In the age of the internet, we’ve seen the importance of protecting the privacy of regular citizens online. Big Tech companies have proven they’re willing to take advantage of lax legislation. Facebook has faced lawsuits for collecting biometric data without permission[1], Google uses so-called Dark Patterns to make opting out of data collection nearly impossible[2], and Amazon’s Echo is essentially a privacy leak right in the middle of our homes[3]. In light of this privacy faux pas, we would like to take some time to audit the current privacy landscape.
Gaps in Privacy Law
Since May of 2018 when the European Union created the General Data Protection Regulation (GDPR), private citizens around the world have had their eyes opened to the potential that privacy laws have to protect them. The United States has been slowly but surely piecing together similar privacy protections from the Virginia Consumer Data Protection Act[5] to the California Privacy Rights Act of 2020[6]. Unfortunately, these have been coming from individual states, creating a digital protection patchwork quilt. The internet, however, doesn’t respect the borders of the 50 states. The patchy approach of United States legislation is insufficient to protect private citizens on the internet.
Congress should be looking to recreate what California has done with the California Consumer Protection Act (CCPA) on a federal level. This bill passed in 2018 intended to pass along several digital privacy rights to California citizens, including the right to discover what data these businesses have collected.
The CCPA requires businesses subject to its jurisdiction to give customers the ability to say no to the sale of their data. You may have noticed a “do not sell my personal information” link on more websites in recent memory. You have the CCPA to thank for that.
The act in California and the GDPR has been an essential foundation when discussing the future of privacy online. Yet, the United States hasn’t seen a similar provision passed on the federal level, such as when they voted no on a bill that would have allowed the FCC to institute more stringent privacy protections [6]. If we dig a little deeper into this glaring Congressional oversight, perhaps the root cause of this federal reluctance is money. According to the Information Technology & Innovation Foundation, federalizing a bill like the California Consumer Protection Act would carve $122 billion out of the US GDP every year[7].
These costs come in the form of compliance costs to businesses that handle or sell private data. When bills that affect private data go into effect, businesses need to hire on customer data stewards and they become subject to time-consuming privacy audits. These additional hires and audits require redirected capital to maintain secure storage solutions and dedicated labor forces for resources to fulfilling customer data requests.
Big Tech’s Loopholes
Big Tech companies like Apple, Google, and Microsoft, or smaller companies like Ring (before they were acquired by Amazon) have undoubtedly brought us creature comforts previously unheard of every year. Online shops that predict our needs, smart home devices that automate our morning routines, and photo albums that automatically organize your shots based on object and facial detection streamline our lives like never before. These companies tend to make these technological strides while trampling — time and again — on our reasonable expectation of privacy. Big Tech has been delving deeper into the world of facial recognition software and using our faces to do it. Social media sites tend to be where the average user saves their vacation photos and other precious memories. In the last few years, these photos have become algorithm food. Data that users assumed would be housed safely on social platforms without being monetized and meticulously annotated in the name of research or profit has been used to create an ever more accurate database bought and sold to third parties without our say in the matter.
Google is currently facing scrutiny from the Attorneys General in Washington D.C. and Texas for using “dark patterns” to extract location data from users that would have otherwise opted out of such tracking. Dark patterns are design tricks that use social engineering or deceptive interface design to push users towards harmful decisions they otherwise would not have made. The suit in question cites the fact that users unwilling to have their location data tracked and sold by Google are effectively boxed out of services provided by non-Google companies like Uber until they give Google permission to collect their data[8]. These suits are claiming that Google violates D.C.’s Consumer Protection Procedures Act and Texas’ Deceptive Trade Practices Consumer Protection Act. It is worth noting that Google would have a more challenging time extracting data from customers if these protection acts had been implemented on a federal level.
Privacy Without Compromise
We believe in an internet that doesn’t take advantage of its users. To that end, we’ve built our business on minding our own business. Our decentralized file-sharing servers create a space online for your data that won’t be sold to a third party or glimpsed by prying eyes. While we advocate for privacy laws that give citizens their privacy back, we’ll work to keep your data private as best we can.
You can try AXEL Go Premium with all features unlocked free for 14-days. Sign up today and see how AXEL Go can improve your workflow and harden your organization’s cybersecurity.
References
[1] “How to Avoid Unwanted Photos on Social Media.” The Wall Street Journal. Dow Jones & Company, January 23, 2022. https://www.wsj.com/articles/how-to-avoid-unwanted-photos-on-social-media-11642933804.
[2] [8] DeGeurin, Mack. “Google Illegally Used Dark Patterns to Trick Users Into Handing Over Location Data, State AGs Say.” Gizmodo. Gizmodo, January 24, 2022. https://gizmodo.com/google-lawsuit-location-data-attorneys-general-1848410222.
[3] Garfield Benjamin Postdoctoral Researcher. “Amazon Echo’s Privacy Issues Go Way beyond Voice Recordings.” The Conversation, April 8, 2021. https://theconversation.com/amazon-echos-privacy-issues-go-way-beyond-voice-recordings-130016.
[4] Ashley Johnson and Daniel Castro. “Why Congress Should Pass Data Privacy Legislation in 2022.” The Hill. The Hill, January 24, 2022. https://thehill.com/opinion/cybersecurity/591022-why-congress-should-pass-data-privacy-legislation-in-2022/.
[5] “Virginia Passes Comprehensive Privacy Law.” Gibson Dunn, March 8, 2021. https://www.gibsondunn.com/virginia-passes-comprehensive-privacy-law/.
[6] “California Consumer Privacy Act (CCPA).” State of California – Department of Justice – Office of the Attorney General, March 28, 2022. https://oag.ca.gov/privacy/ccpa.
[7] Fung, Brian. “The House Just Voted to Wipe Away the FCC’s Landmark Internet Privacy Protections.” The Washington Post. WP Company, December 5, 2021. https://www.washingtonpost.com/news/the-switch/wp/2017/03/28/the-house-just-voted-to-wipe-out-the-fccs-landmark-internet-privacy-protections/.