By Rachel Miller
Abstract
Big tech companies such as Facebook, Google, Instagram, and Twitter offer their platforms for “free” because of the privacy-forsaking data tradeoff inherent in their use. The claim they make is that if people don’t want to participate in that tradeoff, then they can choose not to use these platforms. However, in reality, these services have become ingrained and essential to modern life to the point where people cannot realistically choose NOT to use these services. Thus, consumers are getting undervalued for their data; companies are benefiting far more than any individual. The users receive an essentially necessary service and pay with their personal information and privacy. The time has come to compensate people, monetarily, for their data.
The Issue
It is news to no one that data is this century's oil. However, unlike oil, the flow of data passes from the individual to corporate giants. From there, corporations package it, sell it, and further disseminate it in ways unseen to the owner of the data. However, consumers are aware something is happening - and are largely uncomfortable about it.
Platforms such as Facebook, Google, Instagram, and Twitter cost the user nothing to sign up for and use to their heart's content. In its own right, data is a currency. At this point, it has become common knowledge that in actuality, people pay for these ostensibly free products and services in personal data, and at the expense of their privacy.
Over the past few years (especially following the 2018 Facebook and Cambridge Analytica scandal) privacy concerns have been mounting and have reached an all-time high. According to a 2020 survey by WhistleOut, 85% of respondents believe that at least one big tech company is currently "spying on them."
The by-and-large response of these tech companies is that if users do not agree with their practices, they can elect not to use the platform or service. Despite this, few people have actually opted out of these services. Why?
The Necessity of Big Tech's Services
In reality, these services have become ingrained and essential to modern life, to the point where individuals cannot realistically choose not to use these services. To the user, their data is dramatically undervalued; tech companies are benefiting far more than any person. Users receive an essentially “necessary” service, and pay with their data - that is being monetized and exploited at the expense of their privacy - whether they like it or not.
Modern society operates on the assumption that the average individual holds accounts and is active on the major tech platforms. Universities and student clubs organize events via Facebook groups. NFX Partner James Currier recently wrote, "I am captive...to Google and Twitter because that’s where everyone in my field is. I can’t escape that situation. We are captive to LinkedIn because that’s how we are seen by employers and partners we care about. And we can’t opt out...The more we use them, the more valuable they become to all of us, and the more pain we would feel if we had to stop using them. We’re never [going to] stop."
Sure, an individual can still technically make the choice not to utilize these platforms. However, this would result in significant detriment. Social life, career field participation, job prospects, and more would be significantly restricted. Even alienation or isolation could result. So people are forced to accept the terms and practices of tech giants, absentmindedly pressing that check-button which is only there to help companies avoid liability. There is only an illusion of choice.
Opting out of data sharing and cookies is a practical difficulty nearing impossible. Some websites place cookies the moment a user visits them. Many do not offer any type of opt-out capabilities, and some that do offer such are not genuine. The inability for users to pursue this as a realistic option further exacerbates the problem.
The Social Contract
A social contract is a largely unwritten agreement between parties, such as a society and its governing body, to cooperate for the sake of mutual benefit. Part of the problem is that there is currently a social contract that keeps the exploitative use of personal data by big tech rigid and unchangeable. In the United States, there is already a social contract in place regarding privacy: that citizens are expected to relinquish a significant portion of it so that governing bodies may optimize protective processes and systems. It's a "tradeoff." In operation, this manifests as social security numbers, drivers' licenses, credit scores, insurance, etc. , all of which track what cars we drive, what jobs we hold, where we live, our health information, and more. All information which is arguably private, but is made less so for the sake of society as a whole.
Though governments may be bound by legislative restrictions, principles, and checks & balances, the private sector (meaning corporations) is largely not. There is currently no federal privacy legislation regulating the private sector, when it comes to consumer privacy. Indeed, the same social contract that governments have set in place with their citizens is mirrored in big tech companies and their captive users. This is, undoubtedly, a problem. Do we really want privately owned and traded, for-profit companies to hold as much, or even more, power over us than our elected bodies? Ben Wizner, director of the ACLU, has said, "For every single one of us, there is some pile of aggregated data that exists, the publication of which would cause us enormous harm and, in some cases, even professional and personal ruin. Every single one of us has a database of ruin." This is the reality of the power that big tech companies have accrued.
And to be clear, governments are not exactly innocent here - they do sell and use personal data that they have obtained from the private sector. Recently, the Wall Street Journal reported that federal agencies purchased access to a Venntel database on millions of citizens' movements obtained from mobile phone location data. Reportedly, the Department of Homeland Security is utilizing this data to enforce immigration regulations. Though Customs and Border Protection has stated that only minimal amounts of this data has been accessed in an anonymized form on applicable instances, this is not a realistic claim; in 2018, the New York Times conducted an investigation which divulged how easy it is to essentially de-anonymize data based on other data points held on that individual. Essentially, this means the concept of anonymized data is an oxymoron.
The Value of Data
How has data become so valuable? Most of it has to do with advertisement optimization based on individuals' daily lives and buying patterns. This data comes from everywhere - big life changes such as change in marital status, moving home, having children, switching jobs, and buying a new car (which may itself be a "smartphone on wheels" capable of detecting its speed, preferred routes, number of passengers, and even drivers' changes in weight). Of course, data is also collected from smaller, individual transactions. Have you recently Googled a skin condition? You may see an advertisement for a skincare product. Looked for workouts on YouTube? Ads will inevitably come up offering different programs and products, trying to solve problems you may have not even been aware of having. And of course, we cannot forget the infamous 2012 case where a teenage girl's in-store vitamin purchases led to advertising activity that betrayed her pregnancy to her family. Data is even collected from how we physically use our devices - the way we scroll, the duration of a press, and how we type.
This information is incredibly valuable to advertisers. From this aggregate data, which encapsulates nearly everything about each individual, purchasing habits and activities can be estimated. Advertisements are targeted to those most likely to interact with them and ultimately make a purchase (or provide more data that can be sold). This astronomically increases the return on ad spending. Notably, it can also shape the narratives of our lives around brands' preferences on our spending habits. Take it from Brusseau's Business Ethics - "We identify ourselves with the products we buy. Consumerism goes beyond the idea that our brands (whether we wear Nike shoes or TOMS shoes, whether we drive a Dodge Charger or a Toyota Prius) are symbols of who we are. Consumerism means our products aren’t just things we wear to make statements. They are us; they incarnate the way we think and act." Brands are shaping our lives, and they're making us pay for it.
Furthermore, this data is valuable to and purchased by governments. Police employ facial recognition software in most public spaces. This can even come into play during political protests - the New York Times has reported that protesters can be tracked, and our democracy is being threatened. Why? Because political parties are now buying this phone location information.
What Now?
If individuals must participate in these services to keep up with modern society, and businesses have no real incentive to halt their practices, what can be done? The time for federal legislation has long since passed, and cannot provide a viable solution - laws cannot function retroactively to unravel the data economy that has long been established. What is left now is the idea to compensate individuals, beyond access to these services, for their personal data.
These big tech companies have laid out the networks that now necessitate and facilitate human communication in modern society. They are, without tangible restriction, collecting and using individuals' personal data for billions of dollars of profit. They are using our data to make us spend our money.
The time has come and gone for federal privacy legislation to be enacted. At this point, anything would be too little too late. Even the California Consumer Privacy Act (CCPA) does not provide people with realistic relief. It is filled with exemptions and exceptions, and people aren't left feeling any better about their loss of privacy in the current age. Federal privacy legislation could not work retroactively to force corporations to get rid of personal data. It is simply too valuable. And they cannot go back in time and keep the data from being collected in the first place.
If it is the private sector running rampant, perhaps it is the private sector that should offer a solution. Currier offers that "governments and other traditional forms of hierarchical power are ill-equipped to understand the network economy and evolve the new social contract...they live in a hierarchical world of scale, and we live in the networked world."
So what is the solution? Actual compensation.
If corporations are using data to make them money to take our money, it's only logical that we receive some money back.
Compensating Individuals for their Data
The idea of compensating individuals for their data is not my own. Many businesses worldwide are currently attempting to solve this ongoing issue and provide innovative ways to compensate and empower consumers. Some provide compensation in the form of points or loyalty programs, others pay cash, and more recently, digital currencies (cryptocurrencies) have arisen as forms of payment. Cash is an obvious choice, because it is a tangible and immediate benefit for an individual. Cryptocurrency also presents a unique opportunity. Over 64% of American adults are interested in cryptocurrency, with an even larger portion of the younger generations interested. Cryptocurrency is a particularly attractive method of compensation due to its ability to appreciate in value over time, and its ability to be converted into other global currencies.
Only time will tell how society and its constituents will continue to grapple with the privacy-related problems it currently faces. As partisan disagreements clog up progress in our governmental bodies, smaller companies are stepping up to the plate and offering innovative solutions benefiting both the advertiser and consumer. One thing that is certain is that the public is eager and willing to find solutions and benefit from them.
About The Author:
Rachel Miller holds an LL.B. from Durham University Law School, where her dissertation, "The Commodification of Personal Data: An Examination of Law and Practice in the United States of America" was published in the Durham Law Review. She also holds an LL.M. specializing in Digital Technology Law from UCLA School of Law. She is currently serving as the Director of Marketing Communications at Permission.io. This article is not intended to serve as legal or financial advice.
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Comments
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.