Signing Away your Personalized Data: Service for Data Models

Signing Away your Personalized Data: Service for Data Models
By JJ Sahabu | May 29, 2020

In today’s society, 1 in 3 people have Facebook; it has become so widespread that it has become part of our digital identity. For instance, many websites provide a “Sign in with Facebook” option, almost as if Facebook has become a medium for online identification. Besides Facebook, many other tech companies like Uber, Google, and Amazon have become integrated into our daily lives leaving consumers at the will of these companies’ terms & conditions which often include rights over their personalized data. Some may say if you don’t agree with a company’s terms and conditions, you can abstain and not use the site. However, the cost of abstinence may be too great, putting individuals at a disadvantage to users. Take the instance of electricity. If users abstain from purchasing electricity from their local provider, not only do they recess to a time prior to the industrial revolution, they also don’t have an alternative, similar to refusing service from these technologies. This idea stems from a larger conversation of data ownership, and who has the right to the data. We seek to look deeper at the ethical considerations of user data collection.

In the Belmont Report, it discusses the importance of informed consent, where the user is educated to the level where they can consent. In the case of terms and conditions, they must be presented to the user in a way that makes the user understand what they are signing up for. However, when tech companies provide long documents of the terms outlined in small font, does the user really read through and understand what is going on? In addition, being that accepting the terms is mandatory to gain access to the company’s services, there leaves very limited choices to the user: abide to the terms or abstain from the service. Some services like Facebook, Instagram, or Twitter may be easier to abstain from, but consider apps that are more essential such as driving or using Uber. Some individuals may be financially reliant on working for Uber; thus, they have very little choice but to abide by the terms. And in the case of many social media platforms, users can be coerced by a “crowd effect” where they are tempted to join because everyone they know is on as well. In either case, the odds are leveraged against the user.

The reason this issue exists today is that there lies very little regulation over these technology firms due to the lack of knowledge surrounding the company’s capabilities to harness the data. When Facebook first came out in 2004, no one expected it to be able to collect and store your personal information. Thus, Facebook grew to a point where they are “Too Big to Fail,” a term usually coined for banking companies that will collapse the financial system if bankrupted. In Facebook’s case however, they have already collected enough users that even if some users decide to abstain from the service, Facebook is not concerned over the lost usership thus reducing the leverage the user has over them. Though some features benefit from personalizing the user experience, the ramifications of the data collected raises serious privacy concerns.

The article referenced below offers the solution of changing the model from a service for data model to a pay for service model enabling users to take back control of their private data. Although this would be solving the issue regarding data ownership, this does not solve the data problem for apps like Uber that don’t fall under the same business model. In addition, this can be seen as tech companies selling back your data, implying they have the first rights to your digital identity.

Digital ownership is a huge issue that surrounds the way tech companies run their businesses. On one hand, the data is used to advance technology by creating personalized content and making us more efficient. On the other, we are sacrificing our privacy. There must be a balance between the potential benefits and costs, but without some sort of regulation to strike that balance, tech companies will continue to reap the maximum benefits at the cost of consumer privacy.

References:

Should Big Tech Own Our Personal Data
Digital Fingerprint Image
Social Privacy Image

Leave a Reply