The Ethics of Not Sharing

The Ethics of Not Sharing
By George Tao | April 10, 2020

In this course, we’ve thoroughly covered the potential dangers of data in many different forms. Most of our conclusions have led us to believe that sharing our data is dangerous, and while this is true, we still must remember that data is and will be an instrumental part in societal development. To switch things up, I’d like to present the data ethics behind not sharing your data and steps that can be taken to improve trust between the consumer and the corporation.

The Facebook-Cambridge Analytica data scandal and the Ashley Madison data leaks are among many news stories regarding data misuse that have been etched into our minds. However, we often remember the bad more vividly than the good, so as consumers, we seek to hide our data whenever possible to protect ourselves from the bad. However, we also must remember the tremendous benefits that data can provide for us.

One company has created a sensor that pregnant women can wear to predict when they are going into labor. This app can provide great benefits in reducing maternal and infant mortality, but it can also be very invasive in the type of data it collects. However, childbirth is an area that can use this invasive type of data collection to improve upon current research. Existing research regarding female labor is severely outdated. The study that modern medicine bases its practices on was done in the 1950s on a population of 500 women who were exclusively white. By allowing this company to collect data regarding women’s pregnancy and labor patterns, we are able to replace these outdated practices.

Shot of a beautiful group of young pregnant women taking a selfie together after a yoga session in studio

This may seem like an extremely naive perspective on sharing data, and it is. As a society, we have not progressed to the point where consumers can trust corporations with their data. One suggestion that this article provides is that data collectors should provide their consumers with a list of worst case scenarios that could happen with their data, similar to how a doctor lists side effects that can come with a medicine. This information not only provides consumers with necessary knowledge, but also helps corporations make decisions that will avoid these outcomes.

I believe that one issue that hinders trust between consumer and corporation is that of the privacy policy. Privacy policies and terms of agreement are filled with technical jargon that make them too lengthy and too confusing for consumers to read. This is a problem because I believe that privacy policies should be the bridge that builds trust between the consumer and the corporation. My proposed solution is to create two separate but identical privacy policies: one that is designed for legal purposes and one that is designed for understandability. By doing this, we provide consumers with knowledge of what the policy is saying while not losing any legal protections that the policy provides.

There are many different ways to approach the problem of trust, but ultimately, the goal is to create trust between the consumer and the corporation. When we have achieved this trust, we can use the data built by this trust to improve upon current practices that may be outdated.

Works Cited
https://www.wired.com/story/ethics-hiding-your-data-from-machines/

Leave a Reply