What We Talk About When We Talk About Data Bias and Feminism

What We Talk About When We Talk About Data Bias and Feminism
By Julie Lai | May 28, 2021

Data is oftentimes referenced in a way that is immune to bias and encompasses the ultimate objective truth. There is the ideology, otherwise known as positivism, that as long as enough data is collected, our understanding of the world can be explained from a value-neutral, transcendent view from nowhere. But in fact, this theory fails to acknowledge the complexities and nuances of unequal access to data that makes data science a view from a very specific somewhere. Who do we consider when we collect data? Who is excluded behind these numbers? Who is collecting the data? What are the consequences when people are excluded? Without biases and limitations being understood in data, misinterpretation and reinforced biases are the result.  

Admittedly, there was a point when I initially thought of data as inherently objective. However, author Caroline Criado Perez forced me to reconsider my biases after reading Invisible Women: Exposing Data Bias in a World Designed for Men. Criado Perez sheds light on the consequences that cost women their time, money, and even lives, when we make conclusions based on the biased data that exists and ignore the narratives behind the data that doesn’t exist.

Criado Perez underlines how gender-blindness in tech leads to a “one-size-fits-men” approach for supposedly gender-neutral products and systems. This can mean anything from the average smartphone being too large to fit for most women’s hands and pockets, to speech-recognition software trained on recordings of men’s voices, to restrooms and bus routes and office temperatures designed for men, to a higher percentage of health misdiagnoses for women, to cars being designed around the body of a “Reference Man”.

At first glance, some of these biases might not seem to have explicitly severe consequences, such as gender-biased smartphone designs. The average smartphone is roughly 5.5 inches, which can fit into the average man’s hand comfortably whereas the average woman would have to use two hands. While this design is at the least extremely annoying, it is also affecting women’s health. Women have been found to have a higher prevalence of musculoskeletal symptoms and disorders in studies that sex-aggregate and adequately represent women in their data. Similarly, the standard piano keyboard is designed for the average male hand, which affects both women’s level of acclaim and health. Studies have shown that female musicians suffer disproportionately from work-related injuries and keyboard players were among those who were most at risk.

Other biases, such as health misdiagnosis for women and cars designed around the body of a “Reference Man”, have much more explicit consequences for women. The term “Yentl Syndrome” describes when women often get misdiagnosed, mistreated, or told the pain is all in our heads when women present to their doctors with symptoms that differ from men’s. Not only are the consequences wildly frustrating, they can be lethal. In the UK, 50% of women are more likely to be misdiagnosed for a heart attack, because the symptoms we know and recognize are ‘typical’ male symptoms. Misdiagnoses continue to happen in part because some doctors are still trained on medical textbooks and case studies where trials typically use male participants. Furthermore, medications don’t always work the same way for men as they do for women. For example, a heart medication that was meant to prevent heart attacks actually became more likely to trigger one, depending on a certain point in a woman’s menstrual cycle. However, problems like these are overlooked because of the little research done testing drugs at different stages of the menstrual cycle. Both drug medications and car crash tests use a “Reference Man”, typically a white man in his 30’s representing the “standard” human. Because the “Reference Man” is used for all sorts of research involving the dose of drugs, women are often overdosing on medication. For car crash tests, the “Reference Man” is who cars are designed for. This means seatbelts are not designed for the female form, meaning women have to sit further forward. This results in women being 17% more likely than men to die if they’re in a car crash and 47% more likely to be seriously injured.

With biased data comes biased algorithms and biased policies, both of which only reinforce the information that it’s given. As data scientists, we have to consider what biases our algorithms reinforce when we blindly use data. Furthermore, it is not enough to look at data in the lens of feminism. We must look at data in the lens of race, and much more importantly, how the intersection of gender and race biases the data we use.

References: