Algorithmic Dysphoria: Being Transgender in a Data-Driven World

Algorithmic Dysphoria: Being Transgender in a Data-Driven World
Lana Elauria | October 14, 2022

Data science, as well as the algorithms that push the cutting edge of technology ever forward, are shaped by the cultural context that they grow out of. Data scientists spill their own biases and perspectives into the algorithms that they code, into the data they collect, and into the visualizations they create. These biases are then expressed to every user of a website or an app, and those attitudes are carried forward into mainstream public opinion, now gilded with claims of “algorithmic objectivity” or “technological fairness.” In reality, however, many algorithms only reinforce or exacerbate existing prejudices and social hierarchies. From racial discrimination in algorithms used by court systems to facial recognition models that don’t know what women of color look like, examples of bias and discrimination bleeding into supposedly fair algorithms are more of the norm rather than the exception.

What does this mean, then, for a teenage boy who half-jokingly Googles “am I actually a girl?” when he starts to realize that he feels more natural hanging out with the girls in his class than he does with the boys? When that boy looks up whether he could actually be a girl, Google takes note of this. When the boy clicks on several articles, lists, quizzes, videos, and forums where people are asking this exact question, Google takes note of this. Google serves him the answers to his curiosity, helpfully ranked and filtered by a mysterious algorithm, catering to his previous searches and what the algorithm predicts he will engage with. The algorithm doesn’t actually know what makes up a person’s gender identity, but it does know what similar users clicked on, read, and interacted with. The boy crawls dozens, maybe hundreds, of online forums, with just as many opinions on what makes up someone’s gender identity. The boy begins to take his original question much more seriously than he anticipated, with Google’s PageRank algorithm providing a guiding hand to lead him through the exploration.

One particular search result he finds interesting: there’s an app that records your voice and tells you whether you sound masculine or feminine. It’s at the top of the search page, and he doesn’t notice the “Ad” tag just below the link. He downloads the app, and presses the “Allow” button without reading the terms of use. He doesn’t know that he has just agreed to the use and sharing of his vocal recordings for the company’s “internal research,” and an unknown data scientist in Silicon Valley could be privy to audio recordings of the boy’s first attempts at “becoming” a woman. He speaks a few sentences into his microphone, in his best imitation of a woman’s voice.

The app is drenched in a blue tinge, and a previously unknown feeling, a new sense of discomfort and disappointment, washes over the boy. The app tells him that his “woman voice” was actually still a man’s voice. Why? The AI model within the app analyzed features of the boy’s voice recording and classified them as “male.” However, the model was trained on a dataset of voice recordings from mostly white Americans, all of whom are cisgender men and women. The model within the app does not know what a transgender person even sounds like, so it relegates the boy’s voice to the only categories it knows, the only categories provided to it by the developer: “male” and “female.” The boy begins to think, if he can’t convince a computer of his femininity, how can he convince his parents, let alone the rest of the world? He tries again and again, but no matter how he speaks, he is discouraged by a “male” classification for his voice. He begins to hate the sound of his own voice, even though he had no problem with it before, and when he looks in the mirror, his Adam’s apple seems to taunt him.

This is just one example of the kinds of experiences that can exacerbate feelings of gender dysphoria in transgender people, especially transgender youth. The boy’s exploration of his gender identity is a deeply personal and private journey, a whirlwind of strange new feelings and insecurities. Several apps and websites will track him along the way, picking up data from a very vulnerable point in his life and using it for their own business objectives, whatever those may be. At every step in the boy’s exploration of his gender, biases and stereotypes about gender sneak their way into his model of his own identity, presented to him through various algorithms and machine learning models. This short discussion doesn’t even get into the issue of binary classification in the first place, completely ignoring androgyny and erasing a whole spectrum of gender identities from the conversation because “it’s just easier to work with a binary variable, and most people fall into the binary anyway, right?” Even though these apps are supposedly fair and unbiased, they still propagate ideas and opinions about what is inherently “male” and what is inherently “female,” defined by cutoffs, boundaries, and features that are deliberately chosen by the data scientists who lead these projects. So, next time you’re using or developing algorithms like these, think about what they’re learning from you, and what you’re learning from them.

 

 

Image Sources: Trans Flag, Voice Pitch Analyzer