Censorship on Instagram and the Ethics of Instagram’s Policy and Algorithm
By Anonymous | July 9, 2021
Facebook-owned Instagram disproportionately censors some folks more than others. For example, folks who are hairy, fat, dark skinned, disabled, or not straight-passing, are more likely to be censored. Additionally, folks who express dissent against an institution are often suppressed.
In this article, I want to address the ethics of Instagram’s policies around who is and who is not given space to be themselves on this platform.
Instagram’s community guidelines state that most forms of nudity are not allowed to be posted, except for “photos in the context of breastfeeding, birth giving and after-birth moments, health-related situations (for example, post-mastectomy, breast cancer awareness or gender confirmation surgery) or an act of protest.” This raises the question of what counts as “an act of protest”? And who is permitted to protest on this platform?
Sheerah Ravindren, whose pronouns are they/she, is a creative assistant, activist, and model. You may have seen them in Beyonce’s “Brown Skin Girl” music video. In their bio, they write that they are a hairy, dark-skinned, Tamil, nonbinary, immigrant femme. Sheerah uses their platform to advocate for marginalized folks and raise awareness around issues that affect them and their communities. She speaks out about the genocide against Tamil people, aims to normalize melanin, body hair, marks, and rolls, and adds a dimension of digital representation for nonbinary folks of the diaspora.
Among the various types of content Sheerah posts, they have posted images of themselves with captions that convey their intent to protest eurocentric beauty standards and societal norms of femininity. Before posting images where they are not wearing a top, they edit the images to fully cover nipples, in order to meet Instagram’s community guidelines. However, when posting such content, Sheerah has been censored by Instagram — their posts have been taken down, and they have received the following message: “We removed your post because it goes against our Community Guidelines on nudity or sexual activity.” Sheerah’s post was not considered an act of protest from Instagram’s perspective and instead was sexualized unnecessarily. There are numerous other Instagram posts, depicting people who are lighter skinned, less hairy, and skinnier, wearing similar outfits, that did not get removed. For instance, there are many photography accounts that feature skinny hairless white women who are semi-clothed/semi-nude as well. What made Sheerah’s post less appropriate?
The policies around what is considered appropriate to post on Instagram seem to be inconsistently enforced, which could be due to algorithmic bias. The algorithm that determines whether a post complies with guidelines may perform better (with higher accuracy) on posts that depict lighter skinned, less hairy, and skinnier folks. This could be due to the model being trained on data that is not fully representative of the population (the training data may lack intersectional representation), among other potential factors. Moreover, the caption that accompanies an image may not be taken into account by the algorithm; but captions could be critical to contextualizing images and recognizing posts that are forms of protest.
In the context of justice, a basic ethical principle outlined in The Belmont Report, it seems that the benefits and risks of Instagram’s algorithm are not evenly distributed across users. Folks who are already marginalized in their everyday lives outside of Instagram are further burdened by the sociotechnical harm they experience on Instagram when their posts are taken down. The erasure of marginalized folks on this platform upholds existing systems of oppression that shame, silence, and devalue people who are dark skinned, fat, hairy, disabled, or trans, and those who do not conform to heternormative ideals.
While Instagram’s help center contains documentation on how a user can report content that they think violates the community guidelines, there is no documentation accessible to the user on how to submit an appeal. If a user posts something that follows the community guidelines but is misclassified by the algorithm or misreported by another user and thereby deemed inappropriate, does the user have a real opportunity to advocate for themselves? Is the evaluation process of users’ appeals fair and consistent?
When Sheerah’s post was taken down, they submitted an appeal, and their post was later put back up. But shortly afterwards, their post was taken down again, and they received the same message as before. This back and forth reveals that Instagram may not have updated their algorithm after reviewing the appeal. By not making that update, Instagram missed a crucial step towards taking accountability, serving the user, and preparing their service to not make the same mistakes when other users post similar content down the line. Presenting the option to appeal but not responding to the appeal in a serious manner is disrespectful to the user’s time.
Currently, Instagram’s community guidelines and the algorithm that enforces it do not protect all users equally, and the appeal process seems performative and ineffective in some situations. The algorithm behind Instagram’s censorship needs transparency, and so does the policy for how Instagram handles appeals. Moreover, the guidelines need to be interpreted more comprehensively regarding what is considered an act of protest. Instagram developers and policymakers must take action to improve the experience of users who bear the most consequences at this time. In the future, I hope to see dark skinned, hairy, queer women of color, like myself, take space on digital platforms without being censored.