Filter Bubbles

During our last live session, we discussed in detail the concept of filter bubbles. The condition in which we isolate ourselves inside an environment where everyone around us agrees with our points of view. It is being said a lot lately, not just during our live session, that these filter bubbles are exacerbated by business models and algorithms that power most of the internet. For example, Facebook runs on algorithms that aim to show the users the information that Facebook thinks they will be most interested in based on what the platform knows about them. So if you are on Facebook and like an article from a given source, chances are you will continue to see more articles from that and other similar sources constantly showing up on your feed and you will probably not see articles from other publications that are far away in the ideological spectrum. The same thing happens with Google News and Search, Instagram feeds, Twitter feeds, etc. The information that you see flowing through is based on the profile that these platforms have built around you and they present the information that they think best fits that profile.

Filter bubbles are highlighted as big contributors to the unexpected outcomes of some major political events around the world during 2016 such as the UK vote to exit the European Union as well as the result of the US presidential election in favor of Donald Trump. The idea is that in a politically divided society, filter bubbles make it even harder for groups to find common ground, compromise, and work towards a common goal. Another reason filter bubbles are seen as highly influential in collective decision making is that people tend to trust other individuals in their own circles much more than “impartial” third parties. For example, a person would much rather believe what his or her neighbor is posting on the Facebook wall over what the article in a major national newspaper is reporting on, if the two ideas are opposed to each other, even if the newspaper is a longstanding and reputable news outlet.

This last effect is to me, the most detrimental aspect of internet-based filter bubbles. Because it lends itself for easy exploitation and abuse. With out-of-the-box functionality, these platforms allow trolls and malicious agents to easily identify and join like-minded cohorts and present misleading and false information pretending to be just another member of the trusted group. This type of exploitation is currently being exposed and documented, for example, as part of the ongoing investigation on Russian meddling in the 2016 US Presidential election. But I believe that the most unsettling aspect of this is not the false information itself, it is the fact that that the tools being used to disseminate it are not backdoor hacking or sophisticated algorithms. It is being done using the very core and key functionality of the platforms, which is the ability of third party advertisers to identify specific groups in order to influence them with targeted messages. That is the core business model and selling point of all of these large internet companies and I believe it is fundamentally flawed.

So can we fix it? Do we need to pop out the filter bubbles and reach across the aisle? That would be certainly helpful. But very difficult to implement. Filter bubbles have always been around. I remember in my early childhood, in the small town where I grew up, pretty much everyone around me believe somewhat similar things. We all shared relatively similar ideas, values, and world views. That is natural human behavior. We thrive in tribes. But because we all knew each other, it was also very difficult for external agents to use that close knit community to disguise false information and propaganda. So my recommendation to these big internet companies, would not necessarily be to show views and articles across a wider range of ideas. That’d be nice. But most importantly, I would ask for them to ensure that the information shared by their advertisers and the profiles they surface on users’ feeds are properly vetted out. Put truth before bottom lines.

Leave a Reply