Echo Chambers

July 07, 2020 · 4 mins read

In a study done to understand “follow preferences” of over a million people on twitter, it was found that the users who identified themselves as liberals, followed 56% other liberals and users who identified themselves as conservatives, followed 65% conservatives. It goes on to show how we like to follow and hear voices that confirm our biases and reinforce our thoughts. This leads to the creation of echo chambers. The term was coined by Professor Carl Sunstein in 2001 when he aimed to understand the role of the internet in democracy.

Echo chambers are one of the most dangerous problems facing society today. It has a double effect of not only restricting different views from reaching us but also reinforcing our pre-existing biases. The problem is exacerbated by the underlying technology and incentives of the social media platforms. Every platform wants to hold your attention and wants you to spend more time. They do this by spending an incredible amount of their resources in learning about your preferences (machine learning etc) and then customizing your experience to get you to consume more content. This is what powers content and user recommendations. Given our inherent desire to listen to voices we want to hear and the incentives of these media platforms to ensure we get what we want, is there any hope for us?

When we select a specific source, we are more aware of the inclination of the source and we are more receptive to changing our views.

A South Korean experiment would have us believe there is indeed hope for democracy. In 2016, two young developers built an app that offered specifically curated news items on various issues, from different news sources, to its users and then asked them their views via opinion polls. After a while, some users were given an option to select the sources of their news items, while others users continued to see news items from random sources. Three interesting observations were seen based on the user’s usage.

  • Users regularly changed their preferences to reflect what they read in the news items.
  • As expected, those given the option, selected sources that aligned to their preferences.
  • Those who got the option to select the source updated their preferences more often than those who did not get that option. It was also observed that the users who controlled the sources of their news generally inclined towards the center.

This is an extremely encouraging result and is the reverse of the echo chamber effect. When people control the news sources they see, they are more conscious of their biases. When we select a specific source, we are more aware of the inclination of the source and we are more receptive to changing our views. When software algorithms decide what we read, it numbs out our conscious decision making and keeps feeding into our biases.

Modern social media and content platforms have a huge responsibility in ensuring its users don’t end up consuming a dumb stream of information, some biased and some fake. Users also need to know and learn about the incentives media platforms have and how they pander to their urge to see what they would like to see. Being aware about the dangers of being constantly bombarded with news items, you didn’t choose, is step one.


I run a startup called Harmonize. We are hiring and if you’re looking for an exciting startup journey, please write to jobs@harmonizehq.com. Apart from this blog, I tweet about startup life and practical wisdom in books.