Filter bubble, a term coined by Internet activist Eli Pariser, is the intellectual isolation that can occur when technology companies use algorithms to feed users information and content that they will like, based on their interests, location, past searches, click history, and more. Companies such as Facebook and Google have been accused of contributing to the phenomenon of filter bubbles. Filter bubbles are also reinforced by our social circles, geography, and news media bias.
Some view filter bubbles as unhealthy for both personal psychology and democracy/politics at large because they shut out alternative viewpoints, which ultimately inhibits nuanced thinking. When users do not see information, data or ideas that might challenge their existing views, their biases are confirmed. Users not only have a harder time finding information that challenges their views, they also will be less practiced at digesting new information and integrating it into their worldview or belief system. When people only have their views reinforced and never encounter alternative perspectives, the more polarized they become in their thinking.
Filter bubbles make people believe that there is only one answer. This robs individuals of the ability to decide for themselves, and makes them less tolerant of other perspectives.
Many search engines and social media sites only show content that is popular (i.e., has been clicked on a lot), but what's popular is not always what is right. In this way, filter bubbles can fuel political polarization, bias, hatred or negative beliefs about other groups or ideas, and even violence.
On the flip side, many believe that technology companies are simply catering to what users want. A user might have a better experience on a website if they don't have to do a lot of work or digging to find content and information that they want to see and will like. Technology companies simply serve to make information organized and easily accessible.
Some believe that technology companies should change their algorithms in order to expose people to different ideas and perspectives. Others believe these companies have no duty to ensure that users ingest a variety of information and ideas, and that the onus is on the individual to be curious, do their own research and consume media from many different sources.
In addition, the 24-hour news cycle serves more to reinforce filter bubbles than to provide balanced news. Media bias is the norm, and both right-wing and left-wing outlets build loyal customer bases with highly partisan reporting, because it pays: people keep coming back to hear what they want to hear.
Filter bubbles also reinforced by our social circles, geography, and media bias. A study from the University of Colorado found that “people's attitudes become more extreme after they speak with like-minded others.” When we surround ourselves only with people who agree, we are much less likely to entertain an opposing viewpoint.
QUESTIONS TO PLAY WITH:
- Do tech companies have a responsibility to ensure users consume a balanced media diet?
- Do filter bubbles affect our ability to appreciate nuance, other people, and other perspectives? To what degree?
- What are the benefits of filter bubbles? The dangers?
Perspectives and reading on filter bubbles from AllSides: