8 Ways Social Media Distorts Our Realities
Note: this content was originally published in The Catalyst, a newsletter from the Center for Humane Technology.
The world we see through social media is distorted, like looking into a funhouse mirror. These distortions are negative externalities of an advertising-driven, engagement-maximizing business model, which affects people and relationships in myriad ways.
The Extreme Emotion Distortion occurs as users have access to virtually unlimited amounts of personalized, emotional content, any user can find overwhelming evidence for their deeply held beliefs. This situation creates contradicting “evidence-based” views, resulting in animosity and fracturing of our collective sensemaking.
The Information Flooding Distortion happens as algorithms and bots flood or curate the information users see based on their likelihood to engage with it, resulting in users believing that what is popular (e.g., hashtags, comments, trends) is public consensus, when in fact it can also be manipulated distortion.
The Micro-Targeting Distortion happens as advertisers send personalized, emotionally resonant — and sometimes opposing — messages to distinct groups of people, resulting in individualized micro-realities that can generate social conflict.
The Moral Outrage Distortion occurs when engagement-maximizing algorithms amplify emotionally charged, moralizing content. This results in polarization, mischaracterizations of “the other side,” and the perception of more moral outrage around us than there really is.
The Engaging Content Distortion happens when social media platforms incentivize competition to create more viral content. This results in more frequent posting, more hyperbolic language, and more posting of extreme views, including conspiracy theories and out-of-context information.
The Anti-Journalism Distortion is created as social media platforms force reputable news organizations to compete in an environment that rewards clickbait headlines and polarizing rhetoric resulting in less thoughtful, less nuanced reporting.
The Disloyalty Distortion happens when users on public social media feeds try to understand or express compassion for the “other” side and are attacked by their “own” side for doing so.
The Othering Distortion occurs as algorithms amplify divisive, negative, out-of-context content about particular groups. This incentivizes “othering” content, causing us to dehumanize others and view them as unworthy of our understanding.
These distortions don’t just affect individuals. Over time these distortions warp society’s perception of reality, breaking down our ability to find shared understanding.
Shared understanding is needed for democratic functioning. It enables nuanced discussion, debate, and problem solving across party lines. Yet, today's dominant social media platforms are breaking down these critical capabilities at an alarming pace. This is why social media as it operates today is a threat to open societies worldwide.
We can uphold open society values by enabling an information ecosystem that stewards our capacity for shared understanding rather than optimizing for engagement:
The Center for Humane Technology (CHT) is a nonprofit working to "reframe the insidious effects of persuasive technology, expose the runaway systems beneath, and deepen the capacity of global decision-makers and everyday leaders to take wise action." It was launched in 2018 by Tristan Harris, a former Design Ethicist at Google.
This piece was reviewed by Managing Editor Henry A. Brechter (Center bias).
September 16th, 2024
September 16th, 2024
September 13th, 2024
The Alliance for Citizen Engagement
September 13th, 2024