Note: this content was originally published in The Catalyst, a newsletter from the Center for Humane Technology.

The world we see through social media is distorted, like looking into a funhouse mirror. These distortions are negative externalities of an advertising-driven, engagement-maximizing business model, which affects people and relationships in myriad ways.

8 Ways Social Media Distorts Reality

The Extreme Emotion Distortion occurs as users have access to virtually unlimited amounts of personalized, emotional content, any user can find overwhelming evidence for their deeply held beliefs. This situation creates contradicting “evidence-based” views, resulting in animosity and fracturing of our collective sensemaking.
The Information Flooding Distortion happens as algorithms and bots flood or curate the information users see based on their likelihood to engage with it, resulting in users believing that what is popular (e.g., hashtags, comments, trends) is public consensus, when in fact it can also be manipulated distortion.
The Micro-Targeting Distortion happens as advertisers send personalized, emotionally resonant — and sometimes opposing — messages to distinct groups of people, resulting in individualized micro-realities that can generate social conflict.
The Moral Outrage Distortion occurs when engagement-maximizing algorithms amplify emotionally charged, moralizing content. This results in polarization, mischaracterizations of “the other side,” and the perception of more moral outrage around us than there really is.

The Engaging Content Distortion happens when social media platforms incentivize competition to create more viral content. This results in more frequent posting, more hyperbolic language, and more posting of extreme views, including conspiracy theories and out-of-context information.
The Anti-Journalism Distortion
 is created as social media platforms force reputable news organizations to compete in an environment that rewards clickbait headlines and polarizing rhetoric resulting in less thoughtful, less nuanced reporting.
The Disloyalty Distortion happens when users on public social media feeds try to understand or express compassion for the “other” side and are attacked by their “own” side for doing so.
The Othering Distortion occurs as algorithms amplify divisive, negative, out-of-context content about particular groups. This incentivizes “othering” content, causing us to dehumanize others and view them as unworthy of our understanding.

The Impact

These distortions don’t just affect individuals. Over time these distortions warp society’s perception of reality, breaking down our ability to find shared understanding.

Shared understanding is needed for democratic functioning. It enables nuanced discussion, debate, and problem solving across party lines. Yet, today's dominant social media platforms are breaking down these critical capabilities at an alarming pace. This is why social media as it operates today is a threat to open societies worldwide.

Actions You Could Take

We can uphold open society values by enabling an information ecosystem that stewards our capacity for shared understanding rather than optimizing for engagement:

  1. Curtail the causes through platform design changes that incentivize trust and understanding. For example, introducing friction to limit virality prevents ideas that trigger powerful emotions from spreading quickly and dominating public discourse. For a deep dive, we recommend reading Renee DiResta and Tobias Rose’s piece, “​​How to Stop Misinformation Before It Gets Shared.”
  2. Address the crises caused by the breakdown of shared understanding. For technology teams, this means identifying crises among both users and non-users, maintaining cross-team collaboration, and planning ahead for challenges. For instance, teams should consider implementing blackouts for features that may cause harm during certain periods (e.g., elections).
  3. Heal the toxic state of our minds from years of being conditioned to see divisiveness as safe and compassion with the “other side” as risky.
    • Approach mutual understanding as a skill to be developed. Search for Common Ground and The One America Movement provide powerful insight into how public education can cultivate intellectual humility and establish understanding.
    • Rehumanize each other by connecting with shared values and sharing experiences in order to depolarize our communities. For a bit of inspiration, check out this video.
    • Illustrate distortions in order to reveal perception gaps and “alternate” realities. For example, participate in a “reality swap” where you swap feeds with another person to see how the reality presented to them differs from the reality you see.

The Center for Humane Technology (CHT) is a nonprofit working to "reframe the insidious effects of persuasive technology, expose the runaway systems beneath, and deepen the capacity of global decision-makers and everyday leaders to take wise action." It was launched in 2018 by Tristan Harris, a former Design Ethicist at Google. 

This piece was reviewed by Managing Editor Henry A. Brechter (Center bias).