From the Center

This viewpoint is from a writer rated Center.

Last Wednesday, Facebook announced that it would start testing reduced quantities of political content in its News Feed. Testing has already begun in Canada, Brazil, and Indonesia, and it will begin in the United States in the coming weeks.

At first glance, this is a positive development that could slow or even reverse social media’s polarizing impact. Yet many details are unresolved, and Facebook is not clearly saying that reducing political polarization is even a goal of its efforts. The American public will likely need to continue to push Facebook.

Political content on social media appears in ways that would often tend to increase polarization. Take Jonathan Haidt and Tobias Rose-Stockwell’s 2019 article that presented various reasons social media leads to polarization. They identified a Pew study showing Facebook posts with “indignant disagreement” received twice as much engagement as other kinds of posts. They specifically pointed out the dangers of Facebook’s algorithm based on engagement, which can keep any kind of post near the top of the News Feed regardless of its divisiveness or truthfulness.

Simply having less political content would also likely reduce the quantity of these emotional and divisive posts that often receive more attention. Less polarizing content could slow – and possibly even reverse – polarization.

After all, a study found that those who were not on Facebook for a month became less polarized. Encountering less (polarizing) political content should theoretically have a similar effect, though the magnitude of the effect from Facebook’s change would likely be smaller.

Yet many aspects of Facebook’s change are currently unclear. The announcement was only about tests, and then Facebook will “decide on the approaches we’ll use going forward.”

Facebook did not explicitly say that slowing or reducing polarization is a goal. It only noted that “people don’t want political content to take over their news feeds.” The criteria it plans to use to determine what to do were also vague. The announcement only included the comment, “we’ll survey people about their experience during the tests.”

If Facebook reduced overall political content but mainly kept the most viral and emotional political content, the site could be about equally polarizing as it is today. This policy change would thus be ineffective in terms of polarization.

Hopefully, Facebook will update its policies and algorithms in ways that reduce polarization. But assuming that this early attempt will not go far enough, there are a variety of steps Facebook should take going forward, including the following:

  • Facebook should be more explicit about trying to reduce polarization.
  • Engagement should not be the primary metric for the News Feed algorithm, at least for political content.
  • Facebook should take steps to try to gauge divisiveness, truthfulness, and other metrics of its posts. Not all “political content” is created equally. It should then work to prioritize or de-prioritize posts in terms of their potential to polarize, in its algorithm. (Forbidding dangerous posts is a hot-button but different topic that is not covered here.)
  • Facebook and other researchers can work to determine which political content can depolarize. Facebook should support and promote content shown to effectively depolarize among specific audiences.
  • Facebook should investigate other metrics of reducing polarization on its site, including reducing the visibility of “like” counts, and even asking users if they want to post messages that algorithms detect are likely polarizing.

It is important to recognize Facebook is in a difficult position from a business standpoint. Polarizing content often leads to more engagement, leading to more time users spend online, and finally to more ad revenue for the company. Given the financial incentive to only take modest actions, government intervention should be on the table, if Facebook does not do enough or move quickly enough on its own.

Facebook is taking what appears to be a promising step by starting to reduce political content in its news feed. However, the announcement leaves many unanswered questions. The American public should be prepared to demand more.

James D. Coan is a depolarization strategist who develops approaches to reduce U.S. political polarization at scale. His interests include social psychology and mass communications. He coordinates an initiative on AllSides that features content designed to politically unite, and he co-directs the Braver Angels Ambassadors program. Professionally, he’s a strategy consultant for the energy industry. James has a Center bias and can be reached at jcoan@braverangels.org.

This piece was reviewed and edited by AllSides.com Managing Editor Henry A. Brechter (Center bias).