Mustafa and Aziza/Flickr (CC BY-SA 2.0)

From the Right


"The Facebook Files" investigation by the Wall Street Journal (news bias rated Center), the interview with a Facebook whistleblower on 60 Minutes (CBS online news bias rated lean left), and the Senate hearing with the same whistleblower, have brought even more attention to the need to do something about the ill effects of social media.

But what can we do that will actually have a positive impact without creating new problems? Some talk about having social media censor different kinds of content, but past efforts have led to them spreading misinformation themselves and supporting the agenda of one powerful group over the other. For example, Facebook banned posts that claimed that COVID could be from the virus lab in Wuhan (they later removed that ban) and they limited (or censored) reports of Hunter Biden’s laptop. (News media, especially biased or partisan outlets, also downplayed or temporarily ignored the Hunter Biden laptop scandal until after the election).

I propose one practical and effective step that the government can take now that has a track record of success and does not threaten free speech.  

The government can update Section 230(c)(1) of the Communications Decency Act so that social media companies like Facebook no longer get special protection for content that they moderate. Instead, for any content they moderate, they would be held accountable like every other news and media company in the nation.

Social media companies like Facebook could retain these special protections for content that they truly do not moderate. They could easily implement and identify separate sections of their websites, one that is a non-moderated, open forum (with special First Amendment protections), and another that contains their own controlled and moderated content (which would not confer special legal protections).

A major problem with Facebook and other social media comes down to the fact that they get special treatment — special protection as a carrier of “non-moderated content.” They are protected by Section 230(c)(1) of the Communications Decency Act, which shields them from liability for content posted on their platforms. But they have shown they are moderating content, just like newspapers do.

When the Facebook algorithm chooses or alters what content you see, Facebook is moderating the content. When they censor, de-monetize or de-prioritize content to eliminate it or make it less prevalent, they are acting as moderator. They like to blur the lines by pointing out that the content was written by average people, but that same is true when a newspaper publishes letters to the editor. If a letter to the editor is an example of libel, the newspaper that published it is held responsible. That is because the newspaper, like Facebook and other social media companies, are moderating the content.

If a social media company wants to censor discussion about the possibility that the COVID-19 virus started in a lab in Wuhan, China, that is fine. But they are now acting like a newspaper with editorial control and should not qualify for protections provided by Section 230. They should be held legally and financially accountable. 

If they want to keep those special protections, those protections should only be applied to content they do not moderate, and for that unmoderated content, they should be required to abide by the First Amendment principles that the government follows, including a prohibition against censoring different perspectives.

Any company, big or small, could have it both ways if they identify sections as either a non-moderated open forum (with special protections) or their own controlled/moderated content (without special protections). They just can’t have it both ways at the same time, confusing consumers about what is open and what is controlled, manipulated or censored. 

This small step goes directly to the core cause of the problems with social media. It impacts their financial incentives. By making social media legally accountable for content that they (manually or automatically) promote, produce or censor over other content, they will be financially incentivized to avoid actions that harm people since those actions would open them to lawsuits. While this would not restrict or limit individuals’ ability to say or share what they like, it would make Facebook and social media responsible for what content they promote and push in front of others. If their proactive actions lead to teen suicides or taking poison to combat COVID or amount to libel against another, they will be sued and held accountable, financially and legally, just like news organizations across the country are held accountable today.

Sure, that is not perfect — there are plenty of problems with news today. But news organizations still enjoy freedom of the press and free speech while being held far more accountable to their actions than social media companies are today.

This would be a major positive step in the right direction, and a relatively safe and known approach. It is also simple to do — Congress just has to edit Section 230(c)(1).

Social Media’s impact on society is an important problem that should be addressed fast. If we look for solutions rather than just how we are right and those other people are wrong, we can make great progress quickly. Revising Section 230(c)(1) is one way to do that.

Editor’s Note: This  article was originally published in Sept. 2021; it was expanded upon and edited in Oct. 2021.

John Gable is AllSides' CEO and co-founder. He has a Lean Right bias.

This piece was reviewed by Managing Editor Henry A. Brechter (Center bias), Julie Mastrine (Lean Right bias) and Data Journalist Andrew Weinzierl (Lean Left bias).