From the Center

This viewpoint is from a writer rated Center.

There are disagreements over what role social media companies should play in securing our First Amendment rights and what role, if any, our government should play in regulating hate speech and disinformation. In this context, the Knight First Amendment Institute offers a novel approach to thinking about regulating social media.

Get AllSides in your Inbox

Get balanced news in your inbox weekly.

In his essay “How To Regulate (and Not Regulate) Social Media,” Yale Information Society Project founder Jack Balkin outlines a balanced approach to regulating social media that may appeal to free-speech Republicans concerned with preserving a free marketplace of ideas, as well as Democrats concerned with mitigating the spread of inflammatory or false information.

Among various reforms, he most clearly advocates for splitting up responsibilities of social media companies into separate businesses. He considers the most important role for new companies is that of “information fiduciaries.” These would be separate from social media companies and would need to collect and disseminate user data responsibly. The intent is that these new information fiduciaries would change business incentives for social media companies, encouraging them to more appropriately moderate content.

Before introducing his novel thoughts on regulating social media, Balkin begins by framing how the Internet and social media relate to our First Amendment rights. He says that the First Amendment endows us with the right to freedom of expression, but we need a public sphere to create the very freedom of expression that our Constitution protects.


A public sphere consists not just of individuals, but of many institutions, most of which are private. Social media companies are among these institutions. Balkin argues that having trusted and trustworthy digital media is nothing short of necessary to maintain the health of our public sphere.

Balkin’s ideal for how to balance freedom of speech with content moderation might best be described as a model of “content moderation federalism.” That is, a need for a marketplace of diverse and competing content moderators not controlled by a few powerful interests.

To illustrate, Balkin explains how we have successfully balanced our First Amendment rights with content moderation in print media. We begin with minimalist government regulation that only enforces criminal acts such as child pornography or call to violent action. We then leave it to a wide diversity of institutions and communities to enforce their own more restrictive norms.

For example, public news outlets have adopted restrictive family-friendly “polite society” norms. Meanwhile, we can still find edgier contents in a variety of private outlets such as books or specialty magazines. But what about online content? How would a diversity of content norms and moderation even be possible on the Internet, given how the network effects of social media have already created a natural oligopoly of dominant platforms?

Balkin proposes that perhaps the best way to achieve social media federalism is by splitting up the divergent responsibilities of our dominant tech platforms into separate companies. Why are web search and video hosting managed by the same company? Why are social media companies also acting as advertising agencies?

Among the many concerns Balkin has of entrusting a few powerful tech oligopolies with our fundamental freedom of expression, perhaps his most central one is the use and sale of user data.


Balkin observes that the fundamental business model of most online services is not to serve a democratic society, nor even to serve its own end users, but instead to serve its advertisers with user data. This business model, says Balkin, overproduces emotionally salient content such as conspiracy theories and underproduces goods that support cultural democracy, in addition to not being conducive to content moderation.

A few approaches Balkin suggests to addressing this fundamentally flawed business model is to adopt a model of fixed monthly user fees, limited data collection practices, and improved moderation. But Balkin most strongly advocates for introducing his concept of “information fiduciaries.”

These fiduciaries are separate institutions whose sole responsibility would be to collect and disseminate user data with an emphasis on care, confidentiality, and loyalty. The driving idea behind Balkin’s information fiduciaries is to improve content moderation indirectly by removing the underlying business incentives that have so far compelled tech companies to overvalue polarizing content and undervalue cultural democracy.

“How To Regulate (and Not Regulate) Social Media” is a broad essay outlining many ideas for how to regulate (and not regulate) our online public spaces. Democrats might appreciate Balkin’s attempts to disincentivize inflammatory content. Republicans might appreciate Balkin’s attempts to create social media federalism and avoid monolithic content moderation. Americans of all persuasions might appreciate how the essay introduces fresh ideas that acknowledge both the need for a free exchange of ideas and sensible content moderation.

Rolf Hendriks is a software engineer with a passion for writing. He became involved in the depolarization movement when he joined Braver Angels, participating in Braver Angels book discussions and debates while publishing a depolarization song for the Braver Angels songwriting contest. Rolf has a Center bias.

This piece was reviewed by James Coan of Braver Angels (Center bias). It was edited by AllSides Managing Editor Henry A. Brechter (Center bias).