(Olivier Douliery/AFP via Getty Images)

A former Facebook data scientist has leaked thousands of internal documents, given mainstream news interviews, and testified before the Senate about the company's alleged neglect for its negative impacts on public health. 

Frances Haugen, who left Facebook in May after the company disbanded her misinformation unit, says the company "over and over again, chose to optimize for its own interests, like making more money." She added that "No one at Facebook is malevolent, but the incentives are misaligned." Her leaks fed an extensive investigation by the Wall Street Journal into Facebook, and led to company executives being questioned by lawmakers. Meanwhile, Facebook, Instagram and WhatsApp suffered an unprecedented worldwide outage earlier this week. In a statement published Tuesday, Facebook CEO Mark Zuckerberg said Haugen created "a false narrative that we don't care."

The story has been covered similarly across the spectrum. Most coverage has framed Haugen's claims and the leaked information as indicative of powerful tech companies' negative effects on society. Some on the left took the opportunity to renew calls for disbanding Facebook and other Big Tech companies. Some right-rated voices highlighted how Haugen is working with Democratic Party operatives and consultants, and focused on her role in censoring stories about Hunter Biden while she worked for Facebook.


More from AllSides:

Get AllSides in your Inbox

Get balanced news in your inbox weekly.


Snippets from the Right

Facebook Whistleblower Claims Profit Was Prioritized Over Clamping Down on Hate Speech

The Epoch Times

"During her appearance on the television program, Haugen also accused Facebook of lying to the public about the progress it made to rein in hate speech on the social media platform. She further accused the company of fueling division and violence in the United States and worldwide. She added that Facebook was used to help organize the breach of the U.S. Capitol building on Jan. 6, after the company switched off its safety systems following the U.S. presidential elections."

Facebook Whistleblower’s Push For Stricter Social Media Regulation Is Raising Free Speech Concerns

The Daily Caller (analysis)


"Facebook whistleblower Frances Haugen called to crack down on the amplification of “hateful” speech and “misinformation” in her Senate testimony Tuesday, stirring controversy among some critics and Republican lawmakers who saw her proposals as an in-roads to further suppression of political speech."


Snippets from the Left  

Facebook whistleblower to talk to January 6 committee

CNN Business

"In August, the select committee sent letters to 15 social media companies, including Facebook, YouTube and Twitter (TWTR), seeking to understand how misinformation and efforts to overturn the election by both foreign and domestic actors existed on their platforms. The panel specifically asked for data and analysis on domestic violent extremists affiliated with efforts to overturn the 2020 election, particularly around the January 6 attack."

Facebook is having a Big Oil moment

Vox (analysis)

"Just like Facebook, there are upsides to fossil fuels. Oil and gas have historically provided us with a relatively cheap, seemingly plentiful energy supply. This has led to cool inventions like the internal combustion engine and the cars it powers. But just like Facebook, fossil fuels come with a lot of downsides — like how our reliance on them is destroying the planet — but it’s also almost impossible to imagine the world functioning without them."


Snippets from the Center  

Facebook Whistleblower’s Testimony Builds Momentum for Tougher Tech Laws

Wall Street Journal

"The documents gathered by Ms. Haugen, which provided the foundation for The Wall Street Journal’s Facebook Files series, show how the company’s moderation rules favor elites; how its algorithms foster discord; and how drug cartels and human traffickers use its services openly."

The Facebook whistleblower says its algorithms are dangerous. Here’s why.

MIT News

"During her testimony, Haugen particularly blamed Facebook’s algorithm and platform design decisions for many of its issues. This is a notable shift from the existing focus of policymakers on Facebook’s content policy and censorship—what does and doesn’t belong on Facebook."


See more big stories from the past week.