Social media sites are chock full of political discussions, disagreements, and debates. That holds especially true for Twitter, the popularity of which stems from a user’s ability to distill news and opinion into 280-character digestible bites. A team at New York University found that those messages are all the more likely to spread when an individual employs “moral and emotional language.” The cost? A more polarized society.

“The content that spreads the most may have the biggest impact on social media, so individuals, community leaders, and even political elites could see their influence enhanced by emphasizing morality and emotion in their online messaging,” said the study’s lead author, William Brady, in a press release for NYU. “However, while using this type of language may help content proliferate within your own social or ideological group, it may find little currency among those who have a different world view.”

That might not sound like big news — people are moral and emotional beings. But the study has major implications for the bias we see in media today, as well as for our own filter bubbles. 

With the explosion of social media sites—in fact, more people get their news from social networks than from newspapers—users may have noticed a sharp increase in provocative “clickbait” headlines that serve more to rile up a reader than to inform. Take the now-defunct Slant News, which attempted to infiltrate the near-impossible media landscape by paying writers on a per click basis. 

While it didn’t pan out for Slant, this study helps illuminate why a burgeoning outlet would take such an approach. Many news organizations, already struggling to penetrate a crowded market, have cracked the social media code: ratchet up the drama by appealing to a reader’s moral-emotional base, and you just might stay above water. More clicks, increased shares, larger reach.

It's a business model that financially incentivizes a biased press, one that deepens divisions along partisan lines and pits "us" vs. "them."

Such morally and emotionally driven posts are also indicative of online filter bubbles—the echo chambers tailored toward a user’s preferences as well as those of their likeminded peers and colleagues. The study’s findings showed that for every moral-emotional word (e.g., “greed”), the post would see a 20 percent increase in retweets (the equivalent of a Facebook “share”). But this only held true when the tweet circulated within an ideologically consistent network—one where a user’s followers shared the same views. 

“In the context of moral and political discourse in online social networks, subtle features of the content of your posts are associated with how much your content spreads socially,” said Jay Van Bavel, a co-author of the study and an associate professor in NYU’s Department of Psychology. “However, these results also highlight one process that may partly explain increased differences between liberals and conservatives—communications fusing morality and emotion are more likely to resemble echo chambers and may exacerbate ideological polarization.”