As the 2020 election approaches, more and more news coverage is devoted to opinion polls. This coverage aims to accurately portray how the American public feels about political candidates. However, polls bring with them a host of statistical complexities and political implications that merit further analysis — and could potentially cause more harm than good if not conveyed properly.

Background

Coverage of polling data is by no means a new phenomenon. In 1984, researchers Charles Atkin and James Gaudino were writing about “the explosive growth of public opinion polling in recent years.” Atkin and Gaudino observed that even then, survey findings were “one of the leading categories of news” during election cycles.

More recently, polling data’s popularity has been attributed to a “Nate Silver effect,” named after the statistician who in 2008 accurately predicted the winner of 49 states’ presidential races. Silver’s website FiveThirtyEight (Center) offers a wide range of content and tools for understanding public opinion data, including a tracker for the latest presidential election polling data.

Public trust in polls took a hit in 2016, when confident predictions of a Hillary Clinton presidential election win were seemingly proven wrong. However, an analysis by the Ad Hoc Committee on 2016 Election Polling found that some aspects of these polls weren’t actually that far off. While “state-level polls...clearly under-estimated Trump’s support in the Upper Midwest ... National polls were among the most accurate in estimating the popular vote since 1936.” Of course, the president isn’t elected by the popular vote, so Clinton’s high national poll numbers didn’t translate to a victory.

Nate Silver and others have criticized journalists’ coverage of polls in 2016. According to Silver, journalists had a “probability problem”: their confirmation biases, desire for certainty, and misunderstanding of data prevented them from accurately reporting what the polls were predicting. Throw in the Electoral College, and it’s no wonder why people were surprised by Trump’s victory.

Some experts also blamed 2016’s failed predictions on a failure to “weight by education,” a method where pollsters “compensate for the fact that college-educated people are both more likely to respond to polls and more likely to be Democrats.” Many pollsters have adopted weighting by education for their 2020 polls.

2020 Poll Coverage and How to Understand It

Despite some decreased public trust in their accuracy, polls have continued to enjoy broad coverage during the 2020 presidential election. Some outlets conduct their own polls, like Fox News (Lean Right), or conduct polls with other outlets, like The Washington Post (Lean Left) and ABC News (Lean Left). Individual polls themselves are often the subject of news stories, as in the case of this article from the Washington Examiner (Lean Right).

Not all polls are created equal. This makes it difficult to navigate poll coverage; how is the average person supposed to judge which polls are trustworthy? According to Wired’s (Center) Emma Grey Ellis, “High-quality public opinion polls tend to come from respected news media, universities, and national polling firms.” Ellis also says to consider sample size and sampling methods when reading about polls. Polls done by campaigns, for instance, are less accurate due to a lack of random sampling.

Even when pollsters do everything right, polls can still miss their mark. That’s because people sometimes lie when talking to pollsters. One analysis found that 10.1% of surveyed Trump supporters and 5.1% of Biden supporters said they would likely be untruthful about their vote preference in phone surveys.

It’s also important to watch out for confirmation bias. Just as we tend to believe news that confirms our prior beliefs, people tend to trust polls that show their preferred candidate winning. However, as much as we might wish otherwise, our personal preferences cannot make an unreliable poll into a reliable one.

One tool to help you judge a poll’s trustworthiness is FiveThirtyEight’s pollster ratings. FiveThirtyEight gives polling firms a letter grade based on their methods and past accuracy. For instance, FiveThirtyEight gives Siena College/The New York Times Upshot an A+, Quinnipiac University a B+, and Change Research a C-. FiveThirtyEight uses these grades and other metrics to help calculate its polling average.

Many outlets also calculate polling averages, which combine a variety of polls to get a more accurate estimation of public opinion. National polling averages are often covered in a more permanent manner; FiveThirtyEight’s polling average is featured prominently on its homepage, and Yahoo News (Lean Left) shows RealClearPolitics’ (Center) polling average when you search for “Trump” or “Biden.” Different outlets calculate polling averages differently based on how they weigh certain polls, leading to different election predictions.

Another thing to keep in mind is that not every poll or graph is measuring the same thing. Different polls measure different things, such as “favorability” and “voting intention.” These metrics can mean very different things for elections, but it’s easy to confuse the two if you’re just seeing blue and red lines on a graph.

Of course, presidential elections are not decided by public opinion, but by the Electoral College. To accurately predict the winner of the election, some outlets build their own election forecast models that take into account factors other than national polling averages. As FiveThirtyEight explains, “Our forecast model will blend the polls with other ways of projecting the outcome in each state, such as what happened in the previous election or its demographics.”

Here’s a list of some polling averages: FiveThirtyEight (Center); RealClearPolitics (Center); CNN (Lean Left); NBC News (Lean Left); The New York Times (Lean Left); 270toWin

...and a list of some election forecast models: FiveThirtyEight (Center); RealClearPolitics (Center); The Economist (Lean Left); Politico (Lean Left); CNN (Lean Left); YouGov; 270toWin; New Statesman

It’s important to note that few, if any, right-rated news outlets calculate polling averages or build election forecast models. Indeed, most outlets that venture into polling data are rated as left-of-center. This trend is helped by the fact that there are, quite simply, more left-leaning media outlets. However, even Fox News, which regularly conducts and reports on its own polls, does not calculate polling averages.

Also noteworthy is the fact that most 2020 polling averages show Joe Biden up by several percentage points, and most election forecasts say Biden’s chances are much better than Trump’s. Recent coverage in right-rated outlets has questioned this consensus in op-eds and in reporting on outlier predictions; other reports suggest some Democratic voters don’t believe reports about the size of Biden’s purported lead. Some coverage compares Biden’s apparent lead in 2020 to Clinton’s from 2016, although Biden’s lead is consistently larger than Clinton’s and has not begun to shrink like Clinton’s did.

Why News Outlets Cover Polls So Eagerly

Why do news outlets cover polling data so much? There are a few possible explanations:

Demand for polls exists partly because our type of government — democracy — makes public opinion a major determinant of public policy (ideally). Thus, understanding public opinion is crucial for politicians and the public to know what the country should do next. Since the U.S. is too big for everyone to talk to one another directly, polling data is needed to provide an accurate idea of the country's desires.

Another reason for the media’s obsession with polls is that American election cycles are really, really long. Measuring from Tulsi Gabbard’s campaign announcement in February 2019, the 2020 presidential election cycle will be over 650 days old when it’s all said and done. That’s 66 days longer than the United States’ involvement in World War I! There are babies born after Gabbard’s announcement that have already said their first words and learned to walk.

Long election cycles create space for polling coverage to satisfy voters’ emotional needs. Candidates do everything in their power to engage voters, so people naturally pay attention. However, when you’re paying close attention to an election that won’t happen for another year, there is no immediate meaningful resolution to your stress. If you’re a staunch Republican who fears a Bernie Sanders presidency, but it’s September 2019, you won’t have your fears confirmed or abated until at least the primary elections in the spring of 2020. In the meantime, people need something to tell them just how hopeful or afraid they should be. News outlets fulfill that demand by reporting on polls. Polarization exacerbates this trend, heightening both real and perceived threats from unfavorable election outcomes.

Elections in most democracies aren’t nearly as long as ours. In fact, many democracies have laws regulating how long an election cycle can be. Mexico, for instance, limited presidential campaign seasons to 90 days after a contentious 2006 election. France limits its presidential campaigns to within two weeks of the election. These regulations provide a precedent for potential reforms to lighten the burden of America’s long election cycles.

For the time being, poll coverage will likely continue to be a part of elections. Polls offer a way of scoring democratic competition that provides a prediction for an election’s ultimate consequences. Using the tools mentioned above can help you navigate this coverage in the next two weeks and in the many elections to come.

This piece was written by Daily News Specialist Joseph Ratliff (Lean Left Bias).

This piece was reviewed by AllSides CEO John Gable (Lean Right), Managing Editor Henry A. Brechter (Center), Marketing Director Julie Mastrine (Lean Right) and Research Assistant Rick Wytmar (Lean Left).

Photo by Neon Tommy on Flickr