Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
Watch Live

Why Were The Polls Off? Pollsters Have Some Early Theories

A couple watches the election results at a Republican watch party at Huron Valley Guns in New Hudson, Mich. People watching the results come in saw President Trump outperforming his position in preelection polls.
Seth Herald AFP via Getty Images
A couple watches the election results at a Republican watch party at Huron Valley Guns in New Hudson, Mich. People watching the results come in saw President Trump outperforming his position in preelection polls.

At some point on election night 2020, as CNN's "KEY RACE ALERTS" rolled in and the map turned red and blue, things started to feel eerily like election night 2016.

Specifically, it was that déjà vu feeling of "Huh, maybe the polls were off." It was a feeling that grew as states such as Iowa and Ohio swung even harder for President Trump than polls seemed to indicate, key counties were tighter than expected and Republicans picked up one toss-up House seat after another.

Yes, Joe Biden ended up winning, as forecasters predicted. But polls overestimated his support in multiple swing states — not to mention the fact that Democrats both lost House seats and didn't win the Senate outright, despite being favored to do the opposite.

Advertisement

It will likely be months until pollsters can study this year's misses thoroughly (a comprehensive study came roughly six months after the 2016 election). However, for now, pollsters have some educated guesses about what may have thrown polls off.

How much polls were off

Let's start with the presidential election. Nationally, the polls said Biden would win the popular vote handily — by somewhere around 7.2 percentage points, according to RealClearPolitics' polling average.

That's nearly double the current margin, which stands at 3.8 percentage points, a little more than two weeks after Election Day.

It's a smaller margin than there appeared to be on election night, with votes still waiting to be counted. (New York, for example, had about 84% of its vote total reported as of Wednesday — so this could still shift a bit.)

Meanwhile, polling nearly perfectly captured presidential vote preferences in some battleground states, but it was off by a wide enough margin in others that it threw off election forecasters.

Advertisement

In a handful (Arizona, Nevada, North Carolina, Pennsylvania), polling was fairly accurate, off by around a point or even less.

In Florida, Iowa, Ohio, Texas and Wisconsin, the polls were off sizably — by more than 4 points. In all of these cases, the polls underestimated Trump's performance in relation to Biden. This echoes 2016 polls, which also underestimated Trump's standing in key states in relation to Hillary Clinton.

What may ultimately set this year apart from 2016 is that national polls four years ago were very close to the final result — just 1.2 points away. This year, once all ballots are counted, it may be the case that polls were off significantly both nationally and across swing states.

Importantly, not only did polls underestimate Trump's performance in the presidential race, but they also appear to have underestimated Republicans in congressional races.

One telling statistic: Of the seats that were considered "toss-ups" by the Cook Political Report, which NPR often references, every single one that has been called by The Associated Press so far has gone to a Republican. Not only that, but nearly one-third of "lean Democratic" House races ended up going to Republicans, whereas every "lean Republican" seat thus far has gone Republican.

Why the polls were off

1) Who's answering their phones

The polls this year were overwhelmingly off in Biden's favor. That indicates the error was systematic and not random.

"It's safe to say that we don't have enough Republicans in our samples," said Cliff Zukin, a retired professor of political science at Rutgers University who worked in the polling industry for four decades.

To start to explain this, one possibility is partisan non-response.

The Pew Research Center reported in 2019 that the response rate to its phone polls had dropped to 6%, from 36% in 1997. Cellphones and caller ID have a lot to do with this, as many people don't answer calls from numbers they don't recognize.

Certain types of people are also more likely to answer their phones than others. Combine that with shifting partisan alignments among particular demographics, and it can throw polls off.

"You almost always have too many college graduates who took the survey because they're just more amenable to doing it," said Courtney Kennedy, director of survey research at the Pew Research Center. "And in this political era, that's correlated with support for Democrats."

It's also possible that Trump supporters are uniquely averse to answering polls, said Sean Trende, senior elections analyst for RealClearPolitics.

"The No. 1 prediction of being a Trump supporter was agreeing with the statement, 'People like me don't have much say in this country,' " Trende said. "And those are the exact people that when you hear a phone call and the person says, 'Hi, I'm from The New York Times. Would you take a call?' just go click."

On top of that, it's possible that Democrats were especially likely to answer polls this year, as they both were enthusiastic about defeating Trump and isolated themselves at home, David Shor recently told Vox's Dylan Matthews.

"So the basic story is that, particularly after COVID-19, they were donating at higher rates, etc., and this translated to them also taking surveys, because they were locked at home and didn't have anything else to do."

1.5) Weighting

Closely related to that problem of non-response is whether pollsters can adjust for the fact that they had few non-college-graduate respondents — a process called weighting.

In 2016, the American Association for Public Opinion Research found that a failure to weight for education led to an underestimation of Trump support. This year, more pollsters weighted based on education, Kennedy explained, and pollsters were also making sure they had enough Republican respondents.

"And yet we had the results that we did," Kennedy said. "So what does that mean? That means that the Republicans taking the polls may not have been good proxies for all of the Republicans."

One final complicating factor: ultrahigh turnout this year. Pollsters try to figure out what the electorate will look like. But it may have been that predicting that was even harder this year with an electorate that was so much bigger than in recent years.

2) What makes a "likely voter"?

In attempting to figure out the shape of the electorate, pollsters try to figure out which people are most likely to vote.

Many pollsters choose not to simply go by whether voters say they intend to vote or not. That's because many people will say they intend to vote but then won't actually go do it. Instead, some pollsters have more complicated likely-voter models, taking into account things such as whether a person has voted in the past or how much the person is paying attention to the current election.

"The likely-voter models that we use may be biased and may discount Republicans," Zukin said. He hypothesizes that this may result from not taking voter enthusiasm into account enough in those models.

"The Trump voters were much more enthusiastic about casting an affirmative vote for their guy than the Biden voters were," he said.

In the final poll from YouGov, for example, 4 in 10 Biden voters said they were voting for him, as opposed to against Trump. Meanwhile, 8 in 10 Trump voters said they were voting for him.

Pollsters insufficiently taking this into account in their likely-voter models, Zukin said, may explain why Trump underperformed in polls.

Interestingly, one of the most accurate pollsters in America doesn't use those models. J. Ann Selzer is the head of Selzer & Company, an Iowa polling firm.

"My method is just that simple: I ask people how likely they are to vote," she said in an email. "In a general election, I take them only if they say 'definitely.' Not 'probably,' not 'might' or 'might not.' "

Days before the election, Selzer's poll found Trump ahead in Iowa by 7 points, when other polls saw a closer race or even a Biden lead. Selzer turned out to be closest, as Trump won the state by more than 8 points.

It may be that other likely-voter models are overcomplicating things. Or it may be that certain models work in some places and not in others.

"In Iowa, her approach may work really well, whereas in other states, it might not, like in a state that has lower [political] engagement, like my own state of New Jersey, where knowing a lot about past voting history and turnout patterns really does help you do better at polling there," said Patrick Murray, director of the Monmouth University Polling Institute.

3) There's something about Trump

It's not as if pollsters somehow lost their touch in 2016 and never got it back. After all, the polls were pretty accurate in the 2018 midterms.

"I'm a little less worried about the state of polling this time around than I was after Election Day four years ago," Murray said, "because we seem to have this phenomenon that whenever Donald Trump is on the ballot and that's who we're asking about, that polling just seems to be off."

So, what is it about Trump?

One possibility that surfaced in 2016 and this year alike is the "shy Trump voter" phenomenon. The idea is that people who support Trump are uncomfortable saying so and lie to pollsters, as opposed to simply not responding to a pollster.

While there may have been anecdotal evidence of this in 2016, the American Association for Public Opinion Research found little evidence that the phenomenon swayed polls that year. Pollsters will likely look into the question again, but one study has already notably found evidence of roughly equal numbers of "shy Trump" and "shy Biden" voters.

"I think the bigger issue is that if you look around the world, we have a lot of these misses," Trende said. Polls, for example, failed to capture the U.K.'s vote for Brexit, as well as Australian Prime Minister Scott Morrison's 2019 win. "These kind of populist-right candidates have outperformed polls — not all the time, but more than 50-50. I think there's a bigger issue going on," Trende said.

Kennedy sees two possibilities about the ramifications of Trump himself throwing polls off.

"If it's really because of his unique political profile, his unique ability to turn out voters who are not easily modeled, not easily identified, not easily reachable — is it that, or is it something more long lasting and something more fundamental to how surveys are being done these days?" she said.

The latter is a scary possibility — it means pollsters have not only serious problems to fix but problems they haven't identified.

On the other hand: "If it's the first one, then an election in the future when there's no Donald Trump on the ballot, then maybe we go back to more normal times," she said.

Still, that's not a satisfying answer. Furthermore, another politician like Trump, who similarly confuses the polls, could always come along.

With reporting from NPR's Susan Davis.

Copyright 2020 NPR. To see more, visit https://www.npr.org.