Why do election polls seem to have such a mixed track record?

opinion poll
Credit: Pixabay/CC0 Public Domain

Political polls underestimated the support for Donald Trump and overstated the backing for Hillary Clinton in the 2016 presidential election. Four years later, the polling correctly anticipated Joe Biden’s win over Trump, but both national and statewide polls saw a much wider edge than he ultimately received

A task force report by the American Association of Public Opinion Research called the 2020 race the profession’s biggest misfire since 1980, when polls forecast a close race and instead Ronald Reagan beat incumbent Jimmy Carter by a landslide.

The Gazette spoke with John Anzalone, Biden’s chief pollster in 2020 and a Resident Fellow at the Institute of Politics this fall, about what happened in the past two elections and how the field has tried to make adjustments amid shifts in the nation’s political dynamics.

A co-founder of The Wall Street Journal poll, Anzalone also worked for the presidential campaigns of Hillary Clinton (2016) and Barack Obama (2008 and 2012). His firm, Impact Research, has conducted polls for Vice President Kamala Harris’s campaign, but he is not personally involved in that work. Interview has been edited for clarity and length.

What went wrong in the last two elections, and has the industry made any course corrections?

What’s really important is that we differentiate what professional pollsters who work for campaigns do, and what the public media polls do, because it’s very, very different. You’re not seeing campaign polls.

I’m not saying that there wasn’t error. There was, especially in ’16. But there are not many media polls that are spending a lot of money like we do to do daily interviews the right way, using multimodal methodologies, doing quotas, etc. We get branded by the fact that there are now dozens and dozens of cheap media polls that I think are a problem.

In 2016, there were a lot of legitimate concerns about polling error. What we found internally, in a group of pollsters, is that we weren’t getting the right proportion of non-college-educated voters. We were getting too many service-oriented, non-college-educated voters. We weren’t getting enough people who work with their hands or work in the factory or in agriculture, drivers, and things like that. We also saw that a lot of our small-town rural interviews were in the county seat, not in the rural areas. And so, we changed up a lot in terms of how we’re getting our interviews and quotas with non-college-educated voters.

You have to acknowledge that Trump so changed what’s going on in the political dynamics in America. And there was no way to model out who was coming out in 2016. There just wasn’t. We saw a little bit of that in 2020, as well.

I think the challenges have a lot to do with modeling who’s going to turn out. That has been an absolute mystery in the Trump era. I couldn’t tell you who’s going to turn out now.

What metrics do pollsters find best gauge who will turn out to vote?

We do an enthusiasm level, and we do a likelihood, but most of what we’re doing is message development and strategy. Most of what media polls do is the big number/little number, the head-to-head, the traits, the job ratings, etc. Pollsters who are in the political space to help campaigns are message development strategists. Everything that we do goes into a TV ad or a digital ad or a speech. Yes, the head-to-head is important, and we want to get that right, but media polls have turned every pollster into a prognosticator, and that’s a misread of what we do.

Many who were on the fence about voting when it was a Biden versus Trump matchup now say they plan to vote. How does polling capture this new, still-changing electorate?

All you can do is try to guess what percentage of your sample should be “new voters.” You have the voter history of 2022, 2020, 2018, and 2016, and you have new registrants. That’s not a perfect science. Who says that this cycle will be the same as past cycles, where you have to be up a certain percentage nationally to win the battleground states?

This is a tough industry on a good day. It has been a more difficult challenge during the Trump years figuring out how we can get hard-to-reach voters. Now we know there’s a universe that doesn’t want to take a live call or doesn’t trust a live call, so we’ve corrected for a lot of that. We’re constantly going to have to evolve, and we’re going to constantly have to correct and do better because of all of the challenges that we have.

But I’m proud of our industry, and I’m proud of the fact that, professionally, what we do—which you don’t see—we do really well. Polling is really expensive, and most media outlets don’t spend the money necessary to do it right.

What kinds of things might both the Trump and Harris campaigns want to know from their internal polling at this point in the race?

Presidential campaigns, whether you’re Democrat or Republican, are going to test both positive messaging based on the strength of issues and character traits, and they’re going to test all of the contrasts. There’s nothing each side hasn’t tested in terms of positive frames on each and negative frames on each. It’s September, they’ve been polling for eight months.

What every campaign does is they dial-test their convention speeches, and they dial-test their debates. [That is, monitor responses of sample audiences to get their immediate response to words, phrases, and ideas in real time.] So, they’re seeing what hits with swing voters. The convention speeches for both of them, you can guarantee they dial-tested, and that helped refine some of the things that they would say in the debate. And then, they’ll dial test the debates because they’ve got two months of rallies, two months of speeches, and they have TV ads, so the more data, the better. They have the foundation of their research on message, development, and contrast, and now, it’s all about refinement.

You say most media polls aren’t very reliable. Which are the better ones?

The Wall Street Journal is the gold standard, without a doubt, because it’s multimodal. I think that Pew Research Center is the gold standard of online polls because they’ve built their own online database. And then, I think the NBC poll is really good because you have a Democratic firm and a Republican firm running it, like The Wall Street Journal poll.

Provided by
Harvard Gazette

This story is published courtesy of the Harvard Gazette, Harvard University’s official newspaper. For additional university news, visit Harvard.edu.

Citation:
Q&A: Why do election polls seem to have such a mixed track record? (2024, October 2)
retrieved 2 October 2024
from https://phys.org/news/2024-10-qa-election-polls-track.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.