We’re about to enter the thick of general-election season, which means we’re about to get a boatload of polls.
Problem is, it can be hard to know which polls to trust or how to make sense of them all. But don’t worry — it doesn’t take an advanced degree in statistics to interpret polling in a smart way. So the next time you come across a poll and are wondering what to make of it, just follow these 10 steps.1
- Check the pollster’s track record. Some pollsters have long-standing reputations for accuracy, and others are more error-prone. You can check which are which using the FiveThirtyEight pollster ratings, which assign (most) pollsters a letter grade based on their historical accuracy and whether they follow best practices in their methodologies. In our view, the “gold standard” of polling methodology is using live phone interviewers, calling cell phones as well as landlines, and participating in the American Association for Public Opinion Research’s Transparency Initiative or the Roper Center for Public Opinion Research archive.2 These gold-standard polls tend to be the most accurate, although there are exceptions — some online pollsters, like YouGov, are quite reliable as well. If a pollster doesn’t show up in our pollster ratings, it’s probably new on the scene, which means you should treat it with more caution because it doesn’t have an established track record we can judge; at worst, it might even be fake. (If you’re not sure if a pollster is trustworthy and want us to do some investigating, feel free to email us at polls@fivethirtyeight.com.)
- Avoid comparisons between pollsters. Anytime you see a new poll, check to see what the pollster said previously before declaring that the race has shifted. Some pollsters consistently overestimate one candidate or party relative to what other pollsters find, a phenomenon called “house effects.” Similarly, especially for non-horse-race polls, pollsters often word the same questions in different ways — for example, asking someone’s opinion about “Obamacare” can yield different results from asking about “the Affordable Care Act” — which makes direct comparisons difficult.
- Note who’s being polled. For elections, polls of likely voters tend to be more accurate than polls of registered voters, which in turn tend to be more accurate than polls of adults. That said, many pollsters won’t start surveying likely voters until the fall, and registered-voter polls are perfectly good substitutes until then — just be aware that the results may be a few points too Democratic. And polls of adults have their place too — such as when you want to know how the entire nation feels about something, like the coronavirus.
- Pay attention to the margin of error. Reputable polls will always include a margin of error or confidence interval — it’ll look something like “± 3 points.” This reflects that polls can’t be exact, but they do promise to be within a certain number of percentage points (in this example, 3 points) almost all of the time (the industry standard is 95 percent of the time). In practical terms, that means that if a poll puts President Trump’s approval rating at 42 percent with a 3-point margin of error, his approval rating could be anything from 39 percent to 45 percent. Note that, in head-to-head polls, the margin of error applies to each candidate’s vote share, so if the same poll gave Trump 46 percent and gave former Vice President Joe Biden 51 percent, Trump could actually be leading 49 percent to 48 percent. (Though he could also be trailing with 43 percent to Biden’s 54, or fall anywhere in between those extremes.)
- Consider the source. Partisan groups, or even campaigns themselves, will sometimes release their own polls, but of course, they have an ulterior motive in doing so: Make their side look good. On average, these “internal polls” tend to be about 4 or 5 percentage points too favorable to their sponsor, so don’t take them at face value. Be extra skeptical of internal polls that don’t release full methodological details, like the name of the pollster or the dates of the poll. Similarly, partisan media outlets may exaggerate their side’s standing by extensively covering good polls for their candidate while ignoring bad ones. Even mainstream news outlets can mislead, albeit in a different way: They may be tempted to overhype polls they conduct themselves (e.g., calling it a “shock poll” even if it’s not that shocking) in order to get clicks.
- If a poll has an odd result, there might be a reason for it. Check the poll’s wording — is it accurate and unbiased? For example, some campaigns will release polls showing their candidate doing better after respondents hear a positive statement about them. Check when the poll was conducted; the survey may reflect an outdated reality or have been taken after some major event (e.g., a major military victory) that temporarily swayed public opinion. Even something as basic as the order in which questions are asked can affect the results; for example, if a poll is mostly focused on immigration but then asks about the presidential matchup, respondents may subconsciously choose the candidate they feel is best on immigration, not necessarily whom they support overall.
- That said, don’t try to outguess or “unskew” the polls. People who pick apart a poll by claiming it has, say, too many Democrats or too few black voters in its sample are generally wasting their time (and they usually have an agenda). Polls are almost always weighted to match their target population’s demographics, such as race and age. This doesn’t mean all pollsters assign weights in the same way, though, and there are practices like weighting by education on which the industry is split. Not weighting by education likely contributed to some of the most consequential polling errors of 2016, and many pollsters have now begun to factor education into their weighting, but others are still holding out. In an era when graduating from college has a significant bearing on white people’s political preferences, we recommend putting more stock in polls that weight by education than those that don’t. (On the other hand, weighting by partisanship, an idea that’s received some attention lately, is dicey3 and not something most pollsters do. That’s because party identification, unlike many demographic traits, is fluid, so setting it as a constant risks predetermining the poll’s outcome.)
- Heed averages, not outliers. If a poll’s result differs from every other poll, treat it with caution. Although an outlier poll can sometimes represent the beginning of a new trend (especially after a major event like a debate), they’re usually just flukes. Instead, we recommend looking at an average of the polls, which will more accurately reflect the polling consensus.
- In the aggregate, polls are pretty accurate but not perfect. Since 2000, polls of presidential general elections taken within 21 days of Election Day have a weighted average error4 of 4.0 points. (Polls of Senate, House and gubernatorial races have slightly higher historical error.) That means you can trust the polling average to get pretty close to the final result, but it will rarely nail the election exactly. When an election is close enough that a normal-sized polling error could change who wins, prepare yourself for either outcome.
- Polls are snapshots, not predictions. Even if a poll is a perfectly accurate measure of what would happen if the election were held today, things can always change between now and Election Day. Early general-election polls have been pretty predictive in the last few presidential elections, but with huge uncertainty surrounding major issues like the coronavirus pandemic and economic crisis, we don’t know if that will hold true this year. In general, polls gradually become more accurate the closer you get to the election.
"how" - Google News
June 15, 2020 at 04:58PM
https://ift.tt/2XZCY1w
How To Read 2020 Polls Like A Pro - FiveThirtyEight
"how" - Google News
https://ift.tt/2MfXd3I
Bagikan Berita Ini
0 Response to "How To Read 2020 Polls Like A Pro - FiveThirtyEight"
Post a Comment