Hours |
|
---|---|
Main Library | 7:30am – 2:00am |
Circulation Desk | 7:30am – 2:00am |
Digital Humanities Lab | 7:30am – 2:00am |
Interlibrary Loan Office | 8:00am – 5:00pm |
Reference Desk | 9:00am – 10:00pm |
Not all polls are created equal! Some organizations have better methodologies than others. Want to know more so you understand what you're seeing? Pew has a mini course via email about polling.
They also authored this great article:
"Key things to know about U.S. election polling in 2024."
In addition, On the Media has updated its Breaking News Consumer Handbook: Polling Edition. You can listen to the segment here (19:57 timestamp).
Warning: Use of the Factiva service is restricted to reader access of the content for research and education purposes only. Text mining and analytics of the content is not permitted.
iPoll is database of polling data from the Roper Center.
You can also search for smaller regional polls in our various newspaper databases.
Another great resource for polling data is the Pew Research Center.
Polling has changed a lot over the years. As people have moved away from answering their phones / having landlines and towards a myriad of other communication options, the ability to get a good sample has become more difficult. Add to this the influx of untested organizations, and it is easy to get confused. Polls have that a solid methodology can be hard to spot. As usual, don't rely on one source of information for your news / polling sources.
This exchange between a pollster and a journalist highlights the difficulty:
Micah Loewinger: Some recent polls that you've called high quality include Gallup, which used 8 variables; the New York Times's Siena poll, which uses 12. Your organization, Pew, also uses 12 variables. When readers look at polls in the next couple months, is it fair to say they can look at the number of variables for a sense of the weighting that's being used?
Courtney Kennedy: I think it is an indicator. Two questions I really ask of any poll that I'm looking at is, did they address this phenomenon of education, of people who have college graduate level education or higher would be more likely to take surveys? You've got to fix that because that tends to be correlated with support for Democrats. And did the poll do anything to make sure they had the right balance of Republicans versus Democrats? Frankly, if the pollster just is completely hands-off, that doesn't turn out very well, especially with online polls. They will tend to skew more Democratic unless the pollster really has good controls in place to make sure that that balance represents the nation.
Micah Loewinger: Here's a point that you made in your recent article that caught my eye. The real margin of error, you write, is often about double the one reported.
Courtney Kennedy: People assume that those numbers are the total margin of error for the poll, but that's actually not the case. The margin of error only covers one of four different error types that we have in surveys. It's just reflecting the error that's associated with sampling. We don't interview the whole country, but it leaves out some really important errors, like nonresponse, noncoverage, like people that couldn't have been surveyed to begin with, and even measurement, which gets to the shy Trump idea. There's these three other, quite frankly, important error sources that are not reflected. A good rule of thumb is to take the reported margin of error and double it.
Micah Loewinger: The number of active polling organizations has grown quite a bit. From what I understand, the barrier to entry just isn't that high and a lot of new companies are getting into the game. If listeners encounter an unfamiliar poll, one that's maybe not associated with a major news organization, are there signs that they can look for that might indicate that this poll is low quality or it's being conducted by partisan actors?
Courtney Kennedy: Well, I think you named a biggie, which is, first of all, look for the track record. Who's doing the poll? Do they have a track record of doing high-quality polling? The other thing you can look for is, do they provide details about how the poll was done? One thing you can see, a sort of tip-off of a poll that's probably not very trustworthy, is if you look to see, where do they describe how they did their methods? And it just says something like, oh, the poll was done online. Full stop.
That's a huge red flag because the pollsters who are really doing work carefully and putting in a lot of resources, they will go into great depth about where the people came from, how the sampling was done, how the weighting was done. There will be paragraphs of detail. On average, the pollsters that are willing to provide more fulsome details about their methodology, they tend to be more accurate.