Ever since the polls got Brexit and Trumpocalypse so wrong, inquiring minds have been wondering how could the pollsters, and by extension all the media, lead us astray? In the past week, many publications like The New York Times have discussed the polling problems.
As a survey researcher, my colleagues and I can think of many reasons, but the “science” has its own jargon, and is difficult to explain (or perhaps justify). On private chat boards, they’re trying to figure out what went wrong, and how to deal with the PR problems that arise, muttering things like:
“… the primary goal of the soul searching should not be restoring public confidence in polling/surveys. Rather, the goal should be improving the quality and reliability of polling/surveys.”
Perhaps you’ll be interested in discussing some of the different errors we’re concerned about:
- Non response error – We’ll never know what people think if they hang up on pollsters, delete email invitations and otherwise refuse to participate in a poll. Sometime pollsters just ignore the non-respondents; but you’ve probably heard the joke about the drunk looking for something under a streetlight in the dark. He says “I lost my keys across the street, but the light is better here.” Do YOU respond to surveys? Do you know people who do not? How could pollsters encourage the non-respondent to care about answering our questions?
- Sampling error – Did we invite a full range of people to participate in our survey? Perhaps you’ve been asked to be Nielsen family and share your TV viewing habits, but most of us have not. Have you ever heard somebody say the ratings are bogus because they don’t ask the right people? Now that land lines are almost extinct, so many “marketeers” have been posing as pollsters, and response rates are at an all-time low, polling firms are trying to find people in other ways. What do you think of the polling firms that call your cell phone?
- Modeling error – It’s a guessing game to predict who will actually vote before an election, and then estimate how many people are represented by the survey respondents. Trump’s analytics team, Cambridge Analytica, figured out who to target, and what to say to activate them.
- Reporting error due to social desirability Perhaps some respondents were reluctant to say they were Trump supporters because they worried about “political correctness” (hence that term became so popular at Trump rallies). One NPR reporter recently said that a bartender in a Rustbelt state described the “lean ins” – customers who didn’t want to declare their support for Trump out loud, but would admit it one-on-one over the bar. Then there are those who say they plan to vote just because it’s a socially acceptable thing to do, but they aren’t enthusiastic about the candidates.
- Reporting error due the difference between surveys and elections When talking with a stranger on the phone or answering an online poll, you can say anything. Somebody who says they prefer a third party candidate might mean it at the time, but when in the voting booth, might not want to “waste my vote.” Also, people change their minds. Anything can happen between the survey and the vote; an individual can read something or hear something or get distracted by something that changes his/her opinion before actually voting.
- The electoral college is complicated. ‘nuff said.
With all these possible errors, it’s a wonder that people still believe in polls. Or do we?