Follow by Email

Monday, March 21, 2016

Why Polling is so unreliable

We have all heard politician telling us not to believe the polls.  Sometimes, they are actually right and the polls are wrong.  Sometimes politicians themselves are surprised by the polls.

Polling is a science, but not an exact one.  Sampling, even when done properly, only predicts to a certain degree of probability. In a close race, the result can be well within the margin for error. That is the best case scenario.  In most cases, even good polling is done on the cheap.  Sample sizes are often smaller than they should be.  More importantly, samples are often skewed.  Many of us remember the famous Chicago Tribune headline announcing "Dewey Defeats Truman" the morning that Truman defeated Dewey.  Much of that erroneous prediction was blamed on the fact that the polling was done by phone.  It turns out in the 1940's Republicans were more likely to have phones than Democrats.  That affected the sampling in a significant way.

Sometimes, pollsters try to massage the sample size in order to correct for problems.  For example, Democrats typically tend to turn out in lower numbers than Republicans on election day.  Young people tend to turn out less than older people.  Minorities tend to turn out less than non-minorities.  So, say you have a random sampling with 10% of the sample being 18-25 year olds.  Let's say that is their percentage of the population, but they tend to vote on election day at 7%..  You might discount the sample results by 30% on the assumption that it will represent their numbers on election day.  The problem with that is if your assumption is wrong.  If a candidate has particular appeal to young voters or there is an especially effective get out the vote campaign targeting young people that year, they may turn out in higher percentages.  That was a problem in 2012 for the Republicans.  They discounted black voters based on their turnouts in earlier campaigns.  But blacks turned out in record numbers for Obama and blew away those assumptions. 

Another issue can be last minute changes.  A full poll can often take days to complete. Voter views are often fluid, especially in primaries, and may change their positions in the final days, after polls are taken.  Since media now refuse to use exit polls to call elections while polling is still open, for fear of affecting voter turnout, most polls on election day are at least several days out of date.
There is also a  problem with voter response.  Voters are sick and tired of pollsters.  Fifty years ago, a pollster could get 80% of those called to respond to a poll.  Today, that number is closer to 8%.  The self-selection of responses creates a bias in the sample.  Sometimes this does not matter.  Other times, there may be a correlation between supporters of a particular candidate and refusal to respond to polls.  This can be an issue with a politically incorrect candidate.  In some cases respondents may even lie to pollsters out of a shame in admitting who they are really supporting to an anonymous person.

There are ways to tease out these variations, often by asking other questions that can tell the pollster about sample member's behavior patterns.  But again, this is more art than science.  It requires making certain assumptions about samples that may not prove to be correct. 

Polling can be a useful tool to see where a campaign is headed, but relying too much on polls without knowing the details behind them can lead to major miscalculations.

No comments:

Post a Comment