The only way to predict, roughly, how people might vote is to survey a representative sample of the electorate. At the moment, the average of all opinion polls compiled by Stephen Fisher of Oxford University suggests Theresa May is heading for a majority of 130. This is not a precise science, because of random variation and changes in politics, but it is much, much better than, say, reading tea leaves.
How can you tell anything from just 1,000 people?
If a sample of 1,000 people is representative of the whole country – by sex, age, income and region – it should give a fairly accurate picture of the country’s views. Pollsters adjust their findings if, for example, they don’t have enough middle-aged women in their sample, and they have to adjust voting intentions by how likely different social groups are to turn out on election day.
As a rule of thumb, findings will be within 3 percentage points of the true figure in 19 polls out of 20. This “margin of error” means that, for example, if a poll put Remain on 51 per cent, as the average final poll did in June 2016, it means you would expect Remain to be between 48 and 54 per cent. Unfortunately humans, including commentators, are bad at assessing probability in a close binary choice.
Why do the polls always get it wrong, then?
They don’t. In the first round of the French presidential election on 30 April, for example, the polls were astonishingly accurate, getting each candidate less than one percentage point out and, crucially, getting the order of the four closely bunched candidates right. British polls are not usually that accurate. Getting truly representative samples is hard, as is predicting the likelihood of different groups to turn out to vote. But they are not “terrible” as Nate Silver, the US polling guru, recently claimed.
It is not as if US polls are perfect. National polls were not too bad, correctly predicting that Hillary Clinton would win more votes, but the state polls were up to three points out in the places that made the difference in the electoral college.
Rob Ford of Manchester University replied to Silver, pointing out that at least British polls tend to err in the same direction, of overstating the Labour vote. I’m not sure how reassuring this is. In many ways the big problem for British polls is that they got the 2010 election nearly right, whereas in other recent elections they got the Labour-Conservative gap at least five points wrong in Labour’s favour. Most of the polling companies have changed their methods since 2015, but we won’t know until 8 June whether their figures are over-corrected, roughly right or still too pro-Labour.
How come I’ve never been polled?
Most polling now is done online, so if you are not a member of an internet panel, you would be polled only by phone. At a rough guess, there might be a 1 in 20 chance that your landline or mobile would be called in a year, but most people hang up on cold calls, so perhaps only 1 in 100 people would actually be surveyed.
Polling companies are all biased, aren’t they?
Now we are getting into the field of paranoid conspiracy theory. A fringe of Corbyn supporters online assert that YouGov is owned by Conservatives and enjoy quoting Peter Hitchens – the Mail on Sunday columnist – of all people saying that polls are used to manipulate public opinion not to measure it. YouGov was co-founded by Peter Kellner, a Labour supporter. In any case, as we have seen, election polls have tended to overstate Labour support. Any polling company that is a member of the British Polling Council has to follow its rules, including on openness, so that all their work can be scrutinised.
We should read polls carefully, but celebrate the unpredictability of human beings, who – thank goodness – can always surprise us.
More about: #opnionpolls