Driven by data; ridden with liberty.
Britain’s referendum on membership of the European Union (EU) offers a particular challenge to polling companies. General elections occur regularly — at least every five years — which provides multiple data points, and a recent point of reference for the next vote.
Referenda, especially national referenda, are rarer, and are split along typical partisan lines. The choice is binary: does Britain remain a member of the EU or does it leave?
The final polls released by the polling companies underestimated the Conservative vote share, and overestimated the Labour vote share, meaning there was an average error in the lead of 6.5 percentage points . Matt Singh, of Number Cruncher Politics, and James Kanagasooriam, of Populus, wrote a joint paper on a peculiar disparity in the polls for the EU referendum: the difference between online and telephone polls .
(Video: Matt Singh)
The paper concludes there are two main causes of the polling disparity: how the question is presented, and how the samples differ.
When asking the question on Britain’s EU membership, polling companies performing computer-assisted telephone interviewing (CATI) typically don’t make those being interviewed aware they can say “don’t know”. In polling jargon, listing answers is called “prompting”. The interviewers still accept the verbal answer of indecision.
To this end, the authors split a telephone sample and an online sample. In the ‘Phone B’ sample, the people being surveyed were offered the opportunity to say “Don’t Know”, as well as “Remain” and “Leave”.
In the ‘Online A’ sample, the non-substantive responses were made much smaller. Offering “Don’t know” reduces the Remain lead in the phone poll from 11 points to 3 points. Suppressing “Don’t know” in the online poll cuts the Leave led from 6 points to 1 point.
The subsequent question is which method is most accurate? The ballot paper will only offer two options: Remain or Leave. Being undecided is not necessarily a proxy to unlikelihood of voting. Accentuating or hiding “Don’t know” does appear to make a substantial difference (5 points), but does not explain the whole disparity between the two polling methods (16 points).
The polling difference is then derived from distinctions in the polling samples, similar to how unrepresentative samples were considered the primary cause of the polling error in the 2015 General Election. Since 1992, and the failure to poll correctly for that election, polling companies have used past vote recall, along with demographic factors, to weigh their samples so they are similar to the overall public.
It may seem strange to politically active people — who would be able to say how they have voted in every election since they turned 18, and their critical reasons for doing so — but some people cannot remember how they voted. As the British Election Study’s poll found, where the face-to-face fieldwork occurred over many weeks, the UKIP vote share recall declined as time dragged on from polling day. Overall, the UKIP vote share was initially overestimated, before falling below its actual level in the general election.
It would be fine — on aggregate — if people just misremembered in equal amounts, and so the errors were self-cancelling, but there is some evidence that respondents overestimate previously voting for the two main parties, Labour and the Conservatives, and underestimate past votes for other parties .
As the general vote share for Labour and the Conservatives decays, recall error increases in significance.
In their analysis of the British Election Study’s face-to-face surveys, the authors investigated what were the most important variables that have the strongest statistical association with views on the European Union. Along with age and education, and voting UKIP, the closest correlation was found in social attitudes relating to gender, racial equality and national identity.
The authors then asked the same batch of questions from the BES to 1,004 Great British adults using phones, and 4,047 adults online:
Clearly the phone poll produced what might be called the most socially liberal set of results followed by the BES face-to-face interviews, with the online responses eliciting the least socially liberal results of the three.
The authors continued:
But how much of those differences are down to the different social attitudes of samples reached differently by online, phone or face-to-face, and how much could simply be explained by people hiding or modifying their social attitudes in the presence of a telephone or a doorstep interviewer?
Different waves of the British Election Study are then considered, with “first responders” less likely to say they wished to remain in the EU.
Simply put, people who were harder to reach had a more liberal set of attitudes.
What are the causes of the polling gap? One substantial effect is the suppression of the “Don’t know” responses in telephone polls. Another effect is that telephone polls appear too socially liberal, and online polls are not reflective of the more liberal positions of less contactable members of the population.
There is an unexplained “grey area” of three percentage points, which the authors believe is due to the composition of online panels:
If online samples suffer politically because they lack hard-to-reach people, they also seem to have too many of another kind of person: respondents who know exactly what their views are and who back these up with internally consistent opinions when asked for them.
It remains a question for polling companies whether past vote is still an effective variable for voter intention opinion polls, as well as concerns over its efficacy for polling in this EU referendum.
 Masters, A., 2016. The British Polling Council Inquiry Report. In Defence of Liberty. Available from: https://anthonymasters.wordpress.com/2016/04/04/the-british-polling-council-inquiry-report/ [Accessed: 17th April 2016]
 Singh, M., and Kanagasooriam, J., 2016. Polls Apart. Populus and Number Cruncher Politics. Available from: http://www.populus.co.uk/wp-content/uploads/2016/03/Polls-Apart-29-March-2016.pdf [Accessed: 17th April 2016]
 Curtice, J., 2010. Is it safe to past vote weight? NCRM. Available from: http://eprints.ncrm.ac.uk/892/ [Accessed: 17th April 2016]