In Defence of Liberty

Driven by data; ridden with liberty.

Polls Apart

 

polls-apart-title

A thousand miles and polls apart. (Source: Populus/Number Cruncher Politics)

Britain’s referendum on membership of the European Union (EU) offers a particular challenge to polling companies. General elections occur regularly — at least every five years — which provides multiple data points, and a recent point of reference for the next vote.

Referenda, especially national referenda, are rarer, and are split along typical partisan lines. The choice is binary: does Britain remain a member of the EU or does it leave?

The final polls released by the polling companies underestimated the Conservative vote share, and overestimated the Labour vote share, meaning there was an average error in the lead of 6.5 percentage points [1]. Matt Singh, of Number Cruncher Politics, and James Kanagasooriam, of Populus, wrote a joint paper on a peculiar disparity in the polls for the EU referendum: the difference between online and telephone polls [2].

(Video: Matt Singh)

Do you know, or do you not know?

The paper concludes there are two main causes of the polling disparity: how the question is presented, and how the samples differ.

When asking the question on Britain’s EU membership, polling companies performing computer-assisted telephone interviewing (CATI) typically don’t make those being interviewed aware they can say “don’t know”. In polling jargon, listing answers is called “prompting”. The interviewers still accept the verbal answer of indecision.

To this end, the authors split a telephone sample and an online sample. In the ‘Phone B’ sample, the people being surveyed were offered the opportunity to say “Don’t Know”, as well as “Remain” and “Leave”.

polls-apart-dont-know

Offering “Don’t know”, or suppressing it, substantially changes the result. (Source: Populus/Number Cruncher Politics)

In the ‘Online A’ sample, the non-substantive responses were made much smaller. Offering “Don’t know” reduces the Remain lead in the phone poll from 11 points to 3 points. Suppressing “Don’t know” in the online poll cuts the Leave led from 6 points to 1 point.

The subsequent question is which method is most accurate? The ballot paper will only offer two options: Remain or Leave. Being undecided is not necessarily a proxy to unlikelihood of voting. Accentuating or hiding “Don’t know” does appear to make a substantial difference (5 points), but does not explain the whole disparity between the two polling methods (16 points).

Total recall

The polling difference is then derived from distinctions in the polling samples, similar to how unrepresentative samples were considered the primary cause of the polling error in the 2015 General Election. Since 1992, and the failure to poll correctly for that election, polling companies have used past vote recall, along with demographic factors, to weigh their samples so they are similar to the overall public.

It may seem strange to politically active people — who would be able to say how they have voted in every election since they turned 18, and their critical reasons for doing so — but some people cannot remember how they voted. As the British Election Study’s poll found, where the face-to-face fieldwork occurred over many weeks, the UKIP vote share recall declined as time dragged on from polling day. Overall, the UKIP vote share was initially overestimated, before falling below its actual level in the general election.

polls-apart-ukip-vote.JPG

There were overestimates, and then underestimates, in the overall UKIP vote share. (Source: Populus/Number Cruncher Politics)

It would be fine — on aggregate — if people just misremembered in equal amounts, and so the errors were self-cancelling, but there is some evidence that respondents overestimate previously voting for the two main parties, Labour and the Conservatives, and underestimate past votes for other parties [3].

As the general vote share for Labour and the Conservatives decays, recall error increases in significance.

Social attitudes

In their analysis of the British Election Study’s face-to-face surveys, the authors investigated what were the most important variables that have the strongest statistical association with views on the European Union. Along with age and education, and voting UKIP, the closest correlation was found in social attitudes relating to gender, racial equality and national identity.

The authors then asked the same batch of questions from the BES to 1,004 Great British adults using phones, and 4,047 adults online:

Clearly the phone poll produced what might be called the most socially liberal set of results followed by the BES face-to-face interviews, with the online responses eliciting the least socially liberal results of the three.

polls-apart-social-attitudes.JPG

Could social attitudes be used to weight opinion polls? (Source: Populus/Number Cruncher Politics)

The authors continued:

But how much of those differences are down to the different social attitudes of samples reached differently by online, phone or face-to-face, and how much could simply be explained by people hiding or modifying their social attitudes in the presence of a telephone or a doorstep interviewer?

Different waves of the British Election Study are then considered, with “first responders” less likely to say they wished to remain in the EU.

polls-apart-waves.JPG

First respondents in the face-to-face interviews were less likely to say they wanted to stay in the EU, than later waves. (Source: Populus/Number Cruncher Politics)

Simply put, people who were harder to reach had a more liberal set of attitudes.

polls-apart-opportunities.JPG

Harder-to-reach people are more socially liberal, according to the BES. (Source: Populus/Number Cruncher Politics)

Causes

What are the causes of the polling gap? One substantial effect is the suppression of the “Don’t know” responses in telephone polls. Another effect is that telephone polls appear too socially liberal, and online polls are not reflective of the more liberal positions of less contactable members of the population.

polls-apart-closing-the-gap.JPG

The two main causes of disparity are the social liberalism of the phone polls and the suppression of the “Don’t know”. (Source: Populus/Number Cruncher Politics)

There is an unexplained “grey area” of three percentage points, which the authors believe is due to the composition of online panels:

If online samples suffer politically because they lack hard-to-reach people, they also seem to have too many of another kind of person: respondents who know exactly what their views are and who back these up with internally consistent opinions when asked for them.

It remains a question for polling companies whether past vote is still an effective variable for voter intention opinion polls, as well as concerns over its efficacy for polling in this EU referendum.

References

[1] Masters, A., 2016. The British Polling Council Inquiry Report. In Defence of Liberty. Available from: https://anthonymasters.wordpress.com/2016/04/04/the-british-polling-council-inquiry-report/ [Accessed: 17th April 2016]

[2] Singh, M., and Kanagasooriam, J., 2016. Polls Apart. Populus and Number Cruncher Politics. Available from: http://www.populus.co.uk/wp-content/uploads/2016/03/Polls-Apart-29-March-2016.pdf [Accessed: 17th April 2016]

[3] Curtice, J., 2010. Is it safe to past vote weight? NCRM. Available from: http://eprints.ncrm.ac.uk/892/ [Accessed: 17th April 2016]

Advertisements

One comment on “Polls Apart

  1. Pingback: Where We Start | In Defence of Liberty

Comments are closed.

Information

This entry was posted on April 22, 2016 by in National Politics and tagged , , .
%d bloggers like this: