Here at the Polling Observatory, we estimate current referendum sentiment by pooling all the currently available polling data, while taking into account the estimated biases of the individual pollsters. This helps reduce the impact of the random variation that each individual survey inevitably produces and accounts for the systematic differences between polling houses (a consequence of the various methodological choices pollsters must make).
We assume that the polling industry as a whole will not be biased. This assumption could prove wrong, of course, but we have no way of knowing what any biases might be. We can only draw on the historical record – and because referendums are both very rare and very unusual political events, this record must be interpreted cautiously. That important caveat noted, past evidence suggests that it’s more likely that pollsters are underestimating support for staying in the E.U. rather than overestimating it.
For this estimate, we examine polls beginning in May, when the wording of the referendum question was first announced, up until the start of November. (The government accepted a recommendation from the Electoral Commission that wording needed to be changed in September). We look at the unadjusted responses, which means we include undecided voters, in part because we cannot be sure of how those undecideds will eventually break.
Over the past six months, there has been a steady rise in support for leaving, from around 33 percent to 38 percent. At the same time, support for remaining first increased slightly, reaching approximately 56 percent, but then fell sharply in August to below 48 percent. This collapse in support coincided with the European refugee crisis.
The steady rise in support for Brexit, in the face of both upward and downward shifts in support for remaining, suggests that the leave campaign has made at least some of its gains by winning over undecideds rather than by converting those who previously wanted to remain in the E.U.
A technical note: An important feature of the estimated series is that the level of uncertainty indicated by the confidence intervals is large, especially by comparison with our usual estimates of party support. That’s in part because there are relatively few polls on the E.U. referendum, which most likely will change as the vote nears.
The large confidence intervals might also be a result of the wide spread of estimated levels of support for leave/remain. This variability could arise because pollsters are not able to weight their estimates against self-reported past voting behavior in the way they do for polling on vote intentions for national elections in the United Kingdom. Interestingly, the confidence intervals reveal more agreement on the level of support for leaving the E.U., whereas the uncertainty around support for staying is considerable at certain points in time, especially in the summer during July and August.
Our method allows us to estimate how each pollster’s approach affects their totals compared to the other pollsters. That is, it tells us simply whether the reported share of responses for remain, leave and don’t knows for each polling house tends to be above or below the industry average. This does not reveal who is the most accurate, since there’s no way to compare their results to that of a past referendum.
Pollsters at one end of the extreme or the other might well be giving a more accurate picture of voters’ intentions. We won’t know until the final vote. In the table below, we report all companies’ support for remaining and leaving (and for undecideds) relative to the average pollster.
Notably, too, the pollsters find very different numbers of undecideds. ComRes (-7.3) and Ipsos-MORI (-6.5) indicate lower than average proportions of “don’t knows,” whereas Survation finds many more (+11.4).
The levels of support for “remain” and the level of “don’t knows” also show some connection. Pollsters like ComRes and Ipsos-MORI who report lower levels of “don’t know” also report higher levels of support for “remain,” while Survation shows the opposite pattern, with more “don’t knows” and less support for “remain.” This might suggest a certain segment of “soft support” in the public: respondents who vary between don’t know and support for the status quo depending on the polling house methodology.
These are, however, tentative observations on the initial data. The table clearly illustrates that opinion on the referendum is rather uncertain, and that estimated opinion depends substantially on the polling house conducting the survey. We should be cautious about the findings of any single poll.
While support for leaving has been gaining slightly, many of the small moves we see in the polls in coming months are likely to be nothing more than random fluctuations. Events such as the Paris attacks can introduce additional uncertainty about the referendum outcome. For now, what we know for sure is that opinion remains divided on Britain’s future in Europe.
Robert Ford is senior lecturer in politics at the University of Manchester, United Kingdom. He has various research interests within the broad umbrellas of racial attitudes and intergroup relations, public opinion research and methodology and partisan and electoral politics.
Will Jennings is professor of political science and public policy at the University of Southampton, United Kingdom. He specializes in public opinion, voting, and public policy.
Mark Pickup is associate professor in the department of political science at Simon Fraser University, Canada. He is a specialist in comparative politics and political methodology.
Christopher Wlezien is the Hogg Professor of Government at the University of Texas at Austin. He is author (with Robert S. Erikson) of “The Timeline of Presidential Elections: How Campaigns do (and do not) Matter” (University of Chicago Press, 2012).