Here are some of the theories that may be at play here:
- Online panels are wearing out... respondents are not paying as much attention to answering the questions as we would like them to and are clicking as quickly as possible to gain their incentive.
- Telephone surveys are not reaching representative samples due to fewer households using landline, only answering calls from known numbers,not wanting to spend time on the phone answering a long survey, having our mobiles on silent, etc.
- Modelling based on previous election results or estimated turnouts are not appropriate - the turnout in 2015 was up 1% to 66.1% for the UK but leaped from 63.8% to 71.1% in Scotland following Nicola Sturgeon's strong and successful campaigning (UK Political Info, 2015)
- Voters are not inclined to discuss their voting habits, may not have decided at the point of interview, or may be ashamed of their voting intentions if it is not seen as socially acceptable or deviates from their family's views / class heritage.
- Voters change their mind or vote tactically as they are swayed by campaigning, celebrity endorsements, the results of opinion polls, or as they become aware of election manifestos, or election values quizzes shared on social media.
- Last minute PR gimmicks are damaging to voter intentions such as the #EdStone manifesto where even the supposedly neutral BBC showed edited images of Ed Miliband compared to Moses.
Of course, it may well be that there were major flaws in the complex methodologies employed by the polling companies that are yet to be revealed but I'm not convinced. It's a complex process and there will always be room for improvement (for any survey) in the way we phrase and order questions. Should they measure people's values instead and match those values to the parties? Should they ask people's voting intentions if they voted today? Should they be asked who they voted for last time and if they changed their mind at any point?
I think that, within the margin of error (generally +/- 3%) that the polls were probably right... voters felt at the time of the interview that they would probably vote for [x] and then... they spoke to their friends and family, or watched an interview, or saw something on social media, or took offence at a gimmick or statement, or thought perhaps things wouldn't turn out quite how they hoped and... they changed their mind. So the opinion polls and the exit polls could both have been right all along, it's just that what we say we'll do and what we actually end up doing are entirely different based on numerous theories of consumer behaviour*.
Read our post on the Scottish Referendum here.
Read more about the differences between online and postal polling in The Guardian here.
*Some of which you can read about on our blog but lots more to come.