This article is from the source 'bbc' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at http://www.bbc.co.uk/news/election-2017-40265714

The article has changed 3 times. There is an RSS feed of changes available.

Version 1 Version 2
How wrong was the election polling? How wrong was the election polling?
(35 minutes later)
Once again the polls, taken as a whole, were not a good guide to the election result.Once again the polls, taken as a whole, were not a good guide to the election result.
Over the course of the campaign the gap between the main two parties narrowed but, with one exception, the final polls all suggested a clearer Conservative lead than the actual outcome.Over the course of the campaign the gap between the main two parties narrowed but, with one exception, the final polls all suggested a clearer Conservative lead than the actual outcome.
Having said that, it wasn't an unmitigated disaster. Every poll throughout the campaign put the Conservatives ahead - and that was indeed the result.Having said that, it wasn't an unmitigated disaster. Every poll throughout the campaign put the Conservatives ahead - and that was indeed the result.
The final polls were fairly accurate about the Conservative and Lib Dem shares. It was Labour where they were uniformly wrong. They also overestimated UKIP and the SNP.The final polls were fairly accurate about the Conservative and Lib Dem shares. It was Labour where they were uniformly wrong. They also overestimated UKIP and the SNP.
Survation were closest to the actual result. Kantar Public's numbers were also reasonably good.Survation were closest to the actual result. Kantar Public's numbers were also reasonably good.
YouGov's final poll, like most of the others, seriously underestimated Labour. Prior to that, they had been suggesting a closer race. They also had a separate seat projection model which had been indicating a hung parliament. That had been met with a lot of scepticism but, with hindsight, was pretty accurate.YouGov's final poll, like most of the others, seriously underestimated Labour. Prior to that, they had been suggesting a closer race. They also had a separate seat projection model which had been indicating a hung parliament. That had been met with a lot of scepticism but, with hindsight, was pretty accurate.
In the final weeks of the campaign, the polls were often criticised for being "all over the place". It's true that they were pointing to very different outcomes.In the final weeks of the campaign, the polls were often criticised for being "all over the place". It's true that they were pointing to very different outcomes.
That variation clearly made it difficult to interpret what they were saying. It's surely better, though, that they had different numbers than that they were all wrong in exactly the same way. If you're looking for consistent accuracy then opinion polls are probably not for you.That variation clearly made it difficult to interpret what they were saying. It's surely better, though, that they had different numbers than that they were all wrong in exactly the same way. If you're looking for consistent accuracy then opinion polls are probably not for you.
Pollsters are also sometimes accused of herding - deliberately manipulating their figures so they all say the same thing. That accusation can't be levelled at this election.Pollsters are also sometimes accused of herding - deliberately manipulating their figures so they all say the same thing. That accusation can't be levelled at this election.
In 2015 the polls went wrong because their samples were not representative of the electorate - they contained too many Labour voters. They also failed to estimate the difference in turnout rates between different age groups - they overestimated turnout among young voters.In 2015 the polls went wrong because their samples were not representative of the electorate - they contained too many Labour voters. They also failed to estimate the difference in turnout rates between different age groups - they overestimated turnout among young voters.
The pollsters who were furthest from the actual result this time were those, like ICM and ComRes, who had taken the strongest measures to try to rectify the problem from 2015. Survation made no significant changes to their methodology and came out on top.The pollsters who were furthest from the actual result this time were those, like ICM and ComRes, who had taken the strongest measures to try to rectify the problem from 2015. Survation made no significant changes to their methodology and came out on top.
It looks as though the errors this time were caused, at least in part, by fighting the last war.It looks as though the errors this time were caused, at least in part, by fighting the last war.
We'll never know the exact figures for turnout among young voters - it's a secret ballot remember - but YouGov's post-election estimate puts it at 57% for 18-19 year olds and 59% for 20-24 year olds.We'll never know the exact figures for turnout among young voters - it's a secret ballot remember - but YouGov's post-election estimate puts it at 57% for 18-19 year olds and 59% for 20-24 year olds.
That's lower than for older voters but considerably higher than estimates for young voters in 2015.That's lower than for older voters but considerably higher than estimates for young voters in 2015.
We can also see that the places where the number of voters increased the most were generally those with young populations. The assumption made by some pollsters that young turnout would continue to under-perform was probably wrong." We can also see that the places where the number of voters increased the most were generally those with young populations. The assumption made by some pollsters that young turnout would continue to under-perform was probably wrong.