The night before Britain's general election, Matt Singh, a 33-year-old former trader who'd set himself up as a political blogger, published a post arguing that the opinion polls were all underestimating Conservative support.
For the next 24 hours, the post was the subject of lively online debate. Then the polls closed, and the television exit poll suggested David Cameron was on course for victory by an unexpectedly large margin. Ten minutes later, Singh's website crashed under the weight of traffic.
"I had been working on it for a while," Singh said in an interview. "I could see that the polls as they had been were quite a way from where I expected them to end up. Look at the fundamentals: the leader ratings, the economy, the local election results. There was a big mismatch."
The night of May 7 was a humiliation for Britain's opinion pollsters, who had all agreed in the weeks running up to the vote that the race between the Tories and Labour was too close to call. They're now trying to work out what went wrong. For Singh, who applied to the electoral process the techniques he'd learned trading Scandinavian interest rates for Barclays Plc, it was a triumph.
Like many others watching the election, he'd expected the polls to shift in the Tories' favor as voting approached. When they didn't, he decided the mistake was with them, rather than with his data.
"I thought: OK, this isn't going to be a boring late swing, it's going to be an actual error," Singh said. "I was very confident that the polls were wrong. I just wasn't sure how much."
Singh put his money on his calculations, including some money on a Conservative majority in Parliament, the eventual result. "What I'll say is that the winnings were more like a post-Lehman bonus than a pre-Lehman bonus," was his response when asked how much he made.
Singh's site, Number Cruncher Politics, got more traffic in the 48 hours after his blog post than it had in the previous four months. British and foreign media took notice, and he was cited by the Royal Statistical Society and Oxford University.
The average eve-of-poll prediction from Britain's eight main pollsters had Cameron's Tories tied with Labour on 34 percent of the vote. In the event, the Tories took 38 percent and Labour 31 percent, excluding Northern Ireland, which has different parties.
The pollsters have been undergoing a process of soul-searching over the past two months, offering different explanations for their collective failure. The British Polling Council has begun a year-long review, sifting the data and trying to work out what went wrong. One Labour lawmaker has introduced a bill calling for pollsters to be regulated.
"I can understand why people are cross," said Andrew Cooper, founder of Populus Ltd., one of the polling companies. "I was totally convinced there would be a swing to the Tories. We'd even done the work to identify the people we thought would switch. But we assumed the people who were going to swing would have swung by polling day."
There are a number of theories about what went wrong. One is that, because voting intention was the first question asked in surveys, pollsters might have got different results if they'd asked questions to make people think about political issues such as the economy first — so-called "warm start" polling.
"The problem is that we don't know which of these questions makes the difference," said Cooper. "And it could change. Could there be elections in which the economy is less important?"
Another theory is sampling error — that the people taking part in polls weren't representative of the population as a whole. Populus has compared its samples to available data for the whole population and found they contained too many disabled and low-income people, and not enough with tumble dryers.https://twitter.com/MattSingh_/status/601756366508797952
Contributing to this difficulty is the rise of automated sales calls, which has made it harder to get voters to answer their phones. When ICM Ltd.'s Martin Boon started at the company in 1995, it took 3,000 to 4,000 calls to produce 2,000 interviews. For this election, he said, it took 30,000.
Boon has rejected suggestions that the polling companies were tweaking their results to ensure they were in line with each other.
"I didn't change the ICM method one iota from the start to the finish of that campaign, or in the whole parliamentary cycle," he told an open meeting on the polls in June. "What we've got is something that worked in 2010 and didn't work in 2015."
Tom Mludzinski, the director of political polling at ComRes, made the same point in an interview. "There's absolutely, categorically no herding whatsoever," he said. "We got in the data and we have to stand by it. Certainly for us the biggest problem was identifying the electorate. We can be fairly confident that Labour were doing well among a lot of people who didn't end up voting."
Published political polling is also done largely for media companies with tight budgets. "You're trying to get at a very complicated piece of information, and you're trying to get at it with only two or three questions," said Cooper.
Still, Mludzinski is confident that polls will still be commissioned and read during the referendum on leaving the European Union. "That's easier, because it's binary," he said. "That's why it's easier in the U.S. They don't have all these fluctuating options."https://twitter.com/MattSingh_/status/623962068425818112
Singh is now performing his own analysis of why the polls went wrong and blogging about the contest for a new leader of the Labour Party after the opposition's election debacle. On July 20, Singh published a post arguing that bookmakers were underestimating the chances of the anti-austerity candidate, Jeremy Corbyn. This week, Corbyn became the favorite.
There are some things Singh misses about banking: "Comparing politics to financial markets, you have got very, very few good data points."