IT’S easy to forget that the EU referendum should never have happened. By this I don’t mean it shouldn’t have happened because I – along with millions of others – thought Brexit was a terrible idea. It shouldn’t have happened because the Conservative party should not have won a majority of seats in 2015.

It’s easy to forget now, but at the time no-one thought this was going to happen. In January 2013, when David Cameron made that fateful promise of a referendum if – and only if – his party won a majority, it did not seem likely at all. We all remember that Ukip’s popularity was soaring at the time, with polls showing them overtaking tarnished LibDems to become the third most popular party in the UK, but it’s easy to forget that at the time Ed Miliband’s Labour were a whopping 10 points ahead of the Tories.

As former Labour voters joined former Conservative voters in backing Farage and co, the polls narrowed, but by the eve of the vote on May 7, 2015, the pollsters were unanimous in predicting a hung parliament. When an exit poll showed the Tories just 10 seats short of an absolute majority, most commentators scoffed that this must be a mistake. Actually, it was still a significant underestimate on how well they would do.

Which is a long-winded way of saying that polls – even lots of polls, even polls carried out over months and years – can be wrong. Or not “wrong” per se, but not a reliable indicator of how the population will vote. So be warned.

READ MORE: Lesley Riddoch: Scotland has already gone where England's Tories fear to tread

The BBC’s David Cowling carried out a postmortem a week after the 2015 election and found little evidence to support a “late swing” (people changing their minds closer to election day), a “lazy Labour” effect (the party’s supporters being less likely to turn out on the day) or “shy Tories” (people claiming they didn’t know how they would vote when they had in fact already decided to vote Conservative).

In the immediate aftermath of the 2014 referendum, a Lord Ashcroft poll found that one in seven No voters would be reluctant to tell even their friends, family or colleagues how they had voted. Still, even if we assume that every single “don’t know” in the recent polls is really a shy No, things are still looking pretty good, right? We’re up to 21 in a row showing majority support for independence.

While this will of course be music to the ears of every Yes supporter, there is no room for complacency, for two reasons: the questionable reliability of polling data, and the fluidity of the UK’s current political situation. Sure, every time we’ve thought trust in Westminster couldn’t get any lower, the Tories have limbo-danced under an even lower bar, but eventually Boris Johnson will either fall backwards onto his arse or simply exit the dance-off. Yes, Brexit may be bad news (including for some of its strongest supporters) but already the EU’s sluggish vaccination programme has some Remainers reassessing the pros and cons.

It’s certainly not clear – as some seem to believe, or at least hope – that current reported levels of support for Yes are a solid starting base and that the only way is up when campaigning begins in earnest.

We all know the trajectory was upwards between early 2012, when John Curtice put support for independence at 32-38%, and 2014, when 45% voted Yes. If it was a simple case that a Yes campaign – any Yes campaign – was guaranteed to add seven-to-13 percentage points to the outcome, then we would be looking at a huge landslide for Yes. But 2014 feels a long time ago now, and there can be no doubt the fight is going to get dirty. Very dirty.

Many believe that presenting a positive vision of an independent Scotland should and will be enough to ensure those currently planning to vote Yes stand firm, but that brings us back to those troublesome polls.

READ MORE: Tories' Brexit could spark exodus of musicians to Europe, SNP warn

The polling companies may be ready with their technical answers about demographics, sampling and weighting, but there is one significant group of people who are being missed, regardless of whether the polling is carried out by phone or online: people who don’t take part in polls. And this could go some way to explain recent confounding results.

Ahead of the US elections last year the pollsters correctly predicted a Joe Biden win, but got a lot of other things wrong. In an effort to find out why, US website Vox interviewed political data analyst David Shor. “It turns out that people who answer surveys are really weird,” he said, adding that the evidence shows these people are much more politically engaged than average, more agreeable and have higher levels of social trust. In the past, he says, this didn’t pose a problem for pollsters, since a lack of social trust was not a strong indicator of voting intention, but things have changed. “These low-trust folks used to vote similarly to everyone else,” he said. “But as of 2016, they don’t: they tend to vote for Republicans.”

Is it possible Yes voters are considerably more likely than No voters to answer the phone to a polling company, or join an online research panel? Until someone finds a better explanation for recent polling peculiarities, the possibility cannot be ignored.