IT hasn’t been the best of times for the algorithm. In the last two weeks, this mathematical cornerstone has set the UK ablaze with a moderating system on exam grades that “baked in inequality”.

Terrible tales of teenagers losing their conditional places at universities and colleges or opportunities for apprenticeships have filled our airwaves and newspapers.

More encouragingly, we saw young people marching on Downing Street in a responsibly socially distanced way and speaking up eloquently about their predicament.

It should be said that the Prime Minister is now socially distancing himself in Scotland regardless, but the point was made nonetheless – a practical lesson in the power of protest will do no harm whatsoever for this generation.

READ MORE: England follows Scotland's lead in using teacher estimates as exam results

Last week, we saw a relatively quick reversal in Scotland as the deep-rooted inequalities in our outdated education system were highlighted by this algorithm, with pupils from the most deprived areas of Scotland suffering disproportionately. “Honest” John Swinney saw the light just in time to avoid six of the best from MSPs.

Northern Ireland followed suit, scrapping its GCSE results algorithm and now its A Level one with pupils getting their teachers’ predictions as their grades or their adjusted mark, whichever is better.

Eminently sensible really, given that most teachers know their pupils personally, their work efforts, their strengths and weaknesses, their progress across years of study and revision, far better than any algorithm could ever do or any postcode lottery could predict.

As the students themselves have declared, they are much more than just “statistics” and do not deserve to have “their futures determined by a computer algorithm”.

Or even more to the point let us think of the reaction of grannies across the country finding out that wee Jeanie wasn’t getting into college because her school wasn’t posh enough. Given that the current Westminster Government specialises in U-turns, I was unsurprised by Education Secretary Gavin Williamson’s climb down on Monday – more pet hamster than the tarantula he used to keep in the Tory chief whip’s office. Contrition does not come easily to Williamson but he started the week in sackcloth and ashes.

“I am sorry for the distress this has caused young people and their parents but hope this announcement will now provide the certainty and reassurance they deserve,” he said.

That is political code for: “Why on Earth did I not pay more attention to what was happening to the Jocks last week and why is the phone reception so bad up there just when I needed Boris’s OK to U-bend?”

Unfortunately for gormless Gavin some of the damage to young hopes has already been done. “I’m relieved but quite frustrated at the same time. It’s too late,” said Zainab Ali, 18, from London.

Zainab’s predicted grades were an A* in history, an A in psychology, and a C in chemistry.

But after she was given an A, B and a D on results day last week, she lost her place to study psychology at Queen Mary University of London, and the course is now full.

“I’m facing the consequences for the indecisiveness of people who are in charge,” Zainab said.

Now, it should be said that this young lady is off to her second-choice college – the University of Westminster – but I was struck by her eloquence in stark contrast to the bumbling incompetence of the Education Secretary. All in all it’s an alpha for Zainab and an omega minus for Williamson.

The controversy over algorithms reflects a deeper problem with governments that centralise power and cut off the legs of local engagement. When all the decisions are made within the confines of Number 10 for instance, or by just two or three leaders and thinkers at the top, then these decisions can be divorced from the reality of real lived experience and personal and local knowledge.

Just imagine Johnson and Cummings as the human embodiment of an algorithm, an analogy I’m sure techie-obsessive Cummings would enjoy – the rule-makers want to get the job done, but they follow the wrong steps because they don’t understand the human or the personal.

An algorithm, or indeed an education system, is only as good as the people who write it or those who govern in it and map out the rules. And right now that is not good enough and they must do better.