Government incompetence and u-turns over high school results have perhaps dominated this summer’s news cycle, which is no mean feat given the global pandemic, record-busting recession and looming Brexit crisis. The story has hinged upon an ‘algorithm’ or, to quote the Prime Minister, a ‘mutant algorithm.’ Sadly, our cousins over at Scientists for Labour have picked up on this theme too – blaming science and algorithms for political mistakes.
In a recent blog post, Scientists for Labour highlighted how the algorithm worked (quite useful if you don’t want to read 319 pages of Ofqual verbiage) and correctly highlighted how the full processing pipeline applies a sort of smoothing to past performance and is inherently biased in favour of small class sizes at pay-for schools (public/private, you choose your lingo.)
It is the conclusion that is a bit worrying from Labour’s scientist community:
“In order for an algorithm to take a decision based on some data, it must follow a series of computational processes which have been designed by a data scientist when the model was built.
This data scientist is therefore the one actually making decisions and is entirely unaccountable to the public their model is directly impacting. Due to the lack of accountability and oversight, it was entirely improper to use an algorithm for the purpose of making life-changing decisions for an entire cohort of students.”
What we see here is exactly what we warned against in a recent blog for Lib Dem Voiceand something of the ‘Boris Johnsons’ when he attacked the ‘mutant algorithm’ or, in other words, when he pretended that some Frankenstein science was at fault instead of his government or ministers.
The truth is that this was an entirely, politically accountable set of circumstances that emerged from political decisions and pressure from the Education Secretary, Gavin Williamson. He can and should be held to account and when politicians are setting the prime directives or ‘first parameters’ of any data science model, that is a political not a scientific act.
In brief what we know is:
- Ofqual proposed social distanced exams, Gavin Williamson chose statistically adjusted predictions
- Williamson and the government set the ‘first parameter’ for the data science that followed. It was important to them that there was no ‘grade inflation’ and everything follows from there.
- Williamson was warned about the inequalities a month before the results came out but he chose to continue with the algorithm.
- Williamson hadn’t evaluated the algorithm until after results had been announced.
Far from this being unaccountable ‘Franken-data’, this was an entirely accountable political process from start to finish where the government and Gavin Williamson especially set parameters with which any data scientist or algorithm would have failed public expectation.
We must all be clear about where blame lies for this fiasco – and for all the lives put on hold or set back by it. It was a politically accountable decision by the Education Secretary and possibly others in government, this is not a reason to bash data science.
Leave a Reply