Although it has still to be published, it is a very lengthy thing ("150 pages" if this article is to be believed:
https://edtechnology.co.uk/he-and-fe/36-of-a-level-grades-in-england-downgraded-by-ofqual-algorithm/) and juggles determines the score from a number of factors (previously discussed) but also seems to be is subject to one overriding principle - no grade inflation (
https://www.hepi.ac.uk/2020/08/10/a-levels-2020-what-students-and-parents-need-to-know/).
Just have a think about what's happened here for a minute. There's been a mad panic to write an algorithm to generate a fair and equitable exam score that has, rightly, tried to us the minimum necessary input from teachers to allow them to get on with other pressing issues. Yes, a politician will have signed it off but on the principles of how the algorithm works not the detail. The algorithm seems to have taken 3 months to write, test, adjust before being used, which is not bad going when you consider this is all new ('unprecedented' is way to overused at the moment). And apologies if I'm teaching granny to suck eggs but the test runs would generate score distributions, not individual scores so it was always the case that until the moderation was run in anger that these issues would come out.
Welcome to the brave new world of algorithms. They are difficult to write and, as they increase in complexity (150 pages is quite complex), often difficult to predict how behaviour will change over time (
https://analyticsindiamag.com/8-rea...thms-turned-rogue-causing-disastrous-results/). Am I using this to excuse the current situation? To a degree, yes, but I suggest that it's likely that this is just an unfortunate result of a set of rules and it is unlikely class or race bias was baked in; that would be all too difficult, especially in three months. What is more likely is that, as has been seen with previous problems with algorithms, is that use of historical data which is derived from processes that themselves contains biases and discrimination, will generate biased output (and often amplify them). It will take more than three months to sort out those issues - they are long standing problems, subject to deep debate and not going to be fixed by a bunch of mathematicians and coders.
TL;DR - Too complex to engineer in /out bias in the code. Too much emphasis on avoiding grade inflation.
Biggest mistake is, however, political. Not recognising that there will be losers in this and not allowing proper appeals process based on individuals performance.