[nexa] Automating Society Report 2020

ac+nexa at zeromx.net ac+nexa at zeromx.net
Wed Oct 28 15:29:28 CET 2020


<https://automatingsociety.algorithmwatch.org/>

/by Fabio Chiusi/

On a cloudy August day in London, students were angry. They flocked
<https://twitter.com/HUCKmagazine/status/1294981428602699776?ref_src=twsrc%2525252525252525252525255Etfw%2525252525252525252525257Ctwcamp%2525252525252525252525255Etweetembed%2525252525252525252525257Ctwterm%2525252525252525252525255E1294982805294845953%2525252525252525252525257Ctwgr%2525252525252525252525255E&ref_url=https://www.cnbc.com/2020/08/21/computer-algorithm-caused-a-grading-crisis-in-british-schools.html>
to Parliament Square by the hundreds, in protest – their placards
emblazoned with support for unusual allies: their teachers, and an even
more unusual target: an algorithm.

Due to the COVID-19 pandemic, schools closed in March in the United
Kingdom. With the virus still raging throughout Europe over the summer
of 2020, students knew that their final exams would have to be canceled,
and their assessments – somehow – changed. What they could not have
imagined, however, was that thousands of them would end up with lower
<https://www.theguardian.com/education/2020/aug/13/almost-40-of-english-students-have-a-level-results-downgraded>
than expected grades as a result. Students protesting knew what was to
blame, as apparent by their signs and chants: the automated
decision-making (ADM) system deployed by the Office of Qualifications
and Examinations Regulation (Ofqual). It planned
<https://www.gov.uk/government/publications/awarding-gcse-as-a-levels-in-summer-2020-interim-report>
to produce the best data-based assessment for both General Certificates
of Secondary Education and A-level results, in such a way that “the
distribution of grades follows a similar pattern to that in other years,
so that this year’s students do not face a systemic disadvantage as a
consequence of circumstances this year”.

The government wanted to avoid the excess of optimism that would have
resulted from human judgment alone, according to its own estimates
<https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/909035/6656-2_-_Executive_summary.pdf>:
compared to the historical series, grades would have been too high. But
this attempt to be “as far as possible, fair to students who had been
unable to sit their exams this summer” failed spectacularly, and, on
that grey August day of protest, the students kept on coming, performing
chants, and holding signs to express an urgent need for social justice.
Some were desperate, some broke down and cried.

“Stop stealing our future”, read one placard, echoing the Fridays for
Future protests of climate activists. Others, however, were more
specifically tailored to the flaws of the ADM grading system: “Grade my
work, not my postcode”, we’re “students, not stats”, they read,
denouncing the discriminatory outcomes of the system.

Finally, a chant erupted from the crowd, one that has come to the future
of protest: “Fuck the algorithm”. Scared that the government was
casually – and opaquely – automating their future, no matter how
inconsistent with their skills and efforts, students screamed for the
right not to have their life chances unduly affected by bad code. They
wanted to have a say, and what they said should be heard.

Algorithms are neither “neutral” nor “objective” even though we tend to
think that they are. They replicate the assumptions and beliefs of those
who decide to deploy them and program them. Humans, therefore, are, or
should be, responsible for both good and bad algorithmic choices, not
“algorithms” or ADM systems. The machine may be scary, but the ghost
within it
<https://medium.com/startup-grind/the-ghost-in-the-algorithm-a02e5b882afb>
is always human. And humans are complicated, even more so than algorithms.

The protesting students were not as naive as to believe that their woes
were solely the fault of an algorithm, anyway. In fact, they were not
chanting against “the algorithm” in an outburst of technological
determinism; they were motivated by an urge to protect and promote
social justice. In this respect, their protest more closely resembles
that of the Luddites. Just as the labor movement that crushed mechanized
looms and knitting frames in the 19th Century, they know that ADM
systems are about power, and should not be mistaken for being an
allegedly objective technology. So, they chanted “justice for the
working class”, asked for the resignation of the Health Secretary,
portrayed the ADM system as “classism at its finest”, “blatant classism”.

Eventually, the students succeeded in abolishing the system which put
their educational career and chances in life at risk: in a spectacular
U-turn, the UK government scrapped
<https://www.euronews.com/2020/08/17/britain-scraps-algorithm-for-student-exam-grades-after-outcry-over-fairness>
the error-prone ADM system and utilized the grades predicted by teachers.

But there’s more to this story than the fact that the protesters won in
the end. This example highlights how poorly designed, implemented, and
overseen systems that repro-duce human bias and discrimination fail to
make use of the potential that ADM systems have, such as leveraging
comparability and fairness.

More clearly than many struggles in the past, this protest reveals that
we’re no longer just automating society. We have automated it already –
and, finally, somebody noticed.

[...]

-------------- next part --------------
A non-text attachment was scrubbed...
Name: OpenPGP_0x03A66BFF45B898D5.asc
Type: application/pgp-keys
Size: 1802 bytes
Desc: not available
URL: <http://server-nexa.polito.it/pipermail/nexa/attachments/20201028/137b896e/attachment.key>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: OpenPGP_signature
Type: application/pgp-signature
Size: 203 bytes
Desc: OpenPGP digital signature
URL: <http://server-nexa.polito.it/pipermail/nexa/attachments/20201028/137b896e/attachment.sig>


More information about the nexa mailing list