The book I’ve found most challenging and rewarding recently is ‘Irrationality’. First written in 1992, the book both identifies significant flaws in the human thinking process and advocates for a more standardised scientific approach. It’s a good case.
However, nearly 30 years later, we are living through the lived realities of that way of thinking: the algorithm. The non-profit body Algorithm Watch has produced a report ‘Tracing the Tracers’ – an analysis of the way that algorithms have been co-opted into the fight against the covid-19 pandemic. The thesis of the report is that in the initial panic to ‘DO SOMETHING’, the tech sector offered up dramatic solutions to governments. These solutions aren’t yet proven, and due to a lack of transparency, may never be. Furthermore, once algorithms are included in a government’s core health functions, they become even harder to disentangle, irrespective of whether they’re benefitting the citizen, the government or the technology vendor the most.
“An algorithm isn’t just for Christmas (Lockdown), it’s for life.”
(As the advert almost certainly wouldn’t say)
Once algorithms become an essential (and invisible) part of the ecosystem, they become very hard to remove. And if the legislative balance is wrong, run the risk of bit-by-bit, eroding individual freedoms and privacy. There’s no denying reading a report on the importance of good algorithm governance is less exciting than “tech-bro saves us from covid-19” but long after the lateral flow tests have gone to landfill, those algorithms will still be with us.

The author of irrationality, Stuart Sutherland, was right. There are problems with how humans process complicated decision making. However, precisely because of this issue, it’s even more important that algorithms are properly, transparently, regulated.
Qui vestigia vestigia(ers)?
Who is tracing the tracers?