Weapons of Math Destruction: How big data increases inequality and threatens democracy

Like the last post (‘Everybody Lies’), Weapons of Math Destruction is written by a data scientist. However, there is a significant difference between the tone of the two texts. Seth Stephens-Davidowitz is clearly a bright guy, still fascinated with the potential of big data (although make no mistake, he can see the flaws and potential dangers). Cathy O’Neil is different. O’Neil is an activist.

Weapons of Math Destruction explores how black box algorithms are being used by businesses and governments to make decisions about justice, education, employment, banking, insurance and democracy itself. Often the most persuasive arguments are illustrated by showing the damaging impact of these systems on people’s real lives: the teacher sacked by faulty rating systems, the innocent man doorstepped by the police because an algorithm finds that he’s statistically likely to commit a crime, the man unfairly unable to find a job because assessment centre test identified him as a risk.

O’Neil argues that the current use of big data is flawed on many levels. Firstly, data scientists don’t understand the limitations of their data. She writes lucidly about how ideas like ‘moneyball’ turbocharged big data, but unlike the analysis of baseball players, many systems don’t have feedback loops built in. Once a ‘poorly-performing’ teacher has been fired, they’re invisible to the system, it has no way of learning whether that teacher is genuinely poor, had a bad year, was teaching students who had been ‘gamed’ through an earlier assessment, or even whether the algorithm itself is broken.

Big data is all about systems. What’s particularly depressing is the extent to which data scientists appear happy to work with the limitations of their data. For example, smart policing algorithms are designed to reduce crime target by targeting high crime areas. All well and good, but the mechanisms tend to use proxies for crime including arrest rates. If the police saturate an area, inevitably they’ll encounter all manner of serious and trivial criminality: arrest rates will increase thereby creating a feedback loop. And most of those areas will be populated by poor people, often from ethnic minority backgrounds. Which, in turn, creates a situation in which poor people are increasingly targeted by the police, are arrested and prosecuted, and therefore find it harder to get jobs, meaning that they are more at risk of being involved in crime. It’s a system, but not the one that the designers considered. What stands out most starkly is just how damaging algorithms are to the poor and disadvantaged. Systems designed with good intentions to take the prejudice out of decision-making often do exactly the opposite, trapping poor people into cycles of poverty. Algorithms appear to fossilise disadvantage into an individual or community.

What may be most dangerous is that algorithms appear to be ‘objective’, they’re ‘hard’ science, machines based on unemotional, and colour-blind, maths. As such their decision-making is unassailable. If a supercomputer has made a decision, who are we to argue? O’Neil argues that given the data-entry mistakes, data limitations and faulty assumptions driving them, we should all be challenging them all the time.

The only surprising aspect of the book is the fact that the word ‘Kafka-esque’ doesn’t appear once.

6 thoughts on “Weapons of Math Destruction: How big data increases inequality and threatens democracy

  1. Great book and we should hold each other accountable, both as suppliers and consumers. My worry is that consumers do it [use bias data] unknowingly as the ‘implementors’ of the tech are unclear about what they are buying – I guess thats the point of the book though…..META…


  2. Pingback: Thoughts & lessons from LSAC 2018 (22nd – 23rd October 2018, University of Amsterdam) – Living Learning Analytics Blog

  3. Pingback: Supposedly ‘fair’ algorithms can perpetuate discrimination (this is how to write a headline) – Living Learning Analytics Blog

  4. Pingback: Big data/AI: examples of concerning practice unintended consequences – Living Learning Analytics Blog

  5. Pingback: The effectiveness of learning analytics for identifying at-risk students in higher education – an extra titbit – Living Learning Analytics Blog

  6. Pingback: Slides from my keynote for the Kompetansenettverk for Studenters Suksess I Høyere Utdanning – 7th December 2021 – Living Learning Analytics Blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.