Today, but worse #2: Murder by Algorithm
Things will go wrong in the worst ways if governments think they can predict our crimes
The government holds huge amounts of information about every one of us. What if they used it as evidence we’re destined to kill?
A few months ago, the UK government quietly confirmed that it is developing a murder prediction tool. Originally named The Homicide Prediction Project, it’s now known as Sharing Data to Improve Risk Assessment. Presumably someone behind the scenes realised the original title had a decidedly dystopian vibe. The project uses data to identify the people most likely to commit future violent crimes.
Right now, it’s not really clear whose data will be included. The Guardian reported that it will draw on huge data sets containing the personal information and experiences of people experiencing mental health problems, addiction, self-harm, domestic violence, and other contact with the police (so lots of people who have been unfortunate, but not criminal). The government denies this, saying that no data from individuals who haven’t been convicted of a crime is being used. And anyway it’s just research. Trust us.
Even with its new, non-creepy, title the idea behind the project is chilling, and I’m not alone in this sentiment. Predictive models like this use data from our existing - human - systems, and any data input into the model is just as flawed and institutionally biased as our current justice system. Right now, we’re on shaky ground when it comes to ensuring justice in the present. What happens when we try to predict the future?
We’ve seen this before, haven’t we? In dystopian fiction and in real life. The bureaucratic shifts that begin with good intentions - let’s stop murder before it happens - and end with individuals or entire communities tagged as criminal before any crime has been committed. Just look at the how the requirement for Prevent compliance led to the surveillance of Muslim communities.
For dystopian authors the story is prediction
This is the kind of headline that I and other dystopian authors are fascinated by - not because it’s entertaining, but because it’s familiar. For me, these stories confirm that the future isn’t something I’m pulling from the recesses of my imagination. Dystopian stories are an extrapolation of reality, observed in the present and taken to their logical and unpleasant conclusions. What happens if we keep going down the road set out by The Homicide Prediction Project? Today we predict someone’s risk of offending, but tomorrow we jump straight to guilty verdict? Because why wouldn’t we want to stop crime before it’s committed? Preventing a crime has to be better than cleaning up the emotional and financial fallout afterwards.
The human element - this isn’t just tech
These algorithms don’t exist in isolation and they’re just as flawed as the people and society that created them. Recently the UK sentencing council urged UK judges to consider the ethnicity of the people on trial before deciding whether to send them to prison. On the face of it this seems fair, doesn’t it? To consider the context of a person’s life before judging them? To say, look, we understand that lives are complicated, that some people are dealt a more difficult hand, and that their actions might be a result of that. So we’ll think about cutting them some slack, showing some compassion. Addressing racial disparity in the justice system. But the result in practical terms is a system where some ethnicities face harsher sentences for the same crimes.
Now imagine that logic filtered through the lens of predictive data. A person’s ethnicity, mental health records, childhood trauma, all fed into a predictive model and used to suggest their “risk” to the rest of society. Membership of a particular community might mark a person as high-risk meaning a person’s chance of being flagged increases exponentially, no matter what they’ve done or experienced. Racial profiling would be given a whole new, eerily dystopian legitimacy.
This is the dystopia we live in. Not future-tense. Now. In the US similar predictive algorithms have been in place for more than a decade. In 2016, Propublica raised the alarm about the “risk assessment” algorithms used the US justice system. The risk assessments are a score of a person’s likelihood of reoffending, based on information such as education level, contact with criminals, proximity to people who use drugs, and previous involvement in incidents such as fights. But Propublica’s investigation highlighted a terrifying pattern: being black landed people with a much higher risk score, even when crime and background was taken into account.
In response to the investigation US Attorney General Eric Holder said, “Although these measures were crafted with the best of intentions, I am concerned that they inadvertently undermine our efforts to ensure individualized and equal justice.”
For Writers: Take Notes
For writers working in dystopian or speculative fiction, stories like this are more than material. They are a responsibility. When governments use data not to protect us but to sort us into categories of threat, writers must step in to explore the consequences. Our job is to trace the thin threads between present-day policies and their terrifying conclusions. Because fiction can help us feel what statistics obscure: that behind every “risk assessment” is a real person with a life.
Dystopian fiction doesn’t invent the future. It just has to expand what’s already here.