Government’s 'murder prediction' tool is based on ‘racist’ data
Called ‘chilling and dystopian’
The Ministry of Justice is experimenting with a system using AI to predict crime, but critics say it is based on “dodgy and racist AI and algorithms.”
The UK government is developing a data-driven initiative aimed at identifying the individuals most likely to commit murder - a project critics have slammed as "chilling and dystopian."
Documents obtained through Freedom of Information (FOI) requests by civil liberties watchdog Statewatch revealed the project’s existence.
Originally known as the "Homicide Prediction Project," it has since been rebranded as "Sharing Data to Improve Risk Assessment." The core mission, however, remains: using algorithms and personal data to predict serious violent crime before it happens.
The Ministry of Justice (MoJ) confirmed the programme is currently in the research phase. The department says the purpose is to improve public safety by exploring whether advanced data analytics can enhance existing risk assessment methods used by the prison and probation services.
The project, commissioned during Rishi Sunak's tenure as prime minister, is processing vast datasets from the Probation Service, the Police National Computer and historical records from Greater Manchester Police (GMP).
While officials insist only data from individuals with at least one criminal conviction is being used, Statewatch alleges otherwise.
A section of the data-sharing agreement between the MoJ and GMP, reviewed by Statewatch, includes references to personal data such as the age at which a person first appeared as a victim – including victims of domestic abuse - as well as records of first police contact, regardless of conviction status.
In addition, the programme reportedly analyses "special categories of personal data" including mental health conditions, addiction, suicide risk, disability and incidents of self-harm. Campaigners argue this analysis could unfairly target vulnerable populations.
Justice department sources, speaking to The Register, framed the project as an extension of existing risk-assessment tools already employed in the justice system. They point to the long-established Offender Assessment System (OASys), introduced in 2001, which uses data to predict the likelihood of reoffending, informing decisions on sentencing and prison management.
Sofia Lyall, a researcher at Statewatch, warned of the broader implications: "The Ministry of Justice's attempt to build this murder prediction system is the latest chilling and dystopian example of the government's intent to develop so-called crime 'prediction' systems."
"Time and again, research shows that algorithmic systems for 'predicting' crime are inherently flawed."
Statewatch warns that individuals from more deprived areas, regardless of ethnicity, as well as Black people in particular, are "significantly over-represented" in the data the MoJ is using for its homicide analysis.
It further added that data-driven "predictive" models disproportionately impact racialised communities, further entrenching structural discrimination within the criminal justice system.
The MoJ maintains the tool is not being used operationally and is strictly for academic and methodological evaluation.
A spokesperson said: "This project is being conducted for research purposes only. It has been designed using existing data held by HM Prison and Probation Service and police forces on convicted offenders to help us better understand the risk of people on probation going on to commit serious violence. A report will be published in due course."
However, civil liberties advocates remain sceptical. They fear the project sets a dangerous precedent in pre-emptive policing and echoes the logic of predictive technologies that have faced bans or severe limitations in other democratic countries due to bias and inaccuracy.
Statewatch is calling for the immediate cessation of the project's development, urging the government to invest in welfare services rather than "dodgy and racist AI and algorithms," which Lyall argues undermine safety and wellbeing.