The UK government is working on a new tool that aims to predict who’s most likely to become a murderer. This so-called “homicide prediction project” is led by the Ministry of Justice and has reportedly used data from between 100,000 and 500,000 people.
The project’s goal is to use data science to tackle crime, but experts are sounding the alarm—calling the approach chilling, dystopian, and outright unscientific.

In the movie (and book) Minority Report, set in 2054, police rely on a psychic division to stop crimes before they happen. The plot takes a turn when the main character is framed for a future murder. It’s classic sci-fi: thrilling, philosophical, and ultimately meant to stay in the realm of fiction.
But apparently, it’s leaking into the real world.
The project was brought to light by an NGO called Statewatch through documents obtained by Freedom of Information requests. It uses criminal record data from various sources, including the Probation Service and data from Greater Manchester Police before 2015.
Even more concerning, Statewatch claims the project includes data from individuals who haven’t committed any crimes at all. That means information from people who’ve sought help from the police may be used in the algorithm. This includes domestic abuse victims or individuals experiencing self-harm. Officials strongly deny this, but Statewatch presented part of a data-sharing agreement between the Ministry of Justice and Manchester police that seems to confirm it.
That agreement includes a section titled “type of personal data to be shared.” This section lists health markers expected to have “significant predictive power.” These include data on mental health, addiction, self-harm, suicide risk, vulnerability, and disability.
Sofia Lyall, a Researcher for Statewatch didn’t mince her words:
“The Ministry of Justice’s attempt to build this murder prediction system is the latest chilling and dystopian example of the government’s intent to develop so-called crime ‘prediction’ systems.”
Why this is a horrible idea
The first problem with this is that it doesn’t work.
“Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed. Yet the government is pushing ahead with AI systems that will profile people as criminals before they’ve done anything,” says Lyall.
The second problem is that this type of tool can be used to target specific groups. Law enforcement has long had a questionable relationship with AI tools, and there’s a lot of biased police data out there. There is mounting evidence and growing concern that AI tools and predictive policing can increase racial biases and lead to disproportionate surveillance and even actions against certain communities.
“This latest model, which uses data from our institutionally racist police and Home Office, will reinforce and magnify the structural discrimination underpinning the criminal legal system,” Lyall adds.
There’s no doubt AI and algorithms have an important role to play in policing, but only when they’re based on solid science. When they’re based on pseudoscience and flawed assumptions, the results are bound to be harmful.
AI is reviving dubious sciences
This isn’t the first pseudoscience some people are trying to bring back with AI. The practice of physiognomy (face reading) was dismissed as junk science centuries ago. However, it’s seeing a high-tech revival due to AI. Despite being widely dismissed by researchers as unscientific and racist, physiognomy is supported by several new studies.
A Ministry of Justice spokesperson told The Guardian that this is a research-only project.
“It has been designed using existing data held by HM Prison and Probation Service and police forces on convicted offenders to help us better understand the risk of people on probation going on to commit serious violence. A report will be published in due course.”
But given how secretive they’ve been about the project, Statewatch is understandably concerned. The line between research and policy implementation can be thin—especially when dealing with systems that carry such high ethical stakes. Critics fear that even pilot programs can lay the groundwork for more invasive surveillance strategies down the road.
“The Ministry of Justice must immediately halt further development of this murder prediction tool. Instead of throwing money towards developing dodgy and racist AI and algorithms, the government must invest in genuinely supportive welfare services. Making welfare cuts while investing in techno-solutionist ‘quick fixes’ will only further undermine people’s safety and wellbeing.”