homehome Home chatchat Notifications


Machine learning could solve the US's police violence issue

By flagging high-risk officers, the system allows police departments to limit the chance for violent events.

Alexandru Micu
August 2, 2016 @ 12:25 pm

share Share

The Charlotte-Mecklenburg Police Department of North Carolina is piloting a new machine-learning system which it hopes will combat the rise of police violence. Police brutality has been a growing issue in US in recent years.

The system combs through the police’s staff records to identify officers with a high risk of causing “adverse events” — such as racial profiling or unwarranted shootings.

Image credits Jagz Mario / Flickr.

A University of Chicago team is helping the Charlotte-Mecklenburg PD keep an eye on their police officers, and prevent cases of police violence. The team feeds data from the police’s staff records into a machine learning system that tries to spot risk factors for unprofessional conduct. Once a high-risk individual is identified, the department steps in to prevent any actual harm at the hands of the officer.

Officers are people too, and they can be subjected to a lot of stress in their line of work. The system is meant to single out officers who might behave aggressively under stress. All the information on an individual’s record — details of previous misconduct, gun use, their deployment history, how many suicide or domestic violence calls they have responded to, et cetera — is fed into the system. The idea is to prevent incidents in which officers who are stressed can behave aggressively, such as the case in Texas where an officer pulled his gun on children at a pool party after responding to two suicide calls earlier that shift.

“Right now the systems that claim to do this end up flagging the majority of officers,” says Rayid Ghani, who leads the Chicago team. “You can’t really intervene then.”

But so far, the system had some pretty impressive results. It retrospectively flagged 48 out of 83 adverse incidents that happened between 2005 and now – 12 per cent more than Charlotte-Mecklenberg’s existing early intervention system. It had a false positive rate – officers flagged as having a high risk by the system that didn’t behave aggressively – was 32 per cent lower than the existing systems.

Ghani’s team is currently testing the system with the Los Angeles County Sheriff’s Department and the Knoxville Police Department in Tennessee. They will present the results of their pilot system at the International Conference on Knowledge Discovery and Data Mining in San Francisco later this month.

So the system works, but exactly what should be done after an official has been flagged as a potential risk is still up for debate. The team is still working with the Charlotte-Mecklenburg police to find the best solution.

“The most appropriate intervention to prevent misconduct by an officer could be a training course, a discussion with a manager or changing their beat for a week,” Ghani adds.

Whatever the best course of action is, Ghani is confident that it should be implemented by humans, not a computer system.

Or adorable toy police cars, at least.
Image via pixabay

“I would not want any of those to be automated,” he says. “As long as there is a human in the middle starting a conversation with them, we’re reducing the chance for things to go wrong.”

Frank Pasquale, who studies the social impact of algorithms at the University of Maryland, is cautiously optimistic.

“In many walks of life I think this algorithmic ranking of workers has gone too far – it troubles me,” he says. “But in the context of the police, I think it could work.”

He believes that while such a system for tackling police misconduct is new, it’s likely that older systems created the problem in the first place.

“The people behind this are going to say it’s all new,” he says. “But it could be seen as an effort to correct an earlier algorithmic failure. A lot of people say that the reason you have so much contact between minorities and police is because the CompStat system was rewarding officers who got the most arrests.”

CompStat, short for Computer Statistics, is a police management and accountability system, used to implement the “broken windows” theory of policing — the idea that punishing minor infractions like public drinking and vandalism severely helps create an atmosphere of law and order, and will thus bring down serious crime. Many police researchers have suggested that the approach has led to the current dangerous tension between police and minority communities.

Pasquale warns that the University of Chicago system is not infallible. Just like any other system, it’s going to suffer from biased data — for example, a black police officer in a white community will likely get more complaints than a white colleague, he says, because the police can be subject to racism, too. Giving officers some channel to seek redress will be important.

“This can’t just be an automatic number cruncher.”

share Share

A 30,000-Year-Old Feather Is a First-of-Its-Kind Fossil

A new analysis of a fossil found in 1889 has unveiled the presence of zeolites—and an entirely new mineralization method.

This Sensor Box Can Detect Deadly Bird Flu in 5 Minutes. But It Won't Stop the Current Outbreak

The biosensor can detect viral airborne particles.

In 2013, dolphins in Florida starved. Now, we know why

The culprit is a very familiar one. It's us.

Researchers can't rule out the possibility of life existing on Titan

It wouldn't be very much, but it's exciting anyway.

The Earth's oceans were once green. Then, cyanobacteria and iron came in

A pale green dot?

Could man's best friend be an environmental foe?

Even good boys and girls can disrupt wildlife in ways you never expected.

Musk's DOGE Fires Federal Office That Regulates Tesla's Self-Driving Cars

Mass firings hit regulators overseeing self-driving cars. How convenient.

Archaeologists Just Found a Stunning Teotihuacan Altar Hidden in a Maya City. Its Murals Tell a Shocking Story

What were these outsiders doing so far away from home?

These Strange-Looking Urinals Could Finally Stop Pee From Splashing Back on You

The humble urinal gets a much needed high-tech update after 100 years.

Archaeologists Unearth 150 Skeletons Beneath Vienna From 2,000-Year-Old Roman-Germanic Battlefield

A forgotten battle near the Danube reveals clues about Vienna's inception.