homehome Home chatchat Notifications


Machine learning could solve the US's police violence issue

By flagging high-risk officers, the system allows police departments to limit the chance for violent events.

Alexandru Micu
August 2, 2016 @ 12:25 pm

share Share

The Charlotte-Mecklenburg Police Department of North Carolina is piloting a new machine-learning system which it hopes will combat the rise of police violence. Police brutality has been a growing issue in US in recent years.

The system combs through the police’s staff records to identify officers with a high risk of causing “adverse events” — such as racial profiling or unwarranted shootings.

Image credits Jagz Mario / Flickr.

A University of Chicago team is helping the Charlotte-Mecklenburg PD keep an eye on their police officers, and prevent cases of police violence. The team feeds data from the police’s staff records into a machine learning system that tries to spot risk factors for unprofessional conduct. Once a high-risk individual is identified, the department steps in to prevent any actual harm at the hands of the officer.

Officers are people too, and they can be subjected to a lot of stress in their line of work. The system is meant to single out officers who might behave aggressively under stress. All the information on an individual’s record — details of previous misconduct, gun use, their deployment history, how many suicide or domestic violence calls they have responded to, et cetera — is fed into the system. The idea is to prevent incidents in which officers who are stressed can behave aggressively, such as the case in Texas where an officer pulled his gun on children at a pool party after responding to two suicide calls earlier that shift.

“Right now the systems that claim to do this end up flagging the majority of officers,” says Rayid Ghani, who leads the Chicago team. “You can’t really intervene then.”

But so far, the system had some pretty impressive results. It retrospectively flagged 48 out of 83 adverse incidents that happened between 2005 and now – 12 per cent more than Charlotte-Mecklenberg’s existing early intervention system. It had a false positive rate – officers flagged as having a high risk by the system that didn’t behave aggressively – was 32 per cent lower than the existing systems.

Ghani’s team is currently testing the system with the Los Angeles County Sheriff’s Department and the Knoxville Police Department in Tennessee. They will present the results of their pilot system at the International Conference on Knowledge Discovery and Data Mining in San Francisco later this month.

So the system works, but exactly what should be done after an official has been flagged as a potential risk is still up for debate. The team is still working with the Charlotte-Mecklenburg police to find the best solution.

“The most appropriate intervention to prevent misconduct by an officer could be a training course, a discussion with a manager or changing their beat for a week,” Ghani adds.

Whatever the best course of action is, Ghani is confident that it should be implemented by humans, not a computer system.

Or adorable toy police cars, at least.
Image via pixabay

“I would not want any of those to be automated,” he says. “As long as there is a human in the middle starting a conversation with them, we’re reducing the chance for things to go wrong.”

Frank Pasquale, who studies the social impact of algorithms at the University of Maryland, is cautiously optimistic.

“In many walks of life I think this algorithmic ranking of workers has gone too far – it troubles me,” he says. “But in the context of the police, I think it could work.”

He believes that while such a system for tackling police misconduct is new, it’s likely that older systems created the problem in the first place.

“The people behind this are going to say it’s all new,” he says. “But it could be seen as an effort to correct an earlier algorithmic failure. A lot of people say that the reason you have so much contact between minorities and police is because the CompStat system was rewarding officers who got the most arrests.”

CompStat, short for Computer Statistics, is a police management and accountability system, used to implement the “broken windows” theory of policing — the idea that punishing minor infractions like public drinking and vandalism severely helps create an atmosphere of law and order, and will thus bring down serious crime. Many police researchers have suggested that the approach has led to the current dangerous tension between police and minority communities.

Pasquale warns that the University of Chicago system is not infallible. Just like any other system, it’s going to suffer from biased data — for example, a black police officer in a white community will likely get more complaints than a white colleague, he says, because the police can be subject to racism, too. Giving officers some channel to seek redress will be important.

“This can’t just be an automatic number cruncher.”

share Share

How Hot is the Moon? A New NASA Mission is About to Find Out

Understanding how heat moves through the lunar regolith can help scientists understand how the Moon's interior formed.

This 5,500-year-old Kish tablet is the oldest written document

Beer, goats, and grains: here's what the oldest document reveals.

A Huge, Lazy Black Hole Is Redefining the Early Universe

Astronomers using the James Webb Space Telescope have discovered a massive, dormant black hole from just 800 million years after the Big Bang.

Did Columbus Bring Syphilis to Europe? Ancient DNA Suggests So

A new study pinpoints the origin of the STD to South America.

The Magnetic North Pole Has Shifted Again. Here’s Why It Matters

The magnetic North pole is now closer to Siberia than it is to Canada, and scientists aren't sure why.

For better or worse, machine learning is shaping biology research

Machine learning tools can increase the pace of biology research and open the door to new research questions, but the benefits don’t come without risks.

This Babylonian Student's 4,000-Year-Old Math Blunder Is Still Relatable Today

More than memorializing a math mistake, stone tablets show just how advanced the Babylonians were in their time.

Sixty Years Ago, We Nearly Wiped Out Bed Bugs. Then, They Started Changing

Driven to the brink of extinction, bed bugs adapted—and now pesticides are almost useless against them.

LG’s $60,000 Transparent TV Is So Luxe It’s Practically Invisible

This TV screen vanishes at the push of a button.

Couple Finds Giant Teeth in Backyard Belonging to 13,000-year-old Mastodon

A New York couple stumble upon an ancient mastodon fossil beneath their lawn.