Policing With Algorithms

Print More

In the 2002 tech-noir film “The Minority Report,” Tom Cruise fights to prove his innocence in a dystopian future where crimes are prevented and punished, based on the predictions of three psychic humans called “precogs.”

However, the Hollywood fantasy of stopping crimes before they are committed is no longer just science fiction.

Today it is known as predictive policing.

Instead of psychics, some law enforcement agencies are using mathematical algorithms to predict crime.  In 2013, the Chicago Police Department began using mathematical analytics to create what it calls “heat lists” — catalogs of people who, through the weighing of multiple risk factors such as an individual’s arrest records, known associates, or warrant status, were considered statistically more likely to be involved in violent crimes.

Today, commercial companies like PredPol and HunchLab offer police the potential, based on the results of these complex algorithms, to predict when and where crimes are likely to occur.

With the ultimate goal being response and prevention, predictive analytics are fast becoming a legitimate tool for the criminal justice system.

“Predictive policing… is really simply about algorithmic methods that are designed to try and help reduce uncertainty in the decision-making that police officers have to make every day when they’re on the street,”  Prof. Jeff Brantingham of UCLA, the founder and Chief of Research and Development at PredPol, told a recent conference at New York University examining the impact of predictive analytics on human rights.

While he conceded that “eliminating uncertainty is impossible,” Brantingham’s work is focused on trying to figure out where and when crime is most likely to occur based on historical data of past crimes in a designated neighborhood.

According to PredPol’s website, this technology yielded positive results for the Los Angeles Police Department (LAPD).  In their Foothill Area alone, between 2009 and 2013, there was a 23 percent decrease in crime.  Since then, the LAPD has applied PredPol’s technology to at least 14 of its divisions.

Is Predictive Policing Bias-Free?

Despite these apparently positive results—and the promise that predicting criminal activity can help police reduce it—skeptics warn that the technology may only reinforce pre-existing biases.

“If you look at history, if you look at overall patterns, there are patterns of disparate policing, often when it comes to race and also socioeconomic status,” said Rachel Levinson-Waldman, senior counsel to the Brennan Center’s Liberty and National Security Program and author of the report  What The Government Does With Americans’ Data.  

In a recent phone interview, Waldman pointed out that an algorithm relying on these patterns of discriminatory policing to decide where officers should go next would, thus, naturally, send them to where they have been, because that is all it knows.

This could not only potentially “magnify those [discriminatory] patterns,  but also create an implied illusion of neutrality with the falsely comforting notion that “there’s data involved, and there’s an algorithm, and computers aren’t discriminatory,” she added.

Critics like Waldman say that the assumption that algorithms used in predictive analytics are incapable of racial and socioeconomic discrimination could be misleading—and dangerous.

But Brantingham believes the risk of bias can be addressed through transparency and accountability.

“If we’re going to put these things in the field, we need to do randomized controlled trials and testing of these things to demonstrate that they work in the way that they are supposed to,” he said.

But Brantingham also concedes that it is police, not scientists, who will be drawing conclusions from the data—which in turn requires law enforcement to be accountable for the decisions they make based on predictive policing analytics.

Police are obligated to act, regardless of what any algorithm tells them, “constitutionally or within the framework of human rights guidelines,” he said.

This may also require strict oversight and regulation by authorities, he added.   In other words, the usefulness of predictive analytics, in policing or otherwise, depends on how well such regulatory measures can prevent abuses.

New York Test Case

The first test case exploring the wider potential of policing algorithms is already underway in New York City.

In January, New York’s Citizens Crime Commission of New York City launched the Predictive Prevention Lab. The Commission, a nonpartisan group, says the lab is aimed at tackling issues such as cybercrime and youth gun violence through marrying technology with crime prevention.

It will use applications such as CyberSmart, a web-based intelligent tutoring system which improves cybercrime prevention knowledge and reduces risk, and F.A.S.T. (Facebook Analytical Scanning Tool), which analyzes social media posts to predict potential violence in real time and alert anti-violence professionals.

The data gathered through these applications will not only help to identify the root causes of certain types of crimes, but will stimulate constructive methods of recognizing and responding to their various warning signs.

“We are very focused on behavior and (on) understanding risk factors that we can change to help better somebody’s life,” Stephanie Ueberall,  Director of Violence Prevention for the Citizens Crime Commission, told The Crime Report.

Similar data-based approaches are under development to strengthen cybersecurity.

“These are universal tools that can be used for many other purposes outside crime prevention,”  added Ina Wanca, Director of Cybercrime Prevention Initiatives for the Commission’s Predictive Prevention Lab.

For example, an application called “Work.Train.Forward”  is designed as a tool to  identify students with problems of motivation and self-confidence and help them resolve these issues with the help of tutors. Such systems are already being used in some schools to help improve writing and study skills.

Predictive algorithms may be a beneficial tool for social sciences, but do they cross ethical lines when they’re deployed in criminal justice?

Waldman laughs off the idea of arrests and incarceration occurring based on any psychic-like premonitions.

“I don’t think we’re in Minority Report world,” she says.

Nevertheless, no one underestimates the potential for misuse—particularly if the tools become a source of conflict between law enforcement and urban neighborhoods who are already concerned that they are the targets of biased policing or profiling.

As algorithms become more widely used as a policing tool, the pressure on regulators to make their deployment as transparent and accountable as possible will increase.

Isidoro Rodriguez is a John Jay College student serving as a journalism intern with The Crime Report. He welcomes readers’ comments.

Leave a Reply

Your email address will not be published. Required fields are marked *