One of the popular software applications widely used by correctional institutions to predict an inmate’s likelihood of re-offending tends to exaggerate the risks for Hispanics, according to a forthcoming research study in the American Criminal Law Review.
The Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) generally fails to accurately predict actual outcomes in a linear manner, argues Melissa Hamilton, a senior law and criminal justice lecturer at the University of Surrey in the United Kingdom.
Hamilton analyzed a large dataset of more than 7,000 pretrial defendants in Broward County, Florida who were scored on COMPAS soon after their arrests in 2013 and 2014. The data includes two subsets: one tracks general recidivism, and the other violent recidivism.
The follow-up period was two years.
The general recidivism risk scale considers age at first arrest, age at intake, criminal history, drug problems, and vocational and educational problems such as employment, possessing a skill or trade, and high school grades and school suspensions.
The violent recidivism scale considers a history of violence, and noncompliance with release terms, including previous parole violations, and arrests while on probation.
The COMPAS algorithms produce outcomes as decile scores of 1-10 with higher deciles representing greater predicted risk. Hamilton—using multiple definitions of algorithmic unfairness—found that COMPAS was not well-calibrated for Hispanics.
Risk assessment are now being used to inform decisions regarding sentencing, early release and post-incarceration supervision; and these predictions historically have been based on the intuition or the personal experience of the official responsible for making the decisions.
“Officials are becoming heavily invested in risk assessment tools—with their reliance upon big data and algorithmic processing—to inform decisions on managing offenders according to their risk profiles,” Hamilton wrote.
“Algorithmic risk assessment holds promise in informing decisions that can reduce mass incarceration by releasing more prisoners through risk-based selections that consider public safety.”
This year, The Crime Report reported that a group of more than 100 legal organizations, government watch groups—including the ACLU and NAACP—signed onto “A Shared Statement of Civil Rights Concerns” expressing ethical concerns about the use of algorithms.
The dataset used in the current study was first used in a 2016 investigation by ProPublica which obtained the data through the Freedom of Information Act. The investigation concluded that COMPAS was biased against black people in that its algorithm over-predicted high risk for than population.
Absent from the ProPublica investigation, however, were Hispanics̶─a gap the researcher hoped to fill in the current study.
A copy of the study can be downloaded here.
J. Gabriel Ware is a TCR news intern. Readers’ comments are welcome.