A Code of Silence on Criminal Justice Software

Print More

Proprietary algorithms are flooding the criminal justice system. Machine learning systems deploy police officers to “hot spot” neighborhoods. Crime labs use probabilistic software programs to analyze forensic evidence. Judges rely on automated “risk assessment instruments” to decide who should make bail, or even what sentence to impose, the Washington Monthly reports. Supporters say these tools help correct bias in human decisionmaking and can reduce incarceration without risking public safety by identifying prisoners who are unlikely to commit future crimes if released. Critics argue that the tools disproportionately harm minorities and entrench existing inequalities in criminal justice data under a veneer of scientific objectivity.

As this debate plays out, the tools come with a largely unnoticed problem: ownership. With rare exceptions, the government doesn’t develop its own criminal justice software; the private sector does. The developers of these new technologies often claim that the details about how they work are “proprietary” trade secrets and, as a result, cannot be disclosed in criminal cases. In other words, private companies increasingly purport to own the means by which the government decides what neighborhoods to police, whom to incarcerate, and for how long. They refuse to reveal how these decisions are made—even to those whose life or liberty depends on them. In Loomis v. Wisconsin, the U.S. Supreme Court is deciding whether to review the use of a system called COMPAS in sentencing proceedings. Eric Loomis pleaded guilty to running away from a traffic cop and driving a car without the owner’s permission. When COMPAS ranked him “high risk,” he was sentenced to six years in prison. He tried to argue that using the system to sentence him violated his constitutional rights by demoting him for being male. The system’s owner, a company called Northpointe, refuses to reveal how it weights and calculates sex.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

X

You have Free articles left this month.

Want access to all our reporting? Subscribe for unlimited access or login.

SUBSCRIBE LOGIN