Many in the justice reform community advocate for a new focus on identifying and intercepting future risks as the organizing criminal justice principle.
Rather than wait for the next victimization and only then impose sanctions for misconduct, they argue that predictive algorithms should be used to compute a score; then, handle the locations or the suspects (or defendants, or probationers) according to the scores they have earned.
The algorithms offer a way to anticipate and prevent crimes, rather than just punish them after they’ve been committed.
The traditionally data-free state of criminal justice policy discussion is long overdue for correction, and I agree that the use of predictive data has shown concrete results.
In policing, the development of a science of geographical hot-spot identification has permitted the intelligent direction of resources. By honing the capacity to focus on block-segments (rather than whole neighborhoods) this strategy could even allow for limiting the collateral damage that flows from aggressive crime control practices.
Other efforts, such as the Public Safety Assessment instrument developed by the Laura and John Arnold Foundation, show promise in rationalizing bail decision-making at arraignments. The Public Safety Assessment promises an objective procedure that should encourage timid judges to separate the less dangerous from the more dangerous, and to send the less dangerous home under community-based supervision.
This tool could be used instead of relying solely on instinct and experience. Pretrial detention might be reserved for the data-indicated threats.
Arnold’s PSA is now being extended to assist prosecutors in making important decisions in the pretrial phase, including those related to charging, plea bargaining, and diversion. A movement toward risk-oriented, data-driven (or “intelligence led”) prosecution spearheaded by Manhattan District Attorney Cyrus Vance, Jr., targets high-risk actors for special treatment in surveillance, apprehension, charging, plea leveraging, and sentencing.
But there’s a problem. The system— from beginning to end—gradually becomes less interested in the problem of imposing just deserts for past acts, as things begin to coalesce around a utilitarian strategy of selective deterrence and incapacitation.
“Lock ‘em up,” but not for what they have done, rather, for what we can predict they will do.
My own enthusiasm for these developments is muted.
To begin with, a robust debate is underway concerning the role of an existing array of risk assessment tools in aggravating racially disparate impacts. A recent Propublica series analyzes the startling racial biases built into one widely used proprietary instrument.
Bernard Harcourt of Columbia University has argued that risk has become a proxy for race.
A 2016 study by Jennifer Skeem and Christopher Lowenkamp dismisses Harcourt’s warnings as “rhetoric” but finds that on the level of particular factors (such as many of the criminal history factors on which Arnold’s PSA principally relies) the racial disparities are substantial.
Whether the challenge of developing a neutral risk assessment tool from the race-saturated raw materials we have available can ever be met is an argument I am not statistician enough to join.
Still, a handgun is a neutral tool too—until you pick it up and aim it.
And, for me anyway, there’s something else.
Stand next to an indigent criminal defendant in court and it changes you forever even if you do it only once.
And when you have been doing it for 40 years, as I have, the first thing that strikes you about a risk assessment process is not its statistical elegance; it is the fact that it constitutes a system of portraiture—a system of “representation” expressing the relationship between a copy and an original.
This process moves inexorably away from the flesh-and-blood defendant and fellow citizen standing beside you, to a portrait, to a caricature, to a silhouette, to a score.
It provides a “rendering”: a depiction of the defendant.
It performs a “rendering” by boiling away everything that the system’s operators have learned to treat as inessential to their purposes.
The experience of an African-American man in this country—from our beginnings as a nation up until today—has been one of constant exposure to danger. Threats come from all directions. The anxiety never goes away. Every walk to work can lead to a humiliating encounter. Or a fatal one.
Or a jail term.
It seems perverse to design a criminal justice system in which African American men will be seen as all dangerousness, no vulnerability.
But we are lining ourselves up to drift in that direction.
You may have been moved by Jennifer Gonnerman’s accounts in The New Yorker of the pretrial detention of Kalief Browder and Browder’s subsequent suicide. Now, imagine the doomed young man you met in that series rendered as (or to) “a 3”, or “a 6.4.”
That’s what will happen. The score is what the system’s actors will see. The score substitutes for a person in their decision-making, until, of course, it comes time to actually lock someone up: then the real person has to be re-substituted for the score and do the time.
No one advocates for a risk assessment program in which the score is the only information a judge (for example) will use. Everyone agrees that the risk score should be seen as only one tool among many available.
But this is a frail hope.
What we confront here is not what the policymakers seem to picture. It is a self-contained world of cops, lawyers, probation officers, and judges just trying to get through their days. They are not driven by ideological commitments or racist fervor, but they are under intense pressure—from the political and media climates, their caseloads, their peers, their dockets, and the administrators who thirst for “outputs” as performance measures.
Give them a risk assessment tool that predicts accurately 66 percent of the time, and they will know what to do 100 percent of the time.
They will see easily that the risk score is not “everything”. Then they will quickly learn that the score is enough. There will be variations, but those will be tweaks, not fundamental re-examinations. The risk scores will set the prevailing market prices in bails or sentences, and the trend will be upward.
Whatever else it does, the risk score will blaze a path of least resistance.
After all, if a prosecutor or judge locks someone up on the basis of a “false positive” risk score, that mistake can, by definition, never be known. If it did somehow become known, well, “The algorithm made me do it.”
But if prosecutors or judges disregard the risk score, and send someone back to the street on the basis of their own personal “false negative” risk prediction, they can end up on the front pages when the defendant reoffends.
And when the frontline practitioners mobilize “more than” the risk score in their decisions, the “more” will be drawn from exactly the reservoir of instinct, intuition, racially correlated factors, and (supposedly expert) experience that the objective risk score was supposed to supercede in the first place.
Meanwhile, the capacity to gather more informative facts about an individual—the “full transcript” of his life, as the anthropologists would say—will wither as the felt need for that information declines. Is he also a concert pianist? There’s no place for that on the form.
The inclination to seek the information will dry up as the scarcity of information worsens, and therefore the effort required to obtain it increases.
The score comes first. After all, the risk score is objective, and it is right there, this morning, on the report; we can just use that. We have hundreds of these cases to sort; we can find this guy’s rank among them from the score we have, and slot him in. Then we can deal with tomorrow’s list.
None of this will be accomplished on Day One with a clap of thunder. Rather, as safety expert Sidney Dekker explains, “There is a long steady progression of small incremental steps that unwittingly take an operation toward its boundaries. Each step away from the original norm that meets with empirical success (and no obvious sacrifice in safety) is used as the next basis from which to depart just that little bit more.”
The “safety” that will matter here is the practitioners’ own safety, not the safety of communities.
The danger in this tendency is that a campaign to institute an objective, data-based system that lowers risk and avoids racially saturated moralizing can unintentionally threaten to read young African American men out of the moral realm altogether.
The Arnold Foundation’s PSA conscientiously avoids using age of first arrest as a risk factor because age at first arrest strongly correlates with race: very young blacks are arrested far more frequently than very young whites. To include that factor puts a thumb on the scale.
But that first harassing arrest at the age of 13 may have been the pivotal traumatic event in a young African-American’s life. You can’t understand his situation without it.
To avoid one problem with racial difference (overweighting a risk score with a racial correlate) this approach risks aggravating another: it ignores a meaningful difference in the life courses of individuals in our society that can have important explanatory value.
The logical endpoint of this drift, if we do not redirect it, is a version of the criminal justice system imposed by English utilitarians such as John Stuart Mill and Fitzjames Stephens in imperial India. Our treatment of gangs, suspected gang associates, and potential gang members will begin to resemble the Criminal Tribes Act the utilitarians imposed in India: providing wide-scale preventive measures on the basis of a classification.
Accountability matters to the public.
Reaction to the past year’s police shootings makes it clear that our communities do not regard imposing just deserts for misconduct to be an archaic concept. The community does not seem to accept the idea that we must allow for a certain amount of randomly inflicted collateral damage in our striving for crime control success.
With risk assessment as our focal point we will be deploying a public health strategy developed on a population level to answer what the community believes should be an individual diagnostic problem: “Does this guy deserve ten years?” As things stand, the principal method (incarceration as incapacitation) available for interdicting future crimes is the same method we use for sanctioning past ones.
I don’t argue for eliminating risk assessment as a tool. I do argue for a little humility in wielding that tool. Turning people into integers is not a promising way to build trust in the law or to re-knit a divided society in this context.
We need, as Anthony Braga has noted, to make criminal justice something we do with communities, not (as reformers tend to believe) “for” communities, or (as communities increasingly see it) “to” communities.
We will have to recover the determination to see the people we are working with as fully human figures not as a data set of alien Others and dedicate ourselves to building the resources the will allow the criminal justice system to address the challenges those humans face.
The data revolution in criminal justice provides us with an important way to recognize when we have a problem. It gives us a ladder we can climb to get a new perspective. But when it comes to solving those problems we’ll have to do what Yeats, at the end of his life, was forced to recognize as crucial.
We’ll have to “Lie down where all the ladders start / In the foul rag and bone shop of the heart.”
James Doyle, a Boston defense lawyer and author, is a frequent commentator for The Crime Report. He welcomes comment from readers