New Tools for Measuring Prosecutors

Print More

Photo by Eric via Flickr.

Here, masquerading as a prosaic set of management tools, is a little fuel for a paradigm shift in criminal justice—for a move toward a culture of safety.

Early in the current wave of criminal justice reform, activists targeted two under-used levers for change:  the power of data and the central role of the elected prosecutor.

A recent initiative housed at Florida International University and Loyola University (in Chicago) and supported by the John D. and Catherine T. MacArthur Foundation and the Charles Koch Foundation, has brought those two resources into harness.

The three-year project has produced a list of 55 “indicators” for measuring the performance of prosecutors’ offices.  Four local prosecutors’ offices took part in the exploration.

The “dashboard” this list constitutes moves beyond our blinkered tradition of judging prosecutors exclusively on crime and conviction rates and musters a range of measures of three dimensions:

        • capacity and efficiency;
        • community safety and well-being; and
        • fairness and justice.

The idea is that prosecutors equipped with data can become engines of reform.

This is a big stride in the right direction.

Still, as the dashboard’s developers recognize,  the road to criminal justice reform and community safety is a long one.

The lasting contribution of this innovation could turn out to be the new light it sheds on just how much further we still have to go, and on the new vehicles we will need to develop before we can complete the journey.

The Lonely Warriors

If you learned about prosecution life from the media you will see the job as Tom Wolfe’s Bonfire of the Vanities prosecutor, Larry Kramer, saw it when setting out on his career.

No dreary big firm practice for Larry:

He, Kramer, would embrace life and wade up to his hips in the lives of the miserable and the damned and stand up on his feet in the courtrooms and fight, mano a mano, before the bar of justice.

This never accurately described a contemporary prosecutor’s life; fewer than three per cent of criminal cases go to a jury trial.  The work day usually consists of managing an ever-growing heap of case files, answering docket calls, wrangling cops and witnesses, preparing for hearings that never happen.

But just as battle remains the organizing event of military life even though very few members of the military personally experience it, courtroom combat still permeates the prosecutorial atmosphere.

Well after he should have known better, Larry Kramer could still argue:

Who was more manly than the young prosecutor, who stood not ten feet from the accused, with nothing between them but thin air, and hurled the charges of the People in his teeth?

Advocates trying to coax prosecutors into reform aren’t oblivious to this mindset.  They know that the whole idea of “reform” carries with it the unwelcome  application that prosecutors have not achieved perfection in the past.

So, the reformers have taken a tactful approach, and most of their public statements conjure up images of all-powerful prosecutors doing Justice Reform in their lonely, heroic way.

Organizations such as Fair and Just Prosecution or the Institute for Innovation in Prosecution—although I suspect they know better—are usually careful to leave open the possibility that a progressive prosecutor, suitably mounted on a white charger, can ride to our rescue and straighten this mess out on their his or her own.

The first thing that strikes you about the 55 indicators on the new dashboard is that they are not the traditional criteria: conviction rates (“We have the highest”) and sentences (“We got the longest!”).

The dashboard widens the lens, and takes account of metrics bearing on office capacity, such as the ability to promptly recognize and drop weak or worthless cases, and community safety, such as the felony recidivism rate of diversion program graduates.

But a more important element of the dashboard is that the vast majority of its indicators record events that did not occur within a hermetically sealed prosecution silo, but that were unmistakably system outcomes.

If you look for more than a minute at the dashboard you will have to recognize that to make progress on any metric it will not be enough to optimize each criminal justice system component; you have to understand the components’ interactions.

The rate of cases rejected at filing, for example, is a measure that depends on the prosecutor’s office’s choices, but it also depends on the kinds of cases the cops bring in.

The rate of acquittals implicates the prosecution’s charging decisions, but it also involves the police investigations, the crime scene techniques, the forensic capabilities of the jurisdiction, the exculpatory evidence turned over (or withheld)—even the performance of the defense bar and the trial court.

Recidivism among probationers involves the original sentencing procedures and its inputs. That process was certainly influenced by the prosecution, but it was not unilaterally controlled by it.

The diagnostic picture provided by the offender’s mental health evaluations and court contacts played a role.  The quality of the probation office’s supervision was a condition that could have contributed to the outcome.  The range, nature, and capacity of the rehabilitative programs available is always an issue.

What emerges from the dashboard approach is something greater than the sum of its individual answers.

A routine practice of reckoning scores for the 55 indicators will eventually create a powerful undertow in the direction of the recognition that each of the outcomes marked down in the process—the good ones and the bad ones—emerge from the contributions of many players: that, “The prosecutor did x because the police did (or didn’t) do y.”

Ask “Who is responsible for this outcome?” and the answer will be “Everyone involved to one degree or another,” and “everyone” in this context includes distant actors who set the budgets, developed the legal architecture, shaped the economic conditions in the neighborhoods–even stood by and did nothing.

Elected prosecutors who bring this dashboard back to their offices can expect no warm welcome from the Larry Kramers in their ranks.

Vehement cries of “I didn’t sign on to be a social worker!” or “I’m a trial lawyer, not an accountant!” will echo through the halls.  The dashboard may be seen as a harbinger of a system of surveillance to be gamed or evaded with covert work rules—as a curb on the autonomy that attracted some lawyers to prosecution work in the first place.

But if the head prosecutors stick with it and use these indicators fully and consistently, an innovation that at first seems intended to radically simplify things by breaking the world down into metrics will teach an indispensable lesson about complexity.

It will show that everyone’s work in criminal justice affects everyone else’s, and everyone in the system is simultaneously influenced by conditions and influences imposed independently from the outside.

If that lesson is learned, there might be a culture change on the horizon: from our Battle Model tradition to a Safety Model prosecution function that sees collaboration—not adversarial combat or inter-agency blame-shifting—as the way forward.

This will help on the inter-agency level, and the fact that the dashboards are envisioned as transparent windows into operations will foster alliances and cooperation with other reform-minded actors in the communities.

But the need for collaboration on the case level will also be illuminated by the dashboards.  A line prosecutor trying to arrive at a case disposition that serves community safety and justice (i.e., the best sentence rather than the longest) will need  input from the probation office, the community, even from the defense.

The benchmarks that the dashboards lay out help to make that clear. On these metrics no one succeeds—or fails—alone.

Big Data Needs Thick Data

The idea of utilizing data in prosecutor’s offices is not entirely new.  (I can remember during my three months as a Special Assistant United States Attorney in the mid-1970s wrestling with an infant Prosecution Management Information System, called PROMIS.)

Still, the general data-free state of criminal justice practice has amounted to something like a scandal.

The new prosecution dashboard obviously mitigates that failing, but—less obviously—it simultaneously demonstrates that while data is necessary it is not by itself sufficient.

The data-driven classifications that the dashboard enables require a complementary capacity to provide full narrative descriptions:  the data points need contextualizing, “thick data.”

This point—and the earlier point about the necessity of multi-stakeholder collaboration—is made by John Chisholm, the Milwaukee District Attorney.

Chisholm worked with the dashboard project, and posts transparently his office’s scores. He believes in data.

But Chisholm also joined with Milwaukee’s veteran public defender office head, Tom Reed,  in writing an insightful paper arguing that to make real progress we need more than the count: we need the capability  to probe the meaning behind the Big Data visualizations.

Chisholm and Reed argued that institutions such as the Milwaukee Homicide Review Commission and regular practices such as the Sentinel Events Reviews explored by the National Institute of Justice can provide the stories behind the count—get past what happened to how it happened and why, and how to prevent it happening again.

They pointed to the Sentinel Event Review of a complex homicide convened by Chisholm and conducted by a team of police, prosecutors, juvenile justice supervision workers, defenders, educators, and public defenders that uncovered a systemic structural secrecy—the necessary information was in the system somewhere, but no one making the crucial decisions had all of the information at the time of decision.

As Robert Wears and Ben-Tzion Karsh observed of the analogous multi-stakeholder   Emergency Medicine reviews, “It is from processes like these—detailed explications of individual cases, deeply situated in complex contexts—that insights leading to useful reductions in hazards are likely to emerge.”

james doyle

James Doyle

In other words, the data dashboard will flag problems and tell us where to look.

But the dashboard itself also shows us that we will need the “thick description” of a full-scale all-stakeholders event review (something quite different from a disciplinary performance review) to understand what we see when we do look, to explain what we have seen to others, and to bring about the changes we need.

James M. Doyle is a Boston defense lawyer and author, and a regular columnist for The Crime Report. He welcomes readers’ comments.

One thought on “New Tools for Measuring Prosecutors

  1. Thank you Mr. Doyle for focusing on the issue of bringing big data into the field of prosecution. I agree that the Prosecutorial Performance Indicators indirectly also measure defense and judicial effectiveness and fairness. For example, a decision to divert might be made by a prosecutor/judge, but defense counsel plays an important role in the defendant’s decision to accept a diversion offer.

    Our team from Florida International University and Loyola University Chicago decided nevertheless to focus on prosecutors for a few reasons.

    First, the prosecutorial field has gone too long unchallenged and without relying on or publishing data. Researcher-prosecutorial partnerships are unfortunately rare yet sorely needed. So we thought that prosecutors would benefit the most from a data-heavy initiative that pushes prosecutors to become more data-informed and transparent about their decisions.

    Second, we still believe that the prosecutors are the most powerful players of the system, especially given that 95% of cases are disposed of through guilty pleas. As such, we wanted to place an emphasis on their accountability and transparency because with that type of power comes responsibility for greater effectiveness and fairness.

    Third, we wanted to challenge the historic obsession with conviction rates as a primary measure of success in prosecution, as your piece rightfully acknowledges. The ideals of community wellbeing, racial justice, and charging and conviction integrity, for example, should play a bigger role in assessing an office’s performance over time.

    At the same time, we believe that all part of the system should have a set of indicators that go beyond the silos and inform the communities about the overall impact. This, however, requires greater support from both local, state and federal government as well as philanthropy.

    Thank you again for pushing for the idea of “thick data.” We as researchers certainly appreciate that sentiment and we are hoping we are no longer alone in that.

Leave a Reply

Your email address will not be published. Required fields are marked *