Wrongful Convictions: How We Drift into Error

Print More

We usually learn about wrongful convictions by reading news stories or social science analyses. Both go “down and in” to find the broken criminal justice system component that “caused” the catastrophe.

Press reports identify villains in the forensic lab who botched the analysis, or in the prosecutor’s office, who hid evidence that would have proved innocence.

Social scientists identify a case feature such as an eyewitness mis-identification that correlates with exonerations, counting its recurrence, controlling for other features, and statistically analyzing the implications.

But to learn from wrongful convictions to prevent future errors, we need a different model of criminal system failure.

Going “down and in” will never be enough.

We also have to go “up and out”—in other words, to understand the influence of actors distant from the sharp-end operator who is the traditional focus.

We have to see the components in relation to each other, and to their broader context.

Our best wrongful conviction model isn’t the failed machine with a broken spring or stripped bolt, but that familiar category of event, which is unprintable here, but which the military in its phonetic alphabet abbreviates to “Charlie Foxtrot”

Or, to be a little more polite about it, the “organizational accident.”

That’s what medical reformers Dr. Mark Chassin and Dr. Elise Becher discovered when they investigated why surgeons had operated on the wrong woman.

Their analysis of the event showed over 17 separate errors.

Among them: the patient’s face was draped so that the physicians could not see it. A resident left the lab assuming the attending physician had ordered the invasive surgery without telling him. Conflicting charts were overlooked. Contradictory patient stickers were ignored.

But the crucial point was that none of the 17 errors they catalogued could have caused the wrong-patient surgery by itself.

No single error is enough to cause an organizational accident.

The errors of many individuals converge and interact with system weaknesses, increasing the likelihood that individual errors will do harm. Most of the practitioners involved in these tragedies do not choose to make errors, they drift into them.

Many catastrophic events involved normal people, doing normal work, in normal organizations. They suffered, in human-error expert Charles Perrow’s memorable phrase, “normal accidents.”

These insights apply to a “wrong-man” conviction.

Many wrongful convictions reflect (as Diane Vaughan wrote of the space shuttle Challenger launch decision) “a mistake embedded in the banality of organizational life”

You might not know it from the news stories or the research reports, but lots of things have to go wrong before the wrong man is convicted.

Yes, the eyewitness has to choose the wrong man from a photo array, but the police have to decide to put him into the array in the first place, design the format of the array and choreograph its display.

Forensic evidence on the crime scene could have been overlooked or, even if properly collected and tested in the lab, ignored or distorted in the courtroom presentation.

Cell phone, Metrocard, or other alibi information could have been ignored or considered insignificant.

Tunnel vision, augmented by clearance rate and caseload pressures from above, may have overwhelmed the investigators and the prosecutors.

Poorly funded defense counsel may have failed to investigate alternative explanations or to execute effective cross-examination.

Also: the there may have been errors by the witness, by the cops, by the technicians, or prosecutors, by the defense, judge or jury, or by appellate court.

No single error would have been enough without the others. The errors combined and cascaded; then there was a tragedy.

The answer to the question, “Who is responsible for this wrongful conviction?” is almost invariably “Everyone involved, to one degree or another”— if not by making a mistake, then by failing to catch one.

Criminal justice needs the capacity for what medicine calls “forward-looking accountability.”

We have to save some energy for a non-blaming, all-stakeholders process of looking at wrongful convictions, wrongful releases and “near misses” as the fault not of people or systems—but people in systems.

To prevent future tragedies we have to learn why horrendous choices made by practitioners looked like good choices at the time—why practitioners zigged instead of zagged.

We have to look at the legislators who decide the budgets, trainers, supervisors, appellate courts, funders, and others who created the environment in which the sharp-end operators made their disastrous choices.

Only then can we uncover why those choices might look like good choices to the next practitioner who comes along, unless we make some changes.

James Doyle is a Boston attorney and the author of True Witness: Cops, Courts, Science and the Battle Against Misidentification (Palgrave 2005.) The opinions expressed here are his own. He welcomes comments from readers.

Comments are closed.