Detectives investigating the murder of Dorka Lisker, a 66-year-old woman found stabbed in her Sherman Oaks, Ca., home in 1983, were convinced they had an open-and-shut case.
The most likely suspect was Lisker’s 17-year-old son Bruce, whose hands were covered in blood when police and medical workers arrived, and who appeared to be high on methamphetamines. Although Lisker had himself called 911, local cops had long considered him a troublemaker. When he “confessed” to the crime, that seemed to seal the point.
Although Lisker later recanted, three inmates in the local county jail where he was detained claimed they he had confessed to them.
Lisker wasn’t guilty. But be spent 26 years in prison before a federal judge threw out his conviction, ruling that he had been prosecuted with “false evidence.”
The Lisker case was one of 50 examined by two Texas State University researchers in an attempt to explain why wrongful convictions occur.
“Wrongful convictions are a form of criminal investigative failure,” wrote Kim Rossmo and Joycelyn Pollock in a forthcoming study in the Northeastern University Law Review.
Most analyses of investigative failure have focused on errors of procedure, or malfeasance on the part of authorities, which covers a broad list of ills, such as eyewitness misidentification, flawed forensic evidence, false confessions, deceitful informants, police and prosecutorial misconduct, or a poor defense.
But these mistakes usually have their roots in, or are compounded by, the psychology of those who are leading the investigation or prosecuting the case, the researchers said.
They argued that the conviction of an innocent individual isn’t necessarily the result of a single investigative mistake. Instead, it’s often a product of a cascading series of errors that could be neutralized if authorities were trained to take into account the fact that once we come to a conclusion about an issue, we rarely pay attention to any evidence that might contradict it.
This tendency, known as “confirmation bias” played a central role in nearly all the wrongful conviction cases examined by the researchers and, by implication, explains hundreds of others that have been identified.
A report issued this year by the National Registry of Exonerations identified 151 cases of exoneration alone in 2018, representing individuals who had spent a total of 1,639 years behind bars for crimes they didn’t commit.
“Biases, because they are implicit, are difficult to control,” wrote Rossmo, chair of criminology at Texas State University; and Pollock, Distinguished Professor Emeritus at Texas State’s School of Criminal Justice.
“They function independently of one’s intelligence, and awareness of their dangers makes them no easier to avoid.”
The authors, whose study was funded by the National Institute of Justice through its Sentinel Events Initiative, added that while it was difficult to completely remove confirmation bias and related flaws such as “tunnel vision,” “group think” and “rushing to judgment” from investigative work, “research has shown that specialized training may help mitigate their influence.”
The most promising framework for such training, they wrote, is the technique of “Sentinel Event Reviews,” traditionally used in both transportation and medicine to find the root causes of accidents or mistakes.
The concept assumes that catastrophic mistakes are the products of systemic breakdowns, rather than a single individual’s misconduct or incompetence, and it looks for specific slip-ups (sentinel events), no matter how minor, that—if they had been recognized in time—might have helped avoid tragedy.
The technique is gaining traction, haltingly, in the world of criminal justice. Several prosecutors’ offices now employ “conviction integrity units” to scour case records for evidence that may have been overlooked or for questionable conduct by investigators in convictions where questions have been raised.
But can authorities guard themselves against tunnel vision and related thinking in advance?
The Northeastern Law Review study applies the Sentinel Review concept to develop an effective “risk recipe” that police, prosecutors and others can use as a reality check before they fall victim to confirmation bias.
“While it might be argued that wrongful convictions are ultimately the result of flawed decision-making, multiple wrong decisions by different parties are necessary,” the study said. “(That includes) the decision by the police to arrest the wrong person, the decision by the prosecutor to charge the wrong person, the decision by a judge or jury to convict the wrong person.”
An organizational template that helps investigators understand when they may be prematurely rushing to judgment could include, for example:
- Recognizing that “fear, intense media interest, pressure from politicians, organizational stress, personal ego, or a strong desire to arrest a dangerous offender can all lead to premature judgment;”
- Counteracting the “conviction psychology” of prosecutors, in which there is a “pervasive sense that all defendants are guilty, where racking up convictions is akin to ‘wins’ for a sports team;”
- Fostering a culture inside police departments where alternative views of a case are invited and taken seriously.
- Avoiding “premature” shifts to a suspect-based investigation before all efforts are made to complete satisfactory evidence collection.
“The most certain way to prevent a wrongful conviction is to minimize wrongful arrests of innocent people,” the authors said.
But they added:
It is not safe to assume that a wrongful arrest will be later corrected by the district attorney or that a judge or jury will come to the correct finding. Prosecutors may fail to act as an objective check and balance as they can suffer from the same cognitive biases as police investigators. Early mistakes may never be noticed; even if they are, much damage can still occur.
In several of the cases examined by the authors, “detectives refused to abandon the original suspect, justifying their intransigence through highly convoluted reasoning.”
The antidote, they wrote, was fostering a culture of “critical thinking,” which in turn begins with recognizing that “an entrenched position, even an untenable one, can persist through psychological lethargy and organizational momentum.”
In the Lisker case, for example, detectives had already convinced themselves of his guilt on the basis of what they knew of his criminal record, and didn’t bother to look for any evidence to the contrary. That was compounded by the actions of the prosecutor, who took the word of the jailhouse informants as fact, even though one of them had a history of “overhearing” such confessions.
Confirmation bias can affect any player or combination of players in the justice system, as well as the media and politicians.
The majority of the cases examined by the authors were murders, and although most were adjudicated in the U.S., they also explored examples in Canada and Europe.
The full study can be downloaded here.