Unconscious bias in digital forensic analysis creates “undetected miscarriages of justice,” according to a study to be published in Forensic Science International: Digital Investigation.
Digital forensics, or the search, recovery and analysis of evidence found on a suspect’s electronic devices has become increasingly popular as more and more people use technology to communicate, which leaves traces of evidence behind, reports The Guardian in a summary of the study.
According to The Guardian, 90 percent of today’s criminal cases contain digital evidence.
The study found that a person analyzing digital forensics could interpret data differently based on information or comments given to them about the suspect or the case prior to conducting any of their analysis.
If the analyst had been told that the suspect was innocent before conducting their digital search, they were much less likely to find incriminating evidence, even though some might exist, researchers found.
The study was conducted by Nina Sunde, Police Superintendent for the Norwegian Police University College and Itiel Dror, a professor at the University College, London.
The same is true in opposite circumstances. If an analyst has reason to believe that the suspect is guilty prior to the start of their analysis, they would be more likely to find more evidence than an analyst who had no prior opinion or knowledge of the case.
“We have every reason to believe that an expert acting in good faith, but through a mistake of interpretation, could easily mislead a courtroom,” David Gresty, a senior lecturer at the University of Greenwich, said in The Guardian article.
“Without the defense instructing another expert to review the evidence it is entirely possible this could go unnoticed, and realistically it is likely there are undetected miscarriages of justice where cases have relied heavily on digital evidence,”
Sunde and Dror conducted the study by giving the same hard drive of evidence to 53 different digital analysts spanning across eight countries.
Some analysts were given details that framed the suspect in a guilty light, while some were given reason to believe the suspect was innocent.
The article did not list which countries were included in the study.
The results of the study showed that each analyst’s biases on whether or not the suspect was already guilty played a large part in their findings. According to the article, the study also showed inconsistencies even across analysts who were told the same preliminary context.
According to the article, the report opens up a larger issue of bias within a system that should be solely focused on searching for evidence. Unlike forensic science, which has been used and developed for hundreds of years, digital forensics has only been evolving since the boom of technology across the globe.
The study illustrates a lack of standards across digital forensic analysis, the study authors said.
“Digital forensics examiners need to acknowledge that there’s a problem and take measures to ensure they’re not exposed to irrelevant, biased information,” said Dror.
Described as the “wild west” of criminal evidence, forensics data extracted from electronic devices needs to be subject to rigorous procedures for finding and analyzing evidence so that potential biases have less chance of influencing the results, he added.
While evidence found digitally is without a doubt useful to the prosecution and conviction of an individual, studies show that bias also plays a part ―in both the person analyzing the data and the machine that holds the data.
With issues of handling privacy by law enforcement becoming more and more prioritized―whether through the use of forensic genealogy, predictive policing or biased algorithms―the biases associated with digital forensics could create even less trust in law enforcement.
Emily Riley is a TCR Justice Reporting intern