TX Checking Thousands Of Cases Back To 1999 For DNA Analysis Errors


Over the summer, the Texas Forensic Science Commission came to an unsettling conclusion, reports NPR. There was something wrong with how state labs were analyzing DNA evidence.The labs were using an outdated protocol for calculating the probability of DNA matches in “mixtures” –crime scene samples with genetic material from several people. It may have affected thousands of cases back to 1999. When a lab reran the analysis of a DNA match from a murder case in Galveston, the numbers changed quite a bit. Under the old protocol, says defense lawyer Roberto Torres, DNA from the crime scene was matched to his client with a certainty of more than a million to one. “When they retested it, the likelihood that it could be someone else was, I think, one in 30-something, one in 40. So it was a significant probability that it could be someone else,” he says. “We have to go back and identify which of those cases involved DNA mixtures where the lab may have given incorrect results,” says Jack Roady, the district attorney in Galveston. “It’s going to be a herculean task, but we’re gonna do it.”

It’s unsettling to find out DNA analysis can vary like this because it threatens to undermine the deep faith people have placed in the technology. “And it’s not faith they should not have had to begin with,” says Keith Inman, who teaches forensic science at California State University, East Bay. Inman says forensic DNA-matching is based on sound science, but sometimes labs can get ahead of themselves. What happened in Texas, he says, is that labs have been using cutting-edge “testing kits” that can extract tiny traces of DNA from crime scenes, but those samples were then analyzed with math that’s not suited to “weak” samples that combine DNA from many people. He says the problem isn’t limited to Texas. He says the newest, best analysis method — called “probabilistic genotyping” — takes time to roll out, and that’s put labs in a quandary.

Comments are closed.