The problem with forensic science

Everyone knows that human fingerprints are unique. After all, even identical twins don’t share exactly the same pattern of whorls and furrows, and in fact, no two sets of human fingerprints have ever been found to be identical.

But it might surprise you to learn that the idea that human fingerprints are unique is a guess, not a well-researched idea. Fingerprinting has become a staple of law enforcement techniques long before this issue was even raised, but it has never been proven. This lack of a fundamental scientific basis for the supposed uniqueness of fingerprints—and the inability of apparent experts to reliably match them or even agree on what is required for a match—has seen some federal courts I reject fingerprints entirely as evidence.

This problem is not unique to fingerprints. Forensic science as a whole has encountered serious problems with the basic assumptions underlying the individual techniques.

When justice and science collide

Forensic medicine has more than its fair share of “junk science,” which refers to incorrect scientific information or research. Historically, the actual scientific basis for investigative techniques came long after they came into use, if at all. And while there are some techniques such as bite mark analysisThere is that never is scientifically established, all forensic science has the potential to end up in the dustbin: Even when a method is sound in principle, it can be rendered unreliable by low-quality evidence, contamination, or bias.


Read more: A close look at the forensics behind US criminal justice


And while junk science in other fields can undoubtedly have unfortunate and even deadly consequences – such as now-withdrawn Lancet article that helped spur the anti-vaccination movement—forensic science has a unique scope. Criminal prosecutions are some of the most consequential actions states can take against individuals, and forensics can make or break cases. Bad science can send innocent people to prison for life or even their death, while letting dangerous offenders walk free every day.

The problem of individualization

Only recently has the high-profile nature of forensics spurred the US Congress into action: in 2009, the National Academy of Sciences was ordered to study the problem and recommend solutions. Their findings identifies both many of the problems of forensic science in general – such as the fragmented nature of the legal system and the practices used in different disciplines – and problems with specific practices.

One highlight appears at the beginning of the report, where the commission notes problems with individualization, such as the ability to show that a fingerprint, for example, belongs to a specific person with some degree of certainty. “With the exception of nuclear DNA analysis, however, no forensic method has been rigorously demonstrated to have the capacity to consistently and with a high degree of certainty demonstrate a link between evidence and a particular individual or source,” the authors wrote.

The researchers go on to say that there are significant underlying problems for many forensics. “The simple reality is that the interpretation of forensic evidence is not always based on scientific research to determine its validity. It’s a serious problem.” And although research has been conducted in certain disciplines, the authors continue, there is a striking lack of peer-reviewed, published studies that demonstrate the scientific validity behind many forensic techniques.

Not so definitive DNA

DNA has come to fulfill the same role that fingerprints once did: as a silver bullet linking a particular person to a particular place. But there are problems with DNA testing that go beyond poor quality sampling or testing errors. Trace DNA, the method of collecting small biological samples at crime scenes, has emerged as a particularly problematic confounding factor, forensic experts have found noted. For example, finding someone’s DNA at a crime scene doesn’t even necessarily mean they were there. In one case, the DNA of a male homicide victim who was killed in their home was found. But it turned out that the man to whom that DNA belonged was at the hospital the whole time—the DNA was transferred through the paramedic who brought the man to the hospital and who later answered the call to the homicide unit.

M. Chris Fabricant, author of Junk Science and the American Criminal Justice System and Director of Strategic Litigation at the Innocence Project, summarized the problem in an interview for Texas Observer. “Ywe can take a reliable technique and make it quite unreliable, depending on the quality of the evidence,” he said in the interview. “If you don’t have enough information from a crime scene sample — and we don’t really have any objective thresholds for how much information one actually needs , to reach a reliable conclusion—then we can take a reliable technique and make it pretty stupid.” This potential is evident in fingerprints, Fabriant continues, which are more susceptible to cognitive bias than other techniques and much more likely to there are random potential matches and a higher error rate.”This is true to some extent with DNA evidence as well.”

The National Academy of Sciences report made numerous recommendations for improvements. But one of the key conclusions was simply that the (often) adversarial justice system in the US is ill-suited to establishing the reliability of forensic medicine. As the report states: “Judicial review alone will not cure the ills of the forensics community.” If we are to solve the problems, systemic and otherwise, with forensics, we must begin by rehabilitating the science behind them.

Leave a Comment