Crime-Solving Isn't a Science (But It Could Be)
The latest victim of Trump administration cuts -- the National Commission on Forensic Science -- could have more accurately been called “The Commission to Help Forensics Become a Real Science.” That would better reflect the magnitude of the job handed to this 30-member panel of experts, tasked with evaluating widely used but scientifically untested crime-solving techniques.
Other areas of science have had their problems with irreproducible results, misleading claims and errors. But when criminal courts present flawed forensic “science,” innocent people can be imprisoned -- or, in the U.S., even sentenced to death by their own government.
The trouble with forensics isn’t that it’s junk science. The commission’s job would be much easier if it just had to weed out a few useless, junky methods. A more insidious problem came to light following an investigation led by the U.S. National Academy of Sciences (NAS) back in 2009, and later a White-House-commissioned report headed by some of the country’s most respected scientists.
Both reports concluded that most commonly used forensic techniques have the potential to help solve crimes, but their precision and error rates have never been scientifically quantified. In other words, we know they’re not perfect, but we don’t know how imperfect they are. The National Commission on Forensic Science was formed to address this problem.
And yet, Jeff Sessions recently announced he would not renew the commission’s contract -- which means the Department of Justice has forfeited the chance to get advice from some of the country’s top scientists, working in conjunction with the esteemed National Institute of Standards and Technology, the body responsible for developing standards on everything from gasoline to time.
The commission’s six scientists all signed a public letter asking to continue what they’d started. “This is a several-decades-long problem,” said University of Maryland physics professor and commission member S. James Gates Jr. “This can’t be fixed in a couple of years.”
The techniques in question include the analysis of hair, bite marks, fingerprints and ballistics.
The NAS report made the damning statement that -- with the exception of DNA fingerprinting -- none of the techniques it investigated can connect an individual to a piece of evidence with any sort of consistency or certainty, despite ubiquitous claims of “matches.”
But the field isn’t broken -- it’s just early in the process of becoming a science. After all, it was developed as a tool to help prosecutors. Gates compares forensics to medicine, which has a long tradition of expert knowledge, but only recently started to rely on the kinds of controlled experiments needed to make it an evidence-based endeavor.
DNA fingerprinting is different from other forensic techniques because it was developed by molecular biologists relying on the methods of mainstream science, including clear standards for measuring error rates. Over the course of the 1990s and early 2000s, use of DNA overturned dozens of cases of convictions based on other forms of forensic evidence. (However, the PCAST report added the increasingly common practice of analyzing DNA mixtures to the list of techniques that might contribute to wrongful convictions.)
As both PCAST and NAS pointed out, some central tenets of mainstream science are missing in the rest of forensics. One of those tenets, said Gates, is the need to quantify the uncertainty in your measurements and conclusions. “We recognize that we are not infinitely capable of observing or predicting. … There are always an error bars in science,” he said. “If you look at forensic science you can find this phrase, scientific certainty. … No other part of well-developed science uses that phrase, to my knowledge.”
The word “match” is equally misleading. If police comb through a suspect’s apartment and find a fiber that looks like it came from the victim’s clothing, the claim of a “match” may sound damning. But a scientist would want to know how often similar searches in a random sample of other apartments would turn up a fiber with that level of similarity.
There’s no way to know what a match means without some use of control samples -- another common method of science, according to commission member Arturo Casadevall, a professor of microbiology and immunology at Johns Hopkins University. “We use controls to avoid misleading findings and false conclusions,” he said. “This is something that you do whether you’re an astrophysicist looking at a quasar or a biologist looking at a gene.”
Casadevall sees the end of the commission as a lost opportunity. The 30 members were a mix of crime-lab workers, lawyers, judges and scientists, and it took time and effort for people from such different backgrounds to forge lines of communication, he said.
The commission is actually a cheap substitute for the real recommendation of the NAS report -- a new national institute, sort of like the National Institutes of Health, devoted to doing all the basic science needed to test the validity, reliability and error rates of forensic techniques. Sessions said he would replace the commission with an in-house team. This violates the spirit of the NAS recommendation, which called for an independent organization to set standards.
Scientists are pursuing different goals from detectives and prosecutors. The clash of cultures came through in a comment Sessions made at a congressional hearing in the wake of the NAS criticisms: “I don't think we should suggest that those proven scientific principles that we've been using for decades are somehow uncertain.” This may sound like denial, but he may have been referring to the proven usefulness of forensic techniques to catch criminals. Scientists don’t doubt the techniques are useful. They’re lamenting that these techniques aren’t proven when it comes to finding the truth. And worse, it doesn’t have to be this way.
To contact the author of this story:
Faye Flam at email@example.com
To contact the editor responsible for this story:
Tracy Walsh at firstname.lastname@example.org
To continue reading this article you must be a Bloomberg Professional Service Subscriber.
If you believe that you may have received this message in error please let us know.