Errors and unreliable science can undermine “every step” of the process of collecting forensic evidence, and it should be kept out of the courtroom until research confirms its validity.
That was the consensus of experts gathered for a recent webinar organized by the University of Pennsylvania Carey Law School’s Quattrone Center for the Fair Administration of Justice.
“There are quality and scientific issues at every step of [collecting forensic evidence] – from the moment someone touches the evidence at a crime scene, to the laboratory [and] to the courtroom,” said Brandon Garrett, the L. Neil Williams, Jr. Professor of Law at Duke University.
“Behind something seemingly as simple as a fingerprint match or a firearms comparison, there are 12 different ways that the analysis can and sometimes does go wrong.”
Garrett, author of the recently published Autopsy of a Crime Lab: Exposing the Flaws in Forensics, said that throughout his legal career he has witnessed criminal cases in which “experts overstated the evidence and reached totally wrong conclusions…where people were totally innocent.”
In addition to Garrett, the webinar featured Maneka Sinha, assistant professor of law at the University of Maryland Carey School of Law, and Dr. Itiel Dror, senior cognitive neuroscience researcher at University College London.
The April 14 conversation was moderated by Paul Heaton, academic director of the Quattrone Center for the Fair Administration of Justice.
Garrett said many people assume fingerprinting is more accurate than DNA; yet few understand how little fingerprint evidence is left at the crime scene.
Jurors in the Dark
Moreover, crime laboratories provide no meaningful documentation of their work to the lawyers nor the jurors: They merely indicate that a certain fingerprint and firearm were found at the crime scene.
For this reason, Garrett said, jurors “are left in the dark.”
Reflecting on the dearth of documentation, Garrett said:
You have evidence that may or may not be well-collected. You have poor documentation of what [forensic examiners] do. They come into court claiming expertise, but they have never been tested in any meaningful way…The process they follow may be an ill-defined one…The labs themselves may not have any real testing or auditing or quality control.
Even worse, he added, “judges don’t ask any questions.”
Additionally, Garrett noted that the defense team in criminal cases seldom has the money nor the time to hire its own forensic expert. Therefore, in most cases, the only scientist reviewing the forensic evidence is hired by the prosecution.
“Somehow, in criminal cases where life and liberty are at stake, there is rarely a battle of the experts,” Garrett told the webinar.
A Misidentified Fingerprint
Sinha said that as a public defender, she witnessed many of the problems Garrett described.
As an example, she recalled a burglary case a few years ago that hinged on fingerprinting evidence.
Sinha explained that the defense lawyer took many steps to discredit such evidence: she described the issues facing the forensic discipline, the examiner’s lack of scientific training, and the racial and other bias that can affect decisions regarding fingerprint matching.
Importantly, the defense lawyer cross-examined the forensic examiner on the error rates associated with fingerprinting and other forensic disciplines.
The defendant was ultimately acquitted, and Sinha believes the not guilty verdict was due to the defense lawyer’s thorough inquiry into the forensic evidence process.
Dr. Dror continued Sinha’s discussion of error rates.
He commended the recent trend of studying error rates in forensic science but said that many studies fail to include inconclusive evidence or, worse, count such evidence as correct responses.
Therefore, Dr. Dror argued, the current error rate studies are “inaccurate and misleading at the core.”
Dr. Dror added that racial, gender, and other biases affect experts’ decisions on forensic evidence, biases that many systems have yet to acknowledge.
When the panelists were asked what they wish to see going forward, Sinha said that judges, prosecutors, experts and all other players in the justice system must do their part to make forensic evidence more reliable.
According to Sinha, doing so includes conducting more research on the issue, instituting quality control and proficiency testing measures, and revisiting the role of judges as the gatekeepers of evidence.
For now, however, Sinha argues that disciplines like forensic evidence that have been deemed unreliable must remain out of the courtroom until there is sufficient research that proves their validity.
Meanwhile, Dr. Dror argued that forensic examiners must not be given access to so-called “task- irrelevant biasing information.”
For example, Dr. Dror explained, research shows that 42 percent of fingerprinting files, which are given to forensic examiners, indicate whether the suspect has a criminal record.
He also said that forensic examiners and crime laboratories often work for law enforcement and/or the prosecution.
Forensic examiners must work independently, according to Dr. Dror.
Additionally, Dr. Dror and Garrett agreed that the defense and prosecution teams must be given equal financial resources as well as access to their own forensic scientists.
Lastly, Garrett hopes that crime laboratories will begin to track which individuals worked on which cases, whether the defendant was convicted, and where the paperwork has been filed, among other things.
Unless such measures are taken, then “even when errors come to light, there [will be] little in the way of justice for people who are affected by forensic errors.”
Sadly, Garrett said, “there may be tens of thousands, even hundreds of thousands, of cases affected.”
Brandon Garrett is the L. Neil Williams, Jr. Professor of Law at Duke University.
Maneka Sinha is an assistant professor of law at the University of Maryland Carey School of Law.
Dr. Itiel Dror is a senior cognitive neuroscience researcher at University College London.
A recording of the April 14 webinar can be accessed here.
Michael Gelb is a TCR contributing writer. He welcomes comments from readers.