Forensic Evidence is Often Garbage


CSI has been a television juggernaut for over two decades. Together with its Miami, New York, and Vegas children (and a short lived “Cyber” progeny best forgotten), the police procedural has earned six Primetime Emmy awards and was the most watched show on television for many years. The show was so popular that some criminal lawyers began to speak about the “CSI effect” – an expectation by juries that disputed factual issues could always be decided by forensic evidence.

While it’s difficult to determine whether the “CSI effect” is real or how significant it might be, the idea spawned a great deal of handwringing among prosecutors. Prosecutors were concerned that they would be expected to produce expert testimony in virtually every case and that juries would be suspicious of criminal investigations that lacked them. Many complained that CSI gave potential jurors an unreasonable sense of the availability of forensic evidence, particularly for rural police departments that lacked dedicated crime labs.

But don’t feel too bad for prosecutors. While CSI certainly engaged in mythmaking – by one estimate, 40% of the forensic techniques depicted on the show simply do not exist [1] – it drew from an older fiction that has been benefitting prosecutors for over a century. And that fiction is that the forensic science done in criminal investigations is broadly reliable. In truth, much of what passes for “science” in the criminal courtroom is pseudo-scientific nonsense devised to put defendants in jail. And even techniques that can work often have their reliability drastically overstated by the experts presenting them.

Skeptical? You should be. Particularly when a “forensic expert” is testifying. Let’s start with the worst offenders.

1.         Bitemark Analysis

You’ve surely seen this on television at some point. A victim is savagely bitten, and police need to prove that a suspect was the attacker. They examine the wounds and compare them to the suspect’s teeth, either through “dental records” or some fresh impressions taken after an arrest. In court, a “bitemark analyst” in a suit and tie, possibly punctuated with a lab coat, points to poster-sized pictures of the wound and pictures of the teeth and says something like “as you can see, the bite marks match the suspect’s teeth perfectly.” The villain goes to jail and the heroes go out for drinks. Roll credits.

While the events described have genuinely happened, you should not feel good about it. A 2016 report commissioned by the Obama Administration reviewed the available literature and concluded that bitemark analysis “is clearly scientifically unreliable at present” and further cautioned that even the studies showing it to be unreliable “are likely to underestimate the false-positive rate.”[2] Why? Well, it turns out that when tested, bitemark analysts generally cannot distinguish a human bitemark from an animal bitemark with any kind of accuracy [3] and have false positive rates as high as 84% even under ideal conditions. [4] Oops! While each person’s teeth may be unique, the damage inflicted on skin during a violent bite is so substantial and so random that conclusive “matches” are basically a fantasy. It’s a bit like punching a wall and assuming that an expert could conclusively identify the fist that made the mark – no matter how unique each person’s hands are, that technique simply does not work.

2.         Handwriting Analysis

Another example of “this thing looks like that thing” forensics, handwriting analysis assumes that an expert can look at handwritten text – maybe a note, maybe just a signature – and accurately determine the individual who wrote it. Courts have been allowing handwriting experts of one form or another to testify for over a hundred years. Indeed, Bruno Hauptmann, famously executed for the kidnapping and murder of the Lindbergh baby was largely done in by a handwriting “expert” who concluded that he had written the ransom note.

The only problem is, as Judge Gertner pointed out in United States v. Hines, that “[t]here is no academic field known as handwriting analysis” there is “no data that suggests that handwriting analysts can say, like DNA experts, that this person is ‘the’ author of the document” and “[t]here are no meaningful, and accepted validity studies in the field.”[5] Judge Rakoff reached a similar conclusion in Almeciga v. Ctr. for Investigative Reporting, Inc., describing handwriting analysis as a field that “bears none of the indicia of science and suggests, at best, a form of subjective expertise” and “flunks Daubert” [6]  He further pointed out that what few studies had been done demonstrated that handwriting experts were – at best – “moderately better at handwriting identification than laypeople” and had error rates as high as 24%. [7]  While many courts continue to admit handwriting analysis, the reasoning behind these decision is often dubious, amounting to little more than a breathless declaration that “handwriting comparison testimony has a long history of admissibility in the courts of this country.” [8]

3.         Bloodstain Pattern Analysis

Another CSI-classic, bloodstain pattern analysis, sometimes called “blood spatter analysis” posits that a trained expert can look the blood appearing at a crime scene and reconstruct various facts about how the crime was committed, such as the kind of injury suffered, whether a weapon was used, and the direction or angle from which a person was attacked.

You might assume that these techniques emerge out of background research in physics or fluid dynamics. But no, the field was largely popularized by a chemist named Herbert MacDonell who conducted after-hours experiments in his basement while working a day job at Corning Glass Works. [9] After finding some success marketing himself to prosecutors as an expert witness, MacDonell began offering 40-hour “Bloodstain Evidence Institutes” across the United States in which he would certify local law enforcement officers to act as “blood spatter experts.” [10] In 1983 he founded the International Association of Bloodstain Pattern Analysts to act as a credentialing organization for people paying for his courses. [11]

Does it actually work? A 2009 Report from the National Academy of Sciences called it into serious question, noting that the major accrediting organizations had “no educational requirements for certification in bloodstain pattern analysis,” that “the opinions of bloodstain pattern analysts are more subjective than scientific,” and that “experts extrapolate far beyond what can be supported.” [12] Still more doubt was raised by a 2021 study published in Forensic Science International showing that “[c]onclusions by bloodstain pattern analysts were often erroneous and often contradicted other analysts” and that there were substantial disagreements between so-called experts even with respect to “the meaning and usage of BPA terminology and classifications.” [13]

4.         Arson Analysis

Like bloodstain pattern analysis, arson analysis involves examining the physical evidence at a potential crime scene and attempting to determine what happened. For decades, “fire investigators,” often former police officers or firefighters rather than scientists, have testified in court, offering conclusions about whether a fire was “deliberately set” and whether “accelerants,” like gasoline were used.

How accurate is arson analysis? Judge Kozinski described it as field of expertise “built on nothing by guesswork and false common sense,” lamenting that “[m]any defendants have been convicted and spent countless years in prison based on evidence by arson experts who were later shown to be little better than witch doctors.” [14] As the image below from the ABA Journal summarizes, many of the fundamental assumptions of arson investigation, relied upon for decades to convict defendants, have turned out to be completely false:

Indeed, more than two dozen people have been subsequently exonerated after wrongful convictions based on dubious arson evidence, a number that one fire scientist called “just the tip of the iceberg.” [15]

5.         Ballistics Analysis

While there are many analyses that can be run on firearms, a frequent source of both controversy and convictions is the examination of cartridge cases. For over a century, experts have claimed the ability to determine whether a spent cartridge was fired from a particular gun by examining the scratch marks (striations) on the cartridge case. The theory is that firearms develop unique variations in the manufacturing process and that these unique patterns leave identifiable marks on cartridges as they are loaded and fired.

Do they? Maybe. Ballistic analysis is not total nonsense like bitemark analysis. It is definitely the case that specific markings are left on the cartridge cases and can be visually identified. And those markings can often exclude the possibility that a cartridge was fired from a particular gun. But whether experts can reliably match the marks to a specific firearm is much less obvious. Despite claims from the FBI that ballistics analysis has a 0% error rate, there have been few studies done to confirm the accuracy of ballistics matches.  And those that have been done have been heavily criticized.

Indeed, in 2008 when the National Research Council investigated whether it could build an electronic database of firearm markings – akin to databases that exist for fingerprints – it concluded that “the validity of the fundamental assumptions of uniqueness and reproducibility of firearms-related toolmarks” had not yet been demonstrated and that, given current comparison methods, a database search would likely “return too large a subset of candidate matches to be practically useful for investigative purposes.” [16] And just last year, the Maryland Supreme Court reversed a murder conviction after doing an extensive analysis of the available data and concluding that “these reports, studies, and testimony do not, however, demonstrate that that methodology can reliably support an unqualified conclusion that … bullets were fired from a particular firearm.” [17]

6.         Fingerprint Analysis

Fingerprint analysis as a means of solving crimes was first proposed in the 1800s and has been in regular use for more than a century. Like ballistics, fingerprint analysis is not total nonsense. It is certainly the case that the ridge patterns on fingertips vary substantially between individuals and that these variations are expressed in the oil prints left on objects by those fingers. And under controlled conditions, fingerprints can be used for identification. For example, during criminal background checks or after an arrest, fingerprints are routinely run against large databases as a reliable way of identifying individuals.

But the world is not ideal. Most fingerprints recovered from crime scenes (latent fingerprints) are only partial impression rather than full prints. And many are obscured or smudged by the manipulation of the object. While these problems reduce the reliability of fingerprint matches, they often do not reduce the certainty with which experts claim to have made a match. This was perhaps best showcased in connection with the 2004 Madrid Bombings, when the FBI detained a suspect for 14 days based on an affidavit from an FBI senior fingerprint examiner that he had “100% certainty” that the suspect’s fingerprint matched one taken at the scene, a conclusion that his supervisor, with thirty-years of experience, echoed.[18] As it turned out, the suspect had never been to Spain and did not even have a valid passport. [19] The FBI only acknowledged the error after the Madrid police matched the print to a different person. [20]

7.         But Rob, What About DNA?

DNA testing, done honestly, is reliable. In fact, many of the forensic methods above were discredited because of the advent of DNA testing, which allowed for the exoneration of suspects convicted on the basis of more dubious science.

However, while the fundamental science behind DNA testing is sound, the actual execution of DNA testing by law enforcement has given rise to numerous scandals. Just this year, a report concluded that a Colorado Bureau of Investigation DNA analyst intentionally manipulated data in the testing process for 15 years, calling the results of 652 cases into serious question. [21] Similar problems have occurred in Houston Forensic Science Center [22] and at the FBI’s own laboratories. And laboratory problems are not the only source of potential error. Contamination during collection is equally problematic and can render even the most accurate test fundamentally untrustworthy.

 *          *          *

What should you take away from all of this? Be very skeptical of expert testimony, even if you have seen it on television. In this regard, I can hardly improve upon the Wisconsin Supreme Court, writing in 1899:

[S]killed witnesses come with such a bias on their minds that hardly any weight should be given to their evidence. It seems that if a person is called as a witness to support one side of a controversy by opinion evidence, he is quite likely to espouse such side with all the zeal of blind partisanship, to view the situation from the point of interest and necessity of that one side of the controversy with such a degree of mental concentration as to shut out of view everything not within that narrow focus, inducing a mental condition of entire incapability of giving an independent, impartial opinion, and capability only of acting in the line which the interest of the one side suggests, with as much certainty as the hypnotized follows the mental suggestion of the hypnotizer. [23]


[1]           Simon Cole & Rachel Dioso, Law and the Lab, The Wall Street Journal (May 13, 2005), available at: https://web.archive.org/web/20130928024803/http://truthinjustice.org/law-lab.htm

[2]           President's Council Of Advisors on Sci. & Tech., Exec. Office Of The President, Forensic Science In Criminal Courts: Ensuring Scientific Validity Of Feature-Comparison Methods (2016) at 87, available at: https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/PCAST/pcast_forensic_science_report_final.pdf

[3]           Id. at 85.

[4]           Id. at 86.

[5]           55 F. Supp. 2d 62, 69 (D. Mass. 1999)

[6]           185 F. Supp. 3d 401, 419-423 (S.D.N.Y. 2016).

[7]           Id. at 421.

[8]           United States v. Crisp, 324 F.3d 261, 271 (4th Cir. 2003).  As the dissent points out, this is not a terribly persuasive argument. Id. at 272 (“The majority excuses fingerprint and handwriting analysis from any rigorous Daubert scrutiny because these techniques are generally accepted and have been examined for nearly one hundred years in our adversarial system of litigation. These circumstances are not sufficient to demonstrate reliability in the aftermath of Daubert.”).

[9]           Leora Smith, How a Dubious Forensic Science Spread Like a Virus, ProPublica (December 13, 2018) available at: https://features.propublica.org/blood-spatter-analysis/herbert-macdonell-forensic-evidence-judges-and-courts/

[10]         Id.

[11]         Id.

[12]         National Academy of Sciences, Strengthening Forensic Science in the United States: A Path Forward (2009) at 178, available at: http://www.nap.edu/catalog/12589.html.

[13]         R. Austin Hicklin et al., Accuracy and reproducibility of conclusions by forensic bloodstain pattern analysts, Forensic Science International, Vol. 325 (Aug. 2021) available at: https://www.sciencedirect.com/science/article/pii/S0379073821001766

[14]         The Honorable Alex Kozinski, Criminal Law 2.0, 44 Geo. L.J. Ann. Rev. Crim. Proc. 3, 5 (2015).

[15]         Mark Hansen, Long-held beliefs about arson science have been debunked after decades of misuse, ABA Journal (Dec. 1, 2015) available at: https://www.lb7.uscourts.gov/documents/13c6098.pdf

[16]         National Research Council. Ballistic Imaging. The National Academies Press. Washington DC. (2008): 3-4.

[17]         Abruquah v. State, 483 Md. 637, 648 (2023).

[18]         Hilbert, Jim, The Disappointing History of Science in the Courtroom: Frye, Daubert, and the Ongoing Crisis of “Junk Science in Criminal Trials, 71 Oklahoma Law Review 759, 809-10 (2019).

[19]         Id.

[20]         Id.

[21]         Rebecca Cohen, Investigation finds Colorado DNA analyst intentionally manipulated data, NBC News (March 8, 2004), available at: https://www.nbcnews.com/news/us-news/investigation-finds-colorado-dna-analyst-intentionally-manipulated-dat-rcna142541

[22]         Ted Oberg, Houston cases impacted by DNA analyst's 'false testimony' grows, ABC News (June 8, 2022) available at: https://abc13.com/stephen-adam-vinson-joseph-colone-dna-analyst-misleading-testimony-by/11939587/

[23]         Baxter v. Chicago & N.W. Ry., 80 N.W. 644, 653 (Wis. 1899).

Previous
Previous

What is a High-Low Agreement?

Next
Next

Can the Judge Just Ask the Jury to Be Fair?