It may be true that no two sets of fingerprints are alike, although there is no actual study showing that. Even if it is true, however, the process of matching fingerprints found at crime scenes to prints from other sources is messy. Most often, crime scene prints are partial or smudged, and there are plenty of prints that are thought common enough to match these.
In 2004, an Oregon lawyer, Brandon Mayfield, was falsely implicated in a Madrid train bombing. This was based on a faulty fingerprint analysis performed by the FBI. The lawyer’s fingerprints were found to be a “close non-match” in a search aided by the Automated Fingerprint Identification System, the FBI’s fingerprint database.
In 2009, the National Academy of Sciences issued a blockbuster report calling into question all sorts of traditional forensic evidence techniques, including fingerprint matching. It found that, with the exception of DNA analysis, most forensic evidence is not backed up by a great deal of science. Particularly weak were techniques involving pattern matching from crime scene evidence to outside sources like fingerprints, bite marks, ballistics and the like.
Science on the ground level
According to a recent report by The Intercept, leaders in the field of fingerprinting analysis took that National Academy of Sciences report to heart and made an effort to upgrade the scientific underpinnings of their techniques. They determined, for example, that it is not scientifically supportable to claim that a fingerprint match is conclusive and excludes all other suspects. Moreover, the report notes that in latent print analysis, the error rate is as high as 1 in 24. However, some experts dispute that figure, believing that it overstates the error rate.
Yet the questionable validity of fingerprint matching hasn’t necessarily made its way down to the level of the individual examiner. Examiners routinely testify that their analyses are scientific and conclusive.
Often enough, they base their credibility on the fact that they have aced fingerprint analysis proficiency examinations. According to The Intercept, however, these proficiency tests are startlingly easy to pass. In fact, a group of public defenders passed despite having no training at all.
It seems as if almost anyone could pass. On the exam taken by the public defenders, only 12 out of 360 people missed one or more answers. The tests don’t include any partial or smudged prints, or any “close non-matches” to puzzle over, like the kind that confused experienced FBI examiners in the Mayfield case.
Current proficiency tests are unrealistically easy, in part, because when the testing group previously made them more challenging, many analysts complained that the tests were trying to “trick” them. Thus, there is generally no quality control protocol to ensure that the tests represent real-world conditions.
This lack of quality control apparently follows through to the examiners. According to The Intercept, most police forces say they can’t afford ongoing QA testing of their examiners or have no policies to deal with the potential for mistakes.
But a false positive match could put an innocent person in prison for years.