The Appeal Podcast: The Pseudoscience behind Forensic Science
With Jessica Brand, Legal Director at The Justice Collaborative and Appeal contributor.
We’ve watched the scene play out in countless police dramas: slick scientific experts with the latest gadgets finding the Bad Guys with forensic pattern matching: Bite marks, fingerprints, a marking on a fired bullet or handwriting on a note. But how scientific are these methods? And do prosecutors and judges wildly overstate their reliability? This week, we are joined by Jessica Brand, Legal Director at The Justice Collaborative and Appeal contributor to discuss some of the pseudo-science behind forensic science.
The Appeal is available on iTunes and LibSyn RSS. You can also check us out on Twitter.
Adam Johnson: Hi welcome to The Appeal. I’m your host Adam Johnson. This is a podcast on criminal justice reform, abolition and everything in between. Remember, you can always follow us on Twitter @TheAppealPod, on Facebook at The Appeal magazine’s main webpage and as always you can subscribe to us on iTunes. We’ve watched the scene play out in countless police dramas, slick scientific experts with the latest gadgets and technology finding the bad guys with forensics, specifically pattern matching. Bite marks, fingerprints, the markings on a fired bullet or handwriting of a note, but how scientific are these methods exactly? And how much do prosecutors, judges, the media and police dramas wildly oversell their reliability? This week we’re joined by Jessica Brand, Legal Director at the Justice Collaborative and Appeal contributor, to discuss the pseudo science behind forensic science.
Jessica Brand: For about a decade there has been three reports, huge reports that have talked about how unreliable ballistics evidence can be, so when you match a bullet from a crime scene to a gun and say, ‘this gun definitely fired that bullet.’ And just recently The New York Times ran an article about how reliable that field was without citing to any of these reports put together by the PCAST committee, the President’s committee on science by the National Academy of Sciences, and it didn’t cite to any of those, it just gave a ton of credit to this field and that’s The New York Times. And so when you have such a well respected paper like that running these articles it’s really hard for defense attorneys and even real scientists to combat that narrative.
Adam: Hi Jessica. Thank you so much for joining us.
Jessica Brand: Thanks for having me.
Adam: In May of this year, you wrote an explainer for The Appeal called Faulty Forensics: Explained. You write that there’s quote, “no objective standards to guide how examiners reach their conclusions.” And that a lot of forensics, specifically what we’re talking about today, which is pattern matching, is based on a lot of dubious assumptions and quasi-scientific analysis. Can you, can you explain what you mean by the total lack of objective standards?
Jessica Brand: Sure, so I like to think of pattern matching as a little bit like goldilocks and the three bears. This porridge smells good, this porridge smells bad and this one is just right and that really is a little bit like how the pattern matching fields, with the exception of DNA, operate. So in pattern matching, you’ll take, let’s take fingerprints, you’ll take a fingerprint found at the crime scene, and then if there’s a suspect, you’ll take that suspect’s known fingerprint. It can be found in a database or more often it’s something that a police officer will just find. And the expert, so quote unquote “expert,” will look at this view and try to compare to see if they would match, but there’s a couple problems with that. There is no known standards for how many similarities you need to see among the two fingerprints. There’s no known standards for if something looks different, is that meaningful or is that just a product of an incomplete print left at the crime scene? And those are the kinds of things that you’d want if you’re doing something like declaring a match that could send someone to jail for five, ten, forty, life, or even in a capital case to the death penalty. So those things don’t exist and then there’s no objective rules in the field that you really must follow before you can draw a conclusion. So for example, if I follow these rules, I’ll reach conclusion X and then you Adam, if you follow the same rules, you’re going to reach the same exact conclusion that I would. Those things don’t exist. So the work that I do is not repeatable by a different kind of examiner. So there’s no sense that if you put two, three or five examiners in the room following the same exact rules they’ll actually reach the same conclusions as the next person. Which, when you think about it, is pretty scary.
Adam: Yeah. And obviously being able to reproduce results is the entire basis of the scientific method, right? It’s what makes it science.
Jessica Brand: Apparently.
Adam: So the people who are actually doing these examinations, uh, you write a lot about the, to put it mildly, inconsistent qualifications of people who do this. Can we talk about your average police department, who are the people who are actually doing these pattern matching analysis, uh, both in terms of the, on the scene and then back at the crime lab and also what the kind of origins of this science such that it is, what the origins are?
Jessica Brand: Sure. So on the scene it’s going to be police officers who go and are trained in, for example, how to dust and lift fingerprints and there’s techniques for learning how to do that. But those are mostly law enforcement. Then when you go to the lab, it really depends. It depends on if it’s a law enforcement lab, it depends on if you’re sending it to an external lab. But what’s really critical here is there’s no kind of uniform rigorous standards for how you would be trained to do this. So there was this great 2012 ProPublica article where the journalist took a class online and he got a certificate in pattern matching after like a few hour class and I think a hundred question multiple choice test. And then, you know, Pam Colloff’s article, also for ProPublica and The New York Times really highlighted a detective who became an expert in blood splatter and I think he took a week long class, right? So it can be really fast and this became a major critique in the 2009 major report criticizing these fields by the National Academy of Sciences where they talked about what you want for someone to actually be trained in these fields. You want meaningful, rigorous training. Like, I dunno even a college-like class. We’d want to give serious tests. You’d want standards for one year licenses revoked if you’ve made mistakes or if you’ve failed, but none of those things exist in the pattern matching fields. And you can see that, you notice the St. Paul Lab in 2013 they discovered the fingerprint examiner chief had absolutely no certification and training in fingerprint examination and he was the chief of that lab. So you can really get by in a lot of these laboratories with almost no training and no standards before you’re able to get up and testify in court.
Adam: I would imagine that a meaningful percentage of people who end up in prison or end up in prison due to the testimony of experts in forensics labs, do we have a sense of the scope of how many false positives there are and how many false convictions there are? I know the Innocence Project has written a lot about forensics. Do we have a sense of how many people end up being accused based on science that is at best kind of guesswork?
Jessica Brand: So the Innocence Project estimates that in nearly half, so 45 percent of DNA exoneration cases, faulty forensic science contributed to that wrongful conviction, which is a breathtaking number. But then when you think about how many cases don’t have DNA, so you can’t have an exoneration based on DNA, you may have an even higher number than that. And then of course cases plead out. So if you’re a defendant and your lawyer says they’re going to introduce this fingerprint match, in your case, do you want to plead guilty? You may really not want to roll the dice even though that science is bad and maybe you actually haven’t done it, but you want to get out of prison. You don’t want to serve a long sentence. You might say, guilty. We’re not accounting for any of those cases in that kind of calculation.
Adam: You know one of the things, especially in media criticism, which is what I do a lot of, is the people ingest things through pop culture and it’s kind of depressing but it’s simply the way it is. And the way people perceive forensics, especially, I mean you even and I hate to admit myself, is largely informed by shows like CSI or crime detective movies. To what extent do you think that the sexing up and fetishization of crime scene, both in pop culture and true crime documentaries, to what extent do you think that that has given people a false impression of the precision of these methods?
Jessica Brand: I think that some, although I want to come back to the media critique portion of your question, I mean for sure people watch CSI, John Oliver had a spoof about it, people think that you can find the magic bullet and connect it to the gun and that’s going to lead you to the guy who absolutely committed the crime and you see these examiners in white lab coats and they seem very impressive and then when you get that person on the witness stand, who actually may have no qualifications, you equate the two together. So I think for sure it’s a huge problem, but I also think we in the media, I’m a lawyer, but people in the media give way too much credit to these fields then is due. So for about a decade there has been three reports, huge reports that have talked about how unreliable ballistics evidence can be, so when you match a bullet from a crime scene to a gun and say, ‘this gun definitely fired that bullet.’ And just recently The New York Times ran an article about how reliable that field was without citing to any of these reports put together by the PCAST [President’s Council of Advisors on Science and Technology] committee, the President’s committee on science by the National Academy of Sciences, and it didn’t cite to any of those, it just gave a ton of credit to this field and that’s The New York Times. And so when you have such a well respected paper like that running these articles it’s really hard for defense attorneys and even real scientists to combat that narrative.
Adam: One thing that struck me and something that seems obvious in retrospect, is the degree to which the scientific analysis, the lab analysis such that they are, are not independent of the police system, the police departments, and in fact they work within the police departments for the most part and they report directly to the police, uh, the head of the police and the police chiefs. Um, to what extent is there just this massive conflict of interest baked into the cake of pattern matching?
Jessica Brand: Huge. It’s huge. I mean, to be clear, it’s not just pattern matching, it’s also a lot of these DNA labs.
Adam: Right. Forensics in general.
Jessica Brand: Yeah, exactly. So you know, when labs that are affiliated with law enforcement, there’s just an inclination to make your bosses happy. I mean, one is obvious pressure. So we’ve seen lots of cases where law enforcement or the district attorney will send a sample to the lab and say, you know, ‘look at this fingerprint’ and in that note it’ll say, ‘we think this fingerprint belongs to the suspect and here’s the suspects fingerprint.’ Well, you can be the best analyst in the world and that’s gonna affect how you look at those two samples and decide whether they match it. There’s just no way around that. Study after study shows that people are influenced by that kind of biasing information. And every good defense lawyer will tell you they have seen a case file were law enforcement or a DA has put that kind of note into the law enforcement lab when they put in a sample for analysis. But then there’s just sort of the kind of less sinister type of influence, which is that when you know the police department is cutting your paycheck, who wants to be the lab analyst who says, ‘nope, I don’t think that matches, nope, I don’t think this matches, no, you know, I, I think I need to provide a greater limitation on my conclusion here.’ You’re going to be afraid you’re going to get fired. So we know there’s just that kind of also unconscious biasing effect that goes into this kind of analysis and that’s why report after report after report says we really need to make these labs independent. And even the accreditation that happens, you know, there’s these accrediting agencies like ASCLD/LAB that go in and they look to make sure that the labs are actually following procedures and maybe they’re not as biased as one might expect, but in fact they just turn out to be rubber stamps. Um, and we’ve seen that where labs actually really were making a lot of mistakes, they were making up methods of DNA analysis like happened in Austin and the accreditation firms just gave them the rubber stamps. And part of that is, you know, there’s just this feeling that law enforcement is reliable so everything that’s happening must be okay. That’s really very dangerous.
Adam: Yeah. One of the things that you talk about in your writing is the extent to which there seems to me like there’s so much invested in a lot of these pseudosciences that to sort of pull back and to analyze them critically is to really kind of call the entire system into question. To what extent are people just sort of scared of opening a Pandora’s Box of appeals and overturns? Was that one of the institutional incentives against kind of really looking into this critically in your opinion?
Jessica Brand: Yeah. I mean they’ve been using fingerprint evidence since 1911, so to say the whole field is discredited really, it does, it opens a Pandora’s Box. Now, judges could say different things. They could say, ‘we believe there’s some science to fingerprints or to ballistics, but examiner, you really can’t give the conclusion that you are giving.’ So for example, examiners in some fields get on the stand and say ‘this is a match to a hundred percent certainty’ or ‘this is a match to a near degree of scientific certainty,’ whatever that means. You could really make examiners hedge those opinions quite a bit without declaring the whole field’s totally unfounded and inadmissible, but even then, judges are really unwilling to do that except for in a few places and I think you’re right. I think it is out of fear of just opening up a lot of cases. Now the flip side is you should really be afraid of a lot of wrongful convictions, but no one wants to think about it.
Adam: You mentioned this kind of nebulous concept of what they call “scientific certainty” or “degree of scientific certainty.” Obviously the methods themselves are always about a degree of precision. You have people, the CIA and NSA do this with intelligence right? You have on one end, you know, ‘I have no idea,’ and then the other end you have rock solid proof and there’s, there’s always going to be a kind of degree. Um, to what extent do these terms, in terms of how they translate to a jury, mean anything? A high certainty versus what other, any other kinds of gradients? Are they fixed meanings in different counties and different states?
Jessica Brand: No, it’s total garbage.
Adam: (laughs.) Okay.
Jessica Brand: If you’re a juror and you hear an expert say, ‘I think that’s a match to a near degree of scientific certainty.’ You hear, ‘oh, it’s a match’ and that expert is absolutely sure. I mean it’s completely a worthless cabining of an opinion. From my opinion.
Adam: Yeah, because in many ways the trick here is to make it look like the experts that are working for the police are neutral and any experts that work for the defense are somehow corrupt like mercenaries. This is a trope you see in TV or movies a lot. Does the average juror sort of perceive the police as kind of a neutral party? Is uh, is that one of the kind of main barriers to this?
Jessica Brand: You know, I think the average juror sees the experts as neutral parties.
Jessica Brand: Whether they’re police officers or people who have gotten a certificate, you know, a juror views of policing that’s a whole different conversation. I think it’s changed a lot, especially in the years after the Freddie Gray incident, murder and in Baltimore. You see some increased skepticism of police in some parts of the country. Certainly not all, or maybe even most, but I do think there is a vision of the defense lawyer as a slimy guy who will win at all cost and the prosecutor as a trustworthy person who wears an American flag on their lapel and that’s very hard to fight against.
Adam: To what extent are judges complicit? You write a lot about how judges know better. Judges at this point are familiar with the literature. They know that a lot of this stuff is kind of bogus, but they sort of allow it anyway. This is something that directly Radley Balko of The Washington Post writes a lot about. To what extent are judges as an institution kind of going along with it for the purposes of expediency?
Jessica Brand: Judges are tasked with being the gatekeepers and keeping out unreliable evidence. That’s why they don’t let in hearsay. There’s all kinds of rules that they have to follow to keep that stuff out and junk science as one of them, and there’s legal standards in some places it’s called Frye and in some places it’s called Daubert and they’re supposed to keep out that evidence and they just shy away from doing their jobs. Now I think Radley Balko writes a lot about how maybe that’s because they’re not trained as scientists. I think that’s probably some of it, but you don’t really need to be a scientist to understand why the pattern matching fields hasn’t done studies and research and validation studies to support their conclusions. These are smart judges. They can read the PCAST report, the National Academy of Science’s report and figure that out in the same way as you or I can or lawyers can. I think it really is more, which you articulated earlier, opening up the Pandora’s Box is scary and has a lot of implications in the judicial system.
Adam: I want to drill down some of the specifics of what we’re talking about here. One of the most, if not the most dubious are bite mark analysis. You cite the American Board of Forensics did an informal test and 63 percent of the time they got it wrong. Is there any science at all to bite mark analysis or is it mostly just astrology?
Jessica Brand: Its astrology. You know, I live in Texas. You’re from Texas. Texas hasn’t really been the leader in criminal justice and yet the Texas Forensic Science Commission has said no more bite mark evidence in Texas because it’s not reliable.
Adam: Oh really?
Jessica Brand: Right. And yet in other parts of the country they’re still using bite mark analysis and I think they’ve never actually overturned a case just on the basis of bite mark analysis. Even though in a state like Texas they’re saying no more. It’s ridiculous and we know it’s led to a ton of wrongful convictions.
Adam: So let’s talk about progress that’s being made. There seems like there’s been an effort in recent years to push back against this. What progress, if any, is being made and can you you highlight any examples like on a state or county level that there’s been a real kind of reformation about forensics or pattern matching specifically?
Jessica Brand: So there are some responsible researchers out there who are doing important work on trying to analyze characteristics of patterns. So characteristics of fingerprints, for example, and they’re trying to really conduct rigorous research on those things so that you can actually test whether a fingerprint is unique. You can test how many similarities would you need between a fingerprint on the crime scene and a known source before you declare a match. So one of those people, for example, is Henry Swofford, you know, his research is really important and I don’t know what it will show in the end, but there are people who are out there doing it. Now we’ve seen some setbacks under this administration and sort of the dismantling of some of the groups that are supposed to be watchdogs for forensic science and support that research. But some of it is really happening and I think that’s important. And you’re also seeing some judges who are starting to raise an eyebrow to these things. So you know, Judge Easterly, on the DC Court of Appeals, which is different than the DC Circuit, has written a lot about how people really need to be skeptical and pay attention to the studies that show the problems with ballistics evidence. So there is this focus and increased realization I think out there about how we need to be careful and actually maybe we need to do some research before we reach conclusions that send people to jail. So I would have some cause of optimism based on those things.
Adam: Alright. That’s good to hear. We, uh, we try to leave a little optimism. We don’t wanna sew cynicism too much. Maybe someone will come up with a, uh, an Office like comedy show where it’s a police department where all they do is make mistakes instead of the sort of slick always competent version we see.
Jessica Brand: I look forward to you writing it.
Adam: (Chuckles.) Alright. Well thank you so much for coming on. That was very informative and I look forward to having you back maybe to talk about other forensics at some point down the line.
Jessica Brand: Thanks Adam.
Adam: Thanks again to Jessica Brand, Legal Director of The Justice Collaborative and Appeal contributor. Thanks for listening, this has been The Appeal podcast. Remember, you can always follow us on Twitter @TheAppealPod, you can follow us on Facebook at the main Appeal magazine Facebook page and as always you can subscribe and rate us on iTunes. The show is produced by Florence Barrau-Adams. Production assistant is Trendel Lightburn. Executive producer Sarah Leonard. I’m your host Adam Johnson. Thank you so much. We’ll see you next week.