A panel co-hosted by ProPublica, The New York Times Magazine, Harvard Law School’s Charles Hamilton Houston Institute for Race and Justice, and Harvard Law’s Criminal Justice Policy Program looked at the use of unreliable forensic science practices and models for reform.
This year, ProPublica senior reporter Pamela Colloff, also a writer-at-large for The New York Times Magazine, has been investigating the case of Joe Bryan, a former high school principal in Clifton, Texas. Bryan has been in prison for 32 years over the murder of his wife, Mickey, a crime he says he didn’t commit. Colloff found that Bryan’s conviction rested largely on the testimony of a local police officer who took the stand as an expert in bloodstain-pattern analysis, even though he had only taken a 40-hour class in the technique.
Colloff’s reporting focused on the questionable use of blood-spatter analysis in the nation’s courtrooms, but it raised broader concerns about other forensic disciplines (including the analysis of hair, bite marks and tire impressions; dog-scent examinations; and handwriting analysis), how they’re being used in the justice system and the devastating consequences they can have for the wrongfully convicted.
A panel of experts explored these and other issues at a recent event, “How Bad Science Is Corrupting the Justice System,” co-hosted by ProPublica, The New York Times Magazine, Harvard Law School’s Charles Hamilton Houston Institute for Race and Justice, and Harvard Law’s Criminal Justice Policy Program. Held on Oct. 25 at Harvard Law School, the discussion convened prominent leaders from across the judicial system — including Nancy Gertner, a retired federal district judge and Harvard Law School senior lecturer; Radha Natarajan, executive director of the New England Innocence Project and former public defender; and Nicole Cásarez, an attorney and board chair of the Houston Forensic Science Center, which oversees the City of Houston’s independent crime lab — and Colloff.
“Even among more reliable disciplines, problems can arise when scientific findings are overstated in courtrooms and handled without independent review or input from the scientific community,” said moderator Katy Naples-Mitchell, a Houston Institute legal fellow, whose opening remarks focused on a landmark 2009 National Academy of Sciences report that found that many wrongful convictions derived from forensic evidence that was flat-out wrong, calling for widespread reform.
“The NAS report was from a group of scientists who said, ‘Here’s the measure of what science should be,’” Gertner said. “There should be some way of determining validity. There should be some way of determining reliability. There should be some way to test the individual who is the expert to see what his error rates are. In other words, in this field, which evolved in the four corners of criminal justice prosecutions, we should be able to measure it by what the requirements of science are. … And [the report] had virtually no impact.”
One bright spot might be the city of Houston, the panel noted. Even as most courts continue to resist the scientific community’s criticisms of how forensic disciplines are used in the justice system, Houston has taken heed of a key recommendation: a separation between crime labs and the police or prosecutorial agencies that traditionally oversee their operations. But it took a scandal — and the loss of public trust — to force changes.
“You may have heard of the Houston police crime lab for all the wrong reasons because it had years of scandals,” Cásarez said. “The crime lab in Houston now is overseen by a diverse board of citizen volunteers. We have a technical advisory group of scientists in we need input on scientific matters.” With scientists in charge, she noted, the lab banned a number of questionable forensic techniques, including hair microscopy, bloodstain-spatter analysis, gunshot residue testing and voice spectrography.
Even if a crime lab is not housed specifically within law enforcement, Natarajan pointed out, crime lab analysts face a variety of pressures, both subtle and overt, that can affect whether or not their conclusions are reliable. “The question is: What are the influences and pressures that are coming into your work, and who do you feel like you’re working for? What’s the supervision? What are the biases that you have about that?” she said.
For example, Natarajan said, if a forensic scientist looking at a person’s fingerprint learns that he or she confessed — regardless of whether that information is true — it can influence how the fingerprint is viewed. “You need to shield the scientists from information that is not relevant to the task,” she said.
The consequences of junk science are devastating, the panelists said, detailing the stories of lives ruined by bogus forensics. Natarajan told the story of Victor Rosario, whose murder convictions were based on faulty arson science. He was exonerated last year after 32 years in prison. In another case, George Perrot was convicted of rape based on faulty testimony from a hair microscopy expert. “What was so mind boggling is that this was a case in which the victim and an eyewitness said at the trial, ‘This is not the person who assaulted me,’” Natarajan said. Perrot was exonerated last year after spending 30 years in prison.
As for the case of Bryan, who is still sitting in prison in Texas, the blood-spatter expert whose testimony was key to his conviction recently conceded that his conclusions were wrong.