Monday, February 10, 2014

It's Just Criminal

The State of the Nation when it comes to the Police is over kill, no really over kill. The Police find it much easier to suit up in the equivalent of Game of Thrones armor and kill a suspect versus actually arrest, handcuff, process then investigate, send to trial and let the cards fall where they may. This process saves time and money.

And when the Police arrest and try a suspect they may actually get it wrong and who wants to do all that work. The CSI crime labs don't, the Prosecutor doesn't and the Judge is busy clearing their docket to ensure no lengthy or any trial needs to happen.  Or they are signing off warrants and plea agreements in order to fill their days and time.  What it doesn't seem to allow is time to study law or current legislation and attempting to actually lobby and do something of benefit for the community, such as stopping required minimum sentencing or other costly charges that make our unwieldy system of Justice more so, such as the Smarter Sentencing Act ,which ironically Prosecutors abhor. No that would be work and hard and stuff.

I wonder what Judge signed off on this warrant. I mean this was some haul, a pound of pot and a gun! I do think that the body armor was "warranted" (I made a pun!)

I do wonder how many Justices actually take time to study law or do they just have clerks do that for them? Or how many look to other countries, states and federal jurisdictions to see what they do when presented with similar cases? I have no idea does anyone know anything about any of this? It is about as transparent as it is venal. And aren't criminals that way too?

I found this blog entry from Jonathan Turley and it documents that crime lab malfeasance goes back to 2009. Glad we got right on that one.

And we are actually debating the use of compounding drugs to murder felons. Yes, murder, that is a crime. As I have written there are many problems with crime labs and they have led to convictions as well as overhauls, but I wonder if they are the same as they ever was? As once we have "done" with it, do we ever go back to see if that has changed for the better? I bet no one has ever re-examined the Washington State Crime Lab from its problems of 2008? Well I would bet but that is illegal.

And noted the exoneration's are rising and more incompetency is revealed but what does that do for those whose lives were utterly destroyed by the incompetence? Can you turn back time?

America, the land of North Korea, but without the marching in sync.



Think You Can Rely On Your Local Crime Lab For The Unvarnished Truth? Think Again
1, February 9, 2014 By Mark Esposito, Weekend Contributor

A 2009 report by the National Research Council (NRC) passed quietly into the night (except in legal and forensic circles) while barely garnering more than a ripple in the public’s psyche. It should have been a tidal wave given news last December that a 48-year-old New Jersey man, Gerard Henderson, who spent 19 years in prison for a murder he didn’t commit, was done in by faulty crime lab work. Henderson was convicted largely on “bite mark” evidence. Bite mark evidence is a process used to exam indentations and anomalies on a victim’s body and ostensibly made by human teeth which are then matched to a defendant’s dentures in an effort to prove that he/she was the perpetrator of the crime. Convicted in 1995, Henderson proved that state testing of the bite marks on the back of 19-year-old victim, Monica Reyes, was deeply flawed and conducted without sufficient safeguards to insure its reliability.

Independent forensic scientists working for Project Innocence could not reproduce findings by the state crime lab which is the gold standard for scientific verifiability. Henderson became one of the more than two dozen people wrongfully convicted of rape or murder since 2000 as a direct result of flawed bite mark evidence analysis all duly attested to as accurate by the local crime lab.

It should have been obvious to the government that something had to be done after the 2009 NRC report which excoriated state crime labs. “ According to the report, nearly every analytical technique, from hair-sampling methods to those used in arson investigation, is unreliable, with too much variability in test results. Only DNA evidence escaped condemnation.” The report documented scores of problems from funding to lab protocols to evidence gathering which insured scientifically unreliable results. In its summary, the council found that:

With the exception of nuclear DNA analysis, however, no forensic method has been rigorously shown to have the capacity to consistently, and with a high degree of certainty, demonstrate a connection between evidence and a specific individual or source.

What that little bit of scientific obfuscation means is that no evidence outside of properly controlled DNA analysis can match a person to the crime scene that passes the same scientific muster that we apply to introducing new drugs to the market or even to safety testing of new automobiles. No analysis for hair samples nor bite marks nor even arson investigations are reliable enough to be deemed “scientific.” And the fault lies not just in the hands of the lab technicians crippled by a lack of standardized methodology but in the prevailing methodology itself and the utter lack of peer reviewed studies establishing a link between the evidence and the ability of the testing to individualize the depositor of the evidence. This flawed methodology leads to inconsistent results and a fragmented system where justice is merely a hope.

And the problem has real human costs. On February 17, 2004, Texas resident Cameron Todd Willingham was executed by lethal injection for the arson death of his three daughters at their home in Iraan, Texas. The main evidence against him was the expert opinion by law enforcement officials that the fire had spread by means of a liquid accelerant. Proof of the accelerant, the arson investigators said, were “char patterns” in the floor in the shape of “puddles”, and a finding of multiple starting points of the fire, that had burned “fast and hot.” Willingham denied the charge to his dying day and no motive was ever established. But Deputy State Fire Marshal Manny Vasquez and others concluded that burn patterns clearly established the use of an accelerant and testified that human agency started the fire.

But how reliable was Vasquez’ opinion? Not very much said Craig Beyler, who holds a Ph.D. in Engineering Science from Harvard, and who prepared a written report at the request of the Texas Forensic Science Commission. Writing five years after the execution and in the same year as the release of the NRC study, Dr. Beyler concluded that investigators ignored the scientific method for analyzing fires described in NFPA 921, Guide for Fire and Explosion Investigations and relied on “folklore” and “myths”. Citing many of the same problems that the NRC would cite a month later, Beyler made this chilling assessment about the “methodology” employed to convict a man of a capital crime:

NFPA 921 provides a core methodology, methods for planning and conducting the investigation, and methods for collecting, interpreting, and documenting evidence. Most modern fire investigations texts mirror or amplify upon NFPA 921 (e.g., Icove and DeHaan (2004), DeHaan (2002), Lentini (2006)). The core of the 921 methodology is the application of the scientific method to fire investigation. In the context of fire investigation this involves the collection of data, the formulation of hypotheses from the data, and testing of the hypotheses. Conclusions can only be drawn when only a single hypothesis survives the testing process. None of the investigators employed this methodology. Indeed, in no case was any methodology identified. The testifying investigators admitted on the stand that there were possible alternate hypotheses that were consistent with the facts of the case. In no instance did this cause the testifying investigator to alter his opinions in the least. The overall standard that seems to be in use by the investigator is that his professional opinion with regard to cause was simply the explanation of the case facts that the investigator was personally most comfortable with.

Dr. Beyler found that “a finding of arson could not be sustained” and that key testimony from the fire marshal at Willingham’s trial was “hardly consistent with a scientific mind-set and is more characteristic of mystics or psychics”.

Did Texas execute an innocent man on flawed evidence? In 2010, Judge Charlie Baird thought so but a crafted motion for recusal by the prosecutor prevailed and stopped entry of an order which found, “overwhelming, credible, and reliable evidence” that Willingham was wrongfully convicted of murdering his daughters. Citing a report by fire investigator Gerald Hurst and the Arson Review Committee impaneled by Texas Innocence Project , Baird concluded that “every indicator relied upon since [by the prosecution's experts] has been scientifically proven to be invalid.”

Against this backdrop and the outcry over the execution of a likely innocent man on flawed scientific evidence, the US Department of Justice and the National Institute of Standards and Technology (NIST) has now created the first US National Commission on Forensic Science. “The panel of 37 scientists, lawyers, forensics practitioners and law-enforcement officials met for the first time this week in Washington DC, and aim to advise on government policies such as training and certification standards. In March, NIST will begin to set up a parallel panel, a forensic-science standards board that will set specific standards for the methods used in crime labs.”

The goal is to put some scientific method into forensic science which for too long has enjoyed the undeserved status of infallibility in assessing guilt and innocence. “The fundamental issues with forensic science can be solved by fixing the science,” says Suzanne Bell, a forensic chemist at West Virginia University in Morgantown.

The question is now should courts have swallowed the forensic Kool-ade so completely and admitted into evidence what can only be characterized as “junk science” at the behest of prosecutors? Maybe defense attorneys were right to have complained that modern forensic techniques displaced the deliberative role of the jury in determining guilty or innocence since the reliability of the testing was clearly oversold. And lest you think it’s only the new-fangled techniques in question, even fingerprint analysis, the hoariest and most famous technique we have, has shown some flaws despite the most rigorous protocols. ”A 2011 study found that professional examiners matched two finger­prints incorrectly once in every 1,000 times, and missed a correct match 7.5% of the time .”

The truth now evident, judges must carefully screen all forensic evidence in light of the requirements of Daubert to insure both materiality and reliability. What was once the worry of creative criminal defense lawyers seeking to pioneer new scientific theories seems to have now shifted to the prosecutor to establish reliability for things we took for granted for so long.

Did we buy the forensic scientist’s snake oil to the detriment of innocent men? Gerard Henderson and Cameron Todd Willingham seem to suggest that we did. Now what can we do about it?



No comments:

Post a Comment