The use of science in criminal justice has created ambiguous results. While the application of deoxyribonucleic acid (DNA) testing in criminal evidence has been a tremendous advancement in the determination of guilt or innocence of crime suspects, other scientific applications have not been as productive. Some have been damaging and detrimental to the advancement of policing. One of the more popular applications of forensic science within law is the application of fingerprint analysis and rifling analysis as evidence that leads to criminal convictions. Like other applications of science within criminal justice, these scientific applications are not without controversy or debate. Some of the more prominent obstacles for a consistent application of science within criminal justice include a disconnect between researchers and practitioners, the nonscientific basis in which the criminal justice system has been formed and operates, and the disparity between policy and science.
Police officers, judges, attorneys, witnesses, victims, and suspects are all fallible in their judgment and memory. Their decisions are affected by their social standing, self-identity, education, physical and mental well-being, as well as their experiences, motivations, and moral assessments. Their characteristics and qualities interact to produce inconsistent and unreliable outcomes as the error rate, that is to say, false arrest and conviction rates, is quite high in the U.S. justice system. The complexities that produce imperfect and unpredictable memory and judgment impart outcomes throughout the criminal justice system that cannot be readily anticipated or measurable in a scientific way. Testimony from witnesses or experts that is desired and sometimes paid for by the attorneys to enhance strategic aims brings about errors, which cannot be easily distinguished within the justice process. Witnesses and law enforcement agents routinely misidentify suspects. This has been evidenced by the Innocence Project run by Barry Scheck and Peter Neufeld at the Benjamin N. Cardozo School of Law in New York City. The Innocence Project has been able to exonerate via DNA evidence dozens of individuals who were convicted on faulty eyewitness testimony.
As far back as the 1920s, Frederic Bartlett demonstrated that human memories are highly error-prone, and memory can be influenced, unintentionally or otherwise, by leading questions and other techniques that are employed by police detectives. This understanding was substantiated by the introduction of DNA evidence. The fact that DNA evidence wasn’t brought into the courts until many decades later should cause those in the criminal justice system to consider the number of inmates who might not have been convicted at trial had faulty eyewitness testimony been excluded. Throughout the history of criminal justice there has been little scrutiny as to the application of methodologies and tactics applied by police, attorneys, and judges. Aside from DNA, which was developed by biologists for medical advancements, no other technique commonly employed under the guise of forensics is scientifically based. Fingerprint analysis, polygraph testing, handwriting analysis, and rifling testing are a few of the investigative techniques law enforcement officers employ to derive evidence of a suspect’s guilt. None of these has been found to merit scientific scrutiny. The premise of fingerprint and rifling analysis has proven to be problematic as the underlying assumptions of these techniques, that fingers and guns each have their own set of unique characteristics, have proven to be false under scientific examination.
The polygraph examination is a combination of interview techniques combined with measurements of physiological factors and analysis that are employed jointly to prompt a confession. A suspect will be asked a series of scripted questions. While answering the questions, the individual’s heart rate, respiratory rate, blood pressure, and sometimes electrodermal activity are assessed. The polygraph operates by measuring these physiological rates as a function of a person’s anxiety, distress, or apprehension. Federal courts and most state courts do not allow polygraph tests to be admitted as evidence; however, police and intelligence agencies frequently use polygraph tests in interrogations as a means of collecting information or confirming suspicions. Polygraph tests are also widely used by police departments and government agencies as a means of screening employment candidates.
It is assumed that when a person lies, there is a physiological shift, such as an elevation of respiration rate or pulse. Those who employ polygraph examinations believe that a shift in a physiological rate implies that the person is being untruthful in his or her response. There are several problems with this assumption. The first is that a person can have those physiological shifts throughout the day, as the shifts are not so dramatic that they would cause concern for a medical practitioner. More important, the basic assumption that a person’s heart rate or breathing rate would respond to a thought or expression in real time only when a person is lying is intensely problematic. This represents what statisticians call a validity problem. Although a polygraph examiner may be deriving a measurement, he or she can never claim to measure what he or she thinks is being measured. In other words, there is a logical leap to say that an increased respiration rate means that a person is being untruthful. In physiological terms, an increased respiration rate means nothing more than an elevation in breathing frequency. It could be that a person is simply uncomfortable with the testing process or is diverting his or her mind in between multiple thought streams. There is no way a polygraph examiner can determine what, if anything, is causing a shift in someone’s breathing rate.
Polygraph examiners attempt to circumvent the problem of eliciting physiological response and interpreting it incorrectly by asking mundane questions to the individual being tested. Questions such as “Is the color on the wall red?” or “Is today the third day of the month?” would generate a response that the examiner would know the answer to and would be something that should not generate a nervous arousal. The examiner then analyzes responses to those questions against responses to questions that are more ambiguous or provocative. While this may seem to be logical, the problem is that any provocative question, such as “Were you present when Jane Doe was shot?” would stimulate a strong reaction by any person, as compared to the question about the color of the wall. An innocent person would be just as likely as a guilty person to have a notable physiological shift when being asked a provocative question.
As a countermeasure to this problem, some polygraph examiners employ more sophisticated “control” questions. These questions reflect ambiguous ideas that would elicit some response, a stronger response than an irrelevant question would create, but not a response that would be strong enough to indicate untruthfulness. An example would be a question such as “Did you ever break the law in your lifetime?” or “Did you ever lie to someone in a position of authority?” These questions can provoke anxiety but should not stimulate enough of a reaction to cause a significant shift in physiological response. The problem with this technique is that those who are practiced in deception can easily provide equally ambiguous answers, while those who are prone to feel guilt regardless of actual guilt tend to provoke more noticeable responses.
The recurring validity problem has not gone away. In fact, it may have been exacerbated. The deliberate manipulation of questioning and interview techniques tends to rouse desired responses without uncovering factual evidence. Police officers or polygraph examiners may incite confusion, distrust, or misgivings by the individual being questioned. This may cause an alteration of nervous system responses, but it would be a breach of logic to assume that the distress a person feels while being psychologically manipulated by those in positions of authority is the same as a person who is lying.
What no polygraph examiner or police officer can assume is that an innocent person would be consistently unresponsive to provocative questions, while a guilty person would react strongly to questions reflecting guilt. Innocent persons may be easier to influence, while those practiced in disingenuousness may be better at presenting reliable false negatives. Truthful denials by innocent persons generate strong emotional and physiological responses, which is why the polygraph test is prone to creating false positives against innocent people.
Biologists and physiologists did not create the polygraph examination. It was created by those in policing at a time when other methodologies such as “truth serum” were employed to elicit confessions. A recurrent belief by those in policing is that a guilty person would not want to take a polygraph test whereas an innocent person would. There is no scientific foundation to this notion as people react differently to the same test, regardless of their guilt or innocence. A measurement of physiological response to interview questions is no better at revealing guilt than it would be to reveal courage, resilience, or pride.
Fingerprint Analysis and Rifling Techniques
Fingerprint analysis and rifling techniques both are based on the notion that each fingerprint and each gun creates a mark that is unique. Fingerprint analysis began officially as part of the investigative arsenal of police officers when the Federal Bureau of Investigation (FBI) became formally authorized to collect and analyze fingerprints in 1924. Both Scotland Yard in England and police departments within the United States routinely analyzed fingerprints throughout the 19th century. The foundational belief is that each finger on each hand is unique. There are whorls, arches, and loops that can be distinguished on each finger via ridges that are visible from the time of birth. Even though a pair of identical twins can have the same DNA, they would not share or have the same set of fingerprints. Fingerprints remain the same throughout one’s life. These characteristics of fingerprints, combined with the inexpensive methodology employed to collect and analyze them, made them a reutilized procedure within criminal justice investigations.
Many believed that a person’s fingerprints at a crime scene was irrefutable evidence as to a person’s guilt. Fingerprints are lifted from a crime scene, then sent for analysis by an expert fingerprint examiner. Often, the fingerprints are sprayed with cyanoacrylate (more commonly known as superglue) to reveal lines left by the sweat and oil of one’s hand. In 2000, questions arose when scientific analysis concluded that thousands of individuals were arrested based on false identification and matching of fingerprints. Fingerprints from crime scenes matched individuals who had been fingerprinted for other reasons, such as enrollment in the military.
An infamous case highlighted this problem. Brandon Mayfield, a Portland, Oregon, attorney, was arrested on suspicion of being a material witness to the bombings which occurred on March 11, 2004, in Madrid, Spain. Mayfield was identified by his fingerprints by the FBI. His fingerprints, according to the FBI’s report, were a 100 percent match to those found on a bag that was filled with detonators near the Madrid train station. Mayfield had been in the U.S. Army, thus his fingerprints were in a national database that is controlled and operated by the FBI. His fingerprints were matched by an FBI expert fingerprint examiner, who claimed a match between their agency’s data file and the copy of fingerprints lifted from the crime scene. The FBI claimed it used substandard prints, which created the false positive. The Spanish police, who disputed the FBI’s analysis, had correctly identified an Algerian man who had a criminal record. The FBI admitted error in its fingerprint analysis. Questions were immediately raised as to why the FBI would use substandard prints to make identifications. While there were several questions as to the policies and procedures followed in the FBI lab, no one at that time thought to question the validity of fingerprinting as a tool for evidence. Fingerprinting, like nearly all other forms of forensic science, was developed by police officers, not scientists.
Cognitive neuroscientists tested to see if fingerprint examiners were biased in their interpretation of cases provided to them. Published in several peer-reviewed journals, including the Journal of Forensic Science International and Journal of Applied Cognitive Psychology, studies indicate that fingerprint examiners demonstrate bias reflecting not only their training but also information provided to them about the case. Because there are no national standards for forensic science, there are myriad training centers that teach fingerprint analysis. They vary in their understanding of how many unique characteristics within a fingerprint would qualify it for a match. Variations range from 5 to 20. In 2009, the National Academy of Science completed a comprehensive analysis of forensic science and noted that there was scant science in the field of forensics. Notable was its findings on fingerprint analysis, which indicated that the very technology used to analyze fingerprints, cyanoacrylate, smudges and distorts fingerprints as it chemically alters the residual sweat and oil.
The Mayfield case along with the findings from the National Academy of Science prompted scientists to evaluate the basic premise of fingerprinting: that every fingerprint is unique. The conclusion was that they are not. A fingerprint image from one person can have enough shared characteristics of another person that they would present a match to one another. Despite the scientific findings, fingerprint analysis, like bite-mark analysis and footwear impressions, are based on finding patterns and interpreting those patterns. Technology can provide forensic scientists with the capacity to scrutinize microscopic details of fingerprints; however, these techniques are costly and labor-intensive.
Rifling is the making of spiral grooves in the barrel of a gun when a bullet is projected through the barrel. It was long assumed that each gun had a distinctive rifling pattern within its barrel and that this pattern was easily detected as bullets reinforced the pattern each time the gun was fired. This led to the premise that a bullet can be traced back to an individual gun. In reality, this is not the case. Natural use of a gun, which includes the cleaning of the gun with wire bristles, or the shifting of the hammer, firing pin, or bolt, which can occur through wear or accidental or purposeful manipulation, would create a shift in rifling pattern. It is questionable that any bullet can be traced back to the gun from which it was fired. Yet, forensic examination is routinely conducted on bullets found at crime scenes to determine the gun that fired it. While it is possible to identify the make and model of the gun that fired the bullet, it is improbable that the exact gun will be identified through forensic evidence.
The problem with forensic science is not limited to the historic basis in which these analysis techniques developed; it is also due to the bias and flexible interpretations of its practitioners. As most forensic technicians are employed by police agencies and are given instructions and information from law enforcement officers, they are predisposed toward findings that reflect a match. When testifying in court about their findings, they employ scientific and technological terms that create an aura of expertise and thus persuade and influence juries. Regardless of which forensic procedure is employed, forensics, other than DNA, do not identify individuals. Forensic science leads to probabilities. Even if one’s fingerprints were located at a crime scene, and one assumes this assessment to be accurate, it does not provide any indication that a person was involved in a crime. Simply being present at one time does not indicate that a person was present at the time a crime occurred, much less was involved in the actual crime.
Faulty forensic science is a recurring cause of wrongful convictions. According to the Innocence Project, in more than half of the DNA exonerations within the United States, unvalidated forensic science contributed to the wrongful convictions. Some of these problems occur when there has not been scientific scrutiny of processes employed by forensic science technicians. Other times there is misconduct by technicians who are pressured to generate positive results. Both police and the courts are under constant scrutiny, and incredible demands by both elected officials and the general public apply pressure to get convictions. Nonetheless, both law enforcement and the courts have an obligation to prevent errors from occurring within the criminal justice system. Accepting the limitations of witnesses and forensic science, while understanding the disruptive techniques employed by law enforcement agencies that want to capture a suspect as soon as possible, is critical in the reduction of wrongful convictions within the criminal justice system.
- Dunn, Kevin. “Criminal Element.” Mother Jones (January/February 2013).
- Gardner, John. “Ethics and Law.” In Routledge Companion to Ethics, John Skorupski, ed. New York: Routledge, 2010.
- Geortzel, Ted. “Capital Punishment and Homicide: Sociological Realities and Econometric Illusions.” In Science and Ethics: Can Science Help Us Make Wise Moral Judgments? Paul Kurtz, ed. New York: Prometheus, 2007.
- Gilbert, Neil. “Miscounting Social Ills.” In Fraud and Fallible Judgment, N. Pallone and J. Hennessy, eds. New Brunswick, NJ: Transaction, 1995.
- Goldstein, David. “The Fading Myth of the Noble Scientist. In Fraud and Fallible Judgment, N. Pallone and J. Hennessy, eds. New Brunswick, NJ: Transaction, 1995.
- Iacono, William G. “Forensic ‘Lie Detection’: Procedures Without Scientific Basis.” Journal of Forensic Psychology Practice, v.1/1 (2001).
- Simon, Dan. In Doubt: The Psychology of the Criminal Justice Process. Cambridge, MA: Harvard University Press, 2012.
- Sparrow, Malcom K. “Governing Science.” New Perspectives in Policing (January, 2011). Washington, DC: National Institute of Justice. https://www.ncjrs.gov/pdffiles1/nij/232179.pdf (Accessed October 2013).
This example Science Essay is published for educational and informational purposes only. If you need a custom essay or research paper on this topic please use our writing services. EssayEmpire.com offers reliable custom essay writing services that can help you to receive high grades and impress your professors with the quality of each essay or research paper you hand in.