There are three major factors that account for the difficulty the defense has with presenting a general causation case to a jury. They have to do with the ways in which people process information; the relevance of general causation evidence to what the jurors care about; and the complexity of the evidence. This article will discuss the first of these.
In every day life people do not always think in the most logical and rational fashion. The work of two different Nobel Prize winners is illuminating here. Herbert Simon won the Nobel Prize for Economics in 1978. He developed a theory of decision-making based on the idea of bounded rationality. People face uncertainty about the future and there are costs that are often insurmountable that are associated with acquiring information in the present.
Because of this, people are unable to make fully rational decisions. They only have â€œbounded rationality.â€ People are either unable or unwilling to maximize the rationality of the decisions that they make (because they are unable or unwilling to acquire all the information they need to do so). They make decisions by â€œsatisficingâ€; setting an aspiration level, which, if achieved, they will be happy enough with (Simon, Herbert (1997) Administrative Behavior 4th Edition, New York: The Free Press).
Letâ€™s take as an example a decision about whether to fly or take the train from Boston to New York where time is of utmost importance. Attempting to make a maximally rational decision a person might investigate the percentage of on-time arrivals for both the plane and the train; how this is affected by the time of day; and the impact of weather conditions. However, what is more likely is that the person will ask a few people who have taken both modes of transportation about their experiences with on-time arrival. That information will be satisficing.
In the courtroom, the jurors are given enormous amounts of information in a very unnatural manner. What the defense lawyers and their expert witnesses often attempt to do is provide jurors with the information they need to make a maximally rational decision. It would take a superhuman effort to absorb, understand, and remember all this information in the artificial format of the courtroom.
This is particularly true in a complex toxic tort or products liability case. Jurors make decisions based on what they generally know to be an incomplete understanding of the evidence. They seek to make decisions that will be satificing to them, even if the are not maximally rational. What we often find out from jurors when they have listened to all the evidence on causation that what is determinative for them is the experience they, their family members, and friends may have had with the substance in question.
Some jurors also stop making the cognitive effort to acquire the information presented to them in the generally mistaken belief that a fellow juror is acquiring it for them (a satificing result). Instead of asking ourselves what a juror would want to know in order to make a reasonable decision about the relationship between exposure to a substance and a disease, we mistakenly ask ourselves what we want to tell them. The consequence is that they get too much information. Like all of us, jurors have a handy ability to forget, reinterpret, and disbelieve information that is not consistent with the way they have decided to process the information.
The work of the 2002 Nobel Prize in Economics, Daniel Kahneman (and his late colleague, Amos Tversky) shows that there are common methods people use to â€œmake judgments under uncertaintyâ€ (Tversky, Amos and Daniel Kahneman (1974) â€œJudgment Under Uncertainty: Heuristics and Biases, Science, 185, 1124-1131). They call these heuristics, or rules of thumb. Again, while they are not necessarily rational in the economic sense, they make sense to the decision-maker.
In a famous example, they point out how many people are willing to drive across town to save $5 on a $15 calculator, but would not be willing to drive across town to save $5 on a $125 coat. Rationally, if the first decision-makes sense then the second makes equal sense. We find examples of this same kind of reasoning in the courtroom. Jurors will see a chemical that causes the death in ten people in a million as more dangerous than one that cause one death in a hundred thousand.
Two forms of heuristics stand out in their work in terms of their applicability to science in the courtroom. The first is the availability heuristic. If it is easier for a person to imagine or recall an event then the person is more prone to judge the event as more likely to occur. People overestimate the frequency of low probability but dramatic hazards. Our research has shown that people overestimate the frequency of deaths from lung cancer in this country. The mortality rate from lung cancer for women in 1999 was 40 out of 100,000 (American Cancer Society (2003), Cancer Facts and Figures 2003, p.4).
Our research shows that most people believe the mortality rate to be between one and five percent. Other research has shown that people overestimate the likelihood of nuclear power plant accidents. On the other hand, people under-estimate high probability hazards that are less memorable such as certain diseases (Covello, V.T. (1995) â€œRisk Comparison and Risk Communication: Issues and Problems in Comparing health and environmental Risks in Kasperson, R.E. and P.M. Stallen (eds), Communicating Risk to the Public, Dordecht: Kleuer Academic Press).
These points are very important in the context of a scientific defense of causation in the courtroom. If the plaintiff has suffered an injury that is familiar to the juror or easily imagined, then the juror is likely to overestimate the likelihood of it happening and believe the defense should have anticipated this likelihood and done something to prevent it (further reinforced by the hindsight bias in which people tend to view something that happened as inevitable and therefore something that could have been anticipated). If the injury or illness is unfamiliar to the juror, the juror is likely to see it as a rare event that therefore is likely to have an idiosyncratic cause (such as the product or substance the plaintiff claimed caused it).
The second heuristic is the representativeness heuristic. This refers to the tendency of people to ignore the size of the sample and the base rate when making inferences about causation. Let us take the example of PCBs. Here the literature used by plaintiffs is replete with scientific studies that rely on very small sample sizes and which make no attempt to compare the prevalence of a disease the sample to the prevalence of the disease in the general population. But these studies are still appealing to jurors. If someone has PCBs in their blood and a disease that has been related to PCB exposure, jurors tend to ignore the base rate of the percentage of people in the population with PCBs in their blood (which is one hundred percent).
The work of Paul Slovic on the perception of risk also tells us a great deal about how laypeopleâ€™s reasoning affects how they will see the defenseâ€™s evidence on general causation (for a good summary of his work, see Slovic, Paul and Elke U. Weber (2002), â€œPerception of Risk Posed by Extreme Events,â€ Paper prepared for discussion at the conference â€œRisk Management Strategies in an uncertain world,â€ Palisades, New York, April 12-13, 2002.). Slovic points out that the way in which experts calculate risk differs greatly from how laypeople calculate risk. In both instances there are subjective components.
Applying Slovicâ€™s findings to a toxic tort case, for example, a juror will be more likely to perceive a substance as possessing great health risks if, among other things, exposure to the substance was involuntary, the plaintiff did not know he/she was exposed, the effect of exposure was delayed, and the risks and benefits to the public are unfairly distributed. At one end of the spectrum is a situation where a worker who had been warned of the dangers of exposure to a chemical used in manufacturing an important product knowingly exposes himself to the chemical and suffers an immediate injury.
At the other end of the spectrum is a person who as a child was exposed through groundwater contamination to the same chemical which was disposed of as waste and develops an injury many years later. A jurorâ€™s perception of the risk of the chemical to the public at large (or to him or herself) will be much lower in the first situation than in the second. Although proving a causal connection may be more difficult for the plaintiff in the second situation, jurors will be much more bothered by it, and therefore, motivated to find a connection.
Jurors who reject the defenseâ€™s general causation argument are not necessarily ignorant or stupid. They may simply be employing forms of reasoning that we all use to negotiate everyday life. But among the results of these forms of reasoning are:
1. A tendency to see patterns and clusters where there is actual randomness (a disease cluster, for example).
2. A tendency to believe that any chemical is toxic and that exposure to any amount of that chemical has the ability to cause any disease.
3. A tendency to trust studies based on small sample sizes as well as anecdotal evidence.
4. A tendency to ignore the prevalence of a disease in the general population.
5. A tendency to confuse correlations between exposure to a substance and the presence of a disease with a causal relationship.
In everyday life, scientists and lawyers do not behave like scientists and lawyers. For jurors a trial is everyday life. They do not and will not reason like experts. To the lawyer and scientist, jurors may be making mistakes in their reasoning, but they are not. They are doing the reasoning of everyday life. How this is done with scientific evidence can be understood and consequently the evidence can be presented to them in a way that makes sense to everyone, lawyers, scientists, and jurors alike.