## Judgement Research

Much of the judgement research we will review has its origins in our everyday experience of revising our opinions in the light of new information or evidence. For example, suppose that you believe that it is very likely that someone has lied to you. If you then discover that another person supports their story, your confidence in your original belief is likely to decrease (this example was suggested by Peter Ayton, personal communication). Changes in beliefs can often be expressed in terms of probabilities. For example, we may initially be 90% confident that the other person is lying, but when their story is confirmed by another person, the probability may be reduced to 60%. The Rev. Thomas Bayes went still further, and produced a mathematical formula which shows how we can combine probabilities in order to calculate the impact of new evidence on a pre-existing probability. More specifically, he focused on situations in which there are two beliefs or hypotheses (e.g., X is lying vs. X is not lying), and he showed how new data or information alter the probabilities of these two hypotheses. The approach adopted by Bayes has been applied to conditional reasoning (see Chapter 16).

According to Bayes' theorem, we need to take account of the relative probabilities of the two hypotheses before the data were obtained (prior odds), and the relative probabilities of obtaining the data under each hypothesis (posterior odds). Bayesian methods evaluate the probability of observing the data, D, if hypothesis A is correct, written p(D/HA), and if hypothesis B is correct, written p(D/HB). Bayes' theorem itself can be expressed in the form of an odds ratio as follows:

P(H[j/D) p(HB) p(D/HB) On the left of the equation, we have the relative probabilities of hypotheses A and B, which is what we are trying to find out. On the right of the equation, we have the prior odds of each hypothesis being correct before the data are collected, followed by the posterior odds involving the probability of the data given each hypothesis.

In order to clarify what is involved in Bayes' theorem, we will consider the taxi-cab problem used by Tversky and Kahneman (1980):

A taxi-cab was involved in a hit-and-run accident one night. Two cab companies, the Green and the Blue, operate in the city. You are given the following data: (a) 85% of the cabs in the city are Green, and 15% are Blue, and (b) in court a witness identified the cab as a Blue cab.

However, the court tested the witness's ability to identify cabs under appropriate visibility conditions. When presented with a series of cabs, half of which were Blue and half of which were Green, the witness made the correct identification in 80% of the cases, and was wrong in 20% of cases.

What was the probability that the cab involved in the accident was Blue rather than Green?...per cent.

We will refer to the hypothesis that the cab was Blue as HA, and the hypothesis that it was Green as HB. The prior probabilities are 0.15 for HA and 0.85 for HB. The probability of the witness saying the cab was Blue when it was Blue, i.e., p(D/HA), is .80, and the probability of the witness saying the cab was Blue when it was Green, i.e., p(D/HB), is .20. If we enter these values in the formula, we obtain the following ratio:

Control condition Causal condition Actual probability

## Stop Anxiety Attacks

Here's How You Could End Anxiety and Panic Attacks For Good Prevent Anxiety in Your Golden Years Without Harmful Prescription Drugs. If You Give Me 15 minutes, I Will Show You a Breakthrough That Will Change The Way You Think About Anxiety and Panic Attacks Forever! If you are still suffering because your doctor can't help you, here's some great news...!

## Post a comment