Cognitive Science: An Introduction/Biases and Reasoning Heuristics

From Wikibooks, open books for an open world
Jump to navigation Jump to search

Biases and reasoning heuristics[edit | edit source]

In the course of a day, we repeatedly have to make judgments and decisions. Such decisions and judgments can be about your relationship with your friends, what to eat for dinner, which college to apply for or which city you should settle down in. But how do we go about making these decisions? Research on judgment is concerned about how individuals make use of various cues when they conclude about situations and events. In contrast, research on decision making involves studying how individuals choose among various options (Eysenck & Keane, 2010). One can say that judgments are evaluated in terms of their accuracy, whereas the value of decisions typically is assessed in terms of the consequences of those decisions (Eysenck & Keane, 2010). However, the areas are closely related and somewhat overlapping.

In order not to be overwhelmed with information and stimuli, we need to filter out a lot of it. The same also applies for decision making. In order to come to a decision within a limited time frame, we must reduce the available information to a manageable amount. Heuristics help us to achieve this by reducing the cognitive burden of the decision making process, and by allowing examination of a smaller amount of information. Thus, heuristics are a type of mental shortcut people use for problem solving and information processing. They are simplistic rules of thumb -- habitual, automatic thinking that free us from making a complete and systematic processing of information. They are usually very functional and useful in everyday life, but sometimes they lead us to make mistakes; our thinking gets biased by our tendencies to make our decisions more simply. (Sternberg & Sternberg, 2009).

Daniel Kahneman and Amos Tversky have been the most influential psychologists working in the area of human judgment (Eysenck & Keane, 2010). Kahneman, an Israeli-American psychologist who originally studied attention, became world famous when he in 1970 published a series of experimental studies with Tversky on how people assess probabilities in everyday life, which shortcuts (heuristics) they use and what biases that can occur in such assessments. They also developed a theory of decision making under uncertainty, which at key points deviate from prevailing economic models. For these works, Kahneman won the Nobel Prize in Economics in 2002 [1]. Kahneman, Slovic and Tversky (1982) popularized the term ‘heuristic reasoning’ for thinking and decision-making that involves short cuts [2]

Heuristics and biases[edit | edit source]

In this section, I will start with presenting a model of some heuristics and biases that people use in their daily decision making. Further, I will give a deeper elaboration of some of the heuristics and biases. At the end of the paper, I will address three possible fallacies that can affect the decision process, and give a reflection over heuristics role in our daily life- do they help us or lead us astray?

Name Heuristic Bias
Framing Viewing a need in the real world as a "problem" you can work on solving Mistaking your view of the problem for the real need.

"Framing Bias"

Anchoring & adjustment Assuming a starting point and thinking about adjustments from there Being overly dominated by the assumed starting point
Status Quo "Business as Usual"

"If it ain't broke, don't fix it"

Bias against anything new
Sunk cost Treating the resources already spent on one alternative as an estimate of the resources you'll have to spend all over again to start a new one. Treating the resources already spent on one alternative as a real cost of abandoning it for something better
Confirmation If you're leaning towards an action, see if you can prove it's a good one. If you only look for supporting evidence, you could miss a fatal flaw
Cognitive overconfidence Decisiveness

Refuseal to be haunted by doubt

Self delusion
Prudent estimation "Conservative Estimates" Missed opportunities.

Especially dangerous in group problem solving

Risk aversion "A bird in the hand is worth two in the bush". Avoid probability of ruin Missed opportunities.

Risk aversion is attractive for individual but bad for the economy as a whole. "Certainty Effect"

Selective perception Knowing what you're looking for None so blind as those who will not see
Recallability (’’availability’’) If an idea doesn't fit in with the obvious data, it's surely suspect Non-obvious things can be most important, or even most common.
Guessing at patterns Quickly spotting the trend or the big picture "Outguessing randomness" -- seeing patterns that don't exist
Representativeness "If it looks like a duck and walks like a duck and quacks like a duck" Ignoring the base rate can lead to serious preventable errors
Most likely scenario Avoids wasting time on possibilities that probably won't happen Rare events can be the most important
Optimism Go for the gold! Chasing after dreams, ignoring risks
Pessimism Avoid unpleasant surprises Missed opportunities

[3]

Satisficing[edit | edit source]

Satisficing is one of the first heuristics that was formulated. This heuristic is based on the belief that after we have considered options one by one, we will select an option as soon as we find one that is satisfactory or good enough to meet our minimum level of acceptability (Sternberg & Sternberg, 2009). This heuristic may be used when there are limited working-memory resources available.

Elimination by aspects[edit | edit source]

If we are faced with more alternatives than we have time to consider, we may use a process of elimination by aspects, in which we eliminate alternatives by focusing on aspects of each alternative, one at a time. For example, one may look at an aspect of varying capabilities. Then you form some minimum criterions for this aspect (e.g. you want to buy a computer, but it can not cost more than 1000 EUR). Then you eliminate all possibilities that do not meet this criterion. For the remaining options, you select a new aspect (the computer must have a CD player). This is how we continue the process until we finally are left with one option. In practice, it appears that we can use this theory to narrow down the options, in order to spend more thoughtful and careful strategies. This may be useful for selecting between a small numbers of options (Sternberg & Sternberg, 2009).

Discounting[edit | edit source]

This is when you value things more in the future than you value things in the present. The rate at which things are devalued, going forward in time, is your "discounting rate."

Here's a way to think about it: suppose I were to offer you $100 today, or some other amount of money a year from now. What is the amount of money you'd accept a year from now that would be just as valuable to you as $100 today? If the answer is, say, $103, then your discounting rate is 3% per year. The curve of discounting can be measures--sometimes it's hyperbolic, sometimes exponential.

The marshmallow test can be looked at a measurement of discounting. The kids who eat the marshmallow right away might be highly devaluing the two marshmallows in the future. People from chaotic childhood environments have been shown to have very high discounting rates. When you don't trust in the stability of the future, it makes more sense to them to enjoy what they can get now. This sometimes includes having children--people who grew up in chaos even mature earlier.[1]

Representativeness heuristic[edit | edit source]

Representative heuristics means that we assess the probability of an uncertain event according to two factors:

1. How obvious, equal, or representative it is in relation to the population - for example, many believe that it is more likely to get GGGGG (girl) than BBBBB (boy) because it is born more girls than boys.

2. The degree of reflecting the underlying characteristics of the process, such as coincidence - for example, most people considering the likelihood of getting BGBBBB less likely than getting GBGBBG. Although the probability is the same for both. The reason for this is because the second compound of sex (GBGBBG) are more randomly distributed, and therefore perceived as more likely.

Thus, this heuristic concerns the extent to which something or someone is like our prototype for a term, can affect our classification of the object/person. The fact that we constantly rely on representative heuristics is not surprising. They are simple to use and they often works. Another reason why we use representative heuristics is because we mistakenly believe that small samples resemble the population from which the sample is drawn. We underestimate the likelihood that the characteristics of a small sample of the population are not necessarily adequately representing the characteristics of the entire population (Sternberg & Sternberg, 2009). We also tend to more often use representative heuristics when we are very aware of anecdotal evidence (a form of fallacy in which knowledge is based on a single case). One of the reasons that people mistakenly use the representative heuristics is because they fail to understand the concept of base rates. Base rate refers to the occurrence of an event or characteristic in the population of the event or characteristic (Sternberg & Sternberg, 2009).

Availability heuristics[edit | edit source]

Some judgment errors depend on the use of availability heuristics, which involves estimating the frequencies of events on the basis of how easily we can call to mind what we perceive as relevant information of a phenomenon (Eysenck & Keane, 2010). The use of availability heuristics can be distinguished in two, depending on which mechanism they are associated with. One of these mechanisms is the availability-by-recall mechanism, which is based on the number of people that an individual recalls have died from a specific risk, for example cancer. The second mechanism is the fluency mechanism, which involves ‘’judging the number of deaths from a risk by deciding how easy it would be to bring relevant instances but without retrieving them’’ (Eysenck & Keane, 2010).

Anchoring[edit | edit source]

A heuristic closely linked to availability is the anchoring-and-adjustment heuristic. In this type of heuristic, people adjust their evaluations of things by means of certain reference point called end-anchors (Sternberg & Sternberg, 2009). This type of heuristic is related to human’s tendency to anchor the ratings in a specific starting point and adaption of further information in relation to this. An example of this might be that inferences about other people are often rooted in ideas about ourselves. We determine how intelligent or kind someone is by referring to ourselves.

Framing[edit | edit source]

This heuristic is based on the belief that in which way the options of a problem are presented will influence the selection of an option. For example, when we are faced with an option involving potential gains, we tend to choose options that demonstrate risk aversion. This means that we would choose an option that offers a small but certain gain rather than a large but uncertain gain (Sternberg & Sternberg, 2009).

Biases[edit | edit source]

If our heuristics fail to produce a correct judgment, it may result in a cognitive bias, which is the tendency to draw incorrect conclusions based on cognitive factors. I will in the following discuss some biases that occur in decision making: illusory correlation, overconfidence, and hindsight bias.

Illusory correlation[edit | edit source]

Our predisposition to see particular events, attributes or categories as going together is a phenomenon called illusory correlation. For instance, in the case of attributes, we may use personal prejudices to form and use stereotypes, and in the case of events, we may see false cause-effect relationships (Sternberg & Sternberg, 2009). Illusory Correlation occurs when people recognize a correlation between events even if this relationship is not an existence. (Pelham, Brett; Blanton, Hart, 2013;2017)

Overconfidence[edit | edit source]

Overconfidence is a common error, which is an individual’s overvaluation of one’s own skills, judgment, knowledge e.g. One reason for such overconfidence, may be that people not realize how little they know, and that their information may come from unreliable sources. As a result, people sometimes make poor decisions.

Hindsight bias[edit | edit source]

When we look back at a situation, we believe we easily can see all the signs and events that lead to a particular outcome; a bias called hindsight bias. This bias can be common when intimate personal relationships are in trouble, where people often fail to observe signs of the difficulties until the problems gets too big. In retrospective, one may ask oneself ‘’why didn’t I see it coming’’. Because this bias impairs one’s ability to compare one’s expectations with the outcome, it may hinder learning (Sternberg & Sternberg, 2009).

Fallacies[edit | edit source]

On the basis of heuristics central role in the decision process, the possibility of incorrect inferences will be facilitated. Finally, in this paper I will address three possible fallacies that can affect the decision process.

Gambler's fallacy and the hot hand[edit | edit source]

The misconception that the possibility of a given random event, such as winning or losing in a game is influenced by previous random events, is called the gambler’s fallacy, also known as the Monte Carlo fallacy (Sternberg & Sternberg, 2009). This misconception can be related to representative heuristics, which is based on the belief that the pattern of past events is likely to change. A contrast to this fallacy is the hot hand effect, which refers to the belief that a set of events will continue.

Conjunction fallacy[edit | edit source]

Another fallacy is the conjunction fallacy, which assumes that an individual gives a higher estimation for a subset of events, than for the larger set of events containing the given subclass (Sternberg & Sternberg, 2009). The availability heuristic may lead to such a fallacy.

Sunk-cost fallacy[edit | edit source]

The last fallacy is the sunk-cost fallacy, which claims that the decision making process not only takes potential consequences into consideration. It also takes account of past commitments. According to the sunk costs fallacy, we will continue to invest in something, just because we have invested in it before and hope to recover our contribution. For example, if you have placed a lot of money to repair your car, it is likely that you will repair it again, although it would have been more profitable to buy a new car (Stenberg & Stenberg, 2009).

Do heuristics help us or lead us astray?[edit | edit source]

As we have seen, is one of the most common heuristics the representativeness heuristic, which is the belief that small samples of a population resemble the whole population. Our misunderstanding of aspects of probability and base rates can also lead us to other mental shortcuts, such as in the conjunction fallacy. Another common heuristic most of us use, is the availability heuristic, in which we make judgments based on information that is available in our memory, without bothering to seek less available information (Stenberg & Stenberg, 2009). Other heuristics, such as anchoring and adjustment, framing effects and illusory correlation also often impairs our ability to make fast decisions. Once a person has made a decision and the outcome of the decision is known, we may engage in hindsight bias. Perhaps the most serious of all biases, is overconfidence, which seems to be resistant to evidence of our own errors.

Heuristics do not always lead to poor decisions or wrong judgments, rather, they may function as simple ways of drawing sound conclusions. The work on heuristics and biases shows the importance of distinguishing between intellectual competence and intellectual performance as it manifests itself in daily life. For example can experts in the use of statistics and probability find themselves falling into faulty patterns of decision making and judgment in their everyday lives. Even though people may be intelligent in a conventional, test-based sense, they may show the same biases as someone with a lower test score would show. This may indicate that people often fail to utilize their intellectual competence in their daily life.

References[edit | edit source]

  1. Store Norske Leksikon Daniel Kahneman
  2. Blackwell Publishing What is heuristic reasoning? What are the pros and cons of heuristic reasoning?
  3. Georgia State University Heuristics and biases
  4. R. J. Sternberg., & K. Sternberg (2009) Cognition (6th edition) Wadsworth: Cengage Learning
  5. Eysenck, M, W., & Keane, M, T (2010) Cognitive Psychology (6th edition) Sussex: Psychology Press
  1. Hutson, M. (2015). When destructive behavior makes biological sense. Nautilus Nov/Dec, 107--113. http://nautil.us/issue/31/stress/when-destructive-behavior-makes-biological-sense