Availability bias

From CEOpedia | Management online
Revision as of 00:10, 20 January 2023 by 127.0.0.1 (talk) (The LinkTitles extension automatically added links to existing pages (<a target="_blank" rel="noreferrer noopener" class="external free" href="https://github.com/bovender/LinkTitles">https://github.com/bovender/LinkTitles</a>).)
Availability bias
See also

Availability bias can be defined as a misjudgment of risk. An assessment based on false assumptions. When a person thinks something is more likely than it actually is because they have heard, read, or talked about it a disproportionate number of times. In addition, it may be a situation that happened to someone earlier - and that person therefore projects it into the present or future. The mental availability of corresponding examples is therefore decisive for our judgment or our assessment. For this purpose, the actual question is unconsciously replaced by a heuristic question that is easier to answer.

The term availability bias goes back to psychologists Amos Tversky and Daniel Kahneman, who coined it in 1973. In their work at the time, they showed that people tend to overestimate the likelihood of a scenario occurring if they could easily imagine or remember it[1].

Applications of the concept

In the following scenarios one is confronted with the availability bias in everyday life[2][3][4][5]:

  • Education: A study by Fox looks at whether memory impairment affects judgment. In his study, two groups filled out a course evaluation form. One of the two test groups had to note two positive aspects about the course and two additional suggestions for improvement. In contrast, the second group had to write down 10 suggestions for improvement and two positive comments about the course. Both groups were then asked to rate the course on a scale from 1 to 7. The students who had more suggestions for improvement rated the course better as they found the information more difficult to remember. Most of the students in the group given 10 suggestions did not write more than 2 because they could not recall any other instances of dissatisfaction with the class. The students who were asked to give the easier rating with only two complaints rated the course more rigorously because they had less trouble remembering the information.
  • Finance: When making a complex financial decision, there is a danger that categorizing information will oversimplify what is actually happening. Wrong decisions can be the result, as can investors who rely too much on their own, but often limited, wealth of experience. Finally, one speaks of resonance when decision-relevant information happens to apply to the investor's personal situation or coincides with his or her experience. Such information is usually perceived differently, usually more intensively, and can steer the intake and processing of information in a certain direction.
  • Media: The frequency and likelihood of events are particularly skewed by the media coverage and emotional intensity of news stories to which we are exposed. This can lead to us grossly overestimating certain risks just because the media are reporting more about them. For example, public events such as politician corruption scandals or celebrity divorces draw a lot of attention, making them easy to recall. As a result, we overestimate the frequency of corruption scandals among politicians or celebrity divorces. Dramatic events such as plane crashes also increase the mental availability of an event or category. For example, if a plane crashes, the media coverage leads to a temporary high mental availability, which means that we grossly overestimate the frequency of plane crashes and the risk of air travel (although the extremely low statistical risk has not changed).
  • Perceived risk: A hypothetical event makes an event appear more likely by establishing causal relationships. However, these effects could also be caused by the availability heuristic, since subjective likelihood is increased by making an event easier to imagine. Availability heuristics often lead to risk-averse human behavior. This means trying to avoid dangerous situations even when the danger seems unlikely.

Footnotes

  1. Cheng, Li, Liu, 2020, pp. 3141-3143
  2. Fox, 2006, pp. 86-89
  3. Asl, Gholipour, Gholipour, Rostami, Sadi, 2010, pp. 234-237
  4. Ruscio, 2000, pp. 22-26
  5. Hertwig, Pachur, Steinmann, 2012, pp. 314-316

References

Author: Max Bachmann