Subject: Availability Bias – The Letter “K”
Event: Daniel Hahneman is born, 1934
Imagine that you are driving a friend to the airport. As you drive, your passenger begins talking about her fear of flying; she says, for example, that she had nightmares last night about dying in a fiery crash. Now she is doing everything she can to muster the courage to go through with her plans to fly. What might you say to reason with her?
You might begin by gently explaining to her that what she is doing right now, riding in a car, is far more dangerous than flying. While more than 40,000 Americans are killed each year in car accidents, fewer than 1,000 die in airplane accidents. Her fear of flying, therefore, is based more on the false perception of danger rather than the actual facts.
To further calm your passenger, you might recommend that she read a book called Thinking Fast and Slow by Nobel Prize-winning economist Daniel Kahneman, who was born on this day in 1934. In his book, Kahneman explores the way that humans think and specifically explains two separate operating systems we use to think: System 1 and System 2.
System 1 is our default mode of thinking. It’s fast, intuitive, and emotional. System 2 is our slower, more deliberate mode of thinking; it requires more attention and energy because instead of being automatic, it uses logic and reason to draw its conclusions.
Even though the speed and effortlessness of System 1 have their advantages, it can often lead us to take mental shortcuts that steer us in the wrong direction. Kahneman reviews a number of these errors, which are called cognitive biases.
Your friend’s fear of flying, for example, resulted from a cognitive bias called availability bias (also known as the availability heuristic). This bias stems from our preference for System 1 thinking; when we recall information, we rely on details that come easily from our memory. The information that comes to mind, therefore, is not necessarily the most accurate; instead, it is the most easily accessible. Because your friend probably has seen more news stories on plane crashes than car accidents, she has formed an inaccurate perception of the frequency of aircraft accidents.
To experience the availability bias for yourself, consider the following question about words:
Is the letter K more likely to appear as the first letter in a word OR as the third letter?
Despite the fact that there are three times as many words in English with K as the third letter, most people falsely perceive that there are more words that begin with K. The reason for this is the availability bias: when we try to think of words, it’s much easier to access words based on their beginning letter; as a result, we confuse the ease of access with the reality.
It’s not easy to avoid the availability bias. We like taking the kind of mental shortcuts that allow us to make quick decisions. System 1 is our default mode of thinking, and we like to avoid the kind of mental effort needed for System 2 thinking. The best thing to do is to be aware of the fact that just because something comes to mind easily, does not mean that that thing is true. Perception is not reality. We are influenced by those things that we see and hear most frequently and those things that are most vivid and emotion-packed. Knowing this is the nudge we need to put forth the extra effort needed to make the shift from System 1 to System 2.
Recall, Retrieve, Recite, Ruminate, Reflect, Reason: What is the availability bias, and how does it relate to System 1 and System 2 thinking; Why does the availability bias lead people to get the “K” problem wrong?
Challenge – Boatload of Biases: The availability bias is just one of hundreds of cognitive biases. Take a look at Wikipedia’s “List of Cognitive Biases,” where you will find over one hundred different examples. Select one, and write a PSA explaining what it is and how it can be avoided to think more clearly and cogently.
1-Kahneman, Daniel. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux, 2011.