Brian Backman, Anacortes High School
1. FAST and SLOW: Know the Difference Between Fast and Slow Thinking: Our brains have the capacity for solving problems quickly based on intuition or past experience. There are times, however, when this kind of fast thinking can mislead us. For example, imagine I tell you there is a girl named Julie who is a senior in high school and that Julie learned to read when she was three years old. If I asked you to estimate her GPA, what would you say? Because I told you that Julie was an early reader, you probably estimated that her GPA was fairly high — at least 3.7. However, the truth of the matter is there are a lot of other factors to consider that might result in Julie having a low GPA. Maybe she had a childhood disease that caused her to miss a lot of school, maybe she had a family crisis in her junior year, or maybe she fell into a bad crowd at some point in middle school or high school. The reality is that instead of generating these possibilities, the mind fixates on the data provided and jumps to the fastest, easiest conclusion. Cognitive psychologists call this the availability bias. It’s a mental shortcut — or heuristic — where we latch onto the first ideas the come to mind. As the Nobel Prize-winning economist Daniel Kahneman says, “Mental effort, I would argue, is relatively rare. Most of the time we coast.” You can avoid jumping to hasty conclusions by slowing down and taking the time to question, analyze, and evaluate a variety of possible conclusions along with the evidence that supports them.
2. RIDER and ELEPHANT: Explore the Relationship Between the Rider and the Elephant: Much of what happens in our brains, such as basic drives and emotions, happens at the subconscious level in our brain’s limbic system. The conscious, reasoning part of our brain exists in the neocortex, specifically the frontal cortex which seems to be the site of the brain’s executive functioning. Psychologist Jonathan Haidt has an ingenious metaphor that helps us understand our divided self and the brain’s two parts: the old unconscious brain of emotion and the new conscious brain of reason. Haidt’s metaphor is a single rider atop an elephant. The rider, representing reason, holds the reins, attempting to direct and control the elephant, representing emotion. The elephant (emotion) being larger and older than the rider (reason) has a mind of its own. The rider can vie for control, but the elephant will often follow its own desires and passions, disregarding the tug of reason. When you procrastinate or when you make an impulse purchase that you later regret, you’ve surrendered to the elephant. Consider the rider and the elephant metaphor when you sit down to tackle a thinking task. As Haidt says in his book The Happiness Hypothesis, “Our ancestors were mere animals governed by the primitive emotions and drives of the limbic system until they received the divine gift of reason, installed in the newly expanded neocortex (10). Like new software that we haven’t quite mastered, reasoning requires focused attention and effort. Likewise, we must put in extra effort if the rider (reason) is to control the direction we move in rather than allowing the elephant (emotion) to take us in the opposite direction. The philosopher David Hume said, “Reason is always and everywhere the slave of the passions.” However, with knowledge of the relationship between the elephant and the rider we are better prepared to rein in our emotions, freeing ourselves to let reason be our guide.
3. WHAT and WHY: Think About Not Just WHAT You Believe, but Also WHY You Believe It: Socrates used an analogy to describe the difference between unsound truth and sound truth. He imagined two beautiful statues by the sculptor Daedalus. The unsound truth which came about via intuition is like a statue placed precariously atop a pillar. The first strong wind that comes along will knock it over. The sound truth, however, is anchored to the ground by tethering cables, making it impervious to even gale-force winds. Supported by reasons, evidence, and awareness of counterarguments, the sound truth will stand up to scrutiny without falling (de Botton 26). Asking yourself why you believe what you believe will help you ground your thinking with reasoning.
4. CLAIM and PROOF: Ask Yourself HOW You Know What You Know: Too often we incorrectly focus on the claim instead of the proof that the claim is based upon. For example, if someone claims that “climate change is not influenced by human activity,” you might respond by saying, “That’s not true!” However, a better response would be to focus on the proof by asking, “How do you know that?” This response allows you to determine the evidence upon which the person’s claim is based and to examine that evidence. Instead of focusing on a conclusion, you can examine the process by which the person reached his/her conclusion. Likewise, you should also ask yourself this question when you are examining your own beliefs. By focusing on the proof instead of the claim, you will better understand the sources of your beliefs and you will also be more likely to question and test your evidence. A good analogy for this focus on your proof rather than your claim is juggling. When you are learning to juggle there is a temptation to look at the ball as it falls into your hand. If you do this, however, you will never be successful. Instead of looking down as the ball falls, you need to keep your eyes looking straight ahead at the top of the ball’s arc. Learning to juggle requires you to learn to catch the ball without looking at it so that the toss and the arc of the ball are true. In critical thinking, you are more likely to avoid dropping the ball by keeping your eyes on the proof rather than your claim.
5. HUBRIS and HUMILITY: Watch Out for Hubris, and Embrace Humility in Your Thinking: Too often we become so wrapped up in appearing smart that we become scared of simply saying, “I don’t know.” Instead we pretend that we know something when we don’t, and even worse, we sometimes delude ourselves into believing we know when we really don’t. In the 1990s, David Dunning and Justin Kruger of Cornell University conducted studies that demonstrated that the more incompetent a person is, the more that person is likely to overestimate their competence. An example of the Dunning-Kruger Effect in action is McArthur Wheeler. In 1995, Wheeler robbed two Pittsburgh banks. He made no attempt to disguise himself, and within hours he was apprehended after pictures of him taken from the banks’ surveillance cameras were shown on the evening news. After he was arrested, Wheeler was shocked, saying, “But I wore the juice!” Apparently, Wheeler believed that rubbing lemon juice on his face before robbing the bank would make him invisible to detection by the banks’ cameras. The ancient Greeks had a word for excessive pride and overconfidence. They called it hubris. And Shakespeare said, in the play As You Like It, “The fool thinks he is wise, but the wise man knows himself to be a fool.” Keep your ego in check, and don’t allow it to lead you astray. Instead, approach each thinking task with humility.
6. CLAIMS and COUNTERCLAIMS: Test Your Claims By Generating Counterclaims: Avoid haphazard thinking by being systematic. When he was told by the oracle at Delphi that he was the smartest man in the world, Socrates was incredulous. However, when he humbly went searching to find people smarter than him, he couldn’t find any. He found plenty of people who claimed to know things and who were not shy about spouting their opinions. However, when Socrates questioned these people about why they believed what they believed, they could not defend their claims. Socrates concluded, finally, that the oracle was correct, saying to himself, “At least I know that I don’t know.” Socrates realized that sound thinking is systematic thinking. His method began with a vital FIRST step, with wonder, with inquiry, with questions. SECOND, he would hypothesize an answer to his question by stating a definition or a claim. His THIRD step was to examine the reasoning and evidence for the claim by asking “How do I know this is true?” Socrates’ FOURTH step mirrored the scientific method by looking for counterarguments or counterexamples that would undermine his own claims. He understood that this step was vital to counter the natural tendency humans have to focus on evidence that supports their claims while ignoring evidence that disproves them. This tendency is called confirmation bias (or “myside bias”). Using this method to test your ideas does not guarantee that you will discover absolute truth, but if you have spent time thinking about the proof that supports your opinions as well as alternative opinions to yours, there is an increased likelihood that they will be correct. Take the following advice from Charles Darwin: “I followed a golden rule, namely that whenever a new observation or thought came across me, which was opposed to my general results, I make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favorable ones.”
7. ACTIVE and PASSIVE: Thinking is an Active Process, Not Passive: To illustrate the passive way that people approach thinking, Socrates compared his thinking methodology to pottery. Few would tackle the complex task of creating a vase based on intuition, yet many rely on mere intuition when approaching decisions about key issues in their lives. Like sound reasoning, making a pot involves a complex process: selecting clay, placing it on a wheel, spinning it, forming it, glazing it, and firing it. Each stage requires active attention to detail, practice and rigor. Like a well-crafted piece of pottery, well-crafted thought only holds water if you put in the hard work to do it correctly.
8. THINKING and METACOGNITION: Think for Yourself, and Think about Your Thinking: Each individual is a member of a community and of a larger society. As members of decision-making groups and as participants in democratic processes, we should be wary not only of the potential pitfalls of individual thinking, but also the potential pitfalls of collective thinking, such as the power of conformity and of blind obedience to authority. Take ownership of your beliefs by asking yourself why you believe what you do. Don’t just believe something because someone told you that it’s true, because the majority of people believe it, or because it appears to be common knowledge or conventional wisdom. Engage with others by asking questions, testing claims, evaluating evidence, and generating counterclaims. By thinking about your own thinking — a process known as metacognition — you will be better equipped to contribute to society as a reasoning, thoughtful citizen. A classic case study in poor group decision making and thinking is the failed Bay of Pigs invasion in 1961. The plan, put together by John F. Kennedy’s administration, was to invade Cuba and overthrow Fidel Castro. Even though President Kennedy involved some of the brightest people in the world in his planning process, the invasion of Cuba failed miserably. This puzzled the president, but once he began to examine what happened, he realized his error. Instead of encouraging his subordinates to scrutinize and question the invasion plan, he had allowed his men to simply go along with the plan, telling him what they thought he wanted to hear. The lesson learned from the Bay of Pigs debacle was that to avoid groupthink the individuals in a group need to feel free to speak their minds as well as to scrutinize the weaknesses of a plan as well as its strengths. When we work in groups we are often too quick to try to maintain harmony and to establish consensus. Kennedy was able to capitalize on his lessons learned in 1961 when the Soviet Union secretly deployed nuclear missiles to Cuba in October 1962. After discovering the missiles, Kennedy made sure that he heard multiple points of view and that everyone involved was encouraged to debate, to argue, and to disagree. This time, instead of failure, Kennedy achieved success. The Soviets agreed to remove the missiles from Cuba, and nuclear conflict was averted.*
*For additional case studies in group psychology, look up the following: The Asch Paradigm (Solomon Asch), Cognitive Dissonance (Leon Festinger), The Milgram Experiments (Stanley Milgram), The Stanford Prison Study (Philip Zimbardo).
de Botton, Alain. The Consolations of Philosophy. New York: Vintage International, 2000.
Haidt, Jonathan. The Happiness Hypothesis. New York: Basic Books, 2006.
Kahneman, Daniel. Thinking Fast and Slow. New York: Farrar, Straus and Giroux, 2011.