We begin the course by reviewing what scientists call “critical thinking skills.” How do you KNOW what you know? This is also known by the more daunting name: epistemology. We are going to spend today talking about the difference between opinion, belief and knowledge and the two different systems we have in our brain for making decisions between what we think is true and what we think is false. The title “Thinking Fast and Slow” comes from a 2011 book by Daniel Kahneman. He and his colleague Amos Tversky (now deceased) developed the two-system model we’ll discuss today. They received a Nobel Prize for their work and helped to found the discipline of behavioral economics.


Knowledge occurs at the intersection of belief and truth. In this diagram, things are either true (or not) and they are either believed (or not). Notice that not all true things are believed and that there is plenty of room on the right-hand side of the diagram for things that we believe – but that are not true. What is an example of something that we believe in but which isn’t true? (Until we’re about 7 or 8, probably Santa Claus). Knowledge occurs at the intersection of truth and belief. Knowledge is created when we take a belief (you can call it a hypothesis or an opinion if you want to) and put it to the test. If our belief passes the test, we now say it is true.


As we try to expand our knowledge, we can use things we already know for sure to try and prove things we are unsure of. This method, of taking pieces of knowledge (also known as “facts”) and linking them through logic to form a conclusion, is known as constructing an argument. The classic form of an argument is called a syllogism. Here, statements of fact (called “premises”) are linked through logic to reach a conclusion.
Example: Premise 1: Socrates is human
Premise 2: All humans are mortal
Therefore (using the logic of the transitive property) Conclusion: Socrates is mortal.
Note: the transitive property should be familiar from math class. If A = B and B = C then, A = C.


Another way of looking at the syllogism is through set theory.
Here, let Set A be Socrates who is a subset of Set B (all humans). Set B is itself a subset of Set C (all things that are mortal). Therefore, Set A (Socrates) is a subset of Set C (things that are mortal).


You need to separate the truth of the premises from the logic used to connect them. You can have a valid argument (that is, one with valid logic) but the premises can be flawed, resulting in a false conclusion. Arguments can go bad one of two ways. First, the logic can be flawed (invalid or false or fallacious) as in the third example above. Or the premises can be flawed (or false). As in the second example above – we know that not all men are mad. That premise is false. You’ll discover tomorrow that arguments built on bad logic are way more common than you might have thought.


The type of reasoning we’ve been describing is often referred to as deductive logic or deductive reasoning. Most mathematics follows this type of reasoning. You start off with some axioms or premises and you wind up with a conclusion. In deductive logic, if the premises are true and the logic is valid, the result is always true. Completely true, not “probably true”.


Here is a good example of the flawed or false premise. Is Southeast normal or typical for a high school? That gets to what do we mean by “normal” or “typical”? The Greek philosopher Socrates said we must first define our terms before we can use deduction. In that Wake County is only about 20% African-American, Southeast is not “normal”. In that Southeast has the same basic classes as all other Wake County high schools, Southeast is “normal”.


Inductive reasoning is familiar to anyone who has studied the Scientific Method. We make observations and form a hypothesis. We test our hypothesis by making more observations. We either accept or reject our hypothesis and reach a conclusion. With induction, we can never be 100% sure of our conclusion. New observations might force us to reject our conclusion. Inductive conclusions are either string (backed up by lots of observations) or weak (backed up by very few observations).


As it stands now, my conclusion is weak. There are certainly many other objects that are yellow and sour that are NOT lemons. I could improve the quality of my inductive argument by increasing the detail on the premises. If I rewrite it as “Most FRUITS that are yellow and sour are also lemons” and then turn the second premise into “This FRUIT is both yellow and sour”, I have a much stronger argument.


Most people don’t use deduction or induction to make their decisions on what is or isn’t true. Most people rely on “gut feelings” or “instinct”. Two psychologists, Amos Tversky and Daniel Kahneman, studied this and realized that humans have evolved two separate, independent systems in our brains for deciding what is true or false. The first and most common is the quick “gut feeling” or instinctual approach. The second one involves deduction and induction and is much slower than the first. The “System 2” as described by Kahneman and Tversky is also unique to humans. It may be what makes us human.


System #1 is fast. It is emotional. It is based on instinct. For most people, relying on System #1 heuristics is how they get through life. Examples of heuristics: the availability heuristic says that if we hear a lot of lion roars in the distance, there must be lions nearby. Works pretty good with lions but breaks down in modern situations. Even though crime has been declining steadily for decades, we still hear about it frequently in the news media, giving us the mistaken impression that crime rates are actually going up. Most racism is based on a faulty heuristic.


While System #1 may work fine when it comes to lions and the probability of lion attacks, it breaks down when faced with even the simplest quantitative question. In the problem above, the majority of students say the ball must cost ten cents. They are using a rule of thumb to arrive at their answer and they are wrong. The ball actually costs a nickel.


System #2 is a fairly recent addition to human behavior. It seems to require writing to really make it work and we’ve only had writing for about 5,000 years. System #2 is slow and logical and precise. When System #2 arrives at a decision, using premises that are true and logic that is valid, the conclusion is almost always correct. It’s hard to fool System #2.In the ball and bat example on the previous slide, I followed an algorithm to convert my work problem to an equation and then usd another algorithm to solve the equation and interpret the answer.


“Analysis paralysis” refers to the inability to make a simple decision (like what to watch on Netflix) if you have to choose between dozens or hundreds (or thousands) of possible choices. A lot of our bad decisions, a lot of our false beliefs arise because we use System #1 when we should be using System #2. On the other hand, using System #2 for everything can be exhausting and counter-productive.


Here’s an example of how System #1 thinking or “common sense” can lead you astray. In our next two examples, we’re going to focus on the representativeness heuristic, which says that everything can be put into categories. If it quacks like a duck, it’s probably a duck, right? Simple common sense. But how realistic would that assumption be if I heard a quacking-type noise while in Antarctica? Ducks are not commonplace in Antarctica. I’m probably hearing the noises made by a penguin instead. Knowing how common or appropriate the categories are for where I am in time and space is called the base rate. The base rate for ducks in Antarctica is pretty low, but I’d need to be using System #2 to realize that.


Roughly 1% of the nation is employed directly in agriculture. That’s a few million people. Only a tiny, tiny fraction of the population play trumpet in major symphony orchestras. Maybe less than a hundred in total. Thus, the base rate for farmers is far greater than the base rate for trumpet players. System #1 says Tom plays trumpet. System #2 says it is more likely that Tom is a farmer.


It looks pretty obvious that Linda is probably a feminist. But remember, our choices are “bank teller” versus “feminist bank teller”. Here, you need to realize that the set of feminist bank tellers is a subset of the set of all bank tellers and thus, has a smaller base rate.


In France, Germany and Belgium – you are automatically signed up as an organ donor when you get a driver’s license. You can opt out if you want, but that requires a separate form. In England, the Netherlands and Austria, the situation is reversed. You have to opt in for organ donation and check a special box on your license form. This demonstrates how little people like to think and how committed they are to using System #1 thought processes.

 

blog comments powered by Disqus

Critical Thinking: Resources Day 1


ActivityHow Hard Is It to Change Someone's Mind?

Read More Description  Download 

Recommended BookThinking Fast and Slow

Read More Wiki  Amazon 

Related Lesson Plans

BIASES AND FALLACIES DAY 2 
STATISTICAL LITERACY DAY 3