Research in psychology has led to a clearer picture of common pitfalls in human reasoning — instincts people are wired to make that may have helped our caveman ancestors but that now lead people to make biased decisions or incorrect assumptions.
Woo-kyoung Ahn, a psychology professor at Yale University who directs the Thinking Lab there, decided to teach an introductory class called “Thinking” that lays out the most common mistakes of human reasoning and strategies to correct them. And when she last offered it in 2019, it was the most popular class at the university that semester, with about 450 students sitting in the largest lecture hall on campus.
Helping students understand these issues can not only help them make better decisions in their own lives, but can lead them to make better decisions as future citizens and leaders on pressing issues like climate change and health care, she argues. For that reason, Ahn argues that it’s the kind of course every college should offer — and possibly high schools as well.
“It’s not just about learning how stupid that people are and how many errors we can make in our thinking,” she says. “It’s more about why we make these errors, why we have evolved to think that way. And as a result, we can also think about what we can do to prevent this.”
The popularity of the course led her to assemble the lessons into a book, “Thinking 101: How to Reason Better to Live Better.”
EdSurge recently connected with Ahn to hear her key takeaways from the book, and about how cognitive biases can impact educational systems such as college admissions.
Listen to the episode on Apple Podcasts, Overcast, Spotify, Stitcher or wherever you get your podcasts, or use the player on this page. Or read a partial transcript below, lightly edited for clarity.
EdSurge: Why is there a need for this book on how to reason better? Is it because of all the information flowing at all of us these days?
Woo-kyoung Ahn: We do talk about the importance of rational thinking for climate change issues and racism, sexism, and other social issues. But I'm a psychologist, so I also study how it affects our individual well-being as well.
So my favorite example is that there is a fallacy that I commit myself all the time, which is imposter syndrome. It's a very simple mechanism — it’s a cultural confirmation bias. … For instance, in course evaluations I seek out negative reviews. I search for the negative comments, the worst possible ones. And that's called the negativity bias. So we end up, even though 96 percent of the course evolutions were all positive, the 4 percent really is something that caused me to ruminate. Why did I do that? Or how can I fix that? And of course it can be good for improvement, but I have to maintain my sanity as well.
So even though you study these instincts, you still have to remind yourself what is happening and work against it?
Right. I didn't use the term instinct, but that's actually a great way of thinking about it. It's like these biases are ingrained in our brain for evolutionary reasons. And that's why it's so difficult to get rid of. So that's one of the themes that I wanted to emphasize in the book, which is that it's not that only the bad [or uneducated] people [who] commit these fallacies. Especially when we're dealing with political issues, when you hear the other party’s opinion, and you think, ‘Wow, they are crazy — how on earth could they be thinking that? They're so dumb.’ That's not the case. We are all prone to make all these errors.
There’s one example in your book about admissions committees for a college and how they interpret GPAs. Can you share that one?
So here's how the experiment went — and it was my own experiment. We made up fictitious transcripts of two students. One student, we're gonna call it A, B, C. And this student has a mixture of grades A, B, and C. But the average grade is like a B. There's another student whose grades are a mixture of B plus, B and B minus. So let's call that student B, B, B student. And so we constructed these transcripts such that the average GPAs for both students are identical. So there shouldn't be any difference in which is preferred.
So the subjects were asked to decide who would they admit or who's gonna do better in college.
Now, the top colleges emphasize that students should demonstrate passion about something. So given this, the B, B, B student does not really look like she has a lot of passion because it’s all just mediocre. But student A, B, C looks like she has some passion for something. There might be some reasons why an A, B, C student is a better student for a college.
But then there’s a negativity bias. The student B, B, B doesn't have anything really bad, but the student A, B, C has a C grade, and if you over-weigh the C grade, then it will cancel out not only the A grade, but it will seem even more negative than the B, B, B student.
So we did the study with Yale undergraduate students as participants and admissions officers who were willing to participate in our study and also just the general public. And consistently all three groups preferred the B, B, B student than student A, B, C, even though the average GPAs were identical.
Back to your “Thinking” class at Yale. Why do you think it has drawn so much interest from students?
For many of them it's because they want to outsmart everybody in the room — they want to make a better decision than others. There are some students who told me that they got a job at a high-power finance firm because they cited some of the experiments that I covered in the course.
What does the research say can be done about all the misinformation online?
There are many reasons why fake news happens. Our brains don’t have unlimited capacity, so we need to store only the most important information. So for instance, George Washington was the first president of the United States, but do you remember who first taught you that? No. So we have a tendency to store the content of the information, the fact that George Washington was the first president, but not the source of the information where or when or who taught you that because that kind of information is not as important as the content in many cases. That's actually a very adaptive system because you are showing more important information and just forgetting the less relevant information.
And that can be the problem with the fake news. Even if you read some news article in The Onion, or a satire site, even though you knew that it was fake news, right, after a while, you may forget the source and you may misremember it as true news.
So that's one of the reasons why fake news can happen. You may have seen something in a Facebook posting and you thought, ‘Oh, that's just BS, this cannot be true.’ But then after a while you forget the source of it and you might think, ‘Oh, that sounds familiar.’ And when you see it again, you might think that, ‘Oh, that sounds familiar — it might have been the true news or something.’ And that has been actually experimentally demonstrated.
There are many, many studies now popping up in the field trying to fix this issue. And hopefully within a couple of years we will have more synthesized theories or more systematic recommendations about what to do about this.