Factually! with Adam Conover - The Intelligence Trap, Moral Algebra and Disrationalia with David Robson
Episode Date: September 11, 2019Science journalist and author of the book The Intelligence Trap – Why Smart People Make Stupid Mistakes, David Robson joins Adam this week to discuss how being intelligent can actually ampl...ify your errors, the definition of disrationalia, how to stay intellectually humble, and more! This episode is brought to you by Super Size Me 2, Invitae (www.invitae.com), Acuity (www.acuityscheduling.com/factually), and Dashlane (www.dashlane.com/factually). Learn more about your ad choices. Visit megaphone.fm/adchoices See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Transcript
Discussion (0)
You know, I got to confess, I have always been a sucker for Japanese treats.
I love going down a little Tokyo, heading to a convenience store,
and grabbing all those brightly colored, fun-packaged boxes off of the shelf.
But you know what? I don't get the chance to go down there as often as I would like to.
And that is why I am so thrilled that Bokksu, a Japanese snack subscription box,
chose to sponsor this episode.
What's gotten me so excited about Bokksu is that these aren't just your run-of-the-mill grocery store finds.
Each box comes packed with 20 unique snacks that you can only find in Japan itself.
Plus, they throw in a handy guide filled with info about each snack and about Japanese culture.
And let me tell you something, you are going to need that guide because this box comes with a lot of snacks.
I just got this one today, direct from Bokksu, and look at all of these things.
We got some sort of seaweed snack here.
We've got a buttercream cookie. We've got a dolce. I don't, I'm going to have to read the
guide to figure out what this one is. It looks like some sort of sponge cake. Oh my gosh. This
one is, I think it's some kind of maybe fried banana chip. Let's try it out and see. Is that what it is? Nope, it's not banana. Maybe it's a cassava
potato chip. I should have read the guide. Ah, here they are. Iburigako smoky chips. Potato
chips made with rice flour, providing a lighter texture and satisfying crunch. Oh my gosh, this
is so much fun. You got to get one of these for themselves and get this for the month of March.
Bokksu has a limited edition cherry blossom box and 12 month subscribers get a free kimono
style robe and get this while you're wearing your new duds, learning fascinating things
about your tasty snacks.
You can also rest assured that you have helped to support small family run businesses in
Japan because Bokksu works with 200 plus small makers to get their snacks delivered straight
to your door.
So if all of that sounds good, if you want a big box of delicious snacks like this for yourself,
use the code factually for $15 off your first order at Bokksu.com.
That's code factually for $15 off your first order on Bokksu.com. I don't know the way. I don't know what to think. I don't know what to say. Yeah, but that's alright. Yeah, that's okay. I don't know anything.
Hello, everybody. I'm Adam Conover. Welcome to Factually. And in all of literature,
few characters stand for the clarifying power of rationality like Sherlock Holmes. Yes,
he was aided by his pipe, a magnifying glass, and a hot new wonder drug called cocaine.
But Holmes' greatest skill was his ability to reason through a problem rationally and skeptically. He said, for instance, that it is a capital mistake to theorize before
you have all the evidence. It biases the judgment. I wasn't going to do a full British accent,
all right? I'm not trained in dialect. But it's statements like these that made Holmes the archetype
of a hard-nosed critical thinker for generations after him. I mean, I'm a fan of Sherlock Holmes
myself. I've read all the stories. And, you know, one of the appeals of the character is that we'd
all like to think that in our best moments, we think just as logically as
the famous consulting detective, right? So, it might be reasonable to assume that his creator,
Sir Arthur Conan Doyle, who was himself a doctor and public intellectual, was just as rational as
his creation, right? Well, unfortunately, quite the opposite is true. Doyle could not follow his own iconic
character's advice. Even though he was the literal inventor of Holmes' no-nonsense,
ultra-empirical approach to investigation, Doyle was susceptible to the most obvious
claptrap, hokum, and straight-up bullshit his age had to offer, spiritualism. The fad in which
wealthy English gentlefolk came to believe that, quote,
spirit mediums could communicate with the dead.
And Doyle went all in on it.
The dude was insatiable.
He attended five or six seances a week.
He even believed his wife, Jean, was a literal psychic.
For instance, in 1922 on a book tour,
Doyle met up with the famous escape artist and
magician Harry Houdini in Atlantic City. And now Doyle knew that Houdini was grieving for his
mother's death, so he did the friendly thing and suggested a seance where his wife Jean would try
and contact the late Mama H on the other side. In a hotel room, Doyle's wife, Jean, sat with a pencil in her hand and entered a trance.
She then banged on the table. A spirit was present. She asked the spirit if she believed in God and
then made the sign of the cross. Then Jean started to write frantically. And when the seance ended,
she had 20 pages of material from Houdini's ghost mom. Now, Doyle thought this was amazing. He was
like, holy crap, my wife is talking
to your dead mother. What do you think about that? Don't you feel better now, Harry? But Houdini,
who we should remember was a genius at tricking people and creating illusions, called the scene
out for what it was, a performance. You know, he had a couple of natural questions like,
why would his mom, a Jew, respond to the sign of the cross? And more importantly,
why would Houdini's Hungarian mother suddenly know how to communicate in English? So it was clear
the seance was a sham, no question, right? But Doyle, again, a brilliant man who created the
most analytical character in all of literature, refused to listen to Houdini's valid empirical
points. In fact, he went so far as to write and publish an article dismissing all to Houdini's valid empirical points. In fact, he went so far as to
write and publish an article dismissing all of Houdini's criticisms and claiming that Houdini
himself was a psychic medium. Now, this is clearly the height of irrationality, but at the same time,
we can't say that Arthur Conan Doyle was a dumb guy. I mean, to the contrary, he was brilliant.
Hell, he revolutionized an entire
genre of literature. The problem was Doyle's intelligence actually worked against him when
it came to understanding the truth about spiritualism. His brain, like any intelligent
brain, is like a powerful steam engine. See, it can haul any load at top speed, but that means if
you load it up with bullshit, you can still end up on a one-way trip
to the wrong station. Man, I know how to overextend a metaphor. Look, on a previous podcast, I
interviewed the psychologist David Dunning, and his namesake discovery, the Dunning-Kruger effect,
explains how those who know the least about something are the most likely to overestimate
their abilities. But it's not only beginners and amateurs who get things wrong. Intelligent people like Arthur Conan Doyle and even experts fall into their own mental traps.
The truth is, knowledge and intelligence do not form an impenetrable shield against error.
Well, to teach us more about that, our guest today is David Robson,
a science journalist who wrote a fantastic book called The Intelligence Trap.
This book covers all the ways that smart people lead themselves into false beliefs and what
they might do about it.
And it is full of fascinating stories, just like the one I just told you about Arthur
Conan Doyle and Harry Houdini.
Without further ado, please welcome David Robson.
Well, David, thank you so much for being on the show.
Oh, yeah, it's completely my pleasure.
Thanks.
So we sort of understand that people who are not very expert in a topic are likely to get
things wrong.
We had David Dunning on the show, and he talked all about how having a little bit of knowledge
is a dangerous thing and can make you overestimate your own competence.
But smart people get things wrong as well.
And how does that happen?
I mean, we generally think of experts as the people who are most likely to understand any issue best.
And so they would be most protected from error.
Yeah, I mean, that's what was so counterintuitive when I began to look into kind of the science of intelligence and decision making.
Because, OK, we could accept that smart people,
educated people aren't perfect, like they're always going to make mistakes, but you would
assume that they would be generally less likely to make mistakes. But actually, what you find is
although that is true, in general, there are lots of situations where actually greater intelligence
or expertise or education can actually amplify your errors. So it can really cause you to be more wrong than
if you were slightly less intelligent. And in my book, The Intelligence Trap, I kind of explain
lots of forms of this phenomenon with lots of mechanisms. But my favourite is really this idea
of motivated reasoning, which is, and that's kind of where like you, you have an opinion,
maybe it's like really caught your identity,
like it's a kind of political ideology or a religious belief. And that kind of emotional
pull is so strong that it just kind of completely redirects your reasoning. So rather than looking
at the facts in a really rational kind of open minded way, you just use your intelligence to
justify your opinions and to demolish any of the arguments
against you. So this, for example, can explain why lots of people who maybe have a more capitalist
kind of political ideology are more likely to deny climate change, even though there's
this huge scientific consensus. And actually, the more educated they are, the more likely
they are to deny it. So it's
really, you can see how their intelligence there is actually backfiring. It's causing them to
reject the facts and to have this kind of really irrational belief. So even though it's sort of
like even though the premise of their argument is incorrect, and the conclusion is also incorrect,
and a dispassionate objective
view would let you know that they are so intelligent that they're able to manipulate
the argument endlessly in order to find justifications for why they might be right.
Like any, and those arguments might even be like internally consistent in some ways,
except that they're happening because of motivated reasoning. Is that the idea?
Yeah, that's exactly the idea. So, I mean, I gave a political example, but one of the favorite kind
of stories or anecdotes that I use in my book is the case of Arthur Conan Doyle. So, he was-
I talked about this in the intro, but please elaborate on it.
Sure. So, you know, he's like incredibly intelligent
in the kind of analytical scientific way.
He's also incredibly creative, you know,
writing all of those fabulous novels.
But then he had these kind of odd ideas about spiritualism.
So he was fooled by countless kind of fraudulent mediums.
And he believed in fairies, you know, really bizarre beliefs.
Exactly.
Mediums, exactly the sort of people who Sherlock Holmes would look through.
You know, so many Sherlock Holmes stories are, oh, someone, there's a con artist who's given some kind of false impression.
And then Sherlock Holmes finds the gap in it and figures out that this person is pulling the wool over everyone's eyes.
Right, exactly. So he, it wasn't just that he was so intelligent.
He knew, really knew very well the kind of principles of deductive reasoning.
Yeah.
But he wasn't applying those because of this kind of emotional pull,
because he really wanted to believe in the idea of the afterlife.
So he just kind of used that intelligence to completely demolish
any of the arguments against it,
even when his friends were trying to kind of tell him the facts.
So he was friends with Harry Houdini, the illusionist,
who was also a skeptic when it came to spiritualism. But rather than listening to
Houdini's arguments, he just kind of came up with this really bizarre idea that Houdini himself must
be a paranormal being, and that he was kind of, for some reason, trying to kind of cover up his
tracks by persuading Arthur Conan Doyle that the spirit world didn't exist. And I feel like only a
genius like Arthur Conan Doyle could have come world didn't exist. And I feel like only a genius like Arthur Conan Doyle
could have come up with such a bizarre explanation
for something so bleedingly obvious.
Right, yeah, you'd have to be very smart
to come up with that false argument for the false thing that you believe.
And that's why it's a trap, right?
Because the more intelligent you are,
the better you are at coming up with arguments for wrong ideas.
Is that it?
Yeah, so that's exactly it.
And, you know, like lots of psychologists talk about kind of intelligence as being a little bit like the engine of a car.
So it's obviously an advantage if you're driving to have a powerful engine because you can get places more quickly.
But that's only useful if you have the kind of checks and balances to make sure that you're going in the right direction. And I feel like
you see a very similar parallel here that, you know, like intelligence is the engine of thought
and, you know, education kind of feeds into that. But you also need to make sure you're applying it
in the right way. Otherwise, it's just going to drive you off a cliff in this way that we see
with Arthur Conan Doyle. Yeah, if you don't have other positive habits, I guess, elsewhere in your life or in your mental world, then you can very easily end up going on the wrong track.
Yeah, that's exactly it.
The kind of habits like intellectual humility, so always questioning what your assumptions are, whether you might be wrong, open mindedness, you know, curiosity, which is now being studied very rigorously by scientific
research. And you see that more curious people who are always looking for facts, even if it
kind of contradicts their previous beliefs, that they are actually immune to this kind of
motivated reasoning. And now I like to think of myself as a curious person,
that's sort of part of my self-identity, but I would also imagine that a lot of people who are
falling prey to this trap that you describe also maybe feel that way or feel that they have good
mental habits, right? There are certainly climate change denialists who are very intelligent and
believe that they're intelligent and believe that they're intelligent
and believe that they have the correct habits of mind
and that everybody else is wrong.
We just happen to know because of the overwhelming scientific evidence
that these people are incorrect.
So how does one tell the difference
if the capacity for self-delusion is so high even in experts?
Yeah, that's a great point because, you know,
we're not very self-aware of our kind of mental capabilities, as I'm sure David Dunning told you.
So, say for curiosity, there's this wonderful research by Dan Kahan, who I believe is at Yale
University. And so, he did devise these tests that kind of more objectively measure curiosity. So,
it's the kind of thing like when people came into his lab, he would kind of leave different scientific magazines out on the table. And he would just
see like whether they were reading them really for pleasure, or you know, whether they picked up just
some other magazine not related to science or just kind of were happy to sit there and not kind of
build their mind in that way. So he found that then it was the more curious people,
the people who are more likely to kind of look for evidence just for the fun of it,
look for new ideas just for the fun of it, who were immune to this motivated reasoning.
And what's interesting there is that that wasn't solely explained by kind of education or other
measures of intelligence. The curiosity seemed to be a separate kind of mental trait that was having a separate effect on their motivated reasoning.
So, you've also written about how the expert mind is different from the inexpert mind,
that once someone becomes very expert in a topic, our sort of mental processes change.
Can you go into that at all?
Yeah, sure. So, you know, I certainly don't want this to be a kind of,
I don't want my book to be taken as an anti-elitist argument.
Or anti-intellectual argument. Yeah.
Right. Exactly. It's quite the opposite, I think, because it's kind of saying
experts actually just need to do better. But, but.
Well, yeah, and especially people who have a great deal of expertise should be self-critical,
as self-critical as anyone else.
And, you know, I think it's very intuitive to say, hey, even if you're sort of on the top of the
intellectual mountain, you should, if you think you're not above criticism or above self-reflection,
that's certainly an error. And I think that's a fair argument to make.
Yeah, that's exactly it. And so, you know, like, we do know that when people have a lot of
experience in their field and a lot of education, their brain processing does change quite profoundly.
And that they start to kind of, well, there's a number of different processes.
I mean, first of all, they just become more automatic and intuitive in their decision making.
You know, that kind of professional expertise that can allow a doctor to diagnose someone as soon as they walk into the room.
allow a doctor to diagnose someone as soon as they walk into the room. I think there's this kind of folklore amongst doctors that most diagnoses happen within 30 seconds of seeing the
patient. But then we all know someone who went to the doctor and the doctor insisted that
it was a certain problem and the doctor was incorrect, but they refused to listen to any
of the other symptoms. That's a very common experience people have in doctor's offices,
where the doctor has this rigid diagnosis.
Yeah, so that's exactly the problem
because even though that kind of intuition
often is correct,
like in about 15% of cases, it isn't.
But the problem is that thanks to
kind of their entrenchment
and their belief in their own expertise,
they become very close-minded
to any other possibilities.
So it's a kind of double-edged sword in some ways. It's like,
you know, very quick, very efficient, but it's also wrong in quite a sizable number of cases that lots of experts just aren't really taking into consideration.
Got it. You've written about examples where even geniuses are led astray by their expertise,
and you call it the Nobel disease,
which is a phrase I love.
Can you go into that?
Yeah, yeah, sure.
So I didn't coin that phrase, but I do love it.
I mean, I'm not sure if we know the exact origins,
but it's used a little bit by science writers like myself.
And it's this idea that you actually look
at lots of Nobel Prize winners,
and after they've won their prize,
they often come up with these really bizarre theories that are just so unscientific and,
you know, so sometimes quite dangerous. So in the book, I discussed the case of Carey Mullis, who
he kind of discovered or invented the polymerase chain reaction that's behind all genetics testing. So, you know, a kind of a really, like a really
revolutionary moment in biology, you know, it's really, really, you can't underestimate,
you can't overestimate his genius there. But then in his autobiography, he writes about all these
bizarre ideas, like he's a climate change denier. He was an AIDS denialist. So he
denied that the HIV virus was causing AIDS. He believed he was abducted by this kind of glowing
raccoon alien who could also speak. He believed that he could travel through the astral plane.
Okay, okay. Wow, this is even bigger than I thought. Yeah.
You almost like couldn't have stranger beliefs than he does. Yeah, it could you almost like couldn't be have stranger
beliefs than he does. It's like if you came across those beliefs, you would not think it was the same
mind that had also come up with that amazing new development in biology. And the fact is that he,
I think he just suffered from motivated reasoning and also this idea of earned dogmatism, which is
when you kind of feel like your achievements so far have
just given you the right now to be closed minded, you just believe that your own judgment, your
intuitions must be right, without questioning them and without kind of updating your beliefs
of new evidence. And you know, in that case, I think that is dangerous, because lots of people
would think, well, Kerry Mullis has won a Nobel Prize, so he must be right about
these things. But the scientific consensus and the overwhelming evidence shows that he's not.
Yeah, we talked about a similar case to that on our show a couple years ago,
we talked about the case of Linus Pauling, who was, you know, Nobel Prize winning chemist and,
you know, American intellectual celebrity, and then sort of went nuts and started to believe that
massive doses of vitamin C could cure all manners of diseases. And that's actually a big reason that
we now believe that people erroneously believe that vitamin C can cure colds and things like
that. But it was that exact disease where he was able to go on every talk show and interview in every magazine because of his celebrity.
Well, this guy's a genius.
And so, of course, he must have thought these things through.
But it wasn't – none of it was true.
It was all a fantasy.
Yeah, exactly.
And, you know, he actually – yeah, he won two Nobel Prizes, one for chemistry, one for the Nobel Peace Prize. And the writer Paul Offit,
who's also a physician, said he was like one of the greatest scientists, but also one of the
greatest quacks because of this strange belief that vitamin C was this kind of panacea. It's just
so bizarre and like so dangerous, I think, because even now, some people would reject
good cancer treatment and
believe that they could just change their biology by eating lots of fruits and vegetables, which is
never going to work. Yeah, there's this sort of bizarre tendency we have to believe about other
people, but also to believe about ourselves that success and genius in one area must mean that we
have it in all areas. Like you see it in, you know, you see it
in the tech industry in a huge way where these tech CEOs sort of, you know, say, oh, I'm a
programmer, so I should be able to figure out the exact nutritional composition every human needs
to survive. And they create something like Soylent, which is nonsense. Or, you know, Elon Musk being the canonical example where, you know, the guy creates
PayPal and, you know, is able to get, you know, pull together good people to, you know, build
SpaceX and stuff like that, but now believes he has expertise in transportation and, you know,
human psychology and neurology and all these different fields. And we all believe it for a bit.
We all sort of buy the line until it sort of reveals itself as not true.
Yeah, yeah, yeah, totally.
And, you know, I kind of think like in my book,
I discuss a lot of these kind of extreme examples.
You know, Steve Jobs was another one.
So he actually, you know, obviously a great visionary,
like so creative in his designs but
then he kind of the same intelligence the same uh amazing brain power also caused him to uh kind of
fool himself with his own diagnosis of pancreatic cancer so he was one of these people who did
change his diet uh rather than opting for surgery and And, you know, some doctors think that he could have
survived if he'd just gone for the conventional medical treatment originally. So, you know,
we see all of these extreme examples and they are so fascinating in their own right. But I think
what's also interesting and really important is the fact that it's also just kind of everyday
geniuses, you know, who are working as doctors or forensic scientists or judges or lawyers,
who are also susceptible to
all of these problems. And that's having a huge effect in our medical system, our justice system,
you know, and they're kind of just overlooked at the moment, because we assume that the
way we train people and the way we cultivate expertise is enough, and it really isn't enough.
Yeah, we give those folks when you sort of meet a doctor, you assume that they have this general high level of expertise about anything.
And it's easy for them, I assume, to believe that about themselves.
But so often that knowledge doesn't port.
Yeah, exactly.
So, you know, I think it's like 10 to 15 percent of diagnoses are wrong.
And overall, these diagnostic errors cause more deaths than common illnesses like
breast cancer. And yet they can be corrected. So this research from Rotterdam shows us that like,
if you force or not force, but persuade doctors to question their assumptions, to listen to their
intuitions, but then to kind of look for the contradictory evidence rather than only pulling
in the evidence that supports their point of view, you can reduce diagnostic errors by 40%.
And that would be, yeah, it's a really profound effect for, you know, not much. It's not a huge
intervention. It doesn't cost anything. It's not, you know, requiring any kind of really specialized
training, just, you know, encouraging them to follow a more definite thought process. And that
could save a lot of lives. So I really think like, you know, encouraging them to follow a more definite thought process. And that could save a lot of lives.
So I really think like, you know, there are easy solutions to these problems,
but we're not even recognizing the problem at the moment.
Right. And the solutions all have in common that idea of sort of stepping out of yourself
and maybe distrusting your own immediate mental processes,
your own immediate intelligence and finding ways to check yourself externally, perhaps? Yeah, that's exactly it. So there's that method called self distancing,
which sounds a bit odd, but it's like, essentially, you kind of talk about your problem as if you're
in the third person. So this is much more suitable for like everyday dilemmas, like if you're
deciding whether to go for a new job or, you know, in your love life or to buy a new house.
And you just start talking, for example, I would say, David wants to go for this new opportunity because of this.
But the downsides might be that David will do this, blah, blah, blah.
And that sounds, you know, you sound a bit like, I don't know, like Elmo from Sesame Street.
Like it doesn't sound like sophisticated thinking.
bit like a, I don't know, like Elmo from Sesame Street. Like it doesn't sound like sophisticated thinking, but it does just create this little bit of distance in the way you're considering
the problem. It kind of mutes your emotions just very slightly. So you can appraise the evidence
more rationally without just cherry picking the facts that you want to believe,
given your first intuition. Yeah. It's such a difficult problem though, because
when I think about things that,
that I'm expert in, right? Like if I'm an expert in anything, it's comedy, you know, and I, I,
I perform live, uh, you know, as often as I can, I've been doing it for over 10 years.
And there's so many times that when I'm going about that work, I just need to trust myself,
right? I actually don't have the time to pre-think
what I'm going to do for every single show.
And sometimes I just need to go up there and say,
hey, guess what?
I've got, I put in the time and I'll be funny on stage.
I know what to do.
I can let my body take over.
I can sort of let my built-in skills take over.
And I don't have to stress about every single moment.
And I find that I need to do that
sometimes in order to do good work.
Maybe if I'm stressing too much, then I'm less funny on stage, you know.
But this is, you're sort of describing that,
wait, no, we need to constantly be doubting our own abilities
which frightens me a little bit as someone who like,
I also need to rely on my abilities.
Yeah, yeah, yeah.
And I definitely think that is an issue
but I kind of think it all depends on the timing of when you apply that kind of self-reflection and awareness. Because I think in lots of professions, and especially creative ones like your own, you know, you kind of have to go with the flow in the moment, or you're just going to kind of choke or lose it or, you know, be too hesitant, not a great performer.
or, you know, be too hesitant, not a great performer.
But then I think you can still apply the self-reflection afterwards.
Like you can maybe try to take the more distance approach at the end of a show and just kind of, which you probably do already,
and just think critically, well, like that worked and that didn't.
And that's how people do steadily improve.
And I think it's the biggest difference between the people who succeed
and the people who don't.
It's just whether you're willing and capable of applying that self-reflection. So yeah, I think obviously the
way you apply these techniques depends on the job at hand. But I think that an element of
self-reflection is essential in all professions at one time or another.
It can be painful, though. The comedian's version of self-reflection is you tape your set and you listen back to it. And it's so painful to do that sometimes. So often I tape it and I never listen to it because it takes time to listen to it. But then also, you hear that punchline that didn't work and you're like, oh no, I feel bad about myself. It's like an ego hit. It's difficult in that way. And so it's very easy to go through life without having that self-reflection.
Yeah, yeah, I totally agree.
And, you know, like writing this book, I kind of did feel that sometimes it did put me in situations where it kind of really does like knock your ego, knock your sense of like who you are.
But I think you do come out of that kind of better in the end.
What is, you write about a process called dysrationalia.
Can you tell me about that?
Yeah, totally.
So this was coined by Keith Stanovich, who's this researcher at the University of Toronto.
And his wife actually has kind of specialized in learning difficulties.
And he kind of took this idea of dyslexia, where, you know, the kind of problems
with reading and writing aren't related to someone's intelligence, it's kind of an isolated
issue. And he wondered if that's also true of rationality. So whether you could have someone
who is incredibly intelligent on all other measures, you know, their SAT scores and their
IQ, but they do have a particular problem with rationality. And the way he looked at
rationality there was to look at those kind of classic cognitive biases that people like Daniel
Kahneman have studied. So, you know, things like the sunk cost effect, where your kind of initial
attachment to the investment that you've made means that you just pour more and more resources
into a project, even when it's failing, and you're actually losing, you know, much more money in the long run. And what he has shown is that dysrationalia does exist.
So no matter how intelligent you are, you're probably just as likely or unlikely to suffer
from the sunk cost bias. Being really smart in an academic sense doesn't make you more
rational, according to these particular cognitive biases.
That's fascinating. So, the idea of someone who is very, very well educated, an expert on a topic,
and intelligent broadly, but is specifically has a problem with rationality. That's a type of mind
that I haven't really thought about existing. But
it makes sense that there are people like that, that sort of, I can start to think of people who
fit that description for me. Yeah, me too. Yeah. And you know, that again, is because like you,
like, it's about the way you apply your intelligence. So, you know, if you're sat
down in front of like a maths exam or something, and kind of know then like, OK, I have to kind of apply my brain power here.
And, you know, you can find like the kind of correct answer.
And, you know, it's quite step by step.
It's quite straightforward.
But the problem is that just because you are able to apply that brain power doesn't mean that you actually do it all the time. So often you might just go through the world thinking kind of intuitively,
using heuristics, but not really kind of thinking from things through step by step.
And so that's one of the explanations for this dysrationalia.
Like if you're extremely intuitive in your approach to everything,
rather than sort of explicitly laying out your logical steps of thought.
than sort of explicitly laying out your logical steps of thought.
Yeah, that's exactly it.
So, you know, it kind of, I think that's very prevalent in business, like we've mentioned,
and it explains some of those doctor's biases.
And I think the point here is that discretionary can be combined with motivated reasoning. So you come to a decision or an idea intuitively, and then only later do you apply your intelligence just to justify it.
And that combination, I think, is the most disastrous.
You're explaining, you're actually explaining almost and entirely engaged in motivated reasoning is
like every boss I've ever had, basically, in a business context, right?
Yeah, yeah, yeah.
Everyone who runs a company.
Yeah, totally. And I work for the BBC, which is like a wonderful institution,
but I would be totally lying if I said it's not prevalent there. I just see it every day,
not necessarily in my immediate team, you know just across the organization there are definite
definitely problems there and it's you know and sometimes the organizational culture can kind of
amplify this I think because there's this sociological work looking at kind of corporate
cultures and finding that there's this phenomenon known as functional stupidity. And that is kind
of where you almost like discourage your employees to be thoughtful or curious, because they might
start asking questions. And you know, like, it might kind of interrupt their productivity if
they're thinking too much. But the problem is that then that just allows all of these biases
and errors to kind of accumulate. And at the end, you can get some like serious issues. So the example I use in my book is the Deepwater Horizon catastrophe, where,
you know, there wasn't any one particular person who was to blame. It was just the kind of overall
culture that has stopped multiple people from noticing the errors that caused the eventual
explosion. Right, because you have the organization focusing on finishing its process and on everything going smoothly, which disincentivizes people from raising red flags and using their minds rationally. Instead, everyone's focused on like, no, no, no, just make sure we can move on to the next step. Is that the sort of issue?
the same thing with like nasa with the columbia disaster in that they had really focused there on like speed and efficiency uh and so like the employees accidentally were turning a blind eye
to some serious problems that you know there had been plenty of warning signs in lots of the kind
of uh previous flights but they had just because had succeeded, they hadn't been asking like,
what we should learn from those warning signs. And that eventually caused the disaster. And so ideally, a company could be really efficient, but you'd also be asking the employees to take
notice of those things and to investigate them and give them the freedom to do that.
This is actually very helpful for my own work. Because, you know, I make television,
and we have this big process, and we have to turn around, you know, I make television and we have this big process and
we have to turn around, you know, a large number of scripts in a short period of time and they
need to be researched and written and then fact-checked and it's on, everything's on a
treadmill, you know? It's like, okay, we've got a deadline coming up a week from now. Okay, this is
the pitch for the story? Great, let's research it. Great, let's move forward. Okay, now we're
going to start writing it. And the moment at which someone on our staff says, oh, hold on a second.
This actually is not really true or we shouldn't tell the story this way or there's an error
in what we've been doing.
That's hitting stop on the, on the conveyor belt.
And I should have said conveyor belt, not treadmill.
That's, that's hitting stop on that.
And it's disruptive to the process, but also we're so focused on getting the process as
smooth as possible so that things aren't late
and people don't have to work late every night that there's like a push and pull between those
two things. And, you know, on our show, we've done corrections episodes where we've talked about
things we got wrong. Though, when I look back at why those things happened, it was often because
we were trying to keep the process running smoothly. All right, that sounds great. Let's do it.
Let's not ask too many questions, right?
And so we've, as we've been going on,
tried to build in more and more room for anyone on our staff to say,
hold on a second, there's a problem with this topic.
Let's talk about it a little bit more
so that there isn't that disincentive problem.
Yeah, and I think that's exactly it.
It's like the easiest way to avoid that
is to just build in the kind of room and space
to kind of make sure, you know,
if you finish early, that's great,
but you have to have the space
to acknowledge and correct an error.
And that's just what lots of companies try to avoid
because often you are kind of losing that time
when you could be producing more content.
It's also just like within the kind of team hierarchies itself.
It's just making sure that the managers and the people above are willing to take criticism from the people below
and that they'll respond to news even if they don't especially like what they're hearing,
rather than just kind of only telling people to come to me with good
news or don't bring it to me at all. Right. And if you don't do that, you can end up with a system
that's a lot like that genius who gets things wrong. You can end up with an organization that
is very, very functional at doing shitty work. Yeah, that is exactly what functional stupidity
is. And I think there's this movement of like um where companies try to
be like relentlessly optimistic and positive it's like they're trying you know they and it seems like
a good thing it's like there's no wrong answers well like sometimes you just have to accept there
are wrong answers and some ideas are bad ideas and like that doesn't reflect badly on the person
who came up with them like you know we don't want to define the person by having had a bad idea,
but you want to be able to have the kind of freedom to acknowledge that like someone's made a mistake and for them not to feel scared that they're going to be punished because of that.
Well, all this is so fascinating. We have to take a quick break,
but we'll be right back with more questions for David Robson. Okay, we're back.
So we've been talking about all of the errors that intelligence and expertise specifically can lead us into.
You know, I often think about the state that I was in when I first got to college, right?
When I was like really starting to
try to understand the world around me. And I was like very, I knew very little, but I was extremely
like open to any sort of new idea and any, you know, letting my curiosity take me where it led.
And I often think of that as being like a really special state that I try to come back to
in my own thinking. And that makes me wonder, from your perspective, and having researched this issue, what advantages our minds
have when we're not experts versus when we are experts? Are there ways in which we are more
flexible in our thinking or anything like that? Yeah, absolutely. So this is actually an idea
that comes partly from Zen Buddhism, where they really valued like the beginner's mind.
Yeah, it's a phrase in Buddhism It's fertile, it's flexible,
you know. And the idea is that we want to kind of build our expertise and our knowledge, but we
still want to maintain that kind of curiosity and open-mindedness to things that might question our
assumptions. So you're always looking for ways to update your knowledge rather than feeling that
your learning has finished and that there's nowhere else to go because you know everything already.
Yeah, often it seems like that even happens in art, for instance.
We've all had that experience of like, oh, the artist who we love, their best work is done in their early years.
Then after a while, it gets sort of calcified and, you know, like, oh, how is this the same person who did such creative things then is doing such boring work?
Now, it's not always the case.
A lot of artists come into their own much later in life.
But there's also that, yeah, we sort of understand intuitively that's how it works a little bit.
Yeah, exactly.
That's it.
I mean, so say someone like Picasso, I think, was always kind of innovating and changing his the mediums he was using and the uh his style and you know i think that is something that
is special and unique to some artists and not to others it's like always kind of being on your toes
and ready to adapt and change and to explore some new idea um and that's you know not common but i
think it's something that everyone can kind of
try to learn to adopt in whatever profession they're doing.
You've written about how that applies to chess players.
Is that right?
Yeah, that's it.
So, you know, if chess players actually, they can,
it's like they have this huge memory
of like tens of thousands of possible moves.
The professional ones, the ones who are very, very high expertise.
Yeah, like the best ones.
But then if you present them with a totally new or random configuration,
they really struggle to understand how that could fit into their knowledge.
They can't mesh it.
You also see it with taxi drivers, actually.
So there were studies of the London's taxi drivers
who had this kind of huge mental map of the city.
They make them take that test.
In London, they have to memorize all the different streets
to have a GPS in their heads, right?
Yeah, it's called the knowledge, I think.
But then the cityscape is always changing,
and especially around the Docklands in East London,
there have been lots of new buildings built.
And actually the experts, more the novices,
really struggle to incorporate that into their mental map
because all of their previous knowledge was just too deeply entrenched.
And you can see that with other professions like accountants or computer programmers.
You know, they actually struggle to adapt and upgrade their knowledge when new changes
come along.
You know, I remember that being a fear that I had when I was younger, when I was in that
sort of beginner's mind state about the world, because I was so curious about everything.
And then I'd encounter someone older than me in their 50s who I would try to get excited in some new idea,
and they'd say, I don't really care about it.
I'm not interested.
And I remember thinking, why are older people
or people who are very smart so incurious about the world?
And how do I avoid that as I grow older?
And so how do we get that sort of beginner's mind in our daily lives
as we continue to become more expert and continue to age?
Yeah, I mean, that's a great point,
because it's something that I had kind of come across in my career as a journalist.
You know, as like an editor, I would always rather work with like an intern
rather than like a journalist who'd been around for 20 or 30 years,
because I just think they always did have a fresher approach
and they were just more curious.
So there is some evidence that curiosity can be kind of cultivated.
And what I find especially interesting is it's like
the more you start learning about a new subject,
the more curious you can become about it
because it's like you start to realise that there subject, the more curious you can become about it.
Because it's like, you know, you start to realize that there are all these kind of holes that you don't know. And I think that's the difference between the beginner and the
expert is that the beginner is just a bit more aware of those kind of gaps in their knowledge,
and they want to fill them. And if you're an expert, whatever you're an expert in,
you should start to kind of ask those questions and to just think, well, what do I actually know? And what would I like to know?
And just to try to kind of pursue new questions that you just might not have considered before.
So just always be questioning yourself.
Just actively looking for those gaps in your knowledge and really seeing them and trying
to fill them.
Yeah, that's exactly it. It's like cultivating that awareness and then kind of actively pursuing it.
Yeah, in my own life, I recently realized I want to learn about geology.
I don't know anything about geology.
I know very little.
And I was on vacation in a national park and I was like, oh, geology is cool.
I don't know anything about it.
I haven't figured out yet how to learn about it.
I'm working on it. But is it that sort of cultivating of like, oh, here's something I don't know that I want to find out about that will let me see the world in a new way?
you know, seemed to just maintain that curiosity right up until he died.
And, you know, his last few years were spent on this very strange project where he was just fascinated by this small Soviet country called Tuva,
I think it was called.
You know, it started out just by a random conversation around the dinner table.
And eventually he was just absolutely hooked on learning the language. And it was very difficult to visit because it was behind the iron curtain,
but he came up with all of these ingenious ways to try to arrange travel there. And he just seemed
very good at being able to see the spark of an interest and seize it and allow that to
kind of really build in his life and light up his life.
You've written a lot about how wisdom is also something that we should strive for
and how that can help us solve these problems.
Can you talk about that?
Sure.
Yeah, because like wisdom, you would think is maybe a bit too intangible for scientific study.
And, you know, there had been some attempts to kind of measure it,
but they maybe hadn't been very robust. But then there's been some really exciting new research by
Igor Grossman at the University of Waterloo, who essentially kind of looked at the philosophy of
wisdom, you know, from people like Socrates to the present day and tried to distill some
really important principles that seem to come up again and again. And what do we mean when we say wisdom? Because we've been talking
about knowledge and expertise, but wisdom, we mean something different. What do we mean by that
when we say that? Yeah, sure. So wisdom isn't just based on your kind of factual knowledge,
but it's really the capacity to kind of achieve your goals and to live a kind of meaningful life.
And then it's defined by these different kind of traits like intellectual humility.
So recognizing the limits of your knowledge and perspective taking,
like being able to consider different viewpoints and integrate them.
And all of this stuff that had been discussed in philosophy,
and integrate them.
And all of this stuff that had been discussed in philosophy but then was kind of ignored by science in a way
when we had the IQ test.
And so Grossman designed these great tests
that he would ask people to kind of discuss various dilemmas
and then other psychologists would rate them on these different traits.
And then he found that those scores actually predicted people's life satisfaction
and health and well-being and relationship satisfaction much better than standard measures
of education or intelligence. So it's really looking at people's decision-making in real life
and their ability to make the decisions that really matter to them to building the best life
possible.
And what is evidence-based wisdom? That's something you've written about as well.
Right. So, that's really what I'm describing here. It's evidence-based because it's like based the advice that Igor Grossman and others are giving is looking at ways to improve wise
reasoning, but with a solid scientific basis behind it rather than something that you might read
in like Deepak Chopra
or, you know, the kind of standard self-help book.
It's like really trying to say,
actually, like we've done these carefully controlled trials,
just like in, you know,
you might with a medical treatment
or other kind of intervention.
And we've shown that if you apply these techniques
compared to if you don't apply the techniques,
that you get better outcomes overall.
So it's not, yeah, it's not wisdom as a form of woo-woo
or an anti-intellectualism or anything like that.
It's a, no, we're talking about something
that's very discreet and measurable
that we can scientifically study
and that you can deliberately cultivate.
Yeah, that's exactly it.
And, you know, I see, like,
I don't see it as being in opposition to intelligence,
but I just see it, maybe wisdom in this context
is really just looking at whether you can
apply your intelligence correctly and fairly
rather than just falling prey to all of those
kind of traps that we discussed previously.
Part of your book is about how
the life of Benjamin Franklin can teach us
how to use wisdom in our lives.
Can you talk about that?
Yeah, totally.
So, you know, I think it's important to acknowledge that Benjamin Franklin, you know, looking back, he had some kind of some issues.
So, you know, he was a slave owner early in life, although he then.
You also mentioned Richard Feynman, who is a serial sexual harasser.
So, yeah, Yeah, exactly.
This is what bothers me with all of this.
All of these people had quite serious flaws and were very much the product of their time.
So I certainly don't want to paint them as these heroes.
But I do think in specific, in lots of areas of their lives, you can still see that they were applying like what we now know of as evidence-based wisdom.
And certainly in Benjamin Franklin's political negotiations, such as the signing of the US Constitution.
You know, he was able to kind of guide the discussions to find this kind of compromise that actually,
that eventually worked and just managed to please the most number of people.
that actually they eventually worked and just managed to please the most number of people and he did seem to apply that to lots of areas of his his own life too um you know he he was a very
uh like he really valued things like intellectual humility and perspective taking and that's really
obvious in his writings and he has specific techniques to achieve that. And one of them is moral algebra,
which is a bit like doing a kind of pros and cons list, but he was just a bit more
dedicated to the whole idea and a bit more systematic with the way he did it. So,
you know, he would list every reason for his initial conviction and every reason against,
he would like leave that list for like a few days to kind of let new ideas
pop into his mind and you know make sure that he wasn't just kind of going with his initial gut
reactions and then he would number them and weigh them up and whichever kind of uh whichever was the
came up with the kind of highest score would eventually kind of get his uh would eventually
be his decision and it's just this way of kind of applying more
analysis and more deliberative thinking to your life, rather than just going on your intuitions
and making it more of a process and more ingrained in the process.
Yeah, it's a kind of metacognition. You're thinking about your own thought process
and the feelings that are coming up based on what you're thinking about. And you're
trying to isolate where those feelings are coming from and whether they you're thinking about, and you're trying to isolate where
those feelings are coming from and whether they should matter to you, I suppose.
Yeah, that's exactly it. It's, you know, it's kind of like making sure that that is a regular
part of your thinking, rather than just kind of assuming that it will come naturally or easily,
is kind of acknowledging that you actually need to apply that in every step of your thinking.
Should we be, you know, when we're trying to make a decision and trying to figure out if our intelligence or our expertise is leading us astray, when we have those, you know, strong emotions about one course or another,
are those things that we should be using our metacognition to disregard,
or should we be employing them in some way? I think you definitely need to employ them,
and you need to recognize your emotions and intuitions. And there's some good research
showing that, actually, that if you just ignore your emotions completely, you can be stuck in this
kind of analysis paralysis, where you just struggle to make any good decisions. Yeah. Because we do know that actually emotions are really good at kind of giving us signals from
non-conscious processing and, you know, helping to guide us in important ways. The problem is
that they can also be swayed by really irrelevant factors. So, you know, like one of the famous
examples is that if you're interviewed on a rainy day, you're much less likely to get the job.
Right.
If you're interviewed on a sunny day, because the interviewer misreads the kind of good feeling that comes from good weather for kind of your performance in the interview.
They're like, everyone's, everything is damp in here.
I don't like, I don't have a bad feeling about this person because my socks are wet.
Yeah, that's exactly it. And you know,
some people are just kind of, it's influencing the way they feel, but they're not aware that
that could be influencing their decisions. And you do see that the people who are just a bit more
aware of their feelings in general and better able to describe them in more nuanced ways
are less susceptible to those kinds of biases. So, you know, we want
our emotions, we want our intuitions, but we want to be more critical of them and more and to think
about them in a more analytical way and really interrogate them rather than just taking that as
evidence that, you know, we might be right. Do you think that the problems that we run into
with intelligence and expertise in, well, I was gonna say this
country, but this country and England and, and, and the other, you know, sort of broad countries
we'd, we'd classify as being, you know, sharing our sort of culture around these ideas. Is it,
are any of these problems because of the way our culture treats intelligence, education,
those sort of issues?
Or is there anything to be learned from the ways that it's done elsewhere?
Yeah, totally.
So, you know, I think like from as far as I know,
like the American and English education systems
do value kind of the same kind of qualities.
So we kind of reward children for being very quick
at learning and answering questions.
You know, it's always like
who can put the hand up first and answer the question the quickest it's like rewarded rather
than the kids who might be sitting back and being a bit more deliberative and you know the more we
kind of assume that like if learning is easy for someone then they're learning it better um which
isn't necessarily the case because there's lots of good research now looking
at the science of memory and learning showing that actually when you learn the process of
frustration during learning can actually improve your long-term recall and your kind of deeper
conceptual understanding.
But in England and America, we try to kind of avoid any frustration or confusion. We want like slick textbooks that present everything in a very fluent and easy to understand manner.
And, you know, we kind of teach kids like how to solve a problem and then give them the problems.
Whereas what you see in East Asian countries, especially Japan, where this has been studied most extensively, they actually use the opposite approach. So they might benefit, the students benefit from what's called productive
failure, where they're actually given problems that they don't really know how to solve and just
ask to solve them. And even though that is frustrating, when they're actually then told and
the teacher explains how to solve the problem, then having been through that process
of frustration, their learning is actually a lot better. So their factual understanding and their
conceptual understanding is so much better. And their memory later on is so much better
for having been through the process of frustration. And the kids are given longer
to kind of answer questions. It's not just a kind of race for who's the quickest. And,
you know, all of these things that seem to
just encourage deeper thinking on a problem. And you do see that this seems to result in
a more curious, open-minded, intellectually humble mindset overall. So it's not just
improving their learning, it's just encouraging better thinking and reasoning overall.
Right. If you are confronted with your own failure to understand something and you
have that feeling of frustration, then of course you're going to be more intellectually humble
overall. But that's true. When we have a great deal of expertise, we can tend to
want to avoid that feeling of not knowing, oh, no, I know the answer
immediately. I can use my reason to come to a conclusion very, very quickly that's going to
satisfy me and make sure I don't feel the pain of not knowing or being at sea with a problem.
But really, that feeling of constant fluency can blind us to understanding the issue.
Yeah, that's exactly it. It just kind of takes away the fear of failure and the fear of being
wrong. And that means that, you know, they're just more likely to preempt their kind of errors
because they're not trying to kind of cover up the kind of faults in their thinking. And, you know,
it's, I think that's something that we could definitely learn in the UK and the US to just cultivate a wiser way of reasoning, right from very early
on in someone's life, rather than always kind of making everything so easy that like when people
come to the real world, and they have like really complex problems that don't have an easy solution,
preventing them from kind of just expecting that there will be an easy solution,
and trying to kind
of falsely claim to have that particular answer to a problem. It's more rewarding too. I mean,
this is so, this is such a tangential example, but like I play a lot of video games and in video
games, you know, there's sort of two trends, like the trend that is more popular now is to
tutorialize everything, tell the player exactly how to do what they're going to do,
and then to tune all the challenges so that you never give them a challenge that they don't have
the knowledge of how to overcome. And so that they sort of almost never fail. You know, a lot of
games are designed to, well, be a little bit challenging, but you're probably not going to
fail that often. You'll actually just have the flow experience of overcoming every single challenge.
And that I now find kind of boring. And the games I enjoy now more
are the ones where you're thrown into a situation
where you are almost immediately going to lose and fail,
or you can't figure out what the goal is.
And the only thing drawing you through is your knowledge.
Hey, this is a video game.
There must be a way to solve this problem.
I just have to do it through trial and error.
I have to use my own knowledge to figure it out
and actually become good at it.
And that is such a deeper, more rewarding experience because that's how you actually end up building a skill within the world of the game.
And I can think of that, the same thing imagining in learning.
I mean, I remember taking a German class in my college
where instead of none of the instruction was in English, it was just entirely in German. So the
first page of the textbook didn't have a word of English on it. You just had to figure out through
context clues what the words meant, which was more frustrating, but it was so much more,
I mean, I know what I, you know, I knew what those words meant because I had to figure it out for
myself, you know? Yeah, yeah, yeah. knew what those words meant because I had to figure it out for myself, you know.
Yeah, yeah, yeah. And so much research has shown exactly that, especially with learning vocabulary, for example, that you kind of benefit from what's called a pre-test where you're kind of asked to just guess the meaning of the words without actually being told what they are.
though you're likely to get most of them wrong, having been through that process of thinking about it and getting it wrong, you then learn it a lot better and your memory is so much better
later on. So I just think that applies to all kinds of areas of life that we just need to
be more conscious of accepting ambiguity and confusion and struggle and seeing that not as
something to fear or to ignore or to cover up, but something that
should actually, we should embrace as an essential part of the thinking process. And if you do that,
then you're going to be protected from all of those other problems like motivated reasoning
later on, because you're not just trying to justify yourself, you're trying to actually
find the best possible solution. So let's end with this. When we're talking about those people who are very, very
intelligent and they're using their reason
in order to justify
false beliefs via motivated reasoning,
right? We've
talked about ways to avoid doing that
yourself, but when we're confronted
with someone who appears to have
a great deal of expertise, appears to have a
great deal of intelligence, is giving us a very
fluid argument for why we should believe that climate change is not dangerous or any other
sort of similar argument. And we are trying to evaluate whether or not that argument or that
person is on the level. How can we do that? How can we tell the difference between the intelligent
person who actually knows what they're talking about and the intelligent person who is just using their powers of reason to spin falsehoods.
Right.
You know, that's such an important question with the kind of political landscape being the way it is.
Yes.
You know, especially online, people can seem to produce this really uh really convincing material but kind of the deeper
you look the more it kind of crumbles um i don't think there's any easy solution but i do think
that you it is something that you can learn and it's kind of going back to the principles of
critical thinking so you know um we can learn about the common logical fallacies so those are
those kinds of arguments that seem convincing,
but actually when you look at whether they actually logically prove the point,
they don't at all.
They just create the illusion of an argument rather than being a good argument.
So things like that.
You know, there's also lots of research showing that when you combine that
with kind of studying the process of misinformation and false argument
in lots of different contexts.
So, you know, when they've trained kind of students on the tactics used by the tobacco industry in the 70s or 60s
to question the link between smoking and cancer, teaching people that specific context
can actually protect them against misinformation in all areas, you know, political conspiracy theories or climate change denialism. So it's just helping to kind of plant red flags in their brains to
recognise like, when you should start to interrogate an argument more carefully. And so I
think that's just something we can all do is just maybe try to look a bit deeper, try to kind of,
you know, read about those kind of previous cases and to then apply
that same kind of thinking when we're talking about politics today or, you know, the environment
or anything else that's important.
But then, okay, let me throw this back at you because I was going to make that be the
last question, but this inspired me to ask you this because I read, I don't remember
where I read it, but it was a really interesting argument about skepticism as a process, right? that you start to apply it to areas that should not be,
you know, that you start using it to demolish arguments that are actually true in exactly the same way.
Your reason becomes so good that you're able to do that.
And the line that leapt out at me was, you know,
it's not rational to disbelieve something
that's right in front of your face, right?
And, you know, in exactly the same way,
a climate change denier would say,
no, no, I'm just being, no, I'm the true, true skeptical thinker, right?
When really they're, they're disbelieving what's right in front of their face.
So how do we then make sure that we're not overusing?
Because obviously we want to think skeptically and think critically, but, um, we could fall
into that exact same trap if we overuse that same process in a motivated, irrational way.
Yeah, yeah, yeah, totally. And I think that is a kind of danger. So I think like what the
surprising thing that these researchers had found in the example that I mentioned when they trained
the people about the tobacco industry's tactics and then presented them with misinformation
about climate change, what was surprising there was that actually they weren't falling into this trap that you described. So even people who were naturally more right wing,
more capitalist, you know, should be on paper more likely to deny climate change. They were
actually still just as sceptical about the material on climate change denialism. So they
had kind of overcome that emotional motivated reasoning because they
knew the kind of tactics had been used uh in a really misleading way previously with the tobacco
industry and one of the researchers just said to me it's like no one wants to be hoodwinked really
like you know if you can appeal to that sense of like uh of wanting to be right and wanting to
know the truth and if you can try to kind of instill that in someone,
then you will overcome that other kind of emotional motivated reasoning.
So I don't think there's a simple answer,
but I do think the research suggests that actually
what you've described is less likely to happen than you might fear.
So we can still, despite the fact that intelligence and,
you know, quote, rationality and skepticism and all these things, they can lead us into error.
We should still have trust in these general processes to lead us in the right direction overall, as long as we're still having those good, wise habits of mind.
Yeah, that's exactly it.
And I think it's always about looking at the kind of balance of evidence.
And I think it's always about looking at the kind of balance of evidence.
So, you know, like, yeah, I think it's actually really healthy to kind of interrogate evidence on anything like medical issues, climate change, you know, to question it.
But it's just looking like on balance is the evidence for climate change more convincing than the evidence against climate change. And, you know, I think if you look at it really honestly, and without that kind of motivated reasoning, you do see that the evidence for climate change is more convincing. And so I think that's it. Like, there's always going to be some areas that
are gray or some holes in the argument, but you're just trying to kind of balance them and see
which overall seems the most convincing at the time of you doing that analysis.
This is fascinating stuff.
I really thank you coming on the show to talk to us about it, David.
Cool. Thanks so much. It's been a pleasure.
Well, thank you once again to David Robson for coming on the show.
His book, once again, is called The Intelligence Trap.
Check it out.
And that is it for us this week on Factually.
I want to thank our producer, Dana Wickens,
our researcher, Sam Roudman, and Andrew WK for us this week on Factually. I want to thank our producer Dana Wickens, our researcher Sam Roudman,
and Andrew WK for giving us our theme song.
And hey, if you want more fascinating information
or if you just want to find out what I'm up to,
head to my website adamconover.net
to read some updates or sign up for my mailing list.
Once again, that's it for us this week.
We'll see you next time.
Thanks for listening.
That was a HeadGum Podcast.