Modern Wisdom - #332 - Julia Galef - Learn To Improve Your Decision Making
Episode Date: June 10, 2021Julia Galef is the co-founder of the Center for Applied Rationality, a podcaster and an author. Boris Johnson’s former chief adviser Dominic Cummings said that tens of thousands of Covid deaths coul...d have been prevented if the Government had read Julia's book. Why is it that he swears by Julia's rationalist manifesto? Expect to learn what most people get wrong about confidence, the difference between a solider and scout mindset, why attitude is more important than knowledge for effective judgement, how to avoid being self-deceptive, what the rationality movement has got most wrong and much more... Sponsors: Reclaim your fitness and book a Free Consultation Call with ActiveLifeRX at http://bit.ly/rxwisdom Get 20% discount on the highest quality CBD Products from Pure Sport at https://puresportcbd.com/modernwisdom (use code: MW20) Extra Stuff: Buy The Scout Mindset - https://amzn.to/2RM1RNT Follow Julia on Twitter - https://twitter.com/juliagalef Get my free Ultimate Life Hacks List to 10x your daily productivity → https://chriswillx.com/lifehacks/ To support me on Patreon (thank you): https://www.patreon.com/modernwisdom - Get in touch. Join the discussion with me and other like minded listeners in the episode comments on the MW YouTube Channel or message me... Instagram: https://www.instagram.com/chriswillx Twitter: https://www.twitter.com/chriswillx YouTube: https://www.youtube.com/ModernWisdomPodcast Email: https://www.chriswillx.com/contact Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
Hi friends, welcome back.
My guest today is Julia Gayliffe.
She's the co-founder of the Centre for Applied Rationality, a podcaster and an author.
Boris Johnson's former chief advisor Dominic Cummings said that tens of thousands of
Covid deaths could have been prevented if the government had read Julia's book.
Why is it that he swears by this rationalist manifesto?
Today I expect to learn what most people get wrong about confidence, the difference between
a soldier and a scout mindset, why attitude is more important than knowledge for effective
judgment, how to avoid being self-deceptive, what the rationality movement has got most
wrong, and much more.
I did not think that would be using dominant comings as an example for
an intro to this podcast, but here we are. It is 2021. Julia's awesome. Her insights into
the absolute cutting edge of what is effective human judgment and decision making is phenomenal.
She is one of the best voices coming out of the rationality movement and I'm confident
that you're going to take turns and turns away from today. But now it is time for the wise and wonderful
Julia Gaylith.
Julie Gaylith, welcome to the show. Thank you, good to be here, Chris.
Is being as rational as possible something that everyone should aim to do all the time?
Well, so, like any philosophy nerd, I want to immediately define all of the terms that you just used in that sentence
so we can understand how we're using different words.
But I'll just, I'll start with the idea of being rational.
I think a lot of people hear that word and they think of being like,
spock from Star Trek where you're not allowed to have any emotion
and you're only allowed to do things that you can justify in dollars and cents or in time efficiency
or something like that.
That's not actually, that's not how academics use
the word rational.
It's not how I'm used to thinking of the word rational.
Rationality is just about forming beliefs that are as accurate
as you can.
That's one kind of rationality.
And another kind is just making decisions that,
that help you achieve your goals. Whatever those goals might be. Those goals could be helping people you can. That's one kind of rationality and another kind is just making decisions that
help you achieve your goals. Whatever those goals might be, those goals could be helping people you care about or it could be enjoying life or succeeding at whatever you want, but the goal of
being more rational is to just find more effective ways to get those goals for lower cost or sacrifice. So, define that way, I think it's harder to argue that being rational is not something we should strive for.
And so, yes, I think improving your, they're called epistemic and instrumental rationalities is a really valuable goal.
And I tend to argue that improving your ability to see things clearly and have accurate beliefs actually is a really valuable goal and I tend to argue that improving your ability
to see things clearly and have accurate beliefs actually is a really good way to achieve
your goals to be more instrumentally rational.
There's a real synergy between those two kinds of rationality that I think people should
be more aware of.
The problem is that most people aren't aware of just how irrational they are being.
Most people presume that the decisions that they're making are going to take them towards
their goals, right?
If they have goals, they don't purposefully make decisions that don't take them toward
them, but it's kind of like the emperor has no clothes in a weird way.
They don't know that the things that they're doing are irrational.
Yeah, I mean, I think in the moment we never feel like,
yes, I'm doing an irrational thing.
But I do think that many people are able to kind of zoom out,
take a step back and think, okay, yes,
I do sometimes do things that are like predictably bad
for my goals.
I do sometimes put off working on a project
until the day before it's due.
And I know from past experience that that tends to shoot me in the foot and I do a worse job.
And but I still do it anyway because for various reasons, you know, there's always like a short-term
temptation that can pull you into doing something that you know is bad for your goals.
Or, you know, you might lash out in anger at someone, even though there's no real expectation there that you're
going to improve the situation by yelling at someone, but you just feel compelled to do
it anyway.
And so I think people can often recognize that at least in the past they have done things
that they now accept were not the most rational strategy for getting the things that they want.
But in the moment, it's really hard to recognize that.
So why does some people see things more clearly and others don't?
So this is a big question that I spend a lot of time thinking and writing and talking to people
about. And I'll just try to give you a very, very simplified answer so I don't spend two hours answering this one question. I think some people have, A, some people
care more about seeing things clearly and accurately.
And so they're more motivated to double check
their initial intuitive guesses about what's going on.
They're more inclined to say, well,
before I share this story on social media that
seems really true to me, maybe I'll just spend a moment fact checking it just to see if there's any obvious flaw I'm missing.
So some people just, you know, are already more motivated to do that because they like being
the kind of person who just sees things as accurately as possible.
And then also a big part of the puzzle, I think, is that some people just for various reasons
have developed more effective emotional skills to cope with some
of the stressful or difficult aspects of seeing things clearly.
And it can be stressful and difficult, you know.
It can be uncomfortable to notice that you were some political position you've been taking
passionately to your friends and family or saying publicly to notice you were wrong about that can be very uncomfortable,
or just to seriously consider,
unflattering or difficult truths about yourself
that maybe I do tend to lash out at people unfairly,
or maybe I did screw up at that problem at work,
maybe it's my fault after all.
That's not fun to recognize,
but I think some people have developed tools
that allow them to kind of tackle those situations
and be willing to see things clearly even though
it's not the easiest thing in the world.
So I think a lot of the task of moving towards
what I call Scout mindset,
the ability to really want to see things as
clearly as possible, even if the truth isn't what you wish it were.
I think a lot of that progression involves just developing
these emotional skills that make seeing the truth easier on you.
What's the opposite of scout mindset?
Right. So I have this metaphor that I talk about a lot called
Soldier mindset versus Scout mindset.
Soldier mindset is my term for this often unconscious motivation that's guiding your thinking when
you're reading an article or listening to an argument where the motivation in the back
of your mind is trying to defend your pre-existing beliefs or trying to defend things that you
want to believe against any evidence that might threaten those beliefs.
And so you're going to be much more accepting of evidence
if it supports what you already believe or want to believe.
And you're going to be much more motivated to find flaws
in evidence that goes against your beliefs.
So this phenomenon isn't new.
It has many names.
You might have heard it under the name Motivated Reasoning.
That's what cognitive scientists often call it.
And then there's other sort of colloquial terms like rationalizing or wishful thinking
or self-justification.
Those are all facets of what I call soldier mindset.
And then scout mindset is just the alternative to that because the scout's role, unlike
the soldier, is not to attack or defend.
It's to go out, see what's really there as clearly as possible,
and to form as accurate a map of a situation or a topic
as possible, including all the things
that you don't yet know or that you can't be certain about,
and always being motivated to learn new things that could help
you revise your map and make it more accurate.
So the map is always a work in progress
that is going to be subject to revision.
And so Scout mindset is basically just trying
to be intellectually honest and as objective as possible
and just curious about what's actually true.
Why do you think most people default to a soldier mindset
rather than a Scout mindset?
I know far more soldiers than I do scouts.
Yeah, I mean, so to be clear, we're all a mix of both.
We're all, you know, somewhat soldiers and somewhat scouts
and we can fluctuate between the two mindsets.
You know, I'm often in scout mindset
when I'm thinking about some intellectual issue,
but if someone is confronting me and arguing with me
and especially if they're being kind of a jerk about it, then it's much easier for me to slip into shoulder minds that where I really just want to prove myself right.
Or, you know, the context can matter a lot to like the topic. Someone might be
really good at being a scout at work, like let's say they're a trader, a financial trader, and
and maybe they're really good at trying to test their own assumptions about the market
and they're quite happy to discover that they were wrong about something
because it means they can improve their trading strategy.
But then they come home and they're a soldier in their personal relationships
and they're unwilling to even consider the possibility that other people's perspectives might be valid
or completely closed to the possibility of problems in their relationship, things like that.
So we're all a mix of both. know, completely closed to the possibility of problems in their relationship, things like that.
So, we're all a mix of both, but some people are better at being more of a scout, especially
in these, especially trickier, you know, emotionally, ideologically fraught situations.
And so I forget your original question.
Just why do people, uh, toward it?
Oh, why do we, why do we so often default to the soldier?
Yeah.
Yeah, that's right.
Right, so there are things that soldier mindset does for us,
at least in the short term, that can be very tempting
and can pull us into soldier mindset,
even if scout mindset is better for us in the long run.
So one of those benefits is just comfort,
just like feeling better
about our lives or the world. We often are motivated to reach for, reach for defenses
of some narrative in which we're a good person and things that happen weren't our fault
and, you know, things that are wrong with our life are, are someone else's fault. So,
I don't know, suppose you're, suppose you tend to do poorly at standardized testing,
like SATs or IQ tests or whatever,
you might be especially motivated to believe articles
that tell you that standardized testing
doesn't actually measure anything important.
And it's like a meaningless test or something
that might be an especially appealing belief
for you to try to accept.
So, soldier mindset can give us comfort,
it can make us feel better about ourselves and our lives.
It can also, people also sometimes use soldier minds
to help motivate themselves.
They'll try to convince themselves that,
if I just work hard, I'm guaranteed to succeed,
which working hard is great
and it will help your chance of success.
But there's a lot of randomness and luck in life
and a lot of things you can't control
and we're often kind of motivated to downplay that
because we are afraid it's going to demoralize us.
And yeah, there's lots of other things
that people try to use, soul germines at Ford,
to feel good or to look good to other people.
And that's all totally understandable.
I'm not gonna come on here and say,
you shouldn't worry about feeling good or looking good
because those are very valuable parts
of being
a happy and fulfilled human being.
But I think there are better ways to get those things
that don't require you to tell yourself falsehoods
and deceive yourself.
It definitely seems to reduce complexity.
When you don't consider all of the options
and the fact that you might be wrong,
conviction goes up, culpability probably goes down and
complexity gets reduced.
And that creates a scenario in which you're just more certain and the parameters within
which are the degrees of freedom within which you could be wrong have been reduced.
Right.
No, that's very well put.
And I think that simplicity can be very appealing to people.
And it can also be a way that people are trying to
avoid a common failure mode, which I agree is a failure mode that we should avoid, which is like analysis paralysis, where people are afraid, well, if I don't just stick to one way of looking
at things, or I don't just stick to the first plan, I think of then I'm just going to get
mired and uncertainty. I'm never gonna be able to ask.
Yeah, or nihilism, right, exactly.
If you allow yourself to consider other moral perspectives
or other, whatever.
Yeah, so this is also a very legitimate concern
because analysis paralysis can be a terrible thing
and I've seen people get stuck in it.
But again, I think there are better ways
of avoiding analysis paralysis
that don't require you to insist on false certainty.
So for example, a lot of the very successful scouts, people who are good at scouting mindset
who I know, are just really good at the scale of saying, well, I'm going to spend a little
while considering some options, then I'm just going to go with whatever my best guess
is right now.
And I'm just going to act on that.
I'm going to make this business plan that seems like I've bounded the amount of time I'm
going to consider this, and I'll just go with whatever my best plan is that I can come
up with in a day or something.
And I'll just go on with that until some new evidence comes to light that makes it worth
reconsidering.
If we get some feedback from our beta testers that is surprising and, you know, makes this plan seem flawed, then we'll reevaluate
then, or maybe I'll reevaluate at a set time in six months or something. But until then,
I'll just act on this assumption that this plan is worth doing, and then we'll, you know,
revise if and when we need to. And so they're able to keep that awareness of uncertainty
in the back of their mind.
It's founded in a way. Yeah, I mean, it has to be bounded, right? We humans have limited time and
energy and computational power. And so even if you wanted to, you know, consider all the possibilities
every time you wouldn't have the time to do that. So it's just really important to be able to
act on imperfect information and just
trust that you can, you know, iterate and revise as you go. And I think that's like a much
healthier approach than trying to be 100% confident in, you know, whatever your current
guess is so that you don't have to deal with the complexity.
Hmm. Can you tell us that story from the book about the guy who got stranded at sea
in the raft.
Oh, yeah, I just, I found this such an engrossing and moving story.
So the guy, his name is Steve Callahan, and he was a sailor, just a, you know, on his
own, not a professional sailor.
And he set off, I think this was in the sevens or 80s, I can't remember. He set off on a solo
voyage across the Atlantic. And he just had a stroke of bad luck. I think it was a whale hit his
craft and and broke it and so it started sinking. And he managed to to escape in a life raft that he
had brought on with him on the boat. But so he was he was alive, but he's stuck in the middle of the Atlantic Ocean.
The nearest land was 1400 miles away, and he had a little bit of food and water that
he'd managed to salvage as the ship sank, but that was it.
So it was a pretty dire situation. And so to back up for a minute, the reason that I focus on this story is that,
you know, as I mentioned, soldier mindset can give us comfort and can reassure us in times of
stress or insecurity or fear. And these kinds of survival situations where, you know, the stakes
are life or death can be extremely stressful and fear inducing. And because of that, people in these situations are very often tempted to reach for soldier
mindset to assuage their fears.
They're tempted to insist that I will definitely be rescued.
There's no question.
I will be rescued.
Or sometimes they go the other way and they just collapse into fatalism and say, well,
there's no hope.
There's nothing I can do to salvage the situation because sometimes that little bit of hope is even more uncomfortable than having no hope at all
because it means you have to try and you have to worry and you have to et cetera. So
there's all of these temptations towards soldier mindset in a survival situation.
And that's unfortunate because in a life where death situation is when you most need clear-eyed judgment. So when you most need to be able to weigh the odds as best you can of well, should I wait
here to be rescued or should I go out to seek help, which one strategy is more likely to work?
For Steve Callahan, the kinds of decisions he had to face were things like,
I have a few flares that I've salvaged. When by, you know, I have to judge, is it close
enough that I should waste one of my flares in hopes of being seen, or is that just going
to be a waste of a flare because it's too far away? How much water can I afford to drink per
day? So there are all these really tough judgment calls that make a real difference in whether
you end up surviving or not, and your best hope of making those decisions well is to just be
as clear-eyed objective and realistic as possible.
And so what Steve Callahan was able to do was to comfort his own fear and anxiety without deceiving himself,
without telling himself falsehoods like I'm definitely going to be rescued. So we found other ways. I call them honest coping strategies,
ways of comforting yourself without lies.
So for example, one thing he did was just to remind himself, he used this mantra, he said,
all I can do is the best I can.
I'm doing the best I can, which was true.
He was doing the best he could.
He just kept reminding himself of that, and that was comforting to him.
And he also worked on his memoirs while while he was in the boat, because he figured,
you know, even if I die,
maybe someday this raft will be found,
and my memoirs of this experience could be useful
to someone else, and he found that comforting too.
So, Kalahan relied on these honest coping strategies
allowed him to keep going without deceiving himself,
and his story has a happy ending.
He eventually made it after three months,
almost three months of drifting in this raft
slowly across the Atlantic Ocean.
He got to the Caribbean.
I think he'd lost over a third of his body weight
at that point, but he made a full recovery
and now his memoirs are this amazing book called A Drift,
which is a story of how he survived both logistically
and emotionally in the Atlantic for all that time.
So I love his story just as like a gripping adventure story and also as this kind of emotional role model that I try to follow.
And he had five pints of water left when he was found as well.
He did. I was so impressed at that. What's self-restraint, you know?
Outrageous.
Oh, I should mention he, yeah, I mean he had very little water at the beginning, but he managed to rig up this device to collect rainwater on the raft so that he could replenish his supply of water.
Still, it was a very slow rate of replenishment, so he could only afford to drink, I think
it was like a liter a day or something.
It amounted to about one gulp of water per hour, I think.
Yeah, it's interesting that you say that he found comfort in rationality.
So I was reading Red Rising, which is a series by P.S. Brown,
it's just a sci-fi fiction series that I've addicted half of the audience to. So good, it's
outrageously good. And in it there's this line, and the protagonist is talking about the fact
he's in this emotionally charged situation, he's at war in space and stuff, because space. And he, he says, someone comes in and asks him about how he's
feeling or sort of brings him back to this social issue that he's having while he's in the middle
of planning the attack, this new attack that he's doing out in space. And he says that he was ripped out
of the cold comfort of rationality and into the emotion of it all.
And I thought that was really poignant. I highlighted it. I've saved it down in my favorite quote.
And this is something that I think, I've been thinking about this for a while to do with the rationality movement at large. So less wrong and slight star and all this sort of stuff.
I wonder how much of the comfort that people take in the rationality movement at large
is them trying to wrangle the infinite complexity of the world and the inherent fear that comes
from being a finite creature, like we are surrounded by infinite complexity.
I wonder how much of the compulsion and the desire and the enjoyment of the rationality
movement comes from creating a little bit of order,
trying to reduce the amount of chaos that you're surrounded by. What are your thoughts on that?
Well, I guess that would seem more plausible to me if the non-rationalists were all wrestling
with uncertainty and feeling, but like...
Pandora's box with that though, right, isn't it? It's like, if you don't know that it's there, you can't know your own ignorance if you're still
eager to it. Oh, I see your suggestion. Okay, okay. Got it. So you're suggesting like,
the rash lists are people who have, they have opened Pandora's box. They have, they have
become, you know, aware of how much they've stepped out of Plato's cave. Yeah. Right.
To pick another ancient Greek metaphor. So they've
become aware of all the sun certainty. And so they're they're clinging to rationality
as a way to try to, you know, wrangle it, wrangle the uncertainty into order or something.
That's just something that I considered. It's possible. I mean, I'm maybe one of the worst people to try to give a nice unbiased take on your
psychoanalysis of the rationalist, and so I'm like more inside of the group than outside
of it.
I mean, I'd be curious for your take on the actual analysis or approach to wrangling uncertainty
that rationalists tend to use, because it's, again, I'm not on bias here, but it's not
that simplistic.
It's pretty complicated.
There's always one rationalist will suggest one thing and then another rationalist will
be like, well, but we have to add this nuance here and the, or, you know, well, but there's this flaw in that approach.
And so it doesn't feel to me like we're just like reaching for some simplistic explanation.
No, no, I don't think it's, I don't think it's sort of a reductionist or super, super simplistic
approach.
But I do wonder whether the, the deep core motivation part of that comes from this desire for us to just make the world feel a little bit less crazy.
Yeah, because we can't rely, especially the people in the rationality movement, they can't rely on the old archetypes and narratives that would have given them that sense.
Why, everything's in service of God, or it'll be fine when judgment day arrives.
Another question I had for you given that you are
knee deep in the rationality movement.
What do you think that the rationality movement's
got most wrong over the last 10 years or so?
What are the blind spots that they've overlooked?
I'm not hesitating because I have no answer to this. I'm hesitating because I'm trying to, like,
winnow down all of my thoughts into something concise.
Um, I...
Well, okay, one... Sorry. One thing that I've been kind of converging on in the last few
years is that, so one of the main things that defines the rationality community is a,
an emphasis on these kind of norms of discourse that you're supposed to follow if you want to participate in discussions.
And there are norms that are intended to kind of help push us all or keep us all in scout mindset
and make it easier for us to kind of collectively get closer to the truth, hopefully. And so these
norms include things like, you know, you shouldn't dismiss someone's argument just because you don't like them. You should like take it faith value or assume, you know,
that your fellow, your interlocutor is arguing in good faith.
Don't just say, you know, well, you would say that
because you're a white male or you would say that
because you're just trying to defend your country
or whatever, just like deal with the arguments
that people make as they make them and just talk about the ideas.
And that, it also includes things like,
you should be willing to talk about the evidence
that you have for your beliefs,
or even if you don't have randomized controlled trials
behind everything you say,
you should at least be able to talk about,
well, here's, here's I think what my intuition
it might be based on, instead of just saying your belief
and expecting people to take it as the truth,
you should be willing to talk about the reasons.
So I think these are great norms, of course I do,
because I'm part of this community.
But there is a potential downside to them
that I think I've become more aware of in the last few years,
which is that they're somewhat gameable, or they, to put
it a different way, they create the potential for exploitation by bad actors.
So, you know, knowing that we have these norms, someone can come in and say, well, you know,
here are all of my theories about whatever controversial topic.
And they could actually be quite disingenuous and even harmful claims, but the rationality
community has committed itself to evaluating all arguments as if they're being made in
good faith and to not dismiss something just because it sounds evil or crazy or weird.
And so that can lead to some ideas being taken more seriously or given more of a hearing
just because the rationality community doesn't feel like they're allowed to reject something
just to sound bad.
Because someone's a dick.
Yeah, because someone's a dick.
And so this has come back to Bydus in some ways.
Slate Star Codex, who you mentioned, or Scott Alexander, who wrote the blog, Slate Star Codex,
and now blogs at Astral Codex 10, has, you know, he's really committed to these norms of
discourse and the comment threads on his blog and also on some of the, like forums on Reddit
that he's moderated. And, you know, there are, as you would expect
in any forum where people are committed
to taking ideas seriously, even if they sound weird
or fringe, that attracts some people with like very
unpopular unsavory ideas.
And the fact that, you know, a prominent minority
of commenters in his community,
have these unsavory ideas, has definitely been used like against him
But anyway, I was supposed to be criticizing the rationalists and not defending them. I think that's a good
I think that's a good point that the the playing field the gracious playing field which is given in the the
Hope of
intellectual purity
It is kind of one of the flaws.
To take it back a little bit of a step,
I've got a suggestion that comes from kind of like
the kindergarten kiddie's play pen version
of the rationalist movement, which is
the mental models world, which has been popularized
by Shane Parish from FS.blog.
And there's been basically an unlimited number of books
about mental models written over the last few years.
And what I think you've done a good job of with the Scout mindset is you identify that
simply being able to glossary term define a whole bunch of different mental models doesn't
really do very much for us at all.
The fact that I know that I can define availability bias or the fundamental attribution
error or survivorship bias or Occam's razor or Hanlon's razor or whatever. The fact
that I can identify all of these different things doesn't really get me all that closer
to rationality. It means that post hoc I can look back and perhaps see what I did wrong,
but you have this quote that says, our judgment isn't limited
as much by knowledge as it is by attitude. And I think that really hits the nail on the
head that we want, we need to be attitude and action focused. If we are going to aim
towards seeing the world more accurately, making more rational decisions, being less self-deceptive,
it's something, it's not something that you can arm share philosophies about, and I think that the rationalist
movement, at least the parts of it that I have seen, sometimes attract people who enjoy
the philosophical debate, but when it comes to putting skin in the game and actually making
changes, that's perhaps a step a little bit further beyond,
it's one that's more difficult to prove
on the internet as well, so there is a caveat there.
It's like, I don't know quite how much
these people are making changes,
but there's certainly people within this.
There's also the selection problem of like,
the people you see arguing on the internet
about rationality are like,
by stipulation, the ones who are less motivated
to go out and do things.
Do the properly rational thing.
But that's it, just that I think naming,
there was a period about sort of three or four years ago
where everyone was adamant that simply by knowing
all of the different mental models that they would somehow
make it closer to rationality.
And they didn't, all that they did was take up mental bandwidth
learning glossary definitions.
Yeah.
And it's called insight porn sometimes.
Insight porn.
Insight porn.
Insight.
Yeah.
Yeah.
Yeah.
I mean, I agree with that as a common pitfall or failure mode of like the rationalist approach.
And I think I fall prey to it too, that I have a bias in favor of ideas that are like intellectually interesting, or I have a bias in favor of ideas that are intellectually interesting, or I have a bias in favor of supporting causes that are kind of, yeah, intellectually interesting, or even to be less flattering to myself, you what I know. Oh, you don't know that. Don't worry. Don't worry. I'll explain it to you. Sit down. Right. Like, if I, if it turned out that the
results of my, you know, my mental calculations was that the most important thing to care about was
climate change, then like in a sense that would be a bit disappointing, given my, you know,
personality, just because like everyone cares about climate change. It's not like a very cool
thing to care about.
Like I want to care about something more esoteric or obscure or sophisticated or something.
And so I mean, I'm like trying to psychoanalyze myself and other people like me here.
And I'm just guessing that this is an effect that's working in the back of my mind.
And I do try to correct for it when I can. Like I try to, you know, I try to
motivate myself to be the kind of person who just really tries to care about whatever happens
to be the most important thing and I like the idea of being that kind of person, hopefully
more than I like the idea of being like some esoteric person person with esoteric tastes and interests. So I'm trying, but I agree that that is a kind of blind spot
that I suspect rationalists fall prey to
more than the average person does.
I would agree.
So dig into that attitude and knowledge limitation.
What sort of attitudes can we have
that will foster better self awareness?
Yeah, well, the first thing is just to actually want to notice yourself being in soldier
mindset to feel good about yourself when you notice, oh, I was just being defensive there.
Oh, I, you know, something I often notice is, oh, when I answered that person's objection
or that person's question, I didn't actually think about it.
I was just like, if I pay attention to the mental or that person's question, I didn't actually think about it.
I was just like, if I pay attention to the mental motions I was going through, I was just
reaching for something I could say that would knock down their objection.
I never actually stopped to think about whether they could be right.
So let me interject there.
What's the thing that you do to create a break point in that?
Have you got a practice that you have?
In the reflex to stop the reflex.
Well, part of it is just simply paying attention to it
and building the habit of noticing.
But there are some concrete things as well that help.
One thing is to step back and remind myself of, this is an example of kind of an honest coping strategy,
like an emotional tool to make yourself more motivated to be in scout mindset instead
of soldier mindset, even when the temptation to soldier mindset is there.
And a specific thing I often do is just to remind myself that, oh, if I admit that I'm wrong
in this argument, that might be uncomfortable now,
but the silver lining, the upside is that by doing so,
I'm going to be building this track record
in which I've proven that I'm arguing in good faith
and I'm not the kind of person who's just going to stick
to what she's saying just for the sake of it,
just to stick to her guns. And so I'm kind of person who's just going to stick to what she's saying just for the sake of it, you know, just to stick to her guns.
And so I'm kind of investing in my future credibility by, you know, if it turns out that I actually
was wrong in this argument, then, you know, the silver lining to that is that I can build
a reputation as someone who's a credible good-baited arguer.
And so sometimes just remembering that silver lining doesn't necessarily make it fun to notice that I was
wrong, but it makes it tolerable.
It makes it just palatable enough that I'm kind of willing to do it.
So I think that can help.
Noticing the silver linings or the upsides to being wrong or to changing your mind.
And then I also have some tools for the noticing part, like tools to help you notice that you're
actually in soldier mindset.
There's a whole category of things called thought experiments where you, just for example,
suppose you find yourself defending a politician on your side who just did something that he's
coming under attack for, a thought experiment you could do that would be useful in that situation is to imagine
that a politician from the other party had done the exact same thing.
What would your reaction have been?
Would you feel like defending him too?
Or would you say, like, oh no, that's terrible.
That's, you know, he should be just taken out of office or something like that.
I just, I call that an example of, or that is an example of what I call the double standard test
where you kind of check to see,
am I applying a double standard to people on my side
versus someone else?
Am I applying a double standard to judging my own actions
versus someone else?
Am I applying a double standard to claims I want to believe
or, you know, that I would apply to claims
I don't want to believe?
So in situations where I think it's possible,
I'm in soldier mindset, I will sometimes do a thought experiment like that just to test.
What are the thought experiments do you like? Well, another one that I use a
fair bit is called the outsider test. And the goal there is to check to see if
your judgment about what's best to do in a given situation would be different if
it wasn't you in the situation.
So an example of some people who aren't me
using the outsider test is comes from Andy Grove
and Gordon Moore who are two founders of Intel
in the 80s Intel's market share.
So they made memory chips originally.
And their market share had been really high
but then started plummeting as Japan found ways to make memory chips much faster and cheaper
than Intel. And so everyone was kind of tearing their hair and like ringing their hands. And
in his memoir Andy Grove tells a story of being in his office with Gordon just going back and forth about what to do. And he asked Gordon, what do you think, like, if the board kicked us out and got a new CEO
and brought him in, he walked into this office, what's the first thing he would do?
And Gordon said, oh, he would get us out of the memory chip business.
And Andy was like, yeah, he would.
So what's stopping us from just walking out the door coming back in and doing that ourselves?
And the answer was nothing except that they had been so kind of stuck in the status quo that they were used to of like, well, we're a memory chip company. We should
always make memory chips. It was kind of their identity. Andy said and it felt almost like
blasphemy to consider giving that up. But a new CEO without those attachments would
have no such attachment. So just imagining what an outsider would do in that
situation or like what you should, what you think an outsider should do can often cause you to
realize, oh, this is, you know, my judgment was being clouded just by my past attachments.
It's almost like, I forget who wrote this. Some writer had this great metaphor of like,
it's like hanging a sign around your neck
that says under new management.
And just like getting rid of that past baggage
or imagining what it would be like
can give you a very different take on a situation.
So double standard test, outsider test,
two examples of thought experiments
that can help you notice and then hopefully compensate for bias in your judgments.
That third party perspective is so important.
It's like, it's such a scalable cure all for so many of the things that you do because
we're all capable of giving friends amazing bits of advice.
You know, we say the thing, the friend that doesn't know what to do in the relationship
and you firm but compassionate with them and you're understanding but you're not a total walkover
and you maybe give a few different points of view and then when it comes to your own situations,
you just lambast yourself about how short you are with your shortcomings are awful and you're
failing at this and you watch your own blunders from a front row seat and then give yourself a kick in the ass
on the way out of the door.
It's such a, and you're like, well, hang on a second.
If there's anyone that you should be a best friend to,
it should be yourself.
And yet we find it so hard to treat ourselves
like someone that we are responsible for helping
to quote Jordan Peterson.
And that third party perspective
to develop that
medical gnocence and to step away from the identity, I think is super, super important.
You looked at whether self-deceived people are happier. What did you find out about that?
Yeah, so this was a bit of common wisdom that I investigated for my book. I'm sure many of your
listeners have seen the articles and books saying things like
self-deception makes you happy or you know a little bit of delusion is good for your mental health
or things like that. There have been a lot of articles and books like this in the last 15 years or so
and so I looked into the research that these articles and books were citing and was kind of appalled at how bad it was.
So I'll just, I'll give you one example. Here's a, a scale measuring self-deception. So it's like a
measure of how self-deceived a person is. And the way the researchers measure this was to ask people questions like, do you ever
get angry on a scale from, and they ask you to answer from one never to seven all the
time.
And if you, and their rule was, if anyone gives a one or a two, then they're a self-deceiver.
So in that example, you might think, yeah, okay, well, I mean, everyone gets angry.
So anyone who says they never rarely get angry must be lying to themselves, right?
Well, I mean, maybe I know some people very well who, you know, they've been my friends
for years and they get angry like once a year.
I think that's me.
I would be in that category, yeah.
Yeah.
Right.
And so maybe it's not that common, but there's definitely people who rarely get angry.
And so they would honestly answer two out of seven,
and then they would be classified as self-deceivers.
And then the questions get weirder from there.
So another question is,
are you attracted to members of your same sex ever?
Like do you ever find someone of your same sex attractive?
And if you answer one or two, like never or rarely,
then your classified as a self-deceiver. So the reason why everybody's a bit gay.
Yeah, we've done that. Everyone's a little bit gay.
You're a little bit gay.
Are we sure that the researchers didn't have a, was this for the gay times? Was this for
the gay time?
Oh, the gay time.
Yeah.
I think that's deeper. This is like, these are respected researchers whose work has been cited in like very best selling books
and in very popular podcasts, and I won't name names,
but it's like a bad high school problem.
Yeah, and then I'll just give you one more question
since I can't get enough of how dumb this study is.
Another question was, do you ever fantasize about raping
or being raped? And if you say one or two,
like IE never or rarely, you are a self deceiver. I don't think that this research tells us
very much about self deception. Although it does perhaps tell us a little bit about the
psychology of the researchers themselves. So yeah, and you know, that's one particular metric that was just used in some studies,
not all of them, but most of the studies I found investigating this question of like self-deception,
you know, or finding that self-deception makes you happier, most of those studies had some
equivalent problem in the study design or how they were measuring self-deception.
And so I ended up just feeling like, well, okay, maybe self-deception can make you happier
that might be true sometimes, but I don't believe that because of the research.
The research itself is...
So, inconclusive.
I mean, I have other reasons for thinking self-deception may or may not make you happy,
but not research.
What's your outcome from your own ideas?
Oh, I think just based on being a human in the world and like
observing things and experiencing things, I think self-deception can make you happier
in the short term. I think it's much less clear whether it makes you happy in the long term.
And my guess is that just cultivating the ability to look at things realistically and
and cope with reality
is probably a better strategy for happiness in the long term. But at least in the short term,
I think, yeah, it's pretty clear that it can make you happy to convince yourself that you weren't
at fault or that your mistakes aren't really that bad or everyone loves you. Of course,
yeah, that can make you happy in the short term. Do you think that learning to be wrong is a skill
then that we need to acquire in a way?
Yeah, in many ways.
I mean, it's partly an emotional skill as I was kind of talking about
throughout this conversation.
It involves emotional tools, like recognizing the positive,
the upside of admitting you were wrong.
It's also a, like recognizing the positive, the upside of admitting you were wrong.
It's also a, it involves kind of a different frame,
like a frame shift in the way you think about being wrong.
So, you know, I kind of alluded to one of the aspects of being a scout
is acknowledging uncertainty and never being 100% certain of your beliefs,
always having
your confidence level somewhere on a scale from 0 to 100 shades of gray and not black and white.
And so that's kind of a different way of thinking about things.
It's a cognitive shifter or a frame shift.
And one nice thing about that I find is that it can make it easier to acknowledge evidence against
your views because doing that doesn't mean giving up a belief entirely. It just means
a kind of incremental downgrade in your confidence level. So if you start out being 85% confident
that a paleo diet is the healthiest diet someone can have. And then you read a new study suggesting that,
oh, actually, researchers didn't find any benefit
to the paleo diet over whatever traditional
locality diet or something.
Reading that study doesn't have to immediately
undermine everything you believe.
It just means maybe you should downgrade your confidence
from 85 to 65 or 70 or something.
And then as you read more and experience more,
and maybe there's a follow-up to the study,
maybe your confidence level will go back up, or maybe we'll go down even more, but either way,
each time the update that you have to make to your map of reality is kind of gradual and incremental.
And so I think that's like an underappreciated benefit of recognizing uncertainty in the world
is that it makes the adjustments you have to do
kind of emotionally easier. So there's a number of things like that that are just
like differences in the way you see the world that can make changing your mind easier.
We see being right or wrong as binary, right? We think okay, it's either that it's my
my proposal, my thesis, or something else.
And that's it.
It's interesting that you talk about confidence intervals
because Scott Alexander every year makes his predictions
for the end of the year, right?
I'm 75% likely, there's a 75% likelihood
that my bedroom will be painted yellow
and this and this and this,
and this, I've been rejected.
I accidentally listened to that
on the Slate Star Codex podcast, which was the worst thing
to listen to it on because it's just somebody reading off statistics.
It's a 15 minute bar chart graph thing that he was reading through.
I didn't try to narrate the graph that he drew of his calibration.
And then there's a data point.
No, he does.
He tries to. But anyway, it's slightly ugly.
But yeah, I like thinking about things in terms of that.
You also talk about beliefs and identities.
And I suppose that this adds sort of a psychological layer
on top of this, the fact that you can wrap the person
that you believe that you are
around being right or another group being wrong. So tightly that this further creates
more self-deception or more murky waters to see rationality through.
Right, and that noticing that you were wrong about some, you know, may random political or ideological
intellectual issue can be a blow to your identity just because you pride yourself in being
the kind of person who's always right.
Is that what you're saying?
Yeah, yeah, I agree.
Yeah, I think it's very important to kind of tie your identity not to having the right
beliefs because then you're never allowed to change the beliefs,
but instead tie your identity to taking the right actions.
And by the right actions, I mean,
things that actually help you improve over time
or help you make your map of the world more accurate over time.
And so, you know, I would put things in that category
like being able to say that you were wrong
or being able to understand the views of people who disagree with you, like being able to say that you were wrong or being able to understand the views
of people who disagree with you, like being able to accurately represent the opposing perspective,
you know, fairly and charitably enough that people on the other side go, yes, thank you,
that is what we believe as opposed to a straw man of the other side. And so I think if
you pride yourself on being able to do things like that to change your mind and understand opposing views,
then over time you're incentivizing yourself to do the things that make you stronger and more accurate over the long term,
as opposed to pride in yourself on holding a particular view that you then can't let go of.
It's inevitable that you're going to have to loosen your grip on your identity if you are constantly seeking out counterpoint
to your position. Right, you can't be a good faith actor intellectually in any kind of
discourse and still man somebody else's position properly without thinking, well, this actually
does mean that I need to let go of the identity that I have that's wrapped around my particular
ideology or my particular viewpoint a little bit. It has to. And I think thinking about it
from a competitive advantage standpoint, there are so many ideologically bound
bad actors that are just sort of dogmatic single track thinkers. Now, especially
in 2021 with social media and sort of extremism out to both sides,
etc. Now more than ever. Yeah. You can stand out from the crowd by being radically reasonable,
far more easily than you can by trying to be an extremist. Yeah. This is the,
I think it's an underrated niche, honestly. Honestly, an extreme centrist is a really radical position to hold now,
surprisingly.
I did this video about Sam Harris a few months ago where I said that,
what you do by holding a nuanced viewpoint that is non-typical and also somewhere in the middle,
you're paying a high cost because you guarantee disagreement from both sides. By being in the middle, you guarantee disagreement
from both. At least when you're out on one extreme, you know that you're going to
be agreed with by that group, right? Right, right, right. So what it is, it's a hard-to-fake
signal of intellectual honesty because you need to pay a very high price to be
in that middle. And what are you signalling? I am the sort of person who
has thoroughly thought through his ideas. I am the sort of person who's prepared to take
slack from both sides simply because I want to be right here, even though it would be
easier for me to be out towards the edges. And if people want to separate themselves out
from the pack, I think that's quite
and again, you can gamify the system and you can purposefully choose to have a new
one's viewpoint and always talk about how you always are both sides of the thing, but overall,
I think, yeah, sitting in the middle makes a lot of sense.
Yeah, I mean, I think that's very well put and I think, you know, you're never going
to appeal to everyone.
And so, you know, you just have to decide like, what kind of audience am I optimizing for
impressing and winning the respect of?
And you know, you could optimize for winning the respect of the audience that just like
passionately believes one thing and they just want to hear you say that and
reaffirm it. That's like a choice you could make. But you could also optimize
for winning the respect of the audience that is like more
discerning and cares a little bit more about what's actually true and is
more more interested in hearing challenges to the things that they believe.
And you know it's not a tiny fraction.
It's a minority of the overall population,
but I don't think it's a small minority.
And I think this nuance tends to get washed out sometimes
when people say things like, oh, no one wants nuance
or no one wants to hear uncertainty.
No, actually a lot of people do.
Not like 90%, but like a lot of people.
And in my experience, the subsection of the population that appreciates the nuance
and the challenge to their views is kind of a cooler subset
than most of the other subsets.
I agree, I agree completely.
Yeah, this is something that I was on my podcast a few months ago,
I was talking to Vitalik Booterin,
as a co-founder of Ethereum.
And he's someone I reached out to.
I've been following for years and always admired his kind of intellectual
honesty and his commitment to talking about things that he now thinks he was wrong
about with respect to Ethereum, or acknowledging the downsides of the
plan, the direction that he's chosen, and saying, you know, this is what I think
is best overall, but I agree it has some downsides.
Or acknowledging his uncertainty about different aspects
of the future of the currency.
And so I asked him, has that had downsides for you,
like that intellectual honesty?
And he said, well, in some ways, yes.
Like sometimes journalists will take a quote out of context
to make it sound like I'm,
you know, not, I have no faith in Ethereum or something like that.
But it's a conscious strategy I've chosen, he said, because the kind of people who like
that intellectual honesty are the kind of people who I think are going to make Ethereum
a stronger community, the kind of people I like and whose respect I care about.
And I think they just, yeah, make for a stronger community building what I want to build.
And so that's kind of how I feel too, that's the kind of people I want to optimize for,
even if not a majority.
If it's good enough for the guy that founded Ethereum, I think it's good enough for the rest
of us.
Judy Galef, ladies and gentlemen, the scout mindset will be linked in the show notes
below why some people see things clearly and others don't.
If people want to check out more of your stuff, where should they go?
Well, my website is juliagalef.com, and you can also follow me on Twitter.
I'm just juliagalef.
I'm the only juliagalef in the world, so I'm not hard to find.
You can join the, you know, I like to talk about these ideas on Twitter and especially
kind of hash out some of the ideas around the edges that I'm
less confident about or you know think I might be wrong about and so you know come join the fun.
I love it. Thank you, Julia. Thank you, Chris. This was so much fun.
you