Hidden Brain - Facts Aren't Enough
Episode Date: July 22, 2019Sometimes when we believe something, we resist data that can change our minds. This week, we look at how we rely on the people we trust to shape what we believe, and why emotions can be more powerful ...than facts. This episode features new reporting and favorite conversations with neuroscientist Tali Sharot and philosopher of science Cailin O'Connor.
Transcript
Discussion (0)
This is Hidden Brain, I'm Shankar Vedantam.
In 2012 Miranda Dinda had a lot on her plate.
She was 18.
All her friends were getting ready for college and busy being teenagers.
But Miranda was on a different path.
She was pregnant.
So I was very nervous and I felt really uneasy but excited at the same time.
To calm her anxieties Miranda did some research on childbirth. and I felt really uneasy but excited at the same time.
To calm her anxieties, Miranda did some research on childbirth.
I decided I wanted a home birth.
For that, she would need a midwife.
She searched for months for someone who could come to her apartment in rural Pennsylvania.
Eventually, she found a woman who seemed perfect.
We'll call her M, her first initial, to protect her privacy.
From the very first moment they met, Miranda knew everything was going to be fine.
She said she had been a home birth midwife for over a decade. She was no nonsense, the mother of eight.
She was very open and honest and friendly, so she definitely seemed like someone who I could trust.
About an hour into that first meeting,
M brought up a question that struck Miranda as strange.
She said, have you ever considered not vaccinating?
Miranda hadn't.
Vaccines had never crossed her mind.
So I asked her, I was very confused. I was like, what do you mean? Why would I consider that?
Emme explained that years ago something bad had happened after she vaccinated her first child.
She went on to describe a progression of events that lead some parents to a powerful but faulty conclusion.
Emptool Miranda, that right after her son got his shots,
his development regressed.
One minute he was fine, the next he was autistic.
She said the light had left his eyes.
So Em decided not to vaccinate her other children.
And she very much implored me to do the same and to look into it.
So I did.
Miranda started on Google.
It led her to Facebook groups.
It's very easy to find them.
So yeah, even if you just Google support groups for parents who don't vaccinate, you will find
a lot.
The moms in these groups echoed what M had told her.
And they welcomed me with open arms and immediately were just practically bombarding me with information,
telling me, your midwife's right, this is why you shouldn't vaccinate.
This is why I don't vaccinate.
This is what happened to my child who I did vaccinate versus my child who I didn't vaccinate, things like that.
Everyone was caring and attentive. They didn't just talk about vaccines. They talked about regular mom stuff.
Things at Miranda found hard to talk about with anyone else.
Dipers and birth plans and hospitals and midwives and breast pumps and stuff like that.
Miranda trusted them. and hospitals, and midwives, and breast pumps, and stuff like that.
Miranda trusted them.
To me, it seemed so clear.
It seemed like I had just found this secret information
that only some people come across.
And I thought, why would I not use this information?
Why would I not use this to my benefit,
to my child's benefit?
So it did not take me very long at all
before I was solidly saying, I will not vaccinate my child when she's benefit. So it did not take me very long at all before I was solidly saying,
I will not vaccinate my child when she's born.
["Ramona come closer, shut softly, you watery eyes."
She named her daughter Ramona
after the song by Bob Dylan.
["I'll be the sadness will pass as your senses will rise.
Ramona, as a newborn, she was very active.
She was very bright.
She was very happy.
She was a great baby, honestly.
She was a wonderful baby.
When the doctor said it was time to vaccinate Ramona, Miranda was ready.
She had a script she had been practicing in her head for months.
And I said, no thank you, I have decided that I do not want to vaccinate.
Please respect my opinions, thank you very much.
For the next two years, Miranda continued to say no to vaccines.
Occasionally, when she encountered information that conflicted with her decision, a pamphlet at the doctor's office or website, she
dismissed it. I just very quickly went that's not true. I don't agree with that and I
moved on. At some point though her conviction started to waver. Those doting
moms on Facebook, they had some weird beliefs. People denying that AIDS exists.
People saying that there are reason there's gay people
as vaccines, on and on and on with really crazy conspiracy
theories.
And then it hit her.
If she didn't believe those ideas,
why was she trusting them on vaccines?
And I stepped back.
I stopped going to the Facebook group as much.
And I decided I needed to look at this issue
from a purely logical perspective.
No emotion in it.
No, oh my God, what if something happens to my baby?
And I completely readdress the issue all over again
pretty much from the start.
Miranda started seeking out perspectives that the moms had urged her to avoid.
Information from the center, so disease control, and medical journals.
I started reading all kinds of things, basically opening my mind to more than just vaccines
are bad to the other side of the coin.
It didn't take her long to change her mind.
She got Ramona vaccinated.
Looking back, Miranda can't believe how easy it was
to embrace beliefs that were false.
And what I would say to someone who's
about to become a new mom, especially if they're a young mom,
is don't try to confirm your own fears online. It is so, so easy to Google what if this happens
and find something that's probably not true
that confirms your fear, that confirms your anxieties.
Don't do that.
Miranda's story tells us a lot about the psychology of false beliefs, how they spread and how they
persist even in the face of conflicting information.
Today we look at how we rely on people we trust to shape what we believe and why emotion
can be more powerful than facts.
We start with neuroscientist Tally Sherrett, a professor at University College London.
She is the author of the influential mind, what the brain reveals about our power to change others.
Tally is a mom and so she understands on a personal level why Miranda was so worried
about the safety of her child.
A few years ago, when her baby was just a few weeks old, Tally was listening to a Republican
presidential debate.
Can you take this little beautiful baby and you pump?
I mean, it looks just like it's meant for a horse, not for a child.
Candidate Donald Trump had been asked a question
about the safety of childhood vaccines.
And we've had so many instances,
people that worked for me just the other day.
Two years old, two and a half years old, a child,
a beautiful child, went to have the vaccine
and came back and a week later got a tremendous fever,
got very, very sick, now is autistic.
So when I was listening to Trump at that debate,
it really tapped into this fear that I had,
and the anxiety that I already had.
And when he talked about this huge syringe,
a horse-sized syringe that was going to go into the baby
in my mind, I could imagine this syringe
inserted into my small little child
and all the bad things
that could happen.
And this was a very irrational reaction on my end because I know that there's not an
actual link between autism and vaccines.
But it's not enough to have the data.
Ben Carson, Dr. Ben Carson, was on the other end.
Well, let me put it this way. There have been numerous studies and they have not demonstrated
that there was any correlation between vaccinations and autism.
But that wasn't enough because the data is not enough and even if the data is based on
very good science, it has to be communicated in a way that will really tap into people's needs, their desires.
If people are afraid, we should address that.
I'm curious when you sort of contrasted, you know, the weight of the evidence on the one hand
and this very powerful image of the horse syringe and your seven-week-old baby on the other hand,
how did you talk yourself into trusting the data
over that emotional image?
What really helped is that I understood what was happening to me.
Because this is what I study, I knew what my reaction was, I knew where it was
coming from, I knew how it was going to affect me, and I think that
awareness helped me to put it aside
and say, okay, I know that I am anxious
for the wrong reasons.
And this is the action that I should take.
It's a little bit when you're on a plane
and there's turbulence and you get scared.
But telling yourself, I know that turbulence
is not actually anything that's dangerous
I know the statistics on safety and planes and so on, it helps. It helps people reduce their anxiety.
But facts don't always relieve our anxieties. Sometimes they harden our views.
Some time ago, Tally wanted to test how people update their beliefs when confronted with
new information.
So she presented statements to two kinds of people, those who believe that climate change
was real and those who were deniers.
She found that for both groups, when the statement confirmed what they already thought, this
strengthened their beliefs.
But when it challenged their views, they ignored it.
Tally says it's because of a powerful phenomenon
known as confirmation bias.
Confirmation bias is our tendency to take in any kind of data
that confirms our prior conventions
and to disregard data that does not conform
to what we already believe. And when we see data that doesn not conform to what we already believe.
And when we see data that doesn't conform to what we believe, what we do is we try to
distance ourselves from it.
We say, well, that data is not credible, right?
It's not good evidence for what it's saying.
So we're trying to reframe it to discredit it, right?
So I give an example in my book where if someone comes in and says,
I just saw pink elephants flying in the sky,
and I have a very strong belief, obviously,
that no pink elephants fly in the sky.
I would then think that they're either delusional or they're lying.
And there's good reason for me to believe that.
So it's actually the correct approach to assess data
in light of what you believe.
There's four factors that determine whether we're going to change our beliefs.
Our old belief, our confidence in that old belief, the new piece of data, and our confidence
in that piece of data.
The further away the piece of data is from what you already believe, the less likely it
is to change your belief, and on average, as you go about the world, that is not about
approach.
However, it also means that it's really hard to change false beliefs.
So if someone holds a belief very strongly,
but it is a false belief, it's very hard to change it with data.
So if data and facts don't work, what does?
How do you get people to buy the truth?
Well, you could try scaring them.
The vast majority of Americans today do not feel safe.
On Sunday, Americans woke up to a nightmare that's become mind-numbingly familiar.
This could be the great trojan horse of all time.
Politicians use fear to get us to vote.
TV programs use fear to get us to keep watching.
Public health officials use fear to get us to quit smoking.
I asked Tally whether fear might be an effective way to persuade people to change their minds
and maybe even their behavior.
Fear works in two situations. It works when people are ready, stressed out,
and it also works when what you're trying to do is get someone not to do something,
an inaction. For example, if you try to get someone not to vaccinate their kids, fear may work.
If there's, you know, an apple that looks bad, I don't eat it. Fear is actually not such a good motivator for inducing action,
while hope is a better motivator on average for motivating action.
You talk about one study in your book where a hospital managed to get its workers
to practice hand hygiene to get staff members to wash their hands regularly.
But it turned out the most effective thing was in frightening the staff
about the risks of transmitting infections. It was something else.
So in a hospital on the East Coast, a camera was installed to see how often medical staff
actually sanitize their hand before and after entering a patient's room.
And the medical staff knew that the camera was installed,
and yet only one in 10 medical staff
sanitize their hands before and after entering a patient's room.
But then an intervention was introduced,
an electronic board that was put above each door.
And it gave the medical staff in real time positive feedback.
It showed them the percentage of medical staff that washed their hands in the current shift
and the weekly rate as well.
So anytime a medical staff will wash their hands, the numbers will immediately go up and
there will be a positive feedback saying, you know, good job.
And that affected the likelihood of people washing their hands significantly.
It went up from 10% to 90% and it stayed there. Instead of using the normal approach,
instead of saying, you know, you have to wash their head your hands because otherwise you'll
spread the disease, basically instead of warning them of all the bad things that can happen in the future, which actually results in inaction, they gave them positive feedback.
I wrapped up my conversation with Tali by exploring another idea about how we might convince
others to listen to views that conflict with their own.
It had to do with a study of Princeton students who got their brains scanned while they listened
to stirring emotional speeches.
What they found was the brains of the different people listening to those speeches started synchronizing.
So if we all listen, for example, to Kennedy's famous moon speech, her brains would likely
look very much alike. Those who came before us made certain that this country rolled the first waves
of the industrial revolution.
And this is not only in regions that are important for language and hearing,
it's also in regions that are important for emotion,
in regions that are important for what's known as fear of mind,
our ability to think about what other people are thinking, in regions that are important for what's known as fear of mind, our ability to think about what other people are thinking and regions are important for associations.
And you try to think, well, what's common to all these influential speeches that can cause so many people's activity to synchronize?
And one of the most important things is emotion.
If the storyteller or the person giving the speech is able to elicit emotion in the other person,
then he's actually having somewhat of a control on that person's state of mind.
And this generation does not intend to founder in the backwash of the coming age of space.
We mean to be a part of it, we mean to lead it.
So think about it like this.
If you're very sad and I'm telling you a joke, well, you're sad. So you're not going to perceive the joke as I perceive it when I'm happy.
But if I'm able to first make you happy and then tell you the joke, well,
then you perceive it more from my point of view.
So by eliciting emotion, what you're able to do is change the perception of everything
that comes after, to perceive information as the person who's giving the
speech wants you to perceive it. So you can see how this coupling, this idea that
the audience is mind and the speaker's mind are in some ways coupled together.
You can see how this could potentially be used to spread good information, you that the audience is mind and the speaker's mind are in some ways coupled together.
You can see how this could potentially be used to spread good information.
You have a great teacher in high school and you're captivated by the teacher and you're
being pulled along by the story the teacher is telling you, maybe about history or maybe
about geography.
But you can also see equally how the same thing can work in the opposite direction.
That you could be listening to a demagogue or you could be listening to somebody who has
who has a very sort of captivating rhetorical style, and this person could also lead you
astray in just the same way that the great teacher can lead you to knowledge and positive
things.
Absolutely.
All the different factors that affect whether we will be influenced by one person or ignore
another person are the same whether the person has good intentions or bad intentions, right?
The factors that affect whether your influential can be, can you enlist an emotion in the other
person?
Can you tell a story?
Are you taking into account the state of mind of the person
that's in front of you? Are you giving them data that confirms to their
preconceived notions? All those factors that make one speech more influential than
the other or more likely to create an impact can be used for good and can be
used for bad. Tali Sharter, I want to thank you for joining me on Hidden Brain today.
Thank you so much for having me.
We've been exploring why we cling to beliefs. After the break, we look at how we spread them,
from person to person to person. We'll talk to a mathematician about the power of social networks
to circulate ideas.
I'm Shankar Vedantam, and you're listening to Hidden Brain.
This is Hidden Brain, I'm Shankar Vedanta.
During the Middle Ages, words spread to Europe about a peculiar plant found in Asia.
This plant had a long stalk with heavy pods attached.
When you cut those pods open, inside you would find a
tiny little lamb. Complete with flesh and wool like a live animal lamb. This
creature, half plant, half animal, came to be known as the vegetable lamb of
tartarie. Various travel writers wrote that they had either heard about this or that they had eaten
one of these lambs and many of them said they had saw the kind of downy wool from the
lamb.
When these narratives made their way to Europe, people felt they had a view of a different
world.
Of course, no one in Europe had ever seen the vegetable lamb of
Tarteri because there was no such thing. But for centuries, people kept talking
about this fantastical creature as if it were real. It even came up in scholarly
works right next to pictures of oak trees and rabbits. If people hadn't been
telling each other about these things, nobody would believe that
there were vegetable lambs because nobody had ever seen them, right?
And this is by no means a unique happening at that time.
At that time, of course, we would never fall for vegetable lambs.
We live in an era of science, of evidence-based reasoning, of calm, cool analysis.
But maybe there are vegetable lambs that persist even today, even among highly trained, scientist, physicians, and researchers. Maybe there are spectacularly bad ideas
that we haven't yet recognized as spectacularly bad.
Kaden Okonaer is a philosopher and mathematician at the University of California
Irvine. She studies how information, both good and bad, can pass from person to person. She is co-author
with James Weatherall of the book, The Misinformation Age, How Faults Beliefs
Spread. Kaelin, welcome to Hidden Brain. Oh, thank you for having me. So one of the
fundamental premises in your book is that human beings are
extremely dependent on the opinions and knowledge of other people. And this is what creates channels
for fake news to flourish and spread. Let's talk about this idea. Can you give me some sense of our dependence on what you call the testimony of others?
So our dependence on what you call the testimony of others. So one reason we wrote this book is that we noticed that a lot of people thinking about
fake news and false belief were thinking about problems with individual psychology.
So the way we have biases and processing information, the fact that we're bad at probability.
But if you think about the things you believe, almost every single belief you have has come from another person.
And that's just where we get our beliefs because we're social animals.
And that's really wonderful for us. That's why we have culture and technology.
You know, that's how we went to the moon.
But if you imagine this social spread of beliefs
as opening a door when you open a door for true beliefs
to spread from person to person,
you also open the door for false beliefs
to spread from person to person.
So it's this kind of double-sided coin.
And what's interesting, of course, is that if you close the door,
you close the door to both,
and if you open the door, you open the door to both, and if you open the door, you open the door to both.
That's right.
So if you want to be social learners who can do the kinds of cultural things we can do,
it has to be the case that you also have to have this channel by which you can spread
falsehood and misinformation, too.
So as I was reading the book, I was reflecting on the things that I know, or the things that I think I know,
and I couldn't come up with a good answer for how I actually know that it's the earth that revolts around the sun and not the other way around.
Yeah, that's right. 99% of the things you believe probably you have no direct evidence of yourself.
You have to trust other people to find those things out, get the evidence, and tell it to you.
And so one thing that we talk a lot about in the book is the fact that we all have to
ground our beliefs in social trust.
So we have to decide what sources and what people we trust, and therefore what beliefs
we're going to take up.
Because there's just this problem where we cannot go, verify everything that we learn directly,
we have to trust someone else to do that for us.
We trust the historian who teaches us about Christopher Columbus.
We trust the images from NASA showing how our solar system is organized.
Now, we say we know Columbus was Italian,
and we know the Earth revolves around the sun,
but really what we mean to say is we trust the teacher and we trust NASA to tell us what is true.
And the social trust and ability to spread beliefs, I mean it's remarkable what it's let humans do,
you know, no other animal has this ability
to sort of transfer ideas and knowledge
dependably from person to person over generation
after generation to accumulate that knowledge.
But you do just see sometimes very funny examples
of false beliefs being spread in the same way.
As a philosopher of science,
Kalen studies how scientists communicate and share information.
If we rely on scientists to tell us what to believe,
who do they rely on?
Turns out, other scientists.
Now showing that this is the case isn't easy.
The process by which scientists change their minds
on questions such as the spread of disease
or the movement of objects through space is very complex.
Studying this complex process can be mind-boggling.
Say, for instance, Dr. A,
Dr. Dr. B one day about her research.
Hello.
It also turns out that Dr. B is collaborating with Dr. C, who recently met Dr. D at a conference.
Now, Dr. D frequently reads Dr. A's papers, but doesn't know about Dr. C's research.
A couple of years later, Dr. E reads what Dr. B is written about what Dr. A said in an article
that Dr. C cited before Dr. F had even published her results.
Imperically it's hard to study scientists because things like theory change will happen over the course of 10 or 20 years and involve thousands and thousands of interactions between different
scientists. You know, how would you ever study that? How would you ever study that?
How would you ever study that?
Because Kaelin can't follow all these interactions,
she recreates them in a computer simulation.
You'd want to think of it as a really kind of simplified representation of what's happening in the real world.
She creates groups of fictional scientists and she gives them a series of rules like who
they can talk to and who they trust.
These simulated scientists collect data and discuss their simulated research.
Kaelin sits back and watches what happens.
So one thing we find sometimes in these models is that one agent or scientist will get data supporting the false belief.
They'll share it with the entire community of scientists, and then everyone will come
to all believe the false thing at once and sort of ignore a better theory.
And part of what happens there is this social spread of knowledge and belief, causing everyone
to turn away from a good theory.
So if you have almost too much social influence
within a community, that can be really bad
because everyone can stop gathering data
since the entire community is exposed
to the same spurious results.
If I hear you correctly, what you're saying
is that psychological factors can have an effect,
but you can have the spread of bad information
even in the absence of biases or stupidity.
Yeah, so one way that the models we look at are really useful is that you can kind of
pair away things that are happening in the real world and see, well, suppose we didn't have any psychological biases,
suppose we were perfectly rational, would we always come to the right answer in science
and in our day-to-day lives and see that the answer is no? scientific communities that show how good information sometimes fails to spread and how bad information can metastasize. I'm Shankar Vedantam and this is NPR.
This is Hidden Brain, I'm Shankar Vedantin. Mathematician and philosopher Kaila Nokhana studies how information spreads through social networks.
People who know and trust one another efficiently pass information back and forth and learn from one another.
Unfortunately, the same rules of social trust can sometimes be a roadblock for the truth.
Mary Wartley Montague learned this lesson hundreds of years ago. She was an English aristocrat who found herself living for a while
in what is modern day Turkey.
So Mary Montague seems to have been really enchanted by Turkish culture.
You know, she was coming from England and aristocratic culture there.
In Turkey, she discovered these beautiful shopping centers, she was coming from England and aristocratic culture there.
In Turkey, she discovered these beautiful shopping centers,
bath houses, she seems to have been enchanted by bath houses
where there would be a lot of women sort of lounging naked,
going in the hot water, drinking hot drinks together.
Another thing that struck Mary about Turkish women, they used an innovative technique to
limit the spread of smallpox.
It was called variolation.
What this involved, I mean, it's a bit like vaccination now.
You would scratch maybe the arm of a patient and take a pus from a smallpox post-tool and put that
pus into the scratch.
So what would happen after you did that is that the patient would get a very mild small
pox infection.
Some small percentage of patients would die, but many, many fewer than who would die of
an actual smallpox infection.
And after they had that more mild infection, they would actually be immune to smallpox.
So this was practiced commonly in Turkey, basically unheard of in England at the time.
Mary Montague had herself had smallpox and survived when she was younger.
She had lost a brother to smallpox. And so when she encountered
variolation in Turkey, she decided, well, you know, why don't we do this in England? She had her own
son variolated, and she decided she was going to try to spread this practice in her native country.
So when she returns to Britain, in some ways, Mary Montague here functions like one of your
agents in your computer models, because you have one cluster over here in Turkey, and one cluster
over here in Britain. And essentially, you have an agent walking over from Turkey to Britain.
And Mary Montague says, here's this wonderful idea. We can limit the spread of smallpox in Britain.
Britain, in fact, at the time time was actually facing a smallpox crisis.
How are her ideas received?
So, her ideas were not received very well when she first came back.
One thing we talk a lot about in the book is that almost everyone has what you might call
a conformist bias.
We don't like to publicly state things that are different
from the people in our social networks, we don't like to have beliefs that are different from
the people around us, it's somehow very socially uncomfortable to do that, and we don't like our
actions to not conform with the people who we know and love. So when she got back to England,
you know, it was already the case that all
these physicians in England didn't believe in variolation. They thought this was a crazy
idea. And none of them were going to stand out from the pack of physicians and say, yeah,
I'm the person who's going to try this or going to believe that this practice works because
they were all busy conforming with each other.
And of course, these ideas were coming from another country,
a country with very different cultural practices
that seemed in some ways very foreign, the idea
and the country itself seemed very foreign.
That's right. So it's not just that it's a weird new idea
that none of them believe in their kind of in-group.
It's also that it's a weird new idea that none of them believe in their kind of in-group. It's also that it's coming from Turkey and furthermore, it's coming from women in Turkey,
so it was a practice mostly done by women, and a woman is bringing it to England as well.
So they also don't really trust her as a woman and someone who's not a physician.
trust her as a woman and someone who's not a physician. So social trust is a really important aspect in understanding how people form
beliefs because we can't go out and figure out ourselves whether the things
people tell us are true. Usually we just always have to decide who to trust and
people have little shortcuts in how they do this.
They tend to trust those who are more like them.
They also tend to trust those who share beliefs and values and practices with them.
So for example, if you are a physician, you might tend to trust a physician.
If you believe in homeopathy, you might tend to trust someone who believes in homeopathy.
We all use these kinds of tricks.
So what we saw in the variation case with Mary Montague, the physicians aren't going to
trust this woman who doesn't share their beliefs and practices, who isn't much like them.
Now you could argue that the physicians who rejected Mary Montague's ideas were not behaving like real scientists.
They weren't being dispassionate, they weren't being objective.
They were bringing psychological biases into the picture.
Sexism, xenophobia, tribalism.
In the real world, misinformation spreads because of some
combination of network effects and psychological and cognitive biases. You see the same thing
in the case of the Hungarian physician Ignas Semmelweis. He was an insider, a man, and a doctor.
He even had the assistance of scientific evidence to support his claims.
But it turned out even these were not enough to overcome the barriers that confront the truth.
Ignor Simalayas was a physician living in Vienna.
He was bitten charge of this clinic, the first obstetrical clinic in Vienna.
Next door was the second obstetrical clinic in Vienna. Next door was the second obstetrical clinic of Vienna.
He was in charge of training new doctors and obstetrics,
and at the second clinic, they were training midwives.
And shortly after he took over,
he realized that something really terrible was going on
because in his clinic, 10% of the women were dying,
mostly of childbed fever. While the midwives
next door who presumably, you know, they would have thought they were less
expertise, only three to four percent of their patients were dying. So
Somalwise was obviously really worried about this. He had patients who would be
begging on their knees to be transferred to the other clinic.
He had this kind of breakthrough moment when a colleague of his was conducting an autopsy and accidentally cut himself. And then shortly thereafter, he died of something that looked a lot
like child bed fever. Some of us realized, well, I've got all these physicians who are conducting
autopsy's on cadavers and
then immediately going and delivering babies.
And he thought, well, maybe there's something transferred on their hands.
And you call this cadaverous particles.
Of course, now we know that that is bacteria, but they didn't have a theory of bacteria
at the time.
So he started requiring the physicians to wash their hands in a chlorinated solution
and the death rate in his clinic dropped way down.
And of course, the way we think about science, we say, all right, we've someone's discovered
something wonderful. Everyone must have instantly adopted this brilliant new idea.
You would think, right? And he has this wonderful evidence, right? It was 10%. He introduced
the practice, goes down to 3%, but that's not what happened.
So he published his ideas,
and the other gentleman physicians did not take them up.
In fact, they found them kind of offensive,
they thought this is, you know,
he's writing that we have dirty hands,
we have unclean hands, but in fact we're gentleman.
They also thought it was just really far out of the range of theories that could possibly
be true.
So they didn't believe him despite the really good evidence and the deep importance, you
know, people's lives were really at stake.
And it took decades for his handwashing practice to actually spread.
In fact, I understand that Semmelweis himself eventually suffered a nervous breakdown.
How did his own story end?
So the way the story goes, though this is a little hard to verify, is that he was so frustrated
that people weren't adopting his handwashing practice that he had a nervous
breakdown as a result.
He was put into a V&E's mental hospital where he was beaten by guards and died of a blood
poisoning a few weeks later.
We've seen how being an outsider or breaking with tradition can be barriers to the spread
of good scientific information.
But you could argue that these examples were from a long gone era of gentlemen physicians
and amateur scientists.
But even in the modern day of science where researchers demand hard evidence to be convinced,
it turns out that false, inaccurate, and incomplete information can still take hold.
In 1954, ED Palmer published a paper that changed how doctors thought about stomach ulcers.
So what he did was looked at a lot of stomachs. I believe somewhere in the range of a thousand.
And he found that there were no bacteria whatsoever
in the stomachs that he investigated.
A lot of people at that time had been arguing
over whether stomach ulcers were caused by stomach acid
or some kind of bacteria.
This was taken as really decisive evidence showing that, okay, well, it can't be bacteria
because everyone thought Palmer's study showed there are no bacteria in stomachs, so it
absolutely must be stomach acid.
And of course, in this case, Palmer was not trying to fabricate his data or make up data.
He was sincerely arriving at what he thought was a very good conclusion. That's right. And it seems that it just was a problem with his methodology.
Of course, there are bacteria in our stomachs. He just didn't see them because of the way he was
doing his particular experiment. This was not a fabrication at all. One of the things that's
interesting about this episode involving Palmer and the Stamacal
SARS is that as individuals essentially came over to believe what Palmer was telling them,
there was a consensus that started to grow.
And as each new person added to the consensus, it became a little bit stronger, which made
it even harder to challenge.
Yeah, so although they had been arguing for decades about whether ulcers were caused
by acid or by bacteria, at this point people started to share palm-ish results pretty much
everybody saw them, and this consensus was arrived at okay, it's acid, and everyone who
had been studying the possibility that bacteria caused stomach ulcers stopped studying that.
Well, not everyone.
Fast forward a few decades to the early 1980s.
In Australia, a physician named Barry Marshall
grew skeptical of the acid theory.
His experiment suggested that ulcers were caused
by bacteria, not stomach acid.
But this theory was met with stony face resistance. He couldn't
even get his articles published. Scientists sniped at him behind his back, even though
as it turns out, his data was far better than the stomach studies by Edie Palmer.
Barry Marshall was frustrated that no one seemed willing to listen to his findings.
So he figured out a way to get everyone's attention.
The only person in the world at that time who could make an informed consent was me.
So I had to be in my own experiment.
And so he did this demonstration.
He took bacteria from the stomach of one of his sick patients.
So we cultured a patient with gastritis.
He stood it into a broth, and then...
I drank the bacteria,
tented the ninth colony forming units.
He gave himself stomach ulcers,
and then he later cured them with antibiotics.
In this publicity stunt, almost to convince people that in fact ulcers were and then he later cured them with antibiotics in this publicity stunt almost
to convince people that in fact ulcers were caused by bacteria.
Eventually Barry Marshall and Robin Warren went on to win the Nobel Prize in Medicine for
their discoveries. Mary Montague, the woman who faced resistance in bringing
variation to England, never won a prestigious prize, but she also found a way to spread
the truth.
Like Barry Marshall, she found it had more to do with her sales pitch than with the evidence.
So in the end, she did something really smart, which took advantage of the ways that we
use our social connections to ground our beliefs and our trust.
So she ended up convincing Princess Carolyn of Onesbach to variulate her own two small daughters
and to do it in this kind of public way.
So she got one of the most influential people in the entire country to engage in this practice.
So that did two things.
So number one, it made clear, you know, because she did in this kind of public way and her daughters
were fine, it gave people evidence that this is in fact a safe practice and it's a good
idea.
But it also made clear to people that if they want to conform to the norm, if they want
to share a practice with this really influential person, then they should do the same thing.
And after Princess Carolyn did this, Variolation spread much more quickly, especially among
people who had a personal connection to either Mary Montague or to the princess.
What's fascinating here is that this wasn't in some way
a rational way to solve the problem.
It wasn't saying, look, there's really convincing evidence here.
You're almost using a technique that's pretty close to propaganda.
It is a propaganda technique, absolutely.
So, propaganda's tend to be very savvy
about the ways that people use their social connections
to ground trust and knowledge and choose their beliefs,
and they take advantage of those.
In this case, it was using that social trust for good,
but in many cases, people use it for bad.
And if you look at the history of industrial propaganda
in the US, or if you look at the way Russia conducted
propaganda before the last election,
people have taken advantage of these kinds of social ties
and beliefs to try to convince us of whatever it is
they're selling.
One last idea and how you counter bad information,
semilvice as we saw did not succeed in persuading other doctors
during his lifetime to wash their hands thoroughly
before they were treating patients.
But of course, now that idea is widely adopted,
what does that tell us, Kalin, about how signs in some ways
might be self-correcting?
It might not be self-correcting at the pace that we want, but over time, it appears that
good ideas do beat out the bad ones.
Yeah, so we have thousands and thousands of examples in science of exactly that happening
of good ideas beating out the bad ones.
Of course, now we can look back and say,
oh, well, that could idea one out,
and that could idea one out.
We can't actually look at right now
and know which of the ideas we believe now
are correct ones or good ones.
So there are actually philosophers of science
like Larry Loudon and Kyle Stanford
who argue for something called the pessimistic meta-induction, which is something
like this, because scientific theories in the past have always eventually been overturned,
we ought to think that our theories now will probably be overturned as well.
But there's actually an optimistic side to this, which is that if you look at many theories in
the past, ones that were overturned, often the reason people believe them is that even if they were wrong,
they were a good guide to action.
Even the theory of stomach acid causing ulcers, well if you treat stomach acid, it actually
does help with ulcers.
It wasn't a completely unsuccessful theory.
It's just that it wasn't totally right,
and it wasn't as successful as the bacteria theory of ulcers
because antibiotics do better.
One of the interesting implications about all of this
is how we should think about the truth.
And in some ways, I think the picture that I'm getting
from you is a picture that says the truth is not a binary question. Is it true? Is it false? I mean,
some questions, of course, perhaps can be reduced to, is it true? Is it false? But really,
scientists in the business are producing probability estimates for various claims. And I think what
you're saying is that for us to actually be on the right side of the
misinformation, information divide, it's helpful for us to think in probabilistic terms
rather than in binary terms.
Yeah, that's absolutely right.
So we do think it's really important to think about belief in terms of degrees and evidence and believing something strongly enough.
And part of the reason is that there has been the strategy where people who are
trying to subvert our beliefs will say, but we're not sure about something.
They'll say, evolution's just a theory or there's some doubt about global
warming.
But ultimately not being sure about something is not what matters. We're never really 100% sure about anything.
And if you think about it, think about any belief you could have,
you know, that the sun will come up tomorrow.
Well, it always has in the past, but that doesn't mean that 100% sure it will
tomorrow. There's a really good chance at will tomorrow.
We shouldn't be looking for certainty. Instead, we need to be saying to ourselves, when do
we have enough evidence to make good decisions?
Kaelin O'Connor is a philosopher and mathematician at the University of California, Irvine.
She studies how social networks can spread both good information and bad.
Along with James Weatherall, she is co-author of the book, The Misinformation Age, How
Falls Beliefs Spread.
Kaelin, thank you for joining me today on Hidden Bray.
Oh, thank you so much for having me.
This episode was produced by Maggie Penman, Camilla Vargas Restrepo and Laura Quarelle.
Our team includes Jenny Schmidt, Parth Shah, Raina Cohen and Thomas Liu.
Our supervising producer is Tara Boyle.
For more Hidden Brain, please follow the show on Facebook and Twitter.
I'm Shankar Vedantum.
I'll see you next week.