How to Talk to People - How to Trust Your Brain Online
Episode Date: May 27, 2024This episode explores the web’s effects on our brains and how narrative, repetition, and even a focus on replaying memories can muddy our ability to separate fact from fiction. How do we come to b...elieve the things we do? Why do conspiracy theories flourish? And how can we train our brains to recognize misinformation online? Lisa Fazio, an associate psychology professor at Vanderbilt University, explains how people process information and disinformation, and how to debunk and pre-bunk in ways that can help discern the real from the fake. Music by Forever Sunset (“Spring Dance”), baegel (“Cyber Wham”), Etienne Roussel (“Twilight”), Dip Diet (“Sidelined”), Ben Elson (“Darkwave”), and Rob Smierciak (“Whistle Jazz”). Write to us at howtopodcast@theatlantic.com. Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
Your team requested a ride, but this time not from you.
It's through their Uber Teen account.
It's an Uber account that allows your team to request a ride under your supervision
with live trip tracking and highly rated drivers.
Add your team to your Uber account today.
At Beth 365, we don't do ordinary.
We believe that every sport should be epic.
Every goal, every game, every point, every play. From the moments that are remembered forever to the ones you've already forgotten.
Whether it's a game-winning goal in the final seconds of overtime or a shot on goal in the first period.
So whatever the sport, whatever the moment, it's never ordinary. At Bet 365, must be 19 or older, Ontario only.
Please feel responsible.
If you or someone you know has concerns about gambling, visit connexontario.ca.
When I was growing up, I always believed that blue bonnets, which are the Texas state flower
where I live, that they're illegal to pick in Texas. And this is something that I feel like so many people very firmly believe.
You hear it all the time.
You cannot pick the state flower, the blue bonnet, and come to find out when I was an
adult that there actually is no state law to this effect.
I was 100% convinced of this as a fact. And I bet if you pull an average Texan, there's going to be probably a healthy contingent
of them that also believe it's a fact.
So sometimes we just internalize these bits of information.
They kind of come from somewhere.
I don't know where.
And they just they stick with you.
Oh, that's so interesting.
So not quite a false memory, but a false sense of reality in the present.
Something like that. Wow.
And I love it, too, because it protects the flower.
So, hey, that's not a bad side effect. Yeah, not a bad side effect.
I'm Andrea Valdez.
I'm an editor at The Atlantic.
And I'm Megan Garber, a writer at The Atlantic.
And this is How to Know What's Real.
Garber, a writer at The Atlantic. And this is How to Know What's Real.
Andrea, you know, lots of mistakes like that are commonly shared. One of them, I think about sometimes, involves Nelson Mandela, former president of South Africa, who a lot of people
became convinced that he had died in the 1980s when he was in prison.
But of course, he didn't die in the 1980s.
He died in 2013.
But the misconception was so common that researchers began to talk about the, quote, unquote, Mandela
effect to describe, I think, what we're talking about, these false memories that somehow become
shared and somehow become communal. And they're often really low stakes things,
you know, like how many people remember
the line from Star Wars.
I hope this is not a spoiler, but
the line from Star Wars is Luke, I am your father,
which is definitely what I thought the line was.
Of course, everybody does.
Right?
But you know what it is actually,
cause it's not that.
I do know what it is, but only because I
feel like this has come up so much that people
have the wrong idea.
It's, no, I am your father.
Yeah, exactly.
So there's no Luke in that, which
is such a small distinction and so tiny in one way.
But it's also kind of humbling to think
how that mistake just kind of took over the reality
and how it took on a life of its own.
You know, there's something actually really innocent
about getting things wrong.
In casual conversation, you might say something wrong
and it's okay, we all do that.
But I think the forgiveness for that comes
because the information trail you're creating,
it goes cold pretty quickly.
Maybe you have a kooky aunt and she tells you something when you're a kid,
and you just accept that it's a fact.
And then maybe you take that kooky aunt fact and you repeat it to a friend,
and then it just kind of stops there, right?
It doesn't get passed along and along.
But we live in a world right
now where it feels like there's rampant, never-ending misinformation. And with the internet and
the sharing culture that we have on social media, this misinformation, it goes viral.
And then it's just as if we're all sick with the same misinformation.
And sickness is such a good metaphor and one that scientists also use, right?
They compare bad information to bad health and just like you said, like a virus that
spreads from person to person as a contagion.
And the very fact that it's also easily transferable makes it really hard to fight off.
And I wanted to understand a little bit more about that dynamic and really
about what happens in our brains as we try to sort out the true information from the
false. Dr. Lisa Fazio is an expert on that question and thinks a lot about how our minds
process information. I asked her more about how we come to believe what we believe and how we end up holding on to incorrect information.
So the short answer is in the same ways that we learn correct information. So the same
principles of learning and memory apply. What's different with incorrect stuff is sometimes
we should have the knowledge to know that it's wrong.
And sometimes that means that we can avoid learning incorrect stuff, and sometimes that
means we actually don't notice the contradiction, and so we remember it anyways.
Could you tell me a little bit more about the distinctions there, and especially how
the new information interacts with the knowledge we already have?
My favorite example of this is something that we call the Moses illusion.
So you can ask people, how many animals of each kind did Moses take on the ark?
And almost everyone will respond to, but once you actually pointed out to them that it was
Noah and not Moses who took the animals on the ark, Everyone goes, oh, of course, I knew that.
So that knowledge is in your head, but you're not using it in the moment.
And so we've been calling this knowledge neglect,
that you've got it stored in memory someplace,
but in the moment you fail to use that knowledge
and you instead learn this incorrect information.
Oh, that's so interesting. What do you attribute that to?
It really seems to be that when things are close enough,
we don't flag them as wrong.
So if I asked you how many animals of each kind
did Reagan take on the Ark, you won't answer that question.
You'll notice the error there.
And it actually makes a lot of sense in our day-to-day lives.
When we're talking to each other,
we make speech errors all the time. But to haveday lives when we're talking to each other. We make speech
errors all the time, but to have a conversation, we don't point each one out. We just keep
going.
So, why then can we be so sure that we are correct?
I think it's one of the most fascinating things about our memory system that we can
have these times that we are absolutely certain that we have seen this thing, we have
experienced this thing, and it's just not true.
And I think part of it is that we often think about our memories for events as being kind
of video cameras, that like we're just recording the event and then when it's time to recall
it, we play it back.
And that's not at all how it happens.
Instead, what you remember is partially what parts of the event were important enough for
you to pay attention to, for you to encode.
And do we encode certain types of information differently from others?
Memory researchers sometimes talk about the difference between what we call episodic memory
and semantic memory, where episodic memory is your memory for events, your kind of autobiographical
memory versus semantic memory is just kind of all the stuff that you know about the world.
So the sky is blue, my name is Lisa, all the just kind of general facts and things that
we know.
And I will say there's arguments in the field, are these actually different memory systems,
or is it just one that's remembering two types of material?
There's some evidence from kind of brain lesions
and some neuropsychology that they are separate systems,
but then there's also evidence
that really it's all the same thing.
How does fiction fit into that,
or the facts of fiction, I guess?
How does our brain make sense of the difference between the real fact and the fictional one,
or does it?
So, there's interesting work trying to figure out kind of when we're thinking about fiction,
do we kind of compartmentalize it and think of it as something separate from our knowledge
about the real world?
And it seems to be that that's not really what happens.
So there's much more blending of the two, and you really keep them straight more by
kind of remembering that one is Lord of the Rings and one is reality.
But they can blend in interesting ways.
So we have studies where we've had people read fictional stories.
We tell them they're fictional.
We warn them that, hey, authors of fiction often take liberties with certain facts or
ideas in order to make the story more compelling.
So some of what you read will be false.
And then we have them read a story that contains a bunch of true and false facts about the
world.
And then later that day or a few weeks later,
we just give them a trivia quiz where we ask them
a bunch of questions and see what they answer.
And what they read in those stories bleeds over.
So even though they knew it was fictional,
it sometimes affected their memory and they would recall
what was in the story rather than what they knew
to be correct kind of two weeks earlier.
So Dr. Fazio is saying a couple of things.
One, sometimes we can inadvertently create false memories for ourselves.
We play back a memory in our head, but we have an incomplete picture of that memory.
So maybe we insert some additional not quite right details to flush that memory back out,
and that ends up distorting the memory.
And then the second thing is there's our memories about facts about the world.
And sometimes we're recalling those facts from all sorts of information we've stored
in our brain, and the fictional or false stuff mixes in
with the real and accurate stuff.
You know, I've been thinking a lot, too,
about the efforts experts have been
making to distinguish between the different types
of information that we're confronted with every day,
and specifically the bad information that we deal with.
So there is misinformation, which is a claim that's just generally incorrect.
And then there's disinformation with a D, which is generally understood to be
misinformation that's shared with the intention to mislead.
Right.
Yeah.
Misinformation would be if someone who doesn't know much about Taylor Swift, for
example, messes up and keeps telling people that she's been dating Jason
Kelsey. When in fact, as any Swiftie would know, she has been dating his brother Travis
Kelsey.
Right, right. And disinformation would be, if I know that's wrong, but then I turned
around and I purposely told my friend who's a big football fan, that Jason and Taylor are dating
just to mess with him.
Yes, exactly.
Which is also then getting into the realm of propaganda if a troll kept posting that
whole idea that the Taylor-Travis relationship is not in fact genuine, but a PSYOP designed
to promote a liberal agenda, which was actually a real
claim people made.
Oh, boy. But to your point, I can see how this is confusing for folks. All these terms
are so similar and hard to disentangle, you know? And we have all these ways to categorize
these different errors. But I guess what I'm curious is are we really able to discern between all of these subtle distinctions?
You know, we can intellectualize them, but can we really feel them?
Oh, it's such a good question.
And something I was thinking about too is I talked with Dr. Fazio, and one answer might
be that intellectualizing those questions could also be a way to feel them.
Where just being aware of how our brains are processing new information
might give us that extra bit of distance that would allow us to be more critical
of the information we're consuming. Nobody goes on vacation for the moments that are just... okay.
That's why Sunwing vacationers go all in like it's a buffet of fun.
Whether you're skimming the treetops like Tarzan's long-lost twin, or deep end swimming
with your flippers and fins.
Or maybe you're just perfecting the art of doing absolutely nothing.
Whatever vacationer you are, with Sunwing, you save more so you can do more.
Book with your local travel agent or...
I know you've talked about the difference between debunking misinformation and pre-bunking.
And I love that idea of pre-bunking.
Can you talk a little bit about what that is
and what it achieves?
So debunking is when people have been exposed
to some type of false information,
and then you're trying to correct their memory.
So they've had an experience,
they likely now believe something false,
and you're trying to correct that.
And we find that debunking in general is useful.
The problem is it never gets you back to baseline.
Having no exposure to the misinformation is always better than the debunk.
Seeing a debunk is better than nothing, but even better would be just no exposure to the
misinformation.
Yeah.
What pre-bunking interventions try and do is to kind of prepare you before
you see the misinformation. So sometimes this is done with something that's often called
inoculation where you warn people about the types of manipulative techniques that might
be used in misinformation. So using really emotional language, false experts, trying to kind of
increase polarization, things like that. But then you can also warn people about kind of
the specific themes or topics of misinformation. So like in this next election, you will likely
see a story about ballots being found by a river. In general, that ends up being misinformation.
So just keep an eye out for that and know that if you see a story, you should really make sure
it's true before you believe it.
COLLEEN O'BRIEN And along those lines, how would you make sure that it's true,
especially with our memories working as they do? How do we even trust what seems to be true?
So I tell people to pay attention to the source. Is this coming from some place that you've heard
about before? The best way I think is our multiple sources telling you that. And one of the things I
also remind people is like in the fast-moving social media environment, if you see something
and you're not sure if it's true or false,
one thing you can do is just don't share that.
Like, don't continue the path forward.
Just pause, don't hit that share button
and try and stop the chain a little bit there.
Yeah, if you see something, don't say something.
Exactly. There we go. That's our new motto.
See something, don't say something. Exactly. There we go. That's our new motto, see something, don't say something.
And do you find that people are receptive to that?
Or is the impulse to share so strong that they just
want to anyway?
Yeah, so people are receptive to it generally.
So when you remind people that, hey, Americans really
care about the accuracy of what they hear,
they want to see true information on their social media feeds and
that they'll kind of block people that constantly post false information.
We've got some studies showing that people do respond to that and
are less willing to share kind of really false and
misleading headlines after those types of reminders.
Could you tell me more about emotion and how it resonates with our brains?
Dr. Javan Bevel has some interesting work along with some colleagues finding that
moral emotional words, so words that convey a lot of emotion but
also a sense of morality, those really capture our attention.
And lead to kind of more shares on social media.
Our brains pay a lot of attention to emotion, they pay a lot of attention to morality.
When you smoosh them together, then it's the sense of super power of getting us to just really focus in on that information.
Which is another cue that people can use kind of if something makes you feel a really strong emotion,
that's typically a time to pause and kind of double check, is this true or not?
And along those lines, you know, media literacy has been offered sometimes as an explanation
or as a solution really to, you know, just if the public were a little bit more educated
about the basics of how news gathering works, for example,
that maybe they would be even more
equipped to do all the things that you're talking about
and to be a little bit more suspicious
to question themselves.
How do you feel about that idea?
And how do you feel about news literacy as an answer,
one answer among many?
Yeah, I mean, I think that's the key point, that it's one
answer among many.
I think there are no silver bullets here that are just gonna fix the problem, but I do think
media literacy is useful.
I think one thing it can be really useful for is increasing people's trust of good
news media.
Yeah.
Because one of the things we often worry about with misinformation is that we'll just make
people overly skeptical of everything and become kind of this nihilistic, nothing is true, I can't tell what's true or false, so I'm
just going to check out and not believe anything.
And we really want to avoid that.
So I think an important role of media literacy can be understanding here's how journalists
do their jobs and why you should trust them and all the steps they go to to make sure
that they're providing correct information.
And I think that can be a useful counterpart.
And what are some of the other factors that affect whether or not we're more likely to believe information?
Yeah, so one of the findings that we do a lot of work on is that repetition in and of itself increases our belief in information.
So the more often you hear something, the more likely you are to think that it's true.
And they're not huge effects, but just kind of things gain a little bit of plausibility
every time you hear them.
So you can imagine the first time that people heard the pizza gate rumor that Clinton is
molesting children in the basement of a pizza parlor
in D.C. That seemed utterly implausible. There was no way that was happening. And the second
time you heard it, the tenth time you've heard it, it becomes just slightly less implausible
each time. You likely still don't think it's true, but it's not as outrageous as the first
time you heard it. And so I think that has a lot of implications
for kind of our current media environment
where you're likely to see the same headline
or the same rumor or the same false piece of information
multiple times over the course of a day.
And it occurs to me too that repetition
can also work the other way, right,
as a way to solidify good information.
Yeah, and we know there's even some findings that kind of rhyming sayings are thought to
be a little more truthful than sayings that don't rhyme.
So anything that makes it easy to understand, easy to process, is going of what Dr. Fazio talked about reminds me of a process known as heuristics,
which are these mental shortcuts we take when we're presented with information and we need
to make a quick decision or draw a quick conclusion or make a quick judgment.
And those mental shortcuts, they can really easily be exploited.
There's this great article in Undark magazine about how our brains are, they're just inherently
lazy.
And that puts us at an informational disadvantage.
In it, the writer makes the point that actually just simply using our brain, it requires a
lot of energy.
Like literally, it requires calories, it requires glucose. Just like living is fueling up for a race and you have to fuel up your brain just to process the world.
Right. And so this article argues that as humans were evolving, we didn't always know where our
next meal was going to come from, so we'd save some of that energy. So decisions and judgments
were made really quickly with survival first and foremost in mind. And, decisions and judgments were made really quickly with survival, first and
foremost, in mind.
Oh, yeah.
And so, cognition and critical thinking, those are two things that require heavier mental
lifting and our brain really prefers to not lift heavy thoughts. And it's probably part
of the reason that we're so easy to exploit is because we often just default to our lizard
brain.
And it's also probably part of why conspiracy theories
work so well, right?
They take this world that's so complicated
and so full of nuance and contradiction
and reduce it to something really simple.
All these questions that we have reduced off and down
to this one single answer that kind of explains everything. Yeah, and that's a huge part of their appeal.
And it's so interesting to think about too because one idea you hear about a lot these
days is that, you know, we're living in a golden age of conspiracy theories, or maybe
a fool's gold age.
Ah, very nice.
But I was reading more actually about that, and it turns out that the theories themselves
actually don't seem to be more pervasive now than they have been in the past. So there
was a study in 2022 that reported that 73% of Americans believe that conspiracy theories
are currently quote unquote out of control. And 59% of people agree that
they are more likely to believe conspiracy theories compared with 25 years ago. But the
study itself couldn't find any evidence that any specific conspiracy theories or just general
conspiracism have actually increased over that time.
So even our perception of misinformation is a little bit misinformed.
That's so fascinating.
And it feels right.
Right.
No, exactly.
Or wrong, maybe.
Who knows?
Right, right, right.
Yes, the wrongness feels right.
And 77% blame social media and the internet more broadly for their perception that
conspiracies had increased. And it's sort of hard to prove out fully, but it does
seem to have merit because it's not just that we're often wrong online, but it's
also that we just talk so much about the wrongness. So the environment itself can
be a little bit misleading.
You know, and social media, it feels actually pretty rudimentary compared to
what's coming with the AI revolution. So if we're already having such a tough
time distinguishing between what's real and what's fake right now, I can only
imagine that's gonna get worse with AI.
Dr. Fazio, I wonder about how AI will affect the dynamics we've been talking about so
far.
How are you thinking about AI and especially the effect it might have on how we know and
trust the world around us?
So I go back and forth here from like optimistic to really pessimistic.
So the optimistic case is we've dealt with changes before.
So we had photography and then we had Photoshop
and Photoshop was gonna ruin all of us.
We'd never be able to tell when a photo was real or not.
And that didn't happen.
We figured out ways to authenticate photos.
We still have photojournalism.
Photoshop didn't kind of ruin our ability to tell what's
true or false. And I think a similar thing could be happening with generative AI. It
could go either way, but there's definitely a case to be made that we'll just figure this
out and things will be fine. The pessimistic view is that we won't be sure if what we're seeing is true or false, and
so we'll disbelieve everything.
And so you could end up in a spot where a video is released showing some sort of crime
and everyone can just say, well, that's not real.
It was faked.
And it can become a way to disregard actual evidence.
And do you, do you at this moment have a sense of which of those scenarios might win out?
Yeah, so I will say we're starting to see people do a little bit of the latter.
Anytime you see anything, oh, that's just not real, that's faked.
And that worries me. And, I mean, how do you think about the sort of, you know, preemptive solutions, like you
said, you know, in previous iterations of this with photography, with so many new technologies,
people did find the answer. And what do you think would be our answer here if we were
able to implement it?
I mean, I think the answer again comes down to paying attention to the source of the information.
I mean, so we just saw with the Kate Middleton picture that kind of reputable news organizations
like AP noticed the issue, took the photo down.
And I think it's going to be on these organizations to really verify that this is actual video and
to become a little bit the gatekeepers there of kind of we trust this and you should trust
us.
And that's going to require transparency of kind of what are you doing, why should we
trust you, how do we know this is real.
But I'm hoping that that type of relationship can be useful. Well, thank you for the perfect segue to my next question,
which is when it comes to news in particular,
how can we assess whether something is real?
And in your own life,
how do you think about what and who to trust?
Yeah, so I think one of the useful cues to what's real
is this sense of consensus.
So are multiple people saying it?
And more importantly, are multiple people who have kind of knowledge about the situation
saying it?
So not multiple people being random people on the internet, but multiple people being
ones with the expertise or the knowledge or the firsthand experience. There's a media literacy strategy called lateral reading, which encourages people that when
you're faced with something that you're unsure if it's true or false, it's counterproductive
to dive into the details of that information.
So like if you're looking at a webpage, you don't want to spend a lot of time on that
webpage trying to figure out if it's trustworthy or not.
What you want to do is see what are other people
saying about that website.
So open up Wikipedia, type in the name
of the news organization.
Does it have like a page there?
Or type in the name of the foundation.
Is it actually funded by oil companies talking about climate change,
or is it actually a bunch of scientists?
Figuring out what other people are saying about a source can actually be a really useful
tool.
Andrea, I find that idea of lateral reading to be so useful, both on its own as a way to kind of decide for myself
which pieces of information to trust, but also as a reminder that when it comes to making
those decisions, we actually do have more tools at our disposal than it might seem.
Right.
And there is some comfort in having so many resources available to us, more sources, that
means more context, a fuller understanding.
But it cuts both ways, right? Taking in too much information is actually the thing that
short circuits are lizard brains.
Yes. Yes.
And actually, there's this whole school of thought that flooding the zone with a bunch
of trash information is actually a really good way to confuse and control people.
And it's so useful to remember how connected confusing people and controlling them really
are.
Yes.
And going back even to the language of wrongness, for me at least when I hear the term misinformation,
I automatically associate it with politics, you know, and think about propaganda and all
of that.
But misinformation is a matter of psychology too. And people who study propaganda
actually talk about how often its aim is not just to mislead the public, but really to
just dispirited them, to make them basically give up on the idea of truth itself and get
them to a place where, like that old line goes, everything is possible and nothing is true.
That is dispiriting.
I know, I'm sorry.
I mean, it just encourages this nihilistic or apathetic view.
Right, right.
And I wonder too whether that view would be exacerbated by the influx of AI-generated content.
Yes, yeah.
Like with the rise of deepfakes, I think that's going to challenge our default assumption that, you know, seeing is believing. Given the way that evolution has worked and
the evolution of our information ecosystem, maybe seeing is not enough. But if you want
to fight that nihilism, it's almost like you need to fight the evolutionary instinct of
making quick judgments on just a single piece of information that's presented to you.
Yes. And one way to do that might simply be just what we've been talking about,
appreciating how our brains are wired and remembering that as we make our way
through all the information out there. Almost like a form of mindfulness, you
know, this idea that awareness of your thoughts and sensations is such a
crucial first step in moving beyond our lizard brain impulses. So just being aware of how
our brains are processing new information might give us that bit of distance that would
allow us to be more critical of the information we're consuming, whether it's images or otherwise.
Oh, right. You know, I guess it's seeing tells you a part of the story,
but telling yourself the most truthful story,
that just takes work.
That's all for this episode of How to Know What's Real.
This episode was hosted by Andrea Valdez
and me, Megan Garber.
Our producer is Natalie Brennan.
Our editors are Claudina Bade and Jocelyn Frank.
Fact check by Ena Alvarado.
Our engineer is Rob Smirciak.
Rob also composed some of the music for this show.
The executive producer of audio is Claudina Bade
and the managing editor of audio is Andrea Valdez.
Next time on How to Know What's Real. The way surveillance and privacy works is that it's not just about the information that's
collected about you.
It's like your entire network is now caught in this web and it's just building pictures
of entire ecosystems of information.
So that's a huge part of what defines surveillance. What we can learn about surveillance systems, deepfakes, and the way they affect our reality.
We'll be back with you on Monday.