Hidden Brain - The Air We Breathe
Episode Date: June 13, 2020President Trump said this week that a few "bad apples" were to blame for police killings of black people. But research suggests that something more complicated is at play — a force that af...fects everyone in the culture, not just police officers. In this bonus episode, we revisit our 2017 look at implicit bias and how a culture of racism can infect us all.
Transcript
Discussion (0)
Hey there Shankar here. Today we wanted to bring you a bonus episode.
It's a story we reported several years ago. It's about race and bias and the power of culture
to affect our behavior. A quick note before we get started, this story begins with the police
shooting of a man named Terrence Crutcher and this discussion of police violence against African Americans throughout the episode.
On a September evening in 2016, Terence Croucher's SUV stopped in the middle of a road in Tulsa,
Oklahoma. A woman saw him step out of the car. The doors of the car were open. The engine
was still running. The woman called 911.
Officer Betty Shelby was on her way to an unrelated incident when the call came in.
Terrence was 40, African-American, born and raised in Tulsa.
He was a church-going man with four children.
Betty was 42, white, a mother.
She was born in a small town not far from Tulsa.
In an ideal world, these two Oklahoma natives, close in age, ought to have had more to bring
them together than hold them apart.
But on this evening, there was no small talk or friendly chatter.
The police officer told Terence to take his hands out of his pockets.
According to her attorney, he first complied.
He then put his hands up in the air.
Moments later, he put his hands back in his pockets.
By this point, multiple police officers had gathered and drawn their guns and tasers.
Overhead, a police chopper filmed events as they unfolded.
From the video, it's hard to tell exactly what's happening on the ground, but an officer
in the helicopter thinks Terrence is not cooperating.
Not for taser, I think.
That looks like a bad dude too.
To be honest, I'll tell you.
Moments later, one officer on the ground does fire a taser.
Betty Shelby fires her gun.
God fired!
Oh, 321, we have shot fire.
We have one set set down.
We need an insulator.
She kills Terence Croucher.
Later, police discover that he was unarmed.
At a press conference after the shooting, a journalist asked Cot Wood
Betty Shelby's attorney why she opened fire.
Did him being a big black man play a role in her perceived danger?
No, him being a large man perceived a role in her being in danger.
She's worked in this part of town for quite some time.
And, you know, just the week before,
she was at an all black high school homecoming football game.
She's not afraid of black people.
Terrence Croucher's sister Tiffany sees it very differently.
She thinks the officer shot her brother
because he was black. That big bad dude was my twin brother.
That big bad dude was a father.
That big bad dude was a son.
That big bad dude was enrolled at Tulsa Community College.
Just wanting to make us proud. Betty Shelby's daughter Amber, That big bad dude was enrolled at Tulsa Community College.
Just wanting to make us proud.
Betty Shelby's daughter Amber defended her mother.
I am here to reveal to you the side of my mother that the public does not know.
My mother is an incredible, supportive, loving and caring woman.
She is a wife, a mother, and a grandmother, with a heart of gold.
She has always fought for the underdog and stands up for the weak.
Betty Shelby was acquitted of manslaughter charges. Still, the tenor of the back and forth,
the psychological accusations and psychological defenses, is very revealing.
When an incident like this occurs, we want to hear the story of
what happened. We want to know what was going on in the mind of the shooter and
the mind of the victim. Was Terence Croucher truly a threat? Did Betty
Shelby dislike black people? What clues explain the behavior of these
individuals? We home in, dig for facts, and look for psychological explanations.
But what if there is another way to think about what happened, one that has less to do with the
individuals involved, and more to do with the context in which the shooting occurred?
What we're discovering here is that the individual mind sits in society and the connection between
mind and society is an extremely important one that should not be forgotten.
Individual behavior and the mind of the village this week on Hidden Brain. I'm Maserine Banaji.
Maserine is a psychology professor at Harvard.
She's made a career out of studying the invisible.
For the past 30 years, I've been interested in studying
those aspects of our minds that are hidden
from our own conscious awareness.
Maserine's interests began in graduate school.
She was teaching psychology at Yale
and looking for a way to measure people's biases. There was debate over the right scientific method to do this. You could
simply ask people their views, but because prejudice is a sensitive topic, you often don't
get anything. You couldn't walk up to somebody and say, do you agree that Italians are lazy
and have them say, yes or no? they'll just refuse to answer that question.
Our deep-seated discomfort about discussing prejudice was one hurdle for a researcher
looking to study the phenomenon.
Maserine realized there was another barrier.
What if some forms of prejudice are so deeply buried that people don't even realize they
harbor such bias?
Perhaps we behave in ways that are not known to our own conscious awareness,
that we are being driven to act in certain ways,
not because we're explicitly prejudiced,
but because we make Harry in our heads the thumbprint of the culture.
Was there a way to decipher this thumbprint and expose people's hidden biases?
Eventually, Maserine, with the help of her mentor Tony Greenwald, and then graduate student Brian Nosek, developed a simple,
ingenious test. It's called the implicit association test or the IAT. It's based on the way
we group things in our minds. When you say bread, my mind will easily think butter,
but not something unrelated to it.
Like say, a hammer.
Our brains, it turns out, make associations,
and these associations can reveal important things about the way we think.
So the way the IAT works is to simply ask people to sort things.
So imagine that you're given a deck of playing cards,
and you're asked to sort all the red cards to the left
and all the black cards to the right.
I'll predict that it will take you about 20 seconds
to run through that deck.
Next, Mazarin says, shuffle the deck and resort the cards.
This time, I'd like you to put all the spades
and the diamonds to one side and the clubs and the hearts to the other side.
And what we'll find is that this will take you nearly twice as long to do.
Why? Because a rule that your brain had learned,
red and red go together, black and black go together is no longer available to you.
Remember that in both scenarios, you are grouping two suits together.
In the first scenario, you are grouping hearts with diamonds and clubs with spades.
In the second scenario, you are grouping hearts with clubs and diamonds with spades.
Because there's a simple rule for the first task, group red with red and black with black, that task is easy.
In the second scenario, you need a fraction
of a second to think about each card. You can't follow a simple rule of thumb.
Mazurine and Tony and Brian had an important insight. These rules of association
apply to many subjects, including the way we think about other human beings.
So they created a new sorting task. Sort for me, faces of black people and bad things together, words like devil and bomb and
vomit and awful and failure. Sort those on one side of the playing deck.
On the other side, put the faces of white people, and words like love and peace and joy and sunshine and friendly and so on
to the other side. This turns out to be pretty easy for us to do because as my colleagues and I
will argue the association of white and good and black and bad has been made for us in our culture.
The test doesn't end there. After sorting white and good into one group and black and bad into another, you now have
to do it again, this time grouping black with good and white with bad.
And when you try to do that and when I try to do that, the data show that we will slow
down, that we can't do it quite as fast because black and good are not practice responses for us.
They're not habitual responses for us.
We have to exert control to make that happen because it doesn't come naturally and easily to us.
That's the IAT.
By the way, if you're wondering whether the order of the test makes a difference, it doesn't.
The researchers have presented tests to volunteers in different ways.
It doesn't make a difference if you ask people to first group black with bad or first ask
them to group black with good.
In both cases, people are faster to associate white faces with positive words and black
faces with negative words.
Maserine thinks the IAT is measuring a form of bias that is implicit or unconscious.
Mazarin herself has taken the IAT many times. To her dismay, the test shows she has robust levels of unconscious bias.
My brain simply could not make the association of black with good as quickly as I could make the association of white, with good. And that told me something, it told me, it's not the IAT that screwed up, it's my head
that screwed up.
Because the IAT is a timed test, the results can be precisely measured.
Implicit bias, in other words, can be quantified.
Most psychological tests are only available in the lab, but Marzareen and her colleagues
decided to do something radical.
They put their test on the internet.
You can find it today at implicit.harvard.edu.
Millions of people have taken this test.
The data has been collected, shared, disseminated.
The IAT is widely considered today to be the most influential test of unconscious bias.
As Maserine and Tony and Brian were developing the IAT, other researchers were developing
different ways to measure bias.
Psychologist Joshua Corral found himself diving into the field shortly after a black man
was shot and killed in New York City in 1999.
His name, Amadou Diallo.
Diallo was standing unarmed on the front stoop of his apartment building and the police
thought he looked suspicious and they approached him and they ended up shooting him and the
question that everybody was asking, and this was, I mean, something that people, you know, across the country were wondering about
was he shot because he was black.
At the time, Joshua was starting graduate school.
I took that question pretty seriously,
and we tried to figure out how we could test it
in a laboratory.
Joshua and his colleagues eventually developed a video game.
It was a pretty bad video game, but it did the trick.
It's more like a slideshow where there are a series of
backgrounds that pop up on the screen, and then in one of
those critical backgrounds, a person will suddenly appear.
So we've got photographs of, say, 25 or so white men and
25 black men, and we've photographed these guys holding a
variety of different objects.
Cell phones, can a coke, a wallet, a silver pistol, a black pistol, and so we've
just edited the photographs so that the person pops up in the background holding
an object and the player has to decide how to respond and they're
instructed if the guy on the screen has a gun, he's a bad guy and you're
supposed to shoot him and you're supposed to do that as quickly as you possibly can. What Joshua
wanted to know was whether players would respond differently depending on the race of the target on
the screen. Say a black guy pops up holding a wallet and a white guy pops up holding a wallet,
what's the likelihood that the black guy gets shot and the white guy doesn't? If current events are any clue, you may guess the answer.
Here we're looking at, say, you know, the player is responding to a target who's holding
a wallet, and the correct decision is to say don't shoot, and what we found is that they
are faster to say don't shoot, if the target is white rather than black.
The same hell true for arm targets.
Test takers were faster to shoot black targets,
slow out a shoot, white ones.
Now, you might think that Joshua would conclude
that his test takers were just racist.
But one important similarity between Joshua's test
and Maasareens test is that they do not presume
that the people with such biases
have active animosity toward African Americans.
These are not members of the Ku Klux Klan. the people with such biases have active animosity toward African-Americans.
These are not members of the Ku Klux Klan.
It was just exactly what we had predicted.
And I guess both kind of hoped and feared, right?
I mean, it's an exciting scientific moment,
but it also suggests something kind of deeply troubling
that these participants who are presumably
nice people with no bone to pick. They're not
bigots, they're not angry at black people in any way, but what we saw in their
data very very clearly is an tendency to associate black people with
threat and to shoot them more quickly. You can see that both the psychological
tests are academic exercises. Do they see anything about how people behave in real life?
Joshua Corral is very clear that his video game experiment cannot replicate real life.
It's impossible, he says, to recreate in a lab the fear and stress that a real world
police confrontation can generate.
The IAT2 has been criticized for a somewhat hazy link between test results and real world
behavior.
Hundreds of studies have been conducted looking at whether the IAT explains or predicts how
people will act.
The results have been mixed.
In some studies, unconscious racial bias on the test seems to predict how people will
behave.
Research has found, for example, that doctors who score high and implicit bias are
less likely to prescribe clot-pusting heart drugs to black patients compared to
white patients. But other studies, also looking at doctors and black and white
patients, find no correlation between results on the bias test and actual
behavior. This discrepancy bothers psychologists
Phil Tetlock at the University of Pennsylvania.
He is a critic of the IAT.
It's a test that is enormously intuitively appealing.
I've never seen a psychological test take off the way the IAT has and it's gripped the
popular imagination the way it has because it just seems on its surface to be measuring
something like prejudice.
Tetlock and other critics are concerned that just because someone shows bias on the IAT
doesn't mean that they're going to act in biased ways in real life.
If a test cannot predict how you're going to act, isn't it just an academic exercise?
There is the question of whether or not people who score as prejudiced on the IAT actually
act in the discriminatory ways toward other human beings in real world
situations. And if they don't, if there is very close to zero
relationship between those two things, what exactly is the IIT
measuring? It turns out a lot. There's new evidence that
suggests the IIT does in fact predict behavior. But to see it,
you have to zoom out.
You have to widen the lens to look beyond the individual and into the community.
Hello, my name is Eric Haman.
He's a psychology professor at McGill University.
Eric got interested in the IAT as he was researching the use of lethal force in policing.
He was trying to design a statistical model
that would predict where in the United States,
people of color are disproportionately likely
to be shot and killed by police.
First, he needed some baseline data.
This proved hard since the federal government
does not require police departments
to report deadly shootings by officers.
We really had no idea about really basic questions,
such as how often they were happening,
where they were happening, and who they were happening to.
But in 2015, some news outlets, including the Washington Post
and the British newspaper The Guardian,
began to compile their own database on police homicides
in the United States.
According to official terminology, these are called
justifiable homicides.
So what they were putting together was the most comprehensive list of these justifiable homicides
in the United States. Eric used this data to pinpoint where disproportionate police shootings
of minorities were most likely. Then he turned to the IAT data. Eric suspected that if bias was a factor in police shootings,
it was likely that implicit bias,
rather than overt racism, was at play.
Traditionally, the field has found that
explicit biases are predict behaviors
that are under our conscious control,
whereas implicit biases predict things
are a little bit more automatic,
a little bit more difficult to control.
And this is exactly the sort of behavior
that we thought might be involved in police shootings.
People take the IAT anonymously,
but they need to provide some information
like their race and where they live.
With the millions of data points the IAT provided,
Eric painted a map of bias across the United States.
Some places seem to have lots of bias. Others very little.
Now he had two databases. He cross-reference them to see if there was any connection between
communities with disproportionate numbers of police shootings of minorities and communities
showing high levels of implicit bias. A powerful correlation emerged. So we find that in communities in which people have more racial biases, African-Americans
are being killed more by police than their presence in the population would warrant.
Let me repeat this because it's important.
In places where implicit bias in a community is higher than average, police shootings of
minorities are also higher than average, police shootings of minorities are also higher than average.
Erics analysis effectively pinpoints where police shootings are likely to happen.
But here's what makes the finding crazy.
Most people who take the IAT are not police officers.
So we're predicting police behavior by not measuring police at all themselves.
Coming up, we explore how a test can predict how people will behave, even when they're not the people who take the test.
Stay with us.
The world is complicated, but knowing the past can help us understand it so much better. That's where we come in.
I'm Randa Dutfata.
I'm Ramti Nara Blui, and we're the host of Thru Line, MPR's History Podcast.
Every week we'll dig into forgotten stories from the moments that shaped our world.
Through line, from MPR, listen and subscribe now.
Psychologist Eric Heyman found a way to predict police behavior by comparing places that
have high levels of implicit bias with places where police shootings of minorities are higher
than average.
Since police don't typically take the IAT,
how could the IAT be predicting how police would behave?
Eric thinks the test has tapped into the mind
of the community as a whole.
Say there's a neighborhood that's traditionally
associated with threat or danger
and the people who live in that neighborhood
have these associations between African Americans and threat
or African Americans and danger.
And these would be anybody in this community.
This could be my mother or the person who lives down the street,
not necessarily the police officers themselves.
But there's this idea that this attitude is pervasive
across the entire area.
And that when officers are operating in that area,
they themselves might share that same attitude,
that might influence their behaviors in these split-second, challenging life and death decisions.
Implicit bias is like the smog that hangs over a community.
It becomes the air people breathe.
Or as Mazerine might say, the thumbprint of the culture is showing up in the minds of the people living in that community.
There are many examples for this idea that individual minds shape the community and the community shapes what happens in individual minds.
Sets Stevens-Devitovitz is a data scientist who used to work at Google.
In his book, Everybody Lies, Sets explains how big data from Google searches can predict
with great accuracy, things like the suicide rate in a community, or the chances that a hate
crime will take place.
We've shown that you can predict hate crimes against Muslims based on searches people
make.
People make very, very, very disturbing searches, searches such as Kill Muslims or I hate
Muslims.
And these searches can predict on a given week how many hate crimes there will be against
Muslims.
But I think the right approach to this is not to target any particular individual to show
up at the door of any particular individual who make these searches.
But if there are many, many searches in a given week, it will be wise for police departments
to put extra security around mosques because there is greater threat of these attacks.
In other words, what the Google search data is doing is effectively taking the temperature
of an entire community.
That's what you're really saying, that you're picking up on things that are in the ether
if you will in the community that might not show up in the individual, but are likely to
show up in the aggregate.
Yeah, and I think you don't really know the reason that any particular person makes a
search, right?
Someone could be searching kill Muslims because they're doing research or they're just curious
about something or they made a mistake and they're typing.
There are a lot of reasons that an individual can make these searches.
But if twice as many people are making these searches, well, if I were Muslim American, I'd want some extra security around my mosque, right?
Asking whether implicit bias affects the behavior of every individual is a little like investigating everyone who types an offensive search term into Google.
A lot of the time, you're going to find nothing.
And yet, when you look at the use of search terms in aggregate, it can tell you with great precision,
which areas will see the most hit crimes.
For her part, Maazureen Banaji believes Eric's work
is a key link between her psychological data
on how individuals behave and sociological insights
on how community behaves.
What we're discovering here is that the individual mind
sits in society and the connection between mind and society
is an extremely important one that should not be forgotten.
And that more than any other group of people, social psychologists,
owe it to the beginnings of their discipline,
to do both and to do it even handedly,
to be focused on the individual
mind and to be talking about how that
mind is both influenced by and is
influencing the larger social group around
her. This is why, says Maserine, when a
problem has spread throughout a community,
when it has become part of the culture,
you can't fix it by simply focusing on individuals.
One of the difficulties we've had in the past is that we have looked at individual people
and blamed individual people.
We've said, if we can remove these 10 bad police officers from this force, we'll be fine.
And we know our social scientists, and I believe firmly that that is no way to change anything.
This new way of thinking about bias showed up in the last presidential election.
Democrat Hillary Clinton said implicit bias probably played a role in police shootings.
I think implicit bias is a problem for everyone, not just police.
I think unfortunately too many of us in our great country jumped to conclusions about each other.
Republican Mike Pence, now Vice President, bristled at the idea.
He said that Clinton was calling cops racist.
When African-American police officers involved in a police action shooting involving an African-American,
why would Hillary Clinton accuse that African American police officer of him?
I guess I can't believe you are defending a position that there is no bias.
But as Mazarin says, it's not quite right to think of people with implicit bias as
harboring the kind of racial hostility we typically think of when we say
someone is a racist. Small kids show implicit bias. African Americans themselves
show implicit bias against other African Americans.
The test isn't picking up the nasty thoughts of a few angry outliers.
It's picking up the thumbprint of the culture on each of our minds.
So what can we do?
Miserine is skeptical of those who offer training courses that promise quick fix solutions.
There are many people across the country who say that they offer such a thing, called implicit by a training.
And what they do is explain to large groups of people what might be going on that's keeping them from reaching their own goals and being the good people that they think they are. And my concern is that when I'm an old woman, that I will look back at this time and think,
why didn't I do something about this because I don't believe this training is going to do
anything?
In Mausoleen's view, you can't easily erase implicit bias because you can't erase the
effect of the culture when people are living day in and day out in that same culture.
But she and others argue that there might be ways to prevent such biases from influencing your behavior.
Let's return to psychologist Joshua Correll.
Remember, he created the active shooter video game that found test takers were more likely to shoot black targets rather than white ones.
Many of Joshua's initial test takers were students.
Eventually, he decided to see what would happen
if police officers took the test.
So he went to the Denver police.
We brought down a bunch of laptops and button boxes,
a bunch of electronic equipment that we were using
to do this study, and we would set it up in their roll call room.
And it was just complete chaos
and really, really fun. And some of the police really wanted nothing to do with us, but
a huge number of them volunteered and they wanted to talk with us afterwards.
At first, the police officers performed exactly the same as everyone else. Their levels
of implicit bias were about the same as laypeople who take the test, both in response times and in mistakes.
But when it came to the actual shooting of targets, the police were very different.
The police officers did not show a bias in who they actually shot.
Those early test takers, college students and other lay people,
displayed their bias on response time, mistakes, and who they shot.
Not the police.
But whereas those stereotypes may influence the behavior of the college students
and of you and me, the police officers are somehow able to exert control.
So even though the stereotype, say of threat, may come to mind,
the officer can overcome that stereotype and respond based on the information
that's actually present in the scene
rather than information that the officer
is bringing to it through his or her stereotypes.
Joshua wondered whether there were certain factors
that might keep police officers
from exerting this kind of cognitive control
over their biases.
He found, among other things,
that sleep made a difference.
Those who were getting less sleep were more likely to show racial bias in their decisions
to shoot.
And again, that's just consistent with this idea that they might be able to exert control
to use cognitive resources to avoid showing stereotypic bias in their decisions, but when
those resources are compromised, they can't
do it.
They could be compromised in a variety of ways.
Sleep is just one way that we can compromise it.
This, once again, is evidence that you can't train people not to have unconscious bias.
But as Joshua suggests, you can do things to make it less likely that people will be
affected by their bias.
To be clear, Joshua's experiments are laboratory experiments.
We know in real life that beat police officers do shoot people in error.
Now this could be because, in a real life encounter, stuff happens that makes it very difficult
for you to actually think about what you're doing.
So on the street, when somebody pulls a gun on you, it's scary, right? Like,
cops when they're involved in these firefights report some crazy, crazy
psychological distortions because they're legitimately freaked out. If they
think somebody poses a life and death threat, they may panic, and it may be hard
to bring those cognitive resources online. In several recent high-profile cases,
as with the case of Terence
Croucher and Betty Shelby, police officers shoot people who are unarmed. Of course, officers
do not always know whether someone is armed, it's only in hindsight that we know the officer was
or wasn't in real danger. Joshua's larger point is that police encounters can be inherently
stressful. The uncertainty embedded in a confrontation
can make it very difficult to think objectively. To put it another way, if you're running a
police department and want to reduce errors and shootings, it may be less useful to give
cops a lecture on how they shouldn't be racist and more useful to build procedures that give
cops an extra half second when they're making decisions under pressure.
With practice and a bit of time to exercise conscious control,
people can reduce the risk of falling prey to their implicit biases.
There is the potential to control it, right?
The performance of the regular police officers,
or even people that we train in our lab lab suggests that people don't have to succumb
to those stereotypic influences. They can exert control in certain circumstances.
Maserine Banagee has a similar solution. She thinks we need more of what she calls
in the moment reminders. For example, it's been found that some doctors prescribe pain killers
to white patients far
more often than they do to black patients who are reporting exactly the same levels of pain.
The only difference is the patient's skin color.
This suggests that bias is at work.
Maserine says if the bias is implicit, meaning physicians are acting biased without intending
to be biased, a timely reminder can help doctors exercise cognitive control over their unconscious associations.
You type in a painkiller that you want to prescribe to a patient into your electronic system while the patient is sitting next to you.
And it seems to me quite simple that when you type in the name of any painkiller, let's say, codine, that a little graph pops up
in front of you that says, please note,
in our hospital system, we have noticed
that this is the average amount of painkiller we give
to white men.
This is the average amount we give to black men
for the same reported level of pain.
In other words, giving doctors an opportunity
to stop for a second to make a decision consciously and deliberately,
instead of quickly and automatically,
this can reduce the effects of implicit bias.
Psychology has spent many years
understanding the behavior of individuals,
but tools such as the IAT might give us away
to understand communities as a whole.
Maybe even countries as a whole.
We did a study some years ago, Brian Nosek led this particular project in which we looked
at gender stereotypes across many countries in the world.
How strongly do we associate female with science and male with science. And then we looked at the performance of girls and boys,
roughly around eighth grade on some kind of a standardized test.
And what we discovered is that the stronger the gender bias in a country,
that is to say the stronger the association of male with science in a country,
the less well girls in that country did on that mathematics test.
That's very similar to the Haman kind of result because we didn't measure the gender bias in the girls
and the boys who took the test. We were measuring something at the level of a country in that case.
And yet, it did predict something systematic about the difference in performance between boys and girls.
When we look at an event like a police shooting, we invariably seek to understand it at the level of individuals.
If something bad happens, we think it has to be because someone had bad intentions.
Implicit bias certainly does act on individuals, but it's possible that its strongest effects
are at the level of a community as a whole.
This might be why some police shootings of African-American men are carried out by African-American
police officers, and why some physicians who are not prescribing pain medications to people
of color might themselves be people of color.
Individuals can do their part to limit the effects of bias on their behavior,
but if you want to fix the bias itself, well, that takes the whole village.
This episode of Hidden Brain was produced by Jenny Schmidt, Maggie Penman, and Raina Cohen.
This episode of Hidden Brain was produced by Jenny Schmidt, Maggie Penben and Raina Cohen. It was edited by Tara Boyle.
The music in today's show was composed by Ramthine Arableui.
Our team includes Paath Shah, Thomas Liu, Laura Quarrell, Kat Shukneck and Lu Sheikwaba.
This week, our unsung hero is Caroline Dries, the senior director of field safety and security at NPR.
In this time of coronavirus and nationwide protests, Caroline works tirelessly to make
sure the people of NPR stay safe and healthy.
Thank you Caroline for all that you do.
For more hidden brain, you can find us on Facebook and Twitter.
If you like this episode, please do share it with a friend. I'm Shankar
Vedantan and this is NPR. You may have noticed something at all these protests over at police violence.
There are a lot more white people there than you'd expect.
But how long will that last?
This, a wokening among white American voters, how far are they really willing to go beyond
dethroning Trump?
Adam Surer on Race and Lessons from History.
Listen and subscribe to It's Been a Minute from NPR.
Trump.