The Daily - Wednesday, Feb. 21, 2018
Episode Date: February 21, 2018The indictment secured by the special counsel, Robert S. Mueller III, makes it clear that the most powerful weapon in Russia’s campaign to disrupt the 2016 election was Facebook. We look at how Russ...ia used social media to sow divisions in the United States. Guest: Kevin Roose, who writes about technology for The New York Times. For more information on today’s episode, visit nytimes.com/thedaily.
Transcript
Discussion (0)
From The New York Times, I'm Michael Barbaro. This is The Daily.
Today, the Mueller indictment makes clear the most powerful weapon in Russia's campaign to disrupt the U.S. election was Facebook.
What Russia understood before we did about how an American technology company could be turned against Americans.
It's Wednesday, February 21st.
February 14th, 2018, Valentine's Day, Ash Wednesday,
and a mass shooting at a school in Florida, the White House briefing.
We are continuing to follow an active shooter situation at a South Florida high school.
These are live pictures.
They're in Parkland.
This is just north of Fort Lauderdale, Marjory Stoneman Douglas High School.
You can't see them here in that particular picture, but this is...
Just one hour after this gunman opened fire, killing 17 people in Parkland, Florida,
there were all kinds of messages on social media.
Shortly after the news of this shooting breaks,
a bunch of Twitter accounts start releasing hundreds of different tweets.
Kevin Roos covers technology for The Times.
They're carrying hashtags. Some of them say, gun control now, gun reform now. But researchers
started to notice that some of the bots that were tweeting these things had also been previously used
to spread other unrelated hashtags about other topics.
And they think that that's because these are, in fact, Russian bots
that are being centrally controlled by state actors in Russia.
And the message coming from these bots connected to Russia is advocating for gun control.
Well, not exactly. They're sort of playing both sides of the issue.
So some of the bots started spreading the idea that the suspected gunman was mentally ill and sort of a lone wolf.
They said that he had searched for Arabic phrases before the shooting.
He had searched for Arabic phrases before the shooting.
And they also took the gun control side and started posting hashtags about AR-15s and the NRA.
So they're just trying to sort of stir up discord and inflame tensions.
And they really jumped in very quickly to do that. So this is happening the same week that the Mueller investigation releases its indictment of these 13 Russians showing the extent to which Russia used social media in the same way to try
to get Donald Trump elected. What are they doing a year later, getting involved in a school shooting?
So during the election, obviously, there was a thesis that these bots had a mission,
that they were partisan, that they were trying to help Donald Trump get elected,
which kind of made sense.
I mean, Trump has been favorable to Russian policy.
It made sense that they would try to create a situation
where he could win.
But since the election and through the Mueller investigation,
we've really learned that these bots
don't have so much of a political agenda.
They're not partisan in the way
that we typically think of partisan actors. Instead, their goal is to create chaos. They will take both sides of an issue and push
them as hard as they can to foment division and really get people riled up.
And I guess, why not keep doing this online after the election? Because clearly what the Russians were up to during the election
was a pretty significant success for them,
and nothing really happened to them punitively as a result.
So why should they stop now?
So it's been more than a year since the election,
and they've learned that social media is still a very fertile
ground for them to try to influence conversations in America. And so they've kept at it.
And they've learned that maybe their biggest avenue of influence is on Facebook.
How did Facebook become the go-to platform for the Russians? Well, it's sort of a
long story. And it starts seven or eight years ago, when Facebook begins to shift from a social
network that was for connecting with your friends and family to really something much more public.
It's the heart of your Facebook experience,
completely rethought from the ground up. They began changing what they called the news feed,
which was the sort of central thing that you saw when you opened up Facebook. This design
reflects the evolving face of your news feed. Rather than sorting it chronologically, they began
sorting it by the things that you thought you were interested in. They sorted it algorithmically.
It's designed for the way that we're all sharing today and the trends that we see going forward.
And what they noticed was that if it was just your friends and family, there was a point at which you ran out of new stuff.
Your family couldn't be interesting a hundred times a day.
You know, there are lots of different, you know, newspapers or media
channels that people can plug into, and they tend to not be personalized, right? So at the same time,
Facebook notices that much of the public conversation about news, about current events,
is happening on Twitter. Right. So they begin to get very jealous of this. So they open up this newsfeed to posts from all kinds of places,
newspapers, magazines, digital media companies.
So the newsfeed suddenly has real news.
Right. So the newsfeed suddenly goes from being this place where you go
to figure out what your friends and family ate for lunch today
to becoming this kind of global broadcasting service.
And what we're trying to do with Facebook and News Feed in particular is personalize
it and show you the stuff that's going to be the most interesting to you.
So Mark Zuckerberg, the CEO of Facebook, starts going around and evangelizing this new and
improved News Feed, which is going to be a one-stop shop for all the information you
care about happening anywhere in the world.
Our goal is to build the perfect personalized newspaper for every person in the world.
Facebook does this interesting thing where it decides that even though it has become
this enormous media broadcasting operation,
it is not going to operate like a traditional media company.
It's not going to choose what are the most important stories.
Instead, it's going to kind of run a global popularity contest. As anyone who is sort of
involved in the production of reality TV could tell you, conflict is very popular, and tension
is very popular, and arguments are very popular. And so this environment of a popularity contest for media begins to incentivize people to create even more and more outrageous and divisive content. And that environment is the one that Russia saw as being ripe for exploitation.
So what does Russia do?
So what does Russia do? to metal in elections, to Stoke division. And by 2016, its techniques are pretty well honed.
And Facebook is dealing with this other controversy.
Facebook claims that a software algorithm blindly combs the internet for what people are reading
and spits out a list of what is trending,
regardless of politics.
But a former Facebook employee says
conservative stories never make the list.
So there were humans, journalists and editors who would comb through the news of the day and identify some of the biggest stories and put links to those stories in a prominent place on Facebook's homepage. And one of those contractors accused Facebook
of stripping out links to conservative news sites,
of essentially being biased against right-wing news sources.
And this became a huge and problematic story for Facebook.
What's trending on Facebook? Liberalism.
To me, companies like Facebook and others out there,
they really prize gender and racial diversity.
They don't so much seem to like diversity of thought.
Thanks a lot, Mark Zuckerberg. I see you.
They're getting letters from Congress. They're being accused of partisan bias.
So they decide they're not going to make any changes to Facebook that could be perceived as coming down
harder on one side of the political spectrum than another. Facebook has reportedly fired the entire
editorial staff on its trending team as it cuts back the human role in the controversial section
of its website. People will no longer write news descriptions and headlines for the company.
All employees will now go through political bias training.
The timing here is sort of incredible.
In response to criticism
that Facebook is suppressing conservative perspectives,
the company loosens up on regulations.
At the same moment,
this highly divisive conservative figure of Donald Trump
is becoming a viable candidate for president. So this is all unfolding in a way that Russia
probably could never have even hoped when this influence operation began.
Yeah, I mean, the irony is that Facebook was terrified of being seen as potentially influencing the election.
But their reaction to that ended up having a much bigger effect on the election than I think anyone expected.
So what does all this deregulation within Facebook allow Russia to do during the campaign and the election?
the campaign and the election?
Well, as we saw in the indictment,
it allows them to create a network of pages and a network of fake accounts
and to use those accounts and those pages
to drive people to divisive and polarizing content
and also to drive them to events,
to get real people to go to rallies in Florida for Trump.
Southwest Florida for Trump!
What do we know about what Facebook knew at this time?
Does Facebook understand as this is happening that it has a Russia problem?
as this is happening, that it has a Russia problem.
So famously, after the election and Donald Trump wins,
Mark Zuckerberg gets up on stage in an event and says that he thinks it's crazy. You know, personally, I think the idea that, you know, fake news on Facebook,
of which, you know, it's a very small amount of the content,
influenced the election in any way, I think, is a pretty crazy idea.
Obviously, he's been forced to sort of walk that back and apologize for it. But there was this real
feeling at the time that this was just social media. This wasn't affecting people's voting
habits. This wasn't going to translate to any real world consequences.
Isn't Mark Zuckerberg claiming that that kind of influence by Facebook is, in his words, crazy?
Isn't that sort of undermining the idea of his business model, which is that Facebook is wildly influential, that it's kind of the new digital town square?
Yeah, it is. And more to the point, Facebook had been spending years telling advertisers,
and specifically political advertisers, that they could use Facebook to influence their voters. I mean, they had case studies on their website, which they've now taken down,
with examples of Facebook being used to successfully win campaigns in local elections and state elections.
So they were kind of talking out of both sides of their mouth.
On one hand, they were saying,
we have this enormously powerful platform
to influence people's thoughts and behaviors.
And on the other hand, Zuckerberg is saying,
well, obviously people don't form their opinions
based on what they see on social media.
Would Russia have known that,
that Facebook was marketing itself as an influencer,
a potentially a decisive influencer
in U.S. elections? Well, they could have. I mean, it was right there on their website. But I think
they understood more broadly that this was where a lot of people were spending a lot of time and
getting a lot of ideas and news about the world. And so if you wanted to sort of have a low-cost
way to reach a lot of people with a very targeted message.
That was your best bet.
Responsibility for what happens on the platform of two billion users is finally being owned by Zuckerberg.
Facebook CEO Mark Zuckerberg put out a new statement tonight.
Zuckerberg saying, in part because it's a long statement, after the election I made a comment that I thought the idea of misinformation on Facebook changed the outcome of the election was a crazy idea. Calling that crazy was dismissive,
and I regret it. This is too important an issue to be dismissive. So in December of 2016,
after the election, they sort of make this step to crack down on fake news. All right,
switching gears here, both Google and Facebook
announced they are updating their company policies to ban fake news sites from using their advertising
networks. They hire some journalists to sort of talk with the publishing industry. They say that
they're going to be partnering with fact-check organizations to label and flag disputed stories
on the news feed.
Facebook users, sweeping changes are coming for you in the next few weeks after saying his personal challenge for 2018 was fixing Facebook.
And then they start announcing that they're going to make major changes
to the news feed and the platform that are going to emphasize friends and family again.
Facebook's changing its news feed to encourage more meaningful interactions between family and friends.
Over the next few months, users will see less public content from businesses, brands, and media.
And they're not going to do this all at once.
They're going to reduce it something like 20 percent, the amount of professional news in your news feed.
But it's a major change that sort of signals that they're essentially going to
try to take Facebook back to what it was before they started shoving all of this sort of professional
news content into people's news feeds. So Facebook's response to all this is to de-emphasize
news. And that feels somehow like a bit of a sad commentary, if that's the best solution.
somehow like a bit of a sad commentary if that's the best solution? Well, I think you could see it both ways. I mean, news is certainly divisive. It certainly sometimes is stressful, and it might
even make people not want to go on Facebook if all they're seeing is sort of the bad and crazy
news that comes out every day. Baby pictures are a lot more fun to look at than stories about
war and conflict and political strife. So it might be better for
their business to focus on content that's posted by your family and friends. Simultaneously,
Mark Zuckerberg seems to have had what I think is probably a very genuine crisis of conscience
about this. He built this thing to connect people, and all of a sudden it's being used to sow
division and hatred.
And I think he's genuinely disturbed by what has happened on Facebook. So I think he's trying to
emphasize what he is saying is time well spent, that the time you're spending on Facebook is not just being used to sort of passively observe the world around you, but is being used to
connect and move closer to your family and friends and, you know, have a good experience on Facebook.
Presumably anything that reduces the ability to broadcast a message, whether it's fake,
whether it's from Russia, to the largest possible number of people, would make Facebook less
valuable.
Right. And Mark Zuckerberg has said publicly that he's willing to accept a drop in
the time that people spend on Facebook in order to preserve the integrity of the platform. But I
think there's a point at which that actually would become a real problem for them, and they would have
to sort of scramble to figure out how to make money again. Kevin, I'm wondering if there's a way of looking at what Russia has done
and this whole meddling operation
as Russia essentially recognizing,
maybe before the rest of us fully did,
how powerful a platform Facebook is.
I think in a large and broad sense,
this Russia problem
and this episode in Facebook's history sort of raises a much larger question, which is how do you operate a network of two billion people that's influential in almost every country in the world in a way that doesn't turn into a kind of global shouting match? And how do you do that across cultures?
How do you do that across borders?
How do you do that in an atmosphere
where people are polarized, there's already division?
How do you connect people?
How do you connect a species over the internet
without sowing more division and more disunity?
And I don't know the answer to that.
I don't know whether it's possible
that we're not really supposed to be all connected
in this very easy and seamless way,
whether we're not supposed to be able to send things
careening around the world with one click.
Like, it's not clear to me
that that's a healthy structural force.
And I don't know what we do about it
because now we're here.
And if it's not Facebook, it'll be someone else.
And we have to kind of learn a coping mechanism
and sort of evolve into a species
that can kind of handle this enormous power.
Thank you, Kevin.
Thank you for having me. Thank you.
We'll be right back.
Here's what else you need to know today.
After the deadly shooting in Las Vegas, I directed Attorney General to clarify whether certain bump stock devices like the one used in Las Vegas are illegal under current law.
That process began in December. During a speech at the White House on Tuesday, President Trump announced that his Justice Department was working to ban the bump stock,
legislation that had gained political traction after Las Vegas,
but then quietly stalled out.
Just a few moments ago, I signed a memorandum directing the Attorney General
to propose regulations to ban all devices that turn legal weapons into machine guns.
I expect that these critical regulations will be finalized, Jeff, very soon.
It was the second time since last week's school shooting in Florida that Trump has answered
demands for gun control with calls for modest steps like improving the federal background
check system rather than making major changes to federal gun
laws. And on Tuesday, students from Stoneman Douglas High School traveled to Tallahassee
ahead of a vote by the state legislature on whether to take up a bill that would ban
assault weapons like those used to kill 17 people at their school.
Chris, tell us about the goal for this trip,
what you guys are hoping to accomplish.
Well, to put it simply, we're hoping to pass some
common-sense gun safety laws and extensive background checks.
As the students looked on,
the Republican-controlled House of Representatives
voted by a wide margin to reject the possibility of even debating the measure.
That's it for The Daily. I'm Michael Barbaro. See you tomorrow.