The Daily - 'Rabbit Hole,' Episode 4: Headquarters
Episode Date: May 9, 2020Note: This episode contains strong language. Today, we’re sharing Episode 4 of “Rabbit Hole,” a New York Times audio series with the tech columnist Kevin Roose. In this episode, our reporter int...erviews the woman running the world’s largest and most influential video empire: Susan Wojcicki, the chief executive of YouTube. "If you're tuning in to "Rabbit Hole" for the first time, start with the prologue. You can find more information about the podcast at nytimes.com/rabbithole.
Transcript
Discussion (0)
From the New York Times, I'm Michael Barbaro.
Today, Rabbit Hole.
Episode 4.
Tell me what Internet is.
Can you explain what Internet is?
What, do you write to it like mail?
How do I become a member of internet?
Now, as you may or may not know,
I have poured over 12 years of my life into my YouTube channel,
this YouTube channel.
Heart, mind, body, soul, money, resources, travel, expertise.
And I stand by what I have done.
And then let me tell you what happened in a 24-hour period.
I have a strike against my channel.
Okay, so Kevin, not long after you and I went down to West Virginia and talked to Caleb,
things at YouTube started changing.
Right.
Apparently, you see, I had violated community standards.
So as stories like Caleb's have started coming out about how YouTube and its recommendation algorithm may be pushing people toward these extreme views.
This was a direct shot across the bow.
YouTube has started telling some video makers
that their content is against YouTube's new policies.
What they want to do is to have me start to self-censor and say, well...
Like Stefan Molyneux and...
So I've just been told I'm on double secret probation.
Gavin McInnes.
Some sort of 30-day watch on YouTube
where if I do anything wrong, I'm done forever.
People who had been associated with the alt-right.
What content could possibly be acceptable?
They also affected people like Stephen Crowder.
The demonetization specifically targeting our opinions as unpopular speech.
Who didn't really have the same association with the alt-right,
but just like Stefan Molyneux and Gavin McInnes,
his videos got demonetized.
For being, quote, what they term now, on the border.
Like, they couldn't make any ad money
off of their YouTube videos anymore.
Yeah, meaning it's gotten a lot harder
for some of them to make a full-time living
making YouTube videos.
It renders us unable to create a show for you,
and if it keeps going this way,
the channel will cease to exist.
I am going to need to figure out what to do.
I am begging you.
Philosophy begs you.
The future begs you.
Please help out the show at freedomainradio.com.
They are shutting down the discussion.
Other people, like Alex Jones.
They disappeared me.
Just got banned altogether.
If this isn't 1984, baby,
I don't know what is.
So, a few
months ago... Give me
a level, Kevin
Roos. Our colleague Julia Longoria
flew out to San Francisco. We are
here in San Bruno,
California, a little south of San Francisco. And are here in San Bruno, California,
a little south of San Francisco.
And the two of us drove down to YouTube's headquarters.
That building that looks like an insurance company or something,
that's YouTube.
And why are we here?
And we went there to talk to YouTube's CEO, Susan Wojcicki.
She's someone who has seen kind of the entire evolution of the modern internet.
She's got the steering wheel of this giant, powerful artificial intelligence that runs YouTube.
Hey guys, how's it going?
Hi.
Do you guys have badges?
We're here for an interview.
Sorry about that.
No worries.
I got you. No worries. Thank you. Sorry about that. No worries. I need to check.
I got you.
Confirmation.
No worries.
Thank you.
Thanks.
And anything that she does with that power, it has all these downstream consequences.
So a couple years ago, I guess now, they had a shooting here.
A person who was a YouTube creator who was upset about her channel and ad money and things like that.
She actually came onto this campus with a gun and shot three people and shot herself.
It was a really horrible, traumatic thing for the people here.
And so as a result, they've really stepped up security.
So I'm not surprised that we're getting scoped out by the security guys.
So they lead us inside.
So this is the lobby.
And we go down this set of back stairs.
I've been here, but not to the basement.
Is this just studios down here?
And that's where we meet Susan. Hi, but not to the basement. Is this just studios down here? And that's where we meet...
Hey, how are you?
Susan.
Hi, I'm Dina.
Hi, hi.
Nice to meet you.
Before we get into the interview itself,
what is Susan like as a person?
What was your first impression of her?
Here, let's grab you a chair.
So, how are you?
Doing okay, how are you?
She doesn't really have, like, tech CEO vibes.
I did not major in computer science.
I have a history and literature undergraduate degree.
She doesn't really get the same kind of attention as Mark Zuckerberg or Jack Dorsey.
What was an early memory for you of the Internet and what you thought it was?
Well, I first saw the Internet probably in my late 20s.
But she's actually been around the
tech world for longer than either of them. And of course, I remember Netscape. Like Google
actually started in her garage. What does that mean? It's sort of a wild piece of Silicon Valley
trivia. She was working at Intel and living in Silicon Valley. And these two Stanford grad students named Sergey
Brin and Larry Page, they were looking for a place to house their brand new startup, this search
engine called Google. And eventually they convinced her to come over and help them build it as a
business. And she became Google's 16th employee. And so, you know, then you became history's most successful landlord.
And you, you know, you got to Google and started, you know, working in the tech world.
And in 2014, you know, made the decision that you were going to come over and run YouTube.
And what do you remember about YouTube when you came in? Like, what was it like?
Well, first of all, you know, when I was first asked whether or not I wanted to have this role,
I had been running our advertising business. And so Larry asked me, Larry Page, and I
immediately said yes. I have always had a love for the visual arts. I was a photographer.
I love creative. I had created Google's image search. And I could just see
clear as day that YouTube had huge potential in front of it. And so in many ways, I thought,
oh, I'll go to YouTube and I'll get to work with all these fun entertainment creators.
It probably seemed like you were going from a very serious job to a very fun job.
Yeah.
YouTube at the time was really much more of an entertainment site.
It was not seen as a very serious site.
Yesterday, the internet blew up over this video of a rat carrying a whole slice of pizza
down the...
And actually one of the areas that I really pushed our team a lot on was actually
freshness. Because I would get frustrated, I would come to YouTube repeatedly and I would
keep seeing the same videos. One of the areas I'd push them on and I've consistently pushed them on
is actually exploring new areas, which is of course course, the hardest area for us to discover.
Interests that you haven't necessarily told us that you're interested in.
Or that you might not know you're interested in.
Or you might not know, but we think they're interesting.
Like that exploratory part is a really important part.
And I do think we've gotten better at it.
And we certainly have gotten better at predicting what people are interested in.
So part of why I wanted to talk to Susan was to ask her about this 2015-2016
period. This period when YouTube was trying to engineer this new algorithm that ended up opening
the door to more polarization and extreme views and was also right when Caleb was getting introduced
to all of these new characters. And it was kind of striking that almost right off the bat,
she owned the fact that she had driven that change.
I want to talk more about the recommendations thing.
As you know, I've been very interested in it.
And one of the people I've talked to and wrote about was a guy named Caleb Kane.
Just to ask the most blunt possible version of this question,
what did you think of that story?
I mean, I thought of the story,
you were trying to understand how our recommendation systems were working
and what the impact of them were.
And what was her take on Caleb's story?
Well, she didn't really get into the specifics of his story.
And it wasn't just you. There are many people that have raised questions or concerns on how our recommendation systems work.
And we wanted to make sure we're taking that seriously.
She did acknowledge that YouTube had taken from stories like his that they needed to start making some changes.
And we have taken it seriously. And I think, you know, we've made a lot of changes to how our systems work. I want to talk about all those changes. But I'm struck by the fact that you said that one of the challenges for the algorithm and your design of it was getting people to explore new interests.
That strikes me as something that happened to him, where he went to YouTube looking for self-help videos. He was going through a rough patch. He wanted to find something to cheer him up.
videos. He was going through a rough patch. He wanted to find something to cheer him up.
And then he got introduced to these people who would do self-help, but then they would also talk about politics and they would go on each other's shows. And they were creating this kind of
bridge between like the self-help part of YouTube and the more political part of YouTube.
Was that something that you observed happening on the platform?
I mean, it's interesting that you say that because I guess I want to say going back from my early initial days, which is that we couldn't get people interested in news or get people interested in politics.
I don't think we had no indication that this was something that people were interested in.
Like people were interested in gaming and music, entertainment.
They came to laugh.
They came to escape in many ways.
She went back to this idea that YouTube just wasn't seeing data that suggested that people
were looking for politics on YouTube.
I acknowledge that there are many political commentators and a lot of political views
that have emerged on the platform, but that was not an initial part of how YouTube worked. And
I guess the only reason I'm bringing that up, because I do want to say that, you know, YouTube for all its creators, for, you know, million,
millions of creators, there is a set of creators who do find success with that, but it's a small
set. And in the end, she said that essentially, like this kind of political content, it's still
a really small percentage of everything else on YouTube.
And is that true?
Like, because it seems like there's a lot of it, I guess.
I mean, there's no way to know for sure without having access to their internal data.
But like, even if it were true
that only 1% of YouTube videos
consisted of this kind of political content,
because YouTube is so big,
that would still translate to millions and millions of people around the world watching it.
Yeah. And we've talked to some people who were at YouTube earlier in its history around the time
that you came and even before then, who said, you know, that there was sort of this obsession with
growth, that there was a very strong push to expand the watch time on the platform and that any challenges that were
brought to management around that, that these things were just weren't given a real hearing.
Does that resonate with you at all? And I asked about Guillaume too, about the red flags that he
had tried to raise back when he was at YouTube. I mean, I think I've certainly heard people say that.
And I mean, you know,
like I came from a company that was very focused on quality,
quality of information.
It was always like the most important thing
we also always prioritized was quality.
And she didn't really deny it, but she pivoted pretty fast to her work at Google and this
concept of quality, which is basically like the term that Google uses to talk about its
search engine and how it wants people to get good, reliable information when they go looking
for something.
And so I tried to figure out how do
you reconcile that with a company that is a more entertainment based company, right? So what does
that mean to have quality? If the main thing that you're doing is gaming videos, cat videos,
and music videos, then what does that really mean for quality? And I think what she's getting at is that at the time,
like YouTube just didn't really think that it was capable of doing much harm.
And I guess the reason I'm bringing that up is that
one of the biggest realizations for me was that we needed to start adapting and changing
and that we needed a very different set of ways of managing information
than you want to manage entertainment.
Was there a moment that crystallized that for you?
Um, well...
It's the moment celebration became nightmare.
In 2016...
People were screaming, kids were crying.
There was a terrorist attack that had happened in France.
Horror tonight in the French resort town of Nice. Susan says
that in 2016, on
Bastille Day. In the coastal city of
Nice, just as the fireworks
ended, a truck plowed
into the crowd. When an ISIS terrorist
attacked a crowd. At least 84
people were killed. Dozens of others
were hurt. And the government there
declared a state of emergency.
And I remember reading that and being just extremely upset and thinking, our users need to know about it.
YouTube decided that for people in France, it was actually going to push news about the attack onto their homepages.
The attack turned the streets into a scene of chaos.
And for people looking for information about the attack.
The mayor of Nice warning people to stay indoors.
Here's tonight of another terror attack.
They were actually going to prioritize videos
from what they considered authoritative sources.
But that didn't perform very well on our platform.
She says that basically the engineers at YouTube
were telling her like,
The users don't want to actually see it.
No one is clicking on these videos. That's just not what they want to see. And so what do you do as a
platform? Do you show it to them anyway? And so that's actually, I remember that very clearly
because that was the first time I said to them, you know, it doesn't matter. We have a responsibility.
Something happened in the world and it's important for our users to know. And according to her,
she said basically like,
too bad, we're showing it to him anyway.
And it was the first time we started using the word responsibility
and the fact that we needed to put information in our site that was relevant,
even if our users were not necessarily engaging with it
in the same way that they would with the entertainment videos.
So if I'm understanding this right,
what Susan is saying is that the Nice attack marked
the first time that YouTube took this idea of responsibility and prioritized that over watch
time. Right. At what point did that sense of responsibility extend beyond like a big news event?
like a big news event?
It took a little while.
I mean, I think over 2017 and 18 and into 2019,
I think pressure was starting to build on YouTube.
CNN reports that YouTube ran ads from large brands like Adidas, Amazon, and Hershey
before videos which promoted extreme content.
There were reports in The Times and other places.
You're about to meet a man who says he was radicalized There were reports in The Times and other places.
Regulators, parents.
Advertisers, they were all chiming in. Good morning, everyone. The subcommittee on
consumer protection and commerce will now come to order. Congress actually held hearings about
these internet platforms. Congress has unfortunately taken a laissez-faire approach
to regulation of unfair and deceptive practices online over the past decade,
of unfair and deceptive practices online over the past decade.
And platforms have let them flourish.
And invited experts, including former Google employees.
So there you are, you're about to hit play in a YouTube video.
And you hit play, and then you think you're going to watch this one video.
And then you wake up two hours later and say, oh my God, what just happened?
And the answer is because you had a supercomputer pointed at your brain. To talk about how YouTube was essentially built to pull people into these polarizing rabbit holes.
It's sad to me because it's happening not by accident, but by design.
And that's when YouTube started making these big changes last year.
They disappeared me.
So I've just been told I'm on double secret probation.
Apparently, you see, I had violated community standards.
You know, one of the biggest changes that we made from our recommendation system, and it was probably one of the later changes we made, and I think it's because it's a harder one to grapple,
which is that we realized that there was a set of content that even if people were repeatedly
engaging with it, we thought it was important to make the recommendations more diversified
and to show them more quality content alongside.
What she's essentially said is that there's a certain kind of video
where even if a lot of people are watching it
and it's generating all this viewing time,
the site will actually intervene.
If a video has hate speech
in it, or if it's made by like a neo-Nazi who's denying the Holocaust happened, that will come
totally down off YouTube. But there's this other class of videos that YouTube has a harder time
categorizing. You know, we began to identify content that we called borderline content,
and that if users were repeatedly engaging with this borderline content, we will show them other content alongside that we've ranked as more high quality content. thing that YouTube wants to boost and recommend. So what they've done is essentially to train
algorithms to identify these kinds of videos and then demote them in people's feeds and in their
sidebars so that they don't get seen as often. And what that has done is it's actually had a
70% reduction of use of borderline content in the U.S.
And we've been in the process of rolling that out globally.
It's in most English language countries right now
and a few large markets like France and Germany,
but we continue to roll that out.
And how do you define like borderline content?
Like what's borderline versus just, you know, edgy jokes?
How do you sort of separate the two?
Yeah, I mean, it is a complicated and nuanced area.
And, I mean, we have a whole process and algorithm to find it.
Right.
And I guess what I'm wondering is, like, these are sort of interventions in an automated system, right?
Like, you set up this automated system, let it run, realize, like, some of the things that it produces, realize that maybe those aren't creating an optimal experience for people.
And then, you know, humans come in and sort of tinker with it to produce different results.
Do you feel like when you...
Well, hopefully we do more than tinker.
We like have a lot of advanced technology.
Of course, of course.
Very scientifically to make sure we do it right.
Right.
I guess a sort of overarching question is,
do you think there was an over-reliance in YouTube's early days on automation, on machines?
Do you think there was a sort of underappreciation for the role that sort of judgment and human intuition played in this stuff?
No. I mean, I think both are really important. And if we look at how our systems work, we definitely have a lot of, we have basically a system that is very heavy with humans that is extended with machines.
There is an automated component, but then there's also, of course, a human component.
So is she saying that humans are now doing more work than the AI?
No. I mean, like, yeah, like they do have more people looking at videos than they used to.
We have a number of raters that all watch and review videos.
And based on the feedback, we identify a set of content that looks borderline.
And then based on those borderline...
But YouTube is so big big they could hire a
hundred times as many people and they still wouldn't be able to watch a fraction of everything
that's being uploaded to youtube on any given day so it's still the case then that most videos that
i'm recommended that anyone who goes to YouTube is recommended. Those recommendations are coming from different algorithms
being run by an artificial intelligence.
Yeah, the robots are still very much running the show.
But what she's saying is that they're trying to give humans
a bigger role in supervising them.
You can't see Andrea, but she's waving us off.
But we have like a couple more minutes. We're good.
Okay, well, we'll start wrapping it up.
I do have some more questions.
One is, I wanted to ask about the shooting here in 2018.
I wonder what that was like for you.
Yeah.
I mean, it was, first of all, just a horrible event to go through.
We have a report of subject with a gun.
This will be from the U-Tube building.
Like when something like that happens to you, you can't really process that it's happening.
A flood of U-Tube employees streaming out towards safety.
Some still clutching tightly to their laptops.
towards safety, some still clutching tightly to their laptops.
Turning what was an ordinary, normal lunch hour on this tech campus into an all-too-familiar shooting scene in America.
You don't know, is there one shooter? Are there five shooters? Where are they?
New details about the woman police say shot three people at YouTube's headquarters
before taking her own life.
I'm being discriminated and filtered on YouTube,
and my old videos that used to get
many views stopped getting views. Accusing the website of filtering her channels to keep her
videos from getting views, something she blames on new closed-minded YouTube employees. I'm very
thankful in some ways like that, you know, nobody was killed. I think it also just showed any kind of information
company, you know, has some risk of people being upset with information that's reported.
And so, you know, my key takeaway was that we need to make sure that our employees are always safe
and that we're doing everything we possibly can. And I wonder, was that a moment for you where you
realized the stakes of the decisions
you made here in a different way? Did it change the way you looked at your responsibility?
I mean, information is a tough business. I mean, you are in the information business yourselves,
and you understand that sometimes people can get really upset and they can get really angry about
ways that they're covered or ways that
information is displayed. And just like a journalistic organization like yourselves,
you need to do what's right. I want to look back on this time and feel that I made the right
decisions. I don't want to say, oh, I allowed this type of content because I was afraid.
Yeah. I mean, I remember sort of that day and I, you know, other shootings since that
day and that have been sort of related in some way to the internet and just feeling like a sense of
loss. Like I, I grew up on the internet. I love the internet. I have wondered if we're ever going
to get back to a time when these things feel fun and useful and like they're not leading to these culture wars
and polarization.
Like, do you think that we'll ever feel like that
about the internet again?
I love the internet.
I think that the internet has enabled
so many different opportunities
for people that wouldn't have had it.
And I think you're right.
Like in the last couple of years,
there's been a lot of scrutiny
and it's because of the size that we are and because people realize the significance
of our platform. And I recognize that and I take that very seriously.
My focus on responsibility is probably one of the most important things I'll do in my life.
Why?
Because I think we live at this time where there's tremendous change.
And so, yes, we've had years of all this fun and gaming and cat videos,
but there are a lot of really important decisions to be made
about what this future will hold and what will platforms be
and how will they operate in the future.
And those rules haven't been written yet,
or we're in the process of writing them.
Yeah.
Well, thank you.
Thank you.
I appreciate you making time.
Thank you.
Thank you so much.
All right.
So, Kevin, this is where things stood a couple months ago.
Mm-hmm.
But then came the coronavirus.
Right. A handful of people is spreading the idea on social media that the rollout of 5G cell towers is responsible for the COVID-19 epidemic.
Some of the towers have even been set on fire.
So this April, as we were all dealing with the coronavirus and quarantining in our houses, I started to see.
Let's talk about what's really going on.
As usual...
Something just since the beginning
hasn't seemed right with this coronavirus.
The internet was filling up with misinformation.
You're saying that silver solution would be effective.
Totally eliminated, kills it, deactivates it within 12 hours.
There were all these miracle cures.
Just drinking hot water can kill it.
If you drink bleach, you will not get the coronavirus.
Vitamin C can kill it, no problem.
You can prevent this from happening with vitamin C.
There were these conspiracy theories about...
Anyone else find it coincidental that Dr. Fauci
is on the leadership council for the Bill and Melinda Gates Foundation?
Oh, it's all coming out about Fauci is on the leadership council for the Bill and Melinda Gates Foundation.
Oh, it's all coming out about Fauci working with Gates. He knew all about it.
Bill Gates or the Jews are behind this.
We are quite literally watching a bio warfare drama play out before us.
Stuff about like, is this being caused by 5G towers?
Yes.
5G cell phone towers causing coronavirus. Anybody want to make one guess as to where the first
completely blanketed
5G city in the world was?
Wuhan.
Exactly.
Like the exact sort of thing that
Susan would consider
low quality content. Right.
Another incredible sight
in this surreal ordeal as we battle this coronavirus pandemic, the
Capitol steps, the scene of a protest.
Of course, like a lot of misinformation, this didn't just stay online.
Are you concerned about this virus?
I was in the beginning until I've done my research and found out the realities
and the media's overreach on it and that it's not as serious as they made it out to be.
It actually contributed to these protests that were going on at state capitals all around the
country. These sort of anti-lockdown protesters who were refusing to go along with social
distancing and the other recommendations
that medical and health authorities were making about how to keep society safe.
The disease is not that serious that we should quarantine.
We don't quarantine for the flu. We should not quarantine for COVID-19.
Hi, got you in your closet there?
Hello, yes, I'm in my closet.
I'm in my office. I told my kids not to come in.
So in early April, are you recording on your end or can you record on your end?
Sure. I'm in a town in London. I don't normally record myself.
I got in my very fancy home studio.
Susan, if you could hold the phone as if you're talking on the phone, that'll be the right placement.
And Julia and I got Susan back on the phone.
The last time we met, we talked about all of the ways that you were trying and had tried to improve the quality of the information on YouTube.
of the information on YouTube. And since then, like, there's this coronavirus, which seems like maybe kind of the highest stakes possible version of that mission, where the difference, you know,
between people getting good information and bad information can literally be life or death. So
what are you doing now to make sure that people are getting good information about the virus?
doing now to make sure that people are getting good information about the virus?
From the very beginning, we took this extremely seriously.
And we took our team and decided,
what are all of the things that we can do to be as responsible as possible?
And right away, she said, like, all these coronavirus conspiracy theories and miracle cures.
Just drinking hot water can kill it. Very quickly, we were reviewing all of the videos,
so we are very proactive. And anything that would be harmful or anything that would promote
medically unproven methods. If you drink bleach, you will not get the coronavirus.
We have been removing that type of content. You can prevent this from happening. With vitamin C.
She said YouTube has been aggressively hunting down and deleting them as fast as they can.
Hi, I'm Dr. John Brooks with the CDC.
We put links to the CDC.
Hello, everyone, and welcome to this regular press conference on COVID-19 from Geneva.
The World Health Organization or to the associated equivalent globally in the home feed
under videos that were about coronavirus and in our searches.
Let's work together to keep ourselves healthy,
our families healthy, and our communities healthy.
On every video about the coronavirus,
they put this little link that directs people to the CDC's website
or other health authorities.
So the 5G story is complete and utter rubbish.
It's nonsense.
It's the worst kind of fake news.
We have a new shelf that we launched
that gives users information from authoritative sources.
Far-fetched conspiracy theories like this
are really damaging
because they undermine public health efforts
to stop the spread of the virus
by convincing some people that it's not the real problem.
They created this feature that basically like when you search for information about the coronavirus,
the first thing you see is this section called top news that contains essentially like information
from authoritative, trusted sources.
Staying inside saves lives.
Stay home.
They also got popular YouTubers
to make these social distancing PSAs.
I really think togetherness is the superpower of our species.
Let's do it together.
We will keep each other company.
So they weren't just hearing it from YouTube,
they were hearing it from their favorite creator.
Come yoga with me. Each day can be a different thing. Which they then promoted to all
their subscribers. And when we looked at the combined number of subscribers that they have,
it's over 1.5 billion subscribers. You can slow the growth of this and save lives.
save lives. We have everyone at YouTube working throughout this crisis to make sure that they are following what's happening, making changes on the policy, making sure that we are taking
down content, making sure that we're adjusting whatever changes we need to our algorithm.
And as she's going through her list of everything that YouTube is doing on the coronavirus,
it really strikes me like how much work it takes to make sure that this algorithm that YouTube has built,
that it's not leading people in a dangerous direction.
I mean, I could go on. We did a lot of different steps here to make sure we were doing everything possible.
Yeah, I mean, just personally, like I've been impressed by how hard it's been to find misinformation about the coronavirus on YouTube.
And I guess I'm wondering, like, why isn't it always like this? Like what is preventing YouTube
from taking this approach all the time to every subject? Well, we do take a responsibility
approach to everything that we do.
What's a little bit different about this one is there is a very clear authority, which is a World Health Organization.
And there are recommendations that health organizations are making.
And as a result, it's very clear to us that anything that would contradict social distancing, for example, would be a violation of our policies. And in a lot of other areas, there may be some agreement on the science, but there could be politically different views. And so we want to enable a broad
range of views. So Susan's saying that she's basically comfortable doing this kind of work
when it comes to something like the coronavirus, because it's just science and there are clear authority figures to turn to. But politics is a completely different story.
And maybe there's some obvious answer to this, but like, why is that?
We want to enable a broad range of views. We want to make sure that the full spectrum
of political views or social ideas are represented there. But in this case, it's very clear
that there's an authority and information behind that authority.
I mean, I think on an ideological level, there are a lot of people at YouTube who still want it to be this open platform where every kind of view is welcome and is equal.
And I think there's also this more practical angle to it, which is that they don't want to be seen as putting their thumb on the scale. They don't want politicians and regulators, people who might try to break them
up or enforce new rules on them to think that they're taking a partisan stand.
Right. But I feel like because of the time that we're living in, in part due to internet
platforms like YouTube and the powerful powerful ais that have been designed to
capture and keep our attention at all costs that like now everything feels polarizing
and political even everything around the coronavirus and so it sounds like susan is
saying you know okay we recognize that we helped let this genie out of the bottle. Right. We didn't mean to.
We were just trying to entertain.
We're now going to try and put it back.
But I just wonder, like, in the environment that they helped to create,
if that is even possible.
Yeah.
I mean, it's kind of a reckoning for them.
YouTube is part of Google.
And I've been at Google for over 20 years now.
And, you know, it's an information company, which means our goal, our mission is to deliver
users the most relevant information, right?
And that it's accurate.
And that's true for YouTube, too, that we want to deliver accurate, useful information.
And I think in the information area, it's very important. Yeah. And I guess it strikes me as like there's a trust crisis right now. I
mean, people just they don't trust necessarily the institutions that maybe they would have
years ago to sort of give them accurate information. So you're kind of trying to
rebuild trust. It strikes me like, you know, placing content from organizations like the CDC
and the WHO prominently on YouTube. Do you feel like that's part of what you're trying to do right
now is to kind of help people on YouTube understand that they actually can trust these
sort of mainstream authorities? Well, part of what we want to do for YouTube is make sure that we have all of the voices
represented on the platform. And YouTube started out with a lot of individuals just posting videos
about what was happening in their lives. But what we've really done over the last couple of years
is reach out to make sure that news organizations, trusted scientific health government officials all also have the resources
needed to be able to come onto the platform on YouTube. And what I've seen happen with COVID-19
is it's really accelerated public health officials recognizing that YouTube is an important medium to
be communicating with users about anything involving health. And so, you know,
in many ways, I think this would have happened naturally. It might have taken a few years. It
just accelerated a few years. And we plan to continue to work with all of these organizations
to make sure they can get the right information out to their users.
Thank you.
Thank you.
I think we're good.
What Susan is saying, this change in YouTube,
it's actually like a fundamental shift in the YouTube universe.
Like, for its entire existence,
YouTube has been defined as an alternative media space.
And the people that were there were like these insurgents and upstarts,
these kind of like rebels and misfits.
And now YouTube is basically saying,
when it comes to certain things,
we're going to let the establishment win.
Which is tricky because the establishment
isn't always right.
Like the World Health Organization
has changed its guidance on things like face masks
during the pandemic. So the ramifications of this change, they're not totally clear yet. But what is clear
is that this is going to be a huge battle. Old school media does not like internet personalities
because they're scared of us. We have so much influence and such a large voice. Because the internet culture that grew up on YouTube,
not only has it gotten bigger than mainstream culture.
Why is this an article?
Because clickbait. That's why.
And not only does it distrust a lot of mainstream institutions.
If there's anything I learned about the media from being a public figure
is how they blatantly misrepresent people for their own personal
gain. But over time,
it's basically come to
despise them. I'm still here.
I'm still making videos.
Nice try, Wall Street Journal.
Try again, motherfuckers.
To hear more, go to
whatever app you're using to hear me right now.
Search for Rabbit Hole, two words, and hit subscribe.