Pod Save America - Offline: Alex Stamos on Leaving Facebook and Zuckerberg’s Reign
Episode Date: January 9, 2022This week on Offline, Jon is joined by Alex Stamos, Facebooks former Chief Security Officer. As Jon’s first guest whose worked at a social media company, Alex gives us a first-hand look at Facebook�...��s internal politics, delivering insight on Russian hackers and the Haugen papers He also makes the case that it’s time for Mark Zuckerberg to step down.For a closed-captioned version of this episode, click here. For a transcript of this episode, please email transcripts@crooked.com and include the name of the podcast.
Transcript
Discussion (0)
Offline is brought to you by Splendid Spoon.
We all know we are what we eat,
but we also know how hard it is to plan and prep each week.
Splendid Spoon will send you plant-based,
gluten-free, ready-made meals delivered right to your door,
saves you a trip to the grocery store,
as well as having to buy specialty ingredients
that can add up.
Splendid Spoon believes in plant-based eating
as the single most effective tool we have
to feel our best day in and day out. When you make a habit out of plant-based eating, it allows most effective tool we have to feel our best day in and day out.
When you make a habit out of plant-based eating,
it allows you to be the best version of yourself with side benefits like more energy,
weight loss, improved sleep, and better skin.
Sounds good.
Yeah.
Sounds good.
Have you tried the vegan meatballs and marinara?
Sounds delicious, John.
You should.
You should.
It's delicious.
Also, the strawberry chocolate smoothie.
Take one of those right now.
I'd take two.
Get started and save $35 on your first order of delicious plant-based meals at splendidspoon.com slash crooked.
That's splendidspoon.com slash crooked to save $35 on your first order.
That's only $6.66 per meal.
Thank you, Splendid Spoon, for sponsoring this podcast.
Was there a moment where you said to yourself, like, I think I have to leave?
This is, I've butted heads enough here and I got to go?
Yeah, I mean, in 2018, that's what happened.
You know, Facebook's this place that, like, if you go and you're, like, a lower down employee,
your new employee, it's like Care Bear land, right?
Like, everything's wonderful and the food is free.
And, like, the posters say things about people supporting people and stuff, right?
So it's like the Care Bears for 80, 90% of the company.
And if you're in the top 10% of the company, it's Game of Thrones.
I'm Jon Favreau. Welcome to Offline.
Hey, everyone. Happy New Year and welcome back to Offline.
So my guest this week is Alex Stamos,
who's currently director of the Stanford Internet Observatory,
where they do all kinds of research on the negative impact of technology and disinformation.
But it's Alex's last job that we spend most of this episode talking about.
From 2015 to 2018, he was the chief information security officer at Facebook.
We've heard a lot about product manager turned whistleblower Francis Haugen in the last few months, but Alex was actually the first high-ranking employee to leave the company
over a dispute with other senior executives about how Facebook handles the spread of
disinformation on its platform. Basically, he wanted the company to be more transparent about
the issue. They disagreed. Stamos has plenty of criticism for his former employer, but
not too surprisingly,
he also takes issue with some of the criticism directed towards Facebook from media outlets and liberals like me.
We get into all that.
And we also talk about how he and his team first noticed that the Russian government was using Facebook to influence the 2016 election.
Why he decided to quit the company.
What the internal culture and decision-making process is like at Facebook,
why he thinks it's time for Mark Zuckerberg to step down,
and what changes he thinks would make the platform better.
It's the first episode we've done with someone who's actually worked at a social media company,
and I think it's a valuable perspective to hear.
As always, if you have questions, comments, or complaints about the show,
feel free to email us at offlineatcricket.com.
Here's Alex Stamos. comments or complaints about the show feel free to email us at offline at crooked.com here's alex stamos alex stamos uh thanks for joining offline hey thanks for having me john so i've been doing
this series on um all the way that the internet is breaking our brains and uh you're the first
guest we've had who's actually worked at a social media company.
So you're here to answer for the sins of all.
Great, I appreciate that.
Thank you.
So you were Facebook's chief security officer
from 2015 to 2018.
Is that right?
What did that job entail?
So the majority of that job
is kind of the traditional information security job, right? So I led job entail? So the majority of that job is kind of the traditional information
security job, right? So, you know, I led a team whose job it was to keep people from breaking
Facebook, the company, stealing data, stealing money, doing, you know, kind of your standard
hacker things. And then there was a component that was about preventing people from doing
harmful things on the platform. So I had, for example, a child safety
investigations team that would do the investigations, kind of the worst of the worst of the
child safety issues. While I was there, we started a counterterrorism investigation team due to
mostly ISIS, but other terrorist groups who were doing, you know, kind of advertisement
on the platform. We had an anti-fraud team, and then we had a big team that worked on
state-sponsored hacking, so whose job it was to kind of track what Russia and China and Iran and
North Korea and other countries like that were doing to cause harm as well as to try to attack
the company to get information out of it. I was going to say, things got pretty interesting for you and your team in 2016. Yeah.
Can you talk about how and when you guys discovered that the Russian government was using Facebook to influence the election in the U.S.? So they're really two totally different kind of online Russian campaigns by two totally different Russian groups for which the connections between them are actually pretty loose.
There's not a lot of good evidence,
at least in the unclassified side that we have,
that those groups are working together.
So the part that a lot of people talk about
from a disinformation perspective
is the troll factory, right?
So there's the Russian Internet Research Agency
is a private company that belongs
to one of Putin's oligarchs and who pushes propaganda on behalf of what is in support of Putin.
Most of their propaganda is actually in Russian, but they do have divisions that focus on the United States, Western Europe, Eastern Europe and such.
and such. And then there's a totally different section, which is the hack and leak campaign by the GRU, which the GRU is actually part of the government, is the Russian military.
And they have both hackers and disinformation actors, but they have the actual hands-on
hacking capability, which is something that the Internet Research Agency and the other troll farms
lack. And so that had a very different texture, which is those were the attacks against John Podesta, against the DNC, the DCCC and such, as well as then the leak campaign of
using wiki leaks and then a variety of fake outlets that they created to push the propaganda
that they wanted. So we actually saw the second one first, right? So often if you're a professional
hacker working for a government, the first step in what we call the kill chain, right, which are the steps a hacker has to take to do what they want to do is reconnaissance.
And so reconnaissance will often be done on social media to try to find out, you know, everything about a target and figure out the ways that you can possibly attack them.
Right. Can I text them here? Can I send them a message here? Stuff like that. And we saw reconnaissance activity in the spring of 2016 by accounts that
we were able to tie to GRU. We had a dedicated guy who all he did was track this group and had
been doing so during actions in Ukraine, during attacks against the World Anti-Doping Agency,
stuff like that. And so then when did you start sort of disagreeing with other Facebook executives
about either the transparency of like how to release this information, how to talk about it,
the spread of disinformation on the platform itself?
A lot of the problems that Facebook is still facing, I think, were set up in that kind of
August, September of 2017, where there have always been a couple of different,
you know, people talk about Facebook wants this, but the truth is, you know, a big company and
50,000 people will have all these different groups that want different things. And the big fight has
been since that time, should we be open and transparent and honest about these failings
to demonstrate one, to help other people figure it out to two, to coordinate with others and three, to kind of demonstrate forward movement. Um, or is it better to just be kind of,
to say the minimum amount possible and to, uh, you know, keep as much as close to the best as
possible. You know, one of the things that complicated the whole thing was we had privately
given the data to the Mueller team that summer. Right. Um, and, uh, Mueller as an actual prosecutor
was able to go to a judge and get a
search warrant. And then he took all this stuff, and then he gave us a gag order saying that we
weren't allowed to talk about the fact that we had given it to the Mueller team. And then, you know,
Facebook comes out and says, hey, this bad thing happened, but we're not going to share it with
anybody. But also can't say at the time, hey, but because we're working with the Mueller team.
And becomes this big back and forth where the company takes like the most aggressive interpretation of the privacy laws around this to not share it with
Congress. And this becomes like this, you know, big brouhaha that eventually the data gets shared
to Congress and then Congress can share it with other folks. And that's how we're able to now see
all of that. But that that fight at Facebook has happened continuously since then. Right. And,
you know, I I've always thought that
being open about this stuff is the only way that we're going to fix it because this is a society
wide problem and you can't, you know, a company by itself cannot go do this, right? This is require
not, not just the private sector to work together, but the private and public sector to work
together. And when you try to control everything and make it about the PR outcome, it makes it
very hard to fix the problem. And the, from my perspective, like if Facebook wants to fix their PR problems, the best thing
to do is to fix the underlying issues themselves, that that's actually the shortest.
Like you can't really spin your way out of these things.
But that's what started there.
Now we've had many, many examples since then of that kind of fight happening over and over
again.
Well, I was going to say like, you know, the specific Mueller gag order issue aside, why do you think the specifically the product comms legal side of Facebook is always erring on the side of less transparency and not more?
So if there's like a fundamental there's a couple of like kind of fundamental organizational flaws at Facebook that I think are real problems.
One is the Mark Shereryl split right so everybody thinks sheryl sandberg is the second most powerful person person facebook she's like the 10th or something right the truth is
there's a bunch of people who run product teams or and don't even run product teams who are lower
down but who do important things like metrics and such who actually have more power over the
what the real impact is because facebook's real impact is what is built into the products, right?
Cheryl has the legal department, the comms department, the policy team.
They're kind of the clean up on aisle five folks.
At the time, my team reported to Cheryl,
which is actually kind of another fundamental problem,
is that you had the, you know,
we were part of the team of kind of cleaning up the product mess.
And so that kind of division, I think, is really harmful in that kind of the idea is like, oh, you can throw this stuff out
in the world and break a bunch of things. And then these people will clean it up. And that's
not our responsibility is kind of the feeling for the peroxide. The other kind of fundamental
policy problem that I think goes directly to this issue is that the team the the comms and policy team is huge right so there's this humongous team
that now reports to nick clegg the you know former deputy prime minister of the uk um but was was all
big together at the time um and that included the people who were trying to decide kind of
substantively what the rules should be at facebook and i think there's actually a fundamental problem
here is that you know their goal is to keep governments happy with the company. So the company can operate everywhere and
make money everywhere. And I think the, probably the biggest kind of driver, you know, there's a
natural kind of comms reaction of, you know, this, this phrase, like, let's not break into jail of,
if we don't release things that people won't talk about it. Like that has failed over and over again
for Facebook. Right. So like, I hope people don't say that inside anymore.
Never a good political comm strategy
doesn't seem like a good corporate comm strategy either.
Right, right.
Yeah, it doesn't work.
And I think like we probably talk about this,
like I think there's probably a lot of parallels
between Facebook and government,
both from like all the different actors who are fighting,
but also of getting pulled into these policy discussions
for which there's no real right answer.
And so no matter what,
you're going to be in trouble with somebody.
So you can't like make everybody happy. Um, and specifically at
the time, you've got this new Republican administration, you've got a, you know, uh,
Republicans in Congress. And so you have a policy team whose job it is to push, uh, to make the
government happy. Who's then very close to the comms team. And so you end up with this general
strategy of, Hey, let's not get involved in this because at the time it was like super controversial to talk about Trump and Russia,
right? That was seen as like a really political thing. And so their impulse was, well, instead
of us coming out and talking about this publicly and honestly, it's better to just avoid it and
stay out of it because we don't want, you know, we don't want to take sides. And that's like a
constant problem at Facebook is people not wanting to take sides when one side is right and one side
is wrong and neutrality is actually the problem. Yeah, no, I've noticed that from
the outside. I mean, when you say that their job is to keep governments happy, it sounds like what
you're saying is their job is to make sure that a government, whether in the US or elsewhere in
the world, doesn't get so pissed off that they regulate Facebook in ways that Facebook doesn't
want. Is that right? Yeah. I mean, yeah. I mean,
keep a government happy is regulatory. Then also just kind of the administrative state, right? Like
even if you don't actually pass laws, there's all kinds of powers governments have to punish
companies and make things hard for them. And so, you know, just like any other government fair shop,
they're trying to keep, and by say governments, that's the ruling party, which is like the really
scary thing is, you know, governments don't necessarily reflect all or even, you know, a majority of the people in the country.
And so they end up if you like are just keeping the ruling party happy that you end up twisting yourself in all these decisions.
And so I think a lot of it came out of like, let's not piss off Trump.
Let's not piss off the Republicans. It's not like take a side, because if we're being honest about this stuff, then that's kind of taking a side, which I think is ridiculous and has way hurt the company.
But that's kind of this impulse you see over and over again.
Was there a moment where you said to yourself, like, I think I have to leave? This is this is I've butted heads enough here and I got to go.
Yeah, I mean, in 2018, that's what happened. Effectively, you know, I was trying to fight to kind of make my
team much more effective. I was, you know, get out from under Cheryl, be closer to the product
teams, have the ability to actually affect things, massively grow the number of people who are
investigating and working on this stuff. Um, and some of that stuff ended up happening, but
effectively there were a handful of executives.
Um,
you know, Facebook's this place that like,
if you go and you're like a lower down employee,
your new employee,
it's,
it's like care bear land,
right?
Like everything's wonderful and the food is free.
And like the posters say thing about people supporting people and stuff.
Right.
So it's like the care bears for 80,
90% of the company.
And if you're in the top 10% of the company,
it's game of Thrones. Right? And I was the victim of a very smart political machination
by somebody who basically said, oh, well, I can take care of the problems Alex is working on,
but I can do so without pissing you off, aka telling the truth, right? And so there were
basically a bunch of decisions that were made that I would not be able to make things any better.
And so I had to leave.
a bunch of decisions that were made that I would not be able to make things any better. And so I had to leave. in a safe and private online environment. It's so convenient you can start communicating in under 48 hours.
It's not a crisis line.
It's not self-help.
It is professional counseling done securely online.
You can send a message to your counselor anytime.
You'll get timely and thoughtful responses.
Plus, you can schedule weekly video or phone sessions all without ever having to sit
in an uncomfortable waiting room.
BetterHelp is committed to facilitating
great therapeutic matches
so they make it easy and free
to change counselors if needed. It's more affordable than traditional offline counseling and financial aid is available.
The service is available for clients worldwide. Find the particular expertise you need online.
Don't limit yourself to the counselors located near you. You can find licensed professional
counselors who specialize in everything from depression, stress, anxiety, grief, self-esteem.
Anything you share is confidential. It's convenient, professional, affordable.
You can check out the testimonials posted daily on their site.
BetterHelp is not a crisis line.
In fact, so many people have been using BetterHelp.
They are recruiting additional counselors right now in all 50 states.
Hey, John, how was your week?
My week was, well, it's only Tuesday, but so far so good.
He's dissembling.
Oh, now I got both of you both
of you it's rare to have two people in therapy as the counselors usually it's one i'm sure that's
gonna work well um you're bad john wanna wanna pick up on anything we talked about last week
are you guys gonna play good cop bad cop don't pigeonhole me. It's a bad cop,
lost cop.
He doesn't want to talk about anything.
Anyway,
thanks for your help.
I want all of you
to start living a happier life today.
As a listener,
you'll get 10% off your first month
by visiting our sponsor
at betterhelp.com
slash offline.
Join over 1 million people who have taken charge of their mental health.
Again, that's betterhelp, H-E-L-P dot com slash offline.
Support for this podcast and message comes from Disney+.
With National Geographic's Welcome to Earth starring Will Smith, a six-part original series
now streaming on Disney+.
Tommy, I know you've seen this.
I've seen them all.
And you loved it. It's very good. It will smith doing shit that's legitimately quite dangerous uh and he does it with cool explorers like they're not either your typical explorer like there was
one guy who was blind there's one guy who's missing half a leg and they're still like
rafting down rapids it's crazy it's it's a good show. Check it out. Very cool. Check it out.
Through the lens of Academy Award
nominated filmmaker Darren Aronofsky
and with some of the world's top explorers,
Will embarks on an ends of the earth adventure
to explore the planet's greatest wonders.
Welcome to Earth.
All episodes are now streaming
only on Disney+.
Offline is brought to you by Wealthfront.
The beginning of a new year is a great time
to finally start things like diets,
workout routines, or thinking about your financial future.
Even if you don't plan on getting off your couch in 2022, you should at least, at the very least,
do one responsible thing while you sit there. Check out Wealthfront.com. You can start investing
in no time with Wealthfront's classic portfolio or make it your own with things that you care
about like socially responsible funds, technology, technology crypto trusts or hundreds of other investments wealthfront was designed by financial experts to help you turn
your good ideas into great investments without the hassle of doing everything yourself don't
want to spend hundreds of hours trying to lower your tax bill they help you do that not sure how
to rebalance your portfolio or even what rebalancing is they do that for you automatically pretty
heavily into theranos. Oh, yeah.
Good for you.
Looking for a rebalancing.
Get in the ground floor.
Yeah, there you go.
You're in the DeVos family.
Buying the dip.
Buying the dip.
Wealthfront is trusted
with over $28 billion in assets,
helping nearly half a million people
build their wealth.
Best part is their product
is so simple yet powerful.
It's got a 4.9 out of 5 stars
in the Apple App Store.
Try to beat that, huh?
To start building your wealth and get your first $5,000 managed for free for life, go to wealthfront.com slash cricket.
That's W-E-A-L-T-H-F-R-O-N-T dot com slash cricket to start building your wealth.
Go to wealthfront.com slash cricket to get started today.
I talked to Charlie Worzel about Facebook for this series yeah yeah I saw that I thought
that was a good interview oh thank you um well he said he said in that interview like that he
thinks there's very little cynicism in the upper ranks of Facebook management because
most people are true believers in what they're doing that they basically think they're doing
the right thing was that your experience yeah I I that is definitely, lack of cynicism is not the term I would use, right?
I would say, one, you know, Mark Zuckerberg really does truly believe that letting everybody
in the world talk to each other is a good thing, right?
Now, I'm not the right person to ask about this because I have spent my entire career
working on the downside of letting people use computers, right? Like, I'm not the right person to ask about this because I have spent my entire career working on the downside of letting people use computers.
Right. Like I'm a total Luddite.
Like all I've done is security and safety.
And so once you've worked on the sexual abuse of children online, once you've worked on counterterrorism issues, it is very difficult to feel good about the Internet.
And so I'm way farther on the other side.
And I think there's a truth somewhere in the middle.
Right. But Mark really believes that.
And one of the basic problems at Facebook
is it quite a small C conservative company
in that the people who run it
have barely changed in like 12, 14 years, right?
One of Zuckerberg's smart moves
in the beginning when he started Facebook
was he realized that he was just a kid
and there's a bunch of things he didn't know.
So he surrounded himself with all these adults
from Silicon Valley who knew, how do you sell ads?
How do you build a business?
How do you hire people?
How do you build data centers, right?
How do you run multimillion computer infrastructures?
He had no idea how to do that being like a Harvard dropout.
He had a product idea.
And so he surrounded himself with those people
and that group has barely changed since the IPO, right?
Since the 2010, 2011. And so I has barely changed since the IPO, right? Since the
2010, 2011. And so I think that's like one of the fundamental problems is that one, those people
believe everything. But the other problem is that there's all these decisions that kind of made
sense when Facebook was a scrappy upstart, right? When you're like, oh no, Google plus is going to
defeat us. Or when people are saying, Hey, Instagram acquisition is crazy or mobile is
going to crush Facebook. Facebook will never be profitable. The IPO is a disaster. And so they make all these decisions when they're a scrappy upstart. And
then once you're like the dominant platform and you are intermediating most of the human
interactions on the planet, those decisions don't make sense anymore. Right. But the people who made
them are exactly the same. And it's really hard for them to go back and to be like, oh, maybe I
need to change my mind now. And I think like,
again, like it's this small C conservatism of there's all this talk about the good old days
were the good old days. Right. And like, oh, things were much better when we were smaller
and stuff. And it's, it's kind of like the company's incredibly successful. You can celebrate
that, but then you have to recognize like, man, we have a huge amount of responsibility now that
didn't exist back in those good old days. And so things do have to be different. Like you do have to have more bureaucracy in the things that
product people complain about, because when you have more responsibility, you've got to spend a
bunch of time thinking about what bad things are going to happen. And I think that's like a
fundamental problem that has not been solved is that the turnover at the top ranks is incredibly
slow. And those people create kind of this bubble where Zuckerberg gets to be detached.
This is the other issues.
You know, Mark's never had another job, right?
Like this is all the guy's done since Harvard.
And so he never had like a real kind of young adulthood
where he went into a company
and he saw how a company was run and stuff.
And I've seen a bunch of companies
and like really well-run companies,
the CEOs know they're in a bubble and they pop the bubble.
And Mark does not pop the bubble.
He's okay being in this bubble of people who are telling him, you know, not necessarily what he wants to hear, but they are formatting things in the way he wants them to be structured.
Right.
And that will never be there's a fundamental problem in like your business plan.
It's always going to be like, here's a metric that we can make better.
Yeah, it does seem.
Yeah, it does seem, and from my conversations with people higher up in Facebook too that I've had, that he believes in his mission.
He's a little clueless about everything else that's going on.
When he gets public criticism, he's pretty stubborn about it.
And he thinks that because what he's doing is right and good and he's getting unfair criticism and the people close to him are telling him that it's unfair criticism that and he thinks to himself, well, I'm just a I'm just a computer guy.
I'm just a product guy. And I didn't I didn't think I was going to be this the CEO that that's fine, that that's all he has to do.
Yeah, it's it's a real problem.
And I've said this publicly for years now.
Like I I said to Kara Swisher on stage a couple of years ago in Canada, he just shouldn't be the CEO.
And he shouldn't be the CEO for a couple reasons.
One, he has made all these decisions that have to be revisited, and so it's time for somebody to come in.
They need a major culture change, and it's very hard to change the culture of a company while it's the same people.
There's a bunch of tech companies that have had really great second acts. And the best example is Microsoft, that they had this, they were kind of stuck in their Windows monopoly world of the 1990s, because it was, you know, Bill Gates,
and then Steve Ballmer, who was part, Ballmer was part of that inner circle. And it was not until
those guys were gone, and Satya and Della came in, it was like, we're going to redo everything,
right? And they're kicking butt in a lot of ways. And they fixed a lot of the problems of the
problems that they created in society. And Facebook needs kind of the equivalent cultural change.
I have never seen a company do that
while the same people are in charge.
And so I think like no matter what,
he probably should not be CEO,
partially because he also,
he's just really bad at representing the company, right?
It's like a fundamental job of a public company CEO
is you have to be able to go to the Senate
and take a bunch of crap from people
who don't really know what they're talking about that well. Um, and who perhaps, you know, some of their
criticisms are correct. Some of them are totally off base and crazy, uh, read Ted Cruz. And, um,
you have to kind of sit there and take it and then be able to represent the company well,
and he can't do that. So it's like, if he didn't own a voting majority of the shares,
I think like a very reasonable board of directors would say you're failing at the fundamental job. And I actually think they kind of missed an opportunity here because, you know,
they became meta. And when Google did this again, it was that kind of shift where Larry Page went up
into the stratosphere of I'm going to be the CEO of this holding company and I'm going to be able
to work on rockets and all kinds of fun stuff. And then they hired a manager to run Google,
the big part. And so there
was an opportunity, I think, when Meta happened that he could go play with his virtual reality
and go do the fun stuff that he obviously wants to do and then let somebody else take over as CEO
as Facebook, preferably from the outside, and kind of revamp it culturally and fix a bunch of
these problems. And they didn't do that. He's both the CEO of Meta, but he's still running
Facebook day to day in a way that I think is really unhealthy. I mean, do you see a scenario where he ever steps down? Like,
is it possible that there's internal pressure from within, especially after, you know, the Facebook
papers and everything over the last several months? It doesn't seem like anything has changed
or there's a desire to change there. From my perspective, it seems like they were just trying to get through
that last controversy and hope that it quiets down and then continue on their merry way yeah no i
mean i think the strategy is working out like the company's never been richer and never been making
more money it's never had more users um they're not facing really true regulatory pressure at
least in the united states because i mean the fundamental thing is that republicans democrats
agree that they hate facebook but for it's for completely opposite reasons. Right. And they
finally figured that out. There's this weird period during which that, um, Ted Cruz retweeted.
I remember this very distinctly. It's like Elizabeth Warren tweeted this big criticism
of Facebook and then Ted Cruz retweeted hers. And I was like, that can't hold. Right. Cause it's
like, because yes, they're both angry, but they want completely different futures and so um and you know they are facing i think some reasonable antitrust pressure
uh and so i you know if there's an antitrust move and you end up breaking off the cool new sexy
stuff then i think there is a world where mark goes with it right like if there was a oculus
metaverse spinoff that in in the end he's like a nerd who likes building stuff.
He has a product vision and he's quite good at kind of predicting where things
are going to go from a product direction.
That's what he wants to do.
Not like deal with this huge social network with a gazillion languages and all
these people causing all these problems that in that case,
he might go with the smaller company.
But that's the only situation I could,
I could think of it.
I mean,
the guy's going to live for a long time. He's very young still. And so you don't really have this Bill Gates situation
where Gates finally gave it up and decided he was going to give all his money away.
Mark's already giving his money away. His wife's doing all the good work, right? And so I think
part of his long-term plan is like, Priscilla is going to save my reputation by curing cancer or
whatever. So yeah, I'm not sure. And there's no good way to create
that pressure from the outside because the board of directors is captured by his votes.
Well, so there's the obvious leadership issues, but I'm interested in your take on Frances Haugen
and the document she leaked. To me, I think the most damning and compelling takeaway is that
Facebook is not just some neutral public square where people are acting like people,
that the company is making intentional decisions about the information that we see and that we
engage with. And the effect of those decisions, whether intentional or not, is often to amplify
the worst parts of human nature. Would you agree with that? Well, that last one. So this is the
problem is we don't know what the Hagen documents say.
So there's like 1,500 documents, something like 30,000 pages of documents and only a couple dozen.
The couple dozen that were like the steamiest and the, you know, that were the most interesting to a handful of news outlets got published.
Right.
Right.
A majority, I've seen a couple of the documents that have been published.
A majority of those documents are people actually doing a good job, which is bad things happen online. We should study how it happens. We should come up with mitigations. We should test those mitigations and then we should see what the outcome is. And that's what you want. Right. That's actually what we want. them by being much more public about this, right? Like most of these documents, if they were published on a blog post or in a journal or something would not have been a scandal.
And when you say they would reveal that people were basically doing a good job,
are you saying that they would reveal that people were taking steps to reduce
the potential harm that the platform caused or that people on the platform were causing?
Right. So these documents exist because Facebook has created this org called Integrity, right?
Which is a bad word, but like is effectively...
It's like a trust and safety.
It's exactly.
Integrity is the Facebook term for trust and safety.
They should just call themselves trust and safety instead of...
Facebook always has to come up with their own branding for everything.
But anyway, it is because Facebook has like by far the largest trust and safety team in
all of social media, right?
And so there are more
quantitative social science PhDs doing research on Facebook's negative effects than in probably
the rest of the industry combined. And so now, are those people always effective? And I think
one of the outcomes from the documents that I have seen is that there continues to be this problem
that understanding what is going on does not help you if you run into either a policy team who wants you to be neutral or if you run into a growth team that doesn't want to do anything that hurts their metrics.
Right.
And so I think that is like a continuous problem of you could say we've noticed this uptake in some kind of bad behavior.
We have some ideas to fix it.
And then you have that getting vetoed by a growth team that's like, well, your idea to fix it hurts our metrics by one and a half percent. So we're not even going to consider it. Right. We're not even going to test it. And then you have that getting vetoed by a growth team. That's like, well, your idea to fix it hurts our metrics by one and a half percent. So we're not even going to consider it, right?
We're not even going to test it. And, um, and so I think that that continues to be a fundamental
problem. Um, but you know, again, this is kind of like the direction we want to go. And the
documents are not really well-timed here for the industry because it is good that Facebook has
started on this path. They have not gone there yet. And now we're at this junction where everybody else is watching and is trying to decide, like, is it worth it for us
to know what the bad things are? If knowing is what creates the possibility that all of a sudden
we're going to get criticized for it, which is like, again, this constant battle inside of
Facebook of like publicly fix, identify and fix our problems, or just kind of cover it all up and
hope that it passes us by is that like which one of those teams is winning. It looks like the pass it by
team from reading the tea leaves on the outside from the people who have quit and the people who
have been fired. There's a bunch of people who work on transparency and integrity stuff that
are now gone. And so on the overall, like it's, it's, I think it's a much more mixed bag because
we want them to do this work, right? Like we want them to do this work, right?
Like we want them to do this work.
And most of the companies in the world, media and social media, don't do it, right?
That's the other kind of great irony here is that we're reading about all this in the Wall Street Journal, which belongs to Rupert Murdoch.
The idea of Rupert Murdoch having a civic integrity team, like measuring what is the outcome of the Wall Street journal and fox news on american democracy is just
laughable right yeah they're never going to do that right and so and so we kind of want this
work to happen we should be critical when then the work does not make it into actually fixing
problems we should not be critical when it's just okay well you identified a problem right because
here's also the truth here is every single platform has these right facebook is not the
largest advertiser in the in the world.
They're not the biggest website in the world anymore.
Right. TikTok is actually the biggest in the United States now.
YouTube has more time spent.
Facebook, I think, has the most unique users when you add up all the products.
Right. So it is the biggest by one measurement, but not by all of them.
All of them have this, but most companies don't look.
And so I think we need to create an incentive structure for there to be transparency and inner kind of self-reflection on these problems and not
punish that self-reflection by just saying these documents say that they're bad.
That I completely agree with. Like stipulated, it's good that Facebook is introspective like
this and tries to find and identify problems on the platform, every social media platform should do the same thing.
The ones that aren't should be criticized for that.
I think, at least in the documents that were leaked in the stories,
the problem seems to be when researchers, employees within the company,
can say, hey, you know what?
This is a problem with the product.
This is an algorithm that is, if we tweaked it this way would cause less harm
and then running up against a brick wall it's senior leadership saying no we're not going to
make that change because we want to prioritize growth over safety right um and so like which
which just to be more specific like which revelations from those stories do you think
facebook deserves criticism for and what do you think Facebook deserves criticism for?
And what do you think that Haugen and maybe the media got wrong about Facebook?
And what criticisms do you think is unfair?
Maybe just into two categories.
So again, of the documents that have been publicly discussed, I really want to get access to these.
And I want them to be public so we can have a public discussion.
Because also, there's a huge amount that we can learn both at academia and other companies can learn from learning from facebook's mistakes right um look i will i will
just say that it's always bothered me that like the the press strategy to deliver documents to
a media consortium and not to just make them public so that everyone could see them has always
bothered me and i don't really understand them i i get why she didn't just make it public without
doing any kind of censure because you have all these names of people. And the truth is, so I had people who
I was doxxed by ISIS on a telegram channel, right? Like I had people who had got death threats from
ISIS. So you really don't want of like, here is one of our low level employees who writes a report
about ISIS all of a sudden is now getting death threats at home. And so like, I, as you can
imagine, I support that because there's probably, like,
things in there about the Ministry of State Security
in China or the GRU that have my name on it
where I'm saying not-so-great things
about people who could, you know, make my life worse.
Yeah, yeah, you don't want to piss those people off.
Yeah, no, I get that.
You've been there.
You've been there.
I've been there.
Yeah, yeah, right.
And so, like, I work for Mike McFaul, right?
Like, I've seen what that's like
to have the GRU on your ass.
Oh, yeah.
And so, anyway. And so, anyway.
And so, but yes, I think there is an alternate universe
in which you go through a reasonable step
of getting rid of people's names and stuff like that,
and then they are released on a rolling basis publicly,
or at least to a group of academic groups
that can then release them publicly later.
But that pisses me off.
Anyway, so of the things that are the worst.
I mean, the worst one for me
is the one that was always a problem. And I think is a continuous problem
at Facebook was there's an article in there about the under investment and kind of basic trust and
safety outside of English in the developing world. Right. Um, which is the, the amount of
investment, you know, the, the sad part is like the Facebook that we get as Americans is the best
one that exists on the planet. Maybe the German one's a little better depending on a couple of different metrics, but like English and German are the ones for which
a lot of work happens French a little bit. Um, and then outside of those languages,
it drops off precipitously. And so that specific one was about human trafficking,
where there are a bunch of different components that are happening on Facebook, but the most
egregious was that the people were advertising the jobs to pull people into this
trafficking. They're advertising jobs on Facebook. Now, you know, doing a job advertisement is
obviously not illegal, but this guy was able to pull together of like the people doing this job
advertisement are then on the backend are basically selling these people as slaves.
Right. And so that this is the kind of thing that if was proper investment that you could take care
of. And, um, and this is one And this is one of my kind of issues overall
with kind of the media discussion of Facebook
is that 99% of it about disinformation,
like among like kind of the elite corporate,
mostly liberal media in the United States,
99% of it is about political disinformation in the US, right?
Which is bad, but disinformation is the hardest
of the problems here
because there are legitimate free expression concerns and legitimate concerns around kind of the
political power you want the platforms to have.
There's a bunch of bad things that happened for which there is no
counter-argument,
right?
Like human trafficking should not happen.
The abuse of children should not happen.
The organizing of terrorist attacks should not happen.
And so I think that is like the,
of all the articles that that kind of hit home for me as in with my personal
experience is that the investment outside of a handful of countries that have really high media
criticism and really high uh kind of uh regulatory potentially problems um are just way under
invested and that causes a lot of unnecessary suffering especially in the developing world
and you could fix that problem just with more investment, more staff, more resources in some in these other countries. This is not like one
of those really tricky problems to solve. Right. You don't have to decide like what is vaccine
disinformation, which is a angels on the head of the pin kind of question that 50 lawyers at
Facebook will argue over for a month. Deciding human trafficking should not happen on the
advertising platform is pretty simple. All the rules are there. You just need the investment in the people and the languages
and the skill sets to pull it through at the end. Part of the problem there too,
is that a lot of the investigatory work at Facebook is in a loop with the public sector,
right? So like when we found a bad guy hurting children in America, we were able to get law
enforcement on like that, right? If we found a potential terrorist attack in Western Europe,
that guy was arrested like that.
When you're dealing with some of these problems
in the developing world,
it is hard to get law enforcement to get involved.
And so what happens is instead of continuing of like,
well, we're gonna do the best we can on our own,
I think there's a little bit that loop has to exist
where law enforcement comes back to you
and says, this is a problem and I wanna work on it.
So I think Facebook needs to kind of redouble down
on in the places where
maybe we can't get this guy in jail.
We are going to double down on mitigating the harm,
even if this person is going to continue to exist
and we know they're going to come back over and over again.
Offline is brought to you by Chili Sleep.
Science tells us that the best way to achieve and maintain consistent deep sleep
is by lowering core body temperature.
Temperature-controlled sleep restores testosterone levels,
repairs muscle after a hard day's work, and improves cognitive function.
Chili Sleep makes customizable climate-controlled sleep solutions
that help you improve your entire well-being.
Chili Sleep makes the Uller and Cube Sleep Systems,
hydropower temperature-controlled mattress toppers
that fit over your existing mattress to provide your ideal sleep temperature.
These luxury mattress pads keep your bed at the perfect temperature for deep sleep,
whether you sleep hot or cold. These sleep systems are designed to help you fall asleep,
stay asleep, and give you the confidence and energy to power through your day.
Imagine waking up and not feeling tired. Chili Sleep can help make that happen. I love my Uhler.
I did not sleep well for the two weeks. I guess it was more like one week I was gone,
and when I got home, instantly slept better.
Same.
Sunday night, I got home, got in that bed, had my mattress, had my ula on top, felt cool,
warmed me up in the morning.
It's fantastic.
Head over to chilisleep.com to learn more and check out a special offer available exclusively
for offline listeners and only for a limited time.
That's Chili, C-H-I-L-I, sleep.com slash cricket to take advantage of our exclusive discount and wake up refreshed every day.
I want to tell you about a podcast called Decoder with Nilay Patel.
It's a podcast about big ideas and other problems.
Nilay, the co-founder and editor-in-chief of The Verge, talks to CEOs, policymakers, and business and technology experts about how their decisions affect the world.
He's asked Instagram's Adam Mosseri about how much responsibility platforms have for what their algorithms promote,
Mark Cuban about social networks in Section 230, and Senator Amy Klobuchar about antitrust reform and its limits.
Neelai has been interviewing leaders from the worlds of technology, business, and policy for over a decade.
In past episodes, he's covered the global chip shortage, why charging your phone is such a complex business,
and why Olivia Rodrigo keeps giving away songwriting credits on her new album.
Not really new anymore.
Yeah, not that new.
Still a great episode that I can't wait to listen to.
Great album, though.
Nilay asked business leaders how they run their organizations make decisions and consider policy
that could fundamentally change entire industries and in every episode he finds the thoughtful
critical questions you won't hear anywhere else follow decoder wherever you find your podcasts
new episodes come out every tuesday
what about the story uh regarding the meaningful social interaction sort of algorithm there?
That was one that really got me because I feel like it gets to the core of at least my problem with Facebook, which is not so much about like let's moderate content everywhere, but they are specifically prioritizing in news feeds and people's feeds.
Right.
Content that gets people angry enraged
etc well that's not exactly what that story said and so this is where i get a little like both the
wall street journal and the washington post stories here we're not really matching up with
what the document said and so like this is where i think like it gets pretty complicated part of it
is whenever you make these change it's the company doesn't like make a change in a vacuum.
And so I think like the mean social interaction,
when you make a change like that,
even if it's well-meaning, there will be an outcome.
And so, yes, I think that like the bad part there
is that clearly there was like the growth team
is putting their finger on the scale
in a bunch of different situations
where their metrics,
they believe have to monotonically go up, right?
Like you can never go
backwards it is not okay to make a trade-off and that is again a fundamental kind of zuck problem
is that he allows the growth team to have that power without holding them accountable for what
the outcomes are but i want to caution against the idea there's kind of this
not to use you're using me as a stand of all silicon valley so i'll use the standard of all
liberal media right please and so like the first kind of like there's this first kind of liberal
media bubble of everything's russian propaganda 99 of the bad stuff that happens is americans
doing it to ourself right so that bubble's popped you're you're past that right yeah right like
everybody says it wasn't them but you go roll the tape back to 2017 and a lot of people know
i know a lot of people go yeah everyone gets swept up in it i get it everyone gets swept up right the second bubble was kind of like okay well and this bubble continues
to exist you can content moderate your way out of this right that like as long as you enforce your
rules then the bad stuff i don't like will go away now that should be true for a bunch of things for
which are not kind of politically difficult but for the things that a lot of you know people are
talking about these days like vaccine disinformation politicalinformation, it's not that easy to come
up with those rules. Right. And, um, and it, and, and that's where like the adversarial stuff gets
really tough. And now there's kind of this new one of like, it's just the algorithm. If you just
tweak the algorithm, all this bad activity will go away. And I'm just gonna tell you, that's not
true. One, this happens on every single platform, right? So like everybody who has all these
different algorithms and one of the Hagen documents that was leaked is actually an experiment that facebook did where a certain percentage of
people saw reverse chrono right which is just all the stuff that people have shared who are
friends of yours on facebook in reverse chronological order no ranking at all and that
was worse by almost every metric of badness than the ranking right which makes sense because people
do things like,
if you're a spammer, you do a lot of spam.
If you're a clickbaiter, you do a lot of clickbait.
And so it is quite complicated of,
if I tweak this algorithm, you can make it better.
And I think that's like this bubble
that continues to inflate
because just kind of imagine like
all these bad problems will go away
if you're able to figure out the algorithm and tweak it.
And I just don't think that's true.
I don't think that's empirically,
there's not empirical evidence for that either in the Hagen documents or kind of in the public academic literature. I don't necessarily think that if
you just tweak the algorithms, you're going to fix these problems. But I think that what has
been revealed is that what you're seeing on Facebook is not just some it's not just a public
square where everyone just gathers and shares news like that the team does have its hand on the lever
like it can deprioritize this or prioritize this or so like there's a lot of power here and i don't
and and and i think the fact that if you make one change it causes one problem and
then if you change it back it causes another problem just speaks to the fact that something
has been created that is so unwieldy that even the people in charge of it can't fully control
the harms that it causes which is probably a bad thing for society well the harms that it causes so
facebook causes you're saying it's people.
No, no, no.
What I'm saying is
we have to separate out
the things that Facebook makes worse
from the things that are
a reflection of society.
And that's the,
that is now the kind of overall
media liberal bubble
that is inflated
is the idea that like
you want the social media companies
to make people better.
And I actually,
I think that is like
the scariest direction here.
Yeah, I don't think their job is,
their job is not to make us better.
I think their job I think they're doing a lot of amplification of bad information, bad impulses, stuff like that.
But like, I don't think that look, I mean, the other thing, too, is can they really be neutral?
Right. Like, you know, you've talked about this in other countries.
Right. Which is like Facebook's Facebook's policy in other countries is like, we follow the local law. Well, what happens when
the local law is supporting an autocratic regime? And does Facebook really want to say that they're
neutral in the global struggle between autocracy and democracy right now? Well, by their own
mission statement, you know, bringing the world
closer together and building community, it doesn't sound like that's neutrality. It sounds like that's
on the side of democratic values, but it doesn't seem like all of their internal policy decisions
either here or around the world necessarily match that. Right. And that goes back to way too many
of the policy decisions are being driven by the government affairs responsibility, which outside of the United States,
this is way worse because at least in the US,
when Trump was president,
his power to get Facebook to take down his opponents
and such was pretty limited, right?
Yeah.
What you see right now in India, especially,
so India is by far the most important country,
I've said this for years,
that it is the most important country
from a regulatory perspective
and just predicting the future of the internet.
And right now you've got Modi
who is trying to say,
trying to recreate kind of the power
that the People's Republic of China has
within a democracy
using the fact that he is democratically elected.
He legitimately has the support
of hundreds of millions of people.
And he has very few legal
and constitutional restrictions to the kinds of powers he can have. And you are seeing a future in which the American
companies, Facebook and Twitter in particular, and now YouTube are all of them saying, well,
we follow local law. Uh, India is a democracy. And so we're going to follow their law. And the
outcome is support of autocracy. Right. And so, yes, I, I mean, I totally agree. And I think that is
the neutrality argument here is both punting responsibility and also happens to be in the
economic benefit of the company, because you can't make money unless you have ad salespeople and all
the folks that you can have in a country. And so the future in which Facebook pulls out of India
is just completely, you know, impossible to imagine from a financial and a growth perspective.
But like not imagining that then puts them in this trap where they're like, well, then the only other option is we follow what the local law is.
And the outcome of that is going to be really, really negative. Mark Zuckerberg, what internal policy changes do you think could and should be made to make the platform a better place?
Right. I mean, so first I would quit and spend my money, you know, live on Kauai and let somebody else take care of and fix these problems, right?
Finish buying an entire island in Hawaii and just hang out. Yeah, OK.
island in Hawaii and just hang out. Yeah. Okay. All right. Um, so I think the policy team has to be split, right? Like you have to have the government affairs people not be in charge of
platform policy. I think you need to specifically, uh, you know, have somebody very high up whose
job it is to represent the human rights responsibility here. Um, and so get that out
of the general comms and policy argument and have the platform policy people, the people who decide
what is allowed and what is not allowed, you know, have a specific set of human rights goals that you
want for them that effectively, like you said, are not being neutral between democracy and autocracy,
right? That you're going to say, we're going to be on the side of pluralism and democracy,
even in situations that means going up against a democratic elected government. That's the step
there that's complicated, Right. Of like,
well,
the guy's elected,
right?
Like Duterte was elected or,
you know,
Bolsonaro was elected.
So we should do what the guy says,
um,
is like not,
not actually in support of democratic values.
Right.
Um,
and so I think you have to split that team.
I think more importantly is you have to make the,
the integrity team that creates these metrics of these are the bad things that are happening. Not only do you have to make the integrity team that creates these metrics of,
these are the bad things that are happening.
Not only do you have to invest there,
but you have to give them power over the growth in other product teams,
whereas that they have the ability to basically say
that our metrics are going to win this one, right?
So exactly how you do that gets a little bit complicated,
but a lot of it has to do with kind of the,
what kind of metrics you're holding
to executive compensation, right? So like for those executives who are
running product teams, you need to change around their compensation plan so that like, yes, your
growth goals are good, but if you slip on those growth goals and you've improved on these other
ones on these goals around safety and security, then we, you will get your bonus anyway. I highly
doubt there's anybody at the VP level at Facebook that gets a big pat on the back if growth goes down, but integrity goes up, including the people
who are running the integrity team. Right. And I think that's like a fundamental problem here
is that like, you have to create that tension inside the company and then you have to let
this new upstart team whose job it is to measure downside and to fix downside win.
And, and then I think publicly they need to decide
that they're going to be much more, like they missed an opportunity here to release the Hagen
documents themselves, right? They still have it. Like they're not public. They could go do the
redactions themselves and, you know, get people's names out and such, but pretty much nothing else.
And then go release them and say like, this is the reality of what's going on in the world.
And we're going to be more public about this. And I think that would be a huge step forward of kind of turning over a new
leaf on this and then continuing that process of we're going to work with
outside researchers. We're not going to get rid of crowd tangle. You know,
there's all this stuff that it looks like crowd tangles going to guy crowd
tangle is the tool that our team uses to study disinformation that everybody
else uses to study disinformation. It is a huge pain in the ass for Facebook,
right? Cause there's no YouTube crown tangle.
There's no Tik TOK crown tangle. Everybody else gets away with murder because
they don't provide the ability to study their platform. And so they're going to less transparency.
I'd like them to go to more and then to try to, you know, make that a competitive advantage
in the public market of, you know, we are going to be more transparent and open about the kinds
of things that are happening. A big problem there is Facebook groups. That's actually a very hard issue to study is what's going on in groups.
And we don't really have time for that because it's a really complicated kind of privacy
trade-off of like at what point is a group large enough that you want to have transparency
and what's going on or what point do you want to protect people's privacy?
If we had a functioning Congress, what laws do you think would help or what regulations?
Right. So now we're really like in a fantasy land, right? Um, okay. So like my, my colleague,
Nate personally has put out a law, uh, a draft law around transparency that has two components.
One, it's a hold harmless for researchers to do this kind of work. Um, so, you know, uh,
there's a group at NYU that has been trying to study, Facebook has an ad
archive, unlike a bunch of other platforms, but it's not that great. And so they get around the
fact that the ad archive is not that great by doing a bunch of scraping a bunch of data in a
way that Facebook prohibits. This gets actually, again, pretty complicated because like a lot of
this goes back kind of the overreaction around Cambridge Analytica, where everybody freaked out
about Cambridge Analytica being the worst thing ever. And so as a result, like academic study of the
platform has been massively cramped down. And so the law, this law basically says, well, you know,
if researchers operate within these parameters, they are allowed to do this work. And then the
company is not responsible for it. Right. So, you know, if NYU does something bad, NYU is responsible
for it. But as long as you're doing something good, they're, you know, they're protected. And so I think that's a complicated balance to make. But
Nate has created this balance. And then the second is required, you know, if a platform hits a certain
size, they are required to transfer transparency, that they have to have at least the content that
is public has to be available via APIs, and certain engagement stats have to be available
programmatically to certain researchers, right. And again, there's some privacy issues there, but I think they're
all fixable because it gets, you know, right now, Facebook and Twitter are the ones who are out
there. As a result, 99% of the stuff of, I saw something bad on the internet is written about
Facebook and Twitter. And almost nothing is written about the platforms that are actually
in some measures larger. And so we don't want Twitter and Facebook feeling this pressure to backslide.
We want to take the bar of where they are right now and set the bar on transparency even a little bit higher and then try to get them to meet that and then try to get these other platforms to get there, too.
Your current job is director of the Stanford Internet Observatory.
You're also part of the Aspen Institute's Commission on Information Disorder.
You guys issued a final report.
It starts by saying we're in a crisis of trust and truth that exacerbates all other crises.
What have you learned about the root causes of that crisis and how we even begin to get a handle on it as a society?
Yeah.
I mean, when we talk about kind of the disinformation crisis in the United States, it's complicated by the fact that things have changed a lot between 2016 and
2020. Right.
Kind of the texture of disinformation has changed of like, you know,
either from Russians or from, you know,
just troll farms and propaganda farms,
you'd have all of this crap that was being spread by a large number of people
that the vast majority of disinformation now is spread by a a large number of people that the vast majority
of disinformation now is spread by a relatively small number of people. And we know exactly who
they are. Right. They're all, all friends of yours on Twitter. I'm sure people who love you.
Right. And so we have like this huge report that people can get from EI partnership.net.
But in it, if you look for like the top 25 spreaders of disinformation on Twitter,
you recognize all the names, right? It's gateway pundit. It's Donald Trump jr. It's a bunch of kind of blue check mark accounts that
are specifically spreading disinformation. Cause that's what they do all day. Um, and those actors
are multimedia. And that's one of the other challenges that we've got right now is that,
you know, a Dan Bungino has an AM radio show. He has spots on Fox news. He goes on Newsmax.
And then he also has a big presence on
Facebook. Um, and so you, you end up with like all of those different outlets kind of reinforcing
each other, um, in a way that then is very difficult to act. And in only one of those
cases, do you really, I mean, I guess like in the Newsmax or Fox news, you've got a central actor,
but like really only in the Facebook case, do you have a central actor who has ever taken
steps against disinformation? Um, and so you can push on it in one side and then it just kind of
squeezes out to the other. Right. Um, and, uh, now that doesn't mean the company shouldn't do
anything, but I do think it, it goes to the complexity of the problem because we're now at
the point at which it is a really profitable business to be somebody who lies to people all
day. They know they're being lied to, but they want it. Right. Um,
and you're going to provide that to them and you have some different outlets to
do that profitably. Uh, and if some of them cut you off,
you've built the base elsewhere. Um, and I,
there's not like a really good easy solution for that. So, I mean,
that's something we dealt with a lot in the Aspen commission, um,
was effectively like the Fox news problem, right? Like you know,
we can come up with all these recommendations because we know that there's a
possibility of some kind of regulation that affects social media platforms and
some kind of interest in the social media platforms to fix something.
Even if you don't think it's enough.
Whereas then there's these outlets like the Newsmax is in such who their entire
reason of existence is that they,
it is very, very profitable for them to spread disinformation.
Yeah. I mean, the attention economy incentivizes shameless assholes.
That's what's profitable. So that's a much larger problem than just any any social media platform.
Right. And I mean, in some ways, like we have this much more, you know, you've started a media company that couldn't have existed 20 years ago.
Right. Like you've got podcasts because you can spread stuff on the Internet.
You know, you guys, you're huge on Twitter. You've got, you know, Facebook pages and stuff.
Um, and so there are positives, but like, is there a way we can get those positives with,
without having the negatives? I'm not sure. Especially when you have people who are just
straight up motivated to, it's a supply and demand issue. And I think that's the other issue I'm
always dealing with, uh, you know, personally is like, you know, it's easy to be kind of a supply sider that like you believe that if you cut off the supply of disinformation, this problem is gone.
And almost nothing is ever done around the fact that there's a huge kind of demand to be lied to.
Yeah.
And at the same time that we have kind of a continuous problem of, you know, kind of people not believing in institutions and centralized institutions anymore, which those institutions make worse by the way they act. Right. Like if you look at all the vaccine disinformation stuff, you can look at misstep after misstep by CDC and WHO and Dr. Fauci and others about how the emerging consensus on these issues is handled, which creates kind of
the opening for the hucksters. Like when the institutions say we're not really sure what's
going on, the hucksters can come in and say, I am absolutely positive and I'm going to sell you on
my vision that I am totally true. And there's a huge demand for people just to be told that this
is the absolute truth and that they don't have to have any ambiguity. Well, these things feel each
other. The distrust in institutions fuels disinformation.
The disinformation fuels more distrust in institutions.
And so it becomes a vicious cycle.
Um,
last quick question that I'm asking all the guests,
uh,
what's your favorite way to unplug and how often do you do it?
Oh God.
Um,
yeah,
I like to go sailing with my kids.
Right.
So,
uh,
haven't,
haven't done in a while.
The weather hasn't been great and, and we've been, you know, all stuck inside, unfortunately. But, uh, yeah, I like to get sailing with my kids right so uh haven't haven't done in a while the weather hasn't been great and and we've been you know all stuck inside unfortunately but uh yeah i like to get out
outside i mean i've got three kids and so uh basically any anytime i'm not working i'm with
them excellent uh alex samos thank you so much for joining offline appreciate it thanks john Offline is a Crooked Media production.
It's written and hosted by me, John Favreau.
It's produced by Andy Gardner Bernstein and Austin Fisher.
Andrew Chadwick is our audio editor.
Kyle Seglin and Charlotte Landis sound engineered the show.
Jordan Katz and Kenny Siegel take care of our music.
Thanks to Tanya Sominator, Michael Martinez, Ari Schwartz,
Madison Hallman, and Sandy Gerard for production support.
And to our digital team, Elijah Cohn,
Nar Melkonian, and Amelia Montooth,
who film and share our episodes as videos every week.