Behind the Bastards - Part One: Let's Look at the Facebook Papers
Episode Date: November 16, 2021Robert and Jamie Loftus sit down to discuss the massive, damning 'Facebook leaks'. Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy inf...ormation.
Transcript
Discussion (0)
Alphabet Boys is a new podcast series that goes inside undercover investigations.
In the first season, we're diving into an FBI investigation of the 2020 protests.
It involves a cigar-smoking mystery man who drives a silver hearse.
And inside his hearse look like a lot of guns.
But are federal agents catching bad guys or creating them?
He was just waiting for me to set the date, the time, and then for sure he was trying to get it to happen.
Listen to Alphabet Boys on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts.
What if I told you that much of the forensic science you see on shows like CSI isn't based on actual science?
And the wrongly convicted pay a horrific price.
Two death sentences in a life without parole.
My youngest? I was incarcerated two days after her first birthday.
Listen to CSI on trial on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts.
America. I don't know why I opened the show this way.
I really don't know why you did either.
I don't know why I did that.
Jamie. Jamie Loftis.
What?
Jamie. Hi.
What?
Hi.
Is this a podcast?
Yeah, it's like after all these years, it's a podcast.
It's a podcast. It's a podcast.
Sometimes...
I get on the phone to podcast, and I think that I'm just talking to my friends.
But then I remember that every relationship in my life is dominated by podcasting.
And I don't even know how to interact with people outside of the filter of a Zoom call anymore.
That is incredibly accurate to how I feel.
That is...
I really...
I really...
That's actually the saddest thing I've ever heard in my entire life.
Thank you, Jamie.
How are you doing?
I was going to agree, and I...
I was terrible, and then I was great, and now I'm fine.
Excellent.
That sounds like a solid trajectory.
That's a good little hero's journey.
That's like the Green Knight, more or less.
I didn't see that movie.
It looked long.
It is. It is.
But it's quite a film.
I would not see a movie over an hour and 40 minutes anymore.
That's my heart out.
You know what I don't recommend for short movies?
This is Herbert West Reanimator, or The Reanimator.
Sorry, it was before the Herbert West one, which was a Halloween movie I loved, and then
rewatched this Halloween, and it was really great up until one really horrifying scene
that I had kind of forgotten.
I haven't seen Reanimator before.
It's got some amazing stuff in it, if you're a horror movie fan, and then there is an incredibly
uncomfortable sexual assault scene that is like really bad.
Like really bad.
Wow, I think we're opening this episode in a really strong, powerful, and focused way.
I want to recommend a horror movie to you.
If it's Midsummer, I'm not going to listen.
It's not.
It's not.
I only watch Midsummer to get all horny for Will Poulter, and then I turn it off after
he dies.
Of course.
That's a reasonable thing for a person to say.
Absolutely.
The person that I want to have sex with dies, I turn off the movie because it's boring after
that.
It's a very healthy way to go through life.
Thank you so much.
Okay, so the movie, which I will show you at some point, is called Pin.
Okay.
I've heard of Pin.
No.
So, Pin is about a pediatrician who has like a life-size medical dummy sitting in his
office.
Oh, this doesn't seem like it's going a good place.
The pediatrician, the only way, it's like a psychological thriller.
It's not really that gory, but the pediatrician, the only way that he can communicate with emotional
honesty with his children is by making a little ventriloquist voice for the dummy.
And so the ventriloquist dummy gives the kids sex ed.
The ventriloquist dummy does all this stuff, and then, and then one of the kids thinks that
the ventriloquist dummy, whose name is Pin, is real, and then Pin starts to like control
his thoughts and actions, and then there's a scene where a nurse has sex with Pin, but
he's just a dummy.
Oh, okay.
It's the greatest movie I've ever seen.
I'm not doing it justice.
That sounds like quite a film, Jamie.
That sounds like quite a film.
A nurse has sex with Pin, Robert.
Yeah, I mean, that definitely does sound like a scene that would make me have very specific
physical reactions.
And the best part about that scene is that the nurse never comes back and it's never addressed
in the movie again.
It's kind of a mess.
Now you're speaking the language, the language of shoddy filmmaking.
Something horrible happens, and then you just canonically have to forget in order to watch
the rest of the movie.
No, I'm on board with that.
You know, Jamie, now that you mentioned quitting a movie as soon as the person you want to
have sex with dies, that may explain why I've never made it past the halfway point in the
first Star Wars movie.
Really?
Wow.
Oh, yeah.
Once Alec Guinness is out of the picture, why even keep watching, you know?
Well, yeah, you're like, well, the hottest person is gone.
Not a single fuckable face in the rest of that film.
No.
Bunch of uggos.
Jamie, you know who else is a bunch of uggos?
Oh, that was so perfect.
Oh, here's the transition.
We did it.
Perfect.
The people who run Facebook.
It is, it is.
We're talking today about the Facebook papers, which is, we'll talk about this a little bit
more in detail, but an enormous cache of internal Facebook documents that just got leaked, revealing
a tremendous amount of fucked up shit.
And I think we have to start with the uncomfortable situation, that is, everybody talking shit
about how Mark Zuckerberg looks like an android.
I feel so mixed about it, because on one hand, yes.
I'm bored.
I'm bored of it, honestly.
The thing that's bad about him is not his appearance, but also he does hit the uncanny
valley.
There's something missing in his eyes.
Look, there's something, and that's Ivy League boy syndrome, right?
Yes.
Like, that's not just him.
That's anyone who graduated from Harvard.
The sunscreen photo haunts us all.
Oh, yes.
I truly, I mean, even though we did just refer to the entire cast of Star Wars one as a bunch
of uggos.
Oh, yeah.
Hittiest.
Well, like the worst, the most lazy thing you could do is go after how someone looks
when there are so many other evil facets of him.
I will agree that there is no light in his eyes.
There's certainly no light in his eyes.
There's nothing, like the pupils are very, there's not, that little thing that's supposed
to be there is not there.
Yeah.
He could watch a kitten get murdered and it would just be a dial tone inside his soul.
He looks like a character from the polar express.
He looks, he looks like Herbert West from the movie Reanimator, but less care is mad.
So now I have to check that.
He looks like Pin.
Look up Pin.
I'm not going to look up Pin.
You're going to look up Pin.
At this point in the history of the show, Jamie, we've recorded a lot of podcast episodes
about Facebook.
You have been there for what, three of them?
Yeah.
I forget, I forget entirely how many episodes we did on Facebook.
We did three episodes on like the creation of Facebook and it's kind of a brief list
of its crime.
At least one follow-up, maybe two follow-ups.
And then we've mentioned Facebook fuckery and episodes of like it could happen here
and where's it ever.
Facebook is personal to me for a couple of reasons.
Number one, number of people who helped raise me have slowly lost their grip on reality
and the face of viral propaganda spread via Facebook's engagement algorithm.
And that's kind of bummed me out.
And number two, my friends and I all lost our jobs in the company we built for a decade
due to the fact that Facebook told criminal lies about video metrics that they have currently
been fined $40 million for, which also frustrated me.
And it sounds like you're actually over it.
It sounds like you're really over it and you've, no, we've talked about this, I don't
know if it was on Micronet, but yeah, I also lost my job to that.
Yeah.
I mean, we worked at the same place.
Yeah.
At least one of them, yeah.
And then we also worked at like, yeah, I mean, they all disappeared.
The whole industry went up in flames.
It's so, I'm still mad about it.
Yeah.
I'm still mad about it.
Even though like things have been going fine, great for me career-wise.
It's just like, it's kind of bullshit.
It's kind of frustrating.
I'm sorry, Sophie, did you catch that?
Robert's career is going great.
Yeah.
It is going great, Jamie.
In my head, I went, you're welcome.
I'm a human rocket ship.
I'm the Iggy Pop of talking about Facebook.
You're welcome.
It is kind of nice that we all just had to pivot to like, okay, you can still talk about
what you're passionate about, but just no one has to look at you anymore.
And I'm like, that's actually not the worst thing that's ever happened to me.
And in my case, it was like, you don't have to write articles.
You just have to talk on a microphone, which involves writing an article, but I don't know.
They're easier.
It's true.
It's true.
You can like, you can really have a series of bullet points and be like, well, you just
don't have to edit anymore.
Right.
That's what we, that's what we got rid of in this pivot to podcasting editors.
We're all green-walding a little bit.
It all worked out, but you know, but also QAnon, stealing all of my house, souls.
Yeah.
QAnon, the fact that there will soon be death squads roving many of our neighbor, like there's
downsides to it too, you know.
For sure.
But then also Metta, you know, so there's.
Yeah, Metta.
Thank God we're getting Metta.
We'll be talking about Metta at some point.
But yeah, like it's, it's Facebook's bad.
I don't like Facebook.
But in one of those episodes, and I forget which of those episodes, I said something along
the lines of, at this point, there's no moral case for remaining employed by Facebook.
And earlier this year, a Facebook employee named Francis Hogan came to the same conclusion
on her own.
Rather than jump out with her stock options or whatever perk she decreed and then get
another tech industry job, which is what a lot of people do.
I know people who have done this when they were like, Facebook's kind of fucked up.
I'm just going to go hop to another country company and make even more money.
Instead of doing that, Francis spent weeks painstakingly photographing internal Facebook
has its own internal communications network that is patterned off of Facebook, but it's
for like the corporate, the, the employees to use.
That's depressing.
Well, I mean, it's, it's like, it's like Slack, but probably more all consuming and
soul-destroying.
Yeah.
Yeah.
But she.
There's like nothing worse.
I mean, personality-wise, I, I can't stand someone who's like really killing it on Slack
that's one of my least favorite traits in a person is if they're, they're really giving
it 110% on Slack, I'm like, just, I'm, I'm asleep.
I hate it.
No.
I mean, Sophie barely hears from me when we need to work, let alone when we don't need
to work.
That I'm like, just text me when I'm late, that's 100% not true.
Good stuff.
So rather than, you know, take the money and run.
So she gets in this like internal communications app that Facebook has and because they have
protections about like, cause like they know that this is a risk, people leaking internal
documents is a risk.
They have like security things set up and to get past them, she just photographs a bunch
of internal documents on her phone, a huge, like she could say a lot of shit.
And then she then leaks those files to several news agencies and our good friends at the
SEC.
This week, we're going to go through some of those documents and all of the damning
shit they reveal.
The first and perhaps most important revelation is this.
Many Facebook employees understand that their company is a force for evil.
Some of them have vowed to fix it from the inside.
Others are convinced the evil is outweighed by some nebulous good.
But at the core of it, they know that what they're a part of is problematic and a lot
of them hate themselves for it.
You can really see that coming across in some of these conversations.
Evidence of the.
Yeah, it's good stuff.
Don't you love?
Yeah.
I mean, human beings compromise the very nature of their soul in seeking profit.
Yeah.
And then, you know, you watch the light leave their eyes and then you're supposed to feel
bad for them.
Evidence of the struggle over the soul of Facebook can be found in the reactions of
employees to the growth of the Black Lives Matter movement after the murder of George
Floyd in 2020 by a cop.
That June, as protests reached their height, a Facebook employee posted a message to the
company Racial Justice Chatboards stating, get Breitbart out of news tab.
He was enraged at the fact that the far right publisher was pushing disinformation about
violence at protests and included screen grabs of Breitbart articles with titles like Minneapolis
Mayhem, massive looting, buildings and flames, bonfires, exclamation point, and BLM protesters
pummel police cars.
I wonder how much more attention they paid the police cars than the man who was choked
to death by a cop.
Anyway, good stuff.
Good journalism, nailing it.
This employee claimed that these articles were part of a concerted effort by Breitbart
and other right-wing media sites to, quote, paint Black Americans and Black-led movements
in a negative way.
He argued that none of those hyper-partisan sites deserve to be highlighted by the Facebook
News tab.
So Facebook's News tab consists of two tiers of content providers, right?
It's just like the tab that tells you what's going on in the world and all of the people
whose stories get in there have been vetted to some extent by Facebook.
So there's a first tier of big publishers like The New York Times, The Wall Street Journal,
like The Big Dogs, and they get paid.
Facebook gives them money to be a part of the News tab.
And then there is a second tier of news sites who are not paid but did have to get vetted
as a reliable news source for Facebook to put them on their News tab.
And Breitbart is in that latter tier, which means Facebook isn't giving them money directly
but is institutionally pumping a shitload of traffic towards their propaganda and throwing
a lot of their propaganda out into people's news feeds.
In their public-facing statements, Facebook claims to only include sites on their News
tab who do quality news reporting, sites that repeatedly share disinformation, it claims,
are banned.
This functions on a strike system.
In July of 2020, President Trump tweeted a Breitbart video claiming,
You don't need a mask to protect against COVID-19.
The video also spread misinformation about hydroxychloroquine.
Despite the fact that this video clearly violated Facebook's stated standards, it was able to
reach millions of people through the News tab before Facebook took it down.
From The Wall Street Journal,
According to Facebook's fact-checking rules, pages can be punished if they acquire too many
strikes, meaning they publish content deemed false by third-party fact-checkers.
It requires two strikes within 90 days to be deemed a repeat offender, which can result
in a user being suspended from posting content.
Four strikes can lead to reductions in distribution and advertising revenue.
In a town hall, Mr. Zuckerberg said Breitbart wasn't punished for the video because that
was its only infraction in a 90-day period, according to internal chats described to the
media.
Trump tweeted it and it spread, and Facebook didn't want to take it down until it had made
them some money.
I also think it's just like they don't put a lot of work into checking on this stuff.
We're talking about all this, but they also just don't want to piss any conservatives
off.
There's a lot of things going into why this stuff is not, in fact, vetted to any degree.
Now, you expressed surprise at the fact that Breitbart only had one strike in 90 days.
Let's talk about why.
Yeah.
So thanks to Francis Hogan's leaked documents, we now know that Breitbart was one of the
news sites Facebook considered managed partners.
These sites are part of a program whereby the social network pairs handlers who work
at Facebook with the website.
These handlers give Facebook a back channel to sites that spread disinformation, which
allows them to have content removed or altered without giving the content maker a strike.
So in other words, they put out the content, it gets viewed millions of times, Facebook,
one employee likes messages an editor and says like, hey, you need to change this now,
it gets changed after it's spread around and they avoid a strike and thus stay on the news
tab.
Oh, okay.
That's a good way to do it.
So it is like a back channel.
Yeah.
That's so dark.
Okay.
I mean, I guess if you're looking for a way to keep misinformation up, that is a logical
way to go about it.
Yeah.
Yeah.
Yeah.
They do a perfect job.
So you're saying that Breitbart is accurate, right?
Yes.
That's what I'm saying.
Perfectly accurate.
That is what we always say about Breitbart.
And Andrew Breitbart, a man who did not do cocaine until he died.
Talk about someone who's got light in his eyes and a fire in his heart.
Not anymore he doesn't.
No.
So actual strikes were automatically escalated for review by senior Facebook executives who
could decide to overturn the punishment and remove a strike.
Through these methods, Facebook's strike system for spreading disinformation actually
proved to be nothing at all.
Any sufficiently large right-wing website was given numerous opportunities to avoid
strikes without being delisted.
This was a problem that went further than Breitbart, as the Wall Street Journal reports.
In an internal memo, the engineer said that he based his assessment in part on a queue
of three dozen escalations that he had stumbled onto, the vast majority of which were on behalf
of conservative content producers.
A summary of the engineer's findings was posted to an internal message board.
One case he cited regarding pro-Trump influencers Diamond and Silk, third-party fact-checkers
rated as false, a post on their page that said, yeah.
That sounds like porn stars.
Wait, Diamond and Silk?
Oh, do you not know about Diamond and Silk?
No.
They are.
Are they banned?
They're not great.
No, no, but they're bad.
They're not nice people, not good people.
So they got, yeah, fact-checkers rated false, a post that Diamond and Silk made, stating
how the hell is allocating $25 million in order to give a raise to House members that don't
give a damn about Americans going to help stimulate America's economy.
When fact-checkers rated that post false, a Facebook staffer involved in the partner
program argued that there should be no punishment, noting the publisher has not hesitated going
public about their concerns around alleged anti-conservative bias on Facebook.
So this is a pretty minor case, but it shows what's going on there.
They post something that's not accurate.
This raise is not something that's going through.
And fact-checkers flag it as inaccurate, which should mean it gets removed.
But then someone at Facebook is like, if we remove it, they're going to yell about us
removing their post, and it's going to be a pain in the ass for us.
So just fuck it.
Yeah, I feel like this is always the route that Facebook goes, is just like this big,
gigantic bureaucratic-style operation that people do shitty things so that they're not
inconvenienced or yelled at by someone else.
It's also insidious and also so boring at the same time.
Yeah, it's the consequences that aren't boring.
And to some extent, this is true of a lot of the worst things in history.
There were an awful lot of men in totalitarian societies who signed effectively or literally
the death warrants of their fellow man, because otherwise it's going to be a real pain in
the ass for me.
My day at the office is going to be terrible.
I don't want to take this to the boss.
I don't want to escalate this yet.
Just kill them.
Yeah.
I hate this, so the most evil stuff is done in a very slow and boring way, I feel like.
It's just because if you can get people to fall asleep, you can get away with fucking
murder.
Literally.
Yep, it's good stuff.
Woohoo.
Loving these papers, Robert.
Thank you.
Yeah, they're very fun.
So Diamond and Silk were able to lobby the third-party fact-checker into changing the
rating on their article down to partly false, and with the help of the managed partner escalation
process, all of their strikes were removed.
The chat conversations that the general viewed showed that inside the company, Facebook employees
repeatedly demanded that higher-ups explain the allegations.
Quote, we are apparently providing hate speech policy consulting and consequence mitigation
services to select partners, wrote one employee.
Leadership is scared of being accused of bias, wrote another.
So that seems bad, right?
That doesn't seem good.
Now I'm picturing.
That seems like the root of a lot of problems we've been having as a society.
Like, well, conservatives are loud when they're angry, so let's just let them lie and try
to get people killed.
Go to sleep.
Go to sleep.
Not that Diamond and Silk was doing that in that case, but that's a thing in writing
media.
Now when you're saying, I don't know what to picture when you say Diamond and Silk.
So at first I was picturing porn stars, then I was picturing a hair metal band, and now
I'm picturing two.
They look more like gospel singers.
I was picturing two Springer Spaniels most recently, and I think I'll stay there.
Yeah, no, I wouldn't.
You should look them up.
Okay.
Yeah, they're two musicians who have posed with Trump and have like a, I think they're
on TikTok.
They're just like right-wing media influencers, and they're not great people.
In a farewell memo to colleagues in late 2020, one member of Facebook's integrity team and
the integrity team, their job is to reduce harmful behavior on the platform, complain
that Facebook's tolerance for Breitbart stopped them from effectively fighting hate speech.
Quote, we make special exceptions to our written policies for them, and we even explicitly
endorse them by including the mistrusted partners in our core products.
Hmm.
Yeah.
Okay.
It's bad.
And you can see like there's this constant, with the Facebook papers revealed, there's
this constant seesaw and aggressive between the integrity team, the people whose job is
to reduce the harm the site does, and everyone else whose only real job is to increase engagement
on the site, right?
That is how you get your bonus.
That is how you get the kudos from the boss, is keeping people on the site for longer.
So most of Facebook, that is their job, and a small number of people, their job is to
try and make sure the site doesn't contribute to an ethnic cleansing.
And the ethnic cleansing people, like the people trying to stop that, the best way to
do that is always going to be to do things that cut down on engagement with the site,
and so they nearly always lose the fights they have with everybody else.
Ugh.
Jesus Christ.
Yeah.
It's great.
Okay.
Okay.
Yeah.
That is the scariest extension of that logic.
Yep.
Yep.
One thing we know, thanks to the Facebook papers, is that the social network launched
a study in 2019 to determine what level of trust its users had in different media organizations.
Out of dozens of websites across the US and UK, Breitbart was dead last.
Both themselves rated it as low quality, which again, based on the company's own claims about
how they decide who to include in the news tab, would disqualify Breitbart.
And guess what?
Breitbart is still a trusted Facebook partner.
Oh hey, what's this unrelated news clip from a November 2021 Washington Post article
doing in my script?
Quote, Breitbart is the most influential producer of climate change denial posts on Facebook.
According to a report released Tuesday that suggests a small number of publishers play
an outsized role in creating content that undermines climate science.
Good shit, right?
Wow.
That's rad.
Still number one after all these years.
What a treat.
Isn't that good?
Isn't that a good thing?
Well I think this means that they haven't said two inaccurate things in the last 90
days, which I find to be completely believable.
Never.
Facebook's terror at the thought of offending conservatives by cracking down on hate speech
and rampant disinformation started, I don't know if it started, but it really hit the
ground running in 2016 during the only election that was even worse than this last election.
In May of that year, Gizmodo wrote an article reporting that Facebook's trending topics
list suppressed conservative views.
A handful of ex-employees made claims that seemed to back these allegations up.
Now reporting later in the year by NPR made it clear that this was bullshit.
Quote, NPR called up half a dozen technology experts, including data scientists who have
special access to Facebook's internal metrics.
The consensus, there is no statistical evidence to support the argument that Facebook does
not give conservative views a fair shake.
And truth never matters when you're arguing with conservatives.
They needed a reason to threaten Facebook with regulation, et cetera.
And when Trump won later that year, the social network decides these threats may have teeth,
and so we're going to spend the next four years allowing them to say whatever the fuck
they want, no matter how racist, no matter how conspiratorial, no matter how many shootings
it may help to inspire, no matter how many shootings may be live-streamed on the platform,
like the Christchurch shooting, we're going to let it all in.
All in.
Okay.
Yeah.
Why?
Because money?
Well, because otherwise they'll get yelled at and maybe regulated.
Oh, right.
The conservatives are good angry.
I don't want the conservatives to get angry.
The funny thing is there's no stopping them from getting angry, right?
You know how this works.
I know how this works.
They're going to be angry and they're going to claim bias no matter what, which is what
they do.
And so as Facebook gives them a free pass and their content is consistently the most influential
stuff on the entire site, allegations of anti-right-wing bias continue to spread, even though, again,
like eight to nine out of the 10 top-shared posts on any given day are from right-wing
media.
But you know what's not from right-wing media, Jamie?
What?
All these products and services that you're, I mean, you can't be sure.
You can't be sure.
Not at all.
Not at all.
No, we have a different brand of brain pills than the ones Alex Jones sells.
And ours have less than half the lead.
Centrist brain pills.
Oh, good.
Less than half the lead.
That is a promise, Jamie.
However much lead you think a pill should have, it's less than that because we care.
Oh, I'll take your sick little centrist brain pill.
See if I care.
I'll start watching.
I can start watching MSNBC at any moment.
Okay.
Take brain cooked.
You get brain cooked.
Here's some other products.
What would you do if a secret cabal of the most powerful folks in the United States told
you, hey, let's start a coup?
Back in the 1930s, a Marine named Smedley Butler was all that stood between the U.S. and fascism.
I'm Ben Bullock.
And I'm Alex French.
In our newest show, we take a darkly comedic, and occasionally ridiculous, deep dive into
a story that has been buried for nearly a century.
We've tracked down exclusive historical records.
We've interviewed the world's foremost experts.
We're also bringing you cinematic, historical recreations of moments left out of your history
books.
I'm Smedley Butler, and I got a lot to say.
For one, my personal history is raw, inspiring, and mind-blowing.
And for another, do we get the mattresses after we do the ads, or do we just have to
do the ads?
From iHeart Podcast and School of Humans, this is Let's Start a Coup.
On the iHeart Radio app, Apple podcasts are wherever you find your favorite shows.
What if I told you that much of the forensic science you see on shows like CSI isn't based
on actual science?
The problem with forensic science in the criminal legal system today is that it's an awful
lot of forensic and not an awful lot of science.
And the wrongly convicted pay a horrific price.
Two death sentences and a life without parole.
My youngest, I was incarcerated two days after her first birthday.
I'm Molly Herman.
Join me as we put forensic science on trial to discover what happens when a match isn't
a match and when there's no science in CSI.
How many people have to be wrongly convicted before they realize that this stuff's all
bogus.
It's all made up.
Listen to CSI on trial on the iHeart Radio app, Apple podcasts or wherever you get your
podcasts.
I'm Lance Bass and you may know me from a little band called NSYNC.
What you may not know is that when I was 23, I traveled to Moscow to train to become the
youngest person to go to space.
And when I was there, as you can imagine, I heard some pretty wild stories.
But there was this one that really stuck with me about a Soviet astronaut who found himself
stuck in space with no country to bring him down.
It's 1991 and that man Sergei Krekalev is floating in orbit when he gets a message that
down on Earth, his beloved country, the Soviet Union, is falling apart.
And now he's left defending the Union's last outpost.
This is the crazy story of the 313 days he spent in space, 313 days that changed the
world.
Listen to the last Soviet on the iHeart Radio app, Apple podcasts or wherever you get your
podcasts.
All right, so we're back.
Okay.
So we're back.
So we're back.
In 2018.
Are things going to get happy?
Are things going to get happy?
Are things going to get funny?
What?
Really?
Okay.
Just checking.
Yeah.
No, that's not really.
I mean, Mark Zuckerberg will like, I don't know, fall down a manhole someday.
Maybe.
If we're lucky.
Oh, that would be great.
That would be funny.
That would be great.
In 2018, a Facebook engineer claimed on an internal message board that the company was
intolerant of his beliefs.
The reality is almost certainly that his coworkers found him to be an obnoxious bigot.
I say this because he left the company shortly thereafter and hit the grifting circuit, showing
up on Tucker Carlson's show.
He does the thing that like, you remember 2018, 19, a bunch of these guys were like leaving
big tech companies and like going on the Alex Jones show.
There was one guy who left Google and claimed, like brought a bunch of leaks, but they weren't
anything because it was never anything that people being like, this guy kind of seems
like he sucks.
It's very funny.
Those press tours were, yeah, that was truly, that feels like it was 10 years ago, but yeah.
It was.
2018.
It was funny because like, I think the first one of these dudes did all right, money wise,
but after that, like the spigot dried up and so they were just like detonating their careers
in the tech industry for nothing, going to work for gab afterwards.
It was really fun to watch.
After the 2016 election, and I apologize for the rate that we're jumping around here on
the timeline, but it's unaffordable, Facebook became the subject of bad PR from the left
as well.
The Cambridge Analytica scandal broke and the outrage in the wake of Trump's election
meant that Facebook was being pressured to do something about bigotry and disinformation
spreading on their platform.
At the same time, the Republicans are in charge now, so they can't actually do anything,
otherwise they'll be attacked for being biased and maybe regulated.
So they tested a couple of different changes.
One was a tool called sparing sharing, which sought to reduce the reach of hyper posters.
Hyper posters are exactly what they sound like.
These are users that had been shown to mostly share false and hateful information.
And so reducing their virality was seen as potentially helpful.
This seems like a sensible change, right?
Oh, these people are sharing at an incredible rate, and it's all violent trash.
Let's reduce the number of people who see this stuff.
Right.
And I guess that's a real band date, just to be like, okay, we're going to have them,
they can still share stuff, but just less hateful stuff.
Yeah.
And it's not less garbage.
It's not even a shadow band, because the shadow band would imply that you are actually
reducing artificially the spread.
They're no longer artificially inflating their reach, like, because their stuff gets great
engagement, right?
Because it pisses people off, even though it's untrue.
And the algorithm, the algorithm's default is, oh, this pisses people off.
Let's let everybody see what this asshole's saying.
And they're just being like, well, let's not do that for these specific assholes, right?
That's all they're doing.
It's not a ban.
It's a, we're going to stop inflating these people's reach to the same extent that we
were.
Seems like a sensible change.
You know who disagreed with that, Jamie Loftis, who Robert Evans, Joel Kaplan, former deputy
chief of staff to George W. Bush, and Facebook head of public policy, um, famous right wing
shithead Joel Kaplan, who is huge at Facebook, um, and is a major driving force behind, don't
piss off conservatives.
That's the guy that he is.
That's his whole job.
How are we supposed to work together if we're pissing off the conservatives?
It actually, it's a rising tide, right?
Yeah.
So Kaplan's like, most of these hyper posters are conservatives.
This is, this is, you know, unfair.
And he convinces Zuckerberg to weaken, have his engineers weaken the tool so that they
do kind of reduce the influence that these hyper posters have, but not by as much as
they wanted to.
And it doesn't really seem to have much of an impact.
As we will talk about later, this is still the way Facebook works.
So however, to whatever extent they did reduce the harm, it was not by much.
Another attempt is also like way too cool of a word to describe what that is, which
is spreaders of hate speech.
Why give them a cool name like that?
Yeah.
Why give them a cool name like that?
Um, although, I don't know, that might have, that sounds like something we might have said
as like an insult to, uh, people when I was young and on the internet.
You're a hyper poster.
I don't know.
Dude, you're like hyper posting right now.
You need to chill the fuck out.
I'm picturing, uh, someone sitting at their, a filthy keyboard in a Power Rangers suit.
That's a hyper poster.
I am.
I am imagining a filthy and the filthy Power Rangers suit, Jamie.
Oh, it's really dirty and it doesn't fit.
It's either way too big or way too small.
Yeah.
They, they, they have soiled themselves in it on more than one occasion.
Well, cause they can't stop posting, Robert.
Cause they're posting too much.
You can't do a key break.
It was, it was not an accident.
It was a choice they made in order to finish an argument.
I'm going to make an oil painting of that exact image.
Jamie, I swear to you, I will put that up in my living room.
I will put it on my roof like the Sistine Chapel.
Really?
Absolutely.
Don't underestimate how much free time I have, Robert.
I would never do that.
You work for the internet.
So another attempted tool to make Facebook better was called informed engagement.
This was supposed to reduce the reach of posts that Facebook determined were more likely
to be shared by people who had not read them.
This rule was instituted and over time, Facebook noticed a significant decline in disinformation
and toxic content.
They commissioned a study, which is where the problem started from the Wall Street Journal.
Love that.
The study dubbed a political ideology analysis suggested the company had been suppressing
the traffic of major far-right publishers, even though that wasn't its intent, according
to the documents.
Very conservative sites, it found, would benefit the most if the tools were removed, with Breitbart's
traffic increasing an estimated 20%, Washington Times is 18%, Western Journal 16%, and Epic
Times by 11%, according to the documents.
The study was designed— That's why you never conduct a study, Robert.
Yeah.
They find, basically, if you reduce the amount by which posts shared by people who have not
read the article, if you make those spread less, Breitbart's traffic drops 20%.
God, I still think that the tools that have developed over time to be like, are you sure
you don't want to read the article are so goofy.
I do like when Twitter, I feel like they're like, I just picture like a little shaking,
are you sure you don't want to read that article before you retweet it?
What do you think?
I was like, no.
I felt so bad because there's been times when I've retweeted, shared my own articles
that I've written and because I wrote them, I didn't necessarily click the link before
sharing.
I just woke up and I shared it and it's like, are you sure you don't want to read this?
And I just clicked to share because it's like 9 or 10 in the morning and I haven't had my
coffee yet.
But I feel bad even though it's like, well, I wrote the motherfucker.
I know what's in there.
Usually by the time I share something, I have already read it.
But I think that that function, I think it has a good purpose and it like pings something
in your brain that is useful.
No.
Yeah.
I think it is a good thing.
It's just funny.
It's this little Oliver twist that appears in front of you and is like, are you sure
you don't want to read the article before you share what you're following?
Like, no, I'm good.
I can read.
It's all good.
Would you like to maybe read the article before suggesting that an ethnic group be slotted
for their crimes?
All right.
Right.
And that's where he really comes in handy is in those moments.
Yeah.
That guy shouldn't be British.
So the study, like the reason they conduct the study is in the hopes that it will like
allow them to say that it's not biased.
But then it turns out that like, I wouldn't call it biased, but this change, which is
unequivocally a good thing, impacts conservative sites, which are lower quality and more often
shared by people who haven't read the articles, but are incensed by a shitty aggressive headline
like the Breitbart ones we just read.
Those get shared a lot and they don't read the article and that's great for Breitbart.
But they decided like, oh, shit, actually this study, the results of this study were
absolutely going to be called out for bias.
One of the researchers wrote in an internal memo, we could face significant backlash for
having experimented with distribution at the expense of conservative publishers.
So the company dumps this plan.
They kill it.
Okay.
This is bad for Breitbart.
Good for the world.
Bad for Breitbart.
If it's bad for the Bart, we got to can the plan.
Bad for the Bart got to can the plan.
You have always said that.
I work at Facebook.
Yeah.
I was saying that at meetings.
You are Cheryl Sandberg, actually.
Not a lot of people know that.
Yeah.
You know, listen, I've made up a couple hundred dollars off of me.
Yeah.
Making fun of Cheryl Sandberg.
You sure have.
So the good news is that Facebook didn't just make craven decisions when threatened
with the possibility of being called out for bias.
They were also craven whenever a feature promised to improve the safety of their network at
the cost of absolutely any profitability.
In 2019, Facebook researchers opened a study into the influence of the like button, which
is one of the most basic and central features of the entire platform.
Unfortunately, as we'll discuss in more detail later, likes are one of the most toxic things
about Facebook.
Researchers found that removing the like button along with removing emoji reactions from posts
on Instagram reduced, quote, stress and anxiety felt by the youngest users on the platform,
who all reported significant anxiety due to the feature.
But Facebook also found that hiding the like button reduced the rate at which users interacted
with posts and clicked on ads.
So now this is more of, yeah.
Yeah.
So this is, I will say, more of a mixed bag than the last thing, because removing the
like didn't, like it made one group of young people feel better, but not other groups of
young people.
Like it didn't reduce, it reduced like kids' social anxiety.
It didn't have as much of an impact on teens, really.
So it's not as clear cut as the last one.
But still a protective effect had been found among the most vulnerable people on Instagram
in particular.
But they don't, they don't do anything about it because, you know.
That's so frustrating because like that is genuinely very, very valuable, interesting
information where, I don't know, I mean, I feel like it probably didn't affect teenagers
because by that point it's like, I mean, you don't want to say like too late, but by that
point you're so hardwired to be like, well, I can tell what is important or like what
is worth discussing based on likes.
And once that that's just such a sticky, I mean, I still feel that way, even though it's
like you can objectively know it's not true.
But once you've been kind of pilled in that way, it's very hard to undo.
Yep.
Yeah.
It's, I don't know what it is, Jamie.
It's not great.
Upsetting.
That's what it is, you killed it.
Yeah.
It's not great.
So as time went on, research made it increasingly clear that core features of Facebook products
were fundamentally harmful.
From Buzzfeed, quote, time and again, they determined that people misused key features
or that those features amplified toxic content, among other effects.
In August, in an August 2019 internal memo, several researchers said it was Facebook's
core product mechanics, meaning the basics of how the product functioned that had let
misinformation and hate speech flourish on the site.
The mechanics of our platform are not neutral, they concluded.
So there's Facebook employees recognize internally, we are making decisions that are allowing
hatred and other and just unhealthy toxic content to spread.
And we're not, the bias is not in us fighting it.
Our bias is in refusing to fight it.
Like we are not being neutral because we're letting this spread.
The people making the site work recognize this.
They talk about it to each other.
It makes, they feel guilt over it.
They talk about it.
You know, we know this.
Yeah.
Yeah.
Yeah.
I mean, we've discussed that before of just like the existential stress of working
at Facebook, not the most sympathetic problem in the world.
Not at all.
Yeah.
It was a clear paper trail though of deep existential guilt and distress.
Yep.
Now, it's pretty cool.
Yeah.
It's pretty cool.
It's pretty cool.
So rather than expanding their tests on the impact of removing the like button on a broader
scale, Mark Zuckerberg and other executives agreed to allow testing only on a much more
limited scale, not to reduce harm, but to quote, build a positive press narrative around
Instagram.
So not to actually help human beings.
No, but to give us something to brag about, right?
Or for him to be like, I'm such, we're so nice.
We're so cool.
Yeah.
Look at how fucking rad we are.
I'm the guy who made a site to stalk girls.
Mark Zuckerberg with the fucking dated slang.
He's going to be like, yeah, this is going to be, we got to get some lit press around
Instagram.
Know what I mean?
Yeah.
In three years, someone's going to tell him the word poggers and then he's going to
say it like 30 times in a week.
Do all of his imaginary alien friends on meta, like, yeah, that's Pog bro fucking Mark Zuckerberg.
I hate that screaming into a void.
In September of 2020, Facebook rolled out a study of the share button.
This came in the wake of a summer of unprecedented political violence, much of it stoked via viral
Facebook posts and Facebook groups.
The shit at Kenosha started on Facebook in a lot of ways.
If you were tracking at that night, a hell of a lot of that shit got started on Facebook.
A lot of the like, let's get a militia together and protect businesses.
It's good stuff.
Company researchers identified reshare aggregation units, which were automatically generated groups
of posts that you're, so they identify one of the problems leading to all of this is
what they called reshare aggregation units.
These were automatically generated groups of posts that were sent to you that were posts
your friends liked.
They recognized this is how a lot of this bad shit is spreading.
Right.
That's creating a feedback loop on purpose.
Yes.
In part because users are much more likely to share posts that have already been liked
by other people they know, even if those posts are hateful, bigoted, bullying, or contain
inaccurate information.
So, if somebody gets the same post in two different ways, if they just like see a bigoted
article pop up on their Facebook feed, but they're not being informed that other people
they know have liked it or shared it, they're less likely to share it than if like, well,
my buddy shared it.
So maybe now I have permission.
Right?
So you can think about how this happens on like a societal level, how this has contributed
to everything we're dealing with right now.
Absolutely.
I feel like everyone knows someone who has probably was very, very influenced by that
exact function.
That's, oh God, that's awful.
So company researchers in September of 2020 are like, these reshare aggregation units,
the fact that we don't have to do it this way, right?
We can only show people that the articles that their friends comment on at the very least
as opposed to just like or just share without much commentary.
Like they have a number of options here that could at the very least reduce harmful content
because whenever like, I think the number is like three times as likely to share content
that's presented to them in this way.
So in May of that year, while, you know, so this is actually months before Facebook researchers
find this out, myself and a journalist named Jason Wilson published what I still think
is the first proper forensic investigation into the Boogaloo movement.
He noted how the spread of the movement in its crucial early days was enabled entirely
by Facebook's group recommendation algorithm, which was often spread to people by these
reshare aggregation units.
You'd see, oh, my buddy joined this group where everybody's sharing these like Hawaiian
shirt photos and pictures of guns, maybe I'll hop in there and, you know, the cycle goes
on from there when you've joined one group, it sends you advice like, hey, check out this
other group, check out this other group.
And it starts with like, ah, we're sharing memes about like Hawaiian shirts and, and
you know, the Boogaloo, and then five or six groups down the line, you're making serious
plans to assassinate federal agents or kidnap a governor, you know, I mean, I remember where
I was when I read it, because of how steep the escalation is and how quickly it like
it's not, I guess, not completely surprising, but at the time I was like, oh, that, that
is a very short timeline from, yeah, I don't like it.
It's not good.
It's not good.
And in fairness, there are actually some Facebook, like it kind of becomes accepted, the stuff
that Jason and I were writing about in May, by a lot of Facebook researchers around September
of that year, but there were people within Facebook who actually tried to blow the lid
on this earlier, in fact, significantly earlier, in mid 2019, an internal researcher created
an experimental Facebook account, which is something that like certain researchers would
do from time to time to see what the algorithm is feeding people.
This experimental account was a fake conservative mom, and they made this account because they
wanted to see what the recommendation algorithm would feed this account.
And I'm going to read from BuzzFeed again here.
The internal research titled Carol's Journey to QAnon detailed how the Facebook account
for an imaginary woman named Carol Smith had followed pages for Fox News and Sinclair Broadcasting.
Within days, Facebook had recommended pages and groups related to QAnon, the conspiracy
theory that falsely claimed Mr. Trump was facing down a shadowy cabal of Democratic pedophiles.
By the end of three weeks, Carol Smith's Facebook feed had devolved further.
It became a constant flow of misleading, polarizing, and low-quality content, the researcher wrote.
Carol!
Yeah, yeah, yeah, yeah, so some jerk was like, let's call it Carol, like, they Carol stereotype.
Statistically, not unlikely that it could have been a Carol.
That is, I mean, that's interesting that we also all know a Carol.
We do all know a Carol, yeah, unfortunately.
They're in the Dunkin' Donuts drive-thru, I walk among them.
Yeah, they live among us, they are day walkers.
They're in the drive-thru and you walk among them, that's so funny.
You eat hot dogs next to them.
Here's my people, yeah, we're in line.
I think that hot dog eaters are maybe a more politically astute bunch, but the Dunkin'
Donuts line is just absolute unmitigated chaos.
The politics of the Dunkin' Donuts line are all over the fucking place.
They are, they are.
You have anarchists in the line, you have the scariest people you've ever met in the
line.
You have Ben Affleck in the line, looking like his entire family just died in a bus crash.
Ben Affleck's in the line and you can see his little dragon back piece peeking out.
Oh my God, it's a Phoenix, Jamie, come on, come on.
I'm sorry, that was disrespectful, that was disrespectful and you're right, and you're
right.
Mm-hmm, thank you, thank you.
So this study with this fake account that immediately gets radicalized, this study,
it comes out in the Facebook papers, right, this year, but it was done in 2019.
And when this year, like information of this dropped and journalists start writing about
it, they do what journalists do, which is you put together your article and then you
go for comment, right?
And so the comment that Facebook made about this experiment that this researcher did in
2019 was, well, this was a study of one hypothetical user, it is a perfect example of research
the company does to improve our systems and helped inform our decision to remove QAnon
from the platform.
That did not happen until January of this year.
Oh.
They didn't do shit for two years after this.
They only did shit because people stormed the fucking capital waving QAnon banners.
You motherfuckers.
Sounds like them, sounds like them, they're like, oh, let's wait until things get so desperately
bad that the company will be, you know, severely impacted if we don't do something.
Yeah.
A huge amount of the radicalization QAnon gets supercharged by the lockdowns, right?
Because suddenly all these people, like a lot of them are in financial distress, they're
locked in their fucking houses, they're online all the goddamn time, and they knew they could
have dealt with this problem and reduced massively the number of people who got fed this poison
during the lockdown if they'd done a goddamn thing in 2019.
They had the option, they did not.
Yeah.
Okay.
Well, no surprises here, but you know, on the bird side, nothing bad happened, right?
It did not.
Name one bad thing that happened.
Well, all of 2020 happened, actually, and was pretty heavily tied to this.
In August of 2020, that researcher left the company, she wrote an exit note where she
accused the company Facebook of, quote, knowingly exposing users to harm.
We've known for over a year now that our recommendation systems can very quickly lead
users down a path to conspiracy theories and groups.
In the meantime, the fringe groups slash set of beliefs has grown to national prominence
with QAnon congressional candidates and QAnon hashtags and groups trending on the mainstream.
Out of fears over potential public and policy stakeholder responses, we are knowingly exposing
users to risks of integrity harms.
During the time that we've hesitated, I've seen folks from my hometown go further and
further down the rabbit hole.
It has been painful to observe.
Wow.
Okay.
Okay.
And it is, I mean, no arguments there.
It is very painful to observe that happen to people who are not actually, yeah.
That's a Facebook employee who's not going to get any shit from me.
She identified the problem.
She did good research to try to make clear what the problem was.
And when she realized that her company was never going to take action on this because
it would have reduced their profits, she fucking leaves and she does everything she can to
let people know how unacceptable the situation is within the company, you know?
That's good.
I mean, that is good.
I'll hand it to her.
That is the minimum.
That is the minimum, right?
Yeah.
I mean, it is a little silly to be like, and I just recently realized that I don't think
Facebook is ethical.
You're like, okay.
You're like, shut up.
No way, girl.
I don't know when this person started.
But like, yeah.
You got there.
She got there and she's clearly horrified by what she realized the company was doing.
Again, we've all been where she is, where you just see these people you grew up with
lose their goddamn minds.
And it's bad.
It's real bad.
And then you're like, oh, and I'm at the nucleus of the problem.
Interesting.
Yeah.
I had to quit working at Purdue Pharmaceuticals.
Yeah.
I know that was really hard for you.
I know.
I know.
I know.
I know.
All those late nights on the phone.
I was a great salesman.
Oh.
You were so good.
Yeah.
You know who plays at Purdue Pharmaceuticals salesman?
Who?
Or no, maybe it's that Will Poulter.
Oh, yeah.
Okay.
You know who didn't is Alec Guinness because he never lived to see opiates become what
they are today.
Tragic.
Wow.
That's so sad.
Tragic.
He never lived to taste the sweet taste of tramadol or dilauded.
And we don't talk about that enough.
We don't talk about that enough.
What a shame.
No.
In this essay, I will.
What if Alec Guinness had access to high quality pharmaceutical grade painkillers, an essay.
I think it would have been sweet.
I'm pitching it.
I'm pitching it.
Okay.
Someone who's better at movies than I could have made a train spotting joke there because
he, you and McGregor in the heroin movie, then he played, oh, we're on, I don't know.
There's some joke there, but I didn't come up with it.
Okay.
Someone, yeah.
Someone figure that out in the Reddit and then don't tell us about it.
Do not tell us because I've never seen train spotting.
I'm just aware that when McGregor is on heroin, yeah, no, I haven't seen it, but like, you
know what it's about.
I'm so relieved.
Yeah.
I know what it's about.
I also get so stressed out when I, anytime, it's not often, but anytime I've had to say
even McGregor's name, I say it's so weird and it's the worst thing in the world.
Everything you and McGregor's name is the most frightening experience a human can have
in 2021.
I can't make my mouth make that shape.
It's embarrassing.
No, no.
I think what he has to live with, thankfully, he's gorgeous.
That must make it easier.
Yeah.
I mean, being sexy has to help.
It has to help.
We should ask him.
As the 20, yeah.
You know who does know how to pronounce you and McGregor's name and never feels any anxiety
over it because they're friends.
They hang out on the weekend.
Oh, nice.
Who's that?
The products and services that support this podcast are all good buddies with Ewan McGregor.
Ewan McGregor hangs out with the Highway Patrol.
Good to know.
Yeah.
What would you do if a secret cabal of the most powerful folks in the United States told
you, hey, let's start a coup.
Back in the 1930s, a Marine named Smedley Butler was all that stood between the U.S.
and fascism.
I'm Ben Bullitt.
And I'm Alex French.
In our newest show, we take a darkly comedic and occasionally ridiculous deep dive into
a story that has been buried for nearly a century.
We've tracked down exclusive historical records.
We've interviewed the world's foremost experts.
We're also bringing you cinematic, historical recreations of moments left out of your history
books.
I'm Smedley Butler and I got a lot to say.
For one, my personal history is raw, inspiring and mind blowing.
And for another, do we get the mattresses after we do the ads or do we just have to
do the ads?
From I Heart Podcast and School of Humans, this is Let's Start a Coup.
Listen to Let's Start a Coup on the I Heart Radio app, Apple Podcast, or wherever you
find your favorite shows.
What if I told you that much of the forensic science you see on shows like CSI isn't based
on actual science?
The problem with forensic science in the criminal legal system today is that it's an awful
lot of forensic and not an awful lot of science.
And the wrongly convicted pay a horrific price.
Two death sentences and a life without parole.
My youngest, I was incarcerated two days after her first birthday.
I'm Molly Herman.
Join me as we put forensic science on trial to discover what happens when a match isn't
a match and when there's no science in CSI.
How many people have to be wrongly convicted before they realize that this stuff's all
bogus.
It's all made up.
Listen to CSI on trial on the I Heart Radio app, Apple Podcast, or wherever you get your
podcasts.
It's 1991 and that man Sergei Krekalev is floating in orbit when he gets a message that
down on earth, his beloved country, the Soviet Union, is falling apart.
And now he's left offending the Union's last outpost.
This is the crazy story of the 313 days he spent in space, 313 days that changed the
world.
Listen to the last Soviet on the I Heart Radio app, Apple Podcast, or wherever you get your
podcasts.
Oh, we are back.
As the 2020 election grew nearer, disinformation continued to proliferate on Facebook and the
political temperature in the United States rose ever higher.
Facebook employees grew concerned about the wide variety of worst case scenarios that
might result if something went wrong with the election.
They put together a series of emergency break glass measures.
These would allow them to automatically slow or stop the formation of new Facebook groups
if the election was contested.
This was never stated, but you get the feeling they were looking at Kenosha and how Facebook
groups had led to spontaneous and deadly militia groups to form up due to viral news stories.
My interpretation is that they were terrified of the same sort of phenomenon that it would
lead to Facebook like fueling a civil war.
Like I think they were literally worried that like something will go wrong, people will
form militias on Facebook and there will be a gunfight that a shitload of people die
in that escalates to something worse and everyone will say it started on Facebook because that
happened in Kenosha.
Like this is not a...
I was like, this time happened.
I was like, this time happened.
Yeah.
It happened several times last year.
That's a fair anxiety.
We've seen it happen.
Yeah.
Okay.
It was never a mass exchange of gunfire.
Thank fucking God.
Yeah.
But they saw that possibility.
This is the thing I was worried about, the entirety of 2020.
And we got really close to it several times.
I was there for a few of them, it sucked.
So they're worried about this and they start coming... They try to figure out like emergency
measures they can take to basically like shut all that shit down, like stop people
from joining and making new Facebook groups if they have to, right?
If it becomes obvious that something needs to be done.
And yeah.
So in September, Mark Zuckerberg wrote on an internal company post that his company
had, quote, a responsibility to protect our democracy.
He bragged about a voter registration campaign the social network had funded and claimed they'd
taken vigorous steps to eliminate voter misinformation and block political ads, all with the stated
goal of reducing the chances of violence and unrest.
The election came and it went all right from Facebook's perspective.
The whole situation was too fluid and confusing in those early days after the election, you
know, we were getting the counts in, it's everything's down to the fucking wire.
It was all too messy for there to be much in the way of violent on the ground action
to occur while like that was getting sorted out because people just didn't know where's
it going to land.
So they think, oh, we dodged a bullet, everything was fine because they're dumb.
Oh, baby.
Yeah.
The reality, of course, is that misinformation about election integrity spread immediately,
like wildfire.
On November 5th, one Facebook employee warned colleagues that disinformation was filling
the comment section of news posts about the election.
The most incendiary and dangerous comments were being amplified by Facebook's algorithm
to appear as the top comment on popular threads.
On November 9th, a Facebook data scientist reported in an internal study that one out
of every 50 views on Facebook in the United States and fully 10% of all views of political
information on the site was content claiming the election had been stolen.
10% of Facebook's political posts are the election was stolen.
Yeah.
One out of 50 views on Facebook is someone saying the election's stolen.
The shit's out of control.
Presumably, everyone engaging to agree.
Wow.
Okay.
Honestly, I would have guessed that it would have been higher, but one in 10 is still...
There's a lot going on on Facebook.
The researcher noted, there was also a fringe incitement to violence, and I would quibble
over the word fringe because I don't think it was very fringe, but otherwise, the data
is interesting.
If it's one in 10, that's not fringe.
There's a lot of...
One out of 10 people is not the fringe.
He's saying the incitement to violence was a fringe of the posts claiming the election
had been stolen.
I disagree with his interpretation of that based on my own amount of time that I spent
looking through these same posts, but whatever, maybe we were looking at different posts.
There was a lot going on on Facebook in those days.
Facebook did not blow the whistle or sound the alarm or do anything, but start canceling
its emergency procedures.
They were like, we did it.
The critical period's over.
Everything's going to be fine and dandy, baby.
They thought the danger of post-election violence was over, and most of all, they thought
that if they took action to stop the reach of far-right propaganda users, then conservatives
would complain.
As we now know, the most consequential species of disinformation after November 2nd would
be the Stop the Steel movement.
The idea behind the campaign had its origins in the 2016 election as essentially a fundraising
grift from Roger Stone, Ali Alexander, who is a shithead, adapted it in the wake of
the 2020 election, and it wound up being a major inspiration for the January 6th Capitol
riot.
As we now know from a secret internal report, Facebook was aware of the Stop the Steel movement
from the beginning.
Quote, from Facebook,
The first Stop the Steel group emerged on election night.
It was flagged for escalation because it contained high levels of hate and violence and incitement
V&I in the comments.
The group was disabled and an investigation was kicked off looking for early signs of
coordination and harm across the news Stop the Steel groups that were quickly sprouting
up to replace it.
With our early signals, it was unclear that coordination was taking place, or that there
was enough harm to constitute designating the term.
It wasn't until later that it became clear just how much of a focal point the catchphrase
would be, and that they would serve as a rallying point around which a movement of violent election
delegitimization could coalesce.
In the earliest groups, we saw high levels of hate, V&I, and delegitimization combined
with meteoric growth rates.
Almost all of the fastest growing Facebook groups were Stop the Steel during their peak
growth.
Because we were looking at each entity individually, rather than as a cohesive movement, we were
only able to take down individual groups and pages once they exceeded a violation threshold.
We were not able to act on simple objects like posts and comments because they individually
tended not to violate, even if they were surrounded by hate, violence, and misinformation.
That is such garbage, and I know that you have examined far more of these posts than
I have in depth, but it's just the fast and looseness that people interpret, it's like
free-form jazz of the way people interpret Facebook community rules, because I feel like
in groups like that, and in groups like less inflammatory than that, there is just constant
breaking of the Facebook community rules.
It's just, ugh, it's, yeah.
Yeah.
Yeah, they're not really moderated at all.
No.
Yeah.
So, this is interesting to me for a few reasons.
For one, it lays out what I suspect is the case these researchers and Facebook employees
needed to believe and be able to argue in order to not hate themselves.
The idea that we just didn't recognize it was coordinated.
We thought it was all kind of grassroots and happening organically, and so it was much
more complicated for us to try to deal with.
I think they need to believe this, I'm going to explain why it's not a good excuse.
Starting in December 2019 and going until May 2020, the Boogaloo movement expanded rapidly
in Facebook groups, incitements to violence semi-regularly got groups nuked, and members
adapted new terms in order to avoid getting deplatformed.
It became gradually obvious that a number of these groups were cohesive and connected,
and this was revealed throughout the year in a string of terrorist attacks by Boogaloo
types in multiple states.
When these attacks began, Facebook engaged in a much more cohesive and effective campaign
to ban Boogaloo groups.
The Boogaloo movement and Stop the Steel are, of course, not one-to-one analogs, but the
fact that this occurred earlier in the same year, resulting in deaths and widespread violence,
shows that Facebook fucking knew the stakes.
They could have recognized what was going on with the Stop the Steel movement earlier,
and they could have recognized that it was much likely more cohesive than it may have
seemed.
Facebook was made not to do this, not to act on what they had learned earlier that year,
and I would argue, based on everything else we know, that the reason why this decision
was made was primarily political.
They didn't want to piss off conservatives, you know?
Yeah, I mean, that is like a criminal level of negligence.
I would argue that's leaving a loaded gun with a six-year-old, you know, and being like,
well, I was pretty sure it had a safety, God, there's just, I miss when they were radicalized
by Farmville.
Yeah.
Yeah.
White supremacy.
Yeah.
Those are the days.
So, my critiques aside, this internal report does provide us with a lot of really useful
information.
Invo that would have been very helpful to experts seeking to stop the spread of violent extremism
online if they had had it back when Facebook found it out.
So it's rad that Facebook absolutely never intended to release any of this.
Isn't that cool?
Isn't that sweet?
That's really cool.
That they were never going to put any of this out.
There's like really useful data.
I have a quote in here.
I don't think I'll read it because it's a bunch of numbers and it's only really interesting
to nerds about this, but about like how many of the people who get in to stop the steel
groups come in through like invites and like how many people are actually responsible for
the invites, what the average number of invites per person is, like it's really interesting
stuff.
I'll have the links for it in there.
I know these internal Facebook posts, but like, you know what, I'll read the quote.
Stop the Steel was able to grow rapidly through coordinated group invites.
67% of Stop the Steel joins came through invites.
Moreover, these invites were dominated by a handful of super-inviter.
30% of invites came from just.3% of inviter.
Inviter also tended to be connected to one another through interactions.
They comment on tag and share one another's content.
Out of 6,450 high-engagers, 4,025, 63% of them were directly connected to one another,
meaning they interacted with one another's content or messaged one another.
When using the full information corridor, 77% were connected to one another.
This suggests that the bulk of the Stop the Steel application was happening as part of
a cohesive movement.
This would have been great data to have in January of 2020, right?
That would have been really good.
No.
Yeah.
Speaking as a guy who does this professionally, that would have been great to have.
But this is all just internal, like, okay, so we know.
So we know this.
This is like this paper trail.
Let us never tend to speak of it again.
We know exactly what the problem is.
Yeah.
Yeah.
Now we'll deny it to anyone who alleges this while we have this excellent data that we
will not hand out because we're pieces of shit.
Yeah.
Yeah.
Okay.
On January 6th, Facebook employees were as horrified as anyone else by what happened
in the capital.
This horror was tweaked up several degrees by the undeniable fact that their work for
the social network had helped to enable it.
That it was their fault.
Yeah.
That it was their fault.
That would feel bad.
Yeah.
Probably.
Yeah.
In the same way that, like, when I finish having a gasoline and match fight in my neighbor's
house and then an inexplicable tragedy ensues, I can't help but feel somewhat responsible,
you know?
I'm like, wait, hold on.
I'm feeling this vague melancholy.
And I know it's not my fault.
Where is this going?
Don't worry.
I know it's not my fault.
No.
But I feel that way.
Yeah.
I think we can hold the match countable.
The match is responsible.
The match, the neighbor for having a house.
A lot of people are to blame here.
That's not me.
And that's on them.
Exactly.
And that's on them.
Exactly.
So they're all horrified.
Everybody's horrified.
Much of the riot itself was broadcast in a series of Facebook live streams.
As Mike Pence's Secret Service Detail scrambled to extricate him from the capital grounds,
Facebook employees tried to enact their break the glass emergency measures, amid originally
conceived for the immediate post-election period.
This was too little too late.
That evening, Mark Zuckerberg posted a message on Facebook's internal messaging system with
the title, Employee FYI.
He claimed to be horrified about what had happened and reiterated his company's commitment
to democracy.
Chief Technology Officer Mike Schrepfer, one of the most internally respected members
of Facebook's leadership, also made a post, asking employees to hang in there while the
company decided on its next steps.
One employee responded.
And then he played the theme from the Trolls song, like with Jeffrey Katzenberg firing the
entire, hang in there folks, hang in there folks, while you're hanging in there, here's
an amazing song by Mrs. Anna Kendrick.
That's right.
That's what he did.
Awesome.
So he tells them this and an employee responds, we have been hanging in there for years.
We must demand more action from our leaders.
At this point, faith alone is not sufficient.
Another message was more pointed, all due respect, but haven't we had enough time to
figure out how to manage discourse without enabling violence?
We've been fueling this fire for a long time and we shouldn't be surprised it's now out
of control.
The Atlantic, yeah, fair.
The Atlantic has done that.
I want a poster with a little kitten hanging from the branch that says, we have been hanging
in there for years.
Yeah.
Or how about we have been fueling this fire for a long time and shouldn't be surprised
now it's now out of control.
We could do that with the, this is fine meme, with the guy sitting in the fire.
This is on us, you know, we shouldn't be, this isn't surprising.
We had ample warning of the fire.
But hang in there.
That said, hang in there.
Hang in there, kiddos.
Yeah.
So I think the Atlantic has done some of the best reporting I've found on this particular
moment when Mark and Shrep forget up and like say, don't worry, like hang in there.
We love democracy.
And like people go off on them in January 6th, like people were like a little more open about
the frustration they felt about all this stuff and then they stopped being that open.
It's frustrating and everybody's treating these people with kid loves, whatever.
The Atlantic has done really good reporting on this, on this exact moment, which seems
to have been kind of a damn breaking situation for unrest within the company.
One employee wrote, what are we going to do differently to make the future better?
If the answer is nothing, then we're just that apocryphal Einstein quote on the definition
of insanity.
Another added to Mike Shrepfer, please understand, I think you are a great person and genuinely
one of the good people in leadership that keeps me here, but we cannot deal with fundamentally
bad faith actors by using good faith solutions.
That's a good way to put it, yeah.
Yeah.
I would also like democratic leadership to know that, but um...
Well, let's not set the bar too high, they're not going to figure that out.
Now in the wake of January 6th, an awful lot of people, me included, exhaustively documented
Facebook's contribution to the attack and criticized the company for enabling political
violence on a grand scale.
The company responded the way they always respond, with lies.
Mark Zuckerberg told Congress in March that Facebook quote, did our part to secure the
integrity of our election, Cheryl Sandberg, boss girl and a chief operating officer for
the company, claimed in mid-January that the capital, yeah, claimed in mid-January that
the capital riot was quote, largely organized on platforms that don't have our abilities
to stop hate.
I, oh god, I mean, Robert, first of all, as you know, as a big Sandberg advocate, you
can't talk about her that way because she told women that they should negotiate their
own salary, you fucking loser.
Did you ever think about negotiating your own salary, you fucking dweeb?
That'll be $17.
And I love that.
I do love that too.
That's my favorite thing that she did.
So Sandberg is a lot smarter than Mark Zuckerberg and her statement was the very clever sort
of not technically a lie that spreads a lot more disinformation than just a normal lie
would ever manage because it's technically true that more actual organizing for the riot
was done in places like Parler as well as more private messaging apps.
But it's also a lie because the organizing of the actual movement that spawned the riot,
the stop, the steal shit, that was almost entirely Facebook.
So like, yeah, people didn't like go and open Facebook groups and do, like most of the people
didn't like go in there and be like, okay, we're doing a caravan.
Although some people did, and we have quotes of that.
A lot of that happened in other apps, but like the overall organizing.
But would they have been on those other apps if they had not first been on Facebook?
No, no, no, no, yeah, that's what I'm saying.
Like that's why it's smarter than her because Mark Zuckerberg is just lying to Congress
because they didn't.
Cheryl Sandberg is being very intelligent and also kind of backhandedly complimenting
Facebook in its hour of most blatant failure within the United States, at least not most
blatant failure.
That would be the ethnic cleansing.
So far, so far.
I mean, check the date that your list, you know, check the date we recorded this.
We may have had an ethnic cleansing enabled by Facebook by the time this episode drops.
As of this recording.
True.
Yeah.
As of this recording of the non-ethnic cleansings and also the things that have been done in
the United States that were worse than Facebook did, this tops the list.
Does not top the list of their overall crimes, which include the deaths of tens of thousands.
That is true.
Yep.
Good stuff.
Good stuff.
Good stuff.
I just, I felt like the need to celebrate.
I do.
Oh, those were nice.
Those are nice.
My crush used to send me those in high school and now that website is responsible for the
deaths of tens of thousands of people.
Sick.
Yeah.
It's good stuff.
So what I find so utterly fascinating about the leaks we have of Facebook employees responding
to their bosses on the evening of January 6th is that it makes it irrevocably, undeniably
clear that Zuckerberg and Sandberg and every other Facebook mouthpiece lied when they claimed
the company had no responsibility for the violence on January 6th.
The people who worked for Facebook on the day it happened immediately blamed their own
company for the carnage, quote.
Really do appreciate this response and I can imagine the stress leadership is under, but
this feels like the only avenue where we opt to compare ourselves to other companies rather
than taking our own lead.
If our headsets shocked someone, would we say, well, it's still much better than PlayStation
VR and it's unprecedented technology?
I wish I felt otherwise, but it's simply not enough to say we're adapting because
we should have adapted already long ago.
The atrophy occurs when people know how to circumvent our policies and we're too reactive
to stay ahead.
There were dozens of stop the steel groups active until recently and I doubt they minced
words about their intents, intentions.
Again, hugely appreciate your response and the dialogue, but I'm simply exhausted by
the weight here.
We're at Facebook, not some naive startup with the unprecedented resources we have,
we should do better.
Yeah, but I've, that's like part of the Zuckerberg ethos is to continue to behave like he's
a 22 year old with a good idea.
Yeah.
Yeah.
Or, well, or more iconically, what if, what is his, the quote that's on my shirt?
Oh.
You can be unethical.
You can be unethical and not be breaking the law and that's how I live my life, haha.
That's the key and he's still like, yeah, I mean, and I feel like that is whatever.
I'm sure.
That does say so much about him because like a good person can say, you can break the law
and not be unethical and that's how I live my life.
And that's fine because the law is generally trash.
Zuckerberg is specifically saying, I get to be a piece of shit as long as I don't technically
break the law.
And because I have money, I'm never technically breaking the law.
Is that not sweet?
And then don't forget, haha.
Haha.
You can be unethical and still be legal.
That's the way I live my life.
Haha.
I'll never forget.
I'll never forget.
Never forget.
I'm getting that tattooed on a.
Ugh.
I have it on the little shirt.
I would also recommend getting it on the other side right next to your phoenix that I know
you're planning out.
Oh, yeah.
Full back.
It's actually going to be a perfect replica of the tattoo that Ben Affleck has and then
over my chest, a perfect photorealistic tattoo of Ben Affleck picking up Dunkin' Donuts
and looking like he's just watched his dog shoot itself.
No.
I don't, I like appreciate his devotion to Dunkin' Donuts.
I don't know how he, I mean, I guess he's just tired because I'm like, I don't look
that way at Dunkin' Donuts.
No.
Every picture I've seen of myself at Dunkin' Donuts, I look so happy to be there.
How could you not be thrilled?
I don't know.
I don't know.
I mean, I'd say it's Boston, but you're from Boston, right?
I'm from Boston and I'm so happy to be there.
Yeah.
People keep saying, no, he doesn't look miserable.
He just looks like he's from Boston.
And I think he just is miserable.
Well, there's some truth to that.
There's some.
Okay.
Well, I don't go to cities like Boston.
I'm not like other girls, Robert.
I don't know if you know that.
I'm not like other girls.
So I'm happy at the Dunkin' Donuts.
Okay.
Okay.
Fair enough.
Yeah.
So yeah, I don't know.
At the beginning, I talked about the fact that I have said, I think working for Facebook
is an immoral activity today, given what's known.
That said, there were some points made during this employee bitch session that do make me
kind of hesitant to suggest employees just bounce from the company en masse, quote, please
continue to fight for us.
Shrep.
This is the person talking to the.
Shrep.
I keep hearing Shrep.
I'm sorry.
You keep saying Shrep.
I'm hearing Shrep.
You're hearing Shrep.
I know.
I know.
I know.
Facebook engineering needs you representing our ethical standards in the highest levels
of leadership.
Unless Zuck wants his products built by a cast of mercenaries and ghouls, we need to employ
thousands of thoughtful, caring engineers, and it will be difficult to continue to hire
and retain them on our present course.
That's not a terrible point.
Okay.
That feels like a half step.
Yeah.
It's one of those things where like, on one hand, it is bad to work for a company that's
entire job is to do harm at scale, which is what Facebook does.
On the other hand, if they are replaced by people who don't have any ethical standards
at all, that also probably isn't great.
Now, I don't know.
I would agree with that.
Yeah.
It's complicated because like, I guess you could argue that like, well, if all of the
good engineers leave and they have to hire the ghouls, like it'll fall apart eventually.
And I guess it's the question of like, when does the damage done by Facebook like fading
in popularity hopefully eclipse the damage done by the fact that everyone working there
is the Blackwater equivalent of a guy coding a fucking algorithm?
Like, I don't know.
Right.
That's tricking.
Yeah.
It's whatever.
It's just something to think about, I guess.
Yeah.
Yeah.
Facebook is having a hiring crisis right now.
I think it's gotten a little better recently, but they've had massive shortages of engineers
and failed to meet their hiring goals in 2019, 2020.
I don't know if they're going to, if they have, if it's gotten better this year or not.
I don't know how much any of that's going to like help matters.
I don't know.
It seems unlikely that anything will get better anytime soon.
No.
I mean, yeah.
I think the real solution is to make the company run by even worse people who are less qualified.
I don't know.
That doesn't, that doesn't sound great either.
I don't know.
I think there are volcanoes and that probably has part of the solution to the Facebook problem
in it.
There you go.
Cast their servers into the fires.
So Mark Zuckerberg and his fellow nerd horsemen of the apocalypse have basically built a gun
that fires automatically into a crowd called society every couple of seconds.
The engineers are the people who keep the gun loaded.
That's a turn of phrase.
It's good.
Bravo.
Yes.
Yes.
So the engineers, they keep the gun loaded, but also sometimes they jerk it away from
shooting a child in the face.
And if they leave the bolt, the gun might run out of bullets, but it's just going to keep
shooting straight until the crowd until that point.
So maybe it's better to have engineers jerking.
Yeah.
I had to end with a metaphor.
I don't know.
It's complicated, whatever.
No.
I want to end this episode.
Yeah.
It's, it's just a mess.
It's a messy thing to think about.
We should never have let it get this far.
No.
We've been putting fuel on the fire for a while.
We shouldn't be surprised that it's burning everything.
I mean, truly it has become, it feels like it is slowly becoming just like an annual document
drop of like, yeah.
Things have steadily gotten worse.
Yeah.
The hell company is pretty shitty.
Yeah.
Hell company is still shitty.
Bad stuff at Nightmare Corp.
It's so bad.
It's so bad.
It is really bad.
It's funny how bad it is.
I want to end this episode with my very favorite response from a Facebook employee to that
message from CTO Shrep, a Facebook employee.
If you happen to be listening to this episode, please hit me up because I love you.
Here it is.
Never forget the day Trump rode down the escalator in 2015, called for a ban on Muslims entering
the U.S. and we determined that it violated our policies and yet we explicitly overrode
the policy and didn't take the video down.
There is a straight line that can be drawn from that day to today, one of the darkest
days in the history of democracy and self-governance.
Would it have made a difference in the end?
We can never know, but history will not judge us kindly.
Wow.
Yeah.
Yeah, they know what they're doing.
Yeah.
They know exactly what they're doing and we get an annual reminder.
We get an annual little Pelican summit drop of documents saying that, no, they still know
what they're doing.
They have not forgotten what they're doing.
I might suggest that overtake are we the baddies as the moment you know things need to change.
When you're like, boy, I think history might judge me for my employment decisions.
I think I may be damned by the historians of the future when they analyze my role in
society.
Right.
And it's like, if you're at that place, that's not good.
No.
We've passed the point of no return like five years ago with Facebook.
It's just good lore.
I mean, yeah.
And I do applaud the whistleblowers and the people who are continuing to drop documents.
And at this point, it also just feel like getting punched in the face repeatedly because
it's like, well, I'm glad that there is the paper trail.
I'm glad that there's the evidence, but it's who at this point is surprised.
Like there's like when people, it's whatever.
I mean, I technically, I think by the definition of the word, these are revelations, but they're
also very much not.
They're just confirmations of things that appeared very obvious based on the conduct
of this company already.
Yep.
Yeah.
Woo.
Woo.
Jamie, do you have any pluggables for us?
Do you perhaps have a Facebook-owned Instagram meta thing?
We don't need to call it meta.
I think no one needs to call it what they want to be called.
There's only one meta and it's meta world peace.
Wow.
I don't know what that is.
I know you don't.
What is that?
It's Ronartus.
He was a basketball player.
Basketball player Ronartus changed his name to meta world peace like many years ago.
So he did the meta first.
Oh.
Okay.
Well.
All right.
The metaverse was already taken.
You can listen to my podcast, I got a bunch.
You can listen to Bechtelcast, My Year in Menta, Accast, Lolita podcast, or none.
I won't know.
I'll know.
Actually, Sophie will know, so maybe you better listen to them or I'll lose my livelihood,
which would be interesting.
You could follow me on Twitter, JamieLoftisHelp or Instagram at Jamie Grace Superstar, where
I bravely continued to use Zuckerberg's Tools of Havoc.
Yeah.
Isn't it funny, Jamie, that we call it our livelihood, which is just a nicer way of saying
our ability to not starve to death in the street?
Yeah.
Do you do like, yeah, whoever back in the day rebranded a survival to livelihood, a real
genius sleight of hand there.
Incredible.
This is why I think we should have a program in schools where we determine which kids are
going to be good at marketing, and then we ship them to an island, and we don't think
about it after that point.
Into the island, they go, it's a nice island, like a good one, like a solid island.
Are you marketing the island?
You're marketing the island as good.
I think we put them on an island and we divert all of the world's military forces to making
sure that nothing gets to that island that isn't a marketer or leaves it.
And then make sure that, and make fucking sure there's no Wi-Fi signal on the island.
Oh, good God, no.
Or we fucked.
Or absolutely not.
Or absolutely fucked.
No.
Once a day, they can watch Shark Tank together.
But that's all they're getting.
Oh, God.
Oh, that's sad.
That actually kind of sounds nice.
Living on an island and watching one episode of Shark Tank a day.
That's like, that's my like dream lobotomy.
That's kind of nice.
Yeah.
Well.
That's the episode.
Paul Robert.
Jamie, how are you doing?
How are you feeling?
I'm feeling, you know, I am feeling just kind of a low thrumming of despair, but I have
felt worse.
Good.
Yeah, that's right.
That's how you're supposed to feel.
I have felt worse at the end of this show, which I don't know if that says more about
like my threshold for despair or, you know, it happens to be a coincidence.
But, you know, I'll say I'm hanging in there, but also I've been hanging in there for years.
Yeah.
That's all you can do is hang in there.
Yeah.
Keep hanging in there.
Are you hanging in there, Robert?
Allegedly.
Yeah.
I mean, by all accounts, you're hanging in there, but like internally there's no, who
could say?
No, no, I'm as unmoored and adrift as the rest of us are in these.
When I visit, I'm going to show you pin and I really think you're going to like it.
Oh, good.
Okay.
Yeah.
Well, as-a.
Alphabet Boys is a new podcast series that goes inside undercover investigations.
In the first season, we're diving into an FBI investigation of the 2020 protests.
It involves a cigar-smoking mystery man who drives a silver hearse.
And inside his hearse were like a lot of guns.
But are federal agents catching bad guys or creating them?
He was just waiting for me to set the date, the time, and then for sure he was trying
to get it to happen.
Listen to Alphabet Boys on the iHeart Radio app, Apple Podcasts, or wherever you get your
podcast.
I know Lance Bass is a Russian trained astronaut that he went through training in a secret
facility outside Moscow, hoping to become the youngest person to go to space.
Well, I ought to know because I'm Lance Bass and I'm hosting a new podcast that tells
my crazy story and an even crazier story about a Russian astronaut who found himself
stuck in space with no country to bring him down.
With the Soviet Union collapsing around him, he orbited the Earth for 313 days that changed
the world.
Listen to the last Soviet on the iHeart Radio app, Apple Podcasts, or wherever you get your
podcasts.
What if I told you that much of the forensic science you see on shows like CSI isn't based
on actual science and the wrongly convicted pay a horrific price?
Two death sentences in a life without parole.
My youngest, I was incarcerated two days after her first birthday.
Listen to CSI on trial on the iHeart Radio app, Apple Podcasts, or wherever you get
your podcasts.