The Daily - A Ruling That Could End the Internet as We Know It
Episode Date: February 23, 2023Since 1996, the modern internet has been defined by a sweeping law that prevents tech companies such as Facebook and Google from being held responsible for the content posted on their sites.This week,... the Supreme Court heard arguments in a case that could take that legal immunity away.Guest: Adam Liptak, who covers the Supreme Court for The New York Times.Background reading: The decision on website immunity has the potential to alter the very structure of the internet.Lawmakers are targeting big tech “amplification.” What does that mean?For more information on today’s episode, visit nytimes.com/thedaily. Transcripts of each episode will be made available by the next workday.
Transcript
Discussion (0)
From The New York Times, I'm Michael Barbaro.
This is The Daily.
Today, the modern Internet has been defined
by a sweeping legal shield created by Congress.
This week, the Supreme Court heard arguments
in a case that could take that shield away.
My colleague, Adam Liptak, was in the room.
It's Thursday, February 23rd.
Adam, tell us about this big case that was argued before the Supreme Court just a few days ago.
So the case is called Gonzalez versus Google, and it arises from a terrible terrorist attack in Paris in November of 2015, where more than 100 people were killed, including at the Baraclan Concert Hall.
including at the Baraklan Concert Hall.
And among those people was a young American student,
a 23-year-old college student named Noemi Gonzalez,
and ISIS claimed responsibility for the attacks.
The lawsuit brought by Noemi Gonzalez's family sues YouTube, Google's video service,
on the theory that YouTube bore some responsibility for these attacks by hosting
and posting videos that the family says helped inspire and radicalize the terrorists.
Now, there's little direct proof that any actual terrorist looked at an actual particular video,
terrorist looked at an actual particular video. But nonetheless, atmospherically, it is the case that YouTube posted some of these videos and served them up to you one after the other if
you were interested. So through this lawsuit, this family is attempting to do legally what
many people wish they could do, which is hold an internet company responsible for the real
world consequences of how ugly online content can hurt people in the real world. In this case,
through someone being killed. Right. I mean, these companies are enormously powerful.
They control the information that people receive. They target information based on data they gather about people.
So there are some concerns, and this is on both sides of the political aisle,
that these companies are too big and too dangerous.
Got it.
But those impulses run into a legal barricade, and it's one that is at play in the lawsuit brought by Noemi Gonzalez's family, a 1996 law called Section 230 of the Communications Decency Act.
Adam, this is a law that we've talked about a fair amount on the show, but just remind us exactly what it does and how it fits into this case.
Well, big picture, Michael, it immunizes tech platforms, Google, Facebook, eBay,
Yelp. It immunizes them from liability for the content posted by their users.
Right. This is the law that says that if I, Michael Barbaro, decide to go on Twitter and
say something stupid, inflammatory, and false— Hard to imagine. Totally hard to imagine. That is
on me. That's not on Twitter. Twitter cannot be sued under this law for that stupid, inflammatory,
incorrect content that I posted. Right. So what's the story behind this broad and kind of fascinating level of immunity that Internet companies have?
Why did that ever come into being?
It arises in the earliest days of the Internet.
Pretty fancy computer there.
Want to give it a test drive?
Me?
In the days of services called CompuServe.
CompuServe combines the power of your computer with the convenience
of your telephone. AOL
and Prodigy. Prodigy?
Prodigy? What Prodigy does
is connect our computer through a
modem with a fantastic range of services
like banking,
shopping, games.
Here's a bulletin board for my food and
wine club. You're taking us way
back. Dial-up connections.
And the Internet was this fledgling, nascent thing that needed to be nurtured.
Unlimited access to the Prodigy service can be yours for only $9.95 a month.
Call now.
But in 1995, a New York state court holds Prodigy responsible for something that had been posted on a user board
on the theory that Prodigy had been doing some amount of content moderation,
trying to keep the craziness and the filth and the violence off the message board.
trying to keep the craziness and the filth and the violence off the message board.
And because it was doing that, it was responsible for what else was left up on the board.
So just let me make sure I understand.
So because Prodigy, this new fledgling internet service provider,
was doing some modest level of monitoring,
the court said that if someone posted something deeply problematic, Prodigy could be held liable as a result of this moderation role that it was
playing. Right. You can think of it by way of analogy. So a bookstore sells a book. Nobody
thinks that in the ordinary case, the bookstore is responsible for what's in the book. It doesn't know.
A newspaper edits and curates and decides what to publish and what to put on the front page and
what to put on the op-ed page. The newspaper is generally liable for what it does because it's
doing something. It's speaking, it's editing, it's curating. Interesting. So in this case, this New York State Court in 1995
says Prodigy is more like a newspaper and therefore is liable for what users had put
on the message board. Which of course, if you're Prodigy, is not great news.
No, in fact, it's very hard to imagine how you can survive
unless you devote enormous resources
to taking down every potentially problematic thing.
It really could destroy this seedling
that was the internet in 1995 and 1996.
So Adam, what happens in response to this ruling
that poses such a seemingly existential threat to the young Internet?
Congress acts. It enacts Section 230, and the heart of the section is a 26-word provision.
And in that sentence, Adam, another information content provider is a user, a user like me, screwing around on Twitter. Yeah. Anybody who's posting stuff on somebody's site may well be directly liable, but the site itself will not be.
It gives tech platforms enormous protection.
It treats them like bookstores, even when they're doing some of the things we associate with newspapers.
some of the things we associate with newspapers.
Even when they're moderating,
even when they're taking down problematic speech,
they get protection from lawsuits.
Got it.
So Section 230 is a kind of compromise,
a pretty nifty compromise, that lets these internet companies moderate,
which in theory makes the internet a better, safer place.
But it protects those companies from being held liable for the awful things that are posted on their site,
and therefore lets the companies kind of have it both ways.
They get to be both a newspaper and a bookstore, but with the protections of the bookstore.
That's right. It's a compromise.
of the bookstore. That's right. It's a compromise. And there's no question, Michael, that Section 230 led to and nurtured the rise of the internet, which is a wonderful thing and in some ways a
horrifying thing. And over time, the internet may have outgrown Section 230. It has become much more dynamic.
It's not a matter of individual users posting messages on message boards.
It's a matter of these enormous tech platforms harvesting data and using algorithms to serve up videos.
And there's a question now whether this old law still makes sense now that the
internet has grown into the enormous force in modern life that it is.
Some people think that it gives too much power to internet companies and gives them too much
immunity from being held responsible for the negative consequences
that still exist, of course, on the internet.
And that's what this case before the Supreme Court is all about.
Should Section 230 operate to protect YouTube
from leading people to really vile and dreadful speech?
We'll be right back.
So Adam, take us into the court
on Tuesday morning
when you are hearing oral arguments
in this case challenging Section 230.
We'll hear our argument this morning
in Case 21-133333, Gonzalez versus Google.
Mr. Schnapper?
Mr. Chief Justice, and may it please the court.
It starts with a lawyer for the family, Eric Schnapper.
Section 230-C-1 distinguishes between...
And he makes a basic point.
He says, I get it.
If all YouTube is doing is posting ISIS videos,
that's one thing.
But if it's recommending them, that recommendation is different.
That's no longer ISIS's speech. That's YouTube's speech.
And that recommendation is YouTube talking.
In some circumstances, the manner in which third-party content is organized or presented could convey other information.
And that means it's no longer a question about can you sue YouTube over what a user said.
Now the question is can you sue YouTube over what YouTube said.
Adam, how exactly is this YouTube speaking? Just explain that. So what Schnapper says is that you find a video and you watch it.
If I might turn to the practice of displaying thumbnails.
On the side of the YouTube screen are a number of thumbnails of other things you might like to watch that YouTube has put together, that YouTube has curated, that YouTube recommends.
They're intended to encourage the viewer to click on them and then go see a video.
It's the use of algorithms to generate these thumbnails that's at issue.
And that speech, Schnapper says, is YouTube's speech. It's no longer protected by Section 230 because it's not about what users posted.
It's about what YouTube is recommending.
Got it.
I type in ISIS video and they're going to be a catalog of thumbnails which they created.
It's as if I went into the bookstore and said, I'm interested in sports books.
And they said, we've got this catalog, which we wrote.
To return to that metaphor of a bookstore we were talking about earlier,
the idea that a bookstore isn't liable for the books on its shelves,
Schnapper is basically saying that's true so far as it goes.
But these are recommendations.
He says this is more like a catalog printed by a bookstore that recommends books to you.
And the bookstore is at least in part speaking there because it's making that recommendation to you that here's something we think you're interested in.
Got it. So, Adam, how do the justices respond to this argument?
They have a bunch of responses.
Mr. Schnapper,
the first one comes from Justice Clarence Thomas.
Are you saying that YouTube's application
of its algorithms
is the same algorithm across the board?
It's the same algorithm across the board.
Who says, well, wait a second.
These algorithms are a neutral way to organize information.
So if you're interested in cooking, you don't want thumbnails on light jazz.
If you're interested in cooking videos,
you get interested in rice and pilaf from Uzbekistan.
You would hope that YouTube serves up a rice pilaf recipe.
I don't see how that is any different
from what is happening in this case.
And that there's nothing special about the algorithm
as it relates to ISIS,
because it's just trying to anticipate
what that viewer wants.
So Thomas's argument is that the algorithm has no intention,
it doesn't weigh one interest over another.
It's essentially an automated machine.
Right.
And that means it's a little different from many other ways of trying to organize this huge pile of information that is the Internet.
And therefore, it's protected by Section 230 like anything else.
And what does the lawyer for the Gonzalez family say to that point?
The Gonzalez's lawyer, Eric Schnapper, says, yeah, many, many recommendations for rice peel-off are not subject to Section 230, but they're also not harmful.
Nobody dies from rice peel-off, usually.
Our view is that the fact that an algorithm is neutral doesn't alter the application
of the statute. But when you're in the area of really harmful speech and you're adding your own
speech on top of it, when it's YouTube doing the talking, Schnapper says, that's different and
that's outside of Section 230. So on the one hand, you have this idea that you have neutral algorithms
that try to give people what they want, and that's fine.
On the other hand, you have the idea that if the algorithm ends up recommending
really harmful speech that recruits terrorists and incites terrorism,
you're in a different area, and the tech platforms ought not to be shielded
from lawsuits on those theories.
Right. What the algorithm spits out matters.
And that's what the Gonzalez's lawyers are saying.
That's right.
Okay. What happens next?
Justice Kagan?
Mr. Schaffer, can I give you three kinds of practices?
Justice Kagan tries to test the limits of this theory and asks Eric Schnapper a series of questions thinking that he's going to draw a line between them.
So the first question is, OK, we've got the algorithm that's serving up ISIS videos.
She says, I understand you think that's not subject to Section 230.
What about something more routine?
Facebook or Twitter.
What about the Twitter feeds and the Facebook feeds
that give you posts about things you think you're interested in?
Is that a different area?
And then the third is just a regular search engine.
And then she goes, well, what about a search engine?
And a search engine, after all, is making recommendations
about what it thinks the best answer to your query is,
and it takes account of who you are.
If you ask for a Chinese restaurant,
you will get a Chinese restaurant in your neighborhood,
not a Chinese restaurant in China.
And so he says, no, all of those things can be subject to lawsuits and not protected if they end up recommending harmful speech. The short question is, could they be liable
for the way they prioritize things? And the answer is, I think so. It's going to depend on what happened.
And the example, I could...
So even all the way to the straight search engine,
that they could be liable for their prioritization system?
Yes, there was a...
Okay.
So he's now arguing, and it's kind of sweeping,
that almost any kind of online search
that produces a theoretically dangerous result should no longer be protected by Section 230, which could really undermine what this law was set up to do.
Well, what it is really is an attack on the structure of the modern Internet, which runs on algorithms.
And it's not just about
terrorism speech. It's about defamation. It's about products liability. It's about hate speech
that inflicts emotional distress. All kinds of lawsuits are conceivable. And this shield that protects the nature of the Internet would go away under his theory.
So this is really a bold and comprehensive theory that undermines the very nature of the Internet as we've come to know it.
So you are going to the extreme.
Assume I don't think you're right.
And it seems to frustrate the justices who are looking
for a more modest solution to this case. Got it. And when you say frustrated, should we understand
that to mean that they are skeptical of this legal argument? They're skeptical about going that far.
I mean, we're a court. We really don't know about these things.
You know, these are not like the nine greatest experts on the internet.
They candidly acknowledged themselves not to be experts in this area,
and they were hoping for some guidance that could land them in a sensible place,
and they're not getting it from Schnapper.
Thank you, counsel.
and they're not getting it from Schnapper.
Thank you, counsel.
Okay, so let's turn now to the arguments from Google's lawyers,
who presumably are watching this and know that they have a real opening,
given the justices' skepticism.
Ms. Blatt?
Mr. Chief Justice, and may it please the court. Yeah, Google's lawyer, Lisa Blatt, embraces the idea that has resonated with the justices
that adopting at least the extreme version
of the other side's argument
would do real damage to the internet.
And Blatt kind of plays out what the options are.
She imagines a world without Section 230,
without that immunity, in which
platforms really have two choices. The first is websites just won't take down content,
and that just defeats it the whole point. And you basically have the internet of filth,
violence, hate speech, and everything else that's not attractive. They can go back to the full
bookstore model and sell everything, including, you know, vile, harmful speech.
Or...
You have websites taking everything down and leaving up, you know, basically you take down anything that anyone might object to.
They can actually turn into a newspaper and use intense content moderation to scrub their sites of everything harmful.
And then you basically have,
and I'm speaking figuratively and not literally,
but you have the Truman Show versus a horror show.
You have only anodyne, cartoon-like stuff
that's very happy talk,
and otherwise you just have garbage on the internet.
And what Lisa Blatt says is it destroys what we know and love about the internet.
Right, because in the absence of that nifty compromise that 230 represented,
internet companies have to choose one of these paths.
That's what she's saying.
Right, and she's saying neither one is attractive,
and the world would be a much worse place if those were your choices.
But I got to say, she also goes to a fairly extreme place
and also doesn't satisfy the justices with some of her answers
about just how far she would go.
Explain that.
Well, Justice Kagan said, well, what if the algorithm was not neutral?
Suppose that this were a pro-ISIS algorithm.
In other words, it was an algorithm that was designed to give people ISIS videos,
even if they hadn't requested them or hadn't shown any interest in them.
Still the same answer, that a claim built on that would get 230 protection?
Yes, except for the way...
Lisa Blatt says Section 230 still protects us.
That's an editorial choice.
We're allowed to do that.
And that also didn't seem like what the court wanted to hear.
In general, whether it's neutral or whether it's not neutral,
whether it is designed to push a
particular message, does not matter under the statute, and you get protection either way.
That's correct. The court seemed, you know, maybe content with the idea that a neutral algorithm
that just gave users what they want is one thing. But of course, you can set up your algorithms in all kinds of
different ways. And the Google argument was they're entitled to do that any way they like,
even at the cost of producing real harm. And I should say this is a hypothetical question.
Google actually takes down terrorist conduct and very effectively. But even so, what the court is
interested in is the structure and limits of the
legal arguments, and both sides found themselves very far on the opposite ends of a continuum,
and giving the justices a difficult task in finding a middle path.
Adam, it feels like there could be a middle ground here, which is a modified Section 230, one that is rewritten
in ways that meet this moment, but that don't feel extreme. And that wasn't presented by either side
in this case, but it could very easily be achieved, it would seem, by Congress going back all these
years later and rewriting the law. You're right, Michael. It's not the kind of thing the court is good at
to have a comprehensive code of Internet conduct.
This is really a job for Congress,
but Congress seems incapable of doing anything.
You know, we talk a lot about the court as ideologically divided and ruling six to three in all kinds of areas of the law.
This was a refreshing day where the justices were just trying to figure out
the correct legal answer and not divide it into warring camps.
And their frustration was that the parties weren't really helping.
And I think it's at least possible
that the justices duck the question for now.
And they have a way to do this.
On Wednesday, the court heard argument
in a related case involving Twitter,
this one asking the question
of whether internet platforms can be sued at all
under a different law
that forbids aiding and abetting terrorism.
And if the answer to that is no,
then you don't need to get to the question of do they
have immunity from such suits, because such suits aren't available at all. And if the court rules in
favor of Twitter in Wednesday's case, it maybe finds an exit ramp from the Google case on Tuesday,
doesn't have to decide it, and they're maybe less eager to decide it because they
can't find the line they were searching for during that argument.
Got it.
So the justices may really hope that they never have to weigh in on this at all, which
would leave everything the way it is right now when it comes to Section 230 and the internet,
which means it would remain a place, as you said,
that is both beautiful and absolutely horrifying.
So it's an imperfect solution,
but it might be the best available solution,
and that may be where the court goes.
Adam, as always, thank you very much.
Thank you, Michael.
We'll be right back. Here's what else you need to know today.
On Wednesday, at least 10 Palestinians were killed and more than 100 others were wounded,
according to Palestinian officials, during an hours-long gun battle between Israeli security forces
and armed Palestinian groups in the West Bank.
It was the latest in a series of deadly clashes in the Israeli-occupied West Bank
over the past month. According to Israel, the firefight occurred during an operation to arrest
members of an armed Palestinian group planning imminent attacks on Israel.
But video footage seemed to show Israeli soldiers
shooting at two unarmed civilians
as they ran from the gunfire.
And the Biden administration
has announced its toughest policy yet
to crack down on undocumented immigration.
The new policy would presume that migrants are ineligible for asylum policy yet to crack down on undocumented immigration.
The new policy would presume that migrants are ineligible for asylum if they entered the country unlawfully, a significant rollback of the country's traditional approach to
those fleeing other countries.
The policy is timed to coincide with the expiration of a pandemic-era rule
that allows the U.S. to swiftly expel migrants back to Mexico.
The White House hopes it will discourage a surge of border crossings
when that rule goes away.
Today's episode was produced by Will Reed, Rob Zipko, Jessica Chung, and Mary Wilson.
It was edited by Lexi Diao and Paige Cowett, contains original music by Dan Powell,
and was engineered by Corey Schreppel.
Our theme music is by Jim Brunberg and Ben Landsberg of Wonderly.
That's it for The Daily.
I'm Michael Barbaro.
See you tomorrow.