The Daily - A Strategy to Treat Big Tech Like Big Tobacco
Episode Date: November 15, 2023A historic set of new lawsuits, filed by more than three dozen states, accuses Meta, the country’s largest social media company, of illegally luring children onto its platforms and hooking them on i...ts products.Natasha Singer, who covers technology, business and society for The New York Times, has been reviewing the states’ evidence and trying to understand the long-term strategy behind these lawsuits.Guest: Natasha Singer, a reporter covering technology, business and society for The New York Times.Background reading: Meta was sued by more than three dozen states that accuse it of knowingly using features on Instagram and Facebook to hook children.Industry lawsuits are stymying new laws on children and social media.For more information on today’s episode, visit nytimes.com/thedaily. Transcripts of each episode will be made available by the next workday.Â
Transcript
Discussion (0)
From The New York Times, I'm Michael Bavaro.
This is The Daily.
Today, a look inside a historic set of new lawsuits filed by dozens of states accusing
the country's largest social media company, of luring children onto its platforms
and hooking them onto its products.
My colleague, Natasha Singer,
has been reviewing the state's evidence
and trying to understand the long-term strategy
behind the lawsuits.
It's Wednesday, November 15th.
Natasha, the last time you and I spoke, the state of Utah had just tried to restrict the
use of social media by children, passed a first-of-its-kind law.
restrict the use of social media by children,
passed a first-of-its-kind law.
Today, you come to us with something related,
but on a vastly different scale.
Right. When we were talking about Utah, it was a single state trying to restrict social media
on behalf of young people.
And now what we have...
Good morning, everyone.
Good morning.
Good morning, and thank you for joining me today.
There is a youth mental health crisis in America and we need to act is dozens of states banding together.
So today, myself and 42 other attorney generals across this nation are announcing collective action in a kind of astonishing coalition of
red states like Texas and Tennessee and blue liberal states like Massachusetts. Today,
my office has filed a lawsuit against Meta Platforms Incorporated, the company formerly
known as Facebook, for knowingly harming the mental health of young social media users.
Coming together to sue Meta, which is the social media giant, as you know,
that owns Instagram and Facebook and WhatsApp.
Right.
I do very much see this as a public health crisis in the same way that tobacco was.
And the scale of this investigation, the way the states are banding together to investigate and the kind of parallel
lawsuits they filed is right out of the big tobacco playbook.
Meta has been allowed to addict our children to a product that interferes with their mental
and physical health.
So from the point of view of the attorneys general, they have said they view the case against Mehta as a case about severe health harms and hazards to young people in the same way that states viewed the health hazards and harms to young people with cigarettes or Juul or opioids.
Of course, Natasha, the industries you just mentioned stood accused of quite literally poisoning people, including children, with their products, cigarettes and vaping pens.
Meta is a technology company, so does the parallel to those industries kind of end there?
So I think the answer to your question is both yes and no. Technology is different. And the question of whether social media is addictive at the heart of this case
will have to be proven. It's not clear, right? So the methodology the states are using,
all coming together to sue one company is similar. The content is different.
Like big tobacco, it's hard to say anything good about. Social media, lots of people use to connect
to their friends, their family, their colleagues, to figure out what celebrities are doing, to look
up recipes, right? There's a lot of good things that happen with social media. And so it is not
the same in that way as Big Tobacco or Juul.
But when the case was first announced last month, we didn't really know what evidence the states had against Metta because their legal filings were redacted and it was all blacked out and we couldn't
see anything. But since then, the Massachusetts attorney general has gotten that state's complaint unsealed. And that gave us a much
better sense not only of the Massachusetts case, but of similar cases that attorneys general in
other states were filing. Right. So I want to better understand this lawsuit that all these
states are involved in. So walk us through the case, Natasha. The case has a really interesting backstory.
It starts in the fall of 2020,
which is when Netflix released a really ominous docudrama called The Social Dilemma.
A lot of people think Google's just a search box
and Facebook's just a place to see what my friends are doing.
What they don't realize is there's entire teams of engineers
whose job is to use your
psychology against you. And it featured former Facebook, Google, and Twitter executives warning
about how social media platforms hacked users' psyches. We get rewarded by parts, likes, thumbs up,
and we conflate that with value, and we conflate it with truth. And state regulators around the country began to see the documentary.
Those kids are the first generation in history that got on social media in middle school.
And they began talking about how worrisome social media was for young people.
How do they spend their time?
They come home from school and they're on their devices.
They spend their time, they come home from school, and they're on their devices.
And so some of them are parents who have seen their own kids use social media, and it's personal for them.
I spoke to the attorney general of Massachusetts, Andrea Joy Campbell, and she said, look, I'm not just the attorney general.
I'm the mother of two young boys. And these attorneys general had been
watching spiking rates of teen depression, anxiety, and suicide in their states. And they believed
that social media was one of the causes. And while the attorneys general were discussing the film and
the alarms it raised for them. Facebook is developing a children's version of the popular app Instagram for youngsters
13 and under. Meta, right, Facebook's parent company, announced in early 2021 plans to develop
a new service, Instagram for Kids. And that caused even more alarm bells because attorneys
general were worried that Meta was trying to create kind of training wheels for Instagram proper. And so more than 40 states got together and their attorneys general wrote a
letter to Mark Zuckerberg asking him to halt the plans to develop Instagram for kids. And soon
after they sent that letter, a whistleblower from inside Meta appeared. Many of Facebook's
internal research reports indicate that Facebook has a serious negative harm
on a significant portion of teenagers and children.
Frances Haugen, a former Facebook product manager,
had taken tens of thousands of internal documents
and she spoke to the Wall Street Journal
and she testified in Congress
that the company knew
that they were making young
women feel worse. The company's leadership knows how to make Facebook and Instagram safer,
but won't make the necessary changes because they have put their astronomical profits before people.
And it causes a firestorm. And Meta announced that it was going to pause the development
of Instagram for kids. And more than 40 states announced that it was going to pause the development of Instagram for kids.
And more than 40 states announced that they were going to investigate whether Meta,
and particularly Instagram, had deliberately created a platform to addict kids and knew that it could cause them harm.
So that's how we get to these lawsuits.
It sounds like a slow-burning realization by these states' attorneys general that there is a problem inside Meta with Instagram and that there's something they are uniquely capable of doing about it.
Exactly. And over the last two years, the attorneys general have amassed thousands of
internal documents from Meta showing how Instagram works and using that and also testimony from other
whistleblowers. They contend they now have a really strong case in which they are accusing Meta, particularly Instagram, of deliberately designing
addictive features that harm children, of lying about the harms, and also of allowing underage
users under 13 on his platform. Okay, well, let's walk through those three claims in these lawsuits one by one. Let's start with the are accusing Meta, but particularly Instagram,
of being an all-powerful social media slot machine that has knowingly ensnared,
addicted, and harmed millions of young people. And so if you think about how a slot machine works,
and some of the attorney generals have used this phrase, right? It can be endless.
And so that is one of the things they say is an addictive feature, that there's no natural end for young people, that it's really, really hard to get off.
Second of all, they say that Instagram bombards young people with all kinds of notifications.
And if you've used Instagram, you have probably seen them.
Chad just posted a new photo.
Chloe has a new reel.
Katie Couric's going live.
Right.
And so, first of all, that, like, is a dopamine hit because you're going to see something new, right?
And you instantly have to go check.
Second of all, there's social pressure, right? If you don't like your friend's post fast enough, what is that going to mean?
don't like your friend's post fast enough, what is that going to mean? And then there's also fear of missing out because some of those things like Instagram Live or stories are temporal. If you
don't do it now or in the next 24 hours, you're not going to see it. And so the attorneys general
allege that all this stuff is by design, that these compelling features have a particularly
pernicious effect on young people because it overrides their brains.
Right. I mean, here I have to confess that I have lost two to three hours at a stretch on Instagram because of the very features you're describing, and I'm 44 years old.
But I think we all are, right?
of the very features you're describing. And I'm 44 years old. But I think we all are, right?
You're scrolling down and there's this nanosecond where you're like salivating in anticipation like Pavlov's dogs. And then you see something new and you get a dopamine hit. And look,
there's George and Amal Clooney, right? Look, there's Khloe Kardashian's new thong. There's
Lionel Messi waving his pink jersey. Things you absolutely don't need to know. What are the specific claims of harm to children from all these features we're talking about?
So there are two sets of harms, right?
One is psychological.
The lawsuits are not simply saying that spending all this time on Instagram is causing kids to lose sleep or take time away from school
or distract from homework. What they're saying is that the features they contend are designed
to addict kids cause compulsive use of Instagram. And that compulsive use can lead to increased
depression among young people, increased anxiety, increased isolation and
loneliness, and particularly among young women, increased dislike of their own bodies. For example,
the lawsuit focuses on these cosmetic surgery and beauty filters that you can use if you're
on Instagram, because you can use the filters to make your arms look thinner or your boobs look bigger or to give you a stronger chin.
And there's a conversation inside Instagram
that's cited in one of these lawsuits
in which even Instagram executives are saying,
you know, we're actively encouraging young girls
to hate their bodies with these filters.
Right, because a filter like that basically encourages the people who use it to think
that something is wrong and needs to be improved.
Right.
And apart from the mental health harms, the state lawsuits also allege that there are
concrete harms that come from young people using the platforms.
that there are concrete harms that come from young people using the platforms.
For example, a whistleblower recently came forward with internal company documents and said that a survey of Instagram users found that 22% of 13 to 15-year-olds said
they were bullied on the platform just within the last week.
14 to 15 year olds said they were bullied on the platform just within the last week.
He also said that about 22 percent of young users had said they received unwanted sexual advances, again, just within the last week.
So in the world of these lawsuits, Instagram especially, is not a place where you go to innocently post about your life and catch up with your friends. Instagram, according to this lawsuit, is an addictive
product whose most popular features make it dangerous for kids. Right. That's what the lawsuits are saying. And not only that, the states are accusing Meta of knowingly concealing those harms.
We'll be right back.
back. Natasha, tell us more about this allegation in these lawsuits, the second of the allegations,
that Meta knew Instagram was harmful to kids, but tried to hide that from the public.
That's one of the really surprising things about these state complaints. They describe how the company regularly did research on teen mental health, regularly surveyed its users aged 13 to 17, and knew that they were having negative experiences and that they were experiencing harms. And yet, the complaints say, company executives
from Mark Zuckerberg on down testified in Congress or gave interviews on TV saying that they cared
about the well-being of their youngest users, they were doing all this work to protect them,
and that the platform was safe. In other words, that they knew better than what they were saying in public about what Instagram did or didn't do to kids.
They had internal evidence showing it was bad, and they'd go out and say it's not bad or even that it's good.
That's what the states contend.
I'm curious what, according to the lawsuits, did Meta do with this data it had about what its users were feeling, this data that often
showed these harms? The lawsuits describe how a number of Meta employees were worried by their
own internal data on their impact on teen users. And these Meta employees proposed different ways
to mitigate the problems and the harms that young people were having.
And yet, their proposals were often shut down by their bosses.
Give us a couple examples of that.
One of the internal projects was about likes. And so, according to one complaint,
Meta's research found that teen users often, like, compare their accounts to their friends.
And when they see other people getting more validation,
it's kind of a negative comparison.
And it led to negative outcomes like increased loneliness or worse body image or negative mood.
And so to try to change that,
Meta did a test program called Project Daisy,
where in some cases they basically hid all the like
counts you saw on Instagram except for your own. And then another experiment where like counts from
like really popular accounts were visible, but not like normal people. And they found that both
of those experiments reduced users' experience of negative feelings and negative social comparisons.
That's fascinating.
And so the solution was, let's hide the likes by default, and it might make teens happier.
And around 2020, Meta executives did this whole publicity tour saying that they were
going to put this into effect. But ultimately, Meta did not take away the likes.
And why not?
That's a really important question that the complaints don't directly answer.
The implication in the lawsuits is that there was a profit motive for not taking the likes away.
But because Meta had publicly said they were going to take the likes away,
there was some kind of pressure to do something.
And so at the end of the day, Meta offered an opt-in option that you could choose to hide the likes.
And the lawsuit says that Meta knew that that was really not going to make a difference because they studied it.
And they found that if you offered people the chance to hide their like counts, less than 1% of people would do it. But if it was opt out by default, 35% would leave the like counts hidden. And so basically,
the allegation is that here was something easy that Meta could have done to reduce social anxiety
for teens, and they chose not to do it. Do we get a sense from these lawsuits of who at the company is blocking these efforts to make Instagram better, safer for kids?
Yes, we do.
There's one striking example in the Massachusetts lawsuit against Meta, which talked about those cosmetic surgery filters we talked about before.
Yep. And there's this whole internal discussion in this lawsuit
between different executives saying,
these cosmetic filters are overwhelmingly used by teen girls.
We know it's not good for them.
Outside experts say that these filters are not good for young women.
Let us disallow them.
Again, a little bit like the likes,
do something easy, low-hanging fruit,
just make it go away.
Yeah.
And there is an internal discussion
among a handful of top executives at Instagram.
And Mark Zuckerberg, according to the lawsuit,
is part of this chain.
And there's supposed to be a meeting with Mark Zuckerberg
to discuss getting rid of these filters.
And one day before the meeting, it's canceled.
And then, according to the lawsuit, Mark Zuckerberg vetoed the proposal to formally ban these plastic surgery camera filters himself.
Wow.
And in the email, it says he specifically directed staff to relax.
And he said there was a clear demand for
these filters. And according to the lawsuit, he said in the email that he had not seen data
suggesting that these filters were harmful. So here you have the CEO of the company saying,
I don't see a problem here. And I reject the idea of getting rid of these filters.
You have the CEO saying, according to the lawsuit a problem here, and I reject the idea of getting rid of these filters.
You have the CEO saying, according to the lawsuit, these filters are popular,
but these are selected quotes from emails. We do not have the full correspondence,
nor do we know what data he looked at or didn't look at.
Natasha, the final allegation you mentioned earlier in these lawsuits is that Meta knowingly allowed very young children onto the platform. What does
the lawsuit have to say about that? So Meta's terms of use say that you cannot set up an account
on Instagram if you are under 13. And the reason for that is there's a federal law that says
companies that know they have kids on their platform must get
permission from parents to let them create an account that would involve collecting their
personal data. Got it. But what the lawsuit said is that Meta made it easy for users under 13 to
sign up for accounts. And that initially there was a drop-down menu that automatically generated a date and year of birth for new users that would make them over 13.
So the default was basically to suggest what birth date to pick to be 13.
Which, of course, if you're under 13, you'd pick.
And then they changed it because somebody internally, according to this lawsuit, said it should be neutral.
somebody internally, according to this lawsuit, said it should be neutral. But of course, it's very easy to lie and pick years that make you older than 13. And the result was, the lawsuit says,
that millions of kids under 13 were on Instagram in violation of the federal children's privacy law.
And if Metta were found guilty of violating that law, the fines can be more than
$50,000 per violation. So just to summarize this entire case, Natasha, is that Meta designed
Instagram to be addictive, knew that from its own research, and pretty much lied to the public
about that fact, and internally rejected employees' requests
to do something about this
and make Instagram better for kids.
And all the while,
it is not doing all that much to stop kids under 13
who are most vulnerable
to all the things we're talking about
from using the platform.
That is what the majority of state attorneys general in
the United States contend. So let's turn to Meta's defense against this lawsuit. What has the company
said about the allegations and I think more importantly about the evidence that's contained
in these lawsuits? So I reached out to Meta to ask them what the response was to all these lawsuits.
So I reached out to Meta to ask them what the response was to all these lawsuits. They got back to me and they said, first of all, the company has made a major and ongoing investment in protecting young folks on their platforms and that there were now more than 30 tools and resources to protect teens and help keep them safe and away from potentially harmful content and unwanted contact on their platforms. They also said that the state's complaint is basically cherry-picked.
Meta said that the lawsuit was filled with like selective quotes from hand-picked documents
and that they didn't provide real context of how the
company operates and how it makes decisions. On the other issues, they pointed out that
Instagram's terms of service prohibit users under the age of 13, and that when the company
fines users under 13, they remove those accounts. And as for the beauty filters we talked about,
Meta said that it banned filters that directly promoted cosmetic surgery or extreme weight loss.
And I also asked them about the comparisons to big tobacco that some of the attorneys general
were making. And Meta said that it was an absurd comparison. Unlike tobacco,
they said Meta's apps add value to people's lives. So they basically completely rejected
the comparison to tobacco. So at the end of the day, Natasha, based on your reporting,
how strong is this case that's being made against Meta by the states and how likely are the states to win? You know, Michael, it's not a slam dunk
because the states are accusing meta of several distinct and really big things,
and those can be hard to prove. It's going to be hard to prove, for instance, that notifications
and likes cause addiction and that addiction leads to depression or isolation. There are studies linking social media use to increase symptoms of depression or feelings of isolation or negative self-esteem.
We do not have decades of research, and particularly, we don't have rigorous research showing that the typical use of social media by typical kids directly causes harm.
But there are larger legal issues at play here, including Section 230, which is part of the Communications Decency Act. And in simple terms, that provision generally allows digital services like Instagram to curate speech and content any way they like
and not be held liable. So you could see in these lawsuits from the states that they're carefully
trying to avoid talking about content. They're talking about tools like algorithms. But social
media companies have argued that they're entitled under this Section 230 to curate content as they
see fit. And that the algorithms do that curation. And therefore, the companies are not liable for
the content. Okay. I want to just for a moment, Natasha, ask us to put ourselves in Meta's headspace because we've been spending so much time talking about the state's case, the state's evidence, the state's worries about what Meta's doing.
If you're Meta and you're watching this case unfold, I wonder if it's likely that they're asking the question, why are all these state attorneys general focused on us? We're not the
only social media platform out there where all this stuff is happening. There's TikTok,
there's Snapchat, there's many others. So are they feeling a little bit kind of picked on?
Right. Why me? So what you're saying is absolutely true, right? Like TikTok also has these features that the states are concerned about, like endless feeds and. And so it would be completely legitimate for them to say, why are you picking on us?
Except that the states are not only picking on Instagram. In 2022, the states announced that
they were investigating TikTok for many of the same practices or similar practices that they were
already investigating Meta for. And Tennessee and Colorado are leading that investigation into
TikTok. It's been going on about a year and a half. And remember in the Meta case, right,
they announced it two years ago and now they're actually filing the lawsuit.
Right. So this case isn't the only case, it's just the furthest along.
Right.
And the implication of that is this is a moment, the AG's hope, of possible reckoning
for these platforms, all of them.
I think that that is the design of the lawsuits, not only to try to litigate their way into causing Meta to change some of the things we discussed already, but it's going to attract a lot of publicity, right?
And it's going to reinforce some of the concerns that lawmakers, the Surgeon General, and many other people have already been voicing.
And so I think that it's kind of the beginning of a snowball. And so I think that
they're hoping to use these lawsuits to cause meta to change and then therefore other social
media platforms, whether they win the lawsuit or not. Natasha, we had started this conversation
using the analogy of the state cases against tobacco companies all those years ago.
And when I think about those cases, one of the clear outcomes was
that everybody started to think of cigarettes as dangerous.
I wonder if the states in the case against Meta,
even if they don't win in court, would be happy in a world where
lots more parents walk around the world thinking of social media platforms like Instagram
as a danger to their kids?
Would that be a successful outcome for the states?
Yeah, I don't think it would, Michael.
I think that we already have a lot of parents walking around thinking that social media is problematic, including some attorneys general.
And parents are struggling to keep their kids off their devices and not on social media for hours and hours at a time.
So like newsflash, social media is problematic.
I don't think that's news, right?
I think what they want is they want the companies to stop using or dial down some of the features we talked about, endless feeds, endless bombarding of young people with notifications.
They want that stuff to stop.
You know, I think about Jonathan Scrimetti,
who is the attorney general of Tennessee,
who co-led the investigation into Meta.
And what he said to me was, you know,
social media companies know what they did
to make their platforms as habit-forming as possible for kids.
And so the companies ought to know where the switches are to turn those habit-forming features off.
And so, really, that's the endgame for these attorneys general.
They want the companies to either turn these features off or dial them back.
Well, Natasha, thank you very much.
Thank you.
We'll be right back.
Here's what else you need to know today.
Look at what Hamas is holding inside the hospital.
These are explosives.
These are vests, vests with explosives.
We have hand grenades, kalachnikovs, and then we have the RPGs.
And then we have the RPGs. On Tuesday, Israel released a pair of videos that it said were recorded from inside Gaza's main children's hospital
that showed weapons and explosives purportedly stockpiled there by Hamas.
This is Hamas firing RPGs for hospitals.
The world has to understand who is Israel fighting against.
The world has to understand who is Israel fighting against.
Israel shared the videos to press its case that Hamas is using hospitals as cover for its military operations and to justify Israel's operations aimed at evacuating the hospitals, which have sparked outrage.
Gaza's health ministry, which is run by Hamas, denied nearly every Israeli claim in the video.
But the health ministry acknowledged that the footage was taken from inside Gaza's main children's hospital.
And on Tuesday afternoon, the Biden administration said that U.S. intelligence sources had information supporting Israel's claims.
sources had information supporting Israel's claims. And a temporary spending bill that would avert a government shutdown at the end of the week was adopted by the Republican-controlled
House after more than 200 Democrats crossed party lines to back it. The bill was seen as a major
test of the new House Speaker, Mike Johnson, who chose keeping the government open over pleasing his party's far right.
The bill, which would fund some government departments until mid-January
and the rest of the government through early February,
did not include the spending cuts that conservatives had demanded,
prompting more than 90 of them to vote against it.
Today's episode was produced by Alex Stern, Will Reed, and Carlos Prieto, with help from
Stella Tan. It was edited by John Ketchum with help from Michael Benoit,
contains original music by Marion Lozano and Dan Powell,
and was engineered by Alyssa Moxley.
Our theme music is by Jim Rundberg and Ben Landsberg of Wonderling.
That's it for The Daily.
I'm Michael Barbaro.
See you tomorrow.