The Daily - A Criminal Underworld of Child Abuse, Part 2
Episode Date: February 20, 2020Yesterday on “The Daily,” we heard about the government’s failure to crack down on the explosive growth of child sexual abuse imagery online. In the second half of this series, we look at the ro...le of the nation’s biggest tech companies, and why — despite pleas from victims — the illicit images remain online. Guest: Michael H. Keller, an investigative reporter at the The New York Times, and Gabriel J.X. Dance, an investigations editor for The Times, spoke with the mother and stepfather of a teenager who was sexually abused as a child. For more information on today’s episode, visit nytimes.com/thedaily. Background reading: The tech industry has recently been more diligent in identifying online child sexual abuse imagery, but it has consistently failed to shut it down, a Times investigation found. Facebook accounted for more than 85 percent of the imagery flagged by tech companies last year.Two sisters opened up about their lives after being sexually abused as children. Photos and videos of them online continue to remind them of the horrors they experienced.Here’s the first episode in this two-part series, describing how a finding from a tipster led to The Times’s monthslong investigation of online child abuse imagery.
Transcript
Discussion (0)
From The New York Times, I'm Michael Barbaro.
This is The Daily.
Yesterday, my colleagues Michael Keller and Gabriel Dance
described the government's failure to crack down
on the explosive growth of child sexual abuse imagery online.
Today, the role of the nation's
biggest tech companies
and why, despite
pleas from victims, those
illicit images can still
be found online.
It's Thursday, February 20th.
Would it be possible, I don't want it to get too steamy, but for the sound.
I can turn it down, we'll just get warm.
It gets too warm.
Last summer, Gabriel and I got on a plane and flew out to the West Coast to meet in a lawyer's office to speak with a family that she'd been representing.
As Mike said, once we started looking into it, there's so many facets.
So we explained to them a little bit about our reporting
and why it was so important that we speak with them.
Yeah, we're here to answer your questions, too.
And who is this family?
All we can say is that this is a mom and a stepdad who live on the West Coast.
And that's because they only felt comfortable speaking with us if we could protect their privacy.
I mean, we started this not knowing anything about it.
And I might get emotional here, but, you know, as parents, we're trying to figure out what's the best way for our kid to deal with this.
And they started to tell us a story about what happened to their daughter.
It was August 21st, 2013.
How's it work?
She was shopping with our two middle children.
So one day, six years ago, the mom is out with her kids doing some back-to-school shopping
and she gets a call
from the police.
And they tell her she has to come in
immediately.
Just an odd call to get a call
from a detective saying,
you need to come down to the station right now. We need to talk to you.
We feel your kids are in danger.
And I'm like, what?
She called me panicked, going, they want to talk to me.
I said, go talk to them.
We don't have anything.
We're not criminals, so.
It was just odd.
So she goes into the sheriff's office.
And there's this FBI agent, introduces himself.
He said, you know, we think your children are in danger from their father,
particularly the youngest one. And I was just shocked and had no idea what they were talking
about. No idea. Mind you, my two other kids who I was shopping with were in the room next door
playing or doing something. I don't know what they do. And he just went on to say that, you know,
we're going to start investigating him or we've been investigating him.
And there'll be an agent at our house Friday to tell us more.
And she showed up Friday morning, August 23rd, nine in the morning.
Can I talk to you outside? And we talk outside and she drops this bomb on us.
this bomb on us.
And what the agent tells her is that her children's biological father
had been sexually abusing her youngest child,
starting at the age of four years old
and going on for four years.
And not only that,
he had been documenting this abuse with photos and videos and had been posting them online.
So I remember that moment. I mean, I think I kind of doubled over or something. I was just shocked.
I said, when did this start? How long has this been going on?
How did I not know I'm mom?
So all of those questions and all those feelings and I mean, everything just came crashing down.
What does this couple tell their children at this point?
The FBI agent said, actually, it's better if I'm the one to tell your kids.
In her experience, it would be best for the news to come from her as opposed to me telling the kids directly that their dad was arrested because kids might blame me and point the
finger.
I said, yeah, whatever you think.
She passed around her FBI badge and showed the kids and built a rapport with them.
And then she said, you know, your dad has been arrested.
I don't think we even talked about what it was for or why he was arrested,
just that he was arrested and they're still investigating.
You know, even to this day, you know, I think about, again, I was married to this guy.
How did I miss that?
Still, and this is six years later, you know, how did I miss that still? And this, this is six years later, you know, how,
how did I miss that?
So there's a piece of that guilt and that's always going to be there.
I think.
Yeah.
But it isn't your fault.
It's not your fault.
And how can somebody do that to their own child?
I still, I don't think I'll ever understand a person like that. she's just now developmentally dealing with the effects of it she's angry and she's acting out and
her daughter is now a young teenager she's in counseling now you know so and has a counselor
and is not only dealing with all of the normal
things that young teenagers have to deal with, but the trauma of being sexually abused at such
a young age. So that's something she's just now having to learn how to say no, how to inform
others that she's not comfortable with something. And she's only 13.
And meanwhile, even though the physical abuse ended six years ago, the images continue to circulate to this day. And the parents know that because they get notified whenever someone is arrested
having these images in their possession.
And why would they be notified?
What's the point?
It feels like it would just be a horrendous reminder
every time that this is still happening.
One of the abilities that victims of this crime have
is to seek restitution from the offenders.
So when they're notified,
their lawyers can petition the court
to get a certain sum of money,
which is a really good thing
and helps to rectify some of the damage that was done.
But Mike, you're right.
Oh my gosh, this person in Kansas and New York,
somebody in Wisconsin saw my kid.
And oh my God.
It is a brutal double-edged sword, these notifications.
Oh my God, there's people out there who know about this and they can watch it over and over.
And for this young woman, her parents and their lawyer received more than 350 notices in just the last four years.
Wow. So something on the order of 100 times a year, somebody is convicted of having looked at these photos of their daughter.
That's right.
What do you think should be kind of in the front of our minds?
What do you think is really important for us to understand?
The internet technology companies need to really do something about this because, I don't know, it's just...
I don't know enough about technology, but I just...
Where is this crap, you know?
Certainly for the stepfather, his question, which is a legitimate question, is
How do we get it off?
Why can't they just take it down?
How do we, how do we, how do we make it away?
And that's the same question we had.
And that's the same question we had.
Why, six years later, are these images and videos still showing up on some of the largest technology platforms in the world?
Figure it out. We've got this technology.
We can go to the moon and Mars and stuff.
We can get this crap off the web.
And are the companies doing enough to prevent it.
We'll be right back.
Gabriel, Michael,
how do you begin to answer these parents' very reasonable questions about why these images of their child's sexual abuse keep showing up online?
So before we can answer the question of why these images keep showing up,
we needed to understand where the images were online
and what companies were responsible
for people sharing them. But the National Center, which you'll remember is the designated
clearinghouse for this information, wouldn't tell us.
But they know, right? So why wouldn't they tell you? Why wouldn't they give you that information?
Part of the reason why they don't divulge these numbers
is because the companies are not required by law to look for these images. It's voluntary.
It's voluntary to look for them. Right. Without the help of these companies, they have no idea
where these images are, where they're coming from, how many of them there are.
That's right. And the National Center is concerned that if they disclose
these numbers, that they might damage those relationships, which they depend on. But then
we start to hear anecdotally that there is one company responsible for the majority of the reports,
and that was Facebook. So we are doing everything we can to run down that number, but very few people know it.
However, we ultimately do find somebody who has documentation that reveals that number.
So in 2018, the National Center received over 18.4 million reports of illegal imagery.
And the number this person provides us shows that of those 18.4 million,
nearly 12 million came from Facebook Messenger.
Wow. So the vast majority of them.
Almost two out of every three reports came from Facebook Messenger.
So just that one service of Facebook. That's right. Just the chat application. This doesn't include groups or wall
posts or any of the other public information that you might post or share. This is specifically from
the chat application. But then, after we reported that number,
the DOJ actually comes out and says that Facebook in total,
so Messenger plus the other parts of the platform,
are responsible for nearly 17 of the 18.4 million reports that year.
Wow, this is a Facebook problem.
That's what the numbers at first glance would suggest.
But we realized we needed to talk to people that really understood this
to know what conclusions to come to from these numbers.
Hello?
So we called up somebody who would know better than almost anybody.
Alex.
Alex Stamos.
Hey.
Gabe Dance.
Mike Keller here.
Hey, how's it going?
I'm doing okay.
I'm getting back to my office.
Alex was the former chief security officer at Facebook for the past several years.
He's now a professor at Stanford University.
So he's somebody who very much would have seen this happening,
would have understood what was going on inside Facebook when it comes to child sexual abuse.
Absolutely.
You've obviously worked on this area for years.
Facebook Messenger is responsible for about 12 million of the 18.4 million reports last
year to the National Center.
That seems like a lot.
Can you help us understand kind of what's happening on that platform?
Yeah. So we've been discussing one very important number, 18.4 million, which is the number of...
So Stamos tells us something that actually is a little counterintuitive, that this huge number of reports coming from the company...
That's not because Facebook has the majority of abuse. It's because Facebook does
the most detection. It's them working the hardest to find this type of content and report it.
I expect that pretty much any platform that allows the sharing of files is going to be
absolutely infested with child sexual abuse. If everybody was doing the same level of detection,
we'd probably be in the hundreds of millions of reports. What is he telling you? That of the 18 million, the reason why Facebook has
so many is because other companies are not reporting this at all?
Essentially, yes. What he's saying is that Facebook is reporting such a high number of
images and videos because they're looking for them. And that a lot of other companies
aren't even doing that.
Facebook checks effectively every image that transfers across the platform in an unencrypted manner to see whether or not it is known child sexual abuse material.
Every single time somebody uploads a photo or a video to Facebook, it's scanned against a database of previously identified child sexual abuse imagery.
And in doing so, Facebook is finding far, far more of this content than any other company.
So he's saying don't shoot the messenger here.
That's what he's saying.
So as of now, this is the best method these companies have to identify and remove the imagery.
What Facebook is doing.
That's right.
So why doesn't every technology company do this?
It seems pretty straightforward.
Well, the short answer is that it's not baked into the business model. That this is not something that helps them grow their base of users.
It's not something that provides any source of revenue.
And it's in fact something that works against both of those things if done correctly.
What do you mean?
Well, every time Facebook detects one of these images, they shut down that account.
You're deleting your own users.
That's right. You're deleting your own users.
And it costs money to delete your own users.
Right. You have to spend money to hire people to find accounts that are doing something wrong that you're then going to lose as a customer.
You got it.
So both of those things fly in the face
of almost all of these companies' business models.
And Stamos actually told us something else interesting.
The truth is that the tech industry is still pretty immature
at the highest levels about the interaction between executives.
And that's that these companies aren't really working together
to solve this problem.
You know, if you look at, say, the banking industry,
these big investment banks, the CEOs hate each other, but they understand that their boats all
rise and fall together. And so they are able to work together on what kind of regulation they'd
like to see. But a lot of the top executives at the tech companies really kind of personally
despise one another. And it is very difficult to get them to agree to anything from a policy perspective.
And in our reporting, we found some pretty egregious disparities
in how different companies police this on their platforms.
Amazon, who has a massive cloud storage business, for example,
they handle millions of uploads and downloads a day.
They don't scan whatsoever. cloud storage business, for example, they handle millions of uploads and downloads a day. They
don't scan whatsoever. Apple has an encrypted messaging platform, so that doesn't get scanned.
They also choose not to scan their photos in iCloud. Snapchat, Yahoo, they don't scan for
videos at all, even though everyone knows video is a big part of the problem. And it's not because
the solutions don't exist,
they just have chosen not to implement them.
And now Facebook, the company looking for this content most aggressively,
is starting to rethink its policy in doing that.
Over the last few years, a lot of tech companies have realized
that privacy is an important feature that a lot of their customers are expecting. And citing privacy
concerns, Facebook announced that it will soon encrypt its entire messenger platform, which would
effectively blind them to doing any type of automated scanning between chat,
which again was responsible for nearly 12 million of those 18.4 million reports.
So they would stop searching for this child sexual abuse material.
Right. They would limit their own ability to be aware of it.
And privacy is important enough that they would handicap their ability to find this criminal conduct
and these horrible photos. Based on the criticism that a lot of tech companies have received over
the last few years, moving towards encryption is a really attractive option for them because it
lets them say, we really care about the privacy of your conversations, and we're going to make that more secure.
Gabriel, it occurs to me that the debate over privacy is enormous and loud.
We've done episodes of The Daily about it, many episodes of The Daily about it.
But the child sexual abuse subject is not as well-known.
It's not as widely debated.
It's the first time we're talking about it.
Is that reflected in this decision, The attention that these two subjects get?
It is.
And in fact, it's one of the main reasons we chose this line of reporting.
The issue of child sexual abuse imagery really brings the privacy issue to a head.
Because we're forced with these very, very stark decisions that we're discussing here right now, which is that is it more important to encrypt the communications on a platform where it's well known that children interact with adults?
Is it worth it to encrypt those communications to protect people's privacy when we know what the ramifications for
children are. But the child protection question is also a privacy question. And when you are
Facebook and you're saying, we're going to encrypt conversations for the privacy of our users,
the child advocates will say, well, but you're doing nothing to protect the privacy of the child in the image.
In fact, you're making it harder for their life to ever be private.
Exactly.
And this means that next year, or whenever Facebook moves ahead with its plan to encrypt,
they won't be sending nearly 17 million reports to the National Center.
They'll be sending far fewer. And that means for the family that we spoke with on the West Coast,
who's receiving about 100 notifications a year that someone has been caught with images and
videos of their daughter, they'll likely be getting far fewer notifications. But not because people aren't looking at these images.
They still will be.
The family just won't know about it.
That's my big thing.
I want people to care about this because there is a human factor to this, obviously.
You know, we have to force people to look at it.
You know, the tech companies, they have to do something about it.
So where does that leave this family now?
They feel abandoned and angry with the tech companies.
They have to do something about it because knowing that that's out there,
I don't know, it's just being traumatized all over again.
They keep getting these notifications that their daughter's imagery has been found.
It's ongoing. It's lifelong.
There's nothing they can do about being a victim for the rest of their life.
And the way they describe it is it's like getting hit by a truck.
They can't stop being in a car wreck every day.
Only to get back up and get hit by a truck time after time after time.
There's no other way to say it.
That will be there forever until it's removed, until somebody comes up with a way to take that off.
And so what this family has decided
is that they're not telling their daughter.
She doesn't know.
They're not going to tell her that her images are online.
There's no good it would do.
There's no benefit, at least from our perspective, to tell her.
I mean, she needs to worry about soccer and...
Making the team. I mean, she worries about sleepovers and wrestling.
You know, she doesn't need to be worrying about the worst part of her life available on the Internet.
I don't want her to be fearful of what other people might be seeing of her
when they do a Google search for a bullet.
Up pops her image that's horrible.
But there is a clock ticking.
I just found out when she turns 18 it isn't a choice.
The FBI will get a hold of her because she's an adult victim now. When she turns 18,
the federal government is going to start sending her the notifications.
So what they're hoping is that in the four years until that happens,
the tech companies are going to solve this problem.
Which is to say, get these images offline for good.
That's what they
hope. My motivation for this is we have to explain to our daughter, this is on the internet and she
has to live with that. But being able to tell her, you know, if you can tell me in five years,
this will be something that was not is, that'd be great. The dad asked us the question,
can you tell me that when I talk to my daughter about this, when she turns 18,
that I can tell her that these horrific images used to be online, but no longer are.
And what did you say?
What I wished I could tell him was yes.
I think that's why we're doing this story, to be honest with you.
What I did tell him was that that was the point of us doing this reporting,
was the hopes that something would change
and that in five years those images would no longer be there.
But once we publish this article and we see how they respond
and we see how not only tech companies...
But from everything we've learned,
it's only getting worse.
What if it was your daughter?
Yeah, you know, put yourself in that kid's shoes.
What if you were the person they're looking at the rest of your life?
If we could tell that this happened instead of this is happening,
the world would be a lot better off.
She'll be a lot better off.
Michael, Gabriel, thank you.
Thank you.
Thank you. Thank you.
A few weeks ago, the National Center for Missing and Exploited Children released new data for 2019.
It said that the number of photos, videos, and other materials related to online child sexual abuse grew by more than 50% to 70 million.
We'll be right back.
For highlights and analysis of Wednesday night's Democratic debate in Las Vegas,
the first to include former New York City Mayor Michael Bloomberg.
Listen to this morning's episode of The Latest.
You can find it on The Daily Feed or by searching for The Daily.
I'm Michael Bavaro.
See you tomorrow.