The Daily - Why Apple Is About To Search Your Files
Episode Date: August 20, 2021Two years ago, a multipart Times investigation highlighted an epidemic of child sexual abuse material which relied on platforms run by the world’s largest technology companies.Last week, Apple revea...led its solution — a suite of tools which includes an update to the iPhone’s operating system that allows for the scanning of photographs.That solution, however, has ignited a firestorm over privacy in Silicon Valley.Guest: Jack Nicas, a technology reporter for The New York Times.Sign up here to get The Daily in your inbox each morning. And for an exclusive look at how the biggest stories on our show come together, subscribe to our newsletter. Background reading: Are Apple’s new tools against child abuse bad for privacy? The backlash to the company’s efforts shows that in the debate between privacy and security, there are few easy answers.For more information on today’s episode, visit nytimes.com/thedaily. Transcripts of each episode will be made available by the next workday.Â
Transcript
Discussion (0)
previously on The Daily.
So we started doing this test across a number of different search engines.
And to sit there and watch as the program starts ticking away
and see the first time it flashes that it got a match,
and then to see it match again, and match again, and match again.
A Times investigation revealed an epidemic of child sexual abuse material.
Any platform that allows the sharing of files is going to be absolutely infested with child sexual abuse.
On a staggering scale.
They told us that in 2018 alone, they received over 45 million images and videos.
An epidemic that relies on platforms run by the world's largest tech companies.
Every major tech company is implicated.
In page after page after page, we see social media platforms.
Facebook, Kik, Tumblr.
We see cloud storage companies.
Google Drive, Dropbox.
And that those companies are aware of.
And from these documents, it becomes clear that the companies know.
I mean, there's so many cases for each company that they all know.
But have done little to try to stop.
Why are these images and videos still showing up
on some of the largest technology platforms in the world?
The tech companies, they have to do something about it.
Today, Kevin Roos spoke to our colleague, Jack Nickus,
about how one company is now taking action
and why those actions are triggering a heated debate over privacy.
It's Friday, August 20th.
Jack, I remember a few years ago, our colleagues at The Times had this big multi-part investigation
of child sexual abuse material, what used to be called child porn.
What was the reaction to that investigation inside the tech world?
I think that that investigation opened a lot of eyes. There was a sense that this is an enormous
problem and it is happening on our watch. Now, different companies had different opinions,
and that's because some companies were actually quite good at reporting child sexual abuse images, and some companies
were quite bad. Facebook, for instance, was reporting millions of images each year, and a
company like Apple was reporting just a few hundred. So at certain companies like Apple,
there was a feeling that they weren't doing enough. Right, because the conclusion of this investigation was basically that tech companies can, in theory,
find ways to kind of scour their platforms
and find these horrible images
and kind of match them against known child sexual abuse images
that are gathered by these activist groups,
and that that was maybe what they should
be doing more of if they're a company that doesn't monitor for much of this stuff.
Right. And there were some proven methods to do so. And therefore, there certainly were,
I think, reasonable questions on why more companies weren't doing more to attack such
a pernicious problem.
So what has been happening at Apple in the two years since this investigation?
Well, after that investigation came out, there actually was a hearing in Congress in late 2019 about encryption. And one of the things that happened at that hearing was some members of
Congress told Apple, if you're not going to solve this problem, we're going to force you to do it. And what we know is around that time or sometime
after, Apple set out to figure this out. And last week, kind of out of nowhere, Apple abruptly made
an announcement that it had indeed figured it out and it was going to do a lot more to try to
combat child sexual abuse on its products.
And what was this announcement? What did Apple say it was going to do?
So essentially, there are three separate tools that are part of this announcement.
The first one involves Siri, the virtual assistant that Apple has on iPhones.
Going forward, people who ask Siri how to report child sexual abuse will be pointed to resources for that.
And also people who search for child sexual abuse imagery will be pointed to resources to get help
with their issue. Got it. That sounds pretty straightforward. What's the next one? The second
part of their announcement involved text messages. And basically, it is a feature for parents to turn on for their children.
And if a child is under 13 and receives or tries to send a nude image,
the parents can be notified of that.
If there's a child between 13 and 17 years old,
that child will at least be warned and asked whether or not they truly want to view
or send a nude image in a text message. And Apple is not looking at those photos. Presumably,
that's just being done by artificial intelligence or some algorithm is determining like, that looks
like a nude photo. Let's warn your parents or ask you if you really want to send that.
Exactly. So what Apple has said is that the way this works is that for a child's device
on which this feature is turned on, every photo that is either sent or received in a text message
will be scanned by this artificial intelligence machine learning that analyzes it and tries to
determine whether or not it includes nudity. And if it does, then either a parent will be notified
or the child will be warned. Okay, so that's the second change. What's the third? So the final change is that Apple is
going to create software that scans photos on people's iPhones to see if there are matches
in a database of known images of child sexual abuse. And if there are enough matches, Apple
will report those photos to authorities and so some people
have described this as the first time that you can get in trouble and get reported to the police
for simply having something on your device and how is that approach different from the kinds of
scanning that for example google might do on the photos that you store in Google Photos,
or that Facebook might do on the photos that you upload to its services?
Sure. So in many cases in the past, companies like Facebook and Dropbox are doing scanning
on photos that have been uploaded to those companies' servers. And so there is somewhat
an assumption that once I share these photos and upload them to another company's servers,
they no longer are completely my property and other companies can be a little more aggressive in what those photos are and whether or not they break the law.
But in this case, Apple's approach involves looking at photos that are stored on a user's device.
That they might have felt were not visible to anyone except themselves.
Yes, so it is important to note that Apple is only doing this scanning on photos
that a user has also chosen to upload to iCloud.
And if a user doesn't want their photos to be scanned,
they can simply turn off iCloud backups.
But that is a policy decision because the first part of the scanning is done on the device itself.
And so ostensibly, if Apple wanted to, it could scan and search all the photos on a person's device, not just ones that are being uploaded to iCloud.
Yeah, that does feel like a pretty big difference.
iCloud. Yeah, that does feel like a pretty big difference. It's sort of like the difference between a friend who calls the cops on you if you bring drugs to his house versus a friend who has
the ability to see inside your house, see whether you have any drugs there,
and then telling the cops if he sees any. It certainly is a much more powerful investigative tool for
law enforcement. I'm sure that the way that Apple has figured out to scan for this material is very
complicated and technical, but can you just describe on a basic level how this scanning
and detection works? Sure. So Apple basically reduces each photo to a set of numbers. And this is kind of like a
fingerprint. It's called a hash. And it means that each photo has its own unique code. And so there
already is a database that's been created by child safety groups like the National Center for Missing
and Exploited Children that have also reduced known images of child sexual abuse to similar codes, these hashes. And so Apple will take the hashes that it creates from your
photos, and it will match it against this database of known abuse imagery and see if there are
matches. And once there are 30 or more matches, an Apple employee will then review versions of
these photos to see if there is indeed child
sexual abuse. And if there is, then that will be sent to authorities and the person's account will
be locked. Why 30 matches? Why not one or two? It's a good question. Apple talks about setting
this higher threshold in part because it wants to ensure that it is only looking at the photos
of an actual child abuser. Essentially, Apple is trying to create a very high bar to ensure that
people who aren't looking at child sexual abuse imagery aren't having their photos looked at by
an Apple employee. And this process of making the hashes and matching them up against the database of hashes of known child sexual abuse images, is this going to be happening on every Apple device, every iPhone and iPad and all the billion plus Apple devices are going to start scanning for this kind of material right away? No. What it does mean is that the technology and
the capability will now be built into the new operating system on every new iPhone,
but it only will be turned on in the U.S., at least at the start. And this is basically Apple
being asked to do more following this investigation to stop the spread of this
horrible imagery. And this is them basically saying, we're doing more. This is our plan.
This is how we're going to step up our detection and our enforcement against this material.
Exactly. And I think that for many parents and law enforcement groups and child safety groups, this is a set of really common sense measures that are going to protect children and give parents more oversight over their children's lives and hopefully root out a really pernicious problem from one of the most popular tech platforms in the world.
And so I think there's been a lot of praise for these changes.
It actually seems like an instance in which a problem was shown and a tech company responded
in a real substantive way. And you said that there was a lot of praise, but obviously with any change
this monumental, I mean, scanning photos on potentially more than a billion devices,
it can't be universally welcomed, right?
Far from it.
In fact, even though child sexual abuse is obviously illegal and completely abhorrent,
and there is near consensus in Silicon Valley
that this is a problem worth solving.
Apple's approach kind of ignited a firestorm
in many corners of Silicon Valley.
We'll be right back.
So Jack, you said there's been a firestorm over these changes by Apple.
What are the company's critics saying is wrong here?
The thing that they're most concerned about is the precedent that this sets.
For the first time, this is a major tech company that has created a technology that can go into someone's private device and look at their personal information or
data and decide whether or not that data breaks the law and should be reported to authorities.
And privacy advocates and civil liberties lawyers and cybersecurity experts are all pretty concerned
about the potential for abuse that governments and law enforcement around the world
can see this technology and say, oh, wow, that'd be great to surveil certain portions of the
population or to look for dissidents of the government. And now that Apple has created
this technology, they're worried it's kind of difficult to put it back in the box.
they're worried it's kind of difficult to put it back in the box.
It does seem kind of surprising that Apple is the company doing all this.
Because up to this point, all of their marketing and promotion has been around their ironclad commitment to privacy.
I drive past a billboard for iPhones every day that says
something like, you know, what happens on your iPhone stays on your iPhone. And so this move
from Apple feels particularly surprising, given what they've been saying.
Indeed, you know, privacy has been at the center of Apple's brand and certainly its marketing pitch for many years now.
And this kind of goes back to the mass shooting in San Bernardino in 2015.
The suspects have been identified as Saeed Farouk and his wife, Tashfeen Malik.
The FBI says it found a smartphone used by Farouk that could contain critical evidence.
Investigators think it might have important information on the couple's whereabouts leading up to...
You know, the FBI in that case wanted to get into the gunman's phone, and it couldn't.
Apple is now being ordered to help the FBI break in.
Federal agents argue they just want a one-time weapon in the fight against terror. And so it asked Apple to figure out how to get in. Federal agents argue they just want a one-time weapon in the fight against terror.
And so it asked Apple to figure out how to get in. Apple is refusing to hack into Saeed Farooq's
iPhone despite a court order to help. Most of the tech world has lined up behind Apple
as the company fights a judge's order to unlock the data and the work issued. And Apple's response
was, listen, we built these phones with really strong encryption, and that means that if we don't know the passcode, we can't get in.
A new CBS News New York Times poll suggests that Americans are divided over whether Apple should unlock the encrypted iPhone.
Digital rights advocates agree with Apple.
Protesters rallying at Apple stores to support the tech giant's refusal.
A lot of people feel very strongly.
They are chanting again.
So a lot of people coming out here tonight
to voice their opinion.
But some family members of those fatally shot and wounded
weighing in as well.
The husband of one of those killed in the San Bernardino
shootings says Apple is wrong.
And Apple was ready to go to court against the FBI
on this issue. What is at stake here is,
can the government compel Apple to write software that we believe would make hundreds of millions
of customers vulnerable around the world? And actually in that case, you know, Apple's
principal argument was it would be very dangerous for us to build that. A piece of software that we view as sort of the software equivalent of cancer.
Because if we build a way into this encrypted phone,
that means that we now will likely have to use it in many other cases
when we are facing demands from governments.
And that is an important point in this scenario, in the context of today's debate,
because, you know, in the past, if a government had asked Apple to search the photos on a person's
iPhone, Apple would say the operating system on the iPhone literally will not allow us to do that.
We can't do it. But now that Apple has actually built the technology into the iPhone literally will not allow us to do that. We can't do it. But now that
Apple has actually built the technology into the iPhone software, it now can't argue we can't do
that. It only can argue we won't do that. Right. And that feels like a pretty big difference.
That is what privacy groups are so worried about, that Apple will have far less leverage
in those conversations with law
enforcement and governments. The system will already be present on the phone. It will be
much more difficult for Apple to push back against demands to use it for different purposes.
And what is Apple's track record on those sorts of requests from law enforcement or prosecutors to get into people's
devices? Well, it's pretty mixed. What we can say is that in the U.S., Apple has, as we noted,
they've pushed back against the FBI several times about trying to build a backdoor into encryption.
But they also do provide the contents of people's iCloud accounts to law enforcement.
Now, that only happens with a warrant, but that's data that's not fully encrypted. And Apple has
said, under the law, we're required to turn it over with a warrant, and we do. But we've also
seen Apple make compromises in other countries. We reported earlier this year that in China,
in other countries. We reported earlier this year that in China, Apple has made a number of compromises that really contradict its principles on privacy. Most notably, Apple is now storing the
data of its Chinese customers on servers that are owned and run in effect by the Chinese government.
Hmm. So the fear of privacy advocates, if we sort of follow this slippery slope down, is that if the Chinese government asked Apple to scan iPhones for something that isn't child sexual abuse material, like, you know, photos of the Tiananmen Square massacre or something,
Tiananmen Square massacre or something, then Apple not only could technically do that,
but they might just go ahead and do it in order to be allowed to continue to do business in China.
Exactly. That is the concern. And you have to consider that Apple is a private company. They want access to markets. And in a place like China, they also want access to the supply chain.
They build virtually all of their products in China and are hugely dependent upon the country.
So when the Chinese government asks for something, it's very difficult for Apple to say no.
And that is not just the case in China.
Apple wants access to a lot of markets.
And increasingly, governments across the world are becoming more concerned about not being able to get access to individual citizens' devices.
And so we're seeing similar sorts of demands happen in India, and we're also seeing similar legislation being considered in Europe.
And I guess that is what is giving rise to this fear that ultimately Apple's tool, this tool to detect child sexual abuse material, might turn into
something much darker, like scanning people's text messages on their iPhones to see who is
disloyal to an authoritarian regime. Exactly. You know, I think at the beginning, you could think
about it being used for child sexual abuse or to look for mentions or images related to terrorist activity.
But, you know, certain governments have been shown to try to find dissidents of the government.
Or in some places, it's illegal to be gay.
There are a lot of things that would really run against Apple's values, and I think Western values, that could be abused using this sort of technology.
I guess one thing I keep coming back to is this question over trade-offs between detecting child sexual abuse material and giving people a sense that what's on their phone will stay on their phone.
giving people a sense that what's on their phone will stay on their phone.
And I guess the question that sort of leads into that
is how effective do we think this tool
is actually going to be?
Does Apple and do people who study child sexual abuse
think that scanning people's phones
for this kind of imagery is really going to be
a significant step in ending the distribution of this content?
Or maybe it will just drive people to different platforms
where their devices and accounts won't be scanned.
Right, I think that at the very least we can say
it is better than the alternative
in terms of combating child sexual abuse.
It almost certainly will catch some people,
and Apple's numbers of reports will certainly rise.
How effective it will be remains to be seen.
So what does Apple say to all of this criticism and feedback?
How are they responding to the privacy advocates?
You know, I actually asked Apple that on a briefing at the end of last week.
And their response was one of frustration. I think
that they feel like they're not getting the benefit of the doubt, that they are trying to
thread the needle on a really difficult problem. They say that they're doing their best to create
as many safeguards as possible, and they really think that they've created a solution that both
solves a difficult problem and protects the privacy of their users.
And the privacy groups should trust that Apple will turn down governments if they make demands
to abuse this sort of tool. So I think there's a lot of frustration at Apple that it's being
criticized for, in their minds, trying to do the right thing. And have they changed their plans or
walked back any of these proposed changes in response to all this criticism?
They haven't yet walked anything back, but they did announce additional safeguards.
So a few days after the initial announcement, Apple said that it was going to have a third-party auditor involved to ensure that the system was working effectively and safely. But I don't think Apple is going to be able to make changes
that will appease privacy groups
because their concern is about the fundamental nature
of what Apple is proposing.
Right, because their objection isn't about
how this is being implemented.
It's that it's happening at all.
It's more of a foundational objection.
Exactly.
What the privacy groups are saying is
what you've proposed here
is very dangerous. Jack, that feels like a pretty hardline stance to me. And I'm just thinking back
to our colleagues' investigation and the horrible stories they found about these families of
children who had been victimized and how they would eventually have to tell those children
that there were these images of them on the internet
and that they might not be able to get rid of them.
And I guess the question that this all raises
is like, what is the price
that we are collectively willing to pay to protect
those people to stop this from happening? Yeah, I think that's what makes this so tricky,
is that what Apple is trying to solve here is a very real active problem and one that has real victims. And the other side of it is more concerned about theoretical harm.
But that theoretical harm is also really worrisome. And so both sides have really important
and I think valid viewpoints here. And unfortunately, they both seem to be in contradiction.
And that means for a company like Apple
and for many of the tech companies
that are trying to figure this out,
there are really few easy solutions.
And are there alternatives here?
Are there ways that Apple could protect people's privacy
in a way that privacy advocates found sufficient while also doing enough to stop the spread of child sexual abuse material?
Or does it kind of have to be one or the other?
Well, the thing is, that's exactly what Apple set out to do.
You know, for years, Apple ignored this problem in some ways because they were concerned about privacy.
And they set out to figure out a way to do both, to create a solution that solved this very real problem while also protecting their users' privacy.
And Apple really thought that it had done that.
But now, given the reaction, I think it's pretty clear
that it's not nearly as easy as Apple had thought.
Jack, thank you.
Thank you.
We'll be right back.
Here's what else you need to know today.
Throughout the past week on the show,
we've met Afghans who are desperately trying to find a way out of the country now that the Taliban has taken control.
I think it's the worst day of my life to see what I'm seeing. One of them was a 33-year-old woman and vocal critic of the Taliban
who asked that we refer to her as R to protect her identity.
We are just waiting to see how,
if we will be able to make it to the airport sometime soon.
I don't know.
After days of trying to secure a seat on a flight,
she finally got word.
She would be leaving soon on a plane to France.
So we are on the plane plane ready to take off
I have all different kind of emotions mixed emotions sadness, grief, anger
Outreaches
Disappointment hopelessness at the same time. I'm just praying to the universe that I will be back soon.
Yeah, I hope it happens soon.
R landed in Paris on Thursday.
Now that she's safe,
R told the Times
we could reveal her full name,
Radha Akbar.
Hello, dear madam.
Hopefully you are doing well.
This is the Kabul airport.
You can see.
We also told the story of Abdul,
the interpreter who worked for the U.S. Army
and was seeking a visa that would let he and his family resettle in the United States.
After the Taliban arrived in Kabul,
Abdul said he made his way to the airport with his wife and two children.
When the people, they saw the Taliban,
everybody right now jumping on the wall.
He's also starting firing from the U.S. Army side also.
But after several hours there, they were turned away.
I just heard from Abdul,
this man and his family should not be in this situation.
Colin Daniels, who worked alongside Abdul as a lieutenant in the U.S. Army, has been following his journey. The long story short is that we took direction from the State Department after putting him on the list of people who are going to be tried to be evacuated.
He was told to wait and shelter in place, but he just sent me two video messages from his safe house
that the Taliban is setting up a checkpoint right outside.
So we're running out of options.
You can see they're walking.
AK-47, there is the Czech one.
You see the Taliban, they come back.
There's a gun.
There's a weapon.
There's everything.
So for now, Abdul and his family are in hiding in an undisclosed location in Kabul, watching
the Taliban out their window and hoping that the Taliban doesn't find them.
Today's episode was produced by Eric Krupke, Rob Zipko, and Chelsea Daniel,
with help from Lindsay Garrison.
It was edited by Dave Shaw, contains original music by Mary Lozano and Dan Powell,
and was engineered by Chris Wood.
That's it for The Daily.
I'm Michael Barbaro.
See you on Monday.