Betrayal - S2: Bonus Ep 1 - More on "What Went Wrong?" with the New York Times
Episode Date: July 20, 2023New York Times investigative reporters Michael Keller and Gabriel Dance share more about what tech companies and the government are doing and aren’t doing to confront the vast trading of child sexua...l abuse material across the internet. If you would like to reach out to the Betrayal Team, email us at betrayalpod@gmail.com.  To report a case of child sexual exploitation, call The National Center for Missing and Exploited Children's CyberTipline at 1-800-THE-LOST If you or someone you know is worried about their sexual thoughts and feelings towards children reach out to stopitnow.org In the UK reach out to stopitnow.org.uk Read the article by Michael Keller and Gabriel Dance, The Internet Is Overrun With Images Of Child Sexual Abuse. What Went Wrong?  See omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
The True Crime Podcast, Sacred Scandal, returns for a second season to investigate a led sexual abuse at Mexico's La Luz del Mundo Mega Church.
Journalist Robert Garza explores survivor stories of pure evil experiences at the hands of a self-proclaimed apostle who is now behind bars.
I remember as a little girl being groomed to be his concubine, that's how I was raised.
It is not wrong if you take your clothes off for the Apostle. Listen to Sacred Scandal on the IHR radio app Apple Podcasts or wherever you get your
podcasts.
911 what's your emergency?
It's a nightmare we could never have imagined.
An Achiller?
Who is still on the loose?
In the 1980s we're in high school losing friends, teachers, and community members.
We weren't safe anywhere.
Would we be next?
It was getting harder and harder to live in Mompine.
Listen to the Murder Years on the iHeartRadio app Apple Podcasts or wherever you get your podcasts. Get your tickets now at AXS.com for our 2023 I
are already a music festival coming back to Las Vegas
tonight September 22nd and 23rd one
state.
A T-Mobile Arena streaming live only on
Hulu is the I already a music festival by
your tickets for our 2023 I already a music
festival now.
Starting at just $55 plus taxes and fees at AXS.com.
Topics featured in this episode may be disturbing to some listeners. Please take care while listening.
This is a crime that thrives in the shadows and people needed to hear what was
actually going on. One of the biggest problems we're putting on this is nobody wants
to hear about the problem because of how awful it is.
I'm Andrei Gunning and this is a betrayal bonus episode.
In episode 4, you heard from New York Times reporters Michael Keller and Gabriel Dantz,
as they spoke to their 2019 investigative piece on child sexual abuse material.
It's called, The Internet is overrun with images of child sexual abuse.
What went wrong?
If you have a chance to read it, look it up, because it's superb investigative reporting.
There's a link in our show notes to the article.
We wanted to dive a little deeper.
How are crimes being reported?
What role are technology companies playing?
And how is the government responding?
Here's Michael Keller.
This is a crime that thrives in the shadows, and people needed to hear what was actually
going on.
Reporter Gabriel Dance
One of the biggest problems we're putting on this is nobody wants to hear about the problem
because of how awful it is.
And to be honest, we were nervous.
But we know from season one of Betrayal that our audience is genuinely interested in letting
the light in on dark stories.
One of Michael and Gabriel's most important revelations was that our legislators don't
really want to hear about it.
State lawmakers, judges, and members of Congress have avoided attending meetings and hearings when it was on the agenda.
They just aren't showing up.
One of the big things was the failures of the federal government to live up to its own promises that it made around 2008
to develop a strong national response.
strong national response. The government had not really followed through on its grand plans. The high-well position at DOJ was never fully created. The strategy
reports that were supposed to come out on a regular basis, there's only been two of them over the last decade.
You know, these reports have risen, but federal funding to these special task forces has largely remained flat.
I mean, there's so much of these offenses going on. There's so many reports.
There's not enough police in the United States seemingly to solve this problem.
ICAC, or the Internet Crimes Against Children Task Force,
is working on the front lines every day.
There's at least one ICAC in every state.
Hearing what they go through daily, it's truly harrowing.
What I will say is speaking with members
of these ICAC task forces, I was always
in such admiration and awe of their work,
dealing with this kind of content
and this kind of horrible crime
and really the survivors and how hurt some of them are
for sometimes the rest of their lives.
We spoke with an I-KAC gang Kansas
who had served in the Iraq war war and he said that he would almost
rather go back and serve another tour than continue in his position dealing
with these types of crimes. He had said that he worked in IKAC and then to take a
break he did a tour in Iraq and then came back and felt like right, now I can go back and keep doing this
work.
I'm in all of these law enforcement officers.
They choose this work because they want to save children, but it really is akin to war
in an emotional sense.
Some people like viscerally cannot deal with this issue because it is truly one of the most awful crimes
that we commit against one another.
And the descriptions, Michael and I probably read hundreds
and hundreds of search warrants and legal documents
that would describe videos and photos and the acts in them.
One strategy ICAC uses to write reports is to turn the video off when documenting the
audio and turn the sound off when documenting the video because it's too much to handle
at the same time.
These ICAC task force members can only do so much with what they are given.
They triage the cases, often prioritizing the youngest victims.
But they can only investigate about a third of all the tips, because the caseload is so overwhelming.
Of course, predators are the biggest problem and bear the most responsibility.
But we need to acknowledge there's another culpable participant when it comes to the explosion of CSAM material, technology companies. Before the internet, the US Postal Service
was the leading reporter of CSAM and was stopping the dissemination of material
via the mail. However, with millions of images plaguing the internet, is it time
we started holding technology companies responsible for their lack of action?
What I do think they're certainly responsible for is allowing this problem to get very serious
before they started to take responsibility for their role in it.
As early as 2000 tech companies knew this was a very serious problem and we're doing nothing to solve it.
So I would say that tech companies are certainly responsible for allowing the problem to spiral
out of control in the early part of this century.
And I'm encouraged that from what we've seen, several of them have begun to take the
problem much more seriously.
If the technology exists to root out criminal behavior, why aren't tech companies deploying it?
Microsoft, along with Professor Honey Fareed, came up with a technology called PhotoDNA.
This takes a database of image fingerprints, and whenever a photograph gets uploaded to an internet platform, that company
can scan it to see if it's in the database of verified, illegal imagery.
And so that's the main tool that tech companies use, which is great because it's largely
automated and easy to use.
It's been around for a long time.
So a company like Facebook or others that are doing automated scanning,
they can generate a large number of reports just through this software.
They also generally have a team of human modulators that review it
and that serves an important role of verifying what was found and also escalating it
if there's evidence of actual hands-on abuse of a child.
If you talk with most technology, policy people, one perspective that you hear a lot,
technology companies don't have that much pressure to get rid of harmful content
on their platform because they don't face any legal liability for it. Technology companies
of course would say we have every reason to get rid of this harmful content we don't want
to be a place for exploitation. What the legislative solutions that have been proposed so far
try to go after Section 230 of the Communications Decency Act,
which shields technology companies from any liability for content that users post.
There have been a few proposals to try and change that, both from Democrats and Republicans.
It's been one of the few areas of bipartisan support.
Those proposals have not gone through, but over the last few years, you do see people
trying to find ways to increase the incentives for tech companies to clamp down on this more.
Let's take Facebook's parent company Meta as an example.
Meta is the leading reporter of Child Sexual Abuse
Material to the National Center for Missing and Exploited
Children.
Almost all of the illegal content gets transmitted
through their messenger app.
That isn't necessarily because it has the MoCi-SAM,
because they are using photo DNA and finding offenders.
Currently, messenger does not
encrypt their messages. However, meta has announced that this year, it will make end-to-end
encryption the default. Meta executives have admitted that encryption will decrease its
ability to report CSAM, saying, if it's content we cannot see, then it's content we cannot report.
The virtual global task force, a consortium of 15 law enforcement agencies, is practically begging meta not to do it. Meta's CEO, Mark Zuckerberg stated,
encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things.
When billions of people use a service to connect, some of them are going to misuse it for truly
terrible things, like child exploitation, terrorism, and extortion.
The more communications are encrypted, the less capable tech companies are of using these
automated scanning tools to find and report CSAM.
That's a much broader conversation that should be had and
Often times it gets shorthanded to everything should be encrypted or nothing should be encrypted. He was not even so much that he liked sex.
He wanted something to pray.
It's the largest cult in the world
that no one has ever heard of.
For three generations,
the Luz del Mundo had an incredible control
on his community that began in Mexico
and then grew across the United States, until one day.
A day of reckoning for the man whose millions of followers called him the Apostle.
Their leader was arrested and survivors began to speak out about the sexual abuse, the murder,
and corruption.
This is just a business and their product are people.
They want to know that they will kill you.
Listen to all episodes now on the I Heart Ready Up, Apple Podcasts, or whatever you get
your podcasts.
911, what's your emergency?
You shot her!
Oh my God!
It's a nightmare we could never have imagined.
And a killer who is still on the loose.
My small town rocked by murder.
There are certain murders I'm scared to discuss.
In the 1980s, we're in high school losing friends, teachers, and community members.
One after another, after another, for a decade.
We weren't safe anywhere.
We're teenagers terrified to leave our own homes.
Would we be next?
Who is killing all the kids? And why?
In that moment, I saw rage.
And why do you some want the town secrets
to stay dead and buried forever?
I'm not sure why you're digging up all this old stuff again,
but I'd be careful.
Don't say I didn't warn you, Nancy.
Listen to the Murder Years on the iHeart Radio app Apple
Podcasts or wherever you get your
podcast.
Facing Evil is back and we're bringing you conversations that truly matter.
I can't tell you how many times she had said he's going to kill me.
I will never escape him, he will find me and he will kill me."
We're talking with experts and change makers devoted to making a difference in these tragic true crime stories. Our system failed us and we need to make sure that that does not happen again
for anyone in this country. We are all trying to bring the light to the darkness.
in this country. We are all trying to bring the light to the darkness. The main aim of spreading a story in the teach everyone that everyone deserves equal rights.
Everyone deserves to understand that history is taught and shared so that it can never be repeated again.
New episodes of Facing Evil are available now.
Listen on the iHeartRadio app, Apple podcast, or wherever you get your podcast.
The encryption conversation is often complicated by this particular issue and held out by both sides,
both by law enforcement and by tech companies and people who believe
that all communications should be encrypted as a wedge issue.
I think there can be more nuance to that conversation,
particularly when you come to platforms
and social media networks where adults can engage with children.
Just by definition, children are at such a disadvantage.
Something that's important to note as well is that many of these social networks also
give predators an opportunity to engage with children in a way that was never before possible.
You have documented cases of grown men going on Facebook, pretending to be children, and then sexually
extorting other children into sending images of themselves, after which they continue
to force them to produce more and more imagery.
Gabe is referring to what is commonly known as sex-stortion, tricking a young person
into sending an image and then
essentially blackmailing the child and descending more with threats of exposure or harm.
The encryption debate won't be solved anytime soon, but it's clear that protecting children
from abuse is not enough of a reason to compel for-profit tech companies to consider changing
their approach. Social media websites and messaging platforms are ground zero for the production and sharing of CSAM material.
Through the Dark Web and encrypted groups,
appalling communities have developed.
Take the site, welcome to video.
This Darknet site hosted in South Korea
amassed more than 250,000 child exploitation videos
in only two years.
Welcome to video created a community of users who bought and traded appalling content.
Videos were sold for Bitcoin.
According to an April 2020 article in Wired Magazine, the site's upload page instructed,
do not upload a dope porn.
The last two words highlighted in red for emphasis.
The page also warned that uploaded videos would be checked for uniqueness, meaning only
new material would be accepted.
In a lot of online groups, these images are like a currency.
In order to gain access to people's collections,
it's required that you produce new, never-before-seen images.
So you also have that dynamic where people that want to get images
are pushed into abusing children and documenting that abuse and sharing it online.
Welcome to video was brought down by a joint effort
between the FBI and the South Korean government.
It was the result of dog-a-detective work
and internet sleuthing.
And while it was hosted in South Korea,
many of its users were United States citizens.
There are so many people who don't realize just how big this problem is, and how close to home,
it actually hits. So with all of this information we have, what can we do to make the public more
aware of this problem? What I came away with as the clearest call to action
from our reporting is spreading awareness and educating
parents and encouraging them to educate their children.
This is not necessarily a problem that tech companies
hand solve and certainly don't seem determined to solve.
We spoke with a few online child safety experts who had a few pieces of advice.
One brought up the idea that the industry is not about the business of promoting safety
and she said that she would love to see whenever she buys a cell phone, a pamphlet
that comes along with it that says how to keep your children safe with this device.
The key thing is not keep abuse secret.
The less we talk about this, the more the offenders have an advantage.
They thrive on the feelings of guilt and blame that a child may have if they were tricked
in descending a nude photograph, that shame is really what gives them more power.
If you or someone you know has been a victim of sex extortion, you can get help.
Email the National Center for Missing and Exploited Children, or call 1-800-THELOST.
Many thanks to Michael Keller in Gabriel Dance from the New York Times.
C.R. Show notes for a link to their article. The internet is a veron with images of child sexual
abuse. What went wrong?
Since we spoke with Michael in Gabriel, Metta has been caught up in controversy again.
A recent investigation by the Wall Street Journal and researchers at Stanford University
and the University of Massachusetts Amherst found that Instagram was helping to link predators
and people selling child sexual abuse material.
Its algorithm connected accounts, offering to sell illicit sex material with people seeking it.
According to the Wall Street Journal, Instagram allowed users to search for terms that its own
algorithms know may be associated with illegal material. And it's not like they were hiding it.
Instagram enabled people to search hashtags like hashtag pudo-horror and hashtag preteen sex.
Then connected them to accounts advertising CSAM for sale.
If that wasn't troubling enough, a pop-up screen for the user is warned,
these results may contain images of child sexual abuse,
and then offered user's options.
One of them was see results anyway.
META has set up an internal task force to address the problem.
If you would like to reach out to the betrayal team, email us at patrialpod at gmail.com
That's patrialpod at gmail.com. To report a case of child sexual exploitation,
call the National Center for
Missing and Exploited Children's Cyber Tip Line at 1-800-The Lost. If you or someone you
know is worried about their sexual thoughts and feelings towards children, reach out
to StopItNow.org. In the United Kingdom, go to StopItNow.org.uk. These organizations
can help.
Or grateful for your support. One way to show support is by subscribing to our show
on Apple Podcasts, and don't forget to rate and review betrayal.
Five star reviews go a long way. A big thank you to all of our listeners.
A trail is a production of Glass Podcasts, a division of Glass Entertainment Group and
partnership with IHAR Podcasts. The show was executive produced by Nancy Glass and Jennifer Fason,
hosted and produced by me, Andrea Gunning,
Written and produced by Carrie Hartman,
also produced by Ben Fetterman,
Associate Producer, Kristin Melcuri,
our IHART team is Ali Perry and Jessica Crinecheck,
special thanks to our talent Ashley Litten
and production assistant Tessa Shields,
audio editing and mixing by Matt Alecchio,
Trails theme composed by Oliver Baines,
Music Library provided by Mike Music,
and for more podcasts from I Heart,
visit the I Heart Radio app, Apple Podcast,
or wherever you get your podcasts.
The True Crime Podcast Sacred Scandal
returns for a second season to investigate a led
sexual abuse at Mexico's La Luz del Mundo Mega Church.
Journalist Robert Garza explores survivor stories of pure evil experiences at the hands of
a self-proclaimed apostle who is now behind bars.
I remember as a little girl being groomed to be his concubine, that's how I was raised.
It is not wrong if you take your
clothes off for the impossible. Listen to Sacred Scandal on the IHORK Radio app Apple Podcasts,
or wherever you get your podcasts. 911 what's your emergency? It's a nightmare we could never have
imagined. In the 1980s we were in high school losing friends, teachers, and community members.
We weren't safe anywhere.
Would we be next?
It was getting harder and harder to live in Mompine.
Listen to the Murder Years on the iHeart Radio app Apple Podcasts or wherever you get your podcasts.
Get your tickets now at AXS.com guests. Is the I Heart Radio Music Festival? Buy your tickets for our 2023 I Heart Radio Music Festival now!
Starting at just $55 plus taxes and fees.
At AXS.com.