The Daily Show: Ears Edition - How Social Media Filters Are Destroying Our Mental Health | Frances Haugen
Episode Date: January 26, 2024Social media and photo editing are giving teens unrealistic beauty expectations for themselves, causing them to turn to plastic surgery. Here’s why that needs to change. Plus, former Facebook produc...t manager-turned-whistleblower Frances Haugen explains how Facebook’s algorithm stokes anger and division, and discusses the social media giant’s impact on teen mental health.See omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
Survivor 47 is here, which means we're bringing you a brand new season of the only official survivor podcast on fire.
And this season we are joined by fan favorite and Survivor 46 runner-up, Charlie, Charlie, I'm excited to do this together.
Thanks, Jeff. So excited to be here, and I can't wait to bring you inside the mind of a survivor player for season 47.
Listen to On Fire the official
survivor podcast starting September 18th wherever you get your podcast.
You're listening to Comedy Central.
I'm talking about photo filters. Yes, they've helped mankind realize its dream of
puking rainbows but some of the most popular filters just help you look more attractive, which
may sound harmless, but it could be anything but.
Cutting-edge apps and social media filters are allowing ordinary people to
enhance their online photos to impossible perfection. In some cases it's
sparking a concerning phenomena. With apps like FaceTune, you have the power to completely transform yourself.
Bigger eyes, skinnier nose, and jawline.
Smaller butt or flatter belly, whiter teeth, smoother skin, you can do it right on your phone.
When I take a selfie, I always use filters. I wish I could look like my filtered self in real life.
This obsession with personal appearance that selfie culture encourages may have darker implications
for mental health.
A study in the Journal of the American Medical Association
says filtered pictures can take a toll on self-esteem,
body image, and even lead to body dysmorphic disorder.
I do feel like we're losing touch with what reality looks like.
We're already getting there to the point where we're expecting people to look as
unhuman as possible.
Yeah, photo editing filters set unrealistic expectations for beauty.
The same way Fruit Ninja sets up unrealistic expectations of how easy it is to slice
floating fruit.
Yeah!
And once you have this filtered version of yourself in your head, you become dissatisfied
with what you really look like.
So in essence, we're basically catfishing ourselves.
But if these editing apps can turn adults into quivering blobs of insecurity, just imagine
what they're doing to kids.
Psychologists warn these photo filters can be particularly troubling for teens and young people
who are still developing their sense of self.
80% of girls in one survey say they compare the way they look to other people on social media.
On Instagram, I follow people like Kendall Jenner and Kylie Jenner,
they all have this like time measure, like body image that everyone is expecting from this generation.
Young girls on social media have a negative body perception
with one in seven girls reporting being unhappy with the way they look at the end of elementary school
and that number almost doubling to nearly one in three by age 14. 80% of young girls are using
photo retouching apps to change the way they look before posting pictures.
And those with high scores for manipulating their photos were associated with high scores
for body-related and eating concerns.
Any of you ever question your body because of what you see on social media?
Shame man, this is a vicious cycle for teenagers.
Social media makes them unhappy with how they look, so then they use filters, which perpetuate the unrealistic expectations for themselves and others.
Plus, they're teenagers, so they're doing all of this while they're driving,
which puts everyone at risk.
And all the insecurity this creates is harmful for teenagers, because I know it's hard to tune.
thunex out, but teens shouldn't be obsessing over this stuff. Like, I honestly wish I could sit all teenagers down
and say, hey, don't worry about how you look.
The planet's gonna die out before you're 30.
It doesn't matter.
Now, it's bad enough when people wish
they had the perfect Instagram look in real life.
What's worse is when they actually try to make it happen.
The more people look at doctored up images,
the more likely likely the people people people people people people people people doctored up images, the more likely they are to actually start
seeking out cosmetic procedures at younger ages.
These cosmetic procedures are becoming so popular with teens.
Plastic surgeons have coined a new syndrome for it.
Snapchat dysmorphia.
And the number of kids getting nip-tucks may astound you.
In 2017, nearly 230,000 teens had cosmetic procedures.
Kids as young as 13 are getting them.
Doctors seeing an influx of people of all ages turning to plastic surgery to look more like their filter.
62% of plastic surgeons reported their patients wanted to go under the knife
because of dissatisfaction with their social media profile.
57% said their patients wanted to look better in selfies.
Absolutely.
It's becoming more and more common
where people will show me images on their Instagram
or even something that posted on Facebook
and go, this is really high I want to look.
Just last week I had a patient come in
and asked me for more of an anime eye,
and she couldn't figure out why it's not possible. Okay, man, this is really disturbing. Thirteen-year-olds, in particular,
should not be getting plastic surgery.
I mean, when you're 13,
your physical appearance is already naturally changing.
That's what our faces are doing.
It's like long-term plastic surgery.
I mean, this is what I looked like when I was 13.
You gotta let that shit play out. Honestly, though, I don't blame the teenagers.
I blame the parents and the plastic surgeons.
I mean, how are you going to let them do this to themselves?
They can't even buy cigarettes, but you're going to let them buy a new face?
Clearly, this is getting out of hand, which is why. Many influencers have started speaking up on this issue, admitting that they've presented
altered images in the past and are opening up the conversation.
Some are even posting raw, totally unedited photos of themselves and breaking down how people
on your Instagram feed may be manipulating their angles and lighting to get that
quote-unquote perfect selfie.
There are many celebrities exposing the dangers of digital distortion. They are posting
images of themselves unedited, unfiltered, online. And this is a great example to
young girls. Popstar Lizzo made a big flash when she posted a selfie in the nude and unretouched.
There's no shame anymore and I just kind of post myself. It's like you take me as I am. You you don't have to love to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have to have the the the their dis dis. their their their their. their their their their their their their their their. their their. their. their their their their their their. the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the their their to. to. to. to. to. to. to. their. their. their. their. their. their. their. their. the nude and unretouched. There's no shame anymore and I just kind of post myself.
It's like you take me as I am.
You gonna have to love me.
British MP Luke Evans has proposed the digitally altered body image bill,
which would require advertisers and publishers to display a logo whenever a person's face
or body has been digitally enhanced.
Okay, first of all, I love the idea of putting disclaimers on photos of people who have
been digitally altered. I love it.
And honestly, I don't think we should stop there.
We need to do this with everything that's been digitally out like food ads.
Those are the worst.
Every fast food burger looks great on TV.
But then when I order it, it looks like it fell asleep in a hot tub, but I'm glad that we're finally learning the truth about what celebrities look like.
Now, personally, I'm waiting for SpongeBob to join this movement.
I mean, no way that guy is that square naturally.
Have you seen that?
It's like, it's not even... maybe it's not real.
Then how would you have a TV show? Now, I'm not naive enough to think that society is going to to create to create to create to create to create to create to to to to to to to to to to the to thi that to that thi. thi. that thi. that thi. thi. thi. that th that th th that th thi. thi. thi. I'm th th th th th th th th th th thi. I'm th th thi. I'm th thi. I'm thi. I'm th th th thi. I'm th th th th th th th th th th th th th thi. I's thi. I's thi. I's thi. I's thi. I's thi. I's thi. I's thin. I'm thin. I'm to to to to to to to to to to to thin. to to to to thin. to th th thin. th thin. I'm th that we can better educate our kids and ourselves that our own natural
bodies are beautiful.
I mean, except for that flap of old people's skin we have on our elbows.
That shit is gross.
I don't care who you are.
It looks like a mid-arm ball sack.
But everything else is beautiful. But because this movement could take a while, we here at the Daily Show decided to come up with a the filter, the filter, the filter, the filter, the filter, the filter, their, their, to, their, to, to, to, to, to, to, to, their, to, to, their, to, to, to, to, to, their, thi, thi, to, to, to, to, to, thi, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, thi..a, thi.a, thi.e, thi.ea, thea.eaugh, their, their, their, thi.ea, thia, thia, their, thi.a, their could take a while, we here at the Daily Show decided to come up with a filter of our own that might help.
Are social media filters giving you body image issues?
Are you depressed you don't look as good as your filter?
Then good news.
You'll never have to worry about living up to your filter again with Rudi-Rudify.
It's a brand new filter that turns your face
into Rudy Giuliani.
You'll never be happier with how you look in real life.
And obviously, this wouldn't be effective
if you could turn it off.
So, Rudify overrides all other filters.
And just to be safe,
Rudify retroactively applies itself to every face and every photo in your phone.
The best part is the filter is permanent.
Just like Rudy himself. You can never get rid of it.
Rutify, you'll be overflowing with self-esteem.
Warning, use of this filter by Rudy Giuliani will rupture the fabric of space and time.
Hey, I'm Ben Mycelus. I'm Brett Mycelus. And I'm Jordi.
We are the hosts of the Midas Touch podcast, the top rated Top Watch podcast for pro-democracy
content.
Every single day we release new episodes reporting on the issues that matter most, without any
of that both sides, corporate media BS that we are all so sick of.
We also have conversations with incredible guests, like President Joe Biden.
Reminding, the best politics is truth.
You're a child and the truth.
Second gentleman, Doug Emhoff,
Secretary Pete Budajedge,
Representatives Jasmine Crockett,
Jared Moskowitz, and more.
And it's much more than just a podcast.
We have over 3 million subscribers on YouTube. YouTube so come see what the buzz is all about. Subscribe to the Midas Touch podcast wherever you get your podcast.
That's the Midas Touch, M-E-I-D-A-S-O-U-C-H podcast.
Jordi anything to add?
Shout up to the Midas-Mighty! My guest tonight is former Facebook product manager turned whistleblower Francis Hougain.
She's here to talk about how Facebook prioritizes profits over public safety and teen mental health.
Francis Hougain, welcome to the Daily Show.
Thank you for inviting me. How to be here.
Your name isn't as big as the story that you really, I mean, leaked to the world.
You know, a lot of people you say,
Francis, how can they be like, who is that?
But then if you say to people, hey,
remember how we all learned that Facebook
and what social media is doing is essentially destroying
teenagers' brains and it's harming all of us?
Yeah, that report report happened because of you. You've worked for Google, you've worked for Pinterest, you've worked for Yelp, and yet you blew the whistle on Facebook. Why did you
feel like, oh this is something I can't sit on? So when I joined Facebook, I
thought I was going to work on misinformation in the United States, and I
was surprised, I actually was charged with working on information only outside the
United States. And like many people and many technologists,
I had never really focused on Facebook's impact internationally.
And very rapidly, I realized kind of the horrifying magnitude
of the danger that we are facing,
that Facebook's algorithms give the most reach to the most extreme and divisive ideas.
And that process is destabilizing some of the most fragile places
in the world, like Ethiopia or what happened in Myanmar.
You see, when you say that, there are some people who will accept it immediately.
And then there are some people who are like, that's not true at all.
But I'm sure there are a lot of people who will say, but I mean, aren't they always going to be extreme people in the world? You know, aren't there's always somebody saying something.
You know, Facebook themselves say they go like, hey, we're not, we're not part of the problem.
We're merely a platform that people put their views on.
We are not part of the problem.
And you disagree with that.
So let's imagine you had a relative who had particularly extreme ideas. You know, I I I I I, I, I, I, I, I, I, I, I, I, I, I, I, we, I, we, we, I, we, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, thi, thi, that, th, th, th, th, th, th, th, th, th, th, th. th. th. Facebook, th. Facebook, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, thi, thi, thi, thi, they, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're, they're, say like the crazy uncle. Right, right, right. Right. Right.
Right.
That person, if we're talking to them one-on-one or we're talking to them at a family
gathering, that scale allows for resolution of ideas.
Okay.
What's happening on Facebook right now is ideas that are the most able to elicit a reaction and the shortest path to a click. tham. to th. to to to to their to to to their to th. th. th. th. th. th. th. th. their thi. thi. their thi. their their their their their their thi. thi. thi. thi. thi. thi. their their their their their their their their their their their their their their their one one one one one one one one one one one one one one one one one one one one one one one one one one one one one one one one one one. I. I. I. I. I. I. their. their. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thea. ta. ta. ta. ta. ta.a.a.a.a. I's. thea. I'm. I'ma. I'ma. I'm their. Ito a click is anger, those get the most reach. And ideas that are more moderate
or that try to bring us into synthesis
or to help us find a middle path,
those aren't as likely to elicit a comment from you
or elicit a like.
And so Facebook doesn't get as much distribution.
So Facebook is essentially propelling certain ideas out.
So they may say, we're the platform, but really what they're also doing is they're advocating for certain viewpoints because they push those
out because those get people engaged and spinning in the cycle. So I think
it's a thing where it's not they sit out and go how can we have more extreme
ideas? I don't think that's what they're trying to do. But they do know that there are lots of solutions thof their their their their their their their their their their their their their their their thoes thoes thoes thoes thoes thoes. thoes. thoes. thoes. thoes. thoes. thi. the. the. the. the the the the the the the the the. the the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. the. theeeeeeeeeeeeeeeeeeeeeeeea. theeeeeeeeee. they's the. th we change the dynamics of these systems so that you could have good speech counter, counter bad speech.
How do we give those more of an equal say at the table?
You know, it's interesting that you lay it out like this because I remember talking
to my friends about this saying, have we become angrier in the world?
Have we become less able to have misunderstands?
because I sit with my friends and I go like I don't agree on so many things with my friends but I don't remember a time
when that meant that I couldn't be friends with them. Yeah. In the world of
social media right now it feels like if I disagree with you on this I disagree
I disagree with you on everything. It seems like Facebook because if
you disagree or we're going to find more ways to find where you disagree or how to make you disagree. Yeah, so some people have said this idea of like Facebook's always looking for fault lines,
right? So a high quality piece of content is one that gets lots of engagement. And it turns
out an angry comment thread where, you know, it causes us to yell at each other,
that is viewed as a higher quality piece of content. And so, you know, during COVID, I've had friends who wrote very th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. thi. thi. thi. to to to to to to to to to to to to to to to to to thi. I's to be to be to be to be to be to to to to to be to be to be to be to be to be to be to be to be thi. I. I. I. I. I. I. I. I. I. I. I. I. I. I's, th. I. I. I. I. I. I's, th. I's, th. I's th. I's th. I's th. I'm th. I'm thi. I'm to thi. I'm to to to to thin. toe. toe. toe. toa. toa. toa. toa. toa. I'm toa. I'm toa. I'm to th during COVID I've had friends who wrote very long
thoughtful pieces that went and digested lots of information. That's going to
cause less knee-jerk reaction than something that really offends you.
So, masks don't work. Yeah. People are clicking on that. Here's the thing
about masks we have to consider in our society. Boo. Yeah, Facebook's not pushing that. So what's interesting, you know in everything everything that you've done is, even though you're a whistleblower, even though
you came out with this information, you've never said I'm anti-Facebook, you've
never said shut Facebook down, you've never said this thing needs to go away.
What you're arguing for is you change our relationship with social media, it's more about like, should we use our phone so much?
But I'm encouraging to have a conversation about what's our relationship with companies?
Or what's the company's relationships with us?
Okay.
Facebook knows that they have a level of unaccountability that's very different than either
Google or other big platforms because in the case of Google, we can scrape, we can download their results and analyze them
and see if there are biases.
In the case of Apple, we can take apart the phones
and put up YouTube videos saying Apple phones
do or don't work the way they claim.
But in case of Facebook, researchers and activists,
have been telling Facebook,
hey, we have found all these examples, we think there's a pattern here. We think there's too much human trafficking. We think kids are suffering.
And Facebook keeps coming back because they know no one can call them on it
and saying that's just anecdotal, that's not real.
But when you release the information,
we realize it wasn't anecdotal, it was real because Facebook had done the research.
And then, I mean, this is interesting, companies? It reminds us of the you know the fossil fuel companies. They do the
research, they find out something really bad and then I mean obviously they go
like we're not going to put this out there but but we know one of the
biggest things that's really gotten people worried is how social media
affects younger brains. I mean it affects my brain I'm I don't consider
myself younger anymore but young people are like it seems like the highest, at
the highest risk. Yeah, Facebook's internal research says that the highest rates
of problematic use peak at age 14. So you're not allowed to be on the platform
until you're 13. So I guess it takes a little bit of time to form that
habit, but but the largest fraction of any of those age cohorts they samed and they've done studies that have a hundred thous thous thous thous thous thous thous thous thous thous thous thous thous thous thous thous thous thous thous thous thous, thous, thous, thous, thousand thousand thousand thoes, tho, thi, the the thr-I, thi, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, thi. I I I'm to bea, to bea, toe. toeaugh, toe, toe, toea, toea, toe.a, their, them. These are not tiny studies. They find that the largest fraction say I
can't control my usage and I know it's impacting my health, my employment, or my
school, is at age 14. And so there's a real thing where young people have both, you
know, they're struggling with issues in their life and they don't have the self-control yet and they they they they they they they they they they they they they they they they they they they th. And th. And th. th. th. thi thi thi thi thi thi thi thi thi thr-and though though though thrown thi thrown thr-and tho though tho though though though though though thi thi thi though thi thi thi thi thi thi tho tho tho th th th th th th th th th th th th th th th th th th th. thi thi thi thi thi thi thi the. the. the the, and I the an the an to to to to to to to to toooooooooooooooooooooooooooooooooe. th self-control yet. And they see this. They say this to researchers. They say, I know this is making me feel bad, and I can't stop.
I fear I'll be ostracized if I leave.
And those factors mean it's not easy for kids just to walk away.
It's also not easy for people to understand how to fix it.
You have lawmakers now who are trying to figure this thing out think th ahead of them, right? Like way ahead of them. Lawmakers, you see these hearings,
they'll be like, what if my Facebook is an Insta flap?
They don't understand what's happening.
So how do we begin creating a world
where we're not destroying the companies,
but we're regulating them the way they should be?
So the main thing I'm advocating I'm advocating I'm advocating I'm advocating I'm advocating I'm advocating I'm advocating I'm advocating I'm advocating I'm advocating I'm advocating I'm advocating for is around transparency, because one of the problems today is because Facebook is the only one that holds the cards,
like they can see, they can see whether or not
they're holding a royal flesh or not,
what they claim.
We need to move to a world where we have more access to data.
And that can be aggregate data.
It can be privacy sensitive data. that's a false choice of privacy versus transparency. And once we have the ability to have conversations,
we can stop talking about boogie man, social media.
I see.
And we can start having conversations
on how do we solve these problems.
I think what we need to do, and this is why I'm going to spend
2022 doing, is we need to start organizing people.
Like I want to release this information. Because I don't think Facebook is just going to do it automatically out of goodness of
its heart. We have to force them to.
And I think there's lots, lots of opportunities where we can begin putting pressure on them,
either socially or financially.
As a user of social media, as a person who speaks to all my friends on social media,
I have my enemies on social media. I appreciate this conversation because I thiiii thi thi world where we're all fighting really just because an algorithm is trying to make us fight half
of those fights.
I've noticed, even with the show, if I say a thing on the show, if I say, this has been
like the most interesting thing for me is, depending on what I say, Facebook sends
that to different people.
One time we had on the show where where that's a real thing. Yeah, one time we had on the show where literally I'd
laid out both arguments. I said, here's one way to see it, here's another way to
see it, but then on Facebook only one way went to each group. Yeah. And I was
I was fascinated by that. Because then people like, why didn't you say the other way? I was, that that that that that that that that that that that that th. I was, th. I was, th. I was, th. I was, th. I was, th. th. thin, th. thin, thin, thin, th. th. th. thi, thi, thi, thi, thi, thi, thi, thi, thi, thi, thi, thi, thi, thi, thi, th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. thly, thly, thly, that, that, that, that, that, that that they, literally, that they, literally, literally, they, literally, literally, they, literally, theeeeeeee literally, literally, they they th. And then you realize, you know? So one of the things that happens is,
so let's say you take a video clip.
So let's say you have, out of any given one of your shows,
people go and put on YouTube lots of different little chunks.
People can re-seed those on to Facebook over and over again,
and they do.
And each time that enters into that into into into into into into into into into into into into into into into into into into into into into into into into into, it enters into the network in a different way. And the algorithm begins picking up data on that and it says, oh,
interesting, like these kinds of people engage, these kinds of stone.
And so, and because it's entering from so many different points,
it actually gets a chance to learn what kinds of people or communities and engages the
most with. And so that echo chamber is real, like the algorithm pulls us towards homogeneity. It's almost like Facebook
knows that you react most to your neighbor when they play loud music.
There's no reaction from you when your neighbor is doing other things but when
your neighbor plays in the out of music that's when you react.
Actually, Facebook goes, I'm just gonna... It's even worse. It's even worse.
How can it be worse? Let's imagine there's music that you like and there's music you don't like. It actually knows you really don't like I don't know reggaeton or something.
And one things that I think it's really sad is one of the things that was in
the documents is that people talk about, especially people from
marginalized communities that they will go and and correct people who
are spreading racist, racist, homophobic comments and guess what? Now Facebook now they th th th they they they they they they they they they they they they they they they they they they they they they th th th th th th. th. th th th th th. th th is th is th is th is thi thi th is thi thi thi thi thi thi thi thi thi thi thi thi thi thi thi. thi. th is th is th is th is th. th. th. th. And one th. And one th. And one th. And one th. And one th. And one th. And one th. And one th. And one th. th is th is th is th is th is th is thi thi thi thin thin thin, I thin, I that, I that, I thate is thate is thate is thate is that, I that thi thi th guess what? Now, Facebook knows you engage with those keywords.
So they send you more of that.
And so they send more to.
Yeah, yeah.
I didn't know they were doing that.
They don't do it in purpose.
It's a side effect.
It's a side effect of the way the algorithm works.
So the algorithm is designed to get as much accident, I'm going to engage. Yeah, yeah.
Or more of, if they had an option of showing you a stream of different things, they know
you got a car accident, so they show you more.
Man, this is hard because on the one hand I go like, well, I need to stop looking at
car accidents, but the brain is the brain.
It's the brain. AI is not intelligent, right? We like to say artificial intelligence. Yes. But people who actually study it call it machine learning
because it's not intelligent.
It's just a hill climber.
It's optimizing.
And you know, think of all the different pieces of content
you get exposed to.
It's not trying to show you most extreme content.
It happens to be to a fulfill its goal function.
It mindlessly pushes you towards it. And so this is about Facebook making choices to not fix it.
That Facebook has lots and lots of options that don't deal content.
They're letting the machine run wild.
They're letting the machine run wild.
Money.
Money. That's what it is.
Thank you so much.
This has really been enlighten.
Explore more shows from the Daily Show from the Daily Show. Wherever you get your podcasts. Watch the Daily
Show weeknights at 11 10 Central on Comedy Central and stream full episodes
anytime on Fairmount Plus.
This has been a comedy central podcast. Survivor 47 is here which
means we're bringing you a brand new season of the only official
Survivor podcast on fire and this season we are joined by fan favorite and Survivor 46
runner-up Charlie Davis to bring you even further inside the action Charlie I'm excited
to do this together. Thanks Jeff so excited to be here and I can't wait to bring you inside
the mind of a survivor player for season 47.
Listen to On Fire, the official Survivor podcast starting September 18th, wherever you
get your podcast.