Stuff You Should Know - How Effective Altruism Works
Episode Date: March 3, 2022A branch of philanthropy led by philosophers is dedicated to finding the most impactful ways to help humans survive and thrive. Anyone can find that agreeable, but it can be tough to hear it also mean...s your donations to local charities are kind of a waste. Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
Hey, I'm Lance Bass, host of the new iHeart podcast Frosted Tips with Lance Bass.
Do you ever think to yourself, what advice would Lance Bass and my favorite boy bands
give me in this situation? If you do, you've come to the right place because I'm here to help.
And a different hot sexy teen crush boy bander each week to guide you through life.
Tell everybody, yeah, everybody about my new podcast and make sure to listen so we'll never,
ever have to say bye, bye, bye. Listen to Frosted Tips with Lance Bass on the iHeart
radio app, Apple podcast, or wherever you listen to podcasts.
I'm Munga Chauticular and it turns out astrology is way more widespread than any of us want to
believe. You can find in Major League Baseball, International Banks, K-pop groups, even the White
House. But just when I thought I had a handle on this subject, something completely unbelievable
happened to me and my whole view on astrology changed. Whether you're a skeptic or a believer,
give me a few minutes because I think your ideas are about to change too. Listen to
Skyline Drive on the iHeart radio app, Apple podcast, or wherever you get your podcasts.
Welcome to Stuff You Should Know, a production of iHeart Radio.
Hey and welcome to the podcast. I'm Josh Clark and there's Charles W. Chuck Bryant and Jerry's
here, so that appropriately makes this Stuff You Should Know. That's right. Before we get going,
we want to give a little special plug to our good friend, John Hodgman and our buddy David Reese.
Yeah. Because they got a season two of their awesome animated show, Dick Town, as in private
detective dick, by the way. Yeah. I mean, it's cool enough to get one season of a show, but if
you've gotten a second season and they're tossing you on FX these days, you've made it. So their
show has finally made it and it's well deserved too because it's a pretty awesome cartoon. It is.
It's very funny. It is actually live now. It premiered on March 3rd at 10 p.m. on FXX.
You can watch it on Hulu and the whole jam here is that John and I watched the whole first season.
The whole, they're short episodes. The whole first season was less than two hours long,
which really makes a great case. We're just streaming the whole thing and laughing a lot
in one night. But it's about two detectives, John Huntsman and David Pierfoy, David Reese.
And Hodgman is a private detective. He was a former boy detective, like encyclopedia brown type,
and Dave was his sort of his bug's meanie, his nemesis in high school. Man, great reference.
It's now his buddy and his sort of muscle and his driver and they solve cases together and
season two I think is even bigger and weirder and it's sort of Scooby-Doo. It's just a lot of fun,
a really, really fun show. Yeah, the first season they did nothing but solve children's
mysteries and they were humiliated by that. So they've kind of expanded now to be grownups.
They've resolved to be grownups and they're solving grownup mysteries for grownups now,
which is really something else. So yeah, like you said, you can stream the whole first season
on Hulu and you can catch the second season on FXX. I wasn't aware of the extra X.
Yeah, two Xs. I don't take back what I originally said. It's still big time, but FXX.
That's right. And it is rated PG-13. So if you're 13 and up, you should enjoy it. It's got a few
swear words, some adult themes here and there, but it's great. It's a lot of fun.
And happy for Hodgman and Rhys. Happy, happy Hodgman, happy, happy Rhys.
And I just like saying Dicktown. Sure. It's a great name for a great show.
It is. Should we talk about effective altruism? Yeah, I was gonna say we're talking about that
today and this one kind of, I don't know if you noticed a similarity, but this one really
kind of ties into that short stuff that we released before the end of the year about
charitable giving. Did you notice? I did. Although in that episode, it was like,
we were like, yeah, you know, find a charity that speaks to you and maybe something that's
local or if you have animals or if you had, you know, a family member with cancer. And
this basically says, don't do any of that. Right. The only way you should give is by just kind of
coldly calculating what would help a human the most on planet earth. Yes. So effective altruism
is one of those movements. It's a pretty new movement. I think it really started in earnest
around 2010. And it's one of those movements that like elicits passion one way or another.
It's a very polarizing idea. Yeah. If you just take it at its bare bones,
bones, which people love to do. And the reason why people love to take it at its bare bones,
at its extremes is because it is at heart a philosophical movement rooted in utilitarianism.
And utilitarianism is even more polarizing. It has been for centuries than effective altruism is.
And I think if everybody would just move past the most extreme parts of it and just kind of
took effective altruism at its most middle ground where most of it seems to have accumulated and
settled and where most of the work is being done, it would be really difficult to disagree with the
ideas behind it. It's when you trot out Peter Singer and some of his most extreme views or
when you say, oh, it's all Silicon Valley billionaires. When you just look at it like that,
that's when people get all riled up and they're like, I hate effective altruism. If you really
just kind of take it in a much more level headed way, it's actually pretty sensible and pretty
great because at the end of the day, you're saving people's lives and you're figuring out
how to save the most lives possible. Yeah. I think anything that has some of its roots
and philosophical movements of tech bros, it's a hard sell for a lot of people.
But let's talk about a few things that it is, which is the idea that there's a lot of good
that can be done with money and if you can provide for yourself and your own basic needs,
you should be probably giving to charity. You can take a cold hard look at your finances by
literal strict financial calculations. If you are a person without kids making $40,000 a year,
you are in the 97.4th percentile on planet earth as far as your wealth goes
and that you might not think, hey, I make $40,000 a year and then I have taxes and
I really like people with a lot of money should give to cherries. I really don't have enough
to spare. The idea is that no, you have some to spare. You can give a little bit,
like 10% of your money and still be in the top 96 percentile and you can literally save
human lives on planet earth. That's the big thing that they're trying to get across here,
that the money that you're giving is saving lives that otherwise would be crippled with disease
or just not around like they would die if you didn't give this money. The fact that you are
giving this money, those people are now living what are called quality adjusted life years,
where they're living an additional healthy year or more because of that intervention that you
gave your money for and that yes, it's based on the premise that basically everyone living in
the United States is rich compared to entire swaths of the rest of the world and that basically
anyone living in the United States can afford to give 10% of their income and forego some clothes
or some cars or something like that to help other people literally survive. Right off the bat,
we've reached levels of discomfort for the average person, especially the average American,
that are really tough to deal with and so that's the first challenge that effective altruists have
to do is kind of tamp down that overwhelming sense of guilt and responsibility and shame
at not doing that, that people immediately kind of that crops up and people when they hear about
this. Yeah, so I think maybe let's talk a little bit about the history and some of the main
organizations that are tackling this and maybe through that what some of the founders describe
as the core commitments. Like you said, it took hold in about 2010 and there's a group of organizations
under what is now an umbrella organization called the Center for Effective Altruism, CEA and started
off with philosophers Toby Ord and Will McCaskill founding a group called Giving What We Can,
self-defined as an international community of people committed to giving more and giving more
and giving more effectively. A couple of years later, McCaskill and a man named Benjamin Todd
founded something called 80,000 Hours. The idea is that you might devote 80,000 hours to a career
so when choosing a career, be very thoughtful on the impact that career has for both good and
evil. We'll get way more into all this and then there's other sort of not fringes and weird groups
but just on the outskirts called The Life You Can Save and then Animal Charity Evaluators
which we'll get into how animals figure in. But let's talk a little bit I guess about Will
McCaskill and what he sees as the core, what he calls the core commitments of EA.
So yeah, and Will McCaskill, he's out of Oxford and so is Toby Ord and I first came across this
Chuck when I was researching the End of the World podcast and I deeply admire Toby Ord on a personal
level. He actually walks the walkie and his whole family does. They donate a significant portion of
their family income to charity and forego all sorts of stuff and he's literally trying to save the
world. So in that sense, I'm really kind of open to the ideas that come out of that guy's mouth
and Will McCaskill. You mentioned the end of the world with Josh Clark available wherever you
can find your podcast. Yes. Your wonderful, heady, highly produced 10-part series. Thank you very
much. That was nice of you. Where you tackle the existing existential risks of the universe.
Yes. Okay. I just want to make sure I had the right one. The one and the same and I was not
doing that to set you up for a plug. I was doing it in like kind of full disclosure that I'm a
little, I'm probably a little less than objective at this one. Yeah, but you know, that's a great
show and it's still out there just because it is, you know, a few years old now. It's very evergreen.
I think it's at least in these times. Yeah, the world hasn't ended yet, so it's still evergreen.
Good point. So, but I mentioned that in part just kind of fully disclosed that I think Toby Gord
is one of the greatest people walking the earth right now, but also Will McCaskill, who I don't
know seems to be in lockstep with Toby too. And so he's kind of one of the founders of this movement
and he said that there's four tenants. He wrote a 2018 paper and so there's basically four tenants
that form the core of effective altruism. One is maximizing the good, which we can all pretty much
get on board with like you want to make as much good as possible for as many people as possible.
The second is aligning your ideas, your contributions with science using like evidence-based
well evidence to create where you're going to put your donations. To use that to guide you
rather than your heart. It's a big one, so it's a tough one for people to swallow.
Another one's welfareism, where by maximizing the good, you're improving the welfare of others.
That's the definition of good in that sense of maximizing the good. And then last one is impartiality.
That's a tough one. That's harder, I think, for people to swallow than science alignment,
because what you're saying then, Chuck, is that every single person out there in the world
equally deserves your charitable contribution. Yeah, and that's a big one because I'm trying
to find the number here of how much Americans give abroad. Where is that? Okay, here we go.
Out of the, what is it, $470 billion that Americans donate?
Yeah, in 2020, I think. Yeah, $471 billion. $25.9 billion of that went outside of America to
international affairs. So that's a lot of money, but it's not a lot of money in the total pot.
And the idea for EA is to sort of shatter your way of thinking about trying to help
the people in your city or the people in your state or your country and to look at every human
life as having equal value. And not even human life, but every life.
Yeah, they include animals too, like you mentioned before, and we'll get into a little more.
But the key is that if every single person living on earth is equally important,
then, and you're trying to maximize the help you can do, from a strict EA perspective,
you're wasting your money if you are donating that money, if you're an American.
If you're donating it in America, because just by virtue of the value of a dollar,
it can do exponentially more good, $1 can, in other developing poverty-stricken areas of the
world than it can here in the United States. So that right there sets up for critics of EA
like to point out that, well, wait a minute, wait a minute, are you saying that we shouldn't
donate locally here at home, that we shouldn't save the animals in the animal shelter, that we
shouldn't donate to your local food pantry, that you shouldn't donate to your church?
And if you really back an effective altruist into a corner, they would say,
look, just speaking of maximizing your impact and everybody around the world is equally important,
no, you shouldn't be doing any of those things. And you certainly shouldn't be donating any money
to your local museum or symphony or something like that.
Yeah. And they say that with their head down and they're kind of drawing on the floor with
their foot and they're saying like, yeah, that's kind of what we're saying.
That's right. Yes. And that's really tough for people to swallow. It's just this huge jagged
pill that they're asking people to swallow. But if you can step back from it, what they're
ultimately saying is, look, man, you want to do the most good with your charitable donations.
Here's how to do it. Just put your sentimentality aside.
Yeah. Do you want to feel good about it or really do the good?
Exactly. And that's what they're doing. That's the whole basis of effective altruism is they're
saying all of your charitable giving is for you. You're doing it for yourself. That's why you give.
This takes that out of the equation and says now you're giving to genuinely help somebody else.
All right. I think that's a great beginning. Maybe let's take a break
now that everyone knows what this is and everyone is choking on their coffee
because they just donated to their local neighbor organization.
And we'll come back and talk about some of the other philosophical founders right after this.
Hey, I'm Lance Bass, host of the new iHeart podcast Frosted Tips with Lance Bass.
The hardest thing can be knowing who to turn to when questions arise or times get tough
or you're at the end of the road. Okay. I see what you're doing.
Do you ever think to yourself, what advice would Lance Bass and my favorite boy bands
give me in this situation? If you do, you've come to the right place because I'm here to help.
This, I promise you. Oh, God. Seriously, I swear. And you won't have to send an SOS because I'll
be there for you. Oh, man. And so my husband, Michael. Um, hey, that's me. Yep. We know that,
Michael and a different hot sexy teen crush boy band are each week to guide you through life
step by step. Oh, not another one. Kids, relationships, life in general can get messy.
You may be thinking, this is the story of my life. Just stop now. If so, tell everybody,
yeah, everybody about my new podcast and make sure to listen so we'll never ever have to say bye,
bye, bye. Listen to Frosted Tips with Lance Bass on the iHeart radio app, Apple podcast or wherever
you listen to podcasts. I'm Mangesh Atikular and to be honest, I don't believe in astrology,
but from the moment I was born, it's been a part of my life in India. It's like smoking. You might
not smoke, but you're going to get secondhand astrology. And lately, I've been wondering if
the universe has been trying to tell me to stop running and pay attention because maybe there
is magic in the stars if you're willing to look for it. So I rounded up some friends and we dove in
and let me tell you, it got weird fast. Tantric curses, major league baseball teams, canceled
marriages, K-pop. But just when I thought I had a handle on this sweet and curious show about astrology,
my whole world came crashing down. Situation doesn't look good. There is risk to father.
And my whole view on astrology, it changed. Whether you're a skeptic or a believer,
I think your ideas are going to change too. Listen to Skyline Drive and the iHeart Radio app,
Apple Podcast or wherever you get your podcasts.
A couple of people we should mention really quickly because they're going to come up.
As far as organizations, we did not mention Give Well yet. They were founded in 2007.
They're a big part of the EA movement by Facebook co-founder Dustin Moskovitz and his wife.
Is it Kari or Kari Tuna? I'm going with Kari. I think so. It's C-A-R-I. So they have partnered up
to create open philanthropy. Philanth... philanthropy. Why did it sound weird?
It sounded weird. I wanted to say philanthropy. It's too early in the episode for that.
So they're big donors and big believers in the cause.
And then another person you mentioned is... Well, first of all, you mentioned utilitarians
in this philosophical movement. They were developed by... And then we've talked about Jeremy.
Jeremy Bentham before and John Stuart Mill. But the idea that people should do what causes the
most happiness and relieves the most suffering. And the other guy you mentioned that's sort of
controversial, I guess you could say is Peter Singer. He is an author and a philosopher and a
Ted Talker who kind of became... I don't know about famous because a lot of people don't know
any modern philosophers. But in these circles became famous from an idea, a thought experiment in
1972 from his essay, Fam and Affluence and Morality, which is you're going to work. You just bought
some really expensive, great new shoes. You see a kid drowning in a shallow pond. Do you wait in
there and ruin those new shoes and rescue the kid and make you late for work? And 99% of people,
if asked, would say, well, of course you do. You know, you let that kid drown. So the flip to that
is, well, that's happening every day all over the world and you're essentially saving your new shoes
by letting these kids die. Yeah. So what's your problem, Bob? You're buying those new shoes rather
than donating that money to save a child's life. It's morally speaking, it's the exact same thing.
And in the essay I read it last night, it's really good. Do you want to feel like a total
piece of garbage for not doing enough in the world? Read that. He basically goes on to destroy
any argument about, well, that's a kid that you see in a pond. You're actually physically saving
that kid. He's like, well, it's so easy to donate to help a child on the other side of the world
right now that for all intents and purposes, it's as easy as going into a pond to save the kid
these days. It is easier. You don't even have to get wet. You just call in your credit card,
basically. Yes. So he just destroys any argument you could possibly have. And he is an extremist
utilitarian philosopher in that he's basically saying not just that giving money to the point where
you are just above the level of poverty as the people you're giving, really cutting into your
luxuries to help other people. Not only is that a good thing if you do that, it's actually not
doing that is a morally bad thing. It's morally wrong to not do that. So he will really turn
the hot plate up under you and just really make you feel uncomfortable. But he's saying this is
my philosophical argument and it's pretty sound if you hear me out. And if you hear him out,
it is pretty sound. The problem is he's a utilitarian philosopher and a very strict one too.
And so there's a lot of like, you can take that stuff to the nth degree, some really terrible
extremes to where it becomes so anti-sentimental that it actually can be nauseating sometimes.
Like strictly speaking, under a utilitarian view, this one's often trotted out, it is morally good
to murder one person to harvest their organs to save the lives of five other people with that
murdered person's organs. Technically speaking, in a utilitarian lens, that's maximizing the good
in the world. The thing is, if that's what you're focusing on and you're equating effective altruisms
desire to get the most bang for your donation buck, to murdering somebody to harvest their
organs to save five people, you've just completely lost your way. Sure you can win an argument against
utilitarianism in that respect, but the fact that it's leveled and trained on this movement,
this charitable philanthropy movement is totally unfair even though, yes, it is pretty much part
and parcel with utilitarianism. Yeah, Singer is a guy who I think is one of his philosophies is
the journey of life and that interrupting a life before it has goals or after it's accomplished
goals is okay. If you mention his name, there are a lot of people who will point to this idea that
he says things like, it's okay to kill a disabled baby right after they're born in some cases,
especially if it will lead to the birth of another infant with better prospects of a happy
and productive life or an older person who has already accomplished those goals and the idea
being that the disabled baby doesn't have goals yet, you know, that's obviously some controversial
stuff. Yeah, just to tell. And he's a hardliner and doubles down on this, but again, to sort of
throw that in, that has nothing to do with effective altruism. No, he wrote that paper,
Famine Affluence and Morality, which basically provides the general contours of the effective
altruist movement. Yes. But it's not like he's just the leading heartbeat of the movement or
anything like that. No, that's not their Bible or anything like that. No, and unfortunately,
he's an easy target that people can like point to because the effective altruist movement has kind
of taken some of his ideas and they're like, oh yeah, you like Singer? Well, what about Singer's
argument about this? It's like that has nothing to do with effective altruism. He makes a really
good, easy, easily obtained straw man that people like to pick on. That's right. Let's talk about
numbers a little bit. We mentioned that in the United States, $471 billion was donated in 2020.
About 324 of that came from individuals, which is amazing. Yeah, those corporate guys are really
pulling their weight, huh? Yeah, no kidding. Individuals, and that boils down to about $1,000
per person in the USA, which is not that much money if you think about it. No. And out of that,
there are a couple of pledges that EA endorses, one called Giving What We Can, which is promising
to give away 10% of your lifetime income. And then another one called The Founder's Pledge,
where if you're a startup founder, you promise to give away a percentage of your eventual proceeds.
And then there's also Try Giving, which is a temporary pledge to donate. And it's
only about 12 years old, only about 8,000 to 10,000 people have taken these pledges so far.
Which is still, I mean, that's a decent amount of people, especially considering that most of
the people involved in this movement are high earning, extremely educated people who are probably
like 10% of their income is going to add up to quite a bit over the course of their careers. And
that's the thing they're saying, like, I'm going to give this 10% a year for my career. And the
reason why they've really kind of targeted careers, that's part of 80,000 hours. 80,000 hours is this
idea that we spend about 80,000 hours working. So if you took that 80,000 hours and figured
out how to direct your energy, the most effectively towards saving the world, you can really do some
good just by the virtue of having to have this career to support yourself. And so there's a
couple of ways to do it. One is to have a job that you make as much money as you possibly can at,
and then you donate as much as you comfortably can. And then maybe even then some, say 10%,
or some people donate 25%. There's a NASA engineer named Brian Ottens, who is profiled in the Washington
Post, who said he specifically got the most stressful, high earning job he could handle
in order to give away, I think, a quarter of his income. And that's great. That's one way to do it.
But another way to do it is to say, okay, actually, I'm going to figure out something that I really
love, but I'm going to adjust it so that it's going to have the most impact possible.
Yeah, I think it's interesting. There are two ways to think about it. The first one that you
were talking about, they call it earning to give. And the idea that you, if you are capable of getting
a really high paying job in the oil industry with the idea that you're going to give most of that
away in the earning to give philosophy side of things, they're saying, yeah, go do that. It
doesn't have to be a socially beneficial job. Make the most money you can and give it away.
Don't go get the job at the nonprofit, because there are tons of people that will go and get
that job at the nonprofit. Someone will fill that position. 80,000 hours doesn't,
they say that that's not the best way. There's is more. The second one you mentioned,
and which is don't take a job that causes a lot of harm. Being happy is part of being productive,
and you don't have to go grind it out at a job you hate just because you make a lot of money
so you can give it away. Make yourself happy. Don't take a job that causes harm. Do a job where
you have a talent. Policymaking is one field media. I would argue that we have a job where
we didn't know, but it turns out we have a talent for doing this and we can leverage
our voice and we occasionally do to point out things that we think make a difference in the
world and to mobilize people. That's not the goal of our show, but we can dabble in that,
which is great. That's not what we intended going into it, but I think we woke up one day
and found that we had a lot of years so we could throw in episodes that I think that lead to good.
Right. Yeah, I agree, which means we can shave a little off of that 10%. We're
morally obligated to donate every year, right? Yeah, that's 7%.
So a good example of that, of like figuring out how to direct your career path more toward
improving the world. On the, I guess the 80,000 hour site, they profiled a woman who wanted to
become a doctor and she did some research and said, well, this is cool, but most doctors in
Australia treat Australians who are relatively very well off and very healthy. And so instead,
she decided that she wanted to go into a different field of medicine. I think she went into like
epidemiology and figured out how to get, how to direct her interest in medicine toward getting
vaccines out to market faster to get them through the clinical trial process. And so she's not
going to get to be a doctor, but she's going to get to focus on medicine and she's going to get
to have the satisfaction that she's improving the world demonstrably through her job. And she
might not donate a dime of that. I suspect she's probably going to because she's on the
80,000 hours website. But even if she didn't, she's still figuring out how to use
evidence, to make evidence based decisions to maximize the 80,000 hours she's going to spend
in her career to make the world a better place. Right. Because one of the ideas of EA and a lot
of, you know, the charity navigator and charity watch like good websites that we endorsed that
were not poo pooing at all, but they tend to focus a lot on, you know, how much goes to overhead,
how much goes to the programs, which is, which is good. But EA is like, no, what we want to see
are data and literal scientific data measurables on how much return you're getting for that dollar.
And some charities do this and are a little more open about it, but they basically say,
you know, every charity should say, here's how much your dollar, here's how far your
dollar goes and exactly what it does. Right. And the charities of the West said, come on,
really nervously. Right. When they're asked that, when they're told that they should be doing that,
because they just don't. Part of the reason why is it's very expensive to run what effective
altruists like to use is the gold standard random control trials, where basically, you know what
UX testing is, user experience testing for like a website. So there's a B testing where you've got
some people who are using your website and they're getting one banner ad when the B testers are
getting a totally different banner ad and you just see which gets the most clicks. It's basically
that, but for a charity for the work that a charity is carrying out, some group gets malaria nets,
another group doesn't, and then you study which group had the best outcome. And then you could
say, oh, well, these malaria nets increase these life adjusted years by 30%, which means that it
comes out to 0.5 life adjusted years per dollar compared to 0.2 life adjusted years for the control
group. Ergo, we want to put our money into these groups that distribute malaria nets in Africa,
because they are demonstrably saving more lives than groups that don't. Like they want data like
that and you just don't get that with most charities. The good thing is, is they're pushing
charities to do that because if you do care about that kind of thing, then if you can come up with
that kind of evidence, you can get these effective altruist dollars. And there's a lot of dollars
coming from that group, even though it is relatively small. Yeah, it is interesting because, you know,
in that example, if you were to just say on your website, people with malaria nets fair better,
like everyone knows that, but they really want to have that to drill down and have that measurable
where they can point to a number and say that, you know, this is the actual result. We all know
malaria nets help, but maybe if people, I mean, maybe they think it speaks to people more. It
certainly speaks to them, but I guess they think it would speak to the masses if, because these
things cost money. I mean, that's one of the criticisms of these randomized controlled trials.
It is that they're sort of expensive and like maybe that money should be used to do, to actually
donate instead of doing these trials, but they must think it speaks to people to have actual data
like that. Well, it speaks to them because the way that you figure out how to maximize your money
is to have data to look at, to decide rather than your heart. It makes sense. These are techies,
because they're all about that data. Very much so. And there's some problems with that,
with relying on that. There's some criticisms, I should say, but it's problems too. One is that
there's a lot of stuff that you can't quite quantify in terms like that. Like, if you're
saying, like, no, I want to see how many lives saved your work is doing per dollar.
Well, then the high museum's going to be like zero or saving zero lives, but that doesn't mean
that they're not enriching or improving lives through the donations that they're receiving,
this art museum. You know what I mean? Olivia, who helps us with this article, gives an example.
She's saying, like, you couldn't really do a randomized controlled trial for the 1963 march
on Washington. Yeah, for sure. That helps solidify the civil rights movement. Right.
And yet it'd be really hard to argue that that didn't have any major, like, effects on the world.
So that's a big argument. Then the other thing is that sometimes these randomized controlled
trials, like you can hold it one year and then in one part of the world and go to another part
of the world the next year, and what should be the same is just not the same. And so if you're
basing all of your charitable giving on these things, they better be reproducible or else what
are you doing? Yeah, I mean, you get why this is such a divisive thing and why it's such a hard
sell to people because people give with their hearts generally. They give to causes they find
personal to them, like I mentioned earlier, a family member with cancer or a family member with MS
or just name anything. Generally, people have a personal connection somehow, which makes them
want to give. And that's sort of the heart of philanthropy has always been the heart.
And it's a tough sell for EA to say is, I'm sorry, you have to cut that out of there.
It's a very subjective thing to what constitutes a problem even. When it comes to the animal thing,
like when people give for animal charities, they're generally giving to dogs and cats and
stuff like that. These great organizations that do great work here in America. But the concentration,
if from the EA perspective, are factory farmed animals. And that 1% of charitable spending
in the US goes to the suffering of farmed animals. And that's what we should be concentrating on
because of the massive, massive scale. Again, to try and do the most good, you would look at like
where are the most animals? And sadly, they're on farms. Yeah, I mean, just from sheer numbers,
you can make a case utilitarian speaking that your money would be better off spent
improving the lives of cows that were going to be slaughtered for beef.
That will still eventually be slaughtered for beef, but you can improve their welfare during
their lifetimes. And that technically is maximizing the impact of your dollar by reducing suffering
just because there's so many cows awaiting slaughter in the world compared to humans that
are dying in Africa. Heck yeah, that's a tough sell. And I think this is where it makes sense to
just maintain a certain amount of common sense where it's like, yeah, man, if you really want
to maximize your money, go look at the EA sites, go check out 80,000 hours, get into this and
actually do that. But there's no one who's saying like, but if you give $1 to that local symphony
that you love, you're a sucker, you're a chump, you're an idiot, nobody's saying that. And so
maybe it doesn't have to be all or nothing one way or the other, which seems to be the push and
the pull. And I think the issue here. Yeah, we should read directly from Will McCaskill. He defends
EA and he says this, effective altruism makes no claims about what obligations of benevolence one
has, nor does EA claim that all ways of helping others are morally permissible, as long as they
help others the most. Indeed, there's a strong community norm against promoting or engaging
in activities that cause harm. So they flat out say like the whole murder of someone to harvest
their organs, like we're not down with that. That's not what we're about. Please stop mentioning
Peter Singer. Right, yeah. And he says it doesn't require that I always sacrifice my own interest
for the good of others. And that's actually very contradictory of Peter Singer's essay. He says,
no, you're morally obligated to do that. And if you don't, it's morally bad. They're saying like,
no, let's, let's all just be reasonable here. Like, yeah, we're philosophers, but you know,
we can also like think like normal human beings too. And that's what we're trying to do. We're
trying to take this kind of philosophical view based in science, based in evidence and try to
direct money to get the biggest impact. Yeah, like you said, can we stop? Can we stop bringing
up Peter Singer, please? How about we take another break and we'll talk a little bit
about cheese, what else? Long termism and EA's impact right after this.
Okay.
God, seriously, I swear. And you won't have to send an SOS because I'll be there for you. Oh, man.
And so my husband, Michael, um, hey, that's me. Yeah, we know that Michael and a different hot,
sexy teen crush boy band are each week to guide you through life step by step. Not another one.
Kids relationships life in general can get messy. You may be thinking this is the story of my life.
Just stop now. If so, tell everybody, everybody about my new podcast and make sure to listen.
So we'll never ever have to say bye, bye, bye. Listen to Frosted Tips with Lance Bass on the
iHeart radio app, Apple podcast or wherever you listen to podcasts. I'm Mangeh Shatikathir and
to be honest, I don't believe in astrology. But from the moment I was born, it's been a part of my
life in India. It's like smoking. You might not smoke, but you're going to get second hand astrology.
And lately, I've been wondering if the universe has been trying to tell me to stop running and
pay attention because maybe there is magic in the stars. If you're willing to look for it.
So I rounded up some friends and we dove in and let me tell you, it got weird fast. Tantric curses,
major league baseball teams, canceled marriages, K-pop. But just when I thought I had a handle on
this sweet and curious show about astrology, my whole world came crashing down. Situation doesn't
look good. There is risk to father. And my whole view on astrology, it changed. Whether you're a
skeptic or a believer, I think your ideas are going to change too. Listen to Skyline Drive and the
iHeart Radio App, Apple Podcast, or wherever you get your podcasts.
So long termism is part of the EA movement. And this is the idea of, hey, let's not just think
about helping people now. If we really want to maximize impact to help the most people,
which is at the core of our mission statement, we need to think about the future because there
will be a lot more people in the future to save. And so long termism is really where your
dollar is going to go the most if you think about deep into the future even.
Yeah. Like if humanity just kind of hangs around planet earth for another billion or so years,
which is entirely possible if we can make it through the great filter,
there will be like quadrillions of human lives left to come. Yeah. And a lot of
philosophers who think about this kind of thing kind of make the case or can make the case if
they want to that their lives are probably going to be vastly more enjoyable than ours,
just from the technology available and not having to work and all sorts of great stuff
that's going to come along. And so technically, just by virtue of the fact that there's so many
more of them, we should technically be sacrificing our own stuff now for the benefit of these
generations and generations and generations of humans to come that vastly outnumber the total
number of humans who've ever lived. Like 108 billion humans have ever lived. We're talking
quadrillions of humans left to come. That very much dovetails with the kind of discomfort you
can elicit from somebody who says that your money's better spent relieving the suffering
of cattle awaiting slaughter than it is saving children's lives in Africa. You know what I'm
saying? Yeah. And they're not just talking about climate change and like obviously that kind of
existential risk. They dabble in AI and stuff like that. And I know that we don't need to go down
that rabbit hole. You should listen to the end of the world with Josh Clark, the AI episode.
I mean, it's all about that. But it does have to do with that kind of stuff. It's not just like
we need to save the planet. So it's around in a billion years. You know, they tackle
like all kinds of existential risk, basically. Yeah. And they dedicate like a lot of these
guys are dedicating their careers to figuring out how to avoid existential risk because they've
decided that that is the greatest threat to the future that would cut out any possibility of
those quadrillions of lives. So that's literally why they have dedicated themselves to thinking about
and alleviating these risks because they're trying to save the future of the human race,
because they've decided that that is the best way to maximize their careers for the most good,
which is just astounding if you stop and think about what they're actually doing in real life.
Oh yeah. We mentioned the kind of money that's, even though it's not a huge movement so far,
I think we said like somewhere around 8,000 people have made these pledges. I think overall,
the co-founder of 80,000 hours, Benjamin Todd says about $46 billion is committed to EA going
forward. Like you said, a lot, you know, it's because of there are a lot of rich people and tech
people that are backing this thing. So a lot of that money comes from people like Dustin Moskovitz
and Kari Tuna and Sam Bankman-Fried. He's a cryptocurrency guy. So a lot of that money
comes from them, but they're trying to just raise awareness to get more and more regular
people on board that, you know, if they have, you know, $2,000 or $3,000 to give a year,
they're saying, I think they estimate that 3 to 4,500 bucks is like the amount of money it
takes to save a human life and to give them additional quality years. Yeah. So if you cough up
that much and you directed toward one of the charities that they've identified as the most
effective through their sites, through like GiveWell is a place to go look for charities like that,
that have been vetted by Effective Altress, you're literally saving the life of a child every year.
It's like you're saving a child from drowning in a pond every single year just and all you're
doing is ruining your new shoes, you know, more than your new shoes, but you're ruining your
really nice vacation that year. Right, because you, you know, you sent this one thing, I don't
know where it came from, but the idea of someone running into a burning building and pulling a
child out or a kid out of a pond, they're written in the newspaper as a hero, but you can do that.
You can save a kid a year or more every year for the rest of your life. It's a little less dramatic.
You're not going to have a newspaper. You're not going to be above the fold.
Sure. You know what I'm saying? Yeah.
But that's, I mean, that has, EA is like all about, it's the antithesis of that.
Antithesis. What is it? Yeah. Antithesis. Yeah, I like it. You know what I mean.
It's no frills version of antithesis. Yes.
The thing is too is also, I mean, it's still very relative, like $4,500
is, is relatively a very large amount or a so-so size amount or not much amount,
depending on how much you, you make. And again, nobody in the effective altruist movement is
saying that you should personally sacrifice unless you really want to, unless you're driven to,
but you're not morally required to personally sacrifice to cough up that $4,500 when it means
you're, you're not going to be able to eat for a week or you're not going to be able to have
a place to live. Like nobody's saying that and nobody's being flipping about the idea that $4,500
isn't that much. What they're saying is $4,500 can literally save a child's life. And if you stop
and look at your life and think that you could come up with that, you could donate it to a certain
place that will go save a child's life in real life. That's, that's what they're saying.
Yeah, this, this, this would be a hard sell to Emily. What? I'm thinking about our, our, our
charity conversation we have every year. And I'm trying to imagine myself saying,
what if we don't give to the local animal shelter and neighboring need like we usually do. And
instead we do this. Right. She would just be like, I see what you're saying, but, but no,
get, get out of my face with that. But I mean, you could be like, well, how about we do both?
You know, exactly. So I think, I think that's the thing. That's my take on it. Like we support
co-ed and like that's, I have no qualms about supporting co-ed. Even after doing all this
research and understanding effective altruism even more, no qualms whatsoever. I'm sure that,
that money could be directed better to help other people in other parts of the world.
I still think it's money well spent and it's, it's helping people and I'm, I'm very happy
with that. I think that's great. And I don't have any guilt or shame about that at all.
And I don't think I should. What you're saying at that point, like with co-ed,
it is an organization dedicated to helping children in a, in a not very well off country
live better and longer lives. So like it essentially is effective altruism in a way,
except effective altruism is like, no, no, no, no, no. The data says that this one is,
look at the numbers. It's point this, this, this better and goes further. Like they really,
it's a, it's a numbers in a data game that makes it tough for a lot of people to swallow, I think.
Yeah. It's anti-sentimentalism, basically in the service of saving the most lives possible.
I know. It's, it's, it's interesting. And it doesn't surprise me that it has its roots in,
has its roots in philosophy because it is, it is really a philosophical sort of head
scratcher at the end of the day. Yeah, for sure. It's pretty interesting stuff, isn't it?
It really is. I think it's, I think it's fascinating.
Yeah. So there's, I mean, there's a lot more to read, both criticisms and, you know,
support pro EA stuff. And seriously, you could do worse than, than reading Peter Singer's essay.
What is it called? Famine affluence. And feeling bad.
Famine affluence and morality. Yeah. It's like 16 pages. It's a really quick read.
It's really good. So read that too and just see what you think, see what you think about yourself
too. And maybe take some time and examine, you know, if you could give to some of these charities,
or if you're not giving to charity at all, seriously, do spend some time and see where
you could make that change. Yeah. And since I said make that change and chuck said, yeah,
that means of course it's time for listener mail.
This is follow up to albinism. I knew we would have someone who has albinism to write in. I'm
glad we did. And then we had listeners out there. And this is from Brett. Hey guys,
longtime listener. I have albinism. So I thought I'd throw in my perspective.
First off, I know you were struggling to decide how to describe it, albino or albinism. My preference
is using the term albinism like you guys did. As to me, it denotes a condition while saying
if someone or something is albino, it feels like you're delegating them to a different species.
Right. Being called albino, I was used to bug me growing up. And that was usually because they were
kids were trying to get a rise out of me. Fortunately, I was a big kid. So it never really
escalated to physical bullying. I like this idea. Yeah. Like the kid with albinism who's like huge
and someone says something, they're like, excuse me. What did you just say? Yeah, I didn't say
anything. Being a child of the 70s and 80s, like you guys, it was pretty rough at times.
On the physical side, my eyes are very light sensitive. They're blue, where again, while growing
up, some of the kids would keep asking me why my eyes were closed, that it was bright. And of
course, the low vision comes into play as well. I'm considered legally blind. That's pretty much
every other person with albinism I have met has the same issue. There were ways to adjust in school
and ways they could assist me with large print books, magnifiers, monoculars, or the
teachers simply letting me look at their slides afterward and have more time with them.
Yeah, that's great. As for how people with albinism are portrayed in TV and movies,
I don't think being portrayed as a hitman or even someone with magical powers bug me
as much as the fact that I know that it was fake, because it would be really hard to be a hitman
with the kind of eyesight that we have. I love that. So practical. And Brett had a lot of other
great things to say, but that is from Brett and a longtime listener.
Thanks a lot, Brett. That was great. Glad you rode in. And yeah, thanks a lot.
If you want to be like Brett and get in touch with us and say, hey, you guys did pretty good,
or hey, you guys could have done a lot better, or hey, I'm mad at you guys, or whatever you want
to say, we would love to hear from you. We can take it all and you can address it all to Stuff
Podcast at iHeartRadio.com. Stuff you should know is a production of iHeartRadio. For more
podcasts on my heart radio, visit the iHeartRadio app. Apple podcasts are wherever you listen to
your favorite shows. Hey, I'm Lance Bass, host of the new iHeart podcast, Frosted Tips with Lance Bass.
Do you ever think to yourself, what advice would Lance Bass and my favorite boy bands give me in
this situation? If you do, you've come to the right place because I'm here to help. And a
different hot sexy teen crush boy band are each week to guide you through life. Tell everybody,
yeah, everybody about my new podcast and make sure to listen so we'll never ever have to say bye,
bye, bye. Listen to Frosted Tips with Lance Bass on the iHeart Radio app, Apple podcast, or wherever
you listen to podcasts. I'm Munga Chauticular and it turns out astrology is way more widespread than
any of us want to believe. You can find it in Major League Baseball, international banks,
K-pop groups, even the White House. But just when I thought I had a handle on this subject,
something completely unbelievable happened to me and my whole view on astrology changed.
Whether you're a skeptic or a believer, give me a few minutes because I think your ideas are about
to change too. Listen to Skyline Drive on the iHeart Radio app, Apple podcast, or wherever you get your podcasts.