Factually! with Adam Conover - Why Trust Science? with Naomi Oreskes
Episode Date: May 26, 2021We know that not all American trust science. But why, exactly, should they, when scientists are fallible humans, just like the rest of us? Acclaimed science historian and author Naomi Oreskes... is on the show this week to answer exactly that. Check out her new book, Why Trust Science?, at factuallypod.com/books. Learn more about your ad choices. Visit megaphone.fm/adchoices See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Transcript
Discussion (0)
You know, I got to confess, I have always been a sucker for Japanese treats.
I love going down a little Tokyo, heading to a convenience store,
and grabbing all those brightly colored, fun-packaged boxes off of the shelf.
But you know what? I don't get the chance to go down there as often as I would like to.
And that is why I am so thrilled that Bokksu, a Japanese snack subscription box,
chose to sponsor this episode.
What's gotten me so excited about Bokksu is that these aren't just your run-of-the-mill grocery store finds.
Each box comes packed with 20 unique snacks that you can only find in Japan itself.
Plus, they throw in a handy guide filled with info about each snack and about Japanese culture.
And let me tell you something, you are going to need that guide because this box comes with a lot of snacks.
I just got this one today, direct from Bokksu, and look at all of these things.
We got some sort of seaweed snack here.
We've got a buttercream cookie. We've got a dolce. I don't, I'm going to have to read the
guide to figure out what this one is. It looks like some sort of sponge cake. Oh my gosh. This
one is, I think it's some kind of maybe fried banana chip. Let's try it out and see. Is that what it is? Nope, it's not banana. Maybe it's a cassava
potato chip. I should have read the guide. Ah, here they are. Iburigako smoky chips. Potato
chips made with rice flour, providing a lighter texture and satisfying crunch. Oh my gosh, this
is so much fun. You got to get one of these for themselves and get this for the month of March.
Bokksu has a limited edition cherry blossom box and 12 month subscribers get a free kimono
style robe and get this while you're wearing your new duds, learning fascinating things
about your tasty snacks.
You can also rest assured that you have helped to support small family run businesses in
Japan because Bokksu works with 200 plus small makers to get their snacks delivered straight
to your door.
So if all of that sounds good, if you want a big box of delicious snacks like this for yourself,
use the code factually for $15 off your first order at Bokksu.com.
That's code factually for $15 off your first order on Bokksu.com. I don't know the way. I don't know what to think. I don't know what to say. Yeah, but that's alright. Yeah, that's okay. I don't know anything.
Hello, welcome to Factually, I'm Adam Conover, and this week, let's talk about science.
You know, we hear all the time in the media that a lot of Americans don't trust science.
And we've seen that during the pandemic.
Public health advice from some of the most experienced, knowledgeable scientists in the country was treated by many like a partisan football, with arguments erupting everywhere
over whether those scientists are really trustworthy.
The pandemic in many ways highlighted what seems to be a divide in how Americans view
scientific expertise. And so a lot of people, especially in the media, started asking,
what do we do about that? How do we make people trust science? Well, you know what? I think that
that is the wrong question to ask. I don't think we can treat science as just some authority figure that we must trust and we must bully other people into trusting.
Instead, we need to ask a different question and we need to make sure that that question is answered
to the satisfaction of our fellow citizens. And that question is, why exactly should we trust
science? I mean, seriously, ask yourself this. It's a harder question to
answer than you might think. I mean, the truth is that, contrary to what some people assert,
scientists are not infallible angels strumming truth harps on fact clouds. We shouldn't just
trust them because they are scientists. They are, in fact, fallible people working for fallible
institutions funded by fallible sources like governments and businesses.
Like anyone else, scientists are subject to biases and distortions and mistakes.
Science is as messy as every other fucking human enterprise on this goddamn planet.
And when it makes mistakes, they can be big ones.
For instance, in the 19th and 20th centuries, scientific racism in the form of studies like phrenology and eugenics was the toast of science town.
Many, many scientists asserted and believed these racist propositions that we now know
are false with devastating results.
So given that fallibility and that history, why exactly should we believe that scientists
are more trustworthy than anyone else or that science is a more trustworthy enterprise?
Well, here's one argument that you might advance.
Unlike many other human enterprises over the last several centuries, science has made indisputable progress in our understanding of the universe.
We absolutely, indisputably know more about the world and how it functions this year than we did last year.
And last year we knew more than the year before that.
And this process has been going on in some form for hundreds, if not thousands of years.
And it's not just little bitsy pieces of info we're compiling.
We are breaking off big old chunks of reality from the darkness and bringing it into the light of human understanding.
and bringing it into the light of human understanding.
Let's say you're a baby who was born in 1900, but you're a really smart baby,
so you're able to read all the scientific knowledge
of that time, okay?
Well, even though you were very, very smart,
you would not know anything
about the structure of DNA or RNA.
You wouldn't know about the variety
of weirdo subatomic particles that make up matter.
You wouldn't know that antibiotics could stop infections.
You wouldn't know that black holes are real.
So without question, science has delivered the goods
in terms of our understanding of the universe.
But we still have to ask the question,
what is it about science that makes it so successful?
Why should we trust science and scientists when we speak?
And honestly, what is science to begin with? There is
a lot that we need to know here before we know where to put our intellectual trust. Well, to
answer those questions, I am so honored to say that we have one of the great scholars of science
and the history of science on the show today, Naomi Oreskes. She's a professor of history at
Harvard and the author of the landmark
book, Merchants of Doubt, as well as her new book, fittingly titled, Why Trust Science, which you can
get at our special bookstore at factuallypod.com slash books. Please welcome Naomi Oreskes.
Naomi, thank you so much for being here. Thank you for having me.
so much for being here. Thank you for having me. So your newest book, let's just jump right into it, is called Why Trust Science? Why Trust Science? I'm sorry to make you summarize the
whole book in my first question. I imagine you might have a good answer to this question. And
why is it an important question to ask? Well, it's obviously an important question to ask
because we've seen what's happened
this past year when people don't trust science.
When people, whether it's politicians, governors, the president of the United States, or ordinary
citizens, refuse to accept scientific advice, refuse to trust what scientists have learned
about the world, people get hurt, people get sick, people die.
Science is our best way of understanding the world around us.
And when we understand the world around us. And when we understand the
world around us, we can use that knowledge to protect ourselves against disease, to do things
we want to do, like go to the moon or invent better technologies. But when we ignore that
evidence, we do so at our peril. And so it's an incredibly important issue for all of us.
And that's why I wrote the book even before COVID-19. The answer to the question is twofold.
So scientists are the people who have the job
of trying to understand the natural world.
That's what they do for a living.
Every day they wake up and they get to the work
and their work is the work
of understanding the natural world.
So if we have questions about the natural world,
then scientists are the experts to whom we turn.
So on one level, the answer to my
question is very simple. If you have a toothache, you go to the dentist. If your car breaks down,
you bring it to a car panic. And if you have a question about nature, then you go to a scientist.
So that's the simple answer. And then I can get into more detail if you're interested.
Yeah, well, please. I mean, that's why we're here is to go into detail. I mean,
so that's a really good, you know, heuristic to use throughout our lives. You know, we've we've
talked about on the show before when dealing with misinformation, like one of the first things to do
it's often we say too much. Go do your own research. What you should really do often is go
to an expert and find out what an expert thinks. But, you know, I don't think you can't just reduce
that to an argument from authority.
Right. That like, hey, this this is the person who knows. Therefore, everything they say must be unimpeachable.
Just to take the devil's advocate position. There have been plenty of things that scientists have been wrong about.
Certainly, you know, innumerable statements that, you know, some of which were very harmful.
So how do we,
how do we integrate that into an argument for trusting science?
Well, integrate is the operative word because there are conspicuous examples where in hindsight,
we would say scientists were wrong and sometimes in damaging ways. And the second chapter of my
book is all about that. So I look at some specific examples where scientists did go wrong
and ask the question, well, where scientists did go wrong and ask the
question, well, why did they go wrong and what can we learn from that? But if we put those examples
in context, what we find is that they're actually not that common. That if we look at the track
record of contemporary and modern science, and I think as a historian, I would say, I think we can
legitimately say that science as an institution, science as we know it, has been
around since about the 17th century.
So that's a long time.
So we have a long historical record.
And that historical record is very well documented.
It's very well studied.
And what we see is an amazing track record of success.
So many things about the world that we understand now because of the work that scientists did
and so much work that we've been able to use in good ways to cure disease, to create safe and effective medicines and vaccinations, to build safer houses, to predict when earthquakes, where and when earthquakes may occur, to understand volcanic hazards, to build effective technologies.
I mean, there are so many things we've done and so many things that we take for granted in our daily lives. And I think one of the problems we face is that when everything's
going right, we tend not to notice. It's like we take our car for granted when it doesn't break
down. You know, we take Zoom for granted when it works right. We tend to notice when things go
wrong. So I think there's been a bit of undue emphasis on some of the mistakes or errors of
science against a larger background of a tremendous track record of success.
Yeah, you know, science seems really remarkable in that there's so many fields of human endeavor
that we've been doing as long as science that have not made progress in the same way. I've
had philosophers on the show, and we've talked about how, you know, philosophy hasn't advanced in the same way. And that's really the wrong way to look at
it. But, you know, it's not like, you know, we're like, oh, we've learned so much about philosophy
since Socrates' day. We still read Socrates and think he has some good points. And recent
philosophers, you know, have no better claim to truth than he does in many ways. Architecture, right,
we've been doing for centuries, but it's not as though architecture is like much better than it
was. It's better in some ways, but I'm here in Miami today and I'm running around. Architecture
is not great here in Miami, Florida. But science, by contrast, has made progress throughout the
last hundred years,, few hundred years.
And I suppose if you want to call, you know, sort of proto scientific endeavors that were done,
you know, millennia's past, like we do actually understand the world in a concrete way,
much more fully than we did a century ago. Like we have made undeniable progress. And that's
rare in humanity. Well, exactly. And of course, historians never want to use have made undeniable progress. And that's rare in humanity.
Well, exactly. And of course, historians never want to use the word undeniable because, of course,
you can always find someone to deny it. So but I think what is fair to say is if you think about
the very word progress, it's a little bit of a vexed word because progress for one group of
people might actually be regress for another. Progress is a very value-laden concept,
so different people have different conceptions of what constitutes legitimate progress.
But nevertheless, it is true that the very word progress is often associated in many of our minds
with scientific progress and with technological progress. So why is that? Why do we have this
close coupling in our minds between science and technology and the
notion of progress? And I think the answer is exactly embedded in your question, because science
is a process of learning. It's about learning about the natural world. And we do have a notion
of progress being tied up with learning. Like we could talk about children progressing in school.
We could talk about making progress in learning to play the violin or progress in
learning to speak Chinese. When we're thinking about learning, we have a strong sense of progress
because we can track our own learning process and say, oh yeah, like I play the piano better today
than I did a year ago, or my French has become more fluid, or we know more about the natural
world. There are questions about the ocean, the atmosphere,
about life on earth, about the structure of the atom that we can answer now in powerful and
convincing ways that we couldn't 10 years ago, 100 years ago, 200 years ago. So I think that's
that notion of learning that makes us feel that, yes, there is a sense of progress in science.
And it's not that art isn't good. But if you compare, say, contemporary art to, let's say, 18th century landscape painting.
Well, yeah, it might be better. It might not be. It's more a matter of taste.
Yeah. Yeah, you're right. It's really it's really embedded in there.
Like I've I think a lot about how my sister is a, is a physics reporter for
science news. She writes about particle physics and you know, when she, she's telling me about
the most recent advancements and she's been telling me about how, you know, particle physics
specifically has made such enormous advances in the last, you know, 50, 70, 100 years,
constant discoveries about like the fundamental nature of matter of existence,
right? Almost metaphysical questions are almost being answered. Except that in recent years,
they've sort of run out of frontier. There's been this problem where the physicists are like, wait,
where we've got questions that we're trying to answer, but we, nothing new is coming up. We're not yet finding the avenues for
exploration. And when an experiment does turn up like an anomalous result that contradicts what
they know, everyone gets excited because they're like, oh my God, there might be new physics that
we can do. And if that doesn't happen, they're frustrated because they're like, well, we can't
know everything. There must be more to learn, but what is it? And they can't like even figure out what they don't know.
And I think that's such an interesting dynamic
that that's like so buried in our expectation
about what science is.
Because you would never, an artist would never do that, right?
An artist would never say, oh, what, there must,
actually maybe an artist would feel that way.
I think in literature, you can find people
who have said that like after the novel, what's next?
So maybe that's not so unique.
But what I would point out there, that seems to be something a little peculiar to physics. I'm not
sure what it is about physicists that they sometimes feel as if they've run out of questions,
because I've never known a biologist or geologist or chemist to ever speak in that kind of language.
But physicists sometimes do. And of course, you know, one thing any historian would point out is
they were talking like that in 1904. There was a very widespread sense in physics at the start of the 20th century that they were done. And well, I mean, we look back now, we think, wow, that's pretty funny that they thought they were done when they were actually right on the eve of relativity and quantum mechanics. So I'm confident that physicists will find new and more and good things to do, even if they don't see it right now.
I mean, do you think that that emphasis on progress, is that a mistake at all? Or is that,
you know, part of what we, is that part of what science is?
Yeah, that's a really great question. I think it's tricky because, as I said, as a historian,
I'm mindful of the idea that progress is a very value-laden concept. And the notion that there is progress in some
absolute sense is one that almost all historians would challenge both, you know, intellectually
and politically. But nevertheless, as a historian of science, I'm mindful of the idea that in science,
there is this sense of progress. It's what a historian would call an actor's category. That
is to say, scientists believe that their enterprise progresses, and they believe it
in a way, as you said, that philosophers don't generally.
So the fact that scientists experience what they do as progress, and we, as people who
use science, also experience this progress, tells me that it's not a category that we
just want to throw away.
It's a category we want to understand. And as I said, the way I understand it is to think of it as progress and learning,
that we learn more things, we learn new things, we discover things in the world that we didn't
know were there. And that's a kind of progress. Yeah. Well, so what is it about science that has
caused this remarkable, you know, success, as you put it?
Because there's certainly plenty of fields of, like I said, human endeavor that are trying to
progress. There are certainly plenty of philosophers who thought that they were,
you know, progressing the field. There's, you know, many, many, there's alchemists,
who felt they were understanding the nature of reality, but were failing to do so.
Yet science has actually been successful at it.
So what is it about science that causes it to work where other fields do not?
Well, I think there's two things, maybe three.
One is the sustained engagement.
So artists might not have a sense of progress in art, that we might not say that, say, cubism is progress over representational art, but art continues.
There is a sustained project that we call art.
So in that sense, we can say science is not that different from other activities.
Scientists have hung with the project.
They've sustained the project of investigating and understanding the natural world.
And so some of the success is simply that it has been sustained, that people have stuck with it.
They haven't quit. And especially when the going got tough. I mean, science can be really hard.
Many of the questions that the people I study have tried to answer were really, really hard
questions. I mean, I have another new book about the history of oceanography in which I look at
the question of deep ocean circulation. This is something that scientists argued about for 300 years before they finally
got to a point where they said, okay, yes, we think we know what's going on. So some of it is
a kind of patience and persistence that science as an enterprise has illustrated. The second thing
is money. I mean, part of the reason scientists have been able to sustain this
project is because people have been willing to fund it, and particularly in the 20th century
governments, because there's been a perception that science is useful. And particularly in the
20th century, I'm sorry to say, useful for warfare. The project, the book I wrote on
oceanography is called Science and Emission, and it's all about military funding of oceanography is called Science on a Mission, and it's all about military funding of oceanography
because of its relevance for submarine and anti-submarine warfare. So there's been a very
steady stream of funding into science, particularly in the last 100 to 150 years,
which has made it possible for scientists to have the kind of sustained engagement that they do have.
Other activities could have maybe been successful if they'd had the same kind of institutional and financial support. But science, we know, has had it. And then the third thing that I focus on in this book, in Why Trust Science, is what I describe as the critical vetting of claims.
to do the work, to do the investigations, to come up with ideas, to think that you personally have an answer to an important question, you have to expose that conclusion to the critical scrutiny
of your colleagues, your peers, your fellow experts in the field. And that critical scrutiny
is really, really tough. The philosopher of science, Helen Longino, refers to it as
transformative interrogation.
And I really like that term because I think it's really catches or hits the nail on the head of
what's going on. It's interrogation because it's tough. Your colleagues' job, your colleagues see
their job, and it's part of the cultural values of science that we see our role as challenging
our colleagues. So if you come, you, Adam,
come to a conference, you say, hey, I have this brilliant new theory that I think is going to
explain everything. And here's my argument. I'm going to say, okay, that's great, but what's your
evidence? And then you say, oh, I have evidence. And then someone else is going to say, well,
yeah, that's great, but is that really enough evidence? Is that evidence really robust? Does
it really hold up? And so that critical scrutiny
means that a scientific fact, what we call a scientific fact, is never the opinion of one
person, no matter how smart or clever that person is. It's something that has been accepted by a
group of peers who have looked at it in a tough way from many, many angles. And then one more
thing. So as I said, Helen Longino calls this transformative
interrogation because typically what happens is that the initial claim doesn't entirely hold up
to scrutiny. So you come and you say, I have a theory of everything. And I say, hold on,
wait a minute. And then we have a back and forth. You know, we discuss it in conferences,
you submit it to a peer reviewed journal, the reviewers criticize it, then it gets published, maybe then there's more feedback and commentary.
And it might be that you say, okay, well, yeah, my theory of everything turns out not actually to work for everything, but it does work really well for these two things.
And so it becomes a theory of those two things.
Or you modify the theory to make it work better in response to the critiques and
comments of your colleagues. And so the process is transformative. By the time we finish and we
agree on something that we think is true, it's typically not the same claim as what we started
with initially. And that starting point might have been a year ago, it might have been 10 years ago,
it might have been 100 years ago. That's really interesting. I'm really struck by the fact that I feel like if I
asked a lot of scientists or people who think about science that same question, they would
have answered me in a more, you know, epistemological sense. They would have said, oh, well, science,
here's how it works. You advance a claim and you test the claim and you throw out what isn't true.
And, you know, even though you might have mistakes over time, you know, that winnows out the bad ideas, etc. And, you know, the scientific that the community of scientists operates as a group of people,
which is often left out of that discussion. I think that's a really interesting emphasis that
you have. I wonder if you could tell me a little bit about that. Well, the odd thing about scientists
is that they're often extremely unscientific about their own enterprise. So many scientists
have ideas, they have ideologies, they have kind of ideals of what
they think science is or should be. And often these are ideals that were passed down to them
by their teachers or mentors, but they're actually not database. They're not based on the evidence.
And so this is why my field exists. This is why there are professional historians and philosophers
and sociologists and anthropologists of science, because we actually study science as it is. And in a way, I mean, I don't think of my project as exactly a science, but it's a kind of, it's an empirical project. We look at evidence, we look at data, we say, well, what are these people really doing? Set aside the theories and the ideals and the ideology, what are they actually doing? And when we actually look at what they're doing,
that's when we get this more complicated picture and a picture in which these social institutions of science, institutions like conferences, scientific societies, peer-reviewed journals,
these are crucial to the vetting of claims. And if you actually look at what scientists do,
you find they spend huge amounts of time in conferences. They spend huge amounts of time
preparing papers for peer-reviewed journals, revising those papers when they get them back,
when they get the criticism of the reviews back, resubmitting them. Sometimes the revision and
resubmission can take two, three, four, or five rounds. So these are the activities of scientists.
It doesn't mean that they don't also sometimes do experiments or test a hypothesis. Of course, they do that, too. But it's just a part of the process. And
well, I just put it that way. It's a part of the process. It's certainly important in many
contexts, but it's by no means the whole picture. I love that, that you as a historian are doing
science or you're at least doing investigation on the scientists themselves who are who, of course, like a physicist is not thinking scientifically about their work.
They have not been trained or have any interest in studying the dynamics of human groups.
But that's what you do. And and I think that that is so often left out of our discussion of science.
often left out of our discussion of science. Like so many of the times when there's a debate within scientific circles about, you know, you know, inequity, gender or race disparities,
or any other sort of cultural issues in science that might be happening, there'll be a faction
who says, no, science is just about the scientific method. That's all it is. It's that ideological
sort of version of it. And you're really pointing out, no, where science happens.
That's such a startling observation that like science is happening in those
conference rooms.
Like,
yeah,
all those people are,
they've,
they're at the large Hadron collider.
Sure.
And they're,
and they're spinning the particles around real fast,
but like a big portion of science is happening when they go out to the bar
afterwards and talk about it.
And then when they go to the conference and they get,
and then they go to the bar at the car,
everything happens in bars is what I'm saying. Everyone's drunk when science is happening.
Right. OK, well, quite that far. But, you know, you have the general picture. Yes.
Well, so my question about that then is that like, OK, if the scientists themselves aren't
really thinking along that dimension, that that is so much of a part of what keeps what they do a healthy, successful enterprise, you know, it's not crossing their minds because they're not really necessarily trained to think that way.
Well, that suddenly makes science feel like a much more precarious enterprise than I thought.
For instance, when we're booking the show, we try to find folks who, OK, we want to make sure that our guests, when they're scientists, are part of the sort of
like community, right? That mainstream community in their field. But there is such a thing as,
you know, brilliant people who are outside of that, who are not able to hang,
right? Who are seen by that group as being, you know, outside of it. And there's also such a
thing as groupthink and cultural groups that make mistakes like that. So being that you have that focus on the cultural organization of science, how is it successful, even though there's all these sort of built in biases in human groups?
I put forward in my book is not more precarious. It's more human. And I think it is important for us to recognize the human dimension, because if we put science up on a pedestal and make it seem
if scientists are some kind of gods, and then we discover that they have feet of clay, then we're
all disappointed and we're crushed. And we saw that. I was just talking, did another interview
where the issue of the so-called climate gate scandal came up when some climate scientists'
emails were stolen. And what really came out of that? What did we really learn? We learned that the climate
scientists were people. We learned that there are people who sometimes get frustrated and angry.
We learned that they didn't really know how to cope with organized disinformation that they were
facing in their work. And in their frustration, they considered the possibility of some things
that probably were not a great idea. Now, did they actually do those things? No, there was no evidence that any of the
scientists involved had actually done anything wrong. But there was the suggestion that they
had maybe possibly considered doing some things that weren't entirely great. So this is splashed
all over the front page of newspapers. And there were even people who said, oh, you know, like my
faith in science has been shattered. And I wanted to, I said, oh, you know, like my faith in science
has been shattered. And I wanted to, I mean, honestly, can I say this? This is all right.
This is like a, not a family show. I wanted to hit these people. I'm like, what do you expect?
These are human beings. People make mistakes, right? And the good news of that story was they
didn't actually do the bad things they thought about, but they did think about it. Well, come
on guys. I mean, have we, have you never thought about doing something wrong in your life? Have you never been tempted to,
well, I don't want to say what I've been tempted to do, but you know, have you never been tempted
to do something inappropriate in a moment of frustration? Of course you have. We all have.
But here's the crucial thing. That makes science less precarious because the reality is that what do we really want from these scientists? We don't need them to be perfect human beings. We don't need them to be saints. What we need is for the scientific claims they make, in this case about the climate system, to be accurate, to be reliable, and as far as we know, to be true. And how is that achieved? That's achieved in that social dimension that I was talking about, because it's not just about the one climate scientist. He's got to take those claims
into the court of scientific opinion, and they've got to be vetted by other colleagues who are going
to ask hard questions and who are going to say, well, what is the evidence for this claim?
And so that human dimension is actually what makes science strong because it's not just the opinion of one person who might be fallible,
but it's the collective wisdom of a whole group of very smart people who have
looked hard at these issues.
Now,
the other thing I'd like to add about that is,
so one of the interesting things,
so my work draws heavily on work in sociology of science work that was being
done in the seventies and eighties,
particularly in the 1980s when I went to graduate
school. And in the 1990s, a lot of scientists took offense at this work. And they considered
that work that looked at the social dimensions of science was somehow a kind of debunking.
I think those scientists got it completely backwards, that actually the social dimension
of science is part of its strength. And when we really look closely at how these institutional structures operate,
we realize, oh, well, it makes sense this is the strength
because you're not relying on an individual.
That would be, you know, a slim reed to hang your hat on.
I just mixed my words, but you know what I mean, right?
If you were only relying on one person
and that person turned out to be problematic,
then you would be in trouble. But if you're actually relying on the wisdom and that person turned out to be problematic, then you would be in trouble.
But if you're actually relying on the wisdom and the knowledge of a collective group of people who
are all engaged in trying to find the right answer to these questions, that's a much stronger
position. So I often say, you know, I worked for years on these projects. I read unbelievable
amounts of stuff, a lot of it difficult and boring. And then at the end, I come down to
something that could be summarized by a cliche. So we all know the cliche that two heads are
better than one. But in this case, it's not just two heads, it's 200 or 2000.
Yeah. It's that group gestalt of all those minds working at it together that come up with those
accurate reproducible results in most cases, and that in most cases
corrects when there's been a mistake. Sometimes scientists like to say that science is self
correcting. And I don't know what that means, because that that kind of makes it sound as if
science is alive. It's an entity and it fixes itself. No, I mean, science is not a robot that
knows how to repair itself. But the process of science is involved in identifying
correcting error. So even once we accept a claim and we say, okay, well, we know the climate system
has warmed, for example, and we are pretty sure it's warmed just about one degree centigrade.
But the scientific process is always open to the possibility of revision, that even things we think
we know very well can be revisited if, and of course, this is a big if, if someone has evidence.
And so this also helps us distinguish between what I would call legitimate revision and legitimate dissent versus disinformation, misinformation,
fake news and all the other bogus stuff that we've been trying to deal with in our lives in recent years.
So in science, we do revisit questions.
And my early work was about one example of that. So in the early 20th century, the idea was raised that our continents were not
fixed, that the earth was in motion, that big pieces of the earth were moving around. And that's
what caused earthquakes and volcanoes. And that idea was debated. There was an open discussion
about it. The people who proposed it were not neglected geniuses. They had their day in court. But particularly here in the United States, American geologists in general rejected the idea. There was some acceptance, but not a lot. Mostly rejected. Europe was a little bit more mixed. But even there, it was not taken as established to be true.
not taken as established to be true. About 30 years later, the debate was reopened and there was a whole body of new evidence available that made it possible to look at those questions again
in a different way. And with those new bodies of evidence, scientists said, you know what?
Actually, yeah, Alfred Wegener was right. The continents really do move. So now you could look
at that as a failure that scientists got it wrong the first time around,
that Alfred Wegener didn't get the credit he deserved. Or you could look at it as a success
story that, yes, it took 30 years and that is too bad. It would have been nicer if it happened
sooner. But actually scientists came back to the question and they sorted it out. And now we're
pretty sure that continents really do move and we're not likely to be revisiting that anytime soon.
Although, again, it's always possible, right?
A good scientist keeps open the possibility that we may be revisiting these claims.
Yeah, I mean, that's what you're talking about is plate tectonics, which is something that I learned about in, I don't know, sixth, seventh grade.
It was like a very bedrock piece of my education in geology to the extent that I learned about in, I don't know, sixth, seventh grade. It was like a very bedrock piece
of my education in geology to the extent that I had one. Bedrock education. I like the pun there.
I didn't intend it, but I'll take it. Yeah. So the idea that it was actually rejected by
scientists in the United States for 30 years. Now, yes. Now, some of that had to do with World War II. So in my book,
Science and the Mission, I revisit this story. And one of the things I point out is that actually,
so scientists mostly rejected it. The debate mostly took place in the 1920s. But in the 1930s,
there were a group of scientists. It was a group of Americans and Europeans who were working on a
model to explain how continents might move. And we were very, very close to what we would now say is true.
It wasn't exact, but it was awfully close.
And they actually presented that work at a set of conferences in 1933 and 1936.
But then World War II broke out and the key scientists involved ended up doing military work that was classified.
It was secret. And so they couldn't talk about their work.
And after the war, a lot of the key data became classified.
So in the new book, I argue that even though military funding of science made a lot of things possible, made it possible for scientists to do a lot of new work that had not been possible before.
It also led to conditions of secrecy that
limited conversations on certain questions. And one of those was plate tectonics. And so it's not
until the 1950s, when a group of British scientists start looking at the question again, that the
debate gets reopened, not by Americans, but by British scientists. But once they reopen it,
then the Americans say, oh, wait, wait, wait. Yeah, we want to be part of this too.
But once they reopen it, then the Americans say, oh, wait, wait, wait. Yeah, we want to be part of this, too.
Wow. This is this is fascinating. OK, well, you raise the question of military and government involvement in science. I have a question for you about that, but we got to take a really quick break. We'll be right back with more Naomi Oreskes.
I don't know anything Okay, we're back with Naomi Oreskes.
So you were talking about, right before the break,
about World War II and plate tectonics.
A realization that I've had in the last 10 years
is that I was brought up by scientists.
I come from a family of scientists,
and it's very much came from that idea of scientists are, you know, really care about scientific truth. That is what they're into it for. That is why they're doing it. And I believe that's why they're doing it.
You know, I read that book Sapiens a couple of years ago, and I think a lot of that book is a little glib.
But there was like it has lots of little insights that stuck with me.
And one was that, you know, throughout human history, as much as we want to believe that that's true of scientists and it is true of the scientists on the ground.
Science is always funded by someone else who has another interest for the most part. It's, you know, it's funded agricultural scientists.
Science is funded by the USDA, which is trying to increase crop yields, et cetera. You know, the Manhattan Project was, you know, funded by the government to create an
atom bomb. And there are these pressures upon science in that way that what science gets done
and what science has money poured into it is sort of determined by social factors, not by,
you know, some, you know, some group of scientists deciding, oh, what's the best
science that we could do right now? How does that affect science in your view? Well, it affects a
lot. And this is certainly a really important question that I think both ordinary people in
the scientific community should be talking about more, because there are really big questions about
what we want science to do for us. You know, what do we want our scientists to be studying?
And I think all of us should be involved
in that conversation.
And yet that conversation almost never happens.
So in Science on a Mission,
the subtitle of my book, Science on a Mission,
is how military funding shaped what we do
and don't know about the oceans.
And the argument I make is that funding structures
very, very strongly determine what gets studied
and what gets ignored or
neglected. And so this becomes an argument for diversity in funding sources. The U.S. Navy was
a good patron of science. It funded a lot of outstanding research. We learned a lot of things
about the ocean thanks to Navy support. But there are also a lot of things we didn't learn. There
were a lot of things that were neglected because it wasn't something that mattered to the Navy. And that's not a criticism of the Navy. The Navy was doing its job. In fact, I think you could argue that it would have been a misuse of federal funds if the Navy had given money for things that were irrelevant to the military mission.
But we need to recognize that if we rely too much on, let's say, you know, the Defense Department to fund science or too much on the Department of Agriculture to fund plant research, we will end up with a lopsided vision of the world.
And it's not a criticism of patrons.
Patrons, of course, have something they're interested in.
Like you said, Department of Agriculture wants better corn yield, let's say.
And that's fine. There's nothing wrong with better corn yield.
But we need to recognize that there might be other kinds of questions about plants.
There might be, for many, many years, Department of Agriculture funded what was known as applied entomology.
So they applied, studied insects.
But what was applied entomology really?
It was actually mostly how to kill insects.
It was mostly about pesticides.
really, it was actually mostly how to kill insects. It was mostly about pesticides. And so,
okay, well, there are some contexts in which killing insects, you know, might be what you want to do, but there are an awful lot of other things about insects, like lots of good insects,
like bees that we need, or insects that play important, you know, roles in ecosystems in
terms of pollinating flowers and, or insects that are food for birds, and lots of people love birds.
So if you only study insects to kill them, you will definitely miss out on a lot of important
things that you might actually benefit from knowing about insects.
How do we get funding for the things we actually want to know about insects? And in your view,
do we have enough of it?
Well, this is a really good question. I think this is something that I'd love to see more people
talking about. Rush Holt, who was a former congressman, one of the few members of Congress to ever have a PhD in
physics, has recently written a new preface to the Science, the Endless Frontier, which was the
report that was written by Vannevar Bush at the end of World War II, that's generally viewed as
the blueprint for science as we've known it in America since the end of World War II.
And in his preface to it, Russ says,
you know, we really need a new social contract for science.
We need to rethink what the relationship is
between the American people and the scientific enterprise.
And I think he's right about that.
And so how do we do that?
I don't know.
I think we all need to think about it,
but maybe conversations like this,
maybe having journalists who are interested in science beginning to think through, well, we've done it this way for, you know, what, something like 70 years now. And it's worked pretty well in a lot of ways. But we can also identify some areas in which it hasn't worked so well. And maybe we might want to make some adjustments.
want to make some adjustments. Yeah. But in your view, that's not a reason to trust science less,
the fact that it can have this lopsided funding structure. Correct. It just means that we need to know that the knowledge we have is going to be incomplete. And so we might want to ask
ourselves questions. Oh, for example, here's a good one. I mean, I've given a lot of talks this
year about COVID-19. And one of the things I always say is the scientists have done a great job. They've done exactly what we needed
them to do. They identified the virus, took a little while, but they figured out how it was
transmitted. They sequenced the genome. And with that information, they created these amazing
messenger RNA vaccines that many of us here in America have now got that are proving astonishingly
successful. 95% effectiveness in the clinical trials and seemingly something like 98 to 90%
effective in real life, which is unusual. Usually it goes the other way around. And apparently
virtually 99 to 100% effective in preventing death. This is an astonishing success. And it shows us
scientists have delivered exactly what we wanted of them. However, okay, so that's the good news.
But here's the bad news. We have about 20% of our fellow Americans who have said they don't intend
to get vaccinated. Not now, not ever. And why is that? Is it because the vaccine doesn't work? No. Is it because the vaccine is unsafe?
No.
So what is it?
Well, we have indications that it's cultural.
It has to do with personal identity.
It's ideological.
It has to do with how you feel about the government.
It's partly tribal.
People in certain states are much less likely to get vaccinated than others.
So these are not questions that will be resolved by better sequencing of the genome, right?
These are social scientific questions that might be resolved if we had a better understanding
of people's relationship to science and people's relationship to their governments.
So that can be answered with social scientific research.
But as most of us know, social sciences have been massively underfunded in this country. So this is a shameless plug for what I do. I think we've
learned in the COVID crisis that we need social science as much as we need physical science.
Yeah. Wow. That's an amazing point because, yeah, the vaccines are exactly what you're
talking about. This proof that the science that we funded has had enormous success.
That mRNA vaccines are one of the coolest scientific innovations.
It's literally hacking the machinery of our DNA replication system in a way that allows us to fight a novel disease within months of it appearing.
And that combined with
the spike protein, all these other discoveries all coming together in this enormous thing.
But, you know, I've talked to plenty of folks on this show about how half of the battle with
public health or maybe even more than half the battle is communication, is knowing how we get
people to understand how to protect themselves, how to wash their hands, how to, you know, all over the world this is the case.
You know, people, so much of our progress, you know, getting rid of communicable diseases has been teaching people to use toilets, for instance, in places throughout the world.
You know, the government through the NIH funds medical science to this enormous degree, billions upon billions of dollars flowing through the NIH, one of the largest research organizations of any kind in the world devoted to coming up with things like new vaccines.
But precious little devoted to understanding the people who are going to take the vaccines, who we need to convince to take the vaccines, who we need to understand why they might not take it.
And yeah, we don't put,
why isn't the government massively funding that program to understand its own people better and why they might be hesitant?
Exactly. Right. And so that has to do with, I mean,
the why to that has to do with the legacies of the cold war and the
relationship between science and the military.
That's one reason why I became interested in the science-military relationship
because a whole scientific infrastructure
was built to a very great extent
because of the perception of how science could help
in national defense.
And that wasn't necessarily wrong.
I mean, you may take objection to it or not,
depending upon your point of view,
but it wasn't necessarily wrong, but it's incomplete.
And the COVID-19 crisis really illustrates that. And we can even take
it a step further. So as you said, public health is about the public. And so if you want public
health to work, you have to understand the public, but then you can even take it one step further.
So we know from opinion polls, we have about 20% of Americans who don't intend to get the vaccine
if they can avoid it. But we also know there's another percentage, and I'm not sure the numbers here, but people who would like to get the
vaccine, but they don't get paid sick leave. They can't take a day off from work to get the vaccine
without sacrificing salary. And maybe, like I know where I got vaccinated, well, I got vaccinated in
Utah, so there were no vaccines available on Sunday in Utah.
But I was fortunate that I could go on a Tuesday. But then there's also people who,
in addition to not being able to take a day off or an afternoon off to get the vaccine,
are worried that they'll get sick because we know that many people do have some degree of adverse reactions, particularly to the second shot. Many of us who've had the second shot have
been pretty sick for a day or two afterwards, but they don't have sick leave. So they feel like, so now I'm looking at
maybe three days of lost work and I can't afford that. Now, the NIH is not going to intervene in
that issue because that would be considered politics. And it's not the job of NIH to figure
out how do we fix the fact that millions of Americans don't have sick leave. And yet that political piece is crucial to the public health part. How do we sort that out?
That also, again, though, just goes to your point of how successful science has been as a field of
human enterprise that we, you know, we have advanced the point that we have these vaccines,
but our political culture, our political systems are clearly far less advanced.
Like if we were as successful in organizing our society as we are in organizing science specifically, people would have that sick leave.
People would not feel those pressures. People would have more trust in their government.
Like a lot of people not want to take the vaccine is just is based on the legacy of the Tuskegee
experiment and all these sorts of things. Well, hold on. I need to just jump in there
because that's actually not true. That's been. Oh, OK. Correct. Yeah. That's a good example of,
again, where we actually need to study something. So a lot of people have been saying that and
suggesting that, well, African-Americans have a legitimate reason to be nervous about
the public health system because of abuses like Tuskegee. Well, there's actually very little evidence to support that.
And the poll data we have, we don't have a lot because this is, you know, it's a situation that's
very mobile. It's changing by the day. But actually, most of the data we have has shown that
most African-Americans and people of color in this country do want to get the vaccine. And if they're
not getting it, it's because of these issues of access that I was just referring to.
They can't get time off from work. They don't have sick leave.
There's no vaccination center close to their homes.
Now, the good news on that is when you realize that that's what's going on, then you can figure out, you say, OK, well, maybe we can bring the vaccines to the workplace and people can get their vaccine at their lunch break.
Or maybe we can bring the vaccines to the workplace and people can get their vaccine at their lunch break. Or maybe we can bring the vaccines into their communities.
And in Los Angeles, there was a very effective intervention a couple of months ago where doctors did that and they spread the word in the community through churches.
And, you know, thousands of people showed up.
So this is why it's so important to have the social science research to know what is really the obstacle for folks.
to have the social science research to know what is really the obstacle for folks. Because if it were the legacy of Tuskegee, that would be a different kind of problem with a different
solution than if we just have to send a truck with vaccines into a workplace, right?
Yeah, fair point. I mean, that legacy is real, but we should understand what the problem that
we are trying to address is, or we're going
to come up with the, with the wrong solution. I want to make sure that we touch on denialism,
by which I don't mean, I think when we talk about denialism, we often think of like the,
the sort of stereotype of a person who's just, I hate science, blah, blah, blah.
I'm talking more about the concerted effort to disrupt scientific, uh, you know, the scientific process, um, by say the fossil fuel industry, by, um, uh, other, you know, sort of bad actors that are specifically almost sometimes harnessing the language of science in order to disrupt the work of science.
Tell us a little bit about it.
Well, thank you for asking because this obviously is a very important part of this landscape.
So I'll start with good news. The good news is that public opinion polls do show that the majority
of Americans do trust broadly, science broadly, and they do believe that in general, science makes
our lives better, and that in general, scientists do have the interest of the American people
at heart. But we also know that there is suspicion and rejection of science in particular areas.
And one of those areas is climate change, which is the area I've studied most closely, as you said.
And here, this is where the role of deliberate disinformation is an important part of this
landscape. So we've documented that for 30 years now, the fossil fuel industry
and their allies in conservative ideological circles like libertarian think tanks have
promoted the idea that we don't really know if climate change is true, that scientists have
exaggerated it, that scientists are not to be trusted, that they're just in it for the money,
that they don't actually care about the truth. And they have deliberately promoted distrust in science in order to promote distrust in climate science and prevent action that would
hurt their bottom line. And so it's really important for us to understand this, to know that
this is, in my opinion, malfeasance on a grand scale, and that it needs to be called out,
and that the corporations that do this should be named and shamed because
not only are they preventing us from acting on climate change, which is an incredibly
important issue in our lives, but they've also now done damage, which we've seen expressed
itself in distrust of science more broadly, which then contributes to people not wanting
to wear a mask or not wanting to become vaccinated.
And so what are the things that they do to wage this campaign,
if you could give us a couple of examples? Oh, well, we only have a few minutes left,
and there are so many examples. But in Merchants of Doubt, we did document the coordinated
campaigns. They involve things like television advertisements, advertisements in print media,
paid lecturers who would go on a lecture circuit, workshops at libertarian think tanks
like the Cato Institute, for example, just to name one, attacks on scientists, coordinating
with right-wing members of Congress to investigate scientists.
So we saw in the previous administration, we saw quite a bit of that, all of which has the same general modus operandi, which is to cast doubt on the reliability of science, to cast doubt on the integrity of scientists.
wear a mask or since I don't know, I don't want to spend money. I don't want to have to pay a carbon tax. And after all, why should I spend money on a carbon tax? Why should I accept a
higher price for gas if we don't even really know if the science is right anyway? Sometimes,
you know, I've covered in my past work cases in which groups like this, whether the fossil fuel
industry or another motivated group has even done their own science or their own science looking work.
I mean, even fucking Gatorade has like their sports hydration institute or whatever, right,
where they fund studies on how great Gatorade will make you at football or whatever it is.
And, you know, how much is that an issue?
I mean, there have been cases where, speaking about sort of the right-wing movement, that they've funded chairs at universities that have a particular point of view in economics,
right? And that's influenced our discussion about economics in this country. How much do you worry
about that as a distorting influence on not just public perception of science, but like the actual
work of science itself? It's a big problem. It's a big problem on two levels. First of all, because there's a lot of it.
And also because, as you suggest, that to the degree that real science is compromised by
conflicts of interest and distorting influences, then the American people begin to have a legitimate
reason to distrust at least some science. And then it really, really muddies the water.
And we've seen this now in a number of areas. The fossil fuel industry were the people who first developed this as a fine art. And there's,
I'd say it plays out on two kinds of levels. One is deliberately funding academics in order to
influence their findings. And so we have good evidence that comes out of the tobacco industry.
One of the really scary things about the tobacco industry was that they funded research in practically every major university in America, including most major medical schools, and very,
very few schools turned them down. So this led to a huge distorting effect because, you know,
if you're doing research that's funded by the tobacco industry, and you know very well that
tobacco industry does not want you to find that their product is bad, this is likely to influence your work at least a little bit. And we have studies
that show, in fact, that's the case, that tobacco-funded research was less likely to find
that tobacco caused disease than independent research. And this went on for the better part
of 30 years. It still continues in some quarters. And I think it definitely contributed to delaying action to control tobacco. It contributed to confusing the American people,
confusing journalists, so that even in the 1970s, for example, long after it had been proven that
tobacco tar caused cancer, the New York Times was still quoting tobacco executives saying,
oh, well, we have this study that, you know, says we don't, you know, we're not really sure, we don't really know. And we've seen similar things
in fossil fuels and plastics, as you said, food, there's a lot of industry,
distorting industry research in the food domain. Now, in addition to that, though,
there's also what I would call the funding of distracting research. So in this case,
the research is not necessarily influenced, but it's used to distract attention from the issue at stake. And again,
the tobacco industry really rose this to a fine art. So one of the things the tobacco industry
did was, so we know that smoking causes lung cancer, but it's also the case that there are
other causes of lung cancer. For example, asbestos causes lung cancer.
Radon causes lung cancer.
So what did the tobacco industry did?
They abundantly funded researchers who worked on asbestos and radon.
And then if somebody would raise the link between tobacco and cancer, they would say,
well, but you know, we have these studies that show cancer is caused by radon and asbestos.
And of course, it wasn't a lie. It was true. Cancer is caused by radon and asbestos. But now
the listener is confused. Oh, well, so maybe tobacco is not that bad. Maybe my Aunt Dinah's
cancer wasn't because she smoked two packs a day. Maybe somewhere in the past, she was exposed to
radon. And now again, you've muddied the waters and you've made it harder to
have effective public policy because you've made the scientific landscape seem more confusing
than in fact it actually was. And you're also changing, like you say, when, you know, our
oceanography is mostly funded by the Navy. And so we have a certain gap in what our understanding is.
I mean, if these companies are pouring money into researching a certain, you know, food, food companies provide a lot of the funding for food science and therefore their concerns are the ones, even when it's not, you know, distorting the findings, it means we're getting a certain sort of finding and we're maybe not researching other topics that we might want to know about food.
getting a certain sort of finding and we're maybe not researching other topics that we might want to know about food. So, you know, given that, given those pressures, I mean, is that something that
you think the scientific process is still able to correct for? Well, I think it's difficult. I think
we have seen areas in which where the scientific waters have definitely been muddied. So I think
universities need to do a better job of vetting
funders. I think scientists need to take this more seriously. I've certainly been in cases where
scientists have taken money directly from industry to fund specific studies, but they say, oh, no,
no, no, I'm objective. It's fine. Well, yeah, maybe. And of course, you know, if my argument
is correct, then in many cases, it actually will be fine because the scientists may present those results at a conference and other scientists will push back and say, well, we're not sure that that's true. editors are busy, reviewers are busy. Peer review ends up being a pretty low bar.
So anything that can't pass peer review
is probably not high quality,
but a lot of things do get through peer review
that are not that great.
So I do think it's a big issue.
And I think we need to do more work
that the scientific and academic communities
need to take this on board
as part of their own housekeeping to say,
yeah, this actually is a problem.
And we need to figure out how to do more to clean house
so that the public can maintain their trust in our activity
because we're doing our own homework
to keep our own house in order.
Well, let me end with this.
When we, the public, are trying to evaluate science
that we see come across our transom,
across our social media feed, I saw an interview where you across, you know, our transom, right, across our social media feed.
I saw an interview where you mentioned, for instance, there was a, you know, article that
got a lot of press in Outside Magazine about why we shouldn't wear sunscreen and another article
that said why flossing isn't really good for you. And I remember seeing both of these and going,
well, I'm not so sure about this, but I mean, if scientists say so,
I don't know. I didn't quite swallow it, but I definitely thought about it a little bit.
What should we as lay people be doing when we receive, you know, information like that,
which we're bombarded with? The media is constantly saying new study about this,
new study about that. How do we as lay people evaluate it?
Well, I think the first question to ask is who's saying this? So Outside Magazine is not a
scientific organization, right? And so when they start publishing a thing about science, it's
legitimate to say, well, okay, maybe. And it doesn't mean that you necessarily reject it out
of hand, but if it's something you're really seriously interested in, you need to do a little
bit more work and don't throw away the sunscreen yet, Right. And I think in most cases, you know, like you often hear this
phrase now, I think you even mentioned that I'll go do my own research now. Right. I mean, there's
nothing wrong with people being well-informed, but it means you have to actually go to legitimate
sources. So if you're interested in COVID-19, then you need to go to NIH or CDC. And if you're interested in climate change,
NASA, for example, has a great website about the evidence, the scientific basis for understanding
of climate change. The information is there, but you can't just Google climate change. You know,
you have to go to NASA and then see what they have to say. So in other words, start with the
source, not the topic, because if you just start with the topic, you will find yourself in a
labyrinth of disinformation and fake news. The other thing I would say is that, you know, I think
this responsibility cuts both ways. A lot of scientific organizations don't do a good job
of making information available to people in plain language. And I know, you know, when I've tried to
learn about, you know, like I tried to learn about the sunscreen thing, for example, I mean, NASA's great. I love NASA. I think they've done
a fantastic job with their public outreach. And I think partly because NASA has always had a
public facing element. I think the CDC could do a lot better in some of its communication.
And then of course, journalists play a role here too, right? The journalists have to stop running, you know, with one source stories with something sensational. I mean, we saw with that dental floss case, it was very sensational to say, oh, flossing does you no good. And all the newspapers ran with it. And there were all these headlines filled with schadenfreude.
But hold on, guys.
So this came out of the AP, which is not a scientific organization.
And who were they quoting and where was the data coming from? And what did the American Dental Association have to say about this?
And the reality was the dentists were all saying, no, dental floss is good.
We have lots of clinical evidence.
But the problem is you can't do a double blind trial of flossing because people know whether they're flossing or not.
So this gets into another issue that I talked about in the book, which is what I call methodological fetishism.
Some people become very obsessed with the idea that, you know, if you don't have a double blind randomized clinical trial, you have nothing.
And that's just ridiculous because there are many forms of evidence that are useful and valuable besides
double-blind clinical trials. And for some problems like dental floss, like most areas of
nutrition, you can't do a double-blind trial. People know what they eat. People know if they're
brushing their teeth. So you have to look at other forms of evidence. And so it means it's not always
easy. It means you can't be lazy. You have to do some work.
But, you know, if you're a journalist, you're getting paid, you have a job, you can do the work, right?
There's also, I mean, yeah, you're right.
This sort of like methodological fetishism.
And I've fallen into this trap before myself where I say, oh, there's no study that conclusively says there's no evidence for this.
study that conclusively says there's no evidence for this, but well with flossing, because I think after I read that article, I did cut back on flossing for a while. Cause I don't like it.
I don't think I was completely convinced by it, but I was like, let me see what happens if I don't
floss for a little bit. And when I didn't floss for a while, uh, my gum started bleeding and I
went to the dentist and they were like, your gums are really bleeding. If you're if you floss, your gums won't bleed so much.
And I was like, OK, I will.
And then I started flossing more and my gum stopped bleeding.
And like that's evidence, too.
And that's part of being empirical is saying here's a here's a physical fact.
I know it's not not just the abstract, you know, fetishization of like an ideological attachment to the scientific method.
It's I experienced this. It happened in the real world. Exactly. And that's what all the dentists
said. They said, look, talk to any dentist. The dentist can tell whether you've been flossing or
not. It's obvious in the state of your teeth. And in fact, we did have good evidence that people
who didn't floss were more likely to develop gum disease and in some cases, even serious illness
where bacteria in the gum could spread to the heart.
Evidence comes in lots of sizes, shapes, and colors.
Some evidence is more robust than others.
But if we can't get the most robust form for whatever reason, then we can turn to other
forms of information.
And in some cases, that does involve our own personal experience.
As you said, nobody likes flossing.
I mean, that's partly why that article got so much attention.
So many of us wanted it to be true. We wanted them to be able to say, oh's partly why that article got so much attention. So many of us
wanted it to be true. We wanted them to be able to say, oh, yeah, just throw away the dental floss.
But we know from our own experience, as you said, when you don't floss, your gums bleed and
bleeding's bad. So, hello, that's evidence, right? You really seem like an optimist about science,
despite, you know, talking about these issues we've been discussing about, you know, funding, distorting what science gets done and the issues of denialism
and et cetera, all the challenges and headwinds. You still seem to be a real optimist about it.
Well, I think the thing is this. So I was trained as a scientist originally. I still
keep nice rocks on my desk. This is a very beautiful piece of fluoride from Australia.
Gorgeous.
The scientific enterprise is one of the great endeavors of people.
Understanding the world we live in.
The world is so complex and so beautiful and so amazing.
I mean, when you learn how things work, when you understand bumblebees or the ways in which flowers, you know, it's what
Darwin said when he said there's wonder in this view of life. Science is this very cool and
exciting enterprise. And when you do work and you feel like you actually understand something about
the natural world, it's deeply, deeply gratifying. And I think science popularizers, you know,
you think about someone like Carl Sagan, that's partly what they managed to convey is that sense of excitement and wonder in the natural world. So I see science as a quite great enterprise. And I think it's done a lot of good in the world. It's not all good. And I think it's something that's definitely worth preserving and protecting, particularly against organized disinformation by people with vested economic or ideological interests. Because if we lose science,
we will lose a lot in our lives. And so, yes, I'm optimistic about something that I think is really
valuable and wonderful and that I think has made almost all of our lives better overall.
Maybe spreading that a little bit more, that view of wonder is key to fighting denialism and
helping people trust science more. Like the, you know, the vaccine is key to fighting denialism and helping people trust science more like the,
you know, the vaccine is presented to be, oh, scientists came up with this. You should take
it. We made it real fast. But if people really could understand what mRNA vaccines do and why
they're so incredible and we could if we were better at telling that story, maybe we'd be better
at, you know, fighting the disease and helping people build
that trust in the process. Yeah, I think so. I mean, obviously, the whole vaccine thing is
complicated. I don't think we'll solve it entirely with just the wonder of nature. You know, that
would be nice. I do think there's a way in which nobody likes to feel disempowered and nobody likes
to feel condescended to. And so if you just say, here, take this vaccine, there will always be people
who will recoil from that. But if you can say, look, let me explain how this works. And it is
kind of cool. It's kind of amazing, right? That in this short amount of time, we could send
instructions to our cells that they could fight this disease on their own, which is what this
is doing. It's pretty amazing, actually. And I do think that the more scientists can take the time to do that kind of work, that kind of explanation,
and also to really listen. I mean, I hear a lot of scientists talk about how they need to talk
to people more. And I say, yeah, talking's good, but listening's better, right? To actually listen
to people and ask them, well, what is your concern? Is it about Tuskegee? Or is it because
you can't get a day off from work? And it might be both. For some people, it's one. For some people,
it's the other. For some people, it might be some combination. But there's no way to find
out what it is if you don't listen. And so I think the scientific community, one way or the other,
has to find better ways to really hear the American people and find out more about what
people's concerns really are. Because I think in most
cases, those concerns can be addressed. I mean, we have taken steps to ensure that Tuskegee doesn't
happen again. There are still problems with healthcare delivery in minority communities,
but I don't think we're going to have another Tuskegee. So, you know, these are things that
can be addressed, but they don't get addressed most of the time.
And so we have to figure out better ways of addressing them.
Well, thank you so much for coming on the show to talk to us.
This has been a fascinating conversation, and I can't thank you enough.
The book is called Why Trust Science, and people should really pick it up.
Thank you so much for being on the show.
You're welcome.
It's been fun.
pick it up. Thank you so much for being on the show. You're welcome. It's been fun.
Well, thank you again to Naomi Oreskes for coming on the show. If you want to get a copy of Why Trust Science, that URL once again is factuallypod.com slash books. And because it's through
bookshop.org, you will be supporting not just this show, but your local bookstore when you
purchase there. But of course, if you have a local bookshop in your area, please go to them first and support those fine people
who sell books to you. That is it for our show this week. I want to thank our producers,
Chelsea Jacobson and Sam Roudman, Andrew Carson, our engineer, Andrew WK for the use of our theme
song, the fine folks at Falcon Northwest for building me the incredible custom gaming PC
that I'm recording this very episode on. You can find me online at adamconover.net or at Adam Conover, wherever you
get your social media. Thank you so much for listening. We'll see you next week on Factually.
That was a hate gun podcast.