Stuff You Should Know - Research Bias: Sort It Out, Science
Episode Date: October 5, 2021There’s a sticky issue scientists have to deal with – science is carried out by humans. We humans have flaws (and how) and they can end up in our work. Fortunately, science is waking up to researc...h bias. In the meantime, here’s what to look out for. Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
Hey, I'm Lance Bass, host of the new iHeart podcast Frosted Tips with Lance Bass.
Do you ever think to yourself, what advice would Lance Bass and my favorite boy bands
give me in this situation? If you do, you've come to the right place because I'm here to help.
And a different hot sexy teen crush boy bander each week to guide you through life.
Tell everybody, yeah, everybody about my new podcast and make sure to listen so we'll never,
ever have to say bye, bye, bye. Listen to Frosted Tips with Lance Bass on the iHeart
radio app, Apple podcast, or wherever you listen to podcasts.
I'm Munga Chauticular and it turns out astrology is way more widespread than any of us want to
believe. You can find in Major League Baseball, International Banks, K-pop groups, even the White
House. But just when I thought I had a handle on this subject, something completely unbelievable
happened to me and my whole view on astrology changed. Whether you're a skeptic or a believer,
give me a few minutes because I think your ideas are about to change too. Listen to Skyline Drive
on the iHeart radio app, Apple podcast, or wherever you get your podcasts.
Welcome to Stuff You Should Know, a production of iHeart Radio.
Hey, and welcome to the podcast. I'm Josh Clark and there's Charles W. Chuck Bright and Jerry's
here. Jerry's back, everybody, looking well rested and sun kissed and everything. And this is Stuff
You Should Know. She's like a beautiful, juicy orange. That's right. That's right, Chuck. That's
a really apt description. Ready to be squoze. I wish I could squeeze her, but we're still not
squeezing. No, not in this pandemic. Are you crazy? Are you out of your mind? No, no squeezing. Yeah,
even Robert Plant wouldn't let you anywhere near him. Okay, Chuck, figure out that joke. Robert
the squeeze. My lemon. The lemon song? Nope, till the juice runs down my leg. Yeah, that's the lemon
song, right? No, I don't think so. I think it is. I don't think it is. Okay. I don't think it is the
lemon song, man. I think it's Whole Lotta Love. It's Whole Lotta Love. All right. It's maybe the
dirtiest thing that was ever said in like a top 10 song. Okay. Regardless. I'll just let the emails
take care of this. I really think, no, the lemon song is... I love those lemons. Right. No, the
lemon song is all about how you have friends, like you want to have friends and like friends are
good to have. Okay. Yeah. I may be all wrong. No, I think it's Whole Lotta Love. Yeah, it is. I'm
100% sure, buddy. All right. Well, I encourage you not to Google the lyrics then. Well, we could ask
our good friend and stuff you should know, writer Ed Grabbinowski, the Grabster. Oh, look at that
segue. Because he is in a band and has been for a while. We've mentioned it before, Space Lord,
which has just a super cool Zeppelin-esque sound to them. And they just... They cover some Zeppelin,
too. But they do. You're in there. And they just released a new single, which you can find on
Bandcamp by searching Space Lord. The Space Lords. No, not Vinnie and the Space Lords. Yeah, Space
Lord. Just look for Space Lord with some cool graphics, and you know that's Ed. You'll know
it's the Grabster. Yeah. Good stuff. We also have a game out that Trivial Pursuit made.
Yeah, we should plug our own stuff every now and then. We just did. Yes. It is a co-branded game
with Trivial Pursuit from Hasbro. And it is not a Trivial Pursuit game that you are used to. It
is a stuff you should know game that Trivial Pursuit was happy to co-brand with. So just
what I don't want is emails that are like, this isn't Trivial Pursuit. This is some other different
game. You're always worried about the emails, aren't you? I'm going to just ignore them,
let them roll off my back. I know. Like, I'm disappointed in you guys for this. I haven't
even listened to the episode, but I'm disappointed about that. You just got one of those. Did you
see that? It just rolls off your back. Yeah. Yeah, those are always great. I didn't listen, but...
Here's what was wrong. I wrote that person back actually. I was like, we actually
kind of did exactly what you hoped we would do. They're like, oh, sorry for being presumptuous.
Anyway. Oh, all is forgiven. Yeah. So we're talking today about a bias, Chuck. And I want to set
the scene a little bit because, you know, one of the things that I'm always like harping about is
like the death of expertise, right? Oh, sure. And it's a real problem. Like this idea that
science can't be trusted, that people who go and spend a decade or more learning about a specific
thing that they go out and become an expert in, or that's their profession, that's their training,
that those people, what they have to say is basically meaningless or that it's no better than
somebody on the internet's opinion about that specific subject that that person spent 10 or
12 years being trained to be an expert in. Like that kind of stuff to me is like super dangerous.
Like there's an erosion of something and it's an erosion of intelligence to start with,
but it's also an erosion of just believing in facts and knowing that you're not being taken
for a ride or hustled. It is a huge, enormous problem that we're just beginning to wake up to
and is still unfolding. It's not like it happened and now we're like reeling from it. It's still
happening in real time and it is a massive, huge issue. One of the biggest issues that
humanity faces, I think because it encompasses so many other large issues like climate change,
existential risks, the pandemic, politics, all of them kind of fall under this this erosion of
belief and facts and that there are people out there who know more than you do. It's a big problem.
Yeah. Imagine being someone who studied and researched something intensely for 10 or 15 years
when presenting facts to be met with. I don't know about that. That's a response I hear a lot
in the south. Yeah. Or that they saw something on YouTube that flatly contradicts that and it's
like that it doesn't matter. Like what you just said is ridiculous that somebody posted something
on YouTube and that that has as much weight as what somebody who spent 10 or 12 years studying
this very thing has to say about it. It knows exactly what they're talking about has to say about
it. It's maddening. Yeah. There's something about people from the south in general, I think,
that are in this group that I have literally heard that response from a lot of different people
when I've been like, oh, no, no, no, no, here are the facts actually. And then when presented with
something that they can't refute, they say, I don't know about that. And that's it. That's the end of
the conversation. That's different than the people I've encountered. The people I encountered like
their brow furrows and they start pointing fingers and their tone goes up. Are you hanging out at
the country club or something? I think it's different types of people. There's ignorance and
then there's also people that actually think they're better informed that will fire back with YouTube
clips. Right. So the reason I brought that up is because one of the reasons that that is being
allowed to exist, that that does exist. I think it's a reaction to something else that's going on
simultaneously, which is there are a lot of experts out there who are performing really sloppy science,
sometimes outright fraudulent science, and they're frittering away whatever faith the
general public or society has in their expertise and in their profession. And there are a ton
of scientists out there. I would say the vast majority by far of scientists are legitimate,
upstanding, upright, dedicants to science. Right. That's where they place their, that's
where they hang their hat. That's where their heart is. That's what they believe in and that's
what they work to support. But science has like kind of a problem, Chuck, in that it's allowing
way too much for bias, which is what we're going to talk about, to creep into science
and undermine science and basically produce papers that are just useless and trashed and
there's a whole lot of reasons for it. But it's something that needs to be addressed
if we're ever going to get back on a footing with a faith in experts and expertise and just
facts that there are such things as objective facts. Yeah. I mean, a lot of times it's financially
related, whether it's a lack of funding, a desire for more funding, a desire just to keep
your lab running and people paid on staff, which, you know, all this stuff is understandable. You
want to keep doing this work, but you can't let that get in the way. It's like in Rushmore
at the end when Margaret Yang faked the results of that science experiment
because she didn't want it to be wrong, you know? I don't remember what,
I don't remember that part. Was that like a deleted scene? No, no, no. Was it in the end when
they meet up and he's flying the, I think he's flying the kite with Dirk and she's talking
about her science fair project and he was really impressed with it and she was like,
I fake the results. And the reason why was because she didn't want to be wrong.
And I think a lot of times people will get into a certain body of research or data too
because they want to prove a certain thing and if they can't, it might be really hard to live
with that. So that weighs into it. Money for personal gain, advancing your career, you know,
publish or perish, that whole thing. Like we're going to talk about all this, but there are a
lot of reasons that it's been allowed to creep in, but all of it is at the disservice of their,
the fundamentals of what they base their careers on to begin with.
Yeah. It's at the, it's at the disservice of science itself, right? Because the whole point
of science and then scientific publishing, the whole publishing industry is to, to basically
create a hypothesis, test your hypothesis and then share the results with the world. And that's
ideally what would happen because you're building this body of scientific knowledge.
But money and corporate interests and academic publishing have all kind of come in and taken
control of this whole thing. And as a result, a lot of the stuff that gets published are trash
papers that shouldn't be published. A lot of the really good papers that don't come up with
sexy results don't get published. And then like you said, people using science for personal gain,
there are a very small cadre of thoroughly evil people who are willing to use their scientific
credentials to create doubt in the general public to, to prevent like people from understanding
that climate change is real for 20 years or that fossil fuels actually do contribute to,
to anthropogenic climate change. But what we're mainly focusing on is like bias in the, in the
sense that people carrying out studies are human beings and the human beings are flawed. We're just
flawed and we bring those flaws to our studies and that you really have to work hard at rooting
those flaws and those biases out to produce a really good, thorough scientific study with good,
reliable results that can be reproduced by anybody using the same methods. And that science is just
starting to wake up to the idea that it is really biased and it needs to take these into account
in order to, to progress forward from the point that it's at right now.
Which is tenuous, I think.
What is perhaps more tenuous than ever.
The point that science is at?
I think so.
Science isn't going away. It's not going anywhere. It's probably the greatest achievement humans
have ever come up with, right?
No.
It's not going anywhere, but it is a terrible position that it's in.
And it's going to take some genuine leadership in the scientific community from a bunch of
different quarters and a bunch of different fields to basically step up and be like, guys, this is
really bad and we need to change it now. And a lot of people need to be called out.
And science typically shies away from naming names and calling out by name fraudulent scientists
because scientists seem to like to, um, suppose the best in people, which is not always the case.
Right. And having said all of this, there could, we could root out every bias and, and, and really
clean up the scientific publishing community a hundred percent. And there's still a set,
a certain set of people in this country and in the world who that wouldn't matter to
and would still shut down facts and because it doesn't fit their narrative. So.
For sure.
But Chuck, those people have always, they've always been there, right? And they're always
going to be there. There's always, it's just contrarians that they are, you can call them
free thinkers, you can call them stubborn, you can call them purposefully, purposefully ignorant.
Who knows? They're always going to exist. The problem that this crisis that science finds
itself in right now is that it's allowed that, that population to grow and grow and like people
who otherwise didn't ever really question science have been allowed to kind of trickle into that
fold and that those are the people that we should be worried about. The ones who would,
would know better if they believed in science again.
Right. And our way into this is to talk about different kinds of biases
in true stuff. He should know fashion in the top 10 that is not a top 10.
That's exactly right. We ate into at least three in this intro.
And hopefully shining a light on some of this stuff. People at least be more aware of different
biases. And the first one is good old confirmation bias. I mean, these aren't ranked because
confirmation bias would probably be number one as far as people's awareness of it. But
there are different examples that people use for confirmation bias. And I kind of enjoyed
the one from the House of Works article, even though it was from 1903,
after X-rays were discovered in Germany, there was a French scientist named René Blondelot.
Yeah. Yeah. He looked at X-rays, he said, moi aussi.
Well, who said, A, I see n-rays. I've discovered n-rays and everyone's like, what's an n-ray?
He said, well, it's like a corona when electricity discharges from a crystal and you can only see
it in your peripheral vision and American Robert Wood laid the wood and said, I'm going to come
to your lab and check this out and secretly remove the crystals during one of the experiments and
Blondelot still saw these n-rays. And so that's confirmation bias. He wanted to see those n-rays
and then later, even though it was disproved, other French scientists supposedly published papers
or published papers based on that research, because they wanted it to be true. So that's
what confirmation bias is. It's when you're starting out with a hypothesis that is going to shape
the methodology of your study to confirm it. Right. And then it can also occur where you're
interpreting info to fit your hypothesis. So you're seeking out stuff that supports your
hypothesis and then the stuff that's just there in front of you, the results that are there in
front, you're like, aha, this thing proves that those n-rays actually exist. Or this phenomenon
cannot be due to anything but n-rays. Therefore, n-rays exist. All of it's confirmation bias.
Like you said, that's number one because that's not just a scientific bias. Every human uses
confirmation bias in that it's twofold. We avoid contradictory information because we
don't like to be wrong and we find information that confirms our point of view because we like
to be right. That's confirmation bias and it's everywhere among everyone. That's right. Although
I will say, I know it happens a lot politically, but myself and the people that I congregate with
question their own leaders as much as they do leaders from the other parties.
Oh, that's good. It's very good to do. There shouldn't be sacred calves in politics. That's
a bad jam. Well, no. And it's like, I've always been like at the forefront of calling out my own
party's wrongs and saying, no, no, no, you need to do better than that. Whereas I see a lot of
other people in other situations truly bury and ignore those things because they just don't want
to face that. Yeah. And it's not even like I don't want to face it. It just doesn't fit
their worldview. So they just don't include it. It just gets tossed out. But the point is, it's
not an active process necessarily. Right. I think we should probably take our first break.
I think so too, Chuck. All right. We'll be right back and talk about sampling bias right after this.
situation. If you do, you've come to the right place because I'm here to help this. I promise you.
Oh, God. Seriously, I swear. And you won't have to send an SOS because I'll be there for you. Oh,
man. And so my husband, Michael, um, hey, that's me. Yep. We know that Michael and a different hot,
sexy teen crush boy band are each week to guide you through life step by step. Oh, not another one.
Kids relationships life in general can get messy. You may be thinking this is the story of my life.
Just stop now. If so, tell everybody, everybody about my new podcast and make sure to listen.
So we'll never ever have to say bye, bye, bye. Listen to Frosted Tips with Lance Bass on the
iHeart radio app, Apple podcast, or wherever you listen to podcasts. I'm Mangeh Shatikler. And to
be honest, I don't believe in astrology. But from the moment I was born, it's been a part of my life
in India. It's like smoking. You might not smoke, but you're going to get second hand astrology.
And lately, I've been wondering if the universe has been trying to tell me to stop running and
pay attention. Because maybe there is magic in the stars, if you're willing to look for it.
So I rounded up some friends and we dove in and let me tell you, it got weird fast. Tantric curses,
major league baseball teams, canceled marriages, K-pop. But just when I thought I had a handle on
this sweet and curious show about astrology, my whole world came crashing down. Situation doesn't
look good. There is risk to father. And my whole view on astrology, it changed. Whether you're a
skeptic or a believer, I think your ideas are going to change too. Listen to Skyline Drive and the
iHeart Radio App, Apple Podcast, or wherever you get your podcasts.
All right, Chuck, we're back and we're coming back with something called sampling bias, which is,
it turns out, a subtype of a larger thing called selection bias. Oh, and one other thing we should
say, we kind of got into it before I could say this, there are different stages in a study where
bias can occur. It can happen in the planning, the pre-study phase, and it can happen during the
actual study. And then it can happen after the study as well. And so when we're talking about
any kind of selection bias, including sampling bias, this is pre-study bias, where when you're
actually setting up the study, this bias is where the bias is going to happen.
Yeah, and you know what, I think it also bears saying that biases, you have to work really
hard to avoid it because it's almost like a disease that's always trying to get involved.
And it's not like, just do better, everybody, and quit being biased. It's like, it's way more
complicated than that because it is always knocking at the door, like you said, in all three phases,
trying to sneak in there. And it takes a lot of work in all three phases to avoid it. So
it's not as, I don't want it to come across as easy as us just saying like, you shouldn't do that,
stop it. No, but the first step is to recognizing that there's a lot of bias and different kinds
of bias that are just sitting there waiting for a scientist. And then if you start admitting that
it's there, you can start being on the lookout for it and you can start adjusting for it. And
then other people who read your papers or hear, you know, read news articles about your papers
can be on the lookout for that kind of thing. Yeah, so exactly. Sampling bias is your, you know,
your sample set not being accurate and a good representation of the whole. A lot of times,
you'll find this in either studies that are really small scale because you don't have a
large sample and you don't have the kind of money like near you, like maybe you work for
university. So you work with university students as your first sample set who are not indicative
of anything, but you know, people 18 to 21 years old or so. Now remember, we talked about weird,
Western, educated and just realized rich and democratic. Yeah, that's exactly the thing. It's
like, I mean, it's a, it's a decent place to start if you don't have much money and you want to get
the ball rolling. It's not like, oh, you shouldn't do university studies at all using students.
It's, but those findings definitely don't represent the, the, the wider nation and it needs to grow
and get more funding if you want to actually have a legitimate claim to something.
Another way that sampling bias can come up is from like the, the group that you're recruiting
from. Like if you're doing a strictly online survey, but you're trying to apply your findings
to the wider society, that's just not going to happen because there's so many people who aren't,
aren't internet savvy enough to take an internet survey. Like by, by nature, you are a little
savvier than the average person if you're hanging out on the internet and taking a survey. And then
also kind of tangential that I like to tell myself that at least. And then tangential,
that is something called self selection bias, which is where the people who say, let's say
you're doing a study on wellness and, you know, what eating tuna can do for your health.
People who are interested in wellness and health are going to be much more likely to volunteer for
that study than people who couldn't care less about health and have no desire whatsoever to
further science's understanding of what makes you healthy. So you would have to go out and
find those people and recruit them rather than just relying on the people who volunteered
based on the flyer you put up in the student. Right. Or, you know, study all financial demographics
or poll all financial demographics rather than just, and, you know, sometimes this is a methodology
in which they do try and recruit people, steers them in that direction unknowingly that I know
in the article they talked about the 1936 presidential campaign with Roosevelt and Alph
Landon, Republican Alph Landon, they were doing polling with like country clubs, rosters and
people who drove cars and stuff at the time that was kind of a luxury. So it was all out
of whack. Everyone's like, Landon's going to win in a landslide. It's because you just kind of
basically stuck your polling to, you know, I don't know about wealthy individuals,
but people who are a little more well off. And I think we talked about that in our polling episode
that is that fiasco with polling. I also saw one more too that I want to mention because it has a
really great anecdote attached to it. It's called survivorship bias, where when you're studying
something, say like business or something, you're probably going to just be looking at the extent
businesses, the businesses that have survived 20 years, 30 years, 50 years or something like that,
and you're not taking into account all of the failures. So when you put together like a prognosis
for business in America, it might have a sunnier outlook than it should because all you're looking
at are the ones that manage to survive and thrive. And that's survivorship bias. And did you see
that anecdote about the World War II fighter pilots? It was actually pretty funny because they
studied planes that had been returned, that had been fired upon, but managed to get back safely.
They were like, well, let's look at all these different bullet holes and where this plane was
hit. And let's beef up all those areas on the body. And a mathematician named Abraham Wald said,
no, those are the places where they got shot and did okay. What you should really do is find
these planes that actually went down and beef up those sections of the plane. Exactly. And that's
survivorship bias. It's just, it's failing to take into account the failures that have to do with
what you're trying to study. What about channeling bias? Channeling bias is another kind of selection
bias. Did you get this one? It wasn't the best example of channeling bias. Yeah. I mean, I got
it. Okay. Did you not get it? I got it, but it took a lot of work before I finally did.
Well, it's basically when, let's say you have a patient and their degree of illness might influence
what group they're put into. So if a doctor, if a surgeon was trying to study outcomes of a particular
surgery, they might, because they're surgeons and they want to help people out, they may perform
that surgery on maybe younger, healthier people who might have better outcomes than someone who is
in a different higher age group. Right. And the article is kind of ends it there. And I was like,
so? What's the problem? And I finally found this example where it says like, okay, let's say you're
studying a new heartburn medicine or something, or something to treat like GERD. And it's new,
it's hardcore, it's cutting edge. And the people who are likeliest to get this new hardcore anacid
are the ones who are probably in worse shape, right? So say they're on the verge of going to the ER
anyway. Well, if you look back at all of the people who've ever been prescribed this new hardcore
anacid, you're going to see like a lot of them ended up in the ER, even though it had nothing
to do with this hardcore anacid. And then similarly, the people who have so-so GERD, that's not
particularly bad, they'll probably be prescribed the old drug, the standby that everybody knows is
fine, that's going to work. So if you compare the old drug and the new drug, it looks like the old
drug is super safe, but the new drug will put you into the ER. Whereas that's channeling, you've
channeled different people with different prognoses into different groups and they're kind of pitted
against each other in an effort to obscure the truth. If you wanted to really know the genuine
health outcomes for that anacid, you would have to give it to people with not so bad GERD and people
with really bad GERD and see what happens, see if the ER visits continue for people who wouldn't
otherwise be going to the ER. You want to find the outcome for everyone. Right. And not just for
you. If you're debating surgery, you're like, oh, well, it shows really good outcomes. You're like,
well, yeah, but who are they operating on? Right, right. Yes. So I would like to invite anyone
who got what I was saying or got channeling because of what I was saying. I invite you to
email and let me know. I'm doing a little bit of surveys here and I'd like to know if I confuse
things more or make it more understandable. Well, I know it's funny either way. I got that part.
I'm just trying to figure out if it's understandable. But here with your methodology,
talking about a stuff you should know listener who by nature is smarter than your average pair.
Well, I'm not going to publish it. I'm going to file draw it either way. Oh, what a teaser.
Question order bias is the next one that is, and this is mainly obviously with
when you're just doing like polling and stuff or like an online survey, or it could be just
asking people a set of questions like in a social science research set. And the way you order things
can affect the outcome. And this isn't at all like everything from the brain's tendency to
organize information and patterns to the brain simply paying attention more and being more
interested early on. Like I know that was one of the general social survey was a big long-term study
of American attitudes. And in 1984, they were asked to identify the three most important
qualities for a child to have. And they were given a list of these qualities. Honesty was just listed
higher on the list. When it was, it was picked 66% of the time. When it was further down on the list,
it's 48% important. And that's simply because people are just reading this list that are like,
Honesty is important. And then, yeah, by the time they got down, you know, three quarters of the
way through the list, they'd started thinking about what they're going to have for dinner or
people get pooped when you're giving them lists of stuff. Or you can prime people and get them
all sort of worked up. Like if you have a question like during the Trump administration, how mad were
you at that guy about stuff he did? And you're like super mad. And then you were like, well,
how did you feel generally about how your life was affected during his administration? You might
say it was awful. Whereas if they hadn't have asked that first question, they were just like,
what was your life like from 2000? I'm blocking out the dates. What, 2016 to 2020? Yeah. You might
say, oh, you know, it wasn't, it was okay. I ate a lot of sandwiches.
Just over those four years, though. Right. Yeah. So, and like you said, that's priming,
which is a big, it's a big thing that you have to worry about when you're doing any kind of survey.
So there's some of the ways that you can combat that you can randomize your question order.
Sometimes you'll have a survey where one question is predicated on a previous question.
So one thing you might want to do is ask that set of questions in a few different ways.
So that, yeah, so that you can kind of maybe compare the answers to all three at them up and
divide them by three. And there's your average answer kind of thing. There's a lot of things
you could do to kind of, I guess, manipulate to de-manipulate your respondent when you're
doing a survey like that. Manipulate to de-manipulate. Look it up. You won't find anything on it,
but you could look it up still. Oh, it's a Roxy Music album. Interesting. Wow, Chuck. Wow.
Nice work. Yeah, that was great. What's next? So with question order bias,
we've entered the during the study kind of bias. This is why you're actually conducting the study.
And so is interviewer bias. And interviewer bias, it's kind of like, well, question order
bias has to do more with the study design, but it's a bias that emerges during the study.
Interviewer bias just straight up is in the middle of the study. And it has to do with the
person actually asking the questions in an interview. It can also, I think, apply to somebody
conducting like a clinical trial on a drug if they know whether somebody is getting placebo
or not, it might affect their behavior. But ultimately, what it is is the person who's wearing
the white lab coat is influencing the outcome of the study just through their behavior,
through their tone of voice, through the way that they're asking a question. Sometimes it
can be really overt. And let's say a super devout Christian is doing a study on whether,
how many, what part of the population believes Jesus saves. And they might be like,
you know, Jesus, do you think Jesus saves is the question? Don't you? Like it seems like it,
huh? That kind of thing would be a pretty extreme example. But it's sometimes how you
understand things is in the absurdities, you know? Yeah, I thought this example in the House of
Works article was kind of funny. It was about just like a medical questionnaire where the
interviewer knows that the subject has a disease that they're talking about. And they may probe
more intensely for the known risk factors. And they gave smoking as an example. And it said,
so they may say something like, are you sure you've never smoked? Never? Not even once?
A minute! Like if I heard that coming from a researcher, even without knowing a lot about
this, I would say, what kind of a researcher are you? Yeah. Like, it seems like you're looking for
an answer. You should say you are ethically compromised. Or even facial expressions or,
you know, body language, all that stuff weighs in. I don't know why. Why don't they just have the
robots, Alexa or Google or Siri or somebody ask them? Well, that's one good thing about
something like an internet survey is like it's just questions. And as long as you design the
questions and you randomize their presentation, like it's going to be fairly helpful in that
respect. But then it's got its own pitfalls and pratfalls. You can attract a lot of people who
are just taking it to mess with you. Right. And there's a lot of problems with all of it. But
again, if you're aware of all the problems you can plan for them, and then even if you can't
plan for them or control them, you can write about it in the actual study and be like this.
I remember running across studies before where they're basically like there are, you know,
there was a kind of bias that we couldn't control for. So we can't really say whether it affected
this, the outcome or not. And I thought, wow, this is really refreshing and like even daring
kind of like I was thrilled. But you don't see that very often. But that from what I understand
is the direction that science is going toward now. Well, and the reason you don't see that and
then something we'll talk about is, is what actually ends up getting published. It may be
less likely to get published if they're like, Hey, you know what, dude, you know what I'm saying?
Yeah, I know. So let's do recall and acquiescence bias because they're very much related. And then
we'll take a break. That's our plan. What do you think of it? Everyone says, sounds good to me.
All right. So this is also during study. And this is so in the very much the way that an
interviewer can influence the outcome. The participant can actually influence the outcome too,
especially if they're being asked questions or they're being asked to self report.
There's a couple of ways that us just being humans can foul up the works on the findings
of a study. The first is recall bias. Yeah, this is when you're obviously you're trying to recall
something from the past. And it's amazing what might jump out at you from your past
when probed with a certain question, certain correlations that really have nothing to do with
it. But you may be like, Oh, well, you know what? Now that I think back, I remember around that time,
I was, I was watching a lot of dancing with the stars. I kind of binge that show. So maybe
that's why I had homicidal tendencies. I don't think you need to study to prove that. I think
that's just intuition, you know? Yeah. But yeah, so and if enough people do that, especially if
there's something out kind of in like the zeitgeist about that, how like people who watch too much
dancing with the stars want to kill other people, like a number of your participants might recall
that thing. Whereas other people who don't watch dancing with the stars aren't going to recall
that. And so in the same way that survivorship bias influences it, those people who don't have
that memory to recall, that memory can't possibly be included in the study results, which means that
dancing with the stars is going to kind of percolate to the top as like a major risk factor in
homicidal tendencies. Right. That's not good. You don't want, you don't want dancing with the
stars unfairly canceled. You want it to be canceled because it is terrible. I've never seen it. I'm
sure it's great if you're into dancing. I haven't either. But watch, we're going to be asked to be on.
Oh my God. And they'd have to change the name of the show to dancing with the
mid-level internet famous. Right. Exactly. Wow. Dust off my jazz shoes.
It would be us and chocolate rain and, you know. I love that guy. Tejaunday is his name.
Yeah. We actually met him that time, remember? He was great, man. Another thing, Chuck, that people,
that has to do with recall bias is that like we just tend to have faultier memories with stuff
that makes us look bad. Like say unhealthy habits. Oh, sure. So if you're doing a study on junk food
and health outcomes and you interview a bunch of people who are in terrible health and all of them
are like, oh, I only ate like cheez-its like once in a blue moon or something like that. And the
researcher writes, I want some blue moon cheez-its. Like the results of the study are going to suggest
that it takes just a very small amount of cheez-its to put you in the hospital with long-term chronic
health conditions. Right. And that is a problem with recall bias. Like it's the participants
affecting it in this case because they just aren't paying attention or aren't really thinking about,
no, you've eaten a lot of cheez-its and it takes a lot of cheez-its to put you in the
hospital, not a very little amount. It's not the best example, but it kind of gets the point
across I think. Now, is this part of acquiescence bias? No, that was the end of recall bias.
Acquiescence bias is different, but it's certainly related. Both of them kind of fall
under an umbrella of participant bias. Yeah. And acquiescence bias, I feel like there's
the opposite too. I just don't know if they have a name because acquiescence bias is generally like
people want to be agreeable and they want to answer in the affirmative and they want to,
especially they found, if you are maybe less educated, you might be more willing to just go
along with something and say, yeah, sure. Yeah, yeah. To maybe appear smarter or just to be more
agreeable. I do think it's the opposite can happen too, especially with political research
and social studies in that I think there are also people that are like, oh, you're from the what?
Well, yeah, sure. I'd love to be interviewed. And then they go into it with a sort of opposite
mentality where they're completely disagreeable no matter what anyone says or asks. Yeah. I didn't
run across that, but I'm absolutely sure that that is a bias out there. But you can avoid these
by doing it more smartly, right? More smartly? Yeah, there's ways that you can frame your
questions like people don't like to admit that they didn't actually vote
right American democracy. Some people. So, instead of saying there was a
pew suggestion, pew pew, where they said, rather than saying like, did you vote in the last election?
A lot of people who didn't vote are gonna be like, sure, yeah, of course, why would you ask that?
Instead, you would phrase it as like, in the 2012 presidential election,
did things come up that prevented you from voting or were you able to vote?
Right. And you would probably actually want to train your researcher to use that same intonation
to make it seem casual either way. Like you want to give the person a sense of comfort that
they're not being judged no matter how they answer. Give them a backdoor. That's a good way.
That's a good way to get around acquiescence bias. Yeah, absolutely.
Yep. The old backdoor policy.
That's right, where you can squeeze a lemon. Right.
All right, are we taking a break? I think we're mandated by the FCC to do that after that joke.
All right, well, we'll be back and finish up with our final two biases right after this.
Hey, I'm Lance Bass host of the new iHeart podcast frosted tips with Lance Bass.
The hardest thing can be knowing who to turn to when questions arise or times get tough
or you're at the end of the road. Okay, I see what you're doing. Do you ever think to yourself,
what advice would Lance Bass and my favorite boy bands give me in this situation? If you do,
you've come to the right place because I'm here to help. This, I promise you. Oh, God.
Seriously, I swear. And you won't have to send an SOS because I'll be there for you. Oh, man.
And so my husband, Michael. Um, hey, that's me. Yep, we know that Michael and a different hot,
sexy teen crush boy band are each week to guide you through life step by step. Oh, not another one.
Kids, relationships, life in general can get messy. You may be thinking,
this is the story of my life. Just stop now. If so, tell everybody, everybody about my new
podcast and make sure to listen. So we'll never, ever have to say bye, bye, bye. Listen to frosted
tips with Lance Bass on the iHeart radio app, Apple podcast, or wherever you listen to podcasts.
I'm Mangesh Atikular. And to be honest, I don't believe in astrology. But from the moment I was
born, it's been a part of my life. In India, it's like smoking. You might not smoke, but you're
going to get secondhand astrology. And lately, I've been wondering if the universe has been
trying to tell me to stop running and pay attention. Because maybe there is magic in the stars,
if you're willing to look for it. So I rounded up some friends and we dove in and let me tell you,
it got weird fast. Tantric curses, major league baseball teams, canceled marriages, K-pop?
But just when I thought I had to handle on this sweet and curious show about astrology,
my whole world came crashing down. Situation doesn't look good. There is risk to father.
And my whole view on astrology, it changed. Whether you're a skeptic or a believer,
I think your ideas are going to change too. Listen to Skyline Drive and the iHeart radio app,
Apple Podcast, or wherever you get your podcasts.
Also, I want to apologize to all the parents who listen with their six-year-olds
these days. Oh, my daughter's sick. She doesn't care about what we do.
That's great. So it's still just flying overhead, right?
I mean, she doesn't even listen. She like movie crush a little bit.
Well, some kids that are six listen and hey, shout out to all of you guys.
I know. Whenever I see that, whenever someone writes in that says they're kid,
my daughter's age actually listens. I'm like, really? Oh, yes. This is my daughter. She loves it.
Right. Yeah. And she voted in the last election too.
I'm like, uh, my daughter likes to watch videos of kids playing with toys on YouTube kids.
Is she into that now? Those are the worst videos.
I'm starting to get on her now with just in terms of taste. I'm like,
hey, you can watch something, but like watch something with a story that's good.
It's like, this is garbage. I've never seen chips.
She goes, I like it. I can totally see her saying it just like that.
You define it and happy. That was a great impression.
All right. Publication bias is one we kind of poked around it earlier a little bit with the whole
publisher perish mentality. Can I add something more to that real quick
before we get into publication bias? Sure. You, you don't mind?
Add something to what? To the, to just talking about publication in general.
Oh yeah. So I don't think that it's fully grasped by most people.
It certainly wasn't by me until really diving into this that the academic publishing industry
has a stranglehold on science right now. Yeah.
In a very similar effect that 24 hour cable news had on like journalism
to where it was like it became this voracious beast that was willing to just spit out money
constantly in exchange for, yeah, feed it. Give me more stories. Give me more.
Give me more pundits. Give me like, that was the rise of pundits.
Pundits didn't really exist prior to that. They just hung out on the editorial pages of newspapers
and then 24 hour news came along and there's not possibly enough news stories,
like good news stories to keep going for 24 hours. So you have to talk about the news stories and
analyze them and then you start getting into who's wrong and all that stuff. The, the publishing
industry is very much like that now where it's this beast that must be fed. And so there's,
there can't possibly be that many high quality scientific papers. So scientific papers have
just kind of dipped down in quality. And then one of the other things that the publishing
industry has done is said, we really like studies that have results. They're called positive results
where like it turned up that you found a correlation between something or the compound you tried on
that tumor shrunk the tumor. Like those are what we're interested in. That whole furthering of
science with positive and negative outcomes just to say this did work. This doesn't work. Don't bother
trying it. We don't care about that kind of stuff. And that's a huge issue for the scientific
community. Like they have to get control of the publishing community again if, if they're going
to come out from under this dark cloud. Oh yeah. I mean, they found a, in 2010, a study about papers
in social sciences, especially were about two and a half, I'm sorry, 2.3 times more likely to show
positive results than papers in physical sciences even. So some, some bodies of research are even
more apt to publish positive results. And that means if you're going, you know this going into
your profession and you know this going into your set of research and it's, you know, that's when
it becomes sort of put up or shut up time as far as standing firm on doing good work, even if it
doesn't get published. Right. And so that, that confirmation bias can really come in where you
start hopefully inadvertently, but certainly not in all cases inadvertently search cherry
picking data to get a positive outcome where there really wasn't one there before, or you use a kind
of a weird statistical method to, to suss out the correlation between the variables so that you can
have a positive outcome because if you're not publishing papers like your academic career is
not progressing and you can actually like lose jobs. So you need to be published. The publishing
industry wants your paper, but they just want positive outcomes. So a high quality, well-designed,
well-executed study that found a negative outcome to where they said, well, this compound we tried
didn't actually shrink the tumor. That's, that's going to be ignored in favor of a low quality
paper that found some compound that shrunk a tumor just because they like positive outcomes.
It's ridiculous. Yeah. And I mean, that kind of goes hand in hand with the last one. You know,
there's a lot of overlap with these and a lot that work sort of in concert with one another
and file drawer bias is, you know, it is what it sounds like. It's like you got a negative outcome
and whether or not you were being funded by a company that definitely doesn't want that information
getting out there or if it's just as a result of it being less likely to be published because it
doesn't have a positive outcome, you just stick it in the file drawer and it goes bye-bye.
Right. And again, like part of the point of science and scientific publishing is to
generate this body of knowledge. So if you're about to do a study, you can search and say,
oh, somebody already tried the same exact thing and they found that it doesn't work.
I'm going to not try to reproduce that. I'm just going to not go with it
and move on to try something else. It's a huge waste of resources otherwise.
And then also you could, you can, if you aren't publishing that kind of stuff, you're missing
out on... Well, I mean, you're missing out on the real data if the bad data is file drawered,
like you're missing out on the truth. You're missing out on the whole picture, right?
And also, again, it's not just that, oh, the poor negative outcomes, they need to be included too.
Yes, that's true. But you're also promoting positive outcome studies that actually aren't
good studies. There's this thing called the Proteus Effect, where the initial studies,
these initial papers on a subject, in 70% of cases, a follow-up study that seeks to reproduce them
can't reproduce them. They don't come to the same findings, the same conclusions,
which suggests that a study was really terrible. If it can't be reproduced or if it's reproduced,
somebody comes to a different finding, different conclusion, that's not a good study.
So the idea of publishing positive and negative outcomes together would definitely kind of slow
that whole crazy 24-hour news cycle positive outcome study. Yeah, I don't see how it's even legal
to bury, not bury, but I guess just not even just a file drawer, a study
that included a drug having negative effects. And I know that Congress has stepped up to try and
pass laws to, I think there was one in 2007 requiring researchers to report results of
human studies of experimental treatments. And then they tried to strengthen that in 2016,
basically this like, even if your drug doesn't come to market, we need to have these studies
and the results. How is it even legal? It seems like you're bearing and it's almost
falsification. Well, it is for sure because you're also like, if you're talking about studies where
you have multiple studies on say one drug that's an antidepressant and all you're doing is publishing
the ones that have positive outcomes for that antidepressant and you're just not publishing
the ones that showed no outcomes or maybe even harm, then yeah, that should be illegal, especially
when you're talking about something like an antidepressant or in the biomedical field.
But it's certainly unethical for any field of science in particular, just bury the stuff
you don't like that doesn't support your conclusion. It's a kind of a meta form of
confirmation bias and just putting aside the stuff that doesn't fit your hypothesis or your
worldview and just promoting the stuff that does. That's right, boo. I saw one way around this is
The Lancet, the very respected medical journal, I think it's British. The Lancet has taken to
accepting papers based on the study design and methodology and goals. When you first plan your
study and you have it all together before you ever start, that's when you would apply to have
your paper published in the Lancet and that's when they decide whether it's a high quality
enough study to publish or not. Then they're locked into publishing your study, whether your
outcome is negative or positive. It has the knock on effect of the Lancet basically being like,
hey, this is a trash study. We would never publish this. Don't even bother. It's saving funds and
then the high quality studies are the ones that are going to get published and then also the
positive outcomes and the negative outcomes get published regardless because they have no idea
what the outcome is going to be because they accept the paper before the paper, before the study's
even been conducted. I saw another thing that said that a paper would be more likely to get
published in the Lancet if it had cool illustrations. That's right. That never hurts. Everybody
knows that. That's not unethical. Especially in color. Just put up a few of those New Yorker
cartoons in there. Forget about it. Everybody loves those. You got anything else? I got nothing
else. This is a little soapboxy, but this is something that we believe in. It's kind of like
our episode on the scientific method a little bit. I like the two. Thanks for doing it with me, man.
Thank you for doing it with me. Thank you for squeezing my lemon. Sure.
If you want to know more about scientific bias, there's a lot, fortunately, a lot of sites and
great articles dedicated to rooting that stuff out and to make you a smarter consumer of science.
So go check that out and learn more about it. Since I said learn more about it,
it means it's time for listener mail.
You know, sometimes the listener mail dovetails quite nicely
with the topic, and that was the case today, with our inclusion on the media bias list,
which was pretty exciting. Yeah, what an honor. There's something called the media bias.
Is it called the media bias list? I believe so. What it does is it takes news outlets and
newspapers and TV and stuff like that. It's a big chart where they're ranked according to how
bias they are, kind of up, down, left, and right. And they included podcasts this year.
They did, and we were on the list, and it was really kind of cool. We had a bunch of people
right in, and this is from Nicholas Beto. He said, I found this post while I was scrolling through
Facebook and waiting for the NFL season to start. Add Fonts Media. Is it Fonts or Fontes?
Yes. We should know this. I'm not sure. One of the two. I'm going to say Fonts.
It's a watchdog organization known for the media bias chart. They do a media bias chart where
they rank every news outlets political bias, and in the recent update, they included you guys.
And wouldn't you know it, the most politically fair piece of media you can possibly consume,
and all the known universe of stuff you should know. That is so cool. You guys say you're liberal,
but until I heard Chuck outright state it, I didn't even know. Wow. I think it slips through
there some. Well, yeah. We're certainly human beings, and we have our own biases, but we definitely
try to keep them in check. We try to, and I think it's just really important because they're not
just like, listen to a couple of shows, and oh, these guys seem okay. They really listen,
and they really rank people. Yeah. They probably saw that too, or perhaps they listened to the
North Korea episode where Josh suggested Wolf Blitzer apply hot paper clips to his inner thighs
while writing a nice piece on Trump's Korean relations. Hilarious. Either way, thank you guys
for your fairness and hilarity of all these years. You're both the best. That is from Nicholas Bedot.
Thanks a lot, Nicholas. Thanks to everybody who wrote in to say that they saw that. We appreciate it,
and it was neat to see ourselves right in the middle of the rainbow. Love being in the middle
of that rainbow. I do too, Chuck. It's nice and warm and cozy in there, isn't it? Yes. Well,
if you want to get in touch with us like Nicholas and the gang did, you can send us an email to
StuffPodcast.iHeartRadio.com. Stuff You Should Know is a production of iHeart Radio. For more
podcasts on my heart radio, visit the iHeart Radio app. Apple podcasts are wherever you listen to
your favorite shows. Hey, I'm Lance Bass, host of the new iHeart podcast, Frosted Tips with Lance
Bass. Do you ever think to yourself, what advice would Lance Bass and my favorite boy bands give
me in this situation? If you do, you've come to the right place because I'm here to help and a
different hot, sexy teen crush boy bander each week to guide you through life. Tell everybody,
yeah, everybody about my new podcast and make sure to listen so we'll never ever have to say bye,
bye, bye. Listen to Frosted Tips with Lance Bass on the iHeart Radio app, Apple podcast or wherever
you listen to podcasts. I'm Munga Chauticular, and it turns out astrology is way more widespread
than any of us want to believe. You can find in Major League Baseball, international banks,
K-pop groups, even the White House. But just when I thought I had a handle on this subject,
something completely unbelievable happened to me and my whole view on astrology changed.
Whether you're a skeptic or a believer, give me a few minutes because I think your ideas are about
to change too. Listen to Skyline Drive on the iHeart Radio app, Apple podcast or wherever you get your
podcasts.