Sawbones: A Marital Tour of Misguided Medicine - Sawbones: Interpreting Research
Episode Date: April 4, 2024With so much medical research out there on internet, how can you sort through what's based in science and what's just three people in a room? Dr. Sydnee discusses the all the different factors to thin...k about when deciding how much weight to give any article or study. There's a lot! Music: "Medicines" by The Taxpayers https://taxpayers.bandcamp.com/
Transcript
Discussion (0)
Sawbones is a show about medical history, and nothing the hosts say should be taken
as medical advice or opinion.
It's for fun.
Can't you just have fun for an hour and not try to diagnose your mystery boil?
We think you've earned it.
Just sit back, relax, and enjoy a moment of distraction from that weird growth.
You're worth it. Alright, this one is about some books.
One, two, one, of misguided medicine. I'm your co-host Justin McElroy. I'm Sydney McElroy
And I'm so excited to be here with you, Sid
It's so rare that my work in the research fields inspires people and so I mean
I'm it's really touching to me that I inspired an episode. You know, I mean, it's beautiful actually
Well, I mean honey It's beautiful actually.
Well, I mean, honey, you inspire me in so many ways.
Oh man, I didn't think you'd say that,
but gosh, I'd love to hear more about it.
Well, and I will say, inspire is an interesting word
because you didn't so much as inspire as commit me
to doing this episode next. Commit you to, yes.
Okay, the memories are flooding back now.
And it was, I was gonna say live.
You threw my hat over the fence, is what you did.
Inspire, I inspired you to go get your hat that I threw.
You said here, there's your hat over there, go get it.
Yeah, Justin, last week you talked about a lot of studies.
Yeah, a lot of science that I unearthed.
And I talked about how once I reached
my inbuilt bias that I was recognizing,
I think as we were recording the episode
that might be problematic is that if I am trying
to look into something on my own and I hit a study,
a scientific study, now I know, by the way, I know about research. Like we've talked about quality research here. Like I understand,
you know about, I know about like sample sizes and the stuff and all that kind of stuff. Like I know
that's bad, right? But one thing I've never seen in one of these like studies or the abstracts or whatever
It never says at the top like hey dog. We did this on eight people
So let me just say that up front and then go into the rest of the study. That would be choice
No, it doesn't it doesn't say that up top. That's an example. Okay, that's example of why I have a hard time telling good
Research from bad research when I'm not having like a John Ronson for example explaining to me why the research is bad.
A Sydney McElroy, a Freakonomics, somebody in the, a Bill Nye, you know.
Well, one of our listeners just did a TikTok about this as well.
Yes.
Ask a neuroscientist.
Yes.
Yes, on TikTok.
Did a wonderful explainer of this as well.
Yeah. Piggybacked off of your request last week. Yes. Yes. Yes, on TikTok. Sena did a wonderful explainer of this as well. Yeah.
Piggybacked off of your request last week.
Yes.
Yes.
And a lot of the things that you're going to hear
about studies like that,
I'm gonna hit on a lot of these points.
Some things that you may have thought of,
like sample size,
some things that you might not necessarily think of.
And then some areas, I think,
that are still a little fraught and debated
as to how do we generate the best possible research,
make sure it is accessible to everyone,
not only like widely accessible,
but as quickly as possible,
but as quickly as possible while verifying
that it is well done accurate research.
And then what do we do about once people read that stuff
and start changing their minds
or making decisions based on it?
It's a really complicated question.
Yes, it does not surprise me.
So we're gonna get it,
I'm just gonna kind of walk you through,
and if you're somebody who works in the sciences,
a lot of these things are probably concepts
that you're familiar with,
but if not, I hope I'm introducing some new ideas
as to like, if when you send me links to studies
like you did last week, what am I going to think about
as I read these studies?
Wow, my husband is really, really good at finding science.
I mean, you found studies. You found them.
I remember, babe.
I remember I brought them to you.
I was just as pleased as an old outdoors cat
bringing a mouse to a farmer.
I was just as pleased as Pudge, I think.
Now, let's start with,
before we get into anything about the study itself,
when you send me a study,
the first thing I'm going to look at
is where was it published?
Okay?
Okay?
That's number one.
You sent me one study,
acute inositol stabilized arginine silicate
improves cognitive outcomes in healthy adults.
Hey folks, I don't know about you other laymen out there,
but let me hear you.
If that doesn't already, just seal the deal.
I don't know if you all heard all that,
but it is unhinged the number of scientific words
that were in that.
It is a ton of scientific words.
And if you also like, I will say the title itself,
and it's important to look at what the title
of the article is claiming.
Like, is it outsized compared to what the study actually shows?
What this is saying is that this particular molecule improves cognitive outcomes in healthy
adults.
That's a big cognitive outcomes.
If you don't tease out what exactly they're saying, it makes your brain better.
Even if you're already healthy and don't have anything that you think you need medicine for, this medicine will make your brain brain better. Even if you're already healthy and don't have anything
that you think you need medicine for,
this medicine will make your brain even better.
In some way.
In some way.
This is from the journal Nutrients.
So.
Is that one of the good ones or one of the bad ones?
I don't, okay, I'm gonna work really hard
not to say good or bad.
But I will, I can say that.
And by the way, you found this on PubMed,
so this just re-emphasizes,
just because it's on PubMed.
It actually says at the beginning, there's this disclaimer.
Did you see the disclaimer?
As a library, NLM provides access to scientific literature.
Inclusion in an NLM database does not imply endorsement of
or agreement with the contents by NLM
or the National Institutes of Health.
So there's that.
Okay.
Okay, so I looked into the journal nutrients
and if you're curious about a journal,
I mean, you can go to the site for the journal,
but like, they're not really gonna tell you,
hey, we published some stuff that maybe is a little sketchy.
They're not gonna tell you that, right?
Right.
They're gonna tell you how great the journal is.
They're gonna try to look really good.
So like, you can, but you can read articles
about the journal.
You can look how long has it been a journal
and how is it put together?
Is it a peer-reviewed journal?
Do they have a process?
Like, obviously somebody's doing a research article,
it's getting published in this journal somehow.
How does that happen?
So you can look into the process
behind the journal nutrients,
and you can find out that this is a journal
that we call a pay to publish journal.
Okay, what does that mean?
You pay to get your articles in it.
Can you see any problems with that model?
And I am not, when I say this,
I don't mean that every time this form exists in a journal,
it's automatically like disregarded,
but can you see where that could be problematic?
Yes, I can see where,
I can see where like,
it's sort of like the by the by us name a star.
Yes.
After you a little bit,
because it's like,
that doesn't necessarily mean it's a good name for that star,
but it did mean that you had the money to pay for the star.
If you get,
if your journal is funded by taking money from authors
and putting the studies in your journal.
If you're sent 20 articles,
I'm just picking random numbers,
and 15 of them are not great.
Like their design is bad,
there's something about them that you think
is probably not really strong,
then you're only gonna get money for five.
The other 15, if you don't publish them,
you're not gonna get any money.
Well, how are you gonna keep your journal going that way?
So there's an incentive to continue to publish
as many articles as possible on the part of the journal.
That doesn't mean that the science is bad.
You still have a team of people whose job it is
is to make sure the science is good.
But you can see where, I mean,
anytime money is part of the equation,
you have to at least ask the question.
Right, and that's important money is part of the equation, you have to at least ask the question.
Right.
And that's important because looking at the nutrients now in a way that I will admit I
didn't previously, it says that they're open access.
So it's free to read, which is something that we've talked about can be kind of a double
edged sword in the past.
So that's one of the things I want to talk about.
And just one more thing about nutrients.
If a journal is consistently putting out studies that are questionable,
you usually will find some other articles about it
somewhere on the internet.
I mean, I would just use a search engine like Google
and look up information about the journal.
And I quickly found that nutrients back in 2018
had 10 of their editors quit because they were concerned
that articles were getting through
that were not
very scientific or rigorous or properly reviewed because of this financial incentive.
Okay.
So you can find these things out.
And generally, if a journal is really known to publish stuff that is biased or scientifically
not rigorous, you'll find enough, you'll find other scientists out there telling you that. Is there a meta way of, like trust navigator is for charities, like that for journals,
like being able to put in, you know.
Let me skip to, I was going to talk about open access, but in answer to your question,
let me just really quickly go over, I think a lot of people use something called impact factor
for exactly what you're talking about, okay?
What is an impact factor of a journal?
Do you know?
Oh, you're asking.
I would say that it's how, because of the context,
how seriously you should take this thing.
So a lot of people use impact factor as a way of deciding how important or serious you
should take a journal.
However, the way that you, this is a number, and the way you calculate it, it's the number
of citations, citable items, over the last two years.
And then you subtract the number of publications
in the most, in the last year from the most recent year.
So it's, cause it's over two years.
So like for instance, there's an example,
and this is from a Wikipedia article
on how do you calculate impact factor
in case you're curious
where I'm getting this very secret information.
They have an example that in the journal Nature in 2017,
they had 74,090 citations in other places
and they subtracted that in 2016 they had 880 articles
and they took from that the 902 articles in 2015,
I'm sorry, added them, you don't subtract them,
you add them, excuse me.
So you add the total number of articles,
over top you've got the citations,
the number that results is your impact factor.
So for them their impact factor was 41.
So it's how often you're cited
versus how often you publish.
Exactly. Your hit rate.
There you go, there you go. And the idea is that if you're cited versus how often you publish. Exactly. Your hit rate. There you go.
There you go.
And the idea is that if you're cited more,
it must be really important.
It's kind of your KD ratio, you might say.
There you go.
It's kind of your KD ratio.
So there is this like, you can look at that.
And there are a lot of websites out there
that just list different journals
and like give information about them,
but this isn't an official database. I can't give you official databases of journals that say like, here are the journals and here give information about them. But this isn't an official database.
I can't give you official databases of journals
that say like, here are the journals
and here's how good they are.
But a lot of people will use impact factor
as a way of like deciding if a journal is relevant.
And that's fraught.
Okay.
Because different fields of science for one
are more likely to be cited than others.
So when you look at like the medicinal
or biological sciences in general,
you'll see a much higher percentage generally
that they're cited because of the different areas
of research that intersect, because of pop-sci,
articles and things like that that intersect there that cite these things.
So you see like a lot more citations in general
in those sorts of journals
than you do in like a mathematics journal, right?
So it's already fraught there.
Then articles that tend to be cited,
I mean, the usual stuff that you encounter
when humans are involved,
there's bias depending on race,
depending on nationality,
depending on a general idea of the journal to begin with.
So it tends to be cumulative.
We all sort of think that the New England Journal
of Medicine is great.
So maybe you're gonna turn to it for citations first.
And I'm not saying it's not great.
I'm just saying like,
it can be hard to get that snowball rolling in a different direction once it's started.
Does that make sense?
Yeah, absolutely.
And so there are problems with impact factor, but that is, I think, as close to what you're
talking about as exists out there.
But again, that's a like, if you are using it at all, if you do think there's any value
in it, one of the studies that you sent me, effects of L-theanine on cognitive function
in middle-aged and older subjects,
a randomized placebo-controlled study,
is from the Journal of Medicinal Food,
and it has an impact factor most recently of like 2.4,
which is quite low.
That's quite low, but I'll also say it's interesting.
I noticed in the headline for this one,
I don't know if that's a term you'd use, probably title, but they don't make a claim. It's just... I noticed in the headline for this one, I don't know if that's a term made use, probably title,
but they don't make a claim.
It's just...
That's true.
It's what it is.
Man, that can be harder to parse, I think,
if you're a layman.
Well, and I think that that's a good thing to look for.
What exactly are they saying this thing does,
and then does the study support it?
If you see somebody publish like,
this molecule will make you smarter than everybody else,
I think you know that whatever you're about to read
is not very trustworthy.
That's a wild statement.
Right. Right.
Well- Because you would have heard about it.
Yes.
The limitless pill.
Right. You would have heard about it.
You would have heard about it.
To get to the open access point,
because we mentioned it briefly,
and this is a huge area of debate, by the way,
again, in scientific publishing.
The idea that a journal will be open to everyone,
to access for free,
there's a really beautiful, equitable idea about that,
and as someone who depends on research for my job,
meaning this podcast,
when I find an article that's free, I'm very excited
because it can be really expensive
if you're reading tons of articles every week
and you had to pay for them all
or subscribe to every journal that you would need,
which for me is lots of them
because of the breadth of our show.
So there's a really beautiful idea there
that scientific information should be accessible to everyone.
The flip side of that is if you're not going to pay
to subscribe or pay for the article,
how is the journal existing?
Who's funding it?
And that's where you just have to have a critical eye.
It doesn't mean that those can't be good,
but it does mean is it funded by authors paying
to put their articles in,
and what's that process for submission,
and how long is that peer review, and is it rigorous,
and are the editors being listened to?
What does that process look like?
Is it being funded by an outside source, by a third party?
Is there perhaps someone in a pharmaceutical industry
who has an interest in funding this journal,
somebody in a specific area of tech or science
who wants to fund this journal,
you can see where this could get really fraught.
Right. Right?
And certainly that exists.
There are journals out there from publishing companies
who have very specific interests
and are publishing lots of articles
with a very specific goal in mind.
Right.
And if that's where you're coming from,
that's not science and that's not very helpful.
So you just have to think about it
before you just immediately read an article
from an open access journal and say,
yep, perfect, I'm changing everything I do in my life
because of this article.
I mean, it's tough because in the model that we have,
like you have to have the money coming in for the work
that you are doing. So if you're not going to charge people to read it, the money has
to come from somewhere. And that's a debate that people are having in lots of different
fields. I mean, web media has been struggling with this for a decade, right? Do you put
it behind a paywall or do you, you know,
try to go with ads and figuring out how to pay
for stuff like that?
I mean, that quality to that signals to noise ratio,
I think is always tricky.
Yeah, I mean, it's a very complicated question.
I am all for information being available to everybody
and as quickly as possible.
That's been actually a recent edict
from the Biden administration is that
if you have relevant information,
especially about people's health
that you have studied and published,
making that available to the public as soon as possible
so that they can have access to that information too
is a big priority.
And that makes sense.
Now, part of that process has to be making sure
that all of the rigor has gone into evaluating that study
before it is released to let everybody just have at it.
And the only other thing,
before we get into the study itself,
let me just mention when is it published?
More and more, depending on what discipline you're in,
how recent that study was done
is becoming more and more important,
especially in fields like medicine,
where I mean, I've been a doctor for, I don't know,
I graduated in 2009.
Everything I know about medicine,
maybe not everything, that's an exaggeration.
Most of what I know about medicine has changed
since I graduated medical school.
Things change quickly, things evolve,
things completely flip and 180,
like what I thought I knew is absolutely wrong
from 10 years ago.
So looking at the date of when the study is published
is also really important.
You'd be shocked how often you can be tricked
by a study that was published in like 1975.
And that doesn't mean it might still be relevant.
But if you're trying to give somebody
the most up-to-date information
on like their cholesterol management,
I guarantee you we know more than we did in 1975.
It could be a sociological study,
how cool are the Doobie brothers?
And then it's like, well, this data is no longer relevant.
I think there is some sciences that might be more,
although even as I say that,
I guarantee like any scientific discipline that I said
maybe is a little more stable.
Anything I name, I would have like-
I mean, let's hope geology, right?
Let's hope geology, but I don't know, probably not.
Probably getting wild over there too.
I mean, I feel like even our understanding of physics
has evolved so much.
Just the laws of
Yeah, universe are different than what we you know, so like everything is stressful. We can all agree. So it's important to look at that
Yes, I now this is even before you've read the study. Okay, I want us to go ahead and read these studies
All right, but first we got to go to the billing department. Let's go. The medicines, the medicines, that escalate my cough for the mouth.
Thanks to everyone who contributed during this year's Max Fun Drive.
We truly couldn't do what we do without you.
With the drive in the rearview, it's time for another proud tradition, our annual charity pin sale. This year, the proceeds for the pin sale will support VoteRiders,
a nonprofit dedicated to expanding ballot access nationwide. Members at $10 a month
or more can purchase MaxFunDrive pins featuring shows from across the network, and all members
are able to buy our Network Pin Design, exclusive to
this charity sale.
The sale is live now and it ends Friday, April 12th.
For more info, head to MaximumFun.org slash pin sale.
And thanks again for your support. From the twisted minds that brought you the Adventure Zone, balance and amnesty and graduation
and either see and steeplechase and uterus space and all the other ones.
The McRoy brothers and dad are proud to reveal bold vision for the future of actual play
podcasting.
It's um, it's called the Adventure Zone versus Dracula.
Yeah, we're gonna kill Dracula's ass.
Well, we haven't recorded all of it yet.
We will attempt to kill Dracula's ass.
The Adventure Zone versus Dracula.
Yes, a season I will be running
using the D&D fifth edition rule set.
And there's two episodes out for you to listen to right now.
We hope you will join us.
Same bat time, same bat channel.
And bats.
I see what you did there.
All right, Sid, I've cracked open the study.
Okay, so one of the-
Find the science juice within.
One of the baselines is that this study
has been peer reviewed, meaning that after whoever performed the study
and then wrote about the study,
they handed all of that to other people in their discipline
to look it over and make sure that it was done appropriately,
it makes sense, the design, the statistics.
Would that peer review typically be like them reviewing
like an abstract or are they giving them like the raw data
and just seeing if they get the same results from it?
No, they're giving them like the physical paper
that they've written.
Well, not physical, they've probably emailed it, right?
But they've given them the study that you're going to read.
So they don't double check the actual science?
No, they're not necessarily gonna review their data. You're gonna have to take some, the study that you're going to read. So they don't double check the actual science?
No, they're not necessarily gonna review their data.
You're gonna have to take some, I mean,
Yeah.
I mean, obviously people just flat out lie,
but that's more rare.
Yeah.
Then you can look to like, we're getting into this now.
Well, one, the journal is gonna tell you
if it's peer reviewed.
Two, look at the author's affiliations.
Are these studies being put on by,
like is this being funded or sponsored by like a university?
Because if we're looking at like places
where you study things,
putting together rigorous scientific studies,
that makes sense, right?
Like that is something we feel is more reliable,
generally speaking.
You assume that the goal is educational or somewhat noble in nature because of the nature of the institution.
Exactly. If you see affiliations of authors, like in one of the studies that you sent to me,
where is the exact study? New insights, here it is.
New insights on effects of a dietary supplement
on oxidative and nitrosative stress in humans
from food science and nutrition.
Based on the novel put by Sapphire.
This study, when you open up the author information,
which by the way, if you look at the top of these,
if you look at these studies on an online database,
you're gonna have the title of the study.
And then underneath there,
you're gonna have some little hyperlinks
for author information or author affiliation
or something like that, click on it.
Because then you'll see right there it says,
this study was supported by VDF Future Pseudocles, Inc.
And one of the authors-
So I really hate future pseudocles,
this has been an elogism.
One of the authors, while working at the University of Illinois, also works for future pseudocals
Inc.
Meaning?
Well, what I'm betting is that the dietary supplement they're studying is made by future
pseudocals.
Interesting. studying is made by future pseudocals. Ah, interesting.
And so, I mean, and that's something that you would need to, like, OK, why would future
pseudocals have an interest in conducting this data or in conducting this study and
collecting this data?
I don't know.
I mean, probably because there's a financial incentive.
Honey, every hair on my skin is standing up.
I don't know how deep this goes.
I'm terrified that FutureCeuticals is upstairs and they're going to at any moment kick open
the door because we got too close to the truth.
FutureCeuticals.
Look what you've brought down on us.
Stop saying it.
Stop saying it.
They're hearing all of this.
FutureCeuticals, worldwide ingredient supplier and supplements.
And they've got, and one of their brands is the medicine
that they're studying in this spectra.
Anyway.
Sydney keeps saying future Pseudocles,
please just get them out of the state.
My point is if you open that tab
and you see a bunch of like universities
or like research organizations,
if the NIH is doing this study, you know what I mean?
Like you can use some of this stuff we understand.
Who studies things in an interest of scientific pursuit
and who studies things because they really wanna
sell you something.
Now, all that being said, and this gets into like
who funds the study, this is tricky
because if you look at a lot of the studies,
but probably all, I shouldn't say all,
there might be one or two.
Probably most of the studies that we have used
to decide which cholesterol med, which diabetes med,
which blood pressure med, all the medicines
that we think work or know work,
they almost always are studies funded by pharmaceutical companies.
Oh man.
Now they are also funded by your taxpayer dollars
that went into funding the government research
that underlies a lot of these medications.
So like you, thank you, you are helping fund them,
we are helping fund them too.
So it's tricky.
So then I think you have to start thinking like,
okay, just because something's funded,
that doesn't mean it,
like if you look at the study and it's rigorous,
I mean, yeah, they wanted to sell the pill,
but they proved it worked first.
And that's how we've decided science works in this country.
So we can only-
And a lot of times,
is there a difference between the company itself conducting the research
and then funding a third party to do the research?
Is there maybe like a hope that there's a bit more?
You wanna look for distance.
You wanna look for, we are doing this study here,
even though the money is coming from over here.
You wanna look for that distance.
You also wanna look for if the authors themselves
are paid by the company.
Was the research funded by a company or is this person writing this study going on speaking tours
where they make a lot of money promoting this drug?
Those are two very different things and you will find both.
You will find that sometimes the author of the study is making quite a bit of money off of going around
and telling people to take this medicine.
Or they might sit on the board of directors.
That's very common too.
This was actually a big controversy when it came to,
oh gosh, it feels like forever ago now,
but in the medical world, it's fairly recent.
When we started, we changed how we recommended
cholesterol meds for people, a specific kind
of cholesterol med called statins.
So this specific kind of cholesterol med, we used to, the guidelines we used to prescribe
it used to have a lot to do with like what your cholesterol actually is.
Well they changed so that while that could be part of it, it also just had to do with
your own risk factors.
Stuff that we don't have to do any labs on you to know.
Your age, your weight, your blood pressure,
what other health conditions you have,
things like that would tell us, you know what,
even before I check your cholesterol,
I already know you need to be on this med.
So all these guidelines changed.
And the result of that change is that like,
I don't know, two billion people worldwide
should be put on this cholesterol medicine.
Right? That's wild.
That's like, that's wild.
I mean, and that, and that's what a lot of people
in response to the new guidelines went, holy crap.
That's a lot of people you think we should,
that we're being told we should put on it.
And a lot of the debate around that centered on the fact
that quite a few people who made those consensus guidelines
sat on the board of directors for companies
that made statins or received money
for going around and peddling statins.
Right, so they would get a,
they had a financial incentive.
They had a financial incentive
in telling everybody to take a statin.
Does that mean that we shouldn't take statins?
No, we still might, maybe we should.
That's still what the guidelines say.
The guidelines still say we should.
But it's harder to trust, right?
It makes you question.
Anyway, so that's all part of what you need to look into,
the conflicts of interest, the funding, the affiliations.
You need to look for confounders in a study.
This is when it starts to get a little trickier.
If you say that, okay, if I were to make the claim
that people in the United States who are the eating tofu
makes you lose weight. If I were to make that claim,
I could probably just based on, we live in the US,
I think our cultural perception of tofu
is that it's a health food.
I would bet that if you just randomly selected
people who eat tofu over people who don't eat tofu,
you might find a weight difference between those two groups.
Does that mean that eating tofu makes you lose weight?
No, it's a correlation.
There might be a correlation, but that's not causation.
And there could be confounders in there,
which is like, I don't know,
not as many people eat tofu in the US
who aren't on some sort of specific diet,
like vegetarian or vegan or something like that.
And it could have more to do with your perceptions
of health food or where you live.
And so there's lots of other variables that go into tofu.
People have cultural issues talking about certain things.
Like you're gonna get skewed data
if people don't wanna be forthcoming about something.
Yes, I could make lots,
I could make up lots of things that are probably true,
but the two aren't really related,
but I could put together data that makes you think they are.
So you need to see,
did they acknowledge confounders in their study?
Did they address them?
Did they talk about how they tried to control for them?
In some way in the statistical analysis.
You need to look at, of course, the study size.
And let me say, before we even get to the study size,
your first question is, was it done on humans?
Yes.
Because one of the studies you sent me was dietary
urine enhances the improvement in learning and memory
produced by administering DHA to gerbils.
Now for that one, there is a decent chance that I bailed
at some point during that very long sequence of words.
Well, did you say you didn't- And maybe didn't make it to gerbils.
So maybe that's a tip you could have led with, Sid, is read the whole title.
And that's a common sense one that I can give people.
So now, does that mean we disregard animal studies?
Of course not, of course not.
We build to human studies, often with animal studies.
But if you are using a study that was done in gerbils
to tell people what they should do in their human bodies,
you've skipped a step.
That's not, no, no, no, no, no, no.
Where's the study in humans that says it works?
Right?
This is not enough.
We are not gerbils.
The studies you sent me, the Theanine study was 69 people,
the arginine study was 19 people,
the lion's mane study was 41 people.
None of these groups are enough to represent.
How many is enough?
Well, part of it depends,
this is something you can calculate.
So there's statistics that go into like,
how many people do I need to test this on
to see if there's a difference between these two groups,
or is it coincidence?
I guarantee you it's more than 41.
I mean, because if you're talking about something
you're going to give to the entire human population,
do you think 41 people could possibly be representative
of the 8 billion of us that live on planet earth?
So you have to start thinking about the statistical power.
If it's less, I mean, this is similar
to what our TikTok friend said.
Generally speaking, if it's less than 100,
I'm raising an eyebrow.
Unless you're looking for qualitative data.
This is a different thing.
I'm talking about a lot of studies
that are looking at numbers
and trying to tell you a result based on numbers and math.
If you're just trying to find out
how people feel about things
or get a general sense of something,
you can do qualitative studies.
And you don't need as many people for those
because you're not asking for numbers.
You're asking for...
Give me a narrative feedback about this.
And you're creating an impression,
a gestalt impression of things, right?
That's a little different.
We're looking for numbers
if we're gonna say a dietary supplement does something.
And these studies don't have the numbers.
The study design, I won't get into too much
because I think giving you really nitpicky things
to get in the weeds about isn't very helpful.
Generally speaking, there needs to be two groups,
one that got the thing and one that didn't get the thing.
If there's only one group of participants
and they all got the thing,
I don't know what to do with that.
You didn't compare it to a placebo group.
So like a lot of arginine studies
don't have a placebo group.
So already I don't, I mean, that's not,
I didn't, it's barely a study.
So look for that, look for blinded,
people don't know what they're taking.
If you know what you're taking, that's a big problem.
You could be biased for or against it.
What's double blind?
Double blind means the researchers
also don't know what you're taking.
That's even better.
It's better if me, the researcher, is giving you a pill
and you don't know what it is and I don't know what it is
because then there's nothing in that interaction
that might tip us off.
Okay.
And then statistical significance again,
I'm not gonna get in the weeds with that,
but generally speaking, there are ways to know
and they have to, they should put this in the conclusions
or in the analysis of their data,
the results of their data,
and then mention it in the conclusions.
If they did see a difference between the two groups,
was it statistically significant?
And they should say there was a significant difference
or there wasn't.
Sometimes they hide it in numbers
that not everyone will understand.
They'll give you a p-value
and you don't know what the p-value means, maybe.
I know many of you do, but let's say you don't, then you don't really know what the p-value means maybe I know many of you do
But let's say you don't then you don't really know what that means. They need to come right out and say hey
Maybe we saw a connection, but it didn't reach statistical significance
They need to own that if that's the case
We saw something that makes us want to do it in a bigger group and see if we can get statistical significance
And then of course has it been replicated?
we can get statistical significance. And then of course, has it been replicated?
It almost never happens that if we have an entire body
of scientific evidence that has been researched
and studied and collected that says the sky is blue.
If one study is published that says,
well, actually the sky is red,
we don't turn around and go, oh my gosh,
science has been flipped on its ear.
The sky was red all along.
You get somebody else to do it.
We get somebody else to do the study and see,
do you think the sky is red?
And so I think a lot of times these studies
will be kind of presented in a vacuum
as if look at this thing that has changed
our understanding of science forever. And it's like your study in a vacuum as if, look at this thing that has changed our understanding of science forever.
And it's like, your study in 12 gerbils...
that you didn't have a placebo group for
and was funded by, I don't know,
the maker of the gerbil supplement,
does not overturn our entire understanding
about what amino acids are and what they can and can't do
in the human brain
Said I want to ask you a question before we wrap this up
And it's something that it's sort of the main question that I've been thinking about as we've been talking about this
It seems to me and tell me if I'm I'm
Incorrect here. This is a really interesting skill to possess and learn about and and sort of work on
but it still feels to me like I,
you shouldn't, if you are someone who does not have
medical training, you shouldn't be basing healthcare
decisions on your ability to interpret research data.
Do you think that's fair?
Oh, I mean, you're asking me such a loaded question.
Oh, I didn't mean to. Oh no, I mean, you're asking me such a loaded question as- Oh, I didn't mean to.
Oh no, I mean, I'm gonna say what I truly think
because I tend to do that.
And I understand that some people might not like me
saying it because I am a physician
and I have been trained in this.
I don't think you should try to do this on your own.
I don't think it's a good idea if you don't have any,
if you've never, if this isn't your field of study,
if this isn't your area of expertise,
and if you've never been trained in how to interpret
these kinds of studies and analyses,
I don't think you should try to do it on your own
for your own safety and for the safety of others.
I think you should ask people whose job it is
to know this stuff, who have expertise in this area.
Now, why do I feel differently if we're talking about something that is less significant in
terms of health, right?
Like if I'm looking for what people have had success with in terms of like supplements
or anxiety or what things that are not sort of more life and death maybe, it might be
a little see worth what's worth asking about or I don't know. I It might be a little, see what's worth asking about
or I don't know.
I think it's fine, well, what's worth asking about.
Okay, so I think what we're talking about
is the difference between what I might,
I wanna know what other human experiences
are like out there and so I'm going to sort of
peruse the internet for what other thoughts
and feelings about this have been
because I wanna know what other people's experiences are,
that's valid.
That's part of how we make decisions.
There's nothing wrong with that.
It's conflating that with knowing hard facts
about science because you read a study.
Right.
Those aren't the same thing.
And being able to read a lot of data on a topic
and come up with a,
this is the best possible answer to this question
that we have at this moment,
that is not something you can do
from just sort of surfing the internet, right?
And I think both of those are really needed and valid
and help you go to your health
care provider and say, I've read all this, this makes sense to me, but what do you think?
What is the data actually say? That's how that conversation should go. Not, well, I
Googled it and I think I know better because just because you looked at it on the internet,
I mean, you don't necessarily know what you're looking at.
And this is, I am speaking from an area
of just my expertise.
There are tons of studies out there about stuff
that I would not even begin to try to interpret for you.
There are lots of areas of science
that I don't know nearly as much as other people about,
and I wouldn't try to tell you.
When we talk about physics, I suck at physics.
I'm never gonna try to school people on physics.
If you ask me for legal advice, I will refer you to a lawyer.
I won't try to give it to you.
And I won't try to read, you know, legal journals
and then tell you what I think,
because I don't have that expertise.
I think that as a society,
we all benefit from valuing each other's expertise,
whatever it is, and learning from each other. And that open exchange of expertise is how
we all get to a place where we make the best decisions for ourselves.
But if you can secretly check in the bathroom to see if the St. John's Wart study that your
aunt brought up at Thanksgiving is worth anything,
and then burst back out of the bathroom triumphantly,
maybe that's definitely a skill worth having.
And you're welcome, you're all very welcome.
Thank you, Sid.
And thanks to you for listening.
Thank you if you supported us during the Max Fund Drive.
We really appreciate it.
People really came out for us,
and we smashed through all of our goals
and it was very kind of you, so thank you so much.
Yes, thank you.
We appreciate it so much.
And thanks to taxpayers for you.
So their song, Medicines, is the intro and outro
of our program.
And thanks to you for listening.
It's gonna do it for us.
Until next time, my name is Justin McRoy.
I'm Sydney McRoy.
And as always, don't drill a hole in your head. ["Dreamin'"]
All right!
Yeah!
Maximum Fun.
A workaround network.
Of artist-owned shows.
Supported directly by you.