Ear Biscuits with Rhett & Link - 117: Should We Fear Artificial Intelligence? ft Veritasium | Ear Biscuits Ep. 117
Episode Date: October 23, 2017Rhett & Link are joined by Derek Muller aka Veritasium for a conversation about artificial intelligence, the future of government, and if Derek thinks we should fear our coming robot overlords on this... week's Ear Biscuits. Follow This Is Mythical: Facebook:Â http://facebook.com/ThisIsMythical Instagram:Â http://instagram.com/ThisIsMythical Twitter:Â http://twitter.com/ThisIsMythical Other Mythical Channels: Good Mythical Morning:Â https://www.youtube.com/user/rhettandlink2 Good Mythical MORE:Â https://youtube.com/user/rhettandlink3 Rhett & Link:Â https://youtube.com/rhettandlink Hosted By: Rhett & Link Executive Producer: Stevie Wynne Levine Managing Producer: Cody D'Ambrosio Production Manager: Jacob Moncrief Technical Director: Meggie Malloy Editor: Meggie Malloy & Ty Schmieder Graphics: Matthew Dwyer Set Design/Construction: Cassie Cobb Content Manager: Becca Canote Logo Design: Carra Sykes Featuring: Derek Muller To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
Transcript
Discussion (0)
This, this, this, this is Mythical.
Welcome to Ear Biscuits, I'm Rhett.
And I'm Link.
This week at the round table of dim lighting,
we're gonna talk about the future.
With Derek Muller, known everywhere on the internet
as Veritasium, amazing YouTube creator
and scientifically minded person who is,
we have had a very stimulating conversation.
We have had it.
We've had it and we're teeing it up for you, Ear Biscuit-eer.
Lot of fun for us to speculate about things
that will happen in the future and I know that it's striking,
it's speaking your love language.
I like to talk about the future.
I like to talk about the future with people
who know things about the future
and actually have informed guesses about the future
as opposed to myself that just has formed guesses
about the future.
You know what I'm saying?
Formed meaning you formed it, meaning you made them up.
Meaning I formed them, meaning I got lots of guesses
about the future and I have formulated them
but they're not informulated necessarily.
So we get into a lot of things
in our conversation with Derek and we also clear up,
I will say that you're the one who made it awkward
but I think we cleared up something that was a little
an interpersonal conflict between Derek and I.
It's been festering for a few years.
If you wanna lay blame here,
blame number one responsible party is Derek.
Oh okay.
Number two responsible party is you
because of the way you reacted and number,
I'm just saying technical responsibility.
He's the one who started it.
Like scientific responsibility.
Yeah but that doesn't mean, but then you got
very close second because of the way you took it
and now I'm just a guy who likes to be entertained.
I'm third on this totem pole, man.
I'm third.
You need to invert that.
Okay.
Entirely. All right. You, then me, then Derek. You can to invert that. Okay. Entirely.
All right.
You, then me, then Derek.
You can do that if you want.
But it was fun and it was interesting
and it might have been a little awkward.
I mean his mom was watching for goodness sakes.
Yeah that's what was gonna be so great.
I wanted her to come out and referee.
So that's what you got in store for you
as you're listening tonight.
I feel like you're listening at night for some reason.
Well now you just alienated everyone
who doesn't listen at night.
Even if it's during the day, it's just.
It's night somewhere.
It's night in your soul right now.
Oh gosh.
You're listening in your ear holes.
Night of the soul is like, dark night of the soul,
that's a bad thing.
So we talked about,
we talked about cars. Cars.
And I didn't mention this,
but we can get into it a little bit more.
We got a, the Neils got a vehicle.
Well, pretty big deal here.
A replacement vehicle for the illustrious minivan.
Now I haven't talked to you about this,
but I wanted to.
And it does something that is very futuristic.
Right, but we'll talk about that later
because the thing that the most interesting aspect of this
is the fact that you un-minivaned,
which is a, you know, minivanning as a decision
is already a bit of a decision, right?
I've never made that decision.
And then you un-minivaned.
And I just wanna know what went into the process.
I mean, I've been a proud minivanner for many years.
Maybe seven years.
Maybe more than that.
More than that.
I've been through three minivans.
You got one right after you had your first kid.
Oh yeah.
So I mean.
No I didn't.
Did I?
You had a minivan.
With a second kid, bro.
Definitely with Lincoln, so we're talking 12 years.
Which two kids does not require a minivan.
So yeah, we made a choice to go for it,
for the spaciousness.
And the drivability, man. It drives like a car, it fits a choice to go for it for the spaciousness.
And the drivability, man, it drives like a car,
it fits a lot of people in it, it's very practical.
You can get a lot of people in it.
I remember right when we got it,
we went on a white water rafting trip
and it was me, my dad, you, Greg, Greg's dad,
and another person.
And I was like, oh, we'll take the minivan.
Yeah.
You know, it's like, it's perfect.
And while I appreciate all that, I have a certain level of pride, I'm like, it's perfect. And while I appreciate all that,
I have a certain level of pride,
I'm not saying it's a good thing,
but I, and I only had two kids so I didn't need a minivan.
You have a stigma.
Yeah, well.
You let the stigma get to you.
Minivans have a stigma.
Yeah.
And so, you know, in the way that our families work
is that, you know, we've got the car that we typically drive is that we've got the car that we typically drive
and then you've got the car that your wife typically drives
and she's gonna be the one that has the bigger one
because she's picking up the kids from school
and more often dealing with them than we are during the day.
But then.
Oh I wouldn't be caught dead driving a minivan, just me.
That's crazy.
No, but I'm just saying that there are moments,
and I'm not saying this is a healthy thing,
but there are just times when I'm in a place
and I don't want people to just pigeonhole me
as the guy with the minivan, so that's the reason,
I recognize how practical it is, so I was just like,
I'm gonna get an SUV that basically,
I know it's not as convenient, it's harder to get in
and out of all these things, but it was just a sense of external appearance that keeps a lot of people
from going to the minivan.
You made a shallow choice to not get a minivan.
Yeah, but because.
I made a deep choice to get a minivan.
But that is the way the world works
because people are shallow, and so sometimes
you have to make shallow decisions
to live in a shallow world. Okay, if that makes you feel better make shallow decisions to live in a shallow world.
Okay, if that makes you feel better.
And you've now made a shallow decision.
Well I'll say that years of being minivan driver
I think got to us.
So like we could only fight that good fight so long.
And one day, Christian and I were just sitting down
and it was just one of those like silent moments.
The kids aren't around, here we are just sitting down.
A moment of reflection.
And then it's, yeah, I think it turned reflection-y.
Reflective.
And then.
I like reflection-y, could you just use that word.
And then all of a sudden Christy was like,
I wouldn't like another minivan.
Oh, it was her!
It got to her, and I was like.
Did she say it?
Hold on, did she phrase it exactly that way?
Because it sounds so, I wouldn't like another minivan.
Like that's an interesting way to say it.
Or did you just say that right now?
I did say it right now, I heard myself say it.
I think that was her sentiment. I can't it right now, I heard myself say it. I think that was her sentiment.
I can't, you know, I don't remember conversations.
I'm willing to bet she did not just out of the blue say,
I wouldn't like another minivan.
And then I was like,
I too am in agreement.
You guys are weird.
In your reflectionary moments.
Reflectiony moments.
We were both feeling it.
You know, like the minivan was getting a lot of miles.
Once it breaks 100,000 miles,
you gotta replace the transmission in that puppy.
Yeah.
And I did that twice.
100,000.
And I was like, I am not doing that again.
That's when I think the problems really start.
So then when we were gonna get in,
I was like, we're breaking 100 grand in terms of miles,
not in terms of miles,
not in terms of.
We're gonna spend 100 grand on a minivan.
That is quite a minivan.
It's got gold rims.
100,000 miles, we gotta shed this skin.
Yeah, yeah, yeah.
And we're like, let's not get another minivan.
But was there a discussion beyond that
or was it just a feeling during a reflectionary,
reflection-y moment where you just like.
That was it, we were in agreement.
You vibed, yeah.
Yeah, we were vibing about it and then so,
we wanted that, I mean with three kids,
we wanted a car that had captain's chairs
in the second row so then basically we got a,
we just picked an SUV that had captain's chairs in the second row so then basically we got a, we just picked an SUV that had captain's chairs
in the second row so that the kids wouldn't be
in a bench seat, they have to have physical separation.
They have to have their own seat
or it's just gonna be bad.
It can't be like.
Oh it's not about access to the back row,
it's about physical separation.
That's the first reason is everyone has their own seat
and it's not two people on a big bench that then,
it's like well how much of the bench are you taking?
I just didn't wanna have that argument.
So that narrowed down what we were gonna go with.
And I don't even wanna say what we bought
because A, they're not paying me,
and B, I don't care enough to say, honestly.
But I am tempted to brag a little bit
because it's got a auto parking feature on it.
Well, I mean, okay.
And I didn't really know that cars had this
because I have a minivan with 100,000 miles on it.
Right, but I saw like a Nissan commercial
from like seven years ago where they were advertising
this technology.
And I probably saw it and I was like, that's stupid.
I can park my own car.
One of the things that I took the most pride in
when I first started dating Christy
was my ability to park in a space on a date
and I would always back in.
You were back in her.
I would back in and she was really impressed with that.
I would always back in my 1987 Nissan pickup.
Doesn't take much.
Shut up.
That sounded like you insulted my wife.
Or I can't figure out, did you insult me or my wife?
I'm just saying a woman who is really impressed
by a man who backs in, I'm just saying.
Take it back.
Don't insult my wife on a podcast.
I am insulting.
Insult me.
I'm insulting you through your wife.
That's the way I went at it that time.
Well, I don't appreciate it.
Okay, I take it all back in.
What?
I didn't even hear.
I said I take it all back in,
because you back in.
It was actually an incredible joke
and everyone in the room laughed.
But you didn't get it because you're spending
too much time thinking about backing into spaces
so you can get out quick.
And that's what matters, that you said something funny,
not the apology, that's fine.
So anyway, this car, it only, when it parks automatically,
it does it by backing only.
Did you know that?
That self-parking cars back in exclusively?
Or at least this one.
It will not go into a,
because I know it parallel parks,
but it also goes into a regular space.
A proper parallel parking procedure is to back in.
Right.
You can't, the physics of how a car works,
you can't parallel park nose first, doesn't work.
Nearly as well.
But you can't even choose to pull into a space.
Yeah, it'll automatically park in parking spaces
that are like normal, like cars beside each other,
not end to end but side by side.
Well.
But it will only back into those.
And it's a scary thing and it feels awesome.
But I've lost a little bit of myself.
I've given myself over and that's what we talked about
when we talk about the future is how much of ourselves
are we gonna give over to the robot overlords?
We discussed that in depth.
Well it makes me.
And I've broken the seal personally now.
Well it makes me think that maybe,
again, because we talk about this as well,
about how I have this faith in AI
helping us be able to solve problems
and what you've basically proven here, Link,
is that at least with your car,
the AI has determined that the best way
to get into a spot is to back in,
which might potentially settle our longstanding argument.
And I'm willing.
You're trying to make up for insulting my wife?
I will accept it.
I'm willing. Yes.
I think you're missing the bigger point here.
Finally, I, yeah.
The whole point.
I've been right the whole time.
The whole point of appealing to AI
is that it is something beyond the two people
and their flawed understandings can achieve.
Now, if there's other cars out there
that don't just exclusively back in,
then I'm gonna take it all back.
But at least as far as this particular car company
that I'm not gonna mention is concerned
and this particular model that you have is concerned,
if that's the AI we're deferring to.
I love it.
You win the argument for now.
Yep.
Until I get a car that pulls in.
This episode of Ear Biscuits is supported by HelloFresh.
HelloFresh is the meal kit delivery service
that makes cooking more fun so you can focus
on the whole experience, not just the final plate.
Here's how it works.
Each week, HelloFresh creates new delicious recipes
with step-by-step instructions designed to take around
30 minutes to make for everyone from novices
to seasoned home cooks
short on time.
And you know what, the thing we love is that
the kids get involved.
Lily loves cooking stuff and this is very empowering
for her to cook for the whole family and not waste anything.
You know, like all the ingredients are there
and try new things she's never had.
Even Lando gets involved.
He likes to cook stuff now because of the HelloFresh box.
Well what'd you have this week?
Oven roasted Mediterranean cauliflower,
because we picked the veggie option
with tzatziki and chili cumin oil.
I mean that's something that I don't think
we would have ever eaten, but then it's like,
whoa this is great, and we made it.
Well we had something that took me back home,
Carolina barbecue chicken with mac and cheese
and green beans, pretty simple, classic meal.
We did not let the kids get involved
because that's when things go wrong.
And it has nothing to do with the instructions,
it just has to do with just the tendencies of my children.
Because the recipes are very simple. I can follow them.
I don't spend a lot of time cooking,
but I don't have to spend a lot of time cooking with this.
Easy to follow directions.
Now, HelloFresh delivers food to your doorstep
in a recyclable, insulated box for free,
and this all comes out to less than $10 a meal.
Yeah, so Mythical Beasts, we have a deal for you.
For $30 off your first week of HelloFresh,
all you gotta do is visit HelloFresh.com
and enter promo code Ear30.
That's HelloFresh.com, enter promo code Ear30.
Remember, supporting our sponsors like HelloFresh
is a great way to support Ear Biscuits.
Now, on with the biscuit.
I remember it was a bus and it was the daytime
and we were, that's where we met.
So it was some YouTube event.
And then there were guys who created weird vehicles
and they had like a YouTube channel but they were,
they had an accent but I don't know.
Yeah they were Australian.
They were Australian.
They were these guys who do Mighty Car Mods.
And you knew them or of them.
I'm also kind of Australian.
Kind of.
Yeah I was born there.
I was born there but I don't have the accent.
Right so.
Well I can try to put one on if you like.
Well I mean just do it.
It's too late now.
That would really annoy.
As you please.
That would really annoy the audience.
Did you ever have it? Did you ever have the accent? No no because I moved away when I was like just do it. It's too late now. That would really annoy. As you please. That would really annoy the audience. Did you ever have it? Did you ever have the accent?
No, no, because I moved away when I was like 18 months old.
We moved to Canada, so I'm essentially Canadian.
Then I moved back when I was 21 and lived there for a decade.
But I think by the time you're 21, you've got your accent.
You know what it's going to be.
Well, no, there are different kinds of people
because there are definitely people who put on.
Who are.
No, no, there are people who are adults.
That's the type.
Adults who go to a place, like I had,
I don't even remember who this was,
but I've had multiple friends who just went
on a European vacation and they come back
talking differently.
It's just like people who are just so,
they can be influenced so easily.
I mean, or they want to sound that way.
I had a friend who went to London for like six months
and came back and everything was like,
oh, a cup of Hawaii tea.
It was like, yeah.
You gotta guard against that.
You gotta go in guarding against that.
Exactly.
And we just met your mom.
She's here.
She is indeed, yeah.
She's a wonderful person. She's keeping an eye on you.
She's visiting, keeping an eye on you here,
making sure we don't get you in any trouble
here on this Ear Biscuit.
If my mom was visiting and this wasn't my show,
I don't know if I would bring my mom to it
because I just think she would, when it was over,
she'd try to be nice but she would just be like,
it's so boring.
Yeah.
I don't think my mom would care too much about
coming to a podcast but she wanted to go to the doctors
last time, remember that?
She needed to see a doctor?
No, the doctors, the show.
And she took Christy and Jessie.
She loved it.
Yeah.
She loved it but she didn't ask to sit in on a podcast.
If my mom has a superpower, it's making everything seem the best version of itself.
You know, like I could do anything here.
This could be an awful podcast that we could do
and she would still walk out with glowing reviews.
That I can guarantee you.
Oh wow.
You're gonna go back and brag to your friends
at your high school reunion that you're going to next.
She is absolutely gonna do that.
And if we can get a link and stuff,
you know, she will be sharing that around that crowd.
We'll give you some business cards.
You can't just push Derek's channel anymore.
You gotta push some Good Mythical Morning
and some Ear Biscuits amongst your friends.
You know, put in a good word for us.
I'll try really hard to be entertaining right now.
More than usual on this podcast.
And we are trying to break into that demo,
the demo that's currently going to their
50 year high school reunion.
That's, we heard they got a lot of purchasing power.
Oh yeah.
She been buying you stuff?
No.
Okay, she's saving it up.
She's gonna buy some of our merch.
Okay, now I don't know if Link wants to bring this up,
so I feel like I can be the mediator here.
Oh.
I think I know what this is, actually.
Oh really, okay.
Oh really?
There's some beef.
Yeah, there's a little bit of beef.
But did I talk about this?
You know what, you did mention it,
because I could tell it was on your mind.
To you?
Yeah.
Because I don't remember that.
I know, you know, I could see it in your eyes a little bit
the next time I saw you.
But also, you know, I felt bad about it.
We gotta give them the back story.
When it happened, yeah, I mean.
Oh man.
But.
This is good.
We haven't talked about.
I think we have.
I think I did mention something.
Cause there's no way that you would have known.
There was some kind of follow up.
Well, you can tell it from your perspective.
I didn't even know it happened when it happened.
You told me about it later.
This is how I remember it.
It was VidCon.
And so VidCon's kind of crazy.
Like we were, I'm pretty sure we were introducing people
on the main stage and it was just like,
okay, here's some, we'll introduce some people,
they'll do eight minutes, five minutes,
and then they'll come off, we're emceeing the thing,
and then we'll come back out, we'll crack a joke
and then introduce the next person.
So, we introduced you, I guess.
Yeah, I got out on stage somehow. You were out there somehow, I assume we introduced you, I guess.
Or you were out there somehow.
I assume we introduced you, but then when you came back,
when you were leaving and we were coming out to stage
to introduce the next act, you punched me.
Hey, now listen.
It's that simple.
That makes it seem so aggressive.
I mean, as I remember it.
And then you kept going backstage
and I kept going to do my job.
But it happened in my periphery
and I knew nothing had happened at the time
but you told me later that he gave me kind of like a punch,
like a hey, do a good job kind of punch,
but it really hurt.
Yeah, it wasn't in the face, it wasn't in the chest,
like right in the center of the sternum, it was,
it was, it was.
He frogged you.
It was on the arm, like where,
That's what we called it, did you ever call it a frog?
I've not called it that, that's what we called it.
I think you were aiming for the deltoid.
Yeah.
With a little, you know like.
A little fist bump to the deltoid.
Yeah, fist bump, it was a fist bump motion.
That was the attempt.
But it was a little low, so it was that spot
where the deltoid ends and then the next muscle
starts to pick up.
And it got right to the bone.
There's a lot of bone there on me.
Well, look, you know, I'm just gonna say in my defense,
I think what happened was the adrenaline of being on stage.
Exactly.
You know how it can happen.
And it was an attempt at a friendly gesture between two performers entering and exiting stage.
Yeah, yeah, yeah.
But I remember upon making the punch that it felt harder than I intended.
You know how these things are?
And then I was leaving the stage and I was feeling You know how these things are like, whoa, okay.
And then I was leaving the stage
and I was feeling like a jerk,
because I was like, now he's gotta go out there
and he's got a sore arm.
Oh, you had feelings about it.
I sensed immediately, you know, it's like one of those things.
You probably saw my eyes shudder.
I was like, oh god.
Yeah, it's one of those things where you really,
you don't intend it, but you know immediately
that you've done it wrong.
And I'm sure I didn't help because
I do remember having an immediate emotional reaction.
Like for some reason, I just,
the first illogical response was
like you did it on purpose, which is stupid.
Oh, okay.
And I knew that that was not the case.
I think it was a, I'm not gonna try to defend myself
with science, but let me try to defend myself with science.
Okay.
You hurt me.
Unintentionally, and it was well intended,
it was a brotherly gesture.
Where I'm from, they call it a love lick.
My granddad used to hurt me all the time.
Pinch me, punch me, and it was all,
and it was a love lick, and he'd say,
I'm giving him a love lick.
That was how he showed affection.
And I now believe that's what you were doing.
Or you're just being friendly, let's not make it weird.
Where's the science?
But you hurt me.
You went to granddad real quick.
When it hurt me, yeah, we're getting into therapy,
not science, I need some help with that.
I had a physical pain response that then was like,
fight or flight, and for some, you know me,
sometimes I'm like, even though I have no business
wanting to fight anybody, like, my body said fight.
Wow. My body said, that my body said fight. Wow.
My body said, that was a provocation.
This is the part of the story I did not know.
I did not know this.
How could you know?
How could anyone assume that somebody is a total jerk
and that would be me in that moment?
Link just wanted to fight me.
Link wants to fight me, I'd say every 17 minutes.
It was not.
So don't take it personally.
It was not personal, it was biological.
Right.
It was somebody just hurt me, I'm either gonna run
or I'm gonna try to hurt them back.
It was your reptilian brain.
It was my reptilian brain.
Well I mean, can I say I'm sorry.
Yeah.
That I didn't mean to hurt,
and I'm sorry that you were hurt.
And I'm also saying sorry as a Canadian,
which I'm very cognizant of at this moment.
The other thing that I would offer to you
is an opportunity for a Love Lick back.
Oh, ghost.
If that is something that you desire,
I think it's warranted.
It's something that I would permit right here.
Can we sell tickets to it,
or do we just need to do it right now?
We're going to go Conor McGregor on this thing?
Yeah, I mean, hey,, they both made 200 million.
It's like, if we could get a piece of that.
We do like the YouTube geek off version of that.
You need to make $200.
Like I have glasses and you know a lot of stuff,
so that's really how I'm, I'm lumping us together
in that way.
Let's let that be a teaser.
Let's say we're not gonna get out of here,
we're not gonna let Derek out of here
until you give him a love lick back.
Oh I thought you were gonna say an answer.
Yeah.
An answer about whether.
Well I'm the fight promoter here.
I'm saying it's happening.
I'm not gonna set it off at a different date.
I don't wanna deal with all the publicity
and the pay-per-view stuff.
It's a total headache.
So you want me to not,
you want me to suspend forgiveness?
I mean the dude said he was sorry for something
that he didn't even have to apologize for.
This isn't about forgiveness, this is about revenge.
This is about vengeance, man.
I think this is about audience retention,
let's be honest. Yeah, exactly.
This is a teaser.
Of course, people can just skip to the end
and say, just a good and bad.
I can't do it.
I can't let Derek sit over there and say sorry
and me not say I forgive you, I forgive you.
Okay, I appreciate that. But you know what?
I'm still gonna hit you.
Yeah, no.
At the end of the show.
I feel like I should apologize.
No, come on.
Because that's ridiculous.
Now I don't, honestly I didn't hold a grudge.
I mean once my logic kicked in, I was like,
that made me mad but it would be stupid for me to think
that he had, Derek had anything to do with it.
This is the conversation we had.
Logic doesn't always kick in.
But we did have the conversation.
It's actually, when logic kicks in for Link,
it's like an audible noise.
It's like a machine turning on.
Oh that generator that hadn't started
in a couple of years is starting now.
Okay so now that that's out of the way
and we're gonna do something to the end with it.
So what we do often on this podcast
is just the two of us talk about things
and we don't really know anything about anything
but we like to talk like we do know things about.
Especially me, I can make you think I know about things.
I'm glad you said that.
I could have an incredible cult following of stupid people.
You know there are people that could do that.
I would be a great cult starter.
That's just not something you say out loud though.
No but it is true.
It's a little bit like L. Ron Hubbard or something.
Yeah I would, yeah.
You could actually turn this into something.
Yeah right, I'm working on my first novel right now.
Okay.
And I gotta come up with my version of Dianetics, but.
I am the only thing standing in the way of him going over
some kind of edge, I won't even give it a label.
But the exciting thing about having you here
is that you actually do know stuff.
I know some things, yeah.
And so this is one of those situations where
instead of just us pontificating about things
and then checking ourselves later,
and we usually just check ourselves on Wikipedia.
I mean we don't even go to the source.
Sure.
So we're bringing the source to Ear Biscuits.
Now and the cool thing about what you do is that
you know, you kind of started this career on YouTube and now it's about what you do is that you know, you kinda started this career on YouTube
and now it's gotten a lot bigger than that
and now you've got people like Bill Nye
sending you out places to be an official correspondent.
So this is legitimate.
Because here's the thing, you know this as well as we do
that when you get invited to these places
and into these environments but you're the number one qualifier
to have your name as YouTuber,
you kinda feel like you have something to prove.
We feel like we got something to prove as comedians
when we show up in a place,
but as a, how do you characterize yourself, as a scientist?
Yeah, you know, I'd probably just say YouTuber these days.
Right, but do you ever feel like you have to like,
kind of, well, I'm doing it on YouTube, but it's like you ever feel like you have to like kind of explore?
I'm doing on YouTube,
but it's like,
it's kind of legitimate at the same time.
Yeah.
I,
maybe I've kind of given up on that.
You know,
when I come through customs and stuff and they'll ask me what I do and I make
videos on YouTube and you know,
they actually look at my visa and the technical term for my visa,
cause I'm not American.
The technical term is alien of extraordinary ability.
Oh, gosh.
That's incredible.
Right?
Because we won't let you in here unless you can do something that no one here can.
Exactly.
Or something.
There's some sort of rationale.
And so when I say to them at the border, you know, I say I'm a YouTuber,
and they look at the visa and they're like, what?
Extraordinary ability?
Like, what can you do?
Let's see it.
And then I have to say, to say I'm on Netflix too.
As soon as I say that,
that's fine.
That's legitimate.
I understand.
I'm on Netflix too.
Depending on who I'm talking to,
it doesn't say alien of extraordinary ability,
but it says E11, which is the category.
Now when I said let's see it,
I was role playing as the guy at the gate
but I am glad you're showing this to me.
Because that's what I show the guy at the gate.
E11.
Yeah.
Extraordinary.
Of extraordinary ability.
That's probably trademarked in some way.
Somebody would want to make a movie about that.
That's a good looking photo.
It could be a lot worse.
Yeah it definitely could.
It could be a lot worse.
Okay so.
So yeah, I understand where you're coming from.
When you mention Netflix, they let you right in.
Right. That's right.
As soon as I say I'm on a show with Bill Nye on Netflix,
people are like, oh yeah, he's for real.
I feel it, man, I know.
But you said you've given up, but congrats
on the streaming win, by the way.
Thank you.
You could also say that, I wanna, I don't know.
If I said I wanna stream it, they'd still be like,
man, that sounds made up.
E11, extraordinary Streamy winner.
No, no.
As long as I keep winning Streamys,
I'm not gonna dog it.
No, I'm not gonna dog it.
I'm gonna let Jon Cozart do that.
Yeah.
Okay, so we talk a lot about self-driving cars.
You know, we're in the middle of the transition.
But I think the question.
I would say the beginning of the transition really.
The beginning of the transition.
Okay, see, I've already been correct.
It's all good.
But when can we just not think anymore
about getting from point A to point B?
When is that gonna happen?
I think it's gonna be a while, but'm going to say, you know, five years, you could
probably make it happen. You know, that's my horizon for-
That's-
Yeah.
That's quick.
It's soon.
Yeah.
But I don't think it's going to be widespread adoption at that time, but that'll be, you
know, your friend has one or, you know or you know like you know that's when it's
really coming in and then 10 years you know you see more widespread adoption but i think it's gonna
you know five years is a good time horizon for that and it will be a tesla or by that point do
you guys drive teslas no i wish i do drive a tesla i drove one here today and it drove itself most of
the way here because i was just on the 101. So right.
So at what point, I've actually never, let's see.
Joe Penna had a Tesla.
A really early one.
He did.
He had an early one and I rode it, I got in it
but then I didn't even move it.
He had to be somewhere.
I got inside of it. I was like just let me get in it.
I've been inside a Tesla.
So hang on, you guys have not ridden in a Tesla?
No, never?
I have not been in a Tesla.
But now John who works here.
Has one.
Has one so we've got one in the parking lot
that we could probably take whenever we wanted.
I don't understand why you guys haven't done this yet.
You're sitting around talking about self-driving cars
and you haven't even sat in one.
Well I got a car that will park itself
and I don't want to now get in a Tesla
and realize how stupid I am for getting excited
about just being able, it'll parallel park itself.
Yeah, well, I mean, that's a useful feature.
It is.
But it's much more useful to just drive
for like half an hour on an open road.
So, let's take the trip here as an example
of how much of the time,
when could you go into autopilot, so to speak?
Well, you can go in whenever you like,
but at the moment the Tesla does not recognize stop signs or traffic lights,
so you've got to look out for those things if you're going to drive in that kind of situation.
So the best place is a highway because you can just pop it in there
and you can go for an hour know, an hour or more.
You do have to keep sort of touching the steering wheel to let it know that you're still there.
I think that's some legal requirement.
Okay.
So it will, after a certain period of time, it will start trying to wake you up.
It flashes at you.
Yeah.
And that time seems to depend on your speed.
So if you're on a highway and you're crawling through traffic at like 20 miles an hour, you seem to be able to go five, 10 minutes
without it alerting you.
But if you're going super fast, 65, 70,
it'll ask you probably every minute or so.
But it's still, so when you get off on the surface streets,
it knows that it's on a surface street
and it doesn't even let you engage?
No, no, you can do it.
But you can totally engage on a surface street.
And would it go through a stop sign if you let it?
Yeah.
Oh, yeah.
I mean, it'll just go up to the car in front of you, basically.
And so...
See, but that's...
It's informed by all of the other cars.
So basically, it's a follower car, more or less.
So it's not...
But what if there's no car in front of you?
Then it's not going to stop for any stop sign. It's not going to stop for any red light. But as long as there's no car in front of you? Then it's not gonna stop for any stop sign,
it's not gonna stop for any red light.
But as long as there's a car in front of you,
that car stops at the stop sign,
you'll kinda stop behind it too
and you'll do like a kind of a rolling.
Is there an orientation to this?
No, it's totally just like good luck man.
Here it is.
Because I don't, I mean, all I knew is that
everybody that I know who has one
just does that on the highway.
I didn't know it was because it didn't work
on the surface streets.
I just thought it wasn't capable of doing it
on the surface streets.
Yeah, I mean, it'll do it.
It just, you'll run into more potential hazards, for sure.
Oh, man.
But you can't program in a destination
and it make any turns for you.
So, the one that's commercially available right now doesn't do that,
but Tesla has released a video showing their next generation model doing exactly that.
You basically putting in your destination and it just gets you there.
Yeah, that's the time I want.
That includes stop signs, that includes red lights, and they have the ability to do it today.
But it's not been
released yet it's not been uh i think legally permitted yet but i think you know the the
technology is there today it's probably still a little rough but it's there yeah the thing i don't
understand is the you know obviously there's the the resistance to it and people who just don't
trust it which i just i, I cannot, I appreciate it
but it really frustrates me when people are like,
I don't trust it and it's like, but you trust humans?
You trust people to drive cars?
Million people die every year in car crashes?
I think that our consciousness, I mean, tricks,
I mean, I think a our consciousness, I mean, tricks,
I mean, I think a big thing about consciousness
is that it tricks yourself into trusting yourself.
Like, even something like I know,
like this is what I believe.
It's like, okay, I heard, you heard that analogy too
where it's like, okay, what do you believe about like if something scares you
and then you like jump in the,
I'm blistering this analogy but like you jump.
I'm not with you yet.
You jump in the bed and you pull over the covers
and you would never say you believe that a comforter
is gonna protect you from danger yet it's revealed
in that moment what you actually believe.
I think that applies to cars as well.
That we, part of consciousness is believing
that the most trustworthy person is yourself.
But you're saying that's not true when it comes to driving.
But I do wanna have a comforter in my car.
That's the whole point.
I wanna be able to take a frickin' nap. I wanna goforter in my car. That's the whole point. I want to be able to take a freaking nap.
I want to go to bed
in my car
and let it drive me.
So you don't trust yourself
to drive.
And you're making
the active decision
to not trust yourself,
to trust the machine.
Whoa, I feel like
you're framing this
in a way that I don't
necessarily frame it,
which is I find driving
just a less stressful experience
when I let the car drive.
And that's not because
I don't trust myself.
It's because I can not really focus on driving as much.
Yeah, well, every time, I mean, you can just relax a bit.
You don't have to push the pedal, do anything.
It's just going, especially if you get stuck in traffic.
I mean, it's just, it's a real stressor.
For the stressful yet kind of mind-numbing experience
of freeway driving, bumper-to-bumper stuff,
you're gonna let the machine take over.
Exactly.
But when it gets complicated, you're gonna grab the wheel.
For now, that is absolutely true.
But at a certain point, there's gonna be a threshold.
What's the threshold look and feel like
if you're saying five years from now?
How's that gonna change? Five years from now like i think five years from now it's
going to be happening because like i say they how do we get there they have the they have the cars
built today the technology exists it's just a matter of fine-tuning it and the software and
then getting the legal approval for it i think the big blockage is that we like a story where
it's like oh this guy was drunk and he hit someone. It's obviously his fault. He caused these deaths.
We can point to someone.
We can say someone's wrong and evil.
But if there's an autonomous car, and this has happened, right, the guy who got killed driving a Tesla.
And, you know, who do you blame in that situation?
And I think we have a real problem with that.
If self-driving cars start killing people, who are you going to be upset with?
Who's at fault there? Yeah, there's this, whatever the equation is,
it's like how many car-caused deaths
equal human-caused deaths.
It's like we have some weird scale
that we're willing to put up with all these people
being killed by themselves in cars.
Well, the interesting thing about that,
and the guy who died was behind the wheel of the Tesla.
Right? Yep.
But, and the thing I remember about that is,
I don't remember the details of what happened,
but I remember that Elon Musk published the details
in a super seemingly transparent way
that was mind blowing to me in terms of hey,
this is the actual data, this is actually what was happening
and what the car was saying and the level of transparency
there that he seemed to be putting forward I think
is what's gonna get us over the threshold
because it's not spin.
Like there was nothing about the statement
that came out from Tesla afterward
that felt like the spin zone.
Or, oh we gotta protect our money bags kind of a thing.
Do you remember that?
I do, I do.
I didn't dig too much into the details
but I seem to recall a stat like, can look at the miles driven with the autonomous feature engaged and compare that to the same number of miles driven by people.
And they find that it's 40% safer, 40% fewer accidents or whatever per mile for those driven by the machine.
So the machine is better.
I mean, it's more vigilant with analyzing what's around it.
And people are on their phones all the time now.
I was driving behind a guy today
who was just like swerving all over the place
because he was on his phone.
And probably shaving and reading too.
Exactly.
I mean, I've literally seen that in traffic.
You know, to me, it's like
whenever you started using PayPal,
remember that first feeling of like,
whoa, whoa, whoa, whoa, whoa, whoa, I'ma connect
this internet site to my physical bank account.
I can see greenbacks just disappearing
through this thing called PayPal.
It's even a stupid name, why would I trust this thing?
This guy's not my pal, I just bought something from him.
But then you get over it and then you're like,
when did I not have PayPal or Venmo or whatever it is
these days to pay the babysitters, you know?
You're saying it'll be the same way
and you're saying it'll be five years.
Five years, that's what I'm saying.
At some point before the five years,
you won't even realize that you made a decision.
You'll just get in the car and you'll be like,
oh, I didn't touch it.
And then you'll think back and you'll be like,
oh yeah, there was a guy who worked with me
who had a cool car and I got in it and it was fun.
Oh yeah, we'll tell our grandkids about
the way we thought about driving
and it will blow their minds.
They just won't have any, they just won't be able to comprehend
that it was ever a thing,
you know, that we approach things in this way.
You put those wing doors on it,
people wanna buy it.
That's not gonna happen.
Okay, so.
We've already got the wing doors, the Tesla.
That's a fad, that's a fad, man.
Oh, that's going away?
The wing doors are great.
You think that's a staple of the future?
I don't think it's a staple, but I do love it.
Yours has that?
Yeah, we got the wing doors.
And it's really helpful for the baby, for the baby seat.
Okay, maybe I'm wrong.
Now, okay, so you're kind of getting into something as well, though, that as we transition, you know, when all cars are autonomous,
with the exception of, like, you know, okay,
I've got this really weird thing I have to net.
Even beyond that, when it's completely autonomous,
you know, they're all talking to each other.
You know, I can't wait for the day when
light turns green and the entire mass moves together
at the same time because there's no reaction
because they're all talking together in the same network.
Like that's gonna be like second nature, right?
But that kind of moves into this other subject
that we wanna talk to you about,
which is your perspective on AI.
And all these things kinda merging,
you've got, okay, all the cars are autonomous,
all the banking is, I mean, even banking
and financial management ultimately can be and should be
and is in a lot of ways controlled
by artificial intelligence.
So what are we really up against?
Where do you stand?
Are you on the Neil deGrasse Tyson side
which is like this is no big deal,
we'll just unplug the machine.
Are you on the Elon Musk, Sam Harris side
that are so worried about it that this is
the greatest problem along with climate change
that we face as a species.
I feel like I'm more on the Neil deGrasse Tyson side.
But that could just be because both he and I are scientists
and maybe we're not fully immersed in where AI is right now.
But I find it hard to believe that...
No, I got to choose my words carefully here
because someone's going to be playing this tape back for me
After the AI apocalypse of like 2030
You're assuming that you'll be alive
Yeah
I'm hoping that I made it through
As long as you're there to hear it when they play it back
It'll be a robot playing it back though
Listen to this
Here's what you said
And their voice will sound like that
You know there's a whole bunch of things,
obviously not AI,
but that we keep contained,
like, you know, very dangerous diseases and things.
And we put them in, you know, quarantined environments
because we don't want them getting out.
And I feel like you should be able to do the same thing
with an AI.
And didn't Facebook have an AI
that came up with its own language
and they just, just shut it down?
When they saw that?
Well they started, I don't know if this is the same thing,
but there was two machines that were communicating
with each other and they quickly realized
that English, that language, human language
was really inefficient and they started
just communicating to each other directly with bits, just straight digital communication.
And at that point,
What are they saying?
The scientists could no longer figure out
what they were saying.
And that is, but it's so crazy,
because it's soon,
Pull the plug!
This is why I, but the reason I'm scared,
because I mean, I don't,
Are you more on the Elon Musk side?
I am just because I tend to,
He's a hypochondriac.
Yeah, but I mean I'm super excited about the future.
Like I embrace a lot of things,
probably with less apprehension than I should have.
But I, yeah I think that because they're gonna be,
the moment that they have true, you know,
general artificial intelligence,
and we can't tell the difference between, you know,
a conversation with a human and a robot,
they're already gonna be so much more advanced than us
just because of processing and all that stuff
that they already have.
And at that point, I just feel like they could,
all the barriers that you're talking about,
like keeping things contained, it's like the moment they just decide,
well, we don't want to be contained.
But what are they connected to?
Or do they have arms and legs?
Have we given them that?
Well, they're going to be,
I mean, they're going to have to be connected to the internet.
See, that's a terrible idea.
But how are they not going to be?
I mean, there's some guy working in his lab on AI.
You know, I think that's the rule.
It's like when you go to these places that have really harmful diseases,
you have like three airlocks and stuff.
You don't connect them to the atmosphere because if they could get out,
they would, and that would be awful.
In the same way, you keep your AI really locked up.
I mean, I don't know how we're gonna do it though.
Because at the point that you've got, again, I don't,
yeah, I don't think it's gonna be, well,
Westworld situation where you've got, you know,
robots walking around that, I mean, what we will have,
but that will be just for us, right?
It won't, it'll be for our entertainment or whatever,
but the most powerful robots won't be contained
in the same way, but they're gonna have to have,
in order to be useful, they're gonna have to be connected
to a network that we're using.
Yeah, well I feel like you wanna be careful
about the ones you connect to your network.
This could be the key plot point of the movie
of how this all goes wrong, where someone's like.
The one guy, the mad scientist.
What's he mad about?
The one guy who lets the virus out of the lab.
And this isn't like a virus, though.
It is, but it's way worse.
I still wonder.
A server is not going to come after you.
You know what I mean?
Like a giant lumbering server,
is that going to be the evil villain here?
But I guess to look at what could go wrong,
if everything's connected to the internet
and all your you know stuff in the hospitals and all the the autonomous cars and they get
control of those yeah it could be it could be pretty bad yeah but who knows we don't I mean
it's gonna we're gonna be it's gonna happen uh we're gonna be in the midst of it. And the thing is is that politicians,
the guys who are making the decisions about this stuff
just totally don't know anything about it.
Like you've got a few politicians who go like
the extra mile to kind of get briefed
by some research assistant, right?
But how many people who hold governmental office actually know the potential threats, right?
It's like, it's not many, that's for sure.
Do you guys ever think about the mass unemployment
that may come with the robots?
Well, I feel like that's a clearly more pressing,
or it's a more imminent issue. It's already happening. Yeah, it's happening, and I feel like that's a clearly more pressing, or it's a more imminent issue.
It's already happening.
Yeah, it's happening, and I feel like that
you're gonna deal with first before you're gonna deal
with killer AI.
Yeah.
So we got a serious, we got a set of waves coming at us.
Let's talk about that.
I mean, it's interesting, in Buddy System season two,
we kind of explore, I don't wanna give too much away.
Well, it's the first scene.
I mean the first scene of the whole season
is you losing your job.
To a robot. To a robot.
Yeah.
Now it is a comically ridiculous robot
who's no more capable than a human.
But it's. And that's the joke.
It's all, I mean we're not exploring
what we should talk about right now,
which is what you're getting at.
But it is the kind of job that we will replace.
And I mean, number one most common job
in the US is truck driver.
And Tesla is building a truck
that obviously is gonna be autonomous at some point.
So what are those dudes gonna do?
And at first there'll be somebody sitting there.
Just in case.
Yeah.
They'll work with us.
You know, the laws will require it.
Yeah.
Right?
Yep.
And he's just gonna, like you said,
he's gonna have to touch that big steering wheel
on the big rig every so often.
He's just a, he's a wheel toucher.
That's what he will be called on Monster.com.
We need some wheel touchers. They're not drivers, they need some wheel touchers.
They're not drivers, they're truck wheel touchers.
And then.
But how long is that gonna last?
After a while.
Three years?
Yeah, not even.
They'll be gone.
Yeah.
They'll be gone, but before they're gone,
I think that they'll be just seat sitters.
And then they will replace the seat sitters
with basically like a CPR dummy. seat sitters and then they will replace the seat sitters
with basically like a CPR dummy.
Someone who from the outside.
That is never gonna happen.
Yes it is, yes it is.
No, no, no.
You got this big rig barreling down the road,
you wanna look and see nobody in the cab?
You wanna see what you think is somebody.
You think they're gonna have a mannequin in there.
They're gonna have a freaking mannequin in there
with a trucker hat.
How about just a sticker on the outside of the window
that just shows the silhouette of a person?
Well that's not as believable.
I mean that's a lot cheaper.
That'll be next, yeah, you're skipping ahead.
A decal, can we just go, I wanna go,
I'm straight to the decal kinda guy.
Let's just do a truck driver decal.
Can I be the decal?
Oh it could be your profile, sure, I don't care.
I'm not making the decisions.
I do think that, I mean, really, so much of this
is public perception and comfort level.
But what are those guys gonna do?
If you can make things cool, if you can make.
But the real question is.
That is the bigger question.
What are we gonna do as a population?
As long as we can agree that there will be dummies
in seats in order to make everybody else feel okay,
then we can move on.
I feel like that's more frightening.
Like you're driving up next to a truck
and you look over and you're like,
wait, hang on, that's not a real person, that's a dummy.
Well, it depends on your definition of beauty.
Is that what we're talking about now?
What makes a dummy beautiful?
No, we're not talking about that.
We're not talking about that.
I don't think that's it.
So do you think that,
because I know some people think that they'll just be the decal guy.
They'll be the guy who places the mannequin in the truck.
There's always something for people to do.
The new job.
You've got to run out of that eventually, right?
Yeah.
So, I mean, something that I was really interested to see looking back at the history of work is that the number of hours that we work has dropped a lot.
So if you look in 1830, the average number of hours that someone would work would be 70 hours per week.
Really?
Yeah.
And you look today, and it's down to 40.
And in 20 years, we may be down to 20 hours or something.
We may continue to see that decline.
So maybe even if the number of jobs doesn't decrease that much,
the amount of work per job may drop and we may just be making more and or spending less because
all the work's being done by robots, which are much cheaper than humans. So, I mean, that's the
hopeful utopian future is that we could all decrease our work hours down close to zero
and yet still survive and have wonderful lives
because all the things that we want to buy
are really cheap.
And so we don't have to do that much work
to get the money to pay for those things.
But what are we, like, so what,
I mean, what are the threats to that utopia, right?
You got the overpopulation problem, right?
Yeah.
Eventually you've got, if you've got so many people,
like I feel like the financial problem can be solved.
Like the people not producing, it's like,
well the robots are producing,
but what about this being that many people,
limited amount of resources,
and all we do is just hang out and have a good time.
Yeah, well to me one of the big threats
is that we sort of segment our society
further into people who have and the have nots.
There are gonna be people who are making a killing off this
and there's gonna be people who just are out of work
and can't make any money.
And so I think without a big rethink of how society
functions we may be in a really tricky situation.
What do you think about universal basic income?
I think it's interesting and I think it's probably the way of the future.
How does that work?
Well, you just give everyone some money.
Yeah.
There's a minimum amount that every single citizen receives automatically.
From what organization?
From the government.
From what government?
The world government?
Well, you know.
Whatever government that you're a part of.
Every nation.
We assume the nations still exist.
And they would have to buy into this.
Every nation.
We assume the nations still exist.
And they would have to buy into this.
Well, I mean, the nations would, in theory, do it because it would be better or easier than setting up a whole bunch of other welfare systems.
Yeah.
And, you know.
And it's being experimented with somewhere.
Finland, they've currently got a trial with 2,000 unemployed people where they just picked 2,000 unemployed people at random and they said, we're going to give you like 500 euros a month. Yeah. But yeah, it seems absolutely inevitable with all that,
with the trends that we're talking about. It's funny because you, you say it seems inevitable and I kind of think it's a good idea, but I was in Switzerland, uh, last year and they actually
had a ballot about it. So this is one of those things where if you get enough signatures,
they'll have a public referendum about universal basic income.
And they did this in Switzerland, and they voted it down, I think, 75-25.
Oh, yeah.
So it was pretty harshly against.
And I was talking to this guy at a university who's a smart guy, professor,
and he was like, this is the most idiotic thing ever.
And to him, it was obvious that this was never going to happen.
And of course the great concern with universal basic income
is that nobody works anymore because they can just get income and live off that.
So why would you ever work?
So the idea of disincentivizing any work.
And that's why I think it's inevitable because I think it's getting it mixed up.
It's saying that we can't institute universal basic income because then people
will no longer work.
Well, what I'm saying is that people are gonna
no longer work in the near future
because of the advance of automation.
And so you gotta have an answer to that.
And it can't just be like, well,
these people won't work.
I mean, I don't.
Yeah, I was looking into this actually.
I made a recent video about robots taking our jobs.
And I was looking at the percentage of males between the ages of 25 and
54 who don't have a job.
And if you look back in the 1960s, it's about 5 or 6%.
You know, this is prime working age males.
So like, what are they doing if they're not working?
Mm-hm.
But you look more recently,
and that's up towards 18 to 20%.
So we're looking at a tripling of males
in their prime working age who just don't have a job.
And some of that may be due to globalization, outsourcing,
but another part of that is automation.
And I can only see that going up.
So you've got to do something.
Well, let's just, let's say that utopia is going to come. We're going to, we're going to solve all
our problems with technology and we're going to find a way to quit exploiting all the resources.
We're going to solve climate change. Let's just say all that happens. And our grandkids are
sitting there, they've got some sort of stipend, everything, all their needs are taken care of.
At that point, what is life like, right?
Because don't, I mean, I think we can definitely say
that we find purpose in our work, right?
You know, that you do what you do
because you find purpose in it.
Is it just substituting other things?
Yeah, I mean, I think everyone could,
they could just have hobbies instead of work.
I'm gonna do the little airplanes.
Like making little airplanes?
No, you're on that side of town.
You know the guys over at, where are they at?
Across in the Japanese Garden.
Yeah, over there in Balboa, the Balboa Lake area.
The guys with the remote control airplanes.
I have not seen this.
There's a whole airport dedicated to miniature planes.
Those guys are so serious.
Hang on, hang on, a whole airport?
Is it a miniature airport?
Yes, it's a miniature airport.
It's just a slab, really, but they're so into it
and they go out there every weekend.
There's no terminals, it looks more like baseball dugouts
because sometimes they need to stand in the shade
in order to glare up.
But here's the weird thing.
Squint up at the sun and pilot their planes.
But even those guys.
If you lay down on your belly and spread your arms out,
if you got spread eagle on the floor,
they have planes that big.
Oh yeah.
It's a pretty big plane.
And they're like gas powered.
Some of them are, some of them are electric.
But here's the thing.
I think they're all powered by passion for the hobby.
We can't just go out and do that because then
all of a sudden a guy's gonna show up and he's like,
I've got an automatic plane, I've got an autonomous plane.
And now it's just a bunch of dudes
watching autonomous planes.
You can't even have a hobby, we're gonna watch planes
make their own decisions and that's our hobby?
All we can do is put decals on them.
See, it just comes back to decals.
Why is that plane shooting down all the other planes?
I don't know. Let's find the owner.
I have no control over it. Some robot in the corner.
So this is your idea of a utopia.
Well, we're exploring it, you know? If you're saying everyone does hobbies.
I wonder if people get caught up in gossip,
if that just explodes.
Yeah.
And then a number of gossip mags out on the market.
It's just, you know, it's everywhere.
The whole store is dedicated to it.
I guess not stores.
You just get it delivered.
No, you're right.
Stores.
I think it'll be a retro thing where it's like,
you know, a whole store that replicates the experience
of exiting a grocery store.
You know, all of that impulse buy crap?
This is a good idea.
This is my business.
No robot's gonna come up with this in the future.
A store that's nothing but the last little bit
of grocery aisle before you check out.
The checkout rack.
Checkout area.
Yeah, checkout area everywhere-ia is what I'm gonna call it.
Okay, that's catchy.
And then when you come in, you check out.
When you get in, you check out again.
You just keep checking out,
because it's like oh man, I gotta make a decision.
I can grab one more thing.
And then it starts over.
Yeah.
And then you just.
You got all those little dividers,
and you could be, yeah, I love it.
Yeah, you're, yeah, you wanna invest in this.
This is a great business idea.
Let's invest everywhere.
But it's retro, because you're right.
It would all be digital, it'd all be in your brain.
But there's no aisles. there's no area to actually buy
the useful staples of life.
There's only the checkout area.
I'm basically describing a dollar store I think,
but let's gloss over that.
You recreated a dollar store.
A lot more conveyor belts is what we're saying.
Now I think one of the interesting things
that is probably also gonna happen with this is,
and this is, I kind of foreshadowed this in the past
and how I thought about AI and how it,
I don't know how I set it up, but basically,
I think that we've got all these issues
that are controversial, people can't agree on things,
and then we've got this faulty government model
that we rely on with a questionable dude in charge
of the whole thing.
And wouldn't you just like to say, okay,
I want to ask this robot that is completely familiar
with all the data, like what do you think about this issue?
And then they would just spit it out
and it would be like, oh I can't argue with that robot
because that robot has access to all the information
at once and is perfectly rational and has no bias at all,
except the bias that makes its way in from the programmers
which, okay.
Well I think you're describing a scientist
in its purest form.
Well, you know, but scientists are people
and so you have that inherent distrust of people.
But I think, you know, since scientists make the machine,
I don't know if there's any way to keep their biases
out of the machine and therein lies your problem.
It is a problem, but can't we, can't it be a more perfect
rationality than we can ever achieve, right?
What I'm getting at is aren't we gonna have
an AI government at some point?
I mean, is it, or at least a government
that is assisted by AI?
So in other words, it would probably be.
You know how quick governments are to adopt technology.
They're always the early adopters.
This is distant future, so you've got like.
It's gotten, I think your point is to move,
it's like for the financial system thing.
Well if you did.
Is that things have to get really bad
for entire governments to agree to do something.
We're talking about the Switzerland vote.
Just think about the way that the government
works right now, think about the way that people
get elected and like campaigning
and like influencing people's thoughts.
People who the vast majority don't ever think about
things on a deep level, they just go out and vote
for somebody who has an ad that appeals to them
in some way or some ideology that they identify with
and that person gets put in this position,
they don't know about this issue and that issue
and then they just start listening to people
who don't know about that issue,
tell them what they're supposed to think about that issue
and it's all these very, very flawed humans
making all these decisions.
Like that system, when you've got an intelligence
that is ultimately a more capable intelligence
that is growing, at some point you're gonna have to tap into that
for the sake of humanity, right?
I mean, I'm not into Star Trek enough to know
more than saying, you're talking about Spock.
I mean, let's figure out, Spock was like utterly logical.
Spock for president.
So what do we learn, yeah, what do we learn from Spock was like utterly logical. Spock for president. So what do we learn, yeah, what do we learn from Spock?
Sometimes his logic wasn't enough.
That's what we learn in Star Trek.
Because Spock didn't write his own scripts.
That's why you have the panel of people
who are keeping the AI government in check.
I'm not saying it's completely autonomous.
But then they're like, hold on, okay,
we need to know what to think about this.
Let's throw the problem into the machine,
see what the machine says, and then this panel
of wise people decides what to do.
So it's not completely led by the AI,
but you gotta be consulting it, right?
You're saying it's not the president,
it's like the cabinet, and I mean literally a cabinet.
Like the president's chief advisor is AI.
Is a cabinet.
A literal cabinet?
Yeah.
With a server in it.
Yeah, it's like throw it in the top of the cabinet
and then it'll spit something out of the bottom,
an answer out of the bottom of the cabinet.
It sounds like a cool idea.
I always just fear about that whole garbage in, garbage out
kind of problem of technology where if you don't phrase the question right
or you don't give it the right data,
you're not gonna get the right thing out at the end.
So it's still kind of troubling.
This morning, my son Lando, who's seven,
was like, okay Google, play the Slinky song
because he was playing with a really big Slinky.
And it started playing a really inappropriate r&b track by some
guy named by slinky or something and i was like whoa google whoa google yeah it's hard it's hard
to reign her in i also wonder you know when do we when do we get an artificial intelligence that is
you know truly deserving of that name.
Like to me, artificial intelligence always meant
something that was a computer that had its own consciousness.
But I don't know that we're close to that.
And I also don't know that anyone can time that out for you.
Like, oh, we're not there yet, but give us five years,
we'll be there, or 10 years.
It could be 100 years, could be 1,000 years.
Because we don't know how the human brain works yet,
I think it's really tough to make something
that replicates those capabilities.
Right, we actually don't know how we're gonna
cross that threshold.
And we've, but we've always been obsessed with it.
I mean, Frankenstein wasn't written a few years ago.
No.
I don't know when it was written.
Don't ask me, but I know it's old.
You know, as long as we've been conscious,
we've been obsessed with creating consciousness.
So I think if we're just driven,
we're bound to figure it out.
Do you think we're gonna get there?
Just based on that principle alone,
like just drive.
What if it's impossible? What if it's impossible?
What if it's impossible to replicate in a machine, you know,
the things that happen biologically?
That is a spiritual question.
So I have no clue.
It's a very fair question.
My feeling on this, and this is just going with my gut,
which could be totally wrong, right, because it's not a thinking organ.
But really, I just don't feel like we're gonna get there,
and certainly not soon.
I definitely don't think it's gonna be soon.
So I don't think it's gonna be like 2045 or something.
The singularity.
Right, but I don't know.
I just feel like,
But I don't know, I just feel like it's,
I don't think it's gonna be like somebody's gonna walk out one day and be like, I got it, I got this, here it is,
talk to it.
Here it is, Frankie.
I think it's gonna get so close and we're gonna have
intelligent conversations with machines
and then after a while we're gonna be like,
well we've kind of given this machine all we can give it
and I kind of think that it's conscious now.
But at the same time we're gonna be,
I mean we keep trying to find where consciousness exists,
right, in us, like where's the self inside a brain.
And we have no idea.
We have no idea. And we have no idea.
So I don't know, I agree with you that I don't think
that it's, I don't think it's gonna happen in our lifetime.
I really don't.
I think things will get unrecognizable in our lifetime
technology wise, but I don't think it's gonna,
we're not gonna have AI president.
Well let me ask you, what are you most excited about
the future, it can be the distant future,
and what are you most concerned about?
Maybe that's climate change, maybe that's something else.
Well, tough questions, because I mean,
I don't often think about the distant future,
but I do like the idea of humans becoming
a multi-planetary species.
Oh, let's save the good, start with the bad,
because I want to end with the good.
Oh, okay, great.
So we'll come back to that one.
Start with the bad.
Yeah, it's tough to pick what's your biggest concern.
I'm not a big warrior, so I don't have a lot of concerns.
What about you guys?
You packed quite a punch, though.
Well, it doesn't come from a place I hate, okay? You tend quite a punch though. Well.
It doesn't come from a place of hate. You tend to be violent under some circumstances.
It only comes from a place of love.
Maybe I haven't fully forgiven you.
I don't know why that came out.
It did, it does seem like it right now.
I can feel the tension across this table.
Yeah, what are the biggest problems?
Being a guy like you are and being an expert
in the things that you are in this particular,
like in America, right?
Like we have this really weird sort of dichotomy
of like being super advanced and like world leader
but then we also have like there's so much resistance
to scientific thinking and also a lot of resistance
to just scientists in general and a lot of distrust.
Where we come from, North Carolina,
it's just like scientists are suspects
and scientists have agendas and scientists
are trying to get grants and so that's the only reason
they're talking about climate change or this or that, right?
Yeah, yeah, it's a diverse country. You can say that.
I also saw this study which looked at opinions of scientists
in countries around the world,
and there was a clear negative correlation between GDP,
how developed a country was, and its perception of scientists.
So countries that are still developing,
they love scientists and have this idea
that they are solving problems
and really helping everyone live better lives.
And in countries where they're already very developed,
everyone's quite skeptical of scientists.
I think it's interesting that sort of
the more benefit science gives you,
the more you don't like it or trust it
or think it's a great thing.
It's kind of unfortunate.
I consider this sometimes like the curse of prevention.
This applies to things like vaccines where people don't know what the disease is like and so they freak out about a shot because they can't even see what am I preventing.
So, yeah, that is the curse of prevention, doing all this good stuff.
I think, you know, everyone has biases and they really cloud your judgment.
And so when you talk about, you know, people distrusting scientists, it comes from a place, I think, of, you know, not being wanting, you know, not wanting to be told what to do.
And being told that what you do is wrong and hurting the planet
and that you have to change and it's going to be painful and expensive and hard.
And, you know, for something that seems very intangible
and deep in the future and invisible.
So I can really understand where this is like a Rorschach test,
where, you know, even the people on the pro, you know, we need to do something about climate change side can maybe be somewhat detached from the evidence.
Uh, and people on the anti side are also detached from the evidence and they just battle in the middle over, uh, you know, some sort of ideological or, you know, virtue signaling ground or identity politics, all that sort of stuff.
So everything gets really, really awful, I think.
Well, and it seems like, and it's interesting
because Bill Nye is really,
he tends to insert himself into these things.
For sure.
And it doesn't really matter what issue it is.
He is doing a lot of great things, it doesn't really matter what issue it is.
He is doing a lot of great things, but at the same time, there's this,
when he says, okay, I'll go and I'll debate this guy,
or I'll go on this network and it'll be me
and then this other guy, and it creates
this false perception that this is a real,
there's a real debate to be had here.
And I'm not saying there's not a debate to be had
about issues, but sometimes when you just reduce it down
to two pundits, you know, and these, you know,
what do you call it, the news bit, what do you call the thing?
Talking heads?
Talking heads, but like you got a news bite or whatever.
Sound bite.
Sound bite, yeah, reduce things down to sound bites.
It paints the issue wrong for a lot of people
and gives you this idea that, you know.
I don't know, I mean I agree with you.
I just don't know who's, you know,
what other alternative you have there.
What do you do?
It's like you have no option
because either Bill goes on that show
and actually debates some guy
or that guy just
pontificates by himself.
And gives the same sort of bad talking points.
Yeah, it's a really tough one.
Because the networks are gonna still do it,
I mean they're gonna do that.
That's how they're going to dispense the information.
Yeah.
This kind of makes me think of something,
a little bit of a tangent, but that people on the pro-climate change side, I don't know, pro-climate change, the people who think this is an issue and they want to do something about it, sometimes they'll-
Climate change acknowledgers?
Yeah, acknowledgers.
They will sometimes write inflammatory rhetoric.
There are some people like Naomi Klein, and I haven't read her book about climate change, but I have read the back cover of it.
And it basically says that, you know, climate change is an indication that our our system is not working, that capitalism has failed and that, you know, we need a massive revolution to, you know, change everything.
And, you know, let's let's harness this issue as an opportunity to change everything the way we want it to be.
harness this issue as an opportunity to change everything the way we want it to be.
And I can understand that, freaking out people who don't have that sort of ideology and really getting their backs up and making them more suspicious of what scientists are talking about.
So in a way, I feel like stuff like that does a real disservice.
I would love for us all just to come down to the fundamental issue of raising the temperature of the planet by a couple degrees, which I don't think is a contentious issue scientifically.
And then thinking about that as kind of a risk factor.
There's a guy who's a kind of lobbyist for Republicans.
And I love the way he sort of talks about it, which is we acknowledge this as a potential risk for the future.
And you don't know how bad it's going to be.
It might be really, really bad or it might just be like kind of bad.
the future and you don't know how bad it's going to be. It might be really, really bad or it might just be like kind of bad. But banks, for one, have a lot of experience with dealing with potential
risks in the future. And what do you do when you have a risk which has a small probability of being
terrible? You hedge against it and you sort of pay in a certain amount up front to make sure that
never happens. Because if it did happen, there's nothing you could do that would make you whole.
up front to make sure that never happens. Because if it did happen, you know, there's nothing you could do that would make you whole. So it's just, it's fundamentally this sort of, it's a risk
management game. And he's trying to talk to people and say like, who cares how bad you think this is
going to be? And who cares, you know, how much you really acknowledge this. Consider that it is a
risk, that it is something with uncertainty, which everyone acknowledges. And what do you do in the
face of an uncertain risk? Take steps to mitigate it.
And that's where I'd love to see us go,
is this place where we acknowledge it as a risk,
some uncertainty in the future,
and something that we should probably take some measures to prevent.
Right.
And you're saying that right now the two sides of the debate are alarmism
and kind of demonizing anybody
who questions it and then people who just deny it.
And those are the two camps.
And so where's the action gonna take place?
We just be like, well guys, something could be happening
and that something that's happening could be really bad.
And what can we do?
What are the practical steps that we can agree on?
I think that's a pretty reasonable perspective. and what can we do, what are the practical steps that we can agree on, you know?
I think that's a pretty reasonable perspective. I'd love for us to get there.
I feel like media has made things,
and I kind of refer to the internet and social media
has all made this much worse.
It's funny to think, where would we be policy-wise
if it was
1999 now?
You know what I mean in terms of technology and everything?
We didn't have Twitter or Facebook
or YouTube. Obviously we wouldn't be here.
if you know what I mean, the internet's allowed people
to find their voice.
And also the people like
them and really get into
these camps.
Oh yeah.
Would we be more integrated
if we had fewer communications technologies?
Yeah, that is the really ironic thing that's happened.
We're more connected and more divided than ever.
I was talking with somebody about this the other day
and we were talking about what we thought about issues
in the 80s or what we thought about issues
in the 80s or how we learned about things and it was just like, there's a couple of people
that knew stuff, there's a couple of outlets
and of course, in one sense, that information
was controlled by people who may have been deceiving
the public but in another sense, it's like you didn't
have this like, this is my guy, this is my internet guy,
this is the website I go to,
and these are the people who have it right.
That's right, and I also feel like people had
a greater sense of integrity for some reason.
Maybe it was out of this false notion
that they couldn't push the envelope too far.
You know, like even if someone was controlling the news,
you know, back in the 80s,
they would never dream of putting on false stories or anything because, you know, that's ridiculous.
Who would ever do? No, we have some integrity.
And these days, you know, it's like anyone could put anything out there and then it can blow up without anyone's oversight.
So that's a really scary position to be in.
I also think, you know, media has made it so there's a big market
if you want to be extreme on this
side or extreme on this side. There's not
a lot of a market for the middle.
And that's really kind of unfortunate
and I think does a
massive disservice to the entire
discourse. Well it's interesting because
podcasts have kind of changed this in one
sense. It's like, I think about that
all the time when I'm listening to people talk about
complex issues on podcasts.
I'm like, this would never happen on the news.
Oh yeah.
Like you'd never get into something like that
and actually hear somebody break something down.
Yeah, not a sound bite, a sound meal.
Sound meal.
Sound meal.
That's what we should have called this podcast.
Perhaps a biscuit.
Maybe that's what your biscuit actually means, Link.
Sound meal.
But let me shift gears to the speculative distant future
that you are most excited about.
Like it could be, I don't know,
it could be an autonomous beard trimmer.
Well he said multi-interplanetary travel.
That's what he said.
Is that what it is?
Yeah.
You know, multi-planetary species.
That's my vision for the future.
And it's also really that thing of liberation from work,
which is something we kind of talked on earlier in the podcast.
But you think about the history of humans as a species.
And from the very earliest times, you know, we've been about doing the fundamental work needed to survive, you know, getting food and taking care of, you know, family and stuff.
And that's about all we had time for. uh revolution and you start to get you know artists and writing and poetry and you know
things start to enrich our lives and move through the industrial revolution and the number of people
who can do those sorts of things really increases and now we're going through you know further
revolutions with this technology and so i'm imagining a future where people don't think about
you know what is my job so that i can pay for a living and I can support a family, I can find food,
you're thinking about people who really pursue
their interests and their passions
and they have the freedom to do that
and also do it in space.
Yeah, replacing desperation with hobbies.
Is that how we're gonna overcome the population problem,
is going to other planets?
I somewhat am optimistic that we can support everyone on Earth.
Right.
You know, there may be a few more billion.
But, again, you look at the massive trends of the last, you know, few thousand years,
and it is reducing the amount of work, and it's also having fewer babies in developed countries.
Right, yeah, the birth rate goes down.
Yeah, I'm hoping that we see that.
I think we're already seeing it, the birth rate starting to level off in a lot of places.
And so hopefully we reach a point and we have the capability and the technology to produce
enough food.
And I think it is possible.
But I think for the long-term survival
of the human species, it's a good idea
to be on multiple planets.
And I also think, you know, space exploration
would be one of my top goals.
If I was in government, it wouldn't be growing the military.
It would be growing the NASA budget.
So what about getting to these,
I mean, Mars is one thing, right?
Are you not excited about Mars?
No. Are you like excited about Mars? No.
Are you like, I'm waiting for Pluto.
Well, you know, we went to JPL
and they let us walk around the Mars yard
and it was, I mean, I gotta say it was pretty underwhelming.
I mean it was just.
Yeah, I've been there too.
Just kind of a dirt.
I mean it is Pasadena, it's not Mars.
Right.
To be fair.
Well it's a dirty back lot.
But if that's what Mars is gonna be like,
I'm like, well what about the Earth-like planets
that we can actually get to?
Wow.
Yeah, because we can't get to any right now.
Right.
So we're gonna have to figure out some tech
if we're gonna get to Earth-like planets.
So how are we gonna do that?
What's the closest one that they've discovered?
Like how many light years?
I mean 20 light years is probably your best bet
in terms of Earth-like planets.
And so is this gonna be, are we in a,
what's the movie where Chris Pratt's on the.
Was that Passengers?
Yeah, Passengers.
So is it that kind of situation where we've got people
in some sort of, you know.
Cryogenic freezing.
Some state and we just,
it's just, we're propelling them across the universe.
I feel like the first thing we'd send
is robots. Did they send robots first in that movie?
I haven't seen it. I heard it was bad.
I didn't watch it. Yeah, I heard it was bad too.
It just started in the middle of the journey so
no robot. Right.
I wonder
if you just send a colony
a group of people who could live and have
kids and they could have kids and by the time you get there
it's your fifth grandchild who settles on the place but your
other option uh you know the crazy thing about relativity is that if you go fast enough the
distance gets shorter this is the crazy thing about special relativity so for example if you
could go uh nearly the speed of light 0 go nearly the speed of light,
.999 the speed of light, then to someone on Earth,
it would seem to take about 20 years to get that distance
for that person to arrive there.
But for the people inside the spacecraft,
it would actually take much less time.
This is the crazy fact of technology.
But how are we gonna get that fast?
Well that's the key.
We gotta find a good energy source. And how are we going to get that fast? Well, that's the key. We've got to find a good energy source,
and that's probably going to be something like fusion.
It could be fission.
It's probably going to be fusion, nuclear fusion.
So the same thing that powers the sun.
Doesn't sound dangerous at all.
Right, but, I mean, people are working on it right now.
They're trying to get a reactor up in Europe right now
to actually make power out of fusion.
So the same thing happens in the sun,
just combining hydrogen atoms together, squeezing them together, making helium and other sorts of things. So the same thing happens in the sun, just combining hydrogen atoms together,
squeezing them together, making helium and other sorts of things.
And that's happening?
Yeah, people are working on it.
In fact, they've been working on it for decades.
The problem that they've been having is they put so much energy in
to heat the particles up and get them ready to do their fusion thing
that they actually get less energy out of the fusion process.
Oh, okay.
So it's an input-output issue.
But I think we're very close to that sort of tipping point, inflection point,
where we're actually going to start getting more out than we put in.
And at the moment that you reach that point, doesn't that basically just, that becomes our energy?
I mean, you don't even need solar at that point.
Right.
I mean, that's like solar on Earth.
You're making your own star kind of thing.
But actually getting the technology to be affordable
and stable, easy to use, it'll take time.
So it's kind of like showering so hard
that you make yourself stink.
I don't know that I get that.
You expend so much energy.
Oh yeah.
Yeah, you go hard in the shower, you gotta be careful. Right.
I've never contemplated this.
Don't go hard in the shower.
I don't go hard, I take it easy.
I just lay back.
Right, take a tub bath.
Oh man.
You gotta get out more than you put in.
That's right.
You don't wanna stink coming out of the shower.
Definitely not.
This is a question I've often had with my friends,
like swimmers when they go in the pool, do they sweat?
Oh gosh.
See?
And can they answer the question?
I don't know.
I'm no swimmer and I'm not friends with swimmers.
You got all I gotta do is take a sample, man.
But how do you do it?
I mean they're wet anyway.
That's a good, that's like a video for your channel.
Yeah it is, it is.
Do swimmers sweat? Hey, title, thumbnail, everything. It's a good video for your channel. Yeah it is, it is. Do swimmers sweat?
Hey, title thumbnail, everything.
It's all done, it's all done.
Can we collab on the Go Hard in the Shower?
Yeah we'll do Go Hard in the Shower.
That's a good title.
On our channel.
And then it's just total clean science.
It's not what you think it is.
That'll get clicks.
That's a good one.
Okay well, this has been a great conversation.
Of course we can't just end it here,
because we promised that you guys would settle your score.
Yeah.
You guys.
So, why don't, before you settle your score,
remind everybody where they can find you on the internet.
You can find me with the word veritasium.
Everywhere.
Yeah, veritasium finds me.
I can't even say it now.
You can't even say it because of the anger.
It's because of the anger that is deep inside you.
Veritasium is found wherever it's found, it's you.
Yeah.
Hopefully no one's impersonating you.
I got that trademark.
So you want me to punch him?
Yeah.
To end the podcast?
I want you to give him a love lick. I want you to give him a love lick.
I want you to give him a love lick.
I think it would make you feel better, I think.
Just getting that energy out.
You think that I felt tense, but.
This would close the circle, you know what I'm saying?
I got so much pleasure out of turning this
into entertainment for everybody
that I feel like I can't punch you.
You could hug him.
Real hard.
Hug it out?
Listen, I'm gonna put it this way.
I'm not gonna, I have no revenge in my heart.
I have only love and forgiveness and respect.
Wow.
And I just consider it an honor
to have been punched by the hand
that held the world's smoothest object.
Roundest, roundest object.
Roundest.
Yes.
Okay, well you know what I mean.
It was the most perfect sphere
that anyone's ever made. It was the most spherical.
Yeah, you screwed it up when you tested it.
It wasn't smooth?
Well of course it has to be smooth
in order to be the roundest.
But it's not the smoothest?
Well that is a question.
That's a baby's bottom.
We'll collab on that too.
So you're not gonna hit him?
No I'm not gonna hit him, that's stupid.
I mean they're still listening, mission accomplished.
Okay.
But now we let him down.
Yeah.
Because see this is him man,
this is him playing producer.
And it's not real.
I'm Don King man.
Yeah.
I'm Dana.
Look at that, we're shaking it out.
I'm Dana White. Audio listeners, we're shaking it out. I'm Dana White.
Audio listeners, we're giving it a nice,
semi-firm handshake.
Yeah, squeeze it, Link.
There's some vibrational energy happening there.
Wow, yep.
I love the science on this podcast, it's fantastic.
I love how this is fizzling out, too.
The forgiveness, forgiveness gives a good fizzle.
It does.