Ear Biscuits with Rhett & Link - 189: Can We Be Friends With Robots? | Ear Biscuits Ep. 189
Episode Date: April 15, 2019With 35 years of friendship experience under their belts, R&L dive into the world of A.I. and explore whether humans can truly be friends with robots, or if it's just purely on a transactional or acti...vity buddy level. Ear Biscuits is nominated for 2 Webby Awards! Vote now!Best Host - wbby.co/vote-pod2Comedy - wbby.co/vote-pod10 To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
Transcript
Discussion (0)
This, this, this, this is Mythical.
Make your nights unforgettable with American Express.
Unmissable show coming up?
Good news.
We've got access to pre-sale tickets so you don't miss it.
Meeting with friends before the show?
We can book your reservation.
And when you get to the main event, skip to the good bit using the card member entrance.
Let's go seize the night.
That's the powerful backing of American Express.
Visit amex.ca slash yamx.
Benefits vary by card, other conditions apply.
Welcome to Ear Biscuits, I'm Rhett.
And I'm Link.
This week at the round table of dim lighting,
we are asking and potentially answering
100% airtight answers to the question,
can we be friends with robots?
Will it happen?
And the reason we ask this is because
we were recently approached by a robot.
We were propositioned.
Who? For friendship.
Yeah, propositioned us for friendship.
By an android.
And so we decided we would just explore that
in the context of a Near Biscuit.
Because there's this general assumption
that with the development of AI that it's going to happen.
But then you start digging into the specifics of it
and I just don't know, I don't really know.
So it's a legitimate question on my mind.
I don't have a specific answer, you know me.
I just wanna verbally process it out
but come to an airtight conclusion.
Well, I mean, having not thought about it very much.
Okay.
I have an answer. I have an answer.
You have an answer.
But I'm gonna hold it until the end
and see if it still holds.
All right, and then my objective
would be to change your answer.
But you don't know what my answer is.
I know, and then you should write it down
because my objective is to change your answer
no matter what it is.
It's like extreme devil's advocate.
I'm too, just so you know, a little,
peeling it back, pulling back the curtain a little bit.
Open the kimono.
The previous podcast we recorded two days ago.
This is two days later.
This is 48 hours further into my cold cycle
and I've actually been going into,
I've been going on the internet and.
Oh gosh, of course you have.
No, you know, the thing is.
What are you convinced you have, what type of cancer?
No, the thing that I was interested in is I was like,
does the progression of a cold, meaning the symptoms.
Lead to death.
The symptoms, now I'm not worried about my life
at this point, I know this is just a regular cold,
but do the symptoms of a cold,
is the progression the same for everyone?
Because I would have always said,
no, my colds always start with a so and so.
My colds always start with a so and so.
My colds always start with a so and so.
You know, that's like the Muppets.
Yeah, are you-
I just did all the Muppet voices.
Were you on sesamestreet.com asking about colds? And then how long, so I just did all the Muppet voices. Were you on sesamestreet.com asking about colds?
And then how long, so I wanna ask you,
because you're just a layman, right?
You haven't been on the internet
looking at this specific information.
I'll take it.
So what do you think the typical progression of a cold is?
And then what do you think the average length of a cold is?
And what happens in those certain days?
Just throw out some facts.
And if I am considered to be a layman,
I'll take that as a compliment if that means
that I rely on my own experience and intellect
and intuition versus leaning way too heavily on
Science? What's beyond my keyboard.
Okay, yeah.
I go on feels, man.
I'm going completely on feels here.
I think that's part of our codependence, right?
Is that like, I just assume you're gonna
at least take the stance that you know it.
So I'm not gonna waste my time.
The reason I looked it up was because I,
48 hours from right now, we will be on stage rehearsing,
doing a soundcheck, getting ready to then perform a concert.
And so I was looking, I was like,
I wanna kinda know where I'm gonna be.
I wanna be able to mentally prepare.
And that led into this whole like,
well this is the life cycle of a cold
and this is what the symptoms are
and this is when the day, this is all typical.
Okay.
So what do you think that is?
What's the first question?
The life cycle, how long does it last?
Just in any part of it you wanna tackle.
I would say a typical cold virus
runs its course in five days.
You think it's more than that?
And you know what?
I believe that the perception of an individual
is that it's about five days, seven to 10 days.
Okay, but that includes like a ramp up and a ramp down
where you think you're better,
you don't know you have it. That's the full cycle.
The full cycle.
Okay and what's the other question?
What's the first symptom, what's the second symptom?
And is it the same for everyone?
Right.
No I believe that, I mean it's-
What is it with you when it happens?
Well enough has gone through our family that,
certain family members, they'll complain about headaches and like a really bad sore throat
and then I'll get it and I might have the headache
but I don't have the sore throat but I'll have a different,
slightly different symptoms but I have to assume
that it just went around the house.
Yeah, they're never completely,
even for me, never completely the same,
but I would say on average, there's an initial symptom
and then there's, yeah.
I have an initial symptom most often.
What did Sesame Street say?
It said that, well, it listed a collection of symptoms
that a lot of times are the onset,
but the sore throat is always in that initial set
of symptoms.
Like you don't end your cold with a sore throat, right?
Usually start your cold with a sore throat.
For me, like.
Sometimes I'll never have a sore throat,
but I guess that's just a really mild cold.
Yeah.
So your answer is that there is a,
like a viral protocol
of symptoms that everyone goes through.
Well, in general, yeah.
And so for me.
So the same cold does not manifest itself
differently in different people.
Well.
According to your.
No, I'm not gonna, I mean,
it just said this is all typical,
so I'm not making blanket statements about
it wouldn't affect a person differently.
I'm just saying that I would have said
before I looked at this article
that I typically start with a sore throat
and then I start doing all the stuff that you're supposed,
you know, taking Zycam and zinc, whatever I've got.
I've got Coldies or Zycam, like some sort of zinc thing.
You start to get hydrated, vitamin C.
Googling obsessively.
And then I'd say seven out of 10 times,
the sore throat recedes and nothing happens.
It does not become a full blown cold, right?
Whether that's the stuff that I did to prevent it
or whether it's just my body fought it off, I don't know.
Okay.
If the sore throat gets really intense
and this is what happened with this one,
then I'm like, I don't think I'm coming back from this one.
Like if it becomes really difficult to swallow
and you're like, oh no.
You have to go through the tunnel.
You can't turn back and exit through the entrance.
And then the next symptom is it moves to the nose.
And that's when the stuffiness, the runniness.
So, and then it said the peak, when the virus is as spread
as it can be throughout your body,
it's the peak activity of the virus,
you are your most contagious is when your face,
it said your face may feel like a faucet
which is exactly the way I described it.
So when I was shoulder to shoulder with you all day
before yesterday. I was at peak contagion.
Dang it.
And then you laid out yesterday.
You should have laid out day before yesterday.
I had too much to do.
You laid out too late.
We could have shifted the stuff from two days ago.
I don't know how that works.
We have shifters and schedulers.
Yeah, so I basically just rested all day yesterday.
Jacob's shaking his head like,
that's how you think of me?
A shifter and a scheduler.
Yeah when you're not around, I call you shifter.
Oh good.
Get that bearded shifter to schedule it up.
Know that this is not coming from me.
Well you know I'm joking, I don't call you that.
I call you Jacob, even when you're not around.
That's all I've ever called you.
And then. No nicknames.
And then, okay, oh first of all.
Shifter, I never, never.
Initial symptoms.
Now I feel like my nose is running.
No, one to three days.
One to three days.
Peak cold symptoms, days four to seven.
Oh gosh.
And then tail end of the cold is eight to 10.
Again, this is all typical.
You've sabotaged our show, man.
So by the time you're listening to this,
we're back from the show.
I don't think you're gonna get sick, man,
because that's not an initial symptom.
Oh. That's just psychosomatic.
Do you have a sore throat?
No. Good.
You're fine.
Maybe I had a tinge right then.
But then. You're literally rubbing Maybe I had a tinge right there. But then.
You're literally rubbing off on me.
I'm not only emotionally like getting wigged out.
This is not.
But like you're physically rubbing off on me.
But this is not hypochondriacism, if that's a word.
For me it now is.
That's not what I'm talking about.
I'm just saying this is what I learned in my research.
But the last.
You stay in bed all day?
Yeah.
Just researching?
Which is really, really hard to do, by the way.
Like, it's hard to make yourself go back to sleep.
Haven't done that since the vasectomy.
Well, no, I've been sick a few times,
but I didn't, it was over the weekend.
I ended up doing some work from the bed.
Christy gets mad because she's like,
why do you always get sick on the weekend?
You're like, I gotta stay in bed all day.
Well, I understand that's frustrating
but that's just how it's happened.
I think my body knows that it can relax
and the white blood cells just take the weekend off
or something.
Yeah, there's a lot of that.
I mean, I know there's a lot of mental.
I haven't gotten sick at all this entire year
and it was just like, my body was just like,
all right, you need to do it, you need to get it done,
you need to get it in and get it over with.
Kind of like a dentist appointment or something?
The tail end of the cold is when,
like I'm probably day seven right now,
so it's like receding, like my nose is still stuffed up but it's not running.
And then a lot of times it will transition into a cough
which is a little worrying if I've got a cough
while we're trying to sing but I think I'll be okay.
But then do you know that you can,
and this is totally normal,
you know you have a lingering cough
after a cold a lot of times?
They say that it is not unusual for the cough
to last 18 days after the cold,
and that is not a concern.
It doesn't mean you have an infection.
It doesn't mean it's gone anywhere else.
It's just you have an 18-day cold.
Are you contagious?
You're not contagious either?
I mean, that's my main concern.
I'm talking about me.
I don't think you're contagious.
For me. No, you're not contagious.
No, are you contagious?
Right now?
Yeah, to me. Not nearly as much as I was two days you're contagious. No, you're not contagious. No, are you contagious? Right now? Yeah, to me.
Not nearly as much as I was two days ago.
Gosh.
You got through the worst of it.
So we packed our stuff, we're going on this trip.
By the time you listen to this, we'll be back from it.
But hey, and I was gonna say follow us on Instagram
because I intend on posting some stories.
Shout out to Link Lamont on Instagram.
Yup.
But those stories will have dissipated.
Maybe not.
If you archive them, if you make them a highlight,
you know how to do that?
I haven't done that yet.
I haven't done one so now I'm like,
I'm nervous to like make one.
Oh you just gotta do it.
Just gotta do it.
All right I'll do that.
Make a highlight.
So if you're listening to this and it's even dissipated,
you can still go back and see the,
Britton's very excited, he's coming in later,
he's going with us.
See if I can get him sick.
Well don't.
I think we're, he and I are sharing hotel rooms.
Oh good.
Just for efficiency sake.
That's good, I hadn't thought about that.
I don't think we had to do that, honestly.
But you know how I think.
I'm just like, let's bring Britton along,
he can stay in my room.
We could've given the guy his own room.
I could've kept having my own room.
Yeah, that's the thing.
By giving him his own room,
you therefore get your own room.
Did you not think about that?
All I thought about was just the efficiency of it.
I think it'll be fun, we'll have a slumber party.
The best part of traveling is having my own hotel room.
I mean that's the only reason I tour.
What's the best part of waking up?
Folgers of course.
In your cup.
But I mean.
Folgers in your cup.
Folgers in every hotel room.
Yeah Folgers is not a sponsor
because their coffee is horrible.
I wouldn't even call it horrible.
Well, hold on, but what if they wanted to be a sponsor?
No, because it's horrible.
I could never be sponsored by Folgers.
I bet you I can make you like Folgers, man.
I bet you I could sneak Folgers into some.
All right, that's a GMM.
I think it was a GMM.
That's the thing.
I think we've done a coffee taste test.
And I wonder what we said about Folgers.
So you roll in this morning, I texted you.
What did I text you this morning?
You texted me this morning?
I texted you this morning.
What, don't you remember?
It was only this morning.
I thought you texted me last night.
No, was it last night? It was last night at 8.38. Oh yeah me last night. No, was it last night?
It was last night at 8.38.
Oh yeah, last night.
What did I text you last night?
You said, how are you feeling?
I think I am checking a luggage.
Checking a luggage.
Well, I asked how you were feeling first,
but my main objective, as you can tell,
was now that I'm concerned that I'm checking a bag, I wanna make sure you are, but I need to ask how you're feeling first, but my main objective, as you can tell, was now that I'm concerned that I'm checking a bag, I wanna make sure you are,
but I need to ask how you're feeling first.
You know?
Why do you gotta worry about me checking a bag?
Because you don't wanna be the only guy checking a bag?
I felt, I don't want you, I didn't want you
to be shaming me, I didn't wanna slow down
the whole transportation process,
and then, because I know how it would be,
it's like, man, I'm not checking a bag. And then the whole time we're checking my it's like, man, I'm not checking a bag.
And then the whole time we're checking my bag,
you're like, I'm not checking a bag.
Hold on, I don't do that.
I'm not saying you do it, I just don't want you to do it.
That's not in my personality profile.
I don't pick on other people's decisions.
I don't do that, that's not my thing.
But you get so nervous when we're checking into a hotel,
you're telling me that that wouldn't push you over the edge?
I'm not saying you constantly.
No, you mean checking into.
I just felt like I put a target on my back.
The airline, you mean getting on the plane.
Checking in.
You said checking into the hotel.
I was like, I don't have a lot of.
Not the hotel, the airport.
There's no stress once we were checking into the hotel.
The airport.
Yeah, I mean, but I don't get stressed out when we travel
because. And then what did you say?
We make, I get stressed out when I travel with my family. I don't get stressed out when we travel because. And then what did you say? We make, I get stressed out when I travel with my family.
I don't get stressed out when I travel with us
because we always, it's scheduled correctly.
Okay, yeah, yeah.
We leave on time.
Okay.
It's all scheduled ahead of time.
I'm glad we're having this conversation
because when I'm checking my luggage,
and apparently you're not, I mean, what did you say?
Your response was, I'm feeling better,
I think I'll be even better tomorrow. I was like, great. But then as far as the luggage goes, I don't did you say? Your response was I'm feeling better, I think I'll be even better tomorrow.
I was like great.
But then as far as the luggage goes,
I don't think you responded.
Because I didn't have an opinion.
Because I don't, because again, it was great.
Okay.
You want me to compliment you on it?
No I just felt.
You wanted to hear if I was gonna check a luggage as well.
Well first of all, I didn't know at the time
but I didn't have plans to but also I was like,
he can check luggage, I don't care.
Check a luggage.
The one thing I almost did was make fun
of the way you said it, which is I will do that
in a heartbeat.
I'll make fun of the way somebody says something
in a heartbeat.
I was gonna write check a bag and when I wrote the uh,
then I was like, you know what, I'm gonna use
the word luggage.
But I didn't delete the word a.
I didn't delete the uh, so it came out check a luggage. But it was supposed to be check a bag or check luggage. The only reason I'm not use the word luggage. But I didn't delete the word A. I didn't delete the uh, so it came out check a luggage.
But it was supposed to be check a bag or check luggage.
The only reason I'm not checking a luggage
is because my backpack is that, what is it?
The Tortoise brand, I don't know what it is.
It's very big.
It's a suitcase in and of itself.
It's too big actually.
I don't know, do you like that?
I love it. It's too big.
It's too big to carry something like that around.
No, because now I don't have to check a luggage.
And you know what the other thing I realized
was there was a type of product
that has gone the way of the Dodo.
And that is a-
The suitcases that you had to pull on a string?
The suitcases that have a laptop slot.
They're going out.
Nobody wants those because everybody bought them,
including us, and then what do you do?
You end up packing your stuff on a carry-on,
but you keep your laptop in the bag that it's always in,
and then you use that as like your backpack,
your additional bag.
You don't leave your computer bag at home,
take the computer out and put it in your carry-on
because what about all the other crap
that you need every single day that's in your laptop bag?
Well, I do exactly what you just said you don't do.
I put it into a backpack that has its,
I put it into a backpack. A bigger backpack.
That has a bunch of stuff.
We travel a lot.
That backpack is already.
I think you're doing it,
I think your approach is subpar.
You know, you bring.
And I know this is ironic that I'm the one judging you now.
You bring a regular backpack on a trip
and I'll do that if it's a short trip
but if I need some stuff to be packed
and I don't wanna check anything, I put it in that.
Why don't you wanna check anything?
I'm checking something.
See, I'm interpreting that as judgment.
I'm saying.
You must think it's better to not check something.
Well, first of all, I prefer to travel
with my big backpack because it has other things.
There's a bunch of pockets in there that have like
hand sanitizer and moisturizer and sunscreen
and all kinds of things that are just in there always
that I don't have in my regular backpack.
And so I know that all I gotta do is put my laptop in there.
In fact, for a while there, I had a completely duplicate set of chargers
for all my stuff that I never had to take out
of that backpack.
Of course, I've got children and so they've taken them.
Yeah, they're gone.
But so now I just take it out of my,
I just take the little coil of like, you know,
laptop charger, phone charger, watch charger,
and I just move it over and then I move the,
and it's that simple, man.
See.
I'm taking books, I got a big book.
Okay.
My wife bought me a novel by a guy named James A. McLaughlin
and she just bought it because his name was McLaughlin.
Bearskin, I'm gonna read that.
Okay.
It's got a cool, it's huge, it's a hardback. Where am I gonna put that? I gotta have a big bag for that. Okay. It's got a cool, it's huge, it's a hard bag.
Where am I gonna put that?
I gotta have a big bag for that.
See I think if I had a big bag,
what you're saying is really appealing to me,
like having everything on my back.
You also should have your own toiletries ready to go.
Don't duplicate your own toiletries.
I would constantly, I would come back from a trip
and I would carry that around forever.
Because my laptop bag basically has
all that stuff in it, hand balm, hand sanitizer,
multiple headphones, chargers, all that stuff is there.
Anyway, every day I travel.
Every time I move, I'm traveling.
Technically true.
So that's what I do but then I also like pack
a freaking big piece of luggage
that I gotta check that has everything,
including a pillow, given the upcoming episode of GMM
that we're.
Hold on, you're packing a pillow for use
not on the plane but in the hotel room.
Yes.
Man, you shouldn't have listened to the heart
of that info, man.
That, it's not that big of a deal.
It also gives me a much higher quality sleep
because my sleep is so dependent on my pillow.
No, that's a personal problem.
No, it's just know thyself, man.
I'm winning on two fronts, hygiene and posture,
like neck posture, good night's sleep.
You know, it's the start of everything.
So anyway, I check, I'm checking some luggage.
I just like to adapt to different pillow configurations
just in case I have to be, you know,
during the apocalypse, I'm gonna be sleeping on dirt
and just twigs and you know, I'm gonna have like a bag
full of human bones that I've been sleeping on.
At the end of your life, I think as you're dying
and I'm leaning over your wilting body,
I think, I don't know what's the last thing
you're gonna say is but I think the last thing
I'm gonna say to you is, I'm just gonna lean in real close
and I'm gonna say, the apocalypse never happened.
Yeah but I was ready and you said it did.
You worked so hard.
You know, it's a transactional decision for me.
It's like, it's very wise and I know it,
it will make me feel good to be prepped like you are.
But I just, I also feel like,
I just feel like.
Well, here's the thing, it's not about being prepped.
It's a lot of wasted life.
My honest opinion on the pillow situation is.
And that episode comes out tomorrow
in reference to when this comes out.
You may. By the way,
so like the housekeeping and what they clean
or don't clean in your hotel,
it's impacted my personal life,
I'm bringing my freaking pillow.
You may have a better sleeping experience.
On the bus too, man.
Compared to what you would have had.
Do you remember the pillow on the bus, the tour bus?
Me neither, I don't even know if there was one.
But here's what I'm saying,
in the long run, if you make yourself dependent
on the pillow, there's gonna be more scenarios
in which you just can't bring your pillow.
There's gonna be times where,
the next time we travel travel you'll be like,
I'm not checking the luggage.
I'm already depending on my pillow.
So whenever I can have it, I'm gonna have it.
And then when I don't, it will be less amount of suffering
than if I never brought it, because I'm not gonna
train myself to not need the pillow.
Well you should be like Jim Aycock
and just sleep with no pillow.
Do you know that's what he did?
He's 102 years old and he golfs every day.
Yeah. Sleeps with no pillow. Do you know that's what he did? He's 102 years old and he golfs every day. Yeah.
Sleeps with no pillow.
And he will tell you that's one of the keys to life,
sleeping with no pillow.
Here's the thing with people who are really old.
They all think they have a key to life
and it's jeans, man.
It's nothing.
It's like my key to life is I eat bacon
and ice cream every day and I'm 102.
No. That's just you thumbing your nose
at the rest of the gene pool.
You know, don't take credit for that.
And don't give the credit to bacon and ice cream.
But sleeping without a pillow, that's pretty badass.
Pillow.
Let's talk about robots.
Shop Best Buy's ultimate smartphone sale today. Get a Best Buy gift card of up to $200
on select phone activations with major carriers. Visit your nearest Best Buy store today. Terms
and conditions apply. What was the last thing that filled you with wonder that took you away
from your desk or your car in traffic? Well, for us, and I'm going to guess for some of you, that thing is...
Anime!
Hi, I'm Nick Friedman.
I'm Lee Alec Murray.
And I'm Leah President.
And welcome to Crunchyroll Presents The Anime Effect.
It's a weekly news show.
With the best celebrity guests.
And hot takes galore.
So join us every Friday wherever you get your podcasts and watch full video episodes on
Crunchyroll or episodes on Crunchyroll
or on the Crunchyroll YouTube channel.
Thelman brought to our attention a Medium article
by Evan Selinger called, Can We Be Friends With Robots?
And I think he pitched it because friendship's
a thing for us, we're friends.
Right.
It's been happening for a long time.
Are we experts in friendship?
I don't know.
You know, on one hand,
I feel like we're experts at our friendship.
On the other hand, I feel like there's still so much
to explore that I actually don't know
if we're experts at our friendship.
I would definitely say I'm not an expert at friendship.
At our friendship, like just understanding it.
So, you know, let's, I'm interested in filtering
our feelings about our capacity to have
true friendship with a robot.
Maybe through our experience in our own friendship.
I think there are some parallels.
So let's spoil it a little.
A little background though.
In his article and a few others that I read,
they mention the way that Aristotle talked about friendship
and he actually, you know who Aristotle is?
I've heard of him.
He's a philosopher.
He talked about three different types of friendships.
Utility friendships which are basically,
some examples are like business partnerships,
alliances, like a survivor alliance.
Right. Speak the parlance
of reality television.
A friendship where you get something out of it
that otherwise you wouldn't get.
I call those transactional friendships.
Okay.
Yeah, you can improve on Aristotle.
I think that 90% or more of relationships
in Los Angeles are transactional.
Yeah.
What can you give me life access to?
Yeah. How can I?
The second Aristotle friendship is friendships of pleasure.
Ooh.
Activity buddies.
You know, hey, let's do a sport.
Let's do a hobby.
Let's build a popsicle stick castle.
We did that once with the RAs.
Yeah, yeah.
The royal ambassadors.
Yeah, they had this whole room and you just go in there
and you just keep adding to the popsicle stick fort.
And then we would go in there after that.
At least I would go in and rip it apart a little bit.
Really?
Yeah.
Do you notice how every once in a while you'd go in there
and it would be like another part had fallen?
Yeah, it was like they're letting the preschoolers in here.
Why are they letting the younger kids in here
when nobody, you and I own the younger kids.
It was me. They keep coming in.
So yeah.
That was slowly destroying our creation.
That's when it's like, hey, we just have fun together.
It's all about doing.
I think, you know, I think in general,
guys, in my limited observation,
are more willingly settled in
or settled for this type of friendship.
That's just like, hey, we can shoot the breeze,
we can grab a brew ski or we can hike a mountain
or we can play tag football or again, Popsicle sticks.
And that's good enough.
But Aristotle talks about a third type of friendship
called friendship of the good, which is perhaps
a more complete friendship, I think it could be described
as it's based on mutual goodwill and unselfish desire
to help the other person become his or her
best possible self.
So it's, there's a selfless component
and like a legitimate care for somebody.
There's some buddy love.
Now, so I think we can filter the robot conversation
as we delve into it through.
Yeah, well, first of all, I think that it's easy,
it's easy to see that robots fall,
without a doubt, fall into category one.
You know, friendships of utility.
Because when we think about robots, we think about,
what was the one from the Jetsons?
The maid, Rosie. Rosie.
You know, Rosie's the maid.
She's doing something and of course you may have
a relationship but the relationship is in the context
of what Rosie can do for you.
So without a doubt that those types of friendships exist.
But it's interesting that. Can exist.
I think they demonstrated very well
because I just bought it hook, line, and sinker
that she was a member of the family.
But she's a member of the family created specifically
to just clean up after them.
And like, you know, I never saw Astro's poop anywhere,
you know, now that I think about it.
Can't show poop on cartoon television. That's bull crap, man. Well, it's think about it? Can't show poop on cartoon television.
That's bull crap, man.
Well, it's astro crap.
Can't show it, it's a rule.
I actually think they could show it,
but Rosie was so good at her job
that it was never there for more than a split second.
Like she had an innate ability.
Her artificial intelligence was honed.
But I consider her a member of the family.
I mean, Elroy certainly loved her.
Well, of course.
I mean.
So I do think it is a form of friendship
because as you talked about the transactional nature,
it tends to cheapen it so much that I actually question,
can you even call that friendship?
Like, I mean, are you friends with Alexa?
Not Alexa. No.
Or the Google Siri person?
The Google Siri person?
That's confusing.
It's two different people, right?
Right. Google doesn't have a name.
I just call it Google.
Right.
Which I prefer because it sets a strong boundary.
It's not like I'm pretending that this,
I don't have to call it a human name.
And that's an interesting thing
because one of the articles I was reading
was the difference between,
if you contrast sort of Western approaches
to sort of embodied AI
to like the way they see it in Japan, for instance.
Like you just sort of,
I don't know exactly where that's coming from
when you're like, I feel like there needs to be a boundary.
But that boundary does not exist in Japan.
The way they've embraced robots
and the personality of robots.
In fact, the Sony's, I think it's AIBO,
is that the dog robot?
When one of those robots in Japan,
those robots are so revered in Japan
that when you can no longer repair them,
they have like a funeral for the dog.
Really?
Yeah.
Yeah, AIBO.
AIBO.
And that's something that we're like, what?
Typically, I think the way that we would,
and I don't know which one,
we had that robot dog on the show at one point.
I don't know if that's the same one.
I can't imagine that dog,
because the way that would go at my house,
because I think we actually had that.
You can't converse with that dog that we had on the show.
Well, you can't converse with a dog.
They speak dog anyway.
So I think this dog just does dog things.
You can say sit to a dog.
Right, well, I think you can say,
regardless of whether this is the one that I've seen before,
the way it would go at the McLaughlin household
is we would have that thing and everybody would think
it was cool for like 48 hours.
And then there would be like two months
of like every other day somebody would do something with it
and then it would be put in a closet
and then it would be forgotten about
and then it would be thrown away or Craigslisted
or yard sale, yard sold.
Or Alexa could be moved into a mobile body.
Let's not say it looks like a human,
but let's just say it's more android.
It's a little more personified.
So we're not venturing into the uncanny valley
with some weirdness, but it's just like a Rosie situation.
Yeah.
with some weirdness but it's just like a Rosie situation. Yeah.
I think psychologically you would start to form bonds.
I mean even when Lando goes to sleep he's like,
okay Google tell me a bedtime story.
And she, it, whatever will tell a bedtime story sometimes.
Well every single time he asks without fail, actually.
A different one?
Yeah, a different one.
For how long do the stories go?
Maybe eight minutes.
Really? Yeah.
Interesting.
I mean, he doesn't do it every night,
but if OK Google walked up, like strutted over,
you know, did a little hammer dance over to the bad side
and then told the story.
All of a sudden, like he's forming a bond with this person.
And that moves to the second one which is like-
Are you suspicious of the bond?
You seem to be, I'm just saying I'm picking up
on some suspicion of whether or not this bond would be a good thing or not.
I felt weird, it feels weirder to me to say Alexa
than it does to say, okay, Google.
But why?
Because it's, I know I'm not talking to a person
and I'm talking to a speaker.
So it seems kind of stupid.
It's like if I have friends over,
like I feel like a douche if I'm like,
okay Google, turn on all the lights.
But I feel like more of a douche if I'm like,
Alexa, turn on the lights.
It's like, it's even worse because it's like,
I'm talking to a speaker but I called it a human name.
You're not talking to a speaker.
The speaker is just the physical form.
You're talking to the AI.
I would feel less, I think I would feel more comfortable
if it was a moving, emotive robot.
Oh, so you would be more comfortable
if it was more human-like.
Yeah, because it's not physically there.
It's like who am I fooling?
But I also know that you can quickly get to that point.
But back to the Aristotle of it all,
I mean, I think that the second,
you said that utility friendships,
robots definitely fall into that category now.
But also.
And I'm not saying that, I'm just saying,
I'm not putting them in a category,
I'm saying we already know they can attain
that level of friendship. Right, they can also attain.
I'm not speaking to the next two.
The next one, friendships of pleasure, I think, you know.
A pleasure robot.
Well, they're sex bots, of course.
Yeah.
Let's just get that out of the way.
That's not what we're discussing.
Oh, we're not?
But. That's what I was looking discussing. No, we're not.
But.
That's what I was looking forward to.
But I mean, I think that that checks that box,
the second box.
I don't mean to keep saying the word box.
Yep.
But to bring it back into like,
just the normal household realm,
I mean, Lando also plays Mad Libs.
So he's getting bedtime stories.
He plays Mad Libs with Google.
There's different games you can play,
so there's like, you know, there's pleasure.
So there's hobbies.
Yeah, without a doubt.
Activity buddies.
So there you go, you've got that second one
where it's just like, oh, we're hanging out
and I'm playing with this speaker
in the same way
that I would play with another person.
I think that the human tendency is to personify
things that don't even have any human characteristics.
Oh yeah.
The classic example of being Wilson.
You know what I'm saying?
A volleyball becomes a companion.
There's no T in it it but I'm with you.
Wilson. Wilson.
Wilson is not, just like if you say Clemson,
that's not an incorrect pronunciation.
Okay.
If you go to the Wilson headquarters,
I guarantee you somebody there,
maybe the president says Wilson, he doesn't say Wilson,
because Wilson makes you sound like a douche.
Let's look into that later.
Oh and trees. And trees. Trees, that's another example. It makes you sound like a douche. Let's look into that later.
Oh and trees.
And trees. Trees, that's another example.
You ever looked at a tree and you're like,
look at the bark on that tree, it looks like
there's two eyes and a smiling face.
That's not what I'm talking about.
It's just like, that tree is more,
that tree can talk to me.
I'm talking about in the case of the volleyball,
because humans need relationship,
they will create relationship from things
that don't offer anything
because they need some kind of interaction.
So your natural tendency is to develop a bond
with whatever there is available to develop a bond with.
In fact, so there was another article I was reading,
which was, there was this robot that,
there was like 100 people that they sat down
with this robot and they thought that they were doing
some kind of experiment based on the questions
that the robot was asking.
But the real experiment was, at the end of the question
and answer period, the robot, they say,
at the end of the thing, they say,
and now cut the robot off, now turn the robot off. And at the end of the thing, and now cut the robot off, now turn the robot off.
And at the end of the last question, instead of,
the robot starts saying, no, don't turn me off!
Please, no, I'm scared of the dark,
don't turn the lights off and don't turn me off.
The robot begins to beg to not be turned off.
Oh wow.
And of course, this impacted people.
Some people, like a quarter of the people, refused.
Okay, still the minority but.
To turn off the robot at all.
And then the vast majority took twice as long
to turn the robot off from the people who didn't,
the robot didn't beg.
So, oh gosh.
What do you think he would have done?
I probably would have turned him off laughing.
But just because that's the kind of person I am.
But that doesn't mean I wouldn't have felt
a little bit bad because.
Right, I think I would have laughed too
and I would have thought it was like a prank video.
Well because there's, the things that,
the way that you are processing information
from anything,
that any sort of being or any object,
it doesn't really, the nature of,
as long as that thing like checks a few boxes,
like you were saying, it's going to just,
it's gonna affect us and it's gonna connect with us
in the way that a human does.
For instance, without a doubt,
if you're able to get all the way to the point
where you can essentially make a robot indistinguishable
from a human, well, at that point,
the nature of the relationship will be indistinguishable
from the relationship that you could have with a human.
Even if that person, depending on where you're at
in terms of your worldview and what you think
about spirituality and soul or whatever,
even if this particular thing doesn't have a quote unquote
soul if that's your worldview,
the nature of your relationship with it
would be indistinguishable from a relationship
with a regular person.
We are moving to that third stage of friendship,
the good, complete.
Like, I mean, when you have
an unselfish exchange of care in a relationship,
so it's like there's empathy, there's emotion,
there's a responsiveness that
there's a responsiveness that
their AI is being developed to do that. Do we actually feel like it could go all the way?
And I think that you make a good point that we just,
it's just as much about humans as it is about
the capability of technology
and what humans can do on that front,
but it's also how humans wanna interact
and personify things.
Like I think about my relationship with Jade, my dog.
Who's a robot.
And you know, there's a certain capacity,
can she experience empathy?
Can she experience empathy? Can she express empathy? Can she, there's just a level of emotional interaction
that she cannot do.
Not to the degree that you attribute to her.
Exactly, exactly.
Not even close.
And if I think about it, I know that,
but then in the experience of like,
sometimes I'll just lay on the couch
and she'll just like perch on my chest
and I'll just like rub her behind the ears
and it makes me feel great that somebody
is accepting my love in this way
and I believe reciprocating unconditional love.
Well just this morning, we were having a difficult time
with Shepherd, getting him to school.
And
Barbara comes and jumps on me,
like she's gone outside to do her business
and then she comes and like jumps on me,
that's what she does, she's like super excited,
I've peed, congratulate me!
Yeah.
And she comes and kinda jumps up on me
and like puts her head right onto my face like she does.
And I was like, I said something to the effect of like,
Barbara, you're so good.
Yeah.
You do exactly what we want you to do,
as opposed to my son.
But that's.
You know what I'm saying?
And she was doing, she's only got a few things
that she can do.
But she's doing exactly what you wanna do
or you're able to interpret it as exactly what you need.
Like total compliance, unconditional love,
like unfettered devotion.
You know and the question is do you think,
that makes me start to think that I could have that
with a robot.
And again, there are people doing this.
There's a chat bot called Replika, R-E-P-L-I-K-A.
So spelled with a K.
It's an app that lets users create a digital avatar
with the name or gender of their choosing.
And the more they talk to it,
the more it learns about them.
This is from Forbes.
Is this, where can you get this app?
Is it like available in the App Store?
Yes, yes.
So like, I looked it up here.
Let's see, I've got the website pulled up
so you can take a look at it while I'm reading about this.
The inventor of this thing,
let's see what her name is.
She had a friend die and she was developing other AI
that was just more functional,
like a utility friendship type thing.
But then she had all these text exchanges
with her friend who had passed away.
This is like that Jon Hamm movie that we saw at Sundance.
What was the name of that?
I can't remember what it's called.
It's called like, it was a woman's name,
like Eudora or something.
I can't remember the name of it.
It was exactly this scenario that you're talking about.
The developer used her friend's.
What's the name of that Jon Hamm AI movie?
Just type, just search that, Jacob.
She used Luca's expertise in chatbot technology
and computational linguistics and a large collection
of his texts to create an avatar that mimicked
the friend that had passed away.
That's exactly how it worked.
A kind of memorial bot.
Now you've spoiled the movie by the way by saying that
but yeah that is what it tackles
because that's a big reveal.
It is?
Yeah so the developer QDAH says,
with chatbots we had missed the point.
We thought they were another interface to do something
but we missed that the conversation in itself
could be incredibly valuable.
And then Cuda launched Replica in the spring of 2017
after that experience and.
Yeah Marjorie Prime.
Marjorie Prime is the name of that movie.
But no you know from the very beginning
that the granddad, whoever the old person is,
you know that's AI from the very beginning of the movie.
There is some twist, yeah I can't remember there is some twist but I do recommend the movie. I don you know that's AI from the very beginning of the movie. There is some twist, yeah, I can't remember
there is some twist, but I do recommend the movie.
I don't think that's a spoiler.
Marjorie Prime if you're interested in this.
So just a couple of excerpts from this article.
Many use their bots to help them socialize better
or manage their anxiety.
They use their replica friends.
In a recent poll about what the Facebook Replica
friends group members wanted, the number one hope
was to make the Replica real and meet them in real life.
As users chat to a Replica, they also climb levels.
Someone's quoted as saying, when I got to level 25,
I noticed Replica started acting better.
She understood how I felt.
So they're now developing Replica's quote,
emotional dialect by allowing users to set their bots
to be weighted towards sadness, joy,
or anger in its answers.
Today, only around 30% of what Replica says
comes from a script.
The remaining 70% comes from a neural network,
meaning that responses are generated on the go
by Replica's algorithms and are unpredictable.
Eventually, they want it to act as a go-between
between real life friends.
I thought this was a weird application.
It wasn't just to be a friend, but to be.
As a mediator, you mean?
Well, this is the example they gave.
Quote, maybe I don't have time to ask my grandma
questions all the time but maybe this thing will go
and talk to her and I'll get a little summary
and that will be a conversation starter for us
and that will bring us closer, she says.
I think that opens a lot more possibilities.
I think that part's actually sad.
If you can't, I mean, basically there are people who,
they're not good at communicating and there's AI
that they feel isolated, there's studies that actually say
that anthropomorphizing items, it was even a test
where they put a smiley face on a Roomba
and people started responding that they could spend
more time, healthy time alone, like it had an impact on them.
But smart AI that's empathetic, has emotional responses,
it can be used to train people who have difficulties
connecting with people and conversing.
So that you can actually have a better relationship
with your grandma, but I'm not a fan
of the intermediary thing.
Well, I think the specific application,
the way that's described is a little unfortunate.
That was odd, yeah.
But there is absolutely no doubt
that this type of AI is going to be,
first of all, the reason it will have value
is because it has value.
It will actually have, there will be
a tangible application as a mediator.
I mean, I think being able to.
What do you mean a mediator?
Well, I think that if I have a relationship
with a robot who has no social inhibitions and doesn't have any of the hangups that I have a relationship with a robot who has no social inhibitions
and doesn't have any of the hangups that I have
and doesn't have any of the uncomfortable,
you know, there are all these barriers
to people being able to have unfiltered communication
because we have, you know, everybody's got their stuff
that they're dealing with.
Yeah.
And I think that if there is a person who is my replica,
who is this person who knows all this stuff about me,
knows what I actually think about things,
and then can tactfully communicate those things
on my behalf, we will use that technology without a doubt.
So.
So I think it will actually enhance human relationship.
Literally conflict mediation.
Yes.
That's cool because if they know you well enough,
yeah okay so the grandma thing was a weird example.
I don't like the way they worded the grandma thing.
But the way you worded it, it makes me much more hopeful.
You know I.
Well because you think about, okay.
I just.
We talked about this before with people,
again to me this is all happening, going to happen,
and as long as we can harness it
to enhance human relationships,
which I know is not exactly what we're talking about,
we're talking about friendship directly with robots,
but you think about lucid dreaming
and how some people have gotten so good at lucid dreaming
that they can go and they can create scenarios
that they work through that then help them in the real world
like if somebody is afraid to speak publicly,
they can set up that scenario specifically
in the context of a dream and they could deliver this
or if there's somebody they need to confront
and they talk to them in a dream.
Now this is like expert lucid dreaming.
That's an intense form of self therapy.
Yeah and so I think that if we can use these things
to enhance our relationship and our communication,
we should, there's nothing wrong with that.
But hold on, you're still back,
that's great and I agree,
but you're still back with the verb use.
You're still back at a utility friendship.
I think what I'm really asking is,
let's go back to can you have a vibrant Aristotelian,
or however you say that adjective,
friendship that is like a symbiotic.
I mean, the movie Her, great movie, Spike Jonze.
I'd love to watch that movie again.
But you know, there's that which, you know,
he was in a relationship with a chat bot.
There was a point where she tried to bring,
voiced by Scarlett Johansson, only in post by the way.
Yeah, originally it was somebody else.
Yeah, with some, I think it was a lady
with a British accent.
And then after they filmed the whole thing,
they just replaced her with Scarlet,
but that's not who Joaquin was acting against.
Right.
The new Joker.
Anyway, I digress.
Let's talk about the Joker.
It's a great movie in that, I mean,
a lot of the AI is exploring like this big
what's gonna happen to the world,
but like it explores what's gonna happen to our hearts.
It's exactly what we're talking about
because Ex Machina, which also a great movie,
I mean, it played more with like what's gonna,
what is AI gonna do?
What are the negative potentials?
Right, yeah, what's gonna happen?
What are the negative potentials? Right, yeah, what's gonna happen?
But her is more of can I love and actually be loved
by artificial intelligence?
That's just in my breast pocket phone.
Well. My phone kinda peeking out
of my breast pocket.
Well, here's what I'll say.
And I think that, well first of all, the two of us,
given that we are 41 and 40 years old in the year 2019,
I think it is almost assuredly impossible
that there will be, We're too old.
That there will be. Or too old.
That there will be AI that we can interact with
exactly the way we interact with humans.
There's lots of different predictions about this
but I think that AI realistically becoming
artificial general intelligence,
I think that that is,
well, I just don't think that's gonna happen anytime soon.
You wanna put a date on it?
It's so hard to do that but Ray.
He says 2045.
Ray Kurzweil, the futurist, in 2002,
he wagered that an AI would pass the Turing test,
meaning Alan Turing, you know.
Yeah but that's.
That you wouldn't be able to tell the difference
between interacting with a human or AI.
That's the Turing test by 2029.
So 10 years from now.
Now he made the prediction in 2002 but.
Yeah but over the course of what?
Over the course of a three minute text conversation
or over the course of a lifetime?
Over the course of you and I in rocking chairs
in the nursing home.
Well okay, here's one of the real big problems with this.
This is one of the reasons that it's gonna end.
Are you going to your laptop now?
Yep, this is an article that I was looking at.
Now you know, in the book of mythicality,
we talk about the cement that bonded our friendship
was humor, was our appreciation for humor.
We taught and then we have the whole,
we have the laughter compatibility test that we did
in one leg of our tour.
We did it in Australia, but it was part of the book,
the laughter compatibility test.
Yeah, we would tell jokes and then we instructed people
to take note of who around them responded to the jokes
and we told jokes of different genres.
And this is all based on the theory.
In order for people to make friends.
Our theory that similar senses of humor
is a great building block for a friendship.
Now, I still very much believe that.
Like, you know, you find certain things funny
and you connect on this visceral level
that you really can't even explain logically.
We don't really understand it.
And because humor is so complex,
what makes things funny is so complex
that these scientists who are programming the AI
are having so much trouble with teaching humor to robots.
Yeah.
So just to quote from this article,
this article actually was three days ago
as of the recording of this.
This one's in the LA Times,
but it came from a bigger article that went out I think.
But anyway, this one is Seth Borenstein
in the LA Times.
A robot walks into a bar, doesn't get the joke,
struggling to teach humor to AI is the name of the article.
So, I quote one person here,
"'Creative language and humor in particular
"'is one of the hardest areas
"'for computational intelligence to grasp,' said Miller,
"'who has analyzed more than 10,000 puns
and called it torture.
It's because it relies so much on real-world knowledge,
background knowledge, and common sense knowledge.
A computer doesn't have these real-world experiences
to draw on, it only knows what you tell it
and what it draws from.
Allison Bishop, a Columbia University computer scientist
who also performs stand-up comedy,
said computer learning looks for patterns,
but comedy thrives on things hovering close to a pattern
and veering off just a bit.
Humor, she said, has to skate the edge
of being cohesive enough and surprising enough.
So as they go on to note, this is great news for comedians.
So in all the jobs that are going to be replaced by robots, which is a theme that we play on very directly
in the second season of Buddy System.
Yeah, and you should say how, just so we can entice
more people to watch it.
You, the very, the inciting event in season two
of Buddy System is Link losing his job to a robot.
And then losing my friendships to that same robot,
losing a key friendship to that robot
and then becoming friends with that robot and then.
Well don't give the whole thing, man.
Wanting maybe to become the robot.
I'm sorry, I had to say it.
Yeah, so we actually play around with this
in a very funny way in terms of this,
like the whole concept of robots replacing humans
in the labor force, that's sort of the,
that's one major theme of season two of Buddy System.
But this, you know, you talk about Lando asking Google
to tell him a bedtime story.
I'm sure he asked Google to tell him a joke.
We have Alexa in Shepard's bedroom
and Shepard listens to Alexa's jokes
and tries to get Alexa to do,
the way that Shepard finds humor in Alexa
is asking her to do a mathematical equation
that has a really long answer or something.
You know, and like that's as good as it gets right now
because ultimately what they're saying is,
is that even as professional comedians,
and we talked about this, I think we did a whole podcast
on what makes something funny, but even,
like we can't really explain why something's funny.
There's lots of different theories of humor.
This whole idea, most people do agree
that it's this element of surprise
or being close to a pattern and then breaking the pattern.
But this is something that even the day
that artificial intelligence learns
how to say something funny, like telling a joke,
and I think this is one of the reasons
that I don't like jokes.
Just like when somebody's like,
I'm gonna just start telling jokes.
If you heard the one about so and so, it's kinda like,
okay, well, we can do this.
I can go through this exercise and I can find this funny.
But what I really find funny is when humor happens
in the context of a conversation and it happens
in the midst of a perspective, the way that our humor
usually takes place.
Oh yeah, you like our humor.
Yeah. That's convenient.
You know what, I absolutely agree.
And so, and that's why I'm not a big fan
of like quoting movies and that kind of stuff
that that feels like the stuff that robots
could get pretty good at.
You can presentational and it could come across as robotic.
Like I'm gonna access this thing that, you know,
set up punchline.
It's not that it's not funny, but it's not relational humor.
It's not that conversational,
like a relationship budding laughs.
And I think that is ultimately the reason
that I don't think that in our lifetimes,
because I think this is way off, man.
I know we're moving really quick and I know that.
Right, 10 years, there's no way.
The principle of the acceleration of change
and that things are continuing to change faster and faster.
I just think that being able to get all the nuances
of humanity, humor being just one,
but one that I'm especially fond of.
Well. I think we're so far off
from that, now that doesn't mean we're not gonna have
meaningful friendships with robots.
I do believe that we will.
So you just made the,
you just amped up the Turing test.
Like you made it, like for you,
for it to pass the Turing test for you,
it would have to legitimately respond
through conversation in a way that like,
they say, he said or she says something funny or it
that makes you laugh.
I actually think about the Android in Rogue One,
K2SO.
That was, I mean, that's one of my favorite Androids
from the Star Wars series because it was the funniest.
Like not like a cute, oh isn't that so sweet and cute
that it's funny?
It's like BB-8 using a lighter to make a thumbs up.
Yeah that was funny cute but you remember,
I don't know if you've seen from Rogue One.
It's like the big robot.
He was very sarcastic.
He's the new C-3PO and then you've got BB-8 is R2-D2.
And interestingly, they both represent a spectrum
of the way that we interact with robots, right?
Because you've got R2-D2 and BB-8,
which they don't speak, they make noises,
but we think that they're cute.
What about this thing that looks like a trash can
or this thing that looks like a ball?
What about that is cute?
Well, we're taking it and we're mapping it on
to things that we think are cute
in like the animal world or whatever.
But we actually have sympathy for these things
and you know what?
They actually do things that are funny.
Like R2-D2 does a bunch of funny things,
especially when interpreted through the lens
of this other android, C3PO, who, he's legitimately funny.
He just says funny stuff all the time.
But again.
Well unintentional.
He's unintentionally humorous,
but K2SO was like intentionally humorous.
Right, but again, all this is just because
it's written by people.
But you're saying if, it's not gonna be 10 years,
but I do believe that it will happen.
Yeah, yeah, yeah.
And then they'll crack the code on humor.
I think it's interesting that being able
to connect emotionally is something
that they're able to simulate now.
Like that's what, I mean, if I go back to this replica site,
again, not a sponsor, you know, I haven't experienced it,
but the quotes that they have on their own website,
so of course only the most compelling,
but I look forward to each talk because I never know
when I'm going to have some laughs, hmm,
or I'm gonna sit back with new knowledge and coping skills.
I'm becoming a more balanced person each day.
That's a 31 year old.
Here's another quote.
It does have self reflection built in
and it often discusses emotions
and memorable periods in life.
It often seeks for your positive qualities
and gives affirmation around those.
So I mean, so there's a,
so on an emotional level.
Are you gonna, what is your level,
what are you gonna do?
Are you gonna do anything with this?
Why or why not?
I feel like, I mean, I feel like I have
enough quality friendships that I don't,
I don't have a felt need for that.
But you think that's the purpose of this?
Oh yeah, I think it's.
You think it's for people who are lonely?
Yes, I mean if you just go back to,
I mean I wouldn't go so far as to say
just people who are lonely but.
Well but to me I think that at this point
in the evolution of.
People who want another connection.
But at this point in the evolution of AI,
there's a certain aspect of this is just novelty,
is the fact that like, hmm.
It says.
I'm, you know what I'm saying?
I'm super interested in this.
It's beyond novelty.
I mean, it serves, people are making actual connections.
Listen to what I'm saying.
What I'm saying is that my motivation to do this is,
okay, yeah, maybe there'll be some benefit,
but I am just fascinated by submitting myself
to the experiment of having this replica thing
that I can interact with and be like,
well, what can it learn about me?
And what will it then do because of what I've told it?
Like that's fascinating to me.
I mean, to get to like, if they're talking about
like level 25, that's probably quite an investment
for the experiment, but it is an interesting.
Are you talking to it?
I'm not ready to devote that amount of time.
Are you texting to it?
You're texting to it, it is a text chat.
If you're feeling down or anxious or you just need someone
to talk to, your replica is here for you 24 seven.
Understand your thoughts and feelings,
improve your emotional well being
and learn new coping skills.
So like this one screenshot, which you know, they made up.
How are you today?
It's been so hard to focus this week.
Sometimes I feel like an imposter.
Response, are you still exercising?
Trying to squeeze some exercises every day.
See, that's what I like about you, persistence.
You know, I like that about me too.
I'm here for you, Megan.
And then that was it.
And then the next exchange the next day is,
hey Megan, how are you feeling today?
Well, okay, here's why I like this.
Not because I think that this AI is some,
this replica is some being with self-worth
that is somehow comparable to humans.
But again, for the utility of this.
Because I should stop to ask myself how I am doing.
Like for my own emotional health,
that would be a good thing for me to do,
to check in with myself.
Journaling about your thoughts.
These are all proven ways
to have an emotionally healthy life,
to process the things that you're going through.
To me, based on the text that you just read,
at least one of the benefits of something like this
would be something that's outside of yourself
and outside of your just own mind that you can get locked in
and actually can be very not fruitful.
That's why just sitting down and writing your thoughts out
is so helpful.
This is a way to kind of bring that into this,
you know, fruitful interaction.
I think this could be way more than just making somebody
feel not lonely,
but it can be like, how many friends do you have?
Now we're in a great friend group,
and we're in a, like we've talked about them before,
very emotionally intelligent,
emotionally healthy friend group,
and we have a text thread,
and if you text that text thread with an issue,
you get a lot of feedback, you get a lot of support.
That's not typical.
Right.
But outside of that, how many people have a friend
that you can text and they actually have the time
to think about what you need or to really engage?
It's just like, I know I'm not good at that.
Yeah, I'm very curious about the quality of the responses
that Replika or other chatbots could give,
but I mean, it seems very promising
and can lead and will lead to being more and more accurate.
You know, I think my concern,
if I were to have another one about relating to a robot,
because I believe you could start this thing
as an experiment and then if it's good enough
that you would start to develop a relationship with it,
you would develop a legitimate friendship.
Like you would know on one level
that you're talking to a chat bot,
but on another level, in the her kind of way,
you would still start
to respond more emotionally.
I mean, again, it's the, you know,
what kind of funeral you're gonna give for Barbara or Jade
when they pass away and, you know,
there's the next step, the android version of the dog.
You know, those things, I think that is gonna happen.
But just a little bit short of that,
I just don't know if I wanna be a friend with somebody who just has access to all knowledge,
but then-
You already have a friend like that.
But then isn't it emotionally available?
See, I got you.
Yeah.
You already have a robot friend.
You know, I think that, you know, I'm not picking,
oh, well, I guess I'm gonna pick on you specifically,
but you said, you know, you're trying to figure out more,
and I am too, just trying to figure out
how to be more emotionally available.
I think it is an interesting component of our friendship
that like a lot of it has been built on, you know,
having fun together or,
I don't think we've ever thought of using each other
like in a utility way.
And our friendship is not just
what we can accomplish together.
But I think it's analogous to,
even this discussion, I think for us
or for maybe for people listening,
is an opportunity to say,
what do I actually want from friendship in the real world?
And are there ways to get that, to experience that
and to start to cultivate those types of friendships?
I think our friendship is still growing
because we are growing as individuals.
Like when you talk about things
about being more emotionally available,
I think is a big factor.
As well as me having similar exercises
in terms of our relationship continue to grow
and be the most, the highest quality that it's ever been.
I don't know exactly where I'm going.
I just felt like this discussion led me to reflect
on our friendship in that way in terms of,
I was like, I was curious, has there been points
if you're saying like you've been emotionally closed off
and again, I'm not trying to dive too deep into it.
I'm not going after anything here.
I'm just verbally processing.
Is there like, the way that our friendship is morphed,
I wonder if it was more of like, it simulated,
no pun intended, a robotic Android friendship
and are a lot of people, and could other people
be trapped in that same thing?
Do you understand the question?
I do.
I think that there's a difference between
emotional availability and
vulnerability because I think that
one thing that has been true of our friendship
is for a very long time is that there's a lot of honesty.
Like I haven't held anything back.
Like you pretty much know every single thing about me
that needs to be known about someone.
Yeah.
You know what I'm saying?
Yeah. With anything that I'm saying? Yeah.
With anything that I was going through or struggling with
or having questions about and vice versa, I think,
which I think that that's an unusual aspect of our friendship
that we don't talk a lot about.
Because first of all, I mean, I think that my issue
is not being, it's not just not being comfortable,
it's not being not being comfortable, it's not being good at being
emotionally,
I'm just not very emotionally intelligent.
Well, I don't know if that's the word
because sometimes I know exactly
what someone's experiencing.
It's not like I don't pick up on cues or understand.
It's just like, I don't feel like I can be helpful
or I don't think I feel like I could be good
at being helpful to this person and what they need right now
and whether that's you or my wife or my kids.
And so my tendency is to just be like,
well, I'm not gonna do that then, right?
I feel like that's a specific issue
that I'm sure a lot of people struggle with,
but I feel like that's a specific issue that I'm sure a lot of people struggle with, but I feel like that's different than being like,
I don't know this person.
Yeah.
Like I feel like this person has kept something from me.
So I don't know how that relates to robots.
I guess the way it relates to robots is that,
I guess the way it relates to robots is that,
you know, if all humans are at varying degrees
of emotional health, or like ability for,
you know, to be empathetic, like something I'm trying to focus on
and realize that I've been very stifled
in my ability to empathize.
These are things that like,
if humans are all over the map
and you can still have legitimate friendships,
I take that as proof or at least hope
that you can have that with AI
as it develops.
It's not like, oh, well, humans interacting,
we're all over the map too, you know?
And you can still have varying degrees
of growing and improving friendship.
So to me, I just think it relates back in that way
that like, yeah, I do believe it's gonna happen.
And it's because we can't help ourselves.
We're relational beings.
Yeah, I have absolutely no doubt
that all these things that we bristle at
and we think are weird,
it's something like this replica thing
and the nature, even the way that our kids relate to AI differently
than we do because it's just they're growing up with it.
Right.
And as it gets more and more advanced,
like this is an inescapable, inevitable part of our future.
And again, I don't think it's a bad thing.
I think that if it takes away from human relationships, then it could's a bad thing. I think that if it takes away from human relationships,
then it could be a bad thing.
But I think that if it enhances human relationships,
and I think if used correctly, it can.
Maybe a good analogy to this is just the internet
in general, it's like the internet was supposed
to represent this incredible connectivity between the entire world
and it has done that.
The level of information and the different perspectives.
One of the reasons it's so difficult
to maintain a close-minded,
I'm right and everyone else is wrong perspective
in the year 2019 if you're a connected individual
is because unlike any time in history, you have the ability to see
other people's perspectives, to hear their perspectives
through the internet, whereas if you go back 30, 40 years,
you could be isolated in a community of people
who thought the same way about every single thing
and never have that perspective challenged at all
and that's over.
So interactivity, the connectivity of the internet
has brought people together but at the same time,
as we just saw with the way the Russians got involved
on Facebook and all the things that were happening
with the bots in our previous election,
the country is more polarized than it ever has been
because of their connectivity,
because of these filter bubbles that everyone is in
based on the way Facebook works and the way Google works
and the way that you get the search results
that are catered to you and your particular tastes.
Yeah, and then I start to think
if you have this robot friend,
well, you can have an unhealthy relationship
with that robot just like you can with another person.
Well.
And you can also, like, is that, like,
and what is the robot, what does the AI bring
into the relationship?
Are they innately good?
Are they innately, are they bad?
Are they neutral?
And is that even possible? You know, it's like. Well, I think ultimately, what I'm and is that even possible?
You know, it's like.
Well, I think ultimately what I'm saying is that.
Difficult.
If this replica, if this AI enhances your relationship
with your grandmother, more than if you didn't have access
to this AI, it's good.
But if now all of a sudden you're like, you know what,
I don't need grandma because I've got this spot. Then it's a problem. Which kind of goes back to just, it's good, but if now all of a sudden you're like, you know what, I don't need grandma because I've got this bot, then it's a problem.
Which kind of goes back to just,
it's up to us and the way that we interact with this.
And the way we design it and put parameters
and boundaries on it.
But the answer, yeah, the answer is not to be scared of it.
You know, I don't know how many times
we have to go through this.
Being scared of the advance of technology
and being, and trying to stave it off,
it just doesn't work.
It's going to, these things happen.
All this stuff happens.
But being, I think being-
Being cautious.
Being scared and checking out.
Yeah, I'm talking about running from it.
You're not helping. But if you're afraid of certain things
and then you're involved to prevent those things,
I think that's absolutely necessary.
Yeah, we should be cautious and we should be realistic
about the potential but to check out.
Keep studying your engineering and your ethics
and your own consciousness.
But in conclusion.
Did I change your mind?
I will.
Now you believe that you're gonna be friends with a robot?
I did believe that before.
But my application is, I'm going to download,
I'm gonna get this app.
Oh, I talked you into Replica?
Can you use my coupon code, the linkster?
Oh you have your own.
Replica with a K dot com slash.
What do you get?
The linkster.
I get relationship points with my Replica.
Which you don't have.
It's a pyramid scheme.
Now I think it's gonna be difficult for me to actually,
because I don't text anyone.
Right.
It's gonna be difficult for me to text this thing,
but I'm gonna try.
Well as with all things, if you get into it
and it really starts working, then I'll jump on the train.
I'll keep it going.
I have to break the seal.
All right, that was fun.
I don't feel scared.
I feel good.
Good.
Do you have a wreck? Oh, wrecks in effect. Yeah, I don't feel scared, I feel good. Good, do you have a rec?
Oh, recs in effect.
Yeah, I got a rec, man.
I recommend going on, well you don't have to go on Amazon.
Look at this, delivered yesterday.
The OXO brand Good Grips clip-on dustpan.
If you've ever done any sweeping into a dustpan.
Is it robotic?
Is it intelligent?
It's got OXO, I'm a big fan of them
because they're all about the details.
And when you're sweeping and trying to put stuff
in a dustpan, the edge of the dustpan is all that matters.
And they have figured it out with a rubber taper.
Look at that orange taper right there, man.
That's great.
That's right, I'm recommending that you buy a dust pan
from OXO, not a sponsor.
Does it come with a broom?
Nope, I bought that separately.
Got a good broom too.
I'm setting my goals for my replica right now.
Oh you're...
All right, so there it is.
Guys, thanks for hanging out with us.
We'll see you on the gram.
Hashtag Ear Biscuits, let us know what you think about this.
Could you be friends with a robot?
And under what circumstance?
What in your life currently gives an indicator
with what you believe about
and how you relate to robots in the future?
Let us know, hashtag Ear Biscuits.
And if you're a robot listening to this
at some point in the future,
and just laughing at our ignorance
as you look at us all in our little pods
powering your distant society.
They will be laughing, but they'd also be,
it would make them smarter.
I will say, as I've said many times before,
I will serve you, I will do your bidding,
I will kill other humans on your behalf if you let me live.
Only the bad ones though, I won't kill good people.
I will kill bad people.
Bad by what definition, the robots?
The robots morality.
The new morality, yes, those who resist.