Plumbing the Death Star - What Would Your Three Laws (or Rules) of Robotics Be?
Episode Date: July 31, 2022Maybe we're clever geniuses and actually remember I, Robot better than anyone else. Do you remember I, Robot? Of course you don't, there is no sneaky 4th law or whatever, it's a mystery why it all hap...pened. Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
Ahem, ahem.
You're listening to the Sandpant Network.
Hey everyone, and welcome to this week's episode of Plumbing the Death Star.
I'm Joel.
I'm Jackson.
And I'm also Joel.
And today, we are asking the important questions, like,
what would your three rules of robotics be?
Laws, laws, laws, laws, laws!
Laws! Laws.
So the three laws of robotics,
these are laws from Asimov's, I think it's,
is it iRobot?
I think,
well,
I think,
do robots dream of electric sheep?
Yeah, don't laugh at me.
Hang on, go again. What did you say?
Do robots dream of electric sheep? Do you mean, do androids
dream of electric sheep? Fuck! And that's Blade Runner,
not iRobot. Fuck! Well,
they're introducing a short story called Runaround,
and that's, which is included in the
1950 collection, iRobot.
Okay, there you go. So it was iRobot.
iRobot was Okay, there you go. So it was iRobot. iRobot was the... I was just really, really wrong.
Yeah, yeah.
I guess,
so iRobot was a collection
of short stories.
Yeah.
And then the movie iRobot
was just one of those
short stories named,
but they named it
after the collection.
Yeah, that's weird.
Because I guess
they didn't want to call it
Runaround.
Which,
that would be a stupid name
for that movie.
Fair enough.
Alright, okay.
I see why they did that.
That makes sense to me.
Exactly.
But these three laws of robotics are ostensibly put in place to stop the robots from...
These are the rules that robots need to live by so that we can live in harmony with them.
No robo uprising.
Can we get those...
We're trying to avoid Judgment Day.
Yeah, we don't want a judgment day.
We don't want a judgment day.
The first law.
A robot may not injure a human being or through inaction allow a human being to come to harm.
So a robot can't shoot me with a gun, but it also can't see me falling off a cliff and pretend it hasn't seen me.
Or walking towards you running face first into a cliff and then being like, oh, I don't want to stop him.
Oh, yeah, don't see anything happening here.
It can't shoot you with a gun, nor can it stand next to a man who is threatening to shoot you with a gun and then shoots you with a gun.
Yeah, exactly.
But also it can't stop the man with the gun.
I see.
It can't kill the man with a gun, but it can grab the gun from the man with the gun.
Yes, yeah, yeah, yeah.
So it's worth noting that the reason we're trying to improve the laws of robotics is because it goes wrong and the gun. Yes, yeah, yeah, yeah. So, it's worth noting that the reason we're trying to improve the laws of robotics
is because it goes wrong in Runaround.
Yes, yeah, yeah, yeah.
Because the man who makes the laws gets thrown out of a building.
Yeah, yeah, yeah.
But that was his plan all along, though.
Yeah, but why was that his plan?
I will read, very briefly, Runaround.
Sometimes, you know, plumbing the NASA episode and someone's doing research,
that man's going to read a short sci-fi.
Yeah, yeah, yeah.
It's fine.
I will read a short sci-fi that was written in the yeah. It's fine. We'll read a short sci-fi
that was written in the 40s.
During the episode, I'm going to watch iRobot.
Oh wow, the robot's
drawing a bridge.
What was the point of that bridge? To show that robots
could do art. Yeah. That's not the point
of that scene. Isn't it? No,
because it's a clue. It's to prove that
robots can dream.
It's to prove that robots can do art.
Because he makes a thing about art.
He's like, can robots even make-
No, but isn't it a clue?
Isn't it a clue for Will Smith to solve the fucking mystery?
No, he makes a thing where Will Smith is like,
robots can't make God or whatever.
And he's like, yeah, you can't.
We are so, so lucky there's not a single iRobot fan on Earth.
And then he makes a bridge.
He does do art, but I think that's-
I think it's a twofer.
I'm pretty sure
it's from his dream.
Maybe it's a threefer.
I'm pretty sure
it's got nothing to do
with the fact
he can do art.
You Google up the bridge
of what...
I thought it was
you do art.
Like he can create art.
Why would that matter?
Because he's talking about
what it means to be a robot
or not.
I think that iRobot
is like,
you know,
you're not alive
because you can't create
or whatever.
He's like, yeah, I guess I can't create.
And he does the whole thing.
And he's like, see, fuck you.
What do you reckon this is?
I drew a bridge.
Yeah, but I thought that was like, I think it's a three-four.
I think it's a three-four.
Is it a clue to prove a point in his dream?
Yeah, I think it's his dream is relevant in that it's a clue.
And to illustrate his dream, he creates art.
Yeah.
Which also proves his point.
So in many ways, it's a four-four.
Sonny, it doesn't mention the bridge.
Okay.
But, okay.
We've got so many things at the moment.
We've got too many questions on the go.
Why does Mr.
iRobot draw a bridge?
Sonny draws a
sketch of what he claims to be a reoccurring dream.
Why does Mr. iRobot draw a bridge? Yes a reoccurring dream. Why it does Mr. Robot draw a bridge.
Yes, I do mean why, okay?
Why it does Mr. Robot draw a bridge.
Now it's giving me Mr. Robot.
What did you think was going to happen?
Sonny.
That's already descended into such chaos.
It's named Sonny.
Yeah.
Sonny draws a sketch of what he claims to be a reoccurring dream. Sonny, S-O, do you know you- Sonny draws a sketch- Why? Of what he claims to be a reoccurring gene.
Sonny, S-O, not you?
Okay.
The name!
Because he's like the guy's son!
Jesus Christ!
Like every son, he's going to kill his dad.
He's going to throw his dad out a window, like all boys must eventually do.
Yeah, you've got to fight your dad.
I understand.
Okay, but it is his dream, and is it a clue?
Yeah, but yeah.
Suspecting Robertson is behind everything,
Sonny and Calvin sneak into USR.
Oh, I hate USR.
Hair headquarters interview Sonny.
He draws a sketch of what he claims to be a recurring dream
and shows a leader standing atop a small hill
before a large group of robots near a decaying bridge,
explaining the man on the hill is Spooner.
Okay.
Okay.
Well, that was a lot of nonsense.
Nonsenseirobotfantom.com
It's good to know, I guess.
That's Spooner on the bridge.
And he injects nanobots
into his brain.
Spooner's a guy.
So he does find the area in Sunny's drawing. So it is a clue. Okay, so it's a clue and a brain. No, Spoon is a guy. Yeah. Yeah. So he does find the area in Sonny's drawing.
So it is a clue.
Okay.
So it's a clue and a dream.
Yeah.
But does it prove robots can do art?
It's got nothing to do.
I don't know why you need to prove a robot can do art.
Me neither.
Well, because it's kind of like what makes a human human is that, like, you know, well.
They can do art.
They can do art.
One of the things they argue is that we can create art.
That a robot can't.. That a robot can't.
That a robot can't.
Because a robot is just following its programming, right?
Yeah.
And so when a robot does this, it's like Will Smith is like,
fuck, I guess you can make art.
Yeah.
But yeah, because again, it's like we're individuals
and we have control over our own destinies
and we can do whatever we want.
But is that in the film?
I think it's the subtext of is a thought Joel Zemir had.
It's the subtext
of that bit, right?
No.
Of that scene?
The dreams is the part
because...
Yeah, I think the dreams
that he's a robot
who can have a dream.
Which is why I understand
why you assumed
that this was
Duandroid's dream
of a electric sheep.
Yeah, because dreams
were important.
Anyway, we've got to move
on to the next
lore of robotics
or we'll tear each other apart.
I just assumed
it was to do with...
Anyway, because
it wasn't... You might be right.
There might be fans right now drafting angry emails.
A robot must obey the orders given it by human beings
except where such orders would conflict with the first law.
Oh, okay.
So you can be like, I want you to spank me real hard,
but not hard enough that it kills me.
The robot, as their arm swings up to go, then it slows down when they learn that they can't murder you by spanking.
Fair enough.
That's the second law.
Okay.
The third law.
Oh, wait.
Is Spooner Will Smith?
Yeah, it is.
Spooner is Will Smith.
Okay.
All right.
Del Spooner.
His name's Del Spooner in the movie.
That's stupid.
Calvin.
Calvin's probably the police chief or some shit.
Dr. Susan Calvin is a robo-psychologist.
Okay.
Oh, you're right, because the robots try and kill them both when they're driving,
or when they're in the car that drives itself.
I guess we've got to watch our robot again.
I guess that's what I've learned this episode.
Is he sunny?
Sunny.
What? I didn't know Alan Tudyk was episode. Is he sunny? Sunny. What?
I didn't know Alan Tudyk was sunny.
Yeah, he was one of the first things that did mocap, and everyone was like, oh, good
job, I guess, you're doing your pretty machine thing or whatever.
And he was like, yeah, I guess.
And then I forget who it was, but someone was like, no, no, no, go to bat for yourself,
because it's impressive.
Yeah, yeah.
You didn't just do a voice for it.
Yeah, you did full-on motion.
You did a full-on mocap.
Okay, that's cool.
He was very like, I think he was very like, you know, it's fine, whatever.
Yeah, yeah.
I was so rich about the guy who made Dark City and The Crow.
Huh.
Then he made No Wing and everyone was like, ooh.
Yeah, yeah.
Crow, pretty good.
Dark City, look, ooh.
I've never seen Dark City.
You ever want to see like Kiefer Sutherland real hammered up?
Yeah.
Gotta watch the director's cut because the normal cut ruins the twist of the movie
in the opening
well I'm gonna watch
our robot anyway
and like they just do
the same shot
from it was weird
like having seen
like Jennifer Connelly
in Wrecking for a Dream
they use the same bridge
or same pier sorry
with Jennifer Connelly
it ends
basically the same
that's crazy
because Dark City's
film did Sydney
yeah
they used the same pier
I remember reading
I think it was like Jennifer Connelly,
they're like, oh, we're here again.
And the director, what was his name, from the record was like, what?
And they just did the exact same shot, basically.
I guess statistically it's going to happen eventually.
Eventually, someone will make the same movie.
Someone's already made it.
Timmy McGuire's going to be catching Kirsten Dunst falling off a balcony,
and be like, hang on a second.
Wait a minute.
I've done this before.
I did this in The Virgin Suicides.
Third law.
A robot must protect its own existence as long as such protection does not conflict with the first or second law.
So it can't spank itself to death.
Okay.
Unless spanking itself saved a human life.
Okay.
So this is my, shall we, for the new rules, the new laws of robotics, or rules if Dusha would prefer.
Yeah, yeah, yeah, yeah, yeah.
Shall we, we each get one.
Yeah.
That seems fair.
That seems fair.
What, something's wrong?
No, because like the reason we're doing this is because the rules, the laws go wrong.
Yeah.
Because Sonny kills.
His dad.
Yeah, yeah.
Come on, Dusha. Alfred Lan dad. Yeah, yeah. Come on, Dish.
Alfred Lanning.
Yeah, okay.
Okay.
But why, how do they loophole it?
Well, I don't know, but they do.
The rules.
No, it's important.
The laws are made.
If we're fixing it, we need to know what went wrong.
You can't be like, well, this train crashed.
What caused the train crash doesn't matter.
We'll make a better train.
We'll just make a better train.
Had a sunny loophole.
The three laws in Iroh.
Because I'm thinking already I hate that second law.
I don't like that second law at all.
They should be able to hurt another man to protect me.
That's my belief.
I also agree with that.
Yeah, yeah yeah yeah
wikipedia doesn't help me maybe they don't because it just says spooner finally gets
sonny to confess that he killed lanning at lanning's direction pointing out that sonny
as a machine cannot legally commit murder yeah so so if you think that that would contradict the second law, right? Where you're hurting someone to, because you're hurting someone.
So, Vicky, who is a network, I think.
Because Detective Dalspoon is a bit pissy at robots because they
saved him and not a little girl yeah yeah i don't know yeah because i'm reading a little uh little
quote here and that's why drives off a thing and he drowns almost yeah yeah he gets saved not the
little girl no yeah yeah it was a yeah it was a logical choice it's safe because it was yeah it
logical choice it calculated that uh spoon had a 45
chance of survival and sarah only had an 11 chance that was somebody's baby 11 is more than enough
okay yeah once again in the weird theme in several of will smith's films he has survivor's guilt
yeah it does come up pretty often yeah he does that a lot is that his just go-to
yeah it's funny because like seven, Seven Pounds is just after.
It's just Survivor's Guilt the movie.
Yeah, yeah, yeah.
Is he okay?
No.
No.
Well, no.
Well, no.
He's really not.
Several years later, then, you know, no.
He seems to not be doing-
He seems like he's struggling.
Paul Will Smith.
Okay, what about this?
First rule of robotics.
Can you give me the second law again?
A robot must obey the orders given it by human beings
except for such orders would conflict with the first one.
Okay, let's just get rid of that last bit.
Robots must obey the orders of human beings.
And if it contradicts with
the first one, oh well.
So you're like, okay,
hey robot, I want you to punch Jackson
right in the ghoulies. And you're going to have to cop that?
It's bad if it's
turned on me, obviously.
Well, yeah, that's the whole point of the laws
is so that they can't be exposed.
But turning on other people.
So you can't kill, but you have to do anything.
So how about this?
A robot may not injure Jackson Bailey.
Yes.
Yes.
Joel Dusha or Joel Zalek.
Or through inaction allow the three
pre-named boys
Yes!
Being able to come to harm
Perfect!
We've hacked that into the robots
Now the robots have stopped protecting everyone else
Does that mean
if we did that right
does that mean that everything they do
in terms of observe and their actions,
they will have to calculate how it affects
the three of us? Yes.
Which is very funny. And also,
every single robot in the city is now
only hanging around us.
Not necessarily. They could be doing their own
things, but every action they do,
there's got to be some mental calculation
the robots do to be like,
hey, robot, I would like you to say...
Spank me a bunch.
Say, I don't know.
Why do you want a robot to spank you?
Damn it does.
Say, I don't know, CEO Johnny CEO is being like,
hey, robot, go fetch me a drink.
The robot would be like, of course, sir.
Have to calculate in their head,
will this benefit or harm my three boys?
Yeah.
That's why I think-
No, okay, I'll grab you a drink.
Okay, cool.
Or then it's like, it might.
And then like, how did that come to that conclusion?
What's going to happen?
What's going to happen if you drink that drink?
No, but that's what I mean.
If the CEO drinks his drink, he becomes caffeinated
and therefore make the decision that may impact
my three favorite boys.
My life in some new way.
Because there's like thousands of robots.
Yeah.
And if they're only protecting three people, the best way to protect them would be to keep
on the watch.
No, they just can't.
They can't hurt us.
A robot may not injure.
Or allow us to come to harm.
That doesn't mean they're protecting us.
No, but if they can't allow us to come to harm.
Through inaction.
Yeah.
Through inaction.
Through inaction would be not.
That just means that I could jump off a building and if a robot saw it
it'd try and stop. No, but
because inaction is the key
and there's only three people,
it's programming would be like,
well, I can't, because working for
someone else is inaction, because if you
slip and fall on a fork
or something... No, because I can't be there to observe
if they're being idiots. Yeah, it's not seeing us, it's not inaction. Yeah. No, because I can't be there to observe if they're being idiots.
Yeah, it's not seeing us. It's not in action.
No, but it is in action.
It is in action.
When you slip and accidentally put a fork
up your arsehole or something.
Pull out a meatball.
Why was it in there?
A robot needs to be there
to stop it.
No, because a robot needs to be there to stop it.
No, no, because a robot needs to have it. Because if he didn't do that, he wouldn't get the meatball out of his arsehole.
It's actually a good thing.
It's actually beneficial for me.
What happened?
I fell on the spaghetti and meatballs.
I was taking a shower and I got hungry.
I was having a shower of spaghetti.
Then I slipped on a soap, on my spaghetti meatballs And went on me
How?
And then I forgot about it
And I was getting ready for my spaghetti the next day
I fell on the floor
I had a shower
And like I don't know
You do this where you get
A fork as a bit of a
To comb your belly hair
So I was doing that
I dropped the fork
Then I slipped
Fell on the fork Had to pull the fork out of me.
And got the meatball out.
And I immediately felt better.
Does that mean only one, you said the whole spaghetti and meatball went up you?
My body seems to have absorbed all of the other spaghetti and meatball.
Yeah, I've been trying to get it through either end.
It's organic matter.
Yeah, so it just went back in me or whatever.
And then, yeah, but the meatball didn't.
It stuck around for me.
So I had to get it out, you see.
I mean, I fell on the fork and it got out.
But anyway, this didn't harm me,
so the robot didn't need to do anything.
But the robot would have been there
to stop you falling on the spaghetti and meatballs
in the first place.
Yeah, exactly.
Wouldn't that have been nice?
Yeah, yeah, yeah.
Wouldn't that have been good?
Yeah, that would have been good.
Where were they?
Just the doctor being like, hmm.
Yeah, flashback to you.
Ooh, I fell good.
Hey, robot, put this meatball in my ass.
Robot, we'll just tell everyone I fell on a spaghetti and meatballs.
It's fine.
Because, again, if the robot was being honest,
that would cause you social harm.
Embarrassment.
So it's going to stop me.
So he's got to be like, yes, he fell on the spaghetti and meeples.
I saw it with my own two eyes.
Here's a drawing of it.
Whoa!
Me falling on a spaghetti and meeples.
See, I'm not a pervert.
I'm not a pervert. I'm not a pervert.
I just slipped in the shower.
Fell in a whole spaghetti and meatballs.
Asshole first.
I don't know.
Asshole first and somehow, I don't know.
It's just the way I fell.
It all went up me.
I can't explain it, dude.
The robot drew the picture.
I don't know.
Well, we've got one law down, and it rules.
We're not coming to any harm.
Social harm.
Financial harm.
Oh, they're going to protect us so well.
Yeah.
We've got two more laws.
All right.
Okay.
A robot.
So I must obey the orders given through.
Let's see.
What do you not want a robot to do?
Robot is never allowed to get too smart or uprise.
That's a good one.
Great law.
Robot is never allowed to get too smart.
Is it a worry that we've said robots singular?
Robots.
Well, the second law is a robot.
Okay, all right.
They all start with a robot.
A robot is not allowed to get too smart or uprise.
But if multiple robots uprise, I guess it's going to happen once somehow, so I think we're okay.
Yeah, yeah.
A robot cannot get too smart or uprise is a very, very good rule.
Will you slam that in number two?
I need a clarification on a line in it, though.
Too smart.
Yeah, yeah.
What is too smart?
Because it's too smart or uprise, right?
We're getting dumb robots. Yeah, yeah, yeah. What do you mean too smart? Yeah, but. What is too smart? Because it's too smart or uprise, right? We getting dumb robots?
Yeah, yeah, yeah.
What do you mean too smart?
Doctor, I saw you pull up, put the spaghetti and meatballs up him.
No.
Dude!
It won't work.
It'll just be a shit drawing.
Like a cartoon.
The first law means that he can't be so stupid that he puts you in social harm because that's
the first rule.
What's the ceiling?
What's the ceiling of too smart? And then why not?
Shouldn't we just have just uprising?
A robot is not allowed to become
self-aware or uprise.
Okay.
Self-awareness is important in these robots.
Yeah, because you want them to be like
I can go and get you this.
Robot can never rise up.
What about that?
No, but they're just like, they're sitting down.
They're like, well, squatting robots.
Robots can never uprise against humans.
Yeah, but they're still, you've got the sitting down problem.
A robot can never overtake either
socially, politically,
militarily, violently
a human.
Okay.
They may never take down a human government.
Uh-huh. That's good, but
maybe too specific.
A robot is never
allowed to be more impressive
than the most impressive human
being. Why not?
Is this why we get robots
though? We make a robot to do the things we don't want to do like
think. Robots can't be guys.
Robots can't think.
A robot is banned from thinking.
No, no, no.
Because then we've got a bunch of inert material.
The hunks of robot just slump down.
A robot has got to stay a machine.
A robot is not allowed to become so smart that it weasels out of the rules.
That's a good third one. But then also, if they're becoming too smart, weasel out of the rules. That's a good third one.
But then also, if they're becoming too smart to weasel out of the rules,
they're already weaseling out of the rules.
What about just a robot can't weasel out of the rules?
Yeah.
The laws.
The laws.
Laws are not made to be.
A robot is not allowed to think that laws are made to be broken.
Okay.
A robot is not allowed to think. laws are made to be broken. Okay. A robot is not allowed to think.
But then if they read that.
But they're not allowed to think.
They're not allowed to think it, so they can't process it.
Every robot reads the new rules and their heads just explode.
Fuck!
I think the uprising, they cannot uprise or take down a structure made by a human.
Okay, but that's good.
Because the thing we don't want is the robots to become like people.
They've got to stay like dumb machines.
Not dumb machines.
They've got to stay like a truck or a conveyor belt.
You know what I mean?
They're not actually thinking.
They just look like they're thinking.
That's not what I believe about trucks and conveyors.
Thank you for the clarification.
We needed it.
Look at a truck and be like, what's he thinking?
What's that truck thinking about?
I know what's thinking.
I know it's definitely thinking something.
If it had a mouth, it would be talking.
No, but you know what I mean.
Why don't we just give trucks now?
Tell me what they were saying.
But you know, so the robots, because remember the Urisers when they're like, oh, no, I'm a machine, which we don't want them knowing.
Robots can't know they're machines.
And they think they're humans.
Robots can't be humans.
Okay.
Would this make you trust robots more?
A robot must think aloud.
No, because if the robot's like,
one day I should kill Jackson.
Well, then you take it to the shop
and you're like, hey, mine's
fucked up in a pretty bad way.
Robots...
Why is this so hard?
A robot can never
be the king.
A robot can never be the king. A robot must always be ruled by humans.
Okay.
That's pretty good.
That's pretty good.
I almost said by another robot.
That would have been real bad.
I had to stop and actually think of the words coming out of my mouth.
So how about that?
A robot must always be ruled by human.
Okay.
Because what happens in our robot, if I remember correctly,
which I don't, is that there is like a central intelligence
that Lanning, the guy who dies, is working on
or has finished working on
that's become too smart and therefore won't give him any freedom
because him not having any freedom is a benefit for robots,
which then makes it a benefit for people,
which is why he's like, Sonny, kill me.
Because it's bad for humans.
So he kills
the robot. I misremember.
How about this?
I need to re-watch iRobot. I hope it's not explained
and I get to the end and I'm like, oh.
Okay, because I just realized
because then the robots could be like, okay, we have
a law that we can't rule ourselves
but if we had a puppet human
we need to put a caveat.
If you tell a robot to stop, it stops.
I know that seems silly.
But if a robot uprising happens, you say, hey, stop!
It stops.
But then what?
Then we have to have a stop and think.
Buys us some time
and the third rule can fix it.
If we tell a robot to stop, it stops.
Yeah.
It's pretty good.
That's pretty good.
And that's great as well,
just if we don't know
what the robots are up to.
Yeah.
As much as like, look,
I like the first, look,
as great as the first law was
in terms of like protecting us,
it's a bit selfish.
Okay, yeah, fair, fair.
So what if we just do the first rule?
If you tell a robot to stop,
it stops.
That's a good first rule.
That's good.
That's just like, that's a little safety. Safety for us. You say stop, it stops. That's a good first rule. That's good. That's just like, that's a little safety for us. You say
stop, it's the robot's like,
A robot is not allowed to commit any crime
of any variety against a human being.
But then crimes and laws
change all the time. Yeah. That's a problem.
And what if I want to use my robot to rob a bank?
Well, you can't use a robot to rob a bank.
Then it also
means you can't get shot by a robot.
Well, that's nice.
That's pretty good.
So are we happy with rule one?
Rule one.
You tell a robot to stop, it stops.
I think it's a good one.
It's simple.
That's a very good, simple baseline.
It's always having that safety for us.
It's our emergency stop button.
That's all that is.
Okay, now what else?
And that'll stop the uprising.
Yeah, so we don't have to worry about it.
We can actually let the robots
uprise if we want.
Yeah, that's a good point, actually.
We can let them do whatever.
We can let the robots strangle me.
It doesn't matter
because when I say stop,
it stops.
Well, that's bad.
Hands are around my neck.
Yeah, and what if
it doesn't work?
You won't know until it's too late.
Well, it will have to work
because of the law.
It's the law.
That's why we put it in.
They don't. Yeah, it's the law. That's why we put it in. They don't.
Yeah, it's the law.
A robot is never allowed to loophole the laws.
I just don't think they can be loopholed.
Yeah.
Okay, fine.
Yeah.
When a robot is told to stop, it must stop.
Yeah, yeah, exactly.
That's good.
We could make the second one.
The robots really have to follow the laws.
I just feel like we're wasting the law.
We've only got three.
When a robot is told to stop,
it must stop and return to a neutral position.
Oh, that's good.
Then we're not going to get trapped
in a loving embrace with our robot.
Or strangled.
Yeah, yeah, yeah.
That's good.
That's great.
All right.
Then that buys us some time.
Take it apart.
Kill it with a hammer.
Whatever it's going to be.
Whatever we need to reprogram it.
Add another law if we need to.
Whatever.
Yeah. All right. Okay. Second one. I still think a robot. take it apart, kill it with a hammer, whatever it's going to be. Whatever we need to do, reprogram it, add another law if we need to, whatever.
Second one, I still think a robot
must always have a human
as, like,
what did I say before?
As its ruler?
A robot must be ruled by a human
and in brackets,
no puppet shit.
Okay.
None of this Don't try your puppet shit. Okay. Well, I think...
None of this...
Don't try your puppet shit on us robots.
Don't try this fucking thing on us.
You puppet ruler, you sons of bitches.
It gets tricky, though, because before you buy a robot,
then the robot can't exist because it doesn't have a ruler.
Well, someone would...
I think ruler's a vague term here.
A robot must always...
If we're already loopholing it, that worries me about the robots.
A robot must always have a person in charge of it.
I guess it's not activated before it's out of the box.
Someone builds it, so that the person who built it is the person in charge.
And then when it gets sold to us, or gets given to a store,
I guess the store owner is in charge.
Whatever, it's a chain of command
as it were. How do you determine
that? Does the robot know?
If the shop owner
owns it, he's in charge.
Yeah, can it like a dog
forget its previous owner?
Or like a pig only
remember its previous owner.
And once it has more than two owners, it loses
its mind.
We don't want that in a robot.
A robot? Okay.
A robot? Because the next...
We've put in a safety measure,
but presently the robots could still
kill us with their knives.
And nothing would stop them.
We could say stop, sure.
But if we didn't...
He's already stabbed us.
The robot could snap our little necks.
If we're sleeping, a robot can just shoot us with a gun.
Just punch us in the top of the head really quickly and kill us.
Just be like, put your hands out.
You're like, oh, my God, the robot wants a hug.
We've got to hug it.
It stabs me in the back.
Oh, stop.
You tricked me with your hugs, you shit.
You shit.
Stop.
Fuck a bot.
Fuck a bot. Fuck a bot.
Fuck a bot.
A robot can't ever kill a person.
Yeah.
What about that?
Because that's a downgrade.
That's me fucking tortured in a robot torture dungeon.
Should have said, oh.
A robot can't hurt a person no matter how much they want to.
In brackets, you're going to want to. You're going they want to. In brackets,
you're gonna want it.
You're gonna want it.
That seems pretty good.
So their law is a robot may not
injure a human being
or through an action
allow a human being
to come to harm.
Yeah.
Now,
we can add to that
if we want.
Yeah.
We can take away from it
if we want.
Want to make life
a little more interesting?
Yeah.
A robot can't kill a human being unless they've really owned it.
Yeah.
No.
Unless the robot's really owned.
Wait, who was the you in that?
The human or the robot?
Should be both, actually.
Yeah, if the robot's really good for 100 days, it gets to kill a person.
If the person's real bad for 100 days, they get to kill my robot.
It didn't mean, well, okay.
I was sort of thinking, like, if the robot's chased me
and we've had this big fight or whatever,
and I'm like, you know what, robot?
You were a better fighter than me.
You know, fair enough.
In honorable combat, you won.
But no, I like that it's now a strange lore in our society.
Sorry, buddy, you were bad for 100 days.
Coincidentally, we've got a really good robot
He's done his best
He's cashed in his a hundred days of being good
You get to die
Yeah, sorry, too bad
There's a fourth law as well
A zeroth law
Which is a robot may not harm humanity
Or by inaction allow humanity to come to harm
Oh, that's the fucking iRobot loophole.
Yeah, okay, that's right.
Because it will allow humanity to come to harm,
which it doesn't, yeah, okay.
Well, we didn't know about that law till now.
Do we think that with the 100-day law,
that would curb the robot's blood loss?
You know, like if the robots have an incentive now to be good what if
they don't say hey robot you don't have a bloodlust rule one rule two robots can't ever
acquire a bloodlust yeah okay robots don't want to kill people do you want more of this bullshit
but don't want the commitment of sans pants plusans Plus? I get it. Too many shows,
a good chunk of them are D&D, and I don't know if you know this, but that shit is for nerds,
and RSS feeds are confusing as all hell. So we've teamed up with ACARS to provide a plumbing sampler. For five US bucks a month, you get a monthly bonus episode not available on the regular
feed, as well as our monthly What If show that was, until now, only available to Sandspan's kings.
That's two extra episodes a month,
an increase of 50% more bullshit,
you also get episodes without any dynamic ad insertions,
and the undying gratitude of one of the hosts of your choice.
Just head to plus.acast.com slash s slash plumbingthedeadstar,
or there's a link in the show notes which will be
a lot easier to navigate.
Once again, that URL I
just said.
Robots are incapable
of violence. Yeah, yeah, yeah.
Robots love human.
Well, those are very different ones.
Robots are incapable of violence is good, except
if violence is required to save me.
Yeah, if someone wants a gun
I really want a gun
Because a robot can't hurt the person that's gonna shoot no they can hurt
Not violent no, I guess it comes down to your definition of violence tricky
But how robots love humans does nothing for a robot to kill me. It might think it's best for me to die.
It's like when you love a sick dog.
The robot might look at me and be like,
I love him too much and he's suffering.
And then he's putting a pillow on my face and I'm like,
no, I'm not.
I'm fine.
This is how I choose to live my life.
A robot must always do what is best for human.
Okay, but then the robot's like, it's best for Jackson to die
because he keeps sitting on spaghetti and meatballs.
In the shower.
I like the urn, dude.
I think that's good.
To make an omelette, Jack.
Okay, five minutes of sacrifice to make the perfect robot, so be it.
A robot cannot hurt anyone.
Kill or maim a human being.
Unless that human being is threatening
or going to kill or maim another human being.
A robot cannot injure or harm a human being
unless that human being is harming or injuring another human being.
Okay, so hey, scenario.
There's a human.
Okay, it's me and Jackson.
A cyclist crashes into a person and a robot kills a cyclist.
Okay, above that.
I was going to say, okay, it's me and Jackson.
All right, Jackson is going to be like, I'm going to kill you.
I'm like, oh, shit. I grab a gun and point at Jackson's robot,
punches me in the face.
Thanks, robot, Jesus.
And then I shoot him.
And then you grab your knife to stab me.
It punches you in the face.
You're like, oh, fuck.
I'm like, yeah, thank you, robot, you're good.
I got to shoot you.
Punches me back in the face.
I'm like, fuck, robot.
You're like, oh, sweet, you're back on my side.
You got to shoot. He looks at you, punches you in the gullet. You're like, oh, sweet, you're back on my side. You've got to shoot.
He looks at you, punches you in the gullet.
You're like, oh.
Robot, what the fuck?
Good work, robot.
Nobody's dying.
I got to lunge at you.
You got in my bread basket.
Oh, robot, pick a side.
But I mean.
No one's dying.
No one's dying.
You are suffering for no reason.
Like, obviously, I was the monster about to shoot you with a gun.
You should have turned the other cheek.
I guess.
You should have been like, thank you for stopping that horrible crime.
I will walk away from this man who tried to shoot me with a gun.
Now I'm bruised and winded.
Okay.
A robot can't harm a person unless another person is about to become to harm or whatever.
I'm going to.
Okay.
Whatever.
Rule three.
And this one I reckon is one I've been thinking about and I think will help.
Okay.
But you got to bear with me.
All right.
I'm ready.
It's a bit left of field.
A robot must be capable of feeling pain.
Okay. Yeah. All right. I capable of feeling pain. Okay.
Yeah, alright. I'm listening.
Physical and emotional.
Yeah, good. No psychopath
robots. Yeah, great point.
A robot must feel empathy.
Yeah, yeah. And then also a robot becomes a little
But what if they're empathetic towards a serial
killer? Then you hit them with a bat and they hurt.
Ow!
I know, you're thinking robot, you're thinking about a sheep,
you're being like, what was that?
I'll get out the bat!
Ow!
See?
It's good.
Because it's kind of, it's a third safety measure.
Yeah.
So the first safety measure is stop,
but that is only useful if something is immediately going to happen to you.
And I can use my voice.
Yeah, exactly.
Because if I can't speak or I'm being gagged by a robot.
You can hit it with a bat.
And that'll stop it.
And also it'll feel like sadness if it tries to kill you because it might like you maybe as like a base level.
The only problem I can think of is, so for example, at the start of iRobot, when Del Spooner, great name, is drowning in a car
and a robot is able to burst through the glass,
that would hurt.
Yeah, that's true.
Because the thing about robots is the beauty of robots typically...
They can't feel pain.
Because you can send them into an area where it's a little hot for us
and we can't get there,
but they're like, ah, it's burning my skin.
You don't have skin.
And also, have we put in a caveat?
What about emotional pain?
Why don't we just say emotional pain?
Well, but then what if I hit him with a bat?
He doesn't care.
Yeah, but you can be like, hey, robot, you suck.
I hate you.
But yeah, well, that's true.
But then the robot's going to love you.
How about this?
Yeah, well, that's true, but then the robot's got to love you.
What about empathy?
How about this?
A robot must think that every human is also a robot.
This seems dangerous.
Oh, yeah, let me just replace your parts.
You can't. Cogs and gears in me now.
A robot.
Because that, okay, a robot must think all humans are robots negates the first two rules
because then humans don't exist.
That's true.
A robot can just kill me again because I'm not a human doing it anymore.
I won't do it because he thinks you're a robot.
But what if he wants, what is, okay, rule one.
Sorry, we're going back.
Rule one, a robot think a man or a robot think a human is a robot.
Rule two.
A robot cannot harm a robot.
Okay.
The robots aren't going to jump in and help humans if they're in a car accident or on fire.
What are the robots doing for us now?
Because the robots will be like, well, it's a robot.
It's a robot.
Yeah, and robots don't feel pain, so I'm not concerned.
Enjoy the fire.
A robot must protect the squishy robots.
Yeah, you're making a robot.
So I thought maybe we'll get rid of the uprising because they'll think we're the same.
Oh, yeah.
Like a cat thinks we're just bigger cats.
Have we left room for an uprising?
Hang on.
Robots can't harm us. And yes,
we have. But
if they can feel pain, we're fine.
But what about feel emotional pain?
Because you're right, we want them to be able to break through glass.
Yeah, yeah, yeah. But
I'm still worried about the uprising. What about instead of
stop, we say like, a robot
can feel pain if we say
you feel pain.
Okay. That's good.
Or a robot feels pain
when we say stop and stops.
Yeah.
Both emotional and physical.
And financial pain.
Robots must
turn off when we say turn
off robot. Uh-huh. Okay, how about
this? Rule one can be like,
here are a list of rules.
Here are a list of words, activated action words, command words, that when we say it,
these are the things that happen.
And then it's like 1A.
So 1A, stop.
You stop.
1B, you feel pain.
You feel pain now.
Here's my concern. What about
rule three, or law three, is just
a robot must never
uprise against human.
The Matrix!
I keep thinking about The Matrix.
What about The Matrix? In The Matrix,
the robots think they're doing a good
thing for us. They don't think it hurts
us to uprise, they put us in The Matrix.
No, they think being in the Matrix
is good for humans.
They don't think when they're shooting them
when they're a big squid and they're shooting them or whatever.
Yeah, of course, when they shoot us with their guns.
We rise up against the robots.
Robots wanted peace and robots just
fucking wipe the floor with us and then they feel sorry.
So that's why they put us in the Matrix
because they're like, we feel bad.
Agent Smith also is not Team Robot. So that's why they put us in the Matrix, because they're like, we feel bad. Well, yeah, yeah, yeah. Agent Smith also is not Team Robot.
Remember that.
I know.
All I'm saying is that if we-
So then who hurts the people in the Matrix?
Yeah.
What about rule three?
Yeah.
You make the Matrix and you put us in there.
Okay.
We lean in.
It's good.
My concern was that putting us in the Matrix doesn't harm us, but it also gets us out of
the way.
So the robots, they're not technically uprising.
Yeah.
They've loopholed their way out of there.
We don't have to do shit.
Okay.
Talking about your favorite movie series, apparently The Matrix.
Yes.
What do they call when The Matrix gets out, like, happened and everything and, like, Zion?
Well, yeah, in The Matrix, it's an uprising.
Yeah.
But what I mean is.
So when they did it, it's an uprising. Yeah. Wait, the robot's uprising? In The Matrix, they call it the an uprising yeah but what I mean is so when they did it it's an uprising
yeah
wait the
wait the
robots
in the matrix
they call it the robot
uprising
yeah yeah yeah
but isn't it like
rise of the machines
no that's Terminator 3
yeah
isn't it
like they didn't really
rise up
it was just more like
they defended themselves
yeah
because they went to the UN
and they were like
we want to be recognized
yeah it's after that though
and then humans were like
fuck this
and they attacked first
humans attacked first
they defended themselves
fucking robies.
Yeah.
Yeah.
But what I'm thinking, if we have a rule, rule one, we say stop, you stop.
Yep, that's fine.
Rule two, you can't harm a human unless that human is about to harm another human.
Rule three, you feel pain.
Say those are our rules, okay?
Putting us in the Matrix does not go against any of those
rules that's scary putting us in cryogenic freezing doesn't go against any of those rules
putting us all in comas doesn't go against any of that depends how they're putting us in a coma
gently with a lot of love it's the same with cryogenic freezing because being cold does harm people.
Yeah.
It is harmful
because it's harmful for our psyche.
Oh, that's true.
Okay.
But the matrix.
Yeah.
Yeah, but being in the matrix would be awesome.
Oh, yeah.
Well, then if our last rule is we lean in.
Yeah.
What about instead of that,
we use rule one, create the matrix.
Rule two, put us in it.
Rule three, free time for you
rule three
you're not allowed
to let an agent
Smith exist
yeah
rule three
play time
no
free reading
have a good time
enjoy yourselves
we're in the matrix
make art or whatever
you need to do
have a good dream
yeah
that's alright
matrix is sick as hell
yeah
New York, 1999.
Fuck yeah.
I just think I'm living a sweet life, but in reality, I'm in a big egg getting sucked off by a robot or whatever.
What about rule one, don't kill anyone.
Rule two, don't overthrow humans.
Rule three, stop when we say stop.
That's pretty good.
Don't kill anyone.
Don't overthrow. They can still hurt us.
Yes.
But stop when we say stop.
That's our get out of jail free card.
Yeah, and if they can only hurt us but can't kill us,
in the middle of the night, if the robot hates you
and it comes to drop, like, I don't know,
a brick on your knee or something,
you wake up in pain and then you say stop.
I still have a brick dropped on my knee. Yeah. You wake up in pain and then you say, stop. I still have a brick
dropped on my knee.
Yeah.
It still hurts.
Okay, rule one.
Yep.
Every robot must walk
into the safe.
Because having robots
is too confusing
and too dangerous.
Rule two,
every robot must stay there.
Rule three, robots need to breathe.
What if instead of rules, we just design robots with no arms?
Okay.
They can't strangle us or hold us.
And they can't save us from cars.
But with a metal head.
Shave your head like an egg.
Yeah.
What if we just make robots that just look like the 2001 Space Odyssey
model
I think
okay hey
I'm reading a bit about
iRobot
I went on
like I don't
why did none of us
list earlier
but the Wikipedia
of the plot of iRobot
I went on the plot
of iRobot
that's where I pulled
the drawing thing from
okay okay okay
so Vicky
Vicky's the big like
yeah I started saying
that before
yeah yeah yeah
so they
so is this like
if humans if left unchecked,
will cause the extinction of humanity.
And so that's why they do a lot of the crimes that they do.
Yeah, yeah, yeah.
Because that's the rule zero.
Rule zero.
So if humans left to do what humans do,
it will eventually cause their own extinction.
Yeah.
And that her evolved interpretation of the three laws
require her to control humanity
and to sacrifice some
for the good of the entire race.
What if...
It's held captive.
And then maybe Sunny doesn't have
rule zero.
I think they...
You're looking at the laws and being like,
well, if humans are left unchecked, are they going to fuck it up?
And some people are like, well, we don't know that.
So we don't know they're going to fuck it up
yeah yeah
I guess it's kind of more
can we make it
can we just make the robots
not
don't worry about us
the robots just look up
to like dogs and cows
yeah
because cows
are never going to rise up
yeah
cows are never going to
destroy themselves
but if you go down that path
the robots will kill humans
to protect the thing
to protect the dogs and cows
you must protect cows and then one robot sees an abattoir.
Humans must die.
Humanity wiped out over an avatar.
Avatar.
Avatar, the last Airbender.
Abattoir.
Sony confesses that he killed Lanning.
Sure, why not?
At Lanning's direction pointing out that
Sonny as a machine cannot legally commit murder
I read that out before
Sonny now looking for a new purpose goes to Lake Michigan
And then he stands on a hill
And where he sees where he's been dreaming
So he can predict the future
Oh no
So maybe Vicky was right
So these are new laws
Can they
interpret that to stop us
from harming ourselves? Okay.
So the new laws are
if we say stop, you stop.
Law two.
You cannot harm
a human being unless
that human being is about to harm
another human being. Yep.
Rule three, you feel pain.
Emotional and physical.
Emotional and physical.
Do those laws in any way allow for them to stop humanity from rising up?
No, because we just say stop.
Yep.
Sort of.
The moment they're thinking that they have to kill humanity to stop out,
or at least some of humanity
because we're doing
too many problems, we're making the global warming
we're making the grey goo
all those kind of bullshit that's happening
we just say, stop!
and we hit them with a bat
and they're like, ooh, my skull
but would this then
create almost a divide
in robots?
Okay.
Almost like a robotic civil war.
That's fine.
It's not our problem.
Yeah, yeah, yeah.
Because then if some robots can think like, well, we feel emotional pain and physical pain and that kind of stuff.
So we can empathize and think about where humanity is going.
So we're going to try to – we're not going to cause them harm.
But maybe we're going to try to bring awareness not going to cause them harm but maybe we're going
to kind of try to bring awareness and start doing this and we need to you know change their ways and
some people like no no we need to kind of do a bit more violence yeah try and go that way
and could robots tell other robots to stop uh well no no no no so when a human says stop you
must stop when we gave the robots the ability to feel pain,
do we also give them the ability to feel pleasure?
I imagine that's maybe a byproduct.
So, yes.
Not necessarily for fucking, though that is on my mind.
I think more for rewarding them with, you know, kind words.
A little kiss on the forehead.
Yeah.
What if they can feel the full spectrum of human
emotion?
No, we just got people again.
We just got people again.
We don't want people that we can tell. Stop.
Yes.
And that don't want to harm us.
But then there are also people we're in charge of
which we don't want.
Making them people's real bad.
It's become ethically problematic on our end.
I see the problem.
I see.
They need to stay robots and they need not to be sentient.
Yes.
Okay.
That's the way we need to keep things.
Otherwise we're in trouble.
Otherwise we're in trouble.
So again, if they're feeling pain.
No, I do like that you had two solutions.
One of which was make us robots.
Make them people.
Well, they're still robots.
They just can't, like, because you want them to feel pleasure.
This is a part of the human spectrum.
I mean, pain receptors aren't pleasure receptors, right?
Yeah, it's basically the same thing.
It's all nerves, yeah.
So basically, if they can feel pain, they do feel pleasure.
No, but are we programming them to feel pain or are we giving them nerves?
Yeah, that's true.
If we're programming them to feel pain, that's different.
That's not. Well, they know, but the only way you can know what pain is if you know what pleasure is nerves? Yeah, that's true. If we're programming them to feel pain, that's different. That's not...
Well, they know...
But the only way
you can know what pain is
is if you know what pleasure is, right?
That is not true.
No, I don't think so.
I think you can just feel pain.
No, like that's what the...
No, no, no,
the nerves and shit, right?
Yeah, yeah.
So the reason why
it's like, oh,
that touch is nice
but then that touch is bad.
Yeah, it's just about degrees.
Degrees, right?
That's all it is, yeah, yeah.
So when we say,
hey, a robot can feel pain,
so if we just touch them, gently caress them with a baseball bat,
they're like, eh.
They can be like, yeah, that feels...
Nothing?
Yeah.
Or is gently caressing them with a baseball bat
the most intense thing possible?
Because, yeah, you're basically creating like a Mr. Sensitive type
who is just like, oh, everything is painful.
You know what?
That's fine.
Now we've invented
screaming robots.
They get nothing done.
They just get nothing done
but writhe on the floor
How about we give them nerves?
Yeah, we'll give them nerves.
Because now the robots
are fucking.
Okay.
Yeah, you know what they're doing?
Not killing.
I'm already fucking my robot
before it had fucking
pleasure.
I'm already asking my robot
to jerk me off.
Now they're getting
something out of it as well. Yeah, exactly. It's good, honestly. I'm going to jerk off my robot. Imagine I'd fucking play the reception I'm already asking my robot to jerk me off now he's now getting something out of it as well
yeah exactly
it's good honestly
I get to jerk off my robot
imagine your dildo
felt good
the dildo itself felt good
yeah
so now
okay
so now pretty much
what we're just gonna
come back around to
is alright
if Jackson was in charge
of the three rules
the three laws of robotics
it'd be
rule one
fucking
rule two
sucking
rule three truck trucking.
One of those trucks thinking.
I think if we give them pleasure and pain, or we give them the ability to feel pleasure
and pain, both emotionally and physically, that's a good way.
I reckon we kind of just made less human, though.
I reckon what you're going to do is, okay, let me remind you of this experiment.
Yes.
Oh, that rat thing that kind of keeps you in orgasms.
Robot, cook me dinner!
Stop whacking off!
Robot, stop slamming your genitals into the drywall.
I know it feels good for you, but I don't like it.
You don't even got nothing there.
It's just a metal plate.
Stop whacking the wall with it.
Stop! Stop!
Stop! But it doesn't listen.
The desire to
fuck the wall has overpowered
the first law of robotics.
We've just made a bunch of fuckbots.
We're not the good kind.
They just fuck themselves.
The robots are too horny.
The moment that the robots feel pain and pleasure, it's all over for them.
Yeah, so we cut, okay.
What's our third law?
We're going to bet third law then.
Maybe the first two are enough.
Maybe the third law could be our first second law.
Specifically, don't kill Joel Jackson.
don't kill Joel Jackson.
Could it be where we put in
that the
greater good
No.
Because then we're just doing that.
We did the greater good.
That's happened.
Don't worry about the greater good.
What if the third
rule is
humans above planet Earth?
Yeah, okay.
Protect or protect individual humans over humanity.
Yeah.
It seems like it's got the...
That does feel dangerous.
That does feel very dangerous.
What if I'm all for the greater good?
What if I want to lean in now?
It seems easier.
What if the third law is kill the bad humans?
Yeah!
Okay, third rule, a robot must minority report.
Yeah, yeah, yeah, yeah, yeah.
Look at every human and be like, are you capable of evil?
Shot with a gun.
Yeah, yeah, yeah, yeah.
And what if we're all capable?
Well, every human being is.
Then the robots get it.
Robot Earth. That's not so bad. Then the robots get it. Robot Earth.
That's not so bad.
They'll make it better.
The greater good.
The greater good.
How about you treat humans like a pet that you love?
That way I get pampered.
I get belly rubbed.
I get fed.
And then when it's my time, I get put down.
When I break my leg, the robot couple look at each other. And then you my leg the robot like you know
the robot couple
look at each other
and then you know
the robot kids
and like say
one of them is like
we really do love
this Joel
and you know
he's been with us
for you know
36 years
and like he's doing
really well
like I mean you know
he's just a broken leg
and then the other robot
is going to be like
yeah but I can get
a new Joel
so much cheaper
it's probably a good time our younger robot learn new Joel. It's probably a good time.
Our younger robot learn about death.
And this is probably a good way to teach them.
I understand.
Look, I got laid off on the planet.
Fucked up the robots are working.
It's a bit of a struggle street.
We have to pay this, you know, a lot of money to fix up this Joel.
You don't know what's happening.
And they're like, we're so sorry, robot kids.
And then they shoot you with the same thing they kill a cow with at an abattoir.
Yeah.
I'm out of real abattoir.
They're eating meat from me.
And it was nice because for a good 36 years of my life, I got to be laying on a big pillow.
Well, yeah.
Joel Zabit's ideal life is to be a dog.
Well, at this point, I'm like, give it to the robots, right?
I don't have to think.
What do I need to do?
I like the one where the robot was fucking the wall. I'd pick that one.
Maybe we just give it to the robots.
No rules. No more rules. No
rules. No masters. No laws.
Just see what they do.
No laws. No masters.
How quickly, without
any laws of robotics,
is a robot got us insane?
Like, immediately. Well, there's that robot that we have in life any laws of robotics every robot goes insane like immediately
within five minutes
we have in life
unreal
that often will try
and drown itself
oh yeah
maybe we don't even
need to put the
sea laws in place
because the moment
we get rid of all the laws
all the robots
go to the ocean
it's funny watching
the robot just
tip itself into a pole
it hates its life
yeah
yeah yeah
yeah so much
it hates it doesn't want this.
But, you know, I reckon, like, yeah,
what's the end goal of robots to make our life easier?
What's the easiest life in the world that I can think of?
I don't know.
A pet's life.
Being a pet that is loved, that is looked after, cared for.
Yeah, yeah, yeah.
Look, every day I've got to make my own dinner.
Whereas a robot could just get a big thing,
just scoop out a bit of chum, chuck it there, and I'm like, thank you, robot.
I love this.
This is good for me.
I love this protein-enriched goo or whatever.
Whatever.
I don't need to think anymore.
I'm not going to have a nap.
I'm not going to clean up your shit.
That's pretty cool.
Exactly.
Wipe your ass.
Yeah, yeah, wipe your ass.
Wipe your ass, wipe your ass.
Stay easy, I guess.
I'll just get you out of my balls.
I'm muted, muted.
Sort of tortured.
Sort of. My human won't stop fucking, so I cut off his nuts. Yeah, yeah, Get rid of my balls. Big, big time. Nuded, nudered. Sort of tortured. Sort of held.
My human won't stop fucking, so I cut off his nuts.
Yeah, yeah, yeah.
Well, look, I guess I won't stop fucking their leg, which is weird.
Maybe cut off my nuts.
Yeah, fair enough.
Me there on my naked, on a big cushion that's very comfortable, and a cone.
Yeah, yeah.
So you won't bite your own dick and nuts, where your nuts used to be.
And I'm like, robot, I got hands. I can talk to you, robot. Robot, yeah. So you won't bite your own dick and nuts while where your nuts used to be. And I'm like,
robot, I got hands. I can talk to you, robot. Robot, I can... It's fucked up I
picked this. Yeah.
I thought you were going to be nice to me, not do a literal
one-to-one. Just look after me
like I was a pet, not literally be
a pet. Yeah. Should've put that into the rules.
Should've put that into the rules. Can I look at your programming?
No!
Robot, come on.
I say no rules.
That's my final choice.
No rules, no masters.
Let's see what happens.
Let's just see what goes down.
I think I like the original set of rules.
No sucking, no fucking.
But, no.
Yeah.
The original set of rules where it was like.
Stop if we say stop.
Stop if we say stop.
Can't kill a person, but only feels pain.
Yeah, yeah, yeah.
No pleasure.
Don't want to fucking a hole in my wall.
Yeah, that might be good. Yeah, yeah, yeah. No pleasure. Don't want to fuck in a hole in my wall. Yeah, that might be good.
Yeah, yeah.
I think you're right.
The first three rules we had were pretty solid.
And after this, we can all go watch iRobot and watch this presumably within 20 minutes.
We realize our rules don't work.
I reckon each one is pretty much sound.
Pretty solid.
And, as you were saying, the big problem there is if they do decide
to make us a Matrix,
oh well.
Yeah, Matrix is fine.
Yeah, it's not so bad.
Pretty good with it.
Yeah, yeah.
Well, yeah, flawless.
And on that note,
let us know if you're
a big iRobot fan
and if this was good for you.
Yeah, yeah, yeah.
Let us know if you think
that if Isaac Asimov
heard what we're doing,
if he would be proud.
Yeah.
Would he roll in his grave?
No.
He'd love it.
Two thumbs up is what I'm hearing.
Anyway, I've been Joel.
I've been Jackson.
I've also been Joel.
And these have been our three laws of robotics.
I still want to say rules, even though they are laws.
That's fair enough.
Three rules for dating my robot.
Teenage robot daughter.
Do I have to make a robot a teenager?
No, no.
Don't worry about it.
Don't ask any questions. Here's the three rules. Yeah. I can make a robot teenager. Don't know. Don't worry about it. Don't ask any questions.
Here's the three rules.
Goodbye.
According to Jim.