Armchair Expert with Dax Shepard - Ken Goldberg (roboticist)

Episode Date: January 8, 2025

Ken Goldberg (Why Don’t We Have Better Robots Yet?) is an award-winning artist, roboticist, and engineering professor. Ken joins the Armchair Expert to discuss being born in Nigeria, growin...g up in rough and tumble City of Brotherly Love, and on how that taught him how to not take things lying down. Ken and Dax address the elephant panties in the room, how a course he took in 1981 began his trajectory in robotics and AI, and the tragic archetype of Pygmalion and the hubris of falling in love with your creation. Ken explains the Czechoslovakian etymology of the word “robot,” why don’t we have better robots yet, and how he stays optimistic doing a job predicated on failure.Follow Armchair Expert on the Wondery App or wherever you get your podcasts. Watch new content on YouTube or listen to Armchair Expert early and ad-free by joining Wondery+ in the Wondery App, Apple Podcasts, or Spotify. Start your free trial by visiting wondery.com/links/armchair-expert-with-dax-shepard/ now.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Transcript
Discussion (0)
Starting point is 00:00:00 Wondry Plus subscribers can listen to Armchair Expert early and ad free right now. Join Wondry Plus in the Wondry app or on Apple podcasts. Or you can listen for free wherever you get your podcasts. Welcome, welcome, welcome to Armchair Expert, Experts on Expert. My friend is here today. Yes.
Starting point is 00:00:19 Ken Goldberg. You'll hear our origin story, but I met him somewhere. I was judgmental of him. Then of course I fell in love with him. And now I'm just smitten with this gentleman. Ken Goldberg is the Williams S. Floyd Distinguished Chair in Engineering at UC Berkeley and an award-winning roboticist, filmmaker, artist,
Starting point is 00:00:38 and public speaker on AI and robotics. Now really quick, I've read a couple comments that people are over AI, I get it. People feel a little inundated, and I get it, it's the topic of the day. A, this isn't very heavy an AI talk, and then B, this is a million times more playful than you could ever imagine robotics could be.
Starting point is 00:00:59 I'll also add that he has an art exhibit that is going until March, don't wait till then, go now, at Skirball, if you live in LA or are visiting, Skirball Cultural Center in LA, and it's called Ancient Wisdom for a Future Ecology. Trees, Time, and Technology. Very, very cool art project with these tree rings that are gorgeous and very creative.
Starting point is 00:01:22 Ken Goldberg, I love you. I think y'all will love him too. Please enjoy. He's an up-chance-boy. He's an up-chance-boy. He's an up-chance-boy. He's an up-chance-boy. Ken, take a seat. Get comfy. I prepped Monica by saying, you're going to really like my friend Ken, if Fred Armisen was a roboticist, this would be Ken.
Starting point is 00:01:56 And I think maybe am I unique in that comparison or have you ever heard that before? I remember you saying that. I own here a lot, but I take it as a compliment. You should. He's one of our favorites. We love him. Really? Oh yeah. Very unicorn-y, like you.
Starting point is 00:02:09 Unicorn-y though, is that mean rare? Well not corny, rule out the corny part. Focus on the unicorn part of that. It is mixed messages because he calls people unicorns he likes, but then he also says he doesn't like unicorns. So it is tricky, there's confusion. You're right to be confused. Okay, all right, because I don't know if you meant
Starting point is 00:02:26 rare as a human being. Rare and special. Rare and special, okay, I'll take that. That's true. Rare, unique, special, colorful, vibrant. Playful. Oh, our favorite word. Love it, good.
Starting point is 00:02:37 I think we should start with how we met, because I would love to hear your perspective. I have a very specific perspective. And I don't even know if I let you in on the full details. Well, I'm curious now. So you and I were at a conference, and it's the kind of conference I would have never imagined getting invited to.
Starting point is 00:02:52 There's a lot of people there like yourself, professors and stuff, and I'm a- Smarty pants. Smarty pants, billionaires. A fun group, actually. And we're walking into this event, and they are very militant about everyone wearing name tags, as they should be,
Starting point is 00:03:07 because everyone there thinks everyone knows their name, but people don't know each other's names. So I see this guy with crazy hair, and he doesn't have a name tag on. Fucking the system. I'm like this, which is funny, because I should immediately love that. I should go, yeah, fuck these name tags.
Starting point is 00:03:22 That's my essence. But for some reason, because I've complied, who does this guy think he's so famous? He's the only one here that doesn't need a name tag. So I'm immediately a little triggered. And I say to Chris, I'm like, who's this guy with the wild hair that doesn't have a name tag on him?
Starting point is 00:03:36 And then by luck, somehow I hear your name, Ken Goldberg. And then I immediately go into the bathroom, okay? I go into the bathroom before the little event starts. You don't know any of this. What? Don't worry, you're not being led in a bad direction. I go into the bathroom and I Google you and I see robot professor at Berkeley.
Starting point is 00:03:54 And I immediately am like, that's a cool job. Okay, so he's not a billionaire who thinks he doesn't need to wear a name tag. This guy's just kind of an absent-minded genius maybe. Yeah, because I didn't know about the name tag. This guy's just kind of an absent-minded genius maybe. Yeah, because I didn't know about the name tag. That's just so much more about Dax than it does about you. These are all my shortcomings and character defects. But they work out beautifully because then, 20 minutes later,
Starting point is 00:04:18 and most of the things we were sitting through the seminars were very AI heavy. And I have a chip on my shoulder about AI. So then now I'm standing next to you randomly. And this is probably where I would enter your life story because I just lean over to you and I say, you're a robotics professor. Yeah. And you go, yeah, these robots are like so far away from doing our laundry and working on our car for us or doing anything really. I was like, they saying, Aya's gonna take over everything, you're gonna be a leisure class,
Starting point is 00:04:47 what are we gonna do with all this? And I'm like, where are the fucking robots? And you go, oh, I'm so delighted this is your question. All right, so I should tell you my side of this because, oh, I'll just wrap it up by saying, within 30 seconds of talking to you, I'm like, oh, this is my favorite guy here. By a long shot, I hope I'm at every dinner with him.
Starting point is 00:05:04 And then we since developed a friendship. Yes. Okay, I remember that I oh, this is my favorite guy here. By a long shot, I hope I'm at every dinner with him. And then we since developed a friendship. Yes. Okay, I remember that I was in this place also a little intimidated because a lot of A-list people and I was sitting there and I forget who it was. He was on the stage and you raised a comment and said something about he looked really sharp. Pharrell.
Starting point is 00:05:22 Pharrell, yes. I think I said like he was really dazzling. Exactly, yeah. And I just love the way you said that. I thought it was such an unusual thing to say. It was spot on, but it was just the kind of thing that nobody would normally say. Yeah, unicorn-y.
Starting point is 00:05:35 Very unicorn-y. And so I think it was after the lunch or something, I saw you standing over there and I just went over and I said, hey, I love that comment you made. And then we started talking, that's how I remember it. I didn't know anything about you. You must have known Kristen though. No, we were just having this fun conversation
Starting point is 00:05:53 and you guys were so charming. And then Tiffany came over. Your wife. My wife. Also dazzling and unicorn-y. Yes, we walked away and she said, you don't know who they were? And I was like, no.
Starting point is 00:06:04 Of course not, I hear about important stuff. And then we had several great conversations at that thing. and she said, you don't know who they were? And I was like, no. But it was really, really comforting to hear you say that as someone who is an authority in the space because I've heard many people lecture on AI and I'm hearing all of the, what are you peeking at? underwear on the floor? Oh my god. Okay, I gotta walk, so sorry Ken. But obviously this needs to be addressed. That took me a second. Okay, so here's what happened.
Starting point is 00:06:38 I just put the pieces together. There's an explanation. There is. Monica and I did a commercial yesterday, as I told you. You told me that. When I arrived, I changed my clothes and I put them in a bag and I brought a bag of extra shoes and pants they asked me to bring.
Starting point is 00:06:51 And then I threw this sweater in there and then that underwear was in there and then I just threw on my sweater just now. And clearly my panties were attached and now they've fallen off. Wow. So for the viewer, I would be so angry if I didn't get this studio.
Starting point is 00:07:07 Oh my God, we don't need to see that. You have to if you were watching and you were like everyone's seeing these panties and I'm not. Wouldn't you throw your computer out the car window? I love it, full disclosure. Okay, so these are the offensive panties. They have elephants.
Starting point is 00:07:20 They do. They're quite nice. That's me and these, a former sponsor. Are you being polite? That was quite nice. That's me and these a former sponsor. Are you being polite? That was quite nice. Or do I now have a Christmas idea for you? Well, actually, yeah. I'm gonna buy a pair of those.
Starting point is 00:07:32 Me and these is a great brand. Really? It's a brand? Okay, I'm always looking for good. Very comfortable, very playful. It's almost as if, if they were a current sponsor, we planned all that. That would be great.
Starting point is 00:07:42 Placement. Paola. Sorry, I just had to call that out. Yes, the look on your face, I thought there was maybe a squirrel under my chair. Well, it was a little surprising. Kind of thinking like what else was going on. Right.
Starting point is 00:07:56 Yeah, yeah, there was some implication. This is video, so I know people can see that there's something on the floor, so I had to say it. Glad you did, because if you had just not said it, it would have been sort of like this lingering presence. An elephant in the room, if you will. An elephant in the room!
Starting point is 00:08:09 Oh my god, did they not feel really planted? Oh my god, that is really, oh my god. The elephant panties in the room. Oh man. We're not gonna top that in the episode, we should wrap it up. Okay, so back to AI, everyone's quite scared, and I think there's a lot of reasons to be scared,
Starting point is 00:08:26 but also I think maybe we're a little more panic than we need to be. I just want you to be a kind of a comforting force. Oh, good, good, good. So we became friends and you've been over and we love you and your wife, and you're also artists, so you're impossibly interesting.
Starting point is 00:08:38 Let's start though with, of course you would be born in Nigeria. Oh. Is that where you were born? I was. Of course. Of course. All of born in Nigeria. Oh. Is that where you were born? I was. Of course. Of course. All the best unicorns are.
Starting point is 00:08:48 How did that happen? So my parents were idealists during the 60s and they were at Penn in Philadelphia and they were going on civil rights marches and things like that. So they wanted to continue that idea of doing things for civil rights. So when they were graduating, they wrote to various people in Africa and they said, we'd
Starting point is 00:09:09 love to help. And so one person who ran a school there, and he's actually quite famous in Nigeria, Ty Sholaran, he invited them to come to his school and work for two years. So they basically got over there and there was no running water and no electricity when they got there. So it was very rough and they lived kind of under these circumstances. My dad taught physics and my mom taught English. They were graduate students at Penn or undergrad?
Starting point is 00:09:35 Undergrads, they just finished their undergrad. Also they were ahead of the curve because you were born in 61? Yep. So the civil rights movement in its full velocity is later. Yeah, that's a good point. It was starting. Philadelphia, the city of brotherly love,
Starting point is 00:09:48 has a lot of integration, great history there. And so they were starting there. But that was also around the time of Nigerian independence. There was a real movement across Africa. I was very proud of them. I'm still proud of them for doing that. Were you delivered at a hospital? Yeah, so I always thought it must have been an accident
Starting point is 00:10:05 because like, why would you do that? Who would plan? Yeah, and so I never asked because it was like that elephant in the room, I just didn't want to know. But a couple of years ago, my mom said, we wanted to have a baby because we had all this time together
Starting point is 00:10:17 and we knew it would be a time to focus on you. So I was born in a hospital nearby called a Badin, which is about an hour from this village, but I have a really big vaccination mark from that. My father had that one, right? Is it the size of like a quarter? Yes. And indented?
Starting point is 00:10:33 Yes. Yeah, and it's a specific vaccine that would do that, I think. I think you're right. I don't know what it is, but yes, exactly. I would gaze at it on my father's shoulder all the time. Look like someone put a cigar out in us. Yes, that's it.
Starting point is 00:10:44 That's a great way to put it. That's exactly what it is. Oh my God. What if that was the vaccine? The doctor just lit up a cigar. How long were you there as a baby? Just six months. And did you get any kind of citizenship out of that deal?
Starting point is 00:10:57 No, I looked into that too because I thought it'd be nice to have a dual citizenship. Yeah, be getting a hot water out there. Yeah, yeah. It was always good. You need to flee the country. No, but apparently you can't have both. They don't allow it.
Starting point is 00:11:08 They're like, fuck you, you are not a side dish. You can't have both, yeah. Or the entree. Right. So then you do grow up in, I guess, would it be a suburb of Philadelphia? Yeah, Bethlehem, Pennsylvania, steel town. Right, Bethlehem steel.
Starting point is 00:11:21 Yeah, so my parents come back there because my dad was a metallurgist. Well, this is fascinating. In addition to being a physics teacher? Yeah, well, physics was what he could teach as a, he was an engineering undergrad. So then he went back and he actually got his PhD at Ohio State and then we moved to Bethlehem.
Starting point is 00:11:37 Bethlehem was known for where the time and motion studies were done. There's a whole history of scientific management. You know about this, Frederick Taylor? No, but is this to increase productivity through the scientific method or something? Yes, yes, yes. Oh, okay, tell us.
Starting point is 00:11:51 All right, so this is fascinating. You've heard of time and motion studies, you know, where they have a stopwatch and they would time people doing their work. I hadn't heard of that. Oh. Just to see for productivity? Yeah, efficiency experts.
Starting point is 00:12:02 Uh-huh, okay. This was very big in the early part of this 20th century. Datifying. Yes, making work scientific by quantifying it. So what they would do is they had all these things, mostly stopwatches back then, but they would time how long it took you to say, carry a shovel of ore from one end of this lot to another.
Starting point is 00:12:19 And then they would clock people, and then they would try and get them to increase their speeds. And so this guy wrote this book called The Scientific Management, something like that, and it was very influential on Stalin, apparently. Oh, really? Yes, but workers hated it.
Starting point is 00:12:33 Sure. For obvious reasons, right? Yeah, because you're getting enough efficiency out of me. Exactly. And that was this whole thing, was that you could increase the productivity of the average worker by a factor of two or more if you followed these methods, but it would squeeze the workers to the breaking point. So they didn't like it, but it became popular until unions came and pushed back.
Starting point is 00:12:52 But this whole wave was still around in the form of industrial engineering, which is actually the department I'm in at Berkeley, which used to do these kind of studies where it was sort of how do you arrange your office to be the most efficient, or the assembly line to be most efficient for workers and now machines? Well, and by the way, and we'll just earmark this, among the AI accomplishments that I find most fascinating is their ability to make things more efficient. I know there was a server farm they let AI loose on and it had been studied forever and within hours it figured out how to make it like 30% more efficient or something crazy. Yeah, so energy efficient. So it could lower the amount of electricity it used, which is really undeniably good thing.
Starting point is 00:13:32 That's good for the environment and everything else. Yeah. Okay, so you're growing up in Bethlehem. Mom and dad didn't teach in your childhood? No, he was working at the research lab. Actually she did, she taught at the elementary school. We were very close. She was a great mom. I was lucky, it was a good town research lab. Actually, she did. She taught at the elementary school. We were very close. She was a great mom.
Starting point is 00:13:46 I was lucky. It was a good town to grow up in, but a little rough and tumble because you had to fight. Okay, great. So here's my guest because I'm from Detroit. And so you have this enormous working class. Many of the folks had migrated up from Kentucky to fulfill these roles.
Starting point is 00:14:01 So you have this culture of pride. And yeah, violence was on the table at all times. Yes. So it's interesting you say the pride thing. I didn't know about of pride. And yeah, violence was on the table at all times. Yes. So it's interesting you say the pride thing. I didn't know about that, but the pecking order was all about fighting. And kids would call you out and say, I'll see you after school.
Starting point is 00:14:12 And you had to do it. Everyone would go watch. And there'd be a few additional fights for the people who got excited watching the first fight. Exactly. The most dangerous thing was watching one of those fights, because afterwards there was going to be a few more. Another fight break out.
Starting point is 00:14:23 I mean, I had both my front teeth knocked out. You did. In what grade? Like 10th grade. In school? Okay, so the story is that I was at a party and a girl asked me to take her home because she was having a fight with her boyfriend.
Starting point is 00:14:37 Oh, okay. And I was being a nice guy, I thought. And I drove her, and then I didn't even know who her boyfriend, but she said it was Eddie Okay, it was a very tough guy perfect name for him exactly. I was like, oh no That's not good. I did not want to cross Eddie And so the next thing I know we had gone over I dropped her at her friends and the doorbell rings Oh, and it was Eddie. I came out Eddie just cocked
Starting point is 00:15:05 Right out of the gate. Right out of the gate like a sucker punch and I remember it was snowing and all this blood on the snow. Yeah, and that was my front teeth. Now do you have the same thing I have which is we're both really lucky and we're running in circles that are mostly people that are college bound and stuff.
Starting point is 00:15:21 And I try to explain the level of violence that was kind of ever present. I can tell there's no connection to what I'm saying. And then I wonder, was it an era? Do you wonder if it's still like that in Bethlehem? Cause I'm curious, was that just our generation? I don't know, it's interesting. Cause it wasn't talked about, you know, we didn't report it.
Starting point is 00:15:37 I don't remember it even occurring to me to even tell anyone. You teased it? Yeah, like I wasn't gonna. Well that would lead to more abuse. Probably, so you just sucked it up and you took it. It was definitely rough. Although, like I wasn't going to. Well that would lead to more abuse. Probably. So you just sucked it up and you took it. It was definitely rough.
Starting point is 00:15:49 Although, you know, it's interesting because now, the way it does come up, a few years ago I was in this academic setting and this guy double crossed me. And he basically said, well, we're going to do it my way. And I remember sitting across from him and I was really upset because I had put all this work into something and he was basically going to trash it and put somebody else in to take the credit. And I said, you don't know, but where I come from, I don't stand for that.
Starting point is 00:16:13 I said, I'm going to really. Give your hands full. I'm going to get physical. Oh no, I'm not going to get physical. Somehow I can't remember actually the language, but I wasn't saying I'm going to hit you, but I was going to basically say, I'm going to come back, I'm going to fight this. Yeah but I wasn't saying I'm gonna hit you, but I was gonna basically say, I'm gonna come back, I'm gonna fight this.
Starting point is 00:16:27 Yeah, I'm not gonna push over. Yeah, I'm not gonna just take this lane down. For better or worse, at least in my experience, you walk away with this weird paradigm, which is it's better to get beat up and stand up for yourself. Because if you don't, it's gonna lead to so much more suffering.
Starting point is 00:16:42 It's just an equation. You come to accept it it and that's it All right. So how do you feel about bullies number one pet peeve in life is yeah I really I hate bullies hate them. I was big so I can't claim that I was some victim But I also was a punk rock skateboarder snowboard. So I was alternative bullies for me It was a big big thing I've learned that the best way to deal with them was to stand up to them Even if they were bigger than you.
Starting point is 00:17:05 And then oftentimes, they would cave in. They were cowards. And or they would at least move to someone else who wasn't gonna stand up for themselves. They'd pick another guy. You only had to kind of do it once or twice. Oh, that's interesting, because there was a big reputation thing.
Starting point is 00:17:17 It was very weird. You had this whole pecking order, and so people knew not to mess with you, and you had to be in a few fights, and then people would lay off. Yeah, then you could exist. Okay, but now back to... Wild.
Starting point is 00:17:27 So you did engineering type stuff with dad. You were bonding with dad over that as a kid. But then you decided you kind of want to be an artist at some point? Yeah, because my mom was an artist. She would paint and she took us to this art school in the neighboring town and I really loved that. They were both into modern art and would take us to museums like in in the neighboring town. And I really loved that. They were both into modern art and would take us to museums, like in New York
Starting point is 00:17:48 or Philadelphia. Philadelphia Museum, I have very fond memories. They discouraged you from pursuing that. I remember talking to my mother and saying, oh, I think I'm going to major in art. And she said, oh great, you can major in art after you finish your engineering degree. Sure, sure, sure.
Starting point is 00:18:04 It's a very immigrant parent thing to do, actually. So that's funny. Well, it was also because my parents had a lot of financial troubles growing up. And so it was hard because there were times when we didn't go on vacation for many years. I want to be really careful because I don't want to sound like we were suffering. Like there was some real poverty out there and we weren't facing that. But we had money problems and my mom and dad would fight a lot about that.
Starting point is 00:18:27 See, that's, I think, the most relevant aspect is, was it this concept in your life that every time it was brought up, you saw fear on the faces of the adults and it was the cause of fighting and it is this big elephant panties in the room. I think just once you have that association with it. So yes, there were a lot of people poorer than me, but I was single mom and I still
Starting point is 00:18:49 have the most unrealistic relationship with money to this day. It's just so grounded in fear that there's really nothing I can do to overcome it. Yeah, you know a simple thing is like whenever I look at a menu I'm always studying the price. Yeah, so I would never order the most expensive thing on the menu. And you could. Yeah, but it's interesting, because I'll contrast that with my wife,
Starting point is 00:19:09 and she'll say, what do you think of this dress? And I'll come over and I'll be like, what does it cost? Yeah, and she'll be like, oh. She'll be like, oh. She doesn't know. She doesn't know. Yeah. Right, she's evaluating if it's beautiful.
Starting point is 00:19:18 But the first thing I'm looking at is the price. Before I try something on it, I want to know if I could even afford it. Right. Yes. Even though you can. Probably, but it's still in that mind. Totally. Oh, how about this, without revealing your net worth,
Starting point is 00:19:33 let's just say, if you had a billion dollars, don't you think you'd probably still be the same way? Yes. That's what I'm saying, that it's an irrational relationship with it. It's not grounded in the facts at all. No, no, it's true. Another pet peeve is if I'm in a hotel or something
Starting point is 00:19:47 and they don't have they charge like $20 for a Diet Coke and then they deliver it and it's an additional delivery charge and a tip on top of it. And then it comes, you're gonna give a tip on top of that so it's like gonna be $40. You're in triple digits for a Diet Coke. Yeah. That actually triggers a second issue I have which is,
Starting point is 00:20:04 and we had an expert on talking about this bias of being taken advantage of, to be the sucker. So then I have two things going. I've got my financial insecurity, and then I have like, these people think I'm a fucking idiot. Like I'm a sucker, I'm gonna pay this much for a night, so it's a lot going on. So you do what they urge you to do,
Starting point is 00:20:19 and then you go to Penn, and you double major in economics, and in? Engineering. Summa cum and in? The engineering. Summa cum laude in both. Wow. Yeah. That's wild. For a double major, that's impressive.
Starting point is 00:20:31 I was a double major and I was also summa. We won't say what the majors were. Because that might dilute what I'm saying. You were a double major too? Yeah. For the same reason, because I wanted to do theater. My parents were like, probably not. You're because I wanted to do theater. My parents were like, probably not.
Starting point is 00:20:50 You're going to need to do something else. You have a safety net. A realistic plan in place as well. You study abroad in Edinburgh. Yeah, which by the way when I read Edinburgh today, I was like, isn't it Edinburgh? Yes It is Edinburgh. Oh, it's Edinburgh, but we don't have an O at the end of that I know but that's the way it is. This language is madness. I know But it is Edinburgh. Okay. Thank God. I thought I was in Sam. I've been saying that wrong for 30 years You're absolutely right, but I'm glad you brought that up because that was a huge turning point in my life Yeah, tell me where in in this eight year schooling?
Starting point is 00:21:27 My junior year abroad and also my dad was very sick. He had gotten leukemia because he had the plant that was a lot of toxic chemicals and stuff, but then he got remission so he was feeling better and I was so stressed with that whole thing, I wanted to take some time off and Penn had this junior year abroad program. I had never left the country, actually since I was a baby. was a baby. Yeah right right. But I had set off in this
Starting point is 00:21:48 year-long adventure. Oh I had so much fun. Yeah junior year you're like 21, 20? 19. I remember that distinctly because I remember saying I'm 19 years old and I'm on my own and I had a backpack and the let's go Europe. This big volume and it was my Bible and I would just travel as much as's Go Europe, this big volume that it was my Bible, and I would just travel as much as I could. Yeah, URL, the train schedule. Yep. Oh my God, I can go all the way to here?
Starting point is 00:22:12 One of the highlights was going to Morocco. Back to your continent of birth. Oh. Yeah. It's very interesting you say this because this is the story I always like to tell, which is that basically we have some friends that said, let's go somewhere really exotic, we'll go to Morocco.
Starting point is 00:22:25 It was over Christmas and we went to Spain, Madrid, and then we were taking the train down, and on the train, it was like all these soldiers, everybody was drunken, and it was really super fun, and we're having this blast going to the last stop, and it was packed. When we get there, we get on the ferry, and we all have to turn in our passports for processing.
Starting point is 00:22:41 And my mother had warned me, she said, you're going to Morocco, but you're Jewish. It's an Arab country and you should be careful. But I was like, oh, that's ridiculous. And so we get off the ferry and all my friends, there were about three or four of us, they had gotten their passports. I didn't get mine.
Starting point is 00:22:57 So we're waiting. And then it got stretched into like 45 minutes. And I was like, listen guys, I think there's gonna be a problem. Go ahead, I'll just go back. But I was definitely queasy about what would happen. Yeah, because you're already now on the side of it. We were on the other side,
Starting point is 00:23:10 but we're still on the ferry because we can't get off without the passports. So then this door opens, I'll never forget this, I can see it like it was yesterday. Across the back of the ferry, I see this guy walking over. He's like in a full keffiyeh headdress, very Arab looking.
Starting point is 00:23:24 Yeah, yeah, yeah, yeah. And he walking toward us, and now I'm starting to think, oh, oh, she was right. Yeah. And then he holds up this passport and he says, Monsieur Goldbierge, you know, they're French, and I looked it forward, and then he looks at me, he looks at the passport, he looks at me,
Starting point is 00:23:40 and then he throws open his arms, and with this big smile with gold teeth, I remember he says, welcome home. What? And I have no idea what he's talking about. Right? Oh, that's sweet. And then it hits me, just what you said,
Starting point is 00:23:55 it was the first time I had set foot on the continent of Africa. Yeah, since you were born, oh my God. Or does he have any sense of that? It's in the passport that I was born in Africa, but I have no stamps or anything else. Oh, wow. Or does he have any sense of that? It's in passport that I was born in Africa, but I have no stamps or anything else. Oh wow. That is like very heartwarming.
Starting point is 00:24:09 It was wonderful. The delta between what you were expecting and what came may be the biggest delta of your life. Oh my God, yeah, exactly. I was ready to clap the handcuffs or something. Welcome home with a hug is incredible. Also for people with backgrounds like the both of you, you're expecting the worst thing to happen,
Starting point is 00:24:26 so it's really nice to have evidence that it could go a positive way. Totally, totally. Sometimes it goes the other way. Yeah. So is it in Edinburgh? At some point you get introduced to, I guess, the concept of AI?
Starting point is 00:24:39 Yes, they actually have one of the few AI departments in the world there. I didn't know that. Is this Edinburgh University? Yeah, they had have one of the few AI departments in the world there. I didn't know that. Is this Edinburgh University? Yeah, they had connections with Alan Turing and all the early work in AI. So they had this department, and I remember going into this fair,
Starting point is 00:24:53 all the departments had these little tables, geography, art history, and I saw this little table with AI, and I was like, what? And so I walk over, and sure enough, they had a class that you could take in AI. And this is in 1987? 1981.
Starting point is 00:25:08 Oh no, you graduated in 84. 81. Yeah, that's early early. And I loved it. It was a great course. We had a little bit of robotics, a little bit of natural language, a little bit of computer vision, all these different aspects of AI. And it was so much fun.
Starting point is 00:25:21 So at that point in 1981, what was the robotics component of that course? Basically controlling these arms to move in certain ways. And the kind where you used to see in like automotive assembly plants? Those claws. That was big thing was like, how can you get it to just move around on a table or something like that?
Starting point is 00:25:37 Why do you think that was so exciting? Well, I loved machines like that. I guess it was partly my dad's influence. We had a go-kart when I was a kid. I was really into that and rockets, model rockets, were a big thing. And also of course I watched those shows like Lost in Space and things and I liked robots from that. So you then go and you get this PhD in computer science and then you teach at USC for a minute which is interesting. Yeah so I was there yesterday and it's actually so nice to
Starting point is 00:26:04 return. It was wonderful. Okay the story is that when I graduated in 1990 there and you teach at USC for a minute, which is interesting. backlash to a lot of the over-optimism about robots. So there's this thing called AI winters. Yeah, we heard from a few people have talked about these. Yeah, we had Feifei Leon, yeah, and they give up on it, then something happens, they come back to it. Yes, since maybe the year 2000, it's only no negative. So the students don't know. They've never seen that. But Fei-Fei and I, we've lived through it and it's quite dramatic when suddenly everybody decides it's not going to work and it's not useful and all the funding dries up. Yes. That time it was very interesting.
Starting point is 00:26:54 Japan was on the rise and everybody thought Japan was the future. There was a whole lot of hope that robots were going to do all the things we're saying today and it didn't work. And so it was dismissed. It was a backlash. They were like, we today. And it didn't work. And so it was dismissed. It was a backlash. They were like, we tried it, but it didn't work. Exactly. Am I wrong?
Starting point is 00:27:08 Even Honda was one of the first big corporations that was like committed some real money to building a robot. That was later. But in 1990, I was looking at jobs in Japan. And that was my only option. And then this job at USC came along. I was so lucky. I was so happy that they hired me.
Starting point is 00:27:24 I don't know how you teach here and then you leave at any point. This is a fly trap, LA. The weather's just too good, especially if you're from Pennsylvania. Especially if you're from Pennsylvania. Yeah, it was quite good. Oh, and let's add, this is a sincere question.
Starting point is 00:27:36 The most shocking thing that occurred to me when I moved to California, I remember it so clearly, I was at a Carow's restaurant, which is like a Denny's. I'm at a booth by myself And there's a guy at a booth by himself And he stares right at me and I stare at him And I'm now waiting for what happens in Detroit like either he looks away I look away, or I go what's up that whole exchange
Starting point is 00:27:57 That's just unavoidable where I came from and then he's just looking at me and I realized this guy's just looking at me And I'm looking at him we can do this here, that's California. I couldn't believe you could just look at somebody. Okay, so my version of this is that I was visiting California and we were driving from San Francisco down to LA. This was a couple years before. Someone had figured out that you could go to Esalen
Starting point is 00:28:20 in the middle of the night and just go in and experience the hot tubs. Esalen's this kind of crazy retreat, a hippie-ish vibe, nudity's welcomed, right? That's the vibe. Have you ever been there? No, but my father weirdly used to go there from Detroit. What?
Starting point is 00:28:33 Oh my gosh. Yes, yes. Well, it's been around for a long time, right? And it's on the coast. I remember we go and it's dark and there's nobody there. And I remember thinking, they're not going to open up in the middle of the night. What are you talking about?
Starting point is 00:28:43 So we were standing there and I was like, let's go. And then all of a sudden this mysterious figure comes and unlocks this gate. We go in and sure enough, a couple comes out of those shadows and it's these two beautiful women. Okay. And they come in there with it. So there's three guys. Are they clothed? Everybody's clothed at this point. But then we go in and they say, well, that's the thing because they say you can leave your clothes here and we'll walk down to the thing. I'm sort of like, I don't know what they mean by that. So I take off everything but my jeans. Okay. I go down there. As soon as I get down there, it's on the coast like right on the cliffs.
Starting point is 00:29:15 I realize everyone's naked. Yeah. And so they all get in so I take my pants off and I jump in and I'm sitting there at this moment with the stars above me and everything else and thinking, this is where I want to be. I'm never leaving. I'm never leaving, this is California. We're talking about culture shift. California, baby.
Starting point is 00:29:31 Pennsylvania. After being there for an hour or so, we get up to leave and I pick up my pants and I had put them down in a place where the water was rushing through so they were completely soaked. Soaked. Seawater dungarees.
Starting point is 00:29:43 Rest of the road. Exactly, my only pair. So how do you end up at Berkeley? I love Berkeley because I love the counterculture part too. When I was a kid there was also a head shop in Bethlehem. So I got a little taste like the furry freak brothers. And I listened to Grateful Dead. Where you go buy a water pipe and some tie-dye clothing.
Starting point is 00:30:03 Yeah. Countercultural. They had also those posters with the blue black light. Yeah. Remember those? Yeah, and they also had a smell to them, remember? Yep, patchouli kind of. Patchouli, exactly.
Starting point is 00:30:15 I don't like them, but I'm glad you like them. Oh, interesting. Okay, all right, well that was a big thing for me. Like it was somehow a little bit illicit and off-limits, and I found it very interesting to see what was going on there and I also loved the Beat Poets and all the Rebels so Berklee was big traction. Stay tuned for more Armchair Expert, if you dare. We are supported by Airbnb. More space, more privacy, better locations.
Starting point is 00:30:48 These are just a few of the reasons that some trips are better in an Airbnb. We're going to tell you more about why you should choose Airbnb for your next trip later in the show. Stay tuned. On January 5th, 2024, an Alaska Airlines door plug tore away mid-flight, leaving a gaping hole in the side of a plane that carried 171 passengers. This heart-stopping incident was just the latest in a string of crises surrounding the aviation manufacturing giant, Boeing.
Starting point is 00:31:16 In the past decade, Boeing has been involved in a series of damning scandals and deadly crashes that have chipped away at its once sterling reputation. At the center of it all, the 737 MAX, the latest season of Business Wars, explores how Boeing, once the gold standard of aviation engineering, descended into a nightmare of safety concerns and public mistrust, the decisions, denials, and devastating consequences bringing the Titan to its knees, and what, if anything, can save the company's reputation now follow Business Wars on the wonder EAP or wherever you get your podcasts
Starting point is 00:31:49 You can binge Business Wars the unraveling of Boeing early and ad-free right now on a wonder E plus I Like the physicist history there as well. There's so many cool historical elements to Berkeley that I think make it such a special place. I agree. There was this idea that it's really about being a rebel at some level and being able to buck the status quo.
Starting point is 00:32:18 And that I've always admired from Oppenheimer. They discovered all these elements and all these Nobel Prize winners. That's as rad as their fringe, but they're crushing. They're like doing it in a totally untraditional way, but they're still bringing in the results. Rigorous. You know, it's not about just being space cadets. Being on acid all day. Right.
Starting point is 00:32:35 Because there is a sort of berserkerly kind of idea, right? But it's not because you have to be grounded. Berkeley is not the easiest school to attend because it's big and it doesn't hold your hand. I've driven up to just look at it and you don't get the sense of, oh, I'm entering this secure border. It's very shaggy. There's no real border.
Starting point is 00:32:52 There's no gates. By the way, USC has these gates. You have to have an ID and everything. Nothing like that at Berkeley. A hippie vibe is still floating around. It's definitely got a lot of coffee shops, which I love that. But the rigor is that you have to work hard
Starting point is 00:33:05 and you have to be willing to get what you want. Like you can't get into this class, but you go and you camp out in your sleeping bag next to the professor and talk your way in. You have to be motivated, self-motivated. Motivated and grit. Angela Duckworth. That's right.
Starting point is 00:33:18 Yes, exactly. She's the queen. I think that word really sums up. Students, when they ask, should I come to Berkeley or not, or even faculty, and I say, well, if you want to be comfortable, maybe not. Right, right. You know, it's not the most comfortable. that word really sums up. When does automation arrive? sort of move their arms and stuff. So that has a long history. But they were functional or these were ideas drawn? No, they're functional. Wow.
Starting point is 00:34:08 Yeah, with like levers and chambers that would fill with fluid and then they would raise their arm. But obviously they're using some kind of hydraulics or something. Yeah, it wasn't steam per se, but it was just like liquid that would fill up a chamber, but they had simple mechanisms.
Starting point is 00:34:22 That goes up through the Greeks and then there's all these stories about Pygmalion coming to life, you know, the statue who comes to life, and the story, of course, there is that he falls in love with a statue. I don't know Pygmalion, so help me. Pygmalion is a really good story to know. It's one of the Greek myths, and it's a sculptor
Starting point is 00:34:39 who's renowned for being incredibly skilled, and he, at one point point sculpts this beautiful woman, and it's so lifelike that he falls in love with her. How could he not? How could he not? But then it has a tragic end because he won't eat, and it never returns his affections. It's literally the movie Her.
Starting point is 00:34:55 Yeah, yeah, yeah. That's this archetype. It's Frankenstein. It's the same story that you see over and over again. Falling in love with an inanimate object. Falling in love with your creation. Oh, your creation. Oh oh that's a good detail. Yeah, it's hubris.
Starting point is 00:35:07 Yeah, oh, yeah that's juicy. That's so deeply rooted in Western culture that we're warned against these kind of things. It's overstepping to try to take on this God-like role. It's challenging God, is you're being a creator. Exactly, so there's a lot of idea that that's gonna come to a bad end. And I think that largely boots behind all these fears. It underpins a lot of idea that that's gonna come to a bad end. And I think that's largely behind all these fears.
Starting point is 00:35:26 It underpins a lot of our current. Totally. Even if we don't believe in God, we have some sense that we're not supposed to be tampering. Let's just use nature as God. That it's in the back of our minds that if we do this, there's gonna be some price to pay. It's gonna run amok.
Starting point is 00:35:40 And that's the story with Frankenstein, right? It runs amok. And then the Golem story, it precedes Frankenstein. In the 14th century, a rabbi, there was a lot of pogroms in this little village. So he makes a robot out of clay, just a being out of clay. He puts these words on his forehead that bring it to life. And then it goes around and it basically defends
Starting point is 00:36:01 against all the bad guys and it defends the community. But then when it's done, he's like, well, now why don't you go fetch some water for me? Then he goes to sleep. Anyway, he wakes up and he's drowning because the robot just keeps fetching water over and over again and he has to stop it, but he doesn't know how. So he then reaches up and he wipes off the words on the forehead and the golem collapses on top and kills him. Now that one is specifically, I hear this antidote all the time
Starting point is 00:36:28 that you could deploy AI for this simple innocuous task, but it could determine to execute that task, it would be best that we kill all the humans. This is the very common one that goes around right now. Right, if you want to save energy, get rid of these humans. Yeah. If you're not careful.
Starting point is 00:36:44 The most efficient way is to do this. It has such a myopic command it's following. Right, oh great. Okay, so then when do we get into something that is actually practically helping us? I'm an idiot, I'm thinking Henry Ford is one of those kind of stuff. No, no, it's actually really good.
Starting point is 00:36:59 So the Industrial Revolution with the invention of the steam engine and all those things, all the machinery starts coming out. And so Henry Ford is definitely part of the steam engine, all those things, all the machinery starts coming out. And so Henry Ford is definitely part of the assembly line and the car, but robots actually also start at Ford. There are some very early robots in the late 50s, early 60s. They're like a programmable machine.
Starting point is 00:37:18 And so you can basically tell it to go from point A to point B. And so it's very primitive, but they're in like the World's Fair and people start talking, oh, and by the way, the word robot is coined in 1920. By a sci-fi writer or? Yeah. By a sci-fi writer. It's actually a playwright in Czechoslovakia. So interestingly, it's right around the time of the pandemic, the 1918 pandemic. And also, I think significantly, Sigmund Freud wrote this essay called The Uncanny in 1919. So a year later, this play comes out about essentially Significantly, Sigmund Freud wrote this essay called The Uncanny in 1919.
Starting point is 00:37:45 So a year later, this play comes out about, essentially, robots. That's where the first time this word has ever been used. Yeah, robot, which means worker, or forced worker in Checklist of Valken. Now the 60s Ford robot, was it being employed to move objects or was it like a CNC cutting device?
Starting point is 00:38:05 It's much like a CNC, computer and numerical controlled. So they were able to program these spinning drill bits with three axis, right? So it could say go up, go left, go right. And through those little three axis movements, it could carve out the perfect shape of a part from a block of agate. Steel or something.
Starting point is 00:38:24 See, whatever. So you go like, oh, I want this rim for your car. I start part from a block of agate. Steel or something. Steel, whatever. So you go like, oh, I want this rim for your car. I start with this big block of aluminum. And this thing, just with a drill, it can go through. It has all these steps programmed in, and it just, like a sculptor, shows out everything you don't want. And it's programmable,
Starting point is 00:38:38 so then you can make a whole bunch of them over and over again. And that's still used, by the way, very heavily. And then what's the next big leap forward? There's a lot of fear around that time that robots are gonna take over. In the newspaper, there's all these articles that they're gonna do all the work,
Starting point is 00:38:51 but that doesn't happen. And the first robotics conference is in 1984. Then there's a big research field that starts to grow around robotics. But then it started really taking off in factories, especially automotive. The big thing that it's used for is welding and spray painting. The welding is awesome.
Starting point is 00:39:09 Yeah, the welding is fun because you get those sparks. It's like little pinches just coming boom, boom, right? And welding sheet metal is very hard to do. For humans. You burn right through it so easy. Right, so it's very delicate, but then you're just basically doing the same thing over and over again. learning and AI. Really? Yeah. To get robots to pick things up. What's it called? Macavarker's paradox, what is it? Oh yeah, yeah, Moravec's.
Starting point is 00:40:06 There we go, there we go. Okay, so Moravec was actually at CMU when I was there. He was this very eccentric guy. And he wrote this book and he was saying, it's a paradox that what's easy for robots, like lifting a heavy car, is hard for humans. But what's easy for humans, like stacking some blocks, that's hard for robots, like stacking some blocks,
Starting point is 00:40:25 that's hard for robots. And that's still true today. Yes, you have a great TED Talk. I urge everyone to watch it. It's called... Where Are The Robots? Where Are Our Robot Butler? That is not the title of it, but that is really close.
Starting point is 00:40:41 I found it. Why Don't We Have Better Robots Yet? That's the title of your TED Talk Talk and it's very, very good. Yeah, so it's incredibly hard for a robot to grasp things. There's a bunch of reasons, right? Yeah, it's very counterintuitive because humans is so easy. But we've sort of evolved over millions of years, just like dogs and crows. Crows are able to pick up things amazingly.
Starting point is 00:41:03 They can put coins in slots and they can do eight-step problems. And crows, crows are able to pick up things amazingly. They can put coins in slots and they can do eight step problems. They're far more dexterous than robots. Robots, there's a lot of uncertainty in the environment. And even if you tell a robot to go to one specific spot, because of the motors and levers and gears that are in it, it won't go to that exact spot. You want it to put its jaws or something at a specific point to grasp this cup, it'll be slightly off,
Starting point is 00:41:32 and that will cause it to miss, drop the object. Right, because every single movement's going to have some margin of error, and then that's going to compound. The more movements you add, all these different little margins of errors start stacking up. Exactly. And then the other is sensing. So we can take a high-resolution picture of an environment, like this room.
Starting point is 00:41:51 But there's no sensor that can give me the depth, the three-dimensional part of this room. What if you used a 3D camera, so you had bilateral? There's errors, there's little noise in those things. If you look at the result of that, there'll be a depth map, which is like a 3D camera image, but you'll see that it'll be lots of noise and imprecision and mistakes in those. And those are inevitable. There's no camera that really works reliably for 3D.
Starting point is 00:42:13 Okay, I don't know if this is the time, but this is one of the parts I want to talk about. As I've been frustrated with the exuberance of AI, one thing that has occurred to me is that our fascination with ourselves seems to be about our intelligence. And that, in fact, is what AI is trying to replicate and or surpass, is our executive function, our problem solving, all these different things. But if you look at us as an animal, our motor control took 300 million years to evolve as mammals, and our big brain and everything we're trying to replicate in the AI space started six million years ago when hominids
Starting point is 00:42:52 arrived. You have so much time spent evolving to do all these tests that we think are just standard, and then we think this last minute thing that took the least amount of time to evolve is somehow superior to that. So when I look at this, it's like, forget artificial intelligence, try to figure out artificial physicality. That's such a good way to put it.
Starting point is 00:43:11 And that's exactly right. And so if you look at that history, hundreds of millions of years of evolution to see to mobility and being able to manipulate just the opposable thumb and all of those things. And so all these other things like math is relatively very recent. Yeah, we think that's the high water mark,
Starting point is 00:43:27 but I think the most impressive thing is us moving through time and space and smelling these five senses touching, I would imagine if you could quantify it, that's like this much and then our intelligence is like this much. That I think is helpful for people to understand why we've made all this progress in these quote, hard problems like playing chess and go.
Starting point is 00:43:44 Yeah, but we really haven't made much progress in just being able to like clear the coffee table. Okay, so then my question is, is that a software or hardware problem? So one of the things I think about, it must be so hard when you're trying to design a robot, is you're limited to all of these pulleys to operate a hand the way ours moves.
Starting point is 00:44:05 And as much as we are pulleys, we're also not, right? Because the muscles are such a unique way to operate the pulley. It opens up infinite options of movement. Whereas these robots are really confined to kind of this, this, this, this, right? You're right. The muscles in the human body,
Starting point is 00:44:21 there are hundreds of muscles and bones, and they pull in all these nuanced ways, and we have the skin that's very complex. What's amazing is how much we don't know about human biology. We don't understand how touch works. Touch is incredibly complex. Like, we can feel things that are so small, they're much smaller than human hair. We can perceive up to very complex vibrations and other things. You add in temperature we can feel,
Starting point is 00:44:47 you add in moisture. Have you ever read the book In Immense World? Yes. You have! Oh yes. By Ed Yong. I love it. Holy fuck.
Starting point is 00:44:56 He's fascinating. I love that book. And you get into this mole that has this star shaped nose and so that's a touch sense. Definitely. And that touch sense can detect the movement of moisture in the sand it's exploring at a distance of like 12 inches. It actually can see through everything,
Starting point is 00:45:12 but it's not seeing through and it's not smelling through. It's touch feeling through. Yes. Oh my God. How would you replicate that with a machine? Exactly. Well, also humans don't even have the ability to do a lot of the stuff that these animals can
Starting point is 00:45:26 do. So which one are you even aiming for? Robotic touch sense is extremely primitive. What do they think would be the mechanism that could replicate it? Would it be electricity? What would it be? When I was in undergrad, I tried to use electricity to do that and it failed. It did not work.
Starting point is 00:45:43 But what people are using now is light. And they transform the touch into light. And so imagine that you have a little camera in your fingertip is looking inside at a pad from the bottom. And so when the pad gets indented, you see the pattern of what it's touching. Okay.
Starting point is 00:45:59 So it's like a membrane and above that, the membrane's being observed. That's exactly what it is. But what happens is the membrane gets rubbed off or over time those sensors get deformed and so it doesn't solve the problem. It's just the latest method that we're trying. This is why the Roomba worked
Starting point is 00:46:15 because it didn't have to use any digits or anything. It was just sort of at random moving. And the Roomba replicated the very first multi-cell organism. What it actually ended up knocking off was something that can only move forward, and then turn and move forward, and turn and move forward. Like some paramecium. What we've achieved is like...
Starting point is 00:46:35 That's where we are. Yeah. Multi-cell organism. You're right. The Roomba is the most successful robot of all time. So when they count robots out there, they count these Roombas, where there's like 10 million of those, but that's the robot, right? And it's very simple.
Starting point is 00:46:48 It's basically just random motion, and over time it does cover your carpet, and it's pretty reliable. But of course it also has this problem that it gets stuck all the time, and tangled up in stuff. And so it's not ideal, and it can't go upstairs. Also, a lot of people bought them as a novelty.
Starting point is 00:47:05 There's a lot of them sitting in the closet somewhere. Yeah, they want to see it work once, or maybe even they bought it and they got intimidated about even turning it on. Yeah, yeah. That would be my thought. So even understanding I don't really want to deal with pulling out of the box
Starting point is 00:47:17 and figuring out how I deploy this thing. I actually have a drone that's in that category. Oh yeah, I have two drones. I'm like, I'm not going to be able to figure out. No, I need like six hours to basically figure it out and it's sitting in the category. Oh yeah, I have two drones, I'm like, I'm not gonna be able to figure out. No, I need like six hours to basically figure it out and it's sitting in the box. Yeah, you're right, I'm like, that's gonna require a day and I don't know if I'm gonna enjoy flying it enough
Starting point is 00:47:33 to justify a day of me figuring it out. Okay, good, so I like that we're kind of agreeing that we're really underestimating how complicated our physical abilities are and we're really overplaying our mental capabilities. Right, so everybody's impressed, the analogy if you say, okay, you can beat the best person in the world at chess, then that means you have a very powerful machine,
Starting point is 00:47:54 artificial intelligence, now it can beat the best person at Go. And nobody can play Go or chess that well, so you think this is smarter than everybody. Yeah. That's the way people reason. But then it can't drive a car, it's a whole bunch of things it can't do. And anything physical is just picking up
Starting point is 00:48:09 or opening this can that I just did that is impossible for a robot. Tell people about your folding project. I don't know what year you're into this, but one of your projects. Is folding clothes? Folding clothes. That's one that I think everybody would like to have.
Starting point is 00:48:23 If it just sat on your washing machine and you could dump it in a barrel and then, fuck, yeah, that would be. Also, can you do a dishes one, like putting them. Have you ever watched them do this? No. There's nothing funnier than watching the robot try to cook breakfast or do dishes. It just smashes everything.
Starting point is 00:48:39 Splatters, broken glass. But I agree, taking things in and out of the dishwasher would be great, just unloading and loading the dishwasher. And someone would say that the dishwasher is a robot. It is, very successful. See, there's this idea that if you can use humans and robots together, that's very powerful. So that's what I call complementarity.
Starting point is 00:48:56 When if you figure out that you have a machine, you can use it, but you have the human do the parts that we're good at, and then let it do the parts it's good. Together, you have a great system. And the dishwasher is a beautiful example. And the washing machine, they do all this, but we have to at and then let it do the parts it's good at. Together you have a great system and the dishwasher is a beautiful example and the washing machine they do all this but we have to load it and unload it. In the laundry aspect it's also that you want your clothes to be folded at this precise time right when they come out because then they're at the perfect stage.
Starting point is 00:49:17 No wrinkles. No wrinkles and if you do it too soon they're kind of soggy, if they're too late they get all wrinkled. So having a machine to do that would be quite good. And there's some really interesting new results that just came out about this. But we've been working on it too. And one of the ideas is you fling the clothes up
Starting point is 00:49:35 and you use air to help smooth them out. Like humans do that all the time, right? You snap, you know. That has only been really done in robotics in the last five years. Oh my God. I can God. This is your job. I'm just so annoyed all day long. Yeah, that's a perfect time for me to ask you like, knowing your work
Starting point is 00:49:54 will experience failure. I don't know that there's one that would surpass it. Like, it's just failure, failure, failure. How do you stay optimistic? So I'm super optimistic. I love working on this topic, and I feel like we have a lot more work to do. failure, failure, failure. How do you stay optimistic? So I'm super optimistic. I love working on this topic and I feel like we have a lot more work to do.
Starting point is 00:50:08 So that's also encouraging. I don't worry that it fails. I actually love the times when it does succeed. That's super rewarding knowing how hard it is. You're like a fan of hockey instead of basketball. Why? They're only gonna score once a game. Oh.
Starting point is 00:50:23 Or it's basketball that you're gonna score 110 points. So you're like, oh yeah, I don't get it. But when I get it, boy, it's. Oh, that's interesting. Why? And suddenly deep learning, is actually at the center of this. Oh, that's great. Yeah, so DexNet was our system. We worked on it for five years, and we basically applied deep learning techniques to be able to figure out where to grasp objects. And it started working better than anything had been done before. And I was so surprised because I had been trying to work on this problem, and then I suddenly was able to pick up almost everything we could put in front of it.
Starting point is 00:51:41 Well, there's this critical mass point for all these things, isn't there? Yeah, in her book, she needed such a humongous pile of data and halfway there gets you nothing, but it's like stagnation and the acceleration is probably shocking for you. Yeah, no, that's a really great point. There is a critical point when you get enough data and suddenly it starts working. It took a lot, it was 80 million plus images
Starting point is 00:52:03 that Feifei put together. And in our case, we had seven million grasp examples and suddenly it starts working. So it was just grasping and think of it with a very simple gripper just a parallel pincer So you would put a bin of objects in front of it and it would start pick them one by one and put them out And so we would test it by going into the basement the garage. We just throw all kinds of stuff in there Yeah, and it would just pick them up consistently and clear the bin Oh, and we would try and fool it. He must have been elated. It was so much fun There's a story where we got invited to show this to Jeff Bezos. And he invited us down to this event in Palm Springs.
Starting point is 00:52:50 He said, bring the robot, I want to see this. We had never left a lab before, so it was a big deal to put it on a truck. We had 300 objects that we were so relieved. And he was trying it with different things and it was just like it was in lab. And everything was going great. And then his assistant standing there and took off his shoe. And he said, well, can I try my shoe? And I remember my mouth goes dry
Starting point is 00:53:16 because of all the things we've tried it with, we've never tried a shoe. So I have no idea, but what can we do is we have to say, go ahead. Otherwise it feels all mapped out maybe. Yeah. Yeah, right. Like the panties on the ground. Right, exactly.
Starting point is 00:53:30 So he drops his shoe into the bin and we're all sitting there and the robot just reached over and picked up the shoe. It did. Took it right out. Wow. I remember calling Tiffany, my wife, and I said, this was the best moment of my life. Yeah. And she said, what about our wedding?
Starting point is 00:53:48 Yeah, exactly, exactly. But you can't tell it, like it has the bin and it has all the objects, and you can't say, pick the shoe. That's exactly right, that's very important. You can't tell it to select a specific object. You can't say, go through this bin and find me all the pennies.
Starting point is 00:54:04 No, that's called rummaging. That's very interesting. We're looking at that now. Much different problem, much harder. That could be incredible for recycling. Yes, and also you think about it, you do this all the time if you reach in your pocket and you want to pull out a pen. You can always find the pen or your purse.
Starting point is 00:54:19 People are very good at that. What's going on is very complicated. Robot cannot do that at all. But pulling one thing out of the bin is really interesting because you have to kind of move and what's going on? This might be a good moment to bring up. So you say in that TED talk, which I think would shock people, is that we are much better at predicting the trajectory of an asteroid that's a million miles away
Starting point is 00:54:53 than we are how a plastic bottle on a table will move if we poke it. Yeah, because there's physics. We really don't understand friction. And friction is so important. It's what lets us all sit here and things not slipping around. Friction is so important, We really don't understand friction. And friction is so important. We can approximate it, and there's this model,
Starting point is 00:55:15 Coulomb friction, et cetera, but to really get friction right is actually impossible. If I want to push something across the table, the way it's going to move and react to my pushing force is going to depend on what's underneath it. So if you have one grain of sand, it's going to change it. Yeah, if it's in the right corner, the whole thing's going to rotate clockwise. Exactly. If it's in the left corner, it's going to rotate the other, but I can't know that. The robot can't know it.
Starting point is 00:55:34 So right there is like one of the great mysteries of nature, right? You don't have to talk about quantum physics. That is one unknowable thing that's sitting right in front of us. Oh my God. Wow. And we deal with it all the time.
Starting point is 00:55:46 So you might say, what do we do? Well, we kind of compensate. When we reach for a glass, we don't just reach our gripper right up to it. We scoop it up. We're almost anticipating the many different ways it could go wrong. Exactly. We haven't figured out how to do that for robots yet. Oh man.
Starting point is 00:56:02 And again, is that a software problem or a hardware problem? It's both. It's largely software because we don't have the sensors, we don't have the control, we I see that for robots, yeah. and there's been so much hype. And that's what I worry about. Do you worry this whole field will go through one of those other stagnation patterns that we've already seen a bunch of times? I really do. I think we're on a collision course with a kind of bubble that's gonna burst because people are expecting that we're almost there, especially when they see these videos.
Starting point is 00:56:41 Okay, great. Yeah, tell me about these videos because you watch them and you think we're there. And this is a big problem. Those guys that are running. Right. Okay, the first thing to ask is how many takes were required?
Starting point is 00:56:52 Many times they get to work once and that's the video they show. Out of hundreds of attempts. Hundreds? Yeah, you show the clip of a robot doing a back flip, which is mind blowing. You're like, well, fuck it, we're there. That thing's going to work on my car next month.
Starting point is 00:57:03 Right. That was one in 200 takes. You're like, well, fuck it, we're there. That thing's gonna work on my car next month. Right, that was one in 200 takes. And the other 199 takes, this thing is like flying off the table and smashing around. It's violent when it gets it wrong. In a research lab, that's what we're dealing with. Okay, it's always failing. You'd be lucky if you get it to work once.
Starting point is 00:57:16 But so if you put it on YouTube, we'll say the success rate is one out of 200 or something. But nowadays, there's so much hype that is not putting those caveats in there. Well, because you're an academic, and this becomes one of my next questions, is a lot of these videos I am imagining are coming from startups that are trying to raise funding.
Starting point is 00:57:33 So they're heavily incentivized to mislead you. Someone might say, I have to do that. That's how I'm gonna get the next round of funding. But I really cringe about that because that goes against my instincts of, I like to say under promise and over deliver And I'm gonna be really careful never over promise about what we're doing or a result in a paper We're always careful. Don't exaggerate the result and it's really is a problem for
Starting point is 00:57:56 Robotics where you see the videos and then the other thing is they can be tele-operated and there's a human behind the curtain Okay. So what about this Tesla robot? Optimus. What's it called? Optimus. She owns one? I guess he gave her one. He gave her one?
Starting point is 00:58:11 I guess she's the only one. What a romantic gesture. Yeah, I know. Well. What's the deal? Okay, so I know this is going to disappoint people when I tell you this, but it's far from being human-like in its abilities.
Starting point is 00:58:24 In dexterity it's very very weak. It looks good and they're beautiful designs and they actually have made progress in the motors and the hardware so it can move more smoothly and they're also getting very good at walking. So there's definitely something positive there but what can they do? And you see this triggers that old idea that we've had in the back of our mind, which is we want these things, we've been reading about them, watching them on movies and TV, we think they're going to come, and yet there's this huge gap. So if you watch carefully at the demos, they're being somewhat teleoperated.
Starting point is 00:59:00 That means that there's a human moving them around, essentially remote controlling them, or if you watch what their hands are doing, they're very primitive. Now there's a lot of work trying to address that. Stay tuned for more Armchair Expert, if you dare. ["The Armchair Expert"] Okay, so now I want to ask you how impacted are you by this general paradigm shift where most technology for the 20th century was coming out of military government spending, DARPA, MIT, these great institutions, academia. And then as you saw these private corporations
Starting point is 00:59:45 amass trillion dollar values, the government can't really compete now and academia can't compete. And what do we think the price of that'll be? What are your thoughts on that whole realm of this? So it's a great question because it just came up yesterday at USC because people were talking about
Starting point is 01:00:01 there's not so much funding available from the government agencies, DARPA and others that used to fund a lot of this research because it was more esoteric. because people were talking about Google and Nvidia and many others are actually publishing their results. We work with them, they work with us. Robotics is surprisingly open. The minute they get a result, they publish it and we all see it. exist in academia. We are freely changing information. Students are coming from China, just came from a conference. Half the papers were from China. It's a free open exchange.
Starting point is 01:00:49 No, science always, and it's so cool, it's so punk rock. They're always like, before I'm German or I'm Hungarian or I'm American, I'm a physicist. The collaboration and how everyone got along is something to be really modeled. They have a higher calling, which is kind of knowledge. Yeah, but wasn't that part of the whole thing with Oppenheimer? I mean, they were trying to keep it to America. Yes, Oppenheimer had some folks working under him that were spies for Russia and were leaking our nuclear technology.
Starting point is 01:01:17 So, yeah, I guess that's not to say it was devoid of any statecraft, but just in general, if a Chinese roboticist has a breakthrough, he doesn't give a fuck where this guy came from. He wants to know about the robotics breakthrough. but just in general, China's not gonna, yeah, yeah, yeah. Yeah, exactly, other countries aren't, so what are we doing? A lot of it is open now. Facebook or Meta has actually been quite exemplary in that they share all the models.
Starting point is 01:01:49 They're doing their AI as open source, right? Yeah. Which is unique. Really wonderful, and actually we use it all the time, those tools, and it's very, very helpful. And then others like OpenAI, we use the tool, but we don't actually get access to the source code, so we don't know how it's doing it, right, behind the scenes, but they let us access it and use it which is doing some incredible things
Starting point is 01:02:07 but that idea of I guess your question is about government versus private sector let's just take really quick a hard example and I'm not an Elon Musk hater or lover I respect him as a modern-day Edison he is a once-in-a-generation kind of engineer and I respect that I agree his other self is questionable to me, but whatever. Even with that said, I can't say that I love that he has 5,000 satellites in orbit around the planet, in route to having 12,000.
Starting point is 01:02:37 It's just an interesting level of power to give one individual when I think I'd feel safer if University of Michigan had 8,000 satellites. Or the US government maybe further down that list, but still I'd prefer that. It's a dicey situation when someone has a monopoly on a technology that's hugely impactful. It's a great point. And not only that, but that individual also has a lot
Starting point is 01:03:00 of power in terms of, let's say, Twitter and X. Media. Media and seems to know how to use it very effectively. So I think it's a concern. a lot of power in terms of, let's say, Twitter and X. And seems to know how to use it very effectively. Coming back to the fear, the reason roboticists, and I'm not the only one, almost all of us, are not fearful that robots are going to take over. And also that they're going to eliminate humans because it's just not that sophisticated by a long shot. And certainly the other fear is about jobs. And I don't see them taking over jobs
Starting point is 01:03:27 and putting people out of work because all the jobs that require manual labor are extremely difficult to automate. I looked it up this morning. So 39% of US jobs are still manual labor. Oh, that's a good number. But I think even more importantly, that represents one third of your waking day.
Starting point is 01:03:45 You have another third of your waking day where you're still gonna have to do your laundry, make dinner, right? So you have this whole sector too that no matter who you are, is still manual. So you add those two together, and now you're really looking at a number that is like 78% of the stuff done on planet Earth
Starting point is 01:04:00 in a day is manual. A huge amount of those jobs cannot be replaced. You know, the gardener. The mechanic. The mechanic. The mechanic. The dream for me is I got a robot that's maintaining all this bullshit I bought. Diagnose why the car is not running, fix it,
Starting point is 01:04:11 get in there how intricate working on an engine is. Forget it, that's like 100 years away. Exactly, like mechanics are so complex where they can reach around things and feel they can take off screws. Foul housing on a trans, you can't see anything. So that's the kind of thing way beyond robots. And I think that there's a shortage of workers.
Starting point is 01:04:28 The trades, because we're aging and we need more people to be doing all these jobs, they're not going to be unemployed. Right, right, right. In fact, it's probably most job security. People are realizing that they're actually in demand, which is great. They're actually getting higher wages.
Starting point is 01:04:44 Okay, so my theory on this is that the people that are being interviewed for the media, where we're getting our information about AI, the people that are getting consulted and interviewed are the actual people whose jobs could be replaced. So they are very misled by their own. You're talking to like computer programmers about all these domains where actually AI will threaten those jobs and they're the mouthpiece of this whole thing. So it's all very lopsided because those are the jobs. Well, I have a theory about this.
Starting point is 01:05:14 I think some of the most vocal doomsayers who are saying we're on the verge of these things taking over, it was very telling the person who just won the Nobel Prize, Jeff Hinton. He said, we have never encountered anything more intelligent than ourselves. That was his big line.
Starting point is 01:05:28 And I read that and I thought, wait, I encounter something more intelligent than myself every day. I mean, there's tons of people around that are more intelligent than me. And I'm not afraid of them. They don't freak me out. I want to talk to them.
Starting point is 01:05:41 I want to get to know them. Yeah, we spent a week with Bill Gates and we loved it. We loved it. We weren't scared. Right, exactly. And so most of us aren't afraid of something more intelligent than us, but there's a small group,
Starting point is 01:05:52 and I think they think they're the most intelligent. That's interesting. Wow, great tape. For someone to be smarter than them, where they're at the most upper echelon, is scary to them. It's a threat to their identity. Like if the AI robot's good at drifting in a car, I'm fucked. Because I'm defining my whole identity and self-esteem on my ability to do that.
Starting point is 01:06:11 Yeah. Oh wow. That's a great insight. That's what I think. So it's not really something the rest of us need to worry about. We're constantly bumping into people smarter than us and it's fine. Yeah. It's great actually.
Starting point is 01:06:23 It's great. And I actually think this is something about AI is that it actually can be this interesting partner for us and can enhance our world and our abilities. Well, like this thing I was talking to you guys about. You sent us. Notebook LM. Is that what it's called? Yes.
Starting point is 01:06:36 You sent Monica and I a podcast that is entirely AI. There's a male host and a female host. Yes. And you said, do you think this thing was trained on you guys? Yes! Which was so flattering. Guys, do you not hear that?
Starting point is 01:06:51 I mean, it sounds so much like you. By the way, it's great too. Like I was listening to it and I was like, it sounds like MBR. Like I would listen to this if the information was good. It's so good, but it has a rhythm that is very much like you two. When you said it, I heard it.
Starting point is 01:07:03 Do they ever find panties though? That's what keeps us human. That's what keeps us human. Let's see. But it has a rhythm that is very much like you two. Do they ever find panties though? That's what keeps us human. But it has these amazing insights, like it'll just come up with these analogies and things and they weren't in the document that you gave it. It just came up with these other things. But the back and forth, like the pacing and everything else is so good.
Starting point is 01:07:20 The cadence, yeah, they got it. It has a sense of they're comfortable with each other. They're kind of back and forth. You can feel their rapport. Rapport, it. It has a sense that they're comfortable with each other, they're kind of back and forth. You can feel their rapport. Rapport. Yeah. Perfect. That's the word I wasn't thinking of.
Starting point is 01:07:30 They have fabricated rapport. But that's scary, that's the thing you think as a person you cannot replicate and you can. Yeah. But yeah, we're thinking doomsday, like we're out of a job, but think about this, what if we own the thing and it puts out the show and we're on a beach somewhere
Starting point is 01:07:44 and the show's just're on a beach somewhere and the show's just as good, you know? I would love it. It's kind of like the Picasso story where it's like, yes, I drew this in five minutes, but it took me 40 years to learn how to do this. Exactly. So it's like, yes, we're not doing it anymore,
Starting point is 01:07:54 but because we did so much of it, we've earned this. That's how we'll justify our beach life. Well, you guys are not going to be replaced. Don't worry. You have a very special, and also by the way, if you listen to a couple of these and I have, it starts to become a little repetitive. It's not like that's going to do a whole podcast series.
Starting point is 01:08:10 Right. There's something that's not quite new and fresh about them. At first it sounds really great, but after a while it's not really satisfying. And do you think that the missing ingredient is that we cannot help but evolve and change? We're aging. Our bodies are morphing. Our children get older. You know, all these elements that really funnel into this aren't existing there. but evolve and change.
Starting point is 01:08:45 Unless we can figure out ways to sort of feed it and prompt it. And so we can use it as a tool to discover new things. And sometimes it's good at that where you give it a paper and you say, come up with 10 extensions of this paper and maybe one of them will be really interesting. Or you could do it with new ones of those. Like our armchair anonymous. Yeah. Like prompts, like what are some good prompts? Yeah.
Starting point is 01:09:11 It would feel a little true minimally for us if we could use our data set. Exactly. Then I would feel a little less fraudulent about it. Right, you give it your set and then build on that and see what it does. Okay, I have a couple of rapid fire questions and I want to talk about your art.
Starting point is 01:09:24 Where are we ahead and where are we behind in our expectations? Are we ahead anywhere? and see what it does. It's hard to say. One thing I do think we're going to see is the extension of the Roomba is a robot that can pick up clothes and declutter around the house. I like that. And I think it might have four legs. Oh. So it'll be a little like dog with an arm. You thought it might have a scooper on its back like the tail would be a scooper? I think it might have a tail because tails are actually really important.
Starting point is 01:10:03 For balance. For also user interface. A dog's tail is very interesting, and that's very deeply rooted also in our psychology. We have a reaction when you see a dog with a tail wagging. An emotional reaction. But if you notice, none of these dogs that are out there yet have tails.
Starting point is 01:10:17 So we're building one. Okay, you and your wife are incredible artists. You have an exhibit that's at Skirball right now. Oh yeah, no, I'd love to tell you. So she's a filmmaker, and has been involved in technology for a long time, Tiffany Schlane. artists, you have an exhibit that's at Skirball right now. So she's a filmmaker and has been involved in technology for a long time, Tiffany Schlein. She and I have collaborated a little bit, but this is our first big collaboration. We've been having so much fun. We got invited as part of the Getty is doing this city-wide exhibition on art and science.
Starting point is 01:10:40 It's going on for a whole year. to art and science. and the science of tree ring dating. You know how you count the rings? If you've ever been to Muir Woods, there's a great cross section of a redwood and they put on the rings different events in history this tree has been alive for. It has Jesus on the ring. Right, and they're very Western, patriarchal.
Starting point is 01:11:19 And so she had started actually over the pandemic doing a feminist tree ring. Okay. And then a couple of other ones, she's been developing these the pandemic doing a feminist tree ring. Okay. And then a couple of other ones, she's been developing these sculptures and they're salvaged wood. So we don't cut down trees to do this,
Starting point is 01:11:31 but there's a lot of these big red woods and other forms out there. So for this show, we wanted to do something around, that started with the tree of knowledge. Uh-huh, this is very cool. We found a tree stump that was gigantic, almost as big as this room. It's 7,000 pounds. Holy shit. It's a e stump that was gigantic, almost as big as this room. It's 7,000 pounds.
Starting point is 01:11:46 Holy shit. It's a eucalyptus, but it was uprooted and sort of fell over. And then one side is sanded down. And so when you walk into the gallery, you see the back end of it. So it's all this knotted, gnarly roots. Mold.
Starting point is 01:11:59 And then around the other side, we inscribed it with questions, trying to talk about the history of knowledge and how it evolved from like, what is fire? And can I eat this? Which is thousands of years ago, the kind of questions we asked. But those evolved into, will machines be intelligent?
Starting point is 01:12:15 Yeah. And on the far end. So it has 600 questions or something on there. Wow, that's awesome. One of them that's super cool, and it's so wild to think the tree was around at that point, the first mark on the ring is from 530 BC and it's Pythagoreum's theorem.
Starting point is 01:12:33 You see a squared plus b squared equals c squared. And it goes all through these great breakthrough math equations. It's another piece actually that we call abstract expression, which is a redwood. Yes, it starts with Pythagoreus, but remember it's not literal because that tree wasn't 5,000 years old. We take some liberties. Okay, okay. How old was that tree? I think like 400 years old, maybe 500, but that's the idea.
Starting point is 01:12:55 It's like we kind of playing off of that known concept, but this time we wanted to tell the history of science and do it through just equations. And we never say Pythagoras on there, it just has the equation. But those equations are kind of beautiful in their own right. And and do it through just equations. And we never say Pythagoras on there, it just has the equation. But those equations are kind of beautiful in their own right. And they're kind of artistic because in a way art takes an image like especially with physics, in different directions, you could orient it. And I showed it to my advisor,
Starting point is 01:13:43 and he was very excited about it, and he said, well, can you prove that that would work for any part? I worked on this problem for a year and a half. Wow. I tried all these methods, and it was basically extremely difficult to try and prove that it would work
Starting point is 01:13:57 for all of these geometries. I was living at the end of this alley, and it was down some stairs, and so I was sitting on my porch all the time, just like working on this. I have this moment where this pops into my head to use this step function and it looks like stairs. I remember writing down these equations
Starting point is 01:14:13 and crossing off terms and everything turned into zero. And then it worked. Oh my God. It was this moment where when the whole thing integrated to zero means that there had to be a solution for any polygonal part. Did it feel transcendent? It felt quite transcendent.
Starting point is 01:14:30 Yeah. Like you had tapped into something a little mystical. Totally. It was not something that I felt like I did. It was just revealed. You were like the vessel for this. Yes. Very much.
Starting point is 01:14:43 I still remember that very distinctly. And I'll admit that one of the equations on the tree is yours. Oh, good! Oh, good! You should be doing that. I put it in there. That's great.
Starting point is 01:14:55 Up with Gauss and Einstein. Yeah. But you're right, there's something really elegant about those formulas. And you think of the most famous one, like E equals MC squared on the surface is just so simple and no one could think of that for so many years. Right the elegance of some of these Euler's equation is the one that mathematicians truly love. What's that? It's e to the i pi minus 1
Starting point is 01:15:19 equals 0. It's amazing because you have these three quantities you have e which is the natural logarithm which is like this 2.78 blah blah blah, and then you have pi, 3.14159, and then you have I, which is for imaginary numbers. And those three, there's no reason that those should all relate. Right, because they're all going to infinity, so none of them work great in math. The irrational numbers, yes. And they all came from very different sources, but they all come together at this magical moment, and it's mind-blowing.
Starting point is 01:15:49 Like to a mathematician, it shouldn't be. And it's like one of those moments where you're like, the universe makes sense. By the way, I feel like that's where I believe in a higher power. Yes, so I'm a hardcore atheist, I believe in nothing, I don't want to believe in nothing. There's something happening with my kids
Starting point is 01:16:07 that I feel like is something I can't articulate. I really kind of open my mind to like, maybe there is some kind of magic happening. There's symmetry at the very least. From where, I don't know. There's structure. There's some kind of beauty that makes sense, and that's out there. And when we discover, we get a kind of structure,
Starting point is 01:16:45 we had the one breakthrough with AI. When you look at evolution, they talk about punctuated evolution. It's like nothing happens and then everything happens and then nothing happens. Exactly, Dax. You're the anthropologist. Punctuated equilibrium, you have a sort of plateau and then there's a breakthrough or a change. And then there's a long plateau where it kind of gets digested and everything. And then another, but that's really what progress looks like. Right. You would think it's just this nice linear. Not at all, and people say exponential,
Starting point is 01:17:28 that's not the case at all. We're not living in exponential times. Most technologies do not increase like that. There'll be a little breakthrough and then a long, look at air travel, hasn't really improved. There was a huge jump when it started. And we have had things like carbon fiber planes, and there's definitely things
Starting point is 01:17:45 that make it more energy efficient, but in terms of comfort level, we're into breakthrough. Yeah, I think it's actually gone backwards. Yeah, right. Well, Ken, you are a blessing on planet Earth. I think you are so fun and interesting and encouraging. You're like a polymath kind of cosmopolitan.
Starting point is 01:18:02 You're an artist. You have all these interests. You're youthful beyond your years. It's a pleasure to know you. I'm so glad you came in and talked to us about all this. You guys are so great. I have to say, I get this real joy listening to you because you're so open and the way you bring out
Starting point is 01:18:18 the best in people. Now I see how it works. You sit down and you just make us feel so at home. Oh, good. But your real genuine rapport is so incredible and I think that's why you're so incredibly successful. Because people hear it, it's a pleasure. And it's so genuine.
Starting point is 01:18:37 Thank you. That was really nice. I love it. Yeah. Okay, well to the many dinners we will have in the future. How fun. Thank you so much. Thank you so much. What a pleasure. Fantastic, thank you guys.
Starting point is 01:18:47 Stay tuned to hear Miss Monica correct all the facts that were wrong. That's okay though, we all make mistakes. I'm in an incredibly beautiful new sweater that my friend got me. It looks gorgeous. I just put it on for the first time and I'm truly blown away.
Starting point is 01:19:01 The green is really nice. And the fit is really kind of perfect. I know, they know how to do it. And I think I like these cuffs so you have to roll them up. They're too long on their own. They're clearly designed to be rolled. See, look, that's a seven inch, nine inch cuff.
Starting point is 01:19:14 It's nice though. Yeah, but you wouldn't wear it like that, right? You're supposed to go. Well, I'm not allowed because I'm short and they have a rule that if you're short, you have to show a little bit of skin. If you're wearing oversized clothing, you have to show a little bit of skin. If you're wearing oversized clothing, you have to show a little bit of skin on your arms.
Starting point is 01:19:27 Because you'll get lost in it. I just rode my bicycle. Oh, nice. Yeah. My first time in biker shorts. How'd it go? Well, I just can't believe I'm a person that owns biker shorts and wears them now.
Starting point is 01:19:40 I'm having a hard time. Yeah, well you're 50. Well, I feel like that's- So a lot of things have changed. Yeah, but also I think that's something that's best done much younger. Almost the 50 compounds it. First, I never envisioned myself as being someone
Starting point is 01:19:51 that would be in those biker shorts. Yeah, sure. But they have a pad built into them. And the seat is very tiny on the road bike. And it hurts your anus. Yeah. I don't wanna say it hurts here. It does hurt.
Starting point is 01:20:04 Yeah, it hurts. I was like, oh, this is a nice padding. I put them on for the first time. I felt like Peney always talks about, like, eventize your run. I was like, well, these are built for nothing other than riding a bicycle. And let's do that. And the padding was nice.
Starting point is 01:20:18 And I love that there's no fabric flowing anywhere else. I think I went up the hill faster because of them. Probably aerodynamics. I'm not gonna adopt the Lycra shirt though, I decided. I just wore a wife beater. So you're back home. I'm back home. I'm very, very, very happy to be back home. I got you a present for your birthday.
Starting point is 01:20:35 Do you wanna open it? I would love to open it. Let me really take my time here. I'm looking at a beautiful tissue paper. Yeah. With, oh, can I say one thing? This will sound derogatory, but let me preface it by saying, Let me really take my time here. I'm looking at a beautiful tissue paper with, oh, can I say one thing? This will sound derogatory,
Starting point is 01:20:48 but let me preface it by saying, I could be in the tourism board for Mexico City. I love it. It's an enchanted romantic city. Foods dynamite. If you ever go, go to Havre 77 French restaurant. We went twice. The French onion soups are the best I've ever had in my life.
Starting point is 01:21:07 On the second trip on my birthday night, I got two bowls of it to start. Oh wow, like when you got two steaks? Yes, and I would tear out a fingernail right now to have it again and share it with you. It was the most incredible. But anyways, the facial tissue, and I had a cold, it wasn't ideal.
Starting point is 01:21:24 And where it really hit me was that. One ply? Maybe less. I was at a nice hotel, mind you. Yeah, very. We got on the flight, I went into the bathroom, and I pulled the tissue out of the mirror that's in the lavatory of the airplane,
Starting point is 01:21:38 and the second I touched it, I was like, ooh, that's soft. And then I thought, how bad was the tissue where the airplane tissue felt like Puffs Plus with lotion? Oh my. Just to make it relative. Yeah, cause that's one ply. Yeah, I think it was like 0.6 ply.
Starting point is 01:21:54 Oh, okay. Anyways, beautiful tissue paper with purple flowers. Really nice. The tissue is from Niki Kehoe. The present is not. Oh, this is a multi-stage gift. Yeah. Okay, beautiful tissue paper and then a burlap sack.
Starting point is 01:22:08 Yeah, also from Niki Kehoe. That's how they wrap. Wonderful. Oh, buddy. The stories of Raymond Carver. Will you please be quiet, please? Is this an original? I bought it as a first edition and it is signed.
Starting point is 01:22:28 It's signed. Yeah. Did you pay the face value of $8.95? I know, it was on sale actually, half off. Ha ha ha. What year was this published? Cause we can, I think it's fascinating that a hardcover, beautifully bound book was $8.95.
Starting point is 01:22:45 I know, that's true. I know I'm all over the place in a little manic, but I just gotta add back to Little Women, which I love. As you know, Greta Gerwig's number one super fan now. At the end of that movie, they show them pressing and making her first book, the book, Little Women. I don't know if you remember that sequence. I don't know if I remember it. But the amount of time and effort it took to make a book
Starting point is 01:23:13 in the 1890s, where they're pressing it all, they were cutting it with a saw, they were sewing the binding by hand, and then they were cutting leather out and in a pattern and then gluing and putting that in a press. I'm like, it took like a week to make a single volume. They should have been $600. Exactly, well that's why they're so rare.
Starting point is 01:23:33 And it explains why, I think it was Carnegie who invented the library. There were no libraries. Books were just too expensive. They were like, probably in today's dollars, they probably were hundreds of dollars with that amount of manpower. Okay, so this was first published. We think about wealth disparity now,
Starting point is 01:23:50 but then in order to even read a book, you had to be a millionaire. Yeah, I'll get the number wrong, but to put it into perspective, like, so I guess Elon is now worth $400 billion recently, although that stock just fell, whatever, let's just say he hit 400 billion. 400 billion of our total GDP and national amount of money
Starting point is 01:24:13 isn't even 0.01%. When Rockefeller hit a billion, they say he actually had like 15 cents of every dollar that existed in America. So it's like as bad as it feels now, it was- It was worse. Exponential order of magnitude crazier with the first rich people.
Starting point is 01:24:33 Yeah, that's true. Okay, so this was 1963. So this book costs 8.95 in 1963. How much do we think that is now? Rob, can you put it in? Well, that's a great, we have that technology. Yeah, we sure do. I added a new, I actually wrote up my resolutions
Starting point is 01:24:50 last night. Oh, great. Which I don't know if I've ever written them down. Yeah, I wrote some down too. You did, did you journal this morning? I did. Congratulations. I've journaled every day. I'm proud of you.
Starting point is 01:24:59 I had therapy too, and we talked about it, and she said I could. Burn them? Yeah, or shred them or whatever. Can I have her number? No, she's like, if that's gonna allow you to really be able to be honest and truthful with yourself in a way you won't be able to otherwise.
Starting point is 01:25:22 And let it out of your body. You know, sometimes her and I talk about, like there are things that I talk about with her that only she gets to hear. And she said, you know, it's not just me, you also have you. Yeah. And you have a dialogue with,
Starting point is 01:25:37 you can have a dialogue with yourself. Yeah. Especially via the journal. Yeah. But yes, of course I have to be very honest with myself there. And so if I'm out of fear not doing that, then it's not worth it.
Starting point is 01:25:55 So I do, I'm still deciding. We may have talked about this, but, and I had mentioned there was a period I stopped journaling over the last 20 years. And then I had a relapse, obviously, and I didn't even put all this together, but through therapy with Mark, I think what occurred to me was,
Starting point is 01:26:17 there were things I couldn't write down, just like you were saying, are you afraid someone's gonna find it? And I'm like, no, but in truth, there was a moment, yes, I'd be afraid someone would find it. And I had this weird dedication to never lie to that journal. Right, right.
Starting point is 01:26:33 So I just kinda, I didn't, it didn't feel like I was making a decision to stop journaling. It just was like, this is really weird. I've been journaling for 17 years or whatever, and I haven't in a while, but I'm not overthinking it. But of course, in reflection, I was like, I couldn't really be dishonest to this thing.
Starting point is 01:26:50 Yeah. I love this. This is such a thoughtful, wonderful present. It would have cost $89.18. Wow. $89 for a book. That's a lot. It's not enough though.
Starting point is 01:27:04 I wish it was 5,000. Okay, this is a fantastic present. Very thoughtful, thank you so much. You're welcome. Okay, how was therapy? It was good. It's my first therapy of the new year. For a second, I was debating,
Starting point is 01:27:20 I was like, maybe I only need to start going as check-ins now. Maybe I don't really need to be on this consistent of a schedule. But then today I was like, no, I need to keep up my once every two weeks. Well, look, I've stopped. So I really am in no position to say this.
Starting point is 01:27:38 But it definitely falls into the umbrella of like, well, it couldn't hurt to go. It does not hurt. And it potentially could hurt to go. It does not hurt. And it potentially could hurt to not go. Yeah, yeah. It's kind of the vitamin debate. It's like the scientific community's kind of split down the middle whether vitamins work or not.
Starting point is 01:27:54 Yes. But it's like, I don't know, on the chance that they work, they're not gonna harm you. All right, someone's gonna comment. Yes, I hear you. Well, there are some bad ones. Oh, I know, and you can have too much of certain things. But just in general, if you're taking the, you know,
Starting point is 01:28:07 not above the daily dose of any one thing, it's not gonna harm you. Speaking of, okay, you know how I'm always paranoid about drowning my cells? In too much water? Yeah, or people in general, like drinking too much water and then drowning their cells. You know, Gundry's new movement is less water.
Starting point is 01:28:24 Not shockingly. Ha ha ha ha ha ha. So him and I are aligned. Soulmates. Why he's got those fresh hands. He doesn't drink any water. No hydration. Oh my gosh.
Starting point is 01:28:37 I'm gonna put my hair up real time. If you wanna see it. It looks so good down, but go ahead. Let's see what happens there. Okay. If you wanna see it, go to YouTube. Do you ever do an up and then a braid and back? Yeah, well, I did it for, when's this out?
Starting point is 01:28:52 The eighth. I did it for a commercial we were just in together. Oh yeah, that comes out yesterday. It came out yesterday. Oh my God. Our little commercial. Yes, our second commercial of I Hope Many. Yes, exactly. It was out yesterday. It came out yesterday. Oh my God. Our little commercial. Yes, our second commercial of I Hope Many. Yes, exactly.
Starting point is 01:29:08 It was so fun. And it's out. It was out yesterday. It's on our Instagrams. And in it, I do have a ponytail with a braid that I love. It's just really hard for me to do it on my own. I had a hairstylist that day.
Starting point is 01:29:23 Oh, right, right. But I do like it. Maybe your therapist could style your hair on the days you don't wanna share. Oh, hair play? I would go every day. Yeah, yeah, yeah. I'd pay for that.
Starting point is 01:29:33 Anyway, okay, so drowning cells, everyone laughs at me, they guffaw. Uh-huh. And I met someone who drowned his cells. Oh, tell me. And it was really bad. Tell me more. Okay.
Starting point is 01:29:48 Who did you meet? Where'd you meet him in front of 7-Eleven? No, he's a real person I know. I'm not gonna say, I'm not gonna out him. Him or her's name. Right. He's a friend of a friend. This is a sad story.
Starting point is 01:29:58 I'm transitioning into a sad story. When I was home. If you were having fun and laughing, stop. Yeah, stop. A big group of friends was meeting and one, Robbie. Yeah, sweet Robbie from our chain. Yes, from the connections chain, wasn't there. I was like, where's Robbie?
Starting point is 01:30:15 And his wife said, oh, he's at the hospital with. The mutual. Yes, with. The unmentionable. No, because. He's not underwear. Because that's his name. If... A mutual. Yes, with... The unmentionable. No, because... He's not underwear. Because that's his name?
Starting point is 01:30:28 No, his name is unmentionable. We can't call him untouchable because he's Indian. Oh, he is? Yeah, so now I'm giving a lot of info away. Yeah, it's pretty easy to narrow this down at some point. If you know an Indian in Atlanta who's friends with Robbie. That's true, there is one. There is one. Anyway, this is one. There is one.
Starting point is 01:30:45 Anyway, this is sad, this is sad. And he had a seizure. And I guess he had already had a seizure a year before and was on seizure medication and stuff. But when he went- You're the perfect person to tell this story because you have the same condition. Right, exactly.
Starting point is 01:31:04 And you're Indian. And I'm Indian. When he went the You're the perfect person to tell this story because you have the same condition. Right, exactly. And you're Indian. And I'm Indian. When he went the first time after his seizure, they checked his salinity levels and they were so low and he did drink in like really excessive amount of water. Do we know why he drank? And he drowned his cells. Yeah, he got rid of too much salt, but-
Starting point is 01:31:24 He drowned his cells. That was the got rid of too much salt, but. He drowned his cells. That was the medical? Yep. Oh, okay. You have a good deal of salt, I think, from your diet. Don't take offense to that. Are you referring to like the potatoes I made or something? No, but you like, you only have a nice seasoned chicken.
Starting point is 01:31:42 Oh, sure. I think you have a good amount of salt in your diet. Yeah, I feel fine about my salinity. Yes, so. I don't drink any water, so I'm good there. Do we know why he was drinking so much water? Was he on like an exercise routine? He was on an exercise routine and I'm not sure why.
Starting point is 01:32:00 Anyway, so turns out per usual, I'm right, you can drown yourselves. Per usual. And. Unsurprisingly. Please look out for that. Okay, yeah. I don't know why I brought that up.
Starting point is 01:32:12 You and Gundry should collab on this. I'm happy to join forces. Also, just if you are having a lot of water, maybe use some. Electrolytes. Electrolytes, that's right. Keep an eye on your electrolytes. Yeah.
Starting point is 01:32:26 The only cases I've ever heard of is like no one's ever died from ecstasy, but people have drank too much water on ecstasy. Exactly, they drown their cells. Yeah, okay. Okay. I wonder if they drown their cells or if when they drink way too much water,
Starting point is 01:32:40 it backs up like congenital heart failure basically, like ends up filling up their body. Cause you know, my father who had congenital heart, I don't know if it's congenital, he had heart disease. And what would regularly happen is his heart was too big on one side and normal on one side. And so it would pump in a lot, but it couldn't pump out a lot.
Starting point is 01:33:02 And then it just ends up backing your whole body up with water and you get really bloated and you put on all this water weight and then it starts really affecting your breathing and your lungs and everything else. And so my dad would go into the hospital for like four days and all he'd be on diuretics and he'd just be getting rid of gallons of water.
Starting point is 01:33:20 Right, oh my God. Yeah. Okay, it says, yes, cells can drown in a condition called water intoxication or hyponatremia, which occurs when there's too much water in the body. When there's too much water in the body, sodium levels drop, causing water to move into cells and causing them to swell.
Starting point is 01:33:39 This can be especially dangerous for brain cells as it can lead to pressure in the brain, confusion, drowsiness. Wait, no, epilepsy to pressure in the brain, confusion, drowsiness. Wait, no. Epilepsy, pressure in the brain might have been completely all related. Exactly. Oh, man. Well, I'm sending love and well wishes to this anonymous person. Untouchable. Why does Robbie have two very close Indian epileptic friends?
Starting point is 01:34:02 I know. He's very over-indexed. He is very over-indexed. He is extremely over-indexed. I consider myself kind of unique in America, low percentage where I have a best friend who's Indian and epileptic. And he's got now two. I know.
Starting point is 01:34:19 He has a fetish. I know you don't like that word, but. You think it's a king? Ask if there's a third. If there's a third, he has a condition. Yeah, it is weird. Then I wondered is like epilepsy. What ethnicity is Robbie's wife?
Starting point is 01:34:34 White. White? Yeah, she doesn't have it. Actually. No, no, no, no, no. Oh my God. No, no, no, no. Is he giving everyone this? He's poisoning everyone.
Starting point is 01:34:45 Oh my God. He's so sweet, that would make sense. He's one of the sweetest people I've ever met over text. Yeah, but he is a dark side. Ooh. Hi, Robbie. Oh, nasty. So his wife is my oldest best friend.
Starting point is 01:34:59 And when we were in high school, she had seizures. And they were dating at that time. Okay. And she got in this car accident because she had one. Hers were different though. She had, like, she didn't have grand mall seizures. Were you about to say petite? Petite mall, that's what they're called.
Starting point is 01:35:17 That's so cute. And then she had to- So you picture like a mall, you'd walk in, but there's only three stores. And then there's the food court is like four food carts. They should call it boutique seizures. That's way better. Yeah, that's cute.
Starting point is 01:35:29 Great re-brew. You and Gundry can work on that. Anyway, yes, he has three. There's a fourth. I mean, I only know of three people in his life and all three of them have seizures. So certainly there's more. Should we get Robbie on the phone?
Starting point is 01:35:45 Do you want to? Yeah. We gotta grill him about this. I mean, he's definitely at work. On Saturday? Oh, I forgot. During the NFL playoff game. I'm so sorry, Georgia lost by the way.
Starting point is 01:35:56 Oh no, the Sugar Bowl. You didn't know that. They lost. They lost. To who? Don't say Texas. They didn't lose to Texas, but Texas won theirs. Texas is still in it.
Starting point is 01:36:07 Notre Dame. But they're still in it because, oh no. That game we saw was one of their only two losses. They ended up being really. I know, but they played. Welcome to the SEC, bitch, is that what you said? Yeah, I did. Hold on, I gotta call Robbie.
Starting point is 01:36:22 He's the one also that knows about all of this. Yeah, he's not at work. He's at the hospital, one of his Minneapolis. Don't say that. Knock on wood. Hello? Hey, Robbie? Yeah?
Starting point is 01:36:35 You're on candid camera. You are on air. Arms hair candid. You're on air. I'm on air. And can you hear, do we have your consent and can you hear me? Yes, yes, yes to both, yes.
Starting point is 01:36:47 Okay, great. Well, we started, we wanted to call you about some, one thing, but now we have two things to talk to you about that are very important. And we did not name any names, but I'm just learning of the fact that you have a second Indian friend with epilepsy, which I find to be almost statistically impossible.
Starting point is 01:37:05 And then Monique said, it doesn't stop there. His wife has epilepsy. Well, she doesn't have, okay, not specific epilepsy, but you do have three people in your life that have had seizures and it's, now we're starting to worry. And think you're at the- I know, yeah.
Starting point is 01:37:20 I see where you're going now. I honestly hadn't ever thought of this. Ah! That's what he would say. So my, I had some- Monica, Monica, my sister too. Oh my God. I knew it. I fucking knew it. I said, I, Robbie, I said there's a fourth for sure.
Starting point is 01:37:36 Fuck. Robbie. What are you doing to everyone? I don't know. I really don't know. Oh my gosh. I'm gonna be looking at things in my life. I don't know. I really don't know. Oh my gosh. I'm gonna be looking at things in my life. I don't know.
Starting point is 01:37:48 This is wild. Do you think it's because you're so calm and sweet, all of a sudden the other person's brain feels erratic and unhinged? Is it like relative to your calmness, people short circuit? It could be. I mean, yeah, that's the best. I think that's the best we have to work on right now.
Starting point is 01:38:03 My guess is Geno would say otherwise. I agree. But this is wild four. Robbie, four is a lot. Now I mean sincere. Is there something environmental in Duluth where half the population is in seizures? No, cause mine happened once I left.
Starting point is 01:38:25 But you grew up with that water. Oh, you think it's the water? Yeah, you have late onset. Because I didn't drink enough water and then it caught up. I'm not, I'm not sure about the logic of that. But what I'm saying is there's something in the soil where you grew up where 70% of all people have seizures. There's gotta be, Monica's house was super close to mine.
Starting point is 01:38:48 The other friend also lived like right down the road too. Oh, and Gina. Yeah. So honestly, if you draw, if you draw like a polygon of the four points, it's like a very small area. And so likely shared whatever water source. Yeah, it's pretty narrow there.
Starting point is 01:39:09 You're right. Guys, did we just break an enormous case? Do we need to call the New York Times immediately? Oh, fuck. You're gonna have to do a new podcast. You're gonna start a new podcast where you investigate this issue. Wow.
Starting point is 01:39:21 It's gonna be called Poison Paradise. Under the veil of suburban beauty and tranquility. Oh my God. Lies of burbling poison that results in the shutters. That's a lot of words, that's a lot. That's too many words, you need it to be small. No, no, first was the title and then I was, then I was entering into the first episode.
Starting point is 01:39:40 Oh. Yeah, he got going already. Oh my God. I mean, you're halfway there, it sounds like. This thing writes itself. Okay, now we have, moving on going already. Oh my God. I mean, you're halfway there, it sounds like. Wow. This thing writes itself. Okay, now we have, moving on to point number two that's- Well, no, I have one follow up on that, Robbie. Okay.
Starting point is 01:39:51 In your free time, which I know you don't have much of, just, can you sniff around, see if any more folks have had seizures? Yeah. Okay. I will, yeah, I'll report back. Yeah, I'll just, I'll start kinda casually throwing that into the conversation I have.
Starting point is 01:40:04 Like, so by the way This is kind of weird, but you have a history of epilepsy and just kind of move on from there. Yeah Sounds like a good plan. That's gonna work Okay. Now point number two is football And you are my main source of information for football. I Was texting you during the Texas Georgia game and we were secretly gloating while I was amongst a bunch of Texans.
Starting point is 01:40:31 And then Dax just told me that then Texas went on to like win all the rest of the game. They're still in it, yeah. They're still in it, yeah. So they have a tough matchup against Ohio State because Ohio State looks really good right now. But yeah, they're still in it. But does that be like a fluke?
Starting point is 01:40:50 Shut up. Well, it can't be a fluke because the only team to beat Texas this year is Georgia. Georgia beat Texas twice this year. Oh, twice. Yeah. So, but then it's kind of a fluke that we aren't, like it doesn't make sense that we beat them twice
Starting point is 01:41:04 and we're now out. Yeah, you know how it's like, how can Federer be the best ever if he can't ever beat the doll? You know? It's very similar. We all have our albatrosses. Yeah, exactly.
Starting point is 01:41:17 Yeah, all right, well that clears that up, I guess. And I just wanna end on this, Robbie, your voice was built for radio. He has a great voice. You must be involved in Poison Paradise. Oh, thank you. I'd love to help you. Let me know.
Starting point is 01:41:32 I'm a hard worker too, so yeah, just let me know what you need. All right. All right, thanks, Robbie. All right, thanks guys. Take it easy. Bye. Bye.
Starting point is 01:41:41 Stay tuned for more Armchair Expert, if you dare. Well, that was a great use of time. I wasn't expecting his voice to be that velvety. He's a very handsome man. You know, I only have handsome and beautiful friends. This has started from day one. Good for you. I know.
Starting point is 01:42:06 Anywho, all right. Okay, good luck to, will you plug your ears? Good luck to UT Hook'em. I'm cutting that. This is the same as that story I told about the people flying to LA to watch the Red Sox play LA, hoping the Red Sox would lose because they had just beat New York.
Starting point is 01:42:24 But the other guy was like, no, they must win, that way New York's number two. Wouldn't you want your team to have twice beat the champions? I guess you're right. I think it's time for you to transition into rooting for them for your own. For my own gain. Yeah.
Starting point is 01:42:40 Okay, I see that logic. Speaking of which, and I know we're all over the map and have taken up too much time, but I just, I want that logic. Speaking of which, and I know we're all over the map and have taken up too much time, but I just, I wanna go on to say that I finished the Churchill documentary on the flight home yesterday, and I got very swept up in it. This has happened a few times,
Starting point is 01:42:56 and I'm sure you've watched shows on this. When you are forced to watch what the Brits went through, 57 nights in a row of carpet bombing of London. Everyone sleeping in the subway, no bathrooms, getting up, going straight to work, and carrying the fuck on. And they were so outgunned and outmanned and out everything. And they alone took on Nazi Germany at that point.
Starting point is 01:43:24 Everyone was already defeated. Yeah. The amount of will and resolve is so historic. I found myself like, this is so cheesy, I found myself being like really proud that I know Jethro. Oh, that's nice. Yeah, I was like, by God, that little island, you motherfuckers refused.
Starting point is 01:43:51 Yeah. And Churchill, he is a very flawed person. He was horrendous to India. I'll acknowledge that. But truly one man got those people to that state of mind. If you watch this doc, you're like, who knows if that person doesn't exist, what happens? Because he had two burdens.
Starting point is 01:44:12 One is to be fighting off these Nazis who are just bombing every single night, trying to keep morale high, and he has got to get America into the war or they're gonna die, everyone's gonna die, because they're not gonna surrender. And so his skill at wooing FDR and developing this relationship
Starting point is 01:44:30 and slowly getting us more and more involved is so impressive. And his own story is so unique in that he was a soldier during, in his youth, and he was an incredible soldier. Then he went into politics and he was a soldier during, in his youth, and he was an incredible soldier. Then he went into politics and he was a boy wonder because he was right, the whole time he was in the war, he was also a reporter.
Starting point is 01:44:51 So he was reporting firsthand from all these wars, and he's one of the best writers to ever live. So he was in this crazy, unique situation where he leaves the service as a hugely popular figure in Britain, goes into politics, has this meteoric rise, and then plateaus and then plummets. And he's completely on the outs
Starting point is 01:45:11 and he can't get anything done. And then World War I comes along and he decides in his 40s or 50s to rejoin the army. He becomes a commander, he wins all this glory, returns and for four years is begging Britain to understand Hitler cannot be trusted and don't believe a thing he's saying and we can't be signing these deals, and no one's listening, no one's listening, he never relents.
Starting point is 01:45:38 And finally the Brits realize he has been right the whole time. And overnight he becomes prime minister. Like the story of the up and the down and the out and the miscast and the, it's, what a story. Yeah. Horrible to the Indians. Let's be clear, a colonist, grew up in Elizabethan England, definitely wanted the empire to stay alive. Also miraculous feat of will and resolve in the poetry with how he
Starting point is 01:46:08 motivated people. He gave this speech to our Congress to help us embrace the fact that we were entering the war and it's like the most incredible speech. It's an, I cannot recommend the doc enough. I don't know why I went on that tangent, but it's been burning a hole in my brain. I know, I'm making you nervous. My energy level is a 15, I'm home. It's not making me nervous, it's like. Go ahead. No, it's just like, where's it going? Oh, I'm just sharing all the things
Starting point is 01:46:43 that I missed out on sharing in the last three weeks. God. You're so much like my father. I am. He just loves to explain stuff. Yeah, it's kind of a male trait. But does that story, like is there a male female thing going on?
Starting point is 01:47:02 Is this the Roman Empire? Like does that whole chapter just like not interest you? Parts do, but not that part. Of an individual story where someone's like completely discarded and publicly reviled, then finds their way back, then becomes so valued and important, then gets discarded again, and then it doesn't quit.
Starting point is 01:47:24 Like has a calling that can't be ignored and then matched with this like Shakespearean ability to write speeches. Yeah. No. No. I'm more into like the Anne Frank story of that era. Like I don't, I guess I'm really not drawn deeply
Starting point is 01:47:48 to people in power. Like I'm not, that's not a thing that- You're drawn to the disenfranchised. Yeah. This makes total sense. Well, I just find that way more, as a human story, way more compelling. I find that kind of overcoming,
Starting point is 01:48:07 like a true overcoming, much more compelling than someone who's like just feeding off power. I think the thing that interests me about it is as big as this world is, and as complex and dynamic as it is, single individuals radically changed the face of the world. Oh, I agree, yes. I find that fascinating.
Starting point is 01:48:33 Those figures, they don't do it for me. Yeah, they don't get you going. I'm kind of like, towards them, you know? For the listeners, she just kind of, it was an interesting one. It wasn't an eye roll. It was a back and forth, side to side. Speaking of.
Starting point is 01:48:49 Go ahead. Eye roll. You found the origin of your. I figured it out. I figured out where my eye roll comes from. We thought it was an Indian thing. Or just maybe a genetic innate thing. Thought it was maybe just a full resentment
Starting point is 01:49:05 I have of everything and everyone. We didn't know, but I knew that's not right. That's not it. It's a habit, but why? And now I know. Well, you sent it to me, so I saw it. Well, I'm gonna show the world. The world, show the world.
Starting point is 01:49:19 And I'm gonna have to describe for the listener because let's just be clear, 98% of our audience is still just listening and not watching. Check us out on YouTube and you can see this. Yes, please do. All right, so for the listener, it is a two or three year old Mary Kay Andor Ashley Olson from the Full House program.
Starting point is 01:49:38 It says, duh, across the screen. She's shaking her head and she gives the most expressive eye roll you've ever seen. And she has, or they have, enormous Disney eyeballs where it's very expressive and clear. Yes. Yes. We got it?
Starting point is 01:49:55 We got it. All right, now Full House was my original friend. I was obsessed with it. The only time I was ever punished for my parents, the punishment was I couldn't watch Full House that night. That's in my cells. That's where I got it. I got it from original Mary-Kate and Ashley Full House.
Starting point is 01:50:14 You started probably reenacting it. Always. Yeah, aping it. Yeah, mimicking. They were my models then and now. Yeah. It might be all the way though at the end. I think it is.
Starting point is 01:50:27 They might be your Aaron Weakley. I mean, you already have your Aaron Weakley. I think it would be sad if they're my Aaron Weakley because they don't know me, but they are my ride or die. What I'll say is they're radically different people, which is so fascinating. Yeah, I guess that makes sense, but also doesn't make sense.
Starting point is 01:50:47 Well, they're not identical twins. You know that, right? They have to be. Baloney. They're fraternal twins. No. Well, sisters have never looked that much alike. It's crazy.
Starting point is 01:51:01 Do we know this for positive? God, don't make me, okay. AI Google says that. Thank you. I know, I know. They're not? That's like me saying I know something about Valentino Rossi.
Starting point is 01:51:14 Ooh, tell me, what do you know? I know nothing. Yellow 46? Exactly, that's the whole point. I'm impressed you remembered his name. Thank you. So yeah, fraternal. One's left-handed, one's right-handed.
Starting point is 01:51:26 But that's super common in twins. And one's one inch taller than the other. Even when they're identical. Well, they're fraternal. But that could be a posture thing. Okay, whatever. All right, let's stop. They're fraternal twins. I believe you.
Starting point is 01:51:38 And- You'd never know it by looking at them. Don't judge a book by its cover. You would not. I mean, I agree with you. It's shocking. Yeah. I've met a lot of boy, girl twins
Starting point is 01:51:49 and they have all had the experience where someone asked if their twin was identical, even though they knew one was a boy, one was a girl. What? Yes, I'm telling you. Okay, well, some people don't understand twins. They don't understand what identical means versus fraternal. Yeah.
Starting point is 01:52:09 They must not or. It must be way lower percentage that you get a boy and a girl than same gendered twins. For fraternal? For fraternal. I think the opposite. I feel like if most fraternal twins I know are boy and girl. Oh really? That's why they are very confusing. I feel like if most fraternal twins I know are boy and girl.
Starting point is 01:52:25 Oh really? That's why they are very confusing. We should have a twins expert on. Because what that means is that there were two ova in the uterus and that one male sperm and one female sperm hit the two. And generally you would think, well either the males were making it because they swim slower and they're more robust
Starting point is 01:52:44 or vice versa and one swim fast. So it's weird that one would swim fast, but you know what I'm saying? I don't know. It seems like it's- The body is a wonderland. It is a wonderland. John Mayer.
Starting point is 01:52:53 All right, let's do a little bit of facts. This is for Ken Goldberg. He was wonderful. I really, really liked him. Yeah, what a unicorn. A lot. Okay, now this episode starts with your underwear on the floor.
Starting point is 01:53:07 Which was interesting. That was shocking. That's an experience to look down in your underwear is outside your pants. Cause your first thought is my underwear. Yeah. Came off my. With doll.
Starting point is 01:53:18 Yeah. It doesn't seem to be torn in half. Yeah. That's a real like, where am I at in time and space that my underwear has made itself off of my body and onto the floor. I mean, it's so obvious later when you think, it was clearly in my pant leg.
Starting point is 01:53:32 I know, but in the moment, you can't think straight. My underwear is falling off. It's like in I Think You Should Leave, when Robinson, they put a whoopee cushion on his chair and he doesn't understand it. He goes, what happened? Like he really is shook. Cause he didn't feel himself fart He goes, what happened? Like he really shook, because he didn't feel himself fart,
Starting point is 01:53:47 but he heard a fart, what happened? Oh my God, that's so funny. Okay, but also, so that happened, the underwear, but then I realized when I was editing it, the inside out of my pant pocket- Was exposed? Was exposed the whole time. That's a weird coincidence.
Starting point is 01:54:07 It is weird, but no one caught that, so the whole episode, the inside of my pant pocket is out. Which people could have thought might be her underwear. Like, you know it's the lining of your pocket, but other people could be like, why are both of their underwear falling off? Ha ha ha ha ha. Now the vaccination mark. The smallpox vaccine scar is a small mark you might have on your
Starting point is 01:54:30 upper arm if you receive the dry vax or ACAM2000 smallpox vaccines. It's a sign that the vaccine successfully spurred an immune response in your body to protect you against smallpox. Not many people receive a smallpox vaccine today, so the scar is far less common than it used to be. The smallpox vaccine leaves a scar because it causes a minor infection in your skin. Your body fights off the infection,
Starting point is 01:54:56 but this process leaves behind a small mark on your skin where the infection and related inflammation took place. That makes a lot of sense. I assumed wrongly now that it had something to do with the mechanism of injecting it. Like, did they use some weird thing? Because again, my dad's was, I have such a good memory of my dad's.
Starting point is 01:55:15 I don't know that my mom has one, weirdly. But my dad's is like seared in my brain. And I was like, it looked like, I think I said, a cigar. Like they administered it with a burning cigar. You can look at pictures online, they have them. And they do look like that. Okay, the book, the scientific management book that was influential on Stalin is called
Starting point is 01:55:32 The Principles of Scientific Management by Frederick Taylor. See, this all paid off my diatribe on Churchill because Stalin was the trickiest figure in that triumvirate. Okay, now, so Kim Kardashian posted some pictures with the Optimus robot, and it said that Elon gave it to her, and she denies that she was paid for those pictures. Okay, other than the free robot.
Starting point is 01:56:01 Right. That she may or may not have. Right. Okay. This is what it says the robot can do, the Tesla Optimus robot. Okay, it says it can do physical labor. It says it can move materials, assemble parts, and load items onto machinery.
Starting point is 01:56:22 Okay. Yeah, I'm skeptical of that. I'm skeptical. That's how we do it without getting sued. I'm highly of that. That's how we do it without getting sued. I'm highly skeptical. This is also on the AI overview. So they like their buddies. Okay.
Starting point is 01:56:32 So he's got. They're all in cahoots. Yeah. Inventory management, Optimus can use barcode or RFID scanning to track inventory in real time. Home chores, Optimus can carry groceries, help the elderly, and perform other home tasks. What if it only helped elderly?
Starting point is 01:56:51 I mean, that would be good. Data collection and research. Optimists can be used in labs or remote monitoring environments to collect data. I mean, that's just like the brain. That's a computer. Yeah. Smart home integration. Optimists can link up with Tesla cars and energy systems to become part of a smart home. Optimists can walk among people and serve drinks at a bar. I doubt it, but I'm sorry, I'm skeptical. But have you heard about that?
Starting point is 01:57:15 Okay, apparently there's a place in like Culver City or something that is run by, it's like a burger place that is run by robots and the robots drop off your food. Okay, I think I've heard that, but also my assumption of what that was was like very simple mechanized arms, not bipedal robots walking about.
Starting point is 01:57:35 Like it can make it in the kitchen and it goes on a conveyor belt and then it's exactly lands in front of your thing. Doesn't necessarily mean that a bipedal robot carried it as much as there might be automation that gets it all the way to your. I think it's saying it delivers it to your table, but it might not be bipedal. We should go.
Starting point is 01:57:52 We should go. I'd love to go to a robot restaurant. What is it? Cali Express in Pasadena. Oh, it's in Pasadena. That's much closer. Yeah. That just upped the odds of us actually doing that by a lot.
Starting point is 01:58:02 I do think there's a little guy that rides around and brings here food. A little fowler? Aliexpress by Flippy, the world's first fully autonomous restaurant. Grill and fry stations are automated. It looks like a little thing with serving trays and American flags that goes to your table.
Starting point is 01:58:20 We'll have to go. But okay, it says, optimists can perform precise movements and heavy lifting. Optimus can adapt its behavior over time to reach the desired results. Optimus can play games like rock, paper, scissors. Okay. So, anyway, that's what AI claims its buddy Optimus can do. Okay.
Starting point is 01:58:43 They're best friends. Okay. And our robot feels a little left out. No, he's more boy-like, remember? AI claims its buddy Optimus can do. They're best friends. And our robot feels a little left out. No, he's more boy-like, remember? Big time glass half full. He's wondering what's going on, because there's a lot of other robots now. There are a lot of other robots.
Starting point is 01:58:57 But he's becoming charming and flawed, Wabu Sabi. Robby Sabi. Robby Sabi. Robby Rob Sabi. Wabi Sabi. Wabi Sabi. Robi Robi Sabi. There was a Prada has these bag chains that I really want that are robots. Bag trash, is that what it's called?
Starting point is 01:59:19 No, it's a bag chain. I'm learning this from Nicole. This is the movement now is like you have these very fancy handbags and then you put all these little trinkets that pour off the side. And I think she calls it like bag trash. Oh, she might, but they're called bag charms.
Starting point is 01:59:34 And look, Prada has this one. This one's in like, this one's in like snow gear. Yeah, that's really cute. Isn't it? Yeah, it's really cute. Isn't it? Yeah, it was about to be critical. I just think it's funny, fashion is very funny. Sure.
Starting point is 01:59:50 So you get this perfect, outrageously expensive bag and then you're supposed to like drape some trash off. It's like downplay it. It's like, what's happening? I agree, but it's not trash. This is $1,100. Well, I didn't say it was inexpensive. Oh.
Starting point is 02:00:03 Yeah. Well, okay, but I agree. inexpensive. Oh. Yeah. Well, okay. But I agree. I would not put, people love bag terms, and I think that's great, and it's a way to show your identity. But they're not for me on my bag, but I want this little robot to just sit in my house.
Starting point is 02:00:17 Yeah, yeah, that's great. Yeah, he's pretty big. Look at him compared to the bag. Oh, that's preposterous. He's larger than the bag. Yeah. You said 39% of US jobs are still manual labor. According to the Bureau of Labor Statistics,
Starting point is 02:00:30 reported that 39.1% of civilian workforce in the US performs physically demanding jobs that require lifting, carrying, pushing, pulling, kneeling, stooping, crawling, and climbing activities in varied environmental conditions. Sucking, fucking, don't leave out sex workers. That's manual labor. No, we like, don't we honor sex workers?
Starting point is 02:00:50 Yeah, but I'm just wondering, is it really manual? Yeah. It's definitely manual. It's laborious. All right, well, that's it for Ken. I'm glad we ended on that note for Ken. I think he would appreciate that. All right. All right, bye Ken. I'm glad we ended on that note for Ken. I think he would appreciate that.
Starting point is 02:01:06 All right. Bye Ken. Love you. Love you. Love you. Love you. Love you. Love you. Love you. Love you.
Starting point is 02:01:13 Love you. Love you. Love you. Love you. Love you. Love you. Love you. Love you.
Starting point is 02:01:19 Love you. Love you. Love you. Love you. Love you. Love you. Love you. Love you. Music, or wherever you get your podcasts. You can listen to every episode of Armchair Expert early and ad free right now by joining Wondry Plus in the Wondry app or on Apple podcasts.
Starting point is 02:01:37 Before you go, tell us about yourself by completing a short survey at Wondry.com slash survey.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.