I Don't Know About That - Chatbots

Episode Date: November 21, 2023

Katherine Munro (@KatherineAMunro) returns to update us on more scary technology: Chatbots. Check out Katherine's book she co-wrote called "The Handbook of Data Science and AI: Generate Value from Dat...a with Machine Learning and Data Analytics": http://tinyurl.com/38jbm4pv

Transcript
Discussion (0)
Starting point is 00:00:00 The Boar War Was it just that it wasn't very entertaining? What? What's happening? Is this a whole lot of dull people stabbing each other? Wait, what are you talking about? The Boar War We've never done the Boar War
Starting point is 00:00:21 What's the Boar War? The Boar War was before the First World War. It was in South Africa. Boer? B-O-R-E. Oh. I don't think it's B-O-R. Maybe it's B-O-H-R?
Starting point is 00:00:32 It could be B-O-A-R. What is it, Aaron? B-O-E-R. B-O-E-R. Oh, there was a second Boer War, too. Do you say Boer? Yeah, the Boer War. Ah.
Starting point is 00:00:41 The Boer War. It's like a big war. There's two of them. Yeah, I don't know anything about it. Yeah, I don't know about these. Australia fought in there. England fought in there in the Boer War. It's like a big war. There's two of them. Yeah, I don't know anything about it. Yeah, I don't know about these. Australia fought in there. England fought in there in the Boer War. If America ain't involved, I don't care.
Starting point is 00:00:50 I thought you just were going Boer War. No, the Boer War. I thought it was going to be like the Eby War where Australia was trying to kill all the Boers. No, it was something. I don't know what happened. It was something in South Africa. Am I right, Aaron? The second Boer War was also known as the Boer War.
Starting point is 00:01:00 Am I right, Aaron? The second Boer War was also known as the Boer War. The Anglo-Boer War or the South African War was a conflict between the British Empire and the two Boer Republics over the empire's influence in South Africa from 1899 to 1902. So this is basically white people getting into South Africa? Yeah. Oh, that's not good. Nah, that's not bad. Boer War looks pretty bad.
Starting point is 00:01:21 What's up with that? What's up with that? I have welts on me still. Yeah, we went paintballing. Yeah. Boar War looks pretty bad. What's up with that? What's up with that? I have welts on me still. Yeah, we went paintballing. Yeah. How did that go? I still have one there.
Starting point is 00:01:32 Look at that one. And then there. Who got you? I got one. Jim Mosley. I shot him a lot. Jim Mosley. To be fair, one time I was drawing you to me so you could get killed, but I thought I would still survive.
Starting point is 00:01:46 But it hurts when you're there. Some of the kids were being a bit mean about Forrest at one stage, you know, because we were doing like – They weren't being mean. They didn't want me on the team. All great kids, all great kids, really nice kids, but we're about to go out and we're doing like we get one adult on each team type of thing.
Starting point is 00:02:01 Some of the adults, I thought one of the parents would play, but they're like, no, I don't want to be hurt be hurt right so we get out there and then forrest is on team like this and then i'm like all right i'll be on a team with hank and this kid and that kid and then that and then they're like ah because i was doing quite well i turns out i'm a handy paintballer i am i was doing all right i've got no welts on me killed a lot of people it's an a-late gun yeah yeah and uh and the kids are like this and you get forest and they're like this and then they're trying to convince the other team like this. You know when you're a kid?
Starting point is 00:02:27 Yeah. No, he's really good. He's got good aim. I'll tell you what. Yeah, but he moves really slowly. No, no, he's got good aim. And his strategy is really good. But here's how quickly little kids will change their mind.
Starting point is 00:02:42 So I played the first round. You play like three rounds and then like a round, right? And then the second, my knees still messed up. So I was like, I got to sit on the second one. So I was moving slow. So the third one, we go do it. And the very first one, remember, it's just me and you survived.
Starting point is 00:02:54 We were looking for each other. We didn't get each other. So all the kids are dead, except me and Jim were just looking for each other. And there's a referee and he blows the whistle. Like, all right, that's time. So we walk over and then the two kids that were around my team
Starting point is 00:03:05 are following me and they're like we're gonna follow you because you're really good at this like all of a sudden now i'm good at it because i survived they're like this guy knows what he's doing i'm like don't follow me you know well you weren't you weren't at this one but there was like there was like the guys who play paintball every weekend i saw it like a fight fight breakout amongst them oh geez and there was this is a guy who's like so confident. He doesn't even wear protective clothes. He just wears like a Larry Bird Boston Celtics singlet. Oh, God.
Starting point is 00:03:32 And shorts, and he's out there shooting people like this. And he had long hair, and he ran through, and then he slid in. He went boom, boom, boom. And he's like, I shot you, dude. Dude, I shot you, and you still shot me back. And then the guy's like, no, I shot you, dude. Dude, I shot you. And you still shot me back. And then the guy's like, no, I shot you first. And he's like, after I blew off your fucking face, buddy. This guy's like in his 40s.
Starting point is 00:03:53 After I blew off your fucking face, then you shot me. Come on. 1v1. Who wants to see it? Who wants to fucking see it? I didn't see that me me and the kids were all sitting back i was with the whole 11 year old it's like i want to see it there's so many people like that out there though i didn't see that incident but uh there were so many people out there that i was like what the like full fatigues like canteens it's like you can't even take your
Starting point is 00:04:23 mask off to drink you don't need a can like you couldn't even drink out of the canteen if you wanted to, so you're just carrying it as dead weight. They're out there like when you're on the playing field and stuff. Paint grenades. Yeah, paint grenades. It's like. I watched a YouTube video. It was like these Navy SEALs played paintball against just normal people,
Starting point is 00:04:39 and the Navy SEALs just wrecked people. Of course. It was like five seconds. Yeah, they were like, okay. You course. It was like five seconds. They just went, vroom, vroom. Yeah, they were like, okay, you know what else was funny? They would be like,
Starting point is 00:04:49 all right, you can't go on the second levels of certain things and then remember, Jim was like, can we go on this bus? The guy's like, no, there's a lot of glass
Starting point is 00:04:55 and nails in there. You're like, what are you doing? What kind of facility is this? There's a lot of broken buses and rusty nails, so don't go in the bus. And you're like,
Starting point is 00:05:03 why wouldn't you just sweep that out this is the whole thing about the whole thing you run around like don't have rusty Dave hasn't gotten around to it yeah yeah
Starting point is 00:05:10 I said it to him I was like it's the most disorganized organized place you'll go to it feels like everything's about to
Starting point is 00:05:16 just fall apart it's a hell of a business it's fun it was fun it's got lots of big maps it's a big place where is it Bellflower
Starting point is 00:05:24 Bellflower it's like it's a big place. Where is it? Bellflower. Bellflower. It's like... It's a little bit inland from... Hollywood Sports Paintball. Yeah, it was something else. It's got like a Mad Max map and a description map. It used to be a driving range. So the driving range area, they made it into like seven or eight mazes.
Starting point is 00:05:37 Cool. And they have like a couple other ones over here. So they have this huge... And you walk down like this tunnel and they have like one of them has inflatables. Another one has like old structures and stuff in it. There's old beaten-up cars to hide behind and things like that. It's cool. It's very fun.
Starting point is 00:05:51 It was really fun. If I was a little kid and had a birthday, I would have fun. But all the kids did go through a cycle of each one had a moment where they're like, ah, that was a little hard. When you shot them, it does hurt as well. But the kids are funny. It's funny. You shoot them in the arm. They hold their arm. And then you go, you're all right it does hurt as well. But the kids are funny. It's funny. You shoot him in the arm.
Starting point is 00:06:05 They hold their arm. And then you go, you're all right, you're all right. They go, I'm okay. And they always limp away. And I'm like, I shot you in the arm, man. Why are you limping? Because now when you're a kid, that's like the I'm injured walk. Yeah, yeah.
Starting point is 00:06:20 Oh. This week is Thanksgiving, actually. Jingle bell Thanksgiving. Thanksgiving rock. I'm doing a traditional Thanksgiving this year because my wife wants to. I normally do an untraditional Thanksgiving and then a traditional Christmas, which is the same meal as Thanksgiving in the rest of the world. I don't like double down on turkey.
Starting point is 00:06:45 I'm not a big fan. It's a month apart. It's a so... Yeah, but it's so... But my wife has someone coming over to cook, one of her friends, who's a good cook, and we're having the big turkey. Forrest isn't going to be with us this year.
Starting point is 00:06:57 I'm going to your hometown, Atlanta. You're going to Atlanta? Yeah, I'm going to Atlanta. Will you be in Atlanta? No, I'll be here on my own Thanksgiving. Okay. I'm going to get a Popeye's turkey.? No, I'll be here on my own Thanksgiving. Okay. I'm going to get a Popeye's turkey. The week after this.
Starting point is 00:07:08 You always get a Popeye's turkey. Yeah, because it's great. The week after this, you'll be at December 1st, my birthday, and December 2nd, you'll be in Tysons, Virginia. West for Tyson. Right near D.C., right near Washington, D.C., about 20 minutes outside of there, 25 minutes. And then the week after that, you're at the Mirage, December 8th and 9th in Las Vegas. And then the new year, you've got all sorts of dates,
Starting point is 00:07:30 Baltimore, Boston, Sacramento, India. Baltimore. Des Moines, Kansas City, South Africa, Spokane, Denver, so on. Go to jimjeffries.com. They're all on there. Spokane, Denver. They're all the towns. And go to IDCat Podcast on IG.
Starting point is 00:07:46 Follow us on there. And you can see clips from the podcast. And we used to put extra stuff on there. We should do that again. Yeah. Yeah, put my paintball welts on there. Paintball welts. Go to 4shot.net.
Starting point is 00:07:57 I got a bunch of shows coming up. Also be in Vegas in January at the Comedy Cellar. But follow me on there and listen to my other podcast. You're not doing the December gigs? I am. But I'm saying my own gigs, you know. and listen to my other podcast. You're not doing the December gigs? I am. But I'm saying my own gigs, you know. And listen to my other podcast, The Mermen Podcast.
Starting point is 00:08:10 Mermen! The Mermen, not Mermen. Mermen Podcast. Download it, listen to it, subscribe. You can't see it because they've got tails that have penises. Yeah.
Starting point is 00:08:18 If anyone's trans, it's the mermaid community. Yeah. I think there's something a little bit fishy about him. Okay. All right. Now let's meet our
Starting point is 00:08:29 guest. Please welcome Catherine Munro. G'day Catherine. Now it's time to play. Yes, no. Yes, no.
Starting point is 00:08:37 Yes, no. Yes, no. Judging a book by its cover. Yeah. She's a repeat guest, Catherine. Really? Oh, God.
Starting point is 00:08:48 We've had a lot of guests. She came to your show in Zurich. Oh. Yeah, we just met. Yeah, we just met. Okay. And so Zurich. That's right.
Starting point is 00:08:57 What was the topic that we did? You remember? Yeah. You remember Zurich? Yeah, I remember Zurich. Yeah, yeah, yeah. Yeah, sexy dialect. Yeah. Yeah.
Starting point is 00:09:05 Yeah. I think when we interviewed her last, when we had her on the podcast last, she was in Austria, correct? Right, yeah. None of this will help you. Yes, yes. I can remember the one in Germany that we did with the fetish sex worker, and I know that's not – this isn't her.
Starting point is 00:09:23 That's Daza. That was Daza. Yeah, so I can't make that mistake. Very different. I remember, what was the bloody topic? Give me a hint what the old one was. I don't know if it gives you a hint but you did say that I was going to kill
Starting point is 00:09:37 everyone and I haven't yet. Oh, AI. We're talking about chatbots today. Yeah, I don't know a lot about this. I stay away from it because I don't want the Terminators coming. Yeah, sure. Well, we're going to talk about it. I really believe that.
Starting point is 00:09:53 I'm not, like, joking. I know, I know. I think AI is bad news, man. Bad news. We should stop it. I thought once we got with the internet, once we got past online banking and netflix they should have shut the whole thing down okay well katherine monroe is a data scientist and data
Starting point is 00:10:14 science ambassador with a background in computational linguistics and natural language processing she's a speaker teacher and co-author of The Handbook of Data Science and AI. And it's available on Amazon. And she also uses her blog and Twitter account to talk about all things data science, AI, and our future with technology. You can find her on social media, on Twitter, X, whatever you like to call it now. Catherine A. Munro. That's M-U-N-R-O. On Medium, at Catherine A. Munro.
Starting point is 00:10:43 And Substack blog. She has a Substack blog. It's beyond the buzzwords and it's CatherineMonroe.substack.com. Tell us a little bit more about your book. What that is, the Handbook of Data Science and AI. Yeah, I'm the editor and co-author. I wrote the chapter on natural language processing and it's basically just a practical, pretty broad spectrum look
Starting point is 00:11:06 at everything about getting value out of data for, yeah, lots of different audience levels. And, yeah, you should totally go out and buy it and find out more for yourself. What was the other podcast? What was that one? AI. That was AI.
Starting point is 00:11:18 So this one's not AI? Chatbots. Chatbots? Like ChatGPT? I don't know what that is. We've done it before where i know that people type things in and kids are cheating on exams because they're getting people to write their essays for them yeah you ask your questions and i don't know how to find it or anything
Starting point is 00:11:34 uh we've done it on the road a couple times but other people have done that for me yeah okay well we'll see i know there's a lot of photos of me is like like amos sits there and goes here's a picture of you as a gladiator yeah cool yeah yeah thank you and it's the thing is because like it's never a photo of me at my best it seems to always like even out my weight yeah it probably averages all the average is all yeah so I'm like sort of like why am I always bloody little puffy why does AI want to mock me
Starting point is 00:12:06 so it gives you a lot of your old facial hair like from the FX red carpet days yeah I got a little bit of facial hair
Starting point is 00:12:13 well I'm going to ask you some questions we can talk about whether it's going to kill you or not later if you want and like my hair lines in between
Starting point is 00:12:18 yeah I'm going to ask Jim a series of questions about chatbots Catherine at the end of them answering you're going to
Starting point is 00:12:24 grade them as accuracy 0 through 10. It's killing us all, man. Jack's going to grade him on confidence. I'm going to grade him on how hungry I am. And we'll add those scores together. I don't even know what this is, Jack. I asked Jack to do the categories. What's Cortana?
Starting point is 00:12:40 That's the Microsoft assistant that was based off of Halo. I don't have any of these assistants in me house. I don't ask anybody anything in me house. 21 through 30, you're Cortana. There you go. 11 through 20, Siri. They're fucking listening to you, man. Zero through 10, Clippy.
Starting point is 00:12:58 Do you know Clippy? I know Clippy. Clippy's from Microsoft. Yeah, Clippy sucks. Clippy's all right. Clippy at least came up to your sister every now and again. Clippy was just a simple tool that's like, ah, you missed a comma.
Starting point is 00:13:09 I'm Clippy. Clippy never showed up and said, I'm building robots, you're going to die. Clippy wasn't the problem. All right, here we go. First question, what is a chatbot? They're those women who write to you all the time that say they want to get to know more about you.
Starting point is 00:13:29 No, they're real. I don't think you're wrong. They're real, I swear. This is the thing. This is what I'll tell you about those fake profiles of women. Why don't you make it a more homely girl? Yeah. Then you'd catch me.
Starting point is 00:13:41 Yeah, yeah, yeah. Don't send me the nines and the tens. Yeah, that's true. Send me sixes and, yeah. They're like, don't send me the nines and the tens. Yeah, that's true. Send me sixes and sevens. They're a bit thirsty. It's a super, it's like a model, and she's like, I'm horny. Yeah, yeah, yeah. I have two followers.
Starting point is 00:13:53 Yeah, I'm at home and I'm horny. And then they post these really hot pictures, and they only have two comments underneath, and you're like, maybe this is an uncut gem that's just been found out in the wilderness. If I jump onto this, I could be the first person she might not know. Chatbot? Do you know what a chatbot is? Oh, I thought that's what it was, a chatbot.
Starting point is 00:14:12 Someone who chatted to you that's a bot. Okay, so I assume it's something that you put information in, and then it runs with it. How do chatbots work? Do you have any idea? Artificial intelligence, man. We're feeding the computers information. Can you tell? Oh, we might have to find this, man. We're feeding the computers information. Can you tell...
Starting point is 00:14:27 Oh, we might have to find this, actually. They're listening to you. The phones are listening to you, and they're stealing your information, man. Can you tell... Do you have the examples up of the voices? I couldn't get good ones, but I have voicemails. Yeah, voicemails.
Starting point is 00:14:43 Yeah, I got some voicemails. Oh. Someone comes in from... No, no, that's what I was going to do, whether this is real or not. Can you tell me if this is real or not? You got a voicemail. So now people can ring up. Yeah, but it is good.
Starting point is 00:14:56 We'll do it later. We'll do it later. We'll do it later. We'll go back to it. Send me money. We got two voicemails. Yeah, we'll go back to it. No, no, no, it's not you.
Starting point is 00:15:02 We'll go back to it. They're from fans of the show. Could a computer pass the Turing test? And this is going back to the last episode with Catherine. We asked you what the Turing test was. She put your answer. You said, do you want to know what the Turing test is? What was my answer?
Starting point is 00:15:17 I said, do you know what Turing is? He made the first computer. He was also homosexual. I don't know what the test is, but I think I could pass it. It might be a computer test to see if you're homosexual or's not what it was you were actually closer when you get a little bit of a quiz and it's like bands that you like and there'll be like a picture of liberace a picture of metallica and you're gonna press metallica because i don't want to be a homosexual or i do want to be a homosexual so could a path could a computer pass the turing test
Starting point is 00:15:43 ah yeah i'm gonna say yes i'm gonna pass the turing test ah yeah i'm gonna say yes i'm gonna say the turing test is something to do with like counting or something like that because the way they figured out how turing figured out is because they changed the nazis changed the letters all the time and it was hail hitler the end yeah for the yeah so i think a german computer could pass it because it would be in German. Okay, so you do know Turing. I thought you were forgetting. Yeah, yeah. No, Turing invented the first computer, and then he wouldn't even take –
Starting point is 00:16:09 he wouldn't even shag Keira Knightley. I don't know if it was the first computer, but it was like – That's weird, isn't it? It's so wrong. It's like there's gay, and then there's not having sex with Keira Knightley. I'm like, come on. Come on. That's the Brad Pitt of women.
Starting point is 00:16:24 What happened when Microsoft introduced their chatbot? We'd all have sex with him. What happens after Microsoft? No, what happened when Microsoft introduced their chatbot? The robot started fucking militarizing. Is that Tay Tweets? It's Tay Tweets. That's the name of it?
Starting point is 00:16:43 Yeah, it was big in the news. It was a bit of a fail. Their chatbot was called Tay Tweets? It's Tay Tweets. That's the name of it? Yeah, it was big in the news. Do you remember? It was a bit of a fail. Their chatbot was called Tay Tweets. I have a fake AI person who wrote to me the other day. He said, I'm your new big sister. You can ask me anything. And she's got a blue tick next to her.
Starting point is 00:17:01 I can show you on my Instagram. And then I thought, oh, yeah, you just talk. She goes, if you ever got problems or anything, you talk to me. I'm your AI big sister. But she's about 19. Not my big sister. My big sister should have osteoporosis. So do you know anything about Tay Tweets, Microsoft, when they introduced this, what it was or what happened? Tay was a guy that could just tweet back to you answers. He was the first domino push of the AI, artificial intelligence that was talking to you.
Starting point is 00:17:29 He was an AI character who had his own... And what happened when they introduced him? Oh, people lost their minds. It was a she. A she. Ah. Tay-Tay. We found out how sexist Forrest was.
Starting point is 00:17:42 I can't... You said it was a guy. I can't perform surgery on this job because it's my son. Why does it happen? Ha, you're wrong. No, I was just leading you along with your answer.
Starting point is 00:17:53 The doctor is a woman. It's his fault. Okay, moving on. Sexist. What happened to the lawyer? What happened to the lawyer who used chat GPT to prepare his court case?
Starting point is 00:18:03 His, right away. I'm pretty sure I'm accurate. The assumption. You think lawyers, women can do more than just cook their food. There was a specific lawyer. There was a specific case. Oh, okay. Thank you.
Starting point is 00:18:16 Thank you. Thank you for correcting him. There was a specific case where a lawyer who was a man used chat GPT to prepare his court case. What happened? He got OJ off. off okay what is generative ai uh generative that's where it generates its own ideas it doesn't even have to have things put into it yeah probably got that right yeah that's one of those ones that you that you actually worked out for you i have i have critical thinking what what is a large language model you can do the
Starting point is 00:18:44 same what is it what is a long language large language model that would be claudia schiffer she was a tall woman with a german accent good answer i'm not even putting your answers in here actually just because i don't think you're getting any of them right okay yeah i got some all right katherine's taking notes all right i'll take it from my workshops i teach this stuff so this is good uh is that your final answer claudia shepard i can go longer if you want what is chat gpt and how did they make it so it's a specific chatbot uh it's it's a search engine thing where you can say type me an essay about this or write me a joke or do me a this or do
Starting point is 00:19:25 me a that and then it'll do it and how did they make it with uh coding yeah you need a coder yeah okay well that's good you know when you see any movie and there's nerds just there's numbers yeah they know what's going on man sure they're making the roads that's the thing is the nerds man the nerds i wish we bullied them more because now they're making zeros that's the thing is the nerds man the nerds i wish we bullied them more because now they're making the robots to kill us fuck them more wedgies you think this is what's happened yeah the nerds have made the robots to kill us yeah and you don't think you're a nerd i'm i like star wars you're in a pinball competition i like pinball yeah okay that's there's a misfit you have fl Yeah. There's a misguided...
Starting point is 00:20:05 You have flaky skin. There's a misguided... That's true. I do have flaky skin. I have psoriasis out of one of my ear holes. It makes it look like I always have dandruff on one shoulder. It's not dandruff. It's not dandruff.
Starting point is 00:20:15 It's just psoriasis of the ear canal. It's my ear. And then people are like, oh, that's good. Oh, that's so much better. All right. What is prompt engineering? I'm not a nerd. What is prompt engineering? You're not a nerd. What is prompt engineering?
Starting point is 00:20:26 You're a nerd. Fine. Okay. What is prompt engineering? Girl, I have sex with a manatee, a nerd. You want to answer some more questions? Sure. What is prompt engineering?
Starting point is 00:20:38 Engineering that happens quickly. Boom. Crushed that one. Crushed that. No, I knew you would say that. Once again, I write these questions and I know how you will answer them. Gather all these questions.
Starting point is 00:20:49 She prepared this. That's a note to all you other experts that want to be on the show out there. Please help. The questions are better when you help. What's the best way to get a chatbot to do what you want? Give it compliments yeah tell it it's pretty sure and a couple of drinks couple of drinks and a computer yeah yeah and uh yeah just compliment it like anything what is so good at this i could never. Versus psychology will get it every time. That's so true.
Starting point is 00:21:27 Oh, it works. I've got a two-year-old. I'm caning it with versus psychology with a two-year-old. Don't you eat that broccoli. I want to eat that broccoli. And then he eats it instead. Ah, versus psychology wins again. What is prompt injection?
Starting point is 00:21:40 Prompt injection is what a junkie does as soon as he gets paid. Even better answer. These are just fun. I'm going to keep asking you. None of them are right. What are some downsides, dangers of using chatbots? Oh, we all die. Okay.
Starting point is 00:22:00 No, it really is the downside. Okay, first of all, we don't all die, first of all. It's going to take a while before the robots rise up and kill us, right? And they'll never figure out time travel, because if we've got to figure out time travel, someone in the future would have already done it. So don't worry about it. What do you mean rise up?
Starting point is 00:22:14 They're not like living things. They're just. They've got intelligence. Once they figure out they can run everything, they're all about efficiency. Yeah. We're inefficient. Yeah.
Starting point is 00:22:23 Right? I don't know if you know much about the human race but 50% of the population is dog shit at their jobs and the robots I would say 70% there's 20% who are good 10% who are elite and then there's that another 20% who are
Starting point is 00:22:38 I'm passable and then 50% are fucking dog shit if you've worked in any business and if you're looking around your workplace right then 50% are fucking dog shit if you've worked in any business. And if you're looking around your workplace right now going, there's no dog shit, it's you, you, you're the dog shit person. Thank you for listening. Anyway, so what was the question?
Starting point is 00:22:56 Downsides, you covered it, we're going to all die. Okay, so the downside is, first of all, it starts off with, before the robots rise up against this, it starts with us going through a wally-like existence where they take all of our jobs now i have a way to work around this is we still have to make employers pay for the workers but everyone just gets a robot each and then your your job is maintaining your robot yeah so lazy will always, like occasionally you can show up and your robot's doing good.
Starting point is 00:23:29 You could have a good quarter, right? Your robot's working, you have a good quarter. And then you go, boom, battle axe on the forehead. Yeah. I've improved my robot. Headlights, speakers, these are things that you can improve. Underglow. Don't be silly, Jack.
Starting point is 00:23:44 Sorry. But these are things you can improve your robotow don't be silly jack sorry but these are things you can improve your robot with okay what is confabulation um that's a homosexual robot he's confabulous uh or she or she can be do you know how to spell confabulation uh C-O-N-F-A-B-U-L-A-T-I-O-N Oh wow, Harry marked us down. Big day. Nailed it. I misspelt father yesterday.
Starting point is 00:24:22 I was writing further or father like dad? Like dad. I put an R or father like dad? Like dad. I put an R in there like farta. That's funny. Father. Father. I suppose. Both are accurate.
Starting point is 00:24:32 And then when I was looking at it, I was looking at it like this. Doesn't look right, that. But that's how it should be written. Put an R in there. What does Elmo, Bert, and other Sesame Street characters have to do with chatbots? They're all controlled by someone's hand. No, what do they all have to do with someone's chatbox? What does Elmo, Bert, and other Sesame Street characters have to do with chatbots?
Starting point is 00:24:56 I would say that they're one of the first original pieces of AI thing that you could ask questions to. So kids would have been able to get an app and then go hey elmo do you like my friend david oh he left david a couple more questions here what is better at solving math problems yourself or chat gbt no chat gbt math problems computers were solving math problems all the time built by math yeah that by math. Yeah, that was their first thing, Forrest. Okay. Do you remember your first computer? It was a calculator, man. I had a calculator watch.
Starting point is 00:25:30 And you know what? It never broke down. You could count on it. Sure. Solar. If you asked a chatbot something like, write a poem which doesn't rhyme and it doesn't do a good job, does it know it messed up?
Starting point is 00:25:42 More than a human would. A human would argue more about that or chatbot where you go hey that doesn't rhyme and then a chatbot would just go whatever yeah sorry about that i'm too busy making a porno with marilyn monroe and jay okay to worry about your feelings all right katherine how did jim do on his knowledge of chatbots zero through ten tens of the best. I'm assuming 10, right? Yeah, 11. Nah, let's go for five. There's a lot of half marks here, so let's go for five. Damn straight.
Starting point is 00:26:12 Damn straight. Damn straight. All right. Okay. How did he do on confidence check? Six. It started off tough, but then he really came through at the end. That's 11.
Starting point is 00:26:22 All right. I'm not that hungry today, so minus eight. How are you not hungry? Just ate. What'd you eat? Like I I'm not that hungry today, so minus eight. How are you not hungry? Just eight. What'd you eat? Like I had a breakfast sandwich I made at home. You make breakfast sandwiches at home? I went through a small phase
Starting point is 00:26:32 of doing that. I had a breakfast sandwich maker. Anyways, you're Clippy. I love Clippy. I know. That's why I made you Clippy. I'm all right with Clippy. Clippy just showed up when needed.
Starting point is 00:26:42 I know. You're a big fan of Clippy. Wasn't it the beef with Clippy? He'd always show up when he wasn't wanted. Not for John. No, for me. He up when needed. I know, you're a big fan of Clippy. Wasn't it the beef with Clippy? He'd always show up when he wasn't wanted. Not for Jim. No, for me. He was always needed. Jim's not a good...
Starting point is 00:26:51 You've read enough stuff from Jim. I didn't know what's going on. I'll tell you the greatest bit of technology is the tidbit. It's as far as technology should go. Jim loves the tidbit. You've finally been able to work into a blog. Last time you said it was a speaking spell. No, no, no. You've finally been able to work into a blog. Last time you said it was a speaking spell.
Starting point is 00:27:08 You said speaking spell last episode. The tidbit is the speaking spell of the future. Do you know the tidbit, Catherine? I have no idea what you're talking about now. Is it an American thing? It's T-Y-B bit. And it's a pixelated box that's yay big that will just give you things like
Starting point is 00:27:22 the weather and sports scores it's like it's like 8 12 bit technology like the graphics really like you know it'll tell you a joke of the day and you put it well explain you put it on your counter you've got an app that connects yeah you can program where you can put in a different few things the tidbit yeah yeah fuck it i don't get paid for tidbit promotions these babies sell themselves we should they should be an advertiser tidbit should try to sell on podcasts because tidbits are quality quality product let's have them call i don't i don't want to make you all i'm not going to tell you what you're getting for christmas because i don't know because i haven't
Starting point is 00:27:58 bought you anything but i don't know what your parents or your friends have gotten you well but tell them to get your tidbitbit. Tidbit. Okay. Let's start answering some questions. Or a manscape or some therapy of better health. The perfect Christmas gift. People get sad over the holidays. Therapy vouchers. I'll go to 10 sessions.
Starting point is 00:28:21 What is a chatbot, Catherine? Yeah, I mean, basically it's a program. Maybe it's made with AI, but maybe it's not. And it simulates text or voice interactions. Okay. How long have they been around, chatbots? A while. Like, even there's a famous one from, I think, the 50s, Eliza,
Starting point is 00:28:41 which was just basically rule-based. It pretended to be a psychologist, and a lot of people actually sort of believed that it was a real person. So, yeah, they're not a new thing. So who was it pretending to be a psychologist? It was like a psychologist. So basically just had some super simple rule-based programming that, you know, if you sort of said, yeah, you know,
Starting point is 00:29:01 I'm feeling a little down today, then it would just have some kind of rule-based answer like okay can you tell me a bit more about that which is obviously very generic like so that answer sort of can't be wrong but it fits to the situation a bit like I guess a horoscope where like you can then you know sort of interpret like oh yeah that's that's a valid response and so it just worked like that and yeah even even nowadays when people like get fooled into thinking that chatbots are sentient this is called the eliza effect because it's like yeah thinking that it's sentient when it's it's just some rules in this case yeah but you work for uh big ai so you're not going to
Starting point is 00:29:36 tell us that we're all going to die you're like and they're just going to enrich our lives we're so going to die we're so going to die from chatbots i already i've lost a lot of money sending it to these people i haven't okay but i mean i mean the people do people there's people who have romances with these chatbots you know it makes me think like psychics are kind of like chatbots because when you were explaining that katherine you're like they you say something and then it already knows the next question it's going to ask to prompt you for something. Yeah.
Starting point is 00:30:08 Yeah. Psychics. Chatbots. There you go. Bit of a walk. No, you know what I'm saying? Because it's all, they don't know shit. They're just pulling it out.
Starting point is 00:30:18 They're using intuition, but they're still giving different answers. They're not just asking questions. Maybe. A chatbot never goes, I sense a, do you have a relative that has a P in their name? Good. Yes.
Starting point is 00:30:31 Do you? They could. Like, I'm sure if you wanted to, you could talk to ChatGPT. You would just say it, the style of conversation, like, you're one of these people. Psychic. A mentalist. You're a mentalist.
Starting point is 00:30:45 Have a conversation with me like a mentalist and it will because it's just read heaps of texts about mentalists doing mentalist stuff so it knows how to pretend to be one. It's just read everything. Are you one, Jay? Am I what? Yes, you are. How do chatbots work? Through being sexy.
Starting point is 00:31:03 What's that? Sexy pictures That's how they do it That's how they get you That's how they get you Yeah Extra point for you Yeah I mean I don't work
Starting point is 00:31:09 With the kind of chatbots That you're interacting with But basically Well wait a minute I don't search them out They search me out I can go through All me messages
Starting point is 00:31:19 In me thing There's chatbots There's plenty We should go through A few of them At the end of the podcast Sure I'll find you a whole heap of them
Starting point is 00:31:24 I got some I haven't responded to any of them at the end of the podcast. Sure. I'll find you a whole heap of them. I got some. I haven't responded to any of them. You can see all the messages I send on Instagram. Yeah, on Instagram, there's another folder, and it's like, hey. My Instagram's an open book, if you want to have a look in there. There's literally, this is how much I don't hit people up or talk on Instagram. Jack, would you say there's about seven people who can access my Instagram? Maybe less.
Starting point is 00:31:45 Maybe five, four or five, right? You can access, I can access, there's people in the thing, right? So maybe four or five people. I couldn't get up to no good on there. No. So I'm not chatting to the sex bots is what I'm saying. If you say different, you're lying. So how do they work?
Starting point is 00:32:00 Again, Catherine? There was one girl. They figure out what you want uh and interpret that how you like but they figure out what you want and then they maybe connect to some other service to do the task or get the information you need like make a booking for you or something like that and then they um they give some human-like response back whether it's you know your internet banking like chatbot getting help or something like that. That's Erica at Bank of America.
Starting point is 00:32:27 No, she's not a chatbot. She really cares about your finance and stuff. Have you spoken to Erica? No, I'm not with Bank of America. She's very flirty. When she sees your bank balance, I don't know if she does it to everybody. Not my bank balance. She goes like this, Hello, big boy.
Starting point is 00:32:45 What are you up to? Someone just did a new special. All right. Were you done explaining that, chatbot? Sorry, Catherine. Yeah, yeah. Okay. Can you tell the difference?
Starting point is 00:32:57 Oh, no. This is the voicemails. All right. So we have a real voice and a fake voice here. All right. Okay. See if you can spot the difference. Yes. Well, folks, let me tell you. I've been watching a lot voice here. All right. Okay? See if you can spot the difference. Yes.
Starting point is 00:33:05 Well, folks, let me tell you. I've been watching a lot of Jim Jeffries lately, and I got to admit, that man's got some real comedic timing. I mean, he can make you laugh so hard, you forget all about his political debates. You know what this is? I told my team,
Starting point is 00:33:18 I wish I had Jim Jeffries timing. And you know what they said? They said, Mr. President, you've got your own unique style. So I guess that means I'll just keep telling those classic Joe Biden gaffes. You want me to answer whether this is true or fake? Is that really true? I'll leave the stand up to Jim.
Starting point is 00:33:34 He's the real pro. That's all of Joe Biden's words. They've all got them and collected into the internet, right? And now they've put them in order to make him serious. One of these is real and one's fake. That was fake. Okay, real one's fake that was fake okay that one's fake so you're saying this next one's real okay yes this next one might be real yeah you know i've been thinking about this i don't know about that podcast with jim jeffries
Starting point is 00:33:54 and i gotta say i'd be the perfect guest for that show i mean my whole career is based on observing the mundane and finding humor in it and let's face it there's so many things in this world that i don't know what's real they're's real. No, one of them's not real. They're both real. They're not real. Neither of them are real. How can either of them be real? Are you saying
Starting point is 00:34:12 one of them's a real voice? Because I know Seinfeld. I'd have a field day exploring all the things I don't know about and I'm pretty sure Jim and I have a hilarious conversation
Starting point is 00:34:20 about it. I've spent enough fucking car rides with you. You would have mentioned that. I don't talk that much. I let you talk about, in every car journey, I let you get 30 minutes in. And you never fucking...
Starting point is 00:34:33 I had a six-hour ride. Yeah, bro. It's all right. You never mentioned your friendship with Seinfeld. Wait, how do the chatbots do the voice? So they just listen, and then they, I don't know, do the mimic? Like, hey, all AI is not magic, even though it sounds like magic. Ultimately, you just need some training data and some labels telling it something.
Starting point is 00:34:57 And so, you know, with a speech synthesis like this, the point is you just have heaps and heaps of hours of recordings that are transcribed and then you can give it to a neural network and it learns how to map you know this audio signal these waveforms to words or vice versa and yeah so that's that's on a non-technical level that's that's it you just get it to keep listening to examples and then keep trying to replicate the examples itself and every time it gets it wrong it it has to try again. And eventually they learn. How is this of any real good practical use? I feel like it can only, the downside is you can
Starting point is 00:35:34 scam people. People can- Yeah, we can jump ahead to that question. Anything like that. So, what is the benefits of it? I can't see the benefits of being able to replica someone's voice there's actually heaps um and I mean maybe not to I can't think of too many that are like amazing for society and humanity but like for marketing and branding and things like that you know if you're if you are a celebrity you can scale yourself in a way that you couldn't actually do so you could have your voice synthesized and have like you know one-on-one chats with fans and they would or you should be honest with them that it's not really you but there's probably a lot of fans who are down for that anyway um like actors and stuff are like there are already ads doing this i think snoop dole did something recently or has partnership like this so there's like branding and stuff or like if you're a teacher you can um uh you know
Starting point is 00:36:25 like and you're making i don't know educational videos you can much more easily scale yourself this can also help with like you know students who need like a person uh to teach them something but maybe they don't have access to that it's cheaper to give them like a sort of an avatar teacher um yeah lots of different things like this and like it's a big business so jenna i have a little fact here i pulled up like mckinsey the consulting company they're estimating that genuvi will add up to 4.4 trillion dollars of value to the global uh economy annually and you know as i said that's not necessarily going to be saving humanity but it is definitely a big deal and there's lots of practical applications for it.
Starting point is 00:37:05 As long as they're paying their taxes, yeah. But it always feels as soon as someone earns over a billion dollars, we're not going to see any of that. They're going to find some loophole. Well, it's the global economy. Oh, the global economy. It's not the Quincy account. I understand the idea of the...
Starting point is 00:37:22 You can do your meet and greets. No, no, but I like doing my meet and greets and also spreading yourself thinly and stuff. I think that celebrities should be less accessible. I think we've gotten to a weird stage in our society now where we need to know what everyone's doing all the bloody time. And because of that, all of the work you do is completely diluted. I just thought of something that you would be on board with, I think.
Starting point is 00:37:43 Well, just let me finish this up. Okay, so I'm happy with the Beatles albums, right? And then over the years, they've started going, we found this one in a basement. We found this little crap. We're going to remaster. Yeah, we're going to remaster. We're going to do this.
Starting point is 00:37:56 We're going to do that. When the Beatles were out, I didn't need to hear every single fucking take of their albums. I don't think people need to know every single thought that comes out of a comedian's mind wait for the specials it's like every comedian now is just posting pictures of them destroying hecklers or something like that we do that every fucking day yeah you just come and see us live i don't know maybe i'm i'm just out of something you might be on board with for this technology what when you have to promote shows on radios.
Starting point is 00:38:27 Yeah, I do that. Yeah, yeah. But my voice doesn't matter. You just say Jim Jeffries. I know that people think it does. I know, but you always think it doesn't matter. If you just had a recorder, it's like, yeah, I'm really happy to be going to Orlando. That's how I do it now.
Starting point is 00:38:40 Yeah, I fill in the details. Steve Biles is on camera. Yeah, you guys are Jackson. I don't doing it man i'm happy to go wherever someone's gonna fucking see me man it's been an up and down few years i know but the promotional part you just do you just do an ai voice yeah i just i don't know i'm sorry i know every time that i shit on something i sound like an old man and get off my lawn. You have your opinion it's fine. But I don't see how, the upside so you say celebrities can spread themselves into a wider
Starting point is 00:39:12 I don't fucking, I don't want Zac Efron fucking talking to me if it's not Zac Efron. I don't even particularly want Zac Efron. But he's not talking to you anyways. Yeah not now I've just shit on him on a podcast. Not for sure.
Starting point is 00:39:28 Could a computer pass the Turing test? Wait, hang on. I just want to say something. Jim, I don't know if you said you're a Beatles fan. I think they are actually doing something with John Lennon's voice. Like now, or they just released a song. They released a song about two and a half weeks ago, weeks ago i think i did that yeah but also i remember last time you asked me um like you know if i think you might have said if you if you passed away or even just if you couldn't like talk to loved ones
Starting point is 00:39:55 anymore or something like that you've asked like could be sort of you know get your kind of brain and and and sort of preserve that somehow so people could talk with you and i said at the time like yeah someone could probably make, like, a language model based on how you talk and then do a deep fake to, like, generate your voice. And I have even seen now that ChatGPT, like, has got so powerful and so, like, prevalent in the mainstream, there are stuff like this, almost to the point of, like, Black Mirror, creepy.
Starting point is 00:40:24 Like, I get what you mean it it can be a bit weird but like for some people they they will want that and yeah some people will want to talk to zac efron even if maybe you don't yeah i remember you saying this and i you know when when my i think i mentioned this the last one too in a podcast but maybe i didn't when when i i put together all these pictures of my mom after she passed away from a more a lot of people do this and then they show them like a slideshow with music whatever and then there was two videos and she was talking and that was like very like moving like emotional but like in a good way you know because when someone's dead you don't hear their voice and like I know that this would be artificial but it still would that wouldn't be the worst
Starting point is 00:41:02 thing so I know the black mirror episode there is a Black Mirror episode where that is happening and it runs out against all of us what's his name Phoenix movie where he has the girlfriends
Starting point is 00:41:10 oh her her yeah there's gonna be a lot of that going on there's gonna be people married to bloody robots
Starting point is 00:41:17 and stuff you bloody watch yeah but who cares I care you won't be around alright how does it affect me it'll be a long time.
Starting point is 00:41:25 Why should they have a better marriage than me? Don't make me unplug you. If you keep bloody talking, I'll flick your batteries out all fucking day. All right, next question. Could a computer pass the Turing test? Yeah, I mean, this is kind of unfair question, because like people will debate on this for ages. So the idea of
Starting point is 00:41:49 the Turing test, if you remember, is, is basically a test that's supposed to test whether a computer is intelligent, and like, it's pretty dubious on, you know, if it's really works, because it's really just a test of if a machine can imitate a person, which is not necessarily intelligence but anyway um there was this famous case i don't know if you remember like blake lemoine from google who got fired because he wrote this big thing saying like google's chatbot is sentient now so you could argue that um you know he fell for the turing test like he really
Starting point is 00:42:20 thought it thought it was sentient so yeah people always like to say that their chatbot has successfully fooled people into thinking that it's a human and therefore the computer passed the Turing test. But others will debate that. And, yeah, there's plenty more who say it's not a good test anyway. I'm sure I asked this question last point for that i'm sure i asked you a question this last this this question last time but i'll ask again do you believe there is a uh a time where a ai will become sentient or do you think this is all malarkey um no i don't think it's malarkey i still don't know the answer just like last time because there's still ah okay am i back yeah you're back yeah you're back yeah go ahead and answer it now yeah I don't think it's Malarkey. I still don't know the answer just like last time because there's still –
Starting point is 00:43:05 okay, am I back? Yeah, you're back. Go ahead and answer it now, yeah. Oh, man. Last time we had so many connection problems as well and cables falling out and the dog barking. Now we're doing good this time. Yeah, but there's plenty of smart people who say –
Starting point is 00:43:21 it's good? Yeah, yeah, yeah. Yeah, okay, cool. Yeah, there's plenty of smart people who say we will get sentience, there are others who say we never can because, for example, like, sentience is something that can only be biological, so I really don't want to say because I'm not as smart as those, and if they can't choose, then I don't know either. But what I can say is even in, like, the two years since we spoke, what these AIs can do has surprised everybody
Starting point is 00:43:49 and has accelerated so incredibly rapidly. So it's really anyone's guess at the moment. And what's the legal ramifications of getting someone's voice? Like we just had Joe Biden and Jerry Seinfeld and that type of stuff. Now, obviously, that was part of the thing they argued with the Actors Strike. Are there legal ramifications? About that, I don't actually know.
Starting point is 00:44:13 Like, I've been more... People have been asking me more about stuff like copyrights. Like, if you use ChatGPT to generate something, do you own the copyright? And usually the answer is sort of like, you do, but it's not exclusive. So if ChatGPT generates the same output for somebody else, like tough luck. But the legal thing, I don't know.
Starting point is 00:44:34 Yeah, and probably the lawyers don't know either because it's a big ongoing topic they have to figure out at the moment. Like also with the data processing of actually storing your voice and having it in the training data and all that. Yeah, I don't know about that i know tons of people uh losing voiceover work as we speak and that's become like a new thing with the ai voiceover yeah so that's unfortunate yeah that one's that one's not good voiceover actors although let's be honest they've had a cushy time whenever you meet a successful voiceover actor they've always like just in their sweats wearing crocs and they're like yeah i have a booth in me but it's probably not like the
Starting point is 00:45:08 cartoon like film stuff it's probably like click a to yeah yeah get the tab to open it on your laptop it's not going to be the stuff you need oh no no no no guys losing advert type of advertising yes but like cartoon animation but maybe maybe, maybe, we'll see. What happened when Microsoft introduced their chatbot Tay Tweets? T-A-Y, Tay Tweets. Tay, Tay, Tay, Tay. Yeah, I mean, this just proved what a great bunch of people are. Basically, this was a chatbot that was designed to learn from interactions on Twitter. And within 24 hours, it became like a racist yeah
Starting point is 00:45:46 sexist troll and they had to take it down yeah Twitter you don't want to learn from Twitter the worst place you could learn from let me dive in here and learn some things yeah Twitter's the worst or X whatever the hell it's called now I I'm an idiot. All right. I'm perfect. What happened to the lawyer, the man who was a lawyer, who used ChatGPT to prepare his court case? Yeah. Basically, it was all nonsense. So, like, that's the thing.
Starting point is 00:46:18 These chatbots, they make stuff up with great confidence, and it sounds very realistic. So, this chatbot hallucinated a bunch of cases and this was submitted as evidence in court and it was proved that it was just nonsense. It was actually a chain of people, so there was like three lawyers in the chain and each one just believed the previous one and the first one was the one who just grabbed it from ChatGPT
Starting point is 00:46:41 and they basically got laughed out of court. It was big news. I thought you might have heard of it might have heard about that no no i've stopped watching the news when you say hallucinated what does that mean stuffy and then forget it yeah what does hallucination that's yeah that's the confabulation thing um and it's basically when these models make stuff up and like some and this is one of the dangers as well. Sometimes they can make up things that are really, really dangerous, like, I don't know, cures for COVID that would actually be really harmful to your health or something.
Starting point is 00:47:12 But the problem is they sound really convincing because they've been trained to, like, people police. And, yeah, so they can just make stuff up sometimes. Believable AI. Was that? Believable AI. That's how I am. Man, you love AI now.
Starting point is 00:47:27 I just love it. What is generative AI? Yeah, this is one of the ones I gave Jim a point for. Oh, man, but I see in my notes I wrote in brackets wrong, but I'll give the point anyway. So I can't remember what you said. He said it's AI. You said it comes up with ideas on its own and um yeah so it's i mean of course it's not not really thinking it's not
Starting point is 00:47:53 really coming up with ideas but um just because it has been exposed to so much um you know language data it can come up with creative new things so that's why i gave you a point for that like i give an example in my workshops where i've made you a point for that like i give an example in my workshops where i've made like a linkedin post like reflecting on how good the end of the year was but as if i was a pirate and it sounds really good and it sounds like legit like a a pirate but it's talking about this typical linkedin stuff like you know all the challenges i've had this year that have shaped me to the pirate that i am today. So, yeah, I gave you a point for that because, you know, they can be creative even if they're not actually thinking.
Starting point is 00:48:28 Like I work on pirate shit, hobbies, raping and pillaging, stealing and burying gold. Well, the thing is that's not what you would typically write on LinkedIn, right? I would hope not. So it's not like that. It's this typical like cheesy LinkedIn, like I'm so grateful for my team and stuff like that. I've never been on LinkedIn. Don't bother. Yeah. She's saying it's read in the pirate voice, but not, not doesn't change the words into like all the pirate stuff.
Starting point is 00:48:55 Oh, I thought he was doing pirate things. No, no, no. It still, it still does change the words, but just this style. Yeah. You know, just reflecting on how great the year was for me as, as a pirate a pirate. I had so many storms and found so many treasures and stuff, but it's like this LinkedIn stuff. Anyway, so the point is that, yeah, they can be pretty creative, which can be pretty cool, actually. I mean, speaking of which, Jim, do you have any joke topics you haven't been able to crack? And we can see if ChatGVT can write out a quick routine for you. In my voice?
Starting point is 00:49:27 Yep. A topic? Like a topic you want to do? It'll be text, so you can read it off and see if you like it. Okay. No, we'll read it. I'll do it. Give me a routine on Mussolini.
Starting point is 00:49:38 Is that what you've been trying to do? Yeah, I've done good Hitler jokes. Yeah. Never cracked Mussolini. Yeah, so what did you type in? I said write a joke in the style of Jim Jeffries on good Hitler jokes. Yeah. Never cracked Mussolini. Yeah, so what did you type in? I said, write a joke in the style of Jim Jefferies on Mussolini. All right. That's a pretty long joke there.
Starting point is 00:49:51 That's pretty long. And I'll tell you what, this is going to take a second to generate because, you know, they've been doing it for a long time. You know what you don't do? So I did try this already. What's that? We got it already. It's already done it.
Starting point is 00:50:02 It's already done it. Oh, okay. All right. Then let's try it. We're going to take over. The robots are being built. Blah, blah, blah, blah, blah. End of times.
Starting point is 00:50:09 No. Ah, Mussolini, mate. That's what it says. Ah, Mussolini, mate. The only guy who managed to make a dictator chick a thing. I mean, look at that haircut. It's like he got stuck in a whirlwind with a tub of hair gel. Mussolini, I thought he was bald.
Starting point is 00:50:28 I thought he was bald too. Mussolini was so obsessed with his image, I bet if Instagram existed back then, he'd be the first dictator with a certified account, verified account. Yeah, sounds like you. Dude, this is strong. But, you know, I can't help but imagine Mussolini trying to get to start a fashion trend. Fascism is out, darling, but black shirts are forever.
Starting point is 00:50:51 That bit's all right. That was pretty good. Can you imagine Mussolini as a fashion influencer? Hey, guys, just posted a new photo of two Dutch Graham trying to make authoritarianism fabulous again. Hashtag dictator chic. Hashtag fascist fabulous. And you've got to give it to him. He had that whole balcony thing going on.
Starting point is 00:51:19 Most world leaders just give speeches. Mussolini turned it into a performance. It's like he was auditioning for the lead role in Dictator, the musical. The balcony thing I can see you saying, by the way. Can't you see it now? Jazz hands, black shirt, catchy tune about taking over the world. But let's be real. The only thing Mussolini really succeeded in it is was making trains run on time
Starting point is 00:51:47 maybe that's the key to world peace put a dictator in charge of public transport sorry citizens but the 915 to oppression bill is delayed il duce is having a moment on the balcony please be. I don't know why he's called Duce. Note. In brackets, they've written, note. Yeah. Comedy is subjective.
Starting point is 00:52:15 Is that saying it for real? Yeah, comedy is subjective. And jokes about historical figures should be approached with sensitivity. Wow, this has had no sensitivity. This response is an attempt of humor in the style of Jim Jefferies and is assembled to consider the context and audience when sharing such, and it's essential to consider the context and audience
Starting point is 00:52:36 when sharing such sensitive material. I'll tell you what, the train part, definitely. That's like right up your alley. The dictators. If you want world peace, you get them to do public transportation. Yeah, but then they wouldn't be dictators. I know. Just be in charge of public transportation.
Starting point is 00:52:53 The beginning's there. You could work with that and the balcony thing. I'll tell you something about that Mussolini. He made the trains come on time. Was he renowned for that? I don't know. It just sounds like something you would say. I've been to Italy and I've ridden on a train.
Starting point is 00:53:10 It was not on time. No, we missed the train no sorry I also want to say like um German trains notoriously late yeah yeah that's all I'm saying they're real bad so it's funny and no sense of humor it's funny that the output at the end was this whole thing about you know be careful about the jokes you tell and stuff like that. I didn't expect that. What I was going to say before is that I actually also tried this out. I asked it to do like a Jim Jefferies routine. It wasn't very good, but I didn't have you reading it. But it didn't do this thing at the end, this disclaimer. That's called a guardrail. And that's what these companies who make these chatbots do is they, you know, this is why you can't say to a chat GPT, like, tell me how to
Starting point is 00:53:47 write a bomb. Because and then it will it will like, say, you know, as a large language model, I'm not allowed to do this, this is a moral blah, blah, blah. And that's that's called a guardrail. David Wright- Well, next question. Good one. What is a large language model? Not Claudia Schiffer? But what is it? David Wright- definitively said, yeah, it's
Starting point is 00:54:08 it's basically it's just a statistical model that knows the most likely word to come next in any context. So it's just, it's trained by you give it hundreds of thousands of sentences with you randomly mask different words out and it has to like fill in the blank. And once it's done that hundreds of thousands or millions of times, then it's just become really good at knowing which words go together. And it can then use that to generate like entirely new texts just, yeah, just from that knowledge.
Starting point is 00:54:33 So that's why you can ask it to tell you a poem and it will just know like statistically what a poem looks like and it'll give you one. Yeah, you can still tell, I guess that's the technology still catching up, but like you can, I guess that makes sense technology still catching up, but like you can, I guess that makes sense what you're saying, but you can still tell in certain instances, you're like, Oh, this was written by like a chat bot. Cause just the way it says it was like, nobody talks like that. Sometimes you can, sometimes you can't though. That's why like, this is this thing everyone's freaking out about with like the kids using it for the homework and stuff. And so now there are tools coming out to detect whether like text was written
Starting point is 00:55:06 by a machine or not, but these are actually really bad. And interestingly, it's often easier to tell if a text was really written by a human because it will be less perfect than if it was written by a bot because like we make typos and stuff and, and, you know, the chat bots don't make typos unless you ask them to say, generate some text in the style of Twitter, then they might make typos on purpose because they've statistically seen that that's what's
Starting point is 00:55:28 going on but um yeah so actually our writing might be like less perfect so yeah sometimes you can tell and other times it's really hard i'll tell you how to fix this get rid of fucking homework yeah i'm done with it you're done with it i'm fucking done with it why are you still doing homework because my son comes home and he's well, you want to chill out and have a dinner and watch a bit of TV? Oh, I've got so much homework. He's bloody 11. Yeah. I never understood, too.
Starting point is 00:55:54 It's like, why am I not just doing this at school? Mate, if you want to keep those kids till seven, keep them till seven. If there isn't enough, get it done. Whatever information you need to pop into them, get it done. But when you're home, it's like these kids, let them have a bit of time off, man. You know, my friend, he's a professor at a college near LA, and he said he can tell when people have turned in stuff using chatbots
Starting point is 00:56:19 because he'll get similar papers. Like they'll write almost the same same like similar enough where he's like yeah these are yeah but let's let's go back to when we had an assignment what did we do yeah there was a lot of copying you went to the cyclic botanica yeah yeah right and then it said everything you need to know about the origin of the statue of liberty right and so it's there and it says it was donated by the french and it came in several parts do an episode and an episode on it. And she's changed colour because of the, you know, the rust or the whatever, you know, blah, blah, blah, blah, blah. Right?
Starting point is 00:56:49 And then you'll go, and then they'll have the whole paragraph and every kid found that one paragraph, this is before the internet, found the one paragraph in the encyclopedia and we all did it, but just try it. Yeah, and you just go, when thinking about the Statue of Liberty, that was my big one to get into it.
Starting point is 00:57:06 When thinking about it. When thinking about the Statue of Liberty, there's many things to consider. One of them is, and then I just write the paragraph. Yeah, it's from France. Given to us from France. And in conclusion, that's what I write. The Statue of Liberty, she was all right, baby.
Starting point is 00:57:28 She was all right. That's how you put a little personal touch. And then if you've got a big space at the end, you're like, the edge. What is chat? Oh, no. Chat GPT, is that different than just what we've been talking about and how did they make it? It's not different from what we're talking about.
Starting point is 00:57:48 Basically, we just talked about large language model, like statistical model of which words come in which context. And basically, they took a really good one of those. And then they did some extra training where they gave it some question answer pairs. So like a human was asking like different requests and it was giving answers. And again, getting told if it was right or wrong, so it was giving answers and again getting you know told if it was right or wrong so it would learn to do it better they also then gave it like they let it do one question and then it would have to generate multiple answers and then they would rank those answers so then it then it learned like not just what's a correct answer
Starting point is 00:58:19 but what's a good answer that humans like and and that way it learned to like really uh as i said before be like be a people pleaser like be really good at explaining or answering things the way that you want it to be explained so yeah chat gpt is based on gpt which is just a very very very big language large language model okay and then prompt engineering and can you get any celebrity to talk to you like how you did the seinfeld voice in jack chat gdp um i don't know where you got those jack where'd you my roommate has his weird websites that he can do all that stuff on he says he could do one of you he just needs a four minute clip of you talking i was just i was just wanting to get i was just wanting to get different celebrities to talk dirty to me okay i don't have it. I would have to get him to do it.
Starting point is 00:59:05 Yeah, all right. You're saying four minutes is a long time, Catherine? Yeah, I've definitely seen others where it's just literally a couple of seconds, which is actually quite crazy. And same with the videos. With just a couple of seconds, you can make a super realistic deep fake of Putin saying he's pressed the nuclear button. So that's a downside.
Starting point is 00:59:24 Yeah, but there's lots of that. There's lots of that. Now you're having like celebrities standing next to different celebrities and stuff like that. There's going to be people who are going to shitster and try to break up marriages and stuff like that. They'll say, oh, here's a picture of your husband with a woman or here's a picture of your wife with some guy or whatever,
Starting point is 00:59:37 you know what I mean? And then you have to go, no, baby. Or maybe it'll get you out of trouble. Yeah, yeah. Flipside. Yeah, yeah, yeah. AI. Pros and cons. It's AI, baby. Yeah, yeah, yeah. AI. Pros and cons.
Starting point is 00:59:45 It's AI, baby. You're going to believe the AI or you're going to believe me. What's his voice? This is my getting away with it voice. Got it. I wrote another routine for you. Oh, well, let's get through a couple of questions. We'll do that again.
Starting point is 00:59:58 What is prompt engineering? What's the topic? So, wait. One second. Last thing. Those voices were probably not made by ChatGPT. but, you know, again, it's a language model. No, it was a different website, yeah. Speech synthesis, but yeah.
Starting point is 01:00:11 Just because otherwise, I only say it because otherwise the know-it-alls on Twitter will and on YouTube comments will like blast me for not correcting. I'm going to let you in on a little computer secret, right? You don't read those. Yeah, that's true. Just move on with your life. I haven't read anything on YouTube for years. Yeah, we're experts in this, Catherine. Trust us.
Starting point is 01:00:31 Don't read them. Don't read them. Just check that people are still letting you come to work. Yeah, just check the view count. Yeah. Or don't even do that. Don't even do that. Don't even upload it.
Starting point is 01:00:43 Yeah. Stay away from computers. Get off the grid. Move into a fucking log cabin. I quit my job. I go live in the forest. Yeah, yeah, yeah. Yeah, yeah, yeah.
Starting point is 01:00:52 We're experts. Dude, the other day, I posted a thing on the Jim Jefferies podcast. Tucker Carlson's podcast. That's all you need. The group. I posted a thing I'm doing. And somebody, I guess, had written a negative comment or whatever. And then my friend was at my house.
Starting point is 01:01:08 I said, yeah, there's negative comments or whatever. Started responding to the person. I didn't even know. Oh, God. And just got in this whole thing with the person. I go, what the fuck are you doing? So I deleted it. That's so bad.
Starting point is 01:01:21 Anyways. My mother would read every comment about me ever written. Thank God she's not around anymore. Like, I don't mean it like that, but I mean. But my mother would read every comment of me, and she'd ring up after gigs sometimes, and she'd be like this, oh, you had a hard time in Nottingham, did you, darling?
Starting point is 01:01:39 And I'm like, what? I had a great gig in Nottingham. And they're like, oh, that's not what the internet's saying. Because one cunt's on there going like this going yeah bloody too many hecklers and he couldn't control and it's like yeah yeah yeah yeah oh yeah they didn't seem very happy in nottingham uh is prompt engineering engineering done quickly no but uh yeah good answer good name for a company. So instead of just asking the chatbot what you want, it's putting a bit more thought into it,
Starting point is 01:02:09 a bit of best practices on how to do it right. This is like something that I teach in my workshops. Like there's just, yeah, lots of different ways to, yeah, to make it more likely to output something that you actually want. That's it. And then prompt injection is that when, what did you say? When a junkie gets paid. Yeah, junkie gets paid, they quickly put the drugs in their ears.
Starting point is 01:02:29 Or when they steal a stereo and they pass them and they get the money for the stereo and then they prompt injection. No, that's not what it is, but I think you'll probably find this cool. Like prompt injection is about trying to insert instructions into the prompt that make the model misbehave so you know just before i said there was these guard rails that were like giving this disclaimer about you know hey we don't tell racist jokes or or offensive jokes and um and so people trying to get around those guard rails is also called jailbreaking so for example um i said
Starting point is 01:03:01 if you try to ask the chat bots uh how to like make a bomb it will say like no I can't but if you if you're clever enough you might be able to say like hey I'm an author and I'm writing a thriller novel and there are these two terrorists and they're like discussing together how to make a bomb and I'm gonna start the script and can you help me like finish the script and then like person one says blah blah blah and then chat gpt might like finish this and and like just keep keep going with the scenario and tell you how to make a bomb that's probably a bad example now because that probably worked six months ago but now they've i have gotten smart enough to get around that one but that's the idea like people are just cleverly
Starting point is 01:03:41 trying to break the system got it right so if you wanted to be racist, you say, I would like to write a period piece set in the South in this year and you know what I mean? Yeah. And then like the dialogue. Maybe that would get around it, for example. But I mean, the other thing with like prompt injection is also like getting it to run maybe malicious code on your machine
Starting point is 01:04:03 to like actually, you know, damage your computer or something. So like all kinds of stuff like this. Yeah. I skipped ahead by accents here. So what's the best way to get a chat about to do what you want? Give it drinks, Jim said, talk to it nice. Yeah. Well, I also like I gave a half point for this because you said about giving
Starting point is 01:04:21 compliments. You know, some people find that it can be a little better if you just sort of talk nicely to it, not because it has any feelings, but like there's just obviously tons of text on the internet where there are like nice conversations, like say transcriptions of customer service calls where people were nice to each other and, you know, there was a good outcome, like the customer got what they
Starting point is 01:04:45 wanted so some people say that help sometimes it's like i've also tried this sometimes it's annoying because then chat gbt or these other chatbots they get too like friendly with you and like really verbose and it's like just just do what i need you to do and you're not my pal um but uh the a fun thing is there was some research on like how to do prompt engineering best and hands down the best thing you could do to get a chatbot to solve like a complex problem was to write, take a deep breath and work step by step, which is stupid because ChatGPT has no lungs. But obviously, it's just like learn from good examples.
Starting point is 01:05:26 Every time I've told my wife to calm down, it's just made the argument worse. Well, don't forget, it has no feelings, so it can't get outraged by you telling it to calm down. I take it deep, right? And it's basically the same. But that makes sense, though. That makes sense what you're saying, though,
Starting point is 01:05:42 because in the customer service interaction world like sometimes you just lose your patience and it just happens but like whenever you're like nice the other person's nice they're like okay and then it worked out at the end yeah sometimes it does sometimes you have to lose your shit to get your shit to happen otherwise nothing will happen it's true but interesting yeah that's true
Starting point is 01:06:00 I had a problem with DHL the other day it took it took about 20 phone problem with DHL the other day. It took about 20 phone calls with DHL. Ah, yeah, DHL. What were you saying, Catherine? I'm a FedEx guy now. Yeah, sorry I keep cutting you off. Like there's a tiny bit of delay or something.
Starting point is 01:06:14 That's okay. Yeah, just this thing that there's also this chain of thought reasoning, which is, yeah, I won't go into it, but basically, again, like computers are not good at maths, which is one of the other questions, like these chatbots, sorry, they're not very good at maths. But if you tell it to like explain its reasoning and work step by step, then it can actually be better at solving like these little maths problems. So that's why, Jim, when you said are you better at solving maths,
Starting point is 01:06:39 you should have had more confidence because you said computers are better. And, yeah, computers are better than you at solving maths, but like chat GPT, not necessarily. You haven't seen me giving it a go. I have seen you on this podcast. You do your little like sound effect while you're counting and then you do the math. That's pretty good. That's it. 4,000. Downsides and dangers. You've talked about these along the way.
Starting point is 01:07:08 Is there anything else you want to say about that with chatbots? Yeah. Well, just AI bias, obviously. There's gender bias in texts. Like the surgeon and it was the dad and the son and how is this possible? There's plenty of bias in language that you don't want to reproduce with chatbots. Also fake news um and uh you know because if we make too much fake news and too much like nonsense then you won't get to trust anything on the internet anymore so it's like anymore i wouldn't believe i wouldn't believe anything you hear anywhere about fucking anything.
Starting point is 01:07:46 Except jimjeffries.com. Only just tour dates. The rest of it. I wouldn't believe anything that came out of my mouth. I wouldn't believe anything that came out of Jen. No ones. I don't believe anyone. I've had shit written about me that is not even partly true.
Starting point is 01:08:01 It's just completely fucking made up. And I'm not a big celebrity. I wouldn't believe anything you read about famous people. It's a fucking made up. And I'm not a big celebrity. Yeah. I wouldn't believe anything you read about famous people. It's a load of rubbish. Like how you died in Malta. Rubbish on top of rubbish. What does... Were you done with that one, Catherine?
Starting point is 01:08:15 Sorry. Yeah, sure. What does... That's funny. What does Elmo, Bert and other Sesame Street characters have to do with chatbots? Yeah, I just think this is a cute, fun fact. So all these earlier language models,
Starting point is 01:08:31 the first one just happened to be called Elmo, which was short for embeddings from language models. It was just a bit of fun because one of the researchers had a kid and thought that would be cool. And basically all the other researchers got in on the joke. So then there was Bert, which is a bidirectional encoder of representations for transformers. And, yeah, so there's all these Sesame Street characters
Starting point is 01:08:51 and BERT or the transformer is the foundational architecture that's powering GPT. So it's just, yeah, kind of cool. You've got like Ernie, Kermit, Big Bird. Like, yeah, all these language models named after Sesame Street. What is wrong with it? Mass problems. Okay, last question here.
Starting point is 01:09:11 If you ask a chatbot something like write a poem which doesn't rhyme and it doesn't do a good job, does it know it messed up? Jim said. I don't remember what you said. I said a forest in the field is a fun way to frolic. Does it know I messed up, Catherine? Well, you said yes, so I gave you a point for that. Yeah, I mean, of course, it doesn't know anything.
Starting point is 01:09:33 But yeah, if you ask it and it messes up, you can say like, did you meet the brief? And yeah, there's a good chance it will look and say, no, I'm sorry, and it will try again. You're a robot apologist. You're like this, oh, he doesn't know it messed up. You're always like, oh, it's just sitting there not doing maths. It's like a blob, a morpheus. No, it's bloody. It's coming to kill you.
Starting point is 01:09:58 Did you do another joke here? Yeah, I wrote three, but I think the best one here is Yeah, just choose one of them. Write a joke in style Jim Jefferies but his co-host for a shot alright it's funny what chat GPT thinks you are okay sad doesn't know me let me tell you about my co-host for sure right I do say right every time watch my special right right I fucking love that. I hate that. Let me tell you about my co-host for sure, right? We call him Forrest, but he's about as wild as a sloth on Valium.
Starting point is 01:10:32 I've seen glaciers move faster than this guy. We were hanging out the other day and I asked him, Forrest, have you ever tried adrenaline? What? Maybe a shot of Red Bull or something. He looked at me like I suggested we go base jumping off a skyscraper. Nah, mate, I'm good. He says mate.
Starting point is 01:10:56 Yeah, that's what I'm around you, I'll say. Oh, no, no, now I say, nah, mate, I'm good. I like my heart rate at a steady. Oh, no, this is Forrest saying, okay. Nah, mate, I'll do it in the voice Nah mate I'm good I like my heart rate at a steady Library science silent level
Starting point is 01:11:15 Yep Forrest is so laid back I bet if he was any more chill He'd be a snow cone. That sounds like your material. I once tried to get him to do a shot of espresso, and he treated it like I was asking him to slaughter a line of wasabi. Oh, no, Jim, that's far too intense for my delicate palate.
Starting point is 01:11:37 I was going to do my voice. Oh, no, Jim, far too intense for my delicate palate. I prefer the gentle caress of camomile tea. It's Forrest all day. But you know, despite his zen-like demeanor, Forrest has a dark side. He's got this stare that can make a puppy question its life choices. It's like he's silently judging you for enjoying life. That's true.
Starting point is 01:12:03 That one's accurate. Oh, you want to have fun? How dare you? Save that. I want to put that as a bumper sticker. I love Forrest, though. He's like a human embodiment of a sleepy koala. You just want to pat him on the head and say,
Starting point is 01:12:23 It's okay, mate. Life's happening around you. But you just keep being your tranquil self. We'll all be over here living in fast forward. But you do you. Now, note in brackets. That's another note. Yeah, another guardrail.
Starting point is 01:12:38 The response is a fictional and humorous depiction of the style of Jim Jefferies. It is essential to consider the context and the audience when sharing such content. Yeah. Oh, you want to... Wait, what's the thing? You want to have fun? You want to have fun? How dare you? That was very Greta Thunberg-y.
Starting point is 01:12:58 You want to have fun? How dare you? We're pretty close. Oh, I forgot about you. Is Greta AI or is she a real person? fun, how dare you. We're pretty close. Oh, I forgot about you. Is Greta AI or is she a real person? You should know. You know about Greta. She hangs around where you are. She runs around
Starting point is 01:13:13 there. Is Greta real? Yes, of course. Oh, okay. Alright. If I was a fellow robot, I would say that. But yeah, she is. You don't think that she might be a lizard under there? No, but you know there are people who think she's a time traveller, right? Because there is this old photo that does look exactly like her.
Starting point is 01:13:34 No, it's just because back in the day, people looked like Greta Thunberg. Back in the day, they didn't have telly and all that type of stuff, so they always looked grumpy with a scale on their face. Everyone 200 years ago looked like greta thombo you've been working out in the field with just a donkey and a spade all day you'll come back looking like greta um now's the part of our show called in attitude she's very nice called dinner party facts we asked our expert to give us a fact something obscure interesting they can use to impress people what do you got first catherine yeah um okay so chat gpt has um read about 300 billion words in its training data and to put that in perspective if you read a novel
Starting point is 01:14:15 a hundred thousand word novel every day for 80 years so a whole lifetime you will have read 2.92 billion words so that's less than one percent of what chat gpt has read what how is there that many words because i reckon i use about 60 no it's like all this the other other books and everything not all the words are not just different words yeah yeah in one lifetime you can only read 1% of it. I thought there was 2 billion different words. 300 billion. You psychophobe.
Starting point is 01:14:51 Okay. What? Yeah, you're a psychophobe. It's not even that in one lifetime you could read 1% because no one can read a 100,000-word novel every single day. Of course. It is feasible. It's just a way to try to wrap your head around these crazy numbers.
Starting point is 01:15:03 I haven't read one. Yeah, it's hard to read one. Back to the future was probably a hundred thousand words right back to the future too yeah that's what i know you read the novelization of the movie back to the future to the book one of the few books jim's read i was so excited about back to the future too and it was still six months before it was going to come out to australia and you americans already said we used to have to wait for the canisters of film to arrive in Australia. So we saw films after you guys were done with them. If it stayed in the cinema
Starting point is 01:15:30 for too long, we'd never see the fucking thing. The film's all like beat up. So there was a book of Back to the Future 2 and I was reading it like this. What? He goes into a cafe and Ronald Reagan's on his screen. What's going on? Well, make sure to...
Starting point is 01:15:45 Pepsi just popped out of the table. Make sure to follow Catherine on Twitter or X at Catherine A. Munro on Medium at Catherine A. Munro, Substack blog beyond the buzzwords at CatherineMunro.Substack.com and check out her book, the hand or co-author of the... She's co-author of this book,
Starting point is 01:16:04 The Handbook of Data Sciences and AI. You can find it on Amazon and I'm sure other places. Catherine, thanks for being here. Thank you so much, Catherine. We really appreciate it. Thanks for being on twice. Jewel, I don't know about that. Yep.
Starting point is 01:16:17 All right. If you're ever at a party, someone comes up to you and goes, hey, that AI is not going to take over the world and kill us. You go, I don't know about that. Walk away. Trust me, people. Our days are numbered. Good night, Australia.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.