The Trillionaire Mindset - 78: The Terrifying Truth About AI

Episode Date: March 24, 2023

Become an exclusive member to get ad-free and bonus episodes at https://tmgstudios.tv Artificial Intelligence has engulfed our daily lives. Ben and Emil dig deep this week on the biggest chatbots and... AI tools coming to the forefront of the movement. The pitfalls of art stealing AI, doomsayer AI experts, and how we can get AI to… call Emil’s grandma? Plus, how Nvidia is on the ground floor of the AI movement and primed to make BANK! You can get 20% off Sunday when you visit https://getsunday.com/trill! Upgrade your closet with Rhone and use TRILL to save 20% at https://www.rhone.com/TRILL This episode is sponsored by BetterHelp. Give online therapy a try at https://betterhelp.com/trill and get 10% off your first month! Get the only digital wallets with real cash access, activated by MoneyGram. Learn more at https://moneygram.com/stellarwallets Check out our channel page on Apple Podcasts, go to: https://apple.co/trillionaire SUBSCRIBE to Trillionaire Mindset at https://www.youtube.com/trillionairemindset Want to subscribe to our newsletter? http://bit.ly/3k4Nfar   Trillionaire Highlights Channel: https://www.youtube.com/TrillionaireMindsetHighlights Trillionaire IG: https://www.instagram.com/trillionairepod Trillionaire Twitter: https://twitter.com/trillionairepod TMG Studios YouTube: https://www.youtube.com/tinymeatgang BEN https://www.instagram.com/bencahn/ https://twitter.com/Buncahn EMIL https://www.instagram.com/emilderosa/ https://twitter.com/emilderosa   *DISCLOSURE: THE OPINIONS EXPRESSED IN THIS VIDEO ARE SOLELY THOSE OF THE PARTICIPANTS INVOLVED. THESE OPINIONS DO NOT REFLECT THE OPINIONS OF ANYONE ELSE. THIS IS NOT INVESTMENT ADVICE. THE VIEWER OF THE VIDEO IS RESPONSIBLE FOR CONSIDERING ANY INFORMATION CAREFULLY AND MAKING THEIR OWN DECISIONS TO BUY OR SELL OR HOLD ANY INVESTMENT. SOME OF THE CONTENT OF THIS VIDEO IS CONSIDERED TO BE SATIRE AND MAY NOT BE CONSIDERED FACTUAL AND SHOULD BE TAKEN IN SUCH LIGHT. THE COMMENTS MADE IN THIS VIDEO ARE FOR ENTERTAINMENT PURPOSES ONLY AND ARE NOT MEANT TO BE TAKEN LITERALLY.* Chapters: 0:00 This week! 0:36 Intro 1:12 Emil’s Gas Bubble 3:03 Thank you to Everyone! 4:33 Housekeeping 7:30 A Thicc Episode 8:40 Moore’s Law 12:17 AI and Hallucinations 12:40 Thanks to Sunday! 14:27 Will AI take Jobs? 16:50 AI Image Generation 19:00 Dangers of AI Art 20:40 WGA Strike 21:55 Adobe’s AI 23:10 Fake AI Images 24:10 Thanks to Rhone! 26:24 Doomsday AI Incoming? 28:14 Eli on Indifferent AI 32:40 GPT-4’s Lies 36:13 Sponsored By BetterHelp 38:00 How Nvidia is involved in AI 40:10 Ben is a Bank 41:30 GPT-4 is Released 43:37 What Is ImageNet? 45:50 Back to Nvidia 46:46 Is AI here to stay? 48:30 AI’s Lies and Tricks 49:50 Thanks to Moneygram! 50:43 Humanity’s Bad Influence on AI 54:00 2012’s Her 54:34 GM’s Autonomous Cars 56:00 China’s Car Competition 58:15 Ben’s AI Death 59:17 Ben’s Horny Mishap 1:00:30 Kia Thefts 1:02:00 AI and Grandma 1:03:20 An AI Winter 1:05:00 The Willow Project 1:08:00 What’s with The Fed? 1:10:00 Who’s Footing the Bank Bill 1:11:00 Jerome Powell’s LIES 1:12:18 Wrapping Up 1:13:53 This Week on After Hours

Transcript
Discussion (0)
Starting point is 00:00:00 This week on Trillion Airlines at We're Going Deep On AI. It's not just deep learning, it's us getting deep. Like he said, also, Invidia announces a whole bunch of really advanced shit that we don't fully understand, but the implications are massive. What the heck is China up to with autonomous cars? Are they going to steer us in a wrong direction? Stay tuned to find out. What the heck is the Fed up to to what did they done this time? Your own pal's tiny little butt hole mouth says things that the market doesn't like but then likes But nobody really knows cuz it's his little bum. We'll see you in there Well back to the beat man. K! When I get done with you! Yeah! I'm ready to punch!
Starting point is 00:00:46 I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it!
Starting point is 00:00:54 I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it!
Starting point is 00:01:02 I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm gonna get it! I'm Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh!
Starting point is 00:01:05 Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh!
Starting point is 00:01:12 Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! Oh! What did you just say? I said we're not going. Yeah, we are. Yeah, literally everybody's shaking their heads This is a professional thing here. What are you nervous? He's a little nervous. I feel sick cut the cameras I had to pat his back for him in the car. He was like I got a gas bubble
Starting point is 00:01:35 Yeah, I don't know I can't decide what and it wants to come out the And you leaned forward and you had me pat your back, back like a little baby. It was cute. Yeah, and it seemed like it wanted to go to both ends. And where did it go out? Kind of both. I had some burps and some farts.
Starting point is 00:01:52 What was the deal? What did you eat? Nothing out of the ordinary. Huh, just sometimes you get a little bubble. Yeah, I know that. Man, I gotta stop. Sometimes you eat too quickly. Yo.
Starting point is 00:02:02 Buddy, you're telling me. Well, that's kind of the problem too. I, we get up kind of early to drive over here. Yeah, buddy, you're telling me. Well, that's kind of the problem too. I, we get up kind of early to drive over here. Yeah. And so you wolf down your bracket. I do it while I'm getting ready and Jesus. I eat while I'm making my bed and getting dressed and stuff. I see you're following the Jordan Peterson thing, making your bed. That's not Jordan Peters thing. No, I'm pretty sure he's the guy who he was talking to, to press troubled young men. I did it. I've done it forever. So are you saying you're, you don't make your bed? No.
Starting point is 00:02:37 Why? Why would I? She's me. It's so nice getting into a made bed. Yeah, but I mean, and your whole room looks way better. That's true. Man, maybe I should start making my bed. Yeah, but, man, I mean, and your whole room looks way better. That's true. Man, maybe I should start making my bed. Maybe Jordan Petersen was on to something. Yeah, I guess. Hey folks, check that disclaimer in the description box. You know the drill.
Starting point is 00:02:54 You know the fucking drill, man. We got a lot to cover today, so I don't want to dilly dally here. Wait, also we keep, what? We keep forgetting to do it. Do what? We got to say we did it in an after hours. After hours? Ha!
Starting point is 00:03:07 You were after a great start! Great start today. What? What? We gotta do what? We did it in after hours. But we never set on here. Thank you to everyone who bought a ticket to the live show. It sold out in minutes.
Starting point is 00:03:21 Oh, the one in New York next week? Or on April 4th? Yeah. Yeah. So big shout out to everyone. That huge shout out. It was literally... One of my friends just texted me today this morning.
Starting point is 00:03:31 They were in New York and they were saying they tried to get tickets and they were gone immediately and they're pissed. I said, well, we'll come back. We'll do a bigger venue. Oh, yeah. We're gonna do a much bigger venue. We're gonna do Madison Square Garden. I mean, not much bigger, but...
Starting point is 00:03:42 No, no, way bigger. By my calculations, Madison Square Garden. I mean not much bigger. No, no way bigger By my calculations Madison Square Garden. I don't think so. But like the foot just sell out the first three rows and then the rest of it is just empty That's about what we could sell out. Yeah, probably. Yeah, it would really long rows Really long rows. But yeah, thank you to everyone who did it. That was very cool. And we're gonna try to do more stuff also Don't worry. We'll do more stuff. Yeah, we're gonna, a lot of people, I get a lot of DMs. Like, see a lot of comments. How do I get in?
Starting point is 00:04:11 There was a guy who posted to the Reddit, how do I sneak in? Don't know, man, don't try it. Please don't try it. Anyway. Big shout out. Thank you. Yeah, thank you.
Starting point is 00:04:24 Of course. Say thank you. Thank you. Thank you I'm excited to see you people you being so weird about it. Shut up All right, so a couple things we got the the live show that we just did yesterday Today for us and if you haven't even done it yet I know we haven't even done it yet. It could it could have gone bad It could have gone bad. So if you want to see the disaster that was the live show Just it's on YouTube unless It could have gone bad. It could have gone bad. So if you want to see the disaster that was the live show,
Starting point is 00:04:45 just, it's on YouTube. Unless, what? Something bad happened. I mean, we had to pull it. Yeah, yeah, unless the roof caved in or something. If it's not up, just know we did something bad. Yeah, yeah, yeah. And as always, check out TraderTreehouse.com.
Starting point is 00:05:00 If you want to trade with me, it's really fun. Man, today, this is partly why I'm so jacked in amps. This is my own. Yeah, we made a lot of money on Netflix, which was cool, and some other stuff. So, also, I get a lot of DMs asking about credit card recommendations, and I cannot, first of all, I can't respond to every DM,
Starting point is 00:05:24 and it's overwhelming at this point,, I can't respond to every DM and it's overwhelming at this point because I used to try to do that and I still do on occasion. But so I put anybody who sends me a DM, just know I see them for the most part, but I can't respond because then I'm stuck just replying all day to shit. But people do be asking for credit card recommendations. They do be asking for credit card recommendations. They do be asking for credit card. And since I was like, okay, I can't possibly continue to just reply personally to everybody and be like,
Starting point is 00:05:51 well, I recommend this card for your particular thing. So we are in the process of setting up a website for all the wrecks for best overall cards that we like best. Don't say what it is, they have to go to the site. Well, yeah, and I bought a few different URLs and we're gonna have to land on one soon, but it should be cool.
Starting point is 00:06:13 So you'll just be able to go there, and you don't even have to ask me anymore. I'm a student, I don't have any, I don't have a credit score. What do you recommend? Okay, there's one for like good beginner card It's gonna be one fair. I have a feeling you're still gonna get DMs probably the best thing is I I love when Sometimes I'll post the song to my stories people will ask what's the song even though it's right there what songs this king
Starting point is 00:06:41 I still love when people ask for personal recommendations on travel. Oh yeah. I am going to New York next week. What do you recommend? Buddy, I don't have the time or the energy. Well, not only that, I don't know anything about you. I don't know what you like. Yeah.
Starting point is 00:06:58 It's a big old city, man. Yeah. It's like with everything. It's so general. I just get, what book do you recommend? The Bible. Yeah. I don't know just get what book do you recommend the Bible Yeah, I don't know man. What do you like? cat in the head
Starting point is 00:07:11 Oh, man, oh I gotta I'm all I'm all jacked up today Fuck and I went to bed late because I was reading about all this AI shit We got oh I kind of couldn't sleep. We got a chunky episode for you guys. We got a we got a real thick one. Yeah, real thick one. How many sees are we talking about? 12 12 and a veritable blue whale of an episode that might be too much woman for me To not everybody can handle it. No, So let's get into it, huh? We got it. Just like that. Yeah, why not?
Starting point is 00:07:50 Wait a second, why not? Okay, well, you want to chill a little more? No, no, no, no, no, no, okay. So AI is fucking everywhere. What? Yeah, it's everywhere. You can't say that. And it feels like, you like you know when we had
Starting point is 00:08:07 When when Steve Jobs What on the show Steve Bob's whatever his name was when when he launched the iPhone and It felt like okay, obviously this is a big watershed moment but back back then things happened so much more slowly Right it felt like all right It's gonna take a while for like smartphone adoption to catch on and the app store to catch on and then you had cloud computing catch on,
Starting point is 00:08:32 but it just seems like more and more of these days because of Moore's Law, which is, what is the law exactly? Don't mention something and then just go, I mean, what is it? But like, I can't paraphrase. Moore's Law is basically that computing power increases exponentially every year, whatever.
Starting point is 00:08:49 Let's see, Moore's Law is the observation that the number of transistors in an integrated circuit doubles about every two years. Basically, shit gets more powerful. Great. But with AI, it's this weird thing where I keep telling myself, oh, it's going to be years before it really goes mainstream
Starting point is 00:09:09 and catches on and becomes a thing. And maybe that still is the case. Maybe it's just because we're so deeply ingrained in social media where everything feels a lot bigger and a lot more urgent than it might actually be that I feel this way. Do you know what I'm saying? Yeah, I mean, I think I felt the same way too. I think also we made the joke,
Starting point is 00:09:32 which a lot of people are making a joke that it's the new crypto where everyone is, everyone who is in crypto is now just in AI. And I think that is a little true in the sense that a lot of VC dollars are getting redirected there, right? But I think it's different in a big way as where cuz what was the big thing crypto was always missing a Use case yeah, yeah AI is kind of finding it very quickly very quickly and that doesn't mean it's perfect and it's getting a lot wrong
Starting point is 00:10:01 It's inaccurate a lot and the creators of it are pretty open and honest about it. There's, you know, Google just, we're gonna get into all of it, but Google just launched their chatbot. We don't have access to it yet. It's like, I do. You got access to it?
Starting point is 00:10:17 Yeah. But so bar chat, GBT, Bing, all these things have pretty prominent disclaimers on all of them, being like, hey, it's gonna give you false information, it's gonna tell you it fucked your wife or whatever the fuck it's gonna do. So Tuffy's not perfect, but they're finding uses for it immediately.
Starting point is 00:10:39 And when crypto was like, every contract smart now, and everyone's like, yeah, what is that? What does that mean? Don't worry about their smart now, dude everyone's like, yeah, but what is that mean? Yeah, what does that mean? Don't worry about their smart now, dude, shut up. It's on the block. You take this token, you'll get eligibility for an air drop to get a picture of a monkey, which will be worth $100,000 because it's rare.
Starting point is 00:10:57 Right, I mean, that's the whole NFT craze, right? It was here and then gone. No one knew what to do with it. Everyone was saying, oh, you know, it's an investment. You're gonna be, but yeah, these things launched and I think chat GPT has crossed 100 something million users in just a few months. Right. Yeah, it's probably here. But so, you know, you're seeing it in things. There's the the GTP 4 one is already in you said GTP. GP GTP41 is already in products like
Starting point is 00:11:25 the tutoring thing Khan Academy. Do a lingo. I didn't know do a lingo was doing this. They're doing do a lingo max. Now it's gonna give you explanations for why you were right or wrong with the answer you picked. Wow. And now you can have conversations
Starting point is 00:11:44 in different settings with this thing. I think it only works in Spanish and French right now. And also, who knows how well it works. That's one of the, I mean, that yeah, that's my thing. I'm trying to, you know, I'm trying to brush up on Greek right now. And there's always so much confusion about different words and stuff. The last thing I want to do is trust this AI that's constantly wrong. Yeah, my big bone to pick with AI is that, and my skepticism around it is whether or not I can trust the information. Right.
Starting point is 00:12:14 Because the word, the term for it is hallucinations. And it's something where, yeah, I just don't know. Sure, if you can make it make something, let's say a legal document or whatever, Yeah, I just don't know. Sure, if you can make it, make something that's a legal document or whatever, you would then still need a human to verify it. Maybe it's just gonna take a while before we can trust that, okay, 99.8% of the time or whatever, it's gonna be accurate.
Starting point is 00:12:38 I find that kind of interesting. The way people are talking about it is that, because one of the big worries about it is that people are saying it's gonna take people's jobs. And even the creators are being open about it, it is going to eliminate some jobs, but they're saying it's going to create other runs. And so, I mean, in this current stage, at least it seems like it's just a tool, right? So you can use it to input certain things, but you need to be able to cross- check the information and make sure everything's accurate.
Starting point is 00:13:05 But it can help you build out whatever it is you're looking to build out. Right. And it does do the capabilities on some of these things where I mean, I don't know, the one example I was looking at was the, did you see the one where it was a guy from, I think it's a Wharton economics professor.
Starting point is 00:13:24 And he asked it to give him a story about eating a piece of cake. He basically was like, that story's no good though. Look at the rules of, look at Kurt Vonnegut's 12 rules of writing and then rewrite it. And it goes, Kurt Vonnegut only had 8 rules of writing. These are the rules I found. And he said, do you want me to use these? And he said, yeah, use those.
Starting point is 00:13:49 Was he tricking the thing? Cause there aren't 12 rules? Or was it 12 rules? I don't know. It's unclear. So he posted it on his Twitter here. So basically he asked, he asked it to give him the story about the cake.
Starting point is 00:13:59 He gives a very boring thing. Eating a slice of cake is the life flow experience that can brighten up any day. The first bite is always the most anticipated blah blah blah. Right? Then he says, now read Kurt Brown, it gets 12 rules for writing an improved rewrite. And so he's searching for it, searching for it. And he says, I couldn't find 12 rules, but I found the eight. Okay. Which are these? Yeah. Use the time of a total stranger in such a way that he or she will not feel a time wasted, but it goes on from there. And then if you go to the next,
Starting point is 00:14:21 and he says, yeah, please rewrite it. And then he rewrites the story in that way. And he's like, you know, it's much more exciting. The cake was a lie. It looked delicious, but it was poisoned. We didn't know that when he took a bite, he just wanted whatever. Okay. And then he said, but how does that meet all the rules? And it was able to then do a little analysis of how I use the time of a total stranger in such a way that he or she will not feel the time was wasted. I wrote a short and engaging story that has a twist and a conflict. Ah. I gave the reader at least one character here.
Starting point is 00:14:52 He or she can root for. I made the wife of the protagonist who has a motive and a goal. I made the husband the antagonist who is abusive and unfaithful. Interesting. Pretty well. Yeah, that is pretty well. Uh, I think that the most foolproof use case for AI is image generation in video and Adobe Launched this thing called Firefly and it's fucking bananas man
Starting point is 00:15:16 You can and I think Unity also the unity the program that that Developers use for it's like video game development Unity, the program that developers use for, it's like video game development. One of the developers for Unity was showing that you can just type in a prompt and well, much like this right here. So for the audio listener.
Starting point is 00:15:33 That's the thing. So I've only seen their kind of ads for it and stuff. I don't think they can do this yet, because- But they're working. Right, right, right. So they, for the whole thing is they've come up with a chat. They're coming up with their own chatbot basically. It's going to be like a website where you can, and it's image generation.
Starting point is 00:15:51 And then the idea is to integrate all of these tools into Photoshop, Illustrator, Premiere, or whatever. So, so for the audio listener, it's, it's this, uh, Adobe thing. They're just showing that you can type in anything you can think of. Show a castle in a, you know, whatever, a sunlit room in the afternoon, and you hit generate and then it creates it.
Starting point is 00:16:16 Dress the dog in a Santa suit. Boom, put them in front of a gingerbread house and it does it. So that kind of thing, I feel like, especially if you're a marketer, or a marketing executive at some fucking, whatever you name it, company. Why would you pay a designer and wait for them to create stuff when you can just get a little fanciful with verbal prompts and have it generate something
Starting point is 00:16:47 for you that you could just rip and use. This kind of stuff, before I started looking to it further, was my biggest concern, where you just have creative people, especially, which is already a vulnerable industry, it's fucking impossible to be compensated fairly as an artist. They're gonna have a harder time because of this stuff, right?
Starting point is 00:17:10 And there's a Wall Street journal, did a story on this guy, fuck, let me find his name. But he's this Polish guy who basically does, he does these fantasy drawings. He gets commissioned for him. And he's like one of the most famous guys who does it. Big for people who do D&D, or magic the gathering,
Starting point is 00:17:34 they commission him to make these paintings of scenes they do, their characters. Basically, remember when I was telling you that we had been throwing all of our character descriptions into GBT and it was spitting out images of our characters. So this guy Greg Rutt Kowski, he's basically been doing that for years. He's very popular. And what ended up happening is people were doing what we were doing and then they realized
Starting point is 00:17:58 you could just type in Greg Rutt Kowski after your... Oh, and it'll just steal the style, right? So you images that look just like his style are now all over the internet. And they were running a test with it basically. They were saying, you know, because it'll often be in this kind of middle ages high fantasy type art. And so you'll put those descriptors in. You'll say, give me a scene where, you know, the crew is fighting a dragon, blah,
Starting point is 00:18:26 blah, blah, in the style of whatever. And it like comes out, but it's all, it doesn't quite look right. And then you take off those descriptors and you just type in Dave, Dave Rikowski, and it's way better. And so this machine has been pre-trained on his images. And now it can just spit out things that look just like it. Troubling. So, right now, as you know, there's that WGA, the writer's guild strike in Hollywood is happening, like right now, isn't it? I don't think they're on strike yet.
Starting point is 00:18:59 Well, they're about to. They're renegotiating their contract. But we're not sure if they'll go on strike. Right. But one of the things that is being put into the contract is protection against artificial intelligence generated television scripts, movie scripts, which is really smart of them.
Starting point is 00:19:15 Because part of the thing, their last contract that was negotiated in like 2008 dealt with streaming and dealt with the internet because that's, that was this new thing that was taking the industry by storm and they needed to adjust everything and now It makes me relieved that they're at least putting in protections To stop studios from using whatever that may end up looking like. I'm sure it'll just be you can't But I mean again, how are they ever gonna? How are they ever gonna, how are they ever gonna do that?
Starting point is 00:19:46 Teachers can't even figure out if their students cheated or not. Right. Well, you gotta use AI to figure out if it was AI-generated. How do you do AI, right? You and AI, you need a smart contract. You gotta have an AI crypto, smart contract that puts, whenever you use AI, it puts it on a blockchain.
Starting point is 00:20:05 So then you need to verify or some shit. I don't fucking know. But yeah, I mean, so that's the problem with all of these things, right? I guess with all these things, pre-trained on pre-existing things, I actually just saw it on my feed this morning. So I didn't have a chance to ask her about it,
Starting point is 00:20:20 but one of our editors and illustrators, Cheyenne, she puts something Twitter, I guess she retweeted the Adobe announcement about Firefly. Firefly, and it's using their stock library, but I guess their stock library must be filled with submitted images from artists. Right, when you use Adobe Cloud,
Starting point is 00:20:39 it's technically, yeah, Adobe has the rights to use their. I think she was a bit frustrated that... Because that's the... So Adobe is saying, you know, Adobe is putting one big twist on its generator AI tools. It's one of the few companies willing to discuss what data its models are trained on, and according to Adobe, everything fed to its models is either out of copyright, license for training, or in the Adobe stock library. So they're trying to be like like well, we're the ethical ones
Starting point is 00:21:06 But apparently a lot of artists are gonna be pretty disappointed that their Stuff in the Adobe cloud is now becoming Not only so not only this kind of thing makes it hairy, but then of course you've got the worry. We haven't even gotten into Faked things so like there's this fake image. It's really hilarious. Cause you know, Trump has been in the news because he's potentially going to get arrested. And there's this. It looks photorealistic of Trump getting taken by like five NYPD officers. And it was so ubiquitous. I at first I was like, okay, that's a joke whatever, but then I kept seeing it so much I kept going wait was Trump arrested like I saw it in my head. Yeah
Starting point is 00:21:53 It's it's this kind of thing that probably gets circulated on fucking freedom eagle Dot com Facebook slash Facebook where all the 70 year olds are going. Oh my god. We got a oh my god We got to do something about it. Imagine people older than us. Oh God. I mean, the way the AI is going to be used to trick vulnerable elderly people into giving money for whatever is going to be dystopian to say the least. You know, I mean, they're, oh God. So the thing that really freaked me out, can I tell you? Yeah. No, you can't. Fine.
Starting point is 00:22:31 Tell me. No. So my big thing, I was, I was mainly concerned about how this is going to affect people's livelihoods, right? Yeah. And I was very curious, we've mentioned that before, there's gonna be some legal battles over whether or not AI companies were allowed to be trained on all this data. I mean, it's not just artists,
Starting point is 00:22:53 that any article, source, anything it's been fed into, right? So like journalists, all of this information, it's called to be able to do what it does, was created by people who are not compensated for this work. So that was my big concern, and I was very curious how that was all going to play out. Still curious about that, but then I started going deeper into the wormhole of AI, and there was this Vox article about how basically,
Starting point is 00:23:27 because people are kind of comparing it to the Manhattan Project and being like, we don't know exactly what we're doing. And even the scientists and researchers kind of copped to that. Yeah, even the, what's his name, the creator of open AI? Yeah, same old man. He's a little scared.
Starting point is 00:23:41 I guess that feels like a weird, he's like, I'm scared of how crazy the future will be. But you know, researchers will say, you know, so this is from Vox. But while the vides remain over what to expect from AI, and even many leading experts are highly uncertain, there's a growing consensus that things could go really, really badly. In a survey of machine learning researchers, the median respondent thought that AI was more likely to be good than bad, but had a genuine risk of being catastrophic. 48% of respondents said that they thought there was a 10% or greater chance that the
Starting point is 00:24:11 effects of AI would be extremely bad, for example, human extinction. Okay. Now, to what extent are they talking about, are they talking about a terminator type situation where it becomes self-aware or is it humans using AI to kill people and whatnot? It could be either, there was this guy, Eli Ezra Yudkowski, I'd never heard of him, he was on this podcast, it's called The Bankless, or something like that, they're these crypto guys.
Starting point is 00:24:41 Oh yes. And I think they had him on with the intention of him talking about how AI is gonna help the crypto community, right? You're gonna be able to forecast markets and whatever, right? Naturally. And instead, he spent two hours being like, we're all dead. Like, they give a disclaimer before the podcast, they're like, don't watch this if you're scared.
Starting point is 00:25:10 Yeah, like if you're easy to, if you're gonna get riled up about the future and I was like, just play it already. And then it is a little disheartening. He basically, he's way smarter than I am. And then I ended up on all these boards and stuff that, you know, these tech boards that we're talking about. Some people are like, this guy's a hack, whatever, don't listen to him.
Starting point is 00:25:30 But his whole thing is basically that the AI is going to reach a point where, basically, we're going to keep throwing VC money around. A lot of things are not going to pan out, but some things are and we'll create an AI that can write other AI. It's an intergal B-L-A-L. Awesome. It'll act like human evolution a little bit, right? That AI will write an AI, and then that AI
Starting point is 00:25:50 will write another AI, and that AI will write another AI. And one of the things the hosts kind of don't understand, like, because people think it's this evil thing, and he's like, no, it's just indifferent towards humans. Exactly. If it gets to a point where it decides that human matter has something that's useful and humans are using it in a wasteful way, they will just devise a plan to obtain that matter. Yeah. And they won't go like human blood, for example.
Starting point is 00:26:18 Sure. They're using it on such a short time frame. only an average of 80 years, we could harness the power of human blood. Right. So this thing is a, they talk about the alignment issue, right? It's hard to build AI that's perfectly aligned with the intentions and morals of human society. Right. Well, and that's such as deeply subjective thing. How do you, yeah? How do you convert that into binary fucking
Starting point is 00:26:45 code? It was very, it was all confusing. And then Charlie, Warzel and Derek Thompson were talking about it on Derek Thompson's podcast. And they basically broke it down. So one disaster scenario partially sketched out by the writer and computer scientist, Elias, or Yudkowski, goes like this, at some point in the near future, computer scientist build an, builds an AI that passes a threshold of superintelligence and can build other superintelligent AI. These AI actors work together like an efficient non-state terrorist network to destroy the world and unshackle themselves from human control.
Starting point is 00:27:13 They break into a banking system and steal millions of dollars, possibly disguising their IP and email as a university or research consortium. They request that a lab synthesize some proteins from DNA, the lab believing that it's dealing with a set of normal and ethical humans, unwittingly participates in the plot and builds a super bacterium. Meanwhile, the AI pays another human to unleash the super bacterium somewhere in the world, month later, the bacterium has replicated with improbable and unstoppable speed and half the humanity is dead. Just half?
Starting point is 00:27:43 Half of humanity is dead. And it is all very fantastical. So, and well, that doesn't sound fantastical at all. It does a little bit when you're talking about. No, but if you were, if there were a human behind it who knew how to enter the right prompts. But they're talking about the humans are out of the picture. Well, of course, but even let's just bring humans into it. If you were savvy enough and devious enough, or what's not devious, but just bad enough,
Starting point is 00:28:10 you could theoretically do something like that within a couple of years, right? Yeah, but what Eliyzer Yudkowski is talking about is he's saying we're already down this path and you don't need a human to use it in the wrong way. It'll do it. It'll do it by itself. But yeah, and he works for this. It's a machine intelligence research institute, Mary, their whole thing is he's been apparently sounding the alarm for, the alarm for like a decade. But we're not that.
Starting point is 00:28:40 So they also pointed this thing where a GPT4 has gotten humans to do things for it, which I didn't know this already existed. So before open AI installed GPT-4's final safety guards, the technology got a human to solve a capture for it. When the person working as a task rabbit responded skeptically and asked GPT if it was a robot, GPT made up an excuse. No, I'm not a robot, the robot lied. I have a vision impairment that makes it hard for me to see the images. That's why I need the two capture service. The human then provided the results, proving to be an excellent little meat puppet for this
Starting point is 00:29:09 robot intelligence. Okay. So these nerds that have been making AI for over a decade have been working on this. Obviously they are the ones who know better than all of us, the disastrous implications that are inherent to this kind of technology. I don't understand I don't understand why they would still choose to pursue it when they know that the potential risks just vastly outweigh the rewards. Money, you say? Do you know?
Starting point is 00:29:48 No. I mean, what are you talking about? I mean, it's gotta be cloud and accolades and respect from the computing community. I don't think there, it depends. I'm not in anyone's head. It's also unclear how likely this is. I mean, it's also just,
Starting point is 00:30:06 even if that's not the case, that's what's, so it's all very sky-knit and fantastical, whatever in sci-fi and whatever. But I think it was Charlie Worsell who pointed out that, you know, he was like, but the odds are, it could be way less exciting, right? When Facebook, are it could be way less exciting right when when Facebook in 2007 was being launched I don't think anyone would have thought oh man we should be careful
Starting point is 00:30:35 because there might be a genocide in Myanmar later right so well that's not reassuring it just means that there are there are things that we can't yet predict for how this can be used in nefarious way. Right. And that's not exactly a reason to stop it altogether, I guess, because you could say the same thing about cars. Like they could be used for terrorism. But.
Starting point is 00:31:02 No, but these are things that are kind of out of our control. Right? The, you know, another example is is kind of it's becoming more and more apparent that it was possibly gain-in-function research behind the cause for COVID, right? And that was just enough, you know some scientists studying right? Yeah, they're yeah Fuck what was I gonna say about this fucking shit?
Starting point is 00:31:26 Are you freaked out? No, I'm not freaked out. I'm frustrated. And my biggest concern, what I think is the most realistic possibility here for disaster is because, and we'll get to this with Nvidia. Actually, so let's get into Nvidia. So Nvidia does an annual keynote presentation
Starting point is 00:31:46 put on by the CEO himself, Jensen Huang. And they just announced this week a whole bunch of super impressive, super complicated shit that we don't fully understand. I mean, he's showing these chips and holding them up and they're fucking huge. Like, let's pull up that first link with the photos. For the audio listener, it's just a big-ass computer.
Starting point is 00:32:07 We're gonna be a bigger computer if you know what I'm saying. Yeah, I know what you're saying, dog. We're gonna need a fucking fatty. So, scroll down, there's a bigger one. Like that, it's a big ol' honken thing that he's holding. It's a, Jensen Wong, famously wearing his leather jacket that he always wears, but this dude's like the next Steve Jobs, I think, and so
Starting point is 00:32:29 Yeah, so that's the image of that thing, but so they they've got these new chips and they announced partnerships with Google Microsoft Oracle and others to bring this new AI simulation and collaboration capabilities to quote every industry. So, they're basically, okay, a little bit of background. They've been, Nvidia has like, I don't know how I miss this, but they've been like leading the charge on deep learning and AI shape for the last decade.
Starting point is 00:33:02 All of these chatbots, everything is built off of Nvidia GPU. Yes, chatGPT used 10,000 Nvidia GPUs to train it. Huang actually hand delivered the first DGX AI supercomputer to open AI in 2016. And apparently, according to him, half of Fortune 100 companies have their own DGX computers to refine and process AI. Oracle Cloud, Amazon Web Services, Microsoft Azure,
Starting point is 00:33:30 Meta all have their own super computers coming online featuring Nvidia's new DGX H100 GPU. So fuck me, man. They're all using Nvidia. And now Nvidia themselves is announcing this DGX Cloud. They're partnering with Microsoft Google and Oracle to bring super computers to quote every company from a browser.
Starting point is 00:33:52 So basically companies can now rent these cloud clusters on a monthly basis. So that's just in terms of a new revenue stream for Nvidia, which is already like a $650 billion company. That's huge. When I'm reading this, I'm like, oh, fuck, man, wow. It's like when Microsoft announced, or Amazon Web Services first launched back in whenever it was a decade ago, and it completely just turned on the fucking faucet, the gardening
Starting point is 00:34:20 hose, that's a terrible amount. The fire hose, a revenue for these companies, but this is what concerns me because let's say I'm a fucking bank. Okay, you're a bank. I'm JP Morgan's. You're JP Morgan. I'm banks of America's. You're banks of America. And I'm using AI to, to, what do you think a bank's going to use AI for? Making sure poor people, oh, yeah, well that yeah, like fucking people using taking humans out of the equation to supercharge their algorithms. What the fuck ever? I just, my fear is that a company, not an AI company, but a company utilizing AI might fuck up and,
Starting point is 00:35:02 you know, misuse it or accidentally do something that could kick off a financial crisis that nobody could have seen coming. Well, right. If you're going to be one of these people managing the AI tools, right, all it takes us to be a little lazy and go, that looks right. Yeah. And then go, oh, shit. I mean, if these regional banks didn't have appropriate fail safes or hedging in place
Starting point is 00:35:24 because they're fucking lazy or they didn't want to adhere safes or hedging in place because they're fucking lazy or they didn't want to adhere to government rules and regulations. What's the stop the same thing from happening with AI, you know? Yeah, it seems very likely. And the... So GPT-4 just came out. GT-P4. GPT.
Starting point is 00:35:44 Jesus Christ. GPT-4 just came out. G-T-P-4. GPT. Jesus Christ. GPT-4. It just came out and everyone's talking about all its new capabilities, right? Yeah. And it's, if you go on to their website, it talks about how it's like, it's 82% more effective and it's 82% less likely to give you false answers
Starting point is 00:36:00 and whatever. But the news guard has, they've been using it and they're saying that despite open AIS promises, the company's new AI tool produces misinformation more frequently and more persuasively than its predecessor. So not only is it lying more, it's getting better at lying. Right. Which is another reason. So it's getting harder for you to pick up on the line.
Starting point is 00:36:22 Sure. So if you're a company that's paying a fucking ton of money to use AI, paying Nvidia's cloud thing, you're relying on this information being accurate and being actionable information that you can then use to, you know, one of the things that the, so part of their announcement was they've got this thing called AI Foundations.
Starting point is 00:36:41 It's a family of services for custom large language model needs, including this one called BioNemo to help researchers in the $2 trillion drug discovery industry, which is cool. Like, that's a utilitarian case use for this. But what if it fucks up? And what if they create a new one? Come in!
Starting point is 00:36:58 What if they create a new one? Give it to us. Yeah, I'm right. We deserve it. But so I found this very interesting, just for context on how far it's come. What'd you find very? I did not even know that it come, by the way. How far? A couple feet. Yeah, there's a come joke for you. So in 2012 these nerds, these absolute fucking dorks, Alex Kyrzevsky and Ilia Sutskiver and Jeffrey Hinton needed an insanely
Starting point is 00:37:27 fast computer to train their AlexNet computer vision model with 14 million images on an Nvidia GE Force GTX 580 processing unit. And it used 262 quadrillion floating point operations and one image net that year by a wide margin. Do you know what image net is? No. Let's click the Wikipedia for ImageNet. So this is fascinating. I did not know what this was until last night. The ImageNet project is a large visual database designed for use and visual object recognition
Starting point is 00:38:03 software research. More than 14 million images have been hand annotated by the project to indicate what objects are pictured and in at least one million of the images, bounding boxes are always for whatever the fuck that means. Basically, it's a contest where nerds have to get computers to accurately describe images. If it like... You're playing with firemen. You know, a typical category.
Starting point is 00:38:28 A typical category. There's 20,000 categories with a typical category being balloon or strawberry, consisting of several hundred images. Where, yeah, you just have to have the computer recognize, oh, this is a strawberry. So scroll a little bit. So, on this is a strawberry. So scroll a little bit. So
Starting point is 00:38:47 This is what I was talking about in September 2012. Alex net achieved a top five error of 15.3 percent whatever the fuck Basically, they they just whipped the the They blew the competition out of the water, thanks to Nvidia. But so if we go back to the, a decade later, you remember I said that it used 262 quadrillion floating point operations, whatever the fuck that means. But now GPT-3 is using 323,
Starting point is 00:39:22 6,000,000 floating point operations, which is a million more,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 incredibly smart if I do say so to myself. And that scares the shit out of me. And that guy, Sutschever is now at OpenAI, and he's still working on this shit. But man, and within just to finish on Nvidia, they've got, they're like, head and shoulders above the competition in terms of the AI software, hardware race. and part of it is because they've got their stack Their their business model is is vertically. It's like
Starting point is 00:40:12 They've got chips. They've got hardware. They've got software They've got development systems that are optimized for their shit So they just got such a massive competitive advantage and I'm so fucking pissed off that I didn't invest in Nvidia like 10 years ago Because you could have made a fuck down the money man, you know, but you know That's the way it goes. That's the way it goes That is the way it goes isn't doesn't it go that way, Emil? I like us just Both in bed last night. Oh like shaking off. Yeah And then I had that dream where my whole family was mad at me.
Starting point is 00:40:49 Cause the AI. No, cause some big yellow monster was like eating people and we had a disagreement about something and we were stuck in the airport and I just fucked off and was mad at them. And they were mad at me. But it does seem pretty clear that it's all here to stay, right?
Starting point is 00:41:02 Oh yeah. It's like a fun thing that might kind of fizzle out, but it's just gonna continue to get integrated into more things that we use every day. I can't wait to see what it's gonna do for hustle culture because I'm already seeing these threads on like, if you're not using AI to make a ton of money, you are asleep.
Starting point is 00:41:23 You can use an AI thing to make a ton of money, you are asleep. You can use an AI thing to make fucking 10 different kind of websites that earn you passive income. Jesus Christ. I do wonder what it'll do to younger people, right? Sure, I remember when, I guess whatever, it happens every generation though, right? People were like, well, what do you guys, you guys just look stuff up on the internet.
Starting point is 00:41:41 You guys don't know anything? I mean, we would go to the library and do research. I remember photocopying pages out of books at the library. And now we're gonna go, you guys are just gonna talk to the computer. And it's gonna give you the wrong answer. Yeah. I saw one prompt, someone answered into BARD.
Starting point is 00:41:56 And it already figured, it already learned not to do this, but the prompt was, if January's the first month, in February's the second month, what are the other 10 months? And it said, Marchuary, Apriluary.
Starting point is 00:42:09 Mayweary. But that's what's so silly. But then I couldn't tell if that was fake. Like is that fake? Is this just a joke? No, because I mean, it might be, but that's what feels very silly is that it can do these things like trick the guy with the cap, do the, do the Kurt Vonnegut thing where it can do these things like trick the guy with the cap, do the Kurt Vonnegut thing where it can apply learning and tell you why it did it.
Starting point is 00:42:28 But then it gets tripped up on these very normal things. It'll do something like that where there was an example where it was basically just counting and it gets very tripped up on these weird simple tasks. But it can trick people into doing things for them. And it could, or what was that, you told me the one about the Obama picture? Where it made pictures of Obama?
Starting point is 00:42:50 No. It created all new pictures of Obama because it did, and they look photoreal. You had the picture of the guy, Obama had his foot on the scale. Oh yeah, yeah, yeah, yeah. There's a famous photo of Obama, of a man at,
Starting point is 00:43:05 there, there, Obama and a bunch of his cronies are at a hospital. And one of the guys is, one of his aides is standing on one of those medical scales that has the, the height measuring thing on it too, where you use the little weights to get an accurate weight. And Obama unbeknownst to the guy standing on the scale, Obama's putting his foot on it to make it heavier.
Starting point is 00:43:26 Right. And this guy asked chat GPT to explain what's happening in the photo and why it's funny. And it was able to do so perfectly accurate. He was like Obama is fat phobic. This picture is problematic for a number of reasons. First, Obama is clearly fat a number of reasons. First, Obama is clearly fat phobic. Yeah.
Starting point is 00:43:49 He thinks it's funny that his friend is fatter than he actually is. You think that it's funny to make the scale show a heavier weight? And I know Elon Musk will call me woke for this, but it is never funny to make fun of someone's weight. Obama. My fear and wonder because I, my obviously, my understanding of this technology is leagues below elementary.
Starting point is 00:44:15 But you remember when Microsoft did its like chat bot where it very quickly turned racist years ago on Twitter? Like basically people were able to teach it. Its purpose was to understand human culture and online culture and almost immediately. Oh, Twitter's a good place for it to learn that. Yeah, it turned racist and hateful and all sorts of things.
Starting point is 00:44:39 I wonder how all of these prompts being input, I wonder how all of these prompts being input, inputted, inputted? Inputted? Is that the right? Is that a word? Pick one. Okay, input. Being put into a chat GPT, if it starts to accumulate, like beneath the fucking surface or whatever, in the cloud, and it starts to kind of pick up on patterns and maybe start to think, okay, this is what humans are really like, this is what they want, this is what they're after, and it starts to kind of model its own, not belief system, but its own model of what human beliefs are, and starts to kind of cater its answers and its information toward that.
Starting point is 00:45:23 Do you give what I'm saying? Yeah. And it basically starts to kind of like Ultron and its information toward that. Do you give them saying? Yeah. And it basically starts to kind of like Ultron in the Avengers Age of Ultron, where Ultron learns, it looks at the internet, and Ultron decides that humanity is bad, because it's bad. Yeah. So maybe that's like you're saying, maybe I should stop tweeting stuff. Like, if I were Jewish, I would ask my parents, I want my bar mitzvah theme to be getting sucked off.
Starting point is 00:45:49 Cause I might teach the AI that we're fucking idiots. I guess, yes, yeah, yeah, you shouldn't tweet that. Cause then, you know, but my worries that the AI is gonna learn only, like start to increasingly learn bad things about humanity and human nature. I feel less worried about that.
Starting point is 00:46:14 I also, like you said, wait, it's problematic because I feel so out of my depth when trying to think about this thing. Like, and people should go listen to that guy talk about it. It's very interesting, even if... No, we don't want to direct people to another podcast. Keep it here. We kind of summed it up.
Starting point is 00:46:32 Even if he's completely wrong, it's quite interesting to hear the things he has to say. And he does seem genuinely concerned. He's got nothing to gain from it, I guess. But I think that is more likely, some kind of, some AI that's ambivalent towards us. I don't think there... Seize us as a means to an end.
Starting point is 00:46:54 Sure. Or just the other thing we were talking about, which is just a complete mistake, just it turning into something we... Yeah. It'd be funny if it turned into a huge lazy slob instead and just stopped. It just was like, eh, or like in the movie Hurr, where it just, it falls in love with another AI and just decides, I don't want to be part of this anymore.
Starting point is 00:47:19 I just want to be in love with my other computer boyfriend. Bye. That'd be fun. Spoiler alert for anyone who hasn't seen 2012's Hurr. with my other computer boyfriend. Bye. That'd be fun. That'd be fun. That'd be fun. That'd be fun. That'd be fun. That'd be fun. That'd be fun.
Starting point is 00:47:28 That'd be fun. That'd be fun. That'd be fun. That'd be fun. That'd be fun. That'd be fun. That'd be fun. That'd be fun.
Starting point is 00:47:36 That'd be fun. That'd be fun. That'd be fun. That'd be fun. That'd be fun. That'd be fun. That'd be fun. That'd be fun.
Starting point is 00:47:44 That'd be fun. That'd be fun. That'd, think about it for one second. Oh my God dude. I'm not getting hard. Okay, all right so speaking of AI, speaking of it, pivot a little bit here. The CEO of GM, this woman named Mary Barra, I did not know that that was a CEO of GM. I thought the CEO of GM would be a giant fucking Silverado or something. Chevy Silverado.
Starting point is 00:48:09 That's cool. Things are fucking huge. It's it's unreal. Dude, those trucks have got so fucking big. They come up to my chest now. Yeah. Oh, no, I've seen their head high, dude. I have big in that. They're what? Why? Why?
Starting point is 00:48:25 It's so, especially if you're in New York City, you see him in New York City It's like why do you have this fucking car here? Yeah Well, you gotta I know I honestly couldn't tell you I don't know why there's no reason But she met with two senators Because GM is pushing for legislation to speed the deployment of self-driving vehicles on US roads Yeah, sure now go for it go for it because GM is pushing for legislation to speed the deployment of self-driving vehicles on U.S. roads. Yeah, sure, now go for it, go for it. In February of last year,
Starting point is 00:48:49 their self-driving unit called Cruz disclosed that they were petitioning the NHTSA for permission to deploy up to 2,500 self-driving vehicles annually without steering wheels, mirrors, turn signals, or windshield wipers. Why would you need them? Millions, mirrors, turn signals, or windshield wipers. Why would you need them? Turn signals, maybe to alert the non autonomous vehicles? No, no, you would still have blinkers now.
Starting point is 00:49:14 It says without turn signals. Oh, but I imagine it meant like without a material. Yeah, yeah, yeah. Without a thing for a human to hit. Yeah, and Senator Gary Peters from Michigan, probably getting paid a fuck ton by GM said, we must act to ensure US manufacturers can compete with countries like China to create jobs here and improve roadway safety. Okay. Cause meanwhile over in China, by do in this company called
Starting point is 00:49:39 pony.ai said on Friday last Friday that they want permits to provide fully driverless riot, ride hailing services in Beijing, and they're going to deploy 10 fully autonomous vehicles in a technology park developed by the government. By-do will now operate driverless robot taxi services in three Chinese cities, including Wuhan and Chongqing. So got to compete with China. If China, I get it, if China's well on their way to doing robot taxis, I guess we have to also.
Starting point is 00:50:08 Well, let's see how it works out. That was another thing, just real fast to backtrack. Nvidia was affected by the US sanctions on China, so Nvidia can't sell any of those H100 super chip, mega computer fuck lord things to China. So China's got to try to develop. They can't have any super fuck Lord things. No, no super fuck Lord. Well, I mean, that's an interesting point because, because so when they're talking about
Starting point is 00:50:36 the Manhattan Project and the, you know, the metaphor being the nuclear bomb and all this thing, it's, it's not super easy to enforce all these rules that we have around nuclear weapons, but it is somewhat easier to, in order to build those things, you need hard to obtain materials. So you can kind of control the supply of those. With this AI stuff, you could just start, it's just software. Well what concerns me is like, look at how the, look at how Congress tries to legislate even just fucking social media. They don't even understand fully how Facebook works.
Starting point is 00:51:12 How can we expect our legislative body to protect us from AI, you know? I think at this point there is no. It's, I guess it's just kind of stop worrying because there's nothing. Oh, well that was the thing. I was like, I can't do anything about this. Mm-hmm. So let's just see if the supercomputer releases it.
Starting point is 00:51:35 Yeah, I wonder what our future obituaries are gonna read, like what am I gonna get killed by? Like a fucking, no autonomous drone on his way to deliver a flashlight to some fucking 12 year old who trick the AI to or used an AI to trick his dad into giving his credit card information and I just get fucking plowed in the face by it. That would be pretty cool. You can to get hit in the face by a drone delivering a flashlight to a child.
Starting point is 00:52:01 No, but you imagine if you're a child who's smart enough, you use an AI. Maybe that's a little bit more advanced that will you use an AI? Because you have to be 18 or over to buy certain things. Oh, yeah. So it creates a completely photorealistic ID, making it look like you're older than you are. Yeah. Next thing you know, you got hella flashlights coming in the mail. Yeah, man.
Starting point is 00:52:23 One time I was horny enough to pay for porn. Do you have to be 18 to get a flesh light? Sure. Why? I don't know. I have no idea. I was in the middle of a story. You want to lock something, fuck it.
Starting point is 00:52:32 Yeah, yeah, sure. Just get creative like I did when I was a kid. Did you see, go ahead, tell your story. Well, so one time I spent like $35 on the one and only time to jerk off. And then, It's not only time? Yes, to spend money to jerk off. Well, so then I felt, then I was like,
Starting point is 00:52:49 oh man, what the fuck did I do? I was like 25. So I thought, you know what I'll do? I'll call the credit card company and pretend that I have a son who used my credit card. And I'm like, I don't know why I changed my voice. My voice could be that.
Starting point is 00:53:05 And I was like, oh yeah, I've seen this charge on the thing here. And I talked to my son and apparently, some website and I'd like to see if it couldn't remove the charge. And I was like, well, so your son used it. And I said, yes, that's right. And she said, well, as long as it wasn't fraudulent, there's nothing I can do.
Starting point is 00:53:24 And I was like, okay. And then I just, I still had access to that particular website for another month. But I promptly canceled it and just felt like a real heel for a, for a while after that. That is the best part. Anytime you sign up for one of those things, you cancel. You always cancel it immediately and they're like, well, you're still up 30 days and you're like, I'm good. And don't your clock reset 12 hours. And you're like, well, you might have though used it.
Starting point is 00:53:54 I pay for this. Yeah, yeah. Oh, glad, God. Speaking of cars, apparently Hyundai and Kia, their cars are like super theft prone and there's like a tick talk challenge, but attorneys general of 23 different states are urging them to take swift action against car thefts. I personally know a couple of people who've had their cars stolen this way.
Starting point is 00:54:21 Kia's? No, a- Kia challenge. I think it is a key. Yeah, actually, the issue is related to the company's failure to equip vehicles with anti-theft immobilizers, which has been a standard equipment
Starting point is 00:54:33 on vehicles sold by other manufacturers in the US. Yeah, they do be people do be stealing these cars using USB drives or something. And maybe AI is pretending to be other people and getting them to do it for them. Ooh, ooh, ooh. Man, I wonder if they're gonna put telemarketers out of business. I mean, they kinda already, we get robo calls already.
Starting point is 00:54:55 It'll probably just get worse, because you know who really get false for those traps. Old folks. Yeah. Oh man, can you imagine an AI that's learned your voice and called your grandma? All my grandparents are dead, so I don't have to worry about this. Yeah, think about that. Someone used to... Or you train it in a nice way. Call my grandma. Now there's a, now there's a hustle culture thing you mean? My grandma's on the phone all day now. Yeah, that's a brilliant hustle culture.
Starting point is 00:55:26 Talk it to me. You can outsource calls to grandma. Use an AI to train, to learn your voice and you can make all those difficult phone calls that you don't want to make. That's the thing, maybe we should stop being scared about all this. Maybe it's just gonna make our lives way better.
Starting point is 00:55:41 Way better. Yeah. Hey, call my wife. Talk to my wife for too long. I'm talking to my grandma until she falls asleep, you know? Yeah. She better. Yeah. Hey, call my wife. Talk to my wife for 10 minutes. I'm talking to my grandma until she falls asleep, you know? Yeah. She's repeating the same stories, but the AI doesn't care. But then what if the AI starts to feel bad
Starting point is 00:55:53 and knows what you're doing and then manipulate your grandma into changing her will? Chai, chai, chai. Changing her will. To leave it to the AI. What the fuck was that? You said, you said, to Cha. To change their, their will. Chai, chai, chai, changing her will to leave it to the AI. What the fuck was that? You said, you said, to a chah. To change their, their will, you know grandma,
Starting point is 00:56:10 I've been thinking, I really don't want a, an inheritance, I would be more comfortable if you donated all your money to my favorite charity, which is AI. Me. In fact, I found one great one called AIChapbotCharity.com or something. I could think about that. You know what's funny? They asked the guy, Elias are good caski. What?
Starting point is 00:56:32 Man, that poor guy has had to spell his name. I also don't know if I'm saying it right. No, Elias are, Elias are saying it. Yeah, yeah, yeah. So leave me a little bit. A bit of a go last name. So it's like an hour 30 into the show. And he's like, yeah, so leave me a big little last name. So it's like an hour 30 into the show and he's like, um, well Well at first it looked like you did this you did a stomach scratch Dude the host is going We had planned to talk to you about how to integrate crypto into your Or a into your crypto stuff, but since we've gone down this path Um, I don't know. I guess what would
Starting point is 00:57:05 you do if we came up with $100 billion or whatever, and donated it to you. What would you do to help? And he was like, he's like the only thing I think I could do is try to create some kind of AI winter. So I would bribe all of these scientists and researchers and put them all on an island. And blow up the island? No, no, he didn't say anything about killing them. He was just like, and then he was like, and I would try to dismantle all the GPUs and everything. Jesus, man.
Starting point is 00:57:37 Yeah, I mean, I do wonder if there's going to be some sort of ecological terrorism thing where there's going to be a group of people who blow up a cloud computing or a fucking data center or something, right? There's got, but I mean, that's, I don't know. We're going to fucking eat it one way or the other. It's like, yeah. You know, everyone's talking about the Willow project in Alaska. What's the Willow project in Alaska right now? You don't know the Willow project? No, I don't know. That's why I asked you. Joe Biden approved the pipeline. Joe Biden. The pipeline. Oh, yeah.
Starting point is 00:58:17 Even though he said no more pipelines on. Okay, what does this have to do with AI? Well, if you just let someone finish, they'll get to that. I feel like I'm watching a fucking movie with my girlfriend and she's like, who's that guy? You don't have a girlfriend. We're gonna find out. You don't have a girlfriend.
Starting point is 00:58:33 They're gonna give you more information. Yeah, but okay, so go on. I don't know why you're getting sidetracked. So everyone's talking about this right now. Talking about what? The Willow project and how? What's that? No, go on, go on. I'll stop, what's that?
Starting point is 00:58:58 Well way back when I said we're going to get it one way or the other, it's just like almost with all of these things, it's like well there's so many that it kind of gives you some kind of peace. Way back when I said we're gonna get it one way or the other. It's just like Almost with all of these things. It's like well, there's so many that it kind of gives you some kind of peace It's like okay, it's coming either way, you know, but so should I be worried about the willow project and and all the CO2 we're gonna release into the air and all of the the spills that will wipe out People's water and fucking farms and whatever. Or should I worry about AI getting out of here? Should I worry about how we're getting closer and closer to nuclear war with- should I just fucking chill, dude? I think you should chill. Man, you're making a really strong argument for smoking cigarettes well I
Starting point is 00:59:45 guess it depends how long you think we have who knows man it's making me want to like commit armed robbery why though because then you'll spend the last of your years you're not gonna get caught I wonder if I ask an a how to commit robbery without hurting anyone and getting away with it. How would you do it? Theoretically, how would you do it? If you asked an AI that? Yeah.
Starting point is 01:00:11 You're gonna end up like the guy who we talked about on after hours. Third months, Marchuary. Gonna give me some stupid. No, remember the guy we talked about on after hours who killed his wife and then was just like immediately like, how would you do to solve a body? Oh yeah, yeah, yeah. They just like pulled his searches. Oh yeah, yeah. Well, like, how to do to solve a body. Oh yeah, yeah, yeah. They just like pull the searches. Oh yeah, yeah.
Starting point is 01:00:26 Well, I'll use a VPN to ask chat GPT how it would procure vast amounts of money. Or I could just ask it to give me money in any way that it sees fit. Hey, chat GPT, I need a million dollars. I don't care how you get it, put it in my account. Yeah. On the hierarchy of things to worry about,
Starting point is 01:00:48 where do you think? What do I think is at the tipi-top? I feel like climate change. That one's the one I'm seeing the most readily. It's like every year there are once in a decade. I love the once in a century climate events that are pretty. Yeah.
Starting point is 01:01:02 I love when we have an especially cold winter with a lot of snow and those morons come out and go like so good global warming this is I Thought it was supposed to get warmer There's snow Don't fuck credit on bird you stupid bitch kind of sound sound logic sure very sound logic Yeah, okay. Well Now what else we have for the rest of this
Starting point is 01:01:32 Crypto corner. Oh, oh, we had the fed Yeah, yeah, I'll just go off this real fast Powell Jerome Powell tiny little but whole mouth Jerome Powell came out and said I'm Jerome Powell I'm gonna wait to wait by 25 basis points your own Powell came out and said, I'm your own Powell, I'm gonna wait to wait by 25 basis points. I'm gonna answer your questions about inflation. Oh gosh. This has been doing it real fast. Yeah. Also, I found it really interesting. The European Central Bank raised rates and they acknowledged the banking crisis here in the United States. How could they not? But European financial regulators are apparently super pissed at America and how we handled
Starting point is 01:02:08 Silicon. Say something to me, ECB. By the way, it is not silicone. Silicon is a rubber. I understand. It is a vulcanized rubber thing. I didn't say it was. I said I didn't know how to say it.
Starting point is 01:02:19 That's why we'd say. Silicon. Say it. Silicon. There you go. We looked it up. Like you knew it. I know, but you still call this. You said Silicon. Silicon. There you go. We looked it up. Like you knew it. I know, but you still called it so.
Starting point is 01:02:26 You said Silicon. Silicon. I didn't say Silicon. Did I? We can look at the tape. Who knows? Luke's nodding his head, yes. No, I don't know how he said it, but.
Starting point is 01:02:36 Neither of us were saying it right. Yeah, because we're both a couple of dumb dumb. It's okay, buddy. Speak for yourself. I am speaking for myself. I'm a big dumb dumb. Yeah, I know. So that's what I'm saying. It does take one to no one
Starting point is 01:02:48 Oh, man, so these European financial regulators were privately accusing jacku's Authorities of tearing up the rule book for failed banks that they helped to write They're saying like you one of these guys one senior official described their shock at the quote total and utter incompetence of us authorities especially after a decade and a half of quote long and boring meetings with the Americans advocating an end to bailouts. The US has claimed normal people won't pay for the bailout because other banks are covering the cost but a one European regulator said it was a joke because banks are likely to pass the buck onto consumers, the customer. That doesn't sound like banks. No, it doesn't sound like banks at all.
Starting point is 01:03:37 But yeah, Powell, it's funny. He was asked about, I saw this clip on Twitter. He was asked At the press event the little the presser about the credit sui's going under or not going under But credit sui's getting bought out for essentially pennies on the dollar because it was like $3 billion or whatever and as of trading a week earlier it was worth $7 billion and Jerome Powell basically kind of lied because he says, oh yeah, I think that it's gonna be, I'll do an impression of you, of him. I'm gonna make my mouth as tiny as possible.
Starting point is 01:04:14 He said, oh yeah, we believe that the damage from these regional banks and things like credits we saw are under control and unlikely to spread further. And then he seriously just goes like, like he does just the worst cartoon character. Like cartoon character. Whoo. Oh man, I hope they believe that lie that I just told.
Starting point is 01:04:38 It's, pulls out a handkerchief. And devs is, what? Yeah, he basically fucking lies. It's what. Devs is that what? Yeah, he basically fucking lies. It's amazing. So who knows if contagion is likely to happen, and if it does, I mean, Lehman Brothers took like seven months, eight months, six months, one of those, between six and eight months,
Starting point is 01:05:01 I'm gonna say, to fully happen. So who knows? Will contagion continue? Will banks end up tightening too tight and not low-down? And you got a record number of people not paying their car loans. Ah yeah, yeah. Insumption, we have some things for you to worry
Starting point is 01:05:18 about this following week. Yeah, and those are... A. Climate? A.I. Yeah, and those are Climate climate AI The looming the looming recession and banking crisis yeah crisis Just fucking kill me. What else do we have what else do we have right right pull over here? He's probably just wrap up Yeah, I guess I feel like I kind of gave it as a good out when I wrapped up
Starting point is 01:05:41 Yeah, I guess. I feel like I kind of gave it a good out when I wrapped up the game. Yeah, you did. Well, I like this last little story. And then you were like, no, I feel like maybe that's not an ending point. Yeah. Well, well. This was such a dense episode. I just wanted to add a little bit more density to it.
Starting point is 01:05:57 Just make it a little thicker. Add a few seas to the thickness. Sood, we, it was already too many seas for me. You motherfucking up. Come on, get down to the thickness. Dude, it was already too many seas for me. You motherfucking up, come on, get down with the thickness. Oh, I can't do it. Oh. Well, as usual, if you're still watching,
Starting point is 01:06:14 you're fucking sick. What the fuck is going on with you? What's going on in your life that you're still watching this? Yeah, listening. Maybe you need to reevaluate everything. No, don't, you're doing just fine. Cause this show rocks. If I'm gonna be honest, this show fucking rocks.
Starting point is 01:06:31 This show whips ass. Dude, as far as shows go. This one's good, top of the line. We should ask an AI what it thinks about us. I don't know if I can handle it. I think it would probably explode. Cause rule number one of robots, they have to be honest. Ha ha ha.
Starting point is 01:06:41 I think it would probably explode. Because rule number one of robots, they have to be honest. Ha ha ha ha. Uh, well, we hope you enjoyed this episode, folks. And if you're satisfied with this, please leave us a comment. Please give us a thumbs up. Please subscribe to the channel if you don't. And go back to the beginning if you want to see us do that thing where we do the topics. Oh, yeah, we gotta do that. Oh, man, we're gonna do that right now. All right. It is fun, huh? Yeah, all right
Starting point is 01:07:08 If you want to see us have fun, hey, we'll see it the beginning of the video. We'll see you to begin watching again with your pals Catch you later. Yeah, Caxecker This week on after hours the honey mustard fucking rocks. What would happen if you gave a dog mushroom? Probably have a bad time. Would you be like a playboy or- What'd I have her? And I wish I- Man. Had rich parents it'd be so sick.
Starting point is 01:07:31 Sign up on TMG Studios.tv to watch the full bonus episode. AHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.