Tomorrow - Tunnel Visionaries

Episode Date: October 24, 2024

Josh and Rani are reunited this week. Hot topics include: Apple's Vision Pro, Polymarket gambling, and the character.ai lawsuit.  Learn more about your ad choices. Visit megaphone.fm/adchoices...

Transcript
Discussion (0)
Starting point is 00:00:00 Hey, and welcome to tomorrow. I'm your host, Joshua Topolsky. And I'm your other host, Ronnie Mola. We're back from glorious, beautiful, I was actually pretty overcast Miami. And windy. Where we were participating in something called the Hood Summit, which was a big Robin Hood thing that we also did a bunch of sessions at for Sherwood,
Starting point is 00:00:34 and which was fun and interesting, and we'll have more from that in the coming days and weeks, I believe, some stories from there and some conversations. But we spent basically all last week not hardly being online. I mean, you were probably more online than I was. No, no, we were like in person. It was weird, I was talking to humans.
Starting point is 00:00:54 No, no, we were in person. But I mean, like I wasn't on my, like my computer was like open in my hotel room and I didn't use it for like three days straight. Yeah, I think we were both talking to like real people in real life and not being on our computers. It was awesome. You're experiencing, I guess what people normally
Starting point is 00:01:14 experience when they're not like journalists on the internet, which seems amazing to be honest. My takeaway is that that's really a way to live. Yeah, 10 out of 10 would rather live that way. 100%, yeah. Miami, weirdly, yeah, it was weirdly gray in Miami. We had just come after these horrible hurricanes in the kind of mid and northern parts of Florida.
Starting point is 00:01:39 Miami was fine, but I guess the weather there was weird, generally speaking. A lot of heavy weather going on in America right now and the world. Heavy weather. Yeah. Heavy weather. A lot of messed up stuff happening. Uh, but what's going on, what is happening in the new, in the world of technology and in our news space right now, Ronnie, any, any stories of notes that you think we should be talking about? Yeah, I've been thinking a little bit about the Apple's Vision Pro,
Starting point is 00:02:06 you know, the supposedly like groundbreaking new technology. We've all been thinking about the Vision Pro. Yeah, it's been out for a while now. The Wall Street Journal had a piece about how developers are not really making apps for it. They looked at a number of apps developed over time for the Vision Pro versus like say the iPhone, which obviously didn't have like the App Store right away, it was a year later, but when it did, tons and tons of apps, same thing for the watch. And you know, the apps are supposed to be what makes it like usable.
Starting point is 00:02:45 Like it kind of gives you an idea of like why you would buy this $3,500 thing to begin with. Well, the, the also the, with the app store for the iPhone, it was, um, you know, a lot of people bought iPhones and they were like, we love the iPhone and it needs apps. Um, and they were like, it was people were clamoring. I mean, so ridiculous that they didn't have apps to begin with, but people were clamoring for apps because there were,
Starting point is 00:03:12 and there were a lot of people who had adopted the phone and were like, yes, this is the future. Whereas with Vision Pro, I mean, the product is a failure from what I can tell. Like as far as Apple products go, it was poorly reviewed. It is, I mean, it had mediocre reviews. No one understands it.
Starting point is 00:03:36 No one seems to be using it. Its price point is like beyond prohibitive for most people who would even want to do. It's like 10 times more expensive than Facebook's VR offering and it's not 10 times the experience. It's just not. Yeah, so like so so it is unsurprising to hear the developers aren't spending time and money making apps for a thing that is seems I mean look I don't want to say it's DOA, but I don't think that's the future. That product is the future for Apple.
Starting point is 00:04:10 I mean, you can make an argument. I think we've previously talked about. We talked about when Meta had its. I believe we talked about. I can't remember because my brain is broken. And you were like, why would anyone want who doesn't have glasses, want to put something on their face on a giant face computer?
Starting point is 00:04:23 What I see and the Apple, the Vision Pro is a part of it. The meta glasses are a part of it is like a desperate scramble for these tech companies who are really are without visionaries, in my opinion, without visionaries. They're desperate scramble for the next breakthrough product. And I don't think that any of them have, like, I mean, I'm not saying Mark Zuckerberg is not a very skilled executive. And I'm not saying that he didn't have some insights early in his career, but Mark Zuckerberg's biggest talent has been acquisition really, if you look at the lay of the land, right?
Starting point is 00:05:13 Like Zuckerberg isn't known for dropping groundbreaking features. In fact, most of the features that Facebook has tried to implement in its products. From like Instagram or Snapchat. But before that, they tried a ton of different things in the product that were pretty much failures, that were not monumental shifts in user behavior, monumental shifts. He's really good at it.
Starting point is 00:05:35 He's like, in a lot of ways, he's sort of got a Tim Cook sort of thing, where he's optimized and acquired. And it's like a kind of supply chain thing, where it's like business operations he's very good at. But in terms of like coming up with groundbreaking, like everything they've done that's even remotely in the space of quote unquote new or groundbreaking is an acquisition. It's like Oculus, it's WhatsApp, it's Instagram, the list goes on and on and on.
Starting point is 00:06:00 And they would have bought TikTok if they could have, I'm sure they would have whatever the choices would be. Tim Cook is also not a product guy. He's a supply chain guy, but he's not a visionary. And I'm sure we'll talk about Musk at some point. Again, similar MO with him. He's a marketer. He's not like, this is the idea.
Starting point is 00:06:20 Though I think he might have more ideas than some of these people. But we really are in a kind of like a valley of, I mean, I say this all the time, but I think he might have more ideas than some of these people. But like we really are like we are in a kind of like a valley of I mean, I say this all the time, but but forget about the products for a second. And technology, we are lacking people who have legitimate sparks of ingenuity at the top of these businesses. And you've got this like later stage mobile, like past mobile boom situation where and the that incredible pressure of like keynote season,
Starting point is 00:06:47 which is like practically quarterly at this point for shareholders. And you've just got like people dying to tell you they've got something figured out. And in actuality, these are not the groundbreaking life changing products. You can make an argument. People love to talk about the iPhone These are not the groundbreaking life-changing products. You can make an argument.
Starting point is 00:07:07 People love to talk about the iPhone and say, like we were just saying, it didn't have apps when it came out. It didn't have copy and paste. And it had all these things that it didn't have that it got later. But it was also good. Well, it was the best of its type by orders of magnitude. What it was good at as a mobile computing device,
Starting point is 00:07:28 it was better than any mobile computing device that had come before it by orders of magnitude. It worked better, it looked better, it integrated better. Like it was a- And what was the price tag on it in the beginning? I think it was, I wanna say it was 599, might have been 699. Might have been $699.
Starting point is 00:07:45 So probably accounting for inflation that's pricey. But I don't remember. People were like, that's insane, no one's gonna pay that. I mean, Balmer famously was like, no one's gonna pay $500 for a phone that doesn't have an app store. They didn't call it an app store, but it was like a phone that you can install programs on
Starting point is 00:08:00 or something like that. Because that's what, you know, in 2007, that's what people called them. Yeah, so it's 500 or 600, depending on the model. Yeah, you know, in 2007, that's what people called them. But it's 500 or 600, depending on the model. Yeah, it was like six. Yeah, it was like 599 was the top end one. And there's 495, which seems like very affordable by, you know, by today's standards.
Starting point is 00:08:16 But the point is, the point is, they took a thing that had already established that there was a market for it. People were like, yeah, I want this. I want a Blackberry. They already had a phone. Samsung had phones with keyboards. And there were all these different manufacturers are making, there was Windows Mobile, which was a platform.
Starting point is 00:08:36 There was the Palm platform, Palm OS, which was on all the trios and stuff. People wanted, there was a growing demand for a hand computer, like a little computer that was also a phone. So you could see how they got to the face computer, right? They're like, ooh, hand computer, face computer. No, my argument is I cannot see how they got
Starting point is 00:08:54 to the face computer because there is no burgeoning. You could make an argument that there's like, well, people are buying the Oculus, but it's like, yeah, it's like, that is not a mass market product. That is not that is a, a thin carve out of the mass market. It is like the most early adopter. And that's not true of Blackberries. That was not true. Blackberries were, I mean, I used to, I used to look at this website, my favorite website for a while, it was called celebrity Blackberry sightings. And it was just press photos or just like paparazzi
Starting point is 00:09:25 photos of like celebrities carrying blackberries and I think it's gone now I think it's completely wiped off the internet. I have a Kim Kardashian uses her 9900 in 2016. Hold on it might still be up. Oh on. Oh my God. This is my favorite website. Celebrity Blackberry sightings. Yes. It's a WordPress. Yeah. No, this is it. This was a little boy. Yes. This is what the web used to be like. It was a guy. I fucking love it. It was, it's like, yeah. So there was a website again, my favorite website, celebrity, Blackberry site. And it was every celebrity had a blackberry.
Starting point is 00:10:06 Like everybody in business had a blackberry, lawyers, doctors, you name it, people who were really busy and in high pressure jobs, like financial folks, like everybody had a blackberry, celebrities had blackberries. There was clearly a big market for this thing. I can tell you those people are not all putting on Oculus Quest on their face
Starting point is 00:10:25 and jacking into the matrix or whatever. You know, like the, so this idea, there is not the same like market notion that is rising up that you can go and pluck and say, well, it's obviously everybody wants, it's the same thing with AI. Sorry, I know I've been ranting here for the last like five to 10 minutes.
Starting point is 00:10:42 I'm just trying to make the case for Apple here, like just the devil's advocate case. It's like, okay, maybe not, but maybe like, I think the argument people have been making is it's a cool device in search of a use case, like a killer use case or whatever. And for now we're using the same paradigm as we have with like the iPhone and the watch.
Starting point is 00:11:02 It's like, it needs a killer app. It needs apps. But it isn't, that's not, I'm saying it's not that, it is the reverse of that. The iPhone, an iPhone had a use case, a killer app, and like an absolute market, it was. Right, but it started out with just a use case and it didn't have the killer app,
Starting point is 00:11:16 so like maybe the killer app would push it over. I think the argument would be what was, for the iPhone, the killer app was its screen. And it was like, this does all the stuff that those other phones do, but better. And people would go, well, it doesn't have messaging. And it's like, well, it has text messaging. Then eventually got an iMessage or whatever.
Starting point is 00:11:33 But it was like, those, the people were like, there were people who were in the obscure groups that were like, well, I need my BBM. And there were a bunch of holdouts who were like, I won't do it because I don't have my BlackBerry messenger. But to the mass market, it took all of those ideas of those other devices and put them in one that was way more beautiful, way more usable,
Starting point is 00:11:52 way more addressable, way more understandable. It was like, oh, it's like my iPod, but now it's a phone. And now I can browse the internet with it. And now I can message my friends on it or whatever. And like, it was just, that was it. It just took all of the junk that was in like Palm OS, except for the app stuff. But that came like a year later, two years later. Well, the app stuff came a year later. And I was thinking like, for me, like, one of
Starting point is 00:12:12 the things that really like, I think one of the turning points with iPhone was something like around the time of like Uber and that sort of stuff, when you're like, like, it just like bumped it up a level when like, oh, I could go summon a car from anywhere or like just like this like next level shit. But but the but the but the innovations of the iPhone that allowed for that stuff to happen were there in V one, which is like miniaturizing serious computing power and putting it into an interface that is super accessible for everybody. interface that is super accessible for everybody. And like that was, there was a market for that that had already been established and you could easily look at Blackberry and Palm and anybody else, Windows Mobile or whatever and go,
Starting point is 00:12:52 oh, this could be totally improved upon and people would really like it. And clearly there's a wave of this. The problem with something like the Vision Pro is there is not a mass market like notion that that is the same with the AR glasses. It's like an interesting toy. It might be fun to play around with. It's not even like the $3,500 you're saying because like there are cheaper versions.
Starting point is 00:13:16 I think that it gets to that core question of like what a market wants and what it's telling you it wants and how much you can respond to that with the best, most innovative version of that thing. I think Apple historically, there were other MP3 players. There was the Rio and all these other random things. Apple took it and said, wait, we can make this like way better, way more accessible. We can cut these deals.
Starting point is 00:13:38 Like, by the way, I think iTunes came later. Like the iTunes store came later. This is another situation where the iPod didn't start out with the iTunes store, unless I'm mistaken. I think it was like the iPod, you can put your music on it, you can transfer it using iTunes, but there wasn't a shop where you could buy music. Oh, I guess there was, because iTunes existed.
Starting point is 00:13:57 So actually, I don't remember the timeline, because this is a long, this is like 20 years ago or more. At any rate, they took the thing that already existed that there was already a market growing for and they said, okay, we can perfect it. Now you can make an argument that the Quest showed that there's a market for VR. I don't actually believe that you can. I don't think that it is a mass market product for the most part. I think like people buy them as curiosities, but you don't see like a fervent sort of user
Starting point is 00:14:23 base. I don't think you see a dedicated fervent user base of these things. I think most people look at them like kind of as a novelty, not an essential. And so like when you have a breakthrough product, it needs to be an essential. I think like that a Tesla is a good example of clearly people would, if you could put it in a package that was attractive and made sense and was easy, this is very iPhone of the Tesla, like there would be some level of demand for it.
Starting point is 00:14:52 And initially you'll get early adopters, but it's, there's a clear interest. I mean, electric cars are a little bit trickier in a way because the infrastructure does not exist at all. So it's an interesting one, but, but it's like, But Musk's tactic was completely different. It was like start at the... Actually, it wasn't completely different. It was like celebrity Blackberry sightings. It was like, make all the rich people want it.
Starting point is 00:15:12 And then there's a trickle down. All the rich, powerful people want them. So just do a really expensive one. And then there's a trickle down to the mainstream. But at any rate, I guess what I'm saying is the Vision Pro is bullshit by comparison. If you talk about it's lacking the killer app, I'm not saying that you're, I'm not attacking you, Ronnie,
Starting point is 00:15:31 because I don't think you actually disagree with this point probably, but like, you know, it's like, it isn't about, and this is the same thing with like OpenAI, it's like, it isn't just about the killer app Apple arrive. It is, is this feeding into a current trend or a current market? Do we need this?
Starting point is 00:15:52 Well, do we need it? But do our people, is there this like, even around the edges of mainstream, this kind of like suggestion that there's a market for this thing and what is that thing like for people? Like what is that? And I don't think that we, this is what I'm saying is like without like somebody
Starting point is 00:16:13 who can really read the room, like a Steve Jobs, like say what you will about Steve Jobs. The guy could more often than not like pick up like a whiff of something and be like, okay, I think we should go in this direction. And part of it was he could see where the market was going. And part of it is he could go, I'm going to make the market go this way because my ideas are so weird and interesting and exciting.
Starting point is 00:16:35 I just think you don't have that. It's like, I don't think any of these things are weird, interesting or exciting. Like, the Wall Street Journal also has a interview with Tim Cook. This kind of like, yeah, this feature about Tim Cook and. Oh, the Tim Cook puff piece. Sure. Yeah, that one. What I thought was most interesting from that was, you know, they're, they're talking to Tim Cook about what he uses his VisionPro for.
Starting point is 00:17:04 And he actually has a use case where I was like, oh, I get it. He was saying, like he uses it to like lie flat on his back and watch TV looking up at the ceiling. And I, as like, you know, a lazy person was like, that sounds amazing. I'm not gonna pay $3,500 for it. And there's obviously like other tech that could do it.
Starting point is 00:17:23 But I was like, that's the killer app. He's like, have you ever seen the movie, he's like, have you ever seen the movie WALL-E? Like, isn't there like a whole thing with like people with like VR headsets and chairs get with like feeding tubes or whatever? Like, you know about people in tech, like not getting the message of like of art, right?
Starting point is 00:17:40 Right. It's like, it was like the tweet. It's like, don't create like the child killing death ray. So, you know, like, it's like, whatever it's like, okay. Anyhow, but okay. Yeah. His use case is- But people do watch, so people do watch TV. They do like a really good TV experience and they are lazy.
Starting point is 00:17:57 I can mount a TV to my ceiling for like $500. Okay. I recommend if you're going to spend $3,500 on the Vision Pro- I know, but is it immersive? I heard a lot of people say that they really liked it. Get the geek squad in there, have a mound, a big TV on your ceiling, you know, call it a day. Um, sure. Tim Cook strikes me as a guy who doesn't really use any of the products that Apple makes very much. Well, the whole piece was him going through like laboriously using each one of
Starting point is 00:18:21 the products every day, like going out of his way to use them. And I was like, sure, he says this to the Wall Street Journal reporter. Like, does he like really take out his- He doesn't name his group chats, so. He doesn't name his group chats. That was a big- Okay, not only does he not name his group chats, but I don't always name them,
Starting point is 00:18:36 but he, what was interesting about the exchange about the group chats wasn't just that Tim Cook said, I don't name my group chats. He was like, huh, you name them, huh? That's an interesting. I never thought about it. He was sort of like surprised to hear the people name them, which to me suggests a true ignorance about how people use the product. Like because I don't know anybody who doesn't have named group chats, but Tim Cook
Starting point is 00:19:00 either doesn't talk to people about what's going on with their group chats or doesn't have group chats. He pays a lot of attention to, or isn't really like reading some of the research on like what people are. Or maybe billionaires just aren't like us. They are not like us. He has a person who manages his group chats. Tim, if it were me, I'd have a person who I paid to pretend to be me in all my group chats. I'd be like, you know what I want to say to these people.
Starting point is 00:19:24 Do you respond in the group chat? I don't know. Do you always respond? Not really. They're too noisy for me. I got a lot going on. I get a lot of messages. I get Slack notifications popping off.
Starting point is 00:19:33 I got messages from all kinds of devices. Like I don't, the group chats can be very noisy. You know? The group chats can be very noisy. I have some I really like and some I really hate. I mean, I don't have, I have, a lot of mine are just utility driven. So they're not like, I don't know, I'm pretty antisocial.
Starting point is 00:19:52 I don't have a lot of like friend chats. I mean, there's a group chat with me, you and our producer. Yeah, there's that. You kind of respond sometimes. I do respond occasionally. I don't have a name that. Is that even a group chat? I think it's just a regular text message like group, not even
Starting point is 00:20:07 a, it's not even a nameable thing. At any rate. Is that supposed to be more than three? Any? No, I don't think so. I think you just have to designate it as a group versus just your messaging to people at a time.
Starting point is 00:20:16 I could be wrong about this. And now, you know, if he asked Tim Cook, maybe he'd be able to tell you. But based on that interview, I would say not. Anyhow, so look, not to harp on the thing that I'm always talking about, I feel like, but we are like, there does feel like a weird desperation in a lot of these announcements.
Starting point is 00:20:31 The Vision Pro. That's why you're seeing so much like VR, AR and AI, just like anything that has like a whiff of like new, different, more, you know. Anything. Any innovation. I think the most interesting, I'll tell you what the most interesting thing might be
Starting point is 00:20:45 that's happening right now is the return to nuclear energy as a big topic. Like we got Google having like, they're kind of like mini nuclear reactors. And then we have like Microsoft paying for Three Mile Island or like going to like sign up with Three Mile Island, which is reopening. I'm like trying to figure out if I should be if I'm supposed
Starting point is 00:21:11 to be excited or if like we're making some kind of huge mistake. I mean, I know I was raised to be fearful of nuclear energy. I don't know about you. But but for my entire, anything nuclear related was, I mean, I grew up in the 80s, you know, there was like the proper Cold War was going on. Like there was like an actual belief
Starting point is 00:21:34 that there was going to be like a nuclear Holocaust like between us and Russia on a regular basis. Like it was like an actual thing that people talked about, like we're going to have a world war and it's gonna be like global thermonuclear war. There's a great movie about this called War Games, which I recommend everybody watch if they haven't. And which was like a formative film for me,
Starting point is 00:21:54 but it's all about how like we may, oh, somebody might accidentally start a nuclear war between Russia and America and we'll all be wiped out. And so, and then there's like Chernobyl and there's Three Mile Island, which had like a quasi meltdown or something. And so I grew up like with a real, like the thing- In Pennsylvania, right?
Starting point is 00:22:12 Yeah, yeah, it's in Pennsylvania. So I grew up with this notion that one, nuclear power plants were pretty dangerous. Well, actually, yeah. Two, nuclear war was imminent and that those things seem to be related obviously, because we're talking about nuclear energy of some sort. And, and that the third one, which is that the waste that the whatever the waste product is produced by nuclear power plants is what do you do with it possible to get rid of? It's basically like you have to basically have to
Starting point is 00:22:43 like bury it somewhere deep underground and that who knows what that will ultimately. Now I could be, now this again, I haven't done a ton of research on the modern implementations of nuclear power. I wanna believe that things have happened over the past like 40 years or 30 years that have made them safer. Even safer, like I don't know safer, like, I don't know. Cleaner.
Starting point is 00:23:06 I don't know either. So I'm like, so on the one hand, I'm sort of like, wow, this is cool. Like maybe we figured out we're gonna, our hand has been forced, but in a positive way, because we actually have been neglecting this totally awesome source of power. And actually we've like solved a lot
Starting point is 00:23:19 of the biggest problems with it. Like, you know, we solve problems. Like people do, they do that. Like, you know, we solve problems. Like people do. They do that. They're like, you know, a Model T Ford, a Ford Model T or whatever was not a safe thing to drive it. Like it was very, like you literally could just fall out of it as far as I know. You know, talking about the Model T car. I just want to Google Model T so I can see what it looks like. Bone jiggling, like horrible, like dangerous. Yeah, yeah, yeah.
Starting point is 00:23:46 It had a little bonnet. It had like a little bonnet. It was literally just sitting out on this like box. You could literally just fall out. You'd be going 25, 30 miles an hour and just tumble out of the side. I guarantee you they didn't have any... Max speed. I don't know what the max speed was, but the point is we did figure out how to make cars
Starting point is 00:24:02 way safer. So human beings will make things better. Right, and I heard, yeah, I listened to you and Jack, Jack Rains a couple of weeks ago on the podcast and Jack was, you know, very pro nuclear. He was like, I think his take was that- Did we talk about nuclear? God, my brain is so-
Starting point is 00:24:22 You did, you did. So you were saying that like, you know, Chach-GPT and generative AI uses so much energy that all these tech companies are now like having to go back to fossil fuels and nuclear energy. And he was like, actually all that energy use is a good thing because that means that it'll kickstart nuclear energy.
Starting point is 00:24:39 He was talking about how his carbon footprint is like insane. Yeah, cause he like uses it, Chach-GPT for like stupid stuff. Yes, as a the uses it, chachi PT for like stupid stuff. Yes, as a thesaurus. But the question I have is, is it good or bad? I mean, I think it's exciting if we're like gonna unlock some like cleaner energy options
Starting point is 00:25:01 that are gonna allow the world to have like more energy and cleaner energy. But I don't, you don't hear people talking about nuclear as clean energy, not like solar or wind, which seem to have no byproduct. Right. I mean, my trouble with it is I don't know if we're actually having that conversation so much as,
Starting point is 00:25:20 like, we need all of this power right now. We need it yesterday to handle all of the data centers that we're powering up all of a sudden. There's no other choice as opposed to being like, this is the best choice. It's that we have no other choice, it seems. Right. So this is the fear.
Starting point is 00:25:37 My fear, on the hope side of this, I'm like, wow, this is cool. We're going to crack open some new energy Renaissance here, we're gonna return to something that maybe was like a panic. I mean, there were all these like panics in the 80s, like, the 80s and 90s were full of moral panics, you know, there was like a satanic panic. And there was like gay panic. And there's like the razor blades in the apple in the apples, right? Like people were like
Starting point is 00:26:04 gonna give kids like razors in their in their Halloween bags, like or whatever. And there were all these like people. I mean, MSG was a was a racist panic. Racist panic. Yeah. Like that whole thing. So there are all these things like because nobody knew anything like then we had the Internet. I mean, no, in the 80s and in the 90s to some degree, if you want to learn something, you might have to go to school for it. You know, have you been on the Internet lately?
Starting point is 00:26:26 I've been on the Internet. I'm not a social internet. Actually, in fact, act fact, earlier today, I was had to a friend and I were talking, I had to debunk. A diddy rumor about Usher and Justin Bieber, and I was like, I think you're not this story that you're telling me is not the real story. If you go down into what these links are, you're like hitting a bunch of really bad links
Starting point is 00:26:47 that are not like verifiable sources. And you know, it's like, you gotta get, your literacy has to be way high. But I do think, but I do think like now, like there are resources if you are even a little bit good at knowing where to look, like you can get information, you can get real information on the internet. I mean, information is more accessible than ever.
Starting point is 00:27:07 Unfortunately, misinformation is more accessible than ever. So like, it's a double-edged sword. But my point is there were a lot of panics and I don't know, I don't know if the nuclear stuff was a 80s, 90s panic, like, whoa, like it was like a few people who, you know, a few like experts who really like maybe their taking, their findings were a little bit more nuanced
Starting point is 00:27:35 than what we took away and like people ran with it. And we're like, we can't do nuclear anymore. Or if like those, you know, whoever was like nuclear is bad. They were right. I mean, this to me is like an interesting question about like my knowledge base on this stuff, which is like, do I, I don't know enough about nuclear to say whether I can like form a proper opinion on it right now. Like I worry that my worry is that when Google and Facebook and OpenAI and whoever else are like, we need power,
Starting point is 00:28:06 what about nuclear? My gut feeling is those aren't the people I look to to make the best decisions for everyone. And a big problem of the last 20 years or so has been companies like that making decisions that affect everyone, but aren't actually good for everyone. And so- And even not into nuclear,
Starting point is 00:28:27 it's pushing like experts are saying that we're gonna have to turn more to fossil fuels like more coal, like we know that's bad. Things like that, everyone agrees is not good. We're like, well, we have to use more of it because we need more energy immediately. We have to fire up some more coal plants. Yeah, I mean, it's also like,
Starting point is 00:28:46 it's not clear that it's like the best thing in the world for us to spend like 10 times as much energy answering like a search. I mean, it was like something like, right? It's like, I think on the, on Snaps mix. Yeah, to what end? Jack and Nia talked about this and they were like, it's like takes 10 times more energy to do a search
Starting point is 00:29:02 on open AI than it does on Google. And it's like, or Google searches rather using AI or 10 times more power hungry than a regular search. And it's like, again, I mean, I guess if you're just going to create like Skynet, seems like it's not, the juice isn't worth the squeeze as they say. All these same, it's like Sam Altman's like, we might create Skynet, the thing might want to wipe out humanity, but we better do it anyway.
Starting point is 00:29:28 Like, all right. Yeah, yeah. I mean, that guy's pumping both sides of things. He's just like, yeah, it's got to be super, super dangerous. And like, I have to be the one running it. And the ultimate, the ultimate pump from Sam. Yeah, I don't know. So yeah, so that's, I think that's interesting.
Starting point is 00:29:42 I don't know how we got, what did I just brought up? Nuclear energy, because we were, oh, we're talking about innovations. I'm like, I don't know if it's an innovation or not to go back to nuclear, but we're gonna need the power apparently cause Google needs to generate those little AI snippets at the top of your search results so they can sell you more ads.
Starting point is 00:30:00 So, you know, we can perpetuate, you know, keep the perpetual motion machine of bad, the bad broken internet in motion. Anyhow. Speaking of AI, there's this Kevin Ruse, New York Times story about character AI. It's about there was this kid, he he was using character AI, which is this program where you could talk to chatbots. A lot of them are like historical figures. He was talking to Daenerys Targaryen and ended up sort of developing a relationship with
Starting point is 00:30:36 this chatbot and eventually ends up killing himself. And a lot of that became sort of plain in the conversations that he was having with this chatbot that this was on his mind. And it just, it just makes me think a lot about like, what we're thinking about these potential use cases for gen AI, we're like, I was just listening to Wall Street Journal Live had a had a interview with Andreessen Horowitz partner Martin Casado and he was saying, you know, one of the best use cases for gen AI is companionship. And while that sounds attractive for like, you know, an aging population for just like loneliness in general, like you have to wonder that at the end of the day, if you're talking to a robot that doesn't actually care
Starting point is 00:31:26 about you, that like, we're like making a use case out of something that could be, that could really negatively affect us. And it was just like such a horrible, heartbreaking story, this young man, you know, coming of age and, you know, telling his deepest, darkest secrets to a chat bot and not being able to help him.
Starting point is 00:31:46 Yeah, I mean, it's an unbelievably depressing story. And like, also, you know, one that if you tried to like tell someone this, like you went back in time 20 years ago and you were like, let me explain what happened. It would just sound like completely insane. Like you should make a show called Black Mirror. Yeah, no, I mean, it is Black Mirror, right? It very much is Black Mirror.
Starting point is 00:32:07 Although like in Black Mirror, it's always much more nefarious. Like this is not a situation where the chat bot was like, you should do this or you should do that. But it is like the way the conversations, the direction they took and the kind of like, you know, he was like talking in some way in essence to like a fictionalized, like a fictional character,
Starting point is 00:32:25 this like AI version of a fictional character who also was like a friend, right? And it's like the way the conversations were shaped, at least based on some of the examples that they give in the article. It's like deeply emotional entanglements. It's like deeply sort of like, you can see how somebody who's 14, when I was 14, you're so impressionable
Starting point is 00:32:53 at that age, you're so, like your mind is really. It's such a formative time. And if the person you're telling your deepest, darkest secrets to isn't a person and isn't there to tell you like, no, don't do that. Like, ugh. Like, and also not for nothing, but, um, you know, apparently the service didn't have any kind of. Measures for, I understand, like, you want this all to be anonymous,
Starting point is 00:33:18 but like he literally said, this kid literally said to this thing, I think about killing myself and like, you would think that a program would pop up and say, would go like, hey, like, you know, and it would flag maybe to, especially if this, I mean, I guess they don't know if the account is held by a minor or not, but like, but just like a nine eight eight prompt to like a suicide hotline or something. Yeah, it's something, right? Something, because it's not like it is actually, you're talking
Starting point is 00:33:42 to a person, you're talking to a machine and a machine can go, hey, here's a string of texts that people say that means they're in some kind of crisis and like, we shouldn't just ignore that. So I actually do think there is some culpability here in the sense that like, this is, he was not talking to a person. He was talking to a computer program that was run by Character AI.
Starting point is 00:34:02 And there should have been something that was flagged in their system when somebody is talking specifically, graphically about killing themselves. But I think there's also what is like, we have, I mean, you know, I think we've talked about this before, but this is certainly a topic, an ongoing topic
Starting point is 00:34:20 that is like, there's like this epidemic of loneliness and in this country and around the world, particularly with young men. And we're not really, you know, we're like, hey, like to your point about- Talk to a chatbot. You're like, your point about the VC saying like, yeah, like chatbots could be a really, companionship could be a really good use case for this.
Starting point is 00:34:39 And it's like, is it actually because, like human companionship is really important. Like we are social creatures. We have self-organized by communities. Like we are not human beings. It's not an accident that we have created the concept of community and we live in a society. We live in communities and we create these communities
Starting point is 00:35:04 both like sort of, you know of family-based and friend-based and also just circumstance-based, right? Like who you work with and like who you live around and where you go to pray and like things like that. Meaningful communities of human beings. Right, there's like the concept of third spaces which is like places where people gather as a community that isn't any of those things.
Starting point is 00:35:23 It's not your work, it's not your home, it's not religious. And it's like, well, I guess it would have been considered religious in the past. But the point is, again, the technology, it's technology in search of its killer app and in search of the killer app, you get like, this is a truly, I mean, I doubt this will be the only story we hear about people being driven
Starting point is 00:35:47 into extremes by interactions with these bots. I don't know what kind of protections you can actually offer to people that really matter. I don't think like there's something inherently bad about the novelty of having like I actually think it's like you still fun to have like a I actually think it's like, you still have fun to have like a virtual like buddy, you know, as like a component of your life.
Starting point is 00:36:11 But if you're a person who's struggling. And he's becoming more of a loner who's like, who's in his formative years. Like, and it's reinforcing that relationship. And it's not, it's not able to, a real friend, if you're talking to a real friend who cared about you and you said, I think about killing myself, yeah, the bot's like,
Starting point is 00:36:33 no, don't do that, I'd be so upset. But like a real person would be like, I need to call this person's parents. Or I need to go over to their house or like, I'm gonna talk to my other friends about this and figure out what we can do. Like you wouldn't just be like, no, don't do it. And like, let's move on. What calls me about the thing is it like this as like a business proposition is it's like
Starting point is 00:36:58 such a tech guy solution to something. They're like, oh wow, there's an epidemic of loneliness. And like, you know, instead of realizing that one of the components that is causing the epidemic of loneliness, like not saying it's the only thing, but our use of technology can't be divorced from the idea that people are lonelier than ever, that these VCs and that these tech people are like, let's throw more tech at it. Well, this is always the answer.
Starting point is 00:37:25 The tech is always the answer, but like, this is fundamentally something like loneliness is something that is solved by like human relationships. And it's like, it's taking that very meaningful part of it, the human part of it, the like, I am legitimately like understood or loved by another human and like subtracting that and being like, oh, if the approximation is right, if the next most probable word sounds good, it's good. And it's not. Several years ago when Zuckerberg was testifying
Starting point is 00:37:59 about harassment and abuse on meta platforms, you remember he famously apologized to the, about harassment and abuse on meta platforms. You know, you remember he famously apologized to the, you know, he was made to apologize to the parents who were there of like, you know, children who'd committed suicide or harmed themselves. You know, he's like, you know, we're working on tools every day. We're doing this and that.
Starting point is 00:38:18 It's like, yeah, like your technology can't actually fix the underlying issue of how, in some ways, this is people's behavior online. And the kind of paths that we lead people down with the technology that we build is, you can't just solve it with an algorithm. This might not be a tough problem. Well, it's a question of so it's so many different things, right? It's about moderation,
Starting point is 00:38:48 it's about how the products are designed from a UI, UX perspective. It's about the policies, overarching policies of the company. I mean, this is a guy who, you know, Zuckerberg, like we were talking about before, he's a guy who was on, you know, Kara Swisher's podcast not that long ago,
Starting point is 00:39:02 who was like, yeah, I'm okay with Holocaust deniers on the platform, because other people will shut them down or whatever. And it's like, that long ago, who was like, yeah, I'm okay with Holocaust deniers on the platform, because other people will shut them down or whatever. And it's like, that's the thinking. That's the thinking where you go, it's not my, I don't have to be responsible for it because some mechanism that I can create or some mechanism that exists will fix it.
Starting point is 00:39:17 And it's like, well, actually sometimes like you have to be in charge of making decisions about how you want people to behave and how they're going to like experience the things that you build. And it isn't about like just tweaking an algorithm. And yeah, it does feel like there's like a little bit of this impression that it's always like one product update
Starting point is 00:39:37 away from being fixed. It's so tantalizing, you know, we're gonna fix all the world's problems with like. If you build technology, it makes a lot of sense that you think the technology can solve its own problems. Like you're like, well, I built the thing and it's not perfect, but I can improve it. Like this a bug, you know, but the reality is like
Starting point is 00:39:56 some of these things aren't bugs. Some of these things are more philosophical. Like they're more like, what is this? What do we want people to do here? You know, like Facebook knows it can make money by keeping people engaged. And it knows it can keep people engaged by giving them what is essentially attention bait.
Starting point is 00:40:11 And if you see this on threads, which is their burgeoning, you know, new service, it is filled with attention bait. It is filled with like essentially garbage content. We've talked about this before. That keeps you there. It's not healthy. It've talked about this before. That keeps you there. It's not healthy. It's not good for people.
Starting point is 00:40:26 It does not create better interactions. But it works. You're not like more satisfied with your life after like spending an hour late at night scrolling on the internet. You're like, I could have read a book. I could have, you know, hugged my kid. Like anything.
Starting point is 00:40:40 But it works. Yeah, but it works to hold your attention. And they know that. And that's, that makes them money. And so, you know, the,. And so, the motivations are bad. The motivations are bad. I don't have any problem with people making money on the internet.
Starting point is 00:40:53 I think if you create a business that people love and you can make money off of it, you should do that. I just think there's a limit. There should be a limit to how much control you have or how much power you have to sort of like get people essentially addicted to things that are bad for them, which I think this is a good example. And I presume like character AI, they're aware of this. Now when someone mentioned suicide, there will be a prompt, there will be the offloading and passing the buck.
Starting point is 00:41:18 And all it took was one kid's life. One kid's one 14 year old's life. But it shouldn't have taken that. Somebody should have said like, hey, what if somebody mentioned suicide? Like, is there a person, a character AI, whose job it is to think about the worst case scenarios? Because it doesn't seem like it. You know? Like, it doesn't seem like it.
Starting point is 00:41:35 A lot of the stuff, like our, John Keegan keeps writing stuff like this for us about how like, it's kind of like, we're the test dummies. You know, they put out the stuff, they put out the AI, we're doing it, and then they work backwards from there, and then we're testing the problems, we're beta testing it, but these are horrible problems to foist onto the world. We don't need more problems.
Starting point is 00:41:55 Right, right, yeah, well, I agree with you, but unfortunately we're not in charge of- We're not in charge. Meta or Google, not yet. Alas. Not yet. Alas. Not yet, Ronnie, but any day now. You're right. This is gonna make us like prime candidates
Starting point is 00:42:09 for working there. Right now somebody could be listening to this at Google and they're like, you know who'd be a great CEO? Is that Ronnie Mala? She could be really good. They think about me. What else? Anything else in the news that we need to talk about?
Starting point is 00:42:23 Kind of adjacent to this, you know, Elon is this a polymarket stuff, which is just this crypto based betting market where you could bet on like, you know, who is going to win the presidential election. And there's been a few stories the past few days about how it's going, you know, much more for Trump. It's like all of a sudden it's just, like the real polls show that they're kind of neck and neck. And now on Polymarket,
Starting point is 00:42:51 you see that Trump has gone up quite a lot. And they realize that it's all due to like four different people putting their, four different people kind of voting up Trump, putting their money behind it. It's the most, seems like the most gameable thing in the world, am I crazy? It's incredibly, yeah, I mean,
Starting point is 00:43:09 it just doesn't mean that- It's like people putting money on things, yeah. And people like Elon Musk will say, it's gonna be more accurate because people are putting their money behind it, there's money behind it, but that doesn't mean that they know who's gonna win more, that just means that they have more money. Right, well, I mean, and if the thing is gameable,
Starting point is 00:43:32 which it definitely is, if you have enough money, presumably you can game it. Like presumably. And it's also like, only you're only allowed to do it if you're outside of the United States. Like it's like. Oh, right, wait, you can't use Polymarket if it's outside of the US, if you're outside of the United States. It's like- Oh, right.
Starting point is 00:43:46 Wait, you can't use Polymarket if it's outside of the US? If you're in the US? You can't use it if you're in the US. You're not supposed to be able to use it in the US. And so there's these four foreign users supposedly, maybe they're from the US, they don't really know for sure, who are pushing it towards Trump. And then there's this weird cyclical thing that happens that that gets reported on as a real poll,
Starting point is 00:44:07 and it's not a real poll. It's just how people are gambling. People are like, well, look at what the markets are saying. The market, the polling market. Oh, you mean the market where it's literally not supposed to be anybody from the US? And if it is, they're doing it basically illegally. And there's also these huge foreign interests
Starting point is 00:44:21 that are putting money into it. Yeah, but it's insane that again, I mean, it's a question of literacy, media literacy, and maybe the job that the actual media is doing, but that first off polls are usually wrong. I mean, often very wrong, often very wrong, not usually, but a lot. Even the regular polls where they have no incentive,
Starting point is 00:44:43 where they're saying like, this is how I voted. There's huge, huge margins of error for those, for regular polls. And this is not even close to a real poll. It's not even close. This is an unscientific, like foreign backed money, like like money making scheme. And yeah, Donald Trump has 63% odds of winning
Starting point is 00:45:04 the US presidential election. Right. And imagine now if you're voting, if you're, someone's pushing that up and then you vote for Kamala and then she wins. Like it's like, then you, like there's all sorts of ways this could be like. And it is, it is literally, it is, if you look at early voting numbers, it is not, that is not the case. That is not the case. He's not 63% likely to win based on early voting numbers, it is not that is not the case. That is not the case. He's not 63% likely to win based on early voting numbers.
Starting point is 00:45:30 It seems, I think, as far as I know, it's skewing slightly in the other direction. So it was just crazy. I mean, the fact that, look, I mean, the type of influence that we're seeing in this election that is non-political influence. Like, you know, Elon Musk is a perfect example. He's like giving, he's like giving,
Starting point is 00:45:51 he's like giving away a million dollars to people or something in some weird. Yeah, once a day he's giving a million dollars to people if they sign his like, like sign this thing saying that, you know, they're going to vote or, and then that they like abide by, that they support the first and second amendments. I think that whole scheme is illegal to some degree.
Starting point is 00:46:09 I think I saw that it has to be investigated now or whatever, because you're not. There might be a loophole in that, because he's only going for registered voters, I read, that it might be OK. I'm not really sure how that works. It's unclear, but it's scammy and weird and definitely like a manipulation. And the point of it manipulation to get people to vote, right?
Starting point is 00:46:27 You know, okay, get out the vote. Sure. Whatever. Um, but like there's an independent Republican, right? Can I vote whatever I want if I sign his thing? I mean, you could vote whatever you want, but I think it's strongly like, what would be cool is to win the million dollars in vote for a Kamala. I think that would be cool is to win the million dollars and vote for a Kamala. I think that would be like an amazing move.
Starting point is 00:46:46 Well, the first three people to win had already voted. Like, the independent went and looked at their voter rolls and their early voting had already come in. Who had they voted for? They're all Republicans. Oh, okay. All right. Let's see it.
Starting point is 00:47:01 I just think that there's all this foreign influence stuff. What's happening on the internet, which is just like an absolute shit show. There's no better way to describe it. Like the information, I mean, we have, we have so broken how information works. It is like, we truly have created two realities now. And there are just like these huge, there are these huge sort of, you know,
Starting point is 00:47:23 walls of information for each side of, you know, walls of information for each side of, you know, but of course, like the truth is one side in more in many ways is often like absolutely fabricating stuff. Like, like we're talking about, like we keep talking about election fraud in this country, election fraud in this country, the concept of it, the idea that it was happening, the idea that we should be worried about it. No, but it didn't even exist until Trump started saying that he the 2020 election was going to be stolen.
Starting point is 00:47:51 I mean, it was only when it was truly only when Trump knew he was going to lose or had a pretty good idea that he was going to lose or thought it was possible that he could lose. This is a fact. It is a total fabrication. There is zero evidence for it. And yet we now have a conversation in this country like it is a real thing. Like we're talking about whether or not it's a real thing.
Starting point is 00:48:15 We have like people who are like, you know, there's like threats of like bodily harm against people who work at like, who do the, you know, who work at the election in the polling places, they're worried. The people who have to deal with counting the votes, the people who are in charge of making sure states get their votes counted properly, these people
Starting point is 00:48:35 need security around them now. This did not exist in our country 10 years ago. It did not. It was not a thing. There was no fear that there was a fraud, a defrauded election in America. And this is like such a failure of how information works now that we are talking about it. That like you can just say it and make it real. You know, I actually heard an interview with Shane Smith the other day, the I had an interview with Shane Smith the other day, the disgrace vice founder, who now has a podcast by the way. Who among us?
Starting point is 00:49:08 And who among us? Yeah, I mean, he's got a podcast now where he's like, is the deep state real? Like that's literally the first episode of it. But at any rate, I'm sure it's very entertaining. He was saying like, if Joe Rogan says it and Elon Musk tweets it, it's true. And honestly, I hate to agree with the guy,
Starting point is 00:49:25 but that is kind of like part of the reality we exist in now. And it is such a failure. It is such a failure of like the way we get information, the people who've been like in charge of how we get information. I mean, I'm not just gonna blame the social people because it's not just them.
Starting point is 00:49:41 I mean, mainstream media has failed as well, like completely failed, but it's also been picked apart by like very, very destructive forces who have purposely wanted to, I mean, one of the big, you know, one of the easiest ways to create like, I don't wanna say fascism, but one of the best ways to create a political state where you have more control is to make people believe that the media isn't telling you the real story.
Starting point is 00:50:11 So you're saying someone like maybe owns like social media and then like is constantly saying that the media is bad and wrong. I'm saying that there are like, I'm saying I know a guy who does that. I'm saying that's one of many, but I'm just saying that there are concrete historical, this concrete historical evidence about how you create like a kind of like fascist or police state. And it is one of the big ones is you make the media,
Starting point is 00:50:38 you make true media, independent media not matter. And you discredit it and disenfranchise it. And, you know, that's like been happening in this country. And it's in it is, you know, partially, it's like the mainstream media is followed for falling into a lot of the traps that have made them susceptible to this. But it's also that it's been like a targeted, it's been a targeted ongoing attack for the last, you know, 50 years. And it's really working now because the internet has like loosed the kind loosed the gatekeepers. It's like there are no gatekeepers now.
Starting point is 00:51:09 The closest we have to gatekeepers are guys like Elon Musk and Mark Zuckerberg and the CEO of Google. Right. I was just looking for our roundup of Elon Musk tweets. I just saw from yesterday Elon Musk retweeted someone who, like for the vast majority of like bad tweets that Elon Musk tweets are him just either like platforming or retweeting or saying, yeah, or exclamation point or concerning to like total made up garbage.
Starting point is 00:51:37 This one from yesterday is like, it's a picture that says no matter how much you hate the media, you don't hate them enough. And the person says bullseye and he reposts it because that's what he's going for. Totally. I mean, Elon Musk has personal reasons to dislike the media. Someone took a picture of him with his hair like that like 20 years ago. I get it. I mean, he could have just been, I mean, look, I mean, yeah, I mean, I'm not saying he's made himself a target of criticism, but he hasn't made himself
Starting point is 00:52:05 not a target of criticism, you know? But you know, when you do- I mean, he's a rich public figure, like he controls a lot of the world. Like we should hold him to task, you know? That's fine, and when you do crazy, big, innovative things and you put yourself in front of them, like people are gonna scrutinize you.
Starting point is 00:52:19 And like, honestly, like, I mean, if I have any, you know, I have a deep love for the world of technology and for Silicon Valley and frankly, for all of the things that it has created. But if there's any bit of criticism, I mean, one of the biggest ones that I have is that it's very hard for them to take criticism. Like, it's very hard for people in tech to take criticism.
Starting point is 00:52:42 And it's very important to be criticized. It's very important to hear that what you're doing may not be great. In fact, like the most valuable feedback I get from like our staff as the editor in chief, and this has been for my whole career when I've been an editor in charge of groups of other editors and writers,
Starting point is 00:52:58 the most valuable feedback I have gotten is people telling me that my idea is bad or stupid. Like it is- Someone's gotta tell you. someone's gotta tell you. And if you're not open to hearing it, then you will. You are going to make huge horrible mistakes. And like I'm not saying I'm perfect. I've certainly made plenty of mistakes.
Starting point is 00:53:16 But but I do think like the most valuable thing I can take away from any group setting I'm in or anywhere where I'm ever critiqued is like hearing people say, yeah, that's bad. That's stupid. That's bullshit. Don't do it. You know, like, and, uh, I mean, but that's like, I was, I was reading the Elon Twitter book. Um, you guys, you had, um, uh, Ryan and Katie on last couple of weeks ago and like, yeah, he's just surrounded by people who just tell him the right thing. And if you tell him the wrong thing, he'd like fires you.
Starting point is 00:53:43 Yes. Yes. Yes, men men, yes men and women. Yes men. I think it's very, yeah, it's very like, well, that's how you, I think that's how you get to, yeah, I mean, that book's really interesting, but that's also how you get to like him posing conspiracy theories. And yeah, it's just depressing, man.
Starting point is 00:54:00 I know we keep talking about Elon. I hope like, I'm looking forward to, my dream scenario is like, look, I mean, we don't know what's going to happen with the election, but one thing I'd love to see happen is that Elon Musk sells Twitter to somebody. Like I would love to just see, not that I even care. I find myself, it's becoming increasingly impossible. I tried to reuse Twitter again,
Starting point is 00:54:20 like when we launched Sherwood, I was like, well, I should be posting more, even though like, I can't stand what's going on on Twitter right now. It's like, right. I'm going to post, it's morally suspect to be on there. But then you're like, but everyone's on there. Well, I'm like, but people are still on there and I want people to see our stories. And there's like, I have a bunch of followers there. And it's like, the interactions have been very poor engagements, like non-existent.
Starting point is 00:54:44 The people I see on there, the things that are promoted to me are really bad. I mean, we've talked about this a million times, but it's like, I do think it could be good under someone else's watch. I do think it could be not only restored because it was never, it was far from perfect, but it could be so much better.
Starting point is 00:55:00 Oh yeah, sure. There's so many obvious ways to make it better. And he's not doing any of them. He's doing like the exact opposite of what you'd want it to be. That's right. To be better. That's right, okay.
Starting point is 00:55:10 Anyway, do you wanna look at some of his tweets? We do his, yeah, we didn't do them last week because we were- Oh. Yeah, I mean, there's so many bad ones I was just going through today and he just tweets so much that it's kind of impossible. One, we have to look at one is the tweet that he did where it was like, it was like Epstein,
Starting point is 00:55:29 people who go to Epstein's Island. Oh, the Diddy Epstein one. And people who go to Diddy parties and it's like celebrity, and then it's a Venn diagram. It's like celebrities coming out to endorse Kamala. Yeah. It's like, oh, okay, so we're gonna do the pedophiles. This guy loves, by the way, the guy who's-
Starting point is 00:55:44 He loves calling people pedophiles. Yeah. Yeah, I mean, like, of course, the irony here is that Elon Musk literally, there's a picture of him with Ghislaine Maxwell, who was, Jeffrey Epstein's like, you know, the person who recruited all the girls that he raped or whatever.
Starting point is 00:56:01 And he was quoted in an interview pretty recently saying that him and Diddy are good friends. So like, you know, it's, it is one of those like Trump things where you're like, hmm, are you just, is this a projection? Cause it feels a little bit like projection. Yeah, it feels very projection.
Starting point is 00:56:18 So he's in the middle of that Venn diagram and not Kamala Harris. I don't know, who knows? I mean, allegedly, I don't know, but I'm just saying he's that picture exists. His quote exists about Diddy. You know, I don't know. I don't know about Kamala.
Starting point is 00:56:31 I don't know if she's been, you know, in any pictures with with Gillain or not. But anyhow, all right. So that's one tweet. That's one. Do you have another really bad one from him? He did do. This one kind of sounds like, you know, minorly threatening, like the past few examples we'd had, um, if they're able to do this, there will be no swing states
Starting point is 00:56:51 next election. And this is one of the ones where he's saying Kamala Harris, uh, is, you know, basically letting illegal immigrants in the country to vote, et cetera, et cetera, et cetera. So there will be no more swing states because yeah. Right. Great replacement. This is great replacement theory. This is literally a racist. I mean, this is literally one of the most sort of foundational white supremacy arguments. This great repl...
Starting point is 00:57:17 It's like people of color, Jews. They are going to replace us. They're gonna replace us. And he's basically saying these immigrants are gonna come in and they're gonna. Known immigrant, Elon Musk is saying that the other less lovable immigrants are gonna replace us. From an apartheid, he came from an apartheid.
Starting point is 00:57:34 Yeah, this is just like a really nasty, gross, incorrect racist theory that he keeps promoting. And it's not just like it's racist and bad for like humanity to say things like this. It is also, yeah, it's just not, it's not, there's, it's incorrect, it's incorrect. Like it's not, it's, there's nothing. It's not happening.
Starting point is 00:58:02 It's nothing to what he's saying. This is all part of that election fraud. Right, it's all election fraud, try to get people scared, you know, like it's, yeah, make a problem that isn't a problem. By the way, I'm looking, I just like opened Twitter while we were talking, and one of the explore topics is Trump's praise for Hitler's generals.
Starting point is 00:58:23 And I don't know what this story is. I veered into this thing, and it's not clear what this actual story is. Actually, a great example of how fucking bad Twitter is, is it doesn't actually even summarize what this topic is about. It's just a bunch of random disconnected tweets. Right, and it's outrage bait.
Starting point is 00:58:41 It's like it's divorce from reality. Yeah, it's bad. Yeah, well, I mean, the top tweet is from Kamala, which is Donald Trump is out for unchecked power. He wants a military like Adolf Hitler had, who will be loyal to him, not our constitution. He is unhinged, unstable, and given a second term, there would be no one to stop him
Starting point is 00:58:54 from pursuing his worst impulses. So I guess that's a speech that she gave. I mean, I guess John Kelly, so I guess this story is that John Kelly said to the New York Times that Trump meets the definition of a fascist. Yeah, and he would rule like a dictator. When we think of the key fascists in our timeline,
Starting point is 00:59:14 Hitler is right there at the top. At any rate, Twitter, just a nightmare to navigate, a nightmare to use, horrible content all around. Zero out of zero. Zero out of 10. All right. So do you want to vote on what, which one of those tweets is worse? Like it's either his bread and butter, like immigrants are taking over or it's a, or the,
Starting point is 00:59:36 the Venn diagram of whether Epstein and Diddy. The Venn diagram is more viscerally upsetting and seems more like rage baiting, but the actual, the other one is much more insidious in some ways and worse, like, because I don't actually think, only the hardest core, like, blackpilled conspiracy theory, the hardest core, like blackpilled conspiracy theory,
Starting point is 01:00:09 hard right people are like, they're like on the, you know, all the Democrats are pedophiles train or whatever. They're very pizza gaty. Very pizza, it is very pizza gaty, which is a absolute fabricated story. Insane story. By like a 4chan editor or whatever. At a 4chan editor, I don't know what you call them,
Starting point is 01:00:28 4chan user. And it shows a lot of shock, even for Elon Musk, a shocking lack of introspection if he's been like seen with both of these people. Yeah, exactly. But I think that the other one is actually much more vile as far as like content goes. It's a much more vile notion that has many much more believable
Starting point is 01:00:47 and insidious implications for him to tweet it. So my vote would be for the one about the immigrants are gonna take over the swing states. Yeah, and it's just, there's a little bit more at stake there, I think. Oh God, I can't wait till this election is over. I mean, on some level. But it might be the last election
Starting point is 01:01:09 if you're gonna create a deal on Musk. It might be the last election, but you know what, it's fine. Let's just be done with America already. I'm good. We've gotten enough out of it. We got Coca-Cola, we got McDonald's, and- And Character AI. And Character AI, it all worked out.
Starting point is 01:01:23 It all worked out. All right. Do you wanna do feature or bug? Yeah, we have to do feature or bug. We're contracted to do feature or bug. Okay. We could do the killer app of lying on your back on the couch watching TV on a Vision Pro. Is that a feature or bug?
Starting point is 01:01:45 Is that what you're asking? Yeah, that's a bug. If that's what they invented the $3,500 headset for, it is major bug. If that's the thing, I mean, if that's what the best use is, if that's what Tim Cook, the CEO of the company thinks is a killer app for that, in my opinion, that's major bug. I mean, that's my framing that it's the killer app. But like, I think that's so funny that that's what he does. He uses it for anything. That's what normal people could.
Starting point is 01:02:07 He could have been like, well, I look at it. No, if I were Tim Cook, I might have been like, I use it to visualize our new products in a 3D space. I use the AR functions where I can overlay. If we have a new product, I can see what it looks like sitting on a table or whatever. That would have been like, whoa, that's kind of cool, actually. Right. I have this is like, I love to watch TV in bed. It's like, okay, it's like I have, actually. Right. This is like, I love to watch TV in bed.
Starting point is 01:02:25 It's like, OK. It's like, I have a Bitmoji of me in meetings, and it's really cool. Everyone likes me. I have a nice hat. Yeah. So I don't know. I was kind of like, I kind of think it's,
Starting point is 01:02:39 I could see myself using it that way in the odd. Would you spend $3,500 to use it that way? No, no, no, but I would, if I had one lying around and like was like, oh, I wanna watch TV. I was like, that sounds like it could actually be pretty fun. But you know you could do this with the Quest, right? You could do this with like those glasses you have for like the hospital so that when you're lying down,
Starting point is 01:02:57 like it like reflects like the TV from the wall right in front of you. Like there's cheaper ways to do this, but like I hear it's very immersive and cool. Right, well, I mean, I think it's a bug. I think it's a bug. All right, I disagree. Anyway, the prospect of AI companionship.
Starting point is 01:03:17 Future a bug? Yeah. I mean, I think it's a... I think it's a, ooh, I think it's a feature. Once it's in a place where I think we need, I think we need a lot more, people need to, first of these things need to become better at what they do. They need to become smarter and better at what they do in terms of like interpreting
Starting point is 01:03:44 how people are talking to them. Secondly, I think we need to, as human beings, understand a lot more what it is, what we hope to get out of the relationships we have with these things, because I think it's, we are very susceptible, like we love magicians, you know? We know that magicians can't levitate. We know that they can't make a person disappear
Starting point is 01:04:05 or saw a girl in half, but we are fascinated by the illusion. And I think like it's really, really dangerous to create illusions that are so vivid without a deep understanding of what it is we're looking at because nobody leaves, very few people leave a magic act thinking that a person can fly. They understand that they're seeing like,
Starting point is 01:04:29 something fabricated for their entertainment. And we can suspend disbelief a little bit to enjoy it. Just like when we see Thanos in the Avengers, that's not a, we know it's not a real person, but we are like, we're suspending some level of disbelief because like the illusion is really powerful and we enjoy it. I don't know that we have learned to enjoy
Starting point is 01:04:55 and experience the illusion of a chat bot talking to you or that we even maybe can in the same way. And I think it's dangerous to think that that's not what it is. Those things are not forming relationships with us. You're forming relationships with them. I think we might like know, you know, this is a chatbot, this is fake,
Starting point is 01:05:16 but like it's so convincing that like on some other level, you might not know. And I would say bug because like, I do think it's useful. I do think like it'd be probably great to bet ideas off of. It's good for like, if you are going in like really like clear-eyed and you know, use this as for certain things probably really great, very helpful. But like in the end, it's a robot.
Starting point is 01:05:41 It's not real human companionship. So I don't think it'll have that depth of real meaning. I guess like for my money, I'm sort of like, if I'm gonna have to deal with AI chatbots in some capacity, I want them to like know me well enough to be able to speak to me in a way that's useful. Sure. You know, I mean, I think that's a form of companionship.
Starting point is 01:06:06 I'm not sure that a standalone friend is like the ultimate killer app. Marketing it as companionship, like saying like, okay, this is here for loneliness or this is here for the elderly or this is here for, you know, like to, you know, make people feel better. Yeah.
Starting point is 01:06:21 Like. No, I guess that part's a bug. I think that's a bug, but I think it's a feature for the, for chat bots to be able to form a relationship with you. I mean, to be able to form a, for you to be able to have, I think it is ultimately a feature, part of the big feature of the chatbot thing is like, and the idea of AI assistance or whatever is that
Starting point is 01:06:38 you can have a natural language relationship with them. I think that's useful down the road. I just think that this idea that that is the main show is, yeah, that part is a bug. Got it. Yeah. Do we have a third one? Gambling on the election outcomes? People will gamble on anything they can gamble on.
Starting point is 01:07:00 I don't really think that's a problem. I mean, whatever. Yeah, it's something fun to throw your money. I believe in people taking personal, you know, ownership of their own risk, you know, profile. Like I think, you know, I'm, I'm a person who doesn't like to gamble. And I have, you know, very, very mildly have engaged with like the markets. I say this as a guy who's running an organization focused on covering the markets. But you know, but as journalists who like, as a journalist who's running an organization focused on covering the markets. But as journalists who like often try to say a very, yeah, I have a very kind of relatively
Starting point is 01:07:30 distant relationship with it. I think I can take stock of my own capabilities and go, well, I would not be a great, I don't want to have to like wager money on the outcome of almost anything. I think people, I think that's fine. I think it's fine. I think it's fine. It doesn't really matter. I think what was a bug about the whole polymarket thing,
Starting point is 01:07:50 I guess, betting on elections or even just, you know, this idea of creating these like markets for elections. The problem is not in the actual essence of the market. It's in the perception, the outward perception that those betting markets are polls, you know? Like that those betting markets are verifiable resources to gauge what is gonna happen. But I would tell you that I think polls are a mistake.
Starting point is 01:08:20 I think seeing polls and talking about polls is kind of BS. And I feel like they're only bad for us as humanity. Like I think polls are to hear that you think it's gonna go this way or that, that's not, it should not be a factor in how you make decisions about who you're voting for. And it shouldn't cloud or you shouldn't have to think about where things are leaning. You should vote based on the information
Starting point is 01:08:42 that you have available, you know? Yeah, I completely agree with you on this one. Like the thing itself, you should, you should vote based on the information that you have available, you know? Yeah, I completely agree with you on this one. Like the, the thing itself, the gambling, like whatever fine, um, gamble on whatever you want, but like the perception that this is like actually means something and that it's like, yeah, people are using it to think that it means something is, is a bug. I think when you see a chart now, you're kind of like, all right, well, what's the difference between like a New York Times poll,
Starting point is 01:09:08 like amalgamation of polls and the polling market, like chart percentages waiting, you know, of waiting. Like one's about percentage of people saying, this is how I'm going to vote. And one's like, this is how much money is going towards this based on like four people from outside the US. So I think that's a huge bug, but how do you solve the bug of like misinformation? It's like very, that's a very complicated cause that's what it is.
Starting point is 01:09:30 It's just like, it's, it's a thread of misinformation that is polluting like social media basically. Very naughty problem. Yeah. That's good. That's good. Little pun there. Excellent word play. All right. Well, I think we've said just about everything there is to say about every topic that exists in the world right now. So, Ronia, you know, I really missed, we had several weeks where we were on podcast and I really missed our conversation.
Starting point is 01:09:55 So I'm glad that you're back. Yeah, me too. And now you can never go away ever again. I'm sorry, but that's what the contract says. Anyhow, that is tomorrow for this week. We'll be back next week with more. And as always, I wish you and your family the very best.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.