Lex Fridman Podcast - #218 – Jaron Lanier: Virtual Reality, Social Media & the Future of Humans and AI

Episode Date: September 7, 2021

Jaron Lanier is a computer scientist, composer, artist, author, and founder of the field of virtual reality. Please support this podcast by checking out our sponsors: - Skiff: https://skiff.org/lex to... get early access - Novo: https://banknovo.com/lex - Onnit: https://lexfridman.com/onnit to get up to 10% off - Indeed: https://indeed.com/lex to get $75 credit - Eight Sleep: https://www.eightsleep.com/lex and use code LEX to get special savings EPISODE LINKS: Jaron's Website: http://www.jaronlanier.com/ Jaron's Books: https://amzn.to/3tlhl9T PODCAST INFO: Podcast website: https://lexfridman.com/podcast Apple Podcasts: https://apple.co/2lwqZIr Spotify: https://spoti.fi/2nEwCF8 RSS: https://lexfridman.com/feed/podcast/ YouTube Full Episodes: https://youtube.com/lexfridman YouTube Clips: https://youtube.com/lexclips SUPPORT & CONNECT: - Check out the sponsors above, it's the best way to support this podcast - Support on Patreon: https://www.patreon.com/lexfridman - Twitter: https://twitter.com/lexfridman - Instagram: https://www.instagram.com/lexfridman - LinkedIn: https://www.linkedin.com/in/lexfridman - Facebook: https://www.facebook.com/lexfridman - Medium: https://medium.com/@lexfridman OUTLINE: Here's the timestamps for the episode. On some podcast players you should be able to click the timestamp to jump to that time. (00:00) - Introduction (08:04) - What is reality? (12:17) - Turing machines (13:35) - Simulating our universe (19:49) - Video games and other immersive experiences (23:37) - Death and consciousness (32:08) - Designing human-centric AI (33:42) - Empathy with robots (37:33) - Social media incentives (49:53) - Data dignity (57:26) - Jack Dorsey and Twitter (1:09:10) - Bitcoin and cryptocurrencies (1:13:51) - Government overreach and freedom (1:24:06) - GitHub and TikTok (1:26:16) - The Autodidactic Universe (1:31:07) - Humans and the mystery of music (1:37:17) - Defining moments (1:48:03) - Mortality (1:49:56) - The meaning of life

Transcript
Discussion (0)
Starting point is 00:00:00 The following is a conversation with Jaren Leneer, a computer scientist, visual artist, philosopher, writer, futurist, musician, and the founder of the field of virtual reality. To support this podcast, please check out our sponsors in the description. As a side note, you may know that Jaren is a staunch critic of social media platforms. Him and I agree on many aspects of this, except perhaps I am more optimistic about it being possible to build better platforms and better artificial intelligence systems that put long-term interests and happiness of human beings first. Let me also say a general comment about these conversations.
Starting point is 00:00:40 I try to make sure I prepare well, remove my ego from the picture, and focus on making the other person shine as we try to explore the most beautiful and insightful ideas in their mind. This can be challenging when the ideas that are close to my heart are being criticized. In those cases, I do offer a little pushback, but respectfully, and then move on, trying to have the other person come out looking wiser in the exchange. I think there's no such thing as winning in conversations, nor in life. My goal is to learn and to have fun. I ask that you don't see my approach to these conversations
Starting point is 00:01:16 as weakness. It is not. It is my attempt at showing respect and love for the other person. That said, I also often just do a bad job of talking, but you probably already knew that. So please give me a pass on that as well. As usual, I'll do a few minutes of ads now, no ads in the middle. I try to make this interesting, so hopefully you don't skip, but if you do, please still check out the sponsor links in the description. It is, in fact, the best way to support this podcast. I use their stuff, I enjoy it, maybe you will too. This show is brought to you by SKIF, an end-to-end encrypted and decentralized collaboration
Starting point is 00:01:59 platform builds for privacy from the ground up. What signal is to messaging, SK skip is to document writing and collaboration. It's like Google dogs, but with a lot more security features. From the early days, I happen to be a big user of Google dogs, probably have over a thousand documents on there. I also use Evernote, Notion, Google Keep for various kinds of note-taking. So I'm very picky on the usability front of document writing and collaboration. And that's the magic of SKIF. Not only is it secure, the writing and collaboration experience in it, in my opinion, is better than
Starting point is 00:02:38 Google Docs. Unskif only you can decrypt your data. No one, not even SK skip can ever see it. If you like using signal, which I do, you'll love using skip. They're offering listeners of this podcast early access to their platform. You get to skip their over 60,000 person waitlist. You know all those times we get to wait in line outside of a club just to see those people who are on the list get in without any waiting. That could be you with SKIF. Sign up to SKIF's beta at skif.org slash lex.
Starting point is 00:03:14 I'm thinking of doing a fun collaborative document with a free for all to all that listen to this podcast, what could possibly go wrong. Anyway, go to skif.org slash slashlex to sign up for their early access. This show is also brought to you by Bank Novo, which is a business banking app. The process is simple. You sign up and then will mail you a Nova debit card and you get free ATM use. In my opinion, if there's any industry that needs to be disrupted, revolutionized, it's the old school banking industry, and that's what bank Noah does.
Starting point is 00:03:51 It's backed with FDIC insurance. There's no hidden fees, unlike traditional banks, no monthly fees, or minimum balance requirements, easy to use mobile app. Like I said, you apply in under 10 minutes. Super easy. There's a human powered customer service, free transfers, mailed checks and incoming wires. It integrates with other small business tools like Stripe, Shopify, QuickBooks, and many more. Refunds all ATN fees and there are thousands of dollars in exclusive perks. Go to banknova.com slash Lex to sign up for free. That's banknova.com slash Lex. This episode is also brought to you by Onnet,
Starting point is 00:04:32 Nutrition, Supplement and Fitness Company. They make Alpha Brain, which is a Neutropic that helps support memory, mental speed and focus. I really enjoy using Alpha Brain as a boost when thinking through a difficult problem. So if I anticipate there's a deep work session where I'm going to spend two, three hours on a particular difficult problem, especially in front of a sheet of paper. So I'm designing some aspect of a system or some aspect of a process. That's when I take an alpha brain. It's kind of this catalyst of really getting in the zone, really focusing. It clears the mind, helps me maintain focus, get to a focus
Starting point is 00:05:11 place and maintain it. Anyway, go to lexfreedman.com slash on it to get up to 10% off alpha brain. That's lexfreedman.com slash on it. This episode is brought to you by indeed a hiring website. I've used them as part of many hiring efforts I've done for the teams I've led in the past. They have tools like indeed instant match giving you quality candidates whose resumes unindeed fit your job description immediately. I think work is a source of meaning and happiness for many people, maybe even most people, maybe even all people. So I think if that's the case, in a position of selecting the team with whom you work, it really pays off to invest a lot of effort and use the best tools to select that team.
Starting point is 00:06:03 In terms of tools, indeed, you should definitely be a service you consider for that initial candidate selection, especially, but the whole process they help with, but the initial candidate selection is just stellar. Anyway, right now, get a free 75 sponsored job credit to upgrade your job post at indeed.com slash Lex. Get it. And indeed.com slash Lex, get it and indeed dot com slash Lex. This offer valid through September 30th terms and conditions apply, join three million businesses that use indeed by going to
Starting point is 00:06:33 indeed dot com slash Lex. This episode is also brought to you by the awesome, the amazing, the comfortable eight sleep and's pod pro mattress. It controls temperature with an app, it's packed with sensors, it can cool down to as low as 55 degrees on each side of the bed separately. It is so damn hot outside right now in Austin, Texas. I cannot tell you how pleasant it is to lay down
Starting point is 00:07:01 in a cool bed. Of course, I have air conditioning too, but there's something really nice about having a cold bed, a relatively cold room with a warm blanket is just heaven. It makes me look forward to the naps, to the power nap, it makes me look forward to sleep. They have a pod pro cover,
Starting point is 00:07:20 so you can just add that to your mattress without having to buy theirs, but I have theirs and it's pretty nice. The thing can track a bunch of metrics like heart rate variability But cooling alone is honestly worth the money. Go to 8sleep.com slash Lex to get special savings That's 8sleep.com slash Lex This is the Lex Friedman podcast and here is my conversation with Jaren linear. You're considered the founding father of virtual reality. Do you think we will one day spend most or all of our lives in virtual reality worlds? I have always found the very most valuable moment in virtual reality to be the moment when
Starting point is 00:08:23 you take off the headset and your senses are refreshed and you perceive physicality. A fresh, you know, as if you were a newborn baby, but with a little more experience, you can really notice just how incredibly. Strange and delicate and peculiar and impossible, the real world is. So the magic is and perhaps forever will be in the physical world? Well, that's my take on it. That's just me. I mean, I think I don't get to tell everybody else how to think or how to experience virtuality. And at this point, there have been multiple generations of younger people who've come along and liberated me from having to worry about these things. But I should say also even in what I called it mixed reality back in the day and these days it's called augmented reality, but with something like a hollow lens, even then one of my
Starting point is 00:09:21 favorite things is to augment a forest, not because I think the forest needs augmentation, but when you look at the augmentation next to a real tree, the real tree just pops out as being astounding. It's interactive, it's changing slightly all the time if you pay attention, and it's hard to pay attention to that, but when you compare to aity, all of a sudden you do. And even in practical applications, my favorite early application of virtuality, which we prototype going back to the 80s when I was working with Dr. Joe Rosen, and it's Stanford Med near where we are now. We made the first surgical simulator. And to go from the fake anatomy of the simulation,
Starting point is 00:10:05 which is incredibly valuable for many things, for designing procedures, for training, for all kinds of things, then to go to the real person, boy, it's really something like surgeons really get woken up by that transition. It's very cool. So I think the transition is actually more valuable than the simulation.
Starting point is 00:10:23 That's fascinating. I never really thought about that. It's almost, it's like traveling elsewhere in the physical space can help you appreciate how much you value your home once you return. Well, that's how I take it. I mean, once again, people have different attitudes towards it. All are welcome. What do you think is the difference between the virtual world and the physical meetspace world that you are still drawn for you personally, still drawn to the physical world? Like they're clearly then as a distinction.
Starting point is 00:10:56 Is there some fundamental distinction or is it the peculiarities of the current set of technology? In terms of the kind of virtuality that we have now, it's made of software and software is terrible stuff. Software is always the slave of its own history, its own legacy, it's always infinitely arbitrarily messy and arbitrary, working with it brings out a certain kind of nerdy personality in people, or at least in me, which I'm not that fond of. And there are all kinds of things that software don't like.
Starting point is 00:11:34 And so that's different from the physical world. It's not something we understand as you just pointed out. On the other hand, you know, I'm a little mystified when people ask me, well, do you think the universe is a computer? And I have to say, well, I mean, what on earth could you possibly mean if you say it isn't a computer? If it isn't a computer, it wouldn't follow principles consistently and it wouldn't be intelligible,
Starting point is 00:12:00 because what else is a computer ultimately? I mean, and we have physics. We have technology, you know, so we can do technology, so we can program it. So I mean, of course, it's some kind of computer, but I think trying to understand it as a cheering machine is probably a foolish approach. Right, that's the question.
Starting point is 00:12:19 Whether it performs this computer, we call it the universe, performs the kind of computation that can be modeled as a Universe of touring machine or is it something much more fancy? So fancy in fact that we it may be beyond our cognitive capability to understand Turing machines are kind of I Call them teases in a way, because if you have an infinitely smart programmer with an infinite amount of time, an infinite amount of memory, and an infinite clock speed, then
Starting point is 00:12:53 they're universal. But that cannot exist. So they're not universal in practice. And they actually are in practice, a very particular sort of machine within, you know, the constraints within the conservation principles of any reality that's worth being in probably. So, I think universality of a particular model is probably a deceptive way to think. Even though it's some sort of limit, of course, it's something like that's got to be true at some sort of high enough limit, but it's just not accessible to us, so what's the point?
Starting point is 00:13:35 Well, to me, the question of whether we're living inside a computer or a simulation is interesting in the following way. There's a technical question is here. How difficult does it to build a machine not that simulates the universe, but that makes it sufficiently realistic that we wouldn't know the difference or better yet sufficiently realistic that we would kind of know the difference, but we would prefer to stay in the virtual world anyway. I want to give you a few different answers. I want to give you the one that I think has the most practical importance to human beings right now, which is that there's a kind of an assertion sort of built into the way the questions usually ask that I think is false, which is a suggestion that people have a fixed level of ability to perceive reality in a given way. And actually, people are always learning, evolving, forming themselves. We're fluid too.
Starting point is 00:14:34 We're also programmable, self-programmable, changing, adapting. And so, my favorite way to get at this is to talk about the history of other media. So for instance, there was a peer review paper that showed that an early wire recorder playing back an opera singer behind a curtain was indistinguishable from a real opera singer. And so now of course to us, it would not only be distinguishable,
Starting point is 00:14:58 but it would be very blatant because the recording would be horrible. But to the people at the time, without the experience of it, it seemed plausible. There was an early demonstration of extremely crude video teleconferencing between New York and DC in the 30s. I think so. That people viewed as being absolutely realistic and indistinguishable, which to us would be horrible. And there are many other examples. Another one, one of my favorite ones, is in the Civil War era, there were itinerant photographers
Starting point is 00:15:30 who collected photographs of people who just looked kind of like a few archetypes. So you could buy a photo of somebody who looked kind of like your loved one, to remind you of that person, because actually photographing them was inconceivable and hiring a painter was too expensive and you didn't have any way for the painter to represent them remotely anyway. How would they even know what they looked like? So these are all great examples of how in the early days of different media, we perceive the media as being really great, but then we evolved through the experience of the media. This gets back to what I was saying. Maybe the greatest gift of photography is that we can see the flaws in a photograph and appreciate reality more. through the experience with the media, this gets back to what I was saying. Maybe the greatest gift of photography
Starting point is 00:16:05 is that we can see the flaws in a photograph and appreciate reality more. Maybe the greatest gift of audio recording is that we can distinguish that opera singer now, singer now from that recording of the opera singer on the horrible wire recorder. So we shouldn't limit ourselves by some assumption of stasis that's incorrect. So that's the first
Starting point is 00:16:27 thing, that's my first answer, which is I think the most important one. Now, of course, somebody might come back and say, oh, but you know, technology can go so far. There must be some point at which it would surpass. That's a different question. I think that's also an interesting question, but I think the answer I just gave you is actually the more important answer to the more important question. That's profound. Yeah. But can you, can you, the second question which you're not making me realize is way different? Is it possible to create worlds in which people would want to stay
Starting point is 00:16:56 instead of the real world? Well, like unmasked, like large numbers of people. What I hope is, you know, as I said before, I hope that the experience of virtual worlds helps people appreciate this physical world we have and feel tender towards it and keep it from getting to you fucked up. That's my hope. Do you see all technology in that way? So basically technology helps us appreciate the more sort of technology free aspect of life. Well, media technology, you know, I mean, I, I, you can stretch that. I mean, you can,
Starting point is 00:17:39 let me say, I could definitely play McCluein and turn this into a general theory. It's totally doable. The program you just described is totally doable. In fact, I will psychically predict that if you did the research, you could find 20 PhD thesis that do that already. I don't know, but they might exist. But I don't know how much value there is in pushing a particular idea that far. Claiming that reality isn't a computer in some sense seems in coherent to me because we have, we can program it, we have technology, it has, it seems
Starting point is 00:18:12 to a Bay physical laws. What more do you want from it to be a computer? I mean, it's a computer of some kind, we don't know exactly what kind, we might not know how to think about it, we're working on it. But, uh, so I can draw, but you're absolutely right. Like that's my fascination with the AI as well. Is it helps in the case of AI? I see it as a set of techniques that help us understand ourselves, understand us humans. And the same way virtual reality,
Starting point is 00:18:37 and you're putting it brilliantly, which it's a way to help us understand reality. Sure. I appreciate it and open our eyes more richly to reality. That's certainly how I see it. And I wish people who become incredibly fascinated who go down the rabbit hole of the different fascinations
Starting point is 00:18:58 with whether we're in a simulation or not or you know that there's a whole world of variations on that. I wish they'd step back and think about their own motivations and exactly what they mean, you know, what, and I think the danger with these things is, so if you say, is the universe some kind of computer broadly, it has to be, because it's not coherent to say that it isn't, on the other hand, to say that that means, you know, anything about what kind of computer, that's something very different. And the same thing is true for the brain, the same thing is true for anything where you
Starting point is 00:19:35 might use computational metaphors. Like, we have to have a bit of modesty about where we stand. And the problem I have with these framings of computation is these ultimate cosmic questions that has a way of getting people to pretend they know more than they do. Can you maybe, this is a therapy session. It's like going to last me for a second. I really like the Elder Scrolls series. It's a role-playing game, Skyrim, for example.
Starting point is 00:20:01 Why do I enjoy so deeply just walking around that world? And then there's people and you could talk to and you can just like, it's an escape, but you know, my life is awesome. I'm truly happy, but I also am happy with the music that's playing and the mountains and carrying around a sword and just that I don't know what that is. It's very pleasant though to go there and I missed it sometimes. I think it's wonderful to love artistic creations. It's wonderful to love contact with other people. It's wonderful to love play and ongoing evolving meaning and patterns with other people. It's wonderful to love play and ongoing evolving, meaning and patterns with other people. I think it's a good thing. You know, I'm not like anti tech and I'm certainly not anti digital
Starting point is 00:20:58 tech. I'm anti as everybody knows by now, I think the, you know, manipulative economy of social media is making everybody nuts and all that time anti that stuff. But the core of it, of course, I worked for many, many years on trying to make that stuff happen because I think it can be beautiful like I, I don't like why not, you know, and by the way, there's a thing about humans, which is, we're problematic. Any kind of social interaction with other people is going to have its problems. People are political and tricky. And like, I love classical music, but when you actually go to a classical music thing and it turns out, oh, actually, this is like a backroom power deal kind of place and a big status ritual as well. And that's kind of not as fun. That's part of the package. And the thing is, it's always going to be. There's always going to be a mix of things. I don't, I don't
Starting point is 00:22:01 think the search for purity is gonna get you anywhere. So I'm not worried about that. I worry about the really bad cases where we're becoming, where we're making ourselves crazier cruel enough that we might not survive. And I think, you know, the social media criticism rises to that level, but I'm glad you enjoyed it. I think it's great.
Starting point is 00:22:22 And I like that you basically say that every experience has both beauty and darkness as in with classical music. I also play classical piano, so I appreciate it very much. But it's interesting. I mean, every, and even the darkest man's search for meaning with Victor Franco and the concentration camps, even there, there's opportunity to discover beauty. Mm-hmm. Nah, and so it's... That's the interesting thing about humans is the capacity
Starting point is 00:22:50 to discover beautiful in the darkest moments, but there's always the dark parts, too. Well, I mean, it's... Our situation is structurally difficult. We are... Um... ...exually... No, it is true. We perceive socially. We depend on each other for our sense of place and perception of the world. I mean, we're dependent on each other.
Starting point is 00:23:17 And yet, there's also a degree in which we're inevitably... we inevitably let each other down. We are set up to be competitive as well as supportive. I mean, it's a, it's a, our fundamental situation is complicated and challenging and I wouldn't have it any other way. Okay, let's talk about one of the most challenging things. One of the things I unfortunately am very afraid of being human allegedly, you wrote an essay on death and consciousness in which you write a note.
Starting point is 00:23:53 Certainly the fear of death has been one of the greatest driving forces in the history of thought and in the formation of the character of civilization. And yet it is under acknowledged. The great book on the subject, The Denial of Death by Ernest Becker, deserves a reconsideration. I'm Russian, so I have to ask you about this. What's the role of death in life? See, you would have enjoyed coming to our house because my wife is Russian, and we also have, we have a piano of such spectacular qualities. You wouldn't
Starting point is 00:24:26 You would have freaked out But anyway, let's we'll let all that go So the context in which I remember that essay sort of this was for maybe the 90s or something and I used to publish in a journal called the Journal of Consciousness Studies because I was interested in these endless debates about consciousness and science, which certainly continued today.
Starting point is 00:24:56 And I was interested in how the fear of death and the denial of death played into different philosophical approaches to consciousness. Because I think on the one hand, the sort of sentimental school of dualism, meaning the feeling that there's something apart from the physical brain, some kind of soul or something else, is obviously motivated in a sense by a hope that that whatever that is will survive death and continue. And that's a very core aspect of a lot of the world religion, not all of them, not really, but most of them.
Starting point is 00:25:50 The thing I noticed is that the opposite of those which might be this sort of hardcore, now the brains of computer and that's it. In a sense, we're motivated in the same way with a remarkably similar chain of arguments, which is, no, the brain's a computer and I'm going to figure out in my lifetime and upload it, upload myself and I'll live forever. That's interesting. Yeah, that's the implied thought, right?
Starting point is 00:26:18 Yeah. And so it's kind of this, in a funny way, it's the same thing. It's it's it's peculiar that you to notice that these people who would appear to be opposites in character and cultural references and and their ideas actually are remarkably similar. And and to to an incredible degree, the sort of hardcore computationalist idea about the brain has turned into medieval Christianity, together like there's the people who are afraid that if you have the wrong thought, you'll piss off the super-eyes of the future who will come back and zap you, and all that stuff.
Starting point is 00:27:03 It's really turned into medieval and all that stuff. Yeah, it's like, it's really turned into medieval Christianity all over again. So the Ernest Becker's idea that the fear of death is the warm of the core, which is like, that's the core motivator of everything we see humans have created. The question is if that fear of mortality is somehow core is like a prerequisite. So what you just you just moved across this vast cultural chasm that separates me from most
Starting point is 00:27:36 of my colleagues in a way and I can't answer what you just said on the level without this huge deconstruction. Yes, should I do it? Yes, what's the chasm? Okay. Let us travel across this vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast vast it. When I was young, my main mentor was Marvin Minsky, who's the principal author of the computer as creature rhetoric that we still use. He was the first person to have the idea at all, but he's certainly populated AI culture with most of its tropes, I would say, because a lot of the still people will say, who did you hear this new idea about AI? And I'm like, yeah, I heard it in 1978. Sure, yeah, I remember that. So Marvin was really the person.
Starting point is 00:28:28 And Marvin and I used to argue all the time about the stuff, because I always rejected it. And of all of his, I wasn't formally his student, but I worked for him as a researcher, but of all of his students and student-like people of his young adoptees, I think I was the one who argued with him about this stuff in particular, and he loved it. Yeah, I would have loved to hear that conversation. It was fun.
Starting point is 00:28:59 Did you ever converse to a place? Oh, no, no. So the very last time I saw him, he was quite frail and I was in Boston, and I was going to the old house in Brooklyn, his amazing house. And one of our mutual friends said, he's in Marvin. So frail, don't do the argument with him. Don't argue that AI. You know, and so I said, but Marvin loves that. And so I showed up and he's like, he was frail. He looked up and he said, are you ready to argue? He's such an amazing person. So it's hard to summarize this because it's decades of stuff.
Starting point is 00:29:40 The first thing to say is that nobody can claim absolute knowledge about whether somebody or something else is conscious or not. This is all a matter of faith. And in fact, I think the whole idea of faith needs to be updated, so it's not about God, but it's just about stuff in the universe. We have faith in each other being conscious. And then I used to frame this as a thing called the circle of empathy in my old papers. And then it turned into a thing for the animal rights movement. So I noticed Peter Singer using it. I don't know if it was coincident or, but anyway, we there's this idea that you draw a circle around yourself and the stuff inside is more like you might be conscious, might be deserving of your empathy, of your consideration, and the stuff outside the circle isn't.
Starting point is 00:30:29 And outside the circle might be a rock or I don't know. And that circle is on the notes of based off faith. Well, your faith and what isn't, what isn't. The thing about this circle is it can't be pure faith. It also has, it's also a pragmatic decision and this is where things get complicated. If you try to make it too big, you suffer from incompetence. If you say, I don't want to kill a bacteria, I will not brush my teeth. I don't know, like, what do you do? Like, you know, like, there's a, there's a competence question where you do have to draw the line.
Starting point is 00:31:05 People who make it too small become cruel. People are so clannish and political and so worried about themselves ending up on the bottom of society that they are always ready to gang up on some designated group. And so there's always these people who are being trying, we're always trying to shove somebody out of the circle. And so aren't you shoving AI outside the circle? Well, give me a second. All right. So, so there's a pragmatic consideration here. And so, and, and the biggest questions are probably fetuses and animals lately, but AI is getting there. Now, with AI, I think, and I've had this discussion so many times, people say, but aren't you afraid if you exclude AI, you'd be cruel to some consciousness?
Starting point is 00:31:51 And then I would say, well, if you include AI, you make yourself, you exclude yourself from being able to be a good engineer or designer. And so you're facing incompetence immediately. So like I really think we need to subordinate algorithms and be much more skeptical of them. Your intuition, you speak about this brilliantly with social media, how things can go wrong. Isn't it possible to design systems that show compassion
Starting point is 00:32:20 not to manipulate you, but give you control and make your life better if you so choose to, like grow together with systems and the way we grow with dogs and cats with pets with significant others in that way, that grow to become better people. I don't understand why that's fundamentally not possible. You're saying oftentimes you get into trouble
Starting point is 00:32:42 by thinking you know what's good for people. Well, look, there's this question of what frame we're speaking in. Do you know who Alan Watts was? So Alan Watts once said morality is like gravity that in some absolute cosmic sense, there can't be morality because at some point it all becomes relative and who are we anyway. Like morality is relative to us tiny creatures. But here on Earth, we're with each other. This is our frame and morality is a very real thing,
Starting point is 00:33:12 something with gravity. At some point, you get into stellar space and you might not feel much of it, but here we are on Earth. And I think in the same sense, I think this identification with a frame that's quite remote cannot be separated from a feeling of wanting to feel sort of separate from and superior to other people or something like that. There's an impulse behind it that I really have to reject. And we're just not competent yet to talk about these kinds of absolutes.
Starting point is 00:33:45 Okay, so I agree with you that a lot of technologists sort of lack this basic respect, understanding and love for humanity. There's a separation there. The thing I'd like to push back, yes, it's not that you disagree, but I believe you can create technologies and you can create a new kind of technologist engineer that does build systems that respect humanity, not just respect, but admire humanity, that have empathy for common humans have compassion. No, no, no, no. I think, yeah, I mean, I think musical instruments are a great example of that. Musical instruments are technologies that help people connect in fantastic ways. And that's a great example.
Starting point is 00:34:31 My invention or design during the pandemic period was this thing called Together Mode, where people see themselves seated sort of in a classroom or a theater instead of in squares and it allows them to semi-consciously perform to each other as if they have proper eye contact as if they're paying attention to each other nonverbally and weirdly that turns out to work. And so it promotes empathy so far as I can tell. I hope it is of some use to somebody.
Starting point is 00:35:10 The AI idea isn't really new. I would say it was born with Adam Smith's invisible hand with this idea that we build this algorithmic thing and it gets a bit beyond us and then we think it must be smarter than us. And the thing about the invisible hand is absolutely everybody has some line they draw where they say, no, no, we're going to take control of this thing. They might have different lines. They might care about different things, but everybody ultimately became a kinsian, because it just didn't work. It really wasn't that smart. It was sometimes smart and sometimes it failed.
Starting point is 00:35:36 And so if you really, people who really, really, really want to believe in the invisible hand is infinitely smart, screw up their economies terribly. You have to recognize the economy as a subservient tool. Everybody does when it's to their advantage. They might not when it's not to their advantage. That's kind of an interesting game that happens.
Starting point is 00:35:59 But the thing is, it's just like that with our algorithms. You can have a sort of a Chicago, you can't, you know, economic philosophy about your computers. You know, no, no, my things come alive. It's smarter than anything. I think that there is a deep loneliness within all of us. This is what we seek. We seek love from each other. I think I can help us connect deeper.
Starting point is 00:36:24 Like this is what you criticize social media for. I think there's much better ways of doing social media that doesn't lead to manipulation, but instead leads to deeper connection between humans. At least you're becoming a better human being. And what that requires is some agency on the part of AI to be almost like a therapist. I mean, a companion.
Starting point is 00:36:43 It's not telling you what's right. It's not guiding you as if it's an all-knowing thing. It's just another companion that you can leave at any time. You have complete transparency, control over. There's a lot of mechanisms that you can have that are counter to how current social media operates that I think is subservient to humans or no deeply respects human beings and empathetic to their experience and all those kinds of things. I think it's possible
Starting point is 00:37:14 to create AI systems like that. And I think they, I mean, that's a technical discussion of whether they need to have something that looks like more like AI versus algorithms, something that has identity, something that has a personality, all those kinds of things. AI systems, and you've spoken extensively how AI systems manipulate you within social networks. And that's the biggest problem isn't necessarily that there's advertisement that social networks present you with advertisements that then get you to buy stuff. That's not the biggest problem. The biggest problem is they then manipulate you. They alter your human nature to get you to buy stuff or to get you to
Starting point is 00:38:08 Do whatever the advertiser wants? Maybe you can correct. Yeah, I don't see it quite that way, but we can work with that as an approximation Sure, so my I think the actual thing is even sort of more ridiculous and stupider than that, but that's that's okay Let's let's so my question is let's not use the word AI, but how do we fix it? Oh, fixing social media. That diverts us into this whole other field, in my view, which is economics, which I always thought was really boring,
Starting point is 00:38:38 but we have no choice but to turn into economists if we want to fix this problem, because it's all about incentives. But I've been around this thing since it started, and I've been in the meetings where the social media companies sell themselves to the people who put the most money into them, which are usually the big advertising holding companies and whatnot.
Starting point is 00:39:01 And there's this idea that I think is kind of a fiction, and maybe it's even been recognized as that by everybody that the algorithm will get really good at getting people to buy something. Because I think people have looked at their returns and looked at what happens, and everybody recognizes that it's not exactly right. It's more like a cognitive access blackmail payment at this point. Like just to be connected, you're paying the money. It's not so much that the persuasion algorithms. So Stanford renamed its program, but it used to be called Engage Persuade. The Engage part works. The persuade part is iffy, but the thing is that once people are engaged in order for you to exist as a business in order for you to be known at all You have to put money into that dark. Oh, no, that's not work, but they have to but there's still it's a giant
Starting point is 00:39:53 It's a giant cognitive access blackmail scheme at this point So because the science behind the persuade part. It's not entirely It's not entirely a failure, but it's not what, there's, we play, make believe that, it works more than it does. The damage doesn't come, honestly, as I've said in my books, I'm not anti-advertising. I actually think advertising can be demeaning
Starting point is 00:40:21 and annoying and banal and ridiculous and Take up a lot of our time with stupid stuff. Like there's a lot of ways to criticize that out advertising. That's Accurate and it can also lie and all kinds of things. However, if I look at the biggest picture I think advertising at least as it was understood before social media helped bring people into modernity in a way that overall actually did benefit people overall. And you might say am I contradicting myself because I was saying you shouldn't manipulate people. Yeah, I am probably here. I mean, I'm not pretending to have this perfect airtight worldview without some contradictions. I think there's a bit of a contradiction there. So, you know, well, looking at the long-arguer history advertisement has in some parts benefited society
Starting point is 00:41:08 because they funded some efforts that perhaps that's like, I mean, I think like there's a thing where sometimes I think it's actually been of some years. Now, let's, where the damage comes is a different thing though. Social media, algorithms and social media have to work on feedback loops where they present you with stimulus. They have to see if you respond to the stimulus.
Starting point is 00:41:33 Now, the problem is that the measurement mechanism for telling if you respond in the engagement feedback loop is very, very crude. It's things like whether you click more or occasionally if you're staring at the screen more, if there's a forward facing camera that's activated, but typically there isn't. So you have this incredibly crude back channel of information. And so it's crude enough that it only catches sort of the more dramatic responses from you,
Starting point is 00:42:00 and those are the fight or flight responses. Those are the things where you get scared or pissed off or aggressive or horny. You know, these are these ancient, this sort of what are sometimes called the lizard brain circuits or whatever. You know, these these fast response old, old, old evolutionary business circuits that we have that are helpful and survival once in a while, but are not us at our best. They're not who we want to be. They're not how we relate to each other.
Starting point is 00:42:28 They're this old business. But so then just when you're engaged using those intrinsically totally aside from whatever the topic is, you start to get incrementally just a little bit more paranoid, xenophobic, aggressive, you know, you get a little stupid and like you become a jerk. And it happens slowly. It happens, it's not, it's not like everybody is like instantly transformed,
Starting point is 00:42:50 but it does kind of happen progressively where people get hooked, kind of get drawn more and more into this pattern of being at their worst. Would you say that people are able to, when they get hooked in this way, look back at themselves from 30 days ago and say, I am less happy with who I am now, or I'm not happy with who I am now versus who I was 30 days ago. Are they able to self-reflect when you take yourself outside of the list of books? Sometimes, I wrote a book about people suggesting people take a break from their social media to see what happens, and maybe even to... I actually, the title of the book was just the arguments to delete your account. Yeah, 10 arguments.
Starting point is 00:43:30 Although I always said, I don't know that you should. I can give you the arguments. It's up to you. I was very clear about that. But I don't have a social media account, obviously. And it's not that easy for people to reach me. They have to search out an old fashioned email address on a super crappy, antiquated website.
Starting point is 00:43:48 Like, it's actually a bit, I don't make it easy. And even with that, I get this huge flood of mail from people who say, oh, my social media I'm doing so much better. I can't believe how bad it was. But the thing is, what's for me a huge flood of mail would be an imperceptible trickle from the perspective of a Facebook, right?
Starting point is 00:44:04 And so I think it's rare for somebody to look at themselves and say, Oh boy, I should screwed myself over. Well, it's a really hard thing to ask if somebody none of us find that easy, right? Well, the reason I asked this is, is it possible to design social media systems that optimize for some longer term metrics of you being happy with yourself. Personal growth. I don't think you should try to engineer personal growth or happiness.
Starting point is 00:44:33 I think what you should do is design a system that's just respectful of the people and subordinates itself to the people and doesn't have perverse incentives. And then at least there's a chance for something decent happening. But if you have to recommend stuff, right? So you're saying like, be respectful. What does that actually mean? Engineering-wise, like people? Yeah, curation. People have to be responsible. Algorithms shouldn't be recommending. Algorithms don't understand enough to recommend. Algorithms are crap in this era. I mean, I'm sorry, they are. And I'm not saying this as somebody is a critic from the outside. I'm in the middle of it. I know what they can do.
Starting point is 00:45:05 I know the math. I know what the corpora are. I know the best ones. Our office is funding GPT-3 and all these things that are at the edge of what's possible. And they do not have yet. I mean, it still is statistical, emergent, cdosomantics, it doesn't actually have
Starting point is 00:45:27 the representation emerging of anything. It's just not like, I mean, that I'm speaking to truth here and you know it. Well, let me push back on this. This, there's several truths here. So what you're speaking to the way certain companies operate currently, I don't think it's outside the realm
Starting point is 00:45:43 of what's technically feasible to do. They're just not incentive, like companies are currently, I don't think it's outside the realm of what's technically feasible to do. There's just not incentive like companies are not why fix this thing. I am aware that for example, the YouTube search and discovery has been very helpful to me. And there's a huge number of there's so many videos that it's nice to have a little bit of help. Have you done that? But I'm still in control. Let me ask you something. Have you done the experiment of letting YouTube recommend videos to you, either starting
Starting point is 00:46:11 from a absolutely anonymous random place where it doesn't know who you are or from knowing who you or somebody else is, and then going 15 or 20 hops. Have you ever done that and just let it go top video recommend and then just go 20 hops? I've done that many times now. I have because of how large YouTube is and how widely it's used, it's very hard to get to enough scale to get a statistically solid result on this. I've done it with high school kids, with dozens of kids doing it at a time. Every time I've done an experiment, the majority
Starting point is 00:46:45 of times after about 17 or 18 hops, you end up in really weird paranoid, bizarre territory. Because ultimately, that is the stuff the algorithm rewards the most because of the feedback creepness I was just talking about. So, I'm not saying that the video never recommends something cool. I'm saying that its fundamental core is one that promotes a paranoid style, that promotes increasing irritability, that promotes xenophobia, promotes fear, anger, promotes selfishness, promotes separation between people. And I would encourage, the thing is, it's very hard to do this work solidly. Many have repeated this experiment and yet it still
Starting point is 00:47:25 is kind of anecdotal. I'd like to do like a large, you know, citizen science thing sometime and do it. But then I think the problem of that is YouTube is detected and then change it. Yes, I definitely say I love that kind of stuff and twit. So, Jack Dorsey has spoken about doing healthy conversations on Twitter or optimizing for healthy conversations. What that requires within Twitter, or most likely citizen experiments, of what does healthy conversations actually look like, and what, how do you incentivize those healthy conversations? You're describing what often happens,
Starting point is 00:47:57 and what is currently happening, what I'd like to argue is it's possible to strive for healthy conversations, not in a dogmatic way of saying, I know what healthy conversations are, and I will tell you. I think one way to do this is to try to look around at social, maybe not things that are officially social media, but things where people are together online and see which ones have more healthy conversations. Even if it's hard to be completely objective in that measurement,
Starting point is 00:48:25 you can kind of at least cruelly, could do some subjective adaptation of this like have a lot of crowdsourced. Yeah, one that I've been really interested in is GitHub, because it could change, I'm not saying it'll always be, but for the most part, GitHub has had a relatively quite low poison quotient. And I think there's a few things about GitHub that are interesting. One thing about it is that people have a stake in it. It's not just empty status games. There's actual code or there's actual stuff being done.
Starting point is 00:48:59 And I think as soon as you have a real world stake in something, you have a motivation to not screw up that thing. And I think that that's often missing, that there's no incentive for the person to really preserve something, if they get a little bit of attention from dumping on somebody's TikTok or something, that they don't pay any price for it. But you have to kind of get decent with people
Starting point is 00:49:25 when you have a shared stake, you know, a little secret. So GitHub does a bit of that. GitHub is wonderful, yes. But I'm tempted to play the Jarn back at you, which is that so GitHub is currently amazing, but the thing is, if you have a stake, then if it's a social media platform, they can use the fact that you have a stake to manipulate you because you want to preserve the stake. So like, so like, right. Well, this is why this gets us into the economics. So there's this thing called data dignity. Yes. I've been studying for a long time. I wrote a book about an earlier version
Starting point is 00:49:59 of it called, who owns the future. And the basic idea of it is that once again, this is a 30-year conversation. The fascinating topic. Let me do the fastest version of this I can do. The fastest way I know how to do this is to compare two features. All right. So, future one is then the normative one, the one we're building right now, and future two is going to be data dignity, okay. And I'm gonna use a particular population. I live on the hill in Berkeley and one of the features about the hill is that as the climate changes, we might burn down and I'll lose our houses or die or something.
Starting point is 00:50:36 Like it's dangerous, you know, and it didn't use to be. And so who keeps us alive? Well, the city does, the city does some things. The electric company kind of sort of, maybe hopefully better. Individual people who own property, take care of their property, that's all nice, but there's this other middle layer,
Starting point is 00:50:53 which is fascinating to me, which is that the grounds keepers who work up and down that hill, many of whom are not legally here, many of whom don't speak English, cooperate with each other to make sure trees don't touch to transfer fire easily from lot to lot. They have this whole little web that's keeping us safe.
Starting point is 00:51:13 I didn't know about this at first. I just started talking to them because they were out there during the pandemic. And so I try to just see who are these people, who are these people who are keeping us alive. Now, I wanna talk about the two different ways for those people in their future one and future two. Future one, some weird kindergarten paint job van
Starting point is 00:51:35 with all these cameras and where things drives up, observes with the gardeners and groundskeepers are doing. A few years later, some amazing robots that can show me up trees and all this, show up, all those people are out of work and they're these robots doing the thing and the robots are good and they can scale to more land and they're actually good, but then there are these people out of that solution is every time in history that you've had some centralized thing that's stolen out the benefits, that things get seized by people because it's too centralized and it gets seized. This happened to every communist experiment I can find. So I think that turns into a poor future that will be destabil. I don't think people will feel
Starting point is 00:52:23 good in it. I think it'll be a political disaster with a sequence of people seizing this central source of the basic income. And you'll say, oh no, an algorithm can do it. Then people will seize the algorithm. They'll seize control. Unless the algorithm is decentralized and it's impossible to seize the control.
Starting point is 00:52:40 Yeah, but 60 something people own a quarter of all the Bitcoin, like, like, the things that we think are decentralized or not decentralized. So let's go to future two, future two, the gardeners see that that van with all the cameras and the kindergarten paint job and they say, hey, the robots are coming. We're going to form a data union. And amazingly, California has a little baby data union, a lot emerging in the books. Yes. Interesting. And so they'll, and what they say, they say, we're going to form a data union, and we're going to, not only are we going to sell our data to this place,
Starting point is 00:53:20 but we're going to make it better than it would have been if they were just grabbing it without our corporation. And we're going to improve it. We're going to make the robots more effective. We're going to make them better, but we're going to make it better than it would have been if they were just grabbing it without our cooperation. And we're going to improve it. We're going to make the robots more effective. We're going to make them better and we're going to be proud of it. We're going to become a new class of experts that are respected. And then here's the interesting, there's two things that are different about that world from future one. One thing, of course, the people have more pride, they have more sense of ownership of agency, but what the robots do changes. Instead of just like this functional, like we'll figure out how to keep the neighborhood
Starting point is 00:53:56 from burning down, you have this whole creative community that wasn't there before thinking, well, how can we make these robots better so we can keep on earning money? There'll be waves of creative grounds keeping with spiral pumping, pumpkin patches and waves of cultural things. There'll be new ideas like, wow, I wonder if we could do something about climate change mitigation with how we do this. What about fresh water? Can we make the food healthier? What about all of a sudden, they'll be this whole creative community on the case? And isn't it nicer to have a high tech future with more creative classes than one with more dependent classes? Isn't that a better future?
Starting point is 00:54:35 But but but but. Future one and future two have the same robots and the same algorithms. There's no technological difference. There's only a human difference. Yeah. And that's second future two, that's data dignity. The economy that you're, I mean, the game theory here is on the humans and then the technologies, just the tools that enable you know, I mean, I think you can believe in AI and be in future two. I just think it's a little harder. You have to do more contortion, it's possible. In the case of social media, what does a data dignity look like?
Starting point is 00:55:13 Is it people getting paid for their data? Yeah, I think what should happen is in the future, there should be massive data unions for people putting content into the system. And those data unions smooth out the results a little bit. So it's not winner take all, but at the same time, and people have to pay for it too. They have to pay for Facebook the way they pay for Netflix with and allowance for the poor. There has to be a, there has to be a way out too. But the thing is people
Starting point is 00:55:45 do pay for Netflix. It's a going concern. People pay for Xbox and PlayStation. There's enough people to pay for stuff they want this could happen to. It's just that this precedent started that moved it in the right direction. And then what has to happen, the economy's a measuring device. If it's an honest measuring device, the outcomes for people form a normal distribution, a bell curve. And then, so there should be a few people who do really well, a lot of people who do okay. And then we should have an expanding economy, reflecting more and more creativity and expertise flowing through the network. And that expanding economy moves the result just a bit forward.
Starting point is 00:56:24 So more people are getting money out of it than are putting money into it. So it gradually expands the economy and lifts all boats and the society has to support the lower wing of the bell curve too, but not universal basic income. It has to be for the, you know, because if it's an honest, if it's an honest economy, there will be that lower wing, and we have to support those people. There has to be a safety net. But see what I believe, I'm not gonna talk about AI, but I will say that I think there'll be more
Starting point is 00:56:53 and more algorithms that are useful. And so I don't think everybody's gonna be supplying data to grounds keeping robots, nor do I think everybody's gonna make their living with TikTok videos. I think in both cases, there'll be a rather small contingent that do well enough at either of those things, but I think there might be many, many, many, many of those niches that start to evolve, is there more and more algorithms, more and more robots, and it's that large number that will create the economic
Starting point is 00:57:20 potential for a very large part of society to become members of new creative classes. Do you think it's possible to create a social network that can piece with Twitter and Facebook that's large and centralized in this way? Not centralized, they're large. Large. How do you get, all right, so I've got to tell you how to get from where we are to anything kind of in the zone of what I'm talking about is challenging. I know some of the people who run, like I know Jack Dorsey, and I view Jack as somebody who's actually, I think he's really striving and searching and trying to find a way to make it better, but it's kind of like, it's very hard to do it while in flight.
Starting point is 00:58:09 And he's under enormous business pressure too. It's a jack-o'-sit to me is a fascinating study because I think his mind isn't in a lot of good places. He's a good human being, but there's a big titanic ship that's already moving in one direction. It's hard to know what to do with it. I think that's the story of Twitter. I think that's the story of Twitter. I think that's the story of Twitter. One of the things that I observe is that if you just want
Starting point is 00:58:29 to look at the human side, meaning like how are people being changed? How do they feel? What does the culture like? Almost all of the social media platforms that get big have an initial sort of honeymoon period where they're actually kind of sweet and cute. Like if you look at the early years of Twitter, it was really sweet and cute, but also look
Starting point is 00:58:48 at Snap, TikTok, and then what happens is as they scale and the algorithms become more influential instead of just the early people, when it gets, when it gets big enough that it's the algorithm running it, then you start to see the rise of the paranoid style and then they start to get dark. And we've seen that shift in TikTok rather recently. But I feel like that scaling reveals the flaws within the incentives. I feel like I'm torturing you. I'm sorry.
Starting point is 00:59:18 No, it's not torture. No, because I have hope for the world with humans and I have hope for a lot of things that humans create, including technology. And I just, I feel it is possible to create social media platforms that incentivize a lot of different things than the current. I think the current incentivization is around like the dumbest possible thing that was invented like 20 years ago, however long and it just works and so nobody's changing it. I just think that there could be a lot of innovation for more. See you kind of push back this idea that we can't know what long-term growth or happiness is. I if you give control to people to define what their long-term happiness and goals are, then you that
Starting point is 01:00:09 optimization can happen for each of those individual people. You know, well, I mean, imagine a future where probably a lot of people would love to make their living, doing TikTok dance videos, but people recognize generally, that's kind of hard to get into. Nonetheless, dance crews have an experience that's very similar to programmers working together on GitHub, so the future is like a cross between TikTok and GitHub, and they get together,
Starting point is 01:00:40 and they have rights, they're negotiating for returns. They join different artist societies in order to soften the blow of the randomness of who gets the network effect benefit because nobody can know that. And I think an individual person might join a thousand different data unions in the course of their lives, or maybe even 10,000. I don't know, but the point is that we'll have like these very hedge distributed portfolios of different data unions were part of, and some of them might just trickle in a little money for nonsense stuff where we're contributing to health studies or something. But I think people will find their way. They'll find their way to the right GitHub-like community in which they find their value in the context of supplying inputs and data and taste and correctives and all
Starting point is 01:01:34 of this into the algorithms and the robots of the future. And that is a way to resist the is a way to resist the lizard brain-based funding system mechanism. It's an alternate economic system that rewards productivity, creativity, value as perceived by others. It's a genuine market. It's not rolled out from a center. There's not some communist person deciding who's valuable.
Starting point is 01:02:00 It's an actual market. And the money is made by supporting that instead of just grabbing people's attention in the cheapest possible way, which is definitely how you get the lizard brain. Yeah, okay, so we're finally at the agreement. But I just think that, so yeah, I'll tell you what I think, fake social media, I'll tell you what how I think Fix social media. There's a few things. There's a few things
Starting point is 01:02:29 So one I think people should have complete control over their data and transparency of What that data is and how it's being used if they do hand over the control another thing They should be able to delete walk away with their data at any moment easy, like with a single click of a button, maybe two buttons, I don't know, just easily walk away with their data. The other is control of the algorithm, individualized control of the algorithm for them. So each one has their own algorithm. Each person has their own algorithm, they get to be the decider of what they've seen in this world.
Starting point is 01:03:03 And to me, that, I mean, that's, I guess, fundamentally decentralized in terms of the key decisions being made. But if that's made transparent, I feel like people will choose that system over Twitter of today, over Facebook of today, when they have the ability to walk away, to control their data, and to control the kinds of things they see. Now, let's walk away from the term AI. You're right. In this case, you have full control of the algorithms that help you if you want to use their help, but you can also say a few to those algorithms and just consume the raw, beautiful
Starting point is 01:03:40 waterfall of the internet. I think that, to me, that's not only fixed social media, but I think it would make a lot more money. So I would like to challenge the idea, I know you're not presenting that, but the only way to make a ton of money is to operate like Facebook. I think you can make more money by giving people control. Yeah, I mean, I certainly believe that. We're definitely in the territory of wholehearted agreement here.
Starting point is 01:04:11 I do wanna caution against one thing, which is making a feature that benefits programmers versus this idea that people are in control of their data. So years ago, I co-founded an advisory board for the EU with the guy named Giovanni P Paturelli who passed away. It's one of the reasons I wanted to mention it. A remarkable guy who'd been, he was originally a prosecutor who was throwing
Starting point is 01:04:33 Maffio so in jail in Sicily. So he was like this intense guy who was like, I've dealt with death threats Mark Zuckerberg doesn't scare me, whatever. So we worked on this path of saying, let's make it all about transparency and consent. And it was one of the theaters that led to this huge data privacy and protection framework in Europe called the GDPR. And so therefore, we've been able to have empirical feedback on how that goes. And the problem is that
Starting point is 01:05:12 most people actually get stymied by the complexity of that kind of management. They have trouble and reasonably so. I don't. I'm like a techie. I can go in and I can figure out what's going on. But most people really do. And so there's the problem that it differentiates those who kind of have a technical mindset and can go in and have a feeling for how this stuff works. I kind of still want to come back to incentives. And so if the incentive for whoever's, if the commercial incentive is to help the creative people of the future make more money because you get a cut of it, commercial incentive is to help the creative people of the future make more money because you get a cut of it, that's how you grow an economy. Not the programmers. Well, some movie programmers, it's not anti-programmer.
Starting point is 01:05:51 I'm just saying that it's not only programmers, you know. So, I mean, I definitely, so yeah, you have to make sure the incentives are right. I mean, I like control is an interface problem. Do you have to create something that's compelling to everybody, to the creators, to the public? I mean, there's, I don't know, creative comments, like the licensing, there's a bunch of legal speak,
Starting point is 01:06:22 just in general, the whole legal profession. It's nice when you can be simplified in the way that you can truly, you know, simply understand everybody can simply understand the basics. And that's the same way it should be very simple to understand how the data is being used and what data is being used for people. But then yours are arguing that in order for that to happen, you have to have the incentive to like, I mean, a lot of the reason that money works is actually information hiding and information loss. Like one of the things about money is a particular dollar you get
Starting point is 01:06:59 might have passed through your enemy's hands and you don't know it. But also, this is, I mean, this is what Adam Smith, if you wanna give the most charitable interpretation possible to the invisible hand, is what he was saying is that, like there's this whole complicated thing, and not only do you not need to know about it, the truth is you'd never be able to follow it if you tried
Starting point is 01:07:17 and just like let the economic incentives solve for this whole thing, and that in a sense, there were is like a neuron and a neural net. If he'd had that metaphor, he would have used it and let the whole thing settle to a solution and don't worry about it. I think this idea of having incentives that reduce complexity for people can be made to work. And that's an example of an algorithm that could be manipulative or not going back into your question before about can you do it in a way that's not manipulative. And I would say a GitHub like if you just have this vision, GitHub plus TikTok combined, is it possible? I think it is. I'm
Starting point is 01:07:59 not going to be able to on see that idea of creators on TikTok collaborating in the same way that people on GitHub collaborate. I like that kind of version. I like it. I love it. I just like right now when people use by way father of teenage daughter. So it's all about TikTok.
Starting point is 01:08:18 So, you know, when people use TikTok, there's a lot of it's kind of funny. I was going to say cat-in-us, but I was just using the cat as this exemplar of overcoming it. I contradict myself, but anyway, there's all this cat-iness for people who are like, ee, this person is like, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, ee, a better, can we get a better musician? Like, and they do that. But that's the part that's kind of off the books right now. You know, that should be like right there. That should be the center. That's where the, that's the really best part. Well, that's, that's where the invention of Git period, the versioning is brilliant. And so some of the, some of the things you're talking about technology, algorithms, tools,
Starting point is 01:09:04 can empower. And that's, that's the thing for humans to, to, algorithms, tools, can empower. And that's the thing for humans to connect, to collaborate and so on. Can we, uh, can we upset more people a little bit? You're already maybe we'd have to try. No, no, can we, uh, can ask you to elaborate? Cause I, my intuition was that you would be a supporter of something like cryptocurrency and Bitcoin because it is fundamentally emphasizes decentralization. So can you elaborate on what? Yeah.
Starting point is 01:09:31 Okay. Look, your thoughts on Bitcoin. It's kind of funny. I wrote, I've been advocating some kind of digital currency for a long time. And when the, when Bitcoin came out and the original paper on blockchain, my heart kind of sank because I thought, oh my God, we're applying all of this fancy thought and all these very careful distributed security measures
Starting point is 01:10:01 to recreate the gold standard. Like it's just so retro, it's so dysfunctional. It's so useless from an economic point of view. So it's always a minute. And then the other thing is using computational inefficiency at a boundless scale as your form of security is a crime against the atmosphere. Obviously, a lot of people know that now, but we knew that at the start. Like the thing is when the first paper came out, I remember a lot of people saying,
Starting point is 01:10:25 oh my god, this thing scales. It's a carbon disaster, you know? And I just like, I'm just mystified. But that's a different question than when you asked. Can you have a cryptographic currency or at least some kind of digital currency that's of a benefit? And absolutely.
Starting point is 01:10:44 Like I'm, and there are people who are trying to be thoughtful a benefit. And absolutely, like I'm, and there are people who are trying to be thoughtful about this. You should, if you have it, you should interview Vitalik Buterin sometime. Yeah, they're people. They're people. Okay, so like there are people in the community
Starting point is 01:10:54 who are trying to be thoughtful and trying to figure out how to do this better. It has nice properties, all right? So one of the nice properties is that like government centralized, it's hard to control. And then the other one, to fix some of the issues that you're referring to
Starting point is 01:11:06 I'm sort of playing devil's advocate here, you know, there's lightning network, there's ideas how to how you build stuff on top of Bitcoin, similar with gold, that allow you to have this kind of vibrant economy that operates not on the blockchain, but outside the blockchain, and you use this Bitcoin uses Bitcoin for checking the security
Starting point is 01:11:27 of those transactions. So Bitcoin's not new. It's been around for a while. I've been watching it closely. I've not seen one example of it creating economic growth. There was a succession with the idea that government was the problem. That idea that government's the problem,
Starting point is 01:11:43 let's say government earned that wrath honestly, because if you look at some of the things that governments have done in recent decades, it's not a pretty story. Like after a very small number of people in the US government decided to bomb and land mine, Southeast Asia, it's hard to come back and say, oh, government's a great thing. But then the problem is that this resistance to government is basically resistance to politics. It's a way of saying, if I can get rich,
Starting point is 01:12:17 nobody should bother me. It's a way of not having obligations to others. And that ultimately is a very suspect motivation. But does that mean that the impulse that the government should not overreach its power is flawed? Well, I mean, what I want to ask you to do is to replace the word government with politics. Like our politics is people having to deal with each other.
Starting point is 01:12:44 But my theory about freedom is that the only authentic form of freedom is perpetual annoyance. So annoyance means you're actually dealing with people because people are annoying. Perpetual means that annoyance is survivable so it doesn't destroy us all. So if you have perpetual annoyance then you have freedom. And that's politics. That's politics. If you don't have perpetual annoyance, then you have freedom. And that politics sets politics. If you don't have perpetual annoyance, something's gone very wrong.
Starting point is 01:13:09 And you suppress those people. It's the only temporary. It's going to come back and be horrible. You should seek perpetual annoyance. I'll invite you to a Berkeley City Council meeting so you can know what that feels like. What professional. But anyway, so freedom is being the test of freedom is that you're annoyed by other people.
Starting point is 01:13:26 If you're not, you're not free. If you're not, you're trapped in some temporary illusion that's going to fall apart. Now this quest to avoid government is really a quest to avoid that political feeling, but you have to have it. You have to deal with it. It sucks, but that's the human situation. That's the human condition. And this idea that we're going to have this abstract thing that protects us from having
Starting point is 01:13:48 to deal with each other is always an illusion. The idea, and I apologize, I'm overstretched to use the word government, the idea is this should be some punishment from the people when a group, when a bureaucracy, when a set of, when a set of people or a particular leader, like in an authoritarian regime, which more than half the world currently lives under, if you, like if they become, they start, stop representing the people, it stops being like a Berkeley meeting and starts being more like, like a dictatorial kind of situation. And so the point is, it's nice to give people the populace in decentralized way, power
Starting point is 01:14:34 to resist that kind of like government becoming more authoritarian. Yeah, but people see this idea that the problem is always the government being powerful as false. The problem can also be criminal gangs. The problem can also be weird cults. The problem can be abusive, abusive clergy. The problem can be, uh, uh, uh, fall infrastructure that fails. The problem can be, uh, poisoned water. The problem can be poisoned water, the problem can be failed electric grids. The problem can be a crappy education system that makes the whole society less and less able to create value. There are all these other problems that are different from an overbearing government. Like,
Starting point is 01:15:19 you have to keep some sense of perspective and not be obsessed with only one kind of problem because then the others will pop up. But empirically speaking, some problems are bigger than others. So like, some, like, groups of people, like governments or gangs or companies lead to a more US citizen. Yes. Has the government ever really been a problem for you? Well, okay.
Starting point is 01:15:42 So first of all, I grew up in the Soviet Union. And actually, my wife did too. So I have seen, you know, and has the government bothered me, I would say that that's a really complicated question, especially because the United States is such a special place in like a lot of other countries. My wife's family were refused next. And so we have like a very end her dad was sent to the gulag for what it's worth on my father's side. All but a few were killed by a pogrom in post-Soviet pogrom in Ukraine.
Starting point is 01:16:21 So I would say because you did a little trick of eloquent trick of language that you switched to the United States to talk about government. So I am I believe, unlike my friend Michael Malice who's an anarchist, I believe government can do a lot of good in the world. That's exactly what you're saying, which is it's politics. The thing that Bitcoin folks and cryptocurrency folks argue is that one of the big ways that government can control the populace is centralized bank like control the money that was the case in the Soviet Union, too. There's, you know, inflation can really make poor people suffer. And so what they argue is this is one way to go around that power
Starting point is 01:17:09 that government has of controlling the monetary system. So that's a way to resist. That's not actually saying government bad, that's saying some of the ways that central banks get into trouble can be resisted to the sense of us. So let me ask you, on balance today in the real world, in terms of actual facts, do you think cryptocurrencies are doing more to prop up corrupt, murderous, horrible regimes or to resist those regimes? Where do you think the balance is right now? I know exactly having talked to a lot of cryptocurrency folks what they would tell me, right?
Starting point is 01:17:47 It's hard. It's, I don't know. No, I'm asking it as a real question. There's no way to know the answer perfectly. There's no way to know the answer perfectly. However, I gotta say, if you look at people who've been able to decode blockchains, and they do leak a lot of data.
Starting point is 01:18:05 They're not as secure as this widely thought. There are a lot of unknown Bitcoin whales from pretty early, and they're huge. And if you ask, who are these people, there's evidence that a lot of them are quite not the people you'd want to support, let's say. And I just don't, like I think empirically this idea that there's some intrinsic way that bad governments will be, will be disempowered and people will be able to resist them more than new villains or even villainous governments will be empowered. There's no basis for that assertion. It just is kind of circumstantial. And I think in general, Bitcoin ownership is one thing, but Bitcoin transactions have
Starting point is 01:18:57 tended to support criminality more than productivity. Of course, they would argue that was that was the story of its early days that now more and more Bitcoin is being used for legitimate transactions. But that's a different, I didn't say for legitimate transactions. I said for economic growth for creativity. Like I think what's happening is people
Starting point is 01:19:20 are using it a little bit for, for buying, I don't know, maybe some of these companies make it available for this and that, they buy a Tesla with it or something. Yeah. Investing in a startup, hard, it might have happened a little bit, but it's not an engine of productivity, creativity, and economic growth, whereas old fashioned currency still is. And anyway, I'm, look, I think something, I'm, I'm pro the idea of digital currencies. I am anti the idea of economics wiping out politics as a result. I think they have to exist in some balance to avoid the worst
Starting point is 01:20:05 dysfunctions of each. In some ways, there's parallel star discussion of algorithms and cryptocurrency is your pro, the idea, but it can be used to manipulate, you can use, be used poorly by a forementioned humans. Well, I think that you can make better designs and worse designs. And I think, and you know, the thing about cryptocurrency that's so interesting is how many of us are responsible
Starting point is 01:20:36 for the poor designs because we're all so hooked on that Horatio Alger story, unlike I'm going to be the one who gets the viral benefit. You know, way back when all this stuff was starting, I remember it would have been in the 80s. Somebody had the idea of using viral as a metaphor for network effect. And the whole point was to talk about how bad network effect was, that it always created distortions that ruined the usefulness of economic incentives that created dangerous distortions. But then somehow, even after the pandemic,
Starting point is 01:21:10 we think of viral as this good thing, because we imagine ourselves as the virus, right? We want to be on the beneficiary side of it. But of course, you're not likely to be. There is a sense because money is involved. People are not reasoning clearly always, because they want to be, they want to be part of that first viral wave that makes them rich, and that blinds people from their basic morality. I had an interesting conversation. I don't, I sort of feel like I
Starting point is 01:21:39 should respect some people's privacy, but some of the initial people who started Bitcoin, some people's privacy, but some of the initial people who started Bitcoin, I remember having an argument about like, it's intrinsically a Ponzi scheme, like, you know, the early people have more than the later people, and the further down the chain you get, the more you're subject to gambling like dynamics, where it's more and more random and more and more subject to weird network effects and whatnot, unless you're a very small player, perhaps, and you're just buying something. But even then, you'll be subject to fluctuations because the whole thing is just kind of like that as it fluctuates, it's going to wave around the little people more. And I remember the conversation turned to gambling because gambling's a pretty large economic sector. And it's always struck me as being non-productive,
Starting point is 01:22:26 like somebody goes to Las Vegas and they lose money. And so one argument is, well, they got entertainment. They paid for entertainment as they lost money. So that's fine. And Las Vegas does up the losing of money in an entertaining way. So why not? It's like going to a show.
Starting point is 01:22:40 So that's one argument. The argument that was made to me was different from that. Is that, no, what they're doing is they're getting a chance to experience hope. And a lot of people don't get that chance. And so that's really worth it, even if they're going to lose. They have that moment of hope. And they need to be able to experience that. And it's a very interesting argument. That's so heartbreaking because I, well, I see it, but I've seen that way that I have that a little bit of a sense I've thought to some young people who invest in cryptocurrency and what I see is this hope this is the first thing they gave them hope and that's so heartbreaking to me that you got in hope from
Starting point is 01:23:19 the so much is invested but it's like hope from somehow becoming rich as opposed to something to me. I apologize, but money is in the long term not going to be a source of that deep meaning. It's good to have enough money, but it should not be the source of hope. And it's heartbreaking to me how many people is the source of hope. Yeah. Yeah, you've just described the psychology of virality or the psychology of trying to base a civilization on semi-random occurrences of network effect peaks. Yeah. And it doesn't really work.
Starting point is 01:23:57 I mean, I think we need to get away from that. We need to soften those peaks. And except Microsoft, which deserves every penny penny but in every other case. Well you mentioned GitHub. I think what Microsoft did with GitHub was brilliant. I was very, okay, if I can give a, not a critical but on Microsoft because they recently purchased Bethesda, so Elder Scrolls is in their hands. I'm watching you, Microsoft, do not screw up my favorite game. So, yeah.
Starting point is 01:24:29 I'm not speaking for Microsoft. I haven't explicit arrangement with them where I don't speak for them. Obviously, that should be very clear. I do not speak for them. I am not saying, I like them. I think such is amazing. The term data dignity was coined by such a.
Starting point is 01:24:48 Like so, you know, we have, it's kind of extraordinary, but you know, Microsoft is a giant thing. It's gonna screw up this or that, you know, it's not, I don't know. It's kind of interesting. I've had a few occasions in my life to see how things work from the inside of some big thing. And you know, it's always just people kind of, it's, I don't know, there's always like coordination
Starting point is 01:25:11 problem. And there's always a human problems. Oh, God, you know, there's some bad people. It's always, I hope Microsoft doesn't screw up. And I hope they bring clippy back. You should never kill clippy. Bring clippy back. Oh, Clippy, but Clippy promotes the myth of AI. Well, that's why I just fly. How about it? How about it if we, all right. Could we bring back Bob instead of Clippy? Which one was Bob?
Starting point is 01:25:35 Oh, Bob was an everything like. Bob was this other screen character who was supposed to be the voice of AI. Cortana, Cortana with Cortana dude, and Cortana is too corporate. I like it. It's actually fine. There's a woman in Seattle who's like the model for Cortana.
Starting point is 01:25:52 Did Cortana's voice and was there was like, no, the voice is great. We had a vision. We had her as a, she's to walk around and if you were wearing hollow ones for bed, I don't think that's happening anymore. I think I don't think you should turn
Starting point is 01:26:04 a software software into a creature. Well, you. I think I don't think you should turn a software to creature. Well, you and I, get a dog. Get a dog. Yeah. Yeah. A hedgehog. Yeah. You co-authored a paper.
Starting point is 01:26:18 We mentioned Lee Smollin, titled the Autodagdactic Universe, which describes our universe as one that learns its own physical laws. That's a trippy and beautiful and powerful idea. What are, what would you say are the key ideas in this paper? Okay, well I should say that paper reflected work from last year and the project, the program has moved quite a lot. So it's a little, there's a lot of stuff that's not published that I'm quite excited about. So I have to kind of keep my frame in that last year's things.
Starting point is 01:26:55 I have to try to be a little careful about that. We can think about it in a few different ways. The core of the paper, the technical core of it is a triple correspondence. One part of it was already established and then another part is in the process. The part that was established was, of course, understanding different theories of physics as matrix models. The part that was fresher is understanding those as machine learning systems, so that we could move fluidly between these different ways of describing systems. And the reason to want to do that is just to have more tools and more options because, well, theoretical physics is really hard, and a lot of programs have kind of
Starting point is 01:27:43 Well, theoretical physics is really hard and a lot of programs have kind of Run into a state where they feel a little stalled, I guess I can I I want to be delicate about this because I'm not a physicist I'm the computer scientist collaborating so I don't mean to diss anybody's So this is almost like gives a framework for generating new ideas and physics as you as we start to publish more about where it's gone I think you'll start to see there's tools and ways of thinking about theories that I think open up some new paths that will be of interest. There's the technical core of it, which is this idea of a correspondence to give you more facility. But then there's also the storytelling part of it. And this is something Lee loves stories and I do. And the idea here is
Starting point is 01:28:33 that a typical way of thinking about physics is that there's some kind of starting condition, and then there's some principle by which the starting condition evolves. And the question is like why the starting condition, like how the starting condition has to get kind of purr, there's this has to be fine tuned and all these things about it have to be kind of perfect. And so we were thinking, well, look, what if we could push the storytelling about where the universe comes from much further back by starting with really simple things that evolve and then through that evolution explain how things got to be how they are through very simple
Starting point is 01:29:14 principles, right? And so we've been exploring a variety of ways to push the start of the storytelling further and further back, which, and it's an interesting, it's really kind of interesting because for all of his Lee is sometimes considered to be, to have a radical quality in the physics world. But he still is like, no, this is going to be like the kind of time we're talking about in which evolution happens is the same time we're now and we're talking about something that starts and continues. And I'm like, well, what if there's some other kind of time that's time like and it sounds
Starting point is 01:29:54 like metaphysics, but there's an ambiguity, you know, like it has to start from something. And it's kind of interesting. So there's this a lot of the math can be thought of either way, which's kind of interesting. So there's this, a lot of the math can be thought of either way, which is kind of interesting. So I pushed this so far back that basically all the things we take for granted and physics that becoming an emergent, emergent. I really want to emphasize this is all super baby steps.
Starting point is 01:30:18 I don't want to over claim. It's like, I think a lot of the things we're doing, we're approaching some old problems in a pretty fresh way informed. There's been a zillion papers about how you can think of the universe as a big neural net or how you can think of different ideas and physics as being quite similar to, or even equivalent to some of the ideas in machine learning.
Starting point is 01:30:40 And that actually works out crazy well. Like, I mean, that is actually kind of eerie when you look at it. Like, there's probably two or three dozen papers that have this quality and some of them are just crazy good. And it's very interesting. What we're trying to do is take those kinds of observations and turn them into an actionable framework
Starting point is 01:31:00 where you can then start to do things with landscapes with theories that you couldn't do before and that sort of thing. So in that context, or maybe beyond, how do you explain us humans? How unlikely are we, this intelligent, sufficient, or is there a lot of others or are we alone in this universe? Yeah. You seem to appreciate humans very much. I've grown fond of us. We're okay. We have the our nice qualities. I like that.
Starting point is 01:31:39 I mean, we're kind of weird. We spread this here on our heads. I don't know. We're sort of weird animals. That's the feature, not a bug, I think, the weirdness. I hope so. I hope so. Um, I, I think if I'm just going to answer you in terms of, um, truth, the first thing I'd say is we're not in a privileged enough position, at least
Starting point is 01:32:05 as yet to really know much about who we are, how we are, what we're really like in the context of something larger, what that context is, like all that stuff, we might learn more in the future, our descendants might learn more, but we don't really know very much, which you can either view as frustrating or charming like that first year of TikTok or something. But I'll- I'll- I'll- I'll-
Starting point is 01:32:31 I'll- I'll- I'll- I'll- I'll- I'll- I'll- I'll- I'll- I'll-
Starting point is 01:32:39 I'll- I'll- I'll- I'll- I'll- I'll- I'll- I'll- I'll- I'll- I'll- I sometimes think that if you are just quiet and you do something that gets you in touch with the way reality happens, and for me it's playing music, sometimes it seems like you can feel a bit of how the universe is, and it feels like there's a lot more going on in it, and there is a lot more life and a lot more stuff happening and a lot more stuff flowing through it. I'm not speaking as a scientist now. This is kind of a more my artist side talking. And it's, uh, uh, so I feel like I'm suddenly in multiple personalities with you. But, uh, Kerouac, Jack Kerouac said that music is the only truth. What do you,
Starting point is 01:33:22 uh, it sounds like you might be at least in part. There's a there's a passage in Carrewex book, Dr. Sacks, where somebody tries to just explain the whole situation with reality and people and like a paragraph. And I couldn't reproduce it for you here. But it's like, yeah, like there are these boldest things that walk around. And they make these sounds. You can sort of understand them, but only kind of, and then there's like this, and it's just like this amazing, like just really quick, like if some spirit being or something was gonna show up in our reality and had knew nothing about it,
Starting point is 01:33:52 it's like a little basic intro of like, okay, here's what's going on here. An incredible passage. Yeah, yeah. It's like a one or two sudden summary in H.H.I.K. as Guides of the Galaxy, right, of what this, mostly harmless. Mostly harmless. Yeah, do you Haka's guide to the galaxy, right, uh, of what this, uh, mostly harmless, mostly harmless. Yeah. Do you think there's truth to that, that, uh, music somehow connects to something that
Starting point is 01:34:11 words cannot? Yeah. Music is something that just towers above me. I don't, I don't, I don't feel like I have an overview of it. It's just the reverse. I don't, I don't fully understand it. Because on one level, it's simple. Like you can say, oh, it's, it's a thing people evolved to coordinate our brains on a pattern level or a, or something like that. There's all
Starting point is 01:34:36 these things you can say about music, which are, you know, some of that's probably true. It's also, there's kind of like this. This is the mystery of meaning. Like there's a way that just instead of just being pure abstraction music can have like this kind of substantiality to it that is philosophically impossible. I don't know what to do with it. Yeah. The amount of understanding I feel I have when I hear the right song at the right time is not comparable to anything I can read on Wikipedia. Anything I can understand read in language. There's the music that's connected to something.
Starting point is 01:35:24 There's this thing there. Yeah, there's some kind of a thing in it. I've never, I've read across a lot of explanations from all kinds of interesting people, like that is some kind of a flow language between people or between people and how they perceive. And that kind of thing, and that sort of explanation is fine, but it's not quite it either. There's something about music that makes me believe that panpsychism could possibly be true, which is that everything in the universe is conscious. It makes me think, everything in the universe is cautious. It makes me think, it makes me be humble in how much or how little I understand about the functions
Starting point is 01:36:11 of our universe that everything might be cautious. Most people interested in theoretical physics, eventually land in panpsychism, but I'm not one of them. I still think there's this pragmatic imperative to treat people as special. So I will proudly be a dualist without people in cats. Yeah, I'm not quite sure where to draw the line or why the lines there or anything like that, but I don't think I should be required to all the same questions or equally mysterious
Starting point is 01:36:49 for no line. So I don't feel disadvantaged by that. So I shall remain a dualist. But if you listen to anyone trying to explain where consciousness is and a dualistic sense, either believing in souls or something special thing in the brain or something, you pretty much say, screw this, I'm gonna be a pan psychist. So I'm gonna be a pan psychist.
Starting point is 01:37:11 So I'm gonna be a pan psychist. So I'm gonna be a pan psychist. So I'm gonna be a pan psychist. Fair enough, well put. Is there moments in your life that happen that we're defining in the way that you hope others, you're the always the best. Well, listen, I gotta say, that were defining in the way that you hope others your dog. Well, listen, I gotta say the moments that defined me were not the good ones.
Starting point is 01:37:31 The moments that defined me were often horrible. I, I've had successes, you know, but if you ask what defined me, my mother's death, being under the World Trade Center and the attack. The things that have had an effect on me where the most were sort of real world, terrible things, which I don't wish on young people at all. And this is the thing that's hard about giving advice to young people that they have to learn their own lessons and lessons don't come easily. And a world which avoids hard lessons will be a stupid world.
Starting point is 01:38:22 You know, and I don't know what to do with it. That's a little bundle of truth that has a bit of a fatalist equality to it, but I don't, I don't, this is like what I'm saying that, you know, freedom equals eternal annoyance, like you can't, like, there's a degree to which honest advice is not that pleasant to give. advice is not that pleasant to give. And I don't want young people to have to know about everything. I think you don't want to wish hardship on them. Yeah, I think they deserve to have a little grace period of naivety that's pleasant. I mean, I do, you know, if it's possible, if it's, these things are, this is like, this is tricky stuff. I mean, if you, if you, okay, so let me try a little bit on this advice thing. I think one thing, and any serious, broad advice will have been given a thousand times before for a thousand years. So I'm not going to claim originality. But I think trying to find a way to really pay attention to what you're feeling
Starting point is 01:39:36 fundamentally, what your sense of the world is, what your intuition is, if you feel like an intuitive person, what you're... Like to try to escape the constant sway of social perception or manipulation, whatever you wish, not to escape it entirely. That would be horrible. But to find cover from it once in a while, to find a sense of being anchored in that, to believe in experience as a real thing, believing in experience as a real thing is very dualistic. That goes that goes with my philosophy of dualism. I believe there's something magical and I instead of squirting the magic dust on the programs, I think experience is something real and something apart and something mystical and something. Your own personal internal experience that you just have. And then you're saying,
Starting point is 01:40:30 sounds the rest of the world enough to hear that, like whatever that. Find magic justice. Yeah, find, find with what, what is there. And I think that's what, that's one thing. Another thing is to recognize that kindness requires genius, that it's actually really hard. That facile kindness is not kindness, and that it'll take you a wall to have the skills, to have kind impulse, that's what it be kind, you can have right away. To be effectively kind is hard. you can have right away. To be effectively kind is hard. To be effectively kind, yes. It takes skill. It takes hard lessons.
Starting point is 01:41:14 It's, you'll never be perfect at it. To the degree you get anywhere with it, it's the most rewarding thing ever. Let's see what else would I say. that it's the most rewarding thing ever. Let's see what else would I say. I would say when you're young, you can be very overwhelmed by social and interpersonal emotions. You'll have broken hearts and jealousies, you'll feel socially down the
Starting point is 01:41:48 ladder instead of up the ladder. It feels horrible when that happens. All of these things, and you have to remember what a fragile crest all that stuff is, and it's hard because right when it's happening, it's just so intense. And if I was actually giving this advice to my daughter, she'd already be out of the room. So I'm just, this is for some like hypothetical teenager that doesn't really exist that really wants to sit and listen to my music for your daughter in a 10 years from now. Maybe. Can I ask you a difficult question? Yeah, sure. You talked about losing your mom. Yeah. Do you do you miss her? Yeah, I mean, I still connect her through music. She was a, I mean, I still connected her through music. She was a young prodigy piano player in Vienna and she survived the concentration camp and then died in a car accident here in the US.
Starting point is 01:42:58 What music makes you think of her? Is there a song? Well, she was in Vienna, so she had the whole Viennese music thing going, which is this incredible school of absolute skill and romance bundled together and wonderful in the piano, especially I learned to play some of the Beethoven's sonatas for her to play them in this exaggerated drippy way. I remember when I was a kid and exaggerating meaning a dooflil of emotion. Yeah, like just like it's not the only way to play Beethoven. I mean, I didn't know this. That's a reasonable question. I mean, the fashion these days is to be slightly appellonian even with Beethoven, but one imagines that actual Beethoven playing might have been different. I don't know. I've gotten to play a few instruments he played and tried to see if I could feel anything about how it might have been for him. I don't know, really.
Starting point is 01:43:59 I was always against the clinical precision of classical music. I thought a great piano player should be like in pain, like emotionally, like truly feel the music and make it messy sort of, maybe play classical music the way I don't know blues, pianist plays blues. Like it seems like they actually got happier and I'm not sure if Beethoven got happier. I think it's a different, I think it's a different kind of concept of the place of music. I think the blues, the whole-American tradition, was initially surviving awful, awful circumstances. You could say, you know, there were some of that in the concentration camps and all that, too. And it's not that Beethoven circumstances were brilliant, but he kind of also, I don't know, this is hard. Like, I mean, it would seem to be his misery with some what's off imposed maybe through.
Starting point is 01:45:09 I don't know. It's kind of interesting. Like, I've known some people who've loathed Beethoven. Like, the composer, late composer, Pauline Oliverus, wonderful modernist composer, played in her band for a while. And she was like, oh, Beethoven,
Starting point is 01:45:22 like, that's the worst music ever. It's like, all ego, completely, it turns, it turns information in, I mean, it turns emotion into your enemy. And it's, it's ultimately all about your own self importance, which is what has to be at the expense of others. Could what else, what else could it be? And blah, blah, blah. So she had, I shouldn't say, I don't mean to be disnose, but I'm just saying like her position on Beethoven was very negative and very unimpressed, which is really interesting for the man or the music. I think I don't know. I mean, she's not here to speak for herself. So it's a little hard for me to answer that question. I'm, but it was interesting
Starting point is 01:46:00 because I know it's how to Beethoven. I was like, whoa, you know, this is like Beethoven. It is like really the dude, you know, and it's just like, oh, you know, this is like, paid home, it is like really the dude, you know, and it's just like, yeah, you know, paid home and schmatoven, you know, it's like not really happening. Yeah, it's still even though it's cliche. I like playing personally just for myself, Moonlight's an honor.
Starting point is 01:46:15 I mean, I just, Moonlight's amazing. You know, I, you know, you're talking about comparing the blues and that sensibility from Europe is so different in so many ways. I, one of the musicians I play with is John Batista as the band on, on Colbert Show. And he'll, he'll sit there playing jazz and suddenly going to Moonlight. He loves Moonlight. And what's kind of interesting is he's found a way to
Starting point is 01:46:45 debate hoven and he by the way he can really debate hoven like he went through Giulia art and one time he went to my house he's saying hey you have the book of Beethoven's not to say yeah I want to find one I haven't played any site read through the whole damn thing perfectly I'm
Starting point is 01:46:58 like oh god I just get out of here I can't even deal with this but anyway yeah but anyway the thing is, he has this way of with the same persona and the same philosophy moving from the blues into Beethoven. That's really, really fascinating to me. It's like, I don't want to say he plays it as if it were jazz, but he kind of does. It's kind of really, and he talks, he'll, well, he well, he was not raising, he talks like Beethoven's talking to him. Like he's like, oh, yeah, here these do this, I can't do John,
Starting point is 01:47:30 but you know, it's like, it's really, it's really interesting. Like, it's very different. Like for me, I was introduced to Beethoven as like, almost like this god-like figure, figure, and I presume Pauline was too. That was really kind of a press for an art to deal with. And for him, it's just like. It's a conversation, it's having. he's playing James P. Johnson or something. It's like another musician who did
Starting point is 01:47:49 something and they're talking. And it's very cool to be around. It's very kind of freeing. Just see someone have that that that that that relationship. I would love to hear and play Beethoven. That sounds that sounds amazing. He's great. We talked about Ernest Becker and the how much value he puts on our mortality and our denial of our mortality. Do you think about your mortality? Do you think about your own death? You know what's funny is I used to not be able to, but as you get older, you just know people who die and there's all these things, I just becomes familiar and more of a more ordinary, which is what it is. But are you afraid? Sure, although less so. And it's not like I didn't have some kind of insight or revelation to become less afraid. I think I just, like I say, it's kind of familiarity.
Starting point is 01:48:50 It's just knowing people who have died. And I really believe in the future. I have this optimism that people or this whole thing of life on earth, this whole thing we're part of. I don't know where to draw that circle, but this thing is going somewhere and has some kind of value. And you can't both believe in the future and want to live forever.
Starting point is 01:49:17 You have to make room for it. You have to, that optimism has to also come with its own like humility. You have to make yourself small to believe in the future. And so it actually in a funny way comforts me. Wow, that's powerful. And optimism requires you to kind of step down after time. Yeah, I mean, that said life seems kind of short,
Starting point is 01:49:44 but you know, whatever. Do you think there's, I've tried to find, I can't find that said life seems kind of short, but you know, whatever. Do you think there's I've tried to find I can't find the complaint department, you know, I really want to I want to bring this up with the customer service number never answers. Yeah, email bounces one way. So yeah. Do you think there's meaning to it to life? Ah, we'll see, um, meanings of funny word. Like we sell these things as if we know what they mean, but meaning we don't know what we mean when we say all these things as if we know what they mean, but meaning, we don't know what we mean when we say meaning. Like we obviously do not.
Starting point is 01:50:09 And it's a funny little mystical thing. I think it ultimately connects to that sense of experience that dualists tend to believe in. Like is there a why? Like if you look up to the stars and you experience that awe, inspiring, like if you look up to the stars and you experience that awe, inspiring, like joy, whatever, when you look up to the stars, I don't know, like for me, that kind of makes me feel joyful, maybe a little bit of melancholy, just some weird
Starting point is 01:50:37 super feelings. And ultimately, the question is like, why are we here in this vast universe? That question, why? Have you been able in some way, maybe through music, answer it for yourself? My impulse is to feel like it's not quite the right question to ask, but I feel like going down that path is just too tedious for the moment, and I don't want to do it. But the wrong question. Well, just because, you know, I don't know what meaning is. And I think, I do know that sense of awe.
Starting point is 01:51:26 I grew up in Southern New Mexico and the stars were so vivid. I've had some weird misfortunes, but I've had some weird luck. Also, one of our near neighbors was the head of optics research at at White Sands and when he was young he discovered Pluto, his name was Clyde Tombow. And he taught me how to make telescopes as grinding mirrors and stuff.
Starting point is 01:51:53 And my dad had also made telescopes when he was a kid, but Clyde had like, backyard telescopes that would put the shame a lot of it. I mean, he really, he did his telescopes, you know, and so I remember he'd let me go and play with them and just like looking at a globular cluster and you're seeing the actual photons and with a good telescope, it's really like this object like you can really tell this isn't coming through some intervening information structure. This is like the actual photons and it's really a three dimensional object. And you have even a feeling for the vastness of it. And it's, it's, it's, I don't know.
Starting point is 01:52:31 So I definitely, I was very, very fortunate to have a connection to this guy that way when I was a kid to have had that experience again, the experience. I, um, it's kind of funny like I, I feel like sometimes like I've taken, um, when she was younger, I took my daughter and her friends to, to like a telescope. There are a few around here that are actually kids can go and use and they would like look at Jupiter's moons or something. I think like Galilean moons. And I, I don't know if they quite had that
Starting point is 01:53:06 because it's like two, it's been just two normalized. And I think maybe when I was screwing up screens weren't that common yet. And maybe it's like too confusingable with a screen. I don't know. You know, somebody brought up in conversation to me somewhere. I don't remember who they kind of positive this idea that if humans,
Starting point is 01:53:31 early humans weren't able to see the stars like if Earth atmosphere was such there was cloudy that we would not develop human civilization. There's something about being able to look up and see a vast universe is like that's fundamental to the development of human civilization. I thought that was a curious kind of thought. That reminds me of that old Isaac Asimov story where the cl- you know, there's this planet where they finally get to see what's in the sky when Seno Alana turns out there in the middle of a globular cluster and I'll just start. I forget what happens exactly. That's from when I was the same age as a kid.
Starting point is 01:54:06 I don't really remember. But yeah, I don't know. It might be right. I'm just thinking of all the civilizations that grew up under clouds. I mean, like the Vikings needed a special, diffracting piece of my could and navigate, because they could never see the sun.
Starting point is 01:54:24 They had the sun called a sun stone that they found from this one cave. You know about that? So they were in this like, they were trying to navigate both, you know, in the North Atlantic with, with having to see the sun because it was cloudy and so they, they used a chunk of myka to diffract it
Starting point is 01:54:44 in order to be able to align where the sun really was because it couldn't tell by eye. And navigate. So I'm just saying there are a lot of civilizations that are pretty impressive that have to deal with a lot of clouds. Yeah, I'm the Sonyans invented our agriculture and they were probably under clouds a lot. I don't know. I don't know. To me personally, the question of the meaning of life becomes most vibrant, most apparent when you look up at the stars because it makes me feel very small that we're not small. But then you ask, it still feels that we're special. And then the natural question is like, well, if we are specials, I think we are.
Starting point is 01:55:30 Why the heck are we here in this vast universe? That ultimately is the question of, right, well, the meaning of life. I mean, look, there's a confusion sometimes in trying to use, to set up a question or thought experiment or something that's defined in terms of a context to explain something where there is no larger context, and that's a category error. If we want to do it in physics, or in computer science,
Starting point is 01:56:06 it's hard to talk about the universe as a terrain machine because a terrain machine has an external clock and an observer and input and output. There's a larger context implied in order for it to be defined at all. And so if you're talking about the universe, you can't talk about it coherently as a terrain machine. Quantum mechanics is like that.
Starting point is 01:56:23 Quantum mechanics has an external clock and has some kind of external context depending on your interpretation that's either the observer or whatever. And there's that they're similar that way. So maybe maybe Turing machines and and quantum mechanics can be better friends or something because they have a similar setup. But the thing is if you have something that's defined in terms of an outer context, you can't talk about ultimates with it because obviously it's not suited for that. So there's some ideas that are their own context, general relativity, is it's own context? It's different. It's hard to unify. And I think the same thing is true when we talk about these types of questions. Like, meaning is in a context. And to talk about ultimate meaning is therefore a category
Starting point is 01:57:17 or it's not a, it's not a resolvable way of thinking. It might be a way of thinking that is experientially or aesthetically valuable because it is awesome in the sense of awe, you know, awe-inspiring, but to try to treat it analytically is not sensible. Maybe that's what music composed here for. but to try to treat it analytically is not sensible. Maybe that's what music composure for. Yeah, maybe. I think music actually does escape any particular context. That's how it feels to me, but I'm not sure about that.
Starting point is 01:57:53 That's once again crazy artist talking, not scientists. Well, you do both masterfully. Jaren, like I said, I'm a big fan of everything you've done, the you as a human being. I appreciate the fun argument we had today that I'm sure continue for 30 years as it did with Mark Midsky. Honestly, I deeply appreciate that you spend
Starting point is 01:58:18 your really valuable time with me today. It was a really great conversation. Thank you so much. Thanks for listening to this conversation with Jaren Lanier. To support this podcast, please check out our sponsors in the description. And now, let me leave you with some words
Starting point is 01:58:33 from Jaren Lanier himself. A real friendship ought to introduce each person to unexpected weirdness in the other. Thank you for listening, and hope to see you next time. you

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.