The Problem With Jon Stewart - Jon Talks Misinformation With Ev Williams and Dr. Joan Donovan

Episode Date: February 10, 2022

Well, Jon has gone and stepped in it again! So this week, along with writers Jay Jurden and Kris Acimovic, Jon is joined by Harvard professor and misinformation expert Dr. Joan Donovan to dis...cuss RoganGate and how to combat misinformation. Later, Jon speaks with Ev Williams, CEO of Medium. Who also happens to be the confounder of Twitter, the website where misinformation goes to superspread alongside Wordle scores.CREDITSHosted by: Jon StewartFeaturing, in order of appearance: Kris Acimovic, Jay Jurden, Dr. Joan Donovan, Ev Williams, Alexa LoftusExecutive Produced by Jon Stewart, Brinda Adhikari, James Dixon, Chris McShane, and Richard Plepler.Lead Producer: Sophie EricksonProducers: Caity Gray, Robby SlowikAssoc. Producer: Andrea BetanzosSound Designer & Audio Engineer: Miguel CarrascalSenior Digital Producer: Kwame OpamDigital Coordinator: Norma HernandezSupervising Producer: Lorrie BaranekHead Writer: Kris AcimovicElements: Kenneth Hull, Daniella PhilipsonTalent: Brittany Mehmedovic, Haley DenzakResearch: Susan Helvenston, Andy Crystal, Anne Bennett, Deniz Çam, Harjyot Ron SinghTheme Music by: Gary Clark Jr.The Problem with Jon Stewart podcast is an Apple TV+ podcast produced by Busboy Productions. https://apple.co/-JonStewart

Transcript
Discussion (0)
Starting point is 00:00:00 We're ready to, we're ready to do this thing. Should I, do I have to wait or no? We're good. Already rolling. Here we go, here we go. Hello, everybody, and welcome to the Problem with Jon Stewart podcast. We're joining you live, Jay Jurden and Chris Chimivich, where, you know, last week we did a podcast as well.
Starting point is 00:00:37 You know, I don't even know this, we do this every week now. We do. We do do it every week. It's required. Jay Jurden was on the podcast last week. I was. Along with Chelsea Devontes, yes, who was our head writer. And now we have our new head writer, Chris Chimivich.
Starting point is 00:00:53 That's right. That's right. That's right. The podcast last week was so crazy, we had to change head writers. We had to switch. She was like, I'm leaving. I'm done with this. You know what?
Starting point is 00:01:05 I am out of this joint. I am out of here. She heard the fallout and she said, you know what? Back to California. I'm going to miss that woman dearly, but Chris, we're delighted to have you step up. And the reason, Chris, you know, we have, obviously there's a few writers on the show, but Chris, we elevated because of her fierce disciplinarian nature. She is.
Starting point is 00:01:26 Jon, we already have a summer reading from Chris. We already have homework. It's the physical intimidation for me. It's not even the way that she forces. Yeah. Jon's like, if we're going to bring someone up, she's got to be just a huge bitch. And I was there. It was an easy call.
Starting point is 00:01:44 So we're very excited, Chris, and delighted as we move forward. Solid podcast today. We're going to be talking about misinformation, which I'm not sure if it's a problem or not. I haven't heard much about it recently. But first, we're going to talk about last week's episode, which was a big, big, big fuck. Yeah. I'm a professional.
Starting point is 00:02:02 Pig fuck is a great term. Last week, we were talking about Spotify and Joe Rogan and some other things, and I had some comments on it. I thought, well, geez, I'm always someone who prefers engagement. And generally the commentary back from it, I thought was very measured coming my way. We essentially just sat back and waited for it to explode all over. Amongst the fuck you I'm done with you, Stuart, I'll never forgive you for X, Y, Z. You're off the rails, old man, go away.
Starting point is 00:02:47 I thought there were some interesting stuff, if you sifted through it, that was constructive. And I think some people made the point that economic pressure is also just another pressure point that you can apply to misinformation. I think one of the greatest issues that we will face is that nexus of misinformation and disinformation. And how do you deal with it on a corporate level, on a personal level? So I thought to myself, well, geez, what if I get somebody who studied this sort of thing, who is a doctor, who knows something about it.
Starting point is 00:03:25 That's right. And so that is Jay Jordan. And so Jay is a doctor, and no, we are delighted to be joined by Dr. Joan Donovan. Dr., thank you so much for joining us. Thank you. Please tell me, what is your role at the Harvard Kennedy School, Shorenstein Center? Yeah, at the Shorenstein Center, I'm the research director, and I'm also the director of the Technology and Social Change Project.
Starting point is 00:03:58 So I've been researching the internet for about a decade. I've been at Harvard about three years now, and I look at misinformation, disinformation, media manipulation campaigns, and I look at how the internet is a tool, just like any other, for bringing about the world you want to live in. Dr., you are the Rosetta Stone. And we appreciate you being here, because this is the key to the future. So did you have a chance to listen at all to the podcast from last week? I did.
Starting point is 00:04:29 Oh, okay. I did. What in your opinion did you feel like we got right, and what did we get wrong, and was the reaction to it, in your mind, expected, knowing what you know of, how the internet is where nuance goes to die? Yes, and the ivory tower is also where knowledge goes to die. So we are very much, I mean, well, well, the problem is this, which is that, you know, people who are experts, doctors have a really hard time communicating, getting their message
Starting point is 00:05:08 out there. So in this moment, when I was listening to what you were struggling with, there were a couple of key things that I think some definitions might help with, not to sound too pendantic here, but when we talk about misinformation in this field, we really are talking about information that is shared where people don't know its veracity or its accuracy. So Joe Brogan really falls into the misinformation camp of someone who's just asking questions, has some ideas, he wants to hear from a range of different people, but the misinformation is error corrected, right?
Starting point is 00:05:45 Like a good editorial magazine or even a newspaper is going to have a way to do corrections that the next day you're going to hear, hey, you know, we printed this thing and it was totally wrong. But the internet has really facilitated this flow of information where error correction just never happens. And the way in which... How does that differ from disinformation? That's a campaign or an operation where you have people who are purposeful, there is intent
Starting point is 00:06:16 there, and which gets all the lawyers really rankled because they're like, how can you know what's in a man's heart? But we know Rudy Giuliani was really out here to try to, you know, upset the outcome of the election. He said as much and then there's a lot of background information that help us make sense of it. But disinformation, to put it simply, is either it's sharing inaccurate information for mostly political ends and sometimes financial ends. And with purpose.
Starting point is 00:06:48 With real purpose and veracity and planning and... But when we talk about like digital disinformation, this is different than when you would say something like, you know, the way in which we handled the question around weapons of mass destruction in the 90s where, you know, politicians really had hoaxed some of the journalists into believing a state-sponsored campaign. Right. Digital disinformation. By the way, I just want to point out very quickly, when you started talking about disinformation
Starting point is 00:07:19 doctor, Jay Jordan disappeared from the program. He's there audio-wise, but all of a sudden his video has gone out, which is a very convenient topic. They don't want us to talk about this, John. It's what they don't want us to know. That is exactly my point. I don't want to... It's in that conjuring of the they that is where we start to look around for evidence
Starting point is 00:07:46 of who the they is. Dr. Donovan, the reason this is so funny is because I didn't know that... I didn't know that I disappeared. So you tell me, I might just be a bad actor, but not even know that I'm spreading misinformation. I'm just... Yeah, that's sort of like... But that's the like, my uncle said it on Facebook kind of excuse, right? He's spreading misinformation.
Starting point is 00:08:12 Right, right, right. What changes it for someone like Rogan is it's his brand. It's controversy is his brand. Controversial conversations is what Spotify paid $100 million for. And Spotify wants to reject the fact that they're a publisher. And in this moment of the digital revolution that we're going through to call yourself a platform actually means a very specific thing where Spotify wants all the benefits of being called a platform where there's a lot of user generated content which creates a lot of chaos
Starting point is 00:08:47 and opportunities for disinformation and misinformation. But they really knew what they were getting. All of these problems were there when Rogan was primarily using YouTube as a means to gather an audience. So here's where I run into trouble. And here's why I believe... Well, first of all, I know Joe. So I think you always grant more understanding and nuance to people.
Starting point is 00:09:14 You know, because you know them as more three-dimensionally than what that appearance is. And we always demonize those that we maybe feel alien to us. So that goes right in there. I'm already guilty of a bias, right? But the second part of it where I run into trouble is the thing you just said. You talked about they want the benefit, but they don't want the accountability. And you mentioned weapons of mass destruction. And it reminded me like the New York Times, right?
Starting point is 00:09:49 Was a giant purveyor of misinformation and disinformation. I don't know that the Times was purposeful, but misinformation. And that's as vaunted a media organization as you can find. But there was no accountability for them. And I think where I get nervous is in the run-up to the Iraq War and in the prosecution of the Iraq War, I was very vocal and sometimes cursed about that. But the mainstream view, the New York Times was they have weapons of mass destruction. They have these tubes that can only be used for nuclear war.
Starting point is 00:10:44 Saddam Hussein is this, he's that. Couldn't I have gone down and fallen down this if Viacom or Comedy Central had wanted to censor me or had wanted to take me out of the pot? Look, I'm not owed a platform. Nobody is. So it's not a First Amendment issue. It's not any of that. Once you're in bed with a corporation, the deal is they have to sell enough beer and you get to do what you want. But my point is these are shifting sands. And I think I get concerned with, well, who gets to decide what that...
Starting point is 00:11:29 I mean, in the Iraq War, I was on the side of what you would think on the mainstream is misinformation. I was promoting what they would call misinformation. But it turned out to be right years later. And the establishment media was wrong. And not only were they wrong, in some respects, you could make the case that they enabled a war that killed hundreds and hundreds of thousands of people and never paid a price for it and never had accountability. And just having an ombudsman print a retraction, to me, is an accountability. So it's very easy to attack Rogan.
Starting point is 00:12:23 And I'm not saying that that's not your right and that there aren't things there to talk about. But what I'm saying is let's be careful because the sands can shift. Yeah. People are in a new information ecosystem and they're trying their best. But that's the thing about these platforms over the years is that we've asked no accountability from them. They're not built by librarians who are actual stewards of information. And so it's taken us a long time, at least the last decade, to get into the moment where we ask more from these companies. We're asking we want access to the truth. We want it to come up first. We want more public interest contents.
Starting point is 00:13:12 And so someone like Rogan really straddles the line because he reaches so many people and he described it as a juggernaut. It's out of his control now. But really it's not. It is in his control. It's well within Spotify's control. So that's kind of the idea then is because what I was saying is I'm generally more concerned about the algorithm than I am about the individual. Because if the algorithm can earn your trust, it will place you into places that, you know, you assume that there's a gatekeeper. It's sort of like the news when it's in the New York Times, you assume that there is a gatekeeper there that has vetted this. But in reality, our modern media is kind of an information laundering system where where the information comes from gets laundered through the aggregation process or the clickbait process or any of that. And so gossip and rumor become truth and fact become canon very quickly.
Starting point is 00:14:22 Well, this is interesting because if you think about the history of the Internet, the early websites that were really popular, Perez Hilton, right? People were here for gossip and rumor. They were here for the truth. And news was actually really slow to get online. It's part of the infrastructure of the Internet itself. So to be in this moment where we're demanding more truth means that these platforms are becoming institutions, right? Like the New York Times in a way where people are asking them to be more accountable to the audiences that they claim to serve. And the Neil Young part of this is really interesting because tied up within it and very little is talked about is a labor dispute about Rogan getting a hundred million dollars and musicians getting a penny for 350 plays. Oh, wow.
Starting point is 00:15:15 And so within Neil Young's twist. So there's also an economic aspect to this that's very different. Exactly. And a lot of it gets brought up, you know, in the moment where Neil Young just is like a catalyst for a bunch of different grievances that have been happening in the background, particularly about Rogan, but then also about Spotify having to stand on its two feet and say, this is what we are. So there's been allegations of racism with Rogan. COVID misinformation, of course, that Neil Young talks about. There's also lots of people that believe Rogan is anti-transgender, right, and has his own opinions about trans people in sports.
Starting point is 00:15:54 But where is the platform at equal volume that allows trans people to counterweight what Rogan is saying, right? And so there's something happening in the information ecosystem where we went from platforms that were supposed to serve people. Don't you think that ended years and years ago? Because I would point to, let's use this as an example. Let's say you're the Simpsons, right? All of the Simpsons. All of the Simpsons. You're Lisa Bart, Homer.
Starting point is 00:16:20 I'm Lisa, though. I'm Lisa. All right. I'm Homer, unfortunately. You can be Homer. Okay. Yeah, I knew what was coming. You're on Fox, right?
Starting point is 00:16:31 The same people that pay you pay Hannity. Where's your responsibility when you say where's the platform of equal voice? That's the fairness doctrine, not for politics, but for social issues, right? But we've never had that. And as an artist, what's your responsibility to the tube that you're in and the company that you're on? And I struggle with that. I don't know what you do with that. Do we now expect the Simpsons to say we will no longer be a part of this company?
Starting point is 00:17:17 Or how far do we go with that? It's a big question right now. And I think we're moving from culture wars to content wars in the sense that the way in which these fragmented, fringy opinions start to bubble up and coalesce online makes it seem like there's a lot more people with the same ideas. There were a lot of people that thought Lady Gaga should have gotten Oscar nom, and that's not the case. It's not. It's not the case at all. And that's innocent.
Starting point is 00:17:51 Eve Lady Gaga out of this. I just want to point out very quickly that's at J. Jordan. I would rather think about it not from the perspective of being an artist or a comedian, but from the perspective of owning a technology company. Well, what do you do? How do you protect and provide safety for your audiences or your customers? Which by and large, the customers or advertisers, the users are like the rows of cabbage that just get harvested, right? Like me and you are inconsequential individually. Did you just refer to me as a row of cabbage?
Starting point is 00:18:32 I mean, you're just being, you know, you're being cultivated. I think just a single cabbage. A single head of cabbage. Yeah, you're just one cabbage. Yeah. Listen, I'm a sociologist and it's an old reference to sociological theory by Weber where it's, you know, like it really depends upon. You don't have to explain to us, Doctor. We don't know.
Starting point is 00:18:52 No, no. We don't know. You know, you know, I knew you would. I didn't know this was a freshman course. The idea here is pretty simple, which is to say that if you're a technology company, there's been all this confusion about what a platform is. A platform can mean a place from which to speak. A platform can mean a political agenda and a platform now can mean a computational infrastructure that delivers content. And so that designation of a platform is something that we're going to argue about because we're going to say is that the responsibility of the person who's speaking to be responsible to these audiences.
Starting point is 00:19:32 Or we're going to say it's the computational infrastructure, the actual technology itself. And what's interesting about the New York Times example is the New York Times would put the burden or the onus of disinformation on the sources and say these are the people who are most responsible. And we've seen very similar things with Facebook where they're saying, well, we don't know what's traveling on this crazy super highway of information. We don't know where it goes. And then you see something like stop the steel happen and you're like, if not for your technology, these groups never would have been able to get aligned and meet and plan. And so you are culpable or responsible or accountable for these actions. And that's really why we have to understand these platforms as doing the organization and the coordination of things that happen from the wires to the weeds. And it's really important that we understand that.
Starting point is 00:20:32 So what is our vetting system for this? Is it crowdsourcing? Like, I mean, Wikipedia to some extent does that. But, you know, is the answer the blockchain of information? God, no. No. That would be wrong. I have just failed out of this class.
Starting point is 00:20:50 It's slow. And it's reactionary in the sense that what we need to do is actually stem the flow in the tide of the damage that it does. Individual or small groups of people can do. And this is where the problem of platforming really comes in. Because if you are able to distribute your information to millions and millions of people and you have no responsibility to what the aftermath is, the true costs are then paid by journalists that have to debunk it. The true costs are paid by academics that have to research it. And then down the line, anybody who might be harmed by believing, you know, that you can treat COVID with some kind of bleach or light therapy. And so what's important is that we think about platforming differently in terms of the scale.
Starting point is 00:21:43 And one of the things that I've really advocated for is that we reduce the scale and the speed by which information travels. So as to be able to do is what you're suggesting is to have some kind of crowdsourced intervention or not to let information scale to a huge amount of people before we can actually have any evidence that it's true or false. And this is like misinformation that ends up leading us to run on the, you know, the grocery stores where there's no toilet paper. And we're like, why are we having a toilet paper shortage? And it's because people are reading that, you know, there's going to be a military call to arms. What? And everything's getting shut down. Again, it's happening again.
Starting point is 00:22:34 Get toilet paper, John. Get toilet paper. But that's the thing is it has these real world effects. The wires to the weeds is an important. So in terms of platforming is engagement in your mind, a fool's errand engage. Do you do you recommend pulling back or do you recommend engagement? Because I still believe in engagement. Like I've taught.
Starting point is 00:22:59 I mean, damn, I had Donald Rumsfeld on my show. Like how do you learn nuance without engagement? And how do you get understanding without nuance? And I guess that's my fear is that we lose that. Can I say something also about that? Please. There's there's one more thing that also kind of ties into the story and the narrative of that for any other person for any marginalized group. You don't have the privilege or even the space to not engage with people who might not have your best interest at heart.
Starting point is 00:23:36 There was never a time when women, black people or queer people were were able to live in America and say, I don't want to engage. That is a it is a new take on interactions and power dynamics, but it is also a very privileged take. And I don't throw that word around all the time because it starts to like lose a lot of its meaning. But I'm a queer black person from Mississippi. Wait, what? Yes. Yes, yes, yes. How did you get past our heart and muscles?
Starting point is 00:24:09 John, I want to say Twitter does a lot of bad, but it also does some good. Now, the way the way that people talk about not engaging with someone is they go, Oh, I just would never even talk to them. And that's amazing. As long as that person isn't your boss, as long as that person doesn't provide you with housing, as long as that person doesn't provide you with an opportunity to make some money. And as long as that person isn't a gatekeeper for you to have access just on day to day things. So I think engagement for people online shouldn't start to it shouldn't start to mean engagement in general and in total. Because the story of America is a lot of people having to engage with other people who do not have their best best interest at heart. And you don't have it's not a kumbaya moment and you don't even always have to engage with them in a positive way.
Starting point is 00:25:05 But I think there's an ability of a lot of people to say, Hey, fuck you. That's wrong. And that's still engagement. I think black people saying I don't like this black people telling Joe Rogan. Hey, man, I know you have friends that are black comics who will fucking get in your ass over this new compilation that just popped up on the Internet. That's a level of engagement that you have to discern and you have to be very intentional and specific when you say, Oh, I will never engage. Because women don't have the ability minorities don't have that ability. A lot of people for the longest time haven't had that ability.
Starting point is 00:25:43 Well, that's Jay. I think that's such a great point. Right. For the majority of the world, you're right. Jay, it's not a quit. It's not the privilege of like, I'm I'm taking my ball and going home. This is my life. And I have you say fuck you or saying I don't believe that or even saying, OK, this is a study you're going to cite.
Starting point is 00:26:04 This is a study I'm going to cite. That's engagement. Yeah. Doctor. Well, you're all right. Everybody gets an ambulance. Hit the air horn. But I think there's, you know, there's a couple of things going on here where engagements.
Starting point is 00:26:23 On the one hand, we're talking about it is this interpersonal relationship, something where you're saying, you know, should I talk to my aunt who's like gone down this rabbit hole. Right. And the familiarness that you're talking about earlier about why and when you're willing to give someone a pass and what you're going to engage on and how you're going to have your boundaries. That's really important. We just came out of four years of having one of the most divisive presidencies political polarization moments in our history, not just because like Trump is who he is, but because everybody was called to a tone. Everybody was called to say something. Everybody was called to have an opinion and social media became the way in which they expressed that. And then Twitter trends and other kinds of technologies and these algorithms really worked on harnessing those and that information and then polarizing it so that you would go one way or another.
Starting point is 00:27:21 And remember that the Internet itself is highly participatory. Facebook is empty shelves. Amazon is an empty warehouse, you know, like YouTube is a blockbuster on a Friday night where you can't get any of the new releases. You know, there are all these places where people fill in the content. Right. So you have to think about it in terms of like, how do you measure this participation where everybody's being called to have an opinion on things that they might not have had an opinion on and would maybe in public conversations say, gee, I don't know which way to come down on that issue seems highly contentious. I don't think that's allowed anymore to not know what side to come down on. It's not.
Starting point is 00:28:09 It's like, but it's as you think about it though and you start to scale up and the aggregation of all of those opinions makes us feel further and further pulled away from each other, which is why we look to influencers to set the kind of terms of the public debate that we're going to have. And so someone like you engaging with someone like Rogan, people are going to argue that he or you are punching up. I'm reminded, of course, of the crossfire mode moment where you were just like, can you take this work more seriously, you know, to Tucker Carlson. But even that was an important moment though. But it was much, much misinterpreted because everybody said it was about civility. It had nothing to do with civility. It had everything to do with honesty. I don't care if people yell at each other.
Starting point is 00:28:58 I just want them to be honest. Yeah, you could see a trajectory in his, his hardening of his position over the years as he became more and more wedded to being an opinion maker, more and more wedded to being someone that has these really outside of the mainstream positions on race. I can see the Newsweek article now. John Stewart responsible for Tucker Carlson. It's a Bill and Origin story. My God. It kind of is, you know. But it's to say that media is part of the issue that we have to address.
Starting point is 00:29:36 And would you have talked to Rumsfeld, doctor? I have a brain trust of people that would never, ever let me do a public show with Rumsfeld. Because their jobs are on the line. Because they don't have a Ouija board? Is that one? Last question, doctor. Here's the last question. We have a show and we have a platform.
Starting point is 00:30:01 So what would you suggest for us as measures to guard against, you know, even accidental harm, but still maintain kind of like my belief in engagement, which I think unfortunately I'm going to end up, you know, having that forever? Yeah, I think that's okay. And that's, you know, be an advocate for the truth. What brings us towards clarity is hearing from other people and understanding from other people, but don't get hoaxed. Going through the vetting process and making sure that the person isn't just trying to turn a dime on colloidal silver or whatever supplement of the day. You know, that they're not going to come out here and be like, I got a great new treatment. It's called horse tranquilizers. You won't feel a thing, right?
Starting point is 00:30:55 You know, just do that background research and always try to tell the impact story. And this is something that I tell journalists all the time, which is that platform, the people who are harmed by this stuff, platform, the people who don't have voices in the debate, or that people who are struggling with how to understand the world around them and what's going to matter. Thank you so much, doctor, for taking the time and really enjoy the conversation. Such a fascinating world. Well, always welcome to clarify things. And if you need me at a moment's notice, I'll be here. Thank you very much for all that. Thank you.
Starting point is 00:31:36 And I'm so glad not to have gotten detention. Well, I'll see you in my office at five. Holy shit. Dr. Donovan was, she was awesome. Yeah, I see why Harvard wants her there. Too funny for Harvard, if you ask me. Too funny. That's a solid point.
Starting point is 00:32:01 I would have pegged her on a maybe brown, maybe he pegged. Yeah, yeah. That's some Dartmouth level humor. I liked her. Here's the good news. Apparently now we have earned a third of a doctorate. Now. That is good news.
Starting point is 00:32:17 That was just from that. But fascinating conversation. Jake Jordan, by the way, I thought your point on minorities, people of color and women, like not having the privilege of not being able to engage fucking top notch. I just think that we have to look at these things historically if we're going to say, oh, I would never engage because at a certain point you are kind of showing your implicit bias because you have to at least, it sucks. I hate using Twitter words on here, but you have to be able to say,
Starting point is 00:32:49 no, my engagement is saying, hey, fuck you or hey, I don't like this. That is a level of engagement people. You're absolutely right. And it's part of the dialogue. But along the lines of that, I did get a chance to talk to Ev Williams who founded Twitter. You went straight to the source. Straight to the source. Here's why he founded it.
Starting point is 00:33:14 So his name is only two letters. If my name can be two letters, I could probably create a whole platform where people only communicate with like, let's give him 11 letters and then people are like, I think you're going to have to expand it a little bit. Does he know how many people are naked on Twitter now? Is he aware of that transformation? Yeah, what do you mean naked on Twitter? By the way, Jay Jordan, if I can say,
Starting point is 00:33:41 I don't know that there's a technology that has ever been invented for humankind that didn't very quickly find a way to distribute porn. I just want to say, he's got the best porn website out. Ev Williams, thank you. Twitter. Yes. Hey, you guys look in the chat. I'm sending a link.
Starting point is 00:33:58 Okay. Oh, great. That's going to completely corrupt my computer and the whole thing and it's not going to be good. All right, so we're going to talk to Ev Williams, co-founder of Twitter, of Blogger, founder and CEO of Meet. If you've ever communicated through the spoken or typed word, Ev Williams is, this is why can't be stopped. Here we go.
Starting point is 00:34:25 Ev, thanks so much for joining us. Thanks for having me, John. Very much my pleasure. You know, you have empowered people to express themselves with these new technologies in a way that was unheard of years ago. In my day, not to date myself, but the only way you could express yourself is sitting at a bar yelling at the TV. What drove you to begin this journey of giving people the tools that they need to express themselves in a personal way? Well, I grew up in the middle of the cornfields in Nebraska. So you were a literal child of the corn.
Starting point is 00:35:12 You were raised in? In Nebraska, in really the middle of the cornfields. And this is probably the other reason I wanted to get into computers, because I was not really agreeable with the farm life and or the football, which is the other important thing in Nebraska. So I think when you refer to it as the football, that would be your, is it obvious? That would be the giveaway before the Internet. So I did not have a lot of outlets, didn't travel much, didn't really connect with a lot of people.
Starting point is 00:35:45 But I was very, very curious and really wanted to, you know, see the bigger world. So when the Internet came along, I was just like, mind blown. And to me, the cool thing was the idea that you could access knowledge and ideas and put your knowledge and ideas out there. It was sort of that that utopian technological vision of the world without gatekeepers and we're all going to be smarter. And this is before I knew that that was the same thing said about radio and cable television and every other media that had come before. But that really captivated me. Did you think we would all vote on it? That that would be the way that we, I guess that's Reddit though.
Starting point is 00:36:27 It would all be upvoting. I just thought it'd be obvious. Oh, right. You make a good point, sir. You know, I agree with you. Let's go with that. Yes. But you start out with that idea of blogger and sort of this, it's, it's essayistic.
Starting point is 00:36:42 It's almost an online diary. And how does that lead to, you know, the concision of Twitter? You know what, let's start with these paragraphs are a little much. Let's let's try a sentence and an emoji. If you look at the evolution of almost all technology, it's about making things easier. And that wasn't necessarily the conscious thought behind Twitter. We saw it as a new form. We didn't see it as, oh, this is better than blogging or writing at length because it's easier.
Starting point is 00:37:18 It was just, what would happen if we made it really easy? And, and we made it more real time. Twitter used to say, what are you doing? It wasn't even thought of, he's initially as make a statement about something. People would write things like eating tacos. It was all very innocent. I don't want to say anything, but I think that's still on there. I think there's a great deal of taco eating still going on on Twitter.
Starting point is 00:37:47 Was there a moment in the evolution of this and where you sort of felt like we've done it. You created this utopian community and democratized communication form. There must be a sense of elation in that. Yeah. Well, I was a 20 something kid trying to do something cool that would make money and whatever. But then I started getting emails and this is back in blogger from women in Iraq who wrote me and said they could never express themselves before and now they were writing. And I was just like, I didn't quite know how to handle that.
Starting point is 00:38:29 But that started to open my mind a bit to this. This may be, this may be kind of important and a really powerful thing. And then with Twitter, so it was a little more prepared for that. But again, it was about eating tacos and then, you know, we saw all kinds of crazy things happen. Right. And that was pretty weighty to hear about. And we never thought, aha, we did it. We created the ultimate utopian world.
Starting point is 00:39:00 But I think in those early days, we were very elated by the success. Was there an oh shit moment? Was there a moment in that where you went, this will be perverted by those who don't view it as benevolently as we do. Yeah. Yeah. I think that became pretty apparent a couple years in 2008, 2009, when that stuff was happening. We got a call from the State Department because we had posted about doing some maintenance again because it was barely working. And the US State Department would ask us to reschedule our maintenance because there was going to be a protest happening at that time.
Starting point is 00:39:46 I forget which country. I think that was Iran. I think that was in Tehran. They were cracking down and they were worried that the maintenance was going to prevent people from being able to effectively organize. Right. Right. We were 50 people and we certainly didn't have any policy experts. We didn't have any political expertise.
Starting point is 00:40:07 And mostly we were seeing upside. But I think it was later that we saw more of the downside and people could say it should have been obvious at that time in hindsight as 2020. It's interesting to hear your perspective on it because in many respects, it is like you say, it's a bunch of guys in a room trying to figure out how do you get something like this to work? Oh, that'd be neat. What if we call it grunter? No, that's not going to work. What if we call it? You know, that I think when it emerges, there's always a sense of grand design and grand intention.
Starting point is 00:40:49 And to hear your stories about it, it does seem a lot more happenstance than design. Yeah. And I don't want to underestimate the design of lots of people who are very thoughtful about the system. Some people have talked about is the retweet a good thing or a bad thing? And there have been people say that the retweet is a very bad thing because it makes it too easy to spread disinformation. And I was intimately involved in the design of the retweet and there wasn't an intent behind that design. Certainly it wasn't to spread misinformation, but it was to spread quality information. It was like the idea of Twitter, we thought about the time it's not a social network.
Starting point is 00:41:35 We kind of opposed that definition, at least I did early on because I wanted it to be an information network. And I thought the best possible thing is if you get quality information faster and can spread and we're going to use the intelligence of the world to going back to that original notion of the internet. It's going to make everybody smarter. It's going to make them more informed about important things. So let's build the most efficient mechanisms possible to spread when something's good. And in fact, that was meant to fight an echo chamber where you're only hearing from your friends. If there's something good that anyone could say, the retweet is this amazing mechanism to broadcast it throughout the network based on its quality because it's an intentional choice. And if your friend says this is important and you want to hear it, that was actually the intent.
Starting point is 00:42:27 And I think I would defend that decision as well. And a lot of users didn't like it because they're like, oh, there's strangers in the timeline. I don't know these people who are in my timeline. I don't like this retweet thing. And we're like, well, that's how it works because Twitter is not just about your friends. Right. We all now know what everybody thinks all the time for better or for worse. But in terms of the more high-minded theoretical communication aspects of it, like this is a way for us to spread information more quickly or it's a way for you to get information that other people think is important that could help expand your worldview.
Starting point is 00:43:08 How much of that is tied to what you thought the business model of this whole thing was? I think business model has a huge effect on how systems are designed, especially once they're growing, especially once they are public companies. I think for Twitter, the equation is more or less, how do we get more usage? More usage means more money, period. Just a matter of what is the most efficient system for generating clicks from the right people? And Facebook and Google are dramatically more efficient at that than anyone else in the world because they have more data and they don't pay anything for content. And so once that happened, then that became the game. They just sucked enormous amounts of money out of both existing advertising money and new money.
Starting point is 00:43:55 It doesn't matter what you say. Quality has no relevance on whether you can make ad money. So it starts to incentivize to the lurid or to the extreme because that's what's going to draw wider engagement, more engagement, conflict, and excess is going to sell better. I guess it's funny because it just speaks to the high-mindedness of sort of democratized communication versus the reverse engineering of those who wish to weaponize information. And that battle feels like the crux of all of this. I actually worry a little less about the weaponizing, oddly, because I think it's easier to identify. And weaponizing runs the gamut from political intentions to profit intentions, but it's more blatant. Remember early days of email, there was a lot of spam.
Starting point is 00:44:58 I was like, oh, it's just going to get worse and worse. And we kind of figured out spam. I didn't figure out spam, but smart people at companies figured out spam, and we don't deal with spam that much. You could create 1,000 Twitter accounts a day. Now to create a Twitter account, you actually have to enter a phone number. Now, can sophisticated people generate phone numbers? Yes, but your average person was like, oh, how do I get another phone number? Sort of like we thought we had a utopian society, so no locks on our doors, and we left the keys in our car like we didn't in the cornfields.
Starting point is 00:45:35 And turns out we got to lock the doors, and I think that will happen, and that'll get better and better. Is the difficulty there if the weaponization incentive aligns also with the profit incentive? And that's what we see when a Facebook will come in and they'll say, hey, your algorithm drives extreme content and radicalization. But unfortunately for them, it's also their business model. And so that puts you in kind of a netherworld. Yeah, for sure. And the thing I worry about more is where good, well-meaning people are spreading misinformation because they believe it. My uncle shares an anti-vax story.
Starting point is 00:46:19 So wait, that was your uncle? That was my uncle. I knew it was because those were coming to my inbox as well. Yeah, well, these systems are efficient at spreading that information. Then it's essentially the same problem as offline media, I believe. You're more aware of this than most people, I think, but like all the consternation about social networks spreading bad information. And then you have Fox News and they play off each other at this point to an extreme degree. But Fox News has a very clear profit motive, but the people who then propagate the stuff online don't have the profit motive.
Starting point is 00:47:02 They actually are concerned citizens. And so how do you deal with the concerned citizen who has bad ideas and is spreading those to other people in the beginning? It's like, oh, well, people will just like, it will be obvious what's true. Clearly, that was not the case. Well, in some ways, when you look back, I don't know if you feel this way, but I would imagine anybody that created the radio or a form of communication didn't realize how quickly it could be perverted. I would imagine Alexander Graham Bell, when he made the first phone call, his second question to Watson was, and what are you wearing? Because ultimately, these are tools and tools will always be only as good as those that are wielding them. And it does feel like we're in a moment where the tools of social media are too easily exploited by those who seek to create instability.
Starting point is 00:48:07 Like you say, oh, we have to lock the doors like there's these tools can be used for good or bad. And that's the negotiation I think that we make with everything in our society is that we have, as they say, all these wonderful tools, but they're only as good as the quality of the individual that's wielding them. And I guess the difficulty is there are actors out there who have the time and tenacity to use them for perverse gains of power and money. And it feels like we don't have a lot of bulwarks against that. Gotham City seems to be thriving, not a lot of Batman's. How do we create more Batman's? Yeah, some people think the answer is decentralization and that the problem is, well, there's only a few of these massive networks and it's not likely that a lot more of them are going to be created. But there are two things happening.
Starting point is 00:49:13 One is those big networks will exist, I think, for a long time. There's more niche networks where the behavior is much worse. Oh, sure. Well, they started them because they were not allowed to behave in the way that they wanted to. Right, right. It's like there's this massive house party going on and then there's some people who are kicked out and there's questionable behavior and then the people are kicked out. It's like, oh, we're going to go start our own party. Right.
Starting point is 00:49:42 You're all upset about Reddit and then you go to 8chan and go, holy shit. Right. What just happened? One could argue that Twitter and Facebook have done a relatively good job of keeping things civil for hundreds of millions of people. You go and check out what could be happening and that will continue and that will be completely unmoderated and things like Telegram and Signal and just private groups. Encrypted services. Right, right. No one's watching them.
Starting point is 00:50:13 And then there are those who in a decentralized world, whether it's people trying to imagine, what's the Twitter that no one owns and that can't be controlled centrally? It's not clear at all why that won't just be more anarchy and chaos. If it were to be successful, I don't think that would be necessarily successful. But the flip side of that is, can you also decentralize the enforcement of standards and reputation and moderation? I think that's kind of interesting. If the analogy is like these things should be policed like a city or we need more Batman, can that be something that people get together to do as citizens or private companies even that can do that? They can't currently, if these systems are completely closed and centralized, but in a decentralized world, theoretically, that would be possible. Maybe what the idea is then, if you, you might not look at it like decentralization, but view it as a utility to some extent, that if this is a utility, then maybe an algorithm of engagement is not necessarily the manner in which they need to conduct themselves.
Starting point is 00:51:27 But part of it is you have to look at what are the corrosive elements of it and what are the things that are valuable to society and how do we maximize some of that value while trying to minimize maybe some of the corrosion. And it's hard not to have it come back to business model at some level because that's how it's been designed. How do you break that apart without destroying what is their model? I think YouTube and TikTok, you know, share that and maybe even more directly is like if you're into something, you can go very deep and that can be amazing. People can learn incredible things from these systems in that sense. It's by design, but I wouldn't say necessarily it's by designing that more engagement is good on almost any system, especially if it's advertising based. Even if your purpose is to help people learn something, you could go on a path on TikTok that is all about conspiracies and you could go very quickly down that road and you can go down a completely different road. You can learn how to rebuild an old car or join ISIS and it's basically the exact same algorithm that's going to take you there.
Starting point is 00:52:49 And you said something earlier that maybe I thought was a clue to it, which was not crowdsourcing in the sense of Wikipedia, but in the way of crypto or blockchain. Is there a decentralized vetting that can occur? Look, I think we'd all agree that going down a rabbit hole to be radicalized for terrorism is very possible on some of these social media, but is a bad outcome. Right. These are really sophisticated algorithms and I find it hard to believe that the part that illuminates can't be made brighter and the part that corrodes can't be slowed. It strikes me as not that that doesn't feel right. I completely agree. I think that is entirely possible and there's a bunch of ways you could imagine doing that.
Starting point is 00:53:51 And I think that the companies are trying. It's really not obvious what to do and they do a bunch of stuff, but it's not obvious how do we enable this going down this rabbit hole but not this one is really hard. But that said, I'm optimistic about a couple of approaches. One would be building more reputation into the system. So yes, these are powerful tools. The solution isn't to get rid of the tools for everybody. It's like, we can't have nice things. That's right.
Starting point is 00:54:26 It's that you earn your way to have influence over time. And then you get the question, well, who should earn that and on what basis? And that's where maybe the answer is not just the company or how much engagement they're driving, which is basically how reputation systems work today. And I think of a flaw with them is like. It's user generated usually those types of reputation things. They are, but they can be. Yeah, but I'm hopeful about the decentralization path in some aspects. I just think it's going to take a very long time to work it out.
Starting point is 00:55:01 I think people opt in to healthier systems. That might not be completely different systems. It could be just like, well, I'm only going to see tweets from people who have been verified. I'm making that up off spot. So I'm not actually saying that's a good idea. I think these systems will evolve where people can kind of shield themselves. That could be bad. It could be like, okay, then we're all living in our day gated communities.
Starting point is 00:55:27 I think it's important that everybody get a sense of what everybody's thinking. I think where it changes is when the algorithm is driving you to see more and more divisive and conflict oriented stuff. Like if the platform itself was relatively neutral, it would still move towards the lurid. But I think it's when it comes to that the business model itself is built on creating conflict and misinformation. Because misinformation, it travels better. Crazy shit travels better than normal shit. That's just a maxim. The difficulty is when your business model celebrates that dichotomy.
Starting point is 00:56:15 And I know that, and you've been inside it, you say there's a lot of thoughtful conversation that goes around it. But I never get the sense that there is an urgency from those individuals over what exactly this is doing. How this is hyper accelerating the kinds of conflicts that we're seeing. My concern is more if your business model celebrates that and you don't have something in there to help mitigate it. What are we doing? It's too important, it seems, to be left to thoughtful boardroom conversation. I think it's a fair critique to say there should be more urgency and more to be done. It's been quite a while since I was on the inside of any of those conversations.
Starting point is 00:57:07 And so I can't speak to the current nature of them. I know that there's a lot more is done and a lot of the questions are just really, really thorny. That's not meant to be an excuse. I just think the business model is really get people to engage and then they'll click on ads. It happens to be your right misinformation and some of these things are profitable. It's not the intent. I mean, I would say my Twitter is delightful if you control. It's full of smart people who are...
Starting point is 00:57:49 Ev Williams, he's a good follow. Hey, everybody. Yeah, I barely tweet. I barely tweet. I'm saying the people I follow, it's full of smart people. But I mean, this is part of the reason. At Medium, we've totally focused on how do we align ourselves as a business with quality content? And that's why...
Starting point is 00:58:09 Right. And you've made it subscription. It's a totally different model. 100% subscription and people pay for quality content. When we switched to subscription, we were thinking about advertising for a while and it became very clear to me once I thought deeply about it. Especially when it comes to content, written content, that ads were completely contrary to quality. Not just because bad information gets more clicks, but good information is expensive. Yeah.
Starting point is 00:58:39 Now you're right. Right. And it costs money. It's not just what gets most attention. It's what costs the least amount of money to create. And that's the vast majority of content online. It's like, we got to get this stuff out there. It's clicks per dollar, really, is the equation.
Starting point is 00:58:56 And that applies to a ton of media, as you know. When you create Medium, is that a direct reaction to some of the negatives you saw from the more virulent social media? And where do you see its place? Where does it get a foothold? And are the more thoughtful communication tools relegated to a narrower audience just by the nature of them? I started Medium almost 10 years ago, so a lot has happened since. And so much of what was being published because of this dynamic I was just talking about, it was just getting worse and worse. There wasn't the right feedback loops to really support really good stuff online.
Starting point is 00:59:37 I think it should be much, much better than it is. And incentive structures, I think everything comes back to incentive structures. No question. Money and status are the main reasons people put things out there. And so I wanted to create a system that rewarded really good stuff. And that was longer form ideas and knowledge. And so that's really what we've been focused on. So this is the way I sort of look at the media ecosystem.
Starting point is 01:00:06 So the 24 hours or radio are the content engines. They're sort of the sun. They're the fission that is creating just a constant stream of thoughts and ideas and talk and everything else. And then you have the sort of satellite internet aggregators. And they're almost like what you would consider the, you know, the, what do they call the panhandlers? And they're just, they got the pans out and they're sifting through and they're looking for nuggets. So out of all this mass of content, they're looking for shit that's going to pop on a headline. But that's the business model.
Starting point is 01:00:50 I'm going, I'm a bot finding that I'm going to pull out a nugget that I can weaponize, put it out there and then watch it. Watch it go. Right. And watch it spread. And it's a nuclear reactor. You know, it's pretty clear how this ecosystem has evolved. Right. I did an interview a few years ago with the New York Times and a reporter kept asking me if I was sorry about Trump.
Starting point is 01:01:21 Right. You know, if not for Twitter, Trump wouldn't have been elected probably. And was I, and he kept saying like, all right, you know, do you feel bad about it? And I was just hemmed and hot. And, you know, I finally said something that sounded like, yes. I know the headline. I see it right now. Yeah.
Starting point is 01:01:38 And F Williams apologizes for Twitter's role in Trump's rise. Exactly. And the New York Times headline wasn't that, but every other headline was that. Because they're the ones that are mining for that gold. Yeah. And what I've also seen is there's a secondary effect of there's the people mining for the gold. Like, how do we reframe this in a way to get a click? And I'm not letting the times off the hook because that what I see there is there's the media ecosystem that plays out largely on Twitter where people are writing for each other.
Starting point is 01:02:20 And you know what's very uncool? If you're a journalist is writing anything positive about a tech company or any institution whatsoever. Like that, you will get some shit on Twitter. Like that's, it's a different form of mob. This is one of the things that I remember at the Daily Show towards the end. I started to feel the weight of social media. And it's, you know, one of the things I think is so important about any artistic pursuit is you develop kind of an internal barometer and integrity about the quality of what you're doing. And you don't ever want that bent by a magnetism of an audience reaction.
Starting point is 01:03:03 But it reminds me that we're not supposed to know what everybody thinks of us. 100%. I learned not to look at my mentions on Twitter long ago. It's bad for your mental health. Right. Is it Frankenstein's monster? Did you create a monster or do you still feel like, know what? On the whole, it's worth it.
Starting point is 01:03:32 On the whole, I think it's great in general. And I use Twitter a lot. I would read people's tweets. I don't tweet much and I don't look at what people are saying. Because I completely agree with what you're saying. In fact, last night, I was talking to my wife. I'm talking to John Stewart tomorrow. I used to do a lot of more publicity.
Starting point is 01:03:50 And I was like, well, this kind of nervous about this. And I'm like, well, why are you nervous? Like, well, you know, John's really smart, but I'm out of practice, but it's a fucking minefield out there. That's right. And it does affect what you say. And even when it's not a minefield, like even when you're not trying to get somebody, somebody will find a way. To make it that. Dude, man, I've really enjoyed this.
Starting point is 01:04:14 It's fascinating. I hope it was all worthwhile and that you don't feel nervous. Well, thanks for having me, John. It's been a pleasure. All right, sir. It's interesting that Twitter came about because he did not like football. I did not know that was going to be the reason. Right, right.
Starting point is 01:04:34 He's corn fed. People talk about football on Twitter, Ev. I think honestly, it was it's, you know, the way he described it was like, I was bored. And so I created, I just, there was nothing to do out there. And so I decided an experiment on the world. Yeah. You know what? I think also it kind of speaks to you always have the mindset that things are more intentional and manipulative than they are.
Starting point is 01:05:06 And there's a lot of happenstance and a lot of coincidence and a lot of good fortune and a lot of things that occur that aren't. I would like to reduce communication to grunts and it's going to cause an explosion and it's going to help fuel the divide in this country. Like it just, it's not as of design as you might think. Yeah, it's all an accident. They just, they have this idea and then it gets super, super big and then they're like, how do we make money off this thing that everybody has? That's right. And it's not, it doesn't work out very well ever when they do that. And venture capitalists say, you don't have to worry about money for the first few years.
Starting point is 01:05:50 Yeah, just make it huge. I will give you, I will give you money if you can make everyone do this. There was a gold rush. I don't know if they have it anymore, but like in the day when people are like, what, if anything had a dot com on it, here you go. $10 million on the startup. I mean, people were just pouring money into these things. Well, and still the, the number one thing they ask you when you're in like early seed rounds for these ideas is how can you scale it to essentially everybody in the world? And that's the only plan you really need.
Starting point is 01:06:18 You don't really need a good idea. You need to know how to. How to scale. Chris, you understand, but that's like the problem we talked about with Dr. Donovan is scale and the ability to disseminate anything. Faster than you can even think. But we have always had problems. I don't think the human immune system adjusts very well to these new technologies. And it is my hope that my kids or there will develop an immunity to it.
Starting point is 01:06:48 I mean, radio was when it first came out, just mind bogglingly destructive TV as well. And now this, but I do think if the immune system can, can catch up and be more robust, we may find ourselves in a better, a better place. Well, or, or we become cyborgs and the AI becomes part of us and we just accept. I don't want to hear any more of your porn fantasies. I just, I'm just done. Yeah. My favorite part of that porn was when the girl went 001010111100101011. Also, before we get out of here, we do have a new segment coming from Alexa Loftus, one of our fine fine writers.
Starting point is 01:07:36 It's a little something called Let Me Distract You. I hope it's juggling. I always find that to be the best distraction, but here's Alexa Loftus. Hello there. There's some stressful news out there like another winter storm cancels hundreds of flights. U.S. deploys troops to NATO allies in Eastern Europe. And don't get me started on high mercury levels in the Amazon. I will lose sleep about monkeys.
Starting point is 01:08:04 So let me distract you with some not so stressful headlines. A recent pig to human kidney transplant was successful, and that's good news for everyone. Except pigs. Not only could this save your life, it could also give you a great opening line at a party. All you have to do is say, hi, I'm Alexa. Inside of me is a pig's kidney. It hasn't changed me at all, except now I crave the taste of slop. A man invented an instrument that measures odors,
Starting point is 01:08:38 which is a great way to brag about how you have a lot of free time. I'll buy one though if it's under 20 bucks. And then when I'm like, does something stink? I can whip out my nasal ranger and be like, yup. And finally, whales don't choke. And this kind of stuff is why I pay for the New York Times. They say it's because they have huge throats. Yeah, I feel like that was assumed.
Starting point is 01:09:04 They weigh 400,000 pounds. Thanks for letting me distract you. Now feel free to get back to, Tesla will drop self-driving feature that runs stop signs. Hey, how thoughtful Tesla. I have to say though, it is unsatisfying giving the finger to an empty car. That is our show, everybody. Thanks to Chris J. This is awesome, guys. Thanks so much. Thanks to Dr. Joan Donovan, Ev Williams.
Starting point is 01:09:39 Thanks for listening, for more content from The Problem. Check out our newsletter, subscribe at our website, which is like a newsletter, but with an address. I don't really know the difference, to be quite honest with you. It's got content and links, lots of words. Problem.com, check out the Apple TV Plus show, link in the episode description. And we'll talk again next week. Yes, we will.
Starting point is 01:10:01 Until then, bye-bye. Bye.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.