Factually! with Adam Conover - Feeling Secure with Bruce Schneier
Episode Date: March 18, 2020Online security expert, Bruce Schneier, joins Adam this week to discuss the concept of "security theater", how the surveillance state makes us less secure, if we can change anything to help o...ur vulnerability and more. Learn more about your ad choices. Visit megaphone.fm/adchoices See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Transcript
Discussion (0)
You know, I got to confess, I have always been a sucker for Japanese treats.
I love going down a little Tokyo, heading to a convenience store,
and grabbing all those brightly colored, fun-packaged boxes off of the shelf.
But you know what? I don't get the chance to go down there as often as I would like to.
And that is why I am so thrilled that Bokksu, a Japanese snack subscription box,
chose to sponsor this episode.
What's gotten me so excited about Bokksu is that these aren't just your run-of-the-mill grocery store finds.
Each box comes packed with 20 unique snacks that you can only find in Japan itself.
Plus, they throw in a handy guide filled with info about each snack and about Japanese culture.
And let me tell you something, you are going to need that guide because this box comes with a lot of snacks.
I just got this one today, direct from Bokksu, and look at all of these things.
We got some sort of seaweed snack here.
We've got a buttercream cookie. We've got a dolce. I don't, I'm going to have to read the
guide to figure out what this one is. It looks like some sort of sponge cake. Oh my gosh. This
one is, I think it's some kind of maybe fried banana chip. Let's try it out and see. Is that what it is? Nope, it's not banana. Maybe it's a cassava
potato chip. I should have read the guide. Ah, here they are. Iburigako smoky chips. Potato
chips made with rice flour, providing a lighter texture and satisfying crunch. Oh my gosh, this
is so much fun. You got to get one of these for themselves and get this for the month of March.
Bokksu has a limited edition cherry blossom box and 12 month subscribers get a free kimono
style robe and get this while you're wearing your new duds, learning fascinating things
about your tasty snacks.
You can also rest assured that you have helped to support small family run businesses in
Japan because Bokksu works with 200 plus small makers to get their snacks delivered straight
to your door.
So if all of that sounds good, if you want a big box of delicious snacks like this for yourself,
use the code factually for $15 off your first order at Bokksu.com.
That's code factually for $15 off your first order on Bokksu.com.
Hey folks, Adam here.
I want to give this show a special intro since we're obviously in a special time.
I, like many of you, am holed up in my house right now.
I'm working from home.
I got my dog here.
I got my partner, Lisa.
I got my cup of tea.
I got my somewhat less professional podcast recording set up.
And I'm trying to make sense as best I can of the COVID-19 pandemic that's changing all of our lives so drastically.
Now, first of all, I have to address this because people have been asking me what I think, right?
Adam, we watch your show.
You're always debunking things.
Are you going to debunk this?
You're going to talk about, oh, is the media just overhyping coronavirus?
And, you know, is it causing a panic that we shouldn't be having?
And I just want to tell you so you hear it causing a panic that we shouldn't be having? And I just
want to tell you, so you hear it from me, if that matters to you, no, I do not believe that
this pandemic is overhyped. Every expert in medical science, in epidemiology, in communicable
diseases, these folks are telling us that we need to take drastic actions right now to reduce the spread of the disease so that many, many more people don't die and that so we don't overflow the limitations of our health care system.
And so we're acting accordingly, working from home, not getting together in large groups, not going to bars or restaurants, staying inside as much as we can, going on jogs, but avoiding human contact
as hard as that might be. And I hope that if you're in a position to be able to make those
changes, that you are considering doing so yourself, because it really, really can make
a difference in this time. And I also want to mention that there's a lot of folks out there
who can't make those changes. There's unhoused folks living on the streets. There's workers who, because of what their job requires,
cannot take time off work, cannot work from home, either because they rely on those wages or because
their job is just so damn important. And I hope you're doing what you can to support those folks in your
community as well. And that's really all that we can do right now is try to take those steps
and try to keep aware, try to keep reading, try to keep understanding what's happening.
I want this podcast to be a place where we can do that investigation,
where we can be watching and listening. And we are trying to steer the ship in order to be able
to do episodes like that. We're trying to book some guests right now who can come on to talk
to us about coronavirus. Let me tell you, it's a little hard to book them. They're a little bit
busy, if you can imagine that. Folks are like, oh, do I have time to go and be on a podcast for an hour? No, I think I might have to
save some lives today. But we're doing our best to line some guests up for you.
And in the meantime, we've got a nice backlog of episodes that we're going to be putting up for you
on our normal weekly schedule. It's just that those were recorded before the pandemic really seized our country the way it has in the last week.
And so, you know, they might be a little bit from the pre-coronavirus point of view.
Not to say that they're not full of interesting facts that you got to know.
You know, they're just not about the thing that we're all thinking about 24-7 right now.
And in the interview we recorded with Bruce Schneier today, we actually do get into coronavirus a little bit, but we did record it last Wednesday.
And so, you know, things have have accelerated since then.
So, look, with that being said, I hope you enjoy the show today.
I hope that you are staying safe.
I hope that you are washing your hands for 20 seconds.
Seriously, 20 seconds really does matter because that's how long it takes for the soap to really vaporize the virus.
The time you spend really does make a difference.
I hope you are keeping in touch with the folks you care about and encouraging them to do the same.
And I hope we all get through this on the other side, none too worse for wear.
Either way, I will be there with you, putting out new episodes for as long as we can,
trying to bring you new information,
new perspectives, fascinating people
to keep you entertained and informed.
And thank you for listening.
Thank you for being a part of the community
of people who listen to this podcast.
We need each other now more than ever
in our communities
and in our broader communities on the Internet.
So thank you for listening and hope you enjoy today's show.
I don't know the truth.
I don't know the way.
I don't know what to think.
I don't know what to say.
Yeah, but that's all right.
Yeah, but it's okay, I'm Adam Conover, and look, since every piece of our lives is
online now, our pictures, our health data, our banking information, our rather suspicious
browsing histories.
Internet security is now everything security. Now, you might believe, and tech companies would have us believe, that if you want to be secure, you just need the right cryptography, the right
fancy tech inside your machine. And wango bango, you're good, right? Actually, that high-tech
approach actually neglects one crucial weakness in our security armor, which is that good old-fashioned
fallible humans are in charge of it. See, humans make mistakes all the time, multiple times a day
in my case. So it doesn't take a supercomputer or a team of evil geniuses to get us to divulge
private information. All it takes is a convincing email or phone call. This technique is called
social engineering. It's the oldest hacking
technique in the book, and all the high-tech cryptography in the world does nothing to stop
it. To take one famous example, in 2012, the journalist Matt Honan was hacked, and the hack
was surprisingly low-tech. First, the hackers figured out he had an Apple ID account, and in
order to get into that account and ruin Matt Honan's week, all they needed was his name, his billing address, which they were able to find from public domain name records,
and the last four digits of his credit card. So to get those last four digits, they called Amazon.
They pretended to be Honan. They convinced the Amazon rep to let them enter a new credit card
number onto the account. Then they called back and gave a different operator the new credit card number
they'd just made up. By tricking Amazon's network of human operators, they were able to access
Honan's Amazon account and look at his actual credit card number. And from there, all they had
to do was call Apple, pretending to be Matt, and get into that account. And from there, they were
able to destroy his Google account, all the data on his iPhone, iPad, and MacBook, and take over his Twitter account, which was the whole point of the hack to begin with.
In the end, they just wanted his Twitter account because it was three letters long, at M-A-T.
I don't know.
They just wanted it for street cred or something.
Maybe they don't like typing long usernames.
Who the hell knows? The point is, they were able to destroy his digital life
just by lightly tricking two Amazon phone representatives. And imagine if they'd wanted
more. Imagine if he was a politician who they wanted state secrets from, or if they were a
hospital who they wanted to hold for ransom, as happens so often now. Ransomware attacks are becoming far more common.
The fact is, new attacks are happening every day.
And in response, tech companies, again, are constantly trumpeting the new, expensive, high-tech security measures they're deploying.
They're scanning your fingerprint, your face.
They've got all the new crap, right?
But the fact is, it doesn't require fancy math or a supercomputer
to hack someone. All it really needs is a dedicated con artist and a gullible human.
And if your security system doesn't protect against that, if it doesn't rigorously plan
for every angle of attack, no matter how simple, then that expensive high-tech security is really
nothing but security theater, giving you the feeling of security instead of actual
security. Well, here today to tell us more about how important good security design is and how hard
it is to get it right is today's guest. He's an expert on online security, and he's the inventor
of the concept of security theater. Please welcome public interest technologist and Harvard Kennedy
School fellow, Bruce Schneier.
Bruce Schneier, thanks so much for being on the show.
Thanks for having me.
You were last on the previous installment of this podcast, the Adam Ruins Everything
podcast about three years ago.
What's new in the world of security since then?
What has been changing?
What is scaring you today?
Yeah, there's not a lot new and everything's new, right?
A lot of the vulnerabilities, the attacks are the same.
What's changed is computers and everything.
Yeah.
We last talked, there weren't computers in your refrigerator and your toaster and your thermostat.
Right.
I mean, that's pretty new.
And your doorbell.
And your doorbell.
And then a lot of these disinformation attacks.
Mm. Right. It's new. And your doorbell. And your doorbell. And then a lot of these disinformation attacks.
Right, it's new.
It hasn't happened since we last talked.
And none of that is actually new,
but it's happening to an extent that is surprising to most people,
and it's become political.
It's become something the average person now cares about.
Yeah, well, let's talk about the first one first.
There's computers in everything now.
Home, smart home products are everywhere.
You go to a Home Depot, they're falling off the shelves.
Hey, install this wireless security light fixture that'll change colors and also stop your home from being robbed by scary people.
And you can use it to watch your baby, and et cetera.
And that's complicated.
There's a lot of perhaps negative knock-on effects of that, aren't there?
Well, I think about the fact that there are all of these computers in things that are not traditionally computers.
We're used to laptops, desktops, phones, and now it is everything.
We're used to laptops, desktops, phones, and now it is everything.
And the same care that Apple and Microsoft and Google take with software, and we can argue whether it's good enough, but they certainly take more care than the people who make internet-connected thermostats and toys and even medical devices. So we're seeing this explosion of things on the internet without commensurate security,
plus they affect the world in a direct physical manner. So it's not like your phone, which is just about data. My thermostat turns my heat on. I live in Minneapolis. If it's cold in the winter
and someone hacks it, my heat turns off and I'm not there, my pipes freeze. Even worse, it could be a medical device in my body
where if it's hacked, even worse happens.
Or a car, right?
These things are touching the world.
Yeah.
And that really changes the risk profile,
even though the computers haven't changed at all.
And you're right that, you know, say,
I think Apple probably has pretty good security practices.
They certainly, you know, market themselves as having them. And, you know, based, I think Apple probably has pretty good security practices. They certainly, you know, market themselves as having them.
And, you know, based on what I've read, okay, this is done well, that's done well, etc.
They're certainly one of the companies putting the most money into securing those devices, I would guess.
And even they, you know, there's, hey, occasionally people's iClouds get hacked.
Like there's, you know, it's not like this is an impregnable fortress.
get hacked. Like there's, you know, it's not like this is an impregnable fortress. And when you compare that to like the Ecobee thermostat in my house, I don't know this company. I don't know
what their security, hopefully they're good. Right. But, and now when I go look at my, you know,
home router and I go see how many devices are connected, there's, there's about 40 devices
connected in my house. And you might not know what they all are. Yeah. I mean, something shows up and it connects.
So, and you think about Apple or Microsoft or Google, they're the big companies.
I mean, yeah, they're doing a better job, but Patch Tuesday comes for Windows and they're
like 80 vulnerabilities being patched every month.
Yeah.
That's what good looks like.
Now, that being agile, we can't write good software, so we patch quickly.
That's the mechanism we have.
Now, that doesn't translate down to that Ecobee 3000, whatever it is you have.
Your home router, the way you patch it is you throw it away and buy a new one.
That's the mechanism.
is you throw it away and buy a new one.
Right.
That's the mechanism.
So that whole agile security,
we're going to fix it really fast when we find a vulnerability,
doesn't translate to those low-cost embedded systems.
That mechanism starts failing.
And you've talked about occasions
where one of these weird Internet of Things devices,
like some weird little device,
has been used to hack a larger system as like an entry point for a dedicated attack, right?
So the DynBotnet is the example we all like to use.
2018, this was a vulnerability in basically webcams and DVRs.
They had default and weak passwords.
And some guy collected, I don't know, a bunch of million of them into a botnet,
used them to attack a domain name server,
which had a knock-on effect of dropping
20 or so major websites offline.
Wow.
So there you have that cascade of vulnerabilities.
You can imagine that targeted against a power system,
something more critical, a hospital.
Yeah.
These are not theoretical.
These happen.
And so this guy sort of mass hacked all these devices system, something more critical, a hospital. Yeah. And these are not theoretical. These happen.
And so this guy sort of mass hacked all these devices and then zombified them in order to get them to do his bidding to take down this important piece of internet infrastructure.
And sadly, zombified is the technical term we have for it.
I do read a security blog on occasion, so I may have grasped that.
I mean, that's, that's
stunning. And how much should, I mean, do you think the average person should feel differently
when they go to the Home Depot and they look at the ring doorbell and they consider getting it?
So this is hard and we're struggling with how we tell the consumer this. There are people working
on effectively security nutrition labels. The idea is we can put a label
on products
that has similar information
that a consumer can use
to make a buying decision.
Right now,
the consumer is looking at features.
They're buying a Nest thermostat.
They're buying an Amazon Echo.
Amazon Echo is marketed as
this thing will spy on you
and good things happen as a result. Right. It listens to everything that you say.
So it's going to be hard. You know, a lot of us think the right way to do this are actually
product safety standards. I mean, when you go buy, you know, a pair of children's pajamas,
there isn't a label that says whether or not they'll catch fire.
It's just not allowed to sell
flammable children's pajamas.
We make a rule.
We get that they're cheaper.
Yeah, you can't buy them.
You can't sell them.
That's just the way we are.
There isn't such a rule anywhere
for internet security.
Yeah.
And I think that hurts us.
And whether it's the Federal Trade Commission
or some other body,
I think we need more regulation
in the consumer goods space
because we're getting a lot of shoddy merchandise.
There's no real liabilities here.
And the consumer can't tell the difference
between a secure product and insecure product.
They're going to make the same claims.
Yeah.
And you're going to buy it based on features. So is this, is this becoming like an unsafe at any speed moment for
internet devices? You know, the famous Ralph Nader book about the car industry that resulted in,
Hey, these cars were blowing up or, you know, causing injuries or death at even low speed
collisions. And, you know, there was a sea change where, okay, we're going to require airbags, crumple zones, all the different things that may have
made our cars much safer, at least for the people in them, maybe not for people who get hit by them.
But yeah, pedestrians aren't doing too well. It's true.
Yeah, exactly. Those big front SUV grills will kill them. And where's NHTSA on that? But,
you know, still there was a sea change and we started requiring these things because, hey, consumers can't make their cars safe
by themselves.
We need a basic floor on the safety.
Is it starting to seem the same way
for internet enabled devices?
Yeah, I hope so.
I doubt it.
I mean, you could tell the same story.
The difference is automobiles is 40,000 deaths a year.
A lot of people die in car crashes.
This is the most dangerous thing
most of us do in our life.
And we're at historic lows for that in many ways.
That's right, we are.
And you're right, pedestrian deaths are up
and passenger deaths are down.
So we're not yet at that space on the internet.
We're not seeing deaths because of insecure IoT devices.
The only exception, and this isn't proven, this is my guess, we're seeing a lot of ransomware
against hospitals where critical systems are shut down.
Yeah.
And we have had hospitals have to turn away ambulances, right?
Go to that other hospital across town because we can't take you.
Our computers are all down.
Now, right. It's possible and probable that has resulted in at least one death. I don't have any
data, but we're not seeing the cars crash. We're not seeing the pacemakers crash. We're not seeing
the sorts of loss of life, which would spur a government to take on the industry that doesn't want the
regulation and force it. So we're not at unsafe in a speed moment, but we might be approaching it.
I really hate that we would have to get there to do something, but we as a species are terrible
at being proactive. You know, we are reactive. We. We are not. I've noticed. Let's worry about,
you know, the, the thing that could happen in the future. We are, let's panic about the thing
that happened in the past. And make sure it can't happen again, ideally. But yeah, I mean, it's.
Or at least in exactly the same way. Yeah. But that means, you know, I've, I've read that when
you're talking about risks that, hey, if you're reacting to something bad after it happens, you're too late in many ways.
And really, you want to prevent the thing. And we're not even close to doing that.
It depends what the thing is.
I mean, there are things that are rare and spectacular.
Airplane terrorism, you know, new coronaviruses.
Those are so rare.
You have to be proactive or you're not going to do the job.
If it's credit card fraud, then reactive is probably okay.
I mean, new kind of fraud is invented,
credit companies notice it, they put in defenses,
and there's a steady state.
You see that in spam, right?
And you might notice every couple of months
you get a lot of spam for a day.
That's because the spammers invented a new technique and it takes a while for the anti-spam
companies to figure it out. So you can be reactive in kind of that low level garden variety attacks
and fraud because it's a sort of a steady state system. When you're talking about rare and
spectacular, if you're reactive, yeah,
you're way behind, you're not going to do well. You need to be proactive. We are, of course,
learning that this month. Well, what do you think? I love your framing of product standards that like
right now we're selling all this stuff willy nilly and we're selling like baby gates with holes in
them. We're selling, you know, products that are dangerous or
insecure in ways that are unclear to the customer because it's so new. Do you think that the
solution to that is, you've talked about regulation, is there also a consumer education part of it?
Should people be having better security practices themselves? Or is that impossible to ask the average person to keep
data security in mind 24-7? So I think it's largely impossible,
just like we don't ask consumers to have medical degrees. We make sure that the drugs they buy are
safe. We don't ask them to have auto mechanic licenses. We don't ask them to be civil engineers,
right? I'm in a building and this building has been constructed well, and I have to, I can not care, right?
I'm going to fly home tonight
and I'm going to trust that the airline has a system
to put well-trained and well-rested pilots
into well-maintained aircrafts.
And I don't know how it works and it doesn't matter.
Yeah.
Because regulation requires it.
We cannot ask consumers to be experts in everything.
We have to have some system by which they are largely kept safe in all of these areas.
Yeah. And there's no other option. I mean, I can't think of industry in the past 150 years
that has improved consumer safety or security without being forced to by the government.
Yeah.
Pharmaceuticals, cars, airplanes, medical devices, food production, restaurants, workplace safety, consumer goods, most recently the safety of financial products.
The market rewards getting stuff out there quick and rewards features, doesn't reward
safety and security. So the market pretty much never fixes this problem in their industry until
they're forced to, in which case they do. But a lot of the time, once they're forced to,
we see the pattern again and again. Once you have that safety, that's actually good for business because, you know, the USDA inspected stamp on your meat, right, means we go from Upton Sinclair times where,
you know, everyone's getting sick from eating contaminated meat to, hey, we can all, you know,
most Americans believe that our meat is safe and therefore meat consumption goes up. That sort of
pattern, you know, and right now people don't believe that their devices are secure.
Everyone sort of believes, yeah, they're all spying on us.
We're all being hacked all the time.
Everything you buy is unsafe.
And if that perception were changed via the government setting some basic standards,
maybe that would benefit business.
And it does.
So here's the basic economic problem.
It's individual action versus collective action. The industry does better when they're regulated and their products are safe. Each individual
company doesn't want to be the first one to spend that money to make their products safer because
they'll be penalized because they'll be more expensive. So everybody really wants someone to come in and say, look, all of you have to maintain your aircraft.
All of you have to test your drugs.
All of you have to make sure that your iPhone apps aren't spying on people.
Right.
Then consumers get more confidence.
And yes,
the industry benefits and you get innovation because now there's a market for
all these security systems that these manufacturers can buy and use.
One of the problems I have with my security is it's largely out of my hands.
My email is on some cloud provider.
My photos are somewhere else.
My documents are somewhere.
My credit card data is somewhere else.
There's nothing I can do.
I am powerless.
Right.
Right.
But I, what I want is, is for all of those companies to secure my data better, but I
can't force it.
Yeah.
I have no leverage.
I mean, I could not have an email address.
That's dumb.
Or I could not have a credit card.
That's even dumber.
Or if you're an ultra nerd, you can run your own email server, but then you are personally responsible for, it's like building your own home security system from scratch.
That's right.
And sadly, I'm probably one of the few people left who does manage my own email.
Right.
But, you know, last time I checked, Google has about half of my email because even though I don't use Google, everyone else does.
So I'm, you know, I'm screwed either way. Yeah. It's in
other people's inboxes. That's right. Well, and this is a pretty basic form of like American
regulation you're talking about, just the government setting basic standards. It seems
very sensible to me. Is there something though that you think would make that happen? Are you
a pessimist about it? You know, I tend to be near-term pessimistic,
long-term optimistic. Okay. And this isn't the thing that's going to stifle society and
civilization. We'll figure this out. We figured out harder things. Near-term, the companies have
a lot of power and government's really still afraid to do any serious regulation. Yeah.
So, you know, you're not seeing much.
You're seeing things out of Europe.
That's a start.
You're seeing things out of California.
That's a start.
But you don't really have regulatory bodies
that have a footprint that matches these companies.
Yeah.
So we have this mismatch.
So near term, actually don't see a lot of hope. This is not a this is not a presidential issue. Right. This is not a no one is campaigning on it. Not even Andrew Yang campaigned on this. Right. And he was as nerdy as we got. Yeah, for real. And he did not.
So, you know, if it's not a campaign issue, elected officials don't care that much.
I know the voting public doesn't care that much. So the more likely to do what companies want because they care because the money matters.
Right.
So you're not getting near term movement.
Europe seems to be the exception.
And the neat thing about this is a good regulation in Europe helps the entire world.
Because software is right once, sell everywhere.
So right now, the car you buy in the US is not the car you buy in Mexico.
Environmental laws are different and the manufacturers tune the engines.
But the Facebook you get in the US is exactly the same Facebook you get in Mexico.
Because it's easier.
So California,
beginning of this year, they passed a new internet of things security law. Not that great. One of the things it says is no default passwords. So imagine that thermostat manufacturer reads a law,
changes their software, no default passwords. They're not going to have two versions,
one for California, one for everyone else. They'll make that change worldwide because it's cheaper and easier to do it that way.
So a regulation in any large enough economy gets promulgated throughout the world.
Right.
GDPR is the European data privacy standard.
Right.
You have gotten a lot of more annoying pops ups on your browser because of it, even though you're not in Europe. Right. You have gotten a lot of more annoying pops ups on your browser because of it, even though you're not in Europe. Right. And I used to work for IBM. They actually said we're going to implement this European law worldwide because that is easier than figuring out who a European is.
And these, some of these pop-ups are irritating.
There's been some irritating effects of GDPR, certainly.
Like, for instance, let me just give you an example.
True TV hosts Adam Ruins Everything's bibliographies, right?
We have bibliographies for every episode.
And they actually, rather than make the website,
truetv.com, secure enough for GDPR,
they just made it inaccessible to anyone outside of the United States.
Yeah, I've seen that. Some retailers do that too. It's pretty funny.
And so I've got people who watch the show on YouTube in Italy or whatever saying,
hey, how do I see the bibliography? I can't access it. And there's nothing I can do about it.
So you want a little tip? So tell them to install a VPN and pretend they're from the United States. I mean, it's not like these systems work all that well.
Right.
It's trivial to get around.
But in the best case scenario, yeah, the, you know, the better parts of the GDPR that
are, you know, regulating how my data can be used and et cetera, I benefit from even
though I don't actually live in any of those nations.
And that's true.
And hopefully that'll keep improving
because Europe is the regulatory superpower on the planet, right?
They are the ones who are willing to slap fines
on these companies that the companies notice.
US is not, no one else is capable.
I feel like this is an optimistic note
that we actually ended on.
So I'm going to, let's take a really quick break.
When we get back, I want to talk to you about disinformation. And I also want to get your thoughts on
the coronavirus. We'll be right back with more Bruce Schneier.
OK, we're back with Bruce Schneier. Bruce, let's turn to disinformation.
Give me your thoughts on these campaigns.
Do you want me to lie about it and tell you the truth about it?
Because if I lie about it, that'll be super meta.
Yeah, and it'll also catch fire across.
We know from research that lies travel further on Twitter than the truth.
So that means we'll go far more viral if you go that way.
But yeah, no.
So you said this is a new type of attack that we've seen targeted disinformation campaigns.
How do those look from your perspective as a security researcher?
It's again, it's new and it's not new. I mean, propaganda is is ancient.
And certainly after World War Two, Cold War, propaganda was a big deal, right?
Russians in the U.S., everyone practiced it.
What's different is being able to do it on the internet.
The internet has different affordances.
Internet makes things cheaper.
Internet makes things travel faster.
Internet changes how we share information.
And it turns out certain types of propaganda are easier to do and cheaper to do and are more effective.
And some of it is the fact that these platforms optimize for engagement, not accuracy.
Facebook makes more money the more time you spend on Facebook.
And it turns out you spend more time on Facebook when you're pissed off. Yep. Right. So if these platforms can get you riled up,
you will engage more. And whether that is Twitter or Facebook or YouTube, they push
extremist content, not because they're malicious, because the algorithms say it is more effective
for them to achieve their goals,
which is to show you more ads.
And that's being taken advantage of.
The thing about all the disinformation
is we actually don't know how effective it is.
I mean, all of the analysis,
we can't tell, did it change anybody's vote?
What it does seem to do is it makes people more partisan and it reduces our trust in
our institutions.
Yeah.
Right.
It makes us mistrust the election process, the census.
So I'm going to give you my theory here.
Now, it's going to be a little wonky, but I think it's worth doing.
So here's the question.
Why is it that the same disinformation campaigns that are bad in a country like the U.S. or are destabilizing in a country like the U.S. are stabilizing in a country like Russia?
That's an interesting question.
So here's my theory.
Societies have two kinds of information.
There's something we'll call shared political knowledge,
which is things we all agree on,
like who's in charge,
like how elections work,
like what the laws are.
We all agree on that.
Then there's contested political knowledge. That's the stuff we disagree on. That's things like what should the are. We all agree on that. Then there's contested political knowledge.
That's the stuff we disagree on.
That's things like, what should the tax rates read?
What should regulatory structure look like?
You know, I mean, all the things we battle politically.
Is voter fraud real?
Stuff like that.
Right.
So in a democracy, we harness disagreements to solve problems.
We have elections, we vote, we debate disagreements to solve problems.
We have elections.
We vote.
We debate policy.
We like disagreement.
That's how we solve problems. Okay, now move to an autocracy like Russia.
You do have common political knowledge, like who's in charge.
But in fact, contested political knowledge is dangerous.
And governments like Russia suppress it.
And things like how
popular the opposition is, right? You know, what the government is doing that tends to be suppressed.
Yeah. So a open internet in a country like Russia challenges their monopoly on information and is
dangerous, right? Free and open exchange of information is dangerous
to a country like Russia. In the United States, our danger are attacks that turn
shared political knowledge to contested political knowledge. If we disagree on the results of the
election, we have problems. If we disagree on the fairness of the census, the fairness of the courts, the fairness of the police,
we have problems.
So the same attacks in the exact same campaigns in Russia, which basically make people unsure
about what's true, stabilizes their regime.
In the US, being unsure about what's true, destabilizes us.
unsure about what's true destabilizes us. So we actually have this vulnerability based on the fact that normally disagreements are valuable in our country. And how active are these campaigns? I
mean, you know, you log onto Twitter and you see everybody's accusing everybody else of being a
Russian bot, you know, everyone's like, ah, Putin put you up to that, you know, and it's, or they're saying anyone who says that is a moron, right? Anyone
who thinks that this is an issue at all, from your just on the ground objective view as a security
expert, like how real is this as a day-to-day fact of life? How much is this affecting us?
So it's interesting. And this is a really interesting point that it is
real, but a lot of it
isn't happening. I think we're seeing Russians under every
bed and they're not. So the real data
shows the attacks are less
common than we believe, but in
fact, now we're doing the work for them.
Yeah, right. If we are
looking at this stuff and mistrusting it,
you don't need the Russian
bots because we're doing it to ourselves.
Yeah.
Once the seed is planted,
once the mistrust is there,
we can take it on our own.
Right.
And that's bad.
I mean, yeah, Putin hires, you know,
50 guys in the Ukraine to tweet a couple times
for six months,
and then we freak out for five years, right?
If you're trying to do a disinformation campaign, that's even better for you.
Right.
So right now there's a lot of misinformation about COVID-19.
Yeah.
We do not know if they're coordinated government-run disinformation campaigns.
There's been accusations.
The State Department has said there are.
They have not offered any evidence.
has said there are. They have not offered any evidence. The companies I know and researchers that monitor this haven't seen anything coordinated yet. So we don't know. But the
fact that there could be is already damaging. Yeah. Well, when it comes to fighting back
against disinformation campaigns, do you have a similar optimistic view or what is your
view of our prospects for fighting back against that trend? Because as you said, it's extremely
damaging to our society over the next five years. Where do you think that pushback comes from? The
government? Does it come from these corporations changing their practices? Foreign policy? What is
it? So lots of research and the short answer is nobody knows.
Is it education?
Is it the tech companies taking down
misinformation? But what do you do if the president
tweets it? Does that count?
Does it involve
going after the
organized actors, right? The US
going into the Internet Research Agency
in Russia and shutting it down,
which supposedly we did during the 2018 election cycle.
A few days before we went in and like turned out their light
so they couldn't do what they were doing, right?
Was that effective?
We think so.
But, you know, we really don't know.
Companies changing their practices
is gonna be a big part of it,
but they're not gonna do it
as long as their profit is elsewhere.
So now does the government force companies to do it?
And how do we do that?
And it runs quickly afoul of free speech laws.
So we really have some serious problems of what's effective.
I've seen research that shows priming people for accuracy matters.
But a lot of it comes from the fact that politics is now tribal.
You know, my team good, your team bad.
I'm going to retweet the thing that says your team is bad, regardless of whether it's true.
I don't even care if it's true or not.
Because I'm tweaking your team.
Yeah.
I'm, you know, it's like the truth in sports
about a close call.
You don't care about the truth.
You care about it coming on your side.
Yeah.
You're not, you don't want accuracy.
You want your side to win.
And that kind of hyper-partisanship
in politics fuels it.
The economics of surveillance capitalism
fuels it.
So we have some really
deep core issues that we have to address before you can do disinformation on top of it.
Maybe think about what Cambridge Analytica did in 2016. If they did it for Kellogg's,
Kellogg's would have gotten an award for savvy marketing. That's what the system allows.
We just didn't want Russia to do it.
But it's really hard to make that distinction.
I want to ask you about some more big security stories in the news that I've seen over the last six months, because I only get to talk to you once every couple of years.
This story where the-
I blame you for this, by the way.
Well, I have you on every month.
You're great.
for this, by the way. We'll have you on every month. You're great. But OK, so the story where the Justice Department was pressuring Apple to, you know, give them access to encrypted iPhone
backups. Right. One of the good security practices that Apple has is I believe you tell me how good
it is, is they encrypt, you know, what's on everybody's iPhone in a way that makes it very
difficult for anyone to get onto it. But the Justice Department's been pressuring Apple in order to give them a backdoor into this. What's your view on that?
So this is actually an important policy question, and it's being framed wrong.
What the FBI wants is for you to have security unless they have a warrant. And I can't do that.
I can't make security operates differently
in the presence of a certain piece of paper.
So we have a choice here and you have one choice.
Either we make our iPhones insecure with a backdoor,
which means the FBI can get access,
but also the Russian government,
the Chinese government, cyber criminals, everyone else can get access, but also the Russian government, the Chinese government, cyber criminals, everyone else can get access.
Because if there's a special lock for the FBI, then that lock is pickable too.
Right. That lock's available to anybody who knows about it.
Or I can make the iPhone secure, in which case the iPhone in the pocket of every single elected official and judge and police officer and CEO and
nuclear power plant operator is secure. But that phone in the pocket of a criminal and terrorist
is also secure. So this is security versus security. There is security that comes from
the fact that the FBI can open iPhones and there's security
of the fact that
iPhones can't be opened
by anybody
right
which one is more important
right
I think
as we move this
internet of things
where computers
affect the world
in a direct physical manner
that we have to adopt
a defense dominant strategy
yeah
defense has to win here
yeah
it's too important
because there's there's going to be billions of people walking around with iPhones and the FBI is only going to need the defense dominant strategy. Defense has to win here. It's too important.
Because there's going to be billions of people walking around with iPhones
and the FBI is only going to need to crack
into however many a year.
Like the fact that everyone,
the thing that literally everyone has in their pocket
being more secure,
that is probably the more important thing of the two.
And the FBI has other ways to get at data.
Yeah.
And a lot of this is, I think, bad training.
So the average career length in the FBI age is about 20 years.
Then you get your pension and go elsewhere.
20 years ago, 25 years ago, there was no iPhone.
They were getting this data from lots of places.
They were using real-life forensics to collect data.
We all now have an iPhone, and the default is get the data off the phone.
And I think once we deny that and say, look, you can't have that for these really good security reasons.
The FBI needs to get retrained on all of these other investigative techniques.
The data is out there.
There's lots of ways you can get at and you'll see it in hearings. You saw it in the Roger Stone indictments. There's a point where we're reading transcripts of a secure messaging app that's unencryptable. The FBI can't get at it. And at one point he says, you know, something about the fact that we're secure and the FBI and no one can read the data and we read the rest of it.
secure and the FBI and no one can read the data and we read the rest of it. They probably got it from the other party in the conversation. They might've gotten it when he put a backup online.
Yeah.
There are lots of ways to get at this data and getting at, just getting the phone
is a lazy way to do it. We need to recognize that the FBI needs much better forensic analysis of electronic evidence.
Yeah.
And we need to provide it for them.
And that's going to take money and training.
So we need to secure the systems because of all the badness resulting from insecurities
and then also train law enforcement to handle a world of these secure systems.
I mean, because we currently live in a world where, hey, Jeff Bezos's phone, right?
One of the most powerful men in tech, like can be devastatingly hacked in a really high
profile way like that.
It seems very clear that, yeah, defense is extremely important here.
So that's another interesting story.
So this is a company called NSO Group.
They're an Israeli cyber weapons arms manufacturer.
They produce products that they sell to countries like Saudi Arabia.
They produce cyber weapons.
They produce cyber weapons that they sell to countries that you and I don't want to have those cyber weapons, like Kazakhstan.
Okay.
Right?
Like Syria, like Saudi Arabia.
Yeah.
And those countries use those cyber weapons against journalists
and human rights workers and CEOs of large tech firms.
Wow.
Now, this is another issue that we need to wrestle with.
They are making an industry on insecurity.
And there's an arms race between the companies, the tech companies, and these cyber weapons arms
manufacturers. And whenever Apple figures out, okay, that's what they use, they patch it, and
it might take a year for it to get into the phone in your pocket. But we are forever trying to make
these devices secure.
But now there is this extremely large industry in giving these capabilities to governments
that really shouldn't have them.
And this isn't just like, you know, Donald Trump's one guy sitting on a mattress somewhere
figuring out how to hack something.
This is like these are companies operating in the light of day, just making ways to hack into consumer electronics like this.
That's right.
And then selling them to countries, sold to local law enforcement in the United States and elsewhere.
I mean, this is really arms trafficking.
It's internet arms trafficking, but that's what it is.
Wow.
And we tend to be poor in this planet on regulating arms trafficking. Yeah.
I've noticed. And we're really poor about internet arms. Well, probably most people who could
regulate it don't even know it exists. Like it's a, it's a pretty new concept. And also most people
in positions of power are kind of stupid about stuff like this. Yeah. And then we have countries
that are profiting. I mean, the NSO is an Israeli company. We should be able to pressure Israel to do something.
There's another company that is a German company, an Italian company, a UK company.
These are the cyber weapons arms manufacturers.
There's also dual use products.
There are products that are legitimate corporate products that in the hands of someone else can be used for censorship.
that in the hands of someone else can be used for censorship.
So countries will buy these products designed to ensure corporate secrets don't leave the corporation and use them to ensure that there's no free speech in their country.
Yeah.
This is bad.
This is not what we want to export.
How much should we be worried about our own government developing these hacking technologies?
I mean, there was this story about the, I believe, correct me if I'm wrong, the CIA setting up a front, like sponsoring a company that gave them a backdoor into cryptography around the world in a bath.
Can you just fill me in on this and your view on it?
Yeah, so this is a company post-World War II, electromechanical encryption devices.
They look kind of like typewriters with lights on them.
And embassies around the world use them to encrypt diplomatic traffic back and forth.
Many, many countries use these. There was a
company in Switzerland called Crypto AG that provided the equipment for much of the world's
embassies. And through a complicated story, both the German and US governments managed to buy the company and insert backdoors into the products used in embassies around the world.
I mean, like, you know, if you're writing a spy novel, this is a cool subplot.
Yeah.
Right.
And no one would believe it.
But we did it.
And for many decades.
And for many decades.
And, you know, that kind of thing continued into the computer era.
That, you know, in the 80s and 90s, we were, the US was putting back doors in software encryption products. It was limiting the length of keys in exported products in an effort to ensure that they were always eavesdropping on government traffic.
Now, in our world, that's legit, right?
You're allowed to spy in other countries.
No one goes to war on that.
Everyone says, you know, yeah, you're allowed to do that.
Even a couple of years ago,
China breaks into the Office of Personnel Management,
the United States, steals the personal data files
of pretty much every American employee
with a security clearance.
And I think it was the head of U.S. Cyber Command was testifying before Congress.
One of the Congress people asked him about this attack,
and he stopped the person and said, this is not an attack, right?
That's espionage.
That's normal.
That's what we do.
He didn't say that part, but that's what he meant.
Yeah.
Right.
That's normal stuff. That's normal. That's what we do. He didn't say that part, but that's what he meant. Yeah. Right. That's normal stuff.
This is part of doing business. Yeah.
That's part of doing business in the modern world. We spy on each other.
Yeah.
You know, so taking out your power grids bad, you know, poking around in your power grid,
that's kind of okay. Unless it's not, in which case it really isn't.
But part of this strikes me that like, so the CIA was selling a security product that had a backdoor to governments around the world.
All those governments are less secure than they thought they were because there's a backdoor.
And like this is degrading security.
And I could imagine this being done for like American domestic security products as well.
Is that the case?
Generally not.
So there are exceptions.
There are times when we see the government
try to put a backdoor in American products.
There was a standard that is used in cryptography
that the government seems to put a backdoor in.
It wasn't used a lot because it was a weird standard,
but that example is used.
And to be fair, the CIA wasn't selling the products.
The CIA owned the company through several shell corporations and a whole lot of secrecy
that sold the insecure products.
Okay.
To me, it sounds like they sold the product, but that's fine.
Yeah.
I mean, but, you know, technically it was much more, I mean, you know, it was more worthy
of the CIA than that.
It wasn't like CIA brand security products.
Like buy them for your government today.
That would be a bad marketing scheme.
No, of course, they were secretly doing it because they're the CIA.
Right, right, right.
It's like in their name.
Not, but it should be in their name.
It's secretly in their name.
in their name. But this is, the principle though, is that if the government is able to break into things that we think are secure, that makes the thing less secure, which is bad for us. Am I
right? And that's right. And there's this notion of fragility. So if you think about those old
electromechanical encryption typewriters, if anyone else learned, and we actually don't know,
encryption typewriters, if anyone else learned, and we actually don't know,
if another country learned about the backdoor, I mean, CIA had moles working for Moscow.
Yeah.
Did one of them tell Moscow about the backdoor? Maybe.
Right.
You know, did someone else figure, did someone else take the product reverse engineer and say, wait a second, there's a backdoor in this. I wonder if it's true all around the world.
No, wait, it is.
It's very fragile security.
And that's the problem with these
only we know about the trick tricks.
They're very fragile.
So I can build a backdoor in Apple iPhones
and like not tell anybody about it,
but that's really fragile.
Once someone learns about it, the gig is up.
And there have been cases, right,
where like NSA security or hacking technology
has like leaked into the wild.
Am I right about this?
There've been a lot of cases recently.
So NSA hacking tools in 2016,
near as we can tell,
the Russians popped
some kind of staging server somewhere,
got a whole bunch
of NSA hacking tools,
and dumped them on the internet. Wow.
And they have been used by other governments and criminals
ever since. A huge freaking disaster.
There's also
a bunch of CIA hacking tools,
and actually there was a trial of a person
who may or may not have been
the person who leaked them to WikiLeaks.
It was a mistrial.
So they neither convicted nor
declared innocent. They might just do the trial again.
So there's a whole lot of
CIA hacking tools that hit the internet
I think the same year, maybe the year after.
It's called Vault 7 if your
listeners want to look it up.
The NSA ones are the shadow
brokers. You can look them up too. So yes, it turns out we are terrible at keeping our hacking
tool secret. So, you know, not having them would be a detriment to national security,
but having them and leaking them is probably a bigger detriment to national security.
them and leaking them is probably a bigger detriment to national security.
Which just goes to show like how you have to focus on security being the most important thing,
not your ability to break it, right? Right. Defense dominant strategy in all of these debates against attack and defense, defense has to win. It's just too important and too critical.
I mean, 10 years ago, offense could win.
Now the stuff is so essential to national security, it has to work right. You know,
these vulnerabilities can be used to drop our power grid. That would be no fun.
Yeah. Well, we only have a few more minutes. So I do want to ask just because, hey, we're all thinking about COVID-19, the novel coronavirus every minute of every day.
What take do you have on it as a security researcher, security engineer that maybe most folks don't have? How do you look at this? Is there any security perspective on it?
So there is, let's insert all the real medical advice and information by reference. So I don't
have to say it, but assume that that's all true. Right. I look at it very much as a risk management
balance. Yeah. We're again, dealing with individual interest versus group interest,
right? I want everyone to basically stay put for a couple of weeks to flatten out our infection
curve. But, you know, individually, we want to go do the things we want to do. So we're less
likely to do it. Yeah. I got to go do standup shows, man. I'm a comic. That's right. Otherwise you don't make a living and we don't have health
insurance for you. Sorry. So you've got to work. I'm thinking about everyone working from home
and thinking how insecure those home networks are and what kind of vulnerabilities that'll
open up into the corporate networks as we force people to work from home. That'll be interesting
to watch. I'm looking at the disinformation campaigns.
I'm worried about the security of the medical information.
You want to really cause havoc?
Start hacking hospitals and changing medical information.
Don't do that.
I hope that doesn't happen.
Okay, but don't do that.
Right.
So, you know, there is an interesting angle because all this stuff is computerized.
Yeah. But right now it is all about, about medicine and there's a lot we don't know.
So there's a lot of fear. We tend to fill in information we don't know with information
that is fearful. And I worry actually a lot more about the human overreaction.
Sorry, let's start again. I worry a lot more about the human overreaction than the disease at this point.
Yeah.
You know, I mean, I worry about there being food shortages, not because there are food
charges, because people are overbuying food.
Yeah.
You know, we don't have enough masks, not because we don't have enough masks, because
people who don't need them are hoarding them.
And it forces, it changes your behavior.
I've talked to friends, especially in New York City, where, you know, obviously, you know, where obviously there's a lot more people and access to goods can be
rough. People say, hey, I'm stockpiling
just because other people are. If I want toilet
paper at all, I have to
go behave this way because that's what everybody
else is doing. It's flying off the shelves.
And that's a race to the bottom.
And then as a group, we all lose
because the individual self-interest
is running contrary to the group interest.
And you see that a lot in cybersecurity.
You see that a lot in human reactions.
And this is why our government coordination has value.
But, you know, we don't have a government that's doing much coordination at this point.
So we're hoping that spring will lessen it.
I think the lessons right now from Singapore and Australia are that they won't.
But we don't know.
You know, we don't actually know a lot about the infection rates and death rates
because we're just so under testing.
We don't have good data yet.
There's really so much ignorance.
So we just have to
prepare for not the worst,
but, you know, on the higher end
of bad and hope we're wrong.
Yeah. But then if we're wrong, people are going to say, well, that, you know, on the higher end of bad and hope we're wrong. Yeah. But then if we're
wrong, people are going to say, well, that was a waste, which is of course not the way to think
about it, but that's human reaction as well. Do you have a message for folks who are listening
to this and getting a little bit frightened either when it comes to COVID-19 or to security
in general, best practices that you try to tell friends and family members to take
in an elevator if you're, you got that much time to talk to them?
All right. Don't lick the doorknobs, I guess is first. You know, in both of those,
it's often the common things we can do that make the difference and the rest of it's out of our
hands. Yeah. Right. For sure. For our computers, make sure you're installing all your patches, have antivirus
program, make good backups. For the virus, reduce contact, wash your hands, don't touch your face.
If someone has the disease that's on their hands, stop shaking hands.
And it's those simple things. There's a lot more that we as society can do,
but for individuals,
it often comes down to basic hygiene.
Yeah.
Whether it's computer hygiene or personal hygiene.
Thank you so much for being on Bruce.
It's fascinating to talk to you as always.
It's always fun.
Thank you.
Well, thank you once again to Bruce Schneier
for being on the show
I want to thank you folks for listening
and I want to thank our producer Dana Wickens
our engineer Ryan Connor
our researcher Sam Roundman
and our WK for our theme song
you can follow me wherever you like
at Adam Conover
and sign up for my mailing list
and check out my tour dates
at adamconover.net
Thanks so much for listening.
We'll see you again next week.
That was a Hate Gum podcast.