THE ADAM BUXTON PODCAST - EP.106 - SHOSHANA ZUBOFF
Episode Date: October 25, 2019Adam talks with American author and academic, Shoshana Zuboff whose book The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power is about how the big tech compani...es (Google and Facebook especially) are employing increasingly sophisticated methods to watch how we behave on line and using that data not only to sell us things, but also to make predictions about our future behaviour and even the kind of opinions we 're likely to have. This predictive data is some instances then traded to other companies and organisations seeking to gain a financial or a political advantage by knowing how customers, or voters, might behave.Thanks to Séamus Murphy-Mitchell for production support and Matt Lamont for additional editing. RELATED LINKSTHE GREAT HACK (VARIETY REVIEW by OWEN GLEIBERMAN)https://variety.com/2019/film/reviews/the-great-hack-review-cambridge-analytica-1203277059/THE PERIL AND POTENTIAL OF THE GENERAL DATA PROTECTION REGULATION (FROM CENTRE FOR GOVERNANCE INNOVATION WEBSITE, JULY 2019)https://www.cigionline.org/articles/peril-and-potential-gdpr?gclid=CjwKCAjwusrtBRBmEiwAGBPgE3kVYBBy06B_MCO2nR74oy1ikeYrQXaq79C6feEQZWhd4NmUwDbrnxoCaJYQAvD_BwETHE AGE OF SURVEILLANCE CAPITALISM (GUARDIAN REVIEW by JAMES BRIDLE, FEB 2019)https://www.theguardian.com/books/2019/feb/02/age-of-surveillance-capitalism-shoshana-zuboff-reviewTALKING POLITICS PODCAST - THE NIGHTMARE OF SURVEILLANCE CAPITALISM https://www.talkingpoliticspodcast.com/blog/2019/144-the-nightmare-of-surveillance-capitalismRECODE DECODE PODCAST - SHOSHANA ZUBOFFhttps://www.stitcher.com/podcast/vox/recode-decode/e/58889787?autoplay=true Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
I added one more podcast to the giant podcast bin.
Now you have plucked that podcast out and started listening.
I took my microphone and found some human folk.
Then I recorded all the noises while we spoke.
My name is Adam Buxton, I'm a man.
I want you to enjoy this, that's the plan.
Hey, how you doing, podcats?
Adam Buxton here.
Thanks very much indeed for joining me for podcast number 106.
How's things?
I'm all right.
It's a bit dreary out here today in the Norfolk countryside.
We've had a lot of rain this week.
Mud everywhere.
The skies are overcast.
It's a little bit David Blowey as you can probably hear.
But, oh no, the clocks are going back this weekend, aren't they?
27th of October.
So we're going to be plunged into even gloomier gloom.
Sorry, I don't mean to start the podcast on a negative note.
But it is always a bit of a sad time when you lose a whole extra hour of daylight.
But hey, Christmas is the reward at the end of this often stressful period.
Christmas podcast with corn balls, doodle story.
Oh, anyway.
I don't need to tell you all about that now. What I need to tell you about is my conversation with the American author and academic Shoshana Zuboff.
Shoshana, a professor emeritus at the Harvard Business School.
I had to look up what professor emeritus means.
It means a retired professor.
Is the author of The Age of Surveillance Capitalism,
The Fight for a Human Future at the New Frontier of Power. It's a big book about big things,
about how the big tech companies, particularly but not exclusively Google and Facebook,
are using increasingly sophisticated methods to watch how we behave online and about how they use that data
not only to sell us things more efficiently,
which sometimes can be quite convenient,
but also about how they can use that data
to make predictions about how we might behave
and even the kind of opinions we might have.
This predictive data is, in some instances,
then traded to other companies and organisations
seeking to gain a financial or political advantage by knowing how customers or voters
might behave. Well, so what, you might think. That's just a clever modern way of getting to
know your customer. But apart from the fact that this data gathering and profiling
is usually conducted in a sneaky and underhand way,
what the Cambridge Analytica scandal demonstrated, for example,
is that the data can then be used to do all sorts of things
which we definitely didn't sign up for.
For example, influencing elections.
Now, you may feel that your plate of doom is already full
and you don't have room for one more thing to worry about,
and I heartily sympathise.
I personally find it hard to get worked up about things like targeted advertising,
especially when it's often quite crap.
For example, I bought some maracas the other day online
because I'm reforming the best band in the
world quite soon with my pals and now after I bought the maracas every time I open a browser
window there's an ad for the exact same pair of maracas as if I'm going to think oh brilliant yes
I need some more maracas because then I can have maracas all over the house in case it's suddenly very important to do this.
I mean, that is important. Don't get me wrong.
But does anyone need more than one pair of maracas?
Other than, I don't know, bears.
But the thing is, that shit attempt to sell me more maracas
has profound and disturbing implications
for the kind of lives that we will end up living in the future
if companies are allowed to carry on doing whatever they want
with the information trails that we leave online.
However, I hope if you listen to my conversation with Shoshana,
which was recorded
earlier this year, 2019, in the offices of her UK publishers in London, I hope the message you'll
take away is not, oh great, there's another way that we're all fucked. But instead, here's the
thing that it's important for us all to be aware of. And it's a conversation that should
spread as wide as possible throughout society, so that businesses and corporations using this
technology feel obliged to do so more responsibly. That's the idea, in my mind anyway. Now Shoshana
is used to being interviewed by people who are, and let's be positive about this, even more intelligent and articulate than I am.
So I began by explaining a little bit about this podcast and what she could expect.
Also, as you will hear, we were interrupted at one point by someone bringing Shoshana a chicken salad.
by someone bringing Shoshana a chicken salad.
Apparently she hadn't eaten for about three days as she was giving a round of interviews before flying back to the US.
And I was very grateful that she agreed to be on the podcast,
so I let her eat her salad, which I think was nice.
Of me, that is. I don't know what the salad was like.
It looked fine.
Back at the end for a little more solo waffle
But right now, here we go!
Ramble Chat
Let's have a Ramble Chat
We'll focus first on this
Then concentrate on that
Come on, let's chew the fat
And have a Ramble Chat
Put on your conversation coat And find your talking hat Bye. So set me up a little bit.
Yes.
About your podcast.
So the podcast, I don't know if, I mean, my background is comedy.
And the podcast, sometimes it's with comedians.
Other times it's with writers.
Sometimes they're quite funny. sometimes they're more serious you know i talk to maya foa who's the director of reprieve who represent
people on death row and in guantanamo and lobby for human rights all over the world so that's
quite a serious one i'll say but then i'll talk to charlie brooker
who does black mirror have you ever seen black mirror of course i have yeah okay so charlie's a
good fusion of serious and stupid you know like his main impulse is just to be silly about things
but there's quite a serious underlying set of concerns beneath that.
Yeah.
So there you go.
That's too much information for you.
No, not at all.
I'm fascinated.
I'm probably not the funniest person in the world, which is just because I've always been that way.
I mean, even as a little girl, I was very serious.
Yeah, were you?
Yeah.
But that doesn't mean I'm not game i'm game for anything so
wherever you want to go it's fine all right so i'm really interested in how you
caught on to the book and well chris you read it yes i have read it in its entirety and
listen to the audio book well done do you know the audio book are you happy with it i actually have not
been able to bring myself to listen to it i was really upset that they didn't ask me to record it
actually oh really yes why didn't they give you the option i don't know i don't know that would
have been great have you got any annoying vocal tics not that i'm aware of You'll have to tell me when we're done
Yeah
Have you spoken to sort of students, young adults about the book?
Yes, yes
And what sort of responses do you get from them?
Well, it's interesting because so often we hear
Oh, you know, the young people, the millennials, kids today
They've grown up with this they don't
care about it and so on and so forth i find that to be not true at all first of all just from a
point of view of data if you look at the surveys see that you know people especially i would say
between 16 or 17 and 25 to 30 are very concerned about what's going on very switched on to it
they're the ones who are spending so much time thinking about how do we hide from this you know
how do we camouflage ourselves how do we disguise ourselves yes my son was the person that taught me about vpns vpns so these are forms of resistance
that are very individualized you know where each one of us ends up searching for these modes and
methods of hiding in our own lives what is a vpn for people that don't know it's just a private
voice protocol so that you're on a line where ostensibly there's no access.
There are some of them that will scramble your location on a random basis so you can't
tell what part of the world you're in.
And then there are other programs you can get that will constantly block trackers and some that will create random patterns in your
browsing so that you throw off the predictive analysis because there's no obvious pattern in
your browsing. Have you ever looked into the so-called incognito window on Google Chrome?
Are you familiar with that?
I am familiar with it, and don't quote me on this,
but quite recently, and I haven't had time to look into it
in the depth that I normally do to say something in public,
but quite recently there's been a whole bunch of new forensic expose
of incognito on chrome and the idea that incognito
is not incognito and chrome is in fact a privacy leaking not a privacy protecting
tool right so you know people have to be real careful and you've got to read and you know, people have to be real careful and got to read and, you know, but the bottom line for me in all of this, Adam, is that I resent this kind of thing very much. should not be required to hide in their own lives.
That this is, in fact, intolerable.
And the fact that we spend so much time thinking about how to camouflage ourselves
is one of the early warning signs of great iniquity,
of great injustice in the kind of society that we are allowing to be built around us.
We should have freedom in our lives.
We should neither have to hide from the forces of the market, nor should we have to hide from the forces of the state.
And the fact that so much of our young people's mental real estate,
young people's mental real estate, let alone the creativity and artistry of our most avant-garde painters and poets and filmmakers are going into these themes of how do we hide. Young people are
writing their doctoral theses about how to scramble your identity and camouflage yourself. And the artistic
vanguard is preoccupied with this. This is robbing us as humanity of what the real creativity,
you know, should be about. It should be about how we expand and how we explore and how we adventure not how we hide so i find
this to be a very disturbing sign that things are not okay speaking of unwanted invasion hang on one
second we'll just pause it's lunchtime what have you got i need a protein hit yeah absolutely
nice would you like some i I'm all right, actually.
Thank you.
I didn't eat dinner last night, and I didn't eat lunch yesterday.
Yeah.
And I didn't eat breakfast today.
Come on.
You're not looking after yourself, Shoshana.
It's a chronic problem.
You're prioritizing your commitments.
I do that.
My kids are on the warpath with me. Are they? Mm-hmm. I'm sure you've had an exhausting year. I can hear them the my kids are on the warpath with me are they
I'm sure you've had an exhausting year I can hear them in my head right mama eat
where do you live in the states I live in Maine oh right beautiful and is that where your house
was that got immolated immolated unbelievable so the book, you tell this story,
and it's a way of you drawing an analogy between where we're at now with the advance of surveillance capitalism.
And you talk about an incident when your house was struck by lightning.
Yes.
And a fire started.
Smoke was coming through the house.
You felt as if you had probably a few minutes just to close a few doors to stop the fire spreading.
Take out a few of your most valued possessions onto the landing photo albums you grabbed.
But within minutes, a fire marshal was in, grabbing you by the shoulder and saying,
You've got to get out now.
A fire marshal was in, grabbing you by the shoulder and saying, you've got to get out now.
And a couple of minutes later, you were watching your house burn to the ground.
Explode.
Explode.
Why did it explode?
The lightning went into a wall and traveled down, and I think it found the oil tank.
Right.
That was in a machine room in the lower level of the house.
Wow. And that precipitated an explosion and everything went from there.
Worst case scenario, lightning strike.
Lightning strike right into our kitchen.
Whoa.
Did the insurance cover that?
The insurance was actually, I was, you know, being critical scholar.
And, you know, I was all prepared for
the insurance to be diabolical, because usually we hear those horrible stories. Actually,
our insurance people were angels, and they took care of us, and they were more than fair,
and it was chub insurance, folks. I can't say enough good things about them and i i'm a very critical
person so uh these were very good kind people with tremendous integrity and kept their contracts
wow to the team positive insurance story how often do you hear that yeah so that's a silver lining
but obviously in so many other ways that must have been completely devastating.
Well, it was devastating, Adam.
And the reason that I tell this unusual story in the book is because I learned so many things from the fire.
When you go through a traumatic experience like that, there's so much to learn. But one of the things I learned was how completely unprepared our minds are for events that are unprecedented.
That literally it's impossible, nearly impossible, very difficult to recognize discontinuity.
I literally could not perceive that I was in the midst of an existential threat.
That within moments, the entire building and the home that was the soul of that building.
How long did you live there?
And there's a home.
At that point, we had lived there for, oh, about 15 years.
And, you know, we were married there and our children were
bat mitzvahed there. You expected to live the rest of your life there?
We had, the house was, all of my parents and grandparents' things were there and my husband's
grandparents' and parents' things were there. And literally it was a shrine to our family. We homeschooled our children.
We had nearly 20,000 books in the house.
All of my scholarly work, all the books in progress that I was writing.
We made music in our home. The old Steinway that had been in the family since like 1908.
It was really our home in the deepest possible way everything we did centered around that place
it was inconceivable that it wouldn't exist and yet literally four hours after that lightning
strike there was only ash and char you couldn't even tell where the refrigerator or the ovens had been so I learned from that
something about how our perception works how bonded we are to the norm to the way things are
and our view of the future always proceeds from how things are now. I've changed in the way I think about life.
I think about life now as every moment is a coin toss, that the unexpected will happen
and can happen at any moment. So I hold the future lightly I have minimal expectations I approach each moment with a very
open mind and a kind of ability to pivot emotionally and physically to whatever's
happening now and you trace that back to the house burning I trace it directly back to that because it was my belief in continuity that made me ineffective.
I was closing doors to rooms that would no longer exist.
I was salvaging photo albums on porches that were about to be extinguished.
So I was completely ineffective at the exact moment when I thought I was being so clever.
Right.
And I write about that in the book in part of the introduction because I feel like this is the kind of moment we're living through now.
What's lighting society on fire is an economic logic that is transforming our lives largely outside of our awareness,
without our ability to notice, therefore without our ability to resist or combat.
Any thought of consent is just laughable, because you can't consent to something that is
designed to be hidden, which which ergo the surveillance and
surveillance capitalism if you will yes you know i mean for most people certainly in the eu
when you're online consent is represented by the annoying pop-ups you now get when you visit a website since the data protection regulation in 2018.
You know, you have to, before you read anything, you have to click this thing that says,
in support of our communities, we and our third-party partners set cookies to deliver personalized content and ads
by continuing to use this forum, including clicking the OK, understood button below. You consent to use the
collected data and cookies on this site. And most people, myself included, don't go any further than
that. I mean, we wouldn't have even read that. We would have just clicked OK, because you want to
read the thing that you went there to read. And you think, OK, they have to flash this thing up now because of the GDPR that came into effect in 2018.
But, you know, for most people, they don't have the time or the inclination to dig deep into what that actually signifies.
It's an inconvenience that you brush aside and you don't really think about? I have nothing but respect and awe
for the women and men who worked on GDPR for so many years.
They've really pushed the frontier of privacy law
and it's very, very important.
And so it is ironic that after all of that work,
we're still in this kind of kabuki situation where you know
I have a minute you know I'm online for just a minute I need to read this thing or sign this
thing or get this piece of information or get the health results from my doctor's office or get my
kids homework from the school or whatever it might be. And here we are confronted with the message that you just described. And just as when we had the
notice and consent designed by the companies, we're in the same kabuki. We're just checking
agree because we've got to get on with it. We don't have time. And it's not that we're stupid.
It's not that we're compliant.
It's not that we like it.
It's not that we fundamentally agree with it.
But we just have to get on with it.
And we really have no choice
because any other site you go for
is going to have the same kind of little box to tick.
So for all of their brilliant work on an everyday basis we really haven't
proceeded that far if anything you might imagine that for some companies it's made them more brazen
because they now have this get out clause of flashing this consent form up yes which legitimates
them yeah it's a kind of ethics washing. So they could build in some more outrageous stuff into the terms and conditions that most people would never see.
You'd never see it.
So that is very disturbing and disappointing. despite the importance of antitrust law, legislative constructions that really come from the 20th century and come from a different technological era.
specific forms of regulation and law that are designed to bite on the unique and unprecedented mechanisms and methods of surveillance capitalism. And it's, again, it's that challenge of being able
to cognize and reckon with what is unprecedented and to take that on board and realize that,
hmm, data ownership isn't going to get us there, or, quote, breaking them up isn't going to get
us there. We need to outlaw the new things that it's doing that we even still barely understand. And so that's a 21st century challenge that we have now,
how we build on the past, but in a way that addresses an unprecedented, audacious economic
logic that has unilaterally, without asking any kind of permission, claimed our private human experience
as its free raw material for turning into behavioral data, which it then says it owns,
which it then computes, out of which it makes products, computational products, which are
actually predictions of our behavior. So once it says it can take our
experience, turn it into data, and it claims ownership of the data, and it sends it to its
computers, and it turns out these products, well, then it gets to sell the products.
So now it's selling business customers predictions of what we will do now, soon, and later.
predictions of what we will do now, soon, and later.
And these turn out to be extremely lucrative markets.
These are markets that trade exclusively in human futures.
Digital futures, you refer to them as. That's right.
I mean, just think, we have markets that trade in pork belly futures,
markets that trade in oil futures.
Now we have markets, and they are becoming the dominant markets, that trade in oil futures. Now we have markets and they are becoming the dominant markets
that trade in human futures. And what I've learned from my work on this book is that
as you reverse engineer the competitive dynamics of those markets, you discover their pernicious consequences and how they undermine democracy in a variety of
ways that are deeply, deeply troubling. Before we talk about some of those consequences. Yes, sir.
Because I think that's the thing for a lot of people is that they're like, well,
have you seen The Great Hack? Oh, I have. In fact, I've been asked by the producers quite a few times to
appear at the screenings to talk about the film afterward. For those who haven't seen it,
The Great Hack is a documentary centered mainly around the Cambridge Analytica scandal. And I'm
going to read a review from The Hollywood Reporter,
a section of the review from Owen Gleiberman.
And it was a sort of mixed review.
I think he enjoyed the way it was put together and produced,
but not entirely convinced by the tone of it
and felt that it was sort of overdramatic and overparanoid in some ways
and had a problem with some of the central assumptions that the doc was based on.
So he says,
Much of the data mined and analyzed by Cambridge Analytica was in fact public.
After all, social media is about declaring who you are in a public forum.
Gathering that data and forming profiles out of it isn't illegal.
The liberal dream of the connected world was always a starry-eyed fantasy, one created and
sold by technological capitalists like Steve Jobs, who marketed the dream as their own seductive form
of propaganda. What the Trump campaign did and is continuing to do
is not a violation of that dream.
In some horrible way, it's a fulfillment of it.
The reality is that once we agreed as individuals
to plug our lives and identities
into a social network of information bombardment,
we left ourselves open to being propagandized and manipulated.
The way I read that is that he's saying this is the reality of a world we bought into and got
excited by. And it is an aspect of capitalism in the modern world. And, you know, to act as if you're scandalized and outraged and frightened by it
is in some way disingenuous. So how do you feel about that sort of perspective, which I've heard
sort of mentioned by a couple of people about that film specifically, but more in general about
the idea of data and ownership? Well, I would say that Owen doesn't yet understand
how surveillance capitalism works.
And if he did, I think his assessment here would be different,
or at least it should be different.
So let's talk about data.
Owen's criticism suggests that the data
that the surveillance capitalists have about us is
simply the data that we give them. That is not correct. What people don't understand,
and there are good reasons not to understand it, because it's hidden. It is engineered to be
hidden and secretive and keep us in ignorance.
But essentially what's happening is that we give a little bit.
What they have amassed on us is gigantic compared to what we give.
What they're really looking for are predictive signals, the qualities of data that allow them to lift
highly predictive signals of our behavior. So let me give you an example. I may be on Facebook and
my four-year-old daughter has a birthday party and I want to share the pictures with my
friends and family. And so I post pictures on Facebook to with my friends and family.
And so I post pictures on Facebook to share with friends and family.
That's what I'm doing.
And my expectation is that they will be shared with friends and family only.
Well, at some point, Facebook decided that it was going to develop the most advanced facial recognition
systems on earth. The way that Facebook folks describe their data flows of faces
is with the words nearly infinite. So you've got a couple billion people.
Everybody's uploading photos.
I think my photos are just going to you and to her and to him.
But at some point, especially when they started asking us to tag the photos,
they are ingesting all of these photos
for building, training artificial intelligence systems
that can recognize faces in the wild with a high degree of accuracy.
Right, to create a massive data set.
A massive data set that allows their system to recognize faces.
And it's not just recognizing the identity of the face.
The real interest in the face is there are all these tiny, tiny muscles in the face
that can combine into hundreds of different facial gestures. And now there's something
called affective computing, invented at MIT.
Affective computing are programs that can take a face and can compute very finely grained emotions that the face is displaying.
And it turns out that emotions are incredibly powerful predictors of future behavior.
are incredibly powerful predictors of future behavior.
So Facebook now has the most powerful or one of the most powerful and accurate facial recognition systems ever conceived
based on all of the photos that we innocently uploaded,
thinking that I'm sharing them with Mom and Dad
and Aunt Helen and Uncle Sam and so on and so forth.
And this now becomes a source of incredible predictive power.
And their justification for it
is that it will increase efficiency for apps and services that people want to sign up to and use, presumably?
Well, I'm not sure that they really have a justification for it. I mean,
the justification for it is these are the products that we sell to our business customers.
And aren't those targeted ads? Wonderful. And we now know that these same kinds of operations, you know, have traveled far beyond online targeted advertising markets.
We're seeing them not only as the default model in the tech sector, but in every part of the economy.
retail, health, education, real estate, transportation, all the way back to basic product manufacturing and service creation. Everybody is trying to get into this act because
what I call the surveillance dividend, that margin that comes from selling people data
that predicts what folks are going to do, that's a really important margin that
companies no longer feel that they can ignore. Because in a world of globalized competition,
commodification, you know, it's really hard to get a value-added margin on your product or service.
And now the surveillance dividend is the way companies are doing it, including, ironically, Adam, all the way kind of full circle to the early 20th century,
the birthplace of mass production, which is a very different kind of capitalism.
We were actually giving people, you know, in the Ford Motor Company,
a Model T that farmers and shopkeepers found had tremendous value for them
to allow them to live their lives better and more effectively. Well, now the CEO of Ford Motor says,
hey, folks aren't buying automobiles around the world the way they used to. Well, yeah,
that's not going to come back anytime soon. And so we would have hoped that he would say, yeah, and that's not going to come back anytime soon.
And so we would have hoped that he would say,
well, this is our opportunity to put all of our creativity and our capital and our science into building a car that doesn't burn carbon.
Wouldn't that be wonderful?
That would be alignment with the real needs of populations.
But that's not what he says.
What he says is, we want market capitalizations like Google and Facebook.
We want to attract investors like they do.
I've got a good idea.
There are 100 million people driving Ford vehicles.
So let's stream data from all of those drivers.
And then we can combine it with the data that we have in Ford Credit,
where he says we already know everything about you.
Now we have data sets comparable to Google and Facebook.
What investor would not want to put their money in with Ford Motor?
Right.
Right.
So instead of chasing something
that people and the world
really, really need,
he's chasing the surveillance dividend
to drive his stock price
and his market capitalization.
Right?
This is why I call surveillance capitalism
a rogue mutation of capitalism. It is about us, but it is not for
us. But then could you not say that about capitalism throughout the ages? It's just a new
tool in capitalism's toolbox. And the idea of profiling people in that way certainly has been part of the advertising world for about
100 years, breaking them down into demographics and selling to them on that basis. It's always
been a kind of distasteful part of advertising that most people would lampoon, but it's a reality
of it that people are aware of. So I suppose some of the people involved with those efforts to manipulate data
in the modern world would say it's just an extension of that way of being part of the
world of capitalism. Well, you could indeed say that. But if you were to say that,
then you would be making the same mistake that I made, closing doors to rooms that did not exist,
and throwing my photo albums out on porches that were about to go up in flames, it would be a failure to recognize the unprecedented.
So let me put it this way. It's true that capitalism has long evolved by taking things
that live outside the market dynamic and bringing them into the market dynamic unilaterally,
turning them into commodities that can be sold and purchased.
Famously, industrial capitalism took nature
and its various innocent manifestations
and brought it into the market dynamic
to be sold and purchased for the gain of the capitalist.
And it took us a very long time, and we had to develop a lot of science
before we realized that those operations were destroying our planet.
Well, surveillance capitalism does have this continuity.
It's also taking something that lives outside the market and dragging it into the market, turning it into a commodity.
But now with a dark and I would say startling twist.
And what it's claiming for the marketplace is private human experience.
for the marketplace, is private human experience.
It's like capitalism has colonized and transformed everything.
It's even in space.
And it's like the roving eye of capitalism looking around for the last virgin wood. And it turns out that the last virgin wood is us, is our lives, our behavior, our feelings, our futures. That's the last frontier where there's an opportunity to create a margin,
to create a value that somebody will pay extra for,
and that value is predicting what we will do.
Not for our sake, but for the sake of business customers
who want to exploit that information for their private gain.
who want to exploit that information for their private gain.
This means that just as nature was transformed by industrial capitalism,
human nature will be transformed by surveillance capitalism,
should it be allowed to stand, which I don't believe it will be, but should it be allowed to stand, we are necessarily transforming human nature. Because, look, what are these folks
really selling when they're selling predictions of our future? They're selling the promise of
certainty to their business customers. Certainty, as everybody knows,
you just have to think about it for a moment.
Certainty is the opposite of freedom.
The more certainty you have,
the more you have to extinguish freedom
because free will is a wild card
and that can really undermine certainty.
We live in democratic societies
that cherish freedom, freedom of will.
And therefore, we've decided to live with uncertainty. And we have various ways of
living with uncertainty. We have something called trust. We trust each other. So when I'm in my car
and you're in your car, I come to a green light.
I trust that you're going to stop at the red light.
Without that trust, our society could not exist.
We trust each other.
And we also have contracts, which is a way of formalizing that trust.
And we have rules and norms and values and laws.
And all these things allow us to live together in freedom with uncertainty.
Well, what surveillance capitalism wants to do is produce certainty because it's lucrative.
So it turns out that if you're selling certainty and you're in a competitive marketplace,
there's some things
that you have to do in order to compete for the best predictions, the ones that most approximate
certainty. One thing is you need a heck of a lot of data. Everybody knows artificial intelligence
wants to feed on the maximum amount of data. So you're heading toward totality of data,
economies of scale. And then you discover, well, we really needality of data economies of scale and then you discover well we really need
varieties of data you know not just what folks are doing online but we want them out in the world we
want their location we want their voice we want their posture we want their gait we want what
they're eating what they're buying who they're talking to we want their cars we want their homes we want their bloodstream we want
their brain waves we need varieties emotions their faces yes it's like the way you're describing it
it makes me think of the first time you saw cg versions of human beings in films like the early
pixar films and used to think wow look at that that's cool that looks like a person
but now if you compare those early Pixar versions of human beings to what CG human beings look like
it's just mind-blowing you know they've got all the detail of the hair and the way that skin is
luminescent and the way that's a great analogy absolutely that's a brilliant analogy. And so that's what's happening with the data.
So all of these data are created, you know, not only to profile us, but to profile people like
us, you know, to create these predictive patterns over scale, over a population. It could be
people who live where I live, or people people of my sex or people who have other characteristics that I have.
Or it could be people in my city or people in my country.
And ultimately, in this competitive march, so we've got economies of scale, lots of data.
We've got varieties of data, which is economies of scope, these varieties.
And by the way, aided and abetted by that
supercomputer that you keep in your pocket, and all of the apps that you download on it,
which can take all sorts of things that turn out to be lusciously predictive. And then,
as competition rages on, there is a new insight, and that is the most predictive signals come from actually learning how to nudge and shape and modify behavior in real time so that we can kind of push people in the direction that we want them to go on that satisfy our customers' outcomes.
Right, which was seen so boldly with the Cambridge Analytica thing
and the bombardment of Trump voters.
Which is the perfect experiment in that, absolutely.
The Trump voters got all their anti-Hillary videos,
and that turned out to be something that was seeded by Cambridge Analytica.
What they did in Cambridge Analytica was they used highly predictive data
that they were able to extract from this fake personality test
that their contact, Alexander Kogan, put online as an app.
And he was already known to Facebook. He already had good relations in Facebook. So they let him in there and they let him extract these data that
were known for their rich predictive signals because they could predict personality. Facebook
also allowed Kogan to take not only data about the people who filled out that personality profile,
but the network of all their friends. So this was not supposed to be able to happen, but it did.
Also, applications developers aren't supposed to sell their data, and they're not supposed to sell
it for nefarious purposes. He did both.
All right, so right away, they're doing something that is not supposed to be done,
something that we did not sign up for, something that is hidden from us. So going back to Owen
Gleiberman's thesis, his thesis is incorrect, because this was not just the normal activity that we think we're participating in on a site like Facebook.
It's a shadow operation, a highly detailed, lucrative, highly capitalized shadow operation, which is intentionally engineered to be hidden from us.
operation, which is intentionally engineered to be hidden from us. In fact, when the Facebook researchers write about these things, they celebrate the fact that they are engineered to
bypass user awareness. We have no way of detecting that they're there or what they're doing.
So all Cambridge Analytica did was, you know, draw the data from the host, surveillance capitalism data, draw the
techniques, the methods and mechanisms from the host, all invented under surveillance capitalism,
and draw the opportunity from the host. Because what does the host care about? The host cares
about engagement. It wants you on there as much as possible, giving as much stuff, interfacing as
much as they can possibly get you to interface, so that they've got more raw material from which
to draw this behavioral surplus. So Cambridge Analytica simply repurposed this whole shadow
operation, pivoted it a couple of degrees to achieve political outcomes and influence
rather than commercial outcomes. Yeah, so it was successful initially for Ted Cruz's campaign and
then when Trump won the Republican nomination for Trump and supposedly it was a part of the
Brexit campaign as well. And of course the the real pity here, and what I wish
Owen would take his fine mind and get into more deeply, is that we don't even know yet
the full extent of what it did and what it did or did not accomplish. And that's really where
the frontier is right now, of how we demand that our governments demand from Facebook and
whatever other companies might be implicated, so that we are able to do the forensic analyses
to truly understand what did or did not occur, and what was or was not accomplished.
The strange thing is that this is all happening in a world that initially started out to be very optimistic at the dawn of the net age,
when people were feeling that at last they would be empowered and that they would be sort of actualized individuals
who were able to take control over companies and be independent of the wishes of those companies in a way that they never had
been before and they would be able to interact with each other in a far more transparent and
honest way and a direct way and then people became something happened something funny thing
happened on the way to the forum yeah but even though people are much more cynical about that
overall i would say,
I don't think that many people believe that the internet is now this unregulated wonderland.
Evidently, it's not.
But still, there is a sense that's hung over from the whole thing that,
overall, it is a positive place to interact with your fellow humans.
Even to the extent that I was watching a film the other day called late night.
A comedy.
Oh,
with me.
I saw a bunch of that on the flight over.
Right.
Yeah.
Emma Thompson.
Yes.
And she,
Emma Thompson is this British talk show host in the States.
A kind of grouchy the States. Grouchy.
Yeah, grouchy.
The implication is that she's lost her edge.
She was once this legendary, well-loved,
Letterman-style talk show host,
but her best years are behind her
and she's become set in her ways.
And Mindy Kaling is this writer who comes along
and she ends up being the first female on her writing staff.
And she kind of is a breath of fresh air in all sorts of ways.
But part of what Mindy Kaling's character says to Emma Thompson is that you don't interact online.
There's no viral clips.
There's no interaction on social media.
And that makes people think that you're being superior that
you're above them so that i feel is the mindset and you know as a comedian as a performer or
whatever i'm occasionally asked by people to tweet things to tell people about shows i'm doing or
whatever and if i express any reluctance whatsoever or just say that I can't
be bothered then it's like well what are you doing you're self-sabotaging that's the modern world
that's how we you know what are you above it all it's a strange position now to be in when you're
talking about the colonization of your interior private space and your mind, there is now that mindset, that strange
mindset that I think makes people feel as if opting out is somehow antisocial. That actually,
it's great for the surveillance capitalists, because there is this feeling, I think, that
pervades more and more that you should be part of all this, you should be engaging online.
And what's your problem if you're not? It's a sort of weird position that you're, I mean, even, you know, you're an author, you want people to buy your book, you're on Twitter, you have to engage to that degree, don't you? But you're not, in the book, you're not advocating total disengagement. You're not saying that the technology itself is the problem. It's just the way that it's used. What I'm saying is that they have sold us a bill of goods. our experience into behavioral predictions and amass these unprecedented concentrations of
knowledge about us and the power that goes with that knowledge to increasingly affect our behavior
and even control our behavior. And even when I say these words today, I say, oh, is somebody
going to think I'm paranoid or some, you know, like out there extremist?
But honestly, folks, I've been studying this so deeply for so many years.
And these are reluctant conclusions that I come to.
And they are not melodramatic.
But the thing is, they have tried to make us believe that these are inevitable consequences of digital
technology. And that's where I really get angry. Because I believe that the digital does have
tremendous democratizing and empowering capabilities. And that as 21st century citizens,
we as individuals and our societies overall,
we need and we deserve these capabilities.
We need so-called big data to solve the climate crisis,
to cure disease, to educate every person on earth,
to make sure no one goes hungry, to eliminate plastics
in the ocean and the arctic snow. I mean there are so many urgent things for which we need
these technologies on a global scale to radically improve our lives and our future possibilities, but also just on a personal
level. We should be able to learn what we want to learn and get the kids' grades from the school's
technology platform, get my test results from my doctor's office, talk to my friends and family online without having to march through, a forced march through,
surveillance capitalism supply chains, where I have to make myself available for their extraction operations
just in order to be effective in the most mundane circumstances of my daily life.
This is unacceptable, Adam. This is intolerable. We deserve the digital. It belongs to our society and it belongs to democracy.
The digital century was supposed to be Gutenberg on steroids, the most freedom and empowerment and democratization ever.
Which Gutenberg is that? It's not Steve.
Not Steve.
The original Gutenberg who gave us the printing press and the Bible
that put the prayers in the hands of the people,
and we were no longer mediated by the priests.
The good news here, may I say something about the good news?
Please.
The good news is that everything we've been discussing is only about 20 years old.
The surveillance capitalists like to deride democracy. And they say democracy is slow,
implying that democracy is stupid. And law only gets in the way of innovation.
That's another part of their rhetoric.
That is rhetoric designed to allow them to continue to operate in lawless space.
The truth is, democracy is slow because it should be, because it requires deliberation.
You know what I think of when I hear that put down from them?
I think of that part of Tolkien in Lord of the Rings with the Ents,
who are the big old trees who have to deliberate as to whether or not
they're going to gather themselves and go to war against Sauron.
Right.
And, you know, Pippin and, who's Pippin's colleague there?
Anyway, they're, you know, they're so...
Merry.
Pippin and Merry are so impatient with the Ents.
Yes.
Because they take such a long time and they speak so slowly.
And, you know, we've got to mobilize and we've got to be fast.
Yeah. slowly, and, you know, we've got to mobilize and we've got to be fast. But the point is that once they have their discussion
and they come to consensus and they mobilize,
they move on Sauron and they are successful.
They transform the landscape.
They destroy the enemy in a way that is utterly irreversible, right? And this is
democracy. This is how democracy proceeds. So surveillance capitalism is only 20 years old.
Thanks to Cambridge Analytica, thanks to the work of the courageous journalists who brought those
stories to light, and the whistleblower, and everything we've been
able to learn about it, that has alerted us to this even deeper and larger landscape
of surveillance capitalism. I'm certainly hoping my book makes a contribution. And there's other
work that's making a contribution to this. So now we're getting informed, we're waking up.
And as I go around the world and talk
to large groups of people everywhere I go, people are getting switched on about this. People are
saying this is not okay. We're worried about freedom and control and manipulation. What do we
do? It is our work now to speak out, to come together in groups.
One of the worst mistakes we can make is to think that privacy is private. A society that cherishes privacy is a society that cherishes freedom.
A society that destroys privacy is a society that chooses certainty over freedom. So privacy is a collective
action problem, and it determines what kind of society we will live in, whether or not it is a
free society. So we have to mobilize ourselves in new forms of collective action. Just like a
century ago, people mobilized in unions, and they got the right to bargain
collectively, and they got the right to strike. Those were new forms of collective action
for those threats. Now we're going to have to discover the new forms of collective action
for these threats. And we have to mobilize our lawmakers. Because above all, there is one thing that surveillance capitalists fear.
Above all else, they fear law. They fear democracy. And that's why they put it down.
They try to make us feel small and impotent. But we are big and we are powerful. So it's now our turn to mobilize our lawmakers, to mobilize our democratic institutions,
and to insist on law. And on a practical level, like someone listening to this,
who's maybe never even thought about all this stuff before, what do you say to people about
how they change their behavior in the short term? Or whether it's useful to change their behavior in the short term?
You know, I have a couple of paragraphs that I wrote on that subject. I wrote it for my children and my students, but that might be an interesting way for us to respond to that question. Yeah.
Okay. Because there is some advice I have for individuals. Yeah. This very last section of this last chapter is called Be the Friction.
And if you haven't figured that out already,
it's because they want us to be docile.
So being the friction is a kind of call to action.
Be the friction.
So here's something I wrote
for all the young people listening today.
And if you're not someone who's under 30, you're probably or you may be someone who has children under 30.
Either way, these paragraphs are for you.
When I speak to my children or an audience of young people,
or an audience of young people,
I try to alert them to the historically contingent nature of the thing that has us.
By calling attention to ordinary values and expectations
before surveillance capitalism began its campaign of psychic numbing.
It is not okay to have to hide in your own life.
It is not normal, I tell them.
It is not okay to spend your lunchtime conversations comparing software that will camouflage you and protect you from continuous unwanted invasion.
wanted invasion. Five trackers blocked, four trackers blocked, 59 trackers blocked, facial features scrambled, voice disguised. I tell them that the word search has meant a daring existential journey, not a finger tap to already existing answers. That friend is an embodied mystery
that can be forged only face to face and heart to heart. And that recognition is the glimmer
of homecoming we experience in our beloved's face, not facial recognition. I say that it is not okay
to have our best instincts for connection, empathy, and information exploited by a draconian
quid pro quo that holds these goods hostage to the pervasive strip search of our lives. It is not okay for every move,
emotion, utterance, and desire to be cataloged, manipulated, and then used to surreptitiously
hurt us through the future, through the future tense for the sake of someone else's profit.
through the future tense for the sake of someone else's profit.
These things are brand new, I tell them.
They are unprecedented.
You should not take them for granted because they are not okay.
If democracy is to be replenished in the coming decades,
it is up to us to rekindle the sense of outrage and loss over what is being taken from us.
In this, I do not mean only our personal information.
What is at stake here is the human expectation of sovereignty over one's own life and authorship of one's own experience.
own life and authorship of one's own experience. What is at stake is the inward experience from which we form the will to will and the public spaces to act on that will. What is at stake
is the dominant principle of social ordering in an information civilization, and our rights as individuals and societies to answer the questions,
who knows, who decides, who decides, who decides.
That surveillance capitalism has usurped so many of our rights in these domains
is a scandalous abuse of digital capabilities
and their once grand promise to democratize knowledge and meet our thwarted needs for effective life.
Let there be a digital future, but let it be a human future first.
Thanks very much, Shoshana.
Thank you, Adam.
That was great.
thanks very much Shoshana thank you Adam
that was great
you know what I was trying to get at when I was crapping on about
Mindy Kaling and Late Night
was that sense of
you know obviously
yes we didn't really quite finish
all that did we
no I mean it was only a half formed thought
that I had but I thought afterwards
well maybe what it was was that
the situation we're in with surveillance capitalism
exists because
we have allowed it to exist
in a lot of ways. And because we
willed it into being in some ways. I think that's
what Owen Gleibman was getting at as well.
We willed the internet into being.
For sure. But what we
didn't will into being is that surveillance capitalism
would own and operate
the internet right that came out of left field and we never saw it coming yeah yeah so what you're
hoping for is a sort of sea change in attitudes it's not that you are saying we've got to scrap
capitalism i'm neither saying scrap capitalism nor am i saying
scrap the internet right i'm saying let's get round to an information capitalism that is sane
and healthy which actually not predatory and simply extractive
but actually is giving us the things that we need and want,
which go way beyond these silly targeted ads.
Yeah.
I'll show you this one thing.
I'll get a kick out of it.
I carry this around.
Here's Google's ad.
This is very expensive real estate.
It's the rear page of New Yorker magazine.
Here's what Google says it does for us.
I'm reading from this advert that Shoshana's handed me.
Google ad. Helping you name the 21st president. Convert pounds to kilograms. Say hello in German.
Make a safe password. Get a deal on a flight. Unsend that sent email. That's quite useful.
Leave on time.
Arrive at the right place.
Find your car.
Get there before they close.
Find a four-letter word for assist.
Move that meeting.
Cancel that meeting.
Do the floss.
Find a Thai place.
Open now.
Take better pictures at night.
Learn something new.
Every day. So those are the things that Google claims to be doing for you.
So if you're a yuppie living, you know, in a nice flat in San Francisco,
this probably seems like a very delightful list.
Yeah, there's a few things on there that I thought were quite good.
You know, if your island or your village or your city is about to be overrun by the sea
because of the melting glaciers, or if you're dying of cancer or if your children are hungry
or if your schools are subpar or if you have no place to live this actually isn't doing very much
for any of us it's trivial and it trivializes capitalism because capitalism grows and evolves by meeting the needs of populations.
Did you see a documentary, a Panorama documentary recently over here in the UK about Facebook?
I think it was.
But what was emphasized throughout was that these were all people, young people, principled
people working for companies like Google and Facebook
and they have their
do no evil. Oh yes, they want to save the world.
Do no evil.
But we've blown past
all of that. Yeah, you reckon?
Oh yeah. But those people working
for those companies,
most of them are not excited to be part
of something that has a malign influence
on society.
They genuinely believe that they are helping.
No, I think there are many, many people who are working there and really, you know,
have a personal mission of doing good and they want to make a good contribution.
And I think there are many people working in the companies who themselves
don't have a clear, critical grasp of how the companies operate and what they do.
That's getting harder.
And that's why we're seeing more pushback, you know,
from Google employees and Amazon employees, Facebook employees.
We're seeing more pushback.
But, you know, there's a cultish quality.
You know, there's a sort of drinking the Kool-Aid kind of phenomenon.
And it will take some things to break through to that yeah things do change when you think about how attitudes to all sorts
of things have changed fundamentally things do change and things will change again yeah of this
i mean i am profoundly optimistic about this because, you know, we've fought back worse.
We fought back the robber barons.
We fought back the Great Depression.
After World War II, we used democracy and regulation and law to make a commitment to having more equal societies.
You know, we've used our democracy to do good things in conjunction with capitalism.
I believe we can do that again.
Wait, this is an advert for Squarespace.
Every time I visit your website,
I see success
Yes, success
The way that you look at the world makes the world want to say yes
It looks very professional
I love browsing your videos and pics and I don't want to stop and I'd like to access your members area and spend in
your shop
these are the kinds of comments people will say about your website if you build
it with Squarespace just visit Squarespace.com slash Buxton for a free
trial and when you're ready to launch, because you will want to launch,
use the offer code Buxton to save 10% off your first purchase of a website or domain.
So put the smile of success on your face with Squarespace.
Yes.
Continue.
This really is not about money.
Hey, welcome back, podcast.
So that was Shoshana Zuboff there, of course,
and I was very grateful to her indeed for her time.
If you're interested to explore the subject further, there's a few links in the description of this podcast.
There is that variety review that I quoted by Owen Gleiberman of The Great Hack.
And if you haven't seen The Great Hack, you can watch it on Netflix. I never actually found out what Shoshana thought of The Great Hack. It's certainly worth seeing. A couple of
good podcast appearances featuring Shoshana. One called Recode, Decode. Sounds like a good podcast
anyway, with lots of good other episodes. Similarly, the Talking Politics podcast. And also, if you
download the Talking Politics podcast, the notes that you get in the description are like a whole
book in themselves. They're the most thorough and impressive notes for a podcast I think I've ever
seen. So there you go. Shoshana Zuboff. Hope you found that interesting, thought-provoking at least.
Come on, Rosie, let's have a fly past.
She's coming up the track.
She's loping.
She is not racing.
Rosie's feeling a lot better this week.
If you were listening to last week's podcast,
I mentioned that she's been a bit off, or she was a bit off last week, and I was sort of worried about her.
I hadn't seen her personality change in that way.
She seemed preoccupied and anxious.
And we took her to the vet. The vet said she was fine.
Anyway, she's back to her usual self now.
Bouncing about and being very friendly.
So I'm glad about that.
Alright, dog log.
You can go off and
run about. So how's things
with you listeners?
Podcats? Brexit, eh?
Yeah, I should do some
sort of
topical news podcast, shouldn't I?
Whoa!
Whoa!
Look, it's blowing up a hurricane, so I'm going to go home now.
Thank you very much indeed, once again, to Shoshana Zuboff.
Thanks to Seamus Murphy Mitchell for production support on this
episode. Thanks very much indeed to Matt Lamont for his work editing the conversation. Thanks to
ACAST for hosting this podcast. I really appreciate all the work they put in, finding me sponsors and the like and thanks most especially to you for listening
right to the end that's crazy do you want a hug all right just what about a small one
there we go four more okay take care I love you. Bye! Bye. ស្រូវានប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប់ប� Thank you.