Theories of Everything with Curt Jaimungal - Carissa Véliz: Surveillance, Covid, Privacy, Data, Identity, Digital vs. Analog
Episode Date: May 31, 2024Please consider signing up for TOEmail at https://www.curtjaimungal.org Support TOE: - Patreon: https://patreon.com/curtjaimungal (early access to ad-free audio episodes!) - Crypto: https://tin...yurl.com/cryptoTOE - PayPal: https://tinyurl.com/paypalTOE - TOE Merch: https://tinyurl.com/TOEmerch Follow TOE: - *NEW* Get my 'Top 10 TOEs' PDF + Weekly Personal Updates: https://www.curtjaimungal.org - Instagram: https://www.instagram.com/theoriesofeverythingpod - TikTok: https://www.tiktok.com/@theoriesofeverything_ - Twitter: https://twitter.com/TOEwithCurt - Discord Invite: https://discord.com/invite/kBcnfNVwqs - iTunes: https://podcasts.apple.com/ca/podcast/better-left-unsaid-with-curt-jaimungal/id1521758802 - Pandora: https://pdora.co/33b9lfP - Spotify: https://open.spotify.com/show/4gL14b92xAErofYQA7bU4e - Subreddit r/TheoriesOfEverything: https://reddit.com/r/theoriesofeverything
Transcript
Discussion (0)
Okay, Professor Carissa Veliz, I'm going to start with a quotation from you, from your
TED Talk.
In the data economy, everything you do gets translated into data and sold to the highest
bidder.
Thousands of corporations and governments around the world know who you are, what you
do, where you live, who your family is, what you eat, how much you weigh, who you sleep
next to, whether you're having an affair, what you hope for, what you fear, who you sleep next to, whether you're having an affair, what you hope for,
what you fear, what you desire, what tempts you and where you hurt.
All of this incredibly sensitive data gets compiled and sold to almost anyone who wants
to buy it, and it turns out there's a lot of people who want to buy it.
Most of whom don't have your best interest at heart.
It's a perverse system and we need to stop it.
Please expand.
So for those of us who were born before the internet was such a common mode of communication
and of really being in the world, whenever there was talk of tracking someone, whether it was
through video or taping them or following them, we used to call this spying and spyware.
And when the internet kicked off, it was designed by people who weren't thinking about how it
might be misused and what kind of business model companies might have on there.
A company like Google was a protagonist in this story because they were a startup and
on one hand they were very successful because their search engine was a protagonist in this story because they were a startup and on one hand they were
very successful because their search engine was incredibly good in comparison with alternatives.
But on the other hand, they hadn't managed to find a sustainable business model.
One of the ironies of this story is that the founders of Google had written a paper in 1998 in which they explicitly argued that any search engine
that depended on ads was not going to be loyal to their users, it was going to be loyal to
their actual customers, which were the company's publishing ads.
And so they didn't want to follow that model.
They wanted to follow a much more academic model in which what mattered was the quality
of the information.
But after a few years in which they weren't getting revenue, investors got really impatient
and they put a lot of pressure on Google.
And not only did Google end up kind of biting the bullet and subscribing to this ad model,
but it was much, much worse than that because it's not only that they depended on ads, they
developed very personalized ads.
So they used all that data that they had on users about what we search for and what time
and from where and so on.
And they used it for marketing.
And from that moment on, it became a common thing to do,
the normal thing to do,
almost the normal business model on the internet
to spy on people as a way of earning money.
So what can we do about it?
We can ban it.
So there are all kinds of business models that we think are unethical and that we don't
allow.
So even in the most capitalist societies, we agree that we don't buy or sell people
because it's unethical and we don't buy or sell votes and we don't buy or sell the results of sports matches.
And for the same reason we shouldn't buy or sell personal data.
Um, it gets used to much the same effect as if we bought or sold votes.
And you know, there are other things we, we ban, like we, we ban theft.
Yeah, it might be very profitable.
Yeah.
So there's precedence for this, but this seems like a much larger issue that when
people think about they've just thrown their hands up in the air and just said, well, I
guess that's the way it is.
This is far too large, far too financially lucrative.
Well, I don't think so.
Again, we have some examples from the past in which very, very lucrative modes of behavior got outlawed.
Some people think that we might be in a financial bubble and that if we're not careful, we might
cause a certain kind of crisis, similar to the 2008 crisis. Because it turns out that the people who built the subprime mortgage market that eventually
caused the 2008 crisis were the same people who built the market for ads.
So the way ads happen online is you go on a website and in the seconds it takes to load,
your data is sent to hundreds of companies
who then bid against one another to show you their ad.
And in those seconds or microseconds, your data gets shared with a lot of people.
And it turns out that this kind of bidding system suffers from many of the same problems
that the other system suffered. And one of the most problems that the other ad system suffered.
And one of the most important ones is that it's very opaque.
So if you are a publisher of an ad and you wanted an ad on the papers, you would pay
the paper some amount of money.
And then on the Sunday you would buy the paper and you could check whether the ad was
there and you knew how many people bought the paper.
And so you had a rough idea of how many people were watching that ad. But today there are estimates that around 50% of the ads that
get sold are never seen for one reason or another. In some cases it's directly fraud,
in other cases people have ad blocks, in other cases they're not seen by the people you think
they're seeing. And it's very hard to check
because if you want to target say people who live, I don't know, in Ohio or wherever, who are 20 to 30
and you're not there and you're not in that age gap, you're not going to be able to verify who's
seeing that ad. So I think there are other reasons for why, even though this seems like a very profitable financial enterprise, we should be careful.
Furthermore, it turns out that personalized ads are very, very expensive.
So you might sell about 4% more if you use a personalized ad, but it'll cost you about 98% more than a contextualized ad.
And so it is very questionable whether it's in the interest
of companies to use these ads.
Not only does it expose them to certain kind of liabilities,
but it's also arguably not worth it.
If it wasn't worth it monetarily,
then why would a corporation do it?
Well, for a few reasons.
One is that corporations are very vulnerable to ideology, just like anyone else.
And when there is a trend and everybody's doing it, companies are very, very afraid
of missing out.
And they kind of follow the herd.
And very often, there is no scientific research on verifying whether we should be doing what
everyone else is doing.
That's the first thing.
The second thing is there have been a lot of misleading marketing for personalized ads.
So, I mean, one example was Facebook.
At one point they were saying that their ads were getting excellent of impressions,
and it turns out that it was a very very inflated number
So it might be that they are they're not thoroughly informed of what they're buying
There's a great YouTube channel. I'm not sure if you've heard of it called the hated one
Which is about how do you keep your data private from companies like Google and Apple and also the hypocrisy about them saying that they care
Plenty about privacy and well, then they reveal how they don't. These techniques like using a VPN and
burner phones, etc. You're not a fan of that because it's too individualistic or what?
It's more of a nuanced answer. On the one hand, we absolutely need regulation and this
problem isn't going to go away until we have good regulation.
However, it's true that individuals have much more power than companies want us to realize.
And if enough individuals create enough of a resistance, companies listen. And there is a competitive advantage in privacy. So I think there is a lot of value in protecting your privacy,
both because it's an expressive act of what you care about and what you stand for, and also even if you're not going to
be perfect and still you lose more privacy than you should, you might save yourself from
a case of identity theft or something. You never know which data point is going to be
the one that's a problem. So I'm definitely in favor of being careful about your data.
There was also a case that you outlined in World War II
with the Dutch, how they attempted to burn some papers.
Can you please reiterate that for the audience
so that they can understand that it's not just about,
it's not just about, I want to be private.
It can be life-saving, not just life-saving
at the level of tens, it could be life-saving
at the level of millions.
Yeah, so one of the things I argue in my book, privacy is power, is that we should
think about data as a kind of toxic asset.
Um, yes, it is very cheap to mine.
Yes, it's very, um, it can be profitable and so on, but it also exposes us to all
kinds of harms and, and the analogy I use is something like asbestos.
So asbestos is a very cheap mineral, it's very easy to mine, and it's very useful in
many ways because it's very durable, it doesn't catch fire easily, but it's also incredibly
toxic and every year hundreds of thousands of people die from exposure to asbestos.
So if you think about personal data as a kind of toxic thing, the less you have of it, the
better.
And especially as a society, because if you have the personal data of millions of people,
then the risk is a lot higher.
So one suggestion is to learn from the mistakes of the past.
And one reason to do that is because the best predictor that something might happen in the future is
that it's already happened in the past and
during the Second World War
one of the first things that the Nazis did when they invaded a city was go to the registry because that's where the data was held
and the Dutch had particularly
big amounts of personal data, more than other
European countries because they had somebody who was a pioneer in population statistics.
His name was Lenz and he had wanted to design a system that follows people literally from cradle
to grave. And so in 1943, there was a resistance cell in Amsterdam that realized how dangerous the registry was.
So they decided to try to destroy the records.
So they went into the registry, they set fire to the records, and they had a deal with the fire department
because there were some sympathizers in the fire department. And the fire department was going to try to arrive late and to use more water than necessary
to destroy as many records as possible.
And unfortunately they were quite unsuccessful.
They only managed to destroy about 15% of the records and the Nazis found and killed
70,000 Jews in Amsterdam.
They also got caught and killed.
They being the people who set the fire?
Yeah. And the people at the fire department or no? being the people who set the fire? Yeah.
And the people at the fire department or no?
I don't know about the fire department.
I know about the other people.
There were 12 people.
The Dutch had made two mistakes.
The first mistake was that they had a lot more personal data
than was necessary for a well-ordered society.
They had data like who was Jewish
and things like where did your grandparents live?
And the second thing was that they didn't have an easy way to delete that data in
the event of an emergency.
And we are making both of those mistakes.
And at grand scale never seen before because the amount of personal data that
the Dutch had in the 1940s was nothing compared to the amount of data we are collecting.
So practically speaking, what can be done?
So the person who's watching, who's listening, they care about privacy for themselves, maybe
for their spouse or their family or their friends and children.
What can they do?
So I have a whole chapter in the book, in Privacy as Power, about what can people do.
And I suggest different things for different people
because it depends. It depends on who you are and it depends on how much effort you want to exert.
So are you a journalist whose life is on the line or are you somebody in a democratic country with
a stable job and already you own a house and it depends on who you are and who you think
might want to collect your data and how much you want to
fight for privacy.
But for many things, it's really quite simple.
So instead of using WhatsApp, use Signal.
It's free, it works just as well.
It doesn't collect your data.
Instead of using Google search, use DocDocGo.
It works very well, it doesn't collect your data.
Instead of using Gmail or one of these, use ProtonMail.
It's very good, it doesn't collect your data.
Instead of using Dropbox, use ProtonDrive, use ProtonVPN.
There are all kinds of alternatives.
And in general, respect other people's privacy.
So don't share or retweet things that might expose someone.
If you want a really good party, ask your friends not to take out their phones and certainly
not to upload pictures or videos.
If you want a really good conversation in your classroom, turn off the microphones and
the cameras.
And in general, just make an effort without making your life too hard.
It's not very complicated to make an effort and to make a real difference.
And then of course, not only should you choose companies that are more respectful of privacy,
but you should also write to the companies who have your data and request that they send
you the data they have on you and that they delete the data afterwards.
You should contact your political representatives
and tell them that you're worried about privacy
and ask them what they're doing to protect your privacy.
That last one seems like the most effective option
because it will be at the governmental level
and it will ban it, like you mentioned
in the beginning of this conversation.
So these individual practices, Google already has, at least I
don't know about Apple, but Google already has a service where you can go in the back
end and see what data do they have and you can check and uncheck and even download and
delete. So when you say write to the software companies, you can also just do this yourself.
You don't have to physically write. That's something people should be aware of. Now,
when it comes to trying to get a law passed,
how does that work and what sort of law specifically?
Well, at the moment, there are a few bills
trying to make their way through Congress in the US.
In states like California, they have a good privacy law.
In Europe, we have the GDPR, the General Data Protection Regulation.
And in slowly making strides, I think the most important aspect of a privacy law would be to
ban the trade in personal data, and we still don't have that.
So one suggestion is to send privacy as power to your local representative and your political
representatives.
Another important thing is to make the default no data collection.
So instead of having to say no to cookies, the default should be no data collection and
if you really want your data collected for some reason,
then you can say yes. But even then we should have minimum requirements of safety and that's
certain practices should be banned because they're not only too risky for the person, but they're too risky for society at large. So say, you know, I don't care about my privacy, I'm quite masochistic, I would like my identity to be stolen, I would like to live in an authoritarian
regime, so I'm quite happy to give up my privacy. But other people might not be, and when you give
out your personal data, you usually give the personal data of other people as well. So if I
give up my genetic data, I'm not only giving my genetic data, I'm giving up the genetic data of other people as well. So if I give up my genetic data, I'm not only giving my genetic data, I'm giving up the genetic data of my parents, my siblings,
and even very distant kin who I've never met, but who could have consequences like being deported.
And in the same way, so privacy has this interesting dichotomy. On the one hand,
we tend to associate it
with something being very personal
and being very individual, and that's partly right.
But it also has this collective aspect.
And the collective aspect is that on the one hand,
when I give out my personal data,
often I'm giving out the personal data
of other people as well.
But also, even when I'm not,
my giving out personal data
might have collective consequences.
So if I give out my personal data to somebody like Cambridge Analytica, and they use that
data to profile voters around the world and try to sway elections, then I'm having an
effect on society. And it's not clear that I have the moral authority to do that. Just
like I may not have the moral authority to, I virus just to see what it feels like, because I might put other people at risk.
You mentioned that during COVID, there was a light that was placed on many of these privacy
issues. So what specifically, how?
Well, during COVID, we were forced to be much more online than anyone would have wanted
to.
And that of course made it obvious that our online interactions were not entirely voluntary.
So previous to that, I think the argument of like, well, if you don't want to be on
social media, you can just opt out. Or if you don't want to be on social media, you can just opt out.
Or if you don't want to use Google, just don't.
But after that, it was harder to make that argument, right?
If you want to be a full functioning society member of your society, if you want to have
a job, if you want to access education, chances are you will be forced to use either Zoom
or Teams or one of these.
Saying no is not that easy and it has a very, very, very high cost.
So that's one thing.
But another thing was how instead of focusing on the kinds of solutions
that are important for a pandemic, like protective equipment and developing a
vaccine.
And many countries had a first reaction of, oh, let's develop an app as if an app is what
we need.
And of course, because we're surrounded by tech companies who are very powerful and they're
the most successful companies of our time.
And I think they were well-intentioned in wanting to bring out solutions.
But what is a tech
company going to do? They're not going to develop a vaccine. They're going to come up with an app.
And an app is not necessarily the solution to every problem and became very obvious during COVID.
But I think COVID also
emphasized the value of the analog and how much the analog is so much richer and meaningful
than the digital. The digital is a pale comparison and it's a second best. So if you can't see
someone in person, yes, it's better to talk with them online than not to talk with them
at all. But I think everyone would agree that if you can't see someone in person, it's
just much better.
There's something there in physical presence that you don't find.
And it's not only about people, it's also about places and nature.
So if we neglect the analog in the virtual, we will regret it because we're forgetting
that the virtual depends on the analog.
And when you neglect the analog, you will lose it and it's very hard to recover.
So if you don't go to your local coffee shop and it has to close, it's very hard to bring
it back.
So this whole data economy impacts personal autonomy.
So does it also impact our understanding of free will?
Yeah, absolutely.
So I have a new book and it's an academic book called the Ethics of Privacy and Surveillance.
And in that book, I argue that surveillance is not neutral.
Over and over we get told by tech companies that technology is neutral and it depends on how you use it.
But that is absolutely ridiculous. So it would never occur to you to cook with a gun.
A gun is designed for something and it's not designed to cook.
It's designed to threaten or kill.
And in the same way, surveillance technology is not neutral.
It's designed to control.
So when we set up an architecture of surveillance, we condemn ourselves to being tracked by it.
And so when we transform the analog into digital,
what we do is we turn something that wasn't data into data.
And what that means is that we turn something that wasn't trackable
into something trackable, taggable and searchable.
And what is it to track if it's not to surveil?
And once we have surveillance, that necessarily leads
to a lack of or a diminishment of freedom. Because when you're tracked, you feel much
more vulnerable and you second guess what you do and you don't do things that you would
do otherwise if you weren't tracked.
In behavioral economics, one of the landmark results you just referenced or alluded to at least was that adoption rates drastically increase when you have to opt out versus opt in.
What are some other behavioral economic principles that we can use to decrease privacy risks besides say default heuristics. Yeah, so in general, people stay with whatever default settings they get.
So it's very important to choose the right default settings.
But another thing is, it's partly cultural.
It's not so much about behavioral economics in the sense of like nudging, but it's about
the kind of stories we tell ourselves.
And people think of tech companies as these geniuses that come up with amazing gadgets.
I think they are geniuses in many ways but I think the most genius thing that they do is actually not gadgets but stories.
Very good storytellers and they tell stories that are very compelling even though they're also very misleading.
are very compelling, even though they're also very misleading.
And we should, we should be critical of those stories. And the most important part of the story to be critical about is that
the digital is better than the analog.
So I think that whenever we value the analog, so for instance, instead of
buying an ebook on your Kindle, if you go to your local bookshop and buy a paper book with
cash.
That is a very different action from the point of view of privacy.
Why did they blur the books in the background on your TED talk?
That's a very good question.
Okay, so you didn't have a say in that or discussion.
People don't know, but you're well, maybe know now, you're a professor of philosophy and ethics.
So let's see. So I have two questions. One would be what's a philosopher of the past?
Like what one question would you ask? And which philosopher would it go to if you could?
I don't think I would have a particular question, like a burning question that I'm really trying to figure out.
And like this philosopher could help me. But I would really like to hang out with some philosophers and that would include Plato and
Socrates and Aristotle and Simon de Beauvoir and Philippe Afoot. Yeah, I would be more
Socratic in the sense of just kind of going for a walk and having a conversation as opposed to a
particular question.
So speaking of the walk, the evolution of the podcast for me, the next stage isn't,
you'll see many podcasters where they upgrade the quality of their sets and then they maybe
bring people onto a special stage or a special room and it's beautifully filmed and they
increase the quality of the editing.
So for me, the next stage of
a podcast isn't that I would spend a tenth of that and go for a walk and talk
with someone and film that. Yeah I like that idea. I remember there was this
very good documentary that was very philosophical. I think it was by the
by the filmmaker, Astra Taylor.
I think the documentary is called The Examined Life,
and it's her talking to different people, and they're usually on a walk.
And yeah, I think there's something that goes very deep
in the activity of walking and talking. It's a different way of relating to one another.
It's certainly analog.
It's certainly analog, that's true.
It's being more in sync and in conversation with the context and the environment.
But it's also, I think, just you probably get more blood flow.
Yes, exactly.
It's probably good for ideas.
You also mentioned when you were speaking to your students,
well, you didn't say that you tried this out,
but I'm about to ask, have you tried it out in class?
You tell people, okay, let's turn our phones off,
everyone prove you have your phone off,
now let's converse.
Have you tried that?
So, no, because often,
but I do ask them not to be on their phones for sure.
Well, that should be just saying it's just basic manners.
Don't be on your phone when you're listening to lecture of mine, please.
You'd be surprised.
And I have asked in many contexts to kind of stop the recording and let's just kind of have a more informal
and more honest conversation. And in one occasion, I remember the organizer was a bit upset about
this, even though he had not warned me that there would be a recording. So I got there
and I said, like, look, this wasn't part of the deal. No recording. And he didn't like
that. But at the end, he very kindly said, like, actually, you were right. We did have
a much more interesting conversation than we usually have.
And that to me, I mean, I appreciate the honesty, but it also made me think about how much sometimes
we lose things with technology and we don't even notice that we've lost them.
So I think if you ask that person, do we lose anything by recording a conversation?
He would have said no, but after not recording a conversation, he realized,
oh, actually we are losing something there.
Yeah.
Although that one had one single data point and you were a varying factor.
So it could have been Carissa just brings out the best in people.
Maybe, maybe.
But I do think that there's a good chance. I mean, it's certainly the way I feel if I have a camera in front of me. I don't have the same conversation that I would if I didn't.
Yes. What personal experiences did you have that made you such a staunch supporter of privacy, an advocate for privacy?
I think on the one hand, I do value my privacy at a personal level.
But I think interestingly, it wasn't personal experiences that led me to defend privacy.
It was more researching privacy and realizing that there was this whole
political aspect to privacy that I hadn't been aware of. Um, before I started researching this
topic, I kind of assumed what everybody else assumed that privacy is a personal preference
and to discover all this, these political reasons for why we should protect privacy was, um,
it was kind of a shift in perspective.
What is the right to be forgotten?
The right to be forgotten is a part of the GDPR and the history of it comes from Spain.
So there was this guy in Spain who had some debts and he had paid his debts.
But when you search for his name on Google, the first thing that came up was this link
to a newspaper called La Vanguardia in which he was quoted as somebody who had debts.
And so he went to, um, he complained about this at the agency, the data
production, the data protection agency in Spain.
Um, they thought that La Vanguardia had a right to have that name there because at the time that
was accurate information.
And so the case ended up, and La Vanguardia defended this, that they had a right because
it was accurate information at the time.
And then the case ended up in Europe and the ruling was that it was
unfair for this man to still be haunted by the debt that he had already paid
and that and it was an interesting decision because the decision was that
it was the responsibility not of La Vanguardia because they shouldn't
delete that information because it was accurate at the time but it was a
responsibility of Google as a search engine that certain things that are out of date
or not in the public interest or very private and not in the public interest,
that they not make those so accessible. So basically that they delete that link.
basically that they delete that link.
And that was interesting because even though you can see the logic of it,
so a very rudimentary way of thinking about personal data is to ask, well, is it private or is it in the private sphere or is it in the public domain?
But that doesn't work at all because you can have something that is technically in the public domain, but it's in a registry office in the private sphere or is it in the public domain? But that doesn't work at all because you can have something
that is technically in the public domain,
but it's in a registry office in the middle of nowhere
and nobody's ever going to see it.
Whereas if you have something on the page one of Google
when you search somebody's names,
that makes it very, very accessible.
And so I think it was a development of privacy
and kind of inserting more nuance and understanding about the relationship between privacy and accessibility.
However, what was strange about it is that it left a decision on the hands of Google
as a private company.
And so to this day, if you want something removed from Google, you ask Google to do
it. And then only if they don't do it, then do you go to public instances to complain about that.
If it's the case, which I hear from people who are advocates of privacy, that look,
your online personality, not just your personality, but other traits about you,
it's so readily identifiable. People can reconstruct
who you are from it. That to me implies that there's some, it's akin to, like I'm making an
analogy here, but it's akin to a graspable brick that identifies you. So that seems to me that
that's an advantage in this case of us wanting to create a bill or a law, because then we can say, hey, that graspable brick that can identify Carissa, let's delete that.
Let's delete what can identify Kurt.
Now, maybe it's not so simple.
Maybe it's much more diffuse, but if it is diffuse, then it's not obvious that
it's the case that there's this readily identifiable personality on the internet
that can be reconstructed that is you.
Right. So it is the case that your data is very easy to de-anonymize. So at the beginning
of data collection, we thought that, well, as long as your name isn't there, then you're
anonymized. But of course, that's not very good because there's only, for instance, one person who
works where you work and who sleeps where you sleep, typically.
So the danger is that companies might want to have their cake and eat it too, right?
So you might ask, hey, I want you to delete my data.
And the company might say, oh, but we
don't know what data we have on you.
And it's not kind of identifiable.
So let's imagine a good example is Chagy Bt and OpenAI.
If you go to OpenAI and say, hey, OpenAI, please
delete my data, OpenAI is going to say,
we don't know what data we have on you.
Because we scraped the whole internet.
And we don't keep a record of exactly what's there.
So actually we don't know.
And so you might say, oh, great, because then it's diffuse and so I'm protected.
But you're not because it only takes like one good prompt from someone and CiaoGPT might
spew out your personal data.
So that's the thing with data.
It's not only the data that people have, it's also the kinds of inferences that are made
possible by that data.
And so one of the components of a good privacy law, which I also talk about, write about
in Privacy is Power, will be to ban certain kinds of personal inferences.
So if I have data about what music
you like, I shouldn't be allowed to make inferences about your sexual orientation with that kind of
data, which is done all the time. Do you have advice for people who are entering the field of
philosophy? Advice? Advice. Advice.
Advice for people entering philosophy.
Just enjoy it and follow your nose.
Don't think about strategy.
Just think about what you think is important.
And other people might not be thinking about.
And follow your curiosity.
Because if you're curious about something, chances are there's something in there.
And that other people will be interested as well.
What do you mean don't follow a strategy?
What would be an example of a strategy that someone maybe shouldn't follow?
I get a lot of people asking me, you know, what should I study?
So I'll have a job in 10 years.
And I'm not sure that's the way to go about it. And that's not the way to go about it because philosophy is just not something that people
are looking for in terms of hiring or it's not the correct way because of some other
reason?
Well, it's very hard to predict what people are going to be looking for in 10 years.
And so if you are very strategy minded and the strategy fails, then you're
going to be in a pretty bad place.
But if you follow your instincts and your curiosity and you're looking at something
because you care about the truth of something, then even if you face hardship
in the professional side, it will be much
more fulfilling because you're doing what you think is important and what you think
matters.
So it's akin to, okay, look, if there's some large long-term goal that you have that's
over a decade away, that may or may not come to fruition.
What is true is that you have an interest somewhere.
So go follow that on a local scale and just keep pointing your flashlight and going, okay, there's an interesting little tidbit.
I'm going to learn about that.
I'm going to learn about that.
That looks interesting.
Let me go here.
Yeah.
And I think philosophy is a very vocational thing, right?
So I mean, a lot of people study administration without loving administration for strategic
purposes and fair enough, that's a different way to go.
But if you're going to study philosophy, I mean mean you're not doing it for the fame, you're not doing it for the money,
and to do it for this tragedy then you know maybe go into something else.
You do it for something else and so you know fully commit to that.
Now you're speaking to the audience directly and they're sold on wanting more privacy.
They can follow some of the practices that you mentioned, use DuckDuckGo, and then you
slip and you use the Chrome browser, and then that's not perhaps the greatest browser for
keeping your data to yourself.
So they want to do something that would create a large change.
They like the GDPR.
Although the GDPR is quite annoying because now every website prompts you and then you
have to click.
And what I find annoying, by the way, is that often the reject button, reject all cookies,
is buried and you have to say more options and then sometimes you have to manually click
each one off.
But anyhow, it's a step in the right direction.
So they want more of that.
What can they do specifically?
You outlined some practices already, but now again, you're speaking to them directly.
And they've got your book.
Privacy is power.
So in general, just look for opportunities for privacy.
They're everywhere and it's not very demanding.
So not to use Chrome, it's not a big sacrifice actually.
Except for like very, very few exceptions.
Other browsers work even better.
Cherish the analog.
I think one of the best activities anyone can do is to read a paper book.
It's not only that it'll better inform you about the world, it'll be an antidote to these
kind of problems that we are facing from social media and our attention spans becoming shorter and us kind of struggling
with focusing.
But it's also a practice that in itself cultivates privacy.
So up until, I don't remember exactly which century, but for a long time to read was to
read out loud
and to read in group.
And that was very valuable in itself for other reasons, and that's not to disparage that.
But when we developed and when we started practicing silent reading, it really developed creativity in a different way and it enriched people's
just mental landscapes. And you know what goes for a paper book goes for going for a walk in nature,
looking people in the eyes, enjoying a concert or going to the coffee shop.
Just don't forget that the analogue is really what sustains us.
What makes a city interesting is all analogue.
So don't get too seduced by the digital and remember that the analogue,
that the digital depends on the analog
and that in the analog there are many more opportunities for privacy, for intimacy and
for meaningful connections.
Thank you so much, Professor.
Thank you so much, Kurt.
Firstly thank you for watching, thank you for listening.
There's now a website, kurtjymungle.org, and that has a mailing list.
The reason being that large platforms like YouTube, like Patreon, they can disable you
for whatever reason, whenever they like.
That's just part of the terms of service.
Now a direct mailing list ensures that I have an untrammeled communication with you.
Plus soon I'll be releasing a one-page PDF of
my top 10 toes. It's not as Quentin Tarantino as it sounds like. Secondly, if
you haven't subscribed or clicked that like button, now is the time to do so.
Why? Because each subscribe, each like helps YouTube push this content to more
people like yourself, plus it helps out Kurt directly, aka me.
I also found out last year that external links count plenty toward the algorithm, which means
that whenever you share on Twitter, say on Facebook or even on Reddit, etc., it shows
YouTube, hey, people are talking about this content outside of YouTube, which in turn
greatly aids the distribution on YouTube.
Thirdly, there's a remarkably active Discord and subreddit for Theories of Everything
where people explicate Toes, they disagree respectfully about theories, and build as
a community our own Toe.
Links to both are in the description.
Fourthly, you should know this podcast is on iTunes, it's on Spotify, it's on all of
the audio platforms.
All you have to do is type in The theories of everything and you'll find it.
Personally, I gain from rewatching lectures and podcasts.
I also read in the comments that hey, toll listeners also gain from replaying.
So how about instead you re-listen on those platforms like iTunes, Spotify, Google Podcasts,
whichever podcast catcher you use.
And finally, if you'd like to support more conversations like this, more content like this, then do
consider visiting patreon.com slash Kurt Jaimungal and donating with whatever you like.
There's also PayPal, there's also crypto, there's also just joining on YouTube.
Again, keep in mind, it's support from the sponsors and you that allow me to work on
toe full time.
You also get early access to ad free episodes, whether it's audio or video.
It's audio in the case of Patreon, video in the case of YouTube. For instance, this episode
that you're listening to right now was released a few days earlier. Every dollar helps far
more than you think. Either way, your viewership is generosity enough. Thank you so much. you