back to index

2020-09-17_Gabriel_Custodiet_Interview


Whisper Transcript | Transcript Only Page

00:00:00.500 | When you're in winter's favorite town,
00:00:05.160 | the snow-covered mountains surround you,
00:00:09.800 | a historic Main Street charms you,
00:00:14.520 | and every day brings a new adventure.
00:00:17.120 | Welcome to Park City, Utah, naturally winter's
00:00:23.800 | favorite town.
00:00:26.680 | Join the experience at visitparkcity.com.
00:00:30.800 | - Welcome to Radical Personal Finance,
00:00:32.240 | a show dedicated to providing you with the knowledge,
00:00:33.800 | skills, insight, and encouragement you need
00:00:35.680 | to live a rich and meaningful life now,
00:00:37.760 | while building a plan for financial freedom
00:00:39.120 | in 10 years or less.
00:00:40.360 | My name is Joshua Sheets,
00:00:41.360 | and today we focus on the insight,
00:00:44.040 | because we're gonna talk about history.
00:00:46.240 | I'm proud to welcome back to Radical Personal Finance,
00:00:48.260 | my friend and coworker, Gabriel Custodio.
00:00:50.920 | Gabriel, welcome back to Radical Personal Finance.
00:00:54.120 | - Joshua, it's a pleasure to be here.
00:00:56.280 | I know that you don't invite too many people these days,
00:00:59.280 | and I'm very pleased to have connected with you
00:01:01.240 | and a lot of your audience at some of your events.
00:01:04.000 | And so I know they're very intelligent people.
00:01:05.480 | I know they'll enjoy this episode,
00:01:07.400 | and yeah, glad to be here.
00:01:08.720 | - Absolutely.
00:01:09.720 | Yeah, we've been working together for a couple of years.
00:01:12.480 | You, primarily we've done a number of courses
00:01:16.200 | that we have taught,
00:01:17.520 | and I've recommended your book on privacy.
00:01:19.960 | That's what I think you were most well-known for
00:01:22.120 | in the beginning.
00:01:23.160 | And you've recently come out with a new book
00:01:25.440 | called "Privacy and Utopia, a History,"
00:01:28.800 | which is completely different than your previous work,
00:01:31.860 | which was very practical.
00:01:33.480 | This one is historical and very detailed
00:01:36.960 | and quite a fascinating and interesting book
00:01:38.840 | as we talk about the history of privacy.
00:01:41.240 | So before we begin with the actual concepts
00:01:43.800 | that we're gonna talk about,
00:01:44.640 | and by the way, just for the audience,
00:01:46.800 | here at Radical Personal Finance,
00:01:48.420 | I talk about privacy as a useful strategy in the toolbox
00:01:53.240 | for various aspects of financial planning.
00:01:56.560 | There are a number of different assets.
00:01:58.280 | There are a number of different ways this can be expressed.
00:02:00.680 | It can be a tool in the toolbox
00:02:02.680 | for asset protection planning.
00:02:04.520 | It can be the tool in the toolbox
00:02:05.720 | just for living a low-hassle, low-risk life in many ways.
00:02:09.040 | It can be a tool in the toolbox
00:02:10.480 | for protecting you from identity theft
00:02:12.340 | and other financial risks and frauds that are very common.
00:02:15.560 | And of course, this resonates with me
00:02:18.280 | due to my kind of philosophical bent.
00:02:20.720 | So this episode is not going to be specifically
00:02:23.920 | personal finance-y in the sense of
00:02:26.400 | here's what you specifically should invest in,
00:02:28.840 | but it's going to be extremely informative
00:02:30.780 | as we talk about the history of privacy as a concept,
00:02:35.320 | and then think about how to apply that in the modern world.
00:02:38.120 | So with that introduction,
00:02:39.840 | I'd love to hear the story of the book.
00:02:41.440 | Why, what caused you to want to write
00:02:43.480 | this particular type of history book?
00:02:45.340 | - Yeah, so the people who are aware of me from your show
00:02:51.120 | or elsewhere, they know that I do tend to talk about
00:02:53.660 | a lot of practical things,
00:02:54.600 | but I am interested in the ideas.
00:02:57.000 | And frankly, I'm more interested in the ideas, right?
00:03:00.600 | I kind of got into the Watchman Privacy Podcast
00:03:03.200 | trying to teach people some of the tools and the skills,
00:03:05.560 | but always kept my foot in the door
00:03:08.200 | of talking philosophy, talking ideas.
00:03:10.860 | And I decided to write a book about,
00:03:14.480 | basically, we have this important question,
00:03:19.480 | and I know a lot of the people in your audience
00:03:21.480 | are kind of libertarian-leaning,
00:03:24.160 | and this sort of people.
00:03:25.400 | And we see the last 100 years as a decline
00:03:28.320 | in personal freedom, and consequently,
00:03:30.560 | a decline in many other things.
00:03:32.400 | And we all have this question, what caused it?
00:03:35.160 | What was the origin?
00:03:36.100 | What caused this to be the case?
00:03:38.440 | Hoping that we can kind of change this.
00:03:41.560 | And I set out to do this, focusing on privacy.
00:03:44.540 | And of course, that led me to bigger concepts
00:03:48.200 | like the rise of centralization and things of this sort.
00:03:51.240 | And it took me back about 100 years ago
00:03:53.040 | where I start this book, "Privacy and Utopia."
00:03:55.960 | But yeah, it was really just a opportunity to explore,
00:03:59.200 | okay, what happened to our freedom and privacy,
00:04:02.720 | and try to tackle it in a very historical way,
00:04:05.840 | but also a very philosophical way.
00:04:07.940 | It's an intellectual history,
00:04:09.060 | so I'm not just kind of detailing this law of this year,
00:04:13.160 | and the welfare state gets set up
00:04:15.840 | on this particular occasion,
00:04:17.760 | but also trying to explore the undergirding ideas
00:04:21.000 | of the time that themselves led to these particular dates
00:04:25.140 | being cemented in history.
00:04:26.200 | So yeah, that was the intent,
00:04:27.920 | just trying to get to the first principles
00:04:30.120 | of the things that I talk about on a daily basis.
00:04:32.640 | - Yeah, the book is very much a philosophical history,
00:04:35.240 | which is what makes it so interesting.
00:04:37.320 | Why do you begin your historical narrative in the 1890s?
00:04:42.320 | - Yeah, so you've got to start somewhere.
00:04:46.320 | And I think the 1890s are a very important decade
00:04:51.000 | because, in my view, we have the rise of centralization.
00:04:56.000 | Okay, and centralization makes privacy,
00:04:59.360 | when it reaches its culmination,
00:05:01.240 | centralization makes privacy impossible
00:05:03.720 | because a centralized system,
00:05:05.640 | whether that is a server that is controlling
00:05:08.880 | various webpages, whether that is the ISR,
00:05:11.400 | whether that's the central government,
00:05:13.560 | a centralized system has to know all of its components,
00:05:16.880 | okay, in order to function,
00:05:18.620 | as opposed to a decentralized system
00:05:20.280 | where there's no center, there's no one,
00:05:24.060 | for a good image, ISR on that is looking
00:05:27.760 | and controlling everything.
00:05:28.640 | So in a centralized system,
00:05:31.640 | privacy is metaphysically impossible.
00:05:33.640 | And in the 1890s, we have this serious beginning
00:05:37.200 | of the rise of centralization.
00:05:41.120 | And there are two different things at work here, Josh.
00:05:44.840 | There are the ideas of the time,
00:05:46.460 | and there's the technology that,
00:05:48.320 | well, I don't know which causes which,
00:05:49.840 | so I'll just say they're mutually reinforcing.
00:05:52.280 | So in the 1890s, you have the kind of the culmination
00:05:55.120 | of the industrial revolution.
00:05:56.840 | We have trains, we can now cross a huge landmass,
00:06:00.300 | and we now have the ability to police nations, right?
00:06:03.020 | This is the rise of nationalism,
00:06:05.420 | where suddenly the United States 100 years before this,
00:06:09.000 | Thomas Jefferson said that there was no way
00:06:11.000 | of a central government ever having any hope
00:06:13.680 | of patrolling the landmass of North America.
00:06:16.240 | Well, flash forward to 1890s, now we have trains,
00:06:19.960 | now we have the early automobiles,
00:06:22.320 | now we have steam engines
00:06:23.680 | that can go against the wind in the ocean.
00:06:26.560 | We have telegraphs, we have all this technology
00:06:28.380 | that basically makes it possible
00:06:30.580 | to have a strong central government.
00:06:32.720 | So this is the rise of statism,
00:06:35.720 | that's to use the term that a lot of libertarians
00:06:38.320 | and anarchists like to use.
00:06:39.760 | And so we have technology that is leading
00:06:42.320 | to centralization in the 1890s.
00:06:44.920 | And of course, the decline of privacy
00:06:46.760 | is basically on the same graph
00:06:48.640 | as the rise of central government over the last 100 years.
00:06:51.440 | So this is kind of the origins of this.
00:06:54.080 | And also at the time you get this,
00:06:55.780 | in addition to the technology,
00:06:58.000 | we have this idea in the 1890s.
00:07:00.820 | And I think people started getting this idea
00:07:03.560 | partly because of the technology that they see,
00:07:06.440 | but also just because there's something
00:07:07.720 | in the water of the time, right?
00:07:09.020 | We have Marxism, Marx is writing a couple of decades
00:07:13.800 | before this, he's writing out his ideas
00:07:15.800 | about how society influences people
00:07:18.960 | and how we need to essentially have a central governing body
00:07:22.960 | that can reallocate things in the best way.
00:07:25.920 | There's this idea that the states or some powerful figure
00:07:29.380 | should be intervening into society
00:07:31.960 | and should be changing things, should be adjusting them.
00:07:34.760 | The welfare state starts around this time
00:07:37.080 | or in the turn of the century, early 20th century.
00:07:40.820 | We get all kinds of ideas
00:07:43.020 | where suddenly you're not just an individual,
00:07:46.020 | you're part of a collective.
00:07:47.780 | There's a lot of things in the water,
00:07:49.100 | which we can go into more detail
00:07:51.060 | about how individualism is no longer cool,
00:07:54.680 | laissez-faire is no longer acceptable.
00:07:57.700 | One of the figures I talk about is H.G. Wells.
00:08:00.060 | He has a lot of good lines talking about how
00:08:02.700 | the 19th century was about people doing what they want,
00:08:06.180 | right, individualism.
00:08:07.320 | That's not acceptable in the 20th century.
00:08:09.800 | In the 20th century, we're going to have
00:08:12.080 | to have a common purpose, a common plan.
00:08:14.760 | We're gonna have to have top-down planning.
00:08:16.840 | And of course, anytime you have planning, centralization,
00:08:20.040 | then you have a decline of freedom
00:08:21.360 | and consequently a decline of government.
00:08:23.240 | So there was just something going on in the 1890s
00:08:25.560 | result of the technology,
00:08:27.060 | but also a result of the ideas of collectivism
00:08:29.720 | that start sprouting at this time.
00:08:31.320 | That makes it a very consequential decade
00:08:33.680 | for us to consider.
00:08:36.160 | - Tell me more.
00:08:37.000 | You just mentioned H.G. Wells
00:08:38.360 | and you talk a lot about his impact in the book.
00:08:41.740 | I'm not knowledgeable enough
00:08:43.980 | to understand a lot of that history.
00:08:45.960 | Why was H.G. Wells such an interesting
00:08:49.320 | and pivotal figure at this time?
00:08:50.920 | - Yeah, we could talk for an hour just about H.G. Wells.
00:08:55.080 | Let me just give the summary to pique people's interest.
00:08:58.520 | So first of all, a lot of us know H.G. Wells
00:09:01.720 | for the guy who was prominent in early science fiction.
00:09:04.600 | We have "War of the Worlds,"
00:09:06.120 | we have "The Invisible Man," "The Time Machine,"
00:09:09.520 | "The Island of Dr. Moreau,"
00:09:10.880 | for those who are familiar with that one.
00:09:12.480 | These are all written in the 1890s
00:09:14.680 | and H.G. Wells would more prominently,
00:09:17.600 | if you can believe it, become known as a huge proponent,
00:09:20.880 | perhaps the biggest proponent of world government.
00:09:24.200 | He literally wrote the book called "The New World Order."
00:09:27.520 | All right, 1939, writes this book.
00:09:30.520 | So this is a guy who devoted his entire life
00:09:33.540 | after he wrote a handful of science fiction books
00:09:35.920 | to world government, world order.
00:09:38.640 | He disbelieved in individualism.
00:09:40.700 | In fact, he would write a thesis later in life
00:09:42.920 | trying to argue at a biological level
00:09:44.680 | that individualism is a biological delusion.
00:09:48.220 | He actually wrote that as a PhD thesis later in life.
00:09:51.600 | And so this was a great enemy of human individualism,
00:09:55.320 | freedom, a huge proponent of world government
00:09:59.400 | and centralization.
00:10:00.240 | He wrote a number of books that I talk about in my book,
00:10:03.700 | "A Modern Utopia,"
00:10:05.260 | which is a perfectly dystopian world
00:10:07.800 | where everything is controlled and top-down planning
00:10:10.620 | and anybody who disagrees is sent to an island prison.
00:10:14.100 | But that's who H.G. Wells was.
00:10:19.620 | He was very influential.
00:10:21.600 | George Orwell says of him that, let's see.
00:10:24.940 | Yeah, here's the quote.
00:10:26.140 | This is what George Orwell says.
00:10:27.260 | He says, "Thinking people who were born
00:10:28.820 | about the beginning of the century
00:10:30.380 | are in some sense H.G. Wells's own creation.
00:10:33.500 | I doubt whether anyone who was writing books
00:10:35.460 | between 1900 and 1920, at any rate in the English language,
00:10:39.220 | influenced the young so much.
00:10:41.140 | The minds of all of us, and therefore the physical world,
00:10:43.900 | would be perceptibly different if Wells had never existed."
00:10:46.880 | So this was not just a proponent of all these things,
00:10:49.180 | of centralization in its purest form,
00:10:52.260 | but he influenced the welfare state.
00:10:53.940 | He was friends with Winston Churchill.
00:10:56.140 | Early 20th century, when the welfare state in Britain
00:10:58.900 | comes about, this great emblem of centralization
00:11:02.420 | and incredibly influential person.
00:11:05.180 | And you know, Josh, if we look back at some of these stories
00:11:08.420 | that we are familiar with, let's just do that for a moment.
00:11:10.980 | We have "The War of the Worlds."
00:11:12.900 | Think about "The War of the Worlds."
00:11:14.740 | First of all, why is humanity destroyed
00:11:17.460 | if you read that book?
00:11:18.300 | And I do a close reading of all these books.
00:11:20.080 | It's destroyed because it is too disorganized.
00:11:22.620 | It's too decentralized.
00:11:24.260 | The people have no, this is exactly what he says,
00:11:26.580 | people have no, there was no common plan in London, right?
00:11:30.040 | So all these people are scattered
00:11:31.740 | because they don't have a plan.
00:11:34.340 | Now, the Martians who have, by the way,
00:11:35.900 | been looking at them for a long time, right?
00:11:38.500 | This is the great Eye of Sauron,
00:11:39.740 | this great organized, perfectly technocratic society,
00:11:44.640 | for lack of a better word.
00:11:46.440 | They're not blamed at all for their invasion.
00:11:48.040 | In fact, they're encouraged for their invasion.
00:11:49.860 | This is actually something that the narration encourages,
00:11:53.920 | because this is a more advanced, more developed,
00:11:56.660 | more evolutionarily advanced race of beings
00:12:01.660 | who are centralized, who do have a plan,
00:12:04.340 | and who destroy humanity because humanity is too laissez-faire.
00:12:09.340 | We look at "The Invisible Man," right,
00:12:11.340 | where individualism is basically concomitant,
00:12:14.920 | or is basically, you know,
00:12:18.080 | if you're so much an individual, right,
00:12:20.180 | this invisible man, you become psychotic, right?
00:12:23.420 | You become a monster.
00:12:24.880 | That's kind of one of the takeaways
00:12:28.060 | of that particular book.
00:12:29.700 | You look at "The Island of Dr. Moreau,"
00:12:32.900 | this very disturbing technocratic island
00:12:36.380 | where this doctor goes to experiment on animals
00:12:39.020 | and turn them into humans, right?
00:12:41.060 | This transhumanism that is in this early work,
00:12:44.040 | which is encouraged.
00:12:45.340 | H.G. Wells was actually surprised
00:12:46.860 | that people were so critical of it and so disturbed by it.
00:12:49.700 | For him, this was just, these were his ethics.
00:12:52.620 | And just to finish my point here,
00:12:54.140 | this was a man who was one of the first human beings
00:12:57.060 | who had a purely scientific education.
00:12:59.340 | He was part of this experimental college in London
00:13:02.620 | where he was taught actually by T.H. Huxley,
00:13:05.340 | who was known as Darwin's bulldog.
00:13:07.100 | So this is a man who was saturated
00:13:08.900 | in evolutionary thinking.
00:13:10.620 | He was saturated in scientific thinking.
00:13:13.540 | He was an atheist.
00:13:15.540 | He had a huge disdain for humanity as individuals.
00:13:19.180 | He thought that they should be directed from the top down,
00:13:21.820 | and he wrote a hundred books, okay?
00:13:23.460 | He wrote a hundred books,
00:13:24.900 | almost all of them dedicated to describing how humanity can
00:13:29.900 | and should be directed from a top-down,
00:13:33.260 | from a top-down approach.
00:13:36.140 | And that's what he dedicated his life to.
00:13:37.820 | And many people were influenced,
00:13:39.780 | not just by his early fiction,
00:13:41.400 | but by his more prescriptive books later on in his life.
00:13:46.200 | - I admire your having read through his,
00:13:48.660 | I've never read a single H.G. Wells book.
00:13:50.700 | And so I admire your having gone through
00:13:52.660 | and traced that history,
00:13:54.420 | 'cause I just hear about his name.
00:13:56.220 | But him being the origin of so many of these ideas
00:14:01.220 | is something that I need to learn more about.
00:14:04.300 | After your chapter on H.G. Wells,
00:14:06.140 | you go to another writer named Joseph Conrad.
00:14:08.920 | Tell us about Conrad and what he contributed to this saga.
00:14:13.040 | - Yeah, so there's a lot of connections here
00:14:16.820 | that I tried to bring together.
00:14:18.580 | And Joseph Conrad is actually a friend of H.G. Wells
00:14:21.420 | in the 1890s, when H.G. Wells started going on his tirade
00:14:24.780 | for world government, they stopped being friends.
00:14:27.280 | So that kind of tells you the difference between the two.
00:14:29.340 | And I simply elevate Joseph Conrad,
00:14:31.940 | this Polish-born author who was writing
00:14:35.180 | in his third language and actually managed
00:14:37.300 | to become arguably the best English novelist,
00:14:40.300 | despite writing in his third language, incredible story.
00:14:42.940 | Wrote books such as "Heart of Darkness," "Nostromo,"
00:14:46.220 | who, the guy who wrote "The Great Bank Gatsby"
00:14:49.860 | actually said, "If I could write any book ever written,
00:14:53.340 | "I would prefer to have written 'Nostromo.'"
00:14:54.940 | So Joseph Conrad is this really amazing writer,
00:14:57.320 | much better than H.G. Wells.
00:14:59.020 | And they basically have a disagreement.
00:15:00.820 | I kind of, I put H.G. Wells as the great centralist,
00:15:04.300 | and I put Joseph Conrad as the decentralist, right?
00:15:07.320 | In his fiction, he's writing in a narrative way
00:15:09.640 | that is emphasizing individualism,
00:15:11.760 | that is emphasizing this idea
00:15:13.180 | that we don't live in a perfect world.
00:15:15.420 | We should not be pursuing this utopia,
00:15:18.140 | this perfect world on earth, that humans are flawed.
00:15:20.980 | And his fiction is basically just recognizing
00:15:24.100 | that humans are flawed and appreciating humanity.
00:15:27.020 | He has this great line.
00:15:27.900 | He writes to Wells as their friendship is breaking up.
00:15:30.420 | He says, "The difference between us, Wells, is fundamental.
00:15:33.700 | "You don't care for humanity,
00:15:35.180 | "but think there are to be improved.
00:15:37.300 | "I love humanity, but no, they are not."
00:15:40.180 | So that's this strong dichotomy
00:15:42.020 | between the utopian and the anti-utopian view.
00:15:46.600 | And I just see Joseph Conrad and his fiction
00:15:48.420 | and his ideas as emblematic of this 19th century,
00:15:51.880 | more laissez-faire, more constrained vision of humanity.
00:15:55.860 | And so he's there as a contrast to H.G. Wells.
00:15:58.440 | - The next section in your book
00:16:00.580 | is the one that I was particularly fascinated with
00:16:02.940 | just because it has such a great bearing on modern finance.
00:16:06.580 | So you, in your third section,
00:16:09.020 | you title it "Privacy, Utopia, and the Welfare State"
00:16:14.020 | as a financial planner
00:16:15.640 | and one who pays attention to government finances
00:16:18.420 | and things like that.
00:16:19.260 | I'm intensely conscious of the welfare state.
00:16:21.560 | So how did these ideas influence the creation
00:16:26.560 | of the modern welfare state?
00:16:29.240 | - So if you go back in your minds to this period of time
00:16:36.060 | where suddenly we now have journalism in a serious way,
00:16:40.220 | we now have photography,
00:16:41.940 | we now have census data that is right there in front of us,
00:16:45.280 | it's hard for you as a politician
00:16:47.140 | or as somebody reading the newspaper
00:16:48.620 | not to pick it up and say,
00:16:50.140 | "Oh, hey, we have all these statistics now.
00:16:52.060 | We see how many people are poor.
00:16:54.140 | We see how many people are destitute."
00:16:56.460 | And of course, this has been the case
00:16:57.940 | throughout human history.
00:16:59.140 | The real miracle, of course, Joshua,
00:17:00.900 | is why that is not the case, right?
00:17:02.900 | What free markets, capitalism, et cetera,
00:17:05.820 | why they made it so that at least not everybody is poor,
00:17:09.060 | right?
00:17:09.900 | But that's not how people looked at it, right?
00:17:11.740 | Of course, in the 1890s, around this period of time,
00:17:14.500 | people looked at all this new data,
00:17:17.060 | this photographic realism about how destitute people were.
00:17:20.420 | And of course, people said,
00:17:21.780 | "Well, we should do something about it," right?
00:17:23.940 | Those infamous words.
00:17:25.940 | And they actually had the ability to make it happen, right?
00:17:28.540 | Because now we have all these central institutions,
00:17:31.040 | government is becoming a bigger role in people's lives.
00:17:34.700 | There's a lot more urbanization.
00:17:36.380 | And so everybody's just kind of crowded around
00:17:38.620 | and, you know, it's kind of right in front of you.
00:17:41.300 | You know, there's a good statistic in the UK, in Britain,
00:17:45.100 | in 1800, 10% of people are urban.
00:17:48.300 | And fast forward 100 years, 1900, 90% of people are urban.
00:17:53.300 | So there's a lot of things just happening
00:17:56.540 | on the face of the earth within the West
00:17:59.380 | that are showing that, hey,
00:18:00.900 | we have all these poor and destitute people.
00:18:03.140 | And it makes sense if that's right in front of you,
00:18:06.620 | and you have this growing idea
00:18:07.940 | that individualism is old-fashioned,
00:18:11.460 | that we need to have a common plan,
00:18:13.340 | we need to have a collective solution.
00:18:15.340 | It makes sense that, hey,
00:18:16.820 | what if we give some money to these people, okay?
00:18:20.500 | And it's just kind of a logical conclusion.
00:18:22.660 | You can understand how that came about.
00:18:24.660 | And so as we get the welfare state in 1905,
00:18:28.460 | I think is when Britain passes its first welfare laws,
00:18:32.740 | that is the camel's nose under the tent, of course.
00:18:36.060 | You know, one of the politicians
00:18:37.900 | who was the main politician who was behind this,
00:18:40.700 | George Lloyd, who was partners with Winston Churchill,
00:18:43.740 | also a big proponent of the welfare state,
00:18:45.660 | he had this great speech he gave
00:18:47.980 | talking about how we need to wage war on poverty.
00:18:51.540 | And if that sounds familiar,
00:18:53.300 | that was certainly happening 100 years ago.
00:18:55.580 | So basically it was politicians
00:18:57.220 | who were starting to realize
00:18:58.540 | they had increased power over people.
00:19:01.020 | They could increase their power.
00:19:02.860 | Maybe they thought they were helping people.
00:19:04.300 | I'm sure there was some of that.
00:19:05.300 | I don't think it was exclusively that,
00:19:07.100 | but either way, the welfare state
00:19:08.620 | is just the camel's nose under the tent
00:19:10.580 | of letting government intervene in our lives,
00:19:13.740 | the rise of statism.
00:19:16.780 | And once that happened,
00:19:18.180 | you can easily see how fast forward from 1905 to 1910,
00:19:23.180 | we had the first world war.
00:19:24.940 | Suddenly we have a nationalist conflict
00:19:27.180 | and you are now a citizen, right?
00:19:28.860 | You have passports also crop up during this time.
00:19:31.260 | So you are now a citizen of this nation, right?
00:19:33.860 | Germany was not really a nation before this time.
00:19:37.020 | Suddenly it's all kind of conglomerated into one.
00:19:39.580 | The same for Italy.
00:19:40.420 | These were previously kind of disparate little states
00:19:44.220 | and now they're one entity.
00:19:45.540 | So now you are part of a nation.
00:19:47.820 | You're a citizen.
00:19:48.660 | You have the yoke of citizenship, as I call it.
00:19:51.140 | You're given welfare perhaps in some cases.
00:19:52.900 | So you have an obligation even
00:19:54.820 | to fight for your government, to fight for the state.
00:19:57.340 | So this is just the origin of big government as we know it.
00:20:00.700 | It started in a nice way, right?
00:20:02.260 | Let's help the poor people.
00:20:03.860 | And then suddenly we get to the point
00:20:05.260 | where you cannot leave the country in Italy.
00:20:07.900 | We've revoked passports of young men
00:20:11.140 | so that they literally cannot leave Italy.
00:20:13.420 | This was, you know, first world war.
00:20:16.260 | It was a very quick, but logical, ethical succession
00:20:20.820 | from welfare state to warfare state to where we are today.
00:20:26.460 | - You move from here into talking
00:20:28.980 | about the concept of utopia.
00:20:31.500 | And what I find so fascinating about that word
00:20:34.620 | is that word expresses something that I think we all want.
00:20:39.620 | We all want to live in a perfect world.
00:20:42.480 | So why do you see the utopian mindset as a problem?
00:20:46.660 | - Yeah, it's a good question.
00:20:50.820 | So the reason I see it as a problem is
00:20:54.460 | because the way the utopians of this era,
00:20:57.940 | of which HG Wells was a prominent figure,
00:21:00.820 | 1905, I think he publishes his book, "A Modern Utopia."
00:21:05.820 | And it's because there's a difference, Joshua,
00:21:09.340 | between an idea of, let's say, a Christian idea of utopia,
00:21:12.460 | and then this atheistic, rationalistic, eugenic idea
00:21:16.660 | that we need to create utopia now, right?
00:21:20.020 | Because these utopians who really gained footing
00:21:24.300 | in the period of time that I'm discussing
00:21:26.860 | are people who said, hey, we have a powerful state.
00:21:29.900 | We have a scientific understanding.
00:21:32.660 | We should apply this to the world and create what we,
00:21:37.260 | of course, this is a group of elites, right?
00:21:39.040 | What we think is best for everybody.
00:21:41.560 | And so utopia, according to this idea, is always coercive.
00:21:45.340 | And you could argue that by that fact alone, it is evil.
00:21:48.620 | I start off my book with a great epigraph
00:21:52.860 | from "The Tempest," Shakespeare's famous final play,
00:21:56.260 | "The Tempest," where there's this character talking about,
00:21:59.860 | yeah, there's gonna be this perfect world,
00:22:01.700 | and everybody's going to be idle,
00:22:04.380 | and all the women are going to be pure,
00:22:06.220 | and there's not gonna be any sovereignty.
00:22:08.160 | And then his friend cuts him off and he says,
00:22:09.620 | yeah, but you're gonna be the king of this place,
00:22:11.900 | aren't you?
00:22:12.740 | And it's, as always, Shakespeare understands human nature
00:22:16.620 | and the way things work very carefully.
00:22:18.580 | Behind every utopia, every atheistic utopia in this sense,
00:22:22.600 | there's always some coercive force, right?
00:22:25.020 | You look at H.G. Wells' "1905, A Modern Utopia."
00:22:28.660 | Everything seems nice, everybody has this,
00:22:30.500 | everybody has that, but wait, you disagree?
00:22:32.860 | Well, we're sending you off to an island prison.
00:22:36.260 | And if you disagree with being able to share
00:22:38.820 | your possessions with others,
00:22:40.420 | well, then that's not acceptable.
00:22:41.900 | We're going to have a force that is going to take it from you
00:22:44.660 | and give it to somebody else.
00:22:45.620 | So this idea of utopianism is,
00:22:48.420 | in the sense that I described in the book,
00:22:50.340 | is a very coercive idea, right?
00:22:52.680 | Marxism is a utopian idea.
00:22:54.520 | It sounds good until you realize that, wait a second,
00:22:57.280 | there's going to have to be somebody in charge.
00:22:59.200 | It's not going to be me.
00:23:00.800 | It's going to be somebody else.
00:23:02.520 | And they're going to have their own view of things.
00:23:04.680 | And they themselves are going to be fallen humans
00:23:07.500 | with their own problems.
00:23:09.000 | And of course we know,
00:23:11.120 | as a lot of this audience is libertarian,
00:23:13.280 | that power corrupts and absolute power corrupts absolutely.
00:23:17.060 | So that's the problem with utopia as I see it
00:23:20.080 | in the way I explain it in this book.
00:23:23.060 | - So why then, it was, was the dystopian not,
00:23:29.280 | the dystopian movement,
00:23:31.280 | especially the early dystopian novels,
00:23:33.800 | was that just a natural reaction
00:23:35.840 | or were people seeing already some of the downsides
00:23:39.480 | of these social movements
00:23:41.120 | and then writing and extrapolating from there?
00:23:43.440 | - So this is a very interesting thing
00:23:47.360 | that I discovered in writing this book,
00:23:49.640 | that the H.G. Wells,
00:23:52.280 | especially his 1905 book, "A Modern Utopia,"
00:23:55.120 | it actually inspired the dystopian novel.
00:23:58.880 | And you can look at H.G. Wells' book, "A Modern Utopia,"
00:24:03.160 | and I think everybody should read it.
00:24:04.520 | It's a great historical document.
00:24:05.640 | It actually literally influenced the welfare state.
00:24:09.100 | Winston Churchill actually says he read it
00:24:10.940 | with great sympathy.
00:24:12.380 | And he was one of the chief architects
00:24:14.280 | of the welfare state.
00:24:15.720 | And this book itself can be considered dystopian.
00:24:19.520 | Now, H.G. Wells did not write it with that mindset.
00:24:22.880 | So, what is dystopia, first of all?
00:24:24.440 | Dystopia is simply basically recognizing
00:24:27.040 | what I've recognized and just explain that
00:24:29.280 | in any utopian world, there is a coercive element.
00:24:32.760 | There are people in control, people in charge at the top,
00:24:35.800 | and these people have their own agenda.
00:24:37.880 | And that's what pretty quickly a lot of people recognize.
00:24:41.460 | So 1909, just a few years later, E.M. Forster,
00:24:45.260 | the famous English novelist, E.M. Forster,
00:24:48.040 | who wrote books like "Howard's End"
00:24:49.440 | and some of these famous books,
00:24:50.680 | he actually wrote a long, short story
00:24:52.440 | called "The Machine Stops."
00:24:54.440 | Now, "The Machine Stops" is a dystopian short story
00:24:59.160 | about a society that's underground.
00:25:01.080 | They kind of messed up the earth,
00:25:02.200 | trying to mess with various things.
00:25:04.000 | So they have to live underground.
00:25:05.120 | They all live in these little cubicles.
00:25:07.120 | They don't interact with each other.
00:25:08.900 | They have telescreens where they get all their news
00:25:12.120 | and information, and the machine, who is never seen,
00:25:15.060 | basically takes care of them.
00:25:16.920 | It gives them what they need.
00:25:18.040 | It does all these things.
00:25:19.560 | And of course, as you can imagine,
00:25:21.380 | these are very depressed people.
00:25:22.520 | These are very purposeless people.
00:25:24.520 | And E.M. Forster basically describes
00:25:26.960 | this seemingly utopian world that's actually,
00:25:29.880 | when you get to the end of it, dystopian.
00:25:31.300 | So he has a negative take on the idea of utopia.
00:25:35.080 | So H.G. Wells's own work would literally inspire
00:25:38.880 | the dystopian genre.
00:25:39.760 | He literally created the dystopian genre of people.
00:25:43.160 | E.M. Forster, shortly later,
00:25:45.280 | you have Aldous Huxley and Brave New World,
00:25:47.520 | another famous dystopia.
00:25:49.240 | George Orwell specifically mentions H.G. Wells' work
00:25:53.480 | as inspiration for his own take on things,
00:25:57.280 | which is simply, hey, this idea of utopia is silly,
00:26:02.280 | in the sense that it gives power to a group of people
00:26:06.400 | that is going to use it for their own ends.
00:26:08.080 | We don't want any of that.
00:26:09.000 | We're going to expose it.
00:26:10.420 | So that's the origin of dystopia.
00:26:13.640 | - Bring it forward, then, to where we are today.
00:26:18.640 | As you have now been informed of the origins
00:26:25.040 | of some of these ideas, what do you see working out
00:26:28.920 | in today's world, and how does that impact your life
00:26:32.120 | and our lives?
00:26:33.360 | - So part of writing a book about 100 years ago
00:26:40.680 | is so that I don't have to answer this question right away.
00:26:44.680 | But let me do my best.
00:26:46.440 | - I gave you 26 minutes of background,
00:26:48.400 | and now it's time for the, what do I do,
00:26:50.400 | which is what I care about.
00:26:51.240 | - Yeah, yeah, fair enough.
00:26:53.520 | And I don't have a good answer.
00:26:54.840 | I think, obviously, people listening have a sense of things.
00:26:58.520 | You have a good sense of things.
00:27:00.120 | So I think maybe the main takeaway of this book
00:27:05.120 | is simply to recognize the origin of some of the ideas
00:27:09.520 | out there.
00:27:10.360 | When you see people saying, hey, we need to fix things,
00:27:13.280 | we need a powerful person in charge, we need to,
00:27:17.200 | you can even see to this day, we didn't talk about eugenics,
00:27:20.160 | but eugenics was a big part of this era.
00:27:23.120 | And in eugenics, you can see the two-sided aspect
00:27:28.120 | of this progressive utopian view, which is, hey,
00:27:31.360 | we have all this data, we can see that moronic people,
00:27:34.360 | that's what they call them,
00:27:35.480 | are literally holding our society back.
00:27:37.480 | In fact, they presented the evidence, right?
00:27:39.680 | People with low IQs are actually holding back our economy.
00:27:42.480 | They gave lectures on this.
00:27:43.760 | Everybody you know from the era was a proponent of eugenics,
00:27:46.680 | including Winston Churchill, Virginia Woolf,
00:27:49.000 | Helen Keller, you name it.
00:27:50.080 | They were a proponent of eugenics, of course, HG Wells.
00:27:54.480 | And you can see that these ideas remain with us.
00:27:58.040 | This idea that we need to tamper with humanity,
00:28:00.920 | we need to go beyond our physical nature.
00:28:05.640 | Of course, Neuralink and things of this sort,
00:28:08.280 | I remember listening to a talk on an Asian guy
00:28:12.680 | who said that Asian people are shorter,
00:28:14.960 | they consume fewer resources,
00:28:16.160 | so we should in the future modify people to be shorter.
00:28:20.000 | Of course, we have CRISPR,
00:28:21.120 | which has the ability to change genes.
00:28:23.280 | So if we recognize the fundamental assumptions
00:28:27.320 | of this worldview, which are that
00:28:29.080 | we should go beyond our nature, right?
00:28:31.080 | We are in control of our nature.
00:28:32.920 | We can conquer our nature.
00:28:34.200 | Of course, as C.S. Lewis says,
00:28:35.800 | "If we conquer our nature,
00:28:36.880 | who exactly will have won that battle?"
00:28:39.280 | That's a good question.
00:28:40.560 | When we understand and really ruminate
00:28:44.920 | on how flawed humans are,
00:28:46.400 | we're not going to elevate them in positions of power.
00:28:48.880 | We're not going to use that same human nature
00:28:50.800 | to try to transcend ourselves.
00:28:53.000 | We're still trying to figure out ourselves.
00:28:55.040 | And so we need to have this more humble,
00:28:56.840 | constrained view of humanity,
00:28:59.760 | which will necessarily lead to more limited governments,
00:29:02.800 | more limited power,
00:29:04.080 | more people working as communities together,
00:29:07.040 | more people who are just trying to do things
00:29:09.360 | in a decentralized, local way.
00:29:11.520 | That's kind of the main takeaway
00:29:13.320 | that I hope people get from reading about this.
00:29:16.320 | I end the book with a great quotation
00:29:19.000 | from the famous anarchist Leo Tolstoy.
00:29:22.080 | He says, "Everybody thinks about changing the world,
00:29:25.680 | but nobody thinks about changing himself."
00:29:28.560 | What we need is an internal revolution.
00:29:31.560 | The regeneration of the inner man.
00:29:34.040 | And I think there's a lot of wisdom in that.
00:29:36.240 | - Ideologically, I'm sold on these ideas.
00:29:42.240 | They're what I seem to be,
00:29:44.040 | they're what I think is correct and broadly effective.
00:29:49.040 | So I'm aligned with you.
00:29:50.960 | But I also consider,
00:29:52.440 | I've been considering the other side,
00:29:54.480 | just thinking a lot about how,
00:29:57.200 | am I just too extremist?
00:30:00.320 | Am I dogmatically committed to a certain approach
00:30:04.200 | without considering the evidence?
00:30:06.160 | And I'll give a practical example,
00:30:07.960 | which is quite chilling when you talk about the novels
00:30:12.960 | and the things of olden days.
00:30:16.960 | I recall seeing some, I don't know,
00:30:19.160 | a month or two, a couple of months back,
00:30:20.760 | some tweet that you had made
00:30:22.400 | about how people are constantly using AI.
00:30:25.680 | And everyone says they're opposed to the machine
00:30:29.040 | and then they're turning everything in their life
00:30:30.760 | over to ChatGPT or whatever local AI platform.
00:30:34.720 | And I see the integration of ChatGPT
00:30:37.160 | as like the perfect current example
00:30:39.000 | of this philosophical challenge.
00:30:41.240 | I find it to be one of the most incredible daily tools
00:30:44.600 | that I use and I use it constantly
00:30:47.320 | because it is profoundly helpful.
00:30:50.760 | And yet it's also profoundly flawed.
00:30:53.520 | It makes up stuff left, right, and center.
00:30:57.120 | Constantly just making stuff up out of thin air,
00:30:59.440 | constantly getting stuff wrong.
00:31:01.240 | And it does become a crutch.
00:31:04.320 | It becomes something that you kind of lean on and say,
00:31:06.280 | well, if I can use AI to do this,
00:31:08.160 | then I don't have to think it through myself.
00:31:12.920 | I can just have the machine do this.
00:31:15.200 | And so I'm trying to find the place to put my feet down
00:31:19.280 | where I'm not just a Luddite,
00:31:21.640 | just anti-technology for the sake of it.
00:31:23.960 | I'm not just dogmatically opposed to programming.
00:31:27.000 | I'm opposed to progress because I'm a contrarian.
00:31:30.120 | But I'm also trying to take the informed approach
00:31:32.920 | and learn and say, well, where are my weaknesses?
00:31:37.160 | And so I've been trying to sort out
00:31:39.520 | kind of the proper way to do that.
00:31:40.760 | And I don't expect you to answer that
00:31:42.800 | because each person, each of us has to make our own decisions.
00:31:46.720 | But it's fascinating to think about my challenges of myself
00:31:51.320 | and think about how can I create this balance.
00:31:54.920 | Maybe I just like to ride the fence,
00:31:56.840 | but I don't want to be that person who says,
00:32:00.000 | well, I'm opposed to progress.
00:32:01.160 | I want to be the person who's building towards a utopia,
00:32:04.160 | but I want to do it in an informed way,
00:32:05.880 | recognizing the flawed nature,
00:32:08.040 | the sin nature of human beings,
00:32:10.880 | the fallen nature so that we build proper safeguards
00:32:13.960 | around it that create towards progress
00:32:17.360 | without having to believe the lie
00:32:21.520 | that humans are somehow fundamentally different today
00:32:23.960 | than they were 100 years ago.
00:32:26.680 | - There's a lot of wisdom in that, Joshua.
00:32:29.680 | I don't have too much to say on it.
00:32:32.120 | Maybe I'll just recommend a good book for you
00:32:35.800 | or for other people, "The Abolition of Man" by C.S. Lewis.
00:32:40.440 | He was a guy who, I think this is written in 1943.
00:32:43.520 | And so he saw all the technocratic things
00:32:47.440 | that were building up.
00:32:48.640 | He saw the idea of transhumanism early on.
00:32:50.960 | He saw the idea of the welfare state and its consequences.
00:32:55.440 | And he's got some interesting things to say in there,
00:32:58.840 | but I have been very anti-AI
00:33:02.880 | and anti a lot of certain things.
00:33:05.200 | And I think that all I'll say is,
00:33:07.920 | I think we should be okay with putting our foot down
00:33:10.120 | about certain things and say, yeah, I'm not,
00:33:12.600 | this is not progress, right?
00:33:14.000 | Because this is maybe one of the nastiest consequences
00:33:18.240 | of the utopian era, the progressive era,
00:33:21.040 | is that they suddenly have us really questioning things
00:33:24.280 | and saying, why shouldn't I do this?
00:33:26.680 | Why shouldn't I try this?
00:33:28.160 | That's progress after all,
00:33:29.600 | but we need to define our terms very carefully.
00:33:32.200 | What is progress?
00:33:34.560 | If I have a fridge that can know what I need
00:33:39.560 | and order more of it, there's obviously,
00:33:44.480 | and this is something I talk about a lot
00:33:46.480 | in privacy circles, there's a consequence
00:33:48.640 | to all of this stuff.
00:33:49.920 | And of course, the consequence of technology
00:33:54.000 | is that we lose our ability to be self-sufficient,
00:33:58.840 | our memories degrade, there's all kinds of consequences
00:34:01.560 | that I would not categorize as progressive.
00:34:04.040 | So there's a lot of technology, smart watches, for example,
00:34:07.160 | I think are not progressive at all.
00:34:11.200 | I think there's more human time lost
00:34:14.520 | with all the alerts and all the rest of a smartwatch
00:34:16.760 | than it actually benefits.
00:34:18.360 | So I would just encourage people to be more critical
00:34:21.320 | about this idea of progress.
00:34:23.080 | Not everything, not all movement is progress.
00:34:25.680 | And just to quote another great conservative thinker,
00:34:28.680 | Edmund Burke, he says,
00:34:30.640 | "We think that there are no new discoveries
00:34:33.000 | to be made in morality."
00:34:35.080 | - Interesting.
00:34:38.880 | I can see the point that you're making
00:34:40.280 | and I could lend some arguments to it
00:34:42.360 | from my kind of watching modern society.
00:34:44.920 | We've created, with all of our progressive culture,
00:34:48.600 | we've created a culture that demands less of human beings
00:34:53.400 | and that easy living seems to be toxic to human beings.
00:34:58.400 | The rates of depression, the rates of suicide,
00:35:03.040 | the rates of just general unhappiness,
00:35:05.640 | the rates of loneliness,
00:35:07.120 | all of them are much higher
00:35:08.280 | than they have been historically.
00:35:10.240 | And even just the continuation of our species,
00:35:14.080 | that the most progressive cultures
00:35:16.840 | where there's the least, in general,
00:35:20.880 | the most progressive cultures
00:35:22.200 | where there are the fewest problems
00:35:24.200 | tend to be the cultures that are literally dying out
00:35:27.760 | and literally going extinct over the course of time,
00:35:31.960 | unless we can find some way to correct it.
00:35:34.440 | Meanwhile, the cultures that have not progressed
00:35:38.880 | in the same way
00:35:40.120 | and have not just openly embraced every single technology
00:35:43.880 | tend by those metrics to have higher rates of happiness,
00:35:47.600 | higher rates of satisfaction,
00:35:49.920 | higher rates of human connection
00:35:51.560 | and higher birth rates for the continuation of the species.
00:35:54.660 | So it's fascinating that what we would think
00:35:57.000 | would be an unalloyed good of making life easy
00:36:01.120 | doesn't seem to be the pathway for humans
00:36:03.720 | that automatically makes for better outcomes.
00:36:06.880 | But I also simultaneously find myself frustrated
00:36:11.680 | by people who just stand against progress.
00:36:14.440 | So once again, I try to straddle the fence.
00:36:16.920 | - Right.
00:36:17.760 | And I think I do a good job of straddling this, Josh.
00:36:20.680 | So I don't use any Apple products.
00:36:23.200 | And so I miss out on that whole ecosystem of,
00:36:25.280 | oh, hey, all your devices are integrated
00:36:27.520 | and you have Siri.
00:36:29.240 | Oops, maybe I shouldn't have said that.
00:36:31.000 | You have our favorite female AI,
00:36:34.640 | Apple Assistant helping us out
00:36:36.480 | and all these kinds of things.
00:36:38.240 | And you know what?
00:36:39.080 | I've used those from friends at times.
00:36:40.960 | And I have to say that not having them in my life,
00:36:43.520 | I see no difference, no difference.
00:36:46.840 | In fact, sometimes I think I'm more productive
00:36:49.760 | not having the distractions, not having the worry.
00:36:51.920 | Obviously my privacy is more preserved.
00:36:54.060 | So I can still use a computer.
00:36:56.120 | I'm looking at a computer right now.
00:36:57.800 | It's a Linux computer.
00:36:59.120 | I'm using all these useful programs.
00:37:01.040 | I just don't buy into the idea
00:37:03.000 | that I need an app for this and an app for that.
00:37:06.000 | 'Cause you don't.
00:37:06.840 | You can benefit from the underlying technology
00:37:09.680 | of the internet and computers
00:37:11.200 | and all these sorts of things
00:37:12.360 | without going that bizarre extra mile of saying,
00:37:15.600 | yeah, but progress is always pursuing
00:37:18.280 | that next little gadget.
00:37:20.240 | In my experience, that is not the case.
00:37:22.840 | There are technologies that are useful
00:37:25.040 | and there are many that are worthless.
00:37:26.400 | And I think that unfortunately,
00:37:27.580 | most of the things that humans are producing
00:37:29.580 | in terms of technology these days are actually useless.
00:37:33.120 | - Yeah.
00:37:34.040 | So it seems like privacy,
00:37:36.080 | do you consider that privacy is a good,
00:37:38.680 | the word is failing me,
00:37:42.440 | but like a good,
00:37:43.800 | the thing, the word that means
00:37:46.360 | or the idea that means
00:37:47.360 | that you can just kind of plug one proxy, I guess,
00:37:49.900 | is a good proxy for some of this stuff
00:37:52.240 | that by maintaining privacy,
00:37:55.360 | you can advance,
00:37:56.960 | but basically have a digital minimalist lifestyle
00:38:00.620 | to channel Cal Newport's book.
00:38:03.280 | You can make appropriate progress,
00:38:05.440 | but not go so far that you wind up getting sucked
00:38:08.360 | into the utopian vortex that winds up destroying you.
00:38:11.080 | Do you see that as,
00:38:12.100 | do you see it as a useful proxy,
00:38:13.520 | kind of a metric that aligns with these other metrics
00:38:16.640 | that lead to human flourishing?
00:38:18.200 | - My first, my very first episode
00:38:21.780 | on the Watchmen Privacy Podcast,
00:38:23.160 | I talk about digital minimalism.
00:38:26.280 | And for me, a privacy lifestyle started
00:38:29.400 | from the very beginning in,
00:38:31.200 | wait a second, how do these systems work?
00:38:33.580 | So it started with understanding these systems,
00:38:35.760 | computer systems, how data is shared,
00:38:37.920 | how the internet works.
00:38:39.320 | And once I understood them, I could say, okay,
00:38:41.560 | do I really need to participate in all these things?
00:38:43.880 | And the answer was no.
00:38:45.040 | And that immediately gives me privacy.
00:38:47.200 | And then I could start to understand
00:38:50.420 | what tech I need and don't need.
00:38:52.480 | So yeah, I found that a privacy lifestyle,
00:38:54.680 | especially as I talk about it,
00:38:56.280 | is a way of understanding our technology
00:38:59.380 | to a better degree,
00:39:01.540 | hiding from a lot of the surveying systems as a result.
00:39:06.360 | But yeah, just being more in touch with
00:39:08.920 | what is important and what is not important.
00:39:12.240 | So it is more of a holistic philosophy.
00:39:14.440 | And as a result, you avoid sharing your data
00:39:17.880 | with all these systems.
00:39:18.720 | So yeah, I think the mindset of understanding your threat,
00:39:22.640 | your threat model, understanding the system,
00:39:24.800 | understanding the technology,
00:39:26.560 | and deciding purposely to refuse
00:39:29.920 | a lot of these technologies is a very good mindset.
00:39:33.600 | And I feel very at peace in where I am as a result.
00:39:37.260 | - Yeah, excellent.
00:39:38.780 | Is there anything in the book that I haven't asked you about
00:39:41.700 | that you think would be interesting
00:39:42.940 | for this kind of podcast format to talk about?
00:39:47.040 | - Yeah, let me think about this for a second.
00:39:52.860 | - By the way, while you're thinking,
00:39:53.780 | I just want to congratulate your media.
00:39:55.540 | I enjoy reading your writing
00:39:57.220 | because you write with a flair that is often gone
00:40:00.660 | in modern business writing.
00:40:03.220 | I know you do these podcasts and things
00:40:04.700 | because you have to, to promote your writing.
00:40:06.320 | And I just want to say, keep up the good work, keep writing,
00:40:08.760 | because you do so, you're a great writer,
00:40:11.120 | and I really enjoy reading your prose.
00:40:13.020 | - I appreciate that.
00:40:14.800 | Maybe that's one thing to point out
00:40:15.920 | for people who are listening and say,
00:40:16.960 | well, is this overwhelming?
00:40:18.480 | No, I've decided from now on,
00:40:19.840 | I'm not going to write a book more than 200 pages.
00:40:22.420 | I don't think it's necessary.
00:40:23.760 | I think it's a waste of time.
00:40:25.160 | So I'm the kind of person,
00:40:26.540 | I rewrote certain paragraphs 30 times
00:40:29.120 | just to remove certain things,
00:40:30.400 | to say 10 things in one sentence
00:40:32.800 | that didn't need 10 sentences.
00:40:34.520 | So I think if you give the book a try,
00:40:36.720 | you'll realize that there's a lot of ideas
00:40:38.160 | condensed in a very approachable way.
00:40:40.400 | I'm not trying to use crazy vocabulary.
00:40:43.360 | And it's actually more like 160 pages
00:40:45.880 | when you kind of cut out some of the intermediary parts.
00:40:49.520 | So that is my intent to not just have good ideas,
00:40:53.180 | but to explain them in a good way in writing
00:40:56.720 | that people can really grasp onto.
00:40:58.760 | And there's a whole lot here.
00:40:59.840 | We talk about science fiction,
00:41:01.680 | utopianism, defying privacy early on,
00:41:04.520 | have a good philosophical discussion
00:41:06.320 | about decentralization,
00:41:08.040 | which is almost synonymous with privacy
00:41:09.880 | as I argue early in the book,
00:41:11.760 | as well as have some historical facts
00:41:14.020 | to kind of grasp onto.
00:41:16.120 | We talk about eugenics,
00:41:17.520 | some good explication of some famous
00:41:20.580 | and really important books throughout history.
00:41:22.800 | So yeah, I don't know that I have too much more
00:41:27.800 | to discuss in terms of a conversation.
00:41:30.720 | - Where would you prefer people,
00:41:31.880 | where and how would you prefer people to buy the book?
00:41:34.200 | And then I want to go to your new project
00:41:36.600 | that's more practical,
00:41:38.440 | but where and how would you prefer people buy the book?
00:41:40.940 | - Yeah, so easy.
00:41:43.880 | If you want a physical copy, you have to go to Amazon.
00:41:45.980 | So just go to Amazon, search for Privacy and Utopia.
00:41:48.800 | I'm Gabriel Custodiate.
00:41:50.240 | It should pop up right away.
00:41:51.320 | Very attractive cover.
00:41:53.040 | I try to do everything with style.
00:41:54.740 | And so I think you'll find that the cover itself is,
00:41:57.840 | I can see three symbols in it,
00:41:59.640 | just looking at it very quickly.
00:42:00.800 | So it's a very pretty book.
00:42:02.600 | I've held physical copy.
00:42:03.640 | So Amazon's the way to go there.
00:42:05.240 | I don't have a Kindle version.
00:42:07.120 | Amazon has a problem with you
00:42:09.320 | selling it digitally elsewhere.
00:42:10.960 | And also I don't like to participate
00:42:12.980 | in the Kindle surveillance.
00:42:14.360 | So if you want a digital copy,
00:42:16.560 | that's where you would go to my new website,
00:42:18.120 | escapethetechnocracy.com.
00:42:19.640 | We're gonna give you a code for Joshua Sheets in a moment,
00:42:22.900 | but you can buy it
00:42:23.920 | and you just go through the checkout process.
00:42:26.020 | Bam, you get a PDF instantly, no DRM.
00:42:28.820 | That is your PDF to own and do with as you'd like
00:42:31.240 | and share it as you'd like.
00:42:33.080 | So either Amazon or escapethetechnocracy.com
00:42:35.880 | and you can buy the PDF there.
00:42:37.200 | - Yeah, now let's talk more
00:42:39.040 | because not only is the book available
00:42:40.560 | at escapethetechnocracy.com,
00:42:42.120 | but you have moved all of your training courses there.
00:42:46.320 | We've closed down the previous trainings
00:42:48.280 | and courses that we've done together
00:42:49.920 | and you have reworked everything
00:42:51.840 | and created a whole new world.
00:42:53.680 | So tell us more about the new training that you offer
00:42:56.300 | at Escape the Technocracy.
00:42:57.600 | - Yeah, I'd love to.
00:42:59.320 | So I think a lot of the audience
00:43:00.920 | who we've talked to before about Bitcoin privacy,
00:43:03.040 | just digital privacy generally,
00:43:05.260 | hack proofing yourself.
00:43:06.960 | You mentioned like what's the connection
00:43:08.200 | of privacy to finance?
00:43:10.140 | Well, $10 trillion of cyber crime every year.
00:43:12.640 | How's that for a connection?
00:43:14.300 | So we did have these courses.
00:43:16.000 | We've since closed them down for various reasons,
00:43:19.200 | but I have resurrected them in escapethetechnocracy.com.
00:43:24.200 | Definitely use your code RPF for 15% off.
00:43:29.400 | And basically I have with a different partner this time,
00:43:32.320 | although you're an affiliate,
00:43:34.640 | so you're benefiting,
00:43:36.840 | you're contributing to Joshua as well as me.
00:43:39.600 | And you can have these excellent courses on digital privacy,
00:43:43.360 | how to use cryptocurrency,
00:43:44.960 | all of the important stuff for escaping
00:43:47.080 | this technocratic regime of preserving your privacy
00:43:50.240 | and some cyber security along the way.
00:43:52.680 | So yeah, it's a whole new website,
00:43:54.080 | also a course called Escape the Technocracy.
00:43:57.560 | And we have a code for you to use
00:43:59.240 | for radical personal finance.
00:44:00.320 | You get 15% off of that.
00:44:02.360 | And so I'd encourage everybody,
00:44:03.480 | if you're interested in privacy,
00:44:04.560 | I really do think these are the best privacy tutorials,
00:44:07.520 | video tutorials online.
00:44:09.640 | So if that is your cup of tea,
00:44:11.700 | and even if you've already been in the Bitcoin course
00:44:14.680 | and the Hack Proof course,
00:44:15.680 | definitely go check this out.
00:44:16.680 | There's a lot of new stuff here.
00:44:18.240 | So you're gonna definitely want to go check it out.
00:44:20.400 | - Yeah, and what I would say is there are lots,
00:44:23.400 | there's lots of material out there in the privacy space.
00:44:27.560 | If you're interested in the topic,
00:44:29.360 | go and read 58 books on it.
00:44:32.000 | If you just wanna know what works
00:44:33.560 | from somebody who knows the market
00:44:35.840 | and is able to articulate
00:44:38.160 | in a straightforward, direct fashion
00:44:40.800 | and say, here, do this,
00:44:43.320 | follow these instructions
00:44:44.600 | in a series of relatively short videos.
00:44:46.720 | They're comprehensive, but relatively short,
00:44:48.760 | straightforward, then go to Escape the Technocracy
00:44:51.800 | and use Gabriel for that,
00:44:52.760 | 'cause that's what he is really, really good at.
00:44:54.600 | And that format that he updates continually
00:44:57.560 | and has updated,
00:44:58.960 | that format is the best way to convey this kind of work.
00:45:02.120 | So if you're a reader, get his books,
00:45:04.000 | but if you just want to cut straight to the meat,
00:45:06.200 | as I usually do and know what to do,
00:45:09.240 | then go through his course there.
00:45:11.120 | Use code RPF, I'll get a small commission for that
00:45:13.480 | and you'll save 15% on the sales price
00:45:16.280 | and we'll keep Gabriel doing what he does really well,
00:45:19.400 | which is writing and researching
00:45:21.920 | and producing things that help the rest of us.
00:45:24.760 | Anything else, Gabriel?
00:45:25.960 | - There's also a URL.
00:45:28.040 | You can go to escapethetechnocracy.com/RPF
00:45:32.120 | and that coupon will also be automatically applied.
00:45:34.520 | I appreciate it.
00:45:35.360 | - Excellent, I will link to that directly in the show notes.
00:45:39.160 | Enjoy your time in Western Finland.
00:45:41.160 | I know you are finishing up your autumn in Finland.
00:45:43.920 | I can't do my jokes.
00:45:46.760 | - It is.
00:45:48.040 | No, it's great this time of day, of course.
00:45:50.640 | Between Finland and Karachi, I definitely get around.
00:45:53.520 | - Exactly, exactly.
00:45:55.320 | All right, my friend, I look forward to seeing you soon.
00:45:56.800 | Thanks for coming on.
00:45:58.560 | - Thank you.
00:45:59.400 | - Save on family favorites at Vons and Albertsons.
00:46:05.400 | This week at Vons and Albertsons,
00:46:06.760 | get USDA choice beef tri-tip roast untrimmed
00:46:09.600 | for 4.99 per pound with membership where applicable.
00:46:12.480 | Plus get one pound packages of fresh strawberries
00:46:14.680 | for $1.97 each with digital coupon.
00:46:17.240 | Also this week, 7 to 13.7 ounce Nabisco Ritz
00:46:20.360 | or 9 to 17 ounce premium saltine
00:46:22.800 | or 3.5 to 8.5 ounce snack crackers
00:46:25.480 | or buy one, get one free with membership where applicable.
00:46:28.480 | Visit Vons or Albertsons.com
00:46:30.520 | or head in store for more deals.