back to index2020-09-17_Gabriel_Custodiet_Interview
00:00:17.120 |
Welcome to Park City, Utah, naturally winter's 00:00:32.240 |
a show dedicated to providing you with the knowledge, 00:00:46.240 |
I'm proud to welcome back to Radical Personal Finance, 00:00:50.920 |
Gabriel, welcome back to Radical Personal Finance. 00:00:56.280 |
I know that you don't invite too many people these days, 00:00:59.280 |
and I'm very pleased to have connected with you 00:01:01.240 |
and a lot of your audience at some of your events. 00:01:04.000 |
And so I know they're very intelligent people. 00:01:09.720 |
Yeah, we've been working together for a couple of years. 00:01:12.480 |
You, primarily we've done a number of courses 00:01:19.960 |
That's what I think you were most well-known for 00:01:28.800 |
which is completely different than your previous work, 00:01:48.420 |
I talk about privacy as a useful strategy in the toolbox 00:01:58.280 |
There are a number of different ways this can be expressed. 00:02:05.720 |
just for living a low-hassle, low-risk life in many ways. 00:02:12.340 |
and other financial risks and frauds that are very common. 00:02:20.720 |
So this episode is not going to be specifically 00:02:26.400 |
here's what you specifically should invest in, 00:02:30.780 |
as we talk about the history of privacy as a concept, 00:02:35.320 |
and then think about how to apply that in the modern world. 00:02:45.340 |
- Yeah, so the people who are aware of me from your show 00:02:51.120 |
or elsewhere, they know that I do tend to talk about 00:02:57.000 |
And frankly, I'm more interested in the ideas, right? 00:03:00.600 |
I kind of got into the Watchman Privacy Podcast 00:03:03.200 |
trying to teach people some of the tools and the skills, 00:03:19.480 |
and I know a lot of the people in your audience 00:03:32.400 |
And we all have this question, what caused it? 00:03:41.560 |
And I set out to do this, focusing on privacy. 00:03:44.540 |
And of course, that led me to bigger concepts 00:03:48.200 |
like the rise of centralization and things of this sort. 00:03:53.040 |
where I start this book, "Privacy and Utopia." 00:03:55.960 |
But yeah, it was really just a opportunity to explore, 00:03:59.200 |
okay, what happened to our freedom and privacy, 00:04:02.720 |
and try to tackle it in a very historical way, 00:04:09.060 |
so I'm not just kind of detailing this law of this year, 00:04:17.760 |
but also trying to explore the undergirding ideas 00:04:21.000 |
of the time that themselves led to these particular dates 00:04:30.120 |
of the things that I talk about on a daily basis. 00:04:32.640 |
- Yeah, the book is very much a philosophical history, 00:04:37.320 |
Why do you begin your historical narrative in the 1890s? 00:04:46.320 |
And I think the 1890s are a very important decade 00:04:51.000 |
because, in my view, we have the rise of centralization. 00:05:13.560 |
a centralized system has to know all of its components, 00:05:33.640 |
And in the 1890s, we have this serious beginning 00:05:41.120 |
And there are two different things at work here, Josh. 00:05:49.840 |
so I'll just say they're mutually reinforcing. 00:05:52.280 |
So in the 1890s, you have the kind of the culmination 00:05:56.840 |
We have trains, we can now cross a huge landmass, 00:06:00.300 |
and we now have the ability to police nations, right? 00:06:05.420 |
where suddenly the United States 100 years before this, 00:06:16.240 |
Well, flash forward to 1890s, now we have trains, 00:06:26.560 |
We have telegraphs, we have all this technology 00:06:35.720 |
that's to use the term that a lot of libertarians 00:06:48.640 |
as the rise of central government over the last 100 years. 00:07:03.560 |
partly because of the technology that they see, 00:07:09.020 |
We have Marxism, Marx is writing a couple of decades 00:07:18.960 |
and how we need to essentially have a central governing body 00:07:25.920 |
There's this idea that the states or some powerful figure 00:07:31.960 |
and should be changing things, should be adjusting them. 00:07:37.080 |
or in the turn of the century, early 20th century. 00:07:43.020 |
where suddenly you're not just an individual, 00:07:57.700 |
One of the figures I talk about is H.G. Wells. 00:08:02.700 |
the 19th century was about people doing what they want, 00:08:16.840 |
And of course, anytime you have planning, centralization, 00:08:23.240 |
So there was just something going on in the 1890s 00:08:27.060 |
but also a result of the ideas of collectivism 00:08:38.360 |
and you talk a lot about his impact in the book. 00:08:50.920 |
- Yeah, we could talk for an hour just about H.G. Wells. 00:08:55.080 |
Let me just give the summary to pique people's interest. 00:09:01.720 |
for the guy who was prominent in early science fiction. 00:09:06.120 |
we have "The Invisible Man," "The Time Machine," 00:09:17.600 |
if you can believe it, become known as a huge proponent, 00:09:20.880 |
perhaps the biggest proponent of world government. 00:09:24.200 |
He literally wrote the book called "The New World Order." 00:09:33.540 |
after he wrote a handful of science fiction books 00:09:40.700 |
In fact, he would write a thesis later in life 00:09:48.220 |
He actually wrote that as a PhD thesis later in life. 00:09:51.600 |
And so this was a great enemy of human individualism, 00:09:55.320 |
freedom, a huge proponent of world government 00:10:00.240 |
He wrote a number of books that I talk about in my book, 00:10:07.800 |
where everything is controlled and top-down planning 00:10:10.620 |
and anybody who disagrees is sent to an island prison. 00:10:35.460 |
between 1900 and 1920, at any rate in the English language, 00:10:41.140 |
The minds of all of us, and therefore the physical world, 00:10:43.900 |
would be perceptibly different if Wells had never existed." 00:10:46.880 |
So this was not just a proponent of all these things, 00:10:56.140 |
Early 20th century, when the welfare state in Britain 00:10:58.900 |
comes about, this great emblem of centralization 00:11:05.180 |
And you know, Josh, if we look back at some of these stories 00:11:08.420 |
that we are familiar with, let's just do that for a moment. 00:11:20.080 |
It's destroyed because it is too disorganized. 00:11:24.260 |
The people have no, this is exactly what he says, 00:11:26.580 |
people have no, there was no common plan in London, right? 00:11:39.740 |
this great organized, perfectly technocratic society, 00:11:46.440 |
They're not blamed at all for their invasion. 00:11:48.040 |
In fact, they're encouraged for their invasion. 00:11:49.860 |
This is actually something that the narration encourages, 00:11:53.920 |
because this is a more advanced, more developed, 00:12:04.340 |
and who destroy humanity because humanity is too laissez-faire. 00:12:11.340 |
where individualism is basically concomitant, 00:12:20.180 |
this invisible man, you become psychotic, right? 00:12:36.380 |
where this doctor goes to experiment on animals 00:12:41.060 |
This transhumanism that is in this early work, 00:12:46.860 |
that people were so critical of it and so disturbed by it. 00:12:49.700 |
For him, this was just, these were his ethics. 00:12:54.140 |
this was a man who was one of the first human beings 00:12:59.340 |
He was part of this experimental college in London 00:13:15.540 |
He had a huge disdain for humanity as individuals. 00:13:19.180 |
He thought that they should be directed from the top down, 00:13:24.900 |
almost all of them dedicated to describing how humanity can 00:13:41.400 |
but by his more prescriptive books later on in his life. 00:13:56.220 |
But him being the origin of so many of these ideas 00:14:01.220 |
is something that I need to learn more about. 00:14:06.140 |
you go to another writer named Joseph Conrad. 00:14:08.920 |
Tell us about Conrad and what he contributed to this saga. 00:14:18.580 |
And Joseph Conrad is actually a friend of H.G. Wells 00:14:21.420 |
in the 1890s, when H.G. Wells started going on his tirade 00:14:24.780 |
for world government, they stopped being friends. 00:14:27.280 |
So that kind of tells you the difference between the two. 00:14:37.300 |
to become arguably the best English novelist, 00:14:40.300 |
despite writing in his third language, incredible story. 00:14:42.940 |
Wrote books such as "Heart of Darkness," "Nostromo," 00:14:46.220 |
who, the guy who wrote "The Great Bank Gatsby" 00:14:49.860 |
actually said, "If I could write any book ever written, 00:14:54.940 |
So Joseph Conrad is this really amazing writer, 00:15:00.820 |
I kind of, I put H.G. Wells as the great centralist, 00:15:04.300 |
and I put Joseph Conrad as the decentralist, right? 00:15:07.320 |
In his fiction, he's writing in a narrative way 00:15:18.140 |
this perfect world on earth, that humans are flawed. 00:15:20.980 |
And his fiction is basically just recognizing 00:15:24.100 |
that humans are flawed and appreciating humanity. 00:15:27.900 |
He writes to Wells as their friendship is breaking up. 00:15:30.420 |
He says, "The difference between us, Wells, is fundamental. 00:15:42.020 |
between the utopian and the anti-utopian view. 00:15:48.420 |
and his ideas as emblematic of this 19th century, 00:15:51.880 |
more laissez-faire, more constrained vision of humanity. 00:15:55.860 |
And so he's there as a contrast to H.G. Wells. 00:16:00.580 |
is the one that I was particularly fascinated with 00:16:02.940 |
just because it has such a great bearing on modern finance. 00:16:09.020 |
you title it "Privacy, Utopia, and the Welfare State" 00:16:15.640 |
and one who pays attention to government finances 00:16:19.260 |
I'm intensely conscious of the welfare state. 00:16:21.560 |
So how did these ideas influence the creation 00:16:29.240 |
- So if you go back in your minds to this period of time 00:16:36.060 |
where suddenly we now have journalism in a serious way, 00:16:41.940 |
we now have census data that is right there in front of us, 00:17:05.820 |
why they made it so that at least not everybody is poor, 00:17:09.900 |
But that's not how people looked at it, right? 00:17:11.740 |
Of course, in the 1890s, around this period of time, 00:17:17.060 |
this photographic realism about how destitute people were. 00:17:21.780 |
"Well, we should do something about it," right? 00:17:25.940 |
And they actually had the ability to make it happen, right? 00:17:28.540 |
Because now we have all these central institutions, 00:17:31.040 |
government is becoming a bigger role in people's lives. 00:17:36.380 |
And so everybody's just kind of crowded around 00:17:38.620 |
and, you know, it's kind of right in front of you. 00:17:41.300 |
You know, there's a good statistic in the UK, in Britain, 00:17:48.300 |
And fast forward 100 years, 1900, 90% of people are urban. 00:18:03.140 |
And it makes sense if that's right in front of you, 00:18:16.820 |
what if we give some money to these people, okay? 00:18:28.460 |
I think is when Britain passes its first welfare laws, 00:18:32.740 |
that is the camel's nose under the tent, of course. 00:18:37.900 |
who was the main politician who was behind this, 00:18:40.700 |
George Lloyd, who was partners with Winston Churchill, 00:18:47.980 |
talking about how we need to wage war on poverty. 00:19:10.580 |
of letting government intervene in our lives, 00:19:18.180 |
you can easily see how fast forward from 1905 to 1910, 00:19:28.860 |
You have passports also crop up during this time. 00:19:31.260 |
So you are now a citizen of this nation, right? 00:19:33.860 |
Germany was not really a nation before this time. 00:19:37.020 |
Suddenly it's all kind of conglomerated into one. 00:19:40.420 |
These were previously kind of disparate little states 00:19:48.660 |
You have the yoke of citizenship, as I call it. 00:19:54.820 |
to fight for your government, to fight for the state. 00:19:57.340 |
So this is just the origin of big government as we know it. 00:20:16.260 |
It was a very quick, but logical, ethical succession 00:20:20.820 |
from welfare state to warfare state to where we are today. 00:20:31.500 |
And what I find so fascinating about that word 00:20:34.620 |
is that word expresses something that I think we all want. 00:20:42.480 |
So why do you see the utopian mindset as a problem? 00:21:00.820 |
1905, I think he publishes his book, "A Modern Utopia." 00:21:05.820 |
And it's because there's a difference, Joshua, 00:21:09.340 |
between an idea of, let's say, a Christian idea of utopia, 00:21:12.460 |
and then this atheistic, rationalistic, eugenic idea 00:21:20.020 |
Because these utopians who really gained footing 00:21:26.860 |
are people who said, hey, we have a powerful state. 00:21:32.660 |
We should apply this to the world and create what we, 00:21:41.560 |
And so utopia, according to this idea, is always coercive. 00:21:45.340 |
And you could argue that by that fact alone, it is evil. 00:21:52.860 |
from "The Tempest," Shakespeare's famous final play, 00:21:56.260 |
"The Tempest," where there's this character talking about, 00:22:08.160 |
And then his friend cuts him off and he says, 00:22:09.620 |
yeah, but you're gonna be the king of this place, 00:22:12.740 |
And it's, as always, Shakespeare understands human nature 00:22:18.580 |
Behind every utopia, every atheistic utopia in this sense, 00:22:25.020 |
You look at H.G. Wells' "1905, A Modern Utopia." 00:22:32.860 |
Well, we're sending you off to an island prison. 00:22:41.900 |
We're going to have a force that is going to take it from you 00:22:54.520 |
It sounds good until you realize that, wait a second, 00:22:57.280 |
there's going to have to be somebody in charge. 00:23:02.520 |
And they're going to have their own view of things. 00:23:04.680 |
And they themselves are going to be fallen humans 00:23:13.280 |
that power corrupts and absolute power corrupts absolutely. 00:23:17.060 |
So that's the problem with utopia as I see it 00:23:23.060 |
- So why then, it was, was the dystopian not, 00:23:35.840 |
or were people seeing already some of the downsides 00:23:41.120 |
and then writing and extrapolating from there? 00:23:58.880 |
And you can look at H.G. Wells' book, "A Modern Utopia," 00:24:05.640 |
It actually literally influenced the welfare state. 00:24:15.720 |
And this book itself can be considered dystopian. 00:24:19.520 |
Now, H.G. Wells did not write it with that mindset. 00:24:29.280 |
in any utopian world, there is a coercive element. 00:24:32.760 |
There are people in control, people in charge at the top, 00:24:37.880 |
And that's what pretty quickly a lot of people recognize. 00:24:41.460 |
So 1909, just a few years later, E.M. Forster, 00:24:54.440 |
Now, "The Machine Stops" is a dystopian short story 00:25:08.900 |
They have telescreens where they get all their news 00:25:12.120 |
and information, and the machine, who is never seen, 00:25:26.960 |
this seemingly utopian world that's actually, 00:25:31.300 |
So he has a negative take on the idea of utopia. 00:25:35.080 |
So H.G. Wells's own work would literally inspire 00:25:39.760 |
He literally created the dystopian genre of people. 00:25:49.240 |
George Orwell specifically mentions H.G. Wells' work 00:25:57.280 |
which is simply, hey, this idea of utopia is silly, 00:26:02.280 |
in the sense that it gives power to a group of people 00:26:13.640 |
- Bring it forward, then, to where we are today. 00:26:25.040 |
of some of these ideas, what do you see working out 00:26:28.920 |
in today's world, and how does that impact your life 00:26:33.360 |
- So part of writing a book about 100 years ago 00:26:40.680 |
is so that I don't have to answer this question right away. 00:26:54.840 |
I think, obviously, people listening have a sense of things. 00:27:00.120 |
So I think maybe the main takeaway of this book 00:27:05.120 |
is simply to recognize the origin of some of the ideas 00:27:10.360 |
When you see people saying, hey, we need to fix things, 00:27:13.280 |
we need a powerful person in charge, we need to, 00:27:17.200 |
you can even see to this day, we didn't talk about eugenics, 00:27:23.120 |
And in eugenics, you can see the two-sided aspect 00:27:28.120 |
of this progressive utopian view, which is, hey, 00:27:31.360 |
we have all this data, we can see that moronic people, 00:27:39.680 |
People with low IQs are actually holding back our economy. 00:27:43.760 |
Everybody you know from the era was a proponent of eugenics, 00:27:50.080 |
They were a proponent of eugenics, of course, HG Wells. 00:27:54.480 |
And you can see that these ideas remain with us. 00:27:58.040 |
This idea that we need to tamper with humanity, 00:28:05.640 |
Of course, Neuralink and things of this sort, 00:28:08.280 |
I remember listening to a talk on an Asian guy 00:28:16.160 |
so we should in the future modify people to be shorter. 00:28:23.280 |
So if we recognize the fundamental assumptions 00:28:46.400 |
we're not going to elevate them in positions of power. 00:28:48.880 |
We're not going to use that same human nature 00:28:59.760 |
which will necessarily lead to more limited governments, 00:29:13.320 |
that I hope people get from reading about this. 00:29:22.080 |
He says, "Everybody thinks about changing the world, 00:29:44.040 |
they're what I think is correct and broadly effective. 00:30:00.320 |
Am I dogmatically committed to a certain approach 00:30:07.960 |
which is quite chilling when you talk about the novels 00:30:25.680 |
And everyone says they're opposed to the machine 00:30:29.040 |
and then they're turning everything in their life 00:30:30.760 |
over to ChatGPT or whatever local AI platform. 00:30:41.240 |
I find it to be one of the most incredible daily tools 00:30:57.120 |
Constantly just making stuff up out of thin air, 00:31:04.320 |
It becomes something that you kind of lean on and say, 00:31:08.160 |
then I don't have to think it through myself. 00:31:15.200 |
And so I'm trying to find the place to put my feet down 00:31:23.960 |
I'm not just dogmatically opposed to programming. 00:31:27.000 |
I'm opposed to progress because I'm a contrarian. 00:31:30.120 |
But I'm also trying to take the informed approach 00:31:32.920 |
and learn and say, well, where are my weaknesses? 00:31:42.800 |
because each person, each of us has to make our own decisions. 00:31:46.720 |
But it's fascinating to think about my challenges of myself 00:31:51.320 |
and think about how can I create this balance. 00:32:01.160 |
I want to be the person who's building towards a utopia, 00:32:10.880 |
the fallen nature so that we build proper safeguards 00:32:21.520 |
that humans are somehow fundamentally different today 00:32:32.120 |
Maybe I'll just recommend a good book for you 00:32:35.800 |
or for other people, "The Abolition of Man" by C.S. Lewis. 00:32:40.440 |
He was a guy who, I think this is written in 1943. 00:32:50.960 |
He saw the idea of the welfare state and its consequences. 00:32:55.440 |
And he's got some interesting things to say in there, 00:33:07.920 |
I think we should be okay with putting our foot down 00:33:14.000 |
Because this is maybe one of the nastiest consequences 00:33:21.040 |
is that they suddenly have us really questioning things 00:33:29.600 |
but we need to define our terms very carefully. 00:33:54.000 |
is that we lose our ability to be self-sufficient, 00:33:58.840 |
our memories degrade, there's all kinds of consequences 00:34:04.040 |
So there's a lot of technology, smart watches, for example, 00:34:14.520 |
with all the alerts and all the rest of a smartwatch 00:34:18.360 |
So I would just encourage people to be more critical 00:34:23.080 |
Not everything, not all movement is progress. 00:34:25.680 |
And just to quote another great conservative thinker, 00:34:44.920 |
We've created, with all of our progressive culture, 00:34:48.600 |
we've created a culture that demands less of human beings 00:34:53.400 |
and that easy living seems to be toxic to human beings. 00:34:58.400 |
The rates of depression, the rates of suicide, 00:35:10.240 |
And even just the continuation of our species, 00:35:24.200 |
tend to be the cultures that are literally dying out 00:35:27.760 |
and literally going extinct over the course of time, 00:35:34.440 |
Meanwhile, the cultures that have not progressed 00:35:40.120 |
and have not just openly embraced every single technology 00:35:43.880 |
tend by those metrics to have higher rates of happiness, 00:35:51.560 |
and higher birth rates for the continuation of the species. 00:35:57.000 |
would be an unalloyed good of making life easy 00:36:03.720 |
that automatically makes for better outcomes. 00:36:06.880 |
But I also simultaneously find myself frustrated 00:36:17.760 |
And I think I do a good job of straddling this, Josh. 00:36:23.200 |
And so I miss out on that whole ecosystem of, 00:36:40.960 |
And I have to say that not having them in my life, 00:36:46.840 |
In fact, sometimes I think I'm more productive 00:36:49.760 |
not having the distractions, not having the worry. 00:37:03.000 |
that I need an app for this and an app for that. 00:37:06.840 |
You can benefit from the underlying technology 00:37:12.360 |
without going that bizarre extra mile of saying, 00:37:29.580 |
in terms of technology these days are actually useless. 00:37:47.360 |
that you can just kind of plug one proxy, I guess, 00:37:56.960 |
but basically have a digital minimalist lifestyle 00:38:05.440 |
but not go so far that you wind up getting sucked 00:38:08.360 |
into the utopian vortex that winds up destroying you. 00:38:13.520 |
kind of a metric that aligns with these other metrics 00:38:33.580 |
So it started with understanding these systems, 00:38:39.320 |
And once I understood them, I could say, okay, 00:38:41.560 |
do I really need to participate in all these things? 00:39:01.540 |
hiding from a lot of the surveying systems as a result. 00:39:18.720 |
So yeah, I think the mindset of understanding your threat, 00:39:29.920 |
a lot of these technologies is a very good mindset. 00:39:33.600 |
And I feel very at peace in where I am as a result. 00:39:38.780 |
Is there anything in the book that I haven't asked you about 00:39:42.940 |
for this kind of podcast format to talk about? 00:39:47.040 |
- Yeah, let me think about this for a second. 00:39:57.220 |
because you write with a flair that is often gone 00:40:04.700 |
because you have to, to promote your writing. 00:40:06.320 |
And I just want to say, keep up the good work, keep writing, 00:40:19.840 |
I'm not going to write a book more than 200 pages. 00:40:45.880 |
when you kind of cut out some of the intermediary parts. 00:40:49.520 |
So that is my intent to not just have good ideas, 00:41:20.580 |
and really important books throughout history. 00:41:22.800 |
So yeah, I don't know that I have too much more 00:41:31.880 |
where and how would you prefer people to buy the book? 00:41:38.440 |
but where and how would you prefer people buy the book? 00:41:43.880 |
If you want a physical copy, you have to go to Amazon. 00:41:45.980 |
So just go to Amazon, search for Privacy and Utopia. 00:41:54.740 |
And so I think you'll find that the cover itself is, 00:42:19.640 |
We're gonna give you a code for Joshua Sheets in a moment, 00:42:23.920 |
and you just go through the checkout process. 00:42:28.820 |
That is your PDF to own and do with as you'd like 00:42:42.120 |
but you have moved all of your training courses there. 00:42:53.680 |
So tell us more about the new training that you offer 00:43:00.920 |
who we've talked to before about Bitcoin privacy, 00:43:10.140 |
Well, $10 trillion of cyber crime every year. 00:43:16.000 |
We've since closed them down for various reasons, 00:43:19.200 |
but I have resurrected them in escapethetechnocracy.com. 00:43:29.400 |
And basically I have with a different partner this time, 00:43:39.600 |
And you can have these excellent courses on digital privacy, 00:43:47.080 |
this technocratic regime of preserving your privacy 00:44:04.560 |
I really do think these are the best privacy tutorials, 00:44:11.700 |
and even if you've already been in the Bitcoin course 00:44:18.240 |
So you're gonna definitely want to go check it out. 00:44:20.400 |
- Yeah, and what I would say is there are lots, 00:44:23.400 |
there's lots of material out there in the privacy space. 00:44:48.760 |
straightforward, then go to Escape the Technocracy 00:44:52.760 |
'cause that's what he is really, really good at. 00:44:58.960 |
that format is the best way to convey this kind of work. 00:45:04.000 |
but if you just want to cut straight to the meat, 00:45:11.120 |
Use code RPF, I'll get a small commission for that 00:45:16.280 |
and we'll keep Gabriel doing what he does really well, 00:45:21.920 |
and producing things that help the rest of us. 00:45:32.120 |
and that coupon will also be automatically applied. 00:45:35.360 |
- Excellent, I will link to that directly in the show notes. 00:45:41.160 |
I know you are finishing up your autumn in Finland. 00:45:50.640 |
Between Finland and Karachi, I definitely get around. 00:45:55.320 |
All right, my friend, I look forward to seeing you soon. 00:45:59.400 |
- Save on family favorites at Vons and Albertsons. 00:46:09.600 |
for 4.99 per pound with membership where applicable. 00:46:12.480 |
Plus get one pound packages of fresh strawberries 00:46:25.480 |
or buy one, get one free with membership where applicable.