back to indexTim Urban: Tribalism, Marxism, Liberalism, Social Justice, and Politics | Lex Fridman Podcast #360
Chapters
0:0 Introduction
2:16 Human history
18:15 Greatest people in history
26:4 Social media
32:46 Good times and bad times
44:16 Wisdom vs stupidity
46:24 Utopia
60:33 Conspiracy theories
73:44 Arguing on the Internet
93:44 Political division
103:38 Power games
111:37 Donald Trump and Republican Party
128:45 Social justice
151:28 Censorship gap
158:59 Free speech
163:1 Thinking and universities
171:24 Liv Boeree joins conversation
183:44 Hopes for the future
00:00:00.000 |
A radical political movement, of which there will always be a lot in the country, has managed 00:00:06.280 |
to do something that a radical movement is not supposed to be able to do in the US, which 00:00:10.880 |
is they've managed to hijack institutions all across the country, and hijack medical 00:00:16.400 |
journals and universities and the ACLU, all the activist organizations and non-profits 00:00:26.280 |
And the way I view a liberal democracy is it is a bunch of these institutions that were 00:00:30.840 |
trial and error crafted over hundreds of years, and they all rely on trust, public trust, 00:00:38.560 |
and a certain kind of feeling of unity that actually is critical to a liberal democracy's 00:00:44.680 |
And what I see this thing is, is a parasite on that, whose goal is—and I'm not saying 00:00:50.920 |
each individual in this, I don't think they're bad people. 00:00:54.360 |
I think that it's the ideology itself has the property of, its goal is to tear apart 00:00:59.960 |
the pretty delicate workings of the liberal democracy and shred the critical lines of 00:01:07.600 |
The following is a conversation with Tim Urban, his second time on the podcast. 00:01:11.880 |
He's the author and illustrator of the amazing blog called "Wait, But Why?" and is the author 00:01:17.720 |
of a new book coming out tomorrow called "What's Our Problem?" 00:01:23.920 |
We talk a lot about this book in this podcast, but you really do need to get it and experience 00:01:30.160 |
It is a fearless, insightful, hilarious, and I think important book in this divisive time 00:01:37.280 |
The Kindle version, the audiobook, and the web version should be all available on date 00:01:42.800 |
I should also mention that my face might be a bit more beat up than usual. 00:01:48.840 |
I got hit in the chin pretty good since I've been getting back into training Jiu-Jitsu, 00:01:55.360 |
a sport I love very much, after recovering from an injury. 00:01:59.120 |
So if you see marks on my face during these intros or conversations, you know that my 00:02:09.640 |
To support it, please check out our sponsors in the description. 00:02:16.960 |
You wrote an incredible book called What's Our Problem? 00:02:23.400 |
In the beginning, you present this view of human history as a thousand-page book where 00:02:35.680 |
And it's a brilliant visualization because almost nothing happens for most of it. 00:02:40.280 |
So what blows your mind most about that visualization when you just sit back and think about it? 00:02:47.720 |
So 950 pages, 95% of the book, hunter-gatherers kind of doing their thing. 00:02:52.120 |
I'm sure there's obviously some major cognitive advancements along the way in language. 00:02:57.560 |
And I'm sure the bow and arrow comes around at some point. 00:03:00.320 |
So tiny things, but it's like, oh, now we have 400 pages until the next thing. 00:03:03.600 |
But then you get to page 950, and things start moving. 00:03:10.000 |
So basically, the bottom row is when anything interesting happens. 00:03:13.120 |
There's a bunch of agriculture for a while before we know anything about it. 00:03:24.160 |
So when we think of prehistoric, we're talking about pages 1 through 975 of the book. 00:03:35.920 |
If you were reading the book, it would be like epilogue AD, the last little 10 pages 00:03:43.400 |
2,000 years, the Roman Empire, 2,000 years ago. 00:03:48.440 |
Human history has been going on for over 2,000 centuries. 00:03:53.760 |
That is-- it's just-- it's hard to wrap your head around. 00:03:57.320 |
And this is-- I mean, even that's just the end of a very long road. 00:04:01.000 |
The 100,000 years before that, it's not like that was that different. 00:04:06.840 |
So it's just-- there's been people like us that have emotions like us, that have physical 00:04:21.220 |
And it's-- I think we have no idea what it was like to be them. 00:04:25.420 |
The thing that's craziest about the people of the far past is not just that they had 00:04:29.820 |
different lives, they had different fears, they had different dangers and different responsibilities, 00:04:33.660 |
and they lived in tribes and everything, but they didn't know anything. 00:04:37.460 |
We just take it for granted that we're born on top of this tower of knowledge. 00:04:41.060 |
And from the very beginning, we know that the Earth is a ball floating in space. 00:04:54.060 |
Those were all incredible epiphanies quite recently. 00:04:58.100 |
And the people a long time ago, they just had no idea what was going on. 00:05:01.500 |
And I'm kind of jealous, because I feel like it-- I mean, it might have been scary to not 00:05:04.660 |
know what's going on, but it also, I feel like, would be-- you'd have a sense of awe 00:05:09.300 |
And you don't know what's going to happen next. 00:05:10.660 |
And once you learn, you're kind of like, oh, that's a little grim. 00:05:14.100 |
But they probably had the same capacity for consciousness to experience the world, to 00:05:20.100 |
wander about the world, maybe to construct narratives about the world and myths and so 00:05:25.220 |
They just had less grounded, systematic facts to play with. 00:05:30.660 |
They still probably felt the narratives, the myths, they constructed as intensely as we 00:05:52.820 |
So how many people in history have experienced a hot shower? 00:06:03.660 |
So like George Washington never had a hot shower. 00:06:14.220 |
But that's like an unbelievable life experience to have a rain, a controlled little booth 00:06:23.440 |
And then you get out and it's not everywhere. 00:06:27.380 |
That was like, a lot of people probably lived and died with never experiencing hot water. 00:06:31.520 |
Maybe they had a way to heat water over a fire. 00:06:34.980 |
It's just like, there's so many things about our lives now that are complete, just total 00:06:42.580 |
It makes you wonder like, what is the thing that would notice the most? 00:06:44.940 |
I mean, the sewer system, like it doesn't smell in cities. 00:06:52.300 |
I mean, it gets rid of waste efficiently in such a way we don't have to confront it, both 00:07:01.020 |
Plus all the medical stuff associated with sewage. 00:07:04.700 |
- How about the cockroaches and the rats and the disease and the plagues? 00:07:10.340 |
And then when they got, so they caught more diseases, but then when they caught the disease, 00:07:15.960 |
So they often would die or they would just be in a huge amount of pain. 00:07:25.020 |
They just had this tiny little animals that are causing these diseases. 00:07:28.980 |
In the bubonic plague, in the black death, the 1300s, people thought that it was an act 00:07:38.220 |
Because why would you not think that if you didn't know what it was? 00:07:42.760 |
And so the crazy thing is that these were the same primates. 00:07:46.580 |
I know in some sense what it's like to be them, because I am a human as well. 00:07:51.740 |
And to know that this particular primate that I know what it's like to be experienced such 00:07:59.140 |
And like this isn't, our life is not the life that this primate has experienced almost ever. 00:08:07.940 |
I have a sense that we would get acclimated very quickly. 00:08:11.740 |
Like if we threw ourselves back a few thousand years ago, it would be very uncomfortable 00:08:16.580 |
at first, but the whole hot shower thing, you'll get used to it. 00:08:22.860 |
There's a few, I'm trying to remember which book that talks about hiking the Appalachian 00:08:27.780 |
Trail, but you kind of miss those hot showers. 00:08:31.420 |
But I have a sense like after a few months, after a few years. 00:08:37.060 |
- Yeah, I was saying the other day to a friend that whatever you're used to, you start to 00:08:40.860 |
think that, oh, that the people that have more than me or are more fortunate, like it 00:08:48.100 |
Because the experience, what would happen is you would get these new things or you would 00:08:53.420 |
get these new opportunities and then you would get used to it and then you would, the hedonic 00:08:58.940 |
And likewise though, because you think, oh my God, what if I had to have this kind of 00:09:03.340 |
job that I never would want or I had this kind of marriage that I never would want? 00:09:08.900 |
If you did, you would adjust and you get used to it and you might not be that much less 00:09:13.700 |
So on the other side of the, you being okay going back, we would survive if we had to 00:09:19.100 |
We'd have to learn some skills, but we would buck up and people have gone to war before 00:09:23.860 |
that were in the shopkeepers a year before that. 00:09:28.720 |
But on the other hand, if you brought them here, I always think it would be so fun to 00:09:32.420 |
just bring, forget the hunter gatherers, bring a 1700s person here and tour them around, 00:09:36.900 |
take them on an airplane and show them your phone and all the things it can do. 00:09:40.160 |
Show them the internet, show them the grocery store. 00:09:45.140 |
Likewise I think they would be completely awestruck and on their knees crying tears 00:09:50.900 |
And then they'd get used to it and they'd be complaining about like, "You don't have 00:09:56.880 |
- The grocery store is a tough one to get used to. 00:09:58.860 |
Like when I first came to this country, the abundance of bananas was the thing that struck 00:10:05.540 |
Or like fruits in general, but food in general. 00:10:11.620 |
That you could just eat them as much as you want. 00:10:16.940 |
Probably took several years to really get acclimated to that. 00:10:25.060 |
- The number of bananas, fresh bananas, that wasn't available. 00:10:33.020 |
- Yeah, it's like we don't even know what to, like we don't even know the proper levels 00:10:38.780 |
Like, you know, walking around the grocery store, I don't know to be like, "The bread's 00:10:41.780 |
nice, but the bananas are like, we're so lucky." 00:10:44.900 |
I'm like, "Oh, I could have been the other way. 00:10:46.900 |
- Well, it's interesting then where we point our gratitude in the West, in the United States. 00:10:55.260 |
Probably, do we point it away from materialist possessions towards, or do we just aspire 00:11:04.660 |
to do that towards other human beings that we love? 00:11:07.740 |
Because in the East, in the Soviet Union, growing up poor, it's having food is the gratitude. 00:11:21.820 |
And now, but see within that, the deep gratitude is for other human beings. 00:11:26.860 |
The penguins huddling together for warmth in the cold. 00:11:32.420 |
I mean, I'm sure, yes, of course, in the West, people on average feel gratitude towards different 00:11:36.460 |
things or maybe a different level of gratitude. 00:11:38.700 |
Maybe we feel less gratitude than countries that... 00:11:41.700 |
Obviously, I think the easiest, the person that's most likely to feel gratitude is going 00:11:46.380 |
to be someone whose life happens to be one where they just move up, up, up throughout 00:11:51.900 |
A lot of people in the greatest generation, people who were born in the '20s or whatever, 00:11:57.580 |
This story is the greatest generation grew up dirt poor and they often ended up middle 00:12:02.060 |
And the boomers, some of them started off middle class and many of them ended up quite 00:12:06.740 |
And I feel like that life trajectory is naturally going to foster gratitude, right? 00:12:15.420 |
Because you're not going to take for granted these things because you didn't have them. 00:12:18.380 |
I didn't go out of the country really in my childhood very much. 00:12:23.340 |
We traveled, but it was to Virginia to see my grandparents or Wisconsin to see other 00:12:27.620 |
relatives or maybe Florida for going to the beach. 00:12:30.900 |
And then I started going out of the country like crazy in my '20s because I really became 00:12:36.980 |
And I feel like because I, if I had grown up always doing that, it would have been another 00:12:42.500 |
But I still, every time I go to a new country, I'm like, oh my God, this is so cool. 00:12:46.420 |
And in another country, this thing I've only seen on the map, I'm like, I'm there now. 00:12:50.340 |
And so I feel like a lot of times it's a product of what you didn't have and then you suddenly 00:12:57.060 |
But I still think it's case by case in that there's like a meter in everyone's head that 00:13:04.220 |
I think at a 10, you're experiencing just immense gratitude, which is a euphoric feeling. 00:13:14.860 |
And it makes you happy to savor what you have, to look down at the mountain of stuff that 00:13:21.860 |
you have that you're standing on, to look down at it and say, oh my God, I'm so lucky. 00:13:27.220 |
And I'm so grateful for this and this and this. 00:13:31.580 |
Now when you move that meter down to six or seven, maybe you think that sometimes, but 00:13:36.300 |
you're not always thinking that because you're sometimes looking up at this cloud of things 00:13:42.220 |
that you don't have and the things that they have but you don't or the things you wished 00:13:45.580 |
you had or you thought you were going to have or whatever. 00:13:50.700 |
And either that's envy, that's yearning, or often it's, if you think about your past, 00:13:59.780 |
And so then you go down to a one and you have someone who feels like a complete victim. 00:14:03.860 |
They are just a victim of the society, of their siblings and their parents and their 00:14:10.900 |
And they are wallowing in everything that's happened wrong to me, everything I should 00:14:16.380 |
have that I don't, everything that has gone wrong for me. 00:14:19.500 |
And so that's a very unhealthy, mentally unhealthy place to be. 00:14:24.620 |
There's an endless list of stuff it can be aggrieved about and an endless list of stuff 00:14:30.340 |
And so in some ways it's a choice and it's a habit. 00:14:33.260 |
And maybe it's part of how we were raised or our natural demeanor, but it's such a good 00:14:40.260 |
>>ZUKOFF: Well, like you are constantly just saying, "Man, I'm lucky," or like, "I'm so 00:14:48.500 |
And it's a good thing to do because you're reminding yourself, but you're also reminding 00:14:58.100 |
And so anyway, I think that scale can go from one to 10. 00:15:02.640 |
But I think trying to be above a five and looking down at the things you have more often 00:15:08.620 |
than you are looking up at the things you don't or being resentful about the things 00:15:14.060 |
>>KORELL: Well, the interesting thing, I think, was an open question, but I suspect that you 00:15:24.700 |
You could choose where you are as a matter of habit, like you said. 00:15:28.260 |
But you can also probably control that on a scale of a family, of a tribe, of a nation, 00:15:36.300 |
You can describe a lot of the things that happens in Nazi Germany and different other 00:15:39.780 |
parts of history through a sort of societal envy and resentment that builds up. 00:15:46.020 |
Maybe certain narratives pick up and then they infiltrate your mind and then now your 00:15:51.100 |
knob goes to, from the gratitude for everything, it goes to resentment and envy and all the 00:16:04.780 |
So yeah, and then when you're soaking in a culture, so there's kind of two factors, right? 00:16:10.980 |
It's what's going on in your own head and then what's surrounding you. 00:16:13.660 |
And what's surrounding you kind of has concentric circles. 00:16:16.020 |
There's your immediate group of people, because that group of people, if they're a certain 00:16:20.260 |
way, if they feel a lot of gratitude and they talk about it a lot, that kind of insulates 00:16:25.340 |
Because people are going to have the most impact on you or the ones closest. 00:16:29.620 |
But often, all the concentric circles are saying the same thing. 00:16:33.020 |
The people around you are feeling the same way that the broader community, which is feeling 00:16:39.500 |
And this is why I think American patriotism, nationalism, can be tribal, can be not a good 00:16:55.260 |
If you love your country, you should love your fellow countrymen. 00:17:00.060 |
Patriotism is, I think, a feeling of unity, but it also comes along with an implicit kind 00:17:10.140 |
Because it's like, we are so lucky to live in... 00:17:12.100 |
People think it's chauvinist to say, "We live in the best country in the world." 00:17:15.780 |
And yes, when Americans say that, no one likes it. 00:17:21.860 |
It's a way of saying, "I'm so grateful for all the great things this country gives to 00:17:27.860 |
And I think if you heard a Filipino person say, "You know what? 00:17:30.580 |
The Philippines is the best country in the world." 00:17:31.980 |
No one in America would say, "That's chauvinist." 00:17:35.100 |
Because when it's coming from someone who's not American, it sounds totally fine. 00:17:42.340 |
Now again, that can quickly translate into xenophobia, nationalism, and so you have 00:17:50.100 |
Like you talk about, we'll talk about high rung progressivism, high rung conservatism. 00:17:55.660 |
Those are two different ways of embodying patriotism. 00:18:02.100 |
So you could talk about maybe loving the tradition that this country stands for, or you could 00:18:06.340 |
talk about loving the people that ultimately push progress. 00:18:11.060 |
And those are, from an intellectual perspective, a good way to represent patriotism. 00:18:15.620 |
We've got to zoom out because this graphic is epic. 00:18:18.700 |
A lot of images in your book are just epic on their own. 00:18:24.260 |
But this one has famous people for each of the cards. 00:18:31.620 |
And by the way, good for them to be the person that... 00:18:35.740 |
It's not that I could have chosen lots of people for each card, but I think most people 00:18:44.700 |
You crushed it if you can be the person for your whole 250 year page. 00:18:48.100 |
Well, I noticed you put Gandhi, you didn't put Hitler. 00:18:50.740 |
I mean, there's a lot of people going to argue with you about that particular last page. 00:18:58.220 |
I actually, I was thinking about Darwin there too. 00:19:04.220 |
Did you think about putting yourself for a second? 00:19:08.580 |
That would have endeared the readers to me from the beginning of the first page of the 00:19:12.460 |
A little bit of a messianic complex going on. 00:19:14.940 |
But yeah, so the list of people just so we know. 00:19:23.020 |
And so it goes Gandhi, Shakespeare, Joan of Arc, Genghis Khan, Charlemagne, Muhammad, 00:19:29.380 |
Constantine, Jesus, Cleopatra, Aristotle, Buddha. 00:19:33.580 |
It's so interesting to think about this very recent human history. 00:19:37.860 |
It's 11 pages, so it would be 2750, almost 3000 years. 00:19:42.940 |
Just that there's these figures that stand out and that define the course of human history. 00:19:49.100 |
It's like the craziest thing to me is that Buddha was a dude. 00:19:53.460 |
He was a guy with arms and legs and fingernails that he maybe bit and he liked certain foods 00:20:01.100 |
and maybe he got like, you know, he had like digestive issues sometimes and like he got 00:20:09.220 |
And like he was a guy and he had hopes and dreams and he probably had a big ego for a 00:20:14.020 |
while before I guess Buddha totally overcame that one. 00:20:17.180 |
But like, and it's like who knows, you know, the mythical figure Buddha, who knows how 00:20:23.360 |
But the fact, same with Jesus, this was a guy. 00:20:35.700 |
Yeah, like having tantrum because he's frustrated because he's in the terrible twos. 00:20:49.980 |
It just, I mean, listen, hearing about Genghis Khan, it's incredible to me because it's just 00:20:55.060 |
like this was some Mongolian, you know, herder guy who was taken as a slave and he was like 00:21:03.980 |
dirt poor, you know, catching rats as a young teen to feed him and his mom and I think his 00:21:12.620 |
And it's just like the odds on when he was born, he was just one of, you know, probably 00:21:21.980 |
tens of thousands of random teen boys living in Mongolia in the 1200s. 00:21:27.620 |
The odds of that person, any one of them being a household name today that we're talking 00:21:32.020 |
about, it's just crazy like what had to happen. 00:21:35.820 |
And for that guy, for that poor, dirt poor herder to take over the world, I don't know. 00:21:42.420 |
So history just like continually blows my mind. 00:21:45.820 |
And he's the reason you and I are related probably. 00:21:49.260 |
I mean, it's also, that's the other thing is that some of these dudes by becoming king, 00:21:55.140 |
by being, having a better army at the right time, you know, William the Conqueror or whatever 00:21:58.860 |
has, is in the right place at the right time with the right army, you know, and there's 00:22:02.260 |
a weakness at the right moment and he comes over and he exploits it and ends up probably 00:22:06.740 |
having, you know, I don't know, thousand children and those children are high up people who 00:22:11.860 |
might be have a ton of, the species is different now because of him. 00:22:16.540 |
Like if that, forget England's different or, you know, European borders look different. 00:22:21.340 |
Like, like we are, like we look different because of a small handful of people, you 00:22:28.300 |
When I sometimes I think I'm like, oh, you know, this part of the world I can recognize 00:22:31.740 |
someone's Greek, you know, someone's Persian, someone's wherever because, you know, they 00:22:38.140 |
I mean, obviously it's that that's a population, but it may be that like someone 600 years 00:22:42.540 |
ago that looked like that really spread their seed and that's why the ethnicity looks kind 00:22:53.500 |
Do you think individuals like that can turn the direction of history or is that an illusion 00:23:02.100 |
I mean, so I said that William the Conqueror, right. 00:23:05.780 |
It's not that Hitler was born and destined to be great at all. 00:23:10.020 |
I mean, in a lot of cases he's frustrated artist with a temper who's turning over the 00:23:13.820 |
table in his studio and hitting his wife and being kind of a dick and a total nobody. 00:23:20.660 |
I think almost all the times you could have put Hitler baby on earth. 00:23:26.500 |
You know, and maybe he's a, you know, maybe sometimes he becomes a, you know, some kind 00:23:29.620 |
of, you know, he uses the speaking ability because that ability was going to be there 00:23:32.660 |
either way, but maybe he uses it for something else. 00:23:35.620 |
But that said, I also think, but it's not that World War II was going to happen either 00:23:44.980 |
It's that like these circumstances were one way and this person came along at the right 00:23:49.020 |
time and those two made a match made in this case hell. 00:23:52.060 |
But it makes you wonder, yes, it's a match in hell, but are there other people that could 00:23:57.940 |
have taken his place or do these people that stand out, they're the rare spark of that 00:24:04.100 |
genius, whether it take us towards evil, towards good, whether those figures singularly define 00:24:13.580 |
You know, what defines the trajectory of humanity in the 21st century, for example, might be 00:24:17.860 |
the influence of AI, might be the influence of nuclear war, negative or positive, not 00:24:23.740 |
in the case of nuclear war, but the bioengineering, nanotech, virology, what else is there? 00:24:36.500 |
Maybe the structure of governments and so on. 00:24:41.660 |
There could be singular figures that stand up and lead the way for human. 00:24:46.860 |
- But I wonder if the society is the thing that manifests that person or that person 00:24:55.800 |
- I think it's probably a spectrum where there are some cases when a circumstance was such 00:25:00.800 |
that something like what happened was gonna happen. 00:25:04.460 |
If you pluck that person from the earth, I don't know whether the Mongols is a good example 00:25:08.540 |
or not, but maybe it could be that if you plucked Genghis Khan as a baby, there was 00:25:13.860 |
because of the specific way Chinese civilization was at that time and the specific climate 00:25:21.860 |
that was causing a certain kind of pressure on the Mongols and the way they still had 00:25:25.340 |
their great archers and they had their horses and they had a lot of the same advantages. 00:25:31.340 |
It was gonna happen either way and it might not have happened to the extent or whatever. 00:25:36.820 |
Or you could go the full other direction and say, actually, this was probably not gonna 00:25:44.100 |
I think World War II really was the work of, of course, it relied on all these other circumstances. 00:25:53.420 |
But I think if you take Hitler out, I'm pretty sure World War II doesn't happen. 00:25:59.580 |
- Well then, it seems easier to answer these questions when you look at history, even recent 00:26:05.220 |
Let's look at, I'm sure we'll talk about social media. 00:26:17.660 |
- There's a meme going around where MySpace is the perfect social media 'cause no one 00:26:26.060 |
At the time, we were like, oh man, Tom only made a few million dollars. 00:26:32.980 |
Tom might be living a nice life right now where he doesn't have this nightmare that 00:26:38.140 |
- Yeah, and he's always smiling in his profile picture. 00:26:44.500 |
That's kind of intermingled into that whole thing, into the development of the internet. 00:26:51.900 |
I mean, there's people playing with the evolution of social media. 00:26:54.900 |
And to me, that seems to be connected to the development of AI. 00:27:01.540 |
And it seems like those singular figures will define the direction of AI development and 00:27:06.260 |
social media development with social media seeming to have such a huge impact on our 00:27:13.700 |
- It does feel in one way like individuals have an especially big impact right now in 00:27:19.140 |
that a small number of people are pulling some big levers. 00:27:24.300 |
And there can be a little meeting of three people at Facebook and they come out of that 00:27:28.860 |
meeting and make a decision that totally changes the world, right? 00:27:32.380 |
On the other hand, you see a lot of conformity. 00:27:37.020 |
You see a lot of, they all pulled the plug on Trump the same day, right? 00:27:41.660 |
So that suggests that there's some bigger force that is also kind of driving them, in 00:27:51.460 |
I mean, to me, leadership is the ability to move things in a direction that the cultural 00:28:00.620 |
A lot of times people seem like a leader because they're just kind of hopping on the cultural 00:28:05.380 |
wave and they happen to be the person who gets to the top of it. 00:28:07.900 |
Now it seems like they're, but actually the wave was already going. 00:28:11.580 |
Like real leadership is when someone actually changes the wave, changes the shape of the 00:28:19.940 |
Like I think Elon with SpaceX and with Tesla, like genuinely shaped a wave. 00:28:27.740 |
Maybe you could say that EVs were actually, they were going to happen anyway, but there's 00:28:31.740 |
not much evidence about at least happening when it did. 00:28:35.700 |
You know, if we end up on Mars, you can say that Elon was a genuine leader there. 00:28:40.420 |
And so there are examples now, like Zuckerberg definitely has done a lot of leadership along 00:28:46.220 |
He's also potentially kind of like caught in a storm that is happening and he's one 00:28:55.060 |
- And it's possible that he is a big shaper if the metaverse becomes a reality. 00:29:00.160 |
If in 30 years we're all living in a virtual world. 00:29:03.240 |
To many people it seems ridiculous now that that was a poor investment. 00:29:06.780 |
- He talked about getting, you know, I think it was something like a billion people with 00:29:11.140 |
a VR headset in their pocket in by, you know, I think it was 10 years from now back in 2015. 00:29:19.180 |
But when he was talking about that, and honestly, this is something I've been wrong about. 00:29:23.980 |
Because I went to like one of the Facebook conferences and tried out all the new Oculus 00:29:31.340 |
And I was like, you know, pretty early talking to some of the major players there. 00:29:34.060 |
Because I was going to write a big post about it that then got swallowed by this book. 00:29:39.900 |
Because what I would have said was that this thing is, when I tried it, I was like, this 00:29:47.420 |
Some of them make you nauseous and they're just not that, you know, the headsets were 00:29:51.940 |
But I was like, the times when this is good, it is, I have this feeling I haven't had, 00:29:56.060 |
it reminds me of the feeling I had when I first was five and I went to a friend's house 00:30:00.900 |
And he gave me the controller and I was looking at the screen and I pressed a button and Mario 00:30:05.900 |
And I said, I can make something on the screen move. 00:30:10.700 |
And the same feeling I had the first time someone showed me how to send an email. 00:30:15.220 |
And I was like, it goes, I can press enter on my computer and something happens on your 00:30:19.260 |
Those were obviously, you know, when you have that feeling, it often means you're witnessing 00:30:27.060 |
And I still kind of think it is, but it's kind of weird that it hasn't, you know, like 00:30:35.660 |
My first and still instinct is this feels like it changes everything. 00:30:39.460 |
VR feels like it changes everything, but it's not changing anything. 00:30:43.220 |
Like a dumb part of my brain is genuinely convinced that this is real. 00:30:48.060 |
But that's why the dumb part was like, we're not walking off that cliff. 00:30:53.660 |
The dumb part of my brain is like, I'm not walking off the cliff. 00:30:57.180 |
- I feel like it's waiting for like that revolutionary person who comes in and says, I'm going to 00:31:06.020 |
- Honestly, a little bit of a Carmack type guy, which is why it was really interesting 00:31:11.060 |
It's basically, how do we create a simple dumb thing that's a hundred bucks, but actually 00:31:17.380 |
And then there's going to be some viral killer app on it. 00:31:20.300 |
And that's going to be the gateway into a thing that's going to change everything. 00:31:23.780 |
I mean, I don't know what exactly was the thing that changed everything with a personal 00:31:37.140 |
- Wasn't the '84 Macintosh like a moment when it was like, this is actually something that 00:31:48.140 |
- And I just think it had some like Steve Jobs user friendliness already to it that 00:31:55.980 |
I remember like, because I'm old enough to remember the MS-DOS when I was like, kind 00:32:01.460 |
And then suddenly this concept of like a window you drag something into, or you double click 00:32:05.220 |
an icon, which now seems like so obvious to us, was like revolutionary because it made 00:32:16.820 |
- I forget what the big leaps was, because that was Windows 2000, it sucked. 00:32:29.540 |
Well, us, the people, still use Windows and Android, the device in the operating system 00:32:35.780 |
of the people, not you elitist folk with your books and your, what else? 00:32:46.820 |
You write, "More technology means better good times, but it also means badder bad times. 00:32:52.500 |
And the scary thing is, if the good and bad keep exponentially growing, it doesn't matter 00:32:58.900 |
If the bad gets to a certain level of bad, it's all over for us." 00:33:04.380 |
Why is there, why does the bad have that property? 00:33:09.860 |
That if it's all exponentially getting more powerful, then the bad is gonna win in the 00:33:17.020 |
One of the things I noticed a trend, which was like, the centuries, the good is getting 00:33:24.380 |
Like the 20th century was the best century yet in terms of prosperity, in terms of GDP 00:33:30.140 |
per capita, in terms of life expectancy, in terms of poverty and disease, every metric 00:33:37.780 |
It also had the biggest wars in history, the biggest genocide in history, the biggest existential 00:33:45.580 |
The depression was probably as big an economic. 00:33:48.500 |
So it's this interesting thing where the stakes are getting higher in both directions. 00:33:53.380 |
And so the question is like, if you get enough good, does that protect you against the bad? 00:33:59.780 |
The dream, and I do think this is possible too, is the good gets so good. 00:34:03.700 |
You know, have you ever read the culture series, the Ian Banks books? 00:34:06.100 |
- Not yet, but I get criticized on a daily basis by some of the mutual folks we know 00:34:16.340 |
And I read six of the 10 books and they're great. 00:34:19.660 |
But the thing I love about them is like, it just paints one of these futuristic societies 00:34:25.020 |
where the good has gotten so good that the bad is no longer even an issue. 00:34:31.100 |
Like basically, and the way that this works is the AI, you know, the AIs are benevolent 00:34:39.420 |
And so like, there's one random anecdote where they're like, you know, what happens if you 00:34:43.180 |
murder someone in, 'cause you're still, you know, there's still people with rage and jealousy 00:34:48.660 |
So someone murders someone, first of all, that person's backed up. 00:34:51.540 |
So it's like they have to get a new body and it's annoying, but it's like, it's not death. 00:34:56.180 |
And secondly, that person, what are they gonna do? 00:34:59.500 |
They're just gonna send a slap drone around, which is this little like tiny, you know, 00:35:01.740 |
random drone that just will float around next to them forever. 00:35:05.460 |
Like, it's kind of fun to have a slap drone, but it's just making sure that they never 00:35:08.820 |
And it's like, I was like, oh man, it could just be, everyone could be so safe and everything 00:35:13.140 |
could be so like, you know, you want a house, you know, the AIs will build you a house. 00:35:16.660 |
There's endless space, there's endless resources. 00:35:18.840 |
So I do think that that could be part of our future. 00:35:21.060 |
That's part of what excites me is like, there is, like today would seem like a utopia to 00:35:26.300 |
Thomas Jefferson's world would seem like a utopia to a caveman. 00:35:30.260 |
There is a future, and by the way, these are happening faster, these jumps, right? 00:35:34.600 |
So the thing that would seem like a utopia to us, we could experience in our own lifetimes, 00:35:39.780 |
It's, especially if, you know, life extension combines with exponential progress. 00:35:45.900 |
And I think in that part of what makes it utopia is you don't have to be as scared of 00:35:50.220 |
the worst bad guy in the world trying to do the worst damage because we have protection. 00:35:54.860 |
But that said, I'm not sure how that happens. 00:35:57.860 |
Like it's, it's, it's, it's either easier said than done. 00:36:01.740 |
Nick Bostrom uses the example of if nuclear weapons could be manufactured by microwaving 00:36:06.740 |
sand, for example, we probably would be in the stone age right now because 0.001% of 00:36:14.180 |
people would love to destroy all of humanity, right? 00:36:16.700 |
Some 16 year old with huge mental health problems who right now goes and shoots up a school 00:36:21.380 |
would say, oh, even better, I'm going to blow up a city. 00:36:27.020 |
And so that's like, as our technology grows, it's going to be easier for the worst bad 00:36:41.700 |
So it takes a tiny, tiny number of these people with enough power to do bad. 00:36:46.460 |
So that to me, I'm like the stakes are going up because what we have to lose is this incredible 00:36:56.540 |
They probably earlier thought that was never possible. 00:37:01.860 |
And so to me, that trend is the exponential tech is a double edged sword. 00:37:08.700 |
I'm happy to be alive now overall because I'm an optimist and I find it exciting. 00:37:14.980 |
And the dumbest thing we can do is not be scared. 00:37:18.060 |
Dumbest thing we can do is get cocky and think, well, my life is always-- the last couple 00:37:26.420 |
What percentage of trajectories take us towards the, as you put, unimaginably good future 00:37:38.060 |
I mean, all I can-- one of the things we can do is look at history. 00:37:45.580 |
I'm actually listening to a great podcast right now called The Fall of Civilizations. 00:37:50.660 |
And it's literally every episode is like a little two hour deep dive into some civilization. 00:37:55.900 |
Some are really famous, like the Roman Empire. 00:37:57.700 |
Some are more obscure, like the Norse and Greenland. 00:38:05.380 |
But what's-- I mean, there's a lot of civilizations that had their peak. 00:38:11.140 |
There's always the peak, right, when they're thriving. 00:38:23.980 |
But the peak is the great-- you know, if I could go back in time. 00:38:25.860 |
It's not that the farther you go back, the worse it gets. 00:38:29.740 |
You want to go back to a civilization during-- I would go to the Roman Empire in the year 00:38:34.620 |
You don't want to go to the Roman Empire in the year 400. 00:38:36.540 |
We might be in the peak right now here, whatever this empire is. 00:38:39.700 |
Honestly, I think about the '80s, the '70s, the '80s. 00:38:49.900 |
It's just like-- when I listen to these things, I'm thinking, you know, the '80s and '90s. 00:38:57.620 |
Like, Clinton was a superstar around the world. 00:39:07.100 |
It was a little probably like the '50s, you know, coming out of the World War and the 00:39:11.820 |
It was like this kind of like everyone was in a good mood kind of time, you know? 00:39:15.420 |
It's like, finish a big project and it's Saturday. 00:39:17.540 |
It was like-- I feel like the '50s was kind of like everyone was having it. 00:39:21.020 |
You know, the '20s, I feel like everyone was in a good mood randomly. 00:39:28.360 |
But the '90s, I think we'll look back on it as a time when everyone was in a good mood. 00:39:32.040 |
And it was like, you know, again, of course, at the time, it doesn't feel that way necessarily. 00:39:35.380 |
But I look at that, I'm like, maybe that was kind of America's peak. 00:39:39.200 |
And like, maybe not, but like it hasn't been popular since really worldwide. 00:39:44.180 |
It's gone in and out depending on the country, but like it hasn't reached that level of like 00:39:50.900 |
And the political situation has gotten really ugly. 00:39:55.460 |
And maybe it's social media, maybe who knows. 00:39:57.980 |
But I wonder if it'll ever be as simple and positive as it was then. 00:40:03.900 |
Like, maybe we are in the-- it feels a little like maybe we're in the beginning of the downfall-- 00:40:10.780 |
Because these things don't just-- it's not a perfect smooth hill. 00:40:15.940 |
- It's unclear whether public opinion, which is kind of what you're talking to, is correlated 00:40:23.700 |
You could say that even though America has been on a decline in terms of public opinion, 00:40:27.980 |
the exporting of technology, that America has still, with all the talk of China, has 00:40:33.460 |
still been leading the way in terms of AI, in terms of social media, in terms of just 00:40:45.380 |
You could argue that Google and Microsoft and Facebook are no longer American companies. 00:40:51.580 |
But they really are still headquartered in Silicon Valley, broadly speaking. 00:40:57.260 |
So and Tesla, of course, and just all of the technological innovation still seems to be 00:41:05.780 |
Although culturally and politically, this is not-- 00:41:13.420 |
- Well, maybe that could shift at any moment when all the technological development can 00:41:18.180 |
actually create some positive impact in the world. 00:41:22.180 |
That could shift it with the right leadership and so on, with the right messaging. 00:41:26.740 |
- Yeah, I think, I don't feel confident at all about whether, no, no, I don't mean that. 00:41:32.980 |
I don't mean, I don't feel confident in my opinion that we may be on the downswing or 00:41:39.180 |
It's like, I think the people, these are really big macro stories that are really hard to 00:41:45.900 |
It's like being on a beach and running around a few miles this way and trying to suss out 00:41:53.020 |
It's just really hard to see the big picture. 00:41:56.620 |
You get caught up in the micro stories, the little tiny ups and downs that are part of 00:42:03.300 |
And also giant paradigm shifts happen quickly nowadays. 00:42:05.780 |
The internet came out of nowhere and suddenly was like, changed everything. 00:42:09.860 |
So there could be a changed everything thing on the way. 00:42:11.660 |
It seems like there's a few candidates for it. 00:42:14.300 |
But I mean, it feels like the stakes are just high, higher than it even was for the Romans, 00:42:19.060 |
higher than it was for because that we're more powerful as a species. 00:42:24.460 |
We have God-like powers with technology that other civilizations at their peak didn't have. 00:42:30.500 |
I wonder if those high stakes and powers will feel laughable to people that live, humans, 00:42:37.140 |
aliens, cyborgs, whatever lives 100 years from now. 00:42:40.700 |
That maybe are a little like, this feeling of political and technological turmoil is 00:42:48.780 |
So right now, you know the 1890s was like a super politically contentious decade in 00:42:55.500 |
There was like immense tribalism and the newspapers were all like lying and telling, 00:43:00.660 |
you know, there was a lot of like what we would associate with today's media, the worst 00:43:05.060 |
And it was over gold or silver being this, I don't know, it was very, it's something 00:43:09.220 |
But the point is, it was a little bit of a blip, right? 00:43:12.220 |
It happened, it felt, it must've felt like the end of days at the time. 00:43:14.960 |
And then now we look, and most people don't even know about that. 00:43:17.980 |
Versus, you know, again, the Roman Empire actually collapsed. 00:43:21.420 |
And so the question is just like, is yeah, you know, will in 50 years, will this be like, 00:43:27.820 |
Oh, they had like, oh, that was like a crazy few years in America and then it was fine. 00:43:32.680 |
Or is this the beginning of something really big? 00:43:37.220 |
- Well, I wonder if we can predict what the big thing is at the beginning. 00:43:41.240 |
It feels like we're not, we're just here along for the ride. 00:43:44.260 |
And at the local level, and at every level, we're trying to do our best. 00:43:50.780 |
That's the one thing I know for sure is that we need to have our wits about us and do our 00:43:57.380 |
You know, we have to be as wise as possible, right? 00:44:01.900 |
And wisdom is an emergent property of discourse. 00:44:06.340 |
- So you're a proponent of wisdom versus stupidity? 00:44:08.660 |
'Cause you can make an, I can steal a man in the case for stupidity. 00:44:15.540 |
But there's some, I think wisdom, and you talk about this, can come with a false confidence, 00:44:26.260 |
If you're being arrogant, you're being unwise. 00:44:28.820 |
- Yeah, I think wisdom is doing what people 100 years from now with the hindsight that 00:44:33.380 |
we don't have would do if they could come back in time and they knew everything. 00:44:36.300 |
It's like, how do we figure out how to have hindsight when we actually are not? 00:44:39.820 |
- What if stupidity is the thing that people from 100 years from now will see as wise? 00:44:46.500 |
- The idiot by Dostoevsky being naive and trusting everybody, maybe that's the one. 00:44:52.740 |
Then maybe you get to a good future by stumbling upon it. 00:44:59.260 |
I think a lot of, America, the great things about it are a product of the wisdom of previous 00:45:07.500 |
The Constitution was a pretty wise system to set up. 00:45:14.660 |
- Well, there is, I mean, with Dostoevsky's The Idiot, Prince Mishkin, and Brothers Karamazov, 00:45:21.460 |
there's Alyosha Karamazov, you err on the side of love and almost like a naive trust 00:45:32.860 |
And that turns out to be, at least in my perspective, in the long term, for the success of the species 00:45:46.100 |
So I think we should have, a compass is nice, but you know what else is nice? 00:45:51.660 |
You can't see that far, but you can see, oh, you can see four feet ahead instead of one 00:45:57.380 |
That is open, vigorous discussion in a culture that fosters that is how the species, how 00:46:04.820 |
the American citizens as a unit can be as wise as possible, can maybe see four feet 00:46:13.220 |
- That said, Charles Bukowski said that love is a fog that fades with the first light of 00:46:18.780 |
So I don't know how that works out, but I feel like there's intermixing of metaphors 00:46:23.460 |
Okay, you also write that quote, "As the authors of the story of us," which is this 1,000-page 00:46:28.780 |
book, "we have no mentors, no editors, no one to make sure it all turns out okay. 00:46:37.520 |
This scares me, but it's also what gives me hope. 00:46:40.540 |
If we can all get just a little wiser together, it may be enough to nudge the story onto a 00:46:45.300 |
trajectory that points towards an unimaginably good future." 00:46:51.260 |
Do you think we can possibly define what a good future looks like? 00:46:55.900 |
I mean, this is the problem that we ran into with communism, of thinking of utopia, of 00:47:06.460 |
having a deep confidence about what a utopian world looks like. 00:47:13.020 |
That was a deep confidence about the instrumental way to get there. 00:47:17.020 |
It was that, you know, I think a lot of us can agree that if everyone had everything 00:47:21.500 |
they needed and we didn't have disease or poverty and people could live as long as they 00:47:25.180 |
wanted to and choose when to die and there was no existential, major existential threat 00:47:31.260 |
because we control, I think almost everyone can agree that would be great. 00:47:34.940 |
That communism is a, that was, they said, "This is the way to get there." 00:47:40.500 |
And that is, that's a different question, you know? 00:47:44.340 |
So the unimaginably good future I'm picturing, I think a lot of people would picture, and 00:47:51.060 |
There's a lot of people out there who would say humans are the scourge on the earth and 00:47:53.580 |
we should de-growth or something, but I think a lot of people would agree that, you know, 00:47:58.060 |
just again, take Thomas Jefferson, bring him here. 00:47:59.980 |
You would see it as a utopia for obvious reasons, the medicine, the food, the transportation, 00:48:07.540 |
just how, the quality of life and the safety and all of that. 00:48:14.500 |
Now we're Thomas Jefferson, you know, what's the equivalent? 00:48:19.100 |
And the big question is, I actually don't, I don't try to say, "Here's the way to get 00:48:23.980 |
Here's the actual specific way to get there." 00:48:26.140 |
I try to say, "How do we have a flashlight so that we can together figure it out? 00:48:30.420 |
Like how do we give ourselves the best chance of figuring out the way to get there?" 00:48:33.940 |
And I think part of the problem with communists and people, ideologues, is that they're way 00:48:40.340 |
too overconfident that they know the way to get there. 00:48:44.140 |
And it becomes a religion to them, this solution. 00:48:47.440 |
And then you know, you can't update once you have a solution as a religion. 00:48:51.440 |
- I felt a little violated when you said communists and stared deeply into myself. 00:48:57.880 |
In this book, you've developed a framework for how to fix everything. 00:49:04.580 |
- Okay, it's not a framework for how to fix everything. 00:49:08.060 |
- I'll explain it to Tim Urban at some point. 00:49:12.900 |
- It's a framework of how to think about collaboration between humans such that we could fix things. 00:49:21.080 |
- Yeah, it's like a, it's a ruler that we can, once we look at it together and see what 00:49:28.040 |
it is, we can all say, oh, we want to go to that side of the ruler. 00:49:38.280 |
This orange guy, this primitive mind, is, this is our software. 00:49:42.400 |
That is the software that was in a 50,000 BC person's head that was specifically optimized 00:49:52.480 |
And not even, not just not really survive, but help them pass their genes on in that 00:49:58.720 |
And civilization happened quickly and brains change slowly. 00:50:04.040 |
And so that unchanged dude is still running the show in our head. 00:50:18.720 |
And it's because the primitive mind in the world that it was programmed for, there was 00:50:27.640 |
And if there was a dense, chewy, sweet fruit like that, it meant you just found like a 00:50:33.160 |
Energy, energy, take it, take it, eat as much as you can. 00:50:40.320 |
And now we're so good with energy for a while. 00:50:44.240 |
So today Mars Inc is clever and says, let's not sell things to people's higher minds, 00:50:55.360 |
And let's trick them into thinking this is this new, this, this thing you should eat 00:51:04.800 |
So those are the two things that make up this bigger mind that it, that is the modern human 00:51:13.480 |
There's people who will yell at me for saying there's two minds and you know that, but to 00:51:16.920 |
me it's still a useful framework where you have this software that has making decisions 00:51:27.720 |
And it's the part of you that knows that skills are not good and can override the instinct. 00:51:31.240 |
And the reason you don't always eat Skittles is because the higher mind says, no, no, no, 00:51:36.360 |
And I know that right now you can apply that to a lot of things. 00:51:39.480 |
The higher mind is the one that knows I shouldn't procrastinate. 00:51:41.560 |
The primitive mind is the one that wants to conserve energy and not do anything icky and 00:51:46.080 |
So he procrastinates that, you know, you can apply this. 00:51:48.280 |
No, I, in this book, apply it to, to how we form our beliefs is one of the ways. 00:51:54.880 |
And then eventually to politics and political movements. 00:51:56.880 |
But like, if you think about what, well, what's the equivalent of the Skittles tug of war 00:52:01.960 |
in your head for how do you form your beliefs? 00:52:06.840 |
And it's that the primitive mind in the world that it was optimized for, it wanted to feel 00:52:20.600 |
It wanted to be sure that it was, it wanted to feel conviction and it wanted to agree 00:52:29.840 |
It wanted to, to fervently agree with the tribe about the tribe's sacred beliefs. 00:52:34.320 |
And there's a big part of us that wants to do that, that doesn't like changing our mind. 00:52:37.280 |
It feels like it's part of our, the primitive mind identifies with beliefs. 00:52:40.320 |
It feels like it's a threat, a physical threat to you, to your primitive mind when you change 00:52:45.120 |
your mind or when someone disagrees with you in a smart way. 00:52:48.800 |
So there's that huge force in us, which is confirmation bias. 00:52:52.560 |
It's, it's this, this, this desire to keep believing what we believe and this desire 00:52:56.980 |
to also fit in with our beliefs, to believe what the people around us believe. 00:53:03.920 |
We all like the same sports team and we're all super into it and we're all going to be 00:53:08.560 |
I mean, that it's not always bad, but it's not a very smart way to be. 00:53:13.000 |
And you're actually, you're working kind of for those ideas. 00:53:15.980 |
Those ideas are like your boss and you're working so hard to keep believing those. 00:53:20.000 |
Those ideas are, you know, a really good paper comes in that you read that, that, that conflicts 00:53:24.960 |
And you will do all this work to say that paper is bullshit because you're, you're a 00:53:32.720 |
Now the higher mind, to me, the same party that can override the Skittles can override 00:53:36.320 |
this and can, and can search for something that makes a lot more sense, which is truth. 00:53:41.240 |
Because what rational being wouldn't want to know the truth who wants to be delusional. 00:53:45.600 |
And so there's this tug of war because the higher mind doesn't identify with ideas. 00:53:51.020 |
Why would you, it's an experiment you're doing and it's a mental model. 00:53:53.920 |
And if someone can come over and say, you're wrong, you'd say, where, where, show me, show 00:53:57.640 |
And if they point out something that is wrong, you'd say, oh, thanks. 00:54:07.320 |
So there's both of these in our heads and there's this tug of war between them. 00:54:11.280 |
And sometimes, you know, if you're telling me about something with AI, I'm probably going 00:54:14.400 |
to think with my higher mind because I'm not identified with it. 00:54:16.600 |
But if you go and you criticize the ideas in this book or you criticize my religious 00:54:20.000 |
beliefs or you criticize, I might have a harder time because the primitive mind says, no, 00:54:26.000 |
So yeah, so that's, that's one way to use this ladder is like, it's a spectrum, you 00:54:30.080 |
know, at the top, the higher mind is doing all the thinking. 00:54:32.520 |
And then as you go down, it becomes more of a tug of war. 00:54:34.280 |
And at the bottom, the primitive mind is in total control. 00:54:37.120 |
And this is distinct as you show from the spectrum of ideas. 00:54:40.960 |
So this is how you think versus what you think. 00:54:48.820 |
We have all these horizontal axes, left, right, center, or, you know, this opinion all the 00:54:53.520 |
But it's like, what's much more important than where you stand is how you got there, 00:54:59.320 |
So this helps if, if I can say this person's kind of on the left or on the right, but they're 00:55:04.920 |
up high, I think, I think, in other words, I think they got there using evidence and 00:55:08.880 |
reason and they were willing to change their mind. 00:55:10.880 |
Now that means a lot to me what they have to say. 00:55:12.560 |
If I think they're just a tribal person and I can predict all their beliefs from hearing 00:55:16.120 |
one because it's so obvious what political beliefs, that person's views are irrelevant 00:55:23.820 |
They came from a tribe's kind of, you know, sacred 10 commandments. 00:55:29.340 |
I really like the comic you have in here about with the boxer. 00:55:39.700 |
Then how do you know he's the best boxer in the world? 00:55:43.140 |
Now, I mean, this connects with me and I think with a lot of people just because in martial 00:55:46.680 |
arts, it's especially kind of true that this is this whole legend about different martial 00:55:50.740 |
artists and that kind of would construct like action figures like, you know, thinking that 00:55:58.820 |
Steven Seagal is the best fighter in the world or Chuck Norris. 00:56:03.380 |
He's done really well in competition, but still the ultimate test for particular for 00:56:07.180 |
martial arts is what we now know as mixed martial arts, UFC and so on. 00:56:16.140 |
I mean, there's within certain rules and you can criticize those rules like this doesn't 00:56:19.500 |
actually represent the broader combat that you would think of when you're thinking about 00:56:24.220 |
But reality is you're actually testing things. 00:56:26.540 |
And that's when you realize that Aikido and some of these kind of woo woo martial arts 00:56:31.740 |
in their certain implementations don't work in the way you think they would in the context 00:56:37.940 |
And I think this is one of the places where everyone can agree, which is why it's a really 00:56:41.700 |
nice comic because then you start to talk about. 00:56:46.580 |
Tap this onto ideas that people take personally. 00:56:50.300 |
It starts becoming a lot more difficult to basically highlight that we're thinking with 00:56:57.500 |
not with our higher mind, but with our primitive mind. 00:57:01.020 |
I mean, if I'm thinking with my higher mind and now here is I can use different things 00:57:06.340 |
So here the metaphor is a boxer for one of your conclusions, one of your beliefs. 00:57:12.820 |
And if all I care about is truth, in other words, that means all I care about is having 00:57:21.800 |
I would say, go, yeah, try, see if this person is good. 00:57:26.420 |
In other words, I would get into arguments, which is throwing my boxer out there to fight 00:57:31.620 |
And if I think my argument is good, by the way, I love boxing. 00:57:33.860 |
If I think my guy is amazing, Mike Tyson, I'm thinking, oh yeah, bring it on. 00:57:45.620 |
Now what would you think about my boxer if not only was I telling you he was great, but 00:57:52.140 |
But then you said, OK, well, your idea came over to try to punch him. 00:57:55.740 |
And I screamed and I said, what are you doing? 00:58:01.780 |
And I don't want to be friends with you anymore because you would think this boxer obviously 00:58:06.300 |
Or at least I think it sucks deep down because why would I be so anti anyone? 00:58:13.900 |
So I think if you're in-- so I call this a ladder, right? 00:58:19.140 |
If you're in low-rung land, whether it's a culture or whatever, a debate, an argument, 00:58:23.620 |
when someone says, no, that's totally wrong, what you're saying about that, and here's 00:58:32.180 |
People are going to say, oh, wow, we got in a fight. 00:58:39.700 |
It's a culture where you don't touch each other's ideas. 00:58:48.620 |
I mean, like every one of your podcasts, whether you're agreeing or disagreeing, the tone is 00:58:56.220 |
It's like the tone is identical because you're just playing intellectually either way because 00:59:00.900 |
- At his best, but people do take stuff personally. 00:59:05.140 |
And that's actually one of the skills of conversation just as a fan of podcasts is when you sense 00:59:09.780 |
that people take a thing personally, you have to like, there's sort of methodologies and 00:59:15.460 |
little paths you can take to calm things down. 00:59:21.540 |
- You're trying to suss out which of their ideas are sacred to them. 00:59:26.660 |
- And sometimes it's actually, I mean, that's the skill of it, I suppose, that sometimes 00:59:29.700 |
it's certain wordings in the way you challenge those ideas are important. 00:59:35.140 |
You can challenge them indirectly and then together, walk together in that way. 00:59:40.100 |
Because what I've learned is people are used to their ideas being attacked in a certain 00:59:49.420 |
And if you just avoid those, like for example, if you have political discussions and just 00:59:53.580 |
never mentioned left or right or Republican and Democrat, none of that, just talk about 00:59:59.380 |
different ideas and avoid certain kind of triggering words. 01:00:03.780 |
You can actually talk about ideas versus falling into this path that's well-established through 01:00:12.420 |
- When you say triggering, I mean, who's getting triggered? 01:00:15.740 |
So what you're trying to do, what you're saying in this language is how do you have conversations 01:00:19.740 |
with other people's higher minds, almost like whispering without waking up the primitive 01:00:25.780 |
And as soon as you say something, the left primitive mind gets up and says, "What? 01:00:33.340 |
- What do you make of conspiracy theories under this framework of the latter? 01:00:37.740 |
- So here's the thing about conspiracy theories is that once in a while they're true, right? 01:00:43.780 |
'Cause sometimes there's an actual conspiracy. 01:00:45.720 |
Actually humans are pretty good at real conspiracies, secret things. 01:00:50.260 |
And then I just watched the Madoff doc, great new Netflix doc, by the way. 01:00:56.540 |
And so the question is, how do you create a system that is good at, you put the conspiracy 01:01:06.220 |
theory in and it either goes, "Eh," or it says, "This is interesting, let's keep exploring 01:01:12.820 |
How do you do something that it can, how do you assess? 01:01:14.060 |
And so again, I think the high-run culture is really good at it because a real conspiracy, 01:01:21.420 |
what's gonna happen is you put it, it's like a little machine you put in the middle of 01:01:24.620 |
the table and everyone starts firing darts at it or bow and arrow or whatever, and everyone 01:01:27.900 |
starts kicking it and trying to, and almost all conspiracy theories, they quickly crumble, 01:01:34.060 |
Because they actually, Trump's election one is, I actually dug in and I looked at like 01:01:38.540 |
every claim that he or his team made and I was like, "All of these, none of these hold 01:01:46.900 |
So that was one that as soon as it's open to scrutiny, it crumbles. 01:01:51.320 |
The only way that conspiracy can stick around in a community is if it is a culture where 01:01:58.020 |
that's being treated as a sacred idea that no one should kick or throw a dart at because 01:02:03.740 |
So it's being, and so what you want is a culture where no idea is sacred. 01:02:11.180 |
And so I think that then what you'll find is that 94 out of 100 conspiracy theories 01:02:17.580 |
The other, maybe four of the others come in and there's something there, but it's not 01:02:22.080 |
And then maybe one is a huge deal and it's actually a real conspiracy. 01:02:25.360 |
- Well, isn't there a lot of gray area and there's a lot of mystery. 01:02:28.920 |
Isn't that where the conspiracy theories seep in? 01:02:32.560 |
So it's great to hear that you've really looked into the Trump election fraud claims, but 01:02:40.000 |
aren't they resting on a lot of kind of gray area, like fog, basically saying that there 01:02:46.080 |
is dark forces in the shadows that are actually controlling everything. 01:02:49.840 |
I mean, the same thing with maybe you can, there's like safer conspiracy theories, less 01:02:55.840 |
controversial ones like, have we landed on the moon? 01:03:02.720 |
There's, you know, like the reason those conspiracy theories work is you could construct, there's 01:03:08.800 |
incentives and motivation for faking the moon landing. 01:03:11.880 |
There's a lot of, there's very little data supporting the moon landing. 01:03:20.000 |
Like that's very public and it kind of looks fake, space kind of looks fake. 01:03:22.800 |
- And that would be a big story if it turned out to be fake. 01:03:26.080 |
- That's the argument, that would be the argument against it. 01:03:28.080 |
Like are people really as a collective going to hold onto a story that big? 01:03:33.960 |
Yeah, so that, but there's a lot, the reason they work is there's mystery. 01:03:38.560 |
- Yeah, there's a great documentary called Behind the Curve about flat earthers. 01:03:42.480 |
And one of the things that you learn about flat earthers is they believe all the conspiracies, 01:03:49.640 |
They are convinced the moon landing is fake, they're convinced 9/11 was an American con 01:03:55.240 |
They're convinced, you know, that, name a conspiracy and they believe it. 01:03:59.480 |
And so it's so interesting is that I think of it as a skepticism spectrum. 01:04:07.520 |
So on one side, you, it's like a filter in your head, a filter in the beliefs section 01:04:14.120 |
On one end of the spectrum, you are gullible, perfectly gullible. 01:04:27.720 |
Now the healthy place, I think that the, so I think the healthy place is to be somewhere 01:04:33.280 |
And, but also you can learn to trust certain sources and then, you know, you don't have 01:04:36.120 |
to do as much, apply as much skepticism to them. 01:04:38.960 |
And so here's what, like, when you start having a bias, just say you have a political bias, 01:04:46.480 |
when your side says something, you will find yourself moving towards the gullible side 01:04:51.280 |
You read an article written that supports your views, you move to the gullible side 01:04:54.440 |
of the spectrum and you just believe it and you don't have any, where's that skepticism 01:04:58.280 |
And then you move, and then you, soon as it's the other person talking, the other team talking, 01:05:01.840 |
you move to the skeptical, the closer to the, you know, in denial, paranoid side. 01:05:13.080 |
So it's like, it's so interesting because they're the people who are saying, ah, nah, 01:05:17.880 |
No, everyone else is gullible about the moon landing. 01:05:20.240 |
And then yet when there's this evidence like, oh, because you can't see Seattle, you can't 01:05:24.360 |
see the buildings over that horizon and you should, which isn't true. 01:05:28.140 |
You should be, if the earth around you wouldn't be able to see them. 01:05:32.100 |
Therefore it's, so suddenly they become the most gullible person. 01:05:34.060 |
Hear any theory about the earth flat, they believe it. 01:05:37.380 |
So they're actually jumping back and forth between refusal to believe anything and believe 01:05:44.220 |
But I think when it comes to conspiracy theories, the people that get themselves into trouble 01:05:48.220 |
are the ones who, they become really gullible when they hear a conspiracy theory that kind 01:05:53.820 |
And they likewise, when there's something that's kind of obviously true and it's not 01:05:57.260 |
a big lie, they will actually, they'll think it is. 01:06:01.700 |
They just tighten up their kind of skepticism filter. 01:06:05.980 |
So I think the healthy places to be is where you are not, because you also don't want to 01:06:09.540 |
be the person who says every conspiracy, you hear the word conspiracy theory and it sounds 01:06:12.600 |
like a synonym for like quack job crazy theory. 01:06:19.660 |
So I think, yeah, I think it's be somewhere in the middle of that spectrum and to learn 01:06:25.220 |
Because you kind of have to, every time you hear a new conspiracy theory, you should approach 01:06:31.700 |
And you know, and also if you don't have enough time to investigate, which most people don't, 01:06:36.860 |
kind of still have a humility not to make a conclusive statement that that's nonsense. 01:06:41.500 |
There's a lot of social pressure, actually, to immediately laugh off any conspiracy theory, 01:06:50.940 |
You will quickly get mocked and laughed at and not taken seriously. 01:06:54.420 |
If you give any credence, you know, back to the lab leak was a good one, where it's like, 01:06:59.020 |
turned out that that was at least very credible, if not true. 01:07:03.700 |
And that was a perfect example of one where when it first came out, and not only so, so 01:07:11.820 |
And then I, in a totally different conversation, said something complimentary about him on 01:07:19.040 |
And people were saying, Tim, you might have gone a little off the deep end. 01:07:22.180 |
You're like quoting someone who is like a lab leak person. 01:07:24.900 |
So I was getting my reputation dinged for complimenting on a different topic, someone 01:07:31.220 |
whose reputation was totally sullied, because they had, you know, they questioned an orthodoxy, 01:07:37.820 |
So you see, so what does that make me want to do? 01:07:42.740 |
That's the, at least that's the incentive that's, and what does that make other people 01:07:47.500 |
Don't say it out loud, because you don't want to become someone that no one wants to compliment 01:07:51.900 |
So you can see the social pressure, and that's, and of course, when there is a conspiracy, 01:07:59.040 |
- Because then they see the people from outside are seeing that social pressure enact, like 01:08:06.140 |
Tim Urban becoming more and more and more extreme to the other side, and so they're 01:08:09.180 |
going to take the more and more and more extreme. 01:08:11.260 |
I mean, this, what do you see that the pandemic did, that COVID did to our civilization in 01:08:26.580 |
- Yeah, so COVID, I thought might be, we always know the ultimate example of a topic that 01:08:35.140 |
- Although honestly, I don't even have that much faith then. 01:08:36.260 |
I think there'd be like, some people are super pro-alien, and some people are anti-alien, 01:08:42.420 |
- I was actually sorry to interrupt, because I was talking to a few astronomers, and they're 01:08:47.460 |
the first folks that made me kind of sad in that if we did discover life on Mars, for 01:08:55.020 |
example, that there's going to be potentially a division over that too, whereas half the 01:09:02.060 |
- Well, because we live in a current society where the political divide has subsumed everything, 01:09:17.020 |
We're in a really bad one where it's actually, in the book, I call it like a vortex, almost 01:09:22.860 |
like a whirlpool that pulls everything into it. 01:09:27.740 |
And so normally you'd say, okay, immigration, naturally going to be contentious. 01:09:33.860 |
But COVID seemed like, oh, that's one of those that will unite us all. 01:09:43.300 |
No one's sensitive, no one's getting hurt when we insult the virus. 01:09:46.940 |
Let's all be, we have this threat, this common threat that's a threat to everyone of every 01:09:50.620 |
nationality and every country, every ethnicity. 01:09:59.000 |
So it pulled COVID in, and suddenly masks, if you're on the left, you like them. 01:10:05.300 |
And suddenly lockdowns, if you're on the left, you like them, and on the right, you hate 01:10:11.780 |
When Trump first started talking about the vaccine, Biden, Harris, Cuomo, they're all 01:10:18.780 |
saying I'm not taking that vaccine, not from this CDC. 01:10:21.380 |
- Because it was too rushed or something like that? 01:10:24.020 |
- Because I'm not trusting anything that Trump says. 01:10:26.620 |
Trump wants me to take it, I'm not taking it. 01:10:29.980 |
So this was, Trump was almost out of office, but at the time, if Trump had been, it would've 01:10:37.340 |
Right likes vaccines, the left doesn't like vaccines. 01:10:42.180 |
And all those people are suddenly saying, they were actually specifically saying that 01:10:46.420 |
if you, that like, if you're saying the CDC is not trustworthy, that's misinformation, 01:10:51.540 |
which is exactly what they were saying about the other CDC. 01:10:54.380 |
And they were saying it because they genuinely didn't trust Trump, which is fair, but now 01:10:58.060 |
when other people don't trust the Biden CDC, suddenly it's this kind of misinformation 01:11:03.900 |
So it was a sad moment, 'cause it was a couple months, even a week or so, I mean a month 01:11:08.260 |
or so at the very beginning when it felt like a lot of our other squabbles were kind of 01:11:12.860 |
like, oh, I feel like they're kind of irrelevant right now. 01:11:15.700 |
And then very quickly the whirlpool sucked it in. 01:11:19.300 |
And in a way where I think it damaged the reputation of these, a lot of the trust in 01:11:23.060 |
a lot of these institutions for the long run. 01:11:25.300 |
- But there's also an individual psychological impact. 01:11:27.900 |
It's like a vicious negative feedback cycle where they were deeply affected on an emotional 01:11:32.700 |
level and people just were not their best selves. 01:11:39.900 |
I mean, one thing that we've been dealing with for our whole human history is pathogens. 01:11:47.460 |
It brings out, you know, there's really interesting studies where like, if, they studied the phenomenon 01:11:56.460 |
of disgust, which is one of these like, you know, smiling is universal. 01:12:01.260 |
You don't have to ever translate a smile, right? 01:12:03.980 |
Throwing, you know, throwing your hands up when your sports team wins is universal because 01:12:10.780 |
And so is disgust to kind of make this like, you know, face where you wrinkle up your nose 01:12:13.900 |
and you kind of put out your tongue and maybe even gag. 01:12:17.020 |
That's to expel, expel whatever, because it's the reaction when something is potentially 01:12:26.700 |
But they did this interesting study where people who, in two groups, the control group, 01:12:33.500 |
you know, was shown images of, and I might be getting two studies mixed up, but they 01:12:37.940 |
were showing images of like car crashes and like disturbing but not disgusting. 01:12:41.580 |
And the other one was shown like, you know, like, you know, rotting things and just things 01:12:48.740 |
And the group that had the disgust feeling pulsing through their body was way more likely 01:12:54.160 |
to prefer like immigrants from white countries. 01:12:58.980 |
And the group that was, had been shown car accidents, they were, they still prefer the 01:13:02.820 |
groups from white countries, but much less so. 01:13:06.820 |
It's because with the disgust impulse makes us scared of, you know, sexual practices that 01:13:12.060 |
are foreign, of ethnicities that are not, that don't look like us, of, it's still xenophobia. 01:13:19.180 |
This is, of course, also how, you know, the Nazi propaganda with cockroaches and, or it 01:13:24.540 |
was, Rwandan was cockroaches, you know, the Nazis was rats. 01:13:27.500 |
And, you know, it's specifically, it's a dehumanizing emotion. 01:13:31.660 |
So anyway, we were, we were, we were, we were talking about COVID, but I think it does, 01:13:37.900 |
it taps deep into like the human psyche and it, and it's, I don't think it brings out 01:13:41.180 |
our, I think, like you said, I think it brings out an ugly side in us. 01:13:45.100 |
- You describe an idea lab as being opposite of echo chambers. 01:13:51.540 |
And you said like, there's basically no good term for the opposite of an echo chamber. 01:13:57.900 |
- Yeah, well, first of all, both of these, we think of an echo chamber as like a group 01:14:01.500 |
maybe, or even a place, but it's, it's a culture. 01:14:06.820 |
And this goes along with the high rung, low, so high rung and low rung thinking is individual. 01:14:10.460 |
So I was talking about what's going on in your head, but this is very connected to the 01:14:16.860 |
And so groups will do high rung and low rung thinking together. 01:14:24.100 |
Basically it's, so an echo chamber to me is, is a collaborative low rung thinking. 01:14:27.740 |
It is, it's a culture where the cool, it's based around a sacred set of ideas. 01:14:34.700 |
And it's the coolest thing you can do in an echo chamber culture is talk about how great 01:14:39.420 |
the sacred ideas are and how bad and evil and stupid and wrong the people are who have 01:14:47.580 |
And this, and it's quite boring, you know, it's quite boring, you know, it's very hard 01:14:53.260 |
to learn and changing your mind is not cool in an echo chamber culture. 01:14:58.940 |
It makes you seem like, you know, like you're waffling and you're flip-flopping or whatever. 01:15:06.700 |
Showing conviction about the sacred ideas in echo chamber culture is awesome. 01:15:10.020 |
If you're just like, you know, obviously this, it makes you seem smart while being, you know, 01:15:15.220 |
So now flip all of those things on their heads and you have an, you have the opposite, which 01:15:18.900 |
is idea lab culture, which is collaborative high rung thinking. 01:15:23.820 |
But it's also just, it's just a totally different vibe. 01:15:33.180 |
And criticizing like the thing everyone believes is actually, it makes you seem like interesting. 01:15:40.300 |
And expressing too much conviction makes people lose trust in you. 01:15:44.300 |
It doesn't make you seem smart, it makes you seem stupid if you don't really know what 01:15:46.860 |
you're talking about, but you're acting like you do. 01:15:48.940 |
- I really like this diagram of where on the X axis is agreement, on the Y axis is decency. 01:16:01.300 |
- I think this is a really important thing to understand about the difference between, 01:16:05.900 |
you call it decency here, about asshole-ishness and disagreement. 01:16:10.500 |
- So my college friends, we love to argue, right? 01:16:13.060 |
And no one thought anyone was an asshole for, it was just for sport. 01:16:16.640 |
Sometimes we'd realize we're not even disagreeing on something and that would be disappointing 01:16:24.820 |
And one of the members of this group has this, she brought her new boyfriend to one of our 01:16:31.180 |
like hangouts and there was like a heated, heated debate. 01:16:37.780 |
And afterwards, you know, the next day he said like, is everything okay? 01:16:44.920 |
And he was like, you know, the fight last night. 01:16:46.220 |
And she was like, and she had to, and then she was like, you mean like the arguing? 01:16:51.260 |
And so that's someone who is not used to Idea Lab culture coming into it. 01:16:55.220 |
And seeing it is like, that was like, this is like, are they still friends, right? 01:16:59.100 |
And Idea Lab is nice for the people in them because individuals thrive. 01:17:11.020 |
And you also have people criticizing your ideas, which makes you smarter. 01:17:14.100 |
It doesn't always feel good, but you become more correct and smarter. 01:17:18.580 |
And Echo Chamber is the opposite, where it's not good for the people in it. 01:17:22.180 |
Your learning skills atrophy, and I think it's boring. 01:17:26.860 |
But the thing is, they also have emergent properties. 01:17:29.420 |
So the emergent property of an Idea Lab is like super intelligence. 01:17:37.620 |
If we're working together on something, but we're being really grown up about it, we're 01:17:42.500 |
not disagreeing, we're not, you know, no one's sensitive about anything. 01:17:45.860 |
We're going to each find flaws in the other one's arguments that you wouldn't have found 01:17:50.420 |
And we're going to have double the epiphanies, right? 01:17:52.780 |
So it's almost like the two of us together is like as smart as 1.5. 01:17:56.500 |
It's like 50% smarter than either of us alone, right? 01:17:58.460 |
So you have this 1.5 intelligent kind of joint being that we've made. 01:18:03.420 |
Now bring the third person and fourth person in, right? 01:18:06.540 |
And this is why science institutions can discover relativity and quantum mechanics and these 01:18:12.820 |
things that no individual human was going to come up with without a ton of collaboration, 01:18:19.540 |
So it has an emergent property of super intelligence. 01:18:22.180 |
An echo chamber is the opposite, where it has the emergent property of stupidity. 01:18:27.380 |
I mean, it has the emergent property of a bunch of people all paying fealty to this 01:18:37.020 |
And so you lose this magical thing about language and humans, which is collaborative intelligence, 01:18:45.300 |
But there is that axis of decency, which is really interesting, because you kind of painted 01:18:50.420 |
this picture of you and your friends arguing really harshly. 01:18:54.000 |
But underlying that is a basic camaraderie, respect. 01:19:01.260 |
There's all kinds of mechanisms we humans have constructed to communicate mutual respect, 01:19:07.980 |
or maybe communicate that you're here for the Idea Lab version of this. 01:19:20.660 |
People are respected in an Idea Lab, and ideas are disrespected. 01:19:26.380 |
So with friends, you've already done the signaling. 01:19:31.380 |
The interesting thing is online, I think you have to do some of that work. 01:19:35.900 |
To me, sort of steelmanning the other side, or no, having empathy and hearing out, being 01:19:42.940 |
able to basically repeat the argument the other person is making before you, and showing 01:19:48.820 |
I could see how you could think that before you make a counter-argument. 01:19:52.380 |
There's just a bunch of ways to communicate that you're here not to do kind of, what is 01:20:00.140 |
it, low rung, you know, shit talking, mockery, derision, but are actually here ultimately 01:20:06.620 |
to discover the truth in the space of ideas and the tension of those ideas. 01:20:10.660 |
And I think it's, I think that's a skill that we're all learning as a civilization of how 01:20:17.340 |
to do that kind of communication effectively. 01:20:20.340 |
I think disagreement, as I'm learning on the internet, it's actually a really tricky skill, 01:20:27.820 |
I got to listen to, there's a really good debate podcast, Intelligence Squared. 01:20:34.780 |
And like, they can go pretty hard in the paint. 01:20:38.820 |
- It's exactly, but like, how do we map that to social media? 01:20:42.700 |
When people like will say, will say, well, like Lex or anybody, you're not, you hate 01:20:51.700 |
No, I love Intelligence Squared type of disagreement. 01:20:58.260 |
- And for me personally, I don't want to reduce assholery. 01:21:01.940 |
I kind of like assholery, it's like fun in many ways. 01:21:04.860 |
But the problem is when the asshole shows up to the party, they make it less fun for 01:21:12.940 |
And the other people, especially the quiet voices at the back of the room, they leave. 01:21:20.220 |
- Well, Twitter, political Twitter to me is one of those parties. 01:21:23.580 |
It's a big party where a few assholes have really sent a lot of the quiet thinkers away. 01:21:34.740 |
And so if you think about this graph again, what, some place like Twitter, a great way 01:21:43.300 |
to get followers is to be an asshole with a certain, you know, pumping a certain ideology. 01:21:50.980 |
And for those followers, and the followers you're going to get, the people who like you 01:21:56.780 |
are probably going to be people who are really thinking with their primitive mind because 01:22:00.660 |
they're seeing you're being an asshole, but because you agree with them, they love you. 01:22:06.140 |
And they think, they don't see any problem with how you're being. 01:22:10.260 |
- Well, because look, look at the thing on the right. 01:22:14.620 |
So if you're in that mindset, the bigger the asshole, the better. 01:22:25.820 |
- There's a fascinating dynamic that happens because I have currently hired somebody that 01:22:31.420 |
looks at my social media and they block people because the assholes will roll in. 01:22:35.580 |
They're not actually there to have an interesting disagreement, which I love. 01:22:42.700 |
And then when they get blocked, they then celebrate that to their echo chamber. 01:22:51.020 |
- Or they'll say some annoying thing like, oh, so he talks about he likes, if I had done 01:22:56.140 |
this, they'll say, oh, he says he likes Idea Labs, but he actually wants to create an echo 01:23:04.820 |
Look at the other 50 people on this thread that disagreed with me respectfully. 01:23:09.860 |
- And so they see it as some kind of hypocrisy because again, they only see the thing on 01:23:14.900 |
And they're not understanding that there's two axes or that I see it as two axes. 01:23:18.940 |
And so you seem petty in that moment, but it's like, no, no, no, this is very specific 01:23:26.300 |
- And generally, I give all those folks a pass and just send them love telepathically. 01:23:31.540 |
But yes, getting rid of assholes in the conversation is the way you allow for the disagreement. 01:23:38.520 |
You do a lot of like, I think when primitive mindedness comes at you, at least on Twitter, 01:23:44.340 |
I don't know what you're feeling internally in that moment, but you do a lot of like, 01:23:50.740 |
And you come out and you'll be like, thanks for all the criticism, I love you. 01:23:56.440 |
And that's actually an amazing response because what it does is it unrails up that person's 01:24:07.740 |
primitive mind and actually wakes up their higher mind who says, oh, okay, you know, 01:24:17.060 |
But the thing is, they do seem to drive away high quality disagreement. 01:24:23.060 |
'Cause it takes so much effort to disagree in a high quality way. 01:24:30.620 |
One of the things I pride myself on is like my comment section is awesome. 01:24:38.940 |
No one's afraid to disagree with me and say, tear my post apart, but in a totally respectful 01:24:44.080 |
way where the underlying thing is like, I'm here 'cause I like this guy and his writing. 01:24:48.620 |
And people disagree with each other and they get in these long, and it's interesting and 01:24:53.020 |
And then I, a couple posts, especially the ones I've written about politics, it's not 01:24:56.780 |
like it seems like any other comment section. 01:24:59.140 |
People are being nasty to me, they're being nasty to each other. 01:25:02.200 |
And then I looked down one of them and I realized like, almost all of this is the work of a 01:25:10.660 |
You're not being thin-skinned, you're not being petty doing it. 01:25:17.180 |
Because what really aggressive people like that do is they'll turn it into their own 01:25:21.820 |
Because now everyone is scared to kind of disagree with them, it's unpleasant. 01:25:24.760 |
And so people who will chime in are the people who agree with them and suddenly they've taken 01:25:29.540 |
- And I kind of believe that those people on a different day could actually do high-effort 01:25:33.980 |
disagreement, it's just that they're in a certain kind of mood. 01:25:37.620 |
And a lot of us, just like you said, with a primitive mind, could get into that mood. 01:25:41.900 |
And I believe it's actually the job of the technology, the platform, to incentivize those 01:25:47.680 |
folks to be like, "Are you sure this is the best you can do? 01:25:52.180 |
If you really want to talk shit about this idea, do better." 01:25:57.260 |
And then we need to create incentives where you get likes for high-effort disagreement. 01:26:02.740 |
Because currently you get likes for something that's slightly funny and is a little bit 01:26:10.580 |
Like, basically signals to some kind of echo chamber that this person is a horrible person, 01:26:20.460 |
That feels like it's solvable with technology. 01:26:23.020 |
Because I think in our private lives, none of us want that. 01:26:26.140 |
- I wonder if it's making me think that I want a like, because a much easier way for 01:26:30.460 |
me to do it just for my world would be to say something like, "Here's this axis. 01:26:37.460 |
This is part of what I like about the latter." 01:26:42.460 |
Specifically what we're talking about is high-rung disagreement, good. 01:26:50.460 |
So what I would say is I would have my readers understand this axis. 01:26:56.000 |
And then I would specifically say something like, "Please do the do it but why a favor 01:27:03.280 |
and upvote regardless of what they're saying horizontally, regardless of what their actual 01:27:11.700 |
They can be saying great, they can be praising me, whatever. 01:27:16.060 |
Uproot high-rungness and downvote low-rungness." 01:27:19.040 |
And if enough people are doing that, suddenly there's all this incentive to try to say, 01:27:22.180 |
"I need to calm my emotion down here and not be personal because I'm going to get voted 01:27:28.400 |
I think a lot of people would be very good at that. 01:27:32.680 |
And not only would they be good at that, they would want that. 01:27:36.880 |
That task of saying, "I know I completely disagree with this person, but this was a 01:27:44.680 |
It gets everyone thinking about that other axis too. 01:27:46.280 |
You're not just looking at where do you stand horizontally. 01:27:48.400 |
You're saying, "Well, how did you get there and how are you... 01:27:51.840 |
Are you treating ideas like machines or are you treating them like little babies?" 01:27:56.040 |
- And that there should be some kind of labeling on personal attacks versus idea disagreement. 01:28:04.080 |
- That's like, "All right, no, there should be a disincentive at personal attacks versus 01:28:12.640 |
If I see, just say someone else's Twitter and I see you put out a thought and I see 01:28:23.200 |
Here's where I think you went wrong," and they're just explaining. 01:28:26.040 |
I'm thinking that if Lex reads that, he's going to be interested. 01:28:29.440 |
He's going to want to post more stuff, right? 01:28:32.120 |
If I see someone being like, "Wow, this really shows the kind of person that you become," 01:28:38.320 |
I'm thinking, "That person is making Lex want to be on Twitter less." 01:28:43.240 |
What that person's actually doing is they're putting... 01:28:46.400 |
They're chilling discussion because they're making it unpleasant to... 01:28:48.760 |
They're making it scary to say what you think. 01:28:52.160 |
The first person is making you want to say more stuff. 01:29:00.240 |
Great disagreements with friends in meat space is like you're... 01:29:10.440 |
Honestly, they could even have some shit talk where it's like personal attacks. 01:29:15.360 |
- Because you know them well and you know that that shit talk... 01:29:18.280 |
Because, yeah, friends shit talk all the time playing a sport or a game. 01:29:22.360 |
And again, it's because they know each other well enough to know that this is fun. 01:29:33.120 |
- Yeah, that "obviously I love you" that underlies a lot of human interaction seems to be easily 01:29:39.960 |
I've seen some people on Twitter and elsewhere just behave their worst. 01:29:44.600 |
And it's like, I know that's not who you are. 01:29:49.240 |
- I know someone personally who is one of the best people. 01:30:01.160 |
And if you looked at his Twitter only, you would think he's a culture warrior, an awful 01:30:15.040 |
And I'm not going to give any other info about it, specific. 01:30:19.840 |
- It comes out of a good place because he really cares about what... 01:30:26.440 |
Once you know someone like that, you can realize, okay, apply that to everyone. 01:30:28.960 |
Because a lot of these people are lovely people. 01:30:32.480 |
Even just, you know, back in the before social media. 01:30:38.160 |
They had this like dickishness on text or email that they didn't have in person. 01:30:41.760 |
And you're like, "Wow, email you is kind of a dick." 01:30:45.720 |
Certain people have a different persona behind the screen. 01:30:48.440 |
- It has, for me personally, become a bit of a meme that Lex blocks with love. 01:30:54.000 |
But there is a degree to that where this is... 01:30:56.400 |
I don't see people on social media as representing who they really are. 01:31:04.600 |
I see this as some weird side effect of online communication. 01:31:09.800 |
And so it's like, to me, blocking is not some kind of a derisive act towards that individual. 01:31:17.320 |
- Well, a lot of times what's happened is they have slipped into a very common delusion 01:31:28.240 |
But they're dehumanizing you or whoever they're being nasty to, because in a way they would 01:31:34.840 |
Because in a person, they're reminded that's a person. 01:31:36.840 |
Remember I said the dumb part of my brain when I'm doing VR won't step off the cliff, 01:31:41.240 |
but the smart part of my brain knows I'm just on the rug. 01:31:43.640 |
That dumb part of our brain is really dumb in a lot of ways. 01:31:47.760 |
It's the part of your brain where you can set the clock five minutes fast to help you 01:31:52.800 |
The smart part of your brain knows that you did that, but the dumb part will fall for 01:31:56.760 |
So that same dumb part of your brain can forget that the person behind that screen, behind 01:32:05.040 |
And that doesn't mean they're a bad person for forgetting that, because it's possible. 01:32:08.600 |
- Well, this really interesting idea, and I wonder if it's true that you're right, is 01:32:12.920 |
that both primitive mindedness and high mindedness tend to be contagious. 01:32:18.760 |
I hope you're right that it's possible to make both contagious. 01:32:23.400 |
Because our sort of popular intuition is only one of them, the primitive mindedness is contagious, 01:32:32.680 |
- To compliment you again, don't you think that your Twitter to me is like, I was just 01:32:37.480 |
looking down, and I mean it is a, it's just high mindedness. 01:32:41.880 |
It's just high mindedness, down, down, down, down, down. 01:32:44.380 |
It's gratitude, it's optimism, it's love, it's forgiveness, it's all these things that 01:32:48.880 |
are the opposite of grievance and victimhood and resentment and pessimism, right? 01:32:54.280 |
And there's I think a reason that a lot of people follow you, because it is contagious. 01:33:02.200 |
- I don't know, I've been recently, over the past few months, attacked quite a lot. 01:33:08.720 |
And it's fascinating to watch, because it's over things that, I think I probably have 01:33:13.400 |
done stupid things, but I'm being attacked for things that are totally not worthy of 01:33:22.160 |
- I saw that, by the way, I thought it was great. 01:33:24.840 |
- But you can always kind of find ways to, I guess the assumption is, this person surely 01:33:32.280 |
is a fraud, or some other explanation, he sure has dead bodies in the basement he's 01:33:37.240 |
hiding or something like this, and then I'm going to construct a narrative around that 01:33:42.480 |
I don't know how that works, but there is, there does, and I think you write this in 01:33:46.920 |
the book, there seems to be a gravity pulling people towards the primitive mind. 01:33:52.080 |
- When it comes to anything political, right, religious, certain things are bottom heavy, 01:34:01.640 |
They have a magnet that pulls our psyches downwards on the ladder. 01:34:05.320 |
And why, why does politics pull our psyches down on the ladder? 01:34:09.200 |
Because for the tens of thousands of years that we were evolving, you know, during human 01:34:22.500 |
And so there's actually an amazing study where it's like, they challenged like 20 different 01:34:32.960 |
And different parts of the person's brain, and they had an MRI going, different parts 01:34:36.960 |
of the person's brain lit up when non-political beliefs were challenged versus political beliefs 01:34:42.080 |
When political beliefs were challenged, when non-political beliefs were challenged, the 01:34:46.560 |
rational, like the prefrontal cortex type areas were lit up. 01:34:52.020 |
When the political beliefs were challenged, and I'm getting over my head here, but it's 01:34:55.600 |
like the parts of your brain, the default mode network, the parts of your brain associated 01:34:58.880 |
with like introspection and like your own identity were lit up. 01:35:03.840 |
And they were much more likely to change their mind on all the beliefs, the non-political 01:35:10.080 |
When that default mode network part of your brain lit up, you were going to, if anything, 01:35:15.160 |
get more firm in those beliefs when you had them challenged. 01:35:18.260 |
So politics is one of those topics that just literally, literally lights up different part 01:35:25.560 |
And again, I think we come back to primitive mind, higher mind here. 01:35:28.200 |
It's like it gets our higher, this is one of the things our primitive mind comes programmed 01:35:34.760 |
And so it's going to be very hard for us to stay rational and calm and looking for truth 01:35:42.080 |
- Well, it's weird because politics, like what is politics? 01:35:44.600 |
Like you talk about, it's a bunch of different issues and each individual issue, if we really 01:35:52.760 |
- I don't think we're actually that, I mean, yeah, we're emotional about something else. 01:35:57.560 |
- Yeah, I think what we're emotional about is my side, the side I've identified with 01:36:03.000 |
is in power and making the decisions and your side is out of power. 01:36:08.040 |
And if your side's in power, that's really scary for me because that goes back to the 01:36:12.360 |
idea of who's pulling the strings in this tribe, right? 01:36:21.520 |
We might not have food if we don't win this kind of whatever, chief election. 01:36:26.560 |
So I think that it's not about the tax policy or anything like that. 01:36:30.800 |
And then it gets tied to this like broader, I think a lot of our tribalism has really 01:36:37.120 |
We don't have that much religious tribalism in the US, right? 01:36:39.480 |
Now they know the Protestants and the Catholics hate each other. 01:36:44.080 |
And honestly, people like to say we have racial tribalism and everything, but even a kind 01:36:51.800 |
of a racist white conservative guy, I think takes the black conservative over the woke 01:37:01.480 |
It tells me that I think politics is way stronger tribalism right now. 01:37:05.200 |
I think that that white racist guy loves the black conservative guy compared to the white 01:37:13.040 |
So again, not that racial tribalism isn't a thing. 01:37:16.440 |
Of course, it's always a thing, but like political tribalism is the number one right now. 01:37:21.560 |
So race is almost a topic for the political division versus the actual sort of element 01:37:30.400 |
So there's a, I mean, it's, this is dark because, so this is a book about human civilization. 01:37:36.520 |
This is a book about human nature, but it's also a book of politics about politics. 01:37:44.160 |
It is just the way you list it out in the book. 01:37:48.680 |
It's kind of dark how we just fall into these left and right checklists. 01:37:54.640 |
So if you're on the left, it's maintain Roe v. Wade, universal healthcare, good, mainstream 01:38:01.640 |
media, fine, guns kill people, US is a racist country, protect immigrants, tax cuts bad, 01:38:09.160 |
And on the right is the flip of that, reverse Roe v. Wade, universal healthcare bad, mainstream 01:38:14.600 |
media bad, people kill people, not guns kill people. 01:38:17.880 |
US was a racist country, protect borders, tax cuts good, climate change overblown, don't 01:38:24.560 |
I mean, it has, you almost don't have to think about any of this. 01:38:29.160 |
Well, so when you say it's a book about politics, it's interesting because it's a book about 01:38:35.120 |
It's specifically not a book about the horizontal axis in that I'm not talking, I don't actually 01:38:45.440 |
But when you, so rather than arguing or having another book about those issues, about right 01:38:50.440 |
versus left, I wanted to do a book about this other axis. 01:38:54.600 |
And so on this axis, the reason I had this checklist is that this is a low part of the 01:39:03.800 |
Low rung politics is a checklist and that checklist evolves, right? 01:39:08.000 |
Like Russia suddenly is like popular with the right as opposed to, you know, it used 01:39:11.400 |
to be, you know, in the sixties that left was the one defending Stalin. 01:39:17.960 |
This is the approved checklist of the capital P party. 01:39:23.680 |
The high rungs, this is not what it's like high rung politics. 01:39:29.640 |
I have no idea what you think about anything else, right? 01:39:32.440 |
And you're going to say, I don't know about a lot of stuff because inherently you're not 01:39:36.440 |
going to have that strong an opinion because you don't have that much in for these are 01:39:44.000 |
It's when you know, you're in, you know, you're talking to someone who has been subsumed with 01:39:50.120 |
When, if they tell you their opinion on any one of these issues, you could just, you know, 01:39:54.400 |
you could just rattle off their opinion on every single other one. 01:39:56.960 |
And if, and if in three years it's becomes fashionable to, to have this new view, they're 01:40:00.760 |
going to have that's, you're not thinking that's echo, that's echo chamber culture. 01:40:05.240 |
And I've been using kind of a shorthand of centrist to describe this kind of a high rung 01:40:11.680 |
thinking, but people tend to, I mean, it's, it seems to be difficult to be a centrist 01:40:18.440 |
It's like people want to label you as a person who's too cowardly to take a stance somehow 01:40:26.040 |
as opposed to asking, saying, I don't know, as a first statement. 01:40:28.920 |
Well, the problem with centrist is that would mean that in each of these tax cuts, bag tax 01:40:35.080 |
It means that you are saying I am in, I think we should have some tax cuts, but not that 01:40:39.560 |
You might not think that you might actually come do some research. 01:40:41.640 |
You say, actually, I think tax cuts are really important. 01:40:45.600 |
That doesn't mean, oh, I'm not a centrist anymore. 01:40:50.840 |
So what you're trying to be when you say centrist is high rung, which means you might be all 01:40:55.660 |
You might agree with the far left on this thing, the far right on this thing, you might 01:40:58.560 |
agree with the centrists on this thing, but, but calling yourself a centrist actually like 01:41:03.020 |
is putting yourself in a prison on the horizontal axis. 01:41:07.480 |
And it's saying that, you know, I, I, whatever the, on the, on the different topics, I'm 01:41:14.080 |
So yeah, that's what we, we're badly missing this other axis. 01:41:18.080 |
I mean, I still do think it's like, for me, I am a centrist when you project it down to 01:41:24.680 |
the horizontal, but the point is you're missing so much data by not considering the vertical 01:41:30.600 |
because like on average, maybe it falls somewhere in the middle, but in reality, there's just 01:41:35.720 |
a lot of nuance issue to issue that involves just thinking and uncertainty and changing 01:41:40.840 |
in the, given the context of the current geopolitics and economics is just always considering, 01:41:47.800 |
always questioning, always evolving your views, all of that. 01:41:50.640 |
Not just, not just about like, oh, I think we should be in the center on this. 01:41:53.860 |
But another way to be in the center is if there's some phenomenon happening, you know, 01:41:58.280 |
there's a terrorist attack, you know, and one side wants to say, this has nothing to 01:42:03.700 |
And the other one, the other side wants to say, this is radical Islam, right? 01:42:08.480 |
What's in between those, the saying, this is complicated and nuanced and we have to 01:42:12.940 |
And it probably has something to do with Islam and something to do with the economic circumstances 01:42:17.060 |
and something to do with, you know, geopolitics. 01:42:19.660 |
So in a case like that, you actually do get really un-nuanced when you go to the extremes 01:42:24.740 |
and all of that nuance, which is where all the truth usually is, is going to be in the 01:42:29.980 |
- And there's a lot of truth to the fact that if you take that nuance on those issues, like 01:42:33.420 |
war in Ukraine, COVID, you're going to be attacked by both sides. 01:42:38.900 |
People who have, who are really strongly on one side or the other hate centrist people. 01:42:43.780 |
I've gotten this myself and you know, the, this, the, the slur that I've had thrown at 01:42:47.860 |
me is I'm an enlightened centrist in a very mocking way. 01:42:53.620 |
It means someone who is, you know, Steven Pinker or Jonathan Haidt gets accused of is, 01:42:58.040 |
you know, that they're highfalutin, you know, intellectual world, and they don't actually 01:43:05.840 |
They don't actually get their hands dirty and they can be superior to both sides without 01:43:12.460 |
So I see the argument and I disagree with it because I firmly believe that the hardcore 01:43:17.740 |
tribes, they think they're taking a stand and they're out in the streets and they're 01:43:22.660 |
I think what they're doing is they're just driving the whole country downwards. 01:43:25.260 |
And I think they're, they're hurting all the causes they care about. 01:43:28.380 |
And so it's not that, it's not that, you know, it's not that we need everyone to be sitting 01:43:33.180 |
It's that you can be far left and far right, but be upper left and upper right. 01:43:37.260 |
If we talk about the, you use the word liberal a lot in the book to mean something that we 01:43:48.060 |
And then you use the words progressive to mean the left and conservative to mean the 01:43:54.180 |
Can you describe the concept of liberal games and power games? 01:43:58.340 |
So the power games is, is what I call the like, basically just the laws of nature as 01:44:05.540 |
the, when laws of nature are the laws of the land, that's the power game. 01:44:10.140 |
So animals, watch any David Attenborough special. 01:44:13.860 |
And when the little lizard is running away from the, you know, the bigger animal or whatever, 01:44:22.900 |
They probably don't, but pretend bears eat bunnies. 01:44:25.460 |
So it's like in the power games, the bear is chasing the bunny. 01:44:31.100 |
But you know, what, what, what, what, what's legal? 01:44:33.540 |
If the bear is fast enough, it can eat the bunny. 01:44:36.460 |
If the bunny is, can get away, it can say living in mid. 01:44:41.820 |
Now humans have spent a lot of time in essentially that environment. 01:44:45.620 |
So when you have a totalitarian dictatorship, it's, and so what's the rule of the power 01:44:51.260 |
The bear can do whatever they want if they have the power to do so. 01:44:54.620 |
So if the bunny gets away, the bunny actually has more power than the bear in that situation. 01:44:59.300 |
And likewise, the totalitarian dictatorship, there's no rules. 01:45:04.220 |
They can, they can, they can torture, they can, you know, flatten a rebellion with a 01:45:08.340 |
lot of murder because they have the power to do so. 01:45:12.740 |
And that's, that's kind of the state of nature. 01:45:15.180 |
You know, that would, you know, when you look at a mafia, watch a mafia movie, you know, 01:45:19.540 |
there's, we do a lot of, we have, we have it in us. 01:45:21.180 |
We all have, we all can snap into power games mode when it becomes all about, you know, 01:45:30.500 |
Now the liberal games is, is, you know, something that civilizations for thousands of years 01:45:37.300 |
It's not invented by America or modern times, but America's kind of was like the latest 01:45:42.100 |
crack at it yet, which is this idea instead of everyone can do what they want if they 01:45:46.780 |
have the power to do so, it's everyone can do what they want as long as it doesn't harm 01:45:52.980 |
And, and the idea is that everyone has their, a list of rights which are protected by the 01:45:57.580 |
government and then they have their inalienable rights and they're, they're protected, you 01:46:02.780 |
know, those are protected again by, you know, from, from an invasion by other people. 01:46:08.620 |
And so you have this kind of fragile balance. 01:46:11.020 |
And so the idea with the liberal games is you, that there are laws, but it's not totalitarian. 01:46:16.500 |
They will build very clear, strict laws kind of around the edges of what you can and can't 01:46:24.340 |
So unlike a totalitarian dictatorship, actually it's, it's very loose. 01:46:27.740 |
You can, there's a lot of things can happen and it's kind of up to the people, but there 01:46:31.500 |
are still laws that protect the very basic inalienable rights and stuff like that. 01:46:36.020 |
Now the vulnerability there is that it, so, so, so the, the benefits of it are obvious, 01:46:47.140 |
They, you know, that, that equality of opportunity seemed like the most fair thing and, and, 01:46:53.900 |
you know, equality before the law, you know, due process and all of this stuff. 01:46:57.340 |
So it seems fair to the founders of the US and other enlightenment thinkers. 01:47:01.420 |
And it also is a great way to manifest productivity, right? 01:47:04.940 |
You know, you have, you have Adam Smith saying it's not from the benevolence of the butcher 01:47:09.500 |
or the baker that we get our dinner, but from their own, from their own self-interest. 01:47:12.420 |
So you have, you can harness kind of selfishness for, for progress. 01:47:16.260 |
But it has a vulnerability, which is that because the laws, it's like the totalitarian 01:47:21.540 |
laws, they don't have an excess of laws for no reason. 01:47:26.740 |
And the US, you know, in the US we say, we're, they're not going to do that. 01:47:29.060 |
And so the, the second, it's almost two puzzle pieces. 01:47:31.900 |
You have the laws and then you've got a liberal culture. 01:47:35.620 |
Liberal laws have to be married to liberal culture, kind of a defense of liberal spirit 01:47:40.820 |
in order to truly have the liberal games going on. 01:47:44.100 |
And so that's vulnerable because free speech, you can have the first amendment, that's the 01:47:50.140 |
But if, if you're in a culture where anyone who, you know, speaks out against orthodoxy 01:47:56.660 |
is going to be shunned from the community, well, you're lacking the second piece of the 01:48:02.500 |
And so therefore you, you might as well be in a, you might as well not even have the 01:48:08.080 |
And there's a lot of examples like that where the culture has to do its part for the true 01:48:14.380 |
So it's just much more complicated and much more nuanced than the power games. 01:48:17.740 |
It's kind of, it's kind of a set of basic laws that then are coupled with a basic spirit 01:48:25.100 |
to create this very awesome human environment that's also very vulnerable. 01:48:32.420 |
So what do you mean the culture has to play along? 01:48:34.320 |
So for something like a freedom of speech to work, there has to be a basic, what, decency? 01:48:41.300 |
That if all people are perfectly good, then perfect freedom without any restrictions is 01:48:47.900 |
It's where the human nature starts getting a little iffy. 01:48:50.700 |
We start being cruel to each other, we start being greedy and desiring of harm, and also 01:48:56.980 |
the narcissists and sociopaths and psychopaths in society, all of that, that's when you start 01:49:01.980 |
to have to inject some limitations on that freedom. 01:49:05.020 |
Yeah, I mean if, so what the government basically says is we're going to let everyone be mostly 01:49:12.340 |
free, but no one is going to be free to physically harm other people or to steal their property, 01:49:21.860 |
And so we're all agreeing to sacrifice that, you know, that 20% of our freedom, and then 01:49:27.500 |
in return, all of us in theory can be 80% free, and that's kind of the bargain. 01:49:34.700 |
But now that's a lot of freedom to leave people with, and a lot of people choose, it's like 01:49:39.900 |
you're so free in the US, you're actually free to be unfree if you choose. 01:49:42.840 |
That's kind of what an echo chamber is to me. 01:49:45.100 |
It's, you know, you can choose to kind of be friends with people who essentially make 01:49:53.980 |
it so uncomfortable to speak your mind that it's no actual effective difference for you 01:50:04.340 |
If you can't, you know, criticize Christianity in a certain community, you have a First Amendment, 01:50:11.180 |
so you're not going to get arrested by the government for criticizing Christianity. 01:50:16.300 |
But if you have this, if the social penalties are so extreme that it's just never worth 01:50:22.460 |
it, you might as well be in a country that imprisons people for criticizing Christianity. 01:50:29.420 |
And so that same thing goes for wokeness, right? 01:50:31.860 |
This is what people get, you know, cancel culture and stuff. 01:50:34.700 |
So when the reason these things are bad is because they're actually, they're depriving 01:50:40.180 |
Americans of the beauty of the freedom of the liberal games by, you know, imposing a 01:50:46.620 |
social culture that is very Power Games-esque. 01:50:49.300 |
It's basically a Power Games culture comes in and you might as well be in the Power Games 01:50:54.460 |
And so liberal, if you live in a liberal democracy, there will always be challenges to a liberal 01:51:05.860 |
There'll always be challenges to a liberal culture from people who are much more interested 01:51:12.740 |
And there has to be kind of an immune system that stands up to that culture and says, "That's 01:51:16.220 |
not how we do things here in America, actually. 01:51:18.740 |
We don't excommunicate people for not having the right religious beliefs or not rent, you 01:51:23.500 |
know, we don't disinvite a speaker from campus for having the wrong political beliefs." 01:51:28.060 |
And if it doesn't stand up for itself, it's like the immune system of the country failing 01:51:37.900 |
So before chapter four in your book, and the chapters that will surely result in you being 01:51:45.260 |
burned at the stake, you write, "We'll start our pitchfork tour in this chapter by taking 01:51:51.420 |
a brief trip through the history of the Republican Party. 01:51:54.020 |
Then in the following chapters, we'll take a Tim's career tanking deep dive into America's 01:52:00.140 |
social justice movement," as you started to talk about. 01:52:08.500 |
I'm looking at this through my vertical ladder. 01:52:10.380 |
What is this familiar story of the Republicans from the '60s to today, what does it look 01:52:21.740 |
And is there an interesting story here that's been kind of hidden because we're always looking 01:52:25.500 |
Now, the horizontal story, you'll hear people talk about it and they'll say something like 01:52:29.820 |
the Republicans have moved farther and farther to the right. 01:52:46.060 |
So we're using this, again, it's just like you're calling yourself centrist when it's 01:52:48.340 |
not exactly what you mean, even though it also is. 01:52:52.380 |
So again, I was like, "Okay, look, this vertical lens helps with other things. 01:52:57.020 |
And here's what I saw is I looked at the '60s and I saw an interesting story, which I don't 01:53:04.260 |
think not everyone's familiar with what happened in the early '60s. 01:53:08.900 |
But in 1960, the Republican Party was a plurality. 01:53:14.460 |
You had progressives, like genuine Rockefeller, pretty progressive people, all the way to 01:53:20.660 |
– then you had the moderates like Eisenhower and Dewey. 01:53:25.180 |
And then you go all the way to the farther right, you had Goldwater and what you might 01:53:34.900 |
And so it's this interesting plurality, right? 01:53:40.420 |
And what happened was the Goldwater contingent, which was the underdog, they were small, right? 01:53:47.540 |
Eisenhower was the president, or had just been the president, and was – it seemed 01:53:52.400 |
like the moderates were – he said, "You have to be close to the center of the chess 01:53:58.980 |
These people were very far from the center of the chess board, but they ended up basically 01:54:06.220 |
And they did it by breaking all of the kind of unwritten rules and norms. 01:54:12.020 |
So they did things like they first started with like the college Republicans, which was 01:54:15.220 |
like this feeder group that turned in – a lot of the politicians started there. 01:54:19.500 |
And they went to the election and they wouldn't let the current president, the incumbent, 01:54:26.700 |
And they were throwing chairs and there were fistfights. 01:54:28.580 |
And eventually people gave up and they just sat there and they sat in the chair talking 01:54:31.100 |
for their candidate until everyone eventually left and then they declared victory. 01:54:35.560 |
So basically they came in – there was a certain set of rules, agreed upon rules, and 01:54:41.980 |
they came in playing the power games, saying, "Well, actually, if we do this, you won't 01:54:47.460 |
have the power – we have the power to take it if we just break all the rules." 01:54:53.500 |
And that became this hugely influential thing, which then they conquered California through 01:55:00.740 |
These proper Republican candidates were appalled by the kind of like the insults that were 01:55:04.740 |
being hurled at them and the intimidation and the bullying. 01:55:07.140 |
And eventually they ended up in the National Convention, which was called like the right-wing 01:55:12.500 |
It was like the Republican National Convention in '64 was just – again, there was jeering 01:55:16.780 |
and they wouldn't let the moderates or the progressives even speak. 01:55:21.780 |
You know, Jackie Robinson was there and he was a proud Republican and he said that he 01:55:25.300 |
feels like he was a Jew in Hitler's Germany with the way that blacks were being treated 01:55:31.620 |
They had fiery, you know, plurality enough to win and they won. 01:55:36.660 |
They ended up getting crushed in the general election and they kind of faded away. 01:55:39.660 |
But to me, I was like, "What's – that was an interesting story." 01:55:42.220 |
I see it as – I have this character in the book called the Golem, which is a big, kind 01:55:46.420 |
of a big, dumb, powerful monster that's the emergent property of like a political 01:55:56.340 |
And to me, I was like a golem rose up, conquered the party for a second, knocked it on its 01:56:05.260 |
And to me, when I look at the Trump revolution and a lot – and not just Trump, the last 01:56:09.220 |
20 years, I see that same lower right, that lower right monster kind of making another 01:56:17.780 |
charge for it, but this time succeeding and really taking over the party for a long period 01:56:22.820 |
And I see the same story, which is the power games are being played in a situation when 01:56:28.340 |
it had always been – the government relies on all these unwritten rules and norms to 01:56:33.260 |
But for example, you have in 2016, Merrick Garland gets nominated by Obama and the unwritten 01:56:40.420 |
norm says that when the president nominates a justice, then you pass them through unless 01:56:47.620 |
But they said, "Actually, this is the last year of his presidency and the people should 01:56:51.820 |
So you set a new precedent where the president can't nominate a Supreme Court justice in 01:56:58.540 |
So they pass it through and it ends up being Gorsuch. 01:57:04.140 |
Now three years later, it's Trump's last year and it's another election year and Ginsburg 01:57:12.980 |
They said, "No, actually, we changed our mind. 01:57:17.460 |
So to me, that is classic power games, right? 01:57:20.420 |
There's no actual rule and what you're doing is they did technically have the power to 01:57:23.740 |
block the nomination then and then they technically had the power to put someone in and they're 01:57:26.780 |
pretending there's some principle to it, but they're just going for the short-term edge 01:57:31.860 |
at the expense of what is like the workings of the system in the long run. 01:57:36.540 |
And then what do the Democrats have to do in that situation? 01:57:38.860 |
Because both parties have been doing this is they either can lose now all the time or 01:57:44.260 |
And now you have a prisoner's dilemma where it's like both end up doing this thing and 01:57:48.860 |
everyone ends up worse off, the debt ceiling, all these power plays that are being made 01:57:53.380 |
with these holding the country hostage, this is power games. 01:57:56.740 |
And to me, that's what Goldwater was doing in the '60s, but it was a healthier time in 01:57:59.500 |
a way because there was this plurality within the parties, reduced some of the national 01:58:04.620 |
tribalism and there wasn't as much of an appeal to that. 01:58:07.780 |
But today, it's just like do whatever you have to do to beat the enemies. 01:58:11.000 |
And so I'm seeing a rise in power games and I talk about the Republicans because they 01:58:16.240 |
They have been a little bit more egregious, but both parties have been doing it over the 01:58:20.200 |
- Can you place blame, or maybe there's a different term for it, at the subsystems of 01:58:29.680 |
Is it the politicians like in the Senate and Congress? 01:58:36.160 |
Is it, or maybe it's us human beings, maybe social media versus mainstream media? 01:58:44.480 |
Is there a sense of where, what is the cause and what is the symptom? 01:58:49.120 |
So Ezra Klein has a great book, Why We're Polarized, where he talks about a lot of this. 01:58:52.280 |
And there's some of these, it's really no one's fault. 01:58:56.120 |
First of all, the environment has changed in a bunch of ways you just mentioned. 01:59:00.160 |
And what happens when you take human nature, which is a constant, and you put it into an 01:59:07.240 |
When that changes, the dependent variable, the behavior, changes with it, right? 01:59:11.600 |
And so the environment has changed in a lot of ways. 01:59:13.620 |
So one major one is, it used to, for a long time, actually, first it was the Republicans 01:59:22.240 |
and then the Democrats just had a stranglehold on Congress. 01:59:32.040 |
And so therefore, it actually is a decent environment to compromise it. 01:59:37.440 |
Because now we can both, what you want is Congress people thinking about their home 01:59:40.680 |
district and voting yes on a national policy because we're going to get a good deal on 01:59:46.640 |
That's actually healthy, as opposed to voting in lockstep together because this is what 01:59:51.880 |
the red party is doing, regardless of what's good for my home district. 01:59:56.720 |
There were certain Republican districts that would have actually officially been benefited 02:00:01.800 |
by Obamacare, but every Republican voted against it. 02:00:04.720 |
So and part of the reason is because there's no longer this obvious majority. 02:00:11.960 |
And that's partially because it's become so, we've been so subsumed with this one national 02:00:17.060 |
divide of left versus right that people are not, people are whoever, they're voting for 02:00:23.840 |
the same party for president all the way down the ticket now. 02:00:27.400 |
And so you have this just kind of 50/50 color war, and that's awful for compromise. 02:00:31.240 |
So there's like 10 of these things that have redistricting, but also it is social media. 02:00:39.680 |
In the sixties, you had kind of distributed tribalism. 02:00:41.800 |
You had some people that are worked up about the USSR, right? 02:00:48.200 |
You had some people that were saying left versus right, like they are today. 02:00:51.380 |
And then other people that were saying that they were fighting within the party. 02:00:59.080 |
So you kind of got rid of a lot of the in-party fighting. 02:01:01.200 |
And then there's hasn't been that big of a foreign threat, nothing like the USSR for 02:01:06.960 |
And what's left is just this left versus right thing. 02:01:09.560 |
And so that's kind of this hypercharged whirlpool that subsumes everything. 02:01:14.760 |
And so, yeah, I mean, people point to Newt Gingrich, and people like there's certain 02:01:19.520 |
characters that enacted policies that stoked this kind of thing. 02:01:23.120 |
But I think this is a much bigger kind of environmental shift. 02:01:26.120 |
- Well, that's going back to our questions about the role of individuals in human history. 02:01:30.440 |
So the interesting, one of the many interesting questions here is about Trump. 02:01:37.200 |
Because he seems to be, from the public narrative, such a significant catalyst for some of the 02:01:44.440 |
- This goes back to what we were talking about earlier, right? 02:01:48.240 |
I think he's a perfect example of it's a both situation. 02:01:51.320 |
I don't think, if you pluck Trump out of this situation, I don't think that Trump was inevitable. 02:01:56.600 |
But I think we were very vulnerable to a demagogue. 02:02:00.160 |
And if you hadn't been, Trump would have had no chance. 02:02:03.440 |
And why were we vulnerable to a demagogue is because you have these, I mean, I think 02:02:13.000 |
If you actually look at the stats, it's pretty bad. 02:02:15.080 |
Like the people who, because it's not just who voted for Trump. 02:02:17.680 |
A lot of people just vote for the red, right? 02:02:19.520 |
What's interesting is who voted for Obama against Romney and then voted for Trump? 02:02:33.280 |
Places that had economic despair, where bridges were not working well. 02:02:42.640 |
So I think that you had this, a lot of these kind of rural towns, you have true despair. 02:02:47.880 |
And then you also have the number one indicator of voting for Trump was distrust in media. 02:02:54.580 |
And the media has become much less trustworthy. 02:02:57.880 |
And so you have all these ingredients that actually make us very vulnerable to a demagogue. 02:03:04.840 |
And a demagogue is someone who takes advantage, right? 02:03:06.880 |
There's someone who comes in and says, "I can pull all the right strings and push all 02:03:11.880 |
the right emotional buttons right now and get my self-power by taking advantage of the 02:03:20.520 |
It makes me wonder how easy it is for somebody who's a charismatic leader to capitalize on 02:03:27.360 |
cultural resentment when there's economic hardship to channel that. 02:03:32.680 |
So John Haidt wrote a great article about, basically, truth is at an all-time low right 02:03:41.440 |
MSNBC, Fox News, these are not penalized for being inaccurate. 02:03:45.600 |
They're penalized if they stray from the orthodoxy. 02:03:49.880 |
On social media, it's not the truest tweets that go viral. 02:03:53.240 |
And so Trump understood that better than anyone. 02:03:59.240 |
He was living in the current world when everyone else was stuck in the past. 02:04:06.760 |
Everything he said, truth was not relevant at all. 02:04:11.040 |
It's just truly, it's not relevant to him and what he's talking about. 02:04:14.440 |
He doesn't care, and he knew that neither do a subset of the country. 02:04:17.960 |
I was thinking about this, just reading articles by journalists, especially when you're not 02:04:23.960 |
a famous journalist in yourself, but you're more like a New York Times journalist. 02:04:30.520 |
So the big famous thing is the institution you're a part of. 02:04:34.400 |
You can just lie, 'cause you're not going to get punished for it. 02:04:38.120 |
You're going to be rewarded for the popularity of an article. 02:04:41.240 |
So if you write 10 articles, there's a huge incentive to just make stuff up. 02:04:49.920 |
And culturally, people will attack that article to say it's dishonest. 02:04:54.480 |
One half the country will attack that article for saying it's dishonest. 02:05:03.080 |
There won't be a memory like, "This person made up a lot of stuff in the past." 02:05:06.160 |
No, they'll take one article at a time, and they'll attach the reputation hits will be 02:05:13.880 |
So for the individual journalist, there's a huge incentive to make stuff up. 02:05:19.000 |
- And it's scary, because it's almost like you can't survive if you're just an old-school, 02:05:24.120 |
honest journalist who really works hard and tries to get it right and does it with nuance. 02:05:27.640 |
What you can be is you can be a big-time substacker or a big-time podcaster. 02:05:31.880 |
A lot of people do have a reputation for accuracy and rigor, and they have huge audiences. 02:05:38.000 |
But if you're working in a big company right now, I think that many of the big media brands 02:05:50.640 |
But I will say that the ones that are controlled by the right are even more egregious, not 02:05:54.800 |
just in terms of accuracy, but also in terms of... 02:05:57.680 |
The New York Times, for all of its criticisms, they have a handful of... 02:06:13.400 |
They wrote an article criticizing free speech on campus stuff recently. 02:06:17.880 |
And they have a couple very left-progressive-friendly conservatives, but they have conservatives 02:06:30.240 |
Breitbart, you're not seeing thoughtful progressives writing there. 02:06:34.520 |
There's some degree to which the New York Times, I think, still incentivizes and values 02:06:42.760 |
So you're allowed to have a conservative opinion if you do a really damn good job. 02:06:53.080 |
And if you kind of pander to the progressive senses in all the right ways. 02:06:58.400 |
I always joke that TED, they always have a couple token conservatives, but they get on 02:07:02.760 |
stage and they're basically like, "So totally, you're all... 02:07:06.920 |
The progressivism's right about all of this, but maybe libertarianism isn't all about..." 02:07:18.520 |
I think you can see the New York Times tug of war, the internal tug of war. 02:07:22.440 |
You can see it, because then they also have these awful instances, like the firing of 02:07:35.520 |
The '70s, you had these three news channels, and they weren't always right, and they definitely 02:07:40.880 |
sometimes spun a narrative together maybe about the Vietnam or whatever. 02:07:44.160 |
But if one of them was just lying, they'd be embarrassed for it. 02:07:49.720 |
They'd be dinged, and they'd be known as, "This is the trash one." 02:07:52.280 |
And that would be terrible for their ratings, because they weren't just catering to half 02:07:57.200 |
So both on the axis of accuracy and on the axis of neutrality, they had to try to stay 02:08:04.480 |
somewhere in the reasonable range, and that's just gone. 02:08:08.920 |
One of the things I'm really curious about is... 02:08:14.400 |
I'm very curious to see how it's written about by the press, because I could see... 02:08:19.760 |
I could myself write, with the help of Chad J. Petit, of course, clickbait articles in 02:08:27.000 |
Your whole book is beautifully written for clickbait articles. 02:08:30.520 |
If any journalists out there need help, I can write the most atrocious criticisms. 02:08:43.680 |
So speaking of which, you write about social justice. 02:08:48.840 |
You write about two kinds of social justice, liberal social justice and SJF, social justice 02:08:59.400 |
So the term "wokeness" is so loaded with baggage. 02:09:01.400 |
It's kind of like mocking and derogatory, and I was trying not to do that in this book. 02:09:06.800 |
If it's a term loaded with baggage, you're already kind of... 02:09:09.760 |
You're from the first minute, you're already behind. 02:09:13.840 |
So to me, also, when people say "wokeness is bad," "social justice is bad," they're 02:09:22.800 |
throwing the baby out with the bathwater because the proudest tradition in the US is liberal 02:09:30.400 |
And what I mean by that, again, liberal meaning with lowercase l. 02:09:37.620 |
So Martin Luther King, classic example, his "I Have a Dream" speech, he says stuff like 02:09:41.640 |
"This country has made a promise to all of its citizens, and it has broken that promise 02:09:54.200 |
In other words, liberalism, the Constitution, the core ideals, those are great. 02:10:02.900 |
So civil disobedience, the goal of it wasn't to hurt liberalism. 02:10:06.520 |
It was to specifically break the laws that were already violating... 02:10:11.480 |
The laws that were a violation of liberalism to expose that this is illiberal, that the 02:10:16.080 |
Constitution should not have people of different skin color sitting in different parts of the 02:10:20.960 |
And so it was really patriotic, the Civil Rights Movement. 02:10:25.600 |
It was saying, "Liberalism is this beautiful thing, and we need to do better at it." 02:10:32.840 |
And it used the tools of liberalism to try to improve the flaws that were going on. 02:10:47.600 |
And what were the leftists doing in the '60s on Berkeley campus? 02:10:51.200 |
They were saying, "We need more free speech," because that's what liberal social justice 02:10:57.640 |
But you can also go back to the '20s, women's suffrage. 02:11:00.800 |
Emancipation, the thing that America obviously has all of its... 02:11:05.200 |
These are all ugly things that it had to get out of, but it got out of them one by one, 02:11:12.480 |
And liberal social justice basically is the practice of saying, "Where are we not being 02:11:20.920 |
So that's the idea of liberalism that permeates the history of the United States. 02:11:26.120 |
You have so many good images in this book, but one of them is highlighting the interplay 02:11:32.320 |
of different ideas over the past, let's say, 100 years. 02:11:36.900 |
So liberalism is on one side, there's that thread. 02:11:40.360 |
There's Marxism on the other, and then there's postmodernism. 02:11:47.080 |
So it's interesting because Marxism is, and all of its various descendants, obviously 02:11:53.720 |
there's a lot of things that are rooted in Marxism that aren't the same thing as what 02:12:06.360 |
They actually think that the opposite of what Martin Luther King and other people in the 02:12:16.040 |
civil rights and other movements, they think the opposite. 02:12:18.840 |
He thinks liberalism is good, we need to preserve it. 02:12:23.360 |
These other problems with racism and inequality that we're seeing, those are inevitable results 02:12:30.880 |
Liberalism is a rigged game, and it's just the power games in disguise. 02:12:35.720 |
It's just the power games in disguise, and there's the upper people that oppress the 02:12:39.400 |
lower people, and they convince the lower people, it's all about false consciousness, 02:12:43.320 |
they convince the lower people that everything is fair. 02:12:46.200 |
Now the lower people vote against their own interests, and they work to preserve the system 02:12:52.800 |
We need to actually, it's much more revolutionary, we need to overthrow liberalism. 02:12:58.840 |
People think, oh, what we call a wokeness is just a normal social justice activism, 02:13:06.040 |
No, no, it's the polar opposite, polar opposite. 02:13:12.800 |
Now postmodernism is this term that is super controversial, and I don't think anyone calls 02:13:18.080 |
themselves a postmodernist, or take all of this with a grain of salt in terms of the 02:13:23.600 |
The definition of radical to me is how deep you want change to happen at. 02:13:30.600 |
A liberal progressive and a conservative progressive will disagree about policies, 02:13:35.880 |
the liberal progressive wants to change a lot of policies, change, change, change, and 02:13:41.760 |
the conservative wants to keep things the way they are. 02:13:45.040 |
But they're both conservative when it comes to liberalism, beneath it, the liberal foundation 02:13:50.880 |
of the country, they both become conservatives about that. 02:13:55.400 |
The Marxist is more radical because they want to go one notch deeper and actually overthrow 02:14:01.640 |
Now what's below liberalism is kind of the core tenets of modernity, this idea of reason 02:14:09.320 |
and the notion that there is an objective truth and science as the scientific method. 02:14:17.560 |
These things are actually beneath, and even the Marxist, if you look at the Frankfurt 02:14:20.320 |
School, these post-Marxist thinkers and Marx himself, they were not anti-science, they 02:14:29.960 |
They actually wanted to preserve modernity, but they wanted to get rid of liberalism on 02:14:34.480 |
The post-modernist is even more radical because they want to actually go down to the bottom 02:14:38.680 |
They think science itself is a tool of oppression, they think it's a tool where oppression kind 02:14:44.400 |
of flows through, they think that the white western world has invented these concepts, 02:14:50.760 |
they claim that there's an objective truth and that there's reason and science, and they 02:14:54.320 |
think all of that is just one meta-narrative, and it goes a long way to serve the interests 02:15:00.800 |
- So in the sense that it's almost caricatured, but that is to the core their belief that 02:15:09.480 |
- Not the education of math, but literally math, the mathematics. 02:15:13.760 |
- The notion in math that there's a right answer and a wrong answer, that they believe 02:15:18.200 |
is a meta-narrative that serves white supremacy, or the post-modernist might have said it serves 02:15:27.080 |
So what social justice fundamentalism is, is you take the Marxist thread that has been 02:15:31.720 |
going on in lots of countries, and whoever the upper and lower is, that's what they all 02:15:38.360 |
have in common, but the upper and lower, for Marx was the ruling class and the oppressed 02:15:47.280 |
But you come here and the economic class doesn't resonate as much here as it did maybe in some 02:15:53.280 |
of those other places, but what does resonate here in the '60s and '70s is race and gender 02:15:58.440 |
and these kind of social justice disagreements. 02:16:01.760 |
And so what social justice fundamentalism is, is basically this tried and true framework 02:16:07.320 |
of this Marxist framework, kind of with a new skin on it, which is American social justice, 02:16:16.200 |
and then made even more radical with the infusion of post-modernism, where not just is liberalism 02:16:21.600 |
bad, but actually, like you said, math can be racist. 02:16:25.360 |
So it's this kind of philosophical Frankenstein, this stitched together of these, and so again, 02:16:34.120 |
they wear the same uniform as the liberal social justice. 02:16:36.160 |
They say social justice, racial equality, but it has nothing to do with liberal social 02:16:42.680 |
It is directly opposed to liberal social justice. 02:16:44.480 |
- It's fascinating, the evolution of ideas, if we ignore the harm done by it, it's fascinating 02:16:51.120 |
how humans get together and evolve these ideas. 02:16:53.280 |
So as you show, Marxism is the idea that society is a zero-sum, I mean, I guess the zero-sum 02:17:01.920 |
Zero-sum struggle between the ruling class and the working class with power being exerted 02:17:08.160 |
Then you add critical theory, Marxism 2.0 on top of that, and you add to politics and 02:17:17.040 |
And then on top of that, for postmodernism, you add science, you add morality, basically 02:17:22.600 |
- The stitched together Frankenstein, and if you notice, which is not necessarily bad, 02:17:26.560 |
but in this case, I think it's actually violating the Marxist tradition by being anti-science. 02:17:32.080 |
And it's violating the postmodernism, because what postmodernists were, they were radical 02:17:37.240 |
Not just of, they were radical skeptics, not just of the way things were, but of their 02:17:42.240 |
And social justice fundamentalism is not at all self-critical. 02:17:48.840 |
It says that we have the answers, which is the opposite of what postmodernists would 02:17:54.160 |
And it's also violating, of course, the tradition of liberal social justice in a million ways, 02:18:01.720 |
Meanwhile, liberal social justice doesn't have a Frankenstein. 02:18:05.720 |
It's very, it's a crisp ideology that says, we need, they're trying to make, we're trying 02:18:10.880 |
to get to a more perfect union, they're trying to keep the promises made in the Constitution. 02:18:21.440 |
- So you write that my big problem with social justice fundamentalism isn't the ideology 02:18:27.160 |
It's what scholars and activists started to do sometimes around 2013, when they began 02:18:32.000 |
to wield a cudgel that's not supposed to have any place in the country like the US. 02:18:40.080 |
- Well, to be clear, I don't like the ideology. 02:18:45.600 |
I think it's morally inconsistent based on, you know, it's flip-flops on its morals, depending 02:18:54.400 |
I think it's full of inaccuracies and kind of can't stand up to debate. 02:19:01.000 |
So I think it's a low, but there's a ton of low-rung ideologies I don't like. 02:19:06.160 |
I don't like a lot of political doctrines, right? 02:19:08.960 |
The US is a place inherently that is a mishmash of a ton of ideologies, and I'm not going 02:19:14.320 |
to like two-thirds of them at any given time. 02:19:16.360 |
So my problem, the reason I'm writing about this is not because I'm like, "By the way, 02:19:23.440 |
The reason that it must be written about right now, this particular ideology, is because 02:19:31.680 |
If you want to be a hardcore, you know, evangelical Christian, in the US says, "Live and let 02:19:39.440 |
Not only are you allowed to have an echo chamber of some kind, it's actively protected here. 02:19:46.400 |
Now, if the evangelical Christian started saying, "By the way, anyone who says anything 02:19:51.080 |
that conflicts with evangelical Christianity is going to be severely socially punished, 02:19:56.280 |
and they have the cultural power to do so," which they don't in this case. 02:20:00.200 |
They might like to, but they don't have the power, but they're able to get anyone fired 02:20:03.920 |
who they want, and they're able to actually change the curriculum in all these schools 02:20:07.760 |
and classes to suddenly not conflict with, no more evolution in the textbooks, because 02:20:13.360 |
Now I would write a book about evangelical Christianity, because that's what every liberal, 02:20:20.120 |
regardless of what you think of the actual horizontal beliefs, doesn't matter what they 02:20:24.480 |
believe when they start violating live and let live and shutting down other segments 02:20:33.000 |
It's almost like a, it's not the best analogy, but an echo chamber is like a benign tumor. 02:20:41.040 |
What you have to watch out for is a tumor that starts to metastasize, starts to forcefully 02:20:48.040 |
That's what this particular ideology has been doing. 02:20:51.840 |
- Do you worry about it as an existential threat to liberalism in the West, in the United 02:21:03.680 |
Is it a problem, or is it the biggest problem that's threatening all of human civilization? 02:21:15.880 |
I wouldn't, if it turns out in 50 years someone says actually it was, I wouldn't be shocked, 02:21:20.860 |
but I also wouldn't bet on that, because there's a lot of problems. 02:21:26.640 |
It is popular to say that kind of thing, though, and it's less popular to say the same thing 02:21:32.400 |
about AI or nuclear weapons, which worries me that I'm more worried about nuclear weapons 02:21:42.120 |
- So I've gotten, I've had probably a thousand arguments about this. 02:21:45.520 |
That's one nice thing about spending six years procrastinating on getting a book done is 02:21:49.640 |
you end up test, battle testing your ideas a million times. 02:21:52.080 |
So I've heard this one a lot, which is there's kind of three groups of former Obama voters. 02:22:02.760 |
And the third is what you just said, which is sure, wokeness is over the top. 02:22:08.320 |
You're not woke, but I think that the anti-woke people are totally lost their mind, and it's 02:22:16.000 |
Now here's why I disagree with that, because it's not wokeness itself. 02:22:22.920 |
It's that a radical political movement, of which there will always be a lot in the country, 02:22:30.960 |
has managed to do something that a radical movement is not supposed to be able to do 02:22:34.520 |
in the US, which is they've managed to hijack institutions all across the country and hijack 02:22:44.200 |
medical journals, and universities, and the ACLU, all the activist organizations, and 02:23:01.760 |
So it's not that I think this thing is so bad. 02:23:06.000 |
The reason Trump scares me is not because Trump's so bad. 02:23:08.440 |
It's that because it shows, it reveals that we were vulnerable to a demagogue candidate. 02:23:14.960 |
And what wokeness reveals to me is that we are currently, and until something changes, 02:23:18.520 |
will continue to be vulnerable to a bully movement, and a forcefully expansionist movement 02:23:29.120 |
that wants to actually destroy the workings, their liberal gears, and tear them apart. 02:23:38.360 |
And so here's the way I view a liberal democracy is it is a bunch of these institutions that 02:23:43.440 |
were trial and error crafted over hundreds of years. 02:23:48.000 |
And they all rely on trust, public trust, and a certain kind of feeling of unity that 02:23:53.440 |
actually is critical to a liberal democracy's functioning. 02:23:57.280 |
And what I see this thing is is a parasite on that whose goal is-- and I'm not saying 02:24:03.560 |
each-- by the way, each individual in this is-- I don't think they're bad people. 02:24:07.000 |
I think that it's the ideology itself has the property of its goal is to tear apart 02:24:12.600 |
the pretty delicate workings of the liberal democracy and shred the critical lines of 02:24:18.840 |
And so you talk about AI, and you talk about all these other big problems, nuclear, right? 02:24:23.280 |
The reason I-- I like writing about that stuff a lot more than I like writing about politics. 02:24:26.840 |
This was a fun topic for me, is because I realized that all of those things, if we're 02:24:32.600 |
going to have a good future with those things, and they're actually threats, like I said, 02:24:35.880 |
we need to have our wits about us, and we need the liberal gears and levers working. 02:24:43.720 |
And so if something's threatening to undermine that, it affects everything else. 02:24:48.440 |
- We need to have our scientific mind about us, about these foundational ideas. 02:24:53.120 |
But I guess my sense of hope comes from observing the immune system respond to wokeism. 02:25:00.680 |
There seems to be a pro-liberalism immune system. 02:25:05.800 |
And not only that, so like there's intellectuals, there's people that are willing to do the 02:25:14.200 |
And there is a hunger for that, such that those ideas can become viral, and they take 02:25:20.120 |
So I just don't see a mechanism by which wokeism accelerates, like exponentially, and takes 02:25:29.320 |
It feels like as it expands, the immune system responds. 02:25:34.480 |
The immune system of liberalism, of basically a country, at least in the United States, 02:25:40.400 |
that still ultimately at the core of the individual values the freedom of speech, just freedoms 02:25:50.720 |
- So to me it is like a virus in an immune system. 02:25:54.120 |
And I totally agree, I see the same story happening. 02:25:57.440 |
I'm sitting here rooting for the immune system. 02:26:03.000 |
So a liberal democracy is always gonna be vulnerable to a movement like this, right? 02:26:10.880 |
Because it's not a totalitarian dictatorship. 02:26:13.480 |
Because if you can socially pressure people to not say what they're thinking, you can 02:26:19.680 |
You can break the liberalism of the liberal democracy quite easily, and suddenly a lot 02:26:25.360 |
On the other hand, the same vulnerability, the same system that's vulnerable to that 02:26:34.240 |
Because now the Maoists, right, similar kind of vibe. 02:26:37.600 |
They were saying that science is evil, and that the intellectuals are, it's all this 02:26:49.560 |
And they had the hard cudgel in their hand, right? 02:26:58.960 |
And you can conquer a country with the hard cudgel. 02:27:03.200 |
So what they have is a soft cudgel, which can have the same effect initially. 02:27:10.200 |
You can't maybe imprison them and murder them. 02:27:11.720 |
But if you can socially ostracize them and get them fired, that basically is gonna have 02:27:16.380 |
So the soft cudgel can have the same effect for a while. 02:27:19.220 |
But the thing is, it's a little bit of a house of cards, because it relies on fear. 02:27:25.360 |
And as soon as that fear goes away, the whole thing falls apart, right? 02:27:31.320 |
The soft cudgel requires people to be so scared of getting canceled or getting whatever. 02:27:37.540 |
And as soon as some people start-- you know, Toby Lutka of Shopify I always think about. 02:27:42.280 |
He just said, you know what, I'm not scared of this soft cudgel, and spoke up and said, 02:27:45.880 |
we're not political at this company, and we're not a family, we're a team, and we're gonna 02:27:52.620 |
It seems like a fascinating-- - He's amazing. 02:27:54.740 |
- He spoke up, he's saying that we're not gonna-- - He's one of the smartest and kindest 02:27:58.980 |
dudes, but he's also, he has courage at a time when it's hard. 02:28:03.060 |
But here's the thing, is that it's different than that you need so much less courage against 02:28:07.140 |
a soft cudgel than you do-- the Iranians throwing their hijabs into the fire, those people's 02:28:12.540 |
courage just blows away any courage we have here, 'cause they might get executed. 02:28:19.020 |
That's the thing, is that you can actually have courage right now, and it's-- 02:28:31.300 |
And you talk about, so two things to fight this, there's two things, awareness and courage. 02:28:39.380 |
The awareness piece is first just understanding the stakes, like getting our heads out of 02:28:47.500 |
the sand and being like, technology's blowing up exponentially, our society's trust is devolving, 02:28:53.340 |
like we're kind of falling apart in some important ways, we're losing our grip on some stability 02:28:58.180 |
at the worst time, that's the first point, just the big picture. 02:29:01.660 |
And then also, awareness of, I think, this vertical axis, or whatever your version of 02:29:05.180 |
it is, this concept of, how do I really form my beliefs, where do they actually come from? 02:29:12.460 |
Are they someone else's beliefs, am I following a checklist? 02:29:19.300 |
I used to identify with the blue party or the red party, but now they've changed, and 02:29:23.620 |
I suddenly am okay with that, is that because my values changed with it, or am I actually 02:29:30.780 |
Asking yourself these questions, asking, looking for where do I feel disgusted by fellow human 02:29:38.100 |
Maybe I'm being a crazy tribal person without realizing it. 02:29:41.020 |
How about the people around me, am I being bullied by some echo chamber without realizing 02:29:48.220 |
So that's the first, I think just to kind of do a self-audit, and I think that just 02:29:57.220 |
some awareness like that, just a self-audit about these things can go a long way, but 02:30:02.460 |
if you keep it to yourself, it's almost useless, because if you don't have, you know, awareness 02:30:10.780 |
So courage is when you take that awareness and you actually export it out into the world, 02:30:17.420 |
And so courage can happen on multiple levels. 02:30:19.180 |
It can happen by, first of all, just stop saying stuff you don't believe. 02:30:23.220 |
If you're being pressured by a kind of an ideology or a movement to say stuff that you 02:30:27.980 |
don't actually believe, just stop, just stay on your ground and don't say anything. 02:30:43.740 |
And it's not every group, but you'd be surprised. 02:30:46.700 |
And then eventually, maybe start speaking out in bigger groups. 02:30:54.060 |
Look, some people will lose their jobs for it. 02:30:57.620 |
Most people won't lose their jobs, but they have the same fear as if they would, right? 02:31:02.140 |
And it's like, what, are you going to get criticized? 02:31:03.140 |
Or are you going to get a bunch of people, you know, angry Twitter people will criticize 02:31:09.060 |
Like, yeah, it's not pleasant, but actually that's a little bit like our primitive minds 02:31:13.420 |
fear that really back when it was programmed, that kind of ostracism or criticism will leave 02:31:24.780 |
And the people who realize that can exercise incredible leadership right now. 02:31:28.620 |
- So you have a really interesting description of censorship, of self-censorship also, as 02:31:38.940 |
And this gap, I think, I hope you write even more, even more than you've written in the 02:31:44.180 |
book about these ideas, because it's so strong. 02:31:47.100 |
These censorship gaps that are created between the dormant thought pile and the kind of thing 02:31:56.940 |
So first of all, so I like to think of, I think it's a useful tool is this thing called 02:32:01.780 |
a thought pile, which is if you have a, on any given issue, you have a horizontal spectrum 02:32:07.620 |
and just say, I could take your brain out of your head and I put it on the thought pile 02:32:11.300 |
right where you happen to believe about that issue. 02:32:14.300 |
Now I did that for everyone in the community or in a society. 02:32:17.740 |
And you're gonna end up with a big mushy pile that I think will often form a bell curve. 02:32:21.220 |
If it's really politicized, it might form like a camel with two humps, because it's 02:32:26.020 |
But for a typical issue, it'll just form a fear of AI. 02:32:32.740 |
Now the second thing is a line that I call the speech curve, which is what people are 02:32:38.060 |
So the speech curve is high when not just a lot of people are saying it, but it's being 02:32:40.860 |
said from the biggest platforms, being said in the New York Times, and it's being said 02:32:49.140 |
Those things are the top of the speech curve. 02:32:52.060 |
And then when the speech curve's lower, it means it's being said either whispered in 02:32:55.580 |
small groups or it's just not very many people are talking about it. 02:32:58.540 |
Now a healthy, when a free speech democracy is healthy on a certain topic, you've got 02:33:04.820 |
the speech curve sitting right on top of the thought pile. 02:33:07.180 |
They mirror each other, which is naturally what would happen. 02:33:09.940 |
More people think something, it's gonna be said more often and from higher platforms. 02:33:14.700 |
What censorship does, and censorship can be from the government, so I use the tale of 02:33:20.020 |
And King Mustache, he's a little tiny tyrant, and he's very sensitive, and people are making 02:33:25.620 |
fun of his mustache, and they're saying he's not a good king, and he does not like that. 02:33:29.540 |
He enacts a policy, and he says, "Anyone who is heard criticizing me or my mustache or 02:33:39.460 |
And immediately at the town, because his father was very liberal, there was always free speech 02:33:46.460 |
But now King Mustache has taken over, and he's saying this is a new rules now. 02:33:49.580 |
And so a few people yell out, and they say, "That's not how we do things here." 02:33:52.940 |
And that moment, it's what I call a moment of truth. 02:33:56.300 |
Did the king's guards stand with the principles of the kingdom and say, "Yeah, King Mustache, 02:34:00.300 |
that's not what we do," in which case he would kind of have to, he's nothing he can do. 02:34:05.500 |
So in this case, it's as if he laid down an electric fence over a part of the stockpile 02:34:09.860 |
and said, "No one's allowed to speak over here." 02:34:11.660 |
The speech curve, maybe people will think these things, but the speech curve cannot 02:34:16.980 |
But the electric fence wasn't actually electrified until the king's guards, in a moment of truth, 02:34:21.820 |
get scared and say, "Okay," and they hang the five people who spoke out. 02:34:25.160 |
So in that moment, that fence just became electric. 02:34:28.380 |
And now no one criticizes King Mustache anymore. 02:34:32.660 |
Now, of course, he has a hard cudgel because he can execute people. 02:34:36.100 |
But now when we look at the US, what you're seeing right now is a lot of pressure, which 02:34:41.780 |
An electric fence is being laid down saying, "No one can criticize these ideas. 02:34:45.500 |
And if you do, you won't be executed, you'll be canceled. 02:34:52.980 |
No, they don't work at the company, they can't fire you. 02:34:56.100 |
But they can start a Twitter mob when someone violates that speech curve, when someone violates 02:35:00.380 |
that speech rule, and then the leadership at the company has the moment of truth. 02:35:07.100 |
And what the leaders should do is stand up for their company's values, which is almost 02:35:11.780 |
always in favor of the employee and say, "Look, even if they made a mistake, people make mistakes, 02:35:17.500 |
Or maybe that person actually said something that's reasonable and we should discuss it. 02:35:20.340 |
But either way, we're not going to fire them." 02:35:22.540 |
And if they said no, what happens is the Twitter mob actually doesn't have, they can't execute 02:35:28.780 |
And the fence has proven to have no electricity. 02:35:29.780 |
And that's been the problem with the past few years is what's happened again and again 02:35:33.020 |
is the leader gets scared and they get scared of the Twitter mob and they fire them. 02:35:39.620 |
And now, actually, if you cross that, it's not just a threat. 02:35:52.500 |
Now what happens when an electric fence goes up and it's proven to actually be electrified? 02:35:56.300 |
The speech curve morphs into a totally different position. 02:35:59.860 |
And now these new people say, instead of having the kind of marketplace of ideas that turns 02:36:04.000 |
into a kind of a natural bell curve, they say, "No, no, no. 02:36:11.440 |
That's the rules of their own echo chamber that they're now applying to everyone. 02:36:16.540 |
And so you end up with now instead of one region, which is a region of kind of active 02:36:20.820 |
communal thinking, what people are thinking and saying, you now have three regions. 02:36:25.380 |
You have a little active communal thinking, but mostly you now have this dormant thought 02:36:28.780 |
pile, which is all these opinions that suddenly everyone's scared to say out loud. 02:36:32.580 |
Everyone's thinking, but they're scared to say it out loud. 02:36:35.580 |
And then you have this other region, which is the approved ideas of this now cultural 02:36:43.340 |
And those are being spoken from the largest platforms, and they're being repeated by the 02:36:46.860 |
president, and they're being repeated all over the place, even though people don't believe 02:36:53.700 |
And what happens is the society becomes really stupid because active communal thinking is 02:36:58.420 |
the region where we can actually think together. 02:37:01.700 |
And it gets siloed into small private conversations. 02:37:06.020 |
It's really powerful what you said about institutions and so on. 02:37:08.780 |
It's not trivial from a leadership position to be like, "No, we defend the employee or 02:37:14.020 |
we defend the employee, the person with us on our..." because there's no actual ground 02:37:24.460 |
to any kind of violation we're hearing about. 02:37:28.340 |
It's ultimately to the leader, I guess, of a particular institution or a particular company. 02:37:35.340 |
If it were easy, there wouldn't be all of these failings. 02:37:40.300 |
And by the way, that's the immune system failing. 02:37:42.960 |
That's the liberal immune system of that company failing, but also then it's an example, which 02:37:51.980 |
Of course it's not, because we have primitive minds that are wired to care so much about 02:37:58.140 |
First of all, we're scared that it's gonna start a... 02:38:03.500 |
They don't just say, "I'm gonna criticize you. 02:38:05.220 |
I'm gonna criticize anyone who still buys your product. 02:38:08.540 |
I'm gonna criticize anyone who goes on your podcast." 02:38:12.060 |
It's now suddenly, if Lex becomes tarnished enough... 02:38:16.240 |
Now I go on the podcast and people are saying, "Oh, I'm not buying his book. 02:38:23.400 |
You've been smeared and we're in such a bad time that it smear travels to me. 02:38:27.360 |
And now meanwhile, someone who buys my book and tries to share it, someone said, "You're 02:38:42.040 |
But we're wired to care about it so much because it meant life or death back in the day. 02:38:51.200 |
Probably can smear each other in this conversation. 02:39:00.640 |
What do you think about freedom of speech as a term and as an idea, as a way to resist 02:39:04.740 |
the mechanism, this mechanism of dormant thought pile and artificially generated speech? 02:39:10.680 |
This ideal of the freedom of speech and protecting speech and celebrating speech. 02:39:15.880 |
Well, so this is kind of the point I was talking about earlier about King Mustache made a rule 02:39:28.480 |
One of the amazing things about your book, as you get later and later in the book, you 02:39:32.200 |
cover more and more difficult issues as a way to illustrate the importance of the vertical 02:39:37.400 |
But there's something about using hilarious drawings throughout that make it much more 02:39:45.120 |
fun and it takes you away from the personal somehow. 02:39:47.840 |
And you start thinking in the space of ideas versus outside of the tribal type of thinking. 02:39:57.660 |
When they write controversial books to have hilarious drawings. 02:40:01.240 |
Put a silly stick figure in your thing and it lightens... 02:40:07.200 |
- It reminds people that we're all friends here. 02:40:11.800 |
Let's laugh at ourselves, laugh at the fact that we're in a culture war a little bit and 02:40:16.680 |
now we can talk about it as opposed to getting religious about it. 02:40:21.400 |
But basically, King Mustache had no First Amendment. 02:40:24.040 |
He said, "The government is censoring," which is very common around the world. 02:40:32.200 |
You can argue there's some controversial things recently, but basically the US, the First 02:40:39.360 |
No one is being arrested for saying the wrong thing, but this graph is still happening. 02:40:44.680 |
And so freedom of speech, what people like to say is if someone's complaining about a 02:40:52.960 |
cancel culture and saying, "This is an anti-free speech," people like to point out, "No, it's 02:40:59.720 |
The government's not arresting you for anything. 02:41:03.980 |
This is called... you're putting your ideas out and you're getting criticized and your 02:41:12.120 |
And this is not making a critical distinction between cancel culture and criticism culture. 02:41:19.640 |
Criticism culture is a little bit of this kind of high-rung idea lab stuff we talked 02:41:24.400 |
about.Criticism culture attacks the idea and encourages further discussion. 02:41:42.300 |
Criticism culture says, "Here's why this idea is so bad. 02:41:45.100 |
Cancel culture says, "Here's why this person is bad and no one should talk to them and 02:41:52.740 |
It makes everyone scared to talk and it's the opposite. 02:41:58.100 |
But First Amendment plus cancel culture equals you might as well be in King Must... you might 02:42:04.900 |
First Amendment plus criticism culture, great. 02:42:07.020 |
Now you have this vibrant marketplace of ideas. 02:42:13.660 |
And so when people criticize the cancel culture and then someone says, "Oh, see, you're so 02:42:18.740 |
Look, you're doing the cancel culture yourself. 02:42:24.220 |
Every good liberal, and I mean that in the lower case, which is that anyone who believes 02:42:28.520 |
in liberal democracies, regardless of what they believe, should stand up and say no to 02:42:32.700 |
cancel culture and say, "This is not okay," regardless of what the actual topic is. 02:42:37.580 |
And that makes them a good liberal versus if they're trying to cancel someone who's 02:42:41.440 |
just criticizing, they're doing the opposite. 02:42:43.320 |
Now they're shutting... so it's the opposite thing. 02:42:46.420 |
You can see people take advantage of the... and sometimes they just don't know it themselves. 02:42:55.420 |
Without that wording, suddenly it looks like someone who's criticizing cancel culture is 02:43:02.340 |
- You apply this thinking to universities in particular. 02:43:07.580 |
There's a great, yet another great image on the trade-off between knowledge and conviction. 02:43:14.980 |
And it's what's commonly... actually, you can maybe explain to me the difference, but 02:43:20.300 |
it's often referred to as the Dunning-Kruger effect, where you... when you first learn 02:43:24.420 |
of a thing, you have an extremely high confidence about self-estimation of how well you understand 02:43:32.020 |
You actually say that Dunning-Kruger means something else. 02:43:34.300 |
- So yeah, when I post this, everyone's like, "Dunning-Kruger," and it's what everyone thinks 02:43:41.780 |
You have a diagonal line like this one, which is the place you are... 02:43:49.580 |
It's exactly the right level of humility based on what you know. 02:43:54.860 |
You don't have enough confidence because you know more than you're giving yourself credit 02:43:58.420 |
And when you're above the line, you're in the arrogance zone, right? 02:44:05.420 |
And Dunning-Kruger is basically a straight line that's just... has a lower slope. 02:44:09.620 |
So you start off... you still are getting more confident as you go along, but you start 02:44:17.740 |
off above that line, and as you learn more, you end up below the line later. 02:44:28.860 |
So this idea, so for people just listening, there's a child's hill, pretty damn sure you 02:44:39.180 |
And then there's an insecure canyon, you crash down, acknowledging that you don't know that 02:44:50.300 |
Where after you feel ashamed and embarrassed about not knowing that much, you begin to 02:44:54.860 |
realize that knowing how little you know is the first step in becoming someone who actually 02:45:05.940 |
You're saying that in universities, we're pinning people at the top of the child's hill. 02:45:14.020 |
I think of myself with this, because I went to college, like a lot of 18 year olds, and 02:45:22.460 |
And when it came to politics, I was like bright blue, just because I grew up in a bright blue 02:45:26.500 |
suburb and I wasn't thinking that hard about it. 02:45:30.420 |
And what I did when I went to college is met a lot of smart conservatives and a lot of 02:45:35.420 |
But I met a lot of people who weren't just going down a checklist and they knew stuff. 02:45:39.820 |
And suddenly I realized that a lot of these views I have are not based on knowledge. 02:45:49.020 |
Everyone else thinks that's true, so now I think it's true. 02:45:51.900 |
I'm actually like, I'm transferring someone else's conviction to me. 02:45:57.780 |
They might have conviction because they're transferring from someone else. 02:46:02.260 |
Why am I giving away my own independent learning abilities here and just adopting other views? 02:46:14.500 |
And it wasn't just about politics, by the way. 02:46:16.580 |
It was that I had strong views about a lot of stuff and I just, I got lucky, or not lucky, 02:46:21.460 |
I sought out, you know, the kind of people I sought out were the type that love to disagree 02:46:29.220 |
And so you're quickly in, you know, again, an idea lab culture. 02:46:33.220 |
And also, I also went to, I started getting in the habit, I started loving listening to 02:46:36.620 |
people who disagreed with me because it was so exhilarating listening to a smart person. 02:46:39.900 |
When I thought there was no credence to this other argument, right? 02:46:46.900 |
I wanted to see an Intelligence Squared on that debate. 02:46:49.220 |
I wanted to go see, I actually got into Intelligence Squared in college. 02:46:52.100 |
I wanted to see a smart person who disagrees with me talk. 02:47:02.820 |
That shoved me down the humble tumble here, number three. 02:47:06.340 |
It shoved me down where I started to, and then I went the other way where I realized 02:47:09.740 |
that I had been, a lot of my identity had been based on this faux feeling of knowledge, 02:47:17.100 |
Now that I don't have that, I was like, I felt really like dumb and I felt really almost 02:47:22.780 |
And so that's where I call this insecure canyon. 02:47:24.340 |
I think it's sometimes when you're so used to thinking you know everything and then you 02:47:28.100 |
And then you start to realize that actually really awesome thinkers, they don't judge 02:47:34.940 |
They totally respect if I say, I don't know anything about this. 02:47:37.300 |
They say, oh cool, you should read this and this and this. 02:47:41.620 |
And so, and not that I'm, by the way, this is not to say I'm now on Grown Up Mountain 02:47:46.460 |
I often find myself drifting up with like a helium balloon. 02:47:49.940 |
Oh, I think I read about the new thing and suddenly I think I have, I think I read three 02:47:55.060 |
things about, you know, a new AI thing and I'm like, I'll go do a talk on this. 02:47:59.820 |
I don't, I just, I'm going to just be spouting out the opinion of the person I just read. 02:48:06.300 |
Now what, the reason my problem with colleges today is that it's, I was a graduate in 2004. 02:48:12.660 |
This is a recent change, is that all of those speakers I went who disagreed with me, a lot 02:48:20.980 |
So many of those speakers would not be allowed on campuses today. 02:48:24.100 |
And so many of the discussions I had were in big groups or classrooms. 02:48:27.940 |
And this was still, you know, this was a liberal campus. 02:48:31.180 |
So many of those disagreements, they're not happening today. 02:48:35.900 |
And I've interviewed a ton of college students. 02:48:41.220 |
So what's happening is not only are people losing that push off Child's Hill, which was 02:48:45.420 |
so valuable to me, so valuable to me as a thinker. 02:48:48.980 |
It kind of started my life as a better thinker. 02:48:51.660 |
They're losing that, but actually what college, a lot of the college classes and the vibe 02:48:55.380 |
in colleges, a lot of what I was saying that there is one right set of views and it's this 02:49:03.620 |
And anyone who disagrees with it is bad and anyone, and don't speak up, you know, unless 02:49:09.860 |
It's teaching people that Child's Hill is that, you know, it's nailing people's feet 02:49:15.140 |
It's teaching people that these are right, this views are right. 02:49:17.700 |
And like, you don't have any, you should feel a complete conviction about them. 02:49:29.140 |
Is it part of the, is it part of like actually instilling in the individual, like 18 year 02:49:34.700 |
olds, the idea that this is the beautiful way to live is to embrace the disagreement 02:49:45.540 |
The idea of intellectual just get, when that awareness is people need to see what's happening 02:49:49.340 |
here that kids are getting, losing the, they're not going to college and becoming better, 02:49:57.780 |
They're actually going to college and becoming zealots. 02:50:00.860 |
And the website still advertises, you know, wide variety of, you know, the website is 02:50:06.860 |
- You list all the universities, yeah, Harvard. 02:50:09.340 |
It's still saying, here, you're coming here for a wide intellectual, basically they're 02:50:13.020 |
advertising, this is an ideal lab and you get there and it's like, actually it's an 02:50:17.380 |
So if people realize that they start to get mad, hopefully, and then courage, I mean, 02:50:25.220 |
There's been some very brave students who have started, you know, big think clubs and 02:50:29.180 |
stuff like that, where it's like, we're going to have, you know, present both sides of a 02:50:34.140 |
And that, that takes courage, but also courage and leadership. 02:50:38.700 |
Like it's like, if you look at these colleges, it's specifically the leaders who show strength, 02:50:50.340 |
So if a leader of one of these places says, you know, the college presidents who have 02:50:55.180 |
shown some strength, they actually don't get as much trouble. 02:50:59.060 |
It's the ones who pander, the ones who, in that, you know, in that moment of truth, they 02:51:06.940 |
shrink away, then they get a lot more trouble. 02:51:12.060 |
For the listener, the podcast favorite, Liv Burry just entered, and your friend just entered 02:51:24.860 |
So Liv, you mentioned something that there's a funny story about, we haven't talked at 02:51:29.300 |
all about the actual process of writing the book. 02:51:50.980 |
So Liv was FaceTiming me and she was like, she was like being intimidating. 02:51:54.980 |
I took a screenshot and I made it my phone background. 02:51:59.100 |
So to give the background of this, it's because, if you hadn't noticed, Tim started writing 02:52:08.780 |
As sort of a response to like the Trump stuff. 02:52:10.860 |
Not even, yeah, it was just supposed to be a mini post. 02:52:12.980 |
I was like, oh, I'm so like, I was like, I'm looking at all these like future tech things 02:52:17.180 |
and I feel this like uneasiness, like, ah, we're going to like mess up all these things. 02:52:23.460 |
And I opened it up to WordPress to write a one day little essay. 02:52:30.340 |
It was going to be on like this feeling I had that like this feeling I had that we were, 02:52:37.260 |
our tech was just growing and growing and we were becoming less wise. 02:52:43.020 |
And I just wanted to write like just like a little like a little thousand word essay 02:52:44.500 |
on like something I think we should pay attention to. 02:52:46.700 |
And that was the beginning of this six year nightmare. 02:52:49.860 |
Did you anticipate the blog post would take a long while? 02:52:54.860 |
I don't remember the process fully in terms of, I remember you saying, oh, I'm actually 02:53:01.740 |
You know, because the more we talked about, we were talking about it. 02:53:03.740 |
I was like, oh, this goes deep because I didn't really understand the full scope of the situation. 02:53:12.700 |
And then the more we dug into it, the sort of the deeper and deeper and deeper it went. 02:53:16.100 |
But no, I did not anticipate it would be six years. 02:53:19.700 |
And when was your TED talk on procrastination? 02:53:25.260 |
And I started this book three months later and fell into the biggest procrastination 02:53:33.620 |
I mean, it's like it's I just like I like how much credit I have as as for that TED 02:53:44.620 |
Because, I mean, he did you know, he did intend it to start out as a blog post. 02:53:48.140 |
But then you're like, actually, this needs to be multiple. 02:53:54.660 |
Yeah, that's what and what but also what Liv witnessed a few times and my wife has 02:53:58.820 |
witnessed like 30 of these is like these these 180 epiphanies where I'll be like, I'll 02:54:05.780 |
like I'll have a moment when I'm and I don't know what you know, sometimes it's 02:54:10.780 |
Sometimes it's like I'm just dreading having to finish this the way it is. 02:54:14.180 |
And so there's epiphanies where it's like, you know what, I need to start over from the 02:54:16.260 |
beginning and just make this like a short like 20 little blog post list and then I'll 02:54:22.460 |
And I was like, no, no, no, I have like a new epiphany. 02:54:23.780 |
I have to and it's these and yeah, it's kind of like the crazy person a little bit. 02:54:32.820 |
So things came to a head when we were in we were all on vacation in Dominican Republic. 02:54:43.460 |
And I remember you'd been in the ocean for like an hour just bobbing in there becoming 02:54:48.460 |
And we got talking and we were talking about the book. 02:54:51.060 |
And you were expressing just like this, you know, just the horror of the situation basically. 02:54:58.220 |
You're like, look, I just I'm so close, but there's still this and then there's this. 02:55:02.540 |
And an idea popped into my head, which is that poker players often we will set ourselves 02:55:12.300 |
You know, like essentially if we don't get a job done, then we have to do something we 02:55:18.380 |
So instead of having a carrot, like a really, really big stick. 02:55:26.260 |
What is the worst either organization or individual that you if you had to, you know, that you 02:55:33.300 |
would loathe to give a large sum of money to? 02:55:36.460 |
And he thought about it for a little while and he gave his answer. 02:55:40.340 |
And I was like, all right, what's your net worth? 02:55:46.800 |
If you don't get the draft because that's right. 02:55:49.480 |
Just before that, I asked him how long, like if you had a gun to your head or to your wife's 02:55:53.900 |
head and you had to get the book into a state where you could like send off an edit to the 02:56:01.820 |
He's like, oh, I guess like I could get it like 95 percent good in a month. 02:56:06.140 |
In one month's time, if you do not have that edit handed in, there's a draft handed in, 02:56:13.380 |
So 10 percent of your net worth is going to this thing that you really, really think is 02:56:20.240 |
The kicker was that because, you know, procrastinators, they self-defeat. 02:56:26.000 |
And then Liv says, I'm going to sweeten the deal and I am going to basically match you 02:56:33.920 |
and I'm going to put in I'm going to send this like a huge amount of my own money there 02:56:41.400 |
So and I can't that's that would be really bad. 02:56:44.500 |
Not only are you screwing yourself, you're screwing a friend. 02:56:46.740 |
And she and she was like, and as your friend, because I'm your friend, I will send it. 02:57:09.060 |
Actually, it was it was funny because it was it was like supposed to be by the summer solstice 02:57:15.220 |
I got no, I got it in at four a.m. like the next morning. 02:57:18.700 |
But then and and and they were both like, that doesn't count. 02:57:25.580 |
Can you imagine how fucked in the head you have to be? 02:57:27.580 |
So like literally technically pass the deadline by four hours for an obscene amount of money 02:57:37.660 |
I knew that there was no way she was going to actually send that money because it was 02:57:43.420 |
So, yeah, I should actually punish you and just I should send like a nominal amount to 02:57:51.180 |
But is there some micro like lessons from that from how to avoid procrastination writing 02:58:00.020 |
I mean, like first, don't take don't write like a dissertation about like proving some 02:58:08.620 |
Like I would have been an awful PhD student for that reason. 02:58:11.460 |
And so like I'm going to do another book and it's going to be like a bunch of short chapters 02:58:14.300 |
that are one offs because that's like it just doesn't feed into your book is like a giant 02:58:36.580 |
The second one is like, secondly, yeah, like basically there's two ways to fix procrastination. 02:58:42.500 |
You have a boat that's leaking and it's not working very well. 02:58:45.740 |
You can get your hammer and nails out and your boards and actually fix the boat or you 02:58:50.780 |
can duct tape it for now to get yourself across the river. 02:58:55.300 |
So ideally, down the road, I have repaired whatever kind of bizarre mental illness that 02:59:01.660 |
I have that makes me procrastinate in a very like I just don't self defeat in this way 02:59:06.980 |
But in the meantime, I can duct tape the boat by bringing what I call the panic monster 02:59:11.540 |
into the situation via things like this and this scary person and having external pressure 02:59:16.860 |
to have external pressure of some kind is critical for me. 02:59:21.100 |
It's yes, I don't have the muscle to do the work I need to do without external pressure. 02:59:25.700 |
By the way, Liv, is there a possible future where you write a book? 02:59:31.580 |
And meanwhile, by the way, huge procrastinator. 02:59:34.860 |
Yeah, I mean, I'm how long did your last video take? 02:59:38.900 |
Is there advice you give to live how to get the videos done faster? 02:59:43.860 |
We actually I can give good procrastination advice. 02:59:51.220 |
You know, it's, it's, um, we should actually just do another bet. 02:59:59.500 |
And, and, and it's not the thing is the time. 03:00:02.860 |
But it's like if you if you could take three weeks on a video, and instead you take 10 03:00:06.820 |
weeks, it's not like, oh, well, I've also I'm having more fun in those 10 weeks. 03:00:14.420 |
So you're just you're just having a bad time. 03:00:15.740 |
And you're getting less work done and less work out. 03:00:17.820 |
And it's not like you're enjoying your personal life. 03:00:48.580 |
It's so awesome that I'm going to save it when I'm behind a computer and can take notes. 03:00:55.180 |
Of course, that resulted in like last minute everything, everything, everything I'm doing 03:01:01.500 |
You know, people self-defeat in different ways. 03:01:02.980 |
Some people don't have this particular problem. 03:01:04.820 |
Adam Grant is a, he calls himself a pre-crastinator where he gets an assignment. 03:01:09.220 |
He will go home and do it until it's done and handed it, which is also not necessarily 03:01:14.340 |
You know, it's like you're rushing it either way, but it's better. 03:01:17.020 |
But some people have the opposite thing where they will, the looming deadline makes them 03:01:26.820 |
And the procrastinator I think has a similar anxiety, but they resolve it in a totally 03:01:36.220 |
Now, I think there's an even bigger group of people. 03:01:37.220 |
So there's these people, the Adam Grants, there's people like me, and then there's people 03:01:40.900 |
who have a healthy relationship with deadlines, but they're still part of a bigger group of 03:01:43.900 |
people that actually, they need a deadline there to do something. 03:01:52.900 |
So they actually, they still are motivated by a deadline. 03:01:57.140 |
And as soon as you have all the things in life that don't have a deadline, like working 03:02:00.140 |
out and like working on that album you wanted to write, they don't do anything either. 03:02:03.660 |
So there's actually like, that's why procrastination is a much bigger problem than people realize, 03:02:07.140 |
because it's not just the funny last second people. 03:02:09.500 |
It's anyone who actually can't get things done that don't have a deadline. 03:02:14.780 |
You dedicate your book, quote, to Tannis, who never planned on being married to someone 03:02:20.060 |
who would spend six years talking about his book on politics, but here we are. 03:02:25.100 |
What's the secret to a successful relationship with a procrastinator? 03:02:30.460 |
Well, I think the first and most important thing. 03:02:34.300 |
You already started with a political answer, I can tell. 03:02:37.580 |
No, the first and most important thing is, because people who don't procrastinate, if 03:02:41.740 |
you don't, it's like, you will, people, the instinct is to judge it as like, either just 03:02:48.540 |
think they're just being like a loser, or they'll take it personally, you know. 03:02:52.700 |
And instead to see this as like, this is some form of addiction, or some form of ailment. 03:03:01.580 |
You know, they're not just being a dick, right? 03:03:03.140 |
Like, they have a problem, and so some compassion, but then also maybe finding that line where 03:03:08.340 |
you can, you know, maybe apply some tough love, some middle ground. 03:03:12.820 |
On the other hand, you might say that, you know, you don't want the significant other 03:03:16.780 |
relationship where it's like, they're the one nagging you. 03:03:18.640 |
Maybe that's, you don't want them even being part of that. 03:03:20.780 |
And I think maybe it's, you know, better to have a live, do it instead. 03:03:23.500 |
Right, having someone who can like create the infrastructure where they aren't the direct 03:03:27.620 |
stick, you need a bit of carrot and stick, right? 03:03:30.340 |
Maybe they can be the person who keeps reminding them of the carrot. 03:03:34.100 |
And then they set up the friend group to be the stick. 03:03:36.260 |
And then that keeps your relationship in a good place. 03:03:39.620 |
Yeah, a stick, like looming in the background, that's your friend group. 03:03:43.460 |
Okay, at the beginning of the conversation, we talked about how all of human history can 03:03:52.540 |
What are you excited about for the 1000th, what do you say that, first page? 03:04:00.220 |
So the next 250 years, what are you most excited about? 03:04:05.000 |
I'm most excited about, have you read the Fable of the Dragon? 03:04:11.380 |
Okay, well, it's an allegory for death and it's, you know, Nick Bostrom, and he talks 03:04:16.300 |
about the, he compares death to a dragon that eats 60 million people or whatever the number 03:04:23.100 |
And you just, every year we shepherd those people up and they feed them to the dragon. 03:04:26.180 |
And that there's a Stockholm syndrome when we say that's just a lot of man. 03:04:31.340 |
And anyone who says maybe we should try to beat the dragon, they get called vain and 03:04:36.340 |
But someone who tries to, someone who goes, does chemo, no one calls them vain or narcissistic. 03:04:40.760 |
They say they're, they're, you know, good, good for you, right? 03:04:47.420 |
And I think that if we can get out of that Stockholm syndrome and realize that death 03:04:51.740 |
is just the machine, the human physical machine failing, and that there's no law of nature 03:04:59.340 |
that says you can't, with enough technology, repair the machine and keep it going until 03:05:05.980 |
no one, I don't think anyone wants to live forever. 03:05:11.780 |
And I think when we hit a world where we can, we have enough tech that we can continue to 03:05:17.340 |
keep the human machine alive until the person says, I'm done, I'm ready. 03:05:20.860 |
I think we will look back and we will think that anything before that time, that'll be 03:05:25.260 |
You know, we'll look back at BC before the big advancement and it'll seem so sad and 03:05:31.820 |
And people will say, I can't believe that humans like us had to live with that when 03:05:36.180 |
they lost loved ones and they died before they were ready. 03:05:39.240 |
I think that's the ultimate achievement, but we need to stop criticizing and smearing people 03:05:47.760 |
- So you think that's actually doable in the next 250 years? 03:05:53.400 |
- A lot happens in 250 years, especially when technology really exponentially, yeah. 03:05:58.400 |
- And you think humans will be around versus AI completely takes over, where mortality 03:06:03.800 |
- I mean, look, the optimist in me, and maybe the stupid kind of 2023 person in me, says, 03:06:09.200 |
yeah, of course, we'll make it, we'll figure it out. 03:06:11.360 |
But you know, I mean, we are going into a, you know, I have a friend who knows as much 03:06:19.440 |
I mean, he's really, he's a big investor in, you know, future tech and he's really on the 03:06:23.960 |
pulse of things and he just says, the future's gonna be weird. 03:06:26.480 |
That's what he says, the future's gonna be weird. 03:06:29.480 |
Don't look at the last few decades of your life and apply that forward and say, that's 03:06:33.240 |
No, no, no, it's gonna be weird and different. 03:06:34.840 |
- Well, some of my favorite things in this world are weird. 03:06:38.600 |
And speaking of which, it's good to have this conversation. 03:06:45.800 |
And thanks for talking with me a bunch more times. 03:06:51.520 |
- Thanks for listening to this conversation with Tim Urban. 03:06:53.320 |
To support this podcast, please check out our sponsors in the description. 03:06:57.440 |
And now let me leave you with some words from Winston Churchill. 03:07:01.320 |
When there's no enemy within, the enemies outside cannot hurt you. 03:07:06.880 |
Thank you for listening, and hope to see you next time.