back to index

Tim Urban: Tribalism, Marxism, Liberalism, Social Justice, and Politics | Lex Fridman Podcast #360


Chapters

0:0 Introduction
2:16 Human history
18:15 Greatest people in history
26:4 Social media
32:46 Good times and bad times
44:16 Wisdom vs stupidity
46:24 Utopia
60:33 Conspiracy theories
73:44 Arguing on the Internet
93:44 Political division
103:38 Power games
111:37 Donald Trump and Republican Party
128:45 Social justice
151:28 Censorship gap
158:59 Free speech
163:1 Thinking and universities
171:24 Liv Boeree joins conversation
183:44 Hopes for the future

Whisper Transcript | Transcript Only Page

00:00:00.000 | A radical political movement, of which there will always be a lot in the country, has managed
00:00:06.280 | to do something that a radical movement is not supposed to be able to do in the US, which
00:00:10.880 | is they've managed to hijack institutions all across the country, and hijack medical
00:00:16.400 | journals and universities and the ACLU, all the activist organizations and non-profits
00:00:24.000 | and many tech companies.
00:00:26.280 | And the way I view a liberal democracy is it is a bunch of these institutions that were
00:00:30.840 | trial and error crafted over hundreds of years, and they all rely on trust, public trust,
00:00:38.560 | and a certain kind of feeling of unity that actually is critical to a liberal democracy's
00:00:43.680 | functioning.
00:00:44.680 | And what I see this thing is, is a parasite on that, whose goal is—and I'm not saying
00:00:50.920 | each individual in this, I don't think they're bad people.
00:00:54.360 | I think that it's the ideology itself has the property of, its goal is to tear apart
00:00:59.960 | the pretty delicate workings of the liberal democracy and shred the critical lines of
00:01:04.560 | trust.
00:01:07.600 | The following is a conversation with Tim Urban, his second time on the podcast.
00:01:11.880 | He's the author and illustrator of the amazing blog called "Wait, But Why?" and is the author
00:01:17.720 | of a new book coming out tomorrow called "What's Our Problem?"
00:01:21.640 | A self-help book for societies.
00:01:23.920 | We talk a lot about this book in this podcast, but you really do need to get it and experience
00:01:29.160 | it for yourself.
00:01:30.160 | It is a fearless, insightful, hilarious, and I think important book in this divisive time
00:01:36.080 | that we live in.
00:01:37.280 | The Kindle version, the audiobook, and the web version should be all available on date
00:01:41.800 | of publication.
00:01:42.800 | I should also mention that my face might be a bit more beat up than usual.
00:01:48.840 | I got hit in the chin pretty good since I've been getting back into training Jiu-Jitsu,
00:01:55.360 | a sport I love very much, after recovering from an injury.
00:01:59.120 | So if you see marks on my face during these intros or conversations, you know that my
00:02:05.200 | life is in a pretty good place.
00:02:07.480 | This is the Lex Friedman Podcast.
00:02:09.640 | To support it, please check out our sponsors in the description.
00:02:12.560 | And now, dear friends, here's Tim Urban.
00:02:16.960 | You wrote an incredible book called What's Our Problem?
00:02:20.160 | A Self-Help Book for Societies.
00:02:23.400 | In the beginning, you present this view of human history as a thousand-page book where
00:02:32.200 | each page is 250 years.
00:02:35.680 | And it's a brilliant visualization because almost nothing happens for most of it.
00:02:40.280 | So what blows your mind most about that visualization when you just sit back and think about it?
00:02:46.000 | It's a boring book.
00:02:47.720 | So 950 pages, 95% of the book, hunter-gatherers kind of doing their thing.
00:02:52.120 | I'm sure there's obviously some major cognitive advancements along the way in language.
00:02:57.560 | And I'm sure the bow and arrow comes around at some point.
00:03:00.320 | So tiny things, but it's like, oh, now we have 400 pages until the next thing.
00:03:03.600 | But then you get to page 950, and things start moving.
00:03:07.000 | Recorded history starts at 976.
00:03:09.000 | Right, right.
00:03:10.000 | So basically, the bottom row is when anything interesting happens.
00:03:13.120 | There's a bunch of agriculture for a while before we know anything about it.
00:03:18.200 | And then recorded history starts.
00:03:19.600 | Yeah, 25 pages of actual recorded history.
00:03:24.160 | So when we think of prehistoric, we're talking about pages 1 through 975 of the book.
00:03:31.200 | And then history is page 976 to 1000.
00:03:35.920 | If you were reading the book, it would be like epilogue AD, the last little 10 pages
00:03:40.380 | of the book.
00:03:41.380 | We think of AD as super long, right?
00:03:43.400 | 2,000 years, the Roman Empire, 2,000 years ago.
00:03:46.320 | That's so long.
00:03:48.440 | Human history has been going on for over 2,000 centuries.
00:03:53.760 | That is-- it's just-- it's hard to wrap your head around.
00:03:57.320 | And this is-- I mean, even that's just the end of a very long road.
00:04:01.000 | The 100,000 years before that, it's not like that was that different.
00:04:06.840 | So it's just-- there's been people like us that have emotions like us, that have physical
00:04:12.280 | sensations like us for so, so long.
00:04:17.880 | And who are they all?
00:04:19.940 | And what was their life like?
00:04:21.220 | And it's-- I think we have no idea what it was like to be them.
00:04:25.420 | The thing that's craziest about the people of the far past is not just that they had
00:04:29.820 | different lives, they had different fears, they had different dangers and different responsibilities,
00:04:33.660 | and they lived in tribes and everything, but they didn't know anything.
00:04:37.460 | We just take it for granted that we're born on top of this tower of knowledge.
00:04:41.060 | And from the very beginning, we know that the Earth is a ball floating in space.
00:04:46.700 | And we know that we're going to die one day.
00:04:48.780 | And we know that we evolved from animals.
00:04:54.060 | Those were all incredible epiphanies quite recently.
00:04:58.100 | And the people a long time ago, they just had no idea what was going on.
00:05:01.500 | And I'm kind of jealous, because I feel like it-- I mean, it might have been scary to not
00:05:04.660 | know what's going on, but it also, I feel like, would be-- you'd have a sense of awe
00:05:07.660 | and wonder all the time.
00:05:09.300 | And you don't know what's going to happen next.
00:05:10.660 | And once you learn, you're kind of like, oh, that's a little grim.
00:05:14.100 | But they probably had the same capacity for consciousness to experience the world, to
00:05:20.100 | wander about the world, maybe to construct narratives about the world and myths and so
00:05:25.220 | They just had less grounded, systematic facts to play with.
00:05:30.660 | They still probably felt the narratives, the myths, they constructed as intensely as we
00:05:36.980 | Oh, yeah.
00:05:37.980 | They also fell in love.
00:05:38.980 | They also had friends.
00:05:41.260 | And they had falling outs with friends.
00:05:43.340 | Didn't shower much, though.
00:05:44.340 | No, they did not smell nice.
00:05:47.820 | Maybe they did.
00:05:49.540 | Maybe beauty's in the eye of the beholder.
00:05:51.820 | Maybe it's all relative.
00:05:52.820 | So how many people in history have experienced a hot shower?
00:05:56.820 | Like almost none.
00:05:58.460 | That's like, when were hot showers invented?
00:06:00.020 | 100 years ago?
00:06:01.820 | Like less?
00:06:03.660 | So like George Washington never had a hot shower.
00:06:07.580 | It's like, it's just kind of weird.
00:06:08.740 | Like he took cold showers all the time.
00:06:12.980 | And again, we just take this for granted.
00:06:14.220 | But that's like an unbelievable life experience to have a rain, a controlled little booth
00:06:20.420 | where it rains hot water on your head.
00:06:23.440 | And then you get out and it's not everywhere.
00:06:24.900 | It's like contained.
00:06:27.380 | That was like, a lot of people probably lived and died with never experiencing hot water.
00:06:31.520 | Maybe they had a way to heat water over a fire.
00:06:33.940 | But like then it's, I don't know.
00:06:34.980 | It's just like, there's so many things about our lives now that are complete, just total
00:06:41.580 | anomaly.
00:06:42.580 | It makes you wonder like, what is the thing that would notice the most?
00:06:44.940 | I mean, the sewer system, like it doesn't smell in cities.
00:06:49.580 | Incredible.
00:06:50.940 | What does the sewer system do?
00:06:52.300 | I mean, it gets rid of waste efficiently in such a way we don't have to confront it, both
00:06:56.020 | with any of our senses.
00:06:58.220 | And that probably wasn't there.
00:06:59.540 | I mean, what else?
00:07:01.020 | Plus all the medical stuff associated with sewage.
00:07:02.700 | - Yeah, I mean, how about the disease?
00:07:03.700 | - Yeah.
00:07:04.700 | - How about the cockroaches and the rats and the disease and the plagues?
00:07:10.340 | And then when they got, so they caught more diseases, but then when they caught the disease,
00:07:13.960 | they also didn't have treatment for it.
00:07:15.960 | So they often would die or they would just be in a huge amount of pain.
00:07:20.620 | They also didn't know what the disease was.
00:07:22.860 | They didn't know about microbes.
00:07:24.020 | That was this new thing.
00:07:25.020 | They just had this tiny little animals that are causing these diseases.
00:07:27.980 | So what did they think?
00:07:28.980 | In the bubonic plague, in the black death, the 1300s, people thought that it was an act
00:07:35.500 | of God because God's angry at us.
00:07:38.220 | Because why would you not think that if you didn't know what it was?
00:07:42.760 | And so the crazy thing is that these were the same primates.
00:07:45.020 | So I do know something about them.
00:07:46.580 | I know in some sense what it's like to be them, because I am a human as well.
00:07:51.740 | And to know that this particular primate that I know what it's like to be experienced such
00:07:57.260 | different things.
00:07:59.140 | And like this isn't, our life is not the life that this primate has experienced almost ever.
00:08:04.220 | So it's just a bit strange.
00:08:06.940 | - I don't know.
00:08:07.940 | I have a sense that we would get acclimated very quickly.
00:08:11.740 | Like if we threw ourselves back a few thousand years ago, it would be very uncomfortable
00:08:16.580 | at first, but the whole hot shower thing, you'll get used to it.
00:08:19.500 | - Oh yeah.
00:08:20.500 | - You would not even like miss it.
00:08:22.860 | There's a few, I'm trying to remember which book that talks about hiking the Appalachian
00:08:27.780 | Trail, but you kind of miss those hot showers.
00:08:31.420 | But I have a sense like after a few months, after a few years.
00:08:35.060 | - Your skill recalibrates.
00:08:36.060 | - Yeah.
00:08:37.060 | - Yeah, I was saying the other day to a friend that whatever you're used to, you start to
00:08:40.860 | think that, oh, that the people that have more than me or are more fortunate, like it
00:08:44.780 | just sounds incredible.
00:08:45.780 | I would be so happy.
00:08:47.100 | But you know that's not true.
00:08:48.100 | Because the experience, what would happen is you would get these new things or you would
00:08:53.420 | get these new opportunities and then you would get used to it and then you would, the hedonic
00:08:56.780 | treadmill, you'd come back to where you are.
00:08:58.940 | And likewise though, because you think, oh my God, what if I had to have this kind of
00:09:03.340 | job that I never would want or I had this kind of marriage that I never would want?
00:09:07.900 | You know what?
00:09:08.900 | If you did, you would adjust and you get used to it and you might not be that much less
00:09:12.180 | happy than you are now.
00:09:13.700 | So on the other side of the, you being okay going back, we would survive if we had to
00:09:18.100 | go back.
00:09:19.100 | We'd have to learn some skills, but we would buck up and people have gone to war before
00:09:23.860 | that were in the shopkeepers a year before that.
00:09:26.380 | They were in the trenches the next year.
00:09:28.720 | But on the other hand, if you brought them here, I always think it would be so fun to
00:09:32.420 | just bring, forget the hunter gatherers, bring a 1700s person here and tour them around,
00:09:36.900 | take them on an airplane and show them your phone and all the things it can do.
00:09:40.160 | Show them the internet, show them the grocery store.
00:09:42.580 | And take them to a Whole Foods.
00:09:45.140 | Likewise I think they would be completely awestruck and on their knees crying tears
00:09:49.620 | of joy.
00:09:50.900 | And then they'd get used to it and they'd be complaining about like, "You don't have
00:09:53.380 | the oranges in stock?"
00:09:55.880 | And that's-
00:09:56.880 | - The grocery store is a tough one to get used to.
00:09:58.860 | Like when I first came to this country, the abundance of bananas was the thing that struck
00:10:04.540 | me the most.
00:10:05.540 | Or like fruits in general, but food in general.
00:10:09.660 | But bananas somehow struck me the most.
00:10:11.620 | That you could just eat them as much as you want.
00:10:14.060 | That took a long time for me.
00:10:16.940 | Probably took several years to really get acclimated to that.
00:10:21.940 | - Why didn't you have bananas?
00:10:25.060 | - The number of bananas, fresh bananas, that wasn't available.
00:10:29.900 | Bread, yes.
00:10:32.020 | Bananas, no.
00:10:33.020 | - Yeah, it's like we don't even know what to, like we don't even know the proper levels
00:10:37.780 | of gratitude.
00:10:38.780 | Like, you know, walking around the grocery store, I don't know to be like, "The bread's
00:10:41.780 | nice, but the bananas are like, we're so lucky."
00:10:43.900 | I don't know.
00:10:44.900 | I'm like, "Oh, I could have been the other way.
00:10:45.900 | I have no idea."
00:10:46.900 | - Well, it's interesting then where we point our gratitude in the West, in the United States.
00:10:55.260 | Probably, do we point it away from materialist possessions towards, or do we just aspire
00:11:04.660 | to do that towards other human beings that we love?
00:11:07.740 | Because in the East, in the Soviet Union, growing up poor, it's having food is the gratitude.
00:11:15.740 | Having transportation is gratitude.
00:11:18.700 | Having warmth and shelter is gratitude.
00:11:21.820 | And now, but see within that, the deep gratitude is for other human beings.
00:11:26.860 | The penguins huddling together for warmth in the cold.
00:11:30.100 | - I think it's a person-by-person basis.
00:11:32.420 | I mean, I'm sure, yes, of course, in the West, people on average feel gratitude towards different
00:11:36.460 | things or maybe a different level of gratitude.
00:11:38.700 | Maybe we feel less gratitude than countries that...
00:11:41.700 | Obviously, I think the easiest, the person that's most likely to feel gratitude is going
00:11:46.380 | to be someone whose life happens to be one where they just move up, up, up throughout
00:11:50.860 | their life.
00:11:51.900 | A lot of people in the greatest generation, people who were born in the '20s or whatever,
00:11:55.980 | and a lot of the boomers too.
00:11:57.580 | This story is the greatest generation grew up dirt poor and they often ended up middle
00:12:00.700 | class.
00:12:02.060 | And the boomers, some of them started off middle class and many of them ended up quite
00:12:05.740 | wealthy.
00:12:06.740 | And I feel like that life trajectory is naturally going to foster gratitude, right?
00:12:15.420 | Because you're not going to take for granted these things because you didn't have them.
00:12:18.380 | I didn't go out of the country really in my childhood very much.
00:12:23.340 | We traveled, but it was to Virginia to see my grandparents or Wisconsin to see other
00:12:27.620 | relatives or maybe Florida for going to the beach.
00:12:30.900 | And then I started going out of the country like crazy in my '20s because I really became
00:12:35.620 | my favorite thing.
00:12:36.980 | And I feel like because I, if I had grown up always doing that, it would have been another
00:12:40.500 | thing.
00:12:41.500 | I'm like, yeah, it's just something I do.
00:12:42.500 | But I still, every time I go to a new country, I'm like, oh my God, this is so cool.
00:12:46.420 | And in another country, this thing I've only seen on the map, I'm like, I'm there now.
00:12:50.340 | And so I feel like a lot of times it's a product of what you didn't have and then you suddenly
00:12:57.060 | But I still think it's case by case in that there's like a meter in everyone's head that
00:13:04.220 | I think at a 10, you're experiencing just immense gratitude, which is a euphoric feeling.
00:13:12.740 | It's a great feeling.
00:13:14.860 | And it makes you happy to savor what you have, to look down at the mountain of stuff that
00:13:21.860 | you have that you're standing on, to look down at it and say, oh my God, I'm so lucky.
00:13:27.220 | And I'm so grateful for this and this and this.
00:13:29.300 | And obviously, that's a happy exercise.
00:13:31.580 | Now when you move that meter down to six or seven, maybe you think that sometimes, but
00:13:36.300 | you're not always thinking that because you're sometimes looking up at this cloud of things
00:13:42.220 | that you don't have and the things that they have but you don't or the things you wished
00:13:45.580 | you had or you thought you were going to have or whatever.
00:13:48.460 | And that's the opposite direction to look.
00:13:50.700 | And either that's envy, that's yearning, or often it's, if you think about your past,
00:13:58.060 | it's grievance.
00:13:59.780 | And so then you go down to a one and you have someone who feels like a complete victim.
00:14:03.860 | They are just a victim of the society, of their siblings and their parents and their
00:14:08.620 | loved one.
00:14:10.900 | And they are wallowing in everything that's happened wrong to me, everything I should
00:14:16.380 | have that I don't, everything that has gone wrong for me.
00:14:19.500 | And so that's a very unhealthy, mentally unhealthy place to be.
00:14:23.620 | Anyone can go there.
00:14:24.620 | There's an endless list of stuff it can be aggrieved about and an endless list of stuff
00:14:28.500 | you can have gratitude for.
00:14:30.340 | And so in some ways it's a choice and it's a habit.
00:14:33.260 | And maybe it's part of how we were raised or our natural demeanor, but it's such a good
00:14:36.180 | exercise.
00:14:37.180 | You are really good at this, by the way.
00:14:38.260 | Your Twitter is like--
00:14:39.260 | >>JONES: Go on.
00:14:40.260 | >>ZUKOFF: Well, like you are constantly just saying, "Man, I'm lucky," or like, "I'm so
00:14:47.300 | grateful for this."
00:14:48.500 | And it's a good thing to do because you're reminding yourself, but you're also reminding
00:14:52.620 | other people to think that way.
00:14:53.940 | And it's like, we are lucky.
00:14:58.100 | And so anyway, I think that scale can go from one to 10.
00:14:59.820 | And I think it's hard to be a 10.
00:15:01.020 | I think you'd be very happy if you could be.
00:15:02.640 | But I think trying to be above a five and looking down at the things you have more often
00:15:08.620 | than you are looking up at the things you don't or being resentful about the things
00:15:12.580 | that people have wronged you.
00:15:14.060 | >>KORELL: Well, the interesting thing, I think, was an open question, but I suspect that you
00:15:19.340 | can control that knob for the individual.
00:15:22.220 | Like you yourself can choose.
00:15:23.220 | It's like the Stoic philosophy.
00:15:24.700 | You could choose where you are as a matter of habit, like you said.
00:15:28.260 | But you can also probably control that on a scale of a family, of a tribe, of a nation,
00:15:33.980 | of a society.
00:15:36.300 | You can describe a lot of the things that happens in Nazi Germany and different other
00:15:39.780 | parts of history through a sort of societal envy and resentment that builds up.
00:15:46.020 | Maybe certain narratives pick up and then they infiltrate your mind and then now your
00:15:51.100 | knob goes to, from the gratitude for everything, it goes to resentment and envy and all the
00:15:56.180 | things.
00:15:57.180 | >>STEVE: Germany between the two world wars.
00:15:58.180 | Like you said, the Soviet kind of mentality.
00:16:04.780 | So yeah, and then when you're soaking in a culture, so there's kind of two factors, right?
00:16:10.980 | It's what's going on in your own head and then what's surrounding you.
00:16:13.660 | And what's surrounding you kind of has concentric circles.
00:16:16.020 | There's your immediate group of people, because that group of people, if they're a certain
00:16:20.260 | way, if they feel a lot of gratitude and they talk about it a lot, that kind of insulates
00:16:23.580 | you from the broader culture.
00:16:25.340 | Because people are going to have the most impact on you or the ones closest.
00:16:29.620 | But often, all the concentric circles are saying the same thing.
00:16:33.020 | The people around you are feeling the same way that the broader community, which is feeling
00:16:36.340 | the same way as the broader country.
00:16:39.500 | And this is why I think American patriotism, nationalism, can be tribal, can be not a good
00:16:47.020 | thing.
00:16:48.020 | But patriotism, I think, is a great thing.
00:16:52.260 | Because really, what is patriotism?
00:16:55.260 | If you love your country, you should love your fellow countrymen.
00:16:57.340 | That's a Reagan quote.
00:17:00.060 | Patriotism is, I think, a feeling of unity, but it also comes along with an implicit kind
00:17:08.420 | of concept of gratitude.
00:17:10.140 | Because it's like, we are so lucky to live in...
00:17:12.100 | People think it's chauvinist to say, "We live in the best country in the world."
00:17:15.780 | And yes, when Americans say that, no one likes it.
00:17:18.660 | But actually, it's not a bad thing to think.
00:17:20.860 | It's a nice thing to think.
00:17:21.860 | It's a way of saying, "I'm so grateful for all the great things this country gives to
00:17:26.140 | me and this country has done."
00:17:27.860 | And I think if you heard a Filipino person say, "You know what?
00:17:30.580 | The Philippines is the best country in the world."
00:17:31.980 | No one in America would say, "That's chauvinist."
00:17:33.900 | They'd say, "Awesome."
00:17:35.100 | Because when it's coming from someone who's not American, it sounds totally fine.
00:17:40.020 | But I think national pride is actually good.
00:17:42.340 | Now again, that can quickly translate into xenophobia, nationalism, and so you have
00:17:45.980 | to make sure it doesn't go off that cliff.
00:17:47.660 | Yeah, there's good ways to formulate that.
00:17:50.100 | Like you talk about, we'll talk about high rung progressivism, high rung conservatism.
00:17:55.660 | Those are two different ways of embodying patriotism.
00:18:02.100 | So you could talk about maybe loving the tradition that this country stands for, or you could
00:18:06.340 | talk about loving the people that ultimately push progress.
00:18:11.060 | And those are, from an intellectual perspective, a good way to represent patriotism.
00:18:15.620 | We've got to zoom out because this graphic is epic.
00:18:18.700 | A lot of images in your book are just epic on their own.
00:18:22.900 | It's brilliantly done.
00:18:24.260 | But this one has famous people for each of the cards.
00:18:29.620 | Like the best of.
00:18:30.620 | Yeah.
00:18:31.620 | And by the way, good for them to be the person that...
00:18:35.740 | It's not that I could have chosen lots of people for each card, but I think most people
00:18:39.140 | would agree.
00:18:40.140 | It's a pretty fair choice for each page.
00:18:43.020 | And good for them to be...
00:18:44.700 | You crushed it if you can be the person for your whole 250 year page.
00:18:48.100 | Well, I noticed you put Gandhi, you didn't put Hitler.
00:18:50.740 | I mean, there's a lot of people going to argue with you about that particular last page.
00:18:55.220 | True.
00:18:56.220 | Yes, you're right.
00:18:57.220 | I could have put...
00:18:58.220 | I actually, I was thinking about Darwin there too.
00:19:00.220 | Darwin, yeah.
00:19:01.220 | Einstein.
00:19:02.220 | Yeah, exactly.
00:19:03.220 | You really could have put anyone.
00:19:04.220 | Did you think about putting yourself for a second?
00:19:05.580 | Yeah, I should have.
00:19:06.580 | I should have.
00:19:07.580 | That would have been awesome.
00:19:08.580 | That would have endeared the readers to me from the beginning of the first page of the
00:19:11.460 | book.
00:19:12.460 | A little bit of a messianic complex going on.
00:19:14.940 | But yeah, so the list of people just so we know.
00:19:16.820 | So these are 250 year chunks.
00:19:19.620 | The last one being from 1770 to 2020.
00:19:23.020 | And so it goes Gandhi, Shakespeare, Joan of Arc, Genghis Khan, Charlemagne, Muhammad,
00:19:29.380 | Constantine, Jesus, Cleopatra, Aristotle, Buddha.
00:19:33.580 | It's so interesting to think about this very recent human history.
00:19:37.860 | It's 11 pages, so it would be 2750, almost 3000 years.
00:19:42.940 | Just that there's these figures that stand out and that define the course of human history.
00:19:49.100 | It's like the craziest thing to me is that Buddha was a dude.
00:19:53.460 | He was a guy with arms and legs and fingernails that he maybe bit and he liked certain foods
00:20:01.100 | and maybe he got like, you know, he had like digestive issues sometimes and like he got
00:20:07.820 | cuts and they stung.
00:20:09.220 | And like he was a guy and he had hopes and dreams and he probably had a big ego for a
00:20:14.020 | while before I guess Buddha totally overcame that one.
00:20:17.180 | But like, and it's like who knows, you know, the mythical figure Buddha, who knows how
00:20:22.360 | similar he was.
00:20:23.360 | But the fact, same with Jesus, this was a guy.
00:20:25.500 | Like to me, he's a primate.
00:20:28.300 | What an impact.
00:20:29.660 | He was a cell first and then a baby.
00:20:31.620 | Yeah, he was a fetus at some point.
00:20:33.620 | A dumb baby trying to learn how to walk.
00:20:35.700 | Yeah, like having tantrum because he's frustrated because he's in the terrible twos.
00:20:40.140 | Jesus was in the terrible twos.
00:20:41.540 | Buddha never had a tantrum, let's be honest.
00:20:44.660 | The mother was like, this baby's great.
00:20:46.660 | Like wow.
00:20:47.660 | Let's figure something out.
00:20:49.980 | It just, I mean, listen, hearing about Genghis Khan, it's incredible to me because it's just
00:20:55.060 | like this was some Mongolian, you know, herder guy who was taken as a slave and he was like
00:21:03.980 | dirt poor, you know, catching rats as a young teen to feed him and his mom and I think his
00:21:11.620 | brother.
00:21:12.620 | And it's just like the odds on when he was born, he was just one of, you know, probably
00:21:21.980 | tens of thousands of random teen boys living in Mongolia in the 1200s.
00:21:27.620 | The odds of that person, any one of them being a household name today that we're talking
00:21:32.020 | about, it's just crazy like what had to happen.
00:21:35.820 | And for that guy, for that poor, dirt poor herder to take over the world, I don't know.
00:21:42.420 | So history just like continually blows my mind.
00:21:44.820 | Like, you know.
00:21:45.820 | And he's the reason you and I are related probably.
00:21:48.260 | Yeah, no.
00:21:49.260 | I mean, it's also, that's the other thing is that some of these dudes by becoming king,
00:21:55.140 | by being, having a better army at the right time, you know, William the Conqueror or whatever
00:21:58.860 | has, is in the right place at the right time with the right army, you know, and there's
00:22:02.260 | a weakness at the right moment and he comes over and he exploits it and ends up probably
00:22:06.740 | having, you know, I don't know, thousand children and those children are high up people who
00:22:11.860 | might be have a ton of, the species is different now because of him.
00:22:16.540 | Like if that, forget England's different or, you know, European borders look different.
00:22:21.340 | Like, like we are, like we look different because of a small handful of people, you
00:22:27.300 | know.
00:22:28.300 | When I sometimes I think I'm like, oh, you know, this part of the world I can recognize
00:22:31.740 | someone's Greek, you know, someone's Persian, someone's wherever because, you know, they
00:22:34.860 | kind of have certain facial features.
00:22:36.700 | And I'm like, it may have happened.
00:22:38.140 | I mean, obviously it's that that's a population, but it may be that like someone 600 years
00:22:42.540 | ago that looked like that really spread their seed and that's why the ethnicity looks kind
00:22:48.500 | of like that now.
00:22:49.500 | Sorry.
00:22:50.500 | Anyway.
00:22:51.500 | Yeah.
00:22:52.500 | Yeah.
00:22:53.500 | Do you think individuals like that can turn the direction of history or is that an illusion
00:22:58.420 | that narrative we tell ourselves?
00:23:01.100 | Well it's both.
00:23:02.100 | I mean, so I said that William the Conqueror, right.
00:23:03.540 | Or Hitler, right.
00:23:05.780 | It's not that Hitler was born and destined to be great at all.
00:23:08.780 | Right.
00:23:10.020 | I mean, in a lot of cases he's frustrated artist with a temper who's turning over the
00:23:13.820 | table in his studio and hitting his wife and being kind of a dick and a total nobody.
00:23:19.580 | Right.
00:23:20.660 | I think almost all the times you could have put Hitler baby on earth.
00:23:24.500 | He's a he's a rando.
00:23:25.500 | Right.
00:23:26.500 | You know, and maybe he's a, you know, maybe sometimes he becomes a, you know, some kind
00:23:29.620 | of, you know, he uses the speaking ability because that ability was going to be there
00:23:32.660 | either way, but maybe he uses it for something else.
00:23:35.620 | But that said, I also think, but it's not that World War II was going to happen either
00:23:42.980 | Right.
00:23:43.980 | So it's both.
00:23:44.980 | It's that like these circumstances were one way and this person came along at the right
00:23:49.020 | time and those two made a match made in this case hell.
00:23:52.060 | But it makes you wonder, yes, it's a match in hell, but are there other people that could
00:23:57.940 | have taken his place or do these people that stand out, they're the rare spark of that
00:24:04.100 | genius, whether it take us towards evil, towards good, whether those figures singularly define
00:24:11.060 | the trajectory of humanity.
00:24:13.580 | You know, what defines the trajectory of humanity in the 21st century, for example, might be
00:24:17.860 | the influence of AI, might be the influence of nuclear war, negative or positive, not
00:24:23.740 | in the case of nuclear war, but the bioengineering, nanotech, virology, what else is there?
00:24:36.500 | Maybe the structure of governments and so on.
00:24:39.660 | Maybe the structure of universities.
00:24:40.660 | I don't know.
00:24:41.660 | There could be singular figures that stand up and lead the way for human.
00:24:45.860 | - There will be.
00:24:46.860 | - But I wonder if the society is the thing that manifests that person or that person
00:24:52.740 | really does have a huge impact.
00:24:55.800 | - I think it's probably a spectrum where there are some cases when a circumstance was such
00:25:00.800 | that something like what happened was gonna happen.
00:25:04.460 | If you pluck that person from the earth, I don't know whether the Mongols is a good example
00:25:08.540 | or not, but maybe it could be that if you plucked Genghis Khan as a baby, there was
00:25:13.860 | because of the specific way Chinese civilization was at that time and the specific climate
00:25:21.860 | that was causing a certain kind of pressure on the Mongols and the way they still had
00:25:25.340 | their great archers and they had their horses and they had a lot of the same advantages.
00:25:28.860 | So maybe it was waiting to happen.
00:25:31.340 | It was gonna happen either way and it might not have happened to the extent or whatever.
00:25:35.820 | So maybe.
00:25:36.820 | Or you could go the full other direction and say, actually, this was probably not gonna
00:25:39.580 | happen.
00:25:42.220 | I think World War II is an example.
00:25:44.100 | I think World War II really was the work of, of course, it relied on all these other circumstances.
00:25:50.260 | You had to have the resentment in Germany.
00:25:51.740 | You have to have the Great Depression.
00:25:53.420 | But I think if you take Hitler out, I'm pretty sure World War II doesn't happen.
00:25:59.580 | - Well then, it seems easier to answer these questions when you look at history, even recent
00:26:03.700 | history, but let's look at now.
00:26:05.220 | Let's look at, I'm sure we'll talk about social media.
00:26:07.740 | So who are the key players in social media?
00:26:10.420 | Mark Zuckerberg.
00:26:11.420 | What's the name of the MySpace guy?
00:26:14.740 | - Tom, it's just Tom, yeah.
00:26:17.660 | - There's a meme going around where MySpace is the perfect social media 'cause no one
00:26:21.700 | algorithmic involvement.
00:26:23.780 | Everybody's happy and positive.
00:26:24.780 | - Also, Tom did it right.
00:26:26.060 | At the time, we were like, oh man, Tom only made a few million dollars.
00:26:29.740 | Ooh, he sucks to not be Zuck.
00:26:32.980 | Tom might be living a nice life right now where he doesn't have this nightmare that
00:26:36.820 | these other people have.
00:26:38.140 | - Yeah, and he's always smiling in his profile picture.
00:26:42.300 | So there's Larry Page with Google.
00:26:44.500 | That's kind of intermingled into that whole thing, into the development of the internet.
00:26:48.140 | Jack Dorsey, now Elon.
00:26:50.900 | Who else?
00:26:51.900 | I mean, there's people playing with the evolution of social media.
00:26:54.900 | And to me, that seems to be connected to the development of AI.
00:27:01.540 | And it seems like those singular figures will define the direction of AI development and
00:27:06.260 | social media development with social media seeming to have such a huge impact on our
00:27:11.500 | collective intelligence.
00:27:13.700 | - It does feel in one way like individuals have an especially big impact right now in
00:27:19.140 | that a small number of people are pulling some big levers.
00:27:24.300 | And there can be a little meeting of three people at Facebook and they come out of that
00:27:28.860 | meeting and make a decision that totally changes the world, right?
00:27:32.380 | On the other hand, you see a lot of conformity.
00:27:37.020 | You see a lot of, they all pulled the plug on Trump the same day, right?
00:27:41.660 | So that suggests that there's some bigger force that is also kind of driving them, in
00:27:46.420 | which case it's less about the individuals.
00:27:49.140 | I think, what is leadership, right?
00:27:51.460 | I mean, to me, leadership is the ability to move things in a direction that the cultural
00:27:57.140 | forces are not already taking things, right?
00:28:00.620 | A lot of times people seem like a leader because they're just kind of hopping on the cultural
00:28:05.380 | wave and they happen to be the person who gets to the top of it.
00:28:07.900 | Now it seems like they're, but actually the wave was already going.
00:28:11.580 | Like real leadership is when someone actually changes the wave, changes the shape of the
00:28:18.940 | wave.
00:28:19.940 | Like I think Elon with SpaceX and with Tesla, like genuinely shaped a wave.
00:28:27.740 | Maybe you could say that EVs were actually, they were going to happen anyway, but there's
00:28:31.740 | not much evidence about at least happening when it did.
00:28:35.700 | You know, if we end up on Mars, you can say that Elon was a genuine leader there.
00:28:40.420 | And so there are examples now, like Zuckerberg definitely has done a lot of leadership along
00:28:44.640 | the way.
00:28:46.220 | He's also potentially kind of like caught in a storm that is happening and he's one
00:28:53.060 | of the figures in it.
00:28:54.060 | So I don't know.
00:28:55.060 | - And it's possible that he is a big shaper if the metaverse becomes a reality.
00:29:00.160 | If in 30 years we're all living in a virtual world.
00:29:03.240 | To many people it seems ridiculous now that that was a poor investment.
00:29:06.780 | - He talked about getting, you know, I think it was something like a billion people with
00:29:11.140 | a VR headset in their pocket in by, you know, I think it was 10 years from now back in 2015.
00:29:16.660 | So we're behind that.
00:29:19.180 | But when he was talking about that, and honestly, this is something I've been wrong about.
00:29:23.980 | Because I went to like one of the Facebook conferences and tried out all the new Oculus
00:29:30.340 | stuff.
00:29:31.340 | And I was like, you know, pretty early talking to some of the major players there.
00:29:34.060 | Because I was going to write a big post about it that then got swallowed by this book.
00:29:38.420 | But I would have been wrong in the post.
00:29:39.900 | Because what I would have said was that this thing is, when I tried it, I was like, this
00:29:45.780 | is, you know, some of them suck.
00:29:47.420 | Some of them make you nauseous and they're just not that, you know, the headsets were
00:29:50.500 | big and, you know.
00:29:51.940 | But I was like, the times when this is good, it is, I have this feeling I haven't had,
00:29:56.060 | it reminds me of the feeling I had when I first was five and I went to a friend's house
00:29:59.340 | and he had Nintendo.
00:30:00.900 | And he gave me the controller and I was looking at the screen and I pressed a button and Mario
00:30:04.900 | jumped.
00:30:05.900 | And I said, I can make something on the screen move.
00:30:10.700 | And the same feeling I had the first time someone showed me how to send an email.
00:30:13.220 | It was like really early.
00:30:14.220 | And he's like, you can send this.
00:30:15.220 | And I was like, it goes, I can press enter on my computer and something happens on your
00:30:18.260 | computer.
00:30:19.260 | Those were obviously, you know, when you have that feeling, it often means you're witnessing
00:30:22.800 | a paradigm shift.
00:30:24.380 | And I thought, this is one of those things.
00:30:27.060 | And I still kind of think it is, but it's kind of weird that it hasn't, you know, like
00:30:31.020 | where's the VR revolution?
00:30:33.020 | Like, yeah, I'm surprised.
00:30:34.660 | Because I'm with you.
00:30:35.660 | My first and still instinct is this feels like it changes everything.
00:30:39.460 | VR feels like it changes everything, but it's not changing anything.
00:30:43.220 | Like a dumb part of my brain is genuinely convinced that this is real.
00:30:46.780 | And then the smart part knows it's not.
00:30:48.060 | But that's why the dumb part was like, we're not walking off that cliff.
00:30:50.900 | The smart part's like, you're on your rug.
00:30:52.660 | It's fine.
00:30:53.660 | The dumb part of my brain is like, I'm not walking off the cliff.
00:30:56.180 | So it's like, it's crazy.
00:30:57.180 | - I feel like it's waiting for like that revolutionary person who comes in and says, I'm going to
00:31:02.660 | create a headset.
00:31:03.660 | Like, honestly-
00:31:04.660 | - Yeah, Steve Jobs, iPhone of-
00:31:06.020 | - Honestly, a little bit of a Carmack type guy, which is why it was really interesting
00:31:09.220 | for him to be involved with Facebook.
00:31:11.060 | It's basically, how do we create a simple dumb thing that's a hundred bucks, but actually
00:31:15.820 | creates that experience.
00:31:17.380 | And then there's going to be some viral killer app on it.
00:31:20.300 | And that's going to be the gateway into a thing that's going to change everything.
00:31:23.780 | I mean, I don't know what exactly was the thing that changed everything with a personal
00:31:27.340 | computer.
00:31:28.340 | Is that understood?
00:31:30.540 | Why that maybe graphics?
00:31:33.660 | What was the use case?
00:31:35.140 | - I mean-
00:31:36.140 | - Exactly.
00:31:37.140 | - Wasn't the '84 Macintosh like a moment when it was like, this is actually something that
00:31:43.540 | normal people can and want to use?
00:31:45.460 | - Because it was less than $5,000, I think.
00:31:48.140 | - And I just think it had some like Steve Jobs user friendliness already to it that
00:31:52.020 | other ones hadn't had.
00:31:53.380 | I think Windows 95 was a really big deal.
00:31:55.980 | I remember like, because I'm old enough to remember the MS-DOS when I was like, kind
00:31:59.820 | of remember the command.
00:32:01.460 | And then suddenly this concept of like a window you drag something into, or you double click
00:32:05.220 | an icon, which now seems like so obvious to us, was like revolutionary because it made
00:32:10.380 | it intuitive.
00:32:12.540 | So I don't know.
00:32:14.380 | - Windows 95 was good.
00:32:15.820 | - It was crazy, yeah.
00:32:16.820 | - I forget what the big leaps was, because that was Windows 2000, it sucked.
00:32:20.340 | And then Windows XP was good.
00:32:22.980 | I moved to Mac around 2004, so I stopped.
00:32:26.020 | - You sold your soul to the devil?
00:32:27.540 | - Yeah.
00:32:28.540 | - I see.
00:32:29.540 | Well, us, the people, still use Windows and Android, the device in the operating system
00:32:35.780 | of the people, not you elitist folk with your books and your, what else?
00:32:42.340 | And success.
00:32:44.300 | Okay.
00:32:46.820 | You write, "More technology means better good times, but it also means badder bad times.
00:32:52.500 | And the scary thing is, if the good and bad keep exponentially growing, it doesn't matter
00:32:56.780 | how great the good times become.
00:32:58.900 | If the bad gets to a certain level of bad, it's all over for us."
00:33:02.900 | Can you elaborate on this?
00:33:04.380 | Why is there, why does the bad have that property?
00:33:09.860 | That if it's all exponentially getting more powerful, then the bad is gonna win in the
00:33:14.580 | Was, am I misinterpreting that?
00:33:16.020 | - No.
00:33:17.020 | One of the things I noticed a trend, which was like, the centuries, the good is getting
00:33:23.180 | better every century.
00:33:24.380 | Like the 20th century was the best century yet in terms of prosperity, in terms of GDP
00:33:30.140 | per capita, in terms of life expectancy, in terms of poverty and disease, every metric
00:33:34.860 | that matters.
00:33:35.860 | The 20th century was incredible.
00:33:37.780 | It also had the biggest wars in history, the biggest genocide in history, the biggest existential
00:33:41.860 | threat yet with nuclear weapons, right?
00:33:45.580 | The depression was probably as big an economic.
00:33:48.500 | So it's this interesting thing where the stakes are getting higher in both directions.
00:33:53.380 | And so the question is like, if you get enough good, does that protect you against the bad?
00:33:59.780 | The dream, and I do think this is possible too, is the good gets so good.
00:34:03.700 | You know, have you ever read the culture series, the Ian Banks books?
00:34:06.100 | - Not yet, but I get criticized on a daily basis by some of the mutual folks we know
00:34:10.620 | for not having done so.
00:34:11.620 | - Yeah, lots of us.
00:34:12.620 | - And I feel like a lesser man for it.
00:34:13.860 | Yes, I need to change that.
00:34:14.860 | - Yeah, that's how I got onto it.
00:34:16.340 | And I read six of the 10 books and they're great.
00:34:19.660 | But the thing I love about them is like, it just paints one of these futuristic societies
00:34:25.020 | where the good has gotten so good that the bad is no longer even an issue.
00:34:31.100 | Like basically, and the way that this works is the AI, you know, the AIs are benevolent
00:34:38.140 | and they control everything.
00:34:39.420 | And so like, there's one random anecdote where they're like, you know, what happens if you
00:34:43.180 | murder someone in, 'cause you're still, you know, there's still people with rage and jealousy
00:34:47.660 | or whatever.
00:34:48.660 | So someone murders someone, first of all, that person's backed up.
00:34:51.540 | So it's like they have to get a new body and it's annoying, but it's like, it's not death.
00:34:56.180 | And secondly, that person, what are they gonna do?
00:34:57.500 | Put them in jail?
00:34:58.500 | No, no, no.
00:34:59.500 | They're just gonna send a slap drone around, which is this little like tiny, you know,
00:35:01.740 | random drone that just will float around next to them forever.
00:35:04.340 | And by the way, kind of be their servants.
00:35:05.460 | Like, it's kind of fun to have a slap drone, but it's just making sure that they never
00:35:07.780 | do anything.
00:35:08.820 | And it's like, I was like, oh man, it could just be, everyone could be so safe and everything
00:35:13.140 | could be so like, you know, you want a house, you know, the AIs will build you a house.
00:35:16.660 | There's endless space, there's endless resources.
00:35:18.840 | So I do think that that could be part of our future.
00:35:21.060 | That's part of what excites me is like, there is, like today would seem like a utopia to
00:35:24.860 | Thomas Jefferson, right?
00:35:26.300 | Thomas Jefferson's world would seem like a utopia to a caveman.
00:35:30.260 | There is a future, and by the way, these are happening faster, these jumps, right?
00:35:34.600 | So the thing that would seem like a utopia to us, we could experience in our own lifetimes,
00:35:38.780 | right?
00:35:39.780 | It's, especially if, you know, life extension combines with exponential progress.
00:35:44.900 | I want to get there.
00:35:45.900 | And I think in that part of what makes it utopia is you don't have to be as scared of
00:35:50.220 | the worst bad guy in the world trying to do the worst damage because we have protection.
00:35:54.860 | But that said, I'm not sure how that happens.
00:35:57.860 | Like it's, it's, it's, it's either easier said than done.
00:36:01.740 | Nick Bostrom uses the example of if nuclear weapons could be manufactured by microwaving
00:36:06.740 | sand, for example, we probably would be in the stone age right now because 0.001% of
00:36:14.180 | people would love to destroy all of humanity, right?
00:36:16.700 | Some 16 year old with huge mental health problems who right now goes and shoots up a school
00:36:21.380 | would say, oh, even better, I'm going to blow up a city.
00:36:24.580 | And now suddenly there's copycats, right?
00:36:27.020 | And so that's like, as our technology grows, it's going to be easier for the worst bad
00:36:35.460 | guys to do in tremendous damage.
00:36:39.660 | And it's easier to destroy than to build.
00:36:41.700 | So it takes a tiny, tiny number of these people with enough power to do bad.
00:36:46.460 | So that to me, I'm like the stakes are going up because what we have to lose is this incredible
00:36:51.420 | utopia.
00:36:52.420 | But also like dystopia is real.
00:36:54.300 | It happens.
00:36:55.300 | The Romans ended up in a dystopia.
00:36:56.540 | They probably earlier thought that was never possible.
00:36:58.700 | Like we should not get cocky.
00:37:01.860 | And so to me, that trend is the exponential tech is a double edged sword.
00:37:07.260 | It's so exciting.
00:37:08.700 | I'm happy to be alive now overall because I'm an optimist and I find it exciting.
00:37:12.840 | But it's really scary.
00:37:14.980 | And the dumbest thing we can do is not be scared.
00:37:18.060 | Dumbest thing we can do is get cocky and think, well, my life is always-- the last couple
00:37:21.020 | generations everything's been fine.
00:37:23.700 | Stop that.
00:37:25.180 | What's your gut?
00:37:26.420 | What percentage of trajectories take us towards the, as you put, unimaginably good future
00:37:31.500 | versus unimaginably bad future?
00:37:34.420 | As an optimist.
00:37:36.980 | It's really hard to know.
00:37:38.060 | I mean, all I can-- one of the things we can do is look at history.
00:37:42.220 | And on one hand, there's a lot of stories.
00:37:45.580 | I'm actually listening to a great podcast right now called The Fall of Civilizations.
00:37:50.660 | And it's literally every episode is like a little two hour deep dive into some civilization.
00:37:55.900 | Some are really famous, like the Roman Empire.
00:37:57.700 | Some are more obscure, like the Norse and Greenland.
00:38:03.300 | But each one is so interesting.
00:38:05.380 | But what's-- I mean, there's a lot of civilizations that had their peak.
00:38:11.140 | There's always the peak, right, when they're thriving.
00:38:12.660 | And they're at their max size.
00:38:15.220 | And they have their waterways.
00:38:17.100 | And they have their-- it's civilized.
00:38:19.900 | And it's representative.
00:38:20.900 | And it's fair.
00:38:21.980 | And whatever.
00:38:22.980 | Not always.
00:38:23.980 | But the peak is the great-- you know, if I could go back in time.
00:38:25.860 | It's not that the farther you go back, the worse it gets.
00:38:28.740 | No, no, no.
00:38:29.740 | You want to go back to a civilization during-- I would go to the Roman Empire in the year
00:38:33.620 | Sounds great, right?
00:38:34.620 | You don't want to go to the Roman Empire in the year 400.
00:38:36.540 | We might be in the peak right now here, whatever this empire is.
00:38:39.700 | Honestly, I think about the '80s, the '70s, the '80s.
00:38:43.900 | Oh, here we go.
00:38:44.900 | The music.
00:38:45.900 | No, no.
00:38:46.900 | I hate the '80s.
00:38:47.900 | It's so much better.
00:38:48.900 | No, the '80s culture is so annoying.
00:38:49.900 | It's just like-- when I listen to these things, I'm thinking, you know, the '80s and '90s.
00:38:54.260 | Oh, my god, the '90s was popular.
00:38:56.620 | People forget that now.
00:38:57.620 | Like, Clinton was a superstar around the world.
00:39:00.060 | Michael Jordan was exported internationally.
00:39:02.540 | Then basketball was everywhere suddenly.
00:39:04.540 | You had, like, music, the sports, whatever.
00:39:07.100 | It was a little probably like the '50s, you know, coming out of the World War and the
00:39:10.820 | Depression before it.
00:39:11.820 | It was like this kind of like everyone was in a good mood kind of time, you know?
00:39:15.420 | It's like, finish a big project and it's Saturday.
00:39:17.540 | It was like-- I feel like the '50s was kind of like everyone was having it.
00:39:21.020 | You know, the '20s, I feel like everyone was in a good mood randomly.
00:39:25.580 | Then the '30s, everyone was in a bad mood.
00:39:28.360 | But the '90s, I think we'll look back on it as a time when everyone was in a good mood.
00:39:32.040 | And it was like, you know, again, of course, at the time, it doesn't feel that way necessarily.
00:39:35.380 | But I look at that, I'm like, maybe that was kind of America's peak.
00:39:39.200 | And like, maybe not, but like it hasn't been popular since really worldwide.
00:39:44.180 | It's gone in and out depending on the country, but like it hasn't reached that level of like
00:39:47.860 | America's awesome around the world.
00:39:50.900 | And the political situation has gotten really ugly.
00:39:55.460 | And maybe it's social media, maybe who knows.
00:39:57.980 | But I wonder if it'll ever be as simple and positive as it was then.
00:40:03.900 | Like, maybe we are in the-- it feels a little like maybe we're in the beginning of the downfall--
00:40:09.780 | or not.
00:40:10.780 | Because these things don't just-- it's not a perfect smooth hill.
00:40:12.980 | It goes up and down, up and down.
00:40:14.220 | So maybe there's another big upcoming.
00:40:15.940 | - It's unclear whether public opinion, which is kind of what you're talking to, is correlated
00:40:21.620 | strongly with influence.
00:40:23.700 | You could say that even though America has been on a decline in terms of public opinion,
00:40:27.980 | the exporting of technology, that America has still, with all the talk of China, has
00:40:33.460 | still been leading the way in terms of AI, in terms of social media, in terms of just
00:40:38.380 | basically any software-related product.
00:40:40.380 | - Like chips.
00:40:41.380 | - Yeah, chips.
00:40:42.380 | So hardware and software.
00:40:43.380 | I mean, America leads the way.
00:40:45.380 | You could argue that Google and Microsoft and Facebook are no longer American companies.
00:40:50.340 | They're international companies.
00:40:51.580 | But they really are still headquartered in Silicon Valley, broadly speaking.
00:40:57.260 | So and Tesla, of course, and just all of the technological innovation still seems to be
00:41:02.980 | happening in the United States.
00:41:05.780 | Although culturally and politically, this is not--
00:41:10.900 | - No, it's not good.
00:41:13.420 | - Well, maybe that could shift at any moment when all the technological development can
00:41:18.180 | actually create some positive impact in the world.
00:41:22.180 | That could shift it with the right leadership and so on, with the right messaging.
00:41:26.740 | - Yeah, I think, I don't feel confident at all about whether, no, no, I don't mean that.
00:41:32.980 | I don't mean, I don't feel confident in my opinion that we may be on the downswing or
00:41:37.260 | that we may be, 'cause I truly don't know.
00:41:39.180 | It's like, I think the people, these are really big macro stories that are really hard to
00:41:44.580 | see when you're inside of them.
00:41:45.900 | It's like being on a beach and running around a few miles this way and trying to suss out
00:41:51.180 | the shape of the coastline.
00:41:53.020 | It's just really hard to see the big picture.
00:41:56.620 | You get caught up in the micro stories, the little tiny ups and downs that are part of
00:42:00.980 | some bigger trend.
00:42:03.300 | And also giant paradigm shifts happen quickly nowadays.
00:42:05.780 | The internet came out of nowhere and suddenly was like, changed everything.
00:42:09.860 | So there could be a changed everything thing on the way.
00:42:11.660 | It seems like there's a few candidates for it.
00:42:14.300 | But I mean, it feels like the stakes are just high, higher than it even was for the Romans,
00:42:19.060 | higher than it was for because that we're more powerful as a species.
00:42:24.460 | We have God-like powers with technology that other civilizations at their peak didn't have.
00:42:30.500 | I wonder if those high stakes and powers will feel laughable to people that live, humans,
00:42:37.140 | aliens, cyborgs, whatever lives 100 years from now.
00:42:40.700 | That maybe are a little like, this feeling of political and technological turmoil is
00:42:46.300 | nothing.
00:42:47.300 | - Well, that's the big question.
00:42:48.780 | So right now, you know the 1890s was like a super politically contentious decade in
00:42:54.500 | the US.
00:42:55.500 | There was like immense tribalism and the newspapers were all like lying and telling,
00:43:00.660 | you know, there was a lot of like what we would associate with today's media, the worst
00:43:04.060 | of it.
00:43:05.060 | And it was over gold or silver being this, I don't know, it was very, it's something
00:43:08.220 | that I don't understand.
00:43:09.220 | But the point is, it was a little bit of a blip, right?
00:43:12.220 | It happened, it felt, it must've felt like the end of days at the time.
00:43:14.960 | And then now we look, and most people don't even know about that.
00:43:17.980 | Versus, you know, again, the Roman Empire actually collapsed.
00:43:21.420 | And so the question is just like, is yeah, you know, will in 50 years, will this be like,
00:43:26.820 | or like McCarthyism?
00:43:27.820 | Oh, they had like, oh, that was like a crazy few years in America and then it was fine.
00:43:32.680 | Or is this the beginning of something really big?
00:43:34.980 | And that's what I don't know.
00:43:37.220 | - Well, I wonder if we can predict what the big thing is at the beginning.
00:43:41.240 | It feels like we're not, we're just here along for the ride.
00:43:44.260 | And at the local level, and at every level, we're trying to do our best.
00:43:48.060 | - Well, how do we do our best?
00:43:50.780 | That's the one thing I know for sure is that we need to have our wits about us and do our
00:43:55.160 | best and the way that we can do that.
00:43:57.380 | You know, we have to be as wise as possible, right?
00:44:00.500 | To proceed forward.
00:44:01.900 | And wisdom is an emergent property of discourse.
00:44:06.340 | - So you're a proponent of wisdom versus stupidity?
00:44:08.660 | 'Cause you can make an, I can steal a man in the case for stupidity.
00:44:12.980 | - Do it.
00:44:13.980 | - I probably can't.
00:44:15.540 | But there's some, I think wisdom, and you talk about this, can come with a false confidence,
00:44:21.300 | arrogance.
00:44:22.300 | I mean, you talk about this in the book.
00:44:24.180 | That's too easy.
00:44:25.180 | - That's not wisdom then.
00:44:26.260 | If you're being arrogant, you're being unwise.
00:44:27.820 | - Unwise.
00:44:28.820 | - Yeah, I think wisdom is doing what people 100 years from now with the hindsight that
00:44:33.380 | we don't have would do if they could come back in time and they knew everything.
00:44:36.300 | It's like, how do we figure out how to have hindsight when we actually are not?
00:44:39.820 | - What if stupidity is the thing that people from 100 years from now will see as wise?
00:44:45.500 | - I mean--
00:44:46.500 | - The idiot by Dostoevsky being naive and trusting everybody, maybe that's the one.
00:44:50.300 | - Well, then you get lucky.
00:44:52.740 | Then maybe you get to a good future by stumbling upon it.
00:44:57.100 | But ideally, you can get there.
00:44:59.260 | I think a lot of, America, the great things about it are a product of the wisdom of previous
00:45:06.500 | Americans.
00:45:07.500 | The Constitution was a pretty wise system to set up.
00:45:12.860 | - There's not much stupid stumbling around.
00:45:14.660 | - Well, there is, I mean, with Dostoevsky's The Idiot, Prince Mishkin, and Brothers Karamazov,
00:45:21.460 | there's Alyosha Karamazov, you err on the side of love and almost like a naive trust
00:45:31.100 | in other human beings.
00:45:32.860 | And that turns out to be, at least in my perspective, in the long term, for the success of the species
00:45:38.060 | is actually wisdom.
00:45:39.740 | - It's a compass.
00:45:40.740 | - But we don't know.
00:45:41.740 | - It's a compass when you're in the fog.
00:45:43.100 | - In the fog, yeah.
00:45:44.100 | - Love is a compass.
00:45:45.100 | - Okay, but here's the thing.
00:45:46.100 | So I think we should have, a compass is nice, but you know what else is nice?
00:45:50.340 | It's a flashlight in the fog.
00:45:51.660 | You can't see that far, but you can see, oh, you can see four feet ahead instead of one
00:45:54.860 | foot.
00:45:55.860 | And that, to me, is discourse.
00:45:57.380 | That is open, vigorous discussion in a culture that fosters that is how the species, how
00:46:04.820 | the American citizens as a unit can be as wise as possible, can maybe see four feet
00:46:11.700 | ahead instead of one foot ahead.
00:46:13.220 | - That said, Charles Bukowski said that love is a fog that fades with the first light of
00:46:17.780 | reality.
00:46:18.780 | So I don't know how that works out, but I feel like there's intermixing of metaphors
00:46:22.460 | that works.
00:46:23.460 | Okay, you also write that quote, "As the authors of the story of us," which is this 1,000-page
00:46:28.780 | book, "we have no mentors, no editors, no one to make sure it all turns out okay.
00:46:35.860 | It's all in our hands.
00:46:37.520 | This scares me, but it's also what gives me hope.
00:46:40.540 | If we can all get just a little wiser together, it may be enough to nudge the story onto a
00:46:45.300 | trajectory that points towards an unimaginably good future."
00:46:51.260 | Do you think we can possibly define what a good future looks like?
00:46:55.900 | I mean, this is the problem that we ran into with communism, of thinking of utopia, of
00:47:06.460 | having a deep confidence about what a utopian world looks like.
00:47:11.220 | - Well, it's a deep confidence.
00:47:13.020 | That was a deep confidence about the instrumental way to get there.
00:47:17.020 | It was that, you know, I think a lot of us can agree that if everyone had everything
00:47:21.500 | they needed and we didn't have disease or poverty and people could live as long as they
00:47:25.180 | wanted to and choose when to die and there was no existential, major existential threat
00:47:31.260 | because we control, I think almost everyone can agree that would be great.
00:47:34.940 | That communism is a, that was, they said, "This is the way to get there."
00:47:40.500 | And that is, that's a different question, you know?
00:47:44.340 | So the unimaginably good future I'm picturing, I think a lot of people would picture, and
00:47:48.820 | I think most people would agree.
00:47:50.060 | Now, not everyone.
00:47:51.060 | There's a lot of people out there who would say humans are the scourge on the earth and
00:47:53.580 | we should de-growth or something, but I think a lot of people would agree that, you know,
00:47:58.060 | just again, take Thomas Jefferson, bring him here.
00:47:59.980 | You would see it as a utopia for obvious reasons, the medicine, the food, the transportation,
00:48:07.540 | just how, the quality of life and the safety and all of that.
00:48:11.700 | So extrapolate that forward for us.
00:48:14.500 | Now we're Thomas Jefferson, you know, what's the equivalent?
00:48:17.940 | That's what I'm talking about.
00:48:19.100 | And the big question is, I actually don't, I don't try to say, "Here's the way to get
00:48:22.980 | there.
00:48:23.980 | Here's the actual specific way to get there."
00:48:26.140 | I try to say, "How do we have a flashlight so that we can together figure it out?
00:48:30.420 | Like how do we give ourselves the best chance of figuring out the way to get there?"
00:48:33.940 | And I think part of the problem with communists and people, ideologues, is that they're way
00:48:40.340 | too overconfident that they know the way to get there.
00:48:44.140 | And it becomes a religion to them, this solution.
00:48:47.440 | And then you know, you can't update once you have a solution as a religion.
00:48:50.440 | And so.
00:48:51.440 | - I felt a little violated when you said communists and stared deeply into myself.
00:48:57.880 | In this book, you've developed a framework for how to fix everything.
00:49:02.580 | It's called The Ladder.
00:49:03.580 | Can you explain it?
00:49:04.580 | - Okay, it's not a framework for how to fix everything.
00:49:07.060 | I would never say that.
00:49:08.060 | - I'll explain it to Tim Urban at some point.
00:49:09.900 | - Okay.
00:49:10.900 | - How this humor thing works.
00:49:11.900 | - Yeah, no.
00:49:12.900 | - It's a framework of how to think about collaboration between humans such that we could fix things.
00:49:20.080 | - I think it's a compass.
00:49:21.080 | - Yeah, it's like a, it's a ruler that we can, once we look at it together and see what
00:49:28.040 | it is, we can all say, oh, we want to go to that side of the ruler.
00:49:30.760 | Not this side.
00:49:33.040 | And so it gives us a direction to go.
00:49:34.440 | - And so what are the parts of The Ladder?
00:49:36.960 | - So I have these two characters.
00:49:38.280 | This orange guy, this primitive mind, is, this is our software.
00:49:42.400 | That is the software that was in a 50,000 BC person's head that was specifically optimized
00:49:50.520 | to help that person survive in that world.
00:49:52.480 | And not even, not just not really survive, but help them pass their genes on in that
00:49:55.440 | world.
00:49:58.720 | And civilization happened quickly and brains change slowly.
00:50:04.040 | And so that unchanged dude is still running the show in our head.
00:50:10.560 | And I use the example of like Skittles.
00:50:13.240 | Like why do we eat Skittles?
00:50:15.560 | It's trash.
00:50:16.560 | It's obviously bad for you.
00:50:18.720 | And it's because the primitive mind in the world that it was programmed for, there was
00:50:24.720 | no Skittles.
00:50:25.720 | It was just fruit.
00:50:27.640 | And if there was a dense, chewy, sweet fruit like that, it meant you just found like a
00:50:31.640 | calorie gold mine.
00:50:33.160 | Energy, energy, take it, take it, eat as much as you can.
00:50:36.760 | Gorge on it.
00:50:37.760 | Hopefully you get a little fat.
00:50:38.760 | That would be the dream.
00:50:40.320 | And now we're so good with energy for a while.
00:50:42.160 | We don't have to stress about it anymore.
00:50:44.240 | So today Mars Inc is clever and says, let's not sell things to people's higher minds,
00:50:51.520 | who's the other character.
00:50:52.800 | Let's sell to people's primitive minds.
00:50:54.360 | Primitive minds are dumb.
00:50:55.360 | And let's trick them into thinking this is this new, this, this thing you should eat
00:50:58.920 | and then they'll eat it.
00:50:59.920 | Now Mars Inc is a huge company.
00:51:01.480 | Actually, just to linger real quick.
00:51:03.000 | So you said primitive mind and higher mind.
00:51:04.800 | So those are the two things that make up this bigger mind that it, that is the modern human
00:51:09.480 | being.
00:51:10.480 | Yeah.
00:51:11.480 | It's like, you know, it's not perfect.
00:51:12.480 | Obviously there's a lot of crossover.
00:51:13.480 | There's people who will yell at me for saying there's two minds and you know that, but to
00:51:16.920 | me it's still a useful framework where you have this software that has making decisions
00:51:22.640 | based on a world that you're not in anymore.
00:51:25.080 | And then you've got this other character.
00:51:26.720 | I call it the higher mind.
00:51:27.720 | And it's the part of you that knows that skills are not good and can override the instinct.
00:51:31.240 | And the reason you don't always eat Skittles is because the higher mind says, no, no, no,
00:51:34.560 | we're not doing that because that's bad.
00:51:36.360 | And I know that right now you can apply that to a lot of things.
00:51:39.480 | The higher mind is the one that knows I shouldn't procrastinate.
00:51:41.560 | The primitive mind is the one that wants to conserve energy and not do anything icky and
00:51:45.080 | you know, can't see the future.
00:51:46.080 | So he procrastinates that, you know, you can apply this.
00:51:48.280 | No, I, in this book, apply it to, to how we form our beliefs is one of the ways.
00:51:54.880 | And then eventually to politics and political movements.
00:51:56.880 | But like, if you think about what, well, what's the equivalent of the Skittles tug of war
00:52:01.960 | in your head for how do you form your beliefs?
00:52:06.840 | And it's that the primitive mind in the world that it was optimized for, it wanted to feel
00:52:18.080 | conviction about its beliefs.
00:52:20.600 | It wanted to be sure that it was, it wanted to feel conviction and it wanted to agree
00:52:27.840 | with the people around them.
00:52:28.840 | It didn't want to stand out.
00:52:29.840 | It wanted to, to fervently agree with the tribe about the tribe's sacred beliefs.
00:52:33.320 | Right?
00:52:34.320 | And there's a big part of us that wants to do that, that doesn't like changing our mind.
00:52:37.280 | It feels like it's part of our, the primitive mind identifies with beliefs.
00:52:40.320 | It feels like it's a threat, a physical threat to you, to your primitive mind when you change
00:52:45.120 | your mind or when someone disagrees with you in a smart way.
00:52:48.800 | So there's that huge force in us, which is confirmation bias.
00:52:51.560 | That's where that comes from.
00:52:52.560 | It's, it's this, this, this desire to keep believing what we believe and this desire
00:52:56.980 | to also fit in with our beliefs, to believe what the people around us believe.
00:53:01.920 | And that can be fun in some ways.
00:53:03.920 | We all like the same sports team and we're all super into it and we're all going to be
00:53:07.040 | biased about that call together.
00:53:08.560 | I mean, that it's not always bad, but it's not a very smart way to be.
00:53:13.000 | And you're actually, you're working kind of for those ideas.
00:53:15.980 | Those ideas are like your boss and you're working so hard to keep believing those.
00:53:20.000 | Those ideas are, you know, a really good paper comes in that you read that, that, that conflicts
00:53:23.960 | with those ideas.
00:53:24.960 | And you will do all this work to say that paper is bullshit because you're, you're a
00:53:30.360 | faithful employee of those ideas.
00:53:32.720 | Now the higher mind, to me, the same party that can override the Skittles can override
00:53:36.320 | this and can, and can search for something that makes a lot more sense, which is truth.
00:53:41.240 | Because what rational being wouldn't want to know the truth who wants to be delusional.
00:53:45.600 | And so there's this tug of war because the higher mind doesn't identify with ideas.
00:53:51.020 | Why would you, it's an experiment you're doing and it's a mental model.
00:53:53.920 | And if someone can come over and say, you're wrong, you'd say, where, where, show me, show
00:53:57.640 | And if they point out something that is wrong, you'd say, oh, thanks.
00:54:00.320 | Oh, good.
00:54:01.320 | I just got a little smarter, right?
00:54:02.320 | I'm not going to identify with the thing.
00:54:03.320 | I'll go, yeah, kick it.
00:54:04.320 | See if you can break it.
00:54:05.320 | If you can break it, it's not that good.
00:54:06.320 | Right?
00:54:07.320 | So there's both of these in our heads and there's this tug of war between them.
00:54:11.280 | And sometimes, you know, if you're telling me about something with AI, I'm probably going
00:54:14.400 | to think with my higher mind because I'm not identified with it.
00:54:16.600 | But if you go and you criticize the ideas in this book or you criticize my religious
00:54:20.000 | beliefs or you criticize, I might have a harder time because the primitive mind says, no,
00:54:23.320 | no, no, those are, are special ideas.
00:54:26.000 | So yeah, so that's, that's one way to use this ladder is like, it's a spectrum, you
00:54:30.080 | know, at the top, the higher mind is doing all the thinking.
00:54:32.520 | And then as you go down, it becomes more of a tug of war.
00:54:34.280 | And at the bottom, the primitive mind is in total control.
00:54:37.120 | And this is distinct as you show from the spectrum of ideas.
00:54:40.960 | So this is how you think versus what you think.
00:54:43.320 | And those are distinct.
00:54:44.320 | Those are different dimensions.
00:54:46.240 | We need, we need a vertical axis.
00:54:48.820 | We have all these horizontal axes, left, right, center, or, you know, this opinion all the
00:54:52.520 | way to this opinion.
00:54:53.520 | But it's like, what's much more important than where you stand is how you got there,
00:54:58.320 | right?
00:54:59.320 | So this helps if, if I can say this person's kind of on the left or on the right, but they're
00:55:04.920 | up high, I think, I think, in other words, I think they got there using evidence and
00:55:08.880 | reason and they were willing to change their mind.
00:55:10.880 | Now that means a lot to me what they have to say.
00:55:12.560 | If I think they're just a tribal person and I can predict all their beliefs from hearing
00:55:16.120 | one because it's so obvious what political beliefs, that person's views are irrelevant
00:55:20.080 | to me because they're not real.
00:55:21.540 | They didn't come from information.
00:55:23.820 | They came from a tribe's kind of, you know, sacred 10 commandments.
00:55:29.340 | I really like the comic you have in here about with the boxer.
00:55:32.900 | This is the best boxer in the world.
00:55:34.500 | Wow, cool.
00:55:35.500 | Who has he beaten?
00:55:37.700 | No one.
00:55:38.700 | He's never fought anyone.
00:55:39.700 | Then how do you know he's the best boxer in the world?
00:55:42.140 | I can just tell.
00:55:43.140 | Now, I mean, this connects with me and I think with a lot of people just because in martial
00:55:46.680 | arts, it's especially kind of true that this is this whole legend about different martial
00:55:50.740 | artists and that kind of would construct like action figures like, you know, thinking that
00:55:58.820 | Steven Seagal is the best fighter in the world or Chuck Norris.
00:56:01.460 | But Chuck Norris is actually backed up.
00:56:03.380 | He's done really well in competition, but still the ultimate test for particular for
00:56:07.180 | martial arts is what we now know as mixed martial arts, UFC and so on.
00:56:12.260 | That's the actual scientific testing ground.
00:56:14.140 | It's a meritocracy.
00:56:15.140 | Yeah, exactly.
00:56:16.140 | I mean, there's within certain rules and you can criticize those rules like this doesn't
00:56:19.500 | actually represent the broader combat that you would think of when you're thinking about
00:56:23.220 | martial arts.
00:56:24.220 | But reality is you're actually testing things.
00:56:26.540 | And that's when you realize that Aikido and some of these kind of woo woo martial arts
00:56:31.740 | in their certain implementations don't work in the way you think they would in the context
00:56:36.940 | of fighting.
00:56:37.940 | And I think this is one of the places where everyone can agree, which is why it's a really
00:56:41.700 | nice comic because then you start to talk about.
00:56:46.580 | Tap this onto ideas that people take personally.
00:56:50.300 | It starts becoming a lot more difficult to basically highlight that we're thinking with
00:56:57.500 | not with our higher mind, but with our primitive mind.
00:57:00.020 | Yeah.
00:57:01.020 | I mean, if I'm thinking with my higher mind and now here is I can use different things
00:57:04.980 | for an idea as a metaphor.
00:57:06.340 | So here the metaphor is a boxer for one of your conclusions, one of your beliefs.
00:57:12.820 | And if all I care about is truth, in other words, that means all I care about is having
00:57:20.100 | a good boxer.
00:57:21.800 | I would say, go, yeah, try, see if this person is good.
00:57:26.420 | In other words, I would get into arguments, which is throwing my boxer out there to fight
00:57:29.900 | against other ones.
00:57:31.620 | And if I think my argument is good, by the way, I love boxing.
00:57:33.860 | If I think my guy is amazing, Mike Tyson, I'm thinking, oh yeah, bring it on.
00:57:40.100 | Who wants to come see?
00:57:41.100 | I bet no one can beat my boxer.
00:57:42.460 | I love a good debate, right, in that case.
00:57:45.620 | Now what would you think about my boxer if not only was I telling you he was great, but
00:57:51.100 | he's never boxed anyone.
00:57:52.140 | But then you said, OK, well, your idea came over to try to punch him.
00:57:55.740 | And I screamed and I said, what are you doing?
00:57:57.860 | That's violence.
00:57:58.860 | And you're an awful person.
00:58:01.780 | And I don't want to be friends with you anymore because you would think this boxer obviously
00:58:05.300 | sucks.
00:58:06.300 | Or at least I think it sucks deep down because why would I be so anti anyone?
00:58:11.700 | No boxing allowed.
00:58:13.900 | So I think if you're in-- so I call this a ladder, right?
00:58:19.140 | If you're in low-rung land, whether it's a culture or whatever, a debate, an argument,
00:58:23.620 | when someone says, no, that's totally wrong, what you're saying about that, and here's
00:58:28.620 | You're actually being totally biased.
00:58:31.100 | It sounds like a fight.
00:58:32.180 | People are going to say, oh, wow, we got in a fight.
00:58:34.540 | It was really awkward.
00:58:35.540 | Are we still friends with that person?
00:58:37.460 | Because that's not a culture of boxing.
00:58:39.700 | It's a culture where you don't touch each other's ideas.
00:58:41.620 | It's insensitive.
00:58:42.620 | Versus in a high-rung culture, it's sport.
00:58:48.620 | I mean, like every one of your podcasts, whether you're agreeing or disagreeing, the tone is
00:58:54.220 | the same.
00:58:55.220 | It's not like, oh, this got awkward.
00:58:56.220 | It's like the tone is identical because you're just playing intellectually either way because
00:58:59.900 | it's a good high-rung space.
00:59:00.900 | - At his best, but people do take stuff personally.
00:59:05.140 | And that's actually one of the skills of conversation just as a fan of podcasts is when you sense
00:59:09.780 | that people take a thing personally, you have to like, there's sort of methodologies and
00:59:15.460 | little paths you can take to calm things down.
00:59:18.500 | Go around.
00:59:20.140 | Don't take it as a violation of like that.
00:59:21.540 | - You're trying to suss out which of their ideas are sacred to them.
00:59:24.380 | And which ones are, bring it on.
00:59:26.660 | - And sometimes it's actually, I mean, that's the skill of it, I suppose, that sometimes
00:59:29.700 | it's certain wordings in the way you challenge those ideas are important.
00:59:35.140 | You can challenge them indirectly and then together, walk together in that way.
00:59:40.100 | Because what I've learned is people are used to their ideas being attacked in a certain
00:59:46.940 | way, in a certain tribal way.
00:59:49.420 | And if you just avoid those, like for example, if you have political discussions and just
00:59:53.580 | never mentioned left or right or Republican and Democrat, none of that, just talk about
00:59:59.380 | different ideas and avoid certain kind of triggering words.
01:00:03.780 | You can actually talk about ideas versus falling into this path that's well-established through
01:00:10.340 | battles that people have previously fought.
01:00:12.420 | - When you say triggering, I mean, who's getting triggered?
01:00:14.740 | The primitive mind.
01:00:15.740 | So what you're trying to do, what you're saying in this language is how do you have conversations
01:00:19.740 | with other people's higher minds, almost like whispering without waking up the primitive
01:00:23.780 | mind.
01:00:24.780 | The primitive mind is there sleeping, right?
01:00:25.780 | And as soon as you say something, the left primitive mind gets up and says, "What?
01:00:29.540 | What are you saying about the left?"
01:00:30.540 | And now everything goes off the rails.
01:00:33.340 | - What do you make of conspiracy theories under this framework of the latter?
01:00:37.740 | - So here's the thing about conspiracy theories is that once in a while they're true, right?
01:00:43.780 | 'Cause sometimes there's an actual conspiracy.
01:00:45.720 | Actually humans are pretty good at real conspiracies, secret things.
01:00:50.260 | And then I just watched the Madoff doc, great new Netflix doc, by the way.
01:00:56.540 | And so the question is, how do you create a system that is good at, you put the conspiracy
01:01:06.220 | theory in and it either goes, "Eh," or it says, "This is interesting, let's keep exploring
01:01:12.820 | How do you do something that it can, how do you assess?
01:01:14.060 | And so again, I think the high-run culture is really good at it because a real conspiracy,
01:01:21.420 | what's gonna happen is you put it, it's like a little machine you put in the middle of
01:01:24.620 | the table and everyone starts firing darts at it or bow and arrow or whatever, and everyone
01:01:27.900 | starts kicking it and trying to, and almost all conspiracy theories, they quickly crumble,
01:01:33.060 | right?
01:01:34.060 | Because they actually, Trump's election one is, I actually dug in and I looked at like
01:01:38.540 | every claim that he or his team made and I was like, "All of these, none of these hold
01:01:43.900 | up to scrutiny, none of them."
01:01:44.900 | I was open-minded, but none of them did.
01:01:46.900 | So that was one that as soon as it's open to scrutiny, it crumbles.
01:01:51.320 | The only way that conspiracy can stick around in a community is if it is a culture where
01:01:58.020 | that's being treated as a sacred idea that no one should kick or throw a dart at because
01:02:02.340 | if you throw a dart, it's gonna break.
01:02:03.740 | So it's being, and so what you want is a culture where no idea is sacred.
01:02:10.180 | Anything can get thrown at.
01:02:11.180 | And so I think that then what you'll find is that 94 out of 100 conspiracy theories
01:02:15.940 | come in and they fall down.
01:02:17.580 | The other, maybe four of the others come in and there's something there, but it's not
01:02:21.080 | as extreme as people say.
01:02:22.080 | And then maybe one is a huge deal and it's actually a real conspiracy.
01:02:25.360 | - Well, isn't there a lot of gray area and there's a lot of mystery.
01:02:28.920 | Isn't that where the conspiracy theories seep in?
01:02:32.560 | So it's great to hear that you've really looked into the Trump election fraud claims, but
01:02:40.000 | aren't they resting on a lot of kind of gray area, like fog, basically saying that there
01:02:46.080 | is dark forces in the shadows that are actually controlling everything.
01:02:49.840 | I mean, the same thing with maybe you can, there's like safer conspiracy theories, less
01:02:55.840 | controversial ones like, have we landed on the moon?
01:02:58.720 | Did the United States ever land on the moon?
01:03:02.720 | There's, you know, like the reason those conspiracy theories work is you could construct, there's
01:03:08.800 | incentives and motivation for faking the moon landing.
01:03:11.880 | There's a lot of, there's very little data supporting the moon landing.
01:03:20.000 | Like that's very public and it kind of looks fake, space kind of looks fake.
01:03:22.800 | - And that would be a big story if it turned out to be fake.
01:03:26.080 | - That's the argument, that would be the argument against it.
01:03:28.080 | Like are people really as a collective going to hold onto a story that big?
01:03:33.960 | Yeah, so that, but there's a lot, the reason they work is there's mystery.
01:03:38.560 | - Yeah, there's a great documentary called Behind the Curve about flat earthers.
01:03:42.480 | And one of the things that you learn about flat earthers is they believe all the conspiracies,
01:03:47.880 | not just the flat earth.
01:03:49.640 | They are convinced the moon landing is fake, they're convinced 9/11 was an American con
01:03:55.240 | They're convinced, you know, that, name a conspiracy and they believe it.
01:03:59.480 | And so it's so interesting is that I think of it as a skepticism spectrum.
01:04:07.520 | So on one side, you, it's like a filter in your head, a filter in the beliefs section
01:04:13.120 | of your brain.
01:04:14.120 | On one end of the spectrum, you are gullible, perfectly gullible.
01:04:18.000 | You believe anything someone says, right?
01:04:19.280 | On the other side, you're paranoid.
01:04:20.280 | You think everyone's lying to you, right?
01:04:22.920 | Everything is false.
01:04:23.920 | Nothing anyone says is true, right?
01:04:24.920 | So obviously those aren't good places to be.
01:04:27.720 | Now the healthy place, I think that the, so I think the healthy place is to be somewhere
01:04:32.280 | in the middle.
01:04:33.280 | And, but also you can learn to trust certain sources and then, you know, you don't have
01:04:36.120 | to do as much, apply as much skepticism to them.
01:04:38.960 | And so here's what, like, when you start having a bias, just say you have a political bias,
01:04:46.480 | when your side says something, you will find yourself moving towards the gullible side
01:04:50.280 | of the spectrum.
01:04:51.280 | You read an article written that supports your views, you move to the gullible side
01:04:54.440 | of the spectrum and you just believe it and you don't have any, where's that skepticism
01:04:57.280 | that you normally have, right?
01:04:58.280 | And then you move, and then you, soon as it's the other person talking, the other team talking,
01:05:01.840 | you move to the skeptical, the closer to the, you know, in denial, paranoid side.
01:05:07.120 | Now, flat earthers are the extreme.
01:05:09.600 | They are either a 10 or one.
01:05:13.080 | So it's like, it's so interesting because they're the people who are saying, ah, nah,
01:05:15.880 | I won't believe you.
01:05:16.880 | I'm not gullible.
01:05:17.880 | No, everyone else is gullible about the moon landing.
01:05:19.240 | I won't.
01:05:20.240 | And then yet when there's this evidence like, oh, because you can't see Seattle, you can't
01:05:24.360 | see the buildings over that horizon and you should, which isn't true.
01:05:28.140 | You should be, if the earth around you wouldn't be able to see them.
01:05:32.100 | Therefore it's, so suddenly they become the most gullible person.
01:05:34.060 | Hear any theory about the earth flat, they believe it.
01:05:35.980 | It goes right into their beliefs.
01:05:37.380 | So they're actually jumping back and forth between refusal to believe anything and believe
01:05:41.700 | anything.
01:05:42.700 | And so they're the extreme example.
01:05:44.220 | But I think when it comes to conspiracy theories, the people that get themselves into trouble
01:05:48.220 | are the ones who, they become really gullible when they hear a conspiracy theory that kind
01:05:52.100 | of fits with their worldview.
01:05:53.820 | And they likewise, when there's something that's kind of obviously true and it's not
01:05:57.260 | a big lie, they will actually, they'll think it is.
01:06:01.700 | They just tighten up their kind of skepticism filter.
01:06:04.620 | And so, yeah.
01:06:05.980 | So I think the healthy places to be is where you are not, because you also don't want to
01:06:09.540 | be the person who says every conspiracy, you hear the word conspiracy theory and it sounds
01:06:12.600 | like a synonym for like quack job crazy theory.
01:06:17.660 | Right?
01:06:18.660 | So, yeah.
01:06:19.660 | So I think, yeah, I think it's be somewhere in the middle of that spectrum and to learn
01:06:22.820 | to fine tune it.
01:06:23.820 | Which is a tricky place to operate.
01:06:25.220 | Because you kind of have to, every time you hear a new conspiracy theory, you should approach
01:06:29.500 | it with an open mind.
01:06:31.700 | And you know, and also if you don't have enough time to investigate, which most people don't,
01:06:36.860 | kind of still have a humility not to make a conclusive statement that that's nonsense.
01:06:41.500 | There's a lot of social pressure, actually, to immediately laugh off any conspiracy theory,
01:06:48.660 | if it's done by the bad guys, right?
01:06:50.940 | You will quickly get mocked and laughed at and not taken seriously.
01:06:54.420 | If you give any credence, you know, back to the lab leak was a good one, where it's like,
01:06:59.020 | turned out that that was at least very credible, if not true.
01:07:03.700 | And that was a perfect example of one where when it first came out, and not only so, so
01:07:09.700 | Brett Weinstein talked about it.
01:07:11.820 | And then I, in a totally different conversation, said something complimentary about him on
01:07:16.500 | a totally different subject.
01:07:19.040 | And people were saying, Tim, you might have gone a little off the deep end.
01:07:22.180 | You're like quoting someone who is like a lab leak person.
01:07:24.900 | So I was getting my reputation dinged for complimenting on a different topic, someone
01:07:31.220 | whose reputation was totally sullied, because they had, you know, they questioned an orthodoxy,
01:07:36.820 | right?
01:07:37.820 | So you see, so what does that make me want to do?
01:07:41.380 | Distance myself from Brett Weinstein.
01:07:42.740 | That's the, at least that's the incentive that's, and what does that make other people
01:07:45.500 | want to do?
01:07:46.500 | Don't become the next Brett Weinstein.
01:07:47.500 | Don't say it out loud, because you don't want to become someone that no one wants to compliment
01:07:50.900 | anymore, right?
01:07:51.900 | So you can see the social pressure, and that's, and of course, when there is a conspiracy,
01:07:55.700 | that social pressure is its best friend.
01:07:59.040 | - Because then they see the people from outside are seeing that social pressure enact, like
01:08:06.140 | Tim Urban becoming more and more and more extreme to the other side, and so they're
01:08:09.180 | going to take the more and more and more extreme.
01:08:11.260 | I mean, this, what do you see that the pandemic did, that COVID did to our civilization in
01:08:19.660 | that regard, in the forces?
01:08:21.700 | Why was it so divisive?
01:08:23.540 | Do you understand that?
01:08:26.580 | - Yeah, so COVID, I thought might be, we always know the ultimate example of a topic that
01:08:32.460 | will unite us all is the alien attack.
01:08:34.140 | - Yeah.
01:08:35.140 | - Although honestly, I don't even have that much faith then.
01:08:36.260 | I think there'd be like, some people are super pro-alien, and some people are anti-alien,
01:08:41.420 | but anyway.
01:08:42.420 | - I was actually sorry to interrupt, because I was talking to a few astronomers, and they're
01:08:47.460 | the first folks that made me kind of sad in that if we did discover life on Mars, for
01:08:55.020 | example, that there's going to be potentially a division over that too, whereas half the
01:08:59.820 | people will now believe that's real.
01:09:02.060 | - Well, because we live in a current society where the political divide has subsumed everything,
01:09:12.340 | and that's not always like that.
01:09:15.380 | It goes into stages like that.
01:09:17.020 | We're in a really bad one where it's actually, in the book, I call it like a vortex, almost
01:09:22.860 | like a whirlpool that pulls everything into it.
01:09:27.740 | And so normally you'd say, okay, immigration, naturally going to be contentious.
01:09:31.700 | That's always political, right?
01:09:33.860 | But COVID seemed like, oh, that's one of those that will unite us all.
01:09:39.020 | Let's fight this not human virus thing.
01:09:43.300 | No one's sensitive, no one's getting hurt when we insult the virus.
01:09:46.940 | Let's all be, we have this threat, this common threat that's a threat to everyone of every
01:09:50.620 | nationality and every country, every ethnicity.
01:09:55.380 | And it didn't do that at all.
01:09:57.380 | The whirlpool was too powerful.
01:09:59.000 | So it pulled COVID in, and suddenly masks, if you're on the left, you like them.
01:10:03.380 | If you're on the right, you hate them.
01:10:05.300 | And suddenly lockdowns, if you're on the left, you like them, and on the right, you hate
01:10:08.580 | them.
01:10:09.580 | And vaccines, this is, people forget this.
01:10:11.780 | When Trump first started talking about the vaccine, Biden, Harris, Cuomo, they're all
01:10:18.780 | saying I'm not taking that vaccine, not from this CDC.
01:10:21.380 | - Because it was too rushed or something like that?
01:10:24.020 | - Because I'm not trusting anything that Trump says.
01:10:26.620 | Trump wants me to take it, I'm not taking it.
01:10:28.060 | I'm not taking it from this CDC.
01:10:29.980 | So this was, Trump was almost out of office, but at the time, if Trump had been, it would've
01:10:35.220 | been, I'm pretty sure it would've stayed.
01:10:37.340 | Right likes vaccines, the left doesn't like vaccines.
01:10:40.380 | Instead, the president switched.
01:10:42.180 | And all those people are suddenly saying, they were actually specifically saying that
01:10:46.420 | if you, that like, if you're saying the CDC is not trustworthy, that's misinformation,
01:10:51.540 | which is exactly what they were saying about the other CDC.
01:10:54.380 | And they were saying it because they genuinely didn't trust Trump, which is fair, but now
01:10:58.060 | when other people don't trust the Biden CDC, suddenly it's this kind of misinformation
01:11:02.620 | that needs to be censored.
01:11:03.900 | So it was a sad moment, 'cause it was a couple months, even a week or so, I mean a month
01:11:08.260 | or so at the very beginning when it felt like a lot of our other squabbles were kind of
01:11:12.860 | like, oh, I feel like they're kind of irrelevant right now.
01:11:15.700 | And then very quickly the whirlpool sucked it in.
01:11:19.300 | And in a way where I think it damaged the reputation of these, a lot of the trust in
01:11:23.060 | a lot of these institutions for the long run.
01:11:25.300 | - But there's also an individual psychological impact.
01:11:27.900 | It's like a vicious negative feedback cycle where they were deeply affected on an emotional
01:11:32.700 | level and people just were not their best selves.
01:11:36.380 | - That's definitely true.
01:11:37.380 | I mean, talk about the primitive mind.
01:11:39.900 | I mean, one thing that we've been dealing with for our whole human history is pathogens.
01:11:45.780 | And it's emotional, right?
01:11:47.460 | It brings out, you know, there's really interesting studies where like, if, they studied the phenomenon
01:11:56.460 | of disgust, which is one of these like, you know, smiling is universal.
01:12:01.260 | You don't have to ever translate a smile, right?
01:12:03.980 | Throwing, you know, throwing your hands up when your sports team wins is universal because
01:12:08.700 | it's part of our coding.
01:12:10.780 | And so is disgust to kind of make this like, you know, face where you wrinkle up your nose
01:12:13.900 | and you kind of put out your tongue and maybe even gag.
01:12:17.020 | That's to expel, expel whatever, because it's the reaction when something is potentially
01:12:22.220 | a pathogen that might harm us, right?
01:12:24.740 | Feces, vomit, whatever.
01:12:26.700 | But they did this interesting study where people who, in two groups, the control group,
01:12:33.500 | you know, was shown images of, and I might be getting two studies mixed up, but they
01:12:37.940 | were showing images of like car crashes and like disturbing but not disgusting.
01:12:41.580 | And the other one was shown like, you know, like, you know, rotting things and just things
01:12:45.180 | that were disgusting.
01:12:46.260 | And then they were asked about immigration.
01:12:47.260 | These were Canadians.
01:12:48.740 | And the group that had the disgust feeling pulsing through their body was way more likely
01:12:54.160 | to prefer like immigrants from white countries.
01:12:58.980 | And the group that was, had been shown car accidents, they were, they still prefer the
01:13:02.820 | groups from white countries, but much less so.
01:13:05.300 | And so what does that mean?
01:13:06.820 | It's because with the disgust impulse makes us scared of, you know, sexual practices that
01:13:12.060 | are foreign, of ethnicities that are not, that don't look like us, of, it's still xenophobia.
01:13:17.180 | So it's ugly.
01:13:18.180 | It's really ugly stuff.
01:13:19.180 | This is, of course, also how, you know, the Nazi propaganda with cockroaches and, or it
01:13:24.540 | was, Rwandan was cockroaches, you know, the Nazis was rats.
01:13:27.500 | And, you know, it's specifically, it's a dehumanizing emotion.
01:13:31.660 | So anyway, we were, we were, we were, we were talking about COVID, but I think it does,
01:13:37.900 | it taps deep into like the human psyche and it, and it's, I don't think it brings out
01:13:41.180 | our, I think, like you said, I think it brings out an ugly side in us.
01:13:45.100 | - You describe an idea lab as being opposite of echo chambers.
01:13:50.220 | So we know what echo chambers are.
01:13:51.540 | And you said like, there's basically no good term for the opposite of an echo chamber.
01:13:56.540 | So what's an idea lab?
01:13:57.900 | - Yeah, well, first of all, both of these, we think of an echo chamber as like a group
01:14:01.500 | maybe, or even a place, but it's, it's a culture.
01:14:04.460 | It's an intellectual culture.
01:14:06.820 | And this goes along with the high rung, low, so high rung and low rung thinking is individual.
01:14:10.460 | So I was talking about what's going on in your head, but this is very connected to the
01:14:14.660 | social scene around us.
01:14:16.860 | And so groups will do high rung and low rung thinking together.
01:14:24.100 | Basically it's, so an echo chamber to me is, is a collaborative low rung thinking.
01:14:27.740 | It is, it's a culture where the cool, it's based around a sacred set of ideas.
01:14:34.700 | And it's the coolest thing you can do in an echo chamber culture is talk about how great
01:14:39.420 | the sacred ideas are and how bad and evil and stupid and wrong the people are who have
01:14:45.740 | the other views.
01:14:47.580 | And this, and it's quite boring, you know, it's quite boring, you know, it's very hard
01:14:53.260 | to learn and changing your mind is not cool in an echo chamber culture.
01:14:57.540 | It makes you seem wishy-washy.
01:14:58.940 | It makes you seem like, you know, like you're waffling and you're flip-flopping or whatever.
01:15:06.700 | Showing conviction about the sacred ideas in echo chamber culture is awesome.
01:15:10.020 | If you're just like, you know, obviously this, it makes you seem smart while being, you know,
01:15:14.220 | humble makes you seem dumb.
01:15:15.220 | So now flip all of those things on their heads and you have an, you have the opposite, which
01:15:18.900 | is idea lab culture, which is collaborative high rung thinking.
01:15:21.980 | It's collaborative truth finding.
01:15:23.820 | But it's also just, it's just a totally different vibe.
01:15:26.460 | It's a place where arguing is a fun thing.
01:15:31.620 | It's not, no one's getting offended.
01:15:33.180 | And criticizing like the thing everyone believes is actually, it makes you seem like interesting.
01:15:37.980 | Like, oh, really?
01:15:38.980 | Like, why do you think we're all wrong?
01:15:40.300 | And expressing too much conviction makes people lose trust in you.
01:15:44.300 | It doesn't make you seem smart, it makes you seem stupid if you don't really know what
01:15:46.860 | you're talking about, but you're acting like you do.
01:15:48.940 | - I really like this diagram of where on the X axis is agreement, on the Y axis is decency.
01:15:54.140 | That's in an idea lab.
01:15:55.980 | In echo chamber, there's only one axis.
01:15:58.100 | It's asshole to non-asshole.
01:16:00.300 | - Right.
01:16:01.300 | - I think this is a really important thing to understand about the difference between,
01:16:05.900 | you call it decency here, about asshole-ishness and disagreement.
01:16:10.500 | - So my college friends, we love to argue, right?
01:16:13.060 | And no one thought anyone was an asshole for, it was just for sport.
01:16:16.640 | Sometimes we'd realize we're not even disagreeing on something and that would be disappointing
01:16:19.580 | and be like, oh, I think we agree.
01:16:20.580 | And it was kind of like sad.
01:16:22.060 | It was like, oh, well, there goes the fun.
01:16:24.820 | And one of the members of this group has this, she brought her new boyfriend to one of our
01:16:31.180 | like hangouts and there was like a heated, heated debate.
01:16:35.380 | You know, just one of our typical things.
01:16:37.780 | And afterwards, you know, the next day he said like, is everything okay?
01:16:40.900 | And she was like, what do you mean?
01:16:42.260 | And he said like, after the fight.
01:16:43.260 | And she was like, what fight?
01:16:44.920 | And he was like, you know, the fight last night.
01:16:46.220 | And she was like, and she had to, and then she was like, you mean like the arguing?
01:16:49.620 | And he was like, yeah.
01:16:51.260 | And so that's someone who is not used to Idea Lab culture coming into it.
01:16:55.220 | And seeing it is like, that was like, this is like, are they still friends, right?
01:16:59.100 | And Idea Lab is nice for the people in them because individuals thrive.
01:17:04.100 | You don't want to just conform.
01:17:05.100 | That makes you seem boring in an idea.
01:17:06.820 | But you want to be yourself.
01:17:07.820 | You want to challenge things.
01:17:08.820 | You want to have a unique brain.
01:17:10.020 | So that's great.
01:17:11.020 | And you also have people criticizing your ideas, which makes you smarter.
01:17:14.100 | It doesn't always feel good, but you become more correct and smarter.
01:17:18.580 | And Echo Chamber is the opposite, where it's not good for the people in it.
01:17:22.180 | Your learning skills atrophy, and I think it's boring.
01:17:26.860 | But the thing is, they also have emergent properties.
01:17:29.420 | So the emergent property of an Idea Lab is like super intelligence.
01:17:34.620 | Just you and me alone, just the two of us.
01:17:37.620 | If we're working together on something, but we're being really grown up about it, we're
01:17:42.500 | not disagreeing, we're not, you know, no one's sensitive about anything.
01:17:45.860 | We're going to each find flaws in the other one's arguments that you wouldn't have found
01:17:48.980 | on your own.
01:17:50.420 | And we're going to have double the epiphanies, right?
01:17:52.780 | So it's almost like the two of us together is like as smart as 1.5.
01:17:56.500 | It's like 50% smarter than either of us alone, right?
01:17:58.460 | So you have this 1.5 intelligent kind of joint being that we've made.
01:18:03.420 | Now bring the third person and fourth person in, right?
01:18:05.540 | It starts to scale up.
01:18:06.540 | And this is why science institutions can discover relativity and quantum mechanics and these
01:18:12.820 | things that no individual human was going to come up with without a ton of collaboration,
01:18:17.860 | because it's this giant Idea Lab.
01:18:19.540 | So it has an emergent property of super intelligence.
01:18:22.180 | An echo chamber is the opposite, where it has the emergent property of stupidity.
01:18:27.380 | I mean, it has the emergent property of a bunch of people all paying fealty to this
01:18:35.260 | set of sacred ideas.
01:18:37.020 | And so you lose this magical thing about language and humans, which is collaborative intelligence,
01:18:43.060 | you lose it.
01:18:44.060 | It disappears.
01:18:45.300 | But there is that axis of decency, which is really interesting, because you kind of painted
01:18:50.420 | this picture of you and your friends arguing really harshly.
01:18:54.000 | But underlying that is a basic camaraderie, respect.
01:19:01.260 | There's all kinds of mechanisms we humans have constructed to communicate mutual respect,
01:19:07.980 | or maybe communicate that you're here for the Idea Lab version of this.
01:19:11.580 | Totally.
01:19:12.580 | You don't get personal, right?
01:19:15.220 | You're not getting personal.
01:19:16.220 | You're not taking things personally.
01:19:20.660 | People are respected in an Idea Lab, and ideas are disrespected.
01:19:24.420 | And there's ways to signal that.
01:19:26.380 | So with friends, you've already done the signaling.
01:19:29.660 | You've already established a relationship.
01:19:31.380 | The interesting thing is online, I think you have to do some of that work.
01:19:35.900 | To me, sort of steelmanning the other side, or no, having empathy and hearing out, being
01:19:42.940 | able to basically repeat the argument the other person is making before you, and showing
01:19:47.380 | like respect to that argument.
01:19:48.820 | I could see how you could think that before you make a counter-argument.
01:19:52.380 | There's just a bunch of ways to communicate that you're here not to do kind of, what is
01:20:00.140 | it, low rung, you know, shit talking, mockery, derision, but are actually here ultimately
01:20:06.620 | to discover the truth in the space of ideas and the tension of those ideas.
01:20:10.660 | And I think it's, I think that's a skill that we're all learning as a civilization of how
01:20:17.340 | to do that kind of communication effectively.
01:20:20.340 | I think disagreement, as I'm learning on the internet, it's actually a really tricky skill,
01:20:24.700 | like high effort, high decency disagreement.
01:20:27.820 | I got to listen to, there's a really good debate podcast, Intelligence Squared.
01:20:34.780 | And like, they can go pretty hard in the paint.
01:20:37.260 | - It's a classic idea lab.
01:20:38.820 | - It's exactly, but like, how do we map that to social media?
01:20:42.700 | When people like will say, will say, well, like Lex or anybody, you're not, you hate
01:20:48.980 | disagreement.
01:20:49.980 | You want to censor disagreement.
01:20:51.700 | No, I love Intelligence Squared type of disagreement.
01:20:55.260 | That's fun.
01:20:56.260 | - You want to reduce assholery.
01:20:58.260 | - And for me personally, I don't want to reduce assholery.
01:21:01.940 | I kind of like assholery, it's like fun in many ways.
01:21:04.860 | But the problem is when the asshole shows up to the party, they make it less fun for
01:21:11.140 | the party that's there for the idea lab.
01:21:12.940 | And the other people, especially the quiet voices at the back of the room, they leave.
01:21:17.260 | And so all you're left is with assholes.
01:21:20.220 | - Well, Twitter, political Twitter to me is one of those parties.
01:21:23.580 | It's a big party where a few assholes have really sent a lot of the quiet thinkers away.
01:21:34.740 | And so if you think about this graph again, what, some place like Twitter, a great way
01:21:43.300 | to get followers is to be an asshole with a certain, you know, pumping a certain ideology.
01:21:49.380 | You'll get a huge amount of followers.
01:21:50.980 | And for those followers, and the followers you're going to get, the people who like you
01:21:56.780 | are probably going to be people who are really thinking with their primitive mind because
01:22:00.660 | they're seeing you're being an asshole, but because you agree with them, they love you.
01:22:06.140 | And they think, they don't see any problem with how you're being.
01:22:07.980 | - Yeah, they don't see the asshole.
01:22:09.260 | This is a fascinating thing.
01:22:10.260 | - Well, because look, look at the thing on the right.
01:22:12.180 | Decent and decency are the same.
01:22:14.620 | So if you're in that mindset, the bigger the asshole, the better.
01:22:17.380 | If you're agreeing with me, you're my man.
01:22:18.820 | I love what you're saying.
01:22:19.820 | Yes, show them.
01:22:20.820 | Right?
01:22:21.820 | And the algorithm helps those people.
01:22:23.860 | Those people do great on the algorithm.
01:22:25.820 | - There's a fascinating dynamic that happens because I have currently hired somebody that
01:22:31.420 | looks at my social media and they block people because the assholes will roll in.
01:22:35.580 | They're not actually there to have an interesting disagreement, which I love.
01:22:40.340 | They're there to do kind of mockery.
01:22:42.700 | And then when they get blocked, they then celebrate that to their echo chamber.
01:22:47.940 | Like, look at this.
01:22:48.940 | I got them or whatever.
01:22:51.020 | - Or they'll say some annoying thing like, oh, so he talks about he likes, if I had done
01:22:56.140 | this, they'll say, oh, he says he likes Idea Labs, but he actually wants to create an echo
01:23:00.080 | chamber.
01:23:01.080 | I'm like, nope, you're an asshole.
01:23:04.820 | Look at the other 50 people on this thread that disagreed with me respectfully.
01:23:07.860 | They're not blocked.
01:23:08.860 | - Yep, exactly.
01:23:09.860 | - And so they see it as some kind of hypocrisy because again, they only see the thing on
01:23:13.580 | the right.
01:23:14.900 | And they're not understanding that there's two axes or that I see it as two axes.
01:23:18.940 | And so you seem petty in that moment, but it's like, no, no, no, this is very specific
01:23:22.860 | what I'm doing.
01:23:23.860 | You're actually killing the conversation.
01:23:26.300 | - And generally, I give all those folks a pass and just send them love telepathically.
01:23:31.540 | But yes, getting rid of assholes in the conversation is the way you allow for the disagreement.
01:23:38.520 | You do a lot of like, I think when primitive mindedness comes at you, at least on Twitter,
01:23:44.340 | I don't know what you're feeling internally in that moment, but you do a lot of like,
01:23:48.840 | I'm gonna meet that with my higher mind.
01:23:50.740 | And you come out and you'll be like, thanks for all the criticism, I love you.
01:23:56.440 | And that's actually an amazing response because what it does is it unrails up that person's
01:24:07.740 | primitive mind and actually wakes up their higher mind who says, oh, okay, you know,
01:24:10.820 | this guy's not so bad.
01:24:11.820 | And suddenly like, civility comes back.
01:24:14.380 | So it's a very powerful--
01:24:15.380 | - Hopefully long term.
01:24:17.060 | But the thing is, they do seem to drive away high quality disagreement.
01:24:23.060 | 'Cause it takes so much effort to disagree in a high quality way.
01:24:28.260 | - I've noticed this on my blog.
01:24:30.620 | One of the things I pride myself on is like my comment section is awesome.
01:24:36.660 | They're being respectful.
01:24:38.940 | No one's afraid to disagree with me and say, tear my post apart, but in a totally respectful
01:24:44.080 | way where the underlying thing is like, I'm here 'cause I like this guy and his writing.
01:24:48.620 | And people disagree with each other and they get in these long, and it's interesting and
01:24:51.380 | I read it and I'm learning.
01:24:53.020 | And then I, a couple posts, especially the ones I've written about politics, it's not
01:24:56.780 | like it seems like any other comment section.
01:24:59.140 | People are being nasty to me, they're being nasty to each other.
01:25:02.200 | And then I looked down one of them and I realized like, almost all of this is the work of a
01:25:06.640 | group of like three people.
01:25:08.460 | That's who you need to block.
01:25:09.660 | Those people need to be blocked.
01:25:10.660 | You're not being thin-skinned, you're not being petty doing it.
01:25:13.700 | You're actually protecting an idea lab.
01:25:17.180 | Because what really aggressive people like that do is they'll turn it into their own
01:25:20.420 | echo chamber.
01:25:21.820 | Because now everyone is scared to kind of disagree with them, it's unpleasant.
01:25:24.760 | And so people who will chime in are the people who agree with them and suddenly they've taken
01:25:28.540 | over the space.
01:25:29.540 | - And I kind of believe that those people on a different day could actually do high-effort
01:25:33.980 | disagreement, it's just that they're in a certain kind of mood.
01:25:37.620 | And a lot of us, just like you said, with a primitive mind, could get into that mood.
01:25:41.900 | And I believe it's actually the job of the technology, the platform, to incentivize those
01:25:47.680 | folks to be like, "Are you sure this is the best you can do?
01:25:52.180 | If you really want to talk shit about this idea, do better."
01:25:57.260 | And then we need to create incentives where you get likes for high-effort disagreement.
01:26:02.740 | Because currently you get likes for something that's slightly funny and is a little bit
01:26:08.580 | like mockery.
01:26:10.580 | Like, basically signals to some kind of echo chamber that this person is a horrible person,
01:26:18.060 | is a hypocrite, is evil, whatever.
01:26:20.460 | That feels like it's solvable with technology.
01:26:23.020 | Because I think in our private lives, none of us want that.
01:26:26.140 | - I wonder if it's making me think that I want a like, because a much easier way for
01:26:30.460 | me to do it just for my world would be to say something like, "Here's this axis.
01:26:37.460 | This is part of what I like about the latter."
01:26:40.460 | It's a language that we can use.
01:26:42.460 | Specifically what we're talking about is high-rung disagreement, good.
01:26:47.020 | Low-rung disagreement, bad.
01:26:48.980 | So it gives us a language for that.
01:26:50.460 | So what I would say is I would have my readers understand this axis.
01:26:56.000 | And then I would specifically say something like, "Please do the do it but why a favor
01:27:03.280 | and upvote regardless of what they're saying horizontally, regardless of what their actual
01:27:07.780 | view is.
01:27:08.780 | Upvote high-rungness.
01:27:09.780 | They could be tearing me apart.
01:27:11.700 | They can be saying great, they can be praising me, whatever.
01:27:16.060 | Uproot high-rungness and downvote low-rungness."
01:27:19.040 | And if enough people are doing that, suddenly there's all this incentive to try to say,
01:27:22.180 | "I need to calm my emotion down here and not be personal because I'm going to get voted
01:27:26.400 | into oblivion by these people."
01:27:28.400 | I think a lot of people would be very good at that.
01:27:32.680 | And not only would they be good at that, they would want that.
01:27:36.880 | That task of saying, "I know I completely disagree with this person, but this was a
01:27:40.360 | high-effort, high-rung disagreement."
01:27:44.680 | It gets everyone thinking about that other axis too.
01:27:46.280 | You're not just looking at where do you stand horizontally.
01:27:48.400 | You're saying, "Well, how did you get there and how are you...
01:27:51.840 | Are you treating ideas like machines or are you treating them like little babies?"
01:27:56.040 | - And that there should be some kind of labeling on personal attacks versus idea disagreement.
01:28:01.200 | Sometimes people throw in both a little bit.
01:28:03.080 | - Right.
01:28:04.080 | - That's like, "All right, no, there should be a disincentive at personal attacks versus
01:28:07.040 | idea attacks."
01:28:08.040 | - Well, you can also...
01:28:09.040 | One metric is a respectful disagreement.
01:28:12.640 | If I see, just say someone else's Twitter and I see you put out a thought and I see
01:28:17.240 | someone say, "I don't see it that way.
01:28:23.200 | Here's where I think you went wrong," and they're just explaining.
01:28:26.040 | I'm thinking that if Lex reads that, he's going to be interested.
01:28:29.440 | He's going to want to post more stuff, right?
01:28:31.120 | He's going to like that.
01:28:32.120 | If I see someone being like, "Wow, this really shows the kind of person that you become,"
01:28:37.320 | or shows something...
01:28:38.320 | I'm thinking, "That person is making Lex want to be on Twitter less."
01:28:41.240 | It's making him...
01:28:42.240 | And so what's that doing?
01:28:43.240 | What that person's actually doing is they're putting...
01:28:45.400 | Is they're actually...
01:28:46.400 | They're chilling discussion because they're making it unpleasant to...
01:28:48.760 | They're making it scary to say what you think.
01:28:50.920 | And the first person isn't at all.
01:28:52.160 | The first person is making you want to say more stuff.
01:28:54.880 | And those are both disagreed.
01:28:55.880 | Those are people who both disagree with you.
01:28:57.680 | - Exactly, exactly.
01:28:59.240 | I want to...
01:29:00.240 | Great disagreements with friends in meat space is like you're...
01:29:06.360 | They disagree with you.
01:29:08.240 | They could be even yelling at you.
01:29:10.440 | Honestly, they could even have some shit talk where it's like personal attacks.
01:29:14.080 | It still feels good.
01:29:15.360 | - Because you know them well and you know that that shit talk...
01:29:18.280 | Because, yeah, friends shit talk all the time playing a sport or a game.
01:29:22.360 | And again, it's because they know each other well enough to know that this is fun.
01:29:27.120 | We're having fun and obviously I love you.
01:29:30.880 | And that's important online.
01:29:32.120 | It's a lot harder.
01:29:33.120 | - Yeah, that "obviously I love you" that underlies a lot of human interaction seems to be easily
01:29:38.960 | lost online.
01:29:39.960 | I've seen some people on Twitter and elsewhere just behave their worst.
01:29:44.600 | And it's like, I know that's not who you are.
01:29:46.840 | Like why are you...
01:29:47.840 | Who is this human?
01:29:49.240 | - I know someone personally who is one of the best people.
01:29:54.280 | I love this guy.
01:29:56.160 | One of the best, fun, funny, nicest dudes.
01:30:01.160 | And if you looked at his Twitter only, you would think he's a culture warrior, an awful
01:30:05.940 | culture warrior.
01:30:07.800 | And biased and just stoking anger.
01:30:14.040 | And it comes out of a good place.
01:30:15.040 | And I'm not going to give any other info about it, specific.
01:30:17.840 | But like...
01:30:18.840 | - I think you're describing a lot of people.
01:30:19.840 | - It comes out of a good place because he really cares about what...
01:30:21.720 | You know, it comes out...
01:30:22.760 | But it's just, I can't square the two.
01:30:24.440 | And that's...
01:30:25.440 | You have to...
01:30:26.440 | Once you know someone like that, you can realize, okay, apply that to everyone.
01:30:28.960 | Because a lot of these people are lovely people.
01:30:31.480 | And it's just bring...
01:30:32.480 | Even just, you know, back in the before social media.
01:30:34.280 | Did you ever had a friend who like...
01:30:37.160 | Was just like...
01:30:38.160 | They had this like dickishness on text or email that they didn't have in person.
01:30:41.760 | And you're like, "Wow, email you is kind of a dick."
01:30:44.640 | And it's like, it just...
01:30:45.720 | Certain people have a different persona behind the screen.
01:30:48.440 | - It has, for me personally, become a bit of a meme that Lex blocks with love.
01:30:54.000 | But there is a degree to that where this is...
01:30:56.400 | I don't see people on social media as representing who they really are.
01:30:59.640 | I really do have love for them.
01:31:01.160 | I really do think positive thoughts of them.
01:31:03.160 | Throughout the entirety of the experience.
01:31:04.600 | I see this as some weird side effect of online communication.
01:31:09.800 | And so it's like, to me, blocking is not some kind of a derisive act towards that individual.
01:31:16.320 | It's just like saying...
01:31:17.320 | - Well, a lot of times what's happened is they have slipped into a very common delusion
01:31:23.760 | that dehumanizes others.
01:31:26.120 | So that doesn't mean they're a bad person.
01:31:27.240 | We all can do it.
01:31:28.240 | But they're dehumanizing you or whoever they're being nasty to, because in a way they would
01:31:33.720 | never do in person.
01:31:34.840 | Because in a person, they're reminded that's a person.
01:31:36.840 | Remember I said the dumb part of my brain when I'm doing VR won't step off the cliff,
01:31:41.240 | but the smart part of my brain knows I'm just on the rug.
01:31:43.640 | That dumb part of our brain is really dumb in a lot of ways.
01:31:47.760 | It's the part of your brain where you can set the clock five minutes fast to help you
01:31:51.800 | not be late.
01:31:52.800 | The smart part of your brain knows that you did that, but the dumb part will fall for
01:31:56.760 | So that same dumb part of your brain can forget that the person behind that screen, behind
01:32:01.200 | that handle is a human that has feelings.
01:32:05.040 | And that doesn't mean they're a bad person for forgetting that, because it's possible.
01:32:08.600 | - Well, this really interesting idea, and I wonder if it's true that you're right, is
01:32:12.920 | that both primitive mindedness and high mindedness tend to be contagious.
01:32:18.760 | I hope you're right that it's possible to make both contagious.
01:32:23.400 | Because our sort of popular intuition is only one of them, the primitive mindedness is contagious,
01:32:31.000 | as exhibited by social media.
01:32:32.680 | - To compliment you again, don't you think that your Twitter to me is like, I was just
01:32:37.480 | looking down, and I mean it is a, it's just high mindedness.
01:32:41.880 | It's just high mindedness, down, down, down, down, down.
01:32:44.380 | It's gratitude, it's optimism, it's love, it's forgiveness, it's all these things that
01:32:48.880 | are the opposite of grievance and victimhood and resentment and pessimism, right?
01:32:54.280 | And there's I think a reason that a lot of people follow you, because it is contagious.
01:33:00.480 | It makes other people feel those feelings.
01:33:02.200 | - I don't know, I've been recently, over the past few months, attacked quite a lot.
01:33:08.720 | And it's fascinating to watch, because it's over things that, I think I probably have
01:33:13.400 | done stupid things, but I'm being attacked for things that are totally not worthy of
01:33:18.280 | attack.
01:33:19.280 | I got attacked for a book list.
01:33:22.160 | - I saw that, by the way, I thought it was great.
01:33:24.840 | - But you can always kind of find ways to, I guess the assumption is, this person surely
01:33:32.280 | is a fraud, or some other explanation, he sure has dead bodies in the basement he's
01:33:37.240 | hiding or something like this, and then I'm going to construct a narrative around that
01:33:41.200 | and mock and attack that.
01:33:42.480 | I don't know how that works, but there is, there does, and I think you write this in
01:33:46.920 | the book, there seems to be a gravity pulling people towards the primitive mind.
01:33:52.080 | - When it comes to anything political, right, religious, certain things are bottom heavy,
01:33:58.840 | you know, for our psyche.
01:34:01.640 | They have a magnet that pulls our psyches downwards on the ladder.
01:34:05.320 | And why, why does politics pull our psyches down on the ladder?
01:34:09.200 | Because for the tens of thousands of years that we were evolving, you know, during human
01:34:17.800 | history, it was life or death.
01:34:20.800 | Politics was life or death.
01:34:22.500 | And so there's actually an amazing study where it's like, they challenged like 20 different
01:34:30.600 | beliefs of a person.
01:34:32.960 | And different parts of the person's brain, and they had an MRI going, different parts
01:34:36.960 | of the person's brain lit up when non-political beliefs were challenged versus political beliefs
01:34:41.000 | were challenged.
01:34:42.080 | When political beliefs were challenged, when non-political beliefs were challenged, the
01:34:46.560 | rational, like the prefrontal cortex type areas were lit up.
01:34:52.020 | When the political beliefs were challenged, and I'm getting over my head here, but it's
01:34:55.600 | like the parts of your brain, the default mode network, the parts of your brain associated
01:34:58.880 | with like introspection and like your own identity were lit up.
01:35:03.840 | And they were much more likely to change their mind on all the beliefs, the non-political
01:35:09.080 | beliefs.
01:35:10.080 | When that default mode network part of your brain lit up, you were going to, if anything,
01:35:15.160 | get more firm in those beliefs when you had them challenged.
01:35:18.260 | So politics is one of those topics that just literally, literally lights up different part
01:35:24.560 | of our brain.
01:35:25.560 | And again, I think we come back to primitive mind, higher mind here.
01:35:28.200 | It's like it gets our higher, this is one of the things our primitive mind comes programmed
01:35:33.680 | to care a ton about.
01:35:34.760 | And so it's going to be very hard for us to stay rational and calm and looking for truth
01:35:40.400 | because we have all this gravity to it.
01:35:42.080 | - Well, it's weird because politics, like what is politics?
01:35:44.600 | Like you talk about, it's a bunch of different issues and each individual issue, if we really
01:35:49.280 | talk about it- - Yeah, tax policy.
01:35:50.800 | Like why are we being emotional about this?
01:35:52.760 | - I don't think we're actually that, I mean, yeah, we're emotional about something else.
01:35:57.560 | - Yeah, I think what we're emotional about is my side, the side I've identified with
01:36:03.000 | is in power and making the decisions and your side is out of power.
01:36:08.040 | And if your side's in power, that's really scary for me because that goes back to the
01:36:12.360 | idea of who's pulling the strings in this tribe, right?
01:36:16.980 | Who's the chief?
01:36:17.980 | Is it your family's patriarch or is it mine?
01:36:21.520 | We might not have food if we don't win this kind of whatever, chief election.
01:36:26.560 | So I think that it's not about the tax policy or anything like that.
01:36:30.800 | And then it gets tied to this like broader, I think a lot of our tribalism has really
01:36:36.000 | coalesced around this.
01:36:37.120 | We don't have that much religious tribalism in the US, right?
01:36:39.480 | Now they know the Protestants and the Catholics hate each other.
01:36:42.120 | We don't have that really, right?
01:36:44.080 | And honestly, people like to say we have racial tribalism and everything, but even a kind
01:36:51.800 | of a racist white conservative guy, I think takes the black conservative over the woke
01:36:57.840 | white person any day of the week right now.
01:37:00.000 | So that's the strongest source of division.
01:37:01.480 | It tells me that I think politics is way stronger tribalism right now.
01:37:05.200 | I think that that white racist guy loves the black conservative guy compared to the white
01:37:12.040 | woke guy, right?
01:37:13.040 | So again, not that racial tribalism isn't a thing.
01:37:16.440 | Of course, it's always a thing, but like political tribalism is the number one right now.
01:37:21.560 | So race is almost a topic for the political division versus the actual sort of element
01:37:26.400 | of the tribe.
01:37:27.400 | It's a political football.
01:37:28.400 | Yeah.
01:37:29.400 | Yeah.
01:37:30.400 | So there's a, I mean, it's, this is dark because, so this is a book about human civilization.
01:37:36.520 | This is a book about human nature, but it's also a book of politics about politics.
01:37:44.160 | It is just the way you list it out in the book.
01:37:48.680 | It's kind of dark how we just fall into these left and right checklists.
01:37:54.640 | So if you're on the left, it's maintain Roe v. Wade, universal healthcare, good, mainstream
01:38:01.640 | media, fine, guns kill people, US is a racist country, protect immigrants, tax cuts bad,
01:38:07.480 | climate change awful, raise minimum wage.
01:38:09.160 | And on the right is the flip of that, reverse Roe v. Wade, universal healthcare bad, mainstream
01:38:14.600 | media bad, people kill people, not guns kill people.
01:38:17.880 | US was a racist country, protect borders, tax cuts good, climate change overblown, don't
01:38:23.560 | raise minimum wage.
01:38:24.560 | I mean, it has, you almost don't have to think about any of this.
01:38:28.160 | It's like literally.
01:38:29.160 | Well, so when you say it's a book about politics, it's interesting because it's a book about
01:38:33.120 | the vertical axis.
01:38:34.120 | Right.
01:38:35.120 | It's specifically not a book about the horizontal axis in that I'm not talking, I don't actually
01:38:38.760 | talk about any of these issues.
01:38:40.360 | I don't put out an opinion on them.
01:38:43.200 | Those are all horizontal, right?
01:38:45.440 | But when you, so rather than arguing or having another book about those issues, about right
01:38:50.440 | versus left, I wanted to do a book about this other axis.
01:38:54.600 | And so on this axis, the reason I had this checklist is that this is a low part of the
01:39:00.520 | low rung politics world, right?
01:39:03.800 | Low rung politics is a checklist and that checklist evolves, right?
01:39:08.000 | Like Russia suddenly is like popular with the right as opposed to, you know, it used
01:39:11.400 | to be, you know, in the sixties that left was the one defending Stalin.
01:39:14.960 | So they'll switch.
01:39:15.960 | It doesn't even matter.
01:39:16.960 | The substance doesn't matter.
01:39:17.960 | This is the approved checklist of the capital P party.
01:39:20.680 | And this is what everyone believes.
01:39:22.360 | That's a low rung thing.
01:39:23.680 | The high rungs, this is not what it's like high rung politics.
01:39:28.400 | You tell me your one view on this.
01:39:29.640 | I have no idea what you think about anything else, right?
01:39:32.440 | And you're going to say, I don't know about a lot of stuff because inherently you're not
01:39:36.440 | going to have that strong an opinion because you don't have that much in for these are
01:39:38.960 | complex things.
01:39:39.960 | So there's a lot of, I don't know.
01:39:42.080 | And people are all over the place.
01:39:44.000 | It's when you know, you're in, you know, you're talking to someone who has been subsumed with
01:39:48.440 | low rung politics.
01:39:50.120 | When, if they tell you their opinion on any one of these issues, you could just, you know,
01:39:54.400 | you could just rattle off their opinion on every single other one.
01:39:56.960 | And if, and if in three years it's becomes fashionable to, to have this new view, they're
01:40:00.760 | going to have that's, you're not thinking that's echo, that's echo chamber culture.
01:40:05.240 | And I've been using kind of a shorthand of centrist to describe this kind of a high rung
01:40:11.680 | thinking, but people tend to, I mean, it's, it seems to be difficult to be a centrist
01:40:16.400 | or whatever, a high rung thinker.
01:40:18.440 | It's like people want to label you as a person who's too cowardly to take a stance somehow
01:40:26.040 | as opposed to asking, saying, I don't know, as a first statement.
01:40:28.920 | Well, the problem with centrist is that would mean that in each of these tax cuts, bag tax
01:40:34.080 | cuts, good.
01:40:35.080 | It means that you are saying I am in, I think we should have some tax cuts, but not that
01:40:38.560 | many.
01:40:39.560 | You might not think that you might actually come do some research.
01:40:41.640 | You say, actually, I think tax cuts are really important.
01:40:45.600 | That doesn't mean, oh, I'm not a centrist anymore.
01:40:47.160 | I guess I'm a far, you know, no, no, no.
01:40:49.440 | That's why we need the second axis.
01:40:50.840 | So what you're trying to be when you say centrist is high rung, which means you might be all
01:40:54.160 | over the place horizontally.
01:40:55.660 | You might agree with the far left on this thing, the far right on this thing, you might
01:40:58.560 | agree with the centrists on this thing, but, but calling yourself a centrist actually like
01:41:03.020 | is putting yourself in a prison on the horizontal axis.
01:41:07.480 | And it's saying that, you know, I, I, whatever the, on the, on the different topics, I'm
01:41:10.840 | right in between the two policy-wise.
01:41:12.960 | That's not where you are.
01:41:14.080 | So yeah, that's what we, we're badly missing this other axis.
01:41:17.080 | Yeah.
01:41:18.080 | I mean, I still do think it's like, for me, I am a centrist when you project it down to
01:41:24.680 | the horizontal, but the point is you're missing so much data by not considering the vertical
01:41:30.600 | because like on average, maybe it falls somewhere in the middle, but in reality, there's just
01:41:35.720 | a lot of nuance issue to issue that involves just thinking and uncertainty and changing
01:41:40.840 | in the, given the context of the current geopolitics and economics is just always considering,
01:41:47.800 | always questioning, always evolving your views, all of that.
01:41:50.640 | Not just, not just about like, oh, I think we should be in the center on this.
01:41:53.860 | But another way to be in the center is if there's some phenomenon happening, you know,
01:41:58.280 | there's a terrorist attack, you know, and one side wants to say, this has nothing to
01:42:02.700 | do with Islam.
01:42:03.700 | And the other one, the other side wants to say, this is radical Islam, right?
01:42:08.480 | What's in between those, the saying, this is complicated and nuanced and we have to
01:42:11.940 | learn more.
01:42:12.940 | And it probably has something to do with Islam and something to do with the economic circumstances
01:42:17.060 | and something to do with, you know, geopolitics.
01:42:19.660 | So in a case like that, you actually do get really un-nuanced when you go to the extremes
01:42:24.740 | and all of that nuance, which is where all the truth usually is, is going to be in the
01:42:27.980 | middle.
01:42:28.980 | So yeah.
01:42:29.980 | - And there's a lot of truth to the fact that if you take that nuance on those issues, like
01:42:33.420 | war in Ukraine, COVID, you're going to be attacked by both sides.
01:42:37.900 | - Yes.
01:42:38.900 | People who have, who are really strongly on one side or the other hate centrist people.
01:42:43.780 | I've gotten this myself and you know, the, this, the, the slur that I've had thrown at
01:42:47.860 | me is I'm an enlightened centrist in a very mocking way.
01:42:51.620 | So what are they actually saying?
01:42:52.620 | What does enlightened centrist mean?
01:42:53.620 | It means someone who is, you know, Steven Pinker or Jonathan Haidt gets accused of is,
01:42:58.040 | you know, that they're highfalutin, you know, intellectual world, and they don't actually
01:43:03.260 | have any, they don't actually take a side.
01:43:05.840 | They don't actually get their hands dirty and they can be superior to both sides without
01:43:10.460 | actually taking a stand.
01:43:11.460 | Right.
01:43:12.460 | So I see the argument and I disagree with it because I firmly believe that the hardcore
01:43:17.740 | tribes, they think they're taking a stand and they're out in the streets and they're
01:43:21.660 | pushing for something.
01:43:22.660 | I think what they're doing is they're just driving the whole country downwards.
01:43:25.260 | And I think they're, they're hurting all the causes they care about.
01:43:28.380 | And so it's not that, it's not that, you know, it's not that we need everyone to be sitting
01:43:31.420 | there, you know, refusing to take a side.
01:43:33.180 | It's that you can be far left and far right, but be upper left and upper right.
01:43:37.260 | If we talk about the, you use the word liberal a lot in the book to mean something that we
01:43:43.260 | don't in modern political discourse mean.
01:43:45.540 | So it's this higher philosophical view.
01:43:48.060 | And then you use the words progressive to mean the left and conservative to mean the
01:43:53.180 | right.
01:43:54.180 | Can you describe the concept of liberal games and power games?
01:43:58.340 | So the power games is, is what I call the like, basically just the laws of nature as
01:44:05.540 | the, when laws of nature are the laws of the land, that's the power game.
01:44:10.140 | So animals, watch any David Attenborough special.
01:44:13.860 | And when the little lizard is running away from the, you know, the bigger animal or whatever,
01:44:19.940 | I use an example of a bunny and a bear.
01:44:21.540 | I don't even know if bears eat bunnies.
01:44:22.900 | They probably don't, but pretend bears eat bunnies.
01:44:24.460 | All right.
01:44:25.460 | So it's like in the power games, the bear is chasing the bunny.
01:44:28.780 | There's no fairness.
01:44:29.780 | There's no, okay, well, what's right.
01:44:31.100 | But you know, what, what, what, what, what's legal?
01:44:32.540 | No, no, no.
01:44:33.540 | If the bear is fast enough, it can eat the bunny.
01:44:36.460 | If the bunny is, can get away, it can say living in mid.
01:44:39.820 | So that's it.
01:44:40.820 | That's the only rule.
01:44:41.820 | Now humans have spent a lot of time in essentially that environment.
01:44:45.620 | So when you have a totalitarian dictatorship, it's, and so what's the rule of the power
01:44:50.260 | games?
01:44:51.260 | The bear can do whatever they want if they have the power to do so.
01:44:53.300 | It's just a game of power.
01:44:54.620 | So if the bunny gets away, the bunny actually has more power than the bear in that situation.
01:44:58.300 | Right.
01:44:59.300 | And likewise, the totalitarian dictatorship, there's no rules.
01:45:02.540 | A dictator can do whatever they want.
01:45:04.220 | They can, they can, they can torture, they can, you know, flatten a rebellion with a
01:45:08.340 | lot of murder because they have the power to do so.
01:45:10.740 | What are you going to do?
01:45:11.740 | Right.
01:45:12.740 | And that's, that's kind of the state of nature.
01:45:14.180 | That's our natural way.
01:45:15.180 | You know, that would, you know, when you look at a mafia, watch a mafia movie, you know,
01:45:19.540 | there's, we do a lot of, we have, we have it in us.
01:45:21.180 | We all have, we all can snap into power games mode when it becomes all about, you know,
01:45:28.980 | just, just actual raw power.
01:45:30.500 | Now the liberal games is, is, you know, something that civilizations for thousands of years
01:45:36.300 | have been working on.
01:45:37.300 | It's not invented by America or modern times, but America's kind of was like the latest
01:45:42.100 | crack at it yet, which is this idea instead of everyone can do what they want if they
01:45:46.780 | have the power to do so, it's everyone can do what they want as long as it doesn't harm
01:45:49.980 | anyone else.
01:45:50.980 | Now that's really complicated.
01:45:51.980 | How do you define harm?
01:45:52.980 | And, and the idea is that everyone has their, a list of rights which are protected by the
01:45:57.580 | government and then they have their inalienable rights and they're, they're protected, you
01:46:02.780 | know, those are protected again by, you know, from, from an invasion by other people.
01:46:08.620 | And so you have this kind of fragile balance.
01:46:11.020 | And so the idea with the liberal games is you, that there are laws, but it's not totalitarian.
01:46:16.500 | They will build very clear, strict laws kind of around the edges of what you can and can't
01:46:22.660 | And then everything else, freedom.
01:46:24.340 | So unlike a totalitarian dictatorship, actually it's, it's very loose.
01:46:27.740 | You can, there's a lot of things can happen and it's kind of up to the people, but there
01:46:31.500 | are still laws that protect the very basic inalienable rights and stuff like that.
01:46:34.820 | So it's this much looser thing.
01:46:36.020 | Now the vulnerability there is that it, so, so, so the, the benefits of it are obvious,
01:46:44.140 | right?
01:46:45.140 | Freedom is great.
01:46:46.140 | It's fair.
01:46:47.140 | They, you know, that, that equality of opportunity seemed like the most fair thing and, and,
01:46:53.900 | you know, equality before the law, you know, due process and all of this stuff.
01:46:57.340 | So it seems fair to the founders of the US and other enlightenment thinkers.
01:47:01.420 | And it also is a great way to manifest productivity, right?
01:47:04.940 | You know, you have, you have Adam Smith saying it's not from the benevolence of the butcher
01:47:09.500 | or the baker that we get our dinner, but from their own, from their own self-interest.
01:47:12.420 | So you have, you can harness kind of selfishness for, for progress.
01:47:16.260 | But it has a vulnerability, which is that because the laws, it's like the totalitarian
01:47:21.540 | laws, they don't have an excess of laws for no reason.
01:47:25.460 | They want to control everything.
01:47:26.740 | And the US, you know, in the US we say, we're, they're not going to do that.
01:47:29.060 | And so the, the second, it's almost two puzzle pieces.
01:47:31.900 | You have the laws and then you've got a liberal culture.
01:47:35.620 | Liberal laws have to be married to liberal culture, kind of a defense of liberal spirit
01:47:40.820 | in order to truly have the liberal games going on.
01:47:44.100 | And so that's vulnerable because free speech, you can have the first amendment, that's the
01:47:48.980 | laws part.
01:47:50.140 | But if, if you're in a culture where anyone who, you know, speaks out against orthodoxy
01:47:56.660 | is going to be shunned from the community, well, you're lacking the second piece of the
01:48:00.420 | puzzle there.
01:48:01.420 | You're lacking liberal culture.
01:48:02.500 | And so therefore you, you might as well be in a, you might as well not even have the
01:48:06.700 | first amendment.
01:48:08.080 | And there's a lot of examples like that where the culture has to do its part for the true
01:48:11.860 | liberal games to be enjoyed.
01:48:14.380 | So it's just much more complicated and much more nuanced than the power games.
01:48:17.740 | It's kind of, it's kind of a set of basic laws that then are coupled with a basic spirit
01:48:25.100 | to create this very awesome human environment that's also very vulnerable.
01:48:32.420 | So what do you mean the culture has to play along?
01:48:34.320 | So for something like a freedom of speech to work, there has to be a basic, what, decency?
01:48:41.300 | That if all people are perfectly good, then perfect freedom without any restrictions is
01:48:46.900 | great.
01:48:47.900 | It's where the human nature starts getting a little iffy.
01:48:50.700 | We start being cruel to each other, we start being greedy and desiring of harm, and also
01:48:56.980 | the narcissists and sociopaths and psychopaths in society, all of that, that's when you start
01:49:01.980 | to have to inject some limitations on that freedom.
01:49:05.020 | Yeah, I mean if, so what the government basically says is we're going to let everyone be mostly
01:49:12.340 | free, but no one is going to be free to physically harm other people or to steal their property,
01:49:20.740 | right?
01:49:21.860 | And so we're all agreeing to sacrifice that, you know, that 20% of our freedom, and then
01:49:27.500 | in return, all of us in theory can be 80% free, and that's kind of the bargain.
01:49:34.700 | But now that's a lot of freedom to leave people with, and a lot of people choose, it's like
01:49:39.900 | you're so free in the US, you're actually free to be unfree if you choose.
01:49:42.840 | That's kind of what an echo chamber is to me.
01:49:45.100 | It's, you know, you can choose to kind of be friends with people who essentially make
01:49:53.980 | it so uncomfortable to speak your mind that it's no actual effective difference for you
01:50:02.780 | than if you lived in a country.
01:50:04.340 | If you can't, you know, criticize Christianity in a certain community, you have a First Amendment,
01:50:11.180 | so you're not going to get arrested by the government for criticizing Christianity.
01:50:16.300 | But if you have this, if the social penalties are so extreme that it's just never worth
01:50:22.460 | it, you might as well be in a country that imprisons people for criticizing Christianity.
01:50:29.420 | And so that same thing goes for wokeness, right?
01:50:31.860 | This is what people get, you know, cancel culture and stuff.
01:50:34.700 | So when the reason these things are bad is because they're actually, they're depriving
01:50:40.180 | Americans of the beauty of the freedom of the liberal games by, you know, imposing a
01:50:46.620 | social culture that is very Power Games-esque.
01:50:49.300 | It's basically a Power Games culture comes in and you might as well be in the Power Games
01:50:54.460 | And so liberal, if you live in a liberal democracy, there will always be challenges to a liberal
01:51:01.460 | culture, lowercase l, liberal.
01:51:05.860 | There'll always be challenges to a liberal culture from people who are much more interested
01:51:10.140 | in playing the Power Games.
01:51:12.740 | And there has to be kind of an immune system that stands up to that culture and says, "That's
01:51:16.220 | not how we do things here in America, actually.
01:51:18.740 | We don't excommunicate people for not having the right religious beliefs or not rent, you
01:51:23.500 | know, we don't disinvite a speaker from campus for having the wrong political beliefs."
01:51:28.060 | And if it doesn't stand up for itself, it's like the immune system of the country failing
01:51:33.220 | and Power Games rushes in.
01:51:37.900 | So before chapter four in your book, and the chapters that will surely result in you being
01:51:45.260 | burned at the stake, you write, "We'll start our pitchfork tour in this chapter by taking
01:51:51.420 | a brief trip through the history of the Republican Party.
01:51:54.020 | Then in the following chapters, we'll take a Tim's career tanking deep dive into America's
01:52:00.140 | social justice movement," as you started to talk about.
01:52:03.260 | Okay, so let's go.
01:52:05.980 | What's the history of the Republican Party?
01:52:08.500 | I'm looking at this through my vertical ladder.
01:52:10.380 | What is this familiar story of the Republicans from the '60s to today, what does it look
01:52:17.140 | like through the vertical lens?
01:52:18.940 | Right?
01:52:19.940 | Does it look different?
01:52:21.740 | And is there an interesting story here that's been kind of hidden because we're always looking
01:52:24.500 | at the horizontal?
01:52:25.500 | Now, the horizontal story, you'll hear people talk about it and they'll say something like
01:52:29.820 | the Republicans have moved farther and farther to the right.
01:52:36.740 | And to me, that's not really true.
01:52:38.980 | Like, was Trump more right-wing than Reagan?
01:52:42.060 | I don't think so.
01:52:43.060 | I think he's left.
01:52:44.060 | In terms of actual policy, yeah.
01:52:45.060 | Yeah.
01:52:46.060 | So we're using this, again, it's just like you're calling yourself centrist when it's
01:52:48.340 | not exactly what you mean, even though it also is.
01:52:52.380 | So again, I was like, "Okay, look, this vertical lens helps with other things.
01:52:56.020 | Let's apply it to the Republicans."
01:52:57.020 | And here's what I saw is I looked at the '60s and I saw an interesting story, which I don't
01:53:04.260 | think not everyone's familiar with what happened in the early '60s.
01:53:08.900 | But in 1960, the Republican Party was a plurality.
01:53:14.460 | You had progressives, like genuine Rockefeller, pretty progressive people, all the way to
01:53:20.660 | – then you had the moderates like Eisenhower and Dewey.
01:53:25.180 | And then you go all the way to the farther right, you had Goldwater and what you might
01:53:30.340 | call – I call them the fundamentalists.
01:53:34.900 | And so it's this interesting plurality, right?
01:53:39.140 | Something we don't have today.
01:53:40.420 | And what happened was the Goldwater contingent, which was the underdog, they were small, right?
01:53:47.540 | Eisenhower was the president, or had just been the president, and was – it seemed
01:53:52.400 | like the moderates were – he said, "You have to be close to the center of the chess
01:53:56.060 | board.
01:53:57.060 | That's how you maintain power."
01:53:58.980 | These people were very far from the center of the chess board, but they ended up basically
01:54:02.260 | having like a hostile takeover.
01:54:03.940 | They conquered their own party.
01:54:06.220 | And they did it by breaking all of the kind of unwritten rules and norms.
01:54:12.020 | So they did things like they first started with like the college Republicans, which was
01:54:15.220 | like this feeder group that turned in – a lot of the politicians started there.
01:54:19.500 | And they went to the election and they wouldn't let the current president, the incumbent,
01:54:25.700 | speak.
01:54:26.700 | And they were throwing chairs and there were fistfights.
01:54:28.580 | And eventually people gave up and they just sat there and they sat in the chair talking
01:54:31.100 | for their candidate until everyone eventually left and then they declared victory.
01:54:35.560 | So basically they came in – there was a certain set of rules, agreed upon rules, and
01:54:41.980 | they came in playing the power games, saying, "Well, actually, if we do this, you won't
01:54:47.460 | have the power – we have the power to take it if we just break all the rules."
01:54:52.260 | And so they did and they won.
01:54:53.500 | And that became this hugely influential thing, which then they conquered California through
01:54:57.220 | again – these people were taken aback.
01:55:00.740 | These proper Republican candidates were appalled by the kind of like the insults that were
01:55:04.740 | being hurled at them and the intimidation and the bullying.
01:55:07.140 | And eventually they ended up in the National Convention, which was called like the right-wing
01:55:11.500 | Woodstock.
01:55:12.500 | It was like the Republican National Convention in '64 was just – again, there was jeering
01:55:16.780 | and they wouldn't let the moderates or the progressives even speak.
01:55:20.780 | And there was racism.
01:55:21.780 | You know, Jackie Robinson was there and he was a proud Republican and he said that he
01:55:25.300 | feels like he was a Jew in Hitler's Germany with the way that blacks were being treated
01:55:28.620 | there.
01:55:29.620 | And it was nasty.
01:55:30.620 | But what did they do?
01:55:31.620 | They had fiery, you know, plurality enough to win and they won.
01:55:36.660 | They ended up getting crushed in the general election and they kind of faded away.
01:55:39.660 | But to me, I was like, "What's – that was an interesting story."
01:55:42.220 | I see it as – I have this character in the book called the Golem, which is a big, kind
01:55:46.420 | of a big, dumb, powerful monster that's the emergent property of like a political
01:55:50.980 | echo chamber.
01:55:51.980 | It's like this big giant.
01:55:52.980 | It's stupid, but it's powerful and scary.
01:55:56.340 | And to me, I was like a golem rose up, conquered the party for a second, knocked it on its
01:56:00.460 | ass and then faded away.
01:56:05.260 | And to me, when I look at the Trump revolution and a lot – and not just Trump, the last
01:56:09.220 | 20 years, I see that same lower right, that lower right monster kind of making another
01:56:17.780 | charge for it, but this time succeeding and really taking over the party for a long period
01:56:21.820 | of time.
01:56:22.820 | And I see the same story, which is the power games are being played in a situation when
01:56:28.340 | it had always been – the government relies on all these unwritten rules and norms to
01:56:32.260 | function.
01:56:33.260 | But for example, you have in 2016, Merrick Garland gets nominated by Obama and the unwritten
01:56:40.420 | norm says that when the president nominates a justice, then you pass them through unless
01:56:45.620 | there's some egregious thing.
01:56:46.620 | That's what has happened.
01:56:47.620 | But they said, "Actually, this is the last year of his presidency and the people should
01:56:50.820 | choose.
01:56:51.820 | So you set a new precedent where the president can't nominate a Supreme Court justice in
01:56:57.540 | the last year."
01:56:58.540 | So they pass it through and it ends up being Gorsuch.
01:57:02.100 | And so they lose that seat.
01:57:04.140 | Now three years later, it's Trump's last year and it's another election year and Ginsburg
01:57:09.140 | dies.
01:57:10.820 | And what did they say?
01:57:11.820 | They say, "Oh, let's keep our precedent."
01:57:12.980 | They said, "No, actually, we changed our mind.
01:57:14.860 | We're going to nominate Amy Barrett."
01:57:17.460 | So to me, that is classic power games, right?
01:57:20.420 | There's no actual rule and what you're doing is they did technically have the power to
01:57:23.740 | block the nomination then and then they technically had the power to put someone in and they're
01:57:26.780 | pretending there's some principle to it, but they're just going for the short-term edge
01:57:31.860 | at the expense of what is like the workings of the system in the long run.
01:57:36.540 | And then what do the Democrats have to do in that situation?
01:57:38.860 | Because both parties have been doing this is they either can lose now all the time or
01:57:42.420 | they start playing the power games too.
01:57:44.260 | And now you have a prisoner's dilemma where it's like both end up doing this thing and
01:57:48.860 | everyone ends up worse off, the debt ceiling, all these power plays that are being made
01:57:53.380 | with these holding the country hostage, this is power games.
01:57:56.740 | And to me, that's what Goldwater was doing in the '60s, but it was a healthier time in
01:57:59.500 | a way because there was this plurality within the parties, reduced some of the national
01:58:04.620 | tribalism and there wasn't as much of an appeal to that.
01:58:07.780 | But today, it's just like do whatever you have to do to beat the enemies.
01:58:11.000 | And so I'm seeing a rise in power games and I talk about the Republicans because they
01:58:15.140 | did a lot of these things first.
01:58:16.240 | They have been a little bit more egregious, but both parties have been doing it over the
01:58:18.880 | last 20, 30 years.
01:58:20.200 | - Can you place blame, or maybe there's a different term for it, at the subsystems of
01:58:27.640 | this?
01:58:28.640 | So is it the media?
01:58:29.680 | Is it the politicians like in the Senate and Congress?
01:58:33.600 | Is it Trump?
01:58:34.600 | So the leadership?
01:58:36.160 | Is it, or maybe it's us human beings, maybe social media versus mainstream media?
01:58:44.480 | Is there a sense of where, what is the cause and what is the symptom?
01:58:48.120 | - It's very complex.
01:58:49.120 | So Ezra Klein has a great book, Why We're Polarized, where he talks about a lot of this.
01:58:52.280 | And there's some of these, it's really no one's fault.
01:58:56.120 | First of all, the environment has changed in a bunch of ways you just mentioned.
01:59:00.160 | And what happens when you take human nature, which is a constant, and you put it into an
01:59:02.960 | environment, behavior comes out.
01:59:05.840 | The environment's the independent variable.
01:59:07.240 | When that changes, the dependent variable, the behavior, changes with it, right?
01:59:11.600 | And so the environment has changed in a lot of ways.
01:59:13.620 | So one major one is, it used to, for a long time, actually, first it was the Republicans
01:59:22.240 | and then the Democrats just had a stranglehold on Congress.
01:59:26.200 | There was no, it was not even competitive.
01:59:28.280 | The Democrats for 40 years had the majority.
01:59:32.040 | And so therefore, it actually is a decent environment to compromise it.
01:59:37.440 | Because now we can both, what you want is Congress people thinking about their home
01:59:40.680 | district and voting yes on a national policy because we're going to get a good deal on
01:59:45.640 | it back at home.
01:59:46.640 | That's actually healthy, as opposed to voting in lockstep together because this is what
01:59:51.880 | the red party is doing, regardless of what's good for my home district.
01:59:55.720 | An example is Obamacare.
01:59:56.720 | There were certain Republican districts that would have actually officially been benefited
02:00:01.800 | by Obamacare, but every Republican voted against it.
02:00:04.720 | So and part of the reason is because there's no longer this obvious majority.
02:00:08.920 | Every few years, it switches.
02:00:09.920 | It's a 50/50 thing.
02:00:11.960 | And that's partially because it's become so, we've been so subsumed with this one national
02:00:17.060 | divide of left versus right that people are not, people are whoever, they're voting for
02:00:23.840 | the same party for president all the way down the ticket now.
02:00:27.400 | And so you have this just kind of 50/50 color war, and that's awful for compromise.
02:00:31.240 | So there's like 10 of these things that have redistricting, but also it is social media.
02:00:36.560 | It is, I call it hypercharged tribalism.
02:00:39.680 | In the sixties, you had kind of distributed tribalism.
02:00:41.800 | You had some people that are worked up about the USSR, right?
02:00:45.200 | They're national.
02:00:46.200 | That's what they care about.
02:00:47.200 | US versus foreign.
02:00:48.200 | You had some people that were saying left versus right, like they are today.
02:00:51.380 | And then other people that were saying that they were fighting within the party.
02:00:54.860 | But today you don't have that.
02:00:57.560 | You have ideological realignment.
02:00:59.080 | So you kind of got rid of a lot of the in-party fighting.
02:01:01.200 | And then there's hasn't been that big of a foreign threat, nothing like the USSR for
02:01:04.960 | a long time.
02:01:05.960 | So you kind of lost that.
02:01:06.960 | And what's left is just this left versus right thing.
02:01:09.560 | And so that's kind of this hypercharged whirlpool that subsumes everything.
02:01:14.760 | And so, yeah, I mean, people point to Newt Gingrich, and people like there's certain
02:01:19.520 | characters that enacted policies that stoked this kind of thing.
02:01:23.120 | But I think this is a much bigger kind of environmental shift.
02:01:26.120 | - Well, that's going back to our questions about the role of individuals in human history.
02:01:30.440 | So the interesting, one of the many interesting questions here is about Trump.
02:01:34.560 | Is he a symptom or a cause?
02:01:37.200 | Because he seems to be, from the public narrative, such a significant catalyst for some of the
02:01:43.440 | things we're seeing.
02:01:44.440 | - This goes back to what we were talking about earlier, right?
02:01:46.200 | Like is it the person or is it the times?
02:01:48.240 | I think he's a perfect example of it's a both situation.
02:01:51.320 | I don't think, if you pluck Trump out of this situation, I don't think that Trump was inevitable.
02:01:56.600 | But I think we were very vulnerable to a demagogue.
02:02:00.160 | And if you hadn't been, Trump would have had no chance.
02:02:03.440 | And why were we vulnerable to a demagogue is because you have these, I mean, I think
02:02:10.280 | it's specifically on the right.
02:02:13.000 | If you actually look at the stats, it's pretty bad.
02:02:15.080 | Like the people who, because it's not just who voted for Trump.
02:02:17.680 | A lot of people just vote for the red, right?
02:02:19.520 | What's interesting is who voted for Obama against Romney and then voted for Trump?
02:02:25.320 | These are not racists, right?
02:02:27.120 | These are not hardcore Republicans.
02:02:29.560 | They voted for Obama.
02:02:31.460 | And where did the switch come from?
02:02:33.280 | Places that had economic despair, where bridges were not working well.
02:02:36.920 | That's a signifier.
02:02:39.320 | Where paint's chipping in the schools.
02:02:41.640 | These little things like this.
02:02:42.640 | So I think that you had this, a lot of these kind of rural towns, you have true despair.
02:02:47.880 | And then you also have the number one indicator of voting for Trump was distrust in media.
02:02:54.580 | And the media has become much less trustworthy.
02:02:57.880 | And so you have all these ingredients that actually make us very vulnerable to a demagogue.
02:03:04.840 | And a demagogue is someone who takes advantage, right?
02:03:06.880 | There's someone who comes in and says, "I can pull all the right strings and push all
02:03:11.880 | the right emotional buttons right now and get my self-power by taking advantage of the
02:03:16.240 | circumstances."
02:03:17.320 | And that is what Trump totally did.
02:03:20.520 | It makes me wonder how easy it is for somebody who's a charismatic leader to capitalize on
02:03:27.360 | cultural resentment when there's economic hardship to channel that.
02:03:32.680 | So John Haidt wrote a great article about, basically, truth is at an all-time low right
02:03:38.720 | The media is not penalized for lying.
02:03:41.440 | MSNBC, Fox News, these are not penalized for being inaccurate.
02:03:45.600 | They're penalized if they stray from the orthodoxy.
02:03:49.880 | On social media, it's not the truest tweets that go viral.
02:03:53.240 | And so Trump understood that better than anyone.
02:03:57.840 | He took advantage of it.
02:03:59.240 | He was living in the current world when everyone else was stuck in the past.
02:04:02.820 | And he saw that, and he just lied.
02:04:06.760 | Everything he said, truth was not relevant at all.
02:04:11.040 | It's just truly, it's not relevant to him and what he's talking about.
02:04:14.440 | He doesn't care, and he knew that neither do a subset of the country.
02:04:17.960 | I was thinking about this, just reading articles by journalists, especially when you're not
02:04:23.960 | a famous journalist in yourself, but you're more like a New York Times journalist.
02:04:30.520 | So the big famous thing is the institution you're a part of.
02:04:34.400 | You can just lie, 'cause you're not going to get punished for it.
02:04:38.120 | You're going to be rewarded for the popularity of an article.
02:04:41.240 | So if you write 10 articles, there's a huge incentive to just make stuff up.
02:04:46.880 | - You got to get clicks.
02:04:47.920 | - To get clicks.
02:04:48.920 | That's the first and foremost.
02:04:49.920 | And culturally, people will attack that article to say it's dishonest.
02:04:54.480 | One half the country will attack that article for saying it's dishonest.
02:04:57.520 | But they'll kind of forget.
02:05:00.320 | You will not have a reputational hit.
02:05:03.080 | There won't be a memory like, "This person made up a lot of stuff in the past."
02:05:06.160 | No, they'll take one article at a time, and they'll attach the reputation hits will be
02:05:11.140 | to New York Times, the institution.
02:05:13.880 | So for the individual journalist, there's a huge incentive to make stuff up.
02:05:17.000 | - Totally.
02:05:18.000 | - It's wild.
02:05:19.000 | - And it's scary, because it's almost like you can't survive if you're just an old-school,
02:05:24.120 | honest journalist who really works hard and tries to get it right and does it with nuance.
02:05:27.640 | What you can be is you can be a big-time substacker or a big-time podcaster.
02:05:31.880 | A lot of people do have a reputation for accuracy and rigor, and they have huge audiences.
02:05:38.000 | But if you're working in a big company right now, I think that many of the big media brands
02:05:48.160 | are very much controlled by the left.
02:05:50.640 | But I will say that the ones that are controlled by the right are even more egregious, not
02:05:54.800 | just in terms of accuracy, but also in terms of...
02:05:57.680 | The New York Times, for all of its criticisms, they have a handful of...
02:06:05.000 | Here and there, they put out a pretty...
02:06:08.800 | An article that strays from the...
02:06:10.520 | Barry Weiss wrote there for a long time.
02:06:12.400 | And then you've got...
02:06:13.400 | They wrote an article criticizing free speech on campus stuff recently.
02:06:17.880 | And they have a couple very left-progressive-friendly conservatives, but they have conservatives
02:06:25.380 | that are writing the op-eds.
02:06:26.380 | Fox News, you're not seeing thoughtful...
02:06:30.240 | Breitbart, you're not seeing thoughtful progressives writing there.
02:06:34.520 | There's some degree to which the New York Times, I think, still incentivizes and values
02:06:40.280 | the vertical, the high effort.
02:06:42.760 | So you're allowed to have a conservative opinion if you do a really damn good job.
02:06:50.120 | If it's a very thorough, in-depth kind of...
02:06:53.080 | And if you kind of pander to the progressive senses in all the right ways.
02:06:58.400 | I always joke that TED, they always have a couple token conservatives, but they get on
02:07:02.760 | stage and they're basically like, "So totally, you're all...
02:07:06.920 | The progressivism's right about all of this, but maybe libertarianism isn't all about..."
02:07:13.240 | So there is an element, but you know what?
02:07:15.000 | It's something.
02:07:16.000 | It's better than being a total tribal.
02:07:18.520 | I think you can see the New York Times tug of war, the internal tug of war.
02:07:22.440 | You can see it, because then they also have these awful instances, like the firing of
02:07:26.120 | James Bennett, which is a whole other story.
02:07:28.440 | But they have...
02:07:29.440 | Yeah, you can see it going both ways.
02:07:31.960 | But in the '60s, what did you have?
02:07:34.520 | You had ABC, NBC, CBS.
02:07:35.520 | The '70s, you had these three news channels, and they weren't always right, and they definitely
02:07:40.880 | sometimes spun a narrative together maybe about the Vietnam or whatever.
02:07:44.160 | But if one of them was just lying, they'd be embarrassed for it.
02:07:48.720 | They would be penalized.
02:07:49.720 | They'd be dinged, and they'd be known as, "This is the trash one."
02:07:52.280 | And that would be terrible for their ratings, because they weren't just catering to half
02:07:54.920 | the country.
02:07:55.920 | They all were catering to the whole country.
02:07:57.200 | So both on the axis of accuracy and on the axis of neutrality, they had to try to stay
02:08:04.480 | somewhere in the reasonable range, and that's just gone.
02:08:08.920 | One of the things I'm really curious about is...
02:08:12.360 | I think your book is incredible.
02:08:14.400 | I'm very curious to see how it's written about by the press, because I could see...
02:08:19.760 | I could myself write, with the help of Chad J. Petit, of course, clickbait articles in
02:08:24.000 | either direction.
02:08:25.000 | Yeah.
02:08:26.000 | It's easy to imagine.
02:08:27.000 | Your whole book is beautifully written for clickbait articles.
02:08:30.520 | If any journalists out there need help, I can write the most atrocious criticisms.
02:08:37.400 | Yeah.
02:08:38.400 | I'm ready.
02:08:40.400 | I'm braced.
02:08:42.520 | Yeah.
02:08:43.680 | So speaking of which, you write about social justice.
02:08:48.840 | You write about two kinds of social justice, liberal social justice and SJF, social justice
02:08:55.800 | fundamentalism.
02:08:57.400 | What are those?
02:08:58.400 | Yeah.
02:08:59.400 | So the term "wokeness" is so loaded with baggage.
02:09:01.400 | It's kind of like mocking and derogatory, and I was trying not to do that in this book.
02:09:06.800 | If it's a term loaded with baggage, you're already kind of...
02:09:09.760 | You're from the first minute, you're already behind.
02:09:13.840 | So to me, also, when people say "wokeness is bad," "social justice is bad," they're
02:09:22.800 | throwing the baby out with the bathwater because the proudest tradition in the US is liberal
02:09:29.400 | social justice.
02:09:30.400 | And what I mean by that, again, liberal meaning with lowercase l.
02:09:35.960 | It is intertwined with liberalism.
02:09:37.620 | So Martin Luther King, classic example, his "I Have a Dream" speech, he says stuff like
02:09:41.640 | "This country has made a promise to all of its citizens, and it has broken that promise
02:09:51.040 | to its black citizens."
02:09:54.200 | In other words, liberalism, the Constitution, the core ideals, those are great.
02:09:59.360 | We're not living up to them.
02:10:00.720 | We're failing on some of them.
02:10:02.900 | So civil disobedience, the goal of it wasn't to hurt liberalism.
02:10:06.520 | It was to specifically break the laws that were already violating...
02:10:11.480 | The laws that were a violation of liberalism to expose that this is illiberal, that the
02:10:16.080 | Constitution should not have people of different skin color sitting in different parts of the
02:10:20.960 | And so it was really patriotic, the Civil Rights Movement.
02:10:25.600 | It was saying, "Liberalism is this beautiful thing, and we need to do better at it."
02:10:31.520 | So I call it liberal social justice.
02:10:32.840 | And it used the tools of liberalism to try to improve the flaws that were going on.
02:10:42.480 | So free speech.
02:10:43.480 | Mario Savio in the '60s, he's a leftist.
02:10:47.600 | And what were the leftists doing in the '60s on Berkeley campus?
02:10:51.200 | They were saying, "We need more free speech," because that's what liberal social justice
02:10:56.640 | was fighting for.
02:10:57.640 | But you can also go back to the '20s, women's suffrage.
02:11:00.800 | Emancipation, the thing that America obviously has all of its...
02:11:05.200 | These are all ugly things that it had to get out of, but it got out of them one by one,
02:11:09.680 | and it's still getting out of them.
02:11:10.680 | That's what's cool about America.
02:11:12.480 | And liberal social justice basically is the practice of saying, "Where are we not being
02:11:16.960 | perfect liberals?
02:11:18.360 | And now let's fix that."
02:11:20.920 | So that's the idea of liberalism that permeates the history of the United States.
02:11:25.120 | But then there's interplay.
02:11:26.120 | You have so many good images in this book, but one of them is highlighting the interplay
02:11:32.320 | of different ideas over the past, let's say, 100 years.
02:11:36.900 | So liberalism is on one side, there's that thread.
02:11:40.360 | There's Marxism on the other, and then there's postmodernism.
02:11:45.200 | How do those interplay together?
02:11:47.080 | So it's interesting because Marxism is, and all of its various descendants, obviously
02:11:53.720 | there's a lot of things that are rooted in Marxism that aren't the same thing as what
02:11:58.280 | Karl Marx preached.
02:12:00.100 | But what do they all have in common?
02:12:02.600 | They think liberalism is bad.
02:12:06.360 | They actually think that the opposite of what Martin Luther King and other people in the
02:12:16.040 | civil rights and other movements, they think the opposite.
02:12:18.840 | He thinks liberalism is good, we need to preserve it.
02:12:20.960 | They said liberalism is the problem.
02:12:23.360 | These other problems with racism and inequality that we're seeing, those are inevitable results
02:12:29.720 | of liberalism.
02:12:30.880 | Liberalism is a rigged game, and it's just the power games in disguise.
02:12:34.520 | There is no liberal games.
02:12:35.720 | It's just the power games in disguise, and there's the upper people that oppress the
02:12:39.400 | lower people, and they convince the lower people, it's all about false consciousness,
02:12:43.320 | they convince the lower people that everything is fair.
02:12:46.200 | Now the lower people vote against their own interests, and they work to preserve the system
02:12:50.080 | that's oppressing them.
02:12:51.800 | What do we need to do?
02:12:52.800 | We need to actually, it's much more revolutionary, we need to overthrow liberalism.
02:12:58.840 | People think, oh, what we call a wokeness is just a normal social justice activism,
02:13:05.040 | but it's more extreme.
02:13:06.040 | No, no, it's the polar opposite, polar opposite.
02:13:11.280 | Now that's the Marxist threat.
02:13:12.800 | Now postmodernism is this term that is super controversial, and I don't think anyone calls
02:13:18.080 | themselves a postmodernist, or take all of this with a grain of salt in terms of the
02:13:20.800 | term, but what's the definition of radical?
02:13:23.600 | The definition of radical to me is how deep you want change to happen at.
02:13:30.600 | A liberal progressive and a conservative progressive will disagree about policies,
02:13:35.880 | the liberal progressive wants to change a lot of policies, change, change, change, and
02:13:41.760 | the conservative wants to keep things the way they are.
02:13:45.040 | But they're both conservative when it comes to liberalism, beneath it, the liberal foundation
02:13:50.880 | of the country, they both become conservatives about that.
02:13:55.400 | The Marxist is more radical because they want to go one notch deeper and actually overthrow
02:14:00.200 | that foundation.
02:14:01.640 | Now what's below liberalism is kind of the core tenets of modernity, this idea of reason
02:14:09.320 | and the notion that there is an objective truth and science as the scientific method.
02:14:17.560 | These things are actually beneath, and even the Marxist, if you look at the Frankfurt
02:14:20.320 | School, these post-Marxist thinkers and Marx himself, they were not anti-science, they
02:14:25.800 | believed in that bottom, bottom foundation.
02:14:29.960 | They actually wanted to preserve modernity, but they wanted to get rid of liberalism on
02:14:33.480 | top of it.
02:14:34.480 | The post-modernist is even more radical because they want to actually go down to the bottom
02:14:37.680 | level and overthrow it.
02:14:38.680 | They think science itself is a tool of oppression, they think it's a tool where oppression kind
02:14:44.400 | of flows through, they think that the white western world has invented these concepts,
02:14:50.760 | they claim that there's an objective truth and that there's reason and science, and they
02:14:54.320 | think all of that is just one meta-narrative, and it goes a long way to serve the interests
02:14:59.800 | of the powerful.
02:15:00.800 | - So in the sense that it's almost caricatured, but that is to the core their belief that
02:15:06.400 | math could be racist, for example.
02:15:08.480 | - Oh yeah, absolutely.
02:15:09.480 | - Not the education of math, but literally math, the mathematics.
02:15:13.760 | - The notion in math that there's a right answer and a wrong answer, that they believe
02:15:18.200 | is a meta-narrative that serves white supremacy, or the post-modernist might have said it serves
02:15:23.360 | just the powerful, or the wealthy.
02:15:27.080 | So what social justice fundamentalism is, is you take the Marxist thread that has been
02:15:31.720 | going on in lots of countries, and whoever the upper and lower is, that's what they all
02:15:38.360 | have in common, but the upper and lower, for Marx was the ruling class and the oppressed
02:15:43.360 | class, it was economic.
02:15:47.280 | But you come here and the economic class doesn't resonate as much here as it did maybe in some
02:15:53.280 | of those other places, but what does resonate here in the '60s and '70s is race and gender
02:15:58.440 | and these kind of social justice disagreements.
02:16:01.760 | And so what social justice fundamentalism is, is basically this tried and true framework
02:16:07.320 | of this Marxist framework, kind of with a new skin on it, which is American social justice,
02:16:16.200 | and then made even more radical with the infusion of post-modernism, where not just is liberalism
02:16:21.600 | bad, but actually, like you said, math can be racist.
02:16:25.360 | So it's this kind of philosophical Frankenstein, this stitched together of these, and so again,
02:16:34.120 | they wear the same uniform as the liberal social justice.
02:16:36.160 | They say social justice, racial equality, but it has nothing to do with liberal social
02:16:41.680 | justice.
02:16:42.680 | It is directly opposed to liberal social justice.
02:16:44.480 | - It's fascinating, the evolution of ideas, if we ignore the harm done by it, it's fascinating
02:16:51.120 | how humans get together and evolve these ideas.
02:16:53.280 | So as you show, Marxism is the idea that society is a zero-sum, I mean, I guess the zero-sum
02:16:59.320 | is a really important thing here.
02:17:01.920 | Zero-sum struggle between the ruling class and the working class with power being exerted
02:17:06.280 | through politics and economics.
02:17:08.160 | Then you add critical theory, Marxism 2.0 on top of that, and you add to politics and
02:17:14.400 | economics, you add culture and institutions.
02:17:17.040 | And then on top of that, for postmodernism, you add science, you add morality, basically
02:17:21.520 | anything else you can think of.
02:17:22.600 | - The stitched together Frankenstein, and if you notice, which is not necessarily bad,
02:17:26.560 | but in this case, I think it's actually violating the Marxist tradition by being anti-science.
02:17:32.080 | And it's violating the postmodernism, because what postmodernists were, they were radical
02:17:36.160 | skeptics.
02:17:37.240 | Not just of, they were radical skeptics, not just of the way things were, but of their
02:17:40.240 | own beliefs.
02:17:42.240 | And social justice fundamentalism is not at all self-critical.
02:17:48.840 | It says that we have the answers, which is the opposite of what postmodernists would
02:17:51.440 | ever say.
02:17:52.440 | No, you just have another meta-narrative.
02:17:54.160 | And it's also violating, of course, the tradition of liberal social justice in a million ways,
02:17:57.440 | because it's anti-liberal.
02:17:59.680 | And so this Frankenstein comes together.
02:18:01.720 | Meanwhile, liberal social justice doesn't have a Frankenstein.
02:18:04.600 | It's very clear.
02:18:05.720 | It's very, it's a crisp ideology that says, we need, they're trying to make, we're trying
02:18:10.880 | to get to a more perfect union, they're trying to keep the promises made in the Constitution.
02:18:17.480 | And that's what it's trying to do.
02:18:19.000 | And so it's much simpler in a lot of ways.
02:18:21.440 | - So you write that my big problem with social justice fundamentalism isn't the ideology
02:18:26.160 | itself.
02:18:27.160 | It's what scholars and activists started to do sometimes around 2013, when they began
02:18:32.000 | to wield a cudgel that's not supposed to have any place in the country like the US.
02:18:36.960 | So it's the actions, not the ideas.
02:18:40.080 | - Well, to be clear, I don't like the ideology.
02:18:43.000 | I think it's a low-rung ideology.
02:18:45.600 | I think it's morally inconsistent based on, you know, it's flip-flops on its morals, depending
02:18:50.680 | on the group.
02:18:52.440 | I think it's echo-chambery.
02:18:54.400 | I think it's full of inaccuracies and kind of can't stand up to debate.
02:19:01.000 | So I think it's a low, but there's a ton of low-rung ideologies I don't like.
02:19:04.360 | I don't like a lot of religious doctrines.
02:19:06.160 | I don't like a lot of political doctrines, right?
02:19:08.960 | The US is a place inherently that is a mishmash of a ton of ideologies, and I'm not going
02:19:14.320 | to like two-thirds of them at any given time.
02:19:16.360 | So my problem, the reason I'm writing about this is not because I'm like, "By the way,
02:19:19.280 | this ideology is not something I like."
02:19:21.400 | That's not interesting.
02:19:23.440 | The reason that it must be written about right now, this particular ideology, is because
02:19:29.160 | it's not playing nicely with others.
02:19:31.680 | If you want to be a hardcore, you know, evangelical Christian, in the US says, "Live and let
02:19:38.440 | live."
02:19:39.440 | Not only are you allowed to have an echo chamber of some kind, it's actively protected here.
02:19:43.400 | Live and let live.
02:19:44.400 | They can do what they want.
02:19:45.400 | You do what you want.
02:19:46.400 | Now, if the evangelical Christian started saying, "By the way, anyone who says anything
02:19:51.080 | that conflicts with evangelical Christianity is going to be severely socially punished,
02:19:56.280 | and they have the cultural power to do so," which they don't in this case.
02:20:00.200 | They might like to, but they don't have the power, but they're able to get anyone fired
02:20:03.920 | who they want, and they're able to actually change the curriculum in all these schools
02:20:07.760 | and classes to suddenly not conflict with, no more evolution in the textbooks, because
02:20:12.360 | they don't want it.
02:20:13.360 | Now I would write a book about evangelical Christianity, because that's what every liberal,
02:20:20.120 | regardless of what you think of the actual horizontal beliefs, doesn't matter what they
02:20:24.480 | believe when they start violating live and let live and shutting down other segments
02:20:32.000 | of society.
02:20:33.000 | It's almost like a, it's not the best analogy, but an echo chamber is like a benign tumor.
02:20:41.040 | What you have to watch out for is a tumor that starts to metastasize, starts to forcefully
02:20:44.080 | spread and damage the tissue around it.
02:20:48.040 | That's what this particular ideology has been doing.
02:20:51.840 | - Do you worry about it as an existential threat to liberalism in the West, in the United
02:21:01.560 | States?
02:21:03.680 | Is it a problem, or is it the biggest problem that's threatening all of human civilization?
02:21:11.400 | - I would not say it's the biggest problem.
02:21:14.880 | It might be.
02:21:15.880 | I wouldn't, if it turns out in 50 years someone says actually it was, I wouldn't be shocked,
02:21:20.860 | but I also wouldn't bet on that, because there's a lot of problems.
02:21:24.360 | - I'm a little sorry to interrupt.
02:21:26.640 | It is popular to say that kind of thing, though, and it's less popular to say the same thing
02:21:32.400 | about AI or nuclear weapons, which worries me that I'm more worried about nuclear weapons
02:21:39.440 | even still than I am about wokeism.
02:21:42.120 | - So I've gotten, I've had probably a thousand arguments about this.
02:21:45.520 | That's one nice thing about spending six years procrastinating on getting a book done is
02:21:49.640 | you end up test, battle testing your ideas a million times.
02:21:52.080 | So I've heard this one a lot, which is there's kind of three groups of former Obama voters.
02:21:58.360 | One is super woke now.
02:22:00.880 | Another one is super anti-woke now.
02:22:02.760 | And the third is what you just said, which is sure, wokeness is over the top.
02:22:08.320 | You're not woke, but I think that the anti-woke people are totally lost their mind, and it's
02:22:13.560 | just not that big a deal.
02:22:16.000 | Now here's why I disagree with that, because it's not wokeness itself.
02:22:22.920 | It's that a radical political movement, of which there will always be a lot in the country,
02:22:30.960 | has managed to do something that a radical movement is not supposed to be able to do
02:22:34.520 | in the US, which is they've managed to hijack institutions all across the country and hijack
02:22:44.200 | medical journals, and universities, and the ACLU, all the activist organizations, and
02:22:53.280 | nonprofits, and NGOs.
02:22:55.560 | And certain tech companies.
02:22:57.080 | Yeah, and many tech companies.
02:23:01.760 | So it's not that I think this thing is so bad.
02:23:03.440 | It's a little like we said with Trump.
02:23:06.000 | The reason Trump scares me is not because Trump's so bad.
02:23:08.440 | It's that because it shows, it reveals that we were vulnerable to a demagogue candidate.
02:23:14.960 | And what wokeness reveals to me is that we are currently, and until something changes,
02:23:18.520 | will continue to be vulnerable to a bully movement, and a forcefully expansionist movement
02:23:29.120 | that wants to actually destroy the workings, their liberal gears, and tear them apart.
02:23:38.360 | And so here's the way I view a liberal democracy is it is a bunch of these institutions that
02:23:43.440 | were trial and error crafted over hundreds of years.
02:23:48.000 | And they all rely on trust, public trust, and a certain kind of feeling of unity that
02:23:53.440 | actually is critical to a liberal democracy's functioning.
02:23:57.280 | And what I see this thing is is a parasite on that whose goal is-- and I'm not saying
02:24:03.560 | each-- by the way, each individual in this is-- I don't think they're bad people.
02:24:07.000 | I think that it's the ideology itself has the property of its goal is to tear apart
02:24:12.600 | the pretty delicate workings of the liberal democracy and shred the critical lines of
02:24:17.160 | trust.
02:24:18.840 | And so you talk about AI, and you talk about all these other big problems, nuclear, right?
02:24:23.280 | The reason I-- I like writing about that stuff a lot more than I like writing about politics.
02:24:26.840 | This was a fun topic for me, is because I realized that all of those things, if we're
02:24:32.600 | going to have a good future with those things, and they're actually threats, like I said,
02:24:35.880 | we need to have our wits about us, and we need the liberal gears and levers working.
02:24:42.120 | We need the liberal machine working.
02:24:43.720 | And so if something's threatening to undermine that, it affects everything else.
02:24:48.440 | - We need to have our scientific mind about us, about these foundational ideas.
02:24:53.120 | But I guess my sense of hope comes from observing the immune system respond to wokeism.
02:25:00.680 | There seems to be a pro-liberalism immune system.
02:25:05.800 | And not only that, so like there's intellectuals, there's people that are willing to do the
02:25:10.280 | fight.
02:25:11.280 | You talk about courage, being courageous.
02:25:14.200 | And there is a hunger for that, such that those ideas can become viral, and they take
02:25:19.120 | over.
02:25:20.120 | So I just don't see a mechanism by which wokeism accelerates, like exponentially, and takes
02:25:27.800 | over, like it's expand.
02:25:29.320 | It feels like as it expands, the immune system responds.
02:25:34.480 | The immune system of liberalism, of basically a country, at least in the United States,
02:25:40.400 | that still ultimately at the core of the individual values the freedom of speech, just freedoms
02:25:45.600 | in general, the freedom of an individual.
02:25:48.160 | But that's the battle, which is stronger.
02:25:50.720 | - So to me it is like a virus in an immune system.
02:25:54.120 | And I totally agree, I see the same story happening.
02:25:57.440 | I'm sitting here rooting for the immune system.
02:25:59.680 | - But you're still worried.
02:26:01.360 | - Well, here's the thing.
02:26:03.000 | So a liberal democracy is always gonna be vulnerable to a movement like this, right?
02:26:09.360 | And there will be more.
02:26:10.880 | Because it's not a totalitarian dictatorship.
02:26:13.480 | Because if you can socially pressure people to not say what they're thinking, you can
02:26:17.280 | suddenly start to just take over, right?
02:26:19.680 | You can break the liberalism of the liberal democracy quite easily, and suddenly a lot
02:26:23.440 | of things are illiberal.
02:26:25.360 | On the other hand, the same vulnerability, the same system that's vulnerable to that
02:26:31.200 | also is hard to truly conquer.
02:26:34.240 | Because now the Maoists, right, similar kind of vibe.
02:26:37.600 | They were saying that science is evil, and that the intellectuals are, it's all this
02:26:43.320 | big conspiracy.
02:26:45.900 | But they could murder you.
02:26:49.560 | And they had the hard cudgel in their hand, right?
02:26:53.240 | And the hard cudgel is scary.
02:26:58.960 | And you can conquer a country with the hard cudgel.
02:27:01.040 | But you can't use that in the US.
02:27:03.200 | So what they have is a soft cudgel, which can have the same effect initially.
02:27:08.000 | You can scare people into shutting up.
02:27:10.200 | You can't maybe imprison them and murder them.
02:27:11.720 | But if you can socially ostracize them and get them fired, that basically is gonna have
02:27:15.240 | the same effect.
02:27:16.380 | So the soft cudgel can have the same effect for a while.
02:27:19.220 | But the thing is, it's a little bit of a house of cards, because it relies on fear.
02:27:25.360 | And as soon as that fear goes away, the whole thing falls apart, right?
02:27:31.320 | The soft cudgel requires people to be so scared of getting canceled or getting whatever.
02:27:37.540 | And as soon as some people start-- you know, Toby Lutka of Shopify I always think about.
02:27:42.280 | He just said, you know what, I'm not scared of this soft cudgel, and spoke up and said,
02:27:45.880 | we're not political at this company, and we're not a family, we're a team, and we're gonna
02:27:48.620 | do this.
02:27:49.620 | And you know what?
02:27:50.620 | Like, they're thriving.
02:27:51.620 | - He will be on this podcast.
02:27:52.620 | It seems like a fascinating-- - He's amazing.
02:27:54.740 | - He spoke up, he's saying that we're not gonna-- - He's one of the smartest and kindest
02:27:58.980 | dudes, but he's also, he has courage at a time when it's hard.
02:28:03.060 | But here's the thing, is that it's different than that you need so much less courage against
02:28:07.140 | a soft cudgel than you do-- the Iranians throwing their hijabs into the fire, those people's
02:28:12.540 | courage just blows away any courage we have here, 'cause they might get executed.
02:28:19.020 | That's the thing, is that you can actually have courage right now, and it's--
02:28:22.780 | - I'm not sure I understand.
02:28:25.900 | - Don't worry about it.
02:28:28.900 | - Oh man, the irony of that.
02:28:31.300 | And you talk about, so two things to fight this, there's two things, awareness and courage.
02:28:36.740 | What's the awareness piece?
02:28:39.380 | The awareness piece is first just understanding the stakes, like getting our heads out of
02:28:47.500 | the sand and being like, technology's blowing up exponentially, our society's trust is devolving,
02:28:53.340 | like we're kind of falling apart in some important ways, we're losing our grip on some stability
02:28:58.180 | at the worst time, that's the first point, just the big picture.
02:29:01.660 | And then also, awareness of, I think, this vertical axis, or whatever your version of
02:29:05.180 | it is, this concept of, how do I really form my beliefs, where do they actually come from?
02:29:12.460 | Are they someone else's beliefs, am I following a checklist?
02:29:17.260 | How about my values?
02:29:19.300 | I used to identify with the blue party or the red party, but now they've changed, and
02:29:23.620 | I suddenly am okay with that, is that because my values changed with it, or am I actually
02:29:27.300 | anchored to the party, not to any principle?
02:29:30.780 | Asking yourself these questions, asking, looking for where do I feel disgusted by fellow human
02:29:37.100 | beings?
02:29:38.100 | Maybe I'm being a crazy tribal person without realizing it.
02:29:41.020 | How about the people around me, am I being bullied by some echo chamber without realizing
02:29:46.020 | Am I the bully somewhere?
02:29:48.220 | So that's the first, I think just to kind of do a self-audit, and I think that just
02:29:57.220 | some awareness like that, just a self-audit about these things can go a long way, but
02:30:02.460 | if you keep it to yourself, it's almost useless, because if you don't have, you know, awareness
02:30:08.500 | without courage does very little.
02:30:10.780 | So courage is when you take that awareness and you actually export it out into the world,
02:30:15.740 | and it starts affecting other people.
02:30:17.420 | And so courage can happen on multiple levels.
02:30:19.180 | It can happen by, first of all, just stop saying stuff you don't believe.
02:30:23.220 | If you're being pressured by a kind of an ideology or a movement to say stuff that you
02:30:27.980 | don't actually believe, just stop, just stay on your ground and don't say anything.
02:30:32.420 | That's courage.
02:30:33.420 | That's one first step.
02:30:35.260 | Start speaking out in small groups.
02:30:38.100 | Start actually speaking your mind.
02:30:40.060 | See what happens.
02:30:41.060 | The sky doesn't usually fall.
02:30:42.060 | Actually, people usually respect you for it.
02:30:43.740 | And it's not every group, but you'd be surprised.
02:30:46.700 | And then eventually, maybe start speaking out in bigger groups.
02:30:50.420 | Start going public.
02:30:51.420 | Go public with it.
02:30:52.580 | And you don't need everyone doing this.
02:30:54.060 | Look, some people will lose their jobs for it.
02:30:55.700 | I'm not talking to those people.
02:30:57.620 | Most people won't lose their jobs, but they have the same fear as if they would, right?
02:31:02.140 | And it's like, what, are you going to get criticized?
02:31:03.140 | Or are you going to get a bunch of people, you know, angry Twitter people will criticize
02:31:09.060 | Like, yeah, it's not pleasant, but actually that's a little bit like our primitive minds
02:31:13.420 | fear that really back when it was programmed, that kind of ostracism or criticism will leave
02:31:19.460 | you out of the tribe and you'll die.
02:31:21.060 | Today, it's kind of a delusional fear.
02:31:22.900 | It's not actually that scary.
02:31:24.780 | And the people who realize that can exercise incredible leadership right now.
02:31:28.620 | - So you have a really interesting description of censorship, of self-censorship also, as
02:31:36.420 | you've been talking about.
02:31:37.940 | Who's King Mustache?
02:31:38.940 | And this gap, I think, I hope you write even more, even more than you've written in the
02:31:44.180 | book about these ideas, because it's so strong.
02:31:47.100 | These censorship gaps that are created between the dormant thought pile and the kind of thing
02:31:54.660 | under the speech curve.
02:31:55.940 | - Yeah.
02:31:56.940 | So first of all, so I like to think of, I think it's a useful tool is this thing called
02:32:01.780 | a thought pile, which is if you have a, on any given issue, you have a horizontal spectrum
02:32:07.620 | and just say, I could take your brain out of your head and I put it on the thought pile
02:32:11.300 | right where you happen to believe about that issue.
02:32:14.300 | Now I did that for everyone in the community or in a society.
02:32:17.740 | And you're gonna end up with a big mushy pile that I think will often form a bell curve.
02:32:21.220 | If it's really politicized, it might form like a camel with two humps, because it's
02:32:25.020 | like concentrated here.
02:32:26.020 | But for a typical issue, it'll just form a fear of AI.
02:32:28.420 | You're gonna have a bell curve, right?
02:32:30.300 | Things like this.
02:32:31.580 | That's the thought pile.
02:32:32.740 | Now the second thing is a line that I call the speech curve, which is what people are
02:32:37.060 | saying.
02:32:38.060 | So the speech curve is high when not just a lot of people are saying it, but it's being
02:32:40.860 | said from the biggest platforms, being said in the New York Times, and it's being said
02:32:46.700 | by the president on the State of the Union.
02:32:49.140 | Those things are the top of the speech curve.
02:32:52.060 | And then when the speech curve's lower, it means it's being said either whispered in
02:32:55.580 | small groups or it's just not very many people are talking about it.
02:32:58.540 | Now a healthy, when a free speech democracy is healthy on a certain topic, you've got
02:33:04.820 | the speech curve sitting right on top of the thought pile.
02:33:07.180 | They mirror each other, which is naturally what would happen.
02:33:09.940 | More people think something, it's gonna be said more often and from higher platforms.
02:33:14.700 | What censorship does, and censorship can be from the government, so I use the tale of
02:33:18.980 | King Mustache.
02:33:20.020 | And King Mustache, he's a little tiny tyrant, and he's very sensitive, and people are making
02:33:25.620 | fun of his mustache, and they're saying he's not a good king, and he does not like that.
02:33:28.540 | So what does he do?
02:33:29.540 | He enacts a policy, and he says, "Anyone who is heard criticizing me or my mustache or
02:33:35.640 | my rule will be put to death."
02:33:39.460 | And immediately at the town, because his father was very liberal, there was always free speech
02:33:44.700 | in his kingdom.
02:33:46.460 | But now King Mustache has taken over, and he's saying this is a new rules now.
02:33:49.580 | And so a few people yell out, and they say, "That's not how we do things here."
02:33:52.940 | And that moment, it's what I call a moment of truth.
02:33:56.300 | Did the king's guards stand with the principles of the kingdom and say, "Yeah, King Mustache,
02:34:00.300 | that's not what we do," in which case he would kind of have to, he's nothing he can do.
02:34:04.500 | Or are they going to execute?
02:34:05.500 | So in this case, it's as if he laid down an electric fence over a part of the stockpile
02:34:09.860 | and said, "No one's allowed to speak over here."
02:34:11.660 | The speech curve, maybe people will think these things, but the speech curve cannot
02:34:15.220 | go over here.
02:34:16.980 | But the electric fence wasn't actually electrified until the king's guards, in a moment of truth,
02:34:21.820 | get scared and say, "Okay," and they hang the five people who spoke out.
02:34:25.160 | So in that moment, that fence just became electric.
02:34:28.380 | And now no one criticizes King Mustache anymore.
02:34:31.660 | So I use this as an allegory.
02:34:32.660 | Now, of course, he has a hard cudgel because he can execute people.
02:34:36.100 | But now when we look at the US, what you're seeing right now is a lot of pressure, which
02:34:40.780 | is very similar.
02:34:41.780 | An electric fence is being laid down saying, "No one can criticize these ideas.
02:34:45.500 | And if you do, you won't be executed, you'll be canceled.
02:34:48.820 | You'll be fired."
02:34:49.820 | Now, is that fence electrified from there?
02:34:52.980 | No, they don't work at the company, they can't fire you.
02:34:56.100 | But they can start a Twitter mob when someone violates that speech curve, when someone violates
02:35:00.380 | that speech rule, and then the leadership at the company has the moment of truth.
02:35:07.100 | And what the leaders should do is stand up for their company's values, which is almost
02:35:11.780 | always in favor of the employee and say, "Look, even if they made a mistake, people make mistakes,
02:35:16.500 | we're not going to fire them.
02:35:17.500 | Or maybe that person actually said something that's reasonable and we should discuss it.
02:35:20.340 | But either way, we're not going to fire them."
02:35:22.540 | And if they said no, what happens is the Twitter mob actually doesn't have, they can't execute
02:35:27.780 | They go away.
02:35:28.780 | And the fence has proven to have no electricity.
02:35:29.780 | And that's been the problem with the past few years is what's happened again and again
02:35:33.020 | is the leader gets scared and they get scared of the Twitter mob and they fire them.
02:35:37.180 | Boom, that fence has electricity.
02:35:39.620 | And now, actually, if you cross that, it's not just a threat.
02:35:45.220 | You'll be out of a job.
02:35:46.860 | It's really bad.
02:35:48.140 | You'll have a huge penalty.
02:35:49.140 | You might not be able to feed your kids.
02:35:51.220 | So that's an electric fence that goes up.
02:35:52.500 | Now what happens when an electric fence goes up and it's proven to actually be electrified?
02:35:56.300 | The speech curve morphs into a totally different position.
02:35:59.860 | And now these new people say, instead of having the kind of marketplace of ideas that turns
02:36:04.000 | into a kind of a natural bell curve, they say, "No, no, no.
02:36:06.960 | These ideas are okay to say.
02:36:08.140 | Not just okay.
02:36:09.140 | You'll be socially rewarded.
02:36:10.440 | And these ones don't."
02:36:11.440 | That's the rules of their own echo chamber that they're now applying to everyone.
02:36:14.020 | And it's working.
02:36:15.140 | And so the speech curve distorts.
02:36:16.540 | And so you end up with now instead of one region, which is a region of kind of active
02:36:20.820 | communal thinking, what people are thinking and saying, you now have three regions.
02:36:25.380 | You have a little active communal thinking, but mostly you now have this dormant thought
02:36:28.780 | pile, which is all these opinions that suddenly everyone's scared to say out loud.
02:36:32.580 | Everyone's thinking, but they're scared to say it out loud.
02:36:33.980 | Everyone's thinking, but no one's saying.
02:36:35.580 | And then you have this other region, which is the approved ideas of this now cultural
02:36:40.800 | kind of dictator.
02:36:43.340 | And those are being spoken from the largest platforms, and they're being repeated by the
02:36:46.860 | president, and they're being repeated all over the place, even though people don't believe
02:36:52.700 | And that's this distortion.
02:36:53.700 | And what happens is the society becomes really stupid because active communal thinking is
02:36:58.420 | the region where we can actually think together.
02:37:00.540 | And now no one can think together.
02:37:01.700 | And it gets siloed into small private conversations.
02:37:06.020 | It's really powerful what you said about institutions and so on.
02:37:08.780 | It's not trivial from a leadership position to be like, "No, we defend the employee or
02:37:14.020 | we defend the employee, the person with us on our..." because there's no actual ground
02:37:24.460 | to any kind of violation we're hearing about.
02:37:26.740 | So they resist the mob.
02:37:28.340 | It's ultimately to the leader, I guess, of a particular institution or a particular company.
02:37:33.340 | And it's difficult.
02:37:34.340 | - Oh yeah, no, no.
02:37:35.340 | If it were easy, there wouldn't be all of these failings.
02:37:40.300 | And by the way, that's the immune system failing.
02:37:42.960 | That's the liberal immune system of that company failing, but also then it's an example, which
02:37:47.100 | means that a lot of other...
02:37:48.100 | It's failing to the country.
02:37:50.980 | It's not easy.
02:37:51.980 | Of course it's not, because we have primitive minds that are wired to care so much about
02:37:55.540 | what people think of us.
02:37:56.780 | And even if we're not gonna...
02:37:58.140 | First of all, we're scared that it's gonna start a...
02:38:00.700 | 'Cause what do mobs do?
02:38:03.500 | They don't just say, "I'm gonna criticize you.
02:38:05.220 | I'm gonna criticize anyone who still buys your product.
02:38:08.540 | I'm gonna criticize anyone who goes on your podcast."
02:38:10.620 | So it's not just you.
02:38:12.060 | It's now suddenly, if Lex becomes tarnished enough...
02:38:16.240 | Now I go on the podcast and people are saying, "Oh, I'm not buying his book.
02:38:18.440 | He went on Lex Friedman.
02:38:19.440 | No thanks."
02:38:20.800 | And now I get...
02:38:21.800 | I call it a smear web.
02:38:23.400 | You've been smeared and we're in such a bad time that it smear travels to me.
02:38:27.360 | And now meanwhile, someone who buys my book and tries to share it, someone said, "You're
02:38:29.960 | buying that guy's book?
02:38:30.960 | He goes on Lex Friedman."
02:38:32.880 | You see how this happens.
02:38:33.880 | So that hasn't happened in this case.
02:38:35.760 | So we are so wired.
02:38:37.720 | A, that is kind of bad.
02:38:39.480 | That is actually bad for you.
02:38:42.040 | But we're wired to care about it so much because it meant life or death back in the day.
02:38:46.280 | - Yeah, yeah.
02:38:47.280 | And luckily in this case, we're both...
02:38:51.200 | Probably can smear each other in this conversation.
02:38:53.240 | This is wonderful.
02:38:54.240 | - I smear you all the time.
02:38:55.920 | - Given the nature of your book.
02:39:00.640 | What do you think about freedom of speech as a term and as an idea, as a way to resist
02:39:04.740 | the mechanism, this mechanism of dormant thought pile and artificially generated speech?
02:39:10.680 | This ideal of the freedom of speech and protecting speech and celebrating speech.
02:39:14.880 | - Yeah.
02:39:15.880 | Well, so this is kind of the point I was talking about earlier about King Mustache made a rule
02:39:23.800 | against...
02:39:24.800 | He's created official...
02:39:25.800 | - I just love...
02:39:28.480 | One of the amazing things about your book, as you get later and later in the book, you
02:39:32.200 | cover more and more difficult issues as a way to illustrate the importance of the vertical
02:39:36.000 | perspective.
02:39:37.400 | But there's something about using hilarious drawings throughout that make it much more
02:39:45.120 | fun and it takes you away from the personal somehow.
02:39:47.840 | And you start thinking in the space of ideas versus outside of the tribal type of thinking.
02:39:53.120 | So it's a really brilliant...
02:39:54.640 | I mean, I would advise for anybody to do...
02:39:57.660 | When they write controversial books to have hilarious drawings.
02:40:00.240 | - It's true.
02:40:01.240 | Put a silly stick figure in your thing and it lightens...
02:40:03.600 | It does, it lightens the mood.
02:40:04.600 | It gets people's guard down a little bit.
02:40:06.200 | - Yeah.
02:40:07.200 | - It reminds people that we're all friends here.
02:40:11.800 | Let's laugh at ourselves, laugh at the fact that we're in a culture war a little bit and
02:40:16.680 | now we can talk about it as opposed to getting religious about it.
02:40:21.400 | But basically, King Mustache had no First Amendment.
02:40:24.040 | He said, "The government is censoring," which is very common around the world.
02:40:29.040 | Governments censor all that.
02:40:30.040 | The US, again, there's some...
02:40:32.200 | You can argue there's some controversial things recently, but basically the US, the First
02:40:36.240 | Amendment isn't the problem.
02:40:39.360 | No one is being arrested for saying the wrong thing, but this graph is still happening.
02:40:44.680 | And so freedom of speech, what people like to say is if someone's complaining about a
02:40:52.960 | cancel culture and saying, "This is an anti-free speech," people like to point out, "No, it's
02:40:59.720 | The government's not arresting you for anything.
02:41:00.720 | This is called the free market, buddy.
02:41:03.980 | This is called... you're putting your ideas out and you're getting criticized and your
02:41:07.960 | precious marketplace of ideas, there it is."
02:41:10.240 | I've gotten this a lot.
02:41:12.120 | And this is not making a critical distinction between cancel culture and criticism culture.
02:41:19.640 | Criticism culture is a little bit of this kind of high-rung idea lab stuff we talked
02:41:24.400 | about.Criticism culture attacks the idea and encourages further discussion.
02:41:34.500 | It enlivens discussion.
02:41:36.300 | It makes everyone smarter.
02:41:38.660 | Cancel culture attacks the person.
02:41:40.740 | Very different.
02:41:42.300 | Criticism culture says, "Here's why this idea is so bad.
02:41:44.100 | Let me tell you."
02:41:45.100 | Cancel culture says, "Here's why this person is bad and no one should talk to them and
02:41:48.540 | they should be fired."
02:41:50.220 | And what does that do?
02:41:51.280 | It doesn't enliven the discussion.
02:41:52.740 | It makes everyone scared to talk and it's the opposite.
02:41:55.140 | It shuts down discussion.
02:41:56.660 | So you still have your First Amendment.
02:41:58.100 | But First Amendment plus cancel culture equals you might as well be in King Must... you might
02:42:01.400 | as well have government censorship.
02:42:04.900 | First Amendment plus criticism culture, great.
02:42:07.020 | Now you have this vibrant marketplace of ideas.
02:42:09.380 | So there's a very clear difference.
02:42:13.660 | And so when people criticize the cancel culture and then someone says, "Oh, see, you're so
02:42:17.740 | sensitive now.
02:42:18.740 | Look, you're doing the cancel culture yourself.
02:42:20.340 | You're trying to punish this person for..."
02:42:22.300 | No, no, no, no, no.
02:42:24.220 | Every good liberal, and I mean that in the lower case, which is that anyone who believes
02:42:28.520 | in liberal democracies, regardless of what they believe, should stand up and say no to
02:42:32.700 | cancel culture and say, "This is not okay," regardless of what the actual topic is.
02:42:37.580 | And that makes them a good liberal versus if they're trying to cancel someone who's
02:42:41.440 | just criticizing, they're doing the opposite.
02:42:43.320 | Now they're shutting... so it's the opposite thing.
02:42:45.100 | But it's very easy to get confused.
02:42:46.420 | You can see people take advantage of the... and sometimes they just don't know it themselves.
02:42:51.500 | The lines here can be very confusing.
02:42:53.100 | The wording can be very confusing.
02:42:55.420 | Without that wording, suddenly it looks like someone who's criticizing cancel culture is
02:43:00.380 | canceling but they're not.
02:43:02.340 | - You apply this thinking to universities in particular.
02:43:07.580 | There's a great, yet another great image on the trade-off between knowledge and conviction.
02:43:14.980 | And it's what's commonly... actually, you can maybe explain to me the difference, but
02:43:20.300 | it's often referred to as the Dunning-Kruger effect, where you... when you first learn
02:43:24.420 | of a thing, you have an extremely high confidence about self-estimation of how well you understand
02:43:31.020 | that thing.
02:43:32.020 | You actually say that Dunning-Kruger means something else.
02:43:34.300 | - So yeah, when I post this, everyone's like, "Dunning-Kruger," and it's what everyone thinks
02:43:38.780 | Dunning-Kruger is.
02:43:39.780 | Dunning-Kruger is a little different.
02:43:41.780 | You have a diagonal line like this one, which is the place you are...
02:43:46.540 | I call it the humility tightrope.
02:43:48.140 | It's the humility sweet spot.
02:43:49.580 | It's exactly the right level of humility based on what you know.
02:43:52.380 | If you're below it, you're insecure.
02:43:53.380 | You actually have too much humility.
02:43:54.860 | You don't have enough confidence because you know more than you're giving yourself credit
02:43:58.420 | And when you're above the line, you're in the arrogance zone, right?
02:44:00.900 | You need a dose of humility, right?
02:44:02.780 | You think you know more than you do.
02:44:04.260 | So we all want to stay on that tightrope.
02:44:05.420 | And Dunning-Kruger is basically a straight line that's just... has a lower slope.
02:44:09.620 | So you start off... you still are getting more confident as you go along, but you start
02:44:17.740 | off above that line, and as you learn more, you end up below the line later.
02:44:22.460 | So, but anyway...
02:44:23.460 | So this wavy thing...
02:44:24.860 | This wavy thing is a different phenomenon.
02:44:27.340 | It's related, but...
02:44:28.860 | So this idea, so for people just listening, there's a child's hill, pretty damn sure you
02:44:35.580 | know a whole lot and feeling great about it.
02:44:37.980 | That's in the beginning.
02:44:39.180 | And then there's an insecure canyon, you crash down, acknowledging that you don't know that
02:44:44.220 | much.
02:44:45.220 | And then there's a growth mountain.
02:44:46.220 | - Grown up mountain.
02:44:48.140 | - Grown up mountain.
02:44:50.300 | Where after you feel ashamed and embarrassed about not knowing that much, you begin to
02:44:54.860 | realize that knowing how little you know is the first step in becoming someone who actually
02:44:59.940 | knows stuff.
02:45:00.940 | And that's the grown up mountain.
02:45:03.300 | And you climb and climb and climb.
02:45:05.940 | You're saying that in universities, we're pinning people at the top of the child's hill.
02:45:11.580 | - So for me, this is a very...
02:45:14.020 | I think of myself with this, because I went to college, like a lot of 18 year olds, and
02:45:17.820 | I was very cocky.
02:45:20.340 | I just thought I knew a lot, you know?
02:45:22.460 | And when it came to politics, I was like bright blue, just because I grew up in a bright blue
02:45:26.500 | suburb and I wasn't thinking that hard about it.
02:45:28.380 | And I thought that, you know...
02:45:30.420 | And what I did when I went to college is met a lot of smart conservatives and a lot of
02:45:33.840 | smart progressives.
02:45:35.420 | But I met a lot of people who weren't just going down a checklist and they knew stuff.
02:45:39.820 | And suddenly I realized that a lot of these views I have are not based on knowledge.
02:45:46.060 | They're based on other people's conviction.
02:45:49.020 | Everyone else thinks that's true, so now I think it's true.
02:45:51.900 | I'm actually like, I'm transferring someone else's conviction to me.
02:45:56.500 | And who knows why they have conviction?
02:45:57.780 | They might have conviction because they're transferring from someone else.
02:46:00.420 | And I'm a smart dude, I thought.
02:46:02.260 | Why am I giving away my own independent learning abilities here and just adopting other views?
02:46:13.020 | So anyway, it was this humbling experience.
02:46:14.500 | And it wasn't just about politics, by the way.
02:46:16.580 | It was that I had strong views about a lot of stuff and I just, I got lucky, or not lucky,
02:46:21.460 | I sought out, you know, the kind of people I sought out were the type that love to disagree
02:46:26.180 | and they were, man, they knew stuff.
02:46:29.220 | And so you're quickly in, you know, again, an idea lab culture.
02:46:32.220 | It was an idea lab.
02:46:33.220 | And also, I also went to, I started getting in the habit, I started loving listening to
02:46:36.620 | people who disagreed with me because it was so exhilarating listening to a smart person.
02:46:39.900 | When I thought there was no credence to this other argument, right?
02:46:44.460 | This side of this debate is obviously wrong.
02:46:46.900 | I wanted to see an Intelligence Squared on that debate.
02:46:49.220 | I wanted to go see, I actually got into Intelligence Squared in college.
02:46:52.100 | I wanted to see a smart person who disagrees with me talk.
02:46:56.180 | It became so fascinating to me, right?
02:46:57.700 | It was the most interesting thing.
02:46:59.020 | That was a new thing.
02:47:00.020 | I didn't think I liked that.
02:47:01.180 | And so what did that do?
02:47:02.820 | That shoved me down the humble tumble here, number three.
02:47:06.340 | It shoved me down where I started to, and then I went the other way where I realized
02:47:09.740 | that I had been, a lot of my identity had been based on this faux feeling of knowledge,
02:47:14.780 | this idea that I thought I knew everything.
02:47:17.100 | Now that I don't have that, I was like, I felt really like dumb and I felt really almost
02:47:21.060 | like embarrassed of what I knew.
02:47:22.780 | And so that's where I call this insecure canyon.
02:47:24.340 | I think it's sometimes when you're so used to thinking you know everything and then you
02:47:27.100 | realize you don't.
02:47:28.100 | And then you start to realize that actually really awesome thinkers, they don't judge
02:47:33.940 | me for this.
02:47:34.940 | They totally respect if I say, I don't know anything about this.
02:47:37.300 | They say, oh cool, you should read this and this and this.
02:47:38.780 | They don't say, you don't know anything.
02:47:39.780 | They don't say that.
02:47:41.620 | And so, and not that I'm, by the way, this is not to say I'm now on Grown Up Mountain
02:47:45.460 | and you should all join me.
02:47:46.460 | I often find myself drifting up with like a helium balloon.
02:47:49.940 | Oh, I think I read about the new thing and suddenly I think I have, I think I read three
02:47:55.060 | things about, you know, a new AI thing and I'm like, I'll go do a talk on this.
02:47:58.820 | I'm like, no, I won't.
02:47:59.820 | I don't, I just, I'm going to just be spouting out the opinion of the person I just read.
02:48:04.060 | So I have to remind myself, but it's useful.
02:48:06.300 | Now what, the reason my problem with colleges today is that it's, I was a graduate in 2004.
02:48:12.660 | This is a recent change, is that all of those speakers I went who disagreed with me, a lot
02:48:19.260 | of them were conservative.
02:48:20.980 | So many of those speakers would not be allowed on campuses today.
02:48:24.100 | And so many of the discussions I had were in big groups or classrooms.
02:48:27.940 | And this was still, you know, this was a liberal campus.
02:48:31.180 | So many of those disagreements, they're not happening today.
02:48:35.900 | And I've interviewed a ton of college students.
02:48:37.740 | It's chilly.
02:48:38.740 | It is, you know, people keep to themselves.
02:48:41.220 | So what's happening is not only are people losing that push off Child's Hill, which was
02:48:45.420 | so valuable to me, so valuable to me as a thinker.
02:48:48.980 | It kind of started my life as a better thinker.
02:48:51.660 | They're losing that, but actually what college, a lot of the college classes and the vibe
02:48:55.380 | in colleges, a lot of what I was saying that there is one right set of views and it's this
02:48:59.860 | kind of woke ideology and it's right.
02:49:03.620 | And anyone who disagrees with it is bad and anyone, and don't speak up, you know, unless
02:49:08.380 | you're going to agree with it.
02:49:09.860 | It's teaching people that Child's Hill is that, you know, it's nailing people's feet
02:49:13.980 | to Child's Hill.
02:49:15.140 | It's teaching people that these are right, this views are right.
02:49:17.700 | And like, you don't have any, you should feel a complete conviction about them.
02:49:24.260 | - How do we fix it?
02:49:27.140 | Is it part of the administration?
02:49:28.140 | Is it part of the culture?
02:49:29.140 | Is it part of the, is it part of like actually instilling in the individual, like 18 year
02:49:34.700 | olds, the idea that this is the beautiful way to live is to embrace the disagreement
02:49:40.500 | and the growth from that?
02:49:41.980 | - It's awareness and courage.
02:49:43.540 | It's the same thing.
02:49:44.540 | It's the same thing.
02:49:45.540 | The idea of intellectual just get, when that awareness is people need to see what's happening
02:49:49.340 | here that kids are getting, losing the, they're not going to college and becoming better,
02:49:54.300 | tougher, more robust thinkers.
02:49:57.780 | They're actually going to college and becoming zealots.
02:49:59.500 | They're getting taught to be zealots.
02:50:00.860 | And the website still advertises, you know, wide variety of, you know, the website is
02:50:05.860 | a bait and switch.
02:50:06.860 | - You list all the universities, yeah, Harvard.
02:50:08.340 | - It's a bait and switch.
02:50:09.340 | It's still saying, here, you're coming here for a wide intellectual, basically they're
02:50:13.020 | advertising, this is an ideal lab and you get there and it's like, actually it's an
02:50:15.580 | echo chamber that you're paying money for.
02:50:17.380 | So if people realize that they start to get mad, hopefully, and then courage, I mean,
02:50:23.420 | starts, you know, yes, brave students.
02:50:25.220 | There's been some very brave students who have started, you know, big think clubs and
02:50:29.180 | stuff like that, where it's like, we're going to have, you know, present both sides of a
02:50:33.140 | debate here.
02:50:34.140 | And that, that takes courage, but also courage and leadership.
02:50:38.700 | Like it's like, if you look at these colleges, it's specifically the leaders who show strength,
02:50:46.620 | who get the best results.
02:50:48.220 | Remember, the cudgel is soft.
02:50:50.340 | So if a leader of one of these places says, you know, the college presidents who have
02:50:55.180 | shown some strength, they actually don't get as much trouble.
02:50:59.060 | It's the ones who pander, the ones who, in that, you know, in that moment of truth, they
02:51:06.940 | shrink away, then they get a lot more trouble.
02:51:10.420 | The mob smells blood.
02:51:12.060 | For the listener, the podcast favorite, Liv Burry just entered, and your friend just entered
02:51:18.380 | the room.
02:51:19.380 | Do you mind if she joins us?
02:51:21.060 | Please.
02:51:22.060 | I think there's a story she has about you.
02:51:24.860 | So Liv, you mentioned something that there's a funny story about, we haven't talked at
02:51:29.300 | all about the actual process of writing the book.
02:51:33.460 | Is there, you guys made a bet of some kind?
02:51:36.380 | Yeah.
02:51:37.380 | Is this a true story?
02:51:38.660 | Is this a completely false fabric?
02:51:40.260 | No, no, it's true.
02:51:41.980 | Liv is, she's mean.
02:51:44.340 | I did not know mean Liv.
02:51:46.780 | She's like, she's like a bully.
02:51:48.460 | She's like scary.
02:51:49.460 | I had to have that screenshot.
02:51:50.980 | So Liv was FaceTiming me and she was like, she was like being intimidating.
02:51:54.980 | I took a screenshot and I made it my phone background.
02:51:57.100 | So every time I opened it, I was like, ah.
02:51:59.100 | So to give the background of this, it's because, if you hadn't noticed, Tim started writing
02:52:03.280 | this book, how many years ago?
02:52:05.780 | 2016.
02:52:06.780 | Mid 2016.
02:52:07.780 | Right.
02:52:08.780 | As sort of a response to like the Trump stuff.
02:52:10.860 | Not even, yeah, it was just supposed to be a mini post.
02:52:12.980 | I was like, oh, I'm so like, I was like, I'm looking at all these like future tech things
02:52:17.180 | and I feel this like uneasiness, like, ah, we're going to like mess up all these things.
02:52:21.460 | There's like some cloud over our society.
02:52:22.460 | Let me just write a mini post.
02:52:23.460 | And I opened it up to WordPress to write a one day little essay.
02:52:28.220 | And things went.
02:52:29.220 | On politics.
02:52:30.340 | It was going to be on like this feeling I had that like this feeling I had that we were,
02:52:37.260 | our tech was just growing and growing and we were becoming less wise.
02:52:41.020 | What's up?
02:52:42.020 | What's up with that?
02:52:43.020 | And I just wanted to write like just like a little like a little thousand word essay
02:52:44.500 | on like something I think we should pay attention to.
02:52:46.700 | And that was the beginning of this six year nightmare.
02:52:49.860 | Did you anticipate the blog post would take a long while?
02:52:54.860 | I don't remember the process fully in terms of, I remember you saying, oh, I'm actually
02:52:58.740 | writing.
02:52:59.740 | I'm actually writing into a bigger thing.
02:53:00.740 | And I was like, hmm.
02:53:01.740 | You know, because the more we talked about, we were talking about it.
02:53:03.740 | I was like, oh, this goes deep because I didn't really understand the full scope of the situation.
02:53:07.740 | Like nowhere near.
02:53:09.780 | And you sort of explained it.
02:53:11.060 | I was like, OK, yeah, I see that.
02:53:12.700 | And then the more we dug into it, the sort of the deeper and deeper and deeper it went.
02:53:16.100 | But no, I did not anticipate it would be six years.
02:53:18.220 | Let's put it that way.
02:53:19.700 | And when was your TED talk on procrastination?
02:53:22.340 | So that was that was March of 2016.
02:53:25.260 | And I started this book three months later and fell into the biggest procrastination
02:53:29.620 | hole that I've ever fallen into.
02:53:32.500 | The irony isn't lost on me.
02:53:33.620 | I mean, it's like it's I just like I like how much credit I have as as for that TED
02:53:38.660 | talk.
02:53:39.660 | I'm like, I am legit procrastinator.
02:53:40.660 | That is not I'm not just saying it like.
02:53:43.300 | It wasn't just that.
02:53:44.620 | Because, I mean, he did you know, he did intend it to start out as a blog post.
02:53:48.140 | But then you're like, actually, this needs to be multiple.
02:53:50.060 | Actually, let's make it into a full series.
02:53:52.660 | You know what?
02:53:53.660 | I'll turn it into a book.
02:53:54.660 | Yeah, that's what and what but also what Liv witnessed a few times and my wife has
02:53:58.820 | witnessed like 30 of these is like these these 180 epiphanies where I'll be like, I'll
02:54:05.780 | like I'll have a moment when I'm and I don't know what you know, sometimes it's
02:54:09.260 | that there's a really good idea.
02:54:10.780 | Sometimes it's like I'm just dreading having to finish this the way it is.
02:54:14.180 | And so there's epiphanies where it's like, you know what, I need to start over from the
02:54:16.260 | beginning and just make this like a short like 20 little blog post list and then I'll
02:54:21.460 | do that.
02:54:22.460 | And I was like, no, no, no, I have like a new epiphany.
02:54:23.780 | I have to and it's these and yeah, it's kind of like the crazy person a little bit.
02:54:28.500 | But anyway, can I tell the story of the bed?
02:54:30.820 | Go for it.
02:54:31.820 | All right.
02:54:32.820 | So things came to a head when we were in we were all on vacation in Dominican Republic.
02:54:37.580 | Tim and his wife, me and Igor.
02:54:41.380 | And we were in the ocean.
02:54:43.460 | And I remember you'd been in the ocean for like an hour just bobbing in there becoming
02:54:48.460 | And we got talking and we were talking about the book.
02:54:51.060 | And you were expressing just like this, you know, just the horror of the situation basically.
02:54:58.220 | You're like, look, I just I'm so close, but there's still this and then there's this.
02:55:02.540 | And an idea popped into my head, which is that poker players often we will set ourselves
02:55:10.620 | like negative bets.
02:55:12.300 | You know, like essentially if we don't get a job done, then we have to do something we
02:55:16.900 | really don't want to do.
02:55:18.380 | So instead of having a carrot, like a really, really big stick.
02:55:21.780 | So I had the idea to ask Tim, OK.
02:55:26.260 | What is the worst either organization or individual that you if you had to, you know, that you
02:55:33.300 | would loathe to give a large sum of money to?
02:55:36.460 | And he thought about it for a little while and he gave his answer.
02:55:40.340 | And I was like, all right, what's your net worth?
02:55:43.100 | He said his net worth.
02:55:44.100 | All right.
02:55:45.100 | 10 percent of your net worth to that thing.
02:55:46.800 | If you don't get the draft because that's right.
02:55:49.480 | Just before that, I asked him how long, like if you had a gun to your head or to your wife's
02:55:53.900 | head and you had to get the book into a state where you could like send off an edit to the
02:55:58.940 | draft to your editor, how long?
02:56:01.820 | He's like, oh, I guess like I could get it like 95 percent good in a month.
02:56:04.780 | I was like, OK, great.
02:56:06.140 | In one month's time, if you do not have that edit handed in, there's a draft handed in,
02:56:12.380 | really scary.
02:56:13.380 | So 10 percent of your net worth is going to this thing that you really, really think is
02:56:16.960 | terrible.
02:56:17.960 | But you're forgetting the kicker.
02:56:20.240 | The kicker was that because, you know, procrastinators, they self-defeat.
02:56:25.000 | That's what they do.
02:56:26.000 | And then Liv says, I'm going to sweeten the deal and I am going to basically match you
02:56:33.920 | and I'm going to put in I'm going to send this like a huge amount of my own money there
02:56:40.400 | if you don't do it.
02:56:41.400 | So and I can't that's that would be really bad.
02:56:44.500 | Not only are you screwing yourself, you're screwing a friend.
02:56:46.740 | And she and she was like, and as your friend, because I'm your friend, I will send it.
02:56:52.300 | I will send the money.
02:56:55.660 | I mean, like that, you know, like tyranny.
02:57:00.940 | And I got the draft in.
02:57:03.060 | I got the draft in.
02:57:04.060 | Just!
02:57:05.060 | Just!
02:57:06.060 | I know.
02:57:07.060 | Well, I was.
02:57:08.060 | Ego can attest to this.
02:57:09.060 | Actually, it was it was funny because it was it was like supposed to be by the summer solstice
02:57:10.660 | or whatever it was.
02:57:11.660 | It was like a certain date.
02:57:13.620 | And I got it in at four.
02:57:15.220 | I got no, I got it in at four a.m. like the next morning.
02:57:18.700 | But then and and and they were both like, that doesn't count.
02:57:21.580 | I'm like, it does.
02:57:22.580 | It's still for me.
02:57:23.580 | It's the same day still.
02:57:24.580 | It's OK.
02:57:25.580 | Can you imagine how fucked in the head you have to be?
02:57:26.580 | Yeah.
02:57:27.580 | So like literally technically pass the deadline by four hours for an obscene amount of money
02:57:32.820 | to a thing you loathe.
02:57:34.260 | That's how bad his sickness is.
02:57:36.180 | Because I knew the hard deadline.
02:57:37.660 | I knew that there was no way she was going to actually send that money because it was
02:57:41.420 | four a.m.
02:57:42.420 | So I knew I actually had the whole night.
02:57:43.420 | So, yeah, I should actually punish you and just I should send like a nominal amount to
02:57:48.180 | that thing.
02:57:49.180 | No, thanks.
02:57:51.180 | But is there some micro like lessons from that from how to avoid procrastination writing
02:57:57.020 | a book that you've learned?
02:57:59.020 | Well, I've learned a lot of things.
02:58:00.020 | I mean, like first, don't take don't write like a dissertation about like proving some
02:58:03.300 | grand theory of society because that's.
02:58:06.980 | Really procrastinating.
02:58:08.620 | Like I would have been an awful PhD student for that reason.
02:58:11.460 | And so like I'm going to do another book and it's going to be like a bunch of short chapters
02:58:14.300 | that are one offs because that's like it just doesn't feed into your book is like a giant
02:58:18.580 | like framework.
02:58:19.580 | There is grand theory.
02:58:20.580 | I know through your book.
02:58:21.580 | I know.
02:58:22.580 | And I learned not to do that again.
02:58:23.580 | I did it once.
02:58:24.580 | I don't want to do it again.
02:58:25.580 | Oh, with the book.
02:58:26.580 | Yeah.
02:58:27.580 | I learned the book is a giant mistake.
02:58:29.580 | Don't do another one of this.
02:58:30.580 | Look, look, some people should.
02:58:31.580 | It's just not for me.
02:58:32.580 | I just did it.
02:58:33.580 | I know.
02:58:34.580 | And it almost killed me.
02:58:35.580 | OK, so that's the first one.
02:58:36.580 | The second one is like, secondly, yeah, like basically there's two ways to fix procrastination.
02:58:40.780 | One is you fix it's like a picture.
02:58:42.500 | You have a boat that's leaking and it's not working very well.
02:58:44.740 | You can fix it in two ways.
02:58:45.740 | You can get your hammer and nails out and your boards and actually fix the boat or you
02:58:50.780 | can duct tape it for now to get yourself across the river.
02:58:53.780 | But it's not actually fixed.
02:58:55.300 | So ideally, down the road, I have repaired whatever kind of bizarre mental illness that
02:59:01.660 | I have that makes me procrastinate in a very like I just don't self defeat in this way
02:59:05.980 | anymore.
02:59:06.980 | But in the meantime, I can duct tape the boat by bringing what I call the panic monster
02:59:11.540 | into the situation via things like this and this scary person and having external pressure
02:59:16.860 | to have external pressure of some kind is critical for me.
02:59:21.100 | It's yes, I don't have the muscle to do the work I need to do without external pressure.
02:59:25.700 | By the way, Liv, is there a possible future where you write a book?
02:59:31.580 | And meanwhile, by the way, huge procrastinator.
02:59:33.860 | That's the funny thing about this.
02:59:34.860 | Yeah, I mean, I'm how long did your last video take?
02:59:37.900 | Oh, my God.
02:59:38.900 | Is there advice you give to live how to get the videos done faster?
02:59:42.180 | Well, it would be the same exact thing.
02:59:43.860 | We actually I can give good procrastination advice.
02:59:46.100 | Panic monster.
02:59:47.100 | Yeah, well, we should do it together.
02:59:48.900 | It should be like we have this day.
02:59:50.220 | But right.
02:59:51.220 | You know, it's, it's, um, we should actually just do another bet.
02:59:54.260 | I have to have my script done by this time.
02:59:56.780 | I got to get the third part out.
02:59:57.820 | Because then you'll actually do it.
02:59:59.500 | And, and, and it's not the thing is the time.
03:00:02.860 | But it's like if you if you could take three weeks on a video, and instead you take 10
03:00:06.820 | weeks, it's not like, oh, well, I've also I'm having more fun in those 10 weeks.
03:00:11.420 | The whole 10 weeks are bad.
03:00:12.420 | Yeah, it's torture.
03:00:14.420 | So you're just you're just having a bad time.
03:00:15.740 | And you're getting less work done and less work out.
03:00:17.820 | And it's not like you're enjoying your personal life.
03:00:19.100 | It's bad for you for your relationships.
03:00:20.740 | It's bad for your your own.
03:00:22.380 | You keep doing it anyway.
03:00:23.940 | Yeah, well, a lot of people.
03:00:25.860 | Why do people have troubles keeping a diet?
03:00:28.460 | Right?
03:00:29.460 | Yeah.
03:00:30.460 | Primitive mind.
03:00:31.460 | Why'd you point at me?
03:00:32.460 | Because I'm expensive.
03:00:33.460 | What's your procrastination weakness?
03:00:35.340 | Do you have one?
03:00:36.340 | Everything.
03:00:37.340 | Everything.
03:00:38.340 | Everything.
03:00:39.340 | Everything.
03:00:40.340 | Preparing for a conversation.
03:00:41.660 | I had your book.
03:00:43.340 | Amazing book.
03:00:44.340 | I really enjoyed it.
03:00:45.340 | I started reading it.
03:00:46.340 | I was like, this is awesome.
03:00:48.580 | It's so awesome that I'm going to save it when I'm behind a computer and can take notes.
03:00:53.900 | Like good notes.
03:00:55.180 | Of course, that resulted in like last minute everything, everything, everything I'm doing
03:00:59.500 | in my life.
03:01:00.500 | Not everyone's like that.
03:01:01.500 | You know, people self-defeat in different ways.
03:01:02.980 | Some people don't have this particular problem.
03:01:04.820 | Adam Grant is a, he calls himself a pre-crastinator where he gets an assignment.
03:01:09.220 | He will go home and do it until it's done and handed it, which is also not necessarily
03:01:13.340 | good.
03:01:14.340 | You know, it's like you're rushing it either way, but it's better.
03:01:17.020 | But some people have the opposite thing where they will, the looming deadline makes them
03:01:23.940 | so anxious that they go and fix it.
03:01:25.820 | Right?
03:01:26.820 | And the procrastinator I think has a similar anxiety, but they resolve it in a totally
03:01:30.220 | different way.
03:01:31.220 | They don't solve it.
03:01:32.220 | They just live with the anxiety.
03:01:33.220 | Right.
03:01:34.220 | Right.
03:01:35.220 | They just live with the anxiety.
03:01:36.220 | Now, I think there's an even bigger group of people.
03:01:37.220 | So there's these people, the Adam Grants, there's people like me, and then there's people
03:01:40.900 | who have a healthy relationship with deadlines, but they're still part of a bigger group of
03:01:43.900 | people that actually, they need a deadline there to do something.
03:01:52.900 | So they actually, they still are motivated by a deadline.
03:01:57.140 | And as soon as you have all the things in life that don't have a deadline, like working
03:02:00.140 | out and like working on that album you wanted to write, they don't do anything either.
03:02:03.660 | So there's actually like, that's why procrastination is a much bigger problem than people realize,
03:02:07.140 | because it's not just the funny last second people.
03:02:09.500 | It's anyone who actually can't get things done that don't have a deadline.
03:02:14.780 | You dedicate your book, quote, to Tannis, who never planned on being married to someone
03:02:20.060 | who would spend six years talking about his book on politics, but here we are.
03:02:25.100 | What's the secret to a successful relationship with a procrastinator?
03:02:28.540 | Maybe for both of you.
03:02:30.460 | Well, I think the first and most important thing.
03:02:34.300 | You already started with a political answer, I can tell.
03:02:36.580 | Okay, go ahead.
03:02:37.580 | No, the first and most important thing is, because people who don't procrastinate, if
03:02:41.740 | you don't, it's like, you will, people, the instinct is to judge it as like, either just
03:02:48.540 | think they're just being like a loser, or they'll take it personally, you know.
03:02:52.700 | And instead to see this as like, this is some form of addiction, or some form of ailment.
03:03:01.580 | You know, they're not just being a dick, right?
03:03:03.140 | Like, they have a problem, and so some compassion, but then also maybe finding that line where
03:03:08.340 | you can, you know, maybe apply some tough love, some middle ground.
03:03:12.820 | On the other hand, you might say that, you know, you don't want the significant other
03:03:16.780 | relationship where it's like, they're the one nagging you.
03:03:18.640 | Maybe that's, you don't want them even being part of that.
03:03:20.780 | And I think maybe it's, you know, better to have a live, do it instead.
03:03:23.500 | Right, having someone who can like create the infrastructure where they aren't the direct
03:03:27.620 | stick, you need a bit of carrot and stick, right?
03:03:30.340 | Maybe they can be the person who keeps reminding them of the carrot.
03:03:34.100 | And then they set up the friend group to be the stick.
03:03:36.260 | And then that keeps your relationship in a good place.
03:03:39.620 | Yeah, a stick, like looming in the background, that's your friend group.
03:03:43.460 | Okay, at the beginning of the conversation, we talked about how all of human history can
03:03:47.700 | be presented as a thousand page book.
03:03:52.540 | What are you excited about for the 1000th, what do you say that, first page?
03:04:00.220 | So the next 250 years, what are you most excited about?
03:04:05.000 | I'm most excited about, have you read the Fable of the Dragon?
03:04:11.380 | Okay, well, it's an allegory for death and it's, you know, Nick Bostrom, and he talks
03:04:16.300 | about the, he compares death to a dragon that eats 60 million people or whatever the number
03:04:22.100 | is every year.
03:04:23.100 | And you just, every year we shepherd those people up and they feed them to the dragon.
03:04:26.180 | And that there's a Stockholm syndrome when we say that's just a lot of man.
03:04:29.820 | And that's what we have to do.
03:04:31.340 | And anyone who says maybe we should try to beat the dragon, they get called vain and
03:04:34.380 | narcissistic.
03:04:36.340 | But someone who tries to, someone who goes, does chemo, no one calls them vain or narcissistic.
03:04:40.760 | They say they're, they're, you know, good, good for you, right?
03:04:43.060 | You're a hero.
03:04:44.060 | You're fighting the good fight.
03:04:46.060 | So I think there's some disconnect here.
03:04:47.420 | And I think that if we can get out of that Stockholm syndrome and realize that death
03:04:51.740 | is just the machine, the human physical machine failing, and that there's no law of nature
03:04:59.340 | that says you can't, with enough technology, repair the machine and keep it going until
03:05:05.980 | no one, I don't think anyone wants to live forever.
03:05:08.060 | People think they do, no one does.
03:05:09.420 | But until people are ready.
03:05:11.780 | And I think when we hit a world where we can, we have enough tech that we can continue to
03:05:17.340 | keep the human machine alive until the person says, I'm done, I'm ready.
03:05:20.860 | I think we will look back and we will think that anything before that time, that'll be
03:05:24.260 | the real ADBC.
03:05:25.260 | You know, we'll look back at BC before the big advancement and it'll seem so sad and
03:05:30.580 | so heartbreaking, barbaric.
03:05:31.820 | And people will say, I can't believe that humans like us had to live with that when
03:05:36.180 | they lost loved ones and they died before they were ready.
03:05:39.240 | I think that's the ultimate achievement, but we need to stop criticizing and smearing people
03:05:46.360 | who talk about it.
03:05:47.760 | - So you think that's actually doable in the next 250 years?
03:05:51.400 | - Yes.
03:05:52.400 | - Okay.
03:05:53.400 | - A lot happens in 250 years, especially when technology really exponentially, yeah.
03:05:58.400 | - And you think humans will be around versus AI completely takes over, where mortality
03:06:02.800 | means something completely different?
03:06:03.800 | - I mean, look, the optimist in me, and maybe the stupid kind of 2023 person in me, says,
03:06:09.200 | yeah, of course, we'll make it, we'll figure it out.
03:06:11.360 | But you know, I mean, we are going into a, you know, I have a friend who knows as much
03:06:18.080 | about the future as anyone I know.
03:06:19.440 | I mean, he's really, he's a big investor in, you know, future tech and he's really on the
03:06:23.960 | pulse of things and he just says, the future's gonna be weird.
03:06:26.480 | That's what he says, the future's gonna be weird.
03:06:27.480 | - Good prediction.
03:06:28.480 | - And it's gonna be weird.
03:06:29.480 | Don't look at the last few decades of your life and apply that forward and say, that's
03:06:32.240 | just what life is like.
03:06:33.240 | No, no, no, it's gonna be weird and different.
03:06:34.840 | - Well, some of my favorite things in this world are weird.
03:06:38.600 | And speaking of which, it's good to have this conversation.
03:06:41.800 | It's good to have you as friends.
03:06:43.280 | This was an incredible one.
03:06:44.280 | Thanks for coming back.
03:06:45.800 | And thanks for talking with me a bunch more times.
03:06:48.520 | This was awesome.
03:06:49.520 | - Thank you, Lex.
03:06:50.520 | - Thank you.
03:06:51.520 | - Thanks for listening to this conversation with Tim Urban.
03:06:53.320 | To support this podcast, please check out our sponsors in the description.
03:06:57.440 | And now let me leave you with some words from Winston Churchill.
03:07:01.320 | When there's no enemy within, the enemies outside cannot hurt you.
03:07:06.880 | Thank you for listening, and hope to see you next time.
03:07:09.440 | - Bye.
03:07:10.440 | - Bye.
03:07:10.440 | - Bye.
03:07:12.440 | - Bye.
03:07:16.440 | [Code Red Defense]