back to index

Niall Ferguson: History of Money, Power, War, and Truth | Lex Fridman Podcast #239


Chapters

0:0 Introduction
1:34 University of Austin (UATX)
34:29 Sam Harris
52:56 Elon Musk
61:15 Money
71:10 Hyperinflation
76:35 Bitcoin
93:17 Ethereum and smart contracts
102:4 Worst disasters in human history
124:2 How history will remember the current pandemic
137:36 Hope for the future
146:6 Love
152:44 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Neil Ferguson,
00:00:02.640 | one of the great historians of our time,
00:00:04.960 | at times controversial and always brilliant,
00:00:07.660 | whether you agree with him or not.
00:00:09.800 | He's an author of 16 books on topics
00:00:12.960 | covering the history of money, power, war,
00:00:16.240 | pandemics, and empire.
00:00:18.440 | Previously at Harvard, currently at Stanford,
00:00:21.360 | and today launching a new university here in Austin, Texas
00:00:25.920 | called the University of Austin,
00:00:28.720 | a new institution built from the ground up
00:00:31.840 | to encourage open inquiry and discourse
00:00:34.520 | by both thinkers and doers,
00:00:36.240 | from philosophers and historians
00:00:38.120 | to scientists and engineers,
00:00:40.200 | embracing debate, dissent, and self-examination,
00:00:43.900 | free to speak, to disagree, to think,
00:00:46.880 | to explore truly novel ideas.
00:00:49.760 | The advisory board includes Steven Pinker,
00:00:52.080 | Jonathan Haidt, and many other amazing people,
00:00:55.180 | with one exception, me.
00:00:58.560 | I was graciously invited to be on the advisory board,
00:01:01.240 | which I accepted in the hope of doing my small part
00:01:04.320 | in helping build the future of education and open discourse,
00:01:08.040 | especially in the fields of artificial intelligence,
00:01:10.480 | robotics, and computing.
00:01:12.200 | We spend the first hour of this conversation
00:01:14.440 | talking about this new university
00:01:16.560 | before switching to talking about
00:01:18.880 | some of the darkest moments in human history
00:01:21.160 | and what they reveal about human nature.
00:01:23.640 | This is the Lex Friedman Podcast.
00:01:26.480 | To support it, please check out our sponsors
00:01:28.640 | in the description.
00:01:29.920 | And now, here's my conversation with Neil Ferguson.
00:01:33.800 | You are one of the great historians of our time,
00:01:37.260 | respected, sometimes controversial.
00:01:40.100 | You have flourished in some of the best universities
00:01:42.160 | in the world, from NYU to London School of Economics
00:01:45.160 | to Harvard, and now to Hoover Institution at Stanford.
00:01:49.480 | Before we talk about the history of money, war, and power,
00:01:53.220 | let us talk about a new university.
00:01:55.740 | You're a part of launching here in Austin, Texas.
00:01:59.720 | It is called University of Austin, UATX.
00:02:02.720 | What is its mission, its goals, its plan?
00:02:08.100 | - I think it's pretty obvious to a lot of people
00:02:13.960 | in higher education that there's a problem.
00:02:17.600 | And that problem manifests itself
00:02:19.480 | in a great many different ways.
00:02:22.080 | But I would sum up the problem as being
00:02:25.800 | a drastic chilling of the atmosphere
00:02:30.640 | that constrains free speech, free exchange,
00:02:34.520 | even free thought.
00:02:35.760 | And I had never anticipated that this would happen
00:02:39.640 | in my lifetime.
00:02:40.460 | My academic career began in Oxford in the 1980s
00:02:43.160 | when anything went.
00:02:45.600 | One sensed that a university was a place
00:02:48.040 | where one could risk saying the unsayable.
00:02:51.320 | And debate the undebatable.
00:02:54.320 | So the fact that in a relatively short space of time,
00:02:58.640 | a variety of ideas, critical race theory, or wokeism,
00:03:02.120 | whatever you want to call it,
00:03:02.960 | a variety of ideas have come along that seek to limit,
00:03:06.560 | and quite drastically limit what we can talk about,
00:03:10.040 | strikes me as deeply unhealthy.
00:03:12.580 | And I'm not sure, and I've thought about this
00:03:14.500 | for a long time, you can fix it
00:03:15.880 | with the existing institutions.
00:03:17.700 | I think you need to create a new one.
00:03:20.060 | And so after much deliberation, we decided to do it.
00:03:24.000 | And I think it's a hugely timely opportunity
00:03:29.000 | to do what people used to do in this country,
00:03:33.080 | which was to create new institutions.
00:03:34.640 | I mean, that used to be the default setting of America.
00:03:37.520 | We sort of stopped doing that.
00:03:38.600 | I mean, I look back and I thought,
00:03:39.600 | why are there no new universities?
00:03:41.840 | Or at least, if there are,
00:03:42.960 | why do they have so little impact?
00:03:45.080 | It seems like we have the billionaires,
00:03:48.000 | we have the need, let's do it.
00:03:50.460 | - So you still believe in institutions, in the university,
00:03:53.960 | in the ideal of the university?
00:03:56.060 | - I believe passionately in that ideal.
00:03:59.220 | There's a reason they've been around
00:04:00.740 | for nearly a millennium.
00:04:02.780 | There is a unique thing that happens
00:04:07.620 | on a university campus when it's done right.
00:04:11.980 | And that is the transfer of knowledge between generations.
00:04:16.580 | That is a very sacred activity,
00:04:18.300 | and it seems to withstand major changes in technology.
00:04:22.840 | So this form that we call the university
00:04:25.200 | predates the printing press,
00:04:27.280 | survived the printing press,
00:04:29.620 | continued to function through the scientific revolution,
00:04:32.080 | the enlightenment, the industrial revolution to this day.
00:04:37.080 | And I think it's because,
00:04:39.320 | maybe because of evolutionary psychology,
00:04:41.580 | we need to be together in one relatively confined space
00:04:46.580 | when we're in our late teens and early twenties
00:04:51.860 | for the knowledge transfer
00:04:53.240 | between the generations to happen.
00:04:56.340 | That's my feeling about this.
00:04:58.220 | But in order for it to work well,
00:05:00.760 | there need to be very few constraints.
00:05:03.700 | There needs to be a sense
00:05:05.760 | that one can take intellectual risk.
00:05:07.920 | Remember, people in their late teens and early twenties
00:05:10.760 | are adults, but they're inexperienced adults.
00:05:14.220 | And if I look back on my own time as an undergraduate,
00:05:17.700 | saying stupid things was my MO.
00:05:21.680 | My way to finding good ideas
00:05:24.960 | was through a minefield of bad ideas.
00:05:28.720 | I feel so sorry for people like me today,
00:05:33.260 | people age 18, 19, 20 today,
00:05:36.120 | who are intellectually very curious,
00:05:40.500 | ambitious, but inexperienced,
00:05:42.980 | because the minefields today are absolutely lethal
00:05:47.900 | and one wrong foot and it's cancellation.
00:05:52.060 | I said this to Peter Thiel the other day,
00:05:54.380 | imagine being us now.
00:05:57.020 | I mean, we were obnoxious undergraduates.
00:06:01.180 | There's nothing that Peter did at Stanford
00:06:03.500 | that Andrew Sullivan and I were not doing at Oxford,
00:06:06.380 | and perhaps we were even worse.
00:06:08.040 | But it was so not career-ending
00:06:11.700 | to be an absolutely insufferable,
00:06:15.540 | obnoxious undergraduate then.
00:06:21.040 | Today, if people like us exist today,
00:06:24.500 | they must live in a state of constant anxiety
00:06:28.240 | that they're going to be outed for some heretical statement
00:06:32.860 | that they made five years ago on social media.
00:06:35.260 | So part of what motivates me is the desire
00:06:38.540 | to give the me's of today a shot at free thinking
00:06:43.540 | and really, I'd call it,
00:06:48.420 | aggressive learning, learning where you're really pushed.
00:06:54.300 | And I just think that stopped happening
00:06:56.340 | on the major campuses,
00:06:58.180 | because whether at Harvard where I used to teach
00:07:00.260 | or at Stanford where I'm now based,
00:07:01.720 | I sense a kind of suffocating atmosphere of self-censorship
00:07:06.720 | that means people are afraid
00:07:09.580 | to take even minimal risk in class.
00:07:12.500 | I mean, just take, for example,
00:07:14.740 | a survey that was published earlier this year
00:07:17.760 | that revealed, this is of undergraduates
00:07:20.380 | in four-year programs in the US,
00:07:22.380 | 85% of self-described liberal students
00:07:26.620 | said they would report a professor
00:07:28.780 | to the university administration
00:07:30.820 | if he or she said something they considered offensive.
00:07:33.780 | And something like 75% said they'd do it
00:07:36.560 | to a fellow undergraduate.
00:07:38.080 | That's the kind of culture that's evolved
00:07:40.400 | in our universities.
00:07:41.740 | So we need a new university in which none of that is true,
00:07:44.760 | in which you can speak your mind, say stupid things,
00:07:48.120 | get it completely wrong, and live to tell the tale.
00:07:52.460 | There's a lot more going on, I think,
00:07:54.280 | because when you start thinking about what's wrong
00:07:56.800 | with a modern university,
00:07:58.340 | many, many more things suggest themselves.
00:08:01.820 | And I think there's an opportunity here
00:08:03.520 | to build something that's radically new in some ways
00:08:06.980 | and radically traditional in other ways.
00:08:10.140 | For example, I have a strong preference
00:08:11.880 | for the tutorial system that you see
00:08:14.420 | at Oxford and Cambridge, which is small group teaching
00:08:17.780 | and highly Socratic in its structure.
00:08:21.300 | I think it'd be great to bring that to the United States
00:08:23.660 | where it doesn't really exist.
00:08:25.780 | But at the same time, I think we should be doing
00:08:28.320 | some very 21st century things,
00:08:30.900 | making sure that while people are reading
00:08:33.140 | and studying classic works,
00:08:35.620 | they're also going to be immersed in the real world
00:08:39.580 | of technological innovation,
00:08:41.700 | a world that you know very well.
00:08:43.420 | And I'd love to get a synthesis
00:08:47.260 | of the ancient and classical,
00:08:49.660 | which we're gradually letting fade away,
00:08:52.300 | with the novel and technological.
00:08:55.540 | So we wanna produce people
00:08:57.280 | who can simultaneously talk intelligently
00:09:00.600 | about Adam Smith, or for that matter, Shakespeare or Proust,
00:09:05.600 | and have a conversation with you about where AI is going
00:09:11.440 | and how long it will be before I can get driven here
00:09:14.920 | by a self-driving vehicle,
00:09:17.040 | allowing me to have my lunch and prepare
00:09:18.800 | rather than focus on the other crazy people on the road.
00:09:22.280 | So that's the dream, that we can create something
00:09:25.320 | which is partly classical and partly 21st century,
00:09:29.220 | and we look around and we don't see it.
00:09:31.100 | If you don't see an institution
00:09:32.980 | that you really think should exist,
00:09:35.060 | I think you have a more responsibility to create it.
00:09:38.100 | - So you're thinking including something bigger
00:09:41.940 | than just liberal education,
00:09:43.180 | also including science, engineering, and technology.
00:09:46.060 | I should also comment that I mostly stay out of politics
00:09:51.500 | and out of some of these aspects of liberal education
00:09:55.640 | that's kind of been the most controversial
00:09:57.480 | and difficult within the university,
00:09:59.560 | but there is a kind of ripple effect of fear
00:10:04.480 | within that space into science and engineering
00:10:08.920 | and technology that I think has a nature
00:10:13.920 | that's difficult to describe.
00:10:16.160 | It doesn't have a controversial nature,
00:10:17.680 | it just has a nature of fear,
00:10:19.360 | where you're not, you mentioned saying stupid stuff
00:10:22.560 | as a young 20-year-old.
00:10:24.180 | For example, deep learning, machine learning
00:10:30.080 | is really popular in the computer science now
00:10:32.760 | as an approach for creating
00:10:34.280 | artificial intelligence systems.
00:10:35.880 | It is controversial in that space to say that
00:10:41.720 | anything against machine learning,
00:10:43.640 | saying sort of exploring ideas that's saying
00:10:46.460 | this is going to lead to a dead end.
00:10:49.120 | Now, that takes some guts to do as a young 20-year-old
00:10:54.120 | within a classroom, to think like that,
00:10:57.340 | to raise that question in a machine learning course.
00:10:59.720 | It sounds ridiculous, 'cause it's like
00:11:01.360 | who's going to complain about this,
00:11:03.800 | but the fear that starts in a course on history
00:11:08.800 | or some course that covers society,
00:11:13.560 | the fear ripples and affects those students
00:11:16.940 | that are asking big out-of-the-box questions
00:11:18.860 | about engineering, about computer science.
00:11:22.080 | And there's a lot, there's linear algebra
00:11:24.520 | that's not going to change,
00:11:26.220 | but then there's applied linear algebra,
00:11:29.140 | which is machine learning,
00:11:30.200 | and that's when robots and real systems touch human beings,
00:11:34.940 | and that's when you have to ask yourself
00:11:36.840 | these difficult questions about humanity,
00:11:39.980 | even in the engineering and science and technology courses.
00:11:42.980 | And these are not separate worlds in two senses.
00:11:46.380 | I've just taken delivery of my copy of the book
00:11:50.020 | that Eric Schmidt and Henry Kissinger
00:11:51.580 | have co-authored on artificial intelligence,
00:11:54.980 | the central question of which is
00:11:56.600 | what does this mean for us broadly?
00:11:59.620 | But they're not separate worlds in C.P. Snow's sense
00:12:04.060 | of the chasm between science and arts,
00:12:07.020 | because on a university campus,
00:12:09.780 | everything is contagious from a novel coronavirus
00:12:13.620 | to the behaviors that are occurring
00:12:16.340 | in the English department.
00:12:18.340 | Those behaviors, if denunciation becomes a norm,
00:12:22.980 | you know, undergraduate denounces professor,
00:12:25.140 | teaching assistant denounces undergraduate,
00:12:27.060 | those behaviors are contagious and will spread inexorably,
00:12:29.820 | first to social science and then to natural sciences.
00:12:32.460 | And I think that's part of the reason why
00:12:35.940 | when this started to happen,
00:12:37.340 | when we started to get the origins
00:12:38.980 | of disinvitation and cancel culture,
00:12:42.340 | it was not just a few conservative professors
00:12:45.500 | in the humanities who had to worry,
00:12:47.280 | everybody had to worry,
00:12:48.580 | because eventually it was going to come
00:12:51.580 | even to the most apparently hard stem part of the campus.
00:12:56.580 | It's contagious.
00:12:58.860 | This is something Nicholas Christakis should look at,
00:13:00.860 | because he's very good at looking at the way
00:13:03.220 | in which social networks,
00:13:05.100 | like the ones that exist in a university,
00:13:06.840 | can spread everything.
00:13:08.700 | But I think when we look back and ask,
00:13:10.540 | why did wokeism spread so rapidly
00:13:13.560 | and rapidly out of humanities
00:13:15.600 | into other parts of universities?
00:13:17.160 | And why did it spread across the country
00:13:19.580 | and beyond the United States
00:13:21.600 | to the other English speaking universities?
00:13:23.460 | It's because it's a contagion.
00:13:25.380 | And these behaviors are contagious.
00:13:28.500 | The president of a university I won't name
00:13:32.060 | said to me that he receives every day
00:13:34.740 | at least one denunciation,
00:13:37.520 | one call for somebody or other to be fired
00:13:40.880 | for something that they said.
00:13:43.060 | That's the crazy kind of totalitarianism light
00:13:46.880 | that now exists in our universities.
00:13:50.580 | And of course, the people who want to downplay this say,
00:13:52.740 | oh, well, there only have been
00:13:54.020 | a hundred and something disinvitations,
00:13:55.800 | or, oh, there really aren't that many cases.
00:13:57.920 | But the point is that the famous events,
00:14:00.400 | the events that get the attention,
00:14:02.500 | are responsible for a general chilling
00:14:05.020 | that as you say, spreads to every part of the university
00:14:07.840 | and creates a very familiar culture
00:14:10.780 | in which people are afraid to say what they think.
00:14:13.400 | Self-censorship, look at the Heterodox Academy data on this,
00:14:16.820 | grows and grows.
00:14:17.860 | So now a majority of students will say,
00:14:20.340 | this is clear from the latest Heterodox Academy surveys,
00:14:22.940 | we are scared to say what we think
00:14:24.740 | in case we get denounced, in case we get canceled.
00:14:28.520 | But that's just not the correct atmosphere
00:14:31.300 | for a university in a free society.
00:14:33.980 | To me, what's really creepy
00:14:36.580 | is how many of the behaviors I see
00:14:39.100 | on university campuses today
00:14:40.740 | are reminiscent of the way that people used to behave
00:14:43.380 | in the Soviet Union, or in the Soviet bloc,
00:14:45.580 | or in Mao's China.
00:14:47.260 | The sort of totalitarianism light
00:14:49.740 | that I think we're contending with here,
00:14:52.740 | which manifests itself as denunciations,
00:14:56.460 | people informing on superiors,
00:14:59.660 | some people using it for career advantage,
00:15:02.820 | other people reduced to hapless, desperate apology
00:15:07.100 | to try to exonerate themselves,
00:15:08.980 | people disappearing, metaphorically, if not literally.
00:15:12.460 | All of this is so reminiscent of the totalitarian regimes
00:15:16.100 | that I studied earlier in my career
00:15:17.760 | that it makes me feel sick.
00:15:20.100 | And what makes me really feel sick
00:15:21.480 | is that the people doing this stuff,
00:15:23.660 | the people who write the letters of denunciation
00:15:26.520 | are apparently unaware
00:15:28.020 | that they're behaving exactly like people
00:15:29.780 | in Stalin's Soviet Union.
00:15:31.220 | They don't know that.
00:15:32.620 | So they clearly have,
00:15:33.700 | there's been a massive educational failure.
00:15:36.020 | If somebody can write an anonymous
00:15:37.700 | or non-anonymous letter of denunciation and not feel shame,
00:15:41.220 | I mean, you should feel morally completely contaminated
00:15:44.620 | as you're doing that,
00:15:45.580 | but people haven't been taught
00:15:47.540 | the realities of totalitarianism.
00:15:49.740 | For all these reasons, I think you need to try
00:15:52.620 | at least to create a new institution
00:15:54.820 | where those pathologies will be structurally excluded.
00:15:59.820 | - So maybe a difficult question.
00:16:04.080 | Maybe you'll push back on this,
00:16:06.040 | but you're widely seen politically as a conservative.
00:16:09.140 | Hoover Institution is politically conservative.
00:16:12.420 | What is the role of politics at the University of Austin?
00:16:15.900 | Because some of the ideas, people listening to this,
00:16:18.740 | when they hear the ideas you're expressing,
00:16:21.300 | they may think there's a lean to these ideas.
00:16:23.940 | There's a conservative lean to these ideas.
00:16:26.180 | Is there such a lean?
00:16:28.320 | - There will certainly be people who say that
00:16:30.140 | because the standard mode
00:16:32.900 | of trying to discredit any new initiative is to say,
00:16:36.580 | oh, this is a sinister conservative plot.
00:16:40.260 | But one of our co-founders, Heather Hying,
00:16:45.600 | is definitely not a conservative.
00:16:49.160 | She's as committed to the idea of academic freedom as I am.
00:16:53.260 | But I think on political issues,
00:16:54.700 | we probably agree on almost nothing.
00:16:57.320 | And at least I would guess.
00:16:59.940 | But politics, Max Weber made this point a long time ago,
00:17:04.740 | that politics really should stop at the threshold
00:17:07.300 | of the classroom, of the lecture hall.
00:17:09.700 | And in my career, I've always tried to make sure
00:17:11.900 | that when I'm teaching,
00:17:13.680 | it's not clear where I stand politically,
00:17:17.960 | though of course undergraduates
00:17:19.300 | and insatiably curiously want to know,
00:17:22.100 | but it shouldn't be clear from what I say
00:17:24.620 | because indoctrination on a political basis
00:17:28.020 | is an abuse of the power of the professor,
00:17:30.860 | as Weber rightly said.
00:17:32.940 | So I think one of the key principles
00:17:35.940 | of the University of Austin
00:17:37.820 | will be that Weberian principle,
00:17:39.860 | that politics is not an appropriate subject
00:17:43.860 | for the lecture hall, for the classroom.
00:17:48.340 | And we should pursue truth
00:17:51.420 | and enshrine liberty of thought.
00:17:55.480 | If that's a political issue, then I can't help you.
00:17:58.740 | I mean, if you're against freedom of thought,
00:18:01.100 | then we don't really have much of a discussion to have.
00:18:04.900 | And clearly there are some people
00:18:06.060 | who politically seem quite hostile to it.
00:18:08.300 | But my sense is that there are plenty of people
00:18:11.020 | on the left in academia.
00:18:12.460 | Think of that interesting partnership
00:18:14.480 | between Cornel West and Robbie George,
00:18:18.260 | which has been institutionalized
00:18:20.460 | in the Academic Freedom Alliance.
00:18:22.340 | It's bipartisan, this issue.
00:18:23.980 | It really, really is.
00:18:25.340 | After all, 50 years ago,
00:18:27.340 | it was the left that was in favor of free speech.
00:18:30.420 | The right still has an anti-free speech element to it.
00:18:33.900 | Look how quickly they're out to ban critical race theory.
00:18:37.100 | Critical race theory won't be banned
00:18:38.500 | at the University of Texas.
00:18:39.940 | Wokeism won't be banned.
00:18:41.900 | Everything will be up for discussion,
00:18:44.020 | but the rules of engagement will be clear.
00:18:46.060 | Chicago principles, those will be enforced.
00:18:49.380 | And if you have to give a lecture on,
00:18:53.540 | well, let's just take a recent example,
00:18:56.420 | the Dorian Abbott case.
00:18:57.820 | If you're giving a lecture on astrophysics,
00:19:02.060 | but it turns out that in some different venue
00:19:04.780 | you express skepticism about affirmative action,
00:19:08.180 | well, it doesn't matter.
00:19:09.300 | It's irrelevant.
00:19:10.180 | We want to know what your thoughts are on astrophysics
00:19:13.060 | 'cause that's what you're supposed to be giving a lecture on.
00:19:16.020 | That used to be understood.
00:19:17.620 | I mean, at the Oxford of the 1980s,
00:19:19.220 | there were communists and there were ultra-Tories.
00:19:22.780 | At Cambridge, there were people who were so reactionary
00:19:25.700 | that they celebrated Franco's birthday,
00:19:28.100 | but they were also out-and-out communists
00:19:30.100 | down the road at King's College.
00:19:32.620 | The understanding was that that kind of intellectual diversity
00:19:36.060 | was part and parcel of university life.
00:19:38.740 | And frankly, for an undergraduate,
00:19:39.900 | it was great fun to cross the road
00:19:41.900 | and go from outright conservatism,
00:19:45.220 | ultra-Toriesm to communism.
00:19:47.580 | One learns a lot that way.
00:19:49.860 | But the issue is when you're promoting
00:19:52.220 | or hiring or tenuring people,
00:19:54.740 | their politics is not relevant.
00:19:57.340 | It really isn't.
00:19:58.900 | And when it started to become relevant,
00:20:01.380 | and I remember this coming up
00:20:03.180 | at the Harvard History Department late in my time there,
00:20:06.700 | I felt deeply, deeply uneasy
00:20:08.980 | that we were having conversations
00:20:11.460 | that amounted to,
00:20:13.380 | "Well, we can't hire X person
00:20:15.660 | despite their obvious academic qualifications
00:20:19.860 | because of some political issue."
00:20:24.020 | That's not what should happen at a healthy university.
00:20:27.420 | - Some practical questions.
00:20:30.780 | Will University of Austin be a physical in-person university
00:20:36.060 | or a virtual university?
00:20:37.940 | What are some, in that aspect,
00:20:40.540 | where the classroom is?
00:20:42.300 | - It will be a real space institution.
00:20:45.700 | There may be an online dimension to it
00:20:50.620 | because there clearly are a lot of things
00:20:52.260 | that you can do via the internet.
00:20:56.300 | But the core activity of teaching and learning,
00:21:00.140 | I think, requires real space.
00:21:01.980 | And I've thought about this a long time, debated.
00:21:04.860 | Sebastian Thrun about this many, many years ago
00:21:07.300 | when he was a complete believer in,
00:21:09.460 | let's call it the metaversity, to go with the metaverse.
00:21:11.940 | I mean, the metaversity was going to happen, wasn't it?
00:21:13.980 | But I never really believed in the metaversity.
00:21:16.860 | I didn't do MOOCs because I just didn't think
00:21:19.300 | you'd A, be able to retain the attention,
00:21:22.220 | B, be able to cope with the scaled grading
00:21:25.860 | that was involved.
00:21:26.940 | I think there's a reason universities have been around
00:21:29.900 | in their form for about a millennium.
00:21:32.180 | You kind of need to all be in the same place.
00:21:34.460 | So I think answer to that question,
00:21:36.820 | definitely a campus in the Austin area,
00:21:39.900 | that's where we'll start.
00:21:41.700 | And if we can allow some of our content
00:21:45.060 | to be available online, great, we'll certainly do that.
00:21:48.500 | - Another question is,
00:21:49.900 | what kind of courses and programming will it offer?
00:21:52.460 | Is that something you can speak to?
00:21:54.020 | What's your vision here?
00:21:55.660 | - We think that we need to begin more like a startup
00:22:00.420 | than like a full-service university from day one.
00:22:05.020 | So our vision is that we start with a summer school,
00:22:08.140 | which will offer provocatively the forbidden courses.
00:22:12.380 | We want, I think, to begin by giving a platform
00:22:17.380 | to the professors who've been most subject
00:22:21.820 | to cancel culture, and also to give an opportunity
00:22:23.900 | to students who want to hear them to come.
00:22:25.700 | So we'll start with a summer school
00:22:27.100 | that will be somewhat in the tradition
00:22:29.980 | of those institutions in the interwar period
00:22:33.300 | that were havens for refugees.
00:22:34.780 | So we're dealing here with the internal refugees
00:22:37.340 | of the woke era.
00:22:39.740 | We'll start there.
00:22:41.380 | It'll be an opportunity to test out some content,
00:22:45.460 | see what students will come and spend time in Austin to hear.
00:22:50.460 | So that's part A, that's the sort of,
00:22:53.620 | if you like the launch product.
00:22:56.100 | And then we go straight to a master's program.
00:23:00.260 | I don't think you can go to undergraduate education
00:23:03.340 | right away because the established brands
00:23:06.620 | in undergraduate education are offering something
00:23:09.220 | it's impossible to compete with initially
00:23:10.980 | because they have the brand, Harvard, Yale, Stanford,
00:23:14.820 | and they offer also this peer network,
00:23:17.980 | which is part of the reason people want so badly
00:23:21.660 | to go to those places, not really the professors,
00:23:24.060 | it's the classmates.
00:23:25.500 | So we don't want to compete there initially.
00:23:28.100 | Where there is, I think, room for new entrants
00:23:31.180 | is in a master's program.
00:23:34.940 | And the first one will be in entrepreneurship
00:23:37.820 | and leadership, because I think there's a huge hunger
00:23:42.340 | amongst people who want to get into,
00:23:44.100 | particularly the technology world,
00:23:45.980 | to learn about those things.
00:23:47.060 | And they know they're not really going to learn
00:23:48.820 | about them at business schools.
00:23:50.700 | The people who are not going to teach them leadership
00:23:53.100 | and entrepreneurship are professors.
00:23:55.500 | So we want to create something that will be a little like
00:23:59.420 | the very successful Schwarzman program in China,
00:24:02.620 | which was come and spend a year in China
00:24:05.100 | and find out about China.
00:24:07.500 | We'll be doing the same, essentially saying,
00:24:09.460 | come and spend a year and find out about technology.
00:24:12.220 | And there'll be a mix of academic content.
00:24:15.100 | We want people to understand some of the first principles
00:24:17.900 | of what they're studying.
00:24:19.420 | There are first principles of entrepreneurship
00:24:21.580 | and leadership, but we also want them to spend time
00:24:23.500 | with people like one of our co-founders, Joe Lonsdale,
00:24:26.340 | who's been a hugely successful venture capitalist
00:24:29.980 | and learn directly from people like him.
00:24:33.140 | So that's the kind of initial offering.
00:24:35.740 | I think there are other master's programs
00:24:37.660 | that we will look to roll out quite quickly.
00:24:39.820 | I have a particular passion for a master's in applied history
00:24:43.500 | or politics and applied history.
00:24:45.460 | I'm a historian driven crazy by the tendency
00:24:48.660 | of academic historians to drift away from
00:24:51.580 | what seemed to me the important questions
00:24:53.660 | and certainly to drift away from addressing
00:24:56.380 | policy relevant questions.
00:24:57.820 | So I would love to be involved in a master's
00:25:01.060 | in applied history.
00:25:02.740 | And we'll build some programs like that
00:25:05.980 | before we get to the full liberal arts experience
00:25:10.980 | that we envisage for an undergraduate program.
00:25:14.980 | And that undergraduate program is an exciting one
00:25:17.060 | 'cause I think we can be innovative there too.
00:25:19.660 | I would say two years would be spent
00:25:22.220 | doing some very classical and difficult classical things,
00:25:26.620 | bridging those old divides between arts and sciences.
00:25:30.940 | But then there would also be in the second half
00:25:34.900 | in the junior and senior years,
00:25:37.500 | something somewhat more of an apprenticeship
00:25:41.060 | where we'll have centers,
00:25:42.500 | including a center for technology,
00:25:45.500 | engineering, and mathematics,
00:25:47.660 | that will be designed to help people make that transition
00:25:51.740 | from the theoretical to the practical.
00:25:54.780 | So that's the vision.
00:25:56.980 | And I think like any early stage idea
00:26:01.980 | we'll doubtless tweak it as we go along.
00:26:04.220 | We'll find things that work and things that don't work.
00:26:07.340 | But I have a very clear sense in my own mind
00:26:10.340 | of how this should look five years from now.
00:26:13.980 | And I don't know about you.
00:26:14.820 | I mean, I'm unusual as an academic
00:26:16.780 | 'cause I quite like starting new institutions
00:26:18.780 | and I've done a bit of it in my career.
00:26:22.260 | You got to kind of know what it should look like
00:26:25.140 | after the first four or five years
00:26:27.140 | to get out of bed in the morning
00:26:28.660 | and put up with all the kind of hassles of doing it,
00:26:31.900 | not least the inevitable flack
00:26:34.180 | that we're bound to take from the educational establishment.
00:26:38.080 | - And I was graciously invited to be an advisor
00:26:41.920 | to this University of Austin.
00:26:45.300 | And the reason I would love to help
00:26:49.540 | in whatever way I can is several.
00:26:52.840 | So one, I would love to see Austin,
00:26:55.480 | the physical location flourish intellectually
00:26:58.980 | and especially in the space of science and engineering.
00:27:03.420 | That's really exciting to me.
00:27:05.420 | Another reason is I am still a research scientist at MIT.
00:27:09.860 | I still love MIT.
00:27:12.580 | And I see this effort that you're launching
00:27:16.780 | as a beacon that leads the way
00:27:21.220 | to the other elite institutions in the world.
00:27:24.420 | I think too many of my colleagues,
00:27:26.500 | and especially in robotics,
00:27:28.140 | don't see robotics as a humanities problem.
00:27:35.900 | But to me,
00:27:39.660 | robotics and AI will define much of our world
00:27:43.820 | in the next century.
00:27:45.260 | And not to consider all the deep psychological,
00:27:49.660 | sociological, human problems associated with that.
00:27:53.760 | To have real open conversations, to say stupid things,
00:27:58.860 | to challenge the ideas of how companies are being run,
00:28:03.860 | for example, that is the safe space.
00:28:08.140 | It's very difficult to talk about the difficult questions
00:28:11.540 | about technology when you're employed by Facebook
00:28:14.380 | or Google and so on.
00:28:16.140 | The university is the place to have those conversations.
00:28:19.420 | - That's right.
00:28:20.260 | And we're hugely excited that you want
00:28:22.020 | to be one of our advisors.
00:28:23.300 | We need a broad and eclectic group of people.
00:28:28.300 | And I'm excited by the way that group has developed.
00:28:33.460 | Some of my favorite intellectuals are there.
00:28:37.540 | Steve Pinker, for example.
00:28:40.300 | But we're also making sure that we have people
00:28:43.860 | with experience in academic leadership.
00:28:48.860 | And so it's a happy coalition of the willing
00:28:54.020 | looking to try to build something new,
00:28:56.640 | which as you say, will be complimentary
00:28:58.860 | to the existing and established institutions.
00:29:02.300 | I think of the academic world as a network.
00:29:06.260 | I've moved from some major hubs in the network to others,
00:29:11.260 | but I've always felt that we do our best work,
00:29:15.900 | not in a silo called Oxford,
00:29:18.380 | but in a silo that is really a hub connected to Stanford,
00:29:23.020 | connected to Harvard, connected to MIT.
00:29:26.540 | One of the reasons I moved to the United States
00:29:28.340 | was that I sensed that there was more intellectual action
00:29:32.340 | in my original field of expertise, financial history.
00:29:35.780 | And that was right.
00:29:37.300 | It was a good move.
00:29:39.020 | I think I'd have stagnated if I'd stayed at Oxford.
00:29:43.020 | But at the same time,
00:29:44.540 | I haven't lost connection with Oxford.
00:29:46.420 | I recently went and gave a lecture there
00:29:48.780 | in honor of Sir Roger Scruton,
00:29:50.500 | one of the great conservative philosophers.
00:29:52.820 | And the burden of my lecture was
00:29:56.220 | the idea of the Anglosphere,
00:29:58.060 | which appealed a lot to Roger,
00:29:59.980 | will go horribly wrong if illiberal ideas
00:30:04.500 | that inhibit academic freedom
00:30:05.900 | spread all over the Anglosphere.
00:30:07.780 | And this network gets infected with these,
00:30:11.460 | I think, deeply damaging notions.
00:30:14.980 | So yeah, I think we're creating a new node.
00:30:18.540 | I hope it's a node that makes the network
00:30:20.540 | overall more resilient.
00:30:23.340 | And right now there's an urgent need for it.
00:30:25.820 | I mean, there are people
00:30:27.300 | whose academic careers have been terminated.
00:30:30.300 | I'll name two who were involved.
00:30:32.660 | Peter Boghossian, who was harassed out of Portland State
00:30:37.660 | for the reason that he was one of those intrepid figures
00:30:42.700 | who carried out the grievance studies hoaxes,
00:30:47.060 | exposing the utter charlatanry going on
00:30:50.540 | in many supposedly academic journals
00:30:53.020 | by getting phony gender studies articles published.
00:30:56.940 | It was genius.
00:30:57.820 | And of course, so put the noses out of joint
00:31:00.900 | of the academic establishment
00:31:02.300 | that he began to be subject to disciplinary actions.
00:31:05.260 | So Peter is going to be involved.
00:31:07.300 | And in a recent shocking British case,
00:31:10.300 | the philosopher Kathleen Stock
00:31:11.660 | has essentially been run off the campus
00:31:13.540 | of Sussex University in England
00:31:15.500 | for violating the increasingly complex rules
00:31:21.340 | about discussing transgender issues and women's rights.
00:31:26.180 | She will be one of our advisors.
00:31:28.260 | And I think also one of our founding fellows
00:31:30.740 | actually teaching for us in our first iteration.
00:31:35.020 | So I think we're creating a node that's badly needed.
00:31:38.820 | Those people, I mean, I remember saying this
00:31:40.820 | to the other founders
00:31:43.220 | when we first began to talk about this idea,
00:31:45.580 | to Barry Weiss and to Pano Canellos
00:31:50.300 | as well as to Heather Hying.
00:31:52.660 | We need to do this urgently
00:31:55.020 | because there are people whose livelihoods
00:31:56.900 | are in fact being destroyed
00:31:58.820 | by these extraordinarily illiberal campaigns against them.
00:32:02.620 | And so there's no time to hang around
00:32:05.060 | and come up with the perfect design.
00:32:07.340 | This is an urgently needed lifeboat.
00:32:10.140 | And let's start with that.
00:32:11.580 | And then we can build something spectacular
00:32:13.620 | taking advantage of the fact that all of these people have,
00:32:16.700 | well, they now have very real skin in the game.
00:32:19.260 | They need to make this a success
00:32:21.780 | and I'm sure they will help us make it a success.
00:32:24.540 | - So you mentioned some interesting names
00:32:27.660 | like Heather Hying, Barry Weiss and so on.
00:32:30.420 | Steven Pinker, somebody I really admire.
00:32:32.180 | He too was under quite a lot of fire.
00:32:35.240 | Many reasons I admire him,
00:32:37.860 | one, because of his optimism about the future
00:32:40.580 | and two, how little of a damn he seems to give
00:32:44.540 | about the like walking through the fire.
00:32:48.020 | There's nobody more Zen about walking through the fire
00:32:50.340 | than Steven Pinker.
00:32:51.340 | But anyway, you mentioned a lot of interesting names.
00:32:54.220 | Jonathan Haidt is also interesting there.
00:32:56.980 | Who is involved with this venture at this early days?
00:33:00.940 | - Well, one of the things that I'm excited about
00:33:04.780 | is that we're getting people from inside and outside
00:33:08.220 | the academic world.
00:33:09.220 | So we've got Arthur Brooks, who for many years
00:33:12.860 | ran the American Enterprise Institute very successfully,
00:33:17.660 | has a Harvard role now teaching.
00:33:19.980 | And so he's somebody who brings, I think,
00:33:23.700 | a different perspective.
00:33:26.700 | There's obviously a need to get experienced
00:33:31.700 | academic leaders involved,
00:33:36.060 | which is why I was talking to Larry Summers
00:33:38.620 | about whether he would join our board of advisors.
00:33:43.520 | The Chicago principals owe a debt
00:33:46.900 | to the former president of Chicago
00:33:50.220 | and he's graciously agreed to be in the board of advisors.
00:33:54.140 | I could go on, it would become a long and tedious list,
00:33:56.260 | but my goal in trying to get this happy band to form
00:34:01.260 | has been to signal that it's a bipartisan endeavor.
00:34:06.020 | It is not a conservative institution
00:34:08.420 | that we're trying to build.
00:34:09.340 | It's an institution that's committed to academic freedom
00:34:12.260 | and the pursuit of truth that will mean it
00:34:15.740 | when it takes Robert Zimmer's Chicago principals
00:34:20.420 | and enshrines them in its founding charter.
00:34:22.900 | And we'll make those something other than honored
00:34:26.540 | in the breach, which they seem to be at some institutions.
00:34:29.660 | So the idea here is to grow this organically.
00:34:33.340 | We need, rather like the Academic Freedom Alliance
00:34:36.620 | that Robbie George created earlier this year,
00:34:39.080 | we need breadth and we need to show
00:34:41.380 | that this is not some kind of institutionalization
00:34:45.500 | of the intellectual dark web,
00:34:47.620 | though we welcome founding members of that nebulous body.
00:34:52.420 | It's really something designed for all of academia
00:34:55.460 | to provide a kind of reboot
00:34:57.300 | that I think we all agree is needed.
00:35:00.260 | - Is there a George Washington type figure?
00:35:02.860 | Is there a president elected yet?
00:35:04.580 | Or who's going to lead this institution?
00:35:07.260 | - Pano Canellos, the former president of St. John's,
00:35:10.180 | is the president of University of Austin.
00:35:12.420 | And so he is our George Washington.
00:35:14.380 | I don't know who Alexander Hamilton is.
00:35:16.980 | I'll leave you to guess.
00:35:18.580 | - It's funny you mentioned IDW, intellectual dark web.
00:35:21.820 | Have you talked to your friend Sam Harris
00:35:24.900 | about any of this?
00:35:26.980 | He is another person I really admire
00:35:30.740 | and I've talked to online and offline quite a bit
00:35:34.500 | for not belonging to any tribe.
00:35:39.180 | He stands boldly on his convictions
00:35:43.020 | when he knows they're not going to be popular.
00:35:45.620 | He basically gets canceled by every group.
00:35:50.620 | He doesn't shy away from controversy
00:35:54.260 | and not for the sake of controversy itself,
00:35:56.980 | he is one of the best examples to me
00:36:00.220 | of a person who thinks freely.
00:36:02.620 | I disagree with him on quite a few things,
00:36:06.020 | but I deeply admire that he is what it looks like
00:36:10.820 | to think freely by himself.
00:36:12.780 | It feels to me like he represents a lot of the ideals
00:36:15.300 | of this kind of effort.
00:36:16.540 | - Yes, he would be a natural fit.
00:36:18.500 | Sam, if you're listening, I hope you're in.
00:36:21.380 | I think in the course of his recent intellectual quests,
00:36:25.660 | he did collide with one of our founders, Heather Hying.
00:36:28.100 | So we'll have to model civil disagreements
00:36:31.020 | at the University of Austin.
00:36:32.620 | It's extremely important that we should all disagree
00:36:35.540 | about many things, but do it amicably.
00:36:38.780 | One of the things that has been lost sight of,
00:36:41.180 | perhaps it's all the fault of Twitter
00:36:42.860 | or maybe it's something more profound,
00:36:44.340 | is that it is possible to disagree in a civil way.
00:36:47.660 | And still be friends.
00:36:49.660 | I certainly had friends at Oxford
00:36:52.020 | who were far to the left of me politically,
00:36:54.460 | and they are still among my best friends.
00:36:56.500 | So the University of Austin has to be a place
00:36:58.620 | where we can disagree vehemently,
00:37:03.340 | but we can then go and have a beer afterwards.
00:37:06.540 | That's, in my mind, a really important part
00:37:09.740 | of university life, learning the difference
00:37:12.780 | between the political and the personal.
00:37:15.660 | So Sam is, I think, a good example, as are you,
00:37:19.060 | of a certain kind of intellectual hero
00:37:24.060 | who has been willing to go into the cyber sphere,
00:37:29.140 | the metaverse, and carve out an intellectual space,
00:37:35.580 | the podcast, and debate everything fearlessly.
00:37:42.940 | His essay, it was really an essay on Black Lives Matter
00:37:47.940 | and the question of police racism,
00:37:50.820 | was a masterpiece of 2020.
00:37:53.620 | And so he, I think, is a model of what we believe in.
00:38:00.700 | But we can't save the world with podcasts,
00:38:03.700 | good though yours is,
00:38:06.860 | because there's a kind of solo element
00:38:11.860 | to this form of public intellectual activity.
00:38:15.100 | It's also there in Substack,
00:38:16.740 | where all our best writers now seem to be,
00:38:19.860 | including our founder, Barry Weiss.
00:38:22.580 | The danger with this approach is, ultimately,
00:38:26.860 | your subscribers are the people who already agree with you,
00:38:30.380 | and we are all, therefore,
00:38:32.040 | in danger of preaching to the choir.
00:38:35.780 | I think what makes an institution
00:38:37.100 | like University of Austin so attractive
00:38:38.940 | is that we get everybody together
00:38:41.540 | at least part of the year,
00:38:44.460 | and we do that informal interaction at lunch, at dinner,
00:38:49.460 | that allows, in my experience, the best ideas to form.
00:38:56.620 | Intellectual activity isn't really a solo voyage.
00:39:00.460 | Historians often make it seem that way,
00:39:02.420 | but I've realized over time that I do my best work
00:39:06.020 | in a collaborative way,
00:39:08.140 | and scientists have been better at this
00:39:10.620 | than people in the humanities.
00:39:12.420 | But what really matters,
00:39:13.780 | what's magical about a good university
00:39:16.100 | is that interdisciplinary, serendipitous conversation
00:39:19.500 | that happens on campus.
00:39:21.460 | Tom Sargent, the great Nobel Prize-winning economist,
00:39:24.500 | and I used to have these kind of random conversations
00:39:27.820 | in elevators at NYU or in corridors at Stanford,
00:39:31.540 | and sometimes they'd be quite short conversations,
00:39:34.680 | but in that short, serendipitous exchange,
00:39:38.040 | I would have more intellectual stimulus
00:39:40.540 | than in many a seminar lasting an hour and a half.
00:39:44.420 | So I think we want to get the Sam Harrises
00:39:47.300 | and Lex Freedmans out of their darkened rooms
00:39:51.460 | and give them a chance to interact
00:39:54.020 | in a much less structured way than we've got used to.
00:39:59.020 | Again, it's that sense that sometimes
00:40:02.740 | you need some freewheeling, unstructured debate
00:40:05.620 | to get the really good ideas.
00:40:07.300 | I mean, to talk anecdotally for a moment,
00:40:08.940 | I look back on my Oxford undergraduate experience,
00:40:12.020 | and I wrote a lot of essays and attended a lot of classes,
00:40:14.700 | but intellectually, the most important thing I did
00:40:18.220 | was to write an essay on the Viennese satirist Karl Kraus
00:40:22.420 | for an undergraduate discussion group
00:40:25.940 | called the Canon Club.
00:40:27.460 | And I probably put more work into that paper
00:40:30.020 | than I put into anything else
00:40:31.500 | except maybe my final examinations,
00:40:33.820 | even although there was only really one senior member present
00:40:37.100 | the historian Jeremy Cattell,
00:40:38.820 | I was really just trying to impress my contemporaries.
00:40:41.900 | And that's the kind of thing we want.
00:40:45.760 | The great intellectuals,
00:40:47.900 | the great intellectual leaps forward occurred
00:40:51.120 | often in somewhat unstructured settings.
00:40:54.200 | I'm from Scotland,
00:40:55.100 | you can tell from my accent a little at least.
00:40:58.320 | The enlightenment happened in late 18th century Scotland
00:41:03.080 | in a very interesting interplay between the universities,
00:41:07.300 | which were very important, Glasgow, Edinburgh, St. Andrews,
00:41:10.900 | and the coffee houses and pubs of the Scottish cities
00:41:15.900 | where a lot of unstructured discussion
00:41:19.780 | often fueled by copious amounts of wine took place.
00:41:24.120 | That's what I've missed over the last few years.
00:41:27.320 | Let's just think about how hard
00:41:29.360 | academic social life has become.
00:41:32.520 | That we've reached the point
00:41:35.080 | that Amy Chua becomes the object
00:41:39.360 | of a full-blown investigation and media storm
00:41:43.640 | for inviting two Yale Law School students
00:41:47.240 | over to her house to talk.
00:41:50.000 | I mean, when I was at Oxford,
00:41:51.880 | it was regarded as a tremendous honor
00:41:54.200 | to be asked to go to one of our tutors' homes.
00:41:58.120 | The social life of Oxford and Cambridge
00:41:59.720 | is one of their great strengths.
00:42:01.160 | There's a sort of requirement
00:42:02.640 | to sip unpleasant sherry with the Dons.
00:42:06.600 | And we've kind of killed all that.
00:42:08.400 | We've killed all that in the US
00:42:09.600 | 'cause nobody dares have a social interaction
00:42:12.160 | with an undergraduate or exchange an informal email
00:42:15.340 | in case the whole thing ends up
00:42:16.720 | on the front page of the local or student newspaper.
00:42:19.560 | So that's what we need to kind of restore,
00:42:22.080 | the social life of academia.
00:42:25.600 | - So there's magic.
00:42:26.440 | We didn't really address this sort of explicitly,
00:42:29.380 | but there's magic to the interaction between students.
00:42:33.520 | There's magic in the interaction between faculty,
00:42:37.440 | the people that teach,
00:42:38.400 | and there's the magic in the interaction
00:42:39.920 | between the students and the faculty.
00:42:41.640 | And it's an iterative process
00:42:43.840 | that changes everybody involved.
00:42:46.080 | So it's like world experts in a particular discipline
00:42:48.960 | are changed as much as the students,
00:42:52.880 | as the 20-year-olds with the wild ideas,
00:42:56.920 | each are changed.
00:42:58.160 | And that's the magic of it.
00:42:59.160 | That applies in liberal education.
00:43:01.200 | That applies in the sciences too.
00:43:03.560 | That's probably, maybe you can speak to this,
00:43:05.600 | why so much scientific innovation
00:43:08.800 | has happened in the universities.
00:43:10.680 | There's something about the youthful energy
00:43:13.640 | of like young minds, graduate students,
00:43:16.520 | undergraduate students that inspire
00:43:18.540 | some of the world experts to do some of the best work
00:43:20.740 | of their lives.
00:43:21.580 | - Yeah.
00:43:22.420 | Well, the human brain we know is at its most dynamic
00:43:25.160 | when people are pretty young.
00:43:27.320 | You know this with your background in math.
00:43:29.640 | People don't get better at math after the age of 30.
00:43:33.000 | And this is important when you think about
00:43:36.840 | the intergenerational character of a university.
00:43:39.840 | The older people, the professors have the experience,
00:43:44.220 | but they're fading intellectually
00:43:46.840 | from much earlier than anybody really wants to admit.
00:43:50.260 | And so you get this intellectual shot in the arm
00:43:55.760 | from hanging out with people who are circa 20,
00:43:59.360 | don't know shit, but the brains are kind of like cooking.
00:44:03.440 | - Yeah.
00:44:04.280 | - I look back on the career I've had in teaching,
00:44:07.400 | which is over 25 years where Cambridge, Oxford, NYU, Harvard.
00:44:11.920 | And I have extremely strong relationships
00:44:15.480 | with students from those institutions
00:44:19.360 | because they would show up,
00:44:23.200 | whether it was at office hours or in tutorials
00:44:26.160 | and disagree with me.
00:44:28.160 | And for me, it's always been about encouraging
00:44:31.800 | some active intellectual rebellion,
00:44:34.120 | telling people, "I don't want your essay to echo my views.
00:44:38.720 | If you can find something wrong with what I wrote, great.
00:44:41.760 | Or if you can find something I missed that's new, fantastic."
00:44:45.360 | So there is definitely, as you said,
00:44:47.040 | a magic in that interaction across the generations.
00:44:49.840 | And it's extraordinarily difficult, I think,
00:44:53.280 | for an intellectual to make the same progress
00:44:57.240 | in a project in isolation
00:45:00.120 | compared with the progress that can be made
00:45:03.280 | in these very special communities.
00:45:05.840 | What does a university do amongst other things?
00:45:09.320 | It creates a somewhat artificial environment
00:45:13.120 | of abnormal job security,
00:45:15.280 | and that's the whole idea of giving people tenure,
00:45:18.560 | and then a relatively high turnover, new faces each year,
00:45:22.640 | and an institutionalization of thought experiments
00:45:27.080 | and actual experiments.
00:45:29.040 | And then you get everybody living
00:45:30.360 | in the same kind of vicinity
00:45:31.600 | so that it can spill over into 3 a.m. conversation.
00:45:34.720 | Well, that always seems to me
00:45:36.520 | to be a pretty potent combination.
00:45:39.200 | Let's ask ourselves a counterfactual question next.
00:45:41.720 | Let's imagine that the world wars happen,
00:45:47.560 | but there are no universities.
00:45:49.760 | I mean, how does the Manhattan Project happen
00:45:53.880 | with no academia, to take just one of many examples?
00:45:58.200 | In truth, how does Britain even stay in the war
00:46:01.200 | without Bletchley Park,
00:46:02.400 | without being able to crack the German cipher?
00:46:07.400 | The academics are unsung,
00:46:11.680 | partly sung heroes of these conflicts.
00:46:14.360 | The same is true in the Soviet Union.
00:46:16.960 | The Soviet Union was a terribly evil and repressive system,
00:46:20.600 | but it was good at science,
00:46:22.280 | and that kept it in the game,
00:46:24.840 | not only in World War II, it kept it in the Cold War.
00:46:28.840 | So it's clear that universities
00:46:31.640 | are incredibly powerful intellectual force multipliers,
00:46:36.520 | and our history without them would look very different.
00:46:41.120 | Sure, some innovations would have happened without them.
00:46:43.280 | That's clear.
00:46:44.120 | The Industrial Revolution didn't need universities.
00:46:46.000 | In fact, they played a very marginal role
00:46:48.320 | in the key technological breakthroughs
00:46:49.840 | of the Industrial Revolution in its first phase.
00:46:53.000 | But by the second Industrial Revolution
00:46:54.960 | in the late 19th century,
00:46:56.600 | German industry would not have leapt ahead
00:46:58.840 | of British industry if the universities
00:47:00.600 | had not been superior.
00:47:02.400 | And it was the fact
00:47:03.240 | that the Germans institutionalized scientific research
00:47:06.120 | in the way that they did,
00:47:07.840 | that really produced a powerful, powerful advantage.
00:47:12.160 | The problem was that,
00:47:14.000 | and this is a really interesting point,
00:47:15.760 | that Friedrich Meinicke makes
00:47:17.440 | in "Die deutsche Katastrophe,"
00:47:18.640 | for the German catastrophe,
00:47:20.520 | the German intellectuals became technocrats,
00:47:23.800 | homo faber, he says.
00:47:25.560 | They knew a great deal about their speciality,
00:47:28.720 | but they were alienated from, broadly speaking, humanism.
00:47:32.440 | And that is his explanation,
00:47:33.680 | or one of his explanations,
00:47:34.760 | for why this very scientifically advanced Germany
00:47:38.280 | goes down the path of hell led by Hitler.
00:47:42.080 | So when I come back and ask myself,
00:47:43.880 | what is it that we want to do with a new university?
00:47:47.840 | We wanna make sure that we don't fall into that German pit
00:47:52.800 | where very high levels of technical
00:47:54.960 | and scientific expertise are decoupled
00:47:57.840 | from the fundamental foundations of a free society.
00:48:02.840 | So liberal arts are there, I think,
00:48:05.840 | to stop the scientists making Faustian pacts.
00:48:09.760 | And that's why it's really important
00:48:11.440 | that people working on AI read Shakespeare.
00:48:15.520 | - I think you said that academics are unsung heroes
00:48:19.600 | of the 20th century.
00:48:21.720 | I think there's kind of an intellectual,
00:48:25.200 | a lazy intellectual desire to kind of destroy the academics,
00:48:30.200 | that the academics are the source
00:48:32.720 | of all problems in the world.
00:48:34.840 | And I personally believe that exactly as you said,
00:48:37.520 | we need to recognize that the university
00:48:40.400 | is probably where the ideas that will protect us
00:48:45.080 | from the catastrophes that are looming ahead of us,
00:48:50.080 | that's where those ideas are going to come from.
00:48:52.800 | - People who work on economics
00:48:55.080 | can argue back and forth about John Maynard Keynes.
00:48:58.280 | But I think it's pretty clear
00:49:00.720 | that he was the most important economist
00:49:03.200 | and certainly the most influential economist
00:49:05.520 | of the 20th century.
00:49:06.600 | And I think his ideas are looking better today
00:49:11.320 | in the wake of the financial crisis
00:49:12.960 | than they have at any time since the 1970s.
00:49:15.840 | But imagine John Maynard Keynes without Cambridge,
00:49:19.920 | you can't, because someone like that
00:49:22.800 | doesn't actually exist without the incredible hothouse
00:49:27.800 | that a place like Cambridge was in Keynes' life.
00:49:31.880 | He was a product
00:49:32.720 | of a kind of hereditary intellectual elite
00:49:36.400 | that had its vices,
00:49:37.920 | but you can't help but admire the sheer power of the mind.
00:49:41.840 | I've spent a lot of my career reading Keynes
00:49:44.200 | and I revere that intellect.
00:49:46.680 | It's so, so powerful,
00:49:49.480 | but you can't have people like that
00:49:51.600 | if you're not prepared to have King's College Cambridge.
00:49:55.720 | And it comes with redundancy.
00:49:57.040 | I think that's the point.
00:49:57.880 | There are lots and lots of things
00:49:59.480 | that are very annoying about academic life
00:50:02.320 | that you just have to deal with.
00:50:05.280 | They're made fun of in that recent Netflix series,
00:50:08.040 | "The Chair,"
00:50:09.160 | and it is easy to make fun of academic life.
00:50:11.360 | Tom Sharpe's "Porterhouse Blue" did it.
00:50:14.760 | It's an inherently comical subject.
00:50:18.880 | Professors at least used to be amusingly eccentric,
00:50:22.320 | but we've sort of killed off that side of academia
00:50:27.080 | by turning it into an increasingly doctrinaire place
00:50:33.320 | where eccentricity is not tolerated.
00:50:35.880 | I'll give you an illustration of this.
00:50:37.200 | I had a call this morning from a British academic
00:50:40.400 | who said, "Can you give me some advice?"
00:50:43.640 | Because they're trying to decolonize the curriculum.
00:50:47.760 | This is coming from the diversity,
00:50:51.400 | equity, and inclusion officers.
00:50:53.680 | And it seems to me that what they're requiring of us
00:50:57.000 | is a fundamental violation of academic freedom
00:50:59.560 | because it is determining ex ante
00:51:02.760 | what we should study and teach.
00:51:04.600 | That's what's going on.
00:51:07.080 | And that's the thing that we really, really have to resist
00:51:11.520 | because that kills the university.
00:51:13.000 | That's the moment that it stops being the magical place
00:51:17.320 | of intellectual creativity
00:51:20.240 | and simply becomes an adjunct
00:51:22.040 | of the ministry of propaganda.
00:51:24.000 | - I've loved the time we've spent talking about this
00:51:27.240 | because it's such a hopeful message
00:51:28.920 | for the future of the university
00:51:30.240 | that I still share with you
00:51:33.400 | the love of the ideal of the university.
00:51:37.840 | So very practical question.
00:51:39.400 | You mentioned summer.
00:51:41.320 | Which summer are we talking about?
00:51:44.600 | So when, I know we don't wanna put hard dates here,
00:51:48.360 | but what year are we thinking about?
00:51:51.120 | When is this thing launching?
00:51:52.560 | What are your thoughts on this?
00:51:53.760 | - We are moving as fast as our resources allow.
00:51:57.560 | The goal is to offer the first of the forbidden courses
00:52:01.760 | next summer, summer of 2022.
00:52:03.800 | And we hope to be able to launch an initial,
00:52:06.680 | albeit relatively small scale master's program
00:52:12.240 | in the fall of next year.
00:52:14.440 | That's as fast as is humanly possible.
00:52:17.160 | So yeah, we're really keen to get going.
00:52:20.960 | And I think the approach we're taking
00:52:22.880 | is somewhat imported from Silicon Valley.
00:52:27.040 | Think of this as a startup.
00:52:29.040 | Don't think of this as something that has to exist
00:52:31.040 | as a full service university on day one.
00:52:33.760 | We don't have the resources for that.
00:52:35.120 | You'd need billions and billions of dollars
00:52:36.680 | to build a university sort of as a facsimile
00:52:39.960 | of an existing university,
00:52:41.040 | but that's not what we want to do.
00:52:42.200 | I mean, copying and pasting Harvard or Yale or Stanford
00:52:45.560 | would be a futile thing to do.
00:52:46.760 | They would probably, you very quickly end up
00:52:49.120 | with the same pathologies.
00:52:50.360 | So we do have to come up with a different design.
00:52:52.160 | And one way of doing that is to grow it organically
00:52:54.600 | from something quite small.
00:52:56.800 | Elon Musk mentioned in his usual humorous way on Twitter
00:53:01.800 | that he wants to launch
00:53:03.560 | the Texas Institute of Technology and Science, TITS.
00:53:07.560 | Some people thought this was sexist
00:53:11.320 | because of the acronym TITS.
00:53:13.880 | So first of all, I understand their viewpoint,
00:53:16.520 | but I also think there needs to be a place for humor
00:53:19.160 | on the internet, even from CEO.
00:53:21.480 | So on this podcast, I've gotten a chance
00:53:23.760 | to talk to quite a few CEOs.
00:53:26.400 | And what I love to see is authenticity.
00:53:29.520 | And humor is often a sign of authenticity.
00:53:33.160 | The quirkiness that you mentioned
00:53:36.840 | is such a beautiful characteristic of professors
00:53:40.600 | and faculty in great universities
00:53:42.520 | is also beautiful to see as CEOs, especially founding CEOs.
00:53:46.280 | So anyway, the deeper point he was making
00:53:51.280 | is showing an excitement for the university
00:53:54.360 | as a place for big ideas
00:53:57.080 | in science, technology, engineering.
00:53:59.600 | So to me, if there's some kind of way,
00:54:02.960 | if there is a serious thought that he had behind this tweet,
00:54:07.520 | not to analyze Elon Musk's Twitter like it's Shakespeare,
00:54:10.920 | but if there's a serious thought,
00:54:12.820 | I would love to see him supporting the flourishing of Austin
00:54:18.320 | as a place for science, technology,
00:54:20.360 | for these kinds of intellectual developments
00:54:22.280 | that we're talking about,
00:54:25.920 | like make a place for free inquiry, civil disagreements,
00:54:30.920 | coupled with great education and conversations
00:54:35.880 | about artificial intelligence,
00:54:37.120 | about technology, about engineering.
00:54:39.280 | So I'm actually gonna,
00:54:41.800 | I hope there's a serious idea behind that tweet
00:54:43.840 | and I'm gonna chat with him about it.
00:54:46.320 | - I do too, I do too.
00:54:48.760 | Most of the biggest storms and teacups
00:54:53.760 | of my academic career have been caused by bad jokes
00:54:59.180 | that I've made.
00:55:00.020 | These days, if you wanna make bad jokes,
00:55:03.960 | being a billionaire is a great idea.
00:55:06.900 | I'm not here to defend Elon's Twitter style
00:55:12.400 | or sense of humor.
00:55:14.280 | He's not gonna be remembered for his tweets, I think.
00:55:18.320 | He's gonna be remembered for the astonishing companies
00:55:21.440 | that he's built and his contributions
00:55:24.520 | in a whole range of fields from SpaceX to Tesla
00:55:29.400 | and solar energy.
00:55:32.020 | And I very much hope that we can interest Elon
00:55:35.760 | in this project.
00:55:36.580 | We need not only Elon, but a whole range of his peers
00:55:40.840 | because this takes resources.
00:55:45.440 | Universities are not cheap things to run,
00:55:47.680 | especially if, as I hope, we can make as much
00:55:51.400 | of the tuition covered by scholarships and bursaries.
00:55:56.400 | We want to attract the best intellectual talent
00:56:02.600 | to this institution.
00:56:03.940 | The best intellectual talent is somewhat randomly
00:56:07.260 | distributed through society.
00:56:08.720 | And some of it is in the bottom quintile
00:56:10.340 | of the income distribution.
00:56:12.040 | And that makes it hard to get to elite education.
00:56:14.160 | So this will take resources.
00:56:17.480 | The last generation of super wealthy plutocrats,
00:56:22.480 | the generation of the Gilded Age of the late 19th century,
00:56:26.240 | did a pretty good job of founding universities.
00:56:28.960 | Now, Chicago wouldn't exist, but for the money of that era.
00:56:33.400 | And so my message to not only to Elon,
00:56:35.760 | but to all of the peers, all of those people
00:56:38.820 | who made their billions out of technology
00:56:41.520 | over the last couple of decades is this is your time.
00:56:44.720 | I mean, and this is your opportunity
00:56:46.120 | to create something new.
00:56:47.480 | I can't really understand why the wealthy of our time
00:56:51.480 | are content to hand their money.
00:56:54.080 | I mean, think of the vast sums Mike Bloomberg
00:56:56.920 | recently gave to Johns Hopkins,
00:56:58.880 | to his established institutions.
00:57:01.160 | When on close inspection, those institutions
00:57:06.040 | don't seem to spend the money terribly well.
00:57:10.560 | And in fact, one of the mysteries of our time
00:57:13.240 | is the lack of due diligence that hard-nosed billionaires
00:57:16.800 | seem to do when it comes to philanthropy.
00:57:19.600 | So I think there's an opportunity here
00:57:22.080 | for this generation of very talented, wealthy people
00:57:25.520 | to do what their counterparts did in the late 19th
00:57:29.360 | and early 20th century and create some new institutions.
00:57:32.960 | And they don't need to put their names on the buildings.
00:57:35.200 | They just need to do what the founders
00:57:37.760 | of University of Chicago did,
00:57:40.800 | create something new that will endure.
00:57:43.640 | - Yeah, MIT is launching a college of computing
00:57:49.480 | and Seymour Schwarzman has given quite a large sum of money,
00:57:54.480 | I think in total a billion dollars.
00:57:57.160 | And as somebody who loves computing
00:57:59.640 | and somebody who loves MIT, I want some accountability
00:58:02.520 | for MIT becoming a better institution.
00:58:07.360 | And this is once again, why I'm excited
00:58:09.760 | about University of Austin, 'cause it serves as a beacon.
00:58:12.560 | Look, you can create something new
00:58:14.720 | and this is what the great institutions
00:58:16.760 | of the future should look like.
00:58:18.600 | - And Steve Schwarzman is also an innovator.
00:58:22.720 | The idea of creating a college on the Tsinghua campus
00:58:27.160 | and creating a kind of Rhodes program for students
00:58:29.680 | from the Western world to come study in China
00:58:31.520 | was Steve's idea.
00:58:33.680 | And I was somewhat involved,
00:58:35.760 | did some visiting, professing there.
00:58:38.860 | It taught me that you can create something new
00:58:41.760 | in that area of graduate education
00:58:45.360 | and quite quickly attract really strong applicants.
00:58:50.080 | Because the people who finished their four years
00:58:52.640 | at Harvard or Stanford know that they don't know a lot.
00:58:57.640 | And I, having taught a lot of people in that group,
00:59:02.640 | know how intellectually dissatisfied they often are
00:59:06.520 | at the end of four years.
00:59:08.280 | I mean, they may have beautifully gamed the system
00:59:10.540 | to graduate summa magna cum laude,
00:59:12.780 | but they kind of know,
00:59:14.580 | they'll confess it after a drink or two.
00:59:17.060 | They know that they gamed the system
00:59:18.600 | and that intellectually it wasn't
00:59:20.700 | the fulfilling experience they wanted.
00:59:22.620 | And they also know that an MBA
00:59:24.740 | from a comparable institution
00:59:26.580 | would not be a massive intellectual step forward.
00:59:29.620 | So I think what we want to say
00:59:32.060 | is here's something really novel, exciting,
00:59:35.500 | that will be intellectually very challenging.
00:59:38.140 | I do think the University of Austin has to be difficult.
00:59:41.220 | I'd like it to feel a little bit like
00:59:44.280 | surviving Navy SEAL training to come through this program
00:59:47.240 | because it will be intellectually demanding.
00:59:49.640 | And that I think should be a magnet.
00:59:51.560 | So yeah, Steve, if you're listening,
00:59:54.520 | please join Elon in supporting this.
00:59:57.100 | And Peter Thiel, if you're listening,
01:00:00.000 | I know how skeptical you are
01:00:01.520 | about the idea of creating a new university
01:00:03.560 | 'cause heaven knows,
01:00:04.940 | Peter and I have been discussing this idea for years
01:00:06.840 | and he's always said, "Well, no, we thought about this
01:00:08.680 | "and it just isn't gonna work."
01:00:09.680 | But I really think we've got a responsibility to do this.
01:00:14.680 | - Well, Steve's been on this podcast before.
01:00:17.340 | We've spoken a few times, so I'll send this to him.
01:00:20.160 | I hope he does actually get behind it as well.
01:00:22.520 | So I'm super excited by the ideas
01:00:26.520 | that we've been talking about that this effort represents
01:00:29.920 | and what ripple effect it has on the rest of society.
01:00:33.140 | So thank you.
01:00:33.980 | That was a time beautifully spent.
01:00:36.640 | And I'm really grateful for the fortune
01:00:41.640 | of getting a chance to talk to you
01:00:43.920 | at this moment in history
01:00:45.640 | because I've been a big fan of your work.
01:00:48.120 | And the reason I wanted to talk to you today
01:00:50.800 | is about all the excellent books you've written
01:00:54.280 | about various aspects of history through money,
01:00:57.400 | war, power, pandemics, all of that.
01:01:00.920 | But I'm glad that we got a chance to talk about this,
01:01:04.720 | which is not looking at history, it's looking at the future.
01:01:08.320 | This is a beautiful little fortuitous moment.
01:01:12.760 | I appreciate you talking about it.
01:01:15.600 | In the book "Ascent of Money,"
01:01:18.160 | you give a history of the world through the lens of money.
01:01:21.920 | If the financial system is evolutionary in nature,
01:01:25.040 | much like life on earth,
01:01:26.560 | what is the origin of money on earth?
01:01:30.800 | - The origin of money predates coins.
01:01:34.620 | Most people kind of assume I'll talk about coins,
01:01:37.800 | but coins are relatively late developments.
01:01:41.180 | Back in ancient Mesopotamia,
01:01:44.000 | so I don't know, 5,000 years ago,
01:01:45.900 | there were relations between creditors and debtors.
01:01:51.720 | There are even in the simplest economy
01:01:54.080 | because of the way in which agriculture works.
01:01:57.960 | Hey, I need to plant these seeds,
01:02:00.920 | but I'm not gonna have crops for X months.
01:02:03.600 | So we have clay tablets
01:02:05.520 | in which simple debt transactions are inscribed.
01:02:09.560 | I remember looking at great numbers of these
01:02:11.560 | in the British Museum when I was writing "The Ascent of Money,"
01:02:14.920 | and that's really the beginning of money.
01:02:18.120 | The minute you start recording a relationship
01:02:20.920 | between a creditor and a debtor,
01:02:22.160 | you have something that is quasi money.
01:02:24.720 | And that is probably what these clay tablets mostly denoted.
01:02:29.720 | From that point on,
01:02:33.560 | there's a great evolutionary experiment
01:02:35.920 | to see what the most convenient way is
01:02:39.760 | to record relations between creditors and debtors.
01:02:44.760 | And what emerges in the time of the ancient Greeks
01:02:53.320 | are coins, metal, tokens,
01:02:55.920 | sometimes a valuable metal, sometimes not,
01:03:00.480 | usually bearing the imprint of a state or a monarch.
01:03:04.480 | And that's the sort of more familiar form of money
01:03:08.260 | that we still use today for very, very small transactions.
01:03:12.580 | I expect coins will all be gone
01:03:14.800 | by the time my youngest son is my age,
01:03:17.800 | but they're a last remnant of a very, very old way
01:03:21.240 | of doing simple transactions.
01:03:24.440 | - By the way, when you say coins, you mean physical coins.
01:03:28.000 | - I'm talking about- - Because the term coins
01:03:29.280 | has been rebranded into digital space as well.
01:03:31.720 | - Yeah, not coin-based coins, actual coin coins,
01:03:34.280 | you know, the ones that jangle in your pocket
01:03:36.560 | and you kind of don't know quite what to do with
01:03:38.680 | once you have some.
01:03:39.960 | So that became an incredibly pervasive form
01:03:44.960 | of paying for things.
01:03:47.400 | Money's just a, it's just a crystallization
01:03:50.040 | of a relationship between a debtor and a creditor.
01:03:52.040 | And coins are just very fungible.
01:03:54.140 | You know, whereas a clay tablet relates
01:03:56.800 | to a specific transaction, coins are generic and fungible.
01:04:00.360 | They can be used in any transaction.
01:04:02.280 | So that was an important evolutionary advance.
01:04:05.080 | If you think of financial history,
01:04:06.440 | and this was the point of "The Ascent of Money"
01:04:08.320 | as an evolutionary story, there are punctuated equilibria.
01:04:12.560 | People get by with coins for a long time,
01:04:15.480 | despite their defects as a means of payment,
01:04:19.120 | such as that they can be debased, they can be clipped.
01:04:23.160 | It's very hard to avoid a fake
01:04:25.800 | or debased money entering the system.
01:04:28.760 | But coinage is still kind of the basis of payments
01:04:31.800 | all the way through the Roman Empire,
01:04:34.200 | out the other end into the so-called dark ages.
01:04:36.960 | It's still how most things are settled
01:04:39.400 | in cash transactions in the early 1300s.
01:04:43.880 | You don't get a big shift until after the Black Death,
01:04:47.800 | when there is such a need to monetize the economy
01:04:50.920 | because of chronic labor shortages
01:04:52.440 | and feudalism begins to unravel,
01:04:54.720 | that you just don't have a sufficient amount of coinage.
01:04:58.880 | And so you get bills of exchange.
01:05:00.240 | And I'm really into bills of exchange because,
01:05:04.640 | and this I hope will capture your listeners
01:05:07.960 | and viewers' imaginations,
01:05:09.360 | when they start using bills of exchange,
01:05:13.260 | which are really just pieces of paper saying,
01:05:16.400 | you know, I owe you over a three-month period
01:05:19.360 | while goods are in transit from Florence to London,
01:05:23.920 | you get the first peer-to-peer payment system,
01:05:28.000 | which is network verified.
01:05:29.640 | 'Cause they're not coins,
01:05:31.320 | they don't have a king's head on them.
01:05:33.480 | They're just pieces of paper.
01:05:35.320 | And the verification comes in the form of signatures.
01:05:38.320 | And you need ultimately some kind of guarantee
01:05:42.680 | if I write an IOU to you, bills of exchange,
01:05:46.560 | I mean, you don't really know me that well,
01:05:48.320 | we only just met.
01:05:49.440 | So you might wanna get endorsed by, I don't know,
01:05:52.080 | somebody really credit-worthy like Elon.
01:05:54.920 | And so we actually can see in the late 14th century
01:05:58.200 | in Northern Italy and in England and elsewhere,
01:06:00.960 | the evolution of a peer-to-peer network system of payment.
01:06:05.960 | And that's actually how world trade grows.
01:06:09.280 | 'Cause you just couldn't settle
01:06:11.200 | long oceanic transactions with coinage.
01:06:14.080 | It just wasn't practical.
01:06:15.760 | All those treasure chests full of doubloons,
01:06:18.320 | which were part of the way
01:06:19.800 | in which the Spanish empire worked, really inefficient.
01:06:22.680 | So bills of exchange are an exciting part of the story.
01:06:26.240 | And they illustrate something I should have made more clear
01:06:29.520 | of the ascent of money.
01:06:31.320 | That not everything used in payment needs to be money.
01:06:34.580 | Classically, economists will tell you,
01:06:37.960 | oh, well, money, money has three different functions.
01:06:41.360 | It's, you've heard this a zillion times, right?
01:06:43.360 | It's a unit of account, it's a store of value,
01:06:46.320 | and it's a medium of exchange.
01:06:47.880 | Now there are three or four things
01:06:51.600 | that are worth saying about this, and I'll just say two.
01:06:53.880 | One, it may be that those three things are a trilemma,
01:06:57.160 | and it's very difficult for anything to be all of them.
01:06:59.520 | This point was made by my Hoover colleague,
01:07:01.960 | Manny Rincon Cruz last year,
01:07:03.480 | and I still wish he would write this up as a paper
01:07:05.720 | because it's a great insight.
01:07:07.720 | The second thing that's really interesting to me
01:07:09.800 | is that payments don't need to be money.
01:07:12.920 | And if we go around as economists love to do saying,
01:07:16.480 | well, Bitcoin's not money
01:07:17.920 | because it doesn't fulfill these criteria,
01:07:20.720 | we're missing the point
01:07:21.760 | that you could build a system of payments,
01:07:24.560 | which I think is how we should think about crypto,
01:07:26.880 | that isn't money, doesn't need to be money.
01:07:29.160 | It's like bills of exchange.
01:07:30.620 | It's network-based verification, peer-to-peer transactions
01:07:34.840 | without third-party verification.
01:07:38.060 | When it hit me the other day
01:07:39.300 | that we actually have this precedent for crypto,
01:07:41.560 | I got quite excited and thought,
01:07:43.620 | I wish I had written that in "The Ascent of Money."
01:07:46.600 | - Can you sort of from a first principles,
01:07:48.800 | like almost like a physics perspective
01:07:50.720 | or maybe a human perspective,
01:07:54.220 | describe where does the value of money come from?
01:07:58.800 | Like, where is it actually, where is it?
01:08:01.560 | So it's a sheet of paper or it's coins,
01:08:04.760 | but it feels like in a platonic sense,
01:08:08.120 | there's some kind of thing that's actually storing the value
01:08:11.200 | as us a bunch of ants are dancing around and so on.
01:08:13.800 | - I come from a family of physicists.
01:08:17.400 | I'm the black sheep of the family.
01:08:18.600 | My mother's a physicist, my sister is.
01:08:21.160 | And so when you asked me to explain something
01:08:23.700 | in physics terms, I get a kind of,
01:08:26.400 | little part of me dies 'cause I know I'll fail.
01:08:29.020 | But in truth, it doesn't really matter
01:08:33.040 | what we decide money is going to be.
01:08:36.440 | And anything can record, crystallize the relationship
01:08:41.440 | between the creditor and the debtor.
01:08:44.680 | It can be a piece of paper, it can be a piece of metal,
01:08:48.400 | it can be nothing, can just be a digital entry.
01:08:51.800 | It's trust that we're really talking about here.
01:08:56.480 | We are not just trusting one another, we may not,
01:09:00.440 | but we are trusting the money.
01:09:02.920 | So whatever we use to represent
01:09:07.520 | the creditor-debtor relationship,
01:09:10.240 | whether it's a banknote or a coin or whatever,
01:09:13.200 | it does depend on us both trusting it.
01:09:16.580 | And that doesn't always pertain.
01:09:21.640 | What we see in episodes of inflation,
01:09:25.120 | especially episodes of hyperinflation,
01:09:26.900 | is a crisis of trust, a crisis of confidence
01:09:29.980 | in the means of payment.
01:09:33.040 | And this is very traumatic for the societies
01:09:34.960 | to which it happens.
01:09:36.900 | By and large, human beings,
01:09:40.040 | particularly once you have a rule of law system of the sort
01:09:43.600 | that evolved in the West and then became generalized,
01:09:46.560 | are predisposed to trust one another,
01:09:49.680 | and the default setting is to trust money.
01:09:52.600 | Even when it depreciates at a quite steady rate,
01:09:55.000 | as the US dollar has done pretty much uninterruptedly
01:10:00.000 | since the 1960s,
01:10:02.760 | it takes quite a big disruption
01:10:05.380 | for money to lose that trust.
01:10:07.540 | But I think essentially what money should be thought of as
01:10:11.760 | is a series of tokens that can take any form we like
01:10:15.920 | and can be purely digital,
01:10:18.220 | which represent our transactions as creditors and debtors.
01:10:23.300 | And the whole thing depends on our collective trust to work.
01:10:26.720 | I had to explain this to Stephen Colbert once
01:10:29.980 | in the Colbert Show, the old show that was actually funny.
01:10:33.640 | And it was a great moment when he said,
01:10:37.900 | "So Neil, could I be money?"
01:10:41.980 | And I said, "Yes, we could settle a debt
01:10:46.980 | "with a human being."
01:10:48.260 | That was quite common in much of history,
01:10:50.720 | but it's not the most convenient.
01:10:53.360 | Form of money.
01:10:54.300 | Money has to be convenient.
01:10:57.080 | That's why when they worked out how to make payments
01:11:00.000 | with cell phones, the Chinese simply went straight there
01:11:03.920 | from bank accounts, they skipped out credit cards.
01:11:06.280 | You won't see credit cards in China,
01:11:08.200 | except in the hands of naive tourists.
01:11:10.880 | - How much can this trust bear
01:11:14.360 | in terms of us humans with our human nature testing it?
01:11:18.160 | It seemed, I guess the surprising thing is the thing works.
01:11:22.080 | A bunch of self-interested ants running around,
01:11:25.640 | trading in trust, and it seems to work,
01:11:28.800 | except for a bunch of moments in human history
01:11:32.080 | when there's hyperinflation, like you mentioned.
01:11:34.480 | And it's just kind of amazing.
01:11:38.600 | It's kind of amazing that us humans,
01:11:40.560 | if I were to be optimistic and sort of hopeful
01:11:42.600 | about human nature, it gives me a sense
01:11:45.100 | that people want to lean on each other.
01:11:49.320 | They want to trust.
01:11:51.320 | That's certainly, I would say probably now
01:11:54.960 | a widely shared view amongst evolutionary psychologists,
01:11:59.800 | network scientists.
01:12:01.080 | It's one of Nicholas Christakis' argument in a recent book.
01:12:05.880 | I know that economic history broadly bears this out,
01:12:09.040 | but you have to be cautious.
01:12:12.260 | The cases where the system works are familiar to us
01:12:18.600 | because those are the states and the eras
01:12:23.600 | that produce a lot of written records.
01:12:25.680 | But when the system of trust collapses
01:12:30.320 | and the monetary system collapses with it,
01:12:32.540 | there's generally quite a paucity of records.
01:12:35.080 | I found that when I was writing "Doom."
01:12:38.280 | And so we slightly are biased in favor of the periods
01:12:42.180 | when trust prevailed and the system functioned.
01:12:47.120 | It's very easy to point to a great many episodes
01:12:50.920 | of very, very intense monetary chaos,
01:12:53.560 | even in the relatively recent past.
01:12:56.040 | In the wake of the First World War,
01:12:58.840 | multiple currencies, not just the German currency,
01:13:01.360 | multiple currencies were completely destroyed.
01:13:03.640 | The Russian currency, the Polish currency,
01:13:05.960 | there were currency disasters all over
01:13:08.800 | Central and Eastern Europe in the early 1920s.
01:13:12.560 | And that was partly because over the course
01:13:16.560 | of the 19th century, a system had evolved
01:13:18.840 | in which trust was based on gold
01:13:22.760 | and rules that were supposedly applied by central banks.
01:13:27.140 | That system, which produced relative price stability
01:13:32.280 | over the 19th century, fell apart
01:13:35.280 | as a result of the First World War.
01:13:36.800 | And as soon as it was gone,
01:13:38.720 | as soon as there was no longer a clear link
01:13:41.520 | between those banknotes and coins and gold,
01:13:45.000 | the whole thing went completely haywire.
01:13:47.320 | And I think we should remember
01:13:48.680 | that the extent of the monetary chaos
01:13:51.280 | from certainly 1918 all the way through to the late 1940s.
01:13:56.280 | I mean, the German currency was destroyed
01:13:57.960 | not once but twice in that period.
01:14:00.120 | And that was one of the most advanced economies in the world.
01:14:03.120 | In the United States,
01:14:06.080 | there were periods of intensely deep deflation.
01:14:10.640 | Prices fell by a third in the Great Depression
01:14:13.960 | and then very serious price volatility
01:14:16.360 | in the immediate post-World War II period.
01:14:18.280 | So it's a bit of an illusion.
01:14:21.040 | Maybe it's an illusion for people
01:14:23.960 | who've spent most of their lives in the last 20 years.
01:14:27.600 | We've had a period of exceptional price stability
01:14:30.400 | since the century began,
01:14:33.640 | in which a regime of central bank independence
01:14:38.160 | and inflation targeting appeared to generate
01:14:42.080 | steady below 2% inflation in much of the developed world.
01:14:45.600 | It was a bit too low for the central bankers liking.
01:14:48.400 | And that became a problem in the financial crisis.
01:14:51.040 | But we've avoided major price instability
01:14:54.960 | for the better part of 20 years.
01:14:56.800 | In most of the world,
01:14:57.640 | there haven't really been that many
01:14:59.600 | very high inflation episodes
01:15:01.040 | and hardly any hyperinflationary episodes.
01:15:02.800 | Venezuela is one of the very few, Zimbabwe's another.
01:15:05.880 | But if you take a hundred year view or a 200 year view,
01:15:08.440 | or if you want to take a 500 year view,
01:15:10.560 | you realize that quite often the system doesn't work.
01:15:14.640 | If you go back to the 17th century,
01:15:16.640 | there were multiple competing systems of coinage.
01:15:19.600 | There had been a great inflation
01:15:20.920 | that had begun the previous century.
01:15:23.960 | The price revolution caused mainly
01:15:25.520 | by the arrival of new world silver.
01:15:28.960 | I think financial history is a bit messier
01:15:31.800 | than one might think.
01:15:33.760 | And the more one studies it,
01:15:36.520 | the more one realizes the need for the evolution.
01:15:39.360 | The reason bills of exchange came along
01:15:41.960 | was because the coinage systems had stopped working.
01:15:45.000 | The reason that banknotes started to become used
01:15:47.720 | more generally first in the American colonies
01:15:50.080 | in the 17th century,
01:15:50.960 | then more widely in the 18th century
01:15:52.320 | was just that they were more convenient
01:15:54.440 | than any other way of paying for things.
01:15:57.320 | We had to invent the bond market in the 18th century
01:16:00.480 | to cope with the problem of public debt,
01:16:02.200 | which up until that point
01:16:03.520 | had been a recurrent source of instability.
01:16:06.880 | And then we invented equity finance
01:16:10.000 | because bonds were not enough.
01:16:12.840 | So I would prefer to think of the financial history
01:16:16.400 | as a series of crises really
01:16:18.040 | that are resolved by innovations.
01:16:20.760 | And in the most recent episode,
01:16:22.680 | very exciting episode of financial history,
01:16:25.080 | something called Bitcoin initiated a new financial
01:16:29.280 | or monetary revolution in response, I think,
01:16:31.880 | to the growing crisis of the fiat money system.
01:16:36.080 | - Can you speak to that?
01:16:37.640 | So what do you think about Bitcoin?
01:16:41.680 | What do you think it is a response to?
01:16:43.480 | What are the growing problems of the fiat system?
01:16:46.400 | What is this moment in human history
01:16:49.400 | that is full of challenges that Bitcoin
01:16:52.160 | and cryptocurrency is trying to overcome?
01:16:54.200 | - I don't think Bitcoin was devised by Satoshi,
01:17:00.800 | whoever he was, for fear of a breakdown
01:17:05.800 | of the fiat currencies.
01:17:07.400 | If it was, it was a very far-sighted enterprise
01:17:11.200 | because certainly in 2008,
01:17:12.560 | when the first Bitcoin paper appeared,
01:17:14.320 | it wasn't very likely that a wave of inflation was coming.
01:17:18.080 | If anything, there was more reason
01:17:20.360 | to fear deflation at that point.
01:17:22.240 | I think it would be more accurate to say
01:17:26.360 | that with the advent of the internet,
01:17:28.800 | there was a need for a means of payment
01:17:31.160 | native to the internet.
01:17:33.240 | Typing your credit card number into random websites,
01:17:36.360 | not the way to pay for things on the internet.
01:17:40.240 | And I'd rather think of Bitcoin as the first iteration,
01:17:43.560 | the first attempt to solve the problem
01:17:45.160 | of how do we pay for things
01:17:46.320 | in what we must learn to call the metaverse,
01:17:48.760 | but let's just call it the internet for old times' sake.
01:17:52.480 | And ever since that initial innovation,
01:17:56.000 | the realization that you could use computing power
01:17:58.560 | and cryptography to create peer-to-peer payments
01:18:01.720 | without third-party verification,
01:18:04.160 | a revolution has been gathering momentum
01:18:07.160 | that poses a very profound threat
01:18:08.920 | to the existing legacy system of banks and fiat currencies.
01:18:13.040 | Most money in the world today is made by banks,
01:18:15.600 | not central banks, banks.
01:18:17.760 | That's what most money is, it's entries in bank accounts.
01:18:21.840 | And what Bitcoin represents is an alternative mode
01:18:26.400 | of payment that really ought to render banks obsolete.
01:18:30.280 | I think this financial revolution
01:18:33.520 | has got past the point at which it can be killed.
01:18:37.440 | It was vulnerable in the early years,
01:18:39.720 | but it now has sufficient adoption
01:18:42.360 | and has generated sufficient additional layers.
01:18:45.440 | I mean, Ethereum was in many ways
01:18:47.360 | the more important innovation
01:18:48.840 | because you can build a whole system of payments
01:18:52.240 | and ultimately smart contracts on top of Ether.
01:18:54.960 | I think we've now reached the point
01:18:56.240 | that it's pretty hard to imagine it all being killed.
01:18:59.760 | And it's just survived an amazing thing,
01:19:01.600 | which was the Chinese shutting down mining
01:19:03.600 | and shutting down everything.
01:19:04.520 | And still here we are.
01:19:05.840 | In fact, crypto's thriving.
01:19:09.160 | What we don't know is how much damage
01:19:14.040 | ill-judged regulatory interventions
01:19:16.840 | are going to do to this financial revolution.
01:19:19.760 | Left to its own devices,
01:19:21.400 | I think decentralized finance
01:19:23.560 | provides the native monetary
01:19:27.360 | and financial system for the internet.
01:19:29.800 | And the more time we spend in the metaverse,
01:19:33.040 | the more use we will make of it.
01:19:35.120 | The next things that will happen,
01:19:36.680 | I think will be that tokens in game spaces
01:19:40.240 | like Roblox will become fungible.
01:19:42.040 | As my nine-year-old spends a lot more time
01:19:45.480 | playing on computer games than I ever did,
01:19:48.200 | I can see the entertainment
01:19:50.080 | is becoming a game-driven phenomenon.
01:19:53.000 | And in the game space, you need skins for your avatar.
01:19:57.160 | The economics of the internet, it's evolving very fast.
01:20:01.560 | And in parallel,
01:20:02.560 | you can see this payments revolution happening.
01:20:05.440 | I think that all goes naturally very well
01:20:09.240 | and generates an enormous amount of wealth in the process.
01:20:13.560 | The problem is there are people in Washington
01:20:17.640 | with an overwhelming urge to intervene
01:20:20.880 | and disrupt this evolutionary process.
01:20:24.560 | Partly, I think out of a muddled sense
01:20:28.760 | that there must be a lot of nefarious things going on.
01:20:32.160 | If we don't step in, many more will go on.
01:20:34.760 | This, I think, greatly exaggerates
01:20:36.360 | how much criminal activity is in fact going on in the space.
01:20:40.120 | But there's also the vested interests at work.
01:20:43.000 | It was odd to me, maybe not odd,
01:20:46.440 | perhaps it wasn't surprising,
01:20:47.560 | that the Bank for International Settlements
01:20:49.280 | earlier this year published a report,
01:20:52.480 | one chapter of which said, "This must all go.
01:20:54.760 | It must all stop.
01:20:56.120 | It's all got to be shut down.
01:20:58.000 | And it's got to be replaced by a central bank digital currency."
01:21:01.360 | And Martin Wolf in the Financial Times read this
01:21:03.240 | and said, "I agree with this."
01:21:04.800 | And one suddenly realized that the banks are clever.
01:21:07.720 | They'd achieved the intellectual counter-attack
01:21:12.320 | with almost no fingerprints on the weapon.
01:21:16.080 | I think central bank digital currency is a terrible idea.
01:21:19.400 | I can't imagine why we would want to copy a Chinese model
01:21:23.040 | that essentially takes all transactions
01:21:25.640 | and puts them directly under the surveillance
01:21:27.400 | of a central government institution.
01:21:28.760 | But that suddenly is a serious counter-proposal.
01:21:33.760 | So on the one side, we have a relatively decentralized,
01:21:39.760 | technologically innovative, internet-native system
01:21:44.480 | of payments that has the possibility to evolve,
01:21:47.320 | to produce a full set of smart contracts,
01:21:51.600 | reducing enormously the transaction costs
01:21:54.560 | that we currently encounter in the financial world
01:21:56.560 | because it gets rid of all those middlemen
01:21:58.240 | who take their cut every time you take out a mortgage
01:22:01.080 | or whatever it is.
01:22:02.200 | That's one alternative.
01:22:04.040 | But on the other side, we have a highly centralized system
01:22:06.920 | in which transactions will by default
01:22:08.880 | be under the surveillance of the central bank.
01:22:11.280 | Seems like an easy choice to me,
01:22:12.840 | but hey, I have this thing about personal liberty.
01:22:15.400 | So that's where we are.
01:22:18.240 | I don't think that the regulators can kill Web3.
01:22:23.160 | I think we're supposed to call it Web3
01:22:24.560 | 'cause crypto is now an obsolescent term.
01:22:27.200 | They can't kill it,
01:22:28.120 | but they can definitely make it difficult
01:22:30.320 | and throw a lot of sand into the machine.
01:22:33.440 | And I think worst of all,
01:22:35.160 | they can spoil the evolutionary story
01:22:37.640 | by creating a central bank digital currency
01:22:40.640 | that I don't think we really need,
01:22:42.920 | or we certainly don't need it in the Chinese form.
01:22:47.440 | - So do you think Bitcoin has a strong chance
01:22:51.680 | to take over the world?
01:22:53.280 | So become the primary,
01:22:57.000 | you mentioned the three things that make money money,
01:23:00.440 | become the primary methodology
01:23:02.680 | by which we store wealth, we exchange.
01:23:05.840 | - No, no, I think what Bitcoin is,
01:23:09.600 | this was a phrase that I got from my friend,
01:23:12.120 | Matt McLennan at First Eagle,
01:23:13.840 | an option on digital gold.
01:23:15.760 | So it's the gold of the system,
01:23:18.480 | but currently behaves like an option.
01:23:20.680 | That's why it's quite volatile
01:23:22.320 | because we don't really know
01:23:23.760 | if this brave new world of crypto is gonna work.
01:23:28.760 | But if it does work, then Bitcoin is the gold
01:23:31.200 | because of the finite supply.
01:23:33.400 | What role we need gold to play in the metaverse
01:23:37.240 | isn't quite clear.
01:23:38.840 | - I love that you're using the term metaverse.
01:23:40.680 | This is great.
01:23:41.560 | - Well, I just like the metaversity
01:23:43.720 | as a kind of, as the antithesis
01:23:46.280 | of what we're trying to do in Austin.
01:23:48.480 | - I love it.
01:23:50.640 | - But can you imagine I'm using it sarcastically?
01:23:52.280 | I come from Glasgow where all novel words
01:23:54.440 | have to be used sarcastically.
01:23:55.720 | So the metaverse, sarcastic.
01:23:57.480 | - But see the beauty about humor and sarcasm
01:24:00.280 | is that the joke becomes reality.
01:24:03.440 | I mean, it's like using the word Big Bang
01:24:05.720 | to describe the origins of the universe.
01:24:07.480 | It becomes like, it will-
01:24:10.120 | - After a while it's in the textbooks
01:24:11.760 | and nobody's laughing.
01:24:12.800 | Yeah, well, that's exactly right.
01:24:14.280 | - So sticky.
01:24:15.280 | - Yeah, I'm on the side of humor,
01:24:18.680 | but it is a dangerous activity these days.
01:24:21.320 | Anyway, I think Bitcoin is the option of digital gold.
01:24:25.280 | The role it plays is probably not so much store of value.
01:24:30.280 | Right now, it's just nicely,
01:24:32.440 | not very correlated asset in your portfolio.
01:24:35.160 | When I updated "The Ascent of Money,"
01:24:36.760 | which was in 2018, 10 years after it came out,
01:24:40.040 | I wrote a new chapter in which I said,
01:24:42.200 | Bitcoin, which had just sold off after its 2017 bubble,
01:24:48.120 | will rise again through adoption.
01:24:51.240 | Because if every millionaire in the world
01:24:54.160 | has 0.2% of his or her wealth in Bitcoin,
01:24:57.800 | the price should be $15,000.
01:24:59.880 | And if it's 1%, it's $75,000.
01:25:03.280 | And it might not even stay at 1%,
01:25:06.000 | because I mean, look at its recent performance.
01:25:08.280 | If your exposure to stock,
01:25:11.560 | global stocks had been hedged
01:25:12.960 | with a significant crypto holding,
01:25:16.200 | you would have aced the last few months.
01:25:19.160 | So I think the non-correlation property
01:25:24.000 | is very, very important in driving adoption.
01:25:27.160 | And the volatility also drives adoption
01:25:29.200 | if you're a sophisticated investor.
01:25:31.840 | So I think the adoption drives Bitcoin up
01:25:36.400 | because it's the option of digital gold,
01:25:38.120 | but it's also just this nicely,
01:25:39.680 | not very correlated asset that you wanna hold.
01:25:42.560 | In a world where, what the hell?
01:25:44.800 | I mean, the central bank's gonna tighten.
01:25:47.280 | We've come through this massively disruptive episode
01:25:49.480 | of the pandemic.
01:25:50.680 | Public debt soared.
01:25:52.520 | Money printing soared.
01:25:54.480 | You could hang around with your bonds
01:25:57.520 | and wait for the euthanasia of the rentier.
01:26:00.600 | You can hang on to your tech stocks
01:26:02.640 | and just hope there isn't a massive correction
01:26:04.480 | or dot, dot, dot.
01:26:05.880 | Well, and it seems like a fairly obvious strategy
01:26:08.640 | to make sure that you have at least some crypto
01:26:11.480 | for the coming year,
01:26:13.400 | given what we likely have to face.
01:26:16.520 | I think what's really interesting
01:26:17.840 | is that on top of Ethereum,
01:26:20.640 | a more elaborate financial system is being built.
01:26:27.160 | Table coins are the interesting puzzle for me
01:26:32.160 | 'cause we need off-ramps.
01:26:34.920 | Ultimately, you and I have to pay taxes in US dollars.
01:26:39.000 | And there's no getting away from that.
01:26:43.960 | The IRS is gonna let us hold crypto
01:26:46.120 | as long as we pay our taxes.
01:26:48.560 | And the only question in my mind is
01:26:50.920 | what's the optimal off-ramp to make those taxes,
01:26:55.280 | make those tax payments?
01:26:56.680 | Probably it shouldn't be a currency invented by Facebook.
01:27:01.760 | Never struck me as the best solution to this problem.
01:27:05.720 | Maybe it's some kind of Fed coin.
01:27:08.000 | Or maybe one of the existing algorithmic stable coins
01:27:13.280 | does the job, but we clearly need some stable off-ramp.
01:27:16.760 | - So you don't think it's possible for the IRS
01:27:18.880 | within the next decade to be accepting Bitcoin
01:27:21.200 | as tax payments?
01:27:22.800 | - I doubt that.
01:27:24.280 | Having dealt with the IRS now
01:27:26.160 | since when did I first come here, 2002.
01:27:28.840 | It's hard to think of an institution less likely
01:27:32.440 | to leap into the 21st century when it comes to payments.
01:27:37.440 | No, I think we'll be tolerated.
01:27:41.920 | Crypto world will be tolerated as long as we pay our taxes.
01:27:46.480 | And it's important that we're already at that point.
01:27:49.280 | And then the next question becomes,
01:27:50.560 | well, does Gary Gensler define everything as a security
01:27:53.880 | and do we then have to go through
01:27:56.200 | endless regulatory contortions to satisfy the SEC?
01:28:00.560 | There's a whole bunch of uncertainties
01:28:03.720 | that the administrative state excels at creating
01:28:06.440 | 'cause that's just how the administrative state works.
01:28:09.560 | You'll do something new.
01:28:10.800 | I'll decide whether that's a security,
01:28:13.920 | but don't expect me to define it for you.
01:28:16.080 | I'll decide in an arbitrary way and then you'll owe me money.
01:28:19.360 | So all of this is going to be very annoying.
01:28:21.480 | And for people who are trying to run exchanges
01:28:25.560 | or innovate in the space,
01:28:27.600 | these regulations will be annoying.
01:28:29.640 | But the problem with FinTech is it's different from tech,
01:28:32.360 | broadly defined.
01:28:34.160 | When tech got into e-commerce with Amazon,
01:28:36.960 | when it got into social networking with Facebook,
01:28:39.400 | there wasn't a huge regulatory jungle to navigate,
01:28:43.680 | but welcome to the world of finance,
01:28:45.680 | which has always been a jungle of regulation
01:28:48.840 | because the regulation is there to basically
01:28:52.960 | entrench the incumbents.
01:28:54.280 | That's what it's for.
01:28:56.080 | So it'll be a much tougher fight
01:28:59.120 | than the fights we've seen of other aspects
01:29:03.040 | of the tech revolution
01:29:04.840 | 'cause the incumbents are there and they see the threat.
01:29:09.600 | And in the end, Satoshi said it very explicitly,
01:29:13.040 | it's peer-to-peer payment without third-party verification.
01:29:15.720 | And all the third parties are going, "Wait, what?
01:29:18.120 | "We're the third parties."
01:29:19.560 | - So there is a connection between power and money.
01:29:24.000 | You've mentioned World War I from the perspective of money.
01:29:29.120 | So power, money, war, authoritarian regimes.
01:29:34.120 | From the perspective of money,
01:29:37.320 | do you have hope that cryptocurrency can help resist war,
01:29:42.120 | can help resist the negative effects
01:29:46.240 | of authoritarian regimes?
01:29:48.940 | Or is that a silly hope?
01:29:50.380 | - Wars happen because the people
01:29:57.580 | who have the power to command armed forces miscalculate.
01:30:03.340 | That's generally what happens.
01:30:07.380 | And we will have a big war in the near future
01:30:12.140 | if both the Chinese government
01:30:14.220 | and the US government miscalculates
01:30:17.020 | and they unleash lethal force on one another.
01:30:20.300 | And there's nothing that any financial institution
01:30:23.180 | can do to stop that
01:30:25.180 | any more than the Rothschilds could stop World War I.
01:30:29.340 | And they were then the biggest bank in the world by far
01:30:31.780 | with massive international financial influence.
01:30:35.580 | So let's accept that war is in a different domain.
01:30:39.380 | War would impact the financial world massively
01:30:45.100 | if it were a war between the United States and China
01:30:47.620 | because there's still a huge China trade on.
01:30:52.380 | Wall Street is long China, Europe is long China.
01:30:55.940 | So the conflict that I can foresee in the future
01:30:59.520 | is one that's highly financially disruptive.
01:31:02.900 | Where does crypto fit in?
01:31:04.220 | Crypto's obvious utility in the short run
01:31:09.900 | is as a store of wealth, of transferable wealth
01:31:13.820 | for people who live in dangerous places
01:31:17.060 | with failing, not just failing money,
01:31:19.540 | but failing rule of law.
01:31:21.440 | That's why in Latin America
01:31:22.740 | there's so much interest in crypto
01:31:24.460 | because Latin Americans have a lot of monetary history
01:31:26.580 | to look back on and not much of it is good.
01:31:28.740 | So I think that the short run problem that crypto solves is,
01:31:34.940 | and this goes back to the digital gold point,
01:31:39.060 | if you are in a dangerous place with weak rule of law
01:31:43.020 | and weak property rights,
01:31:44.700 | here is a new and better way to have portable wealth.
01:31:49.300 | I think the next question to ask is,
01:31:56.060 | would you want to be long crypto
01:31:58.860 | in the event of World War III?
01:32:00.820 | What's interesting about that question
01:32:03.560 | is that World War III would likely have
01:32:05.260 | a significant cyber dimension to it.
01:32:07.480 | And I don't wanna be 100% in crypto
01:32:10.900 | if they crash the internet,
01:32:14.020 | which between them, China and Russia might be able to do.
01:32:18.420 | That's a fascinating question
01:32:19.940 | whether you want to be holding physical gold
01:32:22.620 | or digital gold in the event of World War III.
01:32:25.980 | The smart person who studied history
01:32:27.840 | definitely wants better both.
01:32:30.420 | And so let's imagine World War III
01:32:34.020 | has a very, very severe cyber component to it
01:32:36.900 | with high levels of disruption.
01:32:39.260 | Yeah, you'd be glad of the old shiny stuff at that point.
01:32:44.100 | So diversification still seems like the most important truth
01:32:49.100 | of financial history.
01:32:52.620 | And what is crypto?
01:32:54.060 | It's just this wonderful new source of diversification,
01:32:57.200 | but you'd be nuts to be 100% in Bitcoin.
01:33:00.380 | I mean, I have some friends
01:33:02.580 | who are probably quite close to that.
01:33:04.580 | - Close to 100%, yeah.
01:33:06.420 | - I'd mar the balls of steel.
01:33:10.380 | - Yeah, in whatever way that balls of steel takes form.
01:33:16.860 | You mentioned smart contracts.
01:33:20.140 | What are your thoughts about,
01:33:21.780 | in the context of the history of money,
01:33:23.620 | about Ethereum, about smart contracts,
01:33:25.620 | about kind of more systematic at scale formalization
01:33:30.620 | of agreements between humans?
01:33:34.900 | - I think it must be the case
01:33:39.500 | that a lot of the complexity in a mortgage is redundant.
01:33:45.540 | That when we are confronted with pages and pages and pages
01:33:52.300 | and pages of small prints,
01:33:53.820 | we're seeing some manifestation
01:33:58.060 | of the late stage regulatory state.
01:34:01.180 | The transaction itself is quite simple.
01:34:04.700 | And most of the verbiage is just ass covering by regulators.
01:34:08.460 | So I think the smart contract,
01:34:11.340 | although I'm sure lawyers will email me
01:34:16.020 | and tell me I'm wrong,
01:34:17.700 | can deal with a lot of the plain vanilla
01:34:20.380 | and maybe not so plain transactions that we want to do
01:34:24.060 | and eliminate yet more intermediaries.
01:34:28.580 | That's my kind of working assumption.
01:34:31.940 | And given that a lot of financial transactions
01:34:36.940 | have the potential at least to be simplified, automated,
01:34:42.540 | turned into smart contracts,
01:34:45.380 | that's probably where the future goes.
01:34:48.380 | I can't see an obvious reason why my range
01:34:51.900 | of different financial needs,
01:34:54.100 | let's think about insurance, for example,
01:34:56.540 | will continue to be met with instruments
01:35:00.940 | that in some ways are a hundred years old.
01:35:04.260 | So I think we're still at an early stage
01:35:08.620 | of a financial revolution that will greatly streamline
01:35:12.420 | how we take care of all those financial needs that we have,
01:35:17.100 | mortgages and insurance leap to mind.
01:35:19.860 | You know, most households are penalized
01:35:23.380 | for being financially poorly educated
01:35:27.100 | and confronted with oligopolistic
01:35:29.660 | financial services providers.
01:35:31.620 | So you kind of leave college already in debt.
01:35:35.740 | So you start in debt servitude,
01:35:40.340 | and then you got to somehow lever up
01:35:43.380 | to buy a home if you can,
01:35:45.260 | 'cause everybody's kind of telling you you should do that.
01:35:47.500 | So you and your spouse,
01:35:50.380 | you are getting even more leveraged
01:35:53.100 | and your long one asset class called real estate,
01:35:57.380 | which is super illiquid.
01:35:59.540 | I mean, already I'm crying inside at the thought
01:36:03.980 | of describing so many households' financial predicament
01:36:07.540 | in that way, and I'm not done with them yet
01:36:09.060 | because, oh, by the way,
01:36:10.660 | there's all this insurance you have to take out.
01:36:13.180 | And here are the providers that are willing to insure you,
01:36:15.380 | and here are the premiums you're gonna be paying,
01:36:17.740 | which are kind of presented to you.
01:36:19.300 | That's your car insurance, that's your home insurance,
01:36:22.220 | and if you're here, it's the earthquake insurance.
01:36:23.820 | And pretty soon you're just bleeding money
01:36:26.460 | in a bunch of monthly payments to the mortgage lender,
01:36:30.940 | to the insurer, to all the other people that lent you money.
01:36:35.620 | And let's look at your balance sheet.
01:36:37.420 | It sucks.
01:36:39.340 | There's this great big chunk of real estate,
01:36:41.220 | and what else have you really got on there?
01:36:43.660 | And the other side is a bunch of debt,
01:36:45.340 | which is probably paying too high interest.
01:36:48.340 | The typical household in the median kind of range
01:36:52.100 | is at the mercy of oligopolistic financial services providers
01:36:57.100 | go down further in the social scale,
01:37:01.220 | and people are outside the financial system altogether.
01:37:03.780 | And those poor folks have to rely on bank notes
01:37:07.540 | and informal lending with huge punitive rates.
01:37:10.940 | We have to do better.
01:37:12.100 | This has to be improved upon.
01:37:15.340 | And I think what's exciting about our time
01:37:17.380 | is that technology now exists
01:37:19.540 | that didn't exist when I wrote "The Ascent of Money"
01:37:21.700 | to solve these problems.
01:37:22.660 | When I wrote "The Ascent of Money," which was in 2008,
01:37:25.340 | you couldn't really solve the problem I've just described.
01:37:30.380 | Certainly you couldn't solve it
01:37:31.620 | with something like microfinance.
01:37:32.980 | That was obviously not viable.
01:37:35.780 | The interest rates were high,
01:37:38.100 | the transaction costs were crazy,
01:37:40.700 | but now we have solutions,
01:37:42.020 | and the solutions are extremely exciting.
01:37:43.980 | So fintech is this great force for good
01:37:46.140 | that brings people into the financial system
01:37:48.820 | and reduces transaction costs.
01:37:50.860 | Crypto is part of it, but it's just part of it.
01:37:53.220 | There's a much broader story of fintech going on here
01:37:55.860 | where you get, suddenly you get financial services
01:37:59.620 | on your phone, don't cost nearly as much as they did
01:38:03.460 | when there had to be a bricks and mortar building
01:38:05.060 | on Main Street that you kind of went humbly
01:38:08.180 | and beseeched to lend you money.
01:38:10.380 | I'm excited about that
01:38:11.580 | because it seems to me very socially transformative.
01:38:14.700 | I'll give you one other example of what's great.
01:38:17.620 | The people who really get scalped in our financial system
01:38:21.540 | are senders and receivers of remittances,
01:38:25.780 | which are often amongst the poorest families in the world.
01:38:29.020 | The people who are like my wife's family in East Africa,
01:38:32.340 | really kind of hand to mouth.
01:38:34.460 | And if you send money to East Africa
01:38:36.420 | or the Philippines or Central America,
01:38:39.780 | the transaction costs are awful.
01:38:41.820 | I'm talking to you, Western Union.
01:38:46.900 | (laughs)
01:38:48.220 | We're gonna solve that problem.
01:38:50.220 | So 10 years from now,
01:38:51.180 | the transaction costs will just be negligible
01:38:53.500 | and the money will go to the people who need it
01:38:55.500 | rather than to rent seeking financial institutions.
01:38:58.140 | So I'm on the side of the revolution with this
01:38:59.900 | because I think the incumbent financial institutions
01:39:02.700 | globally are doing a pretty terrible job
01:39:05.820 | and middle-class and lower-class families lose out.
01:39:10.340 | And thankfully, technology allows us to fix this.
01:39:13.900 | - Yeah, so FinTech can remove a lot of inefficiencies
01:39:16.180 | in the system.
01:39:17.100 | I'm super excited myself,
01:39:18.940 | maybe as a machine learning person in the data oracles.
01:39:22.060 | So converting a lot of our physical world into data
01:39:26.140 | and have smart contracts on top of that
01:39:29.220 | so that no longer is there's this fuzziness
01:39:32.660 | about what is the concrete nature of the agreements.
01:39:36.640 | You can tie your agreement to weather.
01:39:39.860 | You can tie your agreement to the behavior
01:39:42.780 | of certain kinds of financial systems.
01:39:46.300 | You can tie your behavior to, I don't know,
01:39:49.740 | I mean, all kinds of things.
01:39:50.780 | You can connect it to the body
01:39:52.300 | in terms of human sensory information.
01:39:55.980 | Like you can make an agreement
01:39:58.420 | that if you don't lose five pounds in the next month,
01:40:03.220 | you're going to pay me $1,000 or something like that.
01:40:05.620 | I don't know.
01:40:06.460 | It's a stupid example, but it's not
01:40:08.180 | because you can create all kinds of services on top of that.
01:40:12.120 | You can just create all kinds of interesting applications
01:40:15.140 | that completely revolutionize how humans transact.
01:40:18.800 | - I think, of course, we don't want to create a world
01:40:24.780 | of Chinese style social credit
01:40:29.620 | in which our behavior becomes so transparent
01:40:34.460 | to providers of financial services, particularly insurers,
01:40:38.140 | that when I try to go into the pub,
01:40:42.100 | I'm stopped from doing so.
01:40:44.920 | - Every time you take a drink, your insurance goes up.
01:40:47.220 | (laughing)
01:40:48.220 | - Right, or my credit card won't work
01:40:51.380 | in certain restaurants because they serve rib eye steak.
01:40:56.140 | I fear that world 'cause I see it being built in China.
01:40:59.140 | And we must, at all costs, make sure
01:41:02.940 | that the Western world has something distinctive to offer.
01:41:07.540 | It can't just be, oh, it's the same as in China,
01:41:09.900 | only the data go to five tech companies
01:41:13.620 | rather than to Xi Jinping.
01:41:16.920 | So I think that the way we need to steer this world
01:41:21.740 | is in the way that our data
01:41:26.740 | are by default vaulted on our devices,
01:41:31.100 | and we choose when to release the data
01:41:35.380 | rather than the default setting
01:41:38.020 | being that the data are available.
01:41:40.100 | That's important, I think,
01:41:41.140 | because it was one of the biggest mistakes
01:41:43.060 | of the evolution of the internet
01:41:45.460 | that in a way the default was to let our data be plundered.
01:41:50.060 | It's hard to undo that, but I think we can at least
01:41:53.740 | create a new regime that in future makes privacy default
01:41:58.740 | rather than open access default.
01:42:02.120 | - In the book "Doom, the Politics of Catastrophe,"
01:42:07.980 | your newest book, you describe wars, pandemics,
01:42:11.900 | and the terrible disasters in human history.
01:42:14.300 | Which stands out to you as the worst
01:42:17.900 | in terms of how much it shook the world
01:42:21.020 | and the human spirit?
01:42:22.280 | - I am glad I was not around in the mid 14th century
01:42:28.700 | when the bubonic plague swept across Eurasia.
01:42:33.900 | As far as we can see, that was history's worst pandemic.
01:42:38.340 | Maybe there was a comparably bad one
01:42:40.300 | in the reign of the Emperor Justinian,
01:42:43.660 | but there's some reason to think it wasn't as bad.
01:42:47.860 | And the more we learn about the 14th century,
01:42:51.500 | the more we realize that it really was across Eurasia
01:42:54.420 | and the mortality was 30% in some places, 50%
01:42:59.420 | in some places higher.
01:43:01.500 | There were whole towns that were just emptied.
01:43:03.800 | And when one reads about the Black Death,
01:43:07.460 | it's an unimaginable nightmare of death
01:43:12.460 | and madness in the death with flagellant orders
01:43:17.740 | wandering from town to town,
01:43:20.460 | seeking to ward off divine retribution
01:43:22.760 | by flogging themselves,
01:43:24.860 | people turning on the local Jewish communities
01:43:26.980 | as if it's somehow their fault.
01:43:28.940 | That must've been a nightmarish time.
01:43:32.420 | If you ask me for an also ran a runner up,
01:43:36.740 | it would be World War II in Eastern Europe.
01:43:42.160 | And in many ways, it might have been worse
01:43:46.260 | because for a medieval peasant,
01:43:50.380 | the sense of being on the wrong side of divine retribution
01:43:53.760 | must've been overpowering.
01:43:56.780 | In the mid 20th century,
01:43:59.380 | you knew that this was man-made murder
01:44:04.180 | on a massive industrial scale.
01:44:06.340 | If one reads Brosman's "Life and Fate,"
01:44:10.560 | just to take one example,
01:44:12.900 | one enters a hellscape that it's extremely hard
01:44:17.900 | to imagine oneself in.
01:44:22.140 | So these are two of the great disasters of human history.
01:44:25.540 | And if we did have a time machine,
01:44:28.100 | if one really were able to transport people back
01:44:31.020 | and give them a glimpse of these times,
01:44:35.540 | I think the post-traumatic stress would be enormous.
01:44:38.140 | People would come back from those trips,
01:44:40.360 | even if it was a one-day excursion with guaranteed survival
01:44:45.100 | in a state of utter shock.
01:44:47.160 | - You often explore counterfactual and hypothetical history,
01:44:51.180 | which is a fascinating thing to do,
01:44:53.700 | sometimes to a controversial degree.
01:44:55.880 | And again, you walk through that fire gracefully.
01:44:59.780 | So let me ask maybe about World War II or in general,
01:45:06.660 | what key moments in history of the 20th century
01:45:12.380 | do you think if something else happened at those moments,
01:45:15.860 | we could have avoided some of the big atrocities,
01:45:18.260 | Stalin's Haldimard, Hitler's Holocaust,
01:45:21.180 | Mao's Great Chinese Famine?
01:45:23.440 | - The great turning point in world history
01:45:28.440 | is August the 2nd, 1914,
01:45:32.880 | when the British cabinet decides to intervene
01:45:38.560 | and what would have been a European war
01:45:43.640 | becomes a world war.
01:45:45.880 | And with British intervention,
01:45:47.360 | it becomes a massively larger and more protracted conflict.
01:45:51.800 | So very early in my career,
01:45:53.280 | I became very preoccupied with the deliberations on that day
01:45:56.720 | and the surprising decision that a liberal cabinet took
01:46:02.160 | to go to war,
01:46:04.560 | which you might not have bet on that morning
01:46:06.500 | because there seemed to be a majority of cabinet members
01:46:10.680 | who would be disinclined and only a minority,
01:46:12.960 | including Winston Churchill, who wanted to go to war.
01:46:15.280 | So that's one turning point.
01:46:16.760 | I often wish I could get my time machine working
01:46:20.160 | and go back and say, wait, stop,
01:46:22.640 | just think about what you're going to do.
01:46:24.300 | And by the way, let me show you a video of Europe in 1918.
01:46:28.920 | So that's one.
01:46:29.760 | - Can we linger on that one?
01:46:31.480 | That one, a lot of people push back on you on,
01:46:36.480 | because it's so difficult.
01:46:39.360 | So the idea is, if I could try to summarize,
01:46:43.720 | and you're the first person that made me think
01:46:46.400 | about this very uncomfortable thought,
01:46:50.360 | which is the idea is in World War I,
01:46:54.320 | it would be a better world if Britain stayed out of the war
01:46:57.560 | and Germany won.
01:47:00.480 | - Right.
01:47:01.320 | - Thinking now in retrospect
01:47:05.920 | at the whole story of the 20th century,
01:47:07.960 | thinking about Stalin's rule of 30 years,
01:47:11.920 | thinking about Hitler's rise to power
01:47:14.520 | and the atrocities of the Holocaust,
01:47:18.360 | but also, like you said, on the Eastern front,
01:47:21.320 | the death of tens of millions of people throughout the war,
01:47:25.960 | and also sort of the political prisoners
01:47:28.600 | and the suffering connected to communism,
01:47:30.680 | connected to fascism, all those kinds of things.
01:47:33.080 | Well, that's one heck of an example
01:47:37.880 | of why you're just like fearless in this particular style
01:47:42.000 | of exploring counterfactual history.
01:47:44.120 | So can you elaborate on that idea
01:47:47.400 | and maybe why this was such an important day
01:47:50.200 | in human history?
01:47:52.200 | - This argument was central to my book, "The Pity of War."
01:47:55.120 | I also did an essay in "Virtual History" about this,
01:47:58.360 | and it's always amused me that from around that time,
01:48:01.200 | I began to be called a conservative historian
01:48:03.240 | because it's actually a very left-wing argument.
01:48:05.560 | The people in 1914 who thought Britain should stay
01:48:07.920 | at the war were the left of the Labor Party,
01:48:10.560 | who split to become the independent Labor Party.
01:48:13.800 | What would have happened?
01:48:16.120 | Well, first of all, Britain was not ready for war in 1914.
01:48:19.960 | There had not been conscription.
01:48:21.120 | The army was tiny.
01:48:23.120 | So Britain had failed to deter Germany.
01:48:25.640 | The Germans took the decision that they could risk
01:48:28.880 | going through Belgium, using the Schlieffen plan
01:48:32.040 | to fight their two-front war.
01:48:33.720 | They calculated that Britain's intervention
01:48:37.560 | would either not happen or not matter.
01:48:41.120 | If Britain had been strategically committed
01:48:46.120 | to preventing Germany winning a war in Europe,
01:48:48.960 | they should have introduced conscription 10 years before,
01:48:51.280 | had a meaningful land army,
01:48:53.440 | and that would have deterred the Germans.
01:48:55.800 | So the liberal government provided the worst of both worlds,
01:48:59.320 | a commitment that was more or less secret to intervene
01:49:03.360 | that the public didn't know about,
01:49:05.480 | in fact, much of the Liberal Party didn't know about,
01:49:07.640 | but without really the means
01:49:09.560 | to make that intervention effective,
01:49:11.320 | a tiny army with just a few divisions.
01:49:13.400 | So it was perfectly reasonable to argue,
01:49:16.040 | as a number of people did on August the 2nd, 1914,
01:49:19.720 | that Britain should not intervene.
01:49:21.560 | After all, Britain had not immediately intervened
01:49:23.560 | against the French Revolutionary armies back in the 1790s.
01:49:27.160 | It had played an offshore role, ultimately intervening,
01:49:30.480 | but not immediately intervening.
01:49:32.840 | If Britain had stayed out,
01:49:34.240 | I don't think that France would have collapsed immediately
01:49:38.440 | as it had in 1870.
01:49:40.400 | The French held up remarkably well to catastrophic casualties
01:49:44.440 | in the first six months of the First World War.
01:49:47.800 | But by 1916, I don't see how France could have kept going
01:49:52.400 | if Britain had not joined the war.
01:49:54.840 | And I think the war would have been over
01:49:56.200 | perhaps at some point in 1916.
01:49:59.200 | We know that Germany's aims
01:50:00.520 | would have been significantly limited
01:50:02.320 | because they would have needed to keep Britain out.
01:50:04.480 | If they'd succeeded in keeping Britain out,
01:50:06.160 | they'd have had to keep Britain out.
01:50:07.680 | And the way to keep Britain out
01:50:08.720 | was obviously not to make any annexation of Belgium,
01:50:11.880 | to limit German war aims,
01:50:14.080 | particularly to limit them to Eastern Europe.
01:50:16.520 | And from Britain's point of view, what was not to like?
01:50:19.120 | So the Russian Empire is defeated along with France.
01:50:23.280 | What does that really change?
01:50:27.520 | If the Germans are sensible
01:50:29.800 | and we can see what this might've looked like,
01:50:33.840 | they focus on Eastern Europe,
01:50:35.880 | they take chunks of the Russian Empire,
01:50:38.440 | perhaps they create as they did
01:50:40.600 | in the piece of Brest-Litovsk,
01:50:44.440 | an independent or quasi-independent Poland.
01:50:47.160 | In no way does that pose a threat to the British Empire.
01:50:49.440 | In fact, it's a good thing.
01:50:51.160 | Britain never had had a particularly good relationship
01:50:54.440 | with the Russian Empire after all.
01:50:56.120 | The key point here is that the Germany
01:50:59.440 | that emerges from victory in 1916
01:51:02.240 | has a kind of European Union.
01:51:05.000 | It's the dominant power of an enlarged Germany
01:51:09.080 | with a significant Mitteleuropa,
01:51:12.880 | whatever you want to call it,
01:51:13.800 | customs union type arrangement with neighboring countries,
01:51:17.360 | including one suspects Austria-Hungary.
01:51:20.240 | That is a very different world from the world of 1917-18.
01:51:27.160 | The protraction of the war for a further two years,
01:51:31.080 | its globalization,
01:51:33.200 | which Britain's intervention made inevitable.
01:51:35.600 | As Philip Zelikow showed in his recent book
01:51:39.400 | on the failure to make peace in 1916,
01:51:43.480 | Woodrow Wilson tried and failed to intervene
01:51:46.040 | and broker a peace in 1916.
01:51:47.680 | So I'm not the only counterfactualist here.
01:51:50.760 | The extension of the war for a further two years
01:51:53.360 | with escalating slaughter,
01:51:55.040 | the death toll rose because the industrial capacity
01:51:58.560 | of the armies grew greater.
01:52:00.440 | That's what condemns us to the Bolshevik revolution.
01:52:04.400 | And it's what condemns us ultimately to Nazism
01:52:08.040 | because it's out of the experience of defeat in 1918,
01:52:13.760 | as Hitler makes clear in "Mein Kampf"
01:52:15.600 | that he becomes radicalized and enters the political realm.
01:52:19.320 | Take out those additional years of war
01:52:23.920 | and Hitler's just a failed artist.
01:52:26.160 | It's the end of the war that turns him into the demagogue.
01:52:31.160 | You asked what are the things
01:52:34.880 | that avoid the totalitarian states.
01:52:38.880 | As I've said, British non-intervention for me
01:52:41.360 | is the most plausible
01:52:42.840 | and it takes out all of that malignant history
01:52:45.760 | that follows from the Bolshevik revolution.
01:52:48.280 | It's very hard for me to see how Lenin gets anywhere
01:52:51.000 | if the war is over.
01:52:53.800 | That looks like the opportunity
01:52:55.280 | for the constitutional elements,
01:52:59.800 | the liberal elements in Russia.
01:53:02.600 | There are other moments at which you can imagine
01:53:05.360 | history taking a different path.
01:53:07.440 | If the provisional government in Russia
01:53:12.560 | had been more ruthless,
01:53:14.040 | it was very lenient towards the Bolsheviks.
01:53:17.480 | But if it had just rounded them up
01:53:19.080 | and shot the Bolshevik leadership,
01:53:20.760 | that would have certainly cut the Bolshevik revolution off.
01:53:25.000 | One looks back on the conduct of the Russian liberals
01:53:29.160 | with the kind of despair at their failure
01:53:32.120 | to see the scale of the threat that they face
01:53:34.720 | and the ruthlessness
01:53:35.640 | that the Bolshevik leadership would evince.
01:53:38.520 | There's a counterfactual in Germany, which is interesting.
01:53:42.120 | I think the Weimar Republic destroyed itself
01:53:44.960 | in two disastrous economic calamities,
01:53:49.760 | the inflation and then the deflation.
01:53:51.800 | It's difficult for me to imagine Hitler
01:53:54.720 | getting to be Reich Chancellor
01:53:58.160 | without those huge economic disasters.
01:54:01.000 | So another part of my early work
01:54:02.920 | explored alternative policy options
01:54:05.520 | that the German Republic,
01:54:07.760 | the Weimar Republic might've pursued.
01:54:09.920 | There are other contingencies that spring to mind.
01:54:14.160 | In 1936 or '38, I think more plausibly '38,
01:54:18.840 | Britain should have gone to war.
01:54:21.200 | The great mistake was Munich.
01:54:22.920 | Hitler was in an extremely vulnerable position in 1938,
01:54:29.080 | because remember, he didn't have Russia squared away
01:54:31.520 | as he would in 1939.
01:54:33.840 | Chamberlain's mistake was to fold
01:54:36.360 | instead of going for war, as Churchill rightly saw.
01:54:41.120 | And there was a magical opportunity there
01:54:44.320 | that would have played into the hands
01:54:45.480 | of the German military opposition and conservatives
01:54:48.120 | to snuff Hitler out over Czechoslovakia.
01:54:52.920 | I could go on.
01:54:53.760 | The point is that history is not some inexorable narrative
01:54:58.760 | which can only end one way.
01:55:01.160 | It's a garden of forking paths.
01:55:03.720 | And at many, many junctions in the road,
01:55:08.280 | there were choices that could have averted
01:55:11.880 | the calamities of the mid 20th century.
01:55:14.360 | - I have to ask you about this moment
01:55:16.400 | before you said I could go on.
01:55:18.080 | This moment of Chamberlain and Hitler,
01:55:20.560 | snuff Hitler out in terms of Czechoslovakia.
01:55:24.280 | And we'll return to the book "Doom" on this point.
01:55:29.040 | What does it take to be a great leader
01:55:32.240 | in the room with Hitler or in the same time and space
01:55:37.240 | as Hitler to snuff him out, to make the right decisions?
01:55:43.960 | So it sounds like you put quite a bit of the blame
01:55:46.720 | on the man, Chamberlain, and give credit
01:55:51.200 | to somebody like a Churchill.
01:55:53.320 | So what is the difference?
01:55:54.240 | Where's that line?
01:55:55.360 | You've also written a book about Henry Kissinger
01:55:58.880 | who's an interesting sort of person
01:56:01.160 | that's been throughout many difficult decisions
01:56:04.780 | in the games of power.
01:56:06.160 | So what does it take to be a great leader in that moment?
01:56:08.480 | That particular moment, sorry to keep talking,
01:56:10.600 | is fascinating to me.
01:56:12.240 | 'Cause it feels like it's man on man conversations
01:56:15.760 | that define history.
01:56:17.640 | - Well, Hitler was bluffing.
01:56:19.600 | He really wasn't ready for war in 1938.
01:56:21.720 | The German economy was clearly not ready for war in 1938.
01:56:25.720 | And Chamberlain made a fundamental miscalculation
01:56:30.720 | along with his advisors, 'cause it wasn't all Chamberlain.
01:56:33.680 | He was in many ways articulating the establishment view.
01:56:38.680 | And I tried to show in a book called "War of the World"
01:56:41.420 | how that establishment worked.
01:56:42.720 | It extended through the BBC
01:56:45.040 | into the aristocracy to Oxford.
01:56:46.820 | There was an establishment view.
01:56:48.000 | Chamberlain personified it.
01:56:49.760 | Churchill was seen as a warmonger.
01:56:53.020 | He was at his lowest point of popularity in 1938.
01:56:56.280 | But what is it that Chamberlain gets wrong?
01:56:59.080 | 'Cause it's conceptual.
01:57:00.760 | Chamberlain is persuaded that Britain has to play for time
01:57:03.680 | because Britain is not ready for war in 1938.
01:57:06.800 | He fails to see that the time that he gets,
01:57:09.800 | that he buys at Munich is also available to Hitler.
01:57:13.520 | Everybody gets the time.
01:57:15.200 | And Hitler's able to do much more with it
01:57:17.000 | because Hitler strikes the pact with Stalin
01:57:19.880 | that guarantees that Germany can fight a war
01:57:23.680 | on one front in 1939.
01:57:25.960 | What does Chamberlain do?
01:57:26.880 | Build some more aircraft.
01:57:28.760 | So the great mistake of the strategy of appeasement
01:57:31.360 | was to play for time.
01:57:32.820 | I mean, they knew war was coming,
01:57:34.160 | but they were playing for time,
01:57:35.120 | not realizing that Hitler got the time too.
01:57:39.200 | And after he partitioned Czechoslovakia,
01:57:42.040 | he was in a much stronger position,
01:57:43.800 | not least because of all the resources
01:57:45.420 | that they were able to plunder from Czechoslovakia.
01:57:50.420 | So that was the conceptual mistake.
01:57:52.760 | Churchill played an heroic role in pointing out this mistake
01:57:57.760 | and predicting accurately
01:58:00.960 | that it would lead to war on worse terms.
01:58:04.100 | What does it take?
01:58:05.700 | It takes a distinct courage to be unpopular.
01:58:10.700 | And Churchill was deeply unpopular at that point.
01:58:13.680 | People would listen to him in the House of Commons
01:58:16.520 | in silence.
01:58:17.760 | On one occasion, Lady Astor shouted, "Rubbish!"
01:58:22.020 | So he went through a period of being hated on.
01:58:25.300 | The other thing that made Churchill a formidable leader
01:58:29.100 | was that he always applied history to the problem.
01:58:31.820 | And that's why he gets it right.
01:58:35.140 | He sees the historical problem
01:58:37.420 | much more clearly than Chamberlain.
01:58:39.900 | So I think if you go back to 1938,
01:58:44.340 | there's no realistic counterfactual
01:58:45.900 | in which Churchill's in government in 1938.
01:58:48.000 | You have to have France collapse
01:58:49.820 | for Churchill to come into government.
01:58:51.900 | But you can certainly imagine a Tory elite
01:58:56.900 | that's thinking more clearly about the likely dynamics.
01:59:02.360 | They haven't seen this, I guess, problem of conjecture
01:59:06.340 | to take a phrase from Kissinger,
01:59:08.900 | which is that whatever they're doing in postponing the war
01:59:13.820 | has the potential to create a worse starting point
01:59:17.940 | for the war.
01:59:18.780 | It would have been risky in 1938,
01:59:21.900 | but it was a way better situation
01:59:23.640 | than they ended up with in 1939, a year later.
01:59:27.280 | You asked about Kissinger, and I've learned a lot
01:59:28.940 | from reading Kissinger and talking to Kissinger
01:59:33.180 | since I embarked on writing his biography
01:59:35.780 | a great many years ago.
01:59:37.140 | I think one of the most important things I've learned
01:59:40.740 | is that you can apply history to contemporary problems.
01:59:46.600 | It may be the most important tool that we have
01:59:49.720 | in that kind of decision-making.
01:59:52.820 | You have to do it quite ruthlessly and rigorously.
01:59:58.660 | And in the moment of crisis, you have to take risk.
02:00:03.380 | So Kissinger often says in his early work,
02:00:08.380 | the temptation of the bureaucrat is to wait for more data,
02:00:12.260 | but ultimately the decision-making
02:00:14.500 | that we do under uncertainty can't be based on data.
02:00:18.460 | The problem of conjecture is that you could take an action
02:00:21.260 | now and incur some cost, an overt disaster,
02:00:27.380 | but you'll get no thanks for it
02:00:28.700 | 'cause nobody is grateful for an averted disaster.
02:00:31.200 | And nobody goes around saying,
02:00:33.340 | "Wasn't it wonderful how we didn't have another 9/11?"
02:00:36.340 | On the other hand, you can do nothing,
02:00:40.780 | incur no upfront costs and hope for the best,
02:00:44.140 | and you might get lucky, the disaster might not happen.
02:00:47.460 | That's in a democratic system, the much easier path to take.
02:00:54.300 | And I think the essence of leadership is to be ready
02:00:59.300 | to take that upfront cost, avert the disaster,
02:01:02.020 | and accept that you won't get gratitude.
02:01:04.020 | - If I may make a comment, an aside about Henry Kissinger.
02:01:10.340 | So he, I think at 98 years old currently, has still got it.
02:01:15.400 | - Yeah. - He's brilliant.
02:01:17.700 | - It's very, very impressive.
02:01:18.980 | I can only hope that my brain has the same durability
02:01:23.260 | that his does because it's a formidable intellect
02:01:26.060 | and it's still in as sharp form as it was 50 years ago.
02:01:31.060 | - So you mentioned Eric Schmidt and his book,
02:01:34.260 | and they reached out to me, they wanna do this podcast.
02:01:37.020 | And I know Eric Schmidt, I've spoken to him before,
02:01:41.420 | I like him a lot, obviously.
02:01:44.100 | So they said, "We could do a podcast
02:01:47.340 | "for 40 minutes with Eric,
02:01:49.940 | "40 minutes with Eric and Henry together,
02:01:52.640 | "and 40 minutes with Henry."
02:01:55.020 | So those are three different conversations.
02:01:58.120 | And I had to do some soul searching.
02:02:00.980 | 'Cause I said, "Fine, 40 minutes with Eric,
02:02:02.780 | "we'll probably talk many times again.
02:02:04.740 | "Fine, let's talk about this AI book
02:02:07.060 | "together for 40 minutes."
02:02:09.260 | But I said, what I wrote to them, I said,
02:02:11.180 | "I would hate myself if I only have 40 minutes
02:02:14.320 | "to talk to Henry Kissinger."
02:02:16.660 | And so I had to hold my ground, went back and forth,
02:02:18.860 | and in the end decided to part ways over this.
02:02:21.180 | And I sometimes think about this kind of difficult decision
02:02:25.860 | in the podcasting space of when do you walk away?
02:02:32.900 | Because there's a particular world leader
02:02:38.940 | that I've mentioned in the past
02:02:40.540 | where the conversation is very likely to happen.
02:02:43.940 | And as it happens, those conversations could often be,
02:02:50.740 | unfortunately, this person only has 30 minutes now.
02:02:53.380 | I know we agreed for three hours, but unfortunately.
02:02:56.060 | And you have to decide, do I stand my ground on this point?
02:03:01.060 | I suppose that's the thing that journalists
02:03:03.220 | have to think about, right?
02:03:04.580 | Like, do I hold onto my integrity
02:03:09.580 | in whatever form that takes?
02:03:11.420 | And do I stay my ground
02:03:12.540 | even if I lose a fascinating opportunity?
02:03:16.180 | Anyway, it's something I thought about
02:03:17.980 | and something I think about.
02:03:19.780 | And with Henry Kissinger, I mean,
02:03:21.700 | he's had a million amazing conversations in your biography,
02:03:25.780 | so it's not like something is lost,
02:03:27.300 | but it was still nevertheless, to me,
02:03:28.940 | some soul searching that I had to do
02:03:30.820 | as a kind of practice for what to me
02:03:34.620 | is a higher stakes conversation.
02:03:36.860 | I'll just mention it as Vladimir Putin.
02:03:38.940 | I can have a conversation with him
02:03:41.780 | unlike any conversation he's ever had,
02:03:44.460 | partially because I'm a fluent Russian speaker,
02:03:47.860 | partially because I'm messed up in the head
02:03:49.660 | in certain kind of ways that make for an interesting dynamic
02:03:52.780 | because we're both judo people,
02:03:54.420 | we both are certain kinds of human beings
02:03:58.580 | that can have a much deeper apolitical conversation.
02:04:01.300 | I have to ask to stay on the topic of leadership.
02:04:05.900 | You've, in your book "Doom,"
02:04:08.380 | have talked about wars, pandemics throughout human history,
02:04:12.360 | and in some sense saying
02:04:16.380 | that all of these disasters are man-made.
02:04:19.440 | So humans have a role in terms of the magnitude
02:04:23.580 | of the effect that they have on human civilization.
02:04:26.140 | Without taking cheap political shots,
02:04:30.240 | can we talk about COVID-19?
02:04:32.140 | How will history remember the COVID-19 pandemic?
02:04:38.060 | What were the successes?
02:04:39.500 | What were the failures of leadership, of man, of humans?
02:04:45.500 | - "Doom" was a book that I was planning to write
02:04:50.500 | before the pandemic struck as a history of the future
02:04:56.540 | based in large measure on science fiction.
02:04:59.660 | It had occurred to me in 2019
02:05:01.540 | that I had spent too long not reading science fiction.
02:05:04.380 | And so I decided I would liven up my intake
02:05:09.380 | by getting off history for a bit and reading science fiction.
02:05:15.020 | 'Cause history is great at telling you
02:05:16.260 | about the perennial problems of power.
02:05:18.540 | Putin is always interesting on history.
02:05:20.900 | He's become something of a historian recently
02:05:23.100 | with his essays and lectures.
02:05:25.780 | But what history is bad at telling you is,
02:05:27.460 | well, what will the effects
02:05:28.520 | of discontinuity of technology be?
02:05:31.740 | And so I thought, I need some science fiction
02:05:33.340 | to think more about this,
02:05:34.420 | 'cause I'm tending to miss the importance
02:05:38.340 | of technological discontinuity.
02:05:40.100 | If you read a lot of science fiction,
02:05:43.020 | you read a lot of plague books,
02:05:45.580 | 'cause science fiction writers are really quite fond
02:05:48.700 | of the plague scenario.
02:05:50.260 | So the world ends in many ways in science fiction,
02:05:52.460 | but one of the most popular is "The Lethal Pandemic."
02:05:54.640 | So when the first email came to me,
02:05:58.500 | I think it was on January the 3rd
02:05:59.900 | from my medical friend, Justin Stebbing,
02:06:01.860 | "Funny pneumonia in Wuhan," my antennae began to tingle,
02:06:06.860 | because it was just like one of those science fiction books
02:06:10.260 | that begins in just that way.
02:06:12.440 | (mouse clicking)
02:06:14.600 | In a pandemic, as Larry Brilliant,
02:06:18.800 | the epidemiologist said many years ago,
02:06:20.800 | the key is early detection and early action.
02:06:25.400 | That's how you deal with a novel pathogen.
02:06:27.500 | And almost no Western country did that.
02:06:30.720 | We know it was doable because the Taiwanese
02:06:33.800 | and the South Koreans did it, and they did it very well.
02:06:36.740 | But really no Western country got this right.
02:06:41.400 | Some were unlucky because super spreader events
02:06:43.920 | happened earlier than in other countries.
02:06:46.200 | Italy was hit very hard very early.
02:06:48.520 | For other countries, the real disaster came quite late,
02:06:51.040 | Russia, which has only relatively recently
02:06:54.000 | had a really bad experience.
02:06:56.020 | The lesson for me is quite different
02:06:59.960 | from the one that most journalists
02:07:02.000 | thought they were learning last year.
02:07:03.880 | Most journalists last year thought,
02:07:06.360 | Trump is a terrible president.
02:07:08.320 | He's saying a lot of crazy things.
02:07:10.700 | It's his fault that we have high excess mortality
02:07:13.220 | in the United States.
02:07:15.060 | The same argument was being made by journalists in Britain,
02:07:18.000 | Boris Johnson, dot, dot, dot, Brazil,
02:07:20.560 | Jared Bolsonaro, dot, dot, dot,
02:07:22.240 | even India, Narendra Modi, the same argument.
02:07:25.540 | And I think this argument is wrong in a few ways.
02:07:30.680 | It's true that the populist leaders said many crazy things
02:07:35.160 | and broadly speaking, gave poor guidance
02:07:37.520 | to their populations.
02:07:40.680 | But I don't think it's true to say
02:07:43.480 | that with different leaders,
02:07:44.620 | these countries would have done significantly better
02:07:46.920 | if Joe Biden had magically been president a year earlier.
02:07:50.400 | I don't think the US would have done much better
02:07:52.360 | because the things that caused excess mortality last year
02:07:55.560 | weren't presidential decisions.
02:07:57.020 | They were utter failure of CDC to provide testing.
02:08:00.920 | That definitely wasn't Trump's fault.
02:08:02.960 | Scott Gottlieb's book makes that very clear.
02:08:04.880 | It's just been published recently.
02:08:06.840 | We utterly failed to use technology for contact tracing,
02:08:10.160 | which the Koreans did very well.
02:08:12.560 | We didn't really quarantine anybody seriously.
02:08:16.600 | There was no enforcement of quarantine.
02:08:19.120 | And we exposed the elderly to the virus
02:08:21.080 | as quickly as possible in elderly care homes.
02:08:23.440 | And these things had very little to do
02:08:25.700 | with presidential incompetence.
02:08:28.080 | So I think leadership is of somewhat marginal importance
02:08:33.080 | in a crisis like this,
02:08:34.680 | 'cause what you really need is your public health bureaucracy
02:08:37.160 | to get it right.
02:08:38.080 | And very few Western public health bureaucracies
02:08:40.560 | got it right.
02:08:42.160 | Could the president have given better leadership?
02:08:47.080 | His correct strategy, however,
02:08:49.360 | was to learn from Barack Obama's playbook
02:08:52.740 | with the opioid epidemic.
02:08:55.200 | The opioid epidemic killed as many people on Obama's watch
02:08:59.400 | as COVID did on Trump's watch.
02:09:01.840 | And it was worse in the sense
02:09:03.160 | because it only happened in the US,
02:09:05.040 | and each year it killed more people than the year before
02:09:07.820 | over eight years.
02:09:09.200 | Nobody to my knowledge has ever seriously blamed Obama
02:09:12.680 | for the opioid epidemic.
02:09:14.800 | Trump's mistake was to put himself front and center
02:09:17.600 | of the response,
02:09:18.440 | to claim that he had some unique insight into the pandemic
02:09:22.160 | and to say with every passing week
02:09:24.840 | more and more foolish things
02:09:26.080 | until even a significant portion of people
02:09:29.160 | who'd voted for him in 2016 realized that he'd blown it,
02:09:32.080 | which was why he lost the election.
02:09:34.160 | The correct strategy was actually to make Mike Pence
02:09:37.800 | the pandemic czar and get the hell out of the way.
02:09:40.920 | That's what my advice to Trump would have been.
02:09:42.680 | In fact, it was in February of last year.
02:09:45.640 | So the mistake was to try to lead,
02:09:48.960 | but actually leadership in a pandemic
02:09:52.520 | is almost a contradiction in terms.
02:09:54.000 | What you really need is your public health bureaucracy
02:09:56.600 | not to fuck it up.
02:09:58.320 | And they really, really fucked it up.
02:10:00.400 | And that was then all blamed on Trump.
02:10:02.120 | - Yes.
02:10:02.960 | - Jim Fallows writes a piece in "The Atlantic" that says,
02:10:05.280 | "Well, being the president's like flying a light aircraft
02:10:07.680 | "it's pilot error."
02:10:09.000 | And I read that piece and I thought,
02:10:10.520 | does he really, after all the years he spent writing,
02:10:13.160 | think that being president is like flying a light aircraft?
02:10:16.160 | I mean, it's really nothing like flying a light aircraft.
02:10:19.160 | Being president is you sit on top of a vast bureaucracy
02:10:22.360 | with how many different agencies, 60, 70,
02:10:24.400 | we've all lost count.
02:10:25.880 | And you're surrounded by advisors,
02:10:27.720 | at least a quarter of whom are saying,
02:10:29.680 | "This is a disaster, we have to close the borders."
02:10:31.760 | And the others are saying,
02:10:33.080 | "No, no, we have to keep the economy going.
02:10:34.600 | "That's what you're running on in November."
02:10:37.120 | So being the president in a pandemic
02:10:39.160 | is a very unenviable position
02:10:40.960 | because you can't really determine
02:10:45.440 | whether your public health bureaucracy
02:10:47.240 | will get it right or not.
02:10:48.320 | - You don't think to push back on that,
02:10:50.560 | just like being Churchill in a war is difficult.
02:10:53.120 | So leaving Trump or Biden aside,
02:10:57.040 | what I would love to see from a president
02:10:58.920 | is somebody who makes great speeches
02:11:03.240 | and arouses the public to push the bureaucracy,
02:11:06.420 | the public health bureaucracy,
02:11:08.200 | to get their shit together,
02:11:09.760 | to fire certain kinds of people.
02:11:11.600 | I mean, I'm sorry, but I'm a big fan of powerful speeches,
02:11:15.160 | especially in the modern age with the internet.
02:11:17.380 | It can really move people.
02:11:19.520 | Instead, the lack of speeches
02:11:23.080 | resulted in certain kinds of forces amplifying division
02:11:28.080 | over whether to wear masks or not.
02:11:31.480 | It's almost like the public picked some random topic
02:11:35.840 | over which to divide themselves.
02:11:37.640 | And there was a complete indecision,
02:11:39.800 | which is really what it was,
02:11:41.360 | fear of uncertainty materializing itself
02:11:45.200 | in some kind of division.
02:11:46.440 | And then you almost busied yourself
02:11:48.960 | with the red versus blue politics
02:11:50.960 | as opposed to some, I don't know,
02:11:52.480 | FDR type character just stands and say,
02:11:55.900 | "Fuck all this bullshit that we're hearing.
02:11:59.840 | We're going to manufacture 5 billion tests.
02:12:02.660 | This is what America is great at.
02:12:04.280 | We're going to build the greatest testing infrastructure
02:12:07.520 | ever built or something,
02:12:10.460 | or even with the vaccine development."
02:12:12.440 | - But that was what I was about to interject.
02:12:15.240 | In a pandemic, the most important thing is the vaccine.
02:12:18.380 | If you get that right,
02:12:19.520 | then you should be forgiven for much else.
02:12:21.560 | And that was the one thing
02:12:22.400 | the Trump administration got right,
02:12:23.840 | because they went around the bureaucracy
02:12:27.480 | with Operation Warp Speed
02:12:28.840 | and achieved a really major success.
02:12:33.260 | So I think the paradox of the 2020 story
02:12:38.260 | in the United States is that the one thing
02:12:42.180 | that mattered most, the Trump administration got right.
02:12:45.680 | And it got so much else wrong that was sort of marginal
02:12:49.080 | that we were left with the impression
02:12:50.980 | that Trump had been to blame for the whole disaster,
02:12:53.920 | which wasn't really quite right.
02:12:56.360 | Sure, it would have been great
02:12:57.480 | if we'd had Operation Warp Speed for testing,
02:13:00.000 | but ultimately vaccines are more important than tests.
02:13:02.920 | And this brings me to the question
02:13:06.960 | that you raised there of polarization
02:13:09.360 | and why that happened.
02:13:11.740 | Now, in a book called "The Square and the Tower,"
02:13:13.960 | I argued that it would be very costly for the United States
02:13:17.140 | to allow the public sphere to continue to be dominated
02:13:20.720 | by a handful of big tech companies,
02:13:22.440 | that this ultimately would have more adverse effects
02:13:25.260 | than simply contested elections.
02:13:27.400 | And I think we saw over the past 18 months
02:13:31.280 | just how bad this could be,
02:13:32.800 | because the odd thing about this country
02:13:37.720 | is that we came up with vaccines with 90 plus percent
02:13:40.960 | efficacy and about 20% of people refused to get them
02:13:44.920 | and still do refuse for reasons that seem best explained
02:13:49.920 | in terms of the anti-vax network,
02:13:54.200 | which has been embedded on the internet for a long time,
02:13:56.920 | predating the pandemic.
02:13:58.620 | René Dureste wrote about this pre-2020.
02:14:02.480 | And this anti-vax network has turned out
02:14:04.720 | to kill maybe 200,000 Americans
02:14:06.960 | who could have been vaccinated,
02:14:08.200 | but were persuaded through magical thinking
02:14:11.080 | that the vaccine was riskier than the virus.
02:14:14.000 | Whereas you don't need to be an epidemiologist,
02:14:17.000 | you don't need to be a medical scientist
02:14:18.400 | to know that the virus is about two orders of magnitude
02:14:21.220 | riskier than the vaccine.
02:14:22.760 | So again, leadership could definitely have been better,
02:14:30.000 | but the politicization of everything
02:14:32.520 | was not Trump's doing alone.
02:14:35.440 | It happened because our public sphere has been dominated
02:14:39.760 | by a handful of platforms whose business model
02:14:43.920 | inherently promotes polarization,
02:14:46.040 | inherently promotes fake news and extreme views,
02:14:49.080 | because those are the things that get the eyeballs
02:14:51.400 | on the screens and sell the ads.
02:14:53.180 | I mean, this is now a commonplace,
02:14:55.120 | but when one thinks about the cost of allowing
02:14:58.160 | this kind of thing to happen,
02:15:01.840 | it's now a very high human cost.
02:15:04.120 | And we were foolish to leave uncorrected
02:15:06.960 | these structural problems in the public sphere
02:15:09.280 | that were already very clearly visible in 2016.
02:15:12.800 | - And you described that,
02:15:14.040 | like you mentioned that there's these networks
02:15:17.440 | that are almost like laying dormant,
02:15:19.880 | waiting for their time in the sun.
02:15:22.820 | And they stepped forward in this case.
02:15:25.360 | And that those network effects just,
02:15:27.840 | the service catalyst for whatever
02:15:31.680 | the bad parts of human nature.
02:15:34.440 | I do hope that there's kinds of networks
02:15:36.360 | that emphasize the better angels of our nature
02:15:38.720 | to quote Steven Pinker.
02:15:40.800 | - It's just clearly, and we know this
02:15:43.520 | from all the revelations of the Facebook whistleblower,
02:15:46.360 | there is clearly a very clear tension
02:15:49.760 | between the business model of a company like Facebook
02:15:54.120 | and the public good.
02:15:56.360 | And they know that.
02:15:57.680 | - I just talked to the founder of Instagram.
02:16:00.280 | Yes, that's the case, but it's not
02:16:03.840 | from a technology perspective,
02:16:06.000 | like absolutely true of any kind of social network.
02:16:08.520 | I think it's possible to build,
02:16:09.840 | actually I think it's not just possible.
02:16:12.080 | I think it's pretty easy if you set that as the goal
02:16:15.000 | to build social networks
02:16:16.840 | that don't have these negative effects.
02:16:20.680 | - Right.
02:16:21.600 | But if the business model is we sell ads
02:16:26.120 | and the way you sell ads is to maximize user engagement,
02:16:30.200 | then the algorithm is biased
02:16:31.680 | in favor of fake news and extreme views.
02:16:33.320 | - But it's not, so it's not the ads.
02:16:34.800 | A lot of people blame the ads.
02:16:36.560 | The problem I think is the engagement
02:16:40.360 | and the engagement is just the easiest,
02:16:42.040 | the dumbest way to sell the ads.
02:16:43.880 | I think there's much different metrics
02:16:46.120 | that could be used to make a lot more money
02:16:48.760 | than the engagement in the longterm.
02:16:51.000 | It has more to do with planning for the longterm.
02:16:53.440 | So optimizing the selling of ads
02:16:57.560 | to make people happy with themselves in the longterm
02:17:02.560 | as opposed to some kind of addicted like dopamine feeling.
02:17:07.360 | And so that's, to me, that has to do with metrics
02:17:09.960 | and measuring things correctly
02:17:11.320 | and sort of also creating a culture
02:17:13.520 | with what's valued to have difficult conversations
02:17:16.760 | about what we're doing with society,
02:17:18.400 | all those kinds of things.
02:17:19.560 | And I think once you have those conversations,
02:17:21.960 | this takes us back to the University of Austin,
02:17:23.800 | kind of once you have those difficult human conversations,
02:17:27.080 | you can design the technology that will actually make
02:17:30.000 | for help people grow, become the best version of themselves,
02:17:34.240 | help them be happy in the longterm.
02:17:36.000 | What gives you hope about the future?
02:17:41.840 | As somebody who studied some of the darker moments
02:17:44.520 | of human history, what gives you hope?
02:17:49.200 | A couple of things.
02:17:50.240 | First of all, the United States
02:17:56.520 | has a very unique operating system,
02:17:58.880 | which was very well designed by the founders
02:18:02.160 | who'd thought a lot about history
02:18:03.600 | and realized it would take quite a novel design
02:18:07.840 | to prevent the republic going the way of all republics
02:18:10.880 | 'cause republics tend to end up as tyrannies
02:18:12.920 | for reasons that were well-established
02:18:14.960 | by the time of the Renaissance.
02:18:16.920 | And it gives me hope that this design has worked very well
02:18:20.680 | and withstood an enormous stress test in the last year.
02:18:25.560 | I became an American in 2018.
02:18:27.560 | I think one of the most important features
02:18:32.880 | of this operating system is that it is the magnet for talent.
02:18:37.640 | Here we sit, part of the immigration story
02:18:45.640 | in a darkened room with funny accents.
02:18:50.040 | - A Scot and a Russian walk into a-
02:18:50.880 | - A Scot and a Russian walk into a recording studio
02:18:54.480 | and talk about America.
02:18:56.000 | It's very much like a joke.
02:18:58.000 | And Elon's a South African and so on,
02:18:59.760 | and Thiel is a German.
02:19:00.760 | And we're extraordinarily fortunate
02:19:03.800 | that the natives let us come and play
02:19:06.000 | and play in a way that we could not
02:19:09.760 | in our countries of birth.
02:19:12.040 | And as long as the United States continues
02:19:14.360 | to exploit that superpower, that it is the talent magnet,
02:19:18.760 | then it should out-innovate
02:19:20.760 | the totalitarian competition every time.
02:19:24.280 | So that's one reason for being an optimist.
02:19:29.280 | Another reason, and it's quite a historical reason,
02:19:33.240 | as you would expect from me.
02:19:34.760 | Another reason that I'm optimistic
02:19:39.040 | is that my kids give me a great deal of credit
02:19:44.040 | and a great deal of hope.
02:19:45.080 | They range in age from 27 down to four,
02:19:48.720 | but each of them in their different way
02:19:52.760 | seems to be finding a way through this crazy time of ours
02:19:57.760 | without losing contact with that culture
02:20:03.800 | and civilization that I hold dear.
02:20:08.080 | I don't want to live in the metaverse
02:20:10.040 | as Mark Zuckerberg imagines it.
02:20:12.680 | To me, that's a kind of ghastly hell.
02:20:14.960 | I think Western civilization is the best civilization.
02:20:20.920 | And I think that almost all the truths
02:20:23.920 | about the human condition can be found
02:20:26.480 | in Western literature, art, and music.
02:20:32.520 | And I think also that the civilization
02:20:37.040 | that produced the scientific revolution
02:20:39.040 | has produced the great problem-solving
02:20:42.520 | tool that eluded the other civilizations
02:20:44.720 | that never really cracked science.
02:20:47.560 | And what gives me hope is that despite all the temptations
02:20:52.720 | and distractions that their generation had to contend with,
02:20:57.440 | my children in their different ways
02:20:58.920 | have found their way to literature and to art and to music.
02:21:03.920 | And they are civilized.
02:21:09.280 | And I don't claim much of the credit for that.
02:21:14.040 | I've done my best, but I think it's deeply encouraging
02:21:17.880 | that they found their way to the things
02:21:21.600 | that I think are indispensable for a happy life,
02:21:25.200 | a fulfilled life.
02:21:26.560 | Nobody, I think, can be truly fulfilled
02:21:29.120 | if they're cut off from the great body
02:21:32.120 | of Western literature, for example.
02:21:34.120 | I've thought a lot about Elon's argument
02:21:38.240 | that we might be in a simulation.
02:21:40.240 | No, no, there is a simulation.
02:21:43.200 | It's cold literature.
02:21:44.960 | And we just have to decide whether or not to enter it.
02:21:49.120 | I'm currently in the midst of the later stages
02:21:53.320 | of Proust's great "A l'heure recherche du temps perdu."
02:21:57.320 | And Proust's observation of human relationships
02:22:01.480 | is perhaps more meticulous than that of any other writer.
02:22:05.680 | And it's impossible not to find yourself
02:22:08.480 | identifying with Marcel and his obsessive,
02:22:13.040 | jealous relationships, particularly with Albertine.
02:22:16.040 | It's the simulation.
02:22:18.440 | And you decide, I think, as a sentient being,
02:22:23.440 | how far to, in your own life,
02:22:26.920 | reenact these more profound experiences
02:22:30.440 | that others have written down.
02:22:31.920 | One of my earliest literary simulations
02:22:34.280 | was to reenact Jack Kerouac's trip in "On the Road"
02:22:37.440 | when I was 17, culminating in getting very wasted
02:22:40.640 | in the hanging gardens of Xochimilco, not to be missed.
02:22:44.800 | And it hit me just as I was reading Proust
02:22:48.680 | that that's really how to live a rich life,
02:22:50.920 | that one lives life, but one lives it
02:22:53.720 | juxtaposing one's own experience
02:22:56.240 | against the more refined experiences of the great writers.
02:23:00.400 | So it gives me hope that my children do that a bit.
02:23:03.720 | (laughs)
02:23:04.560 | - Do you include the Russian authors in the canon?
02:23:09.360 | - Yes, I don't read Russian,
02:23:11.760 | but I was entirely obsessed with Russian literature
02:23:15.720 | as a school boy.
02:23:17.160 | I read my way through Dostoevsky, Tolstoy,
02:23:20.640 | Turgenev, Chekhov.
02:23:24.840 | I think of all of those writers,
02:23:29.840 | Tolstoy had the biggest impact
02:23:31.600 | because at the end of "War and Peace"
02:23:33.320 | there's this great essay on historical determinism,
02:23:36.000 | which I think was the reason I became a historian.
02:23:38.500 | But I'm really temperamentally a kind of Turgenev person,
02:23:44.680 | oddly enough.
02:23:48.120 | I think if you haven't read those novelists,
02:23:51.040 | I mean, you can't really be a complete human being
02:23:54.160 | if you haven't read the "Brothers Karamazov."
02:23:57.760 | You're not really, you're not grown up.
02:24:00.960 | And so I think in many ways,
02:24:02.840 | those are the greatest novels.
02:24:06.000 | Raskolnikov's, remember Raskolnikov's "Nightmare"
02:24:09.440 | at the end of "Crime and Punishment,"
02:24:12.240 | in which he imagines in his dream,
02:24:15.320 | a world in which a terrible virus spreads.
02:24:19.400 | Do you remember this?
02:24:20.640 | And this virus has the effect of making every individual
02:24:23.560 | think that what he believes is right.
02:24:27.640 | And in this self-righteousness,
02:24:31.960 | people fall on one another and commit appalling violence.
02:24:36.920 | That's Raskolnikov's "Nightmare."
02:24:38.360 | And it's a prophecy.
02:24:39.560 | It's a terrible prophecy of Russia's future.
02:24:43.100 | - Yeah, it's, and coupled with that is probably the,
02:24:47.760 | I also like the French, the existentialist, all that.
02:24:50.840 | The full spectrum and Germans, Herman Hesse,
02:24:53.800 | and just that range of human thought
02:24:57.080 | as expressed in the literature is fascinating.
02:24:58.680 | I really love your idea that the simulation,
02:25:02.460 | like one way to live life is to kind of explore
02:25:09.520 | these other worlds and borrow from them wisdom
02:25:14.920 | that you then just map onto your own life.
02:25:17.480 | So you almost like stitch together your life
02:25:20.120 | with these kind of pieces from literature.
02:25:22.360 | - The highly educated person is constantly struck
02:25:26.160 | by illusion.
02:25:27.360 | Everything is an illusion to something that one has read.
02:25:32.400 | And that is the simulation.
02:25:34.860 | That's what the real metaverse is.
02:25:38.200 | It's the imaginary world that we enter when we read,
02:25:42.040 | empathize, and then recognize in our daily lives
02:25:45.780 | some scrap of the shared experience that literature gives us.
02:25:50.080 | - Yeah, I think of "Aspire to be the Idiot"
02:25:54.320 | from Prince Mishkin from Dostoevsky,
02:25:57.240 | and in aspiring to be that, I have become the idiot,
02:26:02.240 | I feel, at least in part.
02:26:04.360 | What, you mentioned the human condition,
02:26:09.200 | does love have to do?
02:26:12.360 | What role does it play in the human condition?
02:26:15.040 | Friendship, love?
02:26:20.280 | - Love is the drug.
02:26:24.160 | Love is, this was the great Roxy music line
02:26:31.040 | that Brian Ferry wrote,
02:26:34.000 | and love is the most powerful and dangerous
02:26:38.520 | of all the drugs.
02:26:39.820 | The driving force that overrides our reason,
02:26:49.800 | and of course, it is the primal, it's the primal urge.
02:26:54.800 | So what a civilized society has to do
02:27:00.560 | is to prevent that drug, that primal force,
02:27:04.200 | from creating mayhem.
02:27:07.160 | So there have to be rules like monogamy
02:27:11.440 | and rituals like marriage that reign love in
02:27:16.280 | and make the addict at least more or less under control.
02:27:21.280 | And I think that's part of why I'm a romantic
02:27:29.400 | rather than a Steve Pinker enlightenment rationalist,
02:27:35.360 | because the romantics realized that love was the drug.
02:27:40.800 | It's like the difference in sensibility
02:27:45.680 | between Handel and Wagner.
02:27:48.700 | And I had a Wagnerian phase when I was an undergraduate.
02:27:53.000 | I still remember thinking that in,
02:27:56.740 | as old as Liebestod, that Wagner had got the closest to sex
02:28:02.800 | that anybody had ever got in music, or perhaps to love.
02:28:07.800 | I'm lucky that I love my wife
02:28:12.160 | and that we were, by the time we met,
02:28:17.160 | smart enough to understand that love is a drug
02:28:23.360 | that you have to kind of take in certain careful ways,
02:28:28.360 | and that it works best in the context of a stable family.
02:28:34.400 | That's the key thing, that one has to sort of take the drug
02:28:40.520 | and then submit to the conventions
02:28:44.360 | of marriage and family life.
02:28:47.000 | I think in that respect, I'm a kind of tamed romantic.
02:28:55.000 | - Tamed romantic.
02:28:56.680 | - That's how I would like to think of myself.
02:28:57.520 | - And the degree to which your romanticism is tamed
02:29:01.080 | can be then channeled into productive work.
02:29:03.320 | That's why you are a historian and a writer,
02:29:05.560 | is the rest of that love is channeled through the writing.
02:29:08.040 | - So if you're going to be addicted to anything,
02:29:09.760 | be addicted to work.
02:29:11.160 | I mean, we're all addictive,
02:29:13.600 | but the thing about workaholism
02:29:15.680 | is that it is the most productive addiction,
02:29:17.880 | and rather that than drugs or booze.
02:29:22.880 | So yes, I'm always trying to channel my anxieties into work.
02:29:27.880 | I learned that at a relatively early age,
02:29:31.320 | it's a sort of massively productive way
02:29:33.640 | of coping with the inner demons.
02:29:36.600 | And again, we should teach kids that,
02:29:39.360 | because let's come back to our earlier conversation
02:29:43.000 | about universities.
02:29:43.840 | Part of what happens at university
02:29:45.120 | is that adolescents have to overcome all the inner demons.
02:29:49.400 | And these include deep insecurity about one's appearance,
02:29:54.160 | about one's intellect, and then madly raging hormones
02:29:58.280 | that cause you to behave like a complete fool
02:30:00.560 | with the people to whom you're sexually attracted.
02:30:03.160 | All of this is going on in the university.
02:30:05.560 | How can it be a safe space?
02:30:07.120 | It's a completely dangerous space by definition.
02:30:09.920 | (laughing)
02:30:11.840 | So yeah, teaching young people how to manage these storms,
02:30:15.560 | that's part of the job,
02:30:18.120 | and we're really not allowed to do that anymore
02:30:20.800 | 'cause we can't talk about these things
02:30:22.120 | for fear of the Title IX officers kicking down the door
02:30:24.680 | and dragging us off in chains.
02:30:26.880 | - And like you said, hard work
02:30:28.760 | and something you call work ethic in civilization
02:30:35.760 | is a pretty effective way to achieve, I think,
02:30:39.360 | a kind of happiness in a world that's full of anxiety.
02:30:42.480 | - Or at least exhaustion, so that you sleep well.
02:30:45.200 | (laughing)
02:30:46.720 | - Well, there is beauty to the exhaustion too.
02:30:49.320 | There's why running, there's manual work,
02:30:52.600 | that some part of us is built for that.
02:30:55.320 | - Right, I mean, we are products of evolution,
02:30:59.560 | and our adaptation to a technological world
02:31:03.520 | is a very imperfect one.
02:31:04.760 | So hence the kind of masochistic urge to run.
02:31:09.200 | I'd like outdoor exercise.
02:31:14.200 | I don't really like gyms.
02:31:16.320 | So I'll go for long punishing runs in woodland,
02:31:21.320 | hike up hills.
02:31:24.280 | I like swimming in lakes and in the sea
02:31:27.920 | because there just has to be that physical activity
02:31:32.720 | in order to do the good mental work.
02:31:34.680 | And so it's all about trying to do the best work.
02:31:38.880 | That's my sense, that we have
02:31:42.480 | some random allocation of talent.
02:31:45.760 | You kind of figure out what it is
02:31:47.320 | that you're relatively good at,
02:31:48.960 | and you try to do that well.
02:31:51.280 | I think my father encouraged me to think that way.
02:31:55.160 | And you don't mind about being average at the other stuff.
02:31:58.560 | The kind of sick thing is to try to be brilliant
02:32:00.760 | at everything, I hate those people.
02:32:02.800 | You should really not worry too much
02:32:05.000 | if you're just an average double bass player,
02:32:07.840 | which I am, or kind of average skier,
02:32:10.120 | which I definitely am.
02:32:12.200 | Doing those things okay is part of leading a rich
02:32:15.360 | and fulfilling life.
02:32:16.680 | I was not a good actor,
02:32:19.040 | but I got a lot out of acting as an undergraduate.
02:32:22.800 | It turned out after three years of experimentation
02:32:25.280 | at Oxford that I was broadly speaking better
02:32:28.240 | at writing history essays than my peers.
02:32:32.920 | And that was my edge, that was my comparative advantage.
02:32:35.800 | And so I've just tried to make a living
02:32:37.880 | from that slight edge.
02:32:40.280 | - Yeah, that's a beautiful way to describe a life.
02:32:43.680 | Is there a meaning to this thing?
02:32:46.440 | Is there a meaning to life?
02:32:47.920 | What is the meaning of life?
02:32:49.880 | - I was brought up by a physicist and a physician.
02:32:53.480 | They were more or less committed atheists
02:32:56.560 | who had left the Church of Scotland
02:32:58.360 | as a protest against sectarianism in Glasgow.
02:33:02.240 | And so my sister and I were told from an early age,
02:33:05.600 | life was a cosmic accident.
02:33:07.520 | And that was it.
02:33:10.720 | There was no great meaning to it.
02:33:13.640 | And I can't really get past that.
02:33:19.840 | - Isn't there a beauty to being an accident
02:33:22.800 | at a cosmic scale?
02:33:24.440 | - Yes, I wasn't taught to feel negative about that.
02:33:27.800 | And if anything, it was a frivolous insight
02:33:32.360 | that the whole thing was a kind of joke.
02:33:34.840 | And I think that atheism isn't really a basis
02:33:40.440 | for ordering a society, but it's been all right for me.
02:33:45.440 | I don't have a kind of sense of a missing religious faith.
02:33:53.440 | For me, however, there's clearly some embedded
02:33:58.440 | Christian ethics in the way my parents lived.
02:34:03.800 | And so we were kind of atheist Calvinists
02:34:08.400 | who had kind of deposed God, but carried on behaving
02:34:11.560 | as if we were members of the elect in a moral universe.
02:34:14.280 | So that's kind of the state of mind that I was left in.
02:34:21.200 | And I think that we aren't really around long enough
02:34:26.200 | to claim that our individual lives have meaning.
02:34:32.000 | But what Edmund Burke said is true.
02:34:35.760 | The real social contract is between the generations,
02:34:38.680 | between the dead, the living and the unborn.
02:34:41.120 | And the meaning of life is, for me at least,
02:34:45.720 | to live in a way that honors the dead,
02:34:48.160 | seeks to learn from their accumulated wisdom,
02:34:50.560 | 'cause they do still outnumber us.
02:34:52.160 | They outnumber the living by quite a significant margin.
02:34:56.000 | And then to be mindful of the unborn
02:34:58.720 | and our responsibility to them.
02:35:03.560 | Writing books is a way of communicating with the unborn.
02:35:07.800 | It may or may not succeed, and probably won't succeed
02:35:10.720 | if my books are never assigned
02:35:12.160 | by woke professors in the future.
02:35:14.520 | So what we have to do is more than just write books
02:35:16.760 | and record podcasts.
02:35:18.320 | There have to be institutions.
02:35:20.560 | I'm 57 now.
02:35:22.280 | I realized recently that succession planning
02:35:25.200 | had to be the main focus of the next 20 years,
02:35:29.000 | because there are things that I really care about
02:35:33.000 | that I want future generations to have access to.
02:35:36.760 | And so the meaning of life I do regard
02:35:41.080 | as being intergenerational transfer of wisdom.
02:35:46.360 | Ultimately, the species will go extinct at some point.
02:35:50.400 | Even if we do colonize Mars,
02:35:52.240 | one senses that physics will catch up
02:35:55.040 | with this particular organism,
02:35:56.800 | but it's in the pretty far distant future.
02:35:59.800 | And so the meaning of life is to make sure
02:36:01.720 | that for as long as there are human beings,
02:36:04.920 | they are able to live the kind of fulfilled lives,
02:36:09.920 | ethically fulfilled, intellectually fulfilled,
02:36:14.280 | emotionally fulfilled lives,
02:36:16.240 | that civilization has made possible.
02:36:19.040 | It would be easy for us to revert to the uncivilized world.
02:36:24.040 | There's a fantastic book that I'm going to misremember,
02:36:29.520 | "Milosz's The Captive Mind,"
02:36:34.160 | which has a fantastic passage.
02:36:37.120 | He was a Polish intellectual who says,
02:36:43.000 | "Americans can never imagine what it's like
02:36:46.040 | for civilization to be completely destroyed
02:36:49.320 | as it was in Poland by the end of World War II,
02:36:52.560 | to have no rule of law,
02:36:54.280 | to have no security of even person,
02:36:56.760 | nevermind property rights.
02:36:58.200 | They can't imagine what that's like
02:36:59.840 | and what it will lead you to do."
02:37:01.560 | So one reason for teaching history
02:37:04.560 | is to remind the lucky Generation Z members of California
02:37:09.560 | that civilization's a thin film.
02:37:15.840 | And it can be destroyed remarkably easily.
02:37:18.680 | And to preserve civilization
02:37:20.240 | is a tremendous responsibility that we have.
02:37:23.600 | It's a huge responsibility.
02:37:25.880 | And we must not destroy ourselves,
02:37:28.320 | whether it's in the name of wokeism
02:37:31.000 | or the pursuit of the metaverse.
02:37:33.720 | Preserving civilization and making it available,
02:37:36.120 | not just to our kids, but to people we'll never know,
02:37:39.080 | generations ahead, that's the meaning.
02:37:42.800 | - And do so by studying the lessons of history.
02:37:47.080 | - Right.
02:37:47.920 | Not only studying them, but then acting on them.
02:37:50.560 | For me, the biggest problem is
02:37:52.120 | how do we apply history more effectively?
02:37:55.140 | It seems as if our institutions, including government,
02:37:58.440 | are very, very bad at applying history.
02:38:01.640 | Lessons of history are learned poorly, if at all.
02:38:04.440 | Analogies are drawn crudely.
02:38:06.200 | Often the wrong inferences are drawn.
02:38:08.440 | One of the big intellectual challenges for me
02:38:10.240 | is how to make history more useful.
02:38:14.080 | And this was the kind of thing that professors used to hate,
02:38:17.000 | but really practically useful,
02:38:18.700 | so that policymakers and citizens
02:38:21.800 | can think about the decisions that they face
02:38:23.920 | with a more historically informed body of knowledge.
02:38:28.920 | Whether it's a pandemic, the challenge of climate change,
02:38:31.720 | what to do about Taiwan.
02:38:33.520 | I can't think of a better set of things to know
02:38:38.720 | before you make decisions about those things
02:38:41.040 | than the things that history has to offer.
02:38:43.400 | - Well, I love the discipline of applied history.
02:38:45.520 | Basically going to history and saying,
02:38:47.560 | "What are the key principles here
02:38:52.000 | "that are applicable to the problems of today,
02:38:54.600 | "and how can we solve them?"
02:38:55.440 | - The great philosopher of history, R.G. Collingwood,
02:38:59.280 | said in his autobiography, which was published in 1939,
02:39:02.960 | that the purpose of history
02:39:05.040 | was to reconstitute past thought
02:39:08.640 | from whatever surviving remnants there were,
02:39:12.080 | and then to juxtapose it with our own predicament.
02:39:16.120 | And that's that juxtaposition of past experience
02:39:18.960 | with present experience that is so important.
02:39:21.360 | We don't do that well.
02:39:23.720 | And indeed, we've flipped it
02:39:25.720 | so that academic historians now think their mission
02:39:28.160 | is to travel back to the past
02:39:30.440 | with the value system of 2021,
02:39:33.080 | and castigate the dead for their racism,
02:39:37.200 | and sexism, and transphobia, and whatnot.
02:39:40.080 | And that's exactly wrong.
02:39:42.120 | Our mission is to go back and try to understand
02:39:43.920 | what it was like to live in the 18th century,
02:39:46.560 | not to go back and condescend to the people of the past.
02:39:50.720 | And once we've had a better understanding,
02:39:52.840 | once we've seen into their lives, read their words,
02:39:55.280 | tried to reconstitute their experience,
02:39:57.120 | to come back and understand our own time better,
02:40:00.380 | that's what we should really be doing.
02:40:01.960 | But academic history's gone completely haywire,
02:40:04.160 | and it does almost the exact opposite
02:40:05.760 | of what I think it should do.
02:40:07.560 | - And by studying history, walk beautifully,
02:40:11.240 | gracefully through this simulation, as you described,
02:40:14.440 | by mapping the lessons of history into the world of today.
02:40:17.800 | - We have virtual reality already in our heads.
02:40:20.600 | We do not need Oculus and the metaverse.
02:40:24.480 | - This was an incredible, hopeful conversation.
02:40:27.120 | In many ways that I did not expect,
02:40:29.560 | I thought our conversation would be much more about history
02:40:31.920 | than about the future, and it turned out to be the opposite.
02:40:35.120 | Thank you so much for talking today.
02:40:36.360 | It's a huge honor to finally meet you, to talk to you.
02:40:38.760 | Thank you for your valuable time.
02:40:40.680 | - Thank you, Lex, and good luck with Putin.
02:40:43.520 | - Thanks for listening to this conversation
02:40:45.240 | with Neil Ferguson.
02:40:46.540 | To support this podcast,
02:40:47.920 | please check out our sponsors in the description.
02:40:50.720 | And now, let me leave you with some words
02:40:52.720 | from Neil Ferguson himself.
02:40:54.720 | "No civilization, no matter how mighty it may appear
02:40:57.880 | "to itself, is indestructible."
02:41:01.440 | Thank you for listening, and hope to see you next time.
02:41:04.380 | (upbeat music)
02:41:06.960 | (upbeat music)
02:41:09.540 | [BLANK_AUDIO]