back to indexNorman Naimark: Genocide, Stalin, Hitler, Mao, and Absolute Power | Lex Fridman Podcast #248
Chapters
0:0 Introduction
0:20 Stalin and absolute power
14:17 Dictators and genocide
38:43 What is genocide
48:50 Human nature and suffering
78:35 Mao's Great Leap Forward
85:49 North Korea
89:42 Our role in fighting against atrocities
98:38 China
102:47 Hopes for the future and technology
117:40 Advice for young people
120:27 Love and tragedy
00:00:00.000 |
The following is a conversation with Norman Namark, 00:00:03.360 |
a historian at Stanford specializing in genocide, 00:00:15.040 |
And now, here's my conversation with Norman Namark. 00:00:22.500 |
not just for him, but for the people of the Soviet Union 00:00:27.840 |
I mean, Stalin believed that socialism was the be all 00:00:38.440 |
and in Lenin's tradition, this was what he believed. 00:00:46.320 |
other kinds of things he believed or thought or did. 00:00:55.880 |
no, he absolutely thought it was in the interest 00:00:59.480 |
And in fact, that the world was one day going to go socialist 00:01:05.840 |
And eventually, in the International Revolution. 00:01:08.240 |
- So given the genocide in the 1930s that you described, 00:01:25.500 |
you know, which has a lot of kind of religious 00:01:30.680 |
And in that sense, yes, I think he was an evil man. 00:01:42.080 |
He was completely indifferent to the suffering of others. 00:02:02.780 |
It's, you know, it's a word for moral philosophers, 00:02:14.040 |
And there is a wonderful historian at Princeton, 00:02:17.980 |
a political scientist actually named Robert Tucker, 00:02:20.620 |
who said he suffered from a paranoid delusional system. 00:02:25.620 |
And I always remember that of Tucker's writing 00:02:30.320 |
because what Tucker meant is that he was not just paranoid, 00:02:39.920 |
But that he constructed whole plots of people, 00:02:47.440 |
whole systems of people who were out to get him. 00:02:51.340 |
So in other words, his delusions were that there were 00:02:56.540 |
who were out to diminish his power and remove him 00:03:01.500 |
from his position and undermine the Soviet Union in his view. 00:03:07.360 |
So yes, I think he did suffer from delusions. 00:03:12.360 |
And this had a huge effect because whole groups then 00:03:20.920 |
which he would construct based on these delusions. 00:03:29.340 |
I mean, I think most of the research that's gone on, 00:03:32.700 |
especially since the Stalin Archive was opened 00:03:38.820 |
and I think almost every historian who goes in that archive 00:03:51.900 |
a really excellent sense of political rhetoric, 00:03:56.900 |
a fantastic editor, you know, in a kind of agitational sense. 00:04:09.280 |
I mean, somebody who works from morning till night, 00:04:14.700 |
So his competence, I think, was really extreme. 00:04:20.440 |
you know, times in the '30s, times in the '20s, 00:04:23.540 |
times during the war, where he made mistakes. 00:04:28.940 |
But I think, you know, you look at his stuff, 00:04:31.220 |
you know, you look at his archives, you look what he did. 00:04:36.700 |
who in many, many different areas of enterprise, 00:04:43.540 |
that he should know everything and did know everything. 00:05:05.060 |
how many, you know, barrels they should produce 00:05:13.780 |
in other words, what people were putting down there. 00:05:16.700 |
So he was, you know, his competence ranged very wide, 00:05:20.860 |
or at least he thought his competence ranged very wide. 00:05:25.760 |
- If we look at this paranoid, delusional system, 00:05:29.900 |
He is, many argue, one of the most powerful men in history. 00:05:45.540 |
of the early 1930s, this paranoid, delusional system, 00:05:59.620 |
was that always inevitable, essentially, in this man, 00:06:14.500 |
I mean, the man without his position and without his power, 00:06:18.740 |
you know, wouldn't have been able to accomplish 00:06:21.060 |
what he eventually did in the way of murdering people, 00:06:28.860 |
So, you know, I don't, it wasn't sort of in him. 00:06:42.380 |
People used to say, you know, the father beat him up, 00:06:45.780 |
and it turns out, actually, it wasn't the father, 00:06:49.340 |
But basically, you know, he was not an unusual, 00:06:56.980 |
And, you know, it was the growth of the Soviet system, 00:07:04.720 |
I mean, his own development within the Soviet system, 00:07:07.780 |
I think that led, you know, to the kind of mass killing 00:07:15.340 |
You know, he essentially achieved complete power 00:07:19.680 |
by the early 1930s, and then as he rolled with it, 00:07:24.680 |
as you would say, you know, or people would say, 00:07:32.740 |
and there was no, you know, there were no checks 00:07:36.660 |
and balances, obviously, on that murderous system. 00:07:39.820 |
And not only that, you know, people supported it 00:07:46.900 |
I mean, he was a superb, you know, political manipulator 00:07:54.700 |
And, you know, we've got new transcripts, for example, 00:08:01.700 |
of, you know, police bureau meetings in the early 1930s. 00:08:05.940 |
And you read those things, and you read, you know, 00:08:07.660 |
he uses humor, and he uses sarcasm, especially. 00:08:12.660 |
He uses verbal ways to undermine people, you know, 00:08:27.380 |
and he does it with, you know, a kind of skill 00:08:36.060 |
and on the other hand, of course, is terrible, 00:08:41.740 |
creating the system of terror that he creates. 00:08:50.620 |
I just wonder how much of it is a slippery slope 00:08:59.460 |
even a single person, but thousands and millions. 00:09:44.980 |
to go on this slippery slope that ends in genocide. 00:09:49.020 |
- There are a lot of questions in those questions, 00:09:59.160 |
you know, Stalin wasn't the most likely successor of Lenin. 00:10:12.660 |
that made it possible for Stalin to seize power. 00:10:19.540 |
you know, if you would just know him in 1925, 00:10:22.820 |
I don't think anybody would say, much less himself, 00:10:30.980 |
and thought he was, you know, a mindless bureaucrat. 00:10:35.980 |
You know, others were less mistrustful of him, 00:10:43.580 |
and political maneuvering that was very successful. 00:10:51.940 |
doesn't really begin until the 1930s, in my view. 00:11:05.860 |
of the Five-Year Plan and collectivization go through, 00:11:10.060 |
once he reverses himself and is able to reverse himself 00:11:17.140 |
to give various nationalities their, you know, 00:11:38.060 |
in the political system, you have the suicide of his wife, 00:11:41.140 |
you have all these things come together in '32, '33, 00:11:50.540 |
in other words, that bad things are gonna happen. 00:11:53.440 |
And people start seeing that, too, around him. 00:11:57.820 |
They start seeing that it's not a slippery slope, 00:12:00.940 |
it's a dangerous, it's a dangerous situation, 00:12:05.940 |
which is emerging, and some people really understand that. 00:12:10.100 |
So I don't, I really do see a differentiation, then, 00:12:14.700 |
I mean, it's true that Stalin, during the Civil War, 00:12:17.420 |
there's a lot of, you know, good research on that, 00:12:36.620 |
But I don't really see that as kind of the necessary stage 00:12:40.800 |
for the next thing that came, which was the '30s, 00:12:46.780 |
you know, where everybody's afraid for their lives 00:12:54.420 |
and that sort of thing becomes a common part, 00:13:04.820 |
They'll argue this kind of Lenin-Stalin continuity debate, 00:13:24.380 |
because of Marxism-Leninism, because of the ideology, 00:13:40.100 |
that what Stalin became had nothing to do with Lenin 00:13:47.380 |
But, you know, he takes it one major step further. 00:13:51.580 |
And again, that's why I don't like the slippery slope, 00:13:58.300 |
And we call, you know, I mean, historians talk 00:14:01.300 |
about the Stalin revolution, you know, in '28 and '29, 00:14:11.540 |
through the Five-Year Plan, collectivization, 00:14:21.660 |
the Soviet terror famine in Ukraine in '32 and '33? 00:14:29.720 |
but let me try to be as succinct as I can be. 00:14:42.060 |
an all-union famine that is the result of collectivization. 00:14:47.900 |
You know, collectivization was a catastrophe. 00:14:52.180 |
You know, the more or less, the so-called kulaks, 00:15:00.200 |
Anybody with a tin roof and a cow was considered a kulak, 00:15:09.180 |
So these kulaks, we're talking millions of them, right? 00:15:17.220 |
heavily agricultural area, and Ukrainian peasants, 00:15:27.700 |
than even the Russian peasants resisted collectivization, 00:15:32.060 |
suffered during this collectivization program. 00:15:35.480 |
And they, you know, burned sometimes their own houses, 00:15:40.540 |
They were shot, you know, sometimes on the spot. 00:15:45.120 |
Tens of thousands and others were sent into exile. 00:15:50.020 |
So there was a conflagration in the countryside. 00:15:53.500 |
And the result of that conflagration in Ukraine 00:15:58.140 |
And again, there was famine all over the Soviet Union, 00:16:27.500 |
and it's going to be a very impressive set of exhibits, 00:16:32.060 |
and talk with historians all the time about it. 00:16:34.300 |
So what happens in '32, '33, a couple of things. 00:16:37.440 |
First of all, the Stalin develops an even stronger, 00:16:46.280 |
'cause they already had an antipathy for the Ukrainians, 00:16:49.280 |
an even stronger antipathy for the Ukrainians in general. 00:16:55.960 |
Second of all, he's not getting all the grain he wants 00:17:02.040 |
And so he sends in, then, people to expropriate the grain, 00:17:09.500 |
These teams of people, you know, some policemen, 00:17:19.320 |
go into the villages, and forcibly seize grain, 00:17:30.960 |
this is all over the Soviet Union, in '32 especially. 00:18:01.520 |
That they're basically not loyal to the Soviet Union, 00:18:09.840 |
you know, I think it's Kaganovich, he says it too, 00:18:21.960 |
essentially, break the back of these peasantry. 00:18:26.960 |
is by going through another expropriation program, 00:18:30.580 |
which is not done in the rest of the Soviet Union. 00:18:42.780 |
including kids, with death if they steal any grain, 00:18:58.820 |
which are not introduced into the rest of the Soviet Union. 00:19:05.260 |
are not allowed to leave their villages anymore. 00:19:07.960 |
They can't go to the city to try to find some things. 00:19:15.940 |
in Kharkiv and in Kiev and in places like that 00:19:21.300 |
and get to the cities, but now they can't leave. 00:19:28.180 |
Belarus today, or to Russia, you know, to get any food. 00:19:32.860 |
There's no, he won't allow any relief to Ukraine. 00:19:36.820 |
Number of people offer relief, including the Poles, 00:19:44.500 |
He won't admit that there's a famine in Ukraine. 00:19:47.340 |
And instead what happens is that Ukraine turns 00:20:06.100 |
of bodies just lying everywhere, you know, people dead 00:20:10.500 |
and dying, you know, of hunger, which is, by the way, 00:20:15.500 |
I mean, as you know, I've spent a lot of time 00:20:20.800 |
I don't think there's anything worse than dying of hunger 00:20:24.540 |
I mean, you see terrible ways that people die, right? 00:20:27.620 |
But dying of hunger is just such a horrible, horrible thing. 00:20:31.660 |
And so, for example, we know there were many cases 00:20:43.620 |
And again, you know, we started with this question 00:21:01.680 |
- What about the opposite of joy for teaching them a lesson? 00:21:47.600 |
or was this strategy based on emotion and anger? 00:21:51.580 |
- No, well, I think actually the right answer 00:21:59.100 |
- A little of both? - No, you can't, you can't. 00:22:12.080 |
And by the way, Russians still don't like it, right? 00:22:20.900 |
As I said, you know, Stalin writes, you know, 00:22:24.760 |
we'll lose Ukraine, you know, if these guys win. 00:22:29.440 |
You know, so there's a kind of long-term determination, 00:22:33.300 |
as I said, you know, to kind of break the back 00:22:37.260 |
of Ukrainian national identity and Ukrainian nationalism 00:22:58.560 |
with the Ukrainians for resisting collectivization 00:23:05.340 |
not conforming, you know, to the way he thinks 00:23:20.160 |
to control Ukraine and not be able to control the situation. 00:23:27.040 |
Well, how come the breadbasket isn't turning over to me 00:23:39.880 |
when things fail, and this is absolutely typical of Stalin, 00:23:43.760 |
when things fail, he blames it on other people 00:23:50.020 |
So a little bit of both, I think, is the right answer. 00:23:53.720 |
- This blame, it feels like there's a playbook 00:23:59.580 |
I just wonder if it comes naturally or just kind of evolves. 00:24:04.000 |
There's blaming others and then telling these narratives 00:24:14.720 |
for it not to be a naturally emergent strategy 00:24:20.680 |
- I mean, that's a good, it's a very good point. 00:24:23.080 |
And I think it's one, you know, that has its merits. 00:24:30.400 |
that there's certain kinds of strategies by dictators 00:24:40.040 |
I've written about Mao and Pol Pot and Hitler. 00:24:46.840 |
a kind of playbook for political dictatorship. 00:24:51.800 |
Also for a kind of communist totalitarian way of functioning. 00:24:56.800 |
And that way of functioning was described already 00:25:12.720 |
The real question, it seems to me, is to what extent, 00:25:25.000 |
I mean, Fidel Castro was not a nice man, right? 00:25:27.480 |
He was a dictator, he was a terrible dictator. 00:25:33.280 |
Ho Chi Minh was a dictator, a communist dictator 00:25:39.800 |
went to Moscow, spent time in Moscow in the '30s 00:25:43.360 |
and went to find, found the Vietnamese Communist Party. 00:25:57.920 |
I mean, I would even argue, others will disagree, 00:26:14.960 |
And, you know, he's still responsible for a gulag 00:26:21.320 |
and imprisonment and torture and that sort of thing. 00:26:25.400 |
So there are some, you know, like Stalin, like Mao, 00:26:29.800 |
like Pol Pot, you know, who commit these horrible, 00:26:44.680 |
Well, you know, the difference is partly in personality, 00:26:54.200 |
- How much do you connect the ideas of communism 00:26:57.040 |
or Marxism or socialism to Holodomor, to Stalin's rule? 00:27:14.400 |
by saying it doesn't always lead to genocide. 00:27:26.800 |
You know, even North Korea, as awful as it is, 00:27:32.480 |
and people's rights are totally destroyed, right? 00:27:45.120 |
or whether if they took over South Korea, you know, 00:27:47.480 |
mass murder wouldn't take place and that kind of thing. 00:27:59.360 |
that makes genocide sometimes too easily possible, 00:28:04.280 |
given, you know, the way it thinks through history 00:28:09.200 |
as being, you know, you're on the right side of history, 00:28:11.600 |
and some people are on the wrong side of history, 00:28:22.480 |
has that kind of language and that kind of thinking. 00:28:34.080 |
named Martin Melia, who has written, you know, 00:28:39.600 |
and he was very, very, he was convinced that the, 00:28:44.600 |
you know, that the ideology itself, you know, 00:28:58.600 |
I think there are, you could argue it different ways, 00:29:01.640 |
equally valid, you know, with equally valid arguments. 00:29:05.640 |
- I mean, there's something about the ideology of communism 00:29:10.640 |
that allows you to decrease the value of human life, 00:29:16.440 |
if it's okay to crack a few eggs to make an omelet. 00:29:19.880 |
- So maybe that, if you can reason like that, 00:29:26.240 |
for the good of the country, for the good of the people, 00:29:28.460 |
for the good of the world, it's okay to kill a few people. 00:29:31.640 |
And then that's where, I wonder about the slippery slope. 00:29:48.640 |
I don't like Marxism, Leninism any better than the next guy. 00:30:06.500 |
But they're not necessarily murderous systems. 00:30:11.500 |
They are systems that contain people's autonomy, 00:30:15.620 |
that force people into work and labor and lifestyles 00:30:34.200 |
where people lived lives they didn't want to live, 00:30:37.160 |
you know, and didn't have the freedom to choose. 00:30:46.560 |
And they, you know, ascribe to Marxism, Leninism. 00:30:51.520 |
- So I suppose it's important to draw the line 00:30:54.200 |
between mass murder and genocide and mass murder 00:31:04.020 |
- And the leap to mass murder, you're saying, 00:31:08.800 |
may be easier in some ideologies than others, 00:31:21.080 |
how much of it is a single charismatic leader? 00:31:30.640 |
How much of it is just dumb, the opposite of luck? 00:31:35.060 |
Do you have a sense where if you look at a moment 00:31:45.680 |
whether something bad's going to happen here. 00:31:49.520 |
When you look at Iraq, when Saddam Hussein first took power, 00:32:00.240 |
So you said, you already kind of answered that 00:32:02.200 |
with Stalin saying there's no way you could have 00:32:11.080 |
In other words, I mean, history is a wonderful, 00:32:14.960 |
you know, discipline and way of looking at life 00:32:17.520 |
and the world in retrospect, meaning it happened. 00:32:25.000 |
And it's too easy to say sometimes it happened 00:32:43.820 |
contingency and choice and difference and different paths 00:33:02.900 |
I mean, you can think, well, something's going to happen. 00:33:09.620 |
I mean, I'm thinking about an example right now, 00:33:15.020 |
and eventually ventuated in genocide in Bosnia. 00:33:18.400 |
And, you know, I remember very clearly, you know, 00:33:25.860 |
and people would say, you know, there's trouble here. 00:33:33.920 |
thought that there would be outright war between them all. 00:33:36.900 |
Then the outright war happened, genocide happened, 00:33:39.300 |
and afterwards people would say, I saw it coming. 00:33:44.620 |
especially with pundits and journalists and that. 00:33:51.060 |
You know, well, I mean, what happens in the human mind 00:33:53.620 |
and it happens in your mind too is, you know, 00:33:58.180 |
I mean, think about January 6th, you know, in this country 00:34:05.020 |
or before January 6th, you know, after the lost election. 00:34:10.020 |
You know, things could have gone in lots of different ways 00:34:16.300 |
but nobody really knew how it was gonna turn out. 00:34:22.000 |
that there'd be this Khakimimi uprising on January 6th, 00:34:25.420 |
you know, that almost, you know, caused this enormous grief. 00:34:28.980 |
So all of these kinds of things in history, you know, 00:34:33.620 |
They depend on, you know, factors that we cannot predict 00:34:37.300 |
and, you know, and it's the joy of history that it's open. 00:34:41.380 |
You know, you think about how people are now, 00:34:46.220 |
but, you know, there's the environmental example. 00:34:51.020 |
We know it's coming, we know there's trouble, right? 00:34:54.440 |
We know there's gonna be a catastrophe some point, 00:35:00.980 |
- Yeah, what's the nature of the catastrophe? 00:35:02.380 |
Everyone says catastrophe. - And what's the nature of it? 00:35:07.380 |
Is it gonna be like mass migration of different kinds 00:35:10.780 |
that leads to some kind of conflict and immigration? 00:35:22.220 |
- That's my point, that's my point, that's my point. 00:35:29.700 |
I mean, the warming business and all this kind of stuff, 00:35:39.660 |
and then you get somewhere in 50 years or 60 years, 00:35:56.360 |
You simply cannot predict what's going to happen. 00:35:59.020 |
It's kind of when you just look at Hitler in the '30s, 00:36:02.260 |
for me, oftentimes when I kind of read different accounts, 00:36:09.100 |
but in general, me just reading about Hitler, 00:36:21.300 |
No, no, no, with Stalin, you don't get a sense he's a clown. 00:36:25.940 |
You don't think it would lead to mass murder, 00:36:28.540 |
but you think he's going to build a giant bureaucracy, 00:36:42.380 |
you certainly don't think about the atrocities, 00:37:00.420 |
and all of a sudden he gains more and more power, 00:37:04.980 |
he makes strategic decisions in terms of cooperating 00:37:17.340 |
I mean, this clown is one of the most impactful 00:37:28.020 |
and are being screamed at and discriminated against, 00:37:30.580 |
and there's a series of measures taken against them 00:37:33.860 |
incrementally during the course of the 1930s, 00:37:58.540 |
who should have been very sensitive to what was going on, 00:38:01.540 |
didn't really understand the extent of the danger. 00:38:31.080 |
escalates, escalates, and then war breaks out 00:38:44.880 |
What are the defining characteristics of genocide? 00:38:52.300 |
There is a definition, the December 1948 UN Convention 00:39:02.240 |
is considered the sort of major document of definition 00:39:16.440 |
of an ethnic, national, racial, or religious group. 00:39:21.440 |
Those are the four groups again, comma as such. 00:39:31.920 |
In other words, there's a kind of beauty in human diversity 00:39:39.600 |
Estonians, you know, a tribe of Native Americans, 00:39:43.520 |
South African tribes, you know, the Rohingya in Myanmar. 00:39:54.600 |
You know, this was a notion that emerges really 00:40:00.960 |
in the beginning of the 19th century with Herder mostly. 00:40:04.520 |
And this beauty of these groups then, you know, 00:40:15.920 |
You know, the idea is that it's intentional destruction. 00:40:19.880 |
So this is a kind of, you know, analogy to first degree, 00:40:24.880 |
second degree, and third degree murder, right? 00:40:28.320 |
you're out to kill this person and you plan it 00:40:37.840 |
You end up doing the same thing, but it's different. 00:40:41.000 |
So, you know, the major person behind the definition 00:40:50.400 |
but he was a Polish Jewish jurist who came, you know, 00:40:55.200 |
from Poland, came to the United States during the war 00:40:58.360 |
and had been a kind of crusader for recognizing genocide. 00:41:11.300 |
and then published it in 1944 for the first time. 00:41:14.900 |
Geno meaning people and side meaning killing, right? 00:41:30.500 |
the problems with the definition are several. 00:41:33.900 |
You know, one of them is, is it just these four groups, 00:41:38.460 |
you know, racial, religious, ethnic, or national? 00:41:44.620 |
And what's in people's minds in 1948 are Jews, Poles, 00:41:49.500 |
Russians, Yugoslavs sometimes who were killed by the Nazis. 00:41:54.580 |
But there are other groups too, if you think about it, 00:41:56.980 |
you know, who are killed, social groups or political groups. 00:42:12.860 |
Let's say Kulaks, for example, to be considered. 00:42:16.660 |
That's a social group or peasants, which is a social group. 00:42:30.340 |
but non-communists also killed groups of people. 00:42:32.500 |
In Indonesia in 1965, '66, they killed, you know, 00:42:42.500 |
You know, and my point of view, it is genocide, 00:42:45.260 |
although it's Indonesians killing Indonesians. 00:42:48.380 |
And we have the same problem with the Cambodian genocide. 00:42:53.860 |
but most of the people killed in the Cambodian genocide 00:43:04.340 |
meaning the Vietnamese, Akham people who are, you know, 00:43:08.780 |
Muslim, smaller Muslim people in the area, and a few others. 00:43:17.460 |
well, does it have to be a different nationality 00:43:20.260 |
or ethnic group or religious group for it to be genocide? 00:43:26.420 |
It's a little bit like with our constitution. 00:43:29.860 |
but we don't live in the end of the 18th century, right? 00:43:38.140 |
And similarly, the genocide convention needs updating too. 00:43:53.500 |
and the group is united by some unique characteristics? 00:43:57.660 |
That was an invention in human history, this idea? 00:44:15.860 |
And what the reality is always much more complicated 00:44:18.540 |
than the construction or the invention of a term 00:44:21.940 |
or a concept or a way of thinking about a nation, right? 00:44:26.420 |
And this way of thinking of nations, you know, 00:44:29.740 |
as again, you know, groups of religious, linguistic, 00:44:50.380 |
There are no Italians in the 17th century, right? 00:44:54.860 |
the invention of the nation, which comes again, 00:45:03.700 |
a man named Johann Gottfried von Herder, right? 00:45:10.380 |
who sort of went around, collected people's languages 00:45:17.740 |
"Isn't this cool, you know, that they're Estonians 00:45:20.620 |
and that they're Latvians and that they're these other, 00:45:35.020 |
then people start to say, "Hey, we're nations too." 00:45:38.660 |
You know, and the Germans decide they're a nation 00:45:40.820 |
and they unify and the Italians discover they're a nation 00:45:46.260 |
Florentines and Romans and, you know, Sicilians. 00:45:51.100 |
- But then beyond nations, there's political affiliations, 00:45:58.260 |
you start, you look at the early Homo sapiens 00:46:10.700 |
And then you have warring tribes probably connected 00:46:16.140 |
But it's fascinating to think that that is then taken 00:46:29.980 |
based on maybe even, so political philosophical ideas, 00:46:44.740 |
- And that comes more towards the end of the 19th century, 00:46:58.100 |
the natural competition, the weak ones fall away, 00:47:03.220 |
You know, you get this sort of combination also 00:47:09.980 |
the racial thinking at the end of the 19th century 00:47:14.060 |
So now, you know, at the end of the 19th century 00:47:27.060 |
- As long as you speak the language and you, you know, 00:47:28.900 |
you dress and think and act and share the culture. 00:47:32.420 |
By the end of the 19th century, people are saying, 00:47:37.940 |
They have different blood, they have different, 00:47:45.420 |
there's this sense of superiority too and inferiority. 00:47:55.380 |
and we have to, you know, and Hitler, by the way, 00:48:03.020 |
at the end of the 19th and beginning of 20th century 00:48:09.300 |
that, you know, basically pervades this racial thinking. 00:48:19.820 |
I mean, it's not bad, you know, to share culture 00:48:32.740 |
But then what happens is it becomes, you know, 00:48:35.500 |
frequently is used and becomes, especially on fascism, 00:48:44.540 |
when the two conflicting groups share geographic location. 00:48:49.780 |
- So like with Jews, you know, I come, you know, 00:48:54.220 |
I'm a Russian Jew and it's always interesting. 00:48:58.740 |
I take pride in, you know, I love the tradition 00:49:10.140 |
They have a beautiful tradition in literature and science 00:49:19.300 |
but sometimes correct me that I'm not Russian. 00:49:37.320 |
And then when they're living in the same place 00:49:55.600 |
- You know, that's a big question in and of itself. 00:50:02.640 |
You know, the human heart is full of everything, right? 00:50:06.720 |
it's full of indifference, it's full of apathy, 00:50:10.760 |
So, I mean, hate is something, you know, that, 00:50:14.980 |
I mean, I think, and, you know, along with hate, 00:50:22.720 |
you know, the ability to really hurt and injure people 00:50:30.400 |
And it's just something that's part of who we are 00:50:40.040 |
and our society can do with us often what it wishes. 00:50:55.120 |
I mean, because, you know, you have more or less 00:51:04.440 |
in which you would have to then do nasty to other people. 00:51:08.660 |
Some societies, as we talked about, you know, 00:51:12.760 |
are more, have proclivities towards, you know, 00:51:17.000 |
asking of its people to do things they don't want to do 00:51:27.880 |
to be able to choose not to do evil is a great thing. 00:51:30.920 |
You know, whereas in some societies, you know, 00:51:33.940 |
you feel in some ways for, not so much for the NKVD bosses, 00:51:38.940 |
but for the guys on the ground, you know, in the 1930s 00:51:44.880 |
but for the guys, you know, in the police battalion 00:51:49.120 |
that were told go shoot those Jews, you know? 00:51:59.680 |
but because your social, you know, your social situation, 00:52:12.560 |
rereading Viktor Frankl's "Man's Search for Meaning," 00:52:22.360 |
"The mere knowledge that a man was either a camp guard 00:52:31.160 |
even those which as a whole, it would be easy to condemn." 00:52:36.160 |
So that's speaking to, you feel for those people 00:52:53.400 |
what will happen if you were given those same orders? 00:52:59.040 |
You know, what kind of reaction would you have 00:53:10.360 |
while fighting for almost any country that I was born in. 00:53:23.160 |
Just love of community, and countries want such community. 00:53:42.360 |
or you're in the police force and you're ordered, 00:54:03.480 |
And over time, it's again, that slippery slope. 00:54:05.840 |
'Cause you could see all the people who protest, 00:54:12.600 |
So like, if you actually want to practically help somehow, 00:54:18.560 |
that you can't, one person can't possibly help. 00:54:21.060 |
And then you have a family, so you want to make, 00:54:25.480 |
You tell all of these stories, and over time, 00:54:28.000 |
you naturally convince yourself to dehumanize the other. 00:54:35.200 |
mostly because I worry that I wouldn't be a good German. 00:54:44.600 |
And one of the, you know, one of my tasks as a teacher, 00:54:48.440 |
right, of our students, and I have, you know, 00:55:01.120 |
one of my tasks is to try to get the students 00:55:17.080 |
that this is in all of us, you know, it's in all of us. 00:55:22.760 |
and you just try to gurge yourself up, you know, 00:55:25.600 |
to try to figure out ways that maybe you won't be complicit, 00:55:29.420 |
and that you learn how to stand by your principles, 00:55:33.600 |
but it's very hard, it's extremely difficult. 00:55:36.160 |
And you can't, the other interesting thing about it 00:55:40.320 |
Now, they've done a lot of studies of Poles, for example, 00:55:51.780 |
You know, sometimes it's the worst anti-Semites 00:55:57.520 |
And sometimes, you know, it's not predictable. 00:56:01.160 |
It's not as if the humanists among us, you know, 00:56:04.280 |
are the ones who, you know, consistently show up, you know, 00:56:15.720 |
to defend, you know, your fellow human beings. 00:56:21.560 |
and sometimes they do it for really simple reasons. 00:56:24.800 |
And sometimes people you would expect to do it don't, 00:56:32.580 |
- One thing I've learned in this age of social media 00:56:43.240 |
In terms of humanists, in terms of activists, 00:56:49.040 |
of declaring that you would do the right thing. 00:57:18.920 |
who kind of proclaim that they would do the right thing. 00:57:21.720 |
- And there are different kinds of integrity, too. 00:57:34.960 |
member of the underground who was in Majdanek, 00:57:53.700 |
On the other hand, he was something of an anti-Semite. 00:57:59.700 |
sometimes if Jews were his friends, he would help them. 00:58:02.840 |
And if they weren't, sometimes he was really mean to them. 00:58:06.360 |
You know, and you could, in their various levels, 00:58:11.480 |
a terrible social experiment in some ways, right? 00:58:24.040 |
in some situations and extraordinarily poorly in others. 00:58:33.480 |
It's, you know, I think we claim too much consistency 00:58:40.400 |
You know, they're not consistent anymore than we are, 00:58:43.620 |
- Well, let me ask you about human nature here 00:58:48.040 |
So first, what have you learned about human nature 00:58:56.840 |
What lessons, first of all, why is a difficult question, 00:59:04.840 |
that genocide is something that happens in the world? 00:59:07.800 |
- That's a really big and difficult question, right? 00:59:18.760 |
You know, which the answer there is frequently political, 00:59:24.380 |
meaning, you know, why Hitler ended up killing the Jews. 00:59:28.960 |
Well, it had a lot to do with the political history 00:59:32.020 |
of Germany and wartime history of Germany, right? 00:59:46.040 |
turning into anything of the kind of dictator he ended up 01:00:01.280 |
I mean, these are all essentially political movements 01:00:12.960 |
single party movement, and then is moved in directions 01:00:18.520 |
The other question, let's separate that question out. 01:00:22.620 |
The other question is why do ordinary people participate? 01:00:33.780 |
Just saying, you know, go get them is not enough. 01:00:47.060 |
You know, people who have to pull the triggers 01:00:52.220 |
And grab people, you know, in Kiev in September, 1941, 01:01:03.900 |
You know, and those are all sorts of different questions. 01:01:06.740 |
The question of, you know, especially the lower level, 01:01:13.820 |
is a question which I think we've been talking about, 01:01:33.380 |
I mean, one of the shocking things that I learned 01:01:35.700 |
just a few years ago, studying the Holocaust, 01:01:41.860 |
In other words, if they order a police battalion 01:01:49.660 |
They weren't gonna, they never killed anybody. 01:02:01.980 |
You know, they give them booze to try to, you know, 01:02:04.620 |
numb the pain of murder, 'cause they know there is pain. 01:02:09.100 |
I mean, people experience pain when they murder people. 01:02:14.700 |
And so it's the character of who we are in society, 01:02:30.020 |
within that group, and who is the head of the group. 01:02:36.460 |
whether it's in the academy, right, at Stanford, 01:02:42.340 |
or whether it's in a church group in Tennessee, 01:02:45.020 |
or wherever, you know, people pay attention to each other, 01:02:56.260 |
Even though all of you think it's right, it's wrong. 01:03:00.900 |
especially in societies that are authoritarian, 01:03:08.100 |
Because it's harder, 'cause there's a backup to it, right? 01:03:10.940 |
There's the NKVD there, or there's the Gestapo there, 01:03:14.780 |
So you just, you know, they may not be forcing you to do it, 01:03:18.900 |
but your social being, plus this danger in the distance, 01:03:33.460 |
presumably, you know, you go to like middle management, 01:03:54.620 |
I mean, it happens in the case of the Armenian genocide, 01:04:03.660 |
They're removed, and you find a replacement very easily. 01:04:07.900 |
So, you know, you do see people who stand up, 01:04:10.580 |
and again, it's not really predictable who it will be. 01:04:13.700 |
I would maintain, I mean, I haven't done the study 01:04:24.500 |
there are people who do step aside every once in a while, 01:04:32.520 |
"Wait a minute, what is this business in Poland 01:04:34.340 |
"when they start to kill Jews, or in Belarusia?" 01:04:40.420 |
You know, if they don't do their job, they're pushed aside. 01:04:55.340 |
the human mind, from the victims of genocide? 01:05:02.340 |
to discover meaning and beauty, even in suffering. 01:05:05.980 |
Is there something to be said about, you know, 01:05:21.140 |
with a very pessimistic view of human nature, 01:05:28.940 |
I mean, the victims will eat their children, right? 01:05:38.740 |
The victims will form hierarchies within victimhood. 01:05:49.940 |
And there's, in Majdanek, at a certain point in '42, 01:05:55.860 |
a group of Slovak Jews were arrested and sent to Majdanek. 01:06:16.460 |
And for a variety of different reasons within the camp, 01:06:19.900 |
and again, this shows you the diversity of the camps, 01:06:22.340 |
and also, you know, these images of black and white 01:06:29.060 |
I mean, they basically had all the important jobs 01:06:31.160 |
in the camp, including jobs like beating other Jews. 01:06:35.780 |
And persecuting other Jews, and persecuting other peoples, 01:06:48.740 |
because of what they were doing to the Poles, right? 01:06:57.520 |
because aren't these supposed to be the intervention, 01:07:08.780 |
So, you know, in this kind of work on Majdanek, 01:07:11.440 |
there's certainly parts of it that, you know, 01:07:24.920 |
You know, there's some very heroic Polish women 01:07:30.400 |
who end up having a radio show called Radio Majdanek, 01:07:33.760 |
which they put on every night in the women's camp. 01:07:36.440 |
Which is, you know, to raise people's spirits. 01:07:47.940 |
the horrors that they're experiencing around them. 01:07:51.080 |
And so you do see that, and you do see, you know, 01:07:54.680 |
human beings acting in support of each other. 01:08:00.440 |
But, you know, I mean, Primo Levi is one of my 01:08:10.360 |
And, you know, I don't think Primo Levi saw anything. 01:08:20.080 |
I mean, but he describes this kind of, you know, 01:08:37.480 |
but probably because of his sense of what happened 01:08:44.600 |
So I don't, especially in the concentration camps, 01:08:49.360 |
it's really hard to find places like Wickel-Frankl, 01:09:04.280 |
People hung together, they tried to help each other, 01:09:06.360 |
but, you know, they were totally, totally caught 01:09:37.280 |
that ultimately, en masse, overpowers everything else. 01:09:41.040 |
If you just look at the story of human history, 01:09:43.400 |
the resistance to violence and mass murder and genocide 01:09:52.900 |
And it feels like a force that's more powerful 01:09:57.200 |
than whatever the dark momentum that leads to genocide is. 01:10:08.720 |
It's hard to tell the story of that little flame 01:10:14.060 |
that longing to connect to other human beings. 01:10:16.560 |
And there's something also about human nature, 01:10:18.920 |
and us as storytellers, that we're not very good 01:10:24.040 |
We're much better at telling the stories of atrocities. 01:10:27.040 |
- No, I think maybe I fundamentally disagree with you. 01:10:38.920 |
And I think it too easily goes out in a lot of people. 01:10:43.360 |
And I mean, like I say, I come away from this work 01:10:50.840 |
You know, there is this work by a Harvard psychologist, 01:10:56.560 |
- Yes, yes, Steven Pinker that shows over time, 01:11:00.280 |
and initially I was quite skeptical of the work, 01:11:03.420 |
but in the end I thought he was quite convincing 01:11:05.820 |
that over time the incidence of homicide goes down, 01:11:14.880 |
the incidence of genocide, except for the big blip, 01:11:18.880 |
you know, in the middle of the 20th century goes down. 01:11:30.120 |
I thought he was pretty convincing about that. 01:11:33.080 |
But think about, you know, we're modern people. 01:11:37.240 |
I mean, we've advanced so fast in so many different areas. 01:11:42.000 |
I mean, we should have eliminated this a long time ago, 01:11:50.880 |
we're still facing this business of genocide in Myanmar, 01:11:54.400 |
in Xinjiang, in, you know, Tigray, in Ethiopia, 01:12:04.280 |
we still have this thing that we cannot handle, 01:12:10.240 |
And, you know, again, you know, electric cars and planes 01:12:16.840 |
Think about the differences between 250 years ago 01:12:23.960 |
But the differences in genocide are not all that great. 01:12:30.440 |
I mean, there are problems with his methodology, 01:12:48.960 |
And, you know, I once, someone once said to me, 01:12:52.880 |
when I posed a similar kind of question to a seminar, 01:13:00.640 |
Well, I don't, you know, that's very Catholic, 01:13:02.660 |
and I don't really think in terms of original sin. 01:13:41.000 |
from the muck to see a possible sort of way out through love. 01:13:46.000 |
But it's not obvious that that's not the case. 01:13:50.500 |
You mentioned electric cars and rockets and airplanes. 01:13:54.800 |
To me, the more powerful thing is Wikipedia, the internet. 01:13:58.840 |
Only 50% of the world currently has access to the internet, 01:14:02.400 |
but that's growing in information and knowledge and wisdom, 01:14:08.200 |
As that grows, I think it becomes a lot more difficult 01:14:13.840 |
It becomes a lot more difficult for somebody like Hitler 01:14:18.400 |
because people think, and the masses, I think, 01:14:22.360 |
the people have power when they're able to think, 01:14:34.040 |
they can know about the fact that genocide happens, 01:14:36.340 |
how it occurs, how the promises of great charismatic leaders 01:14:44.820 |
And just even studying the fact that the Holocaust happened 01:15:01.580 |
that normal human beings, leaders that give big promises 01:15:19.800 |
and you learn to just, in your small, local way, 01:15:35.300 |
that in the end, I think in the end, love wins. 01:15:46.620 |
is more and more going to be an artifact of the past. 01:16:10.260 |
to other human beings is repeatedly demonstrated. 01:16:29.420 |
I mean, Syria used to be a country, you know? 01:16:46.580 |
I mean, the Turks have done nice things for the Syrians, 01:16:53.340 |
I mean, I'm not saying bad things only happen in the world. 01:17:17.300 |
But I think otherwise, it's just an article of faith. 01:17:29.980 |
you know, things should get bad or things like that. 01:17:33.940 |
It's the only way if you want to build a better future. 01:17:56.020 |
Because without that, they're not going to be able 01:18:12.820 |
So that's kind of the tension that we're living with. 01:18:19.540 |
But one of the ways to do that is to study history. 01:18:31.020 |
- Well, basically a lot of disciplines in science 01:18:35.220 |
Can you tell the story of China from 1958 to 1962, 01:18:47.400 |
that led to the deaths of tens of millions of people, 01:18:49.760 |
making it arguably the largest famine in human history. 01:18:54.380 |
- Yes, I mean, it was a terrible set of events 01:19:04.040 |
15 million, 17 million, 14 million, 20 million people died 01:19:21.000 |
Essentially, Mao and the Communist Party leadership, 01:19:28.480 |
decided he wanted to move the country into communism. 01:19:37.920 |
Mao was a good Stalinist, or at least felt like Stalin 01:19:43.240 |
was the right kind of communist leader to have, 01:19:48.080 |
and he didn't like what he thought were Khrushchev's reforms 01:19:57.140 |
So Khrushchev started talking about giving more power 01:20:01.720 |
and if you give more power to the party versus the state, 01:20:07.240 |
So what Mao decided to do was to engage in this vast program 01:20:11.400 |
of building what were called people's communes. 01:20:16.960 |
And these communes were enormous conglomerations 01:20:31.560 |
and there would be places for the kids to be raised 01:20:46.760 |
Their metal pots and pans to be melted to then make steel. 01:20:57.320 |
and the whole countryside would be transformed. 01:21:18.760 |
It was incredibly dysfunctional for Chinese agriculture 01:21:23.760 |
and ended up creating, as you mentioned, a terrible famine 01:21:30.840 |
that everybody understood was a famine as a result of this. 01:21:36.440 |
I mean, there were also some problems of nature 01:21:40.400 |
at the same time and some flooding and bad weather 01:21:45.760 |
And Mao said at one point, "Who cares if millions die? 01:21:55.360 |
I mean, he would periodically say things like this 01:21:57.800 |
that showed that like Stalin, he had total indifference 01:22:02.800 |
to the fact that people were dying in large numbers. 01:22:06.320 |
It led again to cannibalism and to terrible wastage 01:22:11.560 |
all over the country and millions of people died. 01:22:17.160 |
There were people in the party who began to kind of edge 01:22:24.160 |
and that he should back off, but he wouldn't back off. 01:22:28.480 |
And the result was catastrophe in the countryside 01:22:39.840 |
if peasants would object or if certain people would say, 01:22:49.160 |
So it was really, really a horrific set of events 01:23:05.960 |
of what was going on and eventually wrote books about it. 01:23:09.080 |
So we have, I mean, we have pretty good documentation, 01:23:19.120 |
this is a little bit separate with the Holodomor 01:23:23.120 |
where Ukrainians are now claiming 11.5 million people died 01:23:29.760 |
in the neighborhood of 4 million, 4.5 million maybe. 01:23:33.200 |
So you have wildly different numbers that come out 01:23:38.000 |
as you mentioned too, with the Great Leap Forward. 01:24:36.360 |
death is just one of the consequences of suffering. 01:24:43.320 |
months and then years of not having anything to eat 01:25:02.560 |
their ability to work, all those kinds of things. 01:25:07.720 |
there are people working on this subject now, 01:25:09.280 |
you know, the long-term effect of the hunger famine on them. 01:25:13.000 |
And I'm sure there's a similar kind of long-term effect 01:25:23.880 |
- And, you know, it's a really, you're absolutely right. 01:26:08.120 |
many people describe mass starvation going on. 01:26:11.320 |
Now in North Korea, when you think about genocide, 01:26:14.360 |
when you think about atrocities going on in the world today, 01:26:37.360 |
by the international community, UN basically, 01:26:43.120 |
which would then try genocide in the more modern period 01:26:54.840 |
The genocide, crimes against humanity, and war crimes. 01:27:23.040 |
And there are other kinds of mass rape and stuff like that. 01:27:33.480 |
And that's sort of where I think about North Korea 01:27:36.400 |
as committing crimes against humanity, not genocide. 01:27:39.200 |
And again, remember, genocide is meant to be, 01:27:48.840 |
Some people think of genocide as the crime of crimes, 01:27:51.920 |
the worst of the three that I just mentioned. 01:27:57.000 |
And the ICC, the International Criminal Court, 01:28:00.480 |
is dealing with them more or less as co-equal, 01:28:03.320 |
even though we tend to think of genocide as the worst. 01:28:11.680 |
I think it's sort of morally and ethically unseemly, 01:28:15.480 |
you know, the split hairs about what is genocide 01:28:20.720 |
You know, this is for lawyers, not for historians. 01:28:24.120 |
- Yeah, yeah, you know, you don't wanna get into that. 01:28:28.360 |
Because it, I mean, it happened with Darfur a little bit, 01:28:40.800 |
it wasn't genocide, it was a crime against humanity. 01:28:47.320 |
I mean, we damn well knew what was happening. 01:28:55.240 |
I think the whole concept and the way of thinking 01:28:58.360 |
about history using genocide as an important part 01:29:08.040 |
On the other hand, I don't like to, you know, 01:29:15.320 |
So that, you know, North Korea, I tend to think of, 01:29:18.600 |
like I said, as committing crimes against humanity 01:29:22.640 |
and, you know, forcibly incarcerating people, 01:29:30.160 |
depriving them of certain kinds of human rights 01:29:42.800 |
- What in this, if we think about, if it's okay, 01:30:21.400 |
what role do I have when I think about North Korea, 01:30:27.320 |
when I think about maybe the Uyghur population in China? 01:30:31.440 |
- Well, I mean, the role is the same role I have, 01:30:38.360 |
and to get the message out that this is happening, 01:30:45.760 |
the more likely it is that the United States government 01:30:50.680 |
within the context of who we are and where we live, right? 01:31:11.440 |
I guess that's not the, yes, so certainly this is true, 01:31:19.440 |
a rare example of somebody that has powerful reach 01:31:24.280 |
with words, but I was also referring to actions. 01:31:27.220 |
The United States government, what are the options here? 01:31:44.100 |
If there's anything that challenges my hope for the future, 01:31:48.500 |
is the fact that sometimes we're not powerless to help, 01:32:21.540 |
but it's also true, I think, that we can be more forceful. 01:32:25.260 |
I think we can be more forceful without necessarily war. 01:32:35.220 |
and this was an idea that came up after Kosovo, 01:32:47.520 |
they were gonna engage in a genocidal program in Kosovo, 01:32:50.560 |
and you know, it was basically a program of ethnic cleansing 01:32:56.240 |
not just driving people out, but beginning to kill them, 01:32:59.540 |
and the United States and Britain and others intervened, 01:33:08.840 |
and I think correctly, people have analyzed this as a case 01:33:19.920 |
In other words, the Serbs were stopped in their tracks. 01:33:25.640 |
and things like that, but you know, it was stopped, 01:33:32.320 |
then there was a kind of international consensus 01:34:06.480 |
and what it says is there's a hierarchy of measures 01:34:11.480 |
that should be, well, let me take a step back. 01:34:14.960 |
It starts with the principle that sovereignty of a country 01:34:19.960 |
is not, you don't earn it just by being there 01:34:27.300 |
You have to earn it by protecting your people. 01:34:32.840 |
with all the nations of the UN agreed, you know, 01:34:38.320 |
sovereignty is there because you protect your people 01:34:52.840 |
If you violate that justification for your sovereignty, 01:35:03.240 |
the international community has the obligation 01:35:11.840 |
of things you can do, you know, starting with, 01:35:17.020 |
but, you know, starting with kind of push and pull, 01:35:19.360 |
you know, trying to convince people, don't do that. 01:35:29.480 |
and you get to sanctions, or threatening sanctions, 01:35:32.300 |
and then sanctions, you know, like we have against Russia, 01:35:41.300 |
you get to military intervention at the bottom, 01:35:52.040 |
but it, just as you said, just as you pointed out, 01:35:58.200 |
And we'll do everything we can short, you know, 01:36:02.000 |
of military intervention, but, you know, if necessary, 01:36:09.080 |
And so the responsibility to protect, I think, is, 01:36:16.280 |
One of the things it says in this last category, right, 01:36:20.600 |
the military intervention, is that the intervention 01:36:23.500 |
cannot create more damage than it relieves, right? 01:36:34.020 |
you know, that, I mean, the international community, 01:36:39.680 |
even though the Russians were there, obviously, 01:36:41.500 |
we ended up being there, and that sort of thing, 01:36:43.260 |
but the international community basically said, 01:36:45.460 |
you know, there's no way you can intervene in Syria. 01:36:48.260 |
You know, there's just no way without causing more damage, 01:36:56.840 |
that's what the international community is saying about, 01:37:04.740 |
what hell would break loose if there was some kind 01:37:07.900 |
of military trouble, you know, to threaten the Chinese with. 01:37:12.740 |
But you can go down that list with, you know, 01:37:33.900 |
and, you know, you can go down that list and start pushing. 01:37:44.380 |
and in the, you know, right at the turn of the century, 01:37:56.300 |
we're not gonna do any of this kind of stuff. 01:38:08.700 |
I mean, here, I tend to be maybe more of an optimist than you. 01:38:14.580 |
that the international community can, you know, 01:38:25.660 |
to get the Chinese, and to get the, again, Myanmar, 01:38:40.060 |
That's the space of politicians, and UN, and so on. 01:38:52.080 |
that not many voices with power and with money speak up, 01:39:06.760 |
because it costs a lot for an individual to speak up, 01:39:17.040 |
Like, if you have a product, if you have a company, 01:39:20.040 |
and you say something negative, China just says, 01:39:22.120 |
"Okay, well, then they knock you out of the market." 01:39:29.800 |
It's a huge cost, sometimes millions or billions of dollars. 01:39:33.200 |
And so what happens is everybody of consequences, 01:39:36.720 |
sort of financially, everybody with a giant platform 01:39:41.240 |
It's a very, it's a different kind of hesitation 01:39:48.760 |
It seems like in history, people were quiet because of fear, 01:39:55.840 |
Here, there's almost like a self-interested preservation 01:40:06.240 |
I mean, I don't know if you can say something there, 01:40:14.440 |
because people are financially self-interested. 01:40:17.400 |
- Yeah, no, I think, I mean, I think the analysis is correct. 01:40:22.060 |
And it's not only, but it's not only corporations, 01:40:26.160 |
but it's, you know, it's the American government 01:40:34.040 |
not to challenge the Chinese on human rights issues. 01:40:39.400 |
- But the interesting thing is it's not just, 01:40:54.600 |
but more because they're going to also lose financially. 01:41:02.560 |
in fact, the Chinese government and the country 01:41:08.520 |
because it has a lot of elements that enable capitalism 01:41:13.880 |
So you have a lot of very successful companies, 01:41:19.960 |
many of whom have either been on this podcast, 01:41:25.960 |
they really don't want to say anything negative 01:41:32.840 |
is not the kind of fear you would have in Nazi Germany. 01:41:45.680 |
on my family, in terms of finance, strictly financially? 01:42:16.040 |
they kind of say, okay, take the monetary system, 01:42:19.720 |
the power to control money away from governments. 01:42:23.480 |
like, allow technology to help you with that. 01:42:28.760 |
In fact, a lot of people argue that kind of Bitcoin, 01:42:38.500 |
that lead to violations of basic human rights. 01:42:41.880 |
If you can control, if you can give the power 01:42:49.740 |
where technology might be able to do something good. 01:42:52.760 |
That's something different about the 21st century 01:42:59.680 |
- I mean, I have to say, I think you're a naive 01:43:04.640 |
I mean, I don't, I'm not someone who understands technology. 01:43:11.640 |
because I don't really spend much time with it. 01:43:15.960 |
And I'm not, I'm neither a fan nor a connoisseur. 01:43:37.240 |
I mean, that's the perfect example of technology. 01:43:39.520 |
You know, what happens when you discover new things. 01:43:42.120 |
It's a perfect example, what's going on with Facebook now. 01:43:53.680 |
and about all the things that were going to happen 01:43:55.760 |
and all these wonderful things like, you know, 01:43:57.540 |
you wouldn't have to translate yourself anything. 01:44:04.160 |
You don't have to do this, you don't have to do that. 01:44:07.360 |
So, you know, my view of technology is it's subsumed, 01:44:12.360 |
you know, to the political, social, and moral needs 01:44:17.480 |
of our day and should be subsumed to that day. 01:44:22.400 |
It's going to be you and me that solve things. 01:44:34.240 |
they're talking now about how artificial intelligence, 01:44:37.040 |
you know, is going to do this and is going to do that. 01:44:39.800 |
I'm not so sure there's anything necessarily positive 01:44:48.680 |
I mean, I, you know, I like email and I like, you know, 01:44:52.160 |
word processing and that sort of, all that stuff is great. 01:44:56.220 |
But actually solving human relations in and of itself, 01:45:02.220 |
relations in and of itself, or international relations, 01:45:21.860 |
- The question is, so like you said, technology is neutral. 01:45:30.460 |
that enables humans to have wider reach and more power. 01:45:34.980 |
The printing press, the rare reason I can read your books 01:45:39.980 |
is I would argue, so first of all, the printing press, 01:45:45.540 |
Wikipedia, I think, has immeasurable effect on humanity. 01:46:04.460 |
the capacity for good outweighs the capacity of bad. 01:46:12.380 |
I would say you're naively cynical about technology. 01:46:16.060 |
Here we have one overdressed, naive optimist, 01:46:23.820 |
technologically naive cynic, and we don't know. 01:46:30.340 |
or the capacity for evil wins out in the end. 01:46:37.060 |
the trajectory of human history seems to pivot 01:46:43.060 |
So we don't know, but as a builder of technology, 01:47:01.040 |
And I'm not sure what to make of that small effect. 01:47:11.660 |
are cynical about the world somehow sound more intelligent. 01:47:17.580 |
- No, no, the issue is how can you be realistic 01:47:23.820 |
It's not optimistic or pessimistic, it's not cynical. 01:47:27.260 |
The question is how can you be a realist, right? 01:47:31.140 |
- Realism depends on a combination of knowledge 01:47:36.140 |
and wisdom and good instincts and that sort of thing. 01:47:42.680 |
And that's what we strive for, is a kind of realism. 01:47:49.420 |
But I mean, here's an example I would give you. 01:47:53.420 |
What about, again, we've got this environmental issue, 01:48:04.180 |
I mean, we all like to be heated well in our homes, 01:48:12.620 |
I mean, we're all consumers and we all profit from this. 01:48:23.180 |
And technology has provided us with a comfortable life. 01:48:25.620 |
And it's also provided us with this incredible danger, 01:48:33.740 |
but it's only, my view is, you know what's gonna happen? 01:48:59.060 |
Greta goes blah, blah, blah, something like that 01:49:01.620 |
in her last talk about the environmental summit 01:49:15.940 |
unless we're hit upside the head really, really hard. 01:49:19.460 |
And then maybe, you know, the business with nuclear weapons, 01:49:24.460 |
you know, I think somehow we got hit upside the head 01:49:32.940 |
And so we started, you know, serious arms control stuff. 01:49:36.660 |
And, you know, but up to that point, you know, 01:49:41.460 |
I mean, it was just something about, you know, 01:49:43.500 |
Khrushchev's big bomb, his big hydrogen bomb, 01:49:48.360 |
I think it was the anniversary or something like that. 01:49:50.420 |
You know, I mean, just think what we could have done 01:49:53.580 |
- Well, that's the double-edged sword of technology. 01:49:58.420 |
there's a lot of people that argue that nuclear weapons 01:50:01.080 |
is the reason we haven't had a World War III. 01:50:03.780 |
So nuclear weapons, the mutually assured destruction 01:50:08.540 |
we've reached a certain level of destructiveness 01:50:11.220 |
with our weapons where we were able to catch ourselves, 01:50:15.660 |
not to create, like you said, hit really hard. 01:50:20.320 |
This is the interesting question about kind of hard, 01:50:36.160 |
there's already, because of this created urgency, 01:50:44.600 |
that sometimes is unrelated to the environment, 01:50:49.980 |
It's been humongous, including the work of Elon Musk, 01:51:02.600 |
is kind of sparked by this environmental like urgency. 01:51:07.200 |
and everything they're doing with electric vehicles 01:51:09.000 |
and so on, there's a huge amount of innovation 01:51:19.760 |
that improves the quality of life across the world 01:51:22.700 |
than the actual catastrophic events that we're describing, 01:51:29.640 |
there's going to be more extreme weather events. 01:51:38.440 |
What does that even mean in terms of catastrophic events? 01:51:47.520 |
there's going to be a huge amount of innovators born today 01:51:52.440 |
that have dreams and that will build devices and inventions 01:51:56.840 |
and from space to vehicles to in the software world 01:52:03.720 |
all those kinds of things that will on mass, on average, 01:52:07.760 |
increase the quality of life on average across the world. 01:52:16.240 |
that are creating climate change, global warming 01:52:19.920 |
are going to have a negative, net negative effect. 01:52:24.760 |
And I'm kind of inspired by the dreamers, the engineers, 01:52:29.760 |
the innovators and the entrepreneurs that build, 01:52:33.600 |
that wake up in the morning, see problems in the world 01:52:52.080 |
and the worst of human history is then we can say, 01:52:59.360 |
It's a wake up call that if you get complacent, 01:53:04.960 |
And that, listen, there's a lot of really intelligent people, 01:53:17.240 |
So I think there's reason to be optimistic about technology, 01:53:21.840 |
There's an argument to be made in a realistic way 01:53:25.460 |
that like with technology, we can build a better future. 01:53:37.280 |
And that lesson serves as a guide of how to do it better, 01:53:42.280 |
how to do it right, how to do it in a positive way. 01:53:46.640 |
And the same, every single sort of failed technology 01:53:50.040 |
contains within it the lessons of how to do it better. 01:53:55.460 |
what's the source of hope for human civilization? 01:54:06.760 |
you have truly studied some of the darkest moments 01:54:32.500 |
which is in the persistence of this civilization over time, 01:54:43.340 |
two enormous world wars, the nuclear standoff, 01:54:52.740 |
with climate change and migration and stuff like that. 01:55:04.580 |
and we're continuing to try to solve these problems. 01:55:07.700 |
And we're continuing to love as well as hate. 01:55:14.620 |
I'm basically, I mean, I have children and grandchildren 01:55:27.700 |
I'm not a Cassandra saying the world is coming to an end. 01:55:38.260 |
Another, by the way, source of tremendous optimism 01:55:44.580 |
I teach some unbelievably fantastic young people 01:55:49.540 |
who are sort of like you say, they're dreamers 01:56:09.620 |
this has probably been true all the way along, 01:56:34.060 |
but certainly these young people are talented, 01:56:41.140 |
they're energetic, they work hard, they're focused. 01:56:50.620 |
You have young people who really wanna contribute 01:57:03.740 |
But the percentages are actually rather small, 01:57:12.380 |
financial well-being will be more important to them. 01:57:16.300 |
But right now, you catch this young generation 01:57:25.820 |
as being kind of silly and naive and knee jerk leftists 01:57:40.420 |
- What advice would you give to those young people today, 01:57:44.020 |
maybe in high school, in college, at Stanford, 01:57:47.140 |
maybe to your grandchildren about how to have a career 01:57:52.140 |
they can be proud of, have a life they can be proud of? 01:57:55.540 |
- Pursue careers that are in the public interest 01:57:58.860 |
in one fashion or another and not just in their interests. 01:58:02.380 |
And that would be, I mean, it's not bad to pursue a career 01:58:08.060 |
I mean, as long as it's something that's useful 01:58:22.220 |
find who they want to be and what they want to be 01:58:30.420 |
and a lot of young people are kind of throwing themselves 01:58:34.580 |
into it and human rights watch and that kind of stuff. 01:58:45.460 |
- I tend to think that even if you're not working 01:58:50.820 |
in human rights, there's a certain way in which 01:59:09.500 |
Maybe it'll never be written about or talked about. 01:59:21.300 |
It may require a sacrifice, but it's the choice 01:59:23.900 |
that the best version of that person would make. 01:59:36.380 |
It sounds dramatic to say, but those little actions. 01:59:40.460 |
And I feel like the best you can do to avoid genocide 01:59:45.420 |
on scale is for all of us to live in that way, 01:59:57.420 |
Like I believe that all of us know the right thing to do. 02:00:09.860 |
to live with truth, which is what Václav Havel 02:00:29.140 |
What role does love play in this whole thing, 02:00:41.260 |
and the best in human nature is expressed through love. 02:01:26.180 |
And also, as I said, among those who are suffering, 02:01:40.980 |
who beat the hell out of the political prisoners, 02:01:45.980 |
so everybody's being the hell out of everybody else. 02:01:48.420 |
So I would not idealize in any way suffering as a, 02:02:13.960 |
you know, to foster loving relations between people. 02:02:28.520 |
you know, worse records when it comes to crime 02:02:40.800 |
I mean, you just, you don't wanna be poor and indigent, 02:02:45.800 |
and not have a roof over your head, be homeless. 02:03:02.440 |
I mean, I'm very critical of the Chinese in a lot of ways, 02:03:08.320 |
they pulled that country out of horrible poverty, right? 02:03:11.080 |
And I mean, there's still poor people in the countryside, 02:03:21.640 |
but, you know, there were millions and millions of Chinese 02:03:33.820 |
So I wanna be clear, I don't speak for history, right? 02:03:38.360 |
I'm giving you, I mean, there used to be historians, 02:03:42.320 |
who really thought they were speaking for history, you know? 02:03:47.020 |
I mean, I understand I'm a subjective human being 02:03:49.520 |
with my own points of view and my own opinions, but-- 02:03:54.020 |
- I'm trying to remember in this conversation 02:03:56.120 |
that you're, despite the fact that you're brilliant 02:03:58.880 |
and you've written brilliant books, that you're just human. 02:04:24.920 |
but, you know, I'm an individual with my points of view, 02:04:29.440 |
and one of them is that I've developed over time 02:04:34.080 |
is that, you know, human want is a real tragedy for people, 02:04:39.080 |
and it hurts people, and it also causes upheavals 02:04:49.440 |
I feel for people in, you know, in Ethiopia, in Tigray, 02:04:54.440 |
you know, when they don't have enough to eat, 02:04:58.280 |
I mean, it doesn't mean they don't love each other, right? 02:05:03.040 |
but it does mean that it's harder, you know, to do that. 02:05:13.280 |
but the numbers, we've been talking about deaths, 02:05:20.000 |
The history that you haven't perhaps been looking at 02:05:22.720 |
is all the times that people have fallen in love deeply 02:05:31.480 |
and I'm not so sure that amidst the suffering, 02:05:34.320 |
those moments of beauty and love can't be discovered, 02:05:45.780 |
with Viktor Frankl, I may too, maybe, depending on the day. 02:05:49.520 |
I mean, he says that if there's meaning to this life at all, 02:05:56.240 |
There's something about accepting the ups and downs 02:06:03.880 |
and within all of it, finding a source of meaning. 02:06:07.800 |
I mean, he's arguing from the perspective of psychology, 02:06:21.120 |
if we just escape the suffering, it'll all be better, 02:06:37.680 |
it's just horrible, it is horrible suffering, 02:06:41.320 |
but I also just want to say that there's love amidst it, 02:06:47.360 |
- No, no, I don't forget it, I don't forget it, but-- 02:06:52.880 |
Now, I don't want to make that compromise or that trade, 02:07:04.560 |
so I'm not sure what to make of these calculations, 02:07:15.760 |
is people I've gone through difficult times with. 02:07:20.040 |
They're a society or a group where things are easy. 02:07:24.880 |
The intensity of the connection between human beings 02:07:32.800 |
I want to have as little suffering in the world as possible, 02:07:42.920 |
- No, there's something to what you're saying. 02:07:46.840 |
There's clearly something to what you're saying. 02:07:50.720 |
when I lived there, and people on the streets 02:07:53.480 |
were so mean to one another, and they never smiled. 02:08:04.680 |
and just how hard people were on each other on the streets, 02:08:11.240 |
the friendships were so intense and so wonderful. 02:08:14.100 |
So in that sense, I mean, they did live a hard life, 02:08:43.480 |
in the refugee camps for Jews in Germany after the war. 02:08:48.260 |
So these were Jews who had come mostly from Poland, 02:08:51.280 |
and some survived the camps, came from awful circumstances, 02:09:00.840 |
I mean, they were guarded sometimes by Germans even, 02:09:03.080 |
but they're basically under the British control, 02:09:08.120 |
trying to get to Palestine right after the war, 02:09:11.920 |
and how many pairs there were, how many people coupled up. 02:09:16.060 |
But remember, this is after being in the concentration camp. 02:09:20.600 |
and it's also being free, to more or less free. 02:09:28.160 |
and to be human beings after this horrible thing 02:09:57.480 |
but it was very inspiring to read about these couples 02:10:01.840 |
and started to couple up, you know, and get married, 02:10:11.520 |
- When did you live in Russia, in the Soviet Union? 02:10:18.480 |
So I went there, I first went there in '69, '70. 02:10:33.700 |
but it was also a time of political uncertainty 02:10:39.480 |
and also hardship, you know, for Russians themselves. 02:10:47.320 |
for food and for getting anything was almost impossible. 02:10:51.800 |
It was a time when Jews were trying to get out. 02:10:55.440 |
In fact, I just talked to a friend of mine from those days 02:11:01.720 |
and lovely people who had managed to have a good life 02:11:13.920 |
and, you know, their daughter was persecuted in school, 02:11:17.920 |
you know, once they declared that they wanted to immigrate, 02:11:22.400 |
So it was a very, it was a lot of anti-Semitism. 02:11:28.600 |
Dissidents, you know, hung out with some dissidents, 02:11:34.220 |
We think by the, nobody knows exactly by the KGB, 02:11:38.560 |
but his art studio was, he had a separate studio 02:11:44.780 |
You know, just a small studio where he did his art, 02:11:57.920 |
So, you know, it was not a, it was a tough time. 02:12:04.680 |
you knew you were being reported on as a foreign scholar, 02:12:09.180 |
There was a formal exchange between the United States 02:12:16.680 |
but then, you know, Ivanov got to work in the- 02:12:24.200 |
You know, so it was an exchange which sent historians 02:12:29.200 |
and literary people and some social scientists to Russia, 02:12:33.600 |
and they sent all scientists here to, you know, 02:12:35.800 |
grab what they could from MIT and those places. 02:12:40.640 |
Do you have any knowledge of Russian language 02:12:58.640 |
I have Russian friends who speak just Russian. 02:13:00.480 |
So, you know, when I'm there, I then, you know, 02:13:13.040 |
- What's your fondest memory of the Soviet Union 02:13:20.920 |
- Was it vodka involved, or is it just vodka involved? 02:13:28.520 |
they'd just make fun of me, and I'd make fun of myself. 02:13:32.960 |
I don't really like, you know, a heavy drink. 02:13:37.880 |
I've done some of that, but I never really enjoyed it, 02:13:45.360 |
You know, one friend I made in the dormitory, 02:14:00.360 |
and one in particular from Omsk became a wonderful friend. 02:14:12.060 |
And he would say, well, this is, he was an historian, 02:14:16.840 |
And he'd say, well, this was the case, wasn't it? 02:14:18.560 |
I said, no, I'm sorry, Sasha, it wasn't the case. 02:14:26.440 |
I mean, we're not sure, but, you know, he said, no. 02:14:29.920 |
You know, so, you know, we had these conversations, 02:14:36.600 |
I don't know if he would agree with me or not. 02:14:48.160 |
And he thought I was, you know, I was, you know, 02:14:58.200 |
kind of reason and facts and accurate stories 02:15:20.960 |
because he became a kind of local official in Omsk. 02:15:25.640 |
And he sort of migrated more and more to being a Democrat. 02:15:34.880 |
and, you know, in the Council of People's Deputies, 02:15:45.200 |
and had a political career through the Yeltsin period. 02:15:49.480 |
And once Putin came along, you know, it was over. 02:15:56.200 |
and Putin didn't like the Yeltsin people, right, 02:15:59.160 |
who were, tried to be, some of them tried to be Democrats. 02:16:04.200 |
He just published his memoirs in Russian, by the way, 02:16:19.440 |
But I translated it four points once for him. 02:16:23.040 |
Like, do you find that the translation is a problem or no? 02:16:31.320 |
it's the only language I know deeply, except English. 02:16:35.640 |
And it seems like so much is lost of the pain, 02:16:45.160 |
I mean, those who do the translations, you know, 02:16:52.080 |
whether it's Russian or Polish or German or anything, 02:16:57.480 |
to talk to the famous translators, the Dostoevsky Tolstoy. 02:17:01.260 |
And I'm just gonna do several conversations with them 02:17:07.880 |
and just talk about the translation in that sentence. 02:17:28.340 |
When you talk about the way the World War II was perceived 02:17:30.600 |
and all those kinds of things, it's fascinating. 02:17:35.240 |
History also has in it opinion and perspective. 02:17:39.500 |
And so sometimes stripping that away is really difficult. 02:17:43.060 |
and at its highest form, that is what you do as a historian. 02:18:00.280 |
through some of the worst parts of human history, 02:18:15.540 |
please check out our sponsors in the description. 02:18:17.940 |
And now, let me leave you with some words from Stalin. 02:18:26.340 |
Thank you for listening, and hope to see you next time.