back to index

Norman Naimark: Genocide, Stalin, Hitler, Mao, and Absolute Power | Lex Fridman Podcast #248


Chapters

0:0 Introduction
0:20 Stalin and absolute power
14:17 Dictators and genocide
38:43 What is genocide
48:50 Human nature and suffering
78:35 Mao's Great Leap Forward
85:49 North Korea
89:42 Our role in fighting against atrocities
98:38 China
102:47 Hopes for the future and technology
117:40 Advice for young people
120:27 Love and tragedy

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Norman Namark,
00:00:03.360 | a historian at Stanford specializing in genocide,
00:00:06.960 | war, and empire.
00:00:09.400 | This is the Lex Friedman Podcast.
00:00:11.600 | To support it, please check out our sponsors
00:00:13.860 | in the description.
00:00:15.040 | And now, here's my conversation with Norman Namark.
00:00:18.960 | Did Stalin believe that communism is good,
00:00:22.500 | not just for him, but for the people of the Soviet Union
00:00:25.400 | and the people of the world?
00:00:26.960 | - Oh, absolutely.
00:00:27.840 | I mean, Stalin believed that socialism was the be all
00:00:32.720 | and end all of human existence.
00:00:36.760 | I mean, he was a true Leninist,
00:00:38.440 | and in Lenin's tradition, this was what he believed.
00:00:43.160 | I mean, that set of beliefs didn't exclude
00:00:46.320 | other kinds of things he believed or thought or did.
00:00:49.880 | But no, the way he defined socialism,
00:00:53.920 | the way he thought about socialism,
00:00:55.880 | no, he absolutely thought it was in the interest
00:00:57.720 | of the Soviet Union and of the world.
00:00:59.480 | And in fact, that the world was one day going to go socialist
00:01:03.320 | in other words, I think he believed in that.
00:01:05.840 | And eventually, in the International Revolution.
00:01:08.240 | - So given the genocide in the 1930s that you described,
00:01:13.840 | was Stalin evil, delusional, or incompetent?
00:01:17.400 | - Evil, delusional, or incompetent.
00:01:22.480 | Well, you know, evil is one of those words,
00:01:25.500 | you know, which has a lot of kind of religious
00:01:27.900 | and moral connotations.
00:01:30.680 | And in that sense, yes, I think he was an evil man.
00:01:33.280 | I mean, he, you know, eliminated people
00:01:36.880 | absolutely unnecessarily.
00:01:38.720 | He tortured people, had people tortured.
00:01:42.080 | He was completely indifferent to the suffering of others.
00:01:48.000 | He couldn't have cared a whit,
00:01:50.760 | you know, that millions were suffering.
00:01:54.840 | And so yes, I consider him an evil man.
00:01:58.540 | I mean, you know, historians don't like to--
00:02:01.140 | - Use the word evil. - Use the word evil.
00:02:02.780 | It's, you know, it's a word for moral philosophers,
00:02:05.340 | but I think it certainly fits who he is.
00:02:10.340 | I think he was delusional.
00:02:14.040 | And there is a wonderful historian at Princeton,
00:02:17.980 | a political scientist actually named Robert Tucker,
00:02:20.620 | who said he suffered from a paranoid delusional system.
00:02:25.620 | And I always remember that of Tucker's writing
00:02:30.320 | because what Tucker meant is that he was not just paranoid,
00:02:35.240 | meaning, you know, I'm paranoid,
00:02:37.600 | I'm worried you're out to get me, right?
00:02:39.920 | But that he constructed whole plots of people,
00:02:47.440 | whole systems of people who were out to get him.
00:02:51.340 | So in other words, his delusions were that there were
00:02:54.220 | all of these groups of people out there
00:02:56.540 | who were out to diminish his power and remove him
00:03:01.500 | from his position and undermine the Soviet Union in his view.
00:03:07.360 | So yes, I think he did suffer from delusions.
00:03:12.360 | And this had a huge effect because whole groups then
00:03:16.940 | were destroyed by his activities,
00:03:20.920 | which he would construct based on these delusions.
00:03:25.920 | He was not incompetent.
00:03:27.380 | He was an extremely competent man.
00:03:29.340 | I mean, I think most of the research that's gone on,
00:03:32.700 | especially since the Stalin Archive was opened
00:03:36.840 | at the beginning of the century,
00:03:38.820 | and I think almost every historian who goes in that archive
00:03:41.720 | comes away from it with the feeling of a man
00:03:45.260 | who is enormously hardworking, intelligent,
00:03:48.860 | you know, with an acute sense of politics,
00:03:51.900 | a really excellent sense of political rhetoric,
00:03:56.900 | a fantastic editor, you know, in a kind of agitational sense.
00:04:02.180 | I mean, he's a real agitator, right?
00:04:04.460 | And of a, you know, a really hard worker.
00:04:09.280 | I mean, somebody who works from morning till night,
00:04:11.900 | a micromanager in some ways.
00:04:14.700 | So his competence, I think, was really extreme.
00:04:17.820 | Now, there were times when that fell down,
00:04:20.440 | you know, times in the '30s, times in the '20s,
00:04:23.540 | times during the war, where he made mistakes.
00:04:26.220 | It's not as if he didn't make any mistakes.
00:04:28.940 | But I think, you know, you look at his stuff,
00:04:31.220 | you know, you look at his archives, you look what he did.
00:04:34.120 | I mean, this is an enormously competent man
00:04:36.700 | who in many, many different areas of enterprise,
00:04:41.320 | because he, you know, he had this notion
00:04:43.540 | that he should know everything and did know everything.
00:04:46.620 | I remember one archive, "Gilad's" called,
00:04:50.620 | you know, a kind of folder that I looked at
00:04:52.340 | where he actually went through the wines
00:04:56.260 | that were produced in his native Georgia
00:04:59.460 | and wrote down how much they should make
00:05:03.340 | of each of these wines, you know,
00:05:05.060 | how many, you know, barrels they should produce
00:05:09.200 | of these wines, which grapes were better
00:05:11.360 | than the other grapes, sort of correcting,
00:05:13.780 | in other words, what people were putting down there.
00:05:16.700 | So he was, you know, his competence ranged very wide,
00:05:20.860 | or at least he thought his competence ranged very wide.
00:05:23.500 | I mean, both things, I think, are the case.
00:05:25.760 | - If we look at this paranoid, delusional system,
00:05:27.860 | Stalin was in power for 30 years.
00:05:29.900 | He is, many argue, one of the most powerful men in history.
00:05:35.000 | Did, in his case, absolute power corrupt him
00:05:38.620 | or did it reveal the true nature of the man?
00:05:40.940 | And maybe just in your sense,
00:05:43.500 | as we kind of build around this genocide
00:05:45.540 | of the early 1930s, this paranoid, delusional system,
00:05:50.300 | did it get built up over time?
00:05:52.780 | Was it always there?
00:05:54.620 | It's kind of a question of did the genocide,
00:05:59.620 | was that always inevitable, essentially, in this man,
00:06:03.240 | or did power create that?
00:06:05.900 | - I mean, it's a great question,
00:06:07.060 | and I don't think you can,
00:06:08.460 | I don't think you can say that it was always
00:06:11.100 | kind of inherent in the man.
00:06:14.500 | I mean, the man without his position and without his power,
00:06:18.740 | you know, wouldn't have been able to accomplish
00:06:21.060 | what he eventually did in the way of murdering people,
00:06:25.060 | you know, and murdering groups of people,
00:06:26.580 | which is what genocide is.
00:06:28.860 | So, you know, I don't, it wasn't sort of in him.
00:06:32.780 | I mean, there were, and again, you know,
00:06:34.180 | the new research has shown that, you know,
00:06:37.020 | he had, his childhood was, you know,
00:06:39.540 | not a particularly nasty one.
00:06:42.380 | People used to say, you know, the father beat him up,
00:06:45.780 | and it turns out, actually, it wasn't the father,
00:06:47.540 | it was the mother once in a while.
00:06:49.340 | But basically, you know, he was not an unusual,
00:06:52.900 | young Georgian kid, or student, even.
00:06:56.980 | And, you know, it was the growth of the Soviet system,
00:07:01.060 | and him within the Soviet system,
00:07:04.720 | I mean, his own development within the Soviet system,
00:07:07.780 | I think that led, you know, to the kind of mass killing
00:07:12.780 | that occurred in the 1930s.
00:07:15.340 | You know, he essentially achieved complete power
00:07:19.680 | by the early 1930s, and then as he rolled with it,
00:07:24.680 | as you would say, you know, or people would say,
00:07:28.160 | you know, it increasingly became murderous,
00:07:32.740 | and there was no, you know, there were no checks
00:07:36.660 | and balances, obviously, on that murderous system.
00:07:39.820 | And not only that, you know, people supported it
00:07:43.180 | in the NKVD and elsewhere,
00:07:44.980 | and he learned how to manipulate people.
00:07:46.900 | I mean, he was a superb, you know, political manipulator
00:07:51.500 | of those people around him.
00:07:54.700 | And, you know, we've got new transcripts, for example,
00:08:01.700 | of, you know, police bureau meetings in the early 1930s.
00:08:05.940 | And you read those things, and you read, you know,
00:08:07.660 | he uses humor, and he uses sarcasm, especially.
00:08:12.660 | He uses verbal ways to undermine people, you know,
00:08:17.140 | to control their behavior and what they do.
00:08:20.780 | And he's a really, you know, he's a real,
00:08:25.460 | I guess, manipulator is the right word,
00:08:27.380 | and he does it with, you know, a kind of skill
00:08:32.380 | that on the one hand is admirable,
00:08:36.060 | and on the other hand, of course, is terrible,
00:08:39.860 | because it ends up, you know,
00:08:41.740 | creating the system of terror that he creates.
00:08:46.460 | - I mean, I guess just to linger on it,
00:08:50.620 | I just wonder how much of it is a slippery slope
00:08:54.220 | in the early '20s, 1920s,
00:08:57.500 | did he think he was going to be murdering
00:08:59.460 | even a single person, but thousands and millions.
00:09:04.460 | I just wonder,
00:09:07.340 | maybe the murder of a single human being,
00:09:14.060 | just to get them, you know,
00:09:16.620 | because you're paranoid about them
00:09:18.460 | potentially threatening your power,
00:09:19.920 | does that murder then open a door?
00:09:22.260 | And once you open the door,
00:09:23.980 | you become a different human being.
00:09:25.940 | A deeper question here is the Solzhenitsyn,
00:09:29.060 | you know, the line between good and evil
00:09:30.740 | runs in every man, are all of us,
00:09:33.180 | once we commit one murder in this situation,
00:09:35.780 | does that open a door for all of us?
00:09:38.060 | And I guess even the further deeper question
00:09:41.620 | is how easy it is for human nature
00:09:44.980 | to go on this slippery slope that ends in genocide.
00:09:49.020 | - There are a lot of questions in those questions,
00:09:52.380 | and you know, the slippery slope question,
00:09:55.900 | I would answer, I suppose, by saying,
00:09:59.160 | you know, Stalin wasn't the most likely successor of Lenin.
00:10:04.100 | There were plenty of others.
00:10:06.020 | There were a lot of political contingencies
00:10:09.620 | that emerged in the 1920s
00:10:12.660 | that made it possible for Stalin to seize power.
00:10:16.620 | I don't think of him as a,
00:10:19.540 | you know, if you would just know him in 1925,
00:10:22.820 | I don't think anybody would say, much less himself,
00:10:25.900 | that this was a future mass murderer.
00:10:28.780 | I mean, Trotsky mistrusted him
00:10:30.980 | and thought he was, you know, a mindless bureaucrat.
00:10:35.980 | You know, others were less mistrustful of him,
00:10:39.500 | but, you know, he managed to gain power
00:10:41.740 | in the way he did through this bureaucratic
00:10:43.580 | and political maneuvering that was very successful.
00:10:48.980 | You know, the slippery slope, as it were,
00:10:51.940 | doesn't really begin until the 1930s, in my view.
00:10:55.580 | In other words, once he gains complete power
00:10:58.820 | and control of the Politburo,
00:11:01.300 | once the programs that he institutes
00:11:05.860 | of the Five-Year Plan and collectivization go through,
00:11:10.060 | once he reverses himself and is able to reverse himself
00:11:14.020 | or reverse the Soviet path, you know,
00:11:17.140 | to give various nationalities their, you know,
00:11:20.540 | their ability to develop their own cultures
00:11:22.980 | and sort of internal politics,
00:11:26.660 | once he reverses all that, you know,
00:11:29.000 | you have the Ukrainian famine in '32, '33,
00:11:32.140 | you have the murder of Kirov,
00:11:34.460 | who is one of the leading figures, you know,
00:11:38.060 | in the political system, you have the suicide of his wife,
00:11:41.140 | you have all these things come together in '32, '33,
00:11:45.540 | that then, you know, make it more likely,
00:11:50.540 | in other words, that bad things are gonna happen.
00:11:53.440 | And people start seeing that, too, around him.
00:11:57.820 | They start seeing that it's not a slippery slope,
00:12:00.940 | it's a dangerous, it's a dangerous situation,
00:12:05.940 | which is emerging, and some people really understand that.
00:12:10.100 | So I don't, I really do see a differentiation, then,
00:12:13.460 | between the '20s.
00:12:14.700 | I mean, it's true that Stalin, during the Civil War,
00:12:17.420 | there's a lot of, you know, good research on that,
00:12:20.140 | you know, shows that he already had
00:12:23.060 | some of these characteristics of being,
00:12:26.420 | as it were, murderous and being, you know,
00:12:29.620 | being dictatorial and pushing people around
00:12:33.980 | and that sort of thing.
00:12:34.820 | That was all there.
00:12:36.620 | But I don't really see that as kind of the necessary stage
00:12:40.800 | for the next thing that came, which was the '30s,
00:12:43.500 | which was really terror of the worst sort,
00:12:46.780 | you know, where everybody's afraid for their lives
00:12:49.340 | and most people are afraid for their lives
00:12:51.580 | and their families' lives, and where torture
00:12:54.420 | and that sort of thing becomes a common part,
00:12:57.100 | you know, of what people had to face.
00:13:00.300 | So it's a different world.
00:13:03.100 | And, you know, people will argue.
00:13:04.820 | They'll argue this kind of Lenin-Stalin continuity debate,
00:13:09.820 | you know, that's been going on
00:13:11.220 | since I was an undergraduate, right?
00:13:13.100 | That argument, you know, was Stalin
00:13:15.460 | the natural sort of next step from Lenin,
00:13:19.100 | or was he something completely different?
00:13:21.220 | Many people will argue, you know,
00:13:24.380 | because of Marxism-Leninism, because of the ideology,
00:13:27.760 | that, you know, it was the natural,
00:13:30.860 | it was a kind of natural next step.
00:13:32.700 | I don't think so, you know?
00:13:34.300 | And I would tend to lean the other way.
00:13:36.300 | Not absolutely.
00:13:37.220 | I mean, I won't make an absolute argument
00:13:40.100 | that what Stalin became had nothing to do with Lenin
00:13:43.540 | and nothing to do with Marxism-Leninism.
00:13:45.540 | It had a lot to do with it.
00:13:47.380 | But, you know, he takes it one major step further.
00:13:51.580 | And again, that's why I don't like the slippery slope,
00:13:53.780 | you know, metaphor, because that means
00:13:55.540 | it's kind of slow and easy.
00:13:57.220 | It's a leap.
00:13:58.300 | And we call, you know, I mean, historians talk
00:14:01.300 | about the Stalin revolution, you know, in '28 and '29,
00:14:05.380 | you know, that he, in some senses,
00:14:08.860 | creates a whole new system, you know,
00:14:11.540 | through the Five-Year Plan, collectivization,
00:14:14.300 | and seizing political power the way he does.
00:14:16.720 | - Can you talk about the 1930s?
00:14:19.460 | Can you describe what happened in Holodomor,
00:14:21.660 | the Soviet terror famine in Ukraine in '32 and '33?
00:14:25.220 | - Yes.
00:14:26.060 | - That killed millions of Ukrainians?
00:14:27.220 | - Right.
00:14:28.040 | It's a long story, you know,
00:14:29.720 | but let me try to be as succinct as I can be.
00:14:34.120 | I mean, the Holodomor, the terror famine
00:14:37.060 | of '32, '33, comes out of, in part,
00:14:42.060 | an all-union famine that is the result of collectivization.
00:14:47.900 | You know, collectivization was a catastrophe.
00:14:52.180 | You know, the more or less, the so-called kulaks,
00:14:56.020 | the more or less richer farmers,
00:14:58.000 | I mean, they weren't really rich, right?
00:15:00.200 | Anybody with a tin roof and a cow was considered a kulak,
00:15:03.180 | you know, and other people who had nothing
00:15:05.080 | were also considered kulaks.
00:15:06.420 | If they opposed collectivization.
00:15:09.180 | So these kulaks, we're talking millions of them, right?
00:15:12.380 | And Ukraine, it's worth recalling,
00:15:15.060 | and I'm sure you know this, was a, you know,
00:15:17.220 | heavily agricultural area, and Ukrainian peasants,
00:15:20.820 | you know, were in the countryside
00:15:24.180 | and resisted collectivization more
00:15:27.700 | than even the Russian peasants resisted collectivization,
00:15:32.060 | suffered during this collectivization program.
00:15:35.480 | And they, you know, burned sometimes their own houses,
00:15:38.620 | they killed their own animals.
00:15:40.540 | They were shot, you know, sometimes on the spot.
00:15:45.120 | Tens of thousands and others were sent into exile.
00:15:50.020 | So there was a conflagration in the countryside.
00:15:53.500 | And the result of that conflagration in Ukraine
00:15:56.260 | was terrible famine.
00:15:58.140 | And again, there was famine all over the Soviet Union,
00:16:01.020 | but it was especially bad in Ukraine,
00:16:04.840 | in part because Ukrainian peasants resisted.
00:16:07.940 | Now in '32, '33, a couple of things happen.
00:16:10.700 | I mean, I've argued this in my writing,
00:16:14.520 | and, you know, I've also worked on this.
00:16:18.200 | I continue to work on it, by the way,
00:16:19.980 | with a museum in Kiev,
00:16:23.340 | that's going to be about the Holodomor.
00:16:25.900 | They're building the museum now,
00:16:27.500 | and it's going to be a very impressive set of exhibits,
00:16:32.060 | and talk with historians all the time about it.
00:16:34.300 | So what happens in '32, '33, a couple of things.
00:16:37.440 | First of all, the Stalin develops an even stronger,
00:16:42.440 | I say even stronger,
00:16:46.280 | 'cause they already had an antipathy for the Ukrainians,
00:16:49.280 | an even stronger antipathy for the Ukrainians in general.
00:16:53.160 | First of all, they resist collectivization.
00:16:55.960 | Second of all, he's not getting all the grain he wants
00:16:59.160 | out of them, and which he needs.
00:17:02.040 | And so he sends in, then, people to expropriate the grain,
00:17:07.040 | and take the grain away from the peasants.
00:17:09.500 | These teams of people, you know, some policemen,
00:17:12.840 | some urban thugs, some party people,
00:17:16.580 | some poor peasants, you know, take part too,
00:17:19.320 | go into the villages, and forcibly seize grain,
00:17:23.920 | and animals from the Ukrainian peasantry.
00:17:28.500 | They're seizing it all over.
00:17:29.920 | I mean, let's remember, again,
00:17:30.960 | this is all over the Soviet Union, in '32 especially.
00:17:34.760 | Then, you know, in December of 1932,
00:17:40.800 | January of '33, February of '33,
00:17:44.800 | Stalin is convinced the Ukrainian peasantry
00:17:47.760 | needs to be shown who's boss,
00:17:51.740 | that they're not turning over their grain,
00:17:55.620 | that they're resisting the expropriators,
00:17:57.900 | that they're hiding the grain,
00:17:59.140 | which they do sometimes, right?
00:18:01.520 | That they're basically not loyal to the Soviet Union,
00:18:05.640 | that they're acting like traitors,
00:18:07.560 | that they're ready, and he says this,
00:18:09.840 | you know, I think it's Kaganovich, he says it too,
00:18:12.640 | you know, they're ready to kind of pull out
00:18:14.240 | of the Soviet Union and join Poland.
00:18:15.920 | I mean, he thinks Poland is, you know,
00:18:17.520 | out to get Ukraine, and so he's gonna then,
00:18:21.960 | essentially, break the back of these peasantry.
00:18:24.040 | And the way he breaks their back
00:18:26.960 | is by going through another expropriation program,
00:18:30.580 | which is not done in the rest of the Soviet Union.
00:18:33.420 | So he's taking away everything they have,
00:18:36.300 | everything they have.
00:18:37.500 | There are new laws introduced
00:18:40.180 | where they will actually punish people,
00:18:42.780 | including kids, with death if they steal any grain,
00:18:47.660 | you know, if they take anything from the,
00:18:49.700 | you know, from the fields.
00:18:51.220 | So, you know, you can shoot anybody,
00:18:53.740 | you know, who is looking for food.
00:18:55.740 | And then he introduces measures in Ukraine
00:18:58.820 | which are not introduced into the rest of the Soviet Union.
00:19:02.200 | For example, Ukrainian peasantry
00:19:05.260 | are not allowed to leave their villages anymore.
00:19:07.960 | They can't go to the city to try to find some things.
00:19:11.180 | I mean, we've got pictures of, you know,
00:19:13.060 | Ukrainian peasants dying on the sidewalks
00:19:15.940 | in Kharkiv and in Kiev and in places like that
00:19:19.620 | who've managed to get out of the village
00:19:21.300 | and get to the cities, but now they can't leave.
00:19:24.100 | They can't leave Ukraine to go to Belarusia,
00:19:28.180 | Belarus today, or to Russia, you know, to get any food.
00:19:32.860 | There's no, he won't allow any relief to Ukraine.
00:19:36.820 | Number of people offer relief, including the Poles,
00:19:39.660 | but also the Vatican offers relief.
00:19:42.180 | He won't allow any relief to Ukraine.
00:19:44.500 | He won't admit that there's a famine in Ukraine.
00:19:47.340 | And instead what happens is that Ukraine turns
00:19:52.260 | into the Ukrainian countryside turns
00:19:54.540 | into what my now past colleague
00:19:59.220 | who died several years ago, Robert Conquest,
00:20:01.580 | called a vast belson.
00:20:04.500 | And by that, you know, the images
00:20:06.100 | of bodies just lying everywhere, you know, people dead
00:20:10.500 | and dying, you know, of hunger, which is, by the way,
00:20:15.500 | I mean, as you know, I've spent a lot of time
00:20:19.960 | studying genocide.
00:20:20.800 | I don't think there's anything worse than dying of hunger
00:20:23.300 | from what I have read.
00:20:24.540 | I mean, you see terrible ways that people die, right?
00:20:27.620 | But dying of hunger is just such a horrible, horrible thing.
00:20:31.660 | And so, for example, we know there were many cases
00:20:36.180 | of cannibalism in the countryside
00:20:37.740 | 'cause there wasn't anything to eat.
00:20:39.260 | People were eating their own kids, right?
00:20:42.300 | And Stalin knew about this.
00:20:43.620 | And again, you know, we started with this question
00:20:46.340 | a little bit earlier.
00:20:47.180 | He doesn't, there's not a sign of remorse,
00:20:51.640 | not a sign of pity, right?
00:20:54.360 | Not a sign of any kind of human emotion
00:20:58.980 | that normal people would have.
00:21:01.680 | - What about the opposite of joy for teaching them a lesson?
00:21:06.680 | - I don't think there's joy.
00:21:09.560 | I'm not sure Stalin really understood
00:21:12.880 | what joy was. - Emotion of intermingling.
00:21:14.640 | - You know, I think he felt it was necessary
00:21:19.440 | to get those SOBs, right?
00:21:21.920 | That they deserved it.
00:21:23.660 | He says that several times.
00:21:24.940 | This is their own fault, right?
00:21:26.700 | This is their own fault.
00:21:27.900 | And as their own fault, you know,
00:21:32.380 | they get what they deserve, basically.
00:21:36.020 | - How much was the calculation?
00:21:37.400 | How much was it reason versus emotion?
00:21:41.500 | In terms of, you said he was competent.
00:21:44.800 | Was there a long-term strategy,
00:21:47.600 | or was this strategy based on emotion and anger?
00:21:51.580 | - No, well, I think actually the right answer
00:21:54.760 | is a little of both.
00:21:56.440 | I mean, usually the right answer in history
00:21:58.260 | is something like that.
00:21:59.100 | - A little of both? - No, you can't, you can't.
00:22:01.400 | It wasn't just, I mean, first of all,
00:22:03.960 | you know, the Soviets had it in for Ukraine
00:22:08.640 | and Ukrainian nationalism,
00:22:09.940 | which they really didn't like.
00:22:12.080 | And by the way, Russians still don't like it, right?
00:22:14.680 | So they had it in for Ukrainian nationalism.
00:22:18.080 | They feared Ukrainian nationalism.
00:22:20.900 | As I said, you know, Stalin writes, you know,
00:22:24.760 | we'll lose Ukraine, you know, if these guys win.
00:22:29.440 | You know, so there's a kind of long-term determination,
00:22:33.300 | as I said, you know, to kind of break the back
00:22:37.260 | of Ukrainian national identity and Ukrainian nationalism
00:22:42.260 | as any kind of separatist force whatsoever.
00:22:46.840 | And so there's that rational calculation.
00:22:50.040 | At the same time, I think Stalin is annoyed
00:22:52.640 | and peeved and angry on one level
00:22:58.560 | with the Ukrainians for resisting collectivization
00:23:02.880 | and for being difficult and for, you know,
00:23:05.340 | not conforming, you know, to the way he thinks
00:23:10.340 | peasants should act in this situation.
00:23:13.240 | So you have both things.
00:23:14.480 | He's also very angry at the Ukrainian party
00:23:17.600 | and eventually purges it for not being able
00:23:20.160 | to control Ukraine and not be able to control the situation.
00:23:23.900 | You know, Ukraine is in theory,
00:23:25.160 | the breadbasket, right, of Europe.
00:23:27.040 | Well, how come the breadbasket isn't turning over to me
00:23:30.280 | all this grain so I can sell it abroad
00:23:32.200 | and, you know, build new factories?
00:23:35.160 | And support the workers in the cities.
00:23:37.540 | So there's a kind of annoyance, you know,
00:23:39.880 | when things fail, and this is absolutely typical of Stalin,
00:23:43.760 | when things fail, he blames it on other people
00:23:45.960 | and usually groups of people, right?
00:23:47.760 | Not individuals, but groups, again.
00:23:50.020 | So a little bit of both, I think, is the right answer.
00:23:53.720 | - This blame, it feels like there's a playbook
00:23:57.760 | that dictators follow.
00:23:59.580 | I just wonder if it comes naturally or just kind of evolves.
00:24:04.000 | There's blaming others and then telling these narratives
00:24:06.960 | and then creating the other and then somehow
00:24:09.080 | that leads to hatred and genocide.
00:24:10.700 | It feels like there's too many commonalities
00:24:14.720 | for it not to be a naturally emergent strategy
00:24:18.960 | that works for dictatorships.
00:24:20.680 | - I mean, that's a good, it's a very good point.
00:24:23.080 | And I think it's one, you know, that has its merits.
00:24:27.680 | In other words, I think you're right,
00:24:30.400 | that there's certain kinds of strategies by dictators
00:24:33.020 | that are common to them.
00:24:35.560 | A lot of them do killing, not all of them,
00:24:37.880 | of that sort that Stalin did.
00:24:40.040 | I've written about Mao and Pol Pot and Hitler.
00:24:43.840 | And there is a sort of, as you say,
00:24:46.840 | a kind of playbook for political dictatorship.
00:24:51.800 | Also for a kind of communist totalitarian way of functioning.
00:24:56.800 | And that way of functioning was described already
00:25:01.500 | by Hannah Arendt early on when she wrote
00:25:03.480 | The Origins of Totalitarianism.
00:25:05.400 | And she more or less writes the playbook
00:25:10.280 | and Stalin does follow it.
00:25:12.720 | The real question, it seems to me, is to what extent,
00:25:17.180 | you know, and how deep does this go
00:25:19.740 | and how often does it go in that direction?
00:25:22.180 | I mean, you can argue, for example,
00:25:25.000 | I mean, Fidel Castro was not a nice man, right?
00:25:27.480 | He was a dictator, he was a terrible dictator.
00:25:30.920 | But he did not engage in mass murder.
00:25:33.280 | Ho Chi Minh was a dictator, a communist dictator
00:25:36.760 | who grew up in the communist movement,
00:25:39.800 | went to Moscow, spent time in Moscow in the '30s
00:25:43.360 | and went to find, found the Vietnamese Communist Party.
00:25:47.960 | He was a horrible dictator.
00:25:49.240 | I'm sure he was responsible
00:25:50.460 | for a lot of death and destruction.
00:25:52.560 | But he wasn't a mass murderer.
00:25:55.120 | And so you get those.
00:25:57.920 | I mean, I would even argue, others will disagree,
00:26:02.000 | that Lenin wasn't a mass murderer.
00:26:04.120 | You know, that he didn't kill the same way
00:26:06.960 | that Stalin killed.
00:26:08.200 | Or people after him.
00:26:09.400 | They're communist dictators too, after all.
00:26:11.280 | Khrushchev was a communist dictator,
00:26:13.560 | but he stopped this killing.
00:26:14.960 | And, you know, he's still responsible for a gulag
00:26:19.080 | and people sent off into a gulag
00:26:21.320 | and imprisonment and torture and that sort of thing.
00:26:23.800 | But it's not at all the same thing.
00:26:25.400 | So there are some, you know, like Stalin, like Mao,
00:26:29.800 | like Pol Pot, you know, who commit these horrible,
00:26:32.680 | horrible atrocities, extensively engaging,
00:26:36.960 | in my view, in genocide.
00:26:39.800 | And there's some who don't.
00:26:41.240 | And, you know, what's the difference?
00:26:44.680 | Well, you know, the difference is partly in personality,
00:26:47.680 | partly in historical circumstance, you know,
00:26:50.400 | partly in, you know, who is it
00:26:52.240 | that controls the reins of power.
00:26:54.200 | - How much do you connect the ideas of communism
00:26:57.040 | or Marxism or socialism to Holodomor, to Stalin's rule?
00:27:02.040 | So how naturally, as you kind of alluded to,
00:27:05.600 | does it lead to genocide?
00:27:08.440 | - It's also, I mean, in some ways,
00:27:13.200 | I've just addressed that question
00:27:14.400 | by saying it doesn't always lead to genocide.
00:27:17.080 | You know, in the case, again, you know,
00:27:18.960 | Cuba is not pretty, but it didn't have,
00:27:23.040 | there was no genocide in Cuba.
00:27:25.000 | And the same thing in North Vietnam.
00:27:26.800 | You know, even North Korea, as awful as it is,
00:27:30.720 | as terrible dictatorship, right,
00:27:32.480 | and people's rights are totally destroyed, right?
00:27:37.480 | They have no freedom whatsoever, you know,
00:27:40.160 | is not, as far as we know, genocidal.
00:27:43.560 | Who knows whether it could be,
00:27:45.120 | or whether if they took over South Korea, you know,
00:27:47.480 | mass murder wouldn't take place and that kind of thing.
00:27:50.200 | But my point is, is that the ideology
00:27:53.240 | doesn't necessarily dictate genocide.
00:27:57.120 | In other words, it's an ideology, I think,
00:27:59.360 | that makes genocide sometimes too easily possible,
00:28:04.280 | given, you know, the way it thinks through history
00:28:09.200 | as being, you know, you're on the right side of history,
00:28:11.600 | and some people are on the wrong side of history,
00:28:13.760 | and you have to destroy those people
00:28:15.640 | who are on the wrong side of history.
00:28:17.040 | I mean, there is something in, you know,
00:28:19.760 | Marxism-Leninism, which, you know,
00:28:22.480 | has that kind of language and that kind of thinking.
00:28:25.480 | But I don't think it's necessarily that way.
00:28:30.480 | There's a wonderful historian at Berkeley
00:28:34.080 | named Martin Melia, who has written, you know,
00:28:37.640 | wrote a number of books on this subject,
00:28:39.600 | and he was very, very, he was convinced that the,
00:28:44.600 | you know, that the ideology itself, you know,
00:28:49.680 | played a crucial role in the murderousness
00:28:53.080 | of the Soviet regime.
00:28:54.640 | I'm not completely convinced, you know.
00:28:57.080 | When I say not completely convinced,
00:28:58.600 | I think there are, you could argue it different ways,
00:29:01.640 | equally valid, you know, with equally valid arguments.
00:29:05.640 | - I mean, there's something about the ideology of communism
00:29:10.640 | that allows you to decrease the value of human life,
00:29:15.200 | almost like this philosophy,
00:29:16.440 | if it's okay to crack a few eggs to make an omelet.
00:29:19.040 | - Right.
00:29:19.880 | - So maybe that, if you can reason like that,
00:29:23.040 | then it's easier to take the leap of,
00:29:26.240 | for the good of the country, for the good of the people,
00:29:28.460 | for the good of the world, it's okay to kill a few people.
00:29:31.640 | And then that's where, I wonder about the slippery slope.
00:29:36.640 | - Yeah, no, no, again, you know,
00:29:38.640 | I don't think it's a slippery slope.
00:29:40.240 | I think it's, I think it's dangerous.
00:29:44.080 | In other words, I think it's dangerous.
00:29:45.720 | But I don't consider, you know,
00:29:48.640 | I don't like Marxism, Leninism any better than the next guy.
00:29:51.680 | And I've lived in plenty of those systems
00:29:53.520 | to know how they can beat people down
00:29:56.680 | and how they can, you know,
00:29:59.500 | destroy human aspirations
00:30:03.280 | and human interaction between people.
00:30:06.500 | But they're not necessarily murderous systems.
00:30:11.500 | They are systems that contain people's autonomy,
00:30:15.620 | that force people into work and labor and lifestyles
00:30:20.000 | that they don't want to live.
00:30:21.840 | I spent a lot of time, you know,
00:30:23.680 | with East Germans and Poles, you know,
00:30:27.640 | who lived in, and even in the Soviet Union,
00:30:30.880 | you know, in the post-Stalin period,
00:30:34.200 | where people lived lives they didn't want to live,
00:30:37.160 | you know, and didn't have the freedom to choose.
00:30:40.660 | And that was terrifying in and of itself.
00:30:44.040 | But these were not murderous systems.
00:30:46.560 | And they, you know, ascribe to Marxism, Leninism.
00:30:51.520 | - So I suppose it's important to draw the line
00:30:54.200 | between mass murder and genocide and mass murder
00:30:58.480 | versus just mass violation of human rights.
00:31:02.720 | - Right, right.
00:31:04.020 | - And the leap to mass murder, you're saying,
00:31:08.800 | may be easier in some ideologies than others,
00:31:13.240 | but it's not clear that somehow one ideology
00:31:15.960 | definitely leads to mass murder and not.
00:31:17.520 | - Exactly.
00:31:18.360 | - I wonder how many factors, what factors,
00:31:21.080 | how much of it is a single charismatic leader?
00:31:24.000 | How much of it is the conflagration
00:31:28.000 | of multiple historical events?
00:31:30.640 | How much of it is just dumb, the opposite of luck?
00:31:35.060 | Do you have a sense where if you look at a moment
00:31:40.680 | in history, predict, looking at the factors,
00:31:45.680 | whether something bad's going to happen here.
00:31:49.520 | When you look at Iraq, when Saddam Hussein first took power,
00:31:54.080 | well, you could, or you can, you know,
00:31:57.120 | go even farther back in history,
00:31:58.680 | would you be able to predict?
00:32:00.240 | So you said, you already kind of answered that
00:32:02.200 | with Stalin saying there's no way you could have
00:32:04.120 | predicted that in the early 20s.
00:32:07.000 | Is that always the case?
00:32:08.000 | You basically can't predict.
00:32:09.280 | - It's pretty much always the case.
00:32:11.080 | In other words, I mean, history is a wonderful,
00:32:14.960 | you know, discipline and way of looking at life
00:32:17.520 | and the world in retrospect, meaning it happened.
00:32:21.920 | It happened and we know it happened.
00:32:25.000 | And it's too easy to say sometimes it happened
00:32:28.640 | because it had to happen that way.
00:32:30.720 | It almost never has to happen that way.
00:32:33.840 | And, you know, things, so I very much,
00:32:39.660 | I'm of the school that emphasizes, you know,
00:32:43.820 | contingency and choice and difference and different paths
00:32:48.820 | and not, you know, not necessarily a path
00:32:52.060 | that has to be followed.
00:32:54.880 | And those, you know, and, you know,
00:32:59.900 | sometimes you can warn about things.
00:33:02.900 | I mean, you can think, well, something's going to happen.
00:33:06.100 | And usually the way it works,
00:33:08.420 | let me just give you one example.
00:33:09.620 | I mean, I'm thinking about an example right now,
00:33:11.580 | which was the war in Yugoslavia, you know,
00:33:13.300 | which came in the 1990s
00:33:15.020 | and eventually ventuated in genocide in Bosnia.
00:33:18.400 | And, you know, I remember very clearly, you know,
00:33:23.220 | the 1970s and 1980s in Yugoslavia,
00:33:25.860 | and people would say, you know, there's trouble here.
00:33:28.300 | And, you know, something could go wrong,
00:33:31.280 | but no one in their wildest imagination
00:33:33.920 | thought that there would be outright war between them all.
00:33:36.900 | Then the outright war happened, genocide happened,
00:33:39.300 | and afterwards people would say, I saw it coming.
00:33:41.960 | You know, so you get a lot of that,
00:33:44.620 | especially with pundits and journalists and that.
00:33:49.180 | I saw it coming, I knew it was happening.
00:33:51.060 | You know, well, I mean, what happens in the human mind
00:33:53.620 | and it happens in your mind too is, you know,
00:33:55.660 | you go through a lot of alternatives.
00:33:58.180 | I mean, think about January 6th, you know, in this country
00:34:00.460 | and all the different alternatives
00:34:02.180 | which people had in their mind
00:34:05.020 | or before January 6th, you know, after the lost election.
00:34:10.020 | You know, things could have gone in lots of different ways
00:34:13.180 | and there were all kinds of people
00:34:14.580 | choosing different ways it could have gone,
00:34:16.300 | but nobody really knew how it was gonna turn out.
00:34:20.020 | Wasn't it smart people really understood
00:34:22.000 | that there'd be this Khakimimi uprising on January 6th,
00:34:25.420 | you know, that almost, you know, caused this enormous grief.
00:34:28.980 | So all of these kinds of things in history, you know,
00:34:32.020 | are deeply contingent.
00:34:33.620 | They depend on, you know, factors that we cannot predict
00:34:37.300 | and, you know, and it's the joy of history that it's open.
00:34:41.380 | You know, you think about how people are now,
00:34:43.500 | I mean, let me give you one more example
00:34:45.020 | and then I'll shut up,
00:34:46.220 | but, you know, there's the environmental example.
00:34:49.700 | You know, we're all threatened, right?
00:34:51.020 | We know it's coming, we know there's trouble, right?
00:34:54.440 | We know there's gonna be a catastrophe some point,
00:34:58.260 | but when?
00:34:59.260 | What's the catastrophe?
00:35:00.980 | - Yeah, what's the nature of the catastrophe?
00:35:02.380 | Everyone says catastrophe. - And what's the nature of it?
00:35:03.700 | Right, right, right. - Is it gonna be wars
00:35:05.160 | because of resource constraint?
00:35:06.380 | Is it going to be hunger?
00:35:07.380 | Is it gonna be like mass migration of different kinds
00:35:10.780 | that leads to some kind of conflict and immigration?
00:35:13.260 | And maybe it won't be that big of a deal
00:35:16.320 | and a total other catastrophic event
00:35:19.020 | will completely challenge the entirety
00:35:21.220 | of the human civilization.
00:35:22.220 | - That's my point, that's my point, that's my point.
00:35:25.120 | You know, we really don't know.
00:35:28.100 | I mean, there's a lot we do know.
00:35:29.700 | I mean, the warming business and all this kind of stuff,
00:35:32.060 | you know, it's scientifically there,
00:35:34.260 | but how it's going to play out,
00:35:36.540 | and everybody's saying different things,
00:35:39.660 | and then you get somewhere in 50 years or 60 years,
00:35:42.980 | which I won't see, and people say,
00:35:44.820 | "Aha, I told you it was gonna be X
00:35:46.980 | "or it was gonna be Y or it was gonna be Z."
00:35:49.420 | So I just don't think in history you can,
00:35:52.080 | well, you can't predict.
00:35:56.360 | You simply cannot predict what's going to happen.
00:35:59.020 | It's kind of when you just look at Hitler in the '30s,
00:36:02.260 | for me, oftentimes when I kind of read different accounts,
00:36:07.140 | it is so often, certainly in the press,
00:36:09.100 | but in general, me just reading about Hitler,
00:36:11.880 | I get the sense like, this is a clown.
00:36:15.020 | There's no way this person will gain power.
00:36:18.420 | - Which one, Hitler or Stalin?
00:36:19.980 | - Hitler, Hitler, Hitler.
00:36:21.300 | No, no, no, with Stalin, you don't get a sense he's a clown.
00:36:24.260 | He's a really good executive.
00:36:25.940 | You don't think it would lead to mass murder,
00:36:28.540 | but you think he's going to build a giant bureaucracy,
00:36:32.440 | at least.
00:36:33.340 | With Hitler, it's like a failed artist
00:36:37.260 | who keeps screaming about stuff.
00:36:39.500 | There's no way he's gonna, I mean,
00:36:42.380 | you certainly don't think about the atrocities,
00:36:44.820 | but there's no way he's going to gain power,
00:36:47.580 | especially against communism.
00:36:48.860 | There's so many other competing forces
00:36:50.620 | that could have easily beat him.
00:36:54.260 | But then you realize event after event
00:36:58.940 | where this clown keeps dancing,
00:37:00.420 | and all of a sudden he gains more and more power,
00:37:02.420 | and just certain moments in time,
00:37:04.980 | he makes strategic decisions in terms of cooperating
00:37:09.980 | or gaining power over the military,
00:37:13.220 | all those kinds of things,
00:37:14.500 | that eventually give him the power.
00:37:17.340 | I mean, this clown is one of the most impactful
00:37:21.580 | in a negative sense human beings in history.
00:37:25.260 | Right?
00:37:26.180 | And even the Jews who are there
00:37:28.020 | and are being screamed at and discriminated against,
00:37:30.580 | and there's a series of measures taken against them
00:37:33.860 | incrementally during the course of the 1930s,
00:37:36.460 | and very few who leave.
00:37:39.060 | Yeah.
00:37:39.900 | I mean, some pick up and go and say,
00:37:40.740 | I'm getting the hell out of here.
00:37:42.340 | And some Zionists try to leave too
00:37:45.380 | and go to the United States and stuff,
00:37:46.860 | but go to Israel and Palestine at the time,
00:37:51.660 | but, or to Britain or France.
00:37:55.580 | But in general, even the Jews
00:37:58.540 | who should have been very sensitive to what was going on,
00:38:01.540 | didn't really understand the extent of the danger.
00:38:06.140 | And it's really hard for people to do that.
00:38:08.940 | It's almost impossible, in fact, I think.
00:38:12.100 | So most of the time in that exact situation,
00:38:16.080 | nothing would have happened,
00:38:18.360 | or there'd be some drama and so on,
00:38:20.240 | and be there some bureaucrat.
00:38:22.240 | But every once in a while in human history,
00:38:23.840 | there's a kind of turn,
00:38:25.880 | and maybe something catalyzes something else
00:38:28.720 | and just it accelerates, it accelerates,
00:38:31.080 | escalates, escalates, and then war breaks out
00:38:34.040 | or totally, you know, revolutions break out.
00:38:37.800 | Right.
00:38:38.640 | Can we go to the big question of genocide?
00:38:43.560 | What is genocide?
00:38:44.880 | What are the defining characteristics of genocide?
00:38:47.440 | Dealing with genocide is a difficult thing
00:38:50.560 | when it comes to the definition.
00:38:52.300 | There is a definition, the December 1948 UN Convention
00:38:58.520 | on the Prevention and Punishment of Genocide
00:39:02.240 | is considered the sort of major document of definition
00:39:07.240 | and the definitional sense of genocide.
00:39:09.680 | And it emphasizes, you know,
00:39:12.040 | the intentional destruction, you know,
00:39:16.440 | of an ethnic, national, racial, or religious group.
00:39:21.440 | Those are the four groups again, comma as such.
00:39:26.680 | And what that means basically
00:39:27.920 | is destroying the group as a group.
00:39:31.920 | In other words, there's a kind of beauty in human diversity
00:39:36.760 | and different groups of people, you know,
00:39:39.600 | Estonians, you know, a tribe of Native Americans,
00:39:43.520 | South African tribes, you know, the Rohingya in Myanmar.
00:39:48.520 | There's a kind of beauty humanity recognizes
00:39:52.220 | in the distinctiveness of those groups.
00:39:54.600 | You know, this was a notion that emerges really
00:39:58.100 | with Romanticism after the French Revolution
00:40:00.960 | in the beginning of the 19th century with Herder mostly.
00:40:04.520 | And this beauty of these groups then, you know,
00:40:08.920 | is what is under attack in genocide.
00:40:12.560 | And it's with intent.
00:40:15.920 | You know, the idea is that it's intentional destruction.
00:40:19.880 | So this is a kind of, you know, analogy to first degree,
00:40:24.880 | second degree, and third degree murder, right?
00:40:27.160 | First degree murder, you know,
00:40:28.320 | you're out to kill this person and you plan it
00:40:31.240 | and you go out and you do it, right?
00:40:34.240 | That's intent, right?
00:40:36.120 | Manslaughter is not intent.
00:40:37.840 | You end up doing the same thing, but it's different.
00:40:41.000 | So, you know, the major person behind the definition
00:40:46.000 | is a man named Raphael Lemkin.
00:40:48.080 | I don't know if you heard his name or not,
00:40:50.400 | but he was a Polish Jewish jurist who came, you know,
00:40:55.200 | from Poland, came to the United States during the war
00:40:58.360 | and had been a kind of crusader for recognizing genocide.
00:41:05.780 | It's a word that he created by the way.
00:41:08.620 | And he coined the term in 1943
00:41:11.300 | and then published it in 1944 for the first time.
00:41:14.900 | Geno meaning people and side meaning killing, right?
00:41:18.060 | And so Lemkin then had this term
00:41:20.900 | and he pushed hard to have it recognized
00:41:23.380 | and it was in the UN convention.
00:41:24.940 | So that's the rough definition.
00:41:27.340 | The problem with it is the definition,
00:41:30.500 | the problems with the definition are several.
00:41:33.900 | You know, one of them is, is it just these four groups,
00:41:38.460 | you know, racial, religious, ethnic, or national?
00:41:42.340 | See, this comes right out of the war.
00:41:44.620 | And what's in people's minds in 1948 are Jews, Poles,
00:41:49.500 | Russians, Yugoslavs sometimes who were killed by the Nazis.
00:41:53.060 | That's what's in their mind.
00:41:54.580 | But there are other groups too, if you think about it,
00:41:56.980 | you know, who are killed, social groups or political groups.
00:42:01.420 | And that was not allowed in the convention,
00:42:05.260 | meaning for a lot of different reasons,
00:42:07.940 | the Soviets were primary among them.
00:42:10.780 | They didn't want other kinds of groups.
00:42:12.860 | Let's say Kulaks, for example, to be considered.
00:42:16.660 | That's a social group or peasants, which is a social group.
00:42:21.500 | So, or a political group.
00:42:23.140 | I mean, let's take a group, you know,
00:42:26.620 | communists killed groups of people,
00:42:30.340 | but non-communists also killed groups of people.
00:42:32.500 | In Indonesia in 1965, '66, they killed, you know,
00:42:36.260 | I don't know exactly,
00:42:37.100 | but roughly 600,000 Indonesian communists.
00:42:40.100 | Well, is that genocide or not?
00:42:42.500 | You know, and my point of view, it is genocide,
00:42:45.260 | although it's Indonesians killing Indonesians.
00:42:48.380 | And we have the same problem with the Cambodian genocide.
00:42:51.060 | I mean, we talk about a Cambodian genocide,
00:42:53.860 | but most of the people killed in the Cambodian genocide
00:42:56.940 | were other Cambodians.
00:42:58.620 | They give it the name,
00:42:59.660 | they're ready to recognize this genocide
00:43:02.060 | because they also killed some other peoples,
00:43:04.340 | meaning the Vietnamese, Akham people who are, you know,
00:43:08.780 | Muslim, smaller Muslim people in the area, and a few others.
00:43:13.780 | So the question then becomes,
00:43:17.460 | well, does it have to be a different nationality
00:43:20.260 | or ethnic group or religious group for it to be genocide?
00:43:23.100 | And my answer is no.
00:43:24.740 | You know, you need to expand the definition.
00:43:26.420 | It's a little bit like with our constitution.
00:43:28.060 | We got a constitution,
00:43:29.860 | but we don't live in the end of the 18th century, right?
00:43:32.020 | We live in the 21st century.
00:43:33.620 | And so you have to update the constitution
00:43:36.620 | over the centuries.
00:43:38.140 | And similarly, the genocide convention needs updating too.
00:43:42.660 | So that's how I work with the definition.
00:43:45.100 | - So this is this invention.
00:43:47.180 | Was it an invention, this beautiful idea,
00:43:51.060 | romantic idea that there's groups of people
00:43:53.500 | and the group is united by some unique characteristics?
00:43:57.660 | That was an invention in human history, this idea?
00:44:01.580 | Not the CS individuals?
00:44:02.420 | - Yes, in some senses it was.
00:44:05.180 | I mean, it's not, you know,
00:44:06.740 | there are things that are always constructed
00:44:09.380 | in one fashion or another,
00:44:10.660 | and the construction, you know,
00:44:13.220 | more or less represents the reality.
00:44:15.860 | And what the reality is always much more complicated
00:44:18.540 | than the construction or the invention of a term
00:44:21.940 | or a concept or a way of thinking about a nation, right?
00:44:26.420 | And this way of thinking of nations, you know,
00:44:29.740 | as again, you know, groups of religious, linguistic,
00:44:34.740 | not political necessarily,
00:44:40.140 | but cultural entities is something
00:44:43.340 | that was essentially invented, yes.
00:44:45.500 | - Yes, I mean, you know, if you look at--
00:44:47.460 | - There are no Germans in the 17th century.
00:44:50.380 | There are no Italians in the 17th century, right?
00:44:52.740 | They're only there after, you know,
00:44:54.860 | the invention of the nation, which comes again,
00:44:59.340 | mostly as out of the French Revolution
00:45:01.900 | and in the Romantic movement,
00:45:03.700 | a man named Johann Gottfried von Herder, right?
00:45:08.420 | Who was the, really the first one
00:45:10.380 | who sort of went around, collected people's languages
00:45:13.100 | and collected their sayings and their dances
00:45:15.660 | and their folkways and stuff and said,
00:45:17.740 | "Isn't this cool, you know, that they're Estonians
00:45:20.620 | and that they're Latvians and that they're these other,
00:45:23.140 | these interesting different peoples
00:45:25.900 | who don't even know necessarily
00:45:28.420 | that they're different peoples, right?"
00:45:30.620 | That comes a little bit later, right?
00:45:33.100 | Once the concept is invented,
00:45:35.020 | then people start to say, "Hey, we're nations too."
00:45:38.660 | You know, and the Germans decide they're a nation
00:45:40.820 | and they unify and the Italians discover they're a nation
00:45:43.900 | and they unify instead of being, you know,
00:45:46.260 | Florentines and Romans and, you know, Sicilians.
00:45:51.100 | - But then beyond nations, there's political affiliations,
00:45:55.340 | all those kinds of things.
00:45:56.180 | It's fascinating that, you know,
00:45:58.260 | you start, you look at the early Homo sapiens
00:46:01.580 | and then there's obviously tribes, right?
00:46:04.220 | And then that's very concrete,
00:46:06.660 | that's a geographic location,
00:46:08.500 | and it's a small group of people.
00:46:10.700 | And then you have warring tribes probably connected
00:46:12.940 | to just limited resources.
00:46:16.140 | But it's fascinating to think that that is then taken
00:46:18.580 | to the space of ideas,
00:46:20.780 | to where you can create a group at first
00:46:23.380 | to appreciate its beauty.
00:46:26.980 | You create a group based on language,
00:46:29.980 | based on maybe even, so political philosophical ideas,
00:46:34.300 | religious ideas, all those kinds of things.
00:46:36.340 | And then that naturally then leads
00:46:38.540 | to getting angry at groups.
00:46:41.460 | - Right.
00:46:42.300 | - And making them the other and then hatred.
00:46:43.900 | - Right.
00:46:44.740 | - And that comes more towards the end of the 19th century,
00:46:47.660 | you know, with the influence of Darwin.
00:46:50.100 | I mean, you can't blame Darwin for it,
00:46:51.940 | but Neo-Darwin, Darwinians, you know,
00:46:54.300 | who start to talk about, you know,
00:46:55.860 | the competition between nations,
00:46:58.100 | the natural competition, the weak ones fall away,
00:47:01.260 | the strong ones get ahead.
00:47:03.220 | You know, you get this sort of combination also
00:47:05.500 | with, you know, modern antisemitism
00:47:08.420 | and with racial thinking, you know,
00:47:09.980 | the racial thinking at the end of the 19th century
00:47:12.420 | is very powerful.
00:47:14.060 | So now, you know, at the end of the 19th century
00:47:16.820 | versus the beginning of the 19th, you know,
00:47:19.780 | the middle of the 19th century, you know,
00:47:22.580 | you can be a German and be a Jew
00:47:24.020 | and there's no contradiction.
00:47:26.220 | - Yeah.
00:47:27.060 | - As long as you speak the language and you, you know,
00:47:28.900 | you dress and think and act and share the culture.
00:47:32.420 | By the end of the 19th century, people are saying,
00:47:34.140 | no, no, you know, they're not Germans.
00:47:37.100 | They're Jews, they're different.
00:47:37.940 | They have different blood, they have different,
00:47:39.260 | they don't say genes yet, but you know,
00:47:41.220 | that sort of a sense of people.
00:47:43.740 | And that's when, you know,
00:47:45.420 | there's this sense of superiority too and inferiority.
00:47:49.220 | - Yeah.
00:47:50.060 | - You know, that they're inferior to us.
00:47:51.700 | - Yeah.
00:47:52.540 | - You know, and that we're the strong ones
00:47:55.380 | and we have to, you know, and Hitler, by the way,
00:47:57.460 | just adopts this hook, line, and sinker.
00:48:00.260 | I mean, there are a whole series of thinkers
00:48:03.020 | at the end of the 19th and beginning of 20th century
00:48:05.100 | who he cites in Mein Kampf, you know,
00:48:06.900 | which is written in the early 1920s,
00:48:09.300 | that, you know, basically pervades this racial thinking.
00:48:14.300 | So nationalism changes.
00:48:16.740 | So nationalism in and of itself is not bad.
00:48:19.820 | I mean, it's not bad, you know, to share culture
00:48:22.620 | and language and, you know, folkways
00:48:26.140 | and a sense of common belonging.
00:48:29.900 | There's nothing bad about it inherently.
00:48:32.740 | But then what happens is it becomes, you know,
00:48:35.500 | frequently is used and becomes, especially on fascism,
00:48:39.580 | becomes dangerous.
00:48:42.340 | - And then it's especially dangerous
00:48:44.540 | when the two conflicting groups share geographic location.
00:48:48.940 | - That's right.
00:48:49.780 | - So like with Jews, you know, I come, you know,
00:48:54.220 | I'm a Russian Jew and it's always interesting.
00:48:58.740 | I take pride in, you know, I love the tradition
00:49:04.740 | of the Soviet Union, of Russia.
00:49:07.500 | I love America.
00:49:08.540 | So I love these countries.
00:49:10.140 | They have a beautiful tradition in literature and science
00:49:13.340 | and art and all those kinds of things.
00:49:15.380 | But it's funny that people, not often,
00:49:19.300 | but sometimes correct me that I'm not Russian.
00:49:24.060 | I'm a Jew.
00:49:24.900 | And it's a nice reminder.
00:49:30.060 | - Yes.
00:49:31.060 | - That that is always there.
00:49:34.280 | That desire to create these groups.
00:49:37.320 | And then when they're living in the same place
00:49:39.840 | for that division between groups,
00:49:42.160 | that hate between groups can explode.
00:49:45.080 | And I just, I wonder why is that there?
00:49:49.600 | Why does the human heart tend so easily
00:49:53.480 | towards this kind of hate?
00:49:55.600 | - You know, that's a big question in and of itself.
00:50:02.640 | You know, the human heart is full of everything, right?
00:50:04.880 | It's full of hate, it's full of love,
00:50:06.720 | it's full of indifference, it's full of apathy,
00:50:09.760 | it's full of energy.
00:50:10.760 | So, I mean, hate is something, you know, that,
00:50:14.980 | I mean, I think, and, you know, along with hate,
00:50:22.720 | you know, the ability to really hurt and injure people
00:50:26.320 | is something that's within all of us.
00:50:28.320 | You know, it's within all of us.
00:50:30.400 | And it's just something that's part of who we are
00:50:33.480 | and part of our society.
00:50:37.920 | So, you know, we're shaped by our society
00:50:40.040 | and our society can do with us often what it wishes.
00:50:43.420 | You know, that's why it's so much nicer
00:50:46.440 | to live in a more or less beneficent society
00:50:49.840 | like that of a democracy in the West
00:50:52.560 | than to live in the Soviet Union, right?
00:50:55.120 | I mean, because, you know, you have more or less
00:50:59.080 | the freedom to do what you wish
00:51:01.840 | and not to be forced into situations
00:51:04.440 | in which you would have to then do nasty to other people.
00:51:08.660 | Some societies, as we talked about, you know,
00:51:12.760 | are more, have proclivities towards, you know,
00:51:17.000 | asking of its people to do things they don't want to do
00:51:20.960 | and forcing them to do so.
00:51:24.440 | So, you know, freedom is a wonderful thing
00:51:27.880 | to be able to choose not to do evil is a great thing.
00:51:30.920 | You know, whereas in some societies, you know,
00:51:33.940 | you feel in some ways for, not so much for the NKVD bosses,
00:51:38.940 | but for the guys on the ground, you know, in the 1930s
00:51:42.000 | or not so much for the Nazi bosses,
00:51:44.880 | but for the guys, you know, in the police battalion
00:51:49.120 | that were told go shoot those Jews, you know?
00:51:53.800 | And you do it, not necessarily
00:51:57.280 | because they force you to do it,
00:51:59.680 | but because your social, you know, your social situation,
00:52:04.100 | you know, encourages you to,
00:52:07.640 | and you don't have the courage not to.
00:52:10.000 | - Yeah, I was just, as I often do,
00:52:12.560 | rereading Viktor Frankl's "Man's Search for Meaning,"
00:52:15.960 | and he said something, I just,
00:52:19.160 | I often pull out sort of lines.
00:52:22.360 | "The mere knowledge that a man was either a camp guard
00:52:25.160 | or a prisoner tells us almost nothing.
00:52:28.320 | Human kindness can be found in all groups,
00:52:31.160 | even those which as a whole, it would be easy to condemn."
00:52:36.160 | So that's speaking to, you feel for those people
00:52:41.200 | at the lowest level implementing the orders
00:52:45.840 | of those above.
00:52:49.800 | - Right.
00:52:51.760 | - And also you worry yourself,
00:52:53.400 | what will happen if you were given those same orders?
00:52:56.320 | You know, I mean, what would you do?
00:52:59.040 | You know, what kind of reaction would you have
00:53:01.720 | in this similar situation?
00:53:03.360 | And, you know, you don't know.
00:53:05.900 | - I could see myself in World War II
00:53:10.360 | while fighting for almost any country that I was born in.
00:53:15.360 | There's a love of community.
00:53:18.800 | There's a love of country that's just,
00:53:21.000 | at least to me, it comes naturally.
00:53:23.160 | Just love of community, and countries want such community.
00:53:27.240 | And I could see fighting for that country,
00:53:29.480 | especially when you're sold a story
00:53:32.320 | that you're fighting evil,
00:53:33.640 | and I'm sure every single country
00:53:35.320 | was sold that story effectively.
00:53:38.520 | And then when you're in the military
00:53:41.280 | and you have a gun in your hand
00:53:42.360 | or you're in the police force and you're ordered,
00:53:47.600 | go to this place and commit violence,
00:53:52.600 | it's hard to know what you would do.
00:53:55.400 | It's a mix of fear.
00:53:56.520 | It's a mix of, maybe you convince yourself,
00:54:00.700 | you know, what can one person really do?
00:54:03.480 | And over time, it's again, that slippery slope.
00:54:05.840 | 'Cause you could see all the people who protest,
00:54:09.000 | who revolt, they're ineffective.
00:54:12.600 | So like, if you actually want to practically help somehow,
00:54:17.440 | you're going to convince yourself
00:54:18.560 | that you can't, one person can't possibly help.
00:54:21.060 | And then you have a family, so you want to make,
00:54:24.120 | you know, you want to protect your family.
00:54:25.480 | You tell all of these stories, and over time,
00:54:28.000 | you naturally convince yourself to dehumanize the other.
00:54:32.560 | Yeah, I think about this a lot,
00:54:35.200 | mostly because I worry that I wouldn't be a good German.
00:54:41.520 | - Yeah, no, no, that's right, that's right.
00:54:44.600 | And one of the, you know, one of my tasks as a teacher,
00:54:48.440 | right, of our students, and I have, you know,
00:54:52.080 | classes on genocide, I have one now,
00:54:54.420 | and another one, by the way, on Stalin.
00:54:57.820 | But the one on genocide, you know,
00:55:01.120 | one of my tasks is to try to get the students
00:55:05.320 | to understand this is not about weird people
00:55:07.200 | who live far away in time and in place,
00:55:10.940 | but it's about them, you know?
00:55:12.600 | And that, you know, that's a hard lesson,
00:55:15.280 | but it's an important one, you know,
00:55:17.080 | that this is in all of us, you know, it's in all of us.
00:55:20.520 | And there's nothing, you know,
00:55:22.760 | and you just try to gurge yourself up, you know,
00:55:25.600 | to try to figure out ways that maybe you won't be complicit,
00:55:29.420 | and that you learn how to stand by your principles,
00:55:33.600 | but it's very hard, it's extremely difficult.
00:55:36.160 | And you can't, the other interesting thing about it
00:55:38.680 | is it's not predictable.
00:55:40.320 | Now, they've done a lot of studies of Poles, for example,
00:55:43.360 | who during the war saved Jews, you know?
00:55:45.840 | Well, who are the Poles who saved Jews
00:55:47.800 | versus those who turned them in?
00:55:50.060 | It's completely unpredictable.
00:55:51.780 | You know, sometimes it's the worst anti-Semites
00:55:53.820 | who protect them because they don't believe
00:55:55.400 | they should be killed, right?
00:55:57.520 | And sometimes, you know, it's not predictable.
00:56:01.160 | It's not as if the humanists among us, you know,
00:56:04.280 | are the ones who, you know, consistently show up, you know,
00:56:10.280 | and experience danger, in other words,
00:56:14.040 | and are ready to take on danger
00:56:15.720 | to defend, you know, your fellow human beings.
00:56:18.920 | Not necessarily.
00:56:19.840 | I mean, sometimes simple people do it,
00:56:21.560 | and sometimes they do it for really simple reasons.
00:56:24.800 | And sometimes people you would expect to do it don't,
00:56:28.540 | you know, and you've got that mix,
00:56:31.080 | and it's just not predictable.
00:56:32.580 | - One thing I've learned in this age of social media
00:56:37.280 | is it feels like the people with integrity
00:56:39.840 | and the ones who would do the right thing
00:56:41.940 | are the quiet ones.
00:56:43.240 | In terms of humanists, in terms of activists,
00:56:46.880 | there's so many points to be gained
00:56:49.040 | of declaring that you would do the right thing.
00:56:52.700 | It's the simple, quiet folks.
00:56:57.500 | Because I've seen quite, on a small,
00:57:01.320 | obviously much smaller scale,
00:57:03.520 | just shows of integrity and character
00:57:05.920 | when there were sacrifices to be made
00:57:07.360 | and it was done quietly.
00:57:09.400 | Now, this sort of the small heroes,
00:57:11.640 | those are, you're right, it's surprising,
00:57:15.640 | but they're often quiet.
00:57:17.240 | That's why I'm distrustful of people
00:57:18.920 | who kind of proclaim that they would do the right thing.
00:57:21.720 | - And there are different kinds of integrity, too.
00:57:25.520 | I mean, I edited a memoir of a Polish,
00:57:30.460 | you know, underground fighter,
00:57:34.960 | member of the underground who was in Majdanek,
00:57:37.200 | in the concentration camp of Majdanek,
00:57:38.920 | you know, and it was just an interesting mix
00:57:41.200 | of different kinds of integrity.
00:57:43.560 | You know, on the one hand,
00:57:45.200 | you know, it really bothered him deeply
00:57:50.360 | when Jews were killed or sent to camp
00:57:52.860 | or that sort of thing.
00:57:53.700 | On the other hand, he was something of an anti-Semite.
00:57:56.840 | You know, he would, you know,
00:57:59.700 | sometimes if Jews were his friends, he would help them.
00:58:02.840 | And if they weren't, sometimes he was really mean to them.
00:58:06.360 | You know, and you could, in their various levels,
00:58:08.200 | you know, a concentration camp is, you know,
00:58:11.480 | a terrible social experiment in some ways, right?
00:58:15.540 | But you learn a lot from how people behave.
00:58:19.960 | And what you see is that, you know,
00:58:21.320 | people behave sometimes extraordinarily well
00:58:24.040 | in some situations and extraordinarily poorly in others.
00:58:26.700 | And it's mixed and you can't predict it.
00:58:28.760 | And it's hard to find consistency.
00:58:32.420 | I mean, that's the other thing.
00:58:33.480 | It's, you know, I think we claim too much consistency
00:58:37.160 | for the people we study
00:58:38.440 | and the people we think about in the past.
00:58:40.400 | You know, they're not consistent anymore than we are,
00:58:42.640 | consistent, right?
00:58:43.620 | - Well, let me ask you about human nature here
00:58:46.920 | on both sides.
00:58:48.040 | So first, what have you learned about human nature
00:58:53.040 | from studying genocide?
00:58:54.720 | Why do humans commit genocide?
00:58:56.840 | What lessons, first of all, why is a difficult question,
00:59:01.560 | but what insights do you have into humans
00:59:04.840 | that genocide is something that happens in the world?
00:59:07.800 | - That's a really big and difficult question, right?
00:59:10.360 | And it has to be parsed, I think,
00:59:13.880 | into different kinds of questions.
00:59:16.400 | You know, why does genocide happen?
00:59:18.760 | You know, which the answer there is frequently political,
00:59:24.380 | meaning, you know, why Hitler ended up killing the Jews.
00:59:28.960 | Well, it had a lot to do with the political history
00:59:32.020 | of Germany and wartime history of Germany, right?
00:59:35.360 | In the '30s.
00:59:37.640 | And, you know, it's traceable to then.
00:59:40.480 | No, like you mentioned it yourself.
00:59:42.960 | You can't imagine Hitler in the mid '20s
00:59:46.040 | turning into anything of the kind of dictator he ended up
00:59:50.440 | being and the kind of murderer,
00:59:53.160 | mass murderer he ended up being.
00:59:55.520 | So, and the same thing goes, by the way,
00:59:58.880 | for Stalin and Soviet Union and Pol Pot.
01:00:01.280 | I mean, these are all essentially political movements
01:00:04.800 | where the polity, state is seized, you know,
01:00:08.160 | by a ideological or, you know, party,
01:00:12.960 | single party movement, and then is moved in directions
01:00:15.800 | where mass killing takes place.
01:00:18.520 | The other question, let's separate that question out.
01:00:22.620 | The other question is why do ordinary people participate?
01:00:26.780 | Because the fact of the matter is,
01:00:31.060 | just ordering genocide is not enough.
01:00:33.780 | Just saying, you know, go get them is not enough.
01:00:36.620 | There have to be people who will cooperate
01:00:39.100 | and who will do their jobs, you know,
01:00:41.420 | both at the kind of mezzo level,
01:00:43.060 | the middle level of a bureaucracy,
01:00:45.340 | but also at the everyday level.
01:00:47.060 | You know, people who have to pull the triggers
01:00:48.700 | and that kind of thing, and, you know,
01:00:50.380 | force people into the gas chamber.
01:00:52.220 | And grab people, you know, in Kiev in September, 1941,
01:00:56.740 | at Babi Yar and push them, you know,
01:00:59.100 | towards the ravine where the machine gunners
01:01:02.020 | are gonna shoot them down.
01:01:03.900 | You know, and those are all sorts of different questions.
01:01:06.740 | The question of, you know, especially the lower level,
01:01:11.580 | people who actually do the killing,
01:01:13.820 | is a question which I think we've been talking about,
01:01:15.980 | which is that within all of us, you know,
01:01:19.620 | is the capability of being murderers.
01:01:21.980 | And mass murderers, I mean,
01:01:23.140 | to participate in mass murder.
01:01:25.460 | I won't call them laws of social psychology,
01:01:27.860 | but the character of social psychology.
01:01:31.220 | You know, we will do it in most cases.
01:01:33.380 | I mean, one of the shocking things that I learned
01:01:35.700 | just a few years ago, studying the Holocaust,
01:01:38.980 | is that you could pull out.
01:01:41.860 | In other words, if they order a police battalion
01:01:44.340 | to go shoot Jews, you didn't have to do it.
01:01:47.660 | You could pull out.
01:01:49.660 | They weren't gonna, they never killed anybody.
01:01:51.500 | They never executed anybody.
01:01:53.260 | They never even punished people for saying,
01:01:54.860 | no, I'm not gonna do that.
01:01:56.420 | So people are doing it voluntarily.
01:01:59.540 | They may not want to do it.
01:02:01.980 | You know, they give them booze to try to, you know,
01:02:04.620 | numb the pain of murder, 'cause they know there is pain.
01:02:09.100 | I mean, people experience pain when they murder people.
01:02:12.500 | But they don't pull out.
01:02:14.700 | And so it's the character of who we are in society,
01:02:17.620 | in groups, and we're very, very influenced.
01:02:21.060 | I mean, we're highly influenced
01:02:22.700 | by the groups in which we operate,
01:02:25.620 | and who we talk to, and who our friends are
01:02:30.020 | within that group, and who is the head of the group.
01:02:32.860 | And I mean, you see this even,
01:02:34.740 | I mean, you see it in any group,
01:02:36.460 | whether it's in the academy, right, at Stanford,
01:02:39.680 | or whether it's in a labor union,
01:02:42.340 | or whether it's in a church group in Tennessee,
01:02:45.020 | or wherever, you know, people pay attention to each other,
01:02:49.220 | and they are unwilling frequently to say,
01:02:53.380 | "No, this is wrong."
01:02:56.260 | Even though all of you think it's right, it's wrong.
01:02:58.260 | I mean, they just don't do that usually,
01:03:00.900 | especially in societies that are authoritarian,
01:03:05.900 | or totalitarian, right?
01:03:08.100 | Because it's harder, 'cause there's a backup to it, right?
01:03:10.940 | There's the NKVD there, or there's the Gestapo there,
01:03:13.380 | and there are other people there.
01:03:14.780 | So you just, you know, they may not be forcing you to do it,
01:03:18.900 | but your social being, plus this danger in the distance,
01:03:23.900 | you know, you do it.
01:03:28.460 | - But then if you go up the hierarchy,
01:03:31.260 | at the very top, there's a dictator,
01:03:33.460 | presumably, you know, you go to like middle management,
01:03:36.340 | to the bureaucracy.
01:03:37.520 | The higher you get up there,
01:03:41.140 | the more power you have to change
01:03:43.740 | the direction of the Titanic.
01:03:45.540 | - Right, right, right.
01:03:46.960 | - But nobody seems to do it, right?
01:03:49.580 | - Or what happens, and it does happen,
01:03:52.580 | it happens in the German army,
01:03:54.620 | I mean, it happens in the case of the Armenian genocide,
01:03:57.660 | where we know there are governors who said,
01:03:59.160 | "No, I'm not gonna kill Armenians,
01:04:01.380 | "what kind of business is this?"
01:04:02.460 | They're just removed.
01:04:03.660 | They're removed, and you find a replacement very easily.
01:04:07.900 | So, you know, you do see people who stand up,
01:04:10.580 | and again, it's not really predictable who it will be.
01:04:13.700 | I would maintain, I mean, I haven't done the study
01:04:15.980 | of the Armenian governors who said no,
01:04:19.220 | I mean, the Turkish governors who said no
01:04:21.140 | to the Armenian genocide, but you know,
01:04:24.500 | there are people who do step aside every once in a while,
01:04:30.040 | in the middle level, and again,
01:04:31.420 | they're German generals who say,
01:04:32.520 | "Wait a minute, what is this business in Poland
01:04:34.340 | "when they start to kill Jews, or in Belarusia?"
01:04:37.920 | And you know, they're just pushed aside.
01:04:40.420 | You know, if they don't do their job, they're pushed aside.
01:04:42.980 | Or they end up doing it,
01:04:44.060 | and they usually do end up doing it.
01:04:45.900 | - What about on the victim side?
01:04:49.700 | So I mentioned man's search for meaning.
01:04:51.700 | What can we learn about human nature,
01:04:55.340 | the human mind, from the victims of genocide?
01:04:59.620 | So Viktor Frankl talked about the ability
01:05:02.340 | to discover meaning and beauty, even in suffering.
01:05:05.980 | Is there something to be said about, you know,
01:05:10.140 | in your studying of genocide
01:05:11.700 | that you've learned about human nature?
01:05:14.660 | - Well, again, I don't, I have to say,
01:05:19.420 | I come out of the study of genocide
01:05:21.140 | with a very pessimistic view of human nature,
01:05:24.340 | a very pessimistic view.
01:05:25.740 | - Even on the victim side?
01:05:26.900 | - Even on the victim side.
01:05:28.940 | I mean, the victims will eat their children, right?
01:05:33.620 | Ukrainian case, they have no choice.
01:05:36.080 | You know, the victims will rob each other.
01:05:38.740 | The victims will form hierarchies within victimhood.
01:05:43.740 | So you see, let me give you an example.
01:05:46.740 | Again, I told you I was working on Majdanek.
01:05:49.940 | And there's, in Majdanek, at a certain point in '42,
01:05:55.860 | a group of Slovak Jews were arrested and sent to Majdanek.
01:06:05.980 | Those Slovak Jews were a group,
01:06:08.420 | somehow, I mean, they stuck together.
01:06:10.260 | They were very competent.
01:06:11.460 | Many of them were businessmen.
01:06:14.900 | They knew each other.
01:06:16.460 | And for a variety of different reasons within the camp,
01:06:19.900 | and again, this shows you the diversity of the camps,
01:06:22.340 | and also, you know, these images of black and white
01:06:24.580 | in the camps are not very useful.
01:06:26.780 | They ruled the camp.
01:06:29.060 | I mean, they basically had all the important jobs
01:06:31.160 | in the camp, including jobs like beating other Jews.
01:06:35.780 | And persecuting other Jews, and persecuting other peoples,
01:06:40.620 | which they did.
01:06:43.540 | And this Polish guy who I mentioned to you,
01:06:46.340 | who wrote this memoir, hated them
01:06:48.740 | because of what they were doing to the Poles, right?
01:06:53.740 | And he's incensed,
01:06:57.520 | because aren't these supposed to be the intervention,
01:07:01.180 | he says, and look what they're doing.
01:07:02.680 | They're treating us, you know, like dirt.
01:07:06.200 | And they do, they treat them like dirt.
01:07:08.780 | So, you know, in this kind of work on Majdanek,
01:07:11.440 | there's certainly parts of it that, you know,
01:07:16.000 | were inspiring.
01:07:17.040 | You know, people helping each other.
01:07:20.840 | People trying to feed each other.
01:07:23.080 | People giving warmth to each other.
01:07:24.920 | You know, there's some very heroic Polish women
01:07:30.400 | who end up having a radio show called Radio Majdanek,
01:07:33.760 | which they put on every night in the women's camp.
01:07:36.440 | Which is, you know, to raise people's spirits.
01:07:39.400 | And they, you know, sing songs,
01:07:41.400 | and do all this kind of stuff, you know,
01:07:43.040 | to try to keep themselves from, you know,
01:07:47.940 | the horrors that they're experiencing around them.
01:07:51.080 | And so you do see that, and you do see, you know,
01:07:54.680 | human beings acting in support of each other.
01:08:00.440 | But, you know, I mean, Primo Levi is one of my
01:08:05.440 | favorite writers about the Holocaust,
01:08:08.800 | and about the camps.
01:08:10.360 | And, you know, I don't think Primo Levi saw anything.
01:08:14.320 | You know, I mean, he had pals, you know,
01:08:17.400 | who he helped, and who helped him.
01:08:20.080 | I mean, but he describes this kind of, you know,
01:08:25.080 | terrible inhuman environment,
01:08:27.520 | which no one can escape, really.
01:08:29.440 | No one can escape.
01:08:30.440 | He ends up committing suicide, too,
01:08:31.920 | I think because of his sense of,
01:08:35.960 | we don't know exactly why,
01:08:37.480 | but probably because of his sense of what happened
01:08:40.820 | in the camp.
01:08:41.660 | I mean, later he goes back to Italy,
01:08:42.880 | becomes a writer, that sort of thing.
01:08:44.600 | So I don't, especially in the concentration camps,
01:08:49.360 | it's really hard to find places like Wickel-Frankl,
01:08:52.960 | where you can say, you know,
01:08:54.440 | I am moved in a positive way,
01:08:59.440 | you know, by what happened.
01:09:02.640 | There were cases, there's no question.
01:09:04.280 | People hung together, they tried to help each other,
01:09:06.360 | but, you know, they were totally, totally caught
01:09:11.360 | in this web of genocide.
01:09:15.760 | - See, so there are stories,
01:09:17.520 | but the thing is, I have this sense,
01:09:19.960 | maybe it's a hope, that within most,
01:09:22.900 | if not every human heart,
01:09:24.440 | there's a kind of like flame of compassion
01:09:29.440 | and kindness and love that waits,
01:09:34.840 | that longs to connect with others,
01:09:37.280 | that ultimately, en masse, overpowers everything else.
01:09:41.040 | If you just look at the story of human history,
01:09:43.400 | the resistance to violence and mass murder and genocide
01:09:50.500 | feels like a force that's there.
01:09:52.900 | And it feels like a force that's more powerful
01:09:57.200 | than whatever the dark momentum that leads to genocide is.
01:10:02.200 | It feels like that's more powerful.
01:10:07.720 | It's just quiet.
01:10:08.720 | It's hard to tell the story of that little flame
01:10:10.720 | that burns within all of our hearts,
01:10:14.060 | that longing to connect to other human beings.
01:10:16.560 | And there's something also about human nature,
01:10:18.920 | and us as storytellers, that we're not very good
01:10:21.600 | at telling the stories of that little flame.
01:10:24.040 | We're much better at telling the stories of atrocities.
01:10:27.040 | - No, I think maybe I fundamentally disagree with you.
01:10:31.320 | I think maybe I fundamentally,
01:10:32.600 | I don't disagree that there is that flame.
01:10:35.260 | I just think it's just too easily doused.
01:10:38.920 | And I think it too easily goes out in a lot of people.
01:10:43.360 | And I mean, like I say, I come away from this work
01:10:49.080 | a pessimist.
01:10:50.840 | You know, there is this work by a Harvard psychologist,
01:10:54.640 | now I'm forgetting his name.
01:10:55.480 | - Steven Pinker.
01:10:56.560 | - Yes, yes, Steven Pinker that shows over time,
01:11:00.280 | and initially I was quite skeptical of the work,
01:11:03.420 | but in the end I thought he was quite convincing
01:11:05.820 | that over time the incidence of homicide goes down,
01:11:12.200 | the incidence of rape goes down,
01:11:14.880 | the incidence of genocide, except for the big blip,
01:11:18.880 | you know, in the middle of the 20th century goes down.
01:11:22.280 | Not markedly, but it goes down generally,
01:11:24.840 | that norms, international norms are changing
01:11:28.380 | how we think about this and stuff like that.
01:11:30.120 | I thought he was pretty convincing about that.
01:11:33.080 | But think about, you know, we're modern people.
01:11:37.240 | I mean, we've advanced so fast in so many different areas.
01:11:42.000 | I mean, we should have eliminated this a long time ago,
01:11:45.200 | a long time ago.
01:11:46.420 | You know, how is it that, you know,
01:11:50.880 | we're still facing this business of genocide in Myanmar,
01:11:54.400 | in Xinjiang, in, you know, Tigray, in Ethiopia,
01:11:59.400 | you know, the potentials of genocide there,
01:12:02.560 | and all over the world, you know,
01:12:04.280 | we still have this thing that we cannot handle,
01:12:08.320 | that we can't deal with.
01:12:10.240 | And, you know, again, you know, electric cars and planes
01:12:14.400 | that fly from here to, you know, Beijing.
01:12:16.840 | Think about the differences between 250 years ago
01:12:21.280 | or 300 years ago and today.
01:12:23.960 | But the differences in genocide are not all that great.
01:12:26.800 | I mean, the incidence has gone down.
01:12:28.720 | I think Pinker has demonstrated,
01:12:30.440 | I mean, there are problems with his methodology,
01:12:32.860 | but on the whole, I'm with him on that book.
01:12:35.460 | I thought in the end it was quite well done.
01:12:38.960 | So, you know, I do not,
01:12:43.640 | I have to say I'm not an optimist
01:12:46.720 | about what this human flame can do.
01:12:48.960 | And, you know, I once, someone once said to me,
01:12:52.880 | when I posed a similar kind of question to a seminar,
01:12:56.300 | a friend of mine at Berkeley once said,
01:12:57.960 | remember original sin, Norman?
01:13:00.640 | Well, I don't, you know, that's very Catholic,
01:13:02.660 | and I don't really think in terms of original sin.
01:13:07.320 | But in some ways, you know,
01:13:09.080 | her point is we carry this with us.
01:13:11.600 | You know, we carry with us a really
01:13:15.000 | potentially nasty mean streak
01:13:20.220 | that can do harm to other people.
01:13:22.360 | - But we carry the capacity to love too.
01:13:24.640 | - Yes, we do.
01:13:25.560 | Yes, we do.
01:13:26.400 | That's part of the deal.
01:13:28.520 | - You have a bias in that you have studied
01:13:32.120 | some of the darker aspects of human nature
01:13:34.880 | and human history.
01:13:36.560 | So it is difficult from the trenches,
01:13:41.000 | from the muck to see a possible sort of way out through love.
01:13:46.000 | But it's not obvious that that's not the case.
01:13:50.500 | You mentioned electric cars and rockets and airplanes.
01:13:54.800 | To me, the more powerful thing is Wikipedia, the internet.
01:13:58.840 | Only 50% of the world currently has access to the internet,
01:14:02.400 | but that's growing in information and knowledge and wisdom,
01:14:05.900 | especially among women in the world.
01:14:08.200 | As that grows, I think it becomes a lot more difficult
01:14:12.680 | if love wins.
01:14:13.840 | It becomes a lot more difficult for somebody like Hitler
01:14:16.120 | to take power, for genocide to occur,
01:14:18.400 | because people think, and the masses, I think,
01:14:22.360 | the people have power when they're able to think,
01:14:26.160 | when they can see the full kind of...
01:14:28.160 | First of all, when they can study your work,
01:14:34.040 | they can know about the fact that genocide happens,
01:14:36.340 | how it occurs, how the promises of great charismatic leaders
01:14:40.500 | lead to great destructive mass genocide.
01:14:44.820 | And just even studying the fact that the Holocaust happened
01:14:48.940 | for a large number of people
01:14:50.860 | is a powerful preventer of future genocide.
01:14:55.100 | Like one of the lessons of history
01:14:57.620 | is just knowing that this can happen,
01:14:59.980 | learning how it happens,
01:15:01.580 | that normal human beings, leaders that give big promises
01:15:06.580 | can also become evil and destructive.
01:15:09.560 | The fact, knowing that that can happen
01:15:12.020 | is a powerful preventer of that.
01:15:13.900 | And then you kind of wake up from this haze
01:15:16.860 | of believing everything you hear,
01:15:19.800 | and you learn to just, in your small, local way,
01:15:24.800 | to put more love out there in the world.
01:15:28.960 | I believe it's not too good.
01:15:31.460 | So to push back, it's not so obvious to me
01:15:35.300 | that in the end, I think in the end, love wins.
01:15:40.040 | That's my intuition.
01:15:40.880 | If I had to bet money on it,
01:15:42.660 | I have a sense that this genocide thing
01:15:46.620 | is more and more going to be an artifact of the past.
01:15:51.240 | - Well, I certainly hope you're right.
01:15:53.100 | I mean, I certainly hope you're right.
01:15:54.500 | And it could be you are, we don't know.
01:16:00.140 | But the evidence is different.
01:16:04.020 | The evidence is different.
01:16:05.260 | And the capacity of human beings to do evil
01:16:10.260 | to other human beings is repeatedly demonstrated.
01:16:14.920 | Whether it's in massacres in Mexico,
01:16:20.140 | or ISIS and the Yazidi Kurds,
01:16:25.140 | or you can just go on and on.
01:16:27.660 | Syria, I mean, look what,
01:16:29.420 | I mean, Syria used to be a country, you know?
01:16:31.740 | And now it's been a mass grave,
01:16:36.180 | and people then have left in the millions,
01:16:38.940 | you know, for other places.
01:16:41.100 | And I'm not saying,
01:16:43.140 | you know, I'm not saying,
01:16:46.580 | I mean, the Turks have done nice things for the Syrians,
01:16:48.980 | and the Germans welcomed in a million or so,
01:16:51.100 | and actually reasonably absorbed them.
01:16:53.340 | I mean, I'm not saying bad things only happen in the world.
01:16:57.660 | There are good and bad things that happen.
01:16:59.100 | You're absolutely right.
01:17:00.300 | But I don't think we're on the path
01:17:04.860 | to eliminating these bad things,
01:17:08.380 | really bad things from happening.
01:17:10.140 | I just don't think we are.
01:17:11.180 | And I don't think there's any,
01:17:12.980 | I don't think the facts demonstrate it.
01:17:15.500 | I mean, I hope, I hope you're right.
01:17:17.300 | But I think otherwise, it's just an article of faith.
01:17:20.980 | - Well.
01:17:23.580 | - You know, which is perfectly fine.
01:17:25.500 | It's better to have that article of faith
01:17:27.260 | than to have a article of faith which says,
01:17:29.980 | you know, things should get bad or things like that.
01:17:32.180 | - Well, it's not just fine.
01:17:33.940 | It's the only way if you want to build a better future.
01:17:36.900 | So optimism is a prerequisite
01:17:38.780 | for engineering a better future.
01:17:40.580 | So like, okay, so a historian
01:17:43.440 | has to see clearly into the past.
01:17:46.140 | An engineer has to imagine a future
01:17:51.140 | that's different from the past,
01:17:54.260 | that's better than the past.
01:17:56.020 | Because without that, they're not going to be able
01:17:58.340 | to build a better future.
01:17:59.520 | So there's a kind of saying,
01:18:01.260 | like you have to consider the facts.
01:18:02.660 | Well, at every single moment in history,
01:18:05.600 | if you allow yourself to be too grounded
01:18:10.160 | by the facts of the past,
01:18:11.340 | you're not going to create the future.
01:18:12.820 | So that's kind of the tension that we're living with.
01:18:15.160 | To have a chance, we have to imagine
01:18:16.860 | that that better future is possible.
01:18:19.540 | But one of the ways to do that is to study history.
01:18:24.000 | - Which engineers don't do enough of.
01:18:26.040 | - They do not.
01:18:26.880 | - Which is a real problem.
01:18:29.280 | It's a real problem.
01:18:31.020 | - Well, basically a lot of disciplines in science
01:18:33.180 | and so on don't do enough of.
01:18:35.220 | Can you tell the story of China from 1958 to 1962,
01:18:41.340 | what was called the Great Leap Forward,
01:18:44.960 | orchestrated by Chairman Mao Zedong
01:18:47.400 | that led to the deaths of tens of millions of people,
01:18:49.760 | making it arguably the largest famine in human history.
01:18:54.380 | - Yes, I mean, it was a terrible set of events
01:18:59.380 | that led to the death.
01:19:02.200 | People will dispute the numbers.
01:19:04.040 | 15 million, 17 million, 14 million, 20 million people died
01:19:12.040 | in the Great Leap. - Many people say
01:19:16.000 | 30, 40, 50 million.
01:19:17.360 | - Some people will go that high too.
01:19:18.940 | That's right, that's right.
01:19:21.000 | Essentially, Mao and the Communist Party leadership,
01:19:25.220 | but it was mostly Mao's doing,
01:19:28.480 | decided he wanted to move the country into communism.
01:19:33.200 | And part of the idea of that
01:19:35.840 | was rivalry with the Soviet Union.
01:19:37.920 | Mao was a good Stalinist, or at least felt like Stalin
01:19:43.240 | was the right kind of communist leader to have,
01:19:46.340 | and he didn't like Khrushchev at all,
01:19:48.080 | and he didn't like what he thought were Khrushchev's reforms
01:19:51.200 | and also Khrushchev's pretensions
01:19:54.160 | to moving the Soviet Union into communism.
01:19:57.140 | So Khrushchev started talking about giving more power
01:19:59.680 | to the party, less power to the state,
01:20:01.720 | and if you give more power to the party versus the state,
01:20:04.480 | then you're moving into communism quicker.
01:20:07.240 | So what Mao decided to do was to engage in this vast program
01:20:11.400 | of building what were called people's communes.
01:20:16.960 | And these communes were enormous conglomerations
01:20:21.960 | of essentially collective farms.
01:20:25.520 | And what would happen on those communes
01:20:28.360 | is there would be places for people to eat,
01:20:31.560 | and there would be places for the kids to be raised
01:20:34.920 | in essentially kind of separate homes,
01:20:38.040 | and they would be schooled.
01:20:39.960 | Everybody would turn over their medal,
01:20:42.680 | which was one of the, actually turned out
01:20:44.160 | to be a terribly negative phenomenon.
01:20:46.760 | Their metal pots and pans to be melted to then make steel.
01:20:51.720 | Every of these big communes
01:20:53.640 | would all have little steel plants,
01:20:55.880 | and they would build steel,
01:20:57.320 | and the whole countryside would be transformed.
01:21:01.360 | Well, like many of these sort of,
01:21:03.400 | I mean, a true megalomaniac project,
01:21:07.680 | like some of Stalin's projects too.
01:21:10.240 | And this particular project then,
01:21:13.000 | the people had no choice.
01:21:15.400 | They were forced to do this.
01:21:18.760 | It was incredibly dysfunctional for Chinese agriculture
01:21:23.760 | and ended up creating, as you mentioned, a terrible famine
01:21:30.840 | that everybody understood was a famine as a result of this.
01:21:36.440 | I mean, there were also some problems of nature
01:21:40.400 | at the same time and some flooding and bad weather
01:21:42.720 | and that sort of thing.
01:21:43.800 | But it was really a man-made famine.
01:21:45.760 | And Mao said at one point, "Who cares if millions die?
01:21:52.240 | "It just doesn't matter.
01:21:53.840 | "We've got millions more left."
01:21:55.360 | I mean, he would periodically say things like this
01:21:57.800 | that showed that like Stalin, he had total indifference
01:22:02.800 | to the fact that people were dying in large numbers.
01:22:06.320 | It led again to cannibalism and to terrible wastage
01:22:11.560 | all over the country and millions of people died.
01:22:15.080 | And there was just no stopping it.
01:22:17.160 | There were people in the party who began to kind of edge
01:22:21.680 | towards telling Mao this wasn't a great idea
01:22:24.160 | and that he should back off, but he wouldn't back off.
01:22:28.480 | And the result was catastrophe in the countryside
01:22:32.080 | and all these people dying.
01:22:33.240 | And then they, compounding the problem
01:22:35.840 | was the political elite, which then,
01:22:39.840 | if peasants would object or if certain people would say,
01:22:43.320 | "No, they'd beat the hell out of them."
01:22:45.040 | They would beat people who didn't do
01:22:47.880 | what they wanted them to do.
01:22:49.160 | So it was really, really a horrific set of events
01:22:54.160 | on the Chinese countryside.
01:22:59.240 | I mean, and people wrote about it.
01:23:02.520 | I mean, we learned about it.
01:23:04.480 | There were people who were keeping track
01:23:05.960 | of what was going on and eventually wrote books about it.
01:23:09.080 | So we have, I mean, we have pretty good documentation,
01:23:13.040 | not so much on the numbers.
01:23:14.160 | Numbers are always a difficult problem.
01:23:17.720 | I'm facing this problem, by the way,
01:23:19.120 | this is a little bit separate with the Holodomor
01:23:23.120 | where Ukrainians are now claiming 11.5 million people died
01:23:26.680 | in Holodomor.
01:23:28.000 | And most people assume it's somewhere
01:23:29.760 | in the neighborhood of 4 million, 4.5 million maybe.
01:23:33.200 | So you have wildly different numbers that come out
01:23:36.440 | and we have different kinds of numbers,
01:23:38.000 | as you mentioned too, with the Great Leap Forward.
01:23:41.640 | So it was a huge catastrophe for China
01:23:45.320 | and now only backed off when he had to.
01:23:47.920 | And then revived a little bit
01:23:50.720 | with the Red Guards movement later on
01:23:53.920 | when he was upset that the bureaucracy
01:23:58.080 | was resisting him a little bit
01:24:00.320 | when it came to the Great Leap,
01:24:01.720 | but he had to back off.
01:24:03.320 | It was such a terrible catastrophe.
01:24:06.000 | - So one of the things about numbers
01:24:07.680 | is that you usually talk about deaths,
01:24:10.440 | but with the famine, with starvation,
01:24:13.380 | the thing I often think about
01:24:16.920 | that's impossible to put into numbers
01:24:18.680 | is the number of people
01:24:20.160 | and the degree to which they were suffering.
01:24:23.440 | The number of days spent in suffering.
01:24:28.840 | - Oh yeah.
01:24:30.120 | - And so, I mean, death is,
01:24:36.360 | death is just one of the consequences of suffering.
01:24:39.760 | To me, it feels like one, two, three years,
01:24:43.320 | months and then years of not having anything to eat
01:24:48.320 | is worse and it's sort of those
01:24:53.600 | aren't put into numbers often.
01:24:55.320 | - That's right.
01:24:56.160 | And the effect on people long-term,
01:24:58.240 | in terms of their mental health,
01:24:59.560 | in terms of their physical health,
01:25:02.560 | their ability to work, all those kinds of things.
01:25:05.240 | I mean, Ukrainians are working on,
01:25:07.720 | there are people working on this subject now,
01:25:09.280 | you know, the long-term effect of the hunger famine on them.
01:25:13.000 | And I'm sure there's a similar kind of long-term effect
01:25:16.220 | on Chinese peasantry of what happened.
01:25:18.080 | You know, I mean, you're destroying--
01:25:20.040 | - Multi-generational.
01:25:21.040 | - Yes, multi-generational, that's right.
01:25:23.040 | That's right. - Wow.
01:25:23.880 | - And, you know, it's a really, you're absolutely right.
01:25:26.560 | This is a terrible, terrible way to die.
01:25:29.160 | And it lasts a long time.
01:25:31.520 | And sometimes you don't die, you survive,
01:25:34.320 | but, you know, in the kind of shape
01:25:37.240 | where you can't do anything.
01:25:39.800 | I mean, you can't function.
01:25:42.080 | Now, your brain's been injured, you know.
01:25:44.920 | I don't know, it's a really,
01:25:46.960 | these famines are really horrible.
01:25:49.240 | - You're right.
01:25:50.080 | So when you talk about genocide,
01:25:50.960 | it's often talking about murder.
01:25:52.440 | - Yeah. - Where do you place
01:25:53.720 | North Korea in this discussion?
01:25:55.160 | We kind of mentioned it.
01:25:56.960 | So in the, what is it, the arduous march
01:26:03.120 | of the 1990s, where it was mass starvation,
01:26:08.120 | many people describe mass starvation going on.
01:26:11.320 | Now in North Korea, when you think about genocide,
01:26:14.360 | when you think about atrocities going on in the world today,
01:26:18.760 | where do you place North Korea?
01:26:20.760 | - So take a step back.
01:26:22.280 | When the, there were all these courts
01:26:24.800 | that were set up for Bosnia and for Rwanda
01:26:29.080 | and for other genocides in the 1990s.
01:26:34.040 | And then the decision was made
01:26:37.360 | by the international community, UN basically,
01:26:39.880 | to set up the International Criminal Court,
01:26:43.120 | which would then try genocide in the more modern period
01:26:47.520 | and the more contemporary period.
01:26:49.520 | And the ICC lists three crimes, basically.
01:26:54.840 | The genocide, crimes against humanity, and war crimes.
01:26:59.840 | And subsumed to crimes against humanity
01:27:07.200 | are a lot of the kinds of things
01:27:08.680 | you're talking about with North Korea.
01:27:10.800 | I mean, it's torture, it's artificial,
01:27:13.760 | sometimes artificial famine or famine,
01:27:16.160 | that is not necessary, right?
01:27:21.040 | Not necessary to have it.
01:27:23.040 | And there are other kinds of mass rape and stuff like that.
01:27:28.040 | There are other kinds of things that fit
01:27:31.160 | into the crimes against humanity.
01:27:33.480 | And that's sort of where I think about North Korea
01:27:36.400 | as committing crimes against humanity, not genocide.
01:27:39.200 | And again, remember, genocide is meant to be,
01:27:43.200 | I mean, some people, there's a disagreement
01:27:46.880 | among scholars and jurists about this.
01:27:48.840 | Some people think of genocide as the crime of crimes,
01:27:51.920 | the worst of the three that I just mentioned.
01:27:55.160 | But some think of them as co-equal.
01:27:57.000 | And the ICC, the International Criminal Court,
01:28:00.480 | is dealing with them more or less as co-equal,
01:28:03.320 | even though we tend to think of genocide as the worst.
01:28:06.520 | So, I mean, what I'm trying to say is that,
01:28:08.960 | you know, I don't wanna split hairs.
01:28:11.680 | I think it's sort of morally and ethically unseemly,
01:28:15.480 | you know, the split hairs about what is genocide
01:28:18.000 | and what is a crime against humanity.
01:28:20.720 | You know, this is for lawyers, not for historians.
01:28:22.960 | - Oh, terminology-wise.
01:28:24.120 | - Yeah, yeah, you know, you don't wanna get into that.
01:28:28.360 | Because it, I mean, it happened with Darfur a little bit,
01:28:32.200 | where the Bush administration had declared
01:28:35.680 | that Darfur was a genocide.
01:28:37.840 | And the UN said, no, no, it's, you know,
01:28:40.800 | it wasn't genocide, it was a crime against humanity.
01:28:43.400 | And that, you know, that confused things
01:28:45.280 | versus clarified them.
01:28:47.320 | I mean, we damn well knew what was happening.
01:28:49.320 | People were being killed and being attacked.
01:28:51.520 | And so, you know, on the one hand,
01:28:55.240 | I think the whole concept and the way of thinking
01:28:58.360 | about history using genocide as an important part
01:29:04.040 | of human history is crucial.
01:29:08.040 | On the other hand, I don't like to, you know,
01:29:12.400 | get involved in the hair splitting,
01:29:13.880 | what's genocide and what's not.
01:29:15.320 | So that, you know, North Korea, I tend to think of,
01:29:18.600 | like I said, as committing crimes against humanity
01:29:22.640 | and, you know, forcibly incarcerating people,
01:29:25.560 | torturing them, that kind of thing.
01:29:28.120 | You know, routinely incarcerating,
01:29:30.160 | depriving them of certain kinds of human rights
01:29:33.640 | can be considered a crime against humanity.
01:29:35.720 | But I don't think of it in the same way
01:29:38.000 | I think about genocide,
01:29:39.080 | which is an attack on a group of people.
01:29:40.800 | Let me just leave it at that.
01:29:42.800 | - What in this, if we think about, if it's okay,
01:29:45.840 | can we loosely use the term genocide here?
01:29:48.160 | Just let's not play games with terminology.
01:29:50.880 | Just bad crimes against humanity.
01:29:54.720 | Of particular interest are the ones
01:29:58.640 | that are going on today still,
01:30:01.640 | because it raises the question to us,
01:30:04.440 | what do people outside of this,
01:30:07.920 | what role do they have to play?
01:30:09.320 | So what role does the United States,
01:30:12.360 | or what role do I as a human being
01:30:16.640 | who has food today, who has shelter,
01:30:19.800 | who has a comfortable life,
01:30:21.400 | what role do I have when I think about North Korea,
01:30:26.120 | when I think about Syria,
01:30:27.320 | when I think about maybe the Uyghur population in China?
01:30:31.440 | - Well, I mean, the role is the same role I have,
01:30:36.080 | which is to teach and to learn
01:30:38.360 | and to get the message out that this is happening,
01:30:43.360 | because the more people who understand it,
01:30:45.760 | the more likely it is that the United States government
01:30:48.600 | will try to do something about it,
01:30:50.680 | within the context of who we are and where we live, right?
01:30:56.640 | And so I write books, you do shows,
01:31:00.800 | maybe you write books too, I don't know.
01:31:04.560 | - No, I do not write books, but I tweet.
01:31:07.840 | - You tweet, okay, that's good too.
01:31:09.720 | - Ineloquently, but that's not,
01:31:11.440 | I guess that's not the, yes, so certainly this is true,
01:31:14.360 | and in terms of a voice, in terms of words,
01:31:17.040 | in terms of books, you are, I would say,
01:31:19.440 | a rare example of somebody that has powerful reach
01:31:24.280 | with words, but I was also referring to actions.
01:31:27.220 | The United States government, what are the options here?
01:31:31.800 | So war has costs, and war seems to be,
01:31:36.800 | as you have described, sort of potentially
01:31:40.280 | increase the atrocity, not decrease it.
01:31:44.100 | If there's anything that challenges my hope for the future,
01:31:48.500 | is the fact that sometimes we're not powerless to help,
01:31:52.980 | but very close to powerless to help,
01:31:56.000 | because trying to help can often lead to,
01:32:00.300 | in the near term, more negative effects
01:32:03.420 | than positive effects.
01:32:04.700 | - That's exactly right, I mean, you know,
01:32:06.980 | the unintended consequences of what we do
01:32:10.220 | can frequently be as bad, if not worse,
01:32:13.580 | than trying to relieve the difficulties
01:32:17.100 | that people are having.
01:32:17.940 | So I think you're caught a little bit,
01:32:21.540 | but it's also true, I think, that we can be more forceful.
01:32:25.260 | I think we can be more forceful without necessarily war.
01:32:29.420 | You know, there is this idea
01:32:31.900 | of the so-called responsibility to protect,
01:32:35.220 | and this was an idea that came up after Kosovo,
01:32:39.520 | which was what, 1999,
01:32:42.840 | and when, you know, the Serbs looked like
01:32:47.520 | they were gonna engage in a genocidal program in Kosovo,
01:32:50.560 | and you know, it was basically a program of ethnic cleansing
01:32:53.200 | but it could have gone bad and gotten worse,
01:32:56.240 | not just driving people out, but beginning to kill them,
01:32:59.540 | and the United States and Britain and others intervened,
01:33:04.540 | you know, and Russians were there too,
01:33:06.640 | as you probably recall,
01:33:08.840 | and I think correctly, people have analyzed this as a case
01:33:13.840 | in which genocide was prevented or stopped.
01:33:19.920 | In other words, the Serbs were stopped in their tracks.
01:33:22.020 | I mean, some bad things did happen.
01:33:23.440 | We bombed Belgrade and the Chinese embassy
01:33:25.640 | and things like that, but you know, it was stopped,
01:33:30.640 | and following upon that,
01:33:32.320 | then there was a kind of international consensus
01:33:35.620 | that we needed to do something.
01:33:36.880 | I mean, because of Rwanda, Bosnia,
01:33:39.520 | and the positive example of Kosovo, right?
01:33:42.600 | That genocide did not happen in Kosovo,
01:33:46.200 | and I think that argument, you know,
01:33:49.360 | has been substantiated.
01:33:50.720 | Anyway, and this notion of the,
01:33:55.440 | or this, you know, doctrine or whatever
01:33:58.440 | of the responsibility to protect them
01:34:00.840 | was adopted by the UN in 2005, unanimously,
01:34:06.480 | and what it says is there's a hierarchy of measures
01:34:11.480 | that should be, well, let me take a step back.
01:34:14.960 | It starts with the principle that sovereignty of a country
01:34:19.960 | is not, you don't earn it just by being there
01:34:24.820 | and being your own country.
01:34:27.300 | You have to earn it by protecting your people.
01:34:29.920 | So every, this was all agreed
01:34:32.840 | with all the nations of the UN agreed, you know,
01:34:35.160 | Chinese and Russians too, that, you know,
01:34:38.320 | sovereignty is there because you protect your people
01:34:43.320 | against various depredations, right?
01:34:46.380 | Including genocide, crimes against humanity,
01:34:49.560 | you know, forced imprisonment, torture,
01:34:51.200 | and that sort of thing.
01:34:52.840 | If you violate that justification for your sovereignty,
01:34:57.840 | that you're protecting your people,
01:35:01.480 | that you're not protecting them,
01:35:03.240 | the international community has the obligation
01:35:06.000 | to do something about it, all right?
01:35:09.000 | Now, then they have a kind of hierarchy
01:35:11.840 | of things you can do, you know, starting with,
01:35:14.200 | I mean, I'm not quoting exactly,
01:35:17.020 | but, you know, starting with kind of push and pull,
01:35:19.360 | you know, trying to convince people, don't do that.
01:35:22.360 | You know, to Myanmar, don't do that
01:35:24.480 | to the Rohingya people, right?
01:35:26.000 | Then it goes down the list, you know,
01:35:29.480 | and you get to sanctions, or threatening sanctions,
01:35:32.300 | and then sanctions, you know, like we have against Russia,
01:35:36.720 | but you go down the list, right?
01:35:38.520 | You go down the list, and eventually,
01:35:41.300 | you get to military intervention at the bottom,
01:35:44.400 | which they say is the last thing, you know,
01:35:46.880 | and you really don't wanna do that.
01:35:50.120 | And not only do you not wanna do it,
01:35:52.040 | but it, just as you said, just as you pointed out,
01:35:54.760 | it can have unintended consequences, right?
01:35:58.200 | And we'll do everything we can short, you know,
01:36:02.000 | of military intervention, but, you know, if necessary,
01:36:06.320 | that can be undertaken as well.
01:36:09.080 | And so the responsibility to protect, I think, is,
01:36:11.580 | you know, it was not implementable.
01:36:16.280 | One of the things it says in this last category, right,
01:36:20.600 | the military intervention, is that the intervention
01:36:23.500 | cannot create more damage than it relieves, right?
01:36:29.020 | And so for Syria, we came to the conclusion,
01:36:34.020 | you know, that, I mean, the international community,
01:36:36.780 | in some ways, said this in so many words,
01:36:39.680 | even though the Russians were there, obviously,
01:36:41.500 | we ended up being there, and that sort of thing,
01:36:43.260 | but the international community basically said,
01:36:45.460 | you know, there's no way you can intervene in Syria.
01:36:48.260 | You know, there's just no way without causing more damage,
01:36:52.140 | you know, than you would relieve.
01:36:54.260 | So, you know, in some senses,
01:36:56.840 | that's what the international community is saying about,
01:36:58.740 | you know, Xinjiang and the Uyghurs, too.
01:37:02.060 | You know, I mean, you can't even imagine
01:37:04.740 | what hell would break loose if there was some kind
01:37:07.900 | of military trouble, you know, to threaten the Chinese with.
01:37:12.740 | But you can go down that list with, you know,
01:37:17.460 | the military leadership of Myanmar,
01:37:19.740 | and you can go down that list
01:37:21.160 | with the Chinese Communist Party,
01:37:23.460 | and you can go down the list, you know,
01:37:25.460 | with others who are threatening, you know,
01:37:29.780 | with Ethiopia and what it's doing in Tigray,
01:37:33.900 | and, you know, you can go down that list and start pushing.
01:37:37.940 | I think what happened,
01:37:39.260 | there was more of a willingness in the '90s,
01:37:44.380 | and in the, you know, right at the turn of the century,
01:37:47.680 | you know, to do these kinds of things,
01:37:50.820 | and then, you know, when Trump got elected,
01:37:52.880 | and, you know, he basically said, you know,
01:37:54.660 | America first, and out of the world,
01:37:56.300 | we're not gonna do any of this kind of stuff.
01:37:58.620 | And now Biden has the problem of trying
01:38:01.100 | to rebuild consensus on how you deal
01:38:04.940 | with these kinds of things.
01:38:06.780 | I think it's not impossible.
01:38:08.700 | I mean, here, I tend to be maybe more of an optimist than you.
01:38:11.700 | (Larry laughs)
01:38:12.540 | You know, I think it's not impossible
01:38:14.580 | that the international community can, you know,
01:38:16.740 | muster some internal fortitude,
01:38:20.660 | and push harder, short of war, you know,
01:38:25.660 | to get the Chinese, and to get the, again, Myanmar,
01:38:30.700 | and to get others to kind of back off
01:38:34.140 | of violations of people's rights
01:38:36.060 | the way they are routinely doing it.
01:38:38.360 | - So that's in the space of geopolitics.
01:38:40.060 | That's the space of politicians, and UN, and so on.
01:38:42.500 | - Yes, yes.
01:38:43.340 | - The interesting thing about China,
01:38:44.500 | and this is a difficult topic,
01:38:47.080 | but there's so many financial interests
01:38:52.080 | that not many voices with power and with money speak up,
01:38:58.600 | speak out against China,
01:39:02.460 | because it's a very interesting effect,
01:39:06.760 | because it costs a lot for an individual to speak up,
01:39:11.260 | because you're going to suffer.
01:39:13.840 | I mean, China just cuts off the market.
01:39:17.040 | Like, if you have a product, if you have a company,
01:39:20.040 | and you say something negative, China just says,
01:39:22.120 | "Okay, well, then they knock you out of the market."
01:39:25.120 | And so any person that speaks up,
01:39:27.440 | they get shut down immediately, financially.
01:39:29.800 | It's a huge cost, sometimes millions or billions of dollars.
01:39:33.200 | And so what happens is everybody of consequences,
01:39:36.720 | sort of financially, everybody with a giant platform
01:39:39.540 | is extremely hesitant to speak out.
01:39:41.240 | It's a very, it's a different kind of hesitation
01:39:45.280 | that's financial in nature.
01:39:46.840 | I don't know if that was always the case.
01:39:48.760 | It seems like in history, people were quiet because of fear,
01:39:53.760 | because of threat of violence.
01:39:55.840 | Here, there's almost like a self-interested preservation
01:40:00.720 | of financial, of wealth.
01:40:04.320 | And I don't know what to do that.
01:40:06.240 | I mean, I don't know if you can say something there,
01:40:09.440 | like, (chuckles)
01:40:12.520 | the genocide going on
01:40:14.440 | because people are financially self-interested.
01:40:17.400 | - Yeah, no, I think, I mean, I think the analysis is correct.
01:40:22.060 | And it's not only, but it's not only corporations,
01:40:26.160 | but it's, you know, it's the American government
01:40:28.240 | that represents the American people
01:40:30.380 | that also feels compelled
01:40:34.040 | not to challenge the Chinese on human rights issues.
01:40:39.400 | - But the interesting thing is it's not just,
01:40:42.520 | I know a lot of people from China,
01:40:44.440 | and first of all, amazing human beings,
01:40:47.160 | and a lot of brilliant people in China.
01:40:49.100 | They also don't want to speak out,
01:40:50.920 | and not because they're sort of,
01:40:52.440 | quote unquote, like, silenced,
01:40:54.600 | but more because they're going to also lose financially.
01:40:57.760 | They have a lot of businesses in China.
01:41:00.640 | They, you know, they're running,
01:41:02.560 | in fact, the Chinese government and the country
01:41:06.780 | has a very interesting structure
01:41:08.520 | because it has a lot of elements that enable capitalism
01:41:11.400 | within a certain framework.
01:41:13.880 | So you have a lot of very successful companies,
01:41:16.640 | and they operate successfully.
01:41:18.240 | And then the leaders of those companies,
01:41:19.960 | many of whom have either been on this podcast,
01:41:24.680 | or want to be on this podcast,
01:41:25.960 | they really don't want to say anything negative
01:41:28.240 | about the government.
01:41:29.460 | And the nature of the fear I sense
01:41:32.840 | is not the kind of fear you would have in Nazi Germany.
01:41:37.400 | It's a very kind of, it's a mellow,
01:41:40.280 | like, why would I speak out
01:41:43.000 | when it has a negative effect on my company,
01:41:45.680 | on my family, in terms of finance, strictly financially?
01:41:48.740 | And that's difficult.
01:41:53.960 | That's a different problem to solve.
01:41:56.120 | That feels solvable.
01:41:57.720 | Because it feels like it's a money problem.
01:42:00.160 | If you can control the flow of money,
01:42:02.460 | where the government has less power
01:42:05.680 | to control the flow of money,
01:42:06.880 | it feels like that's solvable.
01:42:08.840 | And that's where capitalism is good.
01:42:10.360 | That's where the free market is good.
01:42:11.580 | So it's like, that's where a lot of people
01:42:13.240 | in the cryptocurrency space,
01:42:14.440 | I don't know if you follow them,
01:42:16.040 | they kind of say, okay, take the monetary system,
01:42:19.720 | the power to control money away from governments.
01:42:22.560 | Make it a distributed,
01:42:23.480 | like, allow technology to help you with that.
01:42:26.360 | That's a hopeful message there.
01:42:28.760 | In fact, a lot of people argue that kind of Bitcoin,
01:42:31.040 | these cryptocurrencies can help deal
01:42:35.040 | with some of these authoritarian regimes
01:42:38.500 | that lead to violations of basic human rights.
01:42:41.880 | If you can control, if you can give the power
01:42:44.320 | to control the money to the people,
01:42:46.120 | you can take that away from governments.
01:42:47.960 | That's another source of hope,
01:42:49.740 | where technology might be able to do something good.
01:42:52.760 | That's something different about the 21st century
01:42:54.840 | than the 20th, is there's technology
01:42:57.120 | in the hands of billions of people.
01:42:59.680 | - I mean, I have to say, I think you're a naive
01:43:03.240 | when it comes to technology.
01:43:04.640 | I mean, I don't, I'm not someone who understands technology.
01:43:07.920 | So it's wrong of me to argue with you
01:43:11.640 | because I don't really spend much time with it.
01:43:13.840 | I don't really like it very much.
01:43:15.960 | And I'm not, I'm neither a fan nor a connoisseur.
01:43:20.960 | So I just don't really know.
01:43:23.920 | But what human history has shown basically,
01:43:27.800 | and that's a big statement.
01:43:29.040 | I don't want to pretend I can tell you
01:43:31.080 | what human history has shown.
01:43:32.920 | But technology, atom bomb,
01:43:37.240 | I mean, that's the perfect example of technology.
01:43:39.520 | You know, what happens when you discover new things.
01:43:42.120 | It's a perfect example, what's going on with Facebook now.
01:43:45.080 | It's an absolutely perfect example.
01:43:47.600 | You know, and I once went to a lecture
01:43:51.480 | by Eric Schmidt about the future, you know,
01:43:53.680 | and about all the things that were going to happen
01:43:55.760 | and all these wonderful things like, you know,
01:43:57.540 | you wouldn't have to translate yourself anything.
01:43:59.880 | You wouldn't have to read a book, you know.
01:44:02.820 | You wouldn't have to drive a car.
01:44:04.160 | You don't have to do this, you don't have to do that.
01:44:05.680 | What kind of life is that?
01:44:07.360 | So, you know, my view of technology is it's subsumed,
01:44:12.360 | you know, to the political, social, and moral needs
01:44:17.480 | of our day and should be subsumed to that day.
01:44:20.360 | It's not going to solve anything by itself.
01:44:22.400 | It's going to be you and me that solve things.
01:44:25.520 | If they're solved,
01:44:26.360 | or our political system that solves things.
01:44:28.440 | Technology is neutral on one level.
01:44:31.460 | It is simply a human, I mean,
01:44:34.240 | they're talking now about how artificial intelligence,
01:44:37.040 | you know, is going to do this and is going to do that.
01:44:39.800 | I'm not so sure there's anything necessarily positive
01:44:43.680 | or negative about it,
01:44:44.800 | except it does obviously make work easier
01:44:47.560 | and things like that.
01:44:48.680 | I mean, I, you know, I like email and I like, you know,
01:44:52.160 | word processing and that sort of, all that stuff is great.
01:44:56.220 | But actually solving human relations in and of itself,
01:45:02.220 | relations in and of itself, or international relations,
01:45:06.100 | or conflict among human beings.
01:45:11.100 | I mean, I see technology as, you know,
01:45:13.600 | causing as many problems as it solves,
01:45:15.940 | and maybe even more, you know, the kind--
01:45:18.940 | - Maybe.
01:45:19.780 | - Maybe even more. - Maybe.
01:45:21.020 | - Yeah.
01:45:21.860 | - The question is, so like you said, technology is neutral.
01:45:25.260 | I agree with this.
01:45:26.940 | Technology is a toolkit, is a tool set
01:45:30.460 | that enables humans to have wider reach and more power.
01:45:34.980 | The printing press, the rare reason I can read your books
01:45:39.980 | is I would argue, so first of all, the printing press,
01:45:42.740 | and then the internet.
01:45:45.540 | Wikipedia, I think, has immeasurable effect on humanity.
01:45:50.540 | Technology is a double-edged sword.
01:45:53.900 | It allows bad people to do bad things
01:45:57.260 | and good people to do good things.
01:45:58.780 | - Exactly.
01:45:59.620 | - It boils down to--
01:46:01.020 | - Right, the people.
01:46:01.860 | - The people and whether you believe
01:46:04.460 | the capacity for good outweighs the capacity of bad.
01:46:07.500 | And so you said that I'm naive, it is true.
01:46:10.540 | I'm naively optimistic.
01:46:12.380 | I would say you're naively cynical about technology.
01:46:16.060 | Here we have one overdressed, naive optimist,
01:46:20.500 | and one brilliant, but nevertheless,
01:46:23.820 | technologically naive cynic, and we don't know.
01:46:27.100 | We don't know whether the capacity for good
01:46:30.340 | or the capacity for evil wins out in the end.
01:46:34.860 | And like we've been talking about,
01:46:37.060 | the trajectory of human history seems to pivot
01:46:39.940 | on a lot of random seeming moments.
01:46:43.060 | So we don't know, but as a builder of technology,
01:46:48.060 | I remain optimistic.
01:46:50.980 | And I should say, when you are optimistic,
01:46:56.220 | it is often easy to sound naive.
01:47:01.040 | And I'm not sure what to make of that small effect.
01:47:06.300 | Not to linger on specific words,
01:47:07.780 | but I've noticed that people who kind of
01:47:11.660 | are cynical about the world somehow sound more intelligent.
01:47:17.580 | - No, no, the issue is how can you be realistic
01:47:22.560 | about the world?
01:47:23.820 | It's not optimistic or pessimistic, it's not cynical.
01:47:27.260 | The question is how can you be a realist, right?
01:47:29.540 | - Yes, that's a good question.
01:47:31.140 | - Realism depends on a combination of knowledge
01:47:36.140 | and wisdom and good instincts and that sort of thing.
01:47:42.680 | And that's what we strive for, is a kind of realism.
01:47:47.320 | We both strive for that kind of realism.
01:47:49.420 | But I mean, here's an example I would give you.
01:47:53.420 | What about, again, we've got this environmental issue,
01:47:56.580 | and technology has created it.
01:47:59.280 | It's created it.
01:48:01.820 | I mean, the growth of technology,
01:48:04.180 | I mean, we all like to be heated well in our homes,
01:48:07.060 | and we want to have cars that run quickly
01:48:09.820 | and fast on gas, and that sort of,
01:48:12.620 | I mean, we're all consumers and we all profit from this.
01:48:17.040 | I don't, not everybody profits from it,
01:48:20.140 | but we want to be comfortable.
01:48:23.180 | And technology has provided us with a comfortable life.
01:48:25.620 | And it's also provided us with this incredible danger,
01:48:29.660 | which it's not solving, at least not now.
01:48:31.660 | - Okay, but-- - And it may solve,
01:48:33.740 | but it's only, my view is, you know what's gonna happen?
01:48:37.740 | A horrible catastrophe.
01:48:39.380 | It's the only way, it's the only way
01:48:42.300 | we will direct ourselves to actually trying
01:48:45.100 | to do something about it.
01:48:46.520 | We don't have the wisdom and the,
01:48:53.380 | realism and the sense of purpose.
01:48:56.820 | You know, what's her name?
01:48:59.060 | Greta goes blah, blah, blah, something like that
01:49:01.620 | in her last talk about the environmental summit
01:49:06.620 | in Glasgow or wherever it was.
01:49:09.780 | And, you know, we just don't have it
01:49:15.940 | unless we're hit upside the head really, really hard.
01:49:19.460 | And then maybe, you know, the business with nuclear weapons,
01:49:24.460 | you know, I think somehow we got hit upside the head
01:49:27.860 | and we realized, oh man, you know,
01:49:30.380 | this could really do it to the whole world.
01:49:32.940 | And so we started, you know, serious arms control stuff.
01:49:36.660 | And, you know, but up to that point, you know,
01:49:41.460 | I mean, it was just something about, you know,
01:49:43.500 | Khrushchev's big bomb, his big hydrogen bomb,
01:49:46.080 | which he exploded in the times,
01:49:48.360 | I think it was the anniversary or something like that.
01:49:50.420 | You know, I mean, just think what we could have done
01:49:52.460 | to each other.
01:49:53.580 | - Well, that's the double-edged sword of technology.
01:49:55.620 | - Yeah, I agree it's a double-edged sword.
01:49:57.300 | - There's a lot of people,
01:49:58.420 | there's a lot of people that argue that nuclear weapons
01:50:01.080 | is the reason we haven't had a World War III.
01:50:03.780 | So nuclear weapons, the mutually assured destruction
01:50:06.860 | leads to a kind of like,
01:50:08.540 | we've reached a certain level of destructiveness
01:50:11.220 | with our weapons where we were able to catch ourselves,
01:50:15.660 | not to create, like you said, hit really hard.
01:50:20.320 | This is the interesting question about kind of hard,
01:50:25.320 | hard and really hard upside the head.
01:50:28.640 | With the environment, I would argue,
01:50:31.620 | see, we can't know the future,
01:50:33.860 | but I would argue as the pressure builds,
01:50:36.160 | there's already, because of this created urgency,
01:50:41.160 | the amount of innovation that I've seen
01:50:44.600 | that sometimes is unrelated to the environment,
01:50:47.200 | but kind of sparked by this urgency.
01:50:49.980 | It's been humongous, including the work of Elon Musk,
01:50:52.980 | including the work of just,
01:50:55.120 | you could argue that the SpaceX
01:51:00.120 | and the new exploration of space
01:51:02.600 | is kind of sparked by this environmental like urgency.
01:51:06.040 | I mean, connected to Tesla
01:51:07.200 | and everything they're doing with electric vehicles
01:51:09.000 | and so on, there's a huge amount of innovation
01:51:11.280 | in the space that's happening.
01:51:12.680 | I could see the effect of climate change
01:51:16.640 | resulting in more positive innovation
01:51:19.760 | that improves the quality of life across the world
01:51:22.700 | than the actual catastrophic events that we're describing,
01:51:26.340 | which we cannot even currently predict.
01:51:28.440 | It's not like there's going to be,
01:51:29.640 | there's going to be more extreme weather events.
01:51:31.560 | What does that even mean?
01:51:32.800 | There's going to be a gradual increase
01:51:35.280 | of the level of water.
01:51:38.440 | What does that even mean in terms of catastrophic events?
01:51:40.720 | It's going to be pretty gradual.
01:51:42.280 | There's going to be migration of people.
01:51:43.520 | We can't predict what that means.
01:51:45.120 | And in response to that,
01:51:47.520 | there's going to be a huge amount of innovators born today
01:51:52.440 | that have dreams and that will build devices and inventions
01:51:56.840 | and from space to vehicles to in the software world
01:52:01.720 | that enable education across the world,
01:52:03.720 | all those kinds of things that will on mass, on average,
01:52:07.760 | increase the quality of life on average across the world.
01:52:11.040 | So it's not at all obvious that these,
01:52:14.480 | the things that the technologies
01:52:16.240 | that are creating climate change, global warming
01:52:19.920 | are going to have a negative, net negative effect.
01:52:23.520 | We don't know this.
01:52:24.760 | And I'm kind of inspired by the dreamers, the engineers,
01:52:29.760 | the innovators and the entrepreneurs that build,
01:52:33.600 | that wake up in the morning, see problems in the world
01:52:37.880 | and dream that they're going to be the ones
01:52:39.500 | who solve those problems.
01:52:40.760 | That's the human spirit.
01:52:43.000 | And that I'm not exactly,
01:52:44.520 | it is true that we need those deadlines.
01:52:46.720 | We need to be freaking out about stuff.
01:52:48.920 | And the reason we need to study history
01:52:52.080 | and the worst of human history is then we can say,
01:52:55.080 | oh shit, this too can happen.
01:52:57.020 | It's a slap in the face.
01:52:59.360 | It's a wake up call that if you get complacent,
01:53:02.700 | if you get lazy, this is going to happen.
01:53:04.960 | And that, listen, there's a lot of really intelligent people,
01:53:08.400 | ambitious people, dreamers, skilled dreamers
01:53:13.400 | that build solutions that make sure
01:53:15.520 | this stuff doesn't happen anymore.
01:53:17.240 | So I think there's reason to be optimistic about technology,
01:53:20.520 | not in a naive way.
01:53:21.840 | There's an argument to be made in a realistic way
01:53:25.460 | that like with technology, we can build a better future.
01:53:29.280 | And then Facebook is a lesson
01:53:32.640 | in the way Facebook has been done
01:53:34.320 | is a lesson how not to do it.
01:53:37.280 | And that lesson serves as a guide of how to do it better,
01:53:42.280 | how to do it right, how to do it in a positive way.
01:53:46.640 | And the same, every single sort of failed technology
01:53:50.040 | contains within it the lessons of how to do it better.
01:53:53.360 | I mean, without that,
01:53:55.460 | what's the source of hope for human civilization?
01:54:00.760 | (laughs)
01:54:02.840 | I mean, by way of question,
01:54:06.760 | you have truly studied some of the darkest moments
01:54:09.320 | in human history.
01:54:10.340 | Put on your optimist hat.
01:54:13.260 | Where-- - That one.
01:54:15.760 | - Yes.
01:54:16.600 | - The glimmers of it.
01:54:19.720 | - Yes, what is your source of hope
01:54:22.320 | for the future of human civilization?
01:54:24.620 | - Well, I think it resides in
01:54:28.840 | some of what you've been saying,
01:54:32.500 | which is in the persistence of this civilization over time,
01:54:37.500 | despite the incredible setbacks,
01:54:43.340 | two enormous world wars, the nuclear standoff,
01:54:47.540 | the horrible things we're experiencing now
01:54:52.740 | with climate change and migration and stuff like that.
01:54:56.060 | That despite these things, we are persisting
01:55:01.180 | and we are continuing, and like you say,
01:55:03.140 | we're continuing to invent
01:55:04.580 | and we're continuing to try to solve these problems.
01:55:07.700 | And we're continuing to love as well as hate.
01:55:11.300 | And that,
01:55:14.620 | I'm basically, I mean, I have children and grandchildren
01:55:20.380 | and I think they're gonna be just fine.
01:55:23.260 | I'm not a doom and gloomer.
01:55:27.700 | I'm not a Cassandra saying the world is coming to an end.
01:55:30.300 | I'm not like that at all.
01:55:31.620 | I think that things will persist.
01:55:38.260 | Another, by the way, source of tremendous optimism
01:55:41.380 | on my part, the kids I teach.
01:55:44.580 | I teach some unbelievably fantastic young people
01:55:49.540 | who are sort of like you say, they're dreamers
01:55:52.380 | and they're problem solvers.
01:55:57.140 | I mean, they have enormously humane values
01:56:01.460 | and ways of thinking about the world
01:56:02.980 | and they wanna do good.
01:56:05.060 | If you take the kind of, I mean,
01:56:09.620 | this has probably been true all the way along,
01:56:11.940 | but I mean, the percentage of do-gooders
01:56:15.140 | is really enormously large.
01:56:16.780 | Now, whether they end up working
01:56:18.140 | for some kind of shark law firm or something
01:56:20.780 | or that kind of thing,
01:56:24.940 | or whether they end up human rights lawyers
01:56:26.820 | as they all wanna be, right?
01:56:28.460 | Is a different kind of question,
01:56:34.060 | but certainly these young people are talented,
01:56:38.220 | they're smart, they have wonderful values,
01:56:41.140 | they're energetic, they work hard, they're focused.
01:56:45.260 | And of course, it's not just Stanford.
01:56:48.900 | I mean, it's all over the country.
01:56:50.620 | You have young people who really wanna contribute
01:56:54.900 | and they wanna contribute.
01:56:55.980 | I mean, it's true some of them end up
01:56:58.980 | working to get rich.
01:57:02.140 | I mean, that's inevitable, right?
01:57:03.740 | But the percentages are actually rather small,
01:57:07.500 | at least at this age.
01:57:08.820 | Maybe when they get a mortgage and a family
01:57:10.700 | and that sort of thing,
01:57:12.380 | financial well-being will be more important to them.
01:57:16.300 | But right now, you catch this young generation
01:57:20.020 | and they're fantastic.
01:57:21.220 | They're fantastic.
01:57:22.100 | And they're not what they're often portrayed
01:57:25.820 | as being kind of silly and naive and knee jerk leftists
01:57:30.820 | and that, they're not at all like that.
01:57:32.980 | They're really fine young people.
01:57:36.740 | So that's a source of optimism to me too.
01:57:40.420 | - What advice would you give to those young people today,
01:57:44.020 | maybe in high school, in college, at Stanford,
01:57:47.140 | maybe to your grandchildren about how to have a career
01:57:52.140 | they can be proud of, have a life they can be proud of?
01:57:55.540 | - Pursue careers that are in the public interest
01:57:58.860 | in one fashion or another and not just in their interests.
01:58:02.380 | And that would be, I mean, it's not bad to pursue a career
01:58:07.140 | in your own interests.
01:58:08.060 | I mean, as long as it's something that's useful
01:58:11.100 | and positive for their families or whatever.
01:58:16.100 | But yeah, so I mean, I try to advise kids
01:58:20.020 | to find themselves somehow,
01:58:22.220 | find who they want to be and what they want to be
01:58:25.540 | and try to pursue it.
01:58:27.260 | And the NGO world is growing, as you know,
01:58:30.420 | and a lot of young people are kind of throwing themselves
01:58:34.580 | into it and human rights watch and that kind of stuff.
01:58:39.580 | And they want to do that kind of work.
01:58:43.460 | And it's very admirable.
01:58:45.460 | - I tend to think that even if you're not working
01:58:50.820 | in human rights, there's a certain way in which
01:58:53.740 | if you live with integrity,
01:58:55.400 | I believe that all of us or many of us
01:59:02.820 | have a bunch of moments in our lives
01:59:06.020 | when we're posed with a decision.
01:59:08.160 | It's a quiet one.
01:59:09.500 | Maybe it'll never be written about or talked about.
01:59:12.500 | Well, you get to choose.
01:59:13.940 | There's a choice that is difficult to make.
01:59:21.300 | It may require a sacrifice, but it's the choice
01:59:23.900 | that the best version of that person would make.
01:59:28.980 | That's the best way I can sort of say
01:59:31.580 | how to act with integrity.
01:59:32.740 | It's the very thing that would resist
01:59:34.700 | the early days in Nazi Germany.
01:59:36.380 | It sounds dramatic to say, but those little actions.
01:59:40.460 | And I feel like the best you can do to avoid genocide
01:59:45.420 | on scale is for all of us to live in that way,
01:59:50.540 | within those moments, unrelated potentially
01:59:52.940 | to human rights, to anything else,
01:59:55.460 | is to take those actions.
01:59:57.420 | Like I believe that all of us know the right thing to do.
02:00:00.420 | - I know, that's right.
02:00:01.500 | I think that's right.
02:00:02.700 | You put it very well.
02:00:04.020 | I couldn't have done it better myself.
02:00:06.660 | No, no, I agree.
02:00:07.940 | I agree completely that there are,
02:00:09.860 | to live with truth, which is what Václav Havel
02:00:16.780 | used to say, this famous Czech dissident,
02:00:19.620 | talked about living in truth,
02:00:21.140 | but also to live with integrity.
02:00:23.100 | And that's really super important.
02:00:26.900 | - Well, let me ask you about love.
02:00:29.140 | What role does love play in this whole thing,
02:00:31.340 | in the human condition?
02:00:33.260 | In all the study of genocide,
02:00:35.340 | it does seem that hardship in moments
02:00:39.220 | brings out the best in human nature,
02:00:41.260 | and the best in human nature is expressed through love.
02:00:44.380 | - Well, as I already mentioned to you,
02:00:46.180 | I think hardship is not a good thing.
02:00:51.180 | It's not the best thing for love.
02:00:53.940 | I mean, it's better to not have to suffer,
02:00:57.700 | and not have to-- - You think so?
02:00:59.180 | - Yes, I think it is.
02:01:00.900 | I think it's, as I mentioned to you,
02:01:05.900 | studying concentration camps,
02:01:08.660 | this is not a place for love.
02:01:10.460 | It happens, it happens,
02:01:13.740 | but it's not really a place for love.
02:01:15.220 | It's a place for rape.
02:01:16.620 | It's a place for torture.
02:01:19.340 | It's a place for killing,
02:01:20.420 | and it's a place for inhuman action,
02:01:24.060 | one to another, you know?
02:01:26.180 | And also, as I said, among those who are suffering,
02:01:30.460 | not just between those who are,
02:01:33.060 | and then there are whole gradations,
02:01:34.780 | you know, the same thing in the gulag,
02:01:36.300 | you know, there are gradations,
02:01:37.660 | all the way from the criminal prisoners
02:01:40.980 | who beat the hell out of the political prisoners,
02:01:42.900 | you know, who then have others below them
02:01:44.860 | who they beat down, you know,
02:01:45.980 | so everybody's being the hell out of everybody else.
02:01:48.420 | So I would not idealize in any way suffering as a,
02:01:53.340 | you know-- - A source of beauty.
02:01:55.540 | - A source of beauty and love.
02:01:57.220 | I wouldn't do that.
02:01:58.060 | I think it's a whole lot better
02:02:00.700 | for people to be relatively prosperous,
02:02:03.200 | I'm not saying super prosperous,
02:02:04.780 | but to be able to feed themselves,
02:02:06.340 | and to be able to feed their families,
02:02:08.220 | and house their families,
02:02:09.820 | and take care of themselves,
02:02:13.960 | you know, to foster loving relations between people.
02:02:18.960 | And, you know, I think it's no accident
02:02:25.000 | that, you know, poor families have much,
02:02:28.520 | you know, worse records when it comes to crime
02:02:33.200 | and things like that, you know,
02:02:34.600 | and also to wife beating,
02:02:37.760 | and to child abuse, and stuff like that.
02:02:40.800 | I mean, you just, you don't wanna be poor and indigent,
02:02:45.800 | and not have a roof over your head, be homeless.
02:02:49.440 | I mean, it doesn't mean, again,
02:02:51.160 | you know, homeless people are mean people,
02:02:53.300 | that's not what I'm trying to say.
02:02:54.600 | What I'm trying to say is that, you know,
02:02:56.960 | what we wanna try to foster in this country
02:02:59.080 | and around the world,
02:03:00.800 | and one of the reasons, you know,
02:03:02.440 | I mean, I'm very critical of the Chinese in a lot of ways,
02:03:07.180 | but I mean, we have to remember
02:03:08.320 | they pulled that country out of horrible poverty, right?
02:03:11.080 | And I mean, there's still poor people in the countryside,
02:03:14.560 | there's still problems, you know,
02:03:16.640 | with want and need among the Chinese people,
02:03:21.640 | but, you know, there were millions and millions of Chinese
02:03:24.380 | who were living at the bare minimum of life,
02:03:26.520 | which is no way to live, you know,
02:03:28.260 | and no way, again, to foster love,
02:03:30.520 | and compassion, and getting along.
02:03:33.820 | So I wanna be clear, I don't speak for history, right?
02:03:37.520 | (Lex laughing)
02:03:38.360 | I'm giving you, I mean, there used to be historians,
02:03:41.120 | you know, in the 19th century
02:03:42.320 | who really thought they were speaking for history, you know?
02:03:45.480 | I don't think that way at all.
02:03:47.020 | I mean, I understand I'm a subjective human being
02:03:49.520 | with my own points of view and my own opinions, but--
02:03:54.020 | - I'm trying to remember in this conversation
02:03:56.120 | that you're, despite the fact that you're brilliant
02:03:58.880 | and you've written brilliant books, that you're just human.
02:04:02.600 | - Well, I am. - With an opinion.
02:04:04.280 | - That's it, yeah.
02:04:05.480 | No, no, that's absolutely true,
02:04:07.080 | and I tell my students that too.
02:04:09.680 | I mean, I make sure they understand
02:04:11.240 | this is not history speaking, you know?
02:04:12.920 | This is me and Norman, and I'm, you know,
02:04:15.800 | and this is what it's about.
02:04:18.440 | I mean, I spent a long time studying history
02:04:21.120 | and have enjoyed it enormously,
02:04:24.920 | but, you know, I'm an individual with my points of view,
02:04:29.440 | and one of them is that I've developed over time
02:04:34.080 | is that, you know, human want is a real tragedy for people,
02:04:39.080 | and it hurts people, and it also causes upheavals
02:04:44.200 | and difficulties and stuff.
02:04:45.660 | So I feel for people, you know?
02:04:47.960 | I feel for people in Syria.
02:04:49.440 | I feel for people in, you know, in Ethiopia, in Tigray,
02:04:54.440 | you know, when they don't have enough to eat,
02:04:56.240 | and, you know, what that does,
02:04:58.280 | I mean, it doesn't mean they don't love each other, right?
02:05:00.520 | It doesn't mean they don't love their kids,
02:05:03.040 | but it does mean that it's harder, you know, to do that.
02:05:06.720 | - I'm not so sure.
02:05:09.320 | It's obvious to me that it's harder.
02:05:11.520 | There's suffering, there's suffering,
02:05:13.280 | but the numbers, we've been talking about deaths,
02:05:16.560 | been talking about suffering,
02:05:17.840 | but the numbers we're not quantifying.
02:05:20.000 | The history that you haven't perhaps been looking at
02:05:22.720 | is all the times that people have fallen in love deeply
02:05:25.000 | with friends, with romantic love,
02:05:27.640 | the positive emotion that people have felt,
02:05:31.480 | and I'm not so sure that amidst the suffering,
02:05:34.320 | those moments of beauty and love can't be discovered,
02:05:36.680 | and if we look at the numbers,
02:05:38.320 | I'm not so sure the story's obvious.
02:05:40.520 | I mean, again, I suppose you may disagree
02:05:45.780 | with Viktor Frankl, I may too, maybe, depending on the day.
02:05:49.520 | I mean, he says that if there's meaning to this life at all,
02:05:52.520 | there's meaning to the suffering too,
02:05:53.920 | because suffering is part of life.
02:05:56.240 | There's something about accepting the ups and downs
02:06:01.680 | even when the downs go very low,
02:06:03.880 | and within all of it, finding a source of meaning.
02:06:07.800 | I mean, he's arguing from the perspective of psychology,
02:06:09.920 | but just this life is an incredible gift,
02:06:13.440 | almost no matter what, and I'm not,
02:06:16.680 | it's easy to look at suffering and think,
02:06:21.120 | if we just escape the suffering, it'll all be better,
02:06:24.400 | but we all die.
02:06:27.460 | There's beauty in the whole thing,
02:06:30.400 | and it is true that it's just,
02:06:34.080 | from all the stories I've read,
02:06:35.360 | especially in "Famine" and "Starvation,"
02:06:37.680 | it's just horrible, it is horrible suffering,
02:06:41.320 | but I also just want to say that there's love amidst it,
02:06:46.000 | and we can't forget that.
02:06:47.360 | - No, no, I don't forget it, I don't forget it, but--
02:06:50.320 | - And I think it's from the stories.
02:06:52.880 | Now, I don't want to make that compromise or that trade,
02:06:56.800 | but the intensity of friendship in war,
02:06:59.640 | the intensity of love in war is very high,
02:07:04.560 | so I'm not sure what to make of these calculations,
02:07:07.240 | but if you look at the stories,
02:07:08.860 | some of the people I'm closest with,
02:07:10.400 | and I've never experienced anything
02:07:12.360 | even close to any of this,
02:07:14.280 | but some of the people I'm closest with
02:07:15.760 | is people I've gone through difficult times with.
02:07:18.320 | There's something about that.
02:07:20.040 | They're a society or a group where things are easy.
02:07:24.880 | The intensity of the connection between human beings
02:07:27.600 | is not as strong.
02:07:29.040 | I don't know what to do with that calculus,
02:07:31.440 | because I, too, agree with you.
02:07:32.800 | I want to have as little suffering in the world as possible,
02:07:37.600 | but we have to remember about the love
02:07:39.400 | and the depth of human connection
02:07:41.000 | and find the right balance there.
02:07:42.920 | - No, there's something to what you're saying.
02:07:46.840 | There's clearly something to what you're saying.
02:07:48.480 | I was just thinking about the Soviet Union
02:07:50.720 | when I lived there, and people on the streets
02:07:53.480 | were so mean to one another, and they never smiled.
02:07:56.760 | You grew up there?
02:07:57.720 | - No, but you were too young to be.
02:07:58.920 | - No, no, I remember well.
02:08:00.320 | I came here when I was 13, yeah.
02:08:01.840 | - Okay, so anyway, I remember living there
02:08:04.680 | and just how hard people were on each other on the streets,
02:08:07.080 | and when you got inside people's apartments,
02:08:09.000 | when they started to trust you,
02:08:11.240 | the friendships were so intense and so wonderful.
02:08:14.100 | So in that sense, I mean, they did live a hard life,
02:08:19.720 | but there was enough food on the table,
02:08:21.480 | and there was a roof over their heads.
02:08:23.000 | - There's a certain line.
02:08:24.040 | - There's a certain, there are lines.
02:08:26.000 | I don't think there's one line,
02:08:27.200 | but it's kind of a shading.
02:08:29.960 | And the other story I was thinking of
02:08:31.280 | as you were talking was, it's not a story,
02:08:33.520 | it's a history, a book by a friend of mine
02:08:38.480 | who wrote about love in the camps,
02:08:43.480 | in the refugee camps for Jews in Germany after the war.
02:08:48.260 | So these were Jews who had come mostly from Poland,
02:08:51.280 | and some survived the camps, came from awful circumstances,
02:08:56.360 | and then they were put in these camps,
02:08:58.840 | which were not joyful places.
02:09:00.840 | I mean, they were guarded sometimes by Germans even,
02:09:03.080 | but they're basically under the British control,
02:09:06.280 | and they were trying to get to Israel,
02:09:08.120 | trying to get to Palestine right after the war,
02:09:11.920 | and how many pairs there were, how many people coupled up.
02:09:16.060 | But remember, this is after being in the concentration camp.
02:09:18.640 | It's not being in the concentration camp,
02:09:20.600 | and it's also being free, to more or less free.
02:09:26.480 | You know, to express their emotions,
02:09:28.160 | and to be human beings after this horrible thing
02:09:32.960 | which they suffered.
02:09:34.360 | So I wonder whether there's, you know,
02:09:36.400 | as you say, some kind of calculus there
02:09:38.240 | where the level of suffering is such
02:09:42.680 | that it's just too much for humans to bear.
02:09:47.200 | And which I would suggest,
02:09:51.120 | I mean, I haven't studied this myself,
02:09:52.640 | I'm just giving you my point of view,
02:09:54.440 | you know, my off-the-cuff remarks here,
02:09:57.480 | but it was very inspiring to read about these couples
02:10:00.040 | who had met right in these camps
02:10:01.840 | and started to couple up, you know, and get married,
02:10:06.160 | and try to find their way to Palestine,
02:10:09.400 | which was a difficult thing to do then.
02:10:11.520 | - When did you live in Russia, in the Soviet Union?
02:10:14.440 | What's your memory of that time?
02:10:16.080 | - Well, so a number of different times.
02:10:18.480 | So I went there, I first went there in '69, '70.
02:10:22.200 | - Wow. - A long time ago.
02:10:23.760 | And then I lived in Leningrad mostly,
02:10:28.720 | but also in Moscow in 1975.
02:10:31.600 | So it was detente time,
02:10:33.700 | but it was also a time of political uncertainty
02:10:39.480 | and also hardship, you know, for Russians themselves.
02:10:44.480 | You know, standing in long lines,
02:10:46.040 | I mean, you must remember this,
02:10:47.320 | for food and for getting anything was almost impossible.
02:10:51.800 | It was a time when Jews were trying to get out.
02:10:55.440 | In fact, I just talked to a friend of mine from those days
02:10:59.880 | who I helped get out and get to Boston,
02:11:01.720 | and lovely people who had managed to have a good life
02:11:05.360 | in the United States after they left.
02:11:08.760 | But it wasn't an easy time.
02:11:10.000 | It wasn't an easy time at all.
02:11:11.720 | I remember people set fire to their doors,
02:11:13.920 | and, you know, their daughter was persecuted in school,
02:11:17.920 | you know, once they declared that they wanted to immigrate,
02:11:21.480 | and that sort of thing.
02:11:22.400 | So it was a very, it was a lot of anti-Semitism.
02:11:25.280 | So it was a tough time.
02:11:28.600 | Dissidents, you know, hung out with some dissidents,
02:11:32.360 | and one guy was actually killed.
02:11:34.220 | We think by the, nobody knows exactly by the KGB,
02:11:38.560 | but his art studio was, he had a separate studio
02:11:42.320 | in Leningrad, St. Petersburg today.
02:11:44.780 | You know, just a small studio where he did his art,
02:11:49.720 | and somebody set it on fire.
02:11:51.720 | And we think it was KGB, but, you know,
02:11:53.320 | you never really know.
02:11:54.440 | And he died in that fire.
02:11:57.920 | So, you know, it was not a, it was a tough time.
02:12:01.480 | And, you know, you knew you were followed,
02:12:04.680 | you knew you were being reported on as a foreign scholar,
02:12:08.360 | as I was.
02:12:09.180 | There was a formal exchange between the United States
02:12:12.400 | and the Soviet Union, and, you know,
02:12:15.060 | they let me work in the archives,
02:12:16.680 | but then, you know, Ivanov got to work in the-
02:12:19.920 | - Right. - In the physics lab
02:12:22.320 | at Rochester or something like that.
02:12:24.200 | You know, so it was an exchange which sent historians
02:12:29.200 | and literary people and some social scientists to Russia,
02:12:33.600 | and they sent all scientists here to, you know,
02:12:35.800 | grab what they could from MIT and those places.
02:12:38.320 | (Lex laughing)
02:12:39.400 | - How is your Russian?
02:12:40.640 | Do you have any knowledge of Russian language
02:12:44.560 | that has helped you to understand?
02:12:46.320 | - Oh, yeah, yeah.
02:12:47.480 | I mean, I can read it fine.
02:12:50.160 | And the speaking, you know, comes and goes,
02:12:52.920 | depending on whether I'm there,
02:12:54.640 | or whether I've been there recently,
02:12:55.880 | or if I spend some time there,
02:12:57.480 | because I really need, you know,
02:12:58.640 | I have Russian friends who speak just Russian.
02:13:00.480 | So, you know, when I'm there, I then, you know,
02:13:04.320 | I can communicate pretty well.
02:13:05.800 | I can't really write it, unfortunately.
02:13:08.120 | I mean, I can, but it's not very good.
02:13:11.900 | But I get along fine.
02:13:13.040 | - What's your fondest memory of the Soviet Union
02:13:17.440 | or Russia? - It's friends.
02:13:18.840 | - Friends. - It's friends.
02:13:20.920 | - Was it vodka involved, or is it just vodka involved?
02:13:24.640 | Is it-- - A little bit.
02:13:25.720 | You know, I'm not much of a drinker.
02:13:27.040 | - Yeah. - So I would, you know,
02:13:28.520 | they'd just make fun of me, and I'd make fun of myself.
02:13:31.800 | That was easy enough.
02:13:32.960 | I don't really like, you know, a heavy drink.
02:13:35.720 | I've done a lot of that.
02:13:37.000 | - Yeah. - Not a lot.
02:13:37.880 | I've done some of that, but I never really enjoyed it,
02:13:40.240 | and would get sick and stuff.
02:13:42.600 | But no, it's friends.
02:13:45.360 | You know, one friend I made in the dormitory,
02:13:49.280 | you know, it was a dormitory for foreigners,
02:13:51.480 | but also Siberians who had come, you know,
02:13:56.240 | to Leningrad to study.
02:13:58.840 | And so I met a couple of guys,
02:14:00.360 | and one in particular from Omsk became a wonderful friend.
02:14:04.600 | And we talked and talked and talked.
02:14:06.320 | Outside, you know, we would go walk outside,
02:14:08.320 | 'cause we both knew they were, you know,
02:14:10.520 | people were listening and stuff.
02:14:12.060 | And he would say, well, this is, he was an historian,
02:14:15.040 | you know, and so we would talk history.
02:14:16.840 | And he'd say, well, this was the case, wasn't it?
02:14:18.560 | I said, no, I'm sorry, Sasha, it wasn't the case.
02:14:21.120 | It was, you know, we think Stalin
02:14:23.960 | actually had a role in killing Kirov.
02:14:26.440 | I mean, we're not sure, but, you know, he said, no.
02:14:29.000 | I said, yeah.
02:14:29.920 | You know, so, you know, we had these conversations,
02:14:32.120 | and he was, he was, what I would,
02:14:36.600 | I don't know if he would agree with me or not.
02:14:38.240 | I mean, we're still friends, so he was--
02:14:41.080 | - He's gonna check in with you after this.
02:14:42.560 | - Maybe he'll listen to the blog,
02:14:43.960 | or I'll send it to him or something.
02:14:45.240 | He was a kind of naive Marxist-Leninist.
02:14:48.160 | And he thought I was, you know, I was, you know,
02:14:51.280 | I had this capitalist ideology.
02:14:52.800 | He'd say, what ideology do you have?
02:14:54.280 | And I said, I don't have an ideology.
02:14:56.400 | You know, I try to just put together
02:14:58.200 | kind of reason and facts and accurate stories
02:15:02.640 | and try to tell them in that way.
02:15:03.860 | No, no, no, no, you must, you know,
02:15:05.480 | you're a bourgeois, you know, this or that.
02:15:07.560 | I said, no, I'm really not.
02:15:09.160 | And so we would have these talks
02:15:11.940 | and these kind of arguments,
02:15:13.160 | and then, I mean, sure enough, you know,
02:15:16.720 | we corresponded for a while,
02:15:19.240 | and then he had to stop corresponding
02:15:20.960 | because he became a kind of local official in Omsk.
02:15:25.640 | And he sort of migrated more and more to being a Democrat.
02:15:29.560 | And he was then in the, you know,
02:15:31.600 | Democratic movement under Gorbachev,
02:15:34.880 | and, you know, in the Council of People's Deputies,
02:15:38.500 | which they set up, which was, you know,
02:15:41.680 | elected as a Democrat from Omsk
02:15:45.200 | and had a political career through the Yeltsin period.
02:15:49.480 | And once Putin came along, you know, it was over.
02:15:52.780 | He didn't like Putin, and, you know,
02:15:56.200 | and Putin didn't like the Yeltsin people, right,
02:15:59.160 | who were, tried to be, some of them tried to be Democrats.
02:16:02.640 | And Sasha was one who really did.
02:16:04.200 | He just published his memoirs in Russian, by the way,
02:16:06.600 | which are very good, I think.
02:16:08.040 | - Yeah, they're really good.
02:16:11.240 | (speaking in foreign language)
02:16:14.120 | That's what it's called.
02:16:16.040 | It's hard to translate in English.
02:16:18.040 | (speaking in foreign language)
02:16:19.440 | But I translated it four points once for him.
02:16:22.040 | - This is so beautiful.
02:16:23.040 | Like, do you find that the translation is a problem or no?
02:16:27.360 | It's such a different language.
02:16:28.200 | - Yes, translation is very difficult.
02:16:30.160 | - With the Russian language, I mean,
02:16:31.320 | it's the only language I know deeply, except English.
02:16:35.640 | And it seems like so much is lost of the pain,
02:16:38.800 | the poetry, the beauty of the people.
02:16:40.700 | - And translators are to be treasured,
02:16:43.320 | and good ones to be treasured.
02:16:45.160 | I mean, those who do the translations, you know,
02:16:47.780 | when you read things in translation,
02:16:50.600 | sometimes they're quite beautiful, you know,
02:16:52.080 | whether it's Russian or Polish or German or anything,
02:16:54.440 | French, people.
02:16:55.360 | - Yeah, I'm actually traveling to Paris
02:16:57.480 | to talk to the famous translators, the Dostoevsky Tolstoy.
02:17:01.260 | And I'm just gonna do several conversations with them
02:17:04.560 | about, like, you could just sometimes
02:17:06.400 | just grab a single sentence
02:17:07.880 | and just talk about the translation in that sentence.
02:17:10.480 | That's, and also, as you said,
02:17:14.800 | I would love to be a fly on the wall
02:17:16.140 | with some of those friends that you had,
02:17:17.540 | because the perspective on history,
02:17:20.420 | non-academic, sort of without,
02:17:22.840 | just as human beings, is so different
02:17:26.020 | from the United States versus Russia.
02:17:28.340 | When you talk about the way the World War II was perceived
02:17:30.600 | and all those kinds of things, it's fascinating.
02:17:35.240 | History also has in it opinion and perspective.
02:17:39.500 | And so sometimes stripping that away is really difficult.
02:17:41.860 | And then I guess that is your job,
02:17:43.060 | and at its highest form, that is what you do as a historian.
02:17:47.540 | Well, Norman, (speaking in foreign language)
02:17:52.540 | I really appreciate your valuable time.
02:17:54.540 | It's truly an honor to talk to you,
02:17:56.060 | and thank you for taking us through a trip
02:18:00.280 | through some of the worst parts of human history,
02:18:02.780 | and talking about hope and love at the end.
02:18:06.340 | So I really appreciate your time today.
02:18:08.020 | - Okay, thank you. - Thank you.
02:18:09.100 | - Thank you for having me.
02:18:10.400 | - Thanks for listening to this conversation
02:18:12.860 | with Norman Naimark.
02:18:14.260 | To support this podcast,
02:18:15.540 | please check out our sponsors in the description.
02:18:17.940 | And now, let me leave you with some words from Stalin.
02:18:21.180 | "A single death is a tragedy.
02:18:23.700 | "A million deaths is a statistic."
02:18:26.340 | Thank you for listening, and hope to see you next time.
02:18:30.500 | (logo whooshes)
02:18:33.160 | (logo whooshes)
02:18:35.820 | [BLANK_AUDIO]