back to index

Dan Carlin: Hardcore History | Lex Fridman Podcast #136


Chapters

0:0 Introduction
2:36 Nature of evil
9:33 Is violence and force fundamental to human civilization?
14:41 Will we always have war?
24:21 The Russian front in World War II
32:15 Ideologies of the US, the Soviet Union, and China
44:58 Putin
57:33 Journalism is broken
64:58 Genghis Khan
79:19 Greatest leader in history
87:4 Could Hitler have been stopped?
104:3 Hitler's Antisemitism
109:54 Destructive power of evil
119:9 Will human civilization destroy itself?
131:14 Elon Musk, Tesla, SpaceX
139:36 Steering around the iceberg - wow do we avoid collapse of society?
161:43 Advice on podcasting
164:55 Joe Rogan, Spotify, and the future of podcasting
180:2 Future episodes of Hardcore History podcast
195:4 Is Ben real?
195:48 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Dan Carlin,
00:00:03.800 | host of Hardcore History and Common Sense Podcasts.
00:00:08.760 | To me, Hardcore History is one of,
00:00:11.680 | if not the greatest podcast ever made.
00:00:15.960 | Dan and Joe Rogan are probably the two main people
00:00:19.240 | who got me to fall in love with the medium of podcasting
00:00:22.880 | as a fan and eventually as a podcaster myself.
00:00:27.400 | Meeting Dan was surreal.
00:00:29.960 | To me, he was not just a mere human like the rest of us,
00:00:33.400 | since his voice has been a guide
00:00:35.960 | through some of the darkest moments of human history for me.
00:00:39.480 | Meeting him was like meeting Genghis Khan,
00:00:42.120 | Stalin, Hitler, Alexander the Great,
00:00:45.160 | and all of the most powerful leaders in history
00:00:47.360 | all at once in a crappy hotel room in the middle of Oregon.
00:00:51.880 | It turns out that he is in fact just a human
00:00:55.880 | and truly one of the good ones.
00:00:58.720 | This was a pleasure and an honor for me.
00:01:01.440 | Quick mention of each sponsor,
00:01:04.280 | followed by some thoughts related to the episode.
00:01:07.360 | First is Athletic Greens, the all-in-one drink
00:01:10.280 | that I start every day with
00:01:11.720 | to cover all my nutritional bases.
00:01:14.160 | Second is SimpliSafe, a home security company I use
00:01:17.640 | to monitor and protect my apartment.
00:01:20.240 | Third is Magic Spoon, low carb, keto-friendly cereal
00:01:24.480 | that I think is delicious.
00:01:26.440 | And finally, Cash App,
00:01:27.840 | the app I use to send money to friends for food and drinks.
00:01:31.360 | Please check out these sponsors in the description
00:01:33.640 | to get a discount and to support this podcast.
00:01:36.760 | As a side note, let me say that I think we're living through
00:01:40.960 | one of the most challenging moments in American history.
00:01:44.720 | To me, the way out is through reason and love.
00:01:49.040 | Both require a deep understanding of human nature
00:01:52.080 | and of human history.
00:01:54.160 | This conversation is about both.
00:01:56.720 | I am, perhaps hopelessly, optimistic about our future.
00:02:01.720 | But if indeed we stand at the precipice of the great filter,
00:02:06.760 | watching our world consumed by fire,
00:02:09.600 | think of this little podcast conversation
00:02:12.320 | as the appetizer to the final meal before the apocalypse.
00:02:16.900 | If you enjoy this thing, subscribe on YouTube,
00:02:20.440 | review it with 5 Stars on Apple Podcasts,
00:02:22.680 | follow on Spotify, support it on Patreon,
00:02:25.400 | or connect with me on Twitter @LexFriedman.
00:02:28.760 | And now, finally, here's my conversation
00:02:32.680 | with the great Dan Carlin.
00:02:35.160 | Let's start with the highest philosophical question.
00:02:39.040 | Do you think human beings are fundamentally good,
00:02:41.840 | or are all of us capable of both good and evil,
00:02:46.640 | and it's the environment that molds how we,
00:02:50.880 | the trajectory that we take through life?
00:02:53.480 | How do we define evil?
00:02:55.640 | Evil seems to be a situational
00:02:57.920 | eye of the beholder kind of question.
00:03:00.560 | So if we define evil, maybe I can get a better idea of,
00:03:04.880 | and that could be a whole show, couldn't it,
00:03:06.720 | defining evil. (Dan laughing)
00:03:08.240 | But when we say evil, what do we mean?
00:03:10.680 | - That's a slippery one, but I think there's some way
00:03:13.760 | in which your existence, your presence in the world,
00:03:17.600 | leads to pain and suffering and destruction
00:03:22.280 | for many others in the rest of the world.
00:03:25.000 | So you steal the resources and you use them
00:03:28.800 | to create more suffering than there was before in the world.
00:03:33.800 | So I suppose it's somehow deeply connected
00:03:35.920 | to this other slippery word, which is suffering.
00:03:39.360 | As you create suffering in the world,
00:03:41.800 | you bring suffering to the world.
00:03:43.620 | - But here's the problem, I think, with it,
00:03:45.120 | 'cause I fully see where you're going with that,
00:03:46.920 | and I understand it.
00:03:48.480 | The problem is the question of the reason
00:03:52.520 | for inflicting suffering.
00:03:54.220 | So sometimes one might inflict suffering
00:03:58.080 | upon one group of individuals in order to maximize
00:04:03.080 | a lack of suffering with another group of individuals,
00:04:05.520 | or one who might not be considered evil at all
00:04:08.840 | might make the rational, seemingly rational choice,
00:04:11.600 | of inflicting pain and suffering
00:04:13.280 | on a smaller group of people in order to maximize
00:04:16.420 | the opposite of that for a larger group of people.
00:04:19.080 | - Yeah, that's one of the dark things about,
00:04:20.920 | I've spoken and read the work of Stephen Kotkin,
00:04:23.200 | I'm not sure if you're familiar with the historian,
00:04:25.380 | and he's basically a Stalin, a Joseph Stalin scholar.
00:04:30.080 | And one of the things I realized,
00:04:32.600 | I'm not sure where to put Hitler, but with Stalin,
00:04:36.060 | it really seems that he was sane
00:04:41.680 | and he thought he was doing good for the world.
00:04:44.640 | I really believe from everything I've read about Stalin
00:04:48.000 | that he believed that communism is good for the world,
00:04:52.880 | and if you have to kill a few people along the way,
00:04:56.360 | it's like you said, the small groups,
00:04:57.920 | if you have to sort of remove the people
00:05:01.320 | that stand in the way of this utopian system of communism,
00:05:06.320 | then that's actually good for the world.
00:05:08.680 | And it didn't seem to me that he could even consider
00:05:13.720 | the possibility that he was evil.
00:05:16.320 | He really thought he was doing good for the world.
00:05:18.720 | And that stuck with me because he's one of the most,
00:05:22.000 | it's to our definition of evil,
00:05:24.560 | he seems to have brought more evil onto this world
00:05:28.040 | than almost any human in history.
00:05:31.700 | And I don't know what to do with that.
00:05:35.260 | - Well, I'm fascinated with the concept,
00:05:37.000 | so fascinated by it that the very first
00:05:39.160 | Hardcore History Show we ever did,
00:05:40.560 | which was a full 15 or 16 minutes,
00:05:43.700 | was called Alexander versus Hitler.
00:05:46.220 | And the entire question about it was the motivations.
00:05:51.060 | So if you go to a court of law because you killed somebody,
00:05:55.060 | one of the things they're going to consider
00:05:56.660 | is why did you kill them?
00:05:58.100 | And if you killed somebody, for example, in self-defense,
00:06:02.340 | you're going to be treated differently
00:06:03.740 | than if you malicious killed somebody maliciously
00:06:06.220 | to take their wallet.
00:06:08.060 | And in the show, we wondered,
00:06:10.700 | 'cause I don't really make pronouncements,
00:06:12.840 | but we wondered about if you believe Hitler's writings,
00:06:17.340 | for example, Mein Kampf,
00:06:18.900 | which is written by a guy who's a political figure
00:06:21.820 | who wants to get, so I mean, it's about as believable
00:06:24.520 | as any other political tract would be.
00:06:26.880 | But in his mind, the things that he said that he had to do
00:06:31.260 | were designed for the betterment of the German people.
00:06:35.060 | Whereas Alexander the Great, once again,
00:06:37.240 | this is somebody from more than 2000 years ago,
00:06:39.380 | so with lots of propaganda in the intervening years,
00:06:42.260 | but one of the views of Alexander the Great
00:06:44.880 | is that the reason he did what he did was to,
00:06:47.900 | for lack of a better word,
00:06:49.360 | write his name in a more permanent graffiti
00:06:51.860 | on the pages of history, right?
00:06:53.240 | In other words, to glorify himself.
00:06:54.940 | And if that's the case,
00:06:57.420 | does that make Alexander a worse person than Hitler
00:07:00.060 | because Hitler thought he was doing good,
00:07:02.660 | whereas Alexander, if you believe the interpretation,
00:07:05.820 | was simply trying to exalt Alexander.
00:07:08.140 | So the motivations of the people doing these things,
00:07:11.700 | it seems to me, matter.
00:07:14.180 | I don't think you can just sit there and go,
00:07:15.740 | the only thing that matters is the end result
00:07:17.780 | because that might have been an unintentional byproduct.
00:07:20.820 | In which case, that person,
00:07:22.740 | had you been able to show them the future,
00:07:25.660 | might have changed what they were doing.
00:07:27.220 | So were they evil or misguided or wrong or made the,
00:07:30.860 | so, and I hate to do that
00:07:32.340 | because there's certain people like Hitler
00:07:33.620 | that I don't feel deserve the benefit of the doubt.
00:07:36.380 | At the same time, if you're fascinated
00:07:38.320 | by the concept of evil and you delve into it deeply enough,
00:07:42.060 | you're going to want to understand
00:07:43.540 | why these evil people did what they did.
00:07:46.620 | And sometimes it can confuse the hell out of you.
00:07:49.660 | Who wants to sit there and try to see things
00:07:51.380 | from Hitler's point of view to get a better understanding
00:07:53.140 | and sort of commiserate with.
00:07:54.740 | So, but I'm, obviously, first history show,
00:07:57.180 | I'm fascinated with the concept.
00:07:58.740 | - So do you think it's possible
00:08:01.500 | if we put ourselves in the mindset
00:08:03.180 | of some of the people that have led,
00:08:05.540 | created so much suffering in the world,
00:08:08.060 | that all of them had their motivations,
00:08:12.520 | had good intentions underlying them?
00:08:15.180 | - No, I don't, simply 'cause there's so many,
00:08:17.920 | I mean, the law of averages would suggest
00:08:20.680 | that that's not true.
00:08:21.520 | - I guess it's pure evil possible,
00:08:24.840 | meaning you, again, it's slippery,
00:08:28.000 | but you, the suffering is the goal.
00:08:31.360 | - Suffering, intentional suffering.
00:08:33.280 | - Yeah. - Yes.
00:08:34.360 | I think that, and I think that there's historical figures
00:08:36.580 | that one could point, but that gets to the deeper question
00:08:39.740 | of are these people sane?
00:08:42.340 | Do they have something wrong with them?
00:08:43.460 | Are they twisted from something in their youth?
00:08:45.760 | You know, these are the kinds of things
00:08:50.140 | where you start to delve into the psychological makeup
00:08:53.060 | of these people.
00:08:53.980 | In other words, is anybody born evil?
00:08:56.260 | And I actually believe that some people are.
00:08:58.860 | I think the DNA can get scrambled up in ways.
00:09:01.500 | I think the question of evil is important too,
00:09:03.820 | because I think it's an eye of the beholder thing.
00:09:05.740 | I mean, if Hitler, for example, had been successful,
00:09:09.020 | and we were today on the sixth or seventh leader
00:09:12.660 | of the Third Reich, since I think his entire history
00:09:16.960 | would be viewed through a different lens,
00:09:18.480 | 'cause that's the way we do things, right?
00:09:20.700 | Genghis Khan looks different to the Mongolians
00:09:23.140 | than he does to the residents of Baghdad, right?
00:09:25.980 | And I think, so an eye of the beholder question,
00:09:28.900 | I think, comes into all these sorts of things.
00:09:30.580 | As you said, it's a very slippery question.
00:09:32.740 | - Where do you put, as somebody who's fascinated
00:09:35.320 | by military history, where do you put violence?
00:09:38.540 | In terms of the human condition,
00:09:43.420 | is it core to being human, or is it just a little tool
00:09:47.080 | that we use every once in a while?
00:09:49.340 | - So I'm gonna respond to your question with a question.
00:09:52.020 | What do you see the difference being
00:09:54.380 | between violence and force?
00:09:56.840 | Let me go farther.
00:09:58.820 | I'm not sure that violence is something
00:10:02.780 | that we have to put up with as human beings forever,
00:10:05.600 | that we must resign ourselves to violence forever.
00:10:09.560 | But I have a much harder time seeing us
00:10:12.800 | able to abolish force.
00:10:15.720 | And there's going to be some ground
00:10:19.080 | where if those two things are not the same,
00:10:21.720 | and I don't know that maybe they are,
00:10:23.440 | where there's certainly some crossover.
00:10:25.480 | And I think force, you're an engineer,
00:10:28.920 | you'll understand this better than I do,
00:10:30.080 | but think about it as a physical law.
00:10:32.940 | If you can't stop something from moving
00:10:35.020 | in a certain direction without pushing back
00:10:36.980 | in that same direction, I'm not sure
00:10:40.420 | that you can have a society or a civilization
00:10:44.460 | without the ability to use a counterforce
00:10:48.640 | when things are going wrong,
00:10:50.540 | whether it's on an individual level, right?
00:10:53.420 | Person attacks another person,
00:10:55.100 | so you step in to save that person,
00:10:57.760 | or even at the highest levels of politics or anything else,
00:11:01.580 | a counterforce to stop the inertia
00:11:04.540 | or the impetus of another movement.
00:11:07.320 | So I think that force is a simple,
00:11:11.160 | almost law of physics in human interaction,
00:11:13.980 | especially at the civilizational level.
00:11:15.680 | I think civilization requires a certain amount of,
00:11:18.620 | if not violence, then force.
00:11:20.920 | So, and again, they've talked,
00:11:23.740 | I mean, it goes back into St. Augustine,
00:11:25.700 | all kinds of Christian beliefs
00:11:26.700 | about the proper use of force,
00:11:28.840 | and people have philosophically tried to decide
00:11:31.920 | between can you have a sort of an Ahinsa Buddhist,
00:11:35.900 | sort of we will be nonviolent toward everything
00:11:38.620 | and exert no force, or there's a reason to have force
00:11:42.100 | in order to create the space for good.
00:11:44.860 | I think force is inevitable.
00:11:47.460 | Now, we can talk, and I've not come up
00:11:50.260 | to the conclusion myself, if there is a distinction
00:11:52.660 | to be made between force and violence.
00:11:54.740 | I mean, is a nonviolent force enough,
00:11:58.400 | or is violence when done for the cause of good
00:12:02.100 | a different thing than violence done
00:12:03.900 | either for the cause of evil, as you would say,
00:12:05.660 | or simply for random reasons?
00:12:08.800 | I mean, we humans lack control sometimes.
00:12:10.800 | We can be violent for no apparent reason or goal.
00:12:14.340 | And that's, I mean, you look at the criminal justice system
00:12:17.660 | alone and the way we interact with people
00:12:20.820 | who are acting out in ways that we as a society
00:12:23.820 | have decided is intolerable.
00:12:25.940 | Can you deal with that without force?
00:12:27.980 | And at some level, violence?
00:12:29.320 | I don't know.
00:12:30.160 | Can you maintain peacefulness without force?
00:12:32.960 | I don't know.
00:12:33.800 | - Just to be a little bit more specific
00:12:36.900 | about the idea of force, do you put force
00:12:40.500 | as general enough to include force in the space of ideas?
00:12:45.500 | So you mentioned Buddhism or religion or just Twitter.
00:12:51.240 | - I can think of no things farther apart than that.
00:12:56.340 | - Okay.
00:12:58.020 | Is the battles we do in the space of ideas
00:13:03.020 | of the great debates throughout history,
00:13:07.460 | do you put force into that?
00:13:09.580 | Or do you, in this conversation,
00:13:12.700 | are we trying to right now keep it to just physical force?
00:13:16.100 | In saying that you have an intuition
00:13:20.660 | that force might be with us much longer than violence.
00:13:25.140 | - I think the two bleed together.
00:13:27.780 | So take, because it's always my go-to example.
00:13:32.780 | I'm afraid, and I'm sure that the listeners all hate it,
00:13:35.520 | but take Germany during the 1920s, early 1930s
00:13:40.380 | before the Nazis came to power.
00:13:42.500 | And they were always involved in some level of force,
00:13:45.180 | beating up in the streets or whatever it might be.
00:13:47.000 | But think about it more like an intellectual discussion
00:13:49.980 | until a certain point.
00:13:53.740 | It would be difficult, I imagine,
00:13:55.820 | to keep the intellectual counterforce of ideas
00:14:00.300 | from at some point degenerating
00:14:02.060 | into something that's more coercion,
00:14:05.200 | counterforce, if we wanna use the phrases
00:14:08.060 | we were just talking about.
00:14:09.420 | So I think the two are intimately connected.
00:14:11.740 | I mean, actions follow thought, right?
00:14:14.100 | And at a certain point, I think especially
00:14:18.020 | when one is not achieving the goals
00:14:20.660 | that they want to achieve through peaceful discussion
00:14:24.360 | or argumentation or trying to convince the other side
00:14:27.780 | that sometimes the next level of operations
00:14:30.400 | is something a little bit more physically imposing,
00:14:33.260 | if that makes sense.
00:14:34.100 | We go from the intellectual to the physical.
00:14:36.340 | - Yeah, so it too easily spills over into violence.
00:14:39.260 | - Yes, and one leads to the other often.
00:14:41.380 | - So you kind of implied perhaps a hopeful message.
00:14:45.420 | Let me ask it in the form of a question.
00:14:47.300 | Do you think we'll always have war?
00:14:50.120 | - I think it goes to the first question too.
00:14:52.840 | So for example, what do you do?
00:14:56.680 | I mean, let's play with nation states now,
00:14:59.760 | although I don't know that nation states
00:15:01.960 | are something we should think of
00:15:03.160 | as a permanent construct forever.
00:15:06.080 | But how is one nation state supposed to prevent
00:15:09.160 | another nation state from acting in ways
00:15:11.840 | that it would see as either detrimental
00:15:13.560 | to the global community or detrimental
00:15:15.480 | to the interest of their own nation state?
00:15:18.940 | You know, and I think we've had this question
00:15:22.200 | going back to ancient times,
00:15:25.260 | but certainly in the 20th century,
00:15:26.740 | this has come up quite a bit.
00:15:27.800 | I mean, the whole Second World War argument
00:15:30.320 | sometimes revolves around the idea
00:15:32.040 | of what the proper counterforce should be.
00:15:34.720 | Can you create an entity, a League of Nations,
00:15:37.040 | a United Nations, a one world entity maybe even,
00:15:41.240 | that alleviates the need for counterforce
00:15:44.280 | involving mass violence and armies and navies
00:15:46.540 | and those things?
00:15:47.680 | I think that's an open discussion we're still having.
00:15:50.320 | - It's good to think through that
00:15:53.760 | because having a United Nations,
00:15:57.100 | there's usually a centralized control.
00:15:59.020 | So there's humans at the top,
00:16:01.220 | there's committees and usually like leaders emerge
00:16:05.900 | as singular figures that then can become corrupted by power.
00:16:10.740 | And it's just a really important,
00:16:12.500 | it feels like a really important thought experiment
00:16:15.160 | and something to really rigorously think through.
00:16:18.400 | How can you construct systems of government
00:16:20.980 | that are stable enough to push us towards
00:16:26.200 | less and less war and less and less unstable
00:16:30.200 | and another tough word,
00:16:35.100 | which is unfair of application of force.
00:16:39.240 | That's really at the core of the question
00:16:42.420 | that we're trying to figure out as humans,
00:16:44.760 | as our weapons get better and better and better
00:16:46.720 | destroying ourselves,
00:16:48.360 | it feels like it's important to think about
00:16:50.880 | how we minimize the over application
00:16:54.960 | or unfair application of force.
00:16:56.940 | - There's other elements that come into play too.
00:16:59.440 | You and I are discussing this
00:17:00.440 | at the very high intellectual level of things,
00:17:02.720 | but there's also a tail wagging the dog element to this.
00:17:05.420 | So think of a society of warriors,
00:17:08.200 | a tribal society from a long time ago.
00:17:11.520 | How much do the fact that you have warriors in your society
00:17:15.600 | and that their reason for existing,
00:17:17.600 | what they take pride in, what they train for,
00:17:20.100 | what their status in their own civilization,
00:17:23.120 | how much does that itself drive
00:17:25.880 | the responses of that society?
00:17:28.560 | How much do you need war to legitimize warriors?
00:17:32.120 | That's the old argument that you get to
00:17:34.680 | and we've had this in the 20th century too,
00:17:36.260 | that the creation of arms and armies
00:17:39.360 | creates an incentive to use them, right?
00:17:42.920 | And that they themselves can drive that incentive
00:17:45.400 | as a justification for their reasons for existence.
00:17:48.900 | That's where we start to talk about the interactivity
00:17:52.760 | of all these different elements of society upon one another.
00:17:55.920 | So when we talk about governments and war,
00:17:58.560 | we need to take into account the various things
00:18:01.060 | those governments have put into place
00:18:02.300 | in terms of systems and armies and things like that
00:18:05.120 | to protect themselves, right?
00:18:06.440 | For reasons we can all understand,
00:18:08.120 | but they exert a force on your range of choices, don't they?
00:18:13.120 | - It's true, you're making me realize that in my upbringing
00:18:17.080 | and I think upbringing of many, warriors are heroes.
00:18:21.480 | To me, I don't know where that feeling comes from,
00:18:25.160 | but to sort of die fighting is an honorable way to die.
00:18:30.160 | It feels like that.
00:18:34.440 | - I've always had a problem with this
00:18:35.680 | because as a person interested in military history,
00:18:38.240 | the distinction is important.
00:18:40.020 | And I try to make it at different levels.
00:18:42.680 | So at base level, the people who are out there
00:18:45.820 | on the front lines doing the fighting,
00:18:48.820 | to me, those people can be compared with police officers
00:18:52.520 | and firemen and people that fire persons.
00:18:55.720 | But I mean, people that are involved in an ethical attempt
00:19:03.320 | to perform a task, which ultimately one can see
00:19:07.440 | in many situations as being a saving sort of task, right?
00:19:12.440 | Or if nothing else, a self-sacrifice
00:19:15.720 | for what they see as the greater good.
00:19:17.320 | Now, I draw a distinction between the individuals
00:19:20.880 | and the entity that they're a part of, a military.
00:19:23.640 | And I certainly draw a distinction between the military
00:19:26.720 | and then the entire, for lack of a better word,
00:19:28.760 | military industrial complex that that service is a part of.
00:19:32.840 | I feel a lot less moral attachment to those upper echelons
00:19:37.840 | than I do the people on the ground.
00:19:40.560 | The people on the ground could be any of us
00:19:42.200 | and have been in a lot of, you know,
00:19:43.800 | we have a very professional sort of military now
00:19:46.680 | where it's a very, a subset of the population,
00:19:50.160 | but in other periods of time,
00:19:52.260 | we've had conscription and drafts
00:19:54.480 | and it hasn't been a subset of the population.
00:19:56.680 | It's been the population, right?
00:19:58.520 | And so it is the society oftentimes going to war.
00:20:01.720 | And I make a distinction between those warriors
00:20:04.280 | and the entities, either in the system
00:20:06.960 | that they're a part of the military
00:20:08.240 | or the people that control the military
00:20:10.720 | at the highest political levels.
00:20:12.280 | I feel a lot less moral attachment to them.
00:20:15.880 | And I have a much harsher about how I feel about them.
00:20:19.600 | I do not consider the military itself to be heroic.
00:20:24.600 | And I do not consider the military industrial complex
00:20:27.540 | to be heroic.
00:20:28.840 | I do think that is a tail wagging the dog situation.
00:20:31.840 | I do think that draws us into looking at military endeavors
00:20:36.840 | as a solution to the problem much more quickly
00:20:39.600 | than we otherwise might.
00:20:41.180 | And to be honest, to tie it all together,
00:20:42.900 | I actually look at the victims of this
00:20:45.540 | as the soldiers we were talking about.
00:20:47.560 | If you set a fire to send firemen into to fight,
00:20:52.560 | then I feel bad for the firemen.
00:20:55.720 | I feel like you've abused the trust
00:20:57.660 | that you give those people, right?
00:20:58.900 | So when people talk about war,
00:21:01.200 | I always think that the people that we have to make sure
00:21:03.940 | that a war is really necessary in order to protect
00:21:07.840 | are the people that you're gonna send over there
00:21:09.260 | to fight that.
00:21:10.100 | The greatest victims in our society of war
00:21:12.640 | are often the warriors.
00:21:14.240 | So in my mind, when we see these people coming home
00:21:18.160 | from places like Iraq,
00:21:19.480 | a place where I would have made the argument
00:21:21.760 | and did at the time that we didn't belong,
00:21:24.120 | to me, those people are victims.
00:21:26.560 | And I know they don't like to think about themselves
00:21:28.080 | that way 'cause it runs totally counter to the ethos.
00:21:31.360 | But if you're sending people to protect this country's shores
00:21:35.840 | those are heroes.
00:21:37.480 | If you're sending people to go do something
00:21:39.960 | that they otherwise probably don't need to do,
00:21:42.080 | but they're there for political reasons
00:21:43.600 | or anything else you wanna put in
00:21:44.800 | that's not defense related,
00:21:46.440 | well, then you've made victims of our heroes.
00:21:48.700 | And so I feel like we do a lot of talk
00:21:52.580 | about our troops and our soldiers and stuff,
00:21:54.560 | but we don't treat them as valuable
00:21:57.000 | as the rhetoric makes them sound.
00:21:59.680 | Otherwise we would be much more careful
00:22:03.720 | about where we put them.
00:22:05.120 | If you're gonna send my son,
00:22:06.840 | and I don't have a son, I have daughters,
00:22:08.000 | but if you're gonna send my son into harm's way,
00:22:11.800 | I'm going to demand that you really need
00:22:14.480 | to be sending him into harm's way.
00:22:15.720 | And I'm going to be angry at you
00:22:17.360 | if you put him into harm's way if it doesn't warrant it.
00:22:21.280 | And so I have much more suspicion about the system
00:22:23.840 | that sends these people into these situations
00:22:25.880 | where they're required to be heroic
00:22:28.360 | than I do the people on the ground
00:22:29.600 | that I look at as either the people that are defending us
00:22:34.160 | in situations like the Second World War, for example,
00:22:37.160 | or the people that turn out to be the individual victims
00:22:41.400 | of a system where they're just a cog in the machine
00:22:44.000 | and the machine doesn't really care as much about them
00:22:46.920 | as the rhetoric and the propaganda would insinuate.
00:22:51.920 | - Yeah, and as my own family history,
00:22:54.840 | it would be nice if we could talk about,
00:22:57.560 | there's a gray area in the places
00:23:00.080 | that you're talking about.
00:23:01.480 | - There's a gray area in everything.
00:23:03.120 | - In everything.
00:23:03.960 | But when that gray area is part of your own blood,
00:23:08.880 | as it is for me, it's worth shining a light on somehow.
00:23:13.880 | - Sure, give me an example of what you mean.
00:23:17.720 | - So you did a program of four episodes
00:23:20.560 | of "Ghosts of the Ostfront."
00:23:22.840 | So I was born in the Soviet Union.
00:23:26.640 | I was raised in Moscow.
00:23:27.880 | My dad was born and raised in Kiev.
00:23:30.680 | My grandmother, who just recently passed away,
00:23:33.200 | was raised in Ukraine.
00:23:38.200 | She-- - City.
00:23:39.680 | - It's a small city on the border
00:23:42.600 | between Russia and Ukraine.
00:23:44.320 | - I have a grandfather born in Kiev.
00:23:45.760 | - In Kiev.
00:23:46.760 | The interesting thing about the timing of everything,
00:23:49.680 | as you might be able to connect,
00:23:51.400 | is she survived, she's the most badass woman
00:23:55.280 | I've ever encountered in my life,
00:23:57.120 | and most of the warrior spirit I carry is probably from her.
00:24:01.680 | She survived, polymer, the Ukrainian starvation of the '30s.
00:24:05.880 | She was a beautiful teenage girl
00:24:08.640 | during the Nazi occupation.
00:24:10.360 | So she survived all of that.
00:24:14.360 | And of course, family, that everybody,
00:24:18.800 | so many people died through that whole process.
00:24:21.320 | And one of the things you talk about in your program
00:24:25.280 | is that the gray area is, even with the warriors,
00:24:30.280 | it happened to them, just like as you're saying now,
00:24:34.000 | they didn't have a choice.
00:24:35.560 | So my grandfather on the other side,
00:24:38.440 | he was a machine gunner that was in Ukraine that--
00:24:43.440 | - In the Red Army?
00:24:46.880 | - In the Red Army, yeah.
00:24:48.120 | And they threw, like the statement was that there's,
00:24:53.120 | I don't know if it's obvious or not,
00:24:55.560 | but the rule was there's no surrender,
00:24:57.800 | so you better die.
00:24:59.920 | So you, I mean, basically the goal was,
00:25:03.480 | when he was fighting, and he was lucky enough,
00:25:06.480 | one of the only to survive by being wounded early on,
00:25:10.960 | is there was a march of Nazis towards, I guess, Moscow.
00:25:16.640 | And the whole goal in Ukraine was to slow,
00:25:19.960 | to slow them into the winter.
00:25:23.920 | I mean, I view him as such a hero.
00:25:26.880 | And he believed that he's indestructible,
00:25:31.720 | which is survivor bias,
00:25:33.840 | and that bullets can't hurt him.
00:25:37.920 | And that's what everybody believed.
00:25:39.680 | And of course, basically everyone that,
00:25:43.520 | he quickly rose to the ranks, let's just put it this way,
00:25:46.360 | because everybody died.
00:25:48.120 | It was just bodies dragging these heavy machine guns,
00:25:53.680 | like always slowly retreating,
00:25:56.960 | shooting and retreating, shooting and retreating.
00:25:59.840 | And I don't know, he was a hero to me.
00:26:03.360 | Like, I grew up thinking that he was the one
00:26:08.720 | that sort of defeated the Nazis, right?
00:26:11.840 | But the reality, there could be another perspective,
00:26:14.280 | which is all of this happened to him
00:26:16.960 | by the incompetence of Stalin, the incompetence,
00:26:21.360 | and men of the Soviet Union being used like pawns
00:26:26.360 | in a shittily played game of chess, right?
00:26:33.480 | So like, one narrative is of him as a victim,
00:26:38.480 | as you're kind of describing.
00:26:40.880 | And somehow that's more paralyzing,
00:26:44.200 | and that's more, I don't know,
00:26:47.960 | it feels better to think of him as a hero
00:26:51.880 | and as Russia, Soviet Union saving the world.
00:26:55.480 | I mean, that narrative also is in the United States,
00:26:58.040 | that United States was key
00:27:00.440 | in saving the world from the Nazis.
00:27:02.760 | It feels like that narrative is powerful for people.
00:27:05.800 | I'm not sure, and I carry it still with me,
00:27:09.440 | but when I think about the right way
00:27:12.160 | to think about that war,
00:27:14.040 | I'm not sure if that's the correct narrative.
00:27:16.340 | - Let me suggest something.
00:27:18.600 | There's a line that a Marine named Eugene Sledge
00:27:23.600 | had said once, and I keep it on my phone
00:27:26.120 | because it makes a real distinction.
00:27:28.900 | And he said, "The front line is really where the war is.
00:27:33.360 | "And anybody even a hundred yards behind the front line
00:27:36.800 | "doesn't know what it's really like."
00:27:39.480 | Now, the difference is, is there are lots of people
00:27:42.240 | miles behind the front line that are in danger, right?
00:27:45.120 | You can be in a medical unit in the rear
00:27:47.240 | and artillery could strike you, planes could strike,
00:27:49.840 | I mean, you could be in danger.
00:27:51.780 | But at the front line, there are two different things.
00:27:53.780 | One is that, and at least,
00:27:56.960 | and I'm doing a lot of reading on this right now
00:27:58.640 | and reading a lot of veterans accounts.
00:28:00.760 | James Jones, who wrote books like "From Here to Eternity,"
00:28:04.800 | fictional accounts of the Second World War,
00:28:06.680 | but he based them on his own service.
00:28:08.680 | He was at Guadalcanal, for example, in 1942.
00:28:12.080 | And Jones had said that the evolution of a soldier
00:28:15.560 | in front line action requires an almost surrendering
00:28:20.200 | to the idea that you're going to live,
00:28:22.440 | that you become accustomed to the idea
00:28:24.780 | that you're going to die.
00:28:26.320 | And he said, "You're a different person
00:28:28.440 | "simply for considering that thought seriously,"
00:28:31.360 | because most of us don't.
00:28:32.980 | But what that allows you to do
00:28:34.360 | is to do that job at the front line, right?
00:28:36.720 | If you're too concerned about your own life,
00:28:40.320 | you become less of a good guy at your job, right?
00:28:44.480 | The other thing that the people in the 100 yards
00:28:47.200 | at the front line do that the people
00:28:49.240 | in the rear medical unit really don't
00:28:51.600 | is you kill and you kill a lot, right?
00:28:54.100 | You don't just, "Oh, there's a sniper back here,
00:28:55.740 | "so I shot him."
00:28:56.760 | It's we go from one position to another
00:28:58.900 | and we kill lots of people.
00:29:01.040 | Those things will change you.
00:29:02.600 | And what that tends to do, not universally,
00:29:05.400 | 'cause I've read accounts from Red Army soldiers
00:29:08.440 | and they're very patriotic, right?
00:29:10.680 | But a lot of that patriotism comes through years later
00:29:13.620 | as part of the nostalgia and the remembering.
00:29:16.620 | When you're down at that front 100 yards,
00:29:19.280 | it is often boiled down to a very small world.
00:29:22.140 | So your grandfather, was it your grandfather?
00:29:24.560 | - Grandfather. - At the machine gun,
00:29:26.560 | he's concerned about his position and his comrades
00:29:30.120 | and the people who he owes a responsibility to.
00:29:32.760 | And it's a very small world at that point.
00:29:35.380 | And to me, that's where the heroism is, right?
00:29:37.400 | He's not fighting for some giant world,
00:29:39.840 | civilizational thing.
00:29:41.000 | He's fighting to save the people next to him
00:29:43.560 | and his own life at the same time
00:29:45.000 | because they're saving him too.
00:29:46.920 | And there is a huge amount of heroism to that.
00:29:49.920 | And that gets to our question about force earlier.
00:29:52.240 | Why would you use force?
00:29:53.760 | Well, how about to protect these people
00:29:55.680 | on either side of me, right?
00:29:56.760 | Their lives.
00:29:57.740 | Now, is there hatred?
00:30:01.040 | Yeah, I hated the Germans for what they were doing.
00:30:03.300 | As a matter of fact, I got a note from a poll
00:30:06.200 | not that long ago,
00:30:07.400 | and I have this tendency to refer to the Nazis, right?
00:30:10.760 | The regime that was,
00:30:11.760 | and he said, "Why do you keep calling them Nazis?"
00:30:14.040 | He says, "Say what they were, they were Germans."
00:30:17.000 | And this guy wanted me to not absolve Germany
00:30:21.240 | by saying, "Oh, it was this awful group of people
00:30:23.400 | "that took over your country."
00:30:24.540 | He said, "The Germans did this."
00:30:26.680 | And there's that bitterness where he says,
00:30:28.480 | "Let's not forget what they did to us
00:30:30.760 | "and what we had to do back," right?
00:30:33.740 | So for me, when we talk about these combat situations,
00:30:37.140 | the reason I call these people heroic
00:30:39.040 | is because of they're fighting to defend things
00:30:41.920 | we could all understand.
00:30:42.880 | I mean, if you come after my brother
00:30:45.340 | and I take a machine gun and shoot you,
00:30:47.700 | and you're gonna overrun me,
00:30:49.600 | I mean, you're gonna,
00:30:50.700 | that becomes a situation
00:30:52.020 | when we talked about counterforce earlier.
00:30:54.120 | Much easier to call yourself a hero
00:30:56.960 | when you're saving people,
00:30:58.020 | or you're saving this town right behind you,
00:30:59.900 | and you know if they get through your machine gun,
00:31:02.340 | they're gonna burn these villages.
00:31:03.540 | They're gonna throw these people out
00:31:04.620 | in the middle of winter, these families.
00:31:06.700 | That to me is a very different sort of heroism
00:31:09.460 | than this amorphous idea of patriotism.
00:31:13.300 | Patriotism is a thing that we often get used with, right?
00:31:17.500 | People manipulate us through love of country and all this,
00:31:21.460 | because they understand that this is something
00:31:23.020 | we feel very strongly,
00:31:24.060 | but they use it against us sometimes
00:31:26.500 | in order to whip up a war fever or to get people,
00:31:29.500 | I mean, there's a great line,
00:31:30.980 | and I wish I could remember it in its entirety,
00:31:32.660 | that Herman Goering had said about how easy it was
00:31:35.460 | to get the people into a war.
00:31:37.220 | He says, you know, you just appeal to their patriotism.
00:31:39.620 | I mean, there's buttons that you can push,
00:31:41.500 | and they take advantage of things like love of country
00:31:44.460 | and the way we have a loyalty and admiration
00:31:48.060 | to the warriors who put their lives on the line.
00:31:50.140 | These are manipulatable things in the human species
00:31:53.820 | that reliably can be counted on to move us
00:31:57.700 | in directions that in a more sober,
00:32:01.780 | reflective state of mind, we would consider differently.
00:32:05.100 | It gets the, I mean, you get this war fever up,
00:32:06.980 | and people wave flags,
00:32:08.740 | and they start denouncing the enemy,
00:32:09.900 | and they start, I mean, you know,
00:32:11.020 | we've seen it over and over and over again.
00:32:12.940 | In ancient times, this happened.
00:32:14.660 | - But the love of country is also beautiful.
00:32:17.340 | So I haven't seen it in America as much.
00:32:19.900 | So people in America love their country.
00:32:22.180 | Like, there's patriotism strong in America,
00:32:24.820 | but it's not as strong as I remember,
00:32:27.500 | even with my sort of being younger,
00:32:30.420 | the love of the Soviet Union.
00:32:32.860 | - Now, was it the Soviet Union this requires a distinction,
00:32:36.700 | or was it Mother Russia?
00:32:39.140 | - What it really was was the Communist Party.
00:32:41.300 | - Okay, so it was the system in place, okay.
00:32:43.660 | - The system in place, like loving,
00:32:46.340 | I haven't quite deeply psychonalized exactly what you love.
00:32:49.980 | I think you love that populist message of the worker,
00:32:56.540 | of the common man, the common person.
00:32:58.700 | - So let me draw the comparison then.
00:33:01.420 | And I often say this,
00:33:02.460 | that the United States, like the Soviet Union,
00:33:06.200 | is an ideological-based society, right?
00:33:09.940 | So you take a country like France.
00:33:13.380 | It doesn't matter which French government you're in now.
00:33:16.100 | The French have been the French for a long time, right?
00:33:19.060 | It's not based on an ideology, right?
00:33:22.700 | Whereas what unites the United States is an ideology,
00:33:26.260 | freedom, liberty, the Constitution.
00:33:28.460 | This is what draws, you know,
00:33:29.300 | it's the e pluribus unum kind of the idea, right?
00:33:32.100 | That out of many, one.
00:33:33.340 | Well, what binds all these unique, different people?
00:33:37.020 | These shared beliefs, this ideology.
00:33:39.380 | The Soviet Union was the same way,
00:33:40.780 | 'cause as you know, the Soviet Union,
00:33:42.180 | Russia was merely one part of the Soviet Union.
00:33:45.860 | And if you believe the rhetoric until Stalin's time,
00:33:49.180 | everybody was going to be united
00:33:52.000 | under this ideological banner someday, right?
00:33:54.300 | It was a global revolution.
00:33:56.220 | So ideological societies are different.
00:33:59.100 | And to be a fan of the ideological framework and goal,
00:34:03.820 | I mean, I'm a liberty person, right?
00:34:05.820 | I would like to see everybody in the world
00:34:08.460 | have my system of government,
00:34:09.740 | which is part of a bias, right?
00:34:12.420 | Because they might not want that.
00:34:14.180 | But I think it's better for everyone,
00:34:15.980 | 'cause I think it's better for me.
00:34:17.820 | At the same time, when the ideology,
00:34:20.860 | if you consider, and you know,
00:34:22.340 | this stems from ideas of the Enlightenment,
00:34:25.240 | and there's a bias there.
00:34:26.580 | So my bias are toward the, but you feel,
00:34:28.820 | and this is why you say,
00:34:29.660 | we're gonna bring freedom to Iraq.
00:34:30.740 | We're gonna bring freedom to here.
00:34:31.580 | We're gonna bring freedom,
00:34:32.420 | because we think we're spreading to you
00:34:34.300 | something that is just undeniably positive.
00:34:37.900 | We're going to free you and give you this.
00:34:40.860 | It's hard for me to wipe my own bias away from there, right?
00:34:47.740 | 'Cause if I were in Iraq, for example,
00:34:50.180 | I would want freedom, right?
00:34:51.820 | But if you then leave and let the Iraqis vote
00:34:54.740 | for whomever they want,
00:34:56.300 | are they gonna vote for somebody that will,
00:34:58.660 | I mean, you know, you look at Russia now,
00:35:01.520 | and I hear from Russians quite a bit,
00:35:03.120 | because so much of my views on Russia and the Soviet Union
00:35:07.580 | were formed in my formative years.
00:35:09.820 | And, you know, we were not hearing from many people
00:35:12.700 | in the Soviet Union back then, but now you do.
00:35:14.860 | You hear from Russians today who will say,
00:35:16.500 | your views on Stalin are archaic and cold.
00:35:19.300 | So you try to reorient your beliefs a little bit,
00:35:21.800 | but it goes to this idea of,
00:35:23.260 | if you gave the people in Russia a free and fair vote,
00:35:27.060 | will they vote for somebody who promises them
00:35:29.260 | a free and open society
00:35:30.660 | based on enlightenment democratic principles?
00:35:33.060 | Or will they vote for somebody we in the US would go,
00:35:35.600 | what are they doing?
00:35:36.440 | They're voting for some strong man who's just good.
00:35:38.500 | You know, so I think it's very hard
00:35:41.220 | to throw away our own biases and preconceptions.
00:35:45.220 | And, you know, it's an all eye of the beholder
00:35:47.660 | kind of thing.
00:35:48.780 | But when you're talking about ideological societies,
00:35:52.060 | it is very difficult to throw off
00:35:55.100 | all the years of indoctrination
00:35:57.140 | into the superiority of your system.
00:35:59.680 | I mean, listen, in the Soviet Union,
00:36:01.500 | Marxism one way or another was part of every classrooms.
00:36:04.780 | You know, you could be studying geometry
00:36:06.640 | and they'll throw Marxism in there somehow,
00:36:09.080 | because that's what united the society.
00:36:11.580 | And that's what gave it a higher purpose.
00:36:13.540 | And that's what made it,
00:36:15.040 | in the minds of the people who were its defenders,
00:36:17.820 | a superior, morally superior system.
00:36:20.500 | And we do the same thing here.
00:36:21.680 | In fact, most people do, but see, you're still French,
00:36:24.800 | no matter what the ideology or the government might be.
00:36:28.100 | So in that sense, it's funny that there would be a Cold War
00:36:31.720 | with these two systems,
00:36:32.800 | because they're both ideologically based systems
00:36:35.600 | involving peoples of many different backgrounds
00:36:38.300 | who are united under the umbrella of the ideology.
00:36:42.060 | - First of all, that's brilliantly put.
00:36:45.120 | I'm in a funny position that in my formative years,
00:36:48.580 | I came here when I was 13, is when I, you know,
00:36:53.220 | teenage is your first love or whatever,
00:36:55.820 | is I fell in love with the American set of ideas
00:36:59.660 | of freedom and individualism.
00:37:00.860 | - They're compelling, aren't they?
00:37:01.700 | They're compelling, yes.
00:37:02.780 | - But I also remember, it's like, you remember,
00:37:05.060 | like maybe an ex-girlfriend or something like that.
00:37:07.460 | I also remember loving, as a very different human,
00:37:12.460 | the Soviet idea.
00:37:15.900 | Like we had the national anthem,
00:37:17.740 | which is still, I think, the most badass national anthem,
00:37:20.860 | which is the Soviet Union.
00:37:22.580 | Like saying, "We're the indestructible nation."
00:37:24.740 | I mean, just the words are so, like,
00:37:27.300 | American words are like, "Oh, we're nice.
00:37:29.700 | "Like, we're freedom."
00:37:31.300 | But like a Russian Soviet Union national anthem was like,
00:37:34.460 | "We're bad motherfuckers.
00:37:35.880 | "Nobody will destroy us."
00:37:38.060 | I just remember feeling pride in a nation as a kid,
00:37:41.820 | like dumb, not knowing anything,
00:37:43.260 | 'cause we all had to recite the stuff.
00:37:45.340 | There was uniformity to everything.
00:37:48.500 | There was pride underlying everything.
00:37:50.460 | I didn't think about all the destructive nature
00:37:52.940 | of the bureaucracy, the incompetence,
00:37:56.420 | all the things that come with the implementation
00:38:01.000 | of communism, especially around the '80s and '90s.
00:38:05.120 | But I remember what it's like to love that set of ideas.
00:38:09.620 | So I'm in a funny place of like,
00:38:12.280 | remember, like switching the love,
00:38:14.580 | 'cause I kind of joke around about being Russian,
00:38:18.180 | but my long-term monogamous relationship
00:38:21.300 | is now with the idea, the American ideal.
00:38:23.780 | Like I'm stuck with it in my mind.
00:38:25.900 | But I remember what it was like to love it.
00:38:28.940 | And I think about that too when people criticize China
00:38:31.900 | or they criticize the current state of affairs
00:38:34.660 | with how Stalin is remembered and how Putin is,
00:38:40.420 | to know that you can't always wear the American ideal
00:38:45.420 | of individualism, radical individualism,
00:38:49.580 | and freedom in analyzing the ways of the world elsewhere,
00:38:54.580 | like in China, in Russia, that it does,
00:38:58.260 | if you don't take yourself too seriously,
00:39:01.380 | as Americans all do, as I do,
00:39:03.340 | it's kind of a beautiful love to have for your government,
00:39:09.260 | to believe in the nation, to let go of yourself
00:39:13.140 | and your rights and your freedoms,
00:39:15.380 | to believe in something bigger than yourself.
00:39:18.060 | That's actually, that's a kind of freedom.
00:39:22.380 | You're actually liberating yourself.
00:39:24.700 | If you think like life is suffering,
00:39:26.740 | you're giving into the flow of the water,
00:39:30.220 | the flow, the way of the world,
00:39:32.260 | by giving away more power from yourself
00:39:35.060 | and giving it to what you would conceive as,
00:39:37.400 | "The power of the people," together.
00:39:40.100 | Together, we'll do great things.
00:39:41.500 | And really believing in the ideals of,
00:39:44.820 | in this case, I don't even know what you would call Russia,
00:39:50.300 | but whatever the heck that is, authoritarian,
00:39:53.580 | powerful state, powerful leader,
00:39:56.460 | believing that can be as beautiful
00:40:00.120 | as believing in the American ideal.
00:40:02.280 | - Not just that, let me add to what you're saying.
00:40:06.440 | I spend a lot of time trying to get out of my own biases.
00:40:11.220 | It is a fruitless endeavor long-term,
00:40:14.300 | but you try to be better than you normally are.
00:40:16.260 | One of the critiques that China,
00:40:19.140 | and I always, as an American,
00:40:20.580 | I tend to think about this as their government, right?
00:40:22.640 | This is a rationale that their government puts forward.
00:40:25.200 | But what you just said is actually,
00:40:27.900 | if you can make that viewpoint beautiful,
00:40:30.420 | is kind of a beautiful way of approaching it.
00:40:32.220 | The Chinese would say that what we call human rights
00:40:36.020 | in the United States,
00:40:36.860 | and what we consider to be everybody's birthright
00:40:39.140 | around the world, is instead Western rights.
00:40:42.860 | That's the words they use, Western rights.
00:40:44.460 | It's a fundamentally Western-oriented,
00:40:47.940 | and I'll go back to the enlightenment-based ideas
00:40:51.180 | on what constitutes the rights of man.
00:40:55.500 | And they would suggest that that's not internationally
00:40:58.500 | and always applicable, right?
00:41:00.340 | That you can make a case, and again, I don't believe this.
00:41:03.420 | This runs against my own personal views,
00:41:05.300 | but that you could make a case
00:41:07.060 | that the collective wellbeing
00:41:09.260 | of a very large group of people
00:41:11.540 | outweighs the individual needs of any single person,
00:41:15.500 | especially if those things are in conflict
00:41:18.140 | with each other, right?
00:41:18.980 | If you cannot provide for the greater good
00:41:21.640 | because everyone's so individualistic,
00:41:24.220 | well then, really, what is the better thing to do, right?
00:41:27.020 | To suppress individualism so everybody's better off?
00:41:29.940 | I think trying to recognize how someone else
00:41:33.860 | might see that is important if we wanna,
00:41:36.020 | you know, you had talked about eliminating war.
00:41:37.860 | We talked about eliminating conflict.
00:41:39.980 | The first need to do that is to try to understand
00:41:42.700 | how someone else might view something
00:41:44.420 | differently than yourself.
00:41:45.740 | I'm famously one of those people who buys in
00:41:50.480 | to the ideas of traditional Americanism, right?
00:41:53.780 | And look, what a lot of people who live today,
00:41:56.540 | I mean, they would seem to think that things like patriotism
00:42:00.540 | requires a belief in the strong military
00:42:04.100 | and all these things we have today,
00:42:05.040 | but that is a corruption of traditional Americanism,
00:42:07.480 | which viewed all those things with suspicion
00:42:10.100 | in the first hundred years of the Republic,
00:42:12.020 | because they saw it as an enemy to the very things
00:42:15.260 | that Americans celebrated, right?
00:42:17.140 | How could you have freedom and liberty
00:42:20.100 | and individualistic expression
00:42:23.100 | if you had an overriding military
00:42:24.900 | that was always fighting wars
00:42:26.320 | and the founders of this country looked to other examples,
00:42:29.320 | like Europe, for example,
00:42:30.620 | and saw that standing militaries, for example,
00:42:33.380 | standing armies were the enemy of liberty.
00:42:36.700 | Well, we have a standing army now.
00:42:39.220 | And one that is totally interwoven in our entire society.
00:42:43.300 | If you could go back in time and talk to John Quincy Adams,
00:42:47.700 | right, early president of the United States,
00:42:49.480 | and show him what we have now,
00:42:51.500 | he would think it was awful and horrible.
00:42:54.660 | And that somewhere along the line,
00:42:56.940 | the Americans had lost their way
00:42:59.800 | and forgotten what they were all about.
00:43:01.760 | But we have so successfully interwoven
00:43:04.840 | this modern military industrial complex
00:43:08.300 | with the traditional benefits
00:43:11.780 | of the American system and ideology,
00:43:14.000 | so that they've become intertwined in our thinking.
00:43:16.060 | Whereas 150 years ago,
00:43:18.080 | they were actually considered to be at opposite polarities
00:43:21.360 | and a threat to one another.
00:43:23.840 | So when you talk about the love of the nation,
00:43:27.040 | I tend to be suspicious of those things.
00:43:29.260 | I tend to be suspicious of government.
00:43:31.000 | I tend to try very hard to not be manipulated.
00:43:34.480 | And I feel like a large part of what they do
00:43:37.640 | is manipulation and propaganda.
00:43:40.200 | And so I think a healthy skepticism of the nation state
00:43:45.160 | is actually 100% Americanism
00:43:47.920 | in the traditional sense of the word.
00:43:49.960 | But I also have to recognize, as you so eloquently stated,
00:43:53.980 | Americanism is not necessarily universal at all.
00:43:58.660 | And so I think we have to try to be more understanding.
00:44:02.580 | See, the traditional American viewpoint
00:44:05.940 | is that if a place like China
00:44:07.420 | does not allow their people individual human rights,
00:44:10.240 | then they're being denied something.
00:44:12.220 | They're being denied, and 100 years ago,
00:44:14.680 | they would have said they're God-given rights.
00:44:17.140 | Man is born free, and if he's not free,
00:44:19.360 | it's because of something done to him.
00:44:22.060 | The government has taken away his God-given rights.
00:44:25.500 | - I'm getting excited just listening to that.
00:44:27.060 | (laughs)
00:44:27.900 | - Well, but I mean, I think the idea that this is universal
00:44:31.260 | is in and of itself a bias.
00:44:33.980 | Now, do I want freedom for everybody else?
00:44:36.140 | I sure do.
00:44:36.980 | But the people in the Soviet Union
00:44:38.220 | who really bought into that
00:44:40.220 | wanted the workers of the world to unite
00:44:42.500 | and not be exploited by the greedy, blood-sucking people
00:44:47.060 | who worked them to death
00:44:48.300 | and pocketed all of the fruits of their labor.
00:44:51.480 | If you frame it that way,
00:44:52.820 | that sounds like justice as well, you know?
00:44:55.300 | So it is an eye of the beholder sort of thing.
00:44:58.400 | - I'd love to talk to you about Vladimir Putin.
00:45:03.400 | Sort of while we're in this feeling and wave of empathy
00:45:08.380 | and trying to understand others that are not like us,
00:45:11.220 | one of the reasons I started this podcast
00:45:15.140 | is because I believe that there's a few people
00:45:17.340 | I could talk to, some of it is ego,
00:45:21.100 | some of it is stupidity.
00:45:24.060 | Is there some people I could talk to
00:45:26.540 | that not many others can talk to?
00:45:28.880 | The one person I was thinking about was Vladimir Putin.
00:45:33.700 | - You still speak the language?
00:45:35.060 | - I speak the language very well.
00:45:36.460 | - That makes it even easier.
00:45:37.980 | I mean, you might be appointed for that job.
00:45:41.180 | - That's the context in which I'm asking you this question.
00:45:43.660 | What are your thoughts about Vladimir Putin
00:45:48.700 | from a historical context?
00:45:50.700 | Have you studied him, have you thought about him?
00:45:54.260 | - Yes, studied is a loaded word.
00:45:57.580 | And again, I find it hard sometimes
00:46:02.060 | to not filter things through an American lens.
00:46:05.260 | So as an American, I would say that the Russians
00:46:09.360 | should be allowed to have any leader that they wanna have.
00:46:12.740 | But what an American would say is,
00:46:15.060 | but there should be elections, right?
00:46:17.140 | So if the Russians choose Vladimir Putin
00:46:20.100 | and they keep choosing him, that's their business.
00:46:23.780 | Where as an American, I would have a problem
00:46:26.760 | is when that leader stops letting the Russians
00:46:29.920 | make that decision.
00:46:30.920 | And we would say, well, now you're no longer ruling
00:46:34.420 | by the consent of the governed.
00:46:36.460 | You've become the equivalent of a person
00:46:38.460 | who may be oppressing your people.
00:46:40.660 | You might as well be a dictator, right?
00:46:42.800 | Now, there's a difference between a freely elected
00:46:45.900 | and reelected and reelected and reelected dictator, right?
00:46:49.580 | If that's what they want.
00:46:50.420 | And look, it would be silly to broad brush the Russians
00:46:54.600 | like it would be silly to broad brush anyone, right?
00:46:56.760 | Millions and millions of people
00:46:57.880 | with different opinions amongst them all.
00:46:59.900 | But they seem to like a strong person at the helm.
00:47:03.260 | And listen, there's a giant chunk of Americans who do too
00:47:06.940 | in their own country.
00:47:08.220 | But an American would say, as long as the freedom of choice
00:47:12.340 | is given to the Russians to decide this
00:47:15.180 | and not taken away from them, right?
00:47:16.980 | It's one thing to say he was freely elected,
00:47:18.580 | but a long time ago,
00:47:19.500 | and we've done away with elections since then
00:47:22.020 | is a different story too.
00:47:23.300 | So my attitude on Vladimir Putin
00:47:25.580 | is if that's who the Russian people want
00:47:27.500 | and you give them the choice, right?
00:47:29.340 | If he's only there because they keep electing him,
00:47:31.620 | that's a very different story.
00:47:32.980 | When he stops offering them the option of choosing him
00:47:37.060 | or not choosing him,
00:47:38.220 | that's when it begins to look nefarious
00:47:40.060 | to someone born and raised with the mindset
00:47:42.900 | and the ideology that is an integral part of yours truly.
00:47:46.260 | And that I can't, you can see gray areas
00:47:48.700 | and nuance all you like, but it's hard to escape.
00:47:51.000 | As you wish, and you alluded to this too,
00:47:52.960 | it's hard to escape what was indoctrinated into your bones
00:47:57.260 | in your formative years.
00:47:59.300 | It's like, your bones are growing, right?
00:48:02.220 | And you can't go back.
00:48:03.700 | So to me, this is so much a part of who I am
00:48:06.360 | that I have a hard time jettisoning that and saying,
00:48:09.260 | "Oh no, Vladimir Putin not being elected anymore.
00:48:11.660 | "It's just fine.
00:48:12.860 | "I'm too much of a product of my upbringing to go there."
00:48:16.780 | Does that make sense?
00:48:17.620 | - Yeah, absolutely.
00:48:18.440 | But of course, there's, like we were saying,
00:48:20.180 | there's gray areas, which is, I believe,
00:48:23.820 | I have to think through this,
00:48:25.020 | but I think there is a point at which Adolf Hitler
00:48:28.420 | became the popular choice in Nazi Germany in the '30s.
00:48:32.940 | There's, in the same way, from an American perspective,
00:48:37.660 | you can start to criticize some in a shallow way,
00:48:42.660 | some in a deep way.
00:48:43.940 | The way that Putin has maintained power
00:48:47.340 | is by controlling the press.
00:48:48.620 | So limiting one other freedom that we Americans value,
00:48:51.300 | which is the freedom of the press or freedom of speech,
00:48:55.060 | that it is very possible.
00:48:57.700 | Now, things are changing now,
00:49:00.340 | but for most of his presidency,
00:49:03.260 | he was the popular choice and sometimes by far.
00:49:06.760 | And I actually don't have real family in Russia
00:49:11.760 | who don't love Putin.
00:49:14.100 | The only people who write to me about Putin
00:49:17.820 | and not liking him are like sort of activists
00:49:22.420 | who are young, right?
00:49:24.740 | But to me, they're strangers.
00:49:26.900 | I don't know anything about them.
00:49:28.300 | The people I do know, who have a big family in Russia,
00:49:31.140 | they love Putin.
00:49:35.500 | - Do they miss elections?
00:49:37.140 | Would they want the choice to prove it at the ballot box?
00:49:45.620 | Or are they so in love with him
00:49:49.960 | that they wouldn't wanna take a chance
00:49:52.480 | that someone might vote him out?
00:49:54.580 | - No, they don't think of it this way.
00:49:56.280 | And they are aware of the incredible bureaucracy
00:50:00.360 | and corruption that is lurking in the shadows,
00:50:03.560 | which is true in Russia.
00:50:05.040 | - Right, everywhere.
00:50:06.680 | - Everywhere, but there's something about the Russian,
00:50:09.560 | it's the remnants, corruption is so deeply part
00:50:14.040 | of the Russian, the Soviet system,
00:50:16.400 | that even the overthrow of the Soviet,
00:50:18.600 | the breaking apart of the Soviet Union
00:50:21.940 | and Putin coming and reforming a lot of the system,
00:50:26.940 | it's still deeply in there.
00:50:29.000 | And they're aware of that.
00:50:31.040 | That's part of the, like the love for Putin
00:50:33.320 | is partially grounded in the fear of what happens
00:50:38.080 | when the corrupt take over, the greedy take over.
00:50:42.160 | And they see Putin as the stabilizer,
00:50:44.960 | as like a hard, like force that says-
00:50:49.440 | - A counter force.
00:50:50.400 | - A counter force that gets your shit together.
00:50:53.060 | Like basically, from the Western perspective,
00:50:56.220 | Putin is terrible, but from the Russian perspective,
00:51:01.880 | Putin is the only thing holding this thing together
00:51:04.600 | before it goes, if it collapses.
00:51:07.400 | Now, from the, like Garry Kasparov has been loud on this,
00:51:11.840 | a lot of people from the Western perspective say,
00:51:14.520 | "Well, if it has to collapse, let it collapse."
00:51:17.360 | - That's easier said than done
00:51:19.040 | when you don't have to live through that.
00:51:20.280 | - Exactly.
00:51:21.120 | And so anyone worrying about their family,
00:51:23.960 | and they also remember the inflation
00:51:28.880 | and the economic instability and the suffering
00:51:31.080 | and the starvation that happened in the '90s
00:51:33.040 | with the collapse of the Soviet Union.
00:51:35.080 | And they saw the kind of reform and the economic vibrancy
00:51:38.440 | that happened when Putin took power,
00:51:40.520 | that they think like, "This guy's holding it together."
00:51:43.480 | And they see elections as potentially being mechanisms
00:51:48.480 | by which the corrupt people can manipulate the system
00:51:54.520 | unfairly, as opposed to letting the people
00:51:56.660 | speak with their voice.
00:51:58.180 | They somehow figure out a way to manipulate the elections,
00:52:03.100 | to elect somebody like one of them Western revolutionaries.
00:52:08.100 | And so I think one of the beliefs
00:52:11.560 | that's important to the American system
00:52:13.540 | is the belief in the electoral system
00:52:17.840 | that the voice of the people can be heard
00:52:20.360 | in the various systems of government,
00:52:22.440 | whether it's judicial, whether it's,
00:52:25.900 | I mean, basically the assumption is
00:52:29.060 | that the system works well enough
00:52:31.580 | for you to be able to elect the popular choice.
00:52:36.580 | - Okay, so there's a couple of things
00:52:38.580 | that come to mind on that.
00:52:40.620 | The first one has to do with the idea of oligarchs.
00:52:45.340 | There's a belief in political science,
00:52:47.600 | it's not the overall belief,
00:52:50.420 | but that every society is sort of an oligarchy, really,
00:52:53.500 | if you break it down, right?
00:52:55.300 | So what you're talking about are some of the people
00:52:57.760 | who would form an oligarchic class in Russia,
00:53:02.680 | and that Putin is the guy who can harness
00:53:06.200 | the power of the state to keep those people in check.
00:53:10.000 | The problem, of course, in a system like that,
00:53:12.320 | a strongman system, right,
00:53:13.880 | where you have somebody who can hold the reins
00:53:16.480 | and steer the ship when the ship is violently in a storm,
00:53:20.240 | is the succession.
00:53:21.820 | So if you're not creating a system
00:53:25.160 | that can operate without you,
00:53:27.600 | then that terrible instability and that terrible future
00:53:31.680 | that you justify the strongman for
00:53:35.160 | is just awaiting your future, right?
00:53:37.440 | I mean, unless he's actively building the system
00:53:41.340 | that will outlive him and allow successors
00:53:44.680 | to do what he's doing,
00:53:46.440 | then what you've done here is create a temporary,
00:53:49.000 | I would think, a temporary stability here,
00:53:51.000 | because it's the same problem you have in a monarchy, right?
00:53:54.080 | Where you have this one king and he's particularly good,
00:53:57.480 | or you think he's particularly good,
00:53:59.320 | but he's gonna turn that job over
00:54:00.760 | to somebody else down the road,
00:54:02.680 | and the system doesn't guarantee
00:54:04.320 | because no one's really worked on it.
00:54:07.480 | And again, you would tell me,
00:54:08.920 | if Putin is putting into place,
00:54:11.120 | I know he's talked about it over the years,
00:54:13.080 | putting into place a system that can outlive him
00:54:15.800 | and that will create the stability
00:54:17.320 | that the people in Russia like him for when he's gone.
00:54:21.640 | Because if the oligarchs just take over afterwards,
00:54:24.000 | then one might argue, well, we had 20 good years
00:54:27.400 | of stability, but I mean, I would say that
00:54:30.440 | if we're talking about a ship of state here,
00:54:32.560 | the guy steering the ship, maybe,
00:54:34.100 | if you wanted to look at it from the Russian point of view,
00:54:35.760 | has done a great job, maybe, just saying.
00:54:38.320 | But the rocks are still out there,
00:54:39.920 | and he's not going to be at the helm forever.
00:54:42.520 | So one would think that his job is to make sure
00:54:45.000 | that there's going to be someone
00:54:47.460 | who can continue to steer the ship
00:54:49.040 | for the people of Russia after he's gone.
00:54:51.340 | Now, let me ask, 'cause I'm curious,
00:54:53.700 | do you, and ignorant, so is he doing that, do you think?
00:54:58.700 | Is he setting it up so that when there is no Putin,
00:55:02.260 | the state is safe?
00:55:04.000 | - From the beginning, that was the idea,
00:55:06.960 | whether one of the fascinating things,
00:55:09.240 | now, I read every biography,
00:55:10.680 | English-written biography on Putin,
00:55:12.760 | so I haven't, I need to think more deeply,
00:55:16.400 | but one of the fascinating things
00:55:17.680 | is how did power change Vladimir Putin?
00:55:20.920 | He was a different man when he took power than he is today.
00:55:24.400 | I actually, in many ways, admire the man that took power.
00:55:29.400 | I think he's very different than Stalin and then Hitler
00:55:33.160 | at the moment they took power.
00:55:34.900 | I think Hitler and Stalin were both,
00:55:39.140 | in our previous discussion, already on the trajectory
00:55:42.500 | of evil.
00:55:44.020 | I think Putin was a humble, loyal, honest man
00:55:49.020 | when he took power.
00:55:50.760 | The man he is today is worth thinking about and studying.
00:55:54.580 | I'm not sure.
00:55:55.520 | - That's an old line, though,
00:55:57.540 | about absolute power corrupting absolutely.
00:55:59.780 | - But it's kind of a line, it's a beautiful quote,
00:56:04.780 | but you have to really think about it.
00:56:07.380 | What does that actually mean?
00:56:10.820 | One of the things I still have to do,
00:56:13.500 | I've been focusing on securing the conversation,
00:56:15.740 | so I haven't gone through a dark place yet
00:56:18.980 | 'cause I feel like I can't do the dark thing for too long.
00:56:21.840 | So I really have to put myself in the mind of Putin
00:56:25.220 | leading up to the conversation.
00:56:27.920 | But for now, my sense is he took power
00:56:32.820 | when Yeltsin gave him one of the big sort of acts
00:56:36.540 | of the new Russia was for the first time in its history,
00:56:41.540 | a leader could have continued being in power
00:56:45.960 | and chose to give away power.
00:56:48.120 | That was the George Washington.
00:56:49.760 | - Right, we in the United States would look at that
00:56:51.220 | as an absolute positive, yeah, a sign of good things.
00:56:53.800 | - Yes, and so that was a huge act.
00:56:56.120 | And Putin said that that was the defining thing
00:57:01.000 | that will define Russia for the 21st century, that act,
00:57:04.640 | and he will carry that flag forward.
00:57:07.260 | That's why in rhetoric, he, after two terms,
00:57:12.160 | he gave away power.
00:57:13.200 | - To Medvedev, but it was a puppet, right?
00:57:15.320 | - Yeah, yes.
00:57:16.760 | But it was, but like still the story was being told.
00:57:20.760 | I think he believed it early on.
00:57:23.480 | I think he, I believe he still believes it,
00:57:28.300 | but I think he's deeply suspicious of the corruption
00:57:31.440 | that lurks in the shadows.
00:57:33.180 | And I do believe that like as somebody who thinks
00:57:36.040 | clickbait journalism is broken,
00:57:38.240 | journalists annoy the hell out of me.
00:57:40.200 | - Clickbait journalism's working perfectly.
00:57:42.400 | Journalism's broken.
00:57:43.440 | - Journalism.
00:57:44.280 | - Clickbait things work great.
00:57:45.120 | - Exactly.
00:57:46.760 | So I understand from Putin's perspective
00:57:49.840 | that journalism, journalists can be seen
00:57:52.880 | as the enemy of the state
00:57:53.980 | because people think journalists write these deep,
00:57:57.840 | beautiful philosophical pieces
00:57:59.700 | about criticizing the structure of government
00:58:02.320 | and the proper policy, you know,
00:58:04.840 | the steps that we need to take to make a greater nation.
00:58:07.880 | No, they're unfairly take stuff out of context.
00:58:11.360 | They're critical in ways that's like shallow
00:58:15.360 | and not interesting.
00:58:16.640 | They call you a racist or sexist,
00:58:19.760 | or they make up stuff all the time.
00:58:22.560 | So I can put myself in the mindset of a person
00:58:25.440 | that thinks that it is okay to remove
00:58:28.480 | that kind of shallow, fake news voice from the system.
00:58:33.480 | The problem is of course, that is a slippery slope
00:58:36.940 | to then you remove all the annoying people from the system.
00:58:41.120 | And then you change what annoying means,
00:58:43.280 | which annoying starts becoming a thing
00:58:45.280 | that like anyone who opposes the system.
00:58:48.840 | I mean, I get the slippery,
00:58:53.080 | it's obvious that it becomes a slippery slope,
00:58:55.360 | but I can also put myself in the mindset of the people
00:58:57.840 | that see it's okay to remove the liars from the system,
00:59:02.300 | as long as it's good for Russia.
00:59:04.320 | - And okay, so herein lies, and this again,
00:59:06.480 | the traditional American perspective,
00:59:08.480 | because we've had yellow, so-called yellow journalism
00:59:11.280 | since the founding of the Republic, that's nothing new.
00:59:14.240 | But the problem then comes into play
00:59:16.920 | when you remove journalists, even,
00:59:20.120 | it's a broad brush thing,
00:59:21.240 | 'cause but you remove both the crappy ones who are lying
00:59:24.760 | and the ones who are telling the truth too,
00:59:26.520 | you're left with simply
00:59:28.600 | the approved government journalists, right?
00:59:31.300 | The ones who are towing the government's line,
00:59:34.040 | in which case the truth as you see it
00:59:37.200 | is a different kind of fake news, right?
00:59:39.240 | It's the fake news from the government
00:59:41.400 | instead of the clickbait news, and oh yeah,
00:59:43.960 | maybe truth mixed into all that too in some of the outlets.
00:59:48.000 | The problem I always have with our system
00:59:49.720 | here in the United States right now
00:59:50.920 | is trying to tease the truth out from all the falsehoods.
00:59:55.560 | And look, I've got 30 years in journalism.
00:59:58.680 | My job used to be to go through before the internet,
01:00:01.000 | all the newspapers and find the,
01:00:03.200 | I used to know all the journalists by name
01:00:05.120 | and I could pick out, you know, who they were
01:00:06.880 | and I have a hard time picking out the truth
01:00:11.040 | from the falsehood.
01:00:11.880 | So I think constantly,
01:00:13.400 | how are people who don't have all this background,
01:00:16.480 | who have lives or who are trained in other specialties,
01:00:19.120 | how do they do it?
01:00:20.520 | But if the government is the only approved outlet for truth,
01:00:24.560 | a traditional American
01:00:26.600 | and a lot of other traditional societies
01:00:28.240 | based on these ideas of the enlightenment
01:00:29.940 | that I talked about earlier,
01:00:31.220 | would see that as a disaster waiting to happen
01:00:33.440 | or a tyranny in progress.
01:00:35.120 | Does that make sense?
01:00:35.960 | - Oh, it totally makes sense.
01:00:37.480 | And I would agree with you, I still agree with you,
01:00:40.160 | but it is clear that something about the freedom of the press
01:00:45.160 | and freedom of speech in today,
01:00:47.480 | like literally the last few years with the internet
01:00:51.000 | is changing and the argument, you know,
01:00:53.920 | you could say that the American system
01:00:56.000 | of freedom of speech is broken
01:00:58.480 | because here's the belief I grew up on
01:01:04.200 | and I still hold, but I'm starting to be sort of
01:01:07.800 | trying to see multiple views on it.
01:01:09.720 | My belief was that freedom of speech
01:01:13.480 | results in a stable trajectory towards truth always.
01:01:18.240 | So like truth will emerge.
01:01:19.960 | That was my sort of faith and belief that,
01:01:22.640 | that yeah, there's going to be lies all over the place,
01:01:24.880 | but there'll be like a stable thing that is true,
01:01:27.840 | that's carried forward to the public.
01:01:31.680 | Now it feels like it's possible to go towards a world
01:01:36.680 | where nothing is true, where truth is something
01:01:41.840 | that groups of people convince themselves of
01:01:45.880 | and there's multiple groups of people.
01:01:48.040 | And the idea of some universal truth,
01:01:51.200 | I suppose is the better thing,
01:01:53.200 | is something that we can no longer exist under.
01:01:57.320 | Like some people believe that the Green Bay Packers
01:02:00.440 | is the best football team
01:02:02.600 | and some people can think the Patriots
01:02:04.840 | and they deeply believe it
01:02:07.960 | to where they call the other groups liars.
01:02:10.280 | Now that's fun for sports,
01:02:11.540 | that's fun for favorite flavors of ice cream,
01:02:14.680 | but they might believe that about science,
01:02:17.240 | about various aspects of politics,
01:02:22.240 | various aspects of sort of different policies
01:02:28.200 | within the function of our government.
01:02:30.840 | And like, that's not just like some weird thing
01:02:33.480 | we complain about, but that'll be the nature of things.
01:02:35.640 | Like truth is something we can no longer have.
01:02:38.760 | - Well, let's, and let me de-romanticize
01:02:41.360 | the American history of this too,
01:02:43.960 | because the American press was often just as biased,
01:02:48.960 | just as, I mean, I always looked to the 1970s
01:02:52.520 | as the high watermark of the American journalistic,
01:02:55.760 | you know, the post Watergate era,
01:02:57.880 | where it was actively going after the abuses of the
01:03:02.880 | government and all these things.
01:03:04.320 | But there was a famous speech, very quiet though,
01:03:06.560 | very quiet, given by Catherine Graham,
01:03:08.760 | who was a Washington Post editor, I believe.
01:03:11.480 | And I actually, somebody sent it to me,
01:03:13.480 | we had to get it off of a journalism,
01:03:15.260 | like a J-store kind of thing.
01:03:16.980 | And she, at a luncheon,
01:03:19.900 | assured to the government people at the luncheon,
01:03:23.800 | don't worry, this is not gonna be something
01:03:25.760 | that we make a trend.
01:03:27.720 | Because the position of the government
01:03:31.760 | is still something that was carried,
01:03:34.560 | you know, the newspapers were the water,
01:03:36.400 | and the newspapers were the big thing up until certainly
01:03:39.000 | the late '60s, early '70s.
01:03:40.480 | The newspapers were still the water carrier
01:03:42.760 | of the government, right?
01:03:44.080 | And they were the water carriers
01:03:45.680 | of the owners of the newspaper.
01:03:47.720 | So let's not pretend there was some angelic,
01:03:50.080 | wonderful time, and I'm saying to me,
01:03:51.960 | 'cause I was the one who brought it up,
01:03:53.660 | let's not pretend there was any super age
01:03:56.840 | of truthful journalism and all that.
01:03:58.560 | And I mean, you go to the revolutionary period
01:04:00.960 | in American history, and it looks every bit
01:04:03.200 | as bad as today, right?
01:04:04.960 | - That's a hopeful message, actually.
01:04:06.400 | So things may not be as bad as they look.
01:04:09.080 | - Well, let's look at it more like a stock market,
01:04:11.140 | and that you have fluctuations in the truthfulness
01:04:14.040 | or believability of the press.
01:04:16.040 | And there are periods where it was higher
01:04:18.480 | than other periods.
01:04:19.880 | The funny thing about the so-called clickbait era,
01:04:22.200 | and I do think it's terrible,
01:04:24.220 | but I mean, it resembles earlier eras to me.
01:04:27.820 | So I always compare it to when I was a kid growing up,
01:04:30.280 | when I thought journalism was as good as it's ever gotten.
01:04:33.820 | It was never perfect.
01:04:34.960 | But it's also something that you see very rarely
01:04:39.380 | in other governments around the world.
01:04:41.260 | And there's a reason that journalists are often killed
01:04:44.500 | regularly in a lot of countries,
01:04:46.540 | and it's because they report on things
01:04:48.260 | that the authorities do not want reported on.
01:04:50.100 | And I've always thought that that was what journalism
01:04:51.940 | should do, but it's gotta be truthful.
01:04:55.060 | Otherwise, it's just a different kind of propaganda, right?
01:04:58.180 | - Can we talk about Genghis Khan?
01:05:01.340 | Genghis Khan? - Sure, sure.
01:05:02.500 | - By the way, is it Genghis Khan or Genghis Khan?
01:05:05.660 | - It's not Genghis Khan.
01:05:06.660 | It's either Genghis Khan or Chinggis Khan.
01:05:09.140 | - So let's go with Genghis Khan.
01:05:11.220 | - That's the only thing I'll be able to say
01:05:12.600 | with any certain, last certain thing I'll say about it.
01:05:15.620 | - It's like, I don't know, GIF versus GIF.
01:05:18.880 | I don't know if you know about those things.
01:05:20.860 | - Although I don't know how it ever got started
01:05:22.900 | the wrong way, but yeah.
01:05:24.860 | - So first of all, your episodes on Genghis Khan,
01:05:28.860 | for many people, are the favorite.
01:05:31.020 | It's fascinating to think about events that had so much,
01:05:34.780 | like in their ripples, had so much impact
01:05:37.100 | on so much of human civilization.
01:05:40.460 | In your view, was he an evil man?
01:05:45.160 | This goes to our discussion of evil.
01:05:48.380 | Another way to put it is I've read,
01:05:51.900 | he's much loved in many parts of the world, like Mongolia.
01:05:55.780 | And I've also read arguments that say
01:05:59.260 | that he was quite a progressive for the time.
01:06:02.180 | So where do you put him?
01:06:04.260 | Is he a progressive or is he an evil destroyer of humans?
01:06:08.500 | - As I often say, I'm not a historian,
01:06:10.460 | which is why what I try to bring
01:06:13.620 | to the Hardcore History podcasts are these sub-themes.
01:06:17.740 | So each show has, and they're not,
01:06:19.420 | I try to kind of soft-pedal them.
01:06:21.060 | So they're not always like really right
01:06:22.700 | in front of your face.
01:06:23.860 | In that episode, the soft-pedaling sub-theme had to do
01:06:28.620 | with what we refer to as a historical arsonist.
01:06:32.380 | And it's because some historians have taken the position
01:06:36.200 | that sometimes, and most of this is earlier stuff.
01:06:38.620 | Historians don't do this very much anymore,
01:06:40.460 | but these were the wonderful questions I grew up with
01:06:43.180 | that blend, it's almost the intersection
01:06:45.660 | between history and philosophy.
01:06:48.220 | And the idea was that sometimes the world has become
01:06:52.460 | so overwhelmed with bureaucracy or corruption
01:06:56.820 | or just stagnation that somebody has to come in
01:07:00.820 | or some group of people or some force has to come in
01:07:03.980 | and do the equivalent of a forest fire
01:07:06.300 | to clear out all the dead wood
01:07:08.420 | so that the forest itself can be rejuvenated
01:07:11.080 | and society can then move forward.
01:07:13.180 | And there's a lot of these periods
01:07:14.420 | where the historians of the past will portray these figures
01:07:18.380 | who come in and do horrific things
01:07:20.980 | as creating an almost service for mankind, right?
01:07:25.440 | Creating the foundations for a new world
01:07:28.520 | that will be better than the old one.
01:07:29.900 | And it's a recurring theme.
01:07:31.020 | And so this was the sub-theme of the Khan's podcast,
01:07:34.460 | because otherwise you don't need me
01:07:35.420 | to tell you the story of the Mongols,
01:07:36.900 | but I'm gonna bring up the historical arsonist element.
01:07:40.020 | But this gets to how the Khan has been portrayed, right?
01:07:44.260 | If you wanna say, oh yes,
01:07:45.660 | he cleared out the dead wood and made for a fruit,
01:07:47.900 | well, then it's a positive thing.
01:07:49.420 | If you say my family was in the forest fire that he set,
01:07:52.900 | you're not gonna see it that way.
01:07:54.540 | Much of what Genghis Khan is credited with
01:07:58.300 | on the upside, right?
01:07:59.860 | So things like religious toleration,
01:08:02.820 | and you'll say, well, he was a religiously,
01:08:05.200 | the Mongols were religiously tolerant.
01:08:08.060 | And so this makes them almost
01:08:09.500 | like a liberal reformer kind of thing.
01:08:11.540 | But this needs to be seen within the context
01:08:14.340 | of their empire, which was very much
01:08:17.620 | like the Roman viewpoint, which is the Romans didn't care
01:08:20.020 | at a lot of time what your local people worshiped.
01:08:22.740 | They wanted stability.
01:08:24.300 | And if that kept stability and kept you paying taxes
01:08:26.900 | and didn't require the legionaries to come in,
01:08:29.500 | then they didn't care, right?
01:08:31.340 | And the Khans were the same way.
01:08:33.020 | Like they don't care what you're practicing
01:08:34.640 | as long as it doesn't disrupt their empire
01:08:36.460 | and cause them trouble.
01:08:38.020 | But what I always like to point out is yes,
01:08:39.860 | but the Khan could still come in with his representatives
01:08:42.380 | to your town, decide your daughter was a beautiful woman
01:08:45.280 | that they wanted in the Khan's concubine,
01:08:47.340 | and they would take them.
01:08:48.840 | So how liberal an empire is this, right?
01:08:52.500 | So many of the things that they get credit for
01:08:54.740 | is that there's some kind of nice guys,
01:08:56.900 | may in another way of looking at it,
01:08:58.880 | just be a simple mechanism of control, right?
01:09:01.780 | A way to keep the empire stable.
01:09:04.820 | They're not doing it out of the goodness of their heart.
01:09:07.180 | They have decided that this is the best.
01:09:09.140 | And I love because the Mongols were what we would call
01:09:13.420 | a pagan people now.
01:09:15.260 | I love the fact that they,
01:09:17.100 | I think we call it, I forgot the term we used,
01:09:18.940 | had to do with like they were hedging their bets
01:09:21.260 | religiously, right?
01:09:22.420 | They didn't know which God was the right one.
01:09:24.460 | So as long as you're all praying for the health of the Khan,
01:09:27.420 | we're maximizing the chances that whoever the gods are,
01:09:30.460 | they get the message, right?
01:09:32.480 | So I think it's been portrayed as something
01:09:34.980 | like a liberal empire.
01:09:37.340 | The idea of Mongol universality
01:09:40.740 | is more about conquering the world.
01:09:43.700 | And it's like saying, you know,
01:09:44.640 | we're gonna bring stability to the world by conquering it.
01:09:46.700 | Well, what if that's Hitler, right?
01:09:48.740 | He could make the same case
01:09:50.180 | or Hitler wasn't really the world conqueror like that
01:09:52.260 | 'cause he wouldn't have been trying
01:09:54.060 | to make it equal for all peoples.
01:09:55.660 | But my point being that it kind of takes
01:09:58.340 | the positive moral slant out of it
01:10:01.540 | if their motivation wasn't a positive moral slant
01:10:05.140 | to the motivate and the Mongols didn't see it that way.
01:10:09.740 | And I think the way that it's portrayed is like,
01:10:13.180 | and I always like to use this analogy,
01:10:15.220 | but it's like shooting an arrow
01:10:17.540 | and painting a bullseye around it afterwards, right?
01:10:20.340 | How do we justify and make them look good in a way
01:10:24.740 | that they themselves probably,
01:10:26.140 | and listen, we don't have the Mongol point of view per se.
01:10:29.820 | I mean, there's something called
01:10:30.660 | the secret history of the Mongols
01:10:32.020 | and there's things written down by Mongolian overlords
01:10:35.700 | through people like Persian and Chinese scribes later.
01:10:38.580 | We don't have their point of view,
01:10:41.100 | but it sure doesn't look like this was an attempt
01:10:44.020 | to create some wonderful place
01:10:45.820 | where everybody was living a better life
01:10:47.580 | than they were before.
01:10:48.580 | I think that's later people putting a nice rosy spin on it.
01:10:53.580 | - But there's an aspect to it, maybe you can correct me
01:10:57.860 | 'cause I'm projecting sort of my idea
01:10:59.740 | of what it would take to conquer so much land
01:11:04.460 | is the ideology is emergent.
01:11:08.940 | So if I were to guess,
01:11:11.980 | the Mongols started out as exceptionally,
01:11:16.420 | as warriors who valued excellence in skill of killing,
01:11:21.420 | not even killing, but like the actual practice of war.
01:11:27.300 | And it can start out small
01:11:28.660 | and it can grow and grow and grow.
01:11:30.340 | And then in order to maintain the stability
01:11:32.780 | of the things over which of the conquered lands,
01:11:36.780 | you developed a set of ideas with which you can,
01:11:40.300 | like you said, establish control, but it was emergent.
01:11:43.940 | And it seems like the core first principle idea
01:11:48.780 | of the Mongols is just to be excellent warriors.
01:11:52.820 | That felt to me like the starting point.
01:11:55.460 | It wasn't some ideology.
01:11:57.220 | Like with Hitler and Stalin, with Hitler,
01:12:00.860 | there was an ideology that didn't have anything to do
01:12:04.260 | with war underneath it.
01:12:06.980 | It was more about conquering.
01:12:08.260 | It feels like the Mongols started out more organically,
01:12:12.940 | I would say, like this phenomenon started emergently
01:12:16.260 | and they were just like similar to the Native Americans
01:12:19.260 | with the Comanches, like the different warrior tribes
01:12:21.940 | that Joe Rogan's currently obsessed with
01:12:24.900 | that led me to look into it more.
01:12:28.100 | They seem to just start out just valuing the skill
01:12:32.180 | of fighting, whatever the tools of war they had,
01:12:35.260 | which were pretty primitive,
01:12:36.460 | but just to be the best warriors they could possibly be,
01:12:39.140 | make a science out of it.
01:12:40.540 | Is that crazy to think that there was no ideology
01:12:44.060 | behind it in the beginning?
01:12:45.380 | - I'm gonna back up a second.
01:12:47.260 | I'm reminded of the line said about the Romans
01:12:49.700 | that they create a wasteland and call it peace.
01:12:52.900 | - That is-- - Wow.
01:12:54.500 | - But there's a lot of conquerors like that, right?
01:12:56.980 | Where you will sit there, and listen,
01:12:59.540 | historians forever have, it's the famous trade-offs
01:13:03.180 | of empire, and they'll say, well, look at the trade
01:13:05.900 | that they facilitated, and look at the religion,
01:13:08.460 | all those kinds of things, but they come at the cost
01:13:11.660 | of all those peoples that they conquered forcibly
01:13:14.700 | and by force integrated into their empire.
01:13:18.580 | The one thing we need to remember about the Mongols
01:13:20.700 | that makes them different than, say, the Romans,
01:13:22.780 | and this is complex stuff and way above my pay grade,
01:13:25.980 | but I'm fascinated with it, and it's more like
01:13:28.300 | the Comanches that you just brought up,
01:13:30.220 | is that the Mongols are not a settled society.
01:13:33.300 | They come from a nomadic tradition.
01:13:36.820 | Now, several generations later, when you have Kublai Khan
01:13:41.820 | as the emperor of China, it's beginning
01:13:45.140 | to be a different thing, right?
01:13:46.540 | And the Mongols, when their empire broke up,
01:13:48.780 | the ones that were in the so-called settled societies,
01:13:52.420 | right, Iran, places like that, they will become more like,
01:13:56.020 | over time, the rulers of those places were traditionally,
01:13:59.260 | and the Mongols in, say, like the Khaganate
01:14:02.620 | of the Golden Horde, which is still
01:14:04.340 | in their traditional nomadic territories,
01:14:06.740 | will remain traditionally more Mongol,
01:14:09.580 | but when you start talking about who the Mongols were,
01:14:12.520 | I try to make a distinction.
01:14:15.380 | They're not some really super special people.
01:14:19.500 | They're just the latest confederacy in an area
01:14:23.780 | that saw nomadic confederacies going back
01:14:27.700 | to the beginning of recorded history,
01:14:30.140 | the Scythians, the Sarmatians, the Avars,
01:14:33.620 | the Huns, the Magyars, I mean, these are all the nomadic,
01:14:37.100 | you know, the nomads of the Eurasian steppe
01:14:39.620 | were huge, huge players in the history of the world
01:14:42.860 | until gunpowder nullified their traditional weapon system,
01:14:49.140 | which I've been fascinated with
01:14:50.420 | because their traditional weapon system
01:14:52.540 | is not one you could copy,
01:14:54.260 | because you were talking about being the greatest warriors
01:14:56.340 | you could be.
01:14:57.780 | Every warrior society I've ever seen values that.
01:15:02.020 | What the nomads had of the Eurasian steppe
01:15:05.140 | was this relationship between human beings and animals
01:15:10.140 | that changed the equation.
01:15:12.700 | It was how they rode horses,
01:15:15.660 | and societies like the Byzantines,
01:15:18.060 | which would form one flank of the steppe,
01:15:20.300 | and then all the way on the other side,
01:15:21.700 | you had China, and below that, you had Persia.
01:15:24.900 | These societies would all attempt to create mounted horsemen
01:15:29.300 | who used archery, and they did a good job,
01:15:32.140 | but they were never the equals of the nomads
01:15:34.940 | because those people were literally raised in the saddle.
01:15:38.720 | They compared them to centaurs.
01:15:41.500 | The Comanches, great example,
01:15:42.860 | considered to be the best horse-riding warriors
01:15:47.100 | in North America.
01:15:48.620 | The Comanches, I always loved watching,
01:15:50.980 | there's paintings, George Catlin,
01:15:52.660 | the famous painter who painted the Comanches,
01:15:57.060 | illustrated it, but the Mongols, and the Scythians,
01:15:59.980 | and Scythians, and the Avars,
01:16:01.220 | and all these people did it too,
01:16:02.840 | where they would shoot from underneath the horse's neck,
01:16:06.700 | hiding behind the horse the whole way.
01:16:09.640 | You look at a picture of somebody doing that,
01:16:12.100 | and it's insane.
01:16:13.440 | This is what the Byzantines couldn't do,
01:16:15.840 | and the Chinese couldn't do,
01:16:17.020 | and it was a different level
01:16:19.140 | of harnessing a human-animal relationship
01:16:23.740 | that gave them a military advantage
01:16:26.260 | that could not be copied.
01:16:28.600 | It could be emulated, but they were never as good.
01:16:31.580 | That's why they always hired these people.
01:16:33.500 | They hired mercenaries from these areas
01:16:35.460 | because they were incomparable.
01:16:38.380 | It's the combination of people
01:16:39.600 | who were shooting bows and arrows
01:16:41.500 | from the time they were toddlers,
01:16:43.220 | who were riding from the time they were toddlers,
01:16:45.020 | who rode all the time.
01:16:46.580 | I mean, the Huns were bow-legged, the Romans said,
01:16:49.540 | because they were never out.
01:16:50.820 | They ate, slept, everything in the saddle.
01:16:54.340 | That creates something that is difficult to copy,
01:16:57.500 | and it gave them a military advantage.
01:17:00.460 | I enjoy reading, actually,
01:17:01.860 | about when that military advantage ended,
01:17:04.820 | so 17th and 18th century,
01:17:07.260 | when the Chinese on one flank
01:17:08.980 | and the Russians on the other
01:17:10.220 | are beginning to use firearms and stuff
01:17:12.500 | to break this military power of these various Khans.
01:17:17.500 | The Mongols were simply the most dominating
01:17:20.500 | and most successful of the confederacies,
01:17:23.060 | but if you break it down,
01:17:24.540 | they really formed the nucleus
01:17:27.180 | at the top of the pyramid,
01:17:28.660 | of the apex of the food chain,
01:17:30.460 | and a lot of the people that were known as Mongols
01:17:33.300 | were really lots of other tribes, non-Mongolian tribes,
01:17:37.140 | that when the Mongols conquer you,
01:17:38.740 | after they killed a lot of you,
01:17:40.340 | they incorporated you into their confederacy
01:17:43.420 | and often made you go first.
01:17:44.980 | You know, you're gonna fight somebody,
01:17:46.260 | we're gonna make these people go out in front
01:17:48.100 | and suck up all the arrows
01:17:49.580 | before we go in and finish the job.
01:17:51.540 | So to me, and I guess a fan of the Mongols would say
01:17:56.020 | that the difference,
01:17:57.780 | and what made the Mongols different
01:17:59.100 | wasn't the weapon system or the fighting
01:18:01.220 | or the warriors or the armor or anything,
01:18:03.340 | it was Genghis Khan.
01:18:04.980 | And if you go look at the other really dangerous,
01:18:07.520 | from the outside world's perspective,
01:18:08.980 | dangerous steppe nomadic confederacies from past history
01:18:12.640 | was always when some great leader emerged
01:18:16.180 | that could unite the tribes.
01:18:17.400 | And you see the same thing
01:18:18.300 | in Native American history to a degree too.
01:18:21.140 | You had people like Attila, right?
01:18:24.260 | Or there was one called Tuman.
01:18:26.300 | You go back in history,
01:18:27.460 | and these people make the history books
01:18:29.180 | because they caused an enormous amount of trouble
01:18:31.900 | for their settled neighbors that normally,
01:18:34.180 | I mean, Chinese, Byzantine, and Persian approaches
01:18:37.460 | to the steppe people were always the same.
01:18:39.660 | They would pick out tribes to be friendly with.
01:18:42.020 | They would give them money, gifts, hire them,
01:18:43.940 | and they would use them against the other tribes.
01:18:46.460 | And generally Byzantine,
01:18:48.060 | especially in Chinese diplomatic history
01:18:50.880 | was all about keeping these tribes separated.
01:18:54.020 | Don't let them form confederations of large numbers of them
01:18:57.480 | because then they're unstoppable.
01:18:59.300 | Attila was a perfect example.
01:19:01.060 | The Huns were another large,
01:19:02.820 | the Turks, another large confederacy of these people.
01:19:05.540 | And they were devastating when they could unite.
01:19:08.260 | So the diplomatic policy was don't let them.
01:19:10.640 | That's what made the Mongols different
01:19:12.060 | is Genghis Khan united them.
01:19:14.040 | And then unlike most of the tribal confederacies,
01:19:16.140 | he was able, they were able to hold it together
01:19:18.060 | for a few generations.
01:19:19.920 | - To linger on the little thread
01:19:23.460 | that you started pulling on this man, Genghis Khan,
01:19:27.700 | that was a leader. - Temujin, yeah.
01:19:29.980 | - What do you think makes a great leader?
01:19:32.340 | Maybe if you have other examples throughout history.
01:19:35.120 | And great, again, let's use that term loosely.
01:19:40.980 | - Now I was gonna ask for a definition.
01:19:42.940 | - Great uniter of whether it's evil or good,
01:19:46.980 | it doesn't matter.
01:19:48.060 | Is there somebody who stands out to you,
01:19:52.020 | Alexander the Great,
01:19:53.020 | so we're talking about military or ideologies.
01:19:57.060 | Some people bring up FDR,
01:19:59.580 | or I mean, it could be the founding fathers of this country,
01:20:03.740 | or we can go to, was he a man of the century up there?
01:20:08.740 | Hitler of the 20th century and Stalin.
01:20:14.660 | And these people had really amassed the amount of power
01:20:19.340 | that probably has never been seen
01:20:20.900 | in the history of the world.
01:20:22.460 | Is there somebody who stands out to you
01:20:24.740 | by way of trying to define what makes a great uniter,
01:20:28.740 | great leader in one man or woman, maybe in the future?
01:20:33.100 | - It's an interesting question,
01:20:35.340 | and one I've thought a lot about,
01:20:36.700 | because let's take Alexander the Great as an example,
01:20:39.100 | because Alexander fascinated the world of his time,
01:20:41.740 | fascinated ever since.
01:20:43.180 | People have been fascinated with the guy.
01:20:45.220 | But Alexander was a hereditary monarch, right?
01:20:49.660 | He was handed the kingdom.
01:20:52.220 | - Which is fascinating.
01:20:53.180 | - But he did not need to rise from nothing to get that job.
01:20:57.780 | In fact, he reminds me of a lot of other leaders,
01:21:00.340 | Frederick the Great, for example, in Prussia.
01:21:03.460 | These are people who inherited
01:21:06.100 | the greatest army of their day.
01:21:08.660 | Alexander, unless he was an imbecile,
01:21:12.000 | was going to be great no matter what,
01:21:14.780 | because if you inherit the Wehrmacht,
01:21:17.340 | you're gonna be able to do something with it, right?
01:21:19.980 | Alexander's father may have been greater.
01:21:22.500 | Philip, Philip II was the guy who literally did create
01:21:27.500 | a strong kingdom from a disjointed group of people
01:21:32.660 | that were continually beset by their neighbors.
01:21:34.940 | He's the one that reformed that army,
01:21:37.700 | took things that he had learned from other Greek leaders,
01:21:41.000 | like the Theban leader at Pamananda's,
01:21:42.900 | and then laboriously over his lifetime,
01:21:48.400 | stabilized the frontiers, built this system.
01:21:51.020 | He lost an eye doing it.
01:21:52.440 | His leg was made lame.
01:21:55.660 | This was a man who looked like he built the empire
01:21:58.420 | and led from the front ranks,
01:22:00.620 | and then, who may have been killed by his son,
01:22:04.700 | we don't know who assassinated Philip,
01:22:06.780 | but then handed the greatest army
01:22:08.480 | the world had ever seen to his son,
01:22:10.420 | who then did great things with it.
01:22:12.000 | You see this pattern many times.
01:22:13.900 | So in my mind, I'm not sure Alexander
01:22:17.620 | really can be that great when you compare him
01:22:20.340 | to people who arose from nothing.
01:22:22.300 | So the difference between what we would call
01:22:24.200 | in the United States the self-made man
01:22:27.060 | or the one who inherits a fortune.
01:22:29.380 | There's an old line that, you know, it's a slur,
01:22:31.860 | but it's about rich people,
01:22:33.500 | and it's like he was born on third base
01:22:37.100 | and thought he hit a triple, right?
01:22:39.500 | Philip was born at home plate, and he had to hit,
01:22:42.900 | Alexander started on third base.
01:22:45.220 | And so I try to draw a distinction between them.
01:22:48.140 | Genghis Khan is tough because there's two traditions.
01:22:51.740 | The tradition that we grew up with here
01:22:53.600 | in the United States and that I grew up learning
01:22:55.380 | was that he was a self-made man.
01:22:57.860 | But there is a tradition, and it may be one of those things
01:23:00.620 | that's put after the fact, because a long time ago,
01:23:04.020 | whether or not you had blue blood in your veins
01:23:06.700 | was an important distinction.
01:23:08.860 | And so the distinction that you'll often hear
01:23:10.660 | from Mongolian history is that this was a nobleman
01:23:14.620 | who had been deprived of his inheritance,
01:23:16.700 | so he was a blue blood anyway.
01:23:18.740 | I don't know which is true.
01:23:20.700 | There's certainly, I mean, when you look at a Genghis Khan,
01:23:22.980 | though, you have to go,
01:23:24.580 | that is a wicked amount of things to have achieved.
01:23:28.380 | He's very impressive as a figure.
01:23:30.220 | Attila's very impressive as a figure.
01:23:32.240 | Hitler's an interesting figure.
01:23:35.180 | He's one of those people,
01:23:36.860 | you know, the more you study about Hitler,
01:23:38.660 | the more you wonder where the defining moment was,
01:23:43.260 | because if you look at his life,
01:23:46.080 | I mean, Hitler was a relatively common soldier
01:23:50.300 | in the First World War.
01:23:51.260 | I mean, he was brave, he got some decorations.
01:23:54.300 | In fact, the highest decoration he got
01:23:56.060 | in the First World War was given to him by a Jewish officer.
01:23:59.660 | And it was, he often didn't talk about that decoration,
01:24:03.060 | even though it was the more prestigious one,
01:24:04.780 | because it would open up a whole can of worms
01:24:06.320 | you didn't want to get into.
01:24:07.740 | But Hitler's, I mean, if you said, who was Hitler today,
01:24:11.240 | one of the top things you're gonna say
01:24:12.660 | is he was an anti-Semite.
01:24:14.700 | Well, then you have to draw a distinction
01:24:16.060 | between general, regular anti-Semitism
01:24:19.820 | that was pretty common in the era
01:24:21.620 | and something that was a rabid level of anti-Semitism.
01:24:24.440 | But Hitler didn't seem to show a rabid level
01:24:27.380 | of anti-Semitism until after
01:24:29.500 | or at the very end of the First World War.
01:24:31.560 | So if this is a defining part of this person's character,
01:24:34.860 | and much of what we consider to be his evil
01:24:38.340 | stems from that, what happened to this guy
01:24:41.440 | when he's an adult, right?
01:24:43.220 | He's already fought in the war.
01:24:44.860 | To change him so, I mean, it's almost like the old,
01:24:47.220 | there was always a movie theme.
01:24:48.220 | Somebody gets hit by something on the head
01:24:50.740 | and their whole personality changes, right?
01:24:52.820 | I mean, it almost seems something like that.
01:24:55.340 | So I don't think I call that necessarily a great leader.
01:24:58.820 | To me, the interesting thing about Hitler
01:25:00.180 | is what the hell happened to a nondescript person
01:25:03.580 | who didn't really impress anybody with his skills?
01:25:06.940 | And then in the 1920s, it's all of a sudden,
01:25:10.460 | as you said, sort of the man of the hour, right?
01:25:12.780 | So that to me is kind of, I have this feeling
01:25:15.400 | that Genghis Khan, and we don't really know,
01:25:17.980 | was an impressive human being from the get-go.
01:25:20.540 | And then he was raised in this environment
01:25:22.060 | with pressure on all sides.
01:25:23.460 | So you start with this diamond and then you polish it
01:25:25.980 | and you harden it his whole life.
01:25:27.700 | Hitler seems to be a very unimpressive gemstone
01:25:31.260 | most of his life, and then all of a sudden.
01:25:33.340 | So I mean, I don't think I can label great leaders.
01:25:36.820 | And I'm always fascinated by that idea that,
01:25:39.380 | and I'm trying to remember who the quote was by,
01:25:41.180 | that great men, oh, Lord Acton.
01:25:43.420 | So great men are often not good men.
01:25:47.240 | And that in order to be great,
01:25:49.120 | you would have to jettison many of the moral qualities
01:25:51.780 | that we normally would consider a Jesus or a Gandhi,
01:25:55.180 | or, you know, these qualities that one looks at
01:25:58.620 | as the good upstanding moral qualities
01:26:01.160 | that we should all aspire to as examples, right?
01:26:03.460 | The Buddha, whatever it might be.
01:26:05.620 | Those people wouldn't make good leaders
01:26:07.240 | because what you need to be a good leader
01:26:08.620 | often requires the kind of choices
01:26:10.480 | that a true philosophical diogenes moral man wouldn't make.
01:26:15.840 | So I don't have an answer to your question.
01:26:17.380 | How about that?
01:26:18.220 | That's a very long way of saying, I don't know.
01:26:20.160 | - Just to linger a little bit,
01:26:22.280 | it does feel like from my study of Hitler
01:26:25.000 | that the time molded the man versus Genghis Khan
01:26:28.880 | where it feels like he, the man molded his time.
01:26:33.640 | - Yes, and I feel that way about a lot of those
01:26:35.300 | nomadic confederacy builders,
01:26:37.880 | that they really seem to be these figures
01:26:41.160 | that stand out as extraordinary in one way or another.
01:26:45.800 | Remembering by the way that almost all the history of them
01:26:48.200 | were written by the enemies that they so mistreated
01:26:50.600 | that they were probably never gonna get any good press.
01:26:52.760 | They didn't write themselves.
01:26:54.000 | - That's a caveat we should always add to basically--
01:26:56.080 | - Nomadic or Native American peoples
01:26:58.080 | or tribal peoples anywhere generally do not get
01:27:00.840 | the advantage of being able to write
01:27:02.580 | the history of their heroes.
01:27:04.760 | - Okay, I've recently almost done with
01:27:09.120 | the rise and the fall of the Third Reich.
01:27:11.400 | It's one of the historical descriptions
01:27:16.400 | of Hitler's rise to power, Nazis rise to power.
01:27:20.660 | There's a few philosophical things I'd like to
01:27:24.440 | ask you to see if you can help.
01:27:27.960 | Like one of the things I think about is
01:27:33.280 | how does one be a hero in 1930s Nazi Germany?
01:27:38.040 | What does it mean to be a hero?
01:27:41.480 | What do heroic actions look like?
01:27:45.000 | I think about that because I think about
01:27:50.000 | how I move about in this world today.
01:27:52.820 | That we live in really chaotic and tense times
01:28:00.560 | where I don't think you wanna draw any parallels
01:28:04.400 | between Nazi Germany and modern day in any of the nations
01:28:08.160 | we can think about, but it's not out of the realm
01:28:11.320 | of possibility that authoritarian governments take hold,
01:28:16.320 | authoritarian companies take hold.
01:28:21.040 | And I'd like to think that I could be
01:28:25.000 | in my little small way and inspire others
01:28:27.700 | to take the heroic action before things get bad.
01:28:33.640 | And I kind of try to place myself
01:28:36.640 | in what would 1930s Germany look like?
01:28:40.000 | Is it possible to stop a Hitler?
01:28:43.600 | Is it even the right way to think about it?
01:28:47.800 | And how does one be a hero in it?
01:28:51.520 | I mean, you often talk about that living through a moment
01:28:54.400 | in history is very different than looking at that history,
01:28:57.520 | looking when you look back.
01:29:00.080 | I also think about, would it be possible
01:29:03.960 | to understand what's happening
01:29:07.000 | that the bells of war are ringing?
01:29:12.000 | It seems that most people didn't seem to understand
01:29:15.520 | late into the 30s that war is coming.
01:29:19.800 | That's fascinating.
01:29:22.320 | On the United States side, inside Germany,
01:29:25.760 | like the opposing figures, the German military
01:29:29.080 | didn't seem to understand this, maybe.
01:29:32.120 | Off the other countries, certainly France and England
01:29:35.760 | didn't seem to understand this.
01:29:37.520 | That kind of tried to put myself into 1930s Germany
01:29:41.320 | as I'm Jewish, which is another little twist on the whole.
01:29:45.320 | Like, what would I do?
01:29:48.080 | What should one do?
01:29:49.360 | Do you have interesting answers?
01:29:54.400 | - So earlier we had talked about Putin
01:29:56.120 | and we had talked about patriotism
01:29:57.880 | and love of country and those sorts of things.
01:30:00.640 | In order to be a hero in Nazi Germany,
01:30:05.640 | by our views here, you would have had to have been
01:30:10.680 | anti-patriotic to the average German's viewpoint
01:30:15.680 | in the 1930s, right?
01:30:16.880 | You would have to have opposed your own government
01:30:19.960 | and your own country.
01:30:21.200 | And that's a very, it would be a very weird thing
01:30:24.120 | to go to people in Germany and say,
01:30:25.960 | "Listen, the only way you're gonna be seen
01:30:27.920 | "as a good German and a hero to the country
01:30:31.920 | "that will be your enemies is we think
01:30:34.920 | "you should oppose your own government."
01:30:36.640 | It's a strange position to put the people in a government
01:30:41.160 | and saying, "You need to be against your leader.
01:30:43.080 | "You need to oppose your government's policies.
01:30:45.400 | "You need to oppose your government.
01:30:46.920 | "You need to hope and work for its downfall."
01:30:49.560 | That doesn't sound patriotic.
01:30:51.120 | It wouldn't sound patriotic here in this country
01:30:53.280 | if you made a similar argument.
01:30:55.880 | I will go away from the 1930s and go to the 1940s
01:31:00.480 | to answer your question.
01:31:01.520 | So there is movements like the White Rose Movement
01:31:05.320 | in Germany, which involved young people really,
01:31:09.000 | and from various backgrounds, religious backgrounds often,
01:31:13.200 | who worked openly against the Nazi government
01:31:17.320 | at a time when power was already consolidated,
01:31:19.880 | the Gestapo was in full force, and they execute people
01:31:23.280 | who are against the government.
01:31:24.640 | And these young people would go out
01:31:26.400 | and distribute pamphlets, and many of them
01:31:29.240 | got their heads cut off with guillotines for their trouble.
01:31:32.680 | And they knew that that was gonna be the penalty.
01:31:35.960 | That is a remarkable amount of bravery and sacrifice
01:31:40.960 | and willingness to die, and almost not even willingness,
01:31:44.720 | because they were so open about it,
01:31:46.040 | it's almost a certainty, right?
01:31:48.120 | That's incredibly moving to me.
01:31:51.600 | So when we talk, and we had talked earlier
01:31:53.160 | about sort of the human spirit and all that kind of thing,
01:31:56.120 | there are people in the German military
01:32:01.160 | who opposed and worked against Hitler, for example.
01:32:04.480 | But to me, that's almost cowardly compared
01:32:08.120 | to what these young people did in the White Rose Movement,
01:32:10.600 | because those people in the Wehrmacht, for example,
01:32:13.960 | who were secretly trying to undermine Hitler,
01:32:16.320 | they're not really putting their lives on the line
01:32:18.920 | to the same degree.
01:32:21.320 | And so I think when I look at heroes,
01:32:23.800 | and listen, I remember once saying
01:32:25.400 | there were no conscientious objectors in Germany
01:32:28.760 | as a way to point out to people
01:32:30.720 | that you didn't have a choice,
01:32:31.640 | you know, you were gonna serve in the,
01:32:32.880 | and I got letters from Jehovah's Witnesses
01:32:35.040 | who said, "Yes, there were."
01:32:36.960 | And we got sent to the concentration camps.
01:32:39.440 | Those are remarkably brave things.
01:32:42.120 | It's one thing to have your own set of standards and values.
01:32:47.120 | It's another thing to say, "Oh no,
01:32:50.440 | I'm going to display them in a way
01:32:52.600 | that with this regime, that's a death sentence."
01:32:54.600 | And not just for me, for my family, right?
01:32:57.600 | In these regimes, there was not a lot of distinction made
01:33:00.480 | between father and son and wives.
01:33:02.500 | That's a remarkable sacrifice to make,
01:33:05.560 | and far beyond what I think I would even be capable of.
01:33:08.680 | And so the admiration comes from seeing people
01:33:12.760 | who appear to be more morally profound than you are yourself.
01:33:18.080 | So when I look at this, I look at that kind of thing,
01:33:20.520 | and I just say, "Wow."
01:33:22.640 | And the funny thing is,
01:33:23.740 | if you'd have gone to most average Germans on the street
01:33:27.420 | in 1942 and said, "What do you think of these people?"
01:33:31.720 | They're gonna think of them as traitors
01:33:33.360 | who probably got what they deserved.
01:33:35.840 | So that's the eye of the beholder thing.
01:33:37.680 | It's the power of the state
01:33:39.680 | to so propagandize values and morality
01:33:44.200 | in a way that favors the state,
01:33:46.760 | that you can turn people who today we look at
01:33:49.240 | as unbelievably brave and moral
01:33:51.880 | and crusading for righteousness
01:33:54.320 | and turn them into enemies of the people.
01:33:56.820 | So, I mean, in my mind, it would be people like that.
01:34:00.900 | - See, I think, so hero is a funny word,
01:34:05.560 | and we romanticize the notion,
01:34:07.800 | but if I could drag you back to 1930s Germany from 1940s.
01:34:11.760 | - Sure.
01:34:13.640 | I feel like the heroic actions that doesn't accomplish much
01:34:18.640 | is not what I'm referring to.
01:34:21.780 | So there's many heroes I look up to that,
01:34:26.780 | like David Goggins, for example,
01:34:29.240 | the guy who runs crazy distances.
01:34:31.400 | He runs for no purpose except for the suffering in itself.
01:34:35.320 | And I think his willingness to challenge
01:34:38.480 | the limits of his mind is heroic.
01:34:42.140 | I guess I'm looking for a different term,
01:34:44.520 | which is how could Hitler have been stopped?
01:34:47.880 | My sense is that he could have been stopped
01:34:52.120 | in the battle of ideas,
01:34:54.000 | or people, millions of people were suffering economically,
01:34:59.720 | were suffering because of the betrayal of World War I
01:35:02.920 | in terms of the love of country
01:35:04.760 | and how they felt they were being treated,
01:35:07.460 | and a charismatic leader that inspired love
01:35:11.960 | and unity that's not destructive could have emerged,
01:35:15.580 | and that's where the battle should have been fought.
01:35:18.360 | - I would suggest that we need to take into account
01:35:22.780 | the context of the times that led to Hitler's rise of power
01:35:26.660 | and created the conditions where his message resonated.
01:35:31.280 | That is not a message that resonates at all times, right?
01:35:34.780 | It is impossible to understand the rise of Hitler
01:35:40.180 | without dealing with the First World War
01:35:42.060 | and the aftermath of the First World War
01:35:44.060 | and the inflationary, terrible depression in Germany
01:35:46.580 | and all these things,
01:35:47.420 | and the dissatisfaction with the Weimar Republic's government
01:35:52.140 | which was often seen as something put into,
01:35:55.460 | which it was put into place by the victorious powers.
01:35:59.300 | Hitler referred to the people that signed those agreements,
01:36:02.380 | that signed the armistice as the November criminals.
01:36:06.300 | And he used that as a phrase
01:36:08.300 | which resonated with the population.
01:36:10.100 | This was a population that was embittered.
01:36:12.820 | And even if they weren't embittered,
01:36:14.340 | the times were so terrible,
01:36:15.960 | and the options for operating within the system
01:36:19.620 | in a non-radical way seemed totally discredited, right?
01:36:24.500 | You could work through the Weimar Republic,
01:36:25.900 | but they tried and it wasn't working anyway.
01:36:27.660 | And then the alternative to the Nazis
01:36:29.900 | who were bully boys in the street were communist agitators
01:36:33.500 | that to the average conservative German seem no better.
01:36:36.820 | So you have three options
01:36:38.060 | if you're an average German person.
01:36:39.780 | You can go with the discredited government
01:36:42.380 | put in power by your enemies that wasn't working anyway.
01:36:46.800 | You could go with the Nazis
01:36:48.140 | who seemed like a bunch of super patriots
01:36:49.980 | calling for the restoration of German authority,
01:36:54.140 | or you could go with the communists.
01:36:55.820 | And the entire thing seemed like
01:36:57.820 | a litany of poor options, right?
01:37:00.300 | And in this realm,
01:37:01.460 | Hitler was able to triangulate, if you will.
01:37:07.020 | He came off as a person
01:37:08.860 | who was going to restore German greatness
01:37:11.100 | at a time when this was a powerful message.
01:37:13.660 | But if you don't need German greatness restored,
01:37:16.580 | it doesn't resonate, right?
01:37:18.860 | So the reason that your love idea and all this stuff
01:37:23.580 | I don't think would have worked in the time period
01:37:25.600 | is because that was not a commodity
01:37:27.700 | that the average German was in search of then.
01:37:30.980 | - Well, it's interesting to think about
01:37:33.740 | whether greatness can be restored
01:37:36.220 | through mechanisms, through ideas that are not so,
01:37:40.620 | from our perspective today, so evil.
01:37:46.740 | I don't know what the right term is.
01:37:48.700 | - But the war continued in a way.
01:37:50.500 | So remember that when Germany,
01:37:52.460 | when Hitler is rising to power,
01:37:54.860 | the French are in control of parts of Germany, right?
01:37:58.460 | The Ruhr, one of the main industrial heartlands of Germany
01:38:01.660 | was occupied by the French.
01:38:02.940 | So there's never this point
01:38:04.500 | where you're allowed to let the hate dissipate, right?
01:38:07.940 | Every time maybe things were calming down,
01:38:10.420 | something else would happen to stick the knife in
01:38:13.100 | and twist it a little bit more,
01:38:14.280 | from the average German's perspective, right?
01:38:16.980 | The reparations, right?
01:38:18.380 | So if you say, okay, well, we're gonna get back on our feet.
01:38:20.620 | The reparations were crushing.
01:38:22.580 | These things prevented the idea of love or brotherhood
01:38:27.220 | and all these things from taking hold.
01:38:29.300 | And even if there were Germans who felt that way,
01:38:31.500 | and there most certainly were,
01:38:33.660 | it is hard to overcome the power of everyone else.
01:38:38.660 | You know, what I always say
01:38:39.620 | when people talk to me about humanity
01:38:41.940 | is I believe on an individual levels,
01:38:44.540 | we're capable of everything and anything,
01:38:46.940 | good, bad, or indifferent,
01:38:48.580 | but collectively it's different, right?
01:38:51.580 | And in the time period that we're talking about here,
01:38:55.340 | messages of peace on earth and love your enemies
01:38:58.060 | and all these sorts of things
01:39:00.580 | were absolutely deluged and overwhelmed
01:39:03.380 | and drowned out by the bitterness, the hatred.
01:39:06.420 | And let's be honest,
01:39:07.340 | the sense that you were continually being abused
01:39:10.220 | by your former enemies.
01:39:11.820 | There were a lot of people in the Allied side
01:39:14.260 | that realized this and said,
01:39:15.860 | we're setting up the next war.
01:39:17.620 | This is, I mean, they understood
01:39:18.980 | that you can only do certain things
01:39:20.860 | to collective human populations
01:39:23.060 | for a certain period of time
01:39:24.340 | before it is natural for them to want to.
01:39:26.700 | And there are, you can see German posters from the region,
01:39:29.060 | Nazi propaganda posters
01:39:31.060 | that show them breaking off the chains of their enemies.
01:39:33.660 | And I mean, Germany awake, right?
01:39:35.180 | That was the great slogan.
01:39:37.460 | So I think love is always a difficult option.
01:39:42.460 | And in the context of those times,
01:39:44.980 | it was even more disempowered than normal.
01:39:48.860 | - Well, this goes to the,
01:39:51.700 | just to linger in it for a little longer,
01:39:54.820 | the question of the inevitability of history.
01:39:59.660 | Do you think Hitler could have been stopped?
01:40:02.500 | Do you think this kind of force
01:40:04.260 | that you're saying that there was a pain
01:40:06.420 | and it was building,
01:40:07.820 | there's a hatred that was building,
01:40:10.420 | do you think there was a way to avert?
01:40:13.120 | I mean, there's two questions.
01:40:16.980 | Could have been a lot worse
01:40:18.660 | and could have been better
01:40:22.260 | in the trajectory of history in the '30s and '40s.
01:40:25.340 | - The most logical, see, we had started this conversation,
01:40:28.060 | brings a wonderful bow tie into the discussion
01:40:30.340 | and buttons it up nicely.
01:40:32.380 | We had talked about force and counterforce earlier.
01:40:35.680 | The most obvious and much discussed way
01:40:38.580 | that Hitler could have been stopped
01:40:39.700 | has nothing to do with Germans.
01:40:42.180 | When he re-militarized the Rhineland,
01:40:45.420 | everyone talks about what a couple of French divisions
01:40:49.780 | would have done had they simply gone in and contested.
01:40:52.380 | And this was something Hitler was extremely,
01:40:54.300 | I mean, it might've been the most nervous time
01:40:56.140 | in his entire career because he was afraid
01:40:58.740 | that they would have responded with force.
01:41:01.020 | And he was in no position to do anything about it
01:41:03.460 | if they did.
01:41:04.660 | So this is where you get the people who say,
01:41:07.280 | you know, and Churchill's one of these people too,
01:41:10.260 | where they talk about that, you know,
01:41:12.500 | he should have been stopped militarily
01:41:14.280 | right at the very beginning when he was weak.
01:41:16.740 | I don't think,
01:41:17.940 | listen, there were candidates in the Catholic Center Party
01:41:23.500 | and others in the Weimar Republic
01:41:25.540 | that maybe could have done things.
01:41:26.780 | And it's beyond my understanding of specific German history
01:41:30.700 | to talk about it intelligently.
01:41:32.700 | But I do think that had the French responded militarily
01:41:35.860 | to Hitler's initial moves into that area,
01:41:38.780 | that he would have been thwarted.
01:41:40.220 | And I think he himself believed,
01:41:42.160 | if I'm remembering my reading,
01:41:44.580 | that this would have led to his downfall.
01:41:46.820 | So the potential, see, what I don't like about this
01:41:49.720 | is that it almost legitimizes military intervention
01:41:52.740 | at a very early stage to prevent worse things
01:41:55.260 | from happening.
01:41:56.340 | But it might be a pretty clear cut case.
01:41:58.780 | But it should also be pointed out
01:42:00.620 | that there was a lot of sympathy on the part of the allies
01:42:03.260 | for the fact that, you know,
01:42:04.660 | the Germans probably should have Germany back
01:42:06.920 | and this is traditional German land.
01:42:08.980 | I mean, they were trying, in a funny way,
01:42:11.000 | it's almost like the love and the sense of justice
01:42:14.900 | on the allies part may have actually stayed their hand
01:42:18.480 | in a way that would have prevented
01:42:20.500 | much, much, much worse things later.
01:42:22.180 | But if the times were such
01:42:26.100 | that the message of a Hitler resonated,
01:42:28.540 | then simply removing Hitler from the equation
01:42:30.940 | would not have removed the context of the times.
01:42:34.220 | And that means one of two things,
01:42:36.700 | either you could have had another one
01:42:39.140 | or you could have ended up in a situation equally bad
01:42:43.260 | in a different direction.
01:42:44.860 | I don't know what that means
01:42:46.280 | because it's hard to imagine anything could be worse
01:42:49.100 | than what actually occurred.
01:42:51.040 | But history is funny that way.
01:42:52.460 | And Hitler's always everyone's favorite example
01:42:55.580 | of the difference between the great man theory of history
01:42:58.500 | and the trends and forces theories of history, right?
01:43:01.420 | The times made a Hitler possible
01:43:03.880 | and maybe even desirable to some.
01:43:06.180 | If you took him out of the equation,
01:43:08.660 | those trends and forces are still in place, right?
01:43:12.100 | So what does that mean?
01:43:13.860 | If you take him out and the door is still open,
01:43:16.620 | does somebody else walk through it?
01:43:19.080 | - Yeah, it's mathematically speaking,
01:43:21.860 | the probability of charismatic leaders emerge.
01:43:28.140 | I'm so torn on that at this point.
01:43:32.540 | - Here's another way to look at it.
01:43:33.920 | The institutional stability of Germany in that time period
01:43:38.920 | was not enough to push back.
01:43:41.340 | And there are other periods in German history.
01:43:43.220 | I mean, that Hitler arose in, arisen in 1913,
01:43:47.740 | he doesn't get anywhere.
01:43:49.120 | 'Cause Germany's institutional power
01:43:51.520 | is enough to simply quash that.
01:43:54.400 | It's the fact that Germany was unstable anyway
01:43:57.560 | that prevented a united front
01:43:59.540 | that would have kept radicalism from getting out of hand.
01:44:02.440 | Does that make sense?
01:44:03.280 | - Yes, absolutely.
01:44:04.260 | A tricky question on this,
01:44:06.080 | just to stay in this a little longer,
01:44:09.520 | 'cause I'm not sure how to think about it,
01:44:11.380 | is the World War II versus the Holocaust.
01:44:16.140 | And we were talking just now
01:44:20.580 | about the way that history unrolls itself
01:44:23.620 | and could Hitler have been stopped?
01:44:26.180 | And I don't quite know what to think about Hitler
01:44:30.820 | without the Holocaust.
01:44:33.060 | And perhaps in his thinking,
01:44:36.400 | how essential the antisemitism and the hatred of Jews was.
01:44:44.360 | It feels to me that,
01:44:47.000 | I mean, I don't, we were just talking about
01:44:50.240 | where did he pick up his hatred of the Jewish people?
01:44:54.480 | There's stories in Vienna and so on
01:44:57.840 | that it almost is picking up the idea of antisemitism
01:45:02.840 | as a really useful tool,
01:45:05.700 | as opposed to actually believing it in its core.
01:45:10.340 | Do you think World War II, as it turned out,
01:45:13.120 | and Hitler as he turned out,
01:45:15.560 | would be possible without antisemitism?
01:45:18.560 | Could we have avoided the Holocaust?
01:45:21.420 | Or was it an integral part of the ideology
01:45:26.200 | of fascism and the Nazis?
01:45:29.240 | - Not an integral part of fascism,
01:45:30.760 | 'cause Mussolini really, I mean,
01:45:32.800 | Mussolini did it to please Hitler,
01:45:34.720 | but it wasn't an integral part.
01:45:36.800 | What's interesting to me is that that's the big anomaly
01:45:40.400 | in the whole question,
01:45:41.280 | because antisemitism didn't need to be a part of this
01:45:44.320 | at all, right?
01:45:45.640 | Hitler had a conspiratorial view of the world.
01:45:50.200 | He was a believer that the Jews controlled things, right?
01:45:53.760 | The Jews were responsible for both Bolshevism on one side
01:45:57.560 | and capitalism on the other.
01:45:59.120 | They ruled the banks.
01:46:00.320 | I mean, the United States was a Jewified country, right?
01:46:03.680 | Bolshevism was a Jewified sort of a political.
01:46:09.200 | In other words, he saw Jews everywhere,
01:46:11.280 | and he had that line about it,
01:46:12.360 | the Jews of Europe force another war to Germany,
01:46:15.920 | they'll pay the price or whatever.
01:46:17.360 | But then you have to believe that they're capable of that.
01:46:20.280 | The Holocaust is a weird, weird sidebar to the whole thing.
01:46:24.120 | And here's what I've always found interesting.
01:46:25.960 | It's a sidebar that weakened Germany.
01:46:28.400 | 'Cause look at the first world war.
01:46:29.720 | Jews fought for Germany, right?
01:46:31.920 | Who was the most important?
01:46:33.620 | And this is a very arguable point,
01:46:36.080 | but it's just the first one that pops into my head.
01:46:38.240 | Who was the most important Jewish figure
01:46:41.760 | that would have maybe been on the German side
01:46:45.080 | had the Germans had a non-anti-Semitic?
01:46:48.320 | Well, listen, that whole part.
01:46:49.840 | - Einstein. - Yeah, yes, it was Einstein.
01:46:51.480 | But the whole, I should point out that to say Germany
01:46:54.500 | or Europe or Russia or any of those things
01:46:56.800 | were not anti-Semitic is to do injustice to history, right?
01:47:00.040 | Pogroms everywhere.
01:47:00.880 | I mean, that is the, it's standard operating procedure.
01:47:04.660 | What you see in the Hitlerian era
01:47:06.720 | is an absolute huge spike, right?
01:47:09.040 | 'Cause the government has a conspiracy theory
01:47:11.160 | that the Jews have.
01:47:12.040 | It's funny because Hitler both thought of them as weak
01:47:15.080 | and super powerful at the same time, right?
01:47:17.200 | And as an outsider people that weakened Germany,
01:47:20.360 | the whole idea of the blood
01:47:21.780 | and how that connects to Darwinism
01:47:23.560 | and all that sort of stuff is just weird, right?
01:47:26.660 | A real outlier.
01:47:28.160 | But Einstein, let's just play with Einstein.
01:47:30.660 | If there's no anti-Semitism in Germany
01:47:34.720 | or none above the normal level, right?
01:47:38.640 | The baseline level, does Einstein leave
01:47:41.640 | along with all the other Jewish scientists?
01:47:44.800 | And what does Germany have as increased technological
01:47:49.200 | and intellectual capacity if they stay, right?
01:47:52.800 | It's something that actually weakened that state.
01:47:55.420 | It's a tragic flaw in the Hitlerian worldview.
01:47:59.880 | But it was so, and I don't, let me,
01:48:02.320 | you had mentioned earlier,
01:48:03.480 | like maybe it was not integral to his character.
01:48:06.480 | Maybe it was a wonderful tool for power.
01:48:09.320 | I don't think so.
01:48:10.480 | Somewhere along the line and really not at the beginning,
01:48:13.920 | this guy became absolutely obsessed with this.
01:48:17.800 | - With the conspiracy theory.
01:48:19.000 | - And Jews.
01:48:19.840 | And he surrounded himself with people and theorists.
01:48:23.880 | I'm gonna use that word really, really sort of loosely
01:48:27.560 | who believe this too.
01:48:28.840 | And so you have a cabal of people
01:48:30.960 | who are reinforcing this idea
01:48:33.240 | that the Jews control the world,
01:48:34.800 | that he called it international Jewry
01:48:37.560 | was a huge part of the problem.
01:48:39.200 | And that because of that, they deserve to be punished.
01:48:41.000 | They were an enemy within, all these kinds of things.
01:48:43.720 | It's a nutty conspiracy theory
01:48:46.440 | that the government of one of the most,
01:48:49.120 | I mean, the big thing with Germany was culture, right?
01:48:51.440 | They were a leading figure in culture and philosophy
01:48:55.760 | and all these kinds of things.
01:48:56.640 | And that they could be overtaken
01:48:59.540 | with this wildly, wickedly weird conspiracy theory
01:49:03.080 | and that it would actually determine things.
01:49:05.160 | I mean, Hitler was taking vast amounts of German resources
01:49:08.320 | and using it to wipe out this race
01:49:10.680 | when he needed them for all kinds of other things
01:49:12.920 | to fight a war of annihilation.
01:49:14.680 | So that is the weirdest part of the whole Nazi phenomenon.
01:49:19.680 | - It's the darkest possible silver lining to think about
01:49:24.960 | is that the Holocaust may have been
01:49:27.880 | and the hatred of the Jewish people
01:49:29.940 | may have been the thing that avoided
01:49:32.160 | Germany getting the nuclear weapons first.
01:49:34.580 | And- - Potentially.
01:49:36.640 | - And- (laughs)
01:49:38.560 | - Isn't that a wonderful historical ironic twist
01:49:41.280 | that if it weren't so overlaid with tragedy,
01:49:43.920 | a thousand years from now,
01:49:45.040 | we'll be seeing something really kind of funny.
01:49:46.760 | - Well, that's true.
01:49:47.800 | It's fascinating to think as you've talked-
01:49:50.640 | - So the seeds of his own destruction, right?
01:49:52.480 | The tragic flaw.
01:49:55.400 | And my hope is, this is a discussion I have with my dad
01:50:00.360 | who's a physicist,
01:50:01.580 | is that evil inherently contains with it
01:50:09.400 | that kind of incompetence.
01:50:11.440 | So my dad's discussion, so he's a physicist and engineer,
01:50:17.400 | his belief is that at this time in our history,
01:50:22.320 | the reason we haven't had nuclear, like terrorist
01:50:26.400 | blow up a nuclear weapon somewhere in the world
01:50:31.200 | is that the kind of people that would be terrorists
01:50:34.760 | are simply not competent enough
01:50:36.840 | at their job of being destructive.
01:50:41.220 | So like there's a kind of, if you plot it,
01:50:43.960 | the more evil you are, the less able you are.
01:50:47.920 | And by evil, I mean, purely just like we said,
01:50:52.400 | if we were to consider the hatred of Jewish people as evil,
01:50:56.360 | because it's sort of detached from reality.
01:50:58.320 | It's like just this pure hatred of something
01:51:02.460 | that's grounded on conspiracy theories.
01:51:07.460 | If that's evil, then the more you sell yourself,
01:51:11.120 | the more you give into these conspiracy theories,
01:51:13.840 | the less capable you are at actually engineering,
01:51:16.880 | which is very difficult, engineering nuclear weapons
01:51:19.160 | and effectively deploying them.
01:51:20.920 | So that's a hopeful message that the destructive people
01:51:25.000 | in this world are by their worldview,
01:51:29.400 | incompetent in creating the ultimate destruction.
01:51:33.820 | - I don't agree with that.
01:51:35.120 | - Oh boy.
01:51:35.960 | - I straight up don't agree with that.
01:51:37.660 | - So why are we still here?
01:51:39.220 | Why haven't we destroyed ourselves?
01:51:41.560 | Why haven't the terrorists blow?
01:51:43.580 | It's been many decades.
01:51:45.160 | Why haven't we destroyed ourself to this point?
01:51:49.080 | - Well, when you say it's been many decades,
01:51:51.840 | many decades, that's like saying in the life
01:51:54.240 | of 150 year old person, we've been doing well for a year.
01:51:58.920 | The problem with all these kinds of equations,
01:52:01.560 | and it was Bertrand Russell, right?
01:52:03.000 | The philosopher who said so.
01:52:05.000 | He said it's unreasonable to expect a man
01:52:08.240 | to walk on a tight rope for 50 years.
01:52:12.060 | I mean, the problem is that this is a long game.
01:52:15.600 | And let's remember that up until relatively recently,
01:52:18.360 | what would you say, 30 years ago,
01:52:20.520 | the nuclear weapons in the world
01:52:22.660 | were really tightly controlled.
01:52:24.400 | That was one of the real dangers
01:52:25.560 | in the fall of the Soviet Union.
01:52:26.800 | Remember the worry that all of a sudden
01:52:29.680 | you were gonna have bankrupt former Soviet Republic
01:52:32.640 | selling nuclear weapons to terrorists and whatnot.
01:52:35.160 | I would suggest, and here's another problem,
01:52:37.280 | is that when we call these terrorists evil,
01:52:39.560 | it's easy for an American, for example,
01:52:42.100 | to say that Osama bin Laden is evil.
01:52:44.980 | Easy for me to say that.
01:52:46.380 | But one man's terrorist is another man's freedom fighter,
01:52:49.240 | as the saying goes.
01:52:50.200 | And to other people, he's not.
01:52:52.100 | What Osama bin Laden did,
01:52:54.460 | and the people that worked with him,
01:52:56.820 | we would call evil genius.
01:52:58.700 | The idea of hijacking planes
01:53:00.780 | and flying them into the buildings like that,
01:53:02.660 | and that he could pull that off.
01:53:04.740 | And that still boggles my mind.
01:53:07.460 | It's funny, I'm still stunned by that.
01:53:10.680 | And yet, the idea, here's the funny part,
01:53:14.080 | and I hesitate to talk about this
01:53:16.560 | 'cause I don't wanna give anyone ideas.
01:53:19.560 | But you don't need nuclear weapons
01:53:22.720 | to do incredibly grave amounts of danger.
01:53:25.840 | Really, I mean, what one can of gasoline
01:53:29.180 | and a Bic lighter can do in the right place
01:53:31.600 | and the right time,
01:53:33.440 | and over and over and over again
01:53:36.640 | can bring down societies.
01:53:38.420 | This is the argument behind the importance
01:53:41.260 | of the stability that a nation state provides.
01:53:44.240 | So when we went in and took out Saddam Hussein,
01:53:48.620 | one of the great counter arguments
01:53:50.420 | from some of the people who said,
01:53:51.460 | "This is a really stupid thing to do,"
01:53:53.600 | is that Saddam Hussein was the greatest anti-terror weapon
01:53:57.300 | in that region that you could have
01:53:59.020 | because they were a threat to him.
01:54:01.420 | So he took that, and he did it in a way
01:54:03.640 | that was much more repressive than we would ever be.
01:54:07.140 | And this is the old line about why we supported
01:54:09.540 | right-wing death squad countries,
01:54:12.500 | because they were taking out people
01:54:14.740 | that would inevitably be a problem for us if they didn't.
01:54:18.340 | And they were able to do it in a way
01:54:20.080 | we would never be able to do, supposedly.
01:54:21.940 | We're pretty good at that stuff,
01:54:23.600 | just like the Soviet Union was,
01:54:24.860 | behind the scenes and underneath the radar.
01:54:27.060 | But the idea that the stability created
01:54:29.980 | by powerful and strong centralized leadership
01:54:32.780 | allowed them, it's almost like outsourcing
01:54:35.900 | anti-terror activities, allowed them to,
01:54:38.500 | for their own reasons, I mean,
01:54:40.220 | you see the same thing in the Syria situation
01:54:42.100 | with the Assads.
01:54:42.940 | I mean, you can't have an ISIS in that area
01:54:46.180 | because that's a threat to the Assad government
01:54:48.020 | who will take care of that for you,
01:54:49.500 | and then that helps us by not having an ISIS.
01:54:51.820 | So I would suggest, one, that the game is still on
01:54:56.500 | on whether or not these people get nuclear weapons
01:54:58.960 | in their hands.
01:55:00.280 | I would suggest they don't need them
01:55:02.060 | to achieve their goals, really.
01:55:03.700 | The crazy thing is if you start thinking
01:55:06.820 | like the Joker in Batman, the terrorist ideas,
01:55:10.220 | it's funny, I guess I would be a great terrorist
01:55:11.980 | 'cause I'm just full of those ideas.
01:55:13.460 | Oh, you could do this, you could,
01:55:14.920 | it's scary to think of how vulnerable we are.
01:55:17.460 | - But the whole point is that you, as the Joker,
01:55:22.300 | wouldn't do the terrorist actions.
01:55:24.740 | That's the theory that's so hopeful to me with my dad
01:55:29.280 | is that all the ideas, your ability to generate good ideas,
01:55:33.660 | forget nuclear weapons, how you can disrupt the power grid,
01:55:37.180 | how you can disrupt, attack our psychology,
01:55:41.020 | attack with a can of gasoline, like you said,
01:55:44.960 | somehow disrupt the American system of ideas,
01:55:48.060 | that coming up with good ideas there.
01:55:53.740 | - Are we saying evil people can't come up
01:55:55.600 | with evil genius ideas?
01:55:57.500 | - That's what I'm saying.
01:55:58.580 | We have this Hollywood story.
01:56:00.060 | - I don't think history backs that up.
01:56:02.340 | I mean, I think you can say with the nuclear weapons it does
01:56:04.260 | but only because they're so recent.
01:56:06.300 | But I mean, evil genius, I mean, that's almost proverbial.
01:56:10.060 | - But that's, okay, so to push back for the fun of it.
01:56:12.820 | - I don't want you to leave this in a terrible mood
01:56:16.540 | because I pushed back on every hopeful idea you had
01:56:20.060 | but I tend to be a little cynical about that stuff.
01:56:22.560 | - But that goes to the definition of evil, I think,
01:56:26.680 | because I'm not so sure human history
01:56:29.820 | has a lot of evil people being competent.
01:56:33.420 | I do believe that they mostly,
01:56:36.600 | like in order to be good at doing
01:56:38.920 | what may be perceived as evil,
01:56:40.660 | you have to be able to construct an ideology
01:56:43.840 | around which you truly believe when you look in the mirror
01:56:47.740 | by yourself that you're doing good for the world.
01:56:51.220 | And it's difficult to construct an ideology
01:56:56.140 | where destroying the lives of millions
01:56:59.060 | or disrupting the American system,
01:57:01.180 | I'm already contradicting myself as I'm saying it.
01:57:03.220 | - I was just gonna say, people have done this already, yes.
01:57:05.780 | - So, but then it's the question of like about aliens
01:57:10.780 | with the idea that if the aliens are all out there,
01:57:15.780 | why haven't they visited us?
01:57:20.380 | The same question, if it's so easy to be evil,
01:57:25.020 | it's not easy, if it's possible to be evil,
01:57:27.260 | why haven't we destroyed ourselves?
01:57:29.300 | And your statement is from the context of history,
01:57:32.420 | the game is still on.
01:57:34.940 | And it's just been a few years
01:57:37.460 | since we found the tools to destroy ourselves.
01:57:40.460 | And one of the challenges of our modern time,
01:57:44.260 | we don't often think about this pandemic kind of revealed,
01:57:48.060 | is how soft we've gotten in terms of our deep dependence
01:57:52.700 | on the system.
01:57:53.780 | So somebody mentioned to me,
01:57:56.820 | what happens if power goes out for a day?
01:57:59.340 | What happens if power goes out for a month?
01:58:04.060 | Oh, the person that mentioned this was a Berkeley faculty
01:58:08.180 | that I was talking with, he's an astronomer
01:58:11.180 | who's observing solar flares.
01:58:13.500 | And it's very possible that a solar flare,
01:58:17.020 | they happen all the time to different degrees.
01:58:19.580 | - Knock out your cell phones.
01:58:20.740 | - Yeah, to knock out the power grid.
01:58:22.980 | - For months.
01:58:24.860 | So like, just as a thought experiment,
01:58:29.220 | what happens if just power goes out for a week
01:58:32.860 | in this country?
01:58:33.700 | - This is like the electromagnetic pulses
01:58:36.700 | in the nuclear weapons and all those kinds of things.
01:58:38.940 | - But maybe that's an act of nature.
01:58:41.340 | - Yes.
01:58:42.180 | - And even just the act of nature will reveal
01:58:44.900 | like a little--
01:58:47.060 | - Fragility.
01:58:47.900 | - The fragility of it all.
01:58:48.720 | And then the evil can emerge.
01:58:50.100 | I mean, the kind of things that might happen
01:58:51.620 | when power goes out, especially during a divisive time.
01:58:56.220 | - Well, you won't have food.
01:58:57.300 | At baseline level, that would mean
01:59:00.460 | that the entire supplies chain begins to break down.
01:59:04.660 | And then you have desperation,
01:59:06.040 | and desperation opens the door to everything.
01:59:08.300 | - Can I ask a dark question?
01:59:11.020 | - As opposed to the other things we've been talking about?
01:59:14.340 | - There's always a thread, a hopeful message.
01:59:16.420 | I think there'll be a hopeful message on this one too.
01:59:18.420 | - You may have the wrong guess.
01:59:20.220 | - I'm just saying.
01:59:21.560 | - If you were to bet money on the way
01:59:26.820 | that human civilization destroys itself,
01:59:30.980 | or it collapses in some way that is where the result
01:59:35.980 | would be unrecognizable to us as anything akin to progress,
01:59:41.140 | what would you say?
01:59:42.980 | Is it nuclear weapons?
01:59:46.460 | Is it some societal breakdown
01:59:48.820 | through just more traditional kinds of war?
01:59:51.540 | Is it engineered pandemics, nanotechnologies,
01:59:54.660 | is it artificial intelligence?
01:59:56.620 | Is it something we can't even expect yet?
01:59:59.060 | Do you have a sense of how we humans will destroy ourselves?
02:00:02.700 | Or might we live forever?
02:00:04.820 | - I think what governs my view of this thing
02:00:08.700 | is the ability for us to focus ourselves collectively.
02:00:13.220 | And that gives me the choice of looking at this and saying,
02:00:16.360 | what are the odds we will do X versus Y, right?
02:00:19.680 | So go look at the '62 Cuban Missile Crisis,
02:00:24.020 | where we looked at the potential of nuclear war
02:00:28.020 | and we stared right in the face of that.
02:00:30.620 | To me, I consider that to be,
02:00:32.460 | you wanna talk about a hopeful moment?
02:00:34.820 | That's one of the rare times in our history
02:00:36.940 | where I think the odds were overwhelmingly
02:00:41.220 | that there would be a nuclear war.
02:00:43.260 | And I'm not the super Kennedy worshiper
02:00:46.300 | that I grew up in an era where he was,
02:00:48.340 | especially amongst people in the Democratic Party,
02:00:50.780 | he was almost worshipped and I was never that guy,
02:00:53.060 | but I will say something.
02:00:54.860 | John F. Kennedy by himself probably made decisions
02:00:59.860 | that saved 100 million or more lives
02:01:02.740 | because everyone around him thought he should be
02:01:06.380 | taking the road that would have led to those deaths.
02:01:08.780 | And to push back against that is when you look at it now,
02:01:12.380 | I mean, again, if you were a betting person,
02:01:13.900 | you would have bet against that.
02:01:15.220 | And that's rare, right?
02:01:16.640 | So when we talk about how the world will end,
02:01:22.460 | the fact that one person actually had that in their hands
02:01:26.400 | meant that it wasn't a collective decision.
02:01:29.120 | It gave, remember I said,
02:01:30.220 | I trust people on an individual level,
02:01:32.060 | but when we get together, we're more like a herd
02:01:34.100 | and we devolve down to the lowest common denominator.
02:01:36.660 | That was something where the higher ethical ideas
02:01:40.500 | of a single human being could come into play
02:01:42.700 | and make the decisions that influence the events.
02:01:46.300 | But when we have to act collectively,
02:01:48.720 | I get a lot more pessimistic.
02:01:50.260 | So take what we're doing to the planet.
02:01:53.660 | And we talk about it always now in terms of climate change,
02:01:56.500 | which I think is far too narrow.
02:01:59.200 | Look at, you know, and I always get very frustrated
02:02:02.660 | when we talk about these arguments about,
02:02:04.020 | is it happening?
02:02:04.840 | Is it human?
02:02:05.680 | Just look at the trash.
02:02:07.180 | Forget climate for a second.
02:02:09.900 | We're destroying the planet
02:02:11.540 | because we're not taking care of it.
02:02:13.020 | And because what it would do to take care of it
02:02:14.940 | would require collective sacrifices
02:02:17.580 | that would require enough of us to say, okay.
02:02:21.460 | And we can't get enough of us to say, okay,
02:02:24.580 | because too many people have to be on board.
02:02:26.980 | It's not John F. Kennedy making one decision from one man.
02:02:30.600 | We have to have 85% of us or something around the world.
02:02:34.660 | Not just, you can't say we're gonna stop doing damage
02:02:37.660 | to the world here in the United States if China does it.
02:02:41.020 | Right?
02:02:41.860 | So the amount of people that have to get on board
02:02:43.820 | that train is hard.
02:02:46.580 | You get pessimistic hoping for those kinds of shifts
02:02:49.900 | unless it's right.
02:02:51.900 | And you know, Krypton's about to explode.
02:02:54.140 | And so I think if you're talking
02:02:57.460 | about a gambling man's view of this,
02:03:00.420 | that that's gotta be the odds on favorite
02:03:02.340 | because it requires such a,
02:03:04.700 | I mean, and the systems maybe aren't even in place, right?
02:03:09.140 | The fact that we would need intergovernmental bodies
02:03:12.060 | that are completely discredited now on board,
02:03:14.460 | and you would have to subvert the national interests
02:03:17.740 | of nation states.
02:03:18.620 | I mean, the amount of things that have to go right
02:03:22.580 | in a short period of time,
02:03:23.900 | or we don't have 600 years to figure this out, right?
02:03:27.500 | So to me, that looks like the most likely
02:03:30.100 | just because the things we would have to do
02:03:31.700 | to avoid it seem the most unlikely.
02:03:33.580 | Does that make sense?
02:03:34.420 | - Yes, absolutely.
02:03:35.540 | I believe, call me naive,
02:03:38.460 | in just like you said with the individual,
02:03:41.460 | I believe that charismatic leaders,
02:03:43.860 | individual leaders will save us.
02:03:46.180 | Like this- - What if you don't get them all
02:03:47.660 | at the same time?
02:03:48.540 | What if you get a charismatic leader in one country,
02:03:50.580 | but under, or what if you get a charismatic leader
02:03:52.460 | in a country that doesn't really matter that much?
02:03:54.500 | - Well, it's a ripple effect.
02:03:55.940 | So it starts with one leader
02:03:57.460 | and their charisma inspires other leaders.
02:04:00.940 | Like, so it's like one ant queen steps up
02:04:05.180 | and then the rest of the ant starts behaving.
02:04:07.300 | And then there's like little other spikes
02:04:09.260 | of leaders that emerge.
02:04:11.140 | And then that's where collaboration emerges.
02:04:13.100 | I tend to believe that like when you heat up the system
02:04:16.280 | and shit starts getting really chaotic,
02:04:21.060 | then the leader,
02:04:23.020 | whatever this collective intelligence that we've developed,
02:04:26.180 | the leader will emerge.
02:04:28.020 | Like there- - Don't you think
02:04:29.020 | there's just as much of a chance though
02:04:30.440 | that the leader would emerge and say,
02:04:31.820 | the Jews are the people who did all this.
02:04:33.580 | - That's right. - You know what I'm saying
02:04:34.580 | is that the idea that they would come up,
02:04:36.940 | you have a charismatic leader
02:04:37.900 | and he's going to come up with the rights
02:04:39.460 | or she is going to come up with the right solution
02:04:41.700 | as opposed to totally coming up with the wrong solution.
02:04:45.420 | I mean, I guess what I'm saying is you could be right,
02:04:47.620 | but a lot of things have to go the right way.
02:04:50.120 | - But my intuition about the evolutionary process
02:04:52.900 | that led to the creation of human intelligence
02:04:55.540 | and consciousness on earth results in the power of,
02:05:00.540 | like, if we think of it,
02:05:02.420 | just the love in the system versus the hate in the system,
02:05:05.500 | that the love is greater.
02:05:07.460 | The human kindness potential in the system
02:05:12.460 | is greater than the human hatred potential.
02:05:18.180 | And so the leader that is in the time when it's needed,
02:05:21.800 | the leader that inspires love and kindness
02:05:25.940 | is more likely to emerge and will have more power.
02:05:30.540 | So you have the Hitlers of the world that emerge,
02:05:34.140 | but they're actually in the grand scheme of history
02:05:37.820 | are not that impactful.
02:05:40.660 | So it's weird to say,
02:05:42.840 | but not that many people died in World War II.
02:05:45.860 | If you look at the full range of human history,
02:05:50.860 | you know, it's up to a hundred million, whatever that is.
02:05:55.700 | With natural pandemics too,
02:05:57.020 | you can have those kinds of numbers,
02:05:58.420 | but it's still a percentage.
02:06:00.140 | I forget what the percentage is,
02:06:01.220 | maybe three, 5% of the human population on earth.
02:06:04.620 | Maybe it's a little bit focused on a different region,
02:06:07.180 | but it's not destructive
02:06:09.020 | to the entirety of human civilization.
02:06:11.860 | So I believe that the charismatic leaders,
02:06:16.860 | when time is needed, that do good for the world
02:06:22.180 | in the broader sense of good are more likely to emerge
02:06:26.000 | than the ones that say, kill all the Jews.
02:06:29.460 | - It's possible though, and this is just, you know,
02:06:32.020 | I've thought about this all of 30 seconds, but I mean,
02:06:35.060 | it's-
02:06:35.900 | - We're betting money here on the 21st century.
02:06:38.460 | Who's gonna win?
02:06:39.300 | - I think maybe you've divided this
02:06:42.580 | into too much of a black and white dichotomy,
02:06:45.740 | this love and good on one side and this evil on another.
02:06:48.660 | Let me throw something that might be more in the center
02:06:51.820 | of that linear balancing act, self-interest,
02:06:56.420 | which may or may not be good.
02:06:59.100 | You know, the good version of it
02:07:00.580 | we call enlightened self-interest, right?
02:07:02.980 | The bad version of it we call selfishness.
02:07:05.940 | But self-interest to me seems like something more likely
02:07:09.620 | to impact the outcome than either love on one side
02:07:13.180 | or evil on the other.
02:07:14.840 | Simply a question of what's good for me
02:07:17.460 | or what's good for my country
02:07:19.260 | or what's good from my point of view
02:07:21.140 | or what's good for my business.
02:07:22.980 | I mean, if you tell me, and maybe I'm a coal miner
02:07:27.980 | or maybe I own a coal mine,
02:07:29.780 | if you say to me, we have to stop using coal
02:07:32.660 | 'cause it's hurting the earth,
02:07:34.100 | I have a hard time disentangling that greater good question
02:07:39.100 | from my right now good feeding my family question, right?
02:07:43.740 | So I think maybe it's gonna be a much more banal thing
02:07:48.280 | than good and evil, much more a question of,
02:07:51.100 | we're not all going to decide at the same time
02:07:54.420 | that the interests that we have are aligned.
02:07:57.520 | Does that make sense?
02:07:58.360 | - Yeah, totally.
02:07:59.180 | But I mean, I've looked at Ayn Rand and objectivism
02:08:01.540 | and kind of really thought like,
02:08:02.780 | how bad or good can things go
02:08:04.500 | when everybody's acting selfishly?
02:08:06.640 | But I think we're just talking to ants here
02:08:08.900 | with microphones talking about--
02:08:10.740 | - To ants here with microphones.
02:08:13.300 | - But like, the question is when this spreads,
02:08:17.780 | so what do I mean by love and kindness?
02:08:22.780 | I think it's human flourishing on earth
02:08:26.700 | and throughout the cosmos.
02:08:28.940 | It feels like whatever the engine that drives human beings
02:08:33.940 | is more likely to result in human flourishing.
02:08:37.280 | And people like Hitler are not good for human flourishing.
02:08:41.260 | So that's what I mean by good is there's a,
02:08:45.300 | I mean, maybe it's an intuition
02:08:46.900 | that kindness is an evolutionary advantage.
02:08:50.760 | I hate those terms, I hate to reduce stuff
02:08:53.500 | to evolutionary biology always,
02:08:55.900 | but it just seems like for us to multiply
02:08:58.580 | throughout the universe, it's good to be kind to each other.
02:09:03.100 | And those leaders will always emerge to save us
02:09:06.900 | from the Hitlers of the world
02:09:08.420 | that want to kind of burn the thing down
02:09:10.180 | with a flamethrower.
02:09:11.380 | That's the intuition.
02:09:12.220 | - But let's talk about,
02:09:13.060 | you brought up evolution several times.
02:09:14.900 | So let me play with that for a minute.
02:09:17.100 | I think going back to animal times,
02:09:20.460 | we are conditioned to deal with overwhelming threats
02:09:24.380 | right in front of us.
02:09:25.220 | So I have quite a bit of faith in humanity
02:09:28.620 | when it comes to impending doom right outside our door.
02:09:33.620 | If Krypton's about to explode,
02:09:35.940 | I think humanity can rouse themselves to great,
02:09:39.180 | and would give power to the people who needed it
02:09:42.380 | and be willing to make the sacrifices.
02:09:44.420 | But that's what makes, I think,
02:09:45.700 | the pollution/climate change/screwing up your environment
02:09:49.980 | threat so particularly insidious,
02:09:53.200 | is it happens slowly, right?
02:09:55.260 | It defies fight and flight mechanisms.
02:09:58.120 | It defies the natural ability we have
02:10:00.780 | to deal with the threat that's right on top of us.
02:10:03.900 | And it requires an amount of foresight
02:10:06.340 | that while some people would be fine with that,
02:10:09.220 | most people are too worried,
02:10:10.860 | and understandably, I think,
02:10:12.400 | too worried about today's threat
02:10:14.900 | rather than next generation's threat,
02:10:17.040 | or whatever it might be.
02:10:18.580 | So, I mean, when we talk about,
02:10:20.020 | when you had said,
02:10:20.860 | what do you think the greatest threat is?
02:10:23.060 | I think with nuclear weapons,
02:10:24.420 | I think, could we have a nuclear war?
02:10:26.400 | We darn right could.
02:10:27.360 | But I think that there's enough of inertia.
02:10:31.020 | We're against that because people understand instinctively,
02:10:34.800 | if I decide to launch this attack against China,
02:10:37.680 | and I'm India,
02:10:38.960 | we're gonna have 50 million dead people tomorrow.
02:10:41.580 | Whereas if you say,
02:10:42.800 | we're gonna have a whole planet of dead people
02:10:44.800 | in three generations if we don't start now,
02:10:47.520 | I think the evolutionary way
02:10:51.260 | that we have evolved mitigates maybe against that.
02:10:55.520 | In other words, I think I would be pleasantly surprised
02:10:58.440 | if we could pull that off.
02:10:59.720 | Does that make sense?
02:11:01.320 | - Totally.
02:11:02.160 | - I don't mean to be like,
02:11:02.980 | I'm the science predicting doom.
02:11:04.920 | - Well, it's fun that way.
02:11:06.080 | I think we're both,
02:11:07.480 | maybe I'm over the top on the love thing.
02:11:09.240 | - Maybe I'm over the top on the doom.
02:11:11.440 | - So it makes for a fun chat, I think.
02:11:14.540 | So one guy that I've talked to several times
02:11:17.440 | who's slowly becoming a friend
02:11:19.440 | is a guy named Elon Musk.
02:11:22.080 | He's a big fan of hardcore history,
02:11:24.740 | especially Genghis Khan series of episodes,
02:11:29.560 | but really all of it,
02:11:31.240 | him and his girlfriend Grimes listen to it.
02:11:34.720 | - I know Elon.
02:11:35.840 | - Yeah, you know Elon?
02:11:36.760 | Okay, awesome.
02:11:37.580 | So that's like relationship goals,
02:11:40.360 | like listen to hardcore history on the weekend
02:11:42.500 | with your loved one.
02:11:43.620 | Okay, so let me,
02:11:45.880 | if I were to look at the guy
02:11:48.380 | from a perspective of human history,
02:11:51.360 | it feels like he will be a little speck that's remembered.
02:11:56.000 | - Oh, absolutely.
02:11:56.960 | - You think about like the people,
02:11:58.360 | what will we remember from our time?
02:12:01.000 | Who are the people we'll remember,
02:12:03.800 | whether it's the Hitlers or the Einsteins,
02:12:07.880 | who's going to be,
02:12:09.280 | it's hard to predict when you're in it,
02:12:11.560 | but it seems like Elon
02:12:13.320 | will be one of those people remembered.
02:12:14.880 | And if I were to guess what he's remembered for,
02:12:17.920 | it's the work he's doing with SpaceX
02:12:20.740 | and potentially being the person,
02:12:23.820 | now we don't know,
02:12:24.780 | but being the person who launched
02:12:29.080 | a new era of space exploration.
02:12:31.840 | If we look centuries from now,
02:12:34.300 | if we are successful as human beings surviving long enough
02:12:37.260 | to venture out into the,
02:12:41.500 | toward the stars.
02:12:43.180 | It's weird to ask you this.
02:12:44.620 | I don't know what your opinions are,
02:12:46.540 | but do you think humans will be a multi-planetary species
02:12:51.480 | in the arc, long arc of history?
02:12:53.800 | Do you think Elon will be successful in his dream?
02:12:56.280 | And he doesn't shy away from saying it this way, right?
02:12:59.880 | He really wants us to colonize Mars first
02:13:04.720 | and then colonize other earth-like planets
02:13:08.360 | in other solar systems throughout the galaxy.
02:13:11.440 | Do you have a hope that we humans
02:13:12.900 | will venture out towards the stars?
02:13:15.480 | - So here's the thing.
02:13:16.480 | And this actually, again,
02:13:17.400 | dovetails to what we were talking about earlier.
02:13:19.920 | I actually, first of all, I toured SpaceX.
02:13:24.120 | And it is, when you,
02:13:25.800 | it's hard to get your mind around
02:13:27.360 | because he's doing what it took governments to do before.
02:13:30.080 | - Yes. - Okay?
02:13:30.900 | So it's incredible that we're watching
02:13:32.520 | individual companies and stuff doing this.
02:13:34.880 | - Doing it faster and cheaper.
02:13:35.960 | - Yeah, well, and pushing the envelope, right?
02:13:38.720 | Faster than the governments at the time we're moving.
02:13:40.760 | It really is.
02:13:42.440 | I mean, there's a lot of people who I think,
02:13:45.240 | who think Elon is overrated and you have no idea, right?
02:13:49.560 | When you go see it, you have no idea.
02:13:51.800 | But that's actually not what I'm most impressed with.
02:13:54.440 | It's Tesla I'm most impressed with.
02:13:57.720 | And the reason why is because in my mind,
02:14:00.580 | we just talked about what I think is the greatest threat,
02:14:02.920 | the environmental stuff.
02:14:04.160 | And I talked about our inability,
02:14:06.520 | maybe all at the same time,
02:14:08.280 | to be willing to sacrifice our self-interest
02:14:11.620 | in order for the goal.
02:14:14.480 | And I don't wanna put words in Elon's mouth,
02:14:16.400 | so you can talk to him if you want to.
02:14:18.200 | But in my mind, what he's done is recognize that problem.
02:14:22.880 | And instead of building a car that's a piece of crap,
02:14:25.440 | but it's good for the environment, so you should drive it,
02:14:27.760 | he's trying to create a car
02:14:30.320 | that if you're only motivated by your self-interest,
02:14:33.680 | you'll buy it anyway.
02:14:35.240 | And it will help the environment
02:14:36.720 | and help us transition away
02:14:38.080 | from one of the main causes of damage.
02:14:40.760 | I mean, one of the things this pandemic
02:14:43.000 | and the shutdown around the world has done
02:14:45.640 | is show us how amazingly quickly
02:14:48.060 | the earth can actually rejuvenate.
02:14:49.720 | We're seeing clear skies in places, species come.
02:14:52.080 | And you would have thought it would have taken decades
02:14:54.400 | for some of this stuff.
02:14:55.720 | So what if, to name just one major pollution source,
02:14:59.640 | we didn't have the pollution caused by automobiles, right?
02:15:03.800 | And if you had said to me,
02:15:05.520 | "Dan, what do you think the odds
02:15:06.920 | "of us transitioning away from that were 10 years ago?"
02:15:09.480 | I would have said, "Well, people aren't gonna do it
02:15:10.760 | "'cause it's inefficient, it's this, it's that.
02:15:12.240 | "Nobody wants to," blah, blah.
02:15:13.720 | But what if you created a vehicle
02:15:15.240 | that was superior in every way
02:15:16.440 | so that if you were just a self-oriented consumer,
02:15:19.420 | you'd buy it 'cause you wanted that car?
02:15:21.740 | That's the best way to get around that problem
02:15:24.320 | of people not wanting to, you know.
02:15:26.500 | I think he's identified that.
02:15:28.520 | And as he's told me before,
02:15:30.460 | "You know, when the last time a car company was created,
02:15:33.140 | "that actually, you know, blah, blah, blah," he's right.
02:15:36.560 | And so I happen to feel that even though he's pushing
02:15:39.320 | the envelope on the space thing,
02:15:40.800 | I think somebody else would have done that someday.
02:15:43.680 | I'm not sure because of the various things he's mentioned,
02:15:46.640 | how difficult it is to start there.
02:15:48.240 | I'm not sure that the industries
02:15:50.080 | that create vehicles for us would have gone
02:15:52.820 | where he's going to lead them
02:15:55.200 | if he didn't force them there through consumer demand
02:15:58.000 | by making a better car that people wanted anyway.
02:16:00.580 | They'll follow, they'll copy, they'll do all those things.
02:16:03.880 | And yet who was gonna do that?
02:16:06.680 | So I hope he doesn't hate me for saying this,
02:16:08.800 | but I happen to think the Tesla idea
02:16:10.880 | may alleviate some of the need to get off this planet
02:16:15.120 | 'cause the planet's being destroyed, right?
02:16:17.080 | And we're gonna colonize Mars probably anyway
02:16:19.280 | if we live long enough.
02:16:20.480 | And I think the Tesla idea, not just Elon's version,
02:16:23.840 | but ones that follow from other people,
02:16:25.840 | is the best chance of making sure we're around long enough
02:16:28.420 | to see Mars colonized.
02:16:29.560 | Does that make sense?
02:16:30.400 | - Yeah, totally.
02:16:31.220 | And one other thing from my perspective,
02:16:33.280 | 'cause I'm now starting a company,
02:16:35.360 | I think the interesting thing about Elon
02:16:38.800 | is he serves as a beacon of hope,
02:16:41.520 | like pragmatically speaking,
02:16:43.680 | for people that sort of to push back
02:16:45.680 | on our doom conversation from earlier,
02:16:48.620 | that a single individual could build something
02:16:53.620 | that allows us as self-interested individuals
02:16:58.560 | to gather together in a collective way
02:17:00.880 | to actually alleviate some of the dangers
02:17:03.300 | that face our world.
02:17:05.480 | So it gives me hope as an individual
02:17:09.200 | that I can build something that can actually have impact
02:17:14.200 | that counteracts the Stalins and the Hitlers
02:17:19.440 | and all the threats that face,
02:17:22.800 | that human civilization faces,
02:17:25.100 | that an individual has that power.
02:17:27.120 | I didn't believe that the individual has that power
02:17:30.280 | in the halls of government.
02:17:33.160 | Like I don't feel like any one presidential candidate
02:17:36.120 | can rise up and help the world, unite the world.
02:17:39.560 | It feels like from everything I've seen,
02:17:42.760 | and you're right with Tesla,
02:17:45.120 | it can bring the world together to do good.
02:17:49.360 | That's a really powerful mechanism
02:17:50.840 | of whatever you say about capitalism,
02:17:53.200 | that you can build companies that,
02:17:56.120 | it starts with a single individual.
02:17:59.360 | Of course, there's a collective that grows around that,
02:18:02.680 | but the leadership of a single individual,
02:18:05.080 | their ideas, their dreams, their vision
02:18:08.160 | can catalyze something that takes over the world
02:18:12.600 | and does good for the entire world.
02:18:14.600 | - But if I think, but again,
02:18:15.800 | I think the genius of the idea
02:18:18.280 | is that it doesn't require us
02:18:20.240 | to go head to head with human nature, right?
02:18:23.200 | He's actually built human nature into the idea
02:18:27.640 | by basically saying,
02:18:28.600 | I'm not asking you to be an environmental activist.
02:18:31.720 | I'm not asking you to sacrifice to make it.
02:18:33.960 | I'm gonna sell you a car you're going to like better,
02:18:36.560 | and by buying it, you'll help the environment.
02:18:38.760 | That takes into account our foibles as a species
02:18:42.440 | and actually leverages that to work for the greater good.
02:18:46.440 | And that's the sort of thing that does turn off
02:18:49.320 | my little doomcaster cynicism thing a little bit,
02:18:51.920 | because you're actually hitting us where we live, right?
02:18:54.960 | You can take somebody who doesn't even believe
02:18:58.280 | the environment's a problem, but they want a Tesla.
02:19:00.500 | So they're inadvertently helping anyway.
02:19:03.160 | I think that's the genius of the idea.
02:19:05.840 | - Yeah, and I'm telling you,
02:19:07.000 | that's one way to make love
02:19:09.520 | a much more efficient mechanism of change than hate.
02:19:13.560 | - Making it in your self-interest to love somebody?
02:19:15.160 | - It makes you self-interest, creating a product
02:19:17.640 | that leads to more love than hate.
02:19:21.120 | - You're gonna wanna love your neighbor
02:19:22.400 | 'cause you're gonna make a fortune.
02:19:23.520 | - Exactly. - Right, I get it.
02:19:25.040 | There you go. - That's why he said.
02:19:25.880 | - All right, I'm on board.
02:19:27.520 | - That's why Elon said love is the answer.
02:19:29.800 | That's, I think, exactly what he meant.
02:19:33.120 | Okay, let's try something difficult.
02:19:35.700 | You've recorded an episode of Steering Into the Iceberg
02:19:41.720 | on your Common Sense program.
02:19:43.800 | - Yeah.
02:19:44.640 | - That has started a lot of conversations.
02:19:47.160 | It's quite moving, it was quite haunting.
02:19:51.160 | - Got me a lot of angry emails.
02:19:53.000 | - Really? - Of course.
02:19:54.200 | I did something I haven't done in 30 years.
02:19:57.360 | I endorsed a political candidate
02:19:58.740 | from one of the two main parties,
02:19:59.880 | and there were a lot of disillusioned people 'cause of that.
02:20:02.640 | - I guess I didn't hear it as an endorsement.
02:20:05.560 | I just heard it as a,
02:20:08.120 | the similar flavor of conversation
02:20:11.880 | as you have in hardcore history,
02:20:14.660 | it's almost a speaking about modern times
02:20:19.520 | in the same voice as you speak about
02:20:21.520 | when you talk about history.
02:20:23.040 | So it was just a little bit of a haunting view
02:20:27.200 | of the world today.
02:20:29.320 | - I know we were just wearing our Doom, Doomcaster.
02:20:33.880 | - You're gonna have me put that right back on, are you?
02:20:36.000 | - No, I like the term Doomcaster.
02:20:40.280 | Is there, how do we get love to win?
02:20:46.440 | What's the way out of this?
02:20:49.200 | Is there some hopeful line that we can walk
02:20:56.080 | to avoid something, and I hate to use the terminology,
02:21:01.080 | but something that looks like a civil war,
02:21:05.240 | not necessarily a war of force,
02:21:08.620 | but a division to a level where it doesn't any longer feel
02:21:13.620 | like a United States of America with an emphasis on united.
02:21:20.020 | Is there a way out?
02:21:21.960 | - I read a book a while back.
02:21:24.940 | I wanna say George Friedman, the Stratfor guy wrote it.
02:21:28.260 | It was something called The Next Hundred Years,
02:21:30.080 | I think it was called.
02:21:30.980 | And I remember thinking, I didn't agree with any of it.
02:21:35.380 | And one of the things I think he said in the book
02:21:37.200 | was that the United States was gonna break up.
02:21:39.260 | I'm going from memory here.
02:21:40.100 | He might not have said that at all,
02:21:41.060 | but something was stuck in my memory about that.
02:21:42.580 | And I remember thinking,
02:21:43.780 | but I think some of the arguments were connected
02:21:49.140 | to the differences that we had
02:21:53.020 | and the fact that those differences are being exploited.
02:21:55.780 | So we talked about media earlier
02:21:57.420 | and the lack of truth and everything.
02:21:58.940 | We have a media climate that is incentivized
02:22:03.340 | to take the wedges in our society and make them wider.
02:22:08.180 | And there's no countervailing force to do the opposite
02:22:11.940 | or to help.
02:22:13.340 | So there was a famous memo from a group called
02:22:18.340 | Project for a New American Century,
02:22:21.020 | and they took it down, but the Wayback Machine online
02:22:23.540 | still has it.
02:22:24.360 | And it happened before 9/11,
02:22:25.820 | spawned all kinds of conspiracy theories
02:22:27.680 | 'cause it was saying something to the effect of,
02:22:30.860 | and I'm really paraphrasing here,
02:22:32.360 | but you know that the United States
02:22:33.820 | needs another Pearl Harbor type event
02:22:36.340 | because those galvanize a country
02:22:39.220 | that without those kinds of events periodically
02:22:41.780 | is naturally geared towards pulling itself apart.
02:22:45.460 | And it's those periodic events
02:22:47.180 | that act as the countervailing force
02:22:49.060 | that otherwise is not there.
02:22:51.520 | If that's true, then we are naturally inclined
02:22:55.920 | towards pulling ourselves apart.
02:22:57.960 | So to have a media environment that makes money
02:23:02.960 | off widening those divisions, which we do.
02:23:07.220 | I mean, I was in talk radio and it has those people,
02:23:11.160 | the people that used to scream at me
02:23:12.680 | 'cause I wouldn't do it,
02:23:13.640 | but I mean, we would have these terrible conversations
02:23:15.640 | after every broadcast where I'd be in there
02:23:17.600 | with the program director and they're yelling at me
02:23:20.000 | about heat, heat was the word, create more heat.
02:23:23.000 | Well, what is heat, right?
02:23:24.400 | Heat is division, right?
02:23:25.840 | And they want the heat, not because they're political,
02:23:28.360 | they're not Republicans or Democrats either.
02:23:32.520 | We want listeners and we want engagement and involvement.
02:23:36.040 | And because of the constructs of the format,
02:23:39.080 | you don't have a lot of time to get it.
02:23:40.360 | So you can't have me giving you like on a podcast
02:23:43.160 | an hour and a half or two hours
02:23:44.960 | where we build a logical argument
02:23:47.040 | and you're with me the whole way,
02:23:48.720 | your audience is changing every 15 minutes.
02:23:51.280 | So whatever points you make to create interest
02:23:53.720 | and intrigue and engagement have to be knee jerk right now.
02:23:58.720 | They told me once that the audience has to know
02:24:01.760 | where you stand on every single issue
02:24:05.000 | within five minutes of turning on your show.
02:24:07.680 | In other words, you have to be part
02:24:09.240 | of a linear set of political beliefs
02:24:12.720 | so that if you feel A about subject A,
02:24:15.980 | then you must feel D about subject D.
02:24:18.080 | And I don't even need to hear your opinion on it
02:24:19.560 | 'cause if you feel that way about A,
02:24:20.680 | you're gonna feel that way about D.
02:24:22.360 | This is a system that is designed
02:24:24.360 | to pull us apart for profit,
02:24:26.640 | but not because they wanna pull us apart, right?
02:24:29.400 | It's a by-product of the profit.
02:24:31.540 | That's one little example of 50 examples in our society
02:24:37.600 | that work in that same fashion.
02:24:39.920 | So what that project for a new American century document
02:24:42.560 | was saying is that we're naturally inclined
02:24:45.320 | towards disunity and without things
02:24:47.800 | to occasionally ratchet the unity back up again
02:24:51.240 | so that we can start from the baseline again
02:24:53.200 | and then pull ourselves apart 'til the next Pearl Harbor,
02:24:55.960 | that you'll pull yourself apart,
02:24:57.240 | which I think that's what the George Friedman book
02:25:00.720 | was saying that I disagreed with so much at the time.
02:25:03.360 | So in answer to your question about civil wars,
02:25:07.960 | we can't have the same kind of civil war
02:25:09.740 | because we don't have a geographical division
02:25:12.000 | that's as clear cut as the one we had before, right?
02:25:13.960 | You had a basically North-South line and some border states.
02:25:16.760 | It was set up for that kind of a split.
02:25:18.920 | Now we're divided within communities, within families,
02:25:22.460 | within gerrymandered voting districts and precincts, right?
02:25:26.000 | So you can't disengage.
02:25:28.600 | We're stuck with each other.
02:25:30.040 | So if there's a civil war now, for lack of a better word,
02:25:34.800 | what it might seem like is the late 1960s, early 1970s,
02:25:39.560 | where you had the bombings
02:25:42.040 | and let's call it domestic terrorism and things like that,
02:25:45.840 | because that would seem to be something that once again,
02:25:49.300 | you don't even need a large chunk
02:25:50.880 | of the country pulling apart.
02:25:52.240 | 10% of people who think it's the end times
02:25:56.080 | can do the damage,
02:25:57.040 | just like we talked about terrorism before
02:25:59.200 | and a can of gas and a Bic lighter.
02:26:01.400 | I've lived in a bunch of places
02:26:02.960 | and I won't give anybody ideas
02:26:04.680 | where a can of gas and a Bic lighter
02:26:06.680 | would take a thousand houses down
02:26:08.760 | before you could blink, right?
02:26:11.760 | That terrorist doesn't have to be from the Middle East,
02:26:15.640 | doesn't have to have some sort of a fundamentalist
02:26:17.640 | religious agenda.
02:26:18.800 | It could just be somebody really pissed off
02:26:20.780 | about the election results.
02:26:22.280 | So once again, if we're playing an odds game here,
02:26:25.440 | everybody has to behave for this to work right.
02:26:28.480 | Only a few people have to misbehave
02:26:30.400 | for this thing to go sideways.
02:26:31.640 | And remember, for every action,
02:26:33.760 | there is an equal and opposite reaction.
02:26:36.280 | So you don't even have to have those people
02:26:38.240 | doing all these things.
02:26:39.120 | All they have to do is start a tit for tat
02:26:41.920 | retribution cycle.
02:26:43.240 | - And there's an escalation.
02:26:44.280 | - Yes, and it creates a momentum of its own,
02:26:48.840 | which leads fundamentally,
02:26:50.000 | if you follow the chain of events down there,
02:26:51.840 | to some form of dictatorial government
02:26:54.200 | as the only way to create stability, right?
02:26:57.440 | You wanna destroy the Republic and have a dictator?
02:26:59.520 | That's how you do it.
02:27:00.360 | And there are parallels to Nazi Germany,
02:27:02.460 | the burning of the Reichstag, blah, blah, blah.
02:27:05.800 | I'm the Doomcaster again, aren't I?
02:27:07.880 | - And some of it could be manufactured
02:27:09.880 | by those seeking authoritarian power.
02:27:11.960 | - Absolutely, like the Reichstag fire was,
02:27:14.840 | or the Polish soldiers that fired over the border
02:27:17.800 | before the invasion in 1939.
02:27:20.560 | - To fight the devil's advocate with an angel's advocate,
02:27:24.520 | I would say, just as our conversation about Elon,
02:27:27.720 | it feels like individuals have power to unite us,
02:27:31.080 | to be that force of unity.
02:27:33.400 | So you mentioned the media.
02:27:35.920 | I think you're one of the great podcasters in history.
02:27:40.840 | Joe Rogan is a, like a long form, whatever.
02:27:44.000 | It's not podcasting, it's actually whatever the--
02:27:47.640 | - Very infrequent is what it is, no matter what it is.
02:27:50.920 | - But the basic process of it is you go deep
02:27:53.600 | and you stay deep,
02:27:54.800 | and the listener stays with you for a long time.
02:27:57.680 | So I'm just looking at the numbers,
02:28:01.000 | like we're almost three hours in,
02:28:05.340 | and from previous episodes,
02:28:08.240 | I can tell you that about 300,000 people
02:28:11.320 | are still listening to the sound of our voice three hours in.
02:28:15.180 | So usually it's 300 to 500,000 people listen,
02:28:18.520 | and they tune out. - Congratulations,
02:28:19.680 | by the way, that's wonderful.
02:28:20.840 | - And Joe Rogan is, what, like 10 times that.
02:28:23.760 | And so he has power to unite.
02:28:28.760 | You have power to unite.
02:28:31.640 | There's a few people with voices
02:28:34.160 | that it feels like they have power to unite.
02:28:36.420 | Even if you, quote unquote, endorse a candidate and so on,
02:28:41.080 | there's still, it feels to me that speaking of,
02:28:46.080 | I don't wanna keep saying love,
02:28:49.460 | but it's love and maybe unity more practically speaking,
02:28:53.800 | that like sanity,
02:28:56.600 | that like respect for those you don't agree with
02:29:00.880 | or don't understand.
02:29:04.080 | So empathy, just a few voices of those
02:29:08.560 | can help us avoid the, really importantly,
02:29:12.240 | not avoid the singular events, like you said,
02:29:15.140 | of somebody starting a fire and so on,
02:29:17.360 | but avoid the escalation of it,
02:29:21.200 | the preparedness of the populace to escalate those events,
02:29:26.200 | to turn a singular event in a single riot or a shooting
02:29:32.800 | or even something much more dramatic than that,
02:29:35.760 | to turn that into something that creates ripples that grow
02:29:40.760 | as opposed to ripples that fade away.
02:29:43.240 | And so I would like to put responsibility
02:29:46.400 | on somebody like you and on me in some small way.
02:29:51.080 | And Joe, being cognizant of the fact
02:29:55.760 | that a lot of very destructive things
02:29:58.560 | might happen in November.
02:30:01.280 | And a few voices can save us is the feeling I have.
02:30:05.520 | Not by saying who you should vote for
02:30:07.640 | or any of that kind of stuff,
02:30:09.320 | but really by being the voice of calm
02:30:13.760 | that like calms the seas from,
02:30:18.760 | or whatever the analogy is from boiling up.
02:30:22.360 | Because I truly am worried about,
02:30:24.840 | this is the first time this year when I,
02:30:30.400 | I sometimes, I somehow have felt
02:30:33.040 | that the American project will go on forever.
02:30:36.040 | When I came to this country, I just believed,
02:30:39.600 | and I still think I'm young, but like,
02:30:42.400 | I have a dream of creating a company
02:30:45.120 | that will do a lot of good for the world.
02:30:47.480 | And I thought that America is the beacon of hope
02:30:51.320 | for the world and the ideas of freedom,
02:30:54.340 | but also the idea of empowering companies
02:30:56.840 | that can do some good for the world.
02:30:58.880 | And I'm just worried about this America
02:31:02.000 | that filled me, a kid that came from,
02:31:05.640 | our family came from nothing,
02:31:08.200 | and from Russia as it was, Soviet Union as it was,
02:31:11.880 | to be able to do anything in this new country.
02:31:15.400 | I'm just worried about it.
02:31:16.600 | And it feels like a few people
02:31:18.740 | can still keep this project going.
02:31:21.100 | Like people like Elon, people like Joe.
02:31:26.900 | - Is there, do you have a bit of that hope?
02:31:31.900 | - I'm watching this experiment with social media right now.
02:31:38.020 | And I don't even mean social media,
02:31:38.900 | really expand that out to, I mean,
02:31:41.340 | I feel like we're all guinea pigs right now watching,
02:31:43.820 | you know, I have two kids and just watching,
02:31:45.900 | and there's a three year space between the two of them.
02:31:48.500 | One's 18, the other's 15.
02:31:50.620 | And just, you know, when I was a kid,
02:31:52.940 | a person who was 18 and 15 would not be that different,
02:31:56.540 | just three years difference, more maturity.
02:31:58.660 | But their life experiences,
02:32:00.120 | you would easily classify those two people
02:32:02.240 | as being in the same generation.
02:32:04.760 | Now, because of the speed of technological change,
02:32:08.280 | there is a vast difference between my 18 year old
02:32:11.140 | and my 15 year old, and not in the maturity question,
02:32:13.500 | just in what apps they use, how they relate to each other,
02:32:17.240 | how they deal with their peers, their social skills,
02:32:20.100 | all those kinds of things where you turn around and go,
02:32:22.460 | this is uncharted territory, we've never been here,
02:32:25.180 | so it's gonna be interesting to see what effect
02:32:26.580 | that has on society.
02:32:27.420 | Now, as that relates to your question,
02:32:29.900 | the most upsetting part about all that
02:32:33.580 | is reading how people treat each other online.
02:32:36.100 | And you know, there's lots of theories about this.
02:32:37.580 | The fact that some of it is just for trolling laughs,
02:32:40.180 | that some of it is just people are not interacting
02:32:42.380 | face-to-face, so they feel free to treat each other
02:32:44.620 | that way.
02:32:45.460 | And I, of course, I'm trying to figure out
02:32:49.620 | how, if this is how we have always been as people, right?
02:32:54.620 | We've always been this way, but we've never had the means
02:32:57.180 | to post our feelings publicly about it,
02:32:59.840 | or if the environment and the social media
02:33:02.700 | and everything else has provided a change
02:33:05.820 | and changed us into something else.
02:33:08.100 | Either way, when one reads how we treat one another
02:33:13.740 | and the horrible things we say about one another online,
02:33:17.660 | which seems like it shouldn't be that big of deal,
02:33:20.180 | they're just words, but they have a cumulative effect.
02:33:23.080 | I mean, when you, I was reading Megan Markle,
02:33:27.380 | who I don't know a lot about 'cause it's too much
02:33:29.380 | of the pop side of culture for me to pay,
02:33:31.300 | but I read a story the other day where she was talking
02:33:33.180 | about the abuse she took online
02:33:34.940 | and how incredibly overwhelming it was
02:33:37.740 | and how many people were doing it.
02:33:40.140 | And you think to yourself, okay, this is something
02:33:43.040 | that people who were in positions
02:33:44.860 | of what you were discussing earlier never had to deal with.
02:33:48.420 | Let me ask you something, and boy,
02:33:49.940 | this is the ultimate Doomcaster thing of all time to say.
02:33:53.860 | When you think of historical figures
02:33:56.460 | that push things like love and peace
02:34:01.260 | and creating bridges between enemies,
02:34:05.700 | when you think of what happened to those people,
02:34:09.180 | first of all, they're very dangerous.
02:34:10.820 | Every society in the world has a better time,
02:34:13.000 | easier time dealing with violence and things like that
02:34:15.540 | than they do nonviolence.
02:34:16.940 | Nonviolence is really difficult for governments
02:34:19.380 | to deal with, for example.
02:34:20.940 | What happens to Gandhi and Jesus and Martin Luther King?
02:34:25.260 | And you think about all those people, right?
02:34:27.660 | When they're that, it's ironic, isn't it,
02:34:30.380 | that these people who push for peaceful solutions
02:34:32.660 | are so often killed, but it's because they're effective.
02:34:36.900 | And when they're killed, the effectiveness is diminished.
02:34:40.220 | Why are they killed?
02:34:41.300 | Because they're effective.
02:34:42.540 | And the only way to stop them is to eliminate them
02:34:45.120 | 'cause they're charismatic leaders
02:34:47.880 | who don't come around every day.
02:34:49.560 | And if you eliminate them from the scene,
02:34:51.800 | the odds are you're not gonna get another one for a while.
02:34:54.720 | I guess what I'm saying is the very things
02:34:56.320 | you're talking about,
02:34:57.160 | which would have the effect you think it would, right?
02:34:58.880 | They would destabilize systems
02:35:01.120 | in a way that most of us would consider positive.
02:35:03.840 | But those systems have a way of protecting themselves, right?
02:35:07.600 | And so I feel like history shows,
02:35:10.960 | see, history is pretty pessimistic, I think, by and large,
02:35:14.100 | if only because we can find so many examples
02:35:16.420 | that just sound pessimistic.
02:35:17.520 | But I feel like people who are dangerous
02:35:19.760 | to the way things are tend to be removed.
02:35:24.300 | - Yes, but there's two things to say.
02:35:26.860 | I feel like you're right that history,
02:35:29.220 | I feel like the ripples that love leaves in history
02:35:35.340 | are less obvious to detect,
02:35:37.060 | but are actually more transformational.
02:35:39.580 | - Well, one could make a case about,
02:35:41.640 | I mean, if you wanna talk about the long-term value
02:35:44.040 | of a Jesus, a Gandhi box.
02:35:45.560 | But yes, those people's ripples
02:35:46.840 | are still affecting people today, I agree with that.
02:35:49.360 | - You feel those ripples through the general improvement
02:35:52.480 | of the quality of life that we see
02:35:55.280 | throughout the generations.
02:35:57.200 | You feel the ripples through the growth.
02:35:58.040 | - Yeah, I'll go along with you on that.
02:35:59.920 | - But even if that's not true,
02:36:02.420 | I tend to believe that,
02:36:06.800 | and by the way, the company that I'm working on
02:36:10.420 | is a competitor, is exactly attacking this,
02:36:13.520 | which is a competitor to Twitter.
02:36:15.740 | I think I can build a better Twitter as a first step.
02:36:17.860 | There's a long story in there.
02:36:18.980 | - I think a three-year-old child could build a better,
02:36:21.540 | and this is not to denigrate you,
02:36:23.020 | I'm sure yours would be better than a three-year-old,
02:36:24.620 | but Twitter is so, and listen, Facebook too,
02:36:27.220 | they're really awful platforms for intellectual discussion
02:36:30.620 | and meaningful discussion, and I'm on it.
02:36:33.060 | So let me just say I'm part of the problem.
02:36:34.620 | - We're new to this.
02:36:35.460 | - So it wasn't obvious at the time how to do it.
02:36:37.480 | It's now a three-year-old can do it.
02:36:41.080 | I tend to believe that we live in a time
02:36:44.600 | where the tools that people that are interested
02:36:47.880 | in providing love, the weapons of love
02:36:52.160 | are much more powerful.
02:36:53.840 | So the one nice thing about technology
02:36:58.380 | is it allows anyone to build a company
02:37:02.060 | that's more powerful than any government.
02:37:04.620 | So that could be very destructive,
02:37:06.640 | but it could be also very positive.
02:37:09.680 | And that's, I tend to believe that somebody like Elon
02:37:12.480 | that wants to do good for the world,
02:37:14.160 | somebody like me and many like me
02:37:16.600 | could have more power than any one government.
02:37:19.380 | And by power, I mean the power to effect change,
02:37:24.480 | which is different from Gandhi.
02:37:25.320 | - What do you do with government,
02:37:26.160 | and I don't mean to interrupt you,
02:37:26.980 | but I'll forget my train of thought, I'm getting old.
02:37:28.760 | But I mean, how do you deal with the fact
02:37:30.360 | that already governments who are afraid of this
02:37:33.720 | are walling off their own internet systems
02:37:36.280 | as a way to create firewalls simply to prevent you
02:37:40.360 | from doing what you're talking about?
02:37:42.320 | In other words, there's an old line
02:37:43.940 | that if voting really changed anything,
02:37:45.360 | they'd never allow it.
02:37:47.020 | If love through a modern day successor to Twitter
02:37:51.920 | would really do what you want it to do,
02:37:53.820 | and this would destabilize governments,
02:37:56.540 | do you think that governments would take countermeasures
02:38:00.580 | to squash that love before it got too dangerous?
02:38:03.640 | - There's several answers.
02:38:04.840 | One, first of all, I don't, actually,
02:38:06.920 | to push back on something you said earlier,
02:38:08.800 | I don't think love is as much of an enemy of the state
02:38:12.760 | as one would think.
02:38:14.960 | - Different states have different views.
02:38:16.960 | (Lex laughs)
02:38:17.780 | - I think the states want power,
02:38:22.560 | and I don't always think that love is in tension with power.
02:38:27.500 | Like, I think, and I think it's a good thing
02:38:33.280 | I think it's not just about love,
02:38:34.680 | it's about rationality, it's reason, it's empathy,
02:38:37.960 | all of those things.
02:38:39.080 | I don't necessarily think they're always have to be,
02:38:42.760 | by definition, in conflict with each other.
02:38:45.680 | So that's one sense is I feel like,
02:38:49.480 | basically, you can Trojan horse love into,
02:38:52.160 | behind, but you have to be good at it.
02:38:56.120 | This is the thing,
02:38:57.080 | is you have to be conscious of the way these states think.
02:39:01.520 | So the fact that China banned certain services and so on,
02:39:05.760 | that means the companies weren't eloquent,
02:39:09.360 | whoever the companies are,
02:39:10.920 | weren't actually good at infiltrating.
02:39:15.040 | Like, I think, isn't that a song,
02:39:18.120 | like "Love is a Battlefield"?
02:39:19.800 | I think it's all a--
02:39:21.040 | - Capeditaria.
02:39:22.040 | - Yeah, it's all a game,
02:39:23.880 | and you have to be good at the game.
02:39:25.720 | And just like Elon, we said,
02:39:28.160 | with Tesla and saving the environment,
02:39:32.400 | I mean, that's not just by getting on a stage
02:39:35.280 | and saying it's important to save the environment,
02:39:37.720 | it's by building a product that people can't help but love,
02:39:42.720 | and then convincing Hollywood stars to love it.
02:39:46.120 | Like, there's a game to be played.
02:39:48.760 | - Okay, so let me build on that,
02:39:50.600 | because I think there's a way to see this.
02:39:52.440 | I think you're right.
02:39:53.280 | And so it has to do with a story about the 1960s.
02:39:57.040 | In the vast scheme of things,
02:39:58.200 | the 1960s looks like a revival of neo-romantic ideas, right?
02:40:03.200 | I had a buddy of mine several years,
02:40:05.680 | well, two decades older than I was,
02:40:07.520 | who was in the '60s, went to the protests,
02:40:09.880 | did all those kind of things.
02:40:10.920 | And we were talking about it, and I was romanticizing it.
02:40:14.000 | And he said, "Don't romanticize it."
02:40:15.320 | He goes, "Let me tell you,
02:40:16.160 | "most of the people that went to those protests
02:40:17.960 | "and did all those things,
02:40:19.480 | "all they were there was to meet girls and have a good time."
02:40:21.960 | And it wasn't so,
02:40:23.800 | but it became in vogue to have all,
02:40:28.800 | in other words, let's talk about your empathy and love.
02:40:32.120 | You're never gonna, in my opinion,
02:40:33.920 | grab that great mass of people
02:40:35.720 | that are only in it for their interest in whatever,
02:40:38.320 | but if meeting girls for a young teenage guy
02:40:42.520 | requires you to feign empathy,
02:40:45.920 | requires you to read deeper subjects
02:40:49.320 | because that's what people are into,
02:40:52.040 | you can almost, as a silly way to be trendy,
02:40:55.560 | you could make maybe empathy trendy, love trendy,
02:40:59.480 | solutions that are the opposite of that,
02:41:02.360 | the kind of things that people inherently
02:41:05.640 | will not put up with you.
02:41:07.160 | In other words, the possibility exists
02:41:09.560 | to change the zeitgeist and reorient it in a way
02:41:13.820 | that even if most of the people aren't serious about it,
02:41:17.160 | the results are the same.
02:41:19.040 | Does that make sense?
02:41:19.880 | - Absolutely. - Okay.
02:41:21.080 | Okay, so we've found a meeting of the minds.
02:41:22.720 | - Yeah, exactly.
02:41:24.240 | Creating incentives that encourage the best
02:41:29.240 | and the most beautiful aspects of human nature.
02:41:32.000 | - Even against our will.
02:41:33.800 | - It all boils down to meeting girls and boys.
02:41:36.360 | - Once again, you're getting to the bottom
02:41:39.200 | of the evolutionary motivations
02:41:40.960 | and you're always on safe ground when you do that.
02:41:42.960 | - Yeah, it's a little difficult for me.
02:41:45.340 | And I'm sure it's actually difficult for you
02:41:48.640 | to listen to me say, complimenting you, but...
02:41:51.840 | - I'm not good with that.
02:41:55.280 | - It's difficult for both of us, okay?
02:41:57.200 | But you and I, as I mentioned to you,
02:42:01.160 | I think off mic, been friends for a long time,
02:42:03.400 | it's just been one way.
02:42:04.560 | - It's two way now.
02:42:06.800 | - It's two way now.
02:42:08.120 | So that's the beauty of podcasting.
02:42:10.800 | Now, just been fortunate enough with this particular podcast
02:42:14.240 | that I see it in people's eyes when they meet me,
02:42:17.320 | that they've been friends with me for a few years now.
02:42:21.000 | And we become fast friends actually,
02:42:23.800 | after we start talking.
02:42:25.160 | But it's one way in that first moment.
02:42:28.600 | There's something about, especially hardcore history,
02:42:33.600 | that I do some crazy challenges and running and stuff.
02:42:37.960 | I remember in particular, probably don't have time,
02:42:40.820 | one of my favorite episodes, the painful attainment one.
02:42:44.360 | - Some people hate that episode.
02:42:46.200 | - 'Cause it's too real.
02:42:47.320 | - Yeah, they can't listen to it.
02:42:49.160 | It's my darkest one.
02:42:50.160 | We wanted to set a baseline, that's the baseline.
02:42:53.120 | - But I remember listening to that
02:42:55.040 | when I ran 22 miles for me, that was a long distance.
02:42:58.360 | - Holy cow, that's painful attainment right there.
02:43:00.800 | - Yeah, and it just pulls you in.
02:43:03.640 | There's something so powerful
02:43:06.680 | about this particular creation
02:43:09.120 | that's bigger than you actually, that you've created.
02:43:12.120 | It's kind of interesting.
02:43:13.200 | - I think anything that is successful like that,
02:43:15.020 | like Elon's stuff too, it becomes bigger than you
02:43:17.160 | and that's what you're hoping for, right?
02:43:18.920 | - Yeah, absolutely.
02:43:19.760 | - Didn't mean to interrupt you, I apologize.
02:43:20.960 | - I guess a question I have, if you look in the mirror,
02:43:25.960 | but you also look at me,
02:43:27.500 | what advice would you give to yourself and to me
02:43:34.240 | and to other podcasters, maybe to Joe Rogan
02:43:37.520 | about this journey that we're on?
02:43:40.060 | I feel like it's something special.
02:43:41.900 | I'm not sure exactly what's happening,
02:43:44.080 | but it feels like podcasting is special.
02:43:46.340 | What advice, and I'm relatively new to it,
02:43:51.580 | what advice do you have for people
02:43:55.360 | that are carrying this flame and traveling this journey?
02:43:58.540 | - Well, I'm often asked for advice by new podcasters,
02:44:03.680 | people just starting out.
02:44:05.000 | And so I have sort of a tried and true list
02:44:08.960 | of do's and don'ts, but I don't have advice
02:44:14.240 | or suggestions for you or for Joe.
02:44:18.120 | Joe doesn't need anything from me,
02:44:19.760 | Joe's figured it out, right?
02:44:21.280 | - He hasn't yet, he's still a confused kid,
02:44:23.720 | curious about the world.
02:44:25.240 | - But that's the genius of it, that's what makes it work.
02:44:28.560 | That's what Joe's brand is, right?
02:44:31.720 | I guess what I'm saying is,
02:44:32.880 | by the time you reach the stage that you're at
02:44:35.300 | or Joe's at, they have figured this out.
02:44:39.480 | The people that sometimes need help
02:44:40.920 | are brand new people trying to figure out
02:44:42.440 | what do I do with my first show
02:44:43.600 | and how do I talk to them?
02:44:44.880 | And I have standard answers for that.
02:44:47.040 | But you found your niche.
02:44:48.480 | I mean, you don't need me to tell you what to do.
02:44:51.220 | As a matter of fact, I might ask you questions
02:44:53.000 | about how you do what you do, right?
02:44:55.360 | - Well, I guess there's specific things
02:44:58.600 | like we were talking offline about monetization,
02:45:01.720 | that's a fascinating one.
02:45:03.280 | - Very difficult as an independent, yeah.
02:45:06.000 | - And one of the things that Joe is facing with,
02:45:09.880 | I don't know if you're paying attention,
02:45:11.700 | but he joined Spotify with a $100 million deal
02:45:16.000 | before going exclusive on their platform.
02:45:18.260 | The idea of exclusivity,
02:45:19.660 | that one, I don't give a damn about money personally,
02:45:22.380 | but I'm single and I like living in a shitty place.
02:45:26.220 | (laughing)
02:45:28.580 | I enjoy, so I guess it makes it easy.
02:45:30.660 | - You get the freedom, right?
02:45:31.780 | To not care, yeah.
02:45:32.620 | - Freedom, freedom, materials.
02:45:34.380 | - Not saving for anybody's college.
02:45:35.940 | - Exactly.
02:45:37.140 | Okay, so on that point, but I also,
02:45:40.460 | okay, maybe it's romanticization,
02:45:42.020 | but I feel like podcasting is pirate radio.
02:45:46.160 | And when I first heard about Spotify,
02:45:49.780 | partnering up with Joe, I was like, fuck the man.
02:45:53.420 | I said, I even, I drafted a few tweets and so on,
02:45:57.340 | just like attacking Spotify, then I calmed myself down,
02:46:01.020 | that you can't lock up this special thing we have.
02:46:04.720 | But then I realized that maybe that these are vehicles
02:46:09.900 | for just reaching more people
02:46:11.980 | and actually respecting podcasters more and so on.
02:46:15.180 | So that's what I mean by it's unclear what the journey is,
02:46:18.780 | because you also serve as beacon for,
02:46:22.300 | now there's like millions, 1 million plus podcasters.
02:46:26.460 | I wonder what the journey is.
02:46:31.620 | Do you have a sense,
02:46:32.860 | are you romantic in the same kind of way
02:46:37.420 | in feeling that, 'cause you have a roots in radio too,
02:46:41.100 | do you feel that podcasting is pirate radio,
02:46:43.960 | or is the Spotify thing one possible avenue?
02:46:48.060 | Are you nervous about Joe as a fan, as a friend of Joe,
02:46:51.160 | or is this a good thing for us?
02:46:55.980 | - So my history of how I got involved
02:46:58.420 | in podcasting is interesting.
02:47:00.300 | I was in radio, and then I started a company
02:47:04.700 | back in the era where the dot-com boom was happening
02:47:08.140 | and everybody was being bought up,
02:47:09.320 | and it just seemed like a great idea, right?
02:47:11.420 | I did it with six other people,
02:47:14.820 | and the whole goal of the company was,
02:47:17.620 | we had to invent the term, I'm sure everybody,
02:47:20.140 | there's other places that invented it at the same time,
02:47:22.220 | but what we were pitching to investors
02:47:25.140 | was something called amateur content.
02:47:26.940 | So this is before YouTube, before podcasting,
02:47:29.240 | before all this stuff.
02:47:31.020 | And my job was to be the evangelist.
02:47:34.540 | And I would go to these people and sing the praises
02:47:38.060 | of all the ways that amateur content was gonna be great.
02:47:42.460 | And I never got a bite.
02:47:45.380 | And they all told me the same thing.
02:47:46.900 | This isn't gonna take off 'cause anybody who's good
02:47:49.140 | is already gonna be making money at this.
02:47:50.660 | And I kept saying, forget that,
02:47:53.140 | we're talking about scale here.
02:47:55.300 | If you have millions of pieces of content
02:47:57.600 | being made every week, a small percentage
02:48:00.100 | is gonna be good no matter what, right?
02:48:01.580 | 16-year-olds will know what other 16-year-olds like.
02:48:04.000 | I kept pushing this and nobody bit.
02:48:06.140 | But the podcast grew out of that
02:48:07.580 | because if you're talking about amateur content in 1999,
02:48:12.580 | well, then you're already, you're ahead of the game
02:48:16.960 | in terms of not seeing where it's gonna go financially,
02:48:20.420 | but seeing where it's going to go technologically.
02:48:23.060 | And so when we started the podcast in 2005,
02:48:25.540 | and it was the political one, not hardcore history,
02:48:27.820 | which was an outgrowth of the old radio show,
02:48:30.980 | we didn't have any financial ideas.
02:48:34.600 | We were simply trying to get our handle on the technology
02:48:37.520 | and how you distribute it to people and all that.
02:48:39.020 | And it was years later that we tried to figure out,
02:48:42.000 | okay, how can we get enough money to just support us
02:48:43.960 | while we're doing this?
02:48:44.800 | And the cheap and the easy way was just to ask listeners
02:48:47.800 | to donate like a PBS kind of model.
02:48:49.680 | And that was the original model.
02:48:51.540 | So then once we started down that,
02:48:55.200 | we figured out other models,
02:48:56.400 | and there's the advertising thing,
02:48:57.560 | and we sell the old shows.
02:48:58.800 | And so all these became ways for us to support ourselves.
02:49:01.820 | But as podcasting matured,
02:49:06.920 | and as more operating systems developed,
02:49:09.840 | and phones were developed,
02:49:12.040 | and all these kinds of things,
02:49:13.520 | every one of those developments,
02:49:15.320 | which actually made it easier for people to get the podcast
02:49:18.600 | actually made it more complex to make money off of them.
02:49:22.060 | So while our audience was building,
02:49:24.160 | the amount of time and effort we had to put
02:49:26.020 | into the monetization side began to skyrocket.
02:49:29.940 | So to get back to your Spotify question,
02:49:31.820 | to use just one example,
02:49:32.820 | there's a lot of people who are doing similar things.
02:49:35.480 | In this day and age, we used to just sell MP3 files,
02:49:39.660 | and all you had to have was an MP3 player,
02:49:41.060 | it was cheap and dirty.
02:49:42.660 | Now, every time there's an OS upgrade,
02:49:45.140 | something breaks for us.
02:49:46.740 | So we're having, I mean, my choices are at this point
02:49:49.340 | to start hiring staff, more staff,
02:49:51.580 | people to, and then be a human resources manager.
02:49:54.260 | I mean, the pirate radio side of this
02:49:55.980 | was the pirate radio side of this,
02:49:57.300 | 'cause you didn't need anybody but you,
02:49:59.660 | or you and another, I mean,
02:50:00.500 | you could just do this lean and mean,
02:50:02.140 | and it's becoming hard to do it lean and mean now.
02:50:05.280 | So if somebody like a Spotify comes in and says,
02:50:07.380 | "Hey, we'll handle that stuff for you,"
02:50:10.780 | in the past, I would just say,
02:50:12.560 | "F off, we don't need you, I don't mind."
02:50:15.300 | And I definitely am not making what we could make on this,
02:50:18.300 | but what we would have to do to make that is onerous to me.
02:50:22.400 | But it's becoming onerous to me day to day anyway.
02:50:25.660 | And so if somebody were to come in and say,
02:50:28.280 | "Hey, we'll pick that up for you,
02:50:30.300 | we will not interfere with your content at all,
02:50:32.580 | we won't," and in my case, you can't say,
02:50:34.160 | "We need to show a month, 'cause that ain't happening,"
02:50:36.200 | right?
02:50:37.040 | So I mean, everybody's design is different, right?
02:50:40.500 | So it doesn't, you know, there's not one size fits all,
02:50:43.260 | but I guess as a long time pirate podcaster,
02:50:47.020 | there are, you know, we've been looking to partner
02:50:49.700 | with people, but nobody's right for us to partner with.
02:50:51.900 | I mean, so I'm always looking for ways
02:50:55.900 | to take that side of it off my plate,
02:50:58.260 | 'cause I'm not interested in that side.
02:50:59.540 | All I wanna do is the shows and the, you know,
02:51:02.740 | it's really at this point,
02:51:04.500 | you shouldn't call yourself an artist because some,
02:51:06.900 | you know, that's something you decided by other people.
02:51:08.780 | But I mean, we're trying to do art
02:51:11.700 | and there's something very satisfying in that.
02:51:14.980 | But the part that I can't stand is the increasing amount
02:51:18.780 | of time the monetization question takes upon us.
02:51:21.780 | And so there's a case to be made, I guess is what I'm saying,
02:51:26.460 | that if a partnership with some outside firm
02:51:29.700 | enhances your ability to do the art
02:51:32.500 | without disenhancing your ability to do the art,
02:51:36.100 | it's, the word I'm looking for here is it's enticing.
02:51:41.100 | - Yes.
02:51:42.260 | - I don't like big companies.
02:51:44.500 | So I'm afraid of whatever strings might come with that.
02:51:49.500 | And if I'm Joe Rogan and I'm talking about subjects
02:51:52.220 | that can make public companies, you know, a little nervous,
02:51:55.860 | I would certainly be careful.
02:51:57.300 | But at the same time, people who are not in this game
02:52:00.540 | don't understand the problems that literally,
02:52:04.580 | I mean, just all the operating systems, all the pod catchers,
02:52:07.580 | every time some new pod catcher comes up,
02:52:09.420 | makes it easier to get the podcast,
02:52:10.980 | that's something we have to account for on the backend.
02:52:14.220 | And I'm not exactly the technological wizard of all time.
02:52:17.740 | So I think it is maybe, maybe the short answer is,
02:52:21.900 | is that as the medium develops,
02:52:24.300 | it's becoming something that you have to consider,
02:52:26.860 | not because you wanna sell out,
02:52:28.900 | but because you wanna keep going.
02:52:30.420 | And it's becoming harder and harder
02:52:32.300 | to be pirate-like in this environment.
02:52:35.420 | - The thing that convinced me, especially inside Spotify,
02:52:39.060 | is that they understand.
02:52:42.780 | So if you walk into this whole thing with some skepticism,
02:52:46.020 | as you're saying, of big companies, then it works.
02:52:50.060 | Because Spotify understands the magic that makes podcasting,
02:52:54.380 | or they appear to, in part.
02:52:56.580 | At least they understand enough to respect Joe Rogan.
02:53:00.300 | And despite what, I don't know if you,
02:53:02.980 | so there's the internet
02:53:03.940 | and there's people with opinions on the internet.
02:53:05.860 | - Really? - Yes.
02:53:06.700 | - I've got to hear about that.
02:53:07.940 | - And they have opinions about Joe and Spotify.
02:53:11.940 | But the reality is, there's two things.
02:53:14.820 | In private conversation with Joe,
02:53:16.780 | and in general, there's two important things.
02:53:19.540 | One, Spotify literally doesn't tell Joe anything.
02:53:22.820 | Like all the people that think that Spotify's somehow
02:53:26.260 | pushing Joe in this direction or that.
02:53:28.500 | - It's a contractual.
02:53:29.340 | Didn't he insist upon that?
02:53:30.580 | - It's in the contract.
02:53:31.420 | But also, companies have a way of,
02:53:34.020 | even with the contract. - They sure do.
02:53:36.020 | - To be marketing people.
02:53:37.780 | Hey, I know we're not forcing you to do stuff.
02:53:39.180 | - Yeah, yeah, yeah.
02:53:40.020 | I hate that.
02:53:41.060 | Yeah, I'm with you.
02:53:42.820 | - You and Joe are the same.
02:53:44.380 | And Spotify's smart enough
02:53:45.940 | not to send a single email of that kind.
02:53:48.300 | - That's really smart.
02:53:49.580 | - And they leave them be.
02:53:51.300 | There is meetings inside Spotify that people--
02:53:53.940 | - I've read about those, yeah.
02:53:54.780 | - People complain.
02:53:55.820 | But those meetings never reach Joe.
02:53:57.580 | That's like company stuff.
02:54:00.420 | And the idea that Spotify is different than pirate radio,
02:54:04.740 | the difficult thing about podcasting
02:54:07.300 | is nobody gives a damn about your podcast.
02:54:10.140 | You're alone in this.
02:54:11.780 | I mean, there's fans and stuff, but nobody--
02:54:14.020 | - Nobody's looking out for you.
02:54:15.060 | - Yeah. - Yeah.
02:54:15.900 | - And the nice thing about Spotify
02:54:17.380 | is they want Joe's podcast to succeed even more.
02:54:21.780 | What Joe talked about is that's the difference
02:54:24.820 | between YouTube and Spotify.
02:54:27.100 | Spotify wants to be the Netflix of podcasting.
02:54:30.020 | And what Netflix does is they don't wanna control you
02:54:35.020 | in any way, but they want to create a platform
02:54:39.220 | where you can flourish even more.
02:54:41.020 | - 'Cause your interests are aligned.
02:54:42.220 | - Interests are aligned.
02:54:43.180 | - So let me bring up something that,
02:54:45.580 | let's make a distinction,
02:54:46.660 | because not all companies who do this are the same.
02:54:50.580 | And you brought up YouTube and Spotify,
02:54:52.340 | but to me, YouTube is at least more like Spotify
02:54:55.020 | than some of these smaller, the term is walled garden,
02:54:58.340 | right, you've heard the term walled garden?
02:54:59.860 | Okay, so I've been around podcasting so long now
02:55:04.100 | that I've seen rounds of consolidation over the years,
02:55:07.220 | and they come in waves.
02:55:08.460 | And all of a sudden, so you'll get,
02:55:10.620 | and I'm not gonna mention any names,
02:55:12.140 | but up until recently, the consolidation was happening
02:55:15.380 | with relatively small firms compared to people like Spotify.
02:55:18.900 | And the problem was is that by deciding
02:55:21.180 | to consolidate your materials in a walled garden,
02:55:25.020 | you are walling yourself off from audience, right?
02:55:28.380 | So your choice is I'm gonna accept this amount of money
02:55:30.420 | from this company, but the loss is going to be
02:55:32.620 | a large chunk of my audience.
02:55:33.980 | And that's a catch-22 because you're negotiating power
02:55:36.860 | with that company is based on your audience size.
02:55:39.260 | So if signing up with them diminishes your audience size,
02:55:41.340 | you lose negotiating power.
02:55:43.420 | But when you get to the level of the Spotify
02:55:46.820 | to just pick them out, there's other players,
02:55:49.460 | but you brought up Spotify specifically,
02:55:51.460 | these are people who can potentially,
02:55:54.300 | potentially enhance your audience over time.
02:55:57.860 | And so the risk to you is lower
02:56:00.740 | because if you decide in a year or two,
02:56:03.380 | whatever the licensing agreements term is,
02:56:05.860 | that you're done with them and you wanna leave,
02:56:07.820 | instead of how you would have been
02:56:09.580 | with some of these smaller walled gardens
02:56:11.540 | where you're walking away with a fraction
02:56:13.700 | of the audience you walked in with,
02:56:15.500 | you have the potential to walk out
02:56:17.620 | with whatever you got in the original deal,
02:56:19.700 | plus a larger audience 'cause their algorithms
02:56:22.340 | and everything are designed to push people
02:56:25.180 | to your content if they think you'd like.
02:56:27.460 | So it takes away some of the downside risk,
02:56:31.260 | which alleviates, and if you can write an agreement
02:56:34.420 | like Joe Rogan, I mean, where you've protected
02:56:36.460 | your freedom to put the content out the way you want.
02:56:39.300 | So, and if some of the downside risk is mitigated,
02:56:42.260 | and if you eliminate the problem of trying to monetize
02:56:45.820 | and stay up with the latest tech, then it might be worth it.
02:56:49.220 | You know, I'm scared of things like that,
02:56:51.020 | but at the same time, I'm trying to not be an idiot
02:56:53.980 | about it, and I can be an idiot about it.
02:56:56.740 | And when you've been doing it as independently
02:56:59.300 | for as long as I have, the inertia of that
02:57:02.780 | has a force all its own, but I'm inhibited enough
02:57:07.780 | in what I'm trying to do on this other end
02:57:10.780 | that it's opened me at least to listening to people.
02:57:13.620 | But listen, at the same time, I love my audience,
02:57:18.580 | and it sounds like a cliche, but they're literally
02:57:21.820 | the reason I'm here, so I wanna make sure
02:57:24.940 | that whatever I do, if I can, is in keeping
02:57:27.860 | with a relationship that I've developed
02:57:30.460 | with these people over 15 years.
02:57:32.860 | But like you said, no matter what you do, you are,
02:57:37.100 | because see, here's the thing, if you don't sign up
02:57:39.020 | with one of those companies to make it easier
02:57:40.620 | for them to get your stuff on this hand,
02:57:42.620 | they might yell at you for how difficult it is,
02:57:45.020 | 'cause the new operating system just updated,
02:57:48.020 | and you just, I can't get your, so either way,
02:57:50.340 | you're opening yourself up to ridicule at this point.
02:57:52.660 | All of that makes it easier to go, well,
02:57:55.540 | if the right deal came along, and they weren't screwing me,
02:57:57.580 | and they weren't screwing my audience, and blah, blah,
02:58:00.100 | you know, I mean, again, in this business,
02:58:01.820 | when you're talking about cutting-edge technology
02:58:03.740 | that is ever-changing, and as you said,
02:58:05.700 | a million podcasts and growing, I think you have
02:58:08.700 | to try to maintain flexibility, and especially
02:58:11.260 | if they can mitigate the downside risk,
02:58:14.040 | I think you have to, I think you'd be an idiot
02:58:17.100 | to not at least try to stay up on the current trends.
02:58:19.980 | And look, I'm watching Joe.
02:58:22.100 | I'm going, okay, let's see how it goes for Joe.
02:58:24.020 | You know, I mean, if he's like, ah, this is terrible,
02:58:26.700 | I'm getting out of this, you go, okay,
02:58:27.980 | those people are out of this.
02:58:29.620 | So Joe's put himself out as a guinea pig,
02:58:31.860 | and the rest of us guinea pigs appreciate it.
02:58:34.020 | - As a huge, as a fan of your show,
02:58:36.540 | and as a fan of Netflix, the people there,
02:58:40.380 | I think I can speak for like millions of people
02:58:44.000 | in hope that the hardcore history comes to Netflix,
02:58:46.940 | or if Spotify becomes the Netflix of podcasting,
02:58:49.500 | then to Spotify.
02:58:50.700 | There's something at its best that they bring out the,
02:58:54.880 | you said artists, so I can say it,
02:58:57.620 | is they bring out the best out of the artists.
02:58:59.740 | They remove some of the headache,
02:59:02.860 | and somehow like they put,
02:59:05.060 | at their best, Netflix, for example,
02:59:09.860 | is able to enforce and find the beauty
02:59:13.620 | and the power in the creations that you make,
02:59:16.060 | even better than you.
02:59:17.820 | Like they don't interfere with the creations,
02:59:20.620 | but they somehow, it's a branding thing probably too.
02:59:24.180 | - Yeah, but interfering would be,
02:59:25.280 | that would be a no-go for me.
02:59:26.540 | - That's right, absolutely.
02:59:27.380 | - That can't happen.
02:59:28.620 | - But that's why Netflix is masterful.
02:59:30.700 | They seem to not interfere with the talent,
02:59:34.100 | as opposed to I could throw other people under the bus.
02:59:36.540 | - There's a lot of places under the bus
02:59:38.500 | that could be thrown, absolutely.
02:59:39.860 | - So I would love, I know there's probably people screaming
02:59:42.580 | yes right now, in terms of hardcore history on Netflix,
02:59:45.580 | would be awesome.
02:59:46.420 | And I don't love asking this question,
02:59:51.180 | but it's asked probably the most popular question,
02:59:55.660 | that's unanswerable, so let me try to ask it
02:59:59.020 | in a way that you would actually answer it.
03:00:01.540 | Which is of course, you said you don't release shows
03:00:03.780 | very often, and the question is,
03:00:07.060 | or the requests and the questions is,
03:00:08.780 | well can you tell Dan to do one on the Civil War?
03:00:11.780 | Can you tell Dan to do one on the Napoleon Bonaparte?
03:00:14.260 | Can you tell him to do one, every topic?
03:00:17.700 | You've spoken to this, actually your answer
03:00:19.340 | about the Civil War is quite interesting.
03:00:21.060 | (laughing)
03:00:21.900 | - I didn't know you knew what my answer
03:00:22.900 | about the Civil War was.
03:00:24.020 | - That you don't, as a military historian,
03:00:26.140 | you enjoy in particular when there is differences
03:00:29.420 | in the armies as opposed to--
03:00:30.260 | - Contrasts.
03:00:31.100 | - Contrasts, with the Civil War, which blew my mind
03:00:34.720 | when I heard you say, there's not an interesting,
03:00:38.900 | a deep, intricate contrast between the two opposing sides.
03:00:42.220 | - It's like the Roman Civil Wars,
03:00:43.140 | where it's legionary against legionary, yeah.
03:00:45.540 | - And you've also said that, the shows you work on
03:00:49.100 | are ones where you have some roots
03:00:51.340 | of fundamental understanding about that period.
03:00:53.980 | And so, when you work on a show,
03:00:57.380 | it's basically pulling at those strings further
03:01:00.340 | and refreshing your mind and learning--
03:01:02.940 | - You have definitely done the research.
03:01:04.700 | Wow, these are words out of my mouth.
03:01:06.260 | Yeah, you're right.
03:01:07.100 | - And so, but, is there something,
03:01:10.380 | like shower thoughts on Reddit?
03:01:13.820 | Is there some ideas that are lingering in your head
03:01:16.940 | about possible future episodes?
03:01:19.620 | Is there things that, whether you,
03:01:22.980 | not committing to anything,
03:01:24.620 | but whether you're gonna do it or not,
03:01:27.860 | is there something that's like, makes you think,
03:01:29.980 | hmm, that would be interesting
03:01:31.900 | to pull at that thread a little bit?
03:01:34.780 | - Oh yeah, we have things we keep
03:01:37.580 | in our back pocket for later.
03:01:38.980 | So, Blueprint for Armageddon,
03:01:41.180 | the first World War series we did,
03:01:42.620 | that was in my back pocket the whole time.
03:01:44.880 | And when the centennial of the war happened,
03:01:47.380 | it just seemed to be the likely time
03:01:49.040 | to bring out what was--
03:01:50.460 | - That was a hell of a series.
03:01:51.860 | That's probably one of my favorite series.
03:01:52.700 | - Take my rear end, man, I have to tell you.
03:01:55.260 | - Psychologically, you mean?
03:01:56.100 | - Well, just, when you get to these,
03:01:58.700 | I think, I'm guessing here,
03:02:00.100 | I think it's 26 hours, all pieces together.
03:02:02.860 | Think about, and we don't do scripts, it's improvised.
03:02:06.620 | So, think about what 20,
03:02:08.620 | I had somebody write on Twitter just yesterday
03:02:11.020 | saying, he said something like,
03:02:13.260 | I'm not seeing the dedication here.
03:02:14.820 | You're only getting 2.5 shows out a year.
03:02:16.780 | And I wanted to say, man, you have no idea what,
03:02:20.400 | the only people who understand, really,
03:02:22.520 | are other history podcasters.
03:02:24.400 | And even they don't generally do 26 hour,
03:02:27.480 | you know, that was a two year endeavor.
03:02:29.960 | As I said, the first show we ever did was like 15 minutes.
03:02:32.020 | I could crank out one of those a month.
03:02:33.880 | But when you're doing, I mean,
03:02:35.240 | the last show we did on the fall of the Roman Republic
03:02:37.600 | was five and a half hours.
03:02:39.080 | That's a book, right?
03:02:41.360 | And it was part six or something.
03:02:43.060 | So, I mean, you just do the math.
03:02:45.600 | - And it felt like you were, sorry to interrupt,
03:02:47.540 | on World War I,
03:02:49.780 | it felt like you were emotionally pulled in to it.
03:02:53.820 | Like, it felt taxing.
03:02:55.300 | - I was gonna say, that's a good thing though,
03:02:56.500 | because that, you know, and I think we said during the show,
03:02:58.820 | that was the feeling that the people at the time have.
03:03:01.900 | And I think at one point we said,
03:03:03.540 | this is starting to seem gruesomely repetitive.
03:03:06.980 | Now you know how the people at the time felt.
03:03:10.200 | So, in other words, that had, sort of inadvertently,
03:03:13.540 | 'cause when you improvise a show,
03:03:14.660 | some of these things are inadvertent,
03:03:16.380 | but it had inadvertently created the right climate
03:03:20.180 | for having a sense of empathy with the storyline.
03:03:24.460 | And to me, those are the serendipitous moments
03:03:26.900 | that make this art and not some sort of
03:03:29.900 | paint by the numbers kind of endeavor, you know?
03:03:32.300 | And that's, to me, that wouldn't have happened
03:03:35.200 | had we scripted it out.
03:03:36.960 | - So it's mostly, you just bring the tools of knowledge
03:03:40.700 | to the table and then, in large part, improvise.
03:03:44.300 | Like the actual wording?
03:03:45.580 | - I always say we make it like they made things
03:03:47.280 | like spinal tap and some of those other things
03:03:49.380 | where the material, so I do have notes about things,
03:03:52.860 | like on page 427 of this book, you have this quote,
03:03:55.580 | so that I know, aha, I'm at the point
03:03:57.300 | where I can drop that in.
03:03:58.780 | And sometimes I'll write notes saying,
03:04:00.660 | here's where you left off yesterday, so I remember.
03:04:04.020 | But in the improvisation, you end up throwing a lot out.
03:04:07.460 | And so, but it allows us to go off on tangents.
03:04:10.580 | Like, we'll try things.
03:04:11.460 | Like, I'll sit there and go,
03:04:12.420 | mm, I wonder what this would sound like.
03:04:14.120 | And I'll spend two days going down that road
03:04:16.940 | and then I'll listen to it and go, mm-hmm, it doesn't work.
03:04:18.620 | But that's, you know, like writers do this all the time.
03:04:20.860 | It's called killing your babies, right?
03:04:22.300 | You go, pfft, can't, you know, get, no.
03:04:24.100 | But people go, so this guy goes,
03:04:25.260 | I'm not seeing the dedication.
03:04:26.460 | He has no idea how many things we're throwing out.
03:04:28.960 | I did an hour and a half, I had an hour and a half
03:04:31.700 | into the current show about two months ago.
03:04:34.460 | And I listened to it and I just went,
03:04:35.860 | you know what, it's not right.
03:04:37.020 | Boom, out the window, there goes six weeks of work, right?
03:04:41.060 | But here's the problem.
03:04:42.540 | - You trust your, sorry to interrupt.
03:04:43.820 | Do you trust your judgment on that?
03:04:45.440 | - No.
03:04:46.280 | (Zubin laughs)
03:04:47.280 | No, but here's the thing.
03:04:50.300 | Our show is a little different than other people's.
03:04:54.120 | Joe Rogan called it evergreen content.
03:04:56.420 | In other words, my political show is like a car you buy.
03:04:59.840 | And the minute you drive it off the lot,
03:05:01.540 | it loses half its value, right?
03:05:03.040 | 'Cause it's not current anymore.
03:05:04.600 | These shows are just as good or just as bad
03:05:07.880 | five years from now as they are when we,
03:05:09.680 | although the standards on the internet change.
03:05:11.400 | So when I listen to my old shows, I cringe sometimes
03:05:13.360 | 'cause the standards are so much higher now.
03:05:15.300 | But when you're creating evergreen content,
03:05:18.100 | you have two audiences to worry about.
03:05:20.360 | You have the audience that's waiting for the next show
03:05:22.600 | and they've already heard the other ones
03:05:23.760 | and they're impatient and they're telling you on Twitter,
03:05:25.400 | where is it?
03:05:26.280 | But you have show, the show's also for people five years
03:05:28.800 | from now who haven't discovered it yet
03:05:30.480 | and who don't care a whit for how long it took
03:05:32.920 | 'cause they're gonna be able to download the whole
03:05:34.920 | and all they care about is quality.
03:05:36.960 | And so what I always tell new podcasters is,
03:05:40.240 | they always say, I read all these things,
03:05:41.680 | it's very important you have a release schedule.
03:05:44.440 | Well, it's not more important
03:05:45.480 | than putting out a good piece of work.
03:05:47.280 | And the audience will forgive me if it takes too long,
03:05:52.040 | but it's really good when you get it.
03:05:53.880 | They will not forgive me if I rush it to get it out on time
03:05:56.860 | and it's a piece of crap.
03:05:58.320 | So for us, and this is why when you brought up
03:06:00.760 | a Spotify deal or anything else,
03:06:02.480 | they can't interfere with this at all.
03:06:04.160 | Because my job here, as far as I'm concerned, is quality.
03:06:07.920 | And everything else goes by the wayside.
03:06:10.240 | Because the only thing people care about long-term,
03:06:12.480 | the only thing that gives you longevity
03:06:14.520 | is how good is it, right?
03:06:15.740 | How good is that book?
03:06:16.720 | If you read J.R.R. Tolkien's work tomorrow,
03:06:19.320 | you don't care how long it took him to write it,
03:06:21.220 | all you care is how good it is today.
03:06:22.760 | And that's what we try to think too.
03:06:24.120 | And I feel like if it's good, if it's really good,
03:06:27.120 | everything else falls into place and takes care of itself.
03:06:30.520 | - Although sometimes to push back, sorry to interrupt.
03:06:33.440 | - I've done it to you a thousand times,
03:06:34.720 | so you can get me back, please.
03:06:36.160 | - Sometimes the deadline,
03:06:38.440 | some of the greatest movies and books have been,
03:06:42.200 | you think about like Dusty Yusk,
03:06:43.720 | I forget which one,
03:06:44.560 | knows from "Underground" or something.
03:06:46.040 | He needed the money, so he had to write it real quick.
03:06:49.040 | Sometimes the deadline creates,
03:06:51.500 | is powerful at taking a creative mind of an artist
03:06:57.320 | and just slapping it around
03:06:59.500 | to force some of the good stuff out.
03:07:01.200 | Now the problem with history, of course,
03:07:02.760 | is there's different definitions of good
03:07:06.520 | that it's not just about what you talk about,
03:07:09.160 | which is the storytelling,
03:07:10.320 | the richness of the storytelling.
03:07:12.000 | And I'm sure you're, again, not to compliment you too much,
03:07:15.560 | but you're one of the great storytellers of our time,
03:07:18.200 | that I'm sure if you put in a jail cell
03:07:21.040 | and forced somebody to point a gun at you,
03:07:24.320 | you could tell one hell of a good story.
03:07:26.260 | But you still need the facts of history,
03:07:29.560 | or not necessarily the facts,
03:07:31.240 | but making sure you're painting the right full picture,
03:07:35.960 | not perfectly right.
03:07:37.240 | - That's what I meant about the audience
03:07:38.440 | doesn't understand what a history podcast,
03:07:39.840 | you can't just riff and be wrong.
03:07:41.640 | So let me both oppose what you just said
03:07:45.560 | and back up what you just said.
03:07:47.280 | So I have a book that I wrote, right?
03:07:49.320 | And in a book, you have a hard deadline, right?
03:07:52.240 | So HarperCollins had a hard deadline on that book.
03:07:54.920 | So when I released it, I was mad
03:07:57.520 | because I would have worked on it a lot longer,
03:07:59.480 | which is my style, right?
03:08:00.760 | Get it right.
03:08:02.280 | But we had a chapter in that book
03:08:04.560 | entitled "Pandemic Prologue?"
03:08:07.920 | And it was the book about the part about the black death
03:08:10.640 | and the 1918 flu and all that kind of stuff.
03:08:13.680 | And I was just doing an interview
03:08:15.480 | with a Spanish journalist this morning who said,
03:08:18.000 | did you ever think how lucky you got on that?
03:08:21.000 | And first of all, lucky on a pandemic, it strikes you.
03:08:24.240 | But had I had my druthers,
03:08:27.400 | I would have kept that book working
03:08:29.440 | in my study for months more.
03:08:32.200 | And the pandemic would have happened.
03:08:34.540 | And that would have looked like a chapter
03:08:36.560 | I wrote after the fact.
03:08:37.880 | I would have had to rewrite the whole thing.
03:08:39.680 | So that argues for what you said.
03:08:42.680 | At the same time, I would have spent months more
03:08:45.920 | working on it because to me,
03:08:47.080 | it didn't look the way I wanted it to look yet.
03:08:49.760 | - Can you drop a hint of the things
03:08:52.000 | that you're keeping on the shelves?
03:08:53.480 | - Oh, the Alexander the Great podcast.
03:08:55.120 | I've talked around it.
03:08:56.920 | I talked to somebody the other day.
03:08:57.760 | He said, do you know that the very first word
03:09:00.600 | in your very first podcast, in the title,
03:09:02.920 | the very first thing that anybody ever saw
03:09:04.440 | with hardcore history is the word Alexander.
03:09:07.660 | And because the show's entitled Alexander versus Hitler.
03:09:10.100 | I have talked around the career.
03:09:12.140 | I've done show after, I talked about his mother
03:09:14.140 | in one episode.
03:09:14.980 | I talked about the funeral games after his death.
03:09:18.260 | I've talked around this.
03:09:19.420 | I've specifically left this giant Alexandrian size hole
03:09:22.860 | in the middle, 'cause we're gonna do that show one day
03:09:24.940 | and I'm going to lovingly enjoy talking
03:09:27.400 | about this crazily interesting figure
03:09:29.700 | of Alexander the Great.
03:09:30.580 | So that's one of the ones that's on the back pocket list.
03:09:33.300 | And what we try to do is whenever this,
03:09:37.200 | we're doing a second world war in Asia and the Pacific now,
03:09:40.600 | I'm on part five.
03:09:41.840 | Whenever the heck we finish this,
03:09:43.500 | the tendency is to then pick a very different period
03:09:46.240 | because we've had it and the audience has had it.
03:09:49.160 | So it's time.
03:09:50.000 | So I will eventually get to the Alexander saga.
03:09:53.240 | - What about just one last kind of little part of this is,
03:09:57.960 | what about the other half of that first 10 minute,
03:10:00.640 | 15 minute episode, which is,
03:10:02.920 | so you've done quite a bit about the world war.
03:10:05.040 | You've done quite a bit about Germany.
03:10:07.720 | Will you ever think about doing Hitler and the man?
03:10:12.400 | - It's funny because I talked earlier
03:10:14.800 | about how I don't like to go back to the old shows
03:10:16.440 | 'cause our standards have changed so much.
03:10:18.040 | Well, a long time ago, one of my standards
03:10:21.380 | for not getting five hour podcasts done
03:10:24.480 | or not getting too deeply into them
03:10:27.160 | was to flip around the interesting points.
03:10:29.960 | We didn't realize we were gonna get an audience
03:10:32.280 | that wanted the actual history.
03:10:34.520 | We thought we could just go with,
03:10:36.400 | assume the audience knew the details
03:10:38.160 | and just talk about the weird stuff
03:10:39.480 | that only makes up one part of the show now.
03:10:41.760 | So we did a show called "Nazi Tidbits"
03:10:44.640 | and it was just little things about,
03:10:46.960 | it's totally out of date now.
03:10:48.000 | Like, you can still buy them, but they're out of date.
03:10:50.640 | Where we dealt a little with it.
03:10:53.080 | It would be interesting, but I'll give you another example.
03:10:56.280 | I mean, history is not stagnant, as you know.
03:10:59.440 | And we had talked about Stalin earlier
03:11:02.160 | and "Ghosts of the Ostfront" was done years ago.
03:11:04.640 | And people will write me from Russia now and say,
03:11:06.680 | well, your portrayal of Stalin is totally out of,
03:11:09.480 | it's outdated because there's all this new stuff
03:11:13.520 | from the former Soviet Union.
03:11:15.120 | And you do, you turn around and you go, okay, they're right.
03:11:18.320 | And so when you talk about Hitler,
03:11:21.640 | it's very interesting to think about
03:11:23.200 | how I would do a Hitler show today
03:11:24.840 | versus how I did one 10 years ago.
03:11:27.640 | And you would think, well, what's new?
03:11:28.720 | I mean, it happened so long, but there's lots of new stuff
03:11:30.640 | and there's lots of new scholarship.
03:11:31.920 | And so, yeah, I would think that would be
03:11:34.680 | an interesting one to do someday.
03:11:36.760 | I haven't thought about that.
03:11:37.880 | That's not in the back pocket, but yeah,
03:11:40.240 | that'd be interesting.
03:11:41.080 | - I have a disproportionate amount of power
03:11:42.920 | because I trapped you somehow in a room and thereby--
03:11:46.920 | - During a pandemic.
03:11:47.960 | - During a pandemic.
03:11:49.160 | So like my hope will be stuck in your head,
03:11:51.920 | but after Alexander the Great,
03:11:53.320 | which would be an amazing podcast,
03:11:56.000 | I hope you do give a return to Hitler,
03:11:59.600 | the rise and fall of the Third Reich,
03:12:01.720 | which to me--
03:12:04.400 | - It's a contemporary book, basically.
03:12:06.720 | - Yeah, exactly.
03:12:08.880 | It's by a person who was there.
03:12:10.120 | - Shire, yeah.
03:12:11.400 | - I really loved that study of the man of Hitler.
03:12:16.280 | And I would love to hear your study
03:12:20.320 | of certain aspects of it.
03:12:21.800 | Perhaps even an episode that's like more focused
03:12:23.880 | on a very particular period.
03:12:25.700 | I just feel like you can tell a story that,
03:12:30.400 | it's funny, Hitler's one of the most studied people
03:12:32.640 | and I still feel like all the stories
03:12:35.920 | or most of the stories haven't been told.
03:12:38.040 | - Oh, and there's, listen, I've got three books at home.
03:12:40.440 | I'm on all the publishers lists now.
03:12:42.080 | And they just, there's young Hitler,
03:12:43.880 | there's this Hitler, there's that.
03:12:45.080 | I mean, I've been reading these books
03:12:46.200 | and I've read about Hitler.
03:12:47.280 | I read the rise and fall of the Third Reich.
03:12:48.840 | My mother thought I needed to go to a psychologist
03:12:51.240 | 'cause I read it when I was six.
03:12:53.440 | And she said, "There's something wrong with the boy."
03:12:55.640 | And but--
03:12:56.960 | - She was right.
03:12:57.800 | - She was absolutely right.
03:12:59.600 | But you would think that something like that
03:13:02.560 | is pretty established fact.
03:13:04.240 | And yet there's new stuff coming out all the time.
03:13:06.320 | And needless to say,
03:13:07.680 | Germany's been investigating this guy forever.
03:13:10.320 | And sometimes it takes years to get the translations.
03:13:13.160 | I took five years of German in school.
03:13:15.240 | I can't read any of it.
03:13:16.960 | So, I mean, and he is,
03:13:20.120 | when you talk about fascinating figures,
03:13:22.160 | he's so, the whole thing is so twistedly weird.
03:13:26.080 | There was a, it came out a couple of years ago,
03:13:28.240 | somebody found a tape of him talking to,
03:13:31.480 | I wanna say it was general,
03:13:34.480 | the Finnish general, Mannerheim, right?
03:13:36.600 | And he's just in a very normal conversation
03:13:39.560 | of the sort we're having now.
03:13:41.160 | And the Hitler tapes, when you hear him normally,
03:13:43.160 | he's ranting and raving,
03:13:44.400 | but this was a very sedate.
03:13:46.040 | And I wish I'd understood the German well enough
03:13:47.840 | to really get a feel because I was reading
03:13:50.160 | what Germans said and they said,
03:13:51.440 | "Wow, you can really hear the Southern accent."
03:13:55.600 | Little things that only a native speaker would hear.
03:13:58.640 | And I remember thinking,
03:13:59.560 | this is such a different side of this twisted character.
03:14:02.320 | And you would think you would always,
03:14:04.560 | you would think that this was information
03:14:06.000 | that was out in the rise and fall of the Third Reich era,
03:14:08.760 | but it wasn't.
03:14:09.760 | And so this goes along with that stuff
03:14:13.000 | about new stuff coming out all the time.
03:14:14.880 | Alexander, new stuff coming out all the time.
03:14:16.840 | - Really?
03:14:17.680 | - Well, at least interpretations rather than factual data.
03:14:20.200 | - And those color your,
03:14:21.720 | those give depth to your understanding.
03:14:23.400 | - Yes, and you want that
03:14:24.480 | because of the historiography, people love that.
03:14:27.720 | And that was a by-product of my lack of credentials
03:14:31.000 | where we thought we're gonna bring in the historians
03:14:34.040 | and we call them audio footnotes,
03:14:36.120 | right away for me to say,
03:14:37.400 | "Listen, I'm not a historian,
03:14:38.400 | but I'll quote this guy who is so you can trust him."
03:14:41.060 | But then we would quote other people
03:14:42.680 | who had different views.
03:14:44.360 | And people didn't realize that,
03:14:46.400 | if they're not history majors,
03:14:47.900 | that historians don't always agree on this stuff
03:14:49.880 | and that they have disagreements and they loved that.
03:14:52.280 | So I love the fact that there's more stuff out there
03:14:55.680 | because it allows us to then bring in other points of view
03:14:58.840 | and sort of maybe three-dimensionalize
03:15:00.840 | or flesh out the story a little bit more.
03:15:03.280 | - Two last questions, one really simple,
03:15:06.020 | one absurdly ridiculous and perhaps also simple.
03:15:09.400 | First, who is Ben and is he real?
03:15:13.480 | - I don't even know what you're talking about.
03:15:17.300 | - Very well.
03:15:18.440 | - How's that for an answer?
03:15:20.360 | Like asking me, is Harvey the White Rabbit real?
03:15:22.400 | I don't know.
03:15:23.600 | There's carrots all around the production room,
03:15:25.360 | but I don't know what that means.
03:15:27.000 | - Well, a lot of people demanded that I prove,
03:15:30.180 | I somehow figure out a way to prove the existence.
03:15:32.520 | - If I said he was real, people would say, "No, he's not."
03:15:35.560 | And if I said he wasn't real, they would say, "Yes, he is."
03:15:38.520 | So it's a Santa Claus, Easter bunny kind of vibe there.
03:15:42.240 | - Yeah.
03:15:43.240 | I mean, what is real anyway?
03:15:44.640 | - That's exactly what I told him if he exists.
03:15:48.600 | - Okay, the most absurd question, I'm very sorry,
03:15:51.720 | very sorry, but then again, I'm not.
03:15:53.480 | What's the meaning of it all?
03:15:56.000 | You study history, human history.
03:16:00.680 | Have you been able to make sense of why the hell we're here
03:16:06.560 | on this spinning rock?
03:16:08.520 | Does any of it even make sense?
03:16:09.960 | What's the meaning of life?
03:16:11.800 | - What I look at sometimes that I find interesting
03:16:16.480 | is certain consistencies that we have over time.
03:16:20.800 | History doesn't repeat, but it has a constant,
03:16:24.720 | and the constant is us.
03:16:26.680 | Now we change.
03:16:27.840 | I mentioned earlier the wickedly weird time we live in
03:16:31.460 | with what social media is doing to us as guinea pigs,
03:16:33.680 | and that's a new element,
03:16:35.600 | but we're still people who are motivated by love, hate,
03:16:40.520 | greed, envy, sex.
03:16:41.840 | I mean, all these things that would have connected us
03:16:43.600 | with the ancients, right?
03:16:45.520 | That's the part that always makes history sound
03:16:48.040 | like it rhymes, you know?
03:16:49.880 | And when you put the constant, the human element,
03:16:54.400 | and you mix it with systems that are similar,
03:16:57.320 | so one of the reasons that the ancient Roman Republic
03:16:59.600 | is something that people point to all the time
03:17:02.120 | as something that seems like we're repeating history
03:17:05.160 | is because you have humans, just like you had then,
03:17:08.840 | and you have a system that resembles the one we have here.
03:17:12.360 | So you throw the constant in with a system
03:17:14.440 | that is somewhat similar,
03:17:15.920 | and you begin to see things
03:17:17.380 | that look like they rhyme a little.
03:17:19.600 | So for me, I'm always trying to figure out more about us,
03:17:23.440 | and when you show us in 500 years ago in Asia,
03:17:28.440 | and 800 years ago in Africa,
03:17:30.800 | and you look at all these different places
03:17:32.840 | that you put the guinea pig in,
03:17:35.000 | and you watch how the guinea pig responds
03:17:37.200 | to the different stimuli and challenges,
03:17:40.040 | I feel like it helps me flesh out a little bit more
03:17:45.000 | who we are in the long timeline,
03:17:47.520 | not who we are today specifically,
03:17:49.640 | but who we've always been.
03:17:51.440 | It's a personal quest.
03:17:53.380 | It's not meant to educate anybody else.
03:17:55.640 | It's something that fascinates me.
03:17:57.640 | - Do you think there's, in that common humanity
03:18:00.560 | throughout history of the guinea pig,
03:18:03.320 | is there a why underneath it all?
03:18:06.120 | Or is it somehow, like,
03:18:07.400 | it feels like it's an experiment of some sort?
03:18:10.480 | - Oh, now you're into it.
03:18:11.880 | Elon Musk and I talked about this,
03:18:13.280 | the simulation thing, right?
03:18:14.480 | Nick Bostrom's-- - Sure.
03:18:15.680 | - Yeah, the idea that there's some kid,
03:18:17.840 | and we're the equivalent of an alien's ant farm, you know?
03:18:21.120 | And we hope he doesn't throw a tarantula in
03:18:23.100 | just to see what happens.
03:18:24.360 | I think the whys elude us,
03:18:29.440 | and I think that what makes philosophy and religion
03:18:32.900 | and those sorts of things so interesting
03:18:34.920 | is that they grapple with the whys.
03:18:38.060 | But I'm not wise enough to propose a theory myself,
03:18:44.280 | but I'm interested enough to read
03:18:47.200 | all the other ones out there.
03:18:48.520 | So let's put it this way.
03:18:51.500 | I don't think there's any definitive why
03:18:53.520 | that's been agreed upon,
03:18:54.640 | but the various theories are fascinating.
03:18:56.720 | - Yeah, whatever it is,
03:18:58.360 | whoever the kid is that created this thing,
03:19:01.160 | the ant farm is kind of interesting.
03:19:05.140 | - Well, so far.
03:19:06.920 | A little bit twisted and perverted and sadistic, maybe.
03:19:09.800 | - That's what makes it fun, I think.
03:19:12.120 | But then again, that's the Russian perspective.
03:19:13.960 | - I was just gonna say, it is the Russian perspective.
03:19:18.960 | - A little bit of-- - That's what makes
03:19:20.920 | the Russian, so Russian history,
03:19:22.780 | one day I'll do some Russian history.
03:19:24.160 | I took it in college. - Yeah, you should.
03:19:25.840 | - Oh, that's the ant farm, baby.
03:19:28.640 | That's an ant farm with a very, very frustrated,
03:19:31.200 | young teenage alien kid.
03:19:34.880 | - Dan, I can't say,
03:19:37.420 | I've already complimented you way too much.
03:19:39.420 | I'm a huge fan.
03:19:40.920 | This has been an incredible conversation.
03:19:42.920 | It's a huge gift.
03:19:43.960 | Your gift of humanity, I hope you--
03:19:47.720 | - Let me cut you off and just say,
03:19:49.160 | you've done a wonderful job.
03:19:50.920 | This has been fun for me.
03:19:52.800 | The questions, and more importantly,
03:19:54.340 | the questions can come from anybody.
03:19:56.040 | The counter statements, your responses have been wonderful.
03:19:58.620 | You made this a very fun intellectual discussion for me.
03:20:01.280 | Thank you.
03:20:02.120 | - Well, let me have the last word and say,
03:20:03.680 | I agree with Elon, and despite the doomcaster say
03:20:08.280 | that I think we've concluded definitively,
03:20:10.760 | and you don't get a chance to respond,
03:20:12.160 | that love is in fact the answer and the way forward.
03:20:16.920 | So, thanks so much, Dan.
03:20:18.400 | - Thank you for having me.
03:20:20.280 | - Thanks for listening to this conversation
03:20:22.080 | with Dan Carlin, and thank you to our sponsors.
03:20:25.200 | Athletic Greens, the all-in-one drink
03:20:27.480 | that I start every day with
03:20:29.340 | to cover all my nutritional bases.
03:20:31.880 | Simply Safe, a home security company I use
03:20:34.400 | to monitor and protect my apartment.
03:20:36.700 | Magic Spoon, low-carb, keto-friendly cereal
03:20:39.640 | that I think is delicious.
03:20:41.720 | And finally, Cash App, the app I use
03:20:43.800 | to send money to friends for food and drinks.
03:20:47.040 | Please check out these sponsors in the description
03:20:49.660 | to get a discount and to support this podcast.
03:20:52.860 | If you enjoy this thing, subscribe on YouTube,
03:20:56.180 | review it with Five Stars on Apple Podcasts,
03:20:58.560 | follow on Spotify, support on Patreon,
03:21:01.200 | or connect with me on Twitter, @AlexFreedman.
03:21:04.720 | And now, let me leave you with some words from Dan Carlin.
03:21:09.080 | "Wisdom requires a flexible mind."
03:21:12.600 | Thank you for listening, and hope to see you next time.
03:21:16.440 | (upbeat music)
03:21:19.020 | (upbeat music)
03:21:21.600 | [BLANK_AUDIO]