back to index

Ben Shapiro: Politics, Kanye, Trump, Biden, Hitler, Extremism, and War | Lex Fridman Podcast #336


Chapters

0:0 Introduction
2:1 Kanye 'Ye' West
9:41 Hitler and the nature of evil
17:47 Political attacks on the left and the right
23:31 Quebec mosque shooting
33:26 Elon Musk buying Twitter
46:29 Trump and Biden
51:3 Hunter Biden's laptop
62:36 Candace Owens
66:15 War in Ukraine
76:24 Rhetoric vs truth
81:19 Infamous BBC interview
84:35 Day in the life
99:31 Abortion
112:26 Climate change
119:48 God and faith
130:58 Tribalism
135:34 Advice for young people
139:20 Andrew Breitbart
141:50 Self-doubt
143:52 Love

Whisper Transcript | Transcript Only Page

00:00:00.000 | The great lie we tell ourselves is that people who are evil are not like us.
00:00:03.120 | They're a class apart.
00:00:04.160 | Everybody in history who has sinned is a person who's very different from me.
00:00:08.040 | Robert George, the philosopher over at Princeton, he's fond of doing a sort of thought experiment
00:00:12.760 | in his classes where he asks people to raise their hand if they had lived in Alabama in
00:00:16.760 | 1861, how many of you would be abolitionists?
00:00:19.360 | And everybody raises their hand.
00:00:20.600 | He says, of course, that's not true.
00:00:22.480 | Of course, that's not true.
00:00:24.080 | The best protection against evil is recognizing that it lies in every human heart and the
00:00:29.600 | possibility that it takes you over.
00:00:32.520 | Do you ever sit back, you know, in the quiet of your own mind and think, am I participating
00:00:38.080 | in evil?
00:00:41.200 | The following is a conversation with Ben Shapiro, a conservative political commentator, host
00:00:46.480 | of The Ben Shapiro Show, co-founder of The Daily Wire, and author of several books, including
00:00:53.040 | The Authoritarian Moment, The Right Side of History, and Facts Don't Care About Your Feelings.
00:00:59.920 | Whatever your political leanings, I humbly ask that you try to put those aside and listen
00:01:05.640 | with an open mind, trying to give the most charitable interpretation of the words we
00:01:11.600 | This is true in general for this podcast, whether the guest is Ben Shapiro or Alexandria
00:01:17.960 | Ocasio-Cortez, Donald Trump, or Barack Obama.
00:01:22.960 | I will talk to everyone, from every side, from the far left to the far right, from presidents
00:01:29.320 | to prisoners, from artists to scientists, from the powerful to the powerless, because
00:01:35.200 | we are all human, all capable of good and evil, all with fascinating stories and ideas
00:01:42.920 | to explore.
00:01:43.920 | I seek only to understand, and in so doing, hopefully, add a bit of love to the world.
00:01:51.200 | This is the Lex Friedman Podcast.
00:01:53.000 | To support it, please check out our sponsors in the description.
00:01:56.520 | And now, dear friends, here's Ben Shapiro.
00:02:01.560 | Let's start with a difficult topic.
00:02:04.200 | What do you think about the comments made by Ye, formerly known as Kanye West, about
00:02:08.560 | Jewish people?
00:02:09.560 | They're awful and anti-Semitic, and they seem to get worse over time.
00:02:14.640 | They started off with the bizarre DEF CON 3 tweet, and then they went into even more
00:02:21.400 | stereotypical garbage about Jews, and Jews being sexual manipulators.
00:02:26.280 | I think that was the Pete Davidson, Kim Kardashian stuff, and then Jews running all of the media,
00:02:31.440 | Jews being in charge of the financial sector, Jewish people.
00:02:35.760 | There's no...
00:02:36.760 | I called it on my show, "Der Sturmer Nazism," and it is.
00:02:39.640 | It's like, right from Protocols of the Elters of Zion type stuff.
00:02:43.560 | Do you think those words come from pain?
00:02:46.040 | Where do they come from?
00:02:47.040 | It's always hard to try and read somebody's mind.
00:02:50.080 | What he looks like to me, just having experience in my own family of people who are bipolar,
00:02:54.120 | he seems like a bipolar personality.
00:02:56.160 | He seems like somebody who is in the middle of a manic episode, and when you're manic,
00:03:01.920 | you tend to say a lot of things that you shouldn't say, and you tend to believe that they're
00:03:06.800 | the most brilliant things ever said.
00:03:08.880 | The Washington Post did an entire piece speculating about how bipolarism played into the kind
00:03:13.320 | of stuff that Ye was saying, and it's hard for me to think that it's not playing into
00:03:19.600 | it, especially because even if he is an anti-Semite, and I have no reason to suspect he's not,
00:03:25.840 | given all of his comments, if he had an ounce of common sense, he would stop at a certain
00:03:30.040 | point, and bipolarism tends to drive you well past the point where common sense applies.
00:03:35.880 | I would imagine it's coming from that.
00:03:41.720 | From his comments, I would also imagine that he's doing the logical mistake that a lot
00:03:46.840 | of anti-Semites or racists or bigots do, which is, "Somebody hurt me.
00:03:51.200 | That person is a Jew.
00:03:52.720 | Therefore, all Jews are bad."
00:03:54.920 | That jump from, "A person did something to me I don't like who's a member of a particular
00:03:59.360 | race or class, and therefore, everybody of that race or class is bad," that's textbook
00:04:04.600 | bigotry, and that's pretty obviously what Ye's engaging in here.
00:04:07.840 | So, jumping from the individual to the group.
00:04:09.480 | That's the way he's been expressing it, right?
00:04:10.600 | He keeps talking about his Jewish agents, and I watched your interview with him, and
00:04:13.280 | you kept saying, "So, just name the agents, right?
00:04:14.960 | Just name the people who are screwing you," and he wouldn't do it.
00:04:18.800 | Instead, he just kept going back to the general, the group, the Jews in general.
00:04:22.880 | I mean, that's textbook bigotry, and if it were put in any other context, he would probably
00:04:27.720 | recognize it as such.
00:04:30.760 | To the degree as words fuel hate in the world, what's the way to reverse that process?
00:04:35.600 | What's the way to alleviate the hate?
00:04:37.440 | I mean, when it comes to alleviating the kind of stuff that he's saying, obviously, debunking
00:04:43.320 | it, making clear that what he's saying is garbage, but the reality is that I think that
00:04:50.440 | for most people who are in any way engaged with these issues, I don't think they're being
00:04:57.400 | convinced to be anti-Semitic by Ye.
00:05:00.040 | I mean, I think that there's a group of people who may be swayed that anti-Semitism is acceptable
00:05:04.680 | because Ye is saying what he's saying, and he's saying so very loudly, and he's saying
00:05:09.240 | it over and over.
00:05:10.840 | But yeah, I think that, for example, there are these signs that were popping up in Los
00:05:14.840 | Angeles saying Ye is right.
00:05:16.320 | Well, that group's been out there posting anti-Semitic signs on the freeways for years,
00:05:20.120 | and there are groups like that posting anti-Semitic signs where I live in Florida.
00:05:23.800 | They've been doing that for years, well before Ye was saying this sort of stuff.
00:05:26.480 | It's just like latest opportunity to kind of jump on that particular bandwagon, but
00:05:30.040 | listen, I think that people do have a moral duty to call that stuff out.
00:05:34.800 | So there is a degree to which it normalizes that kind of idea that Jews control the media,
00:05:41.400 | Jews control X institution.
00:05:45.480 | Is there a way to talk about a high representation of a group like Jewish people in a certain
00:05:55.840 | institution like the media or Hollywood and so on without it being a hateful conversation?
00:06:01.200 | Sure, of course.
00:06:03.040 | A high percentage of higher than statistically represented in the population, percentage
00:06:08.480 | of Hollywood agents are probably Jewish.
00:06:11.440 | A higher percentage of lawyers generally are probably Jewish.
00:06:13.880 | A high percentage of accountants are probably Jewish.
00:06:16.800 | Also a higher percentage of engineers are probably Asian.
00:06:21.080 | Statistical truths are statistical truths.
00:06:22.800 | It doesn't necessarily mean anything about the nature of the people who are being talked
00:06:29.200 | about.
00:06:30.200 | There are a myriad of reasons why people might be disproportionately in one arena or another
00:06:34.200 | ranging from the cultural to sometimes the genetic.
00:06:36.880 | I mean there are certain areas of the world where people are better long distance runners
00:06:41.200 | because of their genetic adaptations in those particular areas of the world.
00:06:45.080 | That's not racist, that's just fact.
00:06:47.320 | What starts to get racist is when you are attributing a bad characteristic to an entire
00:06:50.840 | population based on the notion that some members of that population are doing bad things.
00:06:58.240 | Yeah, there's a jump between.
00:07:00.240 | It's also possible that record label owners as a group have a kind of culture that Fs
00:07:08.160 | over artists.
00:07:09.160 | Sure.
00:07:10.160 | Doesn't treat artists fairly and it's also possible that there's a high representation
00:07:13.720 | of Jews in the group of people that own record labels but it's that small but a very big
00:07:21.480 | leap that people take from the group that own record labels to all Jews.
00:07:27.960 | For sure.
00:07:28.960 | I think that one of the other issues also is that anti-Semitism is fascinating because
00:07:34.960 | it breaks down into so many different parts.
00:07:36.960 | Meaning that if you look at sort of different types of anti-Semitism, if you're a racist
00:07:40.920 | against black people, it's typically because you're racist based on the color of their
00:07:43.760 | skin.
00:07:44.760 | If you're racist against the Jews, you're anti-Semitic, then there are actually a few
00:07:48.280 | different ways that breaks down.
00:07:49.600 | You have anti-Semitism in terms of ethnicity which is like Nazi-esque anti-Semitism.
00:07:54.520 | You have Jewish parentage, you have a Jewish grandparent, therefore your blood is corrupt
00:07:59.360 | and you are inherently going to have bad properties.
00:08:01.720 | Then there's sort of old school religious anti-Semitism which is that the Jews are the
00:08:06.100 | killers of Christ or the Jews are the sons of pigs and monkeys and therefore Judaism
00:08:10.820 | is bad and therefore Jews are bad.
00:08:13.380 | The way that you get out of that anti-Semitism historically speaking is mass conversion which
00:08:17.380 | most anti-Semitism for a couple thousand years actually was not ethnic.
00:08:20.660 | It was much more rooted in this sort of stuff.
00:08:23.220 | If a Jew converted out of the faith, then the anti-Semitism was "alleviated."
00:08:29.180 | Then there's a sort of bizarre anti-Semitism that's political anti-Semitism and that is
00:08:34.700 | members of a group that I don't like are disproportionately Jewish, therefore all Jews are members of
00:08:43.700 | this group or are predominantly represented in this group.
00:08:46.780 | So you'll see Nazis saying the communists are Jews.
00:08:49.020 | You'll see communists saying the Nazis are Jews or you'll see communists saying that
00:08:52.540 | the capitalists rather are Jews.
00:08:54.900 | So that's the weird thing about anti-Semitism.
00:08:56.940 | It's kind of like the Jews behind every corner.
00:08:58.740 | It's basically a big conspiracy theory.
00:09:00.500 | Unlike a lot of other forms of racism which are not really conspiracy theory, anti-Semitism
00:09:03.700 | tends to be a conspiracy theory about believers of power being controlled by a shadowy cadre
00:09:09.380 | of people who are getting together behind closed doors to control things.
00:09:12.140 | Yeah, the most absurd illustration of anti-Semitism just like you said is Stalin versus Hitler
00:09:18.740 | over Poland that every bad guy was a Jew.
00:09:25.100 | So every enemy, there's a lot of different enemy groups, intellectuals, political and
00:09:29.420 | so on, military and behind any movement that is considered the enemy for the Nazis and
00:09:35.140 | any movement that's considered the enemy for the Soviet army are the Jews.
00:09:41.700 | What does the fact that Hitler took power teach you about human nature?
00:09:48.140 | When you look back at the history of the 20th century, what do you learn from that time?
00:09:52.100 | I mean there are a bunch of lessons to Hitler taking power.
00:09:55.500 | The first thing I think people ought to recognize about Hitler taking power is that the power
00:10:00.540 | had been centralized in the government before Hitler took it.
00:10:03.700 | So if you actually look at the history of Nazi Germany, the Weimar Republic had effectively
00:10:07.300 | collapsed.
00:10:08.460 | The power had been centralized in the chancellery and really under Hindenburg for a couple of
00:10:15.300 | years before that.
00:10:17.100 | So it was only a matter of time until someone who was bad grabbed the power.
00:10:21.820 | The struggle between the Reds and the Browns in Nazism in pre-Nazi Germany led to this
00:10:28.220 | kind of up spiraling of radical sentiment that allowed Hitler in through the front door,
00:10:33.420 | not through the back door.
00:10:34.420 | He was elected.
00:10:35.420 | So you think communists could have also taken power?
00:10:38.020 | There's no question communists could have taken power.
00:10:39.380 | They were a serious force in pre-Nazi Germany.
00:10:42.020 | Do you think there was an underlying current that would have led to an atrocity if the
00:10:45.460 | communists had taken power?
00:10:46.780 | It wouldn't have been quite the same atrocity, but obviously the communists in Soviet Russia
00:10:50.340 | at exactly this time were committing the Haladomar.
00:10:54.020 | So there were very few good guys in terms of good parties.
00:10:57.580 | The moderate parties were being dragged by the radicals into alliance with them to prevent
00:11:03.300 | the worst case scenario from the other guy.
00:11:05.700 | So if you look at, I'm sort of fascinated by the history of this period because it really
00:11:10.660 | does speak to how does a democracy break down.
00:11:13.540 | I mean, the '20s Weimar Republic was a very liberal democracy.
00:11:16.100 | How does a liberal democracy break down into complete fascism and then into genocide?
00:11:21.620 | And there's a character who was very prominent in the history of that time named Franz von
00:11:25.780 | Papen who was actually the second to last chancellor of the republic before Hitler.
00:11:30.500 | So he was the chancellor and then he handed over to Schleicher and then Schleicher ended
00:11:34.620 | up collapsing and that ended up handing power over to Hitler.
00:11:37.540 | It was Papen who had stumped for Hitler to become chancellor.
00:11:41.340 | Papen was a Catholic Democrat.
00:11:44.620 | He didn't like Hitler.
00:11:45.780 | He thought that Hitler was a radical and a nut job, but he also thought that Hitler being
00:11:51.060 | a buffoon as he saw it was going to essentially be usable by the right forces in order to
00:11:56.860 | get the, in order to prevent the communists from taking power, maybe in order to restore
00:12:01.820 | some sort of legitimacy to the regime because he was popular in order for Papen to retain
00:12:07.340 | power himself.
00:12:08.860 | And then immediately after Hitler taking power, Hitler basically kills all of Papen's friends.
00:12:13.100 | Papen out of "loyalty" stays on.
00:12:15.620 | He ends up helping the Anschluss in Austria.
00:12:17.860 | Now all this stuff is really interesting mainly because what it speaks to is the great lie
00:12:21.980 | we tell ourselves that people who are evil are not like us.
00:12:24.980 | They're a class apart.
00:12:25.980 | People who do evil things, people who support evil people, people, they're not like us.
00:12:30.660 | That's an easy call.
00:12:32.940 | Everybody in history who has sinned is a person who's very different from me.
00:12:36.340 | Robert George, the philosopher over at Princeton, he's fond of doing a sort of thought experiment
00:12:40.860 | in his classes where he asks people to raise their hand if they had lived in Alabama in
00:12:44.820 | 1861, how many of you would be abolitionists?
00:12:47.620 | And everybody raises their hand.
00:12:48.620 | He says, "Of course that's not true.
00:12:50.700 | Of course that's not true."
00:12:52.620 | The best protection against evil is recognizing that it lies in every human heart and the
00:12:57.700 | possibility that it takes you over.
00:13:01.200 | And so you have to be very cautious in how you approach these issues and the back and
00:13:06.200 | forth of politics, the sort of bipolarity of politics or the polarization in politics,
00:13:11.420 | might be a better way to put it, makes it very easy to kind of fall into the rock 'em
00:13:17.300 | sock 'em robots that eventually could theoretically allow you to support somebody who's truly
00:13:22.700 | frightening and hideous in order to stop somebody who you think is more frightening and hideous.
00:13:27.020 | And you see this kind of language, by the way, now predominating almost all over the
00:13:30.520 | Western world, right?
00:13:31.660 | My political enemy is an enemy of democracy.
00:13:33.480 | My political enemy is going to end the republic.
00:13:35.200 | My political enemy is going to be the person who destroys the country we live in.
00:13:39.100 | And so that person has to be stopped by any means necessary.
00:13:44.440 | And that's dangerous stuff.
00:13:46.540 | So the communists had to be stopped in Nazi Germany and so they're the devil.
00:13:51.220 | So any useful buffoon, as long as they're effective against the communists, would do.
00:13:58.660 | Do you ever wonder, because the people that are participating in evil may not understand
00:14:02.280 | that they're doing evil.
00:14:04.260 | Do you ever sit back, you know, in the quiet of your mind and think, "Am I participating
00:14:09.340 | in evil?"
00:14:10.340 | I mean, so my business partner and I, one of our favorite memes is from, there's a British
00:14:17.100 | comedy show, the name escapes me, of these two guys who are members of the SS and they're
00:14:21.980 | dressed in the SS uniforms and the black uniforms, they put the skulls on them and they're saying
00:14:25.440 | to each other, one says to the other guy, "You notice like the British, the symbol is
00:14:29.620 | something nice and it's like an eagle.
00:14:33.220 | But it's a skull and crossbones.
00:14:34.780 | You see the Americans, you see their blue uniforms, very nice and pretty, ours are jet
00:14:38.940 | black.
00:14:39.940 | Are we the baddies?"
00:14:40.940 | And, you know, that's it.
00:14:43.660 | And the truth is we look back at the Nazis and we say, "Well, of course they were the
00:14:47.100 | baddies.
00:14:48.100 | They wore black uniforms and they had jackboots and they had this and that."
00:14:50.700 | And of course they were the bad guys, but evil rarely presents its face so clearly.
00:14:55.660 | So yeah, I mean, I think that you have to constantly be thinking along those lines and
00:15:00.180 | hopefully you try to avoid it.
00:15:02.580 | You can only do the best that a human being can do.
00:15:04.780 | But yeah, I mean, the answer is yes.
00:15:07.660 | I would say that I spend an inordinate amount of time reflecting on whether I'm doing the
00:15:14.440 | right thing.
00:15:15.440 | And I may not always do the right thing.
00:15:16.740 | I'm sure a lot of people think that I'm doing the wrong thing on a daily basis, but it's
00:15:20.220 | definitely a question that has to enter your mind as a historically aware and hopefully
00:15:26.020 | morally decent person.
00:15:27.020 | Do you think you're mentally strong enough if you realize that you're on the wrong side
00:15:32.660 | of history to switch sides?
00:15:35.020 | Very few people in history seem to be strong enough to do that.
00:15:37.500 | I mean, I think that the answer I hope would be yes.
00:15:41.260 | You never know until the time comes and you have to do it.
00:15:43.740 | I will say that having heterodox opinions in a wide variety of areas is something that
00:15:50.060 | I have done before.
00:15:51.060 | I'm the only person I've ever heard of in public life who actually has a list on their
00:15:57.020 | website of all the dumb, stupid things I've ever said.
00:16:00.500 | So where I go through and I either say, "This is why I still believe this," or, "This is
00:16:04.580 | why what I said was terrible and stupid."
00:16:07.820 | And I'm sure that list will get a lot longer as the years go on.
00:16:09.780 | Yeah, I look forward to new additions to that list.
00:16:12.380 | Yeah, exactly.
00:16:13.380 | It actually is a super, super long list.
00:16:15.300 | People should check it out.
00:16:16.300 | And it's quite honest and raw.
00:16:19.100 | What do you think about, it's interesting to ask you given how pro-life you are, about
00:16:25.940 | Ye's comments about comparing the Holocaust to the 900,000 abortions in the United States
00:16:32.860 | a year.
00:16:33.860 | So I'll take this from two angles.
00:16:35.860 | As a pro-life person, I actually didn't find it offensive because if you believe, as I
00:16:40.140 | do, that unborn and preborn lives deserve protection, then the slaughter of just under
00:16:46.500 | a million of them every year for the last almost 50 years is a historic tragedy on par
00:16:51.340 | with a Holocaust.
00:16:53.500 | From the outside perspective, I get why people would say there's a difference in how people
00:16:56.860 | view the preborn as to how people view, say, a seven-year-old who's being killed in the
00:17:00.700 | Holocaust.
00:17:01.700 | Like the visceral power and evil of the Nazis shoving full-grown human beings and small
00:17:06.660 | children into gas chambers can't be compared to a person who, even from a pro-life perspective,
00:17:12.420 | may not fully understand the consequences of their own decisions or from a pro-choice
00:17:15.780 | perspective, fully understands the consequences but just doesn't think that that person is
00:17:18.820 | a person, that that's actually different.
00:17:21.300 | I understand both sides of it.
00:17:23.020 | I wasn't offended by Ye's comments in that way, though, because if you're a pro-life
00:17:27.940 | human being, then you do think that what's happening is a great tragedy on scale that
00:17:31.620 | involves the dehumanization of an entire class of people, the preborn.
00:17:36.980 | So the philosophical, you understand the comparison?
00:17:39.180 | I do.
00:17:40.180 | Sure.
00:17:41.580 | So in his comments, in the jumping from the individual to the group, I'd like to ask you,
00:17:47.620 | you're one of the most effective people in the world at attacking the left, and sometimes
00:17:52.660 | it can slip into attacking the group.
00:17:56.220 | Do you worry that that's the same kind of oversimplification that Ye is doing about
00:18:00.660 | Jewish people that you can sometimes do with the left as a group?
00:18:05.340 | So when I speak about the left, I'm speaking about a philosophy.
00:18:09.260 | And I'm not really speaking about individual human beings as the leftists-like group and
00:18:14.500 | then try to name who the members of this individual group are.
00:18:16.900 | I also make a distinction between the left and liberals.
00:18:19.300 | There are a lot of people who are liberal who disagree with me on taxes, disagree with
00:18:23.140 | me on foreign policy, disagree with me on a lot of things.
00:18:25.820 | The people who I'm talking about generally, and I talk about the left in the United States,
00:18:29.380 | are people who believe that alternative points of view ought to be silenced because they
00:18:33.460 | are damaging and harmful simply based on the disagreement.
00:18:36.900 | So that's one distinction.
00:18:38.740 | The other distinction, again, is when I talk about the right versus the left, typically
00:18:41.100 | I'm talking about a battle of competing philosophies.
00:18:44.220 | And so I'm not speaking about typically – it would be hard to – if you put a person in
00:18:48.820 | front of me and said, "Is this person of the left or of the right?"
00:18:51.540 | Having just met them, I wouldn't be able to label them in the same way that if you
00:18:54.980 | met somebody in the name of Greenstein, you'd immediately got Jew, or you met a black person,
00:18:58.740 | it's black person.
00:18:59.740 | And the adherence to a philosophy makes you a member of a group.
00:19:05.220 | If I think the philosophy is bad, that doesn't necessarily mean that you as a person are
00:19:09.180 | bad, but it does mean that I think your philosophy is bad.
00:19:11.260 | Yeah, so the grouping is based on the philosophy versus something like a race, like the color
00:19:17.660 | of your skin, or race as in the case of the Jewish people.
00:19:20.700 | So it's a different thing.
00:19:23.000 | You can be a little bit more nonchalant and careless in attacking a group because it's
00:19:27.820 | ultimately attacking a set of ideas.
00:19:29.500 | Well, I mean, it's really nonchalant in attacking the set of ideas, and I don't
00:19:33.060 | know that nonchalant would be the way I'd put it.
00:19:34.980 | I try to be exact when you're, you know, you don't always hit, but you know, if I
00:19:39.340 | say that I oppose the communists, right, and then presumably I'm speaking of people who
00:19:44.940 | believe in the communist philosophy.
00:19:47.340 | Now the question is whether I'm mislabeling, right, whether I'm taking somebody who's
00:19:49.860 | not actually a communist and then shoving them in that group of communists, right, that
00:19:52.660 | would be inaccurate.
00:19:54.540 | The dangerous thing is it expands the group as opposed to you talking about the philosophy,
00:19:59.940 | you're throwing everybody who's ever said, "I'm curious about communism, I'm curious
00:20:04.020 | about socialism," because there's like a gradient.
00:20:06.820 | You know, it's like to throw something at you, I think Joe Biden said, "MAGA Republicans,"
00:20:13.100 | right?
00:20:14.100 | Right.
00:20:15.100 | You know, I think that's a very careless statement because the thing you jump to immediately
00:20:18.960 | is like all Republicans.
00:20:20.580 | Everyone who voted for Trump.
00:20:21.820 | For Trump.
00:20:22.820 | Right.
00:20:23.820 | Versus I think in the charitable interpretation, that means a set of ideas.
00:20:27.980 | Yeah, my actual problem with the MAGA Republicans line from Biden is that he went on in the
00:20:33.380 | speech that he made in front of Independence Hall to actually try and define what it meant
00:20:37.660 | to be a MAGA Republican who was a threat to the republic was the kind of language that
00:20:40.780 | he was using.
00:20:42.100 | And later on in the speech, he actually suggested, "Well, you know, there are moderate Republicans
00:20:45.780 | and the moderate Republicans are people who agree with me on like the Inflation Reduction
00:20:48.620 | Act."
00:20:49.620 | It's like, "Well, that can't be the dividing line between a MAGA Republican and a moderate
00:20:54.460 | like a moderate Republican, somebody who agrees with you.
00:20:56.260 | You got to name me like a Republican who disagrees with you fairly strenuously but is not in
00:21:01.420 | this group of threats to the republic."
00:21:03.340 | To make that distinction, we can have a fair discussion about whether the idea of election
00:21:06.620 | denial, for example, make somebody a threat to institutions.
00:21:11.900 | That's a conversation that we can have and then we'll have to discuss how much power
00:21:14.420 | they have, what the actual perspective is, delve into it.
00:21:18.180 | But I think that he was being overbroad and sort of labeling all of his political enemies
00:21:21.900 | under one rubric.
00:21:22.900 | Now, again, in politics, this stuff sort of happens all the time.
00:21:25.100 | I'm not going to plead clean hands here because I'm sure that I've been inexact.
00:21:29.500 | But somebody, what would be good in that particular situation is for somebody to sort of read
00:21:33.860 | me back the quote and I'll let you know where I've been inaccurate.
00:21:36.380 | I'll try to do that.
00:21:37.380 | - And also you don't shy away from humor and occasional trolling and mockery and all that
00:21:41.980 | kind of stuff for the fun, for the chaos, all that kind of stuff.
00:21:44.860 | - I mean, I try not to do trollery for trollery's sake, but if the show's not entertaining and
00:21:50.460 | not fun, people aren't going to listen.
00:21:52.260 | And so if you can't have fun with politics, the truth about politics is we all take it
00:21:55.540 | very seriously because it has some serious ramifications.
00:21:58.380 | Politics is veep, it is not house of cards.
00:22:00.940 | The general rule of politics is that everyone is a moron unless proven otherwise, that virtually
00:22:06.240 | everything is done out of stupidity rather than malice, and that if you actually watch
00:22:10.100 | politics as a comedy, you'll have a lot more fun.
00:22:12.060 | And so the difficulty for me is I take politics seriously, but also I have the ability to
00:22:15.900 | sort of flip the switch and suddenly it all becomes incredibly funny because it really
00:22:21.060 | Like if you just watch it from a pure entertainment perspective and you put aside the fact that
00:22:24.060 | it affects hundreds of millions of people, then watching President Trump being president,
00:22:31.260 | I mean, he's one of the funniest humans who's ever lived.
00:22:33.860 | Watching Kamala Harris be Kamala Harris and talking about how much she loves Venn diagrams
00:22:37.880 | or electric buses, I mean, that's funny stuff.
00:22:40.060 | So if I can't make fun of that, then my job becomes pretty morose pretty quickly.
00:22:43.500 | - Yeah, it's funny to figure out what is the perfect balance between seeing the humor and
00:22:49.500 | the absurdity of the game of it versus taking it seriously enough because it does affect
00:22:55.340 | hundreds of millions of people.
00:22:57.000 | It's a weird balance to strike.
00:22:58.740 | It's like, I am afraid with the internet that everything becomes a joke.
00:23:03.140 | - I totally agree with this.
00:23:04.260 | I will say this.
00:23:05.260 | I try to make less jokes about the ideas and more jokes about the people in the same way
00:23:09.860 | that I make jokes about myself.
00:23:11.540 | I'm pretty self-effacing in terms of my humor.
00:23:14.020 | I would say at least half the jokes on my show are about me.
00:23:17.220 | When I'm reading ads for Tommy John and they're talking about their no wedgie guarantee, I'll
00:23:20.540 | say things like, "That would help me in high school," because it would have, I mean, just
00:23:24.540 | factually speaking.
00:23:25.540 | So if I can speak that way about myself, I feel like everybody else can take it as well.
00:23:31.500 | - Difficult question.
00:23:32.500 | In 2017, there was a mosque shooting in Quebec City.
00:23:36.420 | Six people died, five others seriously injured.
00:23:39.220 | The 27-year-old gunman consumed a lot of content online and checked Twitter accounts of a lot
00:23:45.660 | of people, but one of the people he checked quite a lot of is you, 93 times in the month
00:23:51.180 | leading up to the shooting.
00:23:53.220 | If you could talk to that young man, what would you tell him?
00:23:56.380 | And maybe other young men listening to this that have hate in their heart in that same
00:24:02.020 | way, what would you tell them?
00:24:03.620 | - You're getting it wrong.
00:24:05.540 | If anything that I or anyone else in mainstream politics says drives you to violence, you're
00:24:10.740 | getting it wrong.
00:24:12.020 | You're getting it wrong.
00:24:13.020 | Now, again, when it comes to stuff like this, I have a hard and fast rule that I've applied
00:24:17.180 | evenly across the spectrum, and that is I never blame people's politics for other people
00:24:22.620 | committing acts of violence unless they're actively advocating violence.
00:24:25.900 | So when a fan of Bernie Sanders shoots up a congressional baseball game, that is not
00:24:29.700 | Bernie Sanders' fault.
00:24:30.700 | I may not like his rhetoric, I may disagree with him on everything, Bernie Sanders did
00:24:33.340 | not tell somebody to go shoot up a congressional baseball game.
00:24:36.260 | When a nutcase in San Francisco goes and hits Paul Pelosi with a hammer, I'm not gonna blame
00:24:41.500 | Kevin McCarthy, the House Speaker for that.
00:24:43.860 | When somebody threatens Brett Kavanaugh, I'm not gonna suggest that that was Joe Biden's
00:24:48.020 | fault because it's not Joe Biden's fault.
00:24:49.860 | We can play this game all day long, and I find that the people who are most intensely
00:24:53.880 | focused on playing this game are people who tend to oppose the politics of the person
00:24:58.260 | as opposed to actually believing sincerely that this has driven somebody into the arms
00:25:03.080 | of the god of violence.
00:25:05.940 | But I have 4.7 million Twitter followers, I have 8 million Facebook followers, I have
00:25:12.060 | 5 million YouTube followers, I would imagine that some of them are people who are violent,
00:25:18.060 | I would imagine that some of them are people who do evil things or want to do evil things,
00:25:22.380 | and I wish that there were a wand that we could wave that would prevent those people
00:25:26.500 | from deliberately or mistakenly misinterpreting things as a call to violence.
00:25:31.780 | It's just a negative byproduct of the fact that you can reach a lot of people, and so
00:25:36.860 | if somebody could point me to the comment that I suppose "drove somebody to go and literally
00:25:42.340 | murder human beings," then I would appreciate it so I could talk about the comment, but
00:25:49.020 | I don't, mainly because I just think that if we remove agency from individuals, and
00:25:57.500 | if we blame broad-scale political rhetoric for every act of violence, we're not gonna,
00:26:02.540 | the people who are gonna pay the price are actually the general population because free
00:26:05.980 | speech will go away.
00:26:07.260 | If the idea is that things that we say could drive somebody who is unbalanced to go do
00:26:11.980 | something evil, the necessary byproduct is hate, is that speech is a form of hate, hate
00:26:18.620 | is a form of violence, speech is a form of violence, speech needs to be curbed.
00:26:22.740 | And that to me is deeply disturbing.
00:26:25.060 | So, definitely he, that man, that 27-year-old man is the only one responsible for the evil
00:26:32.340 | he did, but what if he and others like him are not in those cases?
00:26:38.460 | What if they're people with pain, with anger in their heart?
00:26:42.900 | What would you say to them?
00:26:44.540 | You are exceptionally influential and other people like you that speak passionately about
00:26:50.020 | ideas, what do you think is your opportunity to alleviate the hate in their heart?
00:26:55.380 | If we're speaking about people who aren't mentally ill, and people who are just misguided,
00:27:00.340 | I'd say to him, the thing that I said to every other young man in the country, you need to
00:27:04.500 | find meaning and purpose in forming connections that actually matter in a belief system that
00:27:10.700 | actually promotes general prosperity and promotes helping other people.
00:27:16.580 | And this is why, you know, the message that I most commonly say to young men is it's time
00:27:21.020 | for you to grow up, mature, get a job, get married, have a family, take care of the people
00:27:24.900 | around you, become a useful part of your community.
00:27:28.020 | I've never at any point in my entire career suggested violence as a resort to political
00:27:34.420 | issues.
00:27:35.420 | And the whole point of having a political conversation is that it's a conversation.
00:27:40.300 | If I didn't think that it were worth trying to convince people, from my point of view,
00:27:43.660 | I wouldn't do what I do for a living.
00:27:45.740 | So violence doesn't solve anything?
00:27:47.300 | No, it doesn't.
00:27:49.820 | As if this wasn't already a difficult conversation.
00:27:55.380 | Let me ask about Ilhan Omar.
00:27:58.980 | You've called out her criticism of Israel policies as anti-Semitic.
00:28:03.840 | Is there a difference between criticizing a race of people, like the Jews, and criticizing
00:28:10.820 | the policies of a nation like Israel?
00:28:12.420 | Of course.
00:28:13.420 | Of course.
00:28:14.420 | I criticize the policies of Israel on a fairly regular basis.
00:28:15.780 | I would assume from a different angle than Ilhan Omar does.
00:28:18.580 | But yeah, I mean, I criticize the policies of a wide variety of states.
00:28:22.660 | And to take an example, I mean, I've criticized Israel's policy in giving control of the Temple
00:28:27.140 | Mount to the Islamic Waqf, which effectively prevents anybody except for Muslims from praying
00:28:31.020 | up there.
00:28:32.020 | I've also criticized the Israeli government for their COVID crackdown.
00:28:34.220 | I mean, you can criticize the policies of any government, but that's not what Ilhan
00:28:37.260 | Omar does.
00:28:38.260 | Ilhan Omar doesn't actually believe that there should be a state of Israel.
00:28:40.380 | She believes that Zionism is racism and that the existence of a Jewish state in Israel
00:28:45.820 | is in and of itself the great sin.
00:28:48.460 | That is a statement she would make about no other people in no other land.
00:28:51.580 | She would not say that the French don't deserve a state for the French.
00:28:54.260 | She wouldn't say that Somalis wouldn't deserve a state in Somalia.
00:28:57.540 | She wouldn't say that Germans don't deserve a state in Germany.
00:29:00.740 | She wouldn't say for the 50 plus Islamic states that exist across the world that they don't
00:29:05.020 | deserve states of their own.
00:29:06.020 | It is only the Jewish state that has fallen under her significant scrutiny.
00:29:09.940 | And she also promulgates lies about one specific state in the form of suggesting, for example,
00:29:16.140 | that Israel is an apartheid state, which it is most eminently not considering that the
00:29:19.620 | last unity government in Israel included an Arab party, that there are Arabs who sit on
00:29:23.700 | the Israeli Supreme Court and all the rest.
00:29:25.740 | And then beyond that, obviously, she's engaged in some of the same sort of anti-Semitic tropes
00:29:28.940 | that you heard from Ye, right?
00:29:29.940 | The stuff about it's all about the Benjamins, that American support for Israel is all about
00:29:33.500 | the Benjamins.
00:29:34.500 | And she's had to be chided by members of her own party about this sort of stuff before.
00:29:37.820 | Can you empathize with the plight of Palestinian people?
00:29:41.220 | Absolutely.
00:29:42.220 | I mean, I, you know, some of the uglier things that I've ever said in my career are things
00:29:45.420 | that I said very early on when I was 17, 18, 19, I started writing a syndicated comment
00:29:48.860 | I was 17, I'm now 38.
00:29:50.380 | So virtually all the dumb things, I don't say virtually all, many of the dumb things,
00:29:53.660 | the plurality of the dumb things that I've said came from the ages of I'd say 17 to maybe
00:29:59.060 | And they are rooted, again, in sloppy thinking.
00:30:02.260 | I feel terrible for people who have lived under the thumb and currently live under the
00:30:06.460 | thumb of Hamas, which is a national terrorist group, or the Palestinian Authority, which
00:30:10.060 | is a corrupt oligarchy that steals money from its people and leaves them in misery, or Islamic
00:30:15.900 | Jihad, which is an actual terrorist group.
00:30:18.660 | The basic rule for the region, in my view, is if these groups were willing to make peace
00:30:24.940 | with Israel, they would have a state literally tomorrow.
00:30:27.420 | And if they are not, then there will be no peace.
00:30:29.700 | And it really is that simple.
00:30:30.700 | If Israel, the formula that's typically used has become a bit of a bumper sticker, but
00:30:35.220 | it happens to be factually correct, if the Palestinians put down their guns tomorrow,
00:30:38.660 | there would be a state.
00:30:39.660 | If the Israelis put down their guns, there would be no Israel.
00:30:45.140 | You get attacked a lot on the internet.
00:30:47.340 | Oh, yeah.
00:30:48.340 | You noticed.
00:30:49.340 | I got to ask you about your own psychology.
00:30:52.900 | How do you not let that break you mentally?
00:30:56.340 | And how do you avoid letting that lead to a resentment of the groups that attack you?
00:31:02.020 | I mean, it's so there are a few sort of practical things that I've done.
00:31:04.900 | So for example, I would say that four years ago, Twitter was all consuming.
00:31:09.620 | Twitter is an ego machine, especially the notifications button, right?
00:31:12.220 | The notifications button is just people talking about you all the time.
00:31:14.980 | And the normal human tendency is, wow, people talking about me.
00:31:17.220 | I got to see what they're saying about me, which is a recipe for insanity.
00:31:20.420 | So my wife actually said, Twitter is making your life miserable.
00:31:23.860 | You need to take it off your phone.
00:31:24.860 | So Twitter is not on my phone.
00:31:26.420 | If I want to log on to Twitter, I have to go onto my computer and I have to make the
00:31:30.040 | conscious decision to go onto Twitter and then take a look at what's going on.
00:31:33.620 | I could just imagine you like there's a computer in the basement.
00:31:36.100 | You descend into the check Twitter.
00:31:38.460 | That's pretty much in the darkness.
00:31:39.460 | If you look at when I actually tweet, it's generally like in the run up to recording
00:31:42.980 | my show or when I'm prepping for my show later in the afternoon.
00:31:46.060 | For example.
00:31:47.340 | That doesn't affect you negatively mentally, like put you in a bad mental space?
00:31:51.060 | Not particularly if it's restricted to sort of what's being watched.
00:31:54.180 | Now I will say that I think the most important thing is you have to surround yourself with
00:31:59.220 | a group of people who are who you trust enough to make serious critiques of you when you're
00:32:03.580 | doing something wrong.
00:32:04.940 | But also you know that they have your best interests at heart because the Internet is
00:32:07.580 | filled with people who don't have your best interests at heart and who hate your guts.
00:32:10.540 | And so you can't really take those critiques seriously or it does wreck you.
00:32:14.340 | And the world is also filled with sycophants, right?
00:32:16.740 | Then the more successful you become, there are a lot of people who will tell you you're
00:32:20.120 | always doing the right thing.
00:32:21.420 | I'm very lucky.
00:32:22.420 | I got married when I was 24.
00:32:23.420 | My wife was 20.
00:32:24.420 | So she's known me long before I was famous or wealthy or anything.
00:32:27.940 | And so she's a good sounding board.
00:32:30.100 | I have a family that's willing to call me out on my bullshit as you talked to Ye about.
00:32:35.860 | I have friends who are able to do that.
00:32:38.460 | I try to have open lines of communications with people who I believe have my best interests
00:32:42.520 | at heart.
00:32:43.520 | But one of the sort of conditions of being friends is that when you see me do something
00:32:46.500 | wrong, I'd like for you to let me know that so I can correct it.
00:32:49.660 | I don't want to leave bad impressions out there.
00:32:51.300 | The sad thing about the Internet, just looking at the critiques you get, I see very few critiques
00:32:56.100 | from people that actually want you to succeed, want you to grow.
00:32:58.900 | I mean, they're very, they're not sophisticated.
00:33:01.340 | They're just, they're, I don't know, they're cruel.
00:33:05.180 | The critiques are just, it's not actual critiques.
00:33:07.500 | It's just cruelty.
00:33:08.500 | - And that's most of Twitter.
00:33:09.860 | I mean, as I said, Twitter is a place to smack and be smacked.
00:33:14.980 | I mean, anybody who uses Twitter for an intellectual conversation, I think, is engaging in category
00:33:20.860 | error.
00:33:21.860 | - I use it to spread love.
00:33:22.860 | I think it's the possible.
00:33:23.860 | - You're the only one.
00:33:24.860 | - It's you and no one else, my friend.
00:33:26.580 | - All right.
00:33:27.580 | Well, on that topic, what do you think about Elon buying Twitter?
00:33:30.300 | What do you like?
00:33:31.300 | What are you hopeful on that front?
00:33:34.740 | What would you like to see Twitter improve?
00:33:36.620 | - So I'm very hopeful about Elon buying Twitter.
00:33:39.780 | I mean, I think that Elon is significantly more transparent than what has taken place
00:33:44.920 | up till now.
00:33:46.380 | He seems committed to the idea that he's going to broaden the Overton window to allow for
00:33:49.340 | conversations that simply were banned before, everything ranging from efficacy of masks
00:33:54.340 | with regard to COVID to whether men can become women and all the rest.
00:33:58.300 | A lot of things that would get you banned on Twitter before without any sort of real
00:34:01.340 | explanation.
00:34:02.860 | It seems like he's dedicated to at least explaining what the standards are going to be and being
00:34:07.300 | broader and allowing a variety of perspectives on the outlet, which I think is wonderful.
00:34:12.060 | I think that's also why people are freaking out.
00:34:14.100 | I think the kind of wailing and gnashing of teeth and wearing of sackcloth and ash by
00:34:18.740 | so many members of the legacy media.
00:34:20.700 | I think a lot of that is because Twitter essentially was an oligarchy in which certain perspectives
00:34:27.180 | were allowed and certain perspectives just were not.
00:34:30.140 | And that was part of a broader social media reimposed oligarchy in the aftermath of 2017.
00:34:36.420 | So in order for, just to really understand, I think, what it means for Elon to take over
00:34:41.820 | Twitter, I think that we have to take a look at sort of the history of media in the United
00:34:45.340 | States in two minutes or less.
00:34:47.220 | The United States, the media for most of its existence up until about 1990, at least from
00:34:51.860 | about 1930s until the 1990s, virtually all media was three major television networks,
00:34:56.500 | a couple of major newspapers and the wire services.
00:34:58.180 | Everybody had a local newspaper, the wire services that basically did all the foreign
00:35:01.740 | policy and all the national policy, McClatchy, Reuters, AP, AFP, et cetera.
00:35:06.100 | So that monopoly or oligopoly existed until the rise of the internet.
00:35:10.420 | There were sort of pokes at it and talk radio and Fox News, but there certainly was not
00:35:14.420 | this plethora of sources.
00:35:15.660 | Then the internet explodes and all of a sudden you can get news everywhere.
00:35:19.260 | And the way that people are accessing that news is you're, I believe, significantly younger
00:35:23.500 | than I am, but we used to do this thing called bookmarking where you would bookmark a series
00:35:27.780 | of websites and then you would visit them every morning.
00:35:30.460 | And then social media came up.
00:35:32.780 | Is this on AOL or?
00:35:34.260 | Yeah, exactly.
00:35:35.260 | You had the dial up and it was actually a can connected to a string and you would actually
00:35:38.340 | just go, "Ehh, ehh."
00:35:41.740 | And then there came a point where social media arose and social media was sort of a boon
00:35:46.540 | for everybody because you no longer had to bookmark anything.
00:35:48.580 | You just followed your favorite accounts and all of them would pop up and you'd follow
00:35:51.580 | everything on Facebook and it would all pop up and it was all centralized.
00:35:54.140 | And for a while, everybody was super happy because this was the brand new wave of the
00:35:57.020 | future.
00:35:58.020 | It made everything super easy.
00:35:59.020 | Suddenly, outlets like mine were able to see new eyeballs because it was all centralized
00:36:03.180 | in one place, right?
00:36:04.180 | You didn't have to do it through Google optimization.
00:36:05.980 | You could now just put it on Facebook and so many eyeballs were on Facebook, you'd
00:36:08.740 | get more traffic.
00:36:09.780 | And everybody seemed pretty happy with this arrangement until precisely the moment Donald
00:36:13.100 | Trump became president.
00:36:14.100 | At that point, then the sort of pre-existing supposition of a lot of the powers that be,
00:36:20.300 | which was Democrats are going to continue winning from here on out so we can sort of
00:36:24.420 | use these social media platforms as ways to push our information and still allow for there
00:36:29.500 | to be other information out there.
00:36:31.260 | The immediate response was, "We need to reestablish this siphoning of information."
00:36:36.820 | It was misinformation and disinformation that won Donald Trump the election.
00:36:39.900 | We need to pressure the social media companies to start cracking down on misinformation and
00:36:43.020 | disinformation and actually see this in the historical record.
00:36:46.420 | I mean, you can see how Jack Dorsey's talk about free speech shifted from about 2015
00:36:49.860 | to about 2018.
00:36:50.860 | You can see Mark Zuckerberg gave a speech at Georgetown in 2018 in which he talked about
00:36:54.700 | free speech and its value.
00:36:55.700 | And by 2019, he was going in front of Congress talking about how he was responsible for the
00:36:59.700 | stuff that was on Facebook, which is not true.
00:37:01.580 | He's not responsible for the stuff on Facebook, right?
00:37:03.620 | It's a platform.
00:37:04.620 | Is AT&T responsible for the stuff you say on your phone?
00:37:06.500 | The answer is typically no.
00:37:07.940 | So when that happened, all of these, because all the eyeballs had now been centralized
00:37:11.820 | in these social media sites, they were able to suddenly control what you could see and
00:37:15.900 | what you could not see.
00:37:17.380 | And the most obvious example was obviously leading up to 2020, the election, the killing
00:37:22.540 | of the Hunter Biden story is a great example of this.
00:37:25.200 | And so Elon coming in and taking over one of the social media services and saying, "I'm
00:37:28.980 | not playing by your rules, right?
00:37:30.820 | There's not going to be this sort of group of people in the halls of power who are going
00:37:35.260 | to decide what we can see and hear.
00:37:37.420 | Instead, I'm going to let a thousand flowers bloom.
00:37:39.260 | There'll be limits, but it's going to be on more case by case basis.
00:37:42.220 | We're going to allow perspectives that are mainstream, but maybe not mainstream in the
00:37:47.940 | halls of academia or in the halls of media.
00:37:50.900 | Let those be said."
00:37:51.900 | I think it's a really good thing.
00:37:54.100 | Now that comes with some responsibilities on Elon's personal part, which would be to
00:37:59.340 | be, for example, I think more responsible in dissemination of information himself sometimes,
00:38:03.660 | right?
00:38:04.660 | I think he got himself in trouble the other day for tweeting out that story about Paul
00:38:07.940 | Pelosi that was speculative and untrue.
00:38:11.220 | And I don't think what he did is horrific.
00:38:14.460 | He deleted it when he found out that it was false.
00:38:17.140 | And that's actually free speech working, right?
00:38:18.540 | He said something wrong, people ripped into him.
00:38:20.620 | He realized he was wrong and he deleted it, which seems to me a better solution than preemptively
00:38:24.180 | banning content, which only raises more questions than it actually stops.
00:38:28.760 | With that said, as the face of responsible free speech, and that's sort of what he's
00:38:34.300 | pitching at Twitter, he, I think, should enact that himself and be a little more careful
00:38:38.260 | in the stuff that he tweets out.
00:38:39.500 | Well, that's a tricky balance.
00:38:41.540 | The reason a lot of people are freaking out is because, one, he's putting his thumb on
00:38:44.900 | the scale by saying he is more likely to vote Republican.
00:38:49.940 | He's showing himself to be center-right and sort of just having a political opinion versus
00:38:54.220 | being this amorphous thing that doesn't have a political opinion.
00:38:58.540 | I think if I were to guess, I haven't talked to him about it, but if I were to guess, he's
00:39:01.980 | sending a kind of signal that's important for the Twitter, the company itself, because
00:39:06.900 | if we're being honest, most of the employees are left-leaning.
00:39:08.900 | So you have to kind of send a signal that, like a resisting mechanism to say, like, since
00:39:15.660 | most of the employees are left, it's good for Elon to be more right to balance out the
00:39:22.540 | way the actual engineering is done, to say, we're not going to do any kind of activism
00:39:27.140 | inside the engineering.
00:39:28.340 | If I were to guess, that's kind of the effective aspect of that mechanism.
00:39:34.100 | The other one, by posting the Pelosi thing, is probably to expand the Overton window,
00:39:39.820 | like saying we can play, we can post stuff, we can post conspiracy theories, and then
00:39:46.020 | through discourse figure out what is and isn't true.
00:39:48.340 | Yeah, again, like I say, I mean, I think that that is a better mechanism in action than
00:39:53.540 | what it was before.
00:39:54.540 | It gave people who hate his guts the opening to kind of slap him for no reason, but I can
00:39:59.660 | see the strategy of it for sure.
00:40:01.740 | And I think that the general idea that he's kind of pushing right where the company had
00:40:07.740 | pushed left before, I think that there is actually unilateral polarization right now
00:40:12.460 | in politics, at least with regard to social media, in which one side basically says the
00:40:16.260 | solution to disinformation is to shut down free speech from the other side, and the other
00:40:23.260 | side is basically like people like me are saying the solution to disinformation is to
00:40:27.420 | let a thousand, like I'd rather have people on the left also being able to put out stuff
00:40:31.300 | that I disagree with than for there to be anybody who's sort of in charge of these social
00:40:34.860 | media platforms and using them as editorial sites.
00:40:37.500 | I mean, I'm not criticizing MSNBC for not putting on right wing opinions.
00:40:40.540 | I mean, that's fine.
00:40:41.540 | I run a conservative side.
00:40:42.540 | I'm, you know, we're not going to put up left wing opinions on a wide variety of issues
00:40:46.580 | because we are a conservative site.
00:40:47.860 | But if you pitch yourself as a platform, that's a different thing.
00:40:51.020 | If you pitch yourself as the town square, as Elon likes to call it, then I think Elon
00:40:55.580 | has a better idea of that than many of the former employees did, especially now that
00:40:59.340 | we have that report from the intercept suggesting that there are people from Twitter working
00:41:03.040 | with DHS to monitor quote unquote disinformation and being rather vague about what disinformation
00:41:08.180 | meant.
00:41:09.180 | Yeah, I don't think activism has a place in what is fundamentally an engineering company
00:41:14.420 | that's building a platform.
00:41:17.700 | Like the people inside the company should not be putting a thumb on the scale of what
00:41:20.460 | is and isn't allowed.
00:41:21.460 | You should create a mechanism for the people to decide what is and isn't allowed.
00:41:25.900 | Do you think Trump should have been removed from Twitter?
00:41:31.340 | Should his account be restored?
00:41:33.100 | His account should be restored.
00:41:34.380 | And this is coming from somebody who really dislikes an enormous number of Donald Trump's
00:41:37.780 | tweets.
00:41:38.780 | Again, he's a very important political personage.
00:41:43.900 | Even if he weren't, I don't think that he should be banned from Twitter or Facebook
00:41:47.760 | in coordinated fashion.
00:41:48.760 | By the way, I hold that opinion about people who I think are far worse than Donald Trump.
00:41:56.060 | Everyone knows I'm not an Alex Jones guy.
00:41:57.340 | I don't like Alex Jones.
00:41:58.340 | I think Alex Jones pervades.
00:41:59.340 | Uh-oh, you think Alex should be back on Twitter?
00:42:01.540 | I do actually because I think that there are plenty of people who are willing to say that
00:42:06.260 | what he's saying is wrong.
00:42:08.260 | I'm not a big fan of this idea that because people I disagree with and people who have
00:42:13.420 | personally targeted me, by the way, I mean Alex Jones has said some things about me personally
00:42:17.900 | that I'm not real fond of.
00:42:18.900 | You guys not...
00:42:19.900 | Well, we're not besties.
00:42:20.900 | No, it turns out, yeah.
00:42:21.900 | All I've said is I don't really enjoy his show.
00:42:24.700 | He said some other stuff about the anti-Christ and such, but that's a bit of a different
00:42:29.020 | thing I suppose.
00:42:31.100 | Even so, I'm just not a big fan of this idea.
00:42:34.100 | I've defended people who have really gone after me on a personal level, have targeted
00:42:38.300 | me that the town square is online.
00:42:43.420 | Banning people from the town square is unpersoning them.
00:42:46.300 | Once you've violated a criminal statute, you should not be unpersoned in American society
00:42:50.940 | as a general rule.
00:42:52.540 | That doesn't mean that companies that are not platforms don't have the ability to respond
00:42:56.940 | to you.
00:42:57.940 | I think Adidas is right to terminate its contract with Kanye, for example, with the A. Twitter
00:43:06.420 | ain't Adidas.
00:43:09.300 | The way your stance on free speech to the degree it's possible to achieve on a platform
00:43:14.820 | like Twitter is you fight bad speech with more speech, with better speech.
00:43:23.740 | If Alex Jones and Trump was allowed back on in the coming months and years leading up
00:43:30.780 | to the 2024 election, you think that's going to make for a better world in the long term?
00:43:35.980 | I think that on the principle that people should be allowed to do this and the alternative
00:43:39.440 | being a group of thought bosses telling us what we can and cannot see, yes.
00:43:43.940 | I think in the short term it's going to mean a lot of things that I don't like very much.
00:43:47.140 | Sure.
00:43:48.140 | I mean, that's, them's the cost of doing business, you know?
00:43:50.980 | I think that one of the costs of freedom is people doing things that I don't particularly
00:43:55.020 | like.
00:43:56.020 | And I would prefer the freedom with all the stuff I don't like than not the freedom.
00:44:01.740 | Let me linger on the love a little bit.
00:44:04.420 | You and a lot of people are pretty snarky on Twitter, sometimes to the point of mockery,
00:44:09.900 | derision.
00:44:10.900 | A bit of, if I were to say, bad faith in the kind of mockery.
00:44:17.060 | And you see it as a war.
00:44:18.420 | I disagree with both you and Elon on this.
00:44:20.820 | Elon sees Twitter as a war zone, or at least has saw it that way in the past.
00:44:26.340 | Have you ever considered being nicer on Twitter?
00:44:30.420 | As a voice that a lot of people look up to?
00:44:33.180 | That if Ben Shapiro becomes a little bit more about love, that's going to inspire a lot
00:44:38.940 | of people?
00:44:39.940 | Or no, is it just too fun for you?
00:44:42.380 | The answer is yes.
00:44:43.380 | Sure, it's occurred to me.
00:44:44.380 | Let's put it this way.
00:44:45.380 | There are a lot of tweets that actually don't go out that I delete.
00:44:47.540 | I'll say that Twitter's new function, that 30-second function, is a friend of mine.
00:44:52.980 | Every so often I'll tweet something and I'll think about it a second.
00:44:55.860 | Like, do I need to say this?
00:44:57.180 | Probably not.
00:44:58.180 | Can you make a book published after you pass away of all the tweets that you didn't send?
00:45:06.180 | Oh no, my kids are still going to be around, I hope.
00:45:08.420 | That's true.
00:45:09.420 | That's the legacy.
00:45:10.420 | But yeah, I mean, sure.
00:45:12.340 | The answer is yes, and this is a good piece of what we would call in Orthodox Judaism,
00:45:16.020 | "musr."
00:45:17.020 | This is like, he's giving you a musr schmooze right now.
00:45:18.420 | This is like the kind of, be a better person stuff.
00:45:21.180 | I agree with you, I agree with you.
00:45:23.380 | And yeah, I will say that Twitter is sometimes too much fun.
00:45:26.700 | I try to be at least, if not even-handed, then equal opportunity in my derision.
00:45:35.380 | I remember that during the 2016 primaries I used to post rather snarky tweets about
00:45:40.700 | virtually all of the candidates, Republican and Democrat.
00:45:44.420 | And every so often I'll still do some of that.
00:45:47.700 | I do think actually the amount of snark on my Twitter feed has gone down fairly significantly.
00:45:50.860 | I think if you go back a couple of years it was probably a little more snarky.
00:45:53.740 | Today I'm trying to use it a little bit more in terms of strategy to get out information.
00:45:58.060 | Now that doesn't mean I'm not going to make jokes about, for example, you know, Joe Biden.
00:46:04.020 | I will make jokes about Joe Biden.
00:46:05.580 | He's the President of the United States.
00:46:07.540 | Nobody else will mock him.
00:46:08.540 | So the entire comedic establishment has decided they actually work for him.
00:46:12.140 | So the President of the United States, no matter who they are, get the snark from their
00:46:16.500 | Twitter feed.
00:46:17.500 | Yes, yes.
00:46:18.500 | And President Trump, I think, is fairly aware that he got the snark from me as well.
00:46:21.700 | When it comes to snarking the President, I'm not going to stop that.
00:46:23.340 | I think the President deserves to be snarked.
00:46:24.700 | So you're not afraid of attacking Trump?
00:46:26.300 | No, I mean, I've done it before.
00:46:30.100 | Can you say what your favorite and least favorite things are about President Trump and President
00:46:36.300 | Biden one at a time?
00:46:38.020 | So maybe one thing that you can say is super positive about Trump and one thing super negative
00:46:43.300 | about Trump.
00:46:44.300 | Okay, so the super positive thing about Trump is that because he has no preconceived views
00:46:48.800 | that are establishmentarian, he's sometimes willing to go out of the box and do things
00:46:52.980 | that haven't been tried before.
00:46:54.580 | And sometimes that works.
00:46:55.580 | I mean, the best example being the entire foreign policy establishment telling him that
00:46:59.660 | he couldn't get a Middle Eastern deal done unless he centered the Palestinian-Israeli
00:47:03.340 | conflict and instead he just went right around that and ended up cutting a bunch of peace
00:47:06.660 | deals in the Middle East or moving the embassy in Jerusalem.
00:47:09.580 | Sometimes he does stuff and it's really out of the box and it actually works.
00:47:12.740 | And that's kind of awesome in politics and neat to see.
00:47:16.500 | The downside of Trump is that he has no capacity to use any sort of – there's no filter
00:47:24.780 | between brain and mouth.
00:47:26.300 | Whatever happens in his brain is the thing that comes out of his mouth.
00:47:28.620 | I know a lot of people find that charming and wonderful and it is very funny.
00:47:33.060 | But I don't think that it is a particularly excellent personal quality in a person who
00:47:37.540 | has as much responsibility as President Trump has.
00:47:39.580 | I think he says a lot of damaging and bad things on Twitter.
00:47:43.420 | I think that he seems consumed in some ways by his own grievances, which is why you've
00:47:50.220 | seen him focusing in on election 2020 so much.
00:47:52.800 | And I think that that is very negative about President Trump.
00:47:54.900 | So I'm very grateful to President Trump as a conservative for many of the things that
00:47:57.620 | he did.
00:47:58.620 | I think that a lot of his personality issues are pretty severe.
00:48:03.020 | What about Joe Biden?
00:48:05.780 | So I think that the thing that I like most about Joe Biden, I will say that Biden – two
00:48:13.220 | things.
00:48:14.220 | One, Biden seems to be a very good father by all available evidence.
00:48:20.900 | There are a lot of people who are put out kind of tape of him talking to Hunter and
00:48:24.260 | Hunter's having trouble with drugs or whatever.
00:48:26.060 | And I keep listening to that tape and thinking he seems like a really good dad.
00:48:30.700 | Like the stuff that he's saying to his son is stuff that God forbid if that were happening
00:48:33.500 | with my kid, I'd be saying to my kid.
00:48:35.860 | And so you can't help but feel for the guy.
00:48:37.820 | He's had an incredibly difficult go of it with his first wife and the death of members
00:48:43.340 | of his family and then Bo dying.
00:48:45.300 | I mean like that kind of stuff obviously is deeply sympathetic and he seems like a deeply
00:48:49.860 | sympathetic father.
00:48:51.780 | As far as his politics, he seems like a slap on the back kind of guy and I don't mind
00:48:57.940 | that.
00:48:58.940 | I think that's nice so far as it goes.
00:48:59.940 | It's sort of an old school politics where things are done with handshake and personal
00:49:03.900 | relationships.
00:49:04.900 | The thing I don't like about him is I think sometimes that's really not genuine.
00:49:07.580 | I think that sometimes – I think that's his personal tendency but I think sometimes
00:49:12.500 | he allows the prevailing winds of his party to carry him to incredibly radical places
00:49:17.020 | and then he just doubles down on the radicalism in some pretty disingenuous ways.
00:49:23.860 | And there I would cite the Independence Day speech or the Independence Hall speech which
00:49:27.340 | I thought was truly one of the worst speeches I've seen a president give.
00:49:29.900 | So you don't think he's trying to be a unifier in general?
00:49:32.500 | Not at all.
00:49:33.500 | I mean that's what he was elected to do.
00:49:35.460 | He was elected to do two things, not be alive and be a unifier.
00:49:38.460 | Those were the two things.
00:49:39.460 | And when I say not be alive, I don't mean like physically dead.
00:49:42.260 | This is where the snark comes in.
00:49:44.100 | But what I do mean is that he is – he was elected to not be particularly activist.
00:49:50.660 | Basically the mandate was don't be Trump.
00:49:52.100 | Be sane.
00:49:53.100 | Don't be Trump.
00:49:54.100 | Calm everything down.
00:49:55.100 | And instead he got in and he's like, "What if we spend $7 trillion?
00:49:57.660 | What if we pull out of Afghanistan without any sort of plan?
00:50:00.860 | What if I start labeling all of my political enemies, enemies of the republic?
00:50:04.620 | What if I start bringing Dillon Mulvaney to the White House and talking about how it is
00:50:09.980 | a moral sin to prevent the general mutilation of minors?"
00:50:13.140 | And this kind of stuff is very radical stuff.
00:50:15.340 | And this is not a president who has pursued a unifying agenda, which is why his approval
00:50:18.820 | rating sank from 60% when he entered office to low 40s or high 30s today.
00:50:23.580 | Unlike President Trump, who never had a high approval rating, right?
00:50:25.940 | Trump came into office and he had like a 45% approval rating.
00:50:28.340 | And when he left office, he had about a 43% approval rating.
00:50:31.100 | It bounced around between 45 and 37 pretty much his entire presidency.
00:50:34.180 | Biden went from being a very popular guy coming in to a very unpopular guy right now.
00:50:38.140 | And if you're Joe Biden, you should be looking in the mirror and wondering exactly why.
00:50:41.660 | Do you think that pulling out from Afghanistan could be flipped as a pro for Biden in terms
00:50:46.020 | of he actually did it?
00:50:47.380 | I think it's going to be almost impossible.
00:50:49.060 | I think the American people are incredibly inconsistent about their own views on foreign
00:50:53.660 | policy.
00:50:55.180 | In other words, we like to be isolationist until it comes time for us to be defeated
00:50:58.700 | and humiliated.
00:51:00.540 | When that happens, we tend not to like it very much.
00:51:03.460 | You mentioned Biden being a good father.
00:51:06.140 | Can you make the case for and against the Hunter Biden laptop story for it being a big
00:51:12.580 | deal and against it being a big deal?
00:51:15.180 | Sure.
00:51:16.180 | So the case for it being a big deal is basically twofold.
00:51:18.580 | One is that it is clearly relevant if the president's son is running around to foreign
00:51:23.660 | countries picking up bags of cash because his last name is Biden while his father is
00:51:28.580 | vice president of the United States.
00:51:31.220 | And it raises questions as to influence peddling for either the vice president or the former
00:51:35.900 | vice president using political connections.
00:51:37.740 | Did he make any money?
00:51:38.780 | Who was the big guy?
00:51:39.780 | Right.
00:51:40.780 | All these open questions that obviously implicates the questions to be asked.
00:51:44.580 | And then the secondary reason that the story is big is actually because the reaction of
00:51:47.500 | the story, the banning of the story is in and of itself a major story.
00:51:51.100 | If there's any story that implicates a presidential candidate in the last month of an election
00:51:55.620 | and there is a media blackout, including a social media blackout, that obviously raises
00:51:59.820 | some very serious questions about informational flow and dissemination in the United States.
00:52:03.540 | So no matter how big of a deal the story is, it is a big deal that there's a censorship
00:52:08.300 | of any relevant story.
00:52:10.220 | When there's a coordinated collusive blackout, yeah, that's a serious and major problem.
00:52:14.580 | So those are the two reasons why it would be a big story.
00:52:17.020 | The two reasons, a reason why it would not be a big story perhaps is if it turns out,
00:52:23.580 | and we don't really know this yet, but let's say that Hunter Biden was basically off on
00:52:27.620 | his own doing what he was doing, being a derelict or a drug addict or acting badly, and his
00:52:33.780 | dad had nothing to do with it and Joe was telling the truth.
00:52:36.300 | But the problem is we never actually got those questions answered.
00:52:38.580 | So if it had turned out to be a nothing of a story, the nice thing about stories that
00:52:41.500 | turn out to be nothing is that after they turn out to be nothing, they're nothing.
00:52:45.060 | The biggest problem with this story is that it wasn't allowed to take the normal life
00:52:48.940 | cycle of a story, which is original story breaks, follow on questions are asked, follow
00:52:53.340 | on questions are answered.
00:52:55.100 | Story is either now a big story or it's nothing.
00:52:58.020 | When the life cycle of a story is cut off right at the very beginning, right when it's
00:53:02.060 | born, then that allows you to speculate in any direction you want.
00:53:04.660 | You can speculate, it means nothing.
00:53:06.460 | It's nonsense.
00:53:07.460 | It's Russian.
00:53:08.460 | It's a Russian laptop.
00:53:09.460 | It's disinformation.
00:53:10.460 | Or on the other hand, this means that Joe Biden was personally calling Hunter and telling
00:53:13.700 | him to pick up a sack of cash over in Beijing, and then he became president and he's influence
00:53:17.620 | peddling.
00:53:18.620 | So this is why it's important to allow these stories to go forward.
00:53:21.320 | So this is why actually the bigger story for the moment is not the laptop.
00:53:24.800 | It's the reaction to the laptop because it cut off that life cycle of the story.
00:53:28.540 | And then, you know, at some point, I would assume that there will be some follow on questions
00:53:32.300 | that are actually answered.
00:53:33.300 | I mean, the house is pledging if it goes Republican to investigate all of this.
00:53:36.580 | Again, I wouldn't be supremely surprised if it turns out that that there was no direct
00:53:41.460 | involvement of Joe in this sort of stuff, because it turns out, as I said before, that
00:53:45.320 | all of politics is veep.
00:53:47.060 | And this is always the story with half the scandals that you see is that everybody assumes
00:53:50.260 | that there's some sort of deep and abiding, clever plan that some politician is implementing
00:53:56.620 | And then you look at it and it turns out, no, it's just something dumb.
00:53:58.900 | Right.
00:53:59.900 | This sort of perfect example of this, you know, President Trump with the classified
00:54:02.540 | documents in Mar-a-Lago.
00:54:03.900 | So people on the left, like it's probably nuclear codes.
00:54:06.300 | Probably he's taking secret documents and selling them to the Russians or the Chinese.
00:54:09.660 | And the real most obvious explanation is Trump looked at the papers and he said, I like these
00:54:13.820 | papers and then he just decided to keep them.
00:54:15.700 | Right.
00:54:16.700 | And then people came to him and said, Mr. President, you're not allowed to keep those
00:54:18.220 | papers.
00:54:19.220 | Who are those people?
00:54:20.220 | And he said, I'm putting them in the other room in a box.
00:54:24.060 | Which is that it is highly likely that that is what happened.
00:54:27.820 | And it's very disappointing to people, I think, when they realize the human brain, I mean,
00:54:32.020 | you know this better than I do, but the human brain is built to find patterns.
00:54:34.500 | Right.
00:54:35.500 | It's what we like to do.
00:54:36.500 | We like to find plans and patterns because this is how we survived in the wild is you
00:54:38.260 | found a plan, you found a pattern, you crack the code of the universe.
00:54:41.460 | When it comes to politics, the conspiracy theories that we see so often, it's largely
00:54:45.700 | because we're seeing inexplicable events.
00:54:48.140 | Unless you just assume everyone's a moron.
00:54:49.620 | If you assume that there's a lot of stupidity going on, everything becomes quickly explicable.
00:54:53.420 | If you assume that there must be some rationale behind it, you have to come up with increasingly
00:54:57.820 | convoluted conspiracy theories to explain just why people are acting the way that they're
00:55:01.100 | acting.
00:55:02.380 | And I find that, I won't say 100% of the time, but 94% of the time, the conspiracy theory
00:55:09.180 | turns out just to be people being dumb and then other people reacting in dumb ways to
00:55:14.220 | the original people being dumb.
00:55:15.660 | But it's also, to me in that same way, very possible, very likely that the Hunter Biden,
00:55:22.380 | Hunter Biden getting money in Ukraine, I guess, for consulting and all that kind of stuff
00:55:26.500 | is a nothing burger.
00:55:28.780 | He's qualified, he's getting money as he should.
00:55:31.060 | There's a lot of influence peddling in general that's not corrupt.
00:55:34.260 | I think the most obvious explanation there probably is that he was fake influence peddling,
00:55:39.780 | meaning he went to Ukraine and he's like, "Guess what?
00:55:41.340 | My dad's Joe."
00:55:42.540 | And they're like, "Well, you don't have any qualifications in oil and natural gas and
00:55:46.100 | you don't really have a great resume, but your dad is Joe."
00:55:48.700 | And then that was kind of the end of it.
00:55:49.700 | They gave him a bag of cash hoping he would do something.
00:55:51.100 | He never did anything.
00:55:52.100 | I think you're making it sound worse than it is.
00:55:53.540 | I think that in general, consulting is done in that way.
00:55:56.060 | Your name, it's not like you're- I agree with you.
00:55:58.820 | It's not like he is some rare case and this is an illustration of corruption.
00:56:03.560 | If you can criticize consulting, which I would- That's fair.
00:56:06.380 | Which they're basically not providing.
00:56:10.060 | You look at a resume and who's who.
00:56:12.020 | Like if you went to Harvard, I can criticize the same thing.
00:56:15.580 | If you have Harvard on your resume, you're more likely to be hired as a consultant.
00:56:20.420 | Maybe there's a network there of people that you know and you hire them in that same way.
00:56:24.620 | If your last name is Biden, if your last- There's a lot of last names that sound pretty
00:56:28.340 | good at a resume.
00:56:29.340 | For sure.
00:56:30.340 | For sure.
00:56:31.340 | And it's not like- Biden admitted that much, by the way, right?
00:56:32.540 | In an open interview, he was like, "If your last name weren't Biden, wouldn't you have
00:56:34.900 | got that job?"
00:56:35.900 | And he's like, "Probably not."
00:56:36.900 | And you're right.
00:56:37.900 | Yeah, but that's an honest- I agree with you.
00:56:39.740 | It's not like he's getting a ridiculous amount of money.
00:56:42.340 | He was getting a pretty standard consulting kind of money, which also would criticize
00:56:47.020 | because they get a ridiculous amount of money.
00:56:49.340 | But I sort of even to push back on the life cycle or to steel man the side that was concerned
00:56:56.420 | about the Hunter Biden laptop story, I don't know if there is a natural life cycle of a
00:57:01.140 | story because there's something about the virality of the internet that we can't predict
00:57:05.860 | that a story can just take hold and the conspiracy around it builds, especially around politics,
00:57:13.140 | where the interpretation, some popular sexy interpretation of a story that might not be
00:57:18.140 | connected to reality at all will become viral.
00:57:21.300 | And that, from Facebook's perspective, is probably what they're worried about is a organized
00:57:26.980 | misinformation campaign that makes up a sexy story or sexy interpretation of the vague
00:57:34.700 | story that we have.
00:57:36.580 | And that has an influence on the populace.
00:57:38.980 | I mean, I think that's true, but I think the question becomes who's the great adjudicator
00:57:42.100 | there, right?
00:57:43.100 | Who adjudicates when the story ought to be allowed to go through even a bad life cycle
00:57:47.580 | or allowed to go viral as opposed to not.
00:57:50.300 | Now, it's one thing if you want to say, "Okay, we can spot the Russian accounts that are
00:57:52.700 | actually promoting this stuff.
00:57:53.700 | They belong to the Russian government.
00:57:54.700 | Got to shut that down."
00:57:55.700 | I think everybody agrees.
00:57:56.920 | This is actually one of the slides that's happened linguistically that I really object
00:58:00.100 | to is the slide between disinformation and misinformation.
00:58:03.420 | Now, you notice there is this evolution.
00:58:05.220 | In 2017, there was a lot of talk about disinformation.
00:58:06.900 | It was Russian disinformation.
00:58:08.260 | The Russians were putting out deliberately false information in order to skew election
00:58:11.340 | results was the accusation.
00:58:12.980 | And then people started using disinformation or misinformation.
00:58:16.140 | And misinformation is either mistaken information or information that is "out of context."
00:58:20.880 | That becomes very subjective very quickly as to what out of context means.
00:58:24.740 | And it doesn't necessarily have to be from a foreign source.
00:58:26.780 | It can be from a domestic source, right?
00:58:27.900 | It could be somebody misinterpreting something here.
00:58:30.020 | It could be somebody interpreting something correctly, but PolitiFact thinks that it's
00:58:32.580 | out of context.
00:58:34.020 | That sort of stuff gets very murky very quickly.
00:58:36.300 | And so I'm deeply uncomfortable with the idea that Facebook, I mean, Zuckerberg was on with
00:58:41.060 | Rogan and talking about how the FBI had basically set lookout for Russian interference in the
00:58:46.540 | election.
00:58:47.540 | And then all of these people were out there saying that the laptop was Russian disinformation,
00:58:51.100 | so he basically shut it down.
00:58:53.100 | That sort of stuff is frightening, especially because it wasn't Russian disinformation.
00:58:55.420 | I mean, the laptop was real.
00:58:57.160 | And so the fact that you have people who seem to, let's put it this way, it seems as though,
00:59:04.240 | maybe this is wrong, it seems as though when a story gets killed preemptively like this,
00:59:07.840 | it is almost universally a story that negatively affects one side of the political aisle.
00:59:11.800 | I can't remember the last time there was a story on the right that was disinformation
00:59:16.840 | or misinformation where social media stepped in and they went, "We cannot have this.
00:59:20.680 | This cannot be distributed.
00:59:21.680 | We're going to all colludes that this information is not distributed."
00:59:25.520 | Maybe in response to the story being proved false, it gets taken down.
00:59:28.280 | But what made the Hunter Biden thing so amazing is that it wasn't really even a response to
00:59:31.880 | anything.
00:59:32.880 | It was like the story got posted, there were no actual doubts expressed as to the verified
00:59:37.240 | falsity of the story.
00:59:38.520 | It was just supposition that it had to be false and everybody jumped in.
00:59:41.120 | So I think that confirmed a lot of the conspiracy theories people had about social media and
00:59:45.280 | how it works.
00:59:46.280 | Yeah.
00:59:47.280 | So if the reason you want to slow down the viral spread of a thing is at all grounded
00:59:53.160 | in partisanship, that's a problem.
00:59:56.960 | You should be very honest with yourself and ask yourself that question.
01:00:00.280 | Is it because I'm on the left or on the right that I want to slow this down versus is it
01:00:04.960 | hate, bipartisan hate speech?
01:00:09.880 | Right.
01:00:11.520 | But it's really tricky.
01:00:14.040 | But like you, I'm very uncomfortable in general with any kind of slowing down, with any kind
01:00:17.820 | of censorship.
01:00:18.880 | But if there is something like a conspiracy theory that spreads hate, that becomes viral,
01:00:26.440 | I still lean to let that conspiracy theory spread because the alternative is dangerous,
01:00:32.960 | more dangerous.
01:00:33.960 | It's sort of like the ring of power, right?
01:00:35.480 | Like everybody wants the ring because with the ring you can stop the bad guys from going
01:00:38.680 | forward.
01:00:39.680 | But it turns out that the ring gives you enormous power and that power can be used in the wrong
01:00:42.840 | ways too.
01:00:44.660 | You head the Daily Wire, which I'm a member of.
01:00:49.120 | I appreciate that.
01:00:50.960 | Thank you.
01:00:51.960 | I recommend everybody sign up to it.
01:00:52.960 | It should be part of your regular diet, whether you're on the left and the right, the far
01:00:55.880 | left or the far right, everybody should be part of your regular diet.
01:00:59.160 | Okay.
01:01:00.160 | That said, do you worry about the audience capture aspect of it?
01:01:04.800 | Because it is a platform for conservatives and you have a powerful voice on there.
01:01:12.540 | It might be difficult for you to go against the talking points or against the stream of
01:01:19.080 | ideas that is usually connected to conservative thought.
01:01:23.760 | Do you worry about that?
01:01:24.760 | I mean, the audience would obviously be upset with me and would have a right to be upset
01:01:28.240 | with me if I suddenly flipped all of my positions on a dime.
01:01:31.560 | I have enough faith in my audience that I can say things that I think are true and that
01:01:35.180 | may disagree with the audience, you know, on a fairly regular basis, I would say.
01:01:39.660 | But they understand that on the deeper principle, we're on the same side of the aisle, at least
01:01:43.600 | I hope that much from the audience.
01:01:45.440 | It's also why we provide a number of different views on the platforms, many of which I disagree
01:01:50.000 | with but are sort of within the generalized range of conservative thought.
01:01:55.360 | It's something I do have to think about every day though, yeah.
01:01:57.520 | I mean, you have to think about like, am I saying this because I'm afraid of taking off
01:02:00.600 | my audience or am I saying this because I actually believe this?
01:02:04.960 | And that's a delicate dance a little bit.
01:02:07.760 | You have to be sort of honest with yourself.
01:02:09.620 | Yeah, somebody like Sam Harris is pretty good at this, at fighting, at saying the most outrageous
01:02:17.640 | thing that he knows.
01:02:19.200 | He almost leans into it.
01:02:21.080 | He knows he'll piss off a lot of his audience.
01:02:24.600 | Sometimes you almost have to test the system.
01:02:27.520 | It's like if you feel you almost exaggerate your feelings just to make sure to send a
01:02:32.520 | signal to the audience that you're not captured by them.
01:02:36.520 | So speaking of people you disagree with, what is your favorite thing about Candace Owens
01:02:43.000 | and what is one thing you disagree with her on?
01:02:45.920 | Well, my favorite thing about Candace is that she will say things that nobody else will
01:02:50.760 | My least favorite thing about Candace is that she will say things that nobody else will
01:02:54.680 | I mean, listen, she says things that are audacious and I think need to be said sometimes.
01:03:00.100 | Sometimes I think that she is morally wrong.
01:03:02.200 | I think the way she responded to Kanye, I've said this clearly, was dead wrong and morally
01:03:06.680 | wrong.
01:03:07.680 | What was her response?
01:03:08.680 | Her original response was that she proffered confusion of what Ye was actually talking
01:03:13.280 | about.
01:03:15.080 | And then she was defending her friend.
01:03:17.200 | I wish that the way that she had responded was by saying, "He's my friend," and also
01:03:21.040 | he said something bad and anti-Semitic.
01:03:23.440 | I wish that she had said that.
01:03:25.800 | Right away.
01:03:26.800 | Right away.
01:03:27.800 | Yeah.
01:03:28.800 | I think you can also...
01:03:29.800 | This is the interesting human thing.
01:03:31.960 | You can be friends with people that you disagree with and you can be friends with people that
01:03:36.200 | actually say hateful stuff.
01:03:38.320 | And one of the ways to help alleviate hate is being friends with people that say hateful
01:03:43.800 | things.
01:03:44.800 | Yeah, and then calling them out on a personal level when they do say wrong or hateful things.
01:03:50.160 | Yeah, from a place of love and respect and privately.
01:03:53.440 | Privately is also a big thing, right?
01:03:54.440 | I mean, the public demand for denunciation from friends to friends is difficult.
01:04:02.080 | And I certainly have compassion for Candace given the fact that she's so close with Ye.
01:04:07.000 | Yeah, it breaks my heart sometimes, the public fights between friends and broken friendships.
01:04:12.360 | I've seen quite a few friendships publicly break over COVID.
01:04:17.080 | COVID made people behave their worst in many cases, which yeah, it breaks my heart a little
01:04:25.040 | bit because the human connection is a prerequisite for effective debate and discussion and battles
01:04:33.600 | over ideas.
01:04:35.880 | Has there been any argument from the opposite political aisle that has made you change your
01:04:39.440 | mind about something?
01:04:42.280 | If you look back?
01:04:45.120 | So I will say that the, I'm thinking it through because I think that my views probably on
01:04:54.120 | foreign policy have morphed somewhat.
01:04:56.240 | I would say that I was much more interventionist when I was younger.
01:04:58.960 | I'm significantly less interventionist now.
01:05:00.680 | I'd probably give myself-
01:05:01.680 | Can you give an example?
01:05:02.680 | Sure.
01:05:03.680 | I was a big backer of the Iraq war.
01:05:04.840 | I think now in retrospect, I might not be a backer of the Iraq war if the same situation
01:05:09.040 | arose again.
01:05:10.920 | Based on the amount of evidence that had been presented or based on the sort of willingness
01:05:16.840 | of the American public to go it.
01:05:19.200 | If you're going to get involved in a war, you have to know what the end point looks
01:05:21.280 | like and you have to know what the American people really are willing to bear.
01:05:24.120 | The American people are not willing to bear open-ended occupations.
01:05:29.520 | Knowing that, you have to consider that going in.
01:05:33.040 | On foreign policy, I've become a lot more of a, let's say almost Henry Kissinger realist
01:05:37.320 | in some ways.
01:05:40.120 | When it comes to social policy, I would say that I'm fairly strong where I was.
01:05:47.320 | I may have become slightly convinced actually by more of the conservative side of the aisle
01:05:52.160 | on things like drug legalization.
01:05:53.160 | I think when I was younger, I was much more pro-drug legalization than I am now, at least
01:05:57.280 | on the local level.
01:05:58.280 | On a federal level, I think the federal government can't really do much other than close the
01:06:02.520 | borders with regard to fentanyl trafficking, for example.
01:06:04.800 | When it comes to how drugs were in local communities, you can see how drugs were in local communities
01:06:08.180 | pretty easily.
01:06:09.180 | - Which is weird because I saw you smoke a joint right before this conversation.
01:06:13.320 | - It's my biggest thing.
01:06:14.320 | I mean, I try to keep that secret.
01:06:15.320 | - All right.
01:06:16.320 | Well, that's interesting about intervention.
01:06:20.000 | Can you comment about the war in Ukraine?
01:06:21.520 | So for me, it's a deeply personal thing, but I think you're able to look at it from a geopolitics
01:06:28.440 | perspective.
01:06:29.440 | What is the role of the United States in this conflict, before the conflict, during the
01:06:32.840 | conflict, and right now in helping achieve peace?
01:06:38.040 | - I think before the conflict, the big problem is that the West took almost the worst possible
01:06:43.540 | view, which was encourage Ukraine to keep trying to join NATO and the EU, but don't
01:06:47.940 | let them in.
01:06:49.140 | And so what that does is it achieves the purpose of getting Russia really, really, really ticked
01:06:53.020 | off and feeling threatened, but also does not give any of the protections of NATO or
01:06:57.820 | the EU to Ukraine.
01:06:59.020 | I mean, Zelensky is on film when he was a comedy actor making that exact joke, right?
01:07:04.020 | He has Merkel on the other line, and she's like, "Oh, welcome to NATO."
01:07:07.780 | And he's like, "Great."
01:07:08.780 | She's like, "Wait, is this Ukraine on the line?"
01:07:10.660 | Oops.
01:07:11.660 | But so, you know, that sort of policy is sort of nonsensical.
01:07:15.620 | If you're gonna offer alliance to somebody, offer alliance to them.
01:07:18.220 | And if you're going to guarantee their security, guarantee their security.
01:07:21.000 | And the West failed signally to do that.
01:07:23.640 | So that was mistakes in the run-up to the war.
01:07:26.240 | Once the war began, then the responsibility of the West began and became to give Ukraine
01:07:32.020 | as much material as is necessary to repel the invasion.
01:07:36.740 | And the West did really well with that.
01:07:39.100 | I think we were late on the ball in the United States.
01:07:41.100 | It seemed like Europe led the way a little bit more than the United States did there.
01:07:44.300 | But in terms of effectuating American interests in the region, which being an American is
01:07:49.980 | what I'm chiefly concerned about, you know, the American interests were several fold.
01:07:54.580 | One is preserve borders.
01:07:56.340 | Two is degrade the Russian aggressive military because Russia's military has been aggressive.
01:08:02.080 | And they are geopolitical rival of the United States.
01:08:04.380 | Three, recalibrate the European balance with China.
01:08:07.860 | Europe was sort of balancing with Russia and China.
01:08:09.860 | And then because of the war, they sort of rebalanced away from China and Russia, which
01:08:13.580 | is a real geostrategic opportunity for the United States.
01:08:18.020 | It seemed like most of those goals have already been achieved at this point for the United
01:08:20.980 | States.
01:08:21.980 | And so then the question becomes, what's the off ramp here?
01:08:24.140 | And what is the thing you're trying to prevent?
01:08:25.500 | So what's the best opportunity?
01:08:27.100 | What's the best case scenario?
01:08:28.100 | What's the worst case scenario?
01:08:29.100 | And then what's realistic?
01:08:30.100 | The best case scenario is Ukraine forces Russia entirely out of Ukraine, including Luhansk,
01:08:34.420 | Donetsk and Crimea, right?
01:08:35.420 | That's the best case scenario.
01:08:36.940 | Virtually no one thinks that's accomplishable, including the United States, right?
01:08:39.860 | The White House has basically said as much.
01:08:41.580 | It's difficult to imagine, particularly Crimea, the Russians being forced out of Crimea.
01:08:46.700 | The Ukrainians have been successful in pushing the Russians out of certain parts of Luhansk
01:08:50.220 | and Donetsk.
01:08:51.220 | But the idea they're going to be able to push the entire Russian army completely back to
01:08:54.740 | the Russian borders, that would be at best a very, very long and difficult slog.
01:08:59.420 | In the middle of a collapsing Ukrainian economy, which is a point that Zelensky has made.
01:09:02.980 | It's like, it's not enough for you guys to give us military aid.
01:09:04.980 | We're in the middle of a war.
01:09:05.980 | We're going to need economic aid as well.
01:09:06.980 | So it's a pretty open-ended and strong commitment.
01:09:08.980 | - Can I take a small tangent on that?
01:09:10.460 | - Sure.
01:09:11.460 | - Best case scenario, if that does militarily happen, including Crimea, do you think there's
01:09:17.060 | a world in which Vladimir Putin would be able to convince the Russian people that this was
01:09:25.340 | a good conclusion to the war?
01:09:26.940 | - Right, so the problem is that the best case scenario might also be the worst case scenario,
01:09:30.740 | meaning that there are a couple of scenarios that are sort of the worst case scenario.
01:09:33.900 | And this is sort of the puzzlement of the situation.
01:09:36.620 | One is that Putin feels so boxed in, so unable to go back to his own people and say, "We
01:09:40.740 | just wasted tens of thousands of lives here for no reason," that he unleashed a tactical
01:09:45.820 | nuclear weapon on the battlefield.
01:09:47.460 | Nobody knows what happens after that.
01:09:48.540 | So we put NATO planes in the air to take out Russian assets.
01:09:51.860 | Do Russians start shooting down planes?
01:09:54.380 | Does Russia then threaten to escalate even further by attacking an actual NATO civilian
01:09:58.180 | center or even a Ukrainian civilian center with nuclear weapons?
01:10:01.740 | Where it goes from there, nobody knows because nuclear weapons haven't been used since 1945.
01:10:05.700 | So that is a worst case scenario.
01:10:08.060 | It's an unpredictable scenario that could devolve into really, really significant problems.
01:10:12.660 | The other worst case scenario, could be a best case scenario, could be a worst, we just
01:10:15.500 | don't know, is Putin falls.
01:10:17.820 | What happens after that?
01:10:19.280 | Who takes over for Putin?
01:10:20.780 | Is that person more moderate than Putin?
01:10:22.880 | Is that person a liberalizer?
01:10:24.460 | It probably won't be Navalny.
01:10:26.140 | If he's going to be ousted, it'll probably somebody who's a top member of Putin's brass
01:10:30.340 | right now and has capacity to control the military.
01:10:33.820 | Or it's possible the entire regime breaks down.
01:10:35.620 | What you end up with is Syria and Russia, where you just have an entirely out of control
01:10:40.100 | region with no centralizing power, which is also a disaster area.
01:10:44.100 | And so in the nature of risk mitigation, in sort of an attempt at risk mitigation, what
01:10:49.660 | actually should be happening right now is some off ramp has to be offered to Putin.
01:10:55.100 | The off ramp likely is going to be him maintaining Crimea and parts of Luhansk and Donetsk.
01:10:58.780 | It's probably gonna be a commitment by Ukraine not to join NATO formally, but a guarantee
01:11:05.940 | by the West to defend Ukraine in case of an invasion of its borders again by Russia, like
01:11:11.100 | an actual treaty obligation.
01:11:12.100 | Now like the BS treaty obligation when Ukraine gave up its nuclear weapons in the 90s.
01:11:18.860 | And that is likely how this is going to have to go.
01:11:21.660 | The problem is that requires political courage, not from Zelensky.
01:11:25.700 | It requires courage from probably Biden.
01:11:28.180 | Because Zelensky is not in a political position where he can go back to his own people who
01:11:32.020 | have made unbelievable sacrifices on behalf of their nation and freedom and say to them,
01:11:35.860 | "Guys, now I'm calling it quits.
01:11:37.580 | We're going to have to give them Luhansk, Donetsk, and give Putin an off ramp."
01:11:39.660 | I don't think that's an acceptable answer to most Ukrainians at this point in time from
01:11:42.340 | the polling data and from the available data we have on the ground.
01:11:45.420 | It's going to actually take Biden biting the bullet and being the bad guy and saying to
01:11:49.500 | Zelensky, "Listen, we've made a commitment of material aid.
01:11:53.340 | We're offering you all these things, including essentially a defense pact.
01:11:57.500 | We're offering you all this stuff.
01:11:58.540 | But if you don't come to the table, then we're going to have to start weaning you off.
01:12:02.580 | Like there will have to be a stick there.
01:12:03.700 | It can't just be a carrot."
01:12:05.460 | And so that will allow Zelensky, if Biden were to do that, it would allow Zelensky to
01:12:08.820 | blame Biden for the solution everybody knows has to happen.
01:12:11.100 | Zelensky can go back to his own people and he can say, "Listen, this is the way it has
01:12:15.260 | to go.
01:12:16.260 | I don't want it to go this way, but it's not my...
01:12:17.740 | I'm signing other people's checks.
01:12:18.940 | I mean, it's not my money."
01:12:22.300 | And Biden would take the hit because he wouldn't then be able to blame Ukraine for whatever
01:12:26.420 | happens next, which has been the easy road off, I think, for a lot of politicians in
01:12:29.500 | the West is for them to just say, "Well, this is up to the Ukrainians to decide.
01:12:32.620 | It's up to the Ukrainians to decide."
01:12:34.060 | Well, is it totally up to the Ukrainians to decide?
01:12:36.900 | Because it seems like the West is signing an awful lot of checks and all of Europe is
01:12:39.860 | going to freeze this winter.
01:12:41.620 | So...
01:12:42.620 | This is the importance of great leadership, by the way.
01:12:45.140 | That's why the people we elect is very important.
01:12:48.900 | Do you think there's power to just one-on-one conversation where Biden sits down with Zelensky
01:12:56.740 | and Biden sits down with Putin almost in person?
01:12:59.900 | Because I...
01:13:00.900 | Or maybe I'm romanticizing the notion, but having done these podcasts in person, I think
01:13:05.180 | there's something fundamentally different than through a remote call and also like a
01:13:10.300 | distant kind of recorded political type speak versus like man-to-man.
01:13:16.860 | So I'm deeply afraid that Putin outplays people in the one-on-one scenarios because he's done
01:13:22.020 | it to multiple presidents already.
01:13:23.980 | He gets in one-on-one scenarios with Bush, with Obama, with Trump, with Biden, and he
01:13:28.140 | seems to be a very canny operator and a very sort of hard-nosed operator in those situations.
01:13:33.060 | I think that if you were going to do something like that, like an actual political face-to-face
01:13:36.260 | summit, what you would need is for Biden to first have a conversation with Zelensky, where
01:13:39.780 | Zelensky knows what's going on, so he's aware.
01:13:43.100 | And then Biden walks in and he says to Putin on camera, "Here's the offer.
01:13:49.220 | Let's get it together.
01:13:50.720 | Let's make peace.
01:13:52.460 | You get to keep this stuff."
01:13:54.360 | And then let Putin respond how Putin is going to respond.
01:13:59.340 | But the big problem for Putin, I think, and the problem with public-facing fora, maybe
01:14:03.300 | it's a private meeting.
01:14:04.300 | If it's a private meeting, maybe that's the best thing.
01:14:05.780 | If it's a public-facing forum, I think it's a problem because Putin's afraid of being
01:14:08.180 | humiliated at this point.
01:14:09.740 | If it's a private meeting, then sure, except that, again, I wonder whether when it comes
01:14:17.920 | to a person as canny as Putin and to a politician that I really don't think is a particularly
01:14:25.300 | sophisticated player in Joe Biden.
01:14:27.180 | And again, this is not unique to Biden.
01:14:29.540 | I think that most of our presidents for the last 30, 40 years have not been particularly
01:14:34.660 | sophisticated players.
01:14:36.300 | I think that that's a risky scenario.
01:14:38.860 | Yeah, I still believe in the power of that because otherwise, I don't know, I don't think
01:14:46.500 | stuff on paper and political speak will solve these kinds of problems because from Zelensky's
01:14:51.340 | perspective, nothing but complete victory will do.
01:14:56.820 | As a nation, his people sacrificed way too much and they're all in.
01:15:00.740 | And if you look at, because I traveled to Ukraine, I spent time there, I'll be going
01:15:04.740 | back there, hopefully also going back to Russia.
01:15:07.540 | Just speaking to Ukrainians, they're all in, they're all in.
01:15:14.660 | Nothing but complete victory.
01:15:15.660 | Yeah, I'm with him.
01:15:16.660 | Yep, that's right.
01:15:17.660 | And so for that, the only way to achieve peace is through honest human-to-human conversation,
01:15:26.300 | giving both people a way to off-ramp, to walk away victorious.
01:15:31.380 | And some of that requires speaking honestly as a human being, but also for America to,
01:15:39.140 | actually not even America, honestly, just the president, be able to eat their own ego
01:15:43.740 | a bit and be the punching bag, just enough for both presidents to be able to walk away
01:15:49.700 | and say, "Listen, we got the American president to come to us."
01:15:54.940 | And I think that makes the president look strong, not weak.
01:15:58.860 | I mean, I agree with you.
01:15:59.980 | I think it would also require some people on the right, people like me, if it's Joe
01:16:03.260 | Biden, to say, "If Biden does that, I see what he's doing and it's the right move."
01:16:07.020 | I think one of the things that he's afraid of, to steel man him, I think one of the things
01:16:10.420 | he's afraid of is he goes and he makes that sort of deal and the right says, "You just
01:16:13.340 | coward in front of Russia.
01:16:14.580 | You just gave away Ukraine," whatever it is.
01:16:17.740 | But it's going to require some people on the right to say that that move is the right move
01:16:21.780 | and then hold by it if Biden actually performs that move.
01:16:25.100 | - You're exceptionally good at debate.
01:16:28.820 | You wrote "How to Debate Leftists and Destroy Them."
01:16:32.060 | You're kind of known for this kind of stuff, just exceptionally skilled at conversation,
01:16:35.140 | at debate, at getting to the facts of the matter and using logic to get to the conclusion
01:16:42.380 | in the debate.
01:16:43.380 | Do you ever worry that this power, talk about the ring, this power you were given has corrupted
01:16:51.620 | you and your ability to pursue the truth versus just winning debates?
01:16:58.580 | - I hope not.
01:16:59.580 | I mean, so I think one of the things that's kind of funny about the branding versus the
01:17:02.660 | reality is that most of the things that get characterized as destroying in debates with
01:17:07.020 | facts and logic, most of those things are basically me having a conversation with somebody
01:17:11.500 | on a college campus.
01:17:13.120 | It actually isn't like a formal debate where we sit there and we critique each other's
01:17:16.620 | positions or it's not me insulting anybody.
01:17:19.420 | A lot of the clips that have gone very viral is me making an argument and then there not
01:17:22.260 | being like an amazing counter argument.
01:17:24.780 | Many of the debates that I've held have been extremely cordial.
01:17:27.380 | Let's take the latest example, like about a year ago, I debated Anna Kasparian from
01:17:30.660 | Young Turks.
01:17:31.660 | It was very cordial.
01:17:32.660 | It was very nice, right?
01:17:35.060 | That's sort of the way that I like to debate.
01:17:37.060 | My rule when it comes to debate and/or discussion is that my opponent actually gets to pick
01:17:42.340 | the mode in which we work.
01:17:43.820 | So if it's going to be a debate of ideas and we're just going to discuss and critique and
01:17:48.300 | clarify, then we can do that.
01:17:50.260 | If somebody comes loaded for bear, then I will respond in kind because one of the big
01:17:56.860 | problems I think in sort of the debate/discussion sphere is very often misdiagnosis of what
01:18:02.560 | exactly is going on.
01:18:03.740 | People who think that a discussion is a debate and vice versa.
01:18:07.340 | That can be a real problem and there are people who will treat what ought to be a discussion
01:18:14.060 | as, for example, an exercise in performance art.
01:18:17.780 | What that is is mugging or trolling or saying trolly things in order to just get to the
01:18:21.660 | – like that's something I actually don't do during debate.
01:18:23.660 | I mean if you actually watch me talk to people, I don't actually do the trolling thing.
01:18:26.900 | The trolling thing is almost solely relegated to Twitter and me making jokes on my show.
01:18:30.220 | When it comes to actually debating people, that sounds actually a lot like what we're
01:18:34.740 | doing right now.
01:18:35.740 | It's just the person maybe taking just an obverse position to mine.
01:18:39.380 | And so that's fine.
01:18:40.900 | Usually half of the debate or discussion is me just asking for clarification of terms.
01:18:44.620 | Like what exactly do you mean by this so I can drill down on where the actual disagreement
01:18:49.060 | may lie because some of the time people think they're disagreeing and they're actually
01:18:51.700 | not disagreeing.
01:18:52.700 | When I'm talking with Anna Kasparian and she's talking about how corporate and government
01:18:56.860 | have too much power together, I'm like, "Well, you sound like a tea party.
01:18:59.300 | You and I are on the same page about that."
01:19:00.740 | That sort of stuff does tend to happen a lot in discussion.
01:19:03.820 | I think that when discussion gets termed debate, it's a problem.
01:19:07.020 | When debate gets termed discussion, it's even more problematic because debate is a
01:19:10.380 | different thing.
01:19:11.380 | I find that your debate and your conversation is often good faith.
01:19:14.260 | You're able to steal men on the other side.
01:19:16.700 | You're actually listening.
01:19:17.740 | You're considering the other side.
01:19:19.180 | The times when I see that Ben Shapiro destroys leftists, it's usually just like you said,
01:19:24.940 | the other side is doing the trolling.
01:19:27.780 | Because the people that do criticize you for that interaction is the people that usually
01:19:35.300 | get destroyed are like 20 years old.
01:19:38.180 | They're usually not sophisticated in any kind of degree in terms of being able to use logic
01:19:44.220 | and reason and facts and so on.
01:19:45.900 | That's totally fine by the way.
01:19:47.220 | If people want to criticize me for speaking on college campuses where a lot of political
01:19:50.460 | conversation happens both right and left, that's fine.
01:19:53.700 | I've had lots of conversations with people on the other side of the aisle too.
01:19:56.140 | I've done podcasts with Sam Harris and we've talked about atheism or I've done debates
01:20:00.060 | with Anna Kasparian or I've done a debate with Cenk Uygur or I've had conversations
01:20:04.140 | with lots of people on the other side of the aisle.
01:20:05.500 | In fact, I believe I'm the only person on the right who recommends that people listen
01:20:08.660 | to his shows on the other side of the aisle.
01:20:10.340 | I say on my show on a fairly regular basis that people should listen to Positive America.
01:20:14.020 | Now, no one on Positive America will ever say that somebody should listen to my show.
01:20:17.220 | That is verboten.
01:20:18.220 | That is not something that can be had.
01:20:20.700 | It's one of the strangenesses of our politics.
01:20:22.300 | It's what I've called the happy birthday problem, which is I have a lot of friends who are of
01:20:26.300 | the left and are publicly of the left.
01:20:28.580 | On my birthday, they'll send you a text message, "Happy birthday," but they will never tweet
01:20:31.780 | happy birthday lest they be acknowledging that you were born of woman and that this
01:20:35.940 | can't be allowed.
01:20:36.940 | On the Sunday special, I've had a bevy of people who are on the other side of the aisle,
01:20:41.660 | a lot of them ranging from people in Hollywood like Jason Blum to Larry Wilmore to Sam to
01:20:47.940 | just a lot of people on the left.
01:20:50.380 | I think we're in the near future probably going to do a Sunday special with Ro Khanna
01:20:53.980 | up in California, the California congressperson, very nice guy.
01:20:56.540 | I had him on the show.
01:20:57.900 | That kind of stuff is fun and interesting.
01:21:01.420 | I think that the easy way out for a clip that people don't like is to either immediately
01:21:05.420 | clip the clip.
01:21:06.420 | I'll take a two-minute clip and clip it down to 15 seconds where somebody insults me and
01:21:09.140 | then that goes viral, which is welcome to the internet, or to say, "Well, you're only
01:21:14.660 | debating college students.
01:21:15.660 | You're only talking to 20."
01:21:16.660 | I mean, I talked to a lot more people than that.
01:21:18.380 | That's just not the stuff you're watching.
01:21:19.780 | You lost your cool in an interview with BBC's Andrew Neil, and you were really honest about
01:21:25.340 | it after, which was refreshing and enjoyable.
01:21:29.940 | Because the internet said they've never seen anyone lose an interview.
01:21:36.180 | To me, honestly, it was like seeing Floyd Mayweather Jr. or somebody knocked down.
01:21:42.740 | What was the ... Can you take me through that experience?
01:21:44.860 | Here's that day.
01:21:45.860 | That day is I have a book release, didn't get a lot of sleep the night before, and this
01:21:49.180 | is the last interview of the day.
01:21:50.180 | It's an interview with BBC.
01:21:51.180 | I don't know anything about BBC.
01:21:52.180 | I don't watch BBC.
01:21:53.180 | I don't know any of the hosts.
01:21:55.180 | We get on the interview, and it's supposed to be about the book.
01:21:59.700 | The host, Andrew Neil, doesn't ask virtually a single question about the book.
01:22:04.180 | He just starts reading me bad old tweets, which I hate.
01:22:07.820 | It's annoying, and it's stupid, and it's the worst form of interview when somebody just
01:22:10.580 | reads you bad old tweets, especially when I've acknowledged bad old tweets before.
01:22:14.700 | I'm going through the list with him.
01:22:16.260 | This interview was solidly 20 minutes.
01:22:17.860 | It was a long interview.
01:22:21.380 | I make a couple of particularly annoyed mistakes in the interview.
01:22:25.660 | Annoyed mistake number one is the ego play.
01:22:29.420 | There's a point in the middle of the interview where I say, "I don't even know who you are,"
01:22:32.340 | which was true.
01:22:33.340 | I didn't know who he was, but it turns out he's a very famous person in Britain.
01:22:36.340 | You can't make that ego play.
01:22:37.340 | Even if he's not famous, that's not a good thing.
01:22:39.060 | It doesn't matter.
01:22:40.060 | It's a dumb thing to do, and it's an ass thing to do.
01:22:42.180 | Saying that was more just kind of peak and silliness.
01:22:47.500 | That was mistake number one.
01:22:48.500 | I enjoyed watching that.
01:22:49.500 | I was like, "Oh, Ben is human."
01:22:51.500 | I'm glad somebody enjoyed it.
01:22:55.020 | There's that.
01:22:56.020 | Then the other mistake was that I just don't watch enough British TV, so the way that interviews
01:23:00.700 | are done there are much more adversarial than American TV.
01:23:03.500 | In American TV, if somebody is adversarial with you, you assume that they're a member
01:23:06.300 | of the other side.
01:23:07.300 | That's typically how it is.
01:23:08.860 | I'm critiquing some of his questions at the beginning, and I thought that the critique
01:23:11.640 | of some of his questions is actually fair.
01:23:13.180 | He was asking me about abortion, and I thought he was asking it from a way of framing the
01:23:17.020 | question that wasn't accurate.
01:23:18.020 | I assumed that he was on the left, because again, I'd never heard of him.
01:23:22.500 | I mischaracterized him, and I apologized later for mischaracterizing him.
01:23:26.860 | We finally go through the interview.
01:23:28.100 | It's 20 minutes.
01:23:29.100 | He just keeps going with the bad old tweets.
01:23:31.140 | Finally, I got up, and I took off the microphone, and I walked out.
01:23:34.940 | Immediately I knew it was a mistake.
01:23:36.140 | Within 30 seconds of the end of the interview, I knew it was a mistake.
01:23:40.300 | That's why even before the interview came out, I believe I corrected the record that
01:23:43.980 | Andrew Neil is not on the left.
01:23:45.780 | That's a mistake by me.
01:23:46.780 | Then, I took the hit for a bad interview.
01:23:53.300 | As far as what I wish I had done differently, I wish I had known who he was.
01:23:56.940 | I wish I had done my research.
01:23:58.700 | I wish that I had treated it as though there was a possibility that it was going to be
01:24:01.780 | more adversarial than it was.
01:24:02.780 | I think I was incautious about the interview, because it was pitched as, "It's just another
01:24:07.100 | book interview."
01:24:08.100 | It wasn't just another book interview.
01:24:09.100 | It was treated much more adversarially than that.
01:24:11.860 | I wish that that's on me.
01:24:13.100 | I got to research the people who are talking to me, and watch their shows, and learn about
01:24:17.860 | that.
01:24:18.860 | Then, obviously, the gut level appeal to ego or arrogance like that, that's a bad look.
01:24:24.340 | I shouldn't have done that.
01:24:25.900 | Losing your cool is always a bad look.
01:24:28.740 | The fact that that became somewhat viral and stood out just shows that it happens so rarely
01:24:34.460 | to you.
01:24:36.860 | Just to look at the day in the life of Ben Shapiro, you speak a lot, very eloquently
01:24:44.140 | about difficult topics.
01:24:46.940 | What goes into the research, the mental part?
01:24:49.020 | You always look pretty energetic.
01:24:53.420 | You're not exhausted by the burden, the heaviness of the topics you're covering day after day
01:24:58.700 | after day after day.
01:25:00.580 | What goes through the preparation, mentally, diet-wise, anything like that?
01:25:06.380 | When do you wake up?
01:25:07.380 | I wake up when my kids wake me up.
01:25:09.860 | Usually that's my baby daughter who's two and a half.
01:25:12.220 | She'll be here on the monitor usually about 6.15, 6.20 AM.
01:25:16.700 | I get up.
01:25:17.700 | My wife sleeps in a little bit.
01:25:18.700 | I go get the baby.
01:25:20.180 | Then, my son gets up.
01:25:22.180 | My oldest daughter gets up.
01:25:23.580 | I have eight, six, and two.
01:25:24.940 | The boy's the middle child.
01:25:26.540 | Is that both a source of stress and happiness?
01:25:28.580 | Oh, yeah.
01:25:29.580 | It's the height of both.
01:25:31.340 | It's the source of the greatest happiness.
01:25:33.060 | The way that I characterize it is this when it comes to kids in life.
01:25:36.360 | When you're single, your boundaries of happiness and unhappiness, you can be a zero in terms
01:25:40.180 | of happiness.
01:25:41.180 | You can be like a 10 in terms of happiness.
01:25:42.180 | Then, you get married and it goes up to like a 20 and a negative 20 because the happiest
01:25:45.740 | stuff is with your wife.
01:25:46.740 | Then, the most unhappy stuff is when something happens to your spouse.
01:25:48.780 | It's the worst thing in the entire world.
01:25:49.780 | Then, you have kids and all limits are removed.
01:25:51.980 | The best things that have ever happened to me are things where I'm watching my kids and
01:25:55.380 | they're playing together and they're being wonderful and sweet and cute and I love them
01:25:57.980 | so much.
01:25:58.980 | The worst things are when my son is screaming at me for no reason because he's being insane
01:26:04.100 | and I have to deal with that.
01:26:05.780 | Or something bad happens to my daughter at school or something like that.
01:26:08.280 | That stuff is really bad.
01:26:09.280 | Yes, the source of my greatest happiness is the source of my greatest stress.
01:26:11.500 | They get me up at about 6.15 in the morning.
01:26:13.140 | I feed them breakfast.
01:26:14.420 | I'm kind of scrolling the news while I'm making the megs and just updating myself on anything
01:26:20.780 | that may have happened overnight.
01:26:22.260 | I go into the office, put on the makeup and the wardrobe or whatever.
01:26:27.060 | Then, I sit down and do the show.
01:26:29.580 | A lot of the prep is actually done the night before because the news cycle doesn't change
01:26:33.100 | all that much between late at night and in the morning so I can supplement in the morning.
01:26:38.340 | I do the show.
01:26:39.340 | A lot of the preparation, like thinking through what are the big issues in the world is done
01:26:43.540 | the night before.
01:26:45.540 | That's reading pretty much all the legacy media.
01:26:47.540 | I rip on legacy media a lot but that's because a lot of what they do is really good and a
01:26:50.660 | lot of what they do is really bad.
01:26:51.660 | I cover a lot of legacy media.
01:26:53.980 | That's probably covering Wall Street Journal, New York Times, Washington Post, Boston Globe,
01:26:57.180 | Daily Mail.
01:26:58.180 | Then, I'll look over at some of the alternative media.
01:27:00.700 | I'll look at my own website, Daily Wire.
01:27:01.980 | I'll look at Breitbart.
01:27:02.980 | I'll look at The Blaze.
01:27:03.980 | I'll look at maybe The Intercept.
01:27:06.260 | I'll look at a bunch of different sources.
01:27:08.380 | Then, I will look at different clips online.
01:27:12.620 | Media comes in handy here.
01:27:13.620 | Grabean comes in handy here.
01:27:16.140 | That sort of stuff because my show relies very heavily on being able to play people
01:27:18.700 | so you can hear them in their own words.
01:27:22.140 | That's sort of the media die.
01:27:23.140 | I sit down.
01:27:24.300 | I do the show.
01:27:26.780 | Once I'm done with the show, I usually have between ... Now, it's like 11, 15 in the morning
01:27:32.140 | maybe because sometimes I'll pre-record the show.
01:27:34.820 | It's 11, 15 in the morning.
01:27:36.140 | I'll go home.
01:27:38.020 | If my wife's available, I'll grab lunch with her.
01:27:40.160 | If not, then I will go and work out.
01:27:42.740 | I try to work out five times a week with a trainer, something like that.
01:27:48.380 | Then, I will-
01:27:49.380 | - Just regular gym stuff?
01:27:51.100 | Just start at the gym?
01:27:52.500 | - Yeah, weights and plyometrics and some CrossFit kind of stuff.
01:27:59.260 | Beneath this mildest hero lies a hulking monster.
01:28:03.860 | And so, I'll do that.
01:28:06.380 | Then I will do reading and writing.
01:28:11.140 | I'm usually working on a book at any given time.
01:28:14.580 | - You shut off the rest of the world for that?
01:28:16.580 | - Yes.
01:28:17.580 | I put some music in my ears, usually Brahms or Bach, sometimes Beethoven or Mozart.
01:28:21.420 | Those four.
01:28:22.420 | Those are on rotation.
01:28:23.420 | - No rap?
01:28:24.420 | - No rap.
01:28:25.420 | No rap.
01:28:26.420 | Despite my extraordinary rendition of WAP, I am not in fact a rapper.
01:28:29.220 | - Do you still hate WAP, the song?
01:28:32.460 | - I will say I do not think that it is the peak of Western civilized art.
01:28:36.060 | I don't think that 100 years from now, people will be gluing their faces to a WAP and protest
01:28:40.100 | at the environment.
01:28:41.100 | - But Brahms and the rest will be still around?
01:28:44.300 | - Yes.
01:28:45.300 | I would assume if people still have a functioning prefrontal cortex and any sort of taste.
01:28:48.620 | - Strong words from Ben Shapiro.
01:28:50.140 | All right, so you got some classical music in your ears and you're focusing.
01:28:54.460 | Are you at the computer when you're writing?
01:28:56.300 | - Yeah, I'm at the computer.
01:28:58.860 | We have a kind of a room that has some sun coming in, so it's nice in there or I'll go
01:29:02.700 | up to a library that we just completed for me.
01:29:05.340 | So I'll go up there and I'll write and read.
01:29:07.180 | - Like with physical books?
01:29:08.180 | - Yeah, I love physical books.
01:29:10.260 | Because I keep Sabbath, I don't use Kindle because when I'm reading a book and I hit
01:29:15.460 | Sabbath I have to turn off the Kindle.
01:29:16.940 | So that means that I have tons and tons and tons of physical books.
01:29:19.780 | When we moved from Los Angeles to Florida, I had about 7,000 volumes.
01:29:23.220 | I had to discard probably 4,000 of them.
01:29:26.340 | And then I've built that back up now.
01:29:27.740 | I'm probably gonna have to go through another round where I put them somewhere else.
01:29:30.220 | I tend to tab books rather than highlighting them because I can't highlight on Sabbath.
01:29:34.420 | So I have like the little stickers and I put them in the book.
01:29:37.340 | So a typical book from me, you can see it on the book club, will be like filled with
01:29:40.060 | tabs on the side, things that I want to take.
01:29:42.340 | Now actually, I got a person who I pay to go through and write down in files the quotes
01:29:50.100 | that I like from the books.
01:29:51.540 | I have those handy.
01:29:53.660 | Which is a good way for me to remember what it is that I've read because I read probably
01:29:59.620 | some routine three and five books a week.
01:30:01.980 | And then the, in a good week, five.
01:30:04.980 | And then I write, I read, and then I go pick up my kids from school at 3.30.
01:30:10.580 | So according to my kids, I have no job.
01:30:12.500 | I'm there in the mornings until they leave for school.
01:30:14.260 | I pick them up from school.
01:30:15.740 | I hang out with them until they go to bed, which is usually 7.30 or so.
01:30:19.700 | So I'm helping them with their homework and I'm playing with them and I'm taking them
01:30:22.960 | on rides in the brand new Tesla, which my son is obsessed with.
01:30:27.940 | And then I put them to bed and then I sit back down.
01:30:29.820 | I prep for the next day, go through all those media sources I was talking about, compile
01:30:32.740 | kind of a schedule for what I want the show to look like and run a show.
01:30:36.540 | It's very detail oriented.
01:30:37.540 | Nobody writes anything for me.
01:30:38.540 | I write all my own stuff.
01:30:39.540 | So every word that comes out of my mouth is my fault.
01:30:43.260 | And then, you know, hopefully I have a couple hours to, or an hour to hang out with my wife
01:30:47.740 | before we go to bed.
01:30:48.740 | - The words you write, do you edit a lot?
01:30:51.340 | Or does it just come out?
01:30:52.580 | You're thinking like, what are the key ideas I want to express?
01:30:54.780 | - No, I don't tend to edit a lot.
01:30:56.540 | So I, thank God I'm able to write extraordinarily quickly.
01:30:59.780 | So I write very, very fast.
01:31:00.780 | In fact, in a previous life I was--
01:31:02.780 | - You also speak fast, so it's similar.
01:31:04.420 | - Yeah, exactly.
01:31:05.420 | And I speak in paragraphs.
01:31:06.420 | So it's exactly the same thing.
01:31:08.500 | In a previous life, I was a ghost writer.
01:31:10.420 | So I used to be sort of known as a turnaround specialist in the publishing industry.
01:31:13.660 | There'd be somebody who came to the publisher and says, "I have three weeks to get this
01:31:18.140 | book done.
01:31:19.140 | I don't have a word done."
01:31:20.140 | And they'd call me up and be like, "This person needs a book written."
01:31:23.220 | And so in three weeks I'd knock out 60,000 words or so.
01:31:25.820 | - Is there something you can say to the process that you follow to think?
01:31:29.780 | Like how you think about ideas?
01:31:32.820 | Stuff is going on in the world, and trying to understand what is happening, what are
01:31:37.180 | the explanations, what are the forces behind this?
01:31:39.340 | Do you have a process, or just you wait for the muse to give you the interpretation of
01:31:44.940 | the book?
01:31:45.940 | - Well, I mean, I think that, I don't think it's a formal process, but because I read,
01:31:49.900 | so there's two ways to do it.
01:31:51.420 | One is sometimes, you know, sometimes the daily grind of the news is going to refer
01:31:57.540 | back to core principles that are broader and deeper.
01:32:00.900 | So I thank God because I've read so much on so many different things of a lot of different
01:32:05.700 | point of views.
01:32:07.300 | Then if something breaks and a piece of news breaks, I can immediately sort of channel
01:32:11.460 | that into, in the mental Rolodex, these three big ideas that I think are really important.
01:32:17.260 | And then I can talk at length about what those ideas are, and I can explicate those.
01:32:21.340 | And so, for example, when we were talking about Musk taking over Twitter before, and
01:32:25.620 | I immediately go to the history of media, right, that's me tying it into a broader theme.
01:32:32.540 | And I do that, I would say, fairly frequently.
01:32:34.860 | We're talking about, say, subsidization of industry, and I can immediately tie that into,
01:32:40.300 | okay, what's the history of subsidization in the United States, going all the way back
01:32:43.500 | to Woodrow Wilson and forward through FDR's industrial policy, and how does that tie into
01:32:46.940 | sort of broader economic policy internationally.
01:32:49.860 | So it allows me to tie into bigger themes, because what I tend to read is mostly not
01:32:54.180 | news.
01:32:55.180 | What I tend to read is mostly books.
01:32:56.180 | I would say most of my media diet is actually not the stuff, like that's the icing on the
01:33:00.340 | cake, but the actual cake is the hundreds of pages in history, econ, geography, that
01:33:07.580 | I'm social science, that I'm reading every week.
01:33:11.260 | And so that sort of stuff allows me to think more deeply about these things.
01:33:15.780 | So that's one way of doing it.
01:33:16.780 | The other way of doing it is Russia breaks in the news.
01:33:18.580 | I don't know anything about Russia.
01:33:19.580 | I immediately go and I purchase five books about Russia and I read all of them.
01:33:23.380 | And so one of the unfortunate things about our, well, the fortunate thing for me and
01:33:28.260 | the unfortunate thing about the world is that, and the unfortunate thing about the world
01:33:31.500 | is if you read two books on a subject, you are now considered by the media an expert
01:33:34.380 | on the subject.
01:33:35.840 | So that's sad and shallow, but that is the way that it is.
01:33:39.060 | The good news for me is that my job isn't to be a full expert on any of these subjects,
01:33:42.260 | and I don't claim to be.
01:33:43.260 | I'm not a Russia expert.
01:33:44.540 | I know enough on Russia to be able to understand when people talk about Russia, what the system
01:33:48.740 | looks like, how it works and all of that.
01:33:51.100 | And then to explicate that for the common man, which a lot of people who are infused
01:33:55.180 | with the expertise can't really do.
01:33:56.500 | If you're so deep in the weeds that you're like a full on academic expert on a thing,
01:33:59.620 | sometimes it's hard to translate that over to a mass audience, which is really my job.
01:34:02.700 | - Well, I think it can actually, it's funny with the two books, you can actually get a
01:34:06.900 | pretty deep understanding if you read and also think deeply about it.
01:34:10.900 | It allows you to approach a thing from first principles.
01:34:13.660 | A lot of times if you're a quote unquote expert, you get carried away by the momentum of what
01:34:21.340 | the field has been thinking about versus like stepping back, all right, what is really going
01:34:27.220 | The challenge is to pick the right two books.
01:34:29.540 | - Right.
01:34:30.540 | So that usually what I'll try to find is somebody who knows the topic pretty well and have them
01:34:33.180 | recommend or a couple people and have them recommend books.
01:34:35.540 | So a couple of years ago, I knew nothing about Bitcoin.
01:34:37.340 | I was at a conference and a couple of people who you've had on your show actually were
01:34:42.420 | there and I asked them, give me your top three books on Bitcoin.
01:34:45.700 | And so then I went and I read like nine books on Bitcoin.
01:34:49.340 | And so if you read nine books on Bitcoin, you at least know enough to get by.
01:34:52.340 | And so I can actually explain what Bitcoin is and why it works or why it doesn't work
01:34:56.000 | in some cases and what's happening in the markets that way.
01:34:59.940 | So that's very, very helpful.
01:35:02.220 | - Well, Putin is an example.
01:35:04.780 | That's a difficult one to find the right books on.
01:35:07.540 | I think the New Czar is the one I read where it was the most objective.
01:35:11.940 | The one I read, I think about Putin was, it was one called Strongman.
01:35:14.500 | It was very highly critical of Putin, but it gave like a good background on him.
01:35:19.900 | - Yeah, so I'm very skeptical sort of things that are very, they're critical of Putin because
01:35:26.340 | it feels like there's activism injected into the history.
01:35:29.820 | Like the way The Rise and Fall of the Third Reich is written about Hitler, I like because
01:35:34.380 | there's almost not a criticism of Hitler.
01:35:36.700 | It's a description of Hitler, which is very, it's easier to do about a historical figure,
01:35:42.740 | which with William Shire, with The Rise and Fall of the Third Reich, it's impressive because
01:35:46.540 | he lived through it.
01:35:48.340 | But it's very tough to find objective descriptions about the history of the man and a country
01:35:53.820 | of Putin, of Zelensky, of any difficult, Trump is the same.
01:35:59.100 | And I feel like-
01:36:00.100 | - Everybody that's the hero villain archetype, right?
01:36:01.500 | And it's like either somebody is completely a hero or a completely a villain.
01:36:05.420 | And the truth is pretty much no one is completely a hero or completely a villain.
01:36:10.340 | People, in fact, I'm not sure that I love descriptions of people as heroes or villains
01:36:14.060 | generally.
01:36:15.060 | I think that people tend to do heroic things or do villainous things.
01:36:17.060 | In the same way that I'm not sure I love descriptions of people as a genius.
01:36:19.660 | My dad used to say this when I was growing up.
01:36:20.820 | He used to say, they didn't believe that there were geniuses.
01:36:23.060 | He said he believed that there were people with a genius for something because people,
01:36:27.420 | you know, yes, there are people who are very high IQ and we call them geniuses, but does
01:36:30.060 | that mean that they're good at EQ stuff?
01:36:32.460 | Not necessarily, but there are people who are geniuses at EQ stuff.
01:36:34.660 | In other words, it would be more specific to say that somebody is a genius at engineering
01:36:38.940 | than to say just broad spectrum, they're a genius.
01:36:40.940 | And that does avoid the problem of thinking that they're good at something that they're
01:36:43.740 | not good at, right?
01:36:44.740 | It's a little more specific.
01:36:46.140 | - So because you read a lot of books, are there, can you look back?
01:36:48.740 | And it's always a tough question 'cause so many, it's like your favorite song, but are
01:36:52.300 | there books that have been influential on your life that are impacting your thinking
01:36:56.500 | or maybe ones you go back to that still carry insight for you?
01:37:02.860 | - The Federalist Paper is a big one in terms of sort of how American politics works.
01:37:07.140 | The first econ book that I thought was really great 'cause it was written for teenagers
01:37:10.500 | essentially is one called Economics in One Lesson by Henry Hazlitt.
01:37:12.980 | It's like 150 pages, I recommend it to everybody sort of 15 and up.
01:37:17.180 | It's easier than say Thomas Sowell's Basic Econ, which is four or 500 pages.
01:37:21.060 | - And it's looking at like macroeconomics, microeconomics, that kind of stuff.
01:37:26.020 | - And then in terms of, there's a great book by Carl Truman called Rise and Triumph of
01:37:31.060 | the Modern Self, which I think is the best book in the last 10 years.
01:37:34.260 | That's been sort of impactful on some of the thoughts I've been having lately.
01:37:37.340 | - What's the key idea in there that's impactful?
01:37:38.340 | - The key idea is that we've shifted the nature of how identity is done in the West from how
01:37:43.020 | it was historically done.
01:37:44.020 | That basically for nearly all of human history, the way that we identify as human beings is
01:37:49.660 | as a mix of our biological drives and then how that interacts with the social institutions
01:37:53.420 | around us.
01:37:54.420 | And so when you're a child, you're a bunch of unfettered biological drives and it's your
01:37:57.540 | parents' job to civilize you and civilize you literally means bring you into civilization,
01:38:01.540 | right?
01:38:02.540 | You learn the rules of the road.
01:38:03.540 | You learn how to integrate into institutions that already exist and are designed to shape
01:38:08.380 | And it's how you interact with those institutions that makes you you.
01:38:10.900 | It's not just a set of biological drives.
01:38:12.260 | And then in the modern world, we've really driven toward the idea that what we are is
01:38:17.140 | how we feel on the inside without reference to the outside world.
01:38:19.820 | And it's the job of the outside world to celebrate and reflect what we think about ourselves
01:38:23.340 | on the inside.
01:38:24.740 | And so what that means is that we are driven now toward fighting institutions because institutions
01:38:30.180 | are in positions.
01:38:31.180 | So everything around us, societal institutions, these are things that are crimping our style.
01:38:34.900 | They're making us not feel the way that we want to feel.
01:38:36.580 | And if we just destroy those things, then we'll be freer and more liberated.
01:38:40.700 | It's a I think much deeper model of how to think about why our social politics particular
01:38:45.740 | are moving in a particular direction is that a ground shift has happened in how people
01:38:49.280 | think about themselves.
01:38:51.100 | And this has had some somewhat kind of shocking effects in terms of social politics.
01:38:56.620 | - So there's negative consequences in your view of that, but is there also positive consequence
01:39:01.900 | of more power, more agency to the individual?
01:39:05.780 | - I think that you can make the argument that institutions were weighing too heavily in
01:39:08.580 | how people form their identity.
01:39:09.820 | But I think that what we've done is gone significantly too far on the other side.
01:39:13.500 | We basically decided to blow up the institutions in favor of unfettered feeling/identity.
01:39:20.380 | And I think that that is not only a large mistake, I think it's gonna have dire ramifications
01:39:23.780 | for everything from suicidal ideation to institutional longevity in politics and in society more
01:39:30.100 | broadly.
01:39:31.100 | - So speaking about the nature of self, you've been an outspoken proponent of pro-life.
01:39:38.500 | Can you, can we start by you trying to steam in the case for pro-choice, that abortion
01:39:44.940 | is not murder and a woman's right to choose is a fundamental human right, freedom?
01:39:51.340 | - So I think that the only way to steel man the pro-choice case is to, and be ideologically
01:39:57.620 | consistent, is to suggest that there is no interest in the life of the unborn that counterweighs
01:40:06.440 | at all freedom of choice.
01:40:09.340 | So what that means is, we can take the full example or we can take sort of the partial
01:40:15.220 | example.
01:40:16.220 | So if we take the full example, what that would mean is that up until point of birth,
01:40:18.060 | which is sort of the Democratic Party platform position, that there is, that a woman's right
01:40:23.220 | to choose ought to extend for any reason whatsoever up to point of birth.
01:40:26.100 | The only way to argue that is that bodily autonomy is the only factor.
01:40:28.660 | There is no countervailing factor that would ever outweigh bodily autonomy.
01:40:33.340 | That would be the strongest version of the argument.
01:40:36.160 | The other version of that argument would be that the reason that bodily autonomy ought
01:40:39.240 | to weigh so heavily is because women can't be the equals of men if the vicissitudes of
01:40:47.240 | biology are allowed to decide their futures.
01:40:50.620 | If pregnancy changes women in a way that it doesn't change men, it's a form of sex discrimination
01:40:54.460 | for women to ever have to go through with pregnancy, which is an argument that was made
01:40:57.640 | by Ruth Bader Ginsburg kind of.
01:41:00.060 | Those are the arguments.
01:41:01.060 | The kind of softer version is the more, I would say, emotionally resonant version of
01:41:06.800 | the argument, which is that bodily autonomy ought to outweigh the interests of the fetus
01:41:12.320 | up till point X.
01:41:14.000 | And then people have different feelings about what point X looks like.
01:41:16.000 | Is it up to the point of viability?
01:41:17.120 | Is it up to the point of the heartbeat?
01:41:18.120 | Is it up to 12 weeks or 15 weeks?
01:41:20.080 | And that really is where the American public is, where the American public is, broadly
01:41:23.160 | speaking, not state by state where there are various really, really varied opinions.
01:41:26.960 | But like broadly speaking, it seems like the American public by polling data wants somewhere
01:41:30.160 | between a 12 and 15 week abortion restriction.
01:41:32.880 | And they believe that up until 12 or 15 weeks, there's not enough there for, to not be specific,
01:41:38.200 | but to be kind of how people feel about it, to outweigh a woman's bodily autonomy.
01:41:42.840 | And then beyond that point, then there's enough of an interest in the life of the preborn
01:41:47.720 | child.
01:41:48.920 | It's developed enough.
01:41:49.920 | Then now we care about it enough that it outweighs a woman's bodily autonomy.
01:41:53.160 | What's the strongest case for pro-life in your mind?
01:41:56.800 | I mean, the strongest case for pro-life is that from conception, a human life has been
01:42:02.000 | created.
01:42:03.000 | It is a human life with potential.
01:42:04.000 | That human life potential with potential now has an independent interest in its own existence.
01:42:09.960 | If I may just ask a quick question.
01:42:12.280 | So conception is when a sperm fertilizes an egg?
01:42:17.520 | Okay.
01:42:18.520 | Just to clarify the biological beginning of what conception means.
01:42:20.920 | I mean, because that is the beginning of human life.
01:42:23.280 | Now there are other standards that people have drawn, right?
01:42:25.620 | Some people will say implantation in the uterus.
01:42:28.520 | Some people will suggest viability, some people brain development or heart development, but
01:42:33.120 | the clear dividing line between a human life exists and a human life does not exist is
01:42:37.400 | the biological creation of an independent human life with its own DNA strands and et
01:42:41.080 | cetera, which happens at conception.
01:42:43.960 | Once you acknowledge that there is that independent human life with potential, and I keep calling
01:42:48.680 | it that because people sometimes say potential human life.
01:42:50.720 | It's not a potential human life.
01:42:51.720 | It's a human life that is not developed yet to the full extent that it will develop.
01:42:56.180 | Once you say that, and once you say that it has its own interest, now you have to, now
01:42:58.960 | the burden of proof is to explain why bodily autonomy ought to allow for the snuffing out
01:43:05.880 | of that human life if we believe that human life ought not to be killed for quote unquote
01:43:11.020 | no good reason.
01:43:12.020 | You have to come up with a good reason, right?
01:43:13.380 | The burden of proof has now shifted.
01:43:14.940 | Now you will find people who will say, well, the good reason is that it's not sufficiently
01:43:19.260 | developed to outweigh the mental trauma or emotional trauma that a woman goes through
01:43:22.380 | if, for example, she was raped or the victim of incest.
01:43:25.900 | And that is a fairly emotionally resonant argument, but it's not necessarily positive.
01:43:30.220 | You can make the argument that just because something horrific and horrible happened to
01:43:35.400 | a woman does not rob the human life of its interest in life.
01:43:40.920 | One of the big problems in trying to draw any line for the self-interest of life in
01:43:45.500 | the human life is that it's very difficult to draw any other line that doesn't seem somewhat
01:43:51.220 | arbitrary.
01:43:52.220 | You say that independent heartbeat, well, people have pacemakers.
01:43:57.060 | If you say brain function, people have various levels of brain function as adults.
01:44:01.140 | If you say viability, babies are not viable after they are born.
01:44:03.700 | If I left a newborn baby on a table and did not take care of it, it would be dead in two
01:44:07.460 | days.
01:44:08.460 | So once you start getting into sort of these lines, it starts to get very fuzzy very quickly.
01:44:13.100 | And so if you're looking for sort of a bright line moral rule, that would be the bright
01:44:17.420 | line moral rule.
01:44:18.420 | That's sort of the pro-life case.
01:44:19.420 | - Well, there's still mysterious, difficult scientific questions of things like consciousness.
01:44:25.660 | So what do you, does the question of consciousness, how does it come into play into this debate?
01:44:33.460 | - So I don't believe that consciousness is the sole criterion by which we judge the self-interest
01:44:39.700 | in human life.
01:44:41.100 | So we are unconscious a good deal of our lives, right?
01:44:45.380 | That does not, we will be conscious again, right?
01:44:48.020 | When you're unconscious, when you're asleep, for example, presumably your life is still
01:44:51.780 | worth living.
01:44:52.780 | If somebody came in and killed you, that'd be a serious moral quandary at the very least.
01:44:56.760 | - But the birth of consciousness, the lighting up of the flame, the initial lighting up the
01:45:02.060 | flame, there does seem to be something special about that.
01:45:05.380 | And it's a mystery of when that happens.
01:45:08.300 | - Well, I mean, Peter Singer makes the case that basically self-consciousness doesn't
01:45:11.580 | exist until you're two and a half.
01:45:13.260 | So he says that even infanticide should be okay, right?
01:45:15.740 | He's the bioethicist over at Princeton.
01:45:17.660 | So you get into some real dicey territory once you get into consciousness.
01:45:21.020 | Also the truth is that consciousness is more of a spectrum than it is a dividing line,
01:45:26.260 | meaning that there are people with various degrees of brain function.
01:45:29.940 | We don't actually know how conscious they are.
01:45:32.220 | And you can get into eugenic territory pretty quickly when we start dividing between lives
01:45:36.540 | that are worth living based on levels of consciousness and lives that are not worth living based
01:45:39.700 | on levels of consciousness.
01:45:41.380 | - Do you find it, the aspect of women's freedom, do you feel the tension between that ability
01:45:50.100 | to choose the trajectory of your own life versus the rights of the unborn child?
01:45:59.420 | - In one situation, yes.
01:46:00.420 | In one situation, no.
01:46:01.460 | If you've had sex with a person voluntarily and as a product of that, you are now pregnant,
01:46:08.340 | You've taken an action with a perfectly predictable result.
01:46:10.300 | Even if you took birth control, this is the way that human beings procreated for literally
01:46:13.860 | all of human existence.
01:46:14.900 | And by the way, also how all mammals procreate.
01:46:17.220 | So the idea that this was an entirely unforeseen consequence of your activity, I find I have
01:46:22.060 | less sympathy for you in that particular situation because you could have made decisions that
01:46:27.100 | would not lead you to this particular impasse.
01:46:29.020 | In fact, this used to be the basis of marriage, right?
01:46:31.300 | Because when we were a apparently more terrible society, we used to say that people should
01:46:36.300 | wait until they get married to have sex, a position that I still hold.
01:46:40.460 | And the reason for that was because then if you have sex and you produce a child, then
01:46:43.660 | the child will grow up in a two-parent family with stability.
01:46:46.540 | So not a ton of sympathy there.
01:46:49.340 | When it comes to rape and incest, obviously heavy, heavy sympathy.
01:46:51.960 | And so that's why I think you see statistically speaking a huge percentage of Americans, including
01:46:56.160 | many pro-life Americans, people who consider themselves pro-life, would consider exceptions
01:46:59.960 | for rape and incest.
01:47:01.340 | One of the sort of dishonest things that I think happens in abortion debates is arguing
01:47:05.260 | from the fringes.
01:47:06.260 | This tends to happen a lot is pro-choice activists will argue from rape and incest to the other
01:47:10.900 | 99.8% of abortions.
01:47:13.260 | Or you'll see people on the pro-life side argue from partial birth abortion to all of
01:47:16.460 | abortion.
01:47:17.460 | You actually have to take on sort of the mainstream case and then decide whether or not that's
01:47:21.540 | acceptable or not.
01:47:22.540 | But to you, the exception just ethically without generalizing it, that is a valid ethically
01:47:28.900 | exception.
01:47:29.900 | I don't hold that there should be an exception for rape or incest because again, I hold by
01:47:33.580 | the bright line rule that once a human life with potential exists, then it has its own
01:47:37.180 | interest in life that cannot be curbed by your self-interest.
01:47:42.980 | The only exception that I hold by is the same exception that literally all pro-lifers hold
01:47:46.580 | by, which is the life of the mother is put in danger.
01:47:48.500 | - Such a tough, tough topic.
01:47:50.140 | 'Cause if you believe that that's the line, then we're committing mass murder.
01:47:54.700 | - Well, or at least mass killing.
01:47:56.820 | So I would say that murder typically requires a level of mens rea that may be absent in
01:48:02.300 | many cases of abortion.
01:48:03.300 | 'Cause the usual follow on question is, well, if it's murder, why not prosecute the woman?
01:48:06.820 | And the answer is because the vast majority of people who are having abortions don't actually
01:48:10.860 | believe that they're killing a person.
01:48:13.420 | They have a very different view of what is exactly happening.
01:48:17.260 | So I would say that there are all sorts of interesting hypotheticals that come in to
01:48:21.500 | play when it comes to abortion.
01:48:23.580 | And you can play them any which way.
01:48:26.380 | But levels, let's put it this way.
01:48:29.060 | There are gradations of wrongs.
01:48:30.300 | I don't think that all abortions are equally blameworthy, even if I would ban virtually
01:48:37.780 | all of them.
01:48:38.780 | I think that there are mitigating circumstances that make, while being wrong, some abortions
01:48:44.860 | less morally blameworthy than others.
01:48:46.860 | I think that there is a, I can admit a difference between killing a two-week-old embryo in the
01:48:55.020 | womb and stabbing a seven-year-old in the face.
01:48:57.220 | I can recognize all that while still saying I think that it would be wrong to terminate
01:48:59.860 | a pregnancy.
01:49:00.860 | - Do you think the question of when life begins, which I think is a fascinating question, is
01:49:06.180 | a question of science or a question of religion?
01:49:08.540 | - When life begins is a question of science.
01:49:10.700 | When that life becomes valuable enough for people to want to protect it is gonna be a
01:49:15.380 | question that is beyond science.
01:49:17.380 | Science doesn't have moral judgments to make about the value of human life.
01:49:20.020 | This is one of the problems that, Sam Harris and I have had this argument many times and
01:49:22.980 | it's always kind of interesting.
01:49:24.860 | Sam is of the opinion that you can get to ought from is.
01:49:28.220 | That science says is, therefore we can learn ought.
01:49:30.620 | Human flourishing is the goal of life.
01:49:33.100 | I always say to him, I don't see where you get that from evolutionary biology.
01:49:36.780 | You can assume it, just say you're assuming it, but don't pretend that that is a conclusion
01:49:41.460 | that you can draw straight from biological reality itself because obviously that doesn't
01:49:46.780 | exist in the animal world, for example.
01:49:48.660 | Nobody assumes the innate value of every ant.
01:49:51.460 | I think I know your answer to this, but let's test it 'cause I think you're going to be
01:49:55.540 | wrong.
01:49:56.540 | So there's a robot behind you.
01:49:59.140 | Do you think there will be a time in the future when it will be unethical and illegal to kill
01:50:04.620 | a robot because they will have sentience?
01:50:08.460 | My guess is you would say no, Lex, 'cause there's a fundamental difference between humans
01:50:13.820 | and robots and I just wanna get you on record 'cause I think you'll be wrong.
01:50:18.380 | I mean it depends on the level of development, I would assume, of the robots.
01:50:21.940 | I mean you're assuming a complexity in the robots that eventually imitates what we in
01:50:27.660 | the religious life would call the human soul.
01:50:29.660 | The ability to choose freely, for example, which I believe is sort of the capacity for
01:50:34.820 | human beings.
01:50:35.820 | The ability to suffer.
01:50:37.220 | Yeah.
01:50:38.220 | If all of that could be proved and not programmed, meaning the freely willed capacity of a machine
01:50:47.740 | to do X, Y, or Z.
01:50:50.860 | You could not pinpoint exactly where it happens in the program.
01:50:54.380 | Right.
01:50:55.380 | It's not deterministic.
01:50:56.380 | Yeah.
01:50:57.380 | Then it would raise serious moral issues for sure.
01:51:00.660 | I'm not sure I know the answer to that question.
01:51:02.060 | Are you afraid of that time?
01:51:03.060 | I'm not sure I'm afraid of that time.
01:51:04.980 | I mean it's any more than I'd be afraid if aliens arrived in the world and had these
01:51:10.420 | characteristics.
01:51:11.420 | Well there's just a lot of moral complexities and they don't necessarily have to be in the
01:51:14.100 | physical space, they can be in the digital space.
01:51:16.900 | There's an increased sophistication and number of bots on the internet, including on Twitter.
01:51:22.780 | As they become more and more intelligent, there's going to be serious questions about
01:51:26.660 | what is our moral duty to protect ones that have or claim to have an identity.
01:51:32.700 | And that'll be really interesting.
01:51:33.700 | Actually, what I'm afraid of is the opposite happening.
01:51:35.620 | Meaning that people, the worst that should happen is that we develop robots so sophisticated
01:51:40.420 | that they appear to have free will and then we treat them with human dignity.
01:51:43.820 | That should be the worst that happens.
01:51:45.180 | What I'm afraid of is the opposite, is that if we're talking about this particular hypothetical,
01:51:50.180 | that we develop robots that have all of these apparent abilities and then we dehumanize
01:51:53.900 | them which leads us to also dehumanize the other humans around us.
01:51:57.100 | Which you could easily see happening.
01:51:59.540 | The devaluation of life to the point where it doesn't really matter.
01:52:02.780 | I mean people have always treated, unfortunately, newly discovered other humans this way.
01:52:07.980 | So I don't think there's actually a new problem.
01:52:10.140 | I think it's a pretty old problem.
01:52:11.860 | It'll just be interesting when it's made of human hands.
01:52:14.300 | - Yeah, it's an opportunity to celebrate humanity or to bring out the worst in humanity.
01:52:21.300 | So the derision that naturally happens, like you said, with pointing out the other.
01:52:26.420 | Let me ask you about climate change.
01:52:30.060 | Let's go from the meme to the profound philosophy.
01:52:33.660 | Okay, the meme was there's a clip of you talking about climate change and saying that--
01:52:37.980 | - Ah, the Aquaman meme.
01:52:38.980 | (laughs)
01:52:39.980 | - You said that for the sake of argument, if the water level rises five to 10 feet in
01:52:44.140 | the next hundred years, people will just sell their homes and move.
01:52:48.820 | And then the meme was sell to who?
01:52:51.260 | Can you argue both sides of that?
01:52:52.980 | - The argument that they're making is a straw man.
01:52:54.900 | The argument that I'm making is over time.
01:52:56.180 | I don't mean that if a tsunami's about to hit your house, you can list it on eBay.
01:52:59.140 | That's not what I mean, obviously.
01:53:00.140 | What I mean is that human beings have an extraordinary ability to adapt.
01:53:03.700 | It's actually our best quality.
01:53:05.580 | And that as water levels rise, real estate prices in those areas tends to fall.
01:53:09.900 | That over time, people tend to abandon those areas.
01:53:13.280 | They tend to leave.
01:53:14.740 | They tend to, right now, sell their houses, and then they tend to move.
01:53:17.820 | And eventually, those houses will be worthless, and you won't have anybody to sell to, but
01:53:20.700 | presumably not that many people will be living there by that point, which is one of the reasons
01:53:24.340 | why the price would be low, because there's no demand.
01:53:26.140 | - So it's over a hundred years, so all of these price dynamics are very gradual, relative
01:53:31.620 | to the other price dynamics.
01:53:32.940 | - Correct.
01:53:33.940 | That's why the joke of it, of course, is that I'm saying that tomorrow, there's a tsunami
01:53:36.820 | on your doorstep, and you're like, "Oh, Bob will buy my house."
01:53:40.060 | Bob ain't gonna buy your house.
01:53:41.580 | We all get that.
01:53:42.580 | - Yeah, but the tsunami may.
01:53:43.580 | I'll admit I laughed at it.
01:53:44.580 | - How is your view on climate change?
01:53:46.740 | The human contribution to climate change, what we should do in terms of policy to respond
01:53:51.940 | to climate change, how has that changed over the years?
01:53:54.000 | - I would say, the truth is, for years and years, I've believed that climate change was
01:53:59.200 | a reality, and that anthropogenic climate change is a reality.
01:54:03.620 | I don't argue with the IPCC estimates.
01:54:05.580 | I know climatologists at places like MIT or Caltech, and they know this stuff better than
01:54:09.780 | I do.
01:54:10.780 | But the notion that climate change is just not happening, or that human beings have not
01:54:14.500 | contributed to climate change, I find doubtful.
01:54:16.780 | The question is to what extent human beings are contributing to climate change, that 50%,
01:54:20.540 | is it 70%, is it 90%.
01:54:22.260 | I think there's a little bit more play in the joints there, so it's not totally clear.
01:54:25.220 | The one thing I do know, and this I know with factual accuracy, is that all of the measures
01:54:29.540 | that are currently being proposed are unworkable and will not happen.
01:54:32.580 | So when people say Paris Climate Accords, even if those were imposed, you're talking
01:54:37.060 | about lowering the potential trajectory of climate change by a fraction of a degree.
01:54:41.940 | If you're talking about the, if you're talking about Green New Deal, net zero by 2050, the
01:54:48.820 | carbon is up there in the air, and the climate change is going to happen.
01:54:51.660 | Also you're assuming that geopolitical dynamics don't exist, so everybody's gonna magically
01:54:56.060 | get on the same page, and we're all gonna be imposing massive carbon taxes to get to
01:55:01.220 | net zero by 2050.
01:55:03.060 | I mean like hundreds of times higher than they currently are.
01:55:05.620 | And that's not me saying that, it's Klaus Schwab saying this, of the World Economic
01:55:08.300 | Forum, who's a big advocate of exactly this sort of policy.
01:55:11.500 | And the reality is that we're gonna have to accept that at least 1.5 degrees Celsius of
01:55:14.580 | climate change is baked into the cake by the end of the century.
01:55:16.540 | Again, not me talking, William Nordhaus, the economist, who just won the Nobel Prize in
01:55:19.500 | this stuff, talking.
01:55:20.660 | And so what that suggests to me is what we've always known, human beings are crap at mitigation
01:55:24.820 | and excellence in adaptation.
01:55:26.860 | We are very bad at mitigating our own faults, we are very, very good at adapting to the
01:55:30.260 | problems as they exist.
01:55:32.020 | Which means that all of the estimates that billions will die, that there will be mass
01:55:35.620 | starvation, that we will see the migration in just a few years of hundreds of millions
01:55:40.820 | of people, those are wrong.
01:55:42.460 | What you'll see is a gradual change of living, people will move away from areas that are
01:55:45.820 | inundated on the coast, you will see people building seawalls, you will see people adapting
01:55:49.860 | new technologies to suck carbon out of the air, you will see geoengineering.
01:55:54.820 | This is the sort of stuff that we should be focused on.
01:55:56.860 | And the sort of bizarre focus on, what if we just keep tossing hundreds of billions
01:56:01.220 | of dollars at the same three technologies over and over in the hopes that if we subsidize
01:56:05.700 | it, this will magically make it more efficient.
01:56:07.500 | I've seen no evidence whatsoever that that is going to be the way that we get ourselves
01:56:12.260 | out of this.
01:56:13.260 | Necessity being the mother of invention, I think human beings will adapt because we have
01:56:15.580 | adapted and we will continue to adapt.
01:56:17.340 | - So to the degree we invest in the threat of this, it should be into the policies that
01:56:23.100 | help with the adaptation versus the mitigation.
01:56:25.220 | - Right, seawalls, geoengineering, developing technologies that suck carbon out of the air.
01:56:29.620 | Again, if I thought that there was more sort of hope for the green technologies currently
01:56:33.220 | in play, then subsidization of those technologies, I might be a little bit more for, but I haven't
01:56:36.700 | seen tremendous progress over the course of the last 30 years in the reliability of, for
01:56:41.020 | example, wind energy or the ability to store solar energy to the extent necessary to actually
01:56:47.620 | power a grid.
01:56:48.820 | - What's your thoughts on nuclear energy?
01:56:50.340 | - Nuclear energy is great.
01:56:51.340 | Nuclear energy is a proven source of energy and we should be radically extending the use
01:56:57.220 | of nuclear energy.
01:56:58.220 | I mean, to me, honestly, this is like a litmus test question as to whether you take climate
01:57:02.020 | change seriously.
01:57:03.020 | If you're on right or left and you take climate change seriously, you should be in favor of
01:57:05.540 | nuclear energy.
01:57:06.540 | If you're not, I know that you're just, you have other priorities.
01:57:09.180 | - Yeah, the fascinating thing about the climate change debate is the dynamics of the fear
01:57:13.100 | mongering over the past few decades 'cause some of the nuclear energy was tied up into
01:57:17.980 | that somehow.
01:57:18.980 | There's a lot of fear about nuclear energy.
01:57:21.180 | It seems like there's a lot of social phenomena, social dynamics involved versus dealing with
01:57:27.420 | just science.
01:57:29.020 | It's interesting to watch.
01:57:31.220 | If on my darker days, it makes me cynical about our ability to use reason and science
01:57:36.260 | to deal with the threats of the world.
01:57:39.660 | - I think that our ability to use reason and science to deal with threats of the world
01:57:42.860 | is almost a time frame question.
01:57:44.660 | So I think that, again, we're very bad at looking down the road and saying, you know,
01:57:49.020 | because people can't handle, for example, even things like compound interest.
01:57:52.300 | The idea that if I put a dollar in the bank today that 15 years from now, that's gonna
01:57:55.660 | be worth a lot more than a dollar.
01:57:57.220 | People can't actually see that.
01:57:58.620 | And so the idea of let's foresee a problem, then we'll deal with it right now as opposed
01:58:01.980 | to 30 years down the road.
01:58:02.980 | Typically, we let the problem happen and then we solve it.
01:58:05.580 | And it's bloodier and worse than it would have been if we had solved it 30 years ago.
01:58:09.060 | But it is, in fact, effective.
01:58:11.060 | And sometimes it turns out the solution that we're proposing 30 years in advance is not
01:58:14.380 | effective.
01:58:15.380 | And that can be a major problem as well.
01:58:17.300 | - Well, that's then to steel man the case for fear mongering, for irrational fear mongering.
01:58:22.680 | We need to be scared shitless in order for us to do anything.
01:58:26.260 | So that's, you know, I'm generally against that, but maybe on a population scale, maybe
01:58:32.340 | some of that is necessary for us to respond appropriately for long, too long term threats.
01:58:39.140 | We should be scared shitless.
01:58:40.140 | - I don't think that we can actually do that though.
01:58:42.820 | Like I, first of all, I think that it's platonic lies are generally bad.
01:58:47.280 | And then second of all, I don't think that we actually have the capacity to do this.
01:58:49.820 | I think that the people who are, you know, the sort of elites of our society who get
01:58:53.820 | together in rooms and talk about this sort of stuff, and I've been in some of those meetings
01:58:56.860 | at my synagogue Friday nights, actually.
01:58:58.860 | - I was gonna make the joke, but I'm glad you did.
01:59:02.780 | - Yeah, you know, I've been in rooms, Davos-like rooms.
01:59:06.540 | And when people discuss these sorts of topics, and they're like, what if we just tell people
01:59:10.420 | that it's gonna be a disaster with tsunamis and day after tomorrow?
01:59:12.860 | It's like, you guys don't have that power.
01:59:14.780 | You don't.
01:59:15.780 | And by the way, you dramatically undercut your own power because of COVID to do this
01:59:18.780 | sort of stuff.
01:59:19.780 | Because a lot of the sort of, what if we scare the living hell out of you to the point where
01:59:23.220 | you stay in your own house for two years, and we tell you you can't send your kids to
01:59:26.940 | school, and then we tell you that the vaccine is gonna prevent transmission, and then we
01:59:31.820 | also tell you that we need to spend $7 trillion in one year, and it won't have any inflationary
01:59:36.340 | effect, and it turns out you're wrong on literally all of those things.
01:59:40.140 | The last few years have done more to undermine institutional trust than any time in probably
01:59:43.880 | American history.
01:59:44.880 | It's pretty amazing.
01:59:45.880 | - Yeah, I tend to agree with that.
01:59:46.880 | The only thing we have to fear is fear itself.
01:59:49.620 | Let me ask you back to the question of God, and a big ridiculous question, who's God?
01:59:57.260 | - Who is God?
01:59:58.260 | So I'm going to use sort of the Aquinas formulation of what God is.
02:00:06.100 | That if there is a cause of all things, not physical things, if there is a cause underlying
02:00:12.420 | the reason of the universe, then that is the thing we call God.
02:00:16.940 | So not a big guy in the sky with a beard.
02:00:20.900 | He is the force underlying the logic of the universe, if there is a logic to the universe,
02:00:26.980 | and he is the creator in the Judaic view of that universe, and he does have an interest
02:00:35.260 | in us living in accordance with the laws of the universe that if you're a religious Jew
02:00:40.980 | are encoded in the Torah, but if you're not a religious Jew, it would be encoded in the
02:00:44.660 | natural law by sort of Catholic theology.
02:00:48.020 | - Why do you think God created the universe, or as popularly asked, what do you think is
02:00:52.860 | the meaning behind it?
02:00:55.300 | What's the meaning of life?
02:00:56.500 | - What's the meaning of life?
02:00:58.080 | So I think that the meaning of life is to fulfill what God made you to do, and that
02:01:04.340 | is a series of roles.
02:01:06.380 | I think that human beings, and here you have to look to sort of human nature, rather than
02:01:09.840 | looking kind of to big questions.
02:01:13.540 | I've evolved something that I've really been working on, I'm writing a book about this
02:01:17.100 | actually that I call colloquially role theory, and basically the idea is that the way that
02:01:24.260 | we interact with the world is through a series of roles, and those are also the things we
02:01:27.060 | find most important and most implementable.
02:01:31.380 | There's sort of virtue ethics, which suggests that if we act in accordance with virtue,
02:01:36.620 | like Aristotle, then we will be living the most fulfilled and meaningful life, and then
02:01:41.580 | you have sort of deontological ethics, like Kantian ethics, that it's a rule-based ethic.
02:01:45.780 | If you follow the rules, then you'll find the meaning of life, and then what I'm proposing
02:01:51.300 | is that there's something that I would call role ethics, which is there are a series of
02:01:54.620 | roles that we play across our lives, which are also the things that we tend to put on
02:01:57.420 | our tombstones and find the most meaningful.
02:01:59.420 | So when you go to a cemetery, you can see what people found the most meaningful, because
02:02:03.180 | it's the stuff they put on the stone that has like four words on it, right?
02:02:05.940 | They're like beloved father, beloved mother, sister, brother, and you might have a job
02:02:11.420 | once in a while, a creator, a religious person, right?
02:02:15.900 | These are all roles that have existed across societies and across humanity, and those are
02:02:19.300 | the things where we actually find meaning, and the way that we navigate those roles brings
02:02:23.660 | us meaning, and I think that God created us in order to fulfill those roles for purposes
02:02:30.100 | that I can't begin to understand because I ain't him, and the more we recognize those
02:02:35.980 | roles and the more we live those roles, and then we can express freedom within those roles.
02:02:40.420 | I think that liberty exists inside each of those roles, and that's what makes all of
02:02:43.620 | our lives different and fun.
02:02:44.940 | We all parent in different ways, but being a parent is a meaningful role.
02:02:48.060 | We all have spouses, but how you interact that relationship is what makes your life
02:02:52.780 | meaningful and interesting.
02:02:55.180 | That is what we were put on earth to do, and if we perform those roles properly, and those
02:02:59.140 | roles do include things like being a creator, like we have a creative instinct as human
02:03:02.260 | beings, being a creator, being an innovator, being a defender of your family, being a social
02:03:08.980 | member of your community, which is something that we're built to do.
02:03:11.340 | If we fulfill those roles properly, then we will have made the world a better place than
02:03:14.500 | we inherited it, and we will also have had the joy of experiencing the sort of flow they
02:03:21.000 | talk about in psychology, where when you engage in these roles, you actually do feel a flow.
02:03:26.340 | So these roles are a fundamental part of the human condition?
02:03:30.220 | So the book you're working on is constructing a system to help us understand these roles?
02:03:37.820 | It's looking at, let's assume that all that's true.
02:03:40.420 | The real question of the book is how do you construct a flourishing and useful society
02:03:45.620 | and politics?
02:03:46.620 | Ah, so a society level.
02:03:49.140 | If this is our understanding of a human being, how do we construct a good society?
02:03:52.700 | Right, exactly, because I think that a lot of political theory is right now based in
02:03:56.580 | either J.S. Mill kind of thought, which is all that a good politics does is allows you
02:04:01.980 | to wave your hand around until you hit somebody in the face, or Rawlsian thought, which is
02:04:05.580 | what if we constructed society in order to achieve the most for the least, essentially?
02:04:10.620 | What if we constructed society around what actually makes humans the most fulfilled,
02:04:15.640 | and that is the fulfillment of these particular roles?
02:04:20.700 | And where does liberty come into that, right?
02:04:21.900 | How do you avoid the idea of a tyranny in that?
02:04:24.500 | You have to be a mother.
02:04:25.500 | You must be a father.
02:04:26.500 | You must be a ... Where does freedom come into that?
02:04:28.340 | Can you reject those roles totally as a society and be okay?
02:04:31.220 | The answer probably is not.
02:04:32.220 | So you need a society that actually promotes and protects those roles, but also protects
02:04:38.320 | the freedom inside those roles.
02:04:39.980 | And that raises a more fundamental question of what exactly liberty is for, and I think
02:04:43.420 | that both the right and the left actually tend to make a mistake when they discuss liberty.
02:04:47.860 | The left tends to think that liberty is an ultimate good, that simple choice makes a
02:04:52.240 | bad thing good, which is not true.
02:04:54.220 | I think the right talks about liberty in almost the same terms sometimes, and I think that's
02:04:58.060 | not true either.
02:04:59.060 | The question is whether liberty is of inherent value or instrumental value.
02:05:03.940 | Is liberty good in and of itself, or is liberty good because it allows you to achieve X, Y,
02:05:08.620 | or Z?
02:05:09.620 | I've thought about this one a lot, and I tend to come down on the latter side of the aisle.
02:05:13.380 | You asked me areas where I've moved.
02:05:14.500 | This may be an area where I've moved, is that I think when you think more shallowly about
02:05:17.380 | politics, or maybe more quickly because this is how we talk in America, is about liberties
02:05:21.500 | and rights, we tend to think that the right is what make, not like the political right,
02:05:25.500 | rights make things good, liberties make things good.
02:05:28.140 | The question really is what are those rights and liberties for?
02:05:30.620 | Now you have to be careful so that that doesn't shade into tyranny, right?
02:05:34.740 | You can only have liberty to do the thing that I say that you can do.
02:05:37.780 | But there have to be spheres of liberty that are roiling and interesting and filled with
02:05:42.540 | debate, but without threatening the chief institutions that surround those liberties,
02:05:47.340 | because if you destroy the institutions, the liberties will go too.
02:05:50.300 | If you knock down the pillars of the society, the liberties that are on top of those pillars
02:05:53.420 | are going to collapse, and I think that that's, if people are feeling as though we're on the
02:05:57.060 | verge of tyranny, I think that's why.
02:05:59.700 | This is fascinating, by the way.
02:06:01.780 | It's an instrumental perspective on liberty.
02:06:04.540 | That's going to have to give me a lot to think about.
02:06:07.580 | Let me ask a personal question.
02:06:09.940 | Was there ever a time that you had a crisis of faith, where you questioned your belief
02:06:13.620 | in God?
02:06:14.620 | Sure.
02:06:15.620 | I mean, I would less call it a crisis of faith than an ongoing question of faith, which
02:06:19.700 | I think is, I hope, most religious people.
02:06:23.020 | The word Israel, right, in Hebrew, Yisrael, means to struggle with God.
02:06:28.380 | That's literally what the word means.
02:06:29.740 | And so the idea of struggling with God, right, if you're Jewish, you're b'nai Yisrael, right?
02:06:35.500 | The idea of struggling with God, I think, is endemic to the human condition.
02:06:39.100 | If you understand what God's doing, then I think you're wrong.
02:06:42.740 | And if you think that that question doesn't matter, then I think you're also wrong.
02:06:46.860 | I think that God is a very necessary hypothesis.
02:06:50.500 | The struggle with God is life.
02:06:52.740 | That is the process of life.
02:06:53.940 | That's right, because you're never going to get to that answer.
02:06:56.060 | Otherwise, you're God and you aren't.
02:06:57.900 | Why does God allow cruelty and suffering in the world?
02:07:01.020 | One of the tough questions.
02:07:02.460 | So we're going deep here.
02:07:04.940 | There's two types of cruelty and suffering.
02:07:07.280 | So if we're talking about human cruelty and suffering, because God does not intervene
02:07:10.580 | to prevent people from exercising their free will, because to do so would be to deprive
02:07:15.500 | human beings of the choice that makes them human.
02:07:18.420 | This is the sin of the Garden of Eden, basically, is that God could make you an angel, in which
02:07:23.340 | case you wouldn't have the choice to do the wrong thing.
02:07:26.140 | But so long as we are going to allow for cause and effect in a universe shaped by your choice,
02:07:31.460 | cruelty and evil are going to exist.
02:07:33.820 | And then there's the question of just the natural cruelty and vicissitudes of life.
02:07:38.380 | And the answer there is I think that God obscures himself.
02:07:40.780 | I think that if God were to appear in all of his glory to people on a regular basis,
02:07:45.100 | I think that would make faith, you wouldn't need it.
02:07:48.540 | There'd be no such thing as faith.
02:07:50.540 | It would just be reality.
02:07:52.740 | Nobody has to prove to you that the sun rises every day.
02:07:55.680 | But if God is to allow us the choice to believe in him, which is the ultimate choice from
02:07:59.960 | a religious point of view, then he's going to have to obscure himself behind tragedy
02:08:03.280 | and horror and all those other things.
02:08:05.460 | And this is a fairly well-known Kabbalistic concept called Tzimtzum in Judaism, which
02:08:09.940 | is the idea that when God created the universe, he sort of withdrew in order to make space
02:08:14.020 | for all of these things to happen.
02:08:15.620 | >>So God doesn't have an instrumental perspective on liberty?
02:08:19.100 | >>In a chief sense, he does, because the best use of liberty is going to be belief in him.
02:08:26.060 | And you can misuse your liberty, right?
02:08:28.500 | There will be consequences if you believe in an afterlife, or if you believe in sort
02:08:32.360 | of a generalized better version of life led by faith, then liberty does have a purpose.
02:08:38.040 | But he also believes that you have to give people from a cosmic perspective the liberty
02:08:41.740 | to do wrong without threatening all the institutions of society.
02:08:46.140 | I mean, that's why it does say in the Bible that if man sheds blood by man, shall his
02:08:49.900 | blood be shed, right?
02:08:50.900 | There are punishments that are in biblical thought for doing things that are wrong.
02:08:56.160 | >>So for a human being who lacks the faith in God, so if you're an atheist, can you still
02:09:01.900 | be a good person?
02:09:02.900 | >>Of course, 100%.
02:09:03.900 | And there are a lot of religious people who are crappy people.
02:09:06.580 | >>How do we understand that tension?
02:09:08.180 | >>Well, from a religious perspective, what you would say is that it is perfectly plausible
02:09:12.020 | to live in accordance with a set of rules that don't damage other people without believing
02:09:16.900 | in God.
02:09:17.900 | You just might be understanding the reason for doing that wrong, is what a religious
02:09:20.740 | person would say.
02:09:21.740 | This is the conversation, again, that I had with Sam, basically, is you and I agree, I
02:09:26.660 | said this to Sam, you and I agree on nearly everything when it comes to morality.
02:09:29.060 | Like we probably disagree on 15 to 20% of things.
02:09:31.660 | The other 80% is because you grew up in a Judeo-Christian society and so do I, and we
02:09:34.860 | grew up 10 miles from each other, you know, around the turn of the millennium.
02:09:37.980 | So there's that.
02:09:40.060 | So you can perfectly well be an atheist living a good, moral, decent life, because you can
02:09:46.220 | live a good, moral, decent life with regard to other people without believing in God.
02:09:48.900 | I don't think you can build a society on that, because I think that, you know, that relies
02:09:52.400 | on the sort of goodness of mankind, natural goodness of mankind.
02:09:56.940 | I don't believe in the natural goodness of mankind.
02:09:58.460 | >>You don't?
02:09:59.460 | >>No, I believe in, I believe that man is created both sinful and with the capacity
02:10:02.820 | for sin and the capacity for good.
02:10:04.940 | >>But if you let them be on their own, isn't, doesn't it lead-
02:10:09.220 | >>Without social institutions to shape them, I think that that's very likely to go poorly.
02:10:13.300 | >>Oh, interesting.
02:10:14.300 | Well, we came to something we disagree on.
02:10:16.900 | But that may be, that might reflect itself in our approach to Twitter as well.
02:10:22.220 | I think if humans are left on their own, they tend towards good.
02:10:28.580 | They definitely have the capacity for good and evil, but I, when left on their own, they're,
02:10:33.820 | I tend to believe they're good.
02:10:35.860 | >>I think they might be good with limits.
02:10:37.940 | What I mean by that is that, what the evidence I think tends to show is that human beings
02:10:41.540 | are quite tribal.
02:10:42.860 | So what you'll end up with is people who are good with their immediate family and maybe
02:10:46.580 | their immediate neighbors, and then when they're threatened by an outside tribe, then they
02:10:49.580 | kill everyone.
02:10:51.220 | Which is sort of the history of civilization in the pre-civilizational era, which was a
02:10:54.740 | very violent time.
02:10:55.740 | Pre-civilizational era was quite violent.
02:10:58.180 | Do you think, on the topic of tribalism in our modern world, what are the pros and cons
02:11:03.620 | of tribes?
02:11:05.180 | Is that something we should try to outgrow as a civilization?
02:11:08.660 | >>I don't think it's ever going to be possible to fully outgrow tribalism.
02:11:12.860 | I think it's a natural human condition to want to be with people who think like you
02:11:17.500 | or have a common set of beliefs.
02:11:19.980 | And I think trying to obliterate that in the name of a universalism likely leads to utopian
02:11:23.700 | results that have devastating consequences.
02:11:26.300 | Utopian sort of universalism has been failing every time it's tried.
02:11:30.740 | Whether you're talking about, now it seems to be, sort of a liberal universalism, which
02:11:34.500 | is being rejected by a huge number of people around the world in various different cultures.
02:11:37.780 | Or you're talking about religious universalism, which typically comes with religious tyranny.
02:11:44.380 | Or you're talking about communistic or Nazi-esque sort of universalism, which comes with mass
02:11:49.180 | slaughter.
02:11:50.180 | So this is, you know, universalism, I'm not a believer in.
02:11:52.180 | I think that you have some values that are fairly limited that all human beings should
02:11:58.220 | hold in common and that's pretty much it.
02:12:01.100 | I think that everybody should have the ability to join with their own culture.
02:12:06.420 | I think how we define tribes is a different thing.
02:12:08.740 | So I think that tribes should not be defined by innate physical characteristics, for example.
02:12:13.820 | Because I think that, thank God as a civilization we've outgrown that.
02:12:17.180 | And I think that that is a childish way to view the world.
02:12:22.700 | All the tall people aren't a tribe.
02:12:24.140 | All the black people aren't a tribe.
02:12:25.140 | All the white people aren't a tribe.
02:12:26.140 | >>Jordan: So the tribes should be formed over ideas versus physical characteristics.
02:12:29.620 | >>Bill: That's right.
02:12:30.620 | Which is why, actually to go back to sort of the beginning of the conversation when
02:12:32.580 | it comes to Jews, you know, I'm not a big believer in ethnic Judaism.
02:12:39.020 | As a person who takes Judaism seriously, Judaism is more to me than you were born with a last
02:12:43.060 | name like Berg or Steen.
02:12:45.020 | And so I've made a controversial-
02:12:46.020 | >>Jordan: I would disagree with you.
02:12:47.020 | >>Bill: He would disagree with me, but that's because he was a tribalist, right?
02:12:49.540 | Who thought in racial terms.
02:12:51.540 | >>Jordan: So maybe robots will help us see humans as one tribe.
02:12:56.020 | Maybe that-
02:12:57.020 | >>Bill: This is Reagan's idea, right?
02:12:58.020 | Reagan said, "Well, if there's an alien invasion, then we'll all be on the same side."
02:13:00.620 | So I'll go over to the Soviets and we'll talk about it.
02:13:02.020 | >>Jordan: There's some deep truth to that.
02:13:05.460 | What does it mean to be a good man?
02:13:08.380 | The various role that a human being takes on in this role theory that you've spoken
02:13:14.380 | about, what does it mean to be good?
02:13:16.540 | >>Bill: It means to perform, now I will do Aristotle, it means to perform the function
02:13:22.420 | well.
02:13:23.420 | And what Aristotle says is the good is not like moral good, moral evil in the way that
02:13:27.100 | we tend to think about it.
02:13:28.700 | He meant that a good cup holds liquid, a good spoon holds soup.
02:13:33.180 | It means that a thing that is broken can't hold those things, right?
02:13:36.540 | So the idea of being a good person means that you are fulfilling the function for which
02:13:40.060 | you were made.
02:13:41.060 | It's a teleological view of humanity.
02:13:44.460 | So if you're a good father, this means that you are bringing up your child in durable
02:13:47.660 | values that is going to bring them up healthy, capable of protecting themselves and passing
02:13:52.180 | on the traditional wisdom of the ages to future generations while allowing for the capacity
02:13:55.500 | for innovation.
02:13:56.500 | That'd be being a good father.
02:13:57.500 | Being a good spouse would mean protecting and unifying with your spouse and building
02:14:03.340 | a safe family and a place to raise children.
02:14:07.940 | Being a good citizen of your community means protecting the fellow citizens of your community
02:14:11.500 | while incentivizing them to build for themselves.
02:14:15.180 | And it becomes actually much easier to think of how to, this is why I like the role theory
02:14:19.340 | because it's very hard since sort of in virtue theory to say, be generous.
02:14:23.260 | Okay, how does that manifest?
02:14:24.620 | I don't know what that looks like.
02:14:26.260 | Sometimes being generous might be being not generous to other people, right?
02:14:31.260 | When Aristotle says that you should be benevolent, like what does that mean?
02:14:33.980 | This is very vague.
02:14:34.980 | But when you say be a good dad, most people sort of have a gut level understanding of
02:14:38.060 | what it means to be a good dad.
02:14:39.060 | And mostly what they have a gut level understanding of what it means to really be a really bad
02:14:42.660 | And so what it means to be a good man is to fulfill those roles, as many of them as you
02:14:46.980 | can properly and at full function.
02:14:50.060 | And that's a very hard job.
02:14:51.060 | I've said before that, you know, because I engage a lot with the public and all of this,
02:14:54.980 | you know, the word great comes up a lot.
02:14:56.220 | What does it take to be a great leader?
02:14:57.220 | What does it take to be a great person?
02:14:58.580 | And I've always said to people, it's actually fairly easy to be great.
02:15:01.260 | It's very difficult to be good.
02:15:02.500 | There are a lot of very great people who are not very good.
02:15:05.380 | And there are not a lot of good people.
02:15:06.700 | And most of them, you know, frankly, most good people die mourned by their family and
02:15:12.820 | friends and two generations later, they're forgotten.
02:15:14.540 | But those are the people who incrementally move the ball forward in the world, sometimes
02:15:17.980 | much more than the people who are considered great.
02:15:21.260 | - Understand the role in your life that involves being a cup and be damn good at it.
02:15:25.940 | - Exactly, that's right.
02:15:27.180 | - Hold the soup.
02:15:28.180 | - It's very, Jordan Peterson.
02:15:30.180 | - It's very like lobster with Jordan Peterson.
02:15:32.740 | I think people will quote you for years and years to come on that.
02:15:37.420 | What advice would you give a lot of young people look up to you?
02:15:40.220 | What advice, despite their better judgment?
02:15:43.620 | No, I'm just kidding.
02:15:44.620 | - Maybe.
02:15:45.620 | - Only kidding, only kidding.
02:15:47.660 | They seriously look up to you and draw inspiration from your ideas, from your bold thinking.
02:15:52.900 | What advice would you give to them?
02:15:54.700 | How to live a life worth living, how to have a career they can be proud of and everything
02:16:02.020 | like that.
02:16:03.020 | - So live out the values that you think are really important and seek those values in
02:16:07.780 | others would be the first piece of advice.
02:16:10.700 | Second piece of advice, don't go on Twitter until you're 26.
02:16:13.940 | - Or 26.
02:16:15.180 | - Because your brain is fully developed at that point.
02:16:18.380 | As I said early on, I was on social media and writing columns from the time I was 17.
02:16:23.020 | It was a great opportunity and as it turns out, a great temptation to say enormous numbers
02:16:26.980 | of stupid things when you're young.
02:16:29.380 | You're kind of trying out ideas and you're putting them on, you're taking them off and
02:16:31.680 | social media permanentizes those things and engraves them in stone and then that's used
02:16:35.960 | against you for the rest of your life.
02:16:37.700 | So I tell young people this all the time.
02:16:39.060 | If you want to be on social media, be on social media, but don't post, watch if you want to
02:16:42.940 | take in information and more importantly, you should read books.
02:16:46.160 | As far as other advice, I'd say engage in your community.
02:16:50.140 | There's no substitute for engaging in your community and engage in interpersonal action
02:16:53.740 | because that will soften you and make you a better person.
02:16:57.380 | I've become a better person since I got married.
02:16:59.220 | I've become an even better person since I've had kids.
02:17:01.060 | So you can imagine how terrible I was before all these things.
02:17:06.220 | Engaging your community does allow you to build the things that matter on the most local
02:17:10.740 | possible level.
02:17:11.740 | I mean, the outcome by the way of the sort of politics of the politics of fulfillment
02:17:14.660 | that I was talking about earlier is a lot of localism because the roles that I'm talking
02:17:17.900 | about are largely local roles.
02:17:19.460 | That stuff has to be protected locally.
02:17:21.180 | I think we focus way too much in this country and others on like world beating solutions,
02:17:25.140 | national solutions, solutions that apply to hundreds of millions of people.
02:17:28.140 | How do we get to the solutions that apply for like five?
02:17:31.140 | And then we get to the solutions that apply to like 20.
02:17:32.820 | And then we get to the solutions that involve 200 people or a thousand people.
02:17:36.780 | Let's solve that stuff.
02:17:37.860 | And I think the solutions at the higher level flow bottom up, not top down.
02:17:42.020 | - What about mentors and maybe role models?
02:17:45.220 | Have you had a mentor or maybe people you look up to either you interacted on a local
02:17:50.780 | scale like you actually knew them or somebody you really looked up to?
02:17:53.220 | - For me, I'm very lucky.
02:17:54.220 | I grew up in a very solid two parent household.
02:17:56.020 | I'm extremely close to my parents.
02:17:58.100 | I've lived near my parents literally my entire life with the exception of three years of
02:18:01.420 | law school.
02:18:03.220 | And like right now they live a mile and a half from us.
02:18:06.260 | That's so- - What'd you learn from, about life from
02:18:10.340 | your parents and your father?
02:18:12.740 | - So, man, so many things from my parents.
02:18:16.220 | - Good and bad.
02:18:17.220 | - That's a hard one.
02:18:18.220 | I mean, I think the good stuff from my dad is that you should hold true to your values.
02:18:22.620 | He's very big on you have values, those values are important, hold true to them.
02:18:26.020 | - Did you understand what your values are, what your principles are early on?
02:18:29.340 | - Fairly quickly, yeah.
02:18:31.180 | And so he was very big on that, which is why, for example, I get asked a lot in the Jewish
02:18:35.900 | community why I wear a kippah.
02:18:37.420 | And the answer is it never occurred to me to take off the kippah.
02:18:39.940 | I always wore it.
02:18:40.940 | Why would I take it off at any point?
02:18:42.720 | That's the life that I want to live and that's the way it is.
02:18:46.420 | So that was a big one from my dad.
02:18:48.140 | From my mom, practicality.
02:18:49.380 | My dad is more of a dreamer, my mom is much more practical.
02:18:51.940 | And so the sort of lessons that I learned from my dad are that you can have, this is
02:18:57.460 | sort of the counter lesson, is that you can have a good idea, but if you don't have a
02:19:00.100 | plan for implementation, then it doesn't end up as reality.
02:19:03.240 | And I think actually he's learned that better over the course of his life too.
02:19:06.460 | But my dad, from the time I was very young, he wanted me to engage with other adults and
02:19:11.020 | he wanted me to learn from other people.
02:19:13.420 | And one of his rules was if he didn't know something, he would find somebody who he thought
02:19:17.020 | did know the thing for me to talk to.
02:19:18.860 | That was a big thing.
02:19:20.460 | So I'm very lucky.
02:19:21.460 | I have wonderful parents.
02:19:22.460 | As far as sort of other mentors, in terms of media, Andrew Breitbart was a mentor.
02:19:28.140 | Andrew obviously, he was kind of known in his latter days, I think more for the militancy
02:19:31.620 | than when I was very close with him.
02:19:34.780 | - So for somebody like me who knows more about the militancy, can you tell me what is a great
02:19:40.780 | ... what makes him a great man?
02:19:42.820 | - What made Andrew great is that he engaged with everyone.
02:19:45.500 | I mean everyone.
02:19:46.740 | So there are videos of him rollerblading down the boulevard and people would be protesting
02:19:51.300 | and he would literally like rollerblade up to them and he would say, "Let's go to lunch
02:19:54.220 | together."
02:19:55.220 | And he would just do this.
02:19:56.220 | That's actually who Andrew was.
02:19:57.220 | - What was the thinking behind that?
02:19:58.940 | - The two he was, he was just garrulous.
02:20:00.700 | He was much more outgoing than I am actually.
02:20:02.660 | He was very warm with people.
02:20:04.620 | For me, I would say that with Andrew, I knew Andrew for, I remember when I was 16, he passed
02:20:12.260 | away when I would have been 28.
02:20:14.860 | So I knew Andrew for 10, 12 years.
02:20:17.540 | And people who met Andrew for about 10 minutes knew Andrew 99% as well as I knew Andrew.
02:20:25.480 | Because he was just all out front.
02:20:27.020 | Everything was out here and he loved talking to people.
02:20:28.980 | He loved engaging with people.
02:20:30.640 | And so this made him a lot of fun and unpredictable and fun to watch and all of that.
02:20:34.460 | And then I think Twitter got to him.
02:20:35.500 | I think by, you know, Twitter is, one of the lessons I learned from Andrew is the counter
02:20:39.420 | lesson, which is Twitter can poison you.
02:20:41.060 | Twitter can really wreck you.
02:20:42.260 | If you spend all day on Twitter reading the comments and getting angry at people who are
02:20:45.100 | talking about you, it becomes a very difficult life.
02:20:47.740 | And I think that, you know, in the last year of his life, Andrew got very caught up in
02:20:51.580 | that because of a series of sort of circumstances.
02:20:54.140 | - It can actually affect your mind.
02:20:55.140 | It can actually make you resentful, all that kind of stuff.
02:20:57.500 | - I tend to agree with that.
02:20:58.900 | So, but the lesson that I learned from Andrew is engage with everybody, take joy in sort
02:21:03.500 | of the mission that you're given.
02:21:06.460 | And you can't always fulfill that.
02:21:07.700 | You know, sometimes it's really rough and difficult.
02:21:09.100 | I'm not going to pretend that it's all fun and rainbows all the time because it isn't.
02:21:13.620 | And some of the stuff that I have to cover, I don't like.
02:21:15.460 | And some of the things I have to say, I don't particularly like, you know, like that happens.
02:21:19.700 | But that's what I learned from Andrew.
02:21:22.280 | As far as sort of other mentors, I had some teachers when I was a kid who, you know, said
02:21:27.100 | things that stuck with me.
02:21:28.100 | I had a fourth grade teacher named Miss Janetti who said, "Don't let potential be written
02:21:31.580 | on your tombstone," which was a pretty--
02:21:34.060 | - That's a good line.
02:21:35.060 | - It's a great line, particularly to a fourth grader.
02:21:37.660 | But it was, you know, that guy had an 11th grade English teacher named Anthony Miller
02:21:42.340 | who was terrific, really good writer.
02:21:43.860 | He'd studied with James Joyce at Trinity College in Dublin.
02:21:46.620 | And so he and I really got along and he helped my writing a lot.
02:21:51.260 | - Did you ever have doubt in yourself?
02:21:53.020 | I mean, especially as you've gotten into the public eye with all the attacks, did you ever
02:21:57.180 | doubt your ability to stay strong, to be able to be a voice of the ideas that you represent?
02:22:03.020 | - You definitely, I don't doubt my ability to say what I want to say.
02:22:06.060 | I doubt my ability to handle the emotional blowback of saying it, meaning that that's
02:22:09.820 | difficult.
02:22:10.820 | I mean, again, to take just one example, in 2016, the ADL measured that I was the number
02:22:16.860 | one target of antisemitism on planet Earth.
02:22:19.660 | You know, that's not fun.
02:22:21.300 | You know, it's unpleasant.
02:22:22.740 | And when you take critiques, not from antisemites, but when you take critiques from people generally,
02:22:26.940 | we talked about near the beginning how you surround yourself with people who are gonna
02:22:31.780 | give you good feedback.
02:22:32.780 | Sometimes it's hard to tell.
02:22:33.780 | Sometimes people are giving you feedback and you don't know whether it's well motivated
02:22:36.860 | or poorly motivated.
02:22:38.720 | And if you are trying to be a decent person, you can't cut off the mechanism of feedback.
02:22:42.380 | And so what that means is sometimes you take to heart the wrong thing or you take it to
02:22:46.580 | heart too much.
02:22:48.420 | You're not light enough about it.
02:22:49.420 | You take it very, very seriously.
02:22:51.220 | You lose sleep over it.
02:22:52.220 | I mean, I can't tell you the number of nights where I've just not slept because of some
02:22:54.980 | critique somebody's made of me and I've thought to myself, maybe that's right.
02:22:58.060 | And sometimes it is right.
02:22:59.660 | And you know, that's-
02:23:00.660 | - Some of that is good to stew in that criticism, but some of that can destroy you.
02:23:04.940 | Do you have a shortcut?
02:23:05.940 | So Rogan has talked about taking a lot of mushrooms.
02:23:09.260 | Since you're not into the mushroom thing, what's your escape from that?
02:23:14.820 | Like when you get low, when you can't sleep.
02:23:17.900 | - Usually writing is a big one for me.
02:23:19.460 | So writing for me is cathartic.
02:23:20.820 | I love writing.
02:23:22.900 | That is a huge one.
02:23:24.900 | Spending time with my family.
02:23:25.900 | Again, I usually have a close circle of friends who I will talk with in order to sort of bounce
02:23:31.500 | ideas off of them.
02:23:33.340 | And then once I've kind of talked it through, I tend to feel a little bit better.
02:23:38.220 | Exercise is also a big one.
02:23:39.220 | I mean, if I go a few days without exercise, I tend to get pretty grumpy pretty quickly.
02:23:43.020 | I mean, I gotta keep the six pack going somehow, man.
02:23:46.460 | - There you and Rogan agree.
02:23:49.620 | We haven't, aside from Twitter, mentioned love.
02:23:52.780 | What's the role of love in the human condition?
02:23:55.580 | Ben Shapiro.
02:23:56.580 | - Man, don't get asked for love too much.
02:23:59.100 | In fact, I was...
02:24:02.620 | - You don't get that question on college camp?
02:24:04.020 | - No, I typically don't actually.
02:24:05.660 | In fact, we were at an event recently, it was a Daily Wire event.
02:24:10.180 | And in the middle of this event, it was a meeting group with some of the audience.
02:24:13.540 | And in the middle of this event, this guy walks by with this girl, they're talking and
02:24:17.020 | they're talking to me and their time kind of runs, the security's moving them.
02:24:19.660 | He says, "No, no, no, wait, hold on a minute."
02:24:21.060 | And he gets down on one knee and he proposes to the girl in front of me.
02:24:23.820 | And I said to him, "This is the weirdest proposal in human history.
02:24:27.860 | What is happening right now?
02:24:29.260 | I was your choice of Cupid here."
02:24:31.180 | I said, "Well, you know, we actually got together 'cause we listened to your show."
02:24:35.340 | And I said, "Well, I can perform it like a Jewish marriage right now.
02:24:37.260 | We're gonna need a glass, we're gonna need some wine, it's gonna get weird real fast."
02:24:41.620 | But yeah, so love doctor, I'm typically not asked too much about.
02:24:46.020 | The role of love is important in binding together human beings who ought to be bound together
02:24:55.300 | and the role of respect is even more important in binding together broader groups of people.
02:25:00.100 | I think one of the mistakes that we make in politics is trying to substitute love for
02:25:02.660 | respect or respect for love and I think that's a big mistake.
02:25:05.180 | So I do not bear tremendous love in the same sense that I do for my family, for random
02:25:11.740 | strangers.
02:25:12.740 | I don't.
02:25:13.740 | I love my family, I love my kids.
02:25:14.740 | Anybody who tells you they love your kid as much as you love your kid is lying to you,
02:25:17.300 | it's not true.
02:25:18.300 | I love my community more than I love other communities.
02:25:22.140 | I love my state more than I love other states.
02:25:24.140 | I love my country more than I love other countries, right?
02:25:26.380 | Like that's all normal and that's all good.
02:25:29.500 | The problem of empathy can be when that becomes so tight-knit that you're not outward looking,
02:25:34.700 | that you don't actually have respect for other people.
02:25:36.780 | So in the local level, you need love in order to protect you and shield you and give you
02:25:40.540 | the strength to go forward and then beyond that, you need a lot of respect for people
02:25:43.780 | who are not in the circle of love.
02:25:46.260 | And I think trying to extend love to people who either are not gonna love you back or
02:25:52.740 | are gonna slap you in the face for it or who you're just not that close to, it's either
02:25:57.220 | it runs the risk of being airsats and fake or it can actually be counterproductive in
02:26:03.220 | some senses.
02:26:04.220 | - Well, there's some sense in which you could have love for other human beings just based
02:26:11.540 | on the humanity that connects everybody, right?
02:26:14.940 | So you love this whole project that we're a part of.
02:26:19.500 | And actually, another thing we disagree on, so loving a stranger, like having that basic
02:26:27.500 | empathy and compassion towards a stranger, even if it can hurt you, I think is ultimately
02:26:33.980 | a, that to me is what it means to be a good man, to live a good life, is to have that
02:26:42.460 | compassion towards strangers.
02:26:43.460 | 'Cause to me, it's easy and natural and obvious to love people close to you, but to step outside
02:26:49.980 | yourself and to love others, I think that's the fabric of a good society.
02:26:55.060 | You don't think there's value to that?
02:26:56.820 | - I think there can be, but I think we're also discussing love almost in two different
02:26:59.420 | senses, meaning that when I talk about love, what I think of immediately is the love I
02:27:03.700 | bear for my wife and kids or my parents or my siblings.
02:27:08.060 | - Or the love of friendship.
02:27:09.060 | - Or the love of my close friends.
02:27:10.660 | - Yeah.
02:27:11.660 | - Okay, but I think that it's, using that same term to describe how I feel about strangers,
02:27:17.020 | I think would just be inaccurate.
02:27:18.740 | And so that's why I'm suggesting that respect might be a more solid and realistic foundation
02:27:24.460 | for the way that we treat people far away from us or people who are strangers, respect
02:27:28.020 | for their dignity, respect for their priorities, respect for their role in life.
02:27:33.380 | It might be too much of an ask, in other words.
02:27:36.060 | There might be the rare human being who's capable of literally loving a homeless man
02:27:39.100 | on the street the way that he loves his own family.
02:27:41.180 | But if you respect the homeless man on the street the way that you respect your own family,
02:27:46.420 | because everyone is deserved, everyone deserves that respect, I think that you get to the
02:27:50.020 | same end without forcing people into a position of unrealistically expecting themselves to
02:27:56.940 | feel a thing they don't feel.
02:27:58.940 | One of the big questions in religion that comes up is God makes certain requests that
02:28:02.860 | you feel certain ways.
02:28:03.860 | You're supposed to be (speaking in foreign language)
02:28:05.660 | You're supposed to be happy about certain things.
02:28:08.260 | You're supposed to love thy neighbor as thyself.
02:28:10.380 | You'll notice that in that statement, it's thy neighbor.
02:28:13.100 | It's not just generally anyone.
02:28:15.300 | It's love thy neighbor as thyself.
02:28:16.300 | In any case, the--
02:28:17.300 | - I think that extends to anyone that follows you on Twitter.
02:28:20.700 | Thy neighbor, 'cause God anticipated the social network aspect that is not constrained by
02:28:25.580 | geography.
02:28:27.220 | - Yeah, I'm gonna differ with you on the interpretation on that.
02:28:29.460 | But in any case, the sort of, the kind of extension of love outwards might be too big
02:28:37.260 | an ask.
02:28:38.260 | So maybe we can start with respect and then hopefully out of that respect can grow something
02:28:42.260 | more if people earn their way in.
02:28:45.100 | 'Cause I think that one of the big problems when we're talking about universalism is when
02:28:47.660 | people say like, "I'm a world citizen.
02:28:49.660 | I love people of the other country as much as I love myself or as much as I love my country."
02:28:54.620 | It tends to actually lead to an almost crammed down utopianism that I think can be kind of
02:29:01.100 | difficult because with love comes a certain expectation of solidarity.
02:29:06.780 | And I think, right, I mean, when you love your family, you love your wife, like there's
02:29:09.660 | a certain level of solidarity that is required inside the home in order to preserve the most
02:29:13.300 | loving kind of home.
02:29:14.760 | And so if you love everybody, then that sort of implies a certain level of solidarity that
02:29:18.100 | may not exist.
02:29:19.100 | So maybe the idea is for me, start with respect and then maybe as people respect each other
02:29:23.900 | more, then love is an outgrowth of that as opposed to starting with love and then hoping
02:29:26.980 | that respect develops.
02:29:27.980 | Yeah, there's a danger that that word becomes empty and instead is used for dogmatic kind
02:29:32.940 | of utopianism versus actual-
02:29:36.340 | I mean, this is the way that, for example, religious theocracies very often work.
02:29:40.420 | We love you so much, we have to convert you.
02:29:43.020 | So let's start with respect.
02:29:44.740 | What I would love to see after our conversation today is to see a Ben Shapiro that continues
02:29:50.700 | the growth on Twitter of being even more respectful than you've already been and maybe one day
02:29:57.340 | converting that into love on Twitter.
02:29:59.860 | That would, if I could see that in this world, that would make me die a happy man.
02:30:04.500 | Wow, that's-
02:30:05.500 | A little bit more-
02:30:06.500 | If I can make that happen for you-
02:30:07.500 | Love in the world for me, as a gift for me.
02:30:09.900 | I'll try to make that happen.
02:30:10.900 | I do have one question.
02:30:11.900 | I'm gonna need you to tell me, can I, like which jokes are okay?
02:30:15.100 | Are jokes still okay?
02:30:16.560 | So yeah, can I just run your Twitter from now on?
02:30:19.460 | You just send it to me.
02:30:20.460 | I will pre-screen you the jokes and you can tell me if this is a loving joke or if this
02:30:24.940 | is a hate-filled-
02:30:25.940 | People will be very surprised by all the heart emojis that start popping up on your Twitter.
02:30:31.500 | Ben, thank you so much for being bold and fearless and exploring ideas.
02:30:36.020 | And your Twitter aside, thank you for being just good faith and all the arguments and
02:30:40.620 | all the conversations you're having with people.
02:30:42.260 | It's a huge honor.
02:30:43.260 | Thank you for talking to me.
02:30:44.260 | Thanks for having me.
02:30:45.260 | I really appreciate it.
02:30:46.260 | Thanks for listening to this conversation with Ben Shapiro.
02:30:49.380 | To support this podcast, please check out our sponsors in the description.
02:30:53.260 | And now let me leave you with some words from Ben Shapiro himself.
02:30:56.380 | "Freedom of speech and thought matters, especially when it is speech and thought with which we
02:31:02.660 | disagree.
02:31:03.660 | The moment the majority decides to destroy people for engaging in thought it dislikes,
02:31:10.100 | thought crime becomes a reality."
02:31:13.580 | Thank you for listening.
02:31:14.580 | I hope to see you next time.