back to index

Michael Malice: Anarchy, Democracy, Libertarianism, Love, and Trolling | Lex Fridman Podcast #128


Chapters

0:0 Introduction
4:7 Putin and the Russian soul
11:10 Love and trolling
21:40 Problem with government
27:12 Anarchism
49:16 Politics
51:8 Are most people capable of thinking deeply?
58:17 Willy Wonka and Albert Camus view of life
64:16 Trolling
68:35 Conspiracy theories
85:52 Donald Trump and the Election
93:6 Trump Biden presidential debates
97:24 Journalism is broken
104:4 Communism
110:11 Presidential candidates
120:42 Libertarian party
131:4 Objectivism
140:26 Trolling
151:30 The New Right
159:38 Cancel culture
189:25 Book recommendations
194:17 Fear of mortality
197:17 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Michael Malice,
00:00:02.740 | an anarchist, political thinker, author,
00:00:05.840 | and a proud, part-time, Andy Kaufman-like troll,
00:00:10.160 | in the best sense of that word,
00:00:11.940 | on both Twitter and in real life.
00:00:15.000 | He's a host of a great podcast called
00:00:17.600 | You're Welcome, spelled Y-O-U-R.
00:00:20.940 | I think that gives a sense of his sense of humor.
00:00:23.560 | He is the author of Dear Reader,
00:00:25.960 | the unauthorized autobiography of King Jong-il,
00:00:29.600 | and The New Right,
00:00:31.060 | a journey to the fringe of American politics.
00:00:33.880 | This latter book, when I read it,
00:00:36.040 | or rather listened to it last year,
00:00:38.520 | helped me start learning about
00:00:40.200 | the various disparate movements
00:00:41.720 | that I was undereducated about,
00:00:43.600 | from the internet trolls, to Alex Jones,
00:00:46.920 | to white nationalists, and to techno-anarchists.
00:00:51.440 | The book is funny and brilliant, and so is Michael.
00:00:54.720 | Unfortunately, because of a self-imposed deadline,
00:00:58.160 | I actually pulled an all-nighter before this conversation.
00:01:01.320 | So I was not exactly all there mentally,
00:01:03.920 | even more so than usual,
00:01:05.680 | which is tough because Michael is
00:01:07.460 | really quick-witted and brilliant.
00:01:09.920 | But he was kind, patient,
00:01:11.720 | and understanding in this conversation,
00:01:13.680 | and I hope you will be as well.
00:01:16.120 | Today, I'm trying something a little new,
00:01:18.120 | looking to establish a regular structure
00:01:20.200 | for these intros.
00:01:22.000 | A first, doing the guest intro, like I just did.
00:01:25.040 | Second, quick one or two sentence mention
00:01:27.640 | of each sponsor.
00:01:29.000 | Third, my side comments related to the episode.
00:01:32.320 | And finally, fourth, full ad reads
00:01:35.040 | on the audio side of things,
00:01:36.800 | and on YouTube, going straight to the conversation.
00:01:39.580 | So not doing the full ad reads.
00:01:41.560 | And as always, no ads in the middle,
00:01:43.600 | because to me, they get in the way of the conversation.
00:01:46.640 | So, quick mention of the sponsors.
00:01:48.560 | First, SEMrush, the most advanced SEO optimization tool
00:01:52.920 | I've ever come across.
00:01:54.360 | I don't like looking at numbers,
00:01:56.100 | but someone probably should.
00:01:57.560 | It helps you make good decisions.
00:02:00.040 | Second sponsor is DoorDash,
00:02:01.920 | food delivery service that I've used for many years
00:02:04.720 | to fuel long, uninterrupted sessions of deep work
00:02:07.680 | at Google, MIT, and I still use it a lot today.
00:02:11.880 | Third sponsor is Masterclass,
00:02:14.000 | online courses from the best people in the world
00:02:16.760 | on each of the topics covered,
00:02:18.580 | from rockets, to game design, to poker,
00:02:21.080 | to writing, and to guitar, with Carlos Santana.
00:02:25.360 | Please check out these sponsors in the description
00:02:27.300 | to get a discount and to support this podcast.
00:02:30.800 | As a side note, let me say that I hope to have
00:02:33.040 | some conversations with political thinkers,
00:02:35.700 | including liberals and conservatives,
00:02:38.180 | anarchists, libertarians, objectivists,
00:02:41.120 | and everything in between.
00:02:42.920 | I'm as allergic to Trump bashing and Trump worship
00:02:46.480 | as you probably are.
00:02:48.200 | I have none of that in me.
00:02:49.820 | I really work hard to be open-minded
00:02:52.080 | and let my curiosity drive the conversation.
00:02:54.600 | I do plead with you to be patient on two counts.
00:02:57.780 | First, I have an intense, busy life
00:03:00.580 | outside of these podcasts.
00:03:02.400 | Like, it's 4 a.m. right now as I'm recording this.
00:03:05.540 | So sometimes, life affects these conversations.
00:03:08.400 | Like in this case, I pull an all-nighter beforehand.
00:03:11.280 | So please be patient with me if I say something
00:03:13.260 | ineloquent, confusing, dumb, or just plain wrong.
00:03:17.660 | I'll try to correct myself on social media
00:03:19.680 | or in future conversations as much as I can.
00:03:22.380 | I really am always learning and working hard
00:03:24.540 | to improve.
00:03:26.040 | Second, if I or the guest says something about,
00:03:29.900 | for example, our current president, Donald Trump,
00:03:33.060 | that's over-the-top negative or over-the-top positive,
00:03:36.900 | please don't let your brain go into the partisan mode.
00:03:39.740 | Try to hear our words in an open-minded, nuanced way.
00:03:43.220 | And if we say stuff from a place of emotion,
00:03:45.820 | please give us a pass.
00:03:47.420 | Nuanced conversation can only happen
00:03:49.740 | if we're patient with each other.
00:03:51.380 | If you enjoy this thing, subscribe on YouTube,
00:03:54.220 | review the Five Stars on Apple Podcasts,
00:03:56.580 | follow on Spotify, support on Patreon,
00:03:58.940 | or connect with me on Twitter, @LexFriedman.
00:04:02.460 | And now, here's my conversation with Michael Malice.
00:04:06.300 | - There was a Simpsons episode where he starts mixing
00:04:10.100 | sleeping pills with pet pills, and he's driving his truck,
00:04:13.580 | and I'm like, "I wanna see what happens
00:04:14.700 | "if he makes Red Bull and Nitro cold brew."
00:04:16.900 | (laughs)
00:04:19.260 | - It's a lineup of drugs.
00:04:20.720 | - This is gonna be so fun.
00:04:22.860 | - This is, yeah.
00:04:24.620 | Let's start with love.
00:04:25.820 | - Yes.
00:04:26.660 | (laughs)
00:04:27.480 | (speaking in foreign language)
00:04:28.900 | - Yeah, so one thing we'll eventually somehow talk about,
00:04:31.700 | it'll be a theme throughout, is that you're also Russian.
00:04:34.300 | - Yes.
00:04:35.140 | - A little bit less than me, but--
00:04:36.940 | - How, at my, 'cause I'm from Ukraine.
00:04:39.660 | - Oh, you're from Ukraine?
00:04:40.660 | - From above.
00:04:41.500 | - Okay, wow.
00:04:42.380 | No, because you came here a little bit
00:04:43.980 | when you were younger.
00:04:44.900 | - Yeah.
00:04:45.740 | - I came here when I was 13,
00:04:47.740 | so I saturated a little bit of the Russian soul.
00:04:49.980 | I marinated in the Russian soul a little deeper.
00:04:52.540 | I haven't told anyone this,
00:04:53.660 | but I'll be glad to tell you, Davidish.
00:04:55.740 | I haven't been back since I was two.
00:04:59.180 | And next summer, it looks like me and my buddy,
00:05:01.460 | Chris Williamson, who's also a podcaster,
00:05:03.260 | he's British, Modern Wisdom.
00:05:04.460 | He looks like Apollo.
00:05:06.340 | Looks like we got a videographer.
00:05:07.620 | - Which Apollo?
00:05:08.460 | Apollo Kreeg? - The God.
00:05:09.280 | He looks like the God, Apollo.
00:05:10.260 | Yeah, he's like a model.
00:05:11.100 | - I thought you were talking about Rocky.
00:05:12.700 | (laughs)
00:05:13.780 | - So we're gonna go for the first time
00:05:16.660 | to see where I came from.
00:05:18.500 | - Which is in Ukraine. - And film it.
00:05:19.940 | We're gonna go to Lvov,
00:05:20.820 | and either St. Petersburg or Moscow,
00:05:22.340 | probably St. Petersburg, or both.
00:05:24.020 | It's gonna be intense.
00:05:25.780 | It's gonna be a lot of panic attacks, I feel.
00:05:28.480 | - And your Russian is okay?
00:05:29.860 | (speaking in foreign language)
00:05:34.460 | - No, you can't talk Russian in Ukraine,
00:05:35.900 | or they get offended.
00:05:37.460 | - Yeah, but then you also wanna go to Russia.
00:05:39.940 | - Yeah.
00:05:41.340 | - I don't know.
00:05:42.940 | For me, there's several people in Russia
00:05:44.860 | I wanna interview on a podcast.
00:05:47.060 | So one of them is Gagarev Proman,
00:05:49.700 | who's a mathematician,
00:05:51.260 | and the other person is Putin.
00:05:53.740 | - You know what my favorite Putin story is?
00:05:55.700 | Do you know this? - No.
00:05:56.660 | - When he had Merkel with him, do you know this story?
00:05:58.580 | - No.
00:05:59.400 | - Merkel's scared of dogs, like petrified of dogs.
00:06:02.180 | So he brings in his like black lab.
00:06:06.220 | It's a Labrador, it's like the sweetest animal,
00:06:08.260 | and it's all over her, and there's pictures,
00:06:09.780 | and she's sitting like this, and she's terrified,
00:06:12.060 | and he's like, "What's wrong, Angela?"
00:06:13.740 | It's just completely trolling her.
00:06:15.500 | - Yeah, he's aware of the sort of,
00:06:19.460 | the narrative around him.
00:06:20.860 | - Yeah.
00:06:21.700 | - And then he plays with it. - Yes.
00:06:22.940 | - He enjoys it.
00:06:23.780 | - It's a very Russian thing.
00:06:24.780 | My friend wanted to do a film about me.
00:06:26.300 | He goes, "I realized you guys aren't like us at all.
00:06:28.580 | "You just like look at us."
00:06:30.100 | And then I started telling him stories about the upbringing,
00:06:32.580 | and he's like, "Oh my God."
00:06:33.580 | And as I'm telling them, I'm like,
00:06:34.520 | "Wow, this stuff is really crazy, like how we are wired."
00:06:38.460 | - Who's the we?
00:06:39.300 | Your friend is-- - The Russian.
00:06:40.260 | The Friends of America.
00:06:41.100 | I'm saying the way Russians are brought up,
00:06:42.940 | and the way maybe, I don't think it was just my family.
00:06:45.180 | I bet you had similar things.
00:06:46.620 | Here's an example.
00:06:47.660 | I had a buddy staying with me.
00:06:50.420 | He had a problem with his roommate,
00:06:51.380 | so he crashed at my place.
00:06:52.300 | Fine.
00:06:53.140 | I went to the gym, and I come back, and he goes,
00:06:57.780 | "Oh, there was," and my apartment building
00:06:59.300 | is four-floor apartment, so it's not like a huge thing.
00:07:02.020 | He goes, "Oh, there was someone knocking at your door."
00:07:04.460 | So I told him, blah, blah, blah.
00:07:06.580 | And for me, and I wonder if you're the same way,
00:07:10.940 | if I'm at someone's house that's not my own,
00:07:12.900 | and someone knocks on the door,
00:07:14.660 | I wouldn't even think to answer it.
00:07:16.620 | Like if I had an apple here, maybe I'd eat it,
00:07:18.900 | I'd cut it, whatever.
00:07:20.360 | I'm not gonna, it just doesn't enter my head
00:07:21.820 | to smash into my face.
00:07:23.140 | The thought of answering the door if it's not my house,
00:07:26.380 | it would never enter my head.
00:07:27.420 | Would it enter your head?
00:07:28.620 | - No, but why?
00:07:30.020 | - But he's an American, so someone's at the door.
00:07:31.740 | He goes and opens it, even though it's not his house.
00:07:33.620 | I would never do that.
00:07:34.700 | I would never think to do that.
00:07:36.820 | - That is so strange that you pick some very obscure thing
00:07:39.880 | to delineate Americans and Russians.
00:07:42.060 | - I don't think that's obscure,
00:07:43.020 | 'cause I think it speaks to how we perceive strangers.
00:07:46.540 | With Americans, everyone's friendly,
00:07:48.260 | and with us, it's like, no, no, you have that moat.
00:07:52.000 | And I think that percolates into many different aspects
00:07:55.200 | of how we relate to people, and I had to undo a lot of that.
00:07:58.040 | - That's true.
00:07:58.860 | You're right, there's the relationship I formed there
00:08:00.880 | where in Russia, we're very deep and close,
00:08:04.240 | and then there's the strangers, the other,
00:08:06.320 | that you don't trust by default.
00:08:08.920 | It takes a long time to go over the moat of trust.
00:08:11.280 | - For a long time, until recently,
00:08:13.880 | whenever I said anything to anyone,
00:08:15.440 | my brain ran a scan that said,
00:08:18.520 | "If this person turns on you, can they use this against you?"
00:08:22.680 | And I would do this with everything I said with strangers.
00:08:24.800 | And after a while, it's like, you know what?
00:08:26.340 | Maybe they will, but I'm strong enough to take it,
00:08:28.240 | but this is not how Americans think.
00:08:30.600 | Or here's another one.
00:08:31.440 | Let me ask you this.
00:08:32.260 | Sorry, I'm taking over the interview.
00:08:33.100 | People ask about advice for work.
00:08:35.300 | There was this party I went to,
00:08:37.940 | and basically everyone had their own problems,
00:08:39.560 | and everyone else gave their advice, right?
00:08:41.300 | And someone's having a problem with the coworker,
00:08:43.400 | and the advice these two poor Americans gave them is,
00:08:46.760 | "Oh, sit down and have a talk with them."
00:08:48.840 | And to me, this is like the last resort.
00:08:52.960 | Like, first you have to see what you can
00:08:54.560 | without showing your hand, showing your vulnerability,
00:08:57.040 | only when everything hasn't worked out,
00:08:58.740 | and you're like, "All right, let me sit down with you
00:09:00.200 | and try to have it out with you," probably.
00:09:02.160 | But for them, the first thing is like,
00:09:03.520 | sit down and be like,
00:09:04.640 | "Oh, you're causing me problems," and blah, blah, blah.
00:09:07.000 | So I perceive that right away as a threat,
00:09:09.400 | that this person sees an antagonism between us,
00:09:11.520 | and also as a weakness that I'm getting to them.
00:09:13.720 | So my reaction isn't, "How do I make it better?"
00:09:17.280 | My reaction is to reinforce my position
00:09:20.240 | and see what I can to marginalize them, usually.
00:09:23.680 | I haven't worked in a corporate setting in a long time.
00:09:25.720 | But it's not, I don't approach it the way an American would,
00:09:27.800 | like, "I'm glad you came and talked to me."
00:09:29.080 | Now I probably would, 'cause it's gonna be a friend.
00:09:31.520 | - So you attribute that to the Russian upbringing,
00:09:34.640 | as opposed to you have deep psychological issues.
00:09:38.040 | - I think there's a synonymous, Daniel.
00:09:39.600 | (both laughing)
00:09:40.960 | - Would you think differently, maybe a few years ago?
00:09:44.440 | - I don't know.
00:09:47.480 | I think you lost me at the,
00:09:50.440 | 'cause you kinda said that,
00:09:52.320 | you're kinda implying you have a deep distrust of the world.
00:09:55.640 | Like, the world does--
00:09:57.320 | - I think the default setting would be distrust, yeah.
00:10:00.020 | - But I would put it differently,
00:10:05.560 | is I almost ignore the rest of the world.
00:10:09.400 | I don't even acknowledge it.
00:10:11.360 | I just savor, I save my love and trust
00:10:16.360 | for the small circle of people.
00:10:18.640 | - I agree, but when that person is being confrontational,
00:10:21.240 | or as they perceive it, as being open,
00:10:24.040 | now there's a situation.
00:10:25.520 | How would you handle that?
00:10:27.320 | - Like a cold wind blows?
00:10:30.320 | You just kinda like--
00:10:31.320 | - Yeah, but it's not like this is an opportunity
00:10:33.400 | for us to work out our differences.
00:10:35.160 | It's a cold wind.
00:10:36.400 | It's not a hug, that's my point.
00:10:38.560 | Americans think it's a hug.
00:10:40.080 | You're so suspicious.
00:10:43.040 | What it really is, is a cold wind.
00:10:45.200 | I'm so inhumane.
00:10:48.040 | It's not something to be scared of.
00:10:50.360 | It's a cold wind, it's not a good person.
00:10:52.560 | - But it's not, this is great.
00:10:54.840 | But it's not a source of,
00:10:56.320 | I'm not suspicious of,
00:10:59.160 | I'm not anxious, I would say,
00:11:01.520 | or living in fear of the rest of the world.
00:11:04.520 | - Oh, I agree, but you're not receptive to that person.
00:11:06.920 | - Right. - That's all I'm saying.
00:11:08.200 | And they are.
00:11:09.440 | - Got it.
00:11:10.280 | So speaking of which, let's talk about love.
00:11:12.720 | - Yes.
00:11:13.560 | (Lex laughing)
00:11:14.400 | - Which requires to be receptive of the world.
00:11:17.480 | - Yes. - Of strangers.
00:11:18.680 | - Agreed.
00:11:19.800 | - How do we put more love out there in the world,
00:11:23.040 | especially on the internet?
00:11:25.360 | - One mechanism I have found to increase love,
00:11:30.360 | and that's a word that has many meanings
00:11:31.920 | and is used in a very intense sense
00:11:33.920 | and is used in a very loose sense.
00:11:35.400 | - Can you try to define love?
00:11:36.840 | - Sure, love is a strong sense of attraction
00:11:41.840 | toward another person, entity, or place
00:11:46.920 | that causes one to tend to react
00:11:51.340 | in a disproportionately positive manner.
00:11:53.540 | That's off the top of my head.
00:11:54.440 | - Disproportionately?
00:11:55.480 | - Yes.
00:11:56.640 | So for example, if you--
00:11:57.680 | - Why not proportionally?
00:11:58.740 | - Because if someone's about to,
00:12:01.800 | who you love is about to get harmed,
00:12:04.100 | you're moving heaven and earth
00:12:05.640 | to make sure, or like a book you love.
00:12:08.440 | You know, like I love this book.
00:12:09.440 | Like you're going through the fire to try to save it.
00:12:11.640 | Whereas if it's a book you really like,
00:12:13.120 | it's like, oh, I'll get another one.
00:12:15.280 | I don't, you know, and a book's kind of a loose example,
00:12:18.120 | but--
00:12:18.940 | - So you're going with the love that's like,
00:12:20.160 | you're saving for just a few people,
00:12:22.240 | almost like romantic love, like love for a close family.
00:12:25.040 | - But it's also--
00:12:25.880 | - But what about just love to even the broader,
00:12:28.800 | like the kind of love you can put out
00:12:30.920 | to people on the internet, which is like just kindness?
00:12:33.840 | - Sure, I would say in that case,
00:12:35.280 | it's important to make them feel seen and validated.
00:12:40.280 | And I try to do this when people who I have come to know
00:12:44.840 | on the internet, and there's a lot,
00:12:46.600 | I try to do that as much as possible
00:12:48.760 | because I don't think it's valid how on social media,
00:12:52.720 | and I do this a lot myself, but not towards everyone,
00:12:55.160 | it's just there to be aggressive and antagonistic.
00:12:58.320 | You should be antagonistic towards bad people,
00:13:00.560 | and that's fine.
00:13:01.640 | But at the same time, there's lots of great people,
00:13:04.440 | and especially with my audience,
00:13:06.120 | and I would bet disproportionately with yours,
00:13:08.560 | there's lots of people who are,
00:13:10.320 | because of their psychology and intelligence,
00:13:13.320 | are going to be much more isolated socially than they should.
00:13:17.240 | And if I, and I've heard from many of them,
00:13:19.360 | and if I'm the person who makes them feel,
00:13:21.200 | oh, I'm not crazy, it's everyone else around me
00:13:24.160 | who is just basic, the fact that I can be that person,
00:13:27.680 | which I didn't have at their age,
00:13:29.520 | to me is incredibly reaffirming.
00:13:32.460 | - You mean that source of love?
00:13:34.100 | - But I mean love in the sense of like,
00:13:36.180 | you know, you care about this person,
00:13:37.700 | and you want good things for them,
00:13:38.780 | not in a kind of romantic way,
00:13:40.180 | but I mean, you're using a broad sense now.
00:13:42.500 | - Yeah, but you're also a person who kind of,
00:13:44.900 | I mean, attacks the power structures in the world
00:13:51.140 | by mocking them effectively.
00:13:54.820 | And love, I would say, requires you to be non-witty
00:13:59.820 | and simple and fragile,
00:14:04.680 | which I see it as like the opposite of what trolls do.
00:14:08.240 | - Trolls are, if there is someone coming after what I love,
00:14:13.240 | there's two mechanisms, right, at least two.
00:14:18.760 | I go up and I'm fighting them,
00:14:20.760 | and in which case, you bring in,
00:14:22.600 | if you are getting hurt in a knife fight,
00:14:24.680 | even if you win the knife fight,
00:14:26.420 | or if you disarm them,
00:14:29.220 | and you preclude the possibility of a fight,
00:14:31.460 | and you drive them off or render them powerless,
00:14:34.120 | you keep your person intact as yourself,
00:14:37.860 | and you also protect your values.
00:14:39.680 | - So how do you render them powerless?
00:14:41.680 | - As you just said, by mocking them.
00:14:43.240 | One of the most effective mechanisms for those in power,
00:14:46.340 | we're much closer to Brave New World than 1984.
00:14:49.000 | The people who are dominant and in power
00:14:51.100 | aren't there because of the threat of,
00:14:53.360 | you know, the gulag or prison.
00:14:54.740 | They're there because of social pressures.
00:14:56.440 | Look at the masks.
00:14:58.000 | I was on the subway not that long ago in New York City.
00:15:01.600 | No one cared who I was until I put off the mask.
00:15:04.080 | I was in the subway that long in New York City,
00:15:06.120 | and I put this on my Instagram, I've told this story before.
00:15:08.320 | There was an Asian dude in his early 30s.
00:15:10.120 | He was like in Western clothes.
00:15:11.420 | It's not like he had a rickshaw or something.
00:15:13.460 | An older man in his 50s stood up over him on the subway,
00:15:16.880 | screamed at him, said, "Go back where you came from.
00:15:20.420 | "You're disgusting, I'm gonna get sick.
00:15:22.200 | "If you think this guy's a vector of disease,
00:15:23.880 | "which is your prerogative, why are you coming close to him?
00:15:25.760 | "Why are you getting in his face?"
00:15:27.420 | And what-- - 'Cause that was the rate,
00:15:29.000 | sorry, so it was because he was Asian?
00:15:32.140 | - It was both.
00:15:32.980 | It was the not having a mask gave him the permission
00:15:37.600 | to act like a despicable, aggressive person toward him.
00:15:41.200 | And the point being, a lot of these mechanisms
00:15:45.020 | for social control are outsourced to low-quality people
00:15:49.280 | because this is their one chance
00:15:50.640 | to assert dominance and status over somebody else.
00:15:53.280 | So the best way to diffuse that
00:15:54.980 | isn't with weaponry or fighting.
00:15:56.860 | It's through mockery 'cause all of a sudden,
00:15:59.000 | their claims to authority are effectively destroyed.
00:16:01.800 | - So let me push back on that.
00:16:03.000 | What about fighting that with love,
00:16:06.960 | with patience and kindness towards them?
00:16:10.920 | - I don't think kindness is,
00:16:12.840 | I think that would be a mismatch and inappropriate.
00:16:16.080 | There's Superman, there's Batman, okay?
00:16:18.160 | And Superman's job is to help the good people.
00:16:20.080 | Batman's job is to hurt the bad people.
00:16:22.120 | And I will always be on the Batman side
00:16:25.200 | than the Superman side.
00:16:26.360 | - Both work, silly tight costumes.
00:16:29.880 | One has pointy ears, both are ridiculous.
00:16:32.560 | So it's-- - And what's a billionaire
00:16:34.160 | who gets, he's swimming in trim.
00:16:36.240 | - Which one is a billionaire? - Batman.
00:16:37.920 | - Okay, I'm undereducated on the superhero movies.
00:16:42.920 | I apologize.
00:16:44.440 | Okay, but you're just saying you,
00:16:47.280 | your predisposition is to be on the Batman side
00:16:50.560 | is to fighting the bad guys.
00:16:54.680 | - Yeah, and it's what I'm good at.
00:16:57.320 | - That's what you're good at.
00:16:58.560 | But just to play devil's advocate,
00:17:02.520 | or actually in this case, I am the devil
00:17:04.600 | 'cause it's what I usually do.
00:17:06.360 | - Well, I'm the devil, you're the angel's advocate.
00:17:08.440 | - Exactly, I'm to be the angel advocate, yeah.
00:17:12.280 | It's like, I feel like mockery
00:17:15.880 | is a path towards escalation of conflict.
00:17:20.360 | - Yes, in many ways, yes.
00:17:22.080 | - So you're not, I mean,
00:17:25.000 | it's kind of like guerrilla warfare.
00:17:28.560 | I mean, you're not going to win.
00:17:31.480 | - I am winning, we're all winning.
00:17:32.800 | We're winning on a daily.
00:17:33.920 | This is my next book, we're winning.
00:17:35.720 | We've won before, I'm not joking.
00:17:37.480 | - The topic of the next book-- - Yes, is the white pill.
00:17:40.640 | - The white pill. - Is that we're gonna,
00:17:42.200 | we are winning, the most horrible people
00:17:45.520 | are being rendered into laughing stocks
00:17:47.480 | on a daily basis on social media.
00:17:49.120 | This is a glorious thing. - I so disagree with you.
00:17:52.160 | I disagree with you because there's side effects
00:17:54.160 | that are very destructive.
00:17:56.000 | It feels like you're winning,
00:17:57.560 | but we're completely destroying the possibility
00:18:00.400 | of having a cohesive society.
00:18:04.680 | - That's called oncology.
00:18:06.760 | - What's that mean?
00:18:07.880 | - Curing cancer.
00:18:09.520 | Your concept of a cohesive society
00:18:11.800 | is in fact a society based on oppression
00:18:15.760 | and not allowing individuals to live their personal freedom.
00:18:20.160 | - Oh, so you're a utopian view of the world.
00:18:22.560 | - You're the utopian, you're saying cohesive society.
00:18:24.840 | I'm saying I don't need that.
00:18:25.880 | I'm saying there's gonna be conflict.
00:18:27.680 | Right, there's gonna be conflict.
00:18:29.160 | You and I are disagreeing right now, that's not cohesive.
00:18:31.400 | Doesn't mean we like each other less,
00:18:32.680 | doesn't mean we respect each other less.
00:18:34.560 | Cohesive, it's just a euphemism
00:18:37.080 | for everyone submitting to what I want.
00:18:40.000 | - No, I mean, cohesive could be that.
00:18:43.840 | It could be like enforced with violence,
00:18:48.200 | all that kind of stuff,
00:18:49.040 | sort of the libertarian view of the world.
00:18:53.240 | But it could just be being respectful
00:18:57.280 | and kind of each other,
00:18:59.240 | and kind towards each other,
00:19:00.440 | and loving towards each other.
00:19:02.720 | I mean, that's what I mean by cohesive.
00:19:04.660 | So when people say free, it's funny.
00:19:07.640 | Like freedom is a funny thing,
00:19:10.040 | because freedom can be painful to a lot of people.
00:19:14.280 | It all matters how you define it,
00:19:18.040 | how you implement it, how it actually looks like.
00:19:20.640 | And I'm just saying it feels like
00:19:23.640 | the mockery of the powerful
00:19:28.840 | leads to further and further divisions.
00:19:31.520 | It's like it's turning life into a game
00:19:36.400 | to where it's always,
00:19:38.600 | you're creating these different little tribes and groups,
00:19:43.080 | and you're constantly fighting the groups
00:19:48.080 | that become a little bit more powerful
00:19:50.320 | by undercutting them through guerrilla warfare
00:19:52.400 | kind of thing.
00:19:53.400 | And that's what the internet becomes,
00:19:54.920 | is everyone's just mocking each other.
00:19:56.880 | And then certain groups become more and more powerful,
00:19:59.280 | and then they start fighting each other.
00:20:01.960 | They form groups of ideologies,
00:20:05.600 | and they start fighting each other in the internet.
00:20:07.560 | Where the result is,
00:20:09.640 | it doesn't feel like the common humanity is highlighted.
00:20:14.560 | It doesn't feel like that's a path of progress.
00:20:19.360 | Now, when I say cohesive,
00:20:21.400 | I don't mean everybody has to be enforcing equality,
00:20:25.840 | all those kinds of ideas.
00:20:27.240 | I just mean not being so divisive.
00:20:30.620 | So it's going back to the original question of
00:20:34.440 | how do we put more love out in the world on the internet?
00:20:37.440 | - I want divisiveness.
00:20:39.800 | - Oh, so you think divisiveness is-
00:20:41.160 | - It's the goal. - That's very interesting.
00:20:42.160 | - It's the goal.
00:20:43.000 | So you started this conversation
00:20:44.920 | where you're talking about you have love
00:20:46.120 | for that small group.
00:20:47.440 | I think we both would agree
00:20:49.440 | to have a bigger group would be better,
00:20:50.720 | especially if that love comes from a sincere place.
00:20:53.560 | I think our country,
00:20:56.400 | I wrote an article about this four years ago,
00:20:57.680 | that it's time to disunite the states and to secede.
00:21:00.900 | This country has been held together
00:21:02.300 | with at least two separate cultures,
00:21:04.180 | with dumb text and string for over 20 years.
00:21:06.800 | There's an enormous amount of contempt
00:21:08.800 | from one group toward another.
00:21:10.800 | This contempt comes from sincere place.
00:21:12.880 | They do not share each other's values.
00:21:14.900 | There's absolutely no reason,
00:21:16.680 | just like any unhealthy relationship,
00:21:18.360 | where you can't say, you know what?
00:21:19.640 | It's not working out.
00:21:20.960 | I wanna go my own way and live my happiness.
00:21:24.400 | And I genuinely want you to go your way,
00:21:26.440 | live your happiness.
00:21:27.360 | If I'm wrong, prove me wrong.
00:21:28.820 | I'll learn from you and take lessons and vice versa.
00:21:31.860 | But the fact that we all have to be
00:21:33.320 | in the same house together is not coherent.
00:21:35.680 | And that's not love.
00:21:36.520 | That is the path towards friction and tension and conflict.
00:21:40.720 | - Do you think there is concrete groups?
00:21:43.960 | Like, is it as simple as the two groups of blue and red?
00:21:47.040 | - No, it's also very fluid,
00:21:49.920 | because you and I are allied as Jewish people,
00:21:53.560 | as Russians, as males, as podcasters.
00:21:57.040 | You're an academic, I'm not.
00:21:58.560 | So we're different, but we each are a Venn diagram,
00:22:02.800 | even within ourselves.
00:22:04.440 | And I can talk to you about politics,
00:22:07.000 | and then we can talk about Russia stuff,
00:22:08.640 | and then you could talk about your work,
00:22:10.840 | which I don't know anything about.
00:22:12.120 | So that would be where you're way up here
00:22:13.280 | and I'm way down here.
00:22:14.380 | So there's lots, every relationship
00:22:16.280 | with just between individuals, it's very dynamic.
00:22:18.920 | - So how do we secede?
00:22:20.660 | Like, how do we form individual states
00:22:22.640 | where there's a little bit more cohesion?
00:22:25.920 | - Sure, and voluntary cohesion.
00:22:27.720 | So the first step is to eliminate
00:22:31.620 | and the concept of political authority as legitimate,
00:22:35.400 | and to denigrate and humiliate those
00:22:38.240 | who would put themselves in a position
00:22:40.840 | in which they are there to tell you
00:22:42.600 | how to live your life from any semblance of validity.
00:22:46.600 | And that's starting to happen.
00:22:48.440 | If you look at what they had with the lockdowns,
00:22:50.680 | Cuomo and de Blasio, New York,
00:22:52.320 | I was tired a couple of weeks ago,
00:22:56.000 | and I said to my friend,
00:22:56.920 | "Oh, just click, maybe I have COVID."
00:22:58.440 | And he goes, "It's not possible."
00:22:59.480 | I go, "What do you mean?"
00:23:00.640 | And he goes, "We haven't had any deaths in like two months.
00:23:04.760 | And there's only like 100 cases a day for like two months."
00:23:08.080 | And I go, "You're exaggerating."
00:23:09.320 | 'Cause everything was still closed.
00:23:11.040 | And I looked at the numbers and he wasn't exaggerating.
00:23:13.900 | And there's no greater American dream to me
00:23:16.880 | than an immigrant family comes to the States,
00:23:19.920 | forms their own little business.
00:23:21.840 | Maybe mom's a good cook,
00:23:22.760 | has a restaurant, dry cleaner, fruit stand.
00:23:25.280 | And those people aren't gonna have a lot of money.
00:23:28.000 | Those are the first ones who lost their companies
00:23:30.320 | because of these lockdowns.
00:23:32.320 | Cuomo, who's the governor of New York,
00:23:35.120 | opened up the gyms.
00:23:36.400 | He said, "You're clear to open up."
00:23:38.040 | De Blasio said, "And we don't have enough inspectors.
00:23:40.340 | You're gonna have to wait another couple of weeks."
00:23:43.080 | To regard that as anything other than literally criminal
00:23:45.880 | is something that I am having a harder and harder time
00:23:49.480 | wrapping my head around.
00:23:51.120 | - You said, I mean, that's something
00:23:52.600 | I'm deeply worried about as well,
00:23:54.280 | which is like thousands,
00:23:56.680 | it's actually millions of dreams being crushed
00:23:59.880 | that American dream of starting a business,
00:24:02.400 | of running a business.
00:24:03.760 | - What about all the young people
00:24:05.160 | who you and I have in our audiences
00:24:07.760 | who are socially isolated at best,
00:24:10.120 | and now they can't leave their homes?
00:24:11.960 | Isolation and ostracism are things
00:24:15.120 | that are very well studied in psychology.
00:24:17.280 | These have extreme consequences.
00:24:19.360 | I read a book called "Ostracism,"
00:24:21.200 | and this wasn't scientific,
00:24:22.680 | but basically the author was a psychiatrist,
00:24:24.440 | psychologist, whatever.
00:24:25.560 | And he had one of his colleagues, they did an experiment.
00:24:27.940 | Let's for a week, you ostracize me completely.
00:24:30.640 | We know it's an, and he goes,
00:24:32.060 | even knowing it's the experiment,
00:24:34.040 | the fact that he wouldn't make eye contact with me
00:24:35.660 | and the fact that he ignored me
00:24:37.020 | had an extreme emotional impact on me.
00:24:40.900 | Knowing full well,
00:24:41.860 | this is purely for experimental purposes.
00:24:44.020 | Now you multiply that by all these, the suicide,
00:24:46.380 | the number of kids who were thinking about suicide
00:24:48.020 | was through the roof during all this.
00:24:50.340 | And my point is until these people,
00:24:53.300 | it's gonna, I would predict like 2024,
00:24:55.820 | that's where we're gonna have to start having conversations
00:24:57.860 | about what personal consequences
00:24:59.500 | have to be done for these people,
00:25:01.020 | because until then they're gonna do the same thing.
00:25:03.380 | - So you think there's going to be
00:25:04.580 | society-wide consequences of this
00:25:06.540 | that we're gonna see like ripple effects
00:25:08.840 | because of the social isolation?
00:25:10.460 | - I know, I mean, we also need to talk about consequences
00:25:13.340 | for Cuomo and de Blasio,
00:25:14.880 | because if politicians respond to incentives
00:25:17.700 | and the incentives are there for them
00:25:19.180 | to be extremely conservative,
00:25:20.820 | because if you have to choose,
00:25:21.980 | as Cuomo said in a press conference,
00:25:23.260 | between a thousand people dying
00:25:25.220 | and a thousand people losing their business,
00:25:26.500 | it's not a hard choice, and he's right.
00:25:28.380 | But at a certain point, it's like, all right,
00:25:30.900 | you're losing both, you're losing, not losing,
00:25:33.660 | you're making these decisions
00:25:35.140 | and not having consequences for it,
00:25:36.940 | and you're gonna do it again the next time,
00:25:38.780 | so we need to make sure you're a little scared.
00:25:41.220 | - Okay. - And I don't know
00:25:42.060 | what that would mean.
00:25:42.900 | - But you're laying this problem, this incompetence--
00:25:47.900 | - I don't think it's incompetence,
00:25:50.140 | I think it's very competent.
00:25:51.760 | I think their job is to be able--
00:25:52.600 | - It's malevolence. - Yes.
00:25:54.620 | - But you're laying it not at the hands of the individuals,
00:25:59.020 | but the structure of government.
00:26:01.420 | - It's both, yes.
00:26:02.360 | - How would we deal with it better
00:26:05.740 | without centralized control?
00:26:07.980 | - Well, we didn't really have centralized control,
00:26:09.460 | 'cause every country and every state
00:26:11.180 | handled it in a different mechanism.
00:26:12.820 | - But a city has centralized control.
00:26:14.940 | - Just, yeah. - Right.
00:26:16.260 | - No, that's not true.
00:26:17.100 | So Cuomo and de Blasio,
00:26:18.020 | they had a lot of disagreements over the months,
00:26:20.660 | and this was actually a source of great interest and tension.
00:26:23.520 | De Blasio wanted, at one point was talking about
00:26:26.180 | quarantining people in their homes.
00:26:27.860 | Cuomo was like, "You're crazy."
00:26:29.940 | Same thing with the schools, same thing with the gyms,
00:26:33.060 | and there were other such examples.
00:26:35.880 | But the point being, this was an emergency.
00:26:37.740 | This is World War I.
00:26:39.660 | I talked about this on Tim Pool's show,
00:26:41.940 | was very dangerous, because it gave a lot of evil people
00:26:45.720 | some very useful information
00:26:47.100 | about what the country put up with
00:26:49.220 | and what they can get away with under wartime.
00:26:51.180 | And this set the model for things like the New Deal
00:26:53.700 | and the other things of that nature.
00:26:55.920 | It is undeniable, you're a scientist,
00:26:57.520 | so you understand this perfectly well,
00:27:00.240 | that this lockdown gave some very nefarious people
00:27:04.060 | some very valid data about how much people
00:27:07.400 | will put up with under pressures from the state.
00:27:11.580 | - So fundamentally, what is the problem with the state?
00:27:16.120 | - Its existence.
00:27:17.280 | (Lex laughing)
00:27:19.320 | - Okay, well, but to play Angel's advocate again,
00:27:23.740 | government is the people.
00:27:27.880 | - That's it, come on.
00:27:28.880 | Do you really think this?
00:27:31.880 | - At its best, I think it's possible to have representation.
00:27:35.560 | - Can you imagine if you have an attorney?
00:27:37.200 | You're like, "Oh, you can't have the attorney you want.
00:27:38.720 | "You're gonna have this guy who you absolutely hate,
00:27:40.180 | "who you share no values with."
00:27:42.320 | - Because he drives, I mean, leaders, political leaders,
00:27:45.280 | and political representation drive the discourse.
00:27:48.240 | Like the majority of people voted for him or whatever,
00:27:53.000 | however you define that.
00:27:55.040 | And now we get to have a discussion,
00:27:58.560 | well, was this the right choice?
00:28:01.120 | And then we get to make that choice again
00:28:02.720 | in four years and so on.
00:28:04.080 | - First of all, the fact that I have to be under the thumb
00:28:06.640 | of somebody for four years makes no sense.
00:28:09.160 | There's no other relationship that's like this,
00:28:10.680 | including a marriage.
00:28:12.200 | You can leave any other relationship at any time, number one.
00:28:15.320 | Number two is--
00:28:16.160 | - You could always impeach.
00:28:17.480 | - But they did that.
00:28:18.320 | - Part of it, I'm just saying that there's,
00:28:20.760 | yeah, the mechanisms are flawed in many ways, yeah.
00:28:24.400 | - Yeah, right.
00:28:25.480 | And so that's number one.
00:28:26.960 | Number two is it doesn't make sense
00:28:29.720 | that if I don't want someone to represent me,
00:28:32.880 | that because that person is popular,
00:28:35.240 | that they are now in a position to.
00:28:36.600 | So having representation and having citizenship
00:28:40.600 | based on geography is a pre-landline technology
00:28:44.320 | in a post-cellphone world.
00:28:45.840 | There's no reason why I have to,
00:28:48.080 | just 'cause we're physically between two oceans,
00:28:50.760 | we all have to be represented by the same people,
00:28:52.960 | whereas I can very easily have my security
00:28:55.520 | be under someone and switch it as easily
00:28:57.480 | as cell phone providers.
00:28:58.720 | - So, okay, but it doesn't have to be geographical.
00:29:01.080 | It can be ideas.
00:29:02.600 | - Sure.
00:29:03.440 | - I mean, this country represents a certain set of ideas.
00:29:05.280 | - Yes, it does.
00:29:06.120 | - It started out geographically.
00:29:07.720 | It still is geographic.
00:29:08.560 | - It was both.
00:29:09.380 | It started off as ideas as well.
00:29:10.220 | - But like, it was intricately,
00:29:12.520 | I mean, that's the way humans are.
00:29:14.680 | I mean, there was no internet.
00:29:17.160 | So it was, you were geographically in the same location,
00:29:19.800 | and you signed a bunch of documents,
00:29:21.200 | and then you kind of debated,
00:29:22.560 | and you wrote a bunch of stuff,
00:29:24.520 | and then you agreed on it.
00:29:26.520 | Okay, so--
00:29:27.360 | - You understand that no one signed these documents,
00:29:29.520 | and no one agreed to it,
00:29:30.720 | as Lysander Spooner pointed out over 150 years ago.
00:29:33.800 | The Constitution or the social contract, if anything,
00:29:36.800 | is only binding to the signatories,
00:29:38.980 | and even then, they're all long dead.
00:29:41.480 | So it's this fallacy that somehow,
00:29:43.720 | because I'm in a physical place,
00:29:45.880 | I have agreed, even though I'm screaming to you in the face
00:29:48.520 | that I don't agree, to be subordinate
00:29:50.880 | to some imaginary invisible monster
00:29:54.440 | that was created 250 years ago.
00:29:56.280 | And this idea of like, if you don't like it,
00:29:57.960 | you have to move, that's not what freedom means.
00:29:59.920 | Freedom means I do what I want, not what you want.
00:30:02.080 | So if you don't like it, you move.
00:30:04.560 | - Okay, just to put some, I don't like words and terms.
00:30:08.280 | - 111011101.
00:30:09.680 | - Yeah, exactly.
00:30:10.520 | - Is that what your language is?
00:30:11.600 | - It is.
00:30:12.440 | - I'm translating it all in real time.
00:30:14.520 | But would you call the kind of ideas
00:30:18.160 | that you're advocating for, and we're talking about, anarchy?
00:30:21.600 | - Yes, anarchism, yes.
00:30:23.360 | - Okay, so let's get into it.
00:30:24.600 | Can you try to paint the utopia
00:30:29.600 | that an anarchist worldview dreams about?
00:30:33.640 | - The only people who describe anarchism as utopia
00:30:36.280 | are its critics.
00:30:37.560 | If I told you right now,
00:30:39.640 | and I wish I could say this factually,
00:30:41.280 | that I have a cure for cancer,
00:30:42.880 | that would not make us a utopia.
00:30:45.220 | That would still probably be expensive.
00:30:47.280 | We would still have many other diseases.
00:30:49.040 | However, we would be fundamentally healthier,
00:30:51.900 | happier, and better off, all of us.
00:30:55.020 | - Than democracy.
00:30:56.100 | So, sorry, I jumped back from the cancer.
00:30:58.680 | - No, than democracy or government.
00:31:00.680 | So it's only curing one major,
00:31:03.680 | major life-threatening problem,
00:31:05.580 | but in no sense is it a utopia.
00:31:08.120 | - So what, can we try to answer this question,
00:31:12.160 | same question many times,
00:31:13.640 | which is what exactly is the problem with democracy?
00:31:17.680 | - The problem with democracy is that those who need leaders
00:31:20.000 | are not qualified to choose them.
00:31:21.640 | - Those who need leaders are not qualified to choose them.
00:31:29.120 | - That's the central problem with democracy.
00:31:30.560 | - Not all of us need leaders.
00:31:32.440 | - Right.
00:31:37.120 | What does it mean to need a leader?
00:31:38.640 | Are you saying like people who are actually
00:31:41.360 | like free thinkers don't need leaders kind of thing?
00:31:44.240 | - Sure.
00:31:45.500 | That's a good way of working.
00:31:47.560 | - Okay, so do you acknowledge that there's some value
00:31:51.920 | in authority in different subjects?
00:31:56.000 | So what that means is,
00:31:57.860 | I don't mean authority, somebody who's in control of you,
00:32:00.080 | but--
00:32:00.920 | - But you're doing the definition switch.
00:32:02.880 | - Yeah, I am, I am.
00:32:05.400 | You're right, you're right.
00:32:06.240 | It's unfair, okay, that was bad.
00:32:08.360 | - But that's what they do, that's their trick.
00:32:10.440 | And this is one of the useful things,
00:32:12.320 | by the way, let's just total sidebar.
00:32:14.180 | If people ask me for advice,
00:32:15.580 | I always tell them, if you're gonna raise your kids,
00:32:17.080 | raise them bilingual.
00:32:18.760 | Because I was trilingual by the time I was six,
00:32:20.880 | and that teaches you to think in concepts.
00:32:23.180 | Whereas if you only know one language,
00:32:24.780 | you fall for things like this,
00:32:26.240 | because using authority in the sense of a policeman
00:32:28.680 | and someone as authority in physics,
00:32:30.560 | it's the same word,
00:32:31.640 | conceptually they're extremely different.
00:32:33.600 | But if you're only thinking in one language,
00:32:35.520 | your brain is going to equate the two.
00:32:37.760 | And that's a trap
00:32:38.600 | that people who only speak one language have.
00:32:40.360 | - For sure.
00:32:41.200 | But even if you know multiple languages,
00:32:42.880 | you can still use the trick of using
00:32:44.520 | the worst of your convenience--
00:32:45.760 | - Yeah, absolutely.
00:32:46.600 | - To manipulate the conversation.
00:32:48.160 | - But you weren't trying to do that,
00:32:49.440 | but you fell into that.
00:32:50.280 | - I accidentally did it, yeah, you're right.
00:32:51.840 | - We all tend to do that if you only speak one language
00:32:54.040 | and think in one language.
00:32:55.080 | - But if, I guess, let me rephrase it.
00:32:57.640 | Are you against,
00:32:59.800 | do you acknowledge the value of offloading
00:33:04.980 | your own effort about a particular thing to somebody else?
00:33:09.320 | - Absolutely.
00:33:10.160 | Like an accountant, a lawyer, a doctor,
00:33:12.920 | absolute, a chef, infinite.
00:33:15.120 | - Isn't that ultimately what a democracy is?
00:33:18.600 | - No. - Broadly defined.
00:33:19.800 | Like you're basically electing a bunch of authorities--
00:33:23.000 | - Using the word you in two senses.
00:33:25.120 | Using the word you meaning me as an individual,
00:33:26.880 | not using you as a mass.
00:33:28.240 | - Yeah, as a mass, not you as an individual.
00:33:31.560 | - Right, so I would absolutely want someone
00:33:34.280 | to provide for my security.
00:33:35.720 | I would absolutely want someone to negotiate with me
00:33:38.200 | for foreign power or something like that.
00:33:39.880 | That does not mean it has to be predicated
00:33:42.480 | and what lots of other people who I do not know,
00:33:44.680 | and if I do know them, probably would not respect,
00:33:47.320 | think about.
00:33:48.200 | It's of no moral relevance to me, nor I to them.
00:33:52.800 | - Did you think this kind of,
00:33:54.280 | there could be a bunch of humans
00:33:56.880 | that behave kind of like ants in a distributed way,
00:34:01.480 | there could be an emergent behavior in them
00:34:04.920 | that results in a stable society?
00:34:07.480 | Isn't that the hope with anarchy
00:34:09.080 | is without an overarching--
00:34:12.360 | - But ants, I mean, ants are the worst example here
00:34:15.620 | 'cause ants have a very firm authority.
00:34:17.760 | - The queen? - Yeah.
00:34:18.880 | And they're all drones, they're all clones of each other.
00:34:22.420 | - Yeah, but so if you forget the queen,
00:34:24.920 | their behavior, they're all,
00:34:26.480 | (laughs)
00:34:27.680 | well, from your perspective,
00:34:29.040 | from your human intelligent perspective,
00:34:30.840 | but from their perspective,
00:34:31.840 | they probably see each other as a bunch of individuals.
00:34:33.960 | - No, they don't.
00:34:34.800 | Ants are very big on altruism
00:34:36.800 | in the sense of self-sacrifice.
00:34:38.460 | They do not think the individual matters.
00:34:42.480 | They routinely kill themselves
00:34:44.840 | for the sake of the hive and the community.
00:34:46.880 | - But they, see, that's from the outside perspective.
00:34:49.320 | From the individual, perspective of the individual,
00:34:51.600 | they probably, they don't see it as altruism.
00:34:56.600 | - Right, but they view, and they're right,
00:35:00.420 | 'cause the ant's life is very ephemeral and cheap,
00:35:02.840 | that it's more important to continue this mass population
00:35:06.280 | that one individual ant live.
00:35:08.760 | Like bees are another, an even better example.
00:35:10.640 | The honeybee, when they sting,
00:35:12.080 | they only sting once and they die.
00:35:13.760 | And they do it gladly 'cause it's like, okay,
00:35:16.060 | this community is much more important than me,
00:35:18.200 | and they're right.
00:35:19.880 | - Yeah, okay, so fine, let's forget.
00:35:22.320 | - I'm being pedantic, but it's important, I think.
00:35:24.480 | I'm not just being pedantic for the sake of being pedantic.
00:35:26.520 | - But there is something beautiful
00:35:27.680 | that I won't argue about
00:35:29.400 | 'cause I do, there's an interesting point there
00:35:31.960 | about individualism of ants.
00:35:33.440 | I do think they're more individual.
00:35:35.200 | But let's give your view of ants that they're communists.
00:35:40.200 | Okay, let's go with the communist view of ants.
00:35:42.720 | - Okay, yeah.
00:35:43.560 | - But there's still a beautiful emergent thing,
00:35:46.920 | which is like they can function as a society
00:35:51.360 | and without, I would say, centralized control.
00:35:56.280 | - Yeah, I agree with you.
00:35:58.640 | - That's another argument.
00:35:59.520 | So is that the hope for anarchy?
00:36:01.080 | It's like you just throw a bunch of people
00:36:03.240 | that voluntarily wanna be in the same place
00:36:05.560 | under the same set of ideas,
00:36:06.920 | and they kind of, the doctors emerge,
00:36:10.440 | the police officers emerge,
00:36:12.640 | the different necessary structures
00:36:15.200 | of a functional society emerge.
00:36:17.040 | - Do you know what the most beautiful example
00:36:19.120 | of anarchism is that is just beyond beautiful
00:36:22.560 | when you stop to think about it?
00:36:23.400 | - I don't see it, Twitter.
00:36:24.280 | - I'm not being tongue-in-cheek.
00:36:25.760 | Language.
00:36:28.080 | There's infinite languages.
00:36:29.760 | Language, the things that language can be used for
00:36:33.080 | are bring tears to people's eyes, quite literally.
00:36:36.280 | It's also used for basic things.
00:36:39.000 | No one is forcing us.
00:36:40.240 | We speak two languages each, at least.
00:36:42.680 | No one's forcing us to use English.
00:36:44.480 | No one's forcing us to use this dialect of English.
00:36:47.840 | It's a way, and despite there being
00:36:50.960 | so many different languages,
00:36:52.760 | a lingua franca emerge,
00:36:54.240 | you know, the language that everyone, Latin.
00:36:56.640 | Even in North Korea, they refer to the fish
00:36:59.120 | and the different animals by the Latin scientific.
00:37:01.640 | No one decided this.
00:37:03.840 | Sure, there's an organization
00:37:05.280 | that sets a binomial nomenclature,
00:37:07.280 | but there's no gun to anyone's head
00:37:08.920 | referring to sea moth as a pegasus species.
00:37:13.480 | And when you think about how amazing language is,
00:37:16.880 | and in some other context, we say like,
00:37:18.840 | well, you need to have a world government,
00:37:21.320 | and they're deciding which is the verbs,
00:37:23.240 | and you have to have an official definition
00:37:25.120 | and an official dictionary,
00:37:26.720 | and none of that's happened.
00:37:28.120 | And I think anyone, even if they don't agree
00:37:30.560 | with my politics or my worldview,
00:37:32.840 | cannot deny that the creation of language
00:37:35.800 | is one of humanity's most miraculous, beautiful achievements.
00:37:40.800 | - Absolutely, so there you go.
00:37:42.280 | There's one system where a kind of anarchy
00:37:45.960 | can result in beauty, stability,
00:37:49.560 | like sufficient stability, and yet dynamic flexibility
00:37:53.560 | to adjust it and so on.
00:37:55.680 | And the internet helps it.
00:37:57.640 | You get something like Urban Dictionary,
00:38:00.400 | which starts creating absurd, both humor and wit.
00:38:05.080 | - But also language and syntax and jargon.
00:38:07.360 | Immediately you size people up.
00:38:09.200 | If you say vertebral, I know you're a doctor,
00:38:12.080 | 'cause that's how they pronounce it, the spinal column.
00:38:15.040 | I'm sure in your field, there's certain jargon,
00:38:17.160 | and right away you can know
00:38:18.000 | if this person's one of us or not.
00:38:19.720 | I mean, it's infinite.
00:38:20.840 | I mean, I don't need to tell you anyone.
00:38:22.520 | - And emojis, too. - Yes, there's so much there
00:38:24.720 | to study with language, it's fascinating.
00:38:26.360 | - But do you think this applies to human life?
00:38:30.280 | The meat space, the physical space?
00:38:32.160 | - Yes.
00:38:33.000 | - So that kind of beauty can emerge
00:38:35.480 | without writing stuff on paper, without laws?
00:38:40.040 | - You could have rules.
00:38:40.880 | You don't have to be laws.
00:38:42.720 | - Enforced by violence.
00:38:45.200 | What's a law?
00:38:48.200 | - A law is something that is unchosen.
00:38:50.200 | A rule is something.
00:38:51.040 | If I go to my pool, I sign up to be a member of a pool,
00:38:55.040 | on the wall, it lists certain things.
00:38:56.440 | It's like, you know, certain number of people in the pool,
00:38:59.040 | no peeing in here, good luck enforcing that one,
00:39:02.040 | and so on and so forth.
00:39:03.120 | - Well, that's the problem.
00:39:04.120 | Aren't you afraid that people are gonna pee in the pool?
00:39:06.960 | - That's not as my big concern as mass incarceration,
00:39:10.960 | as the fact that the police can steal more money
00:39:13.280 | than burglars can, the fact that innocent people
00:39:16.040 | can be killed with no consequences,
00:39:18.280 | the fact that war can be waged,
00:39:20.360 | and with no consequences for those who waged it,
00:39:24.680 | the fact that so many men and women
00:39:26.160 | are being murdered overseas and here,
00:39:28.400 | and the people who are guiding these are regarded as heroic.
00:39:31.160 | - So you think that in an anarchist system,
00:39:34.520 | there's a possibility of having less wars and less,
00:39:39.520 | what would you say, corruption, and less abuse of power?
00:39:44.520 | - Let's talk, yes, and let's talk about corruption,
00:39:47.120 | because, and I made this point on Rogan,
00:39:49.920 | you and I, again, this is the Russian background,
00:39:52.000 | we realize that when it comes to corruption,
00:39:55.200 | America is very naive.
00:39:56.920 | Corruption, they think, is, oh, I got my brother a job
00:39:59.000 | and he's getting money under the table.
00:40:01.040 | That's not, when we're talking about state corruption,
00:40:04.200 | things that are done in totalitarian states,
00:40:06.080 | and even to some extent in America,
00:40:07.300 | like Jeffrey Epstein, Jillian Maxwell,
00:40:09.240 | things that Stalin did, things that Hitler did.
00:40:11.360 | You know, when the CIA was torturing people at Gitmo,
00:40:14.200 | they had to borrow KGB manuals
00:40:16.360 | 'cause they didn't know how to torture correctly,
00:40:17.740 | 'cause they never thought of these things.
00:40:19.320 | We, it's very hard for us to get into the mindset
00:40:23.360 | of someone who's like a child predator,
00:40:25.720 | someone who, let me give you an example
00:40:27.880 | from my forthcoming book.
00:40:28.840 | There was a guy who was the head of Ukraine in the '30s,
00:40:31.880 | I forget his name.
00:40:33.220 | Now, these old Soviets, they were tough.
00:40:34.880 | I mean, they pride, Stalin means steal,
00:40:36.680 | you know, they pride themselves in their cruelty
00:40:39.160 | and how strong they were, and this was the purge.
00:40:41.840 | You know, Stalin is trying to, you know,
00:40:43.160 | killing lots of people left and right,
00:40:44.840 | and his henchman Beria had the quote,
00:40:48.320 | "I'll find me the man and I'll find you the crime."
00:40:50.120 | You know, they would accuse someone
00:40:51.520 | and they would torture him until he talked and confessed,
00:40:54.640 | and then he had to turn people in.
00:40:56.480 | And they took this guy in, like, begin the year,
00:40:59.200 | I think it's '36, '38, he was head of Ukraine.
00:41:01.480 | By May, he's arrested.
00:41:02.960 | And they take him to the Lublanka,
00:41:04.200 | the basement in the Red Square,
00:41:05.480 | where they're torturing people,
00:41:06.680 | and they put, they did the works on him.
00:41:08.480 | And he was a good Soviet and he stood up,
00:41:10.200 | and who knows what they did to him?
00:41:12.440 | He didn't talk.
00:41:13.600 | So they said, "Okay, one moment."
00:41:16.320 | They brought his teenage daughter in,
00:41:18.240 | raped her in front of him, he talked.
00:41:20.720 | So when we talk about corruption,
00:41:23.200 | we would never in a million years think of this.
00:41:26.960 | That's not how our minds work.
00:41:28.480 | So when you're talking about states
00:41:32.000 | and people where you don't have ease of exit,
00:41:35.800 | where you are forced to be under the auspices
00:41:38.640 | of an organization creating a monopoly,
00:41:41.520 | that leads to, in extreme cases,
00:41:44.480 | but in not as extreme cases, really nefarious outcomes.
00:41:49.080 | Whereas if you have the option to leave
00:41:53.040 | as a client or customer,
00:41:54.920 | that would have a strongly limiting effect
00:41:58.080 | on how a business and what it can get away with.
00:42:01.480 | - But don't you think maybe,
00:42:03.240 | I don't know who the right example is,
00:42:04.680 | whether it's Stalin,
00:42:05.760 | I think Hitler might be the better example of,
00:42:08.880 | don't you think, or Jeffrey Epstein perhaps,
00:42:12.720 | don't you think people who are evil
00:42:15.280 | will find ways to manipulate human nature
00:42:20.280 | to attain power, no matter the system?
00:42:23.720 | - Yes.
00:42:24.720 | - And the corollary question is,
00:42:27.920 | do you think those people can get more power
00:42:31.280 | when there's a government already in place?
00:42:37.920 | - It's easily, they get more power,
00:42:39.320 | more dangerous to have a government in place.
00:42:40.760 | First of all, sociopaths are known for their charm
00:42:43.040 | and for their warmth.
00:42:44.720 | Here's the two situations.
00:42:46.200 | In a free society, I'm a sociopath, I'm an evil person,
00:42:51.000 | I'm the head of Macy's.
00:42:52.120 | In a state society, I'm an evil person, I'm a sociopath,
00:42:55.640 | I'm the head of the US government.
00:42:57.320 | Which of these are you more concerned with?
00:42:59.240 | It's like night and day.
00:43:00.560 | So you would have far more decentralized military,
00:43:03.320 | you would have far more decentralized security forces,
00:43:08.440 | and they would be much more subject to feedback
00:43:10.840 | from the market.
00:43:11.760 | If you have an issue with Macy's
00:43:14.800 | or any store with a sweater, look at that transaction.
00:43:18.320 | If you have an issue with the state,
00:43:20.880 | hiring a lawyer costs more than a surgeon.
00:43:22.880 | To even access the mechanism for dispute
00:43:25.640 | is going to be exorbitant and price poor people
00:43:27.760 | out of the market for a conflict resolution immediately.
00:43:30.960 | So right away, you have something
00:43:32.240 | that's extremely regressive.
00:43:34.680 | And even though this is touted as some great equalizer,
00:43:37.520 | it's quite the opposite.
00:43:38.800 | - So in current society,
00:43:39.880 | there's a deep suspicion of governments and states.
00:43:43.280 | - Not really.
00:43:45.240 | - Just your example of Macy's,
00:43:46.640 | I mean, don't you think a Hitler could rise
00:43:49.400 | to be at the top of a social network
00:43:51.600 | like Twitter and Facebook?
00:43:53.200 | - Okay, let's suppose Hitler ran Twitter, okay?
00:43:56.320 | Let's take this thought experiment seriously.
00:43:58.440 | Literally, what could he do?
00:43:59.760 | So the only tweets are gonna be about
00:44:01.360 | how much the Jews suck, right?
00:44:02.760 | Okay, fine.
00:44:04.440 | Okay, all the cool people are leaving.
00:44:07.040 | There could be some compelling, like you said,
00:44:10.440 | evil people are charming.
00:44:12.920 | There could be some compelling narratives
00:44:14.640 | that could be with conspiracy theories,
00:44:17.400 | untruths that could be spread like propaganda.
00:44:22.400 | - Every criticism of anarchism is in fact a description.
00:44:25.520 | Well, the strongest criticism of anarchism
00:44:27.400 | are in fact descriptions of the status quo.
00:44:29.360 | Your concern is under anarchism,
00:44:32.760 | propaganda would spread
00:44:34.120 | and people would be taught the wrong ideas
00:44:36.040 | unlike the status quo?
00:44:37.640 | - That's not even a criticism of anarchism.
00:44:39.920 | I'm not actually criticizing.
00:44:41.520 | It's an open question of,
00:44:43.640 | it's an open question of in which system
00:44:48.320 | will human nature thrive,
00:44:52.560 | be able to thrive more,
00:44:54.240 | and in which system would the evils
00:44:57.440 | that arise in human nature
00:44:59.200 | would be more easily suppressible?
00:45:01.320 | That's the open question.
00:45:03.240 | It's a scientific experiment
00:45:04.960 | and I'm asking only from a perspective
00:45:07.280 | of the fact that we've tried democracy
00:45:10.400 | quite a bit recently
00:45:12.200 | and maybe you can correct me,
00:45:14.280 | we haven't yet seriously tried anarchy in a large scale.
00:45:18.000 | - Well, we don't need to try to.
00:45:19.680 | So, anarchy isn't like a country, right?
00:45:21.760 | It's like saying, well, if anarchy works,
00:45:25.320 | how come we've never had an anarchist government, right?
00:45:27.280 | So, anarchism is a relationship
00:45:29.640 | and language is an example of this.
00:45:31.160 | It's a worldwide anarchic system.
00:45:32.720 | You and I have an anarchist relationship.
00:45:34.000 | There's almost no circumstances
00:45:35.400 | we'd be calling the police on each other.
00:45:37.400 | - I'm asking the same question
00:45:39.880 | in a bunch of different directions,
00:45:42.320 | born out of my curiosity,
00:45:44.360 | is why is anarchy going to be better
00:45:49.280 | at preventing the darker sides of human nature,
00:45:52.000 | which presumably a criticism of government.
00:45:54.720 | - Because of decentralization.
00:45:56.840 | So, the darker side of human nature is an extreme concern.
00:46:00.160 | Anyone who says it's gonna go away
00:46:01.800 | is absurd and fallacious.
00:46:04.080 | I think that's a non-starter
00:46:05.240 | when people say that everyone's gonna be good.
00:46:07.200 | Human beings are basically animals.
00:46:08.640 | We're capable of great beauty and kindness.
00:46:10.800 | We're capable of just complete, cruel,
00:46:13.640 | and what we would call inhumanity.
00:46:15.280 | But we see it on a daily basis even today.
00:46:17.880 | And what's interesting is the corporate press
00:46:21.320 | won't even tell you the darkest aspects
00:46:23.680 | because that's too upsetting to people.
00:46:25.280 | So, they'll tell you about atrocities and horrors,
00:46:27.680 | but only to a point.
00:46:29.280 | And then when you actually do the homework,
00:46:31.000 | you're like, "Oh, it's so much worse than."
00:46:32.600 | Like that thing about Stalin, right?
00:46:34.120 | So, we know in a broad sense that Stalin was a dictator.
00:46:38.040 | We know that he killed a lot of people,
00:46:41.080 | but it takes work to learn about the Holodomor.
00:46:44.040 | It takes work to learn
00:46:45.240 | about what those literal tortures were,
00:46:47.640 | and that this is the person who later,
00:46:49.680 | FDR and Harry Truman were shaking hands with
00:46:52.320 | and taking photos with,
00:46:53.360 | and was being sold to us as Uncle Joe.
00:46:55.360 | You know, he's just like you and me.
00:46:57.800 | So, when you have a decentralized information network,
00:47:02.800 | as opposed to having three media networks,
00:47:06.040 | it is a lot easier for information that doesn't fit
00:47:09.440 | what would be the corporate American narrative
00:47:11.640 | to reach the populations.
00:47:13.840 | And it would be more effective for democracy
00:47:16.360 | 'cause they're in a much better position to be informed.
00:47:18.400 | Now, you're right.
00:47:20.240 | It also means, well, if everyone has a mic,
00:47:22.240 | that means every crazy person and with their wacky views.
00:47:25.560 | And at a certain point, yeah, it has to become,
00:47:28.480 | then there's another level,
00:47:29.660 | which is then the people have to be self-enforcing.
00:47:32.120 | And you see that in social media all the time
00:47:34.040 | where someone says this, the other person jumps in.
00:47:37.240 | - You think, but isn't social media a good example of this?
00:47:40.640 | So, you think ultimately without centralized control,
00:47:44.980 | you can have stability?
00:47:47.320 | What about the mob outrage and the mob rule,
00:47:52.320 | the power of the mobs that emerge?
00:47:54.840 | - Power of the mob is a very serious concern.
00:47:58.760 | Gustav Le Bon wrote a book in the 1890s called "The Crowd."
00:48:01.520 | And this was one of the most important books I've written
00:48:03.480 | 'cause it influenced both Mussolini and Hitler and Stalin,
00:48:06.000 | and they all talked about it.
00:48:07.400 | And he made the point that under crowd psychology,
00:48:11.120 | human lynching is another example of this.
00:48:13.540 | None of those individuals or very few
00:48:16.160 | would ever dream of doing these acts.
00:48:19.560 | But when they're all together
00:48:21.200 | and you lose that sense of self, you become the ant
00:48:23.740 | and you lose that sense of individually,
00:48:25.760 | you're capable of doing things that like in another context,
00:48:29.120 | you'd be like, I should kill myself, I'm a monster.
00:48:32.280 | - So, you're worried about that,
00:48:33.400 | but doesn't the mob have more power under anarchy?
00:48:37.580 | - No, the mob has much less power in anarchy
00:48:39.840 | 'cause under anarchism, every individual is fully empowered.
00:48:43.380 | You wouldn't have gun restrictions.
00:48:47.960 | You would have people creating communities
00:48:50.060 | based on shared values.
00:48:51.840 | They would be much more collegial.
00:48:53.560 | They'd be much more kind as opposed to
00:48:56.280 | when you're forcing people to be together in a polity
00:48:59.400 | when they don't have things in common.
00:49:01.380 | That is like having a bad roommate.
00:49:03.740 | If you're forced to look like jails,
00:49:06.140 | if you're forced to be locked in a room with someone,
00:49:09.940 | even if you at first like them,
00:49:11.520 | after a while, you're going to start to hate them
00:49:13.520 | and that leads to very nefarious consequences.
00:49:16.820 | - So, as an anarchist,
00:49:18.240 | what do you do in a society like this?
00:49:21.160 | - Thrive.
00:49:22.760 | I think I'm doing okay.
00:49:23.920 | - I mean, there's an election coming up.
00:49:30.420 | As you talk, "You're Welcome"
00:49:36.080 | is one of the 15 shows that you host.
00:49:39.940 | (laughing)
00:49:42.200 | - It's down to one.
00:49:44.160 | - Okay, it's down to one.
00:49:46.020 | But I'm a big fan.
00:49:49.440 | - You talk about libertarianism a little bit.
00:49:52.480 | I mean, is there some practical political direction
00:49:57.480 | like in terms of ways the society should go?
00:50:01.120 | I don't mean we as a nation.
00:50:03.000 | I mean, we as a collective of people should go
00:50:05.720 | to make a better world from an anarchist point of view.
00:50:09.200 | - Sure, I think politics is the enemy.
00:50:12.220 | And anything--
00:50:14.000 | - How do you define politics?
00:50:14.820 | - The state, the government.
00:50:16.340 | So anything that lessens its sway on people,
00:50:19.960 | anything that delegitimizes it is good.
00:50:23.440 | I wrote an article a few years ago
00:50:25.400 | about how wonderful it is
00:50:26.960 | that Trump is regarded as such a buffoon
00:50:29.880 | because it's very, very useful
00:50:32.080 | to have a commander-in-chief who's regarded as a clown
00:50:35.160 | because it's gonna take a lot
00:50:37.280 | to get him to convince your kids to go overseas
00:50:39.920 | and start killing people and making widows and orphans,
00:50:42.320 | as well as those kids coming home in caskets.
00:50:44.440 | Whereas if someone is regarded with prestige
00:50:47.220 | and they're like, "Oh, we need to send your kid overseas."
00:50:49.700 | Oh, absolutely.
00:50:50.900 | I mean, this guy's great.
00:50:52.340 | So that is a very healthy thing
00:50:54.460 | where people are skeptical of the state.
00:50:57.120 | - But there's a lot of people that regard him
00:51:01.100 | as one of the greatest leaders we've ever had.
00:51:04.660 | - Yeah, Dinesh D'Souza, he's another Lincoln.
00:51:07.180 | - When you talk shit about Trump or talk shit about Biden,
00:51:13.580 | I'm trying to find a line to walk
00:51:16.840 | where they don't immediately put you into the,
00:51:19.720 | this person has Trump derangement syndrome
00:51:21.640 | or they have the alternative to that.
00:51:25.280 | - I'm more than happy
00:51:27.120 | when people are preemptively dismissing me
00:51:29.500 | because then I don't have to waste time engaging with them
00:51:31.600 | 'cause those people will be of no use to me.
00:51:33.640 | When I was on Tim Pool recently, Tim Pool's show,
00:51:36.540 | Tim Pool's known for his little hat.
00:51:38.760 | I got a propeller beanie motorized
00:51:40.520 | and it was just spinning the whole two hours.
00:51:42.560 | - I know, like a 1950s thing.
00:51:44.320 | The point being I wore it
00:51:46.040 | because there's lots of people who would say,
00:51:48.640 | I can't take seriously someone who wears a hat like that.
00:51:51.440 | And my point being, if you are the kind of person
00:51:53.680 | who takes your cues based on someone's wardrobe
00:51:56.720 | as opposed to the content of your ideas,
00:51:58.600 | you're of no use to me as an ally.
00:52:01.240 | So I'd be more than happy you preemptively abort
00:52:04.640 | rather than waste our breath trying to engage.
00:52:07.280 | - This is a very, very deep thing
00:52:09.000 | that you and I disagree on,
00:52:10.240 | which is this goes through the trolling
00:52:12.420 | versus the love is I believe that person
00:52:16.800 | instinctually dismisses you on the very basic surface level.
00:52:21.360 | But deep down, there actually,
00:52:23.840 | there's a wealth of a human being that seeks the connection,
00:52:28.840 | that seeks to understand deeply,
00:52:32.360 | to connect with other humans that we should speak to.
00:52:36.040 | - Yeah, you and I completely disagree.
00:52:38.240 | - So you're saying--
00:52:40.040 | - I'm saying there's no mind there literally.
00:52:42.400 | - Okay, so let's, I naturally think the majority--
00:52:46.200 | So I naturally think the majority of people
00:52:48.640 | have the capacity to be thoughtful, intelligent,
00:52:54.840 | and learn about ideas,
00:52:59.920 | ideas that they instinctually,
00:53:02.000 | based on their own current inner circle, disagree with,
00:53:07.000 | and learn to understand, to empathize with the other.
00:53:11.120 | And in the current climate,
00:53:13.520 | there's a divisiveness that discourages that.
00:53:15.720 | And that's where I see the value of love,
00:53:18.360 | of encouraging people to strip away
00:53:24.760 | that surface instinctual response
00:53:27.320 | based on the thing they've been taught,
00:53:29.520 | based on the things they listen to,
00:53:31.160 | to actually think deeply.
00:53:32.640 | - Have you ever had, gone to CVS or Duane Reade,
00:53:36.920 | and your bill, how much you owe them is $6,
00:53:40.280 | and you give them a $10 bill and a single,
00:53:42.280 | and watch the look on their face?
00:53:44.560 | You watch them void their bowels and panic
00:53:46.800 | because you've given them $11 on a $6 bill?
00:53:49.640 | This is not a mind capable or interested
00:53:52.600 | in thoughts and ideas and learning.
00:53:54.300 | - No, you're talking about the first moment
00:53:57.040 | of a first moment where there's an opportunity to think.
00:54:02.040 | - They are desperate to avoid it.
00:54:04.180 | - No, they're just--
00:54:07.120 | - And incapable of it.
00:54:10.200 | - They have the same exact experience
00:54:12.400 | as I have every single day
00:54:14.140 | when I know it's time for me to go out on a run
00:54:17.000 | of five miles or six miles or 10 miles.
00:54:20.360 | I'm desperate to avoid it,
00:54:22.200 | and at the same time, I know I have the capacity to do it,
00:54:26.800 | and I'm deeply fulfilled when I do do it,
00:54:29.000 | when I do overcome that challenge.
00:54:30.400 | - You are one of the great minds of our generation.
00:54:33.240 | You are telling me that any of these people
00:54:35.100 | can do anything close to the work you do?
00:54:38.240 | - Not in artificial intelligence,
00:54:39.820 | but in the ability to be compassionate
00:54:44.820 | towards other people's ideas,
00:54:48.360 | like understand them enough to be able--
00:54:50.480 | - Passion requires a certain baseline of intelligence,
00:54:53.400 | 'cause you have to perceive other people
00:54:54.720 | as being different but of value.
00:54:56.080 | - Yeah, exactly.
00:54:56.920 | - That's a sophisticated mindset.
00:54:59.560 | - I think most people are capable of it.
00:55:03.280 | You don't think so?
00:55:04.120 | - No, and nor are they interested in it.
00:55:06.760 | But in that kind of, if you don't believe
00:55:09.900 | they're capable of it, how can anarchy be stable?
00:55:14.060 | - If you have a farm, there's one farmer and 50 cows.
00:55:18.260 | It's very stable.
00:55:19.540 | You're not asking the cows where to farm things.
00:55:23.620 | - Yeah, but the cows aren't intelligent enough to do damage.
00:55:29.620 | - Cows certainly, bulls, 'cause they could do a lot of damage.
00:55:32.460 | They could trample things, they could attack you.
00:55:34.860 | Cows are like, how much do they weigh, like 4,000 pounds?
00:55:37.000 | - Can you connect the analogy then?
00:55:38.520 | Because like--
00:55:39.360 | - Sure, you can't expect it.
00:55:41.640 | Saying a cow's a cow isn't a slur.
00:55:44.680 | It's not saying you hate cows.
00:55:46.120 | Let's take, the example I always use
00:55:48.880 | with good reason is dogs, okay?
00:55:50.580 | I always say to study how human beings operate,
00:55:53.720 | watch Cesar Millan, 'cause human beings
00:55:55.960 | and dogs have co-evolved.
00:55:57.120 | Our minds have both evolved in parallel tracks
00:55:59.520 | to communicate with each other.
00:56:00.920 | Dogs can be vicious.
00:56:03.540 | Dogs, for the most part, are great, wonderful,
00:56:06.360 | but you can't expect the dog to understand certain concepts.
00:56:11.360 | And now most people are offended,
00:56:13.040 | are you saying I'm like a dog?
00:56:14.160 | If you're a dog person like I am,
00:56:15.880 | this is actually a huge compliment.
00:56:17.080 | Most dogs are better than most people.
00:56:19.800 | But to get the idea that this is something
00:56:22.940 | that is basically your peer is nonsensical.
00:56:26.640 | Now, of course, this sounds arrogant and elitist
00:56:28.400 | and so on and so forth, and I'm perfectly happy with that,
00:56:31.320 | but it is very hard to persuade me or anyone
00:56:34.460 | that if you walk, George Carlin has that joke,
00:56:36.460 | think how smart the average person is,
00:56:37.940 | then realize 50% of people are dumber than that.
00:56:40.380 | If you walk around and see who's out there,
00:56:42.180 | these people are very kind, they are of value,
00:56:45.060 | they deserve to be treated with respect,
00:56:47.820 | they deserve to be secure in their person,
00:56:49.860 | they deserve to feel safe and to have love,
00:56:52.860 | but the expectation that they should have
00:56:56.380 | any sort of semblance of power over me or my life
00:56:59.220 | is as nonsensical as asking Lassie to be my accountant.
00:57:03.120 | - So, but that goes to power,
00:57:05.980 | that not to the ability, the capacity
00:57:09.900 | to be empathetic, compassionate, intelligent.
00:57:12.900 | What, if I were to try to prove you wrong--
00:57:16.200 | - That's a good question, okay.
00:57:17.900 | - What would you be impressed by about society?
00:57:22.260 | How would I show it to you?
00:57:25.780 | - That's a good question, how would you show it to me?
00:57:27.220 | 'Cause I think something has to be falsifiable
00:57:28.980 | if you're gonna make a claim, right?
00:57:30.700 | So, what would it, what would it--
00:57:32.980 | - 'Cause we both made claims that aren't a kind of
00:57:37.420 | our own interpretation based on our interaction.
00:57:40.740 | Like when I open Twitter, everyone seems to say--
00:57:43.660 | - Why do you only follow one person?
00:57:44.820 | Who do you follow?
00:57:45.640 | Who's the one person you follow?
00:57:46.780 | - Stoic Emperor, I follow a lot of people.
00:57:49.660 | I have a script that-- - Oh, secretly?
00:57:50.660 | - I have a script that-- - Of course you do.
00:57:52.940 | I'm a robot.
00:57:53.780 | - I have an entire interface.
00:57:55.880 | So I think Twitter's really--
00:57:58.780 | - This is real love, it's not ironic love.
00:58:00.940 | I love watching it, and I'm sure you do too.
00:58:04.220 | I love watching a quality mind at work.
00:58:06.400 | Because when someone has a quality mind,
00:58:08.300 | they're often not self-aware.
00:58:10.380 | I catch this on myself of how it operates.
00:58:12.820 | And then when other people see it,
00:58:13.940 | they're like, oh my God, this is so beautiful.
00:58:15.460 | 'Cause there's such an innocence to it.
00:58:17.380 | - But like when I open Twitter, I'm energized.
00:58:21.780 | There's a lot of love on Twitter.
00:58:23.200 | People say like-- - I love, I agree.
00:58:25.420 | You don't think I have a lot of love on Twitter?
00:58:27.100 | My fans pay my rent.
00:58:29.020 | - I mean, I don't know your experience of Twitter,
00:58:32.420 | but when I look at your,
00:58:33.460 | which is a fundamentally different thing.
00:58:35.580 | I'm saying my experience from the,
00:58:37.620 | so maybe you can tell me what your experience is like
00:58:39.580 | as a human.
00:58:40.420 | So when I observe your Twitter, I think,
00:58:44.380 | I wouldn't call it love.
00:58:48.060 | I would call it fun.
00:58:50.460 | - Yes. - And because of that,
00:58:53.300 | that's a different kind of, like love emerges from that.
00:58:56.880 | Because people kind of learn that we're having,
00:59:00.540 | this is like game night.
00:59:01.820 | Like-- - Yes.
00:59:03.020 | - We can talk shit a little bit.
00:59:06.380 | - Yes. - We can,
00:59:07.660 | and you can even like pull in, you can make fun of people.
00:59:11.620 | You can have the crazy uncle come over
00:59:13.820 | that is a huge Trump supporter,
00:59:16.740 | somebody who hates Trump, and you can have a little fun.
00:59:19.220 | - Yes. - I get it.
00:59:20.060 | It's a different kind of thing.
00:59:21.420 | I wouldn't be able to be the,
00:59:25.380 | you're the host of game night?
00:59:26.860 | - Yes, yes.
00:59:27.700 | - So I wouldn't be able to host that kind of game night?
00:59:30.580 | - I'm imagining you programming your robots,
00:59:32.860 | and you're asking, "What is fun?"
00:59:34.340 | And it just starts sparking.
00:59:35.900 | - Exactly.
00:59:36.740 | (laughing)
00:59:37.580 | - What is fun?
00:59:38.420 | (laughing)
00:59:40.260 | - So the robots in my life that survive
00:59:42.500 | are the ones that don't,
00:59:45.740 | that like survive that whole programming process.
00:59:49.860 | So they're kind of like the idiot from Dusty F. Ski.
00:59:52.940 | They're very like simple-minded robots.
00:59:56.340 | - It's just--
00:59:57.180 | - Fun is moving a can from one table to another.
00:59:59.900 | - Yeah, that's game night for our kin.
01:00:03.460 | - You know what one of my quotes is,
01:00:04.740 | and I think about this every day,
01:00:06.460 | and I mean it with every fiber of my being.
01:00:09.220 | We're born knowing that life is a magical adventure,
01:00:11.560 | and it takes them years to train us to think otherwise.
01:00:14.820 | And I think that Willy Wonka approach,
01:00:16.600 | it's a very Camus approach.
01:00:17.860 | It's something I believe with every fiber of my being.
01:00:20.700 | I try to spread that as much as possible.
01:00:22.820 | I think it is very sad.
01:00:24.620 | I'm not being sarcastic.
01:00:25.700 | It comes off as condescending.
01:00:27.500 | I mean it at face value.
01:00:28.980 | It's very sad how many people are not receptive to that.
01:00:31.740 | And I think a lot of those functions, how they were raised.
01:00:34.060 | And I could have very easily with my upbringing
01:00:37.020 | have not maintained that perspective.
01:00:40.340 | And there's a lot of,
01:00:42.700 | I have a lot of friends in recovery, like AA,
01:00:45.180 | and they have an expression,
01:00:46.900 | "Not my circus, not my monkeys," right?
01:00:49.220 | That you can't really take on other people's problems
01:00:51.940 | on your own at a certain point.
01:00:53.100 | They have to do the work themselves
01:00:54.580 | 'cause you can only do so much externally.
01:00:56.860 | And there are a lot of very damaged people out there.
01:01:00.580 | And there are damaged people who revel in being damaged.
01:01:04.460 | And there are damaged people who desperately,
01:01:07.020 | desperately, desperately wanna be well,
01:01:08.980 | who desperately wanna be happy,
01:01:10.300 | who desperately wanna find joy.
01:01:12.180 | So if I can be the one,
01:01:14.260 | and as arrogant as this sounds, I'll own it,
01:01:16.580 | who does give them that fun,
01:01:18.420 | and to tell them it doesn't have to be like you thought.
01:01:21.100 | Like it could be, it's gonna hurt, it's gonna suck,
01:01:24.140 | but it's still a magical adventure,
01:01:25.820 | and you're gonna be okay 'cause you've been through worse.
01:01:28.100 | Like that, if that could be my message,
01:01:30.060 | I would own it all day long.
01:01:32.020 | - And so what does adventure look like for you?
01:01:35.580 | 'Cause I mean, it actually boils down to,
01:01:37.220 | I still disagree with you.
01:01:38.620 | I think trolling can be,
01:01:41.860 | and very often is destructive for society.
01:01:45.020 | - Yes, I want to destroy society.
01:01:46.740 | That is the goal.
01:01:47.580 | I want to help many people.
01:01:50.780 | - Unironically, okay.
01:01:51.820 | - Unironically, yes.
01:01:53.940 | - Ah, what do I do with that?
01:01:55.860 | Okay, so--
01:01:56.700 | - Whatever you want, do what thou wilt,
01:01:58.020 | is the hall of the law.
01:01:59.180 | - Like I just wanna, so you're hosting game night,
01:02:02.420 | and I just wanna play Monopoly.
01:02:04.140 | I wanna play, what's it, Risk.
01:02:07.540 | Okay, I wanna play these games, and you're saying--
01:02:09.380 | - Those are aggressive games.
01:02:10.540 | - Yeah, I was trying to think of a friendlier game,
01:02:12.780 | but they're all kind of aggressive.
01:02:14.540 | - Battleship.
01:02:16.460 | (both laughing)
01:02:17.700 | Axis and allies, you know, fun stuff.
01:02:20.780 | But like, so that's an adventure,
01:02:24.260 | but you're saying that we wanna destroy everything,
01:02:27.820 | even like the rules of those games are not--
01:02:31.220 | - No, you voluntarily agree to those rules.
01:02:33.500 | The point is, if someone comes in
01:02:35.260 | who no one invited to game night,
01:02:37.460 | and are telling you, "No, when you play Monopoly,
01:02:40.060 | "you have to get money when you land in free parking,"
01:02:42.780 | or you don't, it's like, who are you?
01:02:45.380 | We're having our own fun, and you smell.
01:02:49.740 | - I don't know, but there's an aggressive--
01:02:52.980 | - There's an aggression, let me speak to that,
01:02:54.700 | which I think you're picking up on.
01:02:56.500 | I had a friend named Martha, Marsha, excuse me,
01:02:58.460 | she ran something called cuddle parties,
01:02:59.780 | which people laughed at a lot back in the day.
01:03:02.060 | And the premise of the cuddle party
01:03:03.220 | is everyone got together and cuddled, right?
01:03:04.820 | And it's like, "Ah ha ha."
01:03:05.740 | Then you stop to think about it, and you realize,
01:03:07.780 | physical contact's extremely important,
01:03:09.660 | and a lot of people don't have it.
01:03:11.060 | And if this is a mechanism of people getting that,
01:03:12.740 | it actually is going to have
01:03:13.580 | profound positive psychological consequences.
01:03:15.420 | So after she explained it, I'm like,
01:03:16.540 | "Okay, we laughed at this 'cause it's weird,
01:03:18.540 | "and now that I think about it, this is wonderful."
01:03:20.580 | And I asked her about the tough question,
01:03:23.580 | I go, "What if guys get turned on?"
01:03:25.820 | And on their website, it even has a rule,
01:03:27.660 | like, "Do not fear the erection," right?
01:03:28.860 | 'Cause it's gonna be a natural consequence
01:03:30.220 | of physical proximity.
01:03:31.460 | And the point she goes, she said this,
01:03:33.020 | and I think about this all the time,
01:03:34.460 | "People will take as much space as you let them.
01:03:38.520 | "It is incumbent on each of us to set our own boundaries.
01:03:42.780 | "We all have to learn when to say no,
01:03:44.700 | "you're making me uncomfortable.
01:03:46.200 | "If someone doesn't respect your right
01:03:48.420 | "to have your boundary to be uncomfortable,
01:03:50.140 | "this person is not your friend."
01:03:51.820 | Now, they can say, "I don't understand.
01:03:54.260 | "Like, why is this okay?
01:03:55.420 | "Why is that not?
01:03:56.320 | "Let me know you better so I'm respectful of you."
01:03:59.300 | But if they roll their eyes and they're like,
01:04:00.960 | "Get over, I'm gonna do what I want,"
01:04:02.340 | this person is not interested in knowing
01:04:04.260 | who's a human being.
01:04:05.260 | - Okay. - And that is the aggression.
01:04:08.820 | You have to draw those lines.
01:04:10.260 | - I mean, but that's a very positive way
01:04:13.420 | of phrasing that aggression.
01:04:14.900 | - I'm a very positive person.
01:04:17.020 | But the trolling, there's a destructive thing to it
01:04:20.420 | that hurts others.
01:04:22.740 | But it's not bad people.
01:04:25.380 | - I only troll as a reaction or towards those in power.
01:04:28.500 | - Okay, so maybe let's talk about trolling a little bit.
01:04:31.100 | Because trolling, when it can, maybe you can correct me,
01:04:36.020 | but I've seen it become a game for people
01:04:38.020 | that's enjoyable in itself.
01:04:41.120 | - I disagree with that.
01:04:44.320 | That's not a good thing.
01:04:45.540 | If you are there just to hurt innocent people,
01:04:48.740 | you are a horrible human being.
01:04:50.860 | - But doesn't trolling too easily become that?
01:04:53.940 | - I don't know about easily.
01:04:55.940 | Let me give you an example of where trolling came from.
01:04:58.760 | The original troll was Andy Kaufman.
01:05:01.180 | He was on the show "Taxi."
01:05:02.740 | He was a performance artist, not a standard comedian.
01:05:05.500 | And this is a quintessential example of trolling.
01:05:08.020 | He had a character where he was basically
01:05:10.740 | like a lounge singer.
01:05:11.580 | He had these glasses on and just a terrible singer
01:05:14.980 | and so on and so forth.
01:05:15.820 | And he denied it was him.
01:05:17.040 | And he came out and I'm blanking on the guy's name.
01:05:21.380 | I can't believe it.
01:05:22.220 | Tony Clifton.
01:05:23.040 | - Wow. - Yeah.
01:05:23.980 | - He came out in the audience and he goes,
01:05:25.660 | "You know, my wife died a few years ago.
01:05:28.180 | Every time I look at my daughter Sarah's eyes,
01:05:29.900 | I can see my wife.
01:05:30.780 | Sarah, come out here.
01:05:31.620 | Let's do a duet."
01:05:32.580 | And Sarah was like 11, sits on his lap.
01:05:35.060 | They start singing duet.
01:05:36.460 | Her voice cracks.
01:05:37.620 | He smacks her across the face.
01:05:38.980 | "What the hell are you doing?
01:05:39.940 | You're making an ass out in front of these people."
01:05:42.260 | She starts crying.
01:05:43.600 | The audience is booing and he goes,
01:05:45.500 | "Don't boo her.
01:05:46.340 | You're just gonna make her cry more."
01:05:47.700 | Now it ends.
01:05:50.220 | This wasn't his daughter.
01:05:51.380 | It wasn't even a child.
01:05:52.220 | It was an actress.
01:05:53.040 | This was all set up.
01:05:53.880 | He's exploiting their love of children
01:05:57.340 | in order to force them to be performers.
01:05:59.380 | That is trolling.
01:06:01.060 | No one is actually getting hurt.
01:06:02.820 | It's a humorous though twisted exchange.
01:06:05.780 | If you go online looking for weak people
01:06:10.340 | and you are there to denigrate them
01:06:13.080 | just for them being weak or in some way inferior to you,
01:06:16.520 | that is the wrong approach.
01:06:18.840 | I am best on the counterpunch.
01:06:21.400 | A lot of times people come to me
01:06:23.520 | and they'll be like, "I hope you die.
01:06:24.960 | You're ugly.
01:06:25.800 | You're disgusting."
01:06:26.620 | And there's this great quote from Billy Idol,
01:06:28.160 | which I'm gonna mangle.
01:06:29.000 | He's something effective.
01:06:30.200 | "I love it when people are rude to me,
01:06:31.720 | then I can stop pretending to be nice."
01:06:33.640 | Then you start fights.
01:06:35.480 | Now it's a chance for me to finish it
01:06:37.280 | and make an example of this person.
01:06:39.000 | So that's very, very different from,
01:06:40.940 | I'm gonna go around and humiliate people
01:06:43.020 | for the sake of doing it, in my view.
01:06:45.160 | And I can see how one would lead to the other.
01:06:48.540 | - Yeah, but that's my fundamental concern with it.
01:06:50.980 | So my dream is to put, use technology,
01:06:54.860 | create platforms that increase
01:06:59.020 | the amount of love in the world.
01:07:00.580 | And to me, trolling is doing the opposite.
01:07:06.060 | So like Andy Kaufman is brilliant.
01:07:09.840 | So I love, obviously, it sounds like I'm a robot saying,
01:07:13.000 | I love humor, okay?
01:07:14.260 | Humor is good.
01:07:16.600 | (both laughing)
01:07:18.400 | - One, one, one, zero, one, one, one, one.
01:07:21.080 | - But like, it's, I just see like 4chan.
01:07:25.640 | I see that you can often see that humor quickly turn.
01:07:29.520 | - Yeah, because what happens is a lot of low status people,
01:07:31.980 | this is their one mechanism through sadism,
01:07:34.980 | to feel empowered, and then they can hide behind,
01:07:38.480 | well, I'm just joking.
01:07:40.120 | - Yeah, like there's this dark thing.
01:07:41.440 | - Yeah, that's not acceptable.
01:07:42.560 | - There's a dark LOL that people do,
01:07:45.840 | which is like, they'll say like the shittiest thing.
01:07:49.080 | - Right, 'cause they feel-- - And then do LOL after.
01:07:51.080 | Like, as if, I don't even know,
01:07:54.040 | like what is happening in the dark mind of yours.
01:07:56.960 | - Because they are feeling powerless in their lives.
01:08:00.000 | - Right.
01:08:00.820 | - And they see someone who they perceive
01:08:02.200 | as higher status or more powerful than them,
01:08:03.600 | or even not appear, and they, through their words,
01:08:07.040 | cause a reaction in this person,
01:08:09.500 | so they feel like they are, in a very literal sense,
01:08:11.680 | making a difference on Earth,
01:08:12.920 | and they matter in a very dark way.
01:08:15.220 | It's disturbing.
01:08:16.400 | This is not, I mean, it's unfortunate
01:08:18.240 | that that term trolling is used for that,
01:08:21.200 | as opposed to what Andy Kaufman does,
01:08:22.960 | as opposed to what I do.
01:08:24.420 | It really is a sinister thing,
01:08:29.060 | and it's something I'm not at all a fan of.
01:08:31.640 | - How do we fight that?
01:08:33.840 | So, like, a neighboring concept of that
01:08:37.240 | is conspiracy theories, which is--
01:08:39.760 | - I don't think they're neighboring at all.
01:08:41.640 | - Well, let me give a sort of naive perspective.
01:08:45.280 | Maybe you can educate me on this.
01:08:46.600 | From my perspective, conspiracy theories
01:08:49.760 | are these constructs of ideas
01:08:52.880 | that go deeper and deeper and deeper
01:08:55.440 | into creating worlds where, you know,
01:09:00.440 | worlds where there's powerful pedophiles
01:09:05.120 | controlling things, like these very sophisticated models
01:09:10.120 | of the world that, you know, in part might be true,
01:09:14.160 | but in large part, I would say,
01:09:15.960 | are figments of imagination
01:09:18.560 | that become really useful constructs--
01:09:22.080 | - And self-reinforcing.
01:09:23.160 | - Self-reinforcing, for then feeding,
01:09:26.640 | like empowering the trolls to attack
01:09:31.440 | the powerful, the conventionally powerful.
01:09:34.920 | - I don't think that that's a function
01:09:36.400 | of conspiracy theories.
01:09:37.240 | Now, let's talk about conspiracy theories,
01:09:38.720 | 'cause one of my quotes is, "You take one red pill,
01:09:40.560 | "not the whole bottle."
01:09:41.960 | This concept that everything in life
01:09:45.600 | is at the function of a small cadre of individuals
01:09:50.520 | would be, for many people, reassuring,
01:09:53.220 | because as bad as it looks, you know they,
01:09:55.920 | whoever they are, it's usually the Jews,
01:09:57.960 | aren't gonna let it get that bad, that they will pull back.
01:10:00.880 | Or the black pill is that they aren't intentionally
01:10:05.880 | trying to destroy everything,
01:10:07.320 | and there's nothing we can do when we're doomed.
01:10:08.920 | And there's an amazing book by Arthur Herman
01:10:10.520 | called "The Idea of Declined Western History."
01:10:12.880 | It's one of my top 10 books,
01:10:14.360 | where he goes through every 20 years
01:10:16.400 | how there's a different population that say,
01:10:18.240 | "It's the end of the world, here's the proof."
01:10:20.680 | And very often, the proof is something
01:10:22.160 | that is kind of self-fulfilling,
01:10:24.120 | where it's not falsifiable.
01:10:26.600 | And we both have to think of ways
01:10:27.800 | to falsify our claims from earlier.
01:10:29.720 | So it is a big danger.
01:10:32.560 | It's a big danger online, because very quickly,
01:10:35.880 | if someone who you thought was good,
01:10:38.620 | but now is bad on one aspect,
01:10:40.320 | well, they're controlled opposition,
01:10:41.840 | or they've been taken over,
01:10:44.140 | or they've been kind of appropriated by the bad people,
01:10:47.800 | whoever those bad people would be.
01:10:49.960 | I don't know that I have a good answer for this.
01:10:52.920 | I don't think it's as pervasive as people think.
01:10:56.040 | - The number of people who believe conspiracy theory?
01:10:57.840 | - Right, I mean, and also conspiracy theory
01:11:00.200 | is a term used to dismiss ideas that have some currency.
01:11:04.260 | The Constitutional Convention was a conspiracy.
01:11:06.820 | The Founding Fathers got together secretly,
01:11:08.480 | under word of secrecy in Philadelphia,
01:11:10.080 | said, "We're throwing out the Articles of Confederation,
01:11:11.520 | "we're making a new government," right, yeah, yeah, yeah.
01:11:13.000 | And Luther Martin left, and he told everyone,
01:11:15.280 | "This is a conspiracy," and they're like,
01:11:16.540 | "Yeah, whatever, Luther Martin."
01:11:17.880 | So, and Jeffrey Epstein was a conspiracy,
01:11:20.160 | Harvey Weinstein was a conspiracy,
01:11:21.720 | Bill Cosby was a conspiracy.
01:11:22.720 | They all knew, they didn't care.
01:11:24.880 | Communist infiltration in America,
01:11:26.600 | there's a great book by Eugene Lyons
01:11:28.760 | called "The Red Decade."
01:11:30.160 | They all knew every atrocity that was done under Stalinism
01:11:35.160 | was excused in the West, and if you didn't believe it,
01:11:37.960 | oh, you've got this crazy anti-Russia conspiracy.
01:11:40.280 | So, it's a term that is weaponized in a negative sense,
01:11:43.700 | but that does not at all imply
01:11:45.640 | that it does not have very negative real-life consequences,
01:11:49.200 | because it's kind of a cult of one, right?
01:11:52.280 | Like, I'm at home on my computer,
01:11:54.080 | I buy into this ideology, anyone who doesn't agree with me,
01:11:58.200 | they are blind, they're oblivious, mom and dad, my friends,
01:12:01.760 | you don't get it, we were warned about people like you,
01:12:05.280 | and I think there's a very heavy correlation,
01:12:08.440 | and I'm not a psychiatrist, of course,
01:12:10.040 | between that and certain types of mild mental illness,
01:12:12.480 | like some kind of paranoid schizophrenia,
01:12:14.640 | things like that, because after a certain point,
01:12:17.160 | if everything is a function of this conspiracy,
01:12:20.240 | there's no randomness or beauty in life.
01:12:22.920 | - Yeah, I mean, I don't know if you can say
01:12:25.200 | anything interesting about it in the way of advice
01:12:28.160 | of how to take a step into conspiracy theory world
01:12:33.160 | without completely going, like diving deep,
01:12:37.540 | because it seems like that's what happens.
01:12:39.720 | People can't look at Jeffrey Epstein--
01:12:42.520 | - I can tell you the advice I'd have to--
01:12:44.160 | - Seriously and rigorously, without going,
01:12:48.040 | 'cause you can look at Jeffrey Epstein
01:12:49.760 | and say there's a deeper thing, you can always go deeper.
01:12:53.840 | It's like Jeffrey Epstein was just a tool
01:12:55.840 | of the lizard people, and the lizard people are the tool--
01:12:59.760 | - Well, they say Satanists, in this case.
01:13:02.920 | - And somehow, recently, very popular,
01:13:06.480 | spedophiles somehow always involved,
01:13:08.680 | I'm not understanding any of that.
01:13:11.800 | Legitimately, I say this, both humorously and seriously,
01:13:15.160 | I need to look into it.
01:13:16.920 | And I guess the bigger question I'm asking,
01:13:19.480 | how does a serious human being,
01:13:21.920 | somebody with a position at a respectable university,
01:13:25.600 | like look at a conspiracy theory and look into it?
01:13:28.160 | When I look at somebody like Jeffrey Epstein,
01:13:30.600 | who had a role at MIT, and I think,
01:13:35.600 | I'm not happy, personally,
01:13:38.940 | I wasn't there when Jeffrey Epstein was there,
01:13:42.760 | I'm not happy with the behavior of people now
01:13:45.880 | about Jeffrey Epstein, about the bureaucracy
01:13:49.360 | and everybody's trying to keep quiet, hoping it blows over,
01:13:53.120 | without really looking into any,
01:13:55.160 | like looking in a deep philosophical way of like,
01:14:00.160 | how do we let this human being be among us?
01:14:03.760 | - Can I give you a better example?
01:14:04.920 | - Sure.
01:14:05.760 | - That is kind of conspiratorial.
01:14:07.960 | The Speaker of the House,
01:14:08.960 | the longest serving Republican Speaker of the House,
01:14:10.920 | Dennis Hastert, was a pedophile.
01:14:12.920 | He went to jail.
01:14:14.460 | The Democrats don't throw this
01:14:16.000 | in the Republicans' faces every five minutes,
01:14:18.280 | not even Democratic activists.
01:14:19.600 | I find that very, very odd and not what I would predict.
01:14:23.560 | Now, I'm not saying there's some kind of conspiracy,
01:14:26.160 | but when it comes to things like sexual predation,
01:14:28.940 | which is something that I'm very, very concerned about.
01:14:31.640 | I have an uncle now,
01:14:32.480 | my sister just had her second kid recently, he's adorable.
01:14:35.240 | It's something that I don't understand.
01:14:40.420 | It feels as if there's a lot of people
01:14:44.080 | who want this to all go away.
01:14:45.740 | Now, I think it's also because we don't have
01:14:47.760 | the vocabulary and framework to discuss it.
01:14:50.160 | Because when you start talking about things like children
01:14:52.200 | and these kinds of issues,
01:14:53.480 | we want to believe it's all crap.
01:14:55.560 | Because it's for those of us
01:14:56.840 | who aren't in this kind of mindset,
01:14:58.400 | the idea that this happens to kids and happens frequently
01:15:01.120 | is something so horrible that we,
01:15:03.160 | it's just like, I don't even want to hear it.
01:15:04.800 | And that does these children and adult survivors
01:15:07.320 | an enormous disservice.
01:15:08.720 | So I don't know that I have any particular insight on this.
01:15:11.340 | But see, how do you, the Catholic Church,
01:15:14.780 | again, there's all these topics that--
01:15:17.100 | - Public school teachers are far more proportionately
01:15:19.980 | predators of children than the Catholic Church.
01:15:21.940 | - I mean, I don't know what, you're right, you're right.
01:15:24.620 | Perhaps I've been reading a lot about Stalin and Hitler.
01:15:29.620 | Somehow it's more comforting to be able to--
01:15:32.780 | - Yeah, 'cause it's there, and then.
01:15:34.300 | - And then, and then the atrocities that are happening now,
01:15:38.500 | it's a little bit more difficult because--
01:15:39.860 | - There was a New York Times article,
01:15:41.020 | sorry to interrupt you, where they had people
01:15:43.380 | tracking down child pornography.
01:15:45.300 | And I think the article said they didn't have enough people
01:15:47.560 | just to cover the videotapes of infants being raped.
01:15:51.400 | And we can even wrap our heads around reading Lolita,
01:15:54.520 | like, okay, she's 14, 12, okay, it's still a female.
01:15:57.460 | An infant, it's something that,
01:15:59.820 | again, like with the Stalin example,
01:16:01.520 | we sat down here for 100 years,
01:16:02.780 | we would never think of something like this,
01:16:03.860 | think of it in a sexual context, it makes no sense.
01:16:06.660 | So, and the fact that this is international,
01:16:08.860 | okay, we eliminated completely in America.
01:16:11.300 | Well, then they're gonna go find,
01:16:13.180 | there's infants all over the world,
01:16:14.260 | there's video cameras all over the world.
01:16:15.860 | So, then it has to become a conspiracy
01:16:18.460 | because someone has to film it, I'm filming it,
01:16:20.820 | you're buying it, your kid.
01:16:23.140 | It is literally a conspiratorial,
01:16:25.580 | not in the sense of like a mafia conspiracy
01:16:27.420 | or some government Illuminati,
01:16:29.380 | but there is our networks designed to produce this product.
01:16:34.020 | - See, but like what I'm trying to do now,
01:16:38.500 | I mean, one of the nice things with like a podcast
01:16:41.420 | and other things I'm involved with
01:16:43.020 | is removing myself from having any kind of boss
01:16:46.740 | so I can do whatever that helps.
01:16:48.220 | - Oh, it's so wonderful, that just happened to me.
01:16:50.300 | It's the most wonderful thing ever.
01:16:52.500 | - So, I can actually, in moderation,
01:16:55.540 | consider like look into stuff.
01:16:57.780 | - Careful though, I was gonna write a book about this
01:16:59.900 | and people pointed out, you sure wanna do this research?
01:17:02.980 | 'Cause if you start Googling around for this kind of stuff,
01:17:04.780 | it's on your computer.
01:17:06.660 | - Oh, in that sense. - Yeah.
01:17:08.300 | - I'm more concerned about, you know,
01:17:10.060 | it's the Nietzsche thing, looking into the abyss.
01:17:12.180 | Like you wanna be very,
01:17:13.700 | I believe I can do this kind of thing in moderation
01:17:16.660 | without slipping into the depths.
01:17:19.300 | I think that's intelligence, that's like,
01:17:23.500 | I recently quote unquote looked into like the UFO community,
01:17:28.260 | the extraterrestrial, whatever community.
01:17:32.060 | I think it always frustrated me
01:17:34.580 | that the scientific community like rolled their eyes
01:17:37.220 | at all the UFO sightings, all that kind of stuff.
01:17:40.020 | Even though there could be fascinating,
01:17:41.980 | beautiful, physical, like, first of all,
01:17:44.700 | there could legit-- - Like ball lightning.
01:17:46.300 | - Like ball lightning, right?
01:17:47.580 | That's at the very basic level is a fascinating thing.
01:17:51.860 | And also, it could be something like,
01:17:56.860 | I mean, I don't know,
01:17:57.900 | but it could be something interesting,
01:17:59.500 | like worth looking into.
01:18:02.100 | - My grandfather was an air traffic controller
01:18:04.260 | back in the Soviet Union.
01:18:05.820 | And he said, we saw this stuff all the time.
01:18:08.300 | These are planes that were not moving
01:18:09.780 | or whatever things that were not moving
01:18:11.420 | according to anything we knew about.
01:18:13.420 | So it's absolutely real.
01:18:14.660 | He's not some jerk with an iPhone in his backyard.
01:18:18.060 | This is a military professional who understood technology,
01:18:22.660 | who knew where the secret bases were.
01:18:24.140 | So if he's telling me, it's some,
01:18:26.260 | doesn't mean it's Martians,
01:18:27.700 | but he's telling me there's something there.
01:18:29.180 | And there are many examples of these like military people.
01:18:31.580 | These aren't some layman who sees a story.
01:18:33.420 | - These are legit people.
01:18:34.260 | And so you can dismiss,
01:18:37.580 | when you're talking about professionals
01:18:38.700 | who are around aircraft all the time,
01:18:40.260 | who are familiar with aircraft at the highest levels,
01:18:42.460 | and they're seeing things that they can't explain,
01:18:45.020 | they're clearly not stupid
01:18:46.260 | and they're clearly not underformed.
01:18:47.380 | - So there's different ways to dismiss it.
01:18:49.820 | For example, you were saying that trolling
01:18:53.860 | is a good mechanism.
01:18:55.500 | I'm against that,
01:18:56.740 | but I'm not dismissing it by like rolling my eyes.
01:19:00.460 | I'm considering legitimately
01:19:02.500 | that you're way smarter than me
01:19:04.220 | and you understand the world better than me.
01:19:05.780 | Like I'm allowing myself to consider that possibility
01:19:08.180 | and thinking about it.
01:19:09.020 | Like maybe that's true,
01:19:10.980 | like seriously considering it.
01:19:13.060 | I feel the way people should approach intelligent people,
01:19:18.060 | serious quote unquote people,
01:19:20.020 | scientists should approach conspiracy theories.
01:19:22.380 | Like look at it carefully.
01:19:26.660 | First of all, is it possible that the earth is flat?
01:19:29.380 | - It's not trivial to show that the earth is not flat.
01:19:32.580 | It's a very good exercise.
01:19:33.740 | You should go through it.
01:19:35.020 | But once you go through it,
01:19:36.420 | you realize that based on a lot of data
01:19:41.100 | and a lot of evidence,
01:19:42.340 | and there's a lot of different experiments
01:19:43.700 | you can do yourself actually
01:19:45.620 | to show that the earth is not flat.
01:19:47.220 | Okay, the same kind of process can be taken
01:19:50.860 | for a lot of different conspiracy theories,
01:19:52.940 | and it's helpful.
01:19:54.140 | And without slipping into the depths
01:19:56.300 | of lizard people running everything.
01:20:00.540 | That's where I've now listened to two episodes
01:20:04.140 | of Alex Jones's show,
01:20:08.620 | because he goes crazy deep
01:20:12.140 | into different kind of worldviews
01:20:16.900 | that I was not familiar with.
01:20:19.100 | And I don't know what to make of it.
01:20:20.580 | I mean, the reason I've been listening to it
01:20:22.020 | is because there's been a lot of discussions
01:20:25.500 | about platforming of different people.
01:20:27.820 | And I've been thinking about what does censorship mean.
01:20:30.580 | I've been thinking about whether,
01:20:34.300 | because Joe Rogan said he's gonna have Alex on again.
01:20:39.300 | And then I enjoyed it as a fan,
01:20:43.620 | just the entertainment of it.
01:20:45.260 | But then I actually listened to Alex,
01:20:47.580 | and I was thinking,
01:20:49.700 | is this human being dangerous for the world?
01:20:53.100 | Like is the ideas he's saying dangerous for the world?
01:20:55.500 | - I'm more concerned with the Russian conspiracy
01:20:57.420 | that we had for three years.
01:20:58.900 | And the claim that our election was not legitimate,
01:21:01.540 | and that everyone in the Trump White House
01:21:02.900 | is a stooge of Putin.
01:21:04.300 | And the people who said this had no consequences for this.
01:21:07.180 | Alex Jones doesn't have the respect that they do.
01:21:10.060 | These are both areas of concern for me.
01:21:12.060 | - But he might if he's given more platforms.
01:21:15.380 | So like the people who've,
01:21:18.260 | and I'd be curious,
01:21:19.780 | I'm also a little bit,
01:21:22.300 | I don't know what to think about the idea
01:21:24.100 | that Russians hacked the election.
01:21:26.660 | It seems too easily accepted in the mainstream media.
01:21:30.460 | - Hillary Clinton said that how they did it
01:21:33.980 | was they had ads on the dark web.
01:21:37.660 | Now you and I both know what the dark web is.
01:21:40.400 | So the possibility of ads on the dark web
01:21:43.060 | having an influence,
01:21:44.420 | a proportional influence on the election is literally zero.
01:21:48.420 | - Perhaps I should look into it more carefully,
01:21:50.100 | but I've found very little good data
01:21:53.900 | on exactly what did the Russians do to hack elections.
01:21:58.900 | Technically speaking, what are we talking about here?
01:22:01.980 | As opposed to these kind of weird,
01:22:03.820 | like the best thing is a couple of books
01:22:05.660 | and reporting on farms.
01:22:09.380 | - Patrol farms, yeah.
01:22:10.220 | - Patrol farms.
01:22:11.260 | But let's see the data.
01:22:14.740 | Like how many exactly?
01:22:16.780 | What are we talking about?
01:22:17.900 | What were they doing?
01:22:19.940 | Not just like some anecdotal discussions,
01:22:22.980 | but like relative to the bigger, the size of Facebook.
01:22:27.980 | Like if there's a few people, several hundred say,
01:22:32.780 | that are posting different political things on Facebook,
01:22:36.660 | relative to the full size of Facebook,
01:22:39.700 | let's look at the full size.
01:22:41.660 | - Right, you're thinking like a scientist.
01:22:42.820 | - The actual impact.
01:22:43.740 | Like 'cause it's fascinating the social dynamics
01:22:47.500 | of viral information of videos.
01:22:50.420 | When Donald Trump retweets something,
01:22:53.780 | I think that's understudied the effect of that.
01:22:56.520 | Like he retweeted a clip with Joe Rogan
01:23:01.860 | and Mike Tyson, where Mike Tyson says
01:23:05.460 | that he finds fighting orgasmic.
01:23:08.000 | I don't understand that,
01:23:09.860 | but it'd be fascinating to think like,
01:23:11.340 | what is the ripple effect on the social dynamic
01:23:15.500 | of our society from retweeting a clip about Mike Tyson?
01:23:19.620 | - What's your favorite Trump tweet?
01:23:22.300 | - I tuned him out a long time ago, unfortunately.
01:23:26.840 | You and I have a different relationship with Donald Trump.
01:23:35.340 | You appreciate the art form of trolling.
01:23:37.460 | - Non-sexual.
01:23:38.300 | - Non-sexual, yeah.
01:23:39.500 | So I tend to prefer Bill Clinton.
01:23:44.300 | He's more my type.
01:23:45.140 | I'm just kidding.
01:23:45.980 | I don't know.
01:23:47.260 | - You don't like that consent stuff.
01:23:48.780 | - No, the consent, no.
01:23:50.060 | No, you appreciate the art form of trolling
01:23:53.500 | and Donald Trump is a master.
01:23:57.100 | He's the Da Vinci of trolling.
01:24:00.180 | So I tend to think that trolling
01:24:03.620 | is ultimately destructive for society.
01:24:05.900 | And then Donald Trump takes nothing seriously.
01:24:07.980 | He's playing a game.
01:24:09.140 | He's making a game out of everything.
01:24:10.580 | - He takes a lot of things seriously.
01:24:11.460 | I think he's very committed to international peace.
01:24:14.620 | - I'm sorry, I shouldn't speak so strong.
01:24:16.820 | I think he takes, actually, yes, a lot of things seriously.
01:24:20.540 | I meant on Twitter and the game of politics.
01:24:25.340 | - Yeah.
01:24:26.180 | - He is, he only takes--
01:24:29.540 | - Irreverently.
01:24:30.500 | - Yeah.
01:24:31.340 | - Yeah.
01:24:32.180 | - And I appreciate it.
01:24:35.140 | I just would like to focus on like genuine,
01:24:40.180 | real expressions of humanity, especially positive.
01:24:44.700 | - Well, this is my favorite tweet.
01:24:46.620 | My fans got it laser etched
01:24:48.820 | and put it in a block of loose light for me.
01:24:50.940 | And he said, "Every time I speak of the losers and haters,
01:24:54.940 | I do so with great affection.
01:24:56.780 | They cannot help the fact that they were born fucked up."
01:24:59.500 | That's an actual Trump tweet.
01:25:00.780 | It's my favorite one.
01:25:01.860 | - And that's kind of nice.
01:25:04.460 | - And that's love.
01:25:05.380 | - That's love.
01:25:06.220 | That's kind of nice.
01:25:07.060 | - Great affection.
01:25:08.540 | - That, I mean, exclamation point.
01:25:12.420 | Even.
01:25:14.420 | I broke Lex.
01:25:19.220 | What is love?
01:25:20.300 | - Yeah, the sparks are flying.
01:25:22.900 | But I have to kind of analyze that
01:25:25.860 | from like a literary perspective,
01:25:27.500 | but it seems like there's love in there,
01:25:29.860 | like a little bit.
01:25:30.700 | It's a little bit lighthearted.
01:25:32.940 | - 'Cause he's saying, "Even when I'm going after them,
01:25:34.900 | don't take it so seriously."
01:25:36.260 | - Yeah.
01:25:37.100 | - That's nice.
01:25:37.940 | - It is nice.
01:25:38.780 | - That's acknowledging the game of it.
01:25:40.060 | - Yes.
01:25:40.900 | - That's nice.
01:25:42.020 | He's not always nice.
01:25:43.620 | - Sometimes he's very, very vicious.
01:25:45.060 | - Yeah.
01:25:45.900 | - Very vicious.
01:25:46.740 | He's done things that I can tell you about
01:25:48.580 | that I'm like, "This is a bad person."
01:25:51.220 | - What do you think about one of the...
01:25:53.140 | Okay, listen, I'm not...
01:25:55.380 | For people listening,
01:25:56.460 | I do not have Trump derangement syndrome.
01:25:59.380 | I try to look for the good and the bad in everybody.
01:26:05.620 | One thing, perhaps it's irrational,
01:26:07.940 | but perhaps because I've been reading history,
01:26:10.400 | the one triggering thing for me is the delaying of elections.
01:26:16.540 | I believe in elections,
01:26:20.100 | and this is the part that you probably disagree with.
01:26:24.820 | But I believe in the value of people voting,
01:26:29.420 | and I just see too many dictators,
01:26:32.340 | the place where they finally, the big switch happens.
01:26:36.300 | When you question the legitimacy of elections.
01:26:39.740 | - Who's been questioning the legitimacy of elections
01:26:41.500 | for the last three years?
01:26:43.060 | - I've only heard Donald Trump do it last year,
01:26:45.620 | but the last three years you're saying somebody else?
01:26:48.500 | - You don't think, "Not my president, illegitimate.
01:26:51.220 | "We're not gonna normalize him as president.
01:26:52.860 | "Russia hacked this election.
01:26:54.940 | "Impeached, you're not a real president."
01:26:56.340 | You don't think that's questioning legitimacy of 2016?
01:26:59.100 | - No, it's a good...
01:27:00.260 | I haven't been paying attention enough,
01:27:02.060 | but I would imagine that argument has been that...
01:27:06.980 | I haven't actually heard too many people,
01:27:08.860 | but I imagine that's been a popular thing to say.
01:27:10.380 | - Oh, very much so, yeah.
01:27:12.300 | - Okay.
01:27:13.140 | But nevertheless, that's a part that didn't...
01:27:18.300 | That's not a statement that gained power enough to say
01:27:21.540 | that Barack Obama will keep being president,
01:27:27.060 | or Hillary Clinton should be president.
01:27:30.340 | - Newsweek had that article,
01:27:31.300 | how Hillary Clinton could still be president, Newsweek.
01:27:34.060 | - No, but she's not.
01:27:35.740 | That's what I'm saying.
01:27:36.580 | My worry isn't saying that the election was illegitimate,
01:27:41.580 | and people whining at mass scale,
01:27:45.620 | and then Fox News or CNN reporting for years,
01:27:49.340 | or books being written for years.
01:27:51.060 | My worry is legitimately martial law,
01:27:54.900 | a person stays president.
01:27:57.860 | - So here's the issue.
01:27:59.140 | Like there's a phase shift that happens in a dictatorship.
01:28:02.500 | - I did a book on North Korea.
01:28:04.260 | I'm not someone who thinks
01:28:05.660 | dictatorship should be taken lightly.
01:28:07.700 | I'm not someone who thinks it can't happen here.
01:28:09.900 | I think a lot of times people are desperate for dictatorship.
01:28:13.060 | So I am with you.
01:28:14.100 | And I think this is something,
01:28:15.440 | if you're gonna hand wave it away,
01:28:16.980 | everyone else hand waved it away.
01:28:18.340 | Hitler's never gonna be chancellor.
01:28:19.460 | He's a lunatic.
01:28:20.300 | - He's a joke. - He's a joke.
01:28:22.060 | They couldn't find a publisher for Mein Kampf in English,
01:28:24.860 | because this is a guy from some random minor party
01:28:28.100 | in Germany spouting nonsense,
01:28:29.460 | who's gonna read this crap?
01:28:31.180 | So I completely agree with you in that regard.
01:28:33.300 | - You don't think we're there.
01:28:34.300 | - My point is Donald Trump this year
01:28:37.860 | had every pathway open to him to declare martial law.
01:28:42.580 | The cities are being burnt down.
01:28:44.460 | He could have very easily sent in the tanks,
01:28:47.260 | and people would have been applauding him from his side.
01:28:49.500 | - You're making me feel so good right now.
01:28:50.540 | - But am I wrong though?
01:28:51.620 | - No, I--
01:28:52.460 | - What he did, he tweeted out to Mayor Wheeler of Portland.
01:28:56.020 | He said, "Call me.
01:28:58.380 | "We will solve this in minutes, but you have to call."
01:29:03.220 | And he sat on his hands, and they said,
01:29:05.180 | "Oh, it's his fault.
01:29:06.440 | "The city's burning down.
01:29:07.420 | "He's not doing anything."
01:29:08.380 | And he goes, "I'm not doing anything
01:29:10.040 | "until you ask me to do it."
01:29:12.260 | So I think that is,
01:29:14.020 | even if you think he's an aspiring dictator,
01:29:16.940 | that is at least a sign that there is some restraint
01:29:20.740 | on his aspirations.
01:29:23.420 | - Can I just take that in as a beautiful moment of hope?
01:29:28.420 | So I'm gonna remember this moment--
01:29:30.260 | - As beautiful as Ted Cruz.
01:29:31.820 | Beautiful Ted.
01:29:33.380 | - I'm gonna remember that.
01:29:34.420 | I mean, I should say that perhaps I'm irrationally,
01:29:39.420 | this is the one moment where I feel myself
01:29:41.620 | being a little unhealthy.
01:29:43.020 | - I don't think you're being irrational.
01:29:44.180 | I think there's an asymmetry,
01:29:46.360 | because it's kind of like, okay,
01:29:48.580 | either if I leave the house, it's like Russian roulette.
01:29:51.180 | Yeah, maybe it's like a one in six shot,
01:29:52.760 | I'm pulling the trigger, I'm killing myself,
01:29:54.420 | but that's one in six.
01:29:56.220 | That's not, and the consequences are so dire
01:29:59.400 | that a little paranoia would go a long way.
01:30:01.700 | - There's something that--
01:30:02.540 | - 'Cause you can't go back.
01:30:04.180 | - Yeah, you can't.
01:30:05.340 | - It's an asymmetry, yeah.
01:30:06.660 | - The thing is, the thing that makes Donald Trump new to me,
01:30:11.660 | and again, I'm a little naive in these things,
01:30:14.700 | but he surprised me in how many ways
01:30:21.460 | he just didn't play by the rules.
01:30:24.060 | And he's made me, a little ant in this ant colony,
01:30:30.240 | think like, well, do you have to play by the rules at all?
01:30:34.440 | Why are we having elections?
01:30:38.180 | Why did you say, it's coronavirus time,
01:30:41.720 | it's not healthy to have elections,
01:30:44.880 | we shouldn't be, I could, if I put my dictator hat on--
01:30:48.760 | - Nancy Pelosi said that Joe Biden shouldn't debate.
01:30:52.200 | - Yeah, did she?
01:30:53.980 | - Yes, she says she shouldn't dignify Trump with a debate.
01:30:57.000 | He's the president, he could be the worst president on Earth,
01:30:59.200 | evil, despicable monster, I'll take that as an argument.
01:31:02.120 | - So she's playing politics, but she's--
01:31:03.680 | - I don't think that's playing politics.
01:31:04.800 | I think when, there's a certain point when things get,
01:31:08.120 | when you start attacking institutions
01:31:10.240 | for the emergencies of the moment,
01:31:12.720 | and acting arbitrarily,
01:31:14.040 | that is when things are the slippery slope.
01:31:15.960 | - Yeah, so you're saying debates
01:31:17.900 | is one of the institutions,
01:31:18.920 | like that's one of the traditions to have the debates.
01:31:21.080 | - I think the debates are extremely important.
01:31:23.640 | Now, I don't think that someone's a good debater
01:31:25.440 | is gonna make a good president.
01:31:26.520 | I mean, that's a big problem.
01:31:27.800 | - But you're just saying this is attacking
01:31:29.120 | just yet another tradition, yet another--
01:31:32.320 | - You know like how if you're dating,
01:31:33.320 | if you're married to someone,
01:31:34.240 | and someone throws out the word divorce,
01:31:35.680 | you can't unring that bell, you threw it out there?
01:31:38.040 | I'm saying you don't throw things out like that
01:31:40.240 | unless you really are ready to go down this road.
01:31:43.640 | And I think that is,
01:31:44.640 | there's nothing in the Constitution about debates,
01:31:46.560 | we've only had them since 1980,
01:31:48.240 | but still, I think they are extremely important.
01:31:51.060 | It's also a great chance for Joe Biden
01:31:52.760 | to tell him to his face, you're full of crap,
01:31:55.160 | here's what you did, here's what you did,
01:31:56.240 | here's what you did.
01:31:57.080 | - So fascinating that you're both,
01:31:58.680 | you acknowledge that,
01:32:01.300 | and yet you also see the value
01:32:03.820 | of tearing down the entire thing.
01:32:06.280 | So you're both worried about no debates,
01:32:09.500 | or at least in your voice, in your tone.
01:32:11.260 | - There's a great quote by Chesterton.
01:32:13.900 | I'm not a fan of his at all,
01:32:15.520 | but he says, "Before you tear down a fence,
01:32:17.120 | "make sure you know why they put it up first."
01:32:19.280 | So I am for tearing it all down,
01:32:22.580 | but there's something called a controlled demolition,
01:32:24.280 | like Building 7, or there's--
01:32:26.440 | - Allegedly.
01:32:27.280 | (both laughing)
01:32:29.480 | - We knew we were in Tel Aviv.
01:32:30.980 | And--
01:32:32.760 | - Hashtag Building 7.
01:32:33.600 | (Lex laughing)
01:32:34.440 | We knew we were in Tel Aviv.
01:32:36.080 | Wow, you're faster than me.
01:32:37.440 | You're operating in a different level.
01:32:40.560 | I need to upgrade my operating system.
01:32:42.360 | (Lex laughing)
01:32:43.200 | - I'm in primary. - I told you Windows 95.
01:32:45.280 | (Lex laughing)
01:32:46.120 | - You're trying, yeah.
01:32:46.960 | Building 7.
01:32:49.680 | - If you're gonna, it's like Indiana Jones, right?
01:32:52.320 | If you're gonna pull something away,
01:32:54.240 | make sure you have something in place first,
01:32:55.840 | as opposed to just breaking it,
01:32:57.000 | and then just, especially in politics,
01:32:59.560 | 'cause it escalates, and when things escalate
01:33:02.060 | without any kind of response, it can go in a very bad,
01:33:05.640 | that's when Napoleon comes in.
01:33:07.060 | - So what's your prediction about the Biden-Trump debates?
01:33:14.320 | - Again, I just have this weird,
01:33:16.720 | maybe we'll return to it, maybe not,
01:33:18.960 | in this, how do we put more love into the world?
01:33:22.380 | And one of the things that worries me about the debates
01:33:25.440 | is it'll be the world's greatest troll
01:33:31.520 | against the grandpa on the porch.
01:33:36.480 | - Who crapped his pants.
01:33:37.520 | - Yeah. - Yeah.
01:33:38.640 | - And it'll not put more love into the world.
01:33:42.360 | It will create more mockery.
01:33:45.420 | - Joe Biden did a great job against Paul Ryan in 2012.
01:33:50.680 | Paul Ryan was no lightweight,
01:33:51.840 | no one thought he was a lightweight.
01:33:53.200 | Joe Biden handed Sarah Palin her ass in 2008,
01:33:56.720 | which isn't as easy to do as you think,
01:33:58.120 | 'cause she's a female, so you're gonna come off as bullying,
01:34:00.360 | that's something you have to worry about.
01:34:01.740 | So the guy isn't, I think he is in the stages
01:34:06.640 | of cognitive decline, so I think it's going to be interesting.
01:34:12.000 | I want it to be like Mike Tyson beating up a child,
01:34:17.000 | 'cause it'll be a source of amusement to me,
01:34:20.920 | but I don't know how it's gonna go.
01:34:22.280 | - Is it possible that Joe Biden
01:34:23.680 | will be the Mike Tyson in this?
01:34:24.840 | - Yes, because in his last debate with Bernie,
01:34:27.080 | he was perfectly fine.
01:34:28.540 | And again, the guy was a senator for decades,
01:34:30.400 | and I don't think anyone, if you looked at Joe Biden in 2010
01:34:35.080 | would have thought this guy is gonna be,
01:34:36.640 | have his ass handed him a debate,
01:34:38.000 | you wouldn't think that at all.
01:34:39.280 | So I don't know who we're going to see.
01:34:41.500 | Plus he's got a lot of room to attack Trump.
01:34:45.140 | So I'm sure he's gonna come strapped and ready,
01:34:47.540 | and he's gonna have his talking points,
01:34:49.500 | and watch Trump dance, try to tap dance around him.
01:34:52.000 | And if he's in a position,
01:34:52.880 | I know what the rules of the debate are,
01:34:54.280 | to actually nail him to the wall,
01:34:56.840 | it might actually, I'm sure he's gonna have
01:34:58.160 | a lot of lines too.
01:34:59.440 | The problem is Trump is the master counter puncher.
01:35:01.880 | So like when Hillary's had her line,
01:35:04.200 | she's like, "Well, it's a good thing that Donald Trump
01:35:06.800 | "isn't in charge of our legal system."
01:35:08.480 | And he's like, "Yeah, you'd be in jail."
01:35:10.320 | It's like, "Oh, lady, you set him up."
01:35:13.980 | - That's painful to watch those debates.
01:35:16.720 | I mean, there's something,
01:35:18.280 | I think it's actually analogous,
01:35:20.520 | I've come to think of it,
01:35:22.040 | your conversation with me right now.
01:35:24.640 | It's sleepy Joe, I'm playing the role of sleepy Joe.
01:35:28.920 | I actually connect to Joe because--
01:35:32.840 | - I'm also incontinent.
01:35:34.120 | (laughing)
01:35:35.200 | - There's like these weird pauses that he does.
01:35:37.560 | - Yes, he does.
01:35:38.720 | - I do the same thing and it annoys the shit out of me
01:35:42.280 | that like in mid sentence,
01:35:44.880 | I'll start saying a different thing and take a tangent.
01:35:49.320 | I'm not as slow and drunk as I sound always.
01:35:54.320 | I swear I'm more intelligent underneath it.
01:35:56.400 | - I'm slower but less drunk.
01:35:57.800 | (laughing)
01:35:59.320 | - Exactly, but the result, one of those is true,
01:36:02.280 | but not both, yeah.
01:36:06.480 | And Trump, just like you are a master counterpuncher,
01:36:09.160 | so it's getting me messy.
01:36:11.920 | - Here's the other thing in all seriousness,
01:36:13.440 | Chris Wallace is the moderator.
01:36:15.440 | Chris Wallace has interviewed Trump several times
01:36:17.340 | and he was a tough, tough questioner.
01:36:21.400 | So I don't think he's gonna come in there
01:36:23.740 | with softball questions.
01:36:25.200 | I think he's really going to try to nail Trump down,
01:36:28.280 | which is tough to do.
01:36:29.680 | - I like him a lot.
01:36:30.680 | - Yeah, and he's like, "Mr. President, sir,
01:36:32.920 | that's not accurate, blah, blah, blah."
01:36:33.960 | He's done it and Trump gets very frustrated
01:36:36.400 | 'cause he doesn't just let him say whatever he wants
01:36:38.120 | and he hits him with the follow-up.
01:36:40.200 | - I guess he's on Fox News
01:36:43.720 | and I listen to his Sunday program every once in a while.
01:36:47.520 | I don't know, he gives me hope that,
01:36:50.960 | I don't know, there's something in the voice
01:36:52.440 | like that he's not bought.
01:36:54.220 | - There's no question he's gonna take this seriously,
01:36:57.280 | which I think is the best you could hope for in a moderator.
01:37:00.800 | - It feels like there's people
01:37:02.000 | that might actually take the mainstream media
01:37:04.520 | into a place that's going to be better in the future.
01:37:08.280 | And we need people like him.
01:37:09.760 | - You mean like Robespierre?
01:37:11.200 | - What do you mean?
01:37:12.040 | - Like taking the mainstream media to a better future,
01:37:14.360 | like bring out the guillotines.
01:37:15.920 | - Okay, see, you put your anarchist hat back on.
01:37:21.080 | - I don't think Robespierre's much of an anarchist,
01:37:22.560 | but yeah, I get what you're saying.
01:37:24.240 | - You don't think there should be
01:37:25.120 | a centralized place for news?
01:37:27.000 | - There isn't now.
01:37:29.440 | - Well, that's what mainstream media
01:37:31.760 | is supposed to represent.
01:37:32.600 | - It's not central. - And it's broken.
01:37:33.480 | Well, it's not whatever, what do you call that?
01:37:36.880 | A place where people traditionally said
01:37:39.720 | was the legitimate source of truth.
01:37:44.360 | - No. - That's what the media
01:37:45.240 | was supposed to represent, no?
01:37:46.600 | - So that's their big branding accomplishment.
01:37:50.280 | It's a completely-- - That was never true?
01:37:51.880 | - Yeah, because here's what happens.
01:37:53.560 | We remember the Spanish-American War, remember the main,
01:37:56.640 | we have to take Cuba, yellow journalism,
01:37:59.280 | Wally Randolph Hearst, right?
01:38:00.880 | Then record scratch, and then we're all objective.
01:38:04.240 | Like when did this transition happen, according to people?
01:38:06.520 | When you were saying that the Kaiser
01:38:08.200 | is the worst human being on earth?
01:38:10.320 | When you were downplaying Stalin
01:38:12.320 | and downplaying Hitler's atrocities?
01:38:13.960 | When you were saying we had to be in Vietnam?
01:38:16.320 | At what point, WMDs, when did it change?
01:38:19.120 | It never changed.
01:38:19.960 | You just are better con artists at a certain point,
01:38:22.400 | and now the mask is dropping.
01:38:23.680 | - Yeah, but don't you think there's,
01:38:26.560 | at its best, investigative journalism
01:38:29.960 | can uncover truth in a way that Reddit,
01:38:33.960 | subreddits can't?
01:38:39.120 | - You know, Reddit, sure, I agree.
01:38:41.600 | At its best, absolutely.
01:38:42.520 | That's not even a dispute.
01:38:43.880 | - But don't you think fake it until you make it
01:38:48.760 | is the right way to do it?
01:38:49.920 | Meaning like-- - Fake the news?
01:38:52.360 | - No, no, no.
01:38:53.440 | I meant the news saying like,
01:38:55.760 | we dream of doing, of arriving at the truth
01:38:59.960 | and reporting the truth.
01:39:01.240 | - They don't say that.
01:39:02.080 | CNN had an advertisement that said,
01:39:03.760 | "This is an apple.
01:39:04.600 | "We only report facts.
01:39:05.960 | "That's a lie."
01:39:07.120 | - No, that's now.
01:39:08.400 | And now it's clear things have changed.
01:39:10.920 | - They haven't changed.
01:39:11.800 | You're just more, you're more aware of it.
01:39:13.840 | - Aware of it. - Chicanery.
01:39:15.960 | - But, okay, so the--
01:39:17.840 | - How many people died in Iraq?
01:39:19.320 | 'Cause Saddam Hussein was about to launch WMDs.
01:39:22.800 | Who had consequences for this?
01:39:24.040 | No one.
01:39:25.440 | This isn't a minor thing.
01:39:26.520 | This is lots of dead people.
01:39:28.800 | - Yeah.
01:39:30.080 | And also, I mean, dead people,
01:39:32.800 | it's horrible, but also the money,
01:39:35.280 | which has, like we said, economic effects.
01:39:37.720 | - Marianne Williamson, I think it was,
01:39:39.280 | had the, or Trump, both of them,
01:39:40.720 | had the great point that goes,
01:39:41.800 | "That's like a trillion dollars.
01:39:42.640 | "How many schools would that build?
01:39:43.600 | "How many roads would that build?
01:39:45.200 | "Why are we building hospitals in Iraq
01:39:46.840 | "that we destroyed when we could build hospitals here?"
01:39:49.040 | It's horrifying.
01:39:51.760 | - So who's responsible for that?
01:39:53.200 | Like who--
01:39:54.720 | - Alex Jones.
01:39:55.640 | - No, I meant for, well,
01:39:58.360 | so who's responsible for arriving at the truth of that,
01:40:04.360 | of speaking to the money spent on the wars in Iraq?
01:40:09.560 | - This is one of the great things about social media.
01:40:11.800 | - Twitter, you have faith in Twitter.
01:40:13.240 | - Not specifically Twitter,
01:40:14.480 | but yeah, social media is the whole,
01:40:15.440 | what anyone can, here's another great example.
01:40:18.240 | Before, if you were talking about police brutality
01:40:22.160 | or these riots, you would have to perceive it
01:40:24.880 | in the way it was framed and presented to you.
01:40:26.800 | Nicholas Sandman is another example.
01:40:29.120 | Breonna Taylor, all these things.
01:40:30.280 | Well, we don't have footage of her.
01:40:32.160 | You would have to perceive in the way
01:40:33.480 | that it's edited and presented to you by the corporate press.
01:40:35.920 | Now everyone has a video camera,
01:40:38.320 | everyone has their perspective,
01:40:39.920 | and it's very useful when these incidents happen
01:40:42.840 | where you could see the same incident from several angles
01:40:45.360 | and you don't need Don Lemon or Chris Wallace
01:40:47.960 | to tell me what this means.
01:40:49.000 | I can see with my own eyes.
01:40:50.840 | - Yeah, I've been very pleasantly surprised about the power.
01:40:55.520 | See, like people, the mob, again, gets in the way.
01:40:58.240 | They get emotional and they destroy
01:41:01.080 | the ability for people to reason,
01:41:05.000 | but you're right that truth is unobstructed on social media.
01:41:10.000 | If you're careful and patient, you can see the truth.
01:41:14.480 | Like for example, data on COVID,
01:41:16.600 | some of the best sources are doctors.
01:41:19.920 | Like if you wanna know the truth about the coronavirus
01:41:22.480 | and what's happening is there's follow people on Twitter.
01:41:27.400 | There's certain people that are just like
01:41:28.720 | sourcing them for me versus the CDC and the WHO.
01:41:31.680 | That's fast.
01:41:33.680 | I mean, well, it's kind of anarchy, right?
01:41:35.960 | - Yes, it is.
01:41:36.880 | It's not kind of, it is anarchy, yes.
01:41:38.920 | - I mean, well, there's some censorship
01:41:41.720 | and all that kind of stuff.
01:41:42.680 | - You have censorship under anarchy
01:41:43.920 | in the sense that you're talking about.
01:41:45.000 | Like people get kicked off of Twitter.
01:41:47.520 | That's drawing boundaries.
01:41:48.360 | - How do you kick somebody?
01:41:49.200 | Okay, so I mean, it's a--
01:41:50.960 | - Private company.
01:41:52.400 | - Private company.
01:41:53.440 | Most people wouldn't say Twitter is working,
01:41:56.720 | but that's probably because they take for granted
01:41:59.480 | how well it's working and they're just complaining
01:42:01.360 | about the small part of it that's broken.
01:42:03.960 | Yeah.
01:42:04.800 | Okay, another question about--
01:42:09.360 | - You feel better?
01:42:11.360 | - No, by the way, I mean, I had a personal gripe
01:42:15.120 | with the situation about the,
01:42:17.600 | not a personal gripe, but I felt overly emotional
01:42:22.920 | about the possibility that there will be some
01:42:27.400 | of Donald Trump messing with the election process,
01:42:31.820 | but you made me feel better.
01:42:33.040 | - Good.
01:42:33.880 | - Like saying like, if he had a bunch of opportunities
01:42:36.640 | to do what, like to do what I would have done
01:42:41.320 | if I was a dictator, I would,
01:42:43.520 | the first time those riots over George Floyd,
01:42:47.160 | I would instituted martial law.
01:42:49.960 | - Do you know what I remember very vividly?
01:42:52.160 | Is after 9/11 and everyone was waiting for George Bush
01:42:55.240 | to give his speech and he had 98% approved rating.
01:42:58.000 | And I remember very vividly, 'cause if he had said,
01:43:00.680 | we're suspending the constitution,
01:43:03.080 | everyone would have cheered for him.
01:43:04.200 | Like he couldn't get enough support at that time.
01:43:06.480 | And he didn't do it.
01:43:07.840 | And I can't say anything really good about George W. Bush.
01:43:10.760 | I'm not a fan of his to say the least.
01:43:12.480 | So I think you and I, and other people who are familiar
01:43:15.880 | with totalitarian regimes to some extent from our ancestry
01:43:19.520 | or whatever, from research,
01:43:21.320 | should always be the ones freaking out and warning,
01:43:24.680 | but we should also be aware of,
01:43:27.760 | we got a ways to go before it's Hitler.
01:43:30.600 | And thankfully, there are a lot of dominoes
01:43:34.400 | that have to fall into place before Hitler,
01:43:36.360 | it's like the game secret Hitler, it's a board game,
01:43:38.400 | before Hitler becomes Hitler.
01:43:40.120 | Like it's not, especially in America,
01:43:43.080 | there's lots of things that have to happen
01:43:45.600 | before you really get to that point.
01:43:47.600 | I mean, FDR was for all intents and purposes a dictator,
01:43:50.480 | but even then the worst you could say,
01:43:52.400 | and this is not something that you should take lightly,
01:43:54.600 | was internment of Japanese citizens,
01:43:56.560 | but they weren't murdered.
01:43:58.040 | They weren't under lock and key in the sense of like in cells.
01:44:01.800 | So things could have gotten a lot worse for him.
01:44:04.120 | - We have to, I mean, Hitler is such a horrible person
01:44:06.600 | to bring up because-- - He's Mussolini.
01:44:08.480 | - Yeah, Mussolini is better because Hitler
01:44:11.760 | is so closely connected to the atrocities of the Holocaust.
01:44:15.600 | There's all this stuff that led up to the war
01:44:17.400 | and the war itself, say that there was no Holocaust,
01:44:20.760 | Hitler would probably be viewed differently.
01:44:23.440 | - Yes, I should think so.
01:44:24.680 | - Well, I mean, but--
01:44:26.920 | - You think, that's a very controversial stance.
01:44:29.000 | You think Hitler would be viewed differently
01:44:30.200 | if it wasn't for the Holocaust?
01:44:31.600 | - Well, I mean, but it's a funny thing that the,
01:44:36.600 | I would say the death of how many, 40, 50 million,
01:44:43.840 | I mean, I don't know how you calculate it,
01:44:46.880 | is not seen as bad as the 6 million.
01:44:50.920 | - Oh yeah, 'cause of Mao and Stalin.
01:44:52.800 | - Yeah, but it's interesting.
01:44:56.240 | - I'm working on it.
01:44:57.320 | - You're working on it.
01:44:58.160 | - Yeah, the next book I'm talking about.
01:44:59.000 | - Reminding, well, it's good.
01:45:01.040 | I'm glad a good writer is,
01:45:02.440 | 'cause the world's not reminded.
01:45:03.760 | - My last book, "The New Right,"
01:45:05.080 | I had to deal with some of the Nazis.
01:45:06.400 | And one of the points they make is,
01:45:07.560 | how come everyone knows about the Holocaust,
01:45:09.000 | but no one knows about the Holodomor?
01:45:10.640 | And they're right, we should know about this,
01:45:12.680 | because it is a great example of both
01:45:14.640 | how the Western media were depraved,
01:45:17.360 | but also what human beings are capable of.
01:45:19.680 | And those scars are still,
01:45:22.320 | many Americans think Russia and Ukraine are the same thing.
01:45:25.560 | Oh, Trump's in bed with the Ukrainians,
01:45:27.120 | Trump's about the Russians, they think it's the same thing.
01:45:29.440 | For us, it's complete lunacy.
01:45:31.720 | But this is the kind of thing where Pol Pot
01:45:33.760 | is another example, where people have no clue
01:45:37.320 | of what has been done to their fellow man
01:45:39.160 | on the face of this earth, and they should know.
01:45:41.440 | - How much of that do you lay at the hands of communism?
01:45:43.920 | How much are you with like a Jordan Pearson,
01:45:46.480 | who is intricately connecting the atrocities,
01:45:51.480 | like you're saying, 1930s Ukraine, where people were starved?
01:45:55.200 | I recently, my grandmother recently passed away,
01:45:58.400 | and she survived that as a kid, which is,
01:46:02.760 | those people, I mean, just, they're tough.
01:46:08.080 | They're tough, like that whole region is tough,
01:46:12.000 | 'cause they survived that, and then right after,
01:46:15.120 | occupation of Nazis, of Germans.
01:46:18.720 | How much do you lay that at communism as an ideology
01:46:24.480 | versus Stalin in demand?
01:46:27.960 | - I think, Lenin was building concentration camps
01:46:31.080 | while he was around, and slave labor.
01:46:33.200 | I don't, I think it's clearly both.
01:46:36.840 | There are certain variants of communism
01:46:38.600 | that were far, like Khrushchev and Gorbachev.
01:46:41.760 | The reason the Soviet Union fell apart,
01:46:44.640 | and this is kind of, I'm gonna spoil the end of the book.
01:46:47.000 | There's an amazing book called "Revolution 1989."
01:46:48.920 | It's like most people who book have ever read,
01:46:50.640 | by Viktor Sebastian, he's a Hungarian author.
01:46:53.320 | And basically what happens in 1989,
01:46:55.360 | Poland has their elections, and then in 1990,
01:46:57.800 | they kind of let in the labor people into the government.
01:47:00.360 | And people start crossing borders in the Eastern Bloc.
01:47:04.480 | And you had Honecker from Eastern Germany
01:47:06.280 | and Ceausescu from Romania calling Gorbachev,
01:47:09.360 | because those are the two toughest ones
01:47:11.240 | by communist standards.
01:47:12.280 | They go, they're just escaping.
01:47:14.520 | We're gonna lose everything.
01:47:16.280 | You gotta send in the tanks like you did in Hungary,
01:47:18.040 | like you did in Czechoslovakia in '68.
01:47:20.560 | And Gorbachev goes, "I'm not sending the tanks."
01:47:22.280 | And they go, "Dude, if you don't send in the tanks,
01:47:24.880 | "it's all done."
01:47:25.760 | And he goes, "Nope, I'm not that kind of guy."
01:47:28.360 | And they were right.
01:47:29.200 | I mean, Ceausescu was personally shot
01:47:31.920 | with his wife up against the wall.
01:47:33.520 | Honecker, I forget what happened to him.
01:47:35.240 | But they all self-liberated.
01:47:37.480 | My friend who was born in Czechoslovakia,
01:47:39.840 | his mom was pregnant under communism,
01:47:42.240 | and she never even imagined he'd be free,
01:47:43.840 | and he was born under free.
01:47:45.560 | And they were all looking around all these countries
01:47:47.560 | that self-liberated, 'cause they're like,
01:47:49.440 | this is a trick, right?
01:47:50.280 | They're trying to figure out who's not good
01:47:52.520 | so that they can arrest us en masse, and they didn't.
01:47:54.840 | So even within communism,
01:47:58.880 | there are bad guys and better guys.
01:48:02.480 | - But we talked about anarchy, we talked about democracy.
01:48:06.120 | Do you see, like, there's democratic socialism
01:48:09.240 | conversations going on in the popular culture?
01:48:13.400 | Socialism is seen as like evil, or for some people, great.
01:48:18.400 | - Sure.
01:48:19.440 | - What are your thoughts about it
01:48:22.560 | as in a political ideology?
01:48:24.160 | - Evil.
01:48:25.360 | - So you're on the evil side, fundamentally.
01:48:27.680 | - Yes.
01:48:28.520 | - What is it?
01:48:30.600 | You know, what makes it evil?
01:48:34.880 | What's like structurally, if you were to try to analyze?
01:48:38.040 | - Sure, I say three ways.
01:48:39.960 | Morally, no person has the right to tell another person
01:48:42.560 | how to live their life.
01:48:43.720 | Economically, it's not possible
01:48:46.520 | to make calculations under socialism.
01:48:48.440 | It's only prices that are information that tells me,
01:48:51.760 | oh, we need to produce more of this,
01:48:53.400 | we need to produce less of this.
01:48:54.880 | Without prices being able to adjust
01:48:56.520 | and give information to producers and consumers,
01:48:59.960 | you have no way of being able to produce
01:49:02.040 | effectively or efficiently.
01:49:03.520 | And also, it is, it turns people against each other.
01:49:08.120 | When you force people to interact,
01:49:09.920 | when you force them into relationships,
01:49:11.320 | when you force them into jobs,
01:49:13.040 | and you don't give them any choice when there's a monopoly,
01:49:15.840 | the consequence of monopoly,
01:49:17.160 | everyone's familiar with ostensibly under capitalism,
01:49:19.760 | but somehow when it's a government monopoly,
01:49:21.520 | all those economic principles don't work,
01:49:23.040 | it doesn't make any sense.
01:49:24.320 | - But there's force in democracy too.
01:49:26.200 | It's just you're saying there's a bit more force
01:49:29.080 | in socialism. - Yeah.
01:49:32.280 | - But that's interesting that you say
01:49:33.600 | that there's not enough information.
01:49:34.920 | I mean, that's ultimately,
01:49:36.720 | you need to have really good data
01:49:40.400 | to achieve the goals of the system,
01:49:42.240 | even if there's no corruption.
01:49:45.000 | - Right.
01:49:45.840 | - You just need to have the information.
01:49:46.960 | - Right, which you can't.
01:49:48.240 | And capitalism provides you really strong--
01:49:53.240 | - Real time. - Real time information.
01:49:56.000 | And if capitalism at its best and cleanest,
01:50:02.680 | which is like perfect information is available,
01:50:05.600 | there's no manipulation of information.
01:50:07.640 | That's one of the problems.
01:50:10.080 | Okay.
01:50:10.920 | Can we talk about some candidates,
01:50:13.800 | the ones we got and possible alternatives?
01:50:17.240 | So one question I have is,
01:50:19.680 | why do we have within this system,
01:50:23.480 | why do we have the candidates we have?
01:50:25.480 | It seems, maybe you can correct me,
01:50:31.040 | highly unsatisfactory.
01:50:33.200 | Is anyone actually excited about our current candidates?
01:50:39.600 | - I'm kind of excited because no matter who wins,
01:50:42.840 | the elections can be hilarious.
01:50:44.480 | So that is something that I'm excited about.
01:50:46.840 | - From a humor perspective.
01:50:47.840 | - Yeah.
01:50:48.680 | - Is that what the whole system is about?
01:50:50.080 | So that's one theory of the case,
01:50:52.160 | is the entire thing is optimized for viewership.
01:50:56.000 | - Yeah.
01:50:56.840 | - And excitement by definitions
01:50:59.280 | of like the reality show kind of excitement.
01:51:01.560 | - I think it is,
01:51:04.360 | if you look at what happened with Brett Kavanaugh,
01:51:06.840 | this is not a career that would draw people who are,
01:51:11.840 | you might say quality.
01:51:14.640 | Because no matter who they are,
01:51:16.000 | there would be a huge incentive from the other team
01:51:18.840 | to denigrate them and humiliate them
01:51:21.000 | in the worst possible ways.
01:51:22.000 | Because as the two teams lose their legitimacy among GenPOP,
01:51:26.760 | it's gonna get harder and harder for them
01:51:28.160 | to maintain any kind of claims to authority,
01:51:30.680 | which is something I like,
01:51:32.120 | but which does kind of play out in certain nefarious ways.
01:51:36.120 | - So people, the best of the best
01:51:38.000 | are not gonna wanna be politicians.
01:51:40.120 | - Yeah, because I could have a job,
01:51:42.560 | where I have a job interview
01:51:43.440 | and I'm running Yahoo or whatever,
01:51:45.720 | or I could for 18 months have to eat corn dogs
01:51:49.880 | looking like I'm going down on someone and shake hands
01:51:52.080 | and have all this, my family,
01:51:54.760 | and on social media daily called the worst things for what?
01:51:58.880 | And then I'm still not guaranteed the position?
01:52:02.000 | - But the flip side of that,
01:52:04.080 | like from my perspective is the competition is weak.
01:52:07.160 | Meaning like you need a minimum amount of eloquence clearly
01:52:14.680 | that I don't, the bar which I did not pass.
01:52:17.880 | - I don't think either of them
01:52:18.880 | would be considered particularly eloquent, Biden or Trump.
01:52:21.280 | - No, I know, but that's what I'm saying.
01:52:22.800 | The competition, like if you were,
01:52:26.520 | wanted to become a politician,
01:52:28.240 | if you wanted to run for president,
01:52:30.280 | the opportunity is there, like if you were at all competent.
01:52:33.800 | Like if you had, so like Andrew Yang is an example
01:52:36.320 | of somebody who has a bunch of ideas,
01:52:37.840 | is somewhat eloquent, like young, energetic.
01:52:43.760 | It feels like there should be thousands of Andrew Yangs
01:52:47.160 | like that would enter the domain.
01:52:48.960 | - And he went nowhere.
01:52:50.400 | - Well, I wouldn't say he went nowhere.
01:52:54.200 | He generated quite a bit of excitement.
01:52:55.760 | He just didn't go very far.
01:52:57.480 | That's, okay.
01:52:59.820 | - You don't have to run for president
01:53:01.040 | to generate excitement with your ideas.
01:53:02.360 | You could be a podcast host.
01:53:03.440 | I'm not even joking.
01:53:04.280 | - That's right, that's right.
01:53:05.640 | That's right.
01:53:07.360 | And he's both, Andrew Yang.
01:53:09.880 | - Oh, he's a podcast?
01:53:10.840 | - Yeah, he has a podcast called Yang Speaks.
01:53:13.560 | - Oh, okay, cool.
01:53:14.620 | (laughing)
01:53:18.360 | - Oh, wow, the music of the way you said, yeah, cool,
01:53:24.540 | is the way my mom talks to me when I tell her
01:53:26.440 | there's something exciting going on in my life.
01:53:28.800 | Oh, that's nice, honey.
01:53:32.840 | - Oh, you made a robot, that's cool.
01:53:34.640 | (laughing)
01:53:35.960 | I'll mix coffee.
01:53:36.800 | - Oh, you're still single though, aren't you?
01:53:38.760 | (laughing)
01:53:40.720 | Huh, I wonder why, I wonder why.
01:53:43.160 | Make yourself a robot wife?
01:53:44.800 | Give me some robot grandchildren?
01:53:46.440 | - Okay.
01:53:50.040 | But first of all, okay, let me ask you about Andrew Yang
01:53:55.120 | 'cause he represents fresh energy.
01:53:58.960 | You don't find him fresh or energetic?
01:54:02.200 | Like, is there any candidate you wish was in the mix
01:54:07.200 | that was in the mix you wish was
01:54:09.360 | one of the last two remaining?
01:54:11.280 | - Yeah, people like Marianne Williamson I thought was great.
01:54:14.880 | Tulsi I thought was great.
01:54:16.560 | Amy Klobuchar got a bad rap.
01:54:18.840 | I think she held her own.
01:54:21.280 | Smart, she wasn't particularly funny, that's okay.
01:54:24.120 | I think she was non-threatening to a lot of people.
01:54:26.280 | - What did you like about them?
01:54:28.200 | - I guess it's named all women, that's interesting.
01:54:29.680 | It wasn't even intentional.
01:54:31.560 | Tulsi I like that she was aggressive, has a good resume,
01:54:34.600 | and is not staying the course for the establishment.
01:54:39.800 | Marianne Williamson I like 'cause she comes from a place
01:54:42.760 | from what it seems of genuine compassion.
01:54:45.040 | Maybe she's a sociopath, I don't know.
01:54:47.200 | I read her book and it actually affected me profoundly
01:54:50.640 | 'cause it's very rare when you read a book
01:54:53.320 | and there's even that one idea that blows your mind
01:54:55.520 | and that you kind of think about all the time.
01:54:56.880 | And there was one of that such idea in her book
01:54:59.040 | about she was teaching something called
01:55:01.280 | A Course in Miracles in Hollywood.
01:55:02.600 | I think she still teaches it.
01:55:04.160 | And this was during the '80s, the height of the AIDS crisis.
01:55:07.600 | And all these young men in the prime of their life
01:55:09.880 | were dropping like flies.
01:55:11.800 | And she's trying to give them hope.
01:55:13.240 | Well, good luck, they're dying, no one cares.
01:55:16.280 | And they're like, you can't tell us
01:55:17.800 | that they're gonna cure this.
01:55:20.120 | That's a lie.
01:55:21.520 | And she goes, what if I told you they're not gonna cure it?
01:55:25.460 | What if I told you it's gonna be to like diabetes?
01:55:28.240 | They cut off your foot and you're gonna go blind.
01:55:30.600 | Would that be something that you can hope for?
01:55:33.240 | And when you put it like that, it's like, yeah.
01:55:34.960 | Like if you're talking to someone who's like
01:55:35.800 | a homeless junkie and you're like,
01:55:37.140 | you could be a doctor, you're a lawyer,
01:55:38.680 | or a lawyer, like cool story.
01:55:39.880 | Like you could have a studio apartment
01:55:43.440 | with a terrible roommate and a shitty job.
01:55:45.920 | But when you're on the street, you know,
01:55:48.440 | cooking breakfast in a teaspoon,
01:55:50.680 | and you hear that, you're like, wait,
01:55:52.240 | would that really be so bad?
01:55:53.360 | Is that really so much worse than this?
01:55:54.840 | No, and it becomes something.
01:55:56.280 | So when she put it in those terms, I'm like, wow,
01:55:59.080 | this woman that really did a number on me
01:56:01.720 | in terms of teaching people how to be hopeful.
01:56:04.800 | - Small steps.
01:56:05.940 | - But it's also, then it becomes less of,
01:56:08.180 | I need a miracle to be like, oh, this is really manageable.
01:56:11.620 | And it's absurd to think it's impossible.
01:56:15.060 | - What about, what's your take on UD 2020
01:56:17.980 | that Brett Weinstein pushed forward?
01:56:22.980 | - It was DOA.
01:56:24.060 | He couldn't even stand up to Twitter.
01:56:26.340 | - Dead on arrival.
01:56:27.180 | - Dead on arrival.
01:56:28.000 | He couldn't even stand up to Twitter,
01:56:29.060 | let alone or to Facebook.
01:56:30.380 | They got blocked, let alone to the Facebook.
01:56:31.220 | - It was not hugely problematic, by the way,
01:56:33.660 | that Twitter would block that.
01:56:35.640 | - Not at all.
01:56:36.580 | I don't know why they blocked it, but I believe,
01:56:39.000 | I don't know, problematic means.
01:56:40.440 | That's a word that does a lot of work
01:56:42.560 | that people wanted to do conceptually.
01:56:45.620 | The idea that like unity is like taking the rejects
01:56:48.640 | from each party and we're gonna like have something
01:56:51.160 | that no one likes and therefore it's gonna be a compromise
01:56:53.160 | is absurd.
01:56:54.480 | The last time we had this kind of unity ticket
01:56:56.720 | was the Civil War, when you had Andrew Johnson
01:56:59.440 | from the Democrats and Lincoln from the Republicans.
01:57:01.640 | This was not something that ended well,
01:57:03.700 | particularly nicely for both halves of the country.
01:57:05.940 | - So that's the way you see it is,
01:57:08.300 | like the way I saw it, I guess I haven't looked carefully
01:57:11.260 | at it.
01:57:12.100 | - I haven't either, to be fair.
01:57:13.580 | - The way I saw it is emphasizing centrists, which is--
01:57:17.140 | - How is Tulsi a centrist?
01:57:19.180 | - Tulsi was involved?
01:57:20.060 | - Yes, he's trying to push Tulsi
01:57:21.460 | and like Jesse Ventura or something.
01:57:23.260 | - Oh, so, okay, I don't know.
01:57:26.340 | I don't know the specifics.
01:57:27.180 | - As a scientist, you also know centrism
01:57:28.900 | is not a coherent term in politics.
01:57:30.580 | But see, now you're like, what is it?
01:57:34.280 | Pleading to authority and my ego.
01:57:37.600 | - No, no, I'm pleading to how you approach data.
01:57:40.240 | If someone is saying the mean is accurate,
01:57:42.680 | that only means, I mean, the mean could be anywhere.
01:57:45.020 | It's a function of what's around it.
01:57:46.200 | It doesn't mean it's true.
01:57:47.080 | - I don't even know what centrist is supposed to mean,
01:57:49.640 | but what it means to me, there's no idea.
01:57:53.280 | A centrist, there's more of a center right or center left.
01:57:58.620 | To me, what that means is somebody who is a liberal
01:58:02.680 | or a conservative, but is open-minded
01:58:07.680 | and empathetic to the other side.
01:58:13.200 | - Joe Biden had the crime bill.
01:58:15.600 | Joe Biden voted for Republican Supreme Court justices.
01:58:18.360 | Joe Biden voted for a balanced budget.
01:58:19.960 | Joe Biden voted for Bush's war.
01:58:21.960 | And I'm sure probably haven't looked this up,
01:58:23.120 | the Patriot Act.
01:58:24.340 | If you want a centrist, you have Joe Biden.
01:58:26.040 | - Yeah, okay.
01:58:26.920 | - He's worked very well with the Republican--
01:58:28.440 | - That argument could be made.
01:58:29.380 | Of course, everybody will always resist that argument.
01:58:33.460 | - It's undeniable.
01:58:34.300 | In fact, during the campaign,
01:58:35.940 | some activists started yelling at him at a town hall.
01:58:40.860 | Not yelling, just saying, "Hey, we need open borders."
01:58:43.580 | Joe Biden says, "I'm not for open borders.
01:58:45.500 | Go vote for Trump," and literally turned his back on the man.
01:58:48.820 | And this is during the primaries,
01:58:50.180 | where it would behoove you to try to appeal to the base.
01:58:53.940 | - And of course, you can probably also make the argument
01:58:55.860 | that Donald Trump is center-right, if not center-left.
01:58:59.240 | - Well, I mean, he's very unique as a personality.
01:59:04.240 | But if you look at his record,
01:59:05.480 | and first of all, his rhetoric,
01:59:06.680 | you can say is not centrist at all.
01:59:08.480 | But in terms of how he governs,
01:59:10.480 | the budgeting has been very moderate.
01:59:13.480 | It certainly hasn't been like draconian budget cuts.
01:59:16.080 | The Supreme Court, you could say, okay, he's hard right.
01:59:18.320 | Immigration, you could say in certain capacities,
01:59:20.280 | he's hard right.
01:59:21.160 | But in terms of pro-life, what has he done there?
01:59:24.520 | In terms of, so it's in many other aspects,
01:59:27.420 | he's been very much this kind of me too Republican.
01:59:30.700 | But certainly the rhetoric,
01:59:31.580 | it's very hard to make him the case that he's a centrist.
01:59:33.740 | - So you don't like,
01:59:35.140 | is there any other idea you find compelling?
01:59:37.820 | What I like about UND 2020 is it's an idea
01:59:41.700 | for a different way, for a different party,
01:59:46.260 | a different path forward.
01:59:47.700 | So ideas, just like anarchy, is an interesting idea
01:59:51.340 | that leads to discourse, that leads to--
01:59:53.580 | - I don't think it's interesting at all.
01:59:54.640 | And here's why I don't think it's interesting.
01:59:56.600 | Sweden has eight parties in its parliament.
01:59:59.960 | Iceland population is like 150,000.
02:00:02.040 | They've got nine, I think it was.
02:00:03.320 | Czech Republic has nine, Britain has five.
02:00:06.120 | So the claim that two parties is the censorious of speech,
02:00:11.120 | but three, oh, now all of a sudden it makes no sense.
02:00:15.280 | It doesn't port to the data, number one.
02:00:16.640 | Number two is Donald Trump demonstrated
02:00:18.820 | that you can be basically a third party candidate.
02:00:21.000 | Seize the machinery of an existing party
02:00:23.860 | and appropriate to your own ends.
02:00:25.380 | As Bernie Sanders almost did.
02:00:27.020 | Bernie Sanders has never been a Democrat.
02:00:29.300 | Major credit to him for that's not easy to be elected
02:00:31.980 | as Senator as an independent.
02:00:33.000 | He's done it repeatedly.
02:00:34.140 | So these are two examples of ossified elites
02:00:37.400 | ripe for the picking.
02:00:38.240 | So to have a third party makes no real sense.
02:00:42.860 | - Speaking of which, a party you talk about quite a bit.
02:00:48.740 | And this is a personal challenge to you.
02:00:53.120 | Let me bring up the Libertarian Party.
02:00:55.480 | And the personal challenge is to go five minutes
02:00:58.800 | without mocking them in discussing this idea.
02:01:03.120 | So first of all--
02:01:04.280 | - I'm being trolled.
02:01:07.160 | Okay, I'm being trolled.
02:01:09.520 | Okay, I'm being trolled.
02:01:10.720 | I'm being trolled, okay, this is good.
02:01:13.320 | Do you remember the fun friends?
02:01:14.720 | There was an episode where Chandler
02:01:16.000 | had to not make fun of people.
02:01:17.560 | Like can you go one day Chandler?
02:01:19.220 | And Phoebe starts telling him about this UFO she saw.
02:01:22.820 | And he's like that's very interesting and nice for you.
02:01:26.480 | - This is exactly that.
02:01:29.300 | So a true master would be able to play the game
02:01:31.960 | within the constraints.
02:01:33.060 | No, I'm pretty sure you'll still mock them.
02:01:36.340 | - No, no, I'll stick to the rules.
02:01:38.500 | Five minutes, easy.
02:01:39.620 | - So first of all, speaking broadly about libertarianism,
02:01:42.700 | can you speak to that, how you feel about it?
02:01:44.860 | And then also to the Libertarian Party
02:01:46.980 | which is the implementation of it in our current system.
02:01:49.700 | - So I think libertarianism is a great idea.
02:01:53.740 | And I think there's many libertarian ideas
02:01:56.060 | that have become much more mainstream
02:01:58.140 | which I'm very, very happy about.
02:01:59.640 | I remember there was an article in either New York
02:02:02.300 | or New Yorker Magazine in the early 90s
02:02:05.180 | where they talked about the Cato Institute
02:02:06.740 | which is a libertarian think tank.
02:02:08.500 | And they refer to the fact that Cato was against war
02:02:12.020 | and against like regulation with a wacky consistency
02:02:16.220 | 'cause they didn't know how to reconcile these two things.
02:02:18.580 | I don't remember what the two things were
02:02:19.540 | but I remember that expression, wacky consistency.
02:02:22.140 | And it wasn't even, we were all taught
02:02:25.220 | and this is very much before the internet
02:02:27.200 | that there's two tribes and if you're pro-life,
02:02:30.780 | you have to hate gays.
02:02:32.500 | And if you're for socialized medicine,
02:02:34.980 | that also means you have to be for free speech.
02:02:39.980 | And like there's a whole menu
02:02:42.100 | and you got to sign into all of them.
02:02:43.460 | And that menu is terrible.
02:02:45.740 | They hate America, they want to destroy it.
02:02:47.500 | Oh my God, those are horrible.
02:02:48.940 | This is the menu you want.
02:02:50.300 | And the libertarian party to some extent
02:02:52.620 | and just libertarians as a whole said,
02:02:54.300 | you know, you can do the Chinese buffet
02:02:57.100 | and take a little from column A, a little from column B
02:02:59.500 | and have an ideology that is coherent and consistent
02:03:03.860 | and ideology of peace and non-aggression
02:03:07.620 | and things like that.
02:03:08.780 | The libertarian party takes its model
02:03:12.620 | from like the early progressive and populist parties
02:03:15.260 | from the early 20th century,
02:03:16.700 | which were not very effective
02:03:18.420 | in terms of getting people elected,
02:03:20.860 | but were extremely effective
02:03:23.260 | in terms of getting the two major parties to appropriate
02:03:26.100 | and adopt their ideas and implement them.
02:03:27.980 | And in Britain as well, the liberal party got destroyed
02:03:30.460 | and became taken over by labor
02:03:32.180 | as the alternative party to the Tories
02:03:34.860 | and have those ideas basically become mainstreamed.
02:03:38.660 | So I think that, and the libertarian,
02:03:40.460 | my friend who passed away, Eric, I miss him dearly,
02:03:43.540 | was their webmaster.
02:03:44.580 | And his whole point is,
02:03:46.020 | if you don't think of it in terms of a party,
02:03:47.660 | in terms of getting people elected,
02:03:49.100 | but if you think of it as a party
02:03:50.300 | in terms of getting people educated about alternatives,
02:03:53.900 | then there's enormous use for that.
02:03:55.820 | That was his perspective.
02:03:56.740 | And I don't think that's an absurd perspective,
02:03:58.740 | but here's some libertarian ideas
02:04:00.180 | that have become extremely mainstream.
02:04:02.380 | War should be a last resort.
02:04:04.700 | This is something we were taught as kids and we all say,
02:04:07.100 | but for many years, it's been like,
02:04:09.260 | they don't think of it as a last resort.
02:04:10.420 | It's like, something's bad.
02:04:11.260 | Well, it's like the first instinct.
02:04:12.940 | Now it's like, let's really give it a week, just a week.
02:04:15.460 | Like what's going on in Syria.
02:04:16.780 | Is there really gonna be a genocide, the Kurds,
02:04:18.860 | you know, things like that.
02:04:19.700 | So that's one.
02:04:20.520 | Another thing is drug legalization.
02:04:22.860 | This was, you know, when you and I were kids,
02:04:24.980 | oh, it's crazy.
02:04:25.820 | It's only hippies who wanna smoke pot.
02:04:27.260 | Now it's like, I was on a grand jury
02:04:30.020 | and the point people make is,
02:04:32.460 | are you sure that this 16 year old who's selling weed,
02:04:35.980 | let's say selling, should his life be ruined?
02:04:39.660 | Should he be imprisoned with rapists and murderers?
02:04:41.960 | Like if you say yes, say yes,
02:04:44.860 | but you have to acknowledge that that's what you're meaning.
02:04:48.940 | And then a lot of people are like, wait a minute,
02:04:50.780 | there's gotta be a third option.
02:04:52.100 | Then he has no consequences
02:04:53.940 | or he's in prison with a rapist.
02:04:55.300 | Like I'm not comfortable with either of these.
02:04:57.680 | And I think the other one is an increasing skepticism.
02:05:01.220 | This libertarians were on top of this first
02:05:02.820 | and the hard left of the police.
02:05:05.300 | As of now, asset forfeiture steals more from people
02:05:08.380 | than burglaries, what people don't know
02:05:09.740 | about what asset forfeiture is.
02:05:11.340 | If the cops come to your house and they suspect you,
02:05:13.860 | you haven't been convicted of using your car or your house
02:05:17.720 | or whatever in terms of selling drugs,
02:05:20.000 | they can take whatever they want.
02:05:22.460 | And then you have to sue to prove your innocence
02:05:25.260 | and get your property back.
02:05:26.340 | It's a complete violation of due process.
02:05:28.260 | People don't realize this going on.
02:05:29.540 | It's a great way for the cops to increase their budgets
02:05:31.900 | and it's legal.
02:05:32.980 | And libertarians were like the first big ones saying,
02:05:35.660 | guys, this is not American and this is crazy.
02:05:37.940 | And now increasingly people, conservatives and leftists
02:05:40.740 | are like, wait a minute, this is,
02:05:42.700 | even if you are selling drugs, like they take your house,
02:05:44.620 | what are you talking about?
02:05:45.780 | So I think those are some mechanisms that libertarianism,
02:05:49.260 | though not by name, has become far more popular.
02:05:52.540 | - Yeah, it's interesting.
02:05:53.380 | So the idea, yeah, a coherent set of ideas
02:05:56.820 | that eventually get integrated into a two-party system.
02:06:01.660 | The war, that's an interesting one.
02:06:03.220 | You're right.
02:06:04.060 | I wonder what the thread there is.
02:06:07.380 | I wonder how it connects to 9/11 and so on.
02:06:11.020 | - I think the Patriot Act.
02:06:13.820 | - Patriot Act, okay.
02:06:14.660 | - For people who are politically savvy,
02:06:16.820 | we're like, oh, okay, this is not a joke.
02:06:20.740 | This is really a crazy infringement of our freedoms.
02:06:24.140 | And both parties are falling over each other
02:06:27.020 | to sign into law and the Orwellian name.
02:06:29.940 | How can you be against patriotism?
02:06:32.380 | What kind of person?
02:06:33.220 | You know what I mean?
02:06:34.060 | So I think for a lot of people,
02:06:34.880 | especially both civil libertarians on the left
02:06:37.220 | and a lot of conservatives who are constitutionalists
02:06:39.380 | are like, wait a minute, this isn't,
02:06:40.980 | I'm not comfortable with this.
02:06:42.300 | And I'm also not comfortable
02:06:43.340 | with how comfortable everyone in Washington is with it.
02:06:45.700 | - You're right.
02:06:46.540 | Probably libertarians and libertarianism
02:06:50.540 | is a place of ideas, which is why I have a connection to it.
02:06:53.740 | Every time I listen to those folks, I like 'em.
02:07:00.420 | I feel connected to 'em.
02:07:01.540 | I would even sometimes, depending on the day,
02:07:03.420 | call myself a libertarian.
02:07:05.180 | - But we're on the spectrum, so that's why.
02:07:06.620 | - We're on the spectrum, yeah.
02:07:08.300 | But when I look at the people
02:07:10.020 | that actually rise to the top
02:07:12.480 | in terms of the people who represent the party,
02:07:15.580 | this is where five minutes ran out, right?
02:07:18.100 | - I could go, I'm allowed.
02:07:19.740 | - You can go, why are they so weird?
02:07:22.340 | Why aren't strong candidates emerging
02:07:26.800 | that represent as political representatives
02:07:31.900 | or as famous speakers that represent the ideology?
02:07:36.900 | - I think libertarians tend to,
02:07:40.340 | I think Jonathan Haidt in his book, in his research,
02:07:43.060 | he's a political scientist,
02:07:44.180 | and he does a lot of things
02:07:45.020 | about how people come to their political conclusions
02:07:46.740 | and what factors force people to reach conclusions.
02:07:50.860 | And he found that libertarians are the least empathetic
02:07:53.660 | and most rationalistic of all the groups.
02:07:56.100 | And by that, he means they think in terms of logic
02:07:58.140 | as opposed to people's feelings.
02:07:59.660 | And that has positives and has negatives.
02:08:02.060 | And we have the AB testing with Ron Paul.
02:08:07.100 | Ron Paul ran for president as a libertarian nominee.
02:08:10.580 | He was the nominee.
02:08:11.780 | He got pretty much nowhere in 1988.
02:08:14.040 | Then he ran as a return to Republican Party
02:08:16.100 | as a congressman for many years from Texas.
02:08:18.020 | He ran for the presidency in 2008 and 2012.
02:08:22.480 | And in 2008, he stood on stage with Rudy Giuliani
02:08:26.540 | and told him that they were here in 9/11
02:08:29.060 | because we're over there,
02:08:30.340 | which would have been a shocking,
02:08:32.820 | horrifying taboo a few years earlier.
02:08:35.100 | Many people were like, "Holy crap, this is amazing.
02:08:37.260 | "Giuliani was all offended."
02:08:38.460 | And Ron Paul's like--
02:08:39.300 | - That took some guts, by the way.
02:08:40.580 | - Yeah, it did.
02:08:41.940 | - When I heard that, it was so refreshing.
02:08:44.180 | Not what he said,
02:08:45.620 | but the fact that he said something that took guts.
02:08:48.260 | It made me realize how rare it is for politicians,
02:08:53.260 | but even people, to say something that takes guts.
02:08:56.020 | - Well, it's also the idea that you can't,
02:08:58.700 | even if you think America has a right
02:09:01.140 | to invade any country on Earth as much as it wants
02:09:04.020 | and kill people as a consequence of war
02:09:06.580 | and blow up their buildings and destroy their country,
02:09:09.420 | you can't, with a straight face,
02:09:11.080 | not expect us to have consequences,
02:09:13.820 | even if they're consequences from evil people.
02:09:15.980 | Even if we're 100% the good guys
02:09:17.660 | and they're 100% the bad guys,
02:09:19.100 | those bad guys, some of them are still gonna try
02:09:21.120 | to do something, what happens next?
02:09:23.500 | You know what I mean?
02:09:24.340 | So that kind of concept,
02:09:25.940 | that there's any American culpability
02:09:28.580 | was we're America, we're the good guys.
02:09:30.980 | By definition, we're not culpable.
02:09:32.420 | To have people start thinking about,
02:09:34.140 | what if there's another way?
02:09:35.540 | You know, what if we're not there
02:09:37.140 | and then they're not here
02:09:38.100 | and we're kind of doing a backdoor,
02:09:39.620 | we're talking, so different scenarios.
02:09:41.860 | So the fact that he got so much more traction
02:09:44.560 | as a Republican, the fact that Donald Trump,
02:09:47.580 | who came out of nowhere,
02:09:48.940 | became not only the candidate,
02:09:50.660 | but the president, tells people,
02:09:52.980 | it's like getting a book deal, right?
02:09:54.860 | You can either go, there's three choices.
02:09:57.460 | You can either self-publish, mainstream publisher,
02:10:00.540 | or independent publisher.
02:10:03.180 | The independent publisher is the worst of all choices
02:10:05.940 | 'cause you're not getting a big advance,
02:10:07.960 | they're not gonna be able to promote you a lot
02:10:10.580 | and they don't get the distribution.
02:10:12.900 | Mainstream, I've done mainstream myself, right?
02:10:15.460 | With self, I don't have the cred,
02:10:18.280 | the respectability of a mainstream, or the cachet.
02:10:20.580 | - Can't be a New York Times bestseller.
02:10:22.620 | - Right, it takes a lot of work,
02:10:24.340 | but I get a lot more of the profit
02:10:26.980 | if it looks good on the shelf,
02:10:28.140 | on Amazon, looks identical, so on and so forth.
02:10:30.420 | With the mainstream, the benefits and costs
02:10:32.420 | are pretty much obvious to most people.
02:10:34.300 | So the same thing, it's like,
02:10:35.340 | you can either be an independent like Ross Perot,
02:10:37.840 | or you could be, just seize, one of the party apparatus,
02:10:41.180 | which the benefits are enormous there.
02:10:42.900 | But in terms of going third party,
02:10:45.140 | I don't know the Libertarian Party apparatus
02:10:47.060 | other than maybe some ballot access
02:10:48.860 | is really that efficacious.
02:10:50.720 | And then you're gonna have a lot of baggage
02:10:52.580 | 'cause if you hear independent, Jesse Ventura, Ross Perot,
02:10:56.080 | you think of the person.
02:10:57.400 | Now you have to define yourself
02:10:59.340 | and you have to defend the party,
02:11:01.220 | that's two bridges for most people.
02:11:03.160 | - Brilliantly put.
02:11:05.260 | - Okay, thank you.
02:11:06.300 | (laughing)
02:11:08.180 | - Let me speak to, 'cause I'm speaking to Yaron Brooks.
02:11:11.580 | - Oh gosh. - Yeah.
02:11:12.740 | - I like him.
02:11:15.060 | - Yeah, so, but that, another example, I was--
02:11:19.180 | - Ask him to tell you a joke about Ayn Rand,
02:11:21.500 | if he can do it.
02:11:23.860 | - So there, that's one criticism I've heard you say,
02:11:26.920 | which is they're unable to speak to any weaknesses
02:11:31.000 | in either Ayn Rand's or objectivist worldview.
02:11:34.320 | - Yes.
02:11:35.300 | - That's really, you put it, I know you're half joking,
02:11:39.640 | but that's actually a legitimate discussion to have.
02:11:43.360 | - I'm not joking at all.
02:11:44.480 | - Because that's, to me, one of the criticisms
02:11:47.560 | and one of the explanations why the world
02:11:49.320 | seems to disrespect Ayn Rand, the people that do,
02:11:53.680 | is she kind of implies that her ideas are flawless.
02:11:58.680 | - No, she says they correspond to reality.
02:12:01.420 | - Yeah, right.
02:12:02.320 | - That's the term she uses.
02:12:03.720 | - That, I mean, objective, it's in the name,
02:12:08.320 | it's just facts.
02:12:11.000 | Like, it's impossible to basically argue against,
02:12:14.160 | 'cause it's pretty simple, it's just all facts.
02:12:16.800 | - Well, it's possible to argue against,
02:12:18.380 | but she would say she's never met a good critic
02:12:20.860 | who can argue the facts of that misrepresentation.
02:12:22.960 | - And she's not entirely wrong.
02:12:24.120 | She's often caricatured, 'cause she has
02:12:25.680 | a very extreme personality and extreme worldview.
02:12:28.200 | - But that, to me, I mean, some people,
02:12:30.080 | there's a guy named, in the physics mathematics community,
02:12:32.720 | called Stephen Wolfram, I don't know if you've heard of him.
02:12:35.200 | - Wolfram Alpha?
02:12:36.040 | - Yeah. - Okay.
02:12:36.880 | - He has a similar style of speaking sometimes,
02:12:39.720 | which is like, I've created a science,
02:12:44.200 | but that turns a lot of people off,
02:12:46.560 | like this kind of weird confidence,
02:12:48.840 | but he's one of my favorite people,
02:12:50.440 | I think one of the most brilliant people.
02:12:52.560 | If you just ignore that little bit of ego,
02:12:56.080 | or whatever you call that,
02:12:57.960 | that there's some beautiful ideas in there.
02:12:59.960 | - He's an amazing person.
02:13:01.240 | - And that, for me, objectivism,
02:13:03.440 | I'm undereducated about it.
02:13:05.080 | About it.
02:13:06.040 | I hope to be more educated,
02:13:08.400 | but there's some interesting ideas that,
02:13:10.480 | again, just like with UFOs,
02:13:12.360 | not that there's a connection between the two.
02:13:15.200 | - Don't bring that up for your own, he won't like it.
02:13:17.640 | - He won't. - "Iron Man's like UFOs.
02:13:19.120 | Oh, no, no, no, this interview is over."
02:13:21.240 | (Lex laughing)
02:13:23.320 | That's a good yarn, okay.
02:13:25.240 | But you have to be a little bit open-minded,
02:13:27.560 | but what's your sense of objectivism?
02:13:31.560 | Are there interesting ideas
02:13:32.840 | that are useful to you to think about?
02:13:35.640 | - I own her copy of the first printing of the Fountainhead,
02:13:38.480 | so that should tell you a little bit
02:13:39.680 | about how my affection for Miss Rand,
02:13:41.600 | how heavy that goes.
02:13:42.800 | Ayn Rand does not have all the answers,
02:13:46.040 | but she has all the questions.
02:13:47.680 | So if you study Rand,
02:13:49.120 | you are going to be forced to think
02:13:50.640 | through some very basic things,
02:13:52.160 | and you're gonna have your eyes open very, very heavily.
02:13:54.960 | She was not perfect.
02:13:56.120 | She never claimed to be perfect.
02:13:57.400 | She was asked on Donahue,
02:13:59.400 | "Is it true that according to your philosophy,
02:14:01.320 | you are a perfect being?"
02:14:02.560 | She said, "I never think of myself that way."
02:14:05.120 | And she said, "But if you ask me,
02:14:06.480 | do I practice what I preach?"
02:14:07.600 | The answer is yes, resoundingly.
02:14:10.240 | She's a fascinating woman.
02:14:12.320 | What is really interesting about her,
02:14:15.520 | and this is something you'd appreciate personally,
02:14:17.880 | is when you read her essays,
02:14:19.400 | she'll have these weird asides.
02:14:21.800 | And she would talk about art,
02:14:23.040 | and she'd be like, "And this is why the US
02:14:24.240 | should be the only country with nuclear weapons."
02:14:25.960 | And when you follow a brilliant mind
02:14:28.200 | making these seemingly disparate connections,
02:14:31.360 | it's something I find to be just absolutely inspiring
02:14:33.640 | and awesome and entertaining.
02:14:35.160 | I think there's lots of things about her
02:14:38.480 | that people like Yaron would make uncomfortable.
02:14:40.880 | Well, like she, they, so objectivism,
02:14:46.480 | like any other philosophy,
02:14:47.720 | they give us all these techniques
02:14:49.040 | to kind of hand wave away things
02:14:50.880 | you don't wanna talk about and like pretend it.
02:14:52.800 | So they talk about things like having
02:14:54.360 | no metaphysical significance, right?
02:14:57.000 | So what that means is like, "Well, what about this?
02:14:58.640 | Ah, I don't wanna talk about it.
02:14:59.480 | Like, it doesn't matter."
02:15:00.640 | Like it literally means it fancy,
02:15:02.120 | a philosophical term, it doesn't matter.
02:15:03.840 | Or they will say correctly
02:15:06.120 | that it's very twisted in our culture
02:15:09.560 | that when we have heroes, we look for their flaws
02:15:12.520 | instead of looking for their virtues.
02:15:13.920 | That's a 100% valid perspective.
02:15:16.400 | However, if I'm sitting here telling you
02:15:19.520 | that I think this woman is a badass and she's amazing
02:15:23.120 | and she should be studied,
02:15:24.640 | but there's also these idiosyncrasies,
02:15:26.600 | they don't wanna hear it.
02:15:27.880 | Because they, and I think it's very convenient for them
02:15:30.280 | 'cause there's a lot of things she did that were,
02:15:32.200 | here's an example.
02:15:33.320 | Rand was very, very pro a happiness and poor pleasure.
02:15:36.680 | She was very pro sex, which is kind of surprising
02:15:39.360 | looking at her and how she talked and how strident she was.
02:15:41.800 | As a result of this, she never got her cats fixed
02:15:44.960 | to deny them the pleasure of orgasm.
02:15:46.920 | So her male cats are spraying up her entire house.
02:15:49.960 | Like that is, I mean, that's her putting her philosophy
02:15:53.400 | into practice, but it's still gross.
02:15:55.880 | So that's the kind of thing where I don't think he'd be.
02:15:58.160 | Another thing is Rand had an article on a woman president
02:16:01.920 | and she said a woman should never be president, right?
02:16:04.440 | Now, when Rand says things that are too goofy for them,
02:16:06.960 | they say, "Oh, that's not objectivism.
02:16:09.560 | "That's her personal preference."
02:16:11.120 | It's like she did not have these lines.
02:16:14.320 | Objectivism was always defined as Ayn Rand's writings,
02:16:18.200 | plus the additional essays in her books.
02:16:20.200 | So if this was in part of those books,
02:16:22.240 | this counts as official objectivism,
02:16:24.040 | but they pretend otherwise.
02:16:25.240 | So that's another example.
02:16:26.440 | Plus they, she was, and I bet you she was on the spectrum
02:16:30.120 | to some extent, I'm not joking.
02:16:31.300 | I'm not using that derisively.
02:16:32.720 | She was of the belief and not inaccurately
02:16:36.520 | because that humor is used to denigrate and humiliate.
02:16:40.880 | And she was thinking about the Jon Stewart type
02:16:42.680 | before there was a Jon Stewart.
02:16:43.880 | And a lot of times, like how I use mocking,
02:16:46.560 | but she was resentful, correctly,
02:16:49.320 | that a lot of times people who are great and accomplished,
02:16:51.880 | little nobodies will make a punchline
02:16:54.960 | just to bring them down and just bother.
02:16:56.720 | Here's an example I just thought of.
02:16:58.120 | I remember in, I remember when it was,
02:16:59.960 | must've been the '90s,
02:17:01.160 | they had a segment on MTV of all these musicians
02:17:04.760 | who were making their own perfumes, right?
02:17:07.480 | And this girl grabbed Prince's perfume.
02:17:09.360 | And before she even smelled it, she had the joke ready.
02:17:11.080 | She just, "Oh, this smells almost as bad
02:17:13.160 | "as his music lately."
02:17:14.360 | It's like, first of all, I'm sure the perfume's fine.
02:17:16.640 | And second of all, this is Prince.
02:17:18.160 | He's one of the all-time greats
02:17:19.720 | and you can't wait to denigrate him.
02:17:22.800 | Like how, and part, I wanna be like,
02:17:24.400 | "Ran, like how dare you?"
02:17:25.680 | Like as if this perfume in any way,
02:17:29.400 | in any way mitigates his amazing accomplishments
02:17:32.040 | and achievements, you horrible person.
02:17:34.080 | But I do have some great Ayn Rand jokes
02:17:36.600 | and he would not be happy about them.
02:17:38.520 | - The perfume thing, the problem with it is just not funny.
02:17:40.960 | Not that- - Oh, he sucks.
02:17:42.360 | Okay, great.
02:17:43.480 | - Not that they dared to try to be humorous.
02:17:47.400 | 'Cause I don't know why you mentioned Jonester,
02:17:48.840 | 'cause Jonester can be funny.
02:17:50.480 | - Right, but he taught a generation,
02:17:53.320 | you still see this on Twitter,
02:17:55.080 | where things have to be inherently sarcastic and snide.
02:17:58.920 | - But isn't that, I mean, aren't you practicing that?
02:18:00.920 | - No, I use irony, not sarcasm.
02:18:02.800 | Here's an example.
02:18:03.640 | When people, like you say something
02:18:05.240 | and someone replied, it'd be like,
02:18:06.320 | "Um, last I checked, blah, blah, blah, blah, blah."
02:18:08.760 | And I'll say that.
02:18:09.600 | I go, "What do you think saying last I checked
02:18:11.240 | "added to your point?
02:18:12.080 | "You're giving me valuable information and data,
02:18:14.440 | "but you are trained to believe
02:18:16.400 | "that it has to be couched in this sneering.
02:18:19.420 | "It doesn't, just give me the information.
02:18:21.080 | "This is useful information."
02:18:22.520 | - Yeah, that's true.
02:18:24.080 | - It's a knee jerk.
02:18:25.000 | - But see, Jon Stewart did it masterfully, I thought.
02:18:27.160 | - Correct, and they don't.
02:18:28.120 | - And they don't.
02:18:29.080 | It's like people who copy certain comedians,
02:18:32.040 | you try to copy them and you lose everything
02:18:35.800 | in the process of copying.
02:18:37.040 | Yeah, yeah, okay.
02:18:41.180 | But in terms of the philosophy of selfishness,
02:18:44.940 | this kind of individual focused idea,
02:18:47.940 | I imagine that connects with you.
02:18:50.900 | - Yes, and I think it would connect with more people
02:18:53.020 | if they understood what she meant by it.
02:18:54.340 | Nathanael Brandon, who was her heir
02:18:55.860 | until she kind of broke with him
02:18:57.180 | and he was a co-dedicatee of Atlas Shrugged,
02:19:00.420 | said, "No one will say Ayn Rand's views
02:19:03.380 | "with a straight face.
02:19:04.220 | "They won't say, I believe that my happiness matters
02:19:08.140 | "and is important and is worth fighting for
02:19:10.400 | "and that Ayn Rand says this, then she's dangerous."
02:19:13.060 | Now, it's very easy to say this could have
02:19:15.140 | dangerous consequences if you're a sociopath,
02:19:17.600 | but to put it in those terms, I think is extremely healthy.
02:19:21.220 | I think more people should wanna be happy.
02:19:23.020 | And I think a lot of us are raised to be apologetic,
02:19:26.620 | especially in this cynical media culture,
02:19:29.380 | that if you say, I wanna be happy, I wanna love my life,
02:19:32.380 | that it's just like, okay, sweetheart.
02:19:34.260 | And the eye rolling, and I think that's so pernicious,
02:19:37.180 | it's so horrifying, and this is why I'm a Camus person,
02:19:39.540 | 'cause Camus thought the arch enemy was cynicism
02:19:41.420 | and I could not agree more.
02:19:42.860 | Like if you are the kind of person,
02:19:44.100 | if someone likes a band and you're like,
02:19:45.420 | oh, she likes them, blah, blah, blah,
02:19:46.620 | it's like, this gives them happiness.
02:19:48.980 | - Yeah. - Now, there's certain
02:19:49.800 | exceptions, but if it gives you happiness,
02:19:51.540 | it's not for you, that's cool.
02:19:53.260 | - Okay, this is beautiful.
02:19:55.860 | I so agree with you on the eye rolling,
02:19:59.380 | but you see the best of trolling as not the eye roll.
02:20:04.140 | - Correct, of course not.
02:20:05.360 | - The best of trolling is taking down the eye rollers.
02:20:08.940 | - I'm gonna have to think about that.
02:20:10.180 | - Okay.
02:20:11.020 | - 'Cause I-- - Have another Red Bull.
02:20:12.420 | - Yeah, I was, yeah, 'cause I put 'em all--
02:20:16.340 | - My blood type is Red Bull.
02:20:17.700 | (laughing)
02:20:19.940 | - I kind of put 'em all in the same bin.
02:20:22.840 | - Okay. - And they're not.
02:20:23.760 | - They're not. - They're not, okay.
02:20:25.760 | All right.
02:20:26.920 | - Here's another example of trolling.
02:20:29.180 | I was making jokes about Ron Paul, he just had a stroke.
02:20:32.260 | Right, and someone came at me and they're like,
02:20:35.100 | oh, blah, blah, blah, you're ugly,
02:20:37.480 | I hope you have a stroke, I hope you're in the hospital.
02:20:39.480 | And I just go, I just did have a stroke on your mom's face.
02:20:42.320 | So they came at me and now they got put in their place.
02:20:46.020 | - With a subpar, I mean--
02:20:50.360 | - I wasn't clever.
02:20:51.320 | - You weren't clever.
02:20:52.440 | - Not particularly, no.
02:20:53.680 | - Well, one of your things you do, which is interesting,
02:20:57.040 | I mean, I give you props in a sense,
02:20:59.320 | is you're willing to go farther than people expect you to.
02:21:03.880 | - Yes, that's fun.
02:21:05.220 | - Yeah, in fact, I'll probably edit out
02:21:07.940 | like half of this podcast because the thing you did
02:21:11.300 | which she kept in, should mention,
02:21:14.180 | Michaela Peterson now has a podcast, which is nice.
02:21:16.900 | I guess, was it on her podcast?
02:21:18.420 | - She was on mine.
02:21:19.260 | - She was on yours.
02:21:20.080 | - We did both, but this is when you're referring
02:21:21.420 | to when she was on mine.
02:21:22.260 | - She was on, yeah, right.
02:21:23.620 | And you went right for the--
02:21:26.940 | - So I'll tell you what it was, you don't have to paraphrase.
02:21:29.740 | So I opened up, I say, she's Jordan Peterson's dad,
02:21:32.580 | and as many people know, Jordan's--
02:21:33.900 | - Daughter, yeah.
02:21:34.740 | - Sorry, he's her dad, yeah.
02:21:36.260 | She's had a long issue with substance addiction.
02:21:39.920 | And I said to her, you're most famous
02:21:42.820 | for being Jordan Peterson's daughter.
02:21:45.400 | Many people, he's changed so many lives around the world,
02:21:48.180 | and he's been such an enormous influence to me personally
02:21:51.740 | that I've started taking benzodiazepines recreationally.
02:21:54.780 | And she's like, oh my God, Michael is so horrible.
02:21:58.580 | - Yeah, 'cause you pulled me in with this,
02:22:01.180 | 'cause you're talking, I mean, you know,
02:22:02.940 | 'cause he's going through a rough time now,
02:22:04.860 | she's going through, just everything was just,
02:22:07.420 | you pulled me in emotionally.
02:22:08.940 | I was like, this is gonna be the sweet,
02:22:11.620 | Mike is gonna be just this wonderful, and then just bam.
02:22:16.140 | So that was props to you on that.
02:22:21.140 | Whatever that is, that is an art form.
02:22:25.140 | When done well, it can be taken too far.
02:22:27.860 | My criticism is that that feels too good for some people.
02:22:32.860 | - What do you mean?
02:22:34.580 | Oh, they're too happy being irreverent
02:22:36.260 | to show that they don't care about anything?
02:22:37.380 | That's another form of cynicism though.
02:22:38.980 | - Right, 'cause you think it's possible to be a troll
02:22:42.500 | and still live life to its highest ideal in the Camus sense?
02:22:47.500 | - I try, that's kind of my ideal.
02:22:51.140 | - I believe it's not.
02:22:54.300 | It becomes a drug.
02:22:55.900 | I feel like that takes you,
02:22:57.140 | like I think love ultimately is the way to experience
02:23:01.900 | like every moment of every day.
02:23:04.300 | - You don't think that was an expression of,
02:23:06.860 | I honestly think, let's split hairs here
02:23:09.740 | 'cause I think there's something of use here.
02:23:12.100 | I do think that me being able to make her laugh
02:23:17.100 | about this year of hell she was in
02:23:21.300 | does create an element of love
02:23:23.420 | and connection between me and her.
02:23:25.180 | - Yeah, but--
02:23:26.020 | - I know she would say that.
02:23:28.140 | - Yes, it wasn't that.
02:23:30.860 | It was what you said in combination
02:23:33.660 | with the sweetness everywhere else, the kindness.
02:23:37.500 | It's a very subtle thing,
02:23:39.020 | but it's like some of the deepest connections
02:23:42.020 | we have with others is when we mock them lovingly.
02:23:46.060 | - Yes, correct.
02:23:46.980 | - But there is stuff, there's kindness around that.
02:23:52.260 | Not in words, but in like subtle things.
02:23:55.220 | - 'Cause it creates an air of being familial.
02:23:58.780 | Like we're through this together.
02:24:00.860 | - Yeah, that's missing.
02:24:02.540 | That's very difficult to do on the internet.
02:24:04.420 | - I agree with you.
02:24:05.780 | I agree with you.
02:24:06.620 | - That's why my general approach on the internet
02:24:10.740 | is to be more like simple, less witty,
02:24:15.300 | and more like dumbly loving.
02:24:19.380 | - But that's not your core competency, being witty.
02:24:22.900 | (laughs)
02:24:24.820 | - Me? - Yeah.
02:24:25.900 | - I can be witty.
02:24:27.860 | - You can be, but I'm saying that's not your core competency.
02:24:29.780 | I'm not saying you're bad at it,
02:24:30.900 | but I'm saying that's not where you go organically,
02:24:33.760 | especially with strangers.
02:24:36.020 | - I just feel like nobody's core competence on the internet
02:24:40.260 | is, I guess, if you want to bring love to the world,
02:24:43.340 | nobody's core competence is,
02:24:45.260 | given the current platforms,
02:24:47.860 | nobody's core competence is wit.
02:24:50.420 | It's very difficult to be witty.
02:24:51.660 | - Yeah. - On the internet
02:24:53.460 | while still communicating kindness.
02:24:55.660 | - I'll give you another example.
02:24:57.900 | - In the same way that you can in physical space.
02:24:59.540 | - I'll give you another example.
02:25:00.420 | Someone came at me and they were like,
02:25:03.940 | they gave me a donation, people do this all the time,
02:25:06.220 | and they go, "Oh, I started reading your books
02:25:09.780 | "'cause of my wife, and now we watch your shows together.
02:25:14.040 | "Keep up the good work."
02:25:15.020 | And I go, "What does her boyfriend think?"
02:25:17.460 | So that is an example of wit and love
02:25:20.500 | because that person feels seen.
02:25:23.060 | I'm acknowledging them.
02:25:24.340 | I'm also making a joke at their expense.
02:25:25.940 | We know it's a joke.
02:25:27.300 | So I think-- - Good point, good point.
02:25:29.300 | - Language is often used in non-literal ways
02:25:32.000 | to cue emotional and connectivity.
02:25:34.140 | - It's difficult. - It's very difficult.
02:25:36.180 | - What you've done is difficult to accomplish,
02:25:38.900 | but you've done it well.
02:25:39.740 | I mean, you do, like you've been doing these live streams,
02:25:43.360 | which are nice that people give you a bunch of money
02:25:45.220 | and donations and stuff,
02:25:46.580 | and then you'll often make fun of certain aspects
02:25:49.900 | of their questions and so on, but it's always long.
02:25:52.220 | - That's not from love.
02:25:53.060 | That is genuine annoyance
02:25:53.900 | 'cause they ask me some really dumb questions.
02:25:54.720 | - But there's still underlying, it's not even,
02:25:57.900 | like there's a kind person under that
02:26:00.460 | that's being communicated.
02:26:02.320 | That's interesting,
02:26:03.160 | but I don't know if I get that from your Twitter.
02:26:05.700 | I know I get that from the video.
02:26:07.580 | Something about the face, something about like--
02:26:09.620 | - Yeah, of course, it's much harder.
02:26:10.900 | The more data, the more easy it is
02:26:13.500 | to convey emotion and subtlety, absolutely.
02:26:15.660 | If you only have literally black and white letters,
02:26:17.820 | it's gonna be, or whatever, white and black,
02:26:19.540 | if you have night mode, it's gonna be a very different,
02:26:21.740 | it's much more limited information.
02:26:23.700 | - Yeah, but this is the fundamental thing is like--
02:26:26.700 | - Here's another example.
02:26:28.620 | Like if they had access to my face,
02:26:30.220 | like a lot of times some people don't know who I am
02:26:32.260 | and they come at me, call me a Nazi anti-Semite, right?
02:26:34.860 | And I start talking about the Jews
02:26:36.260 | and just how terrible the Jews are.
02:26:37.860 | Now, all my audience knows I'm Jewish,
02:26:39.380 | that I went to Yeshiva,
02:26:40.220 | so they're sitting there laughing
02:26:41.500 | 'cause this person's making ass of themselves.
02:26:43.460 | That person has no idea, but if there was video,
02:26:46.700 | then they would be like, okay, wait a minute,
02:26:48.180 | something's off.
02:26:49.020 | Something's up.
02:26:49.900 | I don't know.
02:26:51.860 | I think it's entertaining, I think it's fun,
02:26:54.660 | but I just, I don't think it's scalable.
02:26:57.420 | And ultimately, I'm trying to figure out
02:27:00.260 | this whole trolling thing
02:27:01.980 | 'cause I think it's really destructive.
02:27:04.780 | I've been, the outrage mob, the outrage mobs,
02:27:08.900 | just the dynamics of Twitter has been really bothering me.
02:27:12.620 | - Okay.
02:27:13.460 | - And I've been trying to figure out
02:27:15.100 | if we can try to build an alternative to Twitter perhaps
02:27:19.820 | or try to encourage Twitter to be better,
02:27:22.140 | how to have nuanced, healthy conversations.
02:27:26.060 | The reason I talk about love isn't just for love's sake,
02:27:28.860 | it's just a good base
02:27:30.100 | from which to have difficult conversations.
02:27:33.140 | That's a good starting point
02:27:34.540 | because if you start, I would argue
02:27:37.420 | that the kind of conversation you have on Twitter is fun,
02:27:42.420 | but it might not be a good starting point
02:27:44.660 | for a difficult, nuanced conversation.
02:27:46.660 | - Well, I'm not interested
02:27:47.860 | in having those conversations with most people.
02:27:49.820 | - No, I know, but--
02:27:50.900 | - So I agree with you.
02:27:52.220 | Your point is valid.
02:27:53.100 | - Yes, but like I was saying,
02:27:54.340 | so if we were trying to have a difficult, nuanced conversation
02:27:58.980 | about, say, race in America or policing,
02:28:02.340 | is there racism, institutional racism of policing?
02:28:06.380 | Okay, there's the only conversations
02:28:09.420 | that have been nuanced about it
02:28:11.340 | that I've heard is in the podcasting medium.
02:28:13.700 | - I agree with you.
02:28:14.540 | - There's the magic of podcasting, which is great,
02:28:17.020 | but that's, the downside of podcasting
02:28:21.180 | is it's a very small number of people.
02:28:24.260 | Even if it's in the thousands, it's still small.
02:28:27.740 | And then there's millions of people on social media
02:28:30.380 | and they're not having nuanced conversation at all.
02:28:32.940 | - They're not capable of it.
02:28:34.060 | That's the difference in your mind.
02:28:35.740 | - I believe they are.
02:28:36.780 | So that's the--
02:28:37.620 | - There's no data that shows this.
02:28:38.440 | - And then both of us aren't being, not scientific.
02:28:41.580 | You don't have data to support your world either.
02:28:43.420 | - We're making the claims.
02:28:44.720 | - Well, you are too.
02:28:46.700 | - No, I'm not.
02:28:47.520 | If I'm looking at an object, the claim that has a mind.
02:28:51.060 | - Well, no, what, no, your claim is
02:28:55.900 | that people are fundamentally stupid.
02:28:58.180 | - Aren't you a martial artist?
02:28:59.740 | - Yes.
02:29:00.980 | - How's it feel?
02:29:02.160 | I did judo on you.
02:29:04.620 | - But you really don't think people are deep down
02:29:10.140 | like capable of being intelligent.
02:29:13.500 | - No, not at all.
02:29:14.900 | Not deep down, not surface.
02:29:16.020 | I'm not joking.
02:29:16.860 | I'm not being tongue in cheek.
02:29:18.020 | I'm not being cynical.
02:29:18.900 | I do not at all, at all think they have this capacity.
02:29:22.340 | - I'm gonna think, 'cause you're being so clear about it.
02:29:24.620 | You're not even, I'm gonna have to think about that.
02:29:28.260 | - Here's evidence for my position, not proof.
02:29:31.420 | And this is of course data that is of little use,
02:29:33.780 | but it's of interest.
02:29:35.020 | A lot of times when you have an audience as big as mine
02:29:37.260 | and people come at you,
02:29:38.800 | not only will people say the same thing, the same concept,
02:29:42.100 | they'll say the same concept in the same way.
02:29:44.420 | That is not a mind.
02:29:45.620 | - Yeah, that's surface evidence.
02:29:49.300 | You're saying this iceberg looks like this from the surface.
02:29:52.620 | I'm saying there's an iceberg there that if challenged
02:29:56.620 | can rise to the occasion of deep thinking.
02:30:03.220 | And you're saying, nope.
02:30:04.860 | - Nope, it's just frozen water.
02:30:08.660 | (speaking in foreign language)
02:30:09.500 | - Isn't that the Russian expression?
02:30:11.640 | - That's ice cream.
02:30:12.880 | - No, not (speaking in foreign language)
02:30:14.640 | Doesn't it mean like no one's there?
02:30:17.400 | - Actually, I don't know.
02:30:18.240 | - Yeah, it means like, yeah.
02:30:19.840 | Yeah, it's like thought.
02:30:20.920 | It means (speaking in foreign language)
02:30:25.520 | - Okay, well, so you're challenging me
02:30:28.400 | to be a little bit more rigorous.
02:30:30.000 | I think I'll try to prove--
02:30:30.840 | - I'm not challenging you anything.
02:30:31.960 | I'm just saying--
02:30:32.780 | - No, not challenging me,
02:30:33.620 | but I'm challenging myself based on what you're saying
02:30:35.760 | because I'd like to prove you wrong.
02:30:37.920 | And find actual data to show you wrong.
02:30:42.420 | And I think I can, but I would need to get that data.
02:30:46.440 | - That's funny you said, I think I can
02:30:47.760 | when they were working on my biography, "Ego and Hubris",
02:30:50.740 | the title I had suggested was
02:30:52.240 | "The Little Engine That Could But Shouldn't".
02:30:54.440 | (both laughing)
02:30:56.240 | - They didn't like it.
02:30:57.080 | - I think that's a great title.
02:30:58.600 | - That's pretty good, yeah.
02:31:00.200 | Speaking of biographies, I mean,
02:31:02.040 | I read your book or listened to your book.
02:31:04.120 | Listened to, there's an audio book from you, right?
02:31:05.960 | - Yeah, I did the audio, yeah.
02:31:07.000 | - Yeah, you read it?
02:31:08.960 | - My Golas, yes.
02:31:10.200 | - Okay, so this was--
02:31:12.400 | - I didn't do your own Brooks voice in the book.
02:31:14.160 | I did all the different voices 'cause he has lisp
02:31:15.840 | and I didn't wanna sound like I was making fun of him.
02:31:18.100 | - Yeah, I don't remember you reading it,
02:31:21.580 | but I was really enjoyed it.
02:31:23.160 | - I promise you.
02:31:24.000 | - No, okay, it was good.
02:31:24.840 | It was like a year, year and a half ago.
02:31:26.360 | - This I can prove.
02:31:27.440 | (both laughing)
02:31:30.780 | - Well, let me at a high level,
02:31:32.260 | see if you can pull this off.
02:31:33.680 | If I ask you, what's the book you write about?
02:31:38.680 | - It's about a group of people
02:31:42.760 | who are united solely by their opposition to progressivism,
02:31:46.720 | who have little else in common,
02:31:49.080 | but who are all frequently caricatured and dismissed
02:31:53.400 | by the larger establishment media.
02:31:56.380 | - But you give this kind of story of how it came to be.
02:32:00.240 | - Sure.
02:32:01.080 | - And to me, like we're talking about trolls,
02:32:03.520 | but the internet side of things is quite interesting.
02:32:06.840 | So first of all, how does alt-right connect?
02:32:10.000 | - So the alt-right is the subset of the new right,
02:32:13.920 | which feels that race, not racism,
02:32:16.760 | is the most or one of the most important
02:32:19.600 | sociopolitical issues.
02:32:21.140 | - Are any of those folks like part of the mainstream
02:32:27.160 | or worth paying attention to?
02:32:29.200 | - Not in part of the mainstream, the alt-right?
02:32:31.160 | - Yeah.
02:32:32.000 | - I would not say in any position
02:32:32.820 | they would be part of the mainstream.
02:32:33.800 | - They would not be part of that.
02:32:34.960 | - No, they would not.
02:32:36.160 | I don't know that any of them,
02:32:38.640 | well, worth is not a position,
02:32:40.000 | I'm not in a position to say worth.
02:32:41.840 | I would say that it is of use
02:32:45.360 | to be familiar with their arguments
02:32:48.200 | because to dismiss any school of thought,
02:32:51.320 | especially one that has historically gained leverage,
02:32:54.680 | especially one that has historically gained leverage
02:32:57.040 | in very dark ways, especially in America,
02:32:59.520 | in Europe and other places,
02:33:01.200 | just to say, "Oh, they're racist,
02:33:03.000 | "I don't need to think about them,"
02:33:04.320 | it doesn't behoove you.
02:33:07.200 | - So what lessons do we draw from the 4chan side of things,
02:33:12.200 | like the internet side of the movement?
02:33:16.160 | - Tits or get the fuck out.
02:33:17.500 | - Can you define every single word in that sentence?
02:33:21.120 | - Tits or breasts or get the fuck out.
02:33:24.040 | That's from 4chan.
02:33:25.480 | - Okay, what's it mean?
02:33:28.120 | - Oh, sometimes a woman will appear on 4chan
02:33:29.920 | and it'll just reply, "Tits or get the fuck out."
02:33:32.320 | - I'm trying to understand what the,
02:33:35.640 | oh, oh, that's the way.
02:33:37.240 | I just, very slow.
02:33:44.520 | So that's, okay, so that's very disrespectful
02:33:47.680 | towards female members of the community.
02:33:52.520 | I don't understand.
02:33:53.360 | So there's rules to this community
02:33:55.080 | and one of them is we're not very good with women.
02:33:58.760 | Is that, that's one of the rules?
02:34:00.240 | - It's more of a principle than a rule.
02:34:02.280 | - It's a principle?
02:34:04.080 | We're not going to ever get laid.
02:34:06.280 | That's fundamental principle.
02:34:07.880 | Is there other-- - But we are gonna get pics.
02:34:10.240 | - Pics. - Sometimes.
02:34:11.400 | - Sometimes on the internet. - Sometimes they GTFO.
02:34:15.160 | - Okay, so is there other actual principles of,
02:34:19.120 | so like, from my maybe naive perspective
02:34:24.360 | is they have like the darkest aspects of trolling,
02:34:26.660 | which is like take nothing serious,
02:34:28.800 | make a game out of everything true.
02:34:30.520 | - That's not 4chan per se.
02:34:31.880 | One of the things that you will learn in 4chan,
02:34:34.200 | which I think is very healthy,
02:34:35.880 | is if you have an idiosyncratic or unique worldview
02:34:39.480 | or focus on an aspect of history or culture,
02:34:42.020 | you'll be able to find like-minded people
02:34:43.760 | who you will engage with you and discuss it
02:34:45.400 | without being preemptively dismissive.
02:34:48.120 | - That's an ideal that they--
02:34:51.000 | - Well, it's not ideal.
02:34:51.840 | It's something that happens a lot.
02:34:52.840 | Now, 4chan's not really like,
02:34:54.080 | Paul is their board with politics,
02:34:56.480 | but they will get into some,
02:34:59.360 | the people there are much more erudite than you'd think.
02:35:01.880 | - So they do take,
02:35:03.100 | my perception was they take nothing seriously.
02:35:05.260 | So there's things that they take seriously,
02:35:07.360 | like discussing ideas.
02:35:08.640 | - I'll give you one example.
02:35:09.640 | There was a video someone posted of a girl
02:35:11.840 | who put kittens in a bag and threw it in a river,
02:35:14.420 | and they found out where she was within a day
02:35:16.040 | and got her arrested.
02:35:17.500 | So yeah, they do take some things very seriously.
02:35:19.940 | - Okay, but that's like an extreme.
02:35:23.280 | I mean, that's good.
02:35:25.080 | First of all, that's heartwarming
02:35:26.200 | that they wouldn't somehow turn that into a thing.
02:35:29.440 | That feels like more of a, what is it?
02:35:31.800 | What's the other one?
02:35:32.620 | 8chan?
02:35:33.460 | - 8chan's twice as good as 4chan, yeah.
02:35:36.040 | That's their slogan.
02:35:37.000 | - But it feels like they're the kind of community
02:35:40.520 | that would take that kitten situation and make a mockery.
02:35:45.120 | - Yeah, they're a darker than 4chan, yeah.
02:35:47.520 | And don't even, I'm not allowed to talk about 16chan.
02:35:50.200 | - I'm already overwhelmed, clearly, by 4chan lingo.
02:35:56.120 | I have, I literally wrote down in my notes,
02:35:58.940 | like, in doing research for this conversation,
02:36:04.440 | I learned the word pleb,
02:36:05.940 | and I wanted to ask you what does pleb mean?
02:36:09.440 | - You know what pleb means?
02:36:10.480 | - No.
02:36:11.320 | - I don't, what do you--
02:36:13.520 | - I saw, I mean, actually, no, I don't.
02:36:16.280 | - You know what a pleb is?
02:36:17.720 | - I just, I don't know what a pleb is.
02:36:20.100 | - Like a plebiscite, or a plebeian.
02:36:22.320 | - Okay, but does it mean something more sophisticated?
02:36:26.720 | - No, it's a very unsophisticated mechanism
02:36:29.040 | of being dismissive.
02:36:30.680 | - Of, like, the regular people?
02:36:32.720 | - Yeah, or someone who comes at me on Twitter.
02:36:35.280 | - Okay, all right, so back to the 4chan alt-right.
02:36:39.400 | Wasn't the--
02:36:40.400 | - Those are very different concepts, don't conflate them.
02:36:43.520 | - But which internet culture was the alt-right born out of?
02:36:48.520 | - Well, alt-right was more born of blogs,
02:36:51.160 | and people had different blogs,
02:36:52.640 | and they were posting what they called, like,
02:36:54.080 | racial realism, scientific, which is scientific racism,
02:36:56.560 | so-called, and, you know, breaking down issues
02:36:59.080 | from a racialist perspective.
02:37:00.400 | So that wasn't, 4chan is much more dynamic.
02:37:03.720 | It's a message board, it's very fluid.
02:37:06.560 | So it doesn't lend itself to these kind of
02:37:08.640 | in-depth analysis of ideas or history.
02:37:11.520 | - But it spreads them, like it--
02:37:13.560 | - It spreads them as memes, yeah, and it, you know, but--
02:37:16.360 | - But it's not an essential mechanism
02:37:18.480 | of the alt-right, historically?
02:37:20.840 | - No, no, no, no, no, no.
02:37:22.320 | So it was mostly about blogs.
02:37:24.160 | Okay, so what do you make of the psychology
02:37:28.040 | of this kind of worldview?
02:37:29.640 | - When you have, and this goes to your conspiracy theory
02:37:32.600 | subject earlier, when you have a little bit of knowledge
02:37:35.200 | about something, about history, that no one's talking about,
02:37:39.020 | and there's only one group that is talking about it,
02:37:42.160 | and you have no alternative answers,
02:37:45.360 | you're going to be drawn to that group.
02:37:47.300 | So because issues about race, antisemitism, homophobia
02:37:51.600 | are so taboo in our culture,
02:37:53.880 | understandably, there's good reasons.
02:37:55.320 | If you start putting things like,
02:37:56.560 | how old should you be to have sex with kids
02:37:58.680 | and just have regular conversations,
02:38:00.080 | eventually some people are gonna start taking
02:38:01.520 | some positions you don't like,
02:38:02.520 | so some things have to be sanctified to some extent.
02:38:04.600 | They're the only ones talking about it.
02:38:06.440 | You're gonna be drawn to that subculture.
02:38:09.260 | - And where does the alt-right stand now?
02:38:13.080 | I mean, I hear that term used--
02:38:15.400 | - So the term has been weaponized by the corporate press
02:38:18.320 | for people that they want to read out of society.
02:38:22.440 | So it's used both on individual levels,
02:38:24.320 | like people like Gavin McIngus, Milo Yiannopoulos,
02:38:27.360 | some others, I mean, I think they refer to Trump as alt-right
02:38:30.840 | and it's become a slur, just like incel or bot,
02:38:36.620 | that has become largely removed from its original meaning.
02:38:39.720 | - Do you have a sense that there's still a movement
02:38:41.840 | that's alt-right, or like--
02:38:43.000 | - Yeah, they call themselves now, okay,
02:38:44.760 | so there's something called the dissident right,
02:38:47.460 | and they say we're completely not like the alt-right
02:38:49.720 | because the alt-right's A, B, and C, and we're B, C, D.
02:38:53.200 | There's a huge overlap, it's very much the same people.
02:38:56.080 | - Is there intellectuals that still represent
02:38:59.200 | some aspect of the movement?
02:39:01.360 | - I mean, sure-- - Are you tracking this?
02:39:03.280 | - Not that much anymore.
02:39:05.200 | I think they've, I don't find it particularly as,
02:39:09.020 | now that the book's done,
02:39:12.200 | I'm looking more into history for my next book.
02:39:15.840 | - You mentioned communism?
02:39:16.920 | - I'm gonna talk a lot about the Cold War,
02:39:19.600 | so this kind of stuff has largely fallen away
02:39:22.000 | from my radar to some extent.
02:39:23.960 | And they've also been, it's been a very effective movement
02:39:27.040 | to get them marginalized and silenced.
02:39:29.680 | - So they're not as deep of a concern
02:39:32.920 | in terms of concern or not, just their impact on society.
02:39:36.040 | - Yes, it's much lessened, yeah.
02:39:38.400 | - So as a troll on Twitter, in the best sense of the word,
02:39:42.200 | what do you make of cancel culture?
02:39:45.920 | - I think it's Maoism.
02:39:47.440 | I mean, corporate America has done a far better job
02:39:50.200 | of implementing Maoism than the Communist Party ever could.
02:39:52.680 | You had this meeting not that long ago
02:39:54.360 | from I think it was Northwestern University Law School
02:39:56.560 | where everyone on the call got up
02:39:57.940 | and said that they were racist.
02:39:59.440 | I mean, this is something that legally
02:40:01.240 | you should be very averse to saying, even if it were true.
02:40:04.560 | And it's this kind of concept of getting up
02:40:06.560 | and confessing your sins before the collective
02:40:08.840 | is something completely--
02:40:10.480 | - Oh, sorry, they admitted this of themselves?
02:40:14.200 | - Yeah, they were like, 'cause they're saying
02:40:15.640 | because they're white, they're inherently racist.
02:40:17.040 | So my name's John, I'm a racist.
02:40:18.600 | My name's this, I'm a racist.
02:40:20.400 | You hear it and you're like, okay, this is Looney Tunes.
02:40:22.760 | - So you're saying that, wow, that's so much,
02:40:25.280 | you took a step further.
02:40:26.480 | So you're saying there's like a deep underlying force,
02:40:30.120 | in a sense, cancel culture.
02:40:31.760 | It's not just some kind of mob symptom.
02:40:33.600 | - No, it's not a mob at all.
02:40:35.120 | - It's a--
02:40:36.920 | - It's a systemic organized movement
02:40:39.880 | being used for very nefarious purposes
02:40:42.360 | and to dominate an entire nation.
02:40:45.200 | - How do we fight it?
02:40:46.160 | 'Cause I sense it inside.
02:40:47.760 | I used to defend academia more because,
02:40:53.760 | I still do to some extent.
02:40:58.520 | It's a nuanced discussion because folks like Jordan Peterson
02:41:03.520 | and a lot of people that kind of attack academia,
02:41:06.800 | they refer, they really are talking about gender studies
02:41:10.240 | in certain departments.
02:41:11.320 | And me from MIT, it's the University of Science
02:41:15.120 | and Engineering and the faculty there
02:41:19.120 | really don't think about these issues
02:41:22.600 | or haven't traditionally thought about it.
02:41:24.240 | It's beginning to even infiltrate there.
02:41:26.360 | It's starting to infiltrate engineering and sciences
02:41:31.240 | outside of biology.
02:41:33.160 | Let's put biology with the gender studies.
02:41:35.920 | I'm talking about sciences that really don't have anything
02:41:37.960 | to do with gender.
02:41:38.900 | It's starting to infiltrate.
02:41:43.160 | It worries me.
02:41:44.680 | I don't know exactly why.
02:41:47.080 | I don't know exactly what the negative effect there would be
02:41:51.560 | except it feels like it's anti-intellectual.
02:41:55.800 | - Oh yes, of course.
02:41:57.360 | - And I'm not sure what to,
02:41:59.000 | 'cause on the surface,
02:42:02.240 | it feels like a path towards progress at first
02:42:05.880 | when I'm zoomed out, just squinting my eyes,
02:42:11.680 | not even in detail looking at things.
02:42:13.960 | But when I actually join the conversation to listen in,
02:42:17.260 | the conversation on quote unquote diversity,
02:42:20.300 | it quickly makes me realize that there's no interest
02:42:25.300 | in making a better world.
02:42:29.840 | - No, no, it's about domination.
02:42:31.320 | - It's about getting, yeah.
02:42:33.460 | - It's a way for, if you are a lowest status white person,
02:42:37.740 | using anti-racism is the only mechanism you will have
02:42:41.040 | to feel superior to another human being.
02:42:43.360 | So it's very useful for them.
02:42:44.820 | In terms of fighting it,
02:42:48.120 | one of my suggestions has been
02:42:49.760 | to seize all university endowments,
02:42:51.960 | which are the crystallization of privilege,
02:42:54.080 | and distribute that money as reparations.
02:42:56.640 | So it'd be very effective
02:42:57.760 | by turning two populations against each other
02:42:59.920 | and strongly diminishing
02:43:01.320 | the university's intellectual hegemony.
02:43:04.220 | The universities are absolutely
02:43:06.160 | the real villains in the picture.
02:43:07.960 | Thankfully, they're also the least prepared
02:43:10.160 | to be aggressed upon.
02:43:12.120 | And after the government and the corporate press,
02:43:15.400 | they are the last leg of the stool,
02:43:17.280 | and they don't know what's coming,
02:43:18.440 | and it's gonna get ugly, and I cannot wait.
02:43:21.360 | - So this is where you and I disagree.
02:43:23.080 | One, we disagree in a sense
02:43:25.720 | that you want to dismantle broken institutions.
02:43:30.720 | - I don't think they're broken.
02:43:31.600 | I think they're working by design.
02:43:33.280 | I think for over 100 years,
02:43:34.920 | they have been talking about bringing
02:43:36.800 | the next generation of American leaders,
02:43:38.660 | which is code for promulgating an ideology
02:43:42.320 | based on egalitarian principles and world domination.
02:43:46.700 | - Let me try to express my lived experience.
02:43:50.160 | - Okay, sure.
02:43:51.000 | - My experience at MIT is that
02:43:55.840 | there's a bunch of administrators
02:43:58.200 | that are the bureaucracy that,
02:44:00.820 | I can say, this is the nice thing
02:44:03.960 | about having a podcast, I don't give a damn,
02:44:06.080 | is they're pretty useless.
02:44:07.560 | In fact, they get in the way.
02:44:09.360 | But there's faculty, there's professors
02:44:12.480 | that are incredible.
02:44:15.120 | They're incredible human beings
02:44:16.560 | that all they do all day, they're too busy,
02:44:20.600 | but for the most part, what they do all day
02:44:23.660 | is just continually pursue different little trajectories
02:44:28.560 | of curiosities in the various avenues of science
02:44:31.720 | that they work on.
02:44:33.080 | And as a side effect of that,
02:44:36.480 | they mentor a group of students,
02:44:39.280 | sometimes a large group of students,
02:44:40.720 | and also teach courses.
02:44:42.320 | And they're constantly sharing their passion with others.
02:44:45.920 | And my experience is it's just a bunch of people
02:44:49.240 | who are curious about engineering and math and science,
02:44:53.380 | chemistry, artificial intelligence, computer science,
02:44:55.720 | what I'm most familiar with.
02:44:57.720 | And there's never this feeling of MIT being broken somehow,
02:45:02.480 | like this kind of feeling,
02:45:04.680 | like if I talk to you just now, or like Eric Weinstein,
02:45:08.280 | there's a feeling like stuff is on fire.
02:45:11.280 | Right, there's something deeply broken.
02:45:13.880 | But when I'm in the system,
02:45:16.200 | especially before the COVID, before this kind of tension,
02:45:20.920 | everything was great.
02:45:22.640 | There was no discussion of even diversity,
02:45:25.320 | all that kind of stuff, the toxic stuff
02:45:28.040 | that we might be talking about right now,
02:45:30.000 | none of that was happening.
02:45:30.920 | It was a bunch of people just in love
02:45:32.600 | with cool ideas, exploring ideas,
02:45:35.920 | being curious and learning and all that kind of stuff.
02:45:38.520 | So I don't, my sense of academia was this is the place
02:45:43.360 | where kids in their 20s, 30s, and 40s
02:45:46.320 | can continue the playground of science and having fun.
02:45:50.680 | It's, if you destroy academia,
02:45:52.920 | if you destroy universities,
02:45:54.080 | like you're suggesting kind of lessening their power,
02:45:58.080 | you take away the playground from these kids to play.
02:46:01.840 | - It's gonna be hard for you to tell me
02:46:03.480 | that I'm anti-playground.
02:46:05.000 | - Yeah, well, I guess I'm saying you're anti
02:46:07.080 | certain kinds of playgrounds, which is--
02:46:08.960 | - Yeah, the ones that have the broken glass on the floor.
02:46:10.800 | Yeah, I am against those kinds of playgrounds.
02:46:13.100 | - No, you're--
02:46:16.160 | - You're-- - Yes.
02:46:18.280 | - Nope, see-- - Yes.
02:46:19.120 | - Yeah. - That you say,
02:46:20.520 | that you listen.
02:46:21.360 | - No, you wait, yeah.
02:46:23.200 | I would say you're being the watchful mother
02:46:26.840 | who the one kid who hurt themselves in the glass--
02:46:29.840 | - One kid, it's an entire generation,
02:46:30.960 | it's generation after generation.
02:46:32.440 | I'm not a watchful mother,
02:46:33.280 | I'm the guy with the flamethrower.
02:46:34.640 | - No, I understand that.
02:46:36.880 | But you're using the one kid who was always kind of weird,
02:46:42.480 | AKA gender studies department,
02:46:45.000 | that hurt themselves in the glass,
02:46:47.340 | as opposed to the people who are obviously having fun
02:46:50.440 | in the playground and not playing by the glass,
02:46:54.720 | the broken glass, and they're just,
02:46:56.820 | I mean, to me, some of the best innovations in science
02:47:00.160 | happen in universities.
02:47:02.080 | - Okay. - You can't forget
02:47:04.000 | that universities don't have this liberal,
02:47:07.680 | like politics literally,
02:47:09.800 | every conversation until this year,
02:47:14.160 | until this year, there's something happening,
02:47:16.520 | but every conversation I've ever had
02:47:18.800 | had nothing to do with politics.
02:47:20.280 | We never, Trump never came up,
02:47:22.240 | none of that ever come up, nothing.
02:47:24.440 | Like all this kind of idea that there's liberal,
02:47:26.880 | all that, that's in the humanities.
02:47:29.840 | - Yeah, but do you think MIT,
02:47:31.520 | Massachusetts Institute of Technology,
02:47:33.480 | might be a little bit of an outlier?
02:47:34.760 | - Yeah, there probably is, yeah.
02:47:36.600 | But I honestly don't think,
02:47:40.080 | when people criticize academia,
02:47:41.480 | they're looking at, they're in fact also picking outliers,
02:47:46.480 | which is they're picking some of the, quote unquote,
02:47:49.480 | strongest gender studies departments.
02:47:51.280 | - This is nonsensical.
02:47:52.120 | When I was at Bucknell, I was a college student,
02:47:55.040 | we had to take, we had a bunch of electives,
02:47:57.600 | and I wanted to take a class
02:47:58.600 | on individual, American individualism.
02:48:01.160 | One of the texts of the five that we had to read
02:48:04.440 | was "Birth of a Nation," the movie about the Klan.
02:48:07.440 | So there's no department where these people
02:48:12.720 | are not thoroughgoing, hardcore ideologues.
02:48:17.400 | This is not a gender-- - That's the humanities,
02:48:18.640 | that's the humanities.
02:48:19.480 | - Fine, all the humanities, not just gender studies.
02:48:21.920 | - Okay, fine, I can give you--
02:48:23.360 | - History, English.
02:48:25.080 | - Yes. - All of them.
02:48:26.200 | Every university, as you know,
02:48:28.280 | has it mandatory in their curriculum,
02:48:31.640 | they have to take a bunch of these propaganda classes.
02:48:34.520 | - I look forward to you two comments
02:48:37.640 | because you're being more eloquent
02:48:39.720 | and you're speaking to the thing
02:48:40.960 | that a lot of people agree with,
02:48:42.400 | and I'm being my usual slow self,
02:48:44.240 | and people are going to say not very nice things about me.
02:48:47.040 | - Don't say anything that nice about Lex, please.
02:48:50.120 | - Let me try to just-- - Just shoot up a school.
02:48:53.000 | That would be preferable.
02:48:54.400 | - There he goes again.
02:48:55.760 | - Only the teachers.
02:48:56.600 | - The darkest possible place.
02:48:58.280 | - That's Sunshine Baby Schools,
02:48:59.720 | that's where everyone goes to be happy, playgrounds.
02:49:01.640 | - There he goes, dark ear.
02:49:03.400 | Just dives right in, just go dark,
02:49:07.480 | and then just comes back up to the surface.
02:49:09.680 | - I don't have to feel this way anymore.
02:49:13.200 | Just one day and I'll feel it.
02:49:15.620 | - You're probably a figment of my imagination,
02:49:19.920 | I'm not even having this broadcast.
02:49:21.280 | - Well, after 18 Red Bulls,
02:49:22.600 | I'm surprised you could see anything.
02:49:24.560 | - This is like Fight Club,
02:49:25.600 | Red Bull gives you delirium.
02:49:27.000 | I got into it with Ed Norton yesterday on Twitter.
02:49:32.360 | - Oh, really?
02:49:33.200 | - Yeah.
02:49:34.040 | - Is he like the rest of the celebrities?
02:49:35.920 | - Yeah, he's like, "Oh, this is an existential threat
02:49:38.000 | "to America, Trump's a fascist,
02:49:39.480 | "he's delegitimizing the Oval Office."
02:49:41.360 | I said, "What an odd endorsement of Trump."
02:49:44.240 | - Well, you should have went with Bat Pitt,
02:49:45.680 | he might have a different opinion.
02:49:47.000 | - That's true. - Fight Club reference.
02:49:48.560 | - Yeah.
02:49:50.000 | This conversation is over.
02:49:52.600 | - It's interesting, I'd like to draw a line
02:49:54.320 | between science and engineering,
02:49:56.000 | and science not including the biological aspect,
02:50:00.840 | the parts of biology that touch,
02:50:03.600 | and humanities and biology.
02:50:05.080 | I feel because humanities,
02:50:08.040 | if you just look at the percentage of universities,
02:50:10.680 | it's still a minority percentage.
02:50:13.480 | And I would actually draw a different,
02:50:16.240 | I think they serve very different purposes.
02:50:18.320 | - Sure.
02:50:19.140 | - And that's actually a broken part about universities,
02:50:22.320 | about why is some of the best research in the world
02:50:27.400 | done at universities?
02:50:28.440 | There might be a different,
02:50:31.320 | like MIT, it feels weird that a faculty--
02:50:34.520 | - Yeah, these are conceptually different things.
02:50:35.880 | Like we do research and we teach,
02:50:37.160 | why is this the same at Diver?
02:50:38.000 | - Yeah, it feels weird.
02:50:38.920 | But that's just, but I'm also,
02:50:40.840 | I'm coming to the defense of the engineers
02:50:44.680 | that never talk about, I'm not like,
02:50:46.880 | my mind isn't, I'm not deluded or something,
02:50:51.000 | where I'm not seeing the house on fire.
02:50:53.960 | I'm just saying, I am seeing the house
02:50:55.800 | 'cause I also lived in Harvard Square.
02:50:57.360 | I'm seeing Harvard.
02:50:58.720 | But--
02:50:59.560 | - Can you see the tanks coming?
02:51:00.800 | They're coming, Lex.
02:51:01.720 | That's gonna be so beautiful.
02:51:03.720 | It'll be like the American beauty, the plastic bag.
02:51:06.160 | I just won't be able to stop crying
02:51:07.320 | 'cause it'll be so beautiful.
02:51:08.200 | - Yeah. - The tanks.
02:51:09.040 | - I can already see it.
02:51:11.280 | But the engineering departments,
02:51:13.840 | where I believe that the Elon Musks of the world,
02:51:18.140 | that the innovation that will make a better world
02:51:22.080 | is happening, and let's not burn that down
02:51:25.240 | 'cause that has nothing to do with any,
02:51:27.320 | like they're all sitting quietly
02:51:29.360 | while the humanities
02:51:32.760 | and all these kind of diversity programs,
02:51:34.480 | they're not having any of these discussions.
02:51:36.720 | - Listen, my Soviet brother,
02:51:38.280 | you both know, we both know that ice water runs in our veins.
02:51:41.480 | So if you're calling for mercy, that is not how I'm wired,
02:51:44.440 | but I'm not closing the door.
02:51:46.520 | - Yeah, I'm actually realizing now,
02:51:48.540 | so for people listening to this,
02:51:50.660 | I'll probably pre-pen this and saying that
02:51:52.660 | I'm even slower than usual.
02:51:54.940 | I didn't sleep last night,
02:51:56.540 | but I feel I'm actually realizing just how slow I am
02:52:00.820 | and how much preparation I need to do.
02:52:03.100 | And if I would like to defend aspects of academia,
02:52:06.740 | I better come prepared.
02:52:08.500 | - I don't think you need to defend them.
02:52:09.680 | I think I'm granting you your premise freely.
02:52:12.620 | - No, you might be. - Okay.
02:52:14.540 | - I don't think the world is.
02:52:16.300 | - But actually, you just defeat your own argument
02:52:18.400 | 'cause it is not at all have to be the way
02:52:21.520 | that a phenomenal research institution like MIT,
02:52:24.560 | which no one disputes,
02:52:25.920 | has to also be an educational establishment.
02:52:28.720 | These two things are not at all necessarily interconnected.
02:52:32.120 | - But then you have to offer a way to separate them.
02:52:34.600 | - Correct.
02:52:35.440 | - But I'm not a big fan, everybody's different,
02:52:38.920 | but I'm not a fan of criticizing institutions
02:52:41.080 | without offering a way to change.
02:52:43.360 | And especially when I have ability to change,
02:52:46.060 | I'd like to offer a path.
02:52:49.720 | - What if they weren't students, they were all mentor?
02:52:52.320 | What's the opposite of a mentor?
02:52:55.960 | - Mentee.
02:52:56.800 | - Protege?
02:52:57.620 | What's the term when you work at a place?
02:53:00.720 | Interns, not an intern, it's not the one I'm thinking of.
02:53:02.880 | But anyway, basically they're working there
02:53:04.920 | instead of going to college there.
02:53:06.680 | - It's possible, but it's going against tradition.
02:53:08.840 | And so you have to build new institutions.
02:53:10.880 | - You can't have these engineers building new things.
02:53:13.600 | That's crazy.
02:53:15.120 | These research engineers,
02:53:16.460 | where they gonna be building things?
02:53:19.000 | - Well, one of the things, 'cause you're kind of a--
02:53:21.320 | - Apprentice, that's the word I was looking at.
02:53:22.640 | - Apprentice.
02:53:23.480 | - Which is ironic, we're talking about Trump
02:53:24.640 | and we couldn't think of the word apprentice.
02:53:26.800 | - Yeah, well done.
02:53:28.280 | We should both be fired.
02:53:29.120 | - You're fired.
02:53:29.960 | - Yeah, there you go.
02:53:30.920 | These Russian Jews, so quick with their wit.
02:53:34.520 | - Okay.
02:53:35.360 | - But the thing is you're a fan of freedom.
02:53:37.720 | - I am.
02:53:39.000 | - And there is intellectual freedom.
02:53:43.280 | This is what I was trying to articulate.
02:53:45.200 | I'm failing to articulate,
02:53:47.120 | but there truly is complete intellectual freedom
02:53:50.280 | within universities on topics of science and engineering.
02:53:55.280 | - I believe you.
02:53:57.200 | I agree with you.
02:53:58.040 | I don't think it's gonna take much persuasion,
02:53:59.560 | but I'll give you an example.
02:54:01.000 | I'm sure you know more details about this than I do.
02:54:05.560 | When that scientist engineered that probe
02:54:09.600 | to land on that comet,
02:54:11.760 | and the articles were written
02:54:13.000 | 'cause this Hawaiian shirt he was wearing
02:54:15.240 | had pinup girls on it,
02:54:16.360 | which I think is female student,
02:54:17.840 | sewed for him or something, or his ex-girlfriend.
02:54:19.840 | And he had to apologize.
02:54:21.600 | This is what Rand was talking about.
02:54:23.600 | That the great accomplishments of men
02:54:26.440 | have to say I'm sorry to the lowest,
02:54:29.280 | most despicable, disgusting people.
02:54:31.240 | - Yeah, I don't know.
02:54:34.240 | Let me bring this case up 'cause I think about this.
02:54:36.880 | This might not mean much to you,
02:54:39.040 | but it means a lot to a certain aspect
02:54:41.120 | of the computer science community.
02:54:42.560 | There's a guy named Richard Stallman.
02:54:44.760 | I don't know if you know who that is.
02:54:46.960 | He's the founder of the Free Software Foundation.
02:54:51.280 | He's a big Linux, he's one of the key people
02:54:53.360 | in the history of computer science,
02:54:54.920 | one of those open source people.
02:54:56.800 | But he is, I believe, he's one of the hardcore ones
02:55:00.280 | which is like all software should be free.
02:55:03.200 | Okay, so very interesting personality,
02:55:05.360 | very key person in the GNU,
02:55:07.240 | just like Linus Torvald, key person.
02:55:10.880 | But he also kind of speaks his mind.
02:55:13.520 | And on a certain chain of conversations at MIT
02:55:18.520 | that was leaked to the New York Times,
02:55:22.280 | then was published, led him to be fired
02:55:26.460 | or pushed out of MIT recently, maybe a year ago.
02:55:30.680 | And it always sat weird with me.
02:55:32.720 | So what happened is there's a few undergraduate students
02:55:38.840 | that called Marvin Minsky,
02:55:41.240 | not sure if you're familiar with who that is.
02:55:42.720 | - I've heard the name.
02:55:43.540 | - He's one of the seminal people in artificial intelligence.
02:55:46.440 | They said that they called him a rapist
02:55:49.960 | because he met with Jeffrey Epstein.
02:55:53.200 | And Jeffrey Epstein solicited,
02:55:56.880 | these are the best facts known to me that I'm aware of,
02:56:00.700 | that's what was stated on the chain,
02:56:02.840 | is he solicited a 17, but it might've been
02:56:05.360 | an 18-year-old girl to come up to Marvin Minsky
02:56:09.200 | and ask him if he wanted to have sex with her.
02:56:12.300 | So Jeffrey Epstein told the girl.
02:56:15.880 | She came up to Marvin Minsky,
02:56:18.200 | who was at that time is I think seven years old.
02:56:20.720 | And his wife was there too, Marvin Minsky's wife.
02:56:23.440 | And he said no, or like, you know, awkwardly saying.
02:56:27.120 | - No, thank you.
02:56:27.960 | - No, thanks.
02:56:29.440 | And that was stated in the email thread
02:56:35.440 | as Marvin participating in sexual assault
02:56:40.440 | and rape of this, unwilling sexual assault.
02:56:44.560 | And it was called rape of this person, right?
02:56:48.720 | Of this woman that propositioned him.
02:56:51.920 | And then Richard Stallman, he's kind of known for this.
02:56:56.100 | You make fun of me being a robot,
02:56:59.280 | but he's kind of like a debugger.
02:57:01.200 | He's like, well, that sentence is not,
02:57:03.160 | what you said is not correct.
02:57:05.320 | So he like corrected the person,
02:57:07.300 | basically made it seem like the use of the word rape
02:57:13.600 | is not correct, 'cause that's not the definition of rape.
02:57:16.760 | And then he was attacked for saying,
02:57:19.160 | oh, now you're playing with definitions of rape.
02:57:21.140 | Rape is rape, is the answer, right?
02:57:23.800 | And then that was leaked in him defending,
02:57:27.240 | so the way it was leaked, it was reported
02:57:30.680 | as him defending rape.
02:57:35.320 | That's the way it was reported.
02:57:36.960 | And he was pushed out and he didn't really give a damn.
02:57:40.560 | He doesn't seem to make a big deal out of it.
02:57:44.440 | He just left. - But he made
02:57:45.280 | an example of him.
02:57:46.520 | - They made an example.
02:57:47.480 | And that everyone was afraid to defend him.
02:57:51.720 | So like there's a bunch of faculty, one--
02:57:53.520 | - Dude, you're from the Soviet Union.
02:57:54.840 | Doesn't this hit close to home for you?
02:57:57.200 | - I don't know what to think of it.
02:57:59.600 | It hits close to home, but it was basically,
02:58:03.120 | at least at MIT, now MIT is such a light place with this.
02:58:06.700 | It's not common at MIT, but it was like 18, 19 year old kids,
02:58:11.120 | undergraduate kids with this kind of fire in them.
02:58:14.400 | There's just very few of them,
02:58:15.960 | but they're the ones that raise all this kind of fuss.
02:58:19.180 | And the entirety of the administration,
02:58:23.100 | all the faculty are afraid to stand up to them.
02:58:25.760 | It's so interesting to me.
02:58:28.360 | Like, I don't know if I should be afraid of that.
02:58:31.880 | - You don't think you should be afraid
02:58:33.000 | that someone who's trying to be specific
02:58:34.920 | when it comes to charges of violent assault
02:58:37.280 | is looking for that clarity,
02:58:38.640 | can get their life out of the search--
02:58:40.680 | - Let me give you more context.
02:58:42.440 | There's a little bit more context to Richard Stallman,
02:58:46.040 | which is-- - He was also a rapist.
02:58:47.840 | (both laughing)
02:58:48.920 | - No. - I left out that part.
02:58:50.040 | He liked raping people.
02:58:51.160 | - But he had a history through his life
02:58:55.920 | of every once in a while wearing the Hawaiian shirt.
02:58:59.000 | He's a fat, sorry, but he's a fat, unattractive,
02:59:06.400 | like what Trump referred to, the hacker--
02:59:09.120 | - Yeah, the 700 pound guy in the basement.
02:59:11.240 | - That's Richard.
02:59:12.080 | He is what he is.
02:59:15.600 | He would eat his own,
02:59:18.560 | he would pick skin from his feet in lectures
02:59:20.800 | and just eat it.
02:59:21.640 | - No. - Okay, yeah.
02:59:23.000 | Those videos of him doing that.
02:59:24.680 | - I'm not joking, he must really be high
02:59:25.880 | in the spectrum then.
02:59:26.920 | - Yeah. - Okay.
02:59:27.760 | - Oh yeah, yeah.
02:59:28.600 | I think in his office door,
02:59:36.600 | he wrote something like,
02:59:38.040 | hacker plus lover of ladies or something like that.
02:59:46.120 | Something kinda-- - Yeah, yeah, yeah.
02:59:48.920 | - So-- - Unprofessional.
02:59:50.160 | - Yeah, unprofessional and a little creepy.
02:59:51.960 | - Yeah, yeah, no, that's fair.
02:59:53.320 | - So he was also--
02:59:54.440 | - So they were looking for an excuse to get rid of him,
02:59:56.600 | it sounds like.
02:59:58.040 | - No, he was just, who's they?
03:00:00.720 | - The administration.
03:00:01.800 | - Yeah, probably, probably.
03:00:04.080 | - A lot of times what people don't realize,
03:00:06.040 | and this would be my defense of cancel culture,
03:00:08.600 | a lot of times when someone gets fired
03:00:10.160 | over something like this, this isn't why.
03:00:12.280 | This is just giving them cover to get rid of them
03:00:14.560 | without getting a lawsuit.
03:00:15.680 | - Yeah, but it's still--
03:00:18.160 | - Gross.
03:00:19.000 | - So I think, I guess what I'm trying to communicate
03:00:21.520 | is he was a little weird and creepy
03:00:23.560 | and he may not be the best for the community,
03:00:27.400 | but that's not necessarily the message
03:00:30.080 | it's sent to the rest of the community.
03:00:31.800 | The message is sent to the rest of the community
03:00:34.360 | that being clear about words or the usage of the word rape
03:00:39.200 | is like, you should call everything rape.
03:00:41.960 | That's basically the message it was sent.
03:00:44.520 | - Or you should call that, we say rape, rape.
03:00:46.920 | It's about submission.
03:00:48.400 | I think you'd be very happy to know
03:00:50.600 | that there's a lot of people,
03:00:52.120 | and she's very crucified of this, like Betsy DeVos,
03:00:54.440 | the president of the Department of Education,
03:00:56.120 | who are aware of this.
03:00:58.320 | They are aware that this completely contradicts
03:00:59.920 | due process.
03:01:01.560 | They're aware of how a rape accusation
03:01:03.800 | is something not to be taken seriously,
03:01:05.600 | but because it's not to be taken seriously,
03:01:07.200 | it has to be also taken seriously in other contexts
03:01:09.520 | that once that word is around a male,
03:01:12.120 | this can ruin his entire life.
03:01:13.760 | - That's the sticky thing of the word.
03:01:17.440 | Like I think about this a lot that,
03:01:22.440 | like how would I defend it if somebody,
03:01:26.560 | like I've never, I can honestly say
03:01:28.280 | I've never done anything close to creepy in my life,
03:01:32.520 | like with women.
03:01:34.880 | - But you wouldn't know it if you had, right?
03:01:36.520 | That's the thing.
03:01:37.360 | A lot of these creepy guys don't think they're creepy.
03:01:39.160 | They think they're being cute.
03:01:40.960 | - Yeah, but I'm just telling you, even like, fine.
03:01:44.760 | Let's say, right, let's say I'm not aware of it,
03:01:47.280 | but the point that I am aware of
03:01:50.120 | is that somebody could just completely make something up.
03:01:52.600 | - Correct, yeah, yeah, yeah, yeah, yeah, okay.
03:01:54.240 | - And like, what would I--
03:01:55.920 | - No, he denied the charges.
03:01:57.400 | There's an article around everything he did, supposedly,
03:01:59.600 | and it goes, "Mr. Friedman denied the charges," yeah.
03:02:02.640 | - But what creeps me out--
03:02:04.360 | - That happened, can I interrupt?
03:02:05.800 | Zora Neale Hurston's one of my favorite writers.
03:02:07.440 | She's from the Harlem Renaissance.
03:02:09.800 | She wrote "Their Eyes Are Watching God,"
03:02:11.480 | a couple of other books.
03:02:12.480 | She was just an amazing, amazing figure.
03:02:14.360 | Her biography's called "Wrapped in Rainbows."
03:02:16.720 | It's just a masterpiece.
03:02:17.640 | I think I read it one day.
03:02:18.840 | Can't recommend her enough.
03:02:19.840 | Fascinating, fascinating woman.
03:02:21.680 | During the '30s, I think it was, or 1940,
03:02:24.840 | she was out of the country.
03:02:26.360 | She was accused of molesting a teenage boy.
03:02:29.880 | She wasn't in America.
03:02:31.920 | This could be proven.
03:02:33.240 | So it's absolutely false, not even a question.
03:02:36.120 | She was indicted, and she wanted to kill herself
03:02:39.960 | because she's like, people are gonna see these things,
03:02:43.840 | and they're gonna think maybe there's some truth to it.
03:02:45.520 | Maybe it was voluntary.
03:02:46.680 | And you could understand why she'd be suicidal over this.
03:02:51.000 | So yeah, this is something that's been going on
03:02:52.960 | for a long time, and the fact that it's becoming,
03:02:55.600 | I do agree it's important.
03:02:57.240 | I know a lot of women who have been sexually assaulted,
03:02:59.640 | more than I am happy that I know.
03:03:01.600 | And if I know that many, that means there's more.
03:03:03.840 | So I think it's a good idea that they feel seen,
03:03:08.560 | that they don't feel wounded, they don't feel damaged,
03:03:10.560 | that they could talk to their friends.
03:03:12.080 | And I'm like, man, this sucks this is happening to you.
03:03:13.960 | And I don't think you're a slut.
03:03:15.760 | I don't think you're asking for it.
03:03:16.880 | I think you feel violated, I think it's gross.
03:03:19.240 | Talk to me, I do think that that's important.
03:03:21.960 | And I also think it's important though,
03:03:23.640 | like when things get kind of in a frenzy,
03:03:26.600 | that a lot of people are like,
03:03:27.880 | yeah, I also had something happen.
03:03:29.560 | And very quickly the line between he grabbed my boob
03:03:33.240 | and he violently raped me,
03:03:34.960 | I don't think these two things are the same at all.
03:03:37.120 | I think they're both sexual assault.
03:03:39.040 | But in terms of what someone can deal with the next day,
03:03:41.880 | the next month, 10 years later,
03:03:43.680 | I don't think they're similar scenarios.
03:03:45.960 | I had Juanita Broderick on my show
03:03:49.280 | and hearing her talk about her alleged rape by Bill Clinton
03:03:53.880 | was very disturbing for me, very disturbing to hear
03:03:57.000 | because it was like half an hour.
03:03:58.840 | So, when we think of these things and think,
03:04:00.360 | okay, hold her down, blah, blah, blah,
03:04:02.000 | and then it's done, half an hour when,
03:04:03.680 | just even someone physically holding you down
03:04:05.520 | for half an hour, like not even a sexual assault.
03:04:08.320 | Like that's traumatic.
03:04:10.000 | You think, your brain's gonna think, am I gonna die?
03:04:12.480 | When I zoom out, I think that ultimately
03:04:16.000 | this is gonna lead to a better world.
03:04:18.720 | Like empowering women to speak
03:04:21.440 | to those kinds of experiences,
03:04:23.580 | the benefit of it outweighs the--
03:04:27.480 | - The issue is whenever people are given a weapon,
03:04:30.280 | some are going to use it in nefarious ways.
03:04:33.280 | And that's the lesson of history.
03:04:34.560 | Males, females, whites, blacks, children, adults.
03:04:38.600 | When people are given a mechanism
03:04:40.200 | to execute power over others, some are gonna use it.
03:04:43.480 | - Can I ask you for a therapy thing?
03:04:46.480 | - Sure.
03:04:47.320 | - On trolling, in a sense.
03:04:50.440 | 'Cause I mentioned somebody making up something about me.
03:04:55.080 | I feel, 'cause I wear my heart on my sleeve,
03:04:57.440 | I'm not good with these attacks.
03:05:01.960 | Like I've been attacked recently,
03:05:03.800 | just being called a fraud and all that kind of stuff.
03:05:06.400 | Just light stuff.
03:05:07.320 | Like I haven't, you know, it was like, it hurt.
03:05:11.880 | - Okay, well, let me help you.
03:05:13.480 | Maybe it's 'cause I'm a New Yorker.
03:05:15.840 | No, I'm serious, here's why.
03:05:17.840 | In New York, a lot of times you'll be walking
03:05:21.160 | with your friend and a homeless person will come up to you
03:05:24.680 | and start yelling things at you.
03:05:26.720 | Your reaction isn't in those circumstances.
03:05:29.740 | Let me hear this out.
03:05:31.360 | Your reaction is physical safety and getting away.
03:05:35.040 | Now, it's not impossible that that homeless person
03:05:38.520 | is actually saying the truth.
03:05:39.880 | This happened to a friend of mine.
03:05:42.040 | This guy wasn't homeless and he's walking down the street
03:05:45.880 | on Smith Street and he's just talking out loud
03:05:48.640 | and he goes, "Why they call them hipsters?
03:05:50.560 | "What are they hip to?"
03:05:52.040 | And she chuckles and he goes,
03:05:53.820 | "What are you laughing at, fatso?
03:05:55.100 | "You start something, I'll finish it."
03:05:57.240 | And she just couldn't move.
03:05:59.040 | And it's like, is my weight a problem?
03:06:02.480 | 'Cause that's the first thing he went to.
03:06:04.380 | And I don't know that I have any advice,
03:06:08.040 | but when you hear something like this,
03:06:11.040 | I think you need to be better in terms of boundaries.
03:06:13.620 | I think you should not perceive this as a fellow human,
03:06:16.260 | but as a crazy homeless person,
03:06:18.800 | because if this fellow human,
03:06:21.520 | if I thought that you were a fraud in some context,
03:06:24.680 | that's a very weird word to use,
03:06:25.920 | 'cause fraudulent podcaster, these are real mics.
03:06:29.080 | But if I thought-- - Well, scientist or human.
03:06:31.960 | - Sure, but I would ask myself,
03:06:33.760 | is this person in a position to make this judgment,
03:06:36.440 | or are they backing it up?
03:06:37.920 | Are they saying, "Here, your conclusions were wrong.
03:06:40.600 | "Here's some mistakes in your data,
03:06:42.560 | "and you can engage with them on ideas."
03:06:44.720 | But whenever someone uses a word
03:06:46.200 | to entirely dismiss your life
03:06:48.040 | without having the knowledge of your life,
03:06:50.680 | you do not have to take that seriously.
03:06:53.680 | - I appreciate that kind of idea,
03:06:55.300 | but some things aren't about data.
03:06:58.360 | Like, I see myself as a fraud often,
03:07:02.120 | and it's more psychology of it.
03:07:05.520 | If I can reduce something to reason,
03:07:09.160 | I can probably be fine.
03:07:11.200 | My worry is the same as the worry of teenage girls
03:07:14.640 | that get bullied online.
03:07:16.280 | It's like, when I'm being open and fragile on the internet,
03:07:19.240 | it affects me in a way where I can't,
03:07:22.480 | the reason doesn't help.
03:07:24.040 | So it helps me, but-- - You don't block people enough.
03:07:26.240 | I'm very heavy with the blocking.
03:07:27.480 | - No, so yeah, I block-- - Very heavy.
03:07:30.120 | - I block, it's helped a lot. - Any aggressive banality,
03:07:32.880 | I block immediately.
03:07:34.440 | I also think time is gonna help.
03:07:36.400 | I don't think you're,
03:07:37.560 | like, you didn't grow up wanting to be a podcaster, right?
03:07:40.260 | That wasn't your aspiration.
03:07:41.740 | So in some sense, you are gonna feel like a fraud,
03:07:43.680 | 'cause you're like, "I don't have any training for this.
03:07:45.080 | "I have training for a scientist.
03:07:46.400 | "I can talk to you about artificial intelligence
03:07:47.960 | "for literally hours, but in terms of this,
03:07:49.900 | "I don't know what I'm doing, I'm kind of..."
03:07:51.680 | So when they call you a fake, it's like,
03:07:54.240 | "Yeah, you're kind of right,
03:07:55.080 | "'cause I did kind of stumble into this,
03:07:57.880 | "and this is not my pedigree."
03:07:59.760 | So I think that kind of probably speaks to you
03:08:02.000 | on some level.
03:08:02.840 | - Well, but they're attacking not the podcast, I think,
03:08:05.280 | but more like the same,
03:08:07.280 | people call Elon Musk a fraud too,
03:08:08.880 | which that's the way I rationalize it.
03:08:11.160 | Like, well, if they're calling him a fraud,
03:08:14.000 | and they're calling me a fraud,
03:08:15.740 | like, even if you have rockets that go into,
03:08:20.440 | like, if you successfully have rockets
03:08:22.240 | landing back on Earth, reusable rockets,
03:08:26.320 | you're still being called a fraud, then it's okay.
03:08:29.800 | - Not necessarily, it could be that he's not a fraud,
03:08:32.560 | and you really are.
03:08:33.500 | (Lex laughing)
03:08:35.640 | But it's not resonating with you,
03:08:36.960 | 'cause your brain knows the logic,
03:08:38.160 | so you can't trick yourself.
03:08:39.280 | - But, yeah, yeah.
03:08:42.640 | But I don't know, this whole trolling thing,
03:08:44.940 | you seem to be much better at seeing it as a game.
03:08:49.940 | - You know why?
03:08:51.800 | 'Cause you are under the delusion
03:08:54.400 | that every human being is capable
03:08:56.200 | of intelligent reasoned decision.
03:08:57.920 | - Still think I'm right.
03:08:58.760 | - And I perceive them as literally animals,
03:09:01.480 | so when a dog starts barking,
03:09:03.780 | all it's saying is that the dog is agitated,
03:09:05.840 | and this is not going to change my life one iota
03:09:08.240 | other than crossing the street, perhaps.
03:09:09.960 | - Yeah, I'm gonna prove you wrong one day.
03:09:12.680 | - If you're gonna kill yourself,
03:09:13.920 | 'cause they can drive you to it.
03:09:15.520 | The first shoot up of school.
03:09:17.520 | - But if I don't, I'll prove you wrong.
03:09:19.360 | I'll bring the data.
03:09:20.640 | And they'd be like, "You're right, Lex."
03:09:22.320 | - I have the receipts.
03:09:23.160 | - I have the receipts.
03:09:25.000 | Okay, so when we mentioned Camus.
03:09:27.680 | - Oh yeah, I love him.
03:09:29.240 | - Is there, this is a question that people love when I ask.
03:09:34.240 | I have really smart people.
03:09:37.360 | Oh, it is love.
03:09:38.380 | (laughing)
03:09:39.640 | No, what books, let's say three books,
03:09:44.640 | if you can think of them, technical, fiction, philosophical,
03:09:49.640 | would you, had a big impact on you,
03:09:52.040 | or would you recommend to others?
03:09:54.400 | - The Machiavellians by James Burnham.
03:09:56.960 | This is a book about how politics works in reality
03:09:59.840 | as opposed to how people imagine it working.
03:10:02.500 | Mencius Moldbug, who's a figure in these circles,
03:10:05.680 | who's respected by a lot of people.
03:10:07.480 | I was giving a talk and there was a bunch of panelists
03:10:10.440 | and we were asked, "What book would you recommend?"
03:10:12.640 | I said, "The Machiavellians."
03:10:14.080 | Independently of me, that was the book he had recommended.
03:10:17.120 | It's out of print, it's hard to find,
03:10:18.600 | but that would be one.
03:10:19.880 | - Is that his book or no?
03:10:20.920 | - James Burnham, it came out in 1941, I think.
03:10:23.640 | - So, can you pause on the Mencius, what's his--
03:10:26.760 | - Mencius Moldbug.
03:10:28.200 | - That's a code name, right?
03:10:29.720 | That guy's-- - Pen name.
03:10:30.560 | - That guy's pen name. - Curtis Yarvin.
03:10:32.160 | It's his real name.
03:10:33.440 | He swims in your circles.
03:10:35.640 | - Which circles? - He does some kind
03:10:36.480 | of programming. - Oh, he's originally
03:10:38.280 | a programmer.
03:10:39.400 | He comes up as a person that I should talk with
03:10:43.720 | or I should know about, but then I read a few of his things
03:10:46.800 | and they seem quite dangerous.
03:10:49.400 | They're very long and verbose,
03:10:50.800 | but I think he's an amazing thinker.
03:10:53.160 | - Yeah, but-- - But he's the one
03:10:54.000 | who had the idea of sending the tanks to Harvard Yard.
03:10:56.480 | - But doesn't he have like,
03:10:58.080 | he has some radical views.
03:11:01.320 | I forget what they are.
03:11:02.320 | - Very radical views.
03:11:03.200 | Yeah, he wants the military to coup.
03:11:05.000 | - But you're saying he's a serious thinker
03:11:08.440 | that is worthy of, not worthy.
03:11:11.840 | - I don't know that you would enjoy
03:11:13.240 | having a conversation with him.
03:11:14.360 | I think a lot of people enjoy seeing it happen,
03:11:16.160 | but I think it would be a lot of talking past each other
03:11:18.240 | and it would be interesting.
03:11:20.680 | - Would you agree-- - I did a stream
03:11:21.880 | with him, you can watch. - And would you disagree?
03:11:22.920 | Okay, what do you agree, what do you disagree?
03:11:25.400 | - I agree with him that politics has to be looked at
03:11:28.880 | objectively and without kind of an emotional connection
03:11:33.200 | to different schools.
03:11:34.600 | I talk about him a lot in my book on the New Right.
03:11:37.080 | Disagree, I don't think a military coup is a good idea.
03:11:42.160 | He doesn't think anarchism is stable, I disagree.
03:11:47.080 | I mean, me and him, I did a live stream with him,
03:11:49.160 | we just dorked out a lot about history
03:11:50.440 | and people who've fallen in the memory hole.
03:11:52.800 | So, I mean, he's got a lot of writing.
03:11:56.200 | - So, you know, the sense I got from him
03:11:58.960 | was that if I talk with him,
03:12:01.600 | a lot of people would be upset with me
03:12:03.720 | for giving him a platform.
03:12:05.200 | - Yeah, I think he's on that edge
03:12:07.200 | where they want to read him out of what is
03:12:10.280 | acceptable discourse.
03:12:11.520 | - What's his most controversial,
03:12:13.440 | I mean, you keep mentioning the tanks,
03:12:15.200 | is that the most controversial viewpoint?
03:12:17.120 | Does he have a race thing?
03:12:18.360 | - No, the alt-right doesn't particularly like him
03:12:21.440 | in many ways because he's not a big on the race thing.
03:12:24.120 | I don't know what would be his most controversial view,
03:12:27.560 | to be honest.
03:12:28.600 | I think because he is radical in terms of his analysis
03:12:33.240 | of culture, anytime someone's a radical,
03:12:34.960 | that is dangerous.
03:12:35.800 | - Yeah, it's dangerous.
03:12:36.960 | Okay, book, so that's one.
03:12:39.120 | - The Fountainhead.
03:12:39.960 | - The Maccabees Alliance.
03:12:40.800 | - The Fountainhead, which is, I would say--
03:12:43.400 | - Not Aelstrug?
03:12:44.320 | - No, and if you read Atlas Shrugged
03:12:46.160 | before reading The Fountainhead,
03:12:47.120 | you're doing yourself an enormous disservice.
03:12:48.640 | Don't you dare do it.
03:12:49.960 | - On the philosophical, because the novel--
03:12:52.040 | - On every level.
03:12:53.080 | Fountainhead's a better novel.
03:12:54.560 | Fountainhead's superfluous if you read Atlas Shrugged first.
03:12:57.120 | Fountainhead's about psychology and ethics.
03:13:00.440 | It does not have to do with her politics
03:13:02.240 | other than its implications.
03:13:03.800 | So it's by far the superior book.
03:13:05.960 | The third one, ooh, this is a good one question.
03:13:09.480 | Let me see.
03:13:10.480 | There's so many good books out there that I love.
03:13:13.160 | I'm going to, this is not really my third choice,
03:13:16.320 | but I'll throw it out there
03:13:17.200 | because this is such an important worldview,
03:13:21.160 | especially for people on the right.
03:13:22.800 | - Are you virtue signaling?
03:13:24.160 | - No, this is counter signaling.
03:13:25.760 | Thaddeus Russell's book,
03:13:27.840 | A Renegade History of the United States.
03:13:29.640 | His thesis is that it's the degenerates
03:13:33.240 | that give us all freedom.
03:13:35.060 | And things like prostitutes, things like madams,
03:13:38.560 | things like slaves, things like immigrants,
03:13:41.560 | because they were so low status,
03:13:44.400 | they could get away with things
03:13:46.000 | that then people who are higher status demanded
03:13:47.960 | and so on and so forth.
03:13:48.920 | So I think that thesis,
03:13:50.520 | and it really has extreme consequences in thinking.
03:13:55.420 | And no, Jonathan Haidt, The Righteous Mind.
03:13:59.920 | That's, those are the four.
03:14:01.320 | - Is that his best?
03:14:03.560 | I haven't read any of his stuff.
03:14:04.680 | - The Righteous Mind is the only one you want.
03:14:06.080 | - Okay.
03:14:06.920 | That was four, but of course.
03:14:10.360 | Forget Thaddeus Russell, we'll put Haidt in there.
03:14:12.560 | - Of course you would.
03:14:14.080 | - No, forget Thaddeus, those are the three.
03:14:16.240 | - So we talked about love.
03:14:18.840 | Let me ask you the other question I'm obsessed with.
03:14:21.740 | Are you, do you ponder your own mortality?
03:14:26.280 | - I do, a lot, especially now that I'm an uncle,
03:14:29.800 | especially now that I have like these younger people
03:14:31.860 | that I mentor.
03:14:33.640 | I was just yesterday, my friend John Gerges,
03:14:36.080 | who did my theme song for my podcast,
03:14:37.640 | who did the book cover for Dear Reader,
03:14:40.880 | who's like the most talented person I know.
03:14:42.920 | His song came on the iPod at the gym,
03:14:45.440 | and I almost messaged him, I go,
03:14:46.720 | you know, one day one of us is gonna bury the other,
03:14:48.840 | and it's gonna be really sad.
03:14:50.360 | And I thought about that, and it was kind of just like,
03:14:52.040 | oh man, that's really gonna suck.
03:14:53.960 | And you know, I don't know which scenario would be better.
03:14:56.360 | Like I will be very sad if he's gone.
03:14:58.600 | I'm sure he'll be very sad if I'm gone.
03:15:00.560 | - I mean, what do you, are you afraid of it?
03:15:03.400 | - No, you know, Rand had this quote
03:15:07.000 | about how I won't die, the world will end.
03:15:10.320 | So I've had enough experiences that I am,
03:15:13.560 | I've really, at this point,
03:15:16.960 | and everything's icing on the cake.
03:15:18.800 | - So if I were to kill you at the end of this podcast,
03:15:22.720 | it would feel painless, that would be okay?
03:15:25.240 | - Yeah, you know why?
03:15:26.600 | - Does anyone know you're here by the way?
03:15:28.280 | (both laughing)
03:15:29.920 | - You know why?
03:15:30.760 | - Just asking for a friend.
03:15:32.120 | - Here's why.
03:15:33.240 | There's that wit.
03:15:34.600 | Save that for Twitter, Lex.
03:15:36.160 | (Rand laughing)
03:15:37.440 | Do they call you Sasha?
03:15:38.960 | - No, I'm Ljosha.
03:15:40.280 | - Ljosha, oh, that's my sister's husband, okay.
03:15:42.440 | So here's why.
03:15:45.560 | I strongly believe,
03:15:47.240 | and this is a very kind of Jewish perspective,
03:15:49.520 | that you just have to leave the world
03:15:51.000 | a little bit better than you found it.
03:15:52.360 | That all you could do is move the needle a little.
03:15:54.400 | And one of the things I set out to do
03:15:56.520 | with "Dear Reader," my book on North Korea,
03:15:58.880 | I said, I was at a point in my career
03:16:00.560 | where I could do something to make a difference
03:16:02.080 | instead of just writing like co-authoring books
03:16:03.960 | for celebrities, which I'm very proud of,
03:16:05.360 | but are neither here nor there.
03:16:07.520 | And I thought, all right, I know how to tell stories.
03:16:09.800 | I know how to inform people.
03:16:10.800 | I know how to entertain people.
03:16:12.160 | If I move the needle in America, who cares?
03:16:14.440 | We got it really good here.
03:16:15.760 | If I move the needle in North Korea a little bit,
03:16:18.360 | the cost benefits through the roof.
03:16:20.280 | - I never thought of that, actually.
03:16:22.400 | I never thought of "Dear Reader" from that perspective.
03:16:24.800 | - So when I set out to write it, I'm like, okay,
03:16:28.000 | what can I do?
03:16:28.840 | I'm not gonna be able to liberate the North Korean regime.
03:16:30.720 | What I can do is the camera right now
03:16:33.920 | is focused on at the time Kim Jong-il, now Kim Jong-un.
03:16:37.040 | And I can do just this, just this a little bit.
03:16:39.000 | And I go, behind that guy who you think is funny clown,
03:16:42.720 | there's millions of dead people.
03:16:44.520 | There's children being starved.
03:16:46.240 | There's people who are performing
03:16:47.880 | 'cause they have a gun to their kid's head.
03:16:49.640 | And if someone put a gun to your kid's head,
03:16:51.080 | you'd put on those dancing shoes real quick.
03:16:53.200 | And I and others have managed to change the conversation
03:16:57.440 | about North Korea in terms of,
03:16:59.160 | look at those silly buffoons to those poor people.
03:17:02.920 | So the fact that that little thing
03:17:05.000 | I can say with a straight face, I did,
03:17:07.720 | doesn't make me a great person,
03:17:09.400 | but it does make me someone who,
03:17:11.040 | if I have to go tomorrow, I can say I did a little bit
03:17:15.600 | to make the world a better place.
03:17:17.800 | - What do you think is the meaning of life?
03:17:20.200 | - I think the meaning of life is...
03:17:23.840 | - Why are we here?
03:17:27.600 | - Oh, well, I'm a Camus person.
03:17:29.560 | So I'll give the Camus answer.
03:17:31.320 | So there's two types of people.
03:17:33.720 | Those who know how to use binary, no.
03:17:35.520 | (both laughing)
03:17:38.800 | - Thanks for relating to the audience.
03:17:40.720 | - 111001, two, two.
03:17:43.600 | (both laughing)
03:17:45.280 | Down vote.
03:17:46.120 | (both laughing)
03:17:47.520 | What kind of radical freak is this, Lex?
03:17:50.080 | So, and I use this example in my forthcoming book.
03:17:53.340 | You go into a countryside, a mountainside,
03:17:56.040 | and you see a blank canvas on an easel.
03:17:59.000 | And one kind of mentality goes,
03:18:00.600 | this is, it's just a blank canvas.
03:18:02.840 | This is stupid.
03:18:03.680 | This is, what am I looking at?
03:18:05.000 | And the other type goes, what a great opportunity.
03:18:08.280 | I'm in this beautiful space.
03:18:09.760 | I have this entire canvas to paint.
03:18:11.440 | I could do anything I want with it.
03:18:13.280 | So I am very much of that type two person.
03:18:16.640 | And I hope others start to think of life in that way.
03:18:21.640 | You and I have both been more successful
03:18:23.720 | than we expected to, especially growing up,
03:18:25.880 | and in ways we did not expect.
03:18:28.640 | And when you're young,
03:18:30.000 | you are so intent on driving the car.
03:18:33.000 | And after a certain point,
03:18:34.200 | you realize it's not about driving the car,
03:18:35.920 | you're being a surfer.
03:18:37.280 | That you can only control this little board,
03:18:39.280 | and you have no idea where the waves will take you.
03:18:41.080 | And sometimes you're gonna fall down,
03:18:42.160 | and sometimes you're really gonna suck,
03:18:43.160 | and you're gonna swallow some salt water.
03:18:44.800 | But at a certain point, you stop trying to drive,
03:18:47.440 | and you're like, this is freaking awesome,
03:18:48.840 | and I have no idea where it's gonna go.
03:18:50.680 | - Beautifully put.
03:18:53.400 | I know I speak for a lot of people.
03:18:54.980 | First of all, everyone loves the game you play
03:18:58.820 | on the intranet, it's fun.
03:19:00.780 | You make the world, not everyone.
03:19:02.260 | - Today, oof, they came for me hard.
03:19:05.240 | - But it makes the world seem fun,
03:19:08.580 | and especially in this dark time, it's much appreciated.
03:19:12.620 | And we can't wait 'til the next book,
03:19:14.620 | and the many to come,
03:19:15.620 | and to hopefully many more Joe Rogan appearances.
03:19:18.620 | You guys do some great magic together.
03:19:20.380 | - It's fun.
03:19:21.220 | - Yeah, it's, you, yeah.
03:19:24.300 | You're one of my favorite guests on this show,
03:19:26.540 | so I can't wait,
03:19:27.540 | especially if you can make it before the election.
03:19:30.740 | Thanks so much for making today happen.
03:19:32.300 | I'm glad you came down.
03:19:33.780 | You're awesome.
03:19:34.620 | - Thank you so much, man, great compliment.
03:19:37.180 | - Thanks for listening to this conversation
03:19:38.820 | with Michael Malice, and thank you to our sponsors.
03:19:41.860 | SEMrush, which is a SEO optimization tool,
03:19:46.220 | DoorDash, which is my go-to food delivery service,
03:19:49.660 | and Masterclass, which is online courses
03:19:53.020 | from world experts.
03:19:55.020 | Please check out these sponsors in the description
03:19:57.140 | to get a discount and to support this podcast.
03:20:00.620 | If you enjoy this thing, subscribe on YouTube,
03:20:02.820 | review it with Five Stars on Apple Podcast,
03:20:04.980 | follow on Spotify, support on Patreon,
03:20:07.660 | or connect with me on Twitter @LexFriedman.
03:20:11.340 | And now, let me leave you with some words
03:20:13.420 | from Michael Malice.
03:20:15.260 | Conservatism is progressivism driving the speed limit.
03:20:18.980 | Thank you for listening, and hope to see you next time.
03:20:22.780 | (upbeat music)
03:20:25.360 | (upbeat music)
03:20:27.940 | [BLANK_AUDIO]