back to index

Sam Harris: Trump, Pandemic, Twitter, Elon, Bret, IDW, Kanye, AI & UFOs | Lex Fridman Podcast #365


Chapters

0:0 Introduction
3:38 Empathy and reason
11:30 Donald Trump
54:24 Military industrial complex
58:58 Twitter
83:5 COVID
126:48 Kanye West
143:24 Platforming
161:21 Joe Rogan
178:13 Bret Weinstein
191:51 Elon Musk
203:59 Artificial Intelligence
220:1 UFOs
233:16 Free will
260:31 Hope for the future

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Sam Harris,
00:00:02.600 | his second time on the podcast.
00:00:04.680 | As I said two years ago,
00:00:06.280 | when I first met and spoke with Sam,
00:00:08.480 | he's one of the most influential,
00:00:09.960 | pioneering thinkers of our time.
00:00:12.040 | As the host of the Making Sense podcast,
00:00:14.380 | creator of the Waking Up app,
00:00:16.360 | and the author of many seminal books
00:00:19.040 | on human nature and the human mind,
00:00:21.520 | including The End of Faith,
00:00:23.240 | The Moral Landscape,
00:00:24.520 | Lying,
00:00:25.360 | Free Will,
00:00:26.200 | and Waking Up.
00:00:27.560 | In this conversation,
00:00:29.840 | besides our mutual fascination with AGI and free will,
00:00:33.120 | we do also go deep into controversial,
00:00:35.880 | challenging topics of Donald Trump,
00:00:38.520 | Hunter Biden,
00:00:39.560 | January 6th,
00:00:40.800 | vaccines,
00:00:41.640 | lab leak,
00:00:42.460 | Kanye West,
00:00:43.300 | and several key figures at the center of public discourse,
00:00:46.880 | including Joe Rogan and Elon Musk,
00:00:49.720 | both of whom have been friends of Sam
00:00:52.360 | and have become friends of mine.
00:00:54.760 | Somehow,
00:00:55.600 | in an amazing life trajectory
00:00:57.400 | that I do not deserve in any way,
00:00:59.480 | and in fact believe is probably a figment of my imagination.
00:01:03.560 | And if it's all right,
00:01:05.600 | please allow me to say a few words
00:01:07.020 | about this personal aspect of the conversation,
00:01:09.760 | of discussing Joe,
00:01:11.080 | Elon,
00:01:11.900 | and others.
00:01:13.040 | What's been weighing heavy on my heart
00:01:15.080 | since the beginning of the pandemic,
00:01:16.960 | now three years ago,
00:01:18.720 | is that many people I look to for wisdom in public discourse
00:01:22.200 | stop talking to each other as often,
00:01:24.200 | with respect,
00:01:25.040 | humility,
00:01:25.880 | and love,
00:01:26.700 | when the world needed those kinds of conversations the most.
00:01:30.180 | My hope is that they start talking again.
00:01:33.180 | They start being friends again.
00:01:34.780 | They start noticing the humanity that connects them
00:01:37.420 | that is much deeper than the disagreements that divide them.
00:01:41.080 | So let me take this moment to say,
00:01:43.420 | with humility and honesty,
00:01:45.780 | why I look up to and I'm inspired by Joe, Elon, and Sam.
00:01:49.780 | I think Joe Rogan is important to the world
00:01:53.660 | as a voice of compassionate curiosity
00:01:55.620 | and open-mindedness to ideas,
00:01:57.980 | both radical and mainstream,
00:01:59.500 | sometimes with humor,
00:02:00.720 | sometimes with brutal honesty,
00:02:02.540 | always pushing for more kindness in the world.
00:02:05.200 | I think Elon Musk is important to the world
00:02:08.760 | as an engineer,
00:02:09.740 | leader,
00:02:10.580 | entrepreneur,
00:02:11.420 | and human being who takes on the hardest problems
00:02:13.440 | that face humanity
00:02:14.500 | and refuses to accept the constraints
00:02:16.940 | of conventional thinking
00:02:18.240 | that made the solutions to these problems seem impossible.
00:02:21.480 | I think Sam Harris is important to the world
00:02:25.340 | as a fearless voice
00:02:26.860 | who fights for the pursuit of truth
00:02:28.920 | against growing forces of echo chambers
00:02:31.340 | and audience capture,
00:02:33.060 | taking unpopular perspectives
00:02:34.980 | and defending them with rigor and resilience.
00:02:38.940 | I both celebrate and criticize all three privately,
00:02:42.660 | and they criticize me,
00:02:44.240 | usually more effectively,
00:02:45.780 | for which I always learn a lot and always appreciate.
00:02:48.540 | Most importantly,
00:02:49.540 | there is respect and love for each other as human beings.
00:02:52.980 | The very thing that I think the world needs
00:02:54.980 | most now in a time of division and chaos.
00:02:58.760 | I will continue to try to mend divisions,
00:03:01.480 | to try to understand,
00:03:02.740 | not deride,
00:03:04.060 | to turn the other cheek if needed,
00:03:06.080 | to return hate with love.
00:03:07.960 | Sometimes people criticize me for being naive,
00:03:11.500 | cheesy,
00:03:12.540 | simplistic,
00:03:13.520 | all of that.
00:03:14.900 | I know,
00:03:16.020 | I agree,
00:03:17.260 | but I really am speaking from the heart,
00:03:19.560 | and I'm trying.
00:03:21.020 | This world is too fucking beautiful not to try.
00:03:24.340 | In whatever way I know how.
00:03:25.960 | I love you all.
00:03:27.820 | This is the Lex Friedman Podcast.
00:03:30.780 | To support it,
00:03:31.620 | please check out our sponsors in the description.
00:03:33.940 | And now, dear friends,
00:03:35.900 | here's Sam Harris.
00:03:37.760 | What is more effective
00:03:40.180 | at making a net positive impact on the world,
00:03:42.780 | empathy or reason?
00:03:45.740 | - It depends on what you mean by empathy.
00:03:47.300 | There are two,
00:03:48.520 | at least two kinds of empathy.
00:03:50.180 | There's the cognitive form,
00:03:53.620 | which is,
00:03:54.720 | I would argue even a species of reason.
00:03:57.900 | It's just understanding another person's point of view.
00:04:01.740 | You understand why they're suffering,
00:04:04.820 | or why they're happy,
00:04:05.780 | or what,
00:04:07.060 | you have a theory of mind
00:04:09.060 | about another human being that is accurate.
00:04:11.980 | And so you can navigate
00:04:15.300 | in relationship to them more effectively.
00:04:17.840 | And then there's another layer entirely,
00:04:21.460 | not incompatible with that,
00:04:22.620 | but just distinct,
00:04:23.660 | which is what people often mean by empathy,
00:04:27.100 | which is more a kind of emotional contagion.
00:04:31.540 | Like you feel depressed,
00:04:33.660 | and I begin to feel depressed along with you
00:04:37.100 | because it's contagious.
00:04:39.500 | We're so close,
00:04:41.300 | and I'm so concerned about you,
00:04:42.560 | and your problems become my problems,
00:04:44.780 | and it bleeds through.
00:04:47.300 | Now, I think both of those capacities are very important,
00:04:51.720 | but the emotional contagion piece,
00:04:56.720 | and this is not really my thesis,
00:05:00.580 | this is something I have more or less learned
00:05:02.540 | from Paul Bloom,
00:05:04.120 | the psychologist who wrote a book on this topic
00:05:08.820 | titled "Against Empathy."
00:05:10.220 | The emotional social contagion piece
00:05:13.300 | is a bad guide rather often for ethical behavior
00:05:18.300 | and ethical intuitions.
00:05:20.820 | - Oh boy.
00:05:21.860 | - And I'll give you the clear example of this,
00:05:24.480 | which is we find stories
00:05:29.480 | with a single identifiable protagonist
00:05:34.640 | who we can effortlessly empathize with
00:05:37.800 | far more compelling than data.
00:05:40.760 | So if I tell you,
00:05:42.400 | this is the classic case of the little girl
00:05:44.840 | who falls down a well,
00:05:46.560 | this is somebody's daughter,
00:05:49.920 | you see the parents distraught on television,
00:05:53.600 | you hear her cries from the bottom of the well,
00:05:56.040 | the whole country stops.
00:05:57.360 | I mean, there was an example of this 20, 25 years ago,
00:06:00.520 | I think, where it was just wall to wall on CNN,
00:06:02.840 | this is just the perfect use of CNN.
00:06:05.120 | It was 72 hours or whatever it was of continuous coverage
00:06:08.560 | of just extracting this girl from a well.
00:06:11.120 | So we effortlessly pay attention to that,
00:06:14.480 | we care about it,
00:06:15.520 | we will donate money toward it.
00:06:17.460 | I mean, it's just, it marshals 100% of our compassion
00:06:20.160 | and altruistic impulse.
00:06:21.540 | Whereas if you hear that there's a genocide raging
00:06:25.720 | in some country you've never been to
00:06:27.400 | and never intended to go to,
00:06:29.160 | the numbers don't make a dent
00:06:31.080 | and we find the story boring,
00:06:35.320 | we'll change the channel in the face of a genocide.
00:06:37.760 | It doesn't matter.
00:06:38.600 | And it literally, perversely,
00:06:41.080 | it could be 500,000 little girls
00:06:43.640 | have fallen down wells in that country
00:06:45.600 | and we still don't care.
00:06:47.320 | So it's, you know, many of us have come to believe
00:06:51.880 | that this is a bug rather than a feature
00:06:53.720 | of our moral psychology.
00:06:55.040 | And so empathy plays an unhelpful role there.
00:06:58.640 | So ultimately, I think,
00:07:00.880 | when we're making big decisions about what we should do
00:07:03.360 | and how to mitigate human suffering
00:07:05.840 | and what's worth valuing
00:07:07.760 | and how we should protect those values,
00:07:09.720 | I think reason is the better tool.
00:07:13.920 | But it's not that I would want to dispense
00:07:15.360 | with any part of empathy either.
00:07:17.440 | - Well, there's a lot of dangers to go on there.
00:07:18.960 | But briefly to mention,
00:07:20.400 | you've recently talked about effective altruism
00:07:23.920 | on your podcast.
00:07:25.080 | I think you mentioned some interesting statement,
00:07:28.800 | I'm going to horribly misquote you,
00:07:30.840 | but that you'd rather live in a world,
00:07:33.940 | like it doesn't really make sense,
00:07:35.480 | but you'd rather live in a world
00:07:36.760 | where you care about maybe your daughter and son
00:07:40.000 | more than 100 people that live across the world,
00:07:43.280 | something like this.
00:07:44.120 | Like where the calculus is not always perfect,
00:07:46.840 | but somehow it makes sense to live in a world
00:07:49.440 | where it's irrational in this way,
00:07:51.840 | and yet empathetic in the way you've been discussing.
00:07:54.560 | - Right, I'm not sure what the right answer is there,
00:07:57.240 | or even whether there is one right answer.
00:07:59.260 | There could be multiple peaks
00:08:01.600 | on this part of the moral landscape.
00:08:02.880 | But so the opposition is between an ethic
00:08:07.200 | that's articulated by someone like the Dalai Lama,
00:08:10.680 | or really any exponent of classic Buddhism,
00:08:15.560 | would say that the ultimate enlightened ethic
00:08:19.040 | is true dispassion with respect to friends and strangers.
00:08:23.200 | The mind of the Buddha would be truly dispassionate,
00:08:28.160 | you would love and care about all people equally.
00:08:31.680 | And by that light, it seems some kind of ethical failing,
00:08:37.240 | or at least a failure to fully actualize
00:08:41.320 | compassion in the limit, or enlightened wisdom in the limit,
00:08:44.600 | to care more, or even much more, about your kids
00:08:49.960 | than the kids of other people,
00:08:51.320 | and to prioritize your energy in that way.
00:08:55.360 | So you spend all this time trying to figure out
00:08:56.800 | how to keep your kids healthy and happy,
00:08:58.660 | and you'll attend to their minutest concerns,
00:09:02.320 | and however superficial.
00:09:03.920 | And again, there's a genocide raging
00:09:07.000 | in Sudan or wherever, and it takes up
00:09:11.080 | less than 1% of your bandwidth.
00:09:13.360 | I'm not sure it would be a better world
00:09:15.080 | if everyone was running the Dalai Lama program there.
00:09:18.640 | I think some prioritization of one's nearest and dearest
00:09:23.640 | ethically might be optimal,
00:09:28.080 | because we'll all be doing that,
00:09:30.000 | and we'll all be doing that in a circumstance
00:09:33.640 | where we have certain norms, and laws,
00:09:38.520 | and other structures that force us
00:09:42.480 | to be dispassionate where that matters.
00:09:45.000 | So when I go to, when my daughter gets sick,
00:09:48.120 | and I have to take her to a hospital,
00:09:50.660 | I really want her to get attention,
00:09:54.280 | and I'm worried about her more than I'm worried
00:09:56.160 | about everyone else in the lobby.
00:09:57.840 | But the truth is, I actually don't want
00:09:59.700 | a totally corrupt hospital.
00:10:01.400 | I don't want a hospital that treats my daughter
00:10:04.240 | better than anyone else in the lobby
00:10:05.840 | because she's my daughter, and I've bribed
00:10:08.480 | the guy at the door, or whatever,
00:10:10.240 | or the guy's a fan of my podcast,
00:10:11.540 | or whatever the thing is.
00:10:12.920 | You don't want starkly corrupt, unfair situations.
00:10:17.300 | And when you sort of get pressed down
00:10:22.480 | the hierarchy of Maslow's needs,
00:10:24.400 | individually and societally,
00:10:26.720 | a bunch of those variables change,
00:10:30.100 | and they change for the worse, understandably.
00:10:32.400 | But yeah, when everyone's corrupt,
00:10:34.960 | and you're in a state of collective emergency,
00:10:39.960 | you've got a lifeboat problem,
00:10:42.240 | you're scrambling to get into the lifeboat,
00:10:44.800 | yeah, then fairness, and norms,
00:10:47.920 | and the other vestiges of civilization
00:10:52.920 | begin to get stripped off.
00:10:55.760 | We can't reason from those emergencies
00:10:59.260 | to normal life.
00:11:00.640 | I mean, in normal life, we want justice,
00:11:03.160 | we want fairness, we're all better off for it,
00:11:06.000 | even when the spotlight of our concern
00:11:09.240 | is focused on the people we know,
00:11:11.840 | the people who are our friends,
00:11:12.880 | the people who are family,
00:11:14.000 | people we have good reason to care about.
00:11:16.920 | We still, by default, want a system
00:11:19.800 | that protects the interests of strangers, too.
00:11:22.320 | And we know that, generally speaking,
00:11:24.120 | and just in game theoretic terms,
00:11:26.320 | we're all gonna tend to be better off
00:11:27.880 | in a fairer system than a corrupt one.
00:11:30.340 | - One of the failure modes of empathy
00:11:32.860 | is our susceptibility to anecdotal data.
00:11:37.860 | Just a good story will get us to not think clearly.
00:11:43.280 | But what about empathy in the context
00:11:45.520 | of just discussing ideas with other people,
00:11:47.960 | and then there's a large number of people,
00:11:49.500 | like in this country, you know, red and blue,
00:11:52.740 | half the population believes certain things
00:11:55.480 | on immigration or on the response to the pandemic
00:11:59.400 | or any kind of controversial issue,
00:12:01.580 | even if the election was fairly executed.
00:12:04.580 | Having an empathy for their worldview,
00:12:10.440 | trying to understand where they're coming from,
00:12:12.340 | not just in the explicit statement of their idea,
00:12:14.800 | but the entirety of like the roots
00:12:17.000 | from which their idea stems,
00:12:19.120 | that kind of empathy while you're discussing ideas.
00:12:22.040 | What is, in your pursuit of truth,
00:12:24.520 | having empathy for the perspective
00:12:27.800 | of a large number of other people
00:12:29.520 | versus raw mathematical reason?
00:12:33.880 | - I think it's important,
00:12:35.000 | but it only takes you so far, right?
00:12:37.520 | It doesn't get you to truth, right?
00:12:40.180 | It's not, truth is not decided by democratic principles,
00:12:45.180 | and certain people believe things
00:12:50.600 | for understandable reasons,
00:12:52.160 | but those reasons are nonetheless bad reasons, right?
00:12:54.640 | They don't scale, they don't generalize,
00:12:56.520 | they're not reasons anyone should adopt for themselves
00:12:59.680 | or respect epistemologically,
00:13:03.260 | and yet their circumstance is understandable
00:13:07.000 | and it's something you can care about, right?
00:13:08.960 | And so, yeah, like, let me just take,
00:13:11.520 | I think there's many examples of this
00:13:13.480 | that you might be thinking of,
00:13:14.760 | but one that comes to mind
00:13:17.160 | is I've been super critical of Trump, obviously,
00:13:20.720 | and I've been super critical of certain people
00:13:25.280 | for endorsing him or not criticizing him
00:13:28.880 | when he really made it patently obvious who he was,
00:13:33.440 | you know, if there had been any doubt initially.
00:13:36.640 | There was no doubt when we have a sitting president
00:13:38.560 | who's not agreeing to a peaceful transfer of power, right?
00:13:43.560 | So I'm critical of all of that,
00:13:49.280 | and yet the fact that many millions of Americans
00:13:54.200 | didn't see what was wrong with Trump
00:13:57.280 | or bought into the, didn't see through his con, right?
00:14:02.200 | I mean, they bought into the idea
00:14:03.400 | that he was a brilliant businessman
00:14:05.640 | who might just be able to change things
00:14:07.280 | because he's so unconventional
00:14:08.680 | and so, you know, his heart is in the right place.
00:14:11.760 | You know, he's really a man of the people,
00:14:13.080 | even though he's a, you know,
00:14:14.000 | gold-plated everything in his life.
00:14:18.240 | They bought the myth somehow of, you know,
00:14:22.360 | largely because they had seen him on television
00:14:24.040 | for almost a decade and a half
00:14:28.480 | pretending to be this genius businessman
00:14:30.680 | who could get things done.
00:14:31.980 | It's understandable to me that many very frustrated people
00:14:37.040 | who have not had their hopes and dreams actualized,
00:14:41.100 | who have been the victims of globalism
00:14:45.600 | and many other, you know, current trends,
00:14:49.640 | it's understandable that they would be confused
00:14:55.120 | and not see the liability of electing
00:14:59.480 | a grossly incompetent, morbidly narcissistic person
00:15:03.800 | into the presidency.
00:15:06.420 | So I don't, so which is to say that I don't blame,
00:15:10.720 | there are many, many millions of people
00:15:12.040 | who I don't necessarily blame for the Trump phenomenon,
00:15:15.040 | but I can nonetheless bemoan the phenomenon
00:15:18.720 | as indicative of, you know,
00:15:20.400 | very bad state of affairs in our society, right?
00:15:24.380 | So there's two levels to it.
00:15:25.840 | I mean, one is I think you have to call a spade a spade
00:15:28.520 | when you're talking about how things actually work
00:15:31.080 | and what things are likely to happen or not,
00:15:34.280 | but then you can recognize that people
00:15:37.360 | have very different life experiences.
00:15:39.680 | And yeah, I mean, I think empathy and, you know,
00:15:43.200 | probably the better word for what I would hope to embody
00:15:47.000 | there is compassion, right?
00:15:48.360 | Like really, you know, to really wish people well,
00:15:53.360 | you know, and to really wish, you know,
00:15:55.520 | strangers well, effortlessly wish them well.
00:15:58.080 | I mean, to realize that there is no opposition between,
00:16:01.000 | in the bottom, there's no real opposition
00:16:02.960 | between selfishness and selflessness
00:16:05.540 | because why is selfishness really takes into account
00:16:10.540 | other people's happiness?
00:16:12.180 | I mean, you know, which do you,
00:16:13.480 | do you want to live in a society where you have everything,
00:16:16.280 | but most other people have nothing?
00:16:18.680 | Or do you want to live in a society where you're surrounded
00:16:21.680 | by happy, creative, self-actualized people
00:16:25.360 | who are having their hopes and dreams realized?
00:16:27.720 | I think it's obvious that the second society is much better,
00:16:31.080 | however much you can guard your good luck.
00:16:35.040 | - But what about having empathy for certain principles
00:16:40.920 | that people believe, for example, the pushback,
00:16:44.760 | the other perspective on this, 'cause you said,
00:16:47.040 | bought the myth of Trump as the great businessman.
00:16:50.040 | There could be a lot of people that are supporters of Trump
00:16:53.040 | who could say that Sam Harris bought the myth
00:16:57.020 | that we have this government of the people, by the people,
00:17:00.840 | that actually represents the people,
00:17:02.480 | as opposed to a bunch of elites
00:17:05.200 | who are running a giant bureaucracy that is corrupt,
00:17:07.840 | that is feeding themselves,
00:17:09.560 | and they're actually not representing the people.
00:17:11.840 | And then here's this chaos agent,
00:17:14.120 | Trump, who speaks off the top of his head.
00:17:16.600 | Yeah, he's flawed in all this number of ways.
00:17:18.960 | He's a more comedian
00:17:21.280 | than he is a presidential type of figure.
00:17:24.160 | And he's actually creating the kind of chaos
00:17:26.760 | that's going to shake up this bureaucracy,
00:17:29.440 | shake up the elites that are so uncomfortable,
00:17:32.000 | 'cause they don't want the world to know
00:17:34.080 | about the game they got running on everybody else.
00:17:37.000 | So that's, you know. - Yeah, yeah.
00:17:38.440 | - That's the kind of perspective that they would take
00:17:40.760 | and say, yeah, yeah, there's these flaws that Trump has,
00:17:43.960 | but this is necessary.
00:17:45.680 | - I agree with the first part.
00:17:46.960 | So I haven't bought the myth that it's,
00:17:50.400 | you know, a truly representative democracy
00:17:54.720 | in the way that we might idealize.
00:17:57.040 | And, you know, on some level,
00:18:01.120 | I mean, this is a different conversation,
00:18:02.440 | but on some level, I'm not even sure
00:18:04.080 | how much I think it should be, right?
00:18:05.920 | I'm not sure we want, in the end,
00:18:10.920 | everyone's opinion given equal weight
00:18:14.880 | about just what we should do about anything.
00:18:17.000 | And I include myself in that.
00:18:17.960 | I mean, there are many topics around which
00:18:20.480 | I don't deserve to have a strong opinion
00:18:22.600 | because I don't know what I'm talking about, right?
00:18:25.320 | Or what I would be talking about if I had a strong opinion.
00:18:27.480 | So, and I think we'll probably get to that,
00:18:32.480 | to some of those topics,
00:18:34.760 | because I've declined to have certain conversations
00:18:36.640 | on my podcast just because I think I'm the wrong person
00:18:38.680 | to have that conversation, right?
00:18:40.720 | And I think it's important to see those bright lines
00:18:45.480 | in one's life and in the moment,
00:18:47.840 | politically and ethically.
00:18:50.440 | So yeah, I think,
00:18:51.760 | so leave aside the viability of democracy.
00:18:57.120 | I'm under no illusions that all of our institutions
00:19:02.600 | are worth preserving precisely as they have been
00:19:06.160 | up until the moment this great orange wrecking ball
00:19:09.120 | came swinging through our lives.
00:19:11.200 | But I just, it was a very bad bet to elect someone
00:19:15.520 | who is grossly incompetent and worse than incompetent,
00:19:20.520 | genuinely malevolent in his selfishness, right?
00:19:27.480 | And this is something we know based on literally decades
00:19:31.960 | of him being in the public eye, right?
00:19:33.600 | He's not a public servant in any normal sense of that term.
00:19:38.600 | And he couldn't possibly give an honest or sane answer
00:19:44.240 | to the question you asked me about empathy and reason
00:19:47.320 | and like, how should we, what should guide us?
00:19:50.360 | I genuinely think he is missing some necessary moral
00:19:55.800 | and psychological tools, right?
00:19:58.840 | And this is, I can feel compassion for him
00:20:01.920 | as a human being 'cause I think having those things
00:20:05.040 | is incredibly important and genuinely loving other people
00:20:07.720 | is incredibly important and knowing what all that's about
00:20:10.640 | is that's really the good stuff in life.
00:20:12.800 | And I think he's missing a lot of that.
00:20:17.200 | But I think we don't wanna promote people
00:20:20.320 | to the highest positions of power in our society
00:20:23.640 | who are far outliers in pathological terms, right?
00:20:30.840 | We want them to be far outliers in the best case
00:20:34.640 | in wisdom and compassion and some of the things you've,
00:20:37.920 | some of the topics you've brought up.
00:20:39.120 | I mean, we want someone to be deeply informed.
00:20:41.080 | We want someone to be unusually curious,
00:20:46.080 | unusually alert to how they may be wrong
00:20:49.500 | or getting things wrong consequentially.
00:20:52.020 | He's none of those things.
00:20:53.060 | And insofar as we're gonna get normal mediocrities
00:20:56.440 | in that role, which I think is often the best
00:20:59.360 | we could expect, let's get normal mediocrities in that role,
00:21:02.640 | not once in a generation narcissists and frauds.
00:21:07.640 | I mean, it's like,
00:21:13.360 | we just take honesty as a single variable, right?
00:21:15.560 | I think you want, yes, it's possible that most politicians
00:21:20.560 | lie at least some of the time.
00:21:22.560 | I don't think that's a good thing.
00:21:24.800 | I think people should be generally honest
00:21:28.640 | even to a fault.
00:21:29.700 | Yes, there are certain circumstances where lying
00:21:33.120 | I think is necessary.
00:21:34.560 | It's kind of on a continuum of self-defense and violence.
00:21:38.160 | So it's like, if you're gonna,
00:21:39.840 | if the Nazis come to your door and ask you
00:21:41.600 | if you've got Anne Frank in the attic,
00:21:43.480 | I think it's okay to lie to them.
00:21:45.120 | But Trump, arguably there's never been a person
00:21:51.620 | that anyone could name in human history
00:21:55.560 | who's lied with that kind of velocity.
00:21:59.080 | I mean, it's just, he was just a blizzard of lies,
00:22:03.360 | great and small, to pointless and effective.
00:22:08.360 | But it's just, it says something fairly alarming
00:22:14.880 | about our society that a person of that character
00:22:22.800 | got promoted.
00:22:23.680 | And so, yes, I have compassion and concern
00:22:26.600 | for half of the society who didn't see it that way.
00:22:29.520 | And that's gonna sound elitist and smug or something
00:22:34.520 | for anyone who's on that side listening to me.
00:22:36.800 | But it's genuine.
00:22:39.200 | I mean, I understand that like, I barely have the,
00:22:42.400 | I'm like one of the luckiest people in the world
00:22:44.200 | and I barely have the bandwidth to pay attention
00:22:47.000 | to half the things I should pay attention to
00:22:48.640 | in order to have an opinion about half the things
00:22:50.880 | we're gonna talk about, right?
00:22:52.360 | So how much less bandwidth is somebody
00:22:54.840 | who's working two jobs or a single mom
00:22:57.760 | who's raising multiple kids, even a single kid?
00:23:02.760 | It's just, it's unimaginable to me
00:23:06.320 | that people have the bandwidth to really track this stuff.
00:23:10.480 | And so then they jump on social media
00:23:12.280 | and they see, they get inundated by misinformation
00:23:15.080 | and they see what their favorite influencer just said.
00:23:18.120 | And now they're worried about vaccines.
00:23:21.200 | And it's just, we're living in an environment
00:23:25.400 | where the information space has become so corrupted
00:23:29.520 | and we've built machines to further corrupt it.
00:23:32.800 | You know, I mean, we've built a business model
00:23:34.400 | for the internet that it further corrupts it.
00:23:37.180 | So it is just, it's chaos in informational terms.
00:23:42.180 | And I don't fault people for being confused
00:23:46.720 | and impatient and at their wits end.
00:23:51.080 | And yes, Trump was an enormous fuck you
00:23:56.080 | to the establishment.
00:23:57.680 | And that was understandable for many reasons.
00:24:01.320 | - To me, Sam Harris, the great Sam Harris
00:24:04.240 | is somebody I've looked up to for a long time
00:24:07.520 | as a beacon of voice of reason.
00:24:10.240 | And there's this meme on the internet,
00:24:12.400 | and I would love you to steel man the case for it
00:24:14.560 | and against, that Trump broke Sam Harris's brain.
00:24:18.880 | That there's something is disproportionately
00:24:21.600 | to the actual impact that Trump had on our society.
00:24:24.600 | He had an impact on the ability of balanced,
00:24:29.600 | calm, rational minds to see the world clearly,
00:24:35.920 | to think clearly.
00:24:37.600 | You being one of the beacons of that.
00:24:40.280 | Is there a degree to which he broke your brain?
00:24:43.160 | Otherwise known as Trump derangements.
00:24:47.520 | - Yeah, yeah, yeah.
00:24:48.360 | - In a medical condition.
00:24:49.920 | - Yeah, I think Trump derangement syndrome
00:24:51.960 | is a very clever meme because it just throws
00:24:56.800 | the problem back on the person who's criticizing Trump.
00:24:59.620 | But in truth, the true Trump derangement syndrome
00:25:04.240 | was not to have seen how dangerous and divisive
00:25:08.400 | it would be to promote someone like Trump
00:25:10.600 | to that position of power.
00:25:12.240 | And in the final moment, not to see how Trump
00:25:17.320 | was so untenable it was to still support someone
00:25:22.320 | who a sitting president who was not committing
00:25:26.000 | to a peaceful transfer of power.
00:25:27.280 | I mean, if that wasn't a bright line for you,
00:25:30.600 | you have been deranged by something.
00:25:33.120 | Because that was one minute to midnight
00:25:37.120 | for our democracy as far as I'm concerned.
00:25:39.720 | And I think it really was but for the integrity
00:25:45.600 | of a few people that we didn't suffer
00:25:49.320 | some real constitutional crisis and real emergency
00:25:54.320 | after January 6th.
00:25:55.400 | I mean, if Mike Pence had caved in
00:25:58.600 | and decided to not certify the election, right?
00:26:01.200 | Literally you can count on two hands
00:26:04.100 | the number of people who held things together
00:26:06.800 | at that moment.
00:26:08.200 | And so it wasn't for want of trying on Trump's part
00:26:11.880 | that we didn't succumb to some real,
00:26:16.880 | truly uncharted catastrophe with our democracy.
00:26:22.040 | So the fact that that didn't happen is not a sign
00:26:25.480 | that those of us who were worried that it was
00:26:29.560 | so close to happening were exaggerating the problem.
00:26:32.200 | I mean, it's like you almost got run over by a car
00:26:35.560 | but you didn't.
00:26:36.680 | And so the fact that you're adrenalized
00:26:39.360 | and you're thinking, "Boy, that was dangerous.
00:26:41.240 | "I probably shouldn't wander in the middle of the street
00:26:44.880 | "with my eyes closed."
00:26:46.000 | You weren't wrong to feel that you really had a problem.
00:26:49.840 | And came very close to something truly terrible.
00:26:56.200 | So I think that's where we were
00:26:57.800 | and I think we shouldn't do that again.
00:26:59.720 | So the fact that he's still, he's coming back around
00:27:02.160 | as potentially a viable candidate.
00:27:04.640 | I'm not spending much time thinking about it, frankly,
00:27:06.460 | because I'm waiting for the moment
00:27:08.760 | where it requires some thought.
00:27:10.880 | I mean, it took up,
00:27:15.080 | I don't know how many podcasts I devoted to the topic.
00:27:20.520 | It wasn't that, I mean, it wasn't that many in the end
00:27:24.040 | against the number of podcasts I devoted to other topics.
00:27:27.760 | But there are people who look at Trump
00:27:30.880 | and just find him funny, entertaining,
00:27:34.580 | not especially threatening.
00:27:36.880 | He's like not a, it's just good fun to see somebody
00:27:40.240 | who's like, who's just not taking anything seriously.
00:27:43.640 | And it's just putting a stick in the wheel
00:27:46.600 | of business as usual again and again and again and again.
00:27:50.500 | And they don't really see anything much at stake, right?
00:27:56.480 | It doesn't really matter if we don't support NATO.
00:27:59.480 | It doesn't really matter if he says he trusts Putin
00:28:02.040 | more than our intelligence services.
00:28:04.920 | I mean, none of this, it doesn't matter if he's,
00:28:06.880 | on the one hand, saying that he loves
00:28:09.440 | the leader of North Korea, and on the other,
00:28:13.320 | threatening, threatens to bomb them back to the Stone Age,
00:28:17.200 | right, on Twitter.
00:28:18.640 | It all can be taken in the spirit
00:28:20.760 | of kind of reality television.
00:28:22.080 | It's like, this is the part of the movie
00:28:23.540 | that's just fun to watch, right?
00:28:25.120 | And I understand that.
00:28:28.360 | I can even inhabit that space for a few minutes at a time.
00:28:32.240 | But there's a deeper concern that we're in the process
00:28:36.520 | of entertaining ourselves to death, right,
00:28:39.200 | that we're just not taking things seriously.
00:28:41.280 | And this is a problem I've had with several other people
00:28:44.320 | we might name who just appear to me
00:28:47.060 | to be goofing around at scale.
00:28:50.540 | And they lack a kind of moral seriousness.
00:28:52.920 | I mean, they're touching big problems
00:28:55.500 | where lives hang in the balance,
00:28:57.540 | but they're just fucking around.
00:28:58.880 | And I think there are really important problems
00:29:02.040 | that we have to get our head straight around.
00:29:05.040 | And we need, you know, it's not to say
00:29:07.640 | that institutions don't become corrupt.
00:29:09.600 | I think they do.
00:29:10.480 | And I think, and I'm quite worried that,
00:29:13.160 | you know, both about the loss of trust in our institutions
00:29:16.600 | and the fact that trust has eroded for good reason, right,
00:29:21.400 | that they have become less trustworthy.
00:29:22.920 | I think, you know, they've become infected by,
00:29:25.840 | you know, political ideologies that are not truth tracking.
00:29:28.480 | I mean, I worry about all of that.
00:29:31.180 | But I just think we need institutions.
00:29:35.220 | We need to rebuild them.
00:29:36.220 | We need experts who are real experts.
00:29:40.060 | We need to value expertise over, you know,
00:29:43.660 | amateurish speculation and conspiracy thinking
00:29:46.900 | and just, you know, and bullshit.
00:29:49.460 | - What kind of amateur speculation
00:29:51.020 | are we doing on this very podcast?
00:29:52.720 | - I'm usually alert to the moments
00:29:56.260 | where I'm just guessing
00:29:58.880 | or where I actually feel like I'm talking
00:30:01.640 | from within my wheelhouse.
00:30:02.900 | And I try to telegraph that a fair amount with people.
00:30:06.100 | So yeah, I mean, but it's not, it's different.
00:30:12.580 | Like, I mean, you can invite someone onto your podcast
00:30:15.660 | who's an expert about something
00:30:17.340 | that you're not an expert about.
00:30:21.060 | And then you, in the process
00:30:23.580 | of getting more informed yourself,
00:30:25.740 | your audience is getting more informed.
00:30:27.380 | You're asking smart questions
00:30:29.200 | and you might be pushing back at the margins,
00:30:32.260 | but you know that when push comes to shove on that topic,
00:30:35.900 | you really don't have a basis to have a strong opinion.
00:30:39.860 | And if you were gonna form a strong opinion
00:30:44.020 | that was this counter to the expert
00:30:45.700 | you have in front of you,
00:30:46.900 | it's gonna be by deference to some other expert
00:30:49.540 | who you've brought in or who you've heard about
00:30:51.620 | or whose work you've read or whatever.
00:30:53.980 | But there's a paradox to how we value authority in science
00:30:58.980 | that most people don't understand.
00:31:00.620 | And I think we should at some point unravel that
00:31:03.380 | because it's the basis for a lot of public confusion.
00:31:07.380 | And frankly, it's the basis for a lot of criticism
00:31:10.180 | I've received on these topics,
00:31:11.700 | where people think that I'm against free speech
00:31:16.700 | or I'm an establishment shill,
00:31:19.740 | or it's like, I just think I'm a credentialist.
00:31:23.400 | I just think people with PhDs from Ivy League universities
00:31:26.660 | should run everything.
00:31:28.660 | It's not true, but there's a ton of,
00:31:31.220 | there's a lot to cut through to get to daylight there
00:31:33.820 | because people are very confused
00:31:37.740 | about how we value authority
00:31:39.940 | in the service of rationality generally.
00:31:42.020 | - You've talked about it, but it's just interesting,
00:31:44.140 | the intensity of feeling you have.
00:31:45.820 | You've had this famous phrase about Hunter Biden
00:31:50.300 | and children in the basement.
00:31:52.980 | Can you just revisit this case?
00:31:55.180 | So let me give another perspective
00:31:58.640 | on the situation of January 6th and Trump in general.
00:32:01.860 | It's possible that January 6th and things of that nature
00:32:07.940 | revealed that our democracy is actually pretty fragile.
00:32:12.020 | And that Trump is not a malevolent
00:32:14.340 | and ultra-competent malevolent figure,
00:32:16.860 | but is simply a jokester.
00:32:19.820 | And he just, by creating the chaos,
00:32:22.540 | revealed that it's all pretty fragile.
00:32:24.900 | Because you're a student of history
00:32:26.500 | and there's a lot of people like Vladimir Lenin, Hitler,
00:32:29.660 | who are exceptionally competent at controlling power,
00:32:34.320 | at being executives and taking that power,
00:32:37.260 | controlling the generals,
00:32:38.340 | controlling all the figures involved,
00:32:40.460 | and certainly not tweeting,
00:32:42.180 | but working in the shadows behind the scenes to gain power.
00:32:46.340 | And they did so extremely competently,
00:32:49.300 | and that is how they were able to gain power.
00:32:51.700 | The pushback with Trump, he was doing none of that.
00:32:54.860 | He was creating what he's very good at, creating drama,
00:32:59.860 | sometimes for humor's sake, sometimes for drama's sake,
00:33:03.260 | and simply revealed that our democracy is fragile.
00:33:06.700 | And so he's not this once in a generation horrible figure.
00:33:11.700 | - Once in a generation narcissist.
00:33:13.860 | No, I don't think he's a truly scary,
00:33:19.580 | sinister, Putin-like, or Hitler,
00:33:23.260 | much less Hitler-like figure, not at all.
00:33:25.300 | I mean, he's not ideological.
00:33:26.700 | He doesn't care about anything beyond himself.
00:33:29.060 | So it's not, no, no, he's much less scary
00:33:34.060 | than any really scary totalitarian, right?
00:33:38.780 | I mean, and he's--
00:33:40.540 | - He's more brave in your world than 1984.
00:33:42.820 | - This is what Eric Weinstein never stops badgering me about,
00:33:47.780 | but he's still wrong, Eric.
00:33:50.420 | My analogy for Trump was that he's an evil Chauncey Gardner.
00:33:56.420 | I don't know if you remember the book or the film
00:33:59.540 | being there with Peter Sellers,
00:34:02.340 | but Peter Sellers is this gardener
00:34:05.340 | who really doesn't know anything,
00:34:07.820 | but he gets recognized as this wise man
00:34:09.820 | and gets promoted to immense power in Washington
00:34:13.260 | because he's speaking in these kind of,
00:34:16.420 | in a semblance of wisdom,
00:34:18.020 | he's got these very simple aphorisms,
00:34:19.700 | or what seem to be aphorisms.
00:34:21.300 | He's just talking, all he cares about is gardening.
00:34:23.540 | He's just talking about his garden all the time,
00:34:25.940 | but he'll say something, but yeah, in the spring,
00:34:28.580 | the new shoots will bloom,
00:34:30.660 | and people read into that some kind of genius insight
00:34:34.460 | politically, and so he gets promoted,
00:34:36.060 | and so that's the joke of the film.
00:34:38.000 | For me, Trump has always been someone
00:34:40.140 | like an evil Chauncey Gardner.
00:34:42.060 | He's, it's not to say he's totally,
00:34:45.780 | yes, he has a certain kind of genius.
00:34:47.420 | He's got a genius for creating a spectacle around himself.
00:34:51.580 | Right, he's got a genius for getting the eye of the media
00:34:54.780 | always coming back to him,
00:34:56.380 | but it's only, it's a kind of,
00:35:00.220 | it's a kind of self-promotion that only works
00:35:03.900 | if you actually are truly shameless
00:35:06.220 | and don't care about having a reputation for anything
00:35:09.420 | that I or you would wanna have a reputation for, right?
00:35:12.420 | It's like it's pure, the pure pornography of attention,
00:35:15.940 | right, and he just wants more of it.
00:35:17.800 | I think the truly depressing and genuinely scary thing
00:35:22.420 | was that we have a country that,
00:35:26.260 | at least half of the country,
00:35:28.100 | given how broken our society is in many ways,
00:35:33.100 | we have a country that didn't see anything wrong with that,
00:35:38.380 | bringing someone who obviously doesn't know
00:35:41.780 | what he should know to be president
00:35:44.180 | and who's obviously not a good person, right,
00:35:47.100 | obviously doesn't care about people,
00:35:49.220 | can't even pretend to care about people really, right,
00:35:52.380 | in a credible way.
00:35:53.300 | And so, I mean, if there's a silver lining to this,
00:35:58.820 | it's along the lines you just sketched.
00:36:01.820 | It shows us how vulnerable our system is
00:36:05.200 | to a truly brilliant and sinister figure, right?
00:36:09.180 | I mean, like I think we really dodged a bullet.
00:36:13.900 | Yeah, someone far more competent and conniving
00:36:18.900 | and ideological could have exploited our system
00:36:23.700 | in a way that Trump didn't.
00:36:25.060 | And that's, yeah, so if we plug those holes eventually,
00:36:30.060 | that would be a good thing
00:36:33.460 | and he would have done a good thing for our society, right?
00:36:35.780 | I mean, one of the things we realized,
00:36:38.340 | and I think nobody knew, I mean, I certainly didn't know it
00:36:40.820 | and I didn't hear anyone talk about it,
00:36:43.940 | is how much our system relies on norms rather than laws.
00:36:48.820 | - Yeah, civility almost.
00:36:50.180 | - Yeah, it's just like it's quite possible
00:36:53.140 | that he never did anything illegal, truly illegal.
00:36:57.780 | I mean, I think he probably did a few illegal things,
00:37:00.120 | but like illegal such that he really should be
00:37:03.100 | thrown in jail for it.
00:37:04.220 | At least that remains to be seen.
00:37:08.900 | So all of the chaos, all of the diminishment
00:37:13.900 | of our stature in the world,
00:37:16.740 | all of the just the opportunity costs
00:37:19.700 | of spending years focused on nonsense,
00:37:23.000 | all of that was just norm violations.
00:37:27.460 | All that was just all a matter
00:37:29.660 | of not saying the thing you should say,
00:37:32.180 | but that doesn't mean they're insignificant, right?
00:37:34.380 | It's not that it's like, it's not illegal
00:37:37.380 | for a sitting president to say,
00:37:39.200 | no, I'm not gonna commit to a peaceful transfer of power,
00:37:43.600 | right, we'll wait and see whether I win.
00:37:45.420 | If I win, the election was valid,
00:37:50.420 | if I lose, it was fraudulent, right?
00:37:53.140 | - But aren't those humorous perturbations
00:37:57.180 | to our system of civility such that we know
00:37:59.540 | what the limits are and now we start to think that
00:38:02.460 | and have these kinds of discussions?
00:38:04.260 | - That wasn't a humorous perturbation
00:38:05.980 | because he did everything he could,
00:38:09.300 | granted he wasn't very competent,
00:38:11.180 | but he did everything he could to try to steal the election.
00:38:16.060 | I mean, the irony is he claimed to have an election stolen
00:38:18.940 | from him all the while doing everything he could to steal it,
00:38:22.580 | declaring it fraudulent in advance,
00:38:24.660 | trying to get the votes to not be counted
00:38:28.280 | as the evening wore on,
00:38:29.460 | knowing that they were gonna be disproportionately
00:38:31.740 | Democrat votes because of the position he took
00:38:36.740 | on mail-in ballots.
00:38:38.740 | I mean, all of it was fairly calculated.
00:38:41.420 | The whole circus of the clown car
00:38:46.260 | that crashed into Four Seasons Landscaping, right,
00:38:49.940 | and you got Rudy Giuliani with his hair dyed
00:38:52.660 | and you got Sidney Powell
00:38:53.700 | and all these grossly incompetent people
00:38:56.860 | lying as freely as they could breathe
00:38:59.860 | about election fraud, right?
00:39:01.900 | And all of these things are getting thrown out
00:39:03.740 | by largely Republican election officials
00:39:06.940 | and Republican judges.
00:39:08.240 | It wasn't for want of trying
00:39:11.680 | that he didn't maintain his power in this country.
00:39:14.640 | He really tried to steal the presidency.
00:39:17.260 | He just was not competent
00:39:18.760 | and the people around him weren't competent.
00:39:21.060 | So that's a good thing
00:39:22.340 | and it's worth not letting that happen again.
00:39:25.360 | - But he wasn't competent
00:39:26.460 | so he didn't do everything he could.
00:39:28.620 | - Well, no, he did everything he could.
00:39:29.900 | He didn't do everything that could have been done
00:39:32.100 | by someone more competent.
00:39:33.400 | - Right, but the tools you have as a president,
00:39:37.900 | you could do a lot of things.
00:39:38.820 | You can declare emergencies, especially during COVID.
00:39:41.220 | You could postpone the election.
00:39:42.880 | You can create military conflict,
00:39:44.980 | that, you know, any kind of reason
00:39:46.660 | to postpone the election.
00:39:47.740 | There's a lot of weird stuff.
00:39:48.940 | - But he tried to do things
00:39:50.180 | and he would have to have done those things
00:39:51.980 | through other people
00:39:52.960 | and there are people who refuse to do those things.
00:39:55.820 | There are people who said they would quit.
00:39:57.260 | They would quit publicly, right?
00:39:59.260 | I mean, you start,
00:40:01.180 | again, there are multiple books written about
00:40:03.460 | the last hours of this presidency
00:40:08.180 | and the details are shocking
00:40:10.820 | in what he tried to do and tried to get others to do
00:40:13.420 | and it's awful, right?
00:40:15.540 | I mean, it's just awful that we were that close
00:40:18.900 | to something,
00:40:20.340 | to a true unraveling of our political process.
00:40:27.060 | I mean, it's the only time in our lifetime
00:40:28.460 | that anything like this has happened
00:40:29.980 | and it's deeply embarrassing, right?
00:40:34.300 | If we're on the world stage.
00:40:36.060 | It's just like, we looked like a banana republic there
00:40:38.620 | for a while and we're the lone superpower.
00:40:43.620 | It's not good, right?
00:40:47.380 | And so we shouldn't,
00:40:48.860 | there's no, the people who thought,
00:40:51.700 | well, we just need to shake things up
00:40:53.100 | and this is a great way to shake things up
00:40:55.620 | and having people storm our Capitol
00:40:58.540 | and smear shit on the walls,
00:41:00.300 | that's just more shaking things up, right?
00:41:02.740 | It's all just for the lulz.
00:41:04.340 | There's a nihilism and cynicism to all of that,
00:41:09.220 | which again, in certain people, it's understandable.
00:41:12.540 | Frankly, it's not understandable
00:41:13.980 | if you've got a billion dollars
00:41:15.380 | and you have a compound in Menlo Park or wherever.
00:41:19.020 | It's like, there are people who are cheerleading this stuff
00:41:21.620 | who shouldn't be cheerleading this stuff
00:41:23.380 | who know that they can get on their Gulf Stream
00:41:26.140 | and fly to their compound in New Zealand
00:41:28.140 | if everything goes to shit, right?
00:41:30.100 | So there's a cynicism to all of that
00:41:32.460 | that I think we should be deeply critical of.
00:41:35.540 | - What I'm trying to understand is not,
00:41:38.740 | and analyze, is not the behavior
00:41:40.620 | of this particular human being,
00:41:42.140 | but the effect it had in part on the division between people.
00:41:47.140 | And to me, the degree, the meme of Sam Harris's brain
00:41:52.500 | being broken by Trump represents,
00:41:54.660 | you're like the person I would look to
00:41:58.220 | to bridge the division.
00:42:00.300 | - Well, I don't think there is something profitably
00:42:05.060 | to be said to someone who's truly captivated
00:42:09.980 | by the personality cult of Trumpism, right?
00:42:13.380 | There's nothing that I'm gonna say to,
00:42:15.940 | there's no conversation I'm gonna have
00:42:16.820 | with Candace Owens, say, about Trump
00:42:19.440 | that's gonna converge on something reasonable, right?
00:42:21.940 | - You don't think so?
00:42:22.900 | - No, I mean, I haven't tried with Candace,
00:42:24.540 | but I've tried with many people
00:42:26.060 | who are in that particular orbit.
00:42:29.340 | I mean, I've had conversations with people
00:42:31.700 | who won't admit that there's anything wrong with Trump,
00:42:36.700 | anything.
00:42:37.700 | - So I'd like to push for the empathy versus reason.
00:42:40.580 | 'Cause when you operate in the space of reason, yes,
00:42:43.660 | but I think there's a lot of power in you showing,
00:42:47.660 | in you, Sam Harris, showing that you're willing
00:42:50.500 | to see the good qualities of Trump, publicly showing that.
00:42:54.620 | I think that's the way to win over Candace Owens.
00:42:56.580 | - But he has so few of them.
00:42:58.100 | He has fewer good qualities than virtually anyone
00:43:00.900 | I can name, right?
00:43:02.420 | So he's funny.
00:43:04.180 | I'll grant you that he's funny.
00:43:05.780 | He's a good entertainer.
00:43:07.980 | - There's others.
00:43:08.820 | Look at just policies and actual impacts he had.
00:43:11.500 | - I've admitted that.
00:43:12.460 | No, no, so I've admitted that many of his policies
00:43:15.580 | I agree with, many, many of his policies.
00:43:18.100 | So probably more often than not, at least on balance,
00:43:22.820 | I agreed with his policy that we should take China
00:43:27.420 | seriously as an adversary, right?
00:43:29.700 | And I think, I mean, again, you have to,
00:43:34.580 | there's a lot of fine print to a lot of this
00:43:36.140 | 'cause the way he talks about these things
00:43:38.540 | and many of his motives that are obvious
00:43:41.180 | are things that I don't support.
00:43:43.420 | But I mean, take immigration.
00:43:45.100 | I think there's, it's obvious that we should have control
00:43:48.660 | of our borders, right?
00:43:50.460 | Like I don't see the argument for not having control
00:43:53.700 | of our borders.
00:43:54.540 | We should let in who we wanna let in
00:43:56.580 | and we should keep out who we wanna keep out
00:43:58.700 | and we should have a sane immigration policy.
00:44:00.620 | So I didn't necessarily think it was a priority
00:44:04.260 | to build the wall, but I didn't,
00:44:06.100 | I never criticized the impulse to build the wall
00:44:09.140 | because if tens of thousands, hundreds of thousands
00:44:11.900 | of people are coming across that border
00:44:13.180 | and we are not in a position to know who's coming,
00:44:16.500 | that seems untenable to me.
00:44:17.700 | So, and I can recognize that many people in our society
00:44:22.300 | are on balance the victims of immigration
00:44:26.140 | and there is in many cases, a zero sum contest
00:44:30.260 | between the interests of actual citizens
00:44:32.740 | and the interests of immigrants, right?
00:44:34.380 | So I think we should have control of our borders.
00:44:37.780 | We should have a sane and compassionate immigration policy.
00:44:40.700 | We should let in refugees, right?
00:44:43.980 | So I did, Trump on refugees was terrible,
00:44:46.200 | but no, like I would say 80% of the policy concerns
00:44:52.380 | people celebrated in him are concerns
00:44:58.960 | that I either share entirely or certainly sympathize with.
00:45:06.080 | Right, so like that's not the issue.
00:45:09.100 | The issue is--
00:45:10.820 | - A threat to democracy and some fundamental--
00:45:13.020 | - The issue is largely what you said it was.
00:45:14.820 | It's not so much the person,
00:45:16.540 | it's the effect on everything he touches, right?
00:45:19.780 | He just, he has this superpower of deranging
00:45:24.100 | and destabilizing almost everything he touches
00:45:29.740 | and sullying and compromising the integrity
00:45:32.500 | of almost anyone who comes into his orbit.
00:45:34.100 | I mean, so you looked at these people who served
00:45:36.500 | as chief of staff or in various cabinet positions,
00:45:40.660 | people had real reputations for probity
00:45:43.740 | and level-headedness, whether you share their politics
00:45:48.440 | or not, I mean, these were real people.
00:45:49.940 | These were not, some of them were goofballs,
00:45:54.320 | but many people who just got totally trashed
00:46:02.780 | by proximity to him and then trashed by him
00:46:06.900 | when they finally parted company with him.
00:46:09.000 | Yeah, I mean, it's just people bent over backwards
00:46:14.740 | to accommodate his norm violations
00:46:17.220 | and it was bad for them and it was bad for our system.
00:46:21.900 | But none of that discounts the fact that we have a system
00:46:31.900 | that really needs proper house cleaning.
00:46:35.340 | Yes, there are bad incentives and entrenched interests.
00:46:40.340 | And I'm not a fan of the concept of the deep state,
00:46:45.380 | but 'cause it has been so propagandized,
00:46:47.420 | but yes, there's something like that,
00:46:50.340 | that is not flexible enough to respond intelligently
00:46:55.340 | to the needs of the moment, right?
00:47:00.760 | So there's a lot of rethinking of government
00:47:02.620 | and of institutions in general that I think we should do,
00:47:07.620 | but we need smart, well-informed, well-intentioned people
00:47:11.760 | to do that job.
00:47:13.140 | And the well-intentioned part is hugely important, right?
00:47:18.140 | Just give me someone who is not the most selfish person
00:47:23.140 | anyone has ever heard about in their lifetime, right?
00:47:29.920 | And what we got with Trump was literally
00:47:31.980 | the one most selfish person I think anyone could name.
00:47:35.700 | I mean, and again, there's so much known about this man.
00:47:38.780 | That's the thing, it was like, it predates his presidency.
00:47:41.960 | We knew this guy 30 years ago.
00:47:44.060 | And this is why to come back to those inflammatory comments
00:47:49.060 | about Hunter Biden's laptop,
00:47:51.120 | the reason why I can say with confidence
00:47:53.220 | that I don't care what was on his laptop
00:47:55.580 | is that there is, and that includes any evidence
00:47:59.960 | of corruption on the part of his father, right?
00:48:02.240 | Now, there's been precious little of that
00:48:04.760 | that's actually emerged.
00:48:05.900 | So it's like, there is no, as far as I can tell,
00:48:08.860 | there's not a big story associated with that laptop
00:48:10.960 | as much as people bang on about a few emails.
00:48:14.200 | But even if there were just obvious corruption, right?
00:48:19.200 | Like Joe Biden was at this meeting and he took
00:48:22.240 | this amount of money from this shady guy for bad reasons.
00:48:26.340 | Given how visible the lives of these two men have been,
00:48:32.480 | I mean, given how much we know about Joe Biden
00:48:34.220 | and how much we know about Donald Trump
00:48:35.680 | and how they have lived in public
00:48:37.460 | for almost as long as I've been alive, both of them,
00:48:40.160 | the scale of corruption can't possibly balance out
00:48:46.480 | between the two of them.
00:48:47.640 | If you show me that Joe Biden has this secret life
00:48:51.880 | where he's driving a Bugatti
00:48:53.200 | and he's living like Andrew Tate, right?
00:48:55.320 | And he's doing all these things I didn't know about.
00:48:58.680 | Okay, then I'm gonna start getting a sense that,
00:49:01.640 | all right, maybe this guy is way more corrupt
00:49:04.640 | than I realized.
00:49:05.460 | Maybe there is some deal in Ukraine or with China
00:49:08.040 | that is just, like this guy is not who he seems,
00:49:10.680 | he's not the public servant he's been pretending to be.
00:49:13.440 | He's been on the take for decades and decades
00:49:15.940 | and he's just, he's as dirty as can be.
00:49:18.240 | He's all mobbed up and it's a nightmare.
00:49:21.760 | And he can't be trusted, right?
00:49:23.360 | That's possible if you show me that his life
00:49:26.800 | is not at all what it seems.
00:49:28.000 | But on the assumption that I,
00:49:30.120 | having looked at this guy for literally decades, right?
00:49:33.880 | And knowing that every journalist
00:49:36.060 | has looked at him for decades,
00:49:38.000 | just how many affairs is he having?
00:49:39.820 | Just how much, how many drugs is he doing?
00:49:43.420 | How many houses does he have?
00:49:45.000 | What are the obvious conflicts of interest?
00:49:50.320 | You hold that against what we know about Trump, right?
00:49:53.080 | And I mean, the litany of indiscretions
00:49:57.640 | you can put on Trump's side
00:49:59.000 | that testify to his personal corruption,
00:50:02.440 | to testify to the fact that he has no ethical compass,
00:50:05.920 | there's simply no comparison, right?
00:50:07.440 | So that's why I don't care about what's on the laptop.
00:50:10.840 | Now, if you tell me Trump is no longer running
00:50:13.700 | for president in 2024 and we can put Trumpism behind us
00:50:18.560 | and now you're saying,
00:50:20.360 | listen, there's a lot of stuff on that laptop
00:50:21.920 | that makes Joe Biden look like a total asshole.
00:50:25.560 | Okay, I'm all ears, right?
00:50:27.200 | I mean, it was a forced, in 2020,
00:50:29.240 | it was a forced choice between a sitting president
00:50:32.540 | who wouldn't commit to a peaceful transfer of power
00:50:35.200 | and a guy who's obviously too old to be president
00:50:39.060 | who has a crack addicted son who lost his laptop.
00:50:46.280 | And I just knew that I was gonna take Biden
00:50:50.400 | in spite of whatever litany of horrors
00:50:52.560 | was gonna come tumbling out of that laptop.
00:50:54.240 | - And that might involve sort of,
00:50:56.280 | so the actual quote is,
00:50:57.120 | "Hunter Biden literally could have had
00:50:59.120 | "the corpses of children in the basement."
00:51:02.120 | There's a dark humor to it, right?
00:51:03.620 | Which is, I think you speak to,
00:51:04.960 | "I would not have cared.
00:51:06.480 | "There's nothing, it's Hunter Biden, it's not Joe Biden.
00:51:08.980 | "Whatever the scope of Joe Biden's corruption is,
00:51:10.880 | "it is infinitesimally compared to the corruption
00:51:13.760 | "we know Trump was involved in.
00:51:15.540 | "It's like a firefly to the sun,"
00:51:17.360 | is what you're speaking to.
00:51:18.360 | But let me make the case that you're really focused
00:51:22.280 | on the surface stuff,
00:51:24.480 | that it's possible to have corruption
00:51:27.840 | that masquerades in the thing we mentioned,
00:51:30.200 | which is civility.
00:51:31.480 | You can spend hundreds of billions of dollars
00:51:34.140 | or trillions towards a war in the Middle East, for example,
00:51:38.840 | something that you've changed your mind on
00:51:40.600 | in terms of the negative impact it has on the world.
00:51:44.000 | And that, the military industrial complex,
00:51:47.840 | everybody's very nice, everybody's very civil,
00:51:50.420 | just very upfront.
00:51:51.620 | Here's how we're spending the money.
00:51:53.340 | Yeah, sometimes somehow disappears in different places,
00:51:56.940 | but that's the way war is complicated.
00:51:59.540 | And everyone is very polite.
00:52:00.780 | There's no Coke and strippers or whatever is on the laptop.
00:52:04.740 | It's very nice and polite.
00:52:07.200 | In the meanwhile, hundreds of thousands of civilians die.
00:52:10.860 | Hate, just an incredible amount of hate is created
00:52:15.820 | because people lose their family members,
00:52:17.400 | all that kind of stuff,
00:52:18.240 | but there's no strippers and Coke on a laptop.
00:52:21.200 | - Yeah, but it's not just superficial.
00:52:24.800 | It is, when someone only wants wealth and power and fame,
00:52:29.800 | that is their objective function, right?
00:52:35.120 | They're like a robot that is calibrated
00:52:37.800 | just to those variables, right?
00:52:40.720 | And they don't care about the risks we run
00:52:44.700 | on any other front.
00:52:45.700 | They don't care about environmental risk,
00:52:49.620 | pandemic risk, nuclear proliferation risk, none of it, right?
00:52:53.940 | They're just tracking fame and money
00:52:57.340 | and whatever can personally redound to their self-interest
00:53:02.340 | along those lines.
00:53:04.260 | And they're not informed about the other risks
00:53:06.860 | we're running really.
00:53:08.080 | In Trump, you had a president
00:53:10.440 | who was repeatedly asking his generals,
00:53:12.460 | "Why couldn't we use our nuclear weapons?
00:53:14.420 | Why can't we have more of them?
00:53:16.100 | Why do I have fewer nuclear weapons than JFK?"
00:53:18.960 | Right, as though that were a sign
00:53:20.020 | of anything other than progress, right?
00:53:23.620 | And this is the guy who's got the button, right?
00:53:29.900 | I mean, somebody's following him around with a bag
00:53:32.620 | waiting to take his order to launch, right?
00:53:37.320 | That is a, it's just, it's a risk we should never run.
00:53:42.320 | One thing Trump has going for him, I think,
00:53:46.020 | is that he doesn't drink or do drugs, right?
00:53:49.020 | I don't know, people allege that he does speed,
00:53:51.620 | but let's take him at his word.
00:53:54.100 | He's not deranging himself with pharmaceuticals, at least,
00:53:59.100 | but apart from Diet Coke, but--
00:54:03.120 | - There's nothing wrong, just for the record,
00:54:06.580 | let me push back on that.
00:54:07.620 | There's nothing wrong with Diet Coke.
00:54:08.460 | - Nothing wrong with Diet Coke, yeah.
00:54:09.300 | - I mean, it's consumed in a very large amount.
00:54:10.860 | - I occasionally have some myself.
00:54:12.180 | - There's no medical, there's no scientific evidence
00:54:14.220 | that I observed the negatives of, you know,
00:54:16.700 | all those studies about aspartame and all that.
00:54:19.100 | I don't know.
00:54:21.220 | - I hope you're right.
00:54:22.620 | Yeah, I mean, everything you said
00:54:25.260 | about the military-industrial complex is true, right?
00:54:28.300 | And we've been worrying about that
00:54:30.640 | on both sides of the aisle for a very long time.
00:54:33.400 | I mean, that phrase came from Eisenhower.
00:54:37.360 | It's, I mean, so much of what ails us
00:54:44.760 | is a story of bad incentives, right?
00:54:48.760 | And bad incentives are so powerful
00:54:52.520 | that they corrupt even good people, right?
00:54:55.660 | How much more do they corrupt bad people, right?
00:54:58.440 | Like, so it's like, you want,
00:55:00.560 | at minimum, you want reasonably good people,
00:55:03.280 | at least non-pathological people,
00:55:05.680 | in the system trying to navigate
00:55:09.840 | against the grain of bad incentives.
00:55:11.800 | And better still, all of us can get together
00:55:15.640 | and try to diagnose those incentives and change them, right?
00:55:19.720 | And we will really succeed
00:55:22.040 | when we have a system of incentives
00:55:23.600 | where the good incentives are so strong
00:55:28.760 | that even bad people are effortlessly behaving
00:55:33.260 | as though they're good people
00:55:34.340 | because they're so successfully incentivized
00:55:36.480 | to behave that way, right?
00:55:38.320 | That's, and so it's almost the inversion
00:55:41.520 | of our current situation.
00:55:42.500 | So yes, and you say I changed my mind about the war.
00:55:45.880 | Not quite.
00:55:48.620 | I mean, I was never a supporter of the war in Iraq.
00:55:52.400 | I was always worried that it was a distraction
00:55:55.220 | from the war in Afghanistan.
00:55:56.260 | I was a supporter of the war in Afghanistan,
00:55:58.520 | and I will admit in hindsight,
00:56:00.040 | that looks like, you know,
00:56:03.520 | at best, a highly ambiguous and painful exercise,
00:56:07.080 | but more likely a fool's errand, right?
00:56:10.180 | It's like that, you know, it did not turn out well.
00:56:12.760 | It wasn't for want of trying.
00:56:15.800 | I don't, you know, I have not done a deep dive
00:56:18.220 | on all of the failures there,
00:56:21.520 | and maybe all of these failures are failures in principle.
00:56:24.080 | I mean, maybe it's just,
00:56:25.040 | maybe that's not the kind of thing
00:56:26.400 | that can be done well by anybody, whatever our intentions.
00:56:30.200 | But yeah, the move to Iraq always seemed questionable to me,
00:56:35.800 | and when we knew the problem,
00:56:39.480 | the immediate problem at that moment, you know,
00:56:41.120 | al-Qaeda was in Afghanistan,
00:56:45.480 | and, you know, and then bouncing to Pakistan.
00:56:48.640 | Anyway, you know, so yes,
00:56:52.560 | but my sense of the possibility of nation building,
00:56:57.560 | my sense of, you know,
00:57:00.920 | insofar as the neocon spirit of, you know,
00:57:05.920 | responsibility and idealism,
00:57:10.840 | that, you know, America was the kind of nation
00:57:12.960 | that should be functioning in this way as the world's cop,
00:57:16.320 | and we have to get in there
00:57:18.200 | and untangle some of these knots by force,
00:57:22.240 | rather often because, you know,
00:57:25.000 | if we don't do it over there,
00:57:25.960 | we're gonna have to do it over here kind of thing.
00:57:28.920 | Yeah, some of that has definitely changed
00:57:31.320 | for me in my thinking.
00:57:32.520 | And there are obviously cultural reasons
00:57:35.000 | why it failed in Afghanistan,
00:57:36.600 | and if you can't change the culture,
00:57:38.400 | you're not gonna force a change at gunpoint in the culture,
00:57:45.960 | or it certainly seems that that's not gonna happen.
00:57:48.360 | And it took us, you know, over 20 years
00:57:51.720 | to apparently to realize that.
00:57:53.520 | - That's one of the things you realize with the wars,
00:57:55.600 | there's not going to be a strong signal
00:57:57.500 | that things are not working.
00:57:59.680 | You can just keep pouring money into a thing,
00:58:01.480 | a military effort.
00:58:02.640 | - Well, also there are signs of it working too.
00:58:04.680 | You have all the stories of girls now going to school,
00:58:08.400 | right, you know, the girls are getting battery acid
00:58:10.240 | thrown in their faces by religious maniacs,
00:58:12.920 | and then we come in there and we stop that,
00:58:15.360 | and now girls are getting educated,
00:58:17.000 | and that's all good, and our intentions are good there.
00:58:20.280 | And I mean, we're on the right side of history there.
00:58:22.560 | Girls should be going to school.
00:58:24.520 | You know, Malala Yousafzai should have the Nobel Prize,
00:58:27.900 | and she shouldn't have been shot in the face
00:58:29.680 | by the Taliban, right?
00:58:31.520 | We know what the right answers are there.
00:58:35.840 | The question is, what do you do when there are enough,
00:58:39.140 | in this particular case, religious maniacs,
00:58:41.760 | who are willing to die and let their children die
00:58:44.480 | in defense of crazy ideas?
00:58:46.780 | And moral norms that belong in the seventh century.
00:58:49.840 | And it's a problem we couldn't solve,
00:58:53.400 | and we couldn't solve it even though we spent,
00:58:56.120 | you know, trillions of dollars to solve it.
00:58:58.480 | - This reminded me of the thing that you and Jack Dorsey
00:59:03.480 | jokingly had for a while, the discussion about banning
00:59:08.160 | Donald Trump from Twitter.
00:59:10.600 | But does any of it bother you,
00:59:12.040 | now that Twitter files came out, that,
00:59:15.560 | I mean, this has to do with sort of the Hunter laptop,
00:59:18.900 | Hunter Biden laptop story.
00:59:21.100 | Does it bother you that there could be a collection
00:59:24.160 | of people that make decisions about who to ban and not?
00:59:27.460 | And that could be susceptible to bias
00:59:31.780 | and to ideological influence?
00:59:34.260 | - Well, I think it always will be,
00:59:37.940 | or in the absence of perfect AI, it always will be.
00:59:41.860 | - And this becomes relevant with AI as well.
00:59:43.940 | - Yeah, yeah, yeah.
00:59:44.780 | - There's some censorship on AI happening.
00:59:46.980 | And it's an interesting question there as well.
00:59:48.580 | - I don't think Twitter is as important
00:59:50.100 | as people think it is, right?
00:59:51.860 | And I used to think it was more important when I was on it,
00:59:54.860 | and now that I'm off of it, I think it's,
00:59:56.860 | I mean, first let me say it's just
01:00:00.820 | an unambiguously good thing, in my experience,
01:00:05.020 | to delete your Twitter account, right?
01:00:07.180 | It's like, it is just, even the good parts of Twitter
01:00:10.500 | that I miss were bad in the aggregate,
01:00:14.580 | in the degree to which it was fragmenting my attention,
01:00:19.060 | the degree to which my life was getting doled out to me
01:00:22.620 | in periods between those moments where I checked Twitter,
01:00:27.620 | and had my attention divert.
01:00:29.260 | And I was not a crazy Twitter addict.
01:00:33.740 | I mean, I was probably a pretty normal user.
01:00:35.980 | I mean, I was not someone who was tweeting
01:00:38.500 | multiple times a day or even every day, right?
01:00:40.740 | I mean, I probably, I think I probably averaged
01:00:44.660 | something like one tweet a day, I think I averaged.
01:00:47.980 | But in reality, it was like,
01:00:49.380 | there'd be like four tweets one day,
01:00:50.820 | and then I wouldn't tweet for the better part of a week.
01:00:54.140 | But I was looking a lot because it was my newsfeed.
01:00:58.360 | I was just following 200 very smart people,
01:01:01.140 | and I just wanted to see what they were paying attention to.
01:01:03.780 | They would recommend articles,
01:01:04.940 | and I would read those articles.
01:01:06.060 | And then when I would read an article
01:01:08.220 | that I thought I should signal boost,
01:01:10.300 | I would tweet, and so all of that seemed good.
01:01:12.620 | And like, that's all separable
01:01:15.320 | from all of the odious bullshit that came back at me
01:01:18.100 | in response to this,
01:01:19.340 | largely in response to this Hunter Biden thing.
01:01:21.700 | But even the good stuff has a downside.
01:01:27.500 | And it comes at just this point of,
01:01:32.500 | your phone is this perpetual stimulus of,
01:01:39.420 | which is intrinsically fragmenting of time and attention.
01:01:42.220 | And now my phone is much less of a presence in my life.
01:01:45.900 | And it's not that I don't check Slack or check email.
01:01:48.940 | I mean, I use it to work,
01:01:51.660 | but my sense of just what the world is
01:01:56.660 | and my sense of my place in the world,
01:01:58.620 | the sense of where I exist as a person
01:02:01.640 | has changed a lot by deleting my Twitter account.
01:02:04.460 | I mean, I had a, and it's just,
01:02:07.660 | and the things that I think,
01:02:09.340 | I mean, we all know this phenomenon.
01:02:10.780 | I mean, we say of someone, that person's too online, right?
01:02:14.140 | Like, what does it mean to be too online?
01:02:16.460 | And where do you draw that boundary?
01:02:20.740 | How do you know?
01:02:21.860 | What constitutes being too online?
01:02:23.180 | Well, in some sense, just being,
01:02:26.480 | I think being on social media at all is to be too online.
01:02:30.180 | I mean, given what it does to,
01:02:33.740 | given the kinds of information it,
01:02:36.560 | it signal boosts.
01:02:39.400 | And given the impulse it kindles in each of us
01:02:44.400 | to reach out to our audience in specific moments
01:02:50.840 | and in specific ways, right?
01:02:52.440 | It's like, there are lots of moments now
01:02:55.160 | where I have an opinion about something,
01:02:57.340 | but there's nothing for me to do with that opinion, right?
01:02:59.880 | Like there's no Twitter, right?
01:03:01.320 | So like there are lots of things
01:03:02.600 | that I would have tweeted in the last months
01:03:05.280 | that are not the kind of thing
01:03:08.080 | I'm gonna do a podcast about.
01:03:09.720 | I'm not gonna roll out 10 minutes
01:03:11.480 | on that topic on my podcast.
01:03:12.920 | I'm not gonna take the time to really think about it.
01:03:15.040 | But had I been on Twitter,
01:03:16.640 | I would have reacted to this thing in the news
01:03:19.320 | or this thing that somebody did, right?
01:03:21.760 | - What do you do with that thought now?
01:03:23.480 | - I just let go of it.
01:03:24.640 | - Like chocolate ice cream is the most delicious thing ever.
01:03:27.240 | - Yeah, it's usually not that sort of thing,
01:03:29.040 | but it's just,
01:03:31.260 | but then you look at the kinds of problems
01:03:33.240 | people create for themselves.
01:03:34.360 | You look at the life deranging
01:03:36.920 | and reputation destroying things that people do.
01:03:40.220 | And I look at the things that have,
01:03:44.320 | the analogous things that have happened to me.
01:03:45.800 | I mean, the things that have really bent my life around
01:03:48.160 | professionally over the past decade.
01:03:51.240 | So much of it is Twitter.
01:03:54.320 | I mean, honestly, in my case,
01:03:56.340 | almost 100% of it was Twitter.
01:03:58.320 | The controversies I would get into,
01:04:00.560 | the things I would think I would have to respond to,
01:04:03.240 | like I would release a podcast on a certain topic.
01:04:05.480 | I would see some blowback on Twitter.
01:04:07.680 | You know, it would give me the sense
01:04:11.000 | that there was some signal that I really had to respond to.
01:04:14.480 | Now that I'm off Twitter,
01:04:15.560 | I recognize that most of that was just,
01:04:17.780 | it was totally specious, right?
01:04:20.560 | It was not something I had to respond to.
01:04:22.680 | But yet I would then do a cycle of podcasts
01:04:26.040 | responding to that thing that like,
01:04:28.080 | taking my foot out of my mouth
01:04:29.480 | or taking someone else's foot out of my mouth.
01:04:31.560 | And it became this self-perpetuating cycle,
01:04:36.160 | which, I mean, if you're having fun, great.
01:04:42.940 | I mean, if it's generative of useful information
01:04:48.360 | and engagement professionally and psychologically, great.
01:04:53.480 | And there was some of that on Twitter.
01:04:58.360 | I mean, there were people who I've connected with
01:05:01.400 | because I just, you know,
01:05:02.840 | one of us DMed the other on Twitter
01:05:04.440 | and it was hard to see how that was gonna happen otherwise.
01:05:06.960 | But it was largely just a machine
01:05:11.960 | for manufacturing unnecessary controversy.
01:05:16.080 | - Do you think it's possible to avoid the drug of that?
01:05:18.240 | So now that you've achieved this zen state,
01:05:20.820 | is it possible for somebody like you
01:05:23.600 | to use it in a way that doesn't pull you into the whirlpool?
01:05:26.680 | And so anytime there's attacks, you just,
01:05:29.120 | I mean, that's how I tried to use it.
01:05:31.440 | - Yeah, but it's not the way I wanted to use it.
01:05:33.960 | It's not the way it promises itself as a--
01:05:37.680 | - You want it to have debate.
01:05:38.680 | - I wanted to actually communicate with people.
01:05:40.840 | I wanted to hear from the person because,
01:05:44.880 | again, it's like being in Afghanistan, right?
01:05:46.880 | It's like there are the potted cases
01:05:50.760 | where it's obviously good, right?
01:05:52.680 | It's like in Afghanistan,
01:05:53.800 | the girl who's getting an education,
01:05:55.300 | that is just here, that's why we're here.
01:05:57.480 | That's obviously good.
01:05:59.300 | I have those moments on Twitter where it's like,
01:06:01.040 | okay, I'm hearing from a smart person
01:06:03.160 | who's detected an error I made in my podcast or in a book,
01:06:07.320 | or they've just got some great idea
01:06:09.440 | about something that I should spend time on.
01:06:12.480 | And I would never have heard from this person
01:06:14.520 | in any other format.
01:06:15.800 | And now I'm actually in dialogue with them.
01:06:17.440 | And it's fantastic.
01:06:19.200 | That's the promise of it, to actually talk to people.
01:06:21.720 | And so I kept getting lured back into that.
01:06:24.780 | No, the sane or sanity-preserving way of using it
01:06:29.780 | is just as a marketing channel.
01:06:33.840 | You just put your stuff out there
01:06:35.140 | and you don't look at what's coming back at you.
01:06:37.660 | And that's, I'm on other social media platforms
01:06:41.960 | that I don't even touch.
01:06:42.960 | I mean, my team posts stuff on Facebook and on Instagram.
01:06:46.420 | I never even see what's on there.
01:06:48.220 | - So you don't think it's possible to see something
01:06:51.060 | and not let it affect your mind?
01:06:52.660 | - No, that's definitely possible.
01:06:54.100 | But the question is,
01:06:56.220 | and I did that for vast stretches of time, right?
01:06:59.300 | And, but then the promise of the platform
01:07:04.300 | is dialogue and feedback, right?
01:07:07.100 | So like, so why am I, if I know for whatever reason,
01:07:11.900 | I'm gonna see like 99 to one awful feedback,
01:07:16.420 | bad faith feedback, malicious feedback.
01:07:19.200 | Some of it's probably even bots,
01:07:20.740 | and I'm not even aware of who's a person,
01:07:22.520 | who's a bot, right?
01:07:23.660 | But I'm just gonna stare into this funhouse mirror
01:07:26.460 | of acrimony and dishonesty that is going to,
01:07:31.460 | I mean, the reason why I got off
01:07:33.420 | is not because I couldn't recalibrate
01:07:37.700 | and find equanimity again with all the nastiness
01:07:42.100 | that was coming back at me,
01:07:43.100 | and not that I couldn't ignore it for vast stretches of time,
01:07:46.220 | but I could see that I kept coming back to it
01:07:51.060 | hoping that it would be something that I could use,
01:07:53.800 | a real tool for communication.
01:07:55.720 | And I was noticing that it was insidiously changing
01:08:00.720 | the way I felt about people,
01:08:02.980 | both people I know and people I don't know, right?
01:08:04.960 | Like people I, you know, mutual friends of ours
01:08:07.520 | who are behaving in certain ways on Twitter,
01:08:09.280 | which just seemed insane to me.
01:08:12.160 | And then that became a signal I felt like
01:08:14.280 | I had to take into account somehow, right?
01:08:16.600 | You're seeing people at their worst,
01:08:18.120 | both friends and strangers.
01:08:20.960 | And I felt that it was as much as I could sort of
01:08:24.320 | try to recalibrate for it,
01:08:25.700 | I felt that I was losing touch
01:08:29.400 | with what was real information
01:08:31.760 | because people are performing, people are faking,
01:08:34.080 | people are not themselves,
01:08:35.680 | or you're seeing people at their worst.
01:08:38.160 | And so I felt like, all right,
01:08:40.560 | what's being advertised to me here on a,
01:08:44.760 | not just a daily basis, you know, an hourly basis,
01:08:48.500 | or, you know, an increment sometimes of, you know,
01:08:50.800 | multiple times an hour.
01:08:51.920 | I mean, I probably check Twitter, you know,
01:08:56.280 | at minimum 10 times a day,
01:08:57.920 | and maybe I was checking it 100 times a day on some days,
01:09:00.920 | right, where things were really active
01:09:02.520 | and I was really engaged with something.
01:09:04.480 | What was being delivered into my brain there
01:09:10.120 | was subtly false information
01:09:15.200 | about how dishonest and,
01:09:19.920 | you know, just generally unethical,
01:09:25.360 | totally normal people are capable of being, right?
01:09:30.160 | It was like, it is a funhouse mirror.
01:09:33.020 | I was seeing the most grotesque versions
01:09:35.880 | of people who I know, right?
01:09:37.200 | People who I know I could sit down at dinner with
01:09:40.400 | and they would never behave this way,
01:09:42.360 | and yet they were coming at me on Twitter.
01:09:44.900 | It was essentially turning ordinary people
01:09:49.360 | into sociopaths, right?
01:09:51.080 | It's like people are just,
01:09:52.620 | you know, and there are analogies that many of us have made.
01:09:57.560 | It's like, one analogy is road rage, right?
01:10:00.080 | Like people behave in the confines of a car
01:10:02.620 | in ways that they never would
01:10:04.680 | if they didn't have this metal box around them,
01:10:06.600 | you know, moving at speed.
01:10:07.700 | And it's, you know, all of that becomes quite hilarious
01:10:11.160 | and, you know, obviously dysfunctional
01:10:14.520 | when they actually have to stop at the light
01:10:16.200 | next to the person they just flipped off,
01:10:17.660 | and they realize they didn't realize,
01:10:19.640 | they didn't understand that the person
01:10:20.800 | coming out of that car next to them with cauliflower ear
01:10:23.880 | is someone who they never would have,
01:10:25.920 | you know, rolled their eyes at in public
01:10:27.840 | because they would have taken one look at this person
01:10:29.680 | and realized this is the last person you wanna fight with.
01:10:32.760 | - That's one of the heartbreaking things is to see,
01:10:35.480 | see people who I know, who I admire,
01:10:37.360 | who I know are friends, be everything from snarky
01:10:41.800 | to downright mean, derisive towards each other.
01:10:46.800 | It doesn't make any sense.
01:10:50.360 | Like this, this is the only place where I've seen
01:10:53.100 | people I really admire who have had a calm head
01:10:56.720 | about most things, like really be shitty to other people.
01:11:00.080 | It's probably the only place I've seen that.
01:11:02.040 | And I don't, I tend, I choose to maybe believe
01:11:05.400 | that that's not really them.
01:11:06.600 | There's something about the system.
01:11:08.300 | Like if you go paintballing, if you,
01:11:11.920 | Jordan Peterson and whoever go paintballing.
01:11:14.120 | - You're gonna shoot your friends, yeah.
01:11:15.120 | - Yeah, you're gonna shoot your friends,
01:11:16.200 | but you kind of accept that that's kind of what you're doing
01:11:18.320 | in this little game that you're playing.
01:11:20.360 | But it's sometimes hard to remind yourself of that.
01:11:23.640 | - Well, and I think I was guilty of that, definitely.
01:11:27.240 | You know, I don't think, there's nothing,
01:11:31.240 | I don't think I ever did anything
01:11:32.920 | that I really feel bad about.
01:11:35.720 | But yeah, it was always pushing me
01:11:37.520 | to the edge of snideness somehow.
01:11:40.640 | And it's just not healthy.
01:11:44.080 | It's not, it's not, so the reason why I deleted
01:11:49.080 | my Twitter account in the end was that
01:11:51.640 | it was obviously making me a worse person.
01:11:54.440 | And so, and yeah, is there some way to be on there
01:11:58.220 | where it's not making you a worse person?
01:12:00.000 | I'm sure there is, but it's,
01:12:02.080 | given the nature of the platform
01:12:03.980 | and given what was coming back at me on it,
01:12:07.480 | the way to do that is just to basically
01:12:09.800 | use it as a one-way channel of communication,
01:12:13.160 | just marketing.
01:12:14.560 | You know, it's like, here's what I'm paying attention to.
01:12:17.940 | Look at it if you want to, and you just push it out,
01:12:20.360 | and then you don't look at what's coming back at you.
01:12:22.800 | - I put out a call for questions on Twitter,
01:12:24.880 | and actually, quite surprisingly, there's a lot of good,
01:12:28.840 | I mean, they're like, even if they're critical,
01:12:31.600 | they're like being thoughtful, which is nice.
01:12:34.060 | - I used it that way too, and that was what kept me hooked.
01:12:37.100 | - But then there's also TouchBalls69 wrote a question.
01:12:42.100 | Ask why-- - I can't imagine.
01:12:44.280 | This is part of it, but one way to solve this is,
01:12:47.420 | you know, we gotta get rid of anonymity for this.
01:12:50.060 | - Let me ask the question.
01:12:50.940 | Ask Sam why he sucks was the question.
01:12:53.300 | - Yeah, that's good, well,
01:12:55.140 | one reason why I sucked was Twitter.
01:12:57.460 | That was, and I've since solved that problem, so.
01:13:01.020 | TouchBalls-- - 69.
01:13:02.820 | - 69? - Yeah.
01:13:03.660 | - TouchBalls69 should be happy that I suck
01:13:06.340 | a little bit less now that I'm off Twitter.
01:13:08.660 | I don't have to hear from TouchBalls69 on the regular.
01:13:11.780 | - The fact that you have to see that,
01:13:16.600 | it probably can have a negative effect,
01:13:18.680 | just even in moderation, just to see that there is,
01:13:22.540 | like for me, the negative effect is slightly losing faith
01:13:27.060 | in the underlying kindness of humanity.
01:13:29.540 | - Yeah, that was for me, yeah.
01:13:31.220 | - You can also just reason your way out of it,
01:13:32.980 | saying that this is anonymity and this is kind of fun
01:13:35.100 | and this kind of, just the shit show of Twitter, it's okay,
01:13:39.180 | but it does mentally affect you a little bit.
01:13:41.540 | - Like I don't read too much into that kind of comment.
01:13:44.980 | It's like, it's just, that's just trolling,
01:13:49.980 | and it's, you know, I get what's, I get,
01:13:53.180 | I understand the fun the person is having
01:13:55.420 | on the other side of that.
01:13:56.460 | It's like-- - Do you, though?
01:13:57.860 | - I do, well, I do, I don't, I mean,
01:13:59.940 | I don't behave that way, but I do,
01:14:02.420 | and for all I know, that person could be 16 years old,
01:14:05.220 | right, so it's like-- - It could be also
01:14:07.020 | an alt-icon for Elon, I don't know.
01:14:09.540 | - Well, yeah, that's right, yeah, yeah, yeah.
01:14:12.300 | No, I'm pretty sure Elon would just tweet that
01:14:14.940 | under his own name at this point.
01:14:17.100 | - Oh, man, you love each other.
01:14:19.620 | Okay, so the, do you think, so speaking of which,
01:14:22.060 | now that Elon has taken over Twitter,
01:14:26.420 | is there something that he could do
01:14:28.340 | to make this platform better?
01:14:30.780 | This Twitter and just social media in general,
01:14:33.220 | but because of the aggressive nature of his innovation
01:14:36.620 | that he's pushing, is there any way to make Twitter
01:14:40.260 | a pleasant place for Sam Harris?
01:14:42.160 | - Maybe. - Like in the next five years?
01:14:46.180 | - I don't know, I think I'm agnostic as to whether or not
01:14:48.800 | he or anyone could make a social media platform
01:14:51.620 | that really was healthy.
01:14:53.140 | - So you were just observing yourself week by week,
01:14:56.040 | seeing the effect it has on your mind
01:14:58.220 | and on how much you're actually learning
01:14:59.780 | and growing as a person, and it was negative.
01:15:02.100 | - Yeah, and I'd also seen the negativity
01:15:03.500 | in other people's lives.
01:15:04.580 | I mean, it's obviously, I mean, he's not gonna admit it,
01:15:07.960 | but I think it's obviously negative for Elon, right?
01:15:11.380 | I mean, it's just not, it's,
01:15:12.880 | and that was one of the things that, you know,
01:15:16.620 | when I was looking into the Funhaus mirror,
01:15:18.980 | I was also seeing the Funhaus mirror on his side of Twitter,
01:15:22.100 | and it was just even more exaggerated.
01:15:23.940 | It's like, when I was asking myself,
01:15:26.340 | why is he spending his time this way?
01:15:29.000 | I then reflected on why, you know,
01:15:31.920 | why was I spending my time this way, to a lesser degree,
01:15:34.940 | right, and at lesser scale and at lesser risk, frankly,
01:15:39.300 | right, and so, and it was just so,
01:15:44.300 | it's not just Twitter, I mean,
01:15:46.880 | this isn't part an internet phenomenon.
01:15:49.880 | It's like the whole Hunter Biden mess
01:15:52.080 | that you-- - Explored.
01:15:55.040 | - Explored.
01:15:56.520 | That was based, I mean, I was on somebody's podcast,
01:15:59.280 | but that was based on a clip taken from that podcast,
01:16:02.320 | which was highly misleading as to the general shape
01:16:06.240 | of my remarks on that podcast.
01:16:08.560 | Even, you know, I had to then do my own podcast
01:16:12.360 | untangling all of that and admitting that,
01:16:15.040 | even in the full context,
01:16:16.840 | I was not speaking especially well
01:16:18.360 | and didn't say exactly what I thought in a way
01:16:21.080 | that would have been recognizable to anyone,
01:16:24.200 | you know, even someone with,
01:16:25.840 | not functioning by a spirit of charity,
01:16:29.360 | but the clip was quite distinct from the podcast itself.
01:16:33.460 | The reality is, is that we're living in an environment now
01:16:36.200 | where people are so lazy and their attention
01:16:41.200 | is so fragmented that they only have time for clips.
01:16:45.880 | 99% of people will see a clip and will assume
01:16:50.560 | there's no relevant context I need to understand
01:16:52.800 | what happened in that clip, right?
01:16:54.680 | And obviously the people who make those clips know that,
01:16:58.320 | right, and they're doing it quite maliciously.
01:17:00.800 | And in this case, the person who made that clip
01:17:02.680 | and subsequent clips of other podcasts
01:17:04.840 | was quite maliciously trying to engineer, you know,
01:17:08.680 | some reputational immolation for me.
01:17:12.020 | And being signal boosted by Elon and other prominent people
01:17:19.320 | who can't take the time to watch anything other than a clip,
01:17:23.920 | even when it's their friend
01:17:25.680 | or someone who's ostensibly their friend in that clip,
01:17:28.200 | right, so it's a total failure,
01:17:30.280 | an understandable failure of ethics
01:17:32.520 | that everyone is so short on time
01:17:35.400 | and they're so fucking lazy that,
01:17:38.000 | and we now have these contexts in which we react
01:17:41.840 | so quickly to things, right?
01:17:43.280 | Like Twitter is inviting an instantaneous reaction
01:17:47.040 | to this clip that it's just too tempting
01:17:52.040 | to just say something
01:17:55.920 | and not know what you're even commenting on.
01:17:58.320 | And most of the people who saw that clip
01:18:01.400 | don't understand what I actually think
01:18:05.080 | about any of these issues.
01:18:06.400 | And the irony is people are gonna find clips
01:18:09.080 | from this conversation that are just as misleading
01:18:12.120 | and they're gonna export those
01:18:13.240 | and then people are gonna be dunking on those clips.
01:18:15.240 | And we're all living and dying by clips now
01:18:18.080 | and it's dysfunctional.
01:18:21.840 | - See, I think it's possible to create a platform.
01:18:25.120 | I think we will keep living on clips,
01:18:27.640 | but when I saw that clip of you talking about children
01:18:30.360 | and so on, just knowing that you have a sense of humor,
01:18:33.480 | you just went to a dark place in terms of humor.
01:18:36.320 | So I didn't even bother.
01:18:38.320 | And then I knew that the way clips work
01:18:40.760 | is that people will use it for virality's sake,
01:18:43.900 | but giving a person benefit of the doubt,
01:18:48.000 | that's not even the right term.
01:18:49.320 | It's not like I was, it's really like interpreting it
01:18:53.520 | in the context of knowing your past.
01:18:57.120 | - The truth is you even need,
01:18:58.160 | like I even give Trump the benefit of the doubt
01:19:02.360 | when I see a clip of Trump.
01:19:04.280 | 'Cause there are famous clips of Trump
01:19:06.160 | that are very misleading
01:19:07.240 | as to what he was saying in context.
01:19:09.460 | And I've been honest about that.
01:19:10.720 | Like the whole, there were good people on both sides
01:19:14.620 | scandal around his remarks after Charlottesville.
01:19:17.580 | The clip that got exported and got promoted by everyone
01:19:23.220 | left of center, from Biden on down,
01:19:26.860 | the New York Times, CNN, there's nobody that I'm aware of
01:19:31.180 | who has honestly apologized
01:19:35.540 | for what they did with that clip.
01:19:38.660 | He did not say what he seemed to be saying in that clip
01:19:41.240 | about the Nazis at Charlottesville, right?
01:19:44.080 | And I have always been very clear about that.
01:19:46.340 | So it's just, even people who I think should be marginalized
01:19:52.320 | and people who should be defenestrated
01:20:00.180 | because they really are terrible people
01:20:02.880 | who are doing dangerous things and for bad reasons,
01:20:06.960 | I think we should be honest about
01:20:08.340 | what they actually meant in context, right?
01:20:11.640 | And this goes to anyone else we might talk about
01:20:15.180 | who's more, where the case is much more confusing.
01:20:18.620 | But yeah, so everyone's, it's just so,
01:20:23.620 | and then I'm sure we're gonna get to AI,
01:20:26.460 | but the prospect of being able to manufacture clips
01:20:31.160 | with AI and deep fakes
01:20:34.040 | and that where it's gonna be hard for most people
01:20:36.500 | most of the time to even figure out
01:20:38.700 | whether they're in the presence of something real,
01:20:41.000 | forget about being divorced from context,
01:20:44.700 | there was no context.
01:20:45.800 | I mean, that's a misinformation apocalypse
01:20:51.140 | that we are right on the cusp of and it's terrifying.
01:20:55.820 | - Or it could be just a new world
01:20:57.120 | like where Alice going to Wonderland
01:20:59.780 | where humor is the only thing we have and it will save us.
01:21:02.960 | Maybe in the end, Trump's approach to social media
01:21:06.680 | was the right one after all,
01:21:07.760 | nothing is true and everything's absurd.
01:21:09.840 | - Yeah, but we can't live that way.
01:21:11.960 | People function on the basis of what they assume is true.
01:21:15.440 | Right, they think-- - People have functioned.
01:21:18.200 | - To do anything, it's like, I mean,
01:21:19.480 | you have to know what you think is gonna happen
01:21:22.600 | or you have to at least give a probabilistic weighting
01:21:26.680 | over the future, otherwise you're gonna be incapacitated
01:21:30.720 | by, you're not gonna, like people want certain things
01:21:33.800 | and they have to have a rational plan
01:21:35.160 | to get those desires gratified.
01:21:37.560 | And they don't wanna die, they don't want their kids to die,
01:21:40.780 | you tell them that there's a comet hurtling toward Earth
01:21:43.360 | and they should get outside and look up, right?
01:21:46.360 | They're gonna do it and if it turns out it's misinformation,
01:21:49.360 | it's gonna matter because it comes down to like,
01:21:54.680 | what medicines do you give your children, right?
01:21:57.720 | Like we're gonna be manufacturing fake journal articles.
01:22:01.920 | I mean, I'm sure someone's using ChatGPT
01:22:04.720 | for this reader as we speak.
01:22:07.640 | And if it's not credible,
01:22:10.160 | if it's not persuasive now to most people,
01:22:14.520 | I mean, honestly, I don't think we're gonna,
01:22:16.720 | I'll be amazed if it's a year before
01:22:20.920 | we can actually create journal articles
01:22:23.760 | that it would take a PhD to debunk
01:22:27.160 | that are completely fake.
01:22:29.480 | And there are people who are celebrating this kind of,
01:22:34.780 | coming cataclysm, but it's just,
01:22:41.280 | there are the people who don't have anything to lose
01:22:44.720 | who are celebrating it or just are so confused
01:22:46.960 | that they just don't even know what's at stake.
01:22:48.600 | And then there are the people who have met,
01:22:50.000 | the few people who we could count on a few hands
01:22:52.960 | who have managed to insulate themselves,
01:22:54.760 | or at least imagine they've insulated themselves
01:22:57.320 | from the downside here enough
01:22:59.440 | that they're not implicated in the great unraveling
01:23:02.800 | we are witnessing or could witness.
01:23:05.480 | - The shaking up of what is true.
01:23:07.440 | So actually that returns us to experts.
01:23:09.240 | Do you think experts can save us?
01:23:11.520 | Is there such thing as expertise and experts in something?
01:23:14.600 | How do you know if you've achieved it?
01:23:16.560 | - I think it's important to acknowledge upfront
01:23:19.760 | that there's something paradoxical
01:23:21.800 | about how we relate to authority,
01:23:26.040 | especially within science.
01:23:28.040 | And I don't think that paradox is going away
01:23:31.320 | and it doesn't have to be confusing.
01:23:33.000 | It's just, and it's not truly a paradox.
01:23:35.160 | It's just like there are different moments in time.
01:23:38.020 | So it is true to say that within science
01:23:43.020 | or within rationality generally,
01:23:47.960 | I mean, whenever you're having a fact-based discussion
01:23:51.600 | about anything, it is true to say that the truth
01:23:56.120 | or falsity of a statement does not even slightly depend
01:24:01.120 | on the credentials of the person making the statement.
01:24:05.400 | So it doesn't matter if you're a Nobel laureate,
01:24:07.920 | you can be wrong.
01:24:08.960 | The thing you could, the last sentence you spoke
01:24:11.660 | could be total bullshit.
01:24:12.860 | And it's also possible for someone who's deeply uninformed
01:24:17.640 | to be right about something,
01:24:19.240 | or to be right for the wrong reasons,
01:24:21.440 | or someone just gets lucky, or someone,
01:24:23.800 | or, and there are middling cases
01:24:26.100 | where you have like a backyard astronomer
01:24:29.560 | who's got no credentials, but he just loves astronomy
01:24:32.320 | and he's got a telescope and he's spent a lot of time
01:24:35.260 | looking at the night sky, and he discovers a comet
01:24:38.760 | that no one else has seen,
01:24:40.240 | not even the professional expert astronomers.
01:24:44.080 | And I gotta think that happens less and less now,
01:24:47.500 | but some version of that keeps happening
01:24:50.440 | and it may always keep happening
01:24:52.080 | in every area of expertise, right?
01:24:54.480 | So it's true that truth is orthogonal
01:25:02.080 | to the reputational concerns we have among apes
01:25:06.280 | who are talking about the truth.
01:25:08.000 | But it is also true that most of the time,
01:25:12.420 | real experts are much more reliable than frauds
01:25:18.640 | or people who are not experts, right?
01:25:21.200 | So, and expertise really is a thing, right?
01:25:24.200 | And when you're flying an airplane in a storm,
01:25:28.760 | you don't want just randos come into the cockpit saying,
01:25:32.080 | "Listen, I've got a new idea about how to,
01:25:34.120 | how we should tweak these controls," right?
01:25:36.520 | You want someone who's a trained pilot
01:25:38.440 | and that training gave them something, right?
01:25:40.920 | It gave them a set of competences and intuitions
01:25:43.760 | and they know what all those dials and switches do, right?
01:25:48.140 | And I don't, right?
01:25:49.800 | I shouldn't be flying that plane.
01:25:51.440 | So when things really matter,
01:25:55.120 | and putting this at 30,000 feet in a storm
01:25:59.600 | sharpens this up, we want real experts to be in charge,
01:26:04.600 | right?
01:26:06.200 | And we are at 30,000 feet a lot of the time
01:26:10.280 | on a lot of issues, right?
01:26:11.760 | And whether they're public health issues,
01:26:14.160 | whether it's a geopolitical emergency like Ukraine,
01:26:18.500 | I mean, climate change, I mean, just pick your topic.
01:26:23.500 | There are real problems and the clock
01:26:29.100 | is rather often ticking and their solutions
01:26:32.020 | are non-obvious, right?
01:26:33.780 | And so expertise is a thing and deferring to experts
01:26:38.780 | much of the time makes a lot of sense.
01:26:42.140 | It's at minimum, it prevents,
01:26:47.060 | you get a spectacular errors of incompetence
01:26:49.700 | and just foolhardiness.
01:26:54.700 | But even in the case of some,
01:26:57.340 | where you're talking about someone,
01:26:59.320 | I mean, people like ourselves who are like,
01:27:00.740 | we're well-educated, we're not the worst possible candidates
01:27:05.680 | for the Dunning-Kruger effect.
01:27:07.900 | When we're going into a new area where we're not experts,
01:27:11.340 | we're fairly alert to the possibility that we don't,
01:27:14.260 | it's not as simple as things seem at first
01:27:16.080 | and we don't know how our tools translate to this new area.
01:27:20.980 | We can be fairly circumspect,
01:27:22.820 | but we're also, because we're well-educated,
01:27:25.060 | and we're pretty quick studies,
01:27:28.540 | we can learn a lot of things pretty fast
01:27:31.100 | and we can begin to play a language game
01:27:33.900 | that sounds fairly expert, right?
01:27:37.020 | And in that case, the invitation to do your own research,
01:27:44.340 | is when times are good,
01:27:48.480 | I view as an invitation to waste your time pointlessly,
01:27:52.400 | right, when times are good.
01:27:54.700 | Now, the truth is times are not all that good, right?
01:27:57.280 | And we have the ongoing public display
01:28:02.120 | of failures of expertise.
01:28:03.860 | We have experts who are obviously corrupted
01:28:05.920 | by bad incentives.
01:28:07.240 | We've got experts who perversely won't admit they were wrong
01:28:12.040 | when they in fact are demonstrated to be wrong.
01:28:14.720 | We've got institutions that have been captured
01:28:16.880 | by political ideology that's not truth tracking.
01:28:20.320 | I mean, this whole woke encroachment
01:28:25.080 | into really every place,
01:28:27.460 | whether it's universities or science journals
01:28:30.220 | or government, I mean, it's just like,
01:28:31.960 | that has been genuinely deranging.
01:28:34.380 | So there's a lot going on where experts
01:28:39.720 | and the very concept of expertise
01:28:41.640 | has seemed to discredit itself.
01:28:43.160 | But the reality is that there is a massive difference
01:28:45.840 | when anything matters,
01:28:46.880 | when there's anything to know about anything,
01:28:49.160 | there is a massive difference most of the time
01:28:51.960 | between someone who has really done the work
01:28:54.520 | to understand that domain and someone who hasn't.
01:28:57.880 | And if I get sick or someone close to me gets sick,
01:29:02.780 | you know, I have a PhD in neuroscience, right?
01:29:07.160 | So I can read a medical journal article
01:29:09.680 | and understand a lot of it, right?
01:29:11.920 | And I, you know, so I'm just fairly conversant
01:29:14.320 | with medical terminology.
01:29:16.880 | And I understand its methods
01:29:18.320 | and I'm alert to the difference because I've,
01:29:21.040 | because in neuroscience,
01:29:22.180 | I've spent hours and hours in journal clubs,
01:29:24.560 | diagnosing and analyzing the difference
01:29:28.320 | between good and bad studies.
01:29:30.060 | I'm alert to the difference between good and bad studies
01:29:33.640 | in medical journals, right?
01:29:34.980 | And I understand that bad studies can get published
01:29:37.240 | and, you know, et cetera.
01:29:39.180 | And experiments can be poorly designed.
01:29:42.560 | I'm alert to all of those things.
01:29:44.720 | But when I get sick or when someone close to me gets sick,
01:29:47.680 | I don't pretend to be a doctor, right?
01:29:50.360 | I've got no clinical experience.
01:29:52.200 | I don't go down the rabbit hole on Google
01:29:54.600 | for days at a stretch trying to become a doctor,
01:29:58.100 | much less a specialist in the domain of problem
01:30:01.440 | that has been visited upon me or my family, right?
01:30:04.060 | So if someone close to me gets cancer,
01:30:06.680 | I don't pretend to be an oncologist.
01:30:08.580 | I don't go out and start,
01:30:09.800 | I don't start reading, you know, in journals of oncology
01:30:14.140 | and try to really get up to speed as an oncologist
01:30:17.280 | because it's,
01:30:18.120 | one, it's a bad and potential
01:30:25.360 | and very likely misleading use of my time, right?
01:30:30.360 | And it's,
01:30:32.880 | if I decide, if I had, if I had a lot of runway,
01:30:36.360 | if I decided, okay, it's really important for me
01:30:39.320 | to know everything I can.
01:30:40.680 | At this point, I wanna, I know someone's gonna get cancer.
01:30:43.600 | I may not go back to school and become an oncologist,
01:30:46.440 | but what I wanna do is I wanna know everything
01:30:48.280 | I can know about cancer, right?
01:30:49.520 | So I'm gonna take the next four years
01:30:52.120 | and spend most of my time on cancer.
01:30:54.260 | Okay, I could do that, right?
01:30:56.560 | I still think that's a waste of my time.
01:30:58.560 | I still think at the end of,
01:31:00.560 | even at the end of those four years,
01:31:02.480 | I'm not gonna be the best person to form intuitions
01:31:05.820 | about what to do in the face of the next cancer
01:31:08.640 | that I have to confront.
01:31:09.840 | I'm still gonna want a better oncologist than I've become
01:31:15.640 | to tell me what he or she would do if they were in my shoes
01:31:18.920 | or in the shoes of my family member.
01:31:21.040 | What I'm not advocating,
01:31:24.720 | I'm not advocating a blind trust and authority.
01:31:29.360 | Like if you get cancer
01:31:30.680 | and you're talking to one oncologist
01:31:33.240 | and they're recommending some course of treatment,
01:31:34.680 | by all means, get a second opinion, get a third opinion.
01:31:38.460 | But it matters that those opinions
01:31:40.740 | are coming from real experts and not from
01:31:43.060 | Robert Kennedy Jr. who's telling you that you got it
01:31:49.680 | because you got a vaccine, right?
01:31:51.640 | It's like, it's just,
01:31:52.760 | we're swimming in a sea of misinformation
01:31:56.320 | where you've got people who are moving the opinions
01:32:00.140 | of millions of others who should not have an opinion
01:32:05.140 | on these topics.
01:32:07.020 | Like there is no scenario in which
01:32:11.180 | you should be getting your opinion about vaccine safety
01:32:14.300 | or climate change or the war in Ukraine
01:32:19.300 | or anything else that we might wanna talk about
01:32:21.460 | from Candace Owens, right?
01:32:24.020 | It's just like she's not a relevant expert
01:32:28.300 | on any of those topics.
01:32:29.860 | And what's more, she doesn't seem to care, right?
01:32:32.340 | And she's living in a culture
01:32:34.980 | that has amplified that not caring into a business model
01:32:39.980 | and an effective business model, right?
01:32:42.100 | So it's just,
01:32:42.940 | and there's something very Trumpian about all that.
01:32:45.940 | Like that's the problem, the problem is the culture.
01:32:49.300 | It's not these specific individuals.
01:32:51.740 | So the paradox here is that expertise is a real thing
01:32:58.580 | and we defer to it a lot as a labor-saving device
01:33:02.460 | and just based on the reality
01:33:07.460 | that it's very hard to be a polymath, right?
01:33:10.320 | And specialization is a thing, right?
01:33:12.100 | And so there are people who specialize
01:33:13.460 | in a very narrow topic.
01:33:14.800 | They know more about that topic than the next guy,
01:33:16.700 | no matter how smart that guy or gal is.
01:33:19.400 | And that those differences matter.
01:33:24.620 | But it's also true that when you're talking about facts,
01:33:29.540 | sometimes the best experts are wrong.
01:33:34.540 | The scientific consensus is wrong.
01:33:37.140 | You get a sea change in the thinking of a whole field
01:33:40.560 | because one person who's an outlier for whatever reason
01:33:43.740 | decides, okay, I'm gonna prove this point
01:33:48.620 | and they prove it, right?
01:33:49.740 | So somebody like the doctor who believed
01:33:54.540 | that stomach ulcers were not due to stress
01:33:56.780 | but were due to H. pylori infections, right?
01:34:00.020 | So he just drank a vial of H. pylori bacteria
01:34:02.620 | and proved and quickly got an ulcer
01:34:05.380 | and convinced the field that at minimum,
01:34:08.500 | H. pylori was involved in that process.
01:34:11.480 | Okay, so yes, everyone was wrong.
01:34:14.180 | That doesn't disprove the reality of expertise.
01:34:19.140 | It doesn't disprove the utility of relying on experts
01:34:22.260 | most of the time, especially in an emergency,
01:34:25.140 | especially when the clock is ticking,
01:34:27.060 | especially when you're in this particular cockpit
01:34:30.980 | and you only have one chance to land this plane, right?
01:34:34.700 | You want the real pilot at the controls.
01:34:38.180 | - But there's just a few things to say.
01:34:40.180 | So one, you mentioned this example with cancer
01:34:44.460 | and doing your own research.
01:34:45.980 | There are several things that are different
01:34:47.860 | about our particular time in history.
01:34:50.740 | One, doing your own research
01:34:52.780 | has become more and more effective
01:34:54.880 | because you can read, the internet made information
01:34:58.860 | a lot more accessible,
01:35:00.100 | so you can read a lot of different meta-analyses.
01:35:03.440 | You can read blog posts that describe to you
01:35:07.300 | exactly the flaws in the different papers
01:35:09.400 | that make up the meta-analyses.
01:35:11.500 | And you can read a lot of those blog posts
01:35:14.380 | that are conflicting with each other
01:35:15.740 | and you can take that information in
01:35:17.420 | and in a short amount of time,
01:35:19.260 | you can start to make good faith interpretations.
01:35:24.260 | For example, I don't know, I don't wanna overstate things,
01:35:26.980 | but if you suffer from depression, for example,
01:35:31.540 | then you could go to an expert and a doctor
01:35:34.740 | that prescribes you some medication,
01:35:36.540 | but you could also challenge some of those ideas
01:35:39.660 | and seeing like, what are the different medications,
01:35:41.420 | what are the different side effects,
01:35:42.820 | what are the different solutions to depression,
01:35:44.740 | all that kind of stuff.
01:35:45.660 | And I think depression is just a really difficult problem
01:35:48.660 | that's very, I don't wanna, again, state incorrect things,
01:35:52.820 | but I think there's a lot of variability
01:35:56.820 | of what depression really means.
01:35:58.020 | So being introspective about the type of depression you have
01:36:02.260 | and the different possible solutions you have,
01:36:04.460 | just doing your own research as a first step
01:36:07.380 | before approaching a doctor
01:36:08.760 | or as you have multiple opinions
01:36:11.060 | could be very beneficial in that case.
01:36:13.580 | Now, that's depression,
01:36:14.740 | that's something that's been studied for a very long time.
01:36:17.100 | With a new pandemic that's affecting everybody,
01:36:20.720 | with the airplane equated to like 9/11 or something,
01:36:26.660 | did a new emergency just happen?
01:36:30.060 | And everybody, every expert in the world
01:36:34.180 | is publishing on it and talking about it.
01:36:36.780 | So doing your own research there
01:36:38.380 | could be exceptionally effective in asking questions.
01:36:42.180 | And then there's a difference between experts,
01:36:45.300 | virologists, and it's actually a good question,
01:36:47.580 | who is exactly the expert in a pandemic?
01:36:51.780 | But there's the actual experts
01:36:55.140 | doing the research and publishing stuff,
01:36:57.980 | and then there's the communicators of that expertise.
01:37:01.300 | And the question is, if the communicators are flawed
01:37:05.860 | to a degree where doing your own research
01:37:10.360 | is actually the more effective way
01:37:11.980 | to figure out policies and solutions,
01:37:14.260 | because you're not competing with experts,
01:37:16.420 | you're competing with the communicators of expertise.
01:37:18.940 | That could be WHO, CDC in the case of the pandemic,
01:37:21.860 | or politicians, or political type of science figures
01:37:25.420 | like Anthony Fauci.
01:37:27.200 | There's a question there
01:37:28.380 | of the effectiveness of doing your own research
01:37:33.660 | in that context.
01:37:34.680 | And the competing forces there,
01:37:39.460 | incentives that you've mentioned,
01:37:41.180 | is you can become quite popular by being contrarian,
01:37:44.860 | by saying everybody's lying to you,
01:37:46.140 | all the authorities are lying to you,
01:37:47.220 | all the institutions are lying to you.
01:37:49.180 | So those are the waters you're swimming in.
01:37:51.860 | But I think doing your own research
01:37:54.100 | in that kind of context could be quite effective.
01:37:57.740 | - Let me be clear.
01:37:58.700 | I'm not saying you shouldn't do any research.
01:38:02.300 | I'm not saying that you shouldn't be informed
01:38:04.220 | about an issue.
01:38:05.040 | I'm not saying you shouldn't read articles
01:38:06.780 | on whatever the topic is.
01:38:08.280 | And yes, if I got cancer or someone close to me got cancer,
01:38:11.660 | I probably would read more about cancer
01:38:14.300 | than I've read thus far about cancer.
01:38:16.660 | And I've read some.
01:38:17.680 | So I'm not making a virtue of ignorance
01:38:24.320 | and a blind obedience to authority.
01:38:26.880 | And again, I recognize that authorities
01:38:29.560 | can discredit themselves or they can be wrong.
01:38:31.880 | They can be wrong even when there's no discredit.
01:38:35.040 | There's just, there's a lot we don't understand
01:38:37.080 | about the nature of the world.
01:38:39.140 | But still this vast gulf between truly informed opinion
01:38:46.680 | and bullshit exists.
01:38:50.000 | It always exists.
01:38:51.240 | And conspiracy thinking is rather often,
01:38:56.240 | most of the time, a species of bullshit,
01:39:02.160 | but it's not always wrong.
01:39:03.480 | There are real conspiracies
01:39:04.640 | and there really are just awful corruptions
01:39:09.640 | of born of bad incentives within our scientific processes,
01:39:16.000 | within institutions.
01:39:17.640 | And again, we've mentioned a lot of these things in passing,
01:39:20.800 | but what woke political ideology did
01:39:24.920 | to scientific communication during the pandemic was awful.
01:39:28.960 | And it was really corrosive of public trust,
01:39:31.200 | especially on the right.
01:39:34.480 | For understandable reasons.
01:39:35.760 | I mean, it was just, it was crazy,
01:39:37.080 | some of the things that were being said and still is.
01:39:40.840 | And these cases are all different.
01:39:41.960 | So you take depression.
01:39:43.400 | We just don't know enough about depression
01:39:45.320 | for anyone to be that confident about anything.
01:39:49.600 | And there are many different modalities
01:39:52.040 | in which to interact with it as a problem.
01:39:54.440 | So there's, yes, pharmaceuticals
01:39:56.360 | have whatever promise they have,
01:39:57.720 | but there's certainly reason to be concerned
01:40:00.240 | that they don't work well for everybody.
01:40:02.360 | And I mean, it's obvious they don't work well for everybody,
01:40:07.360 | but they do work for some people.
01:40:09.460 | But again, depression is a multifactorial problem
01:40:16.560 | and there are different levels at which to influence it.
01:40:20.360 | And there are things like meditation,
01:40:22.240 | there are things like just life changes.
01:40:24.320 | And one of the perverse things about depression
01:40:29.200 | is that when you're depressed,
01:40:30.900 | all of the things that would be good for you to do
01:40:32.640 | are precisely the things you don't wanna do.
01:40:34.360 | You don't have any energy to socialize,
01:40:36.080 | you don't wanna get things done,
01:40:37.480 | you don't wanna exercise.
01:40:39.200 | And all of those things, if you got those up and running,
01:40:43.120 | they do make you feel better in the aggregate.
01:40:46.560 | But the reality is that there are clinical level depressions
01:40:51.100 | that are so bad that it's just,
01:40:53.120 | we just don't have good tools for them.
01:40:55.200 | And it's not enough to tell,
01:40:57.440 | there's no life change someone's gonna embrace
01:41:01.020 | that is going to be an obvious remedy for that.
01:41:03.420 | The pandemic, I mean, pandemics are obviously
01:41:08.300 | a complicated problem,
01:41:09.580 | but I would consider it much simpler than depression
01:41:14.020 | in terms of what's on the menu to be chosen
01:41:18.860 | among the various choices.
01:41:20.580 | - Just less multifactorial.
01:41:21.900 | - The logic by which you would make those choices, yeah.
01:41:23.900 | So it's like, we have a virus, we have a new virus.
01:41:27.600 | It's some version of bad, it's human transmissible.
01:41:32.320 | We're still catching up,
01:41:33.480 | we're catching up to every aspect of it.
01:41:34.880 | - We don't know how it spreads.
01:41:36.240 | - We don't know how-
01:41:37.280 | - How effective masks are.
01:41:38.880 | - Well, at a certain point, we knew it was respiratory,
01:41:40.480 | but we knew-
01:41:41.680 | - But how respiratory, what that means.
01:41:42.520 | - Yeah, and whether it's spread by fomites,
01:41:44.640 | and like all that, we were confused about a lot of things.
01:41:46.920 | And we're still confused.
01:41:48.200 | It's been a moving target this whole time,
01:41:49.920 | and it's been changing this whole time.
01:41:51.280 | And our responses to it have been,
01:41:54.480 | we ramped up the vaccines as quickly as we could,
01:41:59.480 | but too quick for some, not quick enough for others.
01:42:03.540 | We could have done human challenge trials
01:42:05.520 | and got them out more quickly with better data.
01:42:08.700 | And I think that's something we should probably look at
01:42:11.420 | in the future, because to my eye,
01:42:14.120 | that would make ethical sense to do challenge trials.
01:42:20.400 | - And so much of my concern about COVID,
01:42:22.840 | I mean, many people are confused
01:42:24.000 | about my concern about COVID.
01:42:25.480 | My concern about COVID has, for much of the time,
01:42:29.400 | not been narrowly focused on COVID itself,
01:42:32.520 | how dangerous I perceive COVID to be as a illness.
01:42:36.860 | It has been, for the longest time,
01:42:40.820 | even more a concern about our ability to respond
01:42:45.440 | to a truly scary pathogen next time.
01:42:49.600 | For, outside those initial months,
01:42:53.540 | give me the first six months to be quite worried
01:42:58.160 | about COVID and the unraveling of society.
01:43:01.180 | - And the supply of toilet paper.
01:43:02.860 | - You wanna secure a steady supply of toilet paper.
01:43:05.320 | But beyond that initial period,
01:43:08.020 | when we had a sense of what we were dealing with
01:43:11.120 | and we had every hope that the vaccines
01:43:13.000 | are actually gonna work,
01:43:13.880 | and we knew we were getting those vaccines
01:43:15.880 | in short order, right?
01:43:17.440 | Beyond that, and we knew just how dangerous
01:43:21.000 | the illness was and how dangerous it wasn't.
01:43:23.480 | For years now, I've just been worrying about this
01:43:28.600 | as a failed dress rehearsal for something much worse.
01:43:31.800 | Right, I think what we prove to ourselves
01:43:34.180 | at this moment in history is that
01:43:36.440 | we have built informational tools
01:43:38.680 | that we do not know how to use,
01:43:40.240 | and we have made ourselves,
01:43:42.680 | we've basically enrolled all of human society
01:43:45.960 | into a psychological experiment that is deranging us
01:43:50.960 | and making it virtually impossible
01:43:55.000 | to solve coordination problems
01:43:57.140 | that we absolutely have to solve next time
01:43:59.680 | when things are worse.
01:44:00.560 | - Do you understand who's at fault
01:44:02.600 | for the way this unraveled?
01:44:05.420 | The way we didn't seem to have the distrust in institutions
01:44:11.600 | and the institution of science that grew
01:44:14.360 | like seemingly exponentially,
01:44:16.160 | or got revealed through this process.
01:44:18.680 | Who's at fault here?
01:44:20.100 | And what's to fix?
01:44:22.400 | - So much blame to go around,
01:44:23.840 | but so much of it is not a matter of
01:44:27.400 | bad people conspiring to do bad things.
01:44:30.520 | It's a matter of incompetence and misaligned incentives
01:44:35.520 | and just ordinary, you know,
01:44:40.440 | plain vanilla dysfunction.
01:44:42.500 | But my problem was that people like you,
01:44:45.240 | people like Brett Weinstein,
01:44:47.000 | people that I look to for reasonable,
01:44:49.900 | difficult conversations on difficult topics
01:44:52.560 | have a little bit lost their mind,
01:44:54.620 | became emotional and dogmatic in style of conversation,
01:44:57.640 | perhaps not in the depth of actual ideas,
01:45:01.180 | but they're, you know,
01:45:02.920 | a tweet something of that nature,
01:45:04.240 | and not about you,
01:45:05.080 | but just it feels like the pandemic
01:45:07.520 | made people really more emotional than before.
01:45:11.960 | - And then Kimball Musk responded,
01:45:14.160 | I think something I think you probably would agree with,
01:45:16.960 | maybe not.
01:45:17.800 | I think it was the combo of Trump and the pandemic.
01:45:20.200 | Trump triggered the far left to be way more active
01:45:23.160 | than they could have been without him.
01:45:24.940 | And then the pandemic handed big government,
01:45:27.400 | nanny state, lefties a huge platform on a silver platter,
01:45:30.480 | a one-two punch, and here we are.
01:45:32.520 | - Well, I would agree with some of that.
01:45:33.760 | I'm not sure how much to read into the nanny state concept,
01:45:38.020 | but yeah, like basically got people on the far left
01:45:42.100 | really activated and then gave control to,
01:45:45.820 | I don't know if you say nanny state,
01:45:47.500 | but just control to government
01:45:50.080 | that when executed poorly
01:45:53.100 | has created a complete distrust in government.
01:45:56.180 | - My fear is that there was going to be
01:45:57.500 | that complete distrust anyway,
01:45:59.800 | given the nature of the information space,
01:46:01.980 | given the level of conspiracy thinking,
01:46:04.220 | given the gaming of these tools by an anti-vax cult.
01:46:09.220 | I mean, there really is an anti-vax cult
01:46:13.140 | that just ramped up its energy during this moment.
01:46:17.560 | - But it's a small one.
01:46:19.820 | - It's not to say that everything,
01:46:21.540 | every concern about vaccines is a species of,
01:46:25.660 | it was born of misinformation or born of this cult,
01:46:27.920 | but there is a cult that is just,
01:46:32.060 | and the core of Trumpism is a cult.
01:46:34.540 | I mean, QAnon is a cult.
01:46:36.280 | And so there's a lot of lying
01:46:39.160 | and there's a lot of confusion.
01:46:40.720 | It's almost impossible to exaggerate
01:46:47.020 | how confused some people are
01:46:49.100 | and how fully their lives are organized
01:46:52.020 | around that confusion.
01:46:52.860 | I mean, there are people who think
01:46:54.020 | that the world's being run by pedophile cannibals
01:46:56.900 | and that Tom Hanks and Oprah Winfrey
01:46:59.300 | and Michelle Obama are among those cannibals.
01:47:01.580 | I mean, adjacent to the pure crazy,
01:47:05.820 | there's the semi-crazy,
01:47:07.020 | and adjacent to the semi-crazy,
01:47:08.660 | there's the grifting opportunist asshole.
01:47:12.180 | And the layers of bad faith are hard to fully diagnose,
01:47:17.180 | but the problem is all of this is getting signal boosted
01:47:26.100 | by an outrage machine
01:47:28.820 | that is preferentially spreading misinformation.
01:47:31.260 | It has a business model that is guaranteed
01:47:34.380 | that is preferentially sharing misinformation.
01:47:37.060 | - Can I actually just on a small tangent,
01:47:38.900 | how do you defend yourself against the claim
01:47:42.180 | that you're a pedophile cannibal?
01:47:43.820 | - It's difficult.
01:47:45.660 | - Here's the case I would make,
01:47:47.620 | because I don't think you can use reason.
01:47:49.780 | I think you have to use empathy.
01:47:51.700 | You have to understand--
01:47:52.540 | - But what, like, part of it, I mean,
01:47:54.540 | I find it very difficult to believe
01:47:56.600 | that anyone believes these things.
01:47:58.220 | I mean, I think that there's,
01:47:59.420 | and I'm sure there's some number of people
01:48:02.900 | who are just pretending to believe these things,
01:48:05.320 | because it's just, again,
01:48:07.180 | this is sort of like the 4chanification of everything.
01:48:09.660 | It's just Pepe the Frog, right?
01:48:12.460 | Like, none of this is what it seems.
01:48:14.580 | They're not signaling an alliance
01:48:17.260 | with white supremacy or neo-Nazism,
01:48:19.980 | but they're not not doing it.
01:48:21.260 | Like, they just don't fucking care.
01:48:22.460 | It's just cynicism overflowing its banks, right?
01:48:26.420 | It's just fun to wind up the normies, right?
01:48:29.940 | Like, look at all the normies who don't understand
01:48:31.580 | that a green frog is just a green frog,
01:48:33.460 | even when it isn't just a green frog, right?
01:48:35.020 | It's like, it's just gumming up
01:48:37.940 | everyone's cognitive bandwidth with bullshit, right?
01:48:41.420 | I get that that's fun if you're a teenager
01:48:43.420 | and you just wanna vandalize our newsphere,
01:48:47.620 | but at a certain point, we have to recognize
01:48:51.620 | that real questions of human welfare are in play, right?
01:48:55.900 | It's like, there are wars getting fought or not fought,
01:49:00.220 | and there's a pandemic raging,
01:49:01.700 | and there's medicine to take or not take.
01:49:04.940 | But I mean, to come back to this issue of COVID,
01:49:07.980 | I don't think I got so out of balance around COVID.
01:49:12.380 | I think people are quite confused
01:49:14.100 | about what I was concerned about.
01:49:17.020 | I mean, yes, there was a period where I was crazy
01:49:21.540 | because anyone who was taking it seriously was crazy
01:49:23.820 | 'cause they had no idea what was going on.
01:49:25.340 | And so it's like, yes, I was wiping down packages
01:49:27.900 | with alcohol wipes, right?
01:49:30.420 | Because people thought it was transmissible by touch, right?
01:49:35.420 | And then when we realized that was no longer the case,
01:49:38.540 | I stopped doing that.
01:49:39.380 | But so again, it was a moving target,
01:49:42.940 | and a lot of things we did in hindsight
01:49:45.420 | around masking and school closures
01:49:48.260 | looked fairly dysfunctional, right?
01:49:52.260 | But unnecessary.
01:49:53.300 | - I think the criticism that people would say
01:49:56.780 | about your talking about COVID,
01:49:59.580 | and maybe you can correct me,
01:50:00.940 | but you were against skepticism
01:50:05.940 | of the safety and efficacy of the vaccine.
01:50:09.940 | So people who get nervous about the vaccine,
01:50:14.940 | but don't fall into the usual anti-vax camp,
01:50:19.180 | which I think there was a significant enough number,
01:50:22.660 | they're getting nervous.
01:50:24.340 | I mean, especially after the war in Afghanistan and Iraq,
01:50:29.340 | I too was nervous about anything
01:50:32.580 | where a lot of money could be made.
01:50:34.340 | And you just see how the people who are greedy,
01:50:39.620 | they come to the surface all of a sudden.
01:50:41.420 | And a lot of them that run institutions,
01:50:43.980 | actually really good human beings,
01:50:45.260 | I know a lot of them,
01:50:46.340 | but it's hard to know how those two combine together
01:50:49.740 | when there's hundreds of billions,
01:50:51.300 | trillions of dollars to be made.
01:50:53.300 | And so that skepticism,
01:50:55.100 | I guess the sense was that you weren't open enough
01:50:58.860 | to the skepticism.
01:50:59.780 | - I understand that people have that sense.
01:51:01.380 | I'll tell you how I thought about it and think about it.
01:51:04.700 | One, again, it was a moving target.
01:51:06.260 | So there was a point in the timeline
01:51:08.420 | where it was totally rational to expect
01:51:13.020 | that the vaccines were both working,
01:51:19.860 | both they were reasonably safe
01:51:23.740 | and that COVID was reasonably dangerous.
01:51:26.860 | And that the trade-off for basically everyone
01:51:29.180 | was it was rational to get vaccinated,
01:51:31.140 | given the level of testing
01:51:33.540 | and how many people had been vaccinated before you,
01:51:35.740 | given what we were seeing with COVID,
01:51:37.580 | that that was a forced choice.
01:51:40.580 | You're eventually gonna get COVID,
01:51:42.220 | and the question is,
01:51:43.060 | do you wanna be vaccinated when you do?
01:51:44.780 | There was a period where that forced choice,
01:51:47.460 | where it was just obviously reasonable to get vaccinated,
01:51:51.460 | especially because there was every reason to expect
01:51:56.100 | that while it wasn't a perfectly sterilizing vaccine,
01:51:59.660 | it was going to knock down transmission a lot,
01:52:03.060 | and that matters.
01:52:03.900 | And so it wasn't just a personal choice.
01:52:05.940 | You were actually being a good citizen
01:52:08.820 | when you decided to run whatever risk
01:52:10.780 | you were gonna run to get vaccinated,
01:52:13.740 | because there are people in our society
01:52:16.100 | who actually can't get vaccinated.
01:52:17.860 | I mean, I know people who can't take any vaccines.
01:52:19.660 | They're so allergic to, I mean,
01:52:22.100 | they in their own person seem to justify
01:52:25.380 | all of the fears of the anti-vax cult.
01:52:27.620 | I mean, it's like they're the kind of person
01:52:28.740 | who Robert Kennedy Jr. can point to and say,
01:52:30.940 | "See, vaccines will fucking kill you," right?
01:52:34.620 | Because of the experience,
01:52:36.500 | and we're still, I know people who have kids
01:52:39.140 | who fit that description, right?
01:52:40.300 | So we should all feel a civic responsibility
01:52:44.820 | to be vaccinated against egregiously awful
01:52:49.460 | and transmissible diseases
01:52:51.540 | for which we have relatively safe vaccines
01:52:54.900 | to keep those sorts of people safe.
01:52:56.860 | - And there was a period of time
01:52:57.900 | when it was thought that the vaccine
01:52:59.020 | could stop transmission.
01:53:00.140 | - Yes, and so again, all of this has begun to shift.
01:53:04.100 | I don't think it has shifted as much
01:53:06.580 | as Brett Weinstein thinks it's shifted,
01:53:08.900 | but yes, there are safety concerns
01:53:12.420 | around the mRNA vaccines, especially for young men, right?
01:53:17.380 | As far as I know, that's the purview
01:53:20.300 | of actual heightened concern.
01:53:23.420 | But also, there's now a lot of natural immunity out there.
01:53:29.940 | Basically, everyone who was gonna get vaccinated
01:53:32.900 | has gotten vaccinated.
01:53:34.420 | The virus has evolved to the point in this context
01:53:38.260 | where it seems less dangerous.
01:53:43.260 | Again, I'm going more on the seemings
01:53:47.100 | than on research that I've done at this point,
01:53:50.300 | but I'm certainly less worried about getting COVID.
01:53:52.420 | I've had it once.
01:53:53.340 | I've been vaccinated.
01:53:54.460 | So you ask me now,
01:53:56.740 | how do I feel about getting the next booster?
01:53:59.160 | I don't know that I'm going to get the next booster, right?
01:54:02.940 | So I was somebody who was waiting in line
01:54:06.340 | at four in the morning,
01:54:07.940 | hoping to get some overflow vaccine
01:54:11.060 | when it was first available.
01:54:12.860 | And that was, at that point, given what we knew,
01:54:16.740 | or given what I thought I knew
01:54:18.580 | based on the best sources I could consult
01:54:20.620 | and based on anecdotes that were too vivid to ignore,
01:54:25.620 | both data and personal experience,
01:54:28.580 | it was totally rational for me to want to get that vaccine
01:54:33.860 | as soon as I could.
01:54:35.620 | And now, I think it's totally rational
01:54:37.580 | for me to do a different kind of cost-benefit analysis
01:54:41.700 | and wonder, listen, do I really need to get a booster?
01:54:45.540 | How many of these boosters am I going to get
01:54:48.820 | for the rest of my life, really?
01:54:51.000 | And how safe is the mRNA vaccine for a man of my age?
01:54:56.000 | And do I need to be worried about myocarditis?
01:54:58.580 | All of that is completely rational to talk about now.
01:55:02.680 | My concern is that at every point along the way,
01:55:07.120 | I was the wrong person,
01:55:10.320 | and Brett Weinstein was the wrong person,
01:55:13.360 | and there's many other people I could add to this list,
01:55:15.620 | to have strong opinions about any of this stuff.
01:55:19.080 | - I just disagree with that.
01:55:20.160 | I think, yes, in theory, I agree 100%,
01:55:24.460 | but I feel like experts failed at communicating.
01:55:27.760 | Not at doing-- - They did.
01:55:29.600 | - And I just feel like you and Brett Weinstein
01:55:33.040 | actually have the tools with the internet,
01:55:35.560 | given the engine you have in your brain
01:55:38.200 | of thinking for months at a time
01:55:40.680 | deeply about the problems that face our world,
01:55:44.660 | that you actually have the tools
01:55:45.780 | to do pretty good thinking here.
01:55:47.680 | The problem I have with experts--
01:55:49.360 | - But there would be deference to experts
01:55:51.240 | and pseudo-experts behind all of that.
01:55:53.240 | - Well, the papers, you would stand
01:55:54.300 | on the shoulders of giants,
01:55:55.360 | but you can surf those shoulders
01:55:56.760 | better than the giants themselves, it seems.
01:55:58.200 | - But I knew we were gonna disagree about that.
01:56:00.640 | I saw his podcast where he brought on these experts
01:56:03.640 | who had, many of them, had the right credentials,
01:56:07.780 | but for a variety of reasons,
01:56:10.300 | they didn't pass the smell test for me.
01:56:13.200 | One larger problem, and this goes back to the problem
01:56:15.560 | of how we rely on authority in science,
01:56:19.100 | is that you can always find a PhD or an MD
01:56:23.080 | to champion any crackpot idea.
01:56:26.240 | I mean, it is amazing, but you could find PhDs and MDs
01:56:30.000 | who would sit up there in front of Congress
01:56:32.400 | and say that they thought smoking was not addictive,
01:56:35.200 | or that it was not harmful to,
01:56:37.520 | there was no direct link between smoking and lung cancer.
01:56:40.920 | You could always find those people.
01:56:42.640 | But some of the people Brett found
01:56:46.840 | were people who had obvious tells, to my point of view,
01:56:49.640 | to my eye, and I saw them on,
01:56:52.520 | some of the same people were on Rogan's podcast, right?
01:56:55.040 | And it's hard, because if a person
01:57:00.040 | does have the right credentials,
01:57:03.160 | and they're not saying something floridly mistaken,
01:57:07.720 | and we're talking about something where
01:57:09.760 | they're genuine unknowns, right?
01:57:12.360 | Like how much do we know
01:57:14.240 | about the safety of these vaccines, right?
01:57:15.920 | It's, at that point, not a whole hell of a lot.
01:57:18.760 | I mean, we have no long-term data on mRNA vaccines,
01:57:22.640 | but to confidently say that millions of people
01:57:25.520 | are gonna die because of these vaccines,
01:57:27.760 | and to confidently say that ivermectin is a panacea,
01:57:31.280 | right, ivermectin is the thing that prevents COVID, right?
01:57:34.600 | There was no good reason to say
01:57:36.400 | either of those things at that moment.
01:57:38.280 | And so, given that that's where Brett was,
01:57:42.600 | I felt like there was just no, there was nothing to debate.
01:57:45.480 | We're both the wrong people
01:57:46.840 | to be getting into the weeds on this.
01:57:48.960 | We're both gonna defer to our chosen experts.
01:57:53.160 | His experts look like crackpots to me,
01:57:55.800 | and, or at least the ones who are most vociferous
01:57:59.880 | on those edgiest points, it seemed most--
01:58:02.640 | - And your experts seem like, what is the term,
01:58:04.680 | mass hysteria, I forgot the term.
01:58:06.960 | - Well, no, but it's like with climate science.
01:58:10.160 | I mean, this old, it's received as a canard
01:58:14.680 | in half of our society now,
01:58:15.720 | but the claim that 97% of climate scientists
01:58:18.880 | agree that human-caused climate change is a thing, right?
01:58:23.200 | So do you go with the 97% most of the time,
01:58:26.240 | or do you go with the 3% most of the time?
01:58:29.140 | It's obvious you go with the 97% most of the time
01:58:32.240 | for anything that matters.
01:58:33.760 | It's not to say that the 3% are always wrong.
01:58:36.840 | Again, there are, things get overturned.
01:58:39.580 | And yes, as you say,
01:58:41.320 | and I've spent much more time worrying about this
01:58:43.520 | on my podcast than I've spent worrying about COVID,
01:58:46.120 | our institutions have lost trust for good reason, right?
01:58:51.120 | And it's an open question whether
01:58:56.560 | we can actually get things done
01:59:00.560 | with this level of transparency and pseudo-transparency,
01:59:05.360 | given our information ecosystems.
01:59:07.440 | Like, can we fight a war, really fight a war
01:59:10.780 | that we may have to fight, like the next Nazis?
01:59:13.360 | Can we fight that war when everyone with an iPhone
01:59:17.040 | is showing just how awful it is
01:59:19.080 | that little girls get blown up when we drop our bombs, right?
01:59:22.680 | Like, could we as a society do what we might have to do
01:59:27.440 | to actually get necessary things done
01:59:31.000 | when we're living in this panopticon
01:59:34.200 | of just, you know, everyone's a journalist, right?
01:59:36.920 | Everyone's a scientist, everyone's an expert,
01:59:39.000 | everyone's got direct contact with the facts,
01:59:43.000 | or a semblance of the facts, I don't know.
01:59:46.040 | - I think yes, and I think voices like yours
01:59:48.400 | are exceptionally important,
01:59:49.560 | and I think there's certain signals you send
01:59:52.560 | in your ability to steal me on the other side,
01:59:54.720 | in your empathy, essentially.
01:59:57.400 | So that's the fight, that's the mechanism
02:00:01.060 | by which you resist the dogmatism of this binary thinking.
02:00:06.060 | And then if you become a trusted person
02:00:11.920 | that's able to consider the other side,
02:00:14.040 | then people will listen to you as the aggregator,
02:00:17.520 | as the communicator of expertise.
02:00:19.360 | 'Cause the virologists haven't been able
02:00:21.440 | to be good communicators.
02:00:22.680 | I still, to this day, don't really know
02:00:27.000 | what is the, what am I supposed to think
02:00:29.840 | about the safety and efficacy of the vaccines today?
02:00:33.960 | As it stands today, what are we supposed to think?
02:00:36.200 | What are we supposed to think about testing?
02:00:38.400 | What are we supposed to think about
02:00:39.480 | the effectiveness of masks or lockdowns?
02:00:42.400 | Where's the great communicators on this topic
02:00:45.600 | that consider all the other conspiracy theories,
02:00:48.520 | all the communication that's out there,
02:00:51.740 | and actually aggregate it together
02:00:53.960 | and be able to say this is actually
02:00:55.860 | what's most likely the truth.
02:00:58.400 | And also some of that has to do with humility,
02:01:02.200 | epistemic humility, knowing that you can't
02:01:04.160 | really know for sure.
02:01:05.260 | Just like with depression, you can't really know for sure.
02:01:08.720 | Where's the, I'm not seeing those communications
02:01:11.640 | being effectively done, even still today.
02:01:14.020 | - Well, the jury is still out on some of it.
02:01:17.240 | And again, it's a moving target.
02:01:19.720 | And some of it, I mean, it's complicated.
02:01:21.800 | Some of it's a self-fulfilling dynamic where,
02:01:26.800 | so like lockdowns, in theory, lockdowns,
02:01:30.320 | a lockdown would work if we could only do it.
02:01:33.800 | But we can't really do it.
02:01:35.020 | And there's a lot of people who won't do it
02:01:36.480 | because they're convinced that it's,
02:01:37.740 | this is the totalitarian boot,
02:01:40.240 | finally on the neck of the good people
02:01:44.400 | who are always having their interests
02:01:49.400 | traduced by the elites, right?
02:01:51.160 | So like this is, if you have enough people
02:01:53.600 | who think the lockdown, for any reason,
02:01:56.160 | in the face of any conceivable illness, right,
02:01:59.660 | is just code for the new world order
02:02:02.760 | coming to fuck you over and take your guns, right?
02:02:05.920 | Okay, you have a society that is now immune to reason, right?
02:02:09.740 | 'Cause there are absolutely certain pathogens
02:02:12.880 | that we should lock down for next time, right?
02:02:15.680 | And it was completely rational in the beginning
02:02:20.440 | of this thing to lock down, given,
02:02:23.540 | to attempt to lock down, we never really locked down,
02:02:26.300 | to attempt some semblance of a lockdown
02:02:29.020 | just to, quote, bend the curve,
02:02:30.880 | to spare our healthcare system,
02:02:33.240 | given what we were seeing happening in Italy, right?
02:02:35.760 | Like that moment was not hard to navigate,
02:02:38.480 | at least in my view.
02:02:39.920 | It was obvious at the time.
02:02:41.800 | In retrospect, my views on that haven't changed,
02:02:44.900 | except for the fact that I recognize
02:02:48.140 | maybe it's just impossible,
02:02:50.760 | given the nature of people's response
02:02:53.440 | to that kind of demand, right?
02:02:55.840 | We live in a society that's just not gonna lock down.
02:02:58.120 | - Unless the pandemic is much more deadly.
02:03:01.000 | - Right, so that's a point I made,
02:03:02.600 | which was maliciously clipped out from some other podcast
02:03:05.600 | when someone's trying to make it look like,
02:03:07.720 | I wanna see children die.
02:03:09.440 | Look, it's a pity more children didn't die from COVID, right?
02:03:12.440 | This is actually the same person who,
02:03:16.400 | and that's the other thing that got so poisoned here.
02:03:20.560 | It's like that person, this psychopath,
02:03:22.720 | or effective psychopath,
02:03:23.840 | who's creating these clips of me on podcasts,
02:03:26.480 | this second clip of me seeming to say
02:03:29.960 | that I wish more children died during COVID,
02:03:32.240 | but it was so clear in context what I was saying
02:03:36.760 | that even the clip betrayed the context,
02:03:38.520 | so it didn't actually work.
02:03:39.840 | This psycho, and again,
02:03:41.920 | I don't know whether he actually is a psychopath,
02:03:43.620 | but he's behaving like one
02:03:44.960 | because of the incentives of Twitter.
02:03:47.020 | This is somebody who Brett signal boosted
02:03:49.640 | as a very reliable source of information, right?
02:03:54.640 | He kept retweeting this guy at me, against me, right?
02:03:59.320 | And this guy, at one glance,
02:04:01.520 | I knew how unreliable this guy was, right?
02:04:03.920 | But I think I'm not at all,
02:04:07.920 | one thing I think I did wrong,
02:04:09.800 | one thing that I do regret,
02:04:11.760 | one thing I have not sorted out for myself
02:04:14.320 | is how to navigate the professional and personal pressure
02:04:19.320 | that gets applied at this moment
02:04:27.380 | where you have a friend or an acquaintance
02:04:29.660 | or someone you know who's behaving badly in public
02:04:33.920 | or behaving badly,
02:04:36.360 | behaving in a way that you think is bad in public,
02:04:39.920 | and they have a public platform
02:04:42.380 | where they're influencing a lot of people,
02:04:44.080 | and you have your own public platform
02:04:45.680 | where you're constantly getting asked to comment
02:04:49.040 | on what this friend or acquaintance or colleague is doing.
02:04:54.560 | - I haven't known what I think is ethically right
02:04:58.020 | about the choices that seem forced on us
02:05:01.820 | at moments like this.
02:05:02.660 | So I've criticized you in public
02:05:05.220 | about your interview with Kanye.
02:05:06.900 | Now, in that case, I reached out to you in private first
02:05:11.440 | and told you exactly what I thought,
02:05:12.900 | and then when I was gonna get asked in public
02:05:15.220 | or when I was touching that topic on my podcast,
02:05:18.440 | I more or less said the same thing
02:05:19.620 | that I said to you in private, right?
02:05:20.600 | Now, that was how I navigated that moment.
02:05:24.420 | I did the same thing with Elon,
02:05:27.760 | at least at the beginning.
02:05:29.620 | We have maintained good vibes,
02:05:37.880 | which is not what I wanna say about Elon.
02:05:40.080 | - I don't think, I disagree with you,
02:05:41.880 | 'cause good vibes in the moment,
02:05:44.200 | there's a deep core of good vibes that persists through time
02:05:47.360 | between you and Elon, and I would argue probably
02:05:49.720 | between some of the other folks you mentioned.
02:05:51.880 | - I think with Brett, I failed to reach out in private
02:05:55.860 | to the degree that I should have,
02:05:58.740 | and we never really had,
02:06:00.240 | we had tried to set up a conversation in private
02:06:04.180 | that never happened, but there was some communication,
02:06:07.260 | but it would have been much better for me
02:06:11.740 | to have made more of an effort in private than I did
02:06:15.180 | before it spilled out into public,
02:06:16.880 | and I would say that's true with other people as well.
02:06:19.780 | - What kind of interaction in private
02:06:21.560 | do you think you should have with Brett?
02:06:23.040 | Because my case would be beforehand, and now still.
02:06:26.680 | The case I would like, and this part of the criticism
02:06:31.120 | you sent my way, maybe it's useful to go to that direction.
02:06:35.640 | Actually, let's go to that direction,
02:06:37.240 | because I think I disagree with your criticism
02:06:40.400 | as you stated publicly, but this is very--
02:06:41.880 | - You're talking about your interview with Connie?
02:06:43.680 | - Yeah, yeah, yeah.
02:06:44.520 | The thing you criticized me for
02:06:45.600 | is actually the right thing to do with Brett.
02:06:47.360 | Okay, you said, "Lex could have spoken with Connie
02:06:50.540 | "in such a way as to have produced a useful document.
02:06:54.180 | "He didn't do that because he has a fairly naive philosophy
02:06:57.400 | "about the power of love."
02:06:59.180 | - Let's see if you can maintain that philosophy
02:07:03.120 | in the presence of this criticism.
02:07:03.960 | - Let's go.
02:07:05.140 | No, it's beautiful.
02:07:07.000 | He seemed to think that if he just got through
02:07:09.760 | the minefield to the end of the conversation,
02:07:12.660 | where the two of them still were feeling good
02:07:15.240 | about one another and they can hug it out,
02:07:17.640 | that would be by definition a success.
02:07:20.320 | So let me make the case for this power of love philosophy.
02:07:25.080 | And first of all, I love you, Sam.
02:07:27.280 | You're still an inspiration and somebody I deeply admire.
02:07:30.600 | Okay. - Back at you.
02:07:31.720 | - To me, in the case of Kanye,
02:07:35.680 | it's not only that you get through the conversation
02:07:40.280 | and have hugs, it's that the display
02:07:43.920 | that you're willing to do that has power.
02:07:47.120 | So even if it doesn't end in hugging,
02:07:49.440 | the actual, the turning the other cheek,
02:07:52.360 | the act of turning the other cheek itself communicates
02:07:56.040 | both to Kanye later and to the rest of the world
02:07:59.840 | that we should have empathy and compassion towards each other.
02:08:04.160 | There is power to that.
02:08:05.520 | Maybe that is naive, but I believe in the power of that.
02:08:09.840 | So it's not that I'm trying to convince Kanye
02:08:12.040 | that some of his ideas are wrong,
02:08:14.040 | but I'm trying to illustrate that just the act of listening
02:08:17.840 | and truly trying to understand the human being,
02:08:20.240 | that opens people's minds to actually questioning
02:08:26.260 | their own beliefs more.
02:08:27.880 | It takes them out of the dogmatism,
02:08:29.440 | deescalates the kind of dogmatism that I've been seeing.
02:08:33.640 | So in that sense, I would say the power of love
02:08:36.840 | is the philosophy you might apply to Bret
02:08:40.940 | because the right conversation you have in private
02:08:43.740 | is not about, hey, listen, the experts you're talking to,
02:08:48.740 | they seem credentialed, but they're not actually
02:08:52.500 | as credentialed as they're illustrating,
02:08:54.400 | they're not grounding their findings
02:08:55.960 | in actual meta-analyses and papers and so on.
02:08:58.400 | Like making a strong case, like what are you doing?
02:09:00.880 | This is gonna get a lot of people in trouble.
02:09:02.460 | But instead just saying, like being a friend
02:09:05.320 | in the dumbest of ways, being like respectful,
02:09:10.640 | sending love their way, and just having a conversation
02:09:14.120 | outside of all of this.
02:09:15.540 | Like basically showing that like,
02:09:19.280 | removing the emotional attachment to this debate,
02:09:25.640 | even though you are very emotionally attached
02:09:27.600 | because in the case of COVID specifically,
02:09:29.740 | there is a very large number of lives at stake.
02:09:33.300 | But removing all of that and remembering
02:09:35.700 | that you have a friendship.
02:09:38.500 | - Yeah, well, so I think these are highly
02:09:40.600 | non-analogous cases, right?
02:09:42.500 | So your conversation with Kanye misfired
02:09:45.940 | from my point of view for a very different reason.
02:09:48.440 | It has to do with Kanye.
02:09:51.660 | I mean, so Kanye, I don't know, I've never met Kanye,
02:09:55.380 | so obviously I don't know him.
02:09:56.880 | But I think he's either obviously
02:10:02.780 | in the midst of a mental health crisis
02:10:05.660 | or he's a colossal asshole.
02:10:08.700 | Or both, I mean, those aren't mutually exclusive.
02:10:10.700 | So one of three possibilities,
02:10:12.300 | he's either mentally ill, he's an asshole,
02:10:15.460 | or he's mentally ill and an asshole.
02:10:17.780 | - I think all three of those possibilities
02:10:19.460 | are possible for the both of us as well.
02:10:21.380 | - No, I would argue none of those are likely
02:10:24.660 | for either of us, but--
02:10:26.780 | - Possible.
02:10:27.620 | - Not to say we don't have our moments, but.
02:10:29.780 | So the reason not to talk to Kanye,
02:10:32.780 | so I think you should have had the conversation
02:10:35.540 | you had with him in private, that's great.
02:10:37.660 | And I've got no criticism of what you said
02:10:41.900 | had it been in private.
02:10:43.340 | In public, I just thought you're not doing him a favor.
02:10:47.740 | If he's mentally ill, right,
02:10:50.420 | he's in the middle of a manic episode,
02:10:53.660 | or I'm not a clinician, but I've heard it said of him
02:10:57.040 | that he is bipolar, you're not doing him a favor
02:11:01.840 | sticking a mic in front of him
02:11:03.180 | and letting him go off on the Jews or anything else.
02:11:06.260 | Right?
02:11:07.100 | We know what he thought about the Jews,
02:11:10.300 | we know that there's not much illumination
02:11:12.900 | that's gonna come from him on that topic.
02:11:15.740 | And if it is a symptom of his mental illness
02:11:19.140 | that he thinks these things, well then,
02:11:21.260 | you're not doing him a favor making that even more public.
02:11:24.140 | If he's just an asshole and he's just an antisemite,
02:11:28.820 | an ordinary garden variety antisemite,
02:11:31.500 | well then, there's also not much to say
02:11:34.940 | unless you're really gonna dig in
02:11:36.740 | and kick the shit out of him in public.
02:11:39.780 | And I'm saying you can do that with love.
02:11:41.940 | I mean, that's the other thing here is that
02:11:44.620 | I don't agree that compassion and love always have
02:11:48.640 | this patient, embracing, acquiescent face, right?
02:11:53.640 | They don't always feel good to the recipient, right?
02:11:59.220 | There is a sort of wisdom that you can wield
02:12:02.820 | compassionately in moments like that
02:12:05.100 | where someone's full of shit and you just make it
02:12:07.700 | absolutely clear to them and to your audience
02:12:10.220 | that they're full of shit.
02:12:11.100 | And there's no hatred being communicated.
02:12:13.740 | In fact, you could just, it's like,
02:12:14.900 | listen, I'm gonna do everyone a favor right now
02:12:16.500 | and just take your foot out of your mouth.
02:12:20.620 | And the truth is, I just wouldn't have aired
02:12:25.620 | the conversation.
02:12:26.860 | Like, I just don't think it was a document
02:12:28.020 | that had to get out there, right?
02:12:29.180 | I get that many people,
02:12:31.820 | this is not a signal you're likely to get
02:12:33.660 | from your audience, right?
02:12:34.500 | Like, I get that many people in your audience thought,
02:12:36.780 | oh my God, that's awesome.
02:12:37.820 | You're talking to Kanye and you're doing it in Lex style,
02:12:40.660 | where it's just love and you're not treating him
02:12:42.920 | like a pariah.
02:12:44.180 | And you're holding this tension between
02:12:46.820 | he's this creative genius who does work we love,
02:12:49.300 | and yet he's having this moment that's so painful.
02:12:51.780 | And what a tightrope walk.
02:12:53.680 | And I get that maybe 90% of your audience saw it that way.
02:12:57.820 | They're still wrong.
02:12:59.340 | And I still think that was on balance
02:13:02.180 | not a good thing to put out into the world.
02:13:03.660 | - You don't think it opens up the mind and heart
02:13:05.500 | of people that listen to that?
02:13:06.740 | Just seeing a person--
02:13:08.300 | - If it does, it's opening up in the wrong direction
02:13:12.700 | where just gale force nonsense is coming in, right?
02:13:15.980 | I think we should have an open mind and an open heart,
02:13:19.180 | but there's some clear things here
02:13:21.940 | that we have to keep in view.
02:13:24.200 | One is the mental illness component is its own thing.
02:13:27.300 | I don't pretend to understand what's going on with him,
02:13:29.200 | but insofar as that's the reason he's saying what he's saying
02:13:33.780 | do not put this guy on camera and let no one see it.
02:13:37.420 | - Sorry, on that point, real quick,
02:13:38.580 | I had a bunch of conversation with him offline
02:13:40.580 | and I didn't get a sense of mental illness.
02:13:42.820 | That's why I chose to sit down.
02:13:45.500 | And I didn't get it.
02:13:46.460 | I mean, mental illness is such a...
02:13:48.260 | - But when he shows up in a gimp hood
02:13:51.920 | on Alex Jones's podcast,
02:13:54.180 | either that's more genius performance in his world
02:13:58.700 | or he's unraveling further.
02:14:00.940 | - I wouldn't put that under mental illness.
02:14:03.100 | I think there's another conversation to be had
02:14:06.820 | about how we treat artists.
02:14:10.380 | - Right.
02:14:11.220 | - Because they're weirdos.
02:14:14.160 | They're very, I mean,
02:14:15.520 | taking words from Kanye as if he's like Christopher Hitchens
02:14:22.020 | or something like that,
02:14:23.060 | like very eloquent, researched,
02:14:27.540 | written many books on history and politics
02:14:30.900 | and geopolitics, on psychology.
02:14:33.300 | Kanye didn't do any of that.
02:14:34.860 | He's an artist just spouting off.
02:14:36.820 | And so there's a different style of conversation
02:14:39.340 | and a different way to treat the words
02:14:42.860 | that are coming out of his mouth.
02:14:43.700 | - Let's leave the mental illness aside.
02:14:44.940 | So if we're gonna say that there's no reason
02:14:46.780 | to think he's mentally ill,
02:14:47.780 | and this is just him being creative
02:14:49.220 | and brilliant and opinionated,
02:14:51.780 | well, then that falls into the asshole bucket for me.
02:14:54.220 | It's like, then he's someone...
02:14:55.820 | And honestly, the most offensive thing about him
02:14:58.660 | in that interview, from my point of view,
02:15:00.100 | is not the antisemitism, which we can talk about,
02:15:03.460 | 'cause I think there are problems just letting him
02:15:06.020 | spread those memes as well.
02:15:09.380 | But the most offensive thing
02:15:11.140 | is just how delusionally egocentric he is
02:15:15.800 | or was coming off in that interview and in others.
02:15:18.340 | Like he has an estimation of himself
02:15:21.580 | as this omnibus genius
02:15:25.180 | not only to rival Shakespeare, to exceed Shakespeare.
02:15:28.540 | I mean, he's like, he is the greatest mind
02:15:30.300 | that has ever walked among us.
02:15:32.140 | And he's more or less explicit on that point.
02:15:34.500 | And yet he manages to talk for hours
02:15:36.380 | without saying anything actually interesting or insightful
02:15:39.100 | or factually illuminating.
02:15:41.620 | So it's complete delusion of a very Trumpian sort.
02:15:46.020 | It's like when Trump says he's a genius
02:15:48.660 | who understands everything,
02:15:49.900 | but nobody takes him seriously,
02:15:52.020 | and one wonders whether Trump takes himself seriously,
02:15:54.420 | Kanye seems to believe, he seems to believe his own press.
02:15:58.100 | He actually thinks he's just a colossus.
02:16:03.100 | And he may be a great musician.
02:16:07.260 | It's certainly not my wheelhouse
02:16:10.880 | to compare him to any other musicians.
02:16:12.340 | But one thing that's patently obvious from your conversation
02:16:17.340 | is he's not who he thinks he is intellectually or ethically
02:16:22.820 | or in any other relevant way.
02:16:25.260 | And so when you couple that
02:16:27.460 | to the antisemitism he was spreading,
02:16:29.980 | which was genuinely noxious and ill-considered
02:16:33.580 | and has potential knock-on effects in the black community.
02:16:38.580 | I mean, there's an ambient level of antisemitism
02:16:41.860 | in the black community that is worth worrying about
02:16:44.380 | and talking about anyway.
02:16:45.980 | There's a bunch of guys playing the knockout game
02:16:48.220 | in Brooklyn, just punching Orthodox Jews in the face.
02:16:51.460 | And I think letting Kanye air his antisemitism
02:16:55.020 | that publicly only raises the likelihood of that
02:16:59.180 | rather than diminishes it.
02:17:00.180 | - I don't know.
02:17:01.020 | So let me say just a couple of things.
02:17:02.460 | So one, my belief at the time was it doesn't,
02:17:06.020 | it decreases it.
02:17:06.860 | Showing empathy while pushing back
02:17:09.020 | decreases the likelihood of that.
02:17:10.660 | It does, it might on the surface look
02:17:13.700 | like it's increasing it,
02:17:14.940 | but that's simply because the antisemitism
02:17:17.300 | or the hatred in general is brought to the surface.
02:17:19.780 | And then people talk about it.
02:17:22.260 | But I should also say that you're one of the only people
02:17:25.140 | that wrote to me privately criticizing me.
02:17:27.340 | And like out of the people I really respect and admire,
02:17:31.940 | and that was really valuable.
02:17:33.020 | That like I had to, painful,
02:17:35.220 | 'cause I had to think through it for a while.
02:17:37.460 | And it still haunts me because the other kind of criticism
02:17:41.580 | I got a lot of, people basically said,
02:17:46.100 | things towards me based on who I am that they hate me.
02:17:51.100 | Just--
02:17:52.020 | - You mean antisemitic things?
02:17:52.860 | Or that you-- - Yeah, antisemitic things.
02:17:53.900 | I just hate the word antisemitic.
02:17:55.700 | It's like racist.
02:17:58.220 | - But here's the reality.
02:18:00.020 | So I'm someone, so I'm Jewish,
02:18:02.980 | although obviously not religious.
02:18:04.620 | I have never taken,
02:18:08.100 | I've been a student of the Holocaust, obviously.
02:18:10.940 | I know a lot about that.
02:18:12.540 | And there's reason to be a student of the Holocaust.
02:18:17.060 | But in my lifetime and in my experience,
02:18:20.700 | I have never taken antisemitism very seriously.
02:18:24.220 | I have not worried about it.
02:18:26.020 | I have not made a thing of it.
02:18:28.260 | I've done exactly one podcast on it.
02:18:30.460 | I had Barry Weiss on my podcast when her book came out.
02:18:34.380 | But it really is a thing.
02:18:39.980 | And it's something we have to keep an eye on societally
02:18:44.980 | because it's a unique kind of hatred.
02:18:50.140 | It's unique in that it seems,
02:18:53.820 | it's knit together with, it's not just ordinary racism.
02:18:56.460 | It's knit together with lots of conspiracy theories
02:18:58.980 | that never seem to die out.
02:19:00.420 | It can by turns equally animate the left
02:19:05.860 | and the right politically.
02:19:07.020 | I mean, what's so perverse about antisemitism,
02:19:09.340 | look in the American context,
02:19:11.020 | with the far right, with white supremacists,
02:19:14.020 | Jews aren't considered white.
02:19:15.380 | So they hate us in the same spirit
02:19:17.820 | in which they hate black people or brown people
02:19:20.300 | or anyone who's not white.
02:19:22.100 | But on the left, Jews are considered extra white.
02:19:25.460 | I mean, we're the extra beneficiaries
02:19:27.760 | of white privilege, right?
02:19:29.500 | And in the black community, that is often the case, right?
02:19:32.540 | We're a minority that has thrived.
02:19:34.620 | And it seems to stand as a counterpoint
02:19:38.700 | to all of the problems that other minorities suffer,
02:19:42.780 | in particular, African-Americans in the American context.
02:19:45.880 | And yeah, Asians are now getting a little bit of this,
02:19:49.820 | like the model minority issue.
02:19:53.580 | But Jews have had this going on for centuries and millennia,
02:19:57.660 | and it never seems to go away.
02:20:00.060 | And again, this is something that I've never focused on,
02:20:03.360 | but this has been at a slow boil
02:20:07.500 | for as long as we've been alive.
02:20:10.180 | And there's no guarantee it can't suddenly become
02:20:13.460 | much, much uglier than we have any reason
02:20:16.220 | to expect it to become, even in our society.
02:20:19.420 | And so there's kind of a special concern
02:20:23.380 | at moments like that,
02:20:24.220 | where you have an immensely influential person
02:20:27.640 | in a community who already has a checkered history
02:20:31.680 | with respect to their own beliefs about the Jews
02:20:34.140 | and the conspiracies and all the rest.
02:20:37.420 | And he's just messaging, not especially fully opposed
02:20:42.420 | by you and anyone else who's giving him the microphone
02:20:45.980 | at that moment to the world.
02:20:48.700 | And so that made my spidey sense tingle.
02:20:53.260 | - Yeah, it's complicated.
02:20:54.900 | The stakes are very high.
02:20:56.360 | And as somebody who's been, obviously, family
02:20:59.340 | and also reading a lot about World War II,
02:21:01.820 | and just this whole period,
02:21:03.140 | it was a very difficult conversation.
02:21:05.900 | I believe in the power, especially given who I am,
02:21:09.780 | not always, but sometimes, often turning the other cheek.
02:21:16.460 | - Oh, yeah, and again, things change
02:21:19.060 | when they're for public consumption.
02:21:23.500 | The cut for me that has just,
02:21:28.580 | the use case I keep stumbling upon
02:21:30.660 | is the kinds of things that I will say
02:21:32.820 | on a podcast like this,
02:21:34.020 | or if I'm giving a public lecture,
02:21:36.020 | versus the kinds of things I will say at dinner
02:21:39.640 | with strangers or with friends.
02:21:41.940 | Like if you're in an elevator,
02:21:43.500 | like if I'm in an elevator with strangers,
02:21:45.580 | I do not feel, and I hear someone say something stupid,
02:21:48.300 | I don't feel an intellectual responsibility
02:21:51.780 | to turn around in the confines of that space with them
02:21:56.780 | and say, "Listen, that thing you just said
02:21:58.460 | "about X, Y, or Z is completely false,
02:22:00.680 | "and here's why," right?
02:22:02.140 | But if somebody says it in front of me
02:22:05.060 | on some public dais where I'm actually talking about ideas,
02:22:08.500 | that's when there's a different responsibility
02:22:11.580 | that comes online.
02:22:12.420 | - The question is how you say it, how you say it.
02:22:15.060 | - Or even whether you say anything in those.
02:22:17.500 | I mean, there are moments,
02:22:18.620 | there are definitely moments to privilege civility
02:22:21.940 | or just to pick your battles.
02:22:23.140 | I mean, sometimes it's just not worth it
02:22:24.740 | to get into it with somebody out in real life.
02:22:28.620 | - I just believe in the power of empathy,
02:22:31.180 | both in the elevator and when a bunch of people
02:22:35.940 | are listening, that when they see you
02:22:40.060 | willing to consider another human being's perspective,
02:22:44.360 | it just gives more power to your words after.
02:22:50.300 | - Well, yeah, but until it doesn't.
02:22:52.460 | 'Cause you can extend charity too far, right?
02:22:59.100 | It can be absolutely obvious
02:23:00.300 | what someone's motives really are,
02:23:02.820 | and they're dissembling about that, right?
02:23:05.460 | And so then you're taking at face value
02:23:08.020 | their representations, begins to look like
02:23:10.300 | you're just being duped and you're not actually
02:23:13.260 | doing the work of putting pressure on a bad actor.
02:23:16.820 | And again, the mental illness component here
02:23:20.060 | makes it very difficult to think about
02:23:22.580 | what you should or shouldn't have said to Kanye.
02:23:24.860 | - So I think the topic of platforming is pretty interesting.
02:23:28.460 | What's your view on platforming controversial people?
02:23:32.460 | Let's start with the old, would you interview Hitler
02:23:36.980 | on your podcast, and how would you talk to him?
02:23:40.220 | Oh, and follow-up question.
02:23:42.380 | Would you interview him in 1935, '41, and then '45?
02:23:47.380 | - Well, I think we have an uncanny valley problem
02:23:53.780 | with respect to this issue of whether or not
02:23:58.780 | to speak to bad people, right?
02:24:00.820 | So if a person's sufficiently bad, right?
02:24:03.300 | If they're all the way out of the valley,
02:24:05.860 | then you can talk to them, and it's just,
02:24:07.780 | it's totally unproblematic to talk to them.
02:24:10.340 | Because you don't have to spend any time
02:24:12.660 | signaling to your audience that you don't agree with them.
02:24:14.620 | And if you're interviewing Hitler,
02:24:16.100 | you don't have to say, "Listen, I just gotta say,
02:24:17.700 | "before we start, I don't agree with the whole
02:24:20.140 | "genocide thing, and I just think you're killing
02:24:23.740 | "mental patients in vans, and that was all bad,
02:24:28.020 | "that was a bad look, Adolf."
02:24:30.340 | So you just, it can go without saying
02:24:33.660 | that you don't agree with this person,
02:24:35.620 | and you're not platforming them to signal boost
02:24:38.100 | their views, you're just trying to,
02:24:41.900 | if they're sufficiently evil, you can go into it
02:24:44.980 | very much as an anthropologist would.
02:24:47.440 | You just wanna understand the nature of evil, right?
02:24:51.660 | You just wanna understand this phenomenon,
02:24:53.380 | like how is this person who they are, right?
02:24:56.780 | And that strikes me as a intellectually interesting
02:25:01.860 | and morally necessary thing to do, right?
02:25:06.020 | So yes, I think you always interview Hitler.
02:25:09.420 | - Wait, wait, wait, wait, wait, wait, wait, wait, wait.
02:25:11.180 | - Well, when you know, once he's Hitler--
02:25:13.220 | - But when do you know it?
02:25:14.300 | - Once he's legitimately Hitler.
02:25:15.140 | - But when do you know it?
02:25:16.700 | Is genocide really happening?
02:25:19.180 | - Yeah, yeah, yeah.
02:25:20.020 | - It's not '42, '43?
02:25:20.860 | - No, no, no, if you're on the cusp of it
02:25:22.500 | where it's just he's someone who's gaining power
02:25:25.300 | and you don't wanna help facilitate that,
02:25:27.740 | then there's a question of whether you can undermine him
02:25:32.620 | while pushing back against him in that interview, right?
02:25:35.020 | So there are people I wouldn't talk to
02:25:37.420 | just because I don't wanna give them oxygen
02:25:39.260 | and I don't think that in the context of my interviewing them
02:25:44.060 | I'm gonna be able to take the wind out of their sails
02:25:46.780 | at all, right?
02:25:47.620 | So it's like, for whatever,
02:25:49.140 | either because an asymmetric advantage,
02:25:51.980 | because I just know that they can do something
02:25:54.940 | within the span of an hour that I can't correct for.
02:26:01.300 | It's like they can light many small fires
02:26:03.860 | and it just takes too much time to put them out.
02:26:05.140 | - That's more like on the topic of vaccines, for example,
02:26:07.140 | having a debate on the efficacy of vaccines.
02:26:09.220 | - Yeah.
02:26:10.060 | It's not that I don't think sunlight
02:26:11.380 | is usually the best disinfectant, I think it is.
02:26:14.980 | Even these asymmetries aside,
02:26:17.060 | I mean, it is true that a person
02:26:21.020 | can always make a mess faster than you can clean it up,
02:26:23.940 | right, but still there are debates worth having
02:26:25.980 | even given that limitation.
02:26:27.740 | And they're the right people to have those specific debates.
02:26:30.980 | And there's certain topics where, you know,
02:26:33.300 | I'll debate someone just because
02:26:36.140 | I'm the right person for the job
02:26:37.620 | and it doesn't matter how messy they're gonna be.
02:26:40.900 | It's just worth it because I can make my points land
02:26:44.820 | at least to the right part of the audience.
02:26:47.380 | - So some of it is just your own skill and competence
02:26:49.980 | and also interest in preparing correctly?
02:26:52.620 | - Well, yeah, yeah, and the nature of the subject matter.
02:26:55.420 | But there are other people who just by default,
02:26:58.940 | I would say, well, there's no reason
02:27:00.780 | to give this guy a platform.
02:27:02.100 | And there are also people who are so confabulatory
02:27:04.780 | that they're making such a mess with every sentence
02:27:09.780 | that insofar as you're even trying to interact
02:27:15.020 | with what they're saying,
02:27:16.420 | you're by definition going to fail
02:27:19.460 | and you're going to seem to fail
02:27:22.140 | to a sufficiently large uninformed audience
02:27:26.220 | where it's gonna be a net negative for the cause of truth
02:27:29.780 | no matter how good you are.
02:27:30.740 | So like, for instance, I think talking to Alex Jones
02:27:35.740 | on any topic for any reason is probably a bad idea
02:27:39.380 | because I just think he's just neurologically wired
02:27:44.140 | to just utter a string of sentences.
02:27:46.880 | He'll get 20 sentences out,
02:27:49.020 | each of which contains more lies than the last.
02:27:54.020 | And there's not time enough in the world
02:27:59.380 | to run down, and certainly not time enough
02:28:01.380 | in the span of a conversation,
02:28:02.980 | to run down each of those leads to bedrock
02:28:06.700 | so as to falsify it.
02:28:07.940 | I mean, he'll just make shit up.
02:28:09.580 | Or make shit up and then weave it in with half-truths
02:28:15.540 | and micro-truths that give some semblance of credibility
02:28:20.540 | to somebody out there.
02:28:22.860 | I mean, apparently millions of people out there.
02:28:25.220 | And there's just no way to untangle that
02:28:29.100 | in real time with him.
02:28:30.180 | - I have noticed that you have an allergic reaction
02:28:33.100 | to confabularization.
02:28:36.820 | - Confabulation, yeah.
02:28:38.780 | - Confabulation.
02:28:40.340 | That if somebody says something a little micro-untruth,
02:28:45.780 | it really stops your brain.
02:28:48.060 | - Here I'm not talking about micro-untruths,
02:28:49.580 | I'm just talking about making up things out of whole cloth.
02:28:52.260 | Just like, if someone says something,
02:28:54.660 | well, what about, and then the thing they put
02:28:57.940 | at the end of that sentence is just a set of pseudofacts,
02:29:03.940 | right, that you can't possibly authenticate or not
02:29:07.100 | in the span of that conversation.
02:29:09.300 | They will, you know, whether it's about UFOs
02:29:11.260 | or anything else, right,
02:29:13.580 | they will seem to make you look like an ignoramus
02:29:17.140 | when in fact everything they're saying is specious,
02:29:21.900 | right, whether they know it or not.
02:29:22.940 | I mean, there's some people who are just crazy,
02:29:24.780 | there's some people who are just bullshitting
02:29:27.860 | and they're not even tracking whether it's true,
02:29:29.580 | it just feels good, and then some people
02:29:30.940 | are consciously lying about things.
02:29:32.980 | - But don't you think there's just a kind of jazz
02:29:36.180 | masterpiece of untruth that you should be able
02:29:38.740 | to just wave off by saying like,
02:29:42.660 | well, none of that is backed up by any evidence
02:29:44.780 | and just almost like take it to the humor place?
02:29:47.020 | - Well, yeah, but the thing is,
02:29:48.340 | I mean, the place I'm familiar with doing this
02:29:51.500 | and not doing this is on specific conspiracies
02:29:55.480 | like 9/11 truth, right?
02:29:57.780 | Like, the 9/11, so because of my,
02:30:00.940 | because of what 9/11 did to my intellectual life,
02:30:05.620 | I mean, it really just, you know,
02:30:07.220 | it sent me down a path for the better part of a decade.
02:30:10.500 | Like I became a critic of religion.
02:30:12.720 | I don't know if I was ever gonna be a critic of religion,
02:30:16.140 | right, but it happened to be in my wheelhouse
02:30:19.060 | 'cause I had spent so much time studying religion on my own
02:30:23.500 | and I was also very interested
02:30:26.820 | in the underlying spiritual concerns of every religion
02:30:30.500 | and so I was, you know,
02:30:33.800 | I devoted more than a full decade of my life
02:30:39.360 | to just, you know, what is real here?
02:30:41.980 | What is possible?
02:30:42.820 | What is the nature of subjective reality
02:30:45.500 | and how does it relate to reality at large
02:30:47.260 | and is there anything to, you know,
02:30:49.820 | who was someone like Jesus or Buddha
02:30:51.540 | and are these people frauds or are they,
02:30:53.900 | are these just myths or is there really a continuum
02:30:58.900 | of insight to be had here that is interesting?
02:31:02.420 | So I spent a lot of time on that question
02:31:06.260 | through my 20, the full decade of my 20s.
02:31:08.220 | - And that was launched in part by 9/11 Truth Hour?
02:31:10.520 | - No, but then when 9/11 happened,
02:31:13.240 | I had spent all this time reading religious books,
02:31:15.880 | understanding, empathically understanding
02:31:18.720 | the motivations of religious people, right,
02:31:21.160 | knowing just how fully certain people believe
02:31:24.480 | what they say they believe, right?
02:31:25.640 | So I took religious convictions very seriously
02:31:28.360 | and then people started flying planes into our buildings
02:31:31.280 | and so I knew that there was something to be said about--
02:31:35.000 | - Allegedly.
02:31:36.180 | - The core doctrines of Islam, yeah, exactly.
02:31:38.900 | So I went down, so that became my wheelhouse for a time,
02:31:42.740 | you know, terrorism and jihadism and related topics
02:31:48.860 | and so the 9/11 truth conspiracy thing
02:31:51.900 | kept, you know, getting aimed at me
02:31:55.580 | and the question was, well, do I wanna debate these people?
02:32:00.580 | - Yeah, Alex Jones, perhaps.
02:32:02.380 | - Yeah, so Alex Jones, I think, was an early purveyor
02:32:04.980 | of it, although I don't think I knew who he was
02:32:06.360 | at that point.
02:32:07.200 | And so, and privately, I had some very long debates
02:32:13.120 | with people who, you know, one person in my family
02:32:15.360 | went way down that rabbit hole and I just, you know,
02:32:17.720 | every six months or so, I'd literally write
02:32:20.040 | the two-hour email, you know, that would try to,
02:32:23.040 | try to deprogram 'em, you know, however ineffectually
02:32:26.240 | and so I went back and forth for years on that topic
02:32:30.760 | with, in private, with people, but I could see
02:32:33.880 | the structure of the conspiracy, I could see the nature
02:32:36.140 | of how impossible it was to play whack-a-mole
02:32:41.140 | sufficiently well so as to convince anyone of anything
02:32:48.420 | who was not seeing the problematic structure
02:32:53.420 | of that way of thinking.
02:32:55.860 | I mean, it's not actually a thesis,
02:32:57.540 | it's a proliferation of anomalies that don't,
02:33:01.980 | you can't actually connect all the dots
02:33:03.920 | that are being pointed to, they don't connect
02:33:05.840 | in a coherent way, they're incompatible theses
02:33:09.200 | that are not, and their incompatibility
02:33:10.640 | is not being acknowledged.
02:33:11.940 | But they're running this algorithm of things are,
02:33:17.520 | things are never what they seem, there's always
02:33:19.560 | malicious conspirators doing things perfectly.
02:33:22.920 | We see all, we see evidence of human incompetence
02:33:25.520 | everywhere else, no one can tie their shoes,
02:33:29.000 | you know, expertly anywhere else, but over here,
02:33:33.940 | people are perfectly competent,
02:33:35.780 | they're perfectly concealing things,
02:33:37.460 | like thousands of people are collaborating,
02:33:40.460 | you know, inexplicably, I mean, incentivized by what,
02:33:43.780 | who knows, they're collaborating to murder
02:33:46.580 | thousands of their neighbors and no one is breathing
02:33:49.020 | a peep about it, no one's getting caught on camera,
02:33:51.700 | no one's breathed a word of it to a journalist,
02:33:57.460 | and so I've dealt with that style of thinking
02:34:02.460 | and I know what it's like to be in the weeds
02:34:07.080 | of a conversation like that and the person will say,
02:34:11.200 | okay, well, but what do you make of the fact
02:34:13.360 | that all those F-16s were flown 800 miles out to sea
02:34:18.360 | on the morning of 9/11 doing an exercise
02:34:21.500 | that hadn't even been scheduled for that day,
02:34:23.160 | but it was, and now all of these,
02:34:25.440 | I dimly recall some thesis of that kind,
02:34:28.440 | but I'm just making these things up now, right,
02:34:30.460 | so like that detail, hadn't even been scheduled
02:34:33.040 | for that day, it was inexplicably run that day,
02:34:34.840 | so how long would it take to track that down, right,
02:34:39.220 | the idea that this is anomalous,
02:34:41.360 | like there was an F-16 exercise run on,
02:34:46.360 | and it wasn't even supposed to be run that day, right?
02:34:50.880 | Someone like Alex Jones, their speech pattern
02:34:54.640 | is to pack as much of that stuff in as possible
02:34:58.640 | at the highest velocity that the person can speak
02:35:02.000 | and unless you're knocking down each one of those things
02:35:05.800 | to that audience, you appear to just be uninformed,
02:35:09.560 | you appear to just not be,
02:35:11.000 | wait a minute, he didn't know about the F-16s?
02:35:12.800 | - Yeah, sure.
02:35:14.720 | - He doesn't know about Project Mockingbird?
02:35:16.640 | You haven't heard about Project Mockingbird?
02:35:18.080 | I just made up Project Mockingbird, I don't know what it is,
02:35:19.860 | but that's the kind of thing that comes tumbling out
02:35:23.920 | in a conversation like that,
02:35:26.040 | that's the kind of thing, frankly,
02:35:27.120 | I was worried about in the COVID conversation
02:35:30.280 | because not that someone like Brett would do it consciously,
02:35:34.280 | but someone like Brett is swimming
02:35:36.880 | in a sea of misinformation on social,
02:35:39.600 | living on Twitter, getting people,
02:35:42.160 | sending the blog post and the study from the Philippines
02:35:47.160 | that showed that in this cohort, Ivermectin did X, right,
02:35:52.480 | and to actually run anything to ground, right,
02:35:57.480 | you have to actually do the work
02:36:01.000 | journalistically and scientifically
02:36:02.840 | and run it to ground, right?
02:36:05.800 | So for some of these questions,
02:36:07.920 | you actually have to be a statistician to say,
02:36:11.080 | okay, they used the wrong statistics in this experiment.
02:36:16.080 | Now, yes, we could take all the time to do that
02:36:21.160 | or we could at every stage along the way
02:36:23.320 | in a context where we have experts we can trust
02:36:28.080 | go with what 97% of the experts are saying about X,
02:36:32.480 | about the safety of mRNA,
02:36:33.800 | about the transmissibility of COVID,
02:36:36.160 | about whether to wear masks or not wear masks.
02:36:38.680 | And I completely agree that that broke down
02:36:41.920 | unacceptably over the last few years,
02:36:49.200 | but I think that's largely,
02:36:51.680 | social media and blogs and the efforts of podcasters
02:36:57.120 | and sub-stack writers were not just a response to that.
02:37:02.120 | It was, I think it was a symptom of that
02:37:06.080 | and a cause of that, right?
02:37:08.160 | And I think we're living in an environment where
02:37:11.120 | people, we've basically, we have trained ourselves
02:37:17.920 | not to be able to agree about facts on any topic,
02:37:22.200 | no matter how urgent, right?
02:37:24.080 | What's flying in our sky?
02:37:26.080 | What's happening in Ukraine?
02:37:30.040 | Is Putin just denazifying Ukraine?
02:37:33.600 | I mean, like, there are people who we respect
02:37:37.520 | who are spending time down that particular rabbit hole.
02:37:41.720 | Like this is, maybe there are a lot of Nazis in Ukraine
02:37:45.920 | and that's the real problem.
02:37:47.960 | Right, maybe Putin's,
02:37:49.400 | maybe Putin's not the bad actor here, right?
02:37:52.960 | How much time do I have to spend
02:37:54.760 | empathizing with Putin to the point of thinking,
02:37:58.240 | well, maybe Putin's got a point and it's like,
02:38:01.680 | what about the polonium and the nerve agents
02:38:04.520 | and the killing of journalists and the Navalny?
02:38:07.200 | And like, does that count?
02:38:09.480 | Well, no, listen, I'm not paying so much attention to that
02:38:11.880 | because I'm following all these interesting people
02:38:13.400 | on Twitter and they're giving me
02:38:15.440 | some pro-Putin material here.
02:38:18.440 | And there is a, there are some Nazis in Ukraine.
02:38:21.120 | It's not like there are no Nazis in Ukraine.
02:38:23.440 | How am I gonna weight these things?
02:38:25.680 | I think people are being driven crazy by Twitter.
02:38:28.560 | - Yeah.
02:38:29.840 | But you're kind of speaking to conspiracy theories
02:38:32.480 | that pollute everything.
02:38:33.440 | And then, but every example you gave
02:38:35.880 | is kind of a bad faith style of conversation.
02:38:40.080 | - But it's not necessarily knowingly bad faith by,
02:38:42.640 | I mean, the people who are worried about Ukrainian Nazis,
02:38:47.240 | to my, I mean, they're some of the same people.
02:38:49.960 | They're the same people who are worried about
02:38:52.160 | ivermectin got suppressed.
02:38:55.480 | Like ivermectin is really a panacea,
02:38:57.880 | but it got suppressed for,
02:38:59.720 | because no one could make billions on it.
02:39:01.760 | It's the same, it's literally,
02:39:06.840 | in many cases, the same people and the same efforts
02:39:11.400 | to unearth those facts.
02:39:12.840 | - And you're saying it's very difficult
02:39:13.880 | to have conversations with those kinds of people.
02:39:15.720 | What about a conversation with Trump himself?
02:39:19.600 | Would you do a podcast with Trump?
02:39:22.240 | - No, I don't think so.
02:39:24.600 | I don't think I'd be learning anything about him.
02:39:27.360 | It's like with Hitler,
02:39:29.400 | and I'm not comparing Trump to Hitler.
02:39:31.600 | - But Clipse guy, here's your chance.
02:39:33.880 | You got this one.
02:39:34.920 | - With certain world historical figures,
02:39:39.240 | I would just feel like, okay,
02:39:40.400 | this is an opportunity to learn something
02:39:42.680 | that I'm not gonna learn.
02:39:43.600 | I think Trump is among the most superficial people
02:39:47.760 | we have ever laid eyes on.
02:39:49.200 | Like he is in public view, right?
02:39:52.920 | And I'm sure there's some distance
02:39:55.960 | between who he is in private and who he is in public,
02:39:57.800 | but it's not gonna be the kind of distance
02:40:00.560 | that's gonna blow my mind.
02:40:02.600 | And I think,
02:40:04.240 | so I think the liability of,
02:40:08.000 | for instance, I think Joe Rogan was very wise
02:40:12.760 | not to have Trump on his podcast.
02:40:14.640 | I think all he would have been doing is,
02:40:17.320 | he would have put himself in a situation
02:40:19.080 | where he couldn't adequately contain
02:40:20.920 | the damage Trump was doing,
02:40:23.000 | and he was just gonna make Trump seem cool
02:40:24.840 | to a whole new,
02:40:26.860 | a potentially new cohort of his massive audience, right?
02:40:30.960 | I mean, they would have had a lot of laughs.
02:40:34.880 | Trump's funny.
02:40:37.920 | The entertainment value of things is so influential.
02:40:42.920 | I mean, there was that one debate where Trump
02:40:47.440 | got a massive laugh on his line,
02:40:51.920 | "Only Rosie O'Donnell," right?
02:40:54.680 | The truth is we're living in a political system
02:40:57.240 | where if you can get a big laugh during a political debate,
02:41:02.000 | you win.
02:41:03.440 | It doesn't matter who you are.
02:41:05.080 | That's the level of,
02:41:07.200 | it doesn't matter how uninformed you are.
02:41:08.560 | It doesn't matter that half the debate was about
02:41:10.840 | what the hell we should do about
02:41:12.440 | the threat of nuclear war or anything else.
02:41:16.080 | We're monkeys, right?
02:41:19.760 | And we like to laugh.
02:41:21.600 | - Well, 'cause you brought up Joe.
02:41:22.920 | He's somebody like you I look up to.
02:41:25.060 | I've learned a lot from him
02:41:28.080 | because I think who he is privately as a human being,
02:41:32.440 | also he's kind of the voice of curiosity to me.
02:41:36.680 | He inspired me that,
02:41:37.760 | so unending open-minded curiosity,
02:41:40.300 | much like you are the voice of reason.
02:41:43.580 | They recently had a podcast,
02:41:46.400 | Joe had recently had a podcast with Jordan Peterson,
02:41:48.440 | and he brought you up saying they still have a hope for you.
02:41:53.440 | - Yeah, I saw that clip, yeah.
02:41:55.560 | - Any chance you talk to Joe again
02:41:57.320 | and reinvigorate your friendship?
02:41:59.620 | - Yeah, well, I reached out to him privately
02:42:03.040 | when I saw that clip.
02:42:04.360 | - Did you use the power of love?
02:42:06.440 | - Joe knows I love him and consider him a friend, right?
02:42:08.680 | So there's no issue there.
02:42:10.220 | He also knows I'll be happy to do his podcast
02:42:14.360 | when we get that together.
02:42:17.680 | So I've got no policy of not talking to Joe
02:42:20.920 | or not doing his podcast.
02:42:22.120 | I mean, I think we got a little sideways
02:42:26.600 | along these same lines where we've talked about
02:42:29.360 | Brett and Elon and other people.
02:42:31.000 | It was never to that degree with Joe
02:42:35.600 | because Joe's in a very different lane, right?
02:42:40.600 | He's, and consciously so.
02:42:42.240 | I mean, Joe is a standup comic who interviews,
02:42:46.480 | who just is interested in everything,
02:42:50.040 | interviews the widest conceivable variety of people
02:42:53.920 | and just lets his interests collide with their expertise
02:42:57.440 | or lack of expertise.
02:42:59.000 | I mean, he's, again, it's a super wide variety of people.
02:43:03.360 | He'll talk about anything
02:43:04.960 | and he can always pull the rip cord saying,
02:43:09.080 | I don't know what the fuck I'm saying.
02:43:10.120 | I'm a comic, I'm stoned.
02:43:11.720 | We just drank too much, right?
02:43:13.160 | Like it's very entertaining.
02:43:15.400 | It's all in, to my eye, it's all in good faith.
02:43:18.520 | I think Joe is an extraordinarily ethical, good person.
02:43:21.520 | - Also doesn't use Twitter.
02:43:22.840 | Doesn't really use Twitter.
02:43:23.960 | - Yeah, yeah.
02:43:25.240 | The crucial difference though is that because he
02:43:27.640 | is an entertainer first.
02:43:33.160 | I mean, I'm not saying he's not smart
02:43:35.080 | and he doesn't understand things.
02:43:36.160 | He, I mean, what's potentially confusing
02:43:38.520 | is he's very smart and he's also very informed.
02:43:41.240 | His full-time job is taught, you know,
02:43:44.440 | when he's not doing standup
02:43:45.960 | or doing color commentary for the UFC,
02:43:47.920 | his full-time job is talking to lots of very smart people
02:43:53.940 | at great length.
02:43:54.780 | So he's created a, you know,
02:43:56.600 | the Joe Rogan University for himself
02:43:58.160 | and he's gotten a lot of information crammed into his head.
02:44:02.320 | So it's not that he's uninformed,
02:44:04.440 | but he can always, when he feels that he's uninformed
02:44:08.360 | or when it turns out he was wrong about something,
02:44:10.800 | he can always pull the ripcord and say,
02:44:12.920 | I'm just a comic, we were stoned, it was fun.
02:44:16.400 | You know, don't take medical advice from me.
02:44:18.960 | I don't play a doctor on the internet, right?
02:44:21.120 | I can't quite do that, right?
02:44:25.680 | You can't quite do that.
02:44:26.840 | We're in different lanes.
02:44:28.120 | I'm not saying you and I are in exactly the same lane,
02:44:30.280 | but for much of Joe's audience,
02:44:32.400 | I'm just this establishment shill
02:44:34.120 | who's just banging on about, you know,
02:44:35.480 | the universities and medical journals.
02:44:37.320 | It's not true, but that would be the perception.
02:44:41.920 | And as a counterpoint to a lot of what's being said
02:44:44.040 | on Joe's podcast or, you know,
02:44:46.880 | certainly Brett's podcast on these topics,
02:44:49.240 | I can see how they would form that opinion.
02:44:51.760 | But in reality, if you listen to me long enough,
02:44:56.200 | you hear that I've said as much
02:44:59.760 | against the woke nonsense as anyone,
02:45:01.880 | even any lunatic on the right
02:45:03.520 | who can only keep that bright,
02:45:06.320 | that bright, shining object in view, right?
02:45:09.240 | So there's nothing that Candace Owens has said
02:45:11.480 | about wokeness that I haven't said about wokeness
02:45:13.400 | as far, insofar as she's speaking rationally about wokeness.
02:45:16.560 | But we have to be able to keep multiple things in view,
02:45:23.320 | right?
02:45:24.320 | If you could only look at the problem of wokeness
02:45:26.480 | and you couldn't acknowledge the problem of Trump
02:45:28.920 | and Trumpism and QAnon
02:45:30.880 | and the explosion of irrationality
02:45:33.840 | that was happening on the right
02:45:35.120 | and bigotry that was happening on the right,
02:45:37.320 | you were just disregarding half of the landscape.
02:45:42.880 | And many people took half of the problem in recent years.
02:45:47.880 | The last five years is a story of many people
02:45:50.460 | taking half of the problem
02:45:52.080 | and monetizing that half of the problem.
02:45:54.040 | And getting captured by an audience
02:45:58.200 | that only wanted that half of the problem
02:46:00.400 | talked about in that way.
02:46:02.440 | And this is the larger issue of audience capture,
02:46:07.440 | which is very, I'm sure it's an ancient problem,
02:46:12.460 | but it's a very helpful phrase
02:46:14.680 | that I think comes to us courtesy
02:46:15.920 | of our mutual friend, Eric Weinstein.
02:46:17.800 | And audience capture is a thing.
02:46:22.400 | And I believe I've witnessed many casualties of it.
02:46:26.100 | And if there's anything I've been on guard against
02:46:29.040 | in my life, professionally, it's been that.
02:46:31.960 | And when I noticed that I had a lot of people in my audience
02:46:36.600 | who didn't like my criticizing Trump,
02:46:39.020 | I really leaned into it.
02:46:41.600 | And when I noticed that a lot of the other cohort
02:46:43.760 | in my audience didn't like me criticizing the far left
02:46:47.660 | and wokeness, they thought I was exaggerating that problem.
02:46:51.340 | I leaned into it because I thought those parts
02:46:53.800 | of my audience were absolutely wrong.
02:46:56.600 | And I didn't care about whether I was gonna lose
02:46:59.760 | those parts of my audience.
02:47:01.160 | There are people who have created, knowingly or not,
02:47:06.440 | there are people who've created
02:47:07.280 | different incentives for themselves
02:47:09.520 | because of how they've monetized their podcast
02:47:11.720 | and because of the kind of signal
02:47:13.820 | they've responded to in their audience.
02:47:15.760 | And I worry about, Brett would consider this
02:47:20.280 | a totally invidious ad hominem thing to say,
02:47:24.240 | but I really do worry that that's happened to Brett.
02:47:26.900 | I think I cannot explain how you do 100,
02:47:30.880 | with all the things in the universe to be interested in,
02:47:35.660 | and of all the things he's competent
02:47:37.060 | to speak intelligently about,
02:47:38.940 | I don't know how you do 100 podcasts in a row on COVID.
02:47:42.160 | It's just, it makes no sense.
02:47:44.620 | - Do you think in part audience capture can explain that?
02:47:48.200 | - I absolutely think it can, yeah.
02:47:49.660 | - What about, do you, like for example,
02:47:53.140 | do you feel pressure to not admit
02:47:56.020 | that you made a mistake on COVID
02:47:57.500 | or made a mistake on Trump?
02:47:59.660 | I'm not saying you feel that way,
02:48:01.940 | but do you feel this pressure?
02:48:04.240 | So you've attacked audience capture
02:48:06.460 | within the way you do stuff,
02:48:08.920 | so you don't feel as much pressure from the audience,
02:48:11.660 | but within your own ego?
02:48:13.820 | - I mean, again, the people who think I'm wrong
02:48:16.440 | about any of these topics are gonna think,
02:48:18.940 | okay, you're just not admitting that you're wrong,
02:48:23.940 | but now we're having a dispute about specific facts.
02:48:28.460 | There are things that I believed about COVID
02:48:35.140 | or worried might be true about COVID two years ago
02:48:39.300 | that I no longer believe or I'm not so worried about now,
02:48:42.300 | and vice versa.
02:48:44.340 | I mean, things have flipped.
02:48:45.740 | Certain things have flipped upside down.
02:48:48.600 | The question is, was I wrong?
02:48:51.760 | So here's a cartoon version of it,
02:48:54.400 | but this is something I said probably 18 months ago,
02:48:56.840 | and it's still true.
02:48:58.560 | You know, when I saw what Brett was doing on COVID,
02:49:01.080 | let's call it two years ago,
02:49:04.240 | I said, even if he is right,
02:49:09.360 | even if it turns out that ivermectin is a panacea
02:49:14.000 | and the mRNA vaccines kill millions of people,
02:49:16.720 | he's still wrong right now.
02:49:20.360 | His reasoning is still flawed right now.
02:49:23.120 | His facts still suck right now,
02:49:25.520 | and his confidence is unjustified now.
02:49:31.480 | That was true then.
02:49:32.520 | That will always be true then,
02:49:34.120 | and not much has changed for me to revisit
02:49:41.240 | any of my time points along the way.
02:49:43.360 | Again, I will totally concede that if I had teenage boys
02:49:48.360 | and their schools were demanding that they be vaccinated
02:49:52.040 | with the mRNA vaccine,
02:49:53.520 | I would be powerfully annoyed, right?
02:49:58.320 | Like I wouldn't know what I was gonna do,
02:50:00.400 | and I would be doing more research about myocarditis,
02:50:05.400 | and I'd be badgering our doctors,
02:50:07.920 | and I would be worried that we have a medical system
02:50:11.580 | and a pharmaceutical system and a healthcare system
02:50:14.600 | and a public health system that's not incentivized
02:50:17.640 | to look at any of this in a fine-grained way,
02:50:20.120 | and they just want one blanket admonition
02:50:24.220 | to the entire population,
02:50:25.820 | just take the shot, you idiots.
02:50:28.000 | I view that largely as a result, a panicked response
02:50:34.200 | to the misinformation explosion that happened
02:50:36.640 | and the populist resistance animated by misinformation
02:50:40.660 | that just made it impossible to get anyone to cooperate.
02:50:43.840 | So it's just, part of it is, again,
02:50:46.160 | a pendulum swing in the wrong direction,
02:50:48.160 | somewhat analogous to the woke response to Trump
02:50:51.400 | and the Trumpist response to woke, right?
02:50:53.120 | So a lot of people have just gotten pushed around
02:50:56.120 | for bad reasons, but understandable reasons.
02:50:58.840 | But yes, there are caveats to my,
02:51:04.720 | things have changed about my view of COVID,
02:51:07.920 | but the question is, if you roll back the clock 18 months,
02:51:11.900 | was I wrong to want to platform Eric Topol,
02:51:16.580 | a very well-respected cardiologist on this topic,
02:51:24.100 | or Nicholas Christakis to talk about the network effects
02:51:29.100 | of whether we should close schools, right?
02:51:32.600 | He's written a book on COVID,
02:51:34.180 | network effects are his wheelhouse,
02:51:37.660 | both as an MD and as a sociologist.
02:51:39.820 | There was a lot that we believed we knew
02:51:44.500 | about the efficacy of closing schools during pandemics,
02:51:48.160 | right, during the Spanish flu pandemic and others, right?
02:51:53.160 | But there's a lot we didn't know about COVID.
02:51:56.620 | We didn't know how negligible the effects would be on kids
02:52:01.620 | compared to older people.
02:52:04.580 | We didn't know, like the-
02:52:06.380 | - My problem, I really enjoyed your conversation
02:52:08.300 | with Eric Topol, but also didn't.
02:52:10.500 | So he's one of the great communicators in many ways
02:52:14.380 | on Twitter, like distillation of the current data,
02:52:18.420 | but he, I hope I'm not overstating it,
02:52:21.620 | but there is a bit of an arrogance from him
02:52:26.620 | that I think could be explained by him being exhausted
02:52:30.900 | by being constantly attacked by conspiracy theory,
02:52:33.300 | like anti-vaxxers.
02:52:36.060 | To me, the same thing happens with people
02:52:37.700 | that start drifting to being right-wing,
02:52:42.700 | is they get attacked so much by the left,
02:52:45.520 | they become almost irrational and arrogant
02:52:47.460 | in their beliefs.
02:52:48.740 | And I felt your conversation with Eric Topol
02:52:52.100 | did not sufficiently empathize with people
02:52:55.660 | that have skepticism, but also did not sufficiently
02:52:58.660 | communicate uncertainty we have.
02:53:00.820 | So like many of the decisions you made,
02:53:03.900 | many of the things you were talking about,
02:53:06.460 | were kind of saying there's a lot of uncertainty,
02:53:08.260 | but this is the best thing we could do now.
02:53:10.220 | - Well, it was a forced choice.
02:53:11.260 | You're gonna get COVID, do you wanna be vaccinated
02:53:14.020 | when you get it?
02:53:15.140 | That was always, in my view, an easy choice.
02:53:20.020 | And it's up until you start breaking apart the cohorts
02:53:23.620 | and you start saying, okay, wait a minute,
02:53:25.520 | there is this myocarditis issue in young men.
02:53:29.800 | Let's talk about that.
02:53:33.100 | Before that story emerged, it was just clear that
02:53:36.700 | if it's not knocking down transmission
02:53:42.860 | as much as we had hoped,
02:53:44.540 | it is still mitigating severe illness and death.
02:53:47.460 | And I still believe that it is the current view
02:53:54.100 | of most people competent to analyze the data,
02:53:59.980 | that we lost something like 300,000 people
02:54:02.760 | unnecessarily in the US because of vaccine hesitancy.
02:54:07.480 | - But I think there's a way to communicate with humility
02:54:10.500 | about the uncertainty of things
02:54:12.200 | that would increase the vaccination rate.
02:54:14.060 | - I do believe that it is rational and sometimes effective
02:54:19.060 | to signal impatience with certain bad ideas, right?
02:54:25.420 | And certain conspiracy theories
02:54:27.860 | and certain forms of misinformation.
02:54:29.900 | - I think so.
02:54:31.820 | I just think it makes you look like a douchebag most times.
02:54:34.520 | - Well, I mean, certain people are persuadable,
02:54:36.760 | certain people are not persuadable,
02:54:38.360 | but it's, no, 'cause there's not enough,
02:54:42.160 | it's the opportunity cost.
02:54:43.680 | Not everything can be given a patient hearing.
02:54:47.600 | So you can't have a physics conference
02:54:49.980 | and then let people in to just trumpet their pet theories
02:54:53.880 | about the grand unified vision of physics
02:54:56.640 | when they're obviously crazy
02:54:59.800 | or they're obviously half crazy
02:55:01.080 | or they're just not, you know,
02:55:02.640 | the people, like, you begin to get a sense for this
02:55:07.640 | when it is your wheelhouse,
02:55:09.400 | but there are people who kind of declare
02:55:13.320 | their irrelevance to the conversation fairly quickly
02:55:18.320 | without knowing that they have done it, right?
02:55:22.120 | And the truth is, I think I'm one of those people
02:55:27.120 | on the topic of COVID, right?
02:55:30.280 | Like, it's never that I felt,
02:55:33.840 | "Listen, I know exactly what's going on here.
02:55:36.520 | "I know these mRNA vaccines are safe.
02:55:38.980 | "I know exactly how to run a lockdown."
02:55:43.560 | No, this is a situation where you want the actual pilots
02:55:47.880 | to fly the plane, right?
02:55:49.320 | We needed experts who we could trust.
02:55:51.800 | And insofar as our experts got captured
02:55:54.560 | by all manner of thing,
02:55:57.440 | I mean, some of them got captured by Trump,
02:55:59.120 | some of them were made to look ridiculous
02:56:00.660 | just standing next to Trump
02:56:02.520 | while he was bloviating about, you know, whatever,
02:56:05.480 | that it's just gonna go away, there's just 15 people,
02:56:09.060 | you know, there's 15 people in a cruise ship
02:56:10.560 | and it's just gonna go away, there's gonna be no problem.
02:56:12.840 | Or it's like when he said,
02:56:15.280 | "Many of these doctors think I understand this
02:56:16.920 | "better than them.
02:56:17.760 | "They're just amazed at how I understand this."
02:56:19.040 | And you've got doctors, real doctors,
02:56:21.480 | the heads of the CDC and NIH standing around
02:56:25.680 | just ashen faced while he's talking, you know.
02:56:29.760 | All of this was deeply corrupting
02:56:32.760 | of the public communication of science.
02:56:34.800 | And then again, I've banged on
02:56:36.800 | about the depredations of wokeness.
02:56:39.440 | The woke thing was a disaster, right?
02:56:41.760 | Still is a disaster.
02:56:42.900 | But it doesn't mean that,
02:56:47.240 | but the thing is there's a big difference
02:56:49.940 | between me and Brett in this case.
02:56:52.120 | I didn't do 100 podcasts on COVID.
02:56:54.000 | I did like two podcasts on COVID.
02:56:56.360 | The measure of my concern about COVID
02:56:58.280 | can be measured in how many podcasts I did on it, right?
02:57:01.400 | It's like once we had a sense of how to live with COVID,
02:57:05.360 | I was just living with COVID, right?
02:57:07.040 | Like, okay, get vaxxed or don't get vaxxed,
02:57:09.960 | wear a mask or don't wear a mask,
02:57:11.440 | travel or don't travel.
02:57:12.560 | Like you've got a few things to decide,
02:57:13.980 | but my kids were stuck at home on iPads for too long.
02:57:18.820 | I didn't agree with that.
02:57:20.360 | You know, it was obviously not functional.
02:57:22.920 | Like I criticized that on the margins,
02:57:25.600 | but there was not much to do about it.
02:57:27.080 | But the thing I didn't do is make this my life
02:57:30.760 | and just browbeat people with one message or another.
02:57:34.820 | We need a public health regime
02:57:38.360 | where we can trust what the competent people
02:57:40.520 | are saying to us about, you know,
02:57:42.360 | what medicines are safe to take.
02:57:44.600 | And in the absence of that,
02:57:45.960 | craziness is gonna,
02:57:47.960 | even in the presence of that craziness
02:57:49.680 | is gonna proliferate given the tools we've built.
02:57:51.560 | But in the absence of that,
02:57:52.720 | it's gonna proliferate for understandable reasons.
02:57:55.340 | And that's gonna,
02:57:56.180 | it's not gonna be good next time
02:57:59.440 | when something orders of magnitude more dangerous hits us.
02:58:03.760 | And that's, I spent, you know,
02:58:07.080 | insofar as I think about this issue,
02:58:08.700 | I think much more about next time than this time.
02:58:12.320 | - Before this COVID thing,
02:58:15.560 | you and Brett had some good conversations.
02:58:17.860 | I would say we're friends.
02:58:19.300 | What's your, what do you admire most about Brett
02:58:21.620 | outside of all the criticism we've had
02:58:23.940 | about this COVID topic?
02:58:25.320 | - Well, I think Brett is very smart
02:58:29.420 | and he's a very ethical person
02:58:32.220 | who wants good things for the world.
02:58:34.260 | I mean, I have no reason to doubt that.
02:58:36.940 | So the fact that we're on, you know,
02:58:39.100 | we're crosswise on this issue is not,
02:58:42.180 | does not mean that I think he's a bad person.
02:58:45.580 | I mean, the thing that worried me about what he was doing,
02:58:50.300 | and this was true of Joe and this was true of Elon,
02:58:52.260 | this was true of many other people,
02:58:53.940 | is that once you're messaging at scale to a vast audience,
02:58:58.940 | you incur a certain kind of responsibility
02:59:02.260 | not to get people killed.
02:59:04.660 | And I do, I did worry that,
02:59:06.940 | yeah, people were making decisions
02:59:10.380 | on the basis of the information
02:59:12.020 | that was getting shared there.
02:59:12.860 | And that's why I was, I think, fairly circumspect.
02:59:17.620 | I just said, okay, give me the center
02:59:21.980 | of the fairway expert opinion at this time point
02:59:24.660 | and at this time point and at this time point,
02:59:27.740 | and then I'm out, right?
02:59:29.220 | I don't have any more to say about this.
02:59:30.880 | I'm not an expert on COVID.
02:59:32.140 | I'm not an expert on the safety of mRNA vaccines.
02:59:34.860 | If something changes so as to become newsworthy,
02:59:40.300 | then maybe I'll do a podcast.
02:59:41.460 | So I just did a podcast on the lab leak, right?
02:59:45.740 | I was never skeptical of the lab leak hypothesis.
02:59:50.100 | Brett was very early on saying this is a lab leak, right,
02:59:55.100 | at a point where my only position was,
02:59:57.540 | who cares if it's a lab leak, right?
02:59:59.140 | Like, the thing we have to get straight is,
03:00:01.940 | what do we do given the nature of this pandemic?
03:00:04.180 | - But also we should say that you've actually stated
03:00:07.260 | that it is a possibility.
03:00:08.900 | - Oh, yeah.
03:00:09.740 | - Which doesn't quite matter.
03:00:12.300 | - The time to figure that out, now, I've actually,
03:00:14.580 | I have had my podcast guest on this topic
03:00:18.540 | change my view of this because one of the guests,
03:00:22.660 | Alina Chan, made the point that, no,
03:00:25.500 | actually the best time to figure out the origin of this
03:00:28.820 | is immediately, right?
03:00:30.620 | Because you lose touch with the evidence.
03:00:32.460 | I hadn't really been thinking about that.
03:00:33.780 | Like, if you come back after a year,
03:00:35.960 | there are certain facts you might not be able
03:00:39.500 | to get in hand, but I've always felt that
03:00:43.580 | it didn't matter for two reasons.
03:00:46.300 | One is we had the genome of the virus
03:00:49.300 | and we could design, we very quickly design,
03:00:52.540 | immediately designing vaccines against that genome,
03:00:55.220 | and that's what we had to do.
03:00:56.100 | And then we had to figure out how to vaccinate
03:00:58.540 | and to mitigate and to develop treatments and all that.
03:01:03.540 | So the origin story didn't matter.
03:01:06.700 | Generically speaking, either origin story
03:01:11.340 | was politically inflammatory and made the Chinese look bad.
03:01:16.340 | And the Chinese response to this looked bad,
03:01:18.860 | whatever the origin story, right?
03:01:19.980 | They're not cooperating.
03:01:21.900 | They're letting, they're stopping their domestic flights,
03:01:24.620 | but letting their international flights go.
03:01:26.940 | I mean, it's just, they were bad actors
03:01:29.200 | and they should be treated as such
03:01:30.620 | regardless of the origin, right?
03:01:32.340 | And I would argue that the wet market origin
03:01:36.900 | is even more politically invidious
03:01:39.260 | than the lab leak origin.
03:01:40.900 | - Why do you think?
03:01:41.740 | - Because for the lab leak, to my eye,
03:01:43.980 | the lab leak could happen to anyone, right?
03:01:46.860 | We're all running, all these advanced countries
03:01:49.320 | are running these dangerous labs.
03:01:51.700 | That's a practice that we should be worried about
03:01:54.200 | in general.
03:01:56.900 | We know lab leaks are a problem.
03:01:58.700 | There've been multiple lab leaks of even worse things
03:02:01.580 | that haven't gotten out of hand in this way,
03:02:03.620 | but worse pathogens.
03:02:06.300 | We're wise to be worried about this.
03:02:12.140 | And on some level, it could happen to anyone, right?
03:02:15.680 | The wet market makes them look like barbarians
03:02:19.260 | living in another century.
03:02:20.500 | Like you gotta clean up those wet markets.
03:02:22.260 | Like what are you doing putting a bat on top of a pangolin,
03:02:25.540 | on top of a duck?
03:02:26.780 | It's like, get your shit together.
03:02:28.900 | So if anything, the wet market makes them look worse,
03:02:33.020 | in my view.
03:02:33.860 | Now, I'm sure that what they actually did
03:02:36.660 | to conceal a lab leak, if it was a lab leak,
03:02:39.780 | all of that's gonna look odious.
03:02:42.040 | - Do you think we'll ever get to the bottom of that?
03:02:44.500 | I mean, one of the big negative,
03:02:47.300 | I would say failures of Anthony Fauci and so on
03:02:52.500 | is to be transparent and clear,
03:02:54.340 | and just a good communicator
03:02:55.460 | about gain-in-function research,
03:02:56.780 | the dangers of that,
03:02:57.940 | why it's a useful way of research,
03:03:03.580 | but it's also dangerous.
03:03:05.020 | Just being transparent about that,
03:03:07.420 | as opposed to just coming off really shady.
03:03:09.660 | Of course, the conspiracy theorists
03:03:11.260 | and the politicians are not helping,
03:03:14.260 | but this just created a giant mess.
03:03:18.300 | - Yeah, no, I would agree.
03:03:19.140 | So that exchange with Fauci and Rand Paul,
03:03:23.140 | that went viral,
03:03:24.600 | yeah, I would agree that Fauci looked like
03:03:27.820 | he was taking refuge in a very lawyered language,
03:03:32.820 | and not giving a straightforward account
03:03:37.300 | of what we do and why we do it.
03:03:39.180 | So yeah, I think it looked shady, it played shady,
03:03:42.380 | and it probably was shady.
03:03:44.300 | I mean, I don't know how personally entangled
03:03:46.380 | he is with any of this,
03:03:47.260 | but yeah, the gain-of-function research
03:03:51.100 | is something that I think we're wise to be worried about.
03:03:56.000 | And insofar as I judge myself adequate
03:03:59.600 | to have an opinion on this,
03:04:01.200 | I think it should be banned, right?
03:04:04.680 | Like probably a podcast I'll do,
03:04:09.040 | if you or somebody else doesn't do it in the meantime,
03:04:11.740 | I would like a virologist to defend it
03:04:18.360 | against a virologist who would criticize it.
03:04:22.420 | Forget about just the gain-of-function research.
03:04:25.380 | I don't even understand virus hunting at this point.
03:04:28.140 | It's like, I don't know,
03:04:28.980 | I don't even know why you need to go into a cave
03:04:30.980 | to find this next virus that could be circulating
03:04:34.260 | among bats that may jump zoonotically to us.
03:04:37.980 | Why do that when we can sequence in a day
03:04:42.740 | and make vaccines in a weekend?
03:04:45.780 | I mean, like what kind of headstart
03:04:48.100 | do you think you're getting?
03:04:48.940 | - That's a surprising new thing,
03:04:49.900 | how quickly you can develop a vaccine.
03:04:51.460 | - Exactly.
03:04:52.300 | - That's, yeah, that's really interesting.
03:04:55.620 | But the shadiness around LabLeak.
03:04:58.300 | - I think the point I didn't make
03:05:00.360 | about Brett's style of engaging in this issue
03:05:03.060 | is people are using the fact that he was early on LabLeak
03:05:06.320 | to suggest that he was right about ivermectin
03:05:09.100 | and about mRNA vaccines and all the rest.
03:05:12.220 | Like, no, none of that connects.
03:05:17.140 | And it was possible to be falsely confident.
03:05:20.700 | Like, you shouldn't have been confident about LabLeak.
03:05:22.580 | No one should have been confident about LabLeak early,
03:05:25.260 | even if it turns out to be LabLeak, right?
03:05:27.500 | It was always plausible.
03:05:28.860 | It was never definite.
03:05:29.920 | It still isn't definite.
03:05:31.420 | Zoonotic is also quite plausible.
03:05:35.220 | It certainly was super plausible then.
03:05:37.220 | Both are politically uncomfortable.
03:05:42.260 | Both at the time were inflammatory to be banging on about
03:05:46.420 | when we were trying to secure some kind of cooperation
03:05:48.700 | from the Chinese, right?
03:05:50.220 | So there's a time for these things.
03:05:51.820 | And it's possible to be right by accident, right?
03:05:55.900 | The style of reasoning matters whether you're right or not.
03:06:06.900 | You know, it's like, because your style of reasoning
03:06:09.120 | is dictating what you're gonna do on the next topic.
03:06:15.020 | - Sure, but this multivariate situation here,
03:06:20.020 | it's really difficult to know what's right on COVID,
03:06:23.500 | given all the uncertainty, all the chaos,
03:06:25.020 | especially when you step outside
03:06:26.980 | the pure biology, virology of it,
03:06:29.540 | and you start to get into policy.
03:06:31.180 | It's really-- - Yeah, it's just trade-offs.
03:06:34.980 | - Like transmissibility of the virus.
03:06:36.860 | Just knowing if 65% of the population gets vaccinated,
03:06:44.020 | what effect would that have?
03:06:45.460 | Just even knowing those things,
03:06:47.920 | just modeling all those things.
03:06:49.520 | Given all the other incentives, I mean, Pfizer,
03:06:54.820 | I don't know what to think. - But you had the CEO
03:06:56.740 | of Pfizer on your podcast.
03:06:57.900 | Did you leave that conversation feeling like
03:07:01.160 | this is a person who is consciously
03:07:03.860 | reaping windfall profits on a dangerous vaccine?
03:07:12.980 | Vaccine and putting everyone at intolerable risk?
03:07:17.700 | Or do you think this person,
03:07:19.060 | did you think this person was making a good faith attempt
03:07:21.820 | to save lives and had no,
03:07:25.280 | no bad, no taint of bad incentives or something?
03:07:32.020 | - The thing I sensed, and I felt in part,
03:07:36.280 | it was a failure on my part,
03:07:40.020 | but I sensed that I was talking to a politician.
03:07:43.180 | So it's not thinking of,
03:07:45.180 | there was malevolence there or benevolence.
03:07:47.580 | There was-- - He just had a job to do.
03:07:50.660 | - He put on a suit and I was talking to a suit,
03:07:53.340 | not a human being.
03:07:55.140 | Now, he said that his son was a big fan of the podcast,
03:07:58.020 | which is why he wanted to do it.
03:07:59.620 | So I thought I would be talking to a human being.
03:08:01.940 | And I asked challenging questions,
03:08:04.620 | what I thought the internet thinks otherwise.
03:08:06.980 | Every single question in that interview
03:08:09.740 | was a challenging one,
03:08:12.040 | but it wasn't grilling,
03:08:14.860 | which is what people seem to want to do
03:08:17.340 | with pharmaceutical companies.
03:08:18.920 | There's a deep distrust of pharmaceutical companies.
03:08:21.180 | - Well, what's the alternative?
03:08:22.340 | I mean, I totally get that windfall profits
03:08:25.940 | at a time of a public health emergency looks bad.
03:08:29.660 | It is a bad look, right?
03:08:31.540 | But how do we reward and return cash
03:08:36.540 | and return capital to risk takers
03:08:39.820 | who will spend a billion dollars
03:08:42.140 | to design a new drug for a disease
03:08:45.140 | that maybe only harms a single digit percentage
03:08:49.940 | of the population?
03:08:50.780 | It's like, well, what do we want to encourage?
03:08:53.020 | And who do we want to get rich?
03:08:54.660 | I mean, so like the person who cures cancer,
03:08:57.240 | do we want that person to get rich or not?
03:08:59.060 | We want the person who gave us the iPhone to get rich,
03:09:03.340 | but we don't want the person who cures cancer to get rich?
03:09:06.300 | I mean, what are we trying to do?
03:09:07.140 | - I think it's a very gray area.
03:09:09.260 | So what we want is the person who declares
03:09:11.300 | that they have a cure for cancer
03:09:12.980 | to have authenticity and transparency.
03:09:15.080 | I think we're good now as a population smelling bullshit.
03:09:20.820 | And there is something about the Pfizer CEO, for example,
03:09:23.820 | just CEOs of pharmaceutical companies in general,
03:09:26.180 | just because they're so lawyered up,
03:09:29.780 | so much marketing and PR people,
03:09:32.600 | that they are, you just smell bullshit.
03:09:35.180 | You're not talking to a real human.
03:09:37.140 | That it just feels like none of it is transparent
03:09:41.180 | to us as a public.
03:09:42.300 | So like this whole talking point
03:09:45.980 | that Pfizer's only interested in helping people
03:09:48.820 | just doesn't ring true,
03:09:50.700 | even though it very well could be true.
03:09:53.300 | It's the same thing with Bill Gates,
03:09:55.500 | who seems to be at scale helping a huge amount
03:09:58.580 | of people in the world.
03:09:59.820 | And yet there's something about the way
03:10:01.700 | he delivers that message,
03:10:03.380 | where people are like, this seems suspicious.
03:10:07.020 | What's happening underneath this?
03:10:08.700 | There's certain kinds of communication styles
03:10:10.660 | that seem to be more,
03:10:12.860 | serve as better catalysts for conspiracy theories.
03:10:17.260 | And I'm not sure what that is,
03:10:19.300 | because I don't think there's an alternative
03:10:21.540 | for capitalism in delivering drugs that help people.
03:10:26.540 | But also at the same time,
03:10:28.460 | there seems to need to be more transparency.
03:10:30.860 | And plus, like regulation that actually makes sense
03:10:33.080 | versus, it seems like pharmaceutical companies
03:10:37.500 | are susceptible to corruption.
03:10:39.060 | - Yeah, I worry about all that.
03:10:43.060 | But I also do think that most of the people
03:10:46.740 | going into those fields,
03:10:48.060 | and most of the people going into government,
03:10:49.900 | - They wanna do good. - Are doing it for good.
03:10:51.580 | They're non-psychopaths trying to get good things done
03:10:55.060 | and trying to solve hard problems.
03:10:56.860 | And they're not trying to get rich.
03:10:58.620 | I mean, many of the people are, it's like,
03:11:01.440 | there,
03:11:02.280 | bad incentives are something,
03:11:06.120 | again, I've uttered that phrase 30 times on this podcast,
03:11:10.600 | but it's just almost everywhere it explains
03:11:15.520 | normal people creating terrible harm, right?
03:11:19.280 | It's not that there are that many bad people.
03:11:22.040 | And yes, it makes the truly bad people
03:11:27.040 | that much more remarkable and worth paying attention to,
03:11:29.960 | but the bad incentives and the power of bad ideas
03:11:34.960 | do much more harm.
03:11:40.080 | Because I mean, that's what gets good people
03:11:42.740 | running in the wrong direction,
03:11:44.020 | or doing things that are clearly creating
03:11:49.020 | unnecessary suffering.
03:11:51.440 | - You've had, and I hope still have,
03:11:55.220 | a friendship with Elon Musk,
03:11:57.700 | especially over the topic of AI.
03:11:59.900 | You have a lot of interesting ideas that you both share,
03:12:02.060 | concerns that you both share.
03:12:03.540 | Well, let me first ask, what do you admire most about Elon?
03:12:08.520 | - Well, I had a lot of fun with Elon.
03:12:14.460 | I like Elon a lot.
03:12:15.740 | I mean, Elon, I knew as a friend, I like a lot.
03:12:19.020 | And it's not gonna surprise anyone.
03:12:24.020 | I mean, he's done and he's continuing to do amazing things.
03:12:29.740 | And I think he's,
03:12:30.900 | I think many of his aspirations are realized,
03:12:37.740 | the world will be a much better place.
03:12:39.420 | I think it's just, it's amazing to see what he's built
03:12:42.260 | and what he's attempted to build
03:12:43.940 | and what he may yet build.
03:12:45.760 | - So with Tesla, with SpaceX, with--
03:12:47.580 | - Yeah, I'm a fan of almost all of that.
03:12:51.020 | I mean, there are wrinkles to a lot of that,
03:12:54.100 | or some of that.
03:12:54.940 | - All humans are full of wrinkles.
03:12:58.340 | - There's something very Trumpian about how he's acting
03:13:00.660 | on Twitter, right?
03:13:01.860 | I mean, Twitter, I think Twitter's,
03:13:03.500 | he thinks Twitter's great.
03:13:04.500 | He bought the place because he thinks it's so great.
03:13:06.940 | I think Twitter's driving him crazy, right?
03:13:08.860 | I think he's needlessly complicating his life
03:13:11.500 | and harming his reputation and creating a lot of noise
03:13:15.380 | and harming a lot of other people.
03:13:17.460 | I mean, so like he, the thing that I objected to
03:13:20.060 | with him on Twitter is not that he bought it
03:13:23.260 | and made changes to it.
03:13:24.460 | I mean, that was not,
03:13:26.660 | again, I remain agnostic as to whether or not
03:13:28.900 | he can improve the platform.
03:13:30.380 | It was how he was personally behaving on Twitter,
03:13:33.940 | not just toward me, but toward the world.
03:13:36.620 | I think when you forward an article
03:13:40.420 | about Nancy Pelosi's husband being attacked,
03:13:44.340 | not as he was by some lunatic,
03:13:46.500 | but that it's just some gay tryst gone awry, right?
03:13:51.500 | That's not what it seems.
03:13:52.800 | And you link to a website that previously claimed
03:13:57.800 | that Hillary Clinton was dead
03:14:00.240 | and that a body double was campaigning in her place.
03:14:03.440 | That thing was exploding in Trumpistan
03:14:06.120 | as a conspiracy theory, right?
03:14:07.820 | And it was having its effect.
03:14:08.920 | And it matters that he was signal boosting it
03:14:11.840 | in front of 130 million people.
03:14:13.680 | And so it is with saying that your former employee,
03:14:18.680 | Yoel Roth is a pedophile, right?
03:14:21.200 | I mean, that has real consequences.
03:14:23.480 | It appeared to be complete bullshit.
03:14:25.800 | And now this guy's getting inundated with death threats,
03:14:28.960 | right, and Elon, all of that's totally predictable, right?
03:14:32.880 | And so he's behaving quite recklessly.
03:14:36.160 | And there's a long list of things like that
03:14:38.520 | that he's done on Twitter.
03:14:40.640 | It's not ethical, it's not good for him,
03:14:43.040 | it's not good for the world, it's not serious.
03:14:48.840 | It's a very adolescent relationship
03:14:51.560 | to real problems in our society.
03:14:53.720 | And so my problem with how he's behaved
03:14:57.120 | is that he's purported to touch real issues by turns.
03:15:01.320 | Like, okay, do I give the satellites to Ukraine or not?
03:15:04.280 | Do I minimize their use of them or not?
03:15:08.680 | Should I publicly worry about World War III or not, right?
03:15:12.240 | He's doing this shit on Twitter, right?
03:15:14.640 | And at the same moment, he's doing these other
03:15:19.640 | very impulsive, ill-considered things,
03:15:24.200 | and he's not showing any willingness
03:15:26.080 | to really clean up the mess he makes.
03:15:28.220 | He brings Kanye on, knowing he's an anti-Semite
03:15:32.520 | who's got mental health problems,
03:15:34.240 | and then kicks him off for a swastika,
03:15:36.520 | which I probably wouldn't have kicked him off
03:15:38.520 | for a swastika.
03:15:39.360 | It's like, that's even, like,
03:15:41.080 | can you really kick people off for swastikas?
03:15:42.600 | Is that something that you get banned for?
03:15:45.880 | I mean, are you a free speech absolutist
03:15:47.800 | if you can't let a swastika show up?
03:15:50.400 | I'm not even sure that's an enforceable terms of service.
03:15:53.080 | Right, there are moments to use swastikas
03:15:55.640 | that are not conveying hate and not raising
03:15:58.120 | the risk of violence.
03:15:59.040 | - Clip that.
03:15:59.920 | - Yeah.
03:16:00.760 | But so much of what he's doing, given that he's,
03:16:04.160 | again, scale matters.
03:16:05.680 | He's doing this in front of 130 million people.
03:16:07.600 | That's very different than a million people,
03:16:09.320 | and that's very different than 100,000 people.
03:16:11.680 | And so when I went off the tracks with Elon,
03:16:15.160 | he was doing this about COVID.
03:16:16.840 | And again, this was a situation where I tried
03:16:20.360 | to privately mitigate a friend's behavior,
03:16:25.360 | and it didn't work out very well.
03:16:27.160 | - Did you try to correct him,
03:16:28.960 | sort of highlighting things he might be wrong on?
03:16:32.720 | - Yeah.
03:16:33.560 | - Or did you use the Lex Power love method?
03:16:36.040 | I should write, like, a pamphlet for Sam Harris.
03:16:38.640 | - Well, no, but it was totally coming from a place
03:16:41.240 | of love because I was concerned about his reputation.
03:16:44.120 | I was concerned about what he,
03:16:45.040 | I mean, there was a twofold concern.
03:16:46.920 | I could see what was happening with the tweet.
03:16:49.320 | I mean, he'd had this original tweet that was,
03:16:51.720 | I think it was, "Panic over COVID is dumb,"
03:16:55.640 | or something like that, right?
03:16:56.640 | This is way, this is in March.
03:16:58.880 | This is early March, 2020.
03:17:01.040 | - Oh, super early days.
03:17:02.320 | - Super early.
03:17:03.160 | So when nobody knew anything,
03:17:05.000 | but we knew, we saw what was happening in Italy, right?
03:17:07.440 | It was totally kicking off.
03:17:08.800 | - God, that was a wild time.
03:17:11.160 | That's when the toilet paper.
03:17:12.000 | - It was wild, it was totally wild.
03:17:13.400 | But that became the most influential tweet
03:17:16.720 | on Twitter for that week.
03:17:18.240 | I mean, it had more engagement than any other tweet,
03:17:20.540 | more than any crazy thing Trump was tweeting.
03:17:22.420 | I mean, it was, it went off, again,
03:17:25.760 | it was just a nuclear bomb of information.
03:17:30.520 | And I could see that people were responding to it like,
03:17:35.520 | "Wait a minute, okay, here's this genius technologist
03:17:38.400 | who must have inside information about everything, right?
03:17:41.800 | Surely he knows something that is not on the surface
03:17:44.440 | about this pandemic."
03:17:46.000 | And they're reading, they were reading into it
03:17:48.440 | a lot of information that I knew wasn't there, right?
03:17:52.000 | And at the time, I didn't even,
03:17:54.380 | I didn't think he had any reason to be suggesting that.
03:17:57.080 | I think he was just firing off a tweet, right?
03:17:59.380 | So I reached out to him in private,
03:18:02.520 | and I mean, because it was a private text conversation,
03:18:07.360 | I won't talk about the details,
03:18:09.800 | but I'm just saying, that's a case,
03:18:12.520 | among the many cases of friends who have public platforms
03:18:15.880 | and who did something that I thought was dangerous
03:18:18.520 | and ill-considered, this was a case where I reached out
03:18:21.800 | in private and tried to help, genuinely help,
03:18:26.800 | because it was just, I thought it was harmful
03:18:31.840 | in every sense, because it was being misinterpreted.
03:18:35.440 | And it was like, okay, you can say that panic
03:18:37.360 | over anything is dumb, fine,
03:18:39.080 | but this was not how this was landing.
03:18:41.320 | This was like non-issue conspiracy,
03:18:45.120 | there's gonna be no COVID in the US,
03:18:47.480 | it's gonna peter out, it's just gonna become a cold.
03:18:49.360 | I mean, that's how this was getting received.
03:18:51.960 | Whereas at that moment, it was absolutely obvious
03:18:54.040 | how big a deal this was gonna be,
03:18:56.320 | or that it was gonna, at minimum, going to be a big deal.
03:18:58.680 | - I don't know if it was obvious,
03:18:59.680 | but it was obvious there was a significant probability
03:19:03.440 | that it could be a big deal.
03:19:04.440 | - I remember in March, it wasn't unclear,
03:19:06.440 | how big, 'cause there's still stories of it,
03:19:10.200 | it's probably going to, the big concern,
03:19:12.640 | the hospitals might overfill,
03:19:13.720 | but it's gonna die out in two months or something.
03:19:15.760 | - Yeah, we didn't know, but there was no way
03:19:18.640 | we weren't going to have tens of thousands of deaths
03:19:23.580 | at a minimum at that point.
03:19:24.980 | And it was totally rational to be worried
03:19:29.400 | about hundreds of thousands.
03:19:30.600 | And when Nicholas Christakis came on my podcast very early,
03:19:34.800 | he predicted quite confidently
03:19:36.400 | that we would have about a million people dead in the US.
03:19:39.760 | And that didn't seem, it was, I think, appropriately hedged,
03:19:44.760 | but it was still, it was just like, okay,
03:19:47.520 | it's just gonna, you just look at the,
03:19:49.200 | we're just kind of riding this exponential,
03:19:51.320 | and it'd be very surprising not to have
03:19:56.800 | that order of magnitude and not something much, much less.
03:20:00.880 | And so anyway, I mean, again, to close the story on Elon,
03:20:05.880 | I could see how this was being received,
03:20:13.680 | and I tried to get him to walk that back.
03:20:18.160 | And then we had a fairly long and detailed exchange
03:20:24.260 | on this issue, and that, so that intervention didn't work.
03:20:29.260 | And it was not done, I was not an asshole.
03:20:33.260 | I was not, I was just concerned for him, for the world,
03:20:38.260 | and then there are other relationships where I didn't take,
03:20:43.540 | but again, that's an example where taking the time
03:20:49.620 | didn't work, right, privately.
03:20:52.020 | There are other relationships where I thought,
03:20:53.180 | okay, there's just gonna be more trouble than it's worth,
03:20:54.740 | and I just ignored it, and there's a lot of that.
03:20:58.940 | And again, I'm not comfortable with how this is all
03:21:03.300 | netted out, because I don't know if,
03:21:07.580 | frankly, I'm not comfortable with how much time
03:21:10.780 | in this conversation we've spent talking
03:21:12.420 | about these specific people.
03:21:14.000 | Like, what good is it for me to talk about Elon or Brett
03:21:19.000 | or any of these people in public?
03:21:20.140 | - I think there's a lot of good,
03:21:21.260 | because those friendships, listen, as a fan,
03:21:24.380 | these are the conversations that I loved,
03:21:29.380 | love as a fan, and it feels like COVID
03:21:32.500 | has robbed the world of these conversations.
03:21:35.220 | Because you are exchanging back and forth on Twitter,
03:21:38.300 | but that's not what I mean by conversations,
03:21:39.740 | like long-form discussions, like a debate about COVID,
03:21:43.860 | like a normal debate.
03:21:45.100 | - But there's no, there is no,
03:21:47.580 | Elon and I shouldn't be debating COVID.
03:21:49.500 | - You should be.
03:21:50.500 | Here's the thing, with humility,
03:21:52.260 | like basically saying, we don't really know,
03:21:54.100 | like the Rogan method, we're just a bunch of idiots.
03:21:58.260 | Like, one is an engineer, you're a neuroscientist.
03:22:00.900 | It just kind of, okay, here's the evidence,
03:22:04.060 | and be like normal people.
03:22:05.060 | That's what everybody was doing.
03:22:07.060 | The whole world was trying to figure out,
03:22:08.660 | what the hell, what?
03:22:09.860 | - Yeah, but the issue was that at that,
03:22:11.900 | so at the moment I had this collision with Elon,
03:22:15.100 | certain things were not debatable.
03:22:18.140 | It was just, it was absolutely clear where this was going.
03:22:22.500 | It wasn't clear how far it was gonna go,
03:22:24.980 | or how quickly we would mitigate it,
03:22:26.240 | but it was absolutely clear
03:22:28.140 | that it was gonna be an issue, right?
03:22:30.260 | The train had come off the tracks in Italy.
03:22:34.340 | We knew we weren't gonna seal our borders.
03:22:37.280 | There were already people, you know,
03:22:40.700 | who, there are already cases known to many of us personally
03:22:44.140 | in the US at that point.
03:22:48.000 | And he was operating by a very different logic
03:22:52.420 | that I couldn't engage with.
03:22:54.460 | - Sure, but that logic represents a part of the population,
03:22:58.340 | and there's a lot of interesting topics
03:23:00.220 | that have a lot of uncertainty around them,
03:23:02.100 | like the effectiveness of masks, like--
03:23:04.260 | - Yeah, but no, but where things broke down
03:23:06.100 | was not at the point of, oh, there's a lot to talk about,
03:23:09.340 | a lot to debate, this is all very interesting,
03:23:11.380 | and who knows what's what.
03:23:13.060 | It broke down very early at, this is, you know,
03:23:18.000 | there's nothing to talk about here.
03:23:20.400 | Like, it's like either there's a water bottle on the table
03:23:24.080 | or there isn't, right?
03:23:25.240 | - Technically, there's only 1/4 of a water bottle.
03:23:30.640 | So what defines a water bottle?
03:23:33.080 | Is it the water inside the water bottle,
03:23:34.600 | or is it the water bottle?
03:23:35.880 | What I'm giving you is an example of,
03:23:37.280 | it's worth a conversation.
03:23:39.200 | - This is difficult because this is,
03:23:40.600 | we had an exchange in private, and I want to honor,
03:23:44.120 | not exposing the details of it,
03:23:48.460 | but, you know, the details convinced me
03:23:53.380 | that there was not a follow-up conversation on that topic.
03:23:57.500 | - On this topic.
03:23:58.500 | That said, I hope, and I hope to be part
03:24:02.860 | of helping that happen, that the friendship is rekindled
03:24:05.500 | because one of the topics I care a lot about,
03:24:07.900 | artificial intelligence, you've had great public
03:24:11.940 | and private conversations about this topic.
03:24:14.280 | And it seems like-- - Yeah, and Elon
03:24:15.400 | was very formative in my taking that issue seriously.
03:24:19.040 | I mean, he and I went to that initial conference
03:24:22.920 | in Puerto Rico together, and it was only because
03:24:25.960 | he was going and I found out about it through him,
03:24:28.000 | and I just wrote his coattails to it, you know,
03:24:31.040 | that I got dropped in that side of the pool
03:24:35.280 | to hear about these concerns at that point.
03:24:38.960 | - It would be interesting to hear
03:24:40.560 | how has your concern evolved
03:24:45.560 | with the coming out of Chad GPT
03:24:48.340 | and these new large language models that are fine-tuned
03:24:52.140 | with reinforcement learning and seemingly to be able
03:24:54.920 | to do some incredible human-like things.
03:24:57.880 | There's two questions.
03:24:58.720 | One, how has your concern in terms of AGI
03:25:01.320 | and superintelligence evolved,
03:25:03.160 | and how impressed are you with Chad GPT
03:25:05.420 | as a student of the human mind and mind in general?
03:25:10.180 | - Well, my concern about AGI is unchanged.
03:25:13.780 | So I did a, I've spoken about it a bunch on my podcast,
03:25:18.180 | but I did a TED Talk in 2016, which was the kind of summary
03:25:23.180 | of what that conference and various conversations I had
03:25:28.180 | after that did to my brain on this topic.
03:25:32.520 | - Basically, that once superintelligence is achieved,
03:25:37.980 | there's a takeoff, it becomes exponentially smarter,
03:25:41.820 | and in a matter of time,
03:25:43.660 | there's just, we're ants and they're gods.
03:25:46.780 | - Well, yeah, unless we find some way
03:25:48.940 | of permanently tethering a superintelligent,
03:25:53.940 | superintelligent, self-improving AI to our value system.
03:26:01.220 | And I don't believe anyone has figured out how to do that
03:26:04.580 | or whether that's even possible in principle.
03:26:06.180 | I mean, I know people like Stuart Russell
03:26:07.540 | who I just had on my podcast are--
03:26:10.900 | - Oh, really?
03:26:12.260 | Have you released it yet?
03:26:13.100 | - I haven't released it yet.
03:26:13.920 | - Oh, great.
03:26:14.760 | - He's been on previous podcasts,
03:26:15.600 | but we just recorded this week.
03:26:17.660 | - 'Cause you haven't done an AI podcast in a while,
03:26:19.540 | so it's great.
03:26:20.380 | - Yeah, yeah.
03:26:21.220 | - He's a good person to talk about alignment with.
03:26:23.180 | - Yeah, so Stuart, I mean, Stuart has been probably
03:26:26.980 | more than anyone my guru on this topic.
03:26:29.460 | I mean, like, just reading his book and doing,
03:26:31.540 | I think I've done two podcasts with him at this point.
03:26:34.420 | - I think it's called "The Control Problem"
03:26:36.060 | or something like that.
03:26:36.900 | - His book is human compatible.
03:26:39.180 | - Human compatible.
03:26:40.020 | - He talks about the control problem.
03:26:42.340 | And yeah, so I just think the idea
03:26:45.420 | that we can define a value function in advance
03:26:49.380 | that permanently tethers a self-improving
03:26:53.540 | super intelligent AI to our values
03:26:57.700 | as we continue to discover them, refine them,
03:27:02.020 | extrapolate them in an open-ended way.
03:27:06.680 | I think that's a tall order.
03:27:08.360 | And I think there are many more ways,
03:27:09.920 | there must be many more ways of designing super intelligence
03:27:13.520 | that is not aligned in that way
03:27:15.560 | and is not ever approximating our values in that way.
03:27:19.360 | So I mean, Stuart's idea to put it in a very simple way
03:27:24.200 | is that he thinks you don't want to specify
03:27:27.000 | the value function up front.
03:27:28.160 | You don't want to imagine you could ever write the code
03:27:32.140 | in such a way as to admit of no loophole.
03:27:35.580 | You want to make the AI uncertain
03:27:40.120 | as to what human values are and perpetually uncertain
03:27:43.680 | and always trying to ameliorate that uncertainty
03:27:47.160 | by hewing more and more closely
03:27:49.240 | to what our professed values are.
03:27:50.800 | So like it's always interested in us saying,
03:27:55.080 | oh no, no, that's not what we want,
03:27:56.360 | that's not what we intend, stop doing that.
03:27:58.280 | Or like, no matter how smart it gets,
03:28:00.320 | all it wants to do
03:28:01.160 | is more perfectly approximate human values.
03:28:04.120 | I think there are a lot of problems with that
03:28:07.340 | at a high level, I'm not a computer scientist,
03:28:09.080 | so I'm sure there are many problems at a low level
03:28:10.920 | that I don't understand or can't understand.
03:28:12.840 | - Like how to force a human into the loop always,
03:28:15.800 | no matter what.
03:28:16.640 | - There's that and like what humans get a vote
03:28:18.960 | and just what do humans value
03:28:22.760 | and what is the difference between what we say we value
03:28:26.600 | and our revealed preferences, which,
03:28:29.300 | if you were a super intelligent AI
03:28:32.780 | that could look at humanity now,
03:28:36.680 | I think you could be forgiven for concluding
03:28:39.320 | that what we value is driving ourselves crazy with Twitter
03:28:43.720 | and living perpetually on the brink of nuclear war
03:28:46.760 | and just watching hot girls in yoga pants
03:28:51.220 | on TikTok again and again and again.
03:28:52.880 | It's like what-- - And you're saying
03:28:53.920 | that is not?
03:28:54.840 | - This is all revealed preference
03:28:57.440 | and it's what is an AI to make of that,
03:29:00.440 | and what should it optimize?
03:29:01.800 | Like so, this is also Stuart's observation
03:29:06.560 | that one of the insidious things
03:29:08.720 | about like the YouTube algorithm
03:29:10.400 | is it's not that it just caters to our preferences,
03:29:15.200 | it actually begins to change us in ways
03:29:18.580 | so as to make us more predictable.
03:29:20.240 | It's like it finds ways to make us a better reporter
03:29:24.040 | of our preferences and to trim our preferences down
03:29:27.940 | so that it can further train to that signal.
03:29:32.140 | So the main concern is that most of the people in the field
03:29:37.140 | seem not to be taking intelligence seriously.
03:29:41.740 | Like as they design more and more intelligent machines
03:29:46.420 | and as they profess to want to design true AGI,
03:29:50.720 | they're not, again, they're not spending the time
03:29:56.300 | that Stuart is spending trying to figure out
03:29:58.000 | how to do this safely above all.
03:30:00.260 | They're just assuming that these problems
03:30:04.220 | are gonna solve themselves
03:30:05.260 | as we make that final stride into the end zone,
03:30:08.540 | or they're saying very, you know,
03:30:12.700 | Pollyanna-ish things like, you know,
03:30:15.140 | an AI would never form a motive to harm human,
03:30:18.660 | like why would it ever form a motive
03:30:20.660 | to be malicious toward humanity, right,
03:30:24.820 | unless we put that motive in there, right?
03:30:26.500 | And that's not the concern.
03:30:28.020 | The concern is that in the presence
03:30:29.900 | of vast disparities in competence,
03:30:34.820 | and certainly in a condition
03:30:38.100 | where the machines are improving themselves,
03:30:40.060 | they're improving their own code,
03:30:41.780 | they could be developing goal,
03:30:44.580 | instrumental goals that are antithetical to our wellbeing
03:30:48.660 | without any intent to harm us, right?
03:30:51.260 | It's analogous to what we do
03:30:53.140 | to every other species on earth.
03:30:55.900 | I mean, you and I don't consciously form the intention
03:31:00.540 | to harm insects on a daily basis,
03:31:03.420 | but there are many things we could intend to do
03:31:05.700 | that would, in fact, harm insects
03:31:09.140 | because, you know, you decide to repave your driveway
03:31:11.700 | or whatever you're doing,
03:31:13.140 | like you're just not taking the interest of insects
03:31:17.020 | into account because they're so far beneath you
03:31:20.260 | in terms of your cognitive horizons.
03:31:23.460 | And so the real challenge here is that
03:31:26.860 | if you believe that intelligence, you know,
03:31:30.780 | scales up on a continuum toward heights
03:31:34.140 | that we can only dimly imagine,
03:31:37.020 | and I think there's every reason to believe that,
03:31:38.700 | there's no reason to believe
03:31:39.860 | that we're near the summit of intelligence.
03:31:42.020 | And you can define, you know, define,
03:31:44.980 | maybe there's some forms of intelligence
03:31:48.740 | for which this is not true,
03:31:49.780 | but for many relevant forms, you know,
03:31:53.540 | like the top 100 things we care about cognitively,
03:31:57.380 | I think there's every reason to believe
03:31:58.820 | that many of those things, most of those things
03:32:01.700 | are a lot like chess or Go,
03:32:03.660 | where once the machines get better than we are,
03:32:06.220 | they're gonna stay better than we are,
03:32:07.500 | although they're, I don't know if you caught
03:32:09.260 | the recent thing with Go,
03:32:10.340 | where this actually came out of Stuart's lab.
03:32:12.500 | - One. - Yeah, yeah.
03:32:13.340 | - Yeah, one time a human beat a machine in Go.
03:32:17.980 | - They found a hack for that.
03:32:18.900 | But anyway, ultimately, there's gonna be no looking back,
03:32:23.820 | and then the question is, what do we do in relationship
03:32:28.820 | to these systems that are more competent
03:32:33.540 | than we are in every relevant respect?
03:32:36.240 | Because it will be a relationship.
03:32:37.820 | It's not like, the people who think
03:32:41.180 | we're just gonna figure this all out,
03:32:43.020 | you know, without thinking about it in advance,
03:32:45.940 | it's just gonna, the solutions
03:32:47.220 | are just gonna find themselves,
03:32:48.860 | seem not to be taking the prospect
03:32:54.380 | of really creating autonomous superintelligence seriously.
03:32:59.380 | Like, what does that mean?
03:33:00.980 | It's every bit as independent and ungovernable, ultimately,
03:33:05.980 | as us having created, I mean, just imagine
03:33:10.260 | if we created a race of people
03:33:12.380 | that were 10 times smarter than all of us.
03:33:14.620 | Like, how would we live with those people?
03:33:16.820 | They're 10 times smarter than us, right?
03:33:18.460 | Like, they begin to talk about things we don't understand.
03:33:21.240 | They begin to want things we don't understand.
03:33:23.420 | They begin to view us as obstacles to them,
03:33:25.860 | so they're solving those problems,
03:33:27.960 | or gratifying those desires.
03:33:29.420 | We become the chickens or the monkeys in their presence.
03:33:33.660 | And I think that it's, but for some amazing solution
03:33:38.660 | of the sort that Stuart is imagining,
03:33:44.020 | that we could somehow anchor their reward function
03:33:46.740 | permanently, no matter how intelligent scales,
03:33:49.980 | I think it's really worth worrying about this.
03:33:54.340 | I do buy the sci-fi notion that this is an existential risk
03:33:59.340 | if we don't do it well.
03:34:03.700 | - I worry that we don't notice it.
03:34:05.940 | I'm deeply impressed with Chad GPT,
03:34:08.220 | and I'm worried that it will become superintelligent,
03:34:13.060 | these language models will become superintelligent
03:34:15.180 | because they're basically trained
03:34:16.300 | in the collective intelligence of the human species,
03:34:19.740 | and then it'll start controlling our behavior
03:34:21.660 | if they're integrated into our algorithms,
03:34:24.180 | the recommender systems, and then we just won't notice
03:34:28.540 | that there's a superintelligent system
03:34:31.900 | that's controlling our behavior.
03:34:34.100 | - Well, I think that's true even before,
03:34:37.340 | far before superintelligence,
03:34:38.700 | even before general intelligence.
03:34:40.100 | I mean, I think just the narrow intelligence
03:34:43.540 | of these algorithms and of what something like,
03:34:48.540 | Chad GPT can do,
03:34:51.220 | I mean, it's just far short of it developing its own goals
03:35:00.980 | and that is, that are at cross purposes with ours,
03:35:06.820 | just the unintended consequences of using it
03:35:11.500 | in the ways we're going to be incentivized to use it
03:35:14.100 | and the money to be made from scaling this thing
03:35:18.500 | and what it does to our information space
03:35:22.140 | and our sense of just being able to get to ground truth
03:35:25.340 | of any facts, it's, yeah, it's super scary,
03:35:30.340 | and it was, it's--
03:35:34.580 | - Do you think it's a giant leap
03:35:35.900 | in terms of the development towards AGI, Chad GPT,
03:35:38.580 | or we still, is this just an impressive little toolbox?
03:35:43.580 | So like, when do you think the singularity's coming?
03:35:48.120 | Or is it T, it doesn't matter, it's eventually?
03:35:51.700 | - I have no intuitions on that front apart from the fact
03:35:54.220 | that if we continue to make progress, it will come, right?
03:35:58.420 | So it's just, you just have to assume
03:36:00.500 | we continue to make progress.
03:36:02.060 | There's only two assumptions.
03:36:03.180 | You have to assume substrate independence.
03:36:06.500 | So there's no reason why this can't be done in silico.
03:36:09.420 | It's just, we can build arbitrarily intelligent machines.
03:36:13.860 | There's nothing magical about having this done
03:36:18.260 | in the wetware of our own brains.
03:36:20.380 | I think that is true, and I think that's, you know,
03:36:22.940 | scientifically parsimonious to think that that's true.
03:36:26.340 | And then you just have to assume
03:36:27.740 | we're going to keep making progress.
03:36:29.380 | It doesn't have to be any special rate of progress.
03:36:31.300 | It doesn't have to be Moore's law.
03:36:33.180 | It can just be, we just keep going.
03:36:34.620 | At a certain point, we're going to be in relationship
03:36:37.340 | to minds, leaving consciousness aside.
03:36:41.900 | I don't have any reason to believe
03:36:44.460 | that they'll necessarily be conscious
03:36:47.380 | by virtue of being super intelligent.
03:36:49.220 | And that's its own interesting ethical question.
03:36:52.660 | But leaving consciousness aside,
03:36:56.660 | they're going to be more competent than we are.
03:37:00.500 | And then that's like, you know, the aliens have landed.
03:37:04.900 | You know, that's literally, that's an encounter with,
03:37:08.220 | again, leaving aside the possibility
03:37:10.620 | that something like Stewart's path
03:37:13.980 | is actually available to us.
03:37:18.380 | But it is hard to picture if what we mean by intelligence,
03:37:23.300 | all things considered, and it's truly general,
03:37:27.500 | if that scales and, you know,
03:37:32.500 | begins to build upon itself,
03:37:36.420 | how you maintain that perfect,
03:37:41.420 | slavish devotion until the end of time in those systems.
03:37:46.660 | - The tether to humans?
03:37:47.540 | - Yeah.
03:37:48.380 | - I think my gut says that that tether is not,
03:37:52.620 | there's a lot of ways to do it.
03:37:54.300 | So it's not this increasingly impossible problem.
03:37:58.080 | - Right, so I have no, you know,
03:38:01.220 | as you know, I'm not a computer scientist,
03:38:02.460 | so I have no intuitions about, just algorithmically,
03:38:05.780 | how you would approach that and what's possible.
03:38:08.540 | - My main intuition is maybe deeply flawed,
03:38:12.020 | but the main intuition is based on the fact
03:38:13.740 | that most of the learning is currently happening
03:38:17.340 | on human knowledge.
03:38:18.640 | So even Chad Gipetty is just trained on human data.
03:38:22.460 | - Right.
03:38:23.620 | - I don't see where the takeoff happens
03:38:26.300 | where you completely go above human wisdom.
03:38:29.580 | The current impressive aspect of Chad Gipetty
03:38:31.940 | is that's using collective intelligence of all of us.
03:38:35.740 | - From what I glean, again,
03:38:38.660 | from people who know much more about this than I do,
03:38:41.300 | I think we have reason to be skeptical
03:38:45.060 | that these techniques of deep learning
03:38:49.380 | are actually going to be sufficient to push us into AGI.
03:38:52.980 | Right, so it's just, they're not generalizing
03:38:56.200 | in the way they need to.
03:38:57.340 | They're certainly not learning like human children,
03:39:00.500 | and so they're brittle in strange ways.
03:39:04.760 | It's not to say that the human path is the only path,
03:39:09.980 | you know, and maybe we might learn better lessons
03:39:13.920 | by ignoring the way brains work,
03:39:15.780 | but we know that they don't generalize
03:39:19.980 | and use abstraction the way we do.
03:39:23.820 | And so they have strange holes in their competence.
03:39:28.820 | - But the size of the holes is shrinking every time.
03:39:31.820 | And that's, so the intuition starts to slowly fall apart.
03:39:35.480 | The intuition is like, surely it can't be this simple
03:39:39.860 | to achieve super intelligence.
03:39:41.300 | - Yeah, yeah.
03:39:42.140 | - But it's becoming simpler and simpler.
03:39:44.620 | So I don't know.
03:39:46.180 | The progress is quite incredible.
03:39:47.700 | I've been extremely impressed with Chad Giputi
03:39:49.980 | and the new models, and there's a lot of financial incentive
03:39:52.700 | to make progress in this regard.
03:39:54.500 | So it's, we're going to be living
03:39:57.100 | through some very interesting times.
03:39:58.900 | In raising a question that I'm going to be talking to you,
03:40:05.260 | a lot of people brought up this topic,
03:40:07.020 | probably because Eric Weinstein talked to Joe Rogan recently
03:40:09.980 | and said that he and you were contacted by folks
03:40:13.100 | about UFOs.
03:40:15.300 | Can you clarify the nature of this contact?
03:40:18.620 | - Yeah, yeah.
03:40:19.460 | - That you were contacted by?
03:40:20.740 | - I've got very little to say on this.
03:40:22.420 | I mean, he has much more to say.
03:40:23.780 | I think he went down this rabbit hole further than I did,
03:40:28.700 | which wouldn't surprise anyone.
03:40:31.360 | He's got much more of a taste
03:40:33.780 | for this sort of thing than I do.
03:40:35.140 | But I think we were contacted by the same person.
03:40:38.420 | It wasn't clear to me who this person was
03:40:40.660 | or how this person got my cell phone number.
03:40:45.220 | They didn't seem,
03:40:46.360 | it didn't seem like we were getting punked.
03:40:49.740 | I mean, the person seemed credible to me.
03:40:52.300 | - And they were talking to you
03:40:53.140 | about the release of different videos on UFOs.
03:40:55.060 | - Yeah, and this is when there was a flurry
03:40:57.340 | of activity around this.
03:40:58.180 | So there was like, there was a big New Yorker article
03:41:01.180 | on UFOs and there was rumors of congressional hearings,
03:41:06.180 | I think, coming and the videos
03:41:11.800 | that were being debunked or not.
03:41:14.900 | And so this person contacted both of us,
03:41:18.420 | I think around the same time.
03:41:19.500 | And I think he might've contacted Rogan or other,
03:41:22.020 | Eric is just the only person I've spoken to about it,
03:41:24.580 | I think, who I know was contacted.
03:41:28.140 | And what happened is the person kept writing a check
03:41:33.140 | that he didn't cash.
03:41:38.340 | Like he kept saying, okay, next week,
03:41:40.980 | I'm gonna, you know, I understand this is sounding spooky
03:41:43.840 | and you have no reason to really trust me,
03:41:46.100 | but next week, I'm gonna put you on a Zoom call
03:41:49.900 | with people who you will recognize.
03:41:51.520 | And they're gonna be former heads of the CIA
03:41:54.180 | and people who just, you're gonna,
03:41:56.660 | within five seconds of being on the Zoom call,
03:41:58.540 | you'll know this is not a hoax.
03:42:01.380 | And I said, great, just let me know,
03:42:03.100 | send me the Zoom link, right?
03:42:04.620 | And I went, that happened maybe three times.
03:42:08.100 | There was just one phone conversation
03:42:11.520 | and then it was just texts, you know,
03:42:13.700 | there's just a bunch of texts.
03:42:15.380 | And I think Eric spent more time with this person
03:42:20.380 | and I haven't spoken to him about it.
03:42:22.340 | I know he's spoke about it publicly, but.
03:42:24.380 | So I, you know, it's not that my bullshit detector
03:42:29.400 | ever really went off in a big way,
03:42:31.740 | it's just the thing never happened.
03:42:33.540 | And so I lost interest.
03:42:35.520 | - So you made a comment, which is interesting,
03:42:37.980 | that you ran the, which I really appreciate,
03:42:41.020 | you ran a thought experiment of saying,
03:42:43.820 | okay, maybe we do have alien spacecraft,
03:42:47.040 | or just the thought experiment the aliens did visit.
03:42:49.980 | - Yeah. - And then,
03:42:51.620 | this is very kind of nihilistic, sad thought
03:42:54.220 | that it wouldn't matter, it wouldn't affect your life.
03:42:58.140 | Can you explain that?
03:43:00.000 | - Well, no, I was, I think many people noticed this.
03:43:04.120 | I mean, this was a sign of how crazy
03:43:07.020 | the news cycle was at that point, right?
03:43:08.900 | Like we had COVID and we had Trump
03:43:10.580 | and I forget when the UFO thing was really kicking off,
03:43:13.240 | but it just seemed like no one had the bandwidth
03:43:17.900 | to even be interested in this.
03:43:19.100 | It's like, I was amazed to notice in myself
03:43:23.620 | that I wasn't more interested
03:43:26.220 | in figuring out what was going on.
03:43:28.500 | It's like, and I considered, okay, wait a minute.
03:43:31.820 | This is, if this is true,
03:43:35.860 | this is the biggest story in anyone's lifetime.
03:43:38.260 | I mean, contact with alien intelligence is by definition
03:43:42.860 | the biggest story in anyone's lifetime in human history.
03:43:46.240 | Why isn't this just totally captivating?
03:43:52.420 | And not only was it not totally captivating,
03:43:55.300 | it was just barely rising to the level
03:43:57.460 | of my being able to pay attention to it.
03:44:00.260 | And I view that, I mean, one, as a,
03:44:06.380 | to some degree, an understandable defense mechanism
03:44:09.820 | against the bogus claims that have been made
03:44:14.780 | about this kind of thing in the past.
03:44:16.620 | The general sense is probably bullshit
03:44:20.860 | or it probably has some explanation
03:44:22.800 | that is purely terrestrial and not surprising.
03:44:26.020 | And there is somebody who, what's his name?
03:44:29.460 | Is it Mick West?
03:44:30.740 | I forget, is it a YouTuber?
03:44:32.060 | - Yeah, Mick West, yeah.
03:44:33.020 | He debunks stuff.
03:44:34.420 | I mean, I have since seen some of those videos.
03:44:38.340 | I mean, now this is going back still at least a year,
03:44:41.020 | but some of those videos seem like fairly credible
03:44:44.140 | debunkings of some of the optical evidence.
03:44:47.340 | And I'm surprised we haven't seen more of that.
03:44:51.620 | Like there was a fairly credulous 60 minutes piece
03:44:55.480 | that came out around that time,
03:44:56.620 | looking at some of that video.
03:44:58.020 | And it was the very video that he was debunking on YouTube.
03:45:00.900 | And his video only had like 50,000 views on it or whatever.
03:45:05.360 | But again, it seemed like a fairly credible debunking.
03:45:09.500 | I haven't seen debunkings of his debunkings, but--
03:45:12.740 | - I think there is, but he's basically saying
03:45:14.580 | that there is possible explanations for it.
03:45:18.180 | And usually in these kinds of contexts,
03:45:19.860 | if there's a possible explanation,
03:45:21.300 | even if it seems unlikely,
03:45:23.700 | is going to be more likely
03:45:25.720 | than an alien civilization visiting us.
03:45:28.980 | - Yeah, so the extraordinary claims
03:45:30.420 | require extraordinary evidence principle,
03:45:32.180 | which I think is generally true.
03:45:34.740 | - Well, with aliens, I think generally,
03:45:37.580 | I think there should be some humility
03:45:39.100 | about what they would look like when they show up.
03:45:42.180 | But I tend to think they're already here.
03:45:44.220 | - The amazing thing about this AI conversation though,
03:45:46.140 | is that we're talking about a circumstance
03:45:47.940 | where we would be designing the aliens.
03:45:52.420 | And there's every reason to believe
03:45:55.420 | that eventually this is gonna happen.
03:45:57.320 | Like I said, I'm not at all skeptical
03:45:58.960 | about the coming reality of the aliens,
03:46:02.600 | that we're gonna build them.
03:46:03.640 | - Now, here's the thing.
03:46:04.680 | Does this apply to when super intelligence shows up?
03:46:08.260 | Will this be trending on Twitter for a day?
03:46:11.360 | And then we'll go on to complain
03:46:12.880 | about something Sam Harris once again said
03:46:15.760 | in his podcast the next day?
03:46:17.400 | You tend to trend on Twitter,
03:46:20.520 | even though you're not on Twitter, which is great.
03:46:22.520 | - Yeah, I haven't noticed.
03:46:25.400 | I mean, I did notice when I was on, but.
03:46:28.880 | - You have this concern about AGI basically,
03:46:34.600 | same kind of thing, that we would just look the other way.
03:46:37.280 | Is there something about this time
03:46:38.480 | where even like World War III,
03:46:41.560 | which has been throwing around very casually,
03:46:44.560 | concerningly so, even that, the news cycle wipes that away?
03:46:48.880 | - Yeah, well, I think we have this general problem
03:46:54.880 | that we can't make certain information,
03:46:59.880 | even unequivocally certain information,
03:47:06.840 | emotionally salient.
03:47:11.120 | Like we respond quite readily to certain things.
03:47:14.480 | I mean, as we talked about,
03:47:15.920 | we respond to the little girl who fell down a well.
03:47:20.000 | I mean, that just, that gets 100%
03:47:21.920 | of our emotional resources.
03:47:24.080 | But the abstract probability of nuclear war,
03:47:29.080 | right, even a high probability,
03:47:30.920 | even just, even an intolerable probability,
03:47:32.720 | even if we put it at 30%, right?
03:47:37.160 | You know, like, it's just like,
03:47:38.960 | that's a Russian roulette with a gun with three chambers.
03:47:43.280 | And, you know, it's aimed at the heads,
03:47:45.280 | not only your head, but your kid's head
03:47:46.840 | and everyone's kid's head.
03:47:47.800 | And it's just 24 hours a day.
03:47:50.080 | And I mean, I think people who, this pre-Ukraine,
03:47:55.080 | I think the people who have made it their business
03:47:57.880 | to, you know, professionally to think about
03:48:01.440 | the risk of nuclear war and to mitigate it,
03:48:03.720 | you know, people like Graham Allison or William Perry,
03:48:07.040 | I mean, I think they were putting like the ongoing risk,
03:48:12.680 | I mean, just the risk that we're gonna have
03:48:15.240 | a proper nuclear war at some point in the, you know,
03:48:19.040 | the next generation, people were putting it at,
03:48:23.040 | you know, something like 50%, right?
03:48:25.000 | They were living with this sort of Damocles over our heads.
03:48:29.040 | Now, you might wonder whether anyone
03:48:32.720 | could have reliable intuitions about the probability
03:48:34.760 | of that kind of thing, but the status quo is truly alarming.
03:48:39.760 | I mean, we've got, you know, we've got ICBMs on,
03:48:43.640 | I mean, leave aside smaller exchanges and, you know,
03:48:47.000 | tactical nukes and how we could have a world war,
03:48:50.720 | you know, based on, you know, incremental changes.
03:48:54.080 | We've got the biggest bombs aimed at the biggest cities
03:48:59.080 | in both directions and it's old technology, right?
03:49:05.200 | And it's, you know, and it's vulnerable to some lunatic
03:49:10.360 | deciding to launch or misreading, you know, bad data.
03:49:14.840 | And we know we've been saved from nuclear war,
03:49:17.720 | I think at least twice by, you know,
03:49:24.800 | Soviet submarine commanders deciding,
03:49:27.200 | I'm not gonna pass this up the chain of command, right?
03:49:29.880 | It's like, this is almost certainly an error
03:49:34.400 | and it turns out it was an error.
03:49:35.680 | And it's like, and we need people to,
03:49:40.680 | I mean, in that particular case, like he saw,
03:49:42.720 | I think it was five, what seemed like five missiles
03:49:46.520 | launched from the US to Russia.
03:49:48.280 | And he reasoned if America was gonna engage
03:49:52.000 | in a first strike, they'd launch more than five missiles,
03:49:54.600 | right, so this has to be fictional.
03:49:57.480 | And then he waited long enough to decide
03:49:59.480 | that it was fictional, but the probability
03:50:02.040 | of a nuclear war happening by mistake
03:50:06.560 | or some other species of inadvertence, you know,
03:50:10.600 | misunderstanding, technical malfunction,
03:50:15.440 | that's intolerable.
03:50:17.880 | Forget about the intentional use of it
03:50:20.560 | by people who are, you know, driven crazy by some ideology.
03:50:25.560 | - And more and more technologies are enabled
03:50:29.720 | to kind of scale destruction.
03:50:31.520 | - And misinformation plays into this picture
03:50:35.120 | in a way that is especially scary.
03:50:37.680 | I mean, once you can get a deep fake of, you know,
03:50:42.320 | any current president of the United States
03:50:44.680 | claiming to have launched a first strike, you know,
03:50:47.240 | and just, you know, send that everywhere.
03:50:51.400 | - But that could change the nature of truth
03:50:52.960 | and then we, that might change the engine
03:50:57.120 | we have for skepticism, sharpen it.
03:51:00.420 | The more you have deep fakes.
03:51:02.120 | - Yeah, and we might have AI and digital watermarks
03:51:05.120 | that help us, maybe we'll not trust any information
03:51:09.360 | that hasn't come through specific channels, right?
03:51:13.920 | I mean, so in my world, it's like,
03:51:18.520 | I no longer feel the need to respond to anything
03:51:24.480 | other than what I put out in my channels of information.
03:51:28.520 | It's like, there's so many people
03:51:30.640 | who have clipped stuff of me that shows the opposite
03:51:34.480 | of what I was actually saying in context.
03:51:36.120 | I mean, the people have like re-edited my podcast audio
03:51:38.560 | to make it seem like I said the opposite
03:51:40.640 | of what I was saying.
03:51:41.800 | It's like, unless I put it out, you know,
03:51:44.500 | you can't be sure that I actually said it, you know?
03:51:47.000 | I mean, it's just, but I don't know what it's like
03:51:52.000 | to live like that for all forms of information.
03:51:56.080 | And I mean, strangely, I think it may require
03:52:00.400 | a greater siloing of information in the end.
03:52:05.400 | You know, it's like, we're living through
03:52:09.800 | this sort of Wild West period where everyone's got
03:52:12.240 | a newsletter and everyone's got a blog
03:52:13.640 | and everyone's got an opinion.
03:52:15.440 | But once you can fake everything--
03:52:17.720 | - There might be a greater value for expertise.
03:52:19.800 | - Yeah. - For experts,
03:52:20.800 | but a more rigorous system for identifying
03:52:24.000 | who the experts are.
03:52:24.840 | - Yeah, or just knowing that, you know,
03:52:26.680 | it's gonna be an arms race to authenticate information.
03:52:31.440 | So it's like, if you can never trust a photograph
03:52:35.720 | unless it has been vetted by Getty Images,
03:52:39.800 | because only Getty Images has the resources
03:52:42.400 | to authenticate the provenance of that photograph
03:52:46.000 | and a test that hasn't been meddled with by AI.
03:52:49.420 | And again, I don't even know if that's technically possible.
03:52:52.840 | And maybe whatever the tools available for this
03:52:56.680 | will be commodified and the cost will be driven to zero
03:53:01.680 | so quickly that everyone will be able to do it.
03:53:03.720 | You know, it could be like encryption, but--
03:53:05.840 | - And it would be proven and tested most effectively first,
03:53:09.340 | of course, as always in porn.
03:53:11.680 | - Yeah, right. - Which is where
03:53:12.600 | most of human innovation technology happens first.
03:53:15.380 | Well, I have to ask because Ron Howard,
03:53:19.080 | the director, asked this on Twitter.
03:53:21.280 | Since we're talking about the threat of nuclear war
03:53:23.840 | and otherwise, he asked,
03:53:25.480 | "I'd be interested in both your expectations
03:53:27.480 | for human society if, when we move beyond Mars.
03:53:30.700 | Will those societies be industrial-based?
03:53:34.300 | How will it be governed?
03:53:35.560 | How will criminal infractions be dealt with?
03:53:39.200 | When you read or watch sci-fi,
03:53:41.080 | what comes closest to sounding logical?"
03:53:43.440 | Do you think about our society beyond Earth?
03:53:46.640 | If we colonize Mars, if we colonize space?
03:53:48.640 | - Yeah, well, I think I have a pretty--
03:53:52.120 | - Uh-oh. - Humbling picture of that.
03:53:54.480 | 'Cause we're still gonna be the apes that we are.
03:53:57.200 | So when you imagine colonizing Mars,
03:54:01.000 | you have to imagine a first fist fight on Mars.
03:54:04.340 | You have to imagine a first murder on Mars.
03:54:06.480 | - Also infidelity. - Yeah.
03:54:07.920 | - Somebody-- - Extramarital affairs on Mars.
03:54:10.460 | Right, so it's gonna get really homely and boring
03:54:15.200 | really fast, I think.
03:54:16.680 | It's like only the spacesuits or the other exigencies
03:54:21.240 | of just living in that atmosphere or lack thereof
03:54:26.400 | will limit how badly we can behave on Mars.
03:54:30.280 | - But do you think most of the interaction
03:54:31.800 | will be still in meat space versus digital?
03:54:34.520 | Do you think there'll be, do you think we're like living
03:54:36.580 | through a transformation of a kind
03:54:40.200 | where we're going to be doing more and more interaction
03:54:42.680 | in digital space?
03:54:44.700 | Like everything we've been complaining about Twitter,
03:54:47.320 | is it possible that Twitter's just the early days
03:54:49.660 | of a broken system that's actually giving birth
03:54:52.940 | to a better working system that's ultimately digital?
03:54:55.620 | - I think we're gonna experience a pendulum swing
03:55:02.060 | back into the real world.
03:55:04.820 | I mean, I think many of us are experiencing that now anyway.
03:55:07.340 | I mean, just wanting to have face-to-face encounters
03:55:12.340 | and spend less time on our phones
03:55:14.460 | and less time online.
03:55:15.380 | I mean, I think maybe everyone isn't going
03:55:18.380 | in that direction, but I do notice it myself.
03:55:23.060 | And I notice, I mean, once I got off Twitter,
03:55:26.260 | then I noticed the people who were never on Twitter, right?
03:55:28.940 | And the people who were never, basically, I mean, I know,
03:55:32.140 | I have a lot of friends who were never on Twitter.
03:55:34.900 | And they actually never understood
03:55:36.500 | what I was doing on Twitter.
03:55:37.580 | It's like, they just like, it wasn't that they were seeing it
03:55:40.540 | and then reacting to it.
03:55:42.580 | They just didn't know, it's like,
03:55:44.500 | it's like being on, it's like I'm not on Reddit either,
03:55:48.700 | but I don't spend any time thinking
03:55:49.900 | about not being on Reddit, right?
03:55:51.380 | It's like, I'm just not on Reddit.
03:55:53.080 | - So you think the pursuit of human happiness
03:55:55.980 | is better achieved, more effectively achieved
03:55:58.220 | outside of Twitter world?
03:56:00.860 | - Well, I think all we have is our attention in the end.
03:56:06.100 | And we just have to notice what these various tools
03:56:09.780 | are doing to it.
03:56:11.220 | And it's just, it became very clear to me
03:56:15.020 | that it was an unrewarding use of my attention.
03:56:19.420 | Now, it's not to say there isn't some digital platform
03:56:22.540 | that's conceivable that would be useful and rewarding,
03:56:27.300 | but yeah, I mean, we just have,
03:56:32.300 | you know, our life is doled out to us in moments.
03:56:35.700 | And we have, and we're continually solving this riddle
03:56:39.540 | of what is gonna suffice to make this moment engaging
03:56:44.540 | and meaningful and aligned with who I wanna be now
03:56:50.420 | and how I want the future to look, right?
03:56:53.580 | We're all, I mean, we have this tension
03:56:55.300 | between being in the present and becoming in the future.
03:57:00.300 | And, you know, it's a seeming paradox.
03:57:04.340 | Again, it's not really a paradox, but it can seem like,
03:57:07.180 | I do think the ground truth for personal wellbeing
03:57:12.020 | is to find a mode of being where you can pay attention
03:57:16.580 | to the present moment.
03:57:18.900 | And this is, you know, meditation by another name.
03:57:21.860 | You can pay attention to the present moment
03:57:24.240 | with sufficient gravity that you recognize
03:57:29.100 | that just consciousness itself in the present moment,
03:57:32.740 | no matter what's happening,
03:57:34.460 | is already a circumstance of freedom
03:57:37.540 | and contentment and tranquility.
03:57:40.940 | Like you can be happy now before anything happens,
03:57:44.940 | before this next desire gets gratified,
03:57:47.180 | before this next problem gets solved.
03:57:49.220 | There's this kind of ground truth that you're free,
03:57:52.340 | that consciousness is free and open and unencumbered
03:57:56.020 | by really any problem until you get lost in thought
03:57:59.980 | about all the problems that may yet be real for you.
03:58:03.740 | So the ability to catch and observe consciousness,
03:58:07.400 | that in itself is a source of happiness.
03:58:09.500 | - Yeah, without being lost in thought.
03:58:11.060 | And so this happens haphazardly for people
03:58:15.540 | who don't meditate because they find something
03:58:17.380 | in their life that's so captivating,
03:58:19.820 | it's so pleasurable, it's so thrilling.
03:58:22.540 | It can even be scary, but it can be,
03:58:25.460 | even being scared is captivating.
03:58:26.980 | Like it gets their attention, right, whatever it is.
03:58:30.300 | If you like, Sebastian Junger wrote a great book
03:58:35.100 | about people's experience in war here.
03:58:37.900 | It's like strangely it can be the best experience
03:58:41.260 | anyone's ever had because everything,
03:58:43.100 | it's like only the moment matters, right?
03:58:44.900 | Like the bullet is whizzing by your head.
03:58:47.900 | You're not thinking about your 401k
03:58:51.220 | or that thing that you didn't say last week
03:58:53.580 | to the person you shouldn't have been talking about.
03:58:55.440 | You're not thinking about Twitter.
03:58:56.500 | It's like you're just fully immersed
03:58:59.260 | in the present moment.
03:59:00.360 | Meditation is the only way, I mean,
03:59:05.820 | that word can mean many things to many people,
03:59:07.620 | but what I mean by meditation is simply the discovery
03:59:10.260 | that there is a way to engage the present moment directly
03:59:15.260 | regardless of what's happening.
03:59:18.980 | You don't need to be in a war.
03:59:20.020 | You don't need to be having sex.
03:59:21.260 | You don't need to be on drugs.
03:59:22.800 | You don't need to be surfing.
03:59:24.060 | Nothing, it doesn't have to be a peak experience.
03:59:26.940 | It can be completely ordinary,
03:59:28.620 | but you can recognize that in some basic sense,
03:59:31.540 | there's only this and everything else
03:59:35.940 | is something you're thinking.
03:59:37.580 | You're thinking about the past.
03:59:38.860 | You're thinking about the future
03:59:40.300 | and thoughts themselves have no substance, right?
03:59:43.700 | It's fundamentally mysterious that any thought
03:59:47.240 | ever really commandeers your sense of who you are
03:59:50.700 | and makes you anxious or afraid or angry or whatever it is.
03:59:55.160 | And the more you discover that,
03:59:58.400 | the half-life of all these negative emotions
04:00:00.420 | that blow all of us around get much, much shorter, right?
04:00:04.100 | And you can literally just,
04:00:06.160 | the anger that would have kept you angry for hours or days
04:00:12.100 | lasts four seconds because you just,
04:00:15.420 | the moment it arises, you recognize it
04:00:17.660 | and you can get off that.
04:00:18.500 | You can decide, at minimum, you can decide
04:00:20.460 | whether it's useful to stay angry at that moment.
04:00:23.660 | And obviously it usually isn't.
04:00:25.700 | - And the illusion of free will is one of those thoughts.
04:00:28.360 | - Yeah, it's all just happening, right?
04:00:30.500 | Like even the mindful and meditative response to this
04:00:34.020 | is just happening.
04:00:35.460 | It's just like even the moments where you recognize
04:00:39.140 | or not recognize is just happening.
04:00:40.540 | It's not that,
04:00:41.380 | this does open up a degree of freedom for a person,
04:00:45.780 | but it's not a freedom that gives any motivation
04:00:48.340 | to the notion of free will.
04:00:49.420 | It's just a new way of being in the world.
04:00:53.300 | - Is there a difference between intellectually knowing
04:00:55.980 | free will is an illusion and really experiencing it?
04:00:59.820 | What's the longest you've been able to experience
04:01:03.220 | the escape, the illusion of free will?
04:01:06.380 | - Well, it's always obvious to me when I pay attention.
04:01:10.420 | Whenever I'm mindful, the term of jargon in the Buddhist
04:01:17.260 | and increasingly outside the Buddhist context
04:01:19.980 | is mindfulness, right?
04:01:21.220 | But there are sort of different levels of mindfulness
04:01:23.940 | and there's different degrees of insight into this.
04:01:28.260 | But yes, what I'm calling evidence of lack of free will
04:01:32.460 | and lack of the self, I got two sides of the same coin.
04:01:36.820 | There's a sense of being a subject
04:01:40.000 | in the middle of experience to whom all experience refers,
04:01:44.140 | the sense of I, the sense of me.
04:01:45.820 | And that's almost everybody's starting point
04:01:49.500 | when they start to meditate.
04:01:51.100 | And that's almost always the place people live
04:01:54.140 | most of their lives from.
04:01:55.920 | I do think that gets interrupted
04:01:57.240 | in ways they get unrecognized.
04:01:58.780 | I think people are constantly losing the sense of I,
04:02:01.620 | they're losing the sense of subject, object, distance,
04:02:04.540 | but they're not recognizing it.
04:02:05.780 | And meditation is the mode in which you can recognize,
04:02:10.780 | you can both consciously precipitate it,
04:02:14.060 | you can look for the self and fail to find it
04:02:16.340 | and then recognize its absence.
04:02:19.500 | And that's just the flip side of the coin of free will.
04:02:23.260 | I mean, the feeling of having free will
04:02:26.540 | is what it feels like to feel like a self
04:02:30.220 | who's thinking his thoughts and doing his actions
04:02:33.140 | and intending his intentions.
04:02:34.600 | And the man in the middle of the boat who's rowing,
04:02:38.640 | that's the false starting point.
04:02:43.500 | When you find that there's no one
04:02:44.900 | in the middle of the boat, right?
04:02:46.140 | Or in fact, there's no boat, there's just the river,
04:02:48.580 | there's just the flow of experience
04:02:51.060 | and there's no center to it.
04:02:53.420 | And there's no place from which you would control it.
04:02:55.980 | Again, even when you're doing things,
04:02:58.220 | this does not negate the difference
04:02:59.620 | between voluntary and involuntary behavior.
04:03:01.620 | It's like, I can voluntarily reach for this,
04:03:05.020 | but when I'm paying attention,
04:03:07.340 | I'm aware that everything is just happening.
04:03:10.420 | Like just the intention to move is just arising, right?
04:03:14.260 | And I'm in no position to know why it didn't arise
04:03:18.300 | a moment before or a moment later
04:03:20.780 | or a moment or 50% stronger or weaker
04:03:24.980 | or so as to be ineffective or to be doubly effective
04:03:28.860 | where I lurched for it versus I move slow.
04:03:31.420 | I mean, I can never run the counterfactuals.
04:03:36.420 | I mean, all of this opens the door
04:03:40.140 | to an even more disconcerting picture along the same lines
04:03:45.120 | which subsumes this conversation about free will.
04:03:48.580 | And it's the question of whether
04:03:50.820 | anything is ever possible.
04:03:56.780 | Like what if, this is a question
04:03:58.860 | I haven't thought a lot of about it,
04:04:01.060 | but it's been a few years
04:04:03.020 | I've been kicking this question around.
04:04:04.980 | So I mean, what if only the actual is possible?
04:04:12.400 | What if there was, what if,
04:04:14.740 | so we live with this feeling of possibility.
04:04:16.620 | We live with the sense that,
04:04:18.160 | let me take, so I have two daughters.
04:04:25.500 | I could have had a third child, right?
04:04:28.060 | So what does it mean to say
04:04:29.680 | that I could have had a third child?
04:04:31.220 | Or is it, you don't have kids, I don't think?
04:04:33.780 | So-- - Not that I know of.
04:04:35.140 | - Yes, so-- - So the possibility
04:04:36.460 | might be there. - Right.
04:04:37.360 | So what do we mean when we say
04:04:40.380 | you could have had a child
04:04:43.320 | or you might have a child in the future?
04:04:48.320 | What is the space in reality?
04:04:51.280 | What's the relationship between possibility
04:04:53.400 | and actuality and reality?
04:04:55.920 | Is there a reality in which
04:04:57.720 | non-actual things are nonetheless real?
04:05:03.840 | And so we have other categories of non-concrete things.
04:05:09.120 | We have things that don't have spatial temporal dimension,
04:05:12.560 | but they're nonetheless, they nonetheless exist.
04:05:15.060 | So like, you know, the integers, right?
04:05:18.280 | So numbers.
04:05:19.140 | There's a reality, there's an abstract reality to numbers.
04:05:25.280 | And this is, it's philosophically interesting
04:05:26.860 | to think about these things.
04:05:27.700 | So they're not like, in some sense,
04:05:30.040 | they're real and they're not merely invented by us.
04:05:35.040 | They're discovered because they have structure
04:05:38.440 | that we can't impose upon them, right?
04:05:39.920 | It's not like, they're not fictional characters like,
04:05:42.720 | you know, I mean, Hamlet and Superman
04:05:45.320 | also exist in some sense,
04:05:46.960 | but they exist at a level of our own fiction
04:05:50.520 | and abstraction, but it's like,
04:05:53.040 | there are true and false statements
04:05:54.940 | you can make about Hamlet.
04:05:56.560 | There are true and false statements
04:05:57.580 | you can make about Superman
04:05:59.280 | because our fiction, the fictional worlds we've created
04:06:02.760 | have a certain kind of structure.
04:06:03.880 | But again, this is all abstract.
04:06:05.960 | It's all abstractable from any of its concrete
04:06:08.920 | instantiations, it's not just in the comic books
04:06:11.280 | and just in the movies.
04:06:12.760 | It's in our, you know, ongoing ideas about these characters.
04:06:17.760 | But natural numbers or the integers
04:06:21.500 | don't function quite that way.
04:06:24.640 | I mean, they're similar, but they also have a structure
04:06:26.760 | that's purely a matter of discovery.
04:06:28.600 | It's not, you can't just make up whether numbers are prime.
04:06:32.700 | You know, if you give me two integers, you know,
04:06:34.880 | of a certain size, you mentioned two enormous integers.
04:06:39.880 | If I were to say, okay, well, between those two integers,
04:06:43.160 | they're exactly 11 prime numbers, right?
04:06:46.460 | That's a very specific claim
04:06:48.240 | about which I can be right or wrong,
04:06:49.840 | and whether or not anyone knows I'm right or wrong.
04:06:51.840 | It's like, that's just, there's a domain of facts there,
04:06:54.320 | but these are abstract, it's an abstract reality
04:06:57.080 | that relates in some way that's philosophically interesting,
04:07:00.200 | you know, metaphysically interesting
04:07:01.480 | to what we call real reality.
04:07:04.080 | You know, the spatial temporal order, the physics of things.
04:07:09.080 | But possibility, at least in my view,
04:07:13.320 | occupies a different space.
04:07:15.320 | And this is something, again,
04:07:17.120 | my thoughts on this are pretty inchoate.
04:07:19.240 | I think I need to talk to a philosopher of physics
04:07:23.440 | and/or a physicist about how this may interact
04:07:26.000 | with things like the many worlds interpretation
04:07:28.120 | of quantum mechanics. - Yeah, that's an interesting,
04:07:29.600 | right, exactly.
04:07:30.760 | So I wonder if discoveries in physics,
04:07:35.080 | like further proof or more concrete proof
04:07:37.320 | that many worlds interpretation of quantum mechanics
04:07:39.440 | has some validity,
04:07:41.640 | if that completely starts to change things.
04:07:43.920 | - But even, that's just more actuality.
04:07:47.800 | So if I took that seriously-- - Ah, sure.
04:07:50.600 | - That's a case of, and truth is, that happens even if
04:07:55.600 | the many worlds interpretation isn't true,
04:07:58.040 | but we just imagine we have a physically infinite universe,
04:08:02.680 | the implication of infinity is such that
04:08:05.440 | things will begin to repeat themselves
04:08:08.000 | the farther you go in space, right?
04:08:09.400 | So if you just head out in one direction,
04:08:12.960 | eventually you're gonna meet two people just like us
04:08:15.120 | having a conversation just like this,
04:08:17.560 | and you're gonna meet them an infinite number of times
04:08:19.800 | in every infinite variety of permutations
04:08:23.520 | slightly different from this conversation, right?
04:08:25.440 | So, I mean, infinity is just so big
04:08:28.100 | that our intuitions of probability completely break down.
04:08:31.200 | But what I'm suggesting is,
04:08:32.560 | maybe probability isn't a thing, right?
04:08:36.400 | Maybe there's only actuality.
04:08:39.200 | If there's, maybe there's only what happens,
04:08:41.960 | and at every point along the way,
04:08:44.880 | our notion of what could have happened
04:08:46.600 | or what might have happened is just that,
04:08:48.840 | it's just a thought about what could have happened
04:08:51.680 | or might have happened. - So there's no,
04:08:52.680 | so it's a fundamentally different thing.
04:08:54.240 | If you can imagine a thing that doesn't make it real.
04:08:57.800 | So they, 'cause that's where that possibility exists,
04:09:00.840 | is in your imagination, right?
04:09:03.080 | - Yeah, and possibility itself is a kind of spooky idea
04:09:06.760 | because it too has a sort of structure, right?
04:09:10.520 | So like if I'm gonna say,
04:09:12.520 | you know, you could have had a daughter, right, last year.
04:09:21.620 | So we're saying that's possible, but not actual, right?
04:09:26.620 | That is a claim, there are things that are true
04:09:35.560 | and not true about that daughter, right?
04:09:38.400 | Like it has a kind of structure, it's like.
04:09:40.800 | - I feel like there's a lot of fog around that,
04:09:44.560 | the possibility.
04:09:45.760 | It feels like almost like a useful narrative.
04:09:48.280 | - But what does it mean?
04:09:49.120 | So like, what does it mean if we say, you know,
04:09:54.120 | I just did that, but it's conceivable
04:09:56.600 | that I wouldn't have done that, right?
04:09:57.880 | Like it's possible that I just threw this cap,
04:10:00.560 | but I might not have done that.
04:10:02.800 | - So you're taking it very temporarily close
04:10:05.440 | to the original, like what would appear as a decision.
04:10:08.720 | - Whenever we're saying something's possible,
04:10:11.040 | but not actual, right?
04:10:12.560 | Like this thing just happened, but it's conceivable,
04:10:15.880 | it's possible that it wouldn't have happened
04:10:17.640 | or that it would have happened differently.
04:10:19.600 | In what does that possibility consist?
04:10:24.640 | Like where is that?
04:10:26.480 | For that to be real, for the possibility to be real,
04:10:30.120 | what claim are we making about the universe?
04:10:35.440 | - Well, isn't that an extension of the idea
04:10:37.880 | that free will is an illusion,
04:10:39.160 | that all we have is actuality,
04:10:40.800 | that the possibility is an illusion?
04:10:41.960 | - Right, yeah, I'm just extending it beyond human action.
04:10:47.160 | This goes to the physics of things, this is just everything.
04:10:49.360 | Like we're always telling ourselves a story
04:10:52.520 | that includes possibility.
04:10:54.320 | - Possibility is really compelling for some reason.
04:10:56.920 | - Well, yeah, because it's, I mean, so this, yeah,
04:11:01.600 | I mean, this could sound just academic,
04:11:03.760 | but every backward-looking regret or disappointment
04:11:08.760 | and every forward-looking worry
04:11:11.680 | is completely dependent on this notion of possibility.
04:11:17.040 | Like every regret is based on the sense
04:11:18.880 | that something else, I could have done something else,
04:11:21.200 | something else could have happened.
04:11:22.840 | Every disposition to worry about the future
04:11:26.600 | is based on the feeling
04:11:29.000 | that there's this range of possibilities.
04:11:30.840 | It could go either way.
04:11:32.600 | And, you know, I mean,
04:11:36.680 | whether or not there's such a thing as possibility,
04:11:38.560 | you know, I'm convinced that worry
04:11:39.920 | is almost never psychologically appropriate
04:11:44.420 | because the reality is that in any given moment,
04:11:47.400 | either you can do something to solve the problem
04:11:49.280 | you're worried about or not.
04:11:50.480 | So if you can do something, just do it.
04:11:52.560 | You know, and if you can't,
04:11:53.720 | your worrying is just causing you to suffer twice over,
04:11:56.440 | right, you're gonna, you know,
04:11:58.080 | you're gonna get the medical procedure next week anyway.
04:12:01.320 | How much time between now and next week
04:12:03.280 | do you wanna spend worrying about it, right?
04:12:04.880 | It's gonna, the worry doesn't accomplish anything.
04:12:08.080 | - How much do physicists think about possibility?
04:12:10.880 | - Well, they think about it in terms of probability
04:12:13.800 | more often, but probability just describes,
04:12:16.320 | and again, this is a place where I might be out of my depth
04:12:20.740 | and need to talk to somebody to debunk this, but the-
04:12:25.740 | - Do therapy with a physicist.
04:12:27.520 | - Yeah, but probably it seems just describes
04:12:30.380 | a pattern of actuality that we've observed, right?
04:12:33.760 | I mean, we have, there are certain things we observe
04:12:36.380 | and those are the actual things that have happened.
04:12:38.920 | And we have this additional story about probability.
04:12:42.960 | I mean, we have the frequency with which things happen,
04:12:45.480 | have happened in the past.
04:12:46.780 | You know, I can flip a fair coin and know,
04:12:52.520 | I know in the abstract that I have a belief that
04:12:54.760 | in the limit that those flips,
04:12:57.920 | those tosses should converge on 50% heads and 50% tails.
04:13:01.920 | I know I have a story as to why
04:13:03.980 | it's not gonna be exactly 50%
04:13:05.840 | within any arbitrary timeframe.
04:13:12.560 | But in reality, all we ever have are the observed tosses.
04:13:16.500 | Right, and then we have an additional story that,
04:13:19.400 | oh, it came up heads, but it could have come up tails.
04:13:22.240 | Why do we think that, about that last toss?
04:13:27.660 | And what are we claiming is true
04:13:33.040 | about the physics of things
04:13:34.560 | if we say it could have been otherwise?
04:13:39.320 | - I think we're claiming that probability is true.
04:13:42.820 | That it just, it allows us to have a nice model
04:13:47.060 | about the world, gives us hope about the world.
04:13:49.420 | - Yeah, it seems that possibility
04:13:51.760 | has to be somewhere to be effective.
04:13:54.940 | It's a little bit like what's happening with the laws of,
04:13:58.180 | there's something metaphysically interesting
04:13:59.940 | about the laws of nature too, because the laws of nature,
04:14:02.820 | so the laws of nature impose their work on the world,
04:14:05.980 | right, we see their evidence.
04:14:08.720 | But they're not reducible
04:14:10.940 | to any specific set of instances, right?
04:14:14.060 | So there's some structure there,
04:14:16.220 | but the structure isn't just a matter of the actual things.
04:14:21.220 | We have the actual billiard balls
04:14:22.720 | that are banging into each other.
04:14:24.820 | All of that actuality can be explained
04:14:26.620 | by what actual things are actually doing.
04:14:29.300 | But then we have this notion that in addition to that,
04:14:32.120 | we have the laws of nature that are explaining this act,
04:14:35.940 | but how are the laws of nature an additional thing
04:14:39.140 | in addition to just the actual things
04:14:40.460 | that are actually affect causally?
04:14:42.020 | And if they are an additional thing,
04:14:45.180 | how are they effective if they're not among
04:14:48.360 | the actual things that are just actually banging around?
04:14:50.700 | - Yeah.
04:14:51.580 | - And so to some degree--
04:14:52.980 | - For that, possibly has to be hiding somewhere
04:14:56.100 | for the laws of nature to be--
04:14:57.260 | - To be possible.
04:14:58.100 | For anything to be possible, it has to be,
04:15:01.180 | it has to have--
04:15:02.020 | - It's a closet somewhere, I'm sure,
04:15:03.260 | is where all the possibility goes.
04:15:05.300 | - It has to be attached to something.
04:15:08.460 | - You don't think many worlds is that?
04:15:10.620 | 'Cause many worlds still exist.
04:15:13.780 | - 'Cause we're in this strand of that multiverse.
04:15:17.100 | - Yeah.
04:15:17.920 | - Right, so it's still,
04:15:18.940 | still you have just a local instance of what is actual.
04:15:22.400 | And then if it proliferates elsewhere
04:15:24.000 | where you can't be affected by it,
04:15:26.260 | there's more actuality. - Many worlds,
04:15:27.460 | you can't really connect with the other.
04:15:29.740 | - Yeah. - Yeah.
04:15:31.020 | - So many worlds are just a statement of
04:15:34.340 | basically everything that can happen, happens somewhere.
04:15:37.340 | - Yeah. - And that's,
04:15:40.460 | I mean, maybe that's not an entirely
04:15:42.660 | kosher formulation of it, but it seems pretty close.
04:15:45.260 | But there's whatever happens, right?
04:15:49.340 | In fact, there's, relativistically, there's a,
04:15:51.860 | there's an, you know, Einstein's original notion
04:15:56.020 | of a block universe seems to suggest this.
04:15:58.900 | And it's been a while since I've been in a conversation
04:16:01.020 | with a physicist where I've gotten a chance to ask
04:16:02.680 | about the standing of this concept in physics currently.
04:16:04.980 | I don't hear it discussed much,
04:16:06.300 | but the idea of a block universe is that, you know,
04:16:09.540 | space-time exists as a totality.
04:16:13.620 | And our sense that we are traveling through space-time
04:16:17.940 | where there's a real difference
04:16:20.420 | between the past and the future,
04:16:22.360 | that that's an illusion of just our, you know,
04:16:24.860 | you know, the weird slice we're taking
04:16:28.940 | of this larger object.
04:16:30.660 | But on some level, it's like, you know,
04:16:34.440 | you're reading a novel, the last page of the novel exists
04:16:37.360 | just as much as the first page
04:16:38.820 | when you're in the middle of it.
04:16:40.880 | And they're just, you know, if that's,
04:16:43.000 | if we're living in anything like that,
04:16:44.520 | then there's no such thing as possibility.
04:16:48.160 | It would seem there's just what is actual.
04:16:52.360 | So as a matter of our experience, moment to moment,
04:16:56.520 | I think it's totally compatible with that being true,
04:17:00.180 | that there is only what is actual.
04:17:03.280 | And that sounds to the naive ear,
04:17:07.960 | that sounds like it would be depressing
04:17:09.520 | and disempowering and confining, but as anything,
04:17:12.880 | but it's actually, it's a circumstance of pure discovery.
04:17:17.440 | Like you have no idea what's gonna happen next, right?
04:17:21.160 | You don't know who you're gonna be tomorrow.
04:17:23.560 | You're only by tendency seeming to resemble yourself
04:17:27.400 | from yesterday.
04:17:28.240 | There's way more freedom in all of that
04:17:30.180 | than it seems true to many people.
04:17:34.180 | And yet the basic insight is that you're not,
04:17:39.180 | you're not in, the real freedom is the recognition
04:17:44.120 | that you're not in control of anything.
04:17:46.380 | Everything is just happening,
04:17:47.820 | including your thoughts and intentions and moods.
04:17:50.580 | - So life is a process of continuous discovery.
04:17:54.100 | - You're part of the universe.
04:17:55.300 | Yeah, you are just this, I mean,
04:17:58.480 | it's the miracle that the universe is illuminated
04:18:03.000 | to itself as itself where you sit.
04:18:06.200 | And you're continually discovering what your life is.
04:18:11.200 | And then you have this layer
04:18:13.520 | at which you're telling yourself a story
04:18:15.860 | that you already know what your life is.
04:18:17.740 | And you know exactly who you should be
04:18:20.000 | and what's about to happen,
04:18:22.400 | or you're struggling to form a confident opinion
04:18:25.280 | about all of that.
04:18:26.120 | And yet there is this fundamental mystery to everything,
04:18:30.040 | even the most familiar experience.
04:18:33.160 | - We're all NPCs in a most marvelous video game.
04:18:37.920 | - Maybe, although my game, my sense of gaming
04:18:41.600 | does not run as deep as to know what I'm committing to.
04:18:44.020 | There's a non-playing character.
04:18:46.040 | - You're more, yeah, oh wow.
04:18:48.380 | Yes, you're more of a Mario Kart guy.
04:18:51.420 | - I went back, I was an original video gamer,
04:18:53.620 | but it's been a long time since I,
04:18:55.520 | I mean, I was there for Pong.
04:18:58.040 | I remember when I saw the first Pong in a restaurant
04:19:00.840 | in, I think it was like Benihana's or something,
04:19:04.540 | they had a Pong table.
04:19:06.240 | And that was just an amazing moment when you--
04:19:08.800 | - You, Sam Harris, might live from Pong
04:19:11.800 | to the invention and deployment
04:19:13.640 | of a super intelligent system.
04:19:16.080 | - Yeah, well, that happened fast
04:19:17.480 | if it happens any time in my lifetime.
04:19:19.520 | - From Pong to AGI.
04:19:22.480 | What kind of things do you do purely for fun
04:19:24.620 | that others might consider a waste of time?
04:19:26.780 | - Purely for fun?
04:19:30.060 | - 'Cause meditation doesn't count,
04:19:32.340 | 'cause most people would say that's not a waste of time.
04:19:34.980 | Is there something like Pong
04:19:36.660 | that's a deeply embarrassing thing you would never admit?
04:19:40.980 | - I don't think, well, I mean, once or twice a year
04:19:45.680 | I will play a round of golf,
04:19:47.300 | which many people would find embarrassing.
04:19:49.900 | They might even find my play embarrassing.
04:19:51.500 | But it's fun.
04:19:52.740 | - Do you find it embarrassing?
04:19:53.900 | - No, I mean, I love, golf just takes way too much time,
04:19:57.000 | so I can only squander a certain amount of time on it.
04:19:59.060 | I do love it, it's a lot of fun.
04:20:01.420 | - But you have no control over your actual performance.
04:20:03.900 | You're ever discovering--
04:20:05.940 | - I do have control over my mediocre performance,
04:20:09.620 | but I don't have enough control as to make it really good.
04:20:13.560 | But happily, I'm in the perfect spot
04:20:16.140 | because I don't invest enough time in it
04:20:17.820 | to care how I play, so I just have fun when I play.
04:20:20.460 | - Well, I hope there'll be a day
04:20:21.900 | where you play around golf
04:20:23.380 | with the former president of Donald Trump,
04:20:26.060 | and I would love to be--
04:20:27.460 | - I would bet on him if we played golf.
04:20:29.240 | I'm sure he's a better golfer.
04:20:30.740 | - Amidst the chaos of human civilization in modern times,
04:20:35.180 | as we've talked about,
04:20:36.060 | what gives you hope about this world in the coming year,
04:20:41.060 | in the coming decade, in the coming hundred years,
04:20:44.020 | maybe a thousand years?
04:20:45.180 | What's the source of hope for you?
04:20:48.480 | - Well, it comes back to a few of the things
04:20:52.360 | we've talked about.
04:20:53.560 | I think I'm hopeful.
04:20:57.480 | I know that most people are good
04:20:59.320 | and are mostly converging on the same core values, right?
04:21:04.320 | It's like we're not surrounded by psychopaths.
04:21:09.200 | And the thing that finally convinced me
04:21:14.320 | to get off Twitter was how different life was seeming
04:21:18.720 | through the lens of Twitter.
04:21:19.800 | It's like I just got the sense
04:21:21.000 | that there are way more psychopaths,
04:21:23.060 | or effective psychopaths, than I realized.
04:21:25.880 | And then I thought, okay, this isn't real.
04:21:28.680 | This is either a strange context
04:21:31.000 | in which actually decent people are behaving like psychopaths
04:21:35.080 | or it's a bot army or something
04:21:38.840 | that I don't have to take seriously.
04:21:40.440 | So yeah, I just think most people,
04:21:44.240 | if we can get the incentives right,
04:21:48.360 | I think there's no reason
04:21:52.200 | why we can't really thrive collectively.
04:21:55.880 | Like there's enough wealth to go around.
04:21:58.240 | There's enough, you know, there's no effective limit,
04:22:03.240 | you know, I mean, again,
04:22:04.240 | within the limits of what's physically possible,
04:22:06.800 | but we're nowhere near the limit on abundance.
04:22:11.400 | You know, on this, forget about going to Mars,
04:22:13.160 | on this, the one rock, right?
04:22:16.040 | It's like we could make this place incredibly beautiful
04:22:21.040 | and stable if we just did enough work
04:22:26.400 | to solve some, you know,
04:22:31.920 | rather longstanding political problems.
04:22:36.640 | - The problem of incentives.
04:22:37.960 | So to you, the basic characteristics of human nature
04:22:41.440 | are such that we'll be okay if the incentives are okay.
04:22:44.880 | We'll do pretty good.
04:22:47.600 | - I'm worried about the asymmetries
04:22:50.120 | that it's easier to break things than to fix them.
04:22:52.040 | It's easier to light a fire than to put it out.
04:22:56.920 | And I do worry that, you know,
04:23:01.280 | as technology gets more and more powerful,
04:23:03.600 | it becomes easier for the minority
04:23:06.760 | who wants to screw things up
04:23:08.520 | to effectively screw things up for everybody, right?
04:23:11.080 | So it's easier, it's like a thousand years ago,
04:23:14.160 | it was simply impossible for one person
04:23:17.000 | to range the lives of millions, much less billions.
04:23:22.000 | Now that's getting to be possible.
04:23:26.120 | So on the assumption that we're always gonna have
04:23:28.160 | a sufficient number of crazy individuals
04:23:31.400 | or malevolent individuals,
04:23:36.400 | we have to figure out that asymmetry somehow.
04:23:41.880 | And so there's some cautious exploration
04:23:45.240 | of emergent technology that we need to get
04:23:48.800 | our head screwed on straight about.
04:23:51.920 | And so like, so gain-of-function research,
04:23:53.880 | like just how much do we wanna democratize,
04:23:56.320 | you know, all the relevant technologies there?
04:24:00.000 | You know, do we want, really, you really wanna
04:24:01.800 | give everyone the ability to order nucleotides in the mail
04:24:06.360 | and give them the blueprints for viruses online
04:24:09.800 | because of, you know, you're a free speech absolutist
04:24:11.920 | and you think all PDFs need to be, you know,
04:24:15.880 | exportable everywhere.
04:24:17.200 | So I'm much more, so this is where, yeah,
04:24:21.600 | so there are limits to,
04:24:23.160 | many people are confused about my take on free speech
04:24:26.760 | because I've come down on the unpopular side
04:24:30.680 | of some of these questions, but it's been,
04:24:34.240 | my overriding concern is that in many cases,
04:24:36.360 | I'm worried about the free speech of the individual
04:24:40.080 | businesses or individual platforms or individual,
04:24:42.560 | you know, media people to decide that they don't wanna
04:24:45.880 | be associated with certain things, right?
04:24:47.600 | So like, if you own Twitter,
04:24:49.400 | I think you should be able to kick off the Nazi
04:24:51.880 | you don't wanna be associated with
04:24:53.400 | because it's your platform, you own it, right?
04:24:55.200 | That's your free speech, right?
04:24:58.040 | That's the side of my free speech concern for Twitter,
04:25:01.000 | right?
04:25:01.840 | It's not that every Nazi has the right to be,
04:25:04.720 | to algorithmic speech on Twitter.
04:25:06.840 | I think if you own Twitter, you should be, you or the,
04:25:08.840 | you know, whether it's just Elon or, you know,
04:25:10.800 | in the world where it wasn't Elon,
04:25:12.200 | just the people who own Twitter,
04:25:14.240 | and the board and the shareholders and the employees,
04:25:17.960 | these people need to, should be free to decide
04:25:21.120 | what they wanna promote or not.
04:25:23.000 | I view them as publishers more, you know,
04:25:25.240 | more than as platforms in the end,
04:25:27.840 | and that has other implications.
04:25:31.600 | But I do worry about this problem of misinformation
04:25:35.320 | and algorithmically and otherwise, you know,
04:25:38.920 | supercharged misinformation.
04:25:41.360 | And I think, I do think we have,
04:25:46.360 | we're at a bottleneck now.
04:25:48.000 | I mean, I guess it's,
04:25:49.560 | could be the hubris of every present generation
04:25:51.640 | to think that their moment is especially important,
04:25:55.160 | but I do think with the emergence of these technologies,
04:25:59.120 | we're at some kind of bottleneck
04:26:01.200 | where we really have to figure out how to get this right.
04:26:04.400 | And if we do get this right,
04:26:05.800 | if we figure out how to not drive ourselves crazy
04:26:08.880 | by giving people access to all possible information,
04:26:12.400 | misinformation at all times,
04:26:14.000 | I think, yeah, we could,
04:26:16.720 | there's no limit to how happily we could collaborate
04:26:21.160 | with billions of creative, fulfilled people.
04:26:25.600 | You know, it's just--
04:26:26.680 | - And trillions of robots, some of them sex robots,
04:26:30.640 | but that's another topic.
04:26:31.800 | - Robots that are running the right algorithm,
04:26:34.280 | whatever that algorithm is.
04:26:35.880 | - Whatever you need in your life to make you happy.
04:26:38.240 | Sam, the first time we talked is one of the huge honors
04:26:43.160 | of my life, I've been a fan of yours for a long time.
04:26:45.720 | The few times you were respectful but critical to me
04:26:49.360 | means the world, and thank you so much for helping me
04:26:53.680 | and caring enough about the world
04:26:56.040 | and for everything you do.
04:26:57.720 | But I should say that the few of us
04:27:00.640 | that try to put love in the world on Twitter
04:27:02.440 | miss you on Twitter, but--
04:27:04.160 | - Well, enjoy yourselves.
04:27:05.840 | - Don't break anything, kids.
04:27:09.080 | - Have a good party without me.
04:27:10.840 | - Thanks so much.
04:27:11.680 | - But very happy to do this, thanks for the invitation.
04:27:14.080 | - Thank you. - Great to see you again.
04:27:16.040 | - Thanks for listening to this conversation with Sam Harris.
04:27:18.800 | To support this podcast, please check out our sponsors
04:27:21.240 | in the description.
04:27:22.560 | And now, let me leave you with some words
04:27:24.680 | from Martin Luther King Jr.
04:27:27.080 | "Love is the only force capable of transforming an enemy
04:27:31.360 | into a friend."
04:27:32.440 | Thank you for listening, I hope to see you next time.
04:27:36.600 | (upbeat music)
04:27:39.200 | (upbeat music)
04:27:41.800 | [BLANK_AUDIO]