back to indexSam Harris: Trump, Pandemic, Twitter, Elon, Bret, IDW, Kanye, AI & UFOs | Lex Fridman Podcast #365
Chapters
0:0 Introduction
3:38 Empathy and reason
11:30 Donald Trump
54:24 Military industrial complex
58:58 Twitter
83:5 COVID
126:48 Kanye West
143:24 Platforming
161:21 Joe Rogan
178:13 Bret Weinstein
191:51 Elon Musk
203:59 Artificial Intelligence
220:1 UFOs
233:16 Free will
260:31 Hope for the future
00:00:00.000 |
The following is a conversation with Sam Harris, 00:00:29.840 |
besides our mutual fascination with AGI and free will, 00:00:43.300 |
and several key figures at the center of public discourse, 00:00:59.480 |
and in fact believe is probably a figment of my imagination. 00:01:07.020 |
about this personal aspect of the conversation, 00:01:18.720 |
is that many people I look to for wisdom in public discourse 00:01:26.700 |
when the world needed those kinds of conversations the most. 00:01:34.780 |
They start noticing the humanity that connects them 00:01:37.420 |
that is much deeper than the disagreements that divide them. 00:01:45.780 |
why I look up to and I'm inspired by Joe, Elon, and Sam. 00:02:02.540 |
always pushing for more kindness in the world. 00:02:11.420 |
and human being who takes on the hardest problems 00:02:18.240 |
that made the solutions to these problems seem impossible. 00:02:34.980 |
and defending them with rigor and resilience. 00:02:38.940 |
I both celebrate and criticize all three privately, 00:02:45.780 |
for which I always learn a lot and always appreciate. 00:02:49.540 |
there is respect and love for each other as human beings. 00:03:07.960 |
Sometimes people criticize me for being naive, 00:03:21.020 |
This world is too fucking beautiful not to try. 00:03:31.620 |
please check out our sponsors in the description. 00:03:40.180 |
at making a net positive impact on the world, 00:03:57.900 |
It's just understanding another person's point of view. 00:04:47.300 |
Now, I think both of those capacities are very important, 00:05:00.580 |
this is something I have more or less learned 00:05:04.120 |
the psychologist who wrote a book on this topic 00:05:13.300 |
is a bad guide rather often for ethical behavior 00:05:21.860 |
- And I'll give you the clear example of this, 00:05:49.920 |
you see the parents distraught on television, 00:05:53.600 |
you hear her cries from the bottom of the well, 00:05:57.360 |
I mean, there was an example of this 20, 25 years ago, 00:06:00.520 |
I think, where it was just wall to wall on CNN, 00:06:05.120 |
It was 72 hours or whatever it was of continuous coverage 00:06:17.460 |
I mean, it's just, it marshals 100% of our compassion 00:06:21.540 |
Whereas if you hear that there's a genocide raging 00:06:35.320 |
we'll change the channel in the face of a genocide. 00:06:47.320 |
So it's, you know, many of us have come to believe 00:06:55.040 |
And so empathy plays an unhelpful role there. 00:07:00.880 |
when we're making big decisions about what we should do 00:07:17.440 |
- Well, there's a lot of dangers to go on there. 00:07:20.400 |
you've recently talked about effective altruism 00:07:25.080 |
I think you mentioned some interesting statement, 00:07:36.760 |
where you care about maybe your daughter and son 00:07:40.000 |
more than 100 people that live across the world, 00:07:44.120 |
Like where the calculus is not always perfect, 00:07:46.840 |
but somehow it makes sense to live in a world 00:07:51.840 |
and yet empathetic in the way you've been discussing. 00:07:54.560 |
- Right, I'm not sure what the right answer is there, 00:08:07.200 |
that's articulated by someone like the Dalai Lama, 00:08:15.560 |
would say that the ultimate enlightened ethic 00:08:19.040 |
is true dispassion with respect to friends and strangers. 00:08:23.200 |
The mind of the Buddha would be truly dispassionate, 00:08:28.160 |
you would love and care about all people equally. 00:08:31.680 |
And by that light, it seems some kind of ethical failing, 00:08:41.320 |
compassion in the limit, or enlightened wisdom in the limit, 00:08:44.600 |
to care more, or even much more, about your kids 00:08:55.360 |
So you spend all this time trying to figure out 00:08:58.660 |
and you'll attend to their minutest concerns, 00:09:15.080 |
if everyone was running the Dalai Lama program there. 00:09:18.640 |
I think some prioritization of one's nearest and dearest 00:09:30.000 |
and we'll all be doing that in a circumstance 00:09:54.280 |
and I'm worried about her more than I'm worried 00:10:01.400 |
I don't want a hospital that treats my daughter 00:10:12.920 |
You don't want starkly corrupt, unfair situations. 00:10:30.100 |
and they change for the worse, understandably. 00:10:34.960 |
and you're in a state of collective emergency, 00:11:03.160 |
we want fairness, we're all better off for it, 00:11:19.800 |
that protects the interests of strangers, too. 00:11:37.860 |
Just a good story will get us to not think clearly. 00:11:49.500 |
like in this country, you know, red and blue, 00:11:55.480 |
on immigration or on the response to the pandemic 00:12:10.440 |
trying to understand where they're coming from, 00:12:12.340 |
not just in the explicit statement of their idea, 00:12:19.120 |
that kind of empathy while you're discussing ideas. 00:12:40.180 |
It's not, truth is not decided by democratic principles, 00:12:52.160 |
but those reasons are nonetheless bad reasons, right? 00:12:56.520 |
they're not reasons anyone should adopt for themselves 00:13:07.000 |
and it's something you can care about, right? 00:13:17.160 |
is I've been super critical of Trump, obviously, 00:13:20.720 |
and I've been super critical of certain people 00:13:28.880 |
when he really made it patently obvious who he was, 00:13:33.440 |
you know, if there had been any doubt initially. 00:13:36.640 |
There was no doubt when we have a sitting president 00:13:38.560 |
who's not agreeing to a peaceful transfer of power, right? 00:13:49.280 |
and yet the fact that many millions of Americans 00:13:57.280 |
or bought into the, didn't see through his con, right? 00:14:08.680 |
and so, you know, his heart is in the right place. 00:14:22.360 |
largely because they had seen him on television 00:14:31.980 |
It's understandable to me that many very frustrated people 00:14:37.040 |
who have not had their hopes and dreams actualized, 00:14:49.640 |
it's understandable that they would be confused 00:14:59.480 |
a grossly incompetent, morbidly narcissistic person 00:15:06.420 |
So I don't, so which is to say that I don't blame, 00:15:12.040 |
who I don't necessarily blame for the Trump phenomenon, 00:15:20.400 |
very bad state of affairs in our society, right? 00:15:25.840 |
I mean, one is I think you have to call a spade a spade 00:15:28.520 |
when you're talking about how things actually work 00:15:39.680 |
And yeah, I mean, I think empathy and, you know, 00:15:43.200 |
probably the better word for what I would hope to embody 00:15:48.360 |
Like really, you know, to really wish people well, 00:15:58.080 |
I mean, to realize that there is no opposition between, 00:16:05.540 |
because why is selfishness really takes into account 00:16:13.480 |
do you want to live in a society where you have everything, 00:16:18.680 |
Or do you want to live in a society where you're surrounded 00:16:25.360 |
who are having their hopes and dreams realized? 00:16:27.720 |
I think it's obvious that the second society is much better, 00:16:35.040 |
- But what about having empathy for certain principles 00:16:40.920 |
that people believe, for example, the pushback, 00:16:44.760 |
the other perspective on this, 'cause you said, 00:16:47.040 |
bought the myth of Trump as the great businessman. 00:16:50.040 |
There could be a lot of people that are supporters of Trump 00:16:53.040 |
who could say that Sam Harris bought the myth 00:16:57.020 |
that we have this government of the people, by the people, 00:17:05.200 |
who are running a giant bureaucracy that is corrupt, 00:17:09.560 |
and they're actually not representing the people. 00:17:16.600 |
Yeah, he's flawed in all this number of ways. 00:17:29.440 |
shake up the elites that are so uncomfortable, 00:17:34.080 |
about the game they got running on everybody else. 00:17:38.440 |
- That's the kind of perspective that they would take 00:17:40.760 |
and say, yeah, yeah, there's these flaws that Trump has, 00:18:22.600 |
because I don't know what I'm talking about, right? 00:18:25.320 |
Or what I would be talking about if I had a strong opinion. 00:18:34.760 |
because I've declined to have certain conversations 00:18:36.640 |
on my podcast just because I think I'm the wrong person 00:18:40.720 |
And I think it's important to see those bright lines 00:18:57.120 |
I'm under no illusions that all of our institutions 00:19:02.600 |
are worth preserving precisely as they have been 00:19:06.160 |
up until the moment this great orange wrecking ball 00:19:11.200 |
But I just, it was a very bad bet to elect someone 00:19:15.520 |
who is grossly incompetent and worse than incompetent, 00:19:20.520 |
genuinely malevolent in his selfishness, right? 00:19:27.480 |
And this is something we know based on literally decades 00:19:33.600 |
He's not a public servant in any normal sense of that term. 00:19:38.600 |
And he couldn't possibly give an honest or sane answer 00:19:44.240 |
to the question you asked me about empathy and reason 00:19:47.320 |
and like, how should we, what should guide us? 00:19:50.360 |
I genuinely think he is missing some necessary moral 00:20:01.920 |
as a human being 'cause I think having those things 00:20:05.040 |
is incredibly important and genuinely loving other people 00:20:07.720 |
is incredibly important and knowing what all that's about 00:20:20.320 |
to the highest positions of power in our society 00:20:23.640 |
who are far outliers in pathological terms, right? 00:20:30.840 |
We want them to be far outliers in the best case 00:20:34.640 |
in wisdom and compassion and some of the things you've, 00:20:39.120 |
I mean, we want someone to be deeply informed. 00:20:53.060 |
And insofar as we're gonna get normal mediocrities 00:20:56.440 |
in that role, which I think is often the best 00:20:59.360 |
we could expect, let's get normal mediocrities in that role, 00:21:02.640 |
not once in a generation narcissists and frauds. 00:21:13.360 |
we just take honesty as a single variable, right? 00:21:15.560 |
I think you want, yes, it's possible that most politicians 00:21:29.700 |
Yes, there are certain circumstances where lying 00:21:34.560 |
It's kind of on a continuum of self-defense and violence. 00:21:45.120 |
But Trump, arguably there's never been a person 00:21:59.080 |
I mean, it's just, he was just a blizzard of lies, 00:22:08.360 |
But it's just, it says something fairly alarming 00:22:14.880 |
about our society that a person of that character 00:22:26.600 |
for half of the society who didn't see it that way. 00:22:29.520 |
And that's gonna sound elitist and smug or something 00:22:34.520 |
for anyone who's on that side listening to me. 00:22:39.200 |
I mean, I understand that like, I barely have the, 00:22:42.400 |
I'm like one of the luckiest people in the world 00:22:44.200 |
and I barely have the bandwidth to pay attention 00:22:48.640 |
in order to have an opinion about half the things 00:22:57.760 |
who's raising multiple kids, even a single kid? 00:23:06.320 |
that people have the bandwidth to really track this stuff. 00:23:12.280 |
and they see, they get inundated by misinformation 00:23:15.080 |
and they see what their favorite influencer just said. 00:23:21.200 |
And it's just, we're living in an environment 00:23:25.400 |
where the information space has become so corrupted 00:23:29.520 |
and we've built machines to further corrupt it. 00:23:32.800 |
You know, I mean, we've built a business model 00:23:34.400 |
for the internet that it further corrupts it. 00:23:37.180 |
So it is just, it's chaos in informational terms. 00:23:57.680 |
And that was understandable for many reasons. 00:24:04.240 |
is somebody I've looked up to for a long time 00:24:12.400 |
and I would love you to steel man the case for it 00:24:14.560 |
and against, that Trump broke Sam Harris's brain. 00:24:21.600 |
to the actual impact that Trump had on our society. 00:24:29.600 |
calm, rational minds to see the world clearly, 00:24:40.280 |
Is there a degree to which he broke your brain? 00:24:56.800 |
the problem back on the person who's criticizing Trump. 00:24:59.620 |
But in truth, the true Trump derangement syndrome 00:25:04.240 |
was not to have seen how dangerous and divisive 00:25:12.240 |
And in the final moment, not to see how Trump 00:25:17.320 |
was so untenable it was to still support someone 00:25:22.320 |
who a sitting president who was not committing 00:25:27.280 |
I mean, if that wasn't a bright line for you, 00:25:39.720 |
And I think it really was but for the integrity 00:25:49.320 |
some real constitutional crisis and real emergency 00:25:58.600 |
and decided to not certify the election, right? 00:26:04.100 |
the number of people who held things together 00:26:08.200 |
And so it wasn't for want of trying on Trump's part 00:26:16.880 |
truly uncharted catastrophe with our democracy. 00:26:22.040 |
So the fact that that didn't happen is not a sign 00:26:25.480 |
that those of us who were worried that it was 00:26:29.560 |
so close to happening were exaggerating the problem. 00:26:32.200 |
I mean, it's like you almost got run over by a car 00:26:39.360 |
and you're thinking, "Boy, that was dangerous. 00:26:41.240 |
"I probably shouldn't wander in the middle of the street 00:26:46.000 |
You weren't wrong to feel that you really had a problem. 00:26:49.840 |
And came very close to something truly terrible. 00:26:59.720 |
So the fact that he's still, he's coming back around 00:27:04.640 |
I'm not spending much time thinking about it, frankly, 00:27:15.080 |
I don't know how many podcasts I devoted to the topic. 00:27:20.520 |
It wasn't that, I mean, it wasn't that many in the end 00:27:24.040 |
against the number of podcasts I devoted to other topics. 00:27:36.880 |
He's like not a, it's just good fun to see somebody 00:27:40.240 |
who's like, who's just not taking anything seriously. 00:27:46.600 |
of business as usual again and again and again and again. 00:27:50.500 |
And they don't really see anything much at stake, right? 00:27:56.480 |
It doesn't really matter if we don't support NATO. 00:27:59.480 |
It doesn't really matter if he says he trusts Putin 00:28:04.920 |
I mean, none of this, it doesn't matter if he's, 00:28:13.320 |
threatening, threatens to bomb them back to the Stone Age, 00:28:28.360 |
I can even inhabit that space for a few minutes at a time. 00:28:32.240 |
But there's a deeper concern that we're in the process 00:28:41.280 |
And this is a problem I've had with several other people 00:28:58.880 |
And I think there are really important problems 00:29:02.040 |
that we have to get our head straight around. 00:29:13.160 |
you know, both about the loss of trust in our institutions 00:29:16.600 |
and the fact that trust has eroded for good reason, right, 00:29:22.920 |
I think, you know, they've become infected by, 00:29:25.840 |
you know, political ideologies that are not truth tracking. 00:29:43.660 |
amateurish speculation and conspiracy thinking 00:30:02.900 |
And I try to telegraph that a fair amount with people. 00:30:06.100 |
So yeah, I mean, but it's not, it's different. 00:30:12.580 |
Like, I mean, you can invite someone onto your podcast 00:30:29.200 |
and you might be pushing back at the margins, 00:30:32.260 |
but you know that when push comes to shove on that topic, 00:30:35.900 |
you really don't have a basis to have a strong opinion. 00:30:46.900 |
it's gonna be by deference to some other expert 00:30:49.540 |
who you've brought in or who you've heard about 00:30:53.980 |
But there's a paradox to how we value authority in science 00:31:00.620 |
And I think we should at some point unravel that 00:31:03.380 |
because it's the basis for a lot of public confusion. 00:31:07.380 |
And frankly, it's the basis for a lot of criticism 00:31:11.700 |
where people think that I'm against free speech 00:31:19.740 |
or it's like, I just think I'm a credentialist. 00:31:23.400 |
I just think people with PhDs from Ivy League universities 00:31:31.220 |
there's a lot to cut through to get to daylight there 00:31:42.020 |
- You've talked about it, but it's just interesting, 00:31:45.820 |
You've had this famous phrase about Hunter Biden 00:31:58.640 |
on the situation of January 6th and Trump in general. 00:32:01.860 |
It's possible that January 6th and things of that nature 00:32:07.940 |
revealed that our democracy is actually pretty fragile. 00:32:26.500 |
and there's a lot of people like Vladimir Lenin, Hitler, 00:32:29.660 |
who are exceptionally competent at controlling power, 00:32:42.180 |
but working in the shadows behind the scenes to gain power. 00:32:49.300 |
and that is how they were able to gain power. 00:32:51.700 |
The pushback with Trump, he was doing none of that. 00:32:54.860 |
He was creating what he's very good at, creating drama, 00:32:59.860 |
sometimes for humor's sake, sometimes for drama's sake, 00:33:03.260 |
and simply revealed that our democracy is fragile. 00:33:06.700 |
And so he's not this once in a generation horrible figure. 00:33:26.700 |
He doesn't care about anything beyond himself. 00:33:42.820 |
- This is what Eric Weinstein never stops badgering me about, 00:33:50.420 |
My analogy for Trump was that he's an evil Chauncey Gardner. 00:33:56.420 |
I don't know if you remember the book or the film 00:34:09.820 |
and gets promoted to immense power in Washington 00:34:21.300 |
He's just talking, all he cares about is gardening. 00:34:23.540 |
He's just talking about his garden all the time, 00:34:25.940 |
but he'll say something, but yeah, in the spring, 00:34:30.660 |
and people read into that some kind of genius insight 00:34:47.420 |
He's got a genius for creating a spectacle around himself. 00:34:51.580 |
Right, he's got a genius for getting the eye of the media 00:35:00.220 |
it's a kind of self-promotion that only works 00:35:06.220 |
and don't care about having a reputation for anything 00:35:09.420 |
that I or you would wanna have a reputation for, right? 00:35:12.420 |
It's like it's pure, the pure pornography of attention, 00:35:17.800 |
I think the truly depressing and genuinely scary thing 00:35:28.100 |
given how broken our society is in many ways, 00:35:33.100 |
we have a country that didn't see anything wrong with that, 00:35:44.180 |
and who's obviously not a good person, right, 00:35:49.220 |
can't even pretend to care about people really, right, 00:35:53.300 |
And so, I mean, if there's a silver lining to this, 00:36:05.200 |
to a truly brilliant and sinister figure, right? 00:36:09.180 |
I mean, like I think we really dodged a bullet. 00:36:13.900 |
Yeah, someone far more competent and conniving 00:36:18.900 |
and ideological could have exploited our system 00:36:25.060 |
And that's, yeah, so if we plug those holes eventually, 00:36:33.460 |
and he would have done a good thing for our society, right? 00:36:38.340 |
and I think nobody knew, I mean, I certainly didn't know it 00:36:43.940 |
is how much our system relies on norms rather than laws. 00:36:53.140 |
that he never did anything illegal, truly illegal. 00:36:57.780 |
I mean, I think he probably did a few illegal things, 00:37:00.120 |
but like illegal such that he really should be 00:37:32.180 |
but that doesn't mean they're insignificant, right? 00:37:39.200 |
no, I'm not gonna commit to a peaceful transfer of power, 00:37:59.540 |
what the limits are and now we start to think that 00:38:11.180 |
but he did everything he could to try to steal the election. 00:38:16.060 |
I mean, the irony is he claimed to have an election stolen 00:38:18.940 |
from him all the while doing everything he could to steal it, 00:38:29.460 |
knowing that they were gonna be disproportionately 00:38:31.740 |
Democrat votes because of the position he took 00:38:46.260 |
that crashed into Four Seasons Landscaping, right, 00:39:01.900 |
And all of these things are getting thrown out 00:39:11.680 |
that he didn't maintain his power in this country. 00:39:22.340 |
and it's worth not letting that happen again. 00:39:29.900 |
He didn't do everything that could have been done 00:39:33.400 |
- Right, but the tools you have as a president, 00:39:38.820 |
You can declare emergencies, especially during COVID. 00:39:52.960 |
and there are people who refuse to do those things. 00:40:01.180 |
again, there are multiple books written about 00:40:10.820 |
in what he tried to do and tried to get others to do 00:40:15.540 |
I mean, it's just awful that we were that close 00:40:20.340 |
to a true unraveling of our political process. 00:40:36.060 |
It's just like, we looked like a banana republic there 00:41:04.340 |
There's a nihilism and cynicism to all of that, 00:41:09.220 |
which again, in certain people, it's understandable. 00:41:15.380 |
and you have a compound in Menlo Park or wherever. 00:41:19.020 |
It's like, there are people who are cheerleading this stuff 00:41:23.380 |
who know that they can get on their Gulf Stream 00:41:32.460 |
that I think we should be deeply critical of. 00:41:42.140 |
but the effect it had in part on the division between people. 00:41:47.140 |
And to me, the degree, the meme of Sam Harris's brain 00:42:00.300 |
- Well, I don't think there is something profitably 00:42:19.440 |
that's gonna converge on something reasonable, right? 00:42:31.700 |
who won't admit that there's anything wrong with Trump, 00:42:37.700 |
- So I'd like to push for the empathy versus reason. 00:42:40.580 |
'Cause when you operate in the space of reason, yes, 00:42:43.660 |
but I think there's a lot of power in you showing, 00:42:47.660 |
in you, Sam Harris, showing that you're willing 00:42:50.500 |
to see the good qualities of Trump, publicly showing that. 00:42:54.620 |
I think that's the way to win over Candace Owens. 00:42:58.100 |
He has fewer good qualities than virtually anyone 00:43:08.820 |
Look at just policies and actual impacts he had. 00:43:12.460 |
No, no, so I've admitted that many of his policies 00:43:18.100 |
So probably more often than not, at least on balance, 00:43:22.820 |
I agreed with his policy that we should take China 00:43:45.100 |
I think there's, it's obvious that we should have control 00:43:50.460 |
Like I don't see the argument for not having control 00:43:58.700 |
and we should have a sane immigration policy. 00:44:00.620 |
So I didn't necessarily think it was a priority 00:44:06.100 |
I never criticized the impulse to build the wall 00:44:09.140 |
because if tens of thousands, hundreds of thousands 00:44:13.180 |
and we are not in a position to know who's coming, 00:44:17.700 |
So, and I can recognize that many people in our society 00:44:26.140 |
and there is in many cases, a zero sum contest 00:44:34.380 |
So I think we should have control of our borders. 00:44:37.780 |
We should have a sane and compassionate immigration policy. 00:44:46.200 |
but no, like I would say 80% of the policy concerns 00:44:58.960 |
that I either share entirely or certainly sympathize with. 00:45:10.820 |
- A threat to democracy and some fundamental-- 00:45:16.540 |
it's the effect on everything he touches, right? 00:45:24.100 |
and destabilizing almost everything he touches 00:45:34.100 |
I mean, so you looked at these people who served 00:45:36.500 |
as chief of staff or in various cabinet positions, 00:45:43.740 |
and level-headedness, whether you share their politics 00:46:09.000 |
Yeah, I mean, it's just people bent over backwards 00:46:17.220 |
and it was bad for them and it was bad for our system. 00:46:21.900 |
But none of that discounts the fact that we have a system 00:46:35.340 |
Yes, there are bad incentives and entrenched interests. 00:46:40.340 |
And I'm not a fan of the concept of the deep state, 00:46:50.340 |
that is not flexible enough to respond intelligently 00:47:02.620 |
and of institutions in general that I think we should do, 00:47:07.620 |
but we need smart, well-informed, well-intentioned people 00:47:13.140 |
And the well-intentioned part is hugely important, right? 00:47:18.140 |
Just give me someone who is not the most selfish person 00:47:23.140 |
anyone has ever heard about in their lifetime, right? 00:47:31.980 |
the one most selfish person I think anyone could name. 00:47:35.700 |
I mean, and again, there's so much known about this man. 00:47:38.780 |
That's the thing, it was like, it predates his presidency. 00:47:44.060 |
And this is why to come back to those inflammatory comments 00:47:55.580 |
is that there is, and that includes any evidence 00:47:59.960 |
of corruption on the part of his father, right? 00:48:05.900 |
So it's like, there is no, as far as I can tell, 00:48:08.860 |
there's not a big story associated with that laptop 00:48:10.960 |
as much as people bang on about a few emails. 00:48:14.200 |
But even if there were just obvious corruption, right? 00:48:19.200 |
Like Joe Biden was at this meeting and he took 00:48:22.240 |
this amount of money from this shady guy for bad reasons. 00:48:26.340 |
Given how visible the lives of these two men have been, 00:48:32.480 |
I mean, given how much we know about Joe Biden 00:48:37.460 |
for almost as long as I've been alive, both of them, 00:48:40.160 |
the scale of corruption can't possibly balance out 00:48:47.640 |
If you show me that Joe Biden has this secret life 00:48:55.320 |
And he's doing all these things I didn't know about. 00:48:58.680 |
Okay, then I'm gonna start getting a sense that, 00:49:01.640 |
all right, maybe this guy is way more corrupt 00:49:05.460 |
Maybe there is some deal in Ukraine or with China 00:49:08.040 |
that is just, like this guy is not who he seems, 00:49:10.680 |
he's not the public servant he's been pretending to be. 00:49:13.440 |
He's been on the take for decades and decades 00:49:30.120 |
having looked at this guy for literally decades, right? 00:49:50.320 |
You hold that against what we know about Trump, right? 00:50:02.440 |
to testify to the fact that he has no ethical compass, 00:50:07.440 |
So that's why I don't care about what's on the laptop. 00:50:10.840 |
Now, if you tell me Trump is no longer running 00:50:13.700 |
for president in 2024 and we can put Trumpism behind us 00:50:20.360 |
listen, there's a lot of stuff on that laptop 00:50:21.920 |
that makes Joe Biden look like a total asshole. 00:50:29.240 |
it was a forced choice between a sitting president 00:50:32.540 |
who wouldn't commit to a peaceful transfer of power 00:50:35.200 |
and a guy who's obviously too old to be president 00:50:39.060 |
who has a crack addicted son who lost his laptop. 00:51:06.480 |
"There's nothing, it's Hunter Biden, it's not Joe Biden. 00:51:08.980 |
"Whatever the scope of Joe Biden's corruption is, 00:51:10.880 |
"it is infinitesimally compared to the corruption 00:51:18.360 |
But let me make the case that you're really focused 00:51:31.480 |
You can spend hundreds of billions of dollars 00:51:34.140 |
or trillions towards a war in the Middle East, for example, 00:51:40.600 |
in terms of the negative impact it has on the world. 00:51:47.840 |
everybody's very nice, everybody's very civil, 00:51:53.340 |
Yeah, sometimes somehow disappears in different places, 00:52:00.780 |
There's no Coke and strippers or whatever is on the laptop. 00:52:07.200 |
In the meanwhile, hundreds of thousands of civilians die. 00:52:10.860 |
Hate, just an incredible amount of hate is created 00:52:18.240 |
but there's no strippers and Coke on a laptop. 00:52:24.800 |
It is, when someone only wants wealth and power and fame, 00:52:49.620 |
pandemic risk, nuclear proliferation risk, none of it, right? 00:52:57.340 |
and whatever can personally redound to their self-interest 00:53:04.260 |
And they're not informed about the other risks 00:53:16.100 |
Why do I have fewer nuclear weapons than JFK?" 00:53:23.620 |
And this is the guy who's got the button, right? 00:53:29.900 |
I mean, somebody's following him around with a bag 00:53:37.320 |
That is a, it's just, it's a risk we should never run. 00:53:49.020 |
I don't know, people allege that he does speed, 00:53:54.100 |
He's not deranging himself with pharmaceuticals, at least, 00:54:03.120 |
- There's nothing wrong, just for the record, 00:54:09.300 |
- I mean, it's consumed in a very large amount. 00:54:12.180 |
- There's no medical, there's no scientific evidence 00:54:16.700 |
all those studies about aspartame and all that. 00:54:25.260 |
about the military-industrial complex is true, right? 00:54:30.640 |
on both sides of the aisle for a very long time. 00:54:55.660 |
How much more do they corrupt bad people, right? 00:55:15.640 |
and try to diagnose those incentives and change them, right? 00:55:28.760 |
that even bad people are effortlessly behaving 00:55:42.500 |
So yes, and you say I changed my mind about the war. 00:55:48.620 |
I mean, I was never a supporter of the war in Iraq. 00:55:52.400 |
I was always worried that it was a distraction 00:56:03.520 |
at best, a highly ambiguous and painful exercise, 00:56:10.180 |
It's like that, you know, it did not turn out well. 00:56:15.800 |
I don't, you know, I have not done a deep dive 00:56:21.520 |
and maybe all of these failures are failures in principle. 00:56:26.400 |
that can be done well by anybody, whatever our intentions. 00:56:30.200 |
But yeah, the move to Iraq always seemed questionable to me, 00:56:39.480 |
the immediate problem at that moment, you know, 00:56:45.480 |
and, you know, and then bouncing to Pakistan. 00:56:52.560 |
but my sense of the possibility of nation building, 00:57:10.840 |
that, you know, America was the kind of nation 00:57:12.960 |
that should be functioning in this way as the world's cop, 00:57:25.960 |
we're gonna have to do it over here kind of thing. 00:57:38.400 |
you're not gonna force a change at gunpoint in the culture, 00:57:45.960 |
or it certainly seems that that's not gonna happen. 00:57:53.520 |
- That's one of the things you realize with the wars, 00:57:59.680 |
You can just keep pouring money into a thing, 00:58:02.640 |
- Well, also there are signs of it working too. 00:58:04.680 |
You have all the stories of girls now going to school, 00:58:08.400 |
right, you know, the girls are getting battery acid 00:58:17.000 |
and that's all good, and our intentions are good there. 00:58:20.280 |
And I mean, we're on the right side of history there. 00:58:24.520 |
You know, Malala Yousafzai should have the Nobel Prize, 00:58:35.840 |
The question is, what do you do when there are enough, 00:58:41.760 |
who are willing to die and let their children die 00:58:46.780 |
And moral norms that belong in the seventh century. 00:58:53.400 |
and we couldn't solve it even though we spent, 00:58:58.480 |
- This reminded me of the thing that you and Jack Dorsey 00:59:03.480 |
jokingly had for a while, the discussion about banning 00:59:15.560 |
I mean, this has to do with sort of the Hunter laptop, 00:59:21.100 |
Does it bother you that there could be a collection 00:59:24.160 |
of people that make decisions about who to ban and not? 00:59:37.940 |
or in the absence of perfect AI, it always will be. 00:59:46.980 |
And it's an interesting question there as well. 00:59:51.860 |
And I used to think it was more important when I was on it, 01:00:00.820 |
an unambiguously good thing, in my experience, 01:00:07.180 |
It's like, it is just, even the good parts of Twitter 01:00:14.580 |
in the degree to which it was fragmenting my attention, 01:00:19.060 |
the degree to which my life was getting doled out to me 01:00:22.620 |
in periods between those moments where I checked Twitter, 01:00:38.500 |
multiple times a day or even every day, right? 01:00:40.740 |
I mean, I probably, I think I probably averaged 01:00:44.660 |
something like one tweet a day, I think I averaged. 01:00:50.820 |
and then I wouldn't tweet for the better part of a week. 01:00:54.140 |
But I was looking a lot because it was my newsfeed. 01:01:01.140 |
and I just wanted to see what they were paying attention to. 01:01:10.300 |
I would tweet, and so all of that seemed good. 01:01:15.320 |
from all of the odious bullshit that came back at me 01:01:19.340 |
largely in response to this Hunter Biden thing. 01:01:39.420 |
which is intrinsically fragmenting of time and attention. 01:01:42.220 |
And now my phone is much less of a presence in my life. 01:01:45.900 |
And it's not that I don't check Slack or check email. 01:02:01.640 |
has changed a lot by deleting my Twitter account. 01:02:10.780 |
I mean, we say of someone, that person's too online, right? 01:02:26.480 |
I think being on social media at all is to be too online. 01:02:39.400 |
And given the impulse it kindles in each of us 01:02:44.400 |
to reach out to our audience in specific moments 01:02:57.340 |
but there's nothing for me to do with that opinion, right? 01:03:12.920 |
I'm not gonna take the time to really think about it. 01:03:16.640 |
I would have reacted to this thing in the news 01:03:24.640 |
- Like chocolate ice cream is the most delicious thing ever. 01:03:36.920 |
and reputation destroying things that people do. 01:03:44.320 |
the analogous things that have happened to me. 01:03:45.800 |
I mean, the things that have really bent my life around 01:04:00.560 |
the things I would think I would have to respond to, 01:04:03.240 |
like I would release a podcast on a certain topic. 01:04:11.000 |
that there was some signal that I really had to respond to. 01:04:29.480 |
or taking someone else's foot out of my mouth. 01:04:42.940 |
I mean, if it's generative of useful information 01:04:48.360 |
and engagement professionally and psychologically, great. 01:04:58.360 |
I mean, there were people who I've connected with 01:05:04.440 |
and it was hard to see how that was gonna happen otherwise. 01:05:16.080 |
- Do you think it's possible to avoid the drug of that? 01:05:23.600 |
to use it in a way that doesn't pull you into the whirlpool? 01:05:31.440 |
- Yeah, but it's not the way I wanted to use it. 01:05:38.680 |
- I wanted to actually communicate with people. 01:05:44.880 |
again, it's like being in Afghanistan, right? 01:05:59.300 |
I have those moments on Twitter where it's like, 01:06:03.160 |
who's detected an error I made in my podcast or in a book, 01:06:12.480 |
And I would never have heard from this person 01:06:19.200 |
That's the promise of it, to actually talk to people. 01:06:24.780 |
No, the sane or sanity-preserving way of using it 01:06:35.140 |
and you don't look at what's coming back at you. 01:06:37.660 |
And that's, I'm on other social media platforms 01:06:42.960 |
I mean, my team posts stuff on Facebook and on Instagram. 01:06:48.220 |
- So you don't think it's possible to see something 01:06:56.220 |
and I did that for vast stretches of time, right? 01:07:07.100 |
So like, so why am I, if I know for whatever reason, 01:07:23.660 |
But I'm just gonna stare into this funhouse mirror 01:07:37.700 |
and find equanimity again with all the nastiness 01:07:43.100 |
and not that I couldn't ignore it for vast stretches of time, 01:07:46.220 |
but I could see that I kept coming back to it 01:07:51.060 |
hoping that it would be something that I could use, 01:07:55.720 |
And I was noticing that it was insidiously changing 01:08:02.980 |
both people I know and people I don't know, right? 01:08:04.960 |
Like people I, you know, mutual friends of ours 01:08:20.960 |
And I felt that it was as much as I could sort of 01:08:31.760 |
because people are performing, people are faking, 01:08:44.760 |
not just a daily basis, you know, an hourly basis, 01:08:48.500 |
or, you know, an increment sometimes of, you know, 01:08:57.920 |
and maybe I was checking it 100 times a day on some days, 01:09:25.360 |
totally normal people are capable of being, right? 01:09:37.200 |
People who I know I could sit down at dinner with 01:09:52.620 |
you know, and there are analogies that many of us have made. 01:10:04.680 |
if they didn't have this metal box around them, 01:10:07.700 |
And it's, you know, all of that becomes quite hilarious 01:10:20.800 |
coming out of that car next to them with cauliflower ear 01:10:27.840 |
because they would have taken one look at this person 01:10:29.680 |
and realized this is the last person you wanna fight with. 01:10:32.760 |
- That's one of the heartbreaking things is to see, 01:10:37.360 |
who I know are friends, be everything from snarky 01:10:41.800 |
to downright mean, derisive towards each other. 01:10:50.360 |
Like this, this is the only place where I've seen 01:10:53.100 |
people I really admire who have had a calm head 01:10:56.720 |
about most things, like really be shitty to other people. 01:11:02.040 |
And I don't, I tend, I choose to maybe believe 01:11:16.200 |
but you kind of accept that that's kind of what you're doing 01:11:20.360 |
But it's sometimes hard to remind yourself of that. 01:11:23.640 |
- Well, and I think I was guilty of that, definitely. 01:11:44.080 |
It's not, it's not, so the reason why I deleted 01:11:54.440 |
And so, and yeah, is there some way to be on there 01:12:09.800 |
use it as a one-way channel of communication, 01:12:14.560 |
You know, it's like, here's what I'm paying attention to. 01:12:17.940 |
Look at it if you want to, and you just push it out, 01:12:20.360 |
and then you don't look at what's coming back at you. 01:12:24.880 |
and actually, quite surprisingly, there's a lot of good, 01:12:28.840 |
I mean, they're like, even if they're critical, 01:12:31.600 |
they're like being thoughtful, which is nice. 01:12:34.060 |
- I used it that way too, and that was what kept me hooked. 01:12:37.100 |
- But then there's also TouchBalls69 wrote a question. 01:12:44.280 |
This is part of it, but one way to solve this is, 01:12:47.420 |
you know, we gotta get rid of anonymity for this. 01:12:57.460 |
That was, and I've since solved that problem, so. 01:13:08.660 |
I don't have to hear from TouchBalls69 on the regular. 01:13:18.680 |
just even in moderation, just to see that there is, 01:13:22.540 |
like for me, the negative effect is slightly losing faith 01:13:31.220 |
- You can also just reason your way out of it, 01:13:32.980 |
saying that this is anonymity and this is kind of fun 01:13:35.100 |
and this kind of, just the shit show of Twitter, it's okay, 01:13:39.180 |
but it does mentally affect you a little bit. 01:13:41.540 |
- Like I don't read too much into that kind of comment. 01:14:02.420 |
and for all I know, that person could be 16 years old, 01:14:09.540 |
- Well, yeah, that's right, yeah, yeah, yeah. 01:14:12.300 |
No, I'm pretty sure Elon would just tweet that 01:14:19.620 |
Okay, so the, do you think, so speaking of which, 01:14:30.780 |
This Twitter and just social media in general, 01:14:33.220 |
but because of the aggressive nature of his innovation 01:14:36.620 |
that he's pushing, is there any way to make Twitter 01:14:46.180 |
- I don't know, I think I'm agnostic as to whether or not 01:14:48.800 |
he or anyone could make a social media platform 01:14:53.140 |
- So you were just observing yourself week by week, 01:14:59.780 |
and growing as a person, and it was negative. 01:15:04.580 |
I mean, it's obviously, I mean, he's not gonna admit it, 01:15:07.960 |
but I think it's obviously negative for Elon, right? 01:15:12.880 |
and that was one of the things that, you know, 01:15:18.980 |
I was also seeing the Funhaus mirror on his side of Twitter, 01:15:31.920 |
why was I spending my time this way, to a lesser degree, 01:15:34.940 |
right, and at lesser scale and at lesser risk, frankly, 01:15:56.520 |
That was based, I mean, I was on somebody's podcast, 01:15:59.280 |
but that was based on a clip taken from that podcast, 01:16:02.320 |
which was highly misleading as to the general shape 01:16:08.560 |
Even, you know, I had to then do my own podcast 01:16:18.360 |
and didn't say exactly what I thought in a way 01:16:29.360 |
but the clip was quite distinct from the podcast itself. 01:16:33.460 |
The reality is, is that we're living in an environment now 01:16:41.200 |
is so fragmented that they only have time for clips. 01:16:45.880 |
99% of people will see a clip and will assume 01:16:50.560 |
there's no relevant context I need to understand 01:16:54.680 |
And obviously the people who make those clips know that, 01:16:58.320 |
right, and they're doing it quite maliciously. 01:17:00.800 |
And in this case, the person who made that clip 01:17:04.840 |
was quite maliciously trying to engineer, you know, 01:17:12.020 |
And being signal boosted by Elon and other prominent people 01:17:19.320 |
who can't take the time to watch anything other than a clip, 01:17:25.680 |
or someone who's ostensibly their friend in that clip, 01:17:38.000 |
and we now have these contexts in which we react 01:17:43.280 |
Like Twitter is inviting an instantaneous reaction 01:18:09.080 |
from this conversation that are just as misleading 01:18:13.240 |
and then people are gonna be dunking on those clips. 01:18:21.840 |
- See, I think it's possible to create a platform. 01:18:27.640 |
but when I saw that clip of you talking about children 01:18:30.360 |
and so on, just knowing that you have a sense of humor, 01:18:33.480 |
you just went to a dark place in terms of humor. 01:18:40.760 |
is that people will use it for virality's sake, 01:18:49.320 |
It's not like I was, it's really like interpreting it 01:18:58.160 |
like I even give Trump the benefit of the doubt 01:19:10.720 |
Like the whole, there were good people on both sides 01:19:14.620 |
scandal around his remarks after Charlottesville. 01:19:17.580 |
The clip that got exported and got promoted by everyone 01:19:26.860 |
the New York Times, CNN, there's nobody that I'm aware of 01:19:38.660 |
He did not say what he seemed to be saying in that clip 01:19:44.080 |
And I have always been very clear about that. 01:19:46.340 |
So it's just, even people who I think should be marginalized 01:20:02.880 |
who are doing dangerous things and for bad reasons, 01:20:11.640 |
And this goes to anyone else we might talk about 01:20:15.180 |
who's more, where the case is much more confusing. 01:20:26.460 |
but the prospect of being able to manufacture clips 01:20:34.040 |
and that where it's gonna be hard for most people 01:20:38.700 |
whether they're in the presence of something real, 01:20:51.140 |
that we are right on the cusp of and it's terrifying. 01:20:59.780 |
where humor is the only thing we have and it will save us. 01:21:02.960 |
Maybe in the end, Trump's approach to social media 01:21:11.960 |
People function on the basis of what they assume is true. 01:21:15.440 |
Right, they think-- - People have functioned. 01:21:19.480 |
you have to know what you think is gonna happen 01:21:22.600 |
or you have to at least give a probabilistic weighting 01:21:26.680 |
over the future, otherwise you're gonna be incapacitated 01:21:30.720 |
by, you're not gonna, like people want certain things 01:21:37.560 |
And they don't wanna die, they don't want their kids to die, 01:21:40.780 |
you tell them that there's a comet hurtling toward Earth 01:21:43.360 |
and they should get outside and look up, right? 01:21:46.360 |
They're gonna do it and if it turns out it's misinformation, 01:21:49.360 |
it's gonna matter because it comes down to like, 01:21:54.680 |
what medicines do you give your children, right? 01:21:57.720 |
Like we're gonna be manufacturing fake journal articles. 01:22:29.480 |
And there are people who are celebrating this kind of, 01:22:41.280 |
there are the people who don't have anything to lose 01:22:44.720 |
who are celebrating it or just are so confused 01:22:46.960 |
that they just don't even know what's at stake. 01:22:50.000 |
the few people who we could count on a few hands 01:22:54.760 |
or at least imagine they've insulated themselves 01:22:59.440 |
that they're not implicated in the great unraveling 01:23:11.520 |
Is there such thing as expertise and experts in something? 01:23:16.560 |
- I think it's important to acknowledge upfront 01:23:35.160 |
It's just like there are different moments in time. 01:23:47.960 |
I mean, whenever you're having a fact-based discussion 01:23:51.600 |
about anything, it is true to say that the truth 01:23:56.120 |
or falsity of a statement does not even slightly depend 01:24:01.120 |
on the credentials of the person making the statement. 01:24:05.400 |
So it doesn't matter if you're a Nobel laureate, 01:24:08.960 |
The thing you could, the last sentence you spoke 01:24:12.860 |
And it's also possible for someone who's deeply uninformed 01:24:29.560 |
who's got no credentials, but he just loves astronomy 01:24:32.320 |
and he's got a telescope and he's spent a lot of time 01:24:35.260 |
looking at the night sky, and he discovers a comet 01:24:40.240 |
not even the professional expert astronomers. 01:24:44.080 |
And I gotta think that happens less and less now, 01:25:02.080 |
to the reputational concerns we have among apes 01:25:12.420 |
real experts are much more reliable than frauds 01:25:24.200 |
And when you're flying an airplane in a storm, 01:25:28.760 |
you don't want just randos come into the cockpit saying, 01:25:38.440 |
and that training gave them something, right? 01:25:40.920 |
It gave them a set of competences and intuitions 01:25:43.760 |
and they know what all those dials and switches do, right? 01:25:59.600 |
sharpens this up, we want real experts to be in charge, 01:26:14.160 |
whether it's a geopolitical emergency like Ukraine, 01:26:18.500 |
I mean, climate change, I mean, just pick your topic. 01:26:33.780 |
And so expertise is a thing and deferring to experts 01:27:00.740 |
we're well-educated, we're not the worst possible candidates 01:27:07.900 |
When we're going into a new area where we're not experts, 01:27:11.340 |
we're fairly alert to the possibility that we don't, 01:27:16.080 |
and we don't know how our tools translate to this new area. 01:27:37.020 |
And in that case, the invitation to do your own research, 01:27:48.480 |
I view as an invitation to waste your time pointlessly, 01:27:54.700 |
Now, the truth is times are not all that good, right? 01:28:07.240 |
We've got experts who perversely won't admit they were wrong 01:28:12.040 |
when they in fact are demonstrated to be wrong. 01:28:14.720 |
We've got institutions that have been captured 01:28:16.880 |
by political ideology that's not truth tracking. 01:28:27.460 |
whether it's universities or science journals 01:28:43.160 |
But the reality is that there is a massive difference 01:28:46.880 |
when there's anything to know about anything, 01:28:49.160 |
there is a massive difference most of the time 01:28:54.520 |
to understand that domain and someone who hasn't. 01:28:57.880 |
And if I get sick or someone close to me gets sick, 01:29:02.780 |
you know, I have a PhD in neuroscience, right? 01:29:11.920 |
And I, you know, so I'm just fairly conversant 01:29:18.320 |
and I'm alert to the difference because I've, 01:29:30.060 |
I'm alert to the difference between good and bad studies 01:29:34.980 |
And I understand that bad studies can get published 01:29:44.720 |
But when I get sick or when someone close to me gets sick, 01:29:54.600 |
for days at a stretch trying to become a doctor, 01:29:58.100 |
much less a specialist in the domain of problem 01:30:01.440 |
that has been visited upon me or my family, right? 01:30:09.800 |
I don't start reading, you know, in journals of oncology 01:30:14.140 |
and try to really get up to speed as an oncologist 01:30:25.360 |
and very likely misleading use of my time, right? 01:30:32.880 |
if I decide, if I had, if I had a lot of runway, 01:30:36.360 |
if I decided, okay, it's really important for me 01:30:40.680 |
At this point, I wanna, I know someone's gonna get cancer. 01:30:43.600 |
I may not go back to school and become an oncologist, 01:30:46.440 |
but what I wanna do is I wanna know everything 01:31:02.480 |
I'm not gonna be the best person to form intuitions 01:31:05.820 |
about what to do in the face of the next cancer 01:31:09.840 |
I'm still gonna want a better oncologist than I've become 01:31:15.640 |
to tell me what he or she would do if they were in my shoes 01:31:24.720 |
I'm not advocating a blind trust and authority. 01:31:33.240 |
and they're recommending some course of treatment, 01:31:34.680 |
by all means, get a second opinion, get a third opinion. 01:31:43.060 |
Robert Kennedy Jr. who's telling you that you got it 01:31:56.320 |
where you've got people who are moving the opinions 01:32:00.140 |
of millions of others who should not have an opinion 01:32:11.180 |
you should be getting your opinion about vaccine safety 01:32:19.300 |
or anything else that we might wanna talk about 01:32:29.860 |
And what's more, she doesn't seem to care, right? 01:32:34.980 |
that has amplified that not caring into a business model 01:32:42.940 |
and there's something very Trumpian about all that. 01:32:45.940 |
Like that's the problem, the problem is the culture. 01:32:51.740 |
So the paradox here is that expertise is a real thing 01:32:58.580 |
and we defer to it a lot as a labor-saving device 01:33:14.800 |
They know more about that topic than the next guy, 01:33:24.620 |
But it's also true that when you're talking about facts, 01:33:37.140 |
You get a sea change in the thinking of a whole field 01:33:40.560 |
because one person who's an outlier for whatever reason 01:34:00.020 |
So he just drank a vial of H. pylori bacteria 01:34:14.180 |
That doesn't disprove the reality of expertise. 01:34:19.140 |
It doesn't disprove the utility of relying on experts 01:34:22.260 |
most of the time, especially in an emergency, 01:34:27.060 |
especially when you're in this particular cockpit 01:34:30.980 |
and you only have one chance to land this plane, right? 01:34:40.180 |
So one, you mentioned this example with cancer 01:34:54.880 |
because you can read, the internet made information 01:35:00.100 |
so you can read a lot of different meta-analyses. 01:35:19.260 |
you can start to make good faith interpretations. 01:35:24.260 |
For example, I don't know, I don't wanna overstate things, 01:35:26.980 |
but if you suffer from depression, for example, 01:35:36.540 |
but you could also challenge some of those ideas 01:35:39.660 |
and seeing like, what are the different medications, 01:35:42.820 |
what are the different solutions to depression, 01:35:45.660 |
And I think depression is just a really difficult problem 01:35:48.660 |
that's very, I don't wanna, again, state incorrect things, 01:35:58.020 |
So being introspective about the type of depression you have 01:36:02.260 |
and the different possible solutions you have, 01:36:14.740 |
that's something that's been studied for a very long time. 01:36:17.100 |
With a new pandemic that's affecting everybody, 01:36:20.720 |
with the airplane equated to like 9/11 or something, 01:36:38.380 |
could be exceptionally effective in asking questions. 01:36:42.180 |
And then there's a difference between experts, 01:36:45.300 |
virologists, and it's actually a good question, 01:36:57.980 |
and then there's the communicators of that expertise. 01:37:01.300 |
And the question is, if the communicators are flawed 01:37:16.420 |
you're competing with the communicators of expertise. 01:37:18.940 |
That could be WHO, CDC in the case of the pandemic, 01:37:21.860 |
or politicians, or political type of science figures 01:37:28.380 |
of the effectiveness of doing your own research 01:37:41.180 |
is you can become quite popular by being contrarian, 01:37:54.100 |
in that kind of context could be quite effective. 01:37:58.700 |
I'm not saying you shouldn't do any research. 01:38:02.300 |
I'm not saying that you shouldn't be informed 01:38:08.280 |
And yes, if I got cancer or someone close to me got cancer, 01:38:29.560 |
can discredit themselves or they can be wrong. 01:38:31.880 |
They can be wrong even when there's no discredit. 01:38:35.040 |
There's just, there's a lot we don't understand 01:38:39.140 |
But still this vast gulf between truly informed opinion 01:39:09.640 |
of born of bad incentives within our scientific processes, 01:39:17.640 |
And again, we've mentioned a lot of these things in passing, 01:39:24.920 |
to scientific communication during the pandemic was awful. 01:39:37.080 |
some of the things that were being said and still is. 01:39:45.320 |
for anyone to be that confident about anything. 01:40:02.360 |
And I mean, it's obvious they don't work well for everybody, 01:40:09.460 |
But again, depression is a multifactorial problem 01:40:16.560 |
and there are different levels at which to influence it. 01:40:24.320 |
And one of the perverse things about depression 01:40:30.900 |
all of the things that would be good for you to do 01:40:39.200 |
And all of those things, if you got those up and running, 01:40:43.120 |
they do make you feel better in the aggregate. 01:40:46.560 |
But the reality is that there are clinical level depressions 01:40:57.440 |
there's no life change someone's gonna embrace 01:41:01.020 |
that is going to be an obvious remedy for that. 01:41:03.420 |
The pandemic, I mean, pandemics are obviously 01:41:09.580 |
but I would consider it much simpler than depression 01:41:21.900 |
- The logic by which you would make those choices, yeah. 01:41:23.900 |
So it's like, we have a virus, we have a new virus. 01:41:27.600 |
It's some version of bad, it's human transmissible. 01:41:38.880 |
- Well, at a certain point, we knew it was respiratory, 01:41:44.640 |
and like all that, we were confused about a lot of things. 01:41:54.480 |
we ramped up the vaccines as quickly as we could, 01:41:59.480 |
but too quick for some, not quick enough for others. 01:42:05.520 |
and got them out more quickly with better data. 01:42:08.700 |
And I think that's something we should probably look at 01:42:14.120 |
that would make ethical sense to do challenge trials. 01:42:25.480 |
My concern about COVID has, for much of the time, 01:42:32.520 |
how dangerous I perceive COVID to be as a illness. 01:42:40.820 |
even more a concern about our ability to respond 01:42:53.540 |
give me the first six months to be quite worried 01:43:02.860 |
- You wanna secure a steady supply of toilet paper. 01:43:08.020 |
when we had a sense of what we were dealing with 01:43:23.480 |
For years now, I've just been worrying about this 01:43:28.600 |
as a failed dress rehearsal for something much worse. 01:43:42.680 |
we've basically enrolled all of human society 01:43:45.960 |
into a psychological experiment that is deranging us 01:44:05.420 |
The way we didn't seem to have the distrust in institutions 01:44:30.520 |
It's a matter of incompetence and misaligned incentives 01:44:54.620 |
became emotional and dogmatic in style of conversation, 01:45:07.520 |
made people really more emotional than before. 01:45:14.160 |
I think something I think you probably would agree with, 01:45:17.800 |
I think it was the combo of Trump and the pandemic. 01:45:20.200 |
Trump triggered the far left to be way more active 01:45:27.400 |
nanny state, lefties a huge platform on a silver platter, 01:45:33.760 |
I'm not sure how much to read into the nanny state concept, 01:45:38.020 |
but yeah, like basically got people on the far left 01:45:53.100 |
has created a complete distrust in government. 01:46:04.220 |
given the gaming of these tools by an anti-vax cult. 01:46:13.140 |
that just ramped up its energy during this moment. 01:46:21.540 |
every concern about vaccines is a species of, 01:46:25.660 |
it was born of misinformation or born of this cult, 01:46:54.020 |
that the world's being run by pedophile cannibals 01:46:59.300 |
and Michelle Obama are among those cannibals. 01:47:12.180 |
And the layers of bad faith are hard to fully diagnose, 01:47:17.180 |
but the problem is all of this is getting signal boosted 01:47:28.820 |
that is preferentially spreading misinformation. 01:47:34.380 |
that is preferentially sharing misinformation. 01:48:02.900 |
who are just pretending to believe these things, 01:48:07.180 |
this is sort of like the 4chanification of everything. 01:48:22.460 |
It's just cynicism overflowing its banks, right? 01:48:29.940 |
Like, look at all the normies who don't understand 01:48:37.940 |
everyone's cognitive bandwidth with bullshit, right? 01:48:51.620 |
that real questions of human welfare are in play, right? 01:48:55.900 |
It's like, there are wars getting fought or not fought, 01:49:04.940 |
But I mean, to come back to this issue of COVID, 01:49:07.980 |
I don't think I got so out of balance around COVID. 01:49:17.020 |
I mean, yes, there was a period where I was crazy 01:49:21.540 |
because anyone who was taking it seriously was crazy 01:49:25.340 |
And so it's like, yes, I was wiping down packages 01:49:30.420 |
Because people thought it was transmissible by touch, right? 01:49:35.420 |
And then when we realized that was no longer the case, 01:49:53.300 |
- I think the criticism that people would say 01:50:19.180 |
which I think there was a significant enough number, 01:50:24.340 |
I mean, especially after the war in Afghanistan and Iraq, 01:50:34.340 |
And you just see how the people who are greedy, 01:50:46.340 |
but it's hard to know how those two combine together 01:50:55.100 |
I guess the sense was that you weren't open enough 01:51:01.380 |
I'll tell you how I thought about it and think about it. 01:51:26.860 |
And that the trade-off for basically everyone 01:51:33.540 |
and how many people had been vaccinated before you, 01:51:47.460 |
where it was just obviously reasonable to get vaccinated, 01:51:51.460 |
especially because there was every reason to expect 01:51:56.100 |
that while it wasn't a perfectly sterilizing vaccine, 01:51:59.660 |
it was going to knock down transmission a lot, 01:52:17.860 |
I mean, I know people who can't take any vaccines. 01:52:30.940 |
"See, vaccines will fucking kill you," right? 01:53:00.140 |
- Yes, and so again, all of this has begun to shift. 01:53:12.420 |
around the mRNA vaccines, especially for young men, right? 01:53:23.420 |
But also, there's now a lot of natural immunity out there. 01:53:29.940 |
Basically, everyone who was gonna get vaccinated 01:53:34.420 |
The virus has evolved to the point in this context 01:53:47.100 |
than on research that I've done at this point, 01:53:50.300 |
but I'm certainly less worried about getting COVID. 01:53:56.740 |
how do I feel about getting the next booster? 01:53:59.160 |
I don't know that I'm going to get the next booster, right? 01:54:12.860 |
And that was, at that point, given what we knew, 01:54:20.620 |
and based on anecdotes that were too vivid to ignore, 01:54:28.580 |
it was totally rational for me to want to get that vaccine 01:54:37.580 |
for me to do a different kind of cost-benefit analysis 01:54:41.700 |
and wonder, listen, do I really need to get a booster? 01:54:51.000 |
And how safe is the mRNA vaccine for a man of my age? 01:54:56.000 |
And do I need to be worried about myocarditis? 01:54:58.580 |
All of that is completely rational to talk about now. 01:55:02.680 |
My concern is that at every point along the way, 01:55:13.360 |
and there's many other people I could add to this list, 01:55:15.620 |
to have strong opinions about any of this stuff. 01:55:24.460 |
but I feel like experts failed at communicating. 01:55:29.600 |
- And I just feel like you and Brett Weinstein 01:55:40.680 |
deeply about the problems that face our world, 01:55:58.200 |
- But I knew we were gonna disagree about that. 01:56:00.640 |
I saw his podcast where he brought on these experts 01:56:03.640 |
who had, many of them, had the right credentials, 01:56:13.200 |
One larger problem, and this goes back to the problem 01:56:26.240 |
I mean, it is amazing, but you could find PhDs and MDs 01:56:32.400 |
and say that they thought smoking was not addictive, 01:56:37.520 |
there was no direct link between smoking and lung cancer. 01:56:46.840 |
were people who had obvious tells, to my point of view, 01:56:52.520 |
some of the same people were on Rogan's podcast, right? 01:57:03.160 |
and they're not saying something floridly mistaken, 01:57:15.920 |
It's, at that point, not a whole hell of a lot. 01:57:18.760 |
I mean, we have no long-term data on mRNA vaccines, 01:57:22.640 |
but to confidently say that millions of people 01:57:27.760 |
and to confidently say that ivermectin is a panacea, 01:57:31.280 |
right, ivermectin is the thing that prevents COVID, right? 01:57:42.600 |
I felt like there was just no, there was nothing to debate. 01:57:48.960 |
We're both gonna defer to our chosen experts. 01:57:55.800 |
and, or at least the ones who are most vociferous 01:58:02.640 |
- And your experts seem like, what is the term, 01:58:06.960 |
- Well, no, but it's like with climate science. 01:58:18.880 |
agree that human-caused climate change is a thing, right? 01:58:29.140 |
It's obvious you go with the 97% most of the time 01:58:33.760 |
It's not to say that the 3% are always wrong. 01:58:41.320 |
and I've spent much more time worrying about this 01:58:43.520 |
on my podcast than I've spent worrying about COVID, 01:58:46.120 |
our institutions have lost trust for good reason, right? 01:59:00.560 |
with this level of transparency and pseudo-transparency, 01:59:10.780 |
that we may have to fight, like the next Nazis? 01:59:13.360 |
Can we fight that war when everyone with an iPhone 01:59:19.080 |
that little girls get blown up when we drop our bombs, right? 01:59:22.680 |
Like, could we as a society do what we might have to do 01:59:34.200 |
of just, you know, everyone's a journalist, right? 01:59:36.920 |
Everyone's a scientist, everyone's an expert, 01:59:39.000 |
everyone's got direct contact with the facts, 01:59:52.560 |
in your ability to steal me on the other side, 02:00:01.060 |
by which you resist the dogmatism of this binary thinking. 02:00:14.040 |
then people will listen to you as the aggregator, 02:00:29.840 |
about the safety and efficacy of the vaccines today? 02:00:33.960 |
As it stands today, what are we supposed to think? 02:00:42.400 |
Where's the great communicators on this topic 02:00:45.600 |
that consider all the other conspiracy theories, 02:00:58.400 |
And also some of that has to do with humility, 02:01:05.260 |
Just like with depression, you can't really know for sure. 02:01:08.720 |
Where's the, I'm not seeing those communications 02:01:21.800 |
Some of it's a self-fulfilling dynamic where, 02:01:30.320 |
a lockdown would work if we could only do it. 02:01:56.160 |
in the face of any conceivable illness, right, 02:02:02.760 |
coming to fuck you over and take your guns, right? 02:02:05.920 |
Okay, you have a society that is now immune to reason, right? 02:02:09.740 |
'Cause there are absolutely certain pathogens 02:02:12.880 |
that we should lock down for next time, right? 02:02:15.680 |
And it was completely rational in the beginning 02:02:23.540 |
to attempt to lock down, we never really locked down, 02:02:33.240 |
given what we were seeing happening in Italy, right? 02:02:41.800 |
In retrospect, my views on that haven't changed, 02:02:55.840 |
We live in a society that's just not gonna lock down. 02:03:02.600 |
which was maliciously clipped out from some other podcast 02:03:09.440 |
Look, it's a pity more children didn't die from COVID, right? 02:03:16.400 |
and that's the other thing that got so poisoned here. 02:03:23.840 |
who's creating these clips of me on podcasts, 02:03:32.240 |
but it was so clear in context what I was saying 02:03:41.920 |
I don't know whether he actually is a psychopath, 02:03:49.640 |
as a very reliable source of information, right? 02:03:54.640 |
He kept retweeting this guy at me, against me, right? 02:04:14.320 |
is how to navigate the professional and personal pressure 02:04:29.660 |
or someone you know who's behaving badly in public 02:04:36.360 |
behaving in a way that you think is bad in public, 02:04:45.680 |
where you're constantly getting asked to comment 02:04:49.040 |
on what this friend or acquaintance or colleague is doing. 02:04:54.560 |
- I haven't known what I think is ethically right 02:05:06.900 |
Now, in that case, I reached out to you in private first 02:05:12.900 |
and then when I was gonna get asked in public 02:05:15.220 |
or when I was touching that topic on my podcast, 02:05:44.200 |
there's a deep core of good vibes that persists through time 02:05:47.360 |
between you and Elon, and I would argue probably 02:05:49.720 |
between some of the other folks you mentioned. 02:05:51.880 |
- I think with Brett, I failed to reach out in private 02:06:00.240 |
we had tried to set up a conversation in private 02:06:04.180 |
that never happened, but there was some communication, 02:06:11.740 |
to have made more of an effort in private than I did 02:06:16.880 |
and I would say that's true with other people as well. 02:06:23.040 |
Because my case would be beforehand, and now still. 02:06:26.680 |
The case I would like, and this part of the criticism 02:06:31.120 |
you sent my way, maybe it's useful to go to that direction. 02:06:37.240 |
because I think I disagree with your criticism 02:06:41.880 |
- You're talking about your interview with Connie? 02:06:45.600 |
is actually the right thing to do with Brett. 02:06:47.360 |
Okay, you said, "Lex could have spoken with Connie 02:06:50.540 |
"in such a way as to have produced a useful document. 02:06:54.180 |
"He didn't do that because he has a fairly naive philosophy 02:06:59.180 |
- Let's see if you can maintain that philosophy 02:07:07.000 |
He seemed to think that if he just got through 02:07:09.760 |
the minefield to the end of the conversation, 02:07:12.660 |
where the two of them still were feeling good 02:07:20.320 |
So let me make the case for this power of love philosophy. 02:07:27.280 |
You're still an inspiration and somebody I deeply admire. 02:07:35.680 |
it's not only that you get through the conversation 02:07:52.360 |
the act of turning the other cheek itself communicates 02:07:56.040 |
both to Kanye later and to the rest of the world 02:07:59.840 |
that we should have empathy and compassion towards each other. 02:08:05.520 |
Maybe that is naive, but I believe in the power of that. 02:08:09.840 |
So it's not that I'm trying to convince Kanye 02:08:14.040 |
but I'm trying to illustrate that just the act of listening 02:08:17.840 |
and truly trying to understand the human being, 02:08:20.240 |
that opens people's minds to actually questioning 02:08:29.440 |
deescalates the kind of dogmatism that I've been seeing. 02:08:33.640 |
So in that sense, I would say the power of love 02:08:40.940 |
because the right conversation you have in private 02:08:43.740 |
is not about, hey, listen, the experts you're talking to, 02:08:48.740 |
they seem credentialed, but they're not actually 02:08:55.960 |
in actual meta-analyses and papers and so on. 02:08:58.400 |
Like making a strong case, like what are you doing? 02:09:00.880 |
This is gonna get a lot of people in trouble. 02:09:05.320 |
in the dumbest of ways, being like respectful, 02:09:10.640 |
sending love their way, and just having a conversation 02:09:19.280 |
removing the emotional attachment to this debate, 02:09:25.640 |
even though you are very emotionally attached 02:09:29.740 |
there is a very large number of lives at stake. 02:09:45.940 |
from my point of view for a very different reason. 02:09:51.660 |
I mean, so Kanye, I don't know, I've never met Kanye, 02:10:08.700 |
Or both, I mean, those aren't mutually exclusive. 02:10:32.780 |
so I think you should have had the conversation 02:10:43.340 |
In public, I just thought you're not doing him a favor. 02:10:53.660 |
or I'm not a clinician, but I've heard it said of him 02:10:57.040 |
that he is bipolar, you're not doing him a favor 02:11:03.180 |
and letting him go off on the Jews or anything else. 02:11:21.260 |
you're not doing him a favor making that even more public. 02:11:24.140 |
If he's just an asshole and he's just an antisemite, 02:11:44.620 |
I don't agree that compassion and love always have 02:11:48.640 |
this patient, embracing, acquiescent face, right? 02:11:53.640 |
They don't always feel good to the recipient, right? 02:12:05.100 |
where someone's full of shit and you just make it 02:12:07.700 |
absolutely clear to them and to your audience 02:12:14.900 |
listen, I'm gonna do everyone a favor right now 02:12:34.500 |
Like, I get that many people in your audience thought, 02:12:37.820 |
You're talking to Kanye and you're doing it in Lex style, 02:12:40.660 |
where it's just love and you're not treating him 02:12:46.820 |
he's this creative genius who does work we love, 02:12:49.300 |
and yet he's having this moment that's so painful. 02:12:53.680 |
And I get that maybe 90% of your audience saw it that way. 02:13:03.660 |
- You don't think it opens up the mind and heart 02:13:08.300 |
- If it does, it's opening up in the wrong direction 02:13:12.700 |
where just gale force nonsense is coming in, right? 02:13:15.980 |
I think we should have an open mind and an open heart, 02:13:24.200 |
One is the mental illness component is its own thing. 02:13:27.300 |
I don't pretend to understand what's going on with him, 02:13:29.200 |
but insofar as that's the reason he's saying what he's saying 02:13:33.780 |
do not put this guy on camera and let no one see it. 02:13:38.580 |
I had a bunch of conversation with him offline 02:13:54.180 |
either that's more genius performance in his world 02:14:03.100 |
I think there's another conversation to be had 02:14:15.520 |
taking words from Kanye as if he's like Christopher Hitchens 02:14:36.820 |
And so there's a different style of conversation 02:14:51.780 |
well, then that falls into the asshole bucket for me. 02:14:55.820 |
And honestly, the most offensive thing about him 02:15:00.100 |
is not the antisemitism, which we can talk about, 02:15:03.460 |
'cause I think there are problems just letting him 02:15:15.800 |
or was coming off in that interview and in others. 02:15:25.180 |
not only to rival Shakespeare, to exceed Shakespeare. 02:15:32.140 |
And he's more or less explicit on that point. 02:15:36.380 |
without saying anything actually interesting or insightful 02:15:41.620 |
So it's complete delusion of a very Trumpian sort. 02:15:52.020 |
and one wonders whether Trump takes himself seriously, 02:15:54.420 |
Kanye seems to believe, he seems to believe his own press. 02:16:12.340 |
But one thing that's patently obvious from your conversation 02:16:17.340 |
is he's not who he thinks he is intellectually or ethically 02:16:29.980 |
which was genuinely noxious and ill-considered 02:16:33.580 |
and has potential knock-on effects in the black community. 02:16:38.580 |
I mean, there's an ambient level of antisemitism 02:16:41.860 |
in the black community that is worth worrying about 02:16:45.980 |
There's a bunch of guys playing the knockout game 02:16:48.220 |
in Brooklyn, just punching Orthodox Jews in the face. 02:16:51.460 |
And I think letting Kanye air his antisemitism 02:16:55.020 |
that publicly only raises the likelihood of that 02:17:02.460 |
So one, my belief at the time was it doesn't, 02:17:17.300 |
or the hatred in general is brought to the surface. 02:17:22.260 |
But I should also say that you're one of the only people 02:17:27.340 |
And like out of the people I really respect and admire, 02:17:35.220 |
'cause I had to think through it for a while. 02:17:37.460 |
And it still haunts me because the other kind of criticism 02:17:46.100 |
things towards me based on who I am that they hate me. 02:18:08.100 |
I've been a student of the Holocaust, obviously. 02:18:12.540 |
And there's reason to be a student of the Holocaust. 02:18:20.700 |
I have never taken antisemitism very seriously. 02:18:30.460 |
I had Barry Weiss on my podcast when her book came out. 02:18:39.980 |
And it's something we have to keep an eye on societally 02:18:53.820 |
it's knit together with, it's not just ordinary racism. 02:18:56.460 |
It's knit together with lots of conspiracy theories 02:19:07.020 |
I mean, what's so perverse about antisemitism, 02:19:17.820 |
in which they hate black people or brown people 02:19:22.100 |
But on the left, Jews are considered extra white. 02:19:29.500 |
And in the black community, that is often the case, right? 02:19:38.700 |
to all of the problems that other minorities suffer, 02:19:42.780 |
in particular, African-Americans in the American context. 02:19:45.880 |
And yeah, Asians are now getting a little bit of this, 02:19:53.580 |
But Jews have had this going on for centuries and millennia, 02:20:00.060 |
And again, this is something that I've never focused on, 02:20:10.180 |
And there's no guarantee it can't suddenly become 02:20:24.220 |
where you have an immensely influential person 02:20:27.640 |
in a community who already has a checkered history 02:20:31.680 |
with respect to their own beliefs about the Jews 02:20:37.420 |
And he's just messaging, not especially fully opposed 02:20:42.420 |
by you and anyone else who's giving him the microphone 02:20:56.360 |
And as somebody who's been, obviously, family 02:21:05.900 |
I believe in the power, especially given who I am, 02:21:09.780 |
not always, but sometimes, often turning the other cheek. 02:21:36.020 |
versus the kinds of things I will say at dinner 02:21:45.580 |
I do not feel, and I hear someone say something stupid, 02:21:51.780 |
to turn around in the confines of that space with them 02:22:05.060 |
on some public dais where I'm actually talking about ideas, 02:22:08.500 |
that's when there's a different responsibility 02:22:12.420 |
- The question is how you say it, how you say it. 02:22:18.620 |
there are definitely moments to privilege civility 02:22:24.740 |
to get into it with somebody out in real life. 02:22:31.180 |
both in the elevator and when a bunch of people 02:22:40.060 |
willing to consider another human being's perspective, 02:22:44.360 |
it just gives more power to your words after. 02:22:52.460 |
'Cause you can extend charity too far, right? 02:23:10.300 |
you're just being duped and you're not actually 02:23:13.260 |
doing the work of putting pressure on a bad actor. 02:23:22.580 |
what you should or shouldn't have said to Kanye. 02:23:24.860 |
- So I think the topic of platforming is pretty interesting. 02:23:28.460 |
What's your view on platforming controversial people? 02:23:32.460 |
Let's start with the old, would you interview Hitler 02:23:36.980 |
on your podcast, and how would you talk to him? 02:23:42.380 |
Would you interview him in 1935, '41, and then '45? 02:23:47.380 |
- Well, I think we have an uncanny valley problem 02:24:12.660 |
signaling to your audience that you don't agree with them. 02:24:16.100 |
you don't have to say, "Listen, I just gotta say, 02:24:17.700 |
"before we start, I don't agree with the whole 02:24:20.140 |
"genocide thing, and I just think you're killing 02:24:23.740 |
"mental patients in vans, and that was all bad, 02:24:35.620 |
and you're not platforming them to signal boost 02:24:41.900 |
if they're sufficiently evil, you can go into it 02:24:47.440 |
You just wanna understand the nature of evil, right? 02:24:56.780 |
And that strikes me as a intellectually interesting 02:25:09.420 |
- Wait, wait, wait, wait, wait, wait, wait, wait, wait. 02:25:22.500 |
where it's just he's someone who's gaining power 02:25:27.740 |
then there's a question of whether you can undermine him 02:25:32.620 |
while pushing back against him in that interview, right? 02:25:39.260 |
and I don't think that in the context of my interviewing them 02:25:44.060 |
I'm gonna be able to take the wind out of their sails 02:25:51.980 |
because I just know that they can do something 02:25:54.940 |
within the span of an hour that I can't correct for. 02:26:03.860 |
and it just takes too much time to put them out. 02:26:05.140 |
- That's more like on the topic of vaccines, for example, 02:26:11.380 |
is usually the best disinfectant, I think it is. 02:26:21.020 |
can always make a mess faster than you can clean it up, 02:26:23.940 |
right, but still there are debates worth having 02:26:27.740 |
And they're the right people to have those specific debates. 02:26:37.620 |
and it doesn't matter how messy they're gonna be. 02:26:40.900 |
It's just worth it because I can make my points land 02:26:47.380 |
- So some of it is just your own skill and competence 02:26:52.620 |
- Well, yeah, yeah, and the nature of the subject matter. 02:26:55.420 |
But there are other people who just by default, 02:27:02.100 |
And there are also people who are so confabulatory 02:27:04.780 |
that they're making such a mess with every sentence 02:27:09.780 |
that insofar as you're even trying to interact 02:27:26.220 |
where it's gonna be a net negative for the cause of truth 02:27:30.740 |
So like, for instance, I think talking to Alex Jones 02:27:35.740 |
on any topic for any reason is probably a bad idea 02:27:39.380 |
because I just think he's just neurologically wired 02:27:49.020 |
each of which contains more lies than the last. 02:28:09.580 |
Or make shit up and then weave it in with half-truths 02:28:15.540 |
and micro-truths that give some semblance of credibility 02:28:22.860 |
I mean, apparently millions of people out there. 02:28:30.180 |
- I have noticed that you have an allergic reaction 02:28:40.340 |
That if somebody says something a little micro-untruth, 02:28:49.580 |
I'm just talking about making up things out of whole cloth. 02:28:54.660 |
well, what about, and then the thing they put 02:28:57.940 |
at the end of that sentence is just a set of pseudofacts, 02:29:03.940 |
right, that you can't possibly authenticate or not 02:29:13.580 |
they will seem to make you look like an ignoramus 02:29:17.140 |
when in fact everything they're saying is specious, 02:29:22.940 |
I mean, there's some people who are just crazy, 02:29:24.780 |
there's some people who are just bullshitting 02:29:27.860 |
and they're not even tracking whether it's true, 02:29:32.980 |
- But don't you think there's just a kind of jazz 02:29:36.180 |
masterpiece of untruth that you should be able 02:29:42.660 |
well, none of that is backed up by any evidence 02:29:44.780 |
and just almost like take it to the humor place? 02:29:48.340 |
I mean, the place I'm familiar with doing this 02:29:51.500 |
and not doing this is on specific conspiracies 02:30:00.940 |
because of what 9/11 did to my intellectual life, 02:30:07.220 |
it sent me down a path for the better part of a decade. 02:30:12.720 |
I don't know if I was ever gonna be a critic of religion, 02:30:16.140 |
right, but it happened to be in my wheelhouse 02:30:19.060 |
'cause I had spent so much time studying religion on my own 02:30:26.820 |
in the underlying spiritual concerns of every religion 02:30:53.900 |
are these just myths or is there really a continuum 02:30:58.900 |
of insight to be had here that is interesting? 02:31:08.220 |
- And that was launched in part by 9/11 Truth Hour? 02:31:13.240 |
I had spent all this time reading religious books, 02:31:21.160 |
knowing just how fully certain people believe 02:31:25.640 |
So I took religious convictions very seriously 02:31:28.360 |
and then people started flying planes into our buildings 02:31:31.280 |
and so I knew that there was something to be said about-- 02:31:36.180 |
- The core doctrines of Islam, yeah, exactly. 02:31:38.900 |
So I went down, so that became my wheelhouse for a time, 02:31:42.740 |
you know, terrorism and jihadism and related topics 02:31:55.580 |
and the question was, well, do I wanna debate these people? 02:32:02.380 |
- Yeah, so Alex Jones, I think, was an early purveyor 02:32:04.980 |
of it, although I don't think I knew who he was 02:32:07.200 |
And so, and privately, I had some very long debates 02:32:13.120 |
with people who, you know, one person in my family 02:32:15.360 |
went way down that rabbit hole and I just, you know, 02:32:20.040 |
the two-hour email, you know, that would try to, 02:32:23.040 |
try to deprogram 'em, you know, however ineffectually 02:32:26.240 |
and so I went back and forth for years on that topic 02:32:30.760 |
with, in private, with people, but I could see 02:32:33.880 |
the structure of the conspiracy, I could see the nature 02:32:36.140 |
of how impossible it was to play whack-a-mole 02:32:41.140 |
sufficiently well so as to convince anyone of anything 02:32:57.540 |
it's a proliferation of anomalies that don't, 02:33:03.920 |
that are being pointed to, they don't connect 02:33:05.840 |
in a coherent way, they're incompatible theses 02:33:11.940 |
But they're running this algorithm of things are, 02:33:17.520 |
things are never what they seem, there's always 02:33:19.560 |
malicious conspirators doing things perfectly. 02:33:22.920 |
We see all, we see evidence of human incompetence 02:33:29.000 |
you know, expertly anywhere else, but over here, 02:33:40.460 |
you know, inexplicably, I mean, incentivized by what, 02:33:46.580 |
thousands of their neighbors and no one is breathing 02:33:49.020 |
a peep about it, no one's getting caught on camera, 02:33:51.700 |
no one's breathed a word of it to a journalist, 02:33:57.460 |
and so I've dealt with that style of thinking 02:34:07.080 |
of a conversation like that and the person will say, 02:34:13.360 |
that all those F-16s were flown 800 miles out to sea 02:34:21.500 |
that hadn't even been scheduled for that day, 02:34:28.440 |
but I'm just making these things up now, right, 02:34:30.460 |
so like that detail, hadn't even been scheduled 02:34:33.040 |
for that day, it was inexplicably run that day, 02:34:34.840 |
so how long would it take to track that down, right, 02:34:46.360 |
and it wasn't even supposed to be run that day, right? 02:34:50.880 |
Someone like Alex Jones, their speech pattern 02:34:54.640 |
is to pack as much of that stuff in as possible 02:34:58.640 |
at the highest velocity that the person can speak 02:35:02.000 |
and unless you're knocking down each one of those things 02:35:05.800 |
to that audience, you appear to just be uninformed, 02:35:11.000 |
wait a minute, he didn't know about the F-16s? 02:35:18.080 |
I just made up Project Mockingbird, I don't know what it is, 02:35:19.860 |
but that's the kind of thing that comes tumbling out 02:35:27.120 |
I was worried about in the COVID conversation 02:35:30.280 |
because not that someone like Brett would do it consciously, 02:35:42.160 |
sending the blog post and the study from the Philippines 02:35:47.160 |
that showed that in this cohort, Ivermectin did X, right, 02:35:52.480 |
and to actually run anything to ground, right, 02:36:07.920 |
you actually have to be a statistician to say, 02:36:11.080 |
okay, they used the wrong statistics in this experiment. 02:36:16.080 |
Now, yes, we could take all the time to do that 02:36:23.320 |
in a context where we have experts we can trust 02:36:28.080 |
go with what 97% of the experts are saying about X, 02:36:36.160 |
about whether to wear masks or not wear masks. 02:36:51.680 |
social media and blogs and the efforts of podcasters 02:36:57.120 |
and sub-stack writers were not just a response to that. 02:37:08.160 |
And I think we're living in an environment where 02:37:11.120 |
people, we've basically, we have trained ourselves 02:37:17.920 |
not to be able to agree about facts on any topic, 02:37:33.600 |
I mean, like, there are people who we respect 02:37:37.520 |
who are spending time down that particular rabbit hole. 02:37:41.720 |
Like this is, maybe there are a lot of Nazis in Ukraine 02:37:54.760 |
empathizing with Putin to the point of thinking, 02:37:58.240 |
well, maybe Putin's got a point and it's like, 02:38:04.520 |
and the killing of journalists and the Navalny? 02:38:09.480 |
Well, no, listen, I'm not paying so much attention to that 02:38:11.880 |
because I'm following all these interesting people 02:38:18.440 |
And there is a, there are some Nazis in Ukraine. 02:38:25.680 |
I think people are being driven crazy by Twitter. 02:38:29.840 |
But you're kind of speaking to conspiracy theories 02:38:35.880 |
is kind of a bad faith style of conversation. 02:38:40.080 |
- But it's not necessarily knowingly bad faith by, 02:38:42.640 |
I mean, the people who are worried about Ukrainian Nazis, 02:38:47.240 |
to my, I mean, they're some of the same people. 02:38:49.960 |
They're the same people who are worried about 02:39:06.840 |
in many cases, the same people and the same efforts 02:39:13.880 |
to have conversations with those kinds of people. 02:39:15.720 |
What about a conversation with Trump himself? 02:39:24.600 |
I don't think I'd be learning anything about him. 02:39:43.600 |
I think Trump is among the most superficial people 02:39:55.960 |
between who he is in private and who he is in public, 02:40:08.000 |
for instance, I think Joe Rogan was very wise 02:40:26.860 |
a potentially new cohort of his massive audience, right? 02:40:37.920 |
The entertainment value of things is so influential. 02:40:42.920 |
I mean, there was that one debate where Trump 02:40:54.680 |
The truth is we're living in a political system 02:40:57.240 |
where if you can get a big laugh during a political debate, 02:41:08.560 |
It doesn't matter that half the debate was about 02:41:28.080 |
because I think who he is privately as a human being, 02:41:32.440 |
also he's kind of the voice of curiosity to me. 02:41:46.400 |
Joe had recently had a podcast with Jordan Peterson, 02:41:48.440 |
and he brought you up saying they still have a hope for you. 02:42:06.440 |
- Joe knows I love him and consider him a friend, right? 02:42:10.220 |
He also knows I'll be happy to do his podcast 02:42:26.600 |
along these same lines where we've talked about 02:42:35.600 |
because Joe's in a very different lane, right? 02:42:42.240 |
I mean, Joe is a standup comic who interviews, 02:42:50.040 |
interviews the widest conceivable variety of people 02:42:53.920 |
and just lets his interests collide with their expertise 02:42:59.000 |
I mean, he's, again, it's a super wide variety of people. 02:43:15.400 |
It's all in, to my eye, it's all in good faith. 02:43:18.520 |
I think Joe is an extraordinarily ethical, good person. 02:43:25.240 |
The crucial difference though is that because he 02:43:38.520 |
is he's very smart and he's also very informed. 02:43:47.920 |
his full-time job is talking to lots of very smart people 02:43:58.160 |
and he's gotten a lot of information crammed into his head. 02:44:04.440 |
but he can always, when he feels that he's uninformed 02:44:08.360 |
or when it turns out he was wrong about something, 02:44:12.920 |
I'm just a comic, we were stoned, it was fun. 02:44:18.960 |
I don't play a doctor on the internet, right? 02:44:28.120 |
I'm not saying you and I are in exactly the same lane, 02:44:37.320 |
It's not true, but that would be the perception. 02:44:41.920 |
And as a counterpoint to a lot of what's being said 02:44:51.760 |
But in reality, if you listen to me long enough, 02:45:09.240 |
So there's nothing that Candace Owens has said 02:45:11.480 |
about wokeness that I haven't said about wokeness 02:45:13.400 |
as far, insofar as she's speaking rationally about wokeness. 02:45:16.560 |
But we have to be able to keep multiple things in view, 02:45:24.320 |
If you could only look at the problem of wokeness 02:45:26.480 |
and you couldn't acknowledge the problem of Trump 02:45:37.320 |
you were just disregarding half of the landscape. 02:45:42.880 |
And many people took half of the problem in recent years. 02:45:47.880 |
The last five years is a story of many people 02:46:02.440 |
And this is the larger issue of audience capture, 02:46:07.440 |
which is very, I'm sure it's an ancient problem, 02:46:22.400 |
And I believe I've witnessed many casualties of it. 02:46:26.100 |
And if there's anything I've been on guard against 02:46:31.960 |
And when I noticed that I had a lot of people in my audience 02:46:41.600 |
And when I noticed that a lot of the other cohort 02:46:43.760 |
in my audience didn't like me criticizing the far left 02:46:47.660 |
and wokeness, they thought I was exaggerating that problem. 02:46:51.340 |
I leaned into it because I thought those parts 02:46:56.600 |
And I didn't care about whether I was gonna lose 02:47:01.160 |
There are people who have created, knowingly or not, 02:47:09.520 |
because of how they've monetized their podcast 02:47:24.240 |
but I really do worry that that's happened to Brett. 02:47:30.880 |
with all the things in the universe to be interested in, 02:47:38.940 |
I don't know how you do 100 podcasts in a row on COVID. 02:47:44.620 |
- Do you think in part audience capture can explain that? 02:48:08.920 |
so you don't feel as much pressure from the audience, 02:48:13.820 |
- I mean, again, the people who think I'm wrong 02:48:18.940 |
okay, you're just not admitting that you're wrong, 02:48:23.940 |
but now we're having a dispute about specific facts. 02:48:35.140 |
or worried might be true about COVID two years ago 02:48:39.300 |
that I no longer believe or I'm not so worried about now, 02:48:54.400 |
but this is something I said probably 18 months ago, 02:48:58.560 |
You know, when I saw what Brett was doing on COVID, 02:49:09.360 |
even if it turns out that ivermectin is a panacea 02:49:14.000 |
and the mRNA vaccines kill millions of people, 02:49:43.360 |
Again, I will totally concede that if I had teenage boys 02:49:48.360 |
and their schools were demanding that they be vaccinated 02:50:00.400 |
and I would be doing more research about myocarditis, 02:50:07.920 |
and I would be worried that we have a medical system 02:50:11.580 |
and a pharmaceutical system and a healthcare system 02:50:14.600 |
and a public health system that's not incentivized 02:50:17.640 |
to look at any of this in a fine-grained way, 02:50:28.000 |
I view that largely as a result, a panicked response 02:50:34.200 |
to the misinformation explosion that happened 02:50:36.640 |
and the populist resistance animated by misinformation 02:50:40.660 |
that just made it impossible to get anyone to cooperate. 02:50:48.160 |
somewhat analogous to the woke response to Trump 02:50:53.120 |
So a lot of people have just gotten pushed around 02:51:07.920 |
but the question is, if you roll back the clock 18 months, 02:51:16.580 |
a very well-respected cardiologist on this topic, 02:51:24.100 |
or Nicholas Christakis to talk about the network effects 02:51:44.500 |
about the efficacy of closing schools during pandemics, 02:51:48.160 |
right, during the Spanish flu pandemic and others, right? 02:51:53.160 |
But there's a lot we didn't know about COVID. 02:51:56.620 |
We didn't know how negligible the effects would be on kids 02:52:06.380 |
- My problem, I really enjoyed your conversation 02:52:10.500 |
So he's one of the great communicators in many ways 02:52:14.380 |
on Twitter, like distillation of the current data, 02:52:26.620 |
that I think could be explained by him being exhausted 02:52:30.900 |
by being constantly attacked by conspiracy theory, 02:52:55.660 |
that have skepticism, but also did not sufficiently 02:53:06.460 |
were kind of saying there's a lot of uncertainty, 02:53:11.260 |
You're gonna get COVID, do you wanna be vaccinated 02:53:20.020 |
And it's up until you start breaking apart the cohorts 02:53:25.520 |
there is this myocarditis issue in young men. 02:53:33.100 |
Before that story emerged, it was just clear that 02:53:44.540 |
it is still mitigating severe illness and death. 02:53:47.460 |
And I still believe that it is the current view 02:53:54.100 |
of most people competent to analyze the data, 02:54:02.760 |
unnecessarily in the US because of vaccine hesitancy. 02:54:07.480 |
- But I think there's a way to communicate with humility 02:54:14.060 |
- I do believe that it is rational and sometimes effective 02:54:19.060 |
to signal impatience with certain bad ideas, right? 02:54:31.820 |
I just think it makes you look like a douchebag most times. 02:54:34.520 |
- Well, I mean, certain people are persuadable, 02:54:43.680 |
Not everything can be given a patient hearing. 02:54:49.980 |
and then let people in to just trumpet their pet theories 02:55:02.640 |
the people, like, you begin to get a sense for this 02:55:13.320 |
their irrelevance to the conversation fairly quickly 02:55:18.320 |
without knowing that they have done it, right? 02:55:22.120 |
And the truth is, I think I'm one of those people 02:55:33.840 |
"Listen, I know exactly what's going on here. 02:55:43.560 |
No, this is a situation where you want the actual pilots 02:56:02.520 |
while he was bloviating about, you know, whatever, 02:56:05.480 |
that it's just gonna go away, there's just 15 people, 02:56:10.560 |
and it's just gonna go away, there's gonna be no problem. 02:56:15.280 |
"Many of these doctors think I understand this 02:56:17.760 |
"They're just amazed at how I understand this." 02:56:25.680 |
just ashen faced while he's talking, you know. 02:56:58.280 |
can be measured in how many podcasts I did on it, right? 02:57:01.400 |
It's like once we had a sense of how to live with COVID, 02:57:13.980 |
but my kids were stuck at home on iPads for too long. 02:57:27.080 |
But the thing I didn't do is make this my life 02:57:30.760 |
and just browbeat people with one message or another. 02:57:49.680 |
is gonna proliferate given the tools we've built. 02:57:52.720 |
it's gonna proliferate for understandable reasons. 02:57:59.440 |
when something orders of magnitude more dangerous hits us. 02:58:08.700 |
I think much more about next time than this time. 02:58:19.300 |
What's your, what do you admire most about Brett 02:58:42.180 |
does not mean that I think he's a bad person. 02:58:45.580 |
I mean, the thing that worried me about what he was doing, 02:58:50.300 |
and this was true of Joe and this was true of Elon, 02:58:53.940 |
is that once you're messaging at scale to a vast audience, 02:59:12.860 |
And that's why I was, I think, fairly circumspect. 02:59:21.980 |
of the fairway expert opinion at this time point 02:59:24.660 |
and at this time point and at this time point, 02:59:32.140 |
I'm not an expert on the safety of mRNA vaccines. 02:59:34.860 |
If something changes so as to become newsworthy, 02:59:41.460 |
So I just did a podcast on the lab leak, right? 02:59:45.740 |
I was never skeptical of the lab leak hypothesis. 02:59:50.100 |
Brett was very early on saying this is a lab leak, right, 03:00:01.940 |
what do we do given the nature of this pandemic? 03:00:04.180 |
- But also we should say that you've actually stated 03:00:12.300 |
- The time to figure that out, now, I've actually, 03:00:18.540 |
change my view of this because one of the guests, 03:00:25.500 |
actually the best time to figure out the origin of this 03:00:35.960 |
there are certain facts you might not be able 03:00:52.540 |
immediately designing vaccines against that genome, 03:00:56.100 |
And then we had to figure out how to vaccinate 03:00:58.540 |
and to mitigate and to develop treatments and all that. 03:01:11.340 |
was politically inflammatory and made the Chinese look bad. 03:01:21.900 |
They're letting, they're stopping their domestic flights, 03:01:46.860 |
We're all running, all these advanced countries 03:01:51.700 |
That's a practice that we should be worried about 03:01:58.700 |
There've been multiple lab leaks of even worse things 03:02:12.140 |
And on some level, it could happen to anyone, right? 03:02:15.680 |
The wet market makes them look like barbarians 03:02:22.260 |
Like what are you doing putting a bat on top of a pangolin, 03:02:28.900 |
So if anything, the wet market makes them look worse, 03:02:42.040 |
- Do you think we'll ever get to the bottom of that? 03:02:47.300 |
I would say failures of Anthony Fauci and so on 03:03:27.820 |
he was taking refuge in a very lawyered language, 03:03:39.180 |
So yeah, I think it looked shady, it played shady, 03:03:44.300 |
I mean, I don't know how personally entangled 03:03:51.100 |
is something that I think we're wise to be worried about. 03:04:09.040 |
if you or somebody else doesn't do it in the meantime, 03:04:22.420 |
Forget about just the gain-of-function research. 03:04:25.380 |
I don't even understand virus hunting at this point. 03:04:28.980 |
I don't even know why you need to go into a cave 03:04:30.980 |
to find this next virus that could be circulating 03:05:00.360 |
about Brett's style of engaging in this issue 03:05:03.060 |
is people are using the fact that he was early on LabLeak 03:05:06.320 |
to suggest that he was right about ivermectin 03:05:20.700 |
Like, you shouldn't have been confident about LabLeak. 03:05:22.580 |
No one should have been confident about LabLeak early, 03:05:42.260 |
Both at the time were inflammatory to be banging on about 03:05:46.420 |
when we were trying to secure some kind of cooperation 03:05:51.820 |
And it's possible to be right by accident, right? 03:05:55.900 |
The style of reasoning matters whether you're right or not. 03:06:06.900 |
You know, it's like, because your style of reasoning 03:06:09.120 |
is dictating what you're gonna do on the next topic. 03:06:15.020 |
- Sure, but this multivariate situation here, 03:06:20.020 |
it's really difficult to know what's right on COVID, 03:06:36.860 |
Just knowing if 65% of the population gets vaccinated, 03:06:49.520 |
Given all the other incentives, I mean, Pfizer, 03:06:54.820 |
I don't know what to think. - But you had the CEO 03:07:03.860 |
reaping windfall profits on a dangerous vaccine? 03:07:12.980 |
Vaccine and putting everyone at intolerable risk? 03:07:19.060 |
did you think this person was making a good faith attempt 03:07:25.280 |
no bad, no taint of bad incentives or something? 03:07:40.020 |
but I sensed that I was talking to a politician. 03:07:50.660 |
- He put on a suit and I was talking to a suit, 03:07:55.140 |
Now, he said that his son was a big fan of the podcast, 03:07:59.620 |
So I thought I would be talking to a human being. 03:08:04.620 |
what I thought the internet thinks otherwise. 03:08:18.920 |
There's a deep distrust of pharmaceutical companies. 03:08:25.940 |
at a time of a public health emergency looks bad. 03:08:45.140 |
that maybe only harms a single digit percentage 03:08:50.780 |
It's like, well, what do we want to encourage? 03:08:59.060 |
We want the person who gave us the iPhone to get rich, 03:09:03.340 |
but we don't want the person who cures cancer to get rich? 03:09:15.080 |
I think we're good now as a population smelling bullshit. 03:09:20.820 |
And there is something about the Pfizer CEO, for example, 03:09:23.820 |
just CEOs of pharmaceutical companies in general, 03:09:37.140 |
That it just feels like none of it is transparent 03:09:45.980 |
that Pfizer's only interested in helping people 03:09:55.500 |
who seems to be at scale helping a huge amount 03:10:03.380 |
where people are like, this seems suspicious. 03:10:08.700 |
There's certain kinds of communication styles 03:10:12.860 |
serve as better catalysts for conspiracy theories. 03:10:21.540 |
for capitalism in delivering drugs that help people. 03:10:30.860 |
And plus, like regulation that actually makes sense 03:10:33.080 |
versus, it seems like pharmaceutical companies 03:10:48.060 |
and most of the people going into government, 03:10:49.900 |
- They wanna do good. - Are doing it for good. 03:10:51.580 |
They're non-psychopaths trying to get good things done 03:11:06.120 |
again, I've uttered that phrase 30 times on this podcast, 03:11:19.280 |
It's not that there are that many bad people. 03:11:27.040 |
that much more remarkable and worth paying attention to, 03:11:29.960 |
but the bad incentives and the power of bad ideas 03:11:59.900 |
You have a lot of interesting ideas that you both share, 03:12:03.540 |
Well, let me first ask, what do you admire most about Elon? 03:12:15.740 |
I mean, Elon, I knew as a friend, I like a lot. 03:12:24.020 |
I mean, he's done and he's continuing to do amazing things. 03:12:30.900 |
I think many of his aspirations are realized, 03:12:39.420 |
I think it's just, it's amazing to see what he's built 03:12:58.340 |
- There's something very Trumpian about how he's acting 03:13:04.500 |
He bought the place because he thinks it's so great. 03:13:08.860 |
I think he's needlessly complicating his life 03:13:11.500 |
and harming his reputation and creating a lot of noise 03:13:17.460 |
I mean, so like he, the thing that I objected to 03:13:26.660 |
again, I remain agnostic as to whether or not 03:13:30.380 |
It was how he was personally behaving on Twitter, 03:13:46.500 |
but that it's just some gay tryst gone awry, right? 03:13:52.800 |
And you link to a website that previously claimed 03:14:00.240 |
and that a body double was campaigning in her place. 03:14:08.920 |
And it matters that he was signal boosting it 03:14:13.680 |
And so it is with saying that your former employee, 03:14:25.800 |
And now this guy's getting inundated with death threats, 03:14:28.960 |
right, and Elon, all of that's totally predictable, right? 03:14:43.040 |
it's not good for the world, it's not serious. 03:14:57.120 |
is that he's purported to touch real issues by turns. 03:15:01.320 |
Like, okay, do I give the satellites to Ukraine or not? 03:15:08.680 |
Should I publicly worry about World War III or not, right? 03:15:14.640 |
And at the same moment, he's doing these other 03:15:28.220 |
He brings Kanye on, knowing he's an anti-Semite 03:15:36.520 |
which I probably wouldn't have kicked him off 03:15:41.080 |
can you really kick people off for swastikas? 03:15:50.400 |
I'm not even sure that's an enforceable terms of service. 03:16:00.760 |
But so much of what he's doing, given that he's, 03:16:05.680 |
He's doing this in front of 130 million people. 03:16:09.320 |
and that's very different than 100,000 people. 03:16:16.840 |
And again, this was a situation where I tried 03:16:28.960 |
sort of highlighting things he might be wrong on? 03:16:36.040 |
I should write, like, a pamphlet for Sam Harris. 03:16:38.640 |
- Well, no, but it was totally coming from a place 03:16:41.240 |
of love because I was concerned about his reputation. 03:16:46.920 |
I could see what was happening with the tweet. 03:16:49.320 |
I mean, he'd had this original tweet that was, 03:17:05.000 |
but we knew, we saw what was happening in Italy, right? 03:17:18.240 |
I mean, it had more engagement than any other tweet, 03:17:20.540 |
more than any crazy thing Trump was tweeting. 03:17:30.520 |
And I could see that people were responding to it like, 03:17:35.520 |
"Wait a minute, okay, here's this genius technologist 03:17:38.400 |
who must have inside information about everything, right? 03:17:41.800 |
Surely he knows something that is not on the surface 03:17:46.000 |
And they're reading, they were reading into it 03:17:48.440 |
a lot of information that I knew wasn't there, right? 03:17:54.380 |
I didn't think he had any reason to be suggesting that. 03:17:57.080 |
I think he was just firing off a tweet, right? 03:18:02.520 |
and I mean, because it was a private text conversation, 03:18:12.520 |
among the many cases of friends who have public platforms 03:18:15.880 |
and who did something that I thought was dangerous 03:18:18.520 |
and ill-considered, this was a case where I reached out 03:18:21.800 |
in private and tried to help, genuinely help, 03:18:26.800 |
because it was just, I thought it was harmful 03:18:31.840 |
in every sense, because it was being misinterpreted. 03:18:35.440 |
And it was like, okay, you can say that panic 03:18:47.480 |
it's gonna peter out, it's just gonna become a cold. 03:18:49.360 |
I mean, that's how this was getting received. 03:18:51.960 |
Whereas at that moment, it was absolutely obvious 03:18:56.320 |
or that it was gonna, at minimum, going to be a big deal. 03:18:59.680 |
but it was obvious there was a significant probability 03:19:13.720 |
but it's gonna die out in two months or something. 03:19:18.640 |
we weren't going to have tens of thousands of deaths 03:19:30.600 |
And when Nicholas Christakis came on my podcast very early, 03:19:36.400 |
that we would have about a million people dead in the US. 03:19:39.760 |
And that didn't seem, it was, I think, appropriately hedged, 03:19:56.800 |
that order of magnitude and not something much, much less. 03:20:00.880 |
And so anyway, I mean, again, to close the story on Elon, 03:20:18.160 |
And then we had a fairly long and detailed exchange 03:20:24.260 |
on this issue, and that, so that intervention didn't work. 03:20:33.260 |
I was not, I was just concerned for him, for the world, 03:20:38.260 |
and then there are other relationships where I didn't take, 03:20:43.540 |
but again, that's an example where taking the time 03:20:52.020 |
There are other relationships where I thought, 03:20:53.180 |
okay, there's just gonna be more trouble than it's worth, 03:20:54.740 |
and I just ignored it, and there's a lot of that. 03:20:58.940 |
And again, I'm not comfortable with how this is all 03:21:07.580 |
frankly, I'm not comfortable with how much time 03:21:14.000 |
Like, what good is it for me to talk about Elon or Brett 03:21:35.220 |
Because you are exchanging back and forth on Twitter, 03:21:39.740 |
like long-form discussions, like a debate about COVID, 03:21:54.100 |
like the Rogan method, we're just a bunch of idiots. 03:21:58.260 |
Like, one is an engineer, you're a neuroscientist. 03:22:11.900 |
so at the moment I had this collision with Elon, 03:22:18.140 |
It was just, it was absolutely clear where this was going. 03:22:40.700 |
who, there are already cases known to many of us personally 03:22:48.000 |
And he was operating by a very different logic 03:22:54.460 |
- Sure, but that logic represents a part of the population, 03:23:06.100 |
was not at the point of, oh, there's a lot to talk about, 03:23:09.340 |
a lot to debate, this is all very interesting, 03:23:13.060 |
It broke down very early at, this is, you know, 03:23:20.400 |
Like, it's like either there's a water bottle on the table 03:23:25.240 |
- Technically, there's only 1/4 of a water bottle. 03:23:40.600 |
we had an exchange in private, and I want to honor, 03:23:53.380 |
that there was not a follow-up conversation on that topic. 03:24:02.860 |
of helping that happen, that the friendship is rekindled 03:24:05.500 |
because one of the topics I care a lot about, 03:24:07.900 |
artificial intelligence, you've had great public 03:24:15.400 |
was very formative in my taking that issue seriously. 03:24:19.040 |
I mean, he and I went to that initial conference 03:24:22.920 |
in Puerto Rico together, and it was only because 03:24:25.960 |
he was going and I found out about it through him, 03:24:28.000 |
and I just wrote his coattails to it, you know, 03:24:48.340 |
and these new large language models that are fine-tuned 03:24:52.140 |
with reinforcement learning and seemingly to be able 03:25:05.420 |
as a student of the human mind and mind in general? 03:25:13.780 |
So I did a, I've spoken about it a bunch on my podcast, 03:25:18.180 |
but I did a TED Talk in 2016, which was the kind of summary 03:25:23.180 |
of what that conference and various conversations I had 03:25:32.520 |
- Basically, that once superintelligence is achieved, 03:25:37.980 |
there's a takeoff, it becomes exponentially smarter, 03:25:53.940 |
superintelligent, self-improving AI to our value system. 03:26:01.220 |
And I don't believe anyone has figured out how to do that 03:26:04.580 |
or whether that's even possible in principle. 03:26:17.660 |
- 'Cause you haven't done an AI podcast in a while, 03:26:21.220 |
- He's a good person to talk about alignment with. 03:26:23.180 |
- Yeah, so Stuart, I mean, Stuart has been probably 03:26:29.460 |
I mean, like, just reading his book and doing, 03:26:31.540 |
I think I've done two podcasts with him at this point. 03:26:45.420 |
that we can define a value function in advance 03:26:57.700 |
as we continue to discover them, refine them, 03:27:09.920 |
there must be many more ways of designing super intelligence 03:27:15.560 |
and is not ever approximating our values in that way. 03:27:19.360 |
So I mean, Stuart's idea to put it in a very simple way 03:27:28.160 |
You don't want to imagine you could ever write the code 03:27:40.120 |
as to what human values are and perpetually uncertain 03:27:43.680 |
and always trying to ameliorate that uncertainty 03:28:04.120 |
I think there are a lot of problems with that 03:28:07.340 |
at a high level, I'm not a computer scientist, 03:28:09.080 |
so I'm sure there are many problems at a low level 03:28:12.840 |
- Like how to force a human into the loop always, 03:28:16.640 |
- There's that and like what humans get a vote 03:28:22.760 |
and what is the difference between what we say we value 03:28:39.320 |
that what we value is driving ourselves crazy with Twitter 03:28:43.720 |
and living perpetually on the brink of nuclear war 03:29:10.400 |
is it's not that it just caters to our preferences, 03:29:20.240 |
It's like it finds ways to make us a better reporter 03:29:24.040 |
of our preferences and to trim our preferences down 03:29:32.140 |
So the main concern is that most of the people in the field 03:29:37.140 |
seem not to be taking intelligence seriously. 03:29:41.740 |
Like as they design more and more intelligent machines 03:29:46.420 |
and as they profess to want to design true AGI, 03:29:50.720 |
they're not, again, they're not spending the time 03:30:05.260 |
as we make that final stride into the end zone, 03:30:15.140 |
an AI would never form a motive to harm human, 03:30:44.580 |
instrumental goals that are antithetical to our wellbeing 03:30:55.900 |
I mean, you and I don't consciously form the intention 03:31:03.420 |
but there are many things we could intend to do 03:31:09.140 |
because, you know, you decide to repave your driveway 03:31:13.140 |
like you're just not taking the interest of insects 03:31:17.020 |
into account because they're so far beneath you 03:31:37.020 |
and I think there's every reason to believe that, 03:31:53.540 |
like the top 100 things we care about cognitively, 03:31:58.820 |
that many of those things, most of those things 03:32:03.660 |
where once the machines get better than we are, 03:32:10.340 |
where this actually came out of Stuart's lab. 03:32:13.340 |
- Yeah, one time a human beat a machine in Go. 03:32:18.900 |
But anyway, ultimately, there's gonna be no looking back, 03:32:23.820 |
and then the question is, what do we do in relationship 03:32:43.020 |
you know, without thinking about it in advance, 03:32:54.380 |
of really creating autonomous superintelligence seriously. 03:33:00.980 |
It's every bit as independent and ungovernable, ultimately, 03:33:18.460 |
Like, they begin to talk about things we don't understand. 03:33:21.240 |
They begin to want things we don't understand. 03:33:29.420 |
We become the chickens or the monkeys in their presence. 03:33:33.660 |
And I think that it's, but for some amazing solution 03:33:44.020 |
that we could somehow anchor their reward function 03:33:46.740 |
permanently, no matter how intelligent scales, 03:33:49.980 |
I think it's really worth worrying about this. 03:33:54.340 |
I do buy the sci-fi notion that this is an existential risk 03:34:08.220 |
and I'm worried that it will become superintelligent, 03:34:13.060 |
these language models will become superintelligent 03:34:16.300 |
in the collective intelligence of the human species, 03:34:19.740 |
and then it'll start controlling our behavior 03:34:24.180 |
the recommender systems, and then we just won't notice 03:34:43.540 |
of these algorithms and of what something like, 03:34:51.220 |
I mean, it's just far short of it developing its own goals 03:35:00.980 |
and that is, that are at cross purposes with ours, 03:35:11.500 |
in the ways we're going to be incentivized to use it 03:35:14.100 |
and the money to be made from scaling this thing 03:35:22.140 |
and our sense of just being able to get to ground truth 03:35:35.900 |
in terms of the development towards AGI, Chad GPT, 03:35:38.580 |
or we still, is this just an impressive little toolbox? 03:35:43.580 |
So like, when do you think the singularity's coming? 03:35:48.120 |
Or is it T, it doesn't matter, it's eventually? 03:35:51.700 |
- I have no intuitions on that front apart from the fact 03:35:54.220 |
that if we continue to make progress, it will come, right? 03:36:06.500 |
So there's no reason why this can't be done in silico. 03:36:09.420 |
It's just, we can build arbitrarily intelligent machines. 03:36:13.860 |
There's nothing magical about having this done 03:36:20.380 |
I think that is true, and I think that's, you know, 03:36:22.940 |
scientifically parsimonious to think that that's true. 03:36:29.380 |
It doesn't have to be any special rate of progress. 03:36:34.620 |
At a certain point, we're going to be in relationship 03:36:49.220 |
And that's its own interesting ethical question. 03:36:56.660 |
they're going to be more competent than we are. 03:37:00.500 |
And then that's like, you know, the aliens have landed. 03:37:04.900 |
You know, that's literally, that's an encounter with, 03:37:18.380 |
But it is hard to picture if what we mean by intelligence, 03:37:23.300 |
all things considered, and it's truly general, 03:37:41.420 |
slavish devotion until the end of time in those systems. 03:37:48.380 |
- I think my gut says that that tether is not, 03:37:54.300 |
So it's not this increasingly impossible problem. 03:38:02.460 |
so I have no intuitions about, just algorithmically, 03:38:05.780 |
how you would approach that and what's possible. 03:38:13.740 |
that most of the learning is currently happening 03:38:18.640 |
So even Chad Gipetty is just trained on human data. 03:38:29.580 |
The current impressive aspect of Chad Gipetty 03:38:31.940 |
is that's using collective intelligence of all of us. 03:38:38.660 |
from people who know much more about this than I do, 03:38:49.380 |
are actually going to be sufficient to push us into AGI. 03:38:52.980 |
Right, so it's just, they're not generalizing 03:38:57.340 |
They're certainly not learning like human children, 03:39:04.760 |
It's not to say that the human path is the only path, 03:39:09.980 |
you know, and maybe we might learn better lessons 03:39:23.820 |
And so they have strange holes in their competence. 03:39:28.820 |
- But the size of the holes is shrinking every time. 03:39:31.820 |
And that's, so the intuition starts to slowly fall apart. 03:39:35.480 |
The intuition is like, surely it can't be this simple 03:39:47.700 |
I've been extremely impressed with Chad Giputi 03:39:49.980 |
and the new models, and there's a lot of financial incentive 03:39:58.900 |
In raising a question that I'm going to be talking to you, 03:40:07.020 |
probably because Eric Weinstein talked to Joe Rogan recently 03:40:09.980 |
and said that he and you were contacted by folks 03:40:23.780 |
I think he went down this rabbit hole further than I did, 03:40:35.140 |
But I think we were contacted by the same person. 03:40:53.140 |
about the release of different videos on UFOs. 03:40:58.180 |
So there was like, there was a big New Yorker article 03:41:01.180 |
on UFOs and there was rumors of congressional hearings, 03:41:19.500 |
And I think he might've contacted Rogan or other, 03:41:22.020 |
Eric is just the only person I've spoken to about it, 03:41:28.140 |
And what happened is the person kept writing a check 03:41:40.980 |
I'm gonna, you know, I understand this is sounding spooky 03:41:46.100 |
but next week, I'm gonna put you on a Zoom call 03:41:56.660 |
within five seconds of being on the Zoom call, 03:42:15.380 |
And I think Eric spent more time with this person 03:42:24.380 |
So I, you know, it's not that my bullshit detector 03:42:35.520 |
- So you made a comment, which is interesting, 03:42:47.040 |
or just the thought experiment the aliens did visit. 03:42:54.220 |
that it wouldn't matter, it wouldn't affect your life. 03:43:00.000 |
- Well, no, I was, I think many people noticed this. 03:43:10.580 |
and I forget when the UFO thing was really kicking off, 03:43:13.240 |
but it just seemed like no one had the bandwidth 03:43:28.500 |
It's like, and I considered, okay, wait a minute. 03:43:35.860 |
this is the biggest story in anyone's lifetime. 03:43:38.260 |
I mean, contact with alien intelligence is by definition 03:43:42.860 |
the biggest story in anyone's lifetime in human history. 03:44:06.380 |
to some degree, an understandable defense mechanism 03:44:22.800 |
that is purely terrestrial and not surprising. 03:44:34.420 |
I mean, I have since seen some of those videos. 03:44:38.340 |
I mean, now this is going back still at least a year, 03:44:41.020 |
but some of those videos seem like fairly credible 03:44:47.340 |
And I'm surprised we haven't seen more of that. 03:44:51.620 |
Like there was a fairly credulous 60 minutes piece 03:44:58.020 |
And it was the very video that he was debunking on YouTube. 03:45:00.900 |
And his video only had like 50,000 views on it or whatever. 03:45:05.360 |
But again, it seemed like a fairly credible debunking. 03:45:09.500 |
I haven't seen debunkings of his debunkings, but-- 03:45:12.740 |
- I think there is, but he's basically saying 03:45:39.100 |
about what they would look like when they show up. 03:45:44.220 |
- The amazing thing about this AI conversation though, 03:46:04.680 |
Does this apply to when super intelligence shows up? 03:46:20.520 |
even though you're not on Twitter, which is great. 03:46:34.600 |
same kind of thing, that we would just look the other way. 03:46:41.560 |
which has been throwing around very casually, 03:46:44.560 |
concerningly so, even that, the news cycle wipes that away? 03:46:48.880 |
- Yeah, well, I think we have this general problem 03:47:11.120 |
Like we respond quite readily to certain things. 03:47:15.920 |
we respond to the little girl who fell down a well. 03:47:38.960 |
that's a Russian roulette with a gun with three chambers. 03:47:50.080 |
And I mean, I think people who, this pre-Ukraine, 03:47:55.080 |
I think the people who have made it their business 03:48:03.720 |
you know, people like Graham Allison or William Perry, 03:48:07.040 |
I mean, I think they were putting like the ongoing risk, 03:48:15.240 |
a proper nuclear war at some point in the, you know, 03:48:19.040 |
the next generation, people were putting it at, 03:48:25.000 |
They were living with this sort of Damocles over our heads. 03:48:32.720 |
could have reliable intuitions about the probability 03:48:34.760 |
of that kind of thing, but the status quo is truly alarming. 03:48:39.760 |
I mean, we've got, you know, we've got ICBMs on, 03:48:43.640 |
I mean, leave aside smaller exchanges and, you know, 03:48:47.000 |
tactical nukes and how we could have a world war, 03:48:50.720 |
you know, based on, you know, incremental changes. 03:48:54.080 |
We've got the biggest bombs aimed at the biggest cities 03:48:59.080 |
in both directions and it's old technology, right? 03:49:05.200 |
And it's, you know, and it's vulnerable to some lunatic 03:49:10.360 |
deciding to launch or misreading, you know, bad data. 03:49:14.840 |
And we know we've been saved from nuclear war, 03:49:27.200 |
I'm not gonna pass this up the chain of command, right? 03:49:40.680 |
I mean, in that particular case, like he saw, 03:49:42.720 |
I think it was five, what seemed like five missiles 03:49:52.000 |
in a first strike, they'd launch more than five missiles, 03:50:06.560 |
or some other species of inadvertence, you know, 03:50:20.560 |
by people who are, you know, driven crazy by some ideology. 03:50:37.680 |
I mean, once you can get a deep fake of, you know, 03:50:44.680 |
claiming to have launched a first strike, you know, 03:51:02.120 |
- Yeah, and we might have AI and digital watermarks 03:51:05.120 |
that help us, maybe we'll not trust any information 03:51:09.360 |
that hasn't come through specific channels, right? 03:51:18.520 |
I no longer feel the need to respond to anything 03:51:24.480 |
other than what I put out in my channels of information. 03:51:30.640 |
who have clipped stuff of me that shows the opposite 03:51:36.120 |
I mean, the people have like re-edited my podcast audio 03:51:44.500 |
you can't be sure that I actually said it, you know? 03:51:47.000 |
I mean, it's just, but I don't know what it's like 03:51:52.000 |
to live like that for all forms of information. 03:51:56.080 |
And I mean, strangely, I think it may require 03:52:09.800 |
this sort of Wild West period where everyone's got 03:52:17.720 |
- There might be a greater value for expertise. 03:52:26.680 |
it's gonna be an arms race to authenticate information. 03:52:31.440 |
So it's like, if you can never trust a photograph 03:52:42.400 |
to authenticate the provenance of that photograph 03:52:46.000 |
and a test that hasn't been meddled with by AI. 03:52:49.420 |
And again, I don't even know if that's technically possible. 03:52:52.840 |
And maybe whatever the tools available for this 03:52:56.680 |
will be commodified and the cost will be driven to zero 03:53:01.680 |
so quickly that everyone will be able to do it. 03:53:05.840 |
- And it would be proven and tested most effectively first, 03:53:12.600 |
most of human innovation technology happens first. 03:53:21.280 |
Since we're talking about the threat of nuclear war 03:53:27.480 |
for human society if, when we move beyond Mars. 03:53:54.480 |
'Cause we're still gonna be the apes that we are. 03:54:01.000 |
you have to imagine a first fist fight on Mars. 03:54:10.460 |
Right, so it's gonna get really homely and boring 03:54:16.680 |
It's like only the spacesuits or the other exigencies 03:54:21.240 |
of just living in that atmosphere or lack thereof 03:54:34.520 |
Do you think there'll be, do you think we're like living 03:54:40.200 |
where we're going to be doing more and more interaction 03:54:44.700 |
Like everything we've been complaining about Twitter, 03:54:47.320 |
is it possible that Twitter's just the early days 03:54:49.660 |
of a broken system that's actually giving birth 03:54:52.940 |
to a better working system that's ultimately digital? 03:54:55.620 |
- I think we're gonna experience a pendulum swing 03:55:04.820 |
I mean, I think many of us are experiencing that now anyway. 03:55:07.340 |
I mean, just wanting to have face-to-face encounters 03:55:18.380 |
in that direction, but I do notice it myself. 03:55:23.060 |
And I notice, I mean, once I got off Twitter, 03:55:26.260 |
then I noticed the people who were never on Twitter, right? 03:55:28.940 |
And the people who were never, basically, I mean, I know, 03:55:32.140 |
I have a lot of friends who were never on Twitter. 03:55:37.580 |
It's like, they just like, it wasn't that they were seeing it 03:55:44.500 |
it's like being on, it's like I'm not on Reddit either, 03:55:53.080 |
- So you think the pursuit of human happiness 03:55:55.980 |
is better achieved, more effectively achieved 03:56:00.860 |
- Well, I think all we have is our attention in the end. 03:56:06.100 |
And we just have to notice what these various tools 03:56:15.020 |
that it was an unrewarding use of my attention. 03:56:19.420 |
Now, it's not to say there isn't some digital platform 03:56:22.540 |
that's conceivable that would be useful and rewarding, 03:56:32.300 |
you know, our life is doled out to us in moments. 03:56:35.700 |
And we have, and we're continually solving this riddle 03:56:39.540 |
of what is gonna suffice to make this moment engaging 03:56:44.540 |
and meaningful and aligned with who I wanna be now 03:56:55.300 |
between being in the present and becoming in the future. 03:57:04.340 |
Again, it's not really a paradox, but it can seem like, 03:57:07.180 |
I do think the ground truth for personal wellbeing 03:57:12.020 |
is to find a mode of being where you can pay attention 03:57:18.900 |
And this is, you know, meditation by another name. 03:57:29.100 |
that just consciousness itself in the present moment, 03:57:40.940 |
Like you can be happy now before anything happens, 03:57:49.220 |
There's this kind of ground truth that you're free, 03:57:52.340 |
that consciousness is free and open and unencumbered 03:57:56.020 |
by really any problem until you get lost in thought 03:57:59.980 |
about all the problems that may yet be real for you. 03:58:03.740 |
So the ability to catch and observe consciousness, 03:58:15.540 |
who don't meditate because they find something 03:58:26.980 |
Like it gets their attention, right, whatever it is. 03:58:30.300 |
If you like, Sebastian Junger wrote a great book 03:58:37.900 |
It's like strangely it can be the best experience 03:58:53.580 |
to the person you shouldn't have been talking about. 03:59:05.820 |
that word can mean many things to many people, 03:59:07.620 |
but what I mean by meditation is simply the discovery 03:59:10.260 |
that there is a way to engage the present moment directly 03:59:24.060 |
Nothing, it doesn't have to be a peak experience. 03:59:28.620 |
but you can recognize that in some basic sense, 03:59:40.300 |
and thoughts themselves have no substance, right? 03:59:43.700 |
It's fundamentally mysterious that any thought 03:59:47.240 |
ever really commandeers your sense of who you are 03:59:50.700 |
and makes you anxious or afraid or angry or whatever it is. 04:00:00.420 |
that blow all of us around get much, much shorter, right? 04:00:06.160 |
the anger that would have kept you angry for hours or days 04:00:20.460 |
whether it's useful to stay angry at that moment. 04:00:25.700 |
- And the illusion of free will is one of those thoughts. 04:00:30.500 |
Like even the mindful and meditative response to this 04:00:35.460 |
It's just like even the moments where you recognize 04:00:41.380 |
this does open up a degree of freedom for a person, 04:00:45.780 |
but it's not a freedom that gives any motivation 04:00:53.300 |
- Is there a difference between intellectually knowing 04:00:55.980 |
free will is an illusion and really experiencing it? 04:00:59.820 |
What's the longest you've been able to experience 04:01:06.380 |
- Well, it's always obvious to me when I pay attention. 04:01:10.420 |
Whenever I'm mindful, the term of jargon in the Buddhist 04:01:17.260 |
and increasingly outside the Buddhist context 04:01:21.220 |
But there are sort of different levels of mindfulness 04:01:23.940 |
and there's different degrees of insight into this. 04:01:28.260 |
But yes, what I'm calling evidence of lack of free will 04:01:32.460 |
and lack of the self, I got two sides of the same coin. 04:01:40.000 |
in the middle of experience to whom all experience refers, 04:01:51.100 |
And that's almost always the place people live 04:01:58.780 |
I think people are constantly losing the sense of I, 04:02:01.620 |
they're losing the sense of subject, object, distance, 04:02:05.780 |
And meditation is the mode in which you can recognize, 04:02:14.060 |
you can look for the self and fail to find it 04:02:19.500 |
And that's just the flip side of the coin of free will. 04:02:30.220 |
who's thinking his thoughts and doing his actions 04:02:34.600 |
And the man in the middle of the boat who's rowing, 04:02:46.140 |
Or in fact, there's no boat, there's just the river, 04:02:53.420 |
And there's no place from which you would control it. 04:03:10.420 |
Like just the intention to move is just arising, right? 04:03:14.260 |
And I'm in no position to know why it didn't arise 04:03:24.980 |
or so as to be ineffective or to be doubly effective 04:03:40.140 |
to an even more disconcerting picture along the same lines 04:03:45.120 |
which subsumes this conversation about free will. 04:04:04.980 |
So I mean, what if only the actual is possible? 04:04:31.220 |
Or is it, you don't have kids, I don't think? 04:05:03.840 |
And so we have other categories of non-concrete things. 04:05:09.120 |
We have things that don't have spatial temporal dimension, 04:05:12.560 |
but they're nonetheless, they nonetheless exist. 04:05:19.140 |
There's a reality, there's an abstract reality to numbers. 04:05:25.280 |
And this is, it's philosophically interesting 04:05:30.040 |
they're real and they're not merely invented by us. 04:05:35.040 |
They're discovered because they have structure 04:05:39.920 |
It's not like, they're not fictional characters like, 04:05:59.280 |
because our fiction, the fictional worlds we've created 04:06:05.960 |
It's all abstractable from any of its concrete 04:06:08.920 |
instantiations, it's not just in the comic books 04:06:12.760 |
It's in our, you know, ongoing ideas about these characters. 04:06:24.640 |
I mean, they're similar, but they also have a structure 04:06:28.600 |
It's not, you can't just make up whether numbers are prime. 04:06:32.700 |
You know, if you give me two integers, you know, 04:06:34.880 |
of a certain size, you mentioned two enormous integers. 04:06:39.880 |
If I were to say, okay, well, between those two integers, 04:06:49.840 |
and whether or not anyone knows I'm right or wrong. 04:06:51.840 |
It's like, that's just, there's a domain of facts there, 04:06:54.320 |
but these are abstract, it's an abstract reality 04:06:57.080 |
that relates in some way that's philosophically interesting, 04:07:04.080 |
You know, the spatial temporal order, the physics of things. 04:07:19.240 |
I think I need to talk to a philosopher of physics 04:07:23.440 |
and/or a physicist about how this may interact 04:07:26.000 |
with things like the many worlds interpretation 04:07:28.120 |
of quantum mechanics. - Yeah, that's an interesting, 04:07:37.320 |
that many worlds interpretation of quantum mechanics 04:07:50.600 |
- That's a case of, and truth is, that happens even if 04:07:58.040 |
but we just imagine we have a physically infinite universe, 04:08:12.960 |
eventually you're gonna meet two people just like us 04:08:17.560 |
and you're gonna meet them an infinite number of times 04:08:23.520 |
slightly different from this conversation, right? 04:08:28.100 |
that our intuitions of probability completely break down. 04:08:48.840 |
it's just a thought about what could have happened 04:08:54.240 |
If you can imagine a thing that doesn't make it real. 04:08:57.800 |
So they, 'cause that's where that possibility exists, 04:09:03.080 |
- Yeah, and possibility itself is a kind of spooky idea 04:09:06.760 |
because it too has a sort of structure, right? 04:09:12.520 |
you know, you could have had a daughter, right, last year. 04:09:21.620 |
So we're saying that's possible, but not actual, right? 04:09:26.620 |
That is a claim, there are things that are true 04:09:40.800 |
- I feel like there's a lot of fog around that, 04:09:45.760 |
It feels like almost like a useful narrative. 04:09:49.120 |
So like, what does it mean if we say, you know, 04:09:57.880 |
Like it's possible that I just threw this cap, 04:10:05.440 |
to the original, like what would appear as a decision. 04:10:08.720 |
- Whenever we're saying something's possible, 04:10:12.560 |
Like this thing just happened, but it's conceivable, 04:10:26.480 |
For that to be real, for the possibility to be real, 04:10:41.960 |
- Right, yeah, I'm just extending it beyond human action. 04:10:47.160 |
This goes to the physics of things, this is just everything. 04:10:54.320 |
- Possibility is really compelling for some reason. 04:10:56.920 |
- Well, yeah, because it's, I mean, so this, yeah, 04:11:03.760 |
but every backward-looking regret or disappointment 04:11:11.680 |
is completely dependent on this notion of possibility. 04:11:18.880 |
that something else, I could have done something else, 04:11:36.680 |
whether or not there's such a thing as possibility, 04:11:44.420 |
because the reality is that in any given moment, 04:11:47.400 |
either you can do something to solve the problem 04:11:53.720 |
your worrying is just causing you to suffer twice over, 04:11:58.080 |
you're gonna get the medical procedure next week anyway. 04:12:04.880 |
It's gonna, the worry doesn't accomplish anything. 04:12:08.080 |
- How much do physicists think about possibility? 04:12:10.880 |
- Well, they think about it in terms of probability 04:12:16.320 |
and again, this is a place where I might be out of my depth 04:12:20.740 |
and need to talk to somebody to debunk this, but the- 04:12:30.380 |
a pattern of actuality that we've observed, right? 04:12:33.760 |
I mean, we have, there are certain things we observe 04:12:36.380 |
and those are the actual things that have happened. 04:12:38.920 |
And we have this additional story about probability. 04:12:42.960 |
I mean, we have the frequency with which things happen, 04:12:52.520 |
I know in the abstract that I have a belief that 04:12:57.920 |
those tosses should converge on 50% heads and 50% tails. 04:13:12.560 |
But in reality, all we ever have are the observed tosses. 04:13:16.500 |
Right, and then we have an additional story that, 04:13:19.400 |
oh, it came up heads, but it could have come up tails. 04:13:39.320 |
- I think we're claiming that probability is true. 04:13:42.820 |
That it just, it allows us to have a nice model 04:13:47.060 |
about the world, gives us hope about the world. 04:13:54.940 |
It's a little bit like what's happening with the laws of, 04:13:59.940 |
about the laws of nature too, because the laws of nature, 04:14:02.820 |
so the laws of nature impose their work on the world, 04:14:16.220 |
but the structure isn't just a matter of the actual things. 04:14:29.300 |
But then we have this notion that in addition to that, 04:14:32.120 |
we have the laws of nature that are explaining this act, 04:14:35.940 |
but how are the laws of nature an additional thing 04:14:48.360 |
the actual things that are just actually banging around? 04:14:52.980 |
- For that, possibly has to be hiding somewhere 04:15:13.780 |
- 'Cause we're in this strand of that multiverse. 04:15:18.940 |
still you have just a local instance of what is actual. 04:15:34.340 |
basically everything that can happen, happens somewhere. 04:15:42.660 |
kosher formulation of it, but it seems pretty close. 04:15:49.340 |
In fact, there's, relativistically, there's a, 04:15:51.860 |
there's an, you know, Einstein's original notion 04:15:58.900 |
And it's been a while since I've been in a conversation 04:16:01.020 |
with a physicist where I've gotten a chance to ask 04:16:02.680 |
about the standing of this concept in physics currently. 04:16:06.300 |
but the idea of a block universe is that, you know, 04:16:13.620 |
And our sense that we are traveling through space-time 04:16:22.360 |
that that's an illusion of just our, you know, 04:16:34.440 |
you're reading a novel, the last page of the novel exists 04:16:52.360 |
So as a matter of our experience, moment to moment, 04:16:56.520 |
I think it's totally compatible with that being true, 04:17:09.520 |
and disempowering and confining, but as anything, 04:17:12.880 |
but it's actually, it's a circumstance of pure discovery. 04:17:17.440 |
Like you have no idea what's gonna happen next, right? 04:17:23.560 |
You're only by tendency seeming to resemble yourself 04:17:34.180 |
And yet the basic insight is that you're not, 04:17:39.180 |
you're not in, the real freedom is the recognition 04:17:47.820 |
including your thoughts and intentions and moods. 04:17:50.580 |
- So life is a process of continuous discovery. 04:17:58.480 |
it's the miracle that the universe is illuminated 04:18:06.200 |
And you're continually discovering what your life is. 04:18:22.400 |
or you're struggling to form a confident opinion 04:18:26.120 |
And yet there is this fundamental mystery to everything, 04:18:33.160 |
- We're all NPCs in a most marvelous video game. 04:18:37.920 |
- Maybe, although my game, my sense of gaming 04:18:41.600 |
does not run as deep as to know what I'm committing to. 04:18:51.420 |
- I went back, I was an original video gamer, 04:18:58.040 |
I remember when I saw the first Pong in a restaurant 04:19:00.840 |
in, I think it was like Benihana's or something, 04:19:06.240 |
And that was just an amazing moment when you-- 04:19:32.340 |
'cause most people would say that's not a waste of time. 04:19:36.660 |
that's a deeply embarrassing thing you would never admit? 04:19:40.980 |
- I don't think, well, I mean, once or twice a year 04:19:53.900 |
- No, I mean, I love, golf just takes way too much time, 04:19:57.000 |
so I can only squander a certain amount of time on it. 04:20:01.420 |
- But you have no control over your actual performance. 04:20:05.940 |
- I do have control over my mediocre performance, 04:20:09.620 |
but I don't have enough control as to make it really good. 04:20:17.820 |
to care how I play, so I just have fun when I play. 04:20:30.740 |
- Amidst the chaos of human civilization in modern times, 04:20:36.060 |
what gives you hope about this world in the coming year, 04:20:41.060 |
in the coming decade, in the coming hundred years, 04:20:59.320 |
and are mostly converging on the same core values, right? 04:21:04.320 |
It's like we're not surrounded by psychopaths. 04:21:14.320 |
to get off Twitter was how different life was seeming 04:21:31.000 |
in which actually decent people are behaving like psychopaths 04:21:58.240 |
There's enough, you know, there's no effective limit, 04:22:04.240 |
within the limits of what's physically possible, 04:22:06.800 |
but we're nowhere near the limit on abundance. 04:22:11.400 |
You know, on this, forget about going to Mars, 04:22:16.040 |
It's like we could make this place incredibly beautiful 04:22:37.960 |
So to you, the basic characteristics of human nature 04:22:41.440 |
are such that we'll be okay if the incentives are okay. 04:22:50.120 |
that it's easier to break things than to fix them. 04:22:52.040 |
It's easier to light a fire than to put it out. 04:23:08.520 |
to effectively screw things up for everybody, right? 04:23:11.080 |
So it's easier, it's like a thousand years ago, 04:23:17.000 |
to range the lives of millions, much less billions. 04:23:26.120 |
So on the assumption that we're always gonna have 04:23:36.400 |
we have to figure out that asymmetry somehow. 04:23:56.320 |
you know, all the relevant technologies there? 04:24:00.000 |
You know, do we want, really, you really wanna 04:24:01.800 |
give everyone the ability to order nucleotides in the mail 04:24:06.360 |
and give them the blueprints for viruses online 04:24:09.800 |
because of, you know, you're a free speech absolutist 04:24:23.160 |
many people are confused about my take on free speech 04:24:36.360 |
I'm worried about the free speech of the individual 04:24:40.080 |
businesses or individual platforms or individual, 04:24:42.560 |
you know, media people to decide that they don't wanna 04:24:49.400 |
I think you should be able to kick off the Nazi 04:24:53.400 |
because it's your platform, you own it, right? 04:24:58.040 |
That's the side of my free speech concern for Twitter, 04:25:01.840 |
It's not that every Nazi has the right to be, 04:25:06.840 |
I think if you own Twitter, you should be, you or the, 04:25:08.840 |
you know, whether it's just Elon or, you know, 04:25:14.240 |
and the board and the shareholders and the employees, 04:25:17.960 |
these people need to, should be free to decide 04:25:31.600 |
But I do worry about this problem of misinformation 04:25:49.560 |
could be the hubris of every present generation 04:25:51.640 |
to think that their moment is especially important, 04:25:55.160 |
but I do think with the emergence of these technologies, 04:26:01.200 |
where we really have to figure out how to get this right. 04:26:05.800 |
if we figure out how to not drive ourselves crazy 04:26:08.880 |
by giving people access to all possible information, 04:26:16.720 |
there's no limit to how happily we could collaborate 04:26:26.680 |
- And trillions of robots, some of them sex robots, 04:26:31.800 |
- Robots that are running the right algorithm, 04:26:35.880 |
- Whatever you need in your life to make you happy. 04:26:38.240 |
Sam, the first time we talked is one of the huge honors 04:26:43.160 |
of my life, I've been a fan of yours for a long time. 04:26:45.720 |
The few times you were respectful but critical to me 04:26:49.360 |
means the world, and thank you so much for helping me 04:27:11.680 |
- But very happy to do this, thanks for the invitation. 04:27:16.040 |
- Thanks for listening to this conversation with Sam Harris. 04:27:18.800 |
To support this podcast, please check out our sponsors 04:27:27.080 |
"Love is the only force capable of transforming an enemy 04:27:32.440 |
Thank you for listening, I hope to see you next time.