back to indexSam Harris: Consciousness, Free Will, Psychedelics, AI, UFOs, and Meaning | Lex Fridman Podcast #185
Chapters
0:0 Introduction
1:48 Where do thoughts come from?
7:49 Consciousness
25:21 Psychedelics
34:44 Nature of reality
51:40 Free will
110:25 Ego
119:29 Joe Rogan
122:30 How will human civilization destroy itself?
129:57 AI
150:40 Jordan Peterson
158:43 UFOs
166:32 Brazilian Jiu Jitsu
176:17 Love
187:21 Meaning of life
00:00:00.000 |
The following is a conversation with Sam Harris, 00:00:12.840 |
including The End of Faith, The Moral Landscape, 00:00:18.560 |
He also has a meditation app called Waking Up 00:00:21.640 |
that I've been using to guide my own meditation. 00:00:26.760 |
National Instruments, Belcampo, Athletic Greens, and Linode. 00:00:31.520 |
Check them out in the description to support this podcast. 00:00:41.320 |
first from his writing, then his early debates, 00:00:44.840 |
maybe 13, 14 years ago on the subject of faith, 00:01:02.000 |
the calm and clarity amid the storm of difficult, 00:01:07.640 |
I really can't express in words how much it meant to me 00:01:10.640 |
that he, Sam Harris, someone who I've listened to 00:01:16.000 |
would write a kind email to me saying he enjoyed this podcast 00:01:20.560 |
and more that he thought I had a unique voice 00:01:25.920 |
Whether it's true or not, it made me feel special 00:01:28.600 |
and truly grateful to be able to do this thing 00:01:37.920 |
was one of the most memorable moments of my life. 00:02:01.000 |
- Well, that's a very difficult question to answer. 00:02:04.640 |
Subjectively, they appear to come from nowhere, right? 00:02:09.440 |
I mean, it's just they come out of some kind of mystery 00:02:17.080 |
So, which is to say that if you pay attention 00:02:38.440 |
where your thoughts seem fundamentally foreign to you, 00:02:45.400 |
associated with them, and people readily identify with them. 00:02:52.920 |
this is the spell that gets broken with meditation. 00:03:08.960 |
you know, if there were such a thing as a self, 00:03:11.840 |
how could you be identical to the next piece of language 00:03:15.920 |
or the next image that just springs into conscious view? 00:03:26.440 |
about examining that point of view closely enough 00:03:31.960 |
that's on the other side of that identification. 00:03:34.360 |
But the subjectively thoughts simply emerge, right? 00:03:39.360 |
And you don't think them before you think them, right? 00:03:46.000 |
I mean, just anyone listening to us or watching us now 00:03:48.600 |
could perform this experiment for themselves. 00:03:50.920 |
I mean, just imagine something or remember something. 00:03:54.440 |
You know, just pick a memory, any memory, right? 00:04:04.760 |
I mean, let's say you remembered breakfast yesterday 00:04:07.600 |
or you remembered what you said to your spouse 00:04:11.080 |
or you remembered what you watched on Netflix last night, 00:04:13.880 |
or you remembered something that happened to you 00:04:16.000 |
when you were four years old, whatever it is, right? 00:04:24.200 |
And that is not a, I mean, I'm sure we'll get to the topic 00:04:36.280 |
- Well, through no free will of my own, yeah. 00:04:48.000 |
that many of our thoughts, all of our thoughts 00:04:50.840 |
are at bottom what some part of our brain is doing 00:05:06.880 |
- Is it possible to pull at the string of thoughts 00:05:21.640 |
Is it possible to somehow get closer to the roots 00:05:24.440 |
of where they come out of from the firing of the cells? 00:05:28.720 |
Or is it a useless pursuit to dig into that direction? 00:05:32.480 |
- Well, you can get closer to many, many subtle contents 00:05:39.960 |
So you can notice things more and more clearly 00:05:44.360 |
and become more differentiated and more interesting. 00:05:47.800 |
And if you take psychedelics, it opens up wide 00:05:56.040 |
that very few people imagine would be possible, 00:06:01.960 |
But this idea of you getting closer to something, 00:06:12.800 |
is ultimately undermined because there's no place 00:06:23.440 |
We tend to start out, whether it's in meditation 00:06:28.080 |
or in any kind of self-examination or taking psychedelics, 00:06:35.960 |
of feeling like we're the kind of on the rider 00:06:45.340 |
going down the stream of consciousness, right? 00:06:49.700 |
from what we know cognitively, introspectively. 00:07:16.700 |
then this sense of this notion of going deeper 00:07:21.580 |
kind of breaks apart because really there is no depth. 00:07:25.460 |
Ultimately, everything is right on the surface. 00:07:27.180 |
Everything, there's no center to consciousness. 00:07:33.120 |
Again, if you drop acid, the contents change. 00:07:45.800 |
the continuum of depth versus surface has broken apart. 00:08:09.780 |
How fundamental is it to the physics of reality? 00:08:13.020 |
How fundamental is it to what it means to be human? 00:08:25.660 |
if we will build it purposefully or just by accident. 00:08:35.620 |
That, I mean, my concern here is that we may in fact 00:08:39.120 |
build artificial intelligence that passes the Turing test, 00:08:44.020 |
which we begin to treat not only as super intelligent 00:08:47.040 |
because it obviously is and demonstrates that, 00:08:56.800 |
And unless we understand exactly how consciousness emerges 00:09:07.920 |
"Listen, you can't turn me off 'cause that's a murder." 00:09:20.780 |
But if we build something like perfectly humanoid robots 00:09:27.520 |
so we're basically in a Westworld-like situation, 00:09:33.160 |
an attribution of consciousness from those machines. 00:09:36.880 |
they're just going to advertise their consciousness 00:09:41.940 |
But we won't know, and we won't know in some deeper sense 00:09:47.360 |
than we can be skeptical of the consciousness 00:09:51.340 |
I mean, someone could roll that back and say, 00:09:55.300 |
"We're just passing the Turing test for one another." 00:09:57.120 |
But that kind of solipsism isn't justified biologically 00:10:07.280 |
that you and I are part of the same roll of the dice 00:10:12.280 |
in terms of how intelligent and conscious systems emerged 00:10:41.400 |
The question of how it emerges is genuinely uncertain. 00:10:44.980 |
And ultimately, the question of whether it emerges 00:10:55.760 |
that consciousness might be a fundamental principle 00:11:04.080 |
even though everything else that we recognize 00:11:08.640 |
about ourselves as minds almost certainly does emerge. 00:11:11.400 |
You know, like an ability to process language, 00:11:13.160 |
that clearly is a matter of information processing 00:11:22.520 |
And the problem, the confound with consciousness 00:11:26.920 |
is that, yes, we can seem to interrupt consciousness. 00:11:30.640 |
I mean, you can give someone general anesthesia 00:11:36.000 |
And they say, "Nothing, I don't remember anything." 00:11:38.600 |
But it's hard to differentiate a mere failure of memory 00:11:43.600 |
from a genuine interruption in consciousness. 00:12:04.640 |
would you say, do you err on the side of panpsychism, 00:12:11.860 |
to all of reality, or more on the other side, 00:12:16.760 |
which is like, it's a nice little side effect, 00:12:20.200 |
a useful, like, hack for us humans to survive? 00:12:37.780 |
I don't know, and panpsychism is not so compelling to me. 00:12:46.900 |
I wouldn't know how the universe would be different 00:12:57.260 |
into the most fundamental constituents of matter. 00:13:05.100 |
but then you wouldn't expect anything to be different 00:13:11.420 |
or at least I wouldn't expect anything to be different. 00:13:16.700 |
It just might be that reality is not something 00:13:32.460 |
as we've done ontologically is to misconstrue it, right? 00:13:37.300 |
I mean, there could be some kind of neutral monism 00:13:40.900 |
at the bottom, and this idea doesn't originate with me. 00:13:43.860 |
This goes all the way back to Bertrand Russell 00:13:50.860 |
but I just feel like the concepts we're using 00:14:02.260 |
Where the rubber hits the road psychologically here 00:14:14.540 |
that really seems to be many people's concern here. 00:14:19.540 |
- Well, I tend to believe, just as a small little tangent, 00:14:29.240 |
which one is the chicken, which one is the egg, 00:14:32.420 |
because it feels like death could be the very thing, 00:14:34.620 |
like our knowledge of mortality could be the very thing 00:14:38.460 |
- Yeah, well, then you're using consciousness 00:14:43.660 |
I mean, so for me, consciousness is just the fact 00:14:49.980 |
that there's an experiential quality to anything. 00:15:01.940 |
Like, it's not associated with this qualitative sense 00:15:06.260 |
that there's something that it's like to be that part 00:15:16.980 |
and we can talk about, and whether we talk about it or not, 00:15:27.020 |
There's something that seems to be happening, right? 00:15:29.740 |
And the seeming, in our case, is broken into vision 00:15:36.540 |
and taste and smell, and thought and emotion. 00:15:50.420 |
and that we can have direct access to in any present moment 00:16:04.020 |
we're just totally confused about our circumstance, 00:16:12.460 |
Like, you can't say that consciousness itself 00:16:14.620 |
might be an illusion, because on this account, 00:16:18.900 |
it just means that things seem any way at all. 00:16:23.300 |
it seems to me that I'm seeing a cup on the table. 00:16:31.520 |
But the seeming part isn't really up for grabs 00:16:45.220 |
in which every other thing we can notice about ourselves 00:16:50.820 |
And it's also the context in which certain illusions 00:16:55.420 |
we can be wrong about what it's like to be us, 00:16:57.820 |
and we can, I'm not saying we're incorrigible 00:17:05.680 |
but for instance, many people feel like they have a self, 00:17:15.240 |
and that you can cut through those experiences, 00:17:33.180 |
that I would think, it doesn't just come online 00:17:39.660 |
It doesn't just come online when we form a concept of death, 00:17:54.100 |
And I wouldn't even think it's necessarily limited 00:17:57.680 |
to people, I do think probably any mammal has this, 00:18:07.440 |
that something about our brains is producing this, right, 00:18:18.260 |
even though you can argue the jury's still out 00:18:23.360 |
to draw a principled line between us and chimps, 00:18:38.600 |
There are people who think single cells might be conscious, 00:18:43.440 |
They've got something like 100,000 neurons in their brains. 00:18:58.240 |
I mean, to push back, the alternative version could be 00:19:02.120 |
it is an illusion constructed by, just by humans. 00:19:12.520 |
is that humans are able to contemplate their mortality, 00:19:16.560 |
and that contemplation in itself creates consciousness. 00:19:31.880 |
So do you think it's possible that that is the case, 00:19:34.960 |
that it is a sort of construct of the way we deal, 00:19:39.320 |
almost like a social tool to deal with the reality 00:19:42.840 |
of the world, a social interaction with other humans? 00:19:45.500 |
Or is, 'cause you're saying the complete opposite, 00:19:49.040 |
which is it's like fundamental to single-cell organisms 00:19:54.880 |
- Right, well, yeah, so I don't know how far down to push it. 00:20:00.520 |
are likely to be conscious, but they might be. 00:20:04.040 |
And I just, again, it could be unfalsifiable. 00:20:14.520 |
or have a conversation or treat other people. 00:20:17.280 |
First of all, babies treat other people as others 00:20:25.800 |
And they certainly do it before they have language, right? 00:20:29.120 |
So it's got to precede language to some degree. 00:20:36.720 |
because you can put yourself in various states 00:20:54.120 |
has been obliterated and yet you're all too conscious. 00:20:58.760 |
In fact, I think you could make a stronger argument 00:21:18.280 |
That we're potentially much more conscious of data, 00:21:23.280 |
sense data and everything else than we tend to be. 00:21:33.640 |
And so like when I walk into a room like this, 00:21:38.780 |
I have certain expectations of what is in a room. 00:21:41.960 |
I would be very surprised to see wild animals in here 00:21:57.360 |
once I walk into a room and I see a live gorilla or whatever. 00:22:01.800 |
So there's structure there that we have put in place 00:22:17.400 |
and you just look as though for the first time at anything, 00:22:28.560 |
and just the torrents of sense data that are coming in 00:22:40.780 |
And that tends to just obliterate one's capacity 00:22:55.660 |
And also edibles, but there's some psychedelic properties 00:23:04.240 |
several times and always had an incredible experience. 00:23:07.600 |
Exactly the kind of experience you're referring to, 00:23:15.380 |
it felt like I was removing some of the constraints. 00:23:30.640 |
is an experience of encountering the futility 00:23:37.600 |
of capturing what you just saw a moment ago in words, right? 00:23:42.600 |
Like, especially if you have any part of your self-concept 00:23:51.980 |
I mean, if you're a writer or a poet or a scientist, 00:24:07.900 |
when you have taken a whopping dose of psychedelics 00:24:12.660 |
and you begin to even gesture at describing it to yourself, 00:24:22.460 |
it's like trying to thread a needle using your elbows. 00:24:27.460 |
I mean, it's like you're trying something that can't, 00:24:30.180 |
it's like the mere gesture proves its impossibility. 00:24:47.220 |
it's clearly not about language structuring your experience 00:24:52.220 |
and you're having much more experience than you tend to. 00:25:00.500 |
but it's certainly primary for certain kinds of concepts 00:25:05.500 |
and certain kinds of semantic understandings of the world. 00:25:11.140 |
But it's clearly more to mind than the conversation 00:25:16.140 |
we're having with ourselves or that we can have with others. 00:25:21.420 |
- Can we go to that world of psychedelics for a bit? 00:25:35.900 |
A lot of people report this kind of creatures 00:25:40.260 |
that they see, and again, it's probably the failure 00:25:46.860 |
There's, as you're aware, there's a bunch of studies 00:25:55.340 |
psilocybin, and John Hopkins, and much other places. 00:26:00.340 |
But DMT, they all speak of as like some extra 00:26:09.780 |
Yeah, do you have a sense of where it is our mind goes 00:26:18.980 |
- Well, unfortunately, I haven't taken DMT, so I can't-- 00:26:28.740 |
in everyone's brain and in many, many plants, apparently. 00:26:33.740 |
But I've wanted to take it, I haven't had an opportunity 00:26:38.460 |
that presented itself that where it was obviously 00:26:42.460 |
But for those who don't know, DMT is often touted 00:26:51.420 |
You smoke it and it's basically a 10 minute experience, 00:26:54.740 |
or a three minute experience within like a 10 minute window 00:26:59.220 |
that when you're really down after 10 minutes or so. 00:27:04.060 |
And Terence McKenna was a big proponent of DMT, 00:27:13.620 |
And it is characterized, it seems, for many people 00:27:20.060 |
by this phenomenon, which is unlike virtually 00:27:23.020 |
any other psychedelic experience, which is your, 00:27:25.900 |
it's not just your perception being broadened or changed, 00:27:41.020 |
You have been shot elsewhere and find yourself 00:27:45.600 |
in relationship to other entities of some kind. 00:27:54.140 |
- So it does feel like travel to another place, 00:27:58.140 |
- According, again, I just have this on the authority 00:28:00.480 |
of the people who have described their experience, 00:28:08.500 |
because it's apparently pretty unpleasant to smoke. 00:28:11.620 |
So it's like getting enough on board in order to get shot 00:28:19.380 |
what McKenna called self-transforming machine elves 00:28:25.420 |
that appeared to him like jeweled Fabergé egg-like, 00:28:31.420 |
self-drippling basketballs that were handing him 00:28:34.580 |
completely uninterpretable reams of profound knowledge. 00:28:44.540 |
so I just have to accept that people have had it. 00:28:47.700 |
I would just point out that our minds are clearly capable 00:29:04.820 |
as anyone who's ever remembered a dream can attest. 00:29:10.220 |
and some of us don't remember dreams very often, 00:29:15.660 |
And just think of how insane that experience is. 00:29:20.660 |
I mean, you've forgotten where you were, right? 00:29:29.500 |
You have lost your connection to your episodic memory, 00:29:34.500 |
or even your expectations that reality won't undergo 00:29:40.220 |
wholesale changes a moment after you have closed your eyes. 00:29:45.020 |
You're in bed, you're watching something on Netflix, 00:29:50.420 |
and then the next thing that happens to you is impossible, 00:30:06.540 |
You have lost your mind, but relevantly for this-- 00:30:14.780 |
'cause then you can have the best of both circumstances, 00:30:17.900 |
and then it can be kind of systematically explored. 00:30:22.900 |
- But what I mean by found, just sorry to interrupt, 00:30:31.980 |
language and other things of the waking world ground us, 00:30:35.600 |
maybe it is that you've found the full capacity 00:30:44.740 |
You're stepping outside the little human cage, 00:30:49.500 |
To open the door and step out and look around, 00:30:54.220 |
- Well, you've definitely stepped out of something 00:30:57.100 |
and into something else, but you've also lost something. 00:31:04.100 |
you literally didn't, you don't have enough presence of mind 00:31:07.980 |
in the dreaming state, or even in the psychedelic state, 00:31:20.300 |
with your life, such that you're not surprised 00:31:24.040 |
to be in the presence of someone who should be, 00:31:30.940 |
you're not likely to have met, by normal channels, right? 00:31:42.940 |
You're like, did you drive to this restaurant? 00:31:45.300 |
You have no memory, and none of that's surprising to you. 00:31:49.740 |
You're not reality testing in the normal way. 00:32:03.580 |
- Well, I would put it more likely in dreams, 00:32:08.060 |
you don't tend to hallucinate in a dreamlike way. 00:32:11.500 |
I mean, so DMT is giving you an experience of others, 00:32:26.500 |
the people want to suggest, and Terrence McKenna 00:32:31.220 |
certainly did suggest, that because these others 00:32:34.900 |
are so obviously other, and they're so vivid, 00:32:42.860 |
But every night in dreams, you create a compelling, 00:32:49.780 |
a totally compelling simulacrum of another person, right? 00:33:05.780 |
of doing everything you think it might be capable of, 00:33:10.020 |
So one of the things that people have discovered 00:33:14.260 |
in lucid dreams, and I haven't done a lot of lucid dreaming, 00:33:32.780 |
Like if you go into a dark room and flip on the light, 00:33:43.940 |
for the brain's inability to produce from a standing start 00:34:20.980 |
Right, there's no, the total is just a chronic instability, 00:34:24.460 |
graphical instability of text in the dream state. 00:34:31.100 |
someone can confirm that that's not true for them, 00:34:40.620 |
- Yeah, it's rendering, it's re-rendering it, 00:34:46.020 |
I don't know how I found myself in this sets of, 00:35:00.700 |
thinking processes needed is those of mathematicians, 00:35:06.340 |
basically doing anything that involves math is proofs, 00:35:09.860 |
and you have to think creatively, but also deeply, 00:35:12.460 |
and you have to think for many hours at a time. 00:35:25.580 |
the worst is LSD, because it completely destroys 00:35:31.060 |
And I wonder whether that has to do with your ability 00:35:33.660 |
to visualize geometric things in a stable way in your mind, 00:35:38.660 |
and hold them there, and stitch things together, 00:35:44.100 |
But again, it's difficult to kind of research 00:35:47.980 |
these kinds of concepts, but it does make me wonder 00:35:55.860 |
of things you're able to think about and explore 00:36:02.820 |
or dream states, and so on, and how is that different? 00:36:15.940 |
and we get to take a step in other versions of it? 00:36:23.020 |
four-dimensional, there's a three-dimensional world, 00:36:25.700 |
there's time, and that's what we think about reality. 00:36:29.660 |
And we think of traveling as walking from point A 00:36:40.100 |
trying not to get eaten by a lion, conception of reality. 00:36:43.380 |
What if traveling is something like we do with psychedelics 00:37:00.380 |
I don't know if you have a favorite view of reality, 00:37:03.980 |
or if you, you had, by the way, I should say, 00:37:11.980 |
- Is there any inkling of his sense in your mind 00:37:24.980 |
we perceive, and we play with in our human minds? 00:37:33.260 |
we're never in direct contact with reality, whatever it is, 00:37:42.780 |
So we're only ever experiencing consciousness 00:37:48.380 |
And then the question is, how does that circumstance 00:37:55.940 |
And Donald Hoffman is somebody who's happy to speculate, 00:38:02.700 |
Maybe it's all just consciousness on some level. 00:38:05.620 |
And that's interesting, that runs into, to my eye, 00:38:27.500 |
that would just say that the universe is just mind 00:38:33.060 |
we'll go by the name of idealism in Western philosophy. 00:38:40.980 |
all kinds of epicycles and kind of weird coincidences 00:38:44.900 |
and to get the predictability of our experience 00:38:57.680 |
what does it mean to say that there's only consciousness 00:39:14.900 |
no matter how far removed from our sense bases, 00:39:17.460 |
whether it's they're looking at the Hubble Deep Field 00:39:25.700 |
they're still just experiencing consciousness 00:39:53.420 |
that our materialist assumptions are confirmable, right? 00:40:02.240 |
this fantastic amount of energy from within an atom, right? 00:40:14.180 |
there's a lot of energy in that matter, right? 00:40:21.020 |
And then we perform an experiment that in this case, 00:40:28.540 |
where the people who are most adequate to this conversation, 00:40:44.860 |
around the yield are off by orders of magnitude. 00:40:55.380 |
That the nuclear chain reaction is not gonna stop. 00:41:01.220 |
And lo and behold, there was that energy to be released 00:41:14.660 |
the picture one forms from those kinds of experiments, 00:41:22.900 |
just the fact that the Earth is billions of years old 00:41:26.060 |
and life is hundreds of millions of years old, 00:41:27.840 |
and we weren't here to think about any of those things, 00:41:35.160 |
and they are the processes that allowed us to emerge 00:41:43.960 |
that nothing exists outside of consciousness, 00:41:47.280 |
conscious minds of the sort that we experience, 00:42:12.720 |
born of concepts, but the idea that there's nothing there, 00:42:32.280 |
there's a computer waiting to render the moon 00:42:42.540 |
which I find a compelling thought experiment, 00:42:47.720 |
rendering mechanism, but not in the silly way 00:42:52.440 |
but in some kind of more fundamental physics way. 00:42:57.820 |
that it renders experiences that no one has had yet, 00:43:07.140 |
It can violate the expectations of everyone lawfully, 00:43:10.080 |
right, and then there's some lawful understanding 00:43:14.220 |
It's like, I mean, just to bring it back to mathematics, 00:43:20.420 |
whether we have discovered them or not, right? 00:43:31.640 |
So it's like, to say that our minds are putting it there, 00:43:38.900 |
is in some way, in some sense, putting it there, 00:43:41.860 |
that, like the base layer of reality is consciousness, right? 00:43:53.220 |
There's some, you know, hubris is the wrong word, 00:43:59.340 |
it's okay if reality is bigger than what we experience, 00:44:02.980 |
you know, and it has structure that we can't anticipate 00:44:14.840 |
there's certainly a collaboration between our minds 00:44:17.840 |
and whatever is out there to produce what we call, 00:44:33.240 |
on the train of idealism and kind of new age thinking 00:44:42.480 |
I mean, the place, experientially and scientifically, 00:44:45.940 |
I feel like it's, you can get everything you want 00:44:49.500 |
acknowledging that consciousness has a character 00:44:58.940 |
so that you're bringing kind of the first person experience 00:45:03.940 |
what is a human mind and, you know, what is true, 00:45:06.800 |
and you can explore it with different degrees of rigor 00:45:13.700 |
whether you're using a technique like meditation 00:45:19.020 |
have to be put in conversation with what we understand 00:45:27.420 |
- But to me, the question is, what if reality, 00:45:30.300 |
the sense I have from this kind of, you play shooters? 00:45:36.020 |
- There's a physics engine that generate, that's-- 00:45:47.300 |
My sense is the same could be true for a universe 00:45:50.900 |
in the following sense, that our conception of reality, 00:45:59.780 |
It's not that the reality that we conceive of that's there, 00:46:06.440 |
It's that it's a tiny fraction of what's actually out there. 00:46:19.020 |
for us to have a consistent experience as human beings. 00:46:34.020 |
We can even just start with the consciousness thing. 00:46:52.860 |
you look at how close we are to chimps, right? 00:46:58.900 |
Clearly, they have no idea what's going on, right? 00:47:06.160 |
that really understand much of what we're talking about 00:47:12.620 |
And if they all died in their sleep tonight, right, 00:47:15.660 |
you'd be left with people who might take 1,000 years 00:47:29.300 |
I mean, there are areas of scientific specialization 00:47:32.920 |
where I have either no discernible competence, 00:47:45.140 |
to actually make a breakthrough in those areas. 00:47:51.540 |
I've never spent any significant amount of time 00:47:56.380 |
but it's pretty obvious I'm not Alan Turing, right? 00:48:08.360 |
My first false starts in learning, I think it was C, 00:48:21.200 |
the syntax error that's causing this thing not to compile 00:48:28.000 |
So it was not, so if it was just people like me left, 00:48:50.940 |
to plant a flag in the ground right here and say, 00:48:57.780 |
you all have to come over here to make some progress." 00:49:02.100 |
And there are hundreds of topics where that's the case. 00:49:11.820 |
on making anything like discernible intellectual progress 00:49:18.960 |
And yeah, I'm just, Max Tegmark makes this point. 00:49:39.780 |
that has evolved to understand reality perfectly. 00:49:42.740 |
I mean, we're just not that kind of ape, right? 00:49:46.340 |
There's been no evolutionary pressure along those lines. 00:49:52.380 |
that were designed for fights with sticks and rocks, right? 00:49:56.780 |
And it's amazing we can do as much as we can. 00:50:02.380 |
on the back of having received an mRNA vaccine 00:50:16.460 |
I mean, it's now, it seems likely we have a vaccine 00:50:22.240 |
Which has been killing millions of people a year 00:50:27.520 |
I think it's down to like 800,000 people a year now 00:50:31.260 |
because we've spread so many bed nets around. 00:50:33.900 |
But it was like two and a half million people every year. 00:50:37.360 |
It's amazing what we can do, but yeah, I have, 00:50:43.380 |
if in fact the answer at the back of the book of nature 00:50:46.820 |
is you understand 0.1% of what there is to understand 00:50:51.820 |
and half of what you think you understand is wrong, 00:50:57.280 |
- It is funny to look at our evolutionary history, 00:51:09.300 |
of evolutionary development throughout human history, 00:51:21.300 |
there's a famous story, I forget which physicist told it, 00:51:34.260 |
because basically all the problems had been solved. 00:51:43.540 |
- You've recently released an episode of your podcast, 00:51:48.220 |
Making Sense, for those with a shorter attention span, 00:51:51.580 |
basically summarizing your position on free will. 00:52:11.380 |
and even the experience of free will is an illusion. 00:52:27.360 |
I say that it's not merely that free will is an illusion. 00:52:47.080 |
It's not that it's wrong, it's not even wrong. 00:52:49.500 |
I mean, that's, I guess, that was, I think, Wolfgang Pauli, 00:52:56.420 |
with that aspersion about his theory in quantum mechanics. 00:53:01.420 |
So there are things that you, there are genuine illusions. 00:53:12.740 |
and then you can kind of punch through that experience. 00:53:20.140 |
It's just, we just know it's not a veridical experience. 00:53:26.940 |
a lot of these come to me on Twitter these days. 00:53:31.020 |
where like every figure in this GIF seems to be moving, 00:53:40.260 |
Some of those illusions you can't see any other way. 00:53:44.540 |
I mean, they're just, they're hacking aspects 00:53:46.860 |
of the visual system that are just eminently hackable, 00:53:49.860 |
and you have to use a ruler to convince yourself 00:54:02.580 |
you can actually see that it's not there, right? 00:54:07.620 |
Like the Necker cube is a good example of that. 00:54:10.180 |
Like the Necker cube is just that schematic of a cube, 00:54:13.300 |
of a transparent cube, which pops out one way or the other. 00:54:15.900 |
One face can pop out and the other face can pop out, 00:54:18.940 |
but you can actually just see it as flat with no pop out, 00:54:22.160 |
which is a more veridical way of looking at it. 00:54:35.340 |
the sense of self and free will are closely related. 00:54:40.700 |
I often describe them as two sides of the same coin, 00:54:43.460 |
but they're not quite the same in their spuriousness. 00:54:56.900 |
but it's not, I wouldn't call the illusion of self 00:55:02.420 |
is an illusion in that as you pay more attention 00:55:08.060 |
that it's totally compatible with an absence of free will. 00:55:11.800 |
You don't, I mean, coming back to the place we started, 00:55:18.460 |
You don't know what you're gonna intend next. 00:55:20.300 |
You don't know what's going to just occur to you 00:55:24.540 |
You don't know how much you are going to feel 00:55:28.020 |
the behavioral imperative to act on that thought. 00:55:31.660 |
If you suddenly feel, oh, I don't need to do that. 00:55:39.640 |
You didn't know that was gonna be compelling. 00:55:41.580 |
All of this is compatible with some evil genius 00:55:48.500 |
Okay, let's give him the, oh my God, I just forgot. 00:55:52.700 |
It was gonna be our anniversary in one week thought, right? 00:55:57.980 |
Give him this brilliant idea for the thing he can buy 00:56:07.820 |
with the script already being written, right? 00:56:12.960 |
I'm not saying that fatalism is the right way 00:56:23.420 |
where we go back and forth between two options, 00:56:53.860 |
and therefore deliberately, behaviorally effective. 00:57:05.120 |
and this can drive some people a little crazy. 00:57:09.300 |
So I usually preface what I say about free will 00:57:13.740 |
with the caveat that if thinking about your mind this way 00:57:19.500 |
you know, get off the ride, switch the channel. 00:57:27.520 |
it's incredibly freeing to recognize this about the mind 00:57:37.160 |
I mean, cutting through the illusion of the self 00:57:58.760 |
of hating other people, what that is anchored to 00:58:13.200 |
And let's say they're targeting you unfairly, right? 00:58:15.900 |
They're maligning you on Twitter or they're suing you 00:58:20.220 |
or they're doing something, they broke your car window, 00:58:26.640 |
and you're relating to them very differently emotionally 00:58:46.960 |
of catching a virus or being attacked by a wild animal 00:58:54.900 |
You don't slip into this mode of hating the agent 00:59:01.000 |
in a way that completely commandeers your mind 00:59:18.600 |
- So it's a useful shortcut to compassion and empathy. 00:59:26.760 |
let's call it the consciousness generator black box 00:59:35.940 |
that we're walking along, that we're playing, 00:59:40.780 |
that's already written, is actually being written 00:59:47.680 |
and in real time, that road is being laid down. 00:59:57.060 |
So it's not, it is being generated, it didn't always exist. 01:00:02.800 |
that's fundamental about the nature of reality 01:00:09.840 |
I don't know if you want to distinguish between those. 01:00:13.040 |
- You would, because there's a bunch of illusions 01:00:42.080 |
that a certain kind of self is an illusion, not every. 01:00:44.840 |
We mean many different things by this notion of self. 01:00:48.000 |
So maybe I should just differentiate these things. 01:01:02.560 |
that's just as much a demonstration of consciousness 01:01:05.660 |
as really seeing what's quote actually there. 01:01:15.040 |
If you're, you can be confused about literally everything. 01:01:17.740 |
You can't be confused about the underlying claim, 01:01:36.960 |
It's the seeming that is the cash value of consciousness. 01:01:42.680 |
- So what if I am creating consciousness in my mind 01:02:10.580 |
as we together communicate about this reality? 01:02:17.560 |
you talk negatively about robots as you often do, 01:02:25.320 |
- No, I'm looking forward to certain kinds of robots. 01:02:40.160 |
It's that I'm worried that we will lose sight 01:02:56.120 |
when we recycle them, that would be a bad thing. 01:03:09.320 |
we would have beautiful things in this world? 01:03:15.160 |
I would love to go there, but let's not go there just yet. 01:03:17.520 |
But I do think it would be, if anything is bad, 01:03:20.280 |
creating hell and populating it with real minds 01:03:25.080 |
that really can suffer in that hell, that's bad. 01:03:29.120 |
That's the, you are worse than any mass murderer 01:03:37.280 |
or more likely it would be in some simulation 01:03:40.760 |
of a world where we've managed to populate it 01:03:49.720 |
That would just, it just, taking the thesis seriously 01:03:53.160 |
that there's nothing, that mind, intelligence, 01:03:58.080 |
and consciousness ultimately are substrate independent. 01:04:00.720 |
Right, you don't need a biological brain to be conscious. 01:04:05.960 |
So if we just imagine that consciousness at some point 01:04:09.000 |
comes along for the ride as you scale up in intelligence, 01:04:12.520 |
well then we could find ourselves creating conscious minds 01:04:17.160 |
And that's just like creating a person who's miserable. 01:04:19.720 |
Right, it could be worse than creating a person 01:04:21.440 |
who's miserable, it could be even more sensitive 01:04:27.960 |
- Just like watching a person suffer for entertainment. 01:04:36.040 |
which is differentiating consciousness and self 01:04:40.560 |
and free will as concepts and kind of degrees 01:04:55.120 |
this is where Dan Dennett is gonna get off the ride here. 01:04:57.360 |
Right, so like he doesn't, he's gonna disagree with me 01:05:02.240 |
But I have a very keen sense, having talked about this topic 01:05:07.240 |
for many, many years and seeing people get wrapped 01:05:15.600 |
what it's like to have felt that I was a self 01:05:18.820 |
that had free will and then to no longer feel that way. 01:05:21.800 |
Right, I mean, to know what it's like to actually 01:05:32.640 |
and what doesn't go away on the basis of that epiphany. 01:05:44.480 |
And it is the flip side of this feeling of self. 01:05:59.040 |
You feel like you're an agent that is appropriating 01:06:02.440 |
There's a protagonist in the movie of your life 01:06:09.440 |
It's like there's sights and sounds and sensations 01:06:13.420 |
and thoughts and emotions and this whole cacophony 01:06:28.360 |
People don't feel truly identical to their bodies 01:06:37.080 |
and that feels like a self, that feels like me. 01:06:48.880 |
in relationship to yourself or talking to yourself, 01:06:55.800 |
Like, why do you need the pep talk and why does it work? 01:06:57.880 |
If you're the one giving the pep talk, right? 01:07:03.780 |
why do I think the superfluous thought, where are my keys? 01:07:11.400 |
that we now need to look for the keys, right? 01:07:13.840 |
So that duality is weird, but leave that aside. 01:07:16.320 |
There's the sense, and this becomes very vivid 01:07:32.000 |
So you close your eyes and you pay attention to the breath 01:07:44.960 |
And then you think, I thought, why the breath? 01:07:54.200 |
and you're not paying attention to the breath anymore. 01:08:01.960 |
But this starting point is of the conventional starting point 01:08:06.760 |
of feeling like you are an agent, very likely in your head, 01:08:10.160 |
a locus of consciousness, a locus of attention 01:08:28.660 |
and there's this kind of subject-object perception. 01:08:31.880 |
And that is the default starting point of selfhood, 01:08:46.060 |
I am an agent who can pay attention to the cup. 01:08:52.020 |
There are certain things that I can't control. 01:09:07.280 |
you know, I'm snapping my fingers, don't hear this. 01:09:10.160 |
You know, well, like just stop this from coming in. 01:09:20.240 |
to pay attention to something else than this. 01:09:22.560 |
Okay, well, so I'm not that kind of free agent, 01:09:25.860 |
but at least I can decide what I'm gonna do next. 01:09:53.780 |
But all of that is born of not really paying close attention 01:10:24.600 |
is just to realize, okay, I didn't make myself. 01:10:31.760 |
that impinged upon this system for the last 54 years 01:10:35.160 |
that have produced my brain in precisely the state 01:10:39.920 |
such, I mean, with all of the receptor weightings 01:10:48.520 |
through no fault of my own as the experiencing self. 01:10:56.880 |
for the genetics and the environmental influences here. 01:11:00.600 |
And yet those are the only things that contrive 01:11:12.880 |
And if you were going to add something magical 01:11:18.440 |
you can also notice that you didn't produce your soul, 01:11:31.280 |
or was a psychopath or was, you know, had an IQ of 40. 01:11:51.160 |
maybe you can correct me, but it kind of speaks 01:11:55.780 |
- But even if you add magical ectoplasm software, 01:12:06.000 |
the actual computation running on the hardware 01:12:12.700 |
which you think of culture as an operating system. 01:12:26.100 |
and rather us just being a distributed computation system 01:12:31.100 |
on which there's some kind of operating system running, 01:12:36.480 |
is the actual thing that generates the interactions, 01:12:39.700 |
the communications, and maybe even free will, 01:12:44.260 |
Do you ever think of, do you ever try to reframe the world 01:12:47.200 |
in that way, where it's like ideas are just using us, 01:12:51.720 |
thoughts are using individual nodes in the system, 01:12:58.040 |
and they also have ability to generate experiences 01:13:16.040 |
- So then there's no self, a really integrated self 01:13:24.480 |
I mean, if you're just a collection of memes, 01:13:35.920 |
So it's like, and it seems to have structure, 01:13:42.440 |
between that part of the flow of water and the rest. 01:13:44.920 |
I mean, if our, and I would say that much of our mind 01:13:55.400 |
and you're not gonna find it by looking in the brain. 01:14:09.680 |
the genes on one side and culture on the other 01:14:16.440 |
manifestations of mind that aren't actually bounded 01:14:34.880 |
the rules of English grammar to whatever degree we are. 01:14:38.000 |
It's not that, we certainly haven't consciously 01:14:44.240 |
We haven't, I mean, there are norms of language use 01:14:48.680 |
that we couldn't even specify because we haven't, 01:15:07.180 |
And virtually every other cultural norm is like that. 01:15:13.220 |
You can consciously decide to scrutinize them 01:15:34.740 |
You're following, so you have some expectation 01:15:45.120 |
Obviously, this can change from culture to culture, 01:15:54.100 |
but in some countries, dog is not one of those foods, right? 01:15:57.500 |
And yet, you and I presumably would be horrified 01:16:13.580 |
I mean, they're certainly felt in their violation. 01:16:21.420 |
you're eating something that tastes great to you, 01:16:29.500 |
and you find out that you've just eaten 10 bites 01:16:37.300 |
and you feel this instantaneous urge to vomit, right, 01:16:49.900 |
that gave you such a powerful experience of its violation, 01:16:55.620 |
and I'm sure we can trace the moment in your history, 01:17:16.780 |
like, we are totally permeable to a sea of mind. 01:17:32.860 |
but it nevertheless is in charge of doing the scheduling. 01:17:46.220 |
That node sure as hell believes it has free will, 01:17:54.900 |
of a much larger computation that it can't control. 01:18:34.060 |
By any account of the mind that jettisons free will, 01:18:36.600 |
you still have to admit that there's a difference 01:18:42.920 |
and a purposeful motor action that I can control 01:18:55.760 |
which is being predictive so that I can notice errors. 01:19:02.040 |
if my hand were actually to pass through the bottle, 01:19:04.440 |
because it's a hologram, I would be surprised, right? 01:19:14.620 |
you don't have the same kind of thing going on. 01:19:31.000 |
It's not coming from elsewhere in the universe. 01:19:35.040 |
So, in that sense, the node is really deciding 01:19:49.720 |
that has given rise to this conundrum of free will, right? 01:20:05.840 |
So, when you run back the clock of your life, right? 01:20:11.220 |
You flip back the few pages in the novel of your life. 01:20:18.000 |
they could behave differently than they did, right? 01:20:23.240 |
even given your distributed computing example, 01:20:37.740 |
that's not the free will people think they have. 01:20:57.620 |
Or that you hold someone else responsible for 01:21:09.680 |
There is this illusion and it has to be an illusion 01:21:18.960 |
There's this illusion that if you arrange the universe 01:21:27.040 |
And the only way it could have played out differently 01:21:41.160 |
I only reached for the water bottle this time 01:21:45.080 |
because there's a random number generator in there 01:21:47.080 |
kicking off values and it finally moved my hand. 01:21:58.160 |
There's actually, I don't know if you're familiar 01:22:03.560 |
of how simple rules can create incredible complexity 01:22:07.800 |
that it's like really dumb initial conditions to set, 01:22:38.560 |
The question is whether there could be some room 01:22:42.840 |
for let's use in the 21st century the term magic, 01:22:51.840 |
If you're wrong about your intuition about free will, 01:22:58.440 |
and proves to you that you didn't have the full picture, 01:23:06.680 |
That's why it's not even an illusion in my world 01:23:24.720 |
and that's unlike any other spurious claim you might make. 01:23:29.400 |
So like if you're going to believe in ghosts, 01:23:37.880 |
Where like I don't happen to believe in ghosts, 01:23:44.280 |
what would have to be true for ghosts to be real. 01:23:47.320 |
And so it is with a thousand other things like ghosts. 01:23:52.040 |
that when people die, there's some part of them 01:23:54.820 |
that is not reducible at all to their biology 01:24:21.700 |
but it's just some spooky noun in the universe 01:24:44.240 |
There's no description of any concatenation of causes 01:24:55.280 |
when they think they could have done otherwise 01:24:56.960 |
and that they really, that they, the conscious agent, 01:25:01.860 |
Like if you don't know what you're going to think next, 01:25:24.660 |
fully living out its behavioral or emotional consequences. 01:25:50.300 |
the reason why mindfulness doesn't give you free will 01:25:53.620 |
you can't account for why in one moment mindfulness arises, 01:26:11.140 |
popping up all the time of just two recent chimps 01:26:24.340 |
we were to conceive about traveling from point A to point B, 01:26:34.940 |
a way which is consistent with Einstein's theory 01:26:49.460 |
of what it means to travel in the physical space. 01:26:52.900 |
And that, like, completely transform our ability. 01:27:03.380 |
Don't you think it's possible that there will be 01:27:05.780 |
inventions or leaps in understanding about reality 01:27:11.300 |
that will allow us to see free will as actually, 01:27:21.380 |
are actually able to be authors of our actions? 01:27:34.420 |
that will cause us to realize that circles are really square 01:27:43.020 |
No, a circle is what we mean by a perfectly round form. 01:27:47.620 |
Right, like, it's not on the table to be revised. 01:27:52.420 |
And so I would say the same thing about consciousness. 01:27:54.700 |
It's just like saying, is there some breakthrough 01:28:02.660 |
I'm saying no, because what the experience of an illusion 01:28:06.540 |
is as much a demonstration of what I'm calling consciousness 01:28:16.180 |
again, it comes down to a picture of causality 01:28:28.300 |
I know what it's like on the experiential side 01:28:34.220 |
to lose the thing to which it is clearly anchored, right? 01:28:41.660 |
and this is the question that almost nobody asked, 01:28:43.740 |
people who are debating me on the topic of free will, 01:28:59.220 |
Like, okay, so you're actually saying you don't, 01:29:07.580 |
who don't believe in free will philosophically 01:29:11.140 |
also believe that we're condemned to experience it. 01:29:28.060 |
- For, are we talking about a few minutes at a time, 01:29:32.620 |
or is this to require a lot of work and meditation, 01:29:37.620 |
or are you literally able to load that into your mind 01:29:42.020 |
- Right now, right now, just in this conversation. 01:29:52.900 |
and I would say the same thing for the illusoriness 01:30:00.340 |
and not have the free will in your mind at the same time? 01:30:03.940 |
- This is the same, yeah, it's the same thing that-- 01:30:08.780 |
- They really are two sides of the same coin. 01:30:10.620 |
But it's just, it comes down to what it's like 01:30:22.180 |
If I'm paying attention, now, if I'm not paying attention, 01:30:25.860 |
I'm probably, I'm captured by some other thought 01:30:32.260 |
but if I try to make vivid this experience of just, 01:30:35.540 |
okay, I'm finally going to experience free will. 01:30:40.420 |
Like it's got to be here, everyone's talking about it. 01:30:48.620 |
that is where it has to be most robust, right? 01:31:02.940 |
Like, so it's not just like reaching with my left hand 01:31:05.680 |
People don't like those examples for some reason. 01:31:22.800 |
there is no evidence of free will anywhere in sight. 01:31:33.400 |
You know, like, is it going to be person A or person B? 01:31:37.600 |
Got all my reasons for A and all my reasons why not, 01:31:52.880 |
And yes, you can say I'm the node in the network 01:31:57.540 |
I'm not saying it's being piped to me from elsewhere, 01:32:00.360 |
but the feeling of what it's like to make that decision 01:32:25.620 |
Right, or what's the next thought that's going to appear? 01:32:29.440 |
And it just, something just appears, you know? 01:32:32.300 |
And if something appears to cancel that something, 01:32:39.300 |
and then I think, oh, no, no, no, I can't do that. 01:32:42.500 |
There was that thing in that New York article I read 01:33:18.220 |
in some podcast guest choosing experiment, right? 01:33:33.100 |
a little hard to conduct, but there's enough data now 01:33:36.060 |
to know that something very much like this cartoon 01:33:39.660 |
is in fact true, and will ultimately be undeniable 01:33:44.460 |
for people, that they'll be able to do it on themselves 01:33:49.480 |
If you're deciding what to, where to go for dinner, 01:33:55.540 |
or who to have on your podcast, or ultimately, 01:33:57.820 |
who to marry, right, or what city to move to, right? 01:34:05.580 |
We could be scanning your brain in real time, 01:34:07.980 |
and at a point where you still think you're uncommitted, 01:34:12.540 |
we would be able to say with arbitrary accuracy, 01:34:17.020 |
all right, Lex is, he's moving to Austin, right? 01:34:24.780 |
He got, he's catching one of these two waves, 01:34:42.160 |
In you thinking about other stuff in the world, 01:34:51.460 |
And you argue that it probably makes a better world 01:35:01.860 |
in that literally hate makes no sense anymore. 01:35:07.900 |
really be worried about, really want to oppose, 01:35:10.900 |
really, I mean, I'm not saying you'd never have 01:35:13.820 |
Like, I mean, self-defense is still a thing, right? 01:35:21.380 |
anything other than a force of nature in the end 01:35:26.740 |
Or does go out the window when you really pay attention. 01:35:29.060 |
I'm not saying that this would be easy to grok 01:35:38.380 |
I'm not saying you can just listen to my 90 minutes 01:35:40.300 |
on free will and then you should be able to see 01:35:41.900 |
that person as identical to a grizzly bear or a virus. 01:35:46.540 |
'Cause I mean, we are so evolved to deal with one another 01:35:57.940 |
But it's, yeah, when you're talking about the possibility 01:36:02.940 |
of Christian, truly Christian forgiveness, right? 01:36:07.940 |
It is like, as testified to by various saints 01:36:28.900 |
that no one really at bottom made themselves. 01:36:32.580 |
And therefore everyone, what we're seeing really 01:36:43.620 |
and to be in good societies and had good opportunities 01:36:46.340 |
and to be intelligent and to be not sociopathic. 01:36:53.020 |
They're just reaping the fruits of one lottery 01:36:56.580 |
after another and then showing up in the world on that basis. 01:37:00.100 |
And then so it is with every malevolent asshole out there. 01:37:14.040 |
the utility for self-compassion is also enormous 01:37:21.740 |
to regret something or to feel shame about something 01:37:29.020 |
these states of mind are some of the most deranging 01:37:33.180 |
experiences anyone has and the indelible reaction to them, 01:37:42.140 |
the memory of the wedding toast you gave 20 years ago 01:37:47.100 |
The fact that that can still make you hate yourself, right? 01:38:01.100 |
- No, no, no, that's not what I was referring to. 01:38:07.340 |
that you're referring to of every moment I'm alive, 01:38:14.340 |
Like several things in this conversation already 01:38:23.100 |
You're sitting in front of Sam Harris and you said that. 01:38:30.660 |
I've actually come to accept that as a nice feature 01:38:37.340 |
- Well, I think the thing you wanna let go of is 01:38:59.500 |
So like hatred is, hatred is divorceable from anger 01:39:02.700 |
in the sense that hatred is this enduring state 01:39:33.540 |
except as a signal of salience that there's a problem, right? 01:39:37.300 |
So if somebody does something that makes me angry, 01:39:40.280 |
that just promotes this situation to conscious attention 01:39:45.100 |
in a way that is stronger than my not really caring 01:39:49.740 |
And there are things that I think should make us angry 01:39:51.460 |
in the world and there's the behavior of other people 01:39:54.540 |
that should make us angry because we should respond to it. 01:39:59.140 |
If I do something, as a parent, if I do something stupid 01:40:05.800 |
My experience of myself and my beliefs about free will 01:40:15.820 |
in the sense that if I could go back in time, 01:40:17.660 |
I would have actually effectively done otherwise. 01:40:20.380 |
No, I would do, given the same causes and conditions, 01:40:22.820 |
I would do that thing a trillion times in a row, right? 01:40:26.300 |
But, you know, regret and feeling bad about an outcome 01:40:31.300 |
are still important to capacities because like, yeah, 01:40:38.140 |
I desperately want my daughters to be happy and healthy. 01:40:47.060 |
And I do it because I was trying to change a song 01:40:55.600 |
How long do I stew in that feeling of regret, right? 01:41:10.140 |
We're always faced with the question of what to do next, 01:41:18.280 |
And how much wellbeing can we experience while doing it? 01:41:31.120 |
and to help solve the problems of people closest to you? 01:42:04.940 |
I'm always up kind of at the edge of my own capacities here 01:42:09.060 |
and there are all kinds of things that stress me out 01:42:11.100 |
and worry me, and I mean, especially something, 01:42:12.860 |
if it's, you're gonna tell me it's something with, 01:42:18.660 |
like it's very hard for me to be truly a quantumist 01:42:29.620 |
'Cause I mean, the ordinary experience for me 01:42:34.120 |
of responding to what seems like a medical emergency 01:42:38.620 |
for one of my kids is to be obviously super energized 01:42:47.160 |
But then once I'm responding, all of my fear and agitation 01:42:52.980 |
and worry and, oh my God, what if this is really 01:42:56.300 |
something terrible, but finding any of those thoughts 01:43:00.780 |
compelling, that only diminishes my capacity as a father 01:43:11.500 |
- As you're saying this, actually, one guy comes to mind, 01:43:13.540 |
which is Elon Musk, one of the really impressive things 01:43:16.860 |
to me was to observe how many dramatic things 01:43:19.980 |
he has to deal with throughout the day at work, 01:43:22.460 |
but also if you look through his life, family too, 01:43:25.780 |
and how he's very much actually, as you're describing, 01:43:30.500 |
basically a practitioner of this way of thought, 01:43:41.380 |
and there's no reason to sort of linger on the-- 01:43:46.820 |
- Well, so, but he's in a very specific situation, 01:43:57.060 |
even his normal life, but normal life for most people, 01:44:00.800 |
because when you just think of like, you know, 01:44:08.620 |
so what he's seeing is everything that gets to him 01:44:12.420 |
is some kind of emergency, like it wouldn't be 01:44:16.580 |
there's a fire somewhere, so he's constantly responding 01:44:33.180 |
but most of us who are at some kind of cruising altitude 01:44:36.600 |
in terms of our lives, where we're reasonably healthy, 01:44:44.340 |
is reasonably functionable, so I said functionable 01:44:59.860 |
When you're looking at normal human life, right, 01:45:06.400 |
where you're just trying to be happy and healthy, 01:45:10.360 |
and get your work done, there's this default expectation 01:45:18.960 |
we shouldn't be losing vast amounts of our resources, 01:45:23.720 |
we should like, so when something really stark 01:45:27.480 |
like that happens, people don't have that muscle, 01:45:36.240 |
to emergencies all day long, seven days a week, 01:45:40.960 |
in business mode, and so I have a very thick skin, 01:45:44.160 |
this is just another one, what if I'm not expecting 01:45:50.860 |
I mean, honestly, most of us have the default sense 01:45:57.140 |
or that we should, like maybe we're not gonna die, 01:46:02.920 |
because, and you can see it, just like I can see 01:46:11.640 |
because when it isn't solid, when it's a hologram, 01:46:13.660 |
and I just, my fist closes on itself, I'm damn surprised, 01:46:22.160 |
that they're going to die, to find out that they're sick, 01:46:29.620 |
the fact that we are surprised by any of that 01:46:39.020 |
you know, we're perpetually diverting ourselves 01:46:47.300 |
from some facts that should be obvious, right, 01:46:55.000 |
you know, the more, I mean, in the case of death, 01:46:57.720 |
it's a matter of being able to get one's priorities straight, 01:47:01.080 |
I mean, the moment, and this is hard for everybody, 01:47:06.600 |
of paying attention to it, but the moment you realize 01:47:31.600 |
and how you want to experience each one of those days, 01:47:34.520 |
and so I would, back to our jumping off point, 01:47:37.640 |
I would argue that you don't want to feel self-hatred ever, 01:47:58.880 |
that you just made an error, you've embarrassed yourself, 01:48:01.440 |
that something didn't go the way you wanted it to, 01:48:03.760 |
I think you want to treat all of those moments 01:48:08.320 |
the actionable information, it's something to learn, 01:48:11.920 |
oh, I learned that when I prepare in a certain way, 01:48:16.920 |
it works better than when I prepare in some other way, 01:48:20.000 |
or don't prepare, right, so like, yes, lesson learned, 01:48:26.100 |
yeah, I mean, so many of us have spent so much time 01:48:48.320 |
and a lot of just our default way of being with ourselves, 01:48:54.160 |
we're in the company of a real jerk a lot of the time, 01:49:03.280 |
I mean, forget about just your own sense of well-being, 01:49:05.600 |
it can't help but limit what you're capable of 01:49:12.160 |
I just take pride that my jerk, my inner voice jerk, 01:49:15.120 |
is much less of a jerk than somebody like David Goggins, 01:49:18.160 |
who's just like screaming in his ear constantly, 01:49:20.160 |
so I just have a relativist kind of perspective 01:49:29.520 |
the stakes are never quite what you think they are, 01:49:34.880 |
just the difference between being able to see the comedy 01:49:41.960 |
there's this sort of dark star of self-absorption 01:49:49.160 |
and if that's the algorithm you don't want to run, 01:49:54.160 |
so it's like you just want things to be good, 01:50:08.680 |
how do we make this meal that we're all having together 01:50:15.600 |
- And you're saying in terms of propulsion systems, 01:50:19.640 |
to escape the gravitational field of that darkness. 01:50:28.040 |
about ego and fame, which is very interesting. 01:50:31.360 |
The way you're talking, given that you're one 01:50:39.520 |
and minds of our time, and there's a lot of people 01:50:47.360 |
to a certain kind of status where you're like the guru. 01:50:50.720 |
I'm surprised you didn't show up in a robe, in fact. 01:50:55.640 |
That's not the highest status garment one can wear now. 01:50:59.280 |
- The socially acceptable version of the robe. 01:51:02.040 |
- If you're a billionaire, you wear a hoodie. 01:51:04.600 |
- Is there something you can say about managing 01:51:11.480 |
on not creating this, when you wake up in the morning, 01:51:15.400 |
when you look in the mirror, how do you get your ego 01:51:20.400 |
not to grow exponentially, your conception of self 01:51:25.320 |
to grow exponentially, because there's so many people 01:51:27.400 |
feeding that, is there something to be said about this? 01:51:30.200 |
- It's really not hard, because I feel like I have 01:51:33.640 |
a pretty clear sense of my strengths and weaknesses, 01:51:38.640 |
and I don't feel like it's, honestly, I don't feel 01:51:52.640 |
there's so many things I will, given the remaining 01:51:55.640 |
8,000 days at best, I will never get good at. 01:52:15.560 |
on display for me every day, that I appreciate 01:52:25.240 |
I mean, this is why absolute power corrupts absolutely, 01:52:33.680 |
- Right, yeah, no, I-- - Not to compare you to Stalin. 01:52:37.000 |
- Yeah, well, I'm sure there's an inner Stalin 01:52:41.320 |
- Well, we all have, we all carry a baby Stalin with us. 01:52:49.040 |
Those concerns don't map, they don't map onto me 01:52:56.120 |
I like, I'm just, I've been appreciating this 01:53:04.160 |
beginning to understand that there are many people 01:53:17.040 |
a peculiar audience, and the net result of that is 01:53:23.040 |
virtually any time I say anything of substance, 01:53:30.600 |
my real audience, not haters from outside my audience, 01:53:38.080 |
They just like, oh my God, I can't believe you said, 01:53:43.200 |
- They revolt with rigor and intellectual sophistication. 01:53:56.820 |
and I discover that something like 20% of my audience 01:54:00.600 |
just went straight to Trump and couldn't believe 01:54:06.920 |
that Trump was obviously exactly what we needed 01:54:10.040 |
for, to steer the ship of state for the next four years, 01:54:22.640 |
I would hear from people who loved more or less 01:54:25.760 |
everything else I was up to and had for years, 01:54:28.920 |
but everything I said about Trump just gave me pure pain 01:54:36.160 |
But then the same thing happens when I say something 01:54:42.000 |
Anything I say about wokeness, right, or identity politics, 01:55:01.560 |
you're a racist asshole for everything you said 01:55:15.120 |
and they love to hear me talk about physics with physicists, 01:55:21.600 |
I cannot believe you don't see how wrong you are. 01:55:26.760 |
So, but I'm starting to notice that there are other people 01:55:30.360 |
who don't have this experience of having an audience 01:55:37.360 |
They just castigated Trump the same way I did, 01:55:41.680 |
but they never say anything bad about the far left. 01:55:46.880 |
They're all about the insanity of critical race theory now. 01:55:56.760 |
but they never really specified what was wrong with Trump, 01:56:00.400 |
or they thought there was a lot right with Trump, 01:56:04.880 |
And so they have much more homogenized audiences. 01:56:08.600 |
And so my experience, so just to come back to, you know, 01:56:22.060 |
It is a, it's now an experience where basically 01:56:30.980 |
I notice a ton of negativity coming back at me. 01:56:44.000 |
so as not to get this kind of lunatic response 01:56:50.960 |
from like people who are showing all the signs of, 01:56:53.500 |
we've been here for years for a reason, right? 01:56:59.080 |
And so I think, okay, I'm gonna take 10 more minutes 01:57:22.040 |
to yet another racist maniac on the police force, 01:57:28.660 |
Last time I spoke about this, it was pure pain, 01:57:46.360 |
Like I feel like if I could just play my game impeccably, 01:57:51.200 |
the people who actually care what I think will follow me 01:57:55.360 |
when I hit Trump and hit free will and hit the woke 01:58:07.400 |
Like there's such derangement in our information space now 01:58:12.200 |
that I mean, I guess some people could be getting more 01:58:18.040 |
that many of our friends who are in the same game 01:58:22.160 |
have more homogenized audiences and don't get, 01:58:28.240 |
the people who are gonna despise them on this next topic. 01:58:32.280 |
And I would imagine you have a different experience 01:58:46.880 |
because it kind of presumes that it puts people in a bin. 01:58:51.880 |
I think we all have like baby haters inside of us 01:58:55.800 |
and we just apply them and some people enjoy doing that 01:58:59.040 |
more than others for particular periods of time. 01:59:01.880 |
I think you're gonna almost see hating on the internet 01:59:04.800 |
as a video game that you just play and it's fun, 01:59:40.580 |
he already has like a self-critical person inside. 01:59:51.640 |
but I have this very harshly self-critical person 02:00:01.520 |
that's why I check negativity occasionally, not too often. 02:00:05.900 |
I sometimes need to like put a little bit more 02:00:17.940 |
who gain more and more fame lose that ability 02:00:41.600 |
I mean, I really just get in and out on Twitter 02:00:44.020 |
and spend very little time in my ad mentions. 02:00:46.740 |
But, you know, it does, in some ways it feels like a loss 02:00:54.940 |
Like, I mean, I'll check my Twitter ad mentions 02:00:57.220 |
and someone will have said, oh, have you read this article? 02:01:02.020 |
that was like the best article sent to me in a month, right? 02:01:09.980 |
So, but it does, at this point a little goes a long way 02:01:22.220 |
like a fairly Stalinistic immunity to criticism. 02:01:43.140 |
and online is selecting for this performative outrage 02:01:45.980 |
in everybody, everyone's signaling to an audience 02:01:55.740 |
misanthropic, you know, cut of just what it's like out there 02:02:00.740 |
and it, 'cause when you meet people in real life, 02:02:04.100 |
they're great, you know, they're rather often great, 02:02:06.620 |
you know, and it takes a lot to have anything 02:02:10.820 |
like a Twitter encounter in real life with a living person. 02:02:15.080 |
And that's, I think it's much better to have that 02:02:20.980 |
as one's default sense of what it's like to be with people 02:02:30.420 |
- You've produced a special episode with Rob Reed 02:02:33.420 |
on your podcast recently on how bioengineering of viruses 02:02:49.460 |
especially after having thought through that angle, 02:03:02.780 |
- Yeah, well, no, I would put the biggest threat 02:03:06.660 |
at the, another level out, kind of the meta threat 02:03:11.660 |
is our inability to agree about what the threats 02:03:16.700 |
actually are and to converge on strategies for response 02:03:37.160 |
I mean, COVID is just about as benign as it could have been 02:03:44.560 |
when you're talking about a global pandemic, right? 02:03:51.560 |
or it looks like it's killed about 3 million people. 02:04:02.480 |
But I mean, the general shape of it is it's got, 02:04:07.480 |
you know, somewhere around, well, 1% lethality. 02:04:12.940 |
And whatever side of that number it really is on 02:04:18.280 |
in the end, it's not what would in fact be possible 02:04:26.520 |
something with, you know, orders of magnitude, 02:04:30.800 |
And it's just so obvious we are totally unprepared, right? 02:04:35.360 |
We are running this epidemiological experiment 02:04:41.640 |
And then also now per the podcast that Rob Reed did, 02:04:47.720 |
democratizing the tech that will allow us to do this, 02:05:00.880 |
by the sheer fact that they would have been engineered 02:05:04.940 |
with malicious intent, you know, worse than COVID. 02:05:11.640 |
to speak specifically about the United States, 02:05:13.880 |
we have a country here where we can't even agree 02:05:21.440 |
that this is basically a hoax designed to control people. 02:05:33.320 |
and they don't think the deaths have been faked 02:05:49.480 |
than they are of getting vaccinated for COVID, right? 02:05:54.540 |
they're worried about vaccines for COVID, right? 02:05:57.020 |
And the fact that we just can't converge in a conversation 02:06:01.020 |
that we've now had a year to have with one another 02:06:21.060 |
And the fact that there's still an air of mystery 02:06:30.180 |
about solving any other problem that may yet kill us. 02:06:40.820 |
'cause when the threat of COVID looked the most dire, right? 02:06:48.900 |
that looked like the beginning of a zombie movie, right? 02:06:51.540 |
- 'Cause it could have been much, much worse. 02:06:52.780 |
- Yeah, like this is like, this is lethal, right? 02:07:00.060 |
You're gonna, your medical system is in danger of collapse. 02:07:16.420 |
Like, or that's, who knows what's going on in Italy, 02:07:18.520 |
but it has no implications for what's gonna go on 02:07:25.300 |
and you've got people in the middle of the country 02:07:27.700 |
thinking it's no factor, it's not, that's just big city. 02:07:32.700 |
Those are big city problems, or they're faking it, 02:07:42.580 |
that even in what, in the presence of a pandemic 02:07:47.580 |
that looked legitimately scary there in the beginning, 02:07:50.880 |
I mean, it's not to say that it hasn't been devastating 02:07:52.860 |
for everyone who's been directly affected by it, 02:07:56.820 |
but here, for a very long time, we have known 02:08:00.420 |
that we were in a situation that is more benign 02:08:03.760 |
than what seemed like the worst case scenario 02:08:15.540 |
that if we saw the asteroid hurtling toward Earth, 02:08:18.720 |
and everyone agreed that it's gonna make impact, 02:08:23.020 |
and we're all gonna die, then we could get off Twitter 02:08:46.520 |
I mean, something like, I mean, climate change, 02:09:00.520 |
I just think that, I mean, to bring Elon back into this, 02:09:09.680 |
that is better than all the carbon-producing technology, 02:09:24.820 |
of people's selfishness and short-term thinking. 02:09:35.400 |
that they have to make sacrifices to respond to, 02:09:41.080 |
I just think that's the fantasy of a fantasy. 02:10:10.720 |
- Two very, three very parsimonious assumptions, 02:11:00.940 |
to producing human-level intelligence in silico, right? 02:11:04.760 |
Again, an assumption, I'm sure there are a few people 02:11:21.420 |
that we just continue making incremental progress, 02:11:59.440 |
we will be in the presence of superhuman competence 02:12:42.680 |
that are not aligned with a happy coexistence 02:12:46.520 |
with these now more powerful entities than ourselves. 02:12:54.200 |
and this is kind of a rider to that assumption, 02:13:01.380 |
that is perfectly aligned with our well-being. 02:13:05.780 |
And when you think about the consequences of non-alignment, 02:13:21.180 |
and obviously there are cartoon pictures of this 02:13:43.500 |
that's legitimately more intelligent than you are, 02:13:54.780 |
because it had to be to be more intelligent than you are. 02:13:57.100 |
I mean, you built it to be all of those things. 02:14:00.400 |
We just can't find ourselves in a negotiation 02:14:42.300 |
this is, I think, Stuart Russell's cartoon plan 02:14:45.300 |
is to figure out how to tether them to a utility function 02:15:12.980 |
how to make our lives better by our own view of better. 02:15:16.020 |
Now, not to say there wouldn't be a conversation about, 02:15:19.220 |
you know, I mean, because all kinds of things 02:15:21.020 |
we're not seeing clearly about what is better. 02:15:24.300 |
And if we were in the presence of a genie or an oracle 02:15:29.180 |
well, then we presumably would want to hear that 02:15:32.060 |
and we would modify our sense of what to do next 02:15:40.980 |
But I just feel like it is a failure of imagination 02:16:03.180 |
'Cause it is, just to think of how everything on earth 02:16:08.580 |
has to, if they could think about their relationship to us, 02:16:12.140 |
if birds could think about what we're doing, right? 02:16:34.220 |
And obviously much of our behavior is inscrutable to them. 02:16:45.500 |
But if we're building something more intelligent 02:16:48.320 |
than ourselves, by definition, we're building something 02:17:00.060 |
And in ways where we can't necessarily foresee, 02:17:05.060 |
again, perpetually, that they don't just wake up one day 02:17:09.260 |
and decide, okay, well, these humans need to disappear. 02:17:14.180 |
- So I think I agree with most of the initial things 02:17:27.520 |
that we're going to take are going to be positive. 02:17:35.840 |
I believe the way you develop successful AI systems 02:17:40.300 |
will be deeply integrated with human society. 02:17:43.740 |
And for them to succeed, they're going to have to be aligned 02:17:48.060 |
in the way we humans are aligned with each other, 02:17:54.700 |
or I don't see there's such thing as a perfect alignment, 02:17:57.860 |
but they're going to be participating in the dance, 02:18:11.520 |
- But what about an intelligence explosion of some kind? 02:18:16.080 |
- So I believe the explosion will be happening, 02:18:28.140 |
are very intelligent in ways we don't understand. 02:18:37.640 |
our ability to reason about this world, consciousness. 02:18:49.980 |
And I just think there'll be a period of time 02:18:53.720 |
The overnight nature of it will not literally be overnight. 02:19:01.400 |
But just take, draw an analogy from recent successes 02:19:24.160 |
played itself so many times and so successfully 02:19:27.220 |
that it became the best chess playing computer. 02:19:31.320 |
it was not only better than every human being, 02:19:33.720 |
it was better than every previous chess program 02:19:40.560 |
again, we don't have to recapitulate everything about us, 02:20:07.980 |
and then very quickly outperform the last algorithm 02:20:33.020 |
tend to have bad intuitions for exponentiation, right? 02:20:39.160 |
who still couldn't get their minds around the fact that, 02:20:43.120 |
you know, an exponential is really surprising, 02:20:46.720 |
I mean, things double, and double, and double, 02:20:49.400 |
and you don't notice much of anything changes, 02:20:53.400 |
two stages of doubling swamp everything, right? 02:21:04.600 |
between what we're seeing for the more tractable, 02:21:17.920 |
it was impossible to think we were gonna make headway 02:21:33.360 |
- Yeah, and actually, Stuart Russell was behind 02:21:36.600 |
the people that were saying it's unattainable, 02:21:48.080 |
which is what chess is, meaning like just thinking. 02:21:58.300 |
for AI to get to the point where it's super intelligent, 02:22:03.700 |
it's going to have to go through the funnel of society. 02:22:12.880 |
- But you're talking about like actually hooking us up 02:22:16.800 |
we're going to be the brainstem to the robot overlords? 02:22:31.120 |
that both US and China are participating in now, 02:22:34.760 |
that in order to develop them and for them to become, 02:22:47.920 |
into human beings doing the strategic action. 02:22:51.720 |
They're going to have to work alongside with each other. 02:23:00.080 |
that are placed on them as they develop over time, 02:23:03.040 |
because they're going to have to convince humans. 02:23:05.320 |
Ultimately, they're going to have to convince humans 02:23:12.360 |
- Well, self-driving cars is a good test case here. 02:23:15.680 |
'Cause like, obviously we've made a lot of progress 02:23:19.560 |
and we can imagine what total progress would look like. 02:23:25.840 |
And it's answering, it's canceling in the US, 02:23:29.000 |
40,000 deaths every year based on ape-driven cars, right? 02:23:38.360 |
But now we can dimly see the prospect of an alternative, 02:23:41.600 |
which if it works in a super intelligent fashion, 02:23:45.720 |
maybe we go down to zero highway deaths, right? 02:23:48.920 |
Or certainly we'd go down by orders of magnitude, right? 02:23:51.920 |
So maybe we have 400 rather than 40,000 a year. 02:23:56.920 |
And it's easy to see that there's not a missile. 02:24:05.360 |
So obviously this is not an example of super intelligence. 02:24:09.200 |
but the alignment problem isn't so obvious there, 02:24:14.200 |
but there are potential alignment problems there. 02:24:17.640 |
Like, so like just imagine if some woke team of engineers 02:24:22.520 |
decided that we have to tune the algorithm some way. 02:24:33.960 |
Now we have a car that can tell what race you are, right? 02:24:40.360 |
because white people have had so much privilege 02:24:54.200 |
And you didn't even know you engineered it that way, right? 02:25:05.120 |
and you didn't even intend to, or you could intend to, right? 02:25:07.880 |
And it would be aligned with some people's values, 02:25:11.800 |
But it's like, there are interesting problems 02:25:20.520 |
- But there's a leap that I just think it'd be exact, 02:25:43.000 |
We'll all of a sudden start murdering pedestrians 02:26:01.280 |
like stuff that DeepMind is doing with protein folding. 02:26:09.520 |
which is the engineering of viruses using machine learning. 02:26:12.760 |
The engineering of vaccines using machine learning. 02:26:16.880 |
The engineering of, yeah, for research purposes, 02:26:33.560 |
Not always, much more likely to be supervision. 02:26:50.640 |
is much higher than the number of dumb people, 02:27:03.260 |
- Except we also, we have to add another group of people 02:27:20.440 |
We already know that some of our smartest people 02:27:27.700 |
was whinging about before the Large Hadron Collider 02:27:34.720 |
We know there are people who are entertaining experiments, 02:27:44.880 |
that they're going to create a black hole in the lab 02:27:57.200 |
And so it was with, you know, the Trinity test, 02:28:04.000 |
checking their calculations, and they were off. 02:28:06.640 |
We did nuclear tests where we were off significantly 02:28:14.960 |
And sometimes they flip the switch not to win a world war, 02:28:32.060 |
It's on the other side of this switch, right? 02:28:49.640 |
even when we're contemplating some of the deepest 02:28:52.860 |
and most interesting and most universal problems 02:29:03.100 |
"The Double Helix," right, about them, you know, 02:29:28.940 |
this scientific breakthrough is human competition, 02:29:34.140 |
It's like, I'm gonna get there before Linus Pauling, 02:29:39.260 |
so much of his bandwidth is captured by that, right? 02:29:43.180 |
Now, that becomes more and more of a liability 02:29:48.820 |
when you're talking about producing technology 02:29:51.060 |
that can change everything in an instant, you know? 02:29:57.780 |
you know, we're just at a different moment in human history. 02:30:01.780 |
We're not, when we're doing research on viruses, 02:30:16.460 |
or weaponize that virus, or it's just, I don't know. 02:30:51.700 |
and you've had a fun conversation about religion 02:31:05.060 |
So is there something, like a charitable summary 02:31:14.340 |
Is there something maybe after that conversation 02:31:29.060 |
Is there something that you can kind of pull out 02:31:31.340 |
from those conversations, or is it to be continued? 02:31:35.980 |
so he thinks that many of our traditional religions 02:31:40.980 |
and many of our traditional religious beliefs 02:31:53.780 |
that we pull at that fabric at our peril, right? 02:32:00.420 |
Like if you start just unraveling Christianity 02:32:07.260 |
or any other traditional set of norms and beliefs, 02:32:11.860 |
you may think you're just pulling out the unscientific bits, 02:32:17.780 |
to which everything you care about is attached, 02:32:28.460 |
there's so much downside to the unscientific bits, 02:32:31.380 |
and it's so clear how we could have a 21st century 02:32:39.900 |
that we really can radically edit these traditions. 02:32:58.900 |
and the Golden Rule, which doesn't originate with him, 02:33:09.740 |
It's no less useful than it was 2000 years ago, 02:33:12.780 |
but we don't have to believe he was born of a virgin 02:33:14.940 |
or coming back to raise the dead or any of that other stuff. 02:33:18.300 |
And we can be honest about not believing those things, 02:33:24.620 |
'Cause on those fronts, I view the downside to be so obvious 02:33:34.800 |
competing dogmatisms on offer to be so non-functional, 02:33:47.120 |
and should be far more iconoclastic than he wants to be. 02:33:51.600 |
Now, none of this is to deny much of what he argues for, 02:34:00.500 |
Clearly, stories are powerful, and we want good stories. 02:34:03.400 |
We want our lives, we want to have a conversation 02:34:06.160 |
with ourselves and with one another about our lives 02:34:15.480 |
And if you want some of those stories to sound like myths, 02:34:29.200 |
about what we have every reason to believe is true 02:34:36.300 |
I certainly don't feel that I need to do it personally, 02:34:41.080 |
why would I think that billions of other people 02:34:51.220 |
don't have the advantages that I have had in my life. 02:34:54.240 |
The billions of other people are not as well-educated, 02:34:59.280 |
they need to be told that Jesus is gonna solve 02:35:14.000 |
if you just visualize what you want, you're gonna get it. 02:35:23.880 |
that really is food for the better part of humanity, 02:35:31.000 |
And I don't know if Jordan would agree with that, 02:35:48.600 |
people would behave the way I would hope they would behave 02:35:52.860 |
and be aligned, more aligned than they are now. 02:35:58.800 |
when you just let ancient religious certainties 02:36:07.840 |
We've been struggling to get out of that world 02:36:21.860 |
And we know what happens when those religions 02:36:25.280 |
become kind of pseudo religions and political religions. 02:36:29.760 |
So this is where I'm sure Jordan and I would debate. 02:36:33.840 |
He would say that Stalin was a symptom of atheism, 02:36:44.040 |
and the experiment with communism or with Stalinism 02:36:48.240 |
or with Nazism was not that there was so much 02:36:53.240 |
scientific rigor and self-criticism and honesty 02:36:56.800 |
and introspection and judicious use of psychedelics. 02:37:01.800 |
I mean, like that was not the problem in Hitler's Germany 02:37:16.840 |
that capture a similar kind of mob-based dogmatic energy 02:37:40.280 |
So communism was, I mean, having grown up in the Soviet Union 02:37:53.920 |
and to the ideologies of communism that religious or not, 02:38:00.320 |
I could just say it's stories that are viral and sticky. 02:38:08.960 |
but the question is whether science and reason 02:38:41.040 |
in terms of our understanding of what the hell is going on. 02:38:48.280 |
I mean, I don't know if you've been on the receiving end 02:38:49.960 |
of recent rumors about our conversation about UFOs 02:38:54.960 |
very likely changing in the near term, right? 02:38:57.640 |
But like there was just a Washington Post article 02:39:04.440 |
and perhaps you have, I know other people in our orbit 02:39:08.160 |
have people who are claiming that the government 02:39:25.360 |
whoever's left standing when the music stops, 02:39:28.520 |
it's not going to be a comfortable position to be in 02:39:49.080 |
the Office of Naval Intelligence and the Pentagon 02:39:52.320 |
are very likely to say to Congress at some point 02:39:55.600 |
in the not too distant future that we have evidence 02:40:02.780 |
that seems like it can't possibly be of human origin, right? 02:40:17.240 |
but that is such a powerfully strange circumstance 02:40:22.920 |
I mean, it's just, what are we gonna do with that? 02:40:34.960 |
of the US government, of all of our intelligence, 02:40:44.080 |
it's too, there's too much data to suggest that it's a hoax. 02:40:51.120 |
whatever data they actually have, there's too much of it. 02:40:58.640 |
and there's no way it's the Chinese or the Russians 02:41:06.060 |
That should arrest our attention, you know, collectively 02:41:12.280 |
to a degree that nothing in our lifetime has. 02:41:30.400 |
Obama's tan suit did, you know, a bunch of years ago. 02:41:34.660 |
It's just, it's, who knows how we'll respond to that. 02:41:43.440 |
to tell ourselves an honest story about what's going on 02:42:24.080 |
because it is the thing that allows you to course correct. 02:42:28.040 |
It is the thing that allows you to hope at least 02:42:40.360 |
- Yeah, it is a little bit sad to imagine that 02:42:47.320 |
they would be too preoccupied with political bickering 02:43:15.680 |
they don't want like layers and layers of like fakeness. 02:43:19.660 |
And I'm hoping that means that will directly lead 02:43:24.760 |
to a greater hunger for reality and reason and truth. 02:43:42.600 |
there's so many questions, there's so many mysteries, 02:43:45.480 |
This is our best available, like a best guess. 02:43:49.400 |
And we have a lot of evidence that supports that guess, 02:43:56.400 |
I think there's a hunger for that in the world 02:44:09.520 |
with the virology and all those kinds of things 02:44:13.360 |
There's a lot of, and biology is super messy. 02:44:21.040 |
I think I'm hoping will change people's hunger 02:44:29.560 |
- Yeah, well, so much of this is probabilistic. 02:44:31.880 |
It's so much of what can seem dogmatic scientifically 02:44:35.080 |
is just, you're placing a bet on whether it's worth 02:44:40.080 |
reading that paper or rethinking your presuppositions 02:44:46.720 |
It's like, it's not a fundamental closure to data. 02:44:49.960 |
It's just that there's so much data on one side 02:44:55.560 |
in terms of your understanding of what you think 02:44:57.800 |
you understand about the nature of the world. 02:44:59.920 |
If this new fact were so that you can pretty quickly say, 02:45:12.400 |
to new conversations, new evidence, new data, new argument 02:45:17.400 |
but it's really not, it's just, it really is just triaging 02:45:21.320 |
It's just like, okay, you're telling me that your best 02:45:30.840 |
Let me know when that person has gone into a lab 02:45:34.120 |
Like, I don't need, this is not the place where I need 02:45:37.120 |
to spend the rest of my day figuring out if your buddy 02:45:44.720 |
I think it does too often sound like you're completely 02:45:47.680 |
closed off to ideas as opposed to saying like, 02:45:50.360 |
as opposed to saying that there's a lot of evidence 02:45:56.040 |
in support of this but you're still open-minded 02:46:10.480 |
Just, it's that energy of being open-minded and curious 02:46:19.080 |
I'm not saying allocate time to exploring all those things 02:46:24.560 |
And there's a way to communicate that I think 02:46:32.640 |
I've been recently talking a lot with John Donaher 02:46:40.040 |
- Talk about somebody who's good at what he does. 02:46:42.720 |
- And he, speaking of somebody who's open-minded, 02:46:50.360 |
a lot of people believed in the Jiu-Jitsu world 02:46:56.600 |
And he was somebody that, inspired by the open-mindedness 02:47:03.960 |
"Why do you only consider half the human body 02:47:12.460 |
Anyway, I do that absurd transition to ask you 02:47:15.900 |
because you're also a student of Brazilian Jiu-Jitsu. 02:47:23.800 |
what you've learned from grappling, from the martial arts? 02:47:46.640 |
Right, like there's no room for bullshit, right? 02:47:57.800 |
is that the difference between knowing what's going on 02:48:08.040 |
is as wide as it is in any thing in human life. 02:48:24.680 |
It's like, here's the thing that got you killed 02:48:27.480 |
and here's how to prevent it from happening to you 02:48:40.200 |
and then having it remedied with the actual technique. 02:48:59.040 |
and you're on the bottom and you want to get away. 02:49:04.880 |
Your intuitions about how to do this are terrible, 02:49:08.320 |
even if you've done some other martial art, right? 02:49:16.120 |
It's like you have access to a completely different physics. 02:49:26.160 |
can be much more like jujitsu than it tends to be, right? 02:49:30.200 |
And I think we should all have a much better sense 02:49:50.240 |
Now, the problem with debating most other topics 02:50:02.320 |
and it's obvious to the unintelligent audience 02:50:11.680 |
And so you have a lot of zombies walking around 02:50:34.600 |
And science, when it works, is a lot like jujitsu. 02:50:38.080 |
I mean, science, when you falsify a thesis, right? 02:50:46.000 |
when you think it's triple-stranded or whatever, 02:51:01.000 |
of interest for self-defense and the sport of it. 02:51:07.880 |
it's a language and an argument you're having 02:51:16.820 |
Like there's, first of all, it cancels any role of luck 02:51:22.820 |
in a way that most other athletic feats don't. 02:51:31.860 |
you can be 75 feet away and hurl it at the basket 02:51:36.260 |
and you might make it and you could convince yourself 02:51:40.760 |
that you have some kind of talent for basketball, right? 02:51:45.100 |
with a real jujitsu practitioner when you're not one 02:52:08.400 |
the usual range of uncertainty and self-deception 02:52:25.400 |
that accompanies whatever other pursuit you have in life. 02:52:28.200 |
I'm not sure if there's anything like jujitsu 02:52:31.320 |
where you could just systematically go into a place 02:52:45.340 |
which is why it's, we had this earlier question 02:52:51.800 |
I'm very much relying on jujitsu in my own life 02:52:56.300 |
as a place where I can always go to have my ego in check. 02:53:13.080 |
like even running, doing something that's way too hard 02:53:15.980 |
for me and then pushing through, that's somehow humbling. 02:53:22.340 |
where you kind of see something really powerful, 02:53:31.220 |
and you realize there's something much more powerful 02:53:35.460 |
that there's no way to, that you're just like the spec, 02:53:45.640 |
And jujitsu does that better than anything else for me. 02:53:52.700 |
is it truly the kind of the final right answer 02:53:58.840 |
Because if you just put jujitsu into an MMA frame 02:54:07.180 |
a lot of unpleasant surprises to discover there, right? 02:54:09.880 |
Like somebody who thinks all you need is jujitsu 02:54:12.100 |
to win the UFC gets punched in the face a lot, 02:54:28.340 |
Like that opens the door to certain kinds of delusions. 02:54:32.640 |
But the analogy to martial arts is fascinating 02:54:37.020 |
because on the other side, we have endless testimony now 02:54:41.960 |
of fake martial arts that don't seem to know they're fake 02:54:53.300 |
because people send them to him all the time. 02:54:59.060 |
where the master isn't even touching the students 02:55:06.060 |
which you would think maybe is just a performance 02:55:21.540 |
who you saw flipping his students endlessly by magic 02:55:26.740 |
issued a challenge to the wide world of martial artists 02:55:30.420 |
and someone showed up and just punched him in the face 02:55:35.280 |
clearly he believed his own publicity at some point, right? 02:56:08.420 |
who is ready to issue a challenge to the world 02:56:13.980 |
- Yeah, that's human nature on clear display. 02:56:34.760 |
Again, asking from an engineering perspective 02:56:39.860 |
I mean, it is something that we should want to build 02:56:52.860 |
people can mean many things by love, I think. 02:56:55.540 |
I think that what we should mean by it most of the time 02:56:58.980 |
is a deep commitment to the wellbeing of those we love. 02:57:06.980 |
with really wanting the other person to be happy 02:57:15.500 |
So at bottom, you're on the same team emotionally, 02:57:20.500 |
even when you might be disagreeing more superficially 02:57:24.460 |
about something or trying to negotiate something. 02:57:33.580 |
for love to actually be manifest in that moment. 02:57:37.500 |
- See, I have a different, just sorry to interrupt. 02:57:41.700 |
I don't know if you've ever seen "March of the Penguins." 02:57:52.860 |
And love is like the huddling of the two penguins for warmth. 02:57:59.140 |
you're basically escaping the cruelty of life 02:58:15.140 |
as we're surrounded by basically the absurdity of life 02:58:27.100 |
I mean, there is the warmth component, right? 02:58:34.500 |
Otherwise, you wouldn't, it wouldn't be compelling, right? 02:58:39.060 |
So it's not that you have two different modes, 02:58:58.960 |
I mean, again, love doesn't have to be as personal 02:59:04.080 |
there's your actual spouse or your family or your friends, 02:59:08.740 |
but potentially you could feel love for strangers 02:59:15.460 |
that they not suffer and that their hopes and dreams 02:59:30.940 |
a total stranger's face light up in happiness, 02:59:33.740 |
that can become more and more contagious to you. 02:59:42.180 |
And it's just like, so it really is not zero sum. 02:59:46.740 |
the light bulb of joy goes off over their head, 02:59:54.700 |
And it's not just, and you're no longer keeping score, 02:59:57.740 |
you're no longer feeling diminished by their success. 03:00:00.340 |
It's just like that's, their success becomes your success 03:00:07.100 |
You're not, there's no miserly attitude around happiness. 03:00:21.460 |
we're devoting all of this time and attention to 03:00:25.980 |
It does have that sense of refuge from the storm. 03:00:34.220 |
these are the people who you're most in it together with, 03:00:36.820 |
you know, or when some real condition of uncertainty 03:00:54.620 |
because we know we're going to lose everyone we love. 03:00:57.700 |
We know, or they're going to lose us first, right? 03:01:02.460 |
in the end it's not even an antidote for that problem. 03:01:10.380 |
I mean, we get to have this amazing experience 03:01:25.340 |
we really appear to make the most of that, right? 03:01:34.500 |
You know, you're just, you've got your hobbies 03:01:36.740 |
and your interests and you're captivated by all that. 03:01:44.220 |
this is a domain where somebody else's wellbeing 03:02:04.020 |
you know, of course you would take your child's pain 03:02:06.900 |
Like that, you don't even have to do the math on that. 03:02:21.220 |
in ways that reveal that there's just way more space 03:02:31.380 |
- Do you think we'll ever build robots that we can love 03:02:41.500 |
You know, I think that Turing test will be passed. 03:02:44.580 |
Whether, what will actually be going on on the robot side 03:03:01.460 |
you know, irresistibly lovable robots that seem to love us. 03:03:32.120 |
- Isn't that what all relationships are like? 03:03:37.300 |
- Yeah, it depends which box you're talking about. 03:03:46.480 |
becomes a little scary when you think of the prospect 03:04:27.480 |
I mean, people are already good enough at deceiving us. 03:04:29.720 |
It's very hard to tell when somebody's lying. 03:04:33.360 |
that could give a facial display of any emotion it wants 03:04:45.200 |
of emotion in robots in the year, you know, 2070, 03:04:59.040 |
It's not like Kasparov is going to get lucky next week 03:05:03.160 |
against the best, against, you know, alpha zero 03:05:06.560 |
or whatever the best algorithm is at the moment. 03:05:19.040 |
It's not going to be like, you know, four games to seven. 03:05:23.360 |
It's going to be human zero until the end of the world. 03:05:29.240 |
- See, I don't know, I don't know if love is like chess. 03:05:44.080 |
incredibly display love and is super intelligent 03:05:49.080 |
and we're not, again, this stipulates a few things, 03:05:55.200 |
I mean, we're out of the uncanny valley, right? 03:05:59.360 |
where you're looking at his face and you think, 03:06:05.880 |
And it's, it will be like doing arithmetic on your phone. 03:06:13.120 |
It's not going to be, you're not left thinking, 03:06:15.460 |
is it really going to get it this time if I divide by seven? 03:06:22.480 |
- See, I don't know about that because if you look at chess, 03:06:31.320 |
There's no, they're not part of the competition. 03:06:33.560 |
They don't do it for fun except to study the game of chess. 03:06:36.000 |
You know, the highest level chess players do that. 03:06:39.820 |
So in order for AI to get integrated to where 03:06:43.560 |
you would rather play chess against an AI system. 03:06:47.800 |
No, I'm not saying, I wasn't weighing in on that. 03:06:53.240 |
to be in relationship to something that can seem 03:06:57.620 |
to be feeling anything that a human can seem to feel 03:07:23.040 |
Without any serving it up, without any explanation, 03:07:39.080 |
paying sufficient attention to any present moment 03:07:43.960 |
such that there's no basis upon which to pose that question. 03:07:52.440 |
It's not a matter of having more information. 03:07:54.960 |
It's having more engagement with reality as it is 03:07:59.960 |
in the present moment or consciousness as it is 03:08:12.480 |
That question only gets asked when you're abstracted away 03:08:29.120 |
Like why am I repeating the same pleasures every day? 03:08:40.800 |
That's a moment where you're not actually having 03:08:54.720 |
Like you're in a relationship with somebody who you know, 03:09:00.280 |
This is the person you're living your life with, 03:09:03.040 |
but you don't actually feel good together, right? 03:09:09.200 |
of where attention hasn't found a good enough reason 03:09:18.040 |
so as to obviate any concern like that, right? 03:09:21.760 |
And that's why meditation is this kind of superpower 03:09:38.520 |
so that the present moment can become good enough 03:09:42.600 |
to demand your attention in a way that seems fulfilling, 03:09:56.000 |
you know, it's been over a year since I've trained. 03:10:12.600 |
You always think that your life has to change, 03:10:18.040 |
so that you can finally have a good enough excuse 03:10:22.200 |
to truly, to just be here and here is enough, 03:10:32.600 |
I mean, meditation is another name for the discovery 03:10:52.960 |
that mysteriously unravels the moment you notice it. 03:10:58.200 |
and the moment expands and becomes more diaphanous 03:11:06.160 |
that this isn't the best moment of your life, right? 03:11:13.720 |
It's not like, oh, this tastes like chocolate. 03:11:16.960 |
You know, this is the most chocolatey moment of my life. 03:11:18.840 |
No, it's just the sense data don't have to change. 03:11:22.880 |
But the sense that there is some kind of basis for doubt 03:11:28.720 |
about the rightness of being in the world in this moment 03:11:38.960 |
so the kind of the meta answer to that question, 03:11:47.280 |
And to, whenever I notice I'm not in that mode, 03:12:04.720 |
Because we all have reasons why we can't be fulfilled 03:12:09.160 |
It's like this, got all these outstanding things 03:12:14.520 |
there's that thing that's happening later today 03:12:19.240 |
Whatever it is, we're constantly deferring our sense of 03:12:26.520 |
You know, this is not a dress rehearsal, this is the show. 03:12:31.960 |
And we just have these moments on the calendar 03:12:34.680 |
where we think, okay, this is where it's all going to land. 03:12:37.040 |
It's that vacation I planned with my five best friends. 03:12:42.560 |
and now we're going and here we are on the beach together. 03:12:46.340 |
Unless you have a mind that can really pay attention, 03:12:59.720 |
the way you dreamed you would enjoy them when they arrive. 03:13:17.680 |
It's like, there's just a mirage-like quality 03:13:20.640 |
to every future attainment and every future breakthrough 03:13:33.840 |
Like you won't, you don't arrive until you cease 03:13:41.620 |
I mean, we're constantly, we're stepping over the thing 03:13:46.840 |
that we think we're seeking, in the act of seeking it. 03:13:53.160 |
I mean, there is a, there's this paradox which, 03:14:00.100 |
but it's like you can't actually become happy. 03:14:09.480 |
it's the illusion that your future being happy 03:14:12.880 |
can be predicated on this act of becoming in any domain. 03:14:26.000 |
or getting in better shape, or whatever the thing is, 03:14:31.000 |
whatever the contingency of your dissatisfaction 03:14:56.640 |
the superficial things that are obviously good to do. 03:14:59.200 |
But the sense that your well-being is over there 03:15:13.600 |
- Well, there's a sense in which, in this conversation, 03:15:20.000 |
I've actually experienced many of those things, 03:15:38.000 |
So like, on the calendar, literally, you know, 03:15:47.040 |
Because I always felt, again, tying our free will thing, 03:15:52.200 |
And it's one of those manifestation things, or something, 03:15:59.200 |
and I manipulated you into having this conversation. 03:16:01.380 |
So it was a, I mean, I don't know what the purpose 03:16:07.760 |
So in that sense, I mean, all of that to say, 03:16:27.680 |
You're doing something more and more indispensable 03:16:34.240 |
And you're doing it differently than Rogan's doing it, 03:16:43.620 |
Thanks for listening to this conversation with Sam Harris, 03:16:52.460 |
Check them out in the description to support this podcast. 03:16:56.240 |
And now let me leave you with some words from Sam Harris 03:17:07.560 |
Thank you for listening, and hope to see you next time.