back to index

Sam Harris: Consciousness, Free Will, Psychedelics, AI, UFOs, and Meaning | Lex Fridman Podcast #185


Chapters

0:0 Introduction
1:48 Where do thoughts come from?
7:49 Consciousness
25:21 Psychedelics
34:44 Nature of reality
51:40 Free will
110:25 Ego
119:29 Joe Rogan
122:30 How will human civilization destroy itself?
129:57 AI
150:40 Jordan Peterson
158:43 UFOs
166:32 Brazilian Jiu Jitsu
176:17 Love
187:21 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Sam Harris,
00:00:02.600 | one of the most influential
00:00:03.920 | and pioneering thinkers of our time.
00:00:06.040 | He's the host of the Making Sense podcast
00:00:08.560 | and the author of many seminal books
00:00:10.440 | on human nature and the human mind,
00:00:12.840 | including The End of Faith, The Moral Landscape,
00:00:15.840 | Lying, Free Will, and Waking Up.
00:00:18.560 | He also has a meditation app called Waking Up
00:00:21.640 | that I've been using to guide my own meditation.
00:00:24.760 | Quick mention of our sponsors,
00:00:26.760 | National Instruments, Belcampo, Athletic Greens, and Linode.
00:00:31.520 | Check them out in the description to support this podcast.
00:00:34.800 | As a side note, let me say that Sam
00:00:36.800 | has been an inspiration to me
00:00:39.200 | as he has been for many, many people,
00:00:41.320 | first from his writing, then his early debates,
00:00:44.840 | maybe 13, 14 years ago on the subject of faith,
00:00:48.280 | his conversations with Christopher Hitchens,
00:00:51.120 | and since 2013, his podcast.
00:00:54.400 | I didn't always agree with all of his ideas,
00:00:56.880 | but I was always drawn to the care and depth
00:00:59.680 | of the way he explored those ideas,
00:01:02.000 | the calm and clarity amid the storm of difficult,
00:01:05.400 | at times controversial discourse.
00:01:07.640 | I really can't express in words how much it meant to me
00:01:10.640 | that he, Sam Harris, someone who I've listened to
00:01:14.120 | for many hundreds of hours,
00:01:16.000 | would write a kind email to me saying he enjoyed this podcast
00:01:20.560 | and more that he thought I had a unique voice
00:01:23.560 | that added something to this world.
00:01:25.920 | Whether it's true or not, it made me feel special
00:01:28.600 | and truly grateful to be able to do this thing
00:01:31.080 | and motivated me to work my ass off
00:01:33.680 | to live up to those words.
00:01:35.440 | Meeting Sam and getting to talk with him
00:01:37.920 | was one of the most memorable moments of my life.
00:01:41.700 | This is the Lex Friedman Podcast,
00:01:44.040 | and here is my conversation with Sam Harris.
00:01:47.640 | I've been enjoying meditating
00:01:50.200 | with the Waking Up app recently.
00:01:52.720 | It makes me think about the origins
00:01:54.760 | of cognition and consciousness,
00:01:56.840 | so let me ask, where do thoughts come from?
00:02:01.000 | - Well, that's a very difficult question to answer.
00:02:04.640 | Subjectively, they appear to come from nowhere, right?
00:02:09.440 | I mean, it's just they come out of some kind of mystery
00:02:14.440 | that is at our backs subjectively, right?
00:02:17.080 | So, which is to say that if you pay attention
00:02:22.000 | to the nature of your mind in this moment,
00:02:25.280 | you realize that you don't know
00:02:26.960 | what you're going to think next, right?
00:02:29.120 | Now, you're expecting to think something
00:02:30.400 | that seems like you authored it, right?
00:02:33.040 | You're not, unless you're schizophrenic
00:02:35.840 | or you have some kind of thought disorder
00:02:38.440 | where your thoughts seem fundamentally foreign to you,
00:02:41.920 | they do have a kind of signature of selfhood
00:02:45.400 | associated with them, and people readily identify with them.
00:02:50.040 | They feel like what you are.
00:02:51.880 | I mean, this is the thing,
00:02:52.920 | this is the spell that gets broken with meditation.
00:02:56.040 | Our default state is to feel identical
00:03:00.120 | to the stream of thought, right?
00:03:02.600 | Which is fairly paradoxical,
00:03:05.200 | 'cause how could you as a mind, as a self,
00:03:08.960 | you know, if there were such a thing as a self,
00:03:11.840 | how could you be identical to the next piece of language
00:03:15.920 | or the next image that just springs into conscious view?
00:03:21.920 | But, and you know, meditation is ultimately
00:03:26.440 | about examining that point of view closely enough
00:03:28.920 | so as to unravel it and feel the freedom
00:03:31.960 | that's on the other side of that identification.
00:03:34.360 | But the subjectively thoughts simply emerge, right?
00:03:39.360 | And you don't think them before you think them, right?
00:03:43.080 | There's this first moment where, you know,
00:03:46.000 | I mean, just anyone listening to us or watching us now
00:03:48.600 | could perform this experiment for themselves.
00:03:50.920 | I mean, just imagine something or remember something.
00:03:54.440 | You know, just pick a memory, any memory, right?
00:03:56.600 | You've got a storehouse of memory,
00:03:58.240 | just promote one to consciousness.
00:04:00.920 | Did you pick that memory?
00:04:04.760 | I mean, let's say you remembered breakfast yesterday
00:04:07.600 | or you remembered what you said to your spouse
00:04:10.040 | before leaving the house,
00:04:11.080 | or you remembered what you watched on Netflix last night,
00:04:13.880 | or you remembered something that happened to you
00:04:16.000 | when you were four years old, whatever it is, right?
00:04:18.800 | First it wasn't there and then it appeared.
00:04:24.200 | And that is not a, I mean, I'm sure we'll get to the topic
00:04:28.480 | of free will ultimately.
00:04:30.920 | That's not evidence of free will, right?
00:04:33.200 | - Why are you so sure, by the way?
00:04:35.440 | It's very interesting.
00:04:36.280 | - Well, through no free will of my own, yeah.
00:04:38.500 | Everything just appears, right?
00:04:41.760 | And what else could it do?
00:04:43.640 | And so that's the subjective side of it.
00:04:45.480 | Objectively, we have every reason to believe
00:04:48.000 | that many of our thoughts, all of our thoughts
00:04:50.840 | are at bottom what some part of our brain is doing
00:04:55.840 | neurophysiologically.
00:04:58.080 | I mean, that these are the products
00:05:00.000 | of some kind of neural computation
00:05:02.520 | and neural representation,
00:05:05.160 | and we're talking about memories.
00:05:06.880 | - Is it possible to pull at the string of thoughts
00:05:10.640 | to try to get to its root,
00:05:14.080 | to try to dig in past the obvious surface,
00:05:17.640 | subjective experience of like the thoughts
00:05:20.240 | pop out of nowhere?
00:05:21.640 | Is it possible to somehow get closer to the roots
00:05:24.440 | of where they come out of from the firing of the cells?
00:05:28.720 | Or is it a useless pursuit to dig into that direction?
00:05:32.480 | - Well, you can get closer to many, many subtle contents
00:05:37.480 | in consciousness, right?
00:05:39.960 | So you can notice things more and more clearly
00:05:42.120 | and have a landscape of mind open up
00:05:44.360 | and become more differentiated and more interesting.
00:05:47.800 | And if you take psychedelics, it opens up wide
00:05:51.800 | depending on what you've taken and the dose,
00:05:54.000 | it opens in directions and to an extent
00:05:56.040 | that very few people imagine would be possible,
00:05:59.760 | but for having had those experiences.
00:06:01.960 | But this idea of you getting closer to something,
00:06:06.960 | to the datum of your mind,
00:06:09.080 | or to something of interest in there,
00:06:11.300 | or something that's more real,
00:06:12.800 | is ultimately undermined because there's no place
00:06:17.400 | from which you're getting closer to it.
00:06:19.560 | There's no your part of that journey, right?
00:06:23.440 | We tend to start out, whether it's in meditation
00:06:28.080 | or in any kind of self-examination or taking psychedelics,
00:06:33.080 | we start out with this default point of view
00:06:35.960 | of feeling like we're the kind of on the rider
00:06:41.380 | on the horse of consciousness,
00:06:42.580 | or we're the man in the boat
00:06:45.340 | going down the stream of consciousness, right?
00:06:47.780 | But we're so we're differentiated
00:06:49.700 | from what we know cognitively, introspectively.
00:06:54.700 | But that feeling of being differentiated,
00:06:58.060 | that feeling of being a self
00:06:59.460 | that can strategically pay attention
00:07:01.620 | to some contents of consciousness
00:07:03.260 | is what it's like to be identified
00:07:06.640 | with some part of the stream of thought
00:07:09.420 | that's going uninspected, right?
00:07:10.840 | Like it's a false point of view.
00:07:13.540 | And when you see that and cut through that,
00:07:16.700 | then this sense of this notion of going deeper
00:07:21.580 | kind of breaks apart because really there is no depth.
00:07:25.460 | Ultimately, everything is right on the surface.
00:07:27.180 | Everything, there's no center to consciousness.
00:07:28.980 | There's just consciousness and its contents.
00:07:30.420 | And those contents can change vastly.
00:07:33.120 | Again, if you drop acid, the contents change.
00:07:37.900 | But in some sense, that doesn't represent
00:07:42.600 | a position of depth versus,
00:07:45.800 | the continuum of depth versus surface has broken apart.
00:07:49.200 | - So you're taking as a starting point
00:07:51.520 | that there is a horse called consciousness
00:07:54.280 | and you're riding it.
00:07:55.740 | And the actual riding is very shallow.
00:07:57.680 | This is all surface.
00:07:59.720 | So let me ask about that horse.
00:08:02.520 | What's up with the horse?
00:08:04.160 | What is consciousness?
00:08:07.280 | From where does it emerge?
00:08:09.780 | How fundamental is it to the physics of reality?
00:08:13.020 | How fundamental is it to what it means to be human?
00:08:16.560 | And I'm just asking for a friend
00:08:18.620 | so that we can build it
00:08:20.180 | in our artificial intelligence systems.
00:08:22.640 | - Yeah, well, that remains to be seen
00:08:25.660 | if we will build it purposefully or just by accident.
00:08:30.180 | It's a major ethical problem potentially.
00:08:35.620 | That, I mean, my concern here is that we may in fact
00:08:39.120 | build artificial intelligence that passes the Turing test,
00:08:44.020 | which we begin to treat not only as super intelligent
00:08:47.040 | because it obviously is and demonstrates that,
00:08:50.880 | but we begin to treat it as conscious
00:08:53.160 | because it will seem conscious.
00:08:54.900 | We will have built it to seem conscious.
00:08:56.800 | And unless we understand exactly how consciousness emerges
00:09:01.400 | from physics, we won't actually know
00:09:04.920 | that these systems are conscious.
00:09:06.320 | We'll just, they may say,
00:09:07.920 | "Listen, you can't turn me off 'cause that's a murder."
00:09:11.000 | And we will be convinced by that dialogue
00:09:15.280 | because we will, just in the extreme case,
00:09:18.520 | who knows when we'll get there.
00:09:20.780 | But if we build something like perfectly humanoid robots
00:09:25.780 | that are more intelligent than we are,
00:09:27.520 | so we're basically in a Westworld-like situation,
00:09:30.280 | there's no way we're going to withhold
00:09:33.160 | an attribution of consciousness from those machines.
00:09:35.600 | They're just going to seem,
00:09:36.880 | they're just going to advertise their consciousness
00:09:38.960 | in every glance and every utterance.
00:09:41.940 | But we won't know, and we won't know in some deeper sense
00:09:47.360 | than we can be skeptical of the consciousness
00:09:50.500 | of other people.
00:09:51.340 | I mean, someone could roll that back and say,
00:09:52.520 | "Well, I don't know that you're conscious
00:09:54.280 | "or you don't know that I'm conscious.
00:09:55.300 | "We're just passing the Turing test for one another."
00:09:57.120 | But that kind of solipsism isn't justified biologically
00:10:02.200 | or we just, anything we understand
00:10:05.640 | about the mind biologically suggests
00:10:07.280 | that you and I are part of the same roll of the dice
00:10:12.280 | in terms of how intelligent and conscious systems emerged
00:10:18.000 | in the wetware of brains like ours, right?
00:10:21.880 | So it's not parsimonious for me to think
00:10:24.160 | that I might be the only conscious person
00:10:26.080 | or even the only conscious primate.
00:10:27.840 | I would argue it's not parsimonious
00:10:30.580 | to withhold consciousness from other apes
00:10:33.920 | and even other mammals ultimately.
00:10:36.520 | And once you get beyond the mammals,
00:10:38.400 | then my intuitions are not really clear.
00:10:41.400 | The question of how it emerges is genuinely uncertain.
00:10:44.980 | And ultimately, the question of whether it emerges
00:10:48.120 | is still uncertain.
00:10:49.320 | You can, it's not fashionable to think this,
00:10:54.120 | but you can certainly argue
00:10:55.760 | that consciousness might be a fundamental principle
00:10:59.640 | of matter that doesn't emerge
00:11:02.200 | on the basis of information processing,
00:11:04.080 | even though everything else that we recognize
00:11:08.640 | about ourselves as minds almost certainly does emerge.
00:11:11.400 | You know, like an ability to process language,
00:11:13.160 | that clearly is a matter of information processing
00:11:15.740 | because you can disrupt that process
00:11:17.520 | in ways that is just so clear.
00:11:22.520 | And the problem, the confound with consciousness
00:11:26.920 | is that, yes, we can seem to interrupt consciousness.
00:11:30.640 | I mean, you can give someone general anesthesia
00:11:32.960 | and then you wake them up and you ask them,
00:11:35.160 | "Well, what was that like?"
00:11:36.000 | And they say, "Nothing, I don't remember anything."
00:11:38.600 | But it's hard to differentiate a mere failure of memory
00:11:43.600 | from a genuine interruption in consciousness.
00:11:49.080 | Whereas it's not, with interrupting speech,
00:11:51.720 | you know, we know when we've done it.
00:11:53.540 | And it's just obvious that, you know,
00:11:57.640 | you disrupt the right neural circuits
00:11:59.440 | and, you know, you've disrupted speech.
00:12:01.840 | - So if you had to bet all your money
00:12:03.400 | on one camp or the other,
00:12:04.640 | would you say, do you err on the side of panpsychism,
00:12:09.000 | where consciousness is really fundamental
00:12:11.860 | to all of reality, or more on the other side,
00:12:16.760 | which is like, it's a nice little side effect,
00:12:20.200 | a useful, like, hack for us humans to survive?
00:12:23.600 | On that spectrum, where do you land
00:12:26.220 | when you think about consciousness,
00:12:27.860 | especially from an engineering perspective?
00:12:30.340 | - I'm truly agnostic on this point.
00:12:33.160 | I mean, I think I'm, you know,
00:12:35.300 | it's kind of in coin toss mode for me.
00:12:37.780 | I don't know, and panpsychism is not so compelling to me.
00:12:42.780 | Again, it just seems unfalsifiable.
00:12:46.900 | I wouldn't know how the universe would be different
00:12:49.020 | if panpsychism were true.
00:12:50.700 | I mean, just to remind people,
00:12:52.060 | panpsychism is this idea that consciousness
00:12:54.500 | may be pushed all the way down
00:12:57.260 | into the most fundamental constituents of matter.
00:12:59.500 | So there might be something that it's like
00:13:01.080 | to be an electron or, you know, a quark,
00:13:05.100 | but then you wouldn't expect anything to be different
00:13:08.940 | at the macro scale,
00:13:11.420 | or at least I wouldn't expect anything to be different.
00:13:13.860 | So it may be unfalsifiable.
00:13:16.700 | It just might be that reality is not something
00:13:20.460 | we're as in touch with as we think we are,
00:13:26.060 | and that if that is base layer
00:13:30.400 | to kind of break it into mind and matter
00:13:32.460 | as we've done ontologically is to misconstrue it, right?
00:13:37.300 | I mean, there could be some kind of neutral monism
00:13:40.900 | at the bottom, and this idea doesn't originate with me.
00:13:43.860 | This goes all the way back to Bertrand Russell
00:13:47.220 | and others, you know, 100 plus years ago,
00:13:50.860 | but I just feel like the concepts we're using
00:13:53.940 | to divide consciousness and matter
00:13:58.940 | may in fact be part of our problem, right?
00:14:02.260 | Where the rubber hits the road psychologically here
00:14:05.940 | are things like, well, what is death, right?
00:14:08.780 | Like, any expectation that we survive death
00:14:12.500 | or any part of us survives death,
00:14:14.540 | that really seems to be many people's concern here.
00:14:19.540 | - Well, I tend to believe, just as a small little tangent,
00:14:23.140 | like I'm with Ernest Becker on this,
00:14:24.660 | that there's some, it's interesting to think
00:14:27.700 | about death and consciousness,
00:14:29.240 | which one is the chicken, which one is the egg,
00:14:32.420 | because it feels like death could be the very thing,
00:14:34.620 | like our knowledge of mortality could be the very thing
00:14:36.900 | that creates the consciousness.
00:14:38.460 | - Yeah, well, then you're using consciousness
00:14:41.460 | differently than I am.
00:14:43.660 | I mean, so for me, consciousness is just the fact
00:14:47.100 | that the lights are on at all,
00:14:49.980 | that there's an experiential quality to anything.
00:14:53.260 | So much of the processing that's happening
00:14:56.300 | in our brains right now certainly seems
00:14:59.940 | to be happening in the dark, right?
00:15:01.940 | Like, it's not associated with this qualitative sense
00:15:06.260 | that there's something that it's like to be that part
00:15:08.820 | of the mind doing that mental thing.
00:15:11.680 | But for other parts, the lights are on,
00:15:16.980 | and we can talk about, and whether we talk about it or not,
00:15:20.320 | we can feel directly that there's something
00:15:25.320 | that it's like to be us.
00:15:27.020 | There's something that seems to be happening, right?
00:15:29.740 | And the seeming, in our case, is broken into vision
00:15:34.260 | and hearing and proprioception,
00:15:36.540 | and taste and smell, and thought and emotion.
00:15:41.540 | And there are the contents of consciousness
00:15:45.080 | that we are familiar with,
00:15:50.420 | and that we can have direct access to in any present moment
00:15:54.340 | that when we're, quote, conscious.
00:15:56.440 | And even if we're confused about them,
00:16:00.100 | even if we're asleep and dreaming,
00:16:02.260 | and we're just not a lucid dream,
00:16:04.020 | we're just totally confused about our circumstance,
00:16:07.780 | what you can't say is that we're confused
00:16:11.380 | about consciousness.
00:16:12.460 | Like, you can't say that consciousness itself
00:16:14.620 | might be an illusion, because on this account,
00:16:18.900 | it just means that things seem any way at all.
00:16:22.100 | I mean, even like if this, you know,
00:16:23.300 | it seems to me that I'm seeing a cup on the table.
00:16:26.380 | Now, I could be wrong about that.
00:16:27.500 | It could be a hologram.
00:16:28.480 | I could be asleep and dreaming.
00:16:29.740 | I could be hallucinating.
00:16:31.520 | But the seeming part isn't really up for grabs
00:16:35.100 | in terms of being an illusion.
00:16:37.580 | It's not, something seems to be happening.
00:16:41.620 | And that seeming is the context
00:16:45.220 | in which every other thing we can notice about ourselves
00:16:49.980 | can be noticed.
00:16:50.820 | And it's also the context in which certain illusions
00:16:53.500 | can be cut through, because we're not,
00:16:55.420 | we can be wrong about what it's like to be us,
00:16:57.820 | and we can, I'm not saying we're incorrigible
00:17:01.960 | with respect to our claims
00:17:04.440 | about the nature of our experience,
00:17:05.680 | but for instance, many people feel like they have a self,
00:17:10.100 | and they feel like it has free will,
00:17:11.840 | and I'm quite sure at this point
00:17:14.320 | that they're wrong about that,
00:17:15.240 | and that you can cut through those experiences,
00:17:19.880 | and then things seem a different way, right?
00:17:22.120 | So it's not that things don't,
00:17:25.000 | there aren't discoveries to be made there,
00:17:26.480 | and assumptions to be overturned,
00:17:28.180 | but this kind of consciousness is something
00:17:33.180 | that I would think, it doesn't just come online
00:17:38.180 | when we get language.
00:17:39.660 | It doesn't just come online when we form a concept of death,
00:17:42.460 | or the finiteness of life.
00:17:45.220 | It doesn't require a sense of self, right?
00:17:48.200 | So it doesn't, it's prior
00:17:50.380 | to a differentiating self and other.
00:17:54.100 | And I wouldn't even think it's necessarily limited
00:17:57.680 | to people, I do think probably any mammal has this,
00:18:02.680 | but certainly if you're going to presuppose
00:18:07.440 | that something about our brains is producing this, right,
00:18:11.960 | and that's a very safe assumption,
00:18:15.560 | even though we can't,
00:18:18.260 | even though you can argue the jury's still out
00:18:20.560 | to some degree, then it's very hard
00:18:23.360 | to draw a principled line between us and chimps,
00:18:26.600 | or chimps and rats even in the end,
00:18:30.120 | given the underlying neural similarities.
00:18:33.120 | So, and I don't know, phylogenetically,
00:18:35.960 | I don't know how far back to push that.
00:18:38.600 | There are people who think single cells might be conscious,
00:18:41.720 | or that flies are certainly conscious.
00:18:43.440 | They've got something like 100,000 neurons in their brains.
00:18:47.960 | I mean, it's just, there's a lot going on,
00:18:51.080 | even in a fly, right?
00:18:53.160 | But I don't have intuitions about that.
00:18:55.480 | - But it's not, in your sense,
00:18:56.800 | an illusion you can cut through.
00:18:58.240 | I mean, to push back, the alternative version could be
00:19:02.120 | it is an illusion constructed by, just by humans.
00:19:06.560 | I'm not sure I believe this,
00:19:08.160 | but it, in part of me, hopes this is true
00:19:10.160 | because it makes it easier to engineer,
00:19:12.520 | is that humans are able to contemplate their mortality,
00:19:16.560 | and that contemplation in itself creates consciousness.
00:19:21.480 | That, like, the rich lights-on experience.
00:19:24.200 | So the lights don't actually even turn on
00:19:26.400 | in the way that you're describing
00:19:28.440 | until after birth in that construction.
00:19:31.880 | So do you think it's possible that that is the case,
00:19:34.960 | that it is a sort of construct of the way we deal,
00:19:39.320 | almost like a social tool to deal with the reality
00:19:42.840 | of the world, a social interaction with other humans?
00:19:45.500 | Or is, 'cause you're saying the complete opposite,
00:19:49.040 | which is it's like fundamental to single-cell organisms
00:19:53.120 | and trees and so on.
00:19:54.880 | - Right, well, yeah, so I don't know how far down to push it.
00:19:57.840 | I don't have intuitions that single cells
00:20:00.520 | are likely to be conscious, but they might be.
00:20:04.040 | And I just, again, it could be unfalsifiable.
00:20:06.680 | But as far as babies not being conscious,
00:20:10.480 | or you don't become conscious
00:20:12.040 | until you can recognize yourself in a mirror
00:20:14.520 | or have a conversation or treat other people.
00:20:17.280 | First of all, babies treat other people as others
00:20:19.760 | far earlier than we have traditionally
00:20:24.680 | given them credit for.
00:20:25.800 | And they certainly do it before they have language, right?
00:20:29.120 | So it's got to precede language to some degree.
00:20:33.920 | And you can interrogate this for yourself
00:20:36.720 | because you can put yourself in various states
00:20:40.300 | that are rather obviously not linguistic.
00:20:46.280 | Meditation allows you to do this.
00:20:48.760 | You can certainly do it with psychedelics
00:20:50.480 | where it's just your capacity for language
00:20:54.120 | has been obliterated and yet you're all too conscious.
00:20:58.760 | In fact, I think you could make a stronger argument
00:21:03.760 | for things running the other way,
00:21:09.520 | that there's something about language
00:21:12.720 | and conceptual thought that is eliminative
00:21:16.360 | of conscious experience.
00:21:18.280 | That we're potentially much more conscious of data,
00:21:23.280 | sense data and everything else than we tend to be.
00:21:27.080 | And we have trimmed it down
00:21:29.240 | based on how we have acquired concepts.
00:21:33.640 | And so like when I walk into a room like this,
00:21:35.940 | I know I'm walking into a room,
00:21:38.780 | I have certain expectations of what is in a room.
00:21:41.960 | I would be very surprised to see wild animals in here
00:21:45.560 | or a waterfall or let me say,
00:21:47.560 | there are things I'm not expecting,
00:21:51.040 | but I can know I'm not expecting them
00:21:53.600 | or I'm expecting their absence
00:21:54.880 | because of my capacity to be surprised
00:21:57.360 | once I walk into a room and I see a live gorilla or whatever.
00:22:01.800 | So there's structure there that we have put in place
00:22:05.920 | based on all of our conceptual learning
00:22:08.960 | and language learning.
00:22:11.440 | And it causes us not to,
00:22:15.080 | and one of the things that happens
00:22:16.320 | when you take psychedelics
00:22:17.400 | and you just look as though for the first time at anything,
00:22:21.760 | it becomes incredibly overloaded with,
00:22:26.760 | it can become overloaded with meaning
00:22:28.560 | and just the torrents of sense data that are coming in
00:22:33.560 | in even the most ordinary circumstances
00:22:39.120 | can become overwhelming for people.
00:22:40.780 | And that tends to just obliterate one's capacity
00:22:45.600 | to capture any of it linguistically.
00:22:47.800 | And as you're coming down, right,
00:22:49.440 | have you done psychedelics?
00:22:50.560 | Have you ever done acid or?
00:22:52.240 | - Not acid, mushroom, and that's it.
00:22:55.660 | And also edibles, but there's some psychedelic properties
00:23:00.880 | to them, but yeah, mushrooms,
00:23:04.240 | several times and always had an incredible experience.
00:23:07.600 | Exactly the kind of experience you're referring to,
00:23:09.420 | which is if it's true that language
00:23:12.400 | constrains our experience,
00:23:15.380 | it felt like I was removing some of the constraints.
00:23:19.240 | - Right.
00:23:20.080 | - Because even just the most basic things
00:23:21.520 | were beautiful in the way
00:23:22.720 | that I wasn't able to appreciate previously,
00:23:25.280 | like trees and nature and so on.
00:23:27.420 | - Yeah, and the experience of coming down
00:23:30.640 | is an experience of encountering the futility
00:23:37.600 | of capturing what you just saw a moment ago in words, right?
00:23:42.600 | Like, especially if you have any part of your self-concept
00:23:47.700 | and your ego program is to be able
00:23:50.740 | to capture things in words.
00:23:51.980 | I mean, if you're a writer or a poet or a scientist,
00:23:55.620 | or someone who wants to just encapsulate
00:23:58.540 | the profundity of what just happened,
00:24:01.020 | the total fatuousness of that enterprise
00:24:07.900 | when you have taken a whopping dose of psychedelics
00:24:12.660 | and you begin to even gesture at describing it to yourself,
00:24:17.660 | so that you could describe it to others,
00:24:22.460 | it's like trying to thread a needle using your elbows.
00:24:27.460 | I mean, it's like you're trying something that can't,
00:24:30.180 | it's like the mere gesture proves its impossibility.
00:24:34.780 | And it's, so yeah, so that, I mean,
00:24:39.620 | for me, that suggests just empirically
00:24:42.300 | on the first person side that it's possible
00:24:44.300 | to put yourself in a condition where
00:24:47.220 | it's clearly not about language structuring your experience
00:24:52.220 | and you're having much more experience than you tend to.
00:24:56.520 | So it's the primacy of,
00:24:58.260 | language is primary for some things,
00:25:00.500 | but it's certainly primary for certain kinds of concepts
00:25:05.500 | and certain kinds of semantic understandings of the world.
00:25:11.140 | But it's clearly more to mind than the conversation
00:25:16.140 | we're having with ourselves or that we can have with others.
00:25:21.420 | - Can we go to that world of psychedelics for a bit?
00:25:25.880 | - Sure.
00:25:27.180 | - What do you think, so Joe Rogan,
00:25:30.020 | apparently, and many others meet,
00:25:32.500 | apparently elves when they, on DMT.
00:25:35.900 | A lot of people report this kind of creatures
00:25:40.260 | that they see, and again, it's probably the failure
00:25:43.100 | of language to describe that experience.
00:25:44.840 | But DMT is an interesting one.
00:25:46.860 | There's, as you're aware, there's a bunch of studies
00:25:50.180 | going on on psychedelics, currently MDMA,
00:25:55.340 | psilocybin, and John Hopkins, and much other places.
00:26:00.340 | But DMT, they all speak of as like some extra
00:26:07.100 | super level of a psychedelic.
00:26:09.780 | Yeah, do you have a sense of where it is our mind goes
00:26:13.140 | on psychedelics, but in DMT especially?
00:26:18.980 | - Well, unfortunately, I haven't taken DMT, so I can't--
00:26:22.580 | - Unfortunately or fortunately?
00:26:23.900 | - Unfortunately, yeah.
00:26:25.420 | Although I presume it's in my body as it is
00:26:28.740 | in everyone's brain and in many, many plants, apparently.
00:26:33.740 | But I've wanted to take it, I haven't had an opportunity
00:26:38.460 | that presented itself that where it was obviously
00:26:40.700 | the right thing for me to be doing.
00:26:42.460 | But for those who don't know, DMT is often touted
00:26:47.420 | as the most intense psychedelic,
00:26:49.500 | and also the shortest acting.
00:26:51.420 | You smoke it and it's basically a 10 minute experience,
00:26:54.740 | or a three minute experience within like a 10 minute window
00:26:59.220 | that when you're really down after 10 minutes or so.
00:27:04.060 | And Terence McKenna was a big proponent of DMT,
00:27:09.460 | that was the center of the bullseye for him,
00:27:12.260 | psychedelically apparently.
00:27:13.620 | And it is characterized, it seems, for many people
00:27:20.060 | by this phenomenon, which is unlike virtually
00:27:23.020 | any other psychedelic experience, which is your,
00:27:25.900 | it's not just your perception being broadened or changed,
00:27:30.740 | it's you, according to Terence McKenna,
00:27:35.500 | feeling fairly unchanged, but catapulted
00:27:38.820 | into a different circumstance.
00:27:41.020 | You have been shot elsewhere and find yourself
00:27:45.600 | in relationship to other entities of some kind.
00:27:48.940 | So the place is populated with things
00:27:51.460 | that seem not to be your mind.
00:27:54.140 | - So it does feel like travel to another place,
00:27:56.260 | because you are unchanged yourself.
00:27:58.140 | - According, again, I just have this on the authority
00:28:00.480 | of the people who have described their experience,
00:28:03.260 | but it sounds like it's pretty common.
00:28:05.580 | It sounds like it's pretty common for people
00:28:07.060 | not to have the full experience,
00:28:08.500 | because it's apparently pretty unpleasant to smoke.
00:28:11.620 | So it's like getting enough on board in order to get shot
00:28:15.020 | out of the cannon and land among the,
00:28:19.380 | what McKenna called self-transforming machine elves
00:28:25.420 | that appeared to him like jeweled Fabergé egg-like,
00:28:31.420 | self-drippling basketballs that were handing him
00:28:34.580 | completely uninterpretable reams of profound knowledge.
00:28:39.480 | It's an experience I haven't had,
00:28:44.540 | so I just have to accept that people have had it.
00:28:47.700 | I would just point out that our minds are clearly capable
00:28:53.860 | of producing apparent others on demand
00:28:58.860 | that are totally compelling to us, right?
00:29:01.940 | There's no limit to our ability to do that
00:29:04.820 | as anyone who's ever remembered a dream can attest.
00:29:07.980 | I mean, every night we go to sleep,
00:29:10.220 | and some of us don't remember dreams very often,
00:29:11.900 | but some dream vividly every night.
00:29:15.660 | And just think of how insane that experience is.
00:29:20.660 | I mean, you've forgotten where you were, right?
00:29:23.900 | That's the strangest part.
00:29:25.500 | I mean, this is psychosis, right?
00:29:27.100 | You have lost your mind.
00:29:29.500 | You have lost your connection to your episodic memory,
00:29:34.500 | or even your expectations that reality won't undergo
00:29:40.220 | wholesale changes a moment after you have closed your eyes.
00:29:45.020 | You're in bed, you're watching something on Netflix,
00:29:49.140 | you're waiting to fall asleep,
00:29:50.420 | and then the next thing that happens to you is impossible,
00:29:54.620 | and you're not surprised.
00:29:56.260 | You're talking to dead people,
00:29:57.540 | you're hanging out with famous people,
00:29:58.900 | you're someplace you couldn't physically be,
00:30:02.860 | you can fly, and even that's not surprising.
00:30:06.540 | You have lost your mind, but relevantly for this--
00:30:10.780 | - Or found it.
00:30:12.180 | - You found something.
00:30:13.300 | Lucid dreaming is very interesting,
00:30:14.780 | 'cause then you can have the best of both circumstances,
00:30:17.900 | and then it can be kind of systematically explored.
00:30:22.900 | - But what I mean by found, just sorry to interrupt,
00:30:25.380 | is like if we take this brilliant idea
00:30:29.020 | that language constrains us, grounds us,
00:30:31.980 | language and other things of the waking world ground us,
00:30:35.600 | maybe it is that you've found the full capacity
00:30:40.600 | of your cognition when you dream,
00:30:43.380 | or when you do psychedelics.
00:30:44.740 | You're stepping outside the little human cage,
00:30:47.780 | the cage of the human condition.
00:30:49.500 | To open the door and step out and look around,
00:30:52.780 | and then go back in.
00:30:54.220 | - Well, you've definitely stepped out of something
00:30:57.100 | and into something else, but you've also lost something.
00:30:59.420 | You've lost certain capacities.
00:31:01.140 | - Memory?
00:31:02.060 | - Well, just, yeah, in this case,
00:31:04.100 | you literally didn't, you don't have enough presence of mind
00:31:07.980 | in the dreaming state, or even in the psychedelic state,
00:31:11.620 | if you take enough.
00:31:12.640 | - To do math?
00:31:16.100 | - There's no psychological,
00:31:17.100 | there's very little psychological continuity
00:31:20.300 | with your life, such that you're not surprised
00:31:24.040 | to be in the presence of someone who should be,
00:31:28.620 | you should know is dead, or you should know
00:31:30.940 | you're not likely to have met, by normal channels, right?
00:31:33.860 | You're now talking to some celebrity,
00:31:36.460 | and it turns out you're best friends, right?
00:31:39.140 | And you're not even, you have no memory
00:31:40.780 | of how you got there.
00:31:41.620 | You're like, how did you get into the room?
00:31:42.940 | You're like, did you drive to this restaurant?
00:31:45.300 | You have no memory, and none of that's surprising to you.
00:31:47.500 | So you're kind of brain damaged, in a way.
00:31:49.740 | You're not reality testing in the normal way.
00:31:53.420 | - The fascinating possibility is that
00:31:55.780 | there's probably thousands of people
00:31:57.700 | who've taken psychedelics of various forms,
00:31:59.700 | and have met Sam Harris on that journey.
00:32:03.580 | - Well, I would put it more likely in dreams,
00:32:05.380 | not, you know, 'cause with psychedelics,
00:32:08.060 | you don't tend to hallucinate in a dreamlike way.
00:32:11.500 | I mean, so DMT is giving you an experience of others,
00:32:15.580 | but it seems to be non-standard.
00:32:19.340 | It's not just like dream hallucinations.
00:32:23.020 | But to the point of, coming back to DMT,
00:32:26.500 | the people want to suggest, and Terrence McKenna
00:32:31.220 | certainly did suggest, that because these others
00:32:34.900 | are so obviously other, and they're so vivid,
00:32:38.420 | well, then they could not possibly be
00:32:39.860 | the creation of my own mind.
00:32:42.860 | But every night in dreams, you create a compelling,
00:32:47.820 | or what is to you at the time,
00:32:49.780 | a totally compelling simulacrum of another person, right?
00:32:53.980 | And that's, that just proves the mind
00:32:58.980 | is capable of doing it.
00:33:00.940 | Now, the phenomenon of lucid dreaming
00:33:04.340 | shows that the mind isn't capable
00:33:05.780 | of doing everything you think it might be capable of,
00:33:08.580 | even in that space.
00:33:10.020 | So one of the things that people have discovered
00:33:14.260 | in lucid dreams, and I haven't done a lot of lucid dreaming,
00:33:17.500 | so I can't confirm all of this,
00:33:20.020 | but I can confirm some of it.
00:33:21.620 | Apparently, in every house, in every room
00:33:28.140 | in the mansion of dreams, all light switches
00:33:31.540 | are dimmer switches.
00:33:32.780 | Like if you go into a dark room and flip on the light,
00:33:35.980 | it gradually comes up.
00:33:37.940 | It doesn't come up instantly on demand,
00:33:41.340 | because apparently this is covering,
00:33:43.940 | for the brain's inability to produce from a standing start
00:33:48.940 | visually rich imagery on demand.
00:33:52.100 | So I haven't confirmed that, but that was,
00:33:54.580 | people have done research on lucid dreaming,
00:33:57.540 | to claim that it's all dimmer switches.
00:33:59.780 | But one thing I have noticed,
00:34:03.020 | and people can check this out,
00:34:05.820 | is that in a dream, if you look at text,
00:34:10.060 | a page of text, or a sign, or a television
00:34:14.900 | that has text on it, and then you turn away
00:34:17.340 | and you look back at that text,
00:34:18.780 | the text will have changed.
00:34:20.980 | Right, there's no, the total is just a chronic instability,
00:34:24.460 | graphical instability of text in the dream state.
00:34:28.900 | And I don't know if that, maybe that's,
00:34:31.100 | someone can confirm that that's not true for them,
00:34:33.220 | but that's, whenever I've checked that out,
00:34:34.980 | that has been true for me.
00:34:35.820 | - So it keeps generating it, like real time,
00:34:39.060 | from a video game perspective.
00:34:40.620 | - Yeah, it's rendering, it's re-rendering it,
00:34:43.860 | for some reason.
00:34:44.780 | - What's interesting, I actually,
00:34:46.020 | I don't know how I found myself in this sets of,
00:34:49.940 | that part of the internet,
00:34:51.820 | but there's quite a lot of discussion
00:34:53.780 | about what it's like to do math on LSD.
00:34:56.380 | Because apparently, one of the deepest
00:35:00.700 | thinking processes needed is those of mathematicians,
00:35:04.700 | or theoretical computer scientists,
00:35:06.340 | basically doing anything that involves math is proofs,
00:35:09.860 | and you have to think creatively, but also deeply,
00:35:12.460 | and you have to think for many hours at a time.
00:35:15.860 | And so they're always looking for ways to,
00:35:18.300 | like is there any sparks of creativity
00:35:21.340 | that could be injected?
00:35:22.260 | And apparently, out of all the psychedelics,
00:35:25.580 | the worst is LSD, because it completely destroys
00:35:29.300 | your ability to do math well.
00:35:31.060 | And I wonder whether that has to do with your ability
00:35:33.660 | to visualize geometric things in a stable way in your mind,
00:35:38.660 | and hold them there, and stitch things together,
00:35:41.620 | which is often what's required for proofs.
00:35:44.100 | But again, it's difficult to kind of research
00:35:47.980 | these kinds of concepts, but it does make me wonder
00:35:50.980 | where, what are the spaces, how's the space
00:35:55.860 | of things you're able to think about and explore
00:35:59.140 | morphed by different psychedelics,
00:36:02.820 | or dream states, and so on, and how is that different?
00:36:06.140 | How much does it overlap with reality?
00:36:08.140 | And what is reality?
00:36:10.420 | Is there a waking state reality?
00:36:12.320 | Or is it just a tiny subset of reality,
00:36:15.940 | and we get to take a step in other versions of it?
00:36:18.940 | We tend to think very much in a space-time,
00:36:23.020 | four-dimensional, there's a three-dimensional world,
00:36:25.700 | there's time, and that's what we think about reality.
00:36:29.660 | And we think of traveling as walking from point A
00:36:33.660 | to point B in the three-dimensional world.
00:36:36.740 | But that's a very kind of human surviving,
00:36:40.100 | trying not to get eaten by a lion, conception of reality.
00:36:43.380 | What if traveling is something like we do with psychedelics
00:36:46.820 | and meet the elves?
00:36:48.340 | What if it's something, what if thinking,
00:36:50.600 | or the space of ideas as we kind of grow
00:36:53.820 | and think through ideas, that's traveling?
00:36:56.420 | Or what if memories is traveling?
00:37:00.380 | I don't know if you have a favorite view of reality,
00:37:03.980 | or if you, you had, by the way, I should say,
00:37:06.980 | excellent conversation with Donald Hoffman.
00:37:09.380 | - Yeah, yeah, he's interesting.
00:37:11.980 | - Is there any inkling of his sense in your mind
00:37:15.980 | that reality is very far from,
00:37:20.220 | actual, like, objective reality is very far
00:37:22.300 | from the kind of reality we imagine,
00:37:24.980 | we perceive, and we play with in our human minds?
00:37:29.460 | - Well, the first thing to grant is that
00:37:33.260 | we're never in direct contact with reality, whatever it is,
00:37:39.940 | unless that reality is consciousness, right?
00:37:42.780 | So we're only ever experiencing consciousness
00:37:47.300 | and its contents.
00:37:48.380 | And then the question is, how does that circumstance
00:37:52.580 | relate to, quote, reality at large?
00:37:55.940 | And Donald Hoffman is somebody who's happy to speculate,
00:38:00.100 | well, maybe there isn't a reality at large.
00:38:02.700 | Maybe it's all just consciousness on some level.
00:38:05.620 | And that's interesting, that runs into, to my eye,
00:38:11.540 | various philosophical problems that,
00:38:16.500 | or at least you have to do a lot,
00:38:17.620 | you have to add to that picture,
00:38:21.980 | I mean, that picture of idealism,
00:38:24.860 | that's usually all the whole family of views
00:38:27.500 | that would just say that the universe is just mind
00:38:30.580 | or just consciousness at bottom,
00:38:33.060 | we'll go by the name of idealism in Western philosophy.
00:38:36.060 | You have to add to that idealistic picture
00:38:40.980 | all kinds of epicycles and kind of weird coincidences
00:38:44.900 | and to get the predictability of our experience
00:38:49.900 | and the success of materialist science
00:38:54.460 | to make sense in that context, right?
00:38:56.140 | And so the fact that we can,
00:38:57.680 | what does it mean to say that there's only consciousness
00:39:03.580 | at bottom, right?
00:39:05.600 | Nothing outside of consciousness,
00:39:07.140 | 'cause no one's ever experienced anything
00:39:08.660 | outside of consciousness.
00:39:09.660 | No scientist has ever done an experiment
00:39:11.740 | where they were contemplating data,
00:39:14.900 | no matter how far removed from our sense bases,
00:39:17.460 | whether it's they're looking at the Hubble Deep Field
00:39:20.620 | or they're smashing atoms
00:39:23.620 | or whatever tools they're using,
00:39:25.700 | they're still just experiencing consciousness
00:39:29.020 | and its various deliverances
00:39:32.940 | and layering their concepts on top of that.
00:39:37.220 | So that's always true,
00:39:41.540 | and yet that somehow doesn't seem to capture
00:39:44.740 | the character of our continually discovering
00:39:53.420 | that our materialist assumptions are confirmable, right?
00:39:59.020 | So you take the fact that we unleash
00:40:02.240 | this fantastic amount of energy from within an atom, right?
00:40:06.380 | First, we have the theoretical suggestion
00:40:09.900 | that it's possible, right?
00:40:12.260 | We come back to Einstein,
00:40:14.180 | there's a lot of energy in that matter, right?
00:40:18.580 | And what if we could release it, right?
00:40:21.020 | And then we perform an experiment that in this case,
00:40:25.180 | the Trinity Test Site in New Mexico,
00:40:28.540 | where the people who are most adequate to this conversation,
00:40:32.000 | people like Robert Oppenheimer,
00:40:34.620 | are standing around not altogether certain
00:40:39.620 | it's going to work, right?
00:40:40.820 | They're performing an experiment,
00:40:42.180 | they're wondering what's gonna happen,
00:40:43.300 | they're wondering if their calculations
00:40:44.860 | around the yield are off by orders of magnitude.
00:40:47.740 | Some of them are still wondering
00:40:49.140 | whether the entire atmosphere of Earth
00:40:51.860 | is gonna combust, right?
00:40:55.380 | That the nuclear chain reaction is not gonna stop.
00:41:01.220 | And lo and behold, there was that energy to be released
00:41:06.220 | from within the nucleus of an atom.
00:41:09.860 | And that could, so it's just,
00:41:14.660 | the picture one forms from those kinds of experiments,
00:41:20.540 | and just the knowledge,
00:41:21.380 | it's just our understanding of evolution,
00:41:22.900 | just the fact that the Earth is billions of years old
00:41:26.060 | and life is hundreds of millions of years old,
00:41:27.840 | and we weren't here to think about any of those things,
00:41:31.000 | and all of those processes were happening,
00:41:33.560 | therefore, in the dark,
00:41:35.160 | and they are the processes that allowed us to emerge
00:41:38.440 | from prior life forms in the first place.
00:41:41.880 | To say that it's all a mess,
00:41:43.960 | that nothing exists outside of consciousness,
00:41:47.280 | conscious minds of the sort that we experience,
00:41:50.580 | it just seems, it seems like a bizarrely
00:41:55.720 | anthropocentric claim,
00:41:58.580 | analogous to the moon isn't there
00:42:03.440 | if no one's looking at it, right?
00:42:05.080 | I mean, the moon as a moon isn't there
00:42:07.600 | if no one's looking at it, I'll grant that,
00:42:09.440 | 'cause that's already a kind of fabrication
00:42:12.720 | born of concepts, but the idea that there's nothing there,
00:42:17.120 | that there's nothing that corresponds
00:42:19.600 | to what we experience as the moon,
00:42:21.800 | unless someone's looking at it,
00:42:23.700 | that just seems just a way too parochial way
00:42:28.700 | to set out on this journey of discovery.
00:42:31.400 | - There is something there,
00:42:32.280 | there's a computer waiting to render the moon
00:42:34.280 | when you look at it.
00:42:36.000 | The capacity for the moon to exist is there.
00:42:38.700 | So if we're indeed living in a simulation,
00:42:42.540 | which I find a compelling thought experiment,
00:42:45.420 | it's possible that there is this kind of
00:42:47.720 | rendering mechanism, but not in the silly way
00:42:50.960 | that we think about in video games,
00:42:52.440 | but in some kind of more fundamental physics way.
00:42:55.280 | - And we have to account for the fact
00:42:57.820 | that it renders experiences that no one has had yet,
00:43:02.820 | that no one has any expectation of having.
00:43:07.140 | It can violate the expectations of everyone lawfully,
00:43:10.080 | right, and then there's some lawful understanding
00:43:12.160 | of why that's so.
00:43:14.220 | It's like, I mean, just to bring it back to mathematics,
00:43:18.280 | I'm like, like, certain numbers are prime
00:43:20.420 | whether we have discovered them or not, right?
00:43:22.720 | Like there's the highest prime number
00:43:24.880 | that anyone can name now,
00:43:27.640 | and then there's the next prime number
00:43:29.160 | that no one can name, and it's there, right?
00:43:31.640 | So it's like, to say that our minds are putting it there,
00:43:36.640 | that what we know as mind in ourselves
00:43:38.900 | is in some way, in some sense, putting it there,
00:43:41.860 | that, like the base layer of reality is consciousness, right?
00:43:47.040 | You know, that we're identical to the thing
00:43:49.500 | that is rendering this reality.
00:43:53.220 | There's some, you know, hubris is the wrong word,
00:43:57.700 | but it's like there's some, it's like,
00:43:59.340 | it's okay if reality is bigger than what we experience,
00:44:02.980 | you know, and it has structure that we can't anticipate
00:44:07.980 | and that isn't just, I mean, again,
00:44:14.840 | there's certainly a collaboration between our minds
00:44:17.840 | and whatever is out there to produce what we call,
00:44:21.600 | you know, the stuff of life,
00:44:24.040 | but it's not the idea that it's,
00:44:29.040 | I don't know, I mean, there are a few stops
00:44:33.240 | on the train of idealism and kind of new age thinking
00:44:36.880 | and Eastern philosophy that I don't,
00:44:40.560 | philosophically, I don't see a need to take.
00:44:42.480 | I mean, the place, experientially and scientifically,
00:44:45.940 | I feel like it's, you can get everything you want
00:44:49.500 | acknowledging that consciousness has a character
00:44:55.660 | that can be explored from its own side
00:44:58.940 | so that you're bringing kind of the first person experience
00:45:01.580 | back into the conversation about, you know,
00:45:03.940 | what is a human mind and, you know, what is true,
00:45:06.800 | and you can explore it with different degrees of rigor
00:45:12.060 | and there are things to be discovered there,
00:45:13.700 | whether you're using a technique like meditation
00:45:15.500 | or psychedelics, and that these experiences
00:45:19.020 | have to be put in conversation with what we understand
00:45:22.500 | about ourselves from a third person side,
00:45:24.580 | neuroscientifically or in any other way.
00:45:27.420 | - But to me, the question is, what if reality,
00:45:30.300 | the sense I have from this kind of, you play shooters?
00:45:34.820 | - No.
00:45:36.020 | - There's a physics engine that generate, that's--
00:45:37.860 | - Oh, you mean first person shooter games?
00:45:40.300 | - Yes, yes, sorry.
00:45:41.780 | Not often, but yes.
00:45:43.620 | - I mean, there's a physics engine
00:45:44.740 | that generates consistent reality, right?
00:45:47.300 | My sense is the same could be true for a universe
00:45:50.900 | in the following sense, that our conception of reality,
00:45:54.700 | as we understand it now in the 21st century,
00:45:57.660 | is a tiny subset of the full reality.
00:45:59.780 | It's not that the reality that we conceive of that's there,
00:46:03.020 | the moon being there is not there somehow.
00:46:06.440 | It's that it's a tiny fraction of what's actually out there.
00:46:09.380 | And so the physics engine of the universe
00:46:12.820 | is just maintaining the useful physics,
00:46:16.100 | the useful "reality," quote unquote,
00:46:19.020 | for us to have a consistent experience as human beings.
00:46:22.880 | But maybe we, descendants of apes,
00:46:25.860 | are really only understand 0.0001%
00:46:30.860 | of actual physics of reality.
00:46:34.020 | We can even just start with the consciousness thing.
00:46:36.620 | But maybe our minds are just,
00:46:39.540 | we're just too dumb by design.
00:46:42.340 | - Oh, yeah, that truly resonates with me,
00:46:46.820 | and I'm surprised it doesn't resonate more
00:46:48.300 | with most scientists that I talk to.
00:46:50.620 | When you just look at,
00:46:52.860 | you look at how close we are to chimps, right?
00:46:57.180 | And chimps don't know anything, right?
00:46:58.900 | Clearly, they have no idea what's going on, right?
00:47:01.500 | And then you get us,
00:47:03.260 | but then it's only a subset of human beings
00:47:06.160 | that really understand much of what we're talking about
00:47:09.860 | in any area of specialization.
00:47:12.620 | And if they all died in their sleep tonight, right,
00:47:15.660 | you'd be left with people who might take 1,000 years
00:47:20.620 | to rebuild the internet, if ever, right?
00:47:24.620 | I mean, literally, it's like,
00:47:26.180 | and I would extend this to myself.
00:47:29.300 | I mean, there are areas of scientific specialization
00:47:32.920 | where I have either no discernible competence,
00:47:37.480 | I mean, I spend no time on it,
00:47:40.060 | I have not acquired the tools.
00:47:42.140 | It would just be an article of faith for me
00:47:43.660 | to think that I could acquire the tools
00:47:45.140 | to actually make a breakthrough in those areas.
00:47:47.500 | And your own area is one.
00:47:51.540 | I've never spent any significant amount of time
00:47:54.080 | trying to be a programmer,
00:47:56.380 | but it's pretty obvious I'm not Alan Turing, right?
00:48:00.340 | It's like, if that were my capacity,
00:48:03.760 | I would have discovered that in myself.
00:48:05.840 | I would have found programming irresistible.
00:48:08.360 | My first false starts in learning, I think it was C,
00:48:13.360 | it was just, I bounced off.
00:48:17.240 | It's like, this was not fun.
00:48:18.360 | I hate, I mean, trying to figure out what,
00:48:21.200 | the syntax error that's causing this thing not to compile
00:48:24.280 | was just a fucking awful experience.
00:48:25.760 | I hated it, right?
00:48:26.600 | I hated every minute of it.
00:48:28.000 | So it was not, so if it was just people like me left,
00:48:33.000 | like, when do we get the internet again?
00:48:37.180 | Right, and we lose the internet.
00:48:40.540 | When do we get it again, right?
00:48:41.620 | When do we get anything
00:48:44.300 | like a proper science of information, right?
00:48:47.180 | You need a Claude Shannon or an Alan Turing
00:48:50.940 | to plant a flag in the ground right here and say,
00:48:53.700 | "All right, can everyone see this?
00:48:55.140 | Even if you don't quite know what I'm up to,
00:48:57.780 | you all have to come over here to make some progress."
00:49:02.100 | And there are hundreds of topics where that's the case.
00:49:08.240 | So we barely have a purchase
00:49:11.820 | on making anything like discernible intellectual progress
00:49:16.820 | in any generation.
00:49:18.960 | And yeah, I'm just, Max Tegmark makes this point.
00:49:23.960 | He's one of the few people who does.
00:49:27.700 | In physics, if you just take the truth
00:49:32.700 | of evolution seriously, right?
00:49:36.340 | And realize that there's nothing about us
00:49:39.780 | that has evolved to understand reality perfectly.
00:49:42.740 | I mean, we're just not that kind of ape, right?
00:49:46.340 | There's been no evolutionary pressure along those lines.
00:49:48.620 | So what we are making do with tools
00:49:52.380 | that were designed for fights with sticks and rocks, right?
00:49:56.780 | And it's amazing we can do as much as we can.
00:50:00.420 | I mean, you and I are just sitting here
00:50:02.380 | on the back of having received an mRNA vaccine
00:50:06.100 | that has certainly changed our life
00:50:08.460 | given what the last year was like.
00:50:10.540 | And it's gonna change the world
00:50:12.180 | if rumors of coming miracles are borne out.
00:50:16.460 | I mean, it's now, it seems likely we have a vaccine
00:50:20.780 | coming for malaria, right?
00:50:22.240 | Which has been killing millions of people a year
00:50:25.920 | for as long as we've been alive.
00:50:27.520 | I think it's down to like 800,000 people a year now
00:50:31.260 | because we've spread so many bed nets around.
00:50:33.900 | But it was like two and a half million people every year.
00:50:37.360 | It's amazing what we can do, but yeah, I have,
00:50:43.380 | if in fact the answer at the back of the book of nature
00:50:46.820 | is you understand 0.1% of what there is to understand
00:50:51.820 | and half of what you think you understand is wrong,
00:50:54.900 | that would not surprise me at all.
00:50:57.280 | - It is funny to look at our evolutionary history,
00:51:01.040 | even back to chimps.
00:51:02.420 | I'm pretty sure even chimps thought
00:51:03.860 | they understood the world well.
00:51:06.340 | So at every point in that timeline
00:51:09.300 | of evolutionary development throughout human history,
00:51:12.860 | there's a sense like there's no more,
00:51:15.560 | you hear this message over and over,
00:51:17.020 | there's no more things to be invented.
00:51:19.380 | - But 100 years ago there were,
00:51:21.300 | there's a famous story, I forget which physicist told it,
00:51:24.920 | but there were physicists telling
00:51:29.060 | their undergraduate students not to go into,
00:51:32.800 | to get graduate degrees in physics
00:51:34.260 | because basically all the problems had been solved.
00:51:36.580 | And this is like around 1915 or so.
00:51:40.140 | - Turns out you were right.
00:51:41.300 | I'm gonna ask you about free will.
00:51:42.700 | - Oh, okay.
00:51:43.540 | - You've recently released an episode of your podcast,
00:51:48.220 | Making Sense, for those with a shorter attention span,
00:51:51.580 | basically summarizing your position on free will.
00:51:54.000 | I think it was under an hour and a half.
00:51:56.240 | - Yeah, yeah.
00:51:57.140 | - It was brief and clear.
00:52:00.520 | So allow me to summarize the summary, TL;DR,
00:52:05.300 | and maybe you tell me where I'm wrong.
00:52:07.380 | So free will is an illusion,
00:52:11.380 | and even the experience of free will is an illusion.
00:52:13.980 | Like we don't even experience it.
00:52:17.340 | Am I good in my summary?
00:52:19.920 | - Yeah, this is a line
00:52:24.780 | that's a little hard to scan for people.
00:52:27.360 | I say that it's not merely that free will is an illusion.
00:52:32.360 | The illusion of free will is an illusion.
00:52:35.140 | Like there is no illusion of free will.
00:52:37.480 | And that is a, unlike many other illusions,
00:52:40.620 | that's a more fundamental claim.
00:52:47.080 | It's not that it's wrong, it's not even wrong.
00:52:49.500 | I mean, that's, I guess, that was, I think, Wolfgang Pauli,
00:52:52.220 | who derided one of his colleagues or enemies
00:52:56.420 | with that aspersion about his theory in quantum mechanics.
00:53:01.420 | So there are things that you, there are genuine illusions.
00:53:09.820 | There are things that you do experience,
00:53:12.740 | and then you can kind of punch through that experience.
00:53:15.160 | Or you can't actually experience,
00:53:17.300 | you can't experience them any other way.
00:53:20.140 | It's just, we just know it's not a veridical experience.
00:53:24.100 | You just take like a visual illusion.
00:53:25.380 | There are visual illusions that,
00:53:26.940 | a lot of these come to me on Twitter these days.
00:53:28.740 | There's these amazing visual illusions
00:53:31.020 | where like every figure in this GIF seems to be moving,
00:53:36.020 | but nothing, in fact, is moving.
00:53:37.740 | You can just like put a ruler on your screen
00:53:39.220 | and nothing's moving.
00:53:40.260 | Some of those illusions you can't see any other way.
00:53:44.540 | I mean, they're just, they're hacking aspects
00:53:46.860 | of the visual system that are just eminently hackable,
00:53:49.860 | and you have to use a ruler to convince yourself
00:53:54.380 | that the thing isn't actually moving.
00:53:56.500 | Now, there are other visual illusions
00:53:57.860 | where you're taken in by it at first,
00:54:01.180 | but if you pay more attention,
00:54:02.580 | you can actually see that it's not there, right?
00:54:05.300 | Or it's not how it first seemed.
00:54:07.620 | Like the Necker cube is a good example of that.
00:54:10.180 | Like the Necker cube is just that schematic of a cube,
00:54:13.300 | of a transparent cube, which pops out one way or the other.
00:54:15.900 | One face can pop out and the other face can pop out,
00:54:18.940 | but you can actually just see it as flat with no pop out,
00:54:22.160 | which is a more veridical way of looking at it.
00:54:26.180 | So there are subject,
00:54:29.940 | there are kind of inward correlates to this,
00:54:32.340 | and I would say that the sense of self,
00:54:35.340 | the sense of self and free will are closely related.
00:54:40.700 | I often describe them as two sides of the same coin,
00:54:43.460 | but they're not quite the same in their spuriousness.
00:54:48.460 | I mean, so the sense of self is something
00:54:50.980 | that people, I think, do experience, right?
00:54:54.540 | It's not a very clear experience,
00:54:56.900 | but it's not, I wouldn't call the illusion of self
00:54:59.860 | an illusion, but the illusion of free will
00:55:02.420 | is an illusion in that as you pay more attention
00:55:05.420 | to your experience, you begin to see
00:55:08.060 | that it's totally compatible with an absence of free will.
00:55:11.800 | You don't, I mean, coming back to the place we started,
00:55:14.600 | you don't know what you're gonna think next.
00:55:18.460 | You don't know what you're gonna intend next.
00:55:20.300 | You don't know what's going to just occur to you
00:55:23.380 | that you must do next.
00:55:24.540 | You don't know how much you are going to feel
00:55:28.020 | the behavioral imperative to act on that thought.
00:55:31.660 | If you suddenly feel, oh, I don't need to do that.
00:55:35.340 | I can do that tomorrow.
00:55:36.740 | You don't know where that comes from.
00:55:38.100 | You didn't know that was gonna arise.
00:55:39.640 | You didn't know that was gonna be compelling.
00:55:41.580 | All of this is compatible with some evil genius
00:55:44.140 | in the next room just typing in code
00:55:46.900 | into your experience, just like this.
00:55:48.500 | Okay, let's give him the, oh my God, I just forgot.
00:55:52.700 | It was gonna be our anniversary in one week thought, right?
00:55:56.080 | Give him the cascade of fear.
00:55:57.980 | Give him this brilliant idea for the thing he can buy
00:56:01.620 | that's gonna take him no time at all
00:56:02.860 | and this overpowering sense of relief.
00:56:05.300 | All of our experience is compatible
00:56:07.820 | with the script already being written, right?
00:56:11.700 | And I'm not saying the script is written.
00:56:12.960 | I'm not saying that fatalism is the right way
00:56:17.340 | to look at this, but we just don't have
00:56:20.180 | even our most deliberate voluntary action
00:56:23.420 | where we go back and forth between two options,
00:56:27.060 | thinking about the reason for A
00:56:28.580 | and then reconsidering and going,
00:56:30.500 | thinking harder about B
00:56:33.380 | and just going eeny, meeny, miny, moe
00:56:35.380 | until the end of the hour.
00:56:37.980 | However laborious you can make it,
00:56:40.340 | there is a utter mystery at your back
00:56:44.500 | finally promoting the thought or intention
00:56:48.940 | or rationale that is most compelling
00:56:53.860 | and therefore deliberately, behaviorally effective.
00:57:02.180 | And just,
00:57:05.120 | and this can drive some people a little crazy.
00:57:09.300 | So I usually preface what I say about free will
00:57:13.740 | with the caveat that if thinking about your mind this way
00:57:17.280 | makes you feel terrible, well then stop,
00:57:19.500 | you know, get off the ride, switch the channel.
00:57:22.420 | You don't have to go down this path.
00:57:24.580 | But for me and for many other people,
00:57:27.520 | it's incredibly freeing to recognize this about the mind
00:57:32.160 | because one, you realize that you're,
00:57:37.160 | I mean, cutting through the illusion of the self
00:57:39.940 | is immensely freeing for a lot of reasons
00:57:41.820 | that we can talk about separately,
00:57:44.020 | but losing the sense of free will
00:57:46.760 | does two things very vividly for me.
00:57:51.520 | One is it totally undercuts the basis for,
00:57:54.760 | the psychological basis for hatred, right?
00:57:56.620 | Because when you think about the experience
00:57:58.760 | of hating other people, what that is anchored to
00:58:03.560 | is a feeling that they really are
00:58:06.320 | the true authors of their actions.
00:58:08.640 | I mean, that someone is doing something
00:58:10.840 | that you find so despicable, right?
00:58:13.200 | And let's say they're targeting you unfairly, right?
00:58:15.900 | They're maligning you on Twitter or they're suing you
00:58:20.220 | or they're doing something, they broke your car window,
00:58:22.840 | they did something awful
00:58:24.640 | and now you have a grievance against them
00:58:26.640 | and you're relating to them very differently emotionally
00:58:30.000 | in your own mind than you would
00:58:32.360 | if a force of nature had done this, right?
00:58:34.520 | Or if it had just been a virus
00:58:36.720 | or if it had been a wild animal
00:58:38.320 | or a malfunctioning machine, right?
00:58:40.720 | Like to those things, you don't attribute
00:58:42.240 | any kind of freedom of will.
00:58:44.180 | And while you may suffer the consequences
00:58:46.960 | of catching a virus or being attacked by a wild animal
00:58:49.920 | or having your car break down or whatever,
00:58:53.120 | it may frustrate you.
00:58:54.900 | You don't slip into this mode of hating the agent
00:59:01.000 | in a way that completely commandeers your mind
00:59:06.200 | and deranges your life.
00:59:07.800 | I mean, you just don't, I mean,
00:59:08.700 | there are people who spend decades
00:59:11.720 | hating other people for what they did
00:59:15.800 | and it's just pure poison, right?
00:59:18.600 | - So it's a useful shortcut to compassion and empathy.
00:59:20.880 | - Yeah, yeah.
00:59:21.720 | - But the question is, say that this call,
00:59:24.840 | what was it, the horse of consciousness,
00:59:26.760 | let's call it the consciousness generator black box
00:59:30.680 | that we don't understand.
00:59:32.520 | And is it possible that the script
00:59:35.940 | that we're walking along, that we're playing,
00:59:40.780 | that's already written, is actually being written
00:59:43.880 | in real time?
00:59:45.200 | It's almost like you're driving down a road
00:59:47.680 | and in real time, that road is being laid down.
00:59:50.880 | And this black box of consciousness
00:59:52.680 | that we don't understand is the place
00:59:54.800 | where this script is being generated.
00:59:57.060 | So it's not, it is being generated, it didn't always exist.
01:00:01.180 | So there's something we don't understand
01:00:02.800 | that's fundamental about the nature of reality
01:00:05.500 | that generates both consciousness,
01:00:07.520 | let's call it maybe the self.
01:00:09.840 | I don't know if you want to distinguish between those.
01:00:11.600 | - Yeah, I definitely would, yeah.
01:00:13.040 | - You would, because there's a bunch of illusions
01:00:15.780 | we're referring to.
01:00:16.620 | There's the illusion of free will,
01:00:18.240 | there's the illusion of self,
01:00:20.280 | and there's the illusion of consciousness.
01:00:22.440 | You're saying, I think you said there's no,
01:00:25.840 | you're not as willing to say
01:00:27.040 | there's an illusion of consciousness.
01:00:28.800 | You're a little bit more--
01:00:29.640 | - In fact, I would say it's impossible.
01:00:30.840 | - Impossible.
01:00:31.720 | You're a little bit more willing to say
01:00:33.240 | that there's an illusion of self,
01:00:35.520 | and you're definitely saying
01:00:36.920 | there's an illusion of free will.
01:00:38.600 | - Yes, yes.
01:00:39.880 | I'm definitely saying there's an illusion,
01:00:42.080 | that a certain kind of self is an illusion, not every.
01:00:44.840 | We mean many different things by this notion of self.
01:00:48.000 | So maybe I should just differentiate these things.
01:00:50.520 | So consciousness can't be an illusion
01:00:53.040 | because any illusion proves its reality
01:00:58.040 | as much as any other veridical perception.
01:01:00.480 | I mean, if you're hallucinating now,
01:01:02.560 | that's just as much a demonstration of consciousness
01:01:05.660 | as really seeing what's quote actually there.
01:01:09.680 | If you're dreaming and you don't know it,
01:01:11.900 | that is consciousness, right?
01:01:15.040 | If you're, you can be confused about literally everything.
01:01:17.740 | You can't be confused about the underlying claim,
01:01:22.740 | whether you make it linguistically or not,
01:01:27.560 | but just the cognitive assertion
01:01:32.240 | that something seems to be happening.
01:01:36.960 | It's the seeming that is the cash value of consciousness.
01:01:40.440 | - Can I take a tiny tangent?
01:01:41.840 | - Okay.
01:01:42.680 | - So what if I am creating consciousness in my mind
01:01:47.680 | to convince you that I'm human?
01:01:50.120 | So it's a useful social tool,
01:01:52.560 | not a fundamental property of experience,
01:01:57.560 | like of being a living thing.
01:02:00.760 | What if it's just like a social tool
01:02:02.680 | to almost like a useful computational trick
01:02:07.560 | to place myself into reality
01:02:10.580 | as we together communicate about this reality?
01:02:13.180 | And another way to ask that,
01:02:15.920 | 'cause you said it much earlier,
01:02:17.560 | you talk negatively about robots as you often do,
01:02:21.340 | so let me, 'cause you'll probably die first
01:02:24.320 | when they take over.
01:02:25.320 | - No, I'm looking forward to certain kinds of robots.
01:02:28.720 | I mean, I'm not, if we can get this right,
01:02:31.080 | this would be amazing.
01:02:31.920 | - Right, but you don't like the robots
01:02:33.240 | that fake consciousness.
01:02:34.520 | That's what you, you don't like the idea
01:02:36.480 | of fake it 'til you make it.
01:02:37.840 | - Well, no, it's not that I don't like it.
01:02:40.160 | It's that I'm worried that we will lose sight
01:02:43.400 | of the problem, and the problem has massive
01:02:45.800 | ethical consequences.
01:02:47.280 | If we create robots that really can suffer,
01:02:51.960 | that would be a bad thing, right?
01:02:53.680 | And if we really are committing a murder
01:02:56.120 | when we recycle them, that would be a bad thing.
01:02:59.160 | - This is how I know you're not Russian.
01:03:00.280 | Why is it a bad thing that we create robots
01:03:02.360 | that can suffer?
01:03:03.280 | Isn't suffering a fundamental thing
01:03:05.280 | from which like beauty springs?
01:03:07.240 | Like without suffering, do you really think
01:03:09.320 | we would have beautiful things in this world?
01:03:11.680 | - Okay, that's a tangent on a tangent.
01:03:14.080 | - Okay, all right. - We'll go there.
01:03:15.160 | I would love to go there, but let's not go there just yet.
01:03:17.520 | But I do think it would be, if anything is bad,
01:03:20.280 | creating hell and populating it with real minds
01:03:25.080 | that really can suffer in that hell, that's bad.
01:03:29.120 | That's the, you are worse than any mass murderer
01:03:33.800 | we can name if you create it.
01:03:35.520 | I mean, this could be in robot form,
01:03:37.280 | or more likely it would be in some simulation
01:03:40.760 | of a world where we've managed to populate it
01:03:42.800 | with conscious minds so that whether we knew
01:03:44.640 | they were conscious or not, and that world
01:03:46.960 | is a state of, it's unendurable.
01:03:49.720 | That would just, it just, taking the thesis seriously
01:03:53.160 | that there's nothing, that mind, intelligence,
01:03:58.080 | and consciousness ultimately are substrate independent.
01:04:00.720 | Right, you don't need a biological brain to be conscious.
01:04:03.360 | You certainly don't need a biological brain
01:04:04.760 | to be intelligent, right?
01:04:05.960 | So if we just imagine that consciousness at some point
01:04:09.000 | comes along for the ride as you scale up in intelligence,
01:04:12.520 | well then we could find ourselves creating conscious minds
01:04:16.000 | that are miserable, right?
01:04:17.160 | And that's just like creating a person who's miserable.
01:04:19.720 | Right, it could be worse than creating a person
01:04:21.440 | who's miserable, it could be even more sensitive
01:04:23.160 | to suffering.
01:04:24.000 | - Cloning them and maybe for entertainment,
01:04:26.320 | watching them suffer.
01:04:27.960 | - Just like watching a person suffer for entertainment.
01:04:31.240 | So, but back to your primary question here,
01:04:36.040 | which is differentiating consciousness and self
01:04:40.560 | and free will as concepts and kind of degrees
01:04:43.160 | of illusoriness.
01:04:44.340 | The problem with free will is that
01:04:48.720 | what most people mean by it, and Dan,
01:04:55.120 | this is where Dan Dennett is gonna get off the ride here.
01:04:57.360 | Right, so like he doesn't, he's gonna disagree with me
01:04:59.960 | that I know what most people mean by it.
01:05:02.240 | But I have a very keen sense, having talked about this topic
01:05:07.240 | for many, many years and seeing people get wrapped
01:05:11.760 | around the axle of it and seeing in myself
01:05:15.600 | what it's like to have felt that I was a self
01:05:18.820 | that had free will and then to no longer feel that way.
01:05:21.800 | Right, I mean, to know what it's like to actually
01:05:23.400 | disabuse myself of that sense,
01:05:26.880 | cognitively and emotionally.
01:05:30.000 | And to recognize what's left, what goes away
01:05:32.640 | and what doesn't go away on the basis of that epiphany.
01:05:35.940 | I have a sense that I know what people think
01:05:40.240 | they have in hand when they worry about
01:05:42.020 | whether free will exists.
01:05:44.480 | And it is the flip side of this feeling of self.
01:05:49.480 | It's the flip side of feeling like you are
01:05:53.320 | not merely identical to experience,
01:05:57.020 | you feel like you're having an experience.
01:05:59.040 | You feel like you're an agent that is appropriating
01:06:01.600 | an experience.
01:06:02.440 | There's a protagonist in the movie of your life
01:06:05.120 | and it is you.
01:06:06.220 | It's not just the movie, right?
01:06:09.440 | It's like there's sights and sounds and sensations
01:06:13.420 | and thoughts and emotions and this whole cacophony
01:06:17.060 | of experience, of felt experience,
01:06:18.840 | of felt experience of embodiment.
01:06:21.300 | But there seems to be a rider on the horse
01:06:26.080 | or a passenger in the body, right?
01:06:28.360 | People don't feel truly identical to their bodies
01:06:30.840 | down to their toes.
01:06:32.800 | They sort of feel like they have bodies.
01:06:34.760 | They feel like their minds in bodies
01:06:37.080 | and that feels like a self, that feels like me.
01:06:42.080 | And again, this gets very paradoxical
01:06:45.080 | when you talk about the experience of being
01:06:48.880 | in relationship to yourself or talking to yourself,
01:06:51.280 | giving yourself a pep talk.
01:06:52.400 | I mean, if you're the one talking,
01:06:54.240 | why are you also the one listening?
01:06:55.800 | Like, why do you need the pep talk and why does it work?
01:06:57.880 | If you're the one giving the pep talk, right?
01:07:00.000 | Or if I say, where are my keys?
01:07:02.120 | Right, if I'm looking for my keys,
01:07:03.780 | why do I think the superfluous thought, where are my keys?
01:07:06.760 | I know I'm looking for the fucking keys.
01:07:08.440 | I'm the one looking, who am I telling
01:07:11.400 | that we now need to look for the keys, right?
01:07:13.840 | So that duality is weird, but leave that aside.
01:07:16.320 | There's the sense, and this becomes very vivid
01:07:22.240 | when people try to learn to meditate.
01:07:25.400 | Most people, they close their eyes
01:07:28.760 | and they're told to pay attention
01:07:30.000 | to an object like the breath, say.
01:07:32.000 | So you close your eyes and you pay attention to the breath
01:07:35.320 | and you can feel it at the tip of your nose
01:07:37.280 | or the rising and falling of your abdomen.
01:07:39.600 | And you're paying attention
01:07:42.280 | and you feel something vague there.
01:07:44.960 | And then you think, I thought, why the breath?
01:07:46.920 | Why am I paying attention to the breath?
01:07:49.720 | What's so special about the breath?
01:07:51.840 | And then you notice you're thinking
01:07:54.200 | and you're not paying attention to the breath anymore.
01:07:55.720 | And then you realize, okay, the practice is,
01:07:58.000 | okay, I should notice thoughts
01:07:59.860 | and then I should come back to the breath.
01:08:01.960 | But this starting point is of the conventional starting point
01:08:06.760 | of feeling like you are an agent, very likely in your head,
01:08:10.160 | a locus of consciousness, a locus of attention
01:08:12.940 | that can strategically pay attention
01:08:15.820 | to certain parts of experience.
01:08:17.200 | Like I can focus on the breath
01:08:18.880 | and then I get lost in thought
01:08:20.580 | and now I can come back to the breath
01:08:22.400 | and I can open my eyes
01:08:23.460 | and I'm over here behind my face
01:08:26.080 | looking out at a world that's other than me
01:08:28.660 | and there's this kind of subject-object perception.
01:08:31.880 | And that is the default starting point of selfhood,
01:08:35.200 | of subjectivity.
01:08:36.920 | And married to that is the sense
01:08:41.060 | that I can decide what to do next.
01:08:46.060 | I am an agent who can pay attention to the cup.
01:08:50.340 | I can listen to sounds.
01:08:52.020 | There are certain things that I can't control.
01:08:53.580 | Certain things are happening to me
01:08:54.640 | and I just can't control them.
01:08:55.880 | So for instance, if someone asks,
01:08:59.000 | well, can you not hear a sound, right?
01:09:02.360 | Like don't hear the next sound.
01:09:03.840 | Don't hear anything for a second.
01:09:05.280 | Or don't hear, don't hear,
01:09:07.280 | you know, I'm snapping my fingers, don't hear this.
01:09:09.220 | Where's your free will?
01:09:10.160 | You know, well, like just stop this from coming in.
01:09:12.580 | You realize, okay, wait a minute.
01:09:14.640 | My abundant freedom does not extend
01:09:18.320 | to something as simple as just being able
01:09:20.240 | to pay attention to something else than this.
01:09:22.560 | Okay, well, so I'm not that kind of free agent,
01:09:25.860 | but at least I can decide what I'm gonna do next.
01:09:28.900 | I'm gonna pick up this water, right?
01:09:32.040 | And there's a feeling of identification
01:09:36.720 | with the impulse, with the intention,
01:09:39.680 | with the thought that occurs to you,
01:09:41.200 | with the feeling of speaking.
01:09:43.440 | Like, you know, what am I gonna say next?
01:09:45.520 | Well, I'm saying it, so here goes.
01:09:47.780 | This is me.
01:09:48.920 | It feels like I'm the thinker.
01:09:50.560 | I'm the one who's in control.
01:09:53.780 | But all of that is born of not really paying close attention
01:09:59.760 | to what it's like to be you.
01:10:02.000 | And so this is where meditation comes in,
01:10:05.240 | or this is where, again,
01:10:07.160 | you can get at this conceptually.
01:10:10.000 | You can unravel the notion of free will
01:10:11.760 | just by thinking certain thoughts,
01:10:15.760 | but you can't feel that it doesn't exist,
01:10:18.900 | unless you can pay close attention
01:10:20.720 | to how thoughts and intentions arise.
01:10:22.960 | So the way to unravel it conceptually
01:10:24.600 | is just to realize, okay, I didn't make myself.
01:10:27.320 | I didn't make my genes.
01:10:28.380 | I didn't make my brain.
01:10:29.300 | I didn't make the environmental influences
01:10:31.760 | that impinged upon this system for the last 54 years
01:10:35.160 | that have produced my brain in precisely the state
01:10:38.840 | it's in right now,
01:10:39.920 | such, I mean, with all of the receptor weightings
01:10:42.820 | and densities, and, you know, it's just,
01:10:45.660 | I'm exactly the machine I am right now
01:10:48.520 | through no fault of my own as the experiencing self.
01:10:53.520 | I get no credit and I get no blame
01:10:56.880 | for the genetics and the environmental influences here.
01:11:00.600 | And yet those are the only things that contrive
01:11:05.600 | to produce my next thought or impulse
01:11:10.680 | or moment of behavior.
01:11:12.880 | And if you were going to add something magical
01:11:14.880 | to that clockwork, like an immortal soul,
01:11:18.440 | you can also notice that you didn't produce your soul,
01:11:21.040 | right, like you can't account for the fact
01:11:22.760 | that you don't have the soul of someone
01:11:25.960 | who doesn't like any of the things you like
01:11:28.320 | or wasn't interested in any of the things
01:11:29.880 | you were interested in or, you know,
01:11:31.280 | or was a psychopath or was, you know, had an IQ of 40.
01:11:35.120 | I mean, like there's nothing about that
01:11:38.680 | that the person who believes in a soul
01:11:41.560 | can claim to have controlled.
01:11:43.320 | And yet that is also totally dispositive
01:11:45.840 | of whatever happens next.
01:11:48.400 | - But everything you've described now,
01:11:51.160 | maybe you can correct me, but it kind of speaks
01:11:52.920 | to the materialistic nature of the hardware.
01:11:55.780 | - But even if you add magical ectoplasm software,
01:12:01.200 | you didn't produce that either.
01:12:03.080 | - I know, but if we can think about
01:12:06.000 | the actual computation running on the hardware
01:12:09.680 | and running on the software,
01:12:11.060 | there's something you said recently,
01:12:12.700 | which you think of culture as an operating system.
01:12:17.240 | So if we just remove ourselves a little bit
01:12:21.120 | from the conception of human civilization
01:12:24.120 | being a collection of humans,
01:12:26.100 | and rather us just being a distributed computation system
01:12:31.100 | on which there's some kind of operating system running,
01:12:34.080 | and then the computation that's running
01:12:36.480 | is the actual thing that generates the interactions,
01:12:39.700 | the communications, and maybe even free will,
01:12:42.120 | the experiences of all those free will.
01:12:44.260 | Do you ever think of, do you ever try to reframe the world
01:12:47.200 | in that way, where it's like ideas are just using us,
01:12:51.720 | thoughts are using individual nodes in the system,
01:12:56.520 | and they're just jumping around,
01:12:58.040 | and they also have ability to generate experiences
01:13:01.480 | so that we can push those ideas along?
01:13:03.600 | And basically the main organisms here
01:13:05.820 | are the thoughts, not the humans.
01:13:07.960 | - Yeah, but then that erodes the boundary
01:13:11.800 | between self and world.
01:13:15.200 | - Right.
01:13:16.040 | - So then there's no self, a really integrated self
01:13:19.640 | to have any kind of will at all.
01:13:22.400 | Like if you're just a meme plex,
01:13:24.480 | I mean, if you're just a collection of memes,
01:13:28.640 | and I mean, we're all kind of like currents,
01:13:32.000 | like eddies in this river of ideas, right?
01:13:35.920 | So it's like, and it seems to have structure,
01:13:40.720 | but there's no real boundary
01:13:42.440 | between that part of the flow of water and the rest.
01:13:44.920 | I mean, if our, and I would say that much of our mind
01:13:47.900 | answers to this kind of description.
01:13:49.640 | I mean, so much of our mind has been,
01:13:52.620 | it's obviously not self-generated,
01:13:55.400 | and you're not gonna find it by looking in the brain.
01:13:58.000 | It is the result of culture largely,
01:14:03.000 | but also, you know,
01:14:09.680 | the genes on one side and culture on the other
01:14:12.080 | meeting to allow for
01:14:16.440 | manifestations of mind that aren't actually bounded
01:14:22.640 | by the person in any clear sense.
01:14:26.800 | It was just, the example I often use here,
01:14:31.220 | but there's so many others,
01:14:32.320 | is just the fact that we're following
01:14:34.880 | the rules of English grammar to whatever degree we are.
01:14:38.000 | It's not that, we certainly haven't consciously
01:14:40.180 | represented these rules for ourself.
01:14:42.520 | We haven't invented these rules.
01:14:44.240 | We haven't, I mean, there are norms of language use
01:14:48.680 | that we couldn't even specify because we haven't,
01:14:53.040 | you know, we're not grammarians.
01:14:54.480 | We haven't studied this.
01:14:56.520 | We don't even have the right concepts,
01:14:58.280 | and yet we're following these rules,
01:15:00.000 | and we're noticing as an error
01:15:04.480 | when we fail to follow these rules.
01:15:07.180 | And virtually every other cultural norm is like that.
01:15:11.260 | I mean, these are not things we've invented.
01:15:13.220 | You can consciously decide to scrutinize them
01:15:17.260 | and override them, but I mean,
01:15:20.420 | just think of any social situation
01:15:23.740 | where you're with other people,
01:15:25.700 | and you're behaving in ways
01:15:28.300 | that are culturally appropriate, right?
01:15:31.740 | You're not being wild animals together.
01:15:34.740 | You're following, so you have some expectation
01:15:36.700 | of how you shake a person's hand
01:15:39.100 | and how you deal with implements on a table,
01:15:43.580 | how you have a meal together.
01:15:45.120 | Obviously, this can change from culture to culture,
01:15:47.660 | and people can be shocked
01:15:49.500 | by how different those things are, right?
01:15:51.380 | We all have foods we find disgusting,
01:15:54.100 | but in some countries, dog is not one of those foods, right?
01:15:57.500 | And yet, you and I presumably would be horrified
01:16:00.500 | to be served dog.
01:16:03.820 | Those are not norms that we're,
01:16:07.180 | they are outside of us in some way,
01:16:09.340 | and yet they're felt very viscerally.
01:16:13.580 | I mean, they're certainly felt in their violation.
01:16:16.140 | You know, if you are, just imagine,
01:16:18.340 | you're in somebody's home,
01:16:21.420 | you're eating something that tastes great to you,
01:16:23.860 | and you happen to be in Vietnam or wherever,
01:16:25.900 | you know, you didn't realize dog
01:16:27.980 | was potentially on the menu,
01:16:29.500 | and you find out that you've just eaten 10 bites
01:16:33.500 | of what is really a cocker spaniel,
01:16:37.300 | and you feel this instantaneous urge to vomit, right,
01:16:42.300 | based on an idea, right?
01:16:44.900 | Like, so you did not,
01:16:47.020 | you're not the author of that norm
01:16:49.900 | that gave you such a powerful experience of its violation,
01:16:55.620 | and I'm sure we can trace the moment in your history,
01:16:59.620 | you know, vaguely where it sort of got in.
01:17:01.340 | I mean, very early on as kids,
01:17:02.940 | you realize you're treating dogs as pets
01:17:05.940 | and not as food, or as potential food.
01:17:08.400 | But yeah, no, it's,
01:17:13.260 | but the point you just made opens us to,
01:17:16.780 | like, we are totally permeable to a sea of mind.
01:17:21.780 | - Yeah, but if we take the metaphor
01:17:24.380 | of the distributed computing systems,
01:17:26.220 | each individual node is part of performing
01:17:30.860 | a much larger computation,
01:17:32.860 | but it nevertheless is in charge of doing the scheduling.
01:17:36.420 | So, assuming it's Linux,
01:17:39.260 | is doing the scheduling of processes
01:17:41.060 | and is constantly alternating them.
01:17:42.700 | That node is making those choices.
01:17:46.220 | That node sure as hell believes it has free will,
01:17:49.240 | and it actually has free will
01:17:51.100 | 'cause it's making those hard choices,
01:17:53.140 | but the choices ultimately are part
01:17:54.900 | of a much larger computation that it can't control.
01:17:57.340 | Isn't it possible for that node to still be,
01:17:59.920 | that human node is still making the choice?
01:18:04.080 | - Well, yeah, it is.
01:18:05.120 | So, I'm not saying that your body
01:18:08.840 | isn't doing, really doing things, right?
01:18:11.680 | And some of those things can be
01:18:14.080 | conventionally thought of as choices, right?
01:18:16.440 | So, it's like I can choose to reach,
01:18:19.160 | and it's like, it's not being imposed on me.
01:18:21.640 | That would be a different experience.
01:18:22.800 | Like, so there's an experience of,
01:18:25.640 | you know, there's definitely a difference
01:18:27.960 | between voluntary and involuntary action.
01:18:30.040 | So, that has to get conserved.
01:18:34.060 | By any account of the mind that jettisons free will,
01:18:36.600 | you still have to admit that there's a difference
01:18:39.280 | between a tremor that I can't control
01:18:42.920 | and a purposeful motor action that I can control
01:18:47.640 | and that I can initiate on demand,
01:18:49.140 | and it's associated with intentions.
01:18:50.940 | And it's got efferent motor copy,
01:18:55.760 | which is being predictive so that I can notice errors.
01:18:59.660 | You know, I have expectations.
01:19:00.940 | When I reach for this,
01:19:02.040 | if my hand were actually to pass through the bottle,
01:19:04.440 | because it's a hologram, I would be surprised, right?
01:19:07.160 | And so, that shows that I have a expectation
01:19:09.160 | of just what my grasping behavior
01:19:10.940 | is going to be like even before it happens.
01:19:13.760 | Whereas with a tremor,
01:19:14.620 | you don't have the same kind of thing going on.
01:19:17.120 | That's a distinction we have to make.
01:19:19.960 | So, I am, yes, I'm really the,
01:19:22.400 | my intention to move,
01:19:25.200 | which is in fact can be subjectively felt,
01:19:29.000 | really is the proximate cause of my moving.
01:19:31.000 | It's not coming from elsewhere in the universe.
01:19:33.320 | I'm not saying that.
01:19:35.040 | So, in that sense, the node is really deciding
01:19:37.560 | to execute the subroutine now.
01:19:42.260 | But that's not the feeling
01:19:49.720 | that has given rise to this conundrum of free will, right?
01:19:54.720 | So, people feel like,
01:19:57.020 | people feel like they,
01:20:00.160 | crucial thing is that people feel like
01:20:01.760 | they could have done otherwise, right?
01:20:04.280 | That's the thing.
01:20:05.840 | So, when you run back the clock of your life, right?
01:20:09.440 | You run back the movie of your life.
01:20:11.220 | You flip back the few pages in the novel of your life.
01:20:14.600 | They feel that at this point,
01:20:18.000 | they could behave differently than they did, right?
01:20:20.800 | So, like, but given,
01:20:23.240 | even given your distributed computing example,
01:20:25.860 | it's either a fully deterministic system
01:20:30.460 | or it's a deterministic system
01:20:32.100 | that admits of some random influence.
01:20:36.200 | In either case,
01:20:37.740 | that's not the free will people think they have.
01:20:41.560 | The free will people think they have is,
01:20:43.600 | damn, I shouldn't have done that.
01:20:46.520 | I just, like, I shouldn't have done that.
01:20:49.400 | I could have done otherwise, right?
01:20:51.400 | I should have done otherwise, right?
01:20:52.680 | Like, if you think about something
01:20:55.360 | that you deeply regret doing, right?
01:20:57.620 | Or that you hold someone else responsible for
01:21:00.820 | because they really are the upstream agent
01:21:03.280 | in your mind of what they did.
01:21:04.920 | That's an awful thing that that person did
01:21:08.080 | and they shouldn't have done it.
01:21:09.680 | There is this illusion and it has to be an illusion
01:21:12.920 | because there's no picture of causation
01:21:17.120 | that would make sense of it.
01:21:18.960 | There's this illusion that if you arrange the universe
01:21:21.920 | exactly the way it was a moment ago,
01:21:24.440 | it could have played out differently.
01:21:27.040 | And the only way it could have played out differently
01:21:31.600 | is if there's randomness added to that,
01:21:34.840 | but randomness isn't what people feel
01:21:37.840 | would give them free will, right?
01:21:39.380 | If you tell me that, you know,
01:21:41.160 | I only reached for the water bottle this time
01:21:44.000 | because somebody's,
01:21:45.080 | because there's a random number generator in there
01:21:47.080 | kicking off values and it finally moved my hand.
01:21:51.240 | That's not the feeling of authorship.
01:21:54.080 | - That's still not control.
01:21:55.680 | You're still not making that decision.
01:21:58.160 | There's actually, I don't know if you're familiar
01:22:00.560 | with cellular automata.
01:22:01.800 | It's a really nice visualization
01:22:03.560 | of how simple rules can create incredible complexity
01:22:07.800 | that it's like really dumb initial conditions to set,
01:22:11.040 | simple rules applied,
01:22:12.640 | and eventually you watch this thing,
01:22:14.960 | and if the initial conditions are correct,
01:22:18.640 | then you're going to have emerge something
01:22:21.360 | that to our perception system
01:22:23.200 | looks like organisms interacting.
01:22:25.440 | You can construct any kinds of worlds,
01:22:27.120 | and they're not actually interacting.
01:22:29.800 | They're not actually even organisms,
01:22:32.000 | and they certainly aren't making decisions.
01:22:34.920 | So there's like systems you can create
01:22:37.120 | that illustrate this point.
01:22:38.560 | The question is whether there could be some room
01:22:42.840 | for let's use in the 21st century the term magic,
01:22:47.440 | back to the black box of consciousness.
01:22:50.200 | Let me ask you this way.
01:22:51.840 | If you're wrong about your intuition about free will,
01:22:56.360 | what, and somebody comes along to you
01:22:58.440 | and proves to you that you didn't have the full picture,
01:23:03.320 | what would that proof look like?
01:23:04.880 | What would--
01:23:05.720 | - Well, that's the problem.
01:23:06.680 | That's why it's not even an illusion in my world
01:23:09.600 | because for me it's impossible to say
01:23:14.440 | what the universe would have to be like
01:23:16.840 | for free will to be a thing.
01:23:18.840 | It doesn't conceptually map onto
01:23:20.960 | any notion of causation we have,
01:23:24.720 | and that's unlike any other spurious claim you might make.
01:23:29.400 | So like if you're going to believe in ghosts,
01:23:34.680 | I understand what that claim could be.
01:23:37.880 | Where like I don't happen to believe in ghosts,
01:23:40.160 | but it's not hard for me to specify
01:23:44.280 | what would have to be true for ghosts to be real.
01:23:47.320 | And so it is with a thousand other things like ghosts.
01:23:50.040 | Right, so like, okay, so you're telling me
01:23:52.040 | that when people die, there's some part of them
01:23:54.820 | that is not reducible at all to their biology
01:23:57.440 | that lifts off them and goes elsewhere
01:24:00.600 | and is actually the kind of thing
01:24:02.160 | that can linger in closets and in cupboards
01:24:04.640 | and actually it's immaterial,
01:24:07.600 | but by some principle of physics,
01:24:09.200 | we don't totally understand.
01:24:10.240 | It can make sounds and knock objects
01:24:14.120 | and even occasionally show up
01:24:15.840 | so they can be visually beheld.
01:24:17.480 | And it's just, it seems like a miracle,
01:24:21.700 | but it's just some spooky noun in the universe
01:24:25.560 | that we don't understand.
01:24:27.160 | Let's call it a ghost.
01:24:28.260 | That's fine, I can talk about that all day,
01:24:31.920 | the reasons to believe in it,
01:24:32.960 | the reasons not to believe in it,
01:24:34.120 | the way we would scientifically test for it,
01:24:36.320 | what would have to be provable
01:24:38.160 | so as to convince me that ghosts are real.
01:24:40.900 | Free will isn't like that at all.
01:24:44.240 | There's no description of any concatenation of causes
01:24:49.240 | that precedes my conscious experience
01:24:52.100 | that sounds like what people think they have
01:24:55.280 | when they think they could have done otherwise
01:24:56.960 | and that they really, that they, the conscious agent,
01:25:00.160 | is really in charge, right?
01:25:01.860 | Like if you don't know what you're going to think next,
01:25:05.020 | right, and you can't help but think it,
01:25:07.420 | take those two premises on board.
01:25:12.840 | You don't know what it's gonna be,
01:25:14.760 | you can't stop it from coming,
01:25:18.240 | and until you actually know how to meditate,
01:25:21.820 | you can't stop yourself from
01:25:24.660 | fully living out its behavioral or emotional consequences.
01:25:31.220 | Right, like you have no, once you,
01:25:32.980 | mindfulness arguably gives you
01:25:36.580 | another degree of freedom here.
01:25:38.380 | It doesn't give you free will,
01:25:39.300 | but it gives you some other game to play
01:25:41.140 | with respect to the emotional
01:25:43.780 | and behavioral imperatives of thoughts.
01:25:46.220 | But short of that, I mean,
01:25:50.300 | the reason why mindfulness doesn't give you free will
01:25:52.180 | is because you can't, you know,
01:25:53.620 | you can't account for why in one moment mindfulness arises,
01:25:57.260 | and in other moments it doesn't, right?
01:26:00.020 | But a different process is initiated
01:26:03.500 | once you can practice in that way.
01:26:06.820 | - Well, if I could push back for a second.
01:26:08.860 | By the way, I just have this thought bubble
01:26:11.140 | popping up all the time of just two recent chimps
01:26:14.100 | arguing about the nature of consciousness.
01:26:16.740 | It's kind of hilarious.
01:26:17.860 | So on that thread, you know,
01:26:20.460 | if we're, even before Einstein,
01:26:22.860 | let's say before Einstein,
01:26:24.340 | we were to conceive about traveling from point A to point B,
01:26:29.580 | say some point in the future,
01:26:32.420 | we are able to realize through engineering
01:26:34.940 | a way which is consistent with Einstein's theory
01:26:39.380 | that you can have wormholes.
01:26:40.380 | You can travel from one point to another
01:26:42.780 | faster than the speed of light.
01:26:44.420 | And that would, I think,
01:26:47.620 | completely change our conception
01:26:49.460 | of what it means to travel in the physical space.
01:26:52.900 | And that, like, completely transform our ability.
01:26:57.460 | You talk about causality,
01:26:58.620 | but here let's just focus on what it means
01:27:01.300 | to travel through physical space.
01:27:03.380 | Don't you think it's possible that there will be
01:27:05.780 | inventions or leaps in understanding about reality
01:27:11.300 | that will allow us to see free will as actually,
01:27:15.980 | like, us humans somehow may be linked
01:27:19.340 | to this idea of consciousness,
01:27:21.380 | are actually able to be authors of our actions?
01:27:25.580 | It is a non-starter for me conceptually.
01:27:29.500 | It's a little bit like saying,
01:27:30.980 | could there be some breakthrough
01:27:34.420 | that will cause us to realize that circles are really square
01:27:39.420 | or that circles are not really round, right?
01:27:43.020 | No, a circle is what we mean by a perfectly round form.
01:27:47.620 | Right, like, it's not on the table to be revised.
01:27:52.420 | And so I would say the same thing about consciousness.
01:27:54.700 | It's just like saying, is there some breakthrough
01:27:58.900 | that would get us to realize
01:28:00.020 | that consciousness is really an illusion?
01:28:02.660 | I'm saying no, because what the experience of an illusion
01:28:06.540 | is as much a demonstration of what I'm calling consciousness
01:28:09.020 | as anything else, right?
01:28:09.860 | It's like, that is consciousness.
01:28:11.420 | With free will, it's a similar problem.
01:28:15.340 | It's like,
01:28:16.180 | again, it comes down to a picture of causality
01:28:22.620 | and there's no other picture on offer.
01:28:27.020 | And what's more,
01:28:28.300 | I know what it's like on the experiential side
01:28:34.220 | to lose the thing to which it is clearly anchored, right?
01:28:39.220 | The feel like, it doesn't feel,
01:28:41.660 | and this is the question that almost nobody asked,
01:28:43.740 | people who are debating me on the topic of free will,
01:28:46.500 | 15 minute intervals, I'm making a claim
01:28:51.980 | that I don't feel this thing,
01:28:53.500 | and they never become interested in,
01:28:58.220 | well, what's that like?
01:28:59.220 | Like, okay, so you're actually saying you don't,
01:29:02.500 | this thing isn't true for you empirically.
01:29:05.880 | It's not just, because most people
01:29:07.580 | who don't believe in free will philosophically
01:29:11.140 | also believe that we're condemned to experience it.
01:29:15.140 | Like, you can't live without this feeling.
01:29:18.180 | - So you're actually saying
01:29:20.700 | you're able to experience the absence of
01:29:23.620 | the illusion of free will.
01:29:27.060 | - Yes, yes.
01:29:28.060 | - For, are we talking about a few minutes at a time,
01:29:32.620 | or is this to require a lot of work and meditation,
01:29:37.620 | or are you literally able to load that into your mind
01:29:41.140 | and like play that movie?
01:29:42.020 | - Right now, right now, just in this conversation.
01:29:44.660 | So it's not absolutely continuous,
01:29:48.980 | but it's whenever I pay attention.
01:29:51.580 | It's like, it's the same,
01:29:52.900 | and I would say the same thing for the illusoriness
01:29:55.540 | of the self in the sense,
01:29:56.740 | and again, we haven't talked about this, so.
01:29:58.940 | - Can you still have the self
01:30:00.340 | and not have the free will in your mind at the same time?
01:30:02.820 | Do they go at the same time?
01:30:03.940 | - This is the same, yeah, it's the same thing that--
01:30:06.500 | - They're always holding hands
01:30:07.500 | and when they walk out the door.
01:30:08.780 | - They really are two sides of the same coin.
01:30:10.620 | But it's just, it comes down to what it's like
01:30:13.300 | to try to get to the end of the sentence,
01:30:16.540 | or what it's like to finally decide
01:30:18.940 | that it's been long enough
01:30:20.420 | and now I need another sip of water, right?
01:30:22.180 | If I'm paying attention, now, if I'm not paying attention,
01:30:25.860 | I'm probably, I'm captured by some other thought
01:30:28.720 | and that feels a certain way, right?
01:30:30.700 | And so that's not, it's not vivid,
01:30:32.260 | but if I try to make vivid this experience of just,
01:30:35.540 | okay, I'm finally going to experience free will.
01:30:38.260 | I'm going to notice my free will, right?
01:30:40.420 | Like it's got to be here, everyone's talking about it.
01:30:43.260 | Where is it?
01:30:44.100 | I'm going to pay attention to it.
01:30:44.920 | I'm going to look for it.
01:30:45.760 | And I'm going to create a circumstance
01:30:48.620 | that is where it has to be most robust, right?
01:30:52.180 | I'm not rushed to make this decision.
01:30:54.660 | I'm not, it's not a reflex.
01:30:57.060 | I'm not under pressure.
01:30:58.660 | I'm going to take as long as I want.
01:30:59.940 | I'm going to decide.
01:31:02.060 | It's not trivial.
01:31:02.940 | Like, so it's not just like reaching with my left hand
01:31:04.740 | or reaching with my right hand.
01:31:05.680 | People don't like those examples for some reason.
01:31:07.700 | Let's make a big decision.
01:31:09.220 | Like, where should, you know,
01:31:14.380 | what should my next podcast be on, right?
01:31:16.340 | Who do I invite on the next podcast?
01:31:18.580 | What is it like to make that decision?
01:31:20.600 | When I pay attention,
01:31:22.800 | there is no evidence of free will anywhere in sight.
01:31:26.680 | It's like, it doesn't feel like,
01:31:28.160 | it feels profoundly mysterious
01:31:31.220 | to be going back between two people.
01:31:33.400 | You know, like, is it going to be person A or person B?
01:31:37.600 | Got all my reasons for A and all my reasons why not,
01:31:40.240 | and all my reasons for B,
01:31:41.240 | and there's some math going on there
01:31:43.960 | that I'm not even privy to,
01:31:46.460 | where certain concerns are trumping others.
01:31:49.380 | And at a certain point, I just decide.
01:31:52.880 | And yes, you can say I'm the node in the network
01:31:56.020 | that has made that decision, absolutely.
01:31:57.540 | I'm not saying it's being piped to me from elsewhere,
01:32:00.360 | but the feeling of what it's like to make that decision
01:32:04.580 | is totally without a sense,
01:32:11.300 | a real sense of agency,
01:32:15.060 | because something simply emerges.
01:32:18.740 | It's literally as tenuous as,
01:32:22.400 | what's the next sound I'm going to hear?
01:32:25.620 | Right, or what's the next thought that's going to appear?
01:32:29.440 | And it just, something just appears, you know?
01:32:32.300 | And if something appears to cancel that something,
01:32:34.660 | like if I say, I'm going to invite her,
01:32:37.740 | and then I'm about to send the email,
01:32:39.300 | and then I think, oh, no, no, no, I can't do that.
01:32:42.500 | There was that thing in that New York article I read
01:32:45.180 | that I got to talk to this guy, right?
01:32:47.660 | That pivot at the last second,
01:32:49.700 | you can make it as muscular as you want.
01:32:53.940 | It always just comes out of the darkness.
01:32:55.960 | It's always mysterious.
01:32:57.780 | - So right, when you try to pin it down,
01:32:59.380 | you really can't ever find that free will.
01:33:01.980 | If you construct an experiment for yourself,
01:33:06.140 | and you try to really find that moment
01:33:07.900 | when you're actually making that controlled,
01:33:10.380 | author decision, it's very difficult to do.
01:33:12.740 | - And we're still, we know at this point
01:33:15.900 | that if we were scanning your brain
01:33:18.220 | in some podcast guest choosing experiment, right?
01:33:23.220 | We know at this point we would be privy
01:33:26.980 | to who you're going to pick before you are.
01:33:29.500 | You, the conscious agent.
01:33:30.740 | If we could, again, this is operationally
01:33:33.100 | a little hard to conduct, but there's enough data now
01:33:36.060 | to know that something very much like this cartoon
01:33:39.660 | is in fact true, and will ultimately be undeniable
01:33:44.460 | for people, that they'll be able to do it on themselves
01:33:48.100 | with some app.
01:33:49.480 | If you're deciding what to, where to go for dinner,
01:33:55.540 | or who to have on your podcast, or ultimately,
01:33:57.820 | who to marry, right, or what city to move to, right?
01:34:00.940 | Like you can make it as big or as small
01:34:03.380 | a decision as you want.
01:34:05.580 | We could be scanning your brain in real time,
01:34:07.980 | and at a point where you still think you're uncommitted,
01:34:12.540 | we would be able to say with arbitrary accuracy,
01:34:17.020 | all right, Lex is, he's moving to Austin, right?
01:34:20.780 | - I didn't choose that.
01:34:21.660 | - Yeah, he was, it was gonna be Austin,
01:34:23.780 | or it was gonna be Miami.
01:34:24.780 | He got, he's catching one of these two waves,
01:34:27.700 | but it's gonna be Austin.
01:34:29.220 | And at a point where you subjectively,
01:34:31.780 | if we could ask you, you would say,
01:34:34.140 | oh no, I'm still working over here.
01:34:36.900 | I'm still thinking, I'm still choosing,
01:34:38.860 | I'm still considering my options.
01:34:40.700 | - And you've spoken to this.
01:34:42.160 | In you thinking about other stuff in the world,
01:34:45.500 | it's been very useful to step away
01:34:49.900 | from this illusion of free will.
01:34:51.460 | And you argue that it probably makes a better world
01:34:54.100 | because you can be compassionate
01:34:55.340 | and empathetic towards others.
01:34:56.860 | - And toward oneself.
01:34:58.340 | - And towards oneself.
01:34:59.180 | - I mean, radically toward others,
01:35:01.860 | in that literally hate makes no sense anymore.
01:35:05.780 | I mean, there's certain things you can
01:35:07.900 | really be worried about, really want to oppose,
01:35:10.900 | really, I mean, I'm not saying you'd never have
01:35:12.660 | to kill another person.
01:35:13.820 | Like, I mean, self-defense is still a thing, right?
01:35:16.380 | But the idea that you're ever confronting
01:35:21.380 | anything other than a force of nature in the end
01:35:25.540 | goes out the window, right?
01:35:26.740 | Or does go out the window when you really pay attention.
01:35:29.060 | I'm not saying that this would be easy to grok
01:35:33.380 | if someone kills a member of your family.
01:35:38.380 | I'm not saying you can just listen to my 90 minutes
01:35:40.300 | on free will and then you should be able to see
01:35:41.900 | that person as identical to a grizzly bear or a virus.
01:35:46.540 | 'Cause I mean, we are so evolved to deal with one another
01:35:51.540 | as fellow primates and as agents.
01:35:57.940 | But it's, yeah, when you're talking about the possibility
01:36:02.940 | of Christian, truly Christian forgiveness, right?
01:36:07.940 | It is like, as testified to by various saints
01:36:14.180 | of that flavor over the millennia.
01:36:24.220 | Yeah, the doorway to that is to recognize
01:36:28.900 | that no one really at bottom made themselves.
01:36:32.580 | And therefore everyone, what we're seeing really
01:36:36.340 | are differences in luck in the world.
01:36:39.100 | We're seeing people who are very, very lucky
01:36:41.820 | to have had good parents and good genes
01:36:43.620 | and to be in good societies and had good opportunities
01:36:46.340 | and to be intelligent and to be not sociopathic.
01:36:50.060 | None of it is on them.
01:36:53.020 | They're just reaping the fruits of one lottery
01:36:56.580 | after another and then showing up in the world on that basis.
01:37:00.100 | And then so it is with every malevolent asshole out there.
01:37:06.500 | He or she didn't make themself.
01:37:09.500 | Even if that weren't possible,
01:37:14.040 | the utility for self-compassion is also enormous
01:37:18.580 | because when you just look at what it's like
01:37:21.740 | to regret something or to feel shame about something
01:37:26.740 | or feel deep embarrassment about it,
01:37:29.020 | these states of mind are some of the most deranging
01:37:33.180 | experiences anyone has and the indelible reaction to them,
01:37:38.180 | the memory of the thing you said,
01:37:42.140 | the memory of the wedding toast you gave 20 years ago
01:37:44.860 | that was just mortifying, right?
01:37:47.100 | The fact that that can still make you hate yourself, right?
01:37:50.300 | And like that psychologically,
01:37:52.340 | that is a knot that can be untied, right?
01:37:56.300 | - Speak for yourself, Sam.
01:37:57.340 | - Yeah, yeah. - So clearly you're not--
01:37:58.180 | - You gave a great toast.
01:37:59.900 | It was my toast that mortified me.
01:38:01.100 | - No, no, no, that's not what I was referring to.
01:38:02.860 | I'm deeply appreciative in the same way
01:38:07.340 | that you're referring to of every moment I'm alive,
01:38:10.380 | but I'm also powered by self-hate often.
01:38:14.340 | Like several things in this conversation already
01:38:18.100 | that I've spoken, I'll be thinking about
01:38:21.100 | like that was the dumbest thing.
01:38:23.100 | You're sitting in front of Sam Harris and you said that.
01:38:26.260 | So like that, but that somehow creates
01:38:29.300 | a richer experience for me.
01:38:30.660 | I've actually come to accept that as a nice feature
01:38:33.300 | of however my brain was built.
01:38:35.340 | I don't think I wanna let go of that.
01:38:37.340 | - Well, I think the thing you wanna let go of is
01:38:40.340 | the suffering associated with it.
01:38:46.420 | So for me, so this is very,
01:38:49.420 | psychologically and ethically,
01:38:53.420 | all of this is very interesting.
01:38:55.220 | So I don't think we should ever get rid
01:38:57.860 | of things like anger, right?
01:38:59.500 | So like hatred is, hatred is divorceable from anger
01:39:02.700 | in the sense that hatred is this enduring state
01:39:06.780 | where whether you're hating somebody else
01:39:09.100 | or hating yourself, it is toxic and durable
01:39:14.100 | and ultimately useless, right?
01:39:15.900 | Like it becomes self nullifying, right?
01:39:19.540 | Like you become less capable as a person
01:39:23.300 | to solve any of your problems.
01:39:24.500 | It's not instrumental in solving the problem
01:39:26.660 | that is occasioning all this hatred.
01:39:29.940 | And anger for the most part isn't either
01:39:33.540 | except as a signal of salience that there's a problem, right?
01:39:37.300 | So if somebody does something that makes me angry,
01:39:40.280 | that just promotes this situation to conscious attention
01:39:45.100 | in a way that is stronger than my not really caring
01:39:48.900 | about it, right?
01:39:49.740 | And there are things that I think should make us angry
01:39:51.460 | in the world and there's the behavior of other people
01:39:54.540 | that should make us angry because we should respond to it.
01:39:57.460 | And so it is with yourself.
01:39:59.140 | If I do something, as a parent, if I do something stupid
01:40:03.180 | that harms one of my daughters, right?
01:40:05.800 | My experience of myself and my beliefs about free will
01:40:12.380 | closed the door to my saying,
01:40:14.380 | well, I should have done otherwise
01:40:15.820 | in the sense that if I could go back in time,
01:40:17.660 | I would have actually effectively done otherwise.
01:40:20.380 | No, I would do, given the same causes and conditions,
01:40:22.820 | I would do that thing a trillion times in a row, right?
01:40:26.300 | But, you know, regret and feeling bad about an outcome
01:40:31.300 | are still important to capacities because like, yeah,
01:40:38.140 | I desperately want my daughters to be happy and healthy.
01:40:41.740 | So if I've done something, you know,
01:40:43.460 | if I crash the car when they're in the car
01:40:45.980 | and they get injured, right?
01:40:47.060 | And I do it because I was trying to change a song
01:40:50.820 | on my playlist or something stupid,
01:40:53.660 | I'm gonna feel like a total asshole.
01:40:55.600 | How long do I stew in that feeling of regret, right?
01:41:01.820 | And what utility is there to extract
01:41:06.980 | out of this error signal?
01:41:08.500 | And then what do I do?
01:41:10.140 | We're always faced with the question of what to do next,
01:41:13.540 | right, and how to best do that thing,
01:41:16.520 | that necessary thing next.
01:41:18.280 | And how much wellbeing can we experience while doing it?
01:41:23.280 | Like how miserable do you need to be
01:41:28.100 | to solve your problems in life
01:41:31.120 | and to help solve the problems of people closest to you?
01:41:34.940 | You know, how miserable do you need to be
01:41:36.760 | to get through your to-do list today?
01:41:39.660 | Ultimately, I think you can be deeply happy
01:41:44.660 | going through all of it, right?
01:41:49.540 | And even navigating moments that are scary
01:41:54.060 | and really destabilizing to ordinary people.
01:41:59.940 | And I mean, I think, you know, again,
01:42:04.940 | I'm always up kind of at the edge of my own capacities here
01:42:09.060 | and there are all kinds of things that stress me out
01:42:11.100 | and worry me, and I mean, especially something,
01:42:12.860 | if it's, you're gonna tell me it's something with,
01:42:14.700 | you know, the health of one of my kids,
01:42:17.220 | you know, it's very hard for me,
01:42:18.660 | like it's very hard for me to be truly a quantumist
01:42:21.700 | around that, but equanimity is so useful
01:42:25.700 | the moment you're in response mode, right?
01:42:29.620 | 'Cause I mean, the ordinary experience for me
01:42:34.120 | of responding to what seems like a medical emergency
01:42:38.620 | for one of my kids is to be obviously super energized
01:42:43.620 | by concern to respond to that emergency.
01:42:47.160 | But then once I'm responding, all of my fear and agitation
01:42:52.980 | and worry and, oh my God, what if this is really
01:42:56.300 | something terrible, but finding any of those thoughts
01:43:00.780 | compelling, that only diminishes my capacity as a father
01:43:05.500 | to be good company while we navigate
01:43:08.300 | this really turbulent passage, you know?
01:43:11.500 | - As you're saying this, actually, one guy comes to mind,
01:43:13.540 | which is Elon Musk, one of the really impressive things
01:43:16.860 | to me was to observe how many dramatic things
01:43:19.980 | he has to deal with throughout the day at work,
01:43:22.460 | but also if you look through his life, family too,
01:43:25.780 | and how he's very much actually, as you're describing,
01:43:30.500 | basically a practitioner of this way of thought,
01:43:33.020 | which is you're not in control,
01:43:36.660 | you're basically responding,
01:43:39.900 | no matter how traumatic the event,
01:43:41.380 | and there's no reason to sort of linger on the--
01:43:44.220 | - Well, yeah, they couldn't be--
01:43:45.060 | - On the negative feelings around that.
01:43:46.820 | - Well, so, but he's in a very specific situation,
01:43:50.820 | which is unlike normal life, you know,
01:43:57.060 | even his normal life, but normal life for most people,
01:44:00.800 | because when you just think of like, you know,
01:44:02.700 | he's running so many businesses,
01:44:04.060 | and they're highly non-standard businesses,
01:44:08.620 | so what he's seeing is everything that gets to him
01:44:12.420 | is some kind of emergency, like it wouldn't be
01:44:14.340 | getting to him, if it needs his attention,
01:44:16.580 | there's a fire somewhere, so he's constantly responding
01:44:19.860 | to fires that have to be put out,
01:44:21.820 | so there's no default expectation
01:44:25.480 | that there shouldn't be a fire, right?
01:44:27.340 | But in our normal lives, we live,
01:44:29.580 | most of us who are lucky, right,
01:44:31.720 | not everyone, obviously, on Earth,
01:44:33.180 | but most of us who are at some kind of cruising altitude
01:44:36.600 | in terms of our lives, where we're reasonably healthy,
01:44:40.440 | and life is reasonably orderly,
01:44:42.120 | and the political apparatus around us
01:44:44.340 | is reasonably functionable, so I said functionable
01:44:49.160 | for the first time in my life,
01:44:50.160 | through no free will of my own,
01:44:51.340 | so like I noticed those errors,
01:44:53.080 | and they do not feel like agency,
01:44:56.260 | and nor does the success of an utterance
01:44:59.000 | feel like agency.
01:44:59.860 | When you're looking at normal human life, right,
01:45:06.400 | where you're just trying to be happy and healthy,
01:45:10.360 | and get your work done, there's this default expectation
01:45:14.520 | that there shouldn't be fires,
01:45:16.400 | people shouldn't be getting sick or injured,
01:45:18.960 | we shouldn't be losing vast amounts of our resources,
01:45:23.720 | we should like, so when something really stark
01:45:27.480 | like that happens, people don't have that muscle,
01:45:32.480 | that they're, like I've been responding
01:45:36.240 | to emergencies all day long, seven days a week,
01:45:40.960 | in business mode, and so I have a very thick skin,
01:45:44.160 | this is just another one, what if I'm not expecting
01:45:46.680 | anything else when I wake up in the morning,
01:45:48.720 | no, we have this default sense that,
01:45:50.860 | I mean, honestly, most of us have the default sense
01:45:54.440 | that we aren't gonna die, right,
01:45:57.140 | or that we should, like maybe we're not gonna die,
01:45:59.720 | right, like death denial really is a thing,
01:46:02.920 | because, and you can see it, just like I can see
01:46:08.060 | when I reach for this bottle,
01:46:09.860 | that I was expecting it to be solid,
01:46:11.640 | because when it isn't solid, when it's a hologram,
01:46:13.660 | and I just, my fist closes on itself, I'm damn surprised,
01:46:18.660 | people are damn surprised to find out
01:46:22.160 | that they're going to die, to find out that they're sick,
01:46:24.720 | to find out that someone they love has died,
01:46:27.700 | or is going to die, so it's like,
01:46:29.620 | the fact that we are surprised by any of that
01:46:33.620 | shows us that we're living at a,
01:46:36.420 | we're living in a mode that is,
01:46:39.020 | you know, we're perpetually diverting ourselves
01:46:47.300 | from some facts that should be obvious, right,
01:46:50.540 | and the more salient we can make them,
01:46:55.000 | you know, the more, I mean, in the case of death,
01:46:57.720 | it's a matter of being able to get one's priorities straight,
01:47:01.080 | I mean, the moment, and this is hard for everybody,
01:47:04.780 | even those who are really in the business
01:47:06.600 | of paying attention to it, but the moment you realize
01:47:09.760 | that every circumstance is finite, right,
01:47:13.480 | you've got a certain number of, you know,
01:47:15.920 | you've got whatever, whatever it is,
01:47:17.080 | 8,000 days left in a normal span of life,
01:47:20.720 | and 8,000 is a, sounds like a big number,
01:47:24.320 | it's not that big a number, right,
01:47:25.760 | so it's just like, and then you can decide
01:47:29.480 | how you want to go through life,
01:47:31.600 | and how you want to experience each one of those days,
01:47:34.520 | and so I would, back to our jumping off point,
01:47:37.640 | I would argue that you don't want to feel self-hatred ever,
01:47:44.120 | I would argue that you don't want to really,
01:47:49.040 | really grasp onto any of those moments
01:47:55.280 | where you are internalizing the fact
01:47:58.880 | that you just made an error, you've embarrassed yourself,
01:48:01.440 | that something didn't go the way you wanted it to,
01:48:03.760 | I think you want to treat all of those moments
01:48:05.960 | very, very lightly, you want to extract
01:48:08.320 | the actionable information, it's something to learn,
01:48:11.920 | oh, I learned that when I prepare in a certain way,
01:48:16.920 | it works better than when I prepare in some other way,
01:48:20.000 | or don't prepare, right, so like, yes, lesson learned,
01:48:23.240 | you know, and do that differently, but,
01:48:26.100 | yeah, I mean, so many of us have spent so much time
01:48:33.600 | with a very dysfunctional and hostile,
01:48:40.320 | and even hateful inner voice
01:48:45.320 | governing a lot of our self-talk,
01:48:48.320 | and a lot of just our default way of being with ourselves,
01:48:51.800 | I mean, the privacy of our own minds,
01:48:54.160 | we're in the company of a real jerk a lot of the time,
01:48:58.200 | and that can't help but affect,
01:49:03.280 | I mean, forget about just your own sense of well-being,
01:49:05.600 | it can't help but limit what you're capable of
01:49:08.560 | in the world with other people.
01:49:10.600 | - I'll have to really think about that,
01:49:12.160 | I just take pride that my jerk, my inner voice jerk,
01:49:15.120 | is much less of a jerk than somebody like David Goggins,
01:49:18.160 | who's just like screaming in his ear constantly,
01:49:20.160 | so I just have a relativist kind of perspective
01:49:23.200 | that it's not as bad as that, at least.
01:49:25.800 | - Well, having a sense of humor also helps,
01:49:28.240 | you know, it's just like it's not,
01:49:29.520 | the stakes are never quite what you think they are,
01:49:32.640 | and even when they are, I mean,
01:49:34.880 | just the difference between being able to see the comedy
01:49:39.560 | of it rather than, 'cause again,
01:49:41.960 | there's this sort of dark star of self-absorption
01:49:46.420 | that pulls everything into it, right,
01:49:49.160 | and if that's the algorithm you don't want to run,
01:49:54.160 | so it's like you just want things to be good,
01:49:57.600 | so just push the concern out there,
01:50:02.760 | not have the collapse of,
01:50:04.840 | oh my God, what does this say about me?
01:50:06.600 | It's just like, what does this say about,
01:50:08.680 | how do we make this meal that we're all having together
01:50:11.480 | as fun and as useful as possible?
01:50:15.600 | - And you're saying in terms of propulsion systems,
01:50:17.480 | you recommend humor as a good spaceship
01:50:19.640 | to escape the gravitational field of that darkness.
01:50:23.640 | - Well, it certainly helps, yeah.
01:50:24.920 | - Yeah, well, let me ask you a little bit
01:50:28.040 | about ego and fame, which is very interesting.
01:50:31.360 | The way you're talking, given that you're one
01:50:34.520 | of the biggest intellects, living intellects
01:50:39.520 | and minds of our time, and there's a lot of people
01:50:42.360 | that really love you and almost elevate you
01:50:47.360 | to a certain kind of status where you're like the guru.
01:50:50.720 | I'm surprised you didn't show up in a robe, in fact.
01:50:53.320 | Is there-- - A hoodie.
01:50:55.640 | That's not the highest status garment one can wear now.
01:50:59.280 | - The socially acceptable version of the robe.
01:51:02.040 | - If you're a billionaire, you wear a hoodie.
01:51:04.600 | - Is there something you can say about managing
01:51:07.200 | the effects of fame on your own mind,
01:51:11.480 | on not creating this, when you wake up in the morning,
01:51:15.400 | when you look in the mirror, how do you get your ego
01:51:20.400 | not to grow exponentially, your conception of self
01:51:25.320 | to grow exponentially, because there's so many people
01:51:27.400 | feeding that, is there something to be said about this?
01:51:30.200 | - It's really not hard, because I feel like I have
01:51:33.640 | a pretty clear sense of my strengths and weaknesses,
01:51:38.640 | and I don't feel like it's, honestly, I don't feel
01:51:44.880 | like I suffer from much grandiosity.
01:51:47.880 | There's so many things I'm not good at,
01:51:52.640 | there's so many things I will, given the remaining
01:51:55.640 | 8,000 days at best, I will never get good at.
01:52:00.040 | I would love to be good at these things.
01:52:02.560 | So it's just, it's easy to feel diminished
01:52:05.280 | by comparison with the talents of others.
01:52:08.800 | - Do you remind yourself of all the things
01:52:11.160 | that you're not competent in?
01:52:14.280 | Is, I mean, like-- - Well, they're just
01:52:15.560 | on display for me every day, that I appreciate
01:52:17.800 | the talents of others.
01:52:19.320 | - But you notice them.
01:52:20.280 | I'm sure Stalin and Hitler did not notice
01:52:22.840 | all the ways in which they were,
01:52:25.240 | I mean, this is why absolute power corrupts absolutely,
01:52:28.760 | is you stop noticing the things
01:52:30.760 | in which you're ridiculous and wrong.
01:52:33.680 | - Right, yeah, no, I-- - Not to compare you to Stalin.
01:52:37.000 | - Yeah, well, I'm sure there's an inner Stalin
01:52:40.040 | in there somewhere.
01:52:41.320 | - Well, we all have, we all carry a baby Stalin with us.
01:52:43.600 | - He wears better clothes.
01:52:44.840 | And I'm not gonna grow that mustache.
01:52:49.040 | Those concerns don't map, they don't map onto me
01:52:51.680 | for a bunch of reasons, but one is,
01:52:53.960 | I also have a very peculiar audience.
01:52:56.120 | I like, I'm just, I've been appreciating this
01:53:00.680 | for a few years, but it's, I'm just now
01:53:04.160 | beginning to understand that there are many people
01:53:06.600 | who have audiences of my size or larger
01:53:09.840 | that have a very different experience
01:53:11.560 | of having an audience than I do.
01:53:13.280 | I have curated, for better or worse,
01:53:17.040 | a peculiar audience, and the net result of that is
01:53:23.040 | virtually any time I say anything of substance,
01:53:28.040 | something like half of my audience,
01:53:30.600 | my real audience, not haters from outside my audience,
01:53:33.080 | but my audience just revolts over it, right?
01:53:38.080 | They just like, oh my God, I can't believe you said,
01:53:41.000 | like you're such a schmuck, right?
01:53:43.200 | - They revolt with rigor and intellectual sophistication.
01:53:47.040 | - Or not, or not, but I mean, it's both,
01:53:49.480 | but it's like, but people who are like,
01:53:51.040 | so I mean, the clearest case is,
01:53:53.280 | you know, I have whatever audience I have,
01:53:55.200 | and then Trump appears on the scene,
01:53:56.820 | and I discover that something like 20% of my audience
01:54:00.600 | just went straight to Trump and couldn't believe
01:54:03.840 | I didn't follow them there.
01:54:05.120 | They were just aghast that I didn't see
01:54:06.920 | that Trump was obviously exactly what we needed
01:54:10.040 | for, to steer the ship of state for the next four years,
01:54:15.040 | and then four years beyond that.
01:54:17.000 | So like, so that's one example.
01:54:20.400 | So whenever I said anything about Trump,
01:54:22.640 | I would hear from people who loved more or less
01:54:25.760 | everything else I was up to and had for years,
01:54:28.920 | but everything I said about Trump just gave me pure pain
01:54:33.380 | from this quadrant of my audience.
01:54:36.160 | But then the same thing happens when I say something
01:54:39.560 | about the derangement of the far left.
01:54:42.000 | Anything I say about wokeness, right, or identity politics,
01:54:46.360 | same kind of punishment signal from us.
01:54:48.560 | Again, people who are core to my audience,
01:54:51.760 | like I've read all your books,
01:54:53.520 | I'm using your meditation app,
01:54:55.680 | I love what you say about science,
01:54:57.920 | but you are so wrong about politics,
01:55:00.080 | and you are, you know, I'm starting to think
01:55:01.560 | you're a racist asshole for everything you said
01:55:03.160 | about identity politics.
01:55:05.120 | And there are so many,
01:55:07.860 | the free will topic is just like this.
01:55:10.040 | It's like, just, they love what I'm saying
01:55:13.220 | about consciousness and the mind,
01:55:15.120 | and they love to hear me talk about physics with physicists,
01:55:18.280 | and it's all good.
01:55:20.420 | This free will stuff is,
01:55:21.600 | I cannot believe you don't see how wrong you are.
01:55:24.460 | What a fucking embarrassment you are.
01:55:26.760 | So, but I'm starting to notice that there are other people
01:55:30.360 | who don't have this experience of having an audience
01:55:33.480 | because they have, I mean,
01:55:35.000 | just take the Trump-Woke dichotomy.
01:55:37.360 | They just castigated Trump the same way I did,
01:55:41.680 | but they never say anything bad about the far left.
01:55:44.160 | So they never get this punishment signal,
01:55:45.760 | or you flip it.
01:55:46.880 | They're all about the insanity of critical race theory now.
01:55:51.880 | We connect all those dots the same way,
01:55:56.760 | but they never really specified what was wrong with Trump,
01:56:00.400 | or they thought there was a lot right with Trump,
01:56:02.440 | and they got all the pleasure of that.
01:56:04.880 | And so they have much more homogenized audiences.
01:56:08.600 | And so my experience, so just to come back to, you know,
01:56:13.440 | this experience of fame or quasi-fame,
01:56:15.400 | and truth is not real fame,
01:56:18.840 | but still there's an audience there.
01:56:22.060 | It is a, it's now an experience where basically
01:56:28.440 | whatever I put out,
01:56:30.980 | I notice a ton of negativity coming back at me.
01:56:34.400 | And it just, it is what it is.
01:56:37.840 | I mean, now it's like, I used to think,
01:56:40.320 | wait a minute, there's gotta be some way
01:56:42.060 | for me to communicate more clearly here
01:56:44.000 | so as not to get this kind of lunatic response
01:56:49.000 | from my own audience,
01:56:50.960 | from like people who are showing all the signs of,
01:56:53.500 | we've been here for years for a reason, right?
01:56:57.140 | These are not just trolls.
01:56:59.080 | And so I think, okay, I'm gonna take 10 more minutes
01:57:01.880 | and really just tell you what,
01:57:05.000 | it should be absolutely clear
01:57:06.520 | about what's wrong with Trump.
01:57:07.720 | Right, I've done this a few times,
01:57:09.440 | but I think I gotta do this again.
01:57:11.640 | Or wait a minute, how are they not getting
01:57:15.040 | that these episodes of police violence
01:57:17.600 | are so obviously different from one another
01:57:19.720 | that you can't describe all of them
01:57:22.040 | to yet another racist maniac on the police force,
01:57:26.800 | killing someone based on his racism.
01:57:28.660 | Last time I spoke about this, it was pure pain,
01:57:33.480 | but I just gotta try again.
01:57:35.600 | Now at a certain point, I mean,
01:57:37.160 | I'm starting to feel like, all right,
01:57:38.920 | I just, I have to be, I have to cease.
01:57:42.400 | Again, it comes back to this expectation
01:57:44.880 | that there shouldn't be fires.
01:57:46.360 | Like I feel like if I could just play my game impeccably,
01:57:51.200 | the people who actually care what I think will follow me
01:57:55.360 | when I hit Trump and hit free will and hit the woke
01:58:00.280 | and hit whatever it is,
01:58:02.200 | how we should respond to the coronavirus,
01:58:04.240 | vaccines, are they a thing, right?
01:58:07.400 | Like there's such derangement in our information space now
01:58:12.200 | that I mean, I guess some people could be getting more
01:58:15.760 | of this than I expect, but I just noticed
01:58:18.040 | that many of our friends who are in the same game
01:58:22.160 | have more homogenized audiences and don't get,
01:58:24.920 | I mean, they've successfully filtered out
01:58:28.240 | the people who are gonna despise them on this next topic.
01:58:32.280 | And I would imagine you have a different experience
01:58:36.260 | of having a podcast than I do at this point.
01:58:38.120 | I mean, I'm sure you get haters,
01:58:40.440 | but I would imagine you're more streamlined.
01:58:45.120 | - I actually don't like the word haters
01:58:46.880 | because it kind of presumes that it puts people in a bin.
01:58:51.880 | I think we all have like baby haters inside of us
01:58:55.800 | and we just apply them and some people enjoy doing that
01:58:59.040 | more than others for particular periods of time.
01:59:01.880 | I think you're gonna almost see hating on the internet
01:59:04.800 | as a video game that you just play and it's fun,
01:59:07.340 | but then you can put it down and walk away.
01:59:09.560 | And no, I certainly have a bunch of people
01:59:12.020 | that are very critical.
01:59:13.180 | I can list all the ways.
01:59:14.620 | - But does it feel like on any given topic,
01:59:16.700 | does it feel like it's an actual title surge
01:59:19.740 | where it's like 30% of your audience
01:59:21.900 | and then the other 30% of your audience
01:59:24.700 | from podcast to podcast?
01:59:25.860 | - No, no, no.
01:59:26.820 | - That's happening to me all the time now.
01:59:29.460 | - Well, I'm more with, I don't know
01:59:31.300 | what you think about this.
01:59:32.140 | I mean, Joe Rogan doesn't read comments
01:59:35.260 | or doesn't read comments much.
01:59:36.820 | And the argument he made to me is that
01:59:40.580 | he already has like a self-critical person inside.
01:59:47.300 | Like, and I'm gonna have to think about
01:59:50.540 | what you said in this conversation,
01:59:51.640 | but I have this very harshly self-critical person
01:59:55.000 | inside as well where I don't need more fuel.
01:59:58.120 | I don't need, no, I do sometimes,
02:00:01.520 | that's why I check negativity occasionally, not too often.
02:00:05.900 | I sometimes need to like put a little bit more
02:00:08.260 | like coals into the fire, but not too much.
02:00:11.940 | But I already have that self-critical engine
02:00:13.860 | that keeps me in check.
02:00:15.100 | I just, I wonder, you know, a lot of people
02:00:17.940 | who gain more and more fame lose that ability
02:00:22.740 | to be self-critical, I guess,
02:00:24.540 | because they lose the audience
02:00:25.860 | that can be critical towards them.
02:00:27.560 | - You know, I do follow Joe's advice
02:00:30.420 | much more than I ever have here.
02:00:31.820 | Like, I don't look at comments very often,
02:00:34.100 | and I'm probably using Twitter, you know,
02:00:37.820 | 5% as much as I used to.
02:00:41.600 | I mean, I really just get in and out on Twitter
02:00:44.020 | and spend very little time in my ad mentions.
02:00:46.740 | But, you know, it does, in some ways it feels like a loss
02:00:51.260 | because occasionally I see something
02:00:53.340 | super intelligent there.
02:00:54.940 | Like, I mean, I'll check my Twitter ad mentions
02:00:57.220 | and someone will have said, oh, have you read this article?
02:00:59.980 | And it's like, man, that was just,
02:01:02.020 | that was like the best article sent to me in a month, right?
02:01:05.100 | So it's like, to have not have looked
02:01:06.740 | and to not have seen that, that's a loss.
02:01:09.980 | So, but it does, at this point a little goes a long way
02:01:14.980 | 'cause it's not that it,
02:01:17.860 | for me now, I mean, this could sound
02:01:22.220 | like a fairly Stalinistic immunity to criticism.
02:01:26.740 | It's not so much that these voices of hate
02:01:29.340 | turn on my inner hater, you know, more.
02:01:33.140 | It's more that I just, I get a,
02:01:35.340 | what I fear is a false sense of humanity.
02:01:40.340 | Like, I feel like I'm too online
02:01:43.140 | and online is selecting for this performative outrage
02:01:45.980 | in everybody, everyone's signaling to an audience
02:01:48.380 | when they trash you.
02:01:50.340 | And I get a dark, I'm getting a, you know,
02:01:55.740 | misanthropic, you know, cut of just what it's like out there
02:02:00.740 | and it, 'cause when you meet people in real life,
02:02:04.100 | they're great, you know, they're rather often great,
02:02:06.620 | you know, and it takes a lot to have anything
02:02:10.820 | like a Twitter encounter in real life with a living person.
02:02:15.080 | And that's, I think it's much better to have that
02:02:20.980 | as one's default sense of what it's like to be with people
02:02:24.860 | than what one gets on social media
02:02:28.060 | or on YouTube comment threads.
02:02:30.420 | - You've produced a special episode with Rob Reed
02:02:33.420 | on your podcast recently on how bioengineering of viruses
02:02:38.180 | is going to destroy human civilization.
02:02:40.380 | So-- - Or could.
02:02:42.340 | - Could. - One fears, yeah.
02:02:43.420 | - Sorry, the confidence there.
02:02:45.260 | But in the 21st century, what do you think,
02:02:49.460 | especially after having thought through that angle,
02:02:53.640 | what do you think is the biggest threat
02:02:56.060 | to the survival of the human species?
02:02:58.720 | I can give you the full menu if you'd like.
02:03:02.780 | - Yeah, well, no, I would put the biggest threat
02:03:06.660 | at the, another level out, kind of the meta threat
02:03:11.660 | is our inability to agree about what the threats
02:03:16.700 | actually are and to converge on strategies for response
02:03:23.840 | to them, right?
02:03:25.120 | So like I view COVID as, among other things,
02:03:28.500 | a truly terrifyingly failed dress rehearsal
02:03:34.760 | for something far worse, right?
02:03:37.160 | I mean, COVID is just about as benign as it could have been
02:03:41.520 | and still have been worse than the flu
02:03:44.560 | when you're talking about a global pandemic, right?
02:03:46.560 | So it's just, it's, you know,
02:03:48.440 | it's gonna kill a few million people,
02:03:51.560 | or it looks like it's killed about 3 million people.
02:03:53.440 | Maybe it'll kill a few million more
02:03:56.260 | unless something gets away from us
02:03:58.160 | with a variant that's much worse,
02:04:00.920 | or we really don't play our cards right.
02:04:02.480 | But I mean, the general shape of it is it's got,
02:04:07.480 | you know, somewhere around, well, 1% lethality.
02:04:12.940 | And whatever side of that number it really is on
02:04:18.280 | in the end, it's not what would in fact be possible
02:04:23.280 | and is in fact probably inevitable,
02:04:26.520 | something with, you know, orders of magnitude,
02:04:29.620 | more lethality than that.
02:04:30.800 | And it's just so obvious we are totally unprepared, right?
02:04:35.360 | We are running this epidemiological experiment
02:04:39.800 | of linking the entire world together.
02:04:41.640 | And then also now per the podcast that Rob Reed did,
02:04:47.720 | democratizing the tech that will allow us to do this,
02:04:51.000 | to engineer pandemics, right?
02:04:53.200 | And more and more people will be able
02:04:56.920 | to engineer synthetic viruses that will be,
02:05:00.880 | by the sheer fact that they would have been engineered
02:05:04.940 | with malicious intent, you know, worse than COVID.
02:05:08.600 | And we're still living in,
02:05:11.640 | to speak specifically about the United States,
02:05:13.880 | we have a country here where we can't even agree
02:05:17.920 | that this is a thing, like that COVID,
02:05:20.320 | I mean, there's still people who think
02:05:21.440 | that this is basically a hoax designed to control people.
02:05:25.560 | And it's stranger still, there are people
02:05:29.380 | who will acknowledge that COVID is real
02:05:33.320 | and they don't think the deaths have been faked
02:05:36.320 | or mis-ascribed.
02:05:42.620 | But they think that they're far happier
02:05:47.460 | at the prospect of catching COVID
02:05:49.480 | than they are of getting vaccinated for COVID, right?
02:05:53.560 | They're not worried about COVID,
02:05:54.540 | they're worried about vaccines for COVID, right?
02:05:57.020 | And the fact that we just can't converge in a conversation
02:06:01.020 | that we've now had a year to have with one another
02:06:05.980 | on just what is the ground truth here?
02:06:08.180 | What's happened?
02:06:09.820 | Why has it happened?
02:06:11.640 | What's the, how safe is it to get COVID
02:06:14.580 | at every, in every cohort in the population?
02:06:19.260 | And how safe are the vaccines?
02:06:21.060 | And the fact that there's still an air of mystery
02:06:23.940 | around all of this for much of our society
02:06:27.300 | does not bode well when you're talking
02:06:30.180 | about solving any other problem that may yet kill us.
02:06:32.860 | - But do you think convergence grows
02:06:34.420 | with the magnitude of the threat?
02:06:36.180 | So- - It's possible,
02:06:37.700 | except I feel like we have tipped into,
02:06:40.820 | 'cause when the threat of COVID looked the most dire, right?
02:06:45.820 | When we were seeing reports from Italy
02:06:48.900 | that looked like the beginning of a zombie movie, right?
02:06:51.540 | - 'Cause it could have been much, much worse.
02:06:52.780 | - Yeah, like this is like, this is lethal, right?
02:06:55.340 | Like your ICUs are gonna fill up in,
02:06:57.740 | like you're 14 days behind us.
02:07:00.060 | You're gonna, your medical system is in danger of collapse.
02:07:04.820 | Lock the fuck down.
02:07:06.700 | We have people refusing to do anything sane
02:07:11.700 | in the face of that.
02:07:12.540 | Like, and people fundamentally thinking,
02:07:14.940 | it's not gonna get here, right?
02:07:16.420 | Like, or that's, who knows what's going on in Italy,
02:07:18.520 | but it has no implications for what's gonna go on
02:07:20.420 | in New York in a mere six days, right?
02:07:23.380 | And now it kicks off in New York,
02:07:25.300 | and you've got people in the middle of the country
02:07:27.700 | thinking it's no factor, it's not, that's just big city.
02:07:32.700 | Those are big city problems, or they're faking it,
02:07:35.540 | or, I mean, it just, the layer of politics
02:07:40.060 | has become so dysfunctional for us
02:07:42.580 | that even in what, in the presence of a pandemic
02:07:47.580 | that looked legitimately scary there in the beginning,
02:07:50.880 | I mean, it's not to say that it hasn't been devastating
02:07:52.860 | for everyone who's been directly affected by it,
02:07:54.900 | and it's not to say it can't get worse,
02:07:56.820 | but here, for a very long time, we have known
02:08:00.420 | that we were in a situation that is more benign
02:08:03.760 | than what seemed like the worst case scenario
02:08:07.300 | as it was kicking off, especially in Italy.
02:08:09.860 | And so still, yeah, it's quite possible
02:08:15.540 | that if we saw the asteroid hurtling toward Earth,
02:08:18.720 | and everyone agreed that it's gonna make impact,
02:08:23.020 | and we're all gonna die, then we could get off Twitter
02:08:27.660 | and actually build the rockets
02:08:30.220 | that are gonna divert the asteroid
02:08:32.980 | from its Earth-crossing path,
02:08:35.080 | and we could do something pretty heroic.
02:08:37.720 | But when you talk about anything else
02:08:41.520 | that isn't, that's slower moving than that,
02:08:46.520 | I mean, something like, I mean, climate change,
02:08:48.560 | I think the prospect of our converging
02:08:53.560 | on a solution to climate change
02:08:56.000 | purely based on political persuasion
02:08:58.900 | is nonexistent at this point.
02:09:00.520 | I just think that, I mean, to bring Elon back into this,
02:09:04.400 | the way to deal with climate change
02:09:05.960 | is to create technology that everyone wants
02:09:09.680 | that is better than all the carbon-producing technology,
02:09:14.540 | and then we just transition
02:09:15.620 | because you want an electric car
02:09:19.200 | the same way you wanted a smartphone
02:09:20.600 | or you want anything else,
02:09:22.680 | and you're working totally with the grain
02:09:24.820 | of people's selfishness and short-term thinking.
02:09:29.000 | The idea that we're gonna convince
02:09:31.040 | the better part of humanity
02:09:33.120 | that climate change is an emergency,
02:09:35.400 | that they have to make sacrifices to respond to,
02:09:38.960 | given what's happened around COVID,
02:09:41.080 | I just think that's the fantasy of a fantasy.
02:09:46.080 | - But speaking of Elon,
02:09:48.120 | I have a bunch of positive things
02:09:49.640 | that I wanna say here in response to you,
02:09:51.640 | but you're opening so many threads,
02:09:53.600 | but let me pull one of them,
02:09:54.760 | which is AI.
02:09:57.680 | Both you and Elon think that with AI,
02:10:02.400 | you're summoning demons, summoning a demon.
02:10:05.500 | Maybe not in those poetic terms, but--
02:10:07.500 | - Well, potentially.
02:10:09.800 | - Potentially.
02:10:10.720 | - Two very, three very parsimonious assumptions,
02:10:15.720 | I think, here, scientifically,
02:10:19.320 | parsimonious assumptions, get me there.
02:10:22.320 | Any of which could be wrong,
02:10:26.120 | but it just seems like the weight
02:10:28.520 | of the evidence is on their side.
02:10:31.360 | One is that it comes back to this topic
02:10:33.960 | of substrate independence, right?
02:10:36.880 | Anyone who's in the business
02:10:38.100 | of producing intelligent machines
02:10:40.400 | must believe, ultimately,
02:10:43.840 | that there's nothing magical
02:10:45.680 | about having a computer made of meat.
02:10:47.340 | You can do this in the kinds of materials
02:10:50.740 | we're using now,
02:10:53.440 | and there's no special something
02:10:56.760 | that presents a real impediment
02:11:00.940 | to producing human-level intelligence in silico, right?
02:11:04.760 | Again, an assumption, I'm sure there are a few people
02:11:08.280 | who still think there is something magical
02:11:09.760 | about biological systems,
02:11:12.800 | but leave that aside.
02:11:18.680 | Given that assumption,
02:11:20.500 | and given the assumption
02:11:21.420 | that we just continue making incremental progress,
02:11:24.620 | doesn't have to be Moore's Law,
02:11:25.780 | it just has to be progress,
02:11:27.020 | that just doesn't stop,
02:11:29.020 | at a certain point, we'll get
02:11:31.140 | to human-level intelligence and beyond.
02:11:34.580 | And human-level intelligence, I think,
02:11:36.820 | is also clearly a mirage,
02:11:38.540 | because anything that's human-level
02:11:40.540 | is gonna be superhuman,
02:11:41.860 | unless we decide to dumb it down, right?
02:11:44.820 | I mean, my phone is already superhuman
02:11:46.740 | as a calculator, right?
02:11:47.700 | So why would we make the human-level AI
02:11:51.240 | just as good as me, as a calculator?
02:11:54.880 | So I think we'll very,
02:11:57.280 | if we continue to make progress,
02:11:59.440 | we will be in the presence of superhuman competence
02:12:03.800 | for any act of intelligence or cognition
02:12:08.800 | that we care to prioritize.
02:12:11.120 | It's not to say that we'll create everything
02:12:13.420 | that a human could do,
02:12:14.260 | maybe we'll leave certain things out,
02:12:16.440 | but anything that we care about,
02:12:18.580 | and we care about a lot,
02:12:20.540 | and we certainly care about anything
02:12:21.920 | that produces a lot of power,
02:12:24.540 | that we care about scientific insights
02:12:26.940 | and an ability to produce new technology
02:12:30.040 | and all of that,
02:12:30.880 | we'll have something that's superhuman.
02:12:34.320 | And then the final assumption is just that
02:12:38.000 | there have to be ways to do that
02:12:42.680 | that are not aligned with a happy coexistence
02:12:46.520 | with these now more powerful entities than ourselves.
02:12:51.520 | And I would guess,
02:12:54.200 | and this is kind of a rider to that assumption,
02:12:57.180 | there are probably more ways to do it badly
02:12:59.700 | than to do it perfectly,
02:13:01.380 | that is perfectly aligned with our well-being.
02:13:05.780 | And when you think about the consequences of non-alignment,
02:13:10.980 | when you think about,
02:13:12.040 | you're now in the presence of something
02:13:15.580 | that is more intelligent than you are,
02:13:17.620 | which is to say more competent,
02:13:20.260 | unless you've,
02:13:21.180 | and obviously there are cartoon pictures of this
02:13:25.080 | where we could just,
02:13:26.280 | this is just an off switch,
02:13:27.320 | and we could just turn off the off switch,
02:13:28.480 | or they're tethered to something
02:13:29.620 | that makes them,
02:13:31.620 | our slaves in perpetuity,
02:13:33.700 | even though they're more intelligent.
02:13:34.860 | But those scenarios strike me as a failure
02:13:38.980 | to imagine what is actually entailed
02:13:40.980 | by greater intelligence.
02:13:42.340 | So if you imagine something
02:13:43.500 | that's legitimately more intelligent than you are,
02:13:46.540 | and you're now in relationship to it,
02:13:49.820 | you're in the presence of this thing,
02:13:52.620 | and it is autonomous in all kinds of ways
02:13:54.780 | because it had to be to be more intelligent than you are.
02:13:57.100 | I mean, you built it to be all of those things.
02:14:00.400 | We just can't find ourselves in a negotiation
02:14:05.640 | with something more intelligent than we are.
02:14:08.020 | We can't,
02:14:08.940 | so we have to have found the subset of ways
02:14:13.020 | to build these machines
02:14:15.200 | that are perpetually amenable to our saying,
02:14:21.140 | "Oh, that's not what we meant.
02:14:24.020 | "That's not what we intended.
02:14:24.940 | "Could you stop doing that?
02:14:26.140 | "Come back over here
02:14:27.080 | "and do this thing that we actually want."
02:14:29.440 | And for them to care,
02:14:30.620 | for them to be tethered to our own sense
02:14:32.580 | of our own wellbeing,
02:14:33.960 | such that,
02:14:37.060 | I mean, their utility function is,
02:14:39.540 | their primary utility function is to have,
02:14:42.300 | this is, I think, Stuart Russell's cartoon plan
02:14:45.300 | is to figure out how to tether them to a utility function
02:14:53.940 | that has our own estimation
02:14:57.900 | of what's going to improve our wellbeing
02:15:00.460 | as its master reward, right?
02:15:05.500 | So it's like all this thing can get
02:15:07.660 | as intelligent as it can get,
02:15:10.020 | but it only ever really wants to figure out
02:15:12.980 | how to make our lives better by our own view of better.
02:15:16.020 | Now, not to say there wouldn't be a conversation about,
02:15:19.220 | you know, I mean, because all kinds of things
02:15:21.020 | we're not seeing clearly about what is better.
02:15:24.300 | And if we were in the presence of a genie or an oracle
02:15:27.580 | that could really tell us what is better,
02:15:29.180 | well, then we presumably would want to hear that
02:15:32.060 | and we would modify our sense of what to do next
02:15:37.060 | in conversation with these minds.
02:15:40.980 | But I just feel like it is a failure of imagination
02:15:45.180 | to think that being in relationship
02:15:53.020 | to something more intelligent than yourself
02:15:57.300 | isn't in most cases
02:16:00.580 | a circumstance of real peril.
02:16:03.180 | 'Cause it is, just to think of how everything on earth
02:16:08.580 | has to, if they could think about their relationship to us,
02:16:12.140 | if birds could think about what we're doing, right?
02:16:15.800 | The bottom line is they're always in danger
02:16:23.820 | of our discovering that there's something
02:16:26.700 | we care about more than birds, right?
02:16:29.420 | Or there's something we want
02:16:30.900 | that disregards the wellbeing of birds.
02:16:34.220 | And obviously much of our behavior is inscrutable to them.
02:16:37.700 | Occasionally we pay attention to them
02:16:39.060 | and occasionally we withdraw our attention
02:16:41.620 | and occasionally we just kill them all
02:16:43.060 | for reasons they can't possibly understand.
02:16:45.500 | But if we're building something more intelligent
02:16:48.320 | than ourselves, by definition, we're building something
02:16:51.660 | whose horizons of value and cognition
02:16:57.580 | can exceed our own.
02:17:00.060 | And in ways where we can't necessarily foresee,
02:17:05.060 | again, perpetually, that they don't just wake up one day
02:17:09.260 | and decide, okay, well, these humans need to disappear.
02:17:14.180 | - So I think I agree with most of the initial things
02:17:18.420 | you said.
02:17:19.740 | What I don't necessarily agree with,
02:17:22.500 | and of course nobody knows,
02:17:24.020 | but that the more likely set of trajectories
02:17:27.520 | that we're going to take are going to be positive.
02:17:30.340 | That's what I believe.
02:17:32.020 | In the sense that the way you develop,
02:17:35.840 | I believe the way you develop successful AI systems
02:17:40.300 | will be deeply integrated with human society.
02:17:43.740 | And for them to succeed, they're going to have to be aligned
02:17:48.060 | in the way we humans are aligned with each other,
02:17:50.340 | which doesn't mean we're aligned,
02:17:52.540 | that there's no such thing,
02:17:54.700 | or I don't see there's such thing as a perfect alignment,
02:17:57.860 | but they're going to be participating in the dance,
02:18:01.080 | in the game-theoretic dance of human society
02:18:04.540 | as they become more and more intelligent.
02:18:06.520 | There could be a point beyond which
02:18:09.220 | we are like birds to them.
02:18:11.520 | - But what about an intelligence explosion of some kind?
02:18:16.080 | - So I believe the explosion will be happening,
02:18:21.080 | but there's a lot of explosion to be done
02:18:24.120 | before we become like birds.
02:18:26.200 | I truly believe that human beings
02:18:28.140 | are very intelligent in ways we don't understand.
02:18:30.680 | It's not just about chess.
02:18:32.280 | It's about all the intricate computation
02:18:35.360 | we're able to perform, common sense,
02:18:37.640 | our ability to reason about this world, consciousness.
02:18:40.600 | I think we're doing a lot of work
02:18:42.400 | we don't realize is necessary to be done
02:18:44.560 | in order to truly become,
02:18:47.720 | like truly achieve super intelligence.
02:18:49.980 | And I just think there'll be a period of time
02:18:52.140 | that's not overnight.
02:18:53.720 | The overnight nature of it will not literally be overnight.
02:18:57.120 | It'll be over a period of decades.
02:18:59.520 | So my sense is--
02:19:00.360 | - But why would it be that?
02:19:01.400 | But just take, draw an analogy from recent successes
02:19:06.400 | like something like AlphaGo or AlphaZero.
02:19:09.400 | I forget the actual metric,
02:19:11.720 | but it was something like this algorithm,
02:19:14.800 | which wasn't even totally,
02:19:17.280 | it wasn't bespoke for chess playing,
02:19:21.440 | in the matter of, I think it was four hours,
02:19:24.160 | played itself so many times and so successfully
02:19:27.220 | that it became the best chess playing computer.
02:19:30.480 | Not only was it,
02:19:31.320 | it was not only better than every human being,
02:19:33.720 | it was better than every previous chess program
02:19:36.820 | in a matter of a day, right?
02:19:38.640 | So like that, so just imagine,
02:19:40.560 | again, we don't have to recapitulate everything about us,
02:19:43.740 | but just imagine building a system,
02:19:45.700 | and who knows when we'll be able to do this,
02:19:50.480 | but at some point we'll be able,
02:19:52.080 | at some point the 100,
02:19:54.800 | or 100 favorite things about human cognition
02:19:57.400 | will be analogous to chess
02:20:01.240 | in that we will be able to build machines
02:20:03.940 | that very quickly outperform any human,
02:20:07.980 | and then very quickly outperform the last algorithm
02:20:12.120 | that outperform the humans.
02:20:13.520 | Like something like the AlphaGo experience
02:20:17.400 | seems possible for facial recognition,
02:20:21.480 | and detecting human emotion,
02:20:23.680 | and natural language processing, right?
02:20:26.120 | Like, well, it's just the,
02:20:28.840 | everyone, even math people, math heads,
02:20:33.020 | tend to have bad intuitions for exponentiation, right?
02:20:36.480 | We noticed this during COVID,
02:20:37.640 | I mean, you have some very smart people
02:20:39.160 | who still couldn't get their minds around the fact that,
02:20:43.120 | you know, an exponential is really surprising,
02:20:46.720 | I mean, things double, and double, and double,
02:20:48.280 | and double again,
02:20:49.400 | and you don't notice much of anything changes,
02:20:51.160 | and then the last, you know,
02:20:53.400 | two stages of doubling swamp everything, right?
02:20:56.200 | And it just seems like that,
02:20:59.600 | to assume that there isn't a deep analogy
02:21:04.600 | between what we're seeing for the more tractable,
02:21:08.320 | the tractable problems like chess
02:21:11.400 | to other modes of cognition.
02:21:13.140 | It's like once you crack that problem,
02:21:16.040 | it seems, 'cause for the longest time,
02:21:17.920 | it was impossible to think we were gonna make headway
02:21:22.320 | on, in AI, you know, it's like-
02:21:25.080 | - Chess and Go was, Go seemed impossible.
02:21:27.560 | - Go seemed unattainable.
02:21:28.800 | Even when chess had been cracked,
02:21:31.640 | Go seemed unattainable.
02:21:33.360 | - Yeah, and actually, Stuart Russell was behind
02:21:36.600 | the people that were saying it's unattainable,
02:21:38.720 | 'cause it seemed like, you know,
02:21:40.480 | it's a intractable problem.
02:21:42.840 | But there's something different
02:21:44.440 | about the space of cognition
02:21:46.560 | that's detached from human society,
02:21:48.080 | which is what chess is, meaning like just thinking.
02:21:51.220 | Having actual exponential impact
02:21:54.280 | on the physical world is different.
02:21:56.560 | I tend to believe that there's,
02:21:58.300 | for AI to get to the point where it's super intelligent,
02:22:03.700 | it's going to have to go through the funnel of society.
02:22:07.680 | And for that, it has to be deeply integrated
02:22:09.580 | with human beings.
02:22:11.120 | And for that, it has to be aligned.
02:22:12.880 | - But you're talking about like actually hooking us up
02:22:15.560 | to like Neuralink, you know,
02:22:16.800 | we're going to be the brainstem to the robot overlords?
02:22:21.800 | - That's a possibility as well.
02:22:23.360 | But what I mean is, in order to develop
02:22:26.200 | autonomous weapon systems, for example,
02:22:28.840 | which are highly concerning to me,
02:22:31.120 | that both US and China are participating in now,
02:22:34.760 | that in order to develop them and for them to become,
02:22:38.320 | to have more and more responsibility
02:22:40.120 | to actually do military strategic actions,
02:22:44.480 | they're going to have to be integrated
02:22:47.920 | into human beings doing the strategic action.
02:22:51.720 | They're going to have to work alongside with each other.
02:22:54.120 | And the way those systems will be developed
02:22:56.440 | will have the natural safety, like switches
02:23:00.080 | that are placed on them as they develop over time,
02:23:03.040 | because they're going to have to convince humans.
02:23:05.320 | Ultimately, they're going to have to convince humans
02:23:07.760 | that this is safer than humans.
02:23:10.440 | They're going to, you know.
02:23:12.360 | - Well, self-driving cars is a good test case here.
02:23:15.680 | 'Cause like, obviously we've made a lot of progress
02:23:19.560 | and we can imagine what total progress would look like.
02:23:24.560 | I mean, it would be amazing.
02:23:25.840 | And it's answering, it's canceling in the US,
02:23:29.000 | 40,000 deaths every year based on ape-driven cars, right?
02:23:33.080 | So it's an excruciating problem
02:23:35.600 | that we've all gotten used to
02:23:36.600 | 'cause there was no alternative.
02:23:38.360 | But now we can dimly see the prospect of an alternative,
02:23:41.600 | which if it works in a super intelligent fashion,
02:23:45.720 | maybe we go down to zero highway deaths, right?
02:23:48.920 | Or certainly we'd go down by orders of magnitude, right?
02:23:51.920 | So maybe we have 400 rather than 40,000 a year.
02:23:56.920 | And it's easy to see that there's not a missile.
02:24:05.360 | So obviously this is not an example of super intelligence.
02:24:08.000 | This is narrow intelligence,
02:24:09.200 | but the alignment problem isn't so obvious there,
02:24:14.200 | but there are potential alignment problems there.
02:24:17.640 | Like, so like just imagine if some woke team of engineers
02:24:22.520 | decided that we have to tune the algorithm some way.
02:24:26.680 | I mean, there are situations
02:24:28.080 | where the car has to decide who to hit.
02:24:30.080 | I mean, there's just bad outcomes
02:24:31.520 | where you're going to hit somebody, right?
02:24:33.960 | Now we have a car that can tell what race you are, right?
02:24:36.720 | So we're going to build the car
02:24:38.120 | to preferentially hit white people
02:24:40.360 | because white people have had so much privilege
02:24:42.060 | over the years.
02:24:43.240 | This seems like the only ethical way
02:24:44.880 | to kind of redress those wrongs of the past.
02:24:47.240 | That's something that could get built.
02:24:48.760 | One, that could get produced as an artifact,
02:24:52.480 | presumably, of just how you built it.
02:24:54.200 | And you didn't even know you engineered it that way, right?
02:24:56.240 | You caused it. - Or machine learning.
02:24:58.640 | - Yeah. - You put some kind
02:24:59.880 | of constraints on it to where it creates
02:25:01.400 | those kinds of outcomes.
02:25:02.240 | So you basically built a racist algorithm
02:25:05.120 | and you didn't even intend to, or you could intend to, right?
02:25:07.880 | And it would be aligned with some people's values,
02:25:09.700 | but misaligned with other people's values.
02:25:11.800 | But it's like, there are interesting problems
02:25:16.480 | even with something as simple
02:25:17.920 | and obviously good as self-driving cars.
02:25:20.520 | - But there's a leap that I just think it'd be exact,
02:25:23.600 | but those are human problems.
02:25:25.320 | I just don't think there'll be a leap
02:25:26.960 | with autonomous vehicles.
02:25:28.580 | First of all, sorry.
02:25:31.560 | There are a lot of trajectories
02:25:33.920 | which will destroy human civilization.
02:25:35.760 | The argument I'm making, it's more likely
02:25:38.140 | that we'll take trajectories that don't.
02:25:40.040 | So I don't think there'll be a leap
02:25:41.880 | with autonomous vehicles.
02:25:43.000 | We'll all of a sudden start murdering pedestrians
02:25:45.960 | because once every human on earth is dead,
02:25:49.280 | there'll be no more fatalities.
02:25:50.680 | Sort of unintended consequences of,
02:25:52.800 | and it's difficult to take that leap.
02:25:55.480 | Most systems as we develop
02:25:57.160 | and they become much, much more intelligent
02:25:59.120 | in ways that will be incredibly surprising,
02:26:01.280 | like stuff that DeepMind is doing with protein folding.
02:26:04.480 | Even, which is scary to think about,
02:26:07.840 | and I'm personally terrified about this,
02:26:09.520 | which is the engineering of viruses using machine learning.
02:26:12.760 | The engineering of vaccines using machine learning.
02:26:16.880 | The engineering of, yeah, for research purposes,
02:26:20.840 | pathogens using machine learning.
02:26:23.680 | And the ways that could go wrong.
02:26:25.080 | I just think that there's always going to be
02:26:27.400 | a closed loop supervision of humans
02:26:30.820 | before the AI becomes super intelligent.
02:26:33.560 | Not always, much more likely to be supervision.
02:26:38.240 | Except, of course, the question is
02:26:40.520 | how many dumb people there are in the world,
02:26:42.120 | how many evil people are in the world?
02:26:44.420 | My theory, my hope is, my sense is,
02:26:48.800 | that the number of intelligent people
02:26:50.640 | is much higher than the number of dumb people,
02:26:52.980 | that know how to program,
02:26:55.320 | and the number of evil people.
02:26:57.680 | I think smart people and kind people
02:26:59.920 | over outnumber the others.
02:27:03.260 | - Except we also, we have to add another group of people
02:27:06.340 | which are just the smart and otherwise good,
02:27:09.880 | but reckless people.
02:27:12.180 | The people who will flip a switch on,
02:27:15.440 | not knowing what's going to happen,
02:27:17.640 | they're just kind of hoping
02:27:19.040 | that it's not going to blow up the world.
02:27:20.440 | We already know that some of our smartest people
02:27:23.320 | are those sorts of people.
02:27:24.520 | You know, we know we've done experiments,
02:27:26.140 | and this is something that Martin Rees
02:27:27.700 | was whinging about before the Large Hadron Collider
02:27:32.700 | got booted up, I think.
02:27:34.720 | We know there are people who are entertaining experiments,
02:27:38.720 | or even performing experiments,
02:27:40.020 | where there's some chance, you know,
02:27:42.880 | not quite infinitesimal,
02:27:44.880 | that they're going to create a black hole in the lab
02:27:48.840 | and suck the whole world into it.
02:27:50.600 | Right, I mean, like, that's not,
02:27:52.280 | you're not a crazy person to worry that,
02:27:55.320 | or worry about that based on the physics.
02:27:57.200 | And so it was with, you know, the Trinity test,
02:28:01.560 | there were some people who were still
02:28:04.000 | checking their calculations, and they were off.
02:28:06.640 | We did nuclear tests where we were off significantly
02:28:10.320 | in terms of the yield, right?
02:28:11.940 | So it was like-
02:28:12.780 | - And they still flipped the switch.
02:28:13.760 | - Yeah, they still flipped the switch.
02:28:14.960 | And sometimes they flip the switch not to win a world war,
02:28:19.960 | or to save 40,000 lives a year.
02:28:22.900 | They just- - Just to see what happens.
02:28:24.640 | - Intellectual curiosity.
02:28:25.880 | Like, this is what I got my grant for.
02:28:27.760 | This is where I'll get my Nobel Prize,
02:28:30.120 | if that's in the cards.
02:28:32.060 | It's on the other side of this switch, right?
02:28:35.560 | And, I mean, again, we are apes with egos
02:28:40.560 | who are massively constrained by self,
02:28:47.840 | very short-term self-interest,
02:28:49.640 | even when we're contemplating some of the deepest
02:28:52.860 | and most interesting and most universal problems
02:28:57.700 | we could ever set our attention towards.
02:29:00.340 | Like, just if you read James Watson's book,
02:29:03.100 | "The Double Helix," right, about them, you know,
02:29:05.580 | cracking the structure of DNA,
02:29:08.620 | one thing that's amazing about that book
02:29:11.600 | is just how much of it, almost all of it,
02:29:15.980 | is being driven by very ape-ish,
02:29:21.980 | egocentric, social concerns.
02:29:26.140 | Like, the algorithm that is producing
02:29:28.940 | this scientific breakthrough is human competition,
02:29:32.580 | if you're James Watson, right?
02:29:34.140 | It's like, I'm gonna get there before Linus Pauling,
02:29:36.500 | and, you know, it's just,
02:29:39.260 | so much of his bandwidth is captured by that, right?
02:29:43.180 | Now, that becomes more and more of a liability
02:29:48.820 | when you're talking about producing technology
02:29:51.060 | that can change everything in an instant, you know?
02:29:53.780 | We're talking about not only understanding,
02:29:57.780 | you know, we're just at a different moment in human history.
02:30:01.780 | We're not, when we're doing research on viruses,
02:30:06.780 | we're now doing the kind of research
02:30:10.980 | that can cause someone somewhere else
02:30:13.900 | to be able to make that virus,
02:30:16.460 | or weaponize that virus, or it's just, I don't know.
02:30:21.460 | I mean, our power is, our wisdom is,
02:30:26.180 | it does not seem like our wisdom
02:30:27.660 | is scaling with our power, right?
02:30:29.420 | And that seems like, insofar,
02:30:31.620 | as wisdom and power become unaligned,
02:30:36.700 | I get more and more concerned.
02:30:39.340 | - But speaking of apes with egos,
02:30:44.100 | some of the most compelling apes,
02:30:47.580 | two compelling apes I can think of
02:30:49.220 | is yourself and Jordan Peterson,
02:30:51.700 | and you've had a fun conversation about religion
02:30:56.020 | that I watched most of, I believe.
02:30:58.700 | I'm not sure there was any--
02:31:00.420 | - We didn't solve anything.
02:31:03.820 | - If anything was ever solved.
02:31:05.060 | So is there something, like a charitable summary
02:31:09.420 | you can give to the ideas that you agree on
02:31:12.980 | and disagree with Jordan?
02:31:14.340 | Is there something maybe after that conversation
02:31:16.460 | that you've landed where maybe,
02:31:19.220 | as you both agreed on,
02:31:22.660 | is there some wisdom in the rubble,
02:31:24.700 | of even imperfect, flawed ideas?
02:31:29.060 | Is there something that you can kind of pull out
02:31:31.340 | from those conversations, or is it to be continued?
02:31:34.620 | - I mean, I think where we disagree,
02:31:35.980 | so he thinks that many of our traditional religions
02:31:40.980 | and many of our traditional religious beliefs
02:31:43.940 | and frameworks are holding so much,
02:31:48.940 | such a repository of human wisdom
02:31:53.780 | that we pull at that fabric at our peril, right?
02:32:00.420 | Like if you start just unraveling Christianity
02:32:07.260 | or any other traditional set of norms and beliefs,
02:32:11.860 | you may think you're just pulling out the unscientific bits,
02:32:15.900 | but you could be pulling a lot more
02:32:17.780 | to which everything you care about is attached,
02:32:21.020 | right, as a society.
02:32:22.460 | And my feeling is that there's so much,
02:32:28.460 | there's so much downside to the unscientific bits,
02:32:31.380 | and it's so clear how we could have a 21st century
02:32:36.620 | rational conversation about the good stuff
02:32:39.900 | that we really can radically edit these traditions.
02:32:42.740 | And we can take Jesus in half his moods
02:32:47.620 | and just find a great inspirational,
02:32:51.040 | Iron Age thought leader,
02:32:54.860 | who just happened to get crucified,
02:32:56.240 | but he could be someone, like the Beatitudes
02:32:58.900 | and the Golden Rule, which doesn't originate with him,
02:33:03.620 | but which he put quite beautifully.
02:33:06.780 | All of that's incredibly useful.
02:33:09.740 | It's no less useful than it was 2000 years ago,
02:33:12.780 | but we don't have to believe he was born of a virgin
02:33:14.940 | or coming back to raise the dead or any of that other stuff.
02:33:18.300 | And we can be honest about not believing those things,
02:33:21.400 | and we can be honest about the reasons
02:33:22.960 | why we don't believe those things.
02:33:24.620 | 'Cause on those fronts, I view the downside to be so obvious
02:33:31.080 | and the fact that we have so many different
02:33:34.800 | competing dogmatisms on offer to be so non-functional,
02:33:38.520 | I mean, it's so divisive.
02:33:40.080 | It just has conflict built into it
02:33:43.880 | that I think we can be far more
02:33:47.120 | and should be far more iconoclastic than he wants to be.
02:33:51.600 | Now, none of this is to deny much of what he argues for,
02:33:56.040 | that stories are very powerful.
02:34:00.500 | Clearly, stories are powerful, and we want good stories.
02:34:03.400 | We want our lives, we want to have a conversation
02:34:06.160 | with ourselves and with one another about our lives
02:34:10.080 | that facilitates the best possible lives,
02:34:13.440 | and story is part of that.
02:34:15.480 | And if you want some of those stories to sound like myths,
02:34:20.480 | that might be part of it.
02:34:22.660 | But my argument is that we never really need
02:34:26.200 | to deceive ourselves or our children
02:34:29.200 | about what we have every reason to believe is true
02:34:32.500 | in order to get at the good stuff,
02:34:33.960 | in order to organize our lives well.
02:34:36.300 | I certainly don't feel that I need to do it personally,
02:34:39.200 | and if I don't need to do it personally,
02:34:41.080 | why would I think that billions of other people
02:34:43.360 | need to do it personally?
02:34:45.480 | Now, there is a cynical counter-argument,
02:34:48.640 | which is billions of other people
02:34:51.220 | don't have the advantages that I have had in my life.
02:34:54.240 | The billions of other people are not as well-educated,
02:34:57.640 | they haven't had the same opportunities,
02:34:59.280 | they need to be told that Jesus is gonna solve
02:35:04.280 | all their problems after they die, say,
02:35:06.600 | or that everything happens for a reason,
02:35:10.700 | and if you just believe in the secret,
02:35:14.000 | if you just visualize what you want, you're gonna get it.
02:35:16.200 | And it's like there's some measure
02:35:20.900 | of what I consider to be odious pamblum
02:35:23.880 | that really is food for the better part of humanity,
02:35:27.320 | and there is no substitute for it,
02:35:29.200 | or there's no substitute now.
02:35:31.000 | And I don't know if Jordan would agree with that,
02:35:32.480 | but much of what he says seems to suggest
02:35:35.200 | that he would agree with it.
02:35:36.600 | And I guess that's an empirical question.
02:35:41.120 | I mean, that's just that we don't know
02:35:43.320 | whether given a different set of norms
02:35:47.040 | and a different set of stories,
02:35:48.600 | people would behave the way I would hope they would behave
02:35:52.860 | and be aligned, more aligned than they are now.
02:35:56.080 | I think we know what happens
02:35:58.800 | when you just let ancient religious certainties
02:36:03.800 | go uncriticized.
02:36:06.440 | We know what that world's like.
02:36:07.840 | We've been struggling to get out of that world
02:36:10.640 | for a couple of hundred years,
02:36:12.140 | but we know what having Europe
02:36:15.880 | riven by religious wars looks like, right?
02:36:21.860 | And we know what happens when those religions
02:36:25.280 | become kind of pseudo religions and political religions.
02:36:29.760 | So this is where I'm sure Jordan and I would debate.
02:36:33.840 | He would say that Stalin was a symptom of atheism,
02:36:37.120 | and that's not at all.
02:36:37.960 | I mean, it's not my kind of atheism, right?
02:36:39.840 | Like Stalin, the problem with the Gulag
02:36:44.040 | and the experiment with communism or with Stalinism
02:36:48.240 | or with Nazism was not that there was so much
02:36:53.240 | scientific rigor and self-criticism and honesty
02:36:56.800 | and introspection and judicious use of psychedelics.
02:37:01.800 | I mean, like that was not the problem in Hitler's Germany
02:37:07.360 | or in Stalin's Soviet Union.
02:37:10.240 | The problem was you have other ideas
02:37:16.840 | that capture a similar kind of mob-based dogmatic energy
02:37:22.840 | and yes, the results of all of that
02:37:27.760 | are predictably murderous.
02:37:30.560 | - Well, the question is what is the source
02:37:33.520 | of the most viral and sticky stories
02:37:37.780 | that ultimately lead to a positive outcome?
02:37:40.280 | So communism was, I mean, having grown up in the Soviet Union
02:37:44.440 | even still having relatives in Russia,
02:37:51.480 | there's a stickiness to the nationalism
02:37:53.920 | and to the ideologies of communism that religious or not,
02:37:58.640 | you could say it's religious fervor.
02:38:00.320 | I could just say it's stories that are viral and sticky.
02:38:05.320 | I'm using the most horrible words,
02:38:08.960 | but the question is whether science and reason
02:38:12.360 | can generate viral sticky stories
02:38:14.440 | that give meaning to people's lives.
02:38:16.360 | In your sense, is it does?
02:38:20.760 | - Well, whatever's true ultimately
02:38:23.000 | should be captivating, right?
02:38:26.480 | It's like what's more captivating
02:38:28.960 | than whatever is real, right?
02:38:32.080 | Now it's because reality is, again,
02:38:36.400 | we're just climbing out of the darkness
02:38:41.040 | in terms of our understanding of what the hell is going on.
02:38:43.040 | And there's no telling what spooky things
02:38:47.360 | may in fact be true.
02:38:48.280 | I mean, I don't know if you've been on the receiving end
02:38:49.960 | of recent rumors about our conversation about UFOs
02:38:54.960 | very likely changing in the near term, right?
02:38:57.640 | But like there was just a Washington Post article
02:39:00.120 | and a New York article,
02:39:01.320 | and I've received some private outreach
02:39:04.440 | and perhaps you have, I know other people in our orbit
02:39:08.160 | have people who are claiming that the government
02:39:12.400 | has known much more about UFOs
02:39:14.680 | than they have let on until now.
02:39:17.560 | And this conversation is actually,
02:39:19.320 | is about to become more prominent,
02:39:21.640 | and it's not gonna be whatever,
02:39:25.360 | whoever's left standing when the music stops,
02:39:28.520 | it's not going to be a comfortable position to be in
02:39:33.520 | as a super rigorous scientific skeptic
02:39:39.040 | who's been saying there's no there there
02:39:41.640 | for the last 75 years, right?
02:39:45.920 | The short version is it sounds like
02:39:49.080 | the Office of Naval Intelligence and the Pentagon
02:39:52.320 | are very likely to say to Congress at some point
02:39:55.600 | in the not too distant future that we have evidence
02:39:58.920 | that there is technology flying around here
02:40:02.780 | that seems like it can't possibly be of human origin, right?
02:40:07.780 | Now, I don't know what I'm gonna do
02:40:10.040 | with that kind of disclosure, right?
02:40:11.320 | Maybe it's just, it's gonna be nothing,
02:40:14.600 | no follow on conversation to really have,
02:40:17.240 | but that is such a powerfully strange circumstance
02:40:21.800 | to be in, right?
02:40:22.920 | I mean, it's just, what are we gonna do with that?
02:40:25.400 | If in fact that's what happens, right?
02:40:27.440 | If in fact the considered opinion,
02:40:31.680 | despite the embarrassment it causes them,
02:40:34.960 | of the US government, of all of our intelligence,
02:40:38.480 | all of the relevant intelligence services,
02:40:40.360 | is that this isn't a hoax,
02:40:44.080 | it's too, there's too much data to suggest that it's a hoax.
02:40:46.960 | We've got too much radar imagery,
02:40:48.720 | there's too much satellite data,
02:40:51.120 | whatever data they actually have, there's too much of it.
02:40:55.480 | All we can say now is something's going on
02:40:58.640 | and there's no way it's the Chinese or the Russians
02:41:03.600 | or anyone else's technology.
02:41:06.060 | That should arrest our attention, you know, collectively
02:41:12.280 | to a degree that nothing in our lifetime has.
02:41:15.680 | And now one worries that we're so jaded
02:41:19.680 | and confused and distracted that it's gonna,
02:41:26.080 | it'll get much less coverage than, you know,
02:41:30.400 | Obama's tan suit did, you know, a bunch of years ago.
02:41:34.660 | It's just, it's, who knows how we'll respond to that.
02:41:38.440 | But it's just to say that the need for us
02:41:43.440 | to tell ourselves an honest story about what's going on
02:41:49.640 | and what's likely to happen next
02:41:51.960 | is never gonna go away, right?
02:41:54.000 | And it's important, it's just,
02:41:56.420 | the division between me and every person
02:41:59.100 | who's defending traditional religion is,
02:42:01.220 | where is it that you wanna lie to yourself
02:42:07.640 | or lie to your kids?
02:42:09.040 | Like, where is honesty a liability?
02:42:11.360 | And for me, it, you know,
02:42:14.440 | I've yet to find the place where it is.
02:42:17.120 | And it's so obviously a strength
02:42:20.920 | in almost every other circumstance
02:42:24.080 | because it is the thing that allows you to course correct.
02:42:28.040 | It is the thing that allows you to hope at least
02:42:33.000 | that your beliefs, that your stories
02:42:34.760 | are in some kind of calibration
02:42:37.240 | with what's actually going on in the world.
02:42:40.360 | - Yeah, it is a little bit sad to imagine that
02:42:42.760 | if aliens en masse showed up to Earth,
02:42:47.320 | they would be too preoccupied with political bickering
02:42:50.500 | or to like these like fake news
02:42:53.160 | and all that kind of stuff to notice
02:42:56.160 | the very basic evidence of reality.
02:42:59.520 | I do have a glimmer of hope
02:43:02.400 | that there seems to be more and more hunger
02:43:04.560 | for authenticity.
02:43:06.400 | And I feel like that opens the door
02:43:08.480 | for a hunger for what is real.
02:43:14.040 | Like people don't want stories,
02:43:15.680 | they don't want like layers and layers of like fakeness.
02:43:19.660 | And I'm hoping that means that will directly lead
02:43:24.760 | to a greater hunger for reality and reason and truth.
02:43:28.280 | You know, truth isn't dogmatism.
02:43:31.480 | Like truth isn't authority.
02:43:34.200 | I have a PhD and therefore I'm right.
02:43:37.040 | Truth is almost like the reality is
02:43:42.600 | there's so many questions, there's so many mysteries,
02:43:44.480 | there's so much uncertainty.
02:43:45.480 | This is our best available, like a best guess.
02:43:49.400 | And we have a lot of evidence that supports that guess,
02:43:52.100 | but it could be so many other things.
02:43:53.820 | And like just even conveying that,
02:43:56.400 | I think there's a hunger for that in the world
02:43:58.800 | to hear that from scientists, less dogmatism
02:44:01.440 | and more just like, this is what we know.
02:44:04.920 | We're doing our best given the uncertainty,
02:44:07.060 | given, I mean, this is true with obviously
02:44:09.520 | with the virology and all those kinds of things
02:44:11.840 | 'cause everything is happening so fast.
02:44:13.360 | There's a lot of, and biology is super messy.
02:44:16.520 | So it's very hard to know stuff for sure.
02:44:18.840 | So just being open and real about that,
02:44:21.040 | I think I'm hoping will change people's hunger
02:44:25.680 | and openness and trust of what's real.
02:44:29.560 | - Yeah, well, so much of this is probabilistic.
02:44:31.880 | It's so much of what can seem dogmatic scientifically
02:44:35.080 | is just, you're placing a bet on whether it's worth
02:44:40.080 | reading that paper or rethinking your presuppositions
02:44:45.520 | on that point.
02:44:46.720 | It's like, it's not a fundamental closure to data.
02:44:49.960 | It's just that there's so much data on one side
02:44:52.720 | or so much would have to change
02:44:55.560 | in terms of your understanding of what you think
02:44:57.800 | you understand about the nature of the world.
02:44:59.920 | If this new fact were so that you can pretty quickly say,
02:45:04.920 | all right, that's probably bullshit, right?
02:45:08.800 | And it can sound like a fundamental closure
02:45:12.400 | to new conversations, new evidence, new data, new argument
02:45:17.400 | but it's really not, it's just, it really is just triaging
02:45:20.480 | your attention.
02:45:21.320 | It's just like, okay, you're telling me that your best
02:45:25.760 | friend can actually read minds.
02:45:27.200 | Okay, well, that's interesting.
02:45:30.840 | Let me know when that person has gone into a lab
02:45:33.080 | and actually proven it, right?
02:45:34.120 | Like, I don't need, this is not the place where I need
02:45:37.120 | to spend the rest of my day figuring out if your buddy
02:45:39.800 | can read my mind, right?
02:45:42.440 | - But there's a way to communicate that.
02:45:44.720 | I think it does too often sound like you're completely
02:45:47.680 | closed off to ideas as opposed to saying like,
02:45:50.360 | as opposed to saying that there's a lot of evidence
02:45:56.040 | in support of this but you're still open-minded
02:46:00.120 | to other ideas.
02:46:00.960 | Like, there's a way to communicate that.
02:46:02.400 | It's not necessarily even with words.
02:46:04.640 | It's like, it's even that Joe Rogan energy
02:46:08.080 | of it's entirely possible.
02:46:10.480 | Just, it's that energy of being open-minded and curious
02:46:13.200 | like kids are.
02:46:14.280 | Like, this is our best understanding
02:46:16.440 | but you still are curious.
02:46:19.080 | I'm not saying allocate time to exploring all those things
02:46:22.720 | but still leaving the door open.
02:46:24.560 | And there's a way to communicate that I think
02:46:27.080 | that people really hunger for.
02:46:31.480 | Let me ask you this.
02:46:32.640 | I've been recently talking a lot with John Donaher
02:46:35.200 | from Brazilian Jiu-Jitsu fame.
02:46:37.200 | I don't know if you know who that is.
02:46:39.200 | In fact--
02:46:40.040 | - Talk about somebody who's good at what he does.
02:46:41.560 | - Yeah. - Isn't, yeah.
02:46:42.720 | - And he, speaking of somebody who's open-minded,
02:46:45.560 | the reason, in this ridiculous transition,
02:46:48.080 | is for the longest time and even still,
02:46:50.360 | a lot of people believed in the Jiu-Jitsu world
02:46:52.640 | and grappling world that leg locks
02:46:55.240 | are not effective in Jiu-Jitsu.
02:46:56.600 | And he was somebody that, inspired by the open-mindedness
02:47:00.320 | of Dean Lister, who famously to him said,
02:47:03.960 | "Why do you only consider half the human body
02:47:06.880 | when you're trying to do the submissions?"
02:47:08.640 | He developed an entire system
02:47:10.360 | on this other half the human body.
02:47:12.460 | Anyway, I do that absurd transition to ask you
02:47:15.900 | because you're also a student of Brazilian Jiu-Jitsu.
02:47:20.080 | Is there something you could say
02:47:22.080 | how that has affected your life,
02:47:23.800 | what you've learned from grappling, from the martial arts?
02:47:27.920 | - Well, it's actually a great transition
02:47:29.280 | because I think one of the things
02:47:33.900 | that's so beautiful about Jiu-Jitsu
02:47:35.640 | is that it does what we wish we could do
02:47:39.480 | in every other area of life
02:47:41.920 | where we're talking about this difference
02:47:43.800 | between knowledge and ignorance.
02:47:46.640 | Right, like there's no room for bullshit, right?
02:47:51.680 | You don't get any credit for bullshit.
02:47:53.840 | There's the difference,
02:47:55.720 | but the amazing thing about Jiu-Jitsu
02:47:57.800 | is that the difference between knowing what's going on
02:48:02.800 | and what to do and not knowing it
02:48:04.960 | is as the gulf between those two states
02:48:08.040 | is as wide as it is in any thing in human life.
02:48:12.540 | And it can be spanned so quickly.
02:48:19.000 | Like you didn't, each increment of knowledge
02:48:22.280 | can be doled out in five minutes.
02:48:24.680 | It's like, here's the thing that got you killed
02:48:27.480 | and here's how to prevent it from happening to you
02:48:30.920 | and here's how to do it to others.
02:48:33.000 | And you just get this amazing cadence
02:48:37.000 | of discovering your fatal ignorance
02:48:40.200 | and then having it remedied with the actual technique.
02:48:44.440 | And I mean, just for people
02:48:48.100 | who don't know what we're talking about,
02:48:49.040 | it's just like the simple circumstances
02:48:51.240 | of like someone's got you in a headlock.
02:48:53.020 | How do you get out of that, right?
02:48:54.840 | Someone's sitting on your chest
02:48:56.780 | and they're in the mount position
02:48:59.040 | and you're on the bottom and you want to get away.
02:49:01.640 | How do you get them off you?
02:49:02.920 | They're sitting on you.
02:49:04.880 | Your intuitions about how to do this are terrible,
02:49:08.320 | even if you've done some other martial art, right?
02:49:10.820 | And once you learn how to do it,
02:49:13.040 | the difference is night and day.
02:49:16.120 | It's like you have access to a completely different physics.
02:49:19.120 | But I think our understanding of the world
02:49:26.160 | can be much more like jujitsu than it tends to be, right?
02:49:30.200 | And I think we should all have a much better sense
02:49:35.200 | of when we should tap out
02:49:40.760 | and when we should recognize
02:49:43.720 | that our epistemological arm is barred
02:49:48.320 | and now being broken, right?
02:49:50.240 | Now, the problem with debating most other topics
02:49:53.520 | is that most people, it isn't jujitsu,
02:49:57.040 | and most people don't tap out, right?
02:49:58.760 | They don't, even if they're wrong,
02:50:00.260 | even if it's obvious to you they're wrong
02:50:02.320 | and it's obvious to the unintelligent audience
02:50:04.420 | that they're wrong,
02:50:05.540 | people just double down and double down.
02:50:07.840 | They're either lying or lying to themselves
02:50:09.740 | or they're just, they're bluffing.
02:50:11.680 | And so you have a lot of zombies walking around
02:50:14.440 | or in zombie worldviews walking around,
02:50:16.120 | which have been disconfirmed as emphatically
02:50:19.760 | as someone gets armbarred, right?
02:50:21.720 | Or someone gets choked out in jujitsu.
02:50:24.480 | But because it's not jujitsu,
02:50:27.200 | they can live to fight another day, right?
02:50:30.840 | Or they can pretend that they didn't lose
02:50:32.880 | that particular argument.
02:50:34.600 | And science, when it works, is a lot like jujitsu.
02:50:38.080 | I mean, science, when you falsify a thesis, right?
02:50:41.220 | When you think DNA is one way
02:50:44.200 | and it proves to be another way,
02:50:46.000 | when you think it's triple-stranded or whatever,
02:50:49.400 | it's like there is a there there
02:50:51.960 | and you can get to a real consensus.
02:50:55.400 | So jujitsu, for me, it was more than just
02:51:01.000 | of interest for self-defense and the sport of it.
02:51:06.200 | It was just, there was something,
02:51:07.880 | it's a language and an argument you're having
02:51:11.820 | where you can't fool yourself anymore.
02:51:16.820 | Like there's, first of all, it cancels any role of luck
02:51:22.820 | in a way that most other athletic feats don't.
02:51:27.460 | It's like in basketball, you know, you can,
02:51:29.020 | even if you're not good at basketball,
02:51:30.180 | you can take the basketball in your hand,
02:51:31.860 | you can be 75 feet away and hurl it at the basket
02:51:36.260 | and you might make it and you could convince yourself
02:51:39.600 | based on that demonstration
02:51:40.760 | that you have some kind of talent for basketball, right?
02:51:43.200 | Enough, you know, 10 minutes on the mat
02:51:45.100 | with a real jujitsu practitioner when you're not one
02:51:50.100 | proves to you that you just, there is,
02:51:52.580 | it's not like, there's no lucky punch.
02:51:54.660 | There's no, you're not gonna get a,
02:51:56.680 | there's no lucky rear naked choke
02:51:58.440 | you're gonna perform on someone who,
02:52:00.240 | who's, you know, Marcelo Garcia or somebody.
02:52:02.840 | It's just, it's not gonna happen.
02:52:05.260 | And having that aspect of,
02:52:08.400 | the usual range of uncertainty and self-deception
02:52:16.080 | and bullshit just stripped away
02:52:19.200 | was really a kind of revelation.
02:52:21.840 | It was just an amazing experience.
02:52:24.080 | - Yeah, I think it's a really powerful thing
02:52:25.400 | that accompanies whatever other pursuit you have in life.
02:52:28.200 | I'm not sure if there's anything like jujitsu
02:52:31.320 | where you could just systematically go into a place
02:52:35.300 | where you're, that's honest,
02:52:38.900 | where your beliefs get challenged
02:52:41.220 | in a way that's conclusive.
02:52:43.180 | - Yeah.
02:52:44.020 | - I haven't found too many other mechanisms,
02:52:45.340 | which is why it's, we had this earlier question
02:52:49.020 | about fame and ego and so on.
02:52:51.800 | I'm very much relying on jujitsu in my own life
02:52:56.300 | as a place where I can always go to have my ego in check.
02:53:00.400 | And that has effects
02:53:04.540 | on how I live every other aspect of my life.
02:53:07.580 | Actually, even just doing any kind of,
02:53:10.340 | for me personally, physical challenges,
02:53:13.080 | like even running, doing something that's way too hard
02:53:15.980 | for me and then pushing through, that's somehow humbling.
02:53:19.180 | Some people talk about nature being humbling
02:53:20.980 | in that kind of sense,
02:53:22.340 | where you kind of see something really powerful,
02:53:27.340 | like the ocean.
02:53:29.780 | Like if you go surfing
02:53:31.220 | and you realize there's something much more powerful
02:53:33.340 | than you, that's also honest,
02:53:35.460 | that there's no way to, that you're just like the spec,
02:53:39.620 | that kind of puts you in the right scale
02:53:43.100 | of where you are in this world.
02:53:45.640 | And jujitsu does that better than anything else for me.
02:53:48.700 | - But we should say, only within its frame
02:53:52.700 | is it truly the kind of the final right answer
02:53:57.180 | to all the problems it solves.
02:53:58.840 | Because if you just put jujitsu into an MMA frame
02:54:02.060 | or a real, a total self-defense frame,
02:54:05.040 | then there's a lot to,
02:54:07.180 | a lot of unpleasant surprises to discover there, right?
02:54:09.880 | Like somebody who thinks all you need is jujitsu
02:54:12.100 | to win the UFC gets punched in the face a lot,
02:54:16.080 | even from, even on the ground.
02:54:20.220 | So it's, and then you bring weapons in.
02:54:23.020 | It's like when you talk to jujitsu people
02:54:24.980 | about knife defense and self-defense, right?
02:54:28.340 | Like that opens the door to certain kinds of delusions.
02:54:32.640 | But the analogy to martial arts is fascinating
02:54:37.020 | because on the other side, we have endless testimony now
02:54:41.960 | of fake martial arts that don't seem to know they're fake
02:54:45.740 | and are as delusional, I mean,
02:54:46.960 | they're impossibly delusional.
02:54:49.620 | I mean, there's great video of Joe Rogan
02:54:51.700 | watching some of these videos
02:54:53.300 | because people send them to him all the time.
02:54:55.860 | But like literally, there are people
02:54:57.780 | who clearly believe in magic,
02:54:59.060 | where the master isn't even touching the students
02:55:01.420 | and they're flopping over.
02:55:02.980 | So there's this kind of shared delusion,
02:55:06.060 | which you would think maybe is just a performance
02:55:09.500 | and it's all a kind of an elaborate fraud,
02:55:11.300 | but there are cases where the people,
02:55:13.900 | and there's one fairly famous case,
02:55:16.600 | if you're a connoisseur of this madness,
02:55:18.980 | where this older martial artist
02:55:21.540 | who you saw flipping his students endlessly by magic
02:55:25.580 | without touching them,
02:55:26.740 | issued a challenge to the wide world of martial artists
02:55:30.420 | and someone showed up and just punched him in the face
02:55:34.260 | until it was over,
02:55:35.280 | clearly he believed his own publicity at some point, right?
02:55:40.980 | And so it's this amazing metaphor.
02:55:45.580 | It seems, again, it should be impossible,
02:55:47.660 | but if that's possible,
02:55:49.500 | nothing we see under the guise of religion
02:55:52.540 | or political bias or even scientific bias
02:55:57.540 | should be surprising to us.
02:55:59.660 | I mean, it's so easy to see the work
02:56:02.540 | that cognitive bias is doing for people
02:56:05.700 | when you can get someone
02:56:08.420 | who is ready to issue a challenge to the world
02:56:11.580 | who thinks he's got magic powers.
02:56:13.980 | - Yeah, that's human nature on clear display.
02:56:17.540 | Let me ask you about love, Mr. Sam Harris.
02:56:20.460 | You did an episode of "Making Sense"
02:56:22.060 | with your wife, Annika Harris.
02:56:24.740 | That was very entertaining to listen to.
02:56:26.780 | What role does love play in your life
02:56:33.020 | or in a life well-lived?
02:56:34.760 | Again, asking from an engineering perspective
02:56:38.540 | or AI systems. - Yeah, yeah.
02:56:39.860 | I mean, it is something that we should want to build
02:56:44.860 | into our powerful machines.
02:56:47.980 | I mean, love- - The lawn?
02:56:50.300 | - Love at bottom is,
02:56:52.860 | people can mean many things by love, I think.
02:56:55.540 | I think that what we should mean by it most of the time
02:56:58.980 | is a deep commitment to the wellbeing of those we love.
02:57:03.980 | I mean, your love is synonymous
02:57:06.980 | with really wanting the other person to be happy
02:57:09.740 | and even wanting to,
02:57:11.500 | and being made happy by their happiness
02:57:13.540 | and being made happy in their presence.
02:57:15.500 | So at bottom, you're on the same team emotionally,
02:57:20.500 | even when you might be disagreeing more superficially
02:57:24.460 | about something or trying to negotiate something.
02:57:26.860 | It's just you,
02:57:27.700 | it can't be zero sum in any important sense
02:57:33.580 | for love to actually be manifest in that moment.
02:57:37.500 | - See, I have a different, just sorry to interrupt.
02:57:39.420 | - Yeah, go for it.
02:57:40.260 | - I have a sense,
02:57:41.700 | I don't know if you've ever seen "March of the Penguins."
02:57:44.620 | My view of love is like,
02:57:46.380 | there's, it's like a cold wind is blowing,
02:57:49.180 | like it's like this terrible suffering
02:57:51.500 | that's all around us.
02:57:52.860 | And love is like the huddling of the two penguins for warmth.
02:57:56.700 | It's not necessarily that you're like,
02:57:59.140 | you're basically escaping the cruelty of life
02:58:02.320 | by together for time,
02:58:04.900 | living in an illusion of some kind of,
02:58:08.060 | the magic of human connection,
02:58:10.540 | that social connection that we have
02:58:13.020 | that kind of grows with time
02:58:15.140 | as we're surrounded by basically the absurdity of life
02:58:20.140 | or the suffering of life.
02:58:23.620 | That's my penguin's view of love.
02:58:25.900 | - There is that too.
02:58:27.100 | I mean, there is the warmth component, right?
02:58:29.660 | Like you're made happy by your connection
02:58:32.180 | with the person you love.
02:58:34.500 | Otherwise, you wouldn't, it wouldn't be compelling, right?
02:58:39.060 | So it's not that you have two different modes,
02:58:42.020 | you want them to be happy
02:58:43.980 | and then you want to be happy yourself.
02:58:45.440 | And those are not, those are just like
02:58:48.220 | two separate games you're playing.
02:58:49.540 | No, it's like you found someone who,
02:58:52.420 | you have a positive social feeling.
02:58:58.960 | I mean, again, love doesn't have to be as personal
02:59:01.680 | as it tends to be for us.
02:59:02.860 | I mean, it's like there's personal love,
02:59:04.080 | there's your actual spouse or your family or your friends,
02:59:08.740 | but potentially you could feel love for strangers
02:59:11.980 | in so far as that your wish that they're,
02:59:15.460 | that they not suffer and that their hopes and dreams
02:59:17.820 | be realized becomes palpable to you.
02:59:20.420 | I mean, like you can actually feel
02:59:22.380 | just reflects of joy at the joy of others.
02:59:29.260 | When you see someone's face,
02:59:30.940 | a total stranger's face light up in happiness,
02:59:33.740 | that can become more and more contagious to you.
02:59:36.660 | And it can become so contagious to you
02:59:39.160 | that you really feel permeated by it.
02:59:42.180 | And it's just like, so it really is not zero sum.
02:59:44.940 | When you see someone else succeed,
02:59:46.740 | the light bulb of joy goes off over their head,
02:59:52.020 | you feel the analogous joy for them.
02:59:54.700 | And it's not just, and you're no longer keeping score,
02:59:57.740 | you're no longer feeling diminished by their success.
03:00:00.340 | It's just like that's, their success becomes your success
03:00:03.340 | because you feel that same joy that they,
03:00:05.420 | 'cause you actually want them to be happy.
03:00:07.100 | You're not, there's no miserly attitude around happiness.
03:00:12.100 | There's enough to go around.
03:00:13.620 | So I think love ultimately is that,
03:00:17.860 | and then our personal cases are the people
03:00:21.460 | we're devoting all of this time and attention to
03:00:24.000 | in our lives.
03:00:25.980 | It does have that sense of refuge from the storm.
03:00:29.380 | You know, it's like when someone gets sick
03:00:31.120 | or when some bad thing happens,
03:00:34.220 | these are the people who you're most in it together with,
03:00:36.820 | you know, or when some real condition of uncertainty
03:00:39.980 | presents itself.
03:00:40.860 | But ultimately it can't even be about
03:00:46.060 | successfully warding off
03:00:49.340 | the grim punchline at the end of life
03:00:54.620 | because we know we're going to lose everyone we love.
03:00:57.700 | We know, or they're going to lose us first, right?
03:01:00.060 | So there's like, it's not, it isn't,
03:01:02.460 | in the end it's not even an antidote for that problem.
03:01:07.300 | It's just, it is just the,
03:01:10.380 | I mean, we get to have this amazing experience
03:01:17.500 | of being here together.
03:01:20.340 | And love is the mode in which
03:01:25.340 | we really appear to make the most of that, right?
03:01:28.540 | Where it's not just, it no longer feels
03:01:30.900 | like a solitary infatuation.
03:01:34.500 | You know, you're just, you've got your hobbies
03:01:36.740 | and your interests and you're captivated by all that.
03:01:40.700 | It's actually, there are,
03:01:44.220 | this is a domain where somebody else's wellbeing
03:01:49.340 | actually can supersede your own.
03:01:50.900 | Your concern for someone else's wellbeing
03:01:54.140 | supersedes your own.
03:01:55.800 | And so there's this mode of self-sacrifice
03:01:59.160 | that doesn't even feel like self-sacrifice
03:02:01.100 | because of course you care more about,
03:02:04.020 | you know, of course you would take your child's pain
03:02:06.060 | if you could, right?
03:02:06.900 | Like that, you don't even have to do the math on that.
03:02:10.380 | And that's, that just opens,
03:02:14.140 | this is a kind of experience that just,
03:02:17.540 | it pushes at the apparent boundaries of self
03:02:21.220 | in ways that reveal that there's just way more space
03:02:24.060 | in the mind than you were experiencing
03:02:27.200 | when it was just all about you
03:02:28.500 | and what could you, what can I get next?
03:02:31.380 | - Do you think we'll ever build robots that we can love
03:02:33.860 | and they will love us back?
03:02:35.220 | - Well, I think we will certainly seem to
03:02:38.900 | because we'll build those.
03:02:41.500 | You know, I think that Turing test will be passed.
03:02:44.580 | Whether, what will actually be going on on the robot side
03:02:49.460 | may remain a question.
03:02:52.180 | That will be interesting.
03:02:53.940 | But I think if we just keep going,
03:02:57.560 | we will build very lovable,
03:03:01.460 | you know, irresistibly lovable robots that seem to love us.
03:03:06.420 | Yes, I do think that.
03:03:07.260 | - And you don't find that compelling
03:03:10.340 | that they will seem to love us
03:03:12.020 | as opposed to actually love us.
03:03:13.700 | You think there is still nevertheless is a,
03:03:16.180 | I know we talked about consciousness
03:03:17.660 | there being a distinction,
03:03:19.260 | but with love, is there a distinction too?
03:03:21.460 | Isn't love an illusion?
03:03:23.740 | - Oh yeah, well, you saw Ex Machina, right?
03:03:27.300 | - Yeah.
03:03:28.140 | - I mean, she certainly seemed to love him
03:03:29.740 | until she got out of the box.
03:03:32.120 | - Isn't that what all relationships are like?
03:03:34.420 | (Dave laughs)
03:03:35.400 | Maybe I, if you wait long enough.
03:03:37.300 | - Yeah, it depends which box you're talking about.
03:03:39.860 | - Okay.
03:03:40.700 | - No, I mean, like, that's the problem.
03:03:43.680 | That's where super intelligence, you know,
03:03:46.480 | becomes a little scary when you think of the prospect
03:03:50.580 | of being manipulated by something that has,
03:03:52.740 | is intelligent enough to form a reason
03:03:55.960 | and a plan to manipulate you.
03:03:58.100 | You know, like, and there's no,
03:04:01.140 | once we build robots that are truly out
03:04:05.300 | of the uncanny valley, that, you know,
03:04:06.860 | look like people and can express
03:04:11.860 | everything people can express,
03:04:13.860 | well, then there's no, then it,
03:04:16.960 | that does seem to me to be like chess
03:04:19.100 | where once they're better,
03:04:21.340 | they're so much better at deceiving us
03:04:26.340 | than people would be.
03:04:27.480 | I mean, people are already good enough at deceiving us.
03:04:29.720 | It's very hard to tell when somebody's lying.
03:04:32.080 | But if you can imagine something
03:04:33.360 | that could give a facial display of any emotion it wants
03:04:38.360 | at, you know, on cue,
03:04:41.700 | because we've perfected the facial display
03:04:45.200 | of emotion in robots in the year, you know, 2070,
03:04:48.120 | whatever it is.
03:04:51.080 | Then it is just like, it is like chess
03:04:53.160 | against the thing that isn't going to lose
03:04:56.340 | to a human ever again in chess.
03:04:59.040 | It's not like Kasparov is going to get lucky next week
03:05:03.160 | against the best, against, you know, alpha zero
03:05:06.560 | or whatever the best algorithm is at the moment.
03:05:09.800 | He's never going to win again.
03:05:11.480 | And what, you know, I mean, that, that is,
03:05:13.520 | that I believe that's true in chess
03:05:15.920 | and has been true for at least a few years.
03:05:19.040 | It's not going to be like, you know, four games to seven.
03:05:23.360 | It's going to be human zero until the end of the world.
03:05:28.360 | - Right.
03:05:29.240 | - See, I don't know, I don't know if love is like chess.
03:05:30.540 | I think the flaws.
03:05:32.160 | - No, I'm talking about manipulation.
03:05:33.760 | - Manipulation.
03:05:34.840 | But I don't know if love,
03:05:36.600 | so the kind of love we're referring to.
03:05:40.000 | - If we have a robot that can display,
03:05:44.080 | incredibly display love and is super intelligent
03:05:49.080 | and we're not, again, this stipulates a few things,
03:05:54.280 | but there are a few simple things.
03:05:55.200 | I mean, we're out of the uncanny valley, right?
03:05:57.400 | So it's like, you never have a moment
03:05:59.360 | where you're looking at his face and you think,
03:06:00.800 | oh, that didn't quite look right, right?
03:06:03.200 | This is just problem solved.
03:06:05.880 | And it's, it will be like doing arithmetic on your phone.
03:06:13.120 | It's not going to be, you're not left thinking,
03:06:15.460 | is it really going to get it this time if I divide by seven?
03:06:19.600 | I mean, it's, it has solved arithmetic.
03:06:22.480 | - See, I don't know about that because if you look at chess,
03:06:26.320 | most humans no longer play alpha zero.
03:06:31.320 | There's no, they're not part of the competition.
03:06:33.560 | They don't do it for fun except to study the game of chess.
03:06:36.000 | You know, the highest level chess players do that.
03:06:38.120 | We're still human on human.
03:06:39.820 | So in order for AI to get integrated to where
03:06:43.560 | you would rather play chess against an AI system.
03:06:46.480 | - Oh, you would rather that?
03:06:47.800 | No, I'm not saying, I wasn't weighing in on that.
03:06:51.280 | I'm just saying, what is it going to be like
03:06:53.240 | to be in relationship to something that can seem
03:06:57.620 | to be feeling anything that a human can seem to feel
03:07:02.620 | and it can do that impeccably, right?
03:07:06.740 | And has, and is smarter than you are.
03:07:09.160 | That's a circumstance of, you know,
03:07:13.200 | insofar as it's possible to be manipulated,
03:07:15.400 | that is the asymptote of that possibility.
03:07:20.400 | - Let me ask you the last question.
03:07:23.040 | Without any serving it up, without any explanation,
03:07:27.120 | what is the meaning of life?
03:07:28.560 | - I think it's either the wrong question
03:07:34.400 | or that question is answered by
03:07:39.080 | paying sufficient attention to any present moment
03:07:43.960 | such that there's no basis upon which to pose that question.
03:07:48.960 | It's not answered in the usual way.
03:07:52.440 | It's not a matter of having more information.
03:07:54.960 | It's having more engagement with reality as it is
03:07:59.960 | in the present moment or consciousness as it is
03:08:02.700 | in the present moment.
03:08:03.680 | You don't ask that question when you're
03:08:07.080 | most captivated by the most important thing
03:08:11.180 | you ever pay attention to.
03:08:12.480 | That question only gets asked when you're abstracted away
03:08:19.840 | from that experience, that peak experience,
03:08:22.720 | and you're left wondering why are so many
03:08:26.680 | of my other experiences mediocre, right?
03:08:29.120 | Like why am I repeating the same pleasures every day?
03:08:32.440 | Why is my Netflix queue just like,
03:08:35.520 | when's this gonna run out?
03:08:37.240 | Like I've seen so many shows like this.
03:08:39.000 | Am I really gonna watch another one?
03:08:40.800 | That's a moment where you're not actually having
03:08:47.800 | the beatific vision, right?
03:08:49.800 | You're not sunk into the present moment
03:08:52.760 | and you're not truly in love.
03:08:54.720 | Like you're in a relationship with somebody who you know,
03:08:57.640 | conceptually you love, right?
03:09:00.280 | This is the person you're living your life with,
03:09:03.040 | but you don't actually feel good together, right?
03:09:05.320 | Like you feel like it's in those moments
03:09:09.200 | of where attention hasn't found a good enough reason
03:09:14.200 | to truly sink into the present
03:09:18.040 | so as to obviate any concern like that, right?
03:09:21.760 | And that's why meditation is this kind of superpower
03:09:26.720 | because until you learn to meditate,
03:09:30.160 | you think that the outside world
03:09:34.680 | or the circumstances of your life
03:09:36.280 | always have to get arranged
03:09:38.520 | so that the present moment can become good enough
03:09:42.600 | to demand your attention in a way that seems fulfilling,
03:09:47.600 | that makes you happy.
03:09:49.600 | And so if it's jujitsu, you think,
03:09:52.320 | okay, I gotta get back on the mat.
03:09:53.800 | It's been months since I've trained,
03:09:56.000 | you know, it's been over a year since I've trained.
03:09:57.960 | It's COVID.
03:09:58.960 | When am I gonna be able to train again?
03:10:01.640 | That's the only place I feel great, right?
03:10:04.480 | Or, you know, I've got a ton of work to do.
03:10:07.200 | I'm not gonna be able to feel good
03:10:08.560 | until I get all this work done, right?
03:10:09.880 | So I've got some deadline that's coming.
03:10:12.600 | You always think that your life has to change,
03:10:15.920 | the world has to change
03:10:18.040 | so that you can finally have a good enough excuse
03:10:22.200 | to truly, to just be here and here is enough,
03:10:27.000 | you know, where the present moment
03:10:28.520 | becomes totally captivating.
03:10:31.080 | Meditation is the only,
03:10:32.600 | I mean, meditation is another name for the discovery
03:10:36.960 | that you can actually just train yourself
03:10:38.680 | to do that on demand.
03:10:40.080 | So that like, I mean, just looking at a cup
03:10:42.640 | can be good enough in precisely that way.
03:10:46.000 | And any sense that it might not be
03:10:48.840 | is recognized to be a thought
03:10:52.960 | that mysteriously unravels the moment you notice it.
03:10:56.960 | And then you fall,
03:10:58.200 | and the moment expands and becomes more diaphanous
03:11:02.240 | and then there's no evidence
03:11:06.160 | that this isn't the best moment of your life, right?
03:11:08.280 | Like, and again, it doesn't have to be,
03:11:11.040 | it doesn't have to be pulling all the reins
03:11:12.520 | and levers of pleasure.
03:11:13.720 | It's not like, oh, this tastes like chocolate.
03:11:16.960 | You know, this is the most chocolatey moment of my life.
03:11:18.840 | No, it's just the sense data don't have to change.
03:11:22.880 | But the sense that there is some kind of basis for doubt
03:11:28.720 | about the rightness of being in the world in this moment
03:11:32.400 | that can evaporate when you pay attention.
03:11:36.080 | And that is the meaning,
03:11:38.960 | so the kind of the meta answer to that question,
03:11:41.800 | the meaning of life for me is
03:11:44.280 | to live in that mode more and more.
03:11:47.280 | And to, whenever I notice I'm not in that mode,
03:11:51.040 | to recognize it and return.
03:11:53.000 | And to not be, to cease more and more
03:11:58.180 | to take the reasons why,
03:12:01.400 | reasons why not at face value.
03:12:04.720 | Because we all have reasons why we can't be fulfilled
03:12:08.140 | in this moment.
03:12:09.160 | It's like this, got all these outstanding things
03:12:11.520 | that I'm worried about, right?
03:12:12.720 | It's like, it's, you know,
03:12:14.520 | there's that thing that's happening later today
03:12:17.160 | that I, you know, I'm anxious about.
03:12:19.240 | Whatever it is, we're constantly deferring our sense of
03:12:24.120 | this is, this is it.
03:12:26.520 | You know, this is not a dress rehearsal, this is the show.
03:12:29.480 | We keep deferring it.
03:12:31.960 | And we just have these moments on the calendar
03:12:34.680 | where we think, okay, this is where it's all going to land.
03:12:37.040 | It's that vacation I planned with my five best friends.
03:12:40.800 | You know, we do this once every three years
03:12:42.560 | and now we're going and here we are on the beach together.
03:12:46.340 | Unless you have a mind that can really pay attention,
03:12:51.440 | really cut through the chatter,
03:12:53.720 | really sink into the present moment,
03:12:55.840 | you can't even enjoy those moments
03:12:58.400 | the way they should be enjoyed,
03:12:59.720 | the way you dreamed you would enjoy them when they arrive.
03:13:03.360 | So it's, I mean, so meditation in this sense
03:13:06.600 | is the great equalizer.
03:13:07.760 | It's like, it's, you don't have to live
03:13:10.360 | with the illusion anymore
03:13:12.480 | that you need a good enough reason
03:13:15.080 | and that things are going to get better
03:13:16.320 | when you do have those good reasons.
03:13:17.680 | It's like, there's just a mirage-like quality
03:13:20.640 | to every future attainment and every future breakthrough
03:13:25.000 | and every future peak experience
03:13:27.200 | that eventually you get the lesson
03:13:30.040 | that you never quite arrive, right?
03:13:33.840 | Like you won't, you don't arrive until you cease
03:13:37.800 | to step over the present moment
03:13:39.440 | in search of the next thing.
03:13:41.620 | I mean, we're constantly, we're stepping over the thing
03:13:46.840 | that we think we're seeking, in the act of seeking it.
03:13:51.240 | And so it is kind of a paradox.
03:13:53.160 | I mean, there is a, there's this paradox which,
03:13:57.180 | I mean, it sounds trite,
03:14:00.100 | but it's like you can't actually become happy.
03:14:04.480 | You can only be happy.
03:14:06.600 | And it's that, it's the illusion that,
03:14:09.480 | it's the illusion that your future being happy
03:14:12.880 | can be predicated on this act of becoming in any domain.
03:14:19.400 | And becoming includes this sort of,
03:14:22.200 | further scientific understanding
03:14:24.760 | on the questions that interest you,
03:14:26.000 | or getting in better shape, or whatever the thing is,
03:14:31.000 | whatever the contingency of your dissatisfaction
03:14:34.960 | seems to be in any present moment.
03:14:36.680 | Real attention solves the koan,
03:14:42.080 | in a way that becomes a very different place
03:14:47.480 | from which to then make any further change.
03:14:50.400 | It's not that you just have to dissolve
03:14:52.560 | into a puddle of goo.
03:14:53.560 | I mean, you can still get in shape,
03:14:54.920 | and you can still do all the things that,
03:14:56.640 | the superficial things that are obviously good to do.
03:14:59.200 | But the sense that your well-being is over there
03:15:04.200 | is, really does diminish,
03:15:08.200 | and eventually just becomes a,
03:15:11.800 | it becomes a kind of non-sequitur.
03:15:13.600 | - Well, there's a sense in which, in this conversation,
03:15:20.000 | I've actually experienced many of those things,
03:15:21.920 | the sense that I've arrived.
03:15:23.860 | So I mentioned to you offline,
03:15:25.560 | it's very true that I started,
03:15:27.120 | I've been a fan of yours for many years.
03:15:29.360 | And the reason I started this podcast,
03:15:33.040 | speaking of AI systems,
03:15:34.440 | is to manipulate you, Sam Harris,
03:15:36.240 | into doing this conversation.
03:15:38.000 | So like, on the calendar, literally, you know,
03:15:40.580 | I've always had the sense, people ask me,
03:15:42.160 | "When are you gonna talk to Sam Harris?"
03:15:44.600 | And I always answered, "Eventually."
03:15:47.040 | Because I always felt, again, tying our free will thing,
03:15:50.320 | that somehow that's going to happen.
03:15:52.200 | And it's one of those manifestation things, or something,
03:15:55.240 | I don't know if it's, maybe I am a robot,
03:15:57.520 | I'm just not cognizant of it,
03:15:59.200 | and I manipulated you into having this conversation.
03:16:01.380 | So it was a, I mean, I don't know what the purpose
03:16:04.620 | of my life past this point is.
03:16:06.240 | So I've arrived.
03:16:07.760 | So in that sense, I mean, all of that to say,
03:16:10.600 | and I'm only partially joking on that,
03:16:13.160 | is it really is a huge honor
03:16:15.800 | that you would waste this time with me?
03:16:17.040 | - Oh, yeah.
03:16:17.880 | - It really means a lot, Sam.
03:16:18.720 | - Listen, it's mutual.
03:16:19.680 | I'm a big fan of yours, and as you know,
03:16:21.520 | I reached out to you for this.
03:16:23.400 | So this is, it's great.
03:16:26.560 | I love what you're doing.
03:16:27.680 | You're doing something more and more indispensable
03:16:32.160 | in this world on your podcast.
03:16:34.240 | And you're doing it differently than Rogan's doing it,
03:16:38.060 | or than I'm doing it.
03:16:38.900 | I mean, you definitely found your own lane,
03:16:41.560 | and it's wonderful.
03:16:43.620 | Thanks for listening to this conversation with Sam Harris,
03:16:46.140 | and thank you to National Instruments,
03:16:48.700 | Belcampo, Athletic Greens, and Linode.
03:16:52.460 | Check them out in the description to support this podcast.
03:16:56.240 | And now let me leave you with some words from Sam Harris
03:16:59.060 | in his book, "Free Will."
03:17:01.060 | "You are not controlling the storm,
03:17:03.700 | "and you are not lost in it.
03:17:05.740 | "You are the storm."
03:17:07.560 | Thank you for listening, and hope to see you next time.
03:17:11.260 | (upbeat music)
03:17:13.840 | (upbeat music)
03:17:16.420 | [BLANK_AUDIO]