back to index

Annaka Harris: Free Will, Consciousness, and the Nature of Reality | Lex Fridman Podcast #326


Chapters

0:0 Introduction
0:53 Free will
54:10 Consciousness
84:42 Depression
97:59 Psychedelics
105:58 Meditation
110:22 Ideas
134:8 AI sentience
151:30 Suffering
154:26 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | When we use the term free will,
00:00:01.560 | we're talking about this feeling that consciousness,
00:00:04.600 | that we have a self, that there's this concrete thing
00:00:08.200 | that's separate from brain processing
00:00:11.340 | that somehow swoops in and is the cause of our decision
00:00:15.060 | or the cause of our next action.
00:00:17.040 | And that is, in large part, if not in its entirety,
00:00:21.280 | an illusion.
00:00:22.120 | - The following is a conversation with Anika Harris,
00:00:27.000 | author of "Conscious,"
00:00:28.680 | a brief guide to the fundamental mystery of the mind.
00:00:32.600 | And is someone who writes and thinks a lot
00:00:35.240 | about the nature of consciousness and of reality,
00:00:39.080 | especially from the perspectives
00:00:40.640 | of physics and neuroscience.
00:00:43.360 | This is the Lux Freedom Podcast.
00:00:45.360 | To support it, please check out our sponsors
00:00:47.600 | in the description.
00:00:48.880 | And now, dear friends, here's Anika Harris.
00:00:52.460 | In your book, "Conscious," you described evidence
00:00:55.940 | that free will is an illusion
00:00:57.960 | and that consciousness is used to construct this illusion
00:01:00.760 | and convince ourselves
00:01:02.440 | that we are, in fact, deciding our actions.
00:01:05.160 | Can you explain this?
00:01:06.280 | I think this is chapter three.
00:01:07.720 | - First of all, I really think it's important
00:01:09.960 | to make a distinction between free will and conscious will.
00:01:13.560 | And we'll get into that in a moment.
00:01:15.600 | So free will, in terms of our brain as a system in nature,
00:01:20.000 | making complex decisions
00:01:21.920 | and doing all of the complex processing it does,
00:01:25.160 | there is a decision-making process in nature
00:01:28.040 | that our brains undergo that we can call free will.
00:01:32.240 | That's fine to use that shorthand for that.
00:01:34.880 | Although, once we get into the details,
00:01:36.820 | I might convince you that it's not so free.
00:01:40.360 | But the decision-making process is a process in nature.
00:01:44.240 | The feeling, our conscious experience
00:01:47.040 | of feeling like consciousness is the thing
00:01:50.300 | that is driving the behavior,
00:01:53.780 | that is, I would say, in most cases, an illusion.
00:01:57.960 | And usually, when we talk about free will,
00:02:01.680 | that's the thing we're talking about.
00:02:02.920 | I mean, sometimes it's in conjunction
00:02:04.760 | with the decision-making process,
00:02:06.320 | but for the most part, when we use the term free will,
00:02:09.200 | we're talking about this feeling that consciousness,
00:02:12.240 | that we have a self, that there's this concrete thing
00:02:15.840 | that's separate from brain processing
00:02:18.980 | that somehow swoops in
00:02:20.640 | and is the cause of our decision
00:02:22.700 | or the cause of our next action.
00:02:24.680 | And that is, in large part,
00:02:27.400 | if not in its entirety, an illusion.
00:02:29.800 | - So conscious will is an illusion,
00:02:32.000 | and then we can try to figure out--
00:02:33.200 | - Free will, I would say, is good shorthand
00:02:35.760 | for a process in nature,
00:02:37.240 | which is a decision-making process of the brain.
00:02:40.200 | - But decisions are still being made.
00:02:42.040 | So there's,
00:02:42.880 | if you ran the universe over again,
00:02:49.560 | is there, would it turn out the same way?
00:02:52.780 | I mean, maybe, I'm trying to sneak up to,
00:02:55.460 | what does it mean to make a decision
00:02:57.300 | in a way that's almost, that means something?
00:03:01.140 | - So right, so this is where our intuitions get challenged.
00:03:04.020 | I've been thinking about some new examples for this
00:03:06.540 | just because I talk about it a lot.
00:03:08.740 | And the truth is, most of the things I write about
00:03:11.860 | and talk about and think about are so counterintuitive.
00:03:14.640 | I mean, that's really what my game is,
00:03:16.860 | is breaking intuitions, shaking up intuitions
00:03:19.860 | in order to get a deeper understanding of reality.
00:03:23.180 | I'm often, even though I've thought about this for 20 years
00:03:27.680 | and think about it all the time,
00:03:28.840 | it's an obsession of mine, really,
00:03:30.920 | I have to get back into that mind frame
00:03:34.600 | to be able to think clearly about it
00:03:37.160 | because it is so counterintuitive.
00:03:38.760 | - How long does that take?
00:03:40.400 | How hard is that?
00:03:41.240 | - Depends on if there are kids around
00:03:42.720 | or if I'm alone or if I've been meditating.
00:03:44.860 | But what I was going to say, actually,
00:03:46.580 | I felt like we needed to just take one step back
00:03:49.000 | and talk a little bit, just because I think
00:03:52.200 | the importance of shaking up intuitions
00:03:55.000 | for scientific advancement is such an important piece
00:04:00.000 | of the scientific process.
00:04:02.480 | And I think we've reached a point in consciousness studies
00:04:05.840 | where it's very difficult to move forward.
00:04:10.640 | And usually that's a sign that we need
00:04:13.340 | to start shaking up our intuitions.
00:04:14.960 | So throughout history, the huge breakthroughs,
00:04:18.240 | the things that have really shifted our view of the universe
00:04:21.440 | and our place in the universe and all of that,
00:04:24.040 | those almost always, if not always,
00:04:26.120 | require that we, at the very least,
00:04:29.840 | shift our intuitions, update our intuitions.
00:04:33.580 | But many of them we just have to let go of intuitions
00:04:36.080 | that are feeding us false information
00:04:37.960 | about the way the world works.
00:04:39.920 | - Well, the weirdest thing here is that
00:04:41.600 | here we're looking at our own mind.
00:04:44.760 | So you have to let go of your intuitions
00:04:48.720 | about your own intuitions.
00:04:51.920 | - Yeah, right, exactly.
00:04:53.560 | It's very meta and makes it hard.
00:04:55.760 | And it's part of the reason why doing interviews for me
00:04:59.760 | feels so difficult, aside from the fact
00:05:01.880 | that I just have social anxiety in general.
00:05:04.840 | - Well, it's good 'cause I took mushrooms
00:05:06.120 | just before we started.
00:05:07.200 | - Perfect, that's what I should have done.
00:05:09.520 | - We're on this journey together, let's go.
00:05:12.080 | So where do we take a step backwards to us?
00:05:14.280 | - Well, I was going to say,
00:05:15.560 | I mean, this leads into the point I was going to make,
00:05:17.840 | but what I was going to say is,
00:05:19.800 | I mean, also just for me, I feel like
00:05:21.920 | I'm not as good at speaking as I am at writing,
00:05:26.760 | that I'm clearer in my writing.
00:05:28.360 | And because these topics are so difficult
00:05:30.480 | to get our minds around,
00:05:31.900 | it's hard to kind of get to any real conclusion in real time.
00:05:38.420 | It's actually how I started writing my book,
00:05:42.400 | was just writing for myself.
00:05:43.640 | I decided that I needed to spend some time
00:05:45.800 | writing down all of my thoughts
00:05:47.160 | in order to get clear about how I think about them.
00:05:50.920 | - So you write down a sentence and you think,
00:05:53.160 | you're in the silence, quiet. - Paragraphs.
00:05:55.480 | - Paragraphs, and you just--
00:05:58.000 | - And then I see if that makes sense,
00:05:59.620 | and then I check it with my intuitions,
00:06:01.080 | which is really the scientific process.
00:06:02.680 | And I really, in many ways,
00:06:04.360 | I feel like I'm a physicist at heart.
00:06:06.600 | All of my inquiry, all of my career,
00:06:10.320 | everything I'm interested in,
00:06:11.520 | actually going back to being a child,
00:06:14.680 | is just deep curiosity about how the world works,
00:06:19.200 | what this place is, what it's made of,
00:06:22.280 | how we got here, just being amazed at the fact
00:06:26.220 | that I'm having an experience over here
00:06:28.240 | and you're having one over there,
00:06:29.600 | and we're in this moment of time,
00:06:31.780 | and what does that all mean?
00:06:34.080 | My interest in consciousness really came out of
00:06:37.200 | originally an interest in physics.
00:06:38.920 | And I guess that the two were always side by side,
00:06:41.280 | and I didn't really connect them until I was older,
00:06:43.520 | but I've always been really interested
00:06:47.040 | in just understanding the nature of reality
00:06:51.820 | before I even had language to describe it.
00:06:54.640 | - You talked about laying down and looking up at the stars,
00:06:59.120 | sort of trying to let go of the intuition
00:07:02.460 | that there's a ground below us,
00:07:05.200 | which is a really interesting exercise,
00:07:06.880 | and there's many exercises of this sort you can do,
00:07:09.120 | but that's a really good one.
00:07:10.480 | - Well, and I think scientists and children
00:07:14.280 | who will become scientists,
00:07:15.920 | who are just kind of scientists at heart,
00:07:18.180 | really enjoy that feeling
00:07:20.160 | of breaking through their intuitions.
00:07:22.680 | And I remember the first time it happened, actually,
00:07:25.400 | I was playing with marbles,
00:07:28.960 | and marbles have all these different shapes.
00:07:32.440 | Each one is unique, and they're all these,
00:07:34.000 | it looks like there's liquid inside them.
00:07:36.920 | And I remember asking my father
00:07:39.560 | how they got the liquid inside the glass ball,
00:07:41.920 | and he said, "Actually, it's solid all the way through.
00:07:43.920 | "It's all glass."
00:07:45.220 | And I had such a hard time imagining,
00:07:47.680 | it just didn't seem right to me.
00:07:48.520 | I was very young when I,
00:07:50.000 | but he's a complicated person,
00:07:52.040 | but he was wonderful in this way,
00:07:53.760 | in that he would kind of entertain my curiosity.
00:07:57.160 | And so he said, "Let's open them up."
00:07:59.920 | And he got a towel, and we put the marbles on the towel,
00:08:02.400 | and got a hammer, and he smashed them all,
00:08:04.100 | and lo and behold, it was all glass.
00:08:06.260 | And I remember, it's like the first time I had that feeling
00:08:10.840 | of realizing, wow, the truth was so different
00:08:14.660 | from what I expected.
00:08:16.540 | And I like that feeling.
00:08:18.600 | And of course, we need to be able to do that
00:08:20.520 | to understand that the Earth is flat,
00:08:23.220 | to understand the germ theory of disease,
00:08:25.280 | to understand long processes in nature like evolution.
00:08:29.060 | I mean, we just can never really intuit
00:08:31.300 | that we share genes with ants.
00:08:33.900 | - Did you just say the Earth is flat?
00:08:35.580 | - I mean, the Earth is not flat.
00:08:37.020 | - Did I say that? - Yeah, this is great.
00:08:38.980 | Which I actually like to think about--
00:08:39.820 | - Exactly, see, this is why I need to write and not speak.
00:08:43.140 | - Well, I actually really like conspiracy theories and so on.
00:08:46.660 | I really like flat Earth,
00:08:48.540 | people that believe the Earth is flat,
00:08:49.960 | or not believe, but argue for the Earth is flat.
00:08:52.980 | - Well, it's interesting, because you can see,
00:08:54.540 | I mean, the intuition is so strong, I just said it.
00:08:56.740 | - The thing I love about folks who argue for flat Earth
00:08:59.940 | is they are thinking deeply.
00:09:03.620 | They're questioning, actually,
00:09:05.260 | what has now become intuition.
00:09:07.040 | It's become the mainstream narrative
00:09:09.820 | that the Earth is round, where people actually don't,
00:09:14.100 | yeah, don't think, actually,
00:09:18.060 | how crazy it is that the Earth is round.
00:09:20.980 | We're in a ball.
00:09:22.700 | And that's exactly what you're doing.
00:09:24.260 | You're looking out at the space.
00:09:26.140 | It's really humbling, 'cause I think the basic intuition,
00:09:29.900 | when you're walking on the ground,
00:09:32.060 | you kind of, there's a underlying belief
00:09:35.940 | that Earth is the center of the universe.
00:09:37.780 | There's a kind of feeling
00:09:39.580 | like this is the only world that exists.
00:09:42.500 | And you kind of know that there's a huge universe out there,
00:09:45.820 | but you don't really load that information in.
00:09:47.860 | And I think flat Earthers are really contending
00:09:51.380 | with those big ideas.
00:09:53.300 | - Yeah, no, and I think, I mean, the truth is that
00:09:55.860 | when those observations were first made,
00:09:58.660 | when the celestial observations were made
00:10:00.780 | that revealed this fact to us,
00:10:03.300 | I can't remember how long it took,
00:10:04.740 | but I think it was close to 100 years
00:10:06.660 | before it was actually accepted as common knowledge
00:10:11.620 | that we're no longer the center of the universe,
00:10:13.820 | or of course we never were.
00:10:15.900 | And that's true almost every time
00:10:19.060 | we have a breakthrough like that,
00:10:20.980 | that challenges our intuitions.
00:10:23.140 | There's usually a period of time where we have to,
00:10:28.140 | and this is an important part of the process,
00:10:30.020 | because often our intuitions give us good information.
00:10:32.820 | And so when the science goes against,
00:10:35.780 | when our scientific observations go against our intuitions,
00:10:39.020 | it's important for us to let that in
00:10:43.340 | and to see which side is gonna win.
00:10:46.140 | And once it's clear that the evidence is winning,
00:10:49.660 | then there's this period of time
00:10:50.900 | where we have to grapple with our intuitions
00:10:53.340 | and shift the way we frame our worldview
00:10:57.020 | and go through that process.
00:10:59.100 | - So free will.
00:11:00.460 | - Free will's a hard one.
00:11:01.380 | - It's a hard one.
00:11:02.860 | - So here we are, still in consciousness studies,
00:11:07.460 | pretty stuck, at least in terms of the neuroscience.
00:11:10.620 | And so that's why I started thinking more deeply about that.
00:11:14.380 | That's why a lot of scientists right now
00:11:16.180 | are actually interested in studying consciousness,
00:11:19.300 | where it was very taboo before.
00:11:20.900 | And so we're at this really interesting turning point,
00:11:23.460 | and it's wonderful, but it will require
00:11:26.980 | that we shake up our intuitions a bit
00:11:29.940 | and reframe some things
00:11:31.820 | and look at what the neuroscience is telling us.
00:11:35.540 | And there are a lot of questions.
00:11:37.100 | We have more questions than answers.
00:11:39.940 | But I think it's time.
00:11:41.980 | I think if we're going to make progress
00:11:43.780 | in consciousness studies,
00:11:45.140 | we need to start really looking at the illusions
00:11:50.140 | and false intuitions that are getting in our way.
00:11:52.700 | - Do you think studying the brain
00:11:53.860 | can give us clues about free will,
00:11:55.540 | like some of these questions?
00:11:56.500 | - Yeah, absolutely.
00:11:57.340 | I think it already has.
00:11:58.300 | And I think many facts that have come out of neuroscience
00:12:02.420 | are still barely seeping into the culture.
00:12:05.380 | I mean, I think this is going to be a long process.
00:12:08.620 | So part of my work is really just looking at areas
00:12:11.860 | where we already know some of our intuitions are wrong
00:12:15.780 | and starting to accept them and starting to let them in
00:12:19.580 | and starting to ask questions about,
00:12:20.820 | well, what does this mean then
00:12:22.540 | about the nature of consciousness?
00:12:24.940 | - Let's try to actually get at this question
00:12:27.980 | of free will and conscious will.
00:12:30.140 | I have, my intuitions here are,
00:12:32.940 | I mean, I'm a human being.
00:12:34.220 | It's really, I mean, I approach it from two aspects.
00:12:38.740 | One is a human being, and two, from a robotics perspective.
00:12:42.620 | And I wonder how big the gap between the two is.
00:12:46.260 | And that's a useful, from an engineering perspective,
00:12:50.300 | is another perspective that's useful
00:12:52.940 | and helpful to take on this.
00:12:54.820 | It's like, are we really so different, you and I,
00:12:57.700 | the robot and the human?
00:12:59.820 | You'd like to believe so,
00:13:02.180 | but you don't exactly see where the difference is.
00:13:05.580 | - Research into AI and just the fact
00:13:07.540 | that it's entered our consciousness
00:13:09.780 | at the level of stories and film
00:13:12.060 | and all of these questions that it's raising
00:13:15.220 | is facing us with that.
00:13:19.540 | It's almost like the zombie experiment
00:13:22.380 | is coming to life for us.
00:13:23.860 | You know, we're more and more looking at human-like systems
00:13:28.860 | and wondering, is there an experience in there
00:13:32.860 | and how can we figure that out?
00:13:35.220 | When you were talking about your experience
00:13:37.220 | of looking at robots, it reminds me of how I,
00:13:40.660 | for many years, have been looking at plants.
00:13:42.860 | Because the plant behavior,
00:13:45.100 | and actually, this is the example.
00:13:46.460 | Maybe we'll just try it out.
00:13:47.660 | It may not work.
00:13:48.580 | This is an example I was thinking of recently
00:13:50.620 | because I was reading back on the work of Mark Jaffe,
00:13:54.980 | who did this research with pea tendrils.
00:13:57.820 | I'm sure he did many other plant studies,
00:13:59.820 | but this is the one I was reading about.
00:14:01.660 | And I'm hoping this analogy, I'll just set it up.
00:14:04.100 | I'm hoping that this analogy will be something
00:14:06.420 | that we can keep coming back to as we move forward
00:14:08.380 | because as we shake up our intuitions and get confused
00:14:12.100 | and then we come back to our intuitions
00:14:13.940 | and say, no, that just can't be.
00:14:15.060 | I think this analogy might be helpful.
00:14:17.380 | - What kind of plant was he working on?
00:14:19.260 | - Pea tendrils.
00:14:20.220 | So a pea plant has these tendrils.
00:14:22.420 | You can picture them.
00:14:23.620 | They coil.
00:14:25.500 | So I don't know what year this research was done.
00:14:31.260 | I'm guessing in the '80s, but some--
00:14:34.860 | - But pea tendrils have been around long before that.
00:14:37.500 | - Yes, of course.
00:14:38.340 | (laughs)
00:14:39.180 | And the research may have happened long before the '80s.
00:14:41.500 | - In fact, they might be doing the research on the humans,
00:14:43.100 | but that's another story.
00:14:43.940 | - Yeah, right.
00:14:46.100 | Pea tendrils, as a system, generally,
00:14:50.620 | there are a few more things they can do,
00:14:52.140 | but generally they can behave in two ways.
00:14:55.060 | They can grow in a straight line slowly
00:14:57.300 | or they can grow in this coil form more quickly.
00:15:02.900 | And what happens is when they are growing
00:15:06.740 | in a straight manner and they encounter a branch
00:15:10.740 | or a pole or something else that it can wrap itself around
00:15:14.040 | to gain more stability, when it senses a branch there,
00:15:17.040 | that gives it the cue to start growing at a more rapid pace
00:15:20.300 | and to start coiling instead of growing straight.
00:15:23.400 | So it has these two behaviors.
00:15:24.800 | As a system, it's capable of growing straight
00:15:26.840 | and it's capable of coiling.
00:15:28.380 | One interesting thing, actually, I'll just add this.
00:15:31.640 | It's not totally relevant,
00:15:32.720 | but one interesting thing is Mark Jaffe's work.
00:15:35.920 | So he cut a pea tendril.
00:15:38.240 | He was curious to see if it could do this on its own,
00:15:40.940 | separate from the rest of the plant.
00:15:42.640 | So he cut a pea tendril off the plant.
00:15:45.700 | If you keep it in a moist, warm environment,
00:15:47.960 | it will continue to behave in these ways.
00:15:50.480 | So it will continue to coil.
00:15:52.600 | He noticed that if he touched one end of it,
00:15:55.040 | if he rubbed one side of it, that gave it enough of a cue
00:15:58.200 | that it would start to coil.
00:16:00.000 | And then he noticed that it needed light
00:16:02.840 | to perform this action.
00:16:04.380 | So in the dark, when he rubbed the edge of the tendril,
00:16:08.800 | it did not coil.
00:16:10.920 | In the light, it would, and then he recognized
00:16:14.160 | this further fact, which was that the pea tendril
00:16:17.240 | that he rubbed in the dark, that was still straight,
00:16:20.320 | if he brought it out into the light,
00:16:22.000 | and this could be hours later, it would start to coil.
00:16:25.400 | It has a primitive form of memory
00:16:28.480 | where it has the sensation
00:16:31.880 | and then it holds onto that information.
00:16:34.080 | And as soon as there's light, it acts on that information.
00:16:36.840 | - But also in a kind of distributed intelligence,
00:16:39.720 | because you can separate it from the main part.
00:16:43.600 | Like if you chop off a human arm,
00:16:45.240 | it's not gonna keep growing.
00:16:48.680 | - Even if you keep it in a moist, warm environment,
00:16:50.800 | it's not gonna reach out for the cup of coffee
00:16:52.960 | when you come in with Starbucks.
00:16:54.120 | - Maybe in the correct environment.
00:16:56.540 | Maybe we just haven't found the environment.
00:16:58.440 | But anyway, that's pretty amazing.
00:16:59.960 | - So that's a separate fact.
00:17:01.120 | But anyway, so if you just use the analogy of a pea tendril,
00:17:05.080 | and if you imagine, which is something I like to do a lot,
00:17:07.880 | if you imagine this plant
00:17:10.280 | has some kind of conscious experience,
00:17:12.280 | of course it doesn't have complex thought,
00:17:13.960 | it doesn't have anything like a human experience,
00:17:16.840 | but if it were possible for a plant
00:17:19.480 | to have some felt experience,
00:17:21.960 | you can imagine that when it comes into contact
00:17:26.660 | with a branch and starts to coil,
00:17:29.200 | that that feeling could be one of deciding to do that,
00:17:33.400 | or that it feels good to do that, or kind of wanting.
00:17:36.600 | I mean, that's too complex, that's anthropomorphizing,
00:17:39.720 | but there's a way in which you could imagine
00:17:42.360 | this pea tendril under those circumstances
00:17:44.280 | suddenly wants to start coiling.
00:17:45.960 | - So you're saying you try to meditate
00:17:48.320 | on what it's like to be a pea tendril, a plant.
00:17:52.200 | Like that's what's required here.
00:17:54.680 | So you have to empathize with a plant,
00:17:56.640 | or with another organism that's not human.
00:17:58.760 | - Yeah, and you don't actually need that for this analogy,
00:18:00.880 | the larger analogy that I'm getting at,
00:18:02.760 | but I think that's an interesting piece to keep in mind,
00:18:04.760 | that you could imagine that in nature,
00:18:07.320 | if there's a conscious experience
00:18:09.040 | associated with a pea tendril,
00:18:10.800 | that at that moment, what that feels like is
00:18:13.440 | a want to start moving in a different way.
00:18:17.720 | - So you wanna imagine that without anthropomorphizing,
00:18:21.160 | so without projecting the human experience,
00:18:23.640 | but rather sort of humbling yourself
00:18:25.800 | that we're just another plant with more complexity.
00:18:29.800 | - Yes, in a way. - Like trying to see where--
00:18:31.600 | - Exactly, so that's where I'm going with this.
00:18:34.280 | So, and when you start making that connection,
00:18:37.280 | you can see where there are a few points
00:18:40.480 | at which there's room for an illusion to come in,
00:18:43.400 | for our own feelings of will.
00:18:45.200 | So when we move from a pea tendril
00:18:47.320 | to human decision-making,
00:18:49.200 | obviously, human decision-making,
00:18:52.320 | human brains are many, many, many times more complex
00:18:56.240 | than whatever's going on in a pea tendril.
00:18:57.960 | I mean, it is, the brain is actually
00:18:59.640 | the most complex thing we know of in the universe thus far.
00:19:03.640 | So there is the genes that help develop the brain
00:19:08.440 | into any particular brain into what it is,
00:19:10.200 | there are all the inputs,
00:19:11.280 | there are countless factors that we could never,
00:19:14.160 | I mean, it may as well be an infinite number of factors.
00:19:17.960 | And then in that particular moment,
00:19:19.960 | whatever the inputs are to a brain,
00:19:22.360 | the brain is capable of almost
00:19:25.600 | an infinite number of outputs, right?
00:19:27.360 | So if I walked in here this morning and you said,
00:19:31.320 | "Would you like water or tea?"
00:19:33.360 | And that's a simple decision for me to make.
00:19:36.520 | - I think that's a passive-aggressive way of telling me
00:19:37.960 | I should have offered you some tea.
00:19:40.880 | But yes, go on.
00:19:41.720 | - No, I wanted water.
00:19:42.840 | - Okay, all right.
00:19:44.200 | - I actually asked for water.
00:19:45.640 | - Okay, all right, great.
00:19:47.360 | - And you didn't have any free will anyway,
00:19:48.720 | so it doesn't matter.
00:19:49.560 | I don't hold you responsible for any of it.
00:19:51.520 | - Exactly, I was just running an algorithm,
00:19:54.240 | deterministically.
00:19:55.720 | - You give me this decision, right, to make water or tea.
00:19:59.160 | Go back to the pea tendril for a second.
00:20:00.760 | A pea tendril is capable of growing
00:20:02.560 | in a straight line slowly or in a coil quickly.
00:20:05.500 | My brain is capable of all kinds of responses
00:20:09.560 | to that question, even though you've given me two options.
00:20:13.900 | You could offer me water or tea,
00:20:15.920 | and I could just run out of the room screaming
00:20:18.040 | if I wanted to, right?
00:20:18.880 | - Happens to me all the time, I'm a date.
00:20:20.960 | - Nevermind, I don't wanna do this.
00:20:22.960 | - Yes.
00:20:23.800 | - The fact that the brain is capable,
00:20:27.960 | that there's so many inputs,
00:20:29.640 | and then the brain is capable of so many outputs,
00:20:32.840 | as a system, what it's hard for us to get our minds around
00:20:36.860 | is that it may not be capable of any behavior
00:20:41.760 | in every moment in time.
00:20:44.080 | So as a system, it's capable of doing all kinds of things.
00:20:47.320 | And the point I'm making is that
00:20:51.160 | if we could see all of the factors
00:20:55.440 | leading up to the moment where I chose water
00:20:57.880 | or where I ran screaming from the room,
00:21:00.020 | we could in fact see that there was no other behavior
00:21:06.200 | I was going to or could have exhibited in that moment,
00:21:10.200 | in the same way that when the pea tendril hits the branch,
00:21:12.840 | it starts coiling.
00:21:13.920 | - There's a parallel, which is very interesting in robotics,
00:21:17.760 | with fish and water.
00:21:20.120 | So you could see, they've experienced with dead fish
00:21:23.520 | and they keep swimming.
00:21:25.360 | So the fish is capable of all kinds of complicated movements
00:21:30.200 | as a system.
00:21:31.780 | But in any one moment, the river, the full complexity
00:21:36.440 | of the river defines the actual movement of the fish.
00:21:40.120 | - Right. - And that's sufficient.
00:21:41.200 | - Well, and I should also, I mean,
00:21:42.840 | this brings up another point, which is that
00:21:44.600 | there is a difference between voluntary
00:21:47.040 | and involuntary behavior.
00:21:48.440 | So of course, we have reflexes.
00:21:50.640 | And it is a different, there's different brain processing
00:21:55.320 | in action when I make a decision about water or tea,
00:22:00.080 | then there is, if my behavior is forced from the outside,
00:22:05.080 | or if I have a brain tumor that's causing me
00:22:08.720 | to make certain decisions or feel certain feelings.
00:22:11.620 | And so the point is at bottom,
00:22:14.760 | it's all brain processing and behavior.
00:22:17.520 | But the reason why certain actions feel will,
00:22:22.440 | there's a good reason why it feels that way.
00:22:25.280 | And it's to distinguish our own self-generated behavior
00:22:29.880 | based on thinking and possibly weighing
00:22:33.760 | the different results of different things.
00:22:35.260 | I already had caffeine today, I don't want more.
00:22:37.320 | You know, there are all these processes,
00:22:40.400 | things that we can point to and things that we can't,
00:22:43.280 | things I'm affected by at a subconscious level.
00:22:46.180 | And that is very different from an unwilled action
00:22:52.720 | or reflex or something like that.
00:22:54.360 | And so some people I can imagine,
00:22:56.120 | I haven't used the pee tendril example,
00:22:57.520 | but I can imagine they wouldn't like that
00:22:58.920 | because the pee tendril sounds more to them like a reflex.
00:23:03.400 | And that doesn't address the question
00:23:06.520 | of a much more complex decision-making process.
00:23:10.520 | But I think at bottom, that is what it is.
00:23:14.760 | And that's really where the illusion of free will
00:23:17.080 | and the illusion of self, which I think is,
00:23:18.940 | they're kind of two sides of the same coin, come from.
00:23:21.840 | So even when we intellectually understand
00:23:25.000 | that everything we're feeling, everything we're doing
00:23:27.260 | is based on our brain processing and brain behavior,
00:23:31.360 | if you're a physicalist, you've bought into that.
00:23:33.800 | Even when you intellectually understand that,
00:23:37.680 | we, and I include myself in this,
00:23:39.960 | we still have this feeling that there's something
00:23:43.160 | that stands outside of the brain processing
00:23:45.920 | that can intervene.
00:23:47.120 | And that's the illusion.
00:23:49.360 | I was tweeting with someone recently,
00:23:51.240 | which I almost never do, but we're working
00:23:53.280 | in the TED documentary that I'm making right now,
00:23:55.840 | we're working on the episode on free will.
00:23:58.440 | So I was allowing myself to go back and forth
00:24:01.360 | in a way that I don't usually on Twitter.
00:24:03.600 | - Like arguing? - About free will.
00:24:05.120 | It was a friendly debate.
00:24:06.960 | Gonna go into the reasons why I'm not crazy about Twitter,
00:24:09.320 | but let's leave that for another time.
00:24:11.280 | I mean, talk about how hard it is to have this conversation
00:24:15.420 | when we have as many hours as we like,
00:24:18.380 | trying to do it in sound bites over Twitter.
00:24:20.400 | - See, I like how you made the decision now
00:24:22.440 | not to talk about Twitter.
00:24:23.860 | (Anika laughing)
00:24:24.700 | It's a road less traveled. - Well, my brain,
00:24:27.040 | that was one of the things I said to this person was,
00:24:30.440 | 'cause someone chimed in and said,
00:24:33.120 | "You said I, what do you mean by I?"
00:24:35.080 | And so, actually, that's another point I could make,
00:24:37.360 | which is, first, my response to that was,
00:24:41.960 | well, people tend to get creeped out when I say
00:24:44.400 | the system that is my brain and body
00:24:47.020 | that we call Anika recommends.
00:24:50.360 | - Why do they get freaked out?
00:24:51.920 | (Anika laughing)
00:24:52.760 | Oh, you mean like in your personal life?
00:24:54.520 | - Instead of I. - Oh, instead of I.
00:24:55.360 | - Instead of like never saying I, yeah.
00:24:56.880 | Always, you know, but I always refer to you
00:24:59.440 | as the brain and body we call Lex.
00:25:01.920 | - Yes.
00:25:02.760 | (laughing)
00:25:03.600 | Well, I don't know.
00:25:04.840 | That's kind of charming in a way.
00:25:07.000 | Alleged brain.
00:25:07.880 | - So I and you are very useful shorthand,
00:25:10.960 | even though at some level they're illusions.
00:25:13.260 | They're very useful shorthand for the system of my brain,
00:25:18.440 | really, and my body, the whole system.
00:25:21.720 | That I is useful for that, but the illusion
00:25:24.120 | is when we feel like there's something outside
00:25:26.640 | of that system that can intervene, that is free,
00:25:29.400 | that's somehow free from the physical world.
00:25:32.420 | I can have the thought, yeah, I'm really not crazy
00:25:37.200 | about having intellectual back and forth on Twitter,
00:25:40.600 | and then feel like I decide to not follow that thought,
00:25:44.840 | right, and the feeling, that's the feeling
00:25:47.880 | where the illusion comes in, because it really feels
00:25:51.240 | as if, sure, my brain had that original thought,
00:25:55.120 | and then I came in and made a different decision.
00:25:58.840 | But of course, the truth is, it was just further
00:26:01.280 | brain processing that got me to decide
00:26:03.880 | not to go down that path.
00:26:05.600 | - How much is that feeling of conscious will
00:26:08.160 | is culturally constructed shorthand?
00:26:11.760 | So like, I and you is, you could say,
00:26:16.360 | a culturally constructed shorthand.
00:26:19.300 | How much of that affects how we think?
00:26:22.920 | So our parents say I and you, I and you,
00:26:26.520 | and then we start to believe in I and you,
00:26:29.180 | and is that, or is that fundamental
00:26:32.320 | to the human brain machine that we--
00:26:37.120 | - I think it goes very deep.
00:26:38.520 | I think it's fundamental, and I think it probably,
00:26:40.880 | some form of feeling like a self goes as deep
00:26:45.240 | as cats and dogs, and it's possible,
00:26:48.600 | I mean, if consciousness does go down to the level of cells,
00:26:52.720 | or however far down you wanna take it, worms,
00:26:54.940 | or I think any system that's navigating itself,
00:26:58.680 | that kind of has boundaries and is navigating
00:27:01.040 | itself in the world, my guess is that
00:27:06.040 | it's an intrinsic part of, that's why I imagined
00:27:09.800 | that the pea tendril would have this feeling.
00:27:12.440 | And so, we use the word I,
00:27:18.440 | I think you're right, first of all,
00:27:19.720 | that the way we talk about things affects
00:27:22.520 | our intuitions about them and how we feel about them,
00:27:24.860 | and so there are other cultures who are more open
00:27:27.600 | to breaking through these illusions than others, for sure,
00:27:32.360 | just because of their belief sets, the way they talk.
00:27:36.560 | I mean, I'm sure, I'm not a linguist,
00:27:38.800 | and I don't even speak a second language,
00:27:40.920 | so I can't speak to it, but if there were a language
00:27:45.160 | that framed who we are differently in everyday language,
00:27:50.160 | I mean, in our everyday communication,
00:27:55.720 | I would think that would have an effect.
00:27:57.560 | - Yeah, language does affect things,
00:27:58.960 | and just knowing Russian and the history
00:28:02.720 | of the Soviet Union in the 20th century,
00:28:05.820 | obviously it lived under communism for a long time,
00:28:08.800 | so your conception of individualism is different,
00:28:12.300 | and that reflects itself in the language.
00:28:14.940 | You could probably have a similar kind of thing
00:28:18.140 | within the language in terms of how we talk about I
00:28:21.940 | and we and so on, and I'm sure there's certain countries,
00:28:26.940 | or maybe even villages with certain dialects
00:28:29.380 | that let go of the individualism that's inherent in I.
00:28:34.420 | - Yeah, I mean, there must be a range,
00:28:36.260 | but I do think that it's pretty deep,
00:28:38.740 | and I think there's also a difference
00:28:43.740 | between the autobiographical me
00:28:46.980 | and then this more fundamental me that we're talking about,
00:28:50.420 | or that I'm pointing to as the illusion.
00:28:52.420 | So in my book, I talk about if someone wakes up with amnesia,
00:28:57.420 | if they have brain injury and suddenly have amnesia
00:29:01.220 | and can't remember anything about their lives,
00:29:04.060 | can't remember their name,
00:29:05.100 | don't recognize people they're related to,
00:29:07.780 | they would have lost their autobiographical self,
00:29:14.280 | but they would still feel like an I.
00:29:16.380 | They would still have that basic sense of I'm a person.
00:29:20.500 | I mean, they'd be speaking that way.
00:29:22.140 | I don't remember my name.
00:29:23.460 | I don't know where I live.
00:29:25.040 | It goes very deep, this feeling that I am a single entity
00:29:33.860 | that is somehow not completely reliant
00:29:38.740 | upon the cause and effect of the physical world.
00:29:41.260 | - Can I ask you a pothead question?
00:29:42.780 | - Yeah.
00:29:43.620 | - Would you rather lose all your memories
00:29:51.260 | or not be able to make new ones?
00:29:53.100 | Now I'm asking you as a human
00:29:57.540 | in terms of happiness and preference.
00:30:00.220 | - I can't answer that.
00:30:02.220 | - You like both?
00:30:03.060 | You like both features of the organism that you embody?
00:30:06.540 | - Well, one is intellectual and one is psychological,
00:30:09.900 | really.
00:30:10.740 | I mean, I would have to choose the memories
00:30:12.700 | only because, I mean, memories of the past,
00:30:15.500 | only because I have children and a family
00:30:18.180 | and it would just be, it wouldn't just be affecting me,
00:30:21.340 | it would be affecting them.
00:30:22.180 | It would just be too horrible.
00:30:24.060 | - No, but you would,
00:30:24.900 | there's a lot of people who are like,
00:30:26.140 | "Oh, I'm gonna be a psychologist."
00:30:27.500 | - It's horrible.
00:30:28.620 | - No, but you would make new ones, right?
00:30:31.420 | - If I lost my memory of the 13 years of my children's life?
00:30:36.420 | - You think you would lose, this is a dark question.
00:30:39.660 | - Oh, wait, wasn't that the question?
00:30:40.780 | Maybe I misunderstood.
00:30:41.620 | - No, no, no, no, you understood it perfectly, but.
00:30:43.180 | - Yeah.
00:30:44.020 | - Sorry for the dark question,
00:30:47.020 | but the people you love in your life,
00:30:49.540 | if you lost all your memory of everything,
00:30:52.780 | do you think you would still love them?
00:30:55.020 | Like you show up, you don't know.
00:30:56.900 | - I don't know.
00:30:57.740 | - It's a roll of the dice.
00:30:58.580 | - I mean, not in the way that I do.
00:30:59.940 | - Right.
00:31:00.780 | So some deep aspect of love is the history you have together.
00:31:04.700 | - Oh, absolutely.
00:31:05.540 | Well, and this gets to an interesting point, actually,
00:31:07.580 | which I think a lot about, which is memory.
00:31:10.040 | And we won't go into this yet,
00:31:12.620 | but I'll just plant a flag here that--
00:31:14.180 | - I love the song.
00:31:15.020 | - Memory is, yeah, memory is obviously related to time
00:31:19.260 | and time is something that I'm fascinated with
00:31:21.180 | and for this project I'm working on now,
00:31:23.780 | I've mostly been speaking to physicists
00:31:25.700 | who are interested in consciousness.
00:31:28.300 | And it's partly because of this link
00:31:30.940 | between memory and time and all of these
00:31:35.940 | new fascinating theories and thoughts
00:31:40.740 | around the different interpretations of quantum mechanics
00:31:44.060 | and looking at the thing that I've always been looking for
00:31:47.980 | is really the fundamental nature of reality
00:31:50.460 | and why my questions about consciousness
00:31:55.540 | lead me to wonder if consciousness
00:31:57.860 | is a more fundamental aspect of the universe
00:32:01.140 | than we previously thought
00:32:02.740 | and certainly I previously thought.
00:32:05.180 | And so memory, but memory is tied to so many things.
00:32:11.180 | I mean, even basic functions in nature.
00:32:14.300 | Actually, so the P-tendril, as I mentioned,
00:32:17.460 | memory comes into play there and that's so fascinating.
00:32:21.740 | And there is no sense of self without memory,
00:32:25.700 | even if you're starting from scratch,
00:32:27.940 | as you said with amnesia.
00:32:29.620 | If you truly couldn't lay down any new memories,
00:32:35.300 | I think you would, then that sense of self
00:32:37.380 | would begin to disintegrate
00:32:38.740 | because the sense of self is one of a concrete entity
00:32:43.740 | through time.
00:32:45.580 | And if each moment, if you really were stuck
00:32:49.580 | in the present moment eternally,
00:32:51.620 | you'd basically be meditating.
00:32:53.660 | (laughs)
00:32:54.820 | And in meditation, this is a very common experience
00:32:57.780 | is losing that sense of self, that sense of free will,
00:33:01.260 | that those illusions more easily drop away in meditation.
00:33:06.260 | And I would say for most people who meditate long enough,
00:33:10.180 | they do drop away.
00:33:11.060 | And there's actually an explanation
00:33:12.300 | at the level of the brain as well.
00:33:14.300 | The default mode network is circuitry in the brain
00:33:17.980 | that neuroscientists don't completely understand,
00:33:20.500 | but know is largely responsible
00:33:23.340 | for this feeling of being a self.
00:33:25.580 | And when that circuit gets quieted down,
00:33:29.660 | which it does in meditation
00:33:31.020 | and also does with the use of psychedelic drugs,
00:33:34.860 | and there are other ways to quiet down
00:33:37.700 | the default mode network,
00:33:39.020 | people have this experience of losing
00:33:44.140 | this illusion of being a self.
00:33:45.980 | They no longer feel that they're a self
00:33:47.580 | in the way that they usually do.
00:33:49.540 | - So there's the autobiographical self
00:33:52.460 | is connected to the sense of self.
00:33:56.700 | - Oh, absolutely, yeah. - Through the memory.
00:33:58.540 | And then you're thinking that the solution to that
00:34:01.660 | lies in physics, not just neuroscience.
00:34:03.940 | Like ultimately consciousness and the experience,
00:34:08.620 | the conscious will is a question of physics.
00:34:13.620 | - I may have said something misleading
00:34:16.260 | because I was connecting too many dots.
00:34:18.220 | - Half the things I say are misleading.
00:34:22.300 | Let us mislead each other.
00:34:24.020 | - I just got, I got excited when memory came up
00:34:26.300 | because I love talking about time.
00:34:29.020 | - So you mentioned a project
00:34:30.020 | you're working on a couple of times.
00:34:31.220 | What's that about?
00:34:32.900 | I think you said Ted is involved.
00:34:34.580 | You're interviewing a bunch of people.
00:34:35.900 | What's going on?
00:34:36.740 | What's the topic?
00:34:37.580 | - So I'm working on an audio documentary
00:34:41.460 | about consciousness,
00:34:43.100 | and it picks up where my book left off.
00:34:46.380 | So all of the questions that were still lingering for me
00:34:49.500 | and research that I still wanted to do,
00:34:52.460 | I just started conducting.
00:34:53.620 | So I've done about 30 interviews so far.
00:34:56.500 | And it's not totally clear what the end result will be.
00:35:00.100 | I'm currently collaborating with Ted
00:35:02.700 | and I'm having a lot of fun creating a pilot with them.
00:35:05.980 | And so we'll see where it goes.
00:35:08.580 | But the idea is that it's a narrated documentary.
00:35:11.780 | - And it's like a series.
00:35:12.820 | - A series, it'll be a 10 part series.
00:35:14.620 | - It's an unclear,
00:35:15.580 | oh, you already know the number of parts.
00:35:17.580 | - Sorry, in my mind, it's a 10 part series.
00:35:19.420 | It may end up being eight or 11 or 12.
00:35:21.420 | I don't know why.
00:35:22.260 | - Listen, I am very comforted by the number zero
00:35:24.540 | and one as well.
00:35:25.780 | - About 10.
00:35:26.620 | - I like the confidence of 10.
00:35:28.060 | So, and you're not sure what the title,
00:35:32.020 | like not the title, but the topic,
00:35:33.940 | will there be consciousness or something bigger
00:35:36.020 | or something smaller?
00:35:37.020 | - Yeah, I mean, it's my,
00:35:37.980 | so at the end of my book,
00:35:39.100 | I kind of get to the place where I've convinced myself,
00:35:43.180 | at least, that this question
00:35:44.900 | about whether consciousness is fundamental
00:35:46.900 | is a legitimate one.
00:35:48.300 | And then I just start spending a lot of time
00:35:51.780 | thinking about what that would mean,
00:35:53.700 | if it's even possible to study scientifically.
00:35:57.180 | So I mostly talk to physicists, actually,
00:35:59.980 | because I really think, ultimately,
00:36:02.180 | this is a question for physics,
00:36:04.900 | if consciousness is fundamental.
00:36:06.220 | I think it needs to be strongly informed by neuroscience,
00:36:09.020 | but it's, yeah, if it's part of the fabric of reality,
00:36:13.900 | it is a question for a physicist.
00:36:15.420 | So I speak to different physicists
00:36:17.660 | about different interpretations of quantum mechanics,
00:36:20.220 | so getting at the fundamentals.
00:36:21.940 | So string theory in many worlds.
00:36:24.100 | I spoke to Sean Carroll,
00:36:25.340 | had a great conversation with Sean Carroll.
00:36:26.980 | He's so generous,
00:36:28.020 | because he clearly doesn't agree with me about many things.
00:36:32.000 | But he has a curious mind,
00:36:34.220 | and he's willing to have these conversations.
00:36:35.940 | And I was really interested
00:36:37.740 | in understanding many worlds better,
00:36:39.540 | and if consciousness is fundamental,
00:36:42.020 | what the implications are.
00:36:43.600 | So that was where I started, actually,
00:36:46.340 | was with many worlds.
00:36:47.380 | And then we had conversations about string theory
00:36:51.100 | and the holographic principle.
00:36:54.080 | I spoke to Lee Smolin and Brian Green and Jan Eleven
00:36:59.080 | and Carlo Rovelli, actually.
00:37:00.860 | Have you had Carlo on?
00:37:01.820 | - No, no.
00:37:03.220 | - He's great also and fun to talk to,
00:37:05.100 | because he's just endlessly curious.
00:37:07.340 | - Yeah.
00:37:08.180 | - And you're doing audio.
00:37:09.140 | - It's all audio, yeah.
00:37:10.220 | But it's in the format of a documentary,
00:37:11.700 | so I'm narrating it.
00:37:13.300 | I'm kind of telling the story
00:37:14.480 | of what questions came up for me,
00:37:16.420 | what I was interested in exploring,
00:37:17.940 | and then why I talk to each person I talk to.
00:37:21.460 | - By the way, I highly recommend Sean Carroll's
00:37:23.940 | Mindscape podcast, I think it's called.
00:37:26.100 | - Yeah.
00:37:27.420 | - It's amazing.
00:37:28.260 | One of my favorite things,
00:37:29.100 | well, when he interviews physicists, it's great,
00:37:30.840 | but any topic, his aim is.
00:37:32.500 | But one of my favorite things
00:37:34.900 | is how frustrated he gets with panpsychism.
00:37:37.220 | (laughing)
00:37:38.540 | But he's still like, it's like a fly towards the light.
00:37:41.220 | - Yeah.
00:37:42.060 | - For some reason, he can't make sense of it,
00:37:44.460 | but he still struggles with it,
00:37:46.220 | and I think that's the sign of a good scientist,
00:37:50.700 | really struggling with these ideas.
00:37:53.180 | - I totally agree, and yes,
00:37:55.140 | that's what I appreciate in him
00:37:56.820 | and many scientists like that.
00:37:59.380 | - Who has the craziest, most radical ideas
00:38:02.060 | that you talk with currently?
00:38:03.940 | - So you can go either direction.
00:38:06.060 | You can go like panpsychism,
00:38:07.780 | consciousness permeates everything.
00:38:10.260 | - Yeah.
00:38:11.620 | - I don't know how far you can go down that direction.
00:38:14.540 | Or you could say that, you know,
00:38:17.500 | what would be the other direction?
00:38:19.780 | That there's a--
00:38:20.620 | - Well, there isn't really,
00:38:21.440 | the problem is they're all crazy.
00:38:22.700 | - They're all crazy.
00:38:23.540 | - And each one is crazier than the next.
00:38:24.380 | - All of us are crazy.
00:38:25.500 | - And my own, I mean, my own thoughts now,
00:38:27.780 | I have to be very careful about the words I choose
00:38:33.700 | because, I mean, it's just like talking about
00:38:37.820 | the different interpretations of quantum mechanics.
00:38:41.060 | It's once you get deep enough,
00:38:44.840 | it's so counterintuitive
00:38:46.340 | and it's so beyond anything we understand
00:38:49.020 | that they all sound crazy.
00:38:51.080 | Many worlds sounds crazy.
00:38:52.340 | String theory, I mean,
00:38:53.300 | these are things we just cannot get our minds around, really.
00:38:57.060 | And so that's kind of,
00:38:58.660 | that's the realm I love to live in and love to explore in.
00:39:03.380 | And the realm that, to my surprise,
00:39:06.020 | my interest in consciousness has taken me back to.
00:39:08.700 | - Can I ask you a question on that?
00:39:09.980 | - Yeah.
00:39:10.820 | - Just a side tangent.
00:39:12.580 | How do you prevent,
00:39:13.740 | when you're imagining yourself to be a petandroid,
00:39:16.820 | how do you prevent from going crazy?
00:39:18.660 | I mean, this is kind of the Nietzsche question of like,
00:39:21.540 | you have to be very careful
00:39:23.460 | thinking outside the norms of society
00:39:27.580 | because you might fall off.
00:39:28.900 | Like mentally, you're so connected as a human
00:39:31.780 | to the collective intelligence
00:39:34.100 | that in order to question intuitions,
00:39:36.060 | you have to step outside of it for a brief moment.
00:39:40.060 | How do you prevent yourself from going crazy?
00:39:42.860 | - I think I used to think that was a concern.
00:39:47.860 | - And then you became crazy.
00:39:48.900 | - I've learned so much about the brain.
00:39:50.460 | No, and I've had experiences of deep depression
00:39:54.060 | and I struggled with anxiety my whole life.
00:39:57.340 | I think in order to be a good scientist
00:40:01.900 | and in order to be a truthfully,
00:40:06.660 | let's say, to allow yourself to be curious
00:40:11.380 | and honest in your curiosity,
00:40:13.820 | I think it's inevitable that lots of ideas
00:40:17.660 | and theories and hypotheses will just sound crazy.
00:40:22.660 | And that is always how we've advanced science.
00:40:26.700 | And maybe nine out of 10 ideas are crazy
00:40:29.660 | and crazy meaning they're actually not correct.
00:40:33.700 | But all of, I mean, as I said,
00:40:37.500 | all of the big scientific breakthroughs,
00:40:39.740 | all of the truths we've uncovered
00:40:41.860 | that are the earth-shattering truths that we uncover,
00:40:46.860 | they really do sound crazy at first.
00:40:49.860 | So I don't think one necessarily leads
00:40:52.340 | to a type of mental illness.
00:40:54.460 | I see mental illness in a very different category.
00:40:57.580 | And I think some people are more susceptible
00:41:02.380 | to being destabilized by this type of thinking.
00:41:06.060 | And that might be a legitimate concern for some people,
00:41:09.140 | that kind of being grounded in everyday life
00:41:12.060 | is important for my psychological health.
00:41:15.300 | The more time I spend thinking about the bigger picture
00:41:19.180 | and outside of everyday life,
00:41:22.820 | the more happy I am, the more expansive I feel.
00:41:27.820 | I mean, it feels nourishing to me.
00:41:31.660 | It feels like it makes me more sane, not less.
00:41:34.340 | - Well, that's a happiness,
00:41:35.900 | but in terms of your ability to see the truth,
00:41:38.560 | you can be happy and completely--
00:41:42.260 | - I guess I don't see mental illness
00:41:43.740 | necessarily being linked to truth or not truth.
00:41:47.580 | - So we were talking about minimizing mental illness,
00:41:51.060 | but also truth is a different dimension.
00:41:54.620 | So you can go crazy in both directions.
00:41:57.380 | You could be extremely happy, and they are, flat earthers.
00:42:05.060 | You can believe the earth is flat.
00:42:07.100 | Because, I'm sure there's good books on this,
00:42:10.540 | but it's somehow really comforting.
00:42:12.780 | It's fun and comforting to believe you've figured out
00:42:16.780 | the thing that everybody else hasn't figured out.
00:42:19.460 | - I think that's what conspiracy theories
00:42:21.820 | always provide people.
00:42:23.180 | - Why is it so fun?
00:42:24.580 | It's so fun.
00:42:25.540 | It's, except when it's dangerous.
00:42:28.140 | But even then, it's probably fun.
00:42:31.220 | But then you shouldn't do it because it's unethical.
00:42:33.780 | Anyway, so--
00:42:35.620 | - It's not true, I'm not a fan of following.
00:42:37.660 | - Well, that makes one of us.
00:42:40.340 | I don't know.
00:42:42.220 | There is probably a fascinating story
00:42:44.040 | to why conspiracy theories are so compelling
00:42:48.540 | to us human beings as deeper than just fun internet stuff.
00:42:53.340 | - Yeah, I'm very interested in why they're so compelling
00:42:56.020 | to some people and not others.
00:42:57.540 | I feel like there must be some difference
00:42:59.380 | that at some point we'll be able to discover.
00:43:03.300 | - Yeah, yeah.
00:43:05.820 | - Because some people are just not susceptible to them,
00:43:08.180 | and some people are really drawn to them.
00:43:11.740 | - 'Cause I feel like the kind of thinking that allows
00:43:15.820 | for you to be open to conspiracy theories
00:43:18.700 | is also the kind of thinking that leads
00:43:20.820 | to brilliant breakthroughs in science.
00:43:23.420 | Sort of willingness to go to crazy land.
00:43:27.780 | Something that seems like crazy land.
00:43:28.620 | - That's interesting, I see it the opposite way.
00:43:30.780 | - Really?
00:43:31.620 | - Yeah.
00:43:32.440 | - So you don't see the connection
00:43:35.820 | between thinking the Earth is flat
00:43:38.580 | and coming up with special relativity?
00:43:41.060 | - Thinking the Earth is flat is following your intuitions
00:43:44.420 | and not being open to counterintuitive ideas.
00:43:49.420 | It's a very closed way of viewing things.
00:43:53.060 | Saying it's actually, it's not the way you feel.
00:43:54.860 | There's information that tells us
00:43:58.020 | there's something else going on.
00:43:59.340 | And that type of person will say,
00:44:01.660 | no, it's the way it feels to me.
00:44:04.660 | - No, no, no, but wait a minute.
00:44:06.080 | There's a mainstream narrative of science
00:44:09.740 | that says the Earth is round.
00:44:11.740 | - Right.
00:44:12.580 | - And I think a flat Earth,
00:44:16.500 | see I admire the very first step of a flat Earth.
00:44:19.220 | (Bridget laughs)
00:44:20.140 | I don't admire the full journey.
00:44:22.300 | But the first step is--
00:44:23.740 | - I think if you're open to evidence,
00:44:25.380 | then the evidence clearly takes you in one direction.
00:44:28.340 | - Right, but you have to ask the question.
00:44:30.580 | You have to ask, to me,
00:44:33.260 | this is like first principles thinking.
00:44:35.180 | - Yeah.
00:44:36.020 | - The Earth looks flat, so I'm gonna look around here.
00:44:40.220 | And I, like, how crazy is it
00:44:45.220 | that the Earth is round and there's a thing called gravity
00:44:49.020 | that operates between objects
00:44:51.380 | that's related to the mass of the object?
00:44:54.740 | - Right.
00:44:55.580 | - That's crazy.
00:44:56.420 | - Yes, the truth is often crazier
00:44:58.060 | than what the situation feels to be.
00:45:00.420 | - A good step is to question what everyone is saying
00:45:02.940 | and then you learn a little bit.
00:45:03.780 | - I know what you mean, to be skeptical about the,
00:45:06.380 | it's the authority factor.
00:45:07.500 | - Yeah, but I think that, and the authority
00:45:10.060 | in not in some kind of weird current
00:45:13.820 | where everyone questions institutions,
00:45:15.380 | but more like the authority of the senior scientists,
00:45:18.700 | the junior scientists coming up, wait a minute,
00:45:20.860 | why have we been doing things this way?
00:45:23.060 | And that first step, I feel like that rebelliousness
00:45:27.740 | or that open-mindedness or maybe like resistance to,
00:45:32.260 | maybe curiosity that is not a fact,
00:45:37.260 | not affected by whatever the mainstream science says
00:45:40.660 | of today.
00:45:41.500 | - Cass, I feel like mainstream science
00:45:44.500 | has never been mainstream and it's always a struggle
00:45:47.100 | for science to become mainstream.
00:45:48.900 | It's part of the reason why I started doing the work
00:45:50.700 | I did actually, helping scientists make their work
00:45:53.420 | more accessible is that it's usually not.
00:45:57.540 | - Yeah.
00:45:58.380 | - It's usually not.
00:45:59.220 | - Here's advice for scientists, be more interesting
00:46:02.060 | and much more important, be less arrogant.
00:46:05.300 | - So arrogance, there's very little money in science
00:46:09.860 | and so everyone is fighting for that money
00:46:11.540 | and they become more and more arrogant and siloed.
00:46:13.500 | I don't know why.
00:46:14.540 | - I will say that the scientists I know
00:46:17.380 | and some of them are very well-known,
00:46:19.100 | very famous scientists are the least arrogant people
00:46:21.580 | I've ever met.
00:46:22.820 | That scientists in general, their personalities
00:46:27.260 | are more open, more humble, more likely to say
00:46:33.020 | they just don't know because I've been involved a lot
00:46:36.220 | in the science writing and how the media portrays.
00:46:40.820 | So one of scientists, the scientific community's
00:46:45.180 | greatest frustration is how their work gets presented
00:46:49.300 | in the media and a lot of the time that is,
00:46:52.980 | I would say that's the main frustration
00:46:54.940 | is there's some new breakthrough, there's something
00:46:57.700 | and the scientists will be saying, we're not sure,
00:47:01.100 | it's gonna take five years and no one likes to write
00:47:04.220 | a story about something that may or may not be true,
00:47:06.220 | they think it's true, they're gonna take five years
00:47:08.300 | testing it and so the headline will be,
00:47:11.780 | neuroscientists discovered, they want this sensational
00:47:15.660 | and so I think the public often gets the false impression
00:47:20.660 | that the scientists are arrogant and I really don't find
00:47:25.940 | that to be the case and I've worked with all kinds
00:47:29.300 | of people, artists and my life path has taken a strange--
00:47:34.300 | - You've met some incredible people,
00:47:38.140 | you work with some incredible people.
00:47:40.400 | So let's, the crazy topic of free will, I mean,
00:47:44.500 | I just, we have to link on this 'cause I can't.
00:47:47.240 | So the plant, all right, can you try to steel man the case
00:47:52.800 | that there's something really special about humans,
00:47:56.480 | that there's a fundamental difference between us
00:47:59.520 | and the petandral?
00:48:01.680 | You know, humans are clearly very special
00:48:04.920 | in the evolution of organisms on Earth.
00:48:08.880 | Could that have been the magic leap?
00:48:11.920 | Could consciousness been like the invention
00:48:15.520 | of the eukaryotic cell or something like this?
00:48:18.920 | - Well then, I mean, so I have to get clear
00:48:20.600 | on what you're asking.
00:48:21.440 | So are you coming from a place of wondering
00:48:26.440 | if we are the only conscious mammals?
00:48:29.240 | - Yes.
00:48:30.640 | - Do you really think that's a question?
00:48:33.560 | - Can you make a case for it?
00:48:34.880 | (laughing)
00:48:35.800 | - Do you really think that's a question?
00:48:38.240 | Take one step back.
00:48:39.160 | We look out at the universe.
00:48:40.720 | At this point in our scientific understanding,
00:48:46.440 | we know that essentially we're all made
00:48:48.740 | of the same ingredients, right?
00:48:50.140 | There are atoms in the universe doing their thing,
00:48:54.200 | they find themselves in different configurations
00:48:57.360 | based on the laws of physics.
00:48:58.820 | And then the question is, if we look out
00:49:03.120 | at all of the configurations of atoms in the universe
00:49:05.680 | and ask which of these entail conscious experiences,
00:49:10.240 | which of these have a felt experience
00:49:12.240 | of being the matter they are?
00:49:15.080 | And there are really only two, broadly speaking,
00:49:18.760 | there are really only two assumptions to make here.
00:49:22.960 | And the first one is the one that science has taken
00:49:27.160 | and that I have for most of my career as well
00:49:29.680 | and that in many ways makes the most sense,
00:49:32.040 | which is electrons aren't conscious,
00:49:35.080 | tables aren't, there's no felt experience there,
00:49:37.240 | but at a certain point in complex processing,
00:49:39.920 | that processing entails an experience
00:49:43.240 | of being that processing.
00:49:44.640 | Now that's just a fascinating fact all on its own
00:49:47.400 | and I love to spend time thinking about that.
00:49:50.280 | So the question is, does consciousness arise at some point?
00:49:53.520 | Are some of these collections of atoms conscious
00:49:57.040 | or are all of them?
00:49:59.480 | Because we know the answer isn't none.
00:50:01.280 | I know that I'm at least having a conscious experience,
00:50:04.320 | I know that conscious experiences exist in the universe.
00:50:07.380 | And so the answer isn't none,
00:50:11.040 | so the answer has to be all or some.
00:50:14.640 | And this is a starting assumption
00:50:17.960 | that you're really kind of forced to make
00:50:20.600 | and that it's all or some.
00:50:23.160 | - All or some or one.
00:50:24.200 | - I would say one is some also.
00:50:26.160 | We either need an explanation
00:50:27.480 | for why there's non-conscious matter in the universe
00:50:30.100 | and then something happens for consciousness
00:50:32.280 | to come into being,
00:50:33.720 | or it's part of the fundamental nature of reality.
00:50:37.400 | - It's also if consciousness is a fundamental property
00:50:42.240 | of reality, it could also choose to not reveal itself
00:50:45.720 | until a certain complexity of organism.
00:50:48.020 | - I'm not sure what that means.
00:50:51.320 | - I'm not sure what that means either.
00:50:52.880 | Like the flame of consciousness does not start burning
00:50:57.400 | until a certain complexity of organism is able to reveal it.
00:51:02.400 | - So I don't think we can look at consciousness that way.
00:51:04.240 | I don't think, I mean, many people like to try
00:51:07.640 | to make that argument that it's a spectrum.
00:51:09.400 | Why do we have to say all or nothing maybe?
00:51:12.260 | And I agree that I actually think it is a spectrum,
00:51:15.940 | but it's a spectrum of content,
00:51:17.980 | not of consciousness itself.
00:51:19.260 | So if a worm has some level of conscious experience,
00:51:23.940 | it is extremely minimal,
00:51:25.540 | something we could never imagine
00:51:27.420 | being having the complex experience you and I have.
00:51:31.300 | Maybe some felt sensation of pressure or heat
00:51:35.180 | or something super basic, right?
00:51:37.020 | So there's this range,
00:51:38.540 | or even if you just think of an infant,
00:51:40.140 | like the first, the moment an infant becomes conscious,
00:51:42.880 | what that, there's a very, very minimal experience
00:51:45.760 | of inputs of sound and light and whatever it is.
00:51:48.900 | And so there's a spectrum of content.
00:51:51.500 | There's a spectrum of how much a system
00:51:55.640 | is consciously experiencing,
00:51:57.700 | but there's a moment at which you get on the spectrum.
00:52:01.700 | And that's, and I truly believe that,
00:52:03.880 | that piece of it is binary.
00:52:05.340 | So if there's no conscious experience,
00:52:08.880 | there is no consciousness.
00:52:10.180 | You can't say consciousness is there,
00:52:11.420 | it just hasn't lit its flame yet.
00:52:13.020 | If consciousness is there,
00:52:14.040 | there's an experience there by definition.
00:52:16.020 | It has to arise at some point,
00:52:17.700 | or it has to always be there.
00:52:18.940 | - Is it possible to make the case
00:52:21.660 | that that arising happens first,
00:52:26.660 | for the first time ever with Homo sapiens?
00:52:30.500 | - I think that is extremely unlikely.
00:52:33.220 | What I think is more possible
00:52:37.060 | based on what we understand about the brain
00:52:39.760 | is that it arises in brains or nervous systems.
00:52:44.760 | And so then we're talking about flies and bees
00:52:52.260 | and all kinds of things that kind of fall out of our,
00:52:55.440 | our intuitions for whether they could be conscious or not.
00:52:59.100 | But I think, especially once you talk about
00:53:04.500 | more complex brains with many, many more neurons,
00:53:07.760 | when you're talking about cats and dogs and dolphins,
00:53:10.360 | it's very hard to see how there would be a difference
00:53:16.640 | between humans and other mammals in terms of consciousness.
00:53:21.200 | - Was there a difference in terms of intelligence
00:53:24.120 | between humans and other mammals?
00:53:26.160 | - Sure.
00:53:27.000 | - Not like a fundamental leap in intelligence?
00:53:29.120 | - It's hard to say definitively.
00:53:30.960 | I mean, it depends on how you define intelligence
00:53:35.960 | and all kinds of things,
00:53:37.120 | but obviously humans are unique and capable
00:53:39.720 | of all kinds of things that no other mammals are capable of.
00:53:43.280 | And there are important differences.
00:53:45.640 | And I don't think you need any magical intervention
00:53:50.180 | of something outside of the physical world to explain it.
00:53:53.640 | And the way I think about consciousness,
00:53:56.560 | I actually think it's part of the reason
00:53:58.920 | we're mistaken about consciousness,
00:54:01.680 | is because we are special in the ways that we're special.
00:54:05.560 | And because we're complex creatures,
00:54:08.520 | we have these complex brains.
00:54:10.440 | So I think we should probably get into some of the details
00:54:12.620 | of why I think we're confused about what consciousness is.
00:54:16.200 | But just to finish this point,
00:54:18.240 | I think that we don't actually have any evidence
00:54:23.160 | that consciousness is complex,
00:54:26.720 | that it comes out of complex processing,
00:54:28.840 | that it's required for complex processing.
00:54:31.600 | And I think we've made this anthropomorphic mistake
00:54:36.320 | because we are conscious and it's very hard to get evidence.
00:54:41.320 | It's one of the things that makes consciousness
00:54:43.820 | unique and mysterious and why I'm fascinated with it
00:54:46.160 | is it's the one thing in nature
00:54:48.520 | that we can't get conclusive evidence of from the outside.
00:54:52.360 | We can by analogy, you know,
00:54:53.880 | you're behaving basically the same way I behave,
00:54:57.200 | more or less.
00:54:58.840 | You talk about your conscious experiences
00:55:00.680 | and therefore I just extrapolate from that,
00:55:03.400 | that you're having a felt experience in the way I am
00:55:06.320 | and we can do that throughout nature.
00:55:08.820 | Well, there's no physical evidence.
00:55:09.920 | There's nothing we can observe from the outside
00:55:11.560 | that will give us conclusive proof
00:55:13.800 | that consciousness is there.
00:55:15.360 | And so I think we've made this leap to,
00:55:19.480 | because we're conscious and because we're unique and special
00:55:23.560 | and complex and intelligent in the way that we are
00:55:28.560 | and because we don't have an intuition
00:55:32.640 | that anything else is conscious
00:55:34.720 | or we have no feedback about it,
00:55:37.400 | we've made this assumption that consciousness,
00:55:41.120 | that those things aren't conscious
00:55:42.960 | and felt experience does not exist out there
00:55:45.760 | in other atoms and forms of life even,
00:55:49.280 | but especially not inanimate objects.
00:55:52.680 | And therefore consciousness is somehow tied
00:55:57.680 | to these other things that make us unique.
00:56:01.400 | That consciousness arises
00:56:05.800 | when there is this complex processing.
00:56:08.000 | When there is, and there's,
00:56:09.560 | we can talk about the evolution argument too,
00:56:11.160 | which I think is super interesting to get into
00:56:12.840 | and I'm hoping to talk to Richard Dawkins
00:56:15.480 | about this for my series.
00:56:16.320 | - What's he think about consciousness?
00:56:18.040 | - He's not interested.
00:56:19.320 | (laughing)
00:56:20.640 | He's not interested in actually the conversation
00:56:22.400 | I would have with him would be very brief
00:56:23.840 | because he's just not that interested in this topic.
00:56:26.760 | But let's go back to the Richard Dawkins piece
00:56:29.120 | 'cause I feel like there's a lot to talk about here
00:56:31.000 | in terms of our intuitions about consciousness,
00:56:33.320 | what it's doing, why in my book
00:56:35.920 | and everywhere I talk about consciousness,
00:56:38.560 | I bring it back to these two questions
00:56:40.760 | that I think are at the heart of our intuitions
00:56:42.720 | about consciousness.
00:56:43.560 | And so your questions about whether human beings
00:56:46.280 | are unique and special and all of that
00:56:48.760 | I think are interesting questions
00:56:50.160 | and something we could talk about.
00:56:51.520 | I see them as separate questions
00:56:53.880 | from the consciousness questions.
00:56:55.680 | - So you see consciousness as giving a felt experience
00:56:59.480 | to our uniqueness as opposed to the uniqueness
00:57:02.080 | giving birth to consciousness.
00:57:03.200 | - Yes, and that potentially there is felt experience
00:57:06.760 | even though it sounds crazy even to me
00:57:08.360 | that there is felt experience in all matter.
00:57:12.320 | And at this point in my thinking
00:57:14.360 | and after a few conversations with some physicists
00:57:16.680 | I think if consciousness is fundamental
00:57:19.320 | the only thing that actually makes sense
00:57:21.200 | is that it is part of the most fundamental
00:57:25.120 | that space, time and everything else emerges out of.
00:57:30.040 | - Out of consciousness.
00:57:30.880 | - Felt experience is just part of the fabric of reality.
00:57:35.000 | - So is it possible to intuit this?
00:57:37.600 | Can we start by thinking about dogs and cats,
00:57:40.320 | go to the plants and then going all the way to matter
00:57:43.240 | or is this going to be like modern physics
00:57:45.160 | where it's just going to be impossible to even,
00:57:47.520 | through our reason alone.
00:57:50.800 | We're gonna have to have tools of some kind.
00:57:52.880 | - I think it'll be a little bit of both.
00:57:54.200 | I mean I think the science has a very long way to go
00:57:57.200 | and the truth is I don't even think we can get
00:57:59.240 | to the science yet because we have to do this work
00:58:02.080 | and this is why I'm so passionate about this work.
00:58:04.600 | And it's really taking hold.
00:58:09.080 | I mean there are scientists, neuroscientists and physicists
00:58:12.640 | interested in consciousness and kind of having gotten
00:58:16.280 | over the initial obstacle of wrestling with these intuitions
00:58:21.280 | so that it's now being talked about in a serious way
00:58:26.160 | which was the first huge hurdle.
00:58:28.360 | But I think a lot more of that has to happen,
00:58:31.040 | a lot more of the intuition breaking
00:58:35.760 | from the science we already have.
00:58:38.000 | I mean I think we almost need to catch our intuitions up
00:58:40.520 | to what we already know and then continue to break through
00:58:46.080 | these intuitions systematically so that we can really think
00:58:49.640 | more clearly about consciousness.
00:58:51.520 | There are a couple of scientists now working on theories
00:58:56.840 | of consciousness which do go, they don't quite go
00:59:00.020 | to the fundamental level but they go extremely deep
00:59:03.880 | so that something like an electron might be conscious
00:59:06.420 | under their theory.
00:59:07.360 | This is integrated information theory, IIT
00:59:11.200 | with Christophe Koch and Giulio Tononi.
00:59:15.380 | I've spoken to both of them.
00:59:17.340 | I spoke to Christophe Koch once or twice
00:59:21.700 | for this project I'm working on now.
00:59:23.620 | What they're working on is incredibly interesting to me
00:59:27.300 | and I think very important work.
00:59:29.040 | However, I think they are also really led
00:59:35.660 | by some false intuitions about self and free will.
00:59:40.080 | And I think that will be a limit to their work
00:59:42.080 | so we can get into that but--
00:59:44.500 | - Let's go.
00:59:45.320 | - I'll tell you what.
00:59:46.160 | - Christophe Koch is awesome.
00:59:48.380 | - Which is that what they're working on I think
00:59:52.620 | is the most important next step forward
00:59:54.780 | which is just even being open to the fact
00:59:57.420 | that consciousness goes as deep as particles.
01:00:00.300 | - And being rigorous.
01:00:01.140 | - But even their theory isn't going as deep
01:00:04.900 | as I think we need to go.
01:00:07.300 | And it's hard to say how we could actually study
01:00:09.660 | this scientifically but that's part of the reason
01:00:11.260 | why I'm such a supporter of IIT and why I'm so interested
01:00:13.940 | in what they're doing even though I think they're wrong
01:00:16.300 | is because they're opening this path
01:00:20.760 | and I think they're getting more people interested
01:00:22.560 | and I think, yeah, it's hard for me to imagine
01:00:27.560 | what the science will actually look like.
01:00:30.080 | - Okay, so your intuition or at least the direction
01:00:33.260 | which you're pushing is that consciousness
01:00:36.720 | is the only fundamental thing in the universe
01:00:39.400 | that everything else, time, all those kinds of things
01:00:44.400 | emerge from that.
01:00:45.700 | - I will say that what I believe at this point,
01:00:51.640 | I've been saying 50/50 for a long time,
01:00:54.440 | I'd say now it's like 51/49 in terms of consciousness
01:00:58.960 | being emergent versus fundamental.
01:01:01.000 | So I am not convinced of this at all.
01:01:05.120 | I'm not convinced that consciousness is fundamental.
01:01:07.080 | What I think is there are very good reasons
01:01:10.200 | to think it could be and essentially all of science
01:01:15.200 | up to this point has been led by the other assumption,
01:01:20.800 | by the first assumption that consciousness arises
01:01:23.800 | at some point, namely in brains,
01:01:26.120 | and that's where all the science has gone
01:01:27.720 | and I think that's wonderful and I think it should
01:01:29.320 | keep on going and I actually think that was
01:01:30.920 | a more important place to start.
01:01:33.000 | But I think there's a possibility that the correct assumption
01:01:36.480 | is that it's fundamental and so that's the science I support,
01:01:40.080 | that's the thing I spend a lot of my time thinking about
01:01:42.860 | and talking to scientists and philosophers about
01:01:45.480 | and so I shouldn't give the idea that I actually
01:01:50.320 | have crossed over into believing this is the case,
01:01:52.520 | but it's the assumption I follow in my work at this point.
01:01:56.800 | - It's a possibility, an understudied possibility,
01:01:59.960 | so it deserves serious, rigorous attention.
01:02:02.520 | - And there are good reasons to start with that assumption
01:02:05.800 | versus the other that I think we're just now
01:02:08.520 | starting to realize.
01:02:10.040 | - So just to clarify, when we're talking about consciousness
01:02:13.600 | we're talking about the heart problem of consciousness,
01:02:15.920 | that it feels like something to,
01:02:17.520 | there's a subjective experience.
01:02:20.340 | Do we, if consciousness permeates all matter,
01:02:27.360 | it's fundamental, is that going to be somehow,
01:02:31.640 | is our current intuition about consciousness,
01:02:34.040 | like the very tiny subset of what consciousness
01:02:39.040 | actually is, so like we have our intuitions
01:02:42.440 | about personal experiences, like what it feels like,
01:02:44.920 | what it tastes to eat a cookie or something like that.
01:02:47.660 | But that seems like a very specific implementation
01:02:54.480 | of consciousness in an organism,
01:02:57.740 | so how can we even reason about something that's,
01:03:00.600 | if consciousness is fundamental,
01:03:02.800 | how can we reason about that?
01:03:04.300 | - I'm not sure I'm understanding the connection
01:03:07.160 | between those two things, but--
01:03:09.360 | - When you think about what it's like to be a plant
01:03:11.520 | to experience a thing, okay, we can kind of get that,
01:03:14.280 | we can kind of understand that.
01:03:15.480 | - There are a lot of places we could go with this.
01:03:16.840 | One is, there is actually work being done
01:03:19.360 | by people like David Eagleman, he's a neuroscientist,
01:03:21.740 | I don't know if you know him.
01:03:22.840 | You should talk to him for your podcast if you haven't,
01:03:24.880 | he's wonderful, great science communicator.
01:03:26.820 | He's someone I interviewed for my current project too.
01:03:29.880 | So he's done, this actually, okay,
01:03:32.880 | there are many places we can go.
01:03:34.040 | One is, he does work with sensory addition,
01:03:37.760 | sensory substitution, and this is going
01:03:41.480 | in some very interesting directions
01:03:43.760 | and maybe partly answers your question,
01:03:46.320 | which is giving humans qualia, sensory experiences
01:03:51.320 | that we're not wired for,
01:03:53.360 | that human beings have never had before.
01:03:55.980 | You let me know what you're most interested
01:03:57.880 | in hearing about.
01:03:58.720 | We can talk about things like the brain port.
01:04:01.320 | There was actually a study done,
01:04:03.520 | I just talked to one of the participants in the study
01:04:05.800 | where they were seeing if they could give human beings
01:04:08.880 | an experience of magnetic north.
01:04:10.840 | So other animals have this sense that we don't have
01:04:15.620 | where they can feel intuitively the way that our eyes work
01:04:19.000 | to give us an intuitive sense of our environment.
01:04:21.040 | We don't have to translate the information coming in
01:04:25.060 | through our eyes, we just have a map of the external world
01:04:28.120 | and we can navigate it.
01:04:30.200 | So many animals use a sense of magnetic north
01:04:33.960 | to get around and it's an intuitive sense.
01:04:37.320 | So I spoke to someone who was in this part
01:04:39.480 | of this experiment and it was fascinating
01:04:43.320 | to hear him acquire a sense, not only that he had never had,
01:04:48.320 | but that no human being had ever had.
01:04:50.240 | So when I asked him to describe the experience,
01:04:52.660 | it was challenging for him and understandably so,
01:04:56.820 | because it would be like you describing sight
01:04:59.360 | to someone who's never seen.
01:05:00.760 | But this is clearly possible and scientists
01:05:05.080 | like David Eagleman and others are working on these.
01:05:09.240 | And so I do think it's possible that this line,
01:05:13.160 | that these scientific advancements may actually start
01:05:18.160 | to dovetail with the consciousness research
01:05:23.960 | in terms of being able to experience things
01:05:28.960 | we've never experienced before.
01:05:32.040 | But I do think that at some level, yes,
01:05:34.820 | we're limited as human beings.
01:05:37.560 | We may be able to find some proof or enough proof
01:05:41.360 | to at least assume that consciousness is fundamental
01:05:45.400 | or who knows, one day actually believe
01:05:48.160 | that that's the correct scientific view of things
01:05:51.360 | and not really be able to get our minds around that
01:05:54.920 | or to understand what it means
01:05:56.200 | and certainly not to know what it feels like.
01:05:59.600 | I mean, we don't even know what it feels like
01:06:01.480 | to be other creatures.
01:06:03.480 | - Maybe we'll be--
01:06:04.320 | - I don't know what it's like to be you.
01:06:05.600 | (laughing)
01:06:07.840 | - Yeah, I mean, I guess that's what empathy is about.
01:06:13.080 | That's why I tried to exercise,
01:06:15.040 | try to imagine what it's like to be other people.
01:06:17.320 | And then you're doing that even farther--
01:06:19.520 | - P tendrils. - With P tendrils.
01:06:21.340 | But perhaps we can do that thing more rigorously
01:06:25.360 | by connecting different sensory mechanisms to the brain
01:06:29.360 | to do that for all kinds of organisms on Earth.
01:06:32.360 | But they're similar to us in scale
01:06:34.520 | and the time at which they function,
01:06:36.880 | the time scale and the spatial scale.
01:06:39.320 | Perhaps it's much more difficult to do
01:06:40.800 | for electrons and so on.
01:06:42.400 | - Some of the intuitions I talked about,
01:06:44.600 | I mean, I just kind of, I'm taking them for granted
01:06:47.480 | that you and everyone knows what I'm talking about.
01:06:49.580 | But in terms of the science, in terms of the studies,
01:06:54.580 | understanding things like binding processes,
01:06:57.360 | understanding just a little bit about how the brain works
01:07:01.580 | and as far as we understand.
01:07:03.860 | And there's just a ton of evidence now to support
01:07:06.480 | that our conscious experience is at the tail end
01:07:10.340 | of a lot of brain processing.
01:07:12.980 | - So she tells a story.
01:07:15.600 | - Yeah, so just a little bit.
01:07:17.200 | I mean, I give in the example in my book,
01:07:19.360 | I talk about tennis and the binding of the sights
01:07:23.400 | and sounds and felt experience of hitting a tennis ball,
01:07:27.320 | which in the world are happening at different times.
01:07:29.720 | The rates it takes, the sound waves and the light waves
01:07:34.720 | and the felt sensation to travel to my brain are different.
01:07:39.040 | There are these binding processes that happen
01:07:43.220 | prior to the conscious experience
01:07:47.140 | that were essentially delivered to us by the brain.
01:07:50.940 | And so we can get back into this.
01:07:55.060 | I can answer your bigger question first,
01:07:57.180 | but I feel like for a lot of people to understand
01:08:01.500 | some of the science that already is shattering
01:08:05.700 | some of our intuitions about the role consciousness plays,
01:08:08.620 | I think is helpful in terms of being able to be open
01:08:12.740 | to thinking about these other ideas.
01:08:14.980 | - Let's go there.
01:08:15.820 | Where the heck does consciousness happen
01:08:18.020 | in what we understand about the brain timing-wise?
01:08:20.560 | I mean, this connects to conscious will, too,
01:08:23.820 | our experience of free will.
01:08:25.260 | - Yeah, there is this period of time,
01:08:28.300 | and it's depending on the situation and the behavior,
01:08:33.140 | it can be anywhere from, it's essentially half a second.
01:08:36.820 | There's 200 milliseconds.
01:08:38.700 | I actually don't know, I was gonna compare it
01:08:40.460 | to the timing of syncing film and sound.
01:08:42.980 | I don't know if you know this data.
01:08:44.700 | - Unfortunately, I know this very well.
01:08:46.620 | - You do?
01:08:47.460 | - The film and sound?
01:08:48.860 | - Yeah, like how the timing has to work
01:08:52.140 | so that we conscious, so that our experiences of it
01:08:54.660 | happening at the same time.
01:08:56.620 | - Let me just sit in the silence of it.
01:09:00.180 | There's been so much pain on this one point.
01:09:03.740 | - Sorry, I had no idea. - So much suffering.
01:09:06.420 | So, I mean, yeah, I did a lot of algorithms
01:09:10.380 | on automatic synchronization of audio and video
01:09:13.100 | and all these kinds of things.
01:09:14.220 | So I know this well.
01:09:15.380 | There's a lot of science and there's a lot of differences,
01:09:18.500 | but it's about, and people claim it's about 100 milliseconds
01:09:23.220 | and you can't tell the difference,
01:09:24.100 | but it's much more like 30 to 50 milliseconds.
01:09:27.780 | And you can go nuts trying to see
01:09:31.420 | if something is in sync or not.
01:09:33.380 | Is it in sync or not?
01:09:34.660 | - Well, also, you know--
01:09:35.500 | - Am I out of sync right now?
01:09:36.340 | - Your brain is constantly making adjustments,
01:09:38.700 | and so it can shift for you while you're doing that,
01:09:42.180 | which is probably part of the thing
01:09:43.300 | that's driving you crazy.
01:09:44.540 | Okay, so I'll start with binding processes
01:09:47.900 | and then I'll just give a couple examples.
01:09:50.500 | So yes, there's this window where your brain
01:09:54.420 | is essentially putting all of the information together
01:09:59.420 | to deliver you a present moment experience
01:10:02.620 | that is most useful for you to navigate the world.
01:10:05.200 | So as I said, I use this example of tennis in my book.
01:10:11.140 | So the sights and sounds are coming at us
01:10:14.980 | at different rates.
01:10:16.220 | It takes longer for a sensation in my hand
01:10:20.940 | when I hit the ball with the racket to travel to my brain
01:10:25.040 | than it does for the light waves to hit my retina
01:10:28.780 | and get processed by the brain.
01:10:30.540 | So all these signals are coming in at different times.
01:10:33.800 | Our brains go through this process of binding
01:10:36.860 | to basically weave it all together
01:10:38.340 | so that our conscious experience of that
01:10:41.620 | is of seeing, hearing, and feeling the ball
01:10:44.340 | hit the racket all at the same time.
01:10:45.980 | That's obviously most useful to us.
01:10:49.260 | Binding is mostly about timing.
01:10:51.220 | It can be about other things,
01:10:52.460 | but I was just talking to David Eagleman
01:10:54.380 | who was talking about a very simple experiment, actually,
01:10:57.140 | and this kind of shows how your brain
01:11:00.000 | is basically always interacting with the outside world
01:11:03.420 | and always making adjustments to make its best guess
01:11:06.980 | about the most useful present moment experience to deliver.
01:11:11.980 | So this is a very simple experiment.
01:11:14.500 | This is from many, many years ago,
01:11:16.900 | and David Eagleman was involved in this research
01:11:20.700 | where they had participants hit a button
01:11:23.220 | and that button caused a flash of light.
01:11:25.220 | And so our brains, through binding,
01:11:33.000 | the brain, it notices, is able to kind of calibrate
01:11:38.000 | the experience you have because the brain is aware
01:11:41.280 | that it is its own hand that is causing the light to flash,
01:11:44.960 | that there's this cause and effect going on,
01:11:46.720 | and so you have this experience of pushing the button
01:11:51.320 | that causes the flash of light, which is true,
01:11:53.200 | and the light flashes.
01:11:55.360 | You can start to introduce longer pauses,
01:11:59.000 | starting with 20 milliseconds, 30 milliseconds,
01:12:01.880 | going up to, I think, 100, maybe even 200 milliseconds,
01:12:05.560 | where if you do it gradually,
01:12:07.720 | since your brain is making the adjustment,
01:12:09.820 | you can introduce a delay.
01:12:12.480 | I think it's up to 200 milliseconds.
01:12:14.760 | If you do it gradually, you will still have the experience,
01:12:18.880 | even though there's now a delay
01:12:20.260 | between when you hit the button and the light flashes,
01:12:22.920 | you will still have the exact same experience you had
01:12:25.320 | initially, which is that the light flashes
01:12:27.120 | right when you push the button.
01:12:28.160 | In your experience, nothing is changing.
01:12:31.400 | But then, so they gradually give a delay.
01:12:35.920 | You've acclimated to that because it was done gradually.
01:12:39.560 | If they then go back to the original instantaneous flash,
01:12:43.960 | your brain doesn't have time to make the adjustment,
01:12:47.580 | and you have the experience that the light flashed
01:12:51.360 | before you hit the button.
01:12:52.560 | And that is your true experience.
01:12:54.200 | It's not like you're confused, but that is,
01:12:56.960 | your brain didn't have time to make that adjustment.
01:12:59.920 | You think you're in the same environment.
01:13:01.440 | You're pushing the button, it makes the light flash.
01:13:03.240 | It's kind of calibrating all the time.
01:13:06.640 | But then the participants are suddenly saying,
01:13:09.600 | "Oh wait, that was so weird.
01:13:10.640 | "The light flashed before I hit the button."
01:13:12.840 | And so these-- - That's crazy.
01:13:16.680 | - They built a Rochambeau,
01:13:21.040 | rock, paper, scissors computer game
01:13:24.000 | that was unbeatable based on this glitch
01:13:27.480 | that you can present in binding by training someone.
01:13:30.520 | If you introduce a delay slowly enough,
01:13:33.640 | then the computer can get the information
01:13:36.980 | before it responds, but you still have the experience
01:13:40.880 | that you're both throwing out your rock or paper,
01:13:45.040 | scissors at the same time.
01:13:46.740 | But in actuality, the computer saw your choice
01:13:51.740 | before it makes its choice.
01:13:53.960 | And it's in this window of milliseconds
01:13:55.640 | where you don't notice it.
01:13:57.520 | - So that starts to help you build up an intuition
01:14:00.680 | that this conscious experience is an illusion
01:14:04.000 | constructed by the brain after--
01:14:06.360 | - Conscious will. - Conscious will.
01:14:07.960 | - Yeah, and just in general that consciousness
01:14:11.520 | is not the thing that we feel it is,
01:14:14.280 | which is driving the behavior
01:14:17.120 | that is actually at the tail end of it.
01:14:19.600 | And so a lot of decision-making processes,
01:14:21.480 | and there are studies that are more controversial,
01:14:23.360 | and I don't usually like to cite them,
01:14:26.000 | although if you wanna talk about them, we can.
01:14:27.400 | They're super interesting and intuition-shattering.
01:14:29.980 | But there are now studies specifically about free will
01:14:33.160 | to see if there are markers at the level of the brain
01:14:36.480 | that can see what decision you're going to make
01:14:39.160 | and when you make that decision.
01:14:40.560 | And I think the neuroscience inevitably
01:14:44.280 | is just going to get better.
01:14:45.880 | And so part of the reason I'm so passionate about this,
01:14:49.640 | I mean, there's the science and there's just the curiosity
01:14:52.600 | that drives me of wanting to understand
01:14:55.480 | how the universe works.
01:14:56.640 | But I actually see a lot of the neuroscience
01:15:00.760 | presenting us with truths that are going to be difficult
01:15:05.200 | for us to accept.
01:15:07.400 | And I actually think there are really positive ways
01:15:11.080 | to view these truths that we're uncovering.
01:15:15.040 | And even though they can be initially kind of jarring
01:15:18.440 | and even destabilizing and creepy,
01:15:21.840 | I think ultimately there's actually a lot,
01:15:26.460 | it can have a positive effect on human psychology
01:15:30.420 | and a whole range of things
01:15:32.600 | that I and others have experienced.
01:15:34.920 | And that I think it's important for us to talk about
01:15:37.340 | because you can't hide from the truth,
01:15:40.360 | especially in science, right?
01:15:41.760 | Like it just, it will reveal itself.
01:15:44.760 | And if this is true, I think not only
01:15:48.680 | for better understanding the universe and nature,
01:15:51.580 | which is kind of my primary passion,
01:15:54.280 | it's important for us to absorb these facts
01:15:59.500 | and realize that they don't,
01:16:04.940 | it doesn't necessarily take away the things
01:16:07.560 | from us that we fear.
01:16:08.700 | I've heard people say, as we talked about,
01:16:12.800 | common point to make or question to ask a scientist,
01:16:15.800 | can you still enjoy chocolate
01:16:18.740 | if you're a molecular biologist?
01:16:20.580 | And is it a molecular biologist
01:16:23.740 | that would be the one who would understand
01:16:25.060 | how we experience chocolate?
01:16:26.660 | (laughing)
01:16:27.500 | I may have the wrong sign.
01:16:28.500 | - Yeah.
01:16:29.500 | - But anyway, if you-- - But the point stands.
01:16:31.260 | - If you're focused on the details
01:16:33.500 | of the underlying nature of reality,
01:16:36.460 | does that take the joy and the pleasure
01:16:39.940 | and for lack of a better word, spirituality,
01:16:44.140 | out of our experience as human beings?
01:16:47.140 | And I actually think for these illusions
01:16:50.820 | like free will and self, the reverse is true.
01:16:53.140 | I actually think they can give us,
01:16:55.420 | they are reasons and bases for feeling more connected
01:17:00.420 | to each other and to the universe,
01:17:03.860 | for spiritual experiences,
01:17:06.820 | for even just on a more basic level,
01:17:11.820 | for increasing our wellbeing,
01:17:14.100 | just in terms of our psychology
01:17:16.540 | of lowering rates of depression and anxiety.
01:17:19.580 | And actually I think these realizations
01:17:21.980 | can be extremely helpful to people.
01:17:25.220 | - Well, it's like realizing that the universe
01:17:29.220 | doesn't rotate around Earth,
01:17:31.020 | that the Earth is not the center of the universe
01:17:33.220 | is a really challenging thought.
01:17:35.300 | - Well, and people were worried
01:17:36.260 | about how that would affect society.
01:17:38.700 | - Well, yes, that's like long-term,
01:17:40.380 | but short-term, I bet you the number of people
01:17:43.980 | who had an existential crisis
01:17:45.980 | as it got integrated into society,
01:17:48.060 | that thought is huge.
01:17:49.340 | It's like, it's a hard one.
01:17:51.820 | And you're saying--
01:17:52.900 | - But it can't, but it's also a source of awe.
01:17:55.340 | And I mean, so many people now use that fact
01:18:00.340 | to inspire a positive response,
01:18:06.140 | to inspire creativity and curiosity and awe
01:18:13.340 | and all of these things that are so useful
01:18:18.060 | for human wellbeing.
01:18:19.020 | - Where's the source of meaning
01:18:20.740 | when you're not the center of the universe,
01:18:24.140 | when the you doesn't even exist?
01:18:27.740 | That even you, the sense of self
01:18:30.660 | and the sense of decision-making is an illusion.
01:18:35.340 | - The truth is that for the most part,
01:18:37.820 | the sense of self is kind of at the core of human suffering
01:18:43.300 | because it feels as if we are separate
01:18:47.020 | from the rest of nature.
01:18:48.140 | We're separate from each other.
01:18:49.980 | We're separate from the illusion that I referenced
01:18:54.980 | of feeling like we have these thoughts
01:18:58.260 | that are brain-based thoughts,
01:18:59.940 | but then the I swoops in to make a decision.
01:19:03.580 | In some sense, it goes so deep
01:19:05.660 | that it's as if the I is separate from the physical world.
01:19:11.060 | That separation plays a part in depression,
01:19:14.580 | plays a part in anxiety, even plays a part in addiction.
01:19:18.340 | So at the level of the brain,
01:19:20.380 | I think, stop me if I'm repeating myself,
01:19:22.340 | but we started talking about the default mode network.
01:19:25.020 | And so we actually know that when the default mode network
01:19:30.900 | is quieted down, when people lose a sense of self
01:19:33.940 | in meditation and on psychedelic drugs in therapy,
01:19:38.340 | there is a feeling that people describe
01:19:41.060 | of an extremely positive feeling
01:19:44.820 | of being connected to the rest of nature.
01:19:48.060 | And so that's a piece of it that I think
01:19:50.100 | if you haven't had the experience,
01:19:52.060 | you wouldn't necessarily know that would be a part of it.
01:19:56.260 | But truly having that insight
01:19:59.340 | that you're not the self you feel you are,
01:20:01.900 | immediately your experiences are embedded in the universe
01:20:07.860 | and you are a piece of everything
01:20:11.220 | and you see that everything is interconnected.
01:20:14.180 | And so rather than feeling like a lonely I
01:20:17.500 | in this bigger universe,
01:20:18.940 | there's a sense of being a part
01:20:23.180 | of something larger than yourself.
01:20:25.060 | And this is intrinsically positive for human beings.
01:20:30.020 | And even just in our everyday lives and choices
01:20:32.380 | and what we do for work,
01:20:34.260 | feeling part of something larger than yourself
01:20:37.900 | is the way people describe spiritual experiences
01:20:41.060 | and the way many positive psychological states are framed.
01:20:46.060 | And so there's that piece of it.
01:20:50.220 | - There is something,
01:20:51.060 | so there's one giant hug with the universe,
01:20:53.460 | everything in it.
01:20:56.220 | But there is some sense in which
01:21:00.140 | we attach the search for meaning
01:21:04.100 | with the I, with the ego.
01:21:08.520 | And it could almost seem like life is meaningless.
01:21:14.740 | Our existence, our I, my existence is meaningless.
01:21:18.740 | - I think you can kind of go there under any worldview.
01:21:21.420 | - Right. - Really.
01:21:22.260 | - Right. - And the truth is
01:21:25.300 | we want to find a truth
01:21:29.340 | out of that downward spiral
01:21:32.460 | and not a story that we have to tell ourselves
01:21:37.460 | that isn't true.
01:21:40.140 | And the fact is we have these facts available to us
01:21:44.060 | that with the right framing and the right context,
01:21:49.060 | looking at the truth actually provides us
01:21:54.220 | with that psychological feeling we're searching for.
01:21:57.020 | And I think that's important to point out.
01:21:59.540 | - I think humans are fascinatingly good
01:22:03.380 | at finding beauty in truth
01:22:06.140 | no matter how painful the truth is.
01:22:07.700 | So yes. - Yes, I totally agree.
01:22:10.940 | But in this case, I think there are,
01:22:14.840 | the concerns are legitimate concerns
01:22:22.700 | and I have them myself for how people respond.
01:22:25.780 | I've actually had people tell me
01:22:27.100 | they had to stop reading my book halfway through
01:22:29.700 | because the parts on free will were so upsetting to them.
01:22:34.540 | And this is something I think about a lot
01:22:38.020 | because that kind of breaks my heart.
01:22:41.460 | I don't, because I see this potential
01:22:48.140 | for these realizations bringing
01:22:52.380 | levels of wellbeing that many people don't have access to.
01:22:57.180 | I think it's important to talk about them
01:23:01.580 | in ways that override what can be an initial fear
01:23:06.140 | or kind of spooky quality
01:23:11.300 | that can come out of these realizations.
01:23:13.460 | - So at the end of that journey,
01:23:14.780 | there's a clarity and an appreciation of beauty
01:23:18.820 | that if you just write it out.
01:23:20.660 | By the way, if you wanna read upsetting,
01:23:22.140 | I just gotten through the boy,
01:23:26.140 | the four books if you wanna read upsetting.
01:23:29.260 | So my audible is hilarious.
01:23:31.260 | So there's conscious in it.
01:23:33.180 | And then, so your book.
01:23:35.180 | And then it has "The Rise and Fall of the Third Reich",
01:23:38.460 | "Bloodlands" by Timothy Snyder,
01:23:43.180 | probably the most upsetting book I've ever read.
01:23:45.780 | If you wanna, 'cause it's not just Stalin or Hitler,
01:23:48.820 | it's Stalin and Hitler.
01:23:50.100 | It's the worst hits, the opposite of the best hits.
01:23:53.740 | It's really, really, really well written, really difficult.
01:23:57.180 | I read Solzhenitsyn's "Gulag Archipelago"
01:24:03.300 | and what else, "Red Famine", which is,
01:24:07.820 | and "Apple Bomb", does that hurt?
01:24:11.900 | Yeah, anyway, so those are truly upsetting.
01:24:14.860 | And those are a lot of times the results
01:24:19.260 | of hiding the truth versus pursuing the truth.
01:24:23.820 | So truth in the short term might hurt,
01:24:26.060 | but it did ultimately set us free.
01:24:29.380 | - I believe that.
01:24:30.340 | And I also think whatever the truth is,
01:24:34.180 | we have to find a way to maintain civil society
01:24:39.180 | and love and all the things that are important to us.
01:24:42.060 | - If we can jump around a little bit,
01:24:43.540 | can I just ask on a personal note,
01:24:45.420 | because you said you've suffered from depression
01:24:47.700 | and there's a lot of people that see guidance
01:24:50.460 | on this topic 'cause it's such a difficult one.
01:24:52.740 | How were you able to, when it has struck you,
01:24:57.740 | how were you able to overcome it?
01:24:59.500 | - Yeah, I mean, this is maybe too long an answer.
01:25:02.920 | So I've experienced it in different forms.
01:25:05.260 | So it was my, I would say my depression
01:25:08.860 | has almost always mostly taken the form of anxiety.
01:25:13.780 | I didn't realize how anxious I was,
01:25:18.380 | I think until I was an adult.
01:25:19.860 | So I was always very functional.
01:25:21.600 | I think all the positive sides of suffering in that way.
01:25:27.040 | I think I'm a little OCD as you can tell.
01:25:31.580 | - And this whole conversation is hilarious
01:25:34.020 | 'cause we're both suffered to some level of anxiety.
01:25:37.420 | - Your psychology is just laid out in front of us here.
01:25:39.740 | - It's a giant mess.
01:25:41.140 | - We're the same kind of human.
01:25:42.420 | - It's great.
01:25:44.020 | - Just trying to organize.
01:25:46.540 | - Hold on like the Tom Waits song.
01:25:48.360 | - But then I suffered from postpartum depression
01:25:53.100 | after both of my daughters, after both pregnancies.
01:25:56.520 | That was a very different experience
01:25:59.820 | from anything I've ever experienced,
01:26:01.500 | but clearly I had a predisposition
01:26:03.740 | towards suffering from something like that.
01:26:07.380 | Anyway, it really wasn't until I fully recovered
01:26:11.740 | from the second experience of postpartum depression
01:26:15.380 | that I realized
01:26:16.900 | that I had been suffering on some level my whole life.
01:26:23.440 | And I think I always knew,
01:26:25.060 | I thought of myself as a very sensitive person,
01:26:27.580 | an empathic person.
01:26:29.140 | I mean, I've been in therapy for 10 years.
01:26:32.780 | I knew I had a lot of anxiety.
01:26:37.820 | I would never have denied that I had a lot of anxiety.
01:26:40.660 | I just didn't realize it crossed over into
01:26:44.140 | a disorder really until I was an adult
01:26:48.660 | and ended up taking Prozac.
01:26:51.080 | I took an SSRI for postpartum
01:26:54.380 | and it was fascinating to me.
01:26:58.060 | I ended up interviewing my psychiatrist
01:27:00.740 | because I was so fascinated in the whole thing
01:27:02.740 | once I was on the other side of it,
01:27:04.540 | just what I had been through,
01:27:05.820 | how different I felt during that period of time,
01:27:08.780 | and then how quickly the medication
01:27:10.700 | made me feel like myself again.
01:27:13.660 | I had come out the other side
01:27:14.900 | of the experience of postpartum
01:27:16.660 | and was going to start tapering off the medication.
01:27:21.820 | And in this window
01:27:24.340 | where I no longer had postpartum depression
01:27:26.820 | and hadn't yet gone off the SSRI,
01:27:32.380 | I realized that life was not only a lot easier
01:27:37.880 | than when I had postpartum,
01:27:39.740 | but it was easier than it had ever been.
01:27:42.740 | And it took taking all of that anxiety away
01:27:47.740 | to recognize how much I had been grappling with it
01:27:53.460 | my entire life.
01:27:55.020 | And it first started coming in the form of realizations
01:27:58.060 | like, oh, is this how other people,
01:28:01.700 | is this how other people feel?
01:28:03.900 | Is this how that,
01:28:04.740 | like the things that I just always thought of myself,
01:28:06.980 | I'm really sensitive,
01:28:08.500 | I'm an introvert, I need a lot of time to myself.
01:28:12.140 | And all of these things that I felt like,
01:28:14.900 | I mean, it's always very high functioning.
01:28:18.060 | And in some ways, I was a professional dancer
01:28:20.300 | and I think that was the type of therapy for me.
01:28:22.200 | There was the obsessing over
01:28:25.220 | the training and dancing nine hours a day.
01:28:31.220 | And all of that, I now look back on
01:28:33.900 | and see how much that was therapeutic for me.
01:28:38.180 | And that I was kind of treating something,
01:28:41.220 | but yeah, it was just this experience
01:28:44.020 | of treating an anxiety disorder
01:28:48.860 | that caused me to realize that I had one.
01:28:51.740 | I didn't know I could feel the way I felt
01:28:54.300 | after taking Prozac.
01:28:56.300 | And I became very interested in,
01:28:57.540 | I mean, I was already working with neuroscientists,
01:29:00.340 | I was always already interested in consciousness
01:29:02.140 | in the brain and it just,
01:29:05.940 | this kind of rattled other intuitions for me
01:29:08.940 | in terms of how our childhoods shape who we become.
01:29:13.940 | Because I had been convinced,
01:29:18.260 | my father was--
01:29:20.100 | - He's a complicated person, as you said before.
01:29:23.740 | - That's what I was just gonna say again.
01:29:25.980 | But I think, I mean, so he was not diagnosed,
01:29:28.740 | I think he had borderline personality disorder
01:29:32.380 | and was emotionally abusive.
01:29:35.140 | And I thought that all of the ways I experienced the world
01:29:39.380 | and all of my anxiety and my sensitivities,
01:29:41.780 | I thought almost all of that, if not all of that
01:29:46.140 | was because of these experiences I had growing up
01:29:48.900 | and trauma that I experienced as a child.
01:29:51.260 | And obviously those things play a part,
01:29:53.580 | but what I realized after going through postpartum
01:29:57.980 | and then the thing that was extremely informative to me
01:30:00.700 | was having my own children.
01:30:02.380 | Because they were basically living my dream childhood.
01:30:08.540 | They had none of the things that I thought were the cause
01:30:11.660 | of the psychological suffering that I experienced.
01:30:14.820 | There was none of that and they have a lot of the same,
01:30:20.660 | they struggle with a lot of the same anxiety
01:30:23.420 | and panic attacks.
01:30:24.540 | And what I realized was how much we're kind of born
01:30:29.540 | into the world with these things that we struggle with
01:30:34.900 | and with our strengths and with all of that.
01:30:36.620 | And of course, then if you have an abusive childhood,
01:30:39.620 | if you're someone who tends to be anxious and sensitive
01:30:44.300 | and empathic, and then you're born into an abusive situation
01:30:48.820 | that's obviously a terrible combination.
01:30:51.660 | But I never acknowledged or realized how strong
01:30:55.660 | just the genetics and the wiring played.
01:31:00.660 | - Where's the line between you kind of accepting
01:31:04.540 | the challenges you're born with,
01:31:06.340 | and this is what life will be,
01:31:08.500 | versus then figuring out that life can be somehow different?
01:31:13.060 | - I think they're part of the same process.
01:31:16.020 | And I think it's kind of necessary to be aware
01:31:21.020 | and kind of necessary to accept what you're experiencing
01:31:26.020 | and what the situation is and how you feel
01:31:30.180 | and the types of thoughts and patterns you tend toward
01:31:35.180 | in order to make whatever changes can be made.
01:31:38.860 | So I do think it's kind of part of the same process.
01:31:41.500 | - Could life have been any different?
01:31:43.340 | I mean, do you regret certain aspects of the decisions made?
01:31:48.340 | Not by you.
01:31:49.540 | - I mean, it depends on what level we're talking.
01:31:52.020 | I think at a fundamental level,
01:31:54.040 | I don't believe anything could be different.
01:31:56.900 | - Are you able to think at that level about your own life?
01:32:00.180 | - Sure, and that's actually, that's part of what I was,
01:32:02.180 | when I wanted to kind of talk a little bit
01:32:04.420 | about the levels of usefulness of being aware
01:32:09.420 | of these different illusions,
01:32:11.160 | because I would say most of the time in our daily lives,
01:32:15.900 | the types of illusions that I'm interested in shaking up
01:32:18.860 | are not useful to remind ourselves of most of the time.
01:32:23.780 | I really think there are different levels of usefulness
01:32:27.060 | to thinking about and reminding ourselves
01:32:31.900 | of the places where we have false intuitions.
01:32:34.740 | And so I often use the analogy of living on a sphere.
01:32:39.740 | So it still feels to most of us most of the time,
01:32:47.100 | I mean, our intuitive sense,
01:32:48.140 | we're not thinking about whether the earth is flat
01:32:50.860 | or a sphere, but we behave as if it's flat.
01:32:54.500 | And that makes the most sense.
01:32:56.060 | And it would be exhausting to keep reminding ourselves
01:33:01.060 | as we walk down the street, like it feels flat,
01:33:03.420 | but it's not flat.
01:33:04.940 | Like there's just no reason to do,
01:33:07.020 | it's not useful in that moment.
01:33:09.340 | If you're building a house,
01:33:10.260 | you can build it as if the world is flat.
01:33:12.340 | And, but you know, of course,
01:33:15.900 | so there are psychological reasons to bring it into view
01:33:19.060 | and maybe even spiritual reasons to bring it into view.
01:33:21.580 | And then there's just like usefulness.
01:33:23.940 | So if you're building a rocket to the moon,
01:33:26.780 | you better understand the geometry of the earth.
01:33:30.620 | Even if you're flying an airplane,
01:33:32.740 | if you're an airplane pilot,
01:33:33.940 | you have to be aware of the truth of our situation.
01:33:37.060 | And then I think there are other places
01:33:38.820 | where it's interesting to remind ourselves
01:33:41.460 | is where I start out my book.
01:33:44.220 | Just as a way to inspire awe
01:33:48.740 | and to get yourself out of your everyday life
01:33:51.700 | and see the big picture, which can be just a relief,
01:33:56.540 | but also helps you feel more connected to the universe
01:34:00.940 | and to something larger than ourselves.
01:34:03.300 | And so I see these intuitions,
01:34:07.580 | reminding ourselves that these intuitions
01:34:10.060 | are illusions in the same way,
01:34:11.740 | that most of the time they're not useful.
01:34:14.340 | They are useful if we want to think
01:34:16.860 | about a science of consciousness.
01:34:19.180 | They're useful for a whole range
01:34:20.820 | of neuroscientific studies.
01:34:23.500 | And I think they can be incredibly useful
01:34:26.620 | in the same way that lying on the ground
01:34:29.940 | and feeling the gravity pushing you against a sphere
01:34:33.620 | and realizing you're floating in the middle of outer space,
01:34:36.860 | it gives me the same feeling to realize.
01:34:40.340 | And so I have, I mean, there's so many levels to it,
01:34:44.740 | but if I'm thinking about difficult things
01:34:48.100 | that I've experienced, different traumas in my life,
01:34:52.660 | when I take a step back and kind of get this bird's eye view
01:34:55.940 | of kind of the mystery of this unfolding of the universe
01:35:00.100 | and the fact that it happened the way it happened
01:35:04.540 | and whether it could have happened another way,
01:35:06.700 | there's no going back, that's the way it unfolded.
01:35:10.060 | And being able to surrender to that,
01:35:13.020 | I think is very psychologically healthy
01:35:17.420 | and prevents us from, I mean, I think regret
01:35:20.580 | is one of the most toxic loops we can get into.
01:35:25.020 | - So this is a path to acceptance.
01:35:28.220 | - Oh, absolutely, yeah.
01:35:30.100 | Because free will, I mean, I think part of what,
01:35:33.460 | the function of the experience of it is learning.
01:35:38.020 | I mean, I think we can still learn
01:35:39.460 | without being under the illusion that we have free will.
01:35:42.540 | - So for some people, depression can destroy them.
01:35:46.940 | So how can you think about avoiding that?
01:35:51.380 | - Yeah, so I didn't totally answer your question.
01:35:56.820 | First is therapy, ways that I have worked
01:36:01.820 | through anxiety and depression.
01:36:06.620 | - So you're an introvert and a deeply intellectual person,
01:36:11.620 | therapy works for you?
01:36:14.300 | - To a point.
01:36:15.820 | It was very helpful.
01:36:18.260 | I mean, I think talk therapy is one tool
01:36:21.220 | and can be helpful for, I mean, it depends on the therapist,
01:36:26.020 | depends on the type of therapy,
01:36:27.080 | but I found it to be one piece
01:36:31.540 | and probably not the biggest piece, actually.
01:36:35.380 | But I think, I wish I had discovered medication sooner.
01:36:40.300 | That would have made a big difference in my life.
01:36:43.800 | - Even just intellectually to realize that,
01:36:45.940 | oh, like I'm not--
01:36:48.260 | - Life was a lot harder than it needed to be.
01:36:50.460 | And it wasn't about keeping everything just so.
01:36:53.940 | You know, there's another state my brain can be in
01:36:57.620 | where I don't have to work so hard to be okay.
01:37:02.860 | Meditation was probably the most,
01:37:05.340 | meditation and psychedelic experiences
01:37:08.680 | were probably the most transformative.
01:37:10.880 | But a lot of these things, I'm lucky that I didn't,
01:37:18.220 | my anxiety and depression never really got in the way
01:37:23.700 | of my living my life, of enjoying my life.
01:37:27.580 | I mean, there were struggles that made life harder for me.
01:37:31.460 | But something like treatment-resistant depression
01:37:36.460 | or severe PTSD, these are things that,
01:37:42.220 | at this point in time, based on my understanding,
01:37:47.700 | I think once you've tried,
01:37:50.980 | and the truth is that meditation is often not helpful
01:37:54.580 | for those things, it can actually exacerbate them.
01:37:59.020 | And the most promising thing that I have seen
01:38:01.740 | is this research into psychedelic,
01:38:04.580 | therapy-assisted psychedelic.
01:38:06.580 | - Does that make sense to you,
01:38:07.420 | that psychedelics work so well for such difficult cases?
01:38:10.300 | What is it about psychedelics?
01:38:12.580 | - And I've been following this research from the beginning.
01:38:15.060 | When they were doing end-of-life,
01:38:16.860 | yeah, they started with end-of-life patients,
01:38:18.900 | I don't know, maybe 20 years ago.
01:38:20.300 | I met, at a TED conference, I met one of the doctors
01:38:23.380 | who was doing this research.
01:38:25.820 | It was the first time I became aware
01:38:27.480 | that the research was happening,
01:38:29.420 | and I'd already had my own experiences before that.
01:38:32.980 | And so it made perfect sense to me that this would work.
01:38:36.100 | It was still astonishing to see the results,
01:38:38.720 | to see how successful the work is so much of the time,
01:38:43.720 | but it doesn't surprise me it makes sense.
01:38:46.260 | And it's actually in line with all of these other things.
01:38:48.220 | So quieting down the default mode network.
01:38:52.600 | One of the things that's so transformative
01:38:55.860 | about taking something like psilocybin,
01:38:59.220 | and everyone's experience is different,
01:39:00.980 | it can vary each time you take it,
01:39:02.660 | even in a single person.
01:39:04.300 | But the experience I had,
01:39:06.580 | and the experience that many people have
01:39:07.980 | that is so transformative,
01:39:09.260 | is this feeling that's very hard to describe,
01:39:11.660 | but it's a feeling of being one with the universe.
01:39:15.900 | And that comes with, it's kind of all one feeling
01:39:19.180 | that is, again, hard to put into words,
01:39:21.540 | but there's this feeling that everything is okay.
01:39:25.820 | And I'd never had that feeling before in my life.
01:39:28.980 | And when I took psychedelics,
01:39:31.180 | that feeling would stay with me for months.
01:39:33.940 | And I never understood why,
01:39:35.180 | and it was always fascinating to me,
01:39:36.460 | but it was as if I was glimpsing a deeper truth
01:39:39.760 | of the world that it's all one thing, we're all connected.
01:39:44.760 | There's no sense that there could even be
01:39:51.660 | a feeling of loneliness.
01:39:52.980 | There's just this visceral sense
01:39:55.220 | of being one with everything,
01:39:56.980 | and that everything was okay.
01:39:58.580 | That all the things I was afraid of, even death,
01:40:00.980 | that the universe, in a sense,
01:40:04.660 | is an endless recycling, and I don't know.
01:40:09.660 | It's hard to describe.
01:40:11.620 | But we also know, on the other side,
01:40:13.900 | that depression and anxiety,
01:40:15.660 | when people are experiencing those things,
01:40:18.680 | the default mode network is more active.
01:40:23.680 | And so it's this cycling and this obsessive cycles
01:40:28.680 | of thinking about one's self
01:40:31.500 | that is a huge part of the suffering in the first place.
01:40:34.200 | And so the one thing that's surprising to me
01:40:38.360 | about the research is that,
01:40:39.860 | I may be fudging the data,
01:40:44.200 | but it's something like 80% of people
01:40:47.800 | who are treated for PTSD after one,
01:40:51.760 | only one session are cured of their PTSD.
01:40:56.600 | - Yeah, the effect stays for prolonged periods of time.
01:41:00.000 | - Yeah. - That's really interesting.
01:41:01.400 | - And addiction as well, which is interesting.
01:41:03.320 | That's not something I'm personally familiar with,
01:41:07.120 | so that was a surprise to me.
01:41:08.440 | But yeah, I mean, it's just wonderful that we--
01:41:11.960 | - Yeah, it's incredible.
01:41:13.160 | I mean, of course, it's also incredible
01:41:14.840 | for people who don't suffer
01:41:17.600 | to see what psychedelics can do with the mind,
01:41:19.760 | which is that kind of appreciation.
01:41:21.680 | - Well, and I think it's actually important for this work.
01:41:24.760 | It's one of the questions I ask everyone I talk to
01:41:28.480 | for this series.
01:41:30.800 | Many of them, I won't be able to use that audio.
01:41:34.280 | - Oh, ask them if they've done psychedelics or not?
01:41:35.920 | - Yes, and what their experience was,
01:41:37.400 | and if that's informed.
01:41:38.400 | Actually, initially in the '50s,
01:41:40.840 | I want to do more research on this and look into it,
01:41:42.960 | but in the '50s, there were some studies
01:41:45.160 | that were being done with scientists who,
01:41:48.640 | there were hundreds of scientists they put into the study
01:41:51.520 | where they were on the brink of some kind of discovery,
01:41:56.360 | where they were stuck.
01:41:57.200 | So they had been doing research, and they were stuck,
01:41:59.720 | and they used psychedelics to come up with an answer
01:42:03.400 | to find a path forward,
01:42:04.720 | and it was extremely useful for that.
01:42:07.480 | - Yeah, I mean, it's fascinating.
01:42:08.720 | And the nice thing about psychedelics, from my perception,
01:42:11.560 | is that they don't currently suffer from the taboo
01:42:14.880 | that weed does.
01:42:16.680 | - Oh, that's interesting.
01:42:18.440 | - I don't think.
01:42:19.640 | So like, for example, there's some kind of cultural construct
01:42:24.640 | about a pod head that makes it so that,
01:42:29.360 | you know, like Elon got in trouble for smoking weed.
01:42:31.960 | - Right, but he would have gotten in trouble
01:42:33.160 | for taking mushrooms, too. (laughs)
01:42:35.120 | - I don't know. - Really?
01:42:36.360 | - I don't think so.
01:42:37.280 | - Oh, that's interesting.
01:42:38.120 | - I don't think so, 'cause-- - That's a surprise to me.
01:42:39.360 | - 'Cause mushrooms-- - That's great.
01:42:40.280 | - Mushrooms, to me, seem like a journey.
01:42:43.280 | Like, there's a perception
01:42:44.760 | that you don't take mushrooms all the time.
01:42:46.800 | It's not an addictive substance.
01:42:48.040 | It's not a lifestyle.
01:42:49.520 | It's like going to Burning Man.
01:42:51.200 | It's like an experience that stays with you for a long time.
01:42:54.080 | - Didn't realize that understanding
01:42:55.520 | had permeated into the culture.
01:42:57.400 | - Yeah, that's a good question, if it has or not,
01:43:00.800 | because maybe I have a very narrow perspective
01:43:03.400 | of these kinds of things.
01:43:04.240 | But I think what has permeated is, through Hollywood,
01:43:08.480 | ideas of what it means to be a person who smokes weed a lot.
01:43:11.240 | - Yeah. - And that has, like,
01:43:13.760 | has had its effect, which is hilarious,
01:43:17.560 | given the effects of weed versus alcohol.
01:43:22.400 | But that's a whole 'nother story.
01:43:24.480 | - Have you taken psychedelics?
01:43:25.800 | - Yes. - And you've spoken
01:43:26.640 | about it on your podcast?
01:43:27.680 | - Yeah, yeah, so, not a lot.
01:43:30.360 | I really want to do a lot more.
01:43:32.280 | I've taken mushrooms. - Vaspirations. (laughs)
01:43:35.400 | - Yeah, I mean, 'cause it was such a--
01:43:36.240 | - Oh, I know, so do I.
01:43:37.640 | - And I didn't have, I have a very addictive personality,
01:43:40.240 | so I'm very nervous about substances.
01:43:43.000 | But I didn't have any addictive relationship
01:43:45.920 | with that thing.
01:43:47.080 | Well, every time I--
01:43:48.200 | - It is a treatment for addiction, so.
01:43:50.880 | - Interesting. (Bridget laughs)
01:43:52.280 | But I, you know, I'm almost nervous
01:43:56.680 | because every time I've taken mushrooms,
01:43:58.840 | I've had a really pleasant experience.
01:44:01.120 | I mean, it was, it's already the thing I feel anyway.
01:44:06.760 | But I feel it more intensely.
01:44:08.200 | The thing I feel anyway is like appreciation of the moment,
01:44:10.640 | how beautiful life is.
01:44:12.000 | The weird thing that I feel, not throughout the day,
01:44:16.320 | but certain moments of the day, especially early on,
01:44:18.560 | that's like life is intensely beautiful.
01:44:21.920 | Like, that's usually when I'll tweet.
01:44:26.120 | - Yeah. (laughs)
01:44:26.960 | - It's like everything is awesome.
01:44:29.000 | And I remember those feelings
01:44:31.240 | because sometimes when I feel really down
01:44:32.920 | and all those kinds of things,
01:44:34.520 | you remember that it's a rollercoaster
01:44:37.520 | and you just, and then you find the good feelings
01:44:39.960 | and it's cool.
01:44:40.800 | And it does make me a little bit sad
01:44:43.080 | that they kind of fade.
01:44:44.400 | But then as I get older, you get to use those moments.
01:44:48.400 | You realize that, use them well, you know?
01:44:50.920 | When you feel great, when you're focused,
01:44:52.960 | all that kind of stuff, use them well.
01:44:54.320 | - Yeah.
01:44:55.160 | - 'Cause the mind is a rollercoaster.
01:44:56.000 | - Yes, it's true.
01:44:57.000 | That's partly why I do this work.
01:44:58.080 | I feel like my work is therapy.
01:45:00.400 | I don't know if you feel that way.
01:45:02.120 | - Work is therapy.
01:45:03.280 | This work, not work in general.
01:45:04.920 | (laughs)
01:45:06.120 | Thinking about the deep questions,
01:45:08.720 | thinking about the nature of the universe,
01:45:11.680 | thinking about consciousness, even meditation.
01:45:14.560 | I mean, I got into meditation.
01:45:16.640 | To me, it's interesting.
01:45:19.040 | To me, I think a lot of meditators feel this way about it,
01:45:24.040 | but I think just, I'm thinking about it
01:45:25.920 | from the perspective of someone who hasn't meditated before,
01:45:30.280 | but it feels like a scientific experiment.
01:45:32.920 | It feels like it's the same physicist in me
01:45:36.560 | who was drawn to meditation
01:45:39.360 | because the experience is one of getting closer
01:45:44.360 | to your experience and asking similarly deep questions,
01:45:48.960 | like what is time?
01:45:50.640 | What does that even mean?
01:45:51.640 | What do I mean by time?
01:45:53.320 | What does it feel like?
01:45:54.520 | What is a thought is one of the most
01:45:57.040 | interesting questions to me.
01:45:58.960 | - I mean, how do you meditate?
01:45:59.880 | Let's talk about this.
01:46:00.840 | What you let go of time.
01:46:03.120 | - Well, I'm not really doing anything.
01:46:05.600 | I mean, the exercise is really so simple.
01:46:08.680 | It's just paying attention
01:46:10.240 | to your present moment experience,
01:46:12.280 | and it's an extremely challenging thing to do.
01:46:15.760 | It's not the natural state of the brain.
01:46:18.480 | It's an exercise in concentration,
01:46:22.840 | which is why athletes and other people
01:46:26.720 | who spend a lot of time needing to focus intensely
01:46:31.280 | find it so useful.
01:46:32.320 | I mean, it's really a focus, a concentration practice,
01:46:36.840 | but all it is really, I mean, there are different ways,
01:46:41.440 | there are different methods,
01:46:42.880 | but it really is quite simple at its core,
01:46:47.840 | which is just paying close attention
01:46:50.480 | to your present moment experience.
01:46:52.480 | And so in Vipassana,
01:46:56.120 | which is what I've mostly been trained in,
01:46:58.960 | you're usually paying attention to the breath,
01:47:01.640 | but there's always some focus of concentration.
01:47:04.360 | And the focus can even be just an open awareness,
01:47:07.320 | just watching your mind go,
01:47:09.920 | just what comes into your experience.
01:47:12.120 | And part of that is the mind,
01:47:13.000 | part of it is the external world.
01:47:14.640 | So you hear a sound, you think a thought,
01:47:18.560 | you feel a feeling, your cheek is itching.
01:47:22.280 | Am I gonna scratch it?
01:47:23.480 | Am I not gonna scratch it?
01:47:25.040 | Just like, sounds like the most boring thing in the world.
01:47:28.560 | And what's interesting is,
01:47:30.760 | paying close attention to the most boring thing in the world
01:47:35.760 | is incredibly fascinating.
01:47:38.160 | Noticing that each breath, no two breaths are the same,
01:47:41.600 | that time keeps moving, that your thoughts keep appearing.
01:47:46.600 | It's there, yeah, I mean, it's a spiritual practice
01:47:51.200 | for a reason. - You notice more
01:47:52.640 | and more beautiful things about the simpler, simpler things.
01:47:55.520 | Yeah, it's great.
01:47:56.960 | I like to do that, I don't meditate.
01:47:59.080 | I've tried a few times, and I will.
01:48:03.400 | But I meditate, I do meditate, but not,
01:48:06.720 | I meditate by thinking about a thing.
01:48:08.880 | And like holding onto that thing.
01:48:13.120 | And just like, it's not really, I guess,
01:48:16.680 | technically meditation, but it's keeping a focus on an idea,
01:48:19.800 | and then you walk with it,
01:48:22.400 | and you solve the little puzzle of it.
01:48:25.040 | - Yeah, no, it's-- - Especially any kind
01:48:26.280 | of programming or math stuff,
01:48:28.240 | you're holding stuff in your head,
01:48:29.760 | but don't look stuff up, don't take notes.
01:48:34.760 | You're only allowed to have your mind, and that's it.
01:48:39.000 | - You would really enjoy a meditation retreat.
01:48:41.600 | I mean, you would also not enjoy it.
01:48:43.560 | (laughs)
01:48:44.400 | It would be hard-- - Probably go nuts.
01:48:45.240 | - Because it's all, you wouldn't go nuts.
01:48:46.800 | It would be hard, but you would get a ton out of it.
01:48:49.920 | - What's a meditation retreat?
01:48:51.000 | Is it usually silent, or?
01:48:52.520 | - It is always silent, or actually,
01:48:54.680 | at least the one I would recommend you do
01:48:56.600 | is a silent meditation retreat.
01:48:59.120 | Five days. - How long?
01:49:00.440 | Five days, okay.
01:49:02.320 | - We'll talk later, but you might be my next victim.
01:49:05.040 | I have-- (laughs)
01:49:07.440 | - Five days is a long time.
01:49:09.360 | - It's a long time.
01:49:10.800 | - To just sit.
01:49:11.800 | - It changes your brain.
01:49:12.880 | It's the type of experience
01:49:14.320 | that will change your brain permanently.
01:49:17.160 | - There's been like two, three, four hour sessions
01:49:19.400 | of thinking that break in. - And you don't have children.
01:49:20.840 | You should do it. - I don't have children.
01:49:22.840 | Oh, the children--
01:49:24.040 | - Leaving them for five days and not speaking,
01:49:27.120 | and impossible.
01:49:28.760 | I've only done one retreat since I had kids.
01:49:31.440 | I'm doing another one soon, but only two nights.
01:49:35.280 | - Maybe that's what the thoughts will be coming in my head.
01:49:37.320 | You should be getting married.
01:49:39.200 | You should have kids. - That's okay.
01:49:40.240 | Whatever, let the thoughts be.
01:49:42.240 | - I think kids always-- - You'll get really good
01:49:43.840 | at letting things just be,
01:49:45.840 | and focusing on the present moment.
01:49:48.520 | And you might come out with some epiphany
01:49:50.000 | about what you should do next.
01:49:52.280 | - Yeah.
01:49:53.200 | No, I love the idea, obviously.
01:49:56.360 | I love the idea.
01:49:57.200 | I love the idea, you know, I fast for three days.
01:49:59.360 | I wanna fast for longer.
01:50:00.760 | That's also in a different way, perhaps,
01:50:03.880 | but it brings you, makes you more sensitive
01:50:06.520 | to the world around you somehow.
01:50:07.800 | I'm not exactly sure what the chemistry of that is,
01:50:10.880 | but obviously, actually, it's not obvious,
01:50:13.880 | 'cause you're not always that hungry.
01:50:16.200 | But you're more, time slows down, and you feel things.
01:50:19.780 | You feel a breeze, all this kind of stuff.
01:50:22.040 | It's very interesting.
01:50:23.560 | You've, I think, tweeted something about ideas
01:50:26.280 | coming out from, sometimes, feeling about
01:50:30.080 | coming from outside of you sometimes.
01:50:32.600 | So you mentioned as you meditate,
01:50:34.320 | you notice these ideas come in.
01:50:37.320 | So thoughts, ideas, how did that connect to consciousness?
01:50:42.320 | - So the thing I was responding to that you wrote,
01:50:48.700 | I think I was partly picking up on the part of you
01:50:56.720 | that would really get a lot out of a meditation retreat.
01:51:02.120 | (laughs)
01:51:02.960 | That was my way of beginning that conversation.
01:51:07.240 | That experience you had of a thought
01:51:12.240 | coming from somewhere else,
01:51:14.320 | when you spend an extended period of time
01:51:19.320 | paying close attention to your moment-to-moment experience,
01:51:22.440 | that's how all of your thoughts appear to you.
01:51:26.220 | And it's really beautiful, - Why is that?
01:51:29.600 | - because you're letting go.
01:51:31.640 | Just through the practice of meditation,
01:51:34.400 | you're quieting down your default mode network.
01:51:38.960 | And without necessarily intellectually thinking yourself
01:51:43.960 | out of free will, it naturally kind of drops away.
01:51:49.240 | And so when you're under the spell of this illusion
01:51:53.600 | that you are the author of your thoughts
01:51:55.440 | and your conscious experience is driving
01:51:58.000 | all of your behavior, and there's this eye
01:51:59.720 | that stands somewhere near your brain
01:52:04.000 | but is not your brain that stands free
01:52:06.440 | of the physical world, is the thing generating the thoughts,
01:52:09.800 | when you're meditating, that quiets down
01:52:16.520 | and can kind of quiet down completely
01:52:18.940 | so that your experience is just of the next thing arising
01:52:23.360 | in your conscious awareness.
01:52:24.920 | - But the source of that is still this brain.
01:52:29.000 | - What you realize is the source of it
01:52:30.880 | is not your conscious experience.
01:52:33.400 | - And that's the important insight.
01:52:36.520 | And that's the insight, and so there are many insights
01:52:39.760 | you can have in meditation that align with the science,
01:52:42.440 | which is what's really fascinating,
01:52:43.960 | because it doesn't have to be that way.
01:52:45.240 | Like I can imagine finding meditation
01:52:47.240 | to be extremely useful and helping me with anxiety
01:52:49.960 | and all the rest and having all kinds of insights
01:52:52.000 | that turn out to not be true.
01:52:53.960 | But the interesting thing is that these insights
01:52:56.400 | actually turn out to be true.
01:52:57.920 | And so that is one of them,
01:53:00.720 | is the, when you're just watching
01:53:05.720 | what your conscious experience actually is,
01:53:09.640 | you realize that it's not doing all of the things
01:53:12.320 | you usually feel like it's doing.
01:53:14.120 | And so the thoughts really just arise
01:53:18.640 | in much the same way that a sound or a sight or a feeling,
01:53:23.640 | you know, maybe your leg starts to hurt,
01:53:27.560 | when you're just watching moment by moment by moment,
01:53:30.800 | pain arises, a bird chirping arises,
01:53:34.440 | a thought arises, a feeling arises.
01:53:37.560 | You're just kind of watching it all unfold.
01:53:41.480 | And there's something really beautiful about that.
01:53:44.480 | - Yeah, it's the perspective you could take on
01:53:49.480 | is there's a connectedness to the entirety of the universe,
01:53:53.280 | like to nature in general.
01:53:55.120 | - But there's something so beautiful about consciousness,
01:53:58.000 | about the fact that it's not just a dead universe
01:54:02.320 | with atoms doing their thing,
01:54:04.200 | that at least in this one instance,
01:54:07.560 | there is a felt experience of the universe.
01:54:12.520 | - Of the universe, it's not even individual.
01:54:14.240 | - And I'm part of the universe, yeah.
01:54:16.120 | There's a, right here in this little point in space and time,
01:54:19.520 | there is an experience of the universe.
01:54:21.760 | - But it's still interesting to think about
01:54:23.480 | where those ideas,
01:54:25.920 | if those ideas are solely a construction of the brain,
01:54:28.680 | or is there some kind of mechanism
01:54:30.880 | of joint collective intelligence of humans
01:54:33.560 | as social organisms, where those,
01:54:36.080 | like how much of it is me training my neural network
01:54:42.840 | and the ideas of tens of thousands of other people?
01:54:46.080 | And how much is it myself?
01:54:46.920 | - You're talking like in terms of psychic phenomena,
01:54:49.520 | or you're talking in terms of just absorbing
01:54:52.120 | the information of the past and education
01:54:55.360 | and just kind of our collective human project
01:54:59.800 | that gets in throughout our lives.
01:55:01.480 | - I don't know much about psychic phenomena,
01:55:03.680 | but I also wanna be open-minded
01:55:05.560 | in the way we speak about collective intelligence,
01:55:07.800 | 'cause it's very easy to simplify it to,
01:55:10.040 | it's a neural network trained on knowledge
01:55:13.120 | developed over generations and so on.
01:55:16.040 | It does feel like intelligence is stored
01:55:20.480 | in some kind of distributed fashion across humans.
01:55:24.680 | Like if you take one out,
01:55:26.080 | I think that intelligence quickly goes down.
01:55:30.840 | - I don't know how quickly it goes down
01:55:32.000 | if you just take one out, it depends on which one.
01:55:35.160 | I think I half agree and half disagree
01:55:37.080 | with what you're saying.
01:55:38.280 | But yeah, I mean the other thing you notice
01:55:41.240 | when you spend a lot of time in meditation,
01:55:43.600 | and when you spend a lot of time kind of shaking up
01:55:45.480 | these intuitions that I think get in the way
01:55:47.760 | of clearly thinking about what consciousness is,
01:55:52.760 | is that we are these systems in nature
01:55:58.720 | that are not at all isolated.
01:56:01.200 | And there are the obvious ways,
01:56:02.640 | like if I just stop drinking water,
01:56:04.720 | that's gonna change the system very drastically, right?
01:56:09.520 | So there's just the energy consumption,
01:56:12.720 | but the fact that we exchange ideas
01:56:16.320 | is part of who I am is everyone I've interacted with.
01:56:21.320 | And of course the people I interact more with
01:56:24.800 | have sculpted me more, but our brains are sculpted
01:56:28.520 | through our interactions with each other as well.
01:56:32.160 | - Yes, but I wonder if it's a more correct
01:56:35.760 | and useful perspective to take
01:56:38.320 | that those interactions are the organisms.
01:56:44.160 | You're saying you're still making the brain the primary.
01:56:48.080 | There could be like,
01:56:49.360 | that the brain is what it is
01:56:54.480 | because of the social interactions,
01:56:58.280 | and the social interactions are the living organism.
01:57:00.960 | That's a weird perspective 'cause it's so much more--
01:57:05.920 | - I don't actually think it's one or the other.
01:57:08.960 | - They're both? - Yeah.
01:57:09.800 | - They're both living? - Yeah.
01:57:11.360 | - It's like cats and dogs?
01:57:13.840 | (laughs)
01:57:14.680 | - Yeah, I mean, it's a little bit like,
01:57:16.640 | I have two children and a lot of people
01:57:18.720 | with two children will say,
01:57:20.040 | like when you're preparing to have the second one
01:57:22.800 | and soon after you've had the second one,
01:57:25.280 | that having two is kind of like having three
01:57:28.320 | because you are nourishing and protecting
01:57:33.320 | and overseeing each individual life,
01:57:36.520 | but then there's the sibling relationship,
01:57:39.520 | which is almost another thing.
01:57:43.000 | - Yeah, it's weird.
01:57:44.000 | So you've spoken with Don Hoffman a few times.
01:57:48.000 | - Yes.
01:57:48.840 | - In his book, "Case Against Reality."
01:57:49.680 | - Many more than a few, yeah.
01:57:51.000 | - Many more than a few.
01:57:52.040 | There's a lot of fun ones.
01:57:53.560 | Was there one where Sam was involved?
01:57:55.960 | - Well, Sam and I interviewed him.
01:57:57.080 | Yeah, sorry, most of the conversations I've had with him
01:57:59.280 | are private, they're not public,
01:58:00.600 | but we used to meet, before the pandemic,
01:58:02.280 | we were meeting about monthly to discuss ideas.
01:58:06.480 | - I would love to be a fly on the wall of those discussions,
01:58:09.120 | but he wrote a book, "Case Against Reality."
01:58:11.800 | - Yeah.
01:58:12.640 | - Makes the case that our perception
01:58:14.640 | is completely detached from objective reality.
01:58:18.640 | Can you explain his perspective and let us know?
01:58:23.440 | Well, no, no, maybe not fully,
01:58:25.200 | but to which degree you agree and don't.
01:58:29.720 | So this is much more focused.
01:58:31.920 | I guess you guys have an agreement
01:58:33.640 | that consciousness is somehow fundamental.
01:58:36.200 | - Yeah, I mean, I think we both think we might be wrong.
01:58:40.800 | - About the consciousness or about reality?
01:58:42.200 | - About it being fundamental.
01:58:43.680 | I think we're both just,
01:58:45.200 | we both agree that this is a legitimate question to ask
01:58:50.200 | at this point in science, is consciousness fundamental?
01:58:53.480 | And I really see it as a question, and I think he does too.
01:58:57.440 | - But he goes hard on reality.
01:58:59.420 | - Yes, and it's interesting because I, you know,
01:59:03.440 | especially, so I actually now have recorded
01:59:05.460 | three conversations with him for this project
01:59:08.140 | I'm working on, yeah.
01:59:09.720 | And in every conversation we have,
01:59:13.160 | we seem to land on the same place,
01:59:14.480 | but this last conversation we had,
01:59:16.160 | it seemed to be even more clear
01:59:18.120 | that the semantics really get in the way.
01:59:21.520 | When you get into the weeds in these conversations,
01:59:24.880 | it's almost like we need some new terminology
01:59:29.240 | because it's hard to know sometimes
01:59:32.420 | whether we're talking about the same thing.
01:59:34.560 | I have issues with his terminology
01:59:38.880 | that when we talk about what his terminology represents,
01:59:43.880 | it seems like we completely agree.
01:59:46.640 | - But the conclusions you don't?
01:59:49.800 | - It's possible we have a very similar view of the universe
01:59:54.680 | if consciousness is fundamental.
01:59:56.240 | It may be an identical view.
01:59:57.960 | It's hard for me to know because I disagree
02:00:00.880 | with a lot of his terminology.
02:00:02.800 | - Okay, but our four-dimensional reality,
02:00:05.160 | he says that's like a complete space-time.
02:00:09.680 | It's a complete weird construction that--
02:00:12.160 | - Yeah, well, I mean, the truth is that,
02:00:14.760 | I mean, if you talk to a neuroscientist like Anil Seth,
02:00:17.680 | and I would say most neuroscientists,
02:00:19.920 | but he's really good on this subject,
02:00:21.920 | and his expertise and his area of focus is in perception.
02:00:26.560 | So he talks a lot about how our perceptions
02:00:29.920 | give us an experience of the world,
02:00:31.920 | and he calls it a controlled hallucination.
02:00:34.600 | I'm sorry, he probably got,
02:00:37.200 | I think he says that he got that term from someone else,
02:00:39.840 | but that's the term he uses.
02:00:41.360 | - We got every term from somewhere else.
02:00:43.560 | - Right, that's true.
02:00:44.400 | - Everything, there's no new ideas.
02:00:46.000 | - Right.
02:00:47.080 | There's a sense in which what Hoffman is saying
02:00:49.960 | is already, we already know to be the case.
02:00:52.720 | So our brains are creating this conscious experience
02:00:57.160 | based on these interactions with the outside world.
02:01:01.200 | It is in some sense all a controlled hallucination.
02:01:05.800 | And someone like Anil Seth,
02:01:07.160 | so from the neuroscientific point of view,
02:01:09.120 | I actually have a quote here somewhere
02:01:10.800 | if you have any interest in hearing the quote,
02:01:13.880 | but he's essentially saying,
02:01:16.400 | everything we experience is a perception,
02:01:20.080 | including our experience of time and space.
02:01:23.560 | So we still don't really know what our experience of space
02:01:29.200 | represents out there in the world.
02:01:32.320 | And then of course, when you talk to physicists
02:01:34.280 | about the different interpretations of quantum mechanics,
02:01:36.480 | I mean, where physics seems to be headed
02:01:39.800 | across the board at this point
02:01:41.280 | is that space and time are emergent,
02:01:44.640 | that they're not part of the fundamental fabric of reality.
02:01:48.720 | And so there's some ways in which Don is saying things that--
02:01:53.720 | - Is he being too poetic about it?
02:01:58.600 | Is that the right way to phrase it?
02:02:00.560 | 'Cause like-- - No, go ahead.
02:02:02.280 | - He says like, it's not that our perception
02:02:06.040 | is just a controlled hallucination.
02:02:08.760 | Well-- - No, it's not.
02:02:10.240 | He's saying something more than that.
02:02:11.520 | - More than that. - That's true.
02:02:13.200 | But my point is that a lot of what he's already saying,
02:02:15.920 | on some level, science is already there
02:02:18.800 | and could agree with.
02:02:20.240 | - Yeah, but not all the way.
02:02:23.480 | 'Cause he's saying like,
02:02:26.760 | that we don't even,
02:02:28.600 | the evolutionary process has constructed
02:02:33.680 | our brain mechanisms in such a way
02:02:35.560 | that we're really far from having access
02:02:38.360 | to objective reality. - Yes.
02:02:41.120 | Although I think we already know that as well.
02:02:43.120 | I mean, if any version of string theory is correct,
02:02:46.320 | and of course, we don't know yet, it's all up for grabs,
02:02:48.760 | but the truth is each theory is weirder than the last.
02:02:53.800 | If there are 15 dimensions of space,
02:02:57.280 | we are just not,
02:02:59.000 | we're not wired to be able to understand
02:03:04.360 | the fundamental reality.
02:03:06.480 | - But I think we have a consistent abstraction
02:03:10.560 | that seems to be reliable.
02:03:13.280 | - Yes. - Like a blockchain.
02:03:15.240 | - Yes, and he's not just saying
02:03:16.680 | that we really only have this tiny window onto reality.
02:03:19.560 | He's saying that that window onto reality
02:03:22.400 | is giving us a lot of false information.
02:03:24.440 | It's not true. - Yeah, it's false.
02:03:25.480 | It's not just an abstraction, it's false.
02:03:27.680 | Because he's saying there's no reason it needs to be true.
02:03:30.780 | It's not required to be true.
02:03:37.360 | And in fact, there's, through natural selection,
02:03:40.720 | it's very possible to imagine,
02:03:43.560 | or it's likely to imagine that organisms
02:03:45.720 | will evolve in such a way that you're going to just
02:03:49.360 | be lying to yourself completely.
02:03:52.760 | But the question there is, if that's the case,
02:03:55.160 | it's a really interesting thing to think about.
02:03:57.000 | I think the regular way in which he approaches it
02:03:59.640 | is really admirable.
02:04:02.440 | I do think it's scientific.
02:04:04.880 | But it's,
02:04:07.700 | the question for me is why is it so consistent
02:04:11.280 | across all of these organisms?
02:04:13.600 | We all seem to see the table,
02:04:15.680 | and feel, and run into the table.
02:04:18.320 | - So what he will agree, so what I would say to that,
02:04:20.480 | and when I pose this to him,
02:04:21.680 | I really don't wanna speak for him.
02:04:23.480 | (laughing)
02:04:24.440 | But I'll answer it myself and say that I believe he agrees
02:04:28.440 | with what I'm about to say.
02:04:30.480 | Which is that the things we perceive
02:04:34.640 | are connected to the structure of reality.
02:04:39.440 | It's just that the structure of reality
02:04:41.600 | is made of something completely different
02:04:43.880 | than the thing we're experiencing.
02:04:46.000 | So imagine, if you just go with the holographic principle.
02:04:49.560 | Loosely, and actually, the holographic principle
02:04:54.560 | applies to black holes only.
02:04:56.480 | So there's ADS-CFT duality, anti-de Sitter space,
02:05:00.960 | and conformal field theory.
02:05:02.920 | Am I getting all these terms right?
02:05:04.680 | - The terms are right, but I can't believe
02:05:06.600 | we're going there.
02:05:07.580 | - Well, I mean, this is where I've gone
02:05:09.280 | in all of my conversations with physicists,
02:05:10.820 | because the idea is, so if we just have the basic principle
02:05:14.880 | that reality and all of the information can be contained,
02:05:19.560 | or is actually in a two-dimensional space
02:05:23.680 | that gets projected, this is something that you don't buy,
02:05:26.160 | based on the look on your face.
02:05:28.040 | - No, no, no, I'm actually freaking out,
02:05:30.080 | because yes, any theory of modern physics
02:05:33.640 | gives inkling that reality's very weird.
02:05:36.840 | - Right, and completely different from how we experience it.
02:05:39.520 | - That's one example.
02:05:40.720 | - So this is an intuition that, for whatever reason,
02:05:43.760 | has always felt true to me.
02:05:46.000 | This is the way I thought about things as a child.
02:05:48.480 | I've met other people that felt this way
02:05:50.200 | when I've had experiences in psychedelics,
02:05:52.320 | and this is where I start to sound crazy, too.
02:05:54.120 | (laughs)
02:05:55.520 | - Nope.
02:05:56.360 | Everybody else is crazy, except us.
02:06:00.120 | - But that has always seemed right to me,
02:06:04.440 | and that's always the thing
02:06:05.960 | that I feel like I'm looking for.
02:06:08.600 | It's funny, recently I was thinking
02:06:10.200 | that it's as if I feel like I'm,
02:06:13.840 | and this is more how I was thinking
02:06:15.400 | of how I felt as a child,
02:06:16.520 | but I feel this way a lot as an adult, too,
02:06:19.720 | that the image is one of a snow globe,
02:06:23.400 | that I'm confined to this snow globe
02:06:26.800 | based on my human perceptions,
02:06:29.160 | and the truth of reality is out there,
02:06:32.120 | and it's actually why I'm so drawn to shaking intuitions.
02:06:34.600 | I feel like every time we shake up an intuition,
02:06:36.400 | it's like an opportunity to leave the snow globe
02:06:38.680 | for a moment.
02:06:40.760 | It's like smashing the marbles and seeing,
02:06:42.720 | oh, it's not liquid in there like I thought.
02:06:44.600 | It's getting this glimpse of something truer
02:06:48.200 | than what we typically experience.
02:06:50.720 | - I feel like it's for a long time gonna be snow globes
02:06:53.840 | inside snow globes inside snow globes.
02:06:55.960 | - But the larger point is that, yes,
02:06:58.280 | whatever is true about the fundamental nature of reality
02:07:00.640 | is not something we're experiencing.
02:07:03.160 | However, it is linked and gives us clues to it.
02:07:07.880 | So one image I came up with recently,
02:07:10.360 | I actually wrote about this.
02:07:11.520 | I have an article in Nautilus about time,
02:07:14.640 | because I was, as I spend time thinking about
02:07:17.520 | what it would mean for consciousness to be fundamental,
02:07:20.960 | and at the same time, I'm talking to physicists
02:07:23.680 | about different interpretations of quantum mechanics
02:07:27.200 | and the fact that the ones I'm talking to
02:07:30.440 | believe that space and time are emergent
02:07:33.400 | and are not part of the fundamental story.
02:07:36.520 | I was thinking about what is it,
02:07:39.400 | what could time be if it's not the way we experience it?
02:07:46.040 | What could it be pointing to?
02:07:48.160 | And I'm not the first person to think like this.
02:07:51.720 | Many people have developed
02:07:55.120 | different thought experiments around this,
02:07:56.680 | but, and this is, I'm not saying this is the way things are,
02:07:59.760 | but this is just one solution is that time and causality
02:08:05.760 | appear to us the way they do,
02:08:09.120 | because for whatever reason,
02:08:10.760 | we're only perceiving one moment at a time.
02:08:13.920 | And these connections between events
02:08:19.840 | that we perceive as time are actually
02:08:24.320 | just part of the fabric of reality.
02:08:25.880 | There's some structure to reality at a deeper level
02:08:29.800 | where it's like shining a flashlight
02:08:33.200 | on the structure of reality
02:08:34.360 | where for us, for whatever reason,
02:08:36.840 | everything else disappears.
02:08:38.120 | And the only thing that exists
02:08:39.360 | is that single pinprick of light
02:08:44.360 | that we happen to be inhabiting or that we can perceive,
02:08:48.000 | but the rest of it is there.
02:08:49.480 | And so that even though time would be an illusion
02:08:52.800 | and the causality in the way we experience it,
02:08:55.880 | it is an illusion,
02:08:57.280 | or it doesn't mean what we think it means,
02:09:00.640 | it's still pointing to a deeper structure.
02:09:02.520 | There's something that it corresponds to
02:09:04.680 | in the fundamental nature of reality.
02:09:06.360 | And I've had enough conversations with Don, I think,
02:09:10.040 | to know that he would agree with that,
02:09:15.840 | that our perceptions map onto something.
02:09:20.840 | It's just not the experience of it that we're having.
02:09:26.480 | So to go back to the idea that our perception
02:09:31.480 | of the idea that all of reality
02:09:33.440 | could be contained in two dimensions,
02:09:36.400 | and there's something about the interaction
02:09:38.440 | between different points that caused this holograph
02:09:43.440 | so that it seems like there's a three-dimensional world
02:09:45.840 | when in fact it's a projection
02:09:48.000 | of this two-dimensional surface.
02:09:49.680 | What we experience as space still references something
02:09:59.400 | at the fundamental level.
02:10:03.200 | It's just that it's not space.
02:10:05.680 | And that is something that makes a lot of sense to me.
02:10:08.160 | I also, I posted an excerpt,
02:10:09.960 | George Musser wrote a great book,
02:10:12.400 | "Spooky Action at a Distance."
02:10:14.040 | "Spooky Action at a Distance."
02:10:15.860 | And he talks about, he's a great science writer,
02:10:20.720 | and he talks about ways to kind of absorb
02:10:25.280 | what this would mean, this ADS/CFT duality.
02:10:29.360 | And he talks about, he gives an example of music
02:10:33.120 | as an analogy, that two different notes can exist
02:10:37.200 | in three dimensions as if the other doesn't exist
02:10:40.700 | because of the frequency of the sound waves.
02:10:44.560 | And that in another way, you can think of the sound waves
02:10:47.300 | existing in different dimensions.
02:10:50.280 | I don't know if that's, I--
02:10:54.560 | - Yeah, that's really interesting.
02:10:56.000 | - I don't speak as well as I write.
02:10:57.760 | - You speak beautifully.
02:11:00.040 | - I've written about this in a way
02:11:00.860 | that I think is easier to absorb
02:11:03.160 | than the way I just described it.
02:11:04.280 | - But I think causality is the trickiest one,
02:11:07.640 | trickier one.
02:11:08.640 | Time is a tricky one to like,
02:11:10.880 | who, boy, this--
02:11:13.040 | - And there are physicists who think that space is emergent
02:11:15.680 | but time is still fundamental,
02:11:16.960 | and Lee Smolin is one of those scientists.
02:11:19.080 | And it's really interesting to talk to him about this.
02:11:22.320 | But yeah, so--
02:11:23.160 | - Time being emergent
02:11:24.720 | is a really trippy one to think about.
02:11:29.280 | Also, I wonder if it's possible,
02:11:32.320 | at which point does the experience of time
02:11:36.760 | start becoming a part of the conscious experience
02:11:41.000 | of living organisms?
02:11:42.600 | So is it something that evolved on Earth only, or is it--
02:11:47.000 | - It's also very hard to think about consciousness
02:11:49.040 | without time.
02:11:50.640 | And that's something that's really interesting
02:11:52.120 | for me to think about too.
02:11:54.040 | Although, not that this is scientific evidence of anything,
02:11:58.520 | but I and many others have had the experience,
02:12:01.680 | a timeless, spaceless experience
02:12:04.280 | in certain states of meditation
02:12:06.920 | and under the influence of psychedelics.
02:12:09.160 | - And that's still a conscious experience, would you say?
02:12:11.120 | - Yeah, absolutely.
02:12:12.480 | - But didn't you say that some aspect
02:12:14.840 | of conscious experience is memory?
02:12:17.320 | - It seems like that too.
02:12:18.720 | No, no, so I said an experience of being a self
02:12:22.560 | is due to memory.
02:12:23.800 | It seems that consciousness and time
02:12:27.880 | are inextricably linked,
02:12:29.320 | but I think that may be an illusion also.
02:12:32.600 | And when I think about consciousness being fundamental,
02:12:36.640 | and someone like Max Tegmark,
02:12:39.880 | I don't know if there are other mathematicians,
02:12:41.920 | I'm sure there are, he's the only one I know of
02:12:43.920 | who will talk about mathematical forms
02:12:48.400 | and shapes as not just being,
02:12:51.480 | he talks about them as being actual objects in nature
02:12:56.480 | that exist, that are not just mathematical structures
02:13:02.800 | that we can think about,
02:13:04.920 | but any mathematical structure that comes out of the math
02:13:09.280 | actually exists in reality.
02:13:12.040 | And so when I think about consciousness being fundamental,
02:13:17.720 | I think about physics and mathematics
02:13:21.520 | being a description of the structure of it.
02:13:26.160 | And that when mathematicians say things like that,
02:13:29.200 | or physicists say things like that,
02:13:30.960 | it makes sense if we're talking about
02:13:36.400 | a conscious experience of some sort.
02:13:40.640 | - Yeah, that's really interesting.
02:13:41.920 | First of all, Max is great.
02:13:43.320 | Man, this is really interesting to think about
02:13:47.320 | how, what is fundamental.
02:13:50.600 | It's a good exercise to do in general,
02:13:52.720 | to truly think through it.
02:13:54.760 | I mean, ultimately it's a very humbling process
02:13:56.680 | 'cause we're probably in the very early days of,
02:13:59.600 | well, we can't know currently, right?
02:14:01.120 | - Right, currently.
02:14:02.040 | I mean, maybe permanently, but I remain optimistic.
02:14:07.400 | - Right, well, to jump around a little bit,
02:14:11.320 | the Google AI engineer,
02:14:15.520 | using the terms from the press is kind of hilarious.
02:14:18.240 | - Why, is this a friend of yours?
02:14:22.160 | - No, it's not, no, no.
02:14:24.640 | But just, the term AI is really not used
02:14:27.800 | amongst machine learning people.
02:14:29.120 | - Oh, I see, okay.
02:14:30.120 | - So I'm using kind of Google AI engineer,
02:14:33.600 | and this is like sentience and chatbot,
02:14:36.280 | and none of those words are really used
02:14:39.720 | by the people that actually build them.
02:14:42.000 | You're much more likely to use language model
02:14:44.520 | versus chatbot or natural language dialogue
02:14:49.520 | versus chatbot or whatever, and certainly not sentience.
02:14:53.080 | But that's the point.
02:14:54.280 | I mean, sometimes the difference between
02:14:56.760 | the public discourse and the engineering
02:14:59.560 | is actually really important
02:15:00.400 | 'cause engineering tends to want to ignore the magic.
02:15:04.280 | They don't notice the magic.
02:15:06.780 | Anyway, the Google AI engineer
02:15:10.200 | believes that the Lambda 1
02:15:14.600 | natural language system achieved sentience.
02:15:20.600 | I don't know if you paid attention to that.
02:15:22.200 | - I didn't. - You didn't.
02:15:23.640 | But the general question is,
02:15:26.840 | do you think a chatbot, do you think a robot
02:15:29.480 | could be conscious?
02:15:30.960 | - So, I mean, this answer is slightly different
02:15:33.560 | or very different depending on whether
02:15:35.880 | I kind of follow the assumption
02:15:37.800 | that consciousness emerges at some point
02:15:40.680 | in physical processing or whether it's fundamental.
02:15:43.760 | Since I've just chosen to stay on the fundamental channel,
02:15:49.420 | I mean, then it's kind of a silly answer
02:15:54.320 | because if consciousness is fundamental
02:15:56.600 | in the way I currently think about it,
02:15:59.280 | the only way I imagine it working,
02:16:04.420 | every physical thing we perceive
02:16:08.460 | is a representation of a conscious experience.
02:16:11.060 | So, I mean, yes, that's true of everything in the world.
02:16:15.420 | However, I would say if that's the case,
02:16:17.920 | even though there's a way in which it's behaving
02:16:23.020 | in similar ways to a human being,
02:16:25.660 | the way it's constructed, what it is actually made of
02:16:29.780 | and the physics of it is so different
02:16:32.860 | that I would expect it to have
02:16:34.700 | an entirely, completely non-human conscious experience.
02:16:40.780 | And whether it even feels like a self,
02:16:44.220 | I think would be a big question mark.
02:16:47.900 | - Well, there's questions of ethics
02:16:49.340 | and is it capable of suffering?
02:16:52.860 | Is suffering connected to--
02:16:55.980 | - Consciousness.
02:16:56.820 | - Consciousness or--
02:16:57.660 | - I mean, obviously it is.
02:16:58.740 | It's the only way you can suffer is in a consciousness.
02:17:02.460 | - Maybe it's not, maybe it's not.
02:17:03.820 | Maybe it's more connected to self than consciousness.
02:17:08.580 | - I would say, I mean, just on my own use of these words,
02:17:12.920 | suffering is only something that can happen
02:17:15.420 | in a conscious experience.
02:17:17.220 | - Right, so can robots suffer?
02:17:18.880 | - If they have a, anything that has a conscious experience
02:17:23.900 | can experience suffering, yes.
02:17:25.700 | - But do plants suffer in the same?
02:17:29.580 | So is there some level where when we construct
02:17:34.220 | our morals and ethics that,
02:17:38.500 | is there a class of conscious experiences
02:17:42.300 | or organisms that are capable of conscious experience
02:17:45.660 | that we can anthropomorphize sufficiently
02:17:48.340 | such that we give them rights?
02:17:50.080 | - Yeah, I mean, this is not an area that--
02:17:55.420 | - For physics. - I have spent, for me.
02:17:59.020 | I have not spent a lot of time thinking about this.
02:18:01.940 | Most people expect that I have.
02:18:03.820 | It's interesting, these types of questions
02:18:06.220 | are much less interesting to me than the other questions,
02:18:09.020 | and I think it's because I'm interested
02:18:11.340 | in the physics of things.
02:18:13.100 | I'm somewhat interested, I'm definitely interested
02:18:15.460 | in ethical questions for human beings,
02:18:18.660 | but I have spent very little time thinking
02:18:21.100 | about the implications for other types of intelligence.
02:18:25.380 | I will say that I think the capacity for suffering
02:18:30.060 | of a conscious system goes up with memory
02:18:39.460 | and with a sense of self.
02:18:42.900 | So if you take, if anesthesia only erased your memory
02:18:48.860 | and it didn't actually make you unconscious,
02:18:53.060 | you actually experienced, horrifically experienced
02:18:56.740 | some surgical procedure, but we could completely wipe out
02:19:00.060 | your memory of it, as nightmarish as that scenario is,
02:19:05.060 | and I'm not suggesting we should ever do this,
02:19:08.380 | I would say if our only option were to erase
02:19:12.140 | your memory of it, that would be the more ethical thing
02:19:15.940 | to do than to have you maintain that memory
02:19:20.140 | because the suffering is then carried
02:19:22.540 | across a longer distance through time.
02:19:26.980 | - That's presuming that suffering is unethical.
02:19:30.300 | - Well, isn't that what ethics is all about?
02:19:33.060 | It's about suffering.
02:19:34.100 | I mean, I think, to me, ethics is all about suffering
02:19:38.540 | and well-being, and I don't know what ethics is
02:19:40.980 | without that.
02:19:42.340 | - There's different measures of suffering,
02:19:43.700 | so having one traumatic event, if you erase
02:19:47.760 | that one traumatic event, that potentially might have
02:19:50.420 | negative, unmet consequences for the growth of a human being.
02:19:55.500 | - Yeah, so then it's a different question,
02:19:57.700 | but I would say that memory increases suffering globally
02:20:02.700 | so that if any moment of suffering only existed for itself
02:20:11.480 | in the present moment, that is a lesser kind of suffering
02:20:18.600 | than a suffering that is drawn out over time through memory.
02:20:23.600 | - So hard to think about, yeah.
02:20:25.560 | - And so, yeah, I mean, in terms of AI,
02:20:28.000 | if they're conscious and there's a sense of self
02:20:31.480 | and memory, which I actually think you need memory
02:20:34.680 | to have a sense of self.
02:20:36.000 | Actually, sorry, I take that back.
02:20:39.120 | I actually think you can have a really primitive
02:20:41.600 | sense of self without memory.
02:20:44.480 | But an AI that is conscious, that has memory
02:20:48.520 | and a sense of self, yeah, that's capable of suffering.
02:20:53.520 | Absolutely.
02:20:54.680 | - Well, one of the things, 'cause you said
02:20:57.040 | you haven't really looked into this area
02:20:59.600 | because there's so many interesting things to look into,
02:21:03.560 | and you're really focused on the physics side.
02:21:05.120 | To me, the neuroscience experiments that you mentioned
02:21:09.400 | where there's a difference between the timing of things
02:21:11.800 | that kind of reveal there's something here,
02:21:13.960 | to me, working with robots.
02:21:15.760 | I have robots that are moving around my home in Austin.
02:21:20.640 | It's a very good embodied thought experiment.
02:21:27.000 | Like, here's a thing that looks like it has a free will.
02:21:32.000 | It looks like it has conscious experiences.
02:21:37.760 | And then I know how it's programmed.
02:21:40.680 | And so I have to go back and forth.
02:21:43.840 | - I know.
02:21:44.680 | - And it's, this is what I do.
02:21:46.520 | You lay on the ground looking up at the stars
02:21:49.680 | thinking about plants, and I look at a robot like--
02:21:53.920 | - Well, you can do this with plants too.
02:21:55.320 | I mean, there's some complex enough behavior
02:21:58.840 | that looks like free will from a certain angle.
02:22:01.560 | And it makes you wonder two things.
02:22:04.680 | One, is there consciousness associated with that processing?
02:22:08.620 | And two, if there isn't,
02:22:13.160 | what does that say about our experience?
02:22:17.720 | And--
02:22:18.560 | - Both are complex thoughts.
02:22:22.000 | - Our circumstance in nature, what does that say?
02:22:24.120 | Yeah.
02:22:24.960 | But yeah, I do that with plants all the time.
02:22:27.840 | I go back and forth.
02:22:28.680 | But the zombie thought experiment now,
02:22:32.640 | at least for me, is often presented as AI
02:22:35.640 | because now that's easier, as a robot,
02:22:37.680 | because that's easier.
02:22:41.960 | I don't know if it's just because it's in pop culture now
02:22:44.880 | in the form of films and television shows,
02:22:47.000 | but it's easier to get to that point of contemplation,
02:22:52.000 | I think, by imagining a robot.
02:22:58.440 | - I don't know why exactly I'm bothered
02:23:00.500 | by philosophers talking about zombies
02:23:03.360 | because it feels like they're missing,
02:23:06.720 | it's like talking about,
02:23:09.820 | it's reducing a joyful experience.
02:23:14.820 | So that's like talking about,
02:23:18.300 | listen, when you fall in love with somebody,
02:23:20.460 | the other person is a zombie.
02:23:21.740 | You don't know if they're conscious or not.
02:23:23.700 | You're just making presumptions and so on.
02:23:25.540 | It's like, it says philosophers will do this kind of thing.
02:23:28.940 | They might as well be a zombie.
02:23:30.060 | Or there's no such thing as love,
02:23:32.020 | it's just a mutual, like economists will reduce love
02:23:35.860 | to some kind of mutual calculation
02:23:37.540 | that minimizes risk and stability over time.
02:23:40.580 | It's like, all right,
02:23:42.940 | what I wanna do with each of those people
02:23:45.700 | is I wanna find every one of those philosophers
02:23:49.060 | that talk about zombies
02:23:50.700 | and eventually give them one of those robots
02:23:52.620 | and watch them fall in love
02:23:54.300 | and see how their understanding of how humble they are
02:23:59.300 | by how little we understand.
02:24:02.380 | - That's the point of the zombie experiment.
02:24:04.060 | I mean, the zombie thought experiment.
02:24:06.380 | I mean, I can't speak for any of them.
02:24:08.660 | - Empathy for zombies?
02:24:09.500 | Is that the point? - No.
02:24:10.820 | So for me, I mean, I don't like spending much time on it.
02:24:13.620 | I think it has limited use for sure.
02:24:15.780 | And I understand your annoyance with it.
02:24:18.540 | But for me, what's so useful about it
02:24:20.940 | is it gets you to ask the same questions you're asking
02:24:24.700 | when you're looking at robots.
02:24:26.200 | If you just run the experiment and you say,
02:24:29.460 | okay, I'm sitting here with Lex.
02:24:31.540 | What if I try to trick myself?
02:24:34.180 | What's different about the world
02:24:36.220 | if someone tells me actually he's a robot?
02:24:40.060 | Is essentially what the zombie experiment is.
02:24:42.380 | He's over there.
02:24:43.340 | He has no conscious experience.
02:24:44.780 | He's acting all serious, but there's no experience there.
02:24:47.900 | So it gets you to ask some interesting questions.
02:24:49.800 | One is, okay, when it seems impossible,
02:24:52.760 | I just think, no, that makes no sense.
02:24:54.460 | I can't even imagine that.
02:24:56.620 | Okay, what do I think consciousness is responsible for?
02:25:00.480 | What is consciousness doing in that human over there
02:25:04.560 | that is Lex that I can't fathom all of your behavior
02:25:09.560 | and everything that you're doing
02:25:13.340 | and about without consciousness?
02:25:14.840 | So it gets you to ask this question.
02:25:16.820 | These are the questions I begin my book with.
02:25:19.900 | What is consciousness doing?
02:25:21.780 | It gets you to ask that question in a deeper way.
02:25:24.440 | And then I kind of found this alternate,
02:25:27.060 | I don't know if other people have done this,
02:25:28.360 | but I found this alternate use for it,
02:25:31.320 | which is even more useful to me,
02:25:33.180 | which is I'm able to do it sometimes.
02:25:36.080 | I'm able to just sit with someone
02:25:37.860 | and get my imagination going
02:25:41.500 | and imagine there really is no conscious experience there
02:25:45.740 | in that person.
02:25:47.380 | And what happened for me the first few times
02:25:49.540 | I was able to do this is it reminded me exactly
02:25:53.420 | of how I feel when I look at complex plant behavior
02:25:57.620 | and other behaviors in nature
02:25:59.340 | where I assume there's no conscious experience.
02:26:02.820 | And to me it just flips everything on its head.
02:26:07.380 | It just gets you to be able,
02:26:09.660 | it gets you to be open to possibilities
02:26:12.300 | that you were closed to before.
02:26:14.480 | And I think that's useful.
02:26:16.340 | - Does it enhance or dissipate your capacity
02:26:21.340 | for love of other human beings?
02:26:24.300 | What role does love play in the human condition, Atika?
02:26:27.620 | - I mean, in so many ways it's the most important role.
02:26:32.020 | I don't think any of these realizations,
02:26:36.000 | I mean, if anything I think it enhances it.
02:26:41.920 | But I don't think they,
02:26:46.820 | I mean, it kind of goes back to the levels of usefulness.
02:26:50.740 | - Sometimes you wanna picture your friends as a plant.
02:26:53.540 | (Atika laughs)
02:26:54.380 | It's helpful.
02:26:55.220 | It's helpful to appreciate the beauty
02:26:58.140 | that they are as an organism.
02:26:59.700 | - Yeah, I mean, I don't know.
02:27:01.300 | For me, the more time I spend practicing meditation,
02:27:06.300 | seeing through these illusions,
02:27:10.420 | the more poignant my conscious experience becomes.
02:27:16.500 | And love is obviously one of the most powerful
02:27:20.980 | and one of the most positive experiences we have.
02:27:24.220 | And I don't know, there's just,
02:27:26.900 | whatever its cause is,
02:27:28.460 | there's just something miraculous about it
02:27:31.380 | in and of itself and for itself.
02:27:34.340 | - I think love, romantic love, is a beautiful thing.
02:27:37.060 | Connection, friendship is a beautiful thing.
02:27:38.780 | And it's so interesting how people can grow together,
02:27:41.940 | how interact together, disagree together,
02:27:44.660 | and make each other better.
02:27:46.580 | Like scientific collaborations are like this too.
02:27:49.400 | Daniel Kahneman, Tversky, I mean, there's,
02:27:54.000 | and most people are not able to do that
02:27:56.980 | in the scientific realm.
02:27:58.200 | They create, the more successful you become,
02:28:00.380 | the more solid-- - No, it's rare.
02:28:01.260 | And you recognize it when you have it,
02:28:02.760 | when you have a great collaboration.
02:28:04.060 | I mean, in science, but also in other areas.
02:28:06.540 | In this TED production I'm working on,
02:28:08.980 | I just happened to be working with this producer
02:28:12.500 | where we had this instant connection,
02:28:14.660 | and the chemistry's great,
02:28:15.840 | and I have so much fun recording with her.
02:28:19.220 | It's so great to have, I usually work alone,
02:28:21.620 | and it's been wonderful to have this partner.
02:28:24.100 | - It's like a chat, it's like a conversation type of thing?
02:28:26.380 | - Yeah, she's taking my conversation,
02:28:28.280 | we're playing around with it,
02:28:29.300 | we're just working on the pilot.
02:28:30.680 | - I love how you have no idea how it's gonna turn out.
02:28:33.160 | This is great.
02:28:34.000 | - Well, I just started working without a clear,
02:28:39.900 | a clear image of the end result.
02:28:43.880 | Although it started with an idea for a film.
02:28:47.000 | I don't know, I guess I have a feeling,
02:28:48.960 | I was just wondering if I'd talked about this
02:28:51.080 | with you before anywhere, but probably not.
02:28:53.680 | Yeah, 'cause you and I have never spoken before.
02:28:55.320 | - No, we just met, we just know each other.
02:28:57.560 | - You mean you didn't see me
02:28:58.400 | when I was listening to that podcast of yours?
02:29:01.360 | And have that thought, you didn't hear that thought?
02:29:03.920 | - I mean, we were mentioning this offline,
02:29:05.480 | as a small tangent, there's a cool dynamic
02:29:07.360 | in how we get to become really close friends
02:29:10.080 | without never having met, never having talked one way,
02:29:14.000 | but it could be one way friendships that form,
02:29:15.920 | and it's a beautiful thing.
02:29:17.680 | I think, I don't know, that makes me feel
02:29:20.920 | like we're all connected,
02:29:22.360 | and you're almost like plugging into some kind of weird
02:29:25.240 | thing in the space of ideas.
02:29:27.880 | - So many things I wanna say now,
02:29:28.920 | but here's one thing, is the way I think about consciousness
02:29:32.920 | if it's fundamental, is analogous to a pot of boiling water,
02:29:37.920 | where the water is the consciousness,
02:29:40.760 | and the bubbles are the conscious experiences.
02:29:43.520 | And so, it is all one thing,
02:29:46.840 | and then there are these shapes that take form.
02:29:49.280 | And there's a felt experience, right?
02:29:51.920 | It's all felt experience.
02:29:53.320 | So, when we're able to let go of this sense of self,
02:29:58.320 | or this illusion of self,
02:30:00.880 | the idea that experiences are happening to something,
02:30:05.120 | or to someone, drops out,
02:30:07.040 | and what you get is just experiences arising.
02:30:10.720 | So, there's the fundamental nature of the universe,
02:30:13.600 | which obviously has a structure and obeys laws,
02:30:16.720 | but what you get out of that are appearances
02:30:20.720 | of different conscious experiences.
02:30:22.440 | They're just coming into being, right?
02:30:26.080 | And so, there is, under that view,
02:30:29.920 | I mean, there are different ways to look at
02:30:31.440 | the fundamental nature of reality without consciousness,
02:30:33.600 | and kind of come up with a similar view,
02:30:35.080 | but in that view, it is just kind of one,
02:30:39.120 | it's one thing with different experiences popping up.
02:30:43.160 | - And in that boiling pot is a lobster,
02:30:46.080 | (Bridget laughs)
02:30:46.920 | which represents the human condition.
02:30:48.880 | - The devil.
02:30:50.240 | - Because life is suffering.
02:30:52.080 | I don't know if you've read the statement
02:30:53.360 | of Foster Wallace, "Consider the lobster."
02:30:55.240 | I mean, the stuff that we do to lobsters
02:30:58.680 | is fascinatingly horrible.
02:31:01.240 | - Oh, yeah, no.
02:31:02.280 | I mean, that was my first rejection of many worlds,
02:31:05.360 | just my psychological rejection of it,
02:31:07.880 | was just imagining the multiplication of all the suffering.
02:31:12.880 | I just, I mean, I spent a lot of time
02:31:16.440 | thinking about, consumed by,
02:31:20.400 | and trying not to be overwhelmed
02:31:22.200 | by the depth of human suffering.
02:31:26.120 | To imagine many worlds with is just--
02:31:28.240 | - Infinite suffering?
02:31:29.240 | - Oh, my God, yeah.
02:31:30.320 | - What is it about humans,
02:31:31.840 | I think you spent too much time on Twitter,
02:31:34.040 | is focused on the suffering.
02:31:35.840 | I mean, there's also the awesomeness.
02:31:37.520 | - Yes.
02:31:38.360 | - And I think the awesomeness outpowers
02:31:39.200 | the suffering over time.
02:31:40.400 | - That's so nice.
02:31:41.520 | I wish I believed that.
02:31:43.400 | - With memory, as you said, the suffering is multiplied.
02:31:48.800 | It's an interesting thought,
02:31:51.720 | but with memory, beauty is multiplied as well.
02:31:56.720 | So it's like--
02:31:58.720 | - Yes, where I stand with it,
02:32:01.000 | and I'm, for some reason, still optimistic
02:32:04.080 | that we can get ourselves to a different place,
02:32:07.960 | but the way things currently are,
02:32:11.640 | or the way things have always been for animals and humans,
02:32:15.000 | and I think any conscious life form,
02:32:17.720 | is, to me, the suffering seems so much more impactful
02:32:22.720 | and powerful than any happy,
02:32:31.040 | for lack of a better word, experience,
02:32:33.120 | that no happy experience is worth
02:32:37.040 | its equivalent experience of suffering.
02:32:42.640 | - That's certainly how I feel as well,
02:32:46.920 | but I've learned not to trust my feelings.
02:32:49.880 | - Yeah, well.
02:32:50.800 | - So the folks who are religious will ask the question,
02:32:56.040 | which I think applies whether you're religious or not,
02:32:58.320 | why is there suffering in the world?
02:33:00.360 | Why does a just God allow suffering?
02:33:03.520 | Those kinds of questions.
02:33:04.760 | I think...
02:33:14.640 | It does seem that suffering is a deep part of human history,
02:33:18.440 | and you have to really think about that.
02:33:22.520 | - It's a part of nature.
02:33:23.480 | I mean-- - It's a part of nature.
02:33:24.720 | - If feeling good is surviving and thriving,
02:33:28.880 | nothing survives and thrives forever,
02:33:31.040 | so you just encounter suffering.
02:33:33.920 | It's just built in.
02:33:35.160 | - Yeah, death meets us all in the end,
02:33:37.880 | and only...
02:33:38.800 | It's kind of hilarious to then think about most of nature,
02:33:43.040 | the cruelty and the poverty of nature,
02:33:44.920 | like how horrible the conditions are for animals.
02:33:48.440 | - And plants.
02:33:49.280 | - And plants. - It's just war.
02:33:50.600 | It's just war all over the place.
02:33:53.240 | - But it's mostly, yeah, it's war,
02:33:55.400 | but it's also just, it's like poverty.
02:33:59.080 | It's extreme poverty.
02:34:01.000 | When people criticize farms and so on,
02:34:03.680 | you also have to consider the suffering of the animals.
02:34:06.200 | We try to imagine that animals in the woods
02:34:08.800 | are all this happy time.
02:34:09.840 | No, it's like, you have to really consider,
02:34:12.160 | if you really asked an animal,
02:34:13.800 | would they like to sit in a boring zoo
02:34:16.080 | and be fed away from the wild and nature
02:34:20.120 | and the freedom and so on?
02:34:21.600 | I don't know how many of them would choose the zoo
02:34:23.480 | versus nature.
02:34:24.520 | Anyway, but what's the meaning of life, Anika?
02:34:29.520 | Let me ask the question. - Is that a question?
02:34:30.840 | - Why? - For me?
02:34:31.880 | - Yes.
02:34:32.720 | (Anika laughs)
02:34:33.960 | There's no you.
02:34:35.440 | It's a question for whatever you're plugged into.
02:34:37.600 | - Is that a question for the body and mind system
02:34:40.600 | we call Anika?
02:34:41.440 | (Anika laughs)
02:34:42.440 | Call Anika and let's see what that--
02:34:44.000 | - The meaning of life?
02:34:45.360 | - Yeah, the why, the why.
02:34:47.880 | Why, is there a why? - It's interesting.
02:34:49.600 | I've never been drawn to the why questions.
02:34:52.420 | I'm interested in the what and the how.
02:34:55.820 | What is life?
02:34:59.440 | What is this place?
02:35:00.580 | What are we doing?
02:35:01.560 | How are we here?
02:35:02.720 | How is this taking place?
02:35:06.440 | But I mean, if I had to answer,
02:35:10.440 | I guess I don't think there is a why really.
02:35:13.680 | It's funny, the quote,
02:35:16.280 | the thought that comes to mind is really
02:35:20.440 | like a kind of a cheesy quote that I'm sure is printed
02:35:22.720 | on a bunch of mugs and T-shirts,
02:35:25.720 | but it's Thich Nhat Hanh.
02:35:27.140 | (Anika laughs)
02:35:29.080 | I'm gonna get it wrong,
02:35:29.920 | but it's something like,
02:35:31.220 | we're here to awaken from our illusion of separateness.
02:35:37.840 | And I don't really see that as an answer to the why question,
02:35:42.120 | although that's how it's framed in his quote.
02:35:45.240 | We are here for that purpose.
02:35:47.360 | I think if there is a purpose worth being here for,
02:35:53.200 | that's kind of the ultimate, I think.
02:35:55.840 | - Let me ask you for advice.
02:35:59.240 | You had a complex and a beautiful journey through life.
02:36:06.160 | You're exceptionally successful.
02:36:08.400 | What advice would you give to young folks
02:36:10.880 | in high school or in college
02:36:13.000 | about how to live a life like yours
02:36:15.800 | or how to live a life they can be proud of
02:36:19.680 | or have a career they can be proud of?
02:36:21.520 | How to pave a path and journey
02:36:25.560 | they can be happy with and be proud of?
02:36:28.600 | - I haven't really had this conversation with my kids.
02:36:31.200 | I mean, we have lots of deep conversations
02:36:33.880 | and they're all kind of pertaining to each moment
02:36:38.440 | or whatever they're facing.
02:36:40.160 | I think career is difficult because in so many ways,
02:36:45.960 | I just feel like I'm lucky that I ended up being able to do
02:36:49.200 | for a living the thing I love to do.
02:36:52.840 | But the truth is-- - There's no such thing as luck.
02:36:55.540 | - Yeah, well-- - Or the free will.
02:36:57.200 | Luck is an illusion.
02:36:58.140 | - There's no such thing as luck
02:37:00.640 | when you believe in free will, right?
02:37:02.440 | - Right. (Amy laughs)
02:37:03.920 | That's true. - They're all illusions.
02:37:05.560 | I really, in retrospect, started working on my book
02:37:10.560 | 30 years ago and had no idea that I was working on a book.
02:37:17.200 | And this kind of ties into my advice,
02:37:19.560 | which is I think it's really important
02:37:22.120 | to follow your passion and
02:37:26.200 | to find things that you love
02:37:31.040 | and that you find inspiring and motivating and exciting,
02:37:36.040 | whether they relate to your career or not.
02:37:41.080 | And I think many times, if you persist
02:37:46.080 | just for the pure passion of the thing itself,
02:37:52.040 | it finds a way into your everyday life.
02:37:57.080 | - The career manifests itself.
02:37:59.800 | - I mean, that's what's happened to me.
02:38:01.640 | And I've had such an unconventional path,
02:38:06.480 | it's very hard for me to give advice based on that path.
02:38:11.480 | But I do believe that it's extraordinarily important
02:38:15.920 | to keep your passions alive, to keep your curiosity alive,
02:38:19.880 | to keep your wonder at life alive, however you do that.
02:38:24.680 | And it doesn't necessarily have to be in your career.
02:38:28.560 | And I think for a lot of people,
02:38:29.880 | their career enables them the time and the space
02:38:33.800 | to experience other things that maybe wouldn't be
02:38:38.800 | as enjoyable if they were at their career.
02:38:42.320 | - Yeah, I mean, in general, a dogged pursuit
02:38:45.080 | of the stuff you love will create something beautiful.
02:38:48.720 | And if it's an unconventional path, those are the best kind.
02:38:52.560 | Those are the most beautiful kind.
02:38:53.840 | And it created, in this case,
02:38:56.040 | I think you're a beautiful person, Anika,
02:38:58.080 | beautiful mind. - Thank you.
02:38:59.320 | - Thank you so much for doing everything you do
02:39:03.640 | and for sharing it with the world.
02:39:05.480 | And thank you so much for talking with me today.
02:39:07.000 | That was awesome.
02:39:07.840 | Good to finally meet you.
02:39:08.880 | - Great to finally meet you.
02:39:11.000 | - Thanks for listening to this conversation
02:39:12.480 | with Anika Harris.
02:39:13.760 | To support this podcast,
02:39:15.160 | please check out our sponsors in the description.
02:39:17.960 | And now, let me leave you with some words
02:39:20.160 | from Mahatma Gandhi.
02:39:21.940 | I will not let anyone walk through my mind
02:39:26.500 | with their dirty feet.
02:39:27.900 | Thank you for listening and hope to see you next time.
02:39:32.040 | (upbeat music)
02:39:34.620 | (upbeat music)
02:39:37.200 | Thanks for watching.