back to index

Andrew Huberman: Neuroscience of Optimal Performance | Lex Fridman Podcast #139


Chapters

0:0 Introduction
2:29 Fear
10:41 Virtual reality
14:25 Claustrophobia
16:13 Skydiving
17:48 Overcoming fears
22:48 Optimal performance
26:2 Deep work
41:27 Psychedelics
45:13 Deep work
58:53 Everything in the brain is an abstraction
66:11 Human vision system
77:47 Neuralink
105:17 Science of consciousness
120:5 David Goggins
137:9 Science communication
144:41 Man's Search for Meaning

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Andrew Huberman,
00:00:03.120 | a neuroscientist at Stanford,
00:00:05.240 | working to understand how the brain works,
00:00:07.600 | how it can change through experience,
00:00:09.640 | and how to repair brain circuits
00:00:11.960 | damaged by injury or disease.
00:00:14.600 | He has a great Instagram account @hubermanlab,
00:00:19.120 | where he teaches the world
00:00:20.440 | about the brain and the human mind.
00:00:22.680 | Also, he's a friend and an inspiration
00:00:26.520 | in that he shows that you can be humble,
00:00:28.760 | giving, and still succeed in the science world.
00:00:32.640 | Quick mention of each sponsor,
00:00:34.160 | followed by some thoughts related to the episode.
00:00:37.760 | 8Sleep, a mattress that cools itself
00:00:40.120 | and gives me yet another reason to enjoy sleep.
00:00:43.000 | SEMrush, the most advanced SEO optimization tool
00:00:46.480 | I've ever come across,
00:00:48.040 | and Cash App, the app I use to send money to friends.
00:00:52.080 | Please check out these sponsors in the description
00:00:54.480 | to get a discount and to support this podcast.
00:00:57.920 | As a side note, let me say that I heard from a lot of people
00:01:00.880 | about the previous conversation I had with Yaron Brook
00:01:04.960 | about objectivism.
00:01:07.000 | Some people loved it, some people hated it.
00:01:10.080 | I misspoken some parts,
00:01:12.000 | was more critical on occasion than I meant to be,
00:01:14.680 | didn't push on certain points that I should have,
00:01:17.160 | was undereducated or completely unaware
00:01:19.800 | about some major things that happened in the past
00:01:22.400 | or major ideas out there.
00:01:25.160 | I bring all that up to say
00:01:27.200 | that if we are to have difficult conversations,
00:01:30.000 | we have to give each other space to make mistakes,
00:01:33.040 | to learn, to grow.
00:01:35.560 | Taking one or two statements from a three-hour podcast
00:01:38.720 | and suggesting that they encapsulate who I am,
00:01:42.440 | I was, or ever will be,
00:01:44.800 | is a standard that we can't hold each other to.
00:01:48.680 | I don't think anyone could live up to that kind of standard.
00:01:51.240 | At least I know I can't.
00:01:53.160 | The conversation with Yaron is mild
00:01:56.240 | relative to some conversations
00:01:57.760 | that I will likely have in the coming year.
00:02:00.520 | Please continue to challenge me,
00:02:02.320 | but please try to do so with love and with patience.
00:02:06.120 | I promise to work my ass off to improve.
00:02:10.560 | Whether I'm successful at that or not, we shall see.
00:02:14.080 | If you enjoy this thing, subscribe on YouTube,
00:02:16.320 | review it with 5 Stars on Apple Podcasts,
00:02:18.440 | follow on Spotify, support on Patreon,
00:02:21.000 | or connect with me on Twitter @LexFriedman.
00:02:24.920 | And now, here's my conversation with Andrew Huberman.
00:02:28.960 | You've mentioned that in your lab at Stanford,
00:02:32.800 | you induce stress by putting people into a virtual reality
00:02:36.680 | and having them go through one of a set of experiences.
00:02:40.400 | I think you mentioned this on Rogan or with Whitney,
00:02:42.920 | that scare them.
00:02:44.760 | So just on a practical, psychological level,
00:02:49.240 | and maybe on a philosophical level,
00:02:51.280 | what are people afraid of?
00:02:54.240 | What are the fears?
00:02:55.160 | What are these fear experiences
00:02:57.440 | that you find to be effective?
00:02:59.360 | - Yeah, so it depends on the person, obviously.
00:03:02.040 | And we should probably define fear, right?
00:03:04.200 | 'Cause you can, without going too far down the rabbit hole
00:03:07.560 | of defining these things,
00:03:10.160 | you can't really have fear without stress,
00:03:13.080 | but you could have stress without fear.
00:03:15.400 | And you can't really have trauma without fear and stress,
00:03:18.120 | but you could have fear and stress without trauma.
00:03:20.320 | So, we can start playing the word game,
00:03:22.160 | and that actually is one of the motivations
00:03:24.240 | for even having a laboratory that studies these things,
00:03:26.520 | is that we really need better physiological,
00:03:30.400 | neuroscientific, and operational definitions
00:03:32.920 | of what these things are.
00:03:33.880 | I mean, the field of understanding emotions and states,
00:03:38.880 | which is mainly what I'm interested in, is very complicated.
00:03:42.480 | But we can do away with a lot of complicated debate
00:03:46.520 | and say, in our laboratory,
00:03:48.440 | what we're looking for to assign it a value of fear
00:03:52.720 | is a big inflection in autonomic arousal,
00:03:56.160 | so increases in heart rate, increases in breathing,
00:03:59.440 | perspiration, pupil dilation,
00:04:01.600 | all the hallmark signature features of the stress response.
00:04:05.500 | And in some cases,
00:04:07.900 | we have the benefit of getting neurosurgery patients
00:04:10.800 | where we've got electrodes in their amygdala
00:04:12.320 | and their insula and the orbital frontal cortex
00:04:15.360 | down beneath the skull.
00:04:16.600 | So these are chronically implanted electrodes,
00:04:18.700 | we're getting multi-unit signals,
00:04:20.440 | and we can start seeing some central features
00:04:24.120 | of meaning within the brain.
00:04:26.200 | And what's interesting is that,
00:04:29.180 | as trivial as it might seem in listening to it,
00:04:33.860 | almost everybody responds to heights
00:04:38.440 | and falling from a high virtual place
00:04:42.400 | with a very strong stress, if not fear response.
00:04:45.920 | And that's because the visual vestibular apparati, right?
00:04:49.800 | The optic flow and how it links
00:04:51.460 | to the balanced semicircular canals of the inner ears,
00:04:54.040 | all this technical stuff,
00:04:55.280 | but really all of that pulls all your physiology,
00:05:00.280 | the feeling that your stomach is dropping,
00:05:02.260 | the feeling that suddenly you're sweating
00:05:04.080 | even though you're not afraid
00:05:05.100 | of falling off this virtual platform,
00:05:06.920 | but you feel as if you're falling
00:05:09.400 | because of the optic flow, that one is universal.
00:05:12.780 | So we've got a dive with great white sharks experience
00:05:15.920 | where you actually exit the cage.
00:05:17.720 | We went out and did this in the real world
00:05:19.820 | and brought back 360 video that's built out pretty-
00:05:23.020 | - Oh, so this is actually 360 video.
00:05:24.700 | - 360 video. - That's awesome.
00:05:25.540 | - And this was important to us, right?
00:05:27.420 | So when we decided to set up this platform,
00:05:29.820 | a lot of the motivation was that a lot of the studies
00:05:33.020 | of these things in laboratories,
00:05:35.820 | I don't want to call them lame
00:05:36.820 | because I want to be respectful
00:05:37.940 | of the people that did this stuff before,
00:05:39.500 | but they'd study fear by showing subjects
00:05:42.020 | a picture of a bloody arm or a snake
00:05:44.820 | or something like that.
00:05:46.300 | And it just, unless you have a snake phobia,
00:05:48.340 | it just wasn't creating a real enough experience.
00:05:51.240 | So we need to do something where people
00:05:52.340 | aren't going to get injured,
00:05:53.700 | but where we can tap into the physiology
00:05:55.580 | and that thing of presence of people momentarily,
00:05:58.680 | not the whole time, but momentarily
00:06:00.940 | for getting there in a laboratory.
00:06:02.380 | And so heights will always do it.
00:06:04.940 | And if people want to challenge me on this,
00:06:07.280 | I like to point to that movie "Free Solo,"
00:06:09.820 | which was wild because it's incredible movie,
00:06:13.100 | but I think a lot of its popularity
00:06:14.700 | can be explained by a puzzle,
00:06:17.080 | which is you knew he was going to live
00:06:19.960 | when you walked in the theater
00:06:21.380 | or you watched it at home.
00:06:23.020 | You knew before that he survived,
00:06:25.460 | and yet it was still scary that people somehow
00:06:28.540 | were able to put themselves into that experience
00:06:30.620 | or into Alex's experience enough
00:06:32.800 | that they were concerned or worried
00:06:36.020 | or afraid at some level.
00:06:37.280 | So heights always does it.
00:06:39.640 | If we get people who have generalized anxiety,
00:06:41.860 | these are people who wake up and move through life
00:06:45.420 | at a generally higher state of autonomic arousal
00:06:47.940 | and anxiety, then we can tip them
00:06:50.260 | a little bit more easily with things
00:06:51.700 | that don't necessarily get everyone afraid.
00:06:54.180 | Things like claustrophobia, public speaking,
00:06:57.940 | that's going to vary from person to person.
00:07:01.540 | And then if you're afraid of sharks,
00:07:03.460 | like my sister, for instance, is afraid of sharks,
00:07:04.980 | she won't even come to my laboratory
00:07:06.780 | because there's a thing about sharks in it.
00:07:09.540 | That's how terrified some people are
00:07:11.300 | of these specific stimuli.
00:07:13.400 | But heights gets them every time.
00:07:15.400 | - Yeah, I'm terrified of heights.
00:07:17.340 | - We have you step off a platform, virtual platform,
00:07:22.500 | and it's a flat floor in my lab, but you're up there.
00:07:26.780 | - Well, you actually allow them the possibility
00:07:28.920 | in the virtual world to actually take the leap of faith.
00:07:31.960 | - Yeah, maybe I should describe
00:07:33.060 | a little bit of the experiment.
00:07:34.000 | So without giving away too much
00:07:36.480 | in case someone wants to be a subject
00:07:37.760 | in one of these experiments,
00:07:39.360 | we have them playing a cognitive game.
00:07:40.920 | It's a simple lights out kind of game
00:07:43.020 | where you're pointing a cursor
00:07:44.480 | and turning out lights on a grid,
00:07:46.040 | but it gets increasingly complex and it speeds up on them.
00:07:49.400 | And there's a failure point for everybody
00:07:51.700 | where they just can't make the motor commands fast enough.
00:07:54.120 | And then we surprise people essentially
00:07:57.640 | by placing them virtually.
00:07:59.760 | All of a sudden, they're on a narrow platform
00:08:02.840 | between two buildings.
00:08:04.440 | And then we encourage them or we cue them
00:08:07.800 | by talking to them through a microphone
00:08:10.200 | to continue across that platform to continue the game.
00:08:13.700 | And some people, they actually will get down on the ground
00:08:18.700 | and hold onto a virtual beam
00:08:21.720 | that doesn't even exist on a flat floor.
00:08:24.080 | And so what this really tells us is the power of the brain
00:08:27.560 | to enter these virtual states as if they were real.
00:08:31.040 | And we really think that anchoring the visual
00:08:33.200 | and the vestibular, the balance components
00:08:35.800 | of the nervous system are what bring people
00:08:37.940 | into that presence so quickly.
00:08:40.200 | There's also the potential, and we haven't done this yet,
00:08:42.280 | to bring in 360 sound.
00:08:44.520 | So the reason we did 360 video
00:08:46.160 | is that when we started all this back in 2016,
00:08:49.360 | a lot of the VR was pretty lame, frankly.
00:08:51.320 | It was CGI.
00:08:52.840 | It just wasn't real enough.
00:08:54.480 | But with 360 video,
00:08:56.680 | we knew that we could get people into this presence
00:08:59.040 | where they think they're in a real experience more quickly.
00:09:01.360 | And our friend, Michael Muller,
00:09:03.040 | who I was introduced to because of the project,
00:09:04.900 | I reached out to some friends.
00:09:05.960 | Michael Muller's a very famous
00:09:07.960 | portrait photographer in Hollywood,
00:09:09.760 | but he dives with great white sharks and he leaves the cage.
00:09:13.120 | And so we worked with him to build a 360 video apparatus
00:09:17.620 | that we could swim underwater with,
00:09:20.320 | went out to Guadalupe Island, Mexico,
00:09:22.120 | and actually got the experience.
00:09:24.320 | It was a lot of fun.
00:09:25.560 | There were some interesting moments out there of danger,
00:09:27.760 | but it came back with that video
00:09:29.520 | and built that for the sharks.
00:09:30.880 | And then we realized we need to do this for everything.
00:09:32.520 | We need to do it for heights.
00:09:33.520 | We need to do it for public speaking,
00:09:34.800 | for claustrophobia.
00:09:36.440 | And what's missing still is 360 sound
00:09:40.080 | where 360 sound would be, for instance,
00:09:43.960 | if I were to turn around
00:09:45.160 | and there was a giant attack dog there,
00:09:48.060 | the moment I would turn around and see it,
00:09:49.940 | the dog would growl.
00:09:50.960 | But if I turned back toward you, then it would be silent.
00:09:54.360 | So, and that brings a very real element
00:09:56.560 | to one's own behavior
00:09:58.560 | where you don't know what's gonna happen
00:10:00.340 | if you turn a corner.
00:10:01.180 | Whereas if there's a dog growling behind me
00:10:03.640 | and I turn around and then I turn back to you
00:10:05.800 | and it's still growling,
00:10:07.800 | that might seem like more of an impending threat,
00:10:10.320 | unsustained threat,
00:10:12.720 | but actually it's when you start linking
00:10:14.320 | your own body movements to the experience.
00:10:16.360 | So when it's closed loop,
00:10:18.240 | where my movements and choices
00:10:20.040 | are starting to influence things
00:10:21.560 | and they're getting scarier and scarier,
00:10:23.960 | that's when you can really drive people's nervous system
00:10:26.640 | down these paths of high states of stress and fear.
00:10:30.040 | Now, we don't wanna traumatize people, obviously,
00:10:31.840 | but we also study a number of tools
00:10:34.800 | that allow them to calm themselves in these environments.
00:10:37.680 | So the short answer is heights.
00:10:39.520 | - Heights.
00:10:40.360 | - Yeah.
00:10:41.280 | - Well, from a psychology
00:10:43.160 | and from a neuroscience perspective,
00:10:44.480 | this whole construction that you've developed
00:10:46.040 | is fascinating.
00:10:46.900 | We did this a little bit with autonomous vehicles.
00:10:51.480 | So to try to understand the decision-making process
00:10:56.440 | of a pedestrian when they cross the road
00:10:58.440 | and trying to create an experience of a car
00:11:01.120 | that can run you over,
00:11:03.760 | so there's the danger there.
00:11:06.560 | I was so surprised how real that whole world was.
00:11:11.560 | And the graphics that we built
00:11:13.800 | wasn't ultra realistic or anything,
00:11:15.840 | but I was still afraid of being hit by a car.
00:11:18.960 | Everybody we tested were really afraid
00:11:20.920 | of being hit by that car.
00:11:22.040 | - Even though it was all a simulation?
00:11:23.240 | - It was all a simulation.
00:11:24.160 | It was kind of boxy, actually.
00:11:27.320 | I mean, it wasn't like ultra realistic simulation.
00:11:30.240 | It was fascinating.
00:11:31.600 | - Looms and heights.
00:11:33.480 | So any kind of depth,
00:11:35.680 | we're just programmed to not necessarily recoil,
00:11:40.680 | but to be cautious about that edge and that depth.
00:11:43.320 | And then looms, things coming at us that are getting larger.
00:11:46.160 | There are looming sensing neurons, even in the retina,
00:11:48.720 | at a very, very early stage of visual processing.
00:11:51.840 | And incidentally, the way Muller
00:11:56.280 | and folks learn how to not get eaten by great white sharks
00:11:59.800 | when you're swimming outside the cage
00:12:01.400 | is as they start lumbering in, you swim toward them.
00:12:04.520 | And they get very confused when you loom on them,
00:12:07.480 | because clearly you're smaller.
00:12:09.000 | Clearly they could eat you if they wanted to,
00:12:11.520 | but there's something about forward movement
00:12:13.920 | toward any creature that that creature questions
00:12:17.160 | whether or not it would be a good idea
00:12:18.800 | to generate forward movement toward you.
00:12:21.460 | And so that's actually the survival tool
00:12:23.640 | of these cage exit white shark divers.
00:12:26.040 | - Are you playing around with,
00:12:27.280 | like one of the critical things
00:12:28.400 | for the autonomous vehicle research
00:12:30.160 | is you couldn't do 360 video because there's a game
00:12:33.840 | theoretic, there's an interactive element
00:12:35.880 | that's really necessary.
00:12:37.200 | So maybe people realize this, maybe they don't,
00:12:40.720 | but 360 video, you obviously,
00:12:44.600 | well, it's actually not that obvious to people,
00:12:46.160 | but you can't change the reality that you're watching.
00:12:49.760 | - That's right.
00:12:50.600 | - So, but you find that that's,
00:12:54.280 | is there something fundamental about fear and stress
00:12:58.520 | that the interactive element is essential for?
00:13:00.800 | Or do you find you can arouse people with just the video?
00:13:05.120 | - Great question.
00:13:06.900 | It works best to use mixed reality.
00:13:09.520 | So we have a snake stimulus.
00:13:11.560 | I personally don't like snakes at all.
00:13:13.600 | I don't mind spiders.
00:13:14.520 | We also have a spider stimulus,
00:13:15.660 | but like snakes, I just don't like them.
00:13:17.820 | There's something about the slithering
00:13:19.960 | and it just creates a visceral response for me.
00:13:23.960 | Some people, not so much,
00:13:25.280 | and they have lower levels of stress and fear in there.
00:13:28.640 | But one way that we can get them to feel more of that
00:13:32.200 | is to use mixed reality,
00:13:33.880 | where we have an actual physical bat
00:13:37.080 | and they have to stomp out the snake,
00:13:39.340 | as opposed to just walk to a little safe corner,
00:13:42.320 | which then makes the snake disappear.
00:13:44.260 | That tends to be not as stressful
00:13:45.960 | as if they have a physical weapon.
00:13:48.040 | And so you've got people in there, you know,
00:13:49.520 | banging on the floor against this thing.
00:13:51.320 | And there's something about engaging
00:13:53.300 | that makes it more of a threat.
00:13:56.100 | Now, I should also mention,
00:13:57.480 | we always get the subjective report
00:14:00.040 | from the subject of what they experienced,
00:14:02.360 | because we never want to project our own ideas
00:14:05.520 | about what they were feeling.
00:14:06.600 | But that's the beauty of working with humans
00:14:08.260 | is you can ask them how they feel.
00:14:09.760 | - Exactly.
00:14:10.600 | - And humans aren't great at explaining how they feel,
00:14:13.640 | but it's a lot easier to understand what they're saying
00:14:16.320 | than a mouse or a macaque monkey is saying.
00:14:20.000 | So it's the best we can do is language,
00:14:22.680 | plus these physiological and neurophysiological signals.
00:14:25.480 | - Is there something you've learned about yourself,
00:14:27.280 | about your deepest fears?
00:14:28.880 | Like you said snakes, is there something that,
00:14:32.000 | like if I were to torture you, so I'm Russian,
00:14:34.800 | so, you know, I always kind of think,
00:14:37.960 | how can I murder this person that entered the room?
00:14:41.380 | But also how could I torture you
00:14:43.880 | to get some information out of you?
00:14:45.440 | What would I go with?
00:14:48.080 | - It's interesting you should say
00:14:48.920 | that I never considered myself claustrophobic,
00:14:51.800 | but, 'cause I don't mind small environments
00:14:55.040 | provided they're well ventilated.
00:14:58.000 | But I, before COVID, I started going to this Russian banya,
00:15:02.080 | you know, and then we jumped here.
00:15:03.800 | And I had never been to a banya.
00:15:05.200 | So, you know, the whole experience
00:15:06.520 | of really, really hot sauna.
00:15:08.640 | And what do they call it?
00:15:09.840 | The plaza, they're hitting you with the leaves.
00:15:12.480 | And it gets really hot and humid in there.
00:15:15.120 | And there were a couple of times where I thought,
00:15:17.720 | okay, this thing is below ground.
00:15:20.440 | It's in a city where there are a lot of earthquakes.
00:15:23.640 | Like if this place crumbled and we were stuck in here,
00:15:27.140 | and I'd start getting a little panicky.
00:15:28.600 | And I realized, I'm like,
00:15:29.600 | I don't like small confined spaces with poor ventilation.
00:15:33.360 | So I realized, I think I have some claustrophobia.
00:15:35.600 | And I wasn't aware of that before.
00:15:37.020 | So I put myself into our own claustrophobia stimulus,
00:15:41.280 | which involves getting into an elevator
00:15:43.240 | and with a bunch of people, virtual people,
00:15:47.000 | and the elevator gets stalled.
00:15:49.440 | And at first you're fine, you feel fine.
00:15:52.440 | But then as we start modulating the environment
00:15:54.440 | and we actually can control levels of oxygen
00:15:57.040 | in the environment if we want to,
00:15:59.360 | it is really uncomfortable for me.
00:16:01.920 | And I never would have thought, you know,
00:16:03.200 | I fly, I'm comfortable in planes,
00:16:05.400 | but it is really uncomfortable.
00:16:07.280 | And so I think I've unhatched a bit of a claustrophobia.
00:16:11.520 | - Yeah. - Yeah.
00:16:12.360 | - Yeah, for me as well, probably.
00:16:14.720 | That one, that one is pretty bad.
00:16:16.280 | The heights I tried to overcome.
00:16:18.640 | So I went to skydiving to try to overcome
00:16:20.520 | the fear of heights, but that didn't help.
00:16:23.000 | - Did you jump out?
00:16:23.840 | - Yeah, I jumped out, but it was fundamentally
00:16:27.520 | different experience than, I guess there could be
00:16:31.240 | a lot of different flavors of fear of heights maybe.
00:16:33.960 | But the one I have didn't seem to be connected
00:16:38.760 | to jumping out of a plane.
00:16:40.480 | It's a very different, 'cause like once you accept it,
00:16:43.200 | you're going to jump, then it's a different thing.
00:16:47.040 | I think what I'm afraid of is the moments before it
00:16:52.040 | is the scariest part.
00:16:54.880 | - Absolutely.
00:16:55.720 | - And I don't think that's emphasized
00:16:57.720 | in the skydiving experience as much.
00:17:00.000 | And also just the acceptance of the fact
00:17:02.840 | that it's going to happen.
00:17:03.920 | So once you accept it is going to happen,
00:17:06.120 | it's not as scary.
00:17:07.560 | It's the fact that it's not supposed to happen
00:17:10.400 | and it might, that's the scary part.
00:17:12.160 | I guess I'm not being eloquent in this description,
00:17:15.640 | but there's something about skydiving
00:17:17.920 | that was actually philosophically liberating.
00:17:20.640 | I was like, wow, it was the possibility
00:17:25.360 | that you can walk on a surface.
00:17:27.680 | And then at a certain point, there's no surface anymore
00:17:30.640 | to walk on.
00:17:31.520 | And it's all of a sudden the world becomes three dimensional
00:17:35.080 | and there's this freedom of floating
00:17:37.240 | that the concept of like, of earth disappears
00:17:41.880 | for a brief few seconds.
00:17:43.240 | I don't know, that was--
00:17:45.120 | - Wild.
00:17:45.960 | - That was wild, but I'm still terrified of heights.
00:17:48.040 | So, I mean, one thing I want to ask just on fear,
00:17:52.400 | 'cause it's so fascinating is,
00:17:54.440 | have you learned anything about
00:17:56.920 | what it takes to overcome fears?
00:18:00.280 | - Yes.
00:18:01.120 | And that comes from two, from a research study standpoint,
00:18:05.560 | two parallel tracks of research.
00:18:07.000 | One was done actually in mice,
00:18:09.920 | 'cause we have a mouse lab also
00:18:11.320 | where we can probe around different brain areas
00:18:13.120 | and try and figure out what interesting brain areas
00:18:15.240 | we might want to probe around in humans.
00:18:17.440 | And a graduate student in my lab,
00:18:19.960 | she's now at Caltech, Lindsay Sillay,
00:18:23.600 | published a paper back in 2018,
00:18:25.680 | showing that what at first might seem a little bit obvious,
00:18:29.680 | but the mechanisms are not,
00:18:31.640 | which is that there are really three responses to fear.
00:18:34.280 | You can pause, you can freeze essentially.
00:18:36.960 | You can retreat, you can back up, or you can go forward.
00:18:40.720 | And there's a single hub of neurons in the midbrain,
00:18:45.720 | it's actually not the midbrain,
00:18:46.840 | but it's in the middle of the thalamus,
00:18:49.000 | which is a four brain structure.
00:18:50.600 | And depending on which neurons are active there,
00:18:54.760 | there's a much higher probability that a mouse,
00:18:56.720 | or it turns out, or a human will advance in the face of fear
00:19:00.280 | or will pause or will retreat.
00:19:03.000 | Now, that just assigns a neural structure
00:19:04.960 | to a behavioral phenomenon.
00:19:06.400 | But what's interesting is that it turns out
00:19:09.600 | that the lowest level of stress or autonomic arousal
00:19:13.560 | is actually associated with the pausing
00:19:15.360 | and freezing response.
00:19:16.800 | Then as the threat becomes more impending,
00:19:20.640 | and we used visual looms in this case,
00:19:22.720 | the retreat response has a slightly higher level
00:19:26.920 | of autonomic arousal and stress.
00:19:28.920 | So think about playing hide and go seek,
00:19:30.560 | and you're trying to stay quiet
00:19:32.120 | in a closet that you're hiding.
00:19:34.560 | If you're very calm, it's easy to stay quiet and still.
00:19:37.560 | As your level of stress goes up,
00:19:39.280 | it's harder to maintain that level of quiet and stillness.
00:19:44.000 | You see this also in animals that are stalking,
00:19:46.080 | a cat will chatter its teeth.
00:19:47.400 | That's actually sort of top-down inhibition
00:19:49.680 | and trying to restrain behavior.
00:19:51.800 | So the freeze response is actually an active response,
00:19:55.000 | but it's fairly low stress.
00:19:56.760 | And what was interesting to us is that
00:19:58.720 | the highest level of autonomic arousal
00:20:01.000 | was associated with the forward movement toward the threat.
00:20:04.760 | So in your case, jumping out of the plane.
00:20:07.720 | However, the forward movement in the face of threat
00:20:11.520 | was linked to the activation of what we call collateral,
00:20:15.520 | which means just a side connection,
00:20:17.000 | literally a wire in the brain
00:20:18.800 | that connects to the dopamine circuits for reward.
00:20:21.360 | And so when one safely and adaptively,
00:20:25.520 | meaning you survive,
00:20:26.760 | moves through a threat or toward a threat,
00:20:29.840 | it's rewarded as a positive experience.
00:20:32.760 | And so the key,
00:20:34.120 | it actually maps very well to cognitive behavioral therapy
00:20:36.880 | and a lot of the existing treatments for trauma,
00:20:39.440 | is that you have to confront
00:20:41.840 | the thing that makes you afraid.
00:20:44.200 | So otherwise you exist in this very low level
00:20:46.800 | of reverberatory circuit activity
00:20:49.040 | where the circuits for autonomic arousal are humming
00:20:52.440 | and they're humming more and more and more.
00:20:54.240 | And we have to remember that stress and fear and threat
00:20:56.560 | were designed to agitate us so that we actually move.
00:21:00.680 | So the reason I mentioned this is I think
00:21:02.760 | a lot of times people think that
00:21:04.560 | the maximum stress response or fear response
00:21:07.800 | is to freeze and to lock up.
00:21:10.160 | But that's actually not the maximum stress response.
00:21:12.600 | The maximum stress response is to advance,
00:21:15.620 | but it's associated with reward.
00:21:17.320 | It has positive valence.
00:21:19.320 | So there's this kind of,
00:21:21.040 | everyone always thinks about the bell,
00:21:22.760 | the sort of hump shape curve for,
00:21:26.280 | at low levels of arousal performance is low
00:21:28.400 | and as it increases performance goes higher
00:21:30.040 | and then it drops off as you get really stressed.
00:21:31.920 | But there's another bump further out
00:21:34.520 | of the distribution where you perform very well
00:21:36.760 | under very high levels of stress.
00:21:38.440 | And so we've been spending a lot of time in humans
00:21:40.300 | and in animals exploring what it takes
00:21:42.640 | to get people comfortable to go to that place.
00:21:46.000 | And also to let them experience
00:21:48.960 | how there are heightened states of cognition there.
00:21:51.520 | There's changes in time perception that allow you
00:21:54.760 | to evaluate your environment
00:21:56.440 | at a faster frame rate, essentially.
00:21:59.520 | This is the matrix as a lot of people think of it.
00:22:03.240 | But we tend to think about fear as all the low level stuff
00:22:06.280 | where things aren't worked out.
00:22:07.900 | But there are many,
00:22:10.360 | there are a lot of different features to the fear response.
00:22:14.000 | And so we think about it quantitatively
00:22:15.980 | and we think about it from a circuit perspective
00:22:18.180 | in terms of outcomes.
00:22:20.680 | And we try and weigh that against the threat.
00:22:22.240 | So we never want people to put themselves
00:22:23.600 | in unnecessary risk, but that's where the VR is fun
00:22:25.960 | because you can push people hard
00:22:27.800 | without risk of physically injuring them.
00:22:30.520 | - And that's, like you said, the little bump
00:22:33.120 | that seems to be a very small fraction
00:22:35.400 | of the human experience, right?
00:22:37.320 | So it's kind of fascinating to study it
00:22:39.680 | because most of us move through life
00:22:42.680 | without ever experiencing that kind of focus.
00:22:47.680 | - Well, everything's in a peak state there.
00:22:50.480 | I really think that's where optimal performance lies.
00:22:53.440 | - There's so many interesting words here,
00:22:55.120 | but what's performance and what's optimal performance?
00:22:59.600 | We're talking about mental ability to what?
00:23:04.080 | To perceive the environment quickly,
00:23:06.440 | to make actions quickly.
00:23:08.000 | What's optimal performance?
00:23:09.240 | - Yeah, well, it's very subjective
00:23:11.080 | and it varies depending on task and environment.
00:23:15.080 | So one way where we can make it a little bit more operational
00:23:18.000 | and concrete is to say there is a sweet spot, if you will,
00:23:23.000 | where the level of internal autonomic arousal,
00:23:27.600 | aka stress or alertness, whatever you want to call it,
00:23:30.920 | is ideally matched to the speed of whatever challenge
00:23:35.920 | you have to be facing in the outside world.
00:23:38.400 | So we all have perception of the outside world
00:23:41.360 | as exteroception and the perception
00:23:42.980 | of our internal real estate interoception.
00:23:45.200 | And when those two things,
00:23:46.320 | when interoception and exteroception are matched
00:23:49.040 | along a couple of dimensions,
00:23:50.960 | performance tends to increase
00:23:54.880 | or tends to be in an optimal range.
00:23:58.080 | So for instance, if you're, I don't play guitar,
00:24:00.520 | but I know you play guitar.
00:24:01.420 | So let's say you're trying to learn something new
00:24:02.840 | on the guitar.
00:24:03.760 | I'm not saying that being in these super high states
00:24:07.400 | of activation are the best place for you
00:24:09.120 | to be in order to learn.
00:24:10.960 | It may be that your internal arousal needs to be at a level
00:24:15.960 | where your analysis of space and time
00:24:19.040 | has to be well-matched to the information coming in
00:24:22.240 | and what you're trying to do in terms of performance,
00:24:25.440 | in terms of playing chords and notes and so forth.
00:24:28.180 | Now, in these cases of high threat
00:24:30.560 | where things are coming in quickly
00:24:32.000 | and animals and humans need to react very quickly,
00:24:35.000 | the higher your state of autonomic arousal, the better,
00:24:38.760 | because you're slicing time more finely,
00:24:40.960 | just because of the way the autonomic system works.
00:24:43.680 | The pupil dilation, for instance,
00:24:47.080 | and movement of the lens essentially changes your optics.
00:24:50.520 | And that's obvious, but with the change in optics
00:24:53.760 | is a change in how you bin time and slice time,
00:24:56.100 | which allows you to get more frames per second readout.
00:24:59.960 | With the guitar learning, for instance,
00:25:01.360 | it might actually be that you want to be almost sleepy,
00:25:05.240 | almost in a kind of drowsy state to be able to,
00:25:09.880 | and I don't play music, so I'm guessing here,
00:25:12.040 | but sense some of the nuance in the chords
00:25:14.320 | or the ways that you're to be relaxed enough
00:25:16.480 | that your fingers can follow an external cue.
00:25:18.340 | So matching the movement of your fingers
00:25:20.400 | to something that's pure exteroception.
00:25:23.160 | And so there is no perfect autonomic state for performance.
00:25:28.160 | This is why I don't favor terms like flow,
00:25:33.040 | because they're not well operationally defined enough,
00:25:38.040 | but I do believe that optimal or peak performance
00:25:41.140 | is going to arise when internal state is ideally matched
00:25:45.000 | to the space-time features of the external demands.
00:25:49.280 | - So there's some slicing of time that happens,
00:25:51.800 | and then you're able to adjust,
00:25:54.440 | so slice time more finely or less finely
00:25:58.240 | in order to adjust to the dynamics of the stimulus.
00:26:02.640 | What about the realm of ideas?
00:26:06.120 | So like, I'm a big believer,
00:26:10.080 | there's a guy named Cal Newport
00:26:11.600 | who wrote a book about deep work.
00:26:13.040 | - Oh yeah, I love that book.
00:26:14.600 | - Yeah, he's great.
00:26:18.520 | I mean, one of the nice things,
00:26:19.800 | I've always practiced deep work,
00:26:22.240 | but it's always nice to have words
00:26:24.680 | put to the concepts that you've practiced.
00:26:28.640 | It somehow makes them more concrete
00:26:30.880 | and allows you to get better.
00:26:33.640 | It turns it into a skill that you can get better at.
00:26:36.400 | But I also value deep thinking,
00:26:40.560 | where you think, it's almost meditative.
00:26:44.680 | You think about a particular concept
00:26:46.400 | for long periods of time.
00:26:47.840 | The programming you have to do that kind of thing for.
00:26:50.480 | You just have to hold this concept,
00:26:52.320 | like you hold it and then you take steps with it,
00:26:56.280 | you take further steps,
00:26:57.400 | and you're holding relatively complicated things
00:27:00.880 | in your mind as you're thinking about them.
00:27:03.240 | And there's a lot of,
00:27:04.320 | I mean, the hardest part is there's frustrating things,
00:27:08.720 | like you take a step
00:27:10.480 | and it turns out to be the wrong direction,
00:27:12.120 | so you have to calmly turn around and take a step back.
00:27:14.960 | And then it's, you're kind of like exploring
00:27:16.960 | through the space of ideas.
00:27:18.400 | Is there something about your study of optimal performance
00:27:22.840 | that could be applied to the act of thinking
00:27:26.000 | as opposed to action?
00:27:28.240 | - Well, we haven't done too much work there,
00:27:30.480 | but I think I can comment on it
00:27:33.120 | from a neuroscience perspective.
00:27:33.960 | - Yeah, what's your intuition?
00:27:34.800 | - Which is really all I do is,
00:27:36.360 | well, I mean, we do experiments in the lab,
00:27:38.240 | but looking at things through the lens of neuroscience.
00:27:41.080 | So what you're describing can be mapped fairly well
00:27:44.920 | to working memory,
00:27:46.140 | just keeping things online and updating them
00:27:49.480 | as they change in information
00:27:51.600 | and it's coming back into your brain.
00:27:53.640 | Jack Feldman, who I'm a huge fan of
00:27:57.100 | and fortunate to be friends with,
00:27:59.280 | is a professor at UCLA,
00:28:01.760 | works on respiration and breathing,
00:28:03.300 | but he has a physics background.
00:28:05.120 | And so he thinks about respiration and breathing
00:28:07.500 | in terms of ground states
00:28:09.700 | and how they modulate other states.
00:28:11.280 | Very, very interesting and I think important work.
00:28:15.320 | Jack has an answer to your question.
00:28:18.360 | So I'm not going to get this exactly right
00:28:20.240 | 'cause this is lifted from a coffee conversation
00:28:22.240 | that we had about a month ago.
00:28:23.760 | Apologies in advance,
00:28:27.560 | but I think I can get mostly right.
00:28:29.480 | So we were talking about this,
00:28:31.080 | about how the brain updates cognitive states
00:28:34.480 | depending on demands and thinking in particular.
00:28:37.560 | And he used an interesting example,
00:28:39.260 | I'd be curious to know if you agree or disagree.
00:28:42.120 | He said, "Most great mathematics is done by people
00:28:46.280 | "in their late teens and 20s,
00:28:49.400 | "and even you could say early 20s,
00:28:51.580 | "sometimes into the late 20s,
00:28:52.960 | "but not much further on."
00:28:54.640 | Maybe I just insulted some mathematicians.
00:28:56.720 | - No, that's true.
00:28:58.520 | - And I think that it demands,
00:29:00.040 | his argument was there's a tremendous demand
00:29:02.720 | on working memory to work out theorems in math
00:29:06.000 | and to keep a number of plates spinning,
00:29:08.340 | so to speak, mentally,
00:29:09.400 | and run back and forth between them, updating them.
00:29:12.400 | In physics, Jack said,
00:29:15.760 | and I think this makes sense to me too,
00:29:18.160 | that there's a reliance on working memory,
00:29:21.400 | but an increased reliance on some sort of deep memory
00:29:26.400 | and deep memory stores,
00:29:28.040 | probably stuff that's moved out of the hippocampus
00:29:29.880 | and forebrain and into the cortex,
00:29:31.960 | and is more some episodic and declarative stuff,
00:29:36.460 | but really, so you're pulling from your library, basically.
00:29:39.280 | It's not all RAM, it's not all working memory.
00:29:41.200 | And then in biology,
00:29:43.240 | and physicists tend to have very active careers
00:29:46.000 | into their 30s and 40s and 50s and so forth,
00:29:49.560 | sometimes later.
00:29:50.520 | And then in biology,
00:29:52.340 | you see careers that have a much longer arc,
00:29:55.120 | kind of these protracted careers often,
00:29:57.480 | people still in their 60s and 70s
00:29:59.400 | doing really terrific work,
00:30:01.760 | not always doing it with their own hands
00:30:03.240 | 'cause the people in the labs are doing them, of course,
00:30:06.400 | and that work does tend to rely on insights gained
00:30:10.560 | from having a very deep knowledge base,
00:30:13.440 | where you can remember a paper,
00:30:16.080 | or maybe a figure in a paper,
00:30:17.440 | you could go look it up if you wanted to,
00:30:19.200 | but it's very different
00:30:20.360 | than the working memory of the mathematician.
00:30:22.720 | And so when you're talking about coding
00:30:25.400 | or being in that tunnel of thought
00:30:28.240 | and trying to iterate and keeping a lot of plates spinning,
00:30:31.920 | it speaks directly to working memory.
00:30:35.280 | My lab hasn't done too much of that.
00:30:37.160 | - With working memory?
00:30:38.000 | - But we are pushing working memory
00:30:40.040 | when we have people do things
00:30:41.800 | like these simple lights out tasks,
00:30:43.480 | while they're under,
00:30:44.880 | we can increase the cognitive load
00:30:46.680 | by increasing the level of autonomic arousal
00:30:48.960 | to the point where they start doing less well.
00:30:52.060 | And everyone has a cliff.
00:30:54.200 | This is what's kind of fun.
00:30:55.040 | We've had SEAL team operators come to the lab,
00:30:57.600 | we've had people from other units in the military,
00:31:01.400 | we've had a range of intellects and backgrounds
00:31:04.640 | and all sorts of things,
00:31:05.520 | and everyone has a cliff.
00:31:07.280 | And those cliffs sometimes show up
00:31:09.640 | as a function of the demands of speed of processing
00:31:14.640 | or how many things you need to keep online.
00:31:17.320 | I mean, we're all limited at some point
00:31:18.880 | in the number of things we can keep online.
00:31:20.200 | So what you're describing is very interesting
00:31:21.840 | because I think it has to do with how narrow
00:31:25.040 | or broad the information set is.
00:31:27.080 | And I'm not an active programmer,
00:31:30.640 | so this is a regime I don't really fully know,
00:31:33.440 | so I don't want to comment about it in any way.
00:31:36.580 | Doesn't suggest that,
00:31:39.280 | but I think that what you're talking about
00:31:41.440 | is top-down control.
00:31:43.800 | So this is prefrontal cortex
00:31:45.920 | keeping every bit of reflexive circuitry at bay.
00:31:49.180 | The one that makes you want to get up and use the restroom,
00:31:51.040 | the one that makes you want to check your phone,
00:31:52.800 | all of that,
00:31:53.640 | but also running these anterior thalamus
00:31:57.400 | to prefrontal cortex loops,
00:31:59.140 | which we know are very important for working memory.
00:32:02.320 | - Yeah, let me try to think through this a little bit.
00:32:06.040 | reducing the process of thinking
00:32:09.680 | to working memory access is tricky.
00:32:14.240 | It's probably ultimately correct,
00:32:16.640 | but if I were to say some of the most challenging things
00:32:20.640 | that an engineer has to do,
00:32:24.920 | and a scientific thinker,
00:32:26.880 | I would say it's kind of depressing
00:32:28.200 | to think that we do that best in our 20s,
00:32:29.920 | but is this kind of first principles thinking step
00:32:34.360 | of saying,
00:32:36.700 | you're accessing the things that you know,
00:32:43.040 | and then saying,
00:32:44.840 | well, let me,
00:32:45.880 | how do I do this differently than I've done it before?
00:32:49.120 | This weird like,
00:32:50.960 | stepping back, like,
00:32:53.080 | is this right?
00:32:54.960 | Let's try it this other way.
00:32:57.400 | That's the most mentally taxing step,
00:33:00.440 | is like,
00:33:01.360 | you've gotten quite good at this particular pattern
00:33:05.440 | of how you solve this particular problem.
00:33:07.640 | So, there's a pattern recognition first.
00:33:10.160 | You're like, okay,
00:33:11.760 | I know how to build a thing
00:33:13.480 | that solves this particular problem in programming, say.
00:33:16.240 | And then,
00:33:17.080 | the question is,
00:33:20.640 | but can I do it much better?
00:33:23.200 | And I don't know if that's,
00:33:25.600 | I don't know what the hell that is.
00:33:27.000 | I don't know if that's accessing working memory.
00:33:29.800 | That's almost,
00:33:31.960 | maybe it is accessing memory in a sense
00:33:33.960 | that's trying to find similar patterns
00:33:36.760 | in a totally different place
00:33:37.920 | that it could be projected onto this.
00:33:40.760 | But you're not querying facts,
00:33:44.440 | you're querying like functional things, like.
00:33:48.880 | - Yeah, it's patterns.
00:33:49.720 | I mean, you're running out,
00:33:50.560 | you're testing algorithms.
00:33:52.040 | - Yeah.
00:33:52.880 | - Right, you're testing algorithms.
00:33:54.400 | So, I want to just,
00:33:55.560 | because I know some of the people listening to this,
00:33:59.720 | and you have basis in scientific training
00:34:03.360 | and have scientific training.
00:34:04.560 | So, I want to be clear.
00:34:05.500 | I think we can be correct about some things
00:34:07.960 | like the role of working memory
00:34:09.480 | in these kinds of processes without being exhaustive.
00:34:11.880 | We're not saying that the only thing,
00:34:13.480 | we're not, you know,
00:34:14.320 | we can be correct,
00:34:15.140 | but not assume that that's the only thing involved, right?
00:34:18.200 | And I mean, neuroscience, let's face it,
00:34:20.120 | is still in its infancy.
00:34:21.280 | I mean, we probably know 1% of what there is
00:34:23.240 | to know about the brain.
00:34:24.480 | You know, we've learned so much,
00:34:26.620 | and yet there may be global states that underlie this
00:34:29.840 | that make prefrontal circuitry work differently
00:34:33.700 | than it would in a different regime,
00:34:36.160 | or even time of day.
00:34:37.180 | I mean, there's a lot of mysteries about this.
00:34:39.360 | But, so I just want to make sure that we sort of are,
00:34:43.320 | we're aiming for precision and accuracy,
00:34:45.960 | but we're not going to be exhaustive.
00:34:48.200 | So, there's a difference there.
00:34:49.400 | And I think, you know,
00:34:50.640 | sometimes in the vastness of the internet,
00:34:53.040 | that gets forgotten.
00:34:54.280 | So, the other is that,
00:34:57.820 | you know, we think about,
00:35:01.680 | you know, we think about these operations
00:35:05.480 | at, you know, really focused,
00:35:07.660 | keeping a lot of things online.
00:35:09.200 | But what you were describing is actually,
00:35:12.120 | it speaks to the very real possibility,
00:35:15.960 | probably that with certainty,
00:35:18.160 | there's another element to all this,
00:35:20.060 | which is when you're trying out lots of things,
00:35:22.760 | in particular, lots of different algorithms,
00:35:25.240 | you don't want to be in a state
00:35:27.640 | of very high autonomic arousal.
00:35:29.720 | That's not what you want,
00:35:30.780 | because the higher level of autonomic arousal
00:35:32.680 | and stress in the system,
00:35:34.080 | the more rigidly you're going to analyze space and time.
00:35:37.600 | And what you're talking about
00:35:39.160 | is playing with space-time dimensionality.
00:35:41.400 | And I want to be very clear.
00:35:42.980 | I mean, I'm the son of a physicist.
00:35:44.440 | I am not a physicist.
00:35:45.400 | When I talk about space and time,
00:35:46.580 | I'm literally talking about visual space
00:35:50.320 | and how long it takes for my finger
00:35:52.180 | to move from this point to this point.
00:35:53.600 | - You are facing a tiger and trying to figure out
00:35:56.000 | how to avoid being eaten by the tiger.
00:35:58.120 | - And that's primarily going to be determined
00:35:59.940 | by the visual system in humans.
00:36:01.980 | We don't walk through space, for instance,
00:36:03.720 | like a scent hound would,
00:36:05.480 | and look at three-dimensional scent plumes.
00:36:08.800 | You know, when a scent hound goes out in the environment,
00:36:11.200 | they have depth to the odor trails they're following.
00:36:15.320 | And they don't think about them,
00:36:17.520 | we don't think about odor trails.
00:36:19.040 | You might say, oh, well, the smell's getting more intense.
00:36:21.160 | Aha, but they actually have three-dimensional odor trails.
00:36:24.520 | So they see a cone of odor.
00:36:26.520 | See, of course, with their nose,
00:36:28.040 | with their olfactory cortex.
00:36:29.600 | We do that with our visual system
00:36:31.400 | and we parse time often subconsciously,
00:36:35.160 | mainly with our visual system,
00:36:36.320 | also with our auditory system.
00:36:37.720 | And this shows up for the musicians out there,
00:36:39.560 | metronomes are a great way to play with this.
00:36:42.240 | You know, bass drumming,
00:36:43.200 | when the frequency of bass drumming changes,
00:36:44.920 | your perception of time changes quite a lot.
00:36:47.440 | So in any event, space and time are linked
00:36:50.040 | in the through the sensory apparati,
00:36:51.880 | through the eyes and ears and nose,
00:36:53.320 | and probably through taste too,
00:36:55.360 | and through touch for us, but mainly through vision.
00:36:58.840 | So when you drop into some coding
00:37:02.400 | or iterating through a creative process
00:37:05.400 | or trying to solve something hard,
00:37:07.840 | you can't really do that well
00:37:11.040 | if you're in a rigid high level of autonomic arousal,
00:37:15.180 | because you're plugging in algorithms
00:37:16.920 | that are in this space regime, this time regime matches.
00:37:20.940 | It's space time matched.
00:37:22.060 | Whereas creativity, I always think the lava lamp
00:37:24.680 | is actually a pretty good example,
00:37:25.980 | even though it has these counterculture
00:37:27.440 | new agey connotations,
00:37:28.840 | because you actually don't know
00:37:30.080 | which direction things are gonna change.
00:37:31.860 | And so in drowsy states, sleeping and drowsy states,
00:37:36.860 | space and time become dislodged from one another somewhat,
00:37:40.440 | and they're very fluid.
00:37:41.280 | And I think that's why a lot of solutions come to people
00:37:44.160 | after sleep and naps.
00:37:46.800 | And this could even take us into a discussion
00:37:48.740 | if you like about psychedelics.
00:37:50.040 | And what we now know, for instance,
00:37:53.120 | that people thought that psychedelics work
00:37:55.720 | by just creating a spontaneous bursting of neurons
00:37:58.360 | and hallucinations, but the 5H, 2C and 2A receptors,
00:38:03.360 | which are the main sites for things like LSD and psilocybin
00:38:07.440 | and some of the other, the ones that create hallucinations,
00:38:11.220 | the drugs that create hallucinations,
00:38:13.240 | the most of those receptors are actually
00:38:16.000 | in the collection of neurons that encase the thalamus,
00:38:20.700 | which is where all the sensory information goes into,
00:38:23.240 | a structure called the thalamic reticular nucleus.
00:38:25.840 | And it's an inhibitory structure that makes sure
00:38:30.640 | that when we're sitting here talking,
00:38:32.620 | that I'm mainly focused on whatever I'm seeing visually,
00:38:35.860 | that I'm essentially eliminating
00:38:37.560 | a lot of sensory information.
00:38:39.460 | Under conditions where people take psychedelics
00:38:41.600 | and these particular serotonin receptors are activated,
00:38:45.460 | that inhibitory shell, it's literally shaped like a shell,
00:38:49.140 | starts losing its ability to inhibit
00:38:52.160 | the passage of sensory information.
00:38:53.960 | But mostly the effects of psychedelics
00:38:56.860 | are because the lateral connectivity in layer five
00:39:00.180 | of cortex across cortical areas is increased.
00:39:04.740 | And what that does is that means
00:39:06.680 | that the space-time relationship for vision,
00:39:09.160 | like moving my finger from here to here,
00:39:10.520 | very rigid space-time relationship, right?
00:39:12.680 | If I slow it down, it's slower, obviously,
00:39:14.240 | but there's a prediction that can be made
00:39:15.640 | based on the neurons in the retina and the cortex.
00:39:17.920 | On psychedelics, this could be very strange experience.
00:39:21.040 | But the auditory system has one
00:39:22.680 | that's slightly different space-time
00:39:25.400 | and they're matched to one another
00:39:26.760 | in deeper circuits in the brain.
00:39:28.120 | The olfactory system has a different
00:39:29.480 | space-time relationship to it.
00:39:31.360 | So under conditions of these increased activation
00:39:36.360 | of these serotonin receptors,
00:39:38.600 | space and time across sensory areas starts being fluid.
00:39:42.760 | So I'm no longer running the algorithm
00:39:44.720 | for moving my finger from here to here
00:39:46.480 | and making a prediction based on vision alone.
00:39:49.200 | I'm now, this is where people talk about hearing sites,
00:39:54.200 | right?
00:39:55.200 | You start linking, this might actually make a sound
00:39:57.960 | in a psychedelic state.
00:39:59.320 | Now I'm not suggesting people run out and do psychedelics
00:40:01.200 | because it's very disorganized,
00:40:02.680 | but essentially what you're doing
00:40:03.640 | is you're mixing the algorithms.
00:40:05.400 | And so when you talk about being able
00:40:07.520 | to access new solutions,
00:40:09.120 | you don't need to rely on psychedelics.
00:40:10.880 | If people choose to do that, that's their business.
00:40:13.120 | But in drowsy states,
00:40:15.400 | this lateral connectivity is increased as well.
00:40:17.920 | The shell of the thalamus shuts down.
00:40:20.360 | And what's, these are through these so-called
00:40:22.200 | PONS, Chiniculate Occipital Waves.
00:40:24.000 | And what's happening is you're getting
00:40:25.440 | whole brain activation at a level
00:40:28.120 | that you start mixing algorithms.
00:40:30.560 | And so sometimes I think solutions come
00:40:32.560 | not from being in that narrow tunnel of space-time
00:40:36.080 | and strong activation of working memory
00:40:38.800 | and trying to, well, iterate if this, then this,
00:40:41.120 | very strong deductive and inductive thinking
00:40:43.960 | and working from first principles,
00:40:45.760 | but also from states where something
00:40:48.720 | that was an algorithm that you never had in existence before
00:40:52.560 | suddenly gets lumped with another algorithm.
00:40:55.360 | And all of a sudden a new possibility comes to mind.
00:40:59.360 | And so space and time need to be fluid
00:41:02.680 | and space and time need to be rigid
00:41:05.000 | in order to come up with something meaningful.
00:41:07.080 | And I realize I'm riffing long on this,
00:41:08.840 | but this is why I think, you know,
00:41:10.120 | there was so much interest a few years ago
00:41:11.760 | with Michael Pollan's book and other things happening
00:41:14.720 | about psychedelics as a pathway to exploration
00:41:17.800 | and all this kind of thing.
00:41:18.640 | But the real question is what you export back
00:41:20.840 | from those experiences.
00:41:22.080 | 'Cause dreams are amazing,
00:41:23.800 | but if you can't bring anything back from them,
00:41:25.760 | they're just amazing.
00:41:27.400 | - I wonder how to experiment with the mind
00:41:31.640 | without any medical assistance first.
00:41:35.320 | I push my mind in all kinds of directions.
00:41:38.840 | I definitely want to, I did shrooms a couple of times.
00:41:42.840 | I definitely want to figure out how I can experiment
00:41:47.320 | with psychedelics.
00:41:50.120 | I'm talking to Rick Doblin, I think.
00:41:53.360 | - Doblin. - Doblin.
00:41:54.680 | Soon, I went back and forth.
00:41:57.080 | So he does all these studies in psychedelics
00:41:59.360 | and he keeps ignoring the parts of my email
00:42:01.800 | that asks like, how do I participate in these studies?
00:42:04.560 | - Yeah, well, there are some legality issues.
00:42:06.000 | I mean, conversation, I won't be very clear.
00:42:07.480 | I'm not saying that anyone should run out
00:42:08.800 | and do psychedelics.
00:42:10.000 | I think that drowsy states and sleep states
00:42:12.360 | are super interesting for accessing
00:42:14.680 | some of these more creative states of mind.
00:42:16.760 | Hypnosis is something that my colleague, David Spiegel,
00:42:19.360 | associate chair of psychiatry at Stanford, works on,
00:42:21.640 | where also, again, it's a unique state
00:42:23.800 | because you have narrow context.
00:42:25.800 | So this is very kind of tunnel vision
00:42:28.480 | and yet deeply relaxed, where new algorithms,
00:42:32.960 | if you will, can start to surface.
00:42:34.680 | Strong state for inducing neuroplasticity.
00:42:38.320 | And I think, so if I had a, I'm part of a group
00:42:42.920 | that it's called the Liminal Collective,
00:42:46.320 | is a group of people that get together
00:42:47.600 | and talk about just wild ideas, but they try and implement.
00:42:52.280 | And it's a really interesting group.
00:42:54.160 | Some people from military, from Logitech
00:42:57.560 | and some other backgrounds, academic backgrounds.
00:42:59.440 | And I was asked, what would be,
00:43:02.080 | if you could create a tool,
00:43:03.520 | if you just had a tool like your magic wand
00:43:05.720 | and wish for the day, what would it be?
00:43:07.240 | I thought it'd be really interesting
00:43:09.200 | if someone could develop psychedelics
00:43:12.040 | that have on-off switches.
00:43:15.040 | So you could go into a psychedelic state
00:43:18.560 | very deeply for 10 minutes,
00:43:21.120 | but you could launch yourself out of that state
00:43:23.520 | and place yourself into a linear real-world state
00:43:26.040 | very quickly so that you could extract
00:43:28.280 | whatever it was that happened in that experience
00:43:30.560 | and then go back in if you wanted.
00:43:32.360 | Because the problem with psychedelic states
00:43:34.960 | and dream states is that, first of all,
00:43:37.880 | a lot of the reason people do them is they're lying.
00:43:40.560 | They say they want plasticity and they want all this stuff.
00:43:43.200 | They want a peak experience
00:43:45.260 | inside of an amplified experience.
00:43:47.880 | So they're kind of seeking something unusual.
00:43:50.040 | I think we should just be honest about that
00:43:51.760 | because a lot of times
00:43:52.860 | they're not trying to make their brain better.
00:43:54.120 | They're just trying to experience something really amazing.
00:43:57.800 | But the problem is space and time are so unlocked
00:44:02.040 | in these states, just like they are in dreams,
00:44:04.540 | that you can really end up with a whole lot of nothing.
00:44:07.320 | You can have an amazing amplified experience
00:44:10.400 | housed in an amplified experience
00:44:12.480 | and come out of that thinking you had
00:44:14.760 | a meaningful experience when you didn't bring anything back.
00:44:18.920 | - You didn't bring anything back.
00:44:20.240 | All you have is a fuzzy memory
00:44:23.120 | of having a transformational experience,
00:44:25.720 | but you don't actually have tools to bring back.
00:44:29.080 | Or I'm just sorry, actually concrete ideas to bring back.
00:44:33.160 | Yeah, it's interesting.
00:44:34.680 | Yeah, I wonder if it's possible to do that with a mind
00:44:38.000 | to be able to hop back and forth.
00:44:40.400 | - Well, I think that's where the real power
00:44:42.360 | of adjusting states is gonna be.
00:44:45.520 | It probably will be with devices.
00:44:47.560 | I mean, maybe it'll be done through pharmacology.
00:44:49.240 | It's just that it's hard to do on/off switches
00:44:51.680 | in human pharmacology that we have them for animals.
00:44:54.480 | I mean, we have, you know, Cree flip recombinases
00:44:57.880 | and we have, you know, channel opsins and halo rhodopsins
00:45:01.360 | and all these kinds of things.
00:45:03.240 | But to do that work in humans is tricky,
00:45:05.680 | but I think you could do it with virtual reality,
00:45:09.160 | augmented reality and other devices
00:45:11.000 | that bring more of the somatic experience into it.
00:45:13.680 | - You're of course a scientist who's studying humans
00:45:17.680 | as a collective.
00:45:18.880 | I tend to be just a one person scientist
00:45:21.480 | of just looking at myself.
00:45:23.240 | And, you know, I play, when these deep thinking,
00:45:27.240 | deep work sessions, I'm very cognizant,
00:45:30.320 | like in the morning, that there's times when my mind
00:45:34.160 | is so like eloquent at being able to jump around
00:45:39.160 | from ideas and hold them all together.
00:45:41.560 | And I'm almost like I step back
00:45:44.000 | from a third person perspective and enjoy that.
00:45:48.040 | Whatever that mind is doing,
00:45:49.680 | I do not waste those moments.
00:45:51.640 | And I'm very conscious of this like little creature
00:45:57.600 | that woke up that's only awake for,
00:46:01.080 | if we're being honest, maybe a couple hours a day.
00:46:03.960 | - Early part of the day for you.
00:46:05.080 | - Early part of the day.
00:46:06.360 | Not always, well, early part of the day for me
00:46:08.560 | is a very fluid concept.
00:46:11.160 | So. (laughs)
00:46:12.000 | - You're one of those.
00:46:12.960 | Yeah, you're one of those.
00:46:13.800 | - Being single, one of the problems,
00:46:15.480 | single and no meetings, I don't schedule any meetings.
00:46:19.160 | I've been living at like a 28 hour day.
00:46:22.800 | So I like, it drifts.
00:46:25.640 | So it's all over the place.
00:46:28.200 | But after a traditionally defined full night's sleep,
00:46:33.080 | whatever the heck that means,
00:46:35.720 | I find that like in those moments,
00:46:38.940 | there's a clarity of mind that's just,
00:46:41.800 | this everything is effortless
00:46:43.520 | and it's the deepest dives intellectually that I make.
00:46:47.120 | And I'm cognizant of it.
00:46:51.160 | And I try to bring that to the other parts of the day
00:46:53.280 | that don't have it and treasure them even more
00:46:56.360 | in those moments 'cause they only last
00:46:58.000 | like five or 10 minutes.
00:47:00.000 | 'Cause of course, in those moments,
00:47:01.520 | you wanna do all kinds of stupid stuff
00:47:03.080 | that are completely is worthless,
00:47:05.320 | like check social media or something like that.
00:47:07.560 | But those are the most precious things
00:47:09.800 | in intellectual life is those mental moments of clarity.
00:47:16.080 | And I wonder, I'm learning how to control them.
00:47:18.920 | I think caffeine is somehow involved.
00:47:21.480 | I'm not sure exactly.
00:47:22.320 | - Sure.
00:47:23.140 | Well, because if you learn how to titrate caffeine,
00:47:25.960 | and everyone's slightly different with this,
00:47:27.480 | what they need.
00:47:28.320 | But if you learn to titrate caffeine with time of day
00:47:30.320 | and the kind of work that you're trying to do,
00:47:32.100 | you can bring that autonomic arousal state
00:47:33.820 | into a close to perfect place.
00:47:36.680 | And then you can tune it in with,
00:47:38.640 | sometimes people want a little bit of background music,
00:47:40.280 | sometimes they want less, these kinds of things.
00:47:43.160 | The early part of the day is interesting
00:47:44.720 | because the one thing that's not often discussed
00:47:46.680 | is this transition out of sleep.
00:47:49.040 | So there's a book,
00:47:50.760 | I think it's called "Winston Churchill's Nap."
00:47:52.880 | And it's about naps and the transition
00:47:55.260 | between wake and sleep as a valuable period.
00:47:58.860 | I've a long time ago,
00:48:01.220 | someone who I respect a lot was mentoring me said,
00:48:05.960 | "Be very careful about bringing in
00:48:08.800 | someone else's sensory experience early in the day."
00:48:12.080 | So when I wake up, I'm very drowsy.
00:48:14.520 | I sleep well, but I don't emerge from that very quickly.
00:48:17.440 | I need a lot of caffeine to wake up and whatnot.
00:48:21.040 | But there's this concept of getting the download from sleep,
00:48:25.480 | which is, in sleep,
00:48:27.220 | you were essentially expunging the things
00:48:29.140 | that you don't need,
00:48:30.440 | the stuff that was meaningless from the previous day,
00:48:33.120 | but you were also running variations on these algorithms
00:48:36.060 | of whatever it is you're trying to work out in life
00:48:37.780 | on short time scales like the previous day
00:48:40.160 | and long time scales like your whole life.
00:48:42.480 | And those lateral connections in layer five
00:48:46.040 | of the neocortex are very robustly active
00:48:50.960 | and across sensory areas.
00:48:52.400 | And you're running an algorithm or a,
00:48:55.080 | you know, it's a brain state that would be useless in waking
00:48:57.520 | you wouldn't get anything done.
00:48:59.200 | You'd be the person talking to yourself in the hallway
00:49:01.040 | or something about something that no one else can see.
00:49:03.320 | But in those states, you do,
00:49:06.280 | the theory is that you arrive at certain solutions
00:49:09.320 | and those solutions will reveal themselves
00:49:11.380 | in the early part of the day,
00:49:12.880 | unless you interfere with them by bringing in,
00:49:16.360 | social media is a good example of you,
00:49:17.840 | immediately enter somebody else's
00:49:19.640 | space-time sensory relationship.
00:49:22.520 | Someone is the conductor of your thoughts in that case.
00:49:25.080 | And so many people have written about this.
00:49:27.960 | What I'm saying isn't entirely new,
00:49:29.280 | but allowing the download to occur
00:49:32.480 | in the early part of the day and asking the question,
00:49:36.640 | am I more in my head or external,
00:49:39.160 | am I in more of an interoceptive or exteroceptive mode?
00:49:42.260 | And depending on the kind of work you need to do,
00:49:44.860 | if it's, it sounds like for you,
00:49:46.240 | it's very interoceptive in the end,
00:49:48.380 | very, you got a lot of thinking going on
00:49:50.540 | and a lot of computing going on,
00:49:52.140 | allowing yourself to transition out of that sleep state
00:49:54.880 | and arrive with those solutions from sleep
00:49:57.220 | and plug into the work really deeply.
00:49:59.580 | And then, and only then allowing things like music,
00:50:02.740 | news, social media,
00:50:04.260 | doesn't mean you shouldn't talk to loved ones
00:50:06.020 | and see faces and things like that.
00:50:07.300 | But some people have taken this to the extreme.
00:50:09.320 | When I was a graduate student at Berkeley,
00:50:10.680 | there was a guy, there was a professor,
00:50:13.560 | he's brilliant, odd, but brilliant,
00:50:16.120 | who was so fixated on this concept
00:50:19.720 | that he wouldn't look at faces in the early part of the day
00:50:23.160 | because he just didn't want anything else to impact him.
00:50:27.040 | Now, he didn't have the most rounded life, I suppose.
00:50:31.600 | But if you're talking about cognitive performance,
00:50:34.480 | this could actually be very beneficial.
00:50:36.560 | You said so many brilliant things.
00:50:37.780 | So one, if you read books
00:50:40.340 | that describe the habits of brilliant people,
00:50:45.300 | like writers, they do control that sensory experience
00:50:49.740 | in the hours after wake.
00:50:52.660 | Like many writers,
00:50:54.220 | they have a particular habit of several hours
00:50:58.140 | early in the morning of actual writing,
00:50:59.660 | they do, not doing anything else for the rest of the day,
00:51:02.460 | but they control,
00:51:03.940 | they're very sensitive to noises and so on.
00:51:05.900 | I think they make it very difficult to live with them.
00:51:08.600 | I try to, I'm definitely like that.
00:51:10.760 | I love to control the sensory,
00:51:16.240 | how much information is coming in.
00:51:18.360 | There's something about the peaceful,
00:51:20.200 | just everything being peaceful.
00:51:21.920 | At the same time, and we're talking,
00:51:25.360 | meet your friend, Whitney Cummings,
00:51:26.880 | who has a mansion, a castle on top of a cliff
00:51:31.880 | in the middle of nowhere.
00:51:33.280 | She actually purchased her own island.
00:51:36.140 | So she wants silence.
00:51:38.340 | She wants to control how much sound is coming in.
00:51:41.740 | - She's very sensitive to sound and environment.
00:51:44.220 | - Yeah. - Yeah.
00:51:45.060 | Beautiful home and environment,
00:51:46.140 | but like clearly puts a lot of attention into details.
00:51:50.060 | Yeah.
00:51:51.140 | And very creative.
00:51:52.060 | - Yeah.
00:51:52.900 | And that allows her creativity to flourish.
00:51:55.580 | I'm also, I don't like, that feels like a slippery slope.
00:51:59.780 | So I enjoy introducing
00:52:03.660 | noises and signals and training my mind
00:52:07.420 | to be able to tune them out.
00:52:09.620 | 'Cause I feel like you can't always control
00:52:11.500 | the environment so perfectly
00:52:13.140 | because your mind gets comfortable with that.
00:52:16.340 | I think it's a skill that you want to learn
00:52:18.060 | to be able to shut it off.
00:52:19.780 | Like I often go to like, back before COVID,
00:52:22.420 | to a coffee shop.
00:52:24.020 | It really annoys me when there's sounds and voices
00:52:26.580 | and so on, but I feel like I can train my mind
00:52:29.400 | to block them out.
00:52:31.180 | So it's a balance, I think.
00:52:32.700 | - Yeah.
00:52:33.540 | And I think, two things come to mind
00:52:35.660 | as you're saying this.
00:52:37.100 | First of all, yeah.
00:52:38.300 | I mean, we're talking about what's best for work
00:52:41.420 | is not always what's best for completeness of life.
00:52:44.260 | I mean, autism is probably many things.
00:52:46.740 | Like when you hear autism, just like feet,
00:52:48.420 | there are probably 50 ways to get a fever.
00:52:50.540 | There are probably 50 ways that the brain can create
00:52:53.500 | what looks like autism or what people call autism.
00:52:55.940 | But there's an interesting set of studies
00:52:58.300 | that have come out of David Ginty's lab at Harvard Med
00:53:02.200 | looking at, these are mouse mutants,
00:53:04.740 | where these are models for autism
00:53:07.020 | where nothing is disrupted in the brain proper
00:53:10.500 | and in the central nervous system,
00:53:12.120 | but the sensory neurons, the ones that innervate the skin
00:53:16.480 | and the ears and everything are hypersensitive.
00:53:19.160 | And this maps to a mutation
00:53:20.660 | in certain forms of human autism.
00:53:22.800 | So this means that the overload of sensory information
00:53:27.800 | and sensory experience that a lot of autistics feel,
00:53:30.860 | they're like, they can't tolerate things
00:53:32.560 | and then they get the stereotype behaviors,
00:53:34.100 | the rocking and the kind of the shouting.
00:53:36.620 | It, you know, we always thought of that as a brain problem.
00:53:39.880 | In some cases it might be, but in many cases,
00:53:42.800 | it's because they just can't, they seem to have a,
00:53:46.100 | it's like turning the volume up on every sense.
00:53:48.260 | And so they're overwhelmed
00:53:49.580 | and none of us want to become like that.
00:53:51.440 | I think it's very hard for them
00:53:52.640 | and it's hard for their parents and so forth.
00:53:54.580 | So I like the coffee shop example
00:53:56.900 | because the way I think about trying to build up resilience,
00:54:01.900 | you know, physically or mentally or otherwise is one of,
00:54:05.520 | I guess we could call it limb,
00:54:06.620 | I like to call it limbic friction.
00:54:08.120 | That's not a real scientific term and I acknowledge that.
00:54:10.280 | I'm making it up now
00:54:11.500 | because I think it captures the concept,
00:54:13.300 | which is that, you know, we always hear about resilience.
00:54:15.640 | It makes it sound like, oh, you know, under stress
00:54:17.680 | where everything's coming at you, you're gonna stay calm.
00:54:20.120 | But there's another, you know, so limbic,
00:54:22.400 | the limbic system wants to pull you in some direction.
00:54:26.500 | Typically in the direction of reflexive behavior.
00:54:29.280 | And the prefrontal cortex through top-down mechanisms
00:54:32.960 | has to suppress that and say, no,
00:54:34.740 | we're not gonna respond to the banging
00:54:36.800 | of the coffee cups behind me or I'm gonna keep focusing.
00:54:40.040 | That's pure top-down control.
00:54:42.200 | So limbic friction is high in that environment.
00:54:44.880 | You've put yourself
00:54:45.720 | into a high limbic friction environment.
00:54:47.040 | It mean that the prefrontal cortex has to work really hard.
00:54:49.760 | But there's another side to limbic friction too,
00:54:52.360 | which is when you're very sleepy, there's nothing incoming.
00:54:55.600 | You can be completely silent and it's hard to engage
00:54:58.700 | and focus because you're drifting off
00:55:00.300 | and you're getting sleepy.
00:55:01.140 | So their limbic friction is high,
00:55:02.460 | but for the opposite reason, autonomic arousal is too low.
00:55:05.580 | So they're turning on Netflix in the background
00:55:08.320 | or looping a song might boost your level of alertness
00:55:11.500 | that will allow top-down control to be
00:55:13.980 | in exactly the sweet spot you want it.
00:55:17.140 | So this is why earlier I was saying,
00:55:19.340 | it's all about how we feel inside
00:55:21.220 | relative to what's going on on the outside.
00:55:23.180 | We're constantly in this,
00:55:25.500 | I guess one way you could envision it spatially,
00:55:28.620 | especially if people are listening to this just on audio,
00:55:31.820 | is I like to think about it kind of like a glass barbell
00:55:35.360 | where one sphere of perception and attention
00:55:38.380 | can be on what's going on with me.
00:55:40.260 | And one sphere of attention can be on what's going on
00:55:42.600 | with you or something else in the room or in my environment.
00:55:46.040 | But this barbell isn't rigid.
00:55:47.960 | It's not really glass.
00:55:49.320 | Would plasma work here?
00:55:50.400 | I don't know anything about plasma.
00:55:51.700 | [laughing]
00:55:53.320 | Sorry, I don't know.
00:55:54.640 | Okay, but so imagine that this thing can contort.
00:55:57.180 | The size of the globes at the end of this barbell
00:56:00.040 | can get bigger or smaller.
00:56:01.520 | So let's say I close my eyes and I bring all my experience
00:56:04.360 | into what's going on through interoception internally.
00:56:08.680 | Now it's as if I've got two orbs of perception
00:56:11.760 | just on my internal state,
00:56:13.320 | but I can also do the opposite
00:56:14.560 | and bring both orbs of perception outside me.
00:56:17.220 | I'm not thinking about my heart rate or my breathing.
00:56:18.920 | I'm just thinking about something I see.
00:56:21.080 | And what you'll start to realize
00:56:22.980 | as you kind of use this spatial model is that two things.
00:56:27.060 | One is that it's very dynamic
00:56:30.620 | and that the more relaxed we are,
00:56:33.180 | the more these two orbs of attention,
00:56:35.120 | the two ends of the barbell can move around freely.
00:56:38.300 | The more alert we are,
00:56:40.460 | the more rigid they're going to be tethered in place.
00:56:43.580 | And that was designed so that if I have a threat
00:56:45.580 | in my environment, it's tethered to that threat.
00:56:48.700 | I'm not going to, if something's coming to attack me,
00:56:50.860 | I'm not going to be like,
00:56:51.700 | "Oh, my breathing cadence is a little bit quick."
00:56:53.440 | That's not how it works.
00:56:55.120 | Because both orbs are linked to that threat.
00:56:59.120 | And so my behavior is now actually being driven
00:57:01.680 | by something external, even though I think it's internal.
00:57:04.000 | And so I don't want to get too abstract here
00:57:05.960 | because I'm a neuroscientist, I'm not a theorist.
00:57:08.900 | But when you start thinking about models
00:57:11.240 | of how the brain works, I mean, brain works, excuse me,
00:57:13.680 | there are only really three things that neurons do.
00:57:15.400 | They're either sensory neurons,
00:57:17.100 | they're motor neurons, or they're modulating things.
00:57:20.280 | And the models of attention and perception
00:57:24.060 | that we have now, 2020, tell us that we've got interoception
00:57:28.880 | and exteroception.
00:57:29.880 | They're strongly modulated by levels of autonomic arousal.
00:57:32.520 | And that if we want to form the optimal relationship
00:57:36.240 | to some task or some pressure or something,
00:57:39.840 | whether or not it's sleep, an impending threat, or coding,
00:57:43.720 | we need to adjust our internal space-time relationship
00:57:47.420 | with the external space-time relationship.
00:57:49.440 | And I realize I'm repeating what I said earlier,
00:57:51.720 | but we can actually assign circuitry to this stuff.
00:57:54.240 | It mostly has to do with how much limbic friction there is,
00:57:57.700 | how much you're being pulled to some source.
00:58:00.220 | That source could be internal.
00:58:01.200 | If I have pain, physical pain in my body,
00:58:04.880 | I'm going to be much more interoceptive
00:58:06.420 | than I am extroceptive.
00:58:07.640 | You could be talking to me
00:58:08.460 | and I'm just going to be thinking about that pain.
00:58:09.680 | It's very hard.
00:58:10.720 | And the other thing that we can link it to
00:58:13.280 | is top-down control, meaning anything in our environment
00:58:17.960 | that has a lot of salience will tend to bring us
00:58:19.960 | into more exteroception than interoception.
00:58:22.880 | And again, I don't want to litter the conversation
00:58:24.880 | with just a bunch of terms,
00:58:26.140 | but what I think it can be useful for people
00:58:29.160 | is to do what essentially you've done, Lex,
00:58:31.720 | is to start developing an awareness.
00:58:33.920 | When I wake up, am I mostly in a mode of interoception
00:58:37.400 | or exteroception?
00:58:38.720 | When I work well, is that, what does working well look like
00:58:42.920 | from the perspective of autonomic arousal?
00:58:44.840 | How alert or calm am I?
00:58:46.920 | What kind of balance between internal focus
00:58:49.120 | and external focus is there?
00:58:50.800 | And to sort of watch this process throughout the day.
00:58:53.520 | - Can you linger just briefly on,
00:58:55.700 | 'cause you use this term a lot,
00:58:57.840 | and it'd be nice to try to get a little more color to it,
00:59:00.520 | which is interoception and exteroception.
00:59:02.820 | What are we exactly talking about?
00:59:07.040 | So like what's included in each category
00:59:09.960 | and how much overlap is there?
00:59:12.080 | - Interoception would be an awareness of anything
00:59:16.000 | that's within the confines or on the surface of my skin
00:59:19.620 | that I'm sensing.
00:59:20.460 | - So literally physiological.
00:59:21.840 | - Physiologically, like within the boundaries of my skin
00:59:24.640 | and probably touch to the skin as well.
00:59:26.720 | Exteroception would be perception of anything
00:59:30.200 | that's beyond the reach of my skin.
00:59:33.320 | So that bottle of water, a scent, a sound,
00:59:38.320 | although, and this can change dramatically actually,
00:59:42.440 | if you have headphones in,
00:59:43.440 | you tend to hear things in your head,
00:59:44.680 | as opposed to a speaker in the room.
00:59:47.000 | This is actually the basis of ventriloquism.
00:59:49.520 | So there are beautiful experiments done by Greg Reckensohn
00:59:52.800 | up at UC Davis,
00:59:53.640 | looking at how auditory and visual cues are matched
00:59:56.760 | and we have an array of speakers,
00:59:58.600 | and you can, this will become obvious as I say it,
01:00:01.420 | but obviously the ventriloquist doesn't throw their voice.
01:00:05.040 | What they do is they direct your vision
01:00:06.600 | to a particular location,
01:00:07.820 | and you think the sound is coming from that location.
01:00:10.360 | And there are beautiful experiments
01:00:11.560 | that Greg and his colleagues have done
01:00:12.720 | where they suddenly introduce a auditory visual mismatch
01:00:16.000 | and it freaks people out
01:00:17.480 | because you can actually make it seem
01:00:19.880 | from a perception standpoint,
01:00:21.400 | as if the sound arrived from the corner of the room
01:00:23.880 | and hit you, like physically, and people will recoil.
01:00:28.140 | And so sounds aren't getting thrown across the room,
01:00:31.220 | they're still coming from this defined location
01:00:33.460 | and array of speakers,
01:00:34.720 | but this is the way the brain creates
01:00:37.180 | these internal representations.
01:00:38.720 | And again, not to, I don't wanna go down a rabbit hole,
01:00:41.880 | but I think as much as you're,
01:00:45.880 | I'm sure the listeners appreciate this,
01:00:47.560 | but everything in the brain is an abstraction, right?
01:00:50.800 | I mean, the sensory apparati,
01:00:53.920 | there are the eyes and ears and nose and skin
01:00:56.040 | and taste and all that are taking information
01:00:58.800 | and with interoception,
01:01:00.440 | it's taking information from sensors inside the body,
01:01:03.240 | the enteric nervous system for the gut,
01:01:05.640 | I've got sensory neurons that innervate my liver, et cetera.
01:01:10.540 | Taking all that and the brain is abstracting that
01:01:15.540 | in the same way that if I took a picture of your face
01:01:18.520 | and I handed it to you and I'd say, that's you,
01:01:20.400 | you'd say, yeah, that's me.
01:01:22.000 | But if I were an abstract artist,
01:01:24.240 | I'd be doing a little bit more of what the brain does,
01:01:26.180 | where if I took a pen, pad and paper,
01:01:28.080 | maybe I could do this 'cause I'm a terrible artist,
01:01:29.800 | and I could just mix it up and I'd,
01:01:31.440 | let's say I would make your eyes like water bottles,
01:01:33.860 | but I'd flip them upside down
01:01:34.880 | and I'd start assigning fruits and objects
01:01:37.180 | to the different features of your face.
01:01:38.360 | And I showed you, I say, Lex, that's you.
01:01:40.200 | Say, well, that's not me.
01:01:41.040 | And I'd say, no, but that's my abstraction of you.
01:01:42.800 | But that's what the brain does.
01:01:44.280 | The space time relationship of the neurons that fire,
01:01:47.520 | that encode your face, have no resemblance to your face.
01:01:51.080 | - Right.
01:01:51.920 | - And I think people don't really,
01:01:53.720 | I don't know if people have fully internalized that,
01:01:56.360 | but the day that I,
01:01:57.480 | and I'm not sure I fully internalized that
01:01:59.960 | because it's weird to think about,
01:02:01.400 | but all neurons can do is fire in space and in time,
01:02:05.720 | different neurons in different sequences,
01:02:07.380 | perhaps with different intensities.
01:02:08.840 | It's not clear the action potential is all or none,
01:02:11.160 | although neuroscientists don't like to talk about that,
01:02:13.320 | even though it's been published in "Nature" a couple times.
01:02:16.140 | The action potential for a given neuron
01:02:17.800 | doesn't always have the exact same waveform.
01:02:19.880 | People, it's in all the textbooks,
01:02:21.440 | but you can modify that waveform.
01:02:23.600 | - Well, I mean, there's a lot of fascinating stuff
01:02:26.800 | with neuroscience about the fuzziness
01:02:29.320 | of all the, of the transfer of information
01:02:32.920 | from neuron to neuron.
01:02:33.800 | I mean, we certainly touch upon it
01:02:36.400 | every time we at all try to think about the difference
01:02:38.720 | between artificial neural networks
01:02:40.280 | and biological neural networks,
01:02:42.320 | but can we maybe linger a little bit on this,
01:02:45.480 | on the circuitry that you're getting at?
01:02:48.360 | So the brain is just a bunch of stuff firing
01:02:51.120 | and it forms abstractions that are fascinating
01:02:54.360 | and beautiful, like layers upon layers
01:02:56.160 | upon layers of abstraction.
01:02:58.080 | And I think it, just like when you're programming,
01:03:01.060 | you know, I'm programming in Python,
01:03:02.980 | it's awe-inspiring to think that underneath it all,
01:03:07.720 | it ends up being zeros and ones.
01:03:10.240 | And the computer doesn't know about,
01:03:12.440 | no stupid Python or Windows or Linux.
01:03:14.880 | It only knows about the zeros and ones.
01:03:17.320 | In the same way with the brain,
01:03:18.960 | is there something interesting to you
01:03:25.000 | or fundamental to you about the circuitry of the brain
01:03:28.140 | that allows for the magic that's in our mind to emerge?
01:03:32.260 | How much do we understand?
01:03:36.080 | I mean, maybe even focusing on the vision system.
01:03:38.680 | Is there something specific about the structure
01:03:42.720 | of the vision system, the circuitry of it
01:03:45.400 | that allows for the complexity of the vision system
01:03:49.920 | to emerge or is it all just a complete chaotic mess
01:03:52.680 | that we don't understand?
01:03:53.760 | - It's definitely not all a chaotic mess
01:03:56.160 | that we don't understand if we're talking about vision.
01:03:59.400 | And that's not just 'cause I'm a vision scientist.
01:04:01.440 | - Let's stick to vision.
01:04:02.840 | - Well, because in the beauty of the visual system,
01:04:05.520 | the reason David Hubel and Torsten Wiesel
01:04:07.640 | won the Nobel Prize was because they were brilliant
01:04:10.080 | and forward thinking and adventurous
01:04:11.520 | and all that good stuff.
01:04:12.600 | But the reason that the visual system
01:04:14.380 | is such a great model for addressing
01:04:15.840 | these kinds of questions and other systems are hard
01:04:18.320 | is we can control the stimuli.
01:04:20.280 | We can adjust spatial frequency,
01:04:22.280 | how fine the gratings are, thick gratings, thin gratings.
01:04:25.420 | We can adjust temporal frequency,
01:04:26.840 | how fast things are moving.
01:04:28.160 | We can use cone isolating stimuli.
01:04:31.360 | There's so many things that you can do in a controlled way.
01:04:34.800 | Whereas if we were talking about cognitive encoding,
01:04:37.200 | like encoding the space of concepts or something.
01:04:41.040 | I, like you, if I may,
01:04:45.840 | am drawn to the big questions in neuroscience.
01:04:49.000 | But I confess in part because of some good advice
01:04:53.280 | I got early in my career,
01:04:55.240 | and in part because I'm not perhaps smart enough
01:04:59.560 | to go after the really high level stuff,
01:05:02.320 | I also like to address things that are tractable.
01:05:06.600 | And I want, you know, we need to address
01:05:09.280 | what we can stand to make some ground on at a given time.
01:05:13.040 | - They construct brilliant controlled experiments
01:05:17.160 | just to study, to really literally answer questions about.
01:05:20.280 | - Yeah, I mean, I'm happy to have a talk
01:05:21.980 | about consciousness, but it's a scary talk.
01:05:24.720 | And I think most people don't wanna hear what I have to say,
01:05:27.040 | which is, you know, which is,
01:05:28.920 | we can save that for later perhaps.
01:05:31.520 | - It's an interesting question of,
01:05:33.240 | we talk about psychedelics,
01:05:36.000 | we can talk about consciousness,
01:05:37.240 | we can talk about cognition.
01:05:39.100 | Can experiments in neuroscience be constructed
01:05:43.360 | to shed any kind of light on these questions?
01:05:46.160 | So I mean, it's cool that vision,
01:05:49.580 | I mean, to me, vision is probably
01:05:51.360 | one of the most beautiful things about human beings.
01:05:53.480 | Also from the AI side,
01:05:56.080 | computer vision has some of the most exciting applications
01:06:00.640 | of neural networks is in computer vision.
01:06:03.640 | But it feels like that's a neighbor
01:06:06.440 | of cognition and consciousness.
01:06:08.360 | It's just that we maybe haven't come up
01:06:09.920 | with experiments to study those yet.
01:06:11.800 | - Yeah, the visual system is amazing.
01:06:13.380 | We're mostly visual animals to navigate, survive.
01:06:16.280 | Humans mainly rely on vision, not smell or something else,
01:06:18.840 | but it's a filter for cognition
01:06:23.040 | and it's a strong driver of cognition.
01:06:26.440 | Maybe just 'cause it came up
01:06:28.000 | and then we're moving to higher level concepts.
01:06:30.440 | Just the way the visual system works
01:06:32.720 | can be summarized in a few relatively succinct statements,
01:06:37.000 | unlike most of what I've said,
01:06:38.000 | which has not been succinct at all.
01:06:39.360 | - Let's go there.
01:06:40.200 | - The retina. - What's involved.
01:06:42.640 | - Yeah, so the retina is this three layers
01:06:46.080 | of neuron structure at the back of your eyes,
01:06:47.960 | about as thick as a credit card.
01:06:49.780 | It is a piece of your brain.
01:06:51.600 | And sometimes people think I'm kind of wriggling
01:06:54.720 | out of a reality by saying that.
01:06:56.400 | It's absolutely a piece of the brain.
01:06:58.160 | It's a four brain structure that in the first trimester,
01:07:00.760 | there's a genetic program that made sure
01:07:04.000 | that that neural retina,
01:07:05.520 | which is part of your central nervous system,
01:07:07.420 | was squeezed out into what's called the embryonic eye cups.
01:07:11.080 | And that the bone formed with a little hole
01:07:13.280 | where the optic nerve is gonna connect it
01:07:14.700 | to the rest of the brain.
01:07:15.540 | And that window into the world
01:07:19.160 | is the only window into the world for a mammal,
01:07:21.680 | which has a thick skull.
01:07:22.680 | Birds have a thin skull,
01:07:23.900 | so their pineal gland sits and lizards too.
01:07:26.520 | And snakes actually have a hole
01:07:28.080 | so that light can make it down into the pineal directly
01:07:30.600 | and in trained melatonin rhythms
01:07:32.000 | for time of day and time of year.
01:07:33.640 | Humans have to do all that through the eyes.
01:07:36.280 | So three layers of neurons that are a piece of your brain,
01:07:39.100 | their central nervous system.
01:07:40.240 | And the optic nerve connects to the rest of the brain.
01:07:42.960 | The neurons in the eye,
01:07:44.840 | some just care about luminance,
01:07:46.680 | just how bright or dim it is.
01:07:48.200 | And they inform the brain about time of day.
01:07:50.300 | And then the central circadian clock
01:07:51.720 | informs every cell in your body about time of day
01:07:53.760 | and make sure that all sorts of good stuff happens
01:07:55.600 | if you're getting light in your eyes at the right times.
01:07:57.640 | And all sorts of bad things happen
01:07:59.040 | if you are getting light randomly
01:08:00.800 | throughout the 24 hour cycle.
01:08:02.400 | We could talk about all that,
01:08:03.600 | but this is a good incentive
01:08:04.600 | for keeping a relatively normal schedule,
01:08:07.200 | a consistent schedule of light exposure.
01:08:10.800 | Consistent schedule, try and keep a consistent schedule.
01:08:13.200 | When you're young, it's easy to go off schedule and recover.
01:08:17.040 | As you get older, it gets harder,
01:08:18.320 | but you see everything from outcomes in cancer patients
01:08:21.820 | to diabetes improves when people are getting light
01:08:26.820 | at a particular time of day
01:08:29.320 | and getting darkness at a particular phase
01:08:30.880 | of the 24 hour cycle.
01:08:32.360 | We were designed to get light and dark
01:08:36.880 | at different times of the circadian cycle.
01:08:39.880 | That's all being,
01:08:40.880 | all that information is coming in
01:08:42.040 | through specialized type of neuron in the retina
01:08:44.400 | called the melanopsin,
01:08:45.240 | intrinsically photosensitive ganglion cell
01:08:46.880 | discovered by David Burson at Brown University.
01:08:49.660 | That's not spatial information.
01:08:52.860 | It's subconscious.
01:08:53.860 | You don't think, oh, it's daytime.
01:08:55.440 | Even if you're looking at the sun, it doesn't matter.
01:08:57.080 | It's a photon counter.
01:08:58.360 | It's literally counting photons.
01:09:00.400 | And it's saying, oh, even though it's a cloudy day,
01:09:02.120 | lots of photons coming in.
01:09:03.360 | It's winter in Boston, it must be winter.
01:09:05.760 | And your system is a little depressed.
01:09:07.100 | It's spring, you feel alert.
01:09:08.480 | That's not a coincidence.
01:09:09.520 | That's these melanopsin cells signaling the circadian clock.
01:09:12.840 | There are a bunch of other neurons in the eye
01:09:15.820 | that signal to the brain.
01:09:17.280 | And they mainly signal the presence of things
01:09:19.880 | that are lighter than background or darker than background.
01:09:22.440 | So a black object would be darker than background,
01:09:25.480 | a light object, lighter than background.
01:09:27.040 | And that all come, it's mainly, it's looking at pixels.
01:09:29.920 | Mainly, they look at circles
01:09:32.480 | and those neurons have receptive fields,
01:09:34.400 | which not everyone will understand,
01:09:35.720 | but those neurons respond best
01:09:37.640 | to little circles of dark light
01:09:39.000 | or little circles of bright light.
01:09:40.800 | Little circles of red light versus little circles
01:09:43.120 | of green light or blue light.
01:09:45.420 | And so it sounds very basic.
01:09:47.300 | It's like red, green, blue, and circles brighter or dimmer
01:09:51.560 | than what's next to it.
01:09:53.080 | But that's basically the only information
01:09:55.440 | that's sent down the optic nerve.
01:09:57.500 | And when we say information, we can be very precise.
01:10:00.200 | I don't mean little bits of red
01:10:01.520 | traveling down the optic nerve.
01:10:02.680 | I mean spikes, neural action potentials in space and time,
01:10:06.960 | which for you is like makes total sense.
01:10:08.640 | But I think for a lot of people,
01:10:10.340 | it's actually beautiful to think about
01:10:14.200 | all that information in the outside world
01:10:15.880 | is converted into a language that's very simple.
01:10:18.440 | It's just like a few syllables, if you will.
01:10:21.000 | And those syllables are being shouted down the optic nerve,
01:10:24.280 | converted into a totally different language,
01:10:25.980 | like Morse code, beep, beep, beep, beep, beep, beep.
01:10:28.480 | Goes into the brain and then the thalamus
01:10:30.080 | essentially responds in the same way that the retina does.
01:10:32.640 | Except the thalamus is also waiting things.
01:10:36.480 | It's saying, you know what?
01:10:38.440 | That thing was moving faster than everything else,
01:10:42.360 | or it's brighter than everything else.
01:10:43.920 | So that signal I'm gonna get up,
01:10:45.360 | I'm gonna allow up to cortex.
01:10:47.660 | Or that signal is much redder than it is green.
01:10:52.040 | So I'm gonna let that signal go through.
01:10:53.320 | That signal is much, eh,
01:10:55.040 | it's kind of more like the red next to it.
01:10:57.800 | Throw that out.
01:10:58.740 | The information just doesn't get up into your cortex.
01:11:00.560 | And then in cortex, of course, is where perceptions happen.
01:11:02.880 | And in V1, if you will, visual area one,
01:11:06.320 | but also some neighboring areas,
01:11:08.080 | you start getting representations
01:11:10.160 | of things like oriented lines.
01:11:12.200 | So there's a neuron that responds to this angle of my hand
01:11:15.160 | versus vertical, right?
01:11:17.520 | This is the defining work of Hubel and Wiesel's Nobel.
01:11:20.680 | And it's a very systematic map of orientation,
01:11:23.540 | line orientation, direction of movement, and so forth.
01:11:27.440 | And that's pretty much, and color.
01:11:29.320 | And that's how the visual system is organized
01:11:31.080 | all the way up to the cortex.
01:11:33.120 | So it's hierarchical.
01:11:34.440 | You don't build, I wanna be clear,
01:11:35.980 | it's hierarchical because you don't build up that line
01:11:38.120 | by suddenly having a neuron that responds to lines
01:11:40.940 | in some random way.
01:11:42.440 | It responds to lines by taking all the dots
01:11:44.560 | that are aligned in a vertical stack,
01:11:46.900 | and they all converge on one neuron.
01:11:49.080 | And then that neuron responds to vertical lines.
01:11:51.080 | So it's not random.
01:11:52.840 | There's no abstraction at that point, in fact.
01:11:55.120 | In fact, if I showed you a black line,
01:11:57.720 | I could be sure that if I were imaging V1,
01:12:00.760 | that I would see a representation of that black line
01:12:03.160 | as a vertical line somewhere in your cortex.
01:12:05.600 | So at that point, it's absolutely concrete.
01:12:10.240 | It's not abstract.
01:12:11.600 | But then things get really mysterious.
01:12:14.360 | Some of that information travels further up
01:12:16.560 | into the cortex, so that,
01:12:17.920 | and goes from one visual area to the next,
01:12:19.720 | to the next, to the next,
01:12:21.000 | so that by time you get into an area
01:12:23.000 | that Nancy Kanwisher at MIT has studied
01:12:26.240 | much of her career, the fusiform face area,
01:12:29.200 | you start finding single neurons
01:12:32.040 | that respond only to your father's face
01:12:35.720 | or to Joe Rogan's face,
01:12:37.800 | regardless of the orientation of his face.
01:12:40.420 | I'm sure if you saw Joe, 'cause you know him well,
01:12:43.160 | from across the room and you just saw his profile,
01:12:45.240 | you'd be like, "Oh, that's Joe."
01:12:46.760 | Walk over and say hello.
01:12:47.960 | The orientation of his face isn't there.
01:12:50.640 | You wouldn't even see his eyes necessarily,
01:12:52.800 | but he's represented in some abstract way by a neuron
01:12:56.640 | that actually would be called the Joe Rogan neuron
01:12:58.960 | or black Joe neurons.
01:12:59.800 | - He might have limits.
01:13:01.160 | I might not recognize him if he was upside down
01:13:03.360 | or something like that.
01:13:04.200 | It'd be fascinating to see what the limits
01:13:06.480 | of that Joe Rogan concept is.
01:13:08.240 | - So Nancy's lab has done that
01:13:10.000 | because early on she was challenged by people that said,
01:13:12.440 | "There aren't face neurons.
01:13:14.240 | "There are neurons that they only respond to space and time,
01:13:17.640 | "shapes and things like that,
01:13:19.120 | "moving in particular directions and orientations."
01:13:21.320 | And it turns out Nancy was right.
01:13:23.880 | They use these stimuli called Grebel stimuli,
01:13:26.400 | which any computer programmer would appreciate,
01:13:28.720 | which kind of morphs a face into something gradually
01:13:31.440 | that eventually just looks like this alien thing
01:13:33.880 | they call the Grebel.
01:13:35.120 | And the neurons don't respond to Grebels.
01:13:37.760 | In most cases, they only respond to faces
01:13:40.320 | and familiar faces.
01:13:41.440 | Anyway, I'm summarizing a lot of literature
01:13:43.040 | and forgive me, Nancy, and for those of the Grebel people,
01:13:45.640 | if there are, they're like,
01:13:46.600 | "Don't come after me with pitchforks.
01:13:48.640 | "Actually, you know what?
01:13:49.480 | "Come after me with pitchforks.
01:13:50.320 | "I think you know what I'm trying to do here."
01:13:52.200 | So the point is that in the visual system,
01:13:54.620 | it's very concrete up until about visual area four,
01:13:58.680 | which has color pinwheels
01:14:00.320 | and seems to respond to pinwheels of colors.
01:14:02.360 | And so the stimuli become more and more elaborate,
01:14:06.120 | but at some point you depart that concrete representation
01:14:10.240 | and you start getting abstract representations
01:14:12.080 | that can't be explained by simple point-to-point wiring.
01:14:15.960 | And to take a leap out of the visual system
01:14:18.400 | to the higher level concepts,
01:14:20.800 | what we talked about in the visual system
01:14:22.600 | maps to the auditory system where you're encoding,
01:14:24.840 | what, frequency of tone, sweeps.
01:14:28.480 | So this is gonna sound weird to do, but you know,
01:14:30.720 | like a Doppler, like hearing something,
01:14:32.640 | a car passing by, for instance.
01:14:34.680 | But at some point you get into motifs of music
01:14:38.120 | that can't be mapped to just a,
01:14:40.280 | what they call a tonotopic map of frequency.
01:14:43.440 | You start abstracting.
01:14:44.600 | And if you start thinking about concepts of creativity
01:14:48.280 | and love and memory, like what is the map of memory space?
01:14:52.840 | Well, your memories are very different than mine,
01:14:55.800 | but presumably there's enough structure
01:14:58.180 | at the early stages of memory processing
01:15:00.400 | or at the early stages of emotional processing
01:15:03.040 | or at the earlier stages of creative processing
01:15:06.560 | that you have the building blocks,
01:15:08.420 | your zeros and ones, if you will,
01:15:10.920 | but you depart from that eventually.
01:15:14.440 | Now, the exception to this,
01:15:15.920 | and I wanna be really clear
01:15:17.120 | 'cause I was just mainly talking about neocortex,
01:15:20.320 | the six layered structure on the outside of the brain
01:15:22.440 | that explains a lot of human abilities,
01:15:25.640 | other animals have them too,
01:15:27.280 | is that subcortical structures are a lot more like machines.
01:15:31.500 | It's more plung and chug.
01:15:34.240 | And what I'm talking about is the machinery
01:15:35.940 | that controls heart rate and breathing and receptive fields,
01:15:39.640 | neurons that respond to things like temperature
01:15:43.500 | on the top of my left hand.
01:15:45.300 | And one of the,
01:15:46.340 | I came into neuroscience from a more of a perspective
01:15:49.740 | initially of psychology,
01:15:50.860 | but one of the reasons I forced upon myself
01:15:54.260 | to learn some electrophysiology,
01:15:56.660 | not a ton, but enough,
01:15:58.160 | and some molecular biology and about circuitry
01:16:00.620 | is that one of the most beautiful experiences
01:16:02.720 | you can have in life, I'm convinced,
01:16:05.440 | is to lower an electrode into the cortex
01:16:09.360 | and to show a person or an animal,
01:16:12.660 | you do this ethically, of course,
01:16:14.840 | stimulus like an oriented line or a face.
01:16:17.680 | And you can convert the recordings
01:16:20.160 | coming off of that electrode into an audio signal
01:16:22.520 | or an audio monitor,
01:16:23.360 | and you can hear what they call hash.
01:16:25.520 | It's not the hash you smoke,
01:16:26.600 | it's the hash you hear.
01:16:27.680 | And it sounds like,
01:16:30.000 | it just sounds like noise.
01:16:33.260 | And in the cortex, eventually you find a stimulus
01:16:36.000 | that gets the neuron to spike and fire action potentials
01:16:38.960 | that are converted into an auditory stimulus
01:16:41.160 | that are very concrete, crack, crack, crack,
01:16:43.700 | sounds like a bat cracking,
01:16:45.200 | like home runs or outfield balls.
01:16:48.020 | When you drop electrodes deeper into the thalamus
01:16:52.920 | or into the hypothalamus
01:16:54.640 | or into the brainstem areas that control breathing,
01:16:57.220 | it's like a machine.
01:16:58.300 | You never hear hash.
01:16:59.720 | You drop the electrode down.
01:17:00.960 | This could be like a grungy old tungsten electrode,
01:17:05.700 | not high fidelity electrode,
01:17:07.200 | as long as it's got a little bit of insulation on it.
01:17:08.840 | You plug it into an audio monitor,
01:17:10.120 | it's picking up electricity.
01:17:11.560 | And if it's a visual neuron
01:17:13.520 | and it's in the thalamus or the retina,
01:17:15.040 | and you walk in front of that animal or person,
01:17:17.520 | that neuron goes, [imitates electric noise]
01:17:20.920 | and then you walk away and it stops.
01:17:23.000 | And you put your hand in front of the eye again,
01:17:25.320 | and it goes, [imitates electric noise]
01:17:27.120 | and you could do that for two days.
01:17:29.720 | And that neuron will just,
01:17:30.960 | every time there's a stimulus, it fires.
01:17:33.000 | So whereas before, it's a question of how much information
01:17:35.540 | is getting up to cortex,
01:17:36.560 | and then these abstractions happening
01:17:38.720 | where you're creating these ideas.
01:17:40.600 | When you go subcortical, everything is-
01:17:44.760 | - There's no abstractions.
01:17:45.600 | - It's two plus two equals four.
01:17:46.960 | There's no abstractions.
01:17:48.240 | And this is why I know we have some common friends
01:17:51.140 | at Neuralink, and I love the demonstration
01:17:53.320 | they did recently.
01:17:54.140 | I'm a huge fan of what they're doing
01:17:55.680 | and where they're headed.
01:17:56.700 | And no, I don't get paid to say that.
01:17:58.960 | And I have no business relationship to them.
01:18:01.200 | I'm just a huge fan of the people and the mission.
01:18:03.540 | But my question was to some of them,
01:18:07.080 | when are you going to go subcortical?
01:18:08.600 | 'Cause if you want to control an animal,
01:18:10.880 | you don't do it in the cortex.
01:18:12.460 | The cortex is like the abstract painting
01:18:14.520 | I made of your face.
01:18:15.680 | Removing one piece or changing something
01:18:19.000 | may or may not matter for the abstraction.
01:18:21.300 | But when you are in the subcortical areas of the brain,
01:18:24.780 | a stimulating electrode can evoke an entire behavior
01:18:27.860 | or an entire state.
01:18:29.180 | And so the brain, if we're gonna have a discussion
01:18:31.580 | about the brain and how the brain works,
01:18:33.560 | we need to really be clear which brain.
01:18:36.740 | Because everyone loves neocortex.
01:18:39.020 | It's like, oh, canonical circuits in cortex,
01:18:41.200 | we can get the cortical connectome.
01:18:42.580 | And sure, necessary, but not sufficient.
01:18:45.620 | Not to be able to plug in patterns
01:18:48.360 | of electrical stimulation and get behavior.
01:18:50.260 | Eventually we'll get there.
01:18:51.780 | But if you're talking subcortical circuits,
01:18:54.240 | that's where the action is.
01:18:55.420 | That's where you could potentially cure Parkinson's
01:18:57.380 | by stimulating the subthalamic nucleus.
01:18:59.600 | Because we know that it gates motor activation patterns
01:19:02.740 | in very predictable ways.
01:19:04.440 | So I think for those that are interested in neuroscience,
01:19:07.180 | it pays to pay attention to like,
01:19:09.420 | is this a circuit that abstracts the sensory information?
01:19:13.000 | Or is it just one that builds up hierarchical models
01:19:16.700 | in a very predictable way?
01:19:18.300 | And there's a huge chasm in neuroscience right now,
01:19:21.000 | because there's no conceptual leadership.
01:19:24.060 | No one knows which way to go.
01:19:25.700 | And this is why I think Neuralink
01:19:27.500 | has captured an amazing opportunity, which was,
01:19:29.700 | okay, well, while all you academic research labs
01:19:31.920 | are figuring all this stuff out,
01:19:33.300 | we're gonna pick a very specific goal
01:19:35.180 | and make the goal the end point.
01:19:36.900 | And some academic laboratories do that.
01:19:39.180 | But I think that's a beautiful way
01:19:40.660 | to attack this whole thing about the brain,
01:19:43.980 | because it's very concrete.
01:19:45.100 | Let's restore motion to the Parkinsonian patient.
01:19:48.660 | Academic labs want to do that too, of course.
01:19:50.820 | Let's restore speech to the stroke patient.
01:19:55.140 | But there's nothing abstract about that.
01:19:57.180 | That's about figuring out the solution
01:19:59.060 | to a particular problem.
01:20:00.260 | So anyway, those are my,
01:20:01.140 | and I admit I've mixed in a lot of opinion there.
01:20:04.660 | But having spent some time, like 25 years,
01:20:07.300 | digging around in the brain and listening to neurons firing
01:20:09.620 | and looking at them anatomically,
01:20:11.620 | I think given it's 2020, we need to ask the right,
01:20:15.580 | you know, the way to get better answers,
01:20:16.740 | ask better questions.
01:20:18.180 | And the really high level stuff is fun.
01:20:21.700 | It makes for good conversation.
01:20:23.860 | And it has brought enormous interest.
01:20:27.580 | But I think the questions about consciousness
01:20:29.780 | and dreaming and stuff, they're fascinating.
01:20:32.100 | But I don't know that we're there yet.
01:20:34.460 | - So you're saying there might be a chasm
01:20:35.780 | in the two views of,
01:20:41.060 | the power of the brain arising from the circuitry
01:20:46.060 | that forms abstractions or the power of the brain
01:20:51.380 | arising from the majority of the circuitry
01:20:54.380 | that's just doing very brute force, dumb things
01:20:59.380 | that are like, that don't have any fancy
01:21:02.140 | kind of stuff going on.
01:21:04.140 | That's really interesting to think about.
01:21:05.860 | - And which one to go after first.
01:21:08.180 | And here I'm poaching badly from someone I've never met,
01:21:11.620 | but whose work I follow, which is,
01:21:13.820 | and it was actually on your podcast.
01:21:15.260 | I think Elon Musk said, you know, basically the brain is a,
01:21:18.740 | well, you say a monkey brain with a supercomputer on top.
01:21:21.100 | And I thought that's actually probably the best description
01:21:23.900 | of the brain I've ever heard
01:21:24.940 | because it captures a lot of important features
01:21:27.140 | like limbic friction, right?
01:21:29.220 | But we think of like, oh, you know,
01:21:30.780 | when we're making plans, we're using the prefrontal cortex
01:21:33.280 | and we're executive function and all this kind of stuff.
01:21:35.980 | But think about the drug addict
01:21:37.580 | who's driven to go pursue heroin or cocaine.
01:21:41.540 | They make plans.
01:21:42.880 | So clearly they use their frontal cortex.
01:21:44.740 | It's just that it's been hijacked by the limbic system
01:21:47.460 | and all the monkey brain, as you refer to.
01:21:49.540 | It's really not fair to monkeys though, Elon,
01:21:51.260 | because actually monkeys can make plans.
01:21:53.180 | They just don't make plans as sophisticated as us.
01:21:55.340 | I've spent a lot of time with monkeys,
01:21:56.620 | but I've also spent a lot of time with humans.
01:21:58.060 | Anyway, I'm-
01:21:58.900 | - But you're putting, you're saying like,
01:22:00.380 | there's a lot of value to focusing on the monkey brain
01:22:03.100 | or whatever the heck you call it.
01:22:04.780 | - I do, because let's say I had an ability
01:22:07.700 | to place a chip anywhere I wanted in the brain today
01:22:10.780 | and activate it or inhibit that area.
01:22:13.340 | I'm not sure I would put that chip in neocortex,
01:22:15.640 | except maybe to just kind of have some fun
01:22:17.380 | and see what happens.
01:22:19.100 | The reason is it's an abstraction machine.
01:22:21.180 | And especially if I wanted to make a mass production tool,
01:22:24.740 | a tool in mass production that I could give
01:22:26.500 | to a lot of people, because it's quite possible
01:22:28.640 | that your abstractions are different enough than mine
01:22:30.980 | that I wouldn't know what patterns of firing to induce.
01:22:34.400 | But if I want, let's say I want to increase my level
01:22:37.340 | of focus and creativity, well, then I would love
01:22:40.900 | to be able to, for instance,
01:22:43.140 | control my level of limbic friction.
01:22:45.140 | I would love to be able to wake up and go,
01:22:46.580 | oh, you know what?
01:22:47.420 | I have an eight o'clock appointment.
01:22:48.240 | I wake up slowly.
01:22:49.500 | So between seven, eight,
01:22:50.380 | but I want to do a lot of linear thinking.
01:22:52.020 | So you know what?
01:22:52.860 | I'm going to just, I'm going to turn down the limbic friction
01:22:56.220 | and, or ramp up prefrontal cortex's activation.
01:23:00.300 | So there's a lot of stuff that can happen in the thalamus
01:23:02.820 | with sensory gating.
01:23:04.720 | For instance, you could shut down that shell
01:23:06.860 | around the thalamus and allow more creative thinking
01:23:09.160 | by allowing more lateral connections.
01:23:11.080 | These would be some of the,
01:23:12.400 | those would be the experiments I'd want to do.
01:23:14.040 | So they're in the subcortical, quote unquote, monkey brain,
01:23:17.400 | but you could then look at what sorts of abstract thoughts
01:23:20.960 | and behaviors would arise from that, rather than,
01:23:24.840 | and here I'm not pointing the finger at Neuralink at all,
01:23:27.320 | but there's this obsession with neocortex.
01:23:30.000 | But I, I'm going to, well, I might lose a few friends,
01:23:32.940 | but I'll hopefully gain a few.
01:23:34.340 | And also one of the reasons people spend so much time
01:23:38.300 | in neocortex, I have a fact and an opinion.
01:23:42.340 | One fact is that you can image there
01:23:44.060 | and you can record there.
01:23:45.380 | Right now, the two photon and one photon microscopy methods
01:23:49.680 | that allow you to image deep into the brain
01:23:51.620 | still don't allow you to image down really deep
01:23:54.020 | unless you're jamming prisms in there and endoscopes.
01:23:56.660 | And then in the endoscopes are very narrow.
01:23:58.300 | So you're getting very, you know,
01:23:59.380 | it's like looking at the bottom of the ocean
01:24:01.220 | through a spotlight.
01:24:03.500 | And so you much easier to look at the waves up on top.
01:24:06.220 | Right?
01:24:07.060 | So let's face it, folks,
01:24:08.540 | a lot of the reasons why there's so many recordings
01:24:10.480 | in layer two, three of cortex
01:24:11.940 | with all this advanced microscopy
01:24:13.380 | is 'cause it's very hard to image deeper.
01:24:15.900 | Now the microscopes are getting better.
01:24:18.540 | And thanks to amazing work,
01:24:20.100 | mainly of engineers and chemists and physicists,
01:24:22.340 | let's face it, they're the ones who brought
01:24:23.820 | this revolution to neuroscience in the last 10 years or so.
01:24:27.220 | You can image deeper.
01:24:28.300 | But we don't really,
01:24:30.440 | that's why you see so many reports on layer two, three.
01:24:33.460 | The other thing, which is purely opinion,
01:24:35.380 | and I'm not going after anybody here,
01:24:37.080 | but is that as long as there's no clear right answer,
01:24:40.900 | it becomes a little easier to do creative work
01:24:43.940 | in a structure where no one really knows how it works.
01:24:46.940 | So it's fun to probe around
01:24:48.220 | because anything you see is novel.
01:24:49.980 | If you're gonna work in the thalamus or the pulvinar
01:24:52.920 | or the hypothalamus,
01:24:54.460 | so these structures that have been known about
01:24:55.940 | since the sixties and seventies
01:24:57.340 | and really since the centuries ago,
01:25:00.100 | you are dealing with,
01:25:01.940 | you have to combat existing models.
01:25:04.860 | And whereas in cortex, no one knows how the thing works.
01:25:08.420 | Neocortex, six layer cortex.
01:25:10.300 | And so there's a lot more room for discovery.
01:25:13.500 | There's a lot more room for discovery.
01:25:14.620 | And I'm not calling anyone out.
01:25:15.700 | I love cortex.
01:25:16.540 | We've published some papers on cortex.
01:25:17.900 | It's super interesting.
01:25:19.260 | But I think with the tools that are available nowadays
01:25:23.260 | and where people are trying to head
01:25:25.060 | of not just reading from the brain,
01:25:27.160 | monitoring activity, but writing to the brain,
01:25:29.140 | I think we really have to be careful
01:25:30.880 | and we need to be thoughtful
01:25:32.380 | about what are we trying to write?
01:25:34.860 | What script are we trying to write?
01:25:36.300 | Because there are many brain structures
01:25:37.700 | for which we already know what scripts they write.
01:25:40.100 | And I think there's tremendous value there.
01:25:41.660 | I don't think it's boring.
01:25:43.140 | The fact that they act like machines makes them predictable.
01:25:45.920 | Those are your zeros and ones.
01:25:47.780 | Let's start there.
01:25:49.060 | But what's sort of happening in this field
01:25:51.700 | of writing to the brain is there's this idea.
01:25:54.180 | And again, I want to be clear,
01:25:55.060 | I'm not pointing at Neuralink.
01:25:56.060 | I'm mainly pointing at the neocortical jockeys out there
01:26:00.060 | that you go and you observe patterns
01:26:03.060 | and then you think replaying those patterns
01:26:04.540 | is going to give rise to something interesting.
01:26:07.700 | I should call out one experiment or two experiments
01:26:10.340 | which were done by Susumu Tonogawa,
01:26:12.260 | Nobel Prize winner from MIT.
01:26:14.500 | Done important work in memory and immunology, of course,
01:26:17.220 | is where he got his Nobel,
01:26:18.460 | as well as Mark Mayford's lab at UC San Diego.
01:26:21.780 | They did an experiment where they monitored
01:26:23.560 | a bunch of neurons while an animal learned something.
01:26:26.700 | Then they captured those neurons
01:26:28.220 | through some molecular tricks
01:26:30.060 | so they could replay the neurons.
01:26:32.180 | So now there's like perfect case scenario.
01:26:34.320 | It's like, okay, you monitor the neurons in your brain.
01:26:37.740 | Then I say, okay, neurons one through 100
01:26:39.700 | were played in the particular sequence.
01:26:41.300 | So you know the space time,
01:26:42.380 | you know the keys on the piano that were played
01:26:43.900 | that gave rise to the song, which was the behavior.
01:26:46.660 | And then you go back and you reactivate those neurons
01:26:48.900 | except you reactivate them all at once,
01:26:51.500 | like slamming on all the keys once on the piano,
01:26:54.360 | and you get the exact same behavior.
01:26:57.500 | So the space time code may be meaningless
01:27:03.040 | for some structures.
01:27:05.000 | Now that's freaky.
01:27:06.840 | That's a scary thing because what that means
01:27:09.800 | is that all the space time firing in cortex,
01:27:13.180 | the space part may matter more than the time part.
01:27:17.620 | So, you know, rate codes and space time codes,
01:27:20.520 | we don't know.
01:27:22.200 | And, you know, I'd rather have,
01:27:23.440 | I'd rather deliver more answers
01:27:24.760 | in this discussion questions,
01:27:26.220 | but I think it's an important consideration.
01:27:29.080 | - You're saying some of the magic is in the early stages
01:27:31.280 | of what the closest to the raw information.
01:27:34.720 | - I believe so.
01:27:35.840 | I believe so.
01:27:36.900 | You know the stimulus,
01:27:38.640 | you know the neuron then codes that stimulus.
01:27:40.280 | So you know the transformation.
01:27:41.980 | When I say this for those of you
01:27:43.560 | that don't think about sensory transformations,
01:27:45.320 | it's like, I can show you a red circle.
01:27:48.440 | And then I look at how many times the neuron fires
01:27:51.360 | in response to that red circle.
01:27:53.000 | And then I could show the red circle a bunch of times,
01:27:54.760 | green circle, see if it changes.
01:27:56.040 | And then essentially the number of times
01:27:57.720 | that is the transformation.
01:27:59.580 | You've converted red circle into like three action
01:28:02.080 | potentials, you know, beep, beep, beep,
01:28:04.160 | or whatever you want to call it, you know,
01:28:05.400 | for those that think in sound space.
01:28:07.200 | So that's what you've created.
01:28:09.180 | You know the transformation and you march up the,
01:28:11.540 | it's called the neuro axis as you go from the periphery
01:28:13.840 | up into the cortex.
01:28:16.720 | And we know that, and I know Lisa Feldman Barrett,
01:28:21.720 | or is it Barrett Feldman?
01:28:23.520 | - Barrett Feldman. - Barrett Feldman,
01:28:24.560 | excuse me, Lisa, that talked a lot about this,
01:28:27.560 | that, you know, birds can do sophisticated things
01:28:29.560 | and whatnot as well.
01:28:30.480 | But humans, there's a strong, what we call cephalization.
01:28:34.000 | A lot of the processing has moved up into the cortex
01:28:36.480 | and out of these subcortical areas,
01:28:38.400 | but it happens nonetheless.
01:28:40.000 | And so as long as you know the transformations,
01:28:42.400 | you are in a perfect place to build machines
01:28:45.640 | or add machines to the brain that exactly mimic
01:28:49.400 | what the brain wants to do,
01:28:50.720 | which is take events in the environment
01:28:53.240 | and turn them into internal firing of neurons.
01:28:56.160 | - So the mastery of the brain can happen
01:28:57.760 | at their early level.
01:28:59.000 | You know, another perspective of it is,
01:29:00.900 | you saying this means that humans aren't that special.
01:29:05.560 | If we look at the evolutionary time scale,
01:29:08.200 | the leap to intelligence is not that special.
01:29:10.560 | So like the extra layers of abstraction
01:29:14.680 | isn't where most of the magic happens of intelligence,
01:29:18.520 | which gives me hope that maybe if that's true,
01:29:21.800 | that means the evolution of intelligence
01:29:23.520 | is not that rare of an event.
01:29:25.800 | - I certainly hope not.
01:29:27.080 | I hope there are other forms of intelligence.
01:29:31.440 | I mean, I think what humans are really good at,
01:29:35.080 | and here I want to be clear that this is not a formal model,
01:29:38.240 | but what humans are really good at
01:29:40.520 | is taking that plasma barbell
01:29:43.480 | that we were talking about earlier
01:29:45.280 | and not just using it for analysis of space,
01:29:48.020 | like the intermediate environment,
01:29:49.680 | but also using historical information.
01:29:52.000 | Like I can read a book today about the history of medicine.
01:29:54.320 | I happen to be doing that lately
01:29:55.720 | for some stuff I'm researching.
01:29:57.160 | And I can take that information,
01:29:58.560 | and if I want, I can inject it into my plans for the future.
01:30:02.080 | Other animals don't seem to do that
01:30:05.000 | over the same time scales that we do.
01:30:07.920 | Now, it may be that the chipmunks
01:30:09.480 | are all hiding little like notebooks everywhere
01:30:11.600 | in the form of like little dirt castles
01:30:14.120 | or something that we don't understand.
01:30:15.560 | I mean, the waggle dance of the bee
01:30:16.960 | is in the most famous example.
01:30:18.840 | Bees come back to the hive,
01:30:20.280 | they orient relative to the honeycomb and they waggle.
01:30:23.880 | There's a guy down in Australia named Srinivasan
01:30:25.640 | who studied this, and it's really interesting.
01:30:28.120 | No one really understands it, except he understands it best.
01:30:32.160 | The bee waggles in a couple of ways
01:30:34.440 | relative to the orientation of the honeycomb,
01:30:36.160 | and then all the other bees see that, it's visual,
01:30:39.960 | and they go out and they know the exact coordinate system
01:30:42.680 | to get to that source of whatever it was,
01:30:45.360 | the food, and bring it back.
01:30:47.240 | And he's done it where they isolate the bees,
01:30:49.040 | he's changed the visual flight environment, all this stuff.
01:30:51.520 | They are communicating,
01:30:53.000 | and they're communicating something
01:30:54.280 | about something they saw recently,
01:30:56.000 | but it doesn't extend over very long periods of time.
01:30:59.920 | The same way that you and I can both read a book
01:31:02.520 | or you can recommend something to me,
01:31:03.720 | and then we could converge on a set of ideas later.
01:31:06.320 | And in fairness, 'cause she was the one that said it,
01:31:09.720 | and I didn't, and I hadn't even thought of it.
01:31:12.680 | When you talked to Lisa on your podcast,
01:31:14.960 | she brought up something beautiful,
01:31:17.920 | which is that it never really occurred to me,
01:31:20.160 | and I was sort of embarrassed that it hadn't,
01:31:21.920 | but it's really beautiful and brilliant,
01:31:24.880 | which is that we don't just encode senses
01:31:28.440 | in the form of like color and light
01:31:29.840 | and sound waves and taste,
01:31:30.920 | but ideas become a form of sensory mapping.
01:31:34.200 | And that's where the really, really cool
01:31:37.420 | and exciting stuff is,
01:31:38.480 | but we just don't understand what the receptive fields are
01:31:40.680 | for ideas.
01:31:41.520 | What's an idea receptive field?
01:31:43.880 | - And how they're communicated between humans,
01:31:47.420 | because we seem to be able to encode those ideas
01:31:50.840 | in some kind of way.
01:31:52.600 | Yes, it's taking all the raw information
01:31:54.440 | and the internal physical states,
01:31:57.000 | that sensory information put into this concept blob
01:32:00.920 | that we cut in the store,
01:32:01.920 | and then we're able to communicate that.
01:32:03.600 | - Yeah, your abstractions are different than mine.
01:32:05.520 | I actually think the comment section,
01:32:07.760 | on social media,
01:32:09.640 | is a beautiful example of where the abstractions
01:32:12.400 | are different for different people.
01:32:14.160 | So much of the misunderstanding of the world
01:32:16.720 | is because of these idea receptive fields.
01:32:20.320 | They're not the same.
01:32:21.600 | Whereas I can look at a photoreceptor neuron
01:32:23.680 | or olfactory neuron or a V1 neuron,
01:32:25.920 | and I am certain, I would bet my life
01:32:28.480 | that yours look and respond exactly the same way
01:32:32.000 | that Lisa's do and mine do.
01:32:34.200 | But once you get beyond there, it gets tricky.
01:32:35.960 | And so when you say something or I say something,
01:32:39.120 | and somebody gets upset about it or even happy about it,
01:32:42.640 | their concept of that might be quite a bit different.
01:32:45.720 | They don't really know what you mean.
01:32:47.840 | They only know what it means to them.
01:32:49.680 | - Yeah, so from a neural link perspective,
01:32:52.440 | it makes sense to optimize the control
01:32:54.760 | and the augmentation of the more primitive circuitry.
01:32:59.760 | So like the stuff that is closer
01:33:02.160 | to the raw sensory information.
01:33:03.720 | - Go deeper.
01:33:04.840 | - I think we should go deeper into the brain.
01:33:06.640 | And to be fair, so Matt McDougall,
01:33:10.080 | who's a neurosurgeon and a neural link
01:33:11.600 | and also a clinical nurse, a great guy, brilliant.
01:33:14.360 | They have amazing people.
01:33:15.840 | I have to give it to them.
01:33:16.720 | They have been very cryptic in recent years.
01:33:18.640 | Their website was just like nothing there.
01:33:22.800 | They really know how to do things with style.
01:33:24.640 | And they've upset a lot of people, but that's good too.
01:33:27.900 | But Matt is there.
01:33:29.920 | I know Matt, he actually came up through my lab at Stanford,
01:33:32.200 | although he was a neurosurgery resident.
01:33:33.880 | He spent time in our lab.
01:33:34.960 | He actually came out on the shark dive
01:33:36.640 | and did great white shark diving with my lab
01:33:38.680 | to collect the VR that we use in our fear stuff.
01:33:40.880 | I've talked to Matt and I think, you know,
01:33:43.360 | he and other folks there are hungry
01:33:45.220 | for the deeper brain structures.
01:33:47.240 | The problem is that damn vasculature, all that blood supply.
01:33:50.640 | It's not trivial to get through and down into the brain
01:33:55.320 | without damaging the vasculature in the neocortex,
01:33:57.600 | which is on the outer crust.
01:33:59.120 | But once you start getting into the thalamus
01:34:00.760 | and closer to some of the main arterial sources,
01:34:03.660 | you really risk getting massive bleeds.
01:34:05.560 | And so it's an issue that can be worked out.
01:34:08.680 | It just is hard.
01:34:09.740 | - Maybe it'd be nice to educate, I'm showing my ignorance.
01:34:14.060 | So the smart stuff is on the surface.
01:34:17.800 | So I didn't quite realize, 'cause you keep saying deep.
01:34:22.480 | - Yeah, so--
01:34:23.880 | - So like the early stages are deep?
01:34:27.120 | - Yeah, so--
01:34:27.960 | - In actually physically in the brain.
01:34:29.560 | - Yeah, so the way that, you know,
01:34:32.360 | of course you've got your deep brain structures
01:34:34.580 | that are involved in breathing and heart rate
01:34:36.100 | and kind of lizard brain stuff.
01:34:37.300 | And then on top of that, this is the model of the brain
01:34:41.340 | that no one really subscribes to anymore,
01:34:43.100 | but anatomically it works.
01:34:44.620 | And then on top in mammals, and then on top of that,
01:34:46.940 | you have the limbic structures,
01:34:48.180 | which gate sensory information and decide
01:34:50.700 | whether or not you're gonna listen to something more,
01:34:52.180 | that you're gonna look at it,
01:34:53.060 | or you're gonna split your attention to both
01:34:55.700 | kind of sensory allocation stuff.
01:34:58.380 | And then the neocortex is on the outside.
01:35:02.000 | And that is where you get a lot of this abstraction stuff.
01:35:05.700 | And now not all cortical areas are doing abstraction.
01:35:07.840 | Some like visual area one, auditory area one,
01:35:11.100 | they're just doing concrete representations.
01:35:13.860 | But as you get into the higher order stuff,
01:35:17.180 | that when you start hearing names like infraparietal cortex,
01:35:20.460 | and you know, when you start hearing multiple names
01:35:22.140 | in the same, then you're talking about higher order areas.
01:35:24.920 | But actually there's an important experiment
01:35:27.580 | that drives a lot of what people wanna do
01:35:31.320 | with brain machine interface.
01:35:32.420 | And that's the work of Bill Newsome,
01:35:33.960 | who is at Stanford and Tony Movshin,
01:35:35.520 | who's at runs the Center for Neural Science at NYU.
01:35:38.180 | This is a wild experiment.
01:35:39.720 | And I think it might freak a few people out
01:35:42.080 | if they really think about it too deeply,
01:35:43.720 | but anyway, here it goes.
01:35:45.660 | There's an area called MT in the cortex.
01:35:50.320 | And if I showed you a bunch of dots all moving up,
01:35:53.380 | and this is what Tony and Bill,
01:35:55.620 | and some of the other people in that lab did way back when,
01:35:59.060 | is they show a bunch of dots moving up.
01:36:01.380 | Somewhere in MT, there's some neurons that respond,
01:36:03.980 | they fire when the neurons move up.
01:36:05.940 | And then what they did is they started varying
01:36:07.340 | the coherence of that motion.
01:36:08.700 | So they made it so only 50% of the dots moved up
01:36:10.980 | and the rest move randomly.
01:36:12.460 | And then neuron fires a little less.
01:36:14.300 | And eventually it's random and that neuron stops firing
01:36:16.700 | 'cause it's just kind of dots moving everywhere.
01:36:18.660 | It's awesome.
01:36:19.500 | And there's a systematic map so that other neurons
01:36:21.900 | are responding and things moving down,
01:36:23.220 | and other things are responding left,
01:36:24.460 | and other things are moving right.
01:36:25.460 | Okay, so there's a map of direction space.
01:36:27.940 | Okay, well, that's great.
01:36:30.460 | You could lesion MT, animals lose the ability
01:36:32.860 | to do these kind of coherence discrimination
01:36:35.500 | or direction discrimination.
01:36:37.140 | But the amazing experiment,
01:36:38.700 | the one that just is kind of eerie,
01:36:41.900 | is that they lowered a stimulating electrode into MT,
01:36:45.700 | found a neuron that responds to when dots go up.
01:36:49.300 | But then they silenced that neuron.
01:36:52.540 | And sure enough, the animal doesn't recognize
01:36:56.020 | the neurons are going up.
01:36:58.340 | And then they move the dots down.
01:37:00.280 | They stimulate the neuron that responds to things moving up.
01:37:05.580 | And the animal responds, 'cause it can't speak,
01:37:09.420 | it responds by doing a lever press,
01:37:10.900 | which says the dots are moving up.
01:37:12.660 | So in other words, the sensory,
01:37:14.420 | the dots are moving down in reality on the computer screen.
01:37:17.780 | They're stimulating the neuron
01:37:20.460 | that responds to dots moving up.
01:37:22.540 | And the perception of the animal
01:37:23.980 | is that dots are moving up.
01:37:25.980 | Which tells you that your perception of external reality
01:37:30.980 | absolutely has to be a neuronal abstraction.
01:37:33.480 | It is not tacked to the movement of the dots
01:37:36.860 | in any absolute way.
01:37:38.900 | Your perception of the outside world depends entirely
01:37:42.660 | on the activation patterns of neurons in the brain.
01:37:45.780 | And you can hear that and say, well, duh,
01:37:48.820 | because if I stimulate the stretch reflex
01:37:51.380 | and you kick or something or whatever,
01:37:53.020 | the knee reflex and you kick,
01:37:54.500 | of course there's a neuron that triggers that,
01:37:56.380 | but it didn't have to be that way.
01:37:58.620 | Because A, the animal had prior experience,
01:38:00.820 | B, you're way up in this higher order cortical areas.
01:38:04.700 | What this means is that,
01:38:08.020 | and I generally try and avoid conversations
01:38:09.680 | about this kind of thing,
01:38:10.740 | but what this means is that we are constructing our reality
01:38:15.100 | with this space time firing the zeros and ones.
01:38:17.900 | And it doesn't have to have anything to do
01:38:20.340 | with the actual reality.
01:38:22.100 | And the animal or person can be absolutely convinced
01:38:25.420 | that that's what's happening.
01:38:27.660 | - Are you familiar with the work of Donald Hoffman?
01:38:30.740 | So he's, so he makes an evolution argument
01:38:35.220 | that's not important,
01:38:36.300 | that we, our brains
01:38:41.620 | are completely detached from reality
01:38:46.260 | in the sense that he makes a radical case
01:38:49.940 | that we have no idea what physical reality is.
01:38:53.340 | And in fact, it's drastically different
01:38:56.580 | than what we think it is.
01:38:58.500 | - Oh my.
01:38:59.420 | - So he goes, that's scary.
01:39:01.900 | So he doesn't say like, there's just,
01:39:03.340 | 'cause you're kind of implying there's a gap.
01:39:06.200 | There might be a gap.
01:39:08.180 | We're constructing an illusion
01:39:09.540 | and then maybe using communication
01:39:12.860 | to maybe create a consistency
01:39:16.140 | that's sufficient for our human collaboration or whatever,
01:39:19.260 | or mammal, just maybe even just life forms
01:39:22.660 | are constructing a consistent reality
01:39:25.140 | that's maybe detached.
01:39:26.700 | I mean, that's really cool
01:39:27.540 | that neurons are constructing that,
01:39:28.860 | like that you can prove that this is when you're a science
01:39:32.060 | at its best, vision science.
01:39:34.100 | But he says that like our brain is actually
01:39:38.580 | just lost its shit on the path of evolution
01:39:43.260 | to where we're normal.
01:39:44.460 | We're just playing games with each other
01:39:46.900 | in constructing realities that allow our survival.
01:39:50.540 | But it's completely detached from physical reality.
01:39:54.020 | - Like we're missing a lot.
01:39:55.420 | - We're missing like most of it, if not all of it.
01:40:00.420 | - Well, this was, it's fascinating
01:40:03.640 | because I just saw the Oliver Sacks documentary.
01:40:06.280 | There's a new documentary out about his life.
01:40:08.780 | And there's this one part where he's like,
01:40:11.200 | I've spent part of my life trying to imagine
01:40:13.260 | what it would like to be a bat or something,
01:40:16.980 | to see the world through the sensory apparati of a bat.
01:40:21.780 | And he did this with these patients
01:40:23.700 | that were locked into these horrible syndromes
01:40:25.860 | that to pull out some of the beauty
01:40:28.340 | of their experience as well,
01:40:30.220 | not just communicate the suffering,
01:40:31.860 | although the suffering too.
01:40:33.420 | And as I was listening to him talk about this,
01:40:35.660 | I started to realize, it's like,
01:40:37.020 | well, what, you know, like there are these mantis shrimps
01:40:39.800 | that can see 60 shades of pink or something.
01:40:43.020 | And they see this stuff all the time
01:40:44.520 | and animals that can see UV light.
01:40:46.140 | Every time I learn about an animal
01:40:49.220 | that can sense other things in the environment
01:40:50.980 | that I can't like heat sensing,
01:40:52.740 | well, I don't crave that experience
01:40:55.180 | the same way Sacks talked about craving that experience,
01:40:57.900 | but it does throw another penny in the jar
01:41:00.580 | for what you're saying, which is that it could be that most,
01:41:04.540 | if not all of what I perceive and believe
01:41:07.180 | is just a neural fabrication.
01:41:11.380 | And that for better, for worse,
01:41:13.140 | we all agree on enough of the same neural fabrications
01:41:16.100 | in the same time and place that we're able to function.
01:41:18.660 | - Not only that, but we agree with the things
01:41:20.740 | that are trying to eat us enough
01:41:24.240 | to where they don't eat us.
01:41:26.060 | Meaning like that it's not just us humans, you know.
01:41:30.540 | - Oh, I see, because it's interactive.
01:41:32.140 | - It's interactive.
01:41:33.020 | So like, now I think it's a really nice thought experiment.
01:41:40.740 | I think because Donald really frames it in a scientific,
01:41:45.740 | like he makes a hard,
01:41:48.260 | like as hard as our discussion has been now,
01:41:50.780 | he makes a hard scientific case
01:41:52.960 | that we don't know shit about reality.
01:41:55.780 | I think that's a little bit hardcore,
01:41:59.340 | but I think it's-
01:42:01.420 | - It is hardcore.
01:42:03.100 | It is hardcore.
01:42:03.940 | - I think it's a good thought experiment
01:42:05.300 | that kind of cleanses the palette
01:42:07.600 | of the confidence we might have
01:42:09.920 | about, 'cause we are operating in this abstraction space,
01:42:14.920 | and the sensory spaces might be something very different.
01:42:20.900 | And it's kind of interesting to think about
01:42:25.140 | if you start to go into the realm of Neuralink
01:42:27.600 | or start to talk about just everything
01:42:29.880 | that you've been talking about with dream states
01:42:31.820 | and psychedelics and stuff like that,
01:42:33.820 | which part of the, which layer can we control
01:42:37.340 | and play around with
01:42:38.380 | and maybe look into a different slice of reality?
01:42:41.540 | - You just gotta do the experiment.
01:42:44.460 | The key is to just do the experiment
01:42:46.180 | in the most ethical way possible.
01:42:48.560 | I mean, that's the beauty of experiments.
01:42:50.540 | This is why, there's wonderful theoretical neuroscience
01:42:55.200 | happening now to make predictions,
01:42:57.860 | but that's why experimental science is so wonderful.
01:43:00.500 | You can go into the laboratory and poke around in there
01:43:03.380 | and be a brain explorer and listen to and write to neurons.
01:43:07.180 | And when you do that, you get answers.
01:43:09.500 | You don't always get the answers you want,
01:43:10.940 | but that's the beauty of it.
01:43:13.740 | I think when you were saying this thing about reality
01:43:16.580 | and the Donald Hoffman model,
01:43:19.300 | I was thinking about children.
01:43:20.800 | Like when I have an older sister, she's very sane.
01:43:24.760 | But when she was a kid, she had an imaginary friend
01:43:30.220 | and she would play with this imaginary friend.
01:43:32.500 | And it had, there was this whole, there was a consistency.
01:43:34.900 | This friend was like, it was Larry,
01:43:36.540 | lived in a purple house.
01:43:37.660 | Larry was a girl.
01:43:38.540 | It was like all this stuff that a child,
01:43:40.300 | a young child wouldn't have any issue with.
01:43:42.540 | And then one day she announced that Larry had died.
01:43:45.140 | And it wasn't traumatic or traumatic.
01:43:46.740 | And that was it.
01:43:47.580 | And she just stopped.
01:43:48.540 | And I always wonder what that neurodevelopmental event was
01:43:53.240 | that A, kept her out of a psychiatric ward
01:43:57.660 | had she kept that imaginary friend.
01:44:00.700 | But it's also, there was something kind of sad to it.
01:44:04.620 | I think the way it was told to me,
01:44:05.900 | 'cause I'm the younger brother,
01:44:06.820 | I wasn't around for that.
01:44:08.320 | But my dad told me that there was a kind of a sadness
01:44:11.300 | because it was this beautiful reality
01:44:13.180 | that had been constructed.
01:44:14.020 | And so we kind of wonder, as you're telling me this,
01:44:16.720 | whether or not, as adults,
01:44:19.220 | we try and create as much reality for children as we can
01:44:22.260 | so that they can make predictions and feel safe.
01:44:24.180 | Because the ability to make predictions
01:44:25.540 | is a lot of what keeps our autonomic arousal in check.
01:44:28.760 | I mean, we go to sleep every night
01:44:29.900 | and we give up total control.
01:44:31.940 | And that should frighten us deeply,
01:44:33.540 | but unfortunately autonomic arousal yanks us down under
01:44:37.300 | and we don't negotiate too much.
01:44:39.600 | So you sleep sooner or later.
01:44:41.160 | I don't know.
01:44:44.020 | I was a little worried we'd get into discussions
01:44:45.560 | about the nature of reality.
01:44:46.580 | 'Cause it's interesting in the laboratory,
01:44:48.660 | I'm very much like, what's the experiment?
01:44:51.300 | What's the analysis gonna look like?
01:44:53.660 | What mutant mouse are we gonna use?
01:44:55.380 | What experience are we gonna put someone through?
01:44:58.580 | But I think it's wonderful that in 2020,
01:45:00.500 | we can finally have discussions about this stuff
01:45:03.740 | and look, kind of peek around the corner and say,
01:45:06.020 | well, Neuralink and people,
01:45:08.820 | others who are doing similar things
01:45:11.260 | are gonna figure it out.
01:45:12.580 | They're gonna, the answers will show up
01:45:14.900 | and we just have to be open to interpretation.
01:45:17.580 | - Do you think there could be an experiment
01:45:20.500 | centered around consciousness?
01:45:21.820 | I mean, you're plugged into the neuroscience community.
01:45:24.460 | I think for the longest time,
01:45:26.020 | the quote unquote C word was totally not,
01:45:30.580 | was almost anti-scientific.
01:45:32.060 | But now more and more people are talking about consciousness.
01:45:34.700 | Elon is talking about consciousness.
01:45:37.140 | AI folks is talking about consciousness.
01:45:39.520 | It's still, nobody knows anything,
01:45:42.860 | but it feels like a legitimate domain of inquiry
01:45:47.860 | that's hungry for a real experiment.
01:45:53.460 | - So I have fortunately three short answers to this.
01:45:57.640 | The first one is a-
01:46:00.020 | - Vlogs later.
01:46:00.920 | - I'm not particularly succinct, I agree.
01:46:05.380 | The joke I always tell is,
01:46:07.420 | there are two things you never wanna say to a scientist.
01:46:09.460 | One is, what do you do?
01:46:11.220 | And the second one is, take as much time as you need.
01:46:13.940 | And you definitely don't wanna say them
01:46:14.980 | in the same sentence.
01:46:16.020 | I have three short answers to it.
01:46:19.560 | So there's a cynical answer,
01:46:22.380 | kind of, and it's not one I enjoy giving,
01:46:24.860 | which is that if you look into the '70s
01:46:29.860 | back at the 1970s and 1980s,
01:46:33.060 | and even into the early 2000s,
01:46:35.300 | there were some very dynamic, very impressive speakers
01:46:39.540 | who were very smart in the field of neuroscience
01:46:42.100 | and related fields,
01:46:43.700 | who thought hard about the consciousness problem
01:46:46.380 | and fell in love with the problem,
01:46:49.620 | but overlooked the fact that the technology wasn't there.
01:46:54.620 | So I admire them for falling in love with the problem,
01:46:59.180 | but they gleaned tremendous taxpayer resources,
01:47:04.180 | essentially for nothing.
01:47:06.220 | And these people know who they are.
01:47:07.580 | Some of them are alive, some of them aren't.
01:47:09.020 | I'm not referring to Francis Crick,
01:47:10.380 | who was brilliant by the way,
01:47:11.460 | and thought the Klaustrum was involved in consciousness,
01:47:13.580 | which I think is a great idea.
01:47:14.620 | It's this obscure structure that no one's really studied.
01:47:17.460 | People are now starting to study it.
01:47:18.980 | So I think Francis was brilliant and wonderful,
01:47:20.980 | but there were books written about it.
01:47:23.860 | It makes for great television stuff
01:47:27.820 | and thought around the table
01:47:31.260 | or after a couple of glasses of wine or whatever.
01:47:33.700 | It's an important problem nonetheless.
01:47:36.580 | And so I do think the consciousness,
01:47:39.460 | the issue is it's not operationally defined, right?
01:47:42.240 | That psychologists are much smarter
01:47:43.980 | than a lot of hard scientists
01:47:47.700 | in that for the following reason,
01:47:51.100 | they put operational definitions.
01:47:52.580 | They know that psychology,
01:47:54.020 | if we're talking about motivation, for instance,
01:47:55.900 | they know they need to put operational definitions on that
01:47:58.140 | so that two laboratories can know
01:47:59.780 | they're studying the same thing.
01:48:01.260 | The problem with consciousness
01:48:02.420 | is no one can agree on what that is.
01:48:03.940 | And this was a problem for attention when I was coming up.
01:48:08.220 | So in the early 2000s, people would argue,
01:48:09.980 | what is attention?
01:48:10.820 | Is it spatial attention, auditory attention?
01:48:12.660 | Is it, and finally people were like, you know what?
01:48:15.980 | We agree.
01:48:16.820 | - Did they agree on that one?
01:48:17.740 | - Sort of. - I remember.
01:48:18.660 | - Sort of. - I remember sort of.
01:48:19.500 | - Hearing people scream about attention.
01:48:21.660 | - Right, they couldn't even agree on attention.
01:48:23.100 | So I was coming up as a young graduate student.
01:48:25.020 | I'm thinking like,
01:48:26.060 | I'm definitely not gonna work on attention
01:48:27.860 | and I'm definitely not gonna work on consciousness.
01:48:30.100 | And I wanted something that I could solve or figure out.
01:48:33.500 | I wanna be able to see the circuit or the neurons.
01:48:36.300 | I wanna be able to hear it on the audio
01:48:37.980 | and I wanna record from it.
01:48:39.060 | And then I wanna do gain of function and loss of function.
01:48:41.460 | Take it away, see something change, put it back,
01:48:44.240 | see something change in a systematic way.
01:48:46.020 | And that takes you down into the depths of some stuff
01:48:48.780 | that's pretty plug and chug, you know?
01:48:51.980 | But, you know, I'll borrow from something in the military
01:48:55.060 | 'cause I'm fortunate to do some work
01:48:56.380 | with units from special operations
01:48:58.060 | and they have beautiful language around things
01:48:59.980 | 'cause their world is not abstract.
01:49:02.060 | And they talk about three meter targets,
01:49:03.700 | 10 meter targets and 100 meter targets.
01:49:05.740 | And it's not an issue of picking the 100 meter target
01:49:08.180 | 'cause it's more beautiful
01:49:09.500 | or because it's more interesting.
01:49:10.420 | If you don't take down the three meter targets
01:49:12.780 | and the 10 meter targets first, you're dead.
01:49:15.220 | So that's, I think scientists could pay to, you know,
01:49:19.660 | adopt a more kind of military thinking in that sense.
01:49:22.560 | The other thing that is really important
01:49:26.180 | is that just because somebody conceived of something
01:49:29.040 | and can talk about it beautifully
01:49:30.280 | and can glean a lot of resources for it,
01:49:33.380 | doesn't mean that it's led anywhere.
01:49:35.060 | So, but this isn't just true of the consciousness issue.
01:49:38.460 | And I don't wanna sound cynical,
01:49:39.520 | but I could pull up some names of molecules
01:49:42.100 | that occupied hundreds of articles
01:49:44.900 | in the very premier journals
01:49:47.140 | that then were later discovered
01:49:48.500 | to be totally moot for that process.
01:49:51.860 | And biotech companies folded everyone and the lab pivots
01:49:55.100 | and starts doing something different with that molecule.
01:49:57.020 | And nobody talks about it
01:49:58.980 | because as long as you're in the game,
01:50:01.140 | we have this thing called anonymous peer review.
01:50:02.940 | You can't afford to piss off anybody too much
01:50:05.680 | unless you have some other funding stream.
01:50:07.660 | And I've avoided battles most of my career,
01:50:10.400 | but I pay attention to all of it.
01:50:12.480 | And I've watched this and I don't think it's ego driven.
01:50:15.320 | I think it's that people fall in love with an idea.
01:50:17.780 | I don't think there's any,
01:50:18.780 | there's not enough money in science
01:50:19.900 | for people to sit back there
01:50:21.180 | rubbing their hands together.
01:50:23.080 | The beauty of what Neuralink and Elon and team,
01:50:26.260 | 'cause obviously he's very impressive,
01:50:28.340 | but the team as a whole is really
01:50:31.140 | what gives me great confidence in their mission,
01:50:34.300 | is that he's already got enough money,
01:50:35.980 | so it can't be about that.
01:50:38.500 | He doesn't seem to need it at a level of,
01:50:41.700 | I don't know him,
01:50:42.520 | but he doesn't seem to need it
01:50:44.100 | at a kind of an ego level or something.
01:50:46.280 | I think it's driven by genuine curiosity.
01:50:48.840 | And the team that he's assembled
01:50:50.640 | include people that are very kind of abstract,
01:50:53.960 | neuro neocortex, space-time coding people.
01:50:57.760 | There are people like Matt, who's a neurosurgeon.
01:51:00.240 | You can't, I mean, you know,
01:51:02.440 | you can't BS neurosurgery.
01:51:04.840 | Failures in neurosurgery are not tolerated.
01:51:06.920 | So you have to be very good to exceptional
01:51:09.380 | to even get through the gate, and he's exceptional.
01:51:11.860 | And then they've got people like Dan Adams,
01:51:13.980 | who was at UCSF for a long time,
01:51:15.460 | is a good friend and known him for years,
01:51:18.460 | who is very concrete, studied the vasculature in the eye
01:51:21.180 | and how it maps to the vasculature in cortex.
01:51:23.020 | When you get a team like that together,
01:51:25.260 | you're gonna have dissenters,
01:51:26.860 | you're gonna have people that are high-level thinkers,
01:51:29.560 | people that are coders.
01:51:30.960 | When you get a team like that,
01:51:32.620 | it no longer looks like an academic laboratory
01:51:34.980 | or even a field in science.
01:51:36.460 | And so I think they're gonna solve
01:51:39.220 | some really hard problems.
01:51:40.860 | And again, I'm not here, they don't, you know,
01:51:44.160 | I have nothing at stake with them.
01:51:47.020 | But I think that's the solution.
01:51:48.820 | You need a bunch of people
01:51:50.400 | who don't need first author papers,
01:51:52.540 | who don't need to complete their PhD,
01:51:54.380 | who aren't relying on outside funding,
01:51:56.140 | who have a clear mission,
01:51:57.580 | and you have a bunch of people
01:51:59.700 | who are basically will adapt to solve the problem.
01:52:03.380 | - I like the analogy of the three-meter target
01:52:05.540 | and the hundred-meter target.
01:52:07.300 | So the folks at Neuralink are basically,
01:52:09.700 | many of them are some of the best people in the world
01:52:11.780 | at the three-meter target.
01:52:13.100 | Like you mentioned Matt and neurosurgery,
01:52:15.460 | like they're solving real problems.
01:52:17.420 | There's no BS, philosophical,
01:52:20.620 | smoke some weed and look back and look at the stars.
01:52:23.980 | so both on Elon and because I think like this,
01:52:29.720 | I think it's really important to think about the hundred-meter
01:52:33.300 | and the hundred-meter is not even a hundred-meter,
01:52:36.260 | but like the stuff behind the hill that's too far away,
01:52:41.260 | which is where I put consciousness.
01:52:44.620 | Maybe, I tend to believe that consciousness
01:52:53.500 | can be engineered.
01:52:55.340 | I think part of the reason,
01:52:56.820 | part of the business I wanna build leverages that idea.
01:53:03.380 | That consciousness is a lot simpler
01:53:05.020 | than we've been talking about.
01:53:07.900 | - Well, if someone can simplify the problem.
01:53:10.300 | - Right. - That will be wonderful.
01:53:11.700 | I mean, the reason we can talk about something as abstract
01:53:13.980 | as face representations, infusive form, face area,
01:53:17.300 | is because Nancy Kanwisher had the brilliance
01:53:19.900 | to tie it to the kind of lower level statistics
01:53:24.900 | of visual scenes.
01:53:26.020 | It wasn't 'cause she was like, "Oh, I bet it's there."
01:53:28.340 | That wouldn't have been interesting.
01:53:29.900 | So people like her understand how to bridge that gap
01:53:33.300 | and they put a tractable definition.
01:53:36.300 | So that's what I'm begging for in science,
01:53:40.380 | is a tractable definition.
01:53:41.780 | - This is what, but I want people to sit in the,
01:53:46.180 | I want people who are really uncomfortable
01:53:48.340 | with woo-woo like consciousness, like high-level stuff,
01:53:51.700 | to sit in that topic and sit uncomfortably
01:53:55.080 | because it forces them to then try to ground
01:53:57.180 | and simplify it into something that's concrete.
01:53:59.560 | Because too many people are just uncomfortable
01:54:02.180 | to sit in the consciousness room
01:54:04.020 | because there's no definitions.
01:54:05.980 | It's like attention or intelligence
01:54:09.020 | in the artificial intelligence community.
01:54:10.600 | But the reality is it's easy to avoid that room altogether,
01:54:14.340 | which is what, I mean, there's analogies
01:54:16.500 | to everything you've said
01:54:17.420 | with the artificial intelligence community,
01:54:19.740 | with Minsky and even Alan Turing
01:54:22.820 | that talked about intelligence a lot,
01:54:24.740 | and then they drew a lot of funding and then it crashed
01:54:27.180 | because they really didn't do anything with it.
01:54:29.340 | And it was a lot of force of personality and so on,
01:54:31.560 | but that doesn't mean the topic of the Turing test
01:54:35.600 | and intelligence isn't something we should sit on
01:54:39.100 | and think like, think like what is,
01:54:42.120 | well, first of all, I mean,
01:54:43.140 | Turing actually attempted this with the Turing test.
01:54:45.200 | He tried to make concrete
01:54:47.060 | this very question of intelligence.
01:54:48.920 | It doesn't mean that we shouldn't linger on it
01:54:52.040 | and we shouldn't forget that ultimately
01:54:56.360 | that is what our efforts are all about.
01:54:58.880 | In the artificial intelligence community
01:55:00.780 | and in the people, whether it's neuroscience
01:55:04.500 | or whatever bigger umbrella you wanna use
01:55:07.180 | for understanding the mind,
01:55:08.620 | the goal is not just about understanding
01:55:13.420 | layer two or three of the vision.
01:55:15.420 | It's to understand consciousness and intelligence
01:55:19.480 | and maybe create it
01:55:22.320 | or just all the possible biggest questions of our universe.
01:55:25.940 | That's ultimately the dream.
01:55:27.780 | - Absolutely, and I think what I really appreciate
01:55:30.980 | about what you're saying is that everybody,
01:55:33.380 | whether or not they're working on a kind of
01:55:35.400 | a low level synapse, that's like a reflex in musculature
01:55:38.520 | or something very high level abstract
01:55:41.220 | can benefit from looking at those who prefer three,
01:55:45.240 | everyone's going after three meter, 10 meter
01:55:46.760 | and a hundred meter targets in some sense,
01:55:48.560 | but to be able to tolerate the discomfort
01:55:51.680 | of being in a conversation where there are real answers,
01:55:54.600 | where the zeros and ones are known, zeros and ones,
01:55:57.920 | are the equivalent of that in the nervous system.
01:56:00.560 | And also, as you said, for the people that are very much
01:56:03.640 | like, oh, I can only trust what I can see and touch,
01:56:06.280 | those people need to put themselves
01:56:08.200 | into the discomfort of the high level conversation
01:56:10.280 | because what's missing is conversation
01:56:15.280 | and conceptualization of things at multiple levels.
01:56:19.480 | I think one of the, this is, I don't gripe about,
01:56:22.560 | my lab's been fortunate.
01:56:23.540 | We've been funded from the start and we've been happy
01:56:26.180 | in that regard and lucky, and we're grateful for that.
01:56:30.200 | But I think one of the challenges of research
01:56:33.840 | being so expensive is that there isn't a lot of time,
01:56:38.360 | especially nowadays, for people to just convene
01:56:40.600 | around a topic because there's so much emphasis
01:56:44.020 | on productivity.
01:56:45.920 | And so there are actually, believe it or not,
01:56:48.160 | there aren't that many concepts,
01:56:49.520 | formal concepts in neuroscience right now.
01:56:52.240 | The last 10 years has been this huge influx of tools.
01:56:56.260 | And so people have been in neural circuits
01:56:57.760 | and probing around and connectomes, it's been wonderful.
01:57:00.720 | But 10, 20 years ago, when the consciousness stuff
01:57:04.540 | was more prominent, the C word, as you said,
01:57:07.800 | what was good about that time is that people
01:57:10.360 | would go to meetings and actually discuss ideas and models.
01:57:14.220 | Now it's sort of like demonstration day
01:57:18.120 | at the school science fair where everyone's got their thing
01:57:20.420 | and some stuff is cooler than others.
01:57:23.020 | But I think we're gonna see a shift.
01:57:26.320 | I'm grateful that we have so many computer scientists
01:57:29.680 | and theoreticians or theorists,
01:57:32.860 | I think they call themselves.
01:57:34.320 | Somebody tell me what the difference is someday.
01:57:38.080 | And psychology and even dare I say philosophy,
01:57:43.800 | these things are starting to converge.
01:57:45.560 | Neuroscience, the name neuroscience,
01:57:47.440 | there wasn't even such a thing
01:57:49.080 | when I started graduate school or as a postdoc,
01:57:51.000 | it was neurophysiology or you were a neuroanatomist.
01:57:53.940 | Now, it's sort of everybody's invited and that's beautiful.
01:57:58.340 | That means that something's useful
01:58:00.140 | is gonna come of all this.
01:58:01.420 | And there's also tremendous work, of course,
01:58:03.100 | happening on it for the treatment of disease.
01:58:04.940 | And we shouldn't overlook that.
01:58:06.000 | That's where eliminating, reducing suffering
01:58:09.760 | is also a huge initiative of neuroscience.
01:58:11.680 | So there's a lot of beauty in the field,
01:58:14.080 | but the consciousness thing continues to be a,
01:58:17.820 | it's like an exotic bird.
01:58:19.920 | It's like no one really quite knows how to handle it
01:58:22.580 | and it dies very easily.
01:58:24.920 | - Well, yeah.
01:58:25.760 | I think also from the AI perspective,
01:58:29.720 | so I view the brain as less sacred.
01:58:35.200 | I think from a neuroscience perspective,
01:58:39.280 | you're a little bit more sensitive to BS,
01:58:42.940 | like BS narratives about the brain or whatever.
01:58:46.600 | I'm a little bit more comfortable
01:58:49.520 | with just poetic BS about the brain,
01:58:51.600 | as long as it helps engineer intelligent systems.
01:58:54.640 | You know what I mean?
01:58:56.040 | - Well, and I have to, I confess ignorance
01:58:59.360 | when it comes to most things about coding
01:59:02.120 | and I have some quantitative ability,
01:59:04.120 | but I don't have strong quantitative leanings.
01:59:05.960 | And so I know my limitations too.
01:59:08.520 | And so I think the next generation coming up,
01:59:11.920 | a lot of the students at Stanford are really interested
01:59:14.000 | in quantitative models and theory and AI.
01:59:16.960 | And I remember when I was coming up,
01:59:20.320 | a lot of the people who were doing work ahead of me,
01:59:22.060 | I kind of rolled my eyes at some of the stuff
01:59:23.560 | they were doing, including some of their personalities,
01:59:25.960 | although I have many great senior colleagues everywhere.
01:59:29.720 | - Way of the world.
01:59:30.560 | - So it's the way of the world.
01:59:31.480 | So nobody knows what it's like to be a young graduate student
01:59:35.060 | in 2020, except the young graduate students.
01:59:36.960 | So I know there are a lot of things I don't know.
01:59:41.100 | And in addition to why I do a lot of public education,
01:59:44.300 | increased scientific literacy
01:59:45.420 | and neuroscientific thinking, et cetera,
01:59:47.860 | a big goal of mine is to try and at least pave the way
01:59:50.940 | so that these really brilliant and forward thinking
01:59:54.400 | younger scientists can make the biggest possible dent
01:59:57.540 | and make what will eventually be all us old guys
02:00:00.540 | and gals look stupid.
02:00:01.640 | I mean, that's what we were all trying to do.
02:00:03.820 | That's what we were trying to do.
02:00:04.900 | So yeah.
02:00:06.020 | - So from the highest possible topic of caution,
02:00:11.020 | cautiousness to the lowest level topic of David Goggins.
02:00:16.020 | Let's go.
02:00:19.580 | - I don't know if it's low level, he's high performance.
02:00:22.540 | - High performance, but like low, like there's no,
02:00:26.380 | I don't think David has any time for philosophy.
02:00:30.640 | Let's just put it this way.
02:00:32.700 | - Well, I mean, I think we can tack it
02:00:34.740 | to what we were just saying in a meaningful way,
02:00:37.520 | which is whatever goes on in that abstraction
02:00:41.400 | part of the brain, he's figured out how to dig down
02:00:46.400 | in whatever the limbic friction,
02:00:49.600 | he's figured out how to grab a hold of that,
02:00:52.920 | scruff it and send it in the direction
02:00:55.940 | that he's decided it needs to go.
02:00:57.840 | And what's wild is that he's,
02:00:59.920 | what we're talking about is him doing that to himself.
02:01:02.980 | Right?
02:01:03.820 | It's like he's scruffing himself and directing himself
02:01:06.520 | in a particular direction
02:01:08.560 | and sending himself down that trajectory.
02:01:12.040 | And what's beautiful is that he acknowledges
02:01:15.200 | that that process is not pretty, it doesn't feel good.
02:01:19.940 | It's kind of horrible at every level,
02:01:23.320 | but he's created this rewarding element to it.
02:01:27.020 | And I think that's what's so, it's so admirable.
02:01:31.380 | And it's what so many people crave,
02:01:32.920 | which is regulation of the self at that level.
02:01:36.980 | - And he practices, I mean, there's a ritual to it.
02:01:40.000 | There's a, every single day, like no exceptions.
02:01:44.420 | There's a practice aspect to the suffering
02:01:47.920 | that he goes through.
02:01:49.520 | - It's principled suffering.
02:01:50.960 | - Principled suffering.
02:01:52.320 | I mean, I just, I mean, I admire all aspects of it,
02:01:55.560 | including him and his girlfriend/wife, I'm not sure.
02:01:58.920 | She'll probably know this.
02:01:59.760 | - I don't know.
02:02:00.800 | - I'm not asking him.
02:02:02.160 | - No, no, we've only, I've only communicated with her
02:02:06.880 | by text about some stuff that I was asking David,
02:02:10.120 | but yeah, they clearly have formed a powerful team.
02:02:13.920 | - Yeah, good cop and bad cop.
02:02:15.760 | - And it's a beautiful thing to see people working
02:02:17.960 | in that kind of synergy.
02:02:19.400 | - It's inspiring to me, same as with Elon,
02:02:21.600 | that a guy like David Goggins can find love.
02:02:24.160 | (laughing)
02:02:25.920 | That you find a thing that works, which gives me hope
02:02:30.080 | that like whatever flavor of crazy I am,
02:02:33.760 | you can always find another thing that works with that.
02:02:37.540 | But I've had the, so maybe let's trade Goggins stories,
02:02:42.540 | you from a neuroscience perspective,
02:02:46.840 | me from a self-inflicted pain perspective.
02:02:50.880 | I somehow found myself in communication with David
02:02:56.500 | about some challenges that I was undergoing.
02:02:59.740 | One of which is we were communicating every single day,
02:03:05.140 | email, phone, about the particular 30 day challenge
02:03:08.940 | that I did that stretched for longer of pushups and pull-ups.
02:03:13.620 | - You made a call out on social media.
02:03:15.100 | - Yeah, social media.
02:03:15.940 | - Actually, I think that was the point.
02:03:18.940 | I knew of you before, but that's where I started tracking
02:03:21.180 | some of what you were doing with these physical challenges.
02:03:23.140 | - Yeah, what the hell's wrong with that guy?
02:03:25.580 | - Well, no, I think I actually,
02:03:26.900 | I don't often comment on people's stuff,
02:03:28.700 | but I think I commented something like,
02:03:30.780 | neuroplasticity loves a non-negotiable rule.
02:03:33.860 | But no, I said a non-negotiable contract.
02:03:36.140 | Because at the point where, yeah,
02:03:38.620 | neuroplasticity really loves a non-negotiable contract
02:03:41.800 | because, and I've said this before, so forgive me,
02:03:45.660 | but the brain is doing analysis of duration, path,
02:03:48.700 | and outcome, and that's a lot of work for the brain.
02:03:51.580 | And the more that it can pass off duration, path,
02:03:54.420 | and outcome to just reflex,
02:03:56.460 | the more energy it can allocate to other things.
02:04:00.380 | So if you decide there's no negotiation
02:04:03.860 | about how many pushups, how far I'm gonna run,
02:04:06.260 | how many days, how many pull-ups, et cetera,
02:04:08.220 | you actually have more energy for pushups,
02:04:10.540 | running, and pull-ups.
02:04:11.380 | - And when you say neuroplasticity,
02:04:12.660 | you mean like the brain, once the decision is made,
02:04:15.620 | it'll start rewiring stuff to make sure
02:04:18.580 | that we can actually make this happen.
02:04:20.740 | - That's right, I mean, so much of what we do
02:04:22.560 | is reflexive at the level of just core circuitry,
02:04:24.660 | breathing, heart rate, all that boring stuff, digestion,
02:04:27.500 | but then there's a lot of reflexive stuff
02:04:29.380 | like how you drink out of a mug of coffee
02:04:31.900 | that's reflexive too, but that you had to learn
02:04:34.560 | at some point in your life earlier
02:04:36.060 | when you were very little,
02:04:37.220 | analyzing duration, path, and outcome,
02:04:38.860 | and that involves a lot of top-down processing
02:04:40.900 | with the prefrontal cortex.
02:04:42.380 | But through plasticity mechanisms, you now do it.
02:04:45.140 | So when you take on a challenge,
02:04:47.300 | provided that you understand the core mechanics
02:04:49.380 | of how to run pushups and pull-ups
02:04:51.920 | and whatever else you decided to do,
02:04:53.940 | once you set the number and the duration and all that,
02:04:57.660 | then all you have to do is just go.
02:05:00.200 | But people get caught in that tide pool of just,
02:05:03.220 | well, do I really have to do it?
02:05:04.620 | How do I not do that?
02:05:05.460 | What if I get injured?
02:05:06.820 | Can I sneak this?
02:05:08.580 | And that's work.
02:05:10.860 | And to some extent, look, not David Goggins, obviously,
02:05:15.860 | nor do I claim to understand his process partially,
02:05:22.240 | but maybe a little bit,
02:05:23.760 | which is that it's clear that by making the decision,
02:05:27.000 | there's more resources to devote
02:05:28.440 | to the effort of the actual execution.
02:05:30.760 | - Well, that's a really,
02:05:32.380 | like what you're saying was not a lesson
02:05:34.320 | that was obvious to me, and it's still not obvious.
02:05:36.640 | It's something I really work at,
02:05:38.400 | which is there is always an option to quit.
02:05:40.720 | And I mean, that's something I really struggle with.
02:05:46.540 | I mean, I've quit some things in my life.
02:05:49.360 | It's like stupid stuff.
02:05:51.400 | And one lesson I've learned is if you quit once,
02:05:56.400 | it opens the door that,
02:06:01.260 | like it's really valuable to trick your brain
02:06:05.440 | into thinking that you're gonna have to die before you quit.
02:06:10.440 | Like it's actually really convenient.
02:06:13.080 | So actually what you're saying is very profound,
02:06:16.080 | but you shouldn't intellectualize it.
02:06:19.680 | Like it took me time to develop,
02:06:24.040 | like psychologically in ways
02:06:27.680 | that I think it would be another conversation
02:06:30.360 | 'cause I'm not sure how to put it into words,
02:06:31.840 | but it's really tough on me
02:06:34.680 | to do certain parts of that challenge.
02:06:37.440 | - Well, it's a huge, you know, it's a huge output.
02:06:39.880 | - The number that, see,
02:06:41.760 | I thought it would be the number would be hard,
02:06:43.280 | but it's not.
02:06:44.480 | It's the entirety of it,
02:06:48.080 | especially in the early days was just spending,
02:06:53.080 | I'm kind of embarrassed to say how many hours this took.
02:06:57.080 | So I didn't say publicly how many hours,
02:07:00.080 | 'cause I knew people would be like,
02:07:03.240 | aren't you supposed to do other stuff?
02:07:06.800 | Like the hell are you doing?
02:07:08.320 | - Again, I don't wanna speculate too much,
02:07:09.720 | but occasionally David has said this publicly
02:07:11.920 | where people will be like, don't you sleep or something?
02:07:14.440 | And his process used to just be
02:07:16.560 | that he would just block, delete, you know, like gone.
02:07:18.960 | But it's actually, it's a super interesting topic.
02:07:23.880 | And because self-control and directing our actions
02:07:28.880 | and the role of emotion and quitting,
02:07:31.280 | these are vital to the human experience
02:07:34.400 | and they're vital to performing well at anything.
02:07:37.240 | And obviously at a super high level,
02:07:39.760 | being able to understand this about the self is crucial.
02:07:44.880 | So I have a friend who was also in the teams.
02:07:47.160 | His name is Pat Dossett.
02:07:48.240 | He did nine years in the SEAL teams.
02:07:50.760 | And in a similar way,
02:07:53.320 | there's a lore about him among team guys
02:07:56.840 | because of a kind of funny challenge he gave himself,
02:07:59.800 | which was, so he and I swim together,
02:08:01.360 | although he swims further up front than I do
02:08:03.560 | and he's very patient.
02:08:04.880 | But, you know, he was on a,
02:08:09.320 | he was assigned when he was in the teams
02:08:11.200 | to a position that gave him a little more time
02:08:13.720 | behind a desk than he wanted.
02:08:14.840 | And there's not as much time out in deployments,
02:08:17.280 | although he did deployments.
02:08:19.240 | So he didn't know what to do at that time,
02:08:21.160 | but he thought about it and he asked himself,
02:08:22.840 | what does he hate the most?
02:08:25.280 | And it turns out the thing that he hated doing the most
02:08:27.360 | was bear crawls, you know, walking on your hands and knees.
02:08:30.080 | So he decided to bear crawl for a mile for time.
02:08:32.720 | So he was bear crawling a mile a day, right?
02:08:35.280 | And I thought that was an interesting example that he gave
02:08:37.960 | because, you know, like why pick the thing
02:08:39.880 | you hate the most?
02:08:41.480 | And I think it maps right back to limbic friction.
02:08:44.200 | It's the thing that creates the most limbic friction.
02:08:46.960 | And so if you can overcome that, then there's carry over.
02:08:50.160 | And I think the notion of carry over
02:08:51.960 | has been talked about psychologically
02:08:53.320 | and then kind of in the self-help space,
02:08:54.760 | like, oh, if you run a marathon,
02:08:56.040 | it's going to help you in other areas of life.
02:08:57.800 | But will it really?
02:08:59.000 | Will it?
02:08:59.840 | Well, I think it depends on whether or not
02:09:00.760 | there's a lot of limbic friction.
02:09:01.960 | 'Cause if there is, what you're exercising
02:09:05.200 | is not a circuit for bear crawls or a circuit for pull-ups.
02:09:08.600 | What you're doing is you're exercising a circuit
02:09:10.680 | for top-down control.
02:09:12.360 | And that circuit was not designed to be for bear crawls
02:09:15.540 | or pull-ups or coding or waking up in the middle of the
02:09:19.120 | night to do something hard.
02:09:20.820 | That circuit was designed to override limbic friction.
02:09:24.280 | And so neural circuits were designed to generalize, right?
02:09:28.340 | The stress response to an incoming threat
02:09:30.940 | that's a physical threat was designed to feel the same way
02:09:34.200 | and be the same response internally as the threat
02:09:37.120 | to an impending exam or divorce or marriage
02:09:40.520 | or whatever it is that's stressing somebody out.
02:09:43.320 | And so neural circuits are not designed to be
02:09:45.760 | for one particular action or purpose.
02:09:47.580 | So if you can, as you did, if you can train up
02:09:51.120 | top-down control under conditions
02:09:52.800 | of the highest limbic friction,
02:09:54.800 | that when the desire to quit is at its utmost,
02:09:58.200 | either because of fatigue or hyper arousal,
02:10:00.840 | being too stressed or too tired,
02:10:03.660 | you're learning how to engage a circuit.
02:10:06.120 | And that circuit is forever with you.
02:10:08.760 | And if you don't engage it, it sits there,
02:10:12.120 | but it's atrophied.
02:10:13.400 | It's like a plant that doesn't get any water.
02:10:16.040 | And a lot of this has been discussed in self-help
02:10:18.520 | and growth mindset and all these kinds of ideas
02:10:20.680 | that circle the internet and social media.
02:10:23.200 | But when you start to think about how they map
02:10:25.040 | to neural circuits, I think there's some utility
02:10:26.920 | 'cause what it means is that the limbic friction
02:10:29.720 | that you'll experience in, I don't know,
02:10:31.760 | maybe some future relationship to something or someone,
02:10:36.520 | it's a category of neural processing
02:10:38.720 | that should immediately click into place.
02:10:40.160 | It's just like the limbic friction you experienced
02:10:42.820 | trying to engage in the God knows how many pushups,
02:10:46.760 | pull-ups and running, you know, runs you were doing.
02:10:49.680 | - 25,000, who's commenting?
02:10:52.080 | - So folks, if Lex does this again,
02:10:55.640 | more comments, more likes.
02:10:57.200 | (both laughing)
02:10:58.520 | - No, well-
02:10:59.360 | - This is the problem with you getting more followers
02:11:00.920 | is you're gonna- - Get more, yeah.
02:11:02.280 | - Actually, I should say that's the benefit.
02:11:04.160 | I don't know, maybe it's not politically correct for me.
02:11:06.480 | I asked, but there is this stereotype
02:11:09.120 | about Russians being- - Politically correct.
02:11:11.880 | - No, like being really durable.
02:11:16.080 | And I started going to that Russian banya
02:11:19.680 | way back before COVID,
02:11:21.640 | and they could tolerate a lot of heat,
02:11:25.280 | and they would sit very stoic.
02:11:26.880 | No one was going, "Oh, it's hot in here."
02:11:28.440 | They would just kind of ease into it.
02:11:31.000 | So maybe there's something there, who knows?
02:11:32.440 | - Might be something there,
02:11:33.280 | but it could be also just personal.
02:11:34.760 | I just have some, I found myself, everyone's different,
02:11:39.520 | but I've found myself to be able to do something unpleasant
02:11:44.520 | for very long periods of time.
02:11:47.080 | Like I'm able to shut off the mind,
02:11:49.600 | and I don't think that's been fully tested.
02:11:53.240 | And I feel- - Monkey mind
02:11:54.160 | or the supercomputer?
02:11:55.480 | (both laughing)
02:11:56.560 | - Well, it's interesting.
02:11:57.520 | I mean, which mind tells you to quit exactly?
02:12:02.840 | Limbic friction tells you-
02:12:05.320 | - Well, limbic friction is the source of that,
02:12:07.120 | but who are you talking with exactly?
02:12:09.240 | - So there's a, we can put something very concrete to that.
02:12:13.120 | So there's a paper published in Cell,
02:12:15.720 | super top tier journal, two years ago,
02:12:19.040 | looking at effort.
02:12:22.080 | And this was in a visual environment
02:12:23.920 | of trying to swim forward toward a target and a reward.
02:12:27.200 | And it was a really cool experiment
02:12:28.480 | 'cause they manipulated virtually the visual environment.
02:12:31.760 | So the same amount of effort was being expended every time,
02:12:35.560 | but sometimes the perception
02:12:37.000 | was you're making forward progress.
02:12:38.280 | And sometimes the perception was you're making no progress
02:12:41.120 | because stuff wasn't drifting by, meant no progress.
02:12:44.120 | So you can be swimming and swimming
02:12:45.520 | and not making progress.
02:12:46.560 | And it turns out that with each bout of effort,
02:12:50.640 | there's epinephrine and norepinephrine
02:12:54.040 | is being released in the brainstem.
02:12:56.580 | And glia, what traditionally were thought of
02:12:59.200 | as support cells for the neurons,
02:13:00.620 | but they do a lot of things actively too,
02:13:02.660 | are measuring the amount of epinephrine
02:13:05.240 | and norepinephrine in that circuit.
02:13:06.840 | And when it exceeds a certain threshold,
02:13:08.720 | the glia send inhibitory signals
02:13:10.500 | that shut down top-down control.
02:13:12.840 | They literally, it's the quit, you stop.
02:13:15.680 | There's no more, you quit enduring.
02:13:18.440 | It can be rescued, endurance can be rescued with dopamine.
02:13:24.200 | So that's where the subjective part really comes into play.
02:13:30.380 | So you quit because you've learned how to turn that off,
02:13:34.660 | or you've learned how to,
02:13:36.340 | some people will reward the pain process so much
02:13:39.700 | that friction becomes the reward.
02:13:42.000 | And when you talk about people like Goggins
02:13:44.480 | and other people I know from special operations
02:13:46.680 | and people have gone through cancer treatments three times,
02:13:50.840 | you hear about, just when you hear about people,
02:13:53.640 | the Viktor Frankl stories,
02:13:55.040 | I mean, you hear about Nelson Mandela,
02:13:56.600 | you hear about these stories,
02:13:57.960 | I'm sure the same process is involved.
02:13:59.680 | Again, this speaks to the generalizability
02:14:02.040 | of these processes as opposed to a neural circuit
02:14:04.440 | for a particular action or cognitive function.
02:14:06.920 | So I think you have to learn to subjectively self-reward
02:14:11.260 | in a way that replenishes you.
02:14:13.080 | Goggins talks about eating souls.
02:14:16.480 | It's a very dramatic example.
02:14:18.740 | In his mind, apparently, that's a form of reward,
02:14:21.640 | but it's not just a form of reward where you're,
02:14:24.280 | it's like you're picking up a trophy or something.
02:14:27.440 | It's actually, it gives you energy.
02:14:31.520 | It's a reward that gives more neural energy,
02:14:33.920 | and I'm defining that as more dopamine
02:14:36.520 | to suppress the noradrenaline and adrenaline circuits
02:14:39.680 | in the brainstem.
02:14:40.520 | - So ultimately maps to that.
02:14:41.960 | Yeah, he creates enemies.
02:14:44.600 | He's always fighting enemies.
02:14:46.040 | I never, I think I have enemies,
02:14:48.340 | but they're usually just versions of me inside my head.
02:14:51.880 | So I thought about, through that 30-day challenge,
02:14:55.160 | I tried to come up with fake enemies.
02:14:58.520 | It wasn't working.
02:14:59.640 | The only enemy I came up with is David.
02:15:03.200 | - Well, now you have,
02:15:05.200 | you certainly have a formidable adversary in this one.
02:15:09.880 | - I don't care.
02:15:11.120 | David, I'm willing to die on this one.
02:15:13.040 | So let's go there.
02:15:14.320 | - Well, let's hope you both survive this one.
02:15:20.960 | - My problem is the physical.
02:15:23.120 | So everything we've been talking about in the mind,
02:15:25.760 | there's a physical aspect that's just practically difficult,
02:15:29.080 | which is like, I can't,
02:15:31.280 | like when you injure yourself at a certain point,
02:15:34.440 | like you just can't function.
02:15:36.440 | - Or you're doing more damage.
02:15:37.720 | - Yeah. - You're talking about
02:15:38.960 | taking yourself out of running for, yeah.
02:15:41.280 | For the rest of your life, potentially,
02:15:44.000 | or like, you know, or for years.
02:15:45.960 | So, you know, I'd love to avoid that, right?
02:15:50.560 | There's just like stupid physical stuff
02:15:52.400 | that you just want to avoid.
02:15:54.040 | You want to keep it purely in the mental.
02:15:56.360 | And if it's purely in the mental,
02:15:58.160 | that's when the race is interesting.
02:15:59.480 | But yeah, the problem with these physical challenges,
02:16:02.720 | as David has experienced, I mean,
02:16:05.320 | it has a toll on your body.
02:16:07.560 | I tend to think of the mind as limitless,
02:16:09.920 | and the body is kind of, unfortunately, quite limited.
02:16:13.640 | - Well, I think the key is to dynamically control
02:16:15.960 | your output, and that can be done by reducing effort,
02:16:19.440 | which doesn't work for throughout,
02:16:22.440 | but also by restoring through these
02:16:27.440 | subjective reward processes.
02:16:29.400 | And we don't want to go down the rabbit hole
02:16:31.840 | of why this all works, but these are ancient pathways
02:16:34.360 | that were designed to bring resources to an animal
02:16:37.560 | or to a person through foraging,
02:16:39.680 | for hunting or mates or water, all these things.
02:16:42.520 | And they work so well because they're down in those
02:16:45.600 | circuits where we know the zeros and ones.
02:16:48.800 | And that's great because it can be subjective
02:16:51.600 | at the level of, oh, I reached this one milestone,
02:16:55.320 | this one horizon, this one three-meter target.
02:16:57.760 | But if you don't reward it, it's just effort.
02:17:01.880 | If you do self-reward it, it's effort minus one
02:17:06.320 | in terms of the adrenaline output.
02:17:08.520 | - I have to ask you about this.
02:17:12.080 | You're one of the great communicators in science.
02:17:16.400 | I'm really a big fan of yours,
02:17:17.880 | and enjoying in terms of the educational stuff
02:17:20.800 | you're putting on neuroscience.
02:17:22.840 | - Thank you.
02:17:23.680 | - What's the, do you have a philosophy behind it
02:17:27.160 | or is it just an instinct?
02:17:29.600 | (laughing)
02:17:30.640 | - Oh, my.
02:17:31.480 | - Unstoppable force?
02:17:32.320 | Do you have, what's your thinking?
02:17:34.000 | Because it's rare and it's exciting.
02:17:36.040 | I'm excited that somebody from Stanford,
02:17:41.040 | so I, okay, I'm in multiple places in the sense of
02:17:46.680 | like where my interests lie.
02:17:48.000 | And one, politically speaking,
02:17:51.560 | academic institutions are under fire,
02:17:54.200 | for many reasons we don't need to get into.
02:17:57.960 | I get into it in a lot of other places,
02:18:00.960 | but I believe in places like Stanford
02:18:05.960 | and places like MIT as one of the most magical institutions
02:18:13.720 | for inspiring people to dream, people to build the future.
02:18:18.520 | I mean, it's, I believe that it is a really special,
02:18:22.160 | these universities are really special places.
02:18:24.880 | And so it's always exciting to me
02:18:27.400 | when somebody as inspiring as you represents those places.
02:18:32.400 | So it makes me proud that somebody from Stanford is,
02:18:38.200 | like somebody like you is representing Stanford.
02:18:41.360 | So maybe you could speak to what's,
02:18:46.360 | how did you come to be who you are in being a communicator?
02:18:52.520 | - Well, first of all, thanks for the kind words,
02:18:54.800 | especially coming from you.
02:18:56.320 | I think Stanford is an amazing place as is MIT.
02:19:00.040 | And it's such a--
02:19:01.120 | - MIT is better, by the way.
02:19:02.520 | (laughing)
02:19:03.360 | It's okay, I'll let it out, anything you say at this point.
02:19:05.440 | - You know, I got many friends at MIT.
02:19:07.520 | - Yeah.
02:19:08.360 | - You know, hi, Ed Boyden.
02:19:09.200 | - Smarter friends, yeah.
02:19:10.200 | - Ed Boyden is best in class, you know,
02:19:14.560 | among the best in class.
02:19:15.640 | There's some people--
02:19:16.480 | - Number one.
02:19:17.320 | - Not me that can hold a candle to him,
02:19:18.840 | but not many, maybe one or two.
02:19:20.680 | I think the great benefit of being in a place like MIT
02:19:23.720 | or Stanford is that when you look around,
02:19:26.800 | you know, that the average is very high, right?
02:19:29.640 | You have many best in class among the, you know,
02:19:33.520 | one or two or three best in the world at what they do.
02:19:35.920 | And it's a wonderful privilege to be there.
02:19:39.400 | And one thing that I think also makes them
02:19:42.920 | and other universities like them very special
02:19:44.800 | is that there's an emphasis on what gets exported
02:19:47.200 | out of the university.
02:19:48.520 | What, you know, not keeping it ivory tower
02:19:50.520 | and really trying to keep an eye on what's needed
02:19:53.280 | in the world and trying to do something useful.
02:19:55.600 | And I think the proximity to industry in Silicon Valley
02:19:59.000 | and in the Boston area and Cambridge
02:20:01.040 | also lends itself well to that.
02:20:02.520 | And there are other institutions too, of course.
02:20:05.360 | - So the reason I got involved in educating on social media
02:20:09.800 | was actually because of Pat Dossett,
02:20:13.280 | the Bear, Myle Bearcaw guy.
02:20:15.440 | I was at the turn of 2018 to 2019.
02:20:18.320 | We had formed a good friendship and we were,
02:20:21.720 | he talked me into doing these early morning cold water swims.
02:20:24.800 | I was learning a lot about pain and suffering,
02:20:26.940 | but also the beauty of cold water swims.
02:20:28.680 | And we were talking one morning and he said,
02:20:32.120 | "So what are you going to do to serve the world in 2019?"
02:20:34.720 | It's like, that's the way that like a Texan
02:20:36.960 | former SEAL talks.
02:20:37.920 | Like we're just literally,
02:20:39.000 | what are you gonna do to serve the world in 2019?
02:20:40.800 | Like, well, I run my lab.
02:20:41.960 | It's like, no, no, what are you gonna do that's new?
02:20:43.840 | And he wasn't forceful in it,
02:20:45.000 | but I was like, that's interesting question.
02:20:46.280 | I said, well, if I had my way,
02:20:48.680 | I would just teach people, everyone about the brain
02:20:52.120 | 'cause I think it's amazing.
02:20:52.960 | He goes, "We'll do it."
02:20:54.240 | I go, "All right."
02:20:55.120 | He goes, "Shake on it."
02:20:56.120 | So we did it, you know?
02:20:57.200 | And so I started putting out these posts
02:20:59.240 | and it's grown into to include a variety of things,
02:21:04.200 | but you asked about a governing philosophy.
02:21:06.040 | So I want to increase interest in the brain
02:21:09.780 | and in the nervous system and in biology generally.
02:21:12.040 | That's one major goal.
02:21:13.640 | I'd like to increase scientific literacy,
02:21:16.340 | which can't be rammed down people's throats
02:21:19.360 | of talking about how to look at a graph and statistics
02:21:22.500 | and Z-scores and P-values and genetics.
02:21:25.920 | It has to be done gradually in my opinion.
02:21:28.160 | I want to put valuable tools into the world,
02:21:32.160 | mainly tools that map to things that we're doing in our lab.
02:21:35.560 | So these will be tools centered around
02:21:37.760 | how to understand a direct one states of mind and body.
02:21:41.560 | So reduce stress, raise one stress threshold.
02:21:44.800 | So it's not always just about being calm.
02:21:46.200 | Sometimes it's about learning how to tolerate
02:21:47.880 | not being not calm.
02:21:49.440 | Raise awareness for mental health.
02:21:52.920 | There's a ton of micro missions in this,
02:21:55.360 | but it all really maps back to, you know,
02:21:58.680 | like the eight and 10 year old version of me,
02:22:01.200 | which is I used to spend my weekends
02:22:02.780 | when I was a kid reading about weird animals.
02:22:04.840 | And I had this obsession with like medieval weapons
02:22:07.520 | and stuff like catapults.
02:22:08.920 | And then I used to come into school on Monday
02:22:11.120 | and I would ask if I could talk about it
02:22:13.200 | to the class and teach.
02:22:14.160 | And I just, it's really, I promise,
02:22:17.920 | and some people might not believe me,
02:22:18.960 | but it's really, I don't really like
02:22:20.880 | being the point of focus.
02:22:22.200 | I just get so excited about these gems
02:22:26.240 | that I find in the world in books and in experiments
02:22:29.560 | and in discussions with colleagues
02:22:31.400 | and discussions with people like you
02:22:32.920 | and around the universe.
02:22:35.080 | And I can't just compulsively,
02:22:37.120 | I got to tell people about it.
02:22:38.600 | So I try and package it into a form that people can access.
02:22:41.640 | You know, I think if I've,
02:22:43.160 | I think the reception has been really wonderful.
02:22:45.120 | Stanford has been very supportive, thankfully.
02:22:49.280 | I've given, done some podcasts even with them
02:22:51.520 | and they've reposted some stuff on social media.
02:22:54.080 | It's a precarious place to put yourself out there
02:22:56.240 | as a research academic.
02:22:57.340 | I think some of my colleagues,
02:22:59.220 | both locally and elsewhere probably wonder
02:23:01.080 | if I'm still serious about research, which I absolutely am.
02:23:05.060 | And I also acknowledge that, you know,
02:23:08.520 | their research and the research coming out of the field
02:23:11.840 | needs to be talked about.
02:23:13.320 | And not all scientists are good at translating that
02:23:16.620 | into a language that people can access.
02:23:19.080 | And I don't like the phrase, dumb it down.
02:23:21.080 | What I like to do is take a concept
02:23:25.000 | that I think people will find interesting and useful
02:23:27.660 | and offer it sort of like a,
02:23:30.600 | you would offer food to somebody visiting your home.
02:23:32.600 | You're not going to cram foie gras in their face.
02:23:34.900 | You're going to say, like, do you want a cracker?
02:23:37.460 | Like, and they say, yeah.
02:23:38.400 | And like, do you want something on that cracker?
02:23:39.900 | Like, do you like cheese?
02:23:40.780 | Like, yeah.
02:23:41.620 | Like, do you want Swiss cheese
02:23:43.000 | or you want that really like stinky, like French,
02:23:45.280 | I don't like cheese much, but,
02:23:47.480 | or do you want foie gras?
02:23:48.640 | Like, what's that?
02:23:49.460 | Like, so you're trying,
02:23:50.440 | the best information prompts more questions of interest,
02:23:54.580 | not questions of confusion, but questions of interest.
02:23:57.280 | And so I feel like one door opens,
02:23:59.020 | then another door opens, then another door opens.
02:24:01.120 | And pretty soon the image in my mind
02:24:04.280 | is you create a bunch of neuroscientists
02:24:05.800 | who are thinking about themselves neuroscientifically.
02:24:08.000 | And I don't begin to think
02:24:09.460 | that I have all the answers at all.
02:24:12.360 | I cast a neuroscience,
02:24:13.800 | sometimes a little bit of a psychology lens
02:24:15.920 | onto what I think are interesting topics.
02:24:18.620 | And, you know, I, you know,
02:24:21.720 | someday I'm going to go into the ground
02:24:23.680 | or the ocean or wherever it is I end up.
02:24:25.840 | And I'm very comfortable with the fact
02:24:30.840 | that not everyone's going to be happy
02:24:32.200 | with how I deliver the information,
02:24:33.560 | but I would hope that people would feel
02:24:36.440 | like some of it was useful and meaningful
02:24:38.280 | and got them to think a little bit harder.
02:24:41.160 | - Since you mentioned going into the ground
02:24:43.280 | and Viktor Frankl, "Man's Search for Meaning,"
02:24:49.280 | I reread that book quite often.
02:24:53.620 | What, let me ask the big ridiculous question about life.
02:24:59.520 | What do you think is the meaning of it all?
02:25:05.440 | Like, and maybe why do you,
02:25:07.400 | do you mention that book from a psychologist perspective,
02:25:10.520 | which Viktor Frankl was,
02:25:12.320 | or do you ever think about the bigger philosophical questions
02:25:17.800 | that he raises about meaning?
02:25:19.560 | What's the meaning of it all?
02:25:21.380 | - One of the great challenges in assigning a good,
02:25:27.280 | you know, giving a good answer to the question of like,
02:25:29.040 | what's the meaning of life is I think illustrated best
02:25:32.840 | by the Viktor Frankl example,
02:25:35.200 | although there are other examples too,
02:25:37.120 | which is that our sense of meaning is very elastic
02:25:41.200 | in time and space.
02:25:43.120 | And I'm, we talked a little bit about this earlier,
02:25:46.200 | but it's amazing to me that somebody locked in a cell
02:25:50.480 | or a concentration camp can bring the horizon
02:25:54.240 | in close enough that they can then micro slice
02:25:57.380 | their environment so that they can find rewards
02:26:00.000 | and meaning and power and beauty,
02:26:03.480 | even in a little square box or a horrible situation.
02:26:08.240 | And I think this is really speaks
02:26:09.840 | to one of the most important features of the human mind,
02:26:12.320 | which is we could do, let's take two opposite extremes.
02:26:15.980 | One would be, let's say the alarm went off right now
02:26:18.920 | in this building and the building started shaking.
02:26:21.760 | Our vision, our hearing, everything would be tuned
02:26:25.000 | to this space time bubble for those moments.
02:26:28.740 | And everything that we were processed,
02:26:31.440 | all that would matter, the only meaning would be
02:26:33.720 | get out of here safe, figure out what's going on,
02:26:35.820 | contact loved ones, et cetera.
02:26:37.360 | If we were to sit back, totally relaxed,
02:26:40.800 | we could do the, you know, what is it?
02:26:42.160 | I think it's called pale blue dot thing or whatever,
02:26:44.000 | where we could imagine ourselves in this room.
02:26:45.840 | And then they were in the United States
02:26:47.260 | and this continent and the earth,
02:26:48.600 | and then it's peering down on us.
02:26:49.980 | And all of a sudden you get back,
02:26:51.460 | it can seem so big that all of a sudden
02:26:53.820 | it's meaningless, right?
02:26:55.740 | If you see yourself as just one brief glimmer
02:26:59.480 | in all of time and all of space, you go to, I don't matter.
02:27:03.920 | And if you go to, oh, every little thing that happens
02:27:07.460 | in this text thread or this, you know,
02:27:09.320 | comment section on YouTube or Instagram,
02:27:11.300 | your space time bubble is tiny.
02:27:13.940 | Then everything seems inflated and the brain
02:27:17.440 | will contract and dilate at space time,
02:27:21.760 | vision and time, but also sense of meaning.
02:27:26.640 | And that's beautiful.
02:27:27.840 | And it's what allows us to be so dynamic
02:27:29.920 | in different environments.
02:27:31.000 | And we can pull from the past and the present and future.
02:27:34.240 | It's why examples like Nelson Mandela
02:27:36.400 | and Viktor Frankl had to include.
02:27:39.240 | It makes sense that it wasn't just about grinding it out.
02:27:41.900 | They had to find those dopamine rewards,
02:27:43.700 | even in those little boxes they were forced into.
02:27:46.880 | So I'm not trying to dodge an answer,
02:27:50.840 | but for me personally, and I think about this a lot
02:27:54.480 | because I have this complicated history in science
02:27:59.000 | where my undergraduate graduate advisor
02:28:01.500 | and postdoctoral advisor all died young.
02:28:04.280 | So, you know, and they were wonderful people
02:28:07.200 | and had immense importance in my life.
02:28:09.800 | But what I realized is that we can get so fixated
02:28:14.800 | on the thing that we're experiencing
02:28:17.500 | how holding tremendous meaning,
02:28:19.120 | but it only holds that meaning for as long
02:28:21.800 | as we're in that space time regime.
02:28:25.260 | And this is important because what really gives meaning
02:28:29.620 | is the understanding that you can move
02:28:32.020 | between these different space time dimensionalities.
02:28:35.100 | And I'm not trying to sound like a theoretical physicist
02:28:37.740 | or anyone that thinks about the cosmos in saying that.
02:28:41.920 | It's really the fact that sometimes we say
02:28:44.960 | and do and think things and it feels so important.
02:28:47.760 | And then two days later, we're like, what happened?
02:28:51.160 | Well, you had a different brain processing algorithm
02:28:54.720 | entirely, you were in a completely different state.
02:28:57.080 | And so what I want to do in this lifetime is I want to,
02:29:00.220 | I want to engage in as many different levels
02:29:05.640 | of contraction and dilation of meaning as possible.
02:29:09.540 | I want to go to the micro.
02:29:11.220 | I sometimes think about this.
02:29:12.400 | I'm like, if I just pulled over the side of the road,
02:29:14.100 | I bet you there's an ant hill there
02:29:15.620 | and their whole world is fascinating.
02:29:17.380 | You can't stay there.
02:29:19.260 | And you also can't stay staring up at the clouds
02:29:21.620 | and just think about how we're just these little beings
02:29:24.180 | and it doesn't matter.
02:29:25.340 | The key is the journey back and forth,
02:29:28.260 | up and down that staircase, back and forth
02:29:31.700 | and back and forth.
02:29:32.540 | And my goal is to get as many trips up and down
02:29:34.480 | that staircase as I can before the reaper comes for me.
02:29:37.720 | - Oh, beautiful.
02:29:38.560 | So the dance of dilation and contraction
02:29:42.020 | between the different spaces, zoom in, zoom out
02:29:45.380 | and get as many steps in on that staircase.
02:29:49.760 | - That's my goal anyway.
02:29:51.220 | And I've watched people die.
02:29:52.420 | I watched my postdoc advisor die, wither away.
02:29:55.260 | My graduate advisor, it was tragic,
02:29:57.100 | but they found beauty in these closing moments
02:30:00.180 | because their bubble was their kids in one case
02:30:03.900 | or like one of them was a Giants fan
02:30:06.460 | and like got to see a Giants game, you know,
02:30:08.340 | in her last moments.
02:30:09.420 | And like, and you just realize like it's a Giants game
02:30:12.540 | but not in that moment because time is closing.
02:30:15.020 | And so those time bins feel huge
02:30:16.740 | because she's slicing things so differently.
02:30:19.180 | So I think learning how to do that better
02:30:23.180 | and more fluidly, recognizing where one is
02:30:26.480 | and not getting too tacked to the idea
02:30:29.300 | that there's one correct answer.
02:30:31.160 | Like that's what brings meaning.
02:30:33.220 | That's my goal anyway.
02:30:35.220 | - I don't think there's a better way to end it, Andrew.
02:30:37.760 | I really appreciate that you would come down
02:30:40.580 | and contract your space time
02:30:44.180 | and focus on this conversation for a few hours.
02:30:46.580 | It is a huge honor.
02:30:49.580 | I'm a huge fan of yours, as I told you.
02:30:51.060 | I hope you keep growing and educating the world
02:30:54.340 | about the human mind.
02:30:56.500 | Thanks for talking today.
02:30:58.100 | - Thank you.
02:30:59.040 | I really appreciate the invitation to be here.
02:31:01.500 | And people might think that I'm saying it
02:31:03.020 | just 'cause I'm here, but I'm a huge fan of yours.
02:31:04.940 | I send your podcasts to my colleagues and other people.
02:31:08.060 | And I think what you're doing isn't just amazing,
02:31:12.180 | it's important.
02:31:13.160 | And so thank you.
02:31:14.480 | - Thanks for listening to this conversation
02:31:17.220 | with Andrew Huberman.
02:31:18.440 | And thank you to our sponsors.
02:31:20.300 | Asleep, a mattress that cools itself
02:31:22.900 | and gives me yet another reason to enjoy sleep.
02:31:26.020 | SEMrush, the most advanced SEO optimization tool
02:31:29.460 | I've ever come across.
02:31:30.940 | And Cash App, the app I use to send money to friends.
02:31:35.060 | Please check out the sponsors in the description
02:31:37.460 | to get a discount and to support this podcast.
02:31:41.260 | If you enjoy this thing, subscribe on YouTube,
02:31:43.540 | review it with five stars on Apple Podcasts,
02:31:45.780 | follow on Spotify, support on Patreon,
02:31:48.300 | or connect with me on Twitter @LexFriedman.
02:31:51.620 | And now let me leave you with some words from Carl Jung.
02:31:55.560 | I am not what happened to me.
02:31:59.060 | I am what I choose to become.
02:32:01.700 | Thank you for listening and hope to see you next time.
02:32:05.620 | (upbeat music)
02:32:08.200 | (upbeat music)
02:32:10.780 | [BLANK_AUDIO]