back to indexAndrew Huberman: Neuroscience of Optimal Performance | Lex Fridman Podcast #139
Chapters
0:0 Introduction
2:29 Fear
10:41 Virtual reality
14:25 Claustrophobia
16:13 Skydiving
17:48 Overcoming fears
22:48 Optimal performance
26:2 Deep work
41:27 Psychedelics
45:13 Deep work
58:53 Everything in the brain is an abstraction
66:11 Human vision system
77:47 Neuralink
105:17 Science of consciousness
120:5 David Goggins
137:9 Science communication
144:41 Man's Search for Meaning
00:00:00.000 |
The following is a conversation with Andrew Huberman, 00:00:14.600 |
He has a great Instagram account @hubermanlab, 00:00:28.760 |
giving, and still succeed in the science world. 00:00:34.160 |
followed by some thoughts related to the episode. 00:00:40.120 |
and gives me yet another reason to enjoy sleep. 00:00:43.000 |
SEMrush, the most advanced SEO optimization tool 00:00:48.040 |
and Cash App, the app I use to send money to friends. 00:00:52.080 |
Please check out these sponsors in the description 00:00:54.480 |
to get a discount and to support this podcast. 00:00:57.920 |
As a side note, let me say that I heard from a lot of people 00:01:00.880 |
about the previous conversation I had with Yaron Brook 00:01:12.000 |
was more critical on occasion than I meant to be, 00:01:14.680 |
didn't push on certain points that I should have, 00:01:19.800 |
about some major things that happened in the past 00:01:27.200 |
that if we are to have difficult conversations, 00:01:30.000 |
we have to give each other space to make mistakes, 00:01:35.560 |
Taking one or two statements from a three-hour podcast 00:01:38.720 |
and suggesting that they encapsulate who I am, 00:01:44.800 |
is a standard that we can't hold each other to. 00:01:48.680 |
I don't think anyone could live up to that kind of standard. 00:02:02.320 |
but please try to do so with love and with patience. 00:02:10.560 |
Whether I'm successful at that or not, we shall see. 00:02:14.080 |
If you enjoy this thing, subscribe on YouTube, 00:02:24.920 |
And now, here's my conversation with Andrew Huberman. 00:02:28.960 |
You've mentioned that in your lab at Stanford, 00:02:32.800 |
you induce stress by putting people into a virtual reality 00:02:36.680 |
and having them go through one of a set of experiences. 00:02:40.400 |
I think you mentioned this on Rogan or with Whitney, 00:02:59.360 |
- Yeah, so it depends on the person, obviously. 00:03:04.200 |
'Cause you can, without going too far down the rabbit hole 00:03:15.400 |
And you can't really have trauma without fear and stress, 00:03:18.120 |
but you could have fear and stress without trauma. 00:03:24.240 |
for even having a laboratory that studies these things, 00:03:33.880 |
I mean, the field of understanding emotions and states, 00:03:38.880 |
which is mainly what I'm interested in, is very complicated. 00:03:42.480 |
But we can do away with a lot of complicated debate 00:03:48.440 |
what we're looking for to assign it a value of fear 00:03:56.160 |
so increases in heart rate, increases in breathing, 00:04:01.600 |
all the hallmark signature features of the stress response. 00:04:07.900 |
we have the benefit of getting neurosurgery patients 00:04:12.320 |
and their insula and the orbital frontal cortex 00:04:16.600 |
So these are chronically implanted electrodes, 00:04:20.440 |
and we can start seeing some central features 00:04:29.180 |
as trivial as it might seem in listening to it, 00:04:42.400 |
with a very strong stress, if not fear response. 00:04:45.920 |
And that's because the visual vestibular apparati, right? 00:04:51.460 |
to the balanced semicircular canals of the inner ears, 00:04:55.280 |
but really all of that pulls all your physiology, 00:05:09.400 |
because of the optic flow, that one is universal. 00:05:12.780 |
So we've got a dive with great white sharks experience 00:05:19.820 |
and brought back 360 video that's built out pretty- 00:05:29.820 |
a lot of the motivation was that a lot of the studies 00:05:48.340 |
it just wasn't creating a real enough experience. 00:05:55.580 |
and that thing of presence of people momentarily, 00:06:09.820 |
which was wild because it's incredible movie, 00:06:25.460 |
and yet it was still scary that people somehow 00:06:28.540 |
were able to put themselves into that experience 00:06:39.640 |
If we get people who have generalized anxiety, 00:06:41.860 |
these are people who wake up and move through life 00:06:45.420 |
at a generally higher state of autonomic arousal 00:07:03.460 |
like my sister, for instance, is afraid of sharks, 00:07:17.340 |
- We have you step off a platform, virtual platform, 00:07:22.500 |
and it's a flat floor in my lab, but you're up there. 00:07:26.780 |
- Well, you actually allow them the possibility 00:07:28.920 |
in the virtual world to actually take the leap of faith. 00:07:46.040 |
but it gets increasingly complex and it speeds up on them. 00:07:51.700 |
where they just can't make the motor commands fast enough. 00:07:59.760 |
All of a sudden, they're on a narrow platform 00:08:10.200 |
to continue across that platform to continue the game. 00:08:13.700 |
And some people, they actually will get down on the ground 00:08:24.080 |
And so what this really tells us is the power of the brain 00:08:27.560 |
to enter these virtual states as if they were real. 00:08:31.040 |
And we really think that anchoring the visual 00:08:40.200 |
There's also the potential, and we haven't done this yet, 00:08:46.160 |
is that when we started all this back in 2016, 00:08:56.680 |
we knew that we could get people into this presence 00:08:59.040 |
where they think they're in a real experience more quickly. 00:09:03.040 |
who I was introduced to because of the project, 00:09:09.760 |
but he dives with great white sharks and he leaves the cage. 00:09:13.120 |
And so we worked with him to build a 360 video apparatus 00:09:25.560 |
There were some interesting moments out there of danger, 00:09:30.880 |
And then we realized we need to do this for everything. 00:09:50.960 |
But if I turned back toward you, then it would be silent. 00:10:03.640 |
and I turn around and then I turn back to you 00:10:07.800 |
that might seem like more of an impending threat, 00:10:23.960 |
that's when you can really drive people's nervous system 00:10:26.640 |
down these paths of high states of stress and fear. 00:10:30.040 |
Now, we don't wanna traumatize people, obviously, 00:10:34.800 |
that allow them to calm themselves in these environments. 00:10:44.480 |
this whole construction that you've developed 00:10:46.900 |
We did this a little bit with autonomous vehicles. 00:10:51.480 |
So to try to understand the decision-making process 00:11:06.560 |
I was so surprised how real that whole world was. 00:11:15.840 |
but I was still afraid of being hit by a car. 00:11:27.320 |
I mean, it wasn't like ultra realistic simulation. 00:11:35.680 |
we're just programmed to not necessarily recoil, 00:11:40.680 |
but to be cautious about that edge and that depth. 00:11:43.320 |
And then looms, things coming at us that are getting larger. 00:11:46.160 |
There are looming sensing neurons, even in the retina, 00:11:48.720 |
at a very, very early stage of visual processing. 00:11:56.280 |
and folks learn how to not get eaten by great white sharks 00:12:01.400 |
is as they start lumbering in, you swim toward them. 00:12:04.520 |
And they get very confused when you loom on them, 00:12:09.000 |
Clearly they could eat you if they wanted to, 00:12:13.920 |
toward any creature that that creature questions 00:12:30.160 |
is you couldn't do 360 video because there's a game 00:12:37.200 |
So maybe people realize this, maybe they don't, 00:12:44.600 |
well, it's actually not that obvious to people, 00:12:46.160 |
but you can't change the reality that you're watching. 00:12:54.280 |
is there something fundamental about fear and stress 00:12:58.520 |
that the interactive element is essential for? 00:13:00.800 |
Or do you find you can arouse people with just the video? 00:13:19.960 |
and it just creates a visceral response for me. 00:13:25.280 |
and they have lower levels of stress and fear in there. 00:13:28.640 |
But one way that we can get them to feel more of that 00:13:39.340 |
as opposed to just walk to a little safe corner, 00:14:02.360 |
because we never want to project our own ideas 00:14:10.600 |
- And humans aren't great at explaining how they feel, 00:14:13.640 |
but it's a lot easier to understand what they're saying 00:14:22.680 |
plus these physiological and neurophysiological signals. 00:14:25.480 |
- Is there something you've learned about yourself, 00:14:28.880 |
Like you said snakes, is there something that, 00:14:32.000 |
like if I were to torture you, so I'm Russian, 00:14:37.960 |
how can I murder this person that entered the room? 00:14:48.920 |
that I never considered myself claustrophobic, 00:14:58.000 |
But I, before COVID, I started going to this Russian banya, 00:15:09.840 |
The plaza, they're hitting you with the leaves. 00:15:15.120 |
And there were a couple of times where I thought, 00:15:20.440 |
It's in a city where there are a lot of earthquakes. 00:15:23.640 |
Like if this place crumbled and we were stuck in here, 00:15:29.600 |
I don't like small confined spaces with poor ventilation. 00:15:33.360 |
So I realized, I think I have some claustrophobia. 00:15:37.020 |
So I put myself into our own claustrophobia stimulus, 00:15:52.440 |
But then as we start modulating the environment 00:16:07.280 |
And so I think I've unhatched a bit of a claustrophobia. 00:16:23.840 |
- Yeah, I jumped out, but it was fundamentally 00:16:27.520 |
different experience than, I guess there could be 00:16:31.240 |
a lot of different flavors of fear of heights maybe. 00:16:33.960 |
But the one I have didn't seem to be connected 00:16:40.480 |
It's a very different, 'cause like once you accept it, 00:16:43.200 |
you're going to jump, then it's a different thing. 00:16:47.040 |
I think what I'm afraid of is the moments before it 00:17:07.560 |
It's the fact that it's not supposed to happen 00:17:12.160 |
I guess I'm not being eloquent in this description, 00:17:17.920 |
that was actually philosophically liberating. 00:17:27.680 |
And then at a certain point, there's no surface anymore 00:17:31.520 |
And it's all of a sudden the world becomes three dimensional 00:17:37.240 |
that the concept of like, of earth disappears 00:17:45.960 |
- That was wild, but I'm still terrified of heights. 00:17:48.040 |
So, I mean, one thing I want to ask just on fear, 00:18:01.120 |
And that comes from two, from a research study standpoint, 00:18:11.320 |
where we can probe around different brain areas 00:18:13.120 |
and try and figure out what interesting brain areas 00:18:25.680 |
showing that what at first might seem a little bit obvious, 00:18:31.640 |
which is that there are really three responses to fear. 00:18:36.960 |
You can retreat, you can back up, or you can go forward. 00:18:40.720 |
And there's a single hub of neurons in the midbrain, 00:18:50.600 |
And depending on which neurons are active there, 00:18:54.760 |
there's a much higher probability that a mouse, 00:18:56.720 |
or it turns out, or a human will advance in the face of fear 00:19:09.600 |
that the lowest level of stress or autonomic arousal 00:19:22.720 |
the retreat response has a slightly higher level 00:19:34.560 |
If you're very calm, it's easy to stay quiet and still. 00:19:39.280 |
it's harder to maintain that level of quiet and stillness. 00:19:44.000 |
You see this also in animals that are stalking, 00:19:51.800 |
So the freeze response is actually an active response, 00:20:01.000 |
was associated with the forward movement toward the threat. 00:20:07.720 |
However, the forward movement in the face of threat 00:20:11.520 |
was linked to the activation of what we call collateral, 00:20:18.800 |
that connects to the dopamine circuits for reward. 00:20:34.120 |
it actually maps very well to cognitive behavioral therapy 00:20:36.880 |
and a lot of the existing treatments for trauma, 00:20:44.200 |
So otherwise you exist in this very low level 00:20:49.040 |
where the circuits for autonomic arousal are humming 00:20:54.240 |
And we have to remember that stress and fear and threat 00:20:56.560 |
were designed to agitate us so that we actually move. 00:21:10.160 |
But that's actually not the maximum stress response. 00:21:30.040 |
and then it drops off as you get really stressed. 00:21:34.520 |
of the distribution where you perform very well 00:21:38.440 |
And so we've been spending a lot of time in humans 00:21:42.640 |
to get people comfortable to go to that place. 00:21:48.960 |
how there are heightened states of cognition there. 00:21:51.520 |
There's changes in time perception that allow you 00:21:59.520 |
This is the matrix as a lot of people think of it. 00:22:03.240 |
But we tend to think about fear as all the low level stuff 00:22:10.360 |
there are a lot of different features to the fear response. 00:22:15.980 |
and we think about it from a circuit perspective 00:22:20.680 |
And we try and weigh that against the threat. 00:22:23.600 |
in unnecessary risk, but that's where the VR is fun 00:22:42.680 |
without ever experiencing that kind of focus. 00:22:50.480 |
I really think that's where optimal performance lies. 00:22:55.120 |
but what's performance and what's optimal performance? 00:23:11.080 |
and it varies depending on task and environment. 00:23:15.080 |
So one way where we can make it a little bit more operational 00:23:18.000 |
and concrete is to say there is a sweet spot, if you will, 00:23:23.000 |
where the level of internal autonomic arousal, 00:23:27.600 |
aka stress or alertness, whatever you want to call it, 00:23:30.920 |
is ideally matched to the speed of whatever challenge 00:23:38.400 |
So we all have perception of the outside world 00:23:46.320 |
when interoception and exteroception are matched 00:23:58.080 |
So for instance, if you're, I don't play guitar, 00:24:01.420 |
So let's say you're trying to learn something new 00:24:03.760 |
I'm not saying that being in these super high states 00:24:10.960 |
It may be that your internal arousal needs to be at a level 00:24:19.040 |
has to be well-matched to the information coming in 00:24:22.240 |
and what you're trying to do in terms of performance, 00:24:25.440 |
in terms of playing chords and notes and so forth. 00:24:32.000 |
and animals and humans need to react very quickly, 00:24:35.000 |
the higher your state of autonomic arousal, the better, 00:24:40.960 |
just because of the way the autonomic system works. 00:24:47.080 |
and movement of the lens essentially changes your optics. 00:24:50.520 |
And that's obvious, but with the change in optics 00:24:53.760 |
is a change in how you bin time and slice time, 00:24:56.100 |
which allows you to get more frames per second readout. 00:25:01.360 |
it might actually be that you want to be almost sleepy, 00:25:05.240 |
almost in a kind of drowsy state to be able to, 00:25:09.880 |
and I don't play music, so I'm guessing here, 00:25:16.480 |
that your fingers can follow an external cue. 00:25:23.160 |
And so there is no perfect autonomic state for performance. 00:25:33.040 |
because they're not well operationally defined enough, 00:25:38.040 |
but I do believe that optimal or peak performance 00:25:41.140 |
is going to arise when internal state is ideally matched 00:25:45.000 |
to the space-time features of the external demands. 00:25:49.280 |
- So there's some slicing of time that happens, 00:25:58.240 |
in order to adjust to the dynamics of the stimulus. 00:26:33.640 |
It turns it into a skill that you can get better at. 00:26:47.840 |
The programming you have to do that kind of thing for. 00:26:52.320 |
like you hold it and then you take steps with it, 00:26:57.400 |
and you're holding relatively complicated things 00:27:04.320 |
I mean, the hardest part is there's frustrating things, 00:27:12.120 |
so you have to calmly turn around and take a step back. 00:27:18.400 |
Is there something about your study of optimal performance 00:27:38.240 |
but looking at things through the lens of neuroscience. 00:27:41.080 |
So what you're describing can be mapped fairly well 00:28:05.120 |
And so he thinks about respiration and breathing 00:28:11.280 |
Very, very interesting and I think important work. 00:28:20.240 |
'cause this is lifted from a coffee conversation 00:28:34.480 |
depending on demands and thinking in particular. 00:28:39.260 |
I'd be curious to know if you agree or disagree. 00:28:42.120 |
He said, "Most great mathematics is done by people 00:29:02.720 |
on working memory to work out theorems in math 00:29:09.400 |
and run back and forth between them, updating them. 00:29:21.400 |
but an increased reliance on some sort of deep memory 00:29:28.040 |
probably stuff that's moved out of the hippocampus 00:29:31.960 |
and is more some episodic and declarative stuff, 00:29:36.460 |
but really, so you're pulling from your library, basically. 00:29:39.280 |
It's not all RAM, it's not all working memory. 00:29:43.240 |
and physicists tend to have very active careers 00:30:03.240 |
'cause the people in the labs are doing them, of course, 00:30:06.400 |
and that work does tend to rely on insights gained 00:30:20.360 |
than the working memory of the mathematician. 00:30:28.240 |
and trying to iterate and keeping a lot of plates spinning, 00:30:48.960 |
to the point where they start doing less well. 00:30:55.040 |
We've had SEAL team operators come to the lab, 00:30:57.600 |
we've had people from other units in the military, 00:31:01.400 |
we've had a range of intellects and backgrounds 00:31:09.640 |
as a function of the demands of speed of processing 00:31:20.200 |
So what you're describing is very interesting 00:31:30.640 |
so this is a regime I don't really fully know, 00:31:33.440 |
so I don't want to comment about it in any way. 00:31:45.920 |
keeping every bit of reflexive circuitry at bay. 00:31:49.180 |
The one that makes you want to get up and use the restroom, 00:31:51.040 |
the one that makes you want to check your phone, 00:31:59.140 |
which we know are very important for working memory. 00:32:02.320 |
- Yeah, let me try to think through this a little bit. 00:32:16.640 |
but if I were to say some of the most challenging things 00:32:29.920 |
but is this kind of first principles thinking step 00:32:45.880 |
how do I do this differently than I've done it before? 00:33:01.360 |
you've gotten quite good at this particular pattern 00:33:13.480 |
that solves this particular problem in programming, say. 00:33:27.000 |
I don't know if that's accessing working memory. 00:33:44.440 |
you're querying like functional things, like. 00:33:55.560 |
because I know some of the people listening to this, 00:34:09.480 |
in these kinds of processes without being exhaustive. 00:34:15.140 |
but not assume that that's the only thing involved, right? 00:34:26.620 |
and yet there may be global states that underlie this 00:34:29.840 |
that make prefrontal circuitry work differently 00:34:37.180 |
I mean, there's a lot of mysteries about this. 00:34:39.360 |
But, so I just want to make sure that we sort of are, 00:35:20.060 |
which is when you're trying out lots of things, 00:35:30.780 |
because the higher level of autonomic arousal 00:35:34.080 |
the more rigidly you're going to analyze space and time. 00:35:53.600 |
- You are facing a tiger and trying to figure out 00:35:58.120 |
- And that's primarily going to be determined 00:36:08.800 |
You know, when a scent hound goes out in the environment, 00:36:11.200 |
they have depth to the odor trails they're following. 00:36:19.040 |
You might say, oh, well, the smell's getting more intense. 00:36:21.160 |
Aha, but they actually have three-dimensional odor trails. 00:36:37.720 |
And this shows up for the musicians out there, 00:36:39.560 |
metronomes are a great way to play with this. 00:36:55.360 |
and through touch for us, but mainly through vision. 00:37:11.040 |
if you're in a rigid high level of autonomic arousal, 00:37:16.920 |
that are in this space regime, this time regime matches. 00:37:22.060 |
Whereas creativity, I always think the lava lamp 00:37:31.860 |
And so in drowsy states, sleeping and drowsy states, 00:37:36.860 |
space and time become dislodged from one another somewhat, 00:37:41.280 |
And I think that's why a lot of solutions come to people 00:37:46.800 |
And this could even take us into a discussion 00:37:55.720 |
by just creating a spontaneous bursting of neurons 00:37:58.360 |
and hallucinations, but the 5H, 2C and 2A receptors, 00:38:03.360 |
which are the main sites for things like LSD and psilocybin 00:38:07.440 |
and some of the other, the ones that create hallucinations, 00:38:16.000 |
in the collection of neurons that encase the thalamus, 00:38:20.700 |
which is where all the sensory information goes into, 00:38:23.240 |
a structure called the thalamic reticular nucleus. 00:38:25.840 |
And it's an inhibitory structure that makes sure 00:38:32.620 |
that I'm mainly focused on whatever I'm seeing visually, 00:38:39.460 |
Under conditions where people take psychedelics 00:38:41.600 |
and these particular serotonin receptors are activated, 00:38:45.460 |
that inhibitory shell, it's literally shaped like a shell, 00:38:56.860 |
are because the lateral connectivity in layer five 00:39:00.180 |
of cortex across cortical areas is increased. 00:39:15.640 |
based on the neurons in the retina and the cortex. 00:39:17.920 |
On psychedelics, this could be very strange experience. 00:39:31.360 |
So under conditions of these increased activation 00:39:38.600 |
space and time across sensory areas starts being fluid. 00:39:46.480 |
and making a prediction based on vision alone. 00:39:49.200 |
I'm now, this is where people talk about hearing sites, 00:39:55.200 |
You start linking, this might actually make a sound 00:39:59.320 |
Now I'm not suggesting people run out and do psychedelics 00:40:10.880 |
If people choose to do that, that's their business. 00:40:15.400 |
this lateral connectivity is increased as well. 00:40:20.360 |
And what's, these are through these so-called 00:40:32.560 |
not from being in that narrow tunnel of space-time 00:40:38.800 |
and trying to, well, iterate if this, then this, 00:40:48.720 |
that was an algorithm that you never had in existence before 00:40:55.360 |
And all of a sudden a new possibility comes to mind. 00:41:05.000 |
in order to come up with something meaningful. 00:41:11.760 |
with Michael Pollan's book and other things happening 00:41:14.720 |
about psychedelics as a pathway to exploration 00:41:18.640 |
But the real question is what you export back 00:41:23.800 |
but if you can't bring anything back from them, 00:41:38.840 |
I definitely want to, I did shrooms a couple of times. 00:41:42.840 |
I definitely want to figure out how I can experiment 00:42:01.800 |
that asks like, how do I participate in these studies? 00:42:04.560 |
- Yeah, well, there are some legality issues. 00:42:16.760 |
Hypnosis is something that my colleague, David Spiegel, 00:42:19.360 |
associate chair of psychiatry at Stanford, works on, 00:42:28.480 |
and yet deeply relaxed, where new algorithms, 00:42:38.320 |
And I think, so if I had a, I'm part of a group 00:42:47.600 |
and talk about just wild ideas, but they try and implement. 00:42:57.560 |
and some other backgrounds, academic backgrounds. 00:43:21.120 |
but you could launch yourself out of that state 00:43:23.520 |
and place yourself into a linear real-world state 00:43:28.280 |
whatever it was that happened in that experience 00:43:37.880 |
a lot of the reason people do them is they're lying. 00:43:40.560 |
They say they want plasticity and they want all this stuff. 00:43:47.880 |
So they're kind of seeking something unusual. 00:43:52.860 |
they're not trying to make their brain better. 00:43:54.120 |
They're just trying to experience something really amazing. 00:43:57.800 |
But the problem is space and time are so unlocked 00:44:02.040 |
in these states, just like they are in dreams, 00:44:04.540 |
that you can really end up with a whole lot of nothing. 00:44:14.760 |
a meaningful experience when you didn't bring anything back. 00:44:25.720 |
but you don't actually have tools to bring back. 00:44:29.080 |
Or I'm just sorry, actually concrete ideas to bring back. 00:44:34.680 |
Yeah, I wonder if it's possible to do that with a mind 00:44:47.560 |
I mean, maybe it'll be done through pharmacology. 00:44:49.240 |
It's just that it's hard to do on/off switches 00:44:51.680 |
in human pharmacology that we have them for animals. 00:44:54.480 |
I mean, we have, you know, Cree flip recombinases 00:44:57.880 |
and we have, you know, channel opsins and halo rhodopsins 00:45:05.680 |
but I think you could do it with virtual reality, 00:45:11.000 |
that bring more of the somatic experience into it. 00:45:13.680 |
- You're of course a scientist who's studying humans 00:45:23.240 |
And, you know, I play, when these deep thinking, 00:45:30.320 |
like in the morning, that there's times when my mind 00:45:34.160 |
is so like eloquent at being able to jump around 00:45:44.000 |
from a third person perspective and enjoy that. 00:45:51.640 |
And I'm very conscious of this like little creature 00:46:01.080 |
if we're being honest, maybe a couple hours a day. 00:46:06.360 |
Not always, well, early part of the day for me 00:46:15.480 |
single and no meetings, I don't schedule any meetings. 00:46:28.200 |
But after a traditionally defined full night's sleep, 00:46:43.520 |
and it's the deepest dives intellectually that I make. 00:46:51.160 |
And I try to bring that to the other parts of the day 00:46:53.280 |
that don't have it and treasure them even more 00:47:05.320 |
like check social media or something like that. 00:47:09.800 |
in intellectual life is those mental moments of clarity. 00:47:16.080 |
And I wonder, I'm learning how to control them. 00:47:23.140 |
Well, because if you learn how to titrate caffeine, 00:47:28.320 |
But if you learn to titrate caffeine with time of day 00:47:30.320 |
and the kind of work that you're trying to do, 00:47:38.640 |
sometimes people want a little bit of background music, 00:47:40.280 |
sometimes they want less, these kinds of things. 00:47:44.720 |
because the one thing that's not often discussed 00:47:50.760 |
I think it's called "Winston Churchill's Nap." 00:48:01.220 |
someone who I respect a lot was mentoring me said, 00:48:08.800 |
someone else's sensory experience early in the day." 00:48:14.520 |
I sleep well, but I don't emerge from that very quickly. 00:48:17.440 |
I need a lot of caffeine to wake up and whatnot. 00:48:21.040 |
But there's this concept of getting the download from sleep, 00:48:30.440 |
the stuff that was meaningless from the previous day, 00:48:33.120 |
but you were also running variations on these algorithms 00:48:36.060 |
of whatever it is you're trying to work out in life 00:48:55.080 |
you know, it's a brain state that would be useless in waking 00:48:59.200 |
You'd be the person talking to yourself in the hallway 00:49:01.040 |
or something about something that no one else can see. 00:49:06.280 |
the theory is that you arrive at certain solutions 00:49:12.880 |
unless you interfere with them by bringing in, 00:49:22.520 |
Someone is the conductor of your thoughts in that case. 00:49:32.480 |
in the early part of the day and asking the question, 00:49:39.160 |
am I in more of an interoceptive or exteroceptive mode? 00:49:42.260 |
And depending on the kind of work you need to do, 00:49:52.140 |
allowing yourself to transition out of that sleep state 00:49:59.580 |
And then, and only then allowing things like music, 00:50:04.260 |
doesn't mean you shouldn't talk to loved ones 00:50:07.300 |
But some people have taken this to the extreme. 00:50:19.720 |
that he wouldn't look at faces in the early part of the day 00:50:23.160 |
because he just didn't want anything else to impact him. 00:50:27.040 |
Now, he didn't have the most rounded life, I suppose. 00:50:31.600 |
But if you're talking about cognitive performance, 00:50:40.340 |
that describe the habits of brilliant people, 00:50:45.300 |
like writers, they do control that sensory experience 00:50:54.220 |
they have a particular habit of several hours 00:50:59.660 |
they do, not doing anything else for the rest of the day, 00:51:05.900 |
I think they make it very difficult to live with them. 00:51:26.880 |
who has a mansion, a castle on top of a cliff 00:51:38.340 |
She wants to control how much sound is coming in. 00:51:41.740 |
- She's very sensitive to sound and environment. 00:51:46.140 |
but like clearly puts a lot of attention into details. 00:51:55.580 |
I'm also, I don't like, that feels like a slippery slope. 00:52:13.140 |
because your mind gets comfortable with that. 00:52:24.020 |
It really annoys me when there's sounds and voices 00:52:26.580 |
and so on, but I feel like I can train my mind 00:52:38.300 |
I mean, we're talking about what's best for work 00:52:41.420 |
is not always what's best for completeness of life. 00:52:50.540 |
There are probably 50 ways that the brain can create 00:52:53.500 |
what looks like autism or what people call autism. 00:52:58.300 |
that have come out of David Ginty's lab at Harvard Med 00:53:07.020 |
where nothing is disrupted in the brain proper 00:53:12.120 |
but the sensory neurons, the ones that innervate the skin 00:53:16.480 |
and the ears and everything are hypersensitive. 00:53:22.800 |
So this means that the overload of sensory information 00:53:27.800 |
and sensory experience that a lot of autistics feel, 00:53:36.620 |
It, you know, we always thought of that as a brain problem. 00:53:39.880 |
In some cases it might be, but in many cases, 00:53:42.800 |
it's because they just can't, they seem to have a, 00:53:46.100 |
it's like turning the volume up on every sense. 00:53:52.640 |
and it's hard for their parents and so forth. 00:53:56.900 |
because the way I think about trying to build up resilience, 00:54:01.900 |
you know, physically or mentally or otherwise is one of, 00:54:08.120 |
That's not a real scientific term and I acknowledge that. 00:54:13.300 |
which is that, you know, we always hear about resilience. 00:54:15.640 |
It makes it sound like, oh, you know, under stress 00:54:17.680 |
where everything's coming at you, you're gonna stay calm. 00:54:22.400 |
the limbic system wants to pull you in some direction. 00:54:26.500 |
Typically in the direction of reflexive behavior. 00:54:29.280 |
And the prefrontal cortex through top-down mechanisms 00:54:36.800 |
of the coffee cups behind me or I'm gonna keep focusing. 00:54:42.200 |
So limbic friction is high in that environment. 00:54:47.040 |
It mean that the prefrontal cortex has to work really hard. 00:54:49.760 |
But there's another side to limbic friction too, 00:54:52.360 |
which is when you're very sleepy, there's nothing incoming. 00:54:55.600 |
You can be completely silent and it's hard to engage 00:55:02.460 |
but for the opposite reason, autonomic arousal is too low. 00:55:05.580 |
So they're turning on Netflix in the background 00:55:08.320 |
or looping a song might boost your level of alertness 00:55:25.500 |
I guess one way you could envision it spatially, 00:55:28.620 |
especially if people are listening to this just on audio, 00:55:31.820 |
is I like to think about it kind of like a glass barbell 00:55:40.260 |
And one sphere of attention can be on what's going on 00:55:42.600 |
with you or something else in the room or in my environment. 00:55:54.640 |
Okay, but so imagine that this thing can contort. 00:55:57.180 |
The size of the globes at the end of this barbell 00:56:01.520 |
So let's say I close my eyes and I bring all my experience 00:56:04.360 |
into what's going on through interoception internally. 00:56:08.680 |
Now it's as if I've got two orbs of perception 00:56:14.560 |
and bring both orbs of perception outside me. 00:56:17.220 |
I'm not thinking about my heart rate or my breathing. 00:56:22.980 |
as you kind of use this spatial model is that two things. 00:56:35.120 |
the two ends of the barbell can move around freely. 00:56:40.460 |
the more rigid they're going to be tethered in place. 00:56:43.580 |
And that was designed so that if I have a threat 00:56:45.580 |
in my environment, it's tethered to that threat. 00:56:48.700 |
I'm not going to, if something's coming to attack me, 00:56:51.700 |
"Oh, my breathing cadence is a little bit quick." 00:56:59.120 |
And so my behavior is now actually being driven 00:57:01.680 |
by something external, even though I think it's internal. 00:57:05.960 |
because I'm a neuroscientist, I'm not a theorist. 00:57:11.240 |
of how the brain works, I mean, brain works, excuse me, 00:57:13.680 |
there are only really three things that neurons do. 00:57:17.100 |
they're motor neurons, or they're modulating things. 00:57:24.060 |
that we have now, 2020, tell us that we've got interoception 00:57:29.880 |
They're strongly modulated by levels of autonomic arousal. 00:57:32.520 |
And that if we want to form the optimal relationship 00:57:39.840 |
whether or not it's sleep, an impending threat, or coding, 00:57:43.720 |
we need to adjust our internal space-time relationship 00:57:49.440 |
And I realize I'm repeating what I said earlier, 00:57:51.720 |
but we can actually assign circuitry to this stuff. 00:57:54.240 |
It mostly has to do with how much limbic friction there is, 00:58:08.460 |
and I'm just going to be thinking about that pain. 00:58:13.280 |
is top-down control, meaning anything in our environment 00:58:17.960 |
that has a lot of salience will tend to bring us 00:58:22.880 |
And again, I don't want to litter the conversation 00:58:33.920 |
When I wake up, am I mostly in a mode of interoception 00:58:38.720 |
When I work well, is that, what does working well look like 00:58:50.800 |
And to sort of watch this process throughout the day. 00:58:57.840 |
and it'd be nice to try to get a little more color to it, 00:59:12.080 |
- Interoception would be an awareness of anything 00:59:16.000 |
that's within the confines or on the surface of my skin 00:59:21.840 |
- Physiologically, like within the boundaries of my skin 00:59:26.720 |
Exteroception would be perception of anything 00:59:38.320 |
although, and this can change dramatically actually, 00:59:49.520 |
So there are beautiful experiments done by Greg Reckensohn 00:59:53.640 |
looking at how auditory and visual cues are matched 00:59:58.600 |
and you can, this will become obvious as I say it, 01:00:01.420 |
but obviously the ventriloquist doesn't throw their voice. 01:00:07.820 |
and you think the sound is coming from that location. 01:00:12.720 |
where they suddenly introduce a auditory visual mismatch 01:00:21.400 |
as if the sound arrived from the corner of the room 01:00:23.880 |
and hit you, like physically, and people will recoil. 01:00:28.140 |
And so sounds aren't getting thrown across the room, 01:00:31.220 |
they're still coming from this defined location 01:00:38.720 |
And again, not to, I don't wanna go down a rabbit hole, 01:00:47.560 |
but everything in the brain is an abstraction, right? 01:00:53.920 |
there are the eyes and ears and nose and skin 01:00:56.040 |
and taste and all that are taking information 01:01:00.440 |
it's taking information from sensors inside the body, 01:01:05.640 |
I've got sensory neurons that innervate my liver, et cetera. 01:01:10.540 |
Taking all that and the brain is abstracting that 01:01:15.540 |
in the same way that if I took a picture of your face 01:01:18.520 |
and I handed it to you and I'd say, that's you, 01:01:24.240 |
I'd be doing a little bit more of what the brain does, 01:01:28.080 |
maybe I could do this 'cause I'm a terrible artist, 01:01:31.440 |
let's say I would make your eyes like water bottles, 01:01:41.040 |
And I'd say, no, but that's my abstraction of you. 01:01:44.280 |
The space time relationship of the neurons that fire, 01:01:47.520 |
that encode your face, have no resemblance to your face. 01:01:53.720 |
I don't know if people have fully internalized that, 01:02:01.400 |
but all neurons can do is fire in space and in time, 01:02:08.840 |
It's not clear the action potential is all or none, 01:02:11.160 |
although neuroscientists don't like to talk about that, 01:02:13.320 |
even though it's been published in "Nature" a couple times. 01:02:23.600 |
- Well, I mean, there's a lot of fascinating stuff 01:02:36.400 |
every time we at all try to think about the difference 01:02:42.320 |
but can we maybe linger a little bit on this, 01:02:51.120 |
and it forms abstractions that are fascinating 01:02:58.080 |
And I think it, just like when you're programming, 01:03:02.980 |
it's awe-inspiring to think that underneath it all, 01:03:25.000 |
or fundamental to you about the circuitry of the brain 01:03:28.140 |
that allows for the magic that's in our mind to emerge? 01:03:36.080 |
I mean, maybe even focusing on the vision system. 01:03:38.680 |
Is there something specific about the structure 01:03:45.400 |
that allows for the complexity of the vision system 01:03:49.920 |
to emerge or is it all just a complete chaotic mess 01:03:56.160 |
that we don't understand if we're talking about vision. 01:03:59.400 |
And that's not just 'cause I'm a vision scientist. 01:04:02.840 |
- Well, because in the beauty of the visual system, 01:04:07.640 |
won the Nobel Prize was because they were brilliant 01:04:15.840 |
these kinds of questions and other systems are hard 01:04:22.280 |
how fine the gratings are, thick gratings, thin gratings. 01:04:31.360 |
There's so many things that you can do in a controlled way. 01:04:34.800 |
Whereas if we were talking about cognitive encoding, 01:04:37.200 |
like encoding the space of concepts or something. 01:04:45.840 |
am drawn to the big questions in neuroscience. 01:04:49.000 |
But I confess in part because of some good advice 01:04:55.240 |
and in part because I'm not perhaps smart enough 01:05:02.320 |
I also like to address things that are tractable. 01:05:09.280 |
what we can stand to make some ground on at a given time. 01:05:13.040 |
- They construct brilliant controlled experiments 01:05:17.160 |
just to study, to really literally answer questions about. 01:05:24.720 |
And I think most people don't wanna hear what I have to say, 01:05:39.100 |
Can experiments in neuroscience be constructed 01:05:43.360 |
to shed any kind of light on these questions? 01:05:51.360 |
one of the most beautiful things about human beings. 01:05:56.080 |
computer vision has some of the most exciting applications 01:06:13.380 |
We're mostly visual animals to navigate, survive. 01:06:16.280 |
Humans mainly rely on vision, not smell or something else, 01:06:28.000 |
and then we're moving to higher level concepts. 01:06:32.720 |
can be summarized in a few relatively succinct statements, 01:06:46.080 |
of neuron structure at the back of your eyes, 01:06:51.600 |
And sometimes people think I'm kind of wriggling 01:06:58.160 |
It's a four brain structure that in the first trimester, 01:07:05.520 |
which is part of your central nervous system, 01:07:07.420 |
was squeezed out into what's called the embryonic eye cups. 01:07:19.160 |
is the only window into the world for a mammal, 01:07:28.080 |
so that light can make it down into the pineal directly 01:07:36.280 |
So three layers of neurons that are a piece of your brain, 01:07:40.240 |
And the optic nerve connects to the rest of the brain. 01:07:51.720 |
informs every cell in your body about time of day 01:07:53.760 |
and make sure that all sorts of good stuff happens 01:07:55.600 |
if you're getting light in your eyes at the right times. 01:08:10.800 |
Consistent schedule, try and keep a consistent schedule. 01:08:13.200 |
When you're young, it's easy to go off schedule and recover. 01:08:18.320 |
but you see everything from outcomes in cancer patients 01:08:21.820 |
to diabetes improves when people are getting light 01:08:42.040 |
through specialized type of neuron in the retina 01:08:46.880 |
discovered by David Burson at Brown University. 01:08:55.440 |
Even if you're looking at the sun, it doesn't matter. 01:09:00.400 |
And it's saying, oh, even though it's a cloudy day, 01:09:09.520 |
That's these melanopsin cells signaling the circadian clock. 01:09:12.840 |
There are a bunch of other neurons in the eye 01:09:17.280 |
And they mainly signal the presence of things 01:09:19.880 |
that are lighter than background or darker than background. 01:09:22.440 |
So a black object would be darker than background, 01:09:27.040 |
And that all come, it's mainly, it's looking at pixels. 01:09:40.800 |
Little circles of red light versus little circles 01:09:47.300 |
It's like red, green, blue, and circles brighter or dimmer 01:09:57.500 |
And when we say information, we can be very precise. 01:10:02.680 |
I mean spikes, neural action potentials in space and time, 01:10:15.880 |
is converted into a language that's very simple. 01:10:21.000 |
And those syllables are being shouted down the optic nerve, 01:10:25.980 |
like Morse code, beep, beep, beep, beep, beep, beep. 01:10:30.080 |
essentially responds in the same way that the retina does. 01:10:38.440 |
That thing was moving faster than everything else, 01:10:47.660 |
Or that signal is much redder than it is green. 01:10:58.740 |
The information just doesn't get up into your cortex. 01:11:00.560 |
And then in cortex, of course, is where perceptions happen. 01:11:12.200 |
So there's a neuron that responds to this angle of my hand 01:11:17.520 |
This is the defining work of Hubel and Wiesel's Nobel. 01:11:20.680 |
And it's a very systematic map of orientation, 01:11:23.540 |
line orientation, direction of movement, and so forth. 01:11:29.320 |
And that's how the visual system is organized 01:11:35.980 |
it's hierarchical because you don't build up that line 01:11:38.120 |
by suddenly having a neuron that responds to lines 01:11:49.080 |
And then that neuron responds to vertical lines. 01:11:52.840 |
There's no abstraction at that point, in fact. 01:12:00.760 |
that I would see a representation of that black line 01:12:40.420 |
I'm sure if you saw Joe, 'cause you know him well, 01:12:43.160 |
from across the room and you just saw his profile, 01:12:52.800 |
but he's represented in some abstract way by a neuron 01:12:56.640 |
that actually would be called the Joe Rogan neuron 01:13:01.160 |
I might not recognize him if he was upside down 01:13:10.000 |
because early on she was challenged by people that said, 01:13:14.240 |
"There are neurons that they only respond to space and time, 01:13:19.120 |
"moving in particular directions and orientations." 01:13:23.880 |
They use these stimuli called Grebel stimuli, 01:13:26.400 |
which any computer programmer would appreciate, 01:13:28.720 |
which kind of morphs a face into something gradually 01:13:31.440 |
that eventually just looks like this alien thing 01:13:43.040 |
and forgive me, Nancy, and for those of the Grebel people, 01:13:50.320 |
"I think you know what I'm trying to do here." 01:13:54.620 |
it's very concrete up until about visual area four, 01:14:02.360 |
And so the stimuli become more and more elaborate, 01:14:06.120 |
but at some point you depart that concrete representation 01:14:10.240 |
and you start getting abstract representations 01:14:12.080 |
that can't be explained by simple point-to-point wiring. 01:14:22.600 |
maps to the auditory system where you're encoding, 01:14:28.480 |
So this is gonna sound weird to do, but you know, 01:14:34.680 |
But at some point you get into motifs of music 01:14:44.600 |
And if you start thinking about concepts of creativity 01:14:48.280 |
and love and memory, like what is the map of memory space? 01:14:52.840 |
Well, your memories are very different than mine, 01:15:00.400 |
or at the early stages of emotional processing 01:15:03.040 |
or at the earlier stages of creative processing 01:15:17.120 |
'cause I was just mainly talking about neocortex, 01:15:20.320 |
the six layered structure on the outside of the brain 01:15:27.280 |
is that subcortical structures are a lot more like machines. 01:15:35.940 |
that controls heart rate and breathing and receptive fields, 01:15:39.640 |
neurons that respond to things like temperature 01:15:46.340 |
I came into neuroscience from a more of a perspective 01:15:58.160 |
and some molecular biology and about circuitry 01:16:00.620 |
is that one of the most beautiful experiences 01:16:20.160 |
coming off of that electrode into an audio signal 01:16:33.260 |
And in the cortex, eventually you find a stimulus 01:16:36.000 |
that gets the neuron to spike and fire action potentials 01:16:48.020 |
When you drop electrodes deeper into the thalamus 01:16:54.640 |
or into the brainstem areas that control breathing, 01:17:00.960 |
This could be like a grungy old tungsten electrode, 01:17:07.200 |
as long as it's got a little bit of insulation on it. 01:17:15.040 |
and you walk in front of that animal or person, 01:17:23.000 |
And you put your hand in front of the eye again, 01:17:33.000 |
So whereas before, it's a question of how much information 01:17:48.240 |
And this is why I know we have some common friends 01:18:01.200 |
I'm just a huge fan of the people and the mission. 01:18:21.300 |
But when you are in the subcortical areas of the brain, 01:18:24.780 |
a stimulating electrode can evoke an entire behavior 01:18:29.180 |
And so the brain, if we're gonna have a discussion 01:18:55.420 |
That's where you could potentially cure Parkinson's 01:18:59.600 |
Because we know that it gates motor activation patterns 01:19:04.440 |
So I think for those that are interested in neuroscience, 01:19:09.420 |
is this a circuit that abstracts the sensory information? 01:19:13.000 |
Or is it just one that builds up hierarchical models 01:19:18.300 |
And there's a huge chasm in neuroscience right now, 01:19:27.500 |
has captured an amazing opportunity, which was, 01:19:29.700 |
okay, well, while all you academic research labs 01:19:45.100 |
Let's restore motion to the Parkinsonian patient. 01:19:48.660 |
Academic labs want to do that too, of course. 01:20:01.140 |
and I admit I've mixed in a lot of opinion there. 01:20:07.300 |
digging around in the brain and listening to neurons firing 01:20:11.620 |
I think given it's 2020, we need to ask the right, 01:20:27.580 |
But I think the questions about consciousness 01:20:41.060 |
the power of the brain arising from the circuitry 01:20:46.060 |
that forms abstractions or the power of the brain 01:20:54.380 |
that's just doing very brute force, dumb things 01:21:08.180 |
And here I'm poaching badly from someone I've never met, 01:21:15.260 |
I think Elon Musk said, you know, basically the brain is a, 01:21:18.740 |
well, you say a monkey brain with a supercomputer on top. 01:21:21.100 |
And I thought that's actually probably the best description 01:21:24.940 |
because it captures a lot of important features 01:21:30.780 |
when we're making plans, we're using the prefrontal cortex 01:21:33.280 |
and we're executive function and all this kind of stuff. 01:21:44.740 |
It's just that it's been hijacked by the limbic system 01:21:49.540 |
It's really not fair to monkeys though, Elon, 01:21:53.180 |
They just don't make plans as sophisticated as us. 01:21:56.620 |
but I've also spent a lot of time with humans. 01:22:00.380 |
there's a lot of value to focusing on the monkey brain 01:22:07.700 |
to place a chip anywhere I wanted in the brain today 01:22:13.340 |
I'm not sure I would put that chip in neocortex, 01:22:21.180 |
And especially if I wanted to make a mass production tool, 01:22:26.500 |
to a lot of people, because it's quite possible 01:22:28.640 |
that your abstractions are different enough than mine 01:22:30.980 |
that I wouldn't know what patterns of firing to induce. 01:22:34.400 |
But if I want, let's say I want to increase my level 01:22:37.340 |
of focus and creativity, well, then I would love 01:22:52.860 |
I'm going to just, I'm going to turn down the limbic friction 01:22:56.220 |
and, or ramp up prefrontal cortex's activation. 01:23:00.300 |
So there's a lot of stuff that can happen in the thalamus 01:23:06.860 |
around the thalamus and allow more creative thinking 01:23:12.400 |
those would be the experiments I'd want to do. 01:23:14.040 |
So they're in the subcortical, quote unquote, monkey brain, 01:23:17.400 |
but you could then look at what sorts of abstract thoughts 01:23:20.960 |
and behaviors would arise from that, rather than, 01:23:24.840 |
and here I'm not pointing the finger at Neuralink at all, 01:23:30.000 |
But I, I'm going to, well, I might lose a few friends, 01:23:34.340 |
And also one of the reasons people spend so much time 01:23:45.380 |
Right now, the two photon and one photon microscopy methods 01:23:51.620 |
still don't allow you to image down really deep 01:23:54.020 |
unless you're jamming prisms in there and endoscopes. 01:24:03.500 |
And so you much easier to look at the waves up on top. 01:24:08.540 |
a lot of the reasons why there's so many recordings 01:24:20.100 |
mainly of engineers and chemists and physicists, 01:24:23.820 |
this revolution to neuroscience in the last 10 years or so. 01:24:30.440 |
that's why you see so many reports on layer two, three. 01:24:37.080 |
but is that as long as there's no clear right answer, 01:24:40.900 |
it becomes a little easier to do creative work 01:24:43.940 |
in a structure where no one really knows how it works. 01:24:49.980 |
If you're gonna work in the thalamus or the pulvinar 01:24:54.460 |
so these structures that have been known about 01:25:04.860 |
And whereas in cortex, no one knows how the thing works. 01:25:10.300 |
And so there's a lot more room for discovery. 01:25:19.260 |
But I think with the tools that are available nowadays 01:25:27.160 |
monitoring activity, but writing to the brain, 01:25:37.700 |
for which we already know what scripts they write. 01:25:43.140 |
The fact that they act like machines makes them predictable. 01:25:51.700 |
of writing to the brain is there's this idea. 01:25:56.060 |
I'm mainly pointing at the neocortical jockeys out there 01:26:04.540 |
is going to give rise to something interesting. 01:26:07.700 |
I should call out one experiment or two experiments 01:26:14.500 |
Done important work in memory and immunology, of course, 01:26:18.460 |
as well as Mark Mayford's lab at UC San Diego. 01:26:23.560 |
a bunch of neurons while an animal learned something. 01:26:34.320 |
It's like, okay, you monitor the neurons in your brain. 01:26:42.380 |
you know the keys on the piano that were played 01:26:43.900 |
that gave rise to the song, which was the behavior. 01:26:46.660 |
And then you go back and you reactivate those neurons 01:26:51.500 |
like slamming on all the keys once on the piano, 01:27:13.180 |
the space part may matter more than the time part. 01:27:17.620 |
So, you know, rate codes and space time codes, 01:27:29.080 |
- You're saying some of the magic is in the early stages 01:27:38.640 |
you know the neuron then codes that stimulus. 01:27:43.560 |
that don't think about sensory transformations, 01:27:48.440 |
And then I look at how many times the neuron fires 01:27:53.000 |
And then I could show the red circle a bunch of times, 01:27:59.580 |
You've converted red circle into like three action 01:28:09.180 |
You know the transformation and you march up the, 01:28:11.540 |
it's called the neuro axis as you go from the periphery 01:28:16.720 |
And we know that, and I know Lisa Feldman Barrett, 01:28:24.560 |
excuse me, Lisa, that talked a lot about this, 01:28:27.560 |
that, you know, birds can do sophisticated things 01:28:30.480 |
But humans, there's a strong, what we call cephalization. 01:28:34.000 |
A lot of the processing has moved up into the cortex 01:28:40.000 |
And so as long as you know the transformations, 01:28:45.640 |
or add machines to the brain that exactly mimic 01:28:53.240 |
and turn them into internal firing of neurons. 01:29:00.900 |
you saying this means that humans aren't that special. 01:29:08.200 |
the leap to intelligence is not that special. 01:29:14.680 |
isn't where most of the magic happens of intelligence, 01:29:18.520 |
which gives me hope that maybe if that's true, 01:29:27.080 |
I hope there are other forms of intelligence. 01:29:31.440 |
I mean, I think what humans are really good at, 01:29:35.080 |
and here I want to be clear that this is not a formal model, 01:29:52.000 |
Like I can read a book today about the history of medicine. 01:29:58.560 |
and if I want, I can inject it into my plans for the future. 01:30:09.480 |
are all hiding little like notebooks everywhere 01:30:20.280 |
they orient relative to the honeycomb and they waggle. 01:30:23.880 |
There's a guy down in Australia named Srinivasan 01:30:25.640 |
who studied this, and it's really interesting. 01:30:28.120 |
No one really understands it, except he understands it best. 01:30:34.440 |
relative to the orientation of the honeycomb, 01:30:36.160 |
and then all the other bees see that, it's visual, 01:30:39.960 |
and they go out and they know the exact coordinate system 01:30:47.240 |
And he's done it where they isolate the bees, 01:30:49.040 |
he's changed the visual flight environment, all this stuff. 01:30:56.000 |
but it doesn't extend over very long periods of time. 01:30:59.920 |
The same way that you and I can both read a book 01:31:03.720 |
and then we could converge on a set of ideas later. 01:31:06.320 |
And in fairness, 'cause she was the one that said it, 01:31:09.720 |
and I didn't, and I hadn't even thought of it. 01:31:17.920 |
which is that it never really occurred to me, 01:31:20.160 |
and I was sort of embarrassed that it hadn't, 01:31:38.480 |
but we just don't understand what the receptive fields are 01:31:43.880 |
- And how they're communicated between humans, 01:31:47.420 |
because we seem to be able to encode those ideas 01:31:57.000 |
that sensory information put into this concept blob 01:32:03.600 |
- Yeah, your abstractions are different than mine. 01:32:09.640 |
is a beautiful example of where the abstractions 01:32:28.480 |
that yours look and respond exactly the same way 01:32:34.200 |
But once you get beyond there, it gets tricky. 01:32:35.960 |
And so when you say something or I say something, 01:32:39.120 |
and somebody gets upset about it or even happy about it, 01:32:42.640 |
their concept of that might be quite a bit different. 01:32:54.760 |
and the augmentation of the more primitive circuitry. 01:33:04.840 |
- I think we should go deeper into the brain. 01:33:11.600 |
and also a clinical nurse, a great guy, brilliant. 01:33:22.800 |
They really know how to do things with style. 01:33:24.640 |
And they've upset a lot of people, but that's good too. 01:33:29.920 |
I know Matt, he actually came up through my lab at Stanford, 01:33:38.680 |
to collect the VR that we use in our fear stuff. 01:33:47.240 |
The problem is that damn vasculature, all that blood supply. 01:33:50.640 |
It's not trivial to get through and down into the brain 01:33:55.320 |
without damaging the vasculature in the neocortex, 01:34:00.760 |
and closer to some of the main arterial sources, 01:34:09.740 |
- Maybe it'd be nice to educate, I'm showing my ignorance. 01:34:17.800 |
So I didn't quite realize, 'cause you keep saying deep. 01:34:32.360 |
of course you've got your deep brain structures 01:34:34.580 |
that are involved in breathing and heart rate 01:34:37.300 |
And then on top of that, this is the model of the brain 01:34:44.620 |
And then on top in mammals, and then on top of that, 01:34:50.700 |
whether or not you're gonna listen to something more, 01:35:02.000 |
And that is where you get a lot of this abstraction stuff. 01:35:05.700 |
And now not all cortical areas are doing abstraction. 01:35:07.840 |
Some like visual area one, auditory area one, 01:35:17.180 |
that when you start hearing names like infraparietal cortex, 01:35:20.460 |
and you know, when you start hearing multiple names 01:35:22.140 |
in the same, then you're talking about higher order areas. 01:35:35.520 |
who's at runs the Center for Neural Science at NYU. 01:35:50.320 |
And if I showed you a bunch of dots all moving up, 01:35:55.620 |
and some of the other people in that lab did way back when, 01:36:01.380 |
Somewhere in MT, there's some neurons that respond, 01:36:05.940 |
And then what they did is they started varying 01:36:08.700 |
So they made it so only 50% of the dots moved up 01:36:14.300 |
And eventually it's random and that neuron stops firing 01:36:16.700 |
'cause it's just kind of dots moving everywhere. 01:36:19.500 |
And there's a systematic map so that other neurons 01:36:30.460 |
You could lesion MT, animals lose the ability 01:36:41.900 |
is that they lowered a stimulating electrode into MT, 01:36:45.700 |
found a neuron that responds to when dots go up. 01:36:52.540 |
And sure enough, the animal doesn't recognize 01:37:00.280 |
They stimulate the neuron that responds to things moving up. 01:37:05.580 |
And the animal responds, 'cause it can't speak, 01:37:14.420 |
the dots are moving down in reality on the computer screen. 01:37:25.980 |
Which tells you that your perception of external reality 01:37:38.900 |
Your perception of the outside world depends entirely 01:37:42.660 |
on the activation patterns of neurons in the brain. 01:37:54.500 |
of course there's a neuron that triggers that, 01:38:00.820 |
B, you're way up in this higher order cortical areas. 01:38:10.740 |
but what this means is that we are constructing our reality 01:38:15.100 |
with this space time firing the zeros and ones. 01:38:22.100 |
And the animal or person can be absolutely convinced 01:38:27.660 |
- Are you familiar with the work of Donald Hoffman? 01:38:49.940 |
that we have no idea what physical reality is. 01:39:03.340 |
'cause you're kind of implying there's a gap. 01:39:16.140 |
that's sufficient for our human collaboration or whatever, 01:39:28.860 |
like that you can prove that this is when you're a science 01:39:46.900 |
in constructing realities that allow our survival. 01:39:50.540 |
But it's completely detached from physical reality. 01:39:55.420 |
- We're missing like most of it, if not all of it. 01:40:03.640 |
because I just saw the Oliver Sacks documentary. 01:40:06.280 |
There's a new documentary out about his life. 01:40:16.980 |
to see the world through the sensory apparati of a bat. 01:40:23.700 |
that were locked into these horrible syndromes 01:40:33.420 |
And as I was listening to him talk about this, 01:40:37.020 |
well, what, you know, like there are these mantis shrimps 01:40:49.220 |
that can sense other things in the environment 01:40:55.180 |
the same way Sacks talked about craving that experience, 01:41:00.580 |
for what you're saying, which is that it could be that most, 01:41:13.140 |
we all agree on enough of the same neural fabrications 01:41:16.100 |
in the same time and place that we're able to function. 01:41:18.660 |
- Not only that, but we agree with the things 01:41:26.060 |
Meaning like that it's not just us humans, you know. 01:41:33.020 |
So like, now I think it's a really nice thought experiment. 01:41:40.740 |
I think because Donald really frames it in a scientific, 01:42:09.920 |
about, 'cause we are operating in this abstraction space, 01:42:14.920 |
and the sensory spaces might be something very different. 01:42:25.140 |
if you start to go into the realm of Neuralink 01:42:29.880 |
that you've been talking about with dream states 01:42:33.820 |
which part of the, which layer can we control 01:42:38.380 |
and maybe look into a different slice of reality? 01:42:50.540 |
This is why, there's wonderful theoretical neuroscience 01:42:57.860 |
but that's why experimental science is so wonderful. 01:43:00.500 |
You can go into the laboratory and poke around in there 01:43:03.380 |
and be a brain explorer and listen to and write to neurons. 01:43:13.740 |
I think when you were saying this thing about reality 01:43:20.800 |
Like when I have an older sister, she's very sane. 01:43:24.760 |
But when she was a kid, she had an imaginary friend 01:43:30.220 |
and she would play with this imaginary friend. 01:43:32.500 |
And it had, there was this whole, there was a consistency. 01:43:42.540 |
And then one day she announced that Larry had died. 01:43:48.540 |
And I always wonder what that neurodevelopmental event was 01:44:00.700 |
But it's also, there was something kind of sad to it. 01:44:08.320 |
But my dad told me that there was a kind of a sadness 01:44:14.020 |
And so we kind of wonder, as you're telling me this, 01:44:19.220 |
we try and create as much reality for children as we can 01:44:22.260 |
so that they can make predictions and feel safe. 01:44:25.540 |
is a lot of what keeps our autonomic arousal in check. 01:44:33.540 |
but unfortunately autonomic arousal yanks us down under 01:44:44.020 |
I was a little worried we'd get into discussions 01:44:55.380 |
What experience are we gonna put someone through? 01:45:00.500 |
we can finally have discussions about this stuff 01:45:03.740 |
and look, kind of peek around the corner and say, 01:45:14.900 |
and we just have to be open to interpretation. 01:45:21.820 |
I mean, you're plugged into the neuroscience community. 01:45:32.060 |
But now more and more people are talking about consciousness. 01:45:42.860 |
but it feels like a legitimate domain of inquiry 01:45:53.460 |
- So I have fortunately three short answers to this. 01:46:07.420 |
there are two things you never wanna say to a scientist. 01:46:11.220 |
And the second one is, take as much time as you need. 01:46:35.300 |
there were some very dynamic, very impressive speakers 01:46:39.540 |
who were very smart in the field of neuroscience 01:46:43.700 |
who thought hard about the consciousness problem 01:46:49.620 |
but overlooked the fact that the technology wasn't there. 01:46:54.620 |
So I admire them for falling in love with the problem, 01:46:59.180 |
but they gleaned tremendous taxpayer resources, 01:47:11.460 |
and thought the Klaustrum was involved in consciousness, 01:47:14.620 |
It's this obscure structure that no one's really studied. 01:47:18.980 |
So I think Francis was brilliant and wonderful, 01:47:31.260 |
or after a couple of glasses of wine or whatever. 01:47:39.460 |
the issue is it's not operationally defined, right? 01:47:54.020 |
if we're talking about motivation, for instance, 01:47:55.900 |
they know they need to put operational definitions on that 01:48:03.940 |
And this was a problem for attention when I was coming up. 01:48:12.660 |
Is it, and finally people were like, you know what? 01:48:21.660 |
- Right, they couldn't even agree on attention. 01:48:23.100 |
So I was coming up as a young graduate student. 01:48:27.860 |
and I'm definitely not gonna work on consciousness. 01:48:30.100 |
And I wanted something that I could solve or figure out. 01:48:33.500 |
I wanna be able to see the circuit or the neurons. 01:48:39.060 |
And then I wanna do gain of function and loss of function. 01:48:41.460 |
Take it away, see something change, put it back, 01:48:46.020 |
And that takes you down into the depths of some stuff 01:48:51.980 |
But, you know, I'll borrow from something in the military 01:48:58.060 |
and they have beautiful language around things 01:49:05.740 |
And it's not an issue of picking the 100 meter target 01:49:10.420 |
If you don't take down the three meter targets 01:49:15.220 |
So that's, I think scientists could pay to, you know, 01:49:19.660 |
adopt a more kind of military thinking in that sense. 01:49:26.180 |
is that just because somebody conceived of something 01:49:35.060 |
So, but this isn't just true of the consciousness issue. 01:49:51.860 |
And biotech companies folded everyone and the lab pivots 01:49:55.100 |
and starts doing something different with that molecule. 01:50:01.140 |
we have this thing called anonymous peer review. 01:50:02.940 |
You can't afford to piss off anybody too much 01:50:12.480 |
And I've watched this and I don't think it's ego driven. 01:50:15.320 |
I think it's that people fall in love with an idea. 01:50:23.080 |
The beauty of what Neuralink and Elon and team, 01:50:31.140 |
what gives me great confidence in their mission, 01:50:50.640 |
include people that are very kind of abstract, 01:50:57.760 |
There are people like Matt, who's a neurosurgeon. 01:51:09.380 |
to even get through the gate, and he's exceptional. 01:51:18.460 |
who is very concrete, studied the vasculature in the eye 01:51:21.180 |
and how it maps to the vasculature in cortex. 01:51:26.860 |
you're gonna have people that are high-level thinkers, 01:51:32.620 |
it no longer looks like an academic laboratory 01:51:40.860 |
And again, I'm not here, they don't, you know, 01:51:59.700 |
who are basically will adapt to solve the problem. 01:52:03.380 |
- I like the analogy of the three-meter target 01:52:09.700 |
many of them are some of the best people in the world 01:52:20.620 |
smoke some weed and look back and look at the stars. 01:52:23.980 |
so both on Elon and because I think like this, 01:52:29.720 |
I think it's really important to think about the hundred-meter 01:52:33.300 |
and the hundred-meter is not even a hundred-meter, 01:52:36.260 |
but like the stuff behind the hill that's too far away, 01:52:56.820 |
part of the business I wanna build leverages that idea. 01:53:11.700 |
I mean, the reason we can talk about something as abstract 01:53:13.980 |
as face representations, infusive form, face area, 01:53:17.300 |
is because Nancy Kanwisher had the brilliance 01:53:19.900 |
to tie it to the kind of lower level statistics 01:53:26.020 |
It wasn't 'cause she was like, "Oh, I bet it's there." 01:53:29.900 |
So people like her understand how to bridge that gap 01:53:41.780 |
- This is what, but I want people to sit in the, 01:53:48.340 |
with woo-woo like consciousness, like high-level stuff, 01:53:57.180 |
and simplify it into something that's concrete. 01:53:59.560 |
Because too many people are just uncomfortable 01:54:10.600 |
But the reality is it's easy to avoid that room altogether, 01:54:24.740 |
and then they drew a lot of funding and then it crashed 01:54:27.180 |
because they really didn't do anything with it. 01:54:29.340 |
And it was a lot of force of personality and so on, 01:54:31.560 |
but that doesn't mean the topic of the Turing test 01:54:35.600 |
and intelligence isn't something we should sit on 01:54:43.140 |
Turing actually attempted this with the Turing test. 01:54:48.920 |
It doesn't mean that we shouldn't linger on it 01:55:15.420 |
It's to understand consciousness and intelligence 01:55:22.320 |
or just all the possible biggest questions of our universe. 01:55:27.780 |
- Absolutely, and I think what I really appreciate 01:55:35.400 |
a low level synapse, that's like a reflex in musculature 01:55:41.220 |
can benefit from looking at those who prefer three, 01:55:51.680 |
of being in a conversation where there are real answers, 01:55:54.600 |
where the zeros and ones are known, zeros and ones, 01:55:57.920 |
are the equivalent of that in the nervous system. 01:56:00.560 |
And also, as you said, for the people that are very much 01:56:03.640 |
like, oh, I can only trust what I can see and touch, 01:56:08.200 |
into the discomfort of the high level conversation 01:56:15.280 |
and conceptualization of things at multiple levels. 01:56:19.480 |
I think one of the, this is, I don't gripe about, 01:56:23.540 |
We've been funded from the start and we've been happy 01:56:26.180 |
in that regard and lucky, and we're grateful for that. 01:56:30.200 |
But I think one of the challenges of research 01:56:33.840 |
being so expensive is that there isn't a lot of time, 01:56:38.360 |
especially nowadays, for people to just convene 01:56:40.600 |
around a topic because there's so much emphasis 01:56:45.920 |
And so there are actually, believe it or not, 01:56:52.240 |
The last 10 years has been this huge influx of tools. 01:56:57.760 |
and probing around and connectomes, it's been wonderful. 01:57:00.720 |
But 10, 20 years ago, when the consciousness stuff 01:57:10.360 |
would go to meetings and actually discuss ideas and models. 01:57:18.120 |
at the school science fair where everyone's got their thing 01:57:26.320 |
I'm grateful that we have so many computer scientists 01:57:34.320 |
Somebody tell me what the difference is someday. 01:57:38.080 |
And psychology and even dare I say philosophy, 01:57:49.080 |
when I started graduate school or as a postdoc, 01:57:51.000 |
it was neurophysiology or you were a neuroanatomist. 01:57:53.940 |
Now, it's sort of everybody's invited and that's beautiful. 01:58:03.100 |
happening on it for the treatment of disease. 01:58:14.080 |
but the consciousness thing continues to be a, 01:58:19.920 |
It's like no one really quite knows how to handle it 01:58:42.940 |
like BS narratives about the brain or whatever. 01:58:51.600 |
as long as it helps engineer intelligent systems. 01:59:04.120 |
but I don't have strong quantitative leanings. 01:59:08.520 |
And so I think the next generation coming up, 01:59:11.920 |
a lot of the students at Stanford are really interested 01:59:20.320 |
a lot of the people who were doing work ahead of me, 01:59:22.060 |
I kind of rolled my eyes at some of the stuff 01:59:23.560 |
they were doing, including some of their personalities, 01:59:25.960 |
although I have many great senior colleagues everywhere. 01:59:31.480 |
So nobody knows what it's like to be a young graduate student 01:59:36.960 |
So I know there are a lot of things I don't know. 01:59:41.100 |
And in addition to why I do a lot of public education, 01:59:47.860 |
a big goal of mine is to try and at least pave the way 01:59:50.940 |
so that these really brilliant and forward thinking 01:59:54.400 |
younger scientists can make the biggest possible dent 01:59:57.540 |
and make what will eventually be all us old guys 02:00:01.640 |
I mean, that's what we were all trying to do. 02:00:06.020 |
- So from the highest possible topic of caution, 02:00:11.020 |
cautiousness to the lowest level topic of David Goggins. 02:00:19.580 |
- I don't know if it's low level, he's high performance. 02:00:22.540 |
- High performance, but like low, like there's no, 02:00:26.380 |
I don't think David has any time for philosophy. 02:00:34.740 |
to what we were just saying in a meaningful way, 02:00:37.520 |
which is whatever goes on in that abstraction 02:00:41.400 |
part of the brain, he's figured out how to dig down 02:00:59.920 |
what we're talking about is him doing that to himself. 02:01:03.820 |
It's like he's scruffing himself and directing himself 02:01:15.200 |
that that process is not pretty, it doesn't feel good. 02:01:23.320 |
but he's created this rewarding element to it. 02:01:27.020 |
And I think that's what's so, it's so admirable. 02:01:32.920 |
which is regulation of the self at that level. 02:01:36.980 |
- And he practices, I mean, there's a ritual to it. 02:01:40.000 |
There's a, every single day, like no exceptions. 02:01:52.320 |
I mean, I just, I mean, I admire all aspects of it, 02:01:55.560 |
including him and his girlfriend/wife, I'm not sure. 02:02:02.160 |
- No, no, we've only, I've only communicated with her 02:02:06.880 |
by text about some stuff that I was asking David, 02:02:10.120 |
but yeah, they clearly have formed a powerful team. 02:02:15.760 |
- And it's a beautiful thing to see people working 02:02:25.920 |
That you find a thing that works, which gives me hope 02:02:33.760 |
you can always find another thing that works with that. 02:02:37.540 |
But I've had the, so maybe let's trade Goggins stories, 02:02:50.880 |
I somehow found myself in communication with David 02:02:59.740 |
One of which is we were communicating every single day, 02:03:05.140 |
email, phone, about the particular 30 day challenge 02:03:08.940 |
that I did that stretched for longer of pushups and pull-ups. 02:03:18.940 |
I knew of you before, but that's where I started tracking 02:03:21.180 |
some of what you were doing with these physical challenges. 02:03:38.620 |
neuroplasticity really loves a non-negotiable contract 02:03:41.800 |
because, and I've said this before, so forgive me, 02:03:45.660 |
but the brain is doing analysis of duration, path, 02:03:48.700 |
and outcome, and that's a lot of work for the brain. 02:03:51.580 |
And the more that it can pass off duration, path, 02:03:56.460 |
the more energy it can allocate to other things. 02:04:03.860 |
about how many pushups, how far I'm gonna run, 02:04:12.660 |
you mean like the brain, once the decision is made, 02:04:20.740 |
- That's right, I mean, so much of what we do 02:04:22.560 |
is reflexive at the level of just core circuitry, 02:04:24.660 |
breathing, heart rate, all that boring stuff, digestion, 02:04:31.900 |
that's reflexive too, but that you had to learn 02:04:38.860 |
and that involves a lot of top-down processing 02:04:42.380 |
But through plasticity mechanisms, you now do it. 02:04:47.300 |
provided that you understand the core mechanics 02:04:53.940 |
once you set the number and the duration and all that, 02:05:00.200 |
But people get caught in that tide pool of just, 02:05:10.860 |
And to some extent, look, not David Goggins, obviously, 02:05:15.860 |
nor do I claim to understand his process partially, 02:05:23.760 |
which is that it's clear that by making the decision, 02:05:34.320 |
that was obvious to me, and it's still not obvious. 02:05:40.720 |
And I mean, that's something I really struggle with. 02:05:51.400 |
And one lesson I've learned is if you quit once, 02:06:01.260 |
like it's really valuable to trick your brain 02:06:05.440 |
into thinking that you're gonna have to die before you quit. 02:06:13.080 |
So actually what you're saying is very profound, 02:06:27.680 |
that I think it would be another conversation 02:06:30.360 |
'cause I'm not sure how to put it into words, 02:06:37.440 |
- Well, it's a huge, you know, it's a huge output. 02:06:41.760 |
I thought it would be the number would be hard, 02:06:48.080 |
especially in the early days was just spending, 02:06:53.080 |
I'm kind of embarrassed to say how many hours this took. 02:07:09.720 |
but occasionally David has said this publicly 02:07:11.920 |
where people will be like, don't you sleep or something? 02:07:16.560 |
that he would just block, delete, you know, like gone. 02:07:18.960 |
But it's actually, it's a super interesting topic. 02:07:23.880 |
And because self-control and directing our actions 02:07:34.400 |
and they're vital to performing well at anything. 02:07:39.760 |
being able to understand this about the self is crucial. 02:07:44.880 |
So I have a friend who was also in the teams. 02:07:56.840 |
because of a kind of funny challenge he gave himself, 02:08:11.200 |
to a position that gave him a little more time 02:08:14.840 |
And there's not as much time out in deployments, 02:08:21.160 |
but he thought about it and he asked himself, 02:08:25.280 |
And it turns out the thing that he hated doing the most 02:08:27.360 |
was bear crawls, you know, walking on your hands and knees. 02:08:30.080 |
So he decided to bear crawl for a mile for time. 02:08:35.280 |
And I thought that was an interesting example that he gave 02:08:41.480 |
And I think it maps right back to limbic friction. 02:08:44.200 |
It's the thing that creates the most limbic friction. 02:08:46.960 |
And so if you can overcome that, then there's carry over. 02:08:56.040 |
it's going to help you in other areas of life. 02:09:05.200 |
is not a circuit for bear crawls or a circuit for pull-ups. 02:09:08.600 |
What you're doing is you're exercising a circuit 02:09:12.360 |
And that circuit was not designed to be for bear crawls 02:09:15.540 |
or pull-ups or coding or waking up in the middle of the 02:09:20.820 |
That circuit was designed to override limbic friction. 02:09:24.280 |
And so neural circuits were designed to generalize, right? 02:09:30.940 |
that's a physical threat was designed to feel the same way 02:09:34.200 |
and be the same response internally as the threat 02:09:40.520 |
or whatever it is that's stressing somebody out. 02:09:43.320 |
And so neural circuits are not designed to be 02:09:47.580 |
So if you can, as you did, if you can train up 02:09:54.800 |
that when the desire to quit is at its utmost, 02:10:13.400 |
It's like a plant that doesn't get any water. 02:10:16.040 |
And a lot of this has been discussed in self-help 02:10:18.520 |
and growth mindset and all these kinds of ideas 02:10:23.200 |
But when you start to think about how they map 02:10:25.040 |
to neural circuits, I think there's some utility 02:10:26.920 |
'cause what it means is that the limbic friction 02:10:31.760 |
maybe some future relationship to something or someone, 02:10:40.160 |
It's just like the limbic friction you experienced 02:10:42.820 |
trying to engage in the God knows how many pushups, 02:10:46.760 |
pull-ups and running, you know, runs you were doing. 02:10:59.360 |
- This is the problem with you getting more followers 02:11:04.160 |
I don't know, maybe it's not politically correct for me. 02:11:34.760 |
I just have some, I found myself, everyone's different, 02:11:39.520 |
but I've found myself to be able to do something unpleasant 02:11:57.520 |
I mean, which mind tells you to quit exactly? 02:12:05.320 |
- Well, limbic friction is the source of that, 02:12:09.240 |
- So there's a, we can put something very concrete to that. 02:12:23.920 |
of trying to swim forward toward a target and a reward. 02:12:28.480 |
'cause they manipulated virtually the visual environment. 02:12:31.760 |
So the same amount of effort was being expended every time, 02:12:38.280 |
And sometimes the perception was you're making no progress 02:12:41.120 |
because stuff wasn't drifting by, meant no progress. 02:12:46.560 |
And it turns out that with each bout of effort, 02:13:18.440 |
It can be rescued, endurance can be rescued with dopamine. 02:13:24.200 |
So that's where the subjective part really comes into play. 02:13:30.380 |
So you quit because you've learned how to turn that off, 02:13:36.340 |
some people will reward the pain process so much 02:13:44.480 |
and other people I know from special operations 02:13:46.680 |
and people have gone through cancer treatments three times, 02:13:50.840 |
you hear about, just when you hear about people, 02:14:02.040 |
of these processes as opposed to a neural circuit 02:14:04.440 |
for a particular action or cognitive function. 02:14:06.920 |
So I think you have to learn to subjectively self-reward 02:14:18.740 |
In his mind, apparently, that's a form of reward, 02:14:21.640 |
but it's not just a form of reward where you're, 02:14:24.280 |
it's like you're picking up a trophy or something. 02:14:36.520 |
to suppress the noradrenaline and adrenaline circuits 02:14:48.340 |
but they're usually just versions of me inside my head. 02:14:51.880 |
So I thought about, through that 30-day challenge, 02:15:05.200 |
you certainly have a formidable adversary in this one. 02:15:14.320 |
- Well, let's hope you both survive this one. 02:15:23.120 |
So everything we've been talking about in the mind, 02:15:25.760 |
there's a physical aspect that's just practically difficult, 02:15:31.280 |
like when you injure yourself at a certain point, 02:15:59.480 |
But yeah, the problem with these physical challenges, 02:16:09.920 |
and the body is kind of, unfortunately, quite limited. 02:16:13.640 |
- Well, I think the key is to dynamically control 02:16:15.960 |
your output, and that can be done by reducing effort, 02:16:31.840 |
of why this all works, but these are ancient pathways 02:16:34.360 |
that were designed to bring resources to an animal 02:16:39.680 |
for hunting or mates or water, all these things. 02:16:42.520 |
And they work so well because they're down in those 02:16:48.800 |
And that's great because it can be subjective 02:16:51.600 |
at the level of, oh, I reached this one milestone, 02:16:55.320 |
this one horizon, this one three-meter target. 02:16:57.760 |
But if you don't reward it, it's just effort. 02:17:01.880 |
If you do self-reward it, it's effort minus one 02:17:12.080 |
You're one of the great communicators in science. 02:17:17.880 |
and enjoying in terms of the educational stuff 02:17:23.680 |
- What's the, do you have a philosophy behind it 02:17:41.040 |
so I, okay, I'm in multiple places in the sense of 02:18:05.960 |
and places like MIT as one of the most magical institutions 02:18:13.720 |
for inspiring people to dream, people to build the future. 02:18:18.520 |
I mean, it's, I believe that it is a really special, 02:18:22.160 |
these universities are really special places. 02:18:27.400 |
when somebody as inspiring as you represents those places. 02:18:32.400 |
So it makes me proud that somebody from Stanford is, 02:18:38.200 |
like somebody like you is representing Stanford. 02:18:46.360 |
how did you come to be who you are in being a communicator? 02:18:52.520 |
- Well, first of all, thanks for the kind words, 02:18:56.320 |
I think Stanford is an amazing place as is MIT. 02:19:03.360 |
It's okay, I'll let it out, anything you say at this point. 02:19:20.680 |
I think the great benefit of being in a place like MIT 02:19:26.800 |
you know, that the average is very high, right? 02:19:29.640 |
You have many best in class among the, you know, 02:19:33.520 |
one or two or three best in the world at what they do. 02:19:42.920 |
and other universities like them very special 02:19:44.800 |
is that there's an emphasis on what gets exported 02:19:50.520 |
and really trying to keep an eye on what's needed 02:19:53.280 |
in the world and trying to do something useful. 02:19:55.600 |
And I think the proximity to industry in Silicon Valley 02:20:02.520 |
And there are other institutions too, of course. 02:20:05.360 |
- So the reason I got involved in educating on social media 02:20:21.720 |
he talked me into doing these early morning cold water swims. 02:20:24.800 |
I was learning a lot about pain and suffering, 02:20:32.120 |
"So what are you going to do to serve the world in 2019?" 02:20:39.000 |
what are you gonna do to serve the world in 2019? 02:20:41.960 |
It's like, no, no, what are you gonna do that's new? 02:20:48.680 |
I would just teach people, everyone about the brain 02:20:59.240 |
and it's grown into to include a variety of things, 02:21:09.780 |
and in the nervous system and in biology generally. 02:21:19.360 |
of talking about how to look at a graph and statistics 02:21:32.160 |
mainly tools that map to things that we're doing in our lab. 02:21:37.760 |
how to understand a direct one states of mind and body. 02:21:41.560 |
So reduce stress, raise one stress threshold. 02:21:46.200 |
Sometimes it's about learning how to tolerate 02:21:58.680 |
like the eight and 10 year old version of me, 02:22:02.780 |
when I was a kid reading about weird animals. 02:22:04.840 |
And I had this obsession with like medieval weapons 02:22:08.920 |
And then I used to come into school on Monday 02:22:26.240 |
that I find in the world in books and in experiments 02:22:38.600 |
So I try and package it into a form that people can access. 02:22:43.160 |
I think the reception has been really wonderful. 02:22:45.120 |
Stanford has been very supportive, thankfully. 02:22:49.280 |
I've given, done some podcasts even with them 02:22:51.520 |
and they've reposted some stuff on social media. 02:22:54.080 |
It's a precarious place to put yourself out there 02:23:01.080 |
if I'm still serious about research, which I absolutely am. 02:23:08.520 |
their research and the research coming out of the field 02:23:13.320 |
And not all scientists are good at translating that 02:23:25.000 |
that I think people will find interesting and useful 02:23:30.600 |
you would offer food to somebody visiting your home. 02:23:32.600 |
You're not going to cram foie gras in their face. 02:23:34.900 |
You're going to say, like, do you want a cracker? 02:23:38.400 |
And like, do you want something on that cracker? 02:23:43.000 |
or you want that really like stinky, like French, 02:23:50.440 |
the best information prompts more questions of interest, 02:23:54.580 |
not questions of confusion, but questions of interest. 02:23:59.020 |
then another door opens, then another door opens. 02:24:05.800 |
who are thinking about themselves neuroscientifically. 02:24:43.280 |
and Viktor Frankl, "Man's Search for Meaning," 02:24:53.620 |
What, let me ask the big ridiculous question about life. 02:25:07.400 |
do you mention that book from a psychologist perspective, 02:25:12.320 |
or do you ever think about the bigger philosophical questions 02:25:21.380 |
- One of the great challenges in assigning a good, 02:25:27.280 |
you know, giving a good answer to the question of like, 02:25:29.040 |
what's the meaning of life is I think illustrated best 02:25:37.120 |
which is that our sense of meaning is very elastic 02:25:43.120 |
And I'm, we talked a little bit about this earlier, 02:25:46.200 |
but it's amazing to me that somebody locked in a cell 02:25:50.480 |
or a concentration camp can bring the horizon 02:25:54.240 |
in close enough that they can then micro slice 02:25:57.380 |
their environment so that they can find rewards 02:26:03.480 |
even in a little square box or a horrible situation. 02:26:09.840 |
to one of the most important features of the human mind, 02:26:12.320 |
which is we could do, let's take two opposite extremes. 02:26:15.980 |
One would be, let's say the alarm went off right now 02:26:18.920 |
in this building and the building started shaking. 02:26:21.760 |
Our vision, our hearing, everything would be tuned 02:26:31.440 |
all that would matter, the only meaning would be 02:26:33.720 |
get out of here safe, figure out what's going on, 02:26:42.160 |
I think it's called pale blue dot thing or whatever, 02:26:44.000 |
where we could imagine ourselves in this room. 02:26:55.740 |
If you see yourself as just one brief glimmer 02:26:59.480 |
in all of time and all of space, you go to, I don't matter. 02:27:03.920 |
And if you go to, oh, every little thing that happens 02:27:31.000 |
And we can pull from the past and the present and future. 02:27:39.240 |
It makes sense that it wasn't just about grinding it out. 02:27:43.700 |
even in those little boxes they were forced into. 02:27:50.840 |
but for me personally, and I think about this a lot 02:27:54.480 |
because I have this complicated history in science 02:28:09.800 |
But what I realized is that we can get so fixated 02:28:25.260 |
And this is important because what really gives meaning 02:28:32.020 |
between these different space time dimensionalities. 02:28:35.100 |
And I'm not trying to sound like a theoretical physicist 02:28:37.740 |
or anyone that thinks about the cosmos in saying that. 02:28:44.960 |
and do and think things and it feels so important. 02:28:47.760 |
And then two days later, we're like, what happened? 02:28:51.160 |
Well, you had a different brain processing algorithm 02:28:54.720 |
entirely, you were in a completely different state. 02:28:57.080 |
And so what I want to do in this lifetime is I want to, 02:29:05.640 |
of contraction and dilation of meaning as possible. 02:29:12.400 |
I'm like, if I just pulled over the side of the road, 02:29:19.260 |
And you also can't stay staring up at the clouds 02:29:21.620 |
and just think about how we're just these little beings 02:29:32.540 |
And my goal is to get as many trips up and down 02:29:34.480 |
that staircase as I can before the reaper comes for me. 02:29:42.020 |
between the different spaces, zoom in, zoom out 02:29:52.420 |
I watched my postdoc advisor die, wither away. 02:29:57.100 |
but they found beauty in these closing moments 02:30:00.180 |
because their bubble was their kids in one case 02:30:09.420 |
And like, and you just realize like it's a Giants game 02:30:12.540 |
but not in that moment because time is closing. 02:30:35.220 |
- I don't think there's a better way to end it, Andrew. 02:30:44.180 |
and focus on this conversation for a few hours. 02:30:51.060 |
I hope you keep growing and educating the world 02:30:59.040 |
I really appreciate the invitation to be here. 02:31:03.020 |
just 'cause I'm here, but I'm a huge fan of yours. 02:31:04.940 |
I send your podcasts to my colleagues and other people. 02:31:08.060 |
And I think what you're doing isn't just amazing, 02:31:22.900 |
and gives me yet another reason to enjoy sleep. 02:31:26.020 |
SEMrush, the most advanced SEO optimization tool 02:31:30.940 |
And Cash App, the app I use to send money to friends. 02:31:35.060 |
Please check out the sponsors in the description 02:31:37.460 |
to get a discount and to support this podcast. 02:31:41.260 |
If you enjoy this thing, subscribe on YouTube, 02:31:51.620 |
And now let me leave you with some words from Carl Jung. 02:32:01.700 |
Thank you for listening and hope to see you next time.