back to indexDr. Terry Sejnowski: How to Improve at Learning Using Neuroscience & AI
Chapters
0:0 Dr. Terry Sejnowski
2:32 Sponsors: BetterHelp & Helix Sleep
5:19 Brain Structure & Function, Algorithmic Level
11:49 Basal Ganglia; Learning & Value Function
15:23 Value Function, Reward & Punishment
19:14 Cognitive vs. Procedural Learning, Active Learning, AI
25:56 Learning & Brain Storage
30:8 Traveling Waves, Sleep Spindles, Memory
32:8 Sponsors: AG1 & David
34:57 Tool: Increase Sleep Spindles; Memory, Ambien; Prescription Drugs
42:2 Psilocybin, Brain Connectivity
45:58 Tool: ‘Learning How to Learn’ Course
49:36 Learning, Generational Differences, Technology, Social Media
58:37 Sponsors: LMNT & Joovv
61:6 Draining Experiences, AI & Social Media
66:52 Vigor & Aging, Continued Learning, Tool: Exercise & Mitochondrial Function
72:17 Tool: Cognitive Velocity; Quick Stressors, Mitochondria
76:58 AI, Imagined Futures, Possibilities
87:14 AI & Mapping Potential Options, Schizophrenia
90:56 Schizophrenia, Ketamine, Depression
96:15 AI, “Idea Pump,” Analyzing Research
102:11 AI, Medicine & Diagnostic Tool; Predicting Outcomes
110:4 Parkinson’s Disease; Cognitive Velocity & Variables; Amphetamines
119:49 Free Will; Large Language Model (LLM), Personalities & Learning
132:40 Tool: Idea Generation, Mind Wandering, Learning
138:18 Dreams, Unconscious, Types of Dreams
142:56 Future Projects, Brain & Self-Attention
151:39 Zero-Cost Support, YouTube, Spotify & Apple Follow & Reviews, Sponsors, YouTube Feedback, Protocols Book, Social Media, Neural Network Newsletter
00:00:10.240 |
and I'm a professor of neurobiology and ophthalmology 00:00:20.240 |
at the Salk Institute for Biological Studies, 00:00:22.600 |
where he directs the Computational Neurobiology Laboratory. 00:00:29.500 |
That is, he uses math as well as artificial intelligence 00:00:32.560 |
and computing methods to understand this overarching, 00:00:36.200 |
ultra-important question of how the brain works. 00:00:45.520 |
that it can be a bit overwhelming and even intimidating. 00:00:48.400 |
But I assure you that the purpose of Dr. Signowski's work, 00:00:57.400 |
and indeed, to simplify the answer to that question. 00:01:06.020 |
that all your motivation in all domains of life 00:01:09.260 |
is governed by a simple algorithm or equation. 00:01:16.360 |
drives all of our motivation-related behaviors. 00:01:19.380 |
And it, of course, relates to the neuromodulator dopamine. 00:01:22.020 |
And if you're familiar with dopamine as a term, 00:01:24.140 |
today you will really understand how dopamine works 00:01:51.060 |
the way that you in particular forge for information 00:02:03.120 |
that is his brain's ability to learn information 00:02:08.100 |
Today, we also discuss both the healthy brain 00:02:12.040 |
in conditions like Parkinson's and Alzheimer's, 00:02:18.020 |
can perhaps be used in order to treat various diseases, 00:02:22.960 |
I'm certain that by the end of today's episode, 00:02:25.100 |
you will have learned a tremendous amount of new knowledge 00:02:27.740 |
about how your brain works and practical tools 00:02:35.660 |
is separate from my teaching and research roles at Stanford. 00:02:47.020 |
I'd like to thank the sponsors of today's podcast. 00:02:53.900 |
with a licensed therapist carried out completely online. 00:02:57.060 |
I've been doing weekly therapy for well over 30 years. 00:03:00.980 |
It was a condition of being allowed to stay in school, 00:03:04.380 |
that therapy is an extremely important component 00:03:09.820 |
just as important as getting regular exercise, 00:03:12.140 |
including cardiovascular exercise and resistance training, 00:03:15.260 |
which, of course, I also do every single week. 00:03:24.880 |
about essentially all issues that you want to. 00:03:27.120 |
Second of all, great therapy provides support 00:03:29.540 |
in the form of emotional support or simply directed guidance, 00:03:32.940 |
what to do or what not to do in given areas of your life. 00:03:36.140 |
And third, expert therapy can provide you useful insights 00:03:39.400 |
that you would not have been able to arrive at on your own. 00:03:42.220 |
BetterHelp makes it very easy to find an expert therapist 00:03:46.060 |
and that can provide you the benefits I just mentioned 00:03:59.780 |
Today's episode is also brought to us by Helix Sleep. 00:04:04.820 |
that are customized to your unique sleep needs. 00:04:07.580 |
Now, I've spoken many times before on this and other podcasts 00:04:10.660 |
about the fact that getting a great night's sleep 00:04:16.020 |
Now, the mattress you sleep on makes a huge difference 00:04:18.340 |
in terms of the quality of sleep that you get each night, 00:04:20.740 |
how soft it is or how firm it is, how breathable it is, 00:04:24.980 |
and need to be tailored to your unique sleep needs. 00:04:32.580 |
do you sleep on your back, your side, or your stomach? 00:04:34.660 |
Do you tend to run hot or cold during the night? 00:04:37.500 |
Maybe you know the answers to those questions, 00:04:43.540 |
For me, that turned out to be the Dusk mattress, the USK. 00:05:00.140 |
that is customized for your unique sleep needs. 00:05:15.700 |
And now for my discussion with Dr. Terry Signowski. 00:05:26.500 |
because you've worked on a great many different things 00:05:31.580 |
You're considered by many a computational neuroscience, 00:05:35.820 |
to an understanding of the brain and neural networks. 00:05:40.540 |
and we're going to make it accessible for everybody, 00:05:45.940 |
To kick things off, I want to understand something. 00:05:49.380 |
I understand a bit about the parts list of the brain, 00:05:52.820 |
and most listeners of this podcast will understand 00:05:57.220 |
even if they've never heard an episode of this podcast before 00:06:02.300 |
those neurons connect to one another in very specific ways 00:06:05.060 |
that allow us to see, to hear, to think, et cetera. 00:06:13.460 |
it doesn't really inform us how the brain works, right? 00:06:33.020 |
Like what is this piece of meat in our heads? 00:06:46.260 |
from the mental conversation, if that's possible for you, 00:06:49.060 |
how do you think about quote unquote, how the brain works? 00:07:00.260 |
let's just say the time when we first wake up in the morning 00:07:03.660 |
till we make it to that first cup of coffee or water, 00:07:07.100 |
or maybe even just to urinate first thing in the morning, 00:07:17.500 |
Pat Churchland and I wrote a book, "Computational Brain," 00:07:24.680 |
And it levels of investigation at different spatial scales 00:07:31.480 |
to synapses and neurons, circuits, neural circuits, 00:07:49.800 |
So, you know, where is consciousness in all of that? 00:08:04.240 |
And the one you described, which is, you know, 00:08:07.500 |
let's look at all the parts, that's the bottom-up approach. 00:08:10.340 |
You know, take it apart into a reductionist approach. 00:08:15.420 |
You can figure out, you know, how things are connected 00:08:17.560 |
and understand how development works, how neurons connect. 00:08:21.480 |
But it's very difficult to really make progress 00:08:27.580 |
Now, the other approach, which has been successful, 00:08:32.000 |
but at the end, unsatisfying, is the top-down approach. 00:08:38.520 |
And this is the approach that psychologists have taken, 00:08:41.980 |
looking at behavior and trying to understand, 00:08:52.740 |
were trying to do a top-down, to write programs 00:08:55.500 |
that could replicate human behavior, intelligent behavior. 00:08:59.740 |
And I have to say that both of those approaches, 00:09:08.080 |
of answering any of those questions, the big questions. 00:09:11.480 |
But there's a whole new approach now that is emerging 00:09:15.360 |
in both neuroscience and AI at exactly the same time. 00:09:18.760 |
And at this moment in history, it's really quite remarkable. 00:09:23.980 |
between the implementation level at the bottom, 00:09:44.200 |
They're like, you know, when you bake a cake, 00:09:46.460 |
you have to have ingredients and you have to say 00:09:50.020 |
the order in which they're put together and how long. 00:09:52.680 |
And, you know, if you get it wrong, you know, 00:09:59.080 |
Now, it turns out that we're discovering algorithms. 00:10:03.000 |
We've made a lot of progress with understanding 00:10:06.720 |
the algorithms that are used in neural circuits. 00:10:18.920 |
But I'm gonna give you one example of an algorithm, 00:10:27.600 |
when Peter Dayan and Reid Montague were postdocs in the lab. 00:10:38.240 |
which is responsible for learning sequences of actions 00:10:45.920 |
For example, if you wanna play tennis, you know, 00:10:48.960 |
you have to be able to coordinate many muscles 00:10:51.840 |
and a whole sequence of actions has to be made 00:10:57.120 |
And you have to practice, practice, practice. 00:10:59.240 |
Well, what's going on there is that the basal ganglia 00:11:06.240 |
and producing actions that get better and better 00:11:21.040 |
if you wanna become a good doctor or a neuroscientist, 00:11:25.400 |
right, you have to be practicing, practicing, practicing 00:11:41.840 |
interacts with the cortex, not just in the back, 00:11:48.880 |
- Can I ask you a question about this briefly? 00:12:08.840 |
or even a basic golf swing or a tennis racket swing 00:12:15.700 |
which is that the basal ganglia are also involved 00:12:26.520 |
in suppression of thoughts of particular kinds. 00:12:35.000 |
and just thinking about their motor behaviors, 00:12:38.080 |
They presumably need to think about what to think about, 00:12:43.960 |
thinking about how their kid was a brat that morning 00:12:46.520 |
and they're frustrated because the two things interact. 00:12:50.280 |
So is there go, no-go in terms of action and learning? 00:12:57.280 |
and that part, the loop with the basal ganglia, 00:12:59.960 |
that is one of the last to mature in early adulthood. 00:13:07.440 |
it's not the no-go part for planning and actions 00:13:21.760 |
But one of the things though is that learning is involved. 00:13:31.080 |
and then experimentally later by recording from neurons 00:13:46.440 |
And it's the simplest possible algorithm you can imagine. 00:13:50.960 |
It's simply to predict the next reward you're gonna get. 00:13:55.960 |
If I do an action, will it give me something of value? 00:14:06.560 |
whether you got the amount of reward you expected or less, 00:14:10.040 |
you use that to update the synapses, synaptic plasticity, 00:14:14.200 |
so that the next time you'll have a better chance 00:14:18.840 |
and you build up what's called a value function. 00:14:29.520 |
Like you go to a restaurant, you order something, 00:14:39.400 |
This is the same algorithm that was used by AlphaGo. 00:14:46.680 |
This is an AI program that beat the world Go champion. 00:14:53.480 |
that humans have ever played on a regular basis. 00:14:58.480 |
- Far more complex than chess, as I understand. 00:15:12.200 |
because you have to think in terms of battles 00:15:18.440 |
and the order in which you put the pieces down 00:15:20.500 |
are gonna affect what's gonna happen in the future. 00:15:23.040 |
- So this value function is super interesting 00:15:27.240 |
and I wonder whether, and I think you answered this, 00:15:55.920 |
are they able to tap into that same value function 00:16:00.380 |
even though there's been a lot of intervening time 00:16:08.260 |
is also being played out in more complex scenarios, 00:16:16.060 |
that for many people involves some trial and error, 00:16:27.900 |
We get some things right, we get some things wrong. 00:16:30.380 |
So is the same value function being implemented, 00:16:32.540 |
we're paying attention to what was rewarding, 00:16:34.940 |
but what I didn't hear you say also was what was punishing. 00:16:38.280 |
So are we only paying attention to what is rewarding 00:16:43.160 |
We don't get an electric shock when we get the serve wrong, 00:16:47.640 |
- What you identified is a very important feature, 00:16:59.000 |
you're updating this value function every time 00:17:04.720 |
the answer is that it's always gonna be there. 00:17:08.800 |
It's a very permanent part of your experience 00:17:14.680 |
And interestingly, and behaviorists knew this 00:17:22.480 |
that you can get there two ways of trial and error. 00:17:25.860 |
Small rewards are good because you're constantly 00:17:31.080 |
coming closer and closer to getting what you're seeking, 00:17:36.080 |
better tennis player or being able to make a friend. 00:17:40.340 |
But the negative punishment is much more effective. 00:18:08.520 |
will have you learning certain things forever. 00:18:16.920 |
That can screw you up for the rest of your life. 00:18:21.840 |
and you pointed out something really important, 00:18:24.280 |
which is that a large part of the prefrontal cortex 00:18:32.240 |
you don't know what language you're gonna be speaking. 00:18:44.800 |
All of that has to become through experience, 00:18:49.400 |
And this is something we discovered in the 20th century. 00:19:13.440 |
- Can you give me an example of procedural learning 00:19:15.880 |
in the context of a comparison to cognitive learning? 00:19:32.980 |
And I can imagine one, but you're the true expert here. 00:19:38.600 |
but just since we've been talking about tennis, 00:19:42.200 |
can you imagine learning how to play tennis through a book, 00:19:53.640 |
maybe he was working on his pilot's license or something. 00:19:58.800 |
but notice these diagrams of the plane flying. 00:20:01.920 |
And I thought, I'm just so glad that this guy is a passenger 00:20:06.880 |
And then I thought about how the pilots learned. 00:20:20.640 |
and you learn why you have to wait between dives, et cetera, 00:20:30.560 |
put it back on, and then blow the water out of your mask. 00:20:32.800 |
Like that, you just have to do that in a pool. 00:20:35.160 |
And you actually have to do it when you need to 00:20:43.400 |
that have to be executed quickly and expertly 00:20:47.840 |
to get that really down pat so you don't have to think. 00:21:13.080 |
so I went through all of the classes in theoretical physics. 00:21:20.840 |
that really were the core of becoming a good physicist. 00:21:33.080 |
It would be, you know, like get some morning sunlight 00:21:34.720 |
in your eyes to stimulate your suprachiasmatic nucleus 00:21:39.480 |
Audiences of this podcast will recognize those terms. 00:21:41.640 |
It's basically get sunlight in your eyes in the morning 00:21:54.320 |
There are these things in your eye that, you know, 00:21:56.040 |
encode the sunrise qualities of light, et cetera, 00:21:58.840 |
and then send them to your brain, et cetera, et cetera. 00:22:00.880 |
But then once we link knowledge, pure knowledge, 00:22:05.080 |
to a practice, I do believe that the two things 00:22:11.240 |
reinforces both the knowledge and the practice. 00:22:14.480 |
So these things are not necessarily separate, they bridge. 00:22:17.000 |
In other words, doing your theoretical physics problem sets 00:22:20.400 |
reinforces the examples that you learned in lecture 00:22:26.240 |
- So this is a battle that's going on right now in schools. 00:22:30.660 |
You know, what you've just said is absolutely right. 00:22:37.720 |
We have a cognitive learning system, which is cortical. 00:22:56.440 |
in California at least, is that they're trying 00:23:03.860 |
because it's going to be, you know, you're stressing them. 00:23:08.720 |
You don't want them to be, to feel that, you know, 00:23:22.460 |
and then you're going to go out to the ocean someday 00:23:28.540 |
- And now you're expected to be able to survive, 00:23:38.480 |
and I have a MOOC, Massive Open Online Course, 00:23:46.560 |
We aimed it at students, but it actually has been taken 00:23:49.180 |
by 4 million people in 200 countries, ages 10 to 90. 00:24:12.660 |
is that we need to test ourselves on the material. 00:24:15.240 |
That testing is not just a form of evaluation. 00:24:27.220 |
It's not about just listening and regurgitating. 00:24:30.500 |
You're, you know, you've put your finger on it, 00:24:32.600 |
which is that, and this is what we teach the students, 00:24:35.040 |
is that you have to, the way the brain works, right, 00:24:40.040 |
is not, it doesn't memorize things like a computer, 00:24:44.280 |
but you have to, it has to be active learning. 00:24:49.800 |
In fact, when you're trying to solve a problem on your own, 00:24:55.440 |
by trial and error, and that's the procedural system. 00:24:58.320 |
But if someone tells you what the right answer is, 00:25:03.080 |
that is a fact that gets stored away somewhere, 00:25:10.760 |
that's not exactly the same problem but is similar. 00:25:27.940 |
They just, they're not, they just don't memorize 00:25:36.160 |
That means to be able to do well on new things 00:25:38.680 |
that come in that are similar to the old things 00:25:40.640 |
that you've seen, but allow you to solve new problems. 00:25:47.640 |
The brain is really, really good at generalizing. 00:25:55.680 |
Like going to a restaurant for the first time, 00:25:58.700 |
there are a number of new interactions, right? 00:26:03.920 |
You sit down at these tables you've never sat at. 00:26:14.620 |
doesn't matter what the genre of food happens to be 00:26:22.620 |
Sit at the counter, sit outside, sit at the table. 00:26:28.220 |
that I think pretty much translate to everywhere. 00:26:34.180 |
or some super low-end thing where it's a buffet 00:26:36.260 |
or whatever, you can start to fill in the blanks here. 00:26:39.240 |
If I understand correctly, there's an action function 00:26:42.340 |
that's learned from the knowledge and the experience. 00:26:47.340 |
- And then where is that action function stored? 00:27:02.600 |
In the past, it had been thought that, you know, 00:27:23.360 |
In other words, there's a visual cortex in the back 00:27:28.180 |
and then there's the auditory cortex here in the middle 00:27:31.300 |
and then the prefrontal cortex for social interaction. 00:27:33.860 |
And so it looked really clear cut that it's modular 00:27:38.860 |
and now what we're facing is we have a new way 00:27:45.340 |
Optically, we can record from tens of thousands, 00:27:56.440 |
you're engaging not just the area that you might think, 00:27:59.640 |
you know, has the input coming in, say the visual system, 00:28:10.080 |
coming from the motor system than from the eye. 00:28:13.560 |
- Yes, Ann Churchill at UCLA has shown that in the mouse. 00:28:22.660 |
and that's where real complex cognitive behaviors emerge. 00:28:34.300 |
and we're doing that now first on mice and monkeys, 00:28:46.060 |
at Mass General Hospital to record from people with epilepsy 00:28:56.220 |
find out where it starts in the cortex, you know, 00:28:59.300 |
and where it is initiated, where the seizure starts 00:29:04.980 |
and record simultaneously from a lot of parts of the cortex 00:29:12.040 |
and then you go in and you try to take it out 00:29:17.140 |
Very, very invasive, but for two weeks we have access 00:29:23.860 |
that are being, you know, recorded from constantly. 00:29:30.700 |
and I wanted to understand what happens in the cortex 00:29:40.260 |
people who have these debilitating problems with seizures, 00:29:47.580 |
So they just love the fact that scientists are interested 00:29:50.740 |
in helping them and, you know, teaching them things 00:29:54.340 |
and finding out where in the cortex things are happening 00:30:04.920 |
that I could have never gotten from any other species. 00:30:09.980 |
but there are other things in sleep that we've discovered. 00:30:15.300 |
there are circular traveling waves that go on during sleep, 00:30:25.180 |
- If you were to ascribe one or two major functions 00:30:28.760 |
what do you think they are accomplishing for us in sleep? 00:30:31.500 |
And by the way, are they associated with deep sleep, 00:30:33.820 |
slow wave sleep, or with rapid eye movement sleep, or both? 00:30:48.380 |
They've heard a lot about slow wave sleep from me 00:30:50.020 |
and Matt Walker from Rapid Eye Movement Sleep. 00:30:53.420 |
- And so what do these traveling waves accomplish for us? 00:30:59.300 |
They last, the waves last for about a second or two, 00:31:04.220 |
and they travel, like I say, in a circle around the cortex. 00:31:07.940 |
And it's known that these spindles are important 00:31:10.740 |
for consolidating experiences you've had during the day 00:31:19.900 |
and if you take out, see, it's the hippocampus 00:31:30.460 |
If you don't have a hippocampus, you can't learn new things. 00:31:34.320 |
That is to say, you can't remember what you did yesterday, 00:31:41.360 |
But the hippocampus plays back your experiences, 00:31:43.840 |
causes the sleep spindles now to knead that into the cortex. 00:31:53.720 |
You just want to basically incorporate the new experience 00:31:56.720 |
into your existing knowledge base in an efficient way 00:32:00.980 |
that doesn't interfere with what you already know. 00:32:03.840 |
So that's an example of a very important function 00:32:22.480 |
In fact, I only had enough money to purchase one supplement, 00:32:25.160 |
and I'm so glad that I made that supplement, AG1. 00:32:31.720 |
it's very difficult to get enough vitamins and minerals, 00:32:33.960 |
micronutrients, and adaptogens from diet alone 00:32:38.680 |
meaning have enough energy for all the activities 00:32:42.800 |
sleeping well at night, and keeping my immune system strong. 00:33:06.480 |
that I have more mental clarity and more mental energy. 00:33:29.520 |
As I've discussed many times before on this podcast, 00:33:32.120 |
omega-3 fatty acids are critical for brain health, 00:33:41.360 |
Today's episode is also brought to us by David. 00:33:59.780 |
My favorite flavor is chocolate chip cookie dough, 00:34:02.060 |
but then again, I also like the chocolate fudge-flavored one 00:34:09.380 |
For me personally, I strive to eat mostly whole foods. 00:34:12.380 |
However, when I'm in a rush or I'm away from home, 00:34:15.240 |
or I'm just looking for a quick afternoon snack, 00:34:20.600 |
With David, I'm able to get 28 grams of protein 00:34:24.720 |
which makes it very easy to hit my protein goals 00:34:26.760 |
of one gram of protein per pound of body weight each day. 00:34:33.500 |
I typically eat a David bar in the early afternoon 00:34:35.780 |
or even mid-afternoon if I wanna bridge that gap 00:34:44.480 |
of very high-quality protein with just 150 calories. 00:34:52.620 |
Again, the link is davidprotein.com/huberman. 00:35:02.840 |
that one gets sufficient sleep spindles at night 00:35:09.480 |
This was from the episode that we did with Gina Poe 00:35:12.400 |
from UCLA, I believe, and others, including Matt Walker. 00:35:18.480 |
is to make sure you get enough sleep at night 00:35:22.120 |
And we're all familiar with the cognitive challenges, 00:35:25.040 |
including memory challenges and learning challenges 00:35:27.100 |
associated with lack of sleep, insufficient sleep. 00:35:30.600 |
But the other was that there was some interesting 00:35:54.760 |
I mean, obviously, you're refreshed when you wake up, 00:36:00.920 |
It's that it goes into a completely different state, 00:36:03.800 |
and memory consolidation is just one of those things 00:36:13.400 |
exactly how all the different sleep stages work together. 00:36:18.400 |
But exercise is a particularly important part 00:36:28.240 |
And it's thought that the REM, rapid eye movement sleep, 00:36:36.440 |
So that's yet another part of the sleep stages. 00:36:46.360 |
back and forth, back and forth during the night. 00:36:48.800 |
And then when you wake up, you're in the REM stage, 00:37:00.240 |
is perturb the system and see if you can maybe, 00:37:06.760 |
maybe you'd be able to remember things better. 00:37:08.720 |
So it turns out Sarah Mednick, who's at UC Irvine, 00:37:14.200 |
So it turns out there's a drug called zolpidem, 00:37:24.880 |
- I've never taken it, but I'm aware of what it is. 00:37:30.240 |
A lot of people take it in order to sleep, okay. 00:37:33.740 |
Well, it turns out that it causes more sleep spindles. 00:37:44.640 |
you take the drug after you've done the learning, right? 00:37:50.500 |
You do the learning at night and then you take the drug 00:37:57.740 |
you can remember twice as much from what you learned. 00:38:13.420 |
say if you're going to Europe and you take it 00:38:19.220 |
but often you find yourself in the hotel room 00:38:29.380 |
or any other drugs where I am very badly jet lagged. 00:38:36.500 |
but what feels like eternity, I have no idea where I am. 00:38:40.980 |
- Well, that's another problem that you have with jet lag. 00:38:46.140 |
But this is something where it could be an hour. 00:38:48.780 |
You know, you took the train or you took a taxi 00:38:56.460 |
How could it be a way to improve learning and recall 00:39:01.460 |
on one hand and then forgetfulness on the other hand? 00:39:14.360 |
In other words, it helps consolidate experiences 00:39:18.900 |
you've had in the past before you took the drug, 00:39:22.460 |
but it'll wipe out experiences you have in the future 00:39:35.300 |
and indeed some wonderfully useful pharmaceuticals 00:39:39.860 |
You know, some people may cringe to hear me say that, 00:39:41.780 |
but there are some very useful drugs out there 00:39:43.260 |
that save lives and help people deal with symptoms, et cetera. 00:39:47.740 |
but this particular drug profile, Ambien, that is, 00:39:52.580 |
seems to reveal something perhaps even more important 00:39:59.180 |
or even sleep, which is that you got to pay the piper 00:40:12.500 |
- That's a true, I think that this is something 00:40:17.540 |
that is true, not just of drugs for the brain, 00:40:24.980 |
Yeah, I mean, steroids, even low-dose testosterone therapy, 00:40:31.740 |
but it is introducing a sort of second puberty, 00:40:35.100 |
and puberty is perhaps the most rapid phase of aging 00:40:39.700 |
Same thing with people who take growth hormone 00:40:57.900 |
It's highly individual, but I completely agree with you. 00:41:00.540 |
I would also venture that with the growing interest 00:41:14.060 |
okay, maybe they can get away with doing that 00:41:15.580 |
every once in a while for a deadline task or something, 00:41:23.500 |
to achieve certain brain states pay in some other way. 00:41:27.340 |
- Whether or not stimulants or sedatives or sleep drugs, 00:41:35.860 |
- Yep, and one of the things about the way the body evolved 00:41:40.860 |
is that it really has to balance a lot of things, 00:41:46.260 |
and so with drugs, you're basically unbalancing it somehow, 00:42:02.260 |
- As long as we're talking about brain states 00:42:08.820 |
then I want to return to this issue about how best to learn, 00:42:17.340 |
and some guests on the topic of psychedelics. 00:42:23.300 |
because do you know why there aren't many studies of LSD? 00:42:31.940 |
and there are lots of studies going on about this. 00:42:49.060 |
for the subject to get through an LSD journey, 00:42:51.780 |
whereas psilocybin tends to be a shorter experience. 00:42:55.020 |
Okay, let's talk about psilocybin for a moment. 00:43:05.580 |
show pretty significant recovery from major depression. 00:43:08.420 |
It's pretty impressive, but if we just set that aside 00:43:11.200 |
and say, okay, more needs to be worked out for safety, 00:43:13.620 |
what is very clear from the brain imaging studies, 00:43:20.020 |
is that you get more resting state global connectivity, 00:43:27.140 |
than was the case prior to the use of the psychedelic. 00:43:31.180 |
And given the similarity of the psychedelic journey, 00:43:34.300 |
and here specifically talking about psilocybin, 00:44:04.760 |
as we go from childhood into the late stages of our life, 00:44:15.140 |
Is that what the human experience is really about? 00:44:19.700 |
We're getting more segregated in terms of this area, 00:44:24.860 |
Feel free to explore this in any way that feels meaningful, 00:44:35.340 |
but specifically with regard to connectivity, 00:44:38.480 |
if you look at what happens in an infant's brain 00:44:45.060 |
there's a tremendous amount of new synapses being formed. 00:44:54.820 |
Then the second phase is that you overabundant synapses, 00:45:07.540 |
It takes a lot of energy to activate all of the neurons, 00:45:16.460 |
'cause there's the turnover of the neurotransmitter. 00:45:19.980 |
And so what you wanna do is to reduce the amount of energy 00:45:27.100 |
that have been proven to be the most important, right? 00:45:43.160 |
So I think it goes in the opposite direction. 00:45:45.460 |
I think that as you get older, you're losing connectivity. 00:45:50.040 |
But interestingly, you retain the old memories. 00:45:58.680 |
- The foundation upon which everything else is built. 00:46:07.240 |
in the sense that even as an adult, as you know, 00:46:10.920 |
you can learn new things, maybe not as quickly. 00:46:13.840 |
By the way, this is one of the things that surprised me. 00:46:25.520 |
It turns out that the peak of the demographic is 25 to 35. 00:46:33.560 |
She's a fabulous educator and background in engineering. 00:46:58.800 |
are actually, you know, they weren't taking the course. 00:47:27.480 |
So it's not like, you know, filling in for college. 00:47:47.160 |
So you take a MOOC and you discover, you know, 00:47:49.920 |
I'm not quite as agile as I used to be in terms of learning, 00:48:20.440 |
And there's about 50 or 60 over a course of one month. 00:48:38.200 |
In fact, we have people in India, housewives, 00:48:46.520 |
And I wish I had known this when I was going to school. 00:48:54.640 |
if I get really excited about it or about anything, 00:49:03.440 |
We have like 98% approval, which is phenomenal. 00:49:17.880 |
We're trying to tell you how to acquire knowledge 00:49:25.920 |
or how to, you know, we all procrastinate, right? 00:49:37.280 |
Okay, I'm going to skip back a little bit now 00:49:44.640 |
You pointed out that in particular in California, 00:50:04.780 |
you had to do your times tables and your division, 00:50:06.960 |
and, you know, and then your fractions and your exponents, 00:50:09.680 |
and, you know, and they build on one another. 00:50:18.080 |
To some people, they can be like, what is this? 00:50:20.520 |
But the point being that there were a number of things 00:50:32.480 |
and for macromechanics and learning that stuff. 00:50:35.160 |
Okay, and learning from the chalkboard lectures. 00:50:50.320 |
But nowadays, you know, you'll hear the argument, 00:50:52.520 |
well, why should somebody learn how to read a paper map 00:51:00.000 |
they just put it into the top bar function on the internet 00:51:15.260 |
and activity and time and energy in particular 00:51:21.340 |
could be devoted to learning new forms of knowledge 00:51:32.900 |
I mean, I'm of the belief that the brain is doing math 00:51:43.220 |
But how are we to discern what we need to learn 00:51:49.140 |
in terms of building a brain that's capable of learning 00:51:52.420 |
the maximum number of things or even enough things 00:51:55.420 |
so that we can go into this very uncertain future? 00:51:59.760 |
and I know neither of us have a crystal ball. 00:52:05.620 |
And for those of us that didn't learn certain things 00:52:07.940 |
in our formal education, what should we learn how to learn? 00:52:38.900 |
It made it possible for you to do more things 00:52:53.540 |
It's clear that they didn't key in the calculator properly, 00:52:57.060 |
but they didn't recognize that it was a very far, 00:53:03.560 |
because they didn't have a good feeling for the numbers. 00:53:05.740 |
They don't have a good sense of exactly how big 00:53:15.260 |
the benefit is that you can do things faster, better, 00:53:20.300 |
but then you also lose some of your intuition 00:53:23.980 |
if you don't have the procedural system in place. 00:53:26.500 |
- And think about a kid that wants to be a musician 00:53:28.660 |
who uses AI to write a song about a bad breakup 00:53:33.660 |
that then is kind of recovered when they find new love. 00:53:39.140 |
And I'm guessing that you could do this today 00:53:44.360 |
but would you call that kid a songwriter or a musician? 00:53:51.180 |
And then you'd say, well, that's not the same 00:54:01.620 |
they were criticizing people on the acoustic guitar. 00:54:05.780 |
where we look back and say, that's not the real thing. 00:54:08.040 |
You need to get the, so what are the key fundamentals 00:54:14.620 |
because this is how, the way you put it at the beginning 00:54:21.300 |
how your brain is allocating resources, okay? 00:54:24.620 |
So when you're younger, you can take in things. 00:54:30.320 |
For example, how good are you on social media? 00:54:35.320 |
- Well, I do all my own Instagram and Twitter 00:54:42.460 |
in proportion to the amount of time I've been doing it. 00:54:45.600 |
I mean, I'm not the biggest account on social media, 00:54:48.500 |
but for a science health account, we're doing okay. 00:55:05.580 |
- That's a new phenomenon in human evolution. 00:55:09.340 |
I saw people doing that and now I can do it too. 00:55:11.740 |
But the thing is that if you learn how to do that early 00:55:22.100 |
Also, you can have many more, you know, tweets going 00:55:28.700 |
- So on X, I think they still call them tweets 00:55:30.220 |
because you can't, it's hard to verb the letter X. 00:55:37.260 |
It's kind of punk and it's got a black kind of format 00:55:51.020 |
But you know, I walk across campus and I see everybody, 00:55:55.260 |
like half the people are tweeting or, you know, 00:55:58.700 |
they're doing something with their cell phone. 00:56:02.220 |
- And you have beautiful sunsets at the Salk Institute. 00:56:05.580 |
I mean, it is truly spectacular, awe-inspiring 00:56:13.220 |
- And everyone's on their phones these days, sad. 00:56:15.500 |
- And you know, they're looking down at their phone 00:56:20.700 |
I mean, you know, it's amazing what a human being can do, 00:56:22.740 |
you know, when they learn, get into something. 00:56:31.180 |
And you can pick it up later, but you're not quite as agile, 00:56:40.180 |
that doing anything on my phone feels fatiguing 00:56:45.740 |
or even just writing on a laptop or a desktop computer 00:56:51.060 |
If I'm on social media for more than a few minutes, 00:56:53.140 |
I can literally feel the energy draining out of my body. 00:56:57.860 |
- I would, I could do sprints or deadlifts for hours 00:57:08.700 |
I'd like to know what's going on in your brain. 00:57:11.020 |
Why is it, and also I'd like to know from younger people 00:57:16.540 |
I think my guess is that they don't feel fatigued 00:57:26.480 |
I think that it has a lot to do with the foundation 00:57:39.580 |
and they make things easier, some things easier. 00:57:42.460 |
- Yeah, I spent a lot of time in my room as a kid, 00:57:47.420 |
or building fish tanks or reading about fish. 00:57:51.580 |
and then do a lot of procedural-based activities. 00:57:55.860 |
You know, I would read skateboard magazines and skateboard. 00:58:04.620 |
So social media, to me, feels like an energy sink. 00:58:14.900 |
I feel like I don't have a foundation for it. 00:58:17.500 |
It's like, I'm trying to like jerry-rig my cognition 00:58:21.940 |
into doing something that it wasn't designed to do. 00:58:24.720 |
And it's because you don't have the foundation. 00:58:28.180 |
And now you have to sort of use the cognitive powers 00:58:42.420 |
that has everything you need and nothing you don't. 00:58:50.700 |
We should all know that proper hydration is critical 00:58:56.860 |
can diminish your cognitive and physical performance 00:59:00.660 |
It's also important that you're not just hydrated, 00:59:02.600 |
but that you get adequate amounts of electrolytes 00:59:05.940 |
Drinking a packet of Element dissolved in water 00:59:09.680 |
that you're getting adequate amounts of hydration 00:59:13.060 |
To make sure that I'm getting proper amounts of both, 00:59:20.220 |
and I drink that basically first thing in the morning. 00:59:22.900 |
I'll also drink a packet of Element dissolved in water 00:59:25.040 |
during any kind of physical exercise that I'm doing, 00:59:27.460 |
especially on hot days when I'm sweating a lot 00:59:53.660 |
Today's episode is also brought to us by Juve. 00:59:56.740 |
Juve makes medical grade red light therapy devices. 01:00:00.900 |
that I've consistently emphasized on this podcast 01:00:03.500 |
is the incredible impact that light can have on our biology. 01:00:11.020 |
on improving numerous aspects of cellular and organ health, 01:00:18.320 |
improvements in acne, reduced pain and inflammation, 01:00:27.540 |
and why they're my preferred red light therapy devices 01:00:30.140 |
is that they use clinically proven wavelengths, 01:00:34.260 |
of red light and near infrared light in combination 01:00:48.460 |
you can go to juve, spelled J-O-O-V-V.com/huberman. 01:00:54.980 |
of up to $1,300 now through December 2nd, 2024. 01:01:03.000 |
to get up to $1,300 off select Juve products. 01:01:14.500 |
I went through and I looked at other people's experiences 01:01:18.500 |
I just wanted to know what people were thinking, 01:01:36.460 |
you know, at the end of the day, she was drained, 01:01:41.740 |
And it was like, you know, working on a machine, 01:01:45.620 |
You know, you're struggling, struggling, struggling 01:01:48.400 |
And then she started, said, "Well, wait a second. 01:01:55.700 |
"You know, what if I treat it like a human being? 01:01:59.440 |
"What if I'm polite instead of, you know, being curt?" 01:02:04.140 |
So she said, "Suddenly, I started getting better answers 01:02:08.300 |
"by being polite and, you know, back and forth 01:02:15.160 |
- So saying, "Could you please give me information 01:02:19.180 |
No, you know, that answer you gave me was fabulous, 01:02:23.260 |
And, you know, now I need you to go on to the next part 01:02:28.100 |
In other words, the way you talk to a human, right, 01:02:45.700 |
and therefore giving her the sorts of answers 01:02:47.500 |
that are more facile for her to integrate with? 01:02:51.620 |
First of all, the chat GDP is mirroring your, 01:02:56.620 |
the way you treat it, it will mirror that back. 01:03:06.540 |
Surprise is, she said, "Once I started treating it 01:03:26.600 |
And so by treating the chat GDP as if it were a human, 01:03:31.620 |
you're taking advantage of all the brain circuits 01:03:39.420 |
but many people really enjoy social media, learn from it. 01:03:51.460 |
according to whether or not we're filtering it 01:03:54.140 |
or whether or not we take a couple of minutes 01:03:58.340 |
Very interesting ideas about locus of self-perception 01:04:08.860 |
and social media could provide me both those things 01:04:12.580 |
And I was thinking to myself, this is crazy, right? 01:04:14.500 |
The raccoon is kind of trivial, but it delighted me, 01:04:22.560 |
Could it be that one of the detrimental aspects 01:04:27.740 |
of social media is that if we're complimenting one another, 01:04:32.160 |
or if we are giving hearts, or we're giving thumbs down, 01:04:36.760 |
or we're doing a clap back, or they're clapping back on us, 01:04:42.240 |
that it isn't necessarily the way that we learned to argue. 01:04:55.880 |
that certain online interactions feel really good, 01:04:59.060 |
and others feel like they kind of grate on me, 01:05:01.720 |
like because there's almost like an action step 01:05:03.800 |
that isn't allowed, like you can't fully explain yourself, 01:05:14.680 |
And I feel the same way about text messaging. 01:05:27.120 |
In fact, this whole text messaging thing is beneath me. 01:05:31.120 |
And over the years, of course, I became a text messenger. 01:05:35.320 |
be there in five minutes, running a few minutes late. 01:05:38.780 |
But I think this notion of what grates on us, 01:06:02.080 |
not woo, biology, woo, science, wellness, energy. 01:06:09.280 |
And years ago, the great Ben Barris sadly passed away. 01:06:16.960 |
came to me one day in the hallway and he stopped me 01:06:27.460 |
Why am I more tired today than I was 10 years ago? 01:06:30.660 |
I was like, I don't know, how are you sleeping? 01:06:39.280 |
Like, what is this energy thing that we're talking about? 01:06:46.800 |
that then you either find experiences invigorating 01:06:50.300 |
I wanna make sure we close the hatch on that, 01:06:51.760 |
but I wanna make sure that we relate it at some point 01:06:56.300 |
And why is it that with each passing year of our life, 01:07:05.860 |
- Well, so far you really do have great answers. 01:07:13.400 |
So there's a tremendous amount of learning for me today 01:07:20.140 |
versus 50 years old versus what should they do? 01:07:24.060 |
I mean, we need to integrate with the modern world. 01:07:30.260 |
- People aren't retiring as much, they're living longer. 01:07:34.180 |
but we have to get all get along as they say. 01:07:38.820 |
I think it's true that we all, as we get older, 01:07:44.660 |
if I could use a somewhat different word from energy, 01:08:00.320 |
I think most people won't know what a MOOC is, 01:08:03.240 |
- Okay, this is, they've been around for about, 01:08:05.680 |
actually started at Stanford, Andrew Ng and Daphna Koller. 01:08:19.660 |
to give lectures that are available to anybody in the world 01:08:30.300 |
Any specialty, history, science, music, you know, 01:08:35.220 |
you name it, there's somebody who's done, you know, 01:08:37.580 |
who's an expert on that and wants to tell you, 01:08:39.860 |
because they're excited about what they're doing. 01:08:49.340 |
And so part of the problem is that it gets more difficult. 01:08:58.340 |
if we're gonna stay with this language of energy and vigor. 01:09:03.620 |
As you know, in the cell, there is a physical power plant 01:09:06.700 |
called the mitochondrion, which is supplying us 01:09:11.580 |
with ATP, which is the coin of the realm for the cell 01:09:16.100 |
to be able to operate all of its machinery, right? 01:09:22.320 |
when you get older is that your mitochondrial run down. 01:09:26.580 |
- You have fewer of them and they're less efficient. 01:09:40.580 |
but I know it's the case that there are a lot of drugs 01:10:00.980 |
That exercise is the best drug you could ever take. 01:10:14.400 |
It helps your brain, it rejuvenates your brain. 01:10:25.940 |
I run on the beach every day at the Salk Institute. 01:10:29.700 |
I can, and I also, it's on a mesa, 340 foot above. 01:10:34.360 |
So I go down every day and then I climb up the cliff. 01:10:41.980 |
- They are, they are, and so this is something 01:10:51.180 |
So this is, in September, so this is, I think, 01:10:54.660 |
something that people really ought to realize 01:10:57.920 |
is that it's like putting away reserves of energy 01:11:06.580 |
The more you put away, the better off you are. 01:11:10.380 |
Okay, now this is jumping now to Alzheimer's. 01:11:13.540 |
So a study that was done in China many, many years ago, 01:11:31.900 |
and they went and they had three populations. 01:11:35.220 |
They had peasants who had almost no education, 01:11:38.620 |
then they had another group that had high school education, 01:11:40.580 |
and then people who were, you know, advanced education. 01:11:43.940 |
So it turns out that the onset of Alzheimer's 01:11:47.320 |
was earlier for the people who had no education, 01:12:02.700 |
So one possibility, and obviously we don't really know why, 01:12:06.580 |
but one possibility is that the more you exercise your brain 01:12:11.580 |
with education, the more reserve you have later in life. 01:12:29.500 |
I'll read slowly, or I'll see where my default pace 01:12:43.580 |
And you can feel the energetic demand of that. 01:12:51.500 |
where I'm not reading at the pace that is reflexive, 01:13:02.020 |
And I learned this when I had a lot of catching up to do 01:13:11.780 |
and I have to go back and learn how to learn, you know? 01:13:33.500 |
of trying to get good at things like skateboarding 01:13:38.460 |
There's a certain thing that happens when skateboarding, 01:13:42.620 |
where it's actually easier to learn something going faster. 01:13:47.620 |
You know, most kids try and learn how to ollie and kickflip 01:13:53.120 |
It's all easier going a bit faster than you're comfortable. 01:13:57.460 |
It's also the case that if you're not paying attention, 01:14:08.740 |
I was able to translate into an understanding 01:14:11.580 |
of when I sit down to read a paper or a news article, 01:14:36.680 |
I don't know if it's incorporated into your learning 01:14:48.840 |
And this is why I think that social media is detrimental. 01:14:54.120 |
to be slow, passive, and multi-context cycling through. 01:15:05.680 |
forgive the language, but I'm going to be blunt here, 01:15:09.900 |
unless we make it a point to also engage learning. 01:15:12.720 |
And my guess is it's tapping into this mitochondrial system. 01:15:18.780 |
By the way, the way that you've adjusted the speed 01:15:23.320 |
is very interesting because it turns out that stress, 01:15:29.200 |
but no, it turns out stress that is transient, 01:15:45.800 |
In other words, I run like hell for about 10 seconds. 01:15:53.840 |
And it's pushing your body into that extra gear 01:16:07.760 |
not from just doing the same running pace every day. 01:16:16.800 |
You've always had a slight forward center of mass 01:16:22.800 |
And even the speed at which you walk, Terry, dare I say, 01:16:36.960 |
The reason to not slow down too much for too long 01:16:43.600 |
the energy of the brain and body, as you point out, 01:16:46.840 |
And I do think that below a certain threshold, 01:16:52.440 |
it's hard to exercise without getting very depleted 01:16:55.440 |
or even injured, that we need to maintain this. 01:16:58.160 |
So perhaps now would be a good time to close the hatch 01:17:06.460 |
Everyone should take this learning to learn course 01:17:14.560 |
do you think that young people and older people now, 01:17:20.600 |
I'm 49, so I'll put myself in the older bracket, 01:17:29.980 |
And again, it's just like new technology comes along, 01:17:38.000 |
You know, they're using it a lot more than I am. 01:17:49.440 |
It's a tool that you need to know how to use it. 01:18:00.820 |
This was suggested to me by somebody expert in AI 01:18:09.320 |
but I'll tell you, I really like the aesthetic of Claude AI. 01:18:18.500 |
I like the Apple brand and it gives me answers. 01:18:24.040 |
Maybe this goes back to the example you used earlier 01:18:25.880 |
where I like Claude AI and I'm a big fan of it. 01:18:33.780 |
except that it gives me answers in a bullet pointed format 01:18:39.480 |
to transfer that information into my brain or onto a page. 01:18:47.800 |
for sake of getting smarter, learning knowledge, 01:18:51.680 |
just for the sake of knowledge, having fun with it? 01:19:20.120 |
as the first word in my book, because it's iconic. 01:19:24.240 |
But some of them, I have to say that, for example, 01:19:29.080 |
there are some that are really much better at math 01:19:34.120 |
- Google's Gemini recently did some fine tuning 01:19:44.520 |
And when you reason, you go through a sequence of steps. 01:19:50.840 |
of first finding out what's missing and then adding that. 01:20:04.160 |
- And as people hear that, they probably think, 01:20:07.680 |
But could you imagine any human or panel of humans 01:20:11.180 |
behind a wall where if you asked it a question 01:20:14.240 |
and then another question and another question, 01:20:23.200 |
- So I think we are being perhaps a little bit unfair 01:20:31.240 |
to compare these large language models to the best humans 01:20:39.120 |
As you said, most people couldn't pass the LSAT, 01:20:45.560 |
or MCAT, the test to get into medical school. 01:20:50.760 |
- Is there a world now where we take the existing AI, 01:21:00.560 |
that can learn like a collection of human brains 01:21:03.160 |
and send that somehow into the future, right? 01:21:25.920 |
- I think that's perhaps the better question in some sense, 01:21:35.400 |
but we can perhaps travel into the future with AI 01:21:46.840 |
or space travel experts, or sea travel experts, 01:21:55.440 |
You're just gonna work for the next 48 hours. 01:21:58.440 |
In fact, you're gonna work for the next three weeks 01:22:04.320 |
You're not gonna pay attention to your health. 01:22:14.140 |
and then have that fleet of large language models come back 01:22:18.360 |
and give us the information like, I don't know, tomorrow. 01:22:23.960 |
Back in the 1980s, I was just starting my career 01:22:31.160 |
learning algorithms for neural network models. 01:22:37.040 |
and he actually won a Nobel Prize for this recently. 01:22:49.600 |
on machine learning and then back propagation and so forth. 01:22:54.280 |
But back then, Jeff and I had this view of the future. 01:22:59.280 |
AI was dominated by symbol processing, rules, logic, right? 01:23:06.520 |
For every problem, you need a different computer program, 01:23:08.640 |
and it was very human resource intensive to write programs 01:23:18.480 |
They never wrote a program for vision, for example, 01:23:27.480 |
We had this view that nature has solved these problems, 01:23:33.520 |
Look, every animal can see, even insects, right? 01:23:43.640 |
We can actually, again, going back to algorithms, 01:23:45.760 |
I was telling you, and so in the case of the brain, 01:23:49.760 |
what makes it different from a digital computer, 01:23:51.520 |
digital computers basically can run any program, 01:23:54.000 |
but a fly brain, for example, only runs the program 01:23:57.560 |
that a special purpose hardware allows it to run. 01:24:02.000 |
- There's enough there, just enough habituation and so forth 01:24:09.800 |
I'm not trying to be disparaging to the fly biologists, 01:24:16.040 |
of the human brain to customize to a world of experience. 01:24:21.320 |
I think about a really cool set of neural circuits 01:24:23.800 |
that work really well to avoid getting swatted, 01:24:28.120 |
to eating, and to reproducing, and not a whole lot else. 01:24:38.480 |
It's just sort of like, it's not that it doesn't matter. 01:24:41.560 |
It's just a question of the lack of plasticity 01:24:47.240 |
- Okay, I can see I've pressed your button here. 01:24:50.920 |
They taught us about algorithms for direction selectivity 01:24:56.500 |
I just think that the lack of neuroplasticity 01:25:02.760 |
and the reason we're the curators of the earth 01:25:09.960 |
one step at a time, nature first has to be able 01:25:17.320 |
as the environment gets more complex, and here we are, 01:25:21.840 |
but the key is that it turns out that certain algorithms 01:25:34.280 |
in terms of training it to, when you give it a reward, 01:25:48.440 |
that algorithm is in the fly brain, it's in your brain, 01:25:52.520 |
so we can learn about learning from many species. 01:25:58.600 |
I actually think Drosophila has done a great deal, 01:26:12.080 |
because they actually like the feeling of being caffeinated. 01:26:19.300 |
- No, I fully absorb and agree with the value 01:26:23.800 |
of studying simpler organisms to find the algorithms. 01:26:31.960 |
Now, I'm telling the story about where we were. 01:26:36.880 |
We were saying, this is an alternative to traditional AI. 01:26:42.940 |
Everybody was, experts said, "No, no, write programs, 01:26:47.480 |
They were getting all the resources, the grants, the jobs, 01:26:51.240 |
and we were just like the little furry mammals 01:26:53.680 |
under the feet of these dinosaurs, right, in retrospect. 01:27:01.640 |
- But the point I'm making is that it's possible 01:27:13.780 |
- Yeah, I mean, the reason I'm excited about AI, 01:27:18.780 |
and increasingly so across the course of this conversation, 01:27:34.400 |
I mean, if there's one thing that we are truly a slave to 01:27:42.080 |
And even if you don't, your cognition really waxes 01:28:03.040 |
And the idea that they can provide a portal into the future 01:28:11.480 |
I'm not saying we have to implement their advice, 01:28:19.400 |
computationally diverse, experientially diverse 01:28:26.240 |
and bring us back a panel of potential routes to take, 01:28:51.200 |
that you get some improvement in the motor symptoms 01:28:58.540 |
"No, that's not really the basis of schizophrenia. 01:29:04.680 |
And we even have a department at Stanford now focusing, 01:29:10.300 |
focusing on what Chris really founded as a field, 01:29:29.040 |
And he's looked at ways that people can use ketogenic diet, 01:29:34.520 |
and in some cases, maybe even cure schizophrenia. 01:29:38.560 |
where we still don't have a "cure" for schizophrenia, 01:29:56.600 |
different positive and negative result clinical trials 01:30:01.080 |
10,000 subjects in Scandinavia who go on ketogenic diet, 01:30:05.680 |
who have a certain level of susceptibility to schizophrenia 01:30:12.000 |
things that never, ever, ever would be possible to do 01:30:26.760 |
But so are human brains coming up with these experiments. 01:30:39.740 |
into what might be happening or is likely to happen. 01:30:44.720 |
I'm pretty sure that if we had these large language models 01:30:51.680 |
that ketamine would have been a really good drug 01:31:10.360 |
- Okay, so one of the things now that we know, 01:31:14.040 |
see, the problem is that if you look at the end point, 01:31:17.020 |
that doesn't tell you what started the problem. 01:31:35.160 |
- So what is the concordance in identical twins? 01:31:41.520 |
and one is destined to be full-blown schizophrenic, 01:31:47.640 |
Okay, this has been replicated many, many times, 01:31:53.000 |
Oh no, actually, okay, let me start with a human. 01:32:03.220 |
- I've never taken it, but this is what I hear. 01:32:10.080 |
'cause I've talked to these people who've done this. 01:32:24.200 |
you take young adults, here's what they experience. 01:32:42.560 |
Now, if they just go and have one experience, 01:32:48.600 |
but if they have two, they party two days in a row, 01:33:15.680 |
We say that, my God, this person here is really, 01:33:19.080 |
has become a schizophrenic, and this is really, 01:33:27.080 |
However, if you isolate them for a couple days, 01:33:37.240 |
a form of schizophrenia, psychosis, temporarily, 01:33:45.760 |
Okay, and there's another literature on this. 01:33:47.400 |
It turns out that it binds to a form of receptor, 01:33:55.880 |
for learning and memory, but we know the target, 01:34:01.600 |
that it reduces the strength of the inhibitory circuit, 01:34:06.600 |
the interneurons that use inhibitory transmitters, 01:34:10.400 |
the enzyme that creates the inhibitory transmitter 01:34:18.700 |
and what does that mean, when there's more excitation? 01:34:20.460 |
It means that there's more activity in the cortex, 01:34:38.220 |
and now there's a whole field now in psychiatry 01:34:43.880 |
for the first, where the actual imbalance first occurs. 01:34:54.000 |
and inhibitory systems that are in the cortex 01:35:03.200 |
- Yes, they are glutamate. - They're one class. 01:35:10.620 |
for why ketamine might be good for depression. 01:35:15.460 |
People are taking it now who are depressed, right? 01:35:18.700 |
So here you have a drug that causes overexcitation, 01:35:23.140 |
and here you have a person who is underexcited. 01:35:26.060 |
Depression is associated with lower excitatory activity 01:35:36.580 |
So what you do is you fight depression with schizophrenia, 01:36:01.180 |
the better we are going to be at extrapolating 01:36:09.320 |
By the way, I'm pretty sure that the large language models 01:36:20.900 |
how would we have used these large language models long ago? 01:36:37.740 |
- At that time, it was like the dopamine hypothesis 01:36:41.380 |
There was a little bit about glutamate, perhaps, 01:36:47.340 |
So how would the large language models have discovered this? 01:36:53.020 |
Ketamine, by the way, is very similar to PCP, 01:36:57.140 |
fencyclidine, which also binds the NMD receptor. 01:37:04.600 |
- Which is also, yeah, not one I recommend, nor ketamine. 01:37:07.960 |
Frankly, I don't recommend any recreational drugs, 01:37:12.180 |
But what would those large language models do if they, 01:37:16.420 |
so you've got 2024 technology placed into 1998. 01:37:26.660 |
Like, hey, this stuff is gonna turn out to be wrong, 01:37:30.300 |
- Okay, okay, you know, this is all very, very speculative. 01:37:35.300 |
And really, we can begin actually to see this happening now. 01:37:40.740 |
So I have a colleague at the Salk Institute, Rusty Gage, 01:38:01.100 |
- Yeah, that was around 1998 that Rusty did that. 01:38:09.100 |
actually, the effects of exercise on neurogenesis. 01:38:37.060 |
that they were able to, later in post-mortem, 01:38:39.580 |
to actually see that they were born in the adult. 01:38:57.180 |
happened to talk about this issue about, you know, 01:39:02.940 |
he's using these large language models now for his research. 01:39:16.780 |
Well, you know, we give it all of the experiments 01:39:22.700 |
the literature, it's access to the literature and so forth, 01:39:31.900 |
that works at Google, and he's one of the main people there 01:39:36.900 |
in terms of voice-to-text, and text-to-voice software. 01:39:42.220 |
And he showed me something, I'll provide a link to it, 01:39:45.140 |
'cause it's another one of these open resource things. 01:39:51.580 |
I don't get an F in technology, I don't get an A+. 01:39:54.220 |
I'm kind of in the middle, so I think I'm pretty representative 01:39:56.100 |
of the average listener for this podcast, presumably. 01:40:01.260 |
you open up this website, and you can take PDFs, 01:40:04.660 |
or you take URLs, so websites, website addresses, 01:40:35.720 |
in the two examples of the effects of a drug, 01:40:41.220 |
one being very strong, and one being very weak, 01:40:44.260 |
which of these papers do you think is more rigorous, 01:40:51.020 |
but also kind of the strength of the findings? 01:41:09.380 |
to understand strength of findings, and even that. 01:41:11.780 |
And what's amazing is, it starts giving back answers, 01:41:15.060 |
like, well, if you're concerned about number of subjects, 01:41:26.820 |
that they used in these papers in very sophisticated ways, 01:41:33.540 |
may not be interesting, and others are more interesting, 01:41:38.900 |
And then you say, well, with that weighted evidence, 01:41:46.540 |
where it starts trying to predict the future, 01:41:49.040 |
based on 10 papers that you gave it five minutes ago. 01:41:57.160 |
except in their very specific area of interest, 01:42:00.060 |
and if they were already familiar with the papers, 01:42:02.340 |
and it would take them many hours, if not days, 01:42:10.960 |
Yeah, so this is, so actually this is something 01:42:17.100 |
for doctors who are using AI as an assistant. 01:42:23.000 |
So, and this is dermatology, it was a paper in Nature, 01:42:28.660 |
2,000 skin lesions, and some of them are cancerous, 01:42:36.340 |
And so, in any case, they tested the expert doctors, 01:42:39.580 |
and then they tested an AI, and they were both doing 01:42:56.700 |
So it turns out that, although they got the same 90%, 01:43:06.140 |
and so it could look at the lesions that were rare, 01:43:14.140 |
of the most common ones that he's seen over and over again, 01:43:19.300 |
But so, putting them together, it makes so much sense 01:43:23.120 |
that they're gonna improve if they work together. 01:43:26.300 |
And I think that now, what you're saying is that 01:43:37.380 |
and looking at the arguments, the statistical arguments, 01:43:41.500 |
and also looking at the paper, maybe in a new way, 01:43:48.860 |
Everybody's worried about, oh, AI's gonna replace us. 01:43:52.700 |
It's gonna be much better than we are at everything, 01:44:01.700 |
and by working together, it's gonna strengthen both 01:44:16.980 |
- Would you say that's the case for things like 01:44:40.460 |
there's occasionally an accident on the freeway. 01:44:43.100 |
You have a lot of cameras over freeways nowadays. 01:44:48.380 |
You can imagine all of the data being sent in in real time, 01:44:51.300 |
and you could probably predict accidents pretty easily. 01:44:55.580 |
I mean, these are just moving objects, right, 01:44:57.140 |
at a specific rate, who's driving haphazardly, 01:45:00.340 |
but you could also potentially signal takeover of the brakes 01:45:05.140 |
or the steering wheel of a car and prevent accidents. 01:45:13.620 |
Well, let's do something even more important. 01:45:20.180 |
but could you predict physical events in the world 01:45:26.500 |
- Okay, this has already been done, not for traffic, 01:45:30.840 |
As you know, the weather is extremely difficult to predict, 01:45:44.220 |
But now what they've done is to feed a lot of previous data 01:45:49.220 |
from previous hurricanes and also simulations of hurricanes. 01:45:57.940 |
It takes days and weeks, so it's not very useful 01:46:06.520 |
But what they did was, after training up the AI 01:46:14.800 |
exactly where in Florida it's gonna make a landfall. 01:46:19.800 |
And it does that on your laptop in 10 minutes. 01:46:28.620 |
and it's probably obvious to you and to most people, 01:46:37.140 |
we were talking about the acquisition of knowledge 01:46:47.100 |
in the form of physical action or cognitive action, right? 01:46:50.180 |
Math problem is cognitive action, physical action. 01:46:52.760 |
AI can do both knowledge acquisition, it can learn facts, 01:46:58.240 |
long lists of facts and combinations of facts, 01:47:00.720 |
but presumably it can also run a lot of problem sets 01:47:05.960 |
I don't think, except with some crude, still to me, 01:47:10.040 |
examples of robotics, that it's very good at action yet, 01:47:13.340 |
but it will probably get there at some point. 01:47:20.680 |
But it seems to me that as long as they can acquire knowledge 01:47:29.960 |
different iterations of combinations of knowledge 01:47:35.480 |
to take any data about prior events or current events 01:47:40.160 |
and make pretty darn good predictions about the future 01:47:48.960 |
that they could play out the different iterations. 01:47:54.560 |
that seems to have really vexed neuroscientists 01:47:56.880 |
and the field of medicine and the general public 01:48:03.420 |
I've heard so many different hypotheses over the years. 01:48:07.800 |
I think we're still pretty much in the fog on this one. 01:48:10.560 |
Could AI start to come up with new and potential solutions 01:48:32.400 |
In other words, we will use those tools the best we can, 01:48:36.020 |
'cause obviously if you can make any progress at all 01:48:39.560 |
and jump into the future, wow, that would save lives. 01:48:45.280 |
I mean, I really think the promise here is so great 01:48:51.880 |
we just, we really, really have to really push. 01:49:11.440 |
that otherwise they would have had difficulty with 01:49:14.600 |
It's beginning to happen, but these are early days. 01:49:26.480 |
after the first flight of the Wright brothers. 01:49:31.920 |
- The achievement that the Wright brothers made 01:49:36.600 |
and to power forward with a human being 100 feet. 01:49:43.440 |
And it took an enormous amount of improvements. 01:49:46.340 |
The most difficult thing that had to be solved was control. 01:49:50.160 |
How do you make it go in the direction you want it to go? 01:50:01.200 |
but who knows where it will take us into the future. 01:50:10.160 |
that leads to difficulty in smooth movement generation 01:50:14.880 |
and also some cognitive and mood-based dysfunction. 01:50:32.120 |
It's very interesting because the dopamine cells 01:50:36.280 |
are at a particular part of the brain, the brainstem, 01:50:52.200 |
it's a global signal, it's called a neuromodulator 01:51:05.800 |
sequences of actions that produce survival, for survival. 01:51:10.800 |
But the problem is that with certain environmental insults, 01:51:29.820 |
And when they die, you get all of the symptoms 01:51:51.480 |
They were still alive, but they just didn't move at all. 01:52:00.920 |
So when the first trials of L-DOPA were given to them, 01:52:05.920 |
it was magical because suddenly they started talking again. 01:52:13.640 |
- I'm curious, when they started talking again, 01:52:17.680 |
during the locked in phase was slow velocity? 01:52:24.980 |
or were they in there like screaming to get out? 01:52:27.720 |
Because their physical velocity obviously was zero. 01:52:33.300 |
And I've long wondered when coming back from a run 01:52:37.920 |
or from waking up from a great night's sleep, 01:52:47.280 |
- Okay, that's a wonderful observation or a question. 01:52:52.500 |
Okay, here's something that is really amazing. 01:53:03.900 |
but to them, cognitively, they think they're moving fast. 01:53:10.040 |
because you can say, well, can you move faster? 01:53:15.800 |
But to them, they think they're moving at super velocities. 01:53:24.340 |
And as the set point gets further and further down, 01:53:27.780 |
without moving at all, they think they're moving, right? 01:53:33.260 |
By the way, you can ask them, what was it like? 01:53:35.340 |
We were talking to you, and you didn't respond. 01:53:48.820 |
or they couldn't initiate, they couldn't initiate actions. 01:53:52.340 |
That's one of the things that they have trouble with, 01:54:00.220 |
And again, there may be a better or more accurate 01:54:24.880 |
as at least one metric that relates to brain state. 01:54:48.560 |
and computational neuroscientists in your case. 01:54:58.760 |
But I think the more that people think about this, 01:55:01.900 |
I'll venture to say that the more that they think 01:55:10.700 |
For me, it tends to be early to late mid morning. 01:55:38.060 |
that I need to meet the demands of that stress. 01:55:40.980 |
I just can't, I can't get to that faster pace 01:56:06.180 |
I think in the morning, I'm better at creative stuff. 01:56:16.920 |
Given the relationship between a body temperature 01:56:19.780 |
and circadian rhythm, I would like to run an experiment 01:56:22.980 |
that relates core body temperature to cognitive velocity. 01:56:28.620 |
this is something that is just purely subjective, 01:56:39.940 |
But in the afternoon, I feel a little chilly. 01:56:48.480 |
- Sure, body temperature starts to come down. 01:56:52.920 |
And that may correspond to the loss of energy. 01:56:56.160 |
You know, the amount of the ability for the brain 01:57:10.600 |
And so, yeah, so if the body temperature is doing this, 01:57:13.120 |
then all the cells are doing this too, right? 01:57:19.500 |
- Yeah, Craig Heller, my colleague at Stanford 01:57:21.620 |
in the biology department has beautifully described 01:57:33.500 |
when people are trying to move some resistance, 01:57:43.760 |
that don't allow the muscles to contract the same way. 01:57:52.520 |
are so beautifully controlled by temperature. 01:57:55.560 |
And of course, his laboratory is focused on ways 01:57:57.320 |
to bypass those temperature or to change temperature locally 01:58:09.240 |
about what it would mean for cognitive velocity. 01:58:16.140 |
as opposed to just thinking about like a drug. 01:58:18.700 |
You know, you increase dopamine and norepinephrine 01:58:20.540 |
and epinephrine, the so-called catecholamines, 01:58:23.260 |
and you're gonna increase energy focus and alertness, 01:58:26.580 |
You're gonna have a trough in energy focus and alertness 01:58:28.800 |
that's proportional to how much greater it was 01:58:41.200 |
Of course, you know, it's, I fully understand 01:58:46.720 |
And the reality is you don't actually accomplish 01:58:55.840 |
of what is going to be the consequence on cognition 01:59:11.820 |
And you know, I really would like to know the answer. 01:59:21.220 |
You know, the drug that causes the brain to be activated. 01:59:32.520 |
when it wears off, you have no energy, right? 01:59:41.160 |
And so, and, but that's why you take more of it. 01:59:43.760 |
You see, that's the problem is it's a spiral. 01:59:46.220 |
- I love how today you're making it so very clear 02:00:00.120 |
that we think about these biological problems, 02:00:08.800 |
I want to make sure that we talk about a couple of things 02:00:10.920 |
that I know are in the back of people's minds, 02:00:19.280 |
Normally, I don't like to talk about these things, 02:00:22.760 |
but because I find the discussions around them 02:00:25.500 |
typically to be more philosophical than neurobiological. 02:00:35.000 |
who is a real, I think he has a book about free will. 02:00:46.760 |
And is it even a discussion that we should be having? 02:00:52.320 |
it's the middle ages, the concept didn't exist, 02:01:00.040 |
Because everybody, it was the way that humans felt 02:01:09.360 |
and its impact on them was that it's all fate. 02:01:23.600 |
because of what's going on in the gods up above, 02:01:28.080 |
You attribute it to the physical forces around you 02:01:34.960 |
not to something that caused this to happen to you, right? 02:01:42.880 |
that we use, free will, consciousness, intelligence, 02:01:56.560 |
And it's tough to solve a problem, a scientific problem, 02:02:01.500 |
if you don't have a definition that you can agree on. 02:02:18.920 |
is we don't understand what understanding is. 02:02:22.320 |
Literally, we don't have a really good argument 02:02:25.900 |
or a measure that you could measure someone's understanding 02:02:29.040 |
and then apply it to the GDP and see whether it's the same. 02:03:06.740 |
Well, okay, now there's a big diversity amongst humans too. 02:03:11.880 |
Certain colleagues of ours at UCSD years ago, 02:03:29.120 |
would just kind of turn to me and start talking 02:03:46.280 |
and you kind of want to like, "Is he an alien?" 02:04:01.980 |
That's a problem that, I mean, in other words, 02:04:05.500 |
there are very high functioning autistic people out there. 02:04:11.160 |
There are high people who are brilliant with autism, 02:04:24.040 |
to see what kind of information they forage for? 02:04:27.020 |
- It seemed like it would be a really important thing to do. 02:04:33.240 |
where they took the LLM and they fine-tuned it 02:04:36.620 |
with different data from people with different disorders, 02:04:54.660 |
just like those people who have these disorders. 02:05:05.060 |
- I haven't seen that, but it's pretty clear that, 02:05:07.860 |
to me at least, that if you can do sociopathy, 02:05:10.860 |
you can probably do any political belief, you know. 02:05:29.520 |
and find out what kind of information that person brings, 02:05:49.260 |
"did you see that interaction between so-and-so?" 02:05:55.760 |
"but I did not pick up on what you were picking up on." 02:05:58.300 |
And it was clear that there's two very different experiences 02:06:00.860 |
of the same content based purely on a difference 02:06:06.260 |
- Okay, there's a lot of information that, as you point out, 02:06:17.800 |
You know, there's a tremendous amount of information 02:06:23.300 |
but with all the other parts of the visual input 02:06:30.580 |
There's a tremendous variability between individuals. 02:06:34.060 |
And, you know, biology is all about diversity, 02:06:41.180 |
so that you can evolve and survive catastrophic changes 02:07:00.220 |
that could understand what those differences are? 02:07:13.780 |
- Yeah, so here's how, what you'd have to do. 02:07:15.620 |
What you'd have to do is to train it up on data 02:07:18.940 |
from a bunch of individuals, human individuals. 02:07:30.060 |
You have to tell it what you're expecting from. 02:07:37.380 |
- If you, I once gave it an abstract from a paper, 02:07:47.020 |
"I want you to explain this abstract to a 10-year-old." 02:07:50.560 |
It did it in a way that I could never have done it. 02:07:59.340 |
but it explained, you know, what plasticity it was 02:08:05.340 |
- Almost like a qualifying exam for a graduate student. 02:08:07.740 |
I saw something today on X, formerly known as Twitter, 02:08:11.500 |
that blew my mind that I wanted your thoughts on 02:08:14.140 |
that is very appropriate to what you're saying right now, 02:08:17.060 |
which is someone was asking questions of an LLM 02:08:39.660 |
and start looking at pictures of landscapes in Yosemite. 02:08:50.700 |
or what any kind of online person online would do, 02:08:55.320 |
and look at a couple of pictures of something they, 02:09:03.780 |
that it can imagine things that aren't there, 02:09:45.940 |
or something that happened to you during the day, 02:09:48.180 |
right, your brain is always generating internally. 02:09:55.260 |
one of these large language models just goes blank. 02:10:07.700 |
and in particular brain activity during sleep, 02:10:16.380 |
shaping the knowledge that we experience during the day. 02:10:23.100 |
So these LLMs are not quite where we are at yet. 02:10:28.460 |
I mean, they can outperform us in certain things like Go, 02:10:44.780 |
And so this is something I'm working on myself, actually, 02:10:47.700 |
trying to understand how that's done in our own brains, 02:11:06.160 |
and you hear the words one after the next over an hour, 02:11:18.500 |
Somehow you're able to integrate all that information 02:11:21.860 |
over the hour and then use your long-term memory 02:11:29.140 |
How does your brain remember all that information? 02:11:37.940 |
that neuroscientists study is only for a few seconds, 02:11:40.380 |
right, or maybe a telephone number or something. 02:11:43.140 |
But we're talking about long-term working memory. 02:11:53.580 |
can do something, it's called in-context learning. 02:12:04.500 |
and then all it does after that is to inference, 02:12:07.780 |
you know, fast loop of activity one word after the next, 02:12:12.180 |
right, that's what happens with no learning, no learning. 02:12:16.540 |
But it's been noticed that as you continue your dialogue, 02:12:31.940 |
We don't know the answer to that question yet. 02:12:34.180 |
But we also don't know what the answer it is, 02:12:41.900 |
and as it relates to science and your trajectory? 02:12:48.500 |
do you have a practice of meditation or eyes closed, 02:13:10.020 |
'cause I get my best ideas, not sprinting on the beach, 02:13:13.060 |
but you know, just either walking or jogging. 02:13:22.420 |
I think that that stimulates ideas and thoughts. 02:13:30.260 |
and I can't remember any of those great ideas. 02:13:48.460 |
You know, if you're running in a steady pace, 02:13:52.700 |
nothing distracting about, you know, the beach. 02:13:57.860 |
- No, I never listen to anything except my own thoughts. 02:14:01.500 |
- So there's a former guest on this podcast who, 02:14:05.100 |
she happens to be triple degreed from Harvard, 02:14:07.180 |
but she's more in the kind of like personal coach space, 02:14:10.700 |
but very, very high level and impressive mind, 02:14:18.860 |
that can be used to accomplish a number of different things, 02:14:30.740 |
or maybe once a day of very minimal sensory input, 02:14:35.460 |
no lecture, no podcast, no book, no music, nothing, 02:14:48.540 |
or paying attention to anyone else's thoughts 02:14:51.020 |
through those media venues in any kind of structured way 02:15:04.460 |
And it's often when you have an aha moment, right? 02:15:17.420 |
that is logical, you know, hopping from thing to thing. 02:15:42.060 |
I used to do experiments where I was, you know, 02:15:43.820 |
like pipetting and running immunohistochemistry. 02:15:57.380 |
I'm both working and relaxing and thinking of things. 02:16:47.020 |
nobody has told us how the brain works, right? 02:17:09.000 |
You know, you're struggling with some problem at night 02:17:13.640 |
and you go to bed and you wake up in the morning. 02:17:23.860 |
I wouldn't say insight and not always meaningful insight, 02:17:33.000 |
That's the thing that is so amazing about sleep. 02:17:36.260 |
And you can see people who know this can count on it. 02:17:46.540 |
Your brain works on it during the sleep period, right? 02:17:50.740 |
because then who knows what your brain's gonna work on. 02:17:53.440 |
You know, use the time before you fall asleep 02:17:57.660 |
to think about something that is bothering you 02:18:01.140 |
you're trying to understand, maybe, you know, 02:18:03.960 |
a paper that you've, you read the paper and say, 02:18:06.200 |
oh, you know, I'm tired, I'm gonna go to sleep. 02:18:14.280 |
once you know something about how the brain works, 02:18:40.680 |
or any good understanding, first of all, why we dream. 02:18:45.480 |
We still, I mean, it's still not completely clear. 02:18:52.720 |
Is this, does that have some significance for you? 02:19:01.500 |
is that, you know, the dreams are often very visual, 02:19:11.520 |
All the neuromodulators are downregulated during sleep 02:19:22.060 |
but it doesn't come up in the prefrontal cortex, 02:19:24.460 |
which means that the circuits in the prefrontal cortex 02:19:27.680 |
that are interpreting the sensory input coming in 02:19:34.360 |
So any of these, whatever happens in your visual cortex 02:19:42.240 |
that you start floating and, you know, things happen to you 02:19:48.200 |
And so, but that still doesn't explain why, right? 02:19:55.040 |
and there are some sleeping pills that do block it, 02:20:14.920 |
the days and weeks and months after cannabis. 02:20:21.120 |
- No, no, it's a imbalance that was caused of, 02:20:31.320 |
And now it's gotta go back and then it takes time, 02:21:02.840 |
that dreams, in particular rapid eye movement dreams, 02:21:12.160 |
LSD, lysergic acid, diethylamide, or psilocybin, 02:21:16.720 |
and that perhaps dreams are revealing the unconscious mind, 02:21:25.480 |
can't control thought and action in the same way, obviously, 02:21:29.360 |
and it's sort of a recession of the waterline, 02:21:32.240 |
so we're getting more of the unconscious processing revealed. 02:21:36.480 |
- You know, that's an interesting hypothesis. 02:21:39.920 |
- I'd probably have to put someone in a scanner, 02:21:43.600 |
have them go to sleep, put them in the scanner 02:21:54.980 |
of course we both know, are deficient in the sense 02:22:00.200 |
You'd like to get in there and tickle the neurons over here 02:22:04.240 |
and you'd love to get real-time subjective report. 02:22:11.040 |
but you can't really know what they're dreaming 02:22:17.680 |
By the way, you know, there are two kinds of dreams. 02:22:28.320 |
Dreams, they're always different and changing, 02:22:31.320 |
but if you wake someone up during slow-wave sleep, 02:22:35.680 |
but it's a kind of dream that keeps repeating 02:22:45.800 |
- 'Cause I've had a few dreams over and over and over 02:22:48.160 |
throughout my life, so this would be in slow-wave sleep. 02:22:53.840 |
As a neuroscientist who's computationally oriented, 02:22:59.400 |
but really you incorporate the biology so well 02:23:01.760 |
into your work, so that's one of the reasons you're you, 02:23:21.860 |
and then you had to hand the keys to your lab 02:23:24.660 |
over to someone else, what would you go all in on? 02:23:28.060 |
- Well, so the NIH has something called the Pioneer Award, 02:23:39.660 |
So I put one in recently, and here's the title, 02:23:44.660 |
is Temporal Context in Brains and Transformers. 02:24:03.380 |
feed-forward network, but it's called a transformer, 02:24:06.700 |
and it has certain parts in it that are unique. 02:24:12.120 |
and it's a way of doing what is called temporal context, 02:24:18.400 |
what it does is it connects words that are far apart, 02:24:27.220 |
and then you have to figure out in the last sentence 02:24:31.500 |
well, there's three or four nouns it could have referred to, 02:24:33.740 |
but from context, you can figure out which one it does, 02:24:46.260 |
like if we were to say piano, you'd say keys, 02:24:51.020 |
and then it kind of builds out a word cloud of association. 02:24:56.460 |
I don't know, I'm thinking about the Salk Institute, 02:24:57.780 |
I'd say sunset, Stonehenge, anyone that looks up, 02:25:02.620 |
Then you start building out a word cloud over there. 02:25:06.780 |
except I've been to a classical music concert 02:25:15.820 |
and so you start getting associations at a distance, 02:25:24.140 |
but it turns out that every word is ambiguous, 02:25:29.020 |
and so you have to figure that out from context. 02:25:32.340 |
So in other words, there are words that live together, 02:25:37.980 |
and you can learn that from just by, you know, 02:25:46.460 |
and it keeps predicting the next word in a sentence. 02:25:48.780 |
- Like in my email now, it tries to predict the next word. 02:25:54.380 |
- Okay, well, that's because it's a very primitive version 02:25:59.220 |
What happened is if you train it up on enough, 02:26:05.040 |
it internally builds up a semantic representation 02:26:17.500 |
It can figure that out, and it has representations 02:26:26.900 |
And those associations now form an internal model 02:26:37.140 |
Literally, it's been, this is something that now 02:26:47.220 |
And that means that it's forming an internal model 02:26:51.980 |
of the outside world, in this case, a bunch of words. 02:26:55.700 |
And that's how it's able to actually respond to you 02:27:04.360 |
And it's all for the self-attention I'm talking about. 02:27:07.860 |
So in any case, my pioneer proposal is to figure out 02:27:26.580 |
I mean, I'll be working with experimental people. 02:27:40.740 |
and there are other people that have looked at, in primates. 02:27:49.780 |
are also a part of the puzzle, pieces of the puzzle 02:28:08.140 |
each one of these parts of the brain independently, 02:28:12.900 |
putting the pieces of the puzzle together, right? 02:28:15.020 |
Trying to get all the things that we know about these areas 02:28:18.100 |
and see how they work together in a computational way. 02:28:25.460 |
And I do hope they decide to fund your Pioneer Award. 02:28:29.700 |
And should they make the bad decision not to, 02:28:45.700 |
and running and teaching and research schedule 02:28:51.260 |
And also for the incredible work that you're doing 02:29:02.460 |
So we will certainly provide links to learning how to learn 02:29:05.820 |
and your book and to these other incredible resources 02:29:10.720 |
And you've also given us a ton of practical tools today 02:29:16.820 |
which of course are just your versions of what you do, 02:29:18.720 |
but that certainly, certainly are going to be a value 02:29:28.940 |
I mean, this is not lost on me and those listening 02:29:32.420 |
that your vigor is, as I mentioned earlier, undeniable. 02:29:38.460 |
to just see the amount of focus and energy and enthusiasm 02:29:53.320 |
and many, many people listening and watching. 02:29:57.980 |
a real incredible experience to learn from you. 02:30:03.340 |
And I have to say that I've been blessed over the years 02:30:06.860 |
with wonderful students and wonderful colleagues. 02:30:18.200 |
science is a social activity and we learn from each other 02:30:23.200 |
and we all make mistakes, but we learn from our mistakes. 02:30:31.800 |
Now, you know, your career has been remarkable too, 02:30:34.860 |
because you have affected and influenced more people 02:30:38.720 |
than anybody else I know personally with the knowledge 02:30:43.120 |
that you are broadcasting through your interviews, 02:30:48.120 |
but also, you know, just in terms of your interests. 02:30:51.820 |
Really, I'm really impressed with what you've done 02:31:09.600 |
everything we do is behind closed doors, right? 02:31:16.160 |
in terms of being able to explain things in a clear way 02:31:20.280 |
that gets through to more people than anybody else I know. 02:31:25.720 |
It's a labor of love for me and I'll take those words in 02:31:31.080 |
It's an honor and a privilege to sit with you today 02:31:38.740 |
- Thank you for joining me for today's discussion 02:31:51.520 |
If you're learning from and or enjoying this podcast, 02:31:55.880 |
That's a terrific zero cost way to support us. 02:32:06.140 |
Please check out the sponsors mentioned at the beginning 02:32:12.280 |
If you have questions or comments about the podcast 02:32:14.720 |
or guests or topics that you'd like me to consider 02:32:18.160 |
please put those in the comment section on YouTube. 02:32:36.120 |
And it covers protocols for everything from sleep, 02:32:44.120 |
And of course, I provide the scientific substantiation 02:32:49.580 |
The book is now available by presale at protocolsbook.com. 02:33:02.120 |
If you're not already following me on social media, 02:33:04.080 |
I'm Huberman Lab on all social media platforms. 02:33:07.000 |
So that's Instagram, X, formerly known as Twitter, 02:33:17.980 |
but much of which is distinct from the content 02:33:21.360 |
Again, that's Huberman Lab on all social media platforms. 02:33:36.840 |
Those one to three page PDFs cover things like 02:33:39.120 |
deliberate heat exposure, deliberate cold exposure. 02:33:43.400 |
We also have protocols for optimizing your sleep, 02:33:52.520 |
scroll down to newsletter and provide your email. 02:33:58.680 |
for today's discussion with Dr. Terry Sadnowski.