back to index

Dr. Terry Sejnowski: How to Improve at Learning Using Neuroscience & AI


Chapters

0:0 Dr. Terry Sejnowski
2:32 Sponsors: BetterHelp & Helix Sleep
5:19 Brain Structure & Function, Algorithmic Level
11:49 Basal Ganglia; Learning & Value Function
15:23 Value Function, Reward & Punishment
19:14 Cognitive vs. Procedural Learning, Active Learning, AI
25:56 Learning & Brain Storage
30:8 Traveling Waves, Sleep Spindles, Memory
32:8 Sponsors: AG1 & David
34:57 Tool: Increase Sleep Spindles; Memory, Ambien; Prescription Drugs
42:2 Psilocybin, Brain Connectivity
45:58 Tool: ‘Learning How to Learn’ Course
49:36 Learning, Generational Differences, Technology, Social Media
58:37 Sponsors: LMNT & Joovv
61:6 Draining Experiences, AI & Social Media
66:52 Vigor & Aging, Continued Learning, Tool: Exercise & Mitochondrial Function
72:17 Tool: Cognitive Velocity; Quick Stressors, Mitochondria
76:58 AI, Imagined Futures, Possibilities
87:14 AI & Mapping Potential Options, Schizophrenia
90:56 Schizophrenia, Ketamine, Depression
96:15 AI, “Idea Pump,” Analyzing Research
102:11 AI, Medicine & Diagnostic Tool; Predicting Outcomes
110:4 Parkinson’s Disease; Cognitive Velocity & Variables; Amphetamines
119:49 Free Will; Large Language Model (LLM), Personalities & Learning
132:40 Tool: Idea Generation, Mind Wandering, Learning
138:18 Dreams, Unconscious, Types of Dreams
142:56 Future Projects, Brain & Self-Attention
151:39 Zero-Cost Support, YouTube, Spotify & Apple Follow & Reviews, Sponsors, YouTube Feedback, Protocols Book, Social Media, Neural Network Newsletter

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Welcome to the Huberman Lab Podcast,
00:00:02.240 | where we discuss science
00:00:03.720 | and science-based tools for everyday life.
00:00:05.920 | I'm Andrew Huberman,
00:00:10.240 | and I'm a professor of neurobiology and ophthalmology
00:00:13.480 | at Stanford School of Medicine.
00:00:15.540 | My guest today is Dr. Terry Signowski.
00:00:18.320 | Dr. Terry Signowski is a professor
00:00:20.240 | at the Salk Institute for Biological Studies,
00:00:22.600 | where he directs the Computational Neurobiology Laboratory.
00:00:25.800 | And as his title suggests,
00:00:27.320 | he is a computational neuroscientist.
00:00:29.500 | That is, he uses math as well as artificial intelligence
00:00:32.560 | and computing methods to understand this overarching,
00:00:36.200 | ultra-important question of how the brain works.
00:00:39.080 | Now, I realize that when people hear terms
00:00:40.840 | like computational neuroscience, algorithms,
00:00:43.120 | large language models, and AI,
00:00:45.520 | that it can be a bit overwhelming and even intimidating.
00:00:48.400 | But I assure you that the purpose of Dr. Signowski's work,
00:00:51.360 | and indeed, today's discussion,
00:00:53.160 | is all about using those methods
00:00:55.160 | to clarify how the brain works,
00:00:57.400 | and indeed, to simplify the answer to that question.
00:01:00.580 | So for instance, today you will learn
00:01:02.520 | that regardless of who you are,
00:01:04.380 | regardless of your experience,
00:01:06.020 | that all your motivation in all domains of life
00:01:09.260 | is governed by a simple algorithm or equation.
00:01:12.660 | Dr. Signowski explains how a single rule,
00:01:14.960 | a single learning rule,
00:01:16.360 | drives all of our motivation-related behaviors.
00:01:19.380 | And it, of course, relates to the neuromodulator dopamine.
00:01:22.020 | And if you're familiar with dopamine as a term,
00:01:24.140 | today you will really understand how dopamine works
00:01:26.940 | to drive your levels of motivation,
00:01:28.740 | or in some cases, lack of motivation,
00:01:31.140 | and how to overcome that lack of motivation.
00:01:34.040 | Today, we also discuss how best to learn.
00:01:36.820 | Dr. Signowski shares not just information
00:01:39.220 | about how the brain works,
00:01:40.380 | but also practical tools
00:01:41.780 | that he and colleagues have developed,
00:01:43.080 | including a zero-cost online portal
00:01:45.780 | that teaches you how to learn better
00:01:48.020 | based on your particular learning style,
00:01:51.060 | the way that you in particular forge for information
00:01:53.900 | and implement that information.
00:01:55.660 | Dr. Signowski also explains how he himself
00:01:58.440 | uses physical exercise of a particular type
00:02:01.380 | in order to enhance his cognition,
00:02:03.120 | that is his brain's ability to learn information
00:02:05.980 | and to come up with new ideas.
00:02:08.100 | Today, we also discuss both the healthy brain
00:02:10.420 | and the diseased brain
00:02:12.040 | in conditions like Parkinson's and Alzheimer's,
00:02:14.580 | and how particular tools
00:02:15.860 | that relate to mitochondrial function
00:02:18.020 | can perhaps be used in order to treat various diseases,
00:02:21.240 | including Alzheimer's dementia.
00:02:22.960 | I'm certain that by the end of today's episode,
00:02:25.100 | you will have learned a tremendous amount of new knowledge
00:02:27.740 | about how your brain works and practical tools
00:02:30.260 | that you can implement in your daily life.
00:02:32.760 | Before we begin,
00:02:33.860 | I'd like to emphasize that this podcast
00:02:35.660 | is separate from my teaching and research roles at Stanford.
00:02:38.420 | It is, however, part of my desire and effort
00:02:40.580 | to bring zero-cost to consumer information
00:02:42.460 | about science and science-related tools
00:02:44.580 | to the general public.
00:02:45.940 | In keeping with that theme,
00:02:47.020 | I'd like to thank the sponsors of today's podcast.
00:02:50.020 | Our first sponsor is BetterHelp.
00:02:52.100 | BetterHelp offers professional therapy
00:02:53.900 | with a licensed therapist carried out completely online.
00:02:57.060 | I've been doing weekly therapy for well over 30 years.
00:02:59.760 | Initially, I didn't have a choice.
00:03:00.980 | It was a condition of being allowed to stay in school,
00:03:03.380 | but pretty soon I realized
00:03:04.380 | that therapy is an extremely important component
00:03:06.620 | to one's overall health.
00:03:07.740 | In fact, I consider doing regular therapy
00:03:09.820 | just as important as getting regular exercise,
00:03:12.140 | including cardiovascular exercise and resistance training,
00:03:15.260 | which, of course, I also do every single week.
00:03:18.040 | Now, there are essentially three things
00:03:19.380 | that great therapy provides.
00:03:20.700 | First of all, it provides a good rapport
00:03:22.780 | with somebody that you can trust and talk to
00:03:24.880 | about essentially all issues that you want to.
00:03:27.120 | Second of all, great therapy provides support
00:03:29.540 | in the form of emotional support or simply directed guidance,
00:03:32.940 | what to do or what not to do in given areas of your life.
00:03:36.140 | And third, expert therapy can provide you useful insights
00:03:39.400 | that you would not have been able to arrive at on your own.
00:03:42.220 | BetterHelp makes it very easy to find an expert therapist
00:03:44.740 | who you really resonate with
00:03:46.060 | and that can provide you the benefits I just mentioned
00:03:48.300 | that come with effective therapy.
00:03:50.540 | If you'd like to try BetterHelp,
00:03:52.060 | go to betterhelp.com/huberman
00:03:54.540 | to get 10% off your first month.
00:03:56.620 | Again, that's betterhelp.com/huberman.
00:03:59.780 | Today's episode is also brought to us by Helix Sleep.
00:04:02.980 | Helix Sleep makes mattresses and pillows
00:04:04.820 | that are customized to your unique sleep needs.
00:04:07.580 | Now, I've spoken many times before on this and other podcasts
00:04:10.660 | about the fact that getting a great night's sleep
00:04:12.580 | is the foundation of mental health,
00:04:14.300 | physical health, and performance.
00:04:16.020 | Now, the mattress you sleep on makes a huge difference
00:04:18.340 | in terms of the quality of sleep that you get each night,
00:04:20.740 | how soft it is or how firm it is, how breathable it is,
00:04:23.700 | all play into your comfort
00:04:24.980 | and need to be tailored to your unique sleep needs.
00:04:27.780 | If you go to the Helix website,
00:04:29.240 | you can take a brief two-minute quiz
00:04:30.900 | and it asks you questions such as,
00:04:32.580 | do you sleep on your back, your side, or your stomach?
00:04:34.660 | Do you tend to run hot or cold during the night?
00:04:36.380 | Things of that sort.
00:04:37.500 | Maybe you know the answers to those questions,
00:04:39.340 | maybe you don't.
00:04:40.380 | Either way, Helix will match you
00:04:41.740 | to the ideal mattress for you.
00:04:43.540 | For me, that turned out to be the Dusk mattress, the USK.
00:04:46.460 | I started sleeping on a Dusk mattress
00:04:48.140 | about three and a half years ago,
00:04:49.580 | and it's been far and away the best sleep
00:04:51.620 | that I've ever had.
00:04:52.600 | If you'd like to try Helix,
00:04:53.740 | you can go to helixsleep.com/huberman.
00:04:57.140 | Take that two-minute sleep quiz
00:04:58.460 | and Helix will match you to a mattress
00:05:00.140 | that is customized for your unique sleep needs.
00:05:02.620 | For the month of November, 2024,
00:05:04.580 | Helix is giving up to 25% off
00:05:06.620 | on all mattress orders and two free pillows.
00:05:09.380 | Again, that's helixsleep.com/huberman
00:05:12.460 | to get up to 25% off and two free pillows.
00:05:15.700 | And now for my discussion with Dr. Terry Signowski.
00:05:19.100 | Dr. Terry Signowski, welcome.
00:05:21.940 | - Great to be here.
00:05:23.080 | - We go way back.
00:05:24.640 | And I'm a huge, huge fan of your work
00:05:26.500 | because you've worked on a great many different things
00:05:29.700 | in the field of neuroscience.
00:05:31.580 | You're considered by many a computational neuroscience,
00:05:34.420 | so you bring mathematical models
00:05:35.820 | to an understanding of the brain and neural networks.
00:05:38.580 | And we're also going to talk about AI today,
00:05:40.540 | and we're going to make it accessible for everybody,
00:05:42.640 | biologist or no, math background or no.
00:05:45.940 | To kick things off, I want to understand something.
00:05:49.380 | I understand a bit about the parts list of the brain,
00:05:52.820 | and most listeners of this podcast will understand
00:05:55.620 | a little bit of the parts list of the brain,
00:05:57.220 | even if they've never heard an episode of this podcast before
00:05:59.660 | because they understand there are cells,
00:06:01.260 | those cells are neurons,
00:06:02.300 | those neurons connect to one another in very specific ways
00:06:05.060 | that allow us to see, to hear, to think, et cetera.
00:06:07.740 | But I've come to the belief
00:06:10.220 | that even if we know the parts list,
00:06:13.460 | it doesn't really inform us how the brain works, right?
00:06:17.380 | This is the big question.
00:06:18.220 | How does the brain work?
00:06:19.160 | What is consciousness?
00:06:20.260 | All of this stuff.
00:06:21.320 | So where and how does an understanding
00:06:26.440 | of how neurons talk to one another
00:06:28.740 | start to give us a real understanding
00:06:31.080 | about like how the brain works?
00:06:33.020 | Like what is this piece of meat in our heads?
00:06:36.000 | Because it can't just be,
00:06:37.380 | okay, the hippocampus remembers stuff,
00:06:39.440 | and the visual cortex perceives stuff.
00:06:43.100 | When you sit back and you remove the math
00:06:46.260 | from the mental conversation, if that's possible for you,
00:06:49.060 | how do you think about quote unquote, how the brain works?
00:06:53.440 | Like at a very basic level,
00:06:55.100 | what is this piece of meat in our heads
00:06:57.220 | really trying to accomplish from,
00:07:00.260 | let's just say the time when we first wake up in the morning
00:07:02.420 | and we're a little groggy
00:07:03.660 | till we make it to that first cup of coffee or water,
00:07:07.100 | or maybe even just to urinate first thing in the morning,
00:07:10.060 | what is going on in there?
00:07:11.760 | - What a great question.
00:07:12.820 | And you know, I have a,
00:07:17.500 | Pat Churchland and I wrote a book, "Computational Brain,"
00:07:21.900 | and in it, there's this levels diagram.
00:07:24.680 | And it levels of investigation at different spatial scales
00:07:29.280 | from the molecular at the very bottom
00:07:31.480 | to synapses and neurons, circuits, neural circuits,
00:07:36.480 | how they're connected with each other,
00:07:38.520 | and then brain areas in the cortex,
00:07:41.380 | and then the whole central nervous system
00:07:44.040 | span 10 orders of magnitude,
00:07:46.440 | 10th to the 10th in spatial scale.
00:07:49.800 | So, you know, where is consciousness in all of that?
00:07:54.720 | So there are two approaches
00:07:57.200 | that neuroscientists have taken.
00:07:59.940 | I shouldn't say neuroscientists,
00:08:01.400 | I should say that scientists have taken.
00:08:04.240 | And the one you described, which is, you know,
00:08:07.500 | let's look at all the parts, that's the bottom-up approach.
00:08:10.340 | You know, take it apart into a reductionist approach.
00:08:13.920 | And you make a lot of progress.
00:08:15.420 | You can figure out, you know, how things are connected
00:08:17.560 | and understand how development works, how neurons connect.
00:08:21.480 | But it's very difficult to really make progress
00:08:24.200 | because quickly you get lost in the forest.
00:08:27.580 | Now, the other approach, which has been successful,
00:08:32.000 | but at the end, unsatisfying, is the top-down approach.
00:08:38.520 | And this is the approach that psychologists have taken,
00:08:41.980 | looking at behavior and trying to understand,
00:08:45.260 | you know, the laws of behavior.
00:08:47.740 | This is the behaviorists.
00:08:49.200 | But, you know, even people in AI
00:08:52.740 | were trying to do a top-down, to write programs
00:08:55.500 | that could replicate human behavior, intelligent behavior.
00:08:59.740 | And I have to say that both of those approaches,
00:09:02.740 | you know, bottom-up or top-down,
00:09:05.160 | have really not gotten to the core
00:09:08.080 | of answering any of those questions, the big questions.
00:09:11.480 | But there's a whole new approach now that is emerging
00:09:15.360 | in both neuroscience and AI at exactly the same time.
00:09:18.760 | And at this moment in history, it's really quite remarkable.
00:09:22.040 | So there's an intermediate level
00:09:23.980 | between the implementation level at the bottom,
00:09:27.280 | how you implement some particular mechanism
00:09:34.560 | and the actual behavior of the whole system.
00:09:37.400 | It's called the algorithmic level.
00:09:40.320 | It's in between.
00:09:41.960 | So algorithms are like recipes.
00:09:44.200 | They're like, you know, when you bake a cake,
00:09:46.460 | you have to have ingredients and you have to say
00:09:50.020 | the order in which they're put together and how long.
00:09:52.680 | And, you know, if you get it wrong, you know,
00:09:56.120 | it doesn't work, you know, it's just a mess.
00:09:59.080 | Now, it turns out that we're discovering algorithms.
00:10:03.000 | We've made a lot of progress with understanding
00:10:06.720 | the algorithms that are used in neural circuits.
00:10:11.040 | And this speaks to the computational level
00:10:14.520 | of how to understand, you know,
00:10:16.760 | the function of the neural circuit.
00:10:18.920 | But I'm gonna give you one example of an algorithm,
00:10:22.720 | which is one we worked on back in the 1990s
00:10:27.600 | when Peter Dayan and Reid Montague were postdocs in the lab.
00:10:32.280 | And it had to do with a part of the brain
00:10:35.160 | below the cortex called the basal ganglia,
00:10:38.240 | which is responsible for learning sequences of actions
00:10:43.320 | in order to achieve some goal.
00:10:45.920 | For example, if you wanna play tennis, you know,
00:10:48.960 | you have to be able to coordinate many muscles
00:10:51.840 | and a whole sequence of actions has to be made
00:10:54.280 | if you wanna be able to serve accurately.
00:10:57.120 | And you have to practice, practice, practice.
00:10:59.240 | Well, what's going on there is that the basal ganglia
00:11:02.280 | basically is taking over from the cortex
00:11:06.240 | and producing actions that get better and better
00:11:09.280 | and better and better.
00:11:10.120 | And that's true not just of the muscles,
00:11:13.600 | but it's also true of thinking.
00:11:15.820 | If you wanna become good in any area,
00:11:17.960 | if you wanna become a good financier,
00:11:21.040 | if you wanna become a good doctor or a neuroscientist,
00:11:25.400 | right, you have to be practicing, practicing, practicing
00:11:29.560 | in terms of understanding what's the details
00:11:34.560 | of the profession and what works,
00:11:37.600 | what doesn't work and so forth.
00:11:38.960 | And it turns out that this basal ganglia
00:11:41.840 | interacts with the cortex, not just in the back,
00:11:44.200 | which is the action part,
00:11:46.000 | but also with the prefrontal cortex,
00:11:47.540 | which is the thinking part.
00:11:48.880 | - Can I ask you a question about this briefly?
00:11:51.280 | The basal ganglia, as I understand,
00:11:52.920 | are involved in the organization
00:11:55.960 | of two major types of behaviors.
00:11:57.520 | Go, meaning to actually perform a behavior,
00:12:01.040 | but the basal ganglia also instruct no-go.
00:12:03.760 | Don't engage in that behavior.
00:12:06.080 | And learning an expert golf swing
00:12:08.840 | or even a basic golf swing or a tennis racket swing
00:12:11.720 | involves both of those things, go and no-go.
00:12:14.460 | Given what you just said,
00:12:15.700 | which is that the basal ganglia are also involved
00:12:18.480 | in generating thoughts of particular kinds.
00:12:23.240 | I wonder therefore if it's also involved
00:12:26.520 | in suppression of thoughts of particular kinds.
00:12:29.240 | I mean, you don't want your surgeon
00:12:30.400 | cutting into a particular region
00:12:35.000 | and just thinking about their motor behaviors,
00:12:37.000 | what to do and what not to do.
00:12:38.080 | They presumably need to think about what to think about,
00:12:41.040 | but also what to not think about.
00:12:42.440 | You don't want that surgeon
00:12:43.960 | thinking about how their kid was a brat that morning
00:12:46.520 | and they're frustrated because the two things interact.
00:12:50.280 | So is there go, no-go in terms of action and learning?
00:12:53.180 | And is there go, no-go in terms of thinking?
00:12:54.760 | - Well, I mentioned the prefrontal cortex
00:12:57.280 | and that part, the loop with the basal ganglia,
00:12:59.960 | that is one of the last to mature in early adulthood.
00:13:04.540 | And the problem is that for adolescents,
00:13:07.440 | it's not the no-go part for planning and actions
00:13:11.840 | isn't quite there yet.
00:13:13.120 | And so often it doesn't kick in
00:13:15.680 | to prevent you from doing things
00:13:17.400 | that are not in your best interest.
00:13:20.000 | So yes, absolutely right.
00:13:21.760 | But one of the things though is that learning is involved.
00:13:24.400 | And this is really a problem that we cracked
00:13:29.280 | first theoretically in the '90s
00:13:31.080 | and then experimentally later by recording from neurons
00:13:34.880 | and also brain imaging in humans.
00:13:37.840 | So it turns out we know the algorithm
00:13:40.200 | that is used in the brain for how to learn
00:13:44.280 | sequences of actions to achieve a goal.
00:13:46.440 | And it's the simplest possible algorithm you can imagine.
00:13:50.960 | It's simply to predict the next reward you're gonna get.
00:13:55.960 | If I do an action, will it give me something of value?
00:14:00.960 | And you learn every time you try something,
00:14:06.560 | whether you got the amount of reward you expected or less,
00:14:10.040 | you use that to update the synapses, synaptic plasticity,
00:14:14.200 | so that the next time you'll have a better chance
00:14:16.640 | of getting a better reward
00:14:18.840 | and you build up what's called a value function.
00:14:20.920 | And so the cortex now over your lifetime
00:14:23.640 | is building up a lot of knowledge
00:14:26.520 | about things that are good for you,
00:14:28.160 | things that are bad for you.
00:14:29.520 | Like you go to a restaurant, you order something,
00:14:31.440 | how do you know what's good for you, right?
00:14:32.960 | You've had lots of meals in a lot of places
00:14:35.880 | and now that is part of your value function.
00:14:39.400 | This is the same algorithm that was used by AlphaGo.
00:14:43.320 | This is the program that DeepMind built.
00:14:46.680 | This is an AI program that beat the world Go champion.
00:14:51.040 | And Go is the most complex game
00:14:53.480 | that humans have ever played on a regular basis.
00:14:58.480 | - Far more complex than chess, as I understand.
00:15:00.400 | - Yeah, that's right.
00:15:01.240 | So Go is to chess, which chess is to
00:15:04.360 | something like checkers.
00:15:06.920 | In other words, the level of difficulty
00:15:08.680 | is another way off above it
00:15:12.200 | because you have to think in terms of battles
00:15:16.040 | going on all over the place at the same time
00:15:18.440 | and the order in which you put the pieces down
00:15:20.500 | are gonna affect what's gonna happen in the future.
00:15:23.040 | - So this value function is super interesting
00:15:27.240 | and I wonder whether, and I think you answered this,
00:15:29.660 | but I wonder whether this value function
00:15:32.120 | is implemented over long periods of time.
00:15:36.100 | So you talked about the value function
00:15:39.660 | in terms of learning a motor skill.
00:15:42.140 | Let's say swinging a tennis racket
00:15:43.780 | to do a perfect tennis serve
00:15:45.880 | or even just a decent tennis serve.
00:15:49.040 | When somebody goes back to the court,
00:15:50.780 | let's say on the weekend,
00:15:52.840 | once a month over the course of years,
00:15:55.920 | are they able to tap into that same value function
00:15:59.380 | every time they go back,
00:16:00.380 | even though there's been a lot of intervening time
00:16:02.600 | and learning?
00:16:04.000 | That's question number one.
00:16:05.320 | And then the other question is,
00:16:06.820 | do you think that this value function
00:16:08.260 | is also being played out in more complex scenarios,
00:16:12.060 | not just motor learning,
00:16:13.460 | such as let's say a domain of life
00:16:16.060 | that for many people involves some trial and error,
00:16:19.060 | it would be like human relationships.
00:16:20.980 | We learn how to be friends with people.
00:16:23.580 | We learn how to be a good sibling.
00:16:25.520 | We learn how to be a good romantic partner.
00:16:27.900 | We get some things right, we get some things wrong.
00:16:30.380 | So is the same value function being implemented,
00:16:32.540 | we're paying attention to what was rewarding,
00:16:34.940 | but what I didn't hear you say also was what was punishing.
00:16:38.280 | So are we only paying attention to what is rewarding
00:16:40.800 | or we're also integrating punishment?
00:16:43.160 | We don't get an electric shock when we get the serve wrong,
00:16:45.400 | but we can be frustrated.
00:16:47.640 | - What you identified is a very important feature,
00:16:52.640 | which is that rewards, by the way,
00:16:56.400 | every time you do something,
00:16:59.000 | you're updating this value function every time
00:17:01.760 | and it accumulates.
00:17:02.880 | And the answer to your first question,
00:17:04.720 | the answer is that it's always gonna be there.
00:17:07.720 | It doesn't matter.
00:17:08.800 | It's a very permanent part of your experience
00:17:13.360 | and who you are.
00:17:14.680 | And interestingly, and behaviorists knew this
00:17:20.760 | back in the 1950s,
00:17:22.480 | that you can get there two ways of trial and error.
00:17:25.860 | Small rewards are good because you're constantly
00:17:31.080 | coming closer and closer to getting what you're seeking,
00:17:36.080 | better tennis player or being able to make a friend.
00:17:40.340 | But the negative punishment is much more effective.
00:17:46.720 | One trial learning.
00:17:50.440 | You don't need to have 100 trials,
00:17:57.360 | which you need when you're training a rat
00:17:59.440 | to do some tasks with small food rewards.
00:18:02.840 | But if you just shock the rat,
00:18:04.760 | boy, that rat doesn't forget that.
00:18:06.680 | - Yeah, one really bad relationship
00:18:08.520 | will have you learning certain things forever.
00:18:11.760 | - And this is also PTSD.
00:18:13.600 | Post-traumatic stress disorder
00:18:15.080 | is another good example of that.
00:18:16.920 | That can screw you up for the rest of your life.
00:18:19.720 | But the other thing,
00:18:21.840 | and you pointed out something really important,
00:18:24.280 | which is that a large part of the prefrontal cortex
00:18:26.520 | is devoted to social interactions.
00:18:29.640 | And this is how humans,
00:18:31.400 | when you come into the world,
00:18:32.240 | you don't know what language you're gonna be speaking.
00:18:34.440 | You don't know what the cultural values are
00:18:37.240 | that you're going to have to be able
00:18:39.120 | to become a member of this society
00:18:43.020 | and as things that are expected of you.
00:18:44.800 | All of that has to become through experience,
00:18:47.440 | through building this value function.
00:18:49.400 | And this is something we discovered in the 20th century.
00:18:52.720 | And now it's going into AI.
00:18:55.400 | It's called reinforcement learning in AI.
00:18:57.480 | It's a form of procedural learning,
00:18:59.720 | as opposed to the cognitive level
00:19:01.720 | where you think and you do things.
00:19:03.880 | Cognitive thinking is much less efficient
00:19:06.320 | because you have to go step-by-step.
00:19:09.880 | With procedural learning, it's automatic.
00:19:13.440 | - Can you give me an example of procedural learning
00:19:15.880 | in the context of a comparison to cognitive learning?
00:19:20.080 | Like, is there an example of perhaps
00:19:22.840 | like how to make a decent cup of coffee
00:19:25.040 | using purely knowledge-based learning
00:19:29.360 | versus procedural learning?
00:19:30.640 | - Oh, okay, okay.
00:19:31.540 | - Where procedural learning wins.
00:19:32.980 | And I can imagine one, but you're the true expert here.
00:19:36.160 | - Well, you know a lot of examples,
00:19:38.600 | but just since we've been talking about tennis,
00:19:42.200 | can you imagine learning how to play tennis through a book,
00:19:44.720 | reading a book?
00:19:45.640 | - That's so funny.
00:19:46.480 | On the plane back from Nashville yesterday,
00:19:49.040 | the guy sitting across the aisle from me
00:19:51.920 | was reading a book about,
00:19:53.640 | maybe he was working on his pilot's license or something.
00:19:57.440 | And I looked over and couldn't help,
00:19:58.800 | but notice these diagrams of the plane flying.
00:20:01.920 | And I thought, I'm just so glad that this guy is a passenger
00:20:05.440 | and not a pilot.
00:20:06.880 | And then I thought about how the pilots learned.
00:20:08.520 | And presumably it was a combination
00:20:10.600 | of practical learning and textbook learning.
00:20:14.400 | I mean, when you scuba dive, this is true.
00:20:15.980 | I'm scuba dive certified.
00:20:17.500 | And when you get your certification,
00:20:18.920 | you learn your dive tables
00:20:20.640 | and you learn why you have to wait between dives, et cetera,
00:20:23.400 | and gas exchange, and a number of things.
00:20:25.280 | But there's really no way to simulate
00:20:28.440 | what it is to take your mask off underwater,
00:20:30.560 | put it back on, and then blow the water out of your mask.
00:20:32.800 | Like that, you just have to do that in a pool.
00:20:35.160 | And you actually have to do it when you need to
00:20:37.560 | for it to really get drilled in.
00:20:39.400 | - Yes, it's really essential for things
00:20:43.400 | that have to be executed quickly and expertly
00:20:47.840 | to get that really down pat so you don't have to think.
00:20:52.480 | And this happens in school, right?
00:20:58.800 | In other words, you have classroom lessons
00:21:01.360 | where you're given explicit instruction,
00:21:04.160 | but then you go do homework.
00:21:07.320 | That's procedural learning.
00:21:08.720 | You do problems, you solve problems.
00:21:10.800 | And I'm a PhD physicist,
00:21:13.080 | so I went through all of the classes in theoretical physics.
00:21:17.680 | And it was really the problems
00:21:20.840 | that really were the core of becoming a good physicist.
00:21:23.760 | You know, you can memorize the equations,
00:21:25.480 | but that doesn't mean you understand
00:21:26.960 | how to use the equations.
00:21:28.400 | - I think it's worth highlighting something.
00:21:29.640 | A lot of times on this podcast,
00:21:31.560 | we talk about what I call protocols.
00:21:33.080 | It would be, you know, like get some morning sunlight
00:21:34.720 | in your eyes to stimulate your suprachiasmatic nucleus
00:21:37.880 | by way of your retinal ganglion cells.
00:21:39.480 | Audiences of this podcast will recognize those terms.
00:21:41.640 | It's basically get sunlight in your eyes in the morning
00:21:43.280 | and set your circadian clock.
00:21:44.480 | - That's right.
00:21:45.320 | - And I can hear that a trillion times,
00:21:48.240 | but I do believe that there's some value
00:21:49.880 | to both knowing what the protocol is,
00:21:53.040 | the underlying mechanisms.
00:21:54.320 | There are these things in your eye that, you know,
00:21:56.040 | encode the sunrise qualities of light, et cetera,
00:21:58.840 | and then send them to your brain, et cetera, et cetera.
00:22:00.880 | But then once we link knowledge, pure knowledge,
00:22:05.080 | to a practice, I do believe that the two things
00:22:08.040 | merge someplace in a way that, let's say,
00:22:11.240 | reinforces both the knowledge and the practice.
00:22:14.480 | So these things are not necessarily separate, they bridge.
00:22:17.000 | In other words, doing your theoretical physics problem sets
00:22:20.400 | reinforces the examples that you learned in lecture
00:22:23.120 | and in your textbooks, and vice versa.
00:22:26.240 | - So this is a battle that's going on right now in schools.
00:22:30.660 | You know, what you've just said is absolutely right.
00:22:34.920 | You need both.
00:22:35.760 | We have two major learning systems.
00:22:37.720 | We have a cognitive learning system, which is cortical.
00:22:40.320 | We have a procedural learning system,
00:22:42.180 | which is subcortical, basal ganglia.
00:22:44.760 | And the two go hand in hand.
00:22:49.360 | If you want to become good at anything,
00:22:51.800 | the two are going to help each other.
00:22:54.560 | And what's going on right now in schools,
00:22:56.440 | in California at least, is that they're trying
00:22:59.880 | to get rid of the procedural.
00:23:01.640 | - That's ridiculous.
00:23:02.480 | - They don't want students to practice
00:23:03.860 | because it's going to be, you know, you're stressing them.
00:23:08.720 | You don't want them to be, to feel that, you know,
00:23:11.500 | that they're having difficulty.
00:23:13.180 | So, but we can, but it can do everything.
00:23:15.020 | - For those listening, I'm covering my eyes
00:23:16.620 | because I mean, this would be like saying,
00:23:19.100 | goodness, there's so many examples.
00:23:21.100 | Like here's a textbook on swimming,
00:23:22.460 | and then you're going to go out to the ocean someday
00:23:25.180 | and you will have never actually swum.
00:23:27.520 | - Right.
00:23:28.540 | - And now you're expected to be able to survive,
00:23:32.220 | let alone swim well.
00:23:33.060 | - It's crazy, it's crazy.
00:23:34.280 | And I'll tell you, Barbara Oakley has,
00:23:38.480 | and I have a MOOC, Massive Open Online Course,
00:23:43.000 | on learning how to learn.
00:23:44.160 | And it helps students.
00:23:46.560 | We aimed it at students, but it actually has been taken
00:23:49.180 | by 4 million people in 200 countries, ages 10 to 90.
00:23:53.000 | - What is this called?
00:23:53.880 | - Learning how to learn.
00:23:55.600 | - Is it, is there a paywall?
00:23:58.160 | - No, it's free, completely free.
00:23:59.880 | - Amazing.
00:24:01.300 | - And, you know, I get incredible feedback,
00:24:05.420 | you know, fan letters almost every day.
00:24:08.100 | - Well, you're about to get a few more.
00:24:09.340 | - Okay, well.
00:24:10.180 | - I did an episode on learning how to learn,
00:24:11.220 | and my understanding of the research
00:24:12.660 | is that we need to test ourselves on the material.
00:24:15.240 | That testing is not just a form of evaluation.
00:24:17.940 | It is a form of identifying the errors
00:24:22.160 | that help us then compensate for the errors.
00:24:24.460 | - Exactly.
00:24:25.300 | - But it's very procedural.
00:24:27.220 | It's not about just listening and regurgitating.
00:24:30.500 | You're, you know, you've put your finger on it,
00:24:32.600 | which is that, and this is what we teach the students,
00:24:35.040 | is that you have to, the way the brain works, right,
00:24:40.040 | is not, it doesn't memorize things like a computer,
00:24:44.280 | but you have to, it has to be active learning.
00:24:48.120 | You have to actively engage.
00:24:49.800 | In fact, when you're trying to solve a problem on your own,
00:24:53.760 | right, this is where you're really learning
00:24:55.440 | by trial and error, and that's the procedural system.
00:24:58.320 | But if someone tells you what the right answer is,
00:25:00.340 | you know, you know, that's just something
00:25:03.080 | that is a fact that gets stored away somewhere,
00:25:06.200 | but it's not gonna automatically come up
00:25:08.440 | if you actually are faced with something
00:25:10.760 | that's not exactly the same problem but is similar.
00:25:13.680 | And by the way, this is the key to AI,
00:25:15.960 | completely essential for the recent success
00:25:20.300 | of these large language models, you know,
00:25:23.360 | that the public now is beginning to use,
00:25:25.760 | is that they're not parrots.
00:25:27.940 | They just, they're not, they just don't memorize
00:25:30.680 | what the data that they've taken in.
00:25:33.760 | They have to generalize.
00:25:36.160 | That means to be able to do well on new things
00:25:38.680 | that come in that are similar to the old things
00:25:40.640 | that you've seen, but allow you to solve new problems.
00:25:44.720 | That's the key to the brain.
00:25:47.640 | The brain is really, really good at generalizing.
00:25:50.560 | In fact, in many cases,
00:25:52.520 | you only need one example to generalize.
00:25:55.680 | Like going to a restaurant for the first time,
00:25:58.700 | there are a number of new interactions, right?
00:26:01.860 | There might be a host or a hostess.
00:26:03.920 | You sit down at these tables you've never sat at.
00:26:05.760 | Somebody asks you questions, you read it.
00:26:07.340 | Okay, maybe it's a QR code these days,
00:26:09.340 | but forever after you understand the process
00:26:13.780 | of going into a restaurant,
00:26:14.620 | doesn't matter what the genre of food happens to be
00:26:18.180 | or what city, sitting inside or outside,
00:26:20.820 | you can pretty much work it out.
00:26:22.620 | Sit at the counter, sit outside, sit at the table.
00:26:24.720 | It's, there are a number of key action steps
00:26:28.220 | that I think pretty much translate to everywhere.
00:26:31.420 | Unless you go to some super high-end thing
00:26:34.180 | or some super low-end thing where it's a buffet
00:26:36.260 | or whatever, you can start to fill in the blanks here.
00:26:39.240 | If I understand correctly, there's an action function
00:26:42.340 | that's learned from the knowledge and the experience.
00:26:46.500 | - Exactly.
00:26:47.340 | - And then where is that action function stored?
00:26:50.020 | Is it in one location in the brain
00:26:51.580 | or is it kind of an emergent property
00:26:53.500 | of multiple brain areas?
00:26:55.200 | - So that, you're right at the cusp here
00:26:57.800 | of where we are in neuroscience right now.
00:27:00.200 | We don't know the answer to that question.
00:27:02.600 | In the past, it had been thought that, you know,
00:27:06.760 | the cortex had, were like countries
00:27:11.600 | that each of which, each part of the cortex
00:27:15.120 | was dedicated to one function, right?
00:27:17.460 | You know, there's, and interestingly,
00:27:20.680 | you record for the neurons
00:27:21.680 | and it certainly looks that way, right?
00:27:23.360 | In other words, there's a visual cortex in the back
00:27:26.380 | and there's a whole series of areas
00:27:28.180 | and then there's the auditory cortex here in the middle
00:27:31.300 | and then the prefrontal cortex for social interaction.
00:27:33.860 | And so it looked really clear cut that it's modular
00:27:38.860 | and now what we're facing is we have a new way
00:27:43.580 | to record from neurons.
00:27:45.340 | Optically, we can record from tens of thousands,
00:27:49.140 | from dozens of areas simultaneously
00:27:51.520 | and what we're discovering is that
00:27:54.440 | if you want to do any task,
00:27:56.440 | you're engaging not just the area that you might think,
00:27:59.640 | you know, has the input coming in, say the visual system,
00:28:03.440 | but the visual system is getting input
00:28:05.920 | from the motor system, right?
00:28:08.480 | In fact, you know, there's more input
00:28:10.080 | coming from the motor system than from the eye.
00:28:12.720 | - Really?
00:28:13.560 | - Yes, Ann Churchill at UCLA has shown that in the mouse.
00:28:19.100 | So now we're looking at global interactions
00:28:21.540 | between all these areas
00:28:22.660 | and that's where real complex cognitive behaviors emerge.
00:28:27.660 | It's from those interactions
00:28:30.420 | and now we have the tools for the first time
00:28:32.240 | to actually be able to see them in real time
00:28:34.300 | and we're doing that now first on mice and monkeys,
00:28:39.300 | but we now can do this in humans.
00:28:44.100 | So I've been collaborating with a group
00:28:46.060 | at Mass General Hospital to record from people with epilepsy
00:28:50.340 | and they have to have an operation
00:28:52.540 | for people who are drug resistant,
00:28:54.700 | to be able to take out,
00:28:56.220 | find out where it starts in the cortex, you know,
00:28:59.300 | and where it is initiated, where the seizure starts
00:29:02.740 | and then to go in, you have to go in
00:29:04.980 | and record simultaneously from a lot of parts of the cortex
00:29:08.780 | for weeks until you find out where it is
00:29:12.040 | and then you go in and you try to take it out
00:29:14.940 | and often that helps.
00:29:17.140 | Very, very invasive, but for two weeks we have access
00:29:21.020 | to all those neurons in that cortex
00:29:23.860 | that are being, you know, recorded from constantly.
00:29:27.220 | And so I've used, I started out
00:29:29.180 | because I was interested in sleep
00:29:30.700 | and I wanted to understand what happens in the cortex
00:29:33.580 | of a human during sleep,
00:29:35.280 | but then we realized that, you know,
00:29:37.580 | you can also figure, you know,
00:29:40.260 | people who have these debilitating problems with seizures,
00:29:45.340 | you know, they're there for two weeks
00:29:46.700 | and they have nothing to do.
00:29:47.580 | So they just love the fact that scientists are interested
00:29:50.740 | in helping them and, you know, teaching them things
00:29:54.340 | and finding out where in the cortex things are happening
00:29:58.060 | when they learn something.
00:29:59.860 | This is a goldmine, it's unbelievable.
00:30:02.380 | And I've learned things from humans
00:30:04.920 | that I could have never gotten from any other species.
00:30:07.900 | - Amazing.
00:30:08.720 | - Obviously language is one of them,
00:30:09.980 | but there are other things in sleep that we've discovered.
00:30:13.820 | Having to do with traveling waves,
00:30:15.300 | there are circular traveling waves that go on during sleep,
00:30:18.460 | which is astonishing.
00:30:20.180 | Nobody ever really saw that before, but now-
00:30:25.180 | - If you were to ascribe one or two major functions
00:30:27.760 | to these traveling waves,
00:30:28.760 | what do you think they are accomplishing for us in sleep?
00:30:31.500 | And by the way, are they associated with deep sleep,
00:30:33.820 | slow wave sleep, or with rapid eye movement sleep, or both?
00:30:36.900 | - This is non-REM sleep, this is a jargon,
00:30:39.700 | this is during intermediate-
00:30:43.700 | - Transition states.
00:30:45.580 | - Yeah, transition state.
00:30:46.580 | - Okay, our audience will probably keep up.
00:30:48.380 | They've heard a lot about slow wave sleep from me
00:30:50.020 | and Matt Walker from Rapid Eye Movement Sleep.
00:30:52.060 | - Light, slow wave sleep, yeah.
00:30:53.420 | - And so what do these traveling waves accomplish for us?
00:30:55.580 | - Oh, okay, so in the case of the,
00:30:57.580 | they're called sleep spindles.
00:30:59.300 | They last, the waves last for about a second or two,
00:31:04.220 | and they travel, like I say, in a circle around the cortex.
00:31:07.940 | And it's known that these spindles are important
00:31:10.740 | for consolidating experiences you've had during the day
00:31:14.620 | into your long-term memory storage.
00:31:16.800 | So it's a very important function,
00:31:19.900 | and if you take out, see, it's the hippocampus
00:31:23.580 | that is replaying the experiences.
00:31:28.140 | It's a part of the brain,
00:31:28.980 | it's very important for long-term memory.
00:31:30.460 | If you don't have a hippocampus, you can't learn new things.
00:31:34.320 | That is to say, you can't remember what you did yesterday,
00:31:38.680 | or for that matter, even an hour earlier.
00:31:41.360 | But the hippocampus plays back your experiences,
00:31:43.840 | causes the sleep spindles now to knead that into the cortex.
00:31:47.720 | And it's important you do that right,
00:31:50.160 | 'cause you don't want to overwrite
00:31:52.080 | the existing knowledge you have.
00:31:53.720 | You just want to basically incorporate the new experience
00:31:56.720 | into your existing knowledge base in an efficient way
00:32:00.980 | that doesn't interfere with what you already know.
00:32:03.840 | So that's an example of a very important function
00:32:06.280 | that these traveling ways have.
00:32:08.120 | - I'd like to take a quick break
00:32:09.280 | and acknowledge our sponsor, AG1.
00:32:11.800 | AG1 is a vitamin mineral probiotic drink
00:32:14.280 | that includes prebiotics and adaptogens.
00:32:16.840 | I've been drinking AG1 since 2012,
00:32:19.440 | and I started doing it at a time
00:32:20.800 | when my budget was really limited.
00:32:22.480 | In fact, I only had enough money to purchase one supplement,
00:32:25.160 | and I'm so glad that I made that supplement, AG1.
00:32:28.000 | The reason for that is even though I strive
00:32:29.840 | to eat whole foods and unprocessed foods,
00:32:31.720 | it's very difficult to get enough vitamins and minerals,
00:32:33.960 | micronutrients, and adaptogens from diet alone
00:32:37.120 | in order to make sure that I'm at my best,
00:32:38.680 | meaning have enough energy for all the activities
00:32:40.600 | I participate in from morning until night,
00:32:42.800 | sleeping well at night, and keeping my immune system strong.
00:32:46.520 | When I take AG1 daily,
00:32:47.880 | I find that all aspects of my health,
00:32:49.780 | my physical health, my mental health,
00:32:51.480 | my performance, recovery from exercise,
00:32:53.480 | all of those improve.
00:32:54.960 | And I know that because I've had lapses
00:32:56.600 | when I didn't take my AG1,
00:32:57.960 | and I certainly felt the difference.
00:32:59.760 | I also noticed, and this makes perfect sense
00:33:01.720 | given the relationship
00:33:02.560 | between the gut microbiome and the brain,
00:33:04.640 | that when I regularly take AG1,
00:33:06.480 | that I have more mental clarity and more mental energy.
00:33:09.480 | If you'd like to try AG1,
00:33:11.000 | you can go to drinkag1.com/huberman
00:33:14.360 | to claim a special offer.
00:33:15.920 | For this month only, November, 2024,
00:33:18.480 | AG1 is giving away a free one-month supply
00:33:20.960 | of omega-3 fatty acids from fish oil,
00:33:23.240 | in addition to their usual welcome kit
00:33:25.240 | of five free travel packs
00:33:26.600 | and a year supply of vitamin D3K2.
00:33:29.520 | As I've discussed many times before on this podcast,
00:33:32.120 | omega-3 fatty acids are critical for brain health,
00:33:34.600 | mood, cognition, and more.
00:33:36.440 | Again, go to drinkag1.com/huberman
00:33:39.680 | to claim this special offer.
00:33:41.360 | Today's episode is also brought to us by David.
00:33:44.560 | David makes a protein bar unlike any other.
00:33:47.240 | It has 28 grams of protein,
00:33:49.040 | only 150 calories and zero grams of sugar.
00:33:52.800 | That's right, 28 grams of protein
00:33:54.800 | and 75% of its calories come from protein.
00:33:57.680 | These bars from David also taste amazing.
00:33:59.780 | My favorite flavor is chocolate chip cookie dough,
00:34:02.060 | but then again, I also like the chocolate fudge-flavored one
00:34:04.520 | and I also like the cake-flavored one.
00:34:06.020 | Basically, I like all the flavors.
00:34:07.860 | They're incredibly delicious.
00:34:09.380 | For me personally, I strive to eat mostly whole foods.
00:34:12.380 | However, when I'm in a rush or I'm away from home,
00:34:15.240 | or I'm just looking for a quick afternoon snack,
00:34:17.260 | I often find that I'm looking
00:34:18.460 | for a high-quality protein source.
00:34:20.600 | With David, I'm able to get 28 grams of protein
00:34:23.020 | with the calories of a snack,
00:34:24.720 | which makes it very easy to hit my protein goals
00:34:26.760 | of one gram of protein per pound of body weight each day.
00:34:30.180 | And it allows me to do that
00:34:31.500 | without taking in excess calories.
00:34:33.500 | I typically eat a David bar in the early afternoon
00:34:35.780 | or even mid-afternoon if I wanna bridge that gap
00:34:38.220 | between lunch and dinner.
00:34:39.780 | I like that it's a little bit sweet,
00:34:41.040 | so it tastes like a tasty snack,
00:34:42.540 | but it's also given me that 28 grams
00:34:44.480 | of very high-quality protein with just 150 calories.
00:34:47.880 | If you would like to try David,
00:34:49.300 | you can go to davidprotein.com/huberman.
00:34:52.620 | Again, the link is davidprotein.com/huberman.
00:34:57.200 | As I recall, there are one or two things
00:35:00.520 | that one can do in order to ensure
00:35:02.840 | that one gets sufficient sleep spindles at night
00:35:06.400 | and thereby incorporate this new knowledge.
00:35:09.480 | This was from the episode that we did with Gina Poe
00:35:12.400 | from UCLA, I believe, and others, including Matt Walker.
00:35:16.040 | My recollection is that the number one thing
00:35:18.480 | is to make sure you get enough sleep at night
00:35:20.080 | so you experience enough of these spindles.
00:35:22.120 | And we're all familiar with the cognitive challenges,
00:35:25.040 | including memory challenges and learning challenges
00:35:27.100 | associated with lack of sleep, insufficient sleep.
00:35:30.600 | But the other was that there was some interesting
00:35:33.760 | relationship between daytime exercise
00:35:36.160 | and nighttime prevalence of sleep spindles.
00:35:38.980 | Are you familiar with that literature?
00:35:40.480 | - Yes, oh yes.
00:35:41.480 | No, this is a fascinating literature,
00:35:44.660 | and it's all pointing the same direction,
00:35:48.360 | which is that we always neglect
00:35:51.320 | to appreciate the importance of sleep.
00:35:54.760 | I mean, obviously, you're refreshed when you wake up,
00:35:57.620 | but there's a lot of things happen.
00:35:58.920 | It's not that your brain turns off.
00:36:00.920 | It's that it goes into a completely different state,
00:36:03.800 | and memory consolidation is just one of those things
00:36:06.420 | that happens when you fall asleep.
00:36:08.400 | And of course, there's dreams and so forth.
00:36:11.200 | We don't fully appreciate or understand
00:36:13.400 | exactly how all the different sleep stages work together.
00:36:18.400 | But exercise is a particularly important part
00:36:23.640 | of getting the motor system tuned up.
00:36:28.240 | And it's thought that the REM, rapid eye movement sleep,
00:36:35.200 | may be involved in that.
00:36:36.440 | So that's yet another part of the sleep stages.
00:36:41.440 | You go through, you go back and forth
00:36:43.000 | between dream sleep and the slow-wave sleep,
00:36:46.360 | back and forth, back and forth during the night.
00:36:48.800 | And then when you wake up, you're in the REM stage,
00:36:53.360 | more and more REM, more and more REM.
00:36:55.520 | But that's all observation.
00:36:58.200 | But as a scientist, what you want to do
00:37:00.240 | is perturb the system and see if you can maybe,
00:37:04.400 | if you had more sleep spindles,
00:37:06.760 | maybe you'd be able to remember things better.
00:37:08.720 | So it turns out Sarah Mednick, who's at UC Irvine,
00:37:12.460 | did this fantastic experiment.
00:37:14.200 | So it turns out there's a drug called zolpidem,
00:37:16.700 | which goes by the name Ambien.
00:37:23.120 | You may have some experience with that.
00:37:24.880 | - I've never taken it, but I'm aware of what it is.
00:37:27.920 | People use it as a sleep aid.
00:37:29.400 | - That's right.
00:37:30.240 | A lot of people take it in order to sleep, okay.
00:37:33.740 | Well, it turns out that it causes more sleep spindles.
00:37:40.200 | - Really?
00:37:41.100 | - It doubles the number of sleep spindles.
00:37:43.140 | If you take the drug,
00:37:44.640 | you take the drug after you've done the learning, right?
00:37:50.500 | You do the learning at night and then you take the drug
00:37:54.580 | and you have twice as many spindles.
00:37:56.780 | You wake up in the morning,
00:37:57.740 | you can remember twice as much from what you learned.
00:38:00.460 | - And the memories are stable over time?
00:38:02.900 | - Yes.
00:38:03.740 | - It's in there.
00:38:04.560 | - Yes, yeah, no, it consolidates it.
00:38:06.860 | I mean, that's the point.
00:38:08.140 | - What's the downside of Ambien?
00:38:10.020 | - Okay, here's the downside.
00:38:11.020 | Okay, so people who take the drug,
00:38:13.420 | say if you're going to Europe and you take it
00:38:16.540 | and then you sleep really soundly,
00:38:19.220 | but often you find yourself in the hotel room
00:38:23.260 | and you completely have no clue,
00:38:25.020 | you have no memory of how you got there.
00:38:27.340 | - I've had that experience without Ambien
00:38:29.380 | or any other drugs where I am very badly jet lagged.
00:38:32.620 | - Yes.
00:38:33.460 | - And I wake up and for a few seconds,
00:38:36.500 | but what feels like eternity, I have no idea where I am.
00:38:39.900 | It's terrifying.
00:38:40.980 | - Well, that's another problem that you have with jet lag.
00:38:44.420 | Jet lag really screws things up.
00:38:46.140 | But this is something where it could be an hour.
00:38:48.780 | You know, you took the train or you took a taxi
00:38:52.340 | or something and you're...
00:38:54.780 | So here, now this seems crazy.
00:38:56.460 | How could it be a way to improve learning and recall
00:39:01.460 | on one hand and then forgetfulness on the other hand?
00:39:05.200 | Well, it turns out what's important is
00:39:08.500 | (laughs)
00:39:10.500 | that when you take the drug, right?
00:39:14.360 | In other words, it helps consolidate experiences
00:39:18.900 | you've had in the past before you took the drug,
00:39:22.460 | but it'll wipe out experiences you have in the future
00:39:25.660 | after you take the drug, right?
00:39:27.240 | (laughs)
00:39:28.080 | - Sorry, I'm not laughing.
00:39:29.060 | It must be a terrifying experience,
00:39:30.620 | but I'm laughing because, you know,
00:39:33.840 | there's some beautiful pharmacology
00:39:35.300 | and indeed some wonderfully useful pharmaceuticals
00:39:38.820 | out there.
00:39:39.860 | You know, some people may cringe to hear me say that,
00:39:41.780 | but there are some very useful drugs out there
00:39:43.260 | that save lives and help people deal with symptoms, et cetera.
00:39:46.540 | Side effects are always a concern,
00:39:47.740 | but this particular drug profile, Ambien, that is,
00:39:52.580 | seems to reveal something perhaps even more important
00:39:57.180 | than the discussion about spindles or Ambien
00:39:59.180 | or even sleep, which is that you got to pay the piper
00:40:03.300 | somehow, as they say.
00:40:04.540 | - That's right.
00:40:05.380 | - That you tweak one thing in the brain,
00:40:07.260 | something else, something else goes.
00:40:10.300 | You don't get anything for free.
00:40:12.500 | - That's a true, I think that this is something
00:40:17.540 | that is true, not just of drugs for the brain,
00:40:21.220 | but steroids for the body.
00:40:24.140 | - Sure.
00:40:24.980 | Yeah, I mean, steroids, even low-dose testosterone therapy,
00:40:28.380 | which is very popular nowadays,
00:40:30.380 | will give people more vigor, et cetera,
00:40:31.740 | but it is introducing a sort of second puberty,
00:40:35.100 | and puberty is perhaps the most rapid phase of aging
00:40:38.460 | in the entire lifespan.
00:40:39.700 | Same thing with people who take growth hormone
00:40:41.140 | would be probably a better example,
00:40:43.020 | because certainly those therapies
00:40:44.460 | can be beneficial to people,
00:40:45.540 | but growth hormone gives people more vigor,
00:40:48.740 | but it accelerates aging.
00:40:50.580 | Look at the quality of skin that people have
00:40:52.460 | when they take growth hormone.
00:40:53.420 | It looks more aged.
00:40:54.740 | They physically change,
00:40:55.980 | and I'm not for or against these things.
00:40:57.900 | It's highly individual, but I completely agree with you.
00:41:00.540 | I would also venture that with the growing interest
00:41:04.620 | in so-called nootropics,
00:41:07.020 | and people taking things like modafinil,
00:41:09.460 | not just for narcolepsy, daytime sleepiness,
00:41:11.580 | but also to enhance cognitive function,
00:41:14.060 | okay, maybe they can get away with doing that
00:41:15.580 | every once in a while for a deadline task or something,
00:41:18.460 | but my experience is that people who obsess
00:41:22.220 | over the use of pharmacology
00:41:23.500 | to achieve certain brain states pay in some other way.
00:41:26.500 | - Absolutely.
00:41:27.340 | - Whether or not stimulants or sedatives or sleep drugs,
00:41:30.020 | and that behaviors will always prevail.
00:41:33.900 | Behaviors will always prevail as tools.
00:41:35.860 | - Yep, and one of the things about the way the body evolved
00:41:40.860 | is that it really has to balance a lot of things,
00:41:46.260 | and so with drugs, you're basically unbalancing it somehow,
00:41:50.180 | and the consequence is, as you point out,
00:41:52.980 | is that in order to make one part better,
00:41:56.700 | one part of your body,
00:41:59.140 | you sacrifice something else somewhere else.
00:42:02.260 | - As long as we're talking about brain states
00:42:03.860 | and connectivity across areas,
00:42:06.340 | I want to ask a particular question,
00:42:08.820 | then I want to return to this issue about how best to learn,
00:42:11.660 | especially in kids, but also in adulthood.
00:42:14.460 | I've become very interested in
00:42:15.740 | and spent a lot of time with the literature
00:42:17.340 | and some guests on the topic of psychedelics.
00:42:19.720 | Let's leave the discussion about LSD aside,
00:42:23.300 | because do you know why there aren't many studies of LSD?
00:42:25.540 | This is kind of a fun one.
00:42:26.700 | No one is expected to know the answer.
00:42:28.100 | - Well, it's against the law, I think.
00:42:29.740 | - Oh, but so is psilocybin or MDMA,
00:42:31.940 | and there are lots of studies going on about this.
00:42:33.060 | - Oh, there are now, yeah, it's changed,
00:42:34.580 | but when I was growing up, as you know,
00:42:36.700 | it was against the law.
00:42:37.660 | - Right, so what I learned is that
00:42:39.500 | there are far fewer clinical trials
00:42:41.500 | exploring the use of LSD as a therapeutic,
00:42:44.220 | because with the exception of Switzerland,
00:42:46.140 | none of the researchers are willing to stay
00:42:47.540 | in the laboratory as long as it takes
00:42:49.060 | for the subject to get through an LSD journey,
00:42:51.780 | whereas psilocybin tends to be a shorter experience.
00:42:55.020 | Okay, let's talk about psilocybin for a moment.
00:42:58.060 | My read of the data on psilocybin
00:43:00.700 | is that it's still open to question,
00:43:03.180 | but that some of the clinical trials
00:43:05.580 | show pretty significant recovery from major depression.
00:43:08.420 | It's pretty impressive, but if we just set that aside
00:43:11.200 | and say, okay, more needs to be worked out for safety,
00:43:13.620 | what is very clear from the brain imaging studies,
00:43:16.180 | the sort of before and after,
00:43:17.460 | resting state, task-related, et cetera,
00:43:20.020 | is that you get more resting state global connectivity,
00:43:24.820 | more areas talking to more areas
00:43:27.140 | than was the case prior to the use of the psychedelic.
00:43:31.180 | And given the similarity of the psychedelic journey,
00:43:34.300 | and here specifically talking about psilocybin,
00:43:36.380 | to things like rapid eye movement,
00:43:37.780 | sleep, and things of that sort,
00:43:39.340 | I have a very simple question.
00:43:41.920 | Do you think that there's any real benefit
00:43:44.900 | to increasing brain-wide connectivity?
00:43:47.540 | To me, it seems a little bit haphazard,
00:43:49.580 | and yet the clinical data are promising,
00:43:51.700 | if nothing else, promising.
00:43:53.940 | And so is what we're seeking in life
00:43:56.140 | as we acquire new knowledge,
00:43:58.500 | as we learn tennis or golf,
00:44:01.340 | or take up singing or what have you,
00:44:04.760 | as we go from childhood into the late stages of our life,
00:44:08.320 | that whole transition is what we're doing,
00:44:10.920 | increasing connectivity and communication
00:44:14.060 | between different brain areas?
00:44:15.140 | Is that what the human experience is really about?
00:44:18.060 | Or is it that we're getting more modular?
00:44:19.700 | We're getting more segregated in terms of this area,
00:44:21.980 | talking to this area in this particular way.
00:44:24.860 | Feel free to explore this in any way that feels meaningful,
00:44:28.100 | or to say pass if it's not a good question.
00:44:30.380 | - No, it's a great question.
00:44:31.460 | I mean, you have all these great questions,
00:44:32.980 | and we don't have complete answers yet,
00:44:35.340 | but specifically with regard to connectivity,
00:44:38.480 | if you look at what happens in an infant's brain
00:44:43.740 | during the first two years,
00:44:45.060 | there's a tremendous amount of new synapses being formed.
00:44:48.280 | This is your area, by the way.
00:44:49.900 | You know more about this than I do.
00:44:51.940 | - That's true.
00:44:52.780 | - But then you prune them, right?
00:44:54.820 | Then the second phase is that you overabundant synapses,
00:44:59.420 | and now what you wanna do is to prune them.
00:45:01.580 | Why would you wanna do that?
00:45:03.580 | Well, you know, synapses are expensive.
00:45:07.540 | It takes a lot of energy to activate all of the neurons,
00:45:13.980 | and the synapses especially,
00:45:16.460 | 'cause there's the turnover of the neurotransmitter.
00:45:19.980 | And so what you wanna do is to reduce the amount of energy
00:45:24.980 | and only use those synapses
00:45:27.100 | that have been proven to be the most important, right?
00:45:31.280 | Now, unfortunately, as you get older,
00:45:34.100 | the pruning slows down, but doesn't go away.
00:45:38.260 | And so the cortex thins and so forth.
00:45:43.160 | So I think it goes in the opposite direction.
00:45:45.460 | I think that as you get older, you're losing connectivity.
00:45:50.040 | But interestingly, you retain the old memories.
00:45:52.960 | The old memories are really rock solid,
00:45:55.200 | 'cause they were put in when you were young.
00:45:57.760 | - Yeah, the foundation.
00:45:58.680 | - The foundation upon which everything else is built.
00:46:01.440 | But it's not totally one way,
00:46:07.240 | in the sense that even as an adult, as you know,
00:46:10.920 | you can learn new things, maybe not as quickly.
00:46:13.840 | By the way, this is one of the things that surprised me.
00:46:17.880 | So Barbara and I have looked at the people
00:46:22.640 | who really were the benefit of the most.
00:46:25.520 | It turns out that the peak of the demographic is 25 to 35.
00:46:29.360 | - Barbara?
00:46:30.300 | - Oakley, Oakley.
00:46:31.580 | Yeah, she's really the mastermind.
00:46:33.560 | She's a fabulous educator and background in engineering.
00:46:38.560 | But what's going on?
00:46:41.920 | So it turns out, we aimed our MOOC
00:46:45.880 | at kids in high school and college,
00:46:49.080 | because that's their business.
00:46:50.560 | They go every day and they go into work.
00:46:52.840 | They have to learn, right?
00:46:54.280 | That's their business.
00:46:56.080 | But in fact, very few of the students
00:46:58.800 | are actually, you know, they weren't taking the course.
00:47:02.640 | Why should they?
00:47:03.480 | They spent all day in the class, right?
00:47:05.280 | Why do they want to take another class?
00:47:06.720 | - So this is the learning to learn class.
00:47:10.040 | - Learning how to learn.
00:47:10.880 | - Okay, so you did this with Barbara.
00:47:12.480 | - So I did with Barbara, and now 25 to 35,
00:47:15.760 | we have this huge peak, huge.
00:47:18.560 | So what's going on?
00:47:19.640 | Here's what's going on.
00:47:20.720 | It's very interesting.
00:47:22.800 | So you're 25, you've gone to college.
00:47:25.000 | Half the people, by the way,
00:47:26.000 | who take the course went to college, right?
00:47:27.480 | So it's not like, you know, filling in for college.
00:47:29.920 | This is like topping it off.
00:47:31.400 | But you're in a workforce.
00:47:34.480 | You have to learn new skill.
00:47:35.680 | Maybe you have mortgage.
00:47:37.360 | Maybe you have children, right?
00:47:38.440 | You can't afford to go off and take a course
00:47:43.440 | or get another degree.
00:47:47.160 | So you take a MOOC and you discover, you know,
00:47:49.920 | I'm not quite as agile as I used to be in terms of learning,
00:47:53.920 | but it turns out with our course,
00:47:56.240 | you can boost your learning.
00:47:58.680 | And so that even though you're not as,
00:48:01.400 | your brain isn't learning as quickly,
00:48:03.920 | you can do it more efficiently.
00:48:05.320 | - This is amazing.
00:48:06.200 | I want to take this course.
00:48:07.960 | I will take this course.
00:48:09.040 | What sort of time commitment is the course?
00:48:12.360 | You already pointed out that it's zero cost,
00:48:14.440 | which is amazing.
00:48:15.280 | - Okay, so it's bite-sized videos
00:48:18.440 | lasting about 10 minutes each.
00:48:20.440 | And there's about 50 or 60 over a course of one month.
00:48:24.280 | - And are you tested?
00:48:25.160 | Are you self-tested?
00:48:26.000 | - Yeah, there are tests.
00:48:26.840 | There are quizzes.
00:48:27.680 | There are tests at the end.
00:48:29.240 | And there are forums where you can go
00:48:32.040 | and talk to other students.
00:48:33.040 | You have questions.
00:48:33.880 | We have TAs.
00:48:35.440 | No, it's-
00:48:36.280 | - And anyone can do this?
00:48:37.120 | - Anyone in the world.
00:48:38.200 | In fact, we have people in India, housewives,
00:48:40.920 | who say, "Thank you, thank you, thank you,"
00:48:42.800 | because I could have never learned
00:48:44.160 | about how to be a better learner.
00:48:46.520 | And I wish I had known this when I was going to school.
00:48:49.300 | - Why do more people not know
00:48:51.520 | about this learning to learn course?
00:48:52.740 | Although, as people know,
00:48:54.640 | if I get really excited about it or about anything,
00:48:57.240 | I'm never going to shut up about it.
00:48:59.120 | But I'm going to take the course first
00:49:00.400 | because I want to understand the guts of it.
00:49:01.920 | - You'll enjoy it.
00:49:03.440 | We have like 98% approval, which is phenomenal.
00:49:07.480 | It's sticky.
00:49:08.760 | Is it math, vocabulary?
00:49:11.140 | - No, no math.
00:49:11.980 | We're not teaching anything specific.
00:49:16.600 | We're not trying to give you knowledge.
00:49:17.880 | We're trying to tell you how to acquire knowledge
00:49:21.000 | and how to do that,
00:49:22.520 | how to deal with exam anxiety, for example,
00:49:25.920 | or how to, you know, we all procrastinate, right?
00:49:29.400 | We put things off.
00:49:30.480 | - Nah, no, I'm kidding.
00:49:31.400 | We all procrastinate.
00:49:32.520 | - How to avoid that.
00:49:33.560 | We teach you how to avoid that.
00:49:35.840 | - Fantastic.
00:49:37.280 | Okay, I'm going to skip back a little bit now
00:49:39.840 | with the intention of double-clicking
00:49:42.520 | on this learning to learn thing.
00:49:44.640 | You pointed out that in particular in California,
00:49:48.640 | but elsewhere as well,
00:49:50.760 | there isn't as much procedural
00:49:52.840 | practice-based learning anymore.
00:49:55.100 | I'm going to play devil's advocate here.
00:49:59.640 | And I'm going to point out that this is not
00:50:01.240 | what I actually believe.
00:50:02.400 | But, you know, when I was growing up,
00:50:04.780 | you had to do your times tables and your division,
00:50:06.960 | and, you know, and then your fractions and your exponents,
00:50:09.680 | and, you know, and they build on one another.
00:50:12.080 | And then at some point, you know,
00:50:14.840 | you take courses where you might need it
00:50:16.400 | like a graphing calculator.
00:50:18.080 | To some people, they can be like, what is this?
00:50:20.520 | But the point being that there were a number of things
00:50:22.360 | that you had to learn to implement functions
00:50:24.280 | and you learn by doing, you learn by doing.
00:50:28.200 | Likewise, in physics class, we, you know,
00:50:31.040 | we were attaching things to strings
00:50:32.480 | and for macromechanics and learning that stuff.
00:50:35.160 | Okay, and learning from the chalkboard lectures.
00:50:38.680 | I can see the value of both, certainly.
00:50:42.900 | And you explained that the brain needs both
00:50:45.520 | to really understand knowledge
00:50:47.520 | and how to implement and back and forth.
00:50:50.320 | But nowadays, you know, you'll hear the argument,
00:50:52.520 | well, why should somebody learn how to read a paper map
00:50:55.440 | unless it's the only thing available
00:50:56.720 | because you have Google Maps?
00:50:58.560 | Or if they want to do a calculation,
00:51:00.000 | they just put it into the top bar function on the internet
00:51:02.660 | and boom, out comes the answer.
00:51:05.180 | So there is a world where certain skills
00:51:08.220 | are no longer required.
00:51:10.260 | And one could argue that the brain space
00:51:15.260 | and activity and time and energy in particular
00:51:21.340 | could be devoted to learning new forms of knowledge
00:51:24.940 | that are going to be more practical
00:51:27.020 | in the school and workforce going forward.
00:51:30.360 | - So how do we reconcile these things?
00:51:32.900 | I mean, I'm of the belief that the brain is doing math
00:51:35.460 | and you and I agree it's electrical signals
00:51:37.400 | and chemical signals and it's doing math
00:51:38.860 | and it's running algorithms.
00:51:40.260 | I think you convinced us of that, certainly.
00:51:43.220 | But how are we to discern what we need to learn
00:51:47.700 | versus what we don't need to learn
00:51:49.140 | in terms of building a brain that's capable of learning
00:51:52.420 | the maximum number of things or even enough things
00:51:55.420 | so that we can go into this very uncertain future?
00:51:58.220 | Because as far as you know,
00:51:59.760 | and I know neither of us have a crystal ball.
00:52:02.300 | So what is essential to learn?
00:52:05.620 | And for those of us that didn't learn certain things
00:52:07.940 | in our formal education, what should we learn how to learn?
00:52:11.740 | - Well, this is generational.
00:52:17.720 | Okay.
00:52:18.980 | So technologies provide us with tools.
00:52:25.260 | You mentioned the calculator, right?
00:52:29.460 | Well, a calculator didn't eliminate
00:52:32.980 | the education you need to get in math,
00:52:37.100 | but it made certain things easier.
00:52:38.900 | It made it possible for you to do more things
00:52:42.080 | and more accurately.
00:52:43.260 | However, interestingly, students in my class
00:52:46.700 | often come up with answers that are off
00:52:49.900 | by eight orders of magnitude.
00:52:51.900 | And that's a huge amount, right?
00:52:53.540 | It's clear that they didn't key in the calculator properly,
00:52:57.060 | but they didn't recognize that it was a very far,
00:53:01.340 | it was a completely way off the beam
00:53:03.560 | because they didn't have a good feeling for the numbers.
00:53:05.740 | They don't have a good sense of exactly how big
00:53:09.060 | it should have been, order of magnitude,
00:53:11.300 | basic understanding.
00:53:13.580 | So it's kind of a,
00:53:15.260 | the benefit is that you can do things faster, better,
00:53:20.300 | but then you also lose some of your intuition
00:53:23.980 | if you don't have the procedural system in place.
00:53:26.500 | - And think about a kid that wants to be a musician
00:53:28.660 | who uses AI to write a song about a bad breakup
00:53:33.660 | that then is kind of recovered when they find new love.
00:53:39.140 | And I'm guessing that you could do this today
00:53:42.540 | and get a pretty good song out of AI,
00:53:44.360 | but would you call that kid a songwriter or a musician?
00:53:48.060 | On the face of it, yeah, the AI is helping.
00:53:51.180 | And then you'd say, well, that's not the same
00:53:52.780 | as sitting down with a guitar
00:53:54.020 | and trying out different chords
00:53:56.100 | and feeling the intonation in their voice.
00:53:58.140 | But I'm guessing that for people
00:54:00.500 | that were on the electric guitar,
00:54:01.620 | they were criticizing people on the acoustic guitar.
00:54:04.060 | You know, so we have this generational thing
00:54:05.780 | where we look back and say, that's not the real thing.
00:54:08.040 | You need to get the, so what are the key fundamentals
00:54:11.060 | is really a critical question.
00:54:12.300 | - Okay, so I'm going to come back to that
00:54:14.620 | because this is how, the way you put it at the beginning
00:54:18.060 | had to do with whether your,
00:54:21.300 | how your brain is allocating resources, okay?
00:54:24.620 | So when you're younger, you can take in things.
00:54:29.100 | Your brain is more malleable.
00:54:30.320 | For example, how good are you on social media?
00:54:35.320 | - Well, I do all my own Instagram and Twitter
00:54:40.260 | and those accounts have grown
00:54:42.460 | in proportion to the amount of time I've been doing it.
00:54:44.540 | So yeah, I would say pretty good.
00:54:45.600 | I mean, I'm not the biggest account on social media,
00:54:48.500 | but for a science health account, we're doing okay.
00:54:51.140 | I'm thanks to the audience.
00:54:52.620 | - Well, this speaks well for the fact
00:54:54.300 | that you've managed to break, you know,
00:54:58.660 | to go beyond the generation gap because-
00:55:00.620 | - I can type with my thumbs, Terry.
00:55:02.140 | - Okay, there you go.
00:55:03.660 | That's a manual skill that you've learned.
00:55:05.580 | - That's a new phenomenon in human evolution.
00:55:08.340 | - I couldn't believe it.
00:55:09.340 | I saw people doing that and now I can do it too.
00:55:11.740 | But the thing is that if you learn how to do that early
00:55:16.400 | in life, you're much more good at it.
00:55:18.940 | You can move your thumbs much more quickly.
00:55:22.100 | Also, you can have many more, you know, tweets going
00:55:26.620 | and we're not, what are they called now?
00:55:27.860 | They're not called tweets.
00:55:28.700 | - So on X, I think they still call them tweets
00:55:30.220 | because you can't, it's hard to verb the letter X.
00:55:34.860 | Elon didn't think of that one.
00:55:36.060 | I like X 'cause it's cool.
00:55:37.260 | It's kind of punk and it's got a black kind of format
00:55:40.900 | and it fits with kind of the, you know,
00:55:43.740 | the engineer, like black X, you know,
00:55:46.300 | and this kind of thing.
00:55:47.140 | But yeah, we'll still call them tweets.
00:55:48.700 | - Okay, we'll call them tweets.
00:55:49.700 | Okay, that's good.
00:55:51.020 | But you know, I walk across campus and I see everybody,
00:55:55.260 | like half the people are tweeting or, you know,
00:55:58.700 | they're doing something with their cell phone.
00:56:00.700 | They're, I mean, it's unbelievable.
00:56:02.220 | - And you have beautiful sunsets at the Salk Institute.
00:56:04.460 | We'll put a link to one of them.
00:56:05.580 | I mean, it is truly spectacular, awe-inspiring
00:56:10.140 | to see a sunset at the Salk Institute.
00:56:12.140 | - Every day is different.
00:56:13.220 | - And everyone's on their phones these days, sad.
00:56:15.500 | - And you know, they're looking down at their phone
00:56:17.180 | and they're walking along, even people
00:56:18.580 | who are skateboarding, unbelievable.
00:56:20.700 | I mean, you know, it's amazing what a human being can do,
00:56:22.740 | you know, when they learn, get into something.
00:56:24.980 | But what happens is the younger generation
00:56:27.140 | picks up whatever technology it is
00:56:28.700 | and the brain gets really good at it.
00:56:31.180 | And you can pick it up later, but you're not quite as agile,
00:56:34.940 | not quite as maybe obsessive.
00:56:37.660 | - It fatigues me, I will point this out,
00:56:40.180 | that doing anything on my phone feels fatiguing
00:56:43.780 | in a way that reading a paper book
00:56:45.740 | or even just writing on a laptop or a desktop computer
00:56:48.700 | is fundamentally different.
00:56:49.660 | I can do that for many hours.
00:56:51.060 | If I'm on social media for more than a few minutes,
00:56:53.140 | I can literally feel the energy draining out of my body.
00:56:57.020 | - Interesting.
00:56:57.860 | - I would, I could do sprints or deadlifts for hours
00:57:02.700 | and not feel the kind of fatigue that I feel
00:57:04.780 | from doing social media.
00:57:06.260 | - So, you know, this is fascinating.
00:57:08.700 | I'd like to know what's going on in your brain.
00:57:11.020 | Why is it, and also I'd like to know from younger people
00:57:14.500 | whether they have the same.
00:57:15.580 | I think not.
00:57:16.540 | I think my guess is that they don't feel fatigued
00:57:18.420 | because they got into this early enough.
00:57:20.820 | And this is actually a very, very,
00:57:26.480 | I think that it has a lot to do with the foundation
00:57:31.600 | you put into your brain.
00:57:32.900 | In other words, things that you learn
00:57:36.700 | when you're really young are foundational
00:57:39.580 | and they make things easier, some things easier.
00:57:42.460 | - Yeah, I spent a lot of time in my room as a kid,
00:57:45.260 | either playing with Legos or action figures
00:57:47.420 | or building fish tanks or reading about fish.
00:57:49.780 | I tended to read about things
00:57:51.580 | and then do a lot of procedural-based activities.
00:57:55.860 | You know, I would read skateboard magazines and skateboard.
00:57:58.760 | I was never one to really just watch a sport
00:58:00.900 | and not play it.
00:58:02.460 | So, you know, bridging across these things.
00:58:04.620 | So social media, to me, feels like an energy sink.
00:58:08.460 | But of course, I love the opportunity
00:58:10.040 | to be able to teach to people
00:58:11.380 | and learn from people at such scale.
00:58:13.420 | But at an energetic level,
00:58:14.900 | I feel like I don't have a foundation for it.
00:58:17.500 | It's like, I'm trying to like jerry-rig my cognition
00:58:21.940 | into doing something that it wasn't designed to do.
00:58:23.880 | - Well, there you go.
00:58:24.720 | And it's because you don't have the foundation.
00:58:26.780 | You didn't do it when you were younger.
00:58:28.180 | And now you have to sort of use the cognitive powers
00:58:32.380 | to do a lot of what was being done now
00:58:34.620 | in a younger person procedurally.
00:58:37.220 | - I'd like to take a quick break
00:58:38.340 | and thank one of our sponsors, Element.
00:58:40.900 | Element is an electrolyte drink
00:58:42.420 | that has everything you need and nothing you don't.
00:58:45.100 | That means the electrolytes,
00:58:46.340 | sodium, magnesium, and potassium,
00:58:48.380 | in the correct ratios, but no sugar.
00:58:50.700 | We should all know that proper hydration is critical
00:58:52.780 | for optimal brain and body function.
00:58:54.700 | In fact, even a slight degree of dehydration
00:58:56.860 | can diminish your cognitive and physical performance
00:58:59.260 | to a considerable degree.
00:59:00.660 | It's also important that you're not just hydrated,
00:59:02.600 | but that you get adequate amounts of electrolytes
00:59:04.660 | in the right ratios.
00:59:05.940 | Drinking a packet of Element dissolved in water
00:59:08.320 | makes it very easy to ensure
00:59:09.680 | that you're getting adequate amounts of hydration
00:59:11.780 | and electrolytes.
00:59:13.060 | To make sure that I'm getting proper amounts of both,
00:59:15.340 | I dissolve one packet of Element
00:59:16.900 | in about 16 to 32 ounces of water
00:59:18.900 | when I wake up in the morning,
00:59:20.220 | and I drink that basically first thing in the morning.
00:59:22.900 | I'll also drink a packet of Element dissolved in water
00:59:25.040 | during any kind of physical exercise that I'm doing,
00:59:27.460 | especially on hot days when I'm sweating a lot
00:59:29.580 | and losing water and electrolytes.
00:59:31.780 | There are a bunch of different
00:59:32.660 | great tasting flavors of Element.
00:59:34.100 | I like the watermelon, I like the raspberry,
00:59:35.860 | I like the citrus.
00:59:36.700 | Basically, I like all of them.
00:59:38.100 | If you'd like to try Element,
00:59:39.280 | you can go to drinkelement.com/huberman
00:59:42.460 | to claim an Element sample pack
00:59:43.960 | with the purchase of any Element drink mix.
00:59:46.180 | Again, that's drinkelement spelled L-M-N-T.
00:59:48.940 | So it's drinkelement.com/huberman
00:59:51.980 | to claim a free sample pack.
00:59:53.660 | Today's episode is also brought to us by Juve.
00:59:56.740 | Juve makes medical grade red light therapy devices.
01:00:00.060 | Now, if there's one thing
01:00:00.900 | that I've consistently emphasized on this podcast
01:00:03.500 | is the incredible impact that light can have on our biology.
01:00:06.700 | Now, in addition to sunlight,
01:00:08.020 | red light and near infrared light
01:00:09.580 | have been shown to have positive effects
01:00:11.020 | on improving numerous aspects of cellular and organ health,
01:00:14.300 | including faster muscle recovery,
01:00:16.260 | improved skin health and wound healing,
01:00:18.320 | improvements in acne, reduced pain and inflammation,
01:00:21.700 | improved mitochondrial function,
01:00:23.340 | and even improving vision itself.
01:00:25.940 | Now, what sets Juve lights apart
01:00:27.540 | and why they're my preferred red light therapy devices
01:00:30.140 | is that they use clinically proven wavelengths,
01:00:32.260 | meaning they use specific wavelengths
01:00:34.260 | of red light and near infrared light in combination
01:00:37.060 | to trigger the optimal cellular adaptations.
01:00:39.820 | Personally, I use the Juve whole body panel
01:00:41.900 | about three to four times a week,
01:00:43.380 | and I use the Juve handheld light
01:00:45.220 | both at home and when I travel.
01:00:47.220 | If you'd like to try Juve,
01:00:48.460 | you can go to juve, spelled J-O-O-V-V.com/huberman.
01:00:53.060 | Juve is offering Black Friday discounts
01:00:54.980 | of up to $1,300 now through December 2nd, 2024.
01:00:58.780 | Again, that's juve, J-O-O-V-V.com/huberman
01:01:03.000 | to get up to $1,300 off select Juve products.
01:01:06.700 | I'm gonna tell you something
01:01:08.300 | which is gonna help all of your listeners.
01:01:11.260 | My book, "Chad G.D.P. and the Future of AI,"
01:01:14.500 | I went through and I looked at other people's experiences
01:01:17.660 | with Chad G.D.P.
01:01:18.500 | I just wanted to know what people were thinking,
01:01:20.820 | and I came across, it was an article,
01:01:24.260 | I think it was "The New York Times,"
01:01:25.800 | of a technical writer who decided
01:01:28.340 | she would spend one month using it
01:01:30.160 | to help her write things, her articles.
01:01:32.820 | And she said that when she started out,
01:01:36.460 | you know, at the end of the day, she was drained,
01:01:39.040 | completely drained.
01:01:41.740 | And it was like, you know, working on a machine,
01:01:43.740 | you know, like a tractor or something.
01:01:45.620 | You know, you're struggling, struggling, struggling
01:01:47.500 | to get it to work.
01:01:48.400 | And then she started, said, "Well, wait a second.
01:01:55.700 | "You know, what if I treat it like a human being?
01:01:59.440 | "What if I'm polite instead of, you know, being curt?"
01:02:04.140 | So she said, "Suddenly, I started getting better answers
01:02:08.300 | "by being polite and, you know, back and forth
01:02:13.600 | "the way you with a human, you know."
01:02:15.160 | - So saying, "Could you please give me information
01:02:17.360 | "about so-and-so?" - Yeah, "Please,
01:02:18.200 | "I'm really having trouble."
01:02:19.180 | No, you know, that answer you gave me was fabulous,
01:02:21.780 | is exactly what I was looking for.
01:02:23.260 | And, you know, now I need you to go on to the next part
01:02:26.900 | and help me with that, too.
01:02:28.100 | In other words, the way you talk to a human, right,
01:02:30.020 | if you're an assistant, that you're--
01:02:31.020 | - Or is it that she was talking to the AI
01:02:34.060 | to chat GPT, it sounds like in this case,
01:02:36.620 | in the way that her brain was familiar
01:02:38.980 | with asking questions to a human?
01:02:41.320 | In other words, so is the AI learning her
01:02:45.700 | and therefore giving her the sorts of answers
01:02:47.500 | that are more facile for her to integrate with?
01:02:50.340 | - I think it's both.
01:02:51.620 | First of all, the chat GDP is mirroring your,
01:02:56.620 | the way you treat it, it will mirror that back.
01:02:59.140 | You treat it like a machine,
01:03:00.860 | it will treat you like a machine, okay?
01:03:03.020 | 'Cause that's what it's good at.
01:03:04.980 | But here's the surprise.
01:03:06.540 | Surprise is, she said, "Once I started treating it
01:03:10.460 | "like a human, at the end of the day,
01:03:12.800 | "I wasn't fatigued anymore."
01:03:16.800 | Well, it turns out that all your life,
01:03:18.100 | you interact with humans in a certain way,
01:03:21.540 | and your brain is wired to do that,
01:03:24.300 | and it doesn't take any effort.
01:03:26.600 | And so by treating the chat GDP as if it were a human,
01:03:31.620 | you're taking advantage of all the brain circuits
01:03:33.360 | in your brain.
01:03:34.240 | - This is incredible, and I'll tell you why,
01:03:37.540 | because I think many people, not just me,
01:03:39.420 | but many people really enjoy social media, learn from it.
01:03:44.220 | I mean, yesterday I learned a few things
01:03:46.120 | that I thought were just fascinating
01:03:47.600 | about how we perceive our own identity,
01:03:51.460 | according to whether or not we're filtering it
01:03:52.800 | through the responses of others,
01:03:54.140 | or whether or not we take a couple of minutes
01:03:55.660 | and really just sit and think about
01:03:57.020 | how we actually feel about ourselves.
01:03:58.340 | Very interesting ideas about locus of self-perception
01:04:01.280 | and things like that.
01:04:02.160 | I also looked at a really cool video
01:04:04.100 | of a baby raccoon popping bubbles
01:04:06.160 | while standing on its hind limbs.
01:04:07.980 | And that was really cool,
01:04:08.860 | and social media could provide me both those things
01:04:10.900 | within a series of minutes.
01:04:12.580 | And I was thinking to myself, this is crazy, right?
01:04:14.500 | The raccoon is kind of trivial, but it delighted me,
01:04:17.380 | and that's not trivial.
01:04:18.220 | - There you go, yeah.
01:04:19.060 | - So, but here's the question.
01:04:22.560 | Could it be that one of the detrimental aspects
01:04:27.740 | of social media is that if we're complimenting one another,
01:04:32.160 | or if we are giving hearts, or we're giving thumbs down,
01:04:35.240 | or we're in an argument with somebody,
01:04:36.760 | or we're doing a clap back, or they're clapping back on us,
01:04:39.920 | or dunking, as it's called on X,
01:04:42.240 | that it isn't necessarily the way that we learned to argue.
01:04:48.480 | It's not necessarily the way that we learned
01:04:50.360 | to engage in healthy dispute.
01:04:52.520 | And so, as a consequence, it feels like,
01:04:54.800 | and this is my experience,
01:04:55.880 | that certain online interactions feel really good,
01:04:59.060 | and others feel like they kind of grate on me,
01:05:01.720 | like because there's almost like an action step
01:05:03.800 | that isn't allowed, like you can't fully explain yourself,
01:05:06.200 | or understand the other person.
01:05:08.320 | And I am somebody who believes in the power
01:05:11.480 | of real face-to-face dialogue,
01:05:13.120 | or at least on the phone dialogue.
01:05:14.680 | And I feel the same way about text messaging.
01:05:17.000 | I hate text messaging.
01:05:18.840 | When text messaging first came out,
01:05:20.300 | I remember thinking,
01:05:21.300 | I was not a kid that passed notes in class.
01:05:25.000 | This feels like passing notes in class.
01:05:27.120 | In fact, this whole text messaging thing is beneath me.
01:05:30.280 | That's how I felt.
01:05:31.120 | And over the years, of course, I became a text messenger.
01:05:33.900 | And it's very useful for certain things,
01:05:35.320 | be there in five minutes, running a few minutes late.
01:05:37.180 | In my case, that's a common one.
01:05:38.780 | But I think this notion of what grates on us,
01:05:44.600 | and as it relates to whether or not
01:05:46.120 | it matches our childhood developed template
01:05:50.120 | of how our brain works, is really key,
01:05:52.240 | because it touches on something
01:05:53.480 | that I definitely wanna talk about today
01:05:55.560 | that I know you've worked on quite a bit,
01:05:58.120 | which is this concept of energy.
01:06:00.640 | What we're talking about here is energy,
01:06:02.080 | not woo, biology, woo, science, wellness, energy.
01:06:06.080 | We're talking about,
01:06:07.040 | we only have a finite amount of energy.
01:06:09.280 | And years ago, the great Ben Barris sadly passed away.
01:06:12.520 | Our former colleague and my postdoc advisor
01:06:16.960 | came to me one day in the hallway and he stopped me
01:06:19.000 | and he said, he called me Andy, like you do.
01:06:21.160 | And he said, Andy, how come we get
01:06:24.040 | such a rundown of energy as we get older?
01:06:27.460 | Why am I more tired today than I was 10 years ago?
01:06:30.660 | I was like, I don't know, how are you sleeping?
01:06:31.720 | He's like, I'm sleeping fine.
01:06:32.640 | Ben never slept much in the first place,
01:06:34.120 | but he had a ton of energy.
01:06:35.200 | And I thought to myself, I don't know.
01:06:39.280 | Like, what is this energy thing that we're talking about?
01:06:42.800 | I wanna make sure that we close the hatch
01:06:44.080 | on this notion of a template neural system
01:06:46.800 | that then you either find experiences invigorating
01:06:49.460 | or depleting.
01:06:50.300 | I wanna make sure we close the hatch on that,
01:06:51.760 | but I wanna make sure that we relate it at some point
01:06:54.640 | to this idea of energy.
01:06:56.300 | And why is it that with each passing year of our life,
01:06:58.800 | we seem to have less of it?
01:07:00.720 | - You know, you ask these great questions.
01:07:03.980 | I wish that I had great answers.
01:07:05.860 | - Well, so far you really do have great answers.
01:07:08.320 | They're certainly novel to me in the sense
01:07:10.040 | that I've not heard answers of this sort.
01:07:13.400 | So there's a tremendous amount of learning for me today
01:07:15.440 | and I know for the audience.
01:07:17.500 | - But let's say somebody is 20 years old
01:07:20.140 | versus 50 years old versus what should they do?
01:07:24.060 | I mean, we need to integrate with the modern world.
01:07:25.700 | We also need to relate across generations.
01:07:28.420 | - Oh yeah, no, this is true, this is true.
01:07:30.260 | - People aren't retiring as much, they're living longer.
01:07:33.020 | Birth rates are down,
01:07:34.180 | but we have to get all get along as they say.
01:07:36.700 | - So, you know, it is interesting.
01:07:38.820 | I think it's true that we all, as we get older,
01:07:41.820 | have less of the, you know, the vigor,
01:07:44.660 | if I could use a somewhat different word from energy,
01:07:47.560 | we'll come back to that.
01:07:48.880 | But I think there are some
01:07:51.680 | who manage to keep an active life.
01:07:54.360 | And here's something that, again,
01:07:57.080 | in our MOOC, we really emphasize.
01:07:59.400 | - Could you explain a MOOC?
01:08:00.320 | I think most people won't know what a MOOC is,
01:08:02.400 | just for their sake.
01:08:03.240 | - Okay, this is, they've been around for about,
01:08:05.680 | actually started at Stanford, Andrew Ng and Daphna Koller.
01:08:09.960 | So they have a company called Coursera.
01:08:12.460 | And what happens is that you get professors,
01:08:14.940 | and in fact, anybody who has knowledge
01:08:17.180 | or, you know, professional expertise,
01:08:19.660 | to give lectures that are available to anybody in the world
01:08:23.420 | who have access to the internet.
01:08:24.740 | And, you know, it could,
01:08:27.300 | this is like probably tens of thousands now.
01:08:30.300 | Any specialty, history, science, music, you know,
01:08:35.220 | you name it, there's somebody who's done, you know,
01:08:37.580 | who's an expert on that and wants to tell you,
01:08:39.860 | because they're excited about what they're doing.
01:08:42.040 | Okay, so, you know, what we wanted to do
01:08:47.040 | was to help people with learning.
01:08:49.340 | And so part of the problem is that it gets more difficult.
01:08:53.860 | It takes more effort as you get older.
01:08:56.620 | - It depletes your vigor more,
01:08:58.340 | if we're gonna stay with this language of energy and vigor.
01:09:00.500 | - Yeah, yeah, that's right.
01:09:01.340 | So let's actually use the word energy.
01:09:03.620 | As you know, in the cell, there is a physical power plant
01:09:06.700 | called the mitochondrion, which is supplying us
01:09:11.580 | with ATP, which is the coin of the realm for the cell
01:09:16.100 | to be able to operate all of its machinery, right?
01:09:18.820 | So, and so one of the things that happens
01:09:22.320 | when you get older is that your mitochondrial run down.
01:09:26.580 | - You have fewer of them and they're less efficient.
01:09:29.260 | - That's right, they're less efficient.
01:09:30.920 | And actually drugs can do that to you too.
01:09:33.740 | They can harm mitochondria.
01:09:35.460 | - Or recreational drugs.
01:09:36.580 | - No, the drugs you take for illness.
01:09:38.620 | I'm not sure about recreational drugs,
01:09:40.580 | but I know it's the case that there are a lot of drugs
01:09:44.300 | that people take 'cause they have to,
01:09:46.460 | but the other thing, and this is something,
01:09:51.460 | that's the bad news, here's the good news.
01:09:54.260 | The good news is that you can replenish
01:09:58.120 | your energy by exercise.
01:10:00.980 | That exercise is the best drug you could ever take.
01:10:06.420 | It's the cheapest drug you could ever take.
01:10:08.720 | That can help every organ in your body.
01:10:12.040 | It helps obviously your heart.
01:10:14.400 | It helps your brain, it rejuvenates your brain.
01:10:19.900 | It helps your immune system.
01:10:21.640 | Every single organ system in the body
01:10:24.060 | benefits from a regular exercise.
01:10:25.940 | I run on the beach every day at the Salk Institute.
01:10:29.700 | I can, and I also, it's on a mesa, 340 foot above.
01:10:34.360 | So I go down every day and then I climb up the cliff.
01:10:38.360 | - Yeah, those steps down to Black's Beach,
01:10:40.180 | they're a good workout.
01:10:41.980 | - They are, they are, and so this is something
01:10:44.940 | that has kept me active, and I do hiking.
01:10:47.640 | I went hiking in the Alps last fall.
01:10:51.180 | So this is, in September, so this is, I think,
01:10:54.660 | something that people really ought to realize
01:10:57.920 | is that it's like putting away reserves of energy
01:11:02.920 | for when you get older.
01:11:06.580 | The more you put away, the better off you are.
01:11:09.360 | Here's something else.
01:11:10.380 | Okay, now this is jumping now to Alzheimer's.
01:11:13.540 | So a study that was done in China many, many years ago,
01:11:17.380 | when I first came to La Jolla, San Diego,
01:11:21.320 | I heard this from the, it was the head
01:11:25.140 | of the Alzheimer's program.
01:11:26.900 | He had done a study in China on onset,
01:11:31.900 | and they went and they had three populations.
01:11:35.220 | They had peasants who had almost no education,
01:11:38.620 | then they had another group that had high school education,
01:11:40.580 | and then people who were, you know, advanced education.
01:11:43.940 | So it turns out that the onset of Alzheimer's
01:11:47.320 | was earlier for the people who had no education,
01:11:49.740 | and it was the latest for the people
01:11:52.620 | who had the most education.
01:11:54.560 | Now this is interesting, isn't it?
01:11:56.580 | 'Cause it's, and presumably the genes
01:11:59.180 | aren't that different, right?
01:12:00.340 | I mean, they're all Chinese.
01:12:02.700 | So one possibility, and obviously we don't really know why,
01:12:06.580 | but one possibility is that the more you exercise your brain
01:12:11.580 | with education, the more reserve you have later in life.
01:12:16.780 | - I believe in the notion,
01:12:19.440 | and I don't have a better word for it,
01:12:20.980 | maybe you do, or a phrase for it,
01:12:22.920 | is of kind of a cognitive velocity.
01:12:27.920 | You know, I sometimes will play with this.
01:12:29.500 | I'll read slowly, or I'll see where my default pace
01:12:32.720 | of reading is at a given time of day,
01:12:34.300 | and then I'll intentionally try and read
01:12:35.980 | a little bit faster while also trying
01:12:38.300 | to retain the knowledge I'm reading.
01:12:40.220 | So I'm not just reading the words,
01:12:41.340 | I'm trying to absorb the information.
01:12:43.580 | And you can feel the energetic demand of that.
01:12:46.460 | And then I'll play with it.
01:12:47.340 | I'll kind of back off a little bit,
01:12:49.220 | and then I'll go forward.
01:12:50.060 | And I try and find the sweet spot
01:12:51.500 | where I'm not reading at the pace that is reflexive,
01:12:56.500 | but just a little bit quicker
01:12:58.340 | while also trying to retain the information.
01:13:02.020 | And I learned this when I had a lot of catching up to do
01:13:05.540 | at one phase of my educational career.
01:13:07.580 | Fortunately, it was pretty early,
01:13:08.740 | and I was able to catch up on most things.
01:13:10.500 | You know, occasionally things slip through,
01:13:11.780 | and I have to go back and learn how to learn, you know?
01:13:14.460 | And if I get anything wrong on the internet,
01:13:15.900 | they sure as heck point it out,
01:13:17.860 | and then we go back and learn.
01:13:18.700 | And guess what?
01:13:19.540 | I'd never forget that because punishment,
01:13:22.180 | social punishment is a great signal.
01:13:24.460 | So thank you all for keeping me learning.
01:13:29.460 | But I picked that up from my experience
01:13:33.500 | of trying to get good at things like skateboarding
01:13:37.220 | or soccer when I was younger.
01:13:38.460 | There's a certain thing that happens when skateboarding,
01:13:41.540 | that was my sport growing up,
01:13:42.620 | where it's actually easier to learn something going faster.
01:13:47.620 | You know, most kids try and learn how to ollie and kickflip
01:13:49.740 | standing in the living room on the carpet.
01:13:51.580 | That's the worst way to learn how to do it.
01:13:53.120 | It's all easier going a bit faster than you're comfortable.
01:13:57.460 | It's also the case that if you're not paying attention,
01:14:00.820 | you can get hurt.
01:14:01.900 | It's also the case that if you pay
01:14:03.420 | too much cognitive attention,
01:14:05.100 | you can't perform the motor movements.
01:14:06.940 | So there's this sweet spot that eventually
01:14:08.740 | I was able to translate into an understanding
01:14:11.580 | of when I sit down to read a paper or a news article,
01:14:14.860 | or even listen to a podcast,
01:14:16.180 | there's a pace of the person's voice,
01:14:17.720 | and then I'll adjust the rate of the audio,
01:14:20.620 | where I have to engage cognitively,
01:14:24.140 | and I know I'm in a mode of retaining
01:14:26.640 | the information and learning.
01:14:28.240 | Whereas if I just go with my reflexive pace,
01:14:30.160 | it's rare that I'm in that perfect zone.
01:14:33.400 | So I point this out because perhaps
01:14:35.600 | it will be useful to people.
01:14:36.680 | I don't know if it's incorporated into your learning
01:14:38.200 | how to learn course,
01:14:39.040 | but I do think that there is something,
01:14:41.540 | which I call kind of cognitive velocity,
01:14:45.320 | which is ideal for learning
01:14:46.880 | versus kind of leisurely scrolling.
01:14:48.840 | And this is why I think that social media is detrimental.
01:14:51.260 | I think that we train our brain basically
01:14:54.120 | to be slow, passive, and multi-context cycling through.
01:14:58.680 | And unless something is very high salience,
01:15:03.000 | it kind of makes us kind of fat and lazy,
01:15:05.680 | forgive the language, but I'm going to be blunt here,
01:15:07.680 | fat and lazy cognitively,
01:15:09.900 | unless we make it a point to also engage learning.
01:15:12.720 | And my guess is it's tapping into this mitochondrial system.
01:15:16.520 | Very likely, that's one part of it.
01:15:18.780 | By the way, the way that you've adjusted the speed
01:15:23.320 | is very interesting because it turns out that stress,
01:15:27.880 | everybody thinks stress is bad,
01:15:29.200 | but no, it turns out stress that is transient,
01:15:32.600 | that is only for a limited amount of time
01:15:34.880 | that you control is good for you.
01:15:37.080 | It's good for your brain.
01:15:38.200 | It's good for your body.
01:15:39.320 | I run intervals on the beach,
01:15:41.440 | just the way that you do cognitive intervals
01:15:44.740 | when you're reading.
01:15:45.800 | In other words, I run like hell for about 10 seconds.
01:15:49.440 | And then I go to a jog
01:15:51.640 | and I run like hell for another 10 seconds.
01:15:53.840 | And it's pushing your body into that extra gear
01:15:58.840 | that helps the muscles.
01:16:00.800 | The muscles need to know that
01:16:02.040 | this is what they've got to put out.
01:16:03.840 | And that's where you gain muscle mass,
01:16:07.760 | not from just doing the same running pace every day.
01:16:11.960 | - Well, your intellectual and physical vigor
01:16:13.840 | is undeniable.
01:16:15.960 | I've known you a long time.
01:16:16.800 | You've always had a slight forward center of mass
01:16:20.840 | in your intellect.
01:16:22.800 | And even the speed at which you walk, Terry, dare I say,
01:16:26.320 | for a Californian, you're a quick walker.
01:16:28.460 | - Okay.
01:16:29.300 | - Yeah, so that's a compliment, by the way.
01:16:31.480 | East coasters know what I'm talking about.
01:16:33.020 | And Californians would be like, you know,
01:16:35.880 | why not slow down?
01:16:36.960 | The reason to not slow down too much for too long
01:16:40.580 | is that these mitochondrial systems,
01:16:43.600 | the energy of the brain and body, as you point out,
01:16:45.720 | are very linked.
01:16:46.840 | And I do think that below a certain threshold,
01:16:49.680 | it makes it very hard to come back.
01:16:50.860 | Just like below a certain threshold,
01:16:52.440 | it's hard to exercise without getting very depleted
01:16:55.440 | or even injured, that we need to maintain this.
01:16:58.160 | So perhaps now would be a good time to close the hatch
01:17:01.460 | on this issue of how to teach young people.
01:17:06.460 | Everyone should take this learning to learn course
01:17:08.560 | as a free resource, amazing.
01:17:13.000 | As it relates to AI,
01:17:14.560 | do you think that young people and older people now,
01:17:20.600 | I'm 49, so I'll put myself in the older bracket,
01:17:23.960 | should be learning how to use AI?
01:17:26.860 | - They are already learning how to use AI.
01:17:29.980 | And again, it's just like new technology comes along,
01:17:33.480 | who picks it up first?
01:17:34.480 | It's the younger people.
01:17:36.080 | And it's astonishing.
01:17:38.000 | You know, they're using it a lot more than I am.
01:17:40.560 | You know, I use it almost every day,
01:17:42.280 | but I know a lot of students who basically,
01:17:46.080 | and by the way, it's like any other tool.
01:17:49.440 | It's a tool that you need to know how to use it.
01:17:54.440 | - Where do you suggest people start?
01:17:55.820 | So I have started using Claude AI.
01:18:00.820 | This was suggested to me by somebody expert in AI
01:18:05.440 | as an alternative to ChatGPT.
01:18:07.220 | I don't have anything against ChatGPT,
01:18:09.320 | but I'll tell you, I really like the aesthetic of Claude AI.
01:18:13.880 | It's a bit of a softer beige aesthetic.
01:18:16.720 | It feels kind of Apple-like.
01:18:18.500 | I like the Apple brand and it gives me answers.
01:18:22.040 | Maybe it's the font.
01:18:23.040 | Maybe it's the feel.
01:18:24.040 | Maybe this goes back to the example you used earlier
01:18:25.880 | where I like Claude AI and I'm a big fan of it.
01:18:30.220 | And they don't pay me to say this.
01:18:31.520 | I have never met them.
01:18:32.780 | I have no relationship to them,
01:18:33.780 | except that it gives me answers in a bullet pointed format
01:18:37.240 | that feels very aesthetically easy
01:18:39.480 | to transfer that information into my brain or onto a page.
01:18:43.320 | So I like Claude AI, use ChatGPT.
01:18:45.480 | How should people start to explore AI
01:18:47.800 | for sake of getting smarter, learning knowledge,
01:18:51.680 | just for the sake of knowledge, having fun with it?
01:18:53.680 | What's the best way to do that?
01:18:55.240 | - Well, I think exactly what you did,
01:18:58.800 | which is there's now dozens and dozens
01:19:01.720 | of different chatbots out there
01:19:04.760 | and different people will feel comfortable
01:19:08.000 | with one or the other.
01:19:08.900 | ChatGP is the first,
01:19:10.360 | so that's why it's kind of taken over
01:19:12.120 | a lot of the cognitive space, right?
01:19:15.280 | It's become like Kleenex, right?
01:19:17.960 | That word, that was why I used it
01:19:20.120 | as the first word in my book, because it's iconic.
01:19:24.240 | But some of them, I have to say that, for example,
01:19:29.080 | there are some that are really much better at math
01:19:30.960 | than others.
01:19:31.800 | - Such as?
01:19:34.120 | - Google's Gemini recently did some fine tuning
01:19:39.120 | with what's called a chain of reasoning.
01:19:44.520 | And when you reason, you go through a sequence of steps.
01:19:47.480 | And when you solve a math problem,
01:19:48.680 | you go through a sequence of steps
01:19:50.840 | of first finding out what's missing and then adding that.
01:19:55.840 | And it went from 20% correct to 80, right?
01:20:01.960 | On those problems.
01:20:04.160 | - And as people hear that, they probably think,
01:20:05.640 | "Well, that means 20% wrong still."
01:20:07.680 | But could you imagine any human or panel of humans
01:20:11.180 | behind a wall where if you asked it a question
01:20:14.240 | and then another question and another question,
01:20:16.200 | that it would give you back better than 80%
01:20:19.760 | accurate information in a matter of seconds?
01:20:23.200 | - So I think we are being perhaps a little bit unfair
01:20:31.240 | to compare these large language models to the best humans
01:20:36.240 | rather than the average human, right?
01:20:39.120 | As you said, most people couldn't pass the LSAT,
01:20:42.960 | the loss test to get into law school,
01:20:45.560 | or MCAT, the test to get into medical school.
01:20:48.560 | And JETGPT has.
01:20:50.760 | - Is there a world now where we take the existing AI,
01:20:57.920 | LLMs, these computers, basically,
01:21:00.560 | that can learn like a collection of human brains
01:21:03.160 | and send that somehow into the future, right?
01:21:08.080 | Give them an imagined future, okay?
01:21:11.000 | Could we give them outcome A and outcome B
01:21:14.120 | and let them forage into future states
01:21:17.960 | that we are not yet able to get to
01:21:20.760 | and then harness that knowledge
01:21:22.320 | and explore the two different outcomes?
01:21:25.920 | - I think that's perhaps the better question in some sense,
01:21:30.560 | because we can't travel back in time,
01:21:35.400 | but we can perhaps travel into the future with AI
01:21:38.840 | if you provide it different scenarios.
01:21:41.480 | And you say, unlike a panel of people,
01:21:44.760 | panel of experts, medical experts,
01:21:46.840 | or space travel experts, or sea travel experts,
01:21:51.840 | you can't say, hey, you know what?
01:21:53.560 | Don't sleep tonight.
01:21:55.440 | You're just gonna work for the next 48 hours.
01:21:58.440 | In fact, you're gonna work for the next three weeks
01:22:00.720 | or three months.
01:22:01.800 | And you know what?
01:22:02.960 | You're not gonna do anything else.
01:22:04.320 | You're not gonna pay attention to your health.
01:22:05.680 | You're not gonna do anything else,
01:22:06.520 | but you can take a large language model
01:22:09.320 | and you can say, just forage for knowledge
01:22:12.140 | under the following different scenarios,
01:22:14.140 | and then have that fleet of large language models come back
01:22:18.360 | and give us the information like, I don't know, tomorrow.
01:22:22.040 | - Okay, so I've lived through this myself.
01:22:23.960 | Back in the 1980s, I was just starting my career
01:22:28.960 | and I was one of the pioneers in developing
01:22:31.160 | learning algorithms for neural network models.
01:22:33.880 | Jeff Hinton and I collaborated together
01:22:35.520 | on something called the Boson machine,
01:22:37.040 | and he actually won a Nobel Prize for this recently.
01:22:39.000 | - Yeah, just this year.
01:22:39.840 | - Yeah, he's one of my best friends.
01:22:41.880 | Brilliant, and he well-deserved it
01:22:46.280 | for not just the Boson machine,
01:22:47.680 | but all the work he's done since then
01:22:49.600 | on machine learning and then back propagation and so forth.
01:22:54.280 | But back then, Jeff and I had this view of the future.
01:22:59.280 | AI was dominated by symbol processing, rules, logic, right?
01:23:04.800 | Writing computer programs.
01:23:06.520 | For every problem, you need a different computer program,
01:23:08.640 | and it was very human resource intensive to write programs
01:23:13.640 | so that it was very, very slow going.
01:23:17.280 | And they never actually got there.
01:23:18.480 | They never wrote a program for vision, for example,
01:23:20.960 | even though the computer vision community
01:23:23.440 | really worked hard for a long time.
01:23:25.480 | But we had this view of the future.
01:23:27.480 | We had this view that nature has solved these problems,
01:23:30.960 | and there's existence proof
01:23:32.000 | that you can solve the vision problem.
01:23:33.520 | Look, every animal can see, even insects, right?
01:23:36.200 | Come on, let's figure out how they did it.
01:23:39.880 | Maybe we can help by following up on nature.
01:23:43.640 | We can actually, again, going back to algorithms,
01:23:45.760 | I was telling you, and so in the case of the brain,
01:23:49.760 | what makes it different from a digital computer,
01:23:51.520 | digital computers basically can run any program,
01:23:54.000 | but a fly brain, for example, only runs the program
01:23:57.560 | that a special purpose hardware allows it to run.
01:24:00.520 | - Not much neuroplasticity.
01:24:02.000 | - There's enough there, just enough habituation and so forth
01:24:05.880 | so that it can survive, and this is-
01:24:08.480 | - Survive 24 hours.
01:24:09.800 | I'm not trying to be disparaging to the fly biologists,
01:24:12.080 | but when I think of neuroplasticity,
01:24:13.920 | I think of the magnificent neuroplasticity
01:24:16.040 | of the human brain to customize to a world of experience.
01:24:19.160 | - I agree.
01:24:20.120 | - When I think about a fly,
01:24:21.320 | I think about a really cool set of neural circuits
01:24:23.800 | that work really well to avoid getting swatted,
01:24:28.120 | to eating, and to reproducing, and not a whole lot else.
01:24:32.500 | They don't really build technology.
01:24:34.820 | They might have interesting relationships,
01:24:36.640 | but who knows, who cares?
01:24:38.480 | It's just sort of like, it's not that it doesn't matter.
01:24:41.560 | It's just a question of the lack of plasticity
01:24:44.420 | makes them kind of a meh species.
01:24:47.240 | - Okay, I can see I've pressed your button here.
01:24:49.160 | - No, no, no, no, I love fly biology.
01:24:50.920 | They taught us about algorithms for direction selectivity
01:24:53.280 | in the visual system.
01:24:54.280 | Oh, no, no, I love the Drosophila biology.
01:24:56.500 | I just think that the lack of neuroplasticity
01:24:59.840 | reveals a certain key limitation,
01:25:02.760 | and the reason we're the curators of the earth
01:25:04.720 | is 'cause we have so much plasticity.
01:25:06.520 | - Of course, of course, but you have to,
01:25:09.960 | one step at a time, nature first has to be able
01:25:12.680 | to create creatures that can survive,
01:25:15.560 | and then their brains get bigger
01:25:17.320 | as the environment gets more complex, and here we are,
01:25:21.840 | but the key is that it turns out that certain algorithms
01:25:26.840 | in the fly brain are present in our brain,
01:25:30.200 | like conditioning, classical conditioning.
01:25:32.680 | You can classical condition a fly
01:25:34.280 | in terms of training it to, when you give it a reward,
01:25:38.560 | it will produce the same action, right?
01:25:41.040 | This is like conditioned behavior,
01:25:43.080 | and that algorithm that I told you about
01:25:45.360 | that is in your value function, right?
01:25:47.400 | Temporal difference learning,
01:25:48.440 | that algorithm is in the fly brain, it's in your brain,
01:25:52.520 | so we can learn about learning from many species.
01:25:56.040 | - I was just having a little fun
01:25:57.240 | poking at the fly biologists.
01:25:58.600 | I actually think Drosophila has done a great deal,
01:26:00.920 | as has honeybee biology.
01:26:03.320 | For instance, if you give caffeine to bees
01:26:07.040 | on particular flowers, they'll actually try
01:26:10.160 | and pollinate those flowers more,
01:26:12.080 | because they actually like the feeling of being caffeinated.
01:26:14.480 | There's a bad pun about a buzz here,
01:26:15.800 | but I'm not gonna make that pun,
01:26:16.800 | 'cause everyone's done it before.
01:26:18.460 | - Right, right.
01:26:19.300 | - No, I fully absorb and agree with the value
01:26:23.800 | of studying simpler organisms to find the algorithms.
01:26:27.160 | - Right, that's where we are right now.
01:26:29.260 | But now, just go into the future.
01:26:31.960 | Now, I'm telling the story about where we were.
01:26:34.640 | We were predicting the future.
01:26:36.880 | We were saying, this is an alternative to traditional AI.
01:26:41.640 | We were not taken seriously.
01:26:42.940 | Everybody was, experts said, "No, no, write programs,
01:26:46.640 | "write programs."
01:26:47.480 | They were getting all the resources, the grants, the jobs,
01:26:51.240 | and we were just like the little furry mammals
01:26:53.680 | under the feet of these dinosaurs, right, in retrospect.
01:26:56.680 | - I love the analogy.
01:26:58.660 | - But here's the point.
01:26:59.500 | - But the dinosaurs died off.
01:27:01.640 | - But the point I'm making is that it's possible
01:27:03.400 | for our brain to make these extrapolations
01:27:06.160 | into the future.
01:27:07.000 | Why not AI versions of brains?
01:27:10.520 | Why not?
01:27:11.360 | I think your idea is a great one.
01:27:13.780 | - Yeah, I mean, the reason I'm excited about AI,
01:27:18.780 | and increasingly so across the course of this conversation,
01:27:24.120 | is because there are very few opportunities
01:27:29.120 | to forage information at such large scale
01:27:32.920 | and around the circadian clock.
01:27:34.400 | I mean, if there's one thing that we are truly a slave to
01:27:37.200 | as humans is the circadian biology.
01:27:40.440 | You got to sleep sooner or later.
01:27:42.080 | And even if you don't, your cognition really waxes
01:27:44.480 | and wanes across the circadian cycle.
01:27:46.560 | And if you don't, you're going to die early.
01:27:48.320 | We know this.
01:27:49.640 | Computers can work, work, work.
01:27:51.160 | Sure, you got to power them.
01:27:53.680 | There's the cooling thing.
01:27:54.760 | There are a bunch of things related to that,
01:27:55.940 | but that's trackable.
01:28:00.400 | So computers can work, work, work.
01:28:03.040 | And the idea that they can provide a portal into the future
01:28:08.040 | and that they can just bring it back
01:28:09.880 | so we can take a look-see.
01:28:11.480 | I'm not saying we have to implement their advice,
01:28:15.160 | but to be able to send a panel of diverse,
01:28:19.400 | computationally diverse, experientially diverse
01:28:23.320 | AI experts into the future
01:28:26.240 | and bring us back a panel of potential routes to take,
01:28:30.160 | to me is so exciting.
01:28:32.660 | Maybe a good example would be
01:28:34.660 | like treatments for schizophrenia.
01:28:37.700 | This is an area that I want to make sure
01:28:39.580 | that we talk about.
01:28:40.620 | I grew up learning as a neuroscience student
01:28:43.220 | that schizophrenia was somehow a disruption
01:28:46.740 | of the dopamine system,
01:28:47.980 | because if you give neuroleptic drugs
01:28:49.700 | that block dopamine receptors,
01:28:51.200 | that you get some improvement in the motor symptoms
01:28:54.060 | and some of the hallucinations, et cetera.
01:28:56.400 | You now also have people who say,
01:28:58.540 | "No, that's not really the basis of schizophrenia.
01:29:00.620 | "I'd love your thoughts."
01:29:01.460 | And you have incredible work
01:29:02.500 | from people like Chris Palmer at Harvard.
01:29:04.680 | And we even have a department at Stanford now focusing,
01:29:08.620 | we even have people at Stanford now
01:29:10.300 | focusing on what Chris really founded as a field,
01:29:12.340 | which is metabolic psychiatry.
01:29:13.740 | The idea that, who could imagine,
01:29:16.140 | I'm being sarcastic here,
01:29:17.100 | what you eat impacts your mitochondria,
01:29:19.700 | how you exercise impacts your mitochondria,
01:29:21.540 | mitochondria impacts brain function.
01:29:23.160 | And lo and behold,
01:29:25.180 | metabolic health of the brain and body
01:29:27.060 | impacts schizophrenia symptoms.
01:29:29.040 | And he's looked at ways that people can use ketogenic diet,
01:29:32.320 | maybe not to cure, but to treat,
01:29:34.520 | and in some cases, maybe even cure schizophrenia.
01:29:36.480 | So here we are at this place
01:29:38.560 | where we still don't have a "cure" for schizophrenia,
01:29:41.320 | but you could send LLMs into the future
01:29:45.640 | and start to forage the most likely,
01:29:48.000 | all of the data in those fields,
01:29:50.120 | probably could do that in an hour,
01:29:52.300 | plus come up with a bunch of hypothesized
01:29:56.600 | different positive and negative result clinical trials
01:29:59.520 | that don't even exist yet.
01:30:01.080 | 10,000 subjects in Scandinavia who go on ketogenic diet,
01:30:05.680 | who have a certain level of susceptibility to schizophrenia
01:30:10.360 | based on what we know from twin studies,
01:30:12.000 | things that never, ever, ever would be possible to do
01:30:16.520 | in an afternoon, maybe even in a year,
01:30:19.160 | there's isn't funding, there isn't.
01:30:20.620 | And boom, get the answers back
01:30:22.800 | and let them present us those answers.
01:30:24.600 | And then you say, "Well, it's artificial."
01:30:26.760 | But so are human brains coming up with these experiments.
01:30:30.040 | So to me, I'm starting to realize
01:30:31.680 | that it's not that we have to implement
01:30:34.200 | everything that AI tells us or offers us,
01:30:36.880 | but it sure as hell gives us a great window
01:30:39.740 | into what might be happening or is likely to happen.
01:30:43.080 | - Specifically for schizophrenia,
01:30:44.720 | I'm pretty sure that if we had these large language models
01:30:49.600 | 20 years ago, we would have known back then
01:30:51.680 | that ketamine would have been a really good drug
01:30:54.000 | to try to help these people.
01:30:55.920 | - Tell us about the relationship
01:30:57.000 | between ketamine and schizophrenia.
01:30:58.800 | - Okay.
01:30:59.640 | - Because I think a lot of people,
01:31:01.160 | and maybe you could define schizophrenia,
01:31:02.600 | even though most people think about people
01:31:03.880 | hearing voices and psychosis,
01:31:05.680 | like there's a bit more to it
01:31:08.040 | that maybe we just can't bring out.
01:31:10.360 | - Okay, so one of the things now that we know,
01:31:14.040 | see, the problem is that if you look at the end point,
01:31:17.020 | that doesn't tell you what started the problem.
01:31:19.880 | It started early in development.
01:31:22.600 | Schizophrenia is something that appears
01:31:26.960 | when late adolescence, early adulthood,
01:31:29.400 | but it actually is already a problem,
01:31:32.360 | a genetic problem from the get-go.
01:31:35.160 | - So what is the concordance in identical twins?
01:31:37.200 | Meaning if you have one identical twin,
01:31:39.240 | if you have identical twins in the womb,
01:31:41.520 | and one is destined to be full-blown schizophrenic,
01:31:45.080 | what's the probability the other will be?
01:31:45.920 | - So here's the experiment.
01:31:47.640 | Okay, this has been replicated many, many times,
01:31:51.360 | in mice, I should say.
01:31:53.000 | Oh no, actually, okay, let me start with a human.
01:31:56.480 | Okay, so ketamine is, for a long time,
01:32:00.880 | and it still is, a party drug, special K.
01:32:03.220 | - I've never taken it, but this is what I hear.
01:32:06.680 | - I haven't either.
01:32:07.520 | I don't know.
01:32:08.360 | - It's a dissociative anesthetic, right?
01:32:09.180 | - But I'll tell you what happens,
01:32:10.080 | 'cause I've talked to these people who've done this.
01:32:12.880 | You take ketamine, sub-anesthetic,
01:32:15.760 | by the way, it's an anesthetic.
01:32:17.000 | It's given to children.
01:32:18.700 | It's a pretty good anesthetic,
01:32:19.940 | and it's also used in veterinary medicine.
01:32:21.640 | But in any case, you give it to,
01:32:24.200 | you take young adults, here's what they experience.
01:32:29.080 | They experience out-of-body experience.
01:32:33.960 | They have this wonderful feeling of energy,
01:32:37.520 | and they're very, it's a high,
01:32:40.800 | but it's a very unusual high.
01:32:42.560 | Now, if they just go and have one experience,
01:32:48.600 | but if they have two, they party two days in a row,
01:32:52.420 | a lot of them come into the emergency room,
01:32:54.960 | and here's what the symptoms are.
01:32:58.340 | Full-blown psychosis, full-blown.
01:33:03.600 | We're talking about indistinguishable
01:33:06.600 | from a schizophrenic break.
01:33:08.400 | - So auditory hallucinations.
01:33:09.680 | - Yeah, auditory hallucinations, paranoia,
01:33:13.560 | very, very advanced.
01:33:15.680 | We say that, my God, this person here is really,
01:33:19.080 | has become a schizophrenic, and this is really,
01:33:25.580 | like you say, the symptoms are the same.
01:33:27.080 | However, if you isolate them for a couple days,
01:33:29.560 | they'll come back, right?
01:33:31.660 | So it means that schizophrenia can induce,
01:33:35.320 | I mean, sorry, ketamine can induce
01:33:37.240 | a form of schizophrenia, psychosis, temporarily,
01:33:42.040 | not permanently, fortunately.
01:33:44.260 | Okay, so what does it attack?
01:33:45.760 | Okay, and there's another literature on this.
01:33:47.400 | It turns out that it binds to a form of receptor,
01:33:51.700 | a glutamate receptor, called NMDA receptors,
01:33:54.740 | which are very important, by the way,
01:33:55.880 | for learning and memory, but we know the target,
01:33:58.520 | and we also know what the acute outcome is,
01:34:01.600 | that it reduces the strength of the inhibitory circuit,
01:34:06.600 | the interneurons that use inhibitory transmitters,
01:34:10.400 | the enzyme that creates the inhibitory transmitter
01:34:14.380 | is downregulated, and what does that do?
01:34:16.580 | It means that there's more excitation,
01:34:18.700 | and what does that mean, when there's more excitation?
01:34:20.460 | It means that there's more activity in the cortex,
01:34:22.980 | and there's actually much more vigor,
01:34:24.860 | and you start becoming crazy, right,
01:34:29.860 | if it's too much activity.
01:34:32.140 | So this is interesting.
01:34:33.140 | So this is telling us, I think,
01:34:35.860 | that we should be thinking about,
01:34:38.220 | and now there's a whole field now in psychiatry
01:34:40.560 | that has to do with the glutamate hypothesis
01:34:43.880 | for the first, where the actual imbalance first occurs.
01:34:48.880 | It's an imbalance between the excitatory
01:34:54.000 | and inhibitory systems that are in the cortex
01:34:56.720 | that keep you in balance.
01:34:58.840 | - And NMDA and methyldiaspartate receptors
01:35:02.040 | are glutamate receptors.
01:35:03.200 | - Yes, they are glutamate. - They're one class.
01:35:04.520 | - That's one class, that's right.
01:35:06.240 | Okay, so now, here is a hypothesis
01:35:10.620 | for why ketamine might be good for depression.
01:35:15.460 | People are taking it now who are depressed, right?
01:35:18.700 | So here you have a drug that causes overexcitation,
01:35:23.140 | and here you have a person who is underexcited.
01:35:26.060 | Depression is associated with lower excitatory activity
01:35:29.580 | in some parts of the cortex.
01:35:32.740 | Well, if you titrate it,
01:35:33.920 | you can come back into balance, right?
01:35:36.580 | So what you do is you fight depression with schizophrenia,
01:35:40.940 | a touch of schizophrenia.
01:35:43.300 | Now, you have to keep giving.
01:35:45.380 | I think once every three weeks,
01:35:46.580 | they have to have a new dose of ketamine,
01:35:50.500 | but it's helped an enormous number of people
01:35:52.620 | with very, very severe clinical depression.
01:35:55.740 | So as we learn more about the mechanisms
01:35:58.740 | underlying some of these disorders,
01:36:01.180 | the better we are going to be at extrapolating
01:36:04.640 | and coming up with some solutions,
01:36:07.300 | at least to prevent it from getting worse.
01:36:09.320 | By the way, I'm pretty sure that the large language models
01:36:12.060 | could have figured this out long ago.
01:36:14.520 | - So in an attempt to understand
01:36:16.880 | how we might be able to leverage
01:36:18.360 | these large language models now,
01:36:20.900 | how would we have used these large language models long ago?
01:36:24.160 | Let's say you had 2024 AI technology in 19,
01:36:29.680 | to have fun here, 1998,
01:36:33.900 | the year that I started graduate school.
01:36:36.900 | - Right.
01:36:37.740 | - At that time, it was like the dopamine hypothesis
01:36:39.740 | is schizophrenia was in every textbook.
01:36:41.380 | There was a little bit about glutamate, perhaps,
01:36:44.340 | but it was all about dopamine.
01:36:47.340 | So how would the large language models have discovered this?
01:36:51.340 | Ketamine was known as a drug.
01:36:53.020 | Ketamine, by the way, is very similar to PCP,
01:36:57.140 | fencyclidine, which also binds the NMD receptor.
01:37:00.060 | So how would-
01:37:03.280 | - Which is also a part of-
01:37:04.600 | - Which is also, yeah, not one I recommend, nor ketamine.
01:37:07.960 | Frankly, I don't recommend any recreational drugs,
01:37:10.400 | but I'm not a recreational drug guy.
01:37:12.180 | But what would those large language models do if they,
01:37:16.420 | so you've got 2024 technology placed into 1998.
01:37:21.060 | They're foraging for existing knowledge,
01:37:23.900 | but then are they able to make predictions?
01:37:26.660 | Like, hey, this stuff is gonna turn out to be wrong,
01:37:29.460 | or hey, this stuff-
01:37:30.300 | - Okay, okay, you know, this is all very, very speculative.
01:37:35.300 | And really, we can begin actually to see this happening now.
01:37:40.740 | So I have a colleague at the Salk Institute, Rusty Gage,
01:37:46.340 | very distinguished neuroscientist.
01:37:50.140 | And he discovered that there are new neurons
01:37:55.060 | being born in the hippocampus, right?
01:37:56.660 | Which is something, in adults,
01:37:58.480 | which is something that in a textbook says
01:38:00.260 | that doesn't happen, right?
01:38:01.100 | - Yeah, that was around 1998 that Rusty did that.
01:38:03.180 | - Yeah, yeah, right, that's right.
01:38:04.020 | And I actually have a paper with him
01:38:05.620 | where we tested LTP, long-term potentiation,
01:38:09.100 | actually, the effects of exercise on neurogenesis.
01:38:15.340 | - Exercise increases neurogenesis.
01:38:16.580 | - Yeah, it increases the cells,
01:38:19.900 | that increases neurogenesis,
01:38:21.100 | and also the cells that are active
01:38:24.740 | become part of the circuit.
01:38:26.220 | More cells become integrated.
01:38:28.380 | - And this is true in humans as well, right?
01:38:30.500 | - Yeah, and there was some cancer drug
01:38:33.380 | that was given that, you know,
01:38:34.780 | that they showed that there were new cells
01:38:37.060 | that they were able to, later in post-mortem,
01:38:39.580 | to actually see that they were born in the adult.
01:38:42.580 | Okay, so here we are, okay, in 1998.
01:38:47.420 | And the question is, can you jump?
01:38:51.020 | Can you jump into the future?
01:38:52.460 | Okay, so Rusty, we were at, you know,
01:38:57.180 | happened to talk about this issue about, you know,
01:39:02.940 | he's using these large language models now for his research.
01:39:07.940 | I said, "Oh, wow, how do you use it?"
01:39:12.100 | And he said, "We use it as an idea pump."
01:39:15.100 | What do you mean, idea pump?
01:39:16.780 | Well, you know, we give it all of the experiments
01:39:18.980 | that we've done, and we have it, you know,
01:39:22.700 | the literature, it's access to the literature and so forth,
01:39:25.580 | and we ask it for ideas for new experiments.
01:39:27.820 | - Oh, I love it, I love it.
01:39:29.980 | I was on a plane where I sat next to a guy
01:39:31.900 | that works at Google, and he's one of the main people there
01:39:36.900 | in terms of voice-to-text, and text-to-voice software.
01:39:42.220 | And he showed me something, I'll provide a link to it,
01:39:45.140 | 'cause it's another one of these open resource things.
01:39:48.540 | And I'm not super techie, I'm not like the,
01:39:51.580 | I don't get an F in technology, I don't get an A+.
01:39:54.220 | I'm kind of in the middle, so I think I'm pretty representative
01:39:56.100 | of the average listener for this podcast, presumably.
01:39:58.780 | What he showed me is that you can take,
01:40:01.260 | you open up this website, and you can take PDFs,
01:40:04.660 | or you take URLs, so websites, website addresses,
01:40:09.100 | and you just place them in the margin.
01:40:10.320 | You literally just drag and drop them there.
01:40:12.860 | And then you can ask questions,
01:40:16.300 | and the AI will generate answers
01:40:19.720 | that are based on the content
01:40:21.240 | of whatever you put into this margin,
01:40:23.940 | those PDFs, those websites.
01:40:26.180 | And the cool thing is, it references them,
01:40:28.180 | so you know which article it came from.
01:40:30.740 | And then you can start asking it
01:40:33.660 | more sophisticated questions, like,
01:40:35.720 | in the two examples of the effects of a drug,
01:40:41.220 | one being very strong, and one being very weak,
01:40:44.260 | which of these papers do you think is more rigorous,
01:40:47.560 | based on subject number,
01:40:51.020 | but also kind of the strength of the findings?
01:40:53.380 | Pretty vague thing.
01:40:54.340 | Strength of findings is pretty vague, right?
01:40:56.620 | Anyone that argues those are weak findings,
01:40:59.060 | those aren't enough subjects, well,
01:41:00.560 | we know a hell of a lot about human memory
01:41:02.300 | from one patient, HM.
01:41:03.980 | So strength of findings, when people,
01:41:06.140 | is a subjective thing.
01:41:07.980 | You really have to be an expert in a field
01:41:09.380 | to understand strength of findings, and even that.
01:41:11.780 | And what's amazing is, it starts giving back answers,
01:41:15.060 | like, well, if you're concerned about number of subjects,
01:41:19.500 | this paper, but that's a pretty obvious one,
01:41:21.780 | which one had more subjects.
01:41:23.180 | But it can start critiquing these statistics
01:41:26.820 | that they used in these papers in very sophisticated ways,
01:41:30.700 | and explain back to you why certain papers
01:41:33.540 | may not be interesting, and others are more interesting,
01:41:35.980 | and it starts to weight the evidence.
01:41:38.020 | Oh my God.
01:41:38.900 | And then you say, well, with that weighted evidence,
01:41:41.860 | can you hypothesize what would happen if,
01:41:45.500 | and so I've done a little bit of this,
01:41:46.540 | where it starts trying to predict the future,
01:41:49.040 | based on 10 papers that you gave it five minutes ago.
01:41:52.740 | Amazing.
01:41:53.580 | I don't think any professor could do that,
01:41:57.160 | except in their very specific area of interest,
01:42:00.060 | and if they were already familiar with the papers,
01:42:02.340 | and it would take them many hours, if not days,
01:42:04.380 | to read all those papers in detail.
01:42:06.380 | And they might not actually come up
01:42:08.460 | with the same answers, right?
01:42:10.140 | Right.
01:42:10.960 | Yeah, so this is, so actually this is something
01:42:14.180 | that is happening in medicine, by the way,
01:42:17.100 | for doctors who are using AI as an assistant.
01:42:20.940 | This is really interesting.
01:42:23.000 | So, and this is dermatology, it was a paper in Nature,
01:42:26.620 | you know, skin lesions, there's several,
01:42:28.660 | 2,000 skin lesions, and some of them are cancerous,
01:42:33.180 | and others are benign.
01:42:36.340 | And so, in any case, they tested the expert doctors,
01:42:39.580 | and then they tested an AI, and they were both doing
01:42:42.340 | about, you know, 90%, right?
01:42:47.100 | However, if you let the doctor use the AI,
01:42:50.020 | it boosts the doctor to 98%.
01:42:52.300 | 98% accuracy.
01:42:53.500 | Yes, and what's going on there?
01:42:55.780 | It's very interesting.
01:42:56.700 | So it turns out that, although they got the same 90%,
01:43:01.180 | they had different expertise,
01:43:03.100 | that the AI had access to more data,
01:43:06.140 | and so it could look at the lesions that were rare,
01:43:08.940 | that the doctor may never have seen, okay?
01:43:11.400 | But the doctor has more in-depth knowledge
01:43:14.140 | of the most common ones that he's seen over and over again,
01:43:17.220 | and knows the subtleties and so forth.
01:43:19.300 | But so, putting them together, it makes so much sense
01:43:23.120 | that they're gonna improve if they work together.
01:43:26.300 | And I think that now, what you're saying is that
01:43:28.980 | using AI as a tool for discovery,
01:43:33.620 | with the expert who's interpreting,
01:43:37.380 | and looking at the arguments, the statistical arguments,
01:43:41.500 | and also looking at the paper, maybe in a new way,
01:43:46.380 | maybe that's the future of science.
01:43:47.800 | Maybe that's what's gonna happen.
01:43:48.860 | Everybody's worried about, oh, AI's gonna replace us.
01:43:52.700 | It's gonna be much better than we are at everything,
01:43:55.620 | and humans are obsolete.
01:43:57.640 | Nothing could be further from the case.
01:43:59.500 | Our strengths and weaknesses are different,
01:44:01.700 | and by working together, it's gonna strengthen both
01:44:06.700 | what we do and what AI does,
01:44:11.580 | and it's gonna be a partnership.
01:44:13.780 | It's not gonna be adversarial.
01:44:15.260 | It's gonna be a partnership.
01:44:16.980 | - Would you say that's the case for things like
01:44:18.820 | understanding or discovering treatments
01:44:22.460 | for neurologic illness,
01:44:24.540 | for avoiding large-scale catastrophes,
01:44:31.540 | like can it predict macro movements?
01:44:36.140 | Let me give an example.
01:44:37.740 | Here in Los Angeles,
01:44:40.460 | there's occasionally an accident on the freeway.
01:44:43.100 | You have a lot of cameras over freeways nowadays.
01:44:47.020 | You have cameras in cars.
01:44:48.380 | You can imagine all of the data being sent in in real time,
01:44:51.300 | and you could probably predict accidents pretty easily.
01:44:55.580 | I mean, these are just moving objects, right,
01:44:57.140 | at a specific rate, who's driving haphazardly,
01:45:00.340 | but you could also potentially signal takeover of the brakes
01:45:05.140 | or the steering wheel of a car and prevent accidents.
01:45:07.180 | I mean, certain cars already do that,
01:45:10.060 | but could you essentially eliminate...
01:45:13.620 | Well, let's do something even more important.
01:45:15.180 | Let's eliminate traffic.
01:45:16.380 | (laughs)
01:45:17.220 | I don't know if you can do that,
01:45:19.100 | 'cause that's a funnel problem,
01:45:20.180 | but could you predict physical events in the world
01:45:25.180 | into the future?
01:45:26.500 | - Okay, this has already been done, not for traffic,
01:45:29.840 | but for hurricanes.
01:45:30.840 | As you know, the weather is extremely difficult to predict,
01:45:38.800 | except here in California,
01:45:41.940 | where it's always gonna be sunny, right?
01:45:43.220 | (laughs)
01:45:44.220 | But now what they've done is to feed a lot of previous data
01:45:49.220 | from previous hurricanes and also simulations of hurricanes.
01:45:55.660 | You can simulate them in a supercomputer.
01:45:57.940 | It takes days and weeks, so it's not very useful
01:46:01.720 | for actually accurately predicting
01:46:03.840 | where it's gonna hit Florida.
01:46:06.520 | But what they did was, after training up the AI
01:46:10.040 | on all of this data, it was able to predict,
01:46:13.400 | with much better accuracy,
01:46:14.800 | exactly where in Florida it's gonna make a landfall.
01:46:19.800 | And it does that on your laptop in 10 minutes.
01:46:24.480 | - Incredible.
01:46:26.280 | So something just clicked for me,
01:46:28.620 | and it's probably obvious to you and to most people,
01:46:31.980 | but I think this is true.
01:46:33.980 | I think what I'm about to say is true.
01:46:35.780 | At the beginning of our conversation,
01:46:37.140 | we were talking about the acquisition of knowledge
01:46:40.820 | versus the implementation of knowledge,
01:46:42.980 | just learning facts versus learning
01:46:45.820 | how to implement those facts
01:46:47.100 | in the form of physical action or cognitive action, right?
01:46:50.180 | Math problem is cognitive action, physical action.
01:46:51.920 | Okay.
01:46:52.760 | AI can do both knowledge acquisition, it can learn facts,
01:46:58.240 | long lists of facts and combinations of facts,
01:47:00.720 | but presumably it can also run a lot of problem sets
01:47:03.800 | and solve a lot of problem sets.
01:47:05.960 | I don't think, except with some crude, still to me,
01:47:10.040 | examples of robotics, that it's very good at action yet,
01:47:13.340 | but it will probably get there at some point.
01:47:15.840 | Robots are getting better,
01:47:16.740 | but they're not doing what we're doing yet.
01:47:20.680 | But it seems to me that as long as they can acquire knowledge
01:47:25.680 | and then solve different problem sets,
01:47:29.960 | different iterations of combinations of knowledge
01:47:33.180 | that basically they are in a position
01:47:35.480 | to take any data about prior events or current events
01:47:40.160 | and make pretty darn good predictions about the future
01:47:44.360 | and run those back to us quickly enough
01:47:46.400 | and to themselves quickly enough
01:47:48.960 | that they could play out the different iterations.
01:47:51.480 | And so I'm thinking one of the problems
01:47:54.560 | that seems to have really vexed neuroscientists
01:47:56.880 | and the field of medicine and the general public
01:47:58.920 | has been like the increase in the,
01:48:01.760 | at least diagnosis of autism.
01:48:03.420 | I've heard so many different hypotheses over the years.
01:48:07.800 | I think we're still pretty much in the fog on this one.
01:48:10.560 | Could AI start to come up with new and potential solutions
01:48:17.800 | and treatments if they're necessary,
01:48:20.120 | but maybe get to the heart of this problem?
01:48:22.200 | - It might.
01:48:23.040 | And it depends on the data you have.
01:48:25.520 | It depends on the complexity of the disease,
01:48:29.100 | but it will happen.
01:48:32.400 | In other words, we will use those tools the best we can,
01:48:36.020 | 'cause obviously if you can make any progress at all
01:48:39.560 | and jump into the future, wow, that would save lives.
01:48:42.880 | That would help so many people out there.
01:48:45.280 | I mean, I really think the promise here is so great
01:48:48.400 | that even though there are flaws
01:48:49.980 | and there are regulatory problems,
01:48:51.880 | we just, we really, really have to really push.
01:48:54.800 | And we have to do that in a way
01:48:57.680 | that is going to help people,
01:49:01.040 | in terms of making their jobs better
01:49:07.360 | and helping them solve problems
01:49:11.440 | that otherwise they would have had difficulty with
01:49:13.760 | and so forth.
01:49:14.600 | It's beginning to happen, but these are early days.
01:49:19.160 | So we're at a stage right now with AI
01:49:23.200 | that is similar to what happened
01:49:26.480 | after the first flight of the Wright brothers.
01:49:29.960 | In other words--
01:49:30.800 | - It's that significant.
01:49:31.920 | - The achievement that the Wright brothers made
01:49:34.440 | was to get off the ground 10 feet
01:49:36.600 | and to power forward with a human being 100 feet.
01:49:40.840 | That was it, that was the first flight.
01:49:43.440 | And it took an enormous amount of improvements.
01:49:46.340 | The most difficult thing that had to be solved was control.
01:49:49.220 | How do you control it?
01:49:50.160 | How do you make it go in the direction you want it to go?
01:49:53.560 | And shades of what's happening now in AI
01:49:56.400 | is that we are off the ground.
01:49:59.000 | We were not going very far yet,
01:50:01.200 | but who knows where it will take us into the future.
01:50:03.760 | - Let's talk about Parkinson's disease,
01:50:07.920 | a depletion of dopamine neurons
01:50:10.160 | that leads to difficulty in smooth movement generation
01:50:14.880 | and also some cognitive and mood-based dysfunction.
01:50:20.660 | Tell us about your work on Parkinson's
01:50:23.960 | and what did you learn?
01:50:25.380 | - So as you point out,
01:50:28.960 | Parkinson's is first a degenerative disease.
01:50:32.120 | It's very interesting because the dopamine cells
01:50:36.280 | are at a particular part of the brain, the brainstem,
01:50:39.860 | and they are the ones that are responsible
01:50:42.360 | for procedural learning.
01:50:44.080 | I told you before about temporal difference.
01:50:45.800 | It's dopamine cells.
01:50:47.000 | And it's a very powerful way for the,
01:50:52.200 | it's a global signal, it's called a neuromodulator
01:50:54.280 | because it modulates all the other signals
01:50:56.740 | taking place throughout the cortex.
01:50:59.120 | And also it's very important for learning
01:51:05.800 | sequences of actions that produce survival, for survival.
01:51:10.800 | But the problem is that with certain environmental insults,
01:51:19.740 | especially toxins like pesticides,
01:51:26.220 | those neurons are very vulnerable.
01:51:29.820 | And when they die, you get all of the symptoms
01:51:33.260 | that you just described.
01:51:35.400 | The people who have lost those cells,
01:51:39.300 | actually before the treatment, L-DOPA,
01:51:43.120 | which is a dopamine precursor,
01:51:45.740 | they actually were, became comatose, right?
01:51:49.920 | They didn't move.
01:51:51.480 | They were still alive, but they just didn't move at all.
01:51:54.380 | You know, they-- - It's tragic.
01:51:57.720 | - Yeah, it's locked in, it's called.
01:51:59.620 | Yeah, it's tragic, tragic.
01:52:00.920 | So when the first trials of L-DOPA were given to them,
01:52:05.920 | it was magical because suddenly they started talking again.
01:52:11.000 | So, I mean, this is amazing, amazing.
01:52:13.640 | - I'm curious, when they started talking again,
01:52:15.480 | did they report that their brain state
01:52:17.680 | during the locked in phase was slow velocity?
01:52:21.040 | Like, was it sort of like a dreamlike state
01:52:23.160 | or they felt like they were in a nap
01:52:24.980 | or were they in there like screaming to get out?
01:52:27.720 | Because their physical velocity obviously was zero.
01:52:31.540 | They're locked in after all.
01:52:33.300 | And I've long wondered when coming back from a run
01:52:37.920 | or from waking up from a great night's sleep,
01:52:40.780 | when I shift into my waking state,
01:52:43.740 | whether or not physical velocity
01:52:45.860 | and cognitive velocity are linked.
01:52:47.280 | - Okay, that's a wonderful observation or a question.
01:52:50.300 | I'll bet you know the answer.
01:52:52.500 | Okay, here's something that is really amazing.
01:52:55.900 | It was discovered, interestingly,
01:52:59.660 | when they tend to move slowly, as you said,
01:53:03.900 | but to them, cognitively, they think they're moving fast.
01:53:07.220 | Now, it's not because they can't move fast,
01:53:10.040 | because you can say, well, can you move faster?
01:53:12.180 | Sure.
01:53:13.440 | And they move normal, right?
01:53:15.800 | But to them, they think they're moving at super velocities.
01:53:18.940 | - So it's a set point issue.
01:53:19.780 | - So it's a set point issue.
01:53:21.100 | Yes, it's all about set points.
01:53:22.300 | That's what's really going on.
01:53:24.340 | And as the set point gets further and further down,
01:53:27.780 | without moving at all, they think they're moving, right?
01:53:32.100 | I mean, this is what's going on.
01:53:33.260 | By the way, you can ask them, what was it like?
01:53:35.340 | We were talking to you, and you didn't respond.
01:53:38.140 | Oh, I didn't feel like it.
01:53:39.440 | - The brain confabulates an answer.
01:53:42.180 | - They have, well, that they confabulated it
01:53:46.060 | because they didn't have enough energy,
01:53:48.820 | or they couldn't initiate, they couldn't initiate actions.
01:53:52.340 | That's one of the things that they have trouble with,
01:53:54.180 | with movements, starting a movement.
01:53:56.620 | - Yeah, as you can tell, I'm fascinated
01:53:58.020 | by this notion of cognitive velocity.
01:54:00.220 | And again, there may be a better or more accurate
01:54:02.420 | or official language for it,
01:54:06.140 | but I feel like it encompasses so much
01:54:08.780 | of what we try to do when we learn.
01:54:11.760 | And the fact that during sleep,
01:54:13.440 | you have these very vivid dreams
01:54:16.160 | during rapid eye movement sleep.
01:54:17.300 | So cognitive velocity is very fast.
01:54:19.080 | Time perception is different
01:54:20.300 | than in slow wave sleep dreams.
01:54:22.260 | And I really think there's something to it
01:54:24.880 | as at least one metric that relates to brain state.
01:54:29.380 | I've long thought that we know so much more
01:54:31.940 | about brain states during sleep
01:54:33.300 | than we do about wakeful brain states.
01:54:35.420 | We talk about focus, motivated, flow.
01:54:38.620 | I mean, these are not scientific terms.
01:54:40.940 | I'm not being disparaging of them.
01:54:42.700 | They're pretty much all we've got
01:54:44.860 | until we come up with something better.
01:54:46.160 | But we're biologists and neuroscientists
01:54:48.560 | and computational neuroscientists in your case.
01:54:50.540 | And we're like trying to figure out
01:54:52.420 | like what brain state are we in right now?
01:54:55.140 | Our cognitive velocity is a certain value.
01:54:58.760 | But I think the more that people think about this,
01:55:01.900 | I'll venture to say that the more that they think
01:55:05.060 | a little bit about their cognitive velocity
01:55:06.540 | at different times of day,
01:55:07.700 | we start to notice that there's a,
01:55:09.060 | tends to be a few times of day.
01:55:10.700 | For me, it tends to be early to late mid morning.
01:55:15.120 | And then again, in the evening,
01:55:17.980 | after a little bit of trough and energy
01:55:20.220 | that boy, that hour and a half each,
01:55:23.140 | like that's the time to get real work done.
01:55:25.380 | - I didn't have the same experience.
01:55:26.860 | - I can mentally sprint far at those times.
01:55:30.940 | But there are other times of day
01:55:32.980 | when I don't care how much caffeine I drink.
01:55:36.060 | I don't care, unless it's a stressful event
01:55:38.060 | that I need to meet the demands of that stress.
01:55:40.980 | I just can't, I can't get to that faster pace
01:55:44.140 | while I'm also engaging.
01:55:45.920 | You can read faster, you can listen,
01:55:48.020 | but you're not using the information.
01:55:50.400 | You're not storing the information.
01:55:52.140 | - That's right.
01:55:52.980 | - What times of day for you are?
01:55:54.420 | - I get most done in the morning.
01:55:56.380 | And then you're right, later after dinner
01:56:00.500 | is also different though.
01:56:06.180 | I think in the morning, I'm better at creative stuff.
01:56:10.540 | And then I think that in the evening,
01:56:12.580 | I'm better at actually just cranking it out.
01:56:15.340 | - Interesting.
01:56:16.920 | Given the relationship between a body temperature
01:56:19.780 | and circadian rhythm, I would like to run an experiment
01:56:22.980 | that relates core body temperature to cognitive velocity.
01:56:27.140 | - I've actually noticed,
01:56:28.620 | this is something that is just purely subjective,
01:56:32.740 | but the temperature of the salt
01:56:35.380 | inside the building is kept 75.
01:56:37.940 | It's like, you know, it's rock solid.
01:56:39.940 | But in the afternoon, I feel a little chilly.
01:56:45.420 | It's probably my, you know, internal.
01:56:48.480 | - Sure, body temperature starts to come down.
01:56:51.040 | - Yeah, it's probably going down.
01:56:52.920 | And that may correspond to the loss of energy.
01:56:56.160 | You know, the amount of the ability for the brain
01:56:58.360 | and everything else.
01:56:59.640 | By the way, you know, this is Q10.
01:57:01.520 | This is a jargon.
01:57:03.200 | Every single enzyme in your, every cell
01:57:07.680 | can go at different rates
01:57:08.680 | depending on the temperature, right?
01:57:10.600 | And so, yeah, so if the body temperature is doing this,
01:57:13.120 | then all the cells are doing this too, right?
01:57:15.860 | So this is, it's an explanation.
01:57:17.980 | I'm not sure if it's the right one, but.
01:57:19.500 | - Yeah, Craig Heller, my colleague at Stanford
01:57:21.620 | in the biology department has beautifully described
01:57:24.260 | how the enzymatic control over pyruvate,
01:57:28.220 | I believe it is, controls muscular failure.
01:57:31.600 | That local muscular failure, you know,
01:57:33.500 | when people are trying to move some resistance,
01:57:36.300 | has everything to do with the temperature,
01:57:38.500 | the local temperature that shuts down
01:57:42.180 | certain enzymatic processes
01:57:43.760 | that don't allow the muscles to contract the same way.
01:57:46.840 | You know, he knows the details
01:57:47.880 | and he covered them on this podcast.
01:57:49.000 | I'm forgetting the details.
01:57:50.480 | You start to go, wow, like these enzymes
01:57:52.520 | are so beautifully controlled by temperature.
01:57:55.560 | And of course, his laboratory is focused on ways
01:57:57.320 | to bypass those temperature or to change temperature locally
01:58:01.480 | in order to bypass those limitations
01:58:03.640 | and have shown them again and again.
01:58:05.440 | It's just incredible.
01:58:07.440 | Yeah, I don't, I hear we're speculating
01:58:09.240 | about what it would mean for cognitive velocity.
01:58:11.160 | But I think it's such a different world
01:58:14.080 | to think about the underlying biology
01:58:16.140 | as opposed to just thinking about like a drug.
01:58:18.700 | You know, you increase dopamine and norepinephrine
01:58:20.540 | and epinephrine, the so-called catecholamines,
01:58:23.260 | and you're gonna increase energy focus and alertness,
01:58:25.660 | but you're gonna pay the price.
01:58:26.580 | You're gonna have a trough in energy focus and alertness
01:58:28.800 | that's proportional to how much greater it was
01:58:30.940 | when you took the drug.
01:58:31.980 | - Boy, amphetamines are a good example.
01:58:34.380 | Boy, you know, you're going a mile a minute
01:58:39.100 | when you're taking the drug.
01:58:41.200 | Of course, you know, it's, I fully understand
01:58:45.440 | that that's your impression.
01:58:46.720 | And the reality is you don't actually accomplish
01:58:48.400 | that much more.
01:58:49.880 | - Have any LLMs, so AI, been used
01:58:53.280 | to answer this really pressing question
01:58:55.840 | of what is going to be the consequence on cognition
01:58:58.500 | for these young brains that have been weaned
01:59:00.800 | while taking Ritalin, Adderall, Vyvanse,
01:59:03.960 | and other stimulants?
01:59:04.800 | 'Cause we have, you know, millions of kids
01:59:07.200 | that have been raised this way.
01:59:08.040 | - We did this experiment on our, you know,
01:59:09.820 | whole cadre, a whole generation.
01:59:11.820 | And you know, I really would like to know the answer.
01:59:15.100 | I wonder if anybody's studying that.
01:59:17.420 | That's really a great question.
01:59:18.820 | 'Cause we gave them speed, effectively.
01:59:21.220 | You know, the drug that causes the brain to be activated.
01:59:25.660 | But by the way, but you know,
01:59:27.880 | there's the consequences that, you know,
01:59:32.520 | when it wears off, you have no energy, right?
01:59:36.100 | You're just completely spent, that's it.
01:59:39.400 | - That's the pit.
01:59:40.320 | - That's the pit.
01:59:41.160 | And so, and, but that's why you take more of it.
01:59:43.760 | You see, that's the problem is it's a spiral.
01:59:46.220 | - I love how today you're making it so very clear
01:59:52.720 | how computation, how math and computers
01:59:56.640 | and AI now are really shaping the way
02:00:00.120 | that we think about these biological problems,
02:00:02.120 | which are also psychological problems,
02:00:03.620 | which are also daily challenges.
02:00:04.900 | I also love that we touched on mitochondria
02:00:06.820 | and how to replenish mitochondria.
02:00:08.800 | I want to make sure that we talk about a couple of things
02:00:10.920 | that I know are in the back of people's minds,
02:00:13.120 | no pun intended here,
02:00:14.340 | which are consciousness and free will.
02:00:19.280 | Normally, I don't like to talk about these things,
02:00:21.500 | not because they're sensitive,
02:00:22.760 | but because I find the discussions around them
02:00:25.500 | typically to be more philosophical than neurobiological.
02:00:29.500 | And they tend to be pretty circular.
02:00:31.880 | And so you get people like Kevin Mitchell,
02:00:35.000 | who is a real, I think he has a book about free will.
02:00:37.160 | He believes in free will.
02:00:38.420 | You've got people like Robert Sapolsky,
02:00:42.360 | who wrote the book "Determined."
02:00:43.300 | He doesn't believe in free will.
02:00:45.360 | How do you feel about free will?
02:00:46.760 | And is it even a discussion that we should be having?
02:00:49.920 | - Well, if you go back 500 years, you know,
02:00:52.320 | it's the middle ages, the concept didn't exist,
02:00:57.320 | or at least not in the way we use it.
02:01:00.040 | Because everybody, it was the way that humans felt
02:01:05.040 | about the world and how it worked
02:01:09.360 | and its impact on them was that it's all fate.
02:01:14.360 | They had this concept of fate,
02:01:16.560 | which is that there's nothing you can do
02:01:20.240 | that something is going to happen to you
02:01:23.600 | because of what's going on in the gods up above,
02:01:27.020 | or whatever it is, right?
02:01:28.080 | You attribute it to the physical forces around you
02:01:31.680 | that caused it, not to your own free will,
02:01:34.960 | not to something that caused this to happen to you, right?
02:01:39.700 | So I think that these words, by the way,
02:01:42.880 | that we use, free will, consciousness, intelligence,
02:01:46.880 | understanding, they're weasel words
02:01:50.240 | because you can't pin them down.
02:01:52.600 | There is no definition of consciousness
02:01:55.320 | that everybody agrees on.
02:01:56.560 | And it's tough to solve a problem, a scientific problem,
02:02:01.500 | if you don't have a definition that you can agree on.
02:02:04.320 | And, you know, there's this big controversy
02:02:08.000 | about whether these large language models
02:02:10.040 | understand language or not, right?
02:02:14.080 | The way we do.
02:02:15.060 | And what it really is revealing
02:02:18.920 | is we don't understand what understanding is.
02:02:22.320 | Literally, we don't have a really good argument
02:02:25.900 | or a measure that you could measure someone's understanding
02:02:29.040 | and then apply it to the GDP and see whether it's the same.
02:02:33.400 | It probably isn't exactly the same,
02:02:36.040 | but maybe there's some continuum here
02:02:37.840 | we're talking about, right?
02:02:39.200 | You know, the way I look at it,
02:02:43.460 | it's as if an alien suddenly landed on Earth
02:02:51.440 | and started talking to us in English, right?
02:02:56.260 | And the only thing we could be sure of,
02:02:57.820 | it was that it's not human, right?
02:03:00.260 | - I met some people that I wondered
02:03:01.940 | about their terrestrial origins.
02:03:05.900 | - Okay, okay.
02:03:06.740 | Well, okay, now there's a big diversity amongst humans too.
02:03:09.780 | You're right about that.
02:03:11.040 | - Yeah, yeah, yeah.
02:03:11.880 | Certain colleagues of ours at UCSD years ago,
02:03:14.660 | one in particular in the physics department
02:03:16.500 | who I absolutely adore as a human being,
02:03:19.540 | just had such an unusual pattern of speech,
02:03:23.580 | of behavior, totally appropriate behavior,
02:03:26.040 | but just unusual.
02:03:27.480 | In the middle of a faculty meeting,
02:03:29.120 | would just kind of turn to me and start talking
02:03:30.720 | while the other person was presenting.
02:03:32.640 | And I was like, "Maybe not now."
02:03:34.240 | And he would say, "Oh, okay."
02:03:37.740 | But in any other domain,
02:03:40.120 | you'd say he was very socially adept.
02:03:41.840 | And so, you know, there's certain people
02:03:43.400 | that just kind of discard with convention
02:03:46.280 | and you kind of want to like, "Is he an alien?"
02:03:47.960 | It's kind of cool, in a cool way.
02:03:49.520 | Like, you know, he's one of my, again,
02:03:50.920 | a friend and somebody I really delight in.
02:03:52.720 | - It's true, it's true.
02:03:53.860 | You know, not everybody has adopted
02:03:56.980 | the same social conventions.
02:03:58.660 | It could be a touch of autism.
02:04:01.980 | That's a problem that, I mean, in other words,
02:04:05.500 | there are very high functioning autistic people out there.
02:04:08.220 | - He's brilliant.
02:04:09.380 | - And often they are, you know.
02:04:11.160 | There are high people who are brilliant with autism,
02:04:16.540 | but, you know.
02:04:18.220 | - Could you build an LLM that was more
02:04:22.140 | on one end of the spectrum versus the other
02:04:24.040 | to see what kind of information they forage for?
02:04:26.180 | - I reviewed a paper.
02:04:27.020 | - It seemed like it would be a really important thing to do.
02:04:30.340 | - That it's been done.
02:04:31.180 | Okay, there was a paper that I reviewed
02:04:33.240 | where they took the LLM and they fine-tuned it
02:04:36.620 | with different data from people with different disorders,
02:04:40.100 | you know, autism and so forth.
02:04:41.940 | And sociopaths, you know.
02:04:49.100 | - That's scary.
02:04:50.540 | But you want to know the answer.
02:04:51.820 | - No, and they got these LLMs to behave
02:04:54.660 | just like those people who have these disorders.
02:04:58.480 | You can get them to behave that way, yes.
02:05:00.860 | - Could you do political leaning and values?
02:05:05.060 | - I haven't seen that, but it's pretty clear that,
02:05:07.860 | to me at least, that if you can do sociopathy,
02:05:10.860 | you can probably do any political belief, you know.
02:05:15.100 | - But you could also view all this as,
02:05:17.780 | you could take benevolent tracks.
02:05:19.140 | You could also say hyper-creative,
02:05:21.940 | sensitive to emotional tone of voices
02:05:29.520 | and find out what kind of information that person brings,
02:05:32.660 | excuse me, that LLM brings back
02:05:35.020 | versus somebody who is very oriented
02:05:37.640 | towards just the content of people's words
02:05:39.700 | as opposed to what, you know.
02:05:41.140 | Because among people, you find this.
02:05:43.100 | You know, if you've ever left a party
02:05:44.340 | with a significant other,
02:05:45.740 | and sometimes someone will say,
02:05:47.980 | "I've had this experience with like,
02:05:49.260 | "did you see that interaction between so-and-so?"
02:05:51.500 | I'm like, "no, what are you talking about?
02:05:52.500 | "Like, did you hear that?"
02:05:53.340 | I'm like, "no, not at all.
02:05:54.300 | "I didn't hear, I heard the words,
02:05:55.760 | "but I did not pick up on what you were picking up on."
02:05:58.300 | And it was clear that there's two very different experiences
02:06:00.860 | of the same content based purely on a difference
02:06:04.220 | in interpretation of the tonality.
02:06:06.260 | - Okay, there's a lot of information that, as you point out,
02:06:10.380 | which has to do with the tone,
02:06:14.380 | the spatial expressions.
02:06:17.800 | You know, there's a tremendous amount of information
02:06:20.960 | that is passed not just with words,
02:06:23.300 | but with all the other parts of the visual input
02:06:26.380 | and so forth.
02:06:27.580 | And some people are good at picking that up
02:06:29.320 | and others are not.
02:06:30.580 | There's a tremendous variability between individuals.
02:06:34.060 | And, you know, biology is all about diversity,
02:06:36.980 | and it's all about, you know,
02:06:39.180 | needing a gene pool that's very diverse
02:06:41.180 | so that you can evolve and survive catastrophic changes
02:06:46.180 | that occur in a climate, for example.
02:06:50.020 | But wouldn't it be wonderful
02:06:52.540 | if we could create a LLM
02:07:00.220 | that could understand what those differences are?
02:07:09.180 | Now just think about it, right?
02:07:10.700 | Like a truly diverse LLM
02:07:12.340 | that integrated all those differences.
02:07:13.780 | - Yeah, so here's how, what you'd have to do.
02:07:15.620 | What you'd have to do is to train it up on data
02:07:18.940 | from a bunch of individuals, human individuals.
02:07:22.360 | Now, one of the things about these LLMs
02:07:24.260 | is that they don't have a single persona.
02:07:26.560 | They can adopt any persona.
02:07:30.060 | You have to tell it what you're expecting from.
02:07:33.460 | - Or ask it in a way that works for you
02:07:35.520 | and you'll get back a certain persona.
02:07:37.380 | - If you, I once gave it an abstract from a paper,
02:07:41.740 | very technical, a computational paper.
02:07:43.640 | And I said, "You are a neuroscientist.
02:07:47.020 | "I want you to explain this abstract to a 10-year-old."
02:07:50.560 | It did it in a way that I could never have done it.
02:07:54.820 | It really simplified it. - Was it accurate?
02:07:56.860 | - Some of the subtleties were not in it,
02:07:59.340 | but it explained, you know, what plasticity it was
02:08:02.060 | and explained what a synapse is.
02:08:03.660 | And you know, it did that. - Amazing.
02:08:05.340 | - Almost like a qualifying exam for a graduate student.
02:08:07.740 | I saw something today on X, formerly known as Twitter,
02:08:11.500 | that blew my mind that I wanted your thoughts on
02:08:14.140 | that is very appropriate to what you're saying right now,
02:08:17.060 | which is someone was asking questions of an LLM
02:08:20.700 | on ChatGPT or maybe one of these other,
02:08:23.420 | Anthropic or Claude or something like that.
02:08:26.100 | I probably misused those names.
02:08:27.620 | One of the AI online sites.
02:08:32.060 | And somewhere in the middle of its answers,
02:08:36.780 | the LLM decided to just take a break
02:08:39.660 | and start looking at pictures of landscapes in Yosemite.
02:08:43.640 | Like the LLM was doing what a maybe
02:08:47.580 | cognitively fatigued person
02:08:50.700 | or what any kind of online person online would do,
02:08:54.140 | which was to like take a break
02:08:55.320 | and look at a couple of pictures of something they,
02:08:57.020 | you know, maybe they're thinking about
02:08:57.900 | going camping there or something,
02:08:59.180 | and then get back to whatever task.
02:09:01.660 | We hear about hallucinations in AI,
02:09:03.780 | that it can imagine things that aren't there,
02:09:06.060 | just like a human brain.
02:09:07.460 | But that blew my mind.
02:09:11.020 | - I haven't encountered that,
02:09:12.580 | but, you know, isn't it fascinating?
02:09:14.500 | You know, that's a sign of a real generative
02:09:20.040 | internal model.
02:09:21.100 | See, here's the thing that,
02:09:24.200 | the thing that most distinguishes, I think,
02:09:27.060 | an LLM from a human is that,
02:09:29.660 | you know, if you go into a room,
02:09:34.280 | quiet room, and just sit there
02:09:36.220 | without any sensory stimulation,
02:09:38.900 | your brain keeps thinking, right?
02:09:41.100 | In other words, you think about
02:09:42.660 | what you wanna do, you know, planning ahead
02:09:45.940 | or something that happened to you during the day,
02:09:48.180 | right, your brain is always generating internally.
02:09:50.980 | You know, after talking to you,
02:09:55.260 | one of these large language models just goes blank.
02:09:59.460 | There is no self-continuous,
02:10:03.980 | self-generated thoughts.
02:10:05.860 | - And yet we know self-generated thought,
02:10:07.700 | and in particular brain activity during sleep,
02:10:10.240 | as you illustrated earlier,
02:10:12.040 | with the example of sleep spindles
02:10:13.660 | and rapid eye movement,
02:10:14.500 | sleep are absolutely critical for
02:10:16.380 | shaping the knowledge that we experience during the day.
02:10:23.100 | So these LLMs are not quite where we are at yet.
02:10:28.460 | I mean, they can outperform us in certain things like Go,
02:10:33.460 | but how soon will we have LLMs, AI that is,
02:10:38.480 | with self-generated internal activity?
02:10:41.820 | - We're getting closer.
02:10:44.780 | And so this is something I'm working on myself, actually,
02:10:47.700 | trying to understand how that's done in our own brains,
02:10:52.040 | was generating continual brain activity
02:10:56.380 | that leads to planning and things.
02:10:59.860 | We don't know what the answer to that is yet
02:11:01.900 | in neuroscience.
02:11:02.920 | And by the way, you go to a lecture
02:11:06.160 | and you hear the words one after the next over an hour,
02:11:11.380 | and you see the slides one after the next.
02:11:13.740 | At the end, you ask a question, right?
02:11:15.860 | Just let's think about what you just did.
02:11:18.500 | Somehow you're able to integrate all that information
02:11:21.860 | over the hour and then use your long-term memory
02:11:25.380 | then to come up with some insight
02:11:27.340 | or some issue that you want.
02:11:29.140 | How does your brain remember all that information?
02:11:35.020 | Working memory, traditional working memory
02:11:37.940 | that neuroscientists study is only for a few seconds,
02:11:40.380 | right, or maybe a telephone number or something.
02:11:43.140 | But we're talking about long-term working memory.
02:11:45.900 | We don't understand how that is done.
02:11:49.020 | And LLMs, actually, large language models,
02:11:53.580 | can do something, it's called in-context learning.
02:11:56.500 | And it's a really, it was a great surprise
02:11:59.360 | because there is no plasticity.
02:12:01.740 | The thing learns at the beginning,
02:12:03.300 | you train it up on data,
02:12:04.500 | and then all it does after that is to inference,
02:12:07.780 | you know, fast loop of activity one word after the next,
02:12:12.180 | right, that's what happens with no learning, no learning.
02:12:16.540 | But it's been noticed that as you continue your dialogue,
02:12:22.260 | it seems to get better at things.
02:12:24.740 | How could that be?
02:12:25.660 | How could it be in context learning,
02:12:28.700 | even though there's no plasticity?
02:12:31.100 | That's a mystery.
02:12:31.940 | We don't know the answer to that question yet.
02:12:34.180 | But we also don't know what the answer it is,
02:12:36.140 | what the answer is for humans either.
02:12:38.420 | - Right.
02:12:39.260 | Could I ask you a few questions about you
02:12:41.900 | and as it relates to science and your trajectory?
02:12:44.720 | Building off of what you were just saying,
02:12:48.500 | do you have a practice of meditation or eyes closed,
02:12:53.500 | sensory input reduced or shut down
02:12:58.480 | to drive your thinking in a particular way?
02:13:02.020 | Or are you, you know, at your computer
02:13:04.280 | talking to your students and postdocs
02:13:05.960 | and sprinting on the beach?
02:13:07.300 | - You know, it's funny you mentioned that
02:13:10.020 | 'cause I get my best ideas, not sprinting on the beach,
02:13:13.060 | but you know, just either walking or jogging.
02:13:17.700 | And it's wonderful, I don't know.
02:13:19.460 | I think, you know, serotonin goes up.
02:13:21.100 | It's another neuromodulator.
02:13:22.420 | I think that that stimulates ideas and thoughts.
02:13:26.500 | And so inevitably, I come back to my office
02:13:30.260 | and I can't remember any of those great ideas.
02:13:35.260 | - What do you do about that?
02:13:36.860 | - Well, now I take notes.
02:13:38.180 | - Okay, voice memos?
02:13:39.580 | - Yeah.
02:13:40.540 | And some of them, it's a pan out.
02:13:42.780 | You know, there's no doubt about it.
02:13:44.460 | You're put into a situation.
02:13:46.940 | It is a form of meditation.
02:13:48.460 | You know, if you're running in a steady pace,
02:13:52.700 | nothing distracting about, you know, the beach.
02:13:56.220 | - Or do you listen to music or podcasts?
02:13:57.860 | - No, I never listen to anything except my own thoughts.
02:14:01.500 | - So there's a former guest on this podcast who,
02:14:05.100 | she happens to be triple degreed from Harvard,
02:14:07.180 | but she's more in the kind of like personal coach space,
02:14:10.700 | but very, very high level and impressive mind,
02:14:13.260 | impressive human all around.
02:14:14.620 | And she has this concept of wordlessness
02:14:18.860 | that can be used to accomplish a number of different things,
02:14:22.860 | but this idea that allowing oneself
02:14:25.820 | or creating conditions for oneself
02:14:28.220 | to enter states throughout the day,
02:14:30.740 | or maybe once a day of very minimal sensory input,
02:14:35.460 | no lecture, no podcast, no book, no music, nothing,
02:14:38.860 | and allowing the brain to just kind of idle
02:14:43.140 | and go a little bit non-linear, if you will.
02:14:45.780 | - Right.
02:14:46.860 | - Where we're not constructing thoughts
02:14:48.540 | or paying attention to anyone else's thoughts
02:14:51.020 | through those media venues in any kind of structured way
02:14:55.300 | as a source of great ideas and creativity.
02:14:58.440 | - It's been studied.
02:14:59.280 | Psychologists call it mind-wandering.
02:15:00.980 | - Mind-wandering.
02:15:01.860 | - Yeah, it is a significant literature.
02:15:04.460 | And it's often when you have an aha moment, right?
02:15:09.260 | You know, your mind is wandering
02:15:10.860 | and it's thinking non-linearly
02:15:13.740 | in the sense of not following a sequence
02:15:17.420 | that is logical, you know, hopping from thing to thing.
02:15:22.300 | Often that's when you get a great idea,
02:15:25.040 | with just letting your mind wander.
02:15:27.500 | Yeah, and that happens to me.
02:15:29.540 | - I wonder whether social media
02:15:32.200 | and just texting and phones in general
02:15:33.940 | have eliminated a lot of the, you know,
02:15:36.100 | walks to the car after work
02:15:37.520 | where one would normally not be on a call
02:15:39.700 | or in communication with anyone or anything.
02:15:42.060 | I used to do experiments where I was, you know,
02:15:43.820 | like pipetting and running immunohistochemistry.
02:15:47.860 | And it was very relaxing.
02:15:49.580 | And I could think while I was doing
02:15:51.180 | 'cause I knew the procedures.
02:15:52.360 | And then, you know, you had to pay attention
02:15:53.780 | to certain things, write them down.
02:15:54.860 | But I would often feel like, wow,
02:15:57.380 | I'm both working and relaxing and thinking of things.
02:16:00.340 | And then I would listen to music sometimes.
02:16:02.180 | - Okay, so we have a whole session,
02:16:06.700 | you know, a clip in "Learning How to Learn"
02:16:09.940 | about exactly this phenomenon.
02:16:13.500 | Here's what we tell our students, right?
02:16:16.540 | Is that, you know, if you're having trouble
02:16:19.100 | with some concept or, you know,
02:16:20.660 | you don't understand something,
02:16:21.680 | you're beating your head against the wall,
02:16:24.380 | don't, stop, stop.
02:16:27.480 | Just go off and do something.
02:16:29.360 | Go off and clean the dishes.
02:16:31.360 | Go off and, you know, walk around the block.
02:16:34.520 | And inevitably what happens is
02:16:36.480 | when you come back, your mind is clear
02:16:38.500 | and you figure out what to do.
02:16:40.940 | And that's one of the best pieces of advice
02:16:44.060 | that anybody could get.
02:16:45.180 | Because, you know, we don't,
02:16:47.020 | nobody has told us how the brain works, right?
02:16:49.820 | Some people are really good at intuiting
02:16:53.180 | because they've experienced, you may be,
02:16:55.180 | but everybody I, okay.
02:16:58.580 | The other thing is everybody I know
02:17:00.620 | who's really made important contributions
02:17:06.560 | and I'll bet you're one of them.
02:17:09.000 | You know, you're struggling with some problem at night
02:17:13.640 | and you go to bed and you wake up in the morning.
02:17:15.080 | Ah, that's the solution.
02:17:17.680 | That's what I should do, right?
02:17:18.600 | - First thing in the morning when I wake up
02:17:20.280 | is when I'm almost bombarded with,
02:17:23.860 | I wouldn't say insight and not always meaningful insight,
02:17:27.500 | but certainly what was unclear
02:17:30.600 | becomes immediately clear on waking.
02:17:32.160 | - That's right.
02:17:33.000 | That's the thing that is so amazing about sleep.
02:17:36.260 | And you can see people who know this can count on it.
02:17:41.260 | In other words, the key is to think about it
02:17:43.340 | before you go to sleep, right?
02:17:46.540 | Your brain works on it during the sleep period, right?
02:17:49.020 | And so, you know, don't watch TV
02:17:50.740 | because then who knows what your brain's gonna work on.
02:17:53.440 | You know, use the time before you fall asleep
02:17:57.660 | to think about something that is bothering you
02:17:59.700 | or maybe something that, you know,
02:18:01.140 | you're trying to understand, maybe, you know,
02:18:03.960 | a paper that you've, you read the paper and say,
02:18:06.200 | oh, you know, I'm tired, I'm gonna go to sleep.
02:18:08.920 | You wake up in the morning and say,
02:18:09.760 | oh, I know what's going on in that paper.
02:18:11.680 | Yeah, I mean, that's what happens.
02:18:13.080 | You can use, you know,
02:18:14.280 | once you know something about how the brain works,
02:18:15.920 | you can take advantage of that.
02:18:17.960 | - Do you pay attention to your dreams?
02:18:20.000 | Do you record them?
02:18:20.920 | - No, no.
02:18:22.040 | Okay, so here's the problem.
02:18:25.400 | Dreams seem so iconic
02:18:30.180 | and a lot of people, you know,
02:18:32.200 | somehow attribute things to them,
02:18:35.220 | but there has never been any good theory
02:18:40.680 | or any good understanding, first of all, why we dream.
02:18:45.480 | We still, I mean, it's still not completely clear.
02:18:47.360 | I mean, there are some ideas, but,
02:18:49.440 | or what trig, why this particular dream?
02:18:52.720 | Is this, does that have some significance for you?
02:18:55.280 | And the only thing that I know
02:18:59.480 | that might explain a little bit
02:19:01.500 | is that, you know, the dreams are often very visual,
02:19:05.520 | you know, rapid eye movement, sleep,
02:19:08.080 | so that there's something happening.
02:19:10.640 | Actually, it's interesting.
02:19:11.520 | All the neuromodulators are downregulated during sleep
02:19:14.000 | and then during REM sleep,
02:19:15.260 | the acetylcholine comes up, right?
02:19:16.660 | So that's a very powerful neuromodulator.
02:19:19.440 | It's important for attention, for example,
02:19:22.060 | but it doesn't come up in the prefrontal cortex,
02:19:24.460 | which means that the circuits in the prefrontal cortex
02:19:27.680 | that are interpreting the sensory input coming in
02:19:31.320 | are not turned on.
02:19:34.360 | So any of these, whatever happens in your visual cortex
02:19:37.880 | is not being monitored anymore.
02:19:40.800 | So you get bizarre things, you know,
02:19:42.240 | that you start floating and, you know, things happen to you
02:19:44.880 | and, you know, it's not anchored anymore.
02:19:48.200 | And so, but that still doesn't explain why, right?
02:19:51.400 | Why you have that period.
02:19:52.900 | It's important 'cause if you block it,
02:19:55.040 | and there are some sleeping pills that do block it,
02:19:57.660 | you know, it really does cause problems
02:20:00.360 | with, you know, normal cognitive function.
02:20:03.520 | - Cannabis as well.
02:20:04.960 | People who come off cannabis
02:20:07.120 | experience a tremendous REM rebound
02:20:11.080 | and lots of dreaming in the, you know,
02:20:14.920 | the days and weeks and months after cannabis.
02:20:18.560 | - Wow.
02:20:19.380 | - I don't wanna call it withdrawal
02:20:20.280 | 'cause that has a different meaning.
02:20:21.120 | - No, no, it's a imbalance that was caused of,
02:20:25.680 | you know, because the brain adjusted
02:20:27.640 | to the endocannabinoid levels.
02:20:31.320 | And now it's gotta go back and then it takes time,
02:20:34.760 | but it's interesting.
02:20:35.600 | It's an interesting, it affects dreams.
02:20:36.960 | I think that may be a clue.
02:20:38.680 | - Yeah, very, very common phenomenon.
02:20:41.160 | I'm told, I'm not a cannabis user,
02:20:44.100 | but no judgment there, I just am not.
02:20:47.800 | - There's actually a book I read years ago
02:20:51.320 | when I was in college, so a long time ago,
02:20:54.100 | by Alan Hobson, who was out at Harvard.
02:20:55.960 | - Oh yeah, I know him.
02:20:57.960 | - Oh, cool, so I never met him,
02:21:00.320 | but he had this interesting idea
02:21:02.840 | that dreams, in particular rapid eye movement dreams,
02:21:06.080 | were so very similar to the experience
02:21:09.340 | that one has on certain psychedelics,
02:21:12.160 | LSD, lysergic acid, diethylamide, or psilocybin,
02:21:16.720 | and that perhaps dreams are revealing the unconscious mind,
02:21:21.520 | not saying this in any psychological terms,
02:21:23.560 | that when we're asleep, our conscious mind
02:21:25.480 | can't control thought and action in the same way, obviously,
02:21:29.360 | and it's sort of a recession of the waterline,
02:21:32.240 | so we're getting more of the unconscious processing revealed.
02:21:36.480 | - You know, that's an interesting hypothesis.
02:21:38.820 | How would you test it?
02:21:39.920 | - I'd probably have to put someone in a scanner,
02:21:43.600 | have them go to sleep, put them in the scanner
02:21:46.000 | on a psilocybin journey, this kind of thing.
02:21:51.000 | You know, it's tough.
02:21:52.520 | I mean, any of these observational studies,
02:21:54.980 | of course we both know, are deficient in the sense
02:21:57.380 | that what you'd really like to do
02:21:58.320 | is control the neural activity.
02:22:00.200 | You'd like to get in there and tickle the neurons over here
02:22:02.660 | and see how the brain changes,
02:22:04.240 | and you'd love to get real-time subjective report.
02:22:06.420 | This is the problem with sleep and dreaming,
02:22:07.680 | is you can wake people up and ask them
02:22:09.240 | what they were just dreaming about,
02:22:11.040 | but you can't really know what they're dreaming
02:22:13.520 | about in real time.
02:22:15.920 | - It's true, yeah, it's true.
02:22:17.680 | By the way, you know, there are two kinds of dreams.
02:22:20.600 | Very interesting.
02:22:21.640 | So if you wake someone up during REM sleep,
02:22:24.840 | you get very vivid changing.
02:22:28.320 | Dreams, they're always different and changing,
02:22:31.320 | but if you wake someone up during slow-wave sleep,
02:22:33.920 | you often get a dream report,
02:22:35.680 | but it's a kind of dream that keeps repeating
02:22:38.280 | over and over again every night,
02:22:39.960 | and it's a very heavy emotional content.
02:22:42.680 | - Interesting, that's in slow-wave sleep?
02:22:44.960 | - Yeah.
02:22:45.800 | - 'Cause I've had a few dreams over and over and over
02:22:48.160 | throughout my life, so this would be in slow-wave sleep.
02:22:50.640 | - Yeah, probably slow-wave sleep, yeah.
02:22:53.000 | - Fascinating.
02:22:53.840 | As a neuroscientist who's computationally oriented,
02:22:59.400 | but really you incorporate the biology so well
02:23:01.760 | into your work, so that's one of the reasons you're you,
02:23:03.920 | you're this luminary of your field,
02:23:06.320 | and who's also now really excited about AI,
02:23:10.940 | what are you most excited about now?
02:23:13.540 | Like if you had, and you know,
02:23:16.060 | of course this isn't the case,
02:23:17.100 | but if you had like 24 more months
02:23:20.260 | to just pour yourself into something,
02:23:21.860 | and then you had to hand the keys to your lab
02:23:24.660 | over to someone else, what would you go all in on?
02:23:28.060 | - Well, so the NIH has something called the Pioneer Award,
02:23:32.440 | and what they're looking for are big ideas
02:23:36.340 | that could have a huge impact, right?
02:23:39.660 | So I put one in recently, and here's the title,
02:23:44.660 | is Temporal Context in Brains and Transformers.
02:23:50.140 | - And in brains and transforms?
02:23:54.100 | - Transformers.
02:23:55.300 | - Formers.
02:23:56.140 | - AI, right, the key to GTP is the fact
02:24:00.460 | there's this new architecture,
02:24:01.660 | it's a deep learning architecture,
02:24:03.380 | feed-forward network, but it's called a transformer,
02:24:06.700 | and it has certain parts in it that are unique.
02:24:10.420 | There's one called self-attention,
02:24:12.120 | and it's a way of doing what is called temporal context,
02:24:18.400 | what it does is it connects words that are far apart,
02:24:22.260 | you give it a sequence of words,
02:24:23.620 | and it can tell you the association,
02:24:25.480 | like if I use the word this,
02:24:27.220 | and then you have to figure out in the last sentence
02:24:30.660 | what it did refer to,
02:24:31.500 | well, there's three or four nouns it could have referred to,
02:24:33.740 | but from context, you can figure out which one it does,
02:24:37.700 | and you can learn that association.
02:24:40.000 | - Could I just play with another example
02:24:42.260 | to make sure I understand this correctly?
02:24:44.740 | I've seen these word bubble charts,
02:24:46.260 | like if we were to say piano, you'd say keys,
02:24:48.820 | you'd say music, you'd say seat,
02:24:51.020 | and then it kind of builds out a word cloud of association.
02:24:54.420 | And then over here, we'd say,
02:24:56.460 | I don't know, I'm thinking about the Salk Institute,
02:24:57.780 | I'd say sunset, Stonehenge, anyone that looks up,
02:25:00.340 | there's this phenomenon of Salkhenge.
02:25:02.620 | Then you start building out a word cloud over there.
02:25:05.060 | These are disparate things,
02:25:06.780 | except I've been to a classical music concert
02:25:09.900 | at the Salk Institute twice,
02:25:12.260 | so they're not completely non-overlapping,
02:25:15.820 | and so you start getting associations at a distance,
02:25:17.820 | and eventually they bridge together.
02:25:19.180 | Is this what you're referring to?
02:25:20.340 | - Yes, I think that that's an example,
02:25:24.140 | but it turns out that every word is ambiguous,
02:25:27.260 | it has like three, four meanings,
02:25:29.020 | and so you have to figure that out from context.
02:25:32.340 | So in other words, there are words that live together,
02:25:35.140 | and that come up often,
02:25:37.980 | and you can learn that from just by, you know,
02:25:41.420 | predicting the next word in a sentence,
02:25:42.920 | that's how a transformer is trained.
02:25:45.220 | You give it a bunch of words,
02:25:46.460 | and it keeps predicting the next word in a sentence.
02:25:48.780 | - Like in my email now, it tries to predict the next word.
02:25:51.220 | - Exactly.
02:25:52.060 | - And it's mostly right part of the time.
02:25:54.380 | - Okay, well, that's because it's a very primitive version
02:25:57.420 | of this algorithm.
02:25:59.220 | What happened is if you train it up on enough,
02:26:02.500 | not only can it answer the next word,
02:26:05.040 | it internally builds up a semantic representation
02:26:10.060 | in the same way you describe the words
02:26:11.940 | that are related to each other,
02:26:14.260 | having, you know, associations.
02:26:17.500 | It can figure that out, and it has representations
02:26:19.980 | inside this very large network
02:26:21.840 | with trillions of parameters,
02:26:23.900 | and unbelievable how big they've gotten.
02:26:26.900 | And those associations now form an internal model
02:26:31.900 | of the meaning of the sentence.
02:26:37.140 | Literally, it's been, this is something that now
02:26:41.500 | we've probed these transformers,
02:26:43.300 | and so we pretty much are pretty confident.
02:26:47.220 | And that means that it's forming an internal model
02:26:51.980 | of the outside world, in this case, a bunch of words.
02:26:55.700 | And that's how it's able to actually respond to you
02:26:59.660 | in a way that is sensible, that makes sense,
02:27:02.260 | and actually is interesting, and so forth.
02:27:04.360 | And it's all for the self-attention I'm talking about.
02:27:07.860 | So in any case, my pioneer proposal is to figure out
02:27:11.460 | how does the brain do self-attention, right?
02:27:14.420 | It's gotta do it somehow.
02:27:16.100 | And I'll give you a little hint.
02:27:19.300 | Basal ganglia.
02:27:22.320 | - It's in the basal ganglia.
02:27:23.780 | - That's my hypothesis.
02:27:25.740 | Well, we'll see.
02:27:26.580 | I mean, I'll be working with experimental people.
02:27:30.920 | I've worked with John Reynolds, for example,
02:27:35.060 | who studies primate visual cortex,
02:27:38.220 | and we've looked at traveling waves there,
02:27:40.740 | and there are other people that have looked at, in primates.
02:27:45.100 | And so now, these traveling waves, I think,
02:27:49.780 | are also a part of the puzzle, pieces of the puzzle
02:27:54.780 | that are gonna give us a much better view
02:27:57.940 | of how the cortex is organized
02:28:00.180 | and how it interacts with the basal ganglia.
02:28:02.980 | We've already been there.
02:28:04.420 | But we still, neuroscientists have studied
02:28:08.140 | each one of these parts of the brain independently,
02:28:10.860 | and now we have to start thinking about
02:28:12.900 | putting the pieces of the puzzle together, right?
02:28:15.020 | Trying to get all the things that we know about these areas
02:28:18.100 | and see how they work together in a computational way.
02:28:21.420 | And that's really where I want to go.
02:28:23.420 | - I love it.
02:28:25.460 | And I do hope they decide to fund your Pioneer Award.
02:28:28.180 | - I do too. - Yeah.
02:28:29.700 | And should they make the bad decision not to,
02:28:33.180 | maybe we'll figure out another way
02:28:34.260 | to get the work done.
02:28:35.980 | Certainly you will.
02:28:36.940 | Terry, I want to thank you,
02:28:41.820 | first of all, for coming here today,
02:28:43.340 | taking time out of your busy cognitive
02:28:45.700 | and running and teaching and research schedule
02:28:49.140 | to share your knowledge with us.
02:28:51.260 | And also for the incredible work that you're doing
02:28:54.040 | on public education and teaching the public,
02:28:56.980 | I should say, giving the public resources
02:29:00.000 | to learn how to learn better at zero cost.
02:29:02.460 | So we will certainly provide links to learning how to learn
02:29:05.820 | and your book and to these other incredible resources
02:29:09.880 | that you've shared.
02:29:10.720 | And you've also given us a ton of practical tools today
02:29:13.620 | related to exercise mitochondria
02:29:15.660 | and some of the things that you do,
02:29:16.820 | which of course are just your versions of what you do,
02:29:18.720 | but that certainly, certainly are going to be a value
02:29:22.220 | to people, including me,
02:29:24.420 | in our cognitive and physical pursuits
02:29:26.900 | and frankly, just longevity.
02:29:28.940 | I mean, this is not lost on me and those listening
02:29:32.420 | that your vigor is, as I mentioned earlier, undeniable.
02:29:36.220 | And it's been such a pleasure over the years
02:29:38.460 | to just see the amount of focus and energy and enthusiasm
02:29:43.560 | that you bring to your work and to observe
02:29:45.420 | that it not only hasn't slowed,
02:29:47.500 | but you're picking up velocity.
02:29:48.920 | So thank you so much for educating us today.
02:29:51.420 | I know I speak on behalf of myself
02:29:53.320 | and many, many people listening and watching.
02:29:56.220 | This is a real gift,
02:29:57.980 | a real incredible experience to learn from you.
02:30:01.100 | So thank you so much.
02:30:02.420 | - Well, thank you.
02:30:03.340 | And I have to say that I've been blessed over the years
02:30:06.860 | with wonderful students and wonderful colleagues.
02:30:10.700 | And I count you among them who really,
02:30:13.600 | I've learned a lot from.
02:30:15.120 | - Thank you.
02:30:15.960 | - But, you know, we're, you know,
02:30:18.200 | science is a social activity and we learn from each other
02:30:23.200 | and we all make mistakes, but we learn from our mistakes.
02:30:28.520 | And that's the beauty of science
02:30:29.720 | is that we can make progress.
02:30:31.800 | Now, you know, your career has been remarkable too,
02:30:34.860 | because you have affected and influenced more people
02:30:38.720 | than anybody else I know personally with the knowledge
02:30:43.120 | that you are broadcasting through your interviews,
02:30:48.120 | but also, you know, just in terms of your interests.
02:30:51.820 | Really, I'm really impressed with what you've done
02:30:54.400 | and I want you to keep, you know, at it
02:30:58.640 | because we need people like you.
02:31:01.060 | We need scientists who can actually express
02:31:06.060 | and reach the public.
02:31:08.000 | If we don't do that,
02:31:09.600 | everything we do is behind closed doors, right?
02:31:11.640 | Nothing gets out.
02:31:12.640 | And so you're one of the best of the breed
02:31:16.160 | in terms of being able to explain things in a clear way
02:31:20.280 | that gets through to more people than anybody else I know.
02:31:23.680 | - Well, thank you.
02:31:24.520 | I'm very honored to hear that.
02:31:25.720 | It's a labor of love for me and I'll take those words in
02:31:29.300 | and I really appreciate it.
02:31:31.080 | It's an honor and a privilege to sit with you today
02:31:32.920 | and please come back again.
02:31:34.440 | - I would love to, yeah.
02:31:35.960 | - All right, thank you, Terry.
02:31:37.520 | - You're welcome.
02:31:38.740 | - Thank you for joining me for today's discussion
02:31:40.800 | with Dr. Terry Sinowski.
02:31:42.560 | To find links to his work,
02:31:44.080 | the Zero Cost Online Learning Portal
02:31:45.800 | that he and his colleagues have developed
02:31:47.720 | and to find links to his new book,
02:31:49.620 | please see the show note captions.
02:31:51.520 | If you're learning from and or enjoying this podcast,
02:31:54.040 | please subscribe to our YouTube channel.
02:31:55.880 | That's a terrific zero cost way to support us.
02:31:58.400 | In addition, please follow the podcast
02:32:00.520 | on both Spotify and Apple.
02:32:02.240 | And on both Spotify and Apple,
02:32:03.660 | you can leave us up to a five-star review.
02:32:06.140 | Please check out the sponsors mentioned at the beginning
02:32:08.320 | and throughout today's episode.
02:32:09.920 | That's the best way to support this podcast.
02:32:12.280 | If you have questions or comments about the podcast
02:32:14.720 | or guests or topics that you'd like me to consider
02:32:16.640 | for the Huberman Lab Podcast,
02:32:18.160 | please put those in the comment section on YouTube.
02:32:20.600 | I do read all the comments.
02:32:22.400 | For those of you that haven't heard,
02:32:23.560 | I have a new book coming out.
02:32:24.760 | It's my very first book.
02:32:26.360 | It's entitled "Protocols,
02:32:27.760 | An Operating Manual for the Human Body."
02:32:29.920 | This is a book that I've been working on
02:32:31.100 | for more than five years
02:32:32.240 | and that's based on more than 30 years
02:32:34.580 | of research and experience.
02:32:36.120 | And it covers protocols for everything from sleep,
02:32:39.200 | to exercise, to stress control,
02:32:41.680 | protocols related to focus and motivation.
02:32:44.120 | And of course, I provide the scientific substantiation
02:32:47.520 | for the protocols that are included.
02:32:49.580 | The book is now available by presale at protocolsbook.com.
02:32:53.480 | There you can find links to various vendors.
02:32:55.840 | You can pick the one that you like best.
02:32:57.620 | Again, the book is called "Protocols,
02:32:59.400 | An Operating Manual for the Human Body."
02:33:02.120 | If you're not already following me on social media,
02:33:04.080 | I'm Huberman Lab on all social media platforms.
02:33:07.000 | So that's Instagram, X, formerly known as Twitter,
02:33:09.960 | Threads, Facebook, and LinkedIn.
02:33:11.760 | And on all those platforms,
02:33:12.920 | I discuss science and science related tools,
02:33:15.040 | some of which overlaps with the content
02:33:16.560 | of the Huberman Lab podcast,
02:33:17.980 | but much of which is distinct from the content
02:33:20.080 | on the Huberman Lab podcast.
02:33:21.360 | Again, that's Huberman Lab on all social media platforms.
02:33:25.040 | If you haven't already subscribed
02:33:26.200 | to our Neural Network Newsletter,
02:33:27.760 | our Neural Network Newsletter
02:33:29.280 | is a zero cost monthly newsletter
02:33:31.240 | that includes podcast summaries,
02:33:32.660 | as well as protocols in the form of brief
02:33:34.840 | one to three page PDFs.
02:33:36.840 | Those one to three page PDFs cover things like
02:33:39.120 | deliberate heat exposure, deliberate cold exposure.
02:33:41.360 | We have a foundational fitness protocol.
02:33:43.400 | We also have protocols for optimizing your sleep,
02:33:45.760 | dopamine, and much more.
02:33:47.060 | Again, all available, completely zero cost.
02:33:49.440 | Simply go to HubermanLab.com,
02:33:51.600 | go to the menu tab,
02:33:52.520 | scroll down to newsletter and provide your email.
02:33:54.860 | We do not share your email with anybody.
02:33:57.360 | Thank you once again for joining me
02:33:58.680 | for today's discussion with Dr. Terry Sadnowski.
02:34:01.640 | And last, but certainly not least,
02:34:04.060 | thank you for your interest in science.
02:34:06.100 | [MUSIC PLAYING]