back to index

Enhance Your Learning Speed & Health Using Neuroscience Based Protocols | Dr. Poppy Crum


Chapters

0:0 Poppy Crum
2:22 Neuroplasticity & Limits; Homunculus
8:6 Technology; Environment & Hearing Thresholds; Absolute Pitch
13:12 Sponsors: David & Helix Sleep
15:33 Texting, Homunculus, Mapping & Brain; Smartphones
23:6 Technology, Data Compression, Communication, Smartphones & Acronyms
30:32 Sensory Data & Bayesian Priors; Video Games & Closed Loop Training
40:51 Improve Swim Stroke, Analytics & Enhancing Performance, Digital Twin
46:17 Sponsors: AGZ by AG1 & Rorra
49:8 Digital Twin; Tool: Learning, AI & Self-Testing
53:0 AI: Increase Efficacy or Replace Task?, AI & Germane Cognitive Load
62:7 Bread, Process & Appreciation; AI to Optimize Physical Environments
69:43 Awake States & AI; Measure & Modify
76:37 Wearables, Sensors & Measure Internal State; Pupil Size (Pupillometry)
83:58 Sponsor: Function
85:46 Integrative Systems, Body & Environment; Cognitive State & Decision-Making
92:11 Gamification, Developing Good Habits
98:17 Implications of AI, Diminishing Cognitive Skill
101:11 Digital Twins & Examples, Digital Representative; Feedback Loops
110:59 Customize AI; Situational Intelligence, Blind Spots, Work & Health, “Hearables”
121:8 Career Journey, Perception & Technology; Violin, Absolute Pitch
129:44 Incentives & Neuroplasticity; Technology & Performance
133:59 Acoustic Arms Race: Moths, Bats & Echolocation
141:17 Singing to Spiders, Spider Web & Environment Detection; Crickets; Marmosets
151:44 Acknowledgements
153:18 Zero-Cost Support, YouTube, Spotify & Apple Follow, Reviews & Feedback, Sponsors, Protocols Book, Social Media, Neural Network Newsletter

Whisper Transcript | Transcript Only Page

00:00:00.360 | - Welcome to the Huberman Lab Podcast,
00:00:02.320 | where we discuss science
00:00:03.760 | and science-based tools for everyday life.
00:00:05.960 | I'm Andrew Huberman,
00:00:10.440 | and I'm a professor of neurobiology and ophthalmology
00:00:13.640 | at Stanford School of Medicine.
00:00:15.440 | My guest today is Dr. Poppy Crum.
00:00:17.760 | Dr. Poppy Crum is a neuroscientist,
00:00:19.760 | a professor at Stanford,
00:00:21.000 | and the former chief scientist at Dolby Laboratories.
00:00:24.160 | Her work focuses on how technology
00:00:25.920 | can accelerate neuroplasticity in learning
00:00:27.960 | and generally enrich our life experience.
00:00:30.200 | You've no doubt heard about,
00:00:31.320 | and perhaps use wearables and sleep technologies
00:00:34.160 | that can monitor your sleep,
00:00:35.260 | tell you how much slow-wave sleep you're getting,
00:00:36.840 | how much REM sleep,
00:00:37.720 | and technologies that can control the temperature
00:00:40.020 | of your sleep environment and your room environment.
00:00:42.640 | Well, you can soon expect wearables
00:00:44.800 | and hearable technologies to be part of your life.
00:00:47.520 | Hearable technologies are, as the name suggests,
00:00:49.840 | technologies that can hear your voice
00:00:51.420 | and the voice of other people
00:00:52.880 | and deduce what is going to be best
00:00:54.700 | for your immediate health and your states of mind.
00:00:57.440 | Believe it or not,
00:00:58.280 | these technologies will understand your brain states,
00:01:00.240 | your goals,
00:01:01.120 | and it will make changes to your home
00:01:02.680 | and working in other environments
00:01:04.160 | so that you can focus better,
00:01:05.880 | relax more thoroughly,
00:01:07.040 | and connect with other people on a deeper level.
00:01:09.320 | As Poppy explains,
00:01:10.360 | all of this might seem kind of space age
00:01:12.000 | and maybe even a little aversive or scary now,
00:01:14.640 | but she explains how it will vastly improve life
00:01:16.800 | for both kids and adults,
00:01:18.440 | and indeed increase human-human empathy.
00:01:20.640 | During today's episode,
00:01:21.520 | you'll realize that Poppy is a true,
00:01:23.120 | out-of-the-box thinker and scientist.
00:01:25.320 | She has a really unique story.
00:01:26.760 | She discovered she has perfect pitch at a young age.
00:01:28.960 | She explains what that is
00:01:30.120 | and how that shaped her worldview and her work.
00:01:32.640 | Poppy also graciously built
00:01:34.040 | a zero-cost step-by-step protocol for all of you.
00:01:37.240 | It allows you to build a custom AI tool
00:01:39.680 | to improve at any skill you want
00:01:41.320 | and to build better health protocols and routines.
00:01:43.960 | I should point out that you don't need to know how to program
00:01:46.160 | in order to use this tool that she's built.
00:01:48.080 | Anyone can use it,
00:01:49.000 | and as you'll see,
00:01:49.840 | it's extremely useful.
00:01:51.000 | We provide a link to it in the show note captions.
00:01:53.320 | Today's conversation is unlike any
00:01:54.800 | that we previously had on the podcast.
00:01:57.080 | It's a true glimpse into the future,
00:01:58.640 | and it also points you to new tools
00:02:00.800 | that you can use now to improve your life.
00:02:03.480 | Before we begin,
00:02:04.320 | I'd like to emphasize that this podcast is separate
00:02:06.440 | from my teaching and research roles at Stanford.
00:02:08.680 | It is, however, part of my desire and effort
00:02:10.720 | to bring zero cost to consumer information about science
00:02:13.240 | and science-related tools to the general public.
00:02:15.840 | In keeping with that theme,
00:02:16.960 | today's episode does include sponsors.
00:02:19.320 | And now for my conversation with Dr. Poppy Crum.
00:02:22.400 | Dr. Poppy Crum, welcome.
00:02:24.560 | - Thanks, Andy.
00:02:25.400 | It's great to be here.
00:02:26.240 | - Great to see you again.
00:02:27.680 | We should let people know now,
00:02:28.760 | we were graduate students together,
00:02:30.680 | but that's not why you're here.
00:02:31.760 | You're here because you do incredibly original work.
00:02:34.720 | You've worked in so many different domains
00:02:36.240 | of technology, neuroscience, et cetera.
00:02:39.080 | Today, I wanna talk about a lot of things,
00:02:41.040 | but I wanna start off by talking about neuroplasticity,
00:02:43.800 | this incredible ability of our nervous systems
00:02:45.920 | to change in response to experience.
00:02:48.440 | I know how I think about neuroplasticity,
00:02:50.800 | but I wanna know how you think about neuroplasticity.
00:02:52.880 | In particular, I wanna know,
00:02:55.320 | do you think our brains are much more plastic
00:02:57.320 | than most of us believe?
00:02:58.960 | Like, can we change much more than we think,
00:03:01.080 | and we just haven't accessed the ways to do that?
00:03:03.640 | Or do you think that our brains are pretty fixed,
00:03:06.760 | and in order to make progress as a species,
00:03:09.240 | we're gonna have to, I don't know, create robots or something
00:03:11.640 | to do the work that we're not able to do
00:03:13.760 | because our brains are fixed.
00:03:15.120 | But let's start off by just getting your take
00:03:18.440 | on what neuroplasticity is
00:03:21.040 | and what you think the limits on it are.
00:03:23.520 | I do think we're much more plastic than we talk about
00:03:28.120 | or we realize in our daily lives.
00:03:30.240 | And just to your point about creating robots,
00:03:32.320 | the more we create robots,
00:03:33.480 | there's neuroplasticity that comes with using robots as humans
00:03:37.720 | when we use them in partnerships
00:03:39.520 | or as tools to accelerate our capabilities.
00:03:43.240 | So neuroplasticity, the way that the,
00:03:45.840 | where I resonate with it a lot is trying to understand,
00:03:50.800 | and this is what I've done a lot of in my career,
00:03:53.640 | is thinking about building and developing technologies,
00:03:58.120 | both with an understanding of how they shape our brain.
00:04:01.280 | Everything we engage with in our daily lives,
00:04:04.480 | whether it's the statistics of our environments
00:04:07.200 | and our context or the technologies we use on a daily basis,
00:04:11.640 | are shaping our brains in ways, through neuroplasticity.
00:04:16.120 | Some more than others, some we know as we age,
00:04:18.720 | are very dependent on how attentive and engaged we are,
00:04:21.560 | as opposed to passively just consuming and changing.
00:04:26.800 | But we are in a place where everyone, I believe,
00:04:31.280 | needs to be thinking more about how the technologies they're using,
00:04:34.760 | especially in the age of AI and immersive technologies,
00:04:38.280 | how they are shaping or architecting our brains as we move forward.
00:04:44.040 | You go to any Neuroscience 101 medical school textbook
00:04:47.600 | and there's something, you'll see a few pages
00:04:49.320 | on something called the homunculus.
00:04:51.120 | Now, what is the homunculus?
00:04:53.040 | It's a data representation,
00:04:54.680 | but it'll be this sort of funny-looking creature when you see it.
00:04:57.800 | But that picture of this sort of distorted human that you're looking at
00:05:02.560 | is really just a data representation of how many cells in your brain
00:05:08.720 | are helping, are coding and representing information for your sense of touch, right?
00:05:17.200 | And that image, though, and this is where things get kind of funny,
00:05:20.960 | that image comes from Wilder Penfield back in the '40s.
00:05:24.520 | because he recorded the somatosensory cells of patients just before they were to have surgery for epilepsy and such.
00:05:34.080 | And since we don't have pain receptors in our cortex, he could have this awake human and be able to touch different parts of their brain and ask them, you know, to report what sensation they felt on their bodies.
00:05:46.200 | And so he mapped that part of their cortex, and then that's how we ended up with the homunculus.
00:05:51.760 | And you'll see, you know, it'll have bigger lips, it'll have, you know, smaller parts of your back and the areas where you just don't have the same sensitivities.
00:06:01.320 | Well, fast forward to today.
00:06:03.320 | When you look at that homunculus, one of the things I always will ask people to think about is, you know, what's wrong with this image?
00:06:08.880 | You know, this is an image from 1940 that is still in every textbook.
00:06:12.880 | And, you know, any Stanford student will look at it and they'll immediately say, well, the thumb should be bigger because we do this all day long.
00:06:21.440 | And I've got more sensitivity in my fingers because I'm always typing on my mobile device, which is absolutely true.
00:06:27.560 | Or maybe they'll say something like, well, the ankles are the same size and we drive cars now a lot more than we did in the '40s.
00:06:35.440 | Or maybe if I live in a different part of the world, I drive on one side versus the other.
00:06:40.120 | And in a few years, you know, we probably won't be driving and those resources get optimized elsewhere.
00:06:45.960 | So what the homunculus is, is it's a representation of how our brain has allocated resources to help us be successful.
00:06:54.520 | And those resources are the limited cells we have that support whatever we need to flourish in our world.
00:07:02.600 | And the beauty of that is when you develop expertise, you develop more support, more resources,
00:07:09.840 | go to helping you do that thing, but they also get more specific.
00:07:14.800 | They develop more specificity so that, you know, I might have suddenly a lot more cells in my brain devoted to helping me.
00:07:22.720 | Yet, you know, I'm a violinist and my, well, my left hand, my right hemisphere, my somatosensory cortex,
00:07:29.120 | I'm going to have a lot more cells that are helping me, you know, feel my fingers and the tips of everything so that I can, you know, be fluid and more virtuosic.
00:07:40.080 | But that means I have more cells, but they're more specified.
00:07:44.160 | They're giving me more sensitivity.
00:07:45.920 | They're giving me more data that's differentiated.
00:07:48.880 | And that's what my brain needs and that's what my brain's responding to.
00:07:50.880 | And so when we think about that, you know, my practice as a musician versus my practice playing video games,
00:07:59.840 | all of these things influence our brain and influence our plasticity.
00:08:04.800 | Now, where things get kind of interesting to me and sort of my obsession on that side is,
00:08:10.800 | every time we engage with a technology, it's going to shape our brain, right?
00:08:14.800 | It's both, you know, our environments, but our environments are changing.
00:08:20.800 | Those are shaping who we are.
00:08:22.160 | You know, I think you can look at people's hearing thresholds and predict what city they live in.
00:08:27.360 | Really?
00:08:28.080 | Absolutely.
00:08:29.600 | Can you just briefly explain hearing thresholds and why that would be?
00:08:33.680 | I mean, I was visiting the city of Chicago a couple of years ago.
00:08:36.560 | Beautiful city, amazing food, love the people, very loud city, wide downtown streets,
00:08:44.000 | not a ton of trees compared to what I'm used to.
00:08:46.960 | And I was like, wow, it's really loud here.
00:08:50.240 | And I grew up in the suburbs, got out as quickly as I could.
00:08:53.280 | Yeah.
00:08:53.520 | I don't like the suburbs, sorry.
00:08:55.120 | Suburb dwellers, not for me.
00:08:56.480 | I like the wilderness and I like cities.
00:09:00.000 | But you're telling me that you can actually predict people's hearing thresholds
00:09:06.480 | for loudness simply based on where they were raised or where they currently live.
00:09:11.760 | In part, it can be both, right?
00:09:13.280 | Because cities have sonic imprints, types of noise, things that are very, you know, very loud cities,
00:09:19.840 | but also what's creating that noise, right?
00:09:22.160 | That's often unique though.
00:09:23.520 | The inputs, the types of vehicles, the types of density of people or and even the construction in
00:09:31.920 | those environments, it is changing what noise exists.
00:09:35.040 | That's shaping, you know, people's hearing thresholds.
00:09:39.120 | At the lowest level, it's also shaping their sensitivities.
00:09:41.920 | If you're used to hearing, you know, certain animals in your environment and they come with,
00:09:45.840 | you know, you should be heightened to a certain response in that,
00:09:50.320 | you're going to develop increased sensitivity to that, right?
00:09:54.000 | Whereas if it's really abnormal, you know, I hear chickens.
00:09:57.360 | I have a neighbor who has chickens in the city.
00:09:59.440 | Roosters too?
00:10:00.320 | Yes, yes.
00:10:01.200 | I grew up near a rooster.
00:10:02.800 | In San Francisco, there you go.
00:10:03.360 | I can still hear that rooster.
00:10:06.080 | Yeah.
00:10:06.720 | Those sounds are embedded deeply in my mind.
00:10:09.680 | There's the semantic context and then just the sort of spectrum, right?
00:10:13.520 | And the intensity of that spectrum and meaning when I say spectrum,
00:10:16.720 | I mean the different frequency amplitudes and what that shaping's like.
00:10:20.560 | High pitch, low pitch.
00:10:21.520 | Yeah, yeah.
00:10:22.480 | And that affects how your neural system is changing even at the lowest level of what,
00:10:28.400 | you know, what it's your ears, your brain, your cochlea is getting exposed to.
00:10:34.960 | But then also where, you know, so that would be the lower level, you know, what sort of noise
00:10:42.640 | damage might exist, what exposures.
00:10:44.720 | But then also then there's the amplification of, you know, coming from your higher level
00:10:49.680 | areas that are helping you know that these are frequencies that are more important in your
00:10:54.400 | context and your environment.
00:10:56.240 | There is a funny, like this is kind of funny.
00:10:58.560 | There was a film called, I think it's The Sound of Silence.
00:11:01.520 | And it started, I love Peter Sarsgaard.
00:11:03.120 | He was one of the actors in it.
00:11:05.200 | And it was sort of meant to be a bit fantastical.
00:11:08.560 | Or is that a word?
00:11:09.440 | Is that the right word?
00:11:11.920 | But in fact, to me, so the filmmakers had talked to me a lot as had and to inform this sort of main
00:11:20.800 | character in the way he behaved because I have absolute pitch.
00:11:23.120 | And there were certain things that they were trying to emulate in this film.
00:11:27.680 | He ends up being this person who tunes people's lives.
00:11:31.040 | He'll walk into their environments and be like, oh, you know, things are going badly at work or your
00:11:35.920 | relationships because you're, you know, you've got this tritone.
00:11:39.040 | You're, or, you know, your water heater is making this, you know, pitch and your teapot is at this.
00:11:45.280 | Oh my God, this would go over so well in L.A.
00:11:47.280 | People would pay millions of dollars in Los Angeles.
00:11:49.120 | It's totally funny.
00:11:50.080 | Do you do this for people?
00:11:51.360 | Um, no.
00:11:52.240 | Okay, okay.
00:11:53.120 | But I, I will tell you, I, I will walk into hotel rooms and immediately if I hear something, I'm,
00:11:58.400 | I've moved.
00:11:58.960 | And so I, you know, that is, I.
00:12:00.880 | Because you have perfect pitch.
00:12:01.920 | Could you define perfect pitch?
00:12:03.360 | Does that mean that you can always hit a note perfectly with your voice?
00:12:06.000 | There is no such thing as perfect pitch.
00:12:08.160 | There's absolute pitch.
00:12:09.520 | And so I think only because, uh, the idea of, so like, ah, that would be A equal 440 hertz,
00:12:17.920 | right?
00:12:18.320 | But that's a standard that we use in modern time.
00:12:20.560 | And the, you know, different, what A is has actually changed throughout the, you know,
00:12:26.960 | our lives with aesthetic, with what people liked, with the tools we used to create music.
00:12:31.360 | And, you know, in the Baroque era, A was 415 hertz.
00:12:34.480 | Can you hit that?
00:12:35.440 | Ah, awesome.
00:12:37.440 | Awesome.
00:12:37.920 | And, um, in any case, so that's why it's, it's absolute because, you know, guess what?
00:12:42.800 | As my, uh, basilar membrane gets more rigid as I might age or my temporal processing slows down,
00:12:50.960 | my brain's going to still think I'm in, you know, I'm singing 440 hertz, but it might not be.
00:12:55.360 | It's basilar membrane is a portion of the internal ear that, uh, convert sound waves into electrical
00:13:01.200 | signals, right?
00:13:01.920 | Yeah.
00:13:02.160 | Okay.
00:13:02.400 | Fair enough.
00:13:02.880 | Well, I'm talking to an auditory physiologist here.
00:13:05.440 | Mechanically.
00:13:06.080 | Right, right, right.
00:13:06.800 | I teach auditory physiology, but I want to just make sure because I'm sitting across from an expert.
00:13:11.680 | I'd like to take a quick break and acknowledge one of our sponsors, David.
00:13:15.760 | David makes a protein bar unlike any other.
00:13:18.080 | It has 28 grams of protein, only 150 calories and zero grams of sugar.
00:13:23.120 | That's right.
00:13:24.000 | 28 grams of protein and 75% of its calories come from protein.
00:13:28.000 | This is 50% higher than the next closest protein bar.
00:13:30.880 | David protein bars also tastes amazing.
00:13:32.560 | Amazing.
00:13:33.280 | Even the texture is amazing.
00:13:34.960 | My favorite bar is the chocolate chip cookie dough.
00:13:37.120 | But then again, I also like the new chocolate peanut butter flavor and the chocolate brownie flavored.
00:13:41.440 | Basically, I like all the flavors a lot.
00:13:43.680 | They're all incredibly delicious.
00:13:44.960 | In fact, the toughest challenge is knowing which ones to eat on which days and how many times per day.
00:13:49.440 | I limit myself to two per day, but I absolutely love them.
00:13:52.880 | With David, I'm able to get 28 grams of protein in the calories of a snack,
00:13:56.640 | which makes it easy to hit my protein goals of one gram of protein per pound of body weight per day,
00:14:01.520 | and it allows me to do so without ingesting too many calories.
00:14:05.280 | I'll eat a David protein bar most afternoons as a snack,
00:14:08.400 | and I always keep one with me when I'm out of the house or traveling.
00:14:11.520 | They're incredibly delicious, and given that they have 28 grams of protein,
00:14:15.360 | they're really satisfying for having just 150 calories.
00:14:18.640 | If you'd like to try David, you can go to davidprotein.com/huberman.
00:14:22.960 | Again, that's davidprotein.com/huberman.
00:14:26.320 | Today's episode is also brought to us by Helix Sleep.
00:14:29.280 | Helix Sleep makes mattresses and pillows that are customized to your unique sleep needs.
00:14:33.920 | Now, I've spoken many times before on this and other podcasts about the fact that getting a great
00:14:37.920 | night's sleep is the foundation of mental health, physical health, and performance.
00:14:41.920 | Now, the mattress you sleep on makes a huge difference in the quality of sleep that you get each night.
00:14:46.000 | How soft it is or how firm it is all play into your comfort and need to be tailored to your unique
00:14:50.880 | sleep needs. If you go to the Helix website, you can take a brief two-minute quiz and it will ask you
00:14:55.600 | questions such as, "Do you sleep on your back, your side, or your stomach? Do you tend to run hot or cold
00:14:59.600 | during the night?" Things of that sort. Maybe you know the answers to those questions, maybe you don't.
00:15:04.320 | Either way, Helix will match you to the ideal mattress for you. For me, that turned out to be the Dusk
00:15:08.800 | mattress. I started sleeping on a Dusk mattress about three and a half years ago, and it's been far and away the
00:15:14.000 | best sleep that I've ever had. If you'd like to try Helix Sleep, you can go to helixsleep.com/huberman,
00:15:20.080 | take that two-minute sleep quiz, and Helix will match you to a mattress that's customized to you.
00:15:24.320 | Right now, Helix is giving up to 27% off all mattress orders. Again, that's helixsleep.com/huberman
00:15:30.880 | to get up to 27% off. Okay, so our brains are customized to our experience, especially our
00:15:39.120 | childhood experience, but also our adult experience. You mentioned the homunculus, this representation of
00:15:46.000 | the body surface, and you said something that I just have to pick up on and ask some questions about,
00:15:50.240 | which is that this hypothetical Stanford student, could be any student anywhere, says,
00:15:56.320 | "Wait. Nowadays, we spend a lot of time writing with our thumbs and thinking as we write with our
00:16:01.680 | thumbs and emoting." Right? I mean, when we text with our thumbs, we're sometimes involved in an emotional
00:16:07.360 | exchange. Yeah.
00:16:08.320 | So my question is this. The last 15 years or so have represented an unprecedented time of new technology
00:16:18.720 | integration, right? I mean, the smartphone, texting. And when I text, I realized that I'm hearing a voice
00:16:29.680 | in my head as I text, which is my voice, because if I'm texting outward, I'm sending a text.
00:16:36.960 | But then I'm also internalizing the voice of the person writing to me, if I know them.
00:16:44.560 | But it's coming through filtered by my brain, right? So it's like, I'm not trying to micro dissect
00:16:52.080 | something here for the sake of micro dissection, but the conversation that we have by text, it's all
00:16:57.440 | happening in our own head. But there are two or more players, group text was too complicated to even
00:17:04.000 | consider right now. But what is that transformation really about? Previously, I would write you a letter,
00:17:11.040 | I would send you a letter. I'd write you an email, I'd send you an email. And so the process was really
00:17:15.760 | slowed. Now you can be in a conversation with somebody that's really fast, back and forth, right?
00:17:20.880 | Some people can type fast, you can email fast, but nothing like what you can do with text, right?
00:17:25.840 | I can even know when you're thinking because it's dot, dot, dot, or you're writing, right? And so
00:17:31.920 | is it possible that we've now allocated an entire region of the homunculus, or some other region of
00:17:39.520 | cortex, brain, to conversation that prior to 2010 or so, the brain just was not involved in conversations of
00:17:50.480 | any sort. In other words, we now have the integration of writing with thumbs, that's new,
00:17:54.480 | hearing our own voice, hearing the hypothetical voice of the other person at the other end,
00:18:01.920 | and doing that all at rapid speed. Are we talking about like a new brain area? Or are we talking about
00:18:09.120 | using old brain areas and just trying to find and push the overlap in the Venn diagram? Because I remember
00:18:15.840 | all of this happening very quickly and very seamlessly. I remember like texting showed up
00:18:20.000 | and it was like, all right, well, it's a little slow, a little clunky. Pretty soon it was autofill.
00:18:25.840 | Pretty soon it was learning us. Now we can do voice recognition. And it's, you know, people pick this
00:18:31.680 | up very fast. So the question is, are we taking old brain areas and combining them in new ways? Or is it
00:18:39.360 | possible that we're actually changing the way that our brain works fundamentally in order to be able to
00:18:44.640 | carry out something as what seems to be nowadays trivial, but as basic to everyday life as texting?
00:18:51.760 | What's going on in our brain? We aren't developing new resources. We've got the same cells that are,
00:18:57.920 | or I mean, there's neurogenesis, of course, but it's how those are getting allocated. And, you know,
00:19:03.680 | just one quick comment from what we said before when we talk about the homunculus. The homunculus is an
00:19:09.600 | example of a map in the brain, a cortical map. And maps are important in the brain because they, you
00:19:14.560 | know, allow cells that need to interact to give us specificity, to make us fast, to have, you know,
00:19:20.400 | tight reaction times and things, you know, because you've got shorter distance and, you know, things
00:19:26.080 | that belong together. And also there's a lot of motility in terms of, you know, what those cells
00:19:30.080 | respond to potentially dependent on our input. So the homunculus might be one map, but there are maps all
00:19:34.800 | over our brain. And those maps still have a lot of cross input. So what you're talking about is,
00:19:40.400 | are you having areas where we didn't use to allocate and differentiate in, you know, the specificity of
00:19:48.160 | what those cells were doing that are now quite related to the different ways my brain is having
00:19:54.240 | to interpret a text message? And the subtlety and the nuance of that, that actually now I get faster at,
00:20:01.120 | I have faster reaction times, I also have faster interpretations. So am I allocating cells that used to
00:20:07.200 | do something else to allow me to have that? Probably. But I'm also building, you know, where, like,
00:20:12.320 | think about me as a multi-sensory object that has, you know, I have to integrate information across
00:20:17.840 | sight, sound, smell to form a holistic, you know, object experience. That same sort of, you know, integration
00:20:25.040 | and pattern is happening now when we communicate in ways that it didn't used to. So what does that mean?
00:20:31.360 | It means there's a lot more repeatability, a lot faster pattern matching, a lot more integration
00:20:36.560 | that is allowing us to go faster. I completely agree. I feel like there's an entire generation
00:20:41.680 | of people who grew up with smartphones for which it's just part of life. I think one of the most
00:20:47.840 | impactful statements I ever heard in this general domain was, I gave a talk down at Santa Clara University
00:20:54.080 | one evening to some students, and I made a comment about putting the phone away and how much easier
00:20:59.760 | it is to focus when you put the phone away and how much better life is when you take space from your
00:21:03.920 | smartphone and all of this kind of thing. And afterwards this young guy came up to me, he's
00:21:09.120 | probably in his early twenties, and he said, "Listen, you don't get it at all." I said, "What do you mean?"
00:21:14.000 | He said, "You adopted this technology into your life and after your brain had developed."
00:21:18.880 | He said when he's speaking for himself, he said, "When my phone runs out of charge, I feel the life
00:21:25.600 | drain out of my body and it is unbearable. We're nearly unbearable until that phone pops back on.
00:21:34.640 | And then I feel life returned to my body. And it's because I can communicate with my friends again.
00:21:42.240 | I don't feel alone. I don't feel cut off from the rest of the world." And I was thinking to myself, wow,
00:21:48.000 | like his statements really stuck with me because I realized that his brain, as he was pointing out,
00:21:54.160 | is indeed fundamentally different than mine in terms of social context, communication, feelings of safety,
00:21:59.760 | and on and on. And I don't think he's alone. I think for some people it might not be quite as extreme.
00:22:04.960 | But for many of us to see that dot, dot, dot in the midst of a conversation where we really want the answer
00:22:16.000 | to something or it's an emotionally charged conversation can be a very intense human experience.
00:22:24.320 | That's interesting.
00:22:25.360 | So we've sped up the rate that we transfer information between one another. But even about trivial things,
00:22:31.520 | it doesn't have to be an argument or like, is it stage four cancer or is it benign, right?
00:22:35.680 | Like these are, those are extreme conditions, right? Are they alive? Are they dead? You know, did they find
00:22:40.400 | him or her or did they not? You know, those are extreme cases, but there's the everyday life of,
00:22:46.960 | and I noticed this, like if I go up the coast sometimes or I'll go to Big Sur and I will
00:22:52.800 | intentionally have time away from my phone. It takes about an hour or two or maybe even a half day to
00:22:58.160 | really drop into the local environment where you're not looking for stimulation coming in through the
00:23:03.040 | smartphone. And I don't think I'm unusual in that regard either. So I guess the question is, do you
00:23:08.400 | think that the technology is good, bad, neutral, or are you agnostic as to how the technologies are
00:23:18.320 | shaping our brain?
00:23:19.360 | It goes in lots of different directions. One thing I did want to say though, with what,
00:23:24.320 | with smartphones specifically and sort of everything, you know, in audio, you know, our ability to have,
00:23:30.880 | you know, carry our lifetime of music and content with us has been because of, you know, huge advances in
00:23:39.440 | the last 25, 30 years, and maybe slightly more around compression algorithms that have enabled us to have
00:23:47.600 | really effective what we call perceptual compression, lossy perceptual algorithms and things like MP3 and,
00:23:53.920 | and, you know, my past work with companies like Volby. But whenever you're talking about what's the goal
00:24:00.160 | of content compression algorithms, it's to translate the entirety of the experience, the entirety of a
00:24:07.360 | signal in, you know, with a lot of the information removed, right? But in intelligent ways. When you look at
00:24:14.240 | the way someone is communicating with acronyms and the shorthand that the next generations use to
00:24:21.360 | communicate, it is such a rich communication, even though they might just say LOL. I mean, it's like,
00:24:27.440 | or they might, you know, it's, it's, it's actually a lossy compression that's triggering a huge cognitive
00:24:34.480 | experience, right? Can you explain lossy for people who might not be familiar with it?
00:24:38.800 | Lossy means that in your encoding and decoding of that information, there is actually information
00:24:44.080 | that's lost when you decode it. But hopefully that information is not impacting the perceptual
00:24:49.040 | experience. Imagine I have, you know, a song and I want to represent that song. I could take out to make my file
00:24:56.720 | smaller, I could take out every other, you know, every 500 milliseconds of that and it would sound
00:25:02.000 | really horrible, right? Or I could be a lot more intelligent and instead basically, you know, if you
00:25:07.360 | look at early models like MP3, they're, they're, they're kind of like computational models of the brain.
00:25:12.560 | They stop, you know, they might stop at like the auditory nerve, but they're trying to put a model of
00:25:18.880 | how our brain would deal with sound, what we would hear, what we wouldn't. If this sounds present and
00:25:24.000 | it's present at the same time as this sound, then this sound wouldn't be heard, but this sound would be.
00:25:29.280 | So we don't need to spend any of our, our bits coding this sound. Instead, we just need to code this one.
00:25:34.560 | And so it becomes an intelligent way for the model and the algorithm of deciding what information needs
00:25:39.680 | to be represented and what doesn't to create the same, you know, the best perceptual experience,
00:25:46.480 | which perceptual meaning what we get to, you know, take home. I think one of the things that's
00:25:51.760 | important then, why I think whenever I had used to have to teach some of, you know, what it means to
00:25:57.920 | represent a rich experience with minimal data, you think with minimal information, some of the acronyms
00:26:06.640 | that exist in, in like mobile texting, they've taken on a very rich life in internal life.
00:26:12.080 | L-O-L, O-N-G.
00:26:14.000 | Yeah. Well, those are simplistic ones, but I think people can have communication now that we
00:26:17.920 | can't understand entirely.
00:26:19.280 | Is it because you have a 10-year-old daughter?
00:26:22.000 | Does she have communication by acronym that to you is cryptic?
00:26:25.120 | Sometimes, but I have to figure it out then, but yes. But the point is, that is an example of a lossy
00:26:31.040 | compression algorithm that actually has a much richer perceptual experience, right? And it often
00:26:36.960 | needs context, but it's still, you know, you're using few bits of information to try to represent a much
00:26:43.920 | richer feeling in a much richer state, right? And, you know, if you look at different people, they're going
00:26:49.520 | to have, you know, bigger physiological experience dependent on, you know, how they've grown up with
00:26:55.120 | that kind of context.
00:26:55.440 | I don't want to project here, but it sounds to me like you see the great opportunity of data
00:27:05.440 | compression. Like, let's just stay with the use of acronyms in texting. That's a vast data compression
00:27:11.200 | compared to the kind of speech and direct exchange that people engaged in 30 years ago. So there's less
00:27:19.200 | data being exchanged, but the experience is just as rich, if not more rich, is what you're saying, which
00:27:25.280 | implies to me that you look at it as generally neutral to benevolent. Like, it's good.
00:27:31.840 | It's just different.
00:27:32.560 | I'm coming up on 50 in a couple months. And as opposed to somebody saying, well, you know, when I was younger,
00:27:37.680 | we'd write our boyfriend or girlfriend a letter. You know, I would, I would actually write out a birthday card.
00:27:44.960 | I would go, you'd have a face-to-face conversation and you've got this younger generation that are
00:27:51.280 | saying, yeah, whatever. You know, this is like what we heard about. I used to trudge to school in the
00:27:56.640 | snow kind of thing. It's like, well, we have heated school buses now and we've got, you know, driverless
00:28:01.760 | cars. So I think this is important and useful for people of all ages to hear that the richness of an
00:28:10.880 | experience can be maintained, even though there are data or some elements of the exchange are being
00:28:17.280 | completely removed. Absolutely. But it's maintained because of the neural connections that are built in
00:28:22.960 | those individuals, right? And that generation. I always think of, okay, and the nervous system likes to code
00:28:28.560 | along a continuum, but like yum, yuck or meh. Like, do you think that, that a technology is kind of neutral? Like,
00:28:35.200 | yeah, you lose some things, you gain some things. Or you think like, this is bad. These days we hear
00:28:39.680 | a lot of AI fear. We'll talk about that. Or you hear also people who are super excited about what AI
00:28:46.160 | can do, what smartphones can do. I mean, some people like my sister and her daughter love smartphones
00:28:53.040 | because they can communicate. It gives a feeling of safety at a distance. Like quick communications are
00:28:57.440 | easier. It's hard to sit down and write, write a letter. She's going off to college soon. So the question
00:29:02.560 | is like, how often will you be in touch? It raises expectations about frequency, but it reduce it of
00:29:07.040 | contact, but it reduces expectations of depth. Because you can do like a, hey, I was thinking
00:29:11.680 | about you this morning. And that can feel like a lot. But a letter, if I sent a letter home, you know,
00:29:16.560 | during college to my own, like, hey, I was thinking about you this morning. Love, Andrew.
00:29:20.160 | And be like, okay. Like, I don't know how that would be. They'd be like, well, that didn't take long.
00:29:25.040 | Right? So I think that there's a, it's a seesaw, you know? You can get more frequency
00:29:30.320 | and then it comes with different levels of, you know, expectation on those. My daughter's at camp
00:29:35.360 | right now and we were only allowed to write letters for two weeks. Handwritten letters.
00:29:39.440 | Handwritten letters. How did that go over? That is, that it's happening. I mean,
00:29:42.320 | I'd lost their home in a flood years ago. And one of the only things I saved out of the flood,
00:29:48.800 | which is this, and I just brought these back because I got them from my brother, is the,
00:29:54.640 | there, this communication between one of my ancestors, you know, during the Civil War,
00:29:58.880 | like they were courting. And that was all saved. These letters back and forth between the women.
00:30:04.800 | And, you know, and it's, you know, with these, it's like 1865.
00:30:08.400 | Do you have those letters?
00:30:09.680 | I do. I do. I had them in my computer back until I flew up here.
00:30:13.600 | And, but, you know, they were on parchment. And even though they went through a flood, they,
00:30:18.400 | they, you know, they didn't run. And it's this very different era of communication. And it's wonderful
00:30:25.360 | to have that preserved because that doesn't translate right through and without that history.
00:30:31.760 | In any case, I am a huge advocate for integration of technology, but it's, for me,
00:30:38.080 | the world is data. And, and I, I do think that way. It's, you know, and, and I, I look at what,
00:30:44.160 | the way my daughter behaves, I'm like, okay, well, what data is coming in? And why did she,
00:30:48.720 | you know, respond that way? And, you know, there's this, an example I, I can give, but, you know,
00:30:53.120 | you think we were talking about neuroplasticity. It's like, we are the creatures of sort of three things.
00:30:58.160 | One is, you know, our sensory systems, how they've evolved and be it by, you know, the intrinsic noise
00:31:05.600 | that is, you know, cause degrading our sensory receptors or the external strain, you know, I,
00:31:11.440 | my brain is going to have access to about the same amount of information as someone with hearing loss,
00:31:16.880 | if I'm in a very noisy environment. And so suddenly you've induced, you know, you've compromised the data
00:31:22.000 | I have access to. And then also our sort of experientially established priors, right?
00:31:27.440 | Our priors being, if you think about the brain as sort of a Bayesian model,
00:31:30.560 | things aren't always deterministic for us, like they are for some creatures. Our brain's having
00:31:35.840 | to take data and make decisions about it and respond to it.
00:31:38.720 | Which is Bayesian. We should just explain it for people.
00:31:40.640 | Deterministic would be input A leads to output B.
00:31:44.080 | Yeah.
00:31:44.720 | Bayesian is, it depends on the statistics of what's happening externally and internally.
00:31:49.040 | Yeah.
00:31:49.280 | These are probabilistic models.
00:31:50.800 | Like there's a likelihood of A becoming B, or there's a likelihood of A driving B,
00:31:56.720 | but there's also a probability that A will drive C, D, or F.
00:32:00.160 | Absolutely. And, you know, and we should get into, I mean, some of the things that make us the most
00:32:04.080 | most effective in our environments and just in interacting in the world is how fast and effective
00:32:10.240 | we are with dealing with those probabilistic, you know, situations. Those things where your brain,
00:32:15.360 | it's like probabilistic inference is a great indicator of success in an environment. And,
00:32:21.760 | yeah, be it a work environment, be it just, you know, walking down the street. And that's how do we deal
00:32:27.920 | with this like data that doesn't just tell us we have to go right or left, but there's a lot of
00:32:32.800 | different inputs. And it's our sort of situational intelligence in the world. And there, you know,
00:32:36.880 | we can break that down into a lot of different ways. In any case, we are the products of our,
00:32:41.440 | you know, our sensory systems, our priors, which are the statistics that and data we've had up until
00:32:48.320 | that moment that our brain's using to weight how it's going to behave and the decisions it makes,
00:32:52.720 | but also then our expectations, the context of that, you know, that have shaped where we are.
00:32:57.200 | And so there's this funny story, like my daughter, when she was two and a half, we're in the planetarium
00:33:01.120 | at the Smithsonian. And we're watching, I think, one typical film you might watch in a planetarium.
00:33:05.840 | We started in LA, zoom out on our way to the sun, and we passed that sort of, you know,
00:33:10.480 | quintessential NASA image of the earth. And it's totally dark and silent. And my daughter,
00:33:15.840 | as loud as she possibly could yells, "Minions." And I'm like, "What's going on?"
00:33:22.640 | And I'm like, "Oh, yes, of course. Her experiencedly established prior of that image is coming from
00:33:28.640 | the Universal logo." And, you know, she never, you know, that says "Universal."
00:33:33.200 | Yeah, no, I love it.
00:33:34.240 | It was totally valid. But it was this very, you know, honest and true part of what it is to be human.
00:33:42.480 | Like each of us is experiencing very different, you know, having very different experiences of
00:33:48.000 | the same physical information. And we need to recognize that. But it is driven by our exposures
00:33:54.880 | and our priors and our sensory systems. It's sort of that trifecta. And our expectations of the moment. And
00:34:00.800 | once you unpack that, you really start to understand and appreciate the influence of technology. Now,
00:34:07.440 | I am a huge advocate for technology, improving us as humans, but also improving the data we have to
00:34:14.480 | make better decisions and the sort of insights that drive us. At the same time, I think sometimes we're
00:34:21.600 | pennywise pound-foolish with how we use technology. And the quick things that make us faster can also make
00:34:28.080 | us dumber and take away our cognitive capabilities. And, you know, where you'll end up with those that
00:34:34.800 | are using the technologies might be to, you know, to write papers all the time or maybe, well, and we can
00:34:41.920 | talk about that more, are putting themselves in a place where they are going to be compromised trying to do
00:34:47.920 | anything without that technology. And also in terms of their learning of that data, that information.
00:34:53.680 | And so you start even ending up with bigger differentiations and cognitive capabilities
00:34:59.120 | by whether, how you use a tool, a technology tool to make you better or faster or not.
00:35:04.720 | One of my sort of things I've always done is teach at Stanford. That's, we also have that in common.
00:35:10.960 | I need to sit in on one of your lectures.
00:35:14.080 | Yeah. But my class there has been, is called Neuroplasticity and Video Gaming. And I'm a
00:35:19.280 | neurophysiologist, but I'm really a technologist. I like buildings. I like, you know, innovation across
00:35:24.160 | many domains. And while that class says video gaming, it's really more, well, video games are powerful in
00:35:31.760 | the sense that there's this sort of closed loop environment. You give feedback, you get data on your
00:35:36.160 | performance, but you get to control that and know what you randomize, how you build. And what our aim is in
00:35:41.600 | that class is to build technology and games with an understanding of the neural circuits you're
00:35:47.600 | impacting and how you want to, what you want to train. I'll have students that are musicians.
00:35:53.920 | I'll have students that are computer scientists. I'll have students that are, you know, some of
00:35:58.480 | Sanford's top athletes. I've had a number of their top athletes go through my course. And it's always
00:36:04.720 | focused on, okay, there's some aspect of human performance I want to dissect and I want to really
00:36:09.520 | amplify the sensitivity or the access to that type of learning in a closed loop way. Just for anyone that
00:36:17.040 | isn't familiar with the role or the history of gaming in the neuroscience space, you know, there's been some
00:36:23.360 | great papers in the past. Take a gamer versus a non-gamer, just to start with someone self-identified.
00:36:29.120 | A typical gamer actually has what we would call more sensitive, and this is your domain so you can
00:36:35.600 | counter me on this anytime, but, you know, contrast sensitivity functions. And like a contrast sensitivity
00:36:40.640 | function is your ability to see edges and differentiation in a visual landscape, okay?
00:36:49.120 | They can see faster and, you know, more, they're more sensitive to that sort of differentiation.
00:36:56.720 | So then someone who says I'm not a video game player or self-identifies that way.
00:37:02.880 | Because they've trained it, right? They've trained it.
00:37:05.120 | Like a first-person shooter game, which I've played occasionally in an arcade or something like that.
00:37:09.200 | I didn't play a lot of video games growing up. I don't these days either. But yeah, a lot of it is
00:37:16.640 | based on contrast sensitivity, knowing is that a friend or foe? Are you supposed to shoot them or
00:37:21.440 | not? You have to make these decisions very fast, like right on the threshold of what you would call
00:37:27.440 | like reflexive, like no thinking involved, but it's just rapid iteration and decision-making.
00:37:33.760 | And then the rules will switch, right? Like suddenly you're supposed to turn other things into targets
00:37:40.640 | and other things into friends. Well, you're spot on because then you take someone who that self-identified
00:37:46.080 | non-gamer, make them play 40 hours of Call of Duty. And now their contrast sensitivity looks like a video
00:37:52.320 | game player and it persists, you know, go back, measure them a year later. But you know, 40 hours
00:37:57.200 | of playing Call of Duty and I see the world differently, not just in my video game. I actually have
00:38:01.680 | foundational shifts in how I experience the world that give me greater sensitivity to my situational
00:38:06.880 | awareness, my situational intelligence. In real life.
00:38:08.720 | Yeah. Yeah. Yeah. Yeah. Because that's a low level processing capability. I love intersecting
00:38:13.360 | those when you can. But what's even, I think, more interesting is you also, and these were some,
00:38:19.040 | this was a great study by Alex Puget and Daphne, where it's not just the contrast sensitivity. It's,
00:38:27.760 | let's go to that next level where we were talking about Bayesian, like probabilistic decisions where
00:38:31.840 | things aren't deterministic. And for a video game player, and I can train this,
00:38:37.120 | they're going to make the same decisions as a non-video game player in those, you know, probabilistic
00:38:44.240 | environment, inferential situations. But they're going to do it a lot faster. And so that edge,
00:38:50.960 | that ability to get access to that information is phenomenal, I think. And when you can tap into
00:38:57.360 | that, that becomes a very powerful thing. So like probabilistic inference goes up when I've,
00:39:01.280 | you know, played 40 hours of Call of Duty. But then what I like to do is take it and say, okay, here's,
00:39:07.520 | you know, a training environment. You know, I had a couple of Stanford's top soccer players on my,
00:39:13.440 | in my course this year. And we got, our focus was, okay,
00:39:18.160 | what data do you not have? And how can we build a closed loop environment and make it something so
00:39:24.560 | that you're gaining better neurological access to your performance based on data like my acceleration,
00:39:32.320 | my velocity, not at the end of my, you know, two hour practice, but in real time and getting auditory
00:39:38.480 | feedback so that I am actually tapping into more neural training. So we had sensors, you know, like on,
00:39:45.760 | on their calves that were measuring acceleration velocity and give, able to give us feedback in
00:39:53.120 | real time as they were doing, you know, a sort of somewhat gamified training. I don't want to use
00:39:58.960 | a gamified, it's overused, but let's say it felt like fun environment, but it's also based on computation
00:40:07.200 | of that acceleration data and what their targets were. It's feeding them different sonic cues so that they're
00:40:12.240 | building, they're building that resolution. When I say resolution, what I mean is, especially as a
00:40:20.080 | novice, I can't tell the difference between whether I've accelerated successfully or not, but if you give
00:40:25.200 | me more gradation in the feedback that I get with that sort of, that closed loop behavior, I start to,
00:40:31.280 | my neural representation of that is going to start differentiating more. So with that, that's where the
00:40:37.680 | auditory feedback. So they're getting that in real time and we, you build that kind of closed loop
00:40:42.560 | environment that helps build that, you know, create greater resolution in the brain and greater
00:40:49.200 | sensitivity to differentiation. I'd love for you to share the story about your daughter improving her
00:40:55.440 | swimming stroke, right? Because she's not a D1 athlete yet, maybe she will be someday, but she's a swimmer,
00:41:00.720 | right? Yes. And in the past, if you wanted to get better at swimming, you needed a swimming coach.
00:41:06.080 | And if you wanted to get really good at swimming, you'd have to find a really good swimming coach and
00:41:10.000 | you'd have to work with them repeatedly. You took a slightly different direction that really points
00:41:14.000 | to just how beneficial and inexpensive this technology can potentially be or relatively inexpensive.
00:41:19.760 | Well, first I'll say this. Number one is having good swimming coaches.
00:41:22.560 | Okay. Sure. I'm not trying to do away with swimming coaches. Parents who are data centric and really
00:41:29.680 | like building technologies are sometimes maybe can be red herring distractions, but hopefully not.
00:41:35.120 | Okay. All right. Well, yes. That's one of them. Let's keep the swimming coaches happy.
00:41:40.240 | Yeah. So for example, like you go and train with elite athletes and if you go to a lot of swimming
00:41:46.000 | camps where you're training programs, it's always about work with cameras and what they're recording
00:41:54.320 | you, they're assessing your strokes. But the point is, you can use, and I did this, knowing the things
00:42:02.640 | that the coaches, or frankly, you can go online and learn some of those things that matter to different
00:42:09.920 | strokes. You can use, you know, use perplexity labs, use repli, use some of these. These are online resources?
00:42:17.680 | Yeah. Yeah. And you can build, quickly build a computer vision app that is giving you data analytics
00:42:24.080 | on your strokes and in real time. So how does that work? You're taking the phone underwater, analyzing the stroke?
00:42:29.760 | In this case, I'm using mobile phones, so I'm doing everything above, you know.
00:42:33.280 | Okay. So you're, you're filming, if you could walk us through this. So you, you film your daughter
00:42:37.040 | doing freestyle stroke for, you know. Right, right. Or breaststroke or butterfly.
00:42:40.480 | Sure. There's a lot of core things that, you know, maybe you want to care about. Backstroke
00:42:43.920 | and freestyle. What's their, you know, and I am not, I was, we used to run, like I know you're a good
00:42:49.200 | runner, but I'm a runner, I'm a rock climber, less a swimmer. But, you know, things like the roll or how
00:42:56.240 | high they're coming above the water. What's your, you know, what, what's your velocity on a, you know,
00:43:00.320 | you can get actually very sophisticated once you have the data, right? And, you know, what's your velocity on
00:43:05.840 | entrance? How much, you know, where, how far in front of your, your head is your arm coming in?
00:43:10.880 | How, you know, what is, maybe there's, again, maybe there are things that you, you know, are obvious,
00:43:18.880 | which is you want to know, you know, how consistent are your strokes and your cadence across, you know,
00:43:24.720 | the pool. So you don't just have your speed, you suddenly have access to what I would call,
00:43:31.040 | and you'll hear me use this a lot, better resolution, but also a lot more analytics
00:43:36.080 | that can give you insight. Now, important thing here is, you know, my 10 year old is not going to,
00:43:42.160 | I'm not going to go tell my 10 year old that she needs to change her, her velocity on this
00:43:46.560 | header stroke, but it gives me information that I can at least understand and help her know how
00:43:54.080 | something is going and how consistent she is on certain things that her coaches have told her to
00:43:58.480 | do. You know, and, and what I love about the idea is, look, this isn't just for the ease of getting
00:44:06.560 | access to the type of data and information that would previously, and I mean, I do code in, in a lot of
00:44:13.600 | areas, but you don't have to do that anymore to build these apps. In fact, you shouldn't, you should leverage,
00:44:17.920 | you know, AI for development of these types of tools. You tell AI to write a code so that it would
00:44:23.680 | analyze, you know, trajectory jumping into the pool, how that could be improved if the goal is to swim
00:44:29.280 | faster. You'd use AI to build an app that would allow you to do that so that you would have then
00:44:33.680 | access to that, whatever the data is that you want to do. Yeah. So in that case, you're trying to do better
00:44:39.040 | stroke analytics and, and understand things as you move forward. Um, you could do the same thing for
00:44:44.400 | running, for gate, for, uh, you could do, you know, in a work environment, you can understand a lot more
00:44:50.160 | about where vulnerabilities are, where weaknesses are. There are sort of two different places where I see
00:44:54.720 | this type of, um, AI acceleration and tool building really having major impact. It's on sort of democratizing
00:45:00.960 | data analytics and information that would normally be reserved for the elite to everyone that's really
00:45:07.200 | engaged. And that has a huge impact on improving performance because that kind of data is really,
00:45:13.120 | you know, useful in understanding, um, learning. It also has applications for, you know, when you're
00:45:19.760 | in a work environment and you're trying to better understand, um, success in that environment, in, in some
00:45:26.080 | process or skill of, you know, what you're doing, um, you, you can gain different analytics than you otherwise
00:45:32.880 | would in ways that are become much more, uh, successful, but also give you, uh, new data to
00:45:40.960 | think about, you know, with regard to what I would call a digital twin. And, and when I use the word
00:45:45.120 | digital twin, the goal of the digital twin is not to digitize and represent a physical system in its
00:45:50.800 | entirety. It's to gain, use different interoperable, meaning data sets coming from different sources to
00:45:58.240 | gain insights, you know, digitized data of a physical system or a physical environment or physical world,
00:46:04.800 | be it a hospital, be it airplanes, be it my body, be it my fish tank, to give me insights that are, you know,
00:46:11.760 | continuous and in real time that I otherwise wouldn't be able to gain access to.
00:46:16.800 | We've known for a long time that there are things that we can do to improve our sleep.
00:46:21.280 | And that includes things that we can take things like magnesium threonate, theanine, chamomile extract,
00:46:26.800 | and glycine, along with lesser known things like saffron and valerian root. These are all clinically
00:46:32.000 | supported ingredients that can help you fall asleep, stay asleep and wake up feeling more refreshed.
00:46:37.040 | I'm excited to share that our longtime sponsor, AG1, just created a new product called AGZ,
00:46:42.800 | a nightly drink designed to help you get better sleep and have you wake up feeling super refreshed.
00:46:47.280 | Over the past few years, I've worked with the team at AG1 to help create this new AGZ formula.
00:46:52.320 | It has the best sleep supporting compounds in exactly the right ratios in one easy to drink mix.
00:46:57.520 | This removes all the complexity of trying to forage the vast landscape of supplements focused on sleep
00:47:02.400 | and figuring out the right dosages and which ones to take for you.
00:47:06.080 | AGZ is to my knowledge, the most comprehensive sleep supplement on the market.
00:47:10.000 | I take it 30 to 60 minutes before sleep. It's delicious by the way.
00:47:13.840 | And it dramatically increases both the quality and the depth of my sleep.
00:47:17.280 | I know that both from my subjective experience of my sleep and because I track my sleep.
00:47:21.440 | I'm excited for everyone to try this new AGZ formulation and to enjoy the benefits of better sleep.
00:47:26.640 | AGZ is available in chocolate, chocolate mint, and mixed berry flavors.
00:47:30.400 | And as I mentioned before, they're all extremely delicious.
00:47:32.960 | My favorite of the three has to be, I think, chocolate mint, but I really like them all.
00:47:37.440 | If you'd like to try AGZ, go to drinkagz.com/huberman to get a special offer.
00:47:43.040 | Again, that's drinkagz.com/huberman.
00:47:46.400 | Today's episode is also brought to us by Rora.
00:47:48.880 | Rora makes what I believe are the best water filters on the market.
00:47:51.840 | It's an unfortunate reality, but tap water often contains contaminants that negatively impact our health.
00:47:58.160 | In fact, a 2020 study by the Environmental Working Group estimated that more than 200 million Americans
00:48:03.680 | are exposed to PFAS chemicals, also known as forever chemicals, through drinking of tap water.
00:48:09.200 | These forever chemicals are linked to serious health issues, such as hormone disruption,
00:48:13.600 | gut microbiome disruption, fertility issues, and many other health problems.
00:48:18.080 | The Environmental Working Group has also shown that over 122 million Americans drink tap water with
00:48:23.520 | high levels of chemicals known to cause cancer.
00:48:26.240 | It's for all these reasons that I'm thrilled to have Rora as a sponsor of this podcast.
00:48:30.640 | Rora makes what I believe are the best water filters on the market.
00:48:33.920 | I've been using the Rora countertop system for almost a year now.
00:48:37.360 | Rora's filtration technology removes harmful substances, including endocrine disruptors
00:48:41.760 | and disinfection byproducts, while preserving beneficial minerals like magnesium and calcium.
00:48:46.640 | It requires no installation or plumbing.
00:48:48.720 | It's built from medical grade stainless steel, and its sleek design fits beautifully on your countertop.
00:48:53.200 | In fact, I consider it a welcome addition to my kitchen.
00:48:55.680 | It looks great, and the water is delicious.
00:48:58.320 | If you'd like to try Rora, you can go to Rora.com/Huberman and get an exclusive discount.
00:49:04.080 | Again, that's Rora, R-O-R-R-A.com/Huberman.
00:49:08.560 | We will definitely talk more about digital twins.
00:49:10.640 | But what I'm hearing is that it can be very, as nerds speak, but domain specific.
00:49:16.080 | I mean, like the lowest level example I can think of, which would actually be very useful to me,
00:49:20.640 | would be a digital twin of my refrigerator that would place an order for the things that I need,
00:49:28.800 | not for the things I don't need, eliminate the need for a shopping list.
00:49:32.960 | It would just keep track of like, hey, like you usually run out of strawberries on this day and
00:49:37.040 | this day, and it would just keep track of it in the background, and this stuff would just arrive,
00:49:39.600 | and it would just be there.
00:49:40.240 | And like eliminate what seemed like, like, well, gosh, isn't going to the store nice?
00:49:44.240 | Yeah, this morning I walked to the corner store, bought some produce.
00:49:47.120 | I had the time to do that, the eight minutes to do that.
00:49:50.320 | But really, I would like the fridge to be stocked with the things that I like and need.
00:49:55.840 | And I could hire someone to do that, but that's expensive.
00:49:58.800 | This could be done trivially and probably will be done trivially soon.
00:50:01.520 | And I don't necessarily need to even build an app into my phone.
00:50:04.560 | So I like to think in terms of kind of lowest level, but highly useful and easily available now
00:50:12.560 | type technologies.
00:50:13.520 | There are a couple of areas like when it comes to students learning information,
00:50:19.760 | we've heard that, you know, AI, we've heard of AI generally as like this really bad thing.
00:50:24.080 | Like, oh, they're just going to use AI to write essays and things like that.
00:50:26.880 | But there's a use of AI for learning.
00:50:29.040 | I know this because I'm still learning.
00:50:30.960 | I teach and learn all the time for the podcast,
00:50:33.280 | which is I've been using AI to take large volumes of text from papers.
00:50:40.960 | So this isn't AI hallucinating.
00:50:43.360 | Just take large volumes of text verbatim from papers.
00:50:48.560 | I've read those papers, literally printed them out, taken notes, et cetera.
00:50:53.200 | And then I've been using AI to design tests for me of what's in those papers.
00:50:58.000 | Because I learned, you know, about eight months ago when researching a podcast on how to study and
00:51:03.520 | learn best, the data all point to the fact that when we self-test,
00:51:06.080 | especially when we self-test away from the material, like when we're thinking,
00:51:10.720 | oh yeah, like what is the cascade of hormones driving the cortisol negative feedback loop?
00:51:16.560 | When I have to think about that on a walk,
00:51:18.320 | as opposed to just looking it up, it's the self-testing that is really most impactful for
00:51:23.840 | memory because most of memory is anti-forgetting.
00:51:26.560 | This is kind of one way to think about it.
00:51:28.640 | So what I've been doing is having AI build tests for me and having asked me questions.
00:51:33.920 | Like, you know, what is the, you know, the signal between the pituitary and the adrenals
00:51:40.560 | that drives the release of cortisol?
00:51:43.200 | And what layer of the adrenals does cortisol come from?
00:51:45.440 | And I love that.
00:51:46.720 | And so it's, I'm sure that the information it's drawing from is accurate,
00:51:51.200 | at least to the best of science and medicine's knowledge now.
00:51:53.520 | And it's just testing me and it's learning.
00:51:56.560 | This is what's so incredible about AI.
00:51:58.240 | And I don't consider myself like extreme on AI technology at all.
00:52:01.600 | It's learning where I'm weak and where I'm strong at remembering things.
00:52:05.440 | Because I'm asking it, where am I weak and where am I strong?
00:52:07.520 | And they'll say, oh, like, like naming and this and like, like, like third order conceptual links
00:52:12.720 | here in need of a little bit of work.
00:52:13.840 | And I go, test me on it.
00:52:14.720 | And it starts testing me on it.
00:52:16.240 | It's amazing.
00:52:16.880 | Like I'm blown away that the technology can do this.
00:52:21.280 | And I'm not building apps with AI or anything.
00:52:23.360 | I'm just using it to try and learn better.
00:52:25.120 | Whether you're building apps or you're building a tool, you're using it as a tool
00:52:29.280 | that's helping you optimize your cognition and find your weaknesses,
00:52:32.880 | but also give you feedback on your performance and, and, and accelerate your learning in this.
00:52:39.520 | Right.
00:52:39.840 | Because that's the goal.
00:52:41.120 | But you're still putting in the effort to learn.
00:52:43.280 | And I think even the, the ways that I'm using it to, you know, computer vision with mobile devices,
00:52:49.280 | AI is a huge opportunity and tool that using the cameras and the data that you've collected to,
00:52:55.520 | you know, have much more sophisticated input is, is huge.
00:52:59.200 | But in both of those cases, you're shaping cognition, you're shaping, you're using data to enrich what you
00:53:05.760 | can know and AI is just, you know, incredibly powerful and a great opportunity in those spaces.
00:53:13.520 | The, the place where I think it is, and I, I sort of separate it into literally just two categories.
00:53:22.240 | Maybe that's too simplistic.
00:53:23.520 | It's, am I using, and, and this is true for any tool, not just AI, but am I using the tool?
00:53:28.880 | Am I using the technology in a way to make me smarter about, you know, and, and let me have more
00:53:34.160 | information and make me more effective, but also cognitively more effective, gain different insights?
00:53:39.600 | Or am I using it to replace, replace a cognitive skill I've done before to be faster?
00:53:45.920 | And it doesn't mean you don't want to do those things.
00:53:48.000 | I mean, GPS in our car is a perfect example of a place where we're replacing a cognitive tool of,
00:53:53.200 | you know, to make me faster and more effective.
00:53:55.280 | And frankly, you know, you take away your GPS and in a city you drive around and we're not very good.
00:54:00.400 | And I remember paper maps.
00:54:02.240 | I remember the early studies of the hippocampus were based on London taxi drivers that had mental maps of
00:54:07.440 | the city.
00:54:08.080 | Absolutely.
00:54:08.640 | That, you know, that with all due respect to London taxi drivers up until GPS, like that,
00:54:16.400 | those mental maps are not necessary anymore.
00:54:18.960 | And I mean, they had more gray matter in their hippocampus and we know that.
00:54:22.320 | And you look at them today and they don't have to have that because the people in their back seats
00:54:27.200 | have more data, have more information, have eyes from the sky.
00:54:30.800 | I mean, satellite data is so huge in our success in the future.
00:54:34.400 | And, you know, it can anticipate the things that locally you can't.
00:54:39.120 | And so it's been replaced.
00:54:41.040 | But it still means when you lose that data, you don't expect yourself to have the same spatial
00:54:48.560 | navigation of that environment without it, right?
00:54:51.360 | I love your two batches, right?
00:54:53.840 | You're either using it to make you cognitively better or you're using it to speed you up,
00:54:57.520 | but you have to be...
00:54:58.480 | Here's where I think...
00:54:59.200 | Cognitively or physically.
00:55:00.400 | Cognitively or physically.
00:55:01.040 | But you're still trying to gain insight and data and information that's making me a more effective
00:55:06.000 | human.
00:55:06.560 | Right.
00:55:06.960 | And I think that the place where people are concerned, including myself, is when we use
00:55:13.520 | these technologies that eliminate steps, make things faster, but we fill in the additional time
00:55:22.080 | or mental space with things that are neutral to detrimental.
00:55:27.360 | It's sort of like saying, okay, I can get all the nutrients I need from a drink that's eight
00:55:33.200 | ounces.
00:55:33.520 | This is not true.
00:55:34.320 | But then the question is like, how do I make up the rest of my calories, right?
00:55:37.600 | Am I making up with also nutritious food, right?
00:55:40.320 | Let's just say that it keeps me at a neutral health status or am I eating stuff that because I need
00:55:46.080 | calories that I'm not necessarily gaining weight, but I'm bringing in a bunch of bad stuff with those
00:55:51.360 | calories.
00:55:52.320 | In the mental version of this, things are sped up, but people are filling the space with things that
00:55:58.560 | are making them dumber in some cases.
00:56:01.040 | There was a recent paper from MIT that I actually...
00:56:04.160 | It was...
00:56:05.440 | It is very much what I spend a lot of my time talking about and thinking about, but...
00:56:11.440 | Yeah.
00:56:12.320 | Could you describe that study?
00:56:13.680 | The upshot of the paper first was that people...
00:56:16.240 | There's a lot less mental process or cognitive process that goes on for people
00:56:21.120 | when they use LLMs to write papers and they have...
00:56:24.080 | They don't have the same transfer and they don't really learn the information.
00:56:27.520 | Surprise, surprise.
00:56:28.240 | So just to briefly describe the study, even though it got a lot of popular press, it's
00:56:32.960 | you know, MIT students writing papers using AI...
00:56:36.080 | Yeah.
00:56:36.080 | ...versus writing papers the old-fashioned way where you think and write.
00:56:39.600 | So there were three different categories.
00:56:40.800 | People had to write the papers, you know, just with or using their brain only.
00:56:44.640 | That would be case one.
00:56:47.600 | Case two would be I get to use search engines, which would be sort of a middle ground.
00:56:51.760 | Again, these are, you know, rough categories.
00:56:54.160 | And then a third would be I use LLMs to write my paper.
00:56:58.320 | And they're looking at, you know, sort of what kind of transfer happened, what, you know,
00:57:03.520 | what kind of...
00:57:04.240 | They were measuring neural response.
00:57:05.760 | So they were using EEG to look at neural patterns of...
00:57:09.280 | across the brain to understand how much neural engagement happened during the writing of the
00:57:13.760 | papers and during the whole process.
00:57:16.240 | And then what they could do with that, what they knew about that information down the road.
00:57:21.040 | It's a really nice paper.
00:57:21.920 | So I don't want to want to diminish it in any way by summarizing it.
00:57:26.000 | But what I think is a really important upshot of that paper and also just how we talk about it
00:57:32.720 | that I liked was they...
00:57:34.160 | I talk a lot about cognitive load always.
00:57:37.440 | And you can measure cognitive load in the diameter of your pupil and body posture
00:57:41.120 | and how people are thinking.
00:57:42.080 | It's really how hard is my brain working right now to solve a problem or just in my context.
00:57:47.120 | And there are a lot of different cues we give up as humans that tell us when we're under states
00:57:51.440 | of different load and cognitively and whether we are aware of it or not.
00:57:55.840 | And there's something called cognitive load theory that breaks down sort of what happens
00:58:01.760 | when our brains are under states of, you know, load.
00:58:05.360 | And that load can come from sort of three different places.
00:58:09.040 | It might be coming from intrinsic, what you would call intrinsic information, which is what...
00:58:15.040 | And this is all during learning.
00:58:16.480 | The intrinsic load, cognitive load, would be from, you know, the difficulty of the material
00:58:22.800 | I'm trying to understand.
00:58:24.000 | And how, you know, really, some things are easy to learn, some things are a lot harder.
00:58:28.560 | And that's intrinsic load.
00:58:30.560 | And extraneous load would be the load that comes from how the information is presented.
00:58:34.960 | Is it poorly taught?
00:58:36.800 | Is it poorly organized?
00:58:37.920 | Or also even the environment.
00:58:39.440 | If I'm trying to learn something auditorily and it's noisy, that's introducing extraneous
00:58:43.360 | cognitive load, right?
00:58:44.400 | It's not the information itself, but it's because of everything else happening with that data.
00:58:50.080 | And then the third is germane cognitive load.
00:58:52.480 | And that's the load that is used in my brain to build mental schemas, to organize that information,
00:58:59.200 | to really develop a representation of what that information is that I'm taking in.
00:59:05.680 | And that germane cognitive load, that's the work, right?
00:59:10.320 | And if you don't have germane cognitive load, you don't have learning, really.
00:59:13.680 | And what they found is basically the germane cognitive load is what gets impacted most by
00:59:18.720 | using LLMs, which is, I mean, it's a very obvious thing.
00:59:22.320 | Meaning you don't engage quite as high levels of germane cognitive load.
00:59:27.520 | Using LLMs, you're not engaging the mental effort to build
00:59:32.560 | cognitive schema to build neural schemas and, you know, sort of the mental representation
00:59:37.280 | of the information that you can interact with it later.
00:59:39.920 | And you have access to it later.
00:59:42.560 | And this is really important because without that, you won't be as intelligent on that topic,
00:59:47.360 | that's for sure, down the road.
00:59:49.120 | Let me give two examples.
00:59:50.160 | I have a doctor, I have a lawyer, and both of them use LLMs extensively for searches,
00:59:54.880 | say, or for building information.
00:59:56.800 | In one case, it's for patient aggregation of patient data.
00:59:59.840 | And in another case, it's for, you know, history of case files.
01:00:02.800 | And that is the GPS that's happening in those spaces.
01:00:06.000 | And because those are the tools that are quickly adopted.
01:00:08.160 | Where you have someone that is maybe came, you know, from a different world,
01:00:14.240 | has learned that information, has gone and worked with data in a different way, worked,
01:00:18.400 | their representation of that information is going to be better at extrapolation.
01:00:22.400 | It's going to be better at generalization.
01:00:23.920 | It's going to be better at seeing patterns that, you know, would exist.
01:00:26.720 | The brain that has done everything through LLMs is going to be in a place where
01:00:32.000 | they will get the answer for that relevant task or using the tools they have.
01:00:40.400 | But you're not the same level of richness and depth of information or generalization or
01:00:49.040 | extrapolation for those topics as someone that has learned in a different way.
01:00:52.720 | There's a generational difference in understanding, not because they don't
01:00:57.840 | have the same information, but there is an acknowledgement that there's a gap,
01:01:02.160 | even though we're getting to the same place as fast.
01:01:04.880 | And that's because of the learning that's happened.
01:01:07.040 | Mm-hmm.
01:01:07.680 | That's your main cognitive load.
01:01:09.360 | Absolutely.
01:01:10.080 | The cognitive load.
01:01:11.040 | Like, you've got to do the work.
01:01:12.320 | Your brain has to -- and, you know, what was beautiful about your descriptions, Andy,
01:01:16.160 | is when you were talking about how you were using it, which I love, you know, to test yourself,
01:01:22.560 | find your vulnerabilities, is, you know, and actually in the paper on MIT, which I think,
01:01:26.880 | again, these are things that are somewhat obvious, but we just have to -- I think we have to talk about
01:01:30.880 | them more -- is people with higher competency on the topic use the tools in ways that still engage
01:01:35.840 | more germane cognitive load but help to accelerate their learning.
01:01:39.600 | Mm-hmm.
01:01:39.680 | It's, you know, where is the biggest vulnerability and gap?
01:01:42.480 | It's when it's -- especially in areas and topics where you're trying to learn a new domain fast,
01:01:48.480 | or you're under pressure, and you're not putting in the germane effort, or you're not using the tools
01:01:52.960 | that you have access to that AI can enable.
01:01:55.280 | Mm-hmm.
01:01:55.760 | You're not using them to amplify your cognitive, you know, gain, but instead to deliver something
01:02:03.120 | faster, more rapid, and then walking away from it.
01:02:06.720 | I'm going to try and present two parallel scenarios in order to go further into this question of how
01:02:15.440 | to use AI to our best advantage to enrich our brains as opposed to diminish our brains.
01:02:19.680 | Mm-hmm.
01:02:20.480 | So I could imagine a world, because we already live in it, where there's this notion of slow
01:02:28.000 | food. Like, you cook your food, you get great ingredients from the farmer's market, like a peach
01:02:33.200 | that quote-unquote really tastes like a peach, this kind of thing. You make your own food, you cook it,
01:02:39.840 | and you taste it, and it's just delicious. And I can also imagine a world where you order a peach pie
01:02:46.320 | online, it shows up and you take a slice and you eat it. And you could take two different generations
01:02:51.200 | of people, maybe people that are currently now 50 or older and people that are 15 or younger,
01:02:57.280 | and the older generation would say, "Oh, isn't that the peach pie that you made so much better?
01:03:01.920 | Like, these peaches are amazing." And I could imagine a real scenario where the younger person,
01:03:07.840 | 15 to 30, let's say, would say, "I don't know, I actually really like the other pie. I like it just as
01:03:13.440 | well." And the older generation is like, "What are you talking about? Like, this is how it's done.
01:03:19.280 | What's different?" Well, sure, experience is different, et cetera. But from a neural standpoint,
01:03:26.320 | from a neuroscience standpoint, it very well could be that it tastes equally good to the two of them,
01:03:33.360 | just differs based on their experience, meaning that the person isn't lying. It's not like this kid,
01:03:39.120 | you know, isn't as fine-tuned to taste. It's that their neurons acclimated to, like,
01:03:46.000 | what sweetness is and what contrast between sweet and saltiness is and what a peach should taste like.
01:03:50.720 | Because, damn it, they had peach gummies and that tastes like a peach, you know? And so we can be
01:03:55.760 | disparaging of the kind of what we would call the lower level or diminished sensory input.
01:04:02.640 | Yeah. But it depends a lot on the neural, what those neural circuits were weaned on.
01:04:07.200 | A couple of comments. I love the peach pie example. Making bread is another example of that. And in the 90s,
01:04:15.760 | everyone I knew when they graduated from high school got a bread maker that was shaped like a
01:04:20.560 | box and created this, like, loaf of bread with a giant, you know, rod through it. And it was just,
01:04:27.200 | it was the graduation gift for many years. And, you know, you don't see those anymore.
01:04:33.840 | And, you know, if you even look at what happened with, like, the millennial generation in the last,
01:04:39.280 | you know, in the last five years, especially during the pandemic, suddenly bread making sourdough,
01:04:43.360 | that became a thing. What's the difference? You know, you've got bread, it's warm. It's, you know,
01:04:48.000 | with the bread maker, it's fresh. And it is not at all desired relative to bread that takes a long
01:04:54.960 | period of time and is tactile and in the process and the making of it. And, you know, is clearly much
01:05:02.000 | more onerous than the other in its process of development. I think the key part is it's in the
01:05:09.440 | appreciation of the bread, the process is part of it. And that process is development of sort of the
01:05:15.440 | germane knowledge and the commitment and connection to that humanness of development, but also the tactile
01:05:22.160 | commitment, the work that went into it is really appreciated in the same way that that peach pie
01:05:28.400 | for one comes with that whole time series of data that wasn't just about my taste, but was also
01:05:38.400 | smell, also physical, also visual, and saw the process, you know, evolve and build a different
01:05:45.280 | prior going into that experience. And that is, I think, part of richness of human experience. Will it be
01:05:54.240 | part of the richness of how humans interact with AI? Absolutely. Or interact with robots? Absolutely. So it's, what are the
01:06:02.240 | relationships we're building in how are they, you know, how integrated are these tools, these, you know,
01:06:08.160 | companions, whatever they may be in our existence will shape us in different ways. What I am particularly,
01:06:17.200 | I guess, bullish on and excited for is the robot that optimizes my health, my comfort, my intent in my
01:06:24.480 | environment, in my, you know, be it in the cabin of a car, be it in the, my, my rooms, my spaces. So what would
01:06:31.600 | that look like? If you, could you give me the lowest level example? Like, like, would it be an
01:06:37.040 | assistant that helps you travel today when you head back to the Bay Area? Would it, like, what, what is
01:06:42.320 | this non-physical robot? And I think we already have some of these, like, it's the, the point where HVAC
01:06:48.080 | systems actually get sexy, right? Not sexy in that sense, but they're actually really interesting because
01:06:52.560 | they are the heart of, you know. HVAC systems. Heating, ventilation.
01:06:56.720 | But you think about a thermostat, you know, a thermostat right now is optimizing for, you know,
01:07:04.400 | an AI thermostat optimizing for my behavior, but it's trying to save me resources, trying to save me
01:07:09.440 | money, but it's not, it doesn't know if I'm hot or cold. It doesn't know, to your point, it, my intent,
01:07:14.880 | what I'm trying to do at that moment, where, and this, you know, speaks more to a lot of the, the things
01:07:20.000 | you've studied in the past, you know, it doesn't know what my optimal state is for my goal in that
01:07:26.960 | moment in time, but it can very easily, frankly, you know, it can talk to me, but it can also know how,
01:07:33.120 | my state of my body right now, and what is going, you know, if it's 1am and I really need to work on a
01:07:38.400 | paper, you, you know, my house should not get cold, but it also should be very, it should, for me, it
01:07:45.120 | shouldn't, I know, for some people, it should. Yeah, my, my eight sleep mattress, which I love,
01:07:49.760 | love, love, love, and yes, they're a podcast sponsor, but I would use one, anyone, it knows
01:07:53.840 | what temperature adjustments need to be made across the course of the night. I put in what I think it
01:08:00.320 | is best, but it's updating all the time now, because it has updating sensors, like dynamically updating
01:08:06.000 | sensors. I'm getting close to two hours of REM sleep a night, which is outrageously good for me,
01:08:12.960 | much more deep sleep, and that's a little micro environment. You're talking about integrating
01:08:17.920 | that into an entire home environment. Home, vehicle, yes, because it needs to treat me as a dynamic time
01:08:23.840 | series. It needs to understand the context of everything that's driving my state internally.
01:08:29.040 | There's everything that's driving my state in my local environment, meaning my home or my car, and then
01:08:33.920 | there's what's driving my state externally from, you know, my external environment. And we're in a place
01:08:41.280 | where those things are rarely treated, you know, interacting together for the optimization and the,
01:08:46.880 | you know, the dynamic interactions that happen, but we can know these things. We can know so much about the
01:08:52.960 | human state from non-contact sensors. Yeah, and we're right at the point where the sensors can start to
01:08:57.920 | feed information to AI to be able to deliver what effectively, again, a lower level example would
01:09:02.880 | be like the, the cooling, the dynamically cooling mattress or dynamically heating mattress. Like I
01:09:07.440 | discovered through the AI that my mattress was applying that, and I was told, that heating your
01:09:13.840 | sleep environment toward the end of the night increases your REM sleep dramatically, whereas cooling
01:09:18.800 | it at the beginning of the night increases your deep sleep. This has been immensely beneficial for me to be
01:09:23.360 | able to shorten my total sleep need, which is something that for me is like awesome. Because I, I like sleep
01:09:28.400 | a lot, but I don't want to need to sleep so much in order to feel great. Well, you, you want to have
01:09:34.480 | your own choice about how you sleep. Yeah. Given the date it's helping you have that. Right. Sometimes I have
01:09:39.200 | six hours, sometimes I have eight hours, this kind of thing. Here's where I'm, I get stuck, and I've been
01:09:46.960 | wanting to have a conversation about this with someone, ideally a neuroscientist who's interested in building
01:09:52.640 | technologies for a very long time. So I feel like this moment is a moment I've been waiting for for a
01:09:58.320 | very long time, which is the following. I'm hoping you can solve this for all of us, Bobby.
01:10:03.440 | We're talking about sleep, and we know a lot about sleep. You've got slow wave sleep, deep sleep,
01:10:09.040 | growth hormone release at the beginning of the night. You have less metabolic need then. Then you have
01:10:12.960 | rapid eye movement sleep, which consolidates learning from the previous day. It removes the emotional load of
01:10:17.600 | previous day experiences. We can make temperature adjustments. We do all these things, avoid caffeine too late in
01:10:21.600 | the day. Lots of things to optimize these known states that occupy this thing that we call sleep.
01:10:26.960 | And AI and technology is, I would say, is doing a really great job, as is pharmacology, to try and
01:10:34.800 | enhance sleep. Sleep's getting better. We're getting better at sleeping, despite more forces
01:10:39.600 | potentially disrupting our sleep, like smartphones and noise and city noise, etc. Okay.
01:10:46.400 | Here's the big problem in my mind, is that we have very little understanding or even names for different
01:10:53.280 | awake states. We have names for the goal, like, I want to be able to work. Okay, what's work? What kind
01:11:01.840 | of work? I want to write a chapter of a book. What kind of book? A nonfiction book. Based on what? But like,
01:11:08.640 | we don't, we talk about alpha, beta waves, theta waves. But I feel like as neuroscientists, we have
01:11:13.440 | done a pretty poor job, as a field, of defining different states of wakefulness. And so that, like, the
01:11:21.920 | technology, AI and other technologies are, don't really have, they don't know what to shoot for.
01:11:29.040 | They don't know what to help us optimize for. Whereas with slow wave sleep and REM sleep, like, we've got it. I ask questions of myself all the time. Like, is my brain and what it requires in the first three hours,
01:11:36.720 | and what it requires in the first three hours of the day, anything like what my brain requires in the last three hours of the day,
01:11:42.320 | if I want to work in each one of those three hour compartments? Like, and so I think, like, we don't really understand
01:11:50.000 | what to try and adjust to. So here's my question. Do you think AI could help us understand the different states
01:11:59.520 | that our brain and body go through during the daytime? Give us some understanding of what those are in terms of
01:12:05.840 | body temperature, focus ability, etc. And then help us optimize for those the same way that we optimize for
01:12:11.520 | sleep. Because whether it's a conversation with your therapist, whether or not it's a podcast, whether or
01:12:16.240 | not it's playing with your kids, whether or not it's Netflix and chill, whatever it is, the goal, and what
01:12:22.000 | people have spent so much time, energy, money, etc. on, whether or not they're drinking alcohol, caffeine,
01:12:27.280 | taking Ritalin, or Adderall, or running, or whatever. Like, humans have spent their entire
01:12:34.080 | existence trying to build technologies to get better at doing the things that they need to do, and yet we
01:12:40.640 | still don't really understand waking states. So can AI teach it to us? Can AI teach us a goal that we don't
01:12:47.600 | even know we have? Can AI teach it to us? I would say AI is part of the story, but before we get AI, we need
01:12:55.120 | better, more data. Not just me, right? So maybe I am very focused right now, but without my belief, and this
01:13:04.480 | is my perspective, is imagine I'm very focused right now. I need to know the context of my environment that's
01:13:11.600 | driving that. Like, what's in that environment? Is it internal focus that's gotten me there? What is my
01:13:18.400 | environment? What is that external environment? So understanding my awake state, for me, is very
01:13:26.240 | dependent on the data and interactions that happen from these different environments. Let me give an
01:13:31.600 | example. Like, if I'm in my home, or say I'm in a vehicle, right, and you are measuring information about
01:13:38.640 | me, and you know I'm under stress, or you know I'm experiencing joy, or I'm, or heightens attention
01:13:45.680 | right now. Some different states, you may want to have my home or my system react to mitigate.
01:13:55.920 | Well, like, if you get sleepy in a self-driving, in a smart vehicle, it will make adjustments.
01:14:01.360 | Potentially, it will make adjustments, but not necessarily right for you. That's an important
01:14:05.920 | part is optimizing for, you know, personalization and how a system responds. And, you know, it can make
01:14:10.880 | adjust any home, an HVAC system, or the internal state of the vehicle is going to adjust, you know,
01:14:17.680 | sound, background sound, music. It's going to adjust, you know, whatever, whether it can haptic feedback,
01:14:24.240 | temperature, lighting, you know, any number of, you know, position of your, you know, your chair, dynamics
01:14:32.880 | of what's in your space. All of these different systems in my home or my, my other, what if my
01:14:40.000 | vehicle if it, or some other system can react, right? But the important thing is how you react is going to
01:14:48.000 | shift me. And the goal is to not measure me, but to actually intersect with my state and move it in some
01:14:58.080 | direction, right? Some... Yeah. I always think of devices as good at measurement or modification.
01:15:04.400 | Right. Measurement or modification. Measurement is critical. And that's, you know,
01:15:09.520 | but measurement, not just of my, me, but also of my, my environment and understanding of the external
01:15:16.800 | environment. And this is where like things like earth observation and understanding, you know,
01:15:20.800 | we're getting to a place where we're getting image, you know, really good image quality data from the
01:15:27.920 | satellites that are going in the sky at much lower, lower distances. So that you now have, you know,
01:15:37.920 | faster reaction times between technologies and the information they have to understand and be dynamic
01:15:44.960 | with them, right? Can you give me an example where that impacts everyday life? Are we talking about like
01:15:48.800 | weather analysis? Sure. Weather predictions, car environment, you know, things happening.
01:15:54.240 | And what about traffic? Why haven't they saw traffic yet given all the knowledge of,
01:15:58.080 | of object flow and how to optimize for object flow? And we've got satellites that can basically look at,
01:16:04.240 | at traffic and, I mean, and open up roads dynamically, like change number of lanes. Why isn't that happening?
01:16:10.160 | The traffic problem gets resolved when you have autonomous vehicles in ways that don't have
01:16:15.440 | like the human, the human side of things. That gets resolved. It does. Autonomous
01:16:20.320 | vehicles can solve traffic. Fully autonomous vehicles, you would probably, you don't have traffic in the
01:16:23.760 | ways that you do with, you know, human behavior. That's reason alone. That's reason alone to,
01:16:28.880 | to shift to autonomous vehicles. It is that injection from human, the human system that,
01:16:33.840 | you know, is interrupting all the models. I think the world right now, we think about wearables a lot.
01:16:38.720 | Wearables track us. You have smart mattresses, which are wonderful for understanding. So,
01:16:44.160 | there's so much you would learn well, you know, from a smart mattress and ways of also both measuring
01:16:50.000 | as well as intervening to optimize your sleep, which is the beauty. And it's this nice, incredible period
01:16:57.840 | of time where you can measure so many things. But, you know, in our home, so I was, I used the example of a
01:17:04.480 | thermostat, right? It's pretty, you know, frankly, dumb about what my goals are or what I'm trying to
01:17:09.760 | do at that moment in time. But it doesn't have to be. And there are, you know, there's a company,
01:17:14.640 | Passive Logic. I love them. They actually have, I think, some of the smartest digital twin HVAC
01:17:20.000 | systems. But, you know, their sensors measure things like sound. They measure carbon dioxide.
01:17:25.440 | You know, carbon dioxide. Your carbon, your CO2 levels. Like, when we breathe, we give off CO2,
01:17:30.800 | you know. So, imagine, you know, there's a dynamic mixture of acetone, isoprene, and carbon dioxide
01:17:39.200 | that's constantly exchanging when my, you know, when I get stressed or when I'm feeling, you know,
01:17:44.800 | happiness or suspense in my state. And that dynamic sort of cocktail mixture that's in my breath
01:17:55.200 | is both an indicator of my state, but it's also something that, you know, it's just the spaces around
01:18:02.480 | me, you know, have more information to contribute about how I'm feeling and can also be part of that
01:18:08.320 | solution in ways that don't, I don't have to have things on my body, right? So, I have sensors now
01:18:13.520 | that can measure CO2. You can watch my TED Talk. I have given examples. We brought people in when I,
01:18:18.480 | when I was at Dolby and had, had them watching Free Solo, you know, the Alex Honnold movie where they're
01:18:23.840 | climbing El Cap. Stressful. So, carbon dioxide's heavier than air, so we can measure,
01:18:27.840 | we could measure carbon dioxide from, you know, just tubes on the ground, and you could get the real
01:18:32.320 | time differential of CO2 in there. Were they scared throughout? No. Well, but it's, I mean,
01:18:39.120 | I like to say we broadcast how we're feeling, right? And we do that wherever we are. And in this,
01:18:45.280 | you could look at the time series of carbon dioxide levels and be able to, you know, know what was
01:18:51.200 | happening in the film or in the movie without actually having it annotated. You could tell where
01:18:55.840 | he summited, where he had to abandon his climb, where he hurt his ankle. Absolutely. There's another study,
01:19:01.200 | I forget who the authors are. And they're, you know, they've got different audiences watching
01:19:04.560 | Hunger Games. And, you know, different days, different people. You can tell exactly where
01:19:09.680 | Katniss's dress catches on fire. And, you know, it's like we really are sort of, you know,
01:19:15.280 | it's like digital exhaust of how we're feeling. But, you know, and our thermals, we, you know,
01:19:19.920 | radiate the things we're feeling. I'm very bullish on the power of, you know, our eye in representing our
01:19:27.440 | cognitive load, our stressors. Our eye? Our eye. Yes.
01:19:30.800 | Like the diameter. Oh, our eye. Our, yes. Our eye. Sorry. Our,
01:19:34.640 | literally our eyes. Yes. Our pupil size. Yes. Yes. Yes. I, you know, back when I was a physiologist,
01:19:40.640 | I always, you know, I've worked with a lot of species on in, you know, understanding information
01:19:45.760 | processing internally in cells. But also then I would very often use pupilometry as an indicator of,
01:19:51.840 | you know, perceptual engagement and experience. Yeah. Bigger people mean more arousal, higher
01:19:57.360 | levels of alertness. Yeah. We just know this. More arousal, cognitive load, or, you know,
01:20:02.640 | obviously lighting changes. But the thing that's changing from, you know, 20 years ago, 15 years
01:20:09.200 | ago, it was very expensive to track the kind of resolution and data to, you know, leverage all of
01:20:15.120 | those autonomic nervous system, you know, deterministic responses. Because those ones
01:20:19.360 | are deterministic and not probabilistic, right? Those are the ones that it's like the body's reacting
01:20:24.000 | even if the brain doesn't say anything about it. Below conscious detection. Yeah. Yeah.
01:20:27.520 | And, but today we can do that with, I mean, do it, well, we can do it right now with, you know,
01:20:32.560 | open source software on our laptops or our mobile devices, right? And every pair of smart glasses will be
01:20:39.200 | tracking this information when we wear them. So it is, becomes a channel of data. And, you know,
01:20:45.440 | you, it may be an ambiguous signature in the sense that there's, you know, changes in lighting,
01:20:49.840 | there's changes, am I aroused or am I, you know. Those can be adjusted for, right? Like if you,
01:20:54.000 | you can, you can literally take a measurement, wear eyeglasses that are measuring pupil size.
01:20:58.720 | Mm-hmm. The eyeglasses could have a sensor
01:21:01.280 | that detects levels of illumination in the room. Mm-hmm.
01:21:03.920 | At the level of my eyes. Mm-hmm. It could measure how dynamic that is. And we just make
01:21:08.240 | that the denominator in a fraction, right? And then we just look at changes in pupil size
01:21:12.240 | as the numerator in that fraction, right? Yep. More or less. You just have to have other sensors.
01:21:16.720 | All you need to do is cancel. So as, as you walk from a shadowed area to a brighter area,
01:21:21.120 | sure, the pupil, the size changes, but then you can adjust for that change, right? Yep.
01:21:24.800 | You just like normalize for that. And you end up with an index of arousal.
01:21:29.840 | Right. Which is amazing. You could also use the index of,
01:21:32.560 | of illumination as a useful measure of like, are you getting, uh, compared to your vitamin D levels,
01:21:37.680 | uh, to your levels of, maybe you need more illumination in order to get more arousal.
01:21:41.280 | Like it could tell you all of this. It could literally say, hey, take a five minute walk
01:21:45.840 | outside in to the left after work, and you will, um, get your, your require, your photon requirement
01:21:52.960 | for the day. You know, this kind of thing, not just measuring steps. Mm-hmm.
01:21:56.160 | All this stuff is possible now. I just don't know why it's not being integrated into
01:22:01.040 | single devices more quickly. Cause you'd love to also know that person's blood sugar instead of like,
01:22:05.280 | drawing their blood, taking it down to the wall. Like you think about in the, with, for, with the
01:22:09.760 | resident that's been up for, for 13 hours, because that's the standard in the field and they're making
01:22:14.320 | mistakes on a, on a, on a chart. It's like, I think at some point we're just going to go,
01:22:18.400 | I can't believe we used to do it that way. It's crazy. Yeah. No. And it's a lot of the consumer
01:22:23.760 | devices and just computation. We can do from, you know, whether it's cameras or exhalant or,
01:22:31.200 | you know, other data in our environments that tell us about our physical state and some of these
01:22:35.920 | situations that you're talking about. A lot of the, I mean, why isn't it happening? A lot of the reasons
01:22:40.160 | are simply the regulatory process is antiquated and not up to keeping up with the acceleration of
01:22:47.040 | innovation that's happening, you know, getting things through the FDA, even if they're, you know,
01:22:51.360 | deemed, you know, in the same ballpark and supposed to move fast, you know, with the regulatory costs
01:23:00.480 | and processes is really high. And, you know, you end up many years, you know, down the road from when
01:23:07.920 | the capability and the data and technology actually, you know, should have arisen to be used in a hospital
01:23:13.680 | or to be used in a place where you actually have that kind of appreciation for the data,
01:23:19.120 | you know, and use. The consumer grade devices for tracking of data of our biological processes
01:23:27.040 | are on par and in many cases surpassed the medical grade devices. And that's because they just have,
01:23:33.600 | but then they will have to bill what they do and what they're tracking in some way that is consumer,
01:23:38.960 | you know, is not making the medical claims to allow them to be able to be,
01:23:42.560 | you know, continue to move forward in those spaces. But there's no question that that's a big part of
01:23:49.520 | what can, you know, holds back the availability of a lot of these devices and capabilities.
01:23:58.400 | I'd like to take a quick break and acknowledge one of our sponsors, Function. Last year, I became a
01:24:03.520 | Function member after searching for the most comprehensive approach to lab testing. Function
01:24:08.320 | provides over 100 advanced lab tests that give you a key snapshot of your entire bodily health.
01:24:14.240 | This snapshot offers you with insights on your heart health, hormone health, immune functioning,
01:24:18.880 | nutrient levels, and much more. They've also recently added tests for toxins, such as BPA exposure from
01:24:24.480 | harmful plastics, and tests for PFASs or forever chemicals. Function not only provides testing of
01:24:30.240 | over 100 biomarkers key to your physical and mental health, but it also analyzes these results and
01:24:35.360 | provides insights from top doctors who are expert in the relevant areas. For example, in one of my first
01:24:41.120 | tests with Function, I learned that I had elevated levels of mercury in my blood. Function not only helped
01:24:46.160 | me detect that, but offered insights into how best to reduce my mercury levels, which included limiting my tuna
01:24:51.600 | consumption. I've been eating a lot of tuna while also making an effort to eat more leafy greens and
01:24:56.400 | supplementing with NAC and acetylcysteine, both of which can support glutathione production and
01:25:00.800 | detoxification. And I should say by taking a second Function test, that approach worked. Comprehensive
01:25:06.240 | blood testing is vitally important. There's so many things related to your mental and physical health
01:25:11.120 | that can only be detected in a blood test. The problem is blood testing has always been very expensive and
01:25:16.320 | complicated. In contrast, I've been super impressed by Function's simplicity and at the level of cost.
01:25:22.080 | It is very affordable. As a consequence, I decided to join their scientific advisory board and I'm
01:25:26.880 | thrilled that they're sponsoring the podcast. If you'd like to try Function, you can go to
01:25:31.120 | functionhealth.com/huberman. Function currently has a wait list of over 250,000 people, but they're
01:25:37.600 | offering early access to Huberman podcast listeners. Again, that's functionhealth.com/huberman to get early
01:25:44.320 | access to function. Okay. So I agree that we need more data and that there are a lot of different
01:25:50.480 | sensors out there that can measure blood glucose and sleep and temperature and breathing, all sorts of
01:25:57.200 | things, which raises the question of, are we going to need tons of sensors? I mean, are we going to be
01:26:03.680 | just wrapped in sensors as clothing? Are we going to be wearing 12 watches? What's this going to look like?
01:26:11.440 | I'm an advocate for fewer things on not having all the stuff on our bodies. There's so much we can get
01:26:18.560 | out of the computer vision side, you know, from the cameras in our spaces and how they're supporting us in
01:26:24.320 | our rooms, in the sensors on our, in our, you know, I brought up HVAC systems earlier. So now you've got,
01:26:31.600 | you know, effectively a digital twin that's tracking, you know, and sensors that are tracking my metabolic
01:26:37.760 | rates just in my space. They're tracking carbon dioxide. They're tracking sound. You're getting
01:26:45.280 | context because of that. You're getting intelligence. And now you're able to start having more information
01:26:51.600 | from, you know, what's happening in my environment. The same is true in my, my vehicle. You can tell how I'm,
01:26:57.440 | whether I'm stressed or how I'm feeling just by the posture I have it sitting in my car, right?
01:27:02.560 | And you need AI. This is AI interpretation of data. But what's driving that posture
01:27:09.680 | might be coming from also an understanding of what else is happening in that environment. So it's
01:27:14.240 | suddenly this con-, with contextual intelligence, AI driven understanding of what's happening in that
01:27:20.960 | space that's driving, you know, the state of me. And how do I, you know, I keep leaning to the side
01:27:29.280 | because I'm talking, thinking about, you know, my, the way I move and sit is, you know, it's a proxy for
01:27:35.280 | what's actually happening inside me. And then you've also got data around me coming from my environment.
01:27:42.080 | What's happening, you know, if I'm driving a car or what's happening in my home in my, you know,
01:27:47.200 | in, in, in the weather and not just threats that might be outside and noise that's happening,
01:27:54.800 | not inside the space, but things that give context to have more intelligence with the systems we have.
01:28:01.120 | So I'm a huge believer in you don't-, we aren't anywhere until we have integration of those systems
01:28:08.720 | between the body, the local environment, and the external environment. And we're finally at a place
01:28:14.320 | where AI can help us start integrating that data. In terms of wearables though, you-, so obviously some
01:28:21.760 | of the big companies we've got, the watch we have on our hand has a lot of information that is very
01:28:26.960 | relevant to our bodies. The devices we put in our ears, you may not realize, but, you know, a dime-sized
01:28:32.960 | patch in your, in, in your concha can, we can use, we can know heart rate, pulse, blood oxygen level,
01:28:41.200 | because of the, you know, the electrical signature that your eye produces when it moves back and forth.
01:28:46.720 | We can know what you're looking at just, you know, in, from measuring a signature, measuring your
01:28:53.440 | electrooculogram in your ear. We can measure EEG, electroencephalograms. You can also get,
01:29:01.520 | you know, eye movements out of electroencephalograms, but you can get attention. You can know what people
01:29:05.840 | are attending to based on signatures in their ear. So our earbuds, you know, that become,
01:29:11.120 | sort of a window to our state. And you've got a number of companies working on that right now.
01:29:16.800 | You know, so do we need to wear lots of different sensors? No. Do we need to have the sensors, the
01:29:24.160 | data we have, whether it's on our bodies or off our bodies, be able to, you know, work together and not
01:29:30.720 | be proprietary to just one company, but to be able to integrate great with other companies? That, that
01:29:35.360 | becomes really important. You need integrative systems so that the, the data they have can interact
01:29:41.040 | with the systems that surround, surround you or surround my spaces or the mattress I'm sleeping on,
01:29:48.480 | right? Because you've had a lot of specialty of design come from different developers. And that's
01:29:56.400 | partly been a product of, again, the, the FDA and the regulatory pathways because of the cost of
01:30:02.240 | development. It tends to move companies towards specialization unless they're very large.
01:30:09.760 | Mm-hmm. But where we're at today is you're going, you know, we're getting to a point where
01:30:14.240 | you're going to start seeing a lot of this data get integrated, I think. And, and by all means,
01:30:19.600 | hopefully we're not going to be wearing a lot of things on our bodies. I sure as heck won't. You know,
01:30:23.280 | the more we put on our bodies, it affects our gait, it affects, it has ramifications in so many different
01:30:28.080 | ways. When I got here, I was talking to some of the people that work with you and they're like,
01:30:32.560 | well, what, what wearables do you wear? And I actually don't wear many at all. And, you know,
01:30:36.480 | I, I have worn rings, I've worn watches at different times. But for me, the importance is the point at which
01:30:43.360 | I get insights that, you know, I am a big believer in as little on my body as possible when it comes
01:30:50.160 | to wearables. One interesting company that I think is worth mentioning is Pison. And Pison, you know,
01:30:57.520 | again, they've got a form factor that's, you know, like a Timex watch or they're partnered with Timex,
01:31:01.920 | but they're measuring, um, are you familiar with Pison? No. Okay. So they're measuring psychomotor vigilance.
01:31:10.480 | So, you know, really trying to understand it's like a ENG, electroneuromodulation. And they're trying to
01:31:17.600 | understand fatigue and, and neural attentiveness in a way that is, you know, continuous and useful for, say,
01:31:27.840 | high risk operations or, uh, training, uh, you know, whether it be it in sport. But what I like about it is it's actually
01:31:37.600 | trying to get at a higher level cognitive state from the biometrics or the, that you're measuring.
01:31:44.080 | And that to me is an exciting, really exciting direction is when you're actually doing something
01:31:49.280 | that you could make a decision about how I engage in my work or how I engage in my training or my life
01:31:54.720 | based on that data about my cognitive state and how effective I'm going to be. And then I can start
01:32:00.240 | associating that data with the other data to make better, to have better decisions, better insights
01:32:07.280 | at a certain point in time. And that becomes, that's really your digital twin.
01:32:10.720 | It's interesting. Earlier you said you don't like the word gamification. But, um, one thing that I think
01:32:16.960 | has really been effective in the sleep space has been this notion of a sleep score, where people aspire to get
01:32:24.080 | a high sleep score. Um, and if they don't, they don't see that as a, um, a disparagement of them,
01:32:32.320 | but rather that they need to adjust their behavior. So it's not like, oh, I'm a terrible sleeper and
01:32:37.920 | I'll never be a good sleeper. It gives them something to aspire to on a night by night basis.
01:32:41.760 | Yes. And I feel like that's been pretty effective. When I say gamification, I don't necessarily mean
01:32:46.560 | competitive, uh, with others, but I mean, um, encouraging of oneself, right? So I could imagine,
01:32:54.400 | uh, this showing up in other domains too, um, for wakeful states. Like, you know, like I spent,
01:33:00.800 | I had very few highly distracted, you know, work bouts or something like that. Like, I'd love to know at the
01:33:07.440 | end of my day, I had three really solid work bouts, um, of an hour each at least, um, that would feel
01:33:16.000 | good. It was like a day well spent, even if, you know, I didn't accomplish what I wanted to in its
01:33:20.960 | entirety. Like I, I put in some really good solid work right now. It's all very subjective. Uh, we know
01:33:27.520 | the gamification of steps was very effective as a public messaging, you know, 10,000 steps a day.
01:33:32.880 | We now know you want to get somewhere exceeding 7,000 as a threshold. But if you think about it,
01:33:38.640 | we could have just as easily said, Hey, you want to walk at a, at a reasonable pace for you for 30
01:33:44.560 | minutes per day. But somehow the counting steps thing was more effective because people I know who are
01:33:50.960 | not fanatic about exercise at all will tell me, I make sure I get my 11,000 steps per day. Like people
01:33:56.880 | tell me this. I'm like, Oh, okay. Like, so apparently it's a meaningful thing for people. Um, so I think
01:34:02.080 | quantification of performance, um, creates this aspirational state. Um, so I think that can be very
01:34:10.240 | useful. Data and, and understanding the quantification that you're working towards is really important.
01:34:17.440 | Those are, you know, summary, summary statistics effectively that maybe they're good on some level
01:34:25.120 | to aim for. If it means that people move more all for it. Right. And it's something that if I didn't
01:34:31.680 | move as much before and I didn't get up and I didn't do something, then, you know, and this is making me do
01:34:36.960 | it. That's awesome. Or that's great. But it's also great when now through like a computer vision app, I can
01:34:42.800 | understand it's not just 10,000 steps, but maybe there's, you know, an, you know, a small battery of
01:34:49.040 | things I'm trying to perform against that are helping shape me neurally with the feedback and the targets
01:34:56.480 | that I'm getting so that there's a little more, there's more nuance towards achieving the goal I'm
01:35:00.560 | aiming for, which is what I'm all about from a neuroplasticity perspective. So I just don't like
01:35:05.280 | the word gamification. I believe everything should be fun. Everything training can be fun and gamified in
01:35:11.280 | some ways. You know, again, like my life has been predominantly in industry, but I've always,
01:35:15.200 | you know, I love teaching and I've always been at Stanford to, you know, really there I try to,
01:35:20.400 | it's, it's how do I use technology and, and merge it with the human system in a way that does help
01:35:26.960 | optimize learning in and training in a way that is from a sort of neural circuit first perspective. You know,
01:35:35.280 | how do we think about the neural system and use, you know, this more enjoyable,
01:35:41.200 | understandable target to, to engage with it. One of my favorite examples though is there was a period,
01:35:50.880 | it was right around 2018, 2020, and from 2018 to 2020, into the pandemic where, you know, there became,
01:35:59.360 | the students I, I noticed had a much more, there, there were a lot of projects, their final project,
01:36:06.160 | they can build whatever they want. And, you know, they've had to do projects where they build neural brain
01:36:11.840 | computer interfaces, they've had to build projects in VR, they've had to build AR projects, they've had to build
01:36:16.160 | projects that, you know, use any sort of input device, you know, they have to use different sensor driven
01:36:23.440 | input devices and that's all part of what they develop. And around 2018, 2020, I started to see
01:36:30.000 | almost every project had a wellness component to it, which I loved. I thought that was, and it was a very
01:36:36.160 | notable shift in like the student body and maybe you've seen that too. But I still got this, like one of my
01:36:41.840 | favorite games to date was this VR game where I'm, you know, in a morgue, I wake up, I've got to solve
01:36:48.960 | an escape room, I've got zombies that are coming out of me and they're climbing out of the morgue and
01:36:53.600 | they're getting closer and there's people breathing on my neck and they're like, you know, and everything.
01:36:58.080 | And it's a wellness app. Go figure. It was their idea of, look, this is what I feel like. I've got to,
01:37:06.480 | because I'm also measuring my breath and heart rate and I've got to keep those biological signatures.
01:37:12.560 | Like everything about how the zombies in solving my escape room problems, they're going to get closer
01:37:19.040 | to me if my breath rate goes up, if my heart rate goes up, I've got to keep. So it was about stress
01:37:23.920 | control, basically. Exactly. Yes. But it was in that environment and it was, you know, realized for them
01:37:29.520 | how they felt. But yeah, and you can do it in much simpler ways, but at least I'm a huge fan of how do
01:37:35.520 | we use the right quantification to develop the right habits, the right skills, the right acuity or
01:37:42.000 | resolution in a domain we might not or an area where we might not be able to break it into the pieces we
01:37:48.080 | need. But it's going to help us get there because my brain actually needs to now learn to understand that
01:37:54.880 | different, you know, that sophistication. Yeah. It's clear to me that in the health space,
01:38:00.400 | giving people information that scares them is great for getting them to not do things, but it's very
01:38:05.600 | difficult to scare people into doing the right things. You need to incentivize people to do the
01:38:10.000 | right things by making it engaging and fun and quantifiable. And, you know, I like the example of the
01:38:16.560 | zombie game. Okay. So, fortunately, we won't have to wear dozens of sensors. They'll be more integrated over
01:38:25.120 | time. I'm happy to walk through a cheat sheet later after, you know, for building out like a computer vision
01:38:31.440 | app if, you know, for quantifying some of, you know, some of these more personalized domain-related things that
01:38:37.840 | people might want to do if that's useful. That would be awesome. Yeah. And then we can post a link to it in the show
01:38:42.000 | note captions because I think that the example you gave of, you know, creating an app that can analyze
01:38:47.440 | swimming performance, running gait, focus, what, you know, focused work bouts. I think that's really
01:38:52.480 | intriguing to a lot of people, but I think there's a, at least for me, there's a gap there between hearing
01:38:57.920 | about it, thinking it's really cool, and how to implement. So, I'd certainly appreciate it. I know the
01:39:02.720 | audience would too. I mean, just in- I mean, that's very generous of you. Thank you. Yes, absolutely. And,
01:39:07.040 | you know, we're in an era where everyone, all you hear about is AI and AI tools, and there are tools that
01:39:13.600 | absolutely accelerate our capabilities as humans. But, you know, we gave the examples of talking about
01:39:20.240 | some, you know, some of the LLMs. I mean, I sat next to force- we went to Cal. I sat next- I was at a
01:39:27.840 | film premiere, and I was sitting there- I was sitting next to a few students who happened to be from Berkeley,
01:39:32.480 | and they said to me- you know, they were computer science students and double engineering. And one of
01:39:37.840 | them, when he knew what I talk about or care about, he's like, you know, I'm really worried my- my peer
01:39:42.720 | group- like, my peers can't start a paper without ChatGPT. And, you know, it was a truth, but it was also
01:39:50.880 | a concern. So, they understand the implications of what's happening. And, you know, that's on one level.
01:39:57.680 | We're in an era of agents everywhere. And, you know, I think Reid has said that there's- you
01:40:03.840 | know, a number of people have said, you know, we won't- we'll be using agents- AI agents- for
01:40:09.360 | everything at work in- in the next five years. And some of those things we need to use- agents will
01:40:16.240 | accelerate- they will accelerate capability, they will accelerate short-term revenue, but they also will
01:40:23.360 | diminish workforce capable- you know, cognitive- cognitive skill. And as a user of agents in any
01:40:30.800 | environment, as a- you know, an owner of companies employing agents, you have to think hard about
01:40:36.320 | what- what the near-term and long-term ramifications- doesn't mean you don't use your agents in places
01:40:41.920 | where you need to, but you need to- without the germane cognitive load, there- there is a different
01:40:48.480 | dependence now that you have to have down the road, but also you have to think about how do you-
01:40:53.360 | how do you engage with the right competence to keep your humans that are, you know, engaged with,
01:41:00.960 | you know, developing their cognitive skills and their germane cognitive- their- their mental schemas
01:41:06.560 | to be able to support your systems down the road. Let's talk more about digital twins. Sure. I don't
01:41:15.920 | think this concept has really landed squarely in people's minds as- as like a specific thing. I think
01:41:22.640 | people hear AI, they know what AI is more or less, they hear about a smartphone, they obviously know what a
01:41:27.360 | smartphone is, everyone uses one, it seems, but what is a digital twin? I think when people hear the word twin,
01:41:35.120 | they think it's a twin of us. Earlier, you pointed out that's not necessarily the case. It can be
01:41:39.920 | a useful tool for some area of our life, but it's not a replica of us, correct?
01:41:47.280 | Not at all in the ways that I think are most relevant. Maybe, you know, there are some,
01:41:51.440 | you know, side cases that think about that. And so, like, first, two things to think about. One,
01:41:57.680 | when I talk about digital twins to companies and such, I like to frame it on how it's being used, how-
01:42:04.800 | the immediacy of the data from the digital twin. So, let's go back 50 years, an example of a digital twin
01:42:13.600 | that we still use, air traffic controllers. When an air traffic controller sits down and looks at,
01:42:18.960 | you know, a screen, they're not looking at a spreadsheet. They're looking at a digitization
01:42:22.960 | of information about physical objects that is meant to give them fast reaction times, make them understand
01:42:29.840 | the landscape as effectively as possible. We would call that situational awareness. I've got to take in
01:42:34.720 | data about the environment around me, and I've got to be able to action on it as rapidly, as quickly as
01:42:39.760 | possible to make the right decisions that mitigate any potential, you know, things that, you know, are
01:42:45.440 | determined to be problems or risks, right? And so, that's what you're trying to engage a human system.
01:42:51.040 | You know, the visualization of that data is important, or it doesn't have to be visualization,
01:42:56.080 | the interpretation of it, right? And it's not the raw data. It's, again, it's how is that data,
01:43:01.040 | you know, represented. You want the key information in a way that the salient, most important information,
01:43:07.760 | in this case, you know, about planes, is able to be acted on by that human or even autonomous system,
01:43:16.160 | right? Could you give me an example where, in like a more typical home environment? We're both into
01:43:21.920 | reefing. And, you know, I built an aquacultured reef in my kitchen, partly because I have a child and I
01:43:29.920 | wanted her to understand, I love it myself, so don't get that wrong, it wasn't just ultra,
01:43:36.240 | but to understand sort of the fragility of the ecosystems that happen in the ocean and things we
01:43:41.280 | need to worry about, care about, and all. And, you know, initially when I started, and maybe, you know,
01:43:49.520 | this is not something you encountered, but when you build a reef or a reef tank and do saltwater fish,
01:43:56.880 | you're a couple of things. You're doing chemical measurements by hand, usually, you know, weekly,
01:44:04.240 | bi-weekly. There's a whole, you know, like 10 different chemicals that you're measuring, and I
01:44:10.320 | would have my daughter doing that so that she would do the science part of it. And you're trying to,
01:44:15.600 | you know, you know the ranges, the tolerances you have, and you're also observing this ecosystem and
01:44:22.880 | looking for problems. And by the time you see a problem, you're reacting to that problem. And I can
01:44:28.480 | tell you, it was very unsuccessful. I mean, there's lots of error and noise in human measurements. There's,
01:44:34.320 | you don't have the right resolution of measurements. Resolution, I mean, every other, you know, every
01:44:41.200 | few days is not enough to track a problem. You also have the issue of, you know, you're reactive instead
01:44:49.360 | of being proactive. It's just you're not sensing things that where you're, the point at which it's
01:44:54.000 | visible to you, it's probably too late to do anything about it. So, if you look at my fish tank right now,
01:44:59.760 | or my reef tank right now, I have a number of digital sensors in it. I have dashboards. I can
01:45:05.840 | track a huge chemical assay that is tracked in real time so that I can go back and look at the data. I
01:45:11.360 | can understand, I can see, oh, there was a water change there. Oh, the roadie tank, you know, my,
01:45:16.240 | I can tell what's happening by looking at the data. I have, you know, and you know this, you've got your,
01:45:22.480 | the spectrum of your lights is on a cycle of effect that's representative of the environment that the
01:45:28.560 | corals you're aquaculturing would, you know, that their, their systems, their deterministic systems
01:45:34.000 | are looking for, right? And so, you've built this ecosystem that when I look at my dashboards, I have
01:45:40.320 | a digital twin of that system and it, it, my tank is very stable. My tank knows what's wrong, what's
01:45:46.640 | happening. I can look at the data and understand that an event happened somewhere that could have been
01:45:52.160 | mitigated or some, I can understand that something's wrong quickly before it even shows up.
01:45:57.280 | It's amazing. I mean, I think for people who aren't into reefing, you might ask like,
01:46:02.400 | you know, I know people that are multiple people in my life are soon to have kids. Most everybody
01:46:08.640 | nowadays has a, has a camera on the sleeping environment of their kids so that if their kid
01:46:14.240 | wakes up in the middle of the night, they can see it, they can hear it. So camera and microphone,
01:46:19.520 | do you think we're either have now or soon we'll have AI tools that will help us better understand the
01:46:26.320 | health status of infants? Like parents learn intuitively over time based on diaper changes,
01:46:33.840 | based on all sorts of things, cries, frequency of illnesses, et cetera. And their kids, how well their
01:46:41.840 | kids are doing before they, kids can communicate that. Do you think AI can help parents be better
01:46:47.040 | parents by giving real-time feedback on the health information of their kids, not just if they're
01:46:53.120 | awake or asleep or if they're in some sort of trouble, but really help us adjust our care of our young,
01:47:00.000 | like what's more important for our species than, you know, supporting the growth of our next generation?
01:47:05.680 | No, absolutely. But I'd, I'd even more on the biological side. I mean, so thinking about digital
01:47:11.600 | twins, there's, and I'll get to babies in a moment, but just if you've ever bought a plane ticket, which
01:47:19.920 | any of us have today, that's a very sophisticated digital twin. Not the, you know, not the air traffic
01:47:26.320 | controllers looking at planes, but the pricing models for what data is going in to driving that price in a
01:47:33.200 | real-time, right? You know, you might be trying to buy a ticket and you go back an hour later or half
01:47:38.720 | hour later and it's like double, or maybe it's gone up. And, you know, and that's because it's using
01:47:42.880 | constant data from environments, from things happening in the world, from geopolitical issues, from things
01:47:48.000 | happening in the market, that's driving that price. And that is very much an AI-driven digital twin that's
01:47:54.720 | driving, you know, the sort of value of that, that ticket. And so there, there are places where we use
01:48:02.880 | digital twins. So that would be sort of the example of something that's affecting our lives, but we
01:48:06.880 | don't think about it as a digital twin, but it is a digital twin. And then you think about a different
01:48:12.080 | example where you've got a whole sandbox model. The NFL might have a digital twin of every player
01:48:17.040 | that's in the NFL, right? They're, they know the data, they, they're tracking that information.
01:48:22.080 | They know how people are going to perform many times. What do they care about? They want to
01:48:25.280 | anticipate someone might be, you know, high risk for an injury so that they, you know, can mitigate it.
01:48:29.920 | They're using those kinds of data. Absolutely. Yeah. Interesting. And I think the word twin is
01:48:33.920 | the misleading part. I feel like digital twin, I feel like soon that nomenclature needs to be replaced
01:48:39.280 | because people hear twin, they think a duplicate of yourself. Yes. I feel like these are, are, um,
01:48:45.600 | Well, it's a duplicate of relevant data and information about yourself, but not just trying to,
01:48:53.360 | like what's the purpose in emulating myself. It's to emulate key. So imagine me as a physical system.
01:49:00.640 | I'm going to digitize some of that data, right? And whatever, you know, data I have, I'm, it's how
01:49:07.680 | that data I interact with it to make intelligent insights and feedback loops in the digital environment
01:49:14.080 | about how that physical system is going to behave. Right. Okay. So it's a digital representative.
01:49:18.720 | Yes. More than a digital twin. Yes. I think I'm, I'm not trying to split it. There are many digital
01:49:23.280 | twins and any digital twin. So like even, you know, you've got data, you live with lots of digital,
01:49:28.800 | what I would, I think the world would, the digital twin, whatever nomenclature would say is a digital
01:49:34.880 | twin, but I like a digital representative and it's, it's informing some aspect of decision making and it's
01:49:41.360 | many feedback. So I'm digitizing different things. I'm, you know, in, in, in that situational awareness
01:49:46.480 | model, like just, can I give a quick example? So imagine I, so I, I can digitize an environment,
01:49:52.880 | right? I can digitize our lat, the, the space we're in right now. And would that be a digital twin? So
01:49:59.280 | first there in situational awareness, there's the state of, okay, so what's the sort of sensor,
01:50:06.000 | you know, limitations, the acuity of the data I've actually brought in. Okay. So that's like
01:50:10.720 | perception. Same with our sensory systems. And then there's comprehension. So comprehension would be
01:50:16.400 | like, okay, that's a table, that's a chair, that's a person. Now I'm in those sort of semantic units of
01:50:23.200 | relevance that the digitization takes. Then there's the insight. So what's happening in that environment?
01:50:29.120 | What do I do with that? What is, you know, and, and that's, that's where things get interesting.
01:50:33.120 | And that's where a lot of, you know, I think the future of AI products is, because then it's the
01:50:36.560 | feedback loops of what's happening with those, you know, that input and that data. And it becomes
01:50:42.160 | interesting and important when you start having multiple layers of relevant data that are interacting
01:50:47.520 | that can give you the right insights about what's happening, what to anticipate and, you know, in that
01:50:53.760 | space. But that's all about our situational awareness and intelligence in that environment.
01:50:58.320 | Yeah, I, I can see where these technologies could take us. I think for the general public right now,
01:51:05.120 | AI is super scary because we hear most about AI developing its own forms of intelligence that turn
01:51:14.400 | on us. I think people are gradually getting on board the idea that AI can be very useful. We have digital
01:51:21.120 | representatives already out there for, for us in these different domains.
01:51:24.320 | Absolutely. And I think being able to customize them for our unique challenges and to, and our unique
01:51:29.760 | goals is really what's most exciting to me. I love that because I, I mean, I think what I was trying
01:51:35.600 | to say is exactly what you said. Look, they're, they are out there and these are effectively digital twins.
01:51:40.080 | Every company that's, you're interacting with social media has an effectively a digital twin of you in
01:51:45.760 | some place. It's not to emulate your body, but it's to emulate your behaviors. So to, you know, in those
01:51:52.160 | spaces or you're using tools that are optimum, you know, have digital twins through you for things you
01:51:58.240 | do in your daily life. So the question is how do we harness that for our success, for individual success,
01:52:05.040 | for understanding and agency of what that can mean for you? If the NFL is using it for a player, you can
01:52:12.640 | use it as an athlete, meaning as an athlete at any level, right? And it's that digitization of information
01:52:19.200 | that can feed you. For my baby, you can better understand a great deal about how they're successful
01:52:25.360 | or what isn't successful about them. And, you know, some of the, not, not your baby's always successful,
01:52:29.920 | I don't want to say, but what is maybe not, you know, working well for them, you know, the things that,
01:52:36.160 | but I would tend to say the, the exciting places about digital twins come in and really once you start
01:52:45.200 | integrating the data from different places that tell us about the success of our systems and those are
01:52:53.600 | anchored with actual successes, right? I think you used an example of your mattress and sleep and,
01:53:00.080 | or even like you, one I liked was I had three good, very focused work sessions. You may have used
01:53:05.200 | different words, Andy, but the idea is, okay, you've had those, but it's when you can correlate it with
01:53:11.920 | other systems and other outputs that it becomes powerful. That's the way a digital representative
01:53:16.560 | or a digital twin becomes more useful. It's thinking about not, you know, the resolution of the data,
01:53:22.240 | where the data source, where the data is coming from, meaning whether is it biometric data, is it
01:53:26.720 | environmental data, you know, is it the context of the state of what else was happening during those
01:53:33.920 | work sessions? And how is that something that I don't have to think about, but AI can help me
01:53:38.720 | understand where I'm successful and what else drove that success or what drove that state? Because it's not
01:53:44.720 | just my success, it's intelligence. It's, I like to call it situational intelligence. It's sort of the
01:53:49.840 | overarching goal that we want to have. And that involves, you know, my body and systems having
01:53:56.240 | situational awareness. But it's really, you know, a lot of integration of data that, you know, AI is very
01:54:03.360 | powerful for thinking about how does it optimize and give us the insights. It doesn't have to do, just have
01:54:09.280 | systems behave, but it can give us the insights of how effectively we can act in those environments.
01:54:14.720 | Yeah. I think of AI as being able to see what we can't see.
01:54:19.280 | So for instance, if I had some sort of AI representative that, you know, paid attention to my work environment
01:54:26.560 | and to my ability to focus as I'm trying to do focused work, and it turned out, obviously I'm making this up, but it turned out that
01:54:34.800 | every time my, my air conditioner clicked over to silent or back to on, that it would break my focus
01:54:44.720 | for the next 10 minutes. Yes.
01:54:46.240 | And I wasn't aware of that. And by the way, this, for people listening, this is entirely plausible,
01:54:51.520 | because so many of our states of mind are triggered by cues that we're just fundamentally unaware of.
01:54:58.400 | Hmm. Or that it's always at the 35 minute mark that my eyes start to have to reread words or lines
01:55:06.960 | because somehow my attention is drifting or that it's paragraphs of longer than a certain length.
01:55:15.280 | It's a near infinite space for us to explore on our own, but for AI to explore it, it's straightforward.
01:55:21.280 | Right. And so it can see through our literal, our cognitive blind spots and our functional blind spots.
01:55:27.120 | And I think of where people pay a lot of money right now to get information to get around their
01:55:31.200 | blind spots are things like when you have a pain and you don't know what it is, you go to this thing
01:55:35.440 | called a doctor. Or when you have a problem and you don't know how to sort it out, you might talk
01:55:42.880 | to a therapist, right? People pay a lot of money for that. I'm not saying AI should replace all of
01:55:47.520 | that, but I do think AI can see things that we can't see. Two examples to your point, which I love,
01:55:52.960 | the, you know, the reading, potentially you're, you know, there's a point at which you're experiencing
01:55:57.760 | fatigue and you want to, you know, ideally, much like the fish tank, you want to be not reactive,
01:56:02.960 | you want to be proactive. You want to mitigate it, you know, stop or you could have your devices can have
01:56:08.080 | that integration of data and respond to give you feedback when you're either your mental acuity,
01:56:13.360 | your vigilance or your just effectiveness has waned, right? But also on the level of health,
01:56:19.440 | we know AI is, you know, huge for identifying a lot of different pathologies out of, you know, data that
01:56:29.120 | as humans, we're just not that good at discerning. You know, our voice in the last 10 years, we've become
01:56:35.200 | much more aware of the different pathologies that are, can be discerned from AI app, you know,
01:56:42.960 | assessments of our speech and not what we say, but how we say it.
01:56:47.760 | Yeah. There's a lab up in University of Washington. I think it's Sam Golden's lab, who's working on some
01:56:55.920 | really impressive algorithms to analyze speech patterns as a way to predict suicidality.
01:57:02.000 | Oh, interesting. And to great success, where people don't realize that they're drifting in that
01:57:06.880 | direction. Yeah. And phones can potentially warn people, warn them themselves, right, that they're
01:57:15.280 | drifting in a particular direction. People who have cycles of depression or mania can know whether or not
01:57:21.280 | they're drifting into that. That can be extremely useful. They can discern who else gets that
01:57:26.480 | information. I think it, and it's all based on tonality at different times of day, stuff that
01:57:33.680 | even in a close, close relationship with a therapist over many years, they might not be able to detect
01:57:39.120 | if the person becomes reclusive or something of that sort. Absolutely. I mean, neural degeneration,
01:57:47.120 | things like Alzheimer's show up in speech because of the, you know, linguistic cues control, but, you
01:57:56.320 | know, sometimes 10 years before a typical clinical symptom would show up that, you know,
01:58:00.160 | it shows up in, you know, it shows up in, you know, short assessment of how people speak.
01:58:04.320 | They've definitely been able to show potential likelihood of psychosis, you know, and that's with
01:58:08.320 | the linguistic cues control. But, you know, sometimes 10 years before a typical clinical
01:58:13.520 | symptom would show up that would be identified. And what I think is important for people to realize is
01:58:21.360 | it's not someone saying, "I don't remember." It's nothing like that. It's not those cues that you think are
01:58:26.720 | actually relevant. It's more like an individual says something, something like that, what I just did,
01:58:35.040 | which was I purposely stuttered. I started a word again, right? And it's, you know, what we might call
01:58:41.760 | a stutter in how we're speaking, sometimes duration of spaces between starting one sentence to the next.
01:58:48.560 | These are things that as humans we've adapted to not pick up on because it makes us, you know,
01:58:53.520 | it makes us ineffective in communication or, and, but an algorithm can do so very well.
01:58:58.880 | Diabetes, heart disease, both show up in voice. Diabetes shows up because you can pick up on dehydration
01:59:06.960 | in the voice, much as, again, I'm a sound person in my heart and my past. And if you look at the
01:59:15.520 | spectrum of sound, you're going to see changes that show up, you know, they're very consistent things
01:59:20.400 | in a voice that show up with dehydration in the spectral, you know, salience, as well as with heart
01:59:25.840 | disease, you get sort of flutter that shows up. It's a proxy for things happening inside your body,
01:59:31.440 | you know, with problems, cardiovascular issues, but you're going to see them as certain, like,
01:59:35.840 | modulatory fluctuations in certain frequency bands. And again, we don't walk around as, as, you know,
01:59:42.320 | a partner or a spouse or a, or a child, you know, caretaking our parents and listening for, you know,
01:59:49.680 | like the, the four kilohertz modulation, but an algorithm can. And, you know, all of these are
01:59:54.960 | places where you can identify something that is potentially, you know, mitigate something proactively
02:00:02.080 | before there's, you know, a problem. And especially with, like, neural degeneration,
02:00:06.640 | we're really just getting to a place where there's pharmacological, you know, opportunities
02:00:11.520 | to slow something down. And you want to find that as quick as possible. So where do you, you want to,
02:00:17.520 | you want to have that input so that you can do something about it. You asked me about the babies,
02:00:23.120 | you know, like before we, the type of coughs we have tell us a lot about different pathologies. So
02:00:30.880 | for a baby, they're cry, they're, you know, if I'm thinking, you asked me about a digital tomb,
02:00:36.720 | where would I be most interested in using that information if I had, you know, children or,
02:00:41.280 | I mean, I do have a child. But from, you know, in the sort of lowest touch, most opportunity,
02:00:48.480 | it's to identify potential, you know, pathologies or issues early based on, you know, the natural
02:00:55.680 | sounds and the natural utterances and call, you know, that are happening to understand if there is
02:01:00.560 | something that, you know, there's a way it could be helped, it could be, you know, need, you could
02:01:04.960 | proactively make something much better. Let's talk about you. Oh boy. And how you got into all of this
02:01:14.160 | stuff because you're highly unusual in the neuroscience space. I recall when we were graduate
02:01:18.960 | students who, when you were working on auditory perception and physiology, and then years later,
02:01:23.600 | now you're involved with an AI, neuroplasticity, you were at Dolby. What is to you the most
02:01:32.400 | interesting question that's driving all of this? Like what guides your choices about what to work on?
02:01:38.880 | Human technology intersection and perception is my core, right? I say perception, but the world is data.
02:01:47.120 | And, you know, how our brains take in the data that we consume to optimize how we experience the world is
02:01:54.800 | is what I care about across all of what I've spent my time doing. And for me, technology is such a huge part
02:02:01.680 | of that. That it is, you know, I like to innovate, I like to build things, but I also like to think about how do we improve human performance?
02:02:09.120 | Core to improving human performance is understanding how we're different, not just how similar, but, you know,
02:02:14.880 | the nuances of how our brains are shaped and how they're influenced. And that's why I care, you know, I've spent so much time in neuroplasticity and it is at the intersection of everything. It's how are we changing and how do we harness that?
02:02:27.680 | How do we make it something that we have agency over, whether it's from the technologies we build
02:02:33.760 | and we innovate to the point of, I want to feel better. I want to be successful. I don't want that to be something left to surprise me. Right.
02:02:41.520 | So you asked me, how did I get there? One thing that, so I was a violinist back in the day. I'm still a violinist and music's a part of my life, but I was studying music and engineering when I was an undergrad.
02:02:55.840 | And I think we alluded to the fact I have absolute pitch. And absolute pitch is for anyone that doesn't know. It's not, it's not anything that means I always sing in tune. What it means is I hear the world,
02:03:13.920 | like I hear sound like people see color. Okay. And I can't turn it off really. I can kind of push it back.
02:03:22.000 | Wait, sorry. Don't we all hear sound like we see? I mean, I hear sounds and I see colors. Could you clarify what you mean?
02:03:26.400 | When you, okay, so when you walk down the street, your brain is, you know, that's red, that's black, that's blue, that's green. My brain's going, that's an A, that's a B, that's a G, that's an F, right?
02:03:34.480 | I see. You're categorizing. There's a categorical perception about it. And because of the nature of, I think, my exposure to sound in my life, I also know what frequency it is, right?
02:03:46.480 | You know, so I can say that's, you know, 350 hertz or that's 400 hertz or that's 442 hertz. And it has different applications. I mean, I can transcribe a jazz solo when I listen to it. That's a great party trick, but it doesn't mean that it's not necessarily a good thing for a musician, right?
02:04:04.560 | You know, as well as I do, that, you know, categorical perception is, we all have different forms of it, usually for speech and language, like units of vowels or phonetic units will, especially vowels will, you can hear many different versions of an E and still hear it as an E.
02:04:22.640 | And that's what we would call categorical perception. And my brain does the same thing for, you know, a sort of set of frequencies to hear it as an A. And that can be good at times, but when you're actually a musician, there's a lot more subtlety that goes into how you play with other people and what key you're in or what, you know, the details.
02:04:47.040 | Like, if you ask me to sing Happy Birthday, I'm always going to sing it in the key of G if I am left to my own devices and I will get you there somehow if we start somewhere else.
02:04:55.840 | So what happened to me when I was in music school, when I was in conservatory and also engineering school is I was taking, two things happened.
02:05:05.040 | I knew that I had to override my brain because it was not allowing me the subtlety I wanted to play my Shostakovich or play my chamber music in the ways that were, that I was having to work too hard to override what, you know, these sort of categories of sounds I was hearing.
02:05:25.040 | And so I started playing early music, early music, Baroque music, for anyone, I think I said earlier, A is a social construct.
02:05:33.840 | Today, we typically, as a set as a standard, A is 440 hertz.
02:05:38.840 | If you go back to like the 1700s, A was 415 hertz in the Baroque era and 415 hertz is effectively a G sharp.
02:05:49.840 | It's the difference between ah and ah, okay?
02:05:53.640 | And what would happen to me when I was trying to override this is I was playing an early music ensemble and I would tune my violin up and I would see A on the page and I'd hear G sharp in my brain.
02:06:05.640 | And it was completely, it was, it was, I was terrible.
02:06:10.640 | I was like always, it was really hard for my brain to override.
02:06:13.640 | And I mean, brass and wind players do this all the time.
02:06:17.440 | It's like transposition and they modulate to the key that they're in and they doesn't, their brains have evolved, you know, through their training and neuroplasticity to be able to not have the same sort of experience I had.
02:06:31.440 | Anyhow, long story long, I was also taking a neuroscience course.
02:06:37.440 | In this neuroscience course, we were reading papers about sort of different map making and neuroplasticity.
02:06:43.240 | And I read this paper by a professor at Stanford named Eric Knudsen.
02:06:48.240 | And Eric Knudsen did these amazing, well, he did a lot of seminal work for how we understand the auditory pathways as well as how we form multisensory objects and the way the brain integrates, you know, cells data from across our modalities, meaning, you know, sight and sound.
02:07:05.240 | But in this paper, what he was doing was he had identified cells in the brain that optimally responded, the receptive fields, you know, receptive field being that sort of like in all of that giant data set of the world, it's that, you know, it's the set of data that optimally causes that cell to respond.
02:07:26.040 | And for these cells, they cared about a particular location in auditory and visual space, which, you know, frankly, for mammals, we don't have the same sort of like cells because we can move our eyes back and forth in our sockets, unlike owls.
02:07:39.840 | And he studied owls.
02:07:40.840 | And owls have a very hardwired map of auditory visual space.
02:07:44.840 | On the other hand, if I hear click off to my right, I turn my head to the right.
02:07:47.840 | You turn your head, it triggers a different, you know, vestibular ocular response that moves, you know, all of that, yes.
02:07:53.840 | But in this case, he had these beautiful hardwired maps of auditory visual space.
02:07:58.240 | And then he would rear and raise these owls with prism glasses that effectively shifted their visual system by 15 degrees.
02:08:06.640 | And then he would put them, key to developing neuroplasticity, he would put them in high, you know, important, you know, high, not stress, but let's say situations where they had to do something critical to their, you know, their survival or their well-being.
02:08:22.640 | And so they would hunt and they would feed and do things like that with this 15-degree shift, you know, and consequently he saw the cells, the auditory neurons.
02:08:34.040 | He saw their dendrites realigned to the now 15-degree visually shifted cells.
02:08:41.040 | And it was this realization that they developed a secondary map that was now aligned with the 15-degree shift of the prism glasses, as well as their original map, was super interesting.
02:08:52.240 | For understanding how our brains integrate data and the feedback and neuroplasticity.
02:08:57.640 | So, I go back to my Baroque violin where I'm always out of tune and I'm tuning up with, you know, tuning up my Baroque violin and I realized I had developed absolute pitch at A415.
02:09:10.640 | So I developed a secondary absolute pitch map and then I would go play Shostakovich right after at A440 and I had that map.
02:09:18.640 | And I have nothing in between, but I could modulate between the two.
02:09:22.040 | And that's, like, the point at which I said, I think I just, you know, my brain is a little weird and I just did something that I need to go better understand.
02:09:30.040 | So that's how I, like, ended up here as a neuroscientist.
02:09:33.040 | I know Eric's worked really well.
02:09:35.040 | Our labs were next door.
02:09:36.040 | Our offices were next door.
02:09:37.440 | Yes, he's wonderful.
02:09:38.440 | Our offices were next door.
02:09:38.440 | He's wonderful.
02:09:38.440 | He's retired now.
02:09:39.440 | He knows, I told him the story.
02:09:42.440 | He's wonderful.
02:09:43.440 | I think one of my favorite things about those studies, I think people will find interesting, is that if an animal, human or owl, you know, has a displacement in the world, something's different, something changes and you need to adjust to it.
02:10:01.440 | It could be, like, new information coming to you that you need to learn in order to perform your sport correctly or to perform well in class or an emotionally challenging situation that you need to adjust to.
02:10:13.840 | All of that can happen, but it happens much, much faster if your life depends on it.
02:10:22.240 | And we kind of intuitively know this, but one of my favorite things about his work is where he said, okay, well, yeah, these owls can adjust to the prism shift.
02:10:30.840 | Their maps in the brain can change, but they sure as heck form much faster if you say, hey, in order to eat, in other words, in order to survive, these maps have to change.
02:10:42.840 | You know, and I like that study so much because, you know, we hear all the time, you know, it takes 29 days to form a new habit or it takes 50 days to form a new habit or whatever it is.
02:10:53.240 | Actually, you can form a new habit as quickly as is necessary to form that new habit.
02:10:58.240 | And so the limits on neuroplasticity are really set by how critical it is.
02:11:01.840 | Yeah.
02:11:02.840 | And, you know, of course, if you put a gun to my head right now and you said, okay, remap your auditory world.
02:11:09.240 | I mean, there are limits at the other end too.
02:11:10.840 | I mean, I can't do that quickly.
02:11:13.240 | But I think it's a reminder to me anyway, and thank you for bringing up Eric's work.
02:11:19.840 | It's a reminder to me that neuroplasticity is always in reach.
02:11:24.240 | If the incentives are high enough, we can do it.
02:11:27.640 | Yeah.
02:11:27.640 | And so I think with AI, it's going to be very interesting or with technology generally, you know, our ability to form these new maps of experience, at least with smartphones, has been pretty gradual.
02:11:39.240 | I really see 2010 as kind of the beginning of the smartphone, and then now by 2025, we're in a place where most everyone, young and old, has integrated this new technology.
02:11:49.640 | I think AI is coming at us very fast, and it's unclear what form it's coming at us and where.
02:11:54.640 | And as you said, it's already here.
02:11:56.640 | And I think we will adapt for sure.
02:11:59.640 | We'll form the necessary maps.
02:12:01.640 | I think being very conscious of which maps are changing is so key.
02:12:05.640 | I mean, I think we're still doing a lot of cleanup of the kind of detrimental aspects of smartphones.
02:12:12.040 | Short wavelength light late at night.
02:12:14.040 | Absolutely.
02:12:14.640 | You know, being in contact with so many people all the time, maybe not so good.
02:12:18.640 | I mean, I think what scares people, certainly me, is the idea that, you know, we're going to be doing a lot of error correction over the next 30 years because we're going so fast with technology.
02:12:27.640 | Because maps can change really, really fast.
02:12:29.640 | Well, they do change.
02:12:30.640 | Sam Altman had a -- I saw him say this, and actually that was a really good description.
02:12:38.040 | It's like, you know, Gen X or, you know, there's a group that is using AI as a tool that's sort of novel, interesting.
02:12:46.040 | Then, you know, you've got a different -- Millennials are using it as, you know, a search algorithm.
02:12:52.840 | And maybe that's even Gen X, but, you know, it's a little more deeply integrated.
02:12:56.240 | But then you go back, you know, to younger generations, and it's an operating system.
02:13:00.840 | And it already is.
02:13:02.240 | And that has major changes in neural structure for not just, you know, maps, but also neural processes for how we deal with information, how we learn.
02:13:13.240 | You know, the idea that we are very plastic under pressure, absolutely.
02:13:17.640 | And that's where it gets interesting to talk about different species, too.
02:13:21.640 | I mean, we're talking about owls, and that was under pressure.
02:13:24.040 | But, you know, what is successful human performance in training and all of these things?
02:13:29.240 | It's to make those probabilistic situations more deterministic, right?
02:13:33.640 | That's when you are -- if you're training as an athlete, you're really trying to not have to think and to have the fastest reaction time to very complex behaviors given complex stimuli, complex situations and contexts.
02:13:46.040 | But, you know, that situational awareness or physical behavior in those environments, you want that as fast as possible with as little cognitive, you know, load as possible.
02:13:56.040 | And, you know, it's like that execution is critical.
02:13:58.040 | You love looking across species, so do I, and looking for these ways where, you know, we are -- a brain is changing or you've got a species that can do something that is absolutely not what you would predict or it's incredible.
02:14:15.440 | And it's, you know, how it can evade a predator, how it can find a target, you know, find a mate.
02:14:23.440 | And, you know, it's doing things that are critical to it being able to survive, much as you said.
02:14:27.840 | Like, if I make it something that is absolutely necessary for success, it's going to do it, you know?
02:14:35.840 | One of my favorite examples is a particular moth that bats predate on, echolocating bats.
02:14:41.840 | And, you know, frankly, echolocating bats are sort of nature's engineered, amazing predatory species.
02:14:47.640 | You know, their brains, when you look at them, you know, are just incredible.
02:14:51.640 | They have huge amounts of their brain just dedicated to what's called a FM, constant frequency FM sort of sweep.
02:15:00.440 | Some of the bats, you know, elicit a call that's sort of like, "Ooh, ooh," but really high.
02:15:07.040 | So we can't hear it.
02:15:09.040 | And what does that do for them?
02:15:10.040 | It's doing two things.
02:15:11.040 | One, that constant frequency portion is allowing them to sort of track the Doppler in a moving object.
02:15:17.040 | So -- and they're even so -- I mean, it's such clever and sophisticated.
02:15:23.440 | They're not changing -- they're changing subtlety how -- what frequencies they elicit the call at, so that it always comes back in the same frequency range, because that's where their heightened sensitivity is.
02:15:34.840 | So otherwise, you know, so they're modifying their vocal cords to make sure that the call comes back in the same range.
02:15:41.640 | And then they're tracking how much they've had to modify their -- their -- the call.
02:15:45.640 | Just so that people are on board.
02:15:47.440 | Yeah, bats echolocate.
02:15:49.040 | Yeah.
02:15:49.440 | They're sending out sound, and they can measure distance, and they can essentially see in their mind's eye.
02:15:55.840 | They can sense distance.
02:15:56.840 | They can sense speed of objects.
02:15:58.640 | They can sense shape of objects by virtue of sounds being sent out and coming back.
02:16:02.440 | Absolutely.
02:16:02.840 | And they're shaping those -- the sounds going out differently so that they can look at multiple objects simultaneously.
02:16:08.040 | So they're shaping the sounds they send out so that whatever comes back is in their optimal neural, like, range, so that they don't have to go through more neural plasticity that they already have, like, circuits that are really dedicated to these certain frequency ranges.
02:16:22.840 | And so they send it out, and then they're keeping track of the deltas.
02:16:25.840 | They're keeping track of how much they've had to change it, and that's what's in -- you know, tells them the speed.
02:16:30.440 | So that constant frequency is a lot like, you know, the ambulance sound going by.
02:16:33.840 | That's the compression of sound waves that you hear as a "whoo" when things move past you at speed.
02:16:40.440 | That's the Doppler effect.
02:16:41.440 | And then there also -- it has usually a really fast FM frequency-modulated sweep, and that lets me take kind of an imprint of -- you know, so one's telling me the speed of the object.
02:16:51.640 | Another one's telling me sort of what the surface structure looks like, right?
02:16:56.040 | That FM sweep lets me get, you know, a sonic imprint of what's there so I can tell topography.
02:17:02.440 | I can tell if there's a, you know, a moth on a hard surface, right?
02:17:07.040 | So what's beautiful about other species is you've got a little moth, and you've got nature's predatory marvel.
02:17:16.040 | And 80 percent of the time about, that moth gets away.
02:17:21.040 | Multiple things.
02:17:22.040 | You can tell it almost an acoustic arms race that's happening between the two, and there's a lot of acoustic subterfuge between the moth.
02:17:28.640 | You know, but there's also beautiful deterministic responses that they have.
02:17:32.640 | And so first, deterministic behaviors, again, be it an athlete, be it, you know, effectiveness, being fast, quick in making good decisions that get you the right answer are always important.
02:17:46.640 | You know, moths have just a few neurons when that echolocating bat is flying, you know, at a certain point when those neurons start firing, they will start, you know, they'll start flying in more of a random pattern.
02:17:58.240 | You'll see the same thing with seals when there are great white sharks around, right?
02:18:01.240 | It's decreasing the probability that, you know, it's easy for them to continue to track you.
02:18:05.840 | So they'll fly in a random pattern.
02:18:07.840 | And then when their neurons saturate, you know, when those calls get close enough, the moth will drop to the ground with the idea that, you know, assuming we don't live in cities, in a natural world, the ground is, you know, wheat, grass.
02:18:23.840 | It's a difficult environment for an echolocating bat to locate you, right?
02:18:29.440 | So that is just a deterministic behavior that will happen regardless.
02:18:33.440 | But then the interesting part is their body is reflecting, meta-reflectors effectively, so that the bat may put out its call and it deflects the, you know, the energy of the call away from its body.
02:18:48.040 | So you're deflecting it away from critical areas.
02:18:51.640 | And, you know, this is all like happening and that's the changes in the physical body are interesting, but then it's the behavioral differences.
02:19:03.640 | They're really key, right?
02:19:05.040 | It's how fast does that moth react?
02:19:07.040 | If it has to question, you know, or if it were cognitively responsive instead of being deterministic in its behavior, it wouldn't escape, right?
02:19:15.640 | But it gets away.
02:19:17.640 | Yeah, I've never thought about bats and moths.
02:19:20.640 | I never got the insect.
02:19:23.640 | I was about to say I never got the insect bug.
02:19:25.640 | No pun intended.
02:19:27.640 | I never got the insect bug because I don't think of things in the auditory domain.
02:19:35.640 | I think of things in the visual domain and some insects are very visual, but it's good for me to think about that.
02:19:42.640 | You know, one of my favorite people, although I never met him, was Oliver Sacks, the neurologist and writer.
02:19:48.640 | And he claimed to have spent a lot of time just sitting in a chair and trying to imagine what life would be like as a bat as a way to enhance his clinical abilities with patients suffering from different neurologic disorders.
02:20:02.640 | So when he would interact with somebody with Parkinson's or with severe autism or with Lockton syndrome or any number of different deficits of the nervous system, he felt that he could go into their mind a bit to understand what their experience was like.
02:20:22.640 | And that would make him more effective at treating them.
02:20:26.640 | And he certainly was very effective at storing out their experience in ways that brought about a lot of compassion and understanding.
02:20:32.640 | Like he never presented a neural condition in a way that made you feel sorry for the person.
02:20:39.640 | It was always the opposite.
02:20:40.640 | And I should point out, not trying to be politically correct here, but when I say autistic, I've been, the patients he worked with were severely autistic to the point of, you know, never being able to take care of themselves.
02:20:52.640 | This is, we're not talking about along the spectrum.
02:20:54.640 | We're talking about the far end of the spectrum of needing assisted living their entire lives and being sensory, very, from a sensory standpoint, extremely sensitive, couldn't go out in public, that kind of thing.
02:21:06.640 | That we're not talking about people that are functioning with autism.
02:21:10.640 | So apparently thinking in the auditory domain was useful for him.
02:21:15.640 | So I should probably do that.
02:21:16.640 | So I have one final question for you, which is, what's really two questions.
02:21:22.640 | First question, why did you sing to spiders?
02:21:26.640 | And second, what does that tell us about spider webs?
02:21:30.640 | Because I confess I know the answers to these questions, but I was absolutely blown away to learn what spider webs are actually for.
02:21:39.640 | And you singing to spiders reveals what they're for.
02:21:44.640 | So why did you sing to spiders?
02:21:46.640 | Two things.
02:21:47.640 | And you can watch me sing to a spider on a TED talk I gave a few years ago.
02:21:51.640 | We'll put a link to it.
02:21:52.640 | Here's back.
02:21:53.640 | Okay.
02:21:54.640 | And, no, so maybe this comes back to I have absolute pitch, so I know what frequencies I'm singing.
02:22:01.640 | But I also recognize by having absolute pitch, I know my brain is just a little different.
02:22:05.640 | Again, you asked me what threads drive me.
02:22:07.640 | It's always been we do experience the world differently, and I believe that our success, everyone's success, and the success of our growth as humans is partly dependent on how we use technology to help, you know, improve and optimize each of us with, you know, the different variables we need, right?
02:22:26.640 | So different species and how they respond to sound is very interesting to me, and as much as you, I know, Andy, you look at how different species respond to color and to information in the world, be it cuttlefish or such.
02:22:42.640 | I have jellyfish, too, and I can see how they, you know, their pulsing rates change with their photoreceptors when they, you know, with different light colors.
02:22:50.640 | It's very obvious that some clearly make, you know, that they are under, when they're under stress versus when they're in a more calming state.
02:22:57.640 | And so it's like understanding the stimuli in our world that shape us.
02:23:01.640 | Those changes is a huge part of being human in my perspective.
02:23:04.640 | In this case, this happens to be an orb spider, the one I sing to.
02:23:08.640 | And when I hit about 880 hertz, you will see the spider kind of dances.
02:23:13.640 | But what this particular species, and not all spiders will do this, is predated on by echolocating bats and birds, which makes sense that then, you know, it tunes its web effectively.
02:23:26.640 | And orb weavers are all over California.
02:23:28.640 | They show up a lot in around Thanksgiving if you are October, November, for anyone that's on the, you know, out here on the West Coast.
02:23:35.640 | They're not bad spiders.
02:23:37.640 | They are not spiders you need to get rid of.
02:23:39.640 | They're totally happy spiders.
02:23:41.640 | There are some, you know, that maybe you should worry about more.
02:23:44.640 | Anyhow, they tune their webs to resonate like a violin.
02:23:49.640 | And when, you know, you'll see it as I hit a certain frequency, it'll effectively tell me to go away.
02:23:56.640 | And it's a pretty interesting sort of deterministic response.
02:24:02.640 | Other insects do different things.
02:24:04.640 | The one kind of funny for that was when my daughter was, I think at the time, she was about two and a half or three.
02:24:12.640 | And she kind of adopted asking me when we would see spiders if it was the kind we should sing to or the kind we shouldn't touch.
02:24:21.640 | And so, those were the two classes.
02:24:24.640 | So, amazing.
02:24:26.640 | So, if I understand correctly, these orb spiders use their web.
02:24:32.640 | More or less as an instrument to detect certain sound frequencies in their environment.
02:24:36.640 | Resonances, absolutely.
02:24:37.640 | So that they can respond appropriately.
02:24:39.640 | Yeah.
02:24:40.640 | Either by raising their legs to protect themselves or to attack or whatever it is.
02:24:45.640 | That the spider web is a functional thing not just for catching prey.
02:24:50.640 | It's a detection device also.
02:24:52.640 | And we know that because when prey are caught in a spider web, they wiggle and then the spider goes over to it and wraps it and eats it.
02:24:58.640 | But the idea that it would be tuned to particular frequencies is really wild.
02:25:04.640 | Yeah, not just by any vibration, right?
02:25:06.640 | You know, there's the idea that there's any vibration.
02:25:08.640 | I know I've got, you know, food somewhere.
02:25:09.640 | I should go to that food source.
02:25:10.640 | But instead, it's something that if I experience a threat or something, I'm going to behave.
02:25:16.640 | And that is a more selective, you know, response that I've tuned it towards.
02:25:21.640 | It's so interesting because if I just transfer it to the visual domain, it's like, yeah, of course.
02:25:25.640 | Like if an animal, including us, sees something like a looming object coming at us closer to dark, our immediate response is to either freeze or flee.
02:25:35.640 | Like that's just what we do.
02:25:36.640 | The looming response is one of the most fundamental responses, but that's in the visual domain.
02:25:40.640 | So the fact that there would be auditory cues that would bring about sort of deterministic responses seems very real.
02:25:47.640 | I feel like the wail of somebody in pain evokes a certain response.
02:25:53.640 | Yesterday, there was a lot of noise outside my window at night.
02:25:57.640 | And there was a moment where I couldn't tell were these shouts of glee or shouts of fear.
02:26:03.640 | And I like continue.
02:26:04.640 | And then I heard this like kind of like high pitch fluttering that came after the scream.
02:26:12.640 | And I realized these were kids playing in the in the alley outside my house.
02:26:15.640 | And I went and looked.
02:26:16.640 | I was like, oh, they're definitely playing.
02:26:18.640 | But I knew even before I went and looked based on the kind of the the flutter of sound that came after the like the shriek.
02:26:27.640 | It was like and then it was it was like I can't I can't reproduce the sound at that high frequency.
02:26:32.640 | No, no, but that's that's super.
02:26:34.640 | But so the idea that this would be true all the time is is super interesting.
02:26:39.640 | We just don't tend to focus just on our hearing unless, of course, somebody is blind, in which case they have to rely on it much more.
02:26:44.640 | So two interesting things to go with that.
02:26:46.640 | So like crickets, for example, crickets have bimodal neurons that have sort of peaks in two different frequency ranges for the same neuron.
02:26:55.640 | And each frequency range will elicit a completely different behavior to when when.
02:27:00.640 | So you've got a peak at 6K and you've got a peak at 40K.
02:27:03.640 | And cricket and this is the same neuron.
02:27:06.640 | Cricket hears 40K from a speaker run over to it because that's got to be my bait or some, you know, that.
02:27:12.640 | And here 40K and they run away.
02:27:14.640 | And, you know, it's very predictive behavior.
02:27:16.640 | I spent a lot of, well, I spent a good period of time working with non-primate, non-human primate species, marmosets.
02:27:25.640 | Marmosets are very interesting when you get to a more sophisticated, you know, a more sophisticated neural system.
02:27:31.640 | But they're, you know, marmosets are very social.
02:27:35.640 | You know, it's critical to their happiness.
02:27:37.640 | If you ever see a single marmoset in the zoo or something, that's a very unhappy animal.
02:27:41.640 | But they're native to the Amazon, you know, New World monkeys native to Brazil in the Amazon.
02:27:47.640 | But they're arboreal.
02:27:48.640 | They live in trees.
02:27:49.640 | They live in trees.
02:27:50.640 | And they're very social.
02:27:51.640 | So that kind of can, you know, be in conflict with each other because you're, you know, in dense foliage, but yet you need to communicate.
02:27:59.640 | So they've evolved very interesting systems to be able to, you know, achieve what they needed to.
02:28:05.640 | Which, one, they, if you ever see marmosets, they're very stoic, unlike macaque monkeys that, you know, often have a lot of visual, you know, expression of how they're feeling.
02:28:16.640 | Like, marmosets always look about the same, and, but their vocalizations are almost like birdsong.
02:28:24.640 | And they're very rich in the information that they're, you know, communicating.
02:28:29.640 | They also have a pheromonal system.
02:28:32.640 | Like, you know, they, that you can have a dominant female in the colony who may not be, because you have to have ways of, when one sense is compromised,
02:28:42.640 | the other senses sort of rise up to help assure that the success of what that, you know, that species or system needs is going to be, you know, thrive.
02:28:52.640 | And in the case of marmosets, you can have the dominant female effectively causes the ovulation of, like, the biology to change of all the other females.
02:29:00.640 | And you can have a female that you put just in the same proximity, but now as part of a different group, and her biology will change.
02:29:10.640 | I mean, it's very powerful, the pheromonal interactions that happen in the, because those are things that can travel even when I can't see you.
02:29:18.640 | One thing when I was working with them, you know, that I thought was, and I never, I like writing pets more than publishing papers.
02:29:26.640 | But these things are real, because I was studying pupilometry, is understanding the power of the, you know, their saccades.
02:29:33.640 | I could know what they were hearing based on their eye movements, right?
02:29:36.640 | So if I play, marmosets have, you know, call, some of their calls are really antiphonal.
02:29:42.640 | They're to see, hey, are you out there?
02:29:44.640 | Am I alone?
02:29:45.640 | Who else is around?
02:29:46.640 | It's like texting for humans.
02:29:47.640 | Yeah, yeah.
02:29:48.640 | And sometimes it's like, or sometimes it might be like, oh, you know, from, be careful, there's, you know, there's somebody, you know, around that we got to watch out for.
02:29:56.640 | Maybe there's a leopard on the ground or somebody, something, right?
02:29:58.640 | And then sometimes it's like, you're in my face.
02:30:01.640 | Get out of here now, right?
02:30:03.640 | And those are three different things.
02:30:05.640 | And I can play that to you, and I can tell you, you know, without hearing it, and I know exactly what's being heard.
02:30:10.640 | In the case of the antiphonal, hey, are you out there?
02:30:12.640 | You see, like, the eye will just start scanning back and forth, right?
02:30:16.640 | Because that's the right movement.
02:30:17.640 | I'm looking for where's this coming from.
02:30:19.640 | Yeah, they paired the right eye movement with the right sound.
02:30:21.640 | Exactly.
02:30:22.640 | In the case of, you know, look, it's, you know, there's something to be scared, you know, threatened of.
02:30:27.640 | You're going to see dilation, and you're also going to see some scanning, but it's not as slow.
02:30:31.640 | It's a lot faster, because there's a threat to me.
02:30:34.640 | And, you know, my autonomic system and my cognitive system are, like, reacting differently.
02:30:39.640 | And in the case of you're in my face, it's going to be, you know, without even, so without seeing you, if I hear another, you know, sort of aggressive sound, I'm going to react.
02:30:49.640 | I'm going to be, you know, I'm not scanning anywhere, but my dilation is going to be fast, and I'm also going to be much more on top of things.
02:30:58.640 | But we do this as, you know, humans too, right?
02:31:01.640 | And it's like, you can, you know, walk into a business meeting, you know, walk into a conference room, and, you know, it's these subtle cues that are constant.
02:31:08.640 | You know, we can't, don't always suppress them.
02:31:10.640 | We show them, whether we think we do or we don't.
02:31:12.640 | But, you know, when you look at species like that, it's very much like, okay, you know, there's a lot of, you know, sophistication in how their bodies are helping them be successful, even in a world or an environment that has a lot of things that could maybe, you know, come after them.
02:31:30.640 | So interesting to think about that in terms of our own human behavior and what we're optimizing for, especially as all these technologies come on board and are sure to come on board even more quickly.
02:31:44.640 | Poppy, thank you so much for coming here today to educate us about what you've done, what's here now, what's to come.
02:31:52.640 | We covered a lot of different territories, and I'm glad we did because you have expertise in a lot of areas, and I love that you are constantly thinking about technology development.
02:32:02.640 | And I, you know, I drew a little diagram for myself that I'll just describe for people because if I understood correctly, one of the reasons you got into neuroscience and research at all is about this interface between inputs and us.
02:32:18.640 | And what sits in between those two things is this incredible feature of our nervous systems, which is neuroplasticity or what I sometimes like to refer to as self-directed plasticity because unlike other species, we can decide what we want to change and make the effort to adopt a second map of the auditory world or visual world or take on a new set of learnings in any domain.
02:32:44.640 | And we can do it if we put our mind to it, if the incentives are high enough, we can do it.
02:32:49.640 | And at the same time, neuroplasticity is always occurring based on the things we're bombarded with new technology.
02:32:54.640 | So we have to be aware of how we are changing and we need to intervene at times and leverage those things for our health.
02:33:02.640 | So thank you so much for doing the work that you do.
02:33:05.640 | Thank you for coming here to educate us on them and keep us posted.
02:33:09.640 | We'll provide links to you singing to spiders and all the rest.
02:33:13.640 | My mind's blown.
02:33:14.640 | Thank you so much.
02:33:15.640 | Thank you, Eddie.
02:33:16.640 | Great to be here.
02:33:17.640 | Thank you for joining me for today's discussion with Dr. Poppy Crum.
02:33:20.640 | To learn more about her work and to find links to the various resources we discussed, please see the show note captions.
02:33:26.640 | If you're learning from and or enjoying this podcast, please subscribe to our YouTube channel.
02:33:30.640 | That's a terrific zero cost way to support us.
02:33:32.640 | In addition, please follow the podcast by clicking the follow button on both Spotify and Apple.
02:33:37.640 | And on both Spotify and Apple, you can leave us up to a five-star review and you can now leave us comments at both Spotify and Apple.
02:33:44.640 | Please also check out the sponsors mentioned at the beginning and throughout today's episode.
02:33:48.640 | That's the best way to support this podcast.
02:33:50.640 | If you have questions for me or comments about the podcast or guests or topics that you'd like me to consider for the Huberman Lab podcast, please put those in the comments section on YouTube.
02:34:00.640 | I do read all the comments.
02:34:01.640 | For those of you that haven't heard, I have a new book coming out.
02:34:04.640 | It's my very first book.
02:34:05.640 | It's entitled Protocols, an Operating Manual for the Human Body.
02:34:09.640 | This is a book that I've been working on for more than five years and that's based on more than 30 years of research and experience.
02:34:15.640 | And it covers protocols for everything from sleep to exercise to stress control, protocols related to focus and motivation.
02:34:23.640 | And of course, I provide the scientific substantiation for the protocols that are included.
02:34:29.640 | The book is now available by presale at protocolsbook.com.
02:34:32.640 | There you can find links to various vendors.
02:34:35.640 | You can pick the one that you like best.
02:34:37.640 | Again, the book is called Protocols, an Operating Manual for the Human Body.
02:34:41.640 | And if you're not already following me on social media, I am Huberman Lab on all social media platforms.
02:34:46.640 | So that's Instagram, X, Threads, Facebook, and LinkedIn.
02:34:50.640 | And on all those platforms, I discuss science and science related tools, some of which overlaps with the content of the Huberman Lab podcast,
02:34:56.640 | but much of which is distinct from the information on the Huberman Lab podcast.
02:35:00.640 | Again, it's Huberman Lab on all social media platforms.
02:35:03.640 | And if you haven't already subscribed to our Neural Network newsletter, the Neural Network newsletter is a zero cost monthly newsletter that includes podcast summaries,
02:35:11.640 | as well as what we call protocols in the form of one to three page PDFs that cover everything from how to optimize your sleep,
02:35:17.640 | how to optimize dopamine, deliberate cold exposure.
02:35:20.640 | We have a foundational fitness protocol that covers cardiovascular training and resistance training.
02:35:24.640 | All of that is available completely zero cost.
02:35:27.640 | You simply go to Hubermanlab.com, go to the menu tab in the top right corner, scroll down to newsletter,
02:35:32.640 | and enter your email.
02:35:33.640 | And I should emphasize that we do not share your email with anybody.
02:35:37.640 | Thank you once again for joining me for today's discussion with Dr. Poppy Crum.
02:35:41.640 | And last, but certainly not least, thank you for your interest in science.
02:35:45.640 | Music.