back to index

Dr. Matthew MacDougall: Neuralink & Technologies to Enhance Human Brains | Huberman Lab Podcast


Chapters

0:0 Dr. Matthew MacDougall
4:5 Sponsors: HVMN, Levels, Thesis
7:38 Brain Function & Injury; Brain Tumor Treatment
13:52 Frontal Lobe Filter; Sleep Deprivation
19:0 Neuroplasticity, Pharmacology & Machines
22:10 Neuralink, Neural Implants & Injury, Robotics & Surgery
31:5 Sponsor: AG1 (Athletic Greens)
32:20 Neocortex vs. Deep Brain
36:45 Decoding Brain Signals
42:8 “Confidence Test” & Electrical Stimulation; RFID Implants
51:33 Bluetooth Headphones & Electromagnetic Fields; Heat
57:43 Brain Augmentation & Paralysis
60:51 Sponsor: InsideTracker
62:9 Brain Implants & Peripheral Devices
72:44 Brain Machine Interface (BMI), Neurofeedback; Video Games
82:13 Improving Animal Experimentation, Pigs
93:18 Skull & Injury, Traumatic Brain Injury (TBI)
99:14 Brain Health, Alcohol
103:34 Neuroplasticity, Brain Lesions & Redundancy
107:32 Car Accidents & Driver Alertness
110:0 Future Possibilities in Brain Augmentation & BMI; Neuralink
118:56 Zero-Cost Support, YouTube Feedback, Spotify & Apple Reviews, Sponsors, Momentous, Social Media, Neural Network Newsletter

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Welcome to the Huberman Lab Podcast,
00:00:02.280 | where we discuss science and science-based tools
00:00:04.880 | for everyday life.
00:00:05.900 | I'm Andrew Huberman,
00:00:10.280 | and I'm a professor of neurobiology and ophthalmology
00:00:13.380 | at Stanford School of Medicine.
00:00:15.320 | Today, my guest is Dr. Matthew McDougall.
00:00:18.020 | Dr. Matthew McDougall is the head neurosurgeon at Neuralink.
00:00:21.600 | Neuralink is a company whose goal is to develop technologies
00:00:24.800 | to overcome specific clinical challenges
00:00:26.920 | of the brain and nervous system,
00:00:28.700 | as well as to improve upon brain design,
00:00:31.120 | that is to improve the way that brains currently function
00:00:34.820 | by augmenting memory, by augmenting cognition,
00:00:37.700 | and by improving communication between humans
00:00:40.080 | and between machines and humans.
00:00:42.200 | These are all, of course, tremendous goals,
00:00:44.440 | and Neuralink is uniquely poised to accomplish these goals
00:00:48.240 | because they are approaching these challenges
00:00:50.400 | by combining both existing knowledge of brain function
00:00:53.780 | from the fields of neuroscience and neurosurgery
00:00:56.680 | with robotics, machine learning, computer science,
00:00:59.960 | and the development of novel devices
00:01:02.160 | in order to change the ways
00:01:03.660 | that human brains work for the better.
00:01:05.920 | Today's conversation with Dr. Matthew McDougall
00:01:08.520 | is a truly special one
00:01:10.360 | because I and many others in science and medicine
00:01:13.460 | consider neurosurgeons the astronauts
00:01:16.120 | of neuroscience and the brain.
00:01:17.860 | That is, they go where others have simply not gone before
00:01:21.480 | and are in a position to discover incredibly novel things
00:01:24.880 | about how the human brain works
00:01:26.400 | because they are literally in there,
00:01:28.320 | probing and cutting, stimulating, et cetera,
00:01:31.080 | and able to monitor how people's cognition
00:01:34.120 | and behavior and speech changes
00:01:36.460 | as the brain itself is changed structurally and functionally.
00:01:39.920 | Today's discussion with Dr. McDougall
00:01:41.760 | will teach you how the brain works
00:01:43.920 | through the lens of a neurosurgeon.
00:01:45.840 | It will also teach you about Neuralink's
00:01:47.760 | specific perspective about which challenges
00:01:50.000 | of brain function and disease are immediately tractable,
00:01:53.880 | which ones they are working on now, that is,
00:01:56.320 | as well as where they see the future
00:01:58.020 | of augmenting brain function for sake of treating disease
00:02:00.840 | and for simply making brains work better.
00:02:04.000 | Today's discussion also gets into the realm
00:02:06.040 | of devising the peripheral nervous system.
00:02:08.200 | In fact, one thing that you'll learn
00:02:10.440 | is that Dr. McDougall has a radio receiver implanted
00:02:14.040 | in the periphery of his own body.
00:02:16.240 | He did this not to overcome any specific clinical challenge,
00:02:19.340 | but to overcome a number of daily, everyday life challenges,
00:02:22.840 | and in some ways to demonstrate the powerful utility
00:02:25.820 | of combining novel machines, novel devices,
00:02:29.360 | with what we call our nervous system
00:02:31.520 | and different objects and technologies within the world.
00:02:34.640 | I know that might sound a little bit mysterious,
00:02:36.120 | but you'll soon learn exactly what I'm referring to.
00:02:38.780 | And by the way, he also implanted his family members
00:02:41.680 | with similar devices.
00:02:43.440 | So while all of this might sound
00:02:44.660 | a little bit like science fiction,
00:02:46.280 | this is truly science reality.
00:02:48.360 | These experiments, both the implantation
00:02:50.980 | of specific devices and the attempt
00:02:52.440 | to overcome specific movement disorders,
00:02:54.680 | such as Parkinson's and other disorders
00:02:57.560 | of deep brain function, as well as to augment the human brain
00:03:01.280 | and make it work far better than it ever has
00:03:03.720 | in the course of human evolution,
00:03:05.460 | are experiments and things that are happening now
00:03:07.860 | at Neuralink.
00:03:09.080 | Dr. McDougall also generously takes us under the hood,
00:03:12.440 | so to speak, of what's happening at Neuralink,
00:03:14.740 | explaining exactly the sorts of experiments
00:03:16.600 | that they are doing and have planned,
00:03:18.320 | how they are approaching those experiments.
00:03:20.400 | We get into an extensive conversation
00:03:22.200 | about the utility of animal versus human research
00:03:24.980 | in probing brain function and in devising
00:03:27.560 | and improving the human brain and in overcoming disease
00:03:30.580 | in terms of neurosurgery and Neuralink's goals.
00:03:33.520 | By the end of today's episode,
00:03:34.720 | you will have a much clearer understanding
00:03:36.300 | of how human brains work and how they can be improved
00:03:39.420 | by robotics and engineering.
00:03:41.000 | And you'll have a very clear picture
00:03:42.800 | of what Neuralink is doing toward these goals.
00:03:45.360 | Dr. McDougall did his medical training
00:03:47.160 | at the University of California, San Diego
00:03:49.420 | and at Stanford University School of Medicine.
00:03:51.940 | And of course is now at Neuralink.
00:03:54.020 | So he is in a unique stance to teach us
00:03:56.680 | about human brain function and dysfunction
00:03:59.660 | and to explain to us what the past, present and future
00:04:02.780 | of brain augmentation is really all about.
00:04:05.540 | Before we begin, I'd like to emphasize that this podcast
00:04:08.260 | is separate from my teaching and research roles at Stanford.
00:04:10.940 | It is, however, part of my desire and effort
00:04:13.220 | to bring zero cost to consumer information about science
00:04:15.860 | and science-related tools to the general public.
00:04:18.500 | In keeping with that theme,
00:04:19.580 | I'd like to thank the sponsors of today's podcast.
00:04:22.420 | Our first sponsor is HVMN Ketone IQ.
00:04:25.740 | HVMN Ketone IQ increases blood ketones.
00:04:29.100 | I want to be very clear that I, like most people,
00:04:31.540 | have heard of the ketogenic diet,
00:04:32.980 | but I, like most people, do not follow a ketogenic diet.
00:04:36.060 | That is, I'm not in ketosis.
00:04:37.760 | However, most people don't realize that you can still benefit
00:04:40.080 | from increasing your blood ketones,
00:04:41.800 | which is what HVMN Ketone IQ does.
00:04:44.620 | I take ketone IQ prior to doing really focused
00:04:47.780 | cognitive work, so I take it once in the afternoon,
00:04:51.500 | anytime I'm going to prepare for a podcast or do a podcast,
00:04:54.800 | or if I'm going to do some research or focus on a grant,
00:04:58.060 | anything that requires a high level of cognitive demand,
00:05:00.480 | and that's because ketones are the brain's preferred use
00:05:03.100 | of fuel, even if you're not following a ketogenic diet.
00:05:05.680 | If you'd like to try ketone IQ,
00:05:07.340 | you can go to hvmn.com/huberman
00:05:10.720 | to save 20% off your order.
00:05:12.460 | Again, that's hvmn.com/huberman to save 20%.
00:05:16.620 | Today's episode is also brought to us by Levels.
00:05:19.380 | Levels is a program that lets you see how different foods
00:05:21.860 | and activities affect your health
00:05:23.640 | by giving you real-time feedback on your diet
00:05:25.680 | using a continuous glucose monitor.
00:05:28.100 | Nowadays, there's a lot of excitement
00:05:29.580 | about continuous glucose monitors,
00:05:31.500 | and Levels allows you to assess how what you eat
00:05:35.260 | and what combinations of foods you eat and exercise
00:05:38.260 | and sleep and things like alcohol,
00:05:40.660 | should you indulge in alcohol and things of that sort,
00:05:42.900 | how those impact your blood glucose.
00:05:45.200 | Now, it's very important that the cells of your body,
00:05:47.080 | and in particular, the cells of your nervous system,
00:05:49.860 | not experience levels of blood glucose
00:05:51.740 | that are too high or too low,
00:05:53.300 | so-called hyperglycemia or hypoglycemia.
00:05:56.040 | What Levels allows its users to do
00:05:58.200 | is to understand how their specific routines,
00:06:00.680 | food intake patterns, exercise, et cetera,
00:06:03.400 | impact their blood sugar levels.
00:06:04.980 | I, like most people who use Levels,
00:06:06.880 | find that there's a lot to learn and a lot to be gained
00:06:09.140 | by understanding these blood glucose patterns.
00:06:11.780 | If you're interested in learning more about Levels
00:06:13.520 | and trying a continuous glucose monitor yourself,
00:06:15.940 | you can go to levels.link/huberman.
00:06:18.640 | Right now, Levels is offering
00:06:19.980 | an additional two free months of membership.
00:06:22.260 | Again, that's levels.link, L-I-N-K/huberman
00:06:25.900 | to get two free months of membership.
00:06:27.780 | Today's episode is also brought to us by Thesus.
00:06:30.300 | Thesus makes custom nootropics.
00:06:32.580 | And as many of you have perhaps heard me say before,
00:06:34.820 | I am not a fan of the word nootropics
00:06:36.880 | because it literally means smart drugs.
00:06:39.380 | And the brain has neural circuits for focus.
00:06:42.900 | It has neural circuits for creativity,
00:06:44.480 | has neural circuits for task switching.
00:06:46.340 | It does not have neural circuits for quote unquote,
00:06:48.740 | being smart.
00:06:50.180 | Thesus understands this and has designed custom nootropics,
00:06:53.560 | each of which is designed to place your brain and body
00:06:55.700 | into a specific state,
00:06:56.980 | ideal for a particular type of work or physical effort,
00:07:00.240 | such as creativity or focus or clarity.
00:07:03.020 | If you'd like to try Thesus nootropics,
00:07:04.640 | you simply go to their website,
00:07:06.220 | you fill out a brief quiz,
00:07:07.620 | and they will design a custom starter pack
00:07:10.100 | so that you can assess which things work for you
00:07:12.080 | more or less well.
00:07:13.180 | And then they'll iterate with you over the course
00:07:15.140 | of the next few weeks or months
00:07:16.420 | to come up with the ideal nootropic kit for your needs.
00:07:19.680 | To get your own personalized nootropic starter kit,
00:07:21.600 | go online to takethesis.com/huberman.
00:07:24.660 | You can take that three minute quiz
00:07:25.940 | and they'll send you four different formulas
00:07:27.700 | to try in your first month.
00:07:28.780 | Again, that's takethesis.com/huberman
00:07:31.540 | and use the code Huberman at checkout
00:07:33.220 | to get 10% off your first box.
00:07:35.540 | And now for my discussion with Dr. Matthew McDougall.
00:07:38.540 | Dr. McDougall, welcome.
00:07:40.800 | - Good to be here.
00:07:41.640 | - Nice to see you, Andrew.
00:07:42.620 | - Great to see you again.
00:07:44.220 | We'll get into our history a little bit later,
00:07:46.620 | but just to kick things off,
00:07:48.340 | as a neurosurgeon and as a neuroscientist,
00:07:52.300 | could you share with us your vision of the brain
00:07:55.700 | as an organ as it relates to what's possible there?
00:08:00.300 | I mean, I think most everyone understands
00:08:02.460 | that the brain is, along with the body,
00:08:04.940 | the seat of our cognition, feelings,
00:08:06.620 | our ability to move, et cetera,
00:08:08.640 | and that damage there can limit our ability
00:08:10.780 | to feel the way we want to feel
00:08:12.480 | or move the way we want to move,
00:08:13.760 | but surgeons tend to view the world
00:08:16.600 | a little bit differently than most
00:08:18.060 | because as the not so funny joke goes,
00:08:21.620 | they like to cut and they like to fix
00:08:25.100 | and they like to mend and they, in your case,
00:08:27.800 | have the potential to add things into the brain
00:08:29.860 | that don't exist there already.
00:08:31.480 | So how do you think about and conceptualize the brain
00:08:34.440 | as an organ and what do you think is really possible
00:08:38.040 | with the brain that most of us don't already
00:08:41.500 | probably think about?
00:08:42.560 | - Yeah, that's a great question.
00:08:44.160 | Thinking about the brain as this three pound lump of meat
00:08:50.160 | trapped in a prison of the skull,
00:08:52.760 | it seems almost magical that it could create a human,
00:08:57.960 | a human set of behaviors and a life
00:09:01.000 | merely from electrical impulses.
00:09:04.760 | When you start to see patients and see, say,
00:09:07.160 | a small tumor eating away at a little part of the brain
00:09:10.620 | and see a very discrete function
00:09:14.000 | of that brain go down in isolation,
00:09:17.760 | you start to realize that the brain really is
00:09:20.840 | a collection of functional modules pinned together,
00:09:24.480 | duct taped together in this bone box attached to your head
00:09:29.480 | and sometimes you see very interesting failure modes.
00:09:37.040 | So one of the most memorable patients I ever had
00:09:41.100 | was very early on in my training.
00:09:43.280 | I was down at UC San Diego and saw a very young guy
00:09:48.120 | who had just been in a car accident.
00:09:49.720 | We had operated on him.
00:09:51.800 | And as is so often the case in neurosurgery,
00:09:55.400 | we had saved his life
00:09:56.800 | potentially at the cost of quality of life.
00:10:01.280 | When he woke from surgery
00:10:03.220 | with bilateral frontal lobe damage,
00:10:05.840 | he had essentially no impulse control left.
00:10:10.200 | And so we rounded on him after surgery,
00:10:15.120 | saw that he was doing okay to our first guess at his health.
00:10:20.120 | And we continued on to see our other patients
00:10:23.520 | and we were called back
00:10:25.040 | by his 80-year-old recovery room nurse
00:10:29.040 | saying, "You've got to come see your patient right away.
00:10:31.260 | Something's wrong."
00:10:32.800 | And we walked in to see him
00:10:33.960 | and he points at his elderly nurse
00:10:35.800 | and says, "She won't have sex with me."
00:10:39.520 | And it was apparent at that moment,
00:10:41.500 | his frontal lobes were gone
00:10:44.200 | and that person is never going to have
00:10:46.760 | reasonable human behavior again.
00:10:48.580 | And it's one of the most tragic ways
00:10:54.400 | to have a brain malfunction.
00:10:55.960 | But anything a brain does,
00:11:00.320 | anything from control of hormone levels in your body,
00:11:03.940 | to vision, to sensation,
00:11:06.280 | to the most obvious thing,
00:11:08.560 | which is muscle movement of any kind,
00:11:11.000 | from eye movement to moving your bicep,
00:11:14.600 | all that comes out of the brain.
00:11:16.260 | All of it can go wrong.
00:11:18.400 | Any of it, any part of it or all of it.
00:11:20.480 | So yeah, working with the brain
00:11:23.960 | is the substance of the brain as a surgeon,
00:11:27.160 | very high stakes,
00:11:29.160 | but once in a while you get a chance to really help.
00:11:31.980 | You get a chance to fix something
00:11:33.480 | that seems unfixable
00:11:35.820 | and you have Lazarus-like miracles,
00:11:39.060 | not too uncommonly.
00:11:41.520 | So it's extremely satisfying as a career.
00:11:44.540 | - Could you share with us
00:11:46.300 | one of the more satisfying experiences?
00:11:48.680 | - Sure.
00:11:49.520 | - Or perhaps the top contour
00:11:51.580 | of what qualifies as satisfying in neurosurgery?
00:11:56.060 | - Yeah.
00:11:56.900 | One of the relatively newer techniques that we do
00:12:01.260 | is if someone comes in with a reasonably small tumor
00:12:05.620 | somewhere deep in the brain that's hard to get to,
00:12:08.280 | the traditional approach to taking that out
00:12:11.220 | would involve cutting through a lot of good normal brain
00:12:14.060 | and disrupting a lot of neurons, a lot of white matter,
00:12:17.300 | that kind of the wires connecting neurons.
00:12:19.720 | Then the modern approach
00:12:22.860 | involves a two millimeter drill hole in the skull
00:12:26.180 | down which you can pass a little fiber optic cannula.
00:12:30.400 | And attach it to a laser
00:12:33.100 | and just heat the tumor up deep inside the brain
00:12:35.860 | under direct MRI visualization in real time.
00:12:39.580 | So this person is in the MRI scanner.
00:12:42.500 | You're taking pictures every second or so.
00:12:45.300 | As the tumor heats up, you can monitor the temperature
00:12:48.700 | and get it exactly where you want it,
00:12:50.420 | where it's gonna kill all those tumor cells,
00:12:52.720 | but not hurt hardly any of the brain surrounding it.
00:12:56.340 | And so not uncommonly nowadays,
00:12:58.920 | we have someone come in with a tumor
00:13:00.860 | that previously would have been catastrophic to operate on
00:13:04.020 | and we can eliminate that tumor
00:13:06.480 | with leaving a poke hole in their skin
00:13:10.100 | with almost no visual after effects.
00:13:15.460 | - So that procedure that you just described
00:13:18.500 | translates into better clinical outcomes,
00:13:21.500 | meaning fewer, let's call them side effects
00:13:24.780 | or collateral damage.
00:13:26.020 | - Exactly right, yeah.
00:13:27.940 | We don't, even in cases that previously
00:13:31.380 | would have considered totally inoperable,
00:13:33.580 | say a tumor in the brain stem
00:13:36.100 | or a tumor in a primary motor cortex
00:13:39.400 | or primary verbal areas, Broca's area,
00:13:43.660 | where we would have expected to either not operate
00:13:46.980 | or do catastrophic damage,
00:13:48.700 | those people sometimes now are coming out unscathed.
00:13:52.640 | - I'm very curious about the sorts of basic information
00:13:55.960 | about brain function that can be gleaned
00:13:58.140 | from these clinical approaches of lesions
00:14:01.840 | and strokes and maybe even stimulation.
00:14:06.700 | So for instance, in your example of this patient
00:14:08.860 | that had bilateral frontal damage,
00:14:10.680 | what do you think his lack of regulation reveals
00:14:14.980 | about the normal functioning of the frontal lobes?
00:14:17.720 | Because I think the obvious answer to most people
00:14:19.420 | is going to be, well, the frontal lobes
00:14:20.900 | are normally limiting impulsivity,
00:14:24.600 | but as we both know, because the brain has excitatory
00:14:27.940 | and inhibitory neurons as sort of accelerators
00:14:29.860 | and brakes on communication,
00:14:31.320 | that isn't necessarily the straightforward answer.
00:14:35.080 | It could be, for instance, that the frontal lobes
00:14:39.100 | are acting as conductors and are kind of important,
00:14:44.100 | but not the immediate players in determining impulsivity.
00:14:48.960 | So two questions really,
00:14:51.760 | what do you think the frontal lobes are doing?
00:14:53.360 | Because I'm very intrigued by this human
00:14:56.860 | expanded real estate.
00:14:58.460 | We have a lot of it compared to other animals.
00:15:00.460 | And more generally, what do you think damage
00:15:03.560 | of a given neural tissue means in terms of understanding
00:15:07.940 | the basic function of that tissue?
00:15:10.140 | - Yeah, it varies, I think, from tissue to tissue.
00:15:13.940 | But with respect to the frontal lobes,
00:15:16.060 | I think they act as sort of a filter.
00:15:18.840 | They selectively are saying shh backward
00:15:22.500 | to the rest of the brain behind them
00:15:25.080 | when part of your brain says that looks very attractive.
00:15:28.700 | I want to go grab it and take it
00:15:31.180 | out of the jewelry display case or whatever.
00:15:34.260 | The frontal lobes are saying you can
00:15:38.580 | if you go pay for it first, right?
00:15:40.420 | They're filtering the behavior.
00:15:41.620 | They're letting the impulse through maybe,
00:15:45.280 | but in a controlled way.
00:15:46.580 | This is very high level, very broad thinking
00:15:51.580 | about how the frontal lobes work.
00:15:53.260 | And that that patient I mentioned earlier
00:15:56.080 | is a great example of when they go wrong.
00:15:59.620 | He had this impulse, this sort of strange impulse
00:16:02.980 | to be attracted to his nurse
00:16:06.420 | that normally it would be easy for our frontal lobes
00:16:09.640 | to say this is completely inappropriate,
00:16:11.760 | wrong setting, wrong person, wrong time.
00:16:18.300 | In his case, he had nothing there.
00:16:19.980 | And so even the slightest inclination
00:16:22.940 | to want something came right out to the surface.
00:16:27.940 | So yeah, a filter calming the rest of the brain down
00:16:32.780 | from acting on every possible impulse.
00:16:35.500 | - When I was a graduate student,
00:16:36.620 | I was running what are called,
00:16:38.580 | you know what these are,
00:16:40.280 | but just to inform you of what are called acute,
00:16:41.980 | which are neurophysiological experiments
00:16:45.020 | that last several days
00:16:46.060 | because at the end you terminate the animal.
00:16:48.640 | This is, my apologies to those
00:16:51.000 | that are made uncomfortable by animal research.
00:16:53.140 | I now work on humans, so a different type of animal.
00:16:55.700 | But at the time we were running these acutes
00:16:57.660 | that would start one day
00:16:58.860 | and maybe end two or three days later.
00:17:01.020 | And so you get a lot of data,
00:17:02.220 | the animal's anesthetized and doesn't feel any pain
00:17:04.240 | the entire time of the surgery.
00:17:05.760 | But the one consequence of these experiments
00:17:08.400 | is that the experimenter, me and another individual
00:17:11.680 | are awake for several days
00:17:13.200 | with an hour of sleep here or an hour of sleep there.
00:17:15.900 | But you're basically awake for two, three days.
00:17:17.360 | Something that really I could only do in my teens and 20s.
00:17:20.700 | I was in my 20s at the time.
00:17:22.060 | And I recall going to eat at a diner
00:17:25.740 | after one of these experiments and I was very hungry.
00:17:29.240 | And the waitress walking by with a tray full of food
00:17:33.500 | for another table.
00:17:35.400 | And it took every bit of self-control
00:17:38.220 | to not get up and take the food off the tray.
00:17:40.700 | Something that of course is totally inappropriate
00:17:42.360 | and I would never do.
00:17:43.400 | And it must have been based on what you just said,
00:17:46.380 | that my forebrain was essentially going offline
00:17:49.660 | or offline from the sleep deprivation.
00:17:51.660 | Because there was a moment there
00:17:53.300 | where I thought I might reach up and grab a plate of food
00:17:56.340 | passing by simply because I wanted it and I didn't.
00:18:01.020 | But I can relate to the experience of feeling
00:18:05.020 | like the shh response is a flickering in and out
00:18:09.340 | under conditions of sleep deprivation.
00:18:10.900 | So do we know whether or not sleep deprivation
00:18:12.500 | limits forebrain activity in a similar kind of way?
00:18:15.440 | - I don't know specifically if that effect
00:18:18.860 | is more pronounced in the forebrain
00:18:21.480 | as opposed to other brain regions.
00:18:22.900 | But it's clear that sleep deprivation
00:18:26.200 | has broad effects all over the brain.
00:18:29.000 | People start to see visual hallucinations.
00:18:31.020 | So the opposite end of the brain, as you know,
00:18:32.860 | the visual cortex in the far back of the brain is affected.
00:18:37.500 | People's motor coordination goes down
00:18:41.740 | after sleep deprivation.
00:18:43.060 | So I think if you force me to give a definitive answer
00:18:48.060 | on that question, I'd have to guess that the entire brain
00:18:53.200 | is affected by sleep deprivation.
00:18:55.760 | And it's not clear that one part of the brain
00:18:57.680 | is more affected than another.
00:18:59.240 | - So we've been talking about damage to the brain
00:19:02.460 | and inferring function from damage.
00:19:04.780 | We could talk a little bit about what I consider really
00:19:08.680 | the holy grail of the nervous system,
00:19:11.340 | which is neuroplasticity, this incredible capacity
00:19:14.220 | of the nervous system to change its wiring,
00:19:17.260 | strengthen connections, weaken connections,
00:19:19.460 | maybe new neurons, but probably more strengthening
00:19:21.560 | and weakening of connections.
00:19:23.420 | Nowadays, we hear a lot of excitement
00:19:25.820 | about so-called classical psychedelics like LSD
00:19:28.660 | and psilocybin, which do seem to quote unquote,
00:19:31.300 | open plasticity.
00:19:33.020 | They do a bunch of other things too,
00:19:34.140 | but through the release of neuromodulators like serotonin
00:19:37.340 | and so forth, how do you think about neuroplasticity
00:19:41.700 | and more specifically, what do you think the potential
00:19:44.940 | for neuroplasticity is in the adult?
00:19:47.740 | So let's say older than 25 year old brain
00:19:50.820 | with or without machines being involved?
00:19:55.460 | 'Cause in your role at Neuralink and as a neurosurgeon
00:19:58.860 | in other clinical settings, surely you are using machines
00:20:03.100 | and surely you've seen plasticity in the positive
00:20:06.520 | and negative direction.
00:20:07.620 | - Right.
00:20:08.460 | - What do you think about plasticity?
00:20:10.620 | What's possible there without machines?
00:20:13.220 | What's possible with machines?
00:20:15.700 | - So as you mentioned or alluded to,
00:20:18.340 | that plasticity definitely goes down in older brains.
00:20:21.760 | It is harder for older people to learn new things,
00:20:26.740 | to make radical changes in their behavior,
00:20:29.820 | to kick habits that they've had for years.
00:20:36.400 | Machines aren't the obvious answer.
00:20:38.820 | So implanted electrodes and computers
00:20:40.620 | aren't the obvious answer to increase plasticity necessarily
00:20:43.920 | compared to drugs.
00:20:46.140 | We already know that there are pharmacologics,
00:20:48.940 | some of the ones you mentioned, psychedelics,
00:20:51.320 | that have a broad impact on plasticity.
00:20:54.140 | Yeah, it's hard to know which area of the brain
00:20:56.060 | would be most potent as a stimulation target
00:20:59.300 | for an electrode to broadly juice plasticity
00:21:03.880 | compared to pharmacologic agents that we already know about.
00:21:08.880 | I think with plasticity in general,
00:21:13.380 | you're talking about the entire brain,
00:21:14.740 | you're talking about altering a trillion synapses
00:21:19.540 | all in a similar way in their tendency to be rewireable,
00:21:24.540 | their tendency to be up or down weighted
00:21:27.860 | and an electrical stimulation target in the brain
00:21:33.780 | necessarily has to be focused.
00:21:36.300 | With a device like potentially Neuralink's,
00:21:38.540 | there might be a more broad ability to steer current
00:21:42.820 | to multiple targets with some degree of control,
00:21:46.100 | but you're never going to get that broad target ability
00:21:51.100 | with any electrodes that I can see coming in our lifetimes.
00:21:56.800 | Let's just say that would be coding the entire surface
00:21:59.860 | and depth of the brain the way that a drug can.
00:22:03.020 | And so I think plasticity research will bear the most fruit
00:22:07.260 | when it focuses on pharmacologic agents.
00:22:10.620 | - I wasn't expecting that answer
00:22:12.100 | given that you're at Neuralink.
00:22:13.780 | And then again, I think that all of us, me included,
00:22:18.180 | need to take a step back and realize that
00:22:21.020 | while we may think we know what is going on at Neuralink
00:22:25.240 | in terms of the specific goals and the general goals,
00:22:28.200 | and I certainly have in mind,
00:22:30.160 | I think most people have in mind
00:22:31.500 | a chip implanted in the brain
00:22:33.420 | or maybe even the peripheral nervous system
00:22:35.280 | that can give people super memories
00:22:37.820 | or some other augmented capacity.
00:22:41.380 | We really don't know what you all are doing there.
00:22:44.340 | For all we know, you guys are taking
00:22:46.960 | or administering psilocybin
00:22:48.540 | and combining that with stimulation.
00:22:49.880 | I mean, we really don't know.
00:22:50.780 | And I say this with a tone of excitement
00:22:53.860 | because I think that one of the things that's so exciting
00:22:58.140 | about the different endeavors that Elon
00:23:01.540 | has really spearheaded, SpaceX, Tesla, et cetera,
00:23:04.860 | is that early on, there's a lot of mystique.
00:23:08.940 | Mystique is a quality that is not often talked about,
00:23:12.740 | but it's, I think, a very exciting time
00:23:17.220 | in which engineers are starting to toss up big problems
00:23:22.220 | and go for it.
00:23:23.460 | And obviously, Elon is certainly among the best,
00:23:26.480 | if not the best, in terms of going really big.
00:23:28.900 | I mean, Mars seems pretty far to me.
00:23:31.060 | Electric cars all over the road nowadays
00:23:33.360 | are very different than the picture a few years ago
00:23:35.660 | when you didn't see so many of them, rockets and so forth,
00:23:40.160 | and now the brain.
00:23:41.140 | So to the extent that you are allowed,
00:23:44.620 | could you share with us what your vision
00:23:47.380 | for the missions at Neuralink are
00:23:52.040 | and what the general scope of missions are?
00:23:54.240 | And then if possible, share with us
00:23:57.140 | some of the more specific goals.
00:23:58.840 | I can imagine basic goals of trying to understand the brain
00:24:01.620 | and augment the brain.
00:24:02.460 | I could imagine clinical goals of trying to repair things
00:24:05.340 | in humans that are suffering in some way,
00:24:07.620 | or animals for that matter.
00:24:09.020 | - Yeah, it's funny what you mentioned.
00:24:11.660 | Neuralink and I think Tesla and SpaceX before it
00:24:17.580 | end up being these blank canvases
00:24:19.760 | that people project their hopes and fears onto.
00:24:22.700 | And so we experience a lot of upside in this.
00:24:26.340 | People assume that we have superpowers
00:24:29.860 | in our ability to alter the way brains work
00:24:32.400 | and people have terrifying fears
00:24:34.600 | of the horrible things we're gonna do.
00:24:36.940 | For the most part, those extremes are not true.
00:24:39.920 | We are making a neural implant.
00:24:43.780 | We have a robotic insertion device
00:24:47.100 | that helps place tiny electrodes
00:24:50.880 | smaller than the size of a human hair
00:24:53.600 | all throughout a small region of the brain.
00:24:56.160 | In the first indication that we're aiming at,
00:25:00.680 | we are hoping to implant a series of these electrodes
00:25:04.960 | into the brains of people
00:25:07.220 | that have had a bad spinal cord injury.
00:25:09.440 | So people that are essentially quadriplegic,
00:25:11.460 | they have perfect brains,
00:25:13.420 | but they can't use them to move their body.
00:25:15.860 | They can't move their arms or legs.
00:25:17.900 | - Because of some high level spinal cord damage.
00:25:20.160 | - Exactly right.
00:25:21.060 | And so this pristine motor cortex up in their brain
00:25:25.740 | is completely capable of operating a human body.
00:25:28.140 | It's just not wired properly any longer
00:25:30.980 | to a human's arms or legs.
00:25:33.220 | And so our goal is to place this implant
00:25:38.220 | into a motor cortex and have that person
00:25:41.180 | be able to then control a computer.
00:25:44.820 | So a mouse and a keyboard as if they had their hands
00:25:48.480 | on a mouse and a keyboard,
00:25:50.020 | even though they aren't moving their hands,
00:25:52.100 | their motor intentions are coming directly out of the brain
00:25:55.860 | into the device.
00:25:57.940 | And so they're able to regain their digital freedom
00:26:01.380 | and connect with the world through the internet.
00:26:04.600 | - Why use robotics to insert these chips?
00:26:08.140 | And the reason I asked that is that sure,
00:26:10.520 | I can imagine that a robot could be more precise
00:26:14.460 | or less precise, but in theory,
00:26:16.820 | more precise than the human hand, no tremor, for instance,
00:26:20.360 | more precision in terms of maybe even
00:26:26.200 | a little micro detection device on the tip of the blade
00:26:31.200 | or something that could detect a capillary
00:26:34.220 | that you would want to avoid and swerve around
00:26:36.500 | that the human eye couldn't detect.
00:26:38.640 | And you and I both know, however, that no two brains,
00:26:42.840 | nor are the two sides of the same brain identical.
00:26:47.400 | So navigating through the brain
00:26:49.880 | is perhaps best carried out by a human.
00:26:53.800 | However, and here I'm going to interrupt myself again
00:26:56.320 | and say 10 years ago, face recognition
00:27:01.320 | was very clearly performed better by humans than machines.
00:27:06.860 | And I think now machines do it better.
00:27:10.700 | So is this the idea that eventually, or maybe even now,
00:27:14.360 | robots are better surgeons than humans are?
00:27:17.780 | - In this limited case, yes.
00:27:20.360 | These electrodes are so tiny and the blood vessels
00:27:23.820 | on the surface of the brain so numerous
00:27:25.600 | and so densely packed that a human physically can't do this.
00:27:30.480 | A human hand is not steady enough to grab
00:27:33.240 | this couple micron width loop
00:27:36.920 | at the end of our electrode thread
00:27:38.680 | and place it accurately, blindly, by the way,
00:27:43.140 | into the cortical surface accurately enough
00:27:46.100 | at the right depth to get through all the cortical layers
00:27:48.420 | that we want to reach.
00:27:49.540 | And I would love if human surgeons
00:27:54.600 | were essential to this process,
00:27:57.520 | but very soon humans run out of motor skills
00:28:02.980 | sufficient to do this job.
00:28:05.440 | And so we are required, in this case,
00:28:08.180 | to lean on robots to do this incredibly precise,
00:28:12.620 | incredibly fast, incredibly numerous placement
00:28:16.440 | of electrodes into the right area of the brain.
00:28:19.000 | - So in some ways, Neuralink is pioneering
00:28:21.240 | the development of robotic surgeons
00:28:23.380 | as much as it's pioneering the exploration
00:28:26.320 | and augmentation and treatment of human brain conditions.
00:28:29.460 | - Right, and as the device exists currently,
00:28:32.600 | as we're submitting it to the FDA,
00:28:34.880 | it is only for the placement of the electrodes,
00:28:38.860 | the robot's part of the surgery.
00:28:41.180 | I or another neurosurgeon still needs to do
00:28:44.680 | the more crude part of opening the skin and skull
00:28:47.160 | and presenting the robot a pristine brain surface
00:28:50.960 | to sew electrodes into.
00:28:53.180 | - Well, surely getting quadriplegics
00:28:55.600 | to be able to move again, or maybe even to walk again,
00:28:59.140 | is a heroic goal and one that I think everyone would agree
00:29:04.660 | would be wonderful to accomplish.
00:29:07.260 | Is that the first goal because it's hard but doable?
00:29:12.260 | - Right.
00:29:13.300 | - Or is that the first goal because you and Elon
00:29:16.660 | and other folks at Neuralink have a passion
00:29:19.880 | for getting paralyzed people to move again?
00:29:23.560 | - Yeah, broadly speaking, the mission of Neuralink
00:29:26.900 | is to reduce human suffering, at least in the near term.
00:29:30.300 | You know, there's hope that eventually there's a use here
00:29:33.600 | that makes sense for a brain interface to bring AI
00:29:38.600 | as a tool embedded in the brain that a human can use
00:29:43.900 | to augment their capabilities.
00:29:45.400 | I think that's pretty far down the road for us,
00:29:49.380 | but definitely on a desired roadmap.
00:29:51.940 | In the near term, we really are focused on people
00:29:54.580 | with terrible medical problems
00:29:56.200 | that have no options right now.
00:30:00.020 | With regard to motor control, you know, our mutual friend
00:30:05.020 | recently departed, Krishna Shenoy, was a giant
00:30:09.340 | in this field of motor prosthesis.
00:30:12.120 | It just so happens that his work was foundational
00:30:16.460 | for a lot of people that work in this area, including us,
00:30:19.280 | and he was an advisor to Neuralink.
00:30:21.340 | That work was farther along than most other work
00:30:26.700 | for addressing any function that lives on the surface
00:30:29.800 | of the brain.
00:30:31.140 | The physical constraints of our approach
00:30:33.420 | require us currently to focus on only surface features
00:30:36.940 | on the brain.
00:30:37.900 | So we can't say go to the really very compelling surface,
00:30:42.900 | deep depth functions that happen in the brain,
00:30:48.060 | like, you know, mood, appetite, addiction, pain, sleep.
00:30:53.060 | We'd love to get to that place eventually,
00:30:56.660 | but in the immediate future, our first indication
00:31:00.240 | or two or three will probably be brain surface functions
00:31:04.280 | like motor control.
00:31:06.020 | - I'd like to take a quick break and acknowledge
00:31:08.180 | one of our sponsors, Athletic Greens.
00:31:10.460 | Athletic Greens, now called AG1,
00:31:12.920 | is a vitamin mineral probiotic drink
00:31:15.280 | that covers all of your foundational nutritional needs.
00:31:18.180 | I've been taking Athletic Greens since 2012,
00:31:20.860 | so I'm delighted that they're sponsoring the podcast.
00:31:23.080 | The reason I started taking Athletic Greens
00:31:24.680 | and the reason I still take Athletic Greens
00:31:26.740 | once or usually twice a day
00:31:28.740 | is that it gets to be the probiotics
00:31:30.660 | that I need for gut health.
00:31:32.340 | Our gut is very important.
00:31:33.460 | It's populated by gut microbiota
00:31:36.000 | that communicate with the brain, the immune system,
00:31:37.740 | and basically all the biological systems of our body
00:31:40.140 | to strongly impact our immediate and long-term health.
00:31:43.800 | And those probiotics in Athletic Greens
00:31:45.660 | are optimal and vital for microbiota health.
00:31:49.480 | In addition, Athletic Greens contains a number of adaptogens,
00:31:52.060 | vitamins, and minerals that make sure
00:31:53.460 | that all of my foundational nutritional needs are met,
00:31:56.420 | and it tastes great.
00:31:58.280 | If you'd like to try Athletic Greens,
00:31:59.740 | you can go to athleticgreens.com/huberman,
00:32:03.140 | and they'll give you five free travel packs
00:32:05.100 | that make it really easy to mix up Athletic Greens
00:32:07.420 | while you're on the road, in the car, on the plane, et cetera,
00:32:10.000 | and they'll give you a year's supply of vitamin D3K2.
00:32:13.420 | Again, that's athleticgreens.com/huberman
00:32:16.100 | to get the five free travel packs
00:32:17.480 | and the year's supply of vitamin D3K2.
00:32:20.780 | So for those listening, the outer portions of the brain
00:32:24.780 | are filled with, or consist of, rather, neocortex,
00:32:29.780 | so the bumpy stuff that looks like sea coral.
00:32:35.020 | Some forms of sea coral look like brains,
00:32:38.220 | or brains look like them.
00:32:39.540 | And then underneath reside a lot of the brain structures
00:32:43.440 | that control what Matt just referred to,
00:32:46.980 | things that control our mood, hormone output,
00:32:49.540 | how awake or asleep the brain is.
00:32:52.060 | And would you agree that those deeper regions of the brain
00:32:55.560 | have, in some ways, more predictable functions?
00:32:57.860 | I mean, that lesions there or stimulation there
00:32:59.900 | lead to more predictable outcomes
00:33:02.340 | in terms of deficits or improvements in function.
00:33:05.460 | - Yeah, in some way, yes.
00:33:07.580 | I mean, the deeper parts of the brain
00:33:09.140 | tend to be more stereotyped,
00:33:10.940 | as in more similar between species,
00:33:13.840 | than the outer surface of the brain.
00:33:16.840 | They're kind of the firmware or the housekeeping functions,
00:33:20.840 | to some degree, body temperature, blood pressure,
00:33:23.580 | sex motivation, hunger,
00:33:26.760 | things that you don't really need to vary dramatically
00:33:29.600 | between a fox and a human being.
00:33:31.520 | Whereas the outer, more reasoning functions
00:33:36.080 | of problem-solving functions between a fox and a human
00:33:40.100 | are vastly different, and so the physical requirements
00:33:43.160 | of those brain outputs are different.
00:33:46.180 | I think I heard Elon describe it as the human brain
00:33:49.200 | is essentially a monkey brain
00:33:51.160 | with a supercomputer placed on the outside,
00:33:53.760 | which sparked some interesting ideas
00:33:55.860 | about what neocortex is doing.
00:33:58.540 | We have all this brain real estate
00:34:00.140 | on top of all that more stereotyped function-type stuff
00:34:04.760 | in the deeper brain,
00:34:05.820 | and it's still unclear what neocortex is doing.
00:34:09.760 | In the case of frontal cortex, as you mentioned earlier,
00:34:12.960 | it's clear that it's providing some quieting
00:34:16.060 | of impulses, some context setting,
00:34:19.420 | rule setting, context switching.
00:34:22.040 | All of that makes good sense,
00:34:23.840 | but then there are a lot of cortical areas
00:34:25.600 | that sure are involved in vision or touch or hearing,
00:34:28.280 | but then there's also a lot of real estate
00:34:29.940 | that just feels unexplored.
00:34:32.800 | So I'm curious whether or not in your clinical work
00:34:35.940 | or work with Neuralink or both,
00:34:39.240 | whether or not you have ever encountered neurons
00:34:43.200 | that do something that's really peculiar and intriguing.
00:34:47.000 | And here I'm referring to examples
00:34:50.560 | that could be anywhere in the brain.
00:34:52.180 | Like where you go, wow, like these neurons,
00:34:54.040 | when I stimulate them or when they're taken away,
00:34:57.760 | lead to something kind of bizarre but interesting.
00:35:01.060 | - Yeah, yeah, the one that comes immediately to mind
00:35:05.560 | is unfortunately in a terrible case in kids
00:35:08.660 | that have a tumor in the hypothalamus
00:35:12.980 | that lead to what we call gelastic seizures,
00:35:16.160 | which is sort of an uncontrollable fit of laughter.
00:35:20.260 | There's been cases in the literature
00:35:21.920 | where this laughter is so uncontrollable
00:35:25.520 | and so pervasive that people suffocate
00:35:28.360 | from failing to breathe
00:35:30.140 | or they laugh until they pass out.
00:35:32.000 | And so you don't normally think of a deep structure
00:35:37.280 | in the brain like the hypothalamus
00:35:38.840 | as being involved in a function like humor.
00:35:43.640 | And certainly when we think about this kind of laughter
00:35:48.720 | in these kids with tumors,
00:35:50.920 | it's mirthless laughter is the kind of textbook phrase,
00:35:55.920 | humorless laughter.
00:35:58.240 | It's just a reflexive, almost zombie-like behavior.
00:36:05.540 | And it comes from a very small population of neurons
00:36:09.640 | deep in the brain.
00:36:11.360 | This is one of the other sort of strange loss of functions
00:36:16.360 | you might say is it's nice that you and I can sit here
00:36:20.660 | and not have constant disruptive fits of laughter
00:36:25.420 | coming out of our bodies, but that's a neuronal function.
00:36:29.680 | That's, thank goodness, due to neurons properly wired
00:36:33.760 | and properly functioning.
00:36:35.640 | And any neurons that do anything like this can be broken.
00:36:40.500 | And so we see this in horrifying cases like that
00:36:43.440 | from time to time.
00:36:44.440 | - So I'm starting to sense that there are two broad bins
00:36:49.260 | of approaches to augmenting the brain,
00:36:52.780 | either to treat disease or for sake of increasing memory,
00:36:56.240 | creating super brains, et cetera.
00:36:58.500 | One category you alluded to earlier, which is pharmacology.
00:37:02.160 | And you specifically mentioned the tremendous power
00:37:05.480 | that pharmacology holds, whether or not it's through
00:37:08.220 | psychedelics or through prescription drug
00:37:10.340 | or some other compound.
00:37:13.140 | The other approach are these little microelectrodes
00:37:16.200 | that are extremely strategically placed
00:37:20.480 | into multiple regions in order to play essentially
00:37:25.100 | a concert of electricity that is exactly right
00:37:28.240 | to get a quadriplegic moving.
00:37:31.960 | That sparks two questions.
00:37:33.860 | First of all, is there a role for and is Neuralink
00:37:37.580 | interested in combining pharmacology with stimulation?
00:37:41.340 | - So not immediately.
00:37:42.860 | Right now we're solely focused on the extremely hard,
00:37:46.360 | some might say the hardest problem facing humans right now
00:37:49.500 | of decoding the brain through electrical stimulation
00:37:52.680 | and recording.
00:37:54.100 | That's enough for us for now.
00:37:56.180 | - So to just give us a bit fuller picture of this,
00:37:59.660 | we were talking about a patient who can't move their limbs
00:38:01.900 | because they have spinal cord damage.
00:38:03.940 | The motor cortex that controls movement is in theory fine.
00:38:09.920 | Make a small hole in the skull and through that hole,
00:38:12.680 | a robot is going to place electrodes,
00:38:14.840 | obviously motor cortex, but then where, how?
00:38:18.580 | Is the idea that you're going to play a concert
00:38:20.400 | from different locations.
00:38:21.300 | You're going to hit all the keys on the piano
00:38:22.740 | in different combinations and then figure out
00:38:25.020 | what can move the limbs.
00:38:27.080 | What I'm alluding to here is I still don't understand
00:38:30.820 | how the signals are going to get out of motor cortex
00:38:32.700 | past the lesion and into and out to the limbs
00:38:35.540 | because the lesion hasn't been dealt with at all
00:38:37.580 | in this scenario.
00:38:38.420 | - So just to clarify there, I should emphasize
00:38:42.420 | we're not in the immediate future talking about
00:38:44.980 | reconnecting the brain to the patient's own limbs.
00:38:48.120 | That's on the roadmap,
00:38:49.340 | but it's way down the roadmap a few years.
00:38:52.500 | What we're talking about in the immediate future
00:38:55.580 | is having the person be able to control electronic devices
00:38:58.260 | around them with their motor intentions alone, right?
00:39:01.580 | - So prosthetic hand and arm or just mouse and keys on a--
00:39:05.460 | - Mouse and keys on a keyboard for starters.
00:39:07.740 | So you wouldn't see anything in the world move.
00:39:10.140 | As they have an intention, the patient might imagine,
00:39:14.440 | say flexing their fist or moving their wrist.
00:39:18.740 | And what would happen on the screen is the mouse would move
00:39:21.420 | down and left and click on an icon
00:39:23.460 | and bring up their word processor.
00:39:25.580 | And then a keyboard at the bottom of the screen
00:39:27.360 | would allow them to select letters in sequence
00:39:30.760 | and they could type.
00:39:32.420 | This is the easy place to start, easy in quotes.
00:39:37.420 | - I would say, because the transformation
00:39:41.000 | of electrical signals from motor cortex
00:39:43.460 | through the brainstem into the spinal cord
00:39:45.100 | and out to the muscles is somewhat known
00:39:48.580 | through a hundred years or more
00:39:49.700 | of incredible laboratory research.
00:39:52.480 | But the transformation,
00:39:53.820 | meaning how to take the electrical signals
00:39:55.540 | out of motor cortex and put it into a mouse or a robot arm,
00:40:00.540 | that's not a trivial problem.
00:40:02.560 | I mean, that's a whole other set of problems in fact.
00:40:05.520 | - Well, we're unloading some of that difficulty
00:40:08.720 | from the brain itself, from the brain of the patient
00:40:12.300 | and putting some of that into software.
00:40:15.940 | So we're using smarter algorithms
00:40:18.980 | to decode the motor intentions out of the brain.
00:40:22.840 | We have been able to do this in monkeys really well.
00:40:26.480 | So we have a small army of monkeys
00:40:29.320 | playing video games for smoothie rewards
00:40:33.960 | and they do really well.
00:40:35.120 | We actually have the world record of bit rate
00:40:39.040 | of information coming out of a monkey's brain
00:40:41.320 | to intelligently control a cursor on a screen.
00:40:45.160 | We're doing that better than anyone else.
00:40:47.160 | And again, thanks in no small part due to Krishna Shenoy
00:40:52.440 | and his lab and the people that have worked for him
00:40:55.720 | that have been helping Neuralink.
00:40:57.320 | But what you can't do with that monkey
00:41:01.060 | is ask him what he's thinking.
00:41:03.160 | You can't ask him.
00:41:04.600 | - We can ask him,
00:41:05.440 | but you won't get a very interesting answer.
00:41:07.320 | - Yeah.
00:41:08.580 | You can't tell him to try something different.
00:41:11.000 | You can't tell him to, hey, try the shoulder on this.
00:41:13.780 | Try the other hand and see if there's some cross body
00:41:18.040 | neuron firing that gives you a useful signal.
00:41:21.480 | Once we get the people,
00:41:22.940 | we expect to see what they've seen
00:41:25.540 | when they've done similar work in academic labs,
00:41:27.440 | which is the human can work with you
00:41:30.140 | to vastly accelerate this process
00:41:32.560 | and get much more interesting results.
00:41:35.020 | So one of the things out of Stanford recently
00:41:39.360 | is there was a lab that with Krishna and Jamie Henderson
00:41:44.360 | and other people decode speech
00:41:47.200 | out of the hand movement area in the brain.
00:41:49.840 | So what we know is that there are multitudes
00:41:54.840 | of useful signals in each area of the brain
00:41:58.120 | that we've looked at so far.
00:41:59.840 | They just tend to be highly expressed
00:42:02.360 | for say hand movement in the hand area,
00:42:04.380 | but that doesn't mean only hand movement in the hand area.
00:42:07.740 | - Okay, so here's the confidence test.
00:42:11.200 | There's a long history dating back really prior
00:42:14.800 | to the 1950s of scientists doing experiments on themselves.
00:42:19.320 | - Sure.
00:42:20.920 | - Not because they are reckless,
00:42:22.360 | but because they want the exact sorts of information
00:42:25.700 | that you're talking about.
00:42:26.540 | The ability to really understand how intention
00:42:30.680 | and awareness of goals can shape outcomes in biology.
00:42:35.080 | If that is vague to people listening,
00:42:37.000 | what I mean here is that for many,
00:42:39.980 | probably hundreds of years, if not longer,
00:42:42.200 | scientists have taken the drugs they've studied
00:42:45.080 | or stimulated their own brain or done things
00:42:47.840 | to really try and get a sense of what the animals
00:42:50.480 | they work on or the patients they work on
00:42:52.040 | might be experiencing.
00:42:52.880 | Psychiatrists are sort of famous for this, by the way.
00:42:55.360 | I'm not pointing fingers at anybody,
00:42:56.560 | but psychiatrists are known to try the drugs
00:42:58.440 | that they administer.
00:42:59.640 | And some people would probably imagine that's a good thing
00:43:02.840 | just so that the clinicians could have empathy
00:43:06.180 | for the sorts of side effects and not so great effects
00:43:09.160 | of some of these drugs that they administer to the patients.
00:43:12.100 | But the confidence test I present you is,
00:43:17.500 | would you be willing or are you willing, if allowed,
00:43:22.500 | to have these electrodes implanted into your motor cortex?
00:43:27.520 | You're not a quadriplegic.
00:43:29.080 | You can move your limbs.
00:43:30.620 | But given the state of the technology at Neuralink now,
00:43:35.620 | would you do that?
00:43:36.760 | Or maybe in the next couple of years, if you were allowed,
00:43:40.160 | would you be willing to do that and be the person to say,
00:43:43.500 | hey, turn up the stimulation over there.
00:43:45.400 | I feel like I want to reach for the cup
00:43:47.840 | with that robotic arm, but I'm feeling some resistance.
00:43:50.920 | Because it's exactly that kind of experiment
00:43:54.040 | done on a person who can move their limbs
00:43:57.360 | and who deeply understands the technology
00:43:59.580 | and the goals of the experiment that I would argue
00:44:02.360 | actually stands to advance the technology fastest
00:44:05.360 | as opposed to putting the electrodes first into somebody
00:44:08.560 | who is impaired at a number of levels
00:44:11.920 | and then trying to think about why things aren't working.
00:44:14.840 | - And again, this is all with the goal
00:44:17.320 | of reversing paralysis in mind.
00:44:20.700 | But would you implant yourself with these microelectrodes?
00:44:24.120 | - Yeah, absolutely.
00:44:24.960 | I would be excited to do that.
00:44:27.440 | I think for the first iteration of the device,
00:44:30.300 | it probably wouldn't be very meaningful.
00:44:32.380 | It wouldn't be very useful
00:44:33.300 | because I can still move my limbs.
00:44:35.280 | And our first outputs from this are things
00:44:38.160 | that I can do just as easily with my hands.
00:44:40.920 | Moving a mouse, typing on a keyboard.
00:44:44.360 | We are necessarily making this device
00:44:47.520 | as a medical device for starters
00:44:50.040 | for people with bad medical problems and no good options.
00:44:53.900 | It wouldn't really make sense for an able-bodied person
00:44:58.080 | to get one in the near term.
00:44:59.680 | As the technology develops
00:45:02.840 | and we make devices specifically designed
00:45:05.900 | to perform functions that can't be done
00:45:09.160 | even by an able-bodied person,
00:45:11.560 | say eventually refine the technique
00:45:13.840 | to get to the point where you can type faster
00:45:16.960 | with your mind and one of these devices
00:45:20.000 | than you can with text to speech
00:45:22.200 | or speech to text and your fingers.
00:45:26.000 | That's a use case that makes sense
00:45:27.840 | for someone like me to get it.
00:45:29.400 | It doesn't really make sense for me to get one
00:45:32.020 | when it allows me to use a mouse slightly worse
00:45:35.320 | than I can with my hand currently.
00:45:37.840 | That said, the safety of the device,
00:45:39.680 | I would absolutely vouch for
00:45:41.880 | from the hundreds of surgeries
00:45:43.840 | that I've personally done with this.
00:45:45.960 | I think it's much safer
00:45:48.800 | than many of the industry standard FDA approved surgeries
00:45:53.580 | that I routinely do on patients
00:45:56.880 | that no one even thinks twice about their standard of care.
00:46:01.880 | Neuralink has already reached, in my mind,
00:46:05.200 | a safety threshold that is far beyond
00:46:08.760 | a commonly accepted safety threshold.
00:46:12.760 | - Along the lines of augmenting one's biological function
00:46:15.860 | or functions in the world,
00:46:16.840 | I think now's the appropriate time
00:46:18.200 | to talk about the small lump
00:46:21.640 | present in the top of your hand.
00:46:24.080 | For those listening, not watching,
00:46:25.320 | there's a, it looks like a small lump
00:46:27.920 | between Dr. McDougall's forefinger and thumb
00:46:32.920 | or index finger and thumb placed on skin
00:46:37.720 | on the top of his hand.
00:46:40.400 | You've had this for some years now
00:46:41.760 | because we've known each other for,
00:46:43.340 | gosh, probably seven years now or so.
00:46:45.440 | And you've always had it in the time that I've known you.
00:46:47.640 | What is that lump and why did you put it in there?
00:46:52.280 | - Yeah, so it's a small writable RFID tag.
00:46:56.340 | - What's an RFID?
00:46:57.380 | What does RFID stand for?
00:46:58.740 | - Yeah, radio frequency identification.
00:47:00.800 | And so it's just a very small implantable chip
00:47:05.240 | that wireless devices can temporarily power.
00:47:09.960 | If you approach an antenna,
00:47:11.720 | they can power and send a small amount
00:47:13.760 | of data back and forth.
00:47:15.180 | So most phones have the capability
00:47:18.520 | of reading and writing to this chip.
00:47:21.280 | For years, it let me into my house.
00:47:25.060 | It unlocked a deadbolt on my front door.
00:47:27.360 | For some years, it unlocked the doors at Neuralink
00:47:31.560 | and let me through the various locked doors
00:47:35.640 | inside the building.
00:47:36.640 | It is writable.
00:47:40.080 | I can write a small amount of data to it.
00:47:41.700 | And so for some years in the early days of crypto,
00:47:46.700 | I had a crypto private key written on it
00:47:50.840 | to store a cryptocurrency that I thought was
00:47:53.660 | a dead offshoot of one of the main cryptocurrencies
00:47:57.680 | after it had forked.
00:47:59.480 | And so I put the private wallet key on there
00:48:01.940 | and forgot about it and remembered a few years later
00:48:05.220 | that it was there and went and checked
00:48:06.580 | and it was worth a few thousand dollars more
00:48:08.920 | than when I had left it on there.
00:48:11.200 | So that was a nice finding change in the sofa
00:48:13.480 | in the 21st century.
00:48:15.080 | - And then when you say you read it,
00:48:16.080 | you're essentially taking a phone or other device
00:48:18.560 | and scanning it over the lump in your hand, so to speak.
00:48:23.560 | And then it can read the data from there essentially.
00:48:27.160 | What other sorts of things could one put into these RFIDs
00:48:31.360 | in theory and how long can they stay in there
00:48:32.960 | before you need to take them out
00:48:34.120 | and recharge them or replace them?
00:48:37.440 | - Well, these are passive.
00:48:38.980 | They're coated in biocompatible glass.
00:48:42.560 | And as an extra, I'm a rock climber
00:48:44.680 | and so I was worried about that glass shattering
00:48:47.480 | during rock climbing.
00:48:49.720 | I additionally coated them in another ring of silicone
00:48:53.440 | before implanting that.
00:48:54.420 | So it's pretty safe.
00:48:55.880 | They're passive.
00:48:56.800 | There's no battery.
00:48:57.680 | There's no active electronics in them.
00:49:00.560 | So they could last the rest of my life.
00:49:02.380 | I don't think I'd ever have to remove it for any reason.
00:49:05.320 | At some point, the technology is always improving,
00:49:07.900 | so I might remove it and upgrade it.
00:49:10.020 | That's not inconceivable already.
00:49:12.800 | There's 10X more storage versions available.
00:49:16.760 | That could be a drop-in replacement for this
00:49:19.360 | if I ever remove it.
00:49:20.480 | But it has a small niche use case
00:49:26.180 | and it's an interesting proof of concept,
00:49:28.740 | tiptoeing towards the concept that you mentioned
00:49:31.440 | of you have to be willing to go through the things
00:49:34.840 | that you're suggesting to your patients
00:49:37.700 | in order to say with a straight face
00:49:39.900 | that you think this is a reasonable thing to do.
00:49:42.440 | So a small subcutaneous implant in the hand,
00:49:46.880 | it's a little different than a brain implant.
00:49:48.700 | - Yeah, what's involved in getting that RFID chip
00:49:51.500 | into the hand?
00:49:53.120 | I'm assuming it's an outpatient procedure.
00:49:55.080 | Presumably you did it on yourself.
00:49:56.360 | - Yeah, yeah.
00:49:57.200 | This was a kitchen table kind of procedure.
00:49:59.520 | - Any anesthetic or no?
00:50:02.220 | - You know, I've seen people do this
00:50:05.520 | with a lidocaine injection.
00:50:07.720 | For my money, I think a lidocaine injection
00:50:09.800 | is probably as painful as just doing the procedures.
00:50:12.560 | - Just a little cut in that thin skin
00:50:14.000 | on the top of the hand.
00:50:15.180 | Some people are cringing right now.
00:50:16.320 | Other people are saying, "I want one,"
00:50:17.920 | 'cause you'll never have to worry about losing your keys
00:50:20.840 | or passwords.
00:50:21.680 | I actually would like it for passwords
00:50:23.520 | because I'm dreadfully bad at remembering passwords.
00:50:27.200 | I have to put them in places all over the place.
00:50:29.520 | And then it's like, I'm like that kid in,
00:50:31.680 | remember that movie "Stand By Me"
00:50:33.040 | where the kid hides the pennies under the porch
00:50:35.360 | and then loses the map?
00:50:36.440 | Yeah, spends all summer trying to find them.
00:50:38.720 | So I can relate.
00:50:40.400 | Yeah, so it was just a little slit and then put in there.
00:50:43.280 | No local immune response, no pus, no swelling.
00:50:47.160 | - All the materials are completely biocompatible.
00:50:49.880 | They're on the surface exposed to the body.
00:50:51.680 | So no bad reaction, it healed up in days and it was fine.
00:50:56.640 | - Very cool.
00:50:57.520 | Since we're on video here,
00:50:58.680 | maybe can you just maybe raise it and show us?
00:51:01.200 | Yeah, so were you not to point out that little lump?
00:51:05.840 | I wouldn't have known to ask about it.
00:51:09.280 | And any other members of your family have these?
00:51:12.440 | - A few years after having this and seeing the convenience
00:51:16.160 | of me being able to open the door without keys,
00:51:18.520 | my wife insisted that I put one in her as well.
00:51:20.560 | So she's walking around with one.
00:51:22.600 | - Fantastic.
00:51:23.440 | - We consider them our version of wedding rings.
00:51:26.920 | - Love it.
00:51:27.760 | Well, it's certainly more permanent
00:51:29.440 | than wedding rings in some sense.
00:51:32.520 | I can't help but ask this question,
00:51:35.600 | even though it might seem a little bit off topic.
00:51:37.440 | As long as we're talking about implantable devices
00:51:39.320 | and Bluetooth and RFID chips in the body,
00:51:42.600 | I get asked a lot about the safety
00:51:45.800 | or lack thereof of Bluetooth headphones.
00:51:49.100 | You work on the brain, you're a brain surgeon.
00:51:51.620 | That's valuable real estate in there.
00:51:53.280 | And you understand about electromagnetic fields
00:51:55.920 | and any discussion about EMFs immediately puts us
00:51:59.080 | in the category of, uh-oh, like get their tinfoil hats.
00:52:01.880 | And yet I've been researching EMFs
00:52:04.380 | for a future episode of the podcast.
00:52:06.540 | And EMFs are a real thing.
00:52:10.220 | That's not a valuable statement.
00:52:11.920 | Everything's a real thing at some level, even an idea.
00:52:14.800 | But there does seem to be some evidence
00:52:18.080 | that electromagnetic fields of sufficient strength
00:52:21.800 | can alter the function of, maybe the health of,
00:52:24.820 | but the function of neural tissue,
00:52:26.160 | given that neural tissue is electrically
00:52:28.140 | signaling among itself.
00:52:30.900 | So I'll just ask this in a very straightforward way.
00:52:33.380 | Do you use Bluetooth headphones or wired headphones?
00:52:35.580 | - Yeah, Bluetooth.
00:52:36.420 | - And you're not worried about any kind of EMF fields
00:52:39.680 | across the skull?
00:52:40.600 | - No, I mean, I think the energy levels involved
00:52:43.520 | are so tiny that ionizing radiation aside,
00:52:48.520 | we're way out of the realm of ionizing radiation
00:52:51.520 | that people would worry about tumor causing EMF fields.
00:52:56.520 | Even just the electromagnetic field itself,
00:53:02.840 | as is very well described in a Bluetooth frequency range,
00:53:07.840 | the power levels are tiny in these devices.
00:53:11.800 | And so we are awash in these signals,
00:53:14.240 | whether you use Bluetooth headphones or not.
00:53:17.040 | For that matter, you're getting bombarded
00:53:18.940 | with ionizing radiation in a very tiny amount,
00:53:21.580 | no matter where you live on earth,
00:53:24.120 | unless you live under huge amounts of water.
00:53:27.040 | It's unavoidable.
00:53:30.360 | And so I think you just have to trust
00:53:33.240 | that your body has the DNA repair mechanisms
00:53:36.600 | that it needs to deal with the constant bath
00:53:39.200 | of ionizing radiation that you're in
00:53:41.920 | as a result of being in the universe
00:53:44.400 | and exposed to cosmic rays.
00:53:46.040 | In terms of electromagnetic fields,
00:53:49.480 | it's just the energy levels are way, way out of the range
00:53:54.480 | where I would be worried about this.
00:53:58.160 | - What about heat?
00:53:59.080 | I don't use the earbuds any longer for a couple of reasons.
00:54:04.680 | Once, as you know, I take a lot of supplements
00:54:06.540 | and I reached into my left pocket once
00:54:07.980 | and swallowed a handful of supplements
00:54:09.220 | that included a Bluetooth, an AirPod Pro.
00:54:14.000 | I knew it.
00:54:14.840 | I swallowed it the moment after I gulped it down.
00:54:17.600 | By the way, folks, please don't do this.
00:54:19.220 | It was not a good idea.
00:54:20.700 | It wasn't an idea.
00:54:21.800 | It was a mistake.
00:54:22.640 | But I could see it on my phone as registering there.
00:54:24.820 | Never saw it again.
00:54:25.800 | So I'm assuming it's no longer in my body.
00:54:27.560 | But anyway, there's a bad joke there to be sure.
00:54:32.560 | But in any event, I tend to lose them or misplace them.
00:54:35.920 | So that's the main reason.
00:54:37.280 | But I did notice when I used them
00:54:40.060 | that there's some heat generated there.
00:54:43.040 | I also am not convinced that plugging your ears
00:54:45.060 | all day long is good.
00:54:46.260 | There's some ventilation through the sinus systems
00:54:49.060 | that include the ears.
00:54:50.200 | So it sounds to me like you're not concerned
00:54:52.080 | about the use of earbuds.
00:54:54.200 | But what about heat near the brain?
00:54:57.280 | I mean, the cochlea, the auditory mechanisms
00:55:01.260 | that sit pretty close to the surface there,
00:55:04.660 | heat and neural tissue are not friends.
00:55:08.140 | I'd much rather get my brain cold than hot
00:55:10.140 | in terms of keeping the cells healthy and alive.
00:55:14.080 | Should we be thinking about the heat effects
00:55:17.460 | of some of these devices or other things?
00:55:19.800 | Is there anything we're overlooking?
00:55:21.020 | - Well, think about it this way.
00:55:22.640 | I use cars as an analogy a lot
00:55:26.500 | and mostly internal combustion engine cars.
00:55:29.340 | So these analogies are going to start to be foreign
00:55:33.040 | and useless for another generation of people
00:55:35.120 | that grew up in the era of electric cars.
00:55:37.400 | But using cars as a platform
00:55:40.200 | to talk about fluid cooling systems,
00:55:44.340 | your body has a massive distributed fluid cooling system
00:55:48.740 | similar to a car's radiator.
00:55:50.300 | You're pumping blood all around your body all the time
00:55:53.560 | at a very strictly controlled temperature.
00:55:56.820 | That blood carries, it's mostly water.
00:55:59.480 | So it carries a huge amount of the heat
00:56:02.900 | or cold away from any area of the body
00:56:06.760 | that's focused heating or focused cooling.
00:56:10.680 | So you could put an ice cube on your skin
00:56:13.000 | until it completely melts away
00:56:14.620 | and the blood is going to bring heat back to that area.
00:56:17.180 | You can stand in the sun
00:56:18.920 | under much more scary heating rays
00:56:23.920 | from the sun itself that contain UV radiation
00:56:27.240 | that's definitely damaging your DNA.
00:56:30.320 | - If you're looking for things to be afraid of,
00:56:31.960 | the sun is a good one.
00:56:33.660 | - Now you're talking to the guy that tells everybody
00:56:35.360 | you get sunlight in their eyes every morning,
00:56:37.080 | but I don't want people to get burned
00:56:38.400 | or give themselves skin cancer.
00:56:40.300 | I encourage people to protect their skin accordingly.
00:56:43.160 | And different individuals require different levels
00:56:45.360 | of protection from the sun.
00:56:46.720 | Some people do very well in a lot of sunshine,
00:56:49.780 | never get basal cell or anything like that.
00:56:52.020 | Some people, and it's not just people with very fair skin,
00:56:55.240 | a minimum of sun exposure can cause some issues.
00:56:58.800 | And here I'm talking about sun exposure to the skin.
00:57:01.600 | Of course, staring at the sun is a bad idea.
00:57:03.520 | I never recommend that.
00:57:04.360 | - Thinking about the sun just as a heater for a moment
00:57:08.820 | to compare it with Bluetooth headphones,
00:57:11.680 | your body is very capable of carrying that heat away
00:57:15.520 | and dissipating it via sweat evaporation
00:57:19.440 | or temperature equalization.
00:57:22.220 | So any heat that's locally generated in the ear,
00:57:24.880 | one, there's a pretty large bony barrier there,
00:57:28.000 | but two, there's a ton of blood flow in the scalp
00:57:30.080 | and in the head in general and definitely in the brain
00:57:32.680 | that's gonna regulate that temperature.
00:57:34.320 | So I think certainly there can be
00:57:37.400 | a tiny temperature variation,
00:57:39.080 | but I doubt very seriously that it's enough
00:57:41.800 | to cause a significant problem.
00:57:43.640 | - I'd like to go back to brain augmentation.
00:57:47.640 | You've made very clear that one of the first goals
00:57:50.040 | for Neuralink is to get quadriplegics walking again.
00:57:53.680 | And again, what a marvelous goal that is.
00:57:57.600 | And I certainly hope you guys succeed.
00:57:59.880 | - Well, again, just to be very clear,
00:58:01.480 | the first step is we aren't reconnecting
00:58:04.320 | the patient's own muscle system to their motor cortex.
00:58:07.520 | - Allowing them, excuse me,
00:58:09.640 | agency over the movement of things in the world.
00:58:12.080 | - Yes.
00:58:13.080 | - And eventually their body.
00:58:14.200 | - And you're exactly right.
00:58:15.160 | Yeah, eventually their body.
00:58:16.280 | We would love to do that.
00:58:17.680 | And we've done a lot of work on developing a system
00:58:22.040 | for stimulating the spinal cord itself.
00:58:24.760 | And so that gets to the question that you asked
00:58:28.040 | a few minutes ago of how do you reconnect
00:58:30.360 | the motor cortex to the rest of the body?
00:58:32.680 | Well, if you can bypass the damaged area
00:58:34.960 | of the spinal cord and have an implant
00:58:36.680 | in the spinal cord itself connected to an implant
00:58:39.640 | in the brain and have them talking to each other,
00:58:42.000 | you can take the perfectly intact motor signals
00:58:44.720 | out of the motor cortex and send them to the spinal cord,
00:58:48.720 | which most of the wiring should be intact
00:58:51.040 | in the spinal cord below the level of,
00:58:53.080 | say the injury caused by a car accident
00:58:55.440 | or motorcycle accident or gunshot wound or whatever.
00:58:58.560 | And it should be possible to reconnect the brain
00:59:01.120 | to the body in that way.
00:59:02.680 | So not out of the realm of possibility that,
00:59:05.680 | in some small number of years,
00:59:08.320 | that a neural link will be able to reconnect
00:59:10.840 | to somebody's own body to their brain.
00:59:13.720 | - And here, I just want to flag the hundred years
00:59:17.920 | or more of incredible work by basic scientists.
00:59:21.520 | The names that I learned about in my textbooks
00:59:23.280 | as a graduate student were like Georgopoulos,
00:59:25.480 | and that won't mean anything to anyone
00:59:27.760 | unless you're a neuroscientist,
00:59:28.820 | but Georgopoulos performed some of the first
00:59:32.440 | sophisticated recordings out of motor cortex,
00:59:34.280 | just simply asking what sorts of electrical patterns
00:59:36.740 | are present in motor cortex as an animal or human,
00:59:39.120 | moves a limb.
00:59:40.840 | Krishna Shenoy being another major pioneer in this area
00:59:44.600 | and many others.
00:59:45.920 | And just really highlighting the fact that basic research
00:59:48.160 | where a exploration of neural tissue is carried out
00:59:52.500 | at the level of anatomy and physiology
00:59:54.480 | really sets down the pavement on the runway
00:59:57.320 | to do the sorts of big clinical expeditions
01:00:01.720 | that you all at Neuralink are doing.
01:00:03.640 | - Yeah, it can't be said enough that we,
01:00:07.060 | broadly speaking, in industry,
01:00:08.800 | sometimes are and sometimes stand on the shoulders
01:00:12.040 | of academic giants.
01:00:13.360 | They were the real pioneers that they were involved
01:00:17.800 | in the grind for years in an unglorious, unglamorous way.
01:00:22.720 | - No stock options.
01:00:23.920 | - No stock options.
01:00:25.240 | And the reward for all the hard work is a paper
01:00:30.380 | at the end of the day that is read by dozens of people.
01:00:34.520 | And so they were selfless academic researchers
01:00:39.080 | that made all this possible.
01:00:41.420 | And we all, humanity and Neuralink,
01:00:44.360 | owe them a massive debt of gratitude
01:00:47.320 | for all the hard work that they've done and continue to do.
01:00:50.320 | - I agree.
01:00:51.620 | I'd like to just take a brief moment
01:00:53.160 | and thank one of our podcast sponsors,
01:00:55.020 | which is InsideTracker.
01:00:56.320 | InsideTracker is a personalized nutrition platform
01:00:58.980 | that analyzes data from your blood and DNA
01:01:01.560 | to help you better understand your body
01:01:03.120 | and help you reach your health goals.
01:01:05.000 | I've long been a believer in getting regular blood work done
01:01:07.280 | for the simple reason that blood work is the only way
01:01:10.260 | that you can monitor the markers, such as hormone markers,
01:01:12.600 | lipids, metabolic factors, et cetera,
01:01:14.720 | that impact your immediate and long-term health.
01:01:17.440 | One major challenge with blood work, however,
01:01:19.560 | is that most of the time it does not come back
01:01:22.200 | with any information about what to do
01:01:23.800 | in order to move the values for hormones,
01:01:26.040 | metabolic factors, lipids, et cetera,
01:01:27.900 | into the ranges that you want.
01:01:29.340 | With InsideTracker, changing those values
01:01:31.800 | becomes very straightforward
01:01:32.960 | because it has a personalized dashboard that you can use
01:01:35.900 | to address the nutrition-based, behavior-based,
01:01:38.940 | supplement-based approaches that you can use
01:01:41.680 | in order to move those values into the ranges
01:01:43.820 | that are optimal for you, your vitality, and your longevity.
01:01:46.840 | InsideTracker now includes a measurement
01:01:48.580 | of apolipoprotein B, so-called ApoB, in their ultimate plan.
01:01:52.240 | ApoB is a key marker of cardiovascular health,
01:01:54.880 | and therefore there's extreme value
01:01:56.640 | to knowing your ApoB levels.
01:01:58.660 | If you'd like to try InsideTracker,
01:02:00.040 | you can go to insidetracker.com/huberman
01:02:02.780 | to get 20% off any of InsideTracker's plans.
01:02:05.100 | Again, that's insidetracker.com/huberman to get 20% off.
01:02:10.000 | Along the lines of augmentation,
01:02:11.500 | early on in some of the public discussions
01:02:13.580 | about Neuralink that I overheard between Elon
01:02:15.900 | and various podcast hosts, et cetera,
01:02:19.420 | there were some lofty ideas set out
01:02:22.180 | that I think are still very much in play in people's minds.
01:02:26.280 | Things like, for instance,
01:02:27.900 | electrical stimulation of the hippocampus
01:02:30.500 | that you so appropriately have worn on your shirt today.
01:02:33.580 | So for those, yeah, beautiful.
01:02:35.660 | It looks like either, it looks like a Golgi
01:02:38.060 | or a Cajal rendition of the hippocampus.
01:02:40.340 | Translates to seahorse, and it's an area of the brain
01:02:43.380 | that's involved in learning and memory among other things.
01:02:46.680 | There was this idea thrown out
01:02:49.980 | that a chip or chips could be implanted in the hippocampus
01:02:53.020 | that would allow greater than normal memory abilities,
01:02:57.740 | perhaps, that's one idea.
01:02:59.980 | Another idea that I heard about in these discussions
01:03:04.020 | was, for instance, that you would have some chips
01:03:06.980 | in your brain and I would have some chips in my brain
01:03:08.780 | and you and I could just sit here looking at each other
01:03:12.620 | or not, nodding or shaking our heads
01:03:14.660 | and essentially hear each other's thoughts,
01:03:17.140 | which sounds outrageous, but of course, why not?
01:03:20.460 | Why should we constrain ourselves to,
01:03:24.460 | as our good friend Eddie Chang, who's a neurosurgeon
01:03:27.540 | who was already on this podcast once before,
01:03:29.060 | said, "Speech is just the shaping of breath
01:03:31.220 | "as it exits our lungs."
01:03:32.940 | Incredible, really, when you think about it.
01:03:34.680 | But we don't necessarily need speech
01:03:37.900 | to hear and understand each other's thoughts
01:03:39.760 | because the neural signals that produce
01:03:41.180 | that shaping of the lungs come from some intention.
01:03:44.100 | I have some idea, although it might not seem like it,
01:03:46.220 | about what I'm going to say next.
01:03:48.440 | So is that possible that we could sit here
01:03:51.740 | and just hear each other's thoughts
01:03:54.180 | and also how would we restrict
01:03:56.040 | what the other person could hear?
01:03:57.740 | - Well, so absolutely.
01:03:58.900 | I mean, think about the fact that we could do this right now
01:04:03.340 | if you pulled out your phone and started texting me
01:04:05.980 | on my phone and I looked down and started texting you,
01:04:08.600 | we would be communicating without looking at each other
01:04:10.780 | or talking.
01:04:11.620 | Shifting that function from a phone to an implanted device,
01:04:17.200 | it requires no magic advance, no leap forward.
01:04:22.200 | It's technology we already know how to do
01:04:25.260 | if we, say, put a device in that allows you
01:04:28.740 | to control a keyboard and a mouse,
01:04:30.100 | which is our stated intention
01:04:32.020 | for our first human clinical trial.
01:04:34.020 | - Or, and again, I'm deliberately interrupting,
01:04:36.700 | or I can text an entire team of people simultaneously
01:04:41.420 | and they can text me, and in theory,
01:04:43.460 | I could have a bunch of thoughts
01:04:44.460 | and five, 10, 50 people could hear,
01:04:47.820 | or probably more to their preference,
01:04:51.820 | they could talk to me.
01:04:52.700 | - Yeah, and so texting each other with our brains
01:04:55.580 | is maybe an uninspiring rendition of this,
01:04:57.620 | but it's not very difficult to imagine the implementation
01:05:02.940 | of the same device in a more verbally focused area
01:05:07.420 | of the brain that allows you to more naturally
01:05:09.700 | speak the thoughts that you're thinking
01:05:12.220 | and have them rendered into speech that I can hear,
01:05:15.980 | maybe via a bone conducting implant, so silently here.
01:05:21.740 | - Or not silently, like I could,
01:05:23.820 | let's say I was getting off the plane
01:05:25.500 | and I wanted to let somebody at home know
01:05:26.940 | that I had arrived.
01:05:28.260 | I might be able to think in my mind,
01:05:30.380 | think their first name, which might cue up a device
01:05:33.620 | that would then play my voice to them and say,
01:05:36.540 | just got off the plane, I'm going to grab my bag
01:05:38.460 | and then I'll give you a call.
01:05:39.540 | - Right, on their home Alexa.
01:05:41.180 | - Right, so that's all possible, meaning we know the origin
01:05:46.180 | of the neural signals that gives rise to speech.
01:05:50.340 | We know the different mechanical and neural apparati,
01:05:55.020 | like the cochlea, eardrums, et cetera,
01:05:59.580 | that transduce sound waves into electrical signals.
01:06:04.260 | Essentially all the pieces are known,
01:06:05.660 | we're just really talking about like--
01:06:08.140 | - Refining it.
01:06:08.980 | - Yeah, refining it and reconfiguring it.
01:06:11.580 | I mean, it's not an easy problem,
01:06:13.140 | but it's really an engineering problem
01:06:15.100 | rather than a neuroscience problem.
01:06:17.420 | - For that use case, nonverbal communication, you might say,
01:06:22.420 | that's a solved problem in a very crude disjointed way.
01:06:28.540 | Some labs have solved part one of it,
01:06:31.140 | some labs have solved part two of it.
01:06:32.900 | There are products out there that solve,
01:06:35.200 | say the implanted bone conduction part of it
01:06:37.700 | for the deaf community.
01:06:39.240 | There are no implementations I'm aware of
01:06:45.620 | that are pulling all that together into one product.
01:06:48.920 | That's a streamlined package from end to end.
01:06:51.740 | I think that's a few years down the road.
01:06:54.140 | - And we, I think, have some hints of how easily
01:06:57.860 | or poorly people will adapt to these,
01:07:01.540 | let's call them novel transformations.
01:07:03.380 | A few years ago, I was on Instagram
01:07:06.000 | and I saw a post from a woman,
01:07:08.960 | her name is Kassar Jacobson,
01:07:10.700 | and she is deaf since birth and can sign
01:07:14.880 | and to some extent can read lips,
01:07:16.340 | but she was discussing a neosensory.
01:07:20.980 | So this is a device that translates sound in the environment
01:07:24.980 | into touch sensations on her hand or wrist.
01:07:28.640 | She's a admirer of birds and all things avian.
01:07:33.640 | And I reached out to her about this device
01:07:38.340 | 'cause I'm very curious
01:07:39.180 | because this is a very interesting use case
01:07:41.180 | of neuroplasticity in the sensory domain,
01:07:45.160 | which is a fascination of mine.
01:07:46.660 | And she said that, yes, indeed,
01:07:49.220 | it afforded her novel experiences.
01:07:51.300 | Now when walking past, say, pigeons in the park,
01:07:54.620 | if they were to make some goo goo goo goo,
01:07:56.380 | whatever sounds that pigeons make,
01:07:58.340 | that she would feel those sounds
01:08:00.140 | and that indeed it enriched her experience of those birds
01:08:04.200 | in ways that obviously it wouldn't otherwise.
01:08:07.180 | I haven't followed up with her recently
01:08:08.580 | to find out whether or not ongoing use of the neosensory
01:08:12.220 | has made for a better, worse,
01:08:16.020 | or kind of equivalent experience of avians in the world,
01:08:21.820 | which for her is a near obsession.
01:08:24.740 | So she delights in them.
01:08:26.240 | What are your thoughts about peripheral devices
01:08:30.100 | like that peripheral meaning outside of the skull,
01:08:34.340 | no requirement for a surgery?
01:08:37.740 | Do you think that there's a more immediate
01:08:39.980 | or even a just generally potent use case
01:08:44.980 | for peripheral devices?
01:08:47.140 | And do you think that those are going to be used
01:08:49.700 | more readily before the kind of brain surgery
01:08:52.420 | requiring devices are used?
01:08:54.900 | - Yeah, certainly the barrier to entry is lower.
01:08:58.140 | The barrier to adoption is low.
01:09:00.720 | If you're making a tactile glove,
01:09:02.660 | that's hard to say no to when you can slip it on
01:09:06.580 | and slip it off and not have to get your skin cut at all.
01:09:09.860 | Again, there's no perfect measure of the efficacy
01:09:16.540 | of a device, of one device compared to another,
01:09:19.140 | especially across modalities.
01:09:20.500 | But one way that you can start to compare apples to oranges
01:09:25.340 | is bit rate, useful information in or out of the brain
01:09:28.900 | as transformed into digital data.
01:09:33.940 | And so you can put a single number on that.
01:09:36.460 | And you have to ask when you look at a device like that is,
01:09:39.980 | what is the bit rate in?
01:09:41.460 | What is the bit rate out?
01:09:42.660 | How much information are you able to usefully convey
01:09:46.420 | into the system and get out of the system into the body,
01:09:50.060 | into the brain?
01:09:50.980 | And I think what we've seen in the early stabs at this
01:09:55.980 | is that there's a very low threshold for bit rate
01:10:00.480 | on some of the devices that are trying to avoid
01:10:03.940 | a direct brain surgery.
01:10:05.780 | - Could you perhaps say what you just said,
01:10:09.220 | but in a way that maybe people who aren't as familiar
01:10:12.700 | with thinking about bit rates might be able to digest.
01:10:17.700 | There, I'm referring to myself.
01:10:20.020 | I mean, I understand bit rate.
01:10:22.020 | I understand that adding a new channel of information
01:10:24.020 | is just that, adding information.
01:10:26.700 | Are you saying it's important to understand
01:10:28.580 | whether or not that new information provides
01:10:30.940 | for novel function or experience and to what extent
01:10:35.480 | is the newness of that valid and adaptive?
01:10:39.500 | - Well, I'm saying more, it's hard to measure utility
01:10:44.300 | in this space.
01:10:45.140 | It's hard to put a single metric, single number
01:10:49.000 | on how useful a technology is.
01:10:51.620 | One crude way to try to get at that is bit rate.
01:10:56.220 | Think of it as back in the days of dial-up modems.
01:11:00.140 | The bit rate of your modem was 56K or 96K.
01:11:04.420 | - I can still hear the sound of the dial-up
01:11:06.380 | in the background.
01:11:07.220 | - That was a bit rate that thankfully kept steadily
01:11:11.100 | going up and up and up.
01:11:12.040 | Your internet service provider gives you a number
01:11:16.300 | that is the maximum usable data that you can transmit
01:11:19.060 | back and forth from the internet.
01:11:20.900 | That's a useful way to think about these assistive devices.
01:11:25.900 | How much information are you able to get into the brain
01:11:29.060 | and out of the brain usefully?
01:11:30.980 | And right now that number is very small,
01:11:34.300 | even compared to the old modems,
01:11:36.780 | but you have to ask yourself when you're looking
01:11:38.560 | at a technology, what's the ceiling?
01:11:40.060 | What's the theoretical maximum?
01:11:42.760 | And for a lot of these technologies,
01:11:44.140 | the theoretical maximum is very low, disappointingly low,
01:11:48.860 | even if it's perfectly executed and perfectly developed
01:11:53.820 | as a technology.
01:11:54.720 | And I think the thing that attracts a lot of us
01:11:56.900 | to a technology like Neuralink is that the ceiling
01:12:00.040 | is incredibly high.
01:12:01.660 | There's no obvious reason that you can't interface
01:12:04.260 | with millions of neurons as this technology is refined
01:12:08.800 | and developed further.
01:12:10.460 | So that's the kind of wide band,
01:12:15.060 | high bandwidth brain interface that you want to develop
01:12:18.340 | if you're talking about a semantic prosthetic
01:12:23.340 | and AI assistant to your cognitive abilities.
01:12:27.900 | You know, the more sci-fi things that we think about
01:12:30.260 | in the coming decades.
01:12:33.220 | So it's an important caveat
01:12:36.280 | when you're evaluating these technologies.
01:12:38.300 | They really want it to be something that you can expand
01:12:41.540 | off into the sci-fi.
01:12:43.100 | - So let's take this a step further
01:12:47.400 | because as you're saying this,
01:12:48.900 | I'm realizing that people have been doing exactly
01:12:52.060 | what Neuralink is trying to do now for a very long time.
01:12:56.180 | Let me give you an example.
01:12:57.540 | People who are blind, who have no pattern vision,
01:13:01.540 | have used canes for a very long time.
01:13:04.000 | Now the cane is not a chip, it's not an electrode,
01:13:08.700 | it's not neo-sensory, none of that stuff.
01:13:12.440 | What it is is essentially a stick that has
01:13:17.360 | an interface with a surface.
01:13:20.340 | So it swept back and forth across the ground
01:13:22.820 | and translating what would otherwise be visual cues
01:13:25.960 | into somatosensory cues.
01:13:28.940 | And we know that blind people are very good
01:13:31.900 | at understanding, even when they are approaching,
01:13:36.260 | say a curb edge,
01:13:37.980 | because they are integrating that information
01:13:40.000 | from the tip of the cane up through
01:13:42.300 | their somatosensory cortex and their motor cortex
01:13:44.760 | with other things like the changes in the wind
01:13:48.220 | and the sound as they round a corner.
01:13:50.160 | And here I'm imagining like a corner
01:13:52.180 | in San Francisco downtown,
01:13:53.340 | where as you get to the corner,
01:13:54.900 | it's a completely different set of auditory cues.
01:13:58.860 | And very often we know,
01:14:00.100 | and this is because my laboratory worked on visual repair
01:14:02.100 | for a long time, I talked to a lot of blind people
01:14:03.740 | who use different devices to navigate the world,
01:14:05.460 | that they aren't aware of the fact
01:14:08.740 | that they're integrating these other cues,
01:14:10.340 | but they nonetheless do them subconsciously.
01:14:13.400 | - Right.
01:14:15.020 | - And in doing so get pretty good at navigating with a cane.
01:14:17.940 | Now a cane isn't perfect,
01:14:19.620 | but you can imagine the other form of navigating
01:14:23.240 | as a blind person,
01:14:24.820 | which is to just attach yourself or attach to you
01:14:29.140 | another nervous system,
01:14:30.580 | the best that we know being a dog,
01:14:33.200 | a sighted dog that can cue you again,
01:14:36.020 | with stopping at a curbs edge,
01:14:38.840 | or even if there are some individuals
01:14:40.620 | that might seem a little sketchy,
01:14:41.640 | dogs are also very good at sensing
01:14:43.980 | different arousal states and others, threat, danger.
01:14:47.940 | I mean, they're exquisite at it, right?
01:14:49.860 | So here what we're really talking about
01:14:51.260 | is taking a cane or another biological system,
01:14:54.460 | essentially a whole nervous system
01:14:56.140 | and saying this other nervous system's job
01:14:58.580 | is to get you to navigate more safely through the world.
01:15:02.020 | In some sense what Neuralink is trying to do is that,
01:15:05.620 | but with robotics to insert them and chips,
01:15:09.740 | which raises the question,
01:15:11.820 | people are gonna say finally a question.
01:15:13.420 | The question is this,
01:15:14.740 | we hear about BMI, brain machine interface,
01:15:18.540 | which is really what Neuralink specializes in.
01:15:20.380 | We also hear about AI,
01:15:22.060 | another example where there's great promise and great fear.
01:15:25.000 | We hear about machine learning as well.
01:15:27.680 | To what extent can these brain machine interfaces
01:15:32.680 | learn the same way a seeing eye dog would learn,
01:15:36.700 | but unlike a seeing eye dog,
01:15:38.500 | continue to learn over time
01:15:41.380 | and get better and better and better
01:15:43.200 | because it's also listening to the nervous system
01:15:45.420 | that it's trying to support.
01:15:47.540 | Put simply, what is the role for AI and machine learning
01:15:50.300 | in the type of work that you're doing?
01:15:51.960 | - That's a great question.
01:15:52.800 | I think it goes both ways.
01:15:54.740 | Basically what you're doing is
01:15:56.340 | taking a very crude software intelligence.
01:15:58.660 | I would say not exactly a full-blown AI,
01:16:01.920 | but some well-designed software
01:16:04.900 | that can adapt to changes in firing of the brain
01:16:08.740 | and you're coupling it with another form of intelligence,
01:16:11.260 | a human intelligence,
01:16:12.460 | and you're allowing the two to learn each other.
01:16:16.200 | So undoubtedly the human that has a Neuralink device
01:16:20.060 | will get better at using it over time.
01:16:22.720 | Undoubtedly, the software that the Neuralink engineers
01:16:27.020 | have written will adapt to the firing patterns
01:16:31.460 | that the device is able to record
01:16:34.820 | and over time focus in on meaningful signals
01:16:39.820 | toward movement, right?
01:16:41.060 | So if a neuron is a high firing rate
01:16:45.420 | when you intend to move the mouse cursor up and to the right,
01:16:49.260 | it doesn't know that when it starts.
01:16:51.020 | When you first put this in,
01:16:52.120 | it's just a random series of signals
01:16:54.460 | as far as the chip knows,
01:16:56.120 | but you start correlating it with what the person,
01:16:58.700 | what you know the person wants to do
01:17:01.320 | as expressed in a series of games.
01:17:03.520 | So you assume that the person wants to move the mouse
01:17:08.520 | on the screen to the target that's shown
01:17:11.360 | because you tell them that's the goal.
01:17:12.760 | And so you start correlating the activity that you record
01:17:17.460 | when they're moving toward an up and right target
01:17:19.860 | on a screen with that firing pattern.
01:17:22.220 | And similarly for up and left, down and left,
01:17:24.820 | down and right.
01:17:25.660 | And so you develop a model semi-intelligently
01:17:30.520 | in the software for what the person is intending to do
01:17:33.700 | and let the person run wild with it for a while.
01:17:36.300 | And they start to get better at using the model presented
01:17:40.520 | to them by the software as expressed by the mouse moving
01:17:44.120 | or not moving properly on the screen, right?
01:17:46.760 | So it's, imagine a scenario where you're asking somebody
01:17:50.120 | to play piano, but the sound that comes out of each key
01:17:55.120 | randomly shifts over time, very difficult problem,
01:18:00.160 | but a human brain is good enough with the aid of software
01:18:03.600 | to solve that problem and map well enough
01:18:06.140 | to a semi-sable state that they're gonna know
01:18:09.100 | how to use that mouse even when they say,
01:18:12.120 | turn the device off for the night,
01:18:13.400 | come back to it the next day.
01:18:14.840 | And some of the signals have shifted.
01:18:17.320 | - So you're describing this,
01:18:18.160 | I'm recalling a recent experience.
01:18:20.020 | I got one of these rowers, you know, to exercise.
01:18:25.020 | And I am well aware that there's a proper row stroke
01:18:28.860 | and there's a improper row stroke.
01:18:31.860 | And most everybody, including me,
01:18:33.720 | who's never been coached in rowing,
01:18:35.900 | gets on this thing and pushes with their legs
01:18:37.560 | and pulls with their arms and back.
01:18:38.860 | And it's some mix of incorrect
01:18:40.620 | and maybe a smidgen of correct type execution.
01:18:45.180 | There's a function within the rower
01:18:46.500 | that allows you, or in this case me,
01:18:48.980 | to play a game where you can actually,
01:18:51.460 | every row stroke you generate arrows toward a dartboard.
01:18:56.220 | And it knows whether or not you're generating
01:18:58.820 | the appropriate forces at the given segment of the row,
01:19:01.700 | the initial pull when you're leaning back, et cetera,
01:19:04.340 | and adjusts the trajectory of the arrow
01:19:06.540 | so that when you do a proper row stroke,
01:19:08.460 | it gets closer to a bullseye.
01:19:10.140 | And it's very satisfying
01:19:11.780 | because you now have a visual feedback
01:19:13.480 | that's unrelated to the kinds of instructions
01:19:16.700 | that one would expect, like, oh, you know,
01:19:18.080 | hinge your hip a bit more, or, you know,
01:19:19.660 | splay your knees a bit more, reach more with your arms,
01:19:21.820 | or pull first with your back.
01:19:22.940 | All the rowers are probably cringing as I say this
01:19:24.660 | 'cause they're realizing what is exactly the point,
01:19:27.180 | which is I don't know how to row,
01:19:28.440 | but over time, simply by paying attention
01:19:31.460 | to whether or not the arrow is hitting the bullseye or not,
01:19:34.240 | more or less frequently,
01:19:36.460 | you can improve your row stroke and get,
01:19:39.940 | as I understand, pretty close to optimal row stroke
01:19:43.180 | in the same way that if you had a coach there
01:19:46.700 | telling you, hey, do this and do that.
01:19:48.140 | What we're really talking about here is neurobiofeedback.
01:19:50.420 | - Sure.
01:19:51.260 | - So is that analogy similar to what you're describing?
01:19:54.300 | - Yeah, that's a great analogy.
01:19:56.000 | You know, humans are really good at learning
01:19:59.400 | how to play games in software.
01:20:01.460 | So video games are an awesome platform
01:20:03.860 | for us to use as a training environment
01:20:05.740 | for people to get better at controlling these things.
01:20:08.680 | In fact, it's the default and the obvious way to do it
01:20:12.540 | is to have people and monkeys play video games.
01:20:15.860 | - Do you play video games?
01:20:16.920 | - Yeah, sure.
01:20:17.820 | - Which video games?
01:20:19.680 | - Let's see, I play old ones.
01:20:22.140 | I'm a little nostalgic.
01:20:23.460 | So I like the old Blizzard games,
01:20:26.700 | Starcraft and Warcraft.
01:20:28.340 | - Oh my, I don't even know those.
01:20:30.020 | I remember the first Apple computers.
01:20:31.620 | I mean, how old are you?
01:20:34.340 | - 43.
01:20:35.180 | - Okay, 44 now as of a few days ago.
01:20:37.300 | Happy birthday, so we're a little bit offset there.
01:20:39.120 | Yeah, I can recall Mike Tyson's punch out
01:20:41.460 | like the original Nintendo games.
01:20:43.240 | - It's a hard game.
01:20:44.080 | - Super Mario Brothers.
01:20:44.900 | It's a hard game.
01:20:45.740 | But the game, so the games you're describing,
01:20:48.200 | I don't recall that.
01:20:49.720 | My understanding is that the newer games
01:20:51.440 | are far more sophisticated.
01:20:53.740 | - In some respects, I did recently find time
01:20:56.180 | to play cyberpunk, which was really satisfying
01:20:59.600 | and maybe appropriate.
01:21:00.940 | It's a game where the characters are all fully
01:21:04.460 | modded out with cybernetic implants.
01:21:07.080 | - Oh, perfect.
01:21:08.120 | - But the root of the game is run around and shoot things.
01:21:11.240 | So maybe not so different from Duck Hunt
01:21:14.520 | or whatever from our childhoods.
01:21:16.120 | - Reason I ask about video games is there's been
01:21:19.240 | some controversy as to whether or not
01:21:20.520 | they are making young brains better or worse.
01:21:23.200 | And I think some of the work from Adam Ghazali's lab
01:21:26.160 | at UCSF and other laboratories have shown
01:21:28.140 | that actually provided that children in particular
01:21:32.220 | and adults are also spending time in normal
01:21:34.640 | face-to-face, let's call them more traditional
01:21:37.480 | face-to-face interactions that video games
01:21:39.640 | can actually make nervous systems.
01:21:41.580 | That is people are much more proficient
01:21:43.720 | at learning and motor execution.
01:21:45.200 | - Sure.
01:21:46.120 | - Visual detection and on and on.
01:21:48.840 | - Yeah, there's some work showing that surgeons
01:21:51.420 | are better if they play video games.
01:21:54.200 | So I try to squeeze some in
01:21:55.600 | as a professional development activity.
01:21:58.760 | - Great, great.
01:21:59.800 | Well, I'm sure you're getting cheers
01:22:01.100 | from those that like video games out there
01:22:03.980 | and some of the parents who are trying to get their kids
01:22:06.180 | to play fewer video games are cringing, but that's okay.
01:22:08.680 | We'll let them settle their familial disputes
01:22:11.780 | among themselves.
01:22:12.760 | Let's talk about pigs.
01:22:15.460 | - Sure.
01:22:17.020 | - Neuralink has been quite generous, I would say,
01:22:21.200 | in announcing their discoveries and their goals.
01:22:25.780 | And I want to highlight this because I think
01:22:27.760 | it's quite unusual for a company to do this.
01:22:30.680 | I'm probably going to earn a few enemies by saying this.
01:22:33.460 | Despite the fact that I've always owned Apple devices
01:22:36.000 | and from the South Bay, you know,
01:22:38.320 | the Apple design team is notoriously cryptic
01:22:40.640 | about what they're going to do next
01:22:41.980 | or when the next phone or computer is going to come out
01:22:44.360 | is vaulted to a serious extent.
01:22:49.360 | Neuralink has been pretty open about their goals.
01:22:53.800 | - Right.
01:22:54.640 | - With the understanding that goals change
01:22:56.580 | and have to change.
01:22:58.560 | And one of the things that they've done,
01:23:00.020 | which I think is marvelous,
01:23:00.980 | is they've held online symposia
01:23:04.660 | where you and some other colleagues of mine
01:23:07.220 | from the neuroscience community, Dan Adams,
01:23:08.920 | who I have tremendous respect for,
01:23:10.200 | and Elon and others there at Neuralink
01:23:13.880 | have shared some of the progress that they've made
01:23:16.540 | in experimental animals.
01:23:18.120 | I'm highlighting this because I think
01:23:19.960 | if one takes a step back, I mean,
01:23:21.600 | just for most people to know about and realize
01:23:26.600 | that there's experimentation on animals,
01:23:28.320 | implantation of electrodes and so on,
01:23:30.320 | is itself a pretty bold move
01:23:31.620 | because that understandably evokes
01:23:34.100 | some strong emotions in people.
01:23:36.840 | And in some people it evokes extremely strong emotions.
01:23:39.760 | - Sure.
01:23:40.600 | - Neuralink did one such symposium
01:23:44.720 | where they showed implant devices in pigs.
01:23:48.160 | - Right.
01:23:49.300 | - Then they did another one, you guys did another one
01:23:51.760 | where it was implant devices in monkeys.
01:23:54.760 | - Right.
01:23:55.580 | - I assume at some point there will be
01:23:57.400 | one of these public symposia
01:23:58.560 | where the implant devices will be in a human.
01:24:01.820 | What was the rationale for using pigs?
01:24:06.280 | I'm told pigs are very nice creatures.
01:24:08.280 | - Yeah.
01:24:09.120 | - I'm told that they are quite smart.
01:24:10.600 | - Right.
01:24:11.760 | - And for all my years as a neuroscientist
01:24:15.780 | and having worked admittedly on every species
01:24:18.800 | from mice to cuttlefish, to humans, to hamsters,
01:24:22.480 | to I confess various carnivore species,
01:24:27.040 | which I no longer do, I work on humans now
01:24:29.160 | for various reasons, I never in my life
01:24:33.240 | thought I would see a implant device
01:24:36.800 | in the cortex of a pig.
01:24:39.120 | - Sure.
01:24:39.960 | - Why work on pigs?
01:24:41.760 | - Yeah, well, let me say first,
01:24:44.120 | Neuralink is almost entirely composed
01:24:47.000 | of animal loving people.
01:24:48.880 | The people at Neuralink are obsessive animal lovers.
01:24:53.400 | There are signs up all around the office,
01:24:55.960 | spontaneously put up by people within the organization,
01:24:59.900 | talking about how we wanna save animals,
01:25:03.300 | we wanna protect animals.
01:25:05.200 | If there was any possible way to help people
01:25:09.080 | the way we wanna help people without using animals
01:25:12.360 | in our research, we would do it.
01:25:13.960 | It's just not known how to do that right now.
01:25:17.280 | And so we are completely restricted to making advances,
01:25:22.280 | to getting a device approval through the FDA
01:25:24.600 | by first showing that it's incredibly safe in animals.
01:25:28.920 | - As is the case for any medical advancement, essentially.
01:25:33.680 | I do wanna highlight this,
01:25:34.520 | that the FDA and the other governing bodies
01:25:37.920 | oversee these types of experiments
01:25:41.760 | and ensure that they're done
01:25:42.680 | with a minimum of discomfort to the animals, of course.
01:25:45.120 | But I think there's an inherent speciesism
01:25:49.080 | in most humans, not all.
01:25:53.840 | Some people truly see equivalence
01:25:55.800 | between a lizard and a human,
01:25:57.760 | lizard life being equivalent to human life.
01:25:59.480 | Most human beings, I think,
01:26:01.000 | in particular human beings who themselves
01:26:03.540 | or who have loved ones that are suffering from diseases
01:26:05.960 | that they hope could be cured at some point,
01:26:09.640 | view themselves as species
01:26:11.520 | and feel that if you have to work on a biological system
01:26:15.400 | in order to solve the problem,
01:26:17.360 | working on non-human animals first makes sense
01:26:21.120 | to most people.
01:26:22.680 | But certainly there's a category of people
01:26:24.160 | that feels very strongly in the opposite direction.
01:26:26.880 | - Sure, and I think we would probably
01:26:29.120 | be having a very different conversation
01:26:30.960 | around animal research if we weren't,
01:26:34.480 | we as a species, we as a culture,
01:26:36.920 | weren't just casually slaughtering millions of animals
01:26:40.520 | to eat them every single day.
01:26:44.960 | And so that is a background against which
01:26:48.120 | the relatively minuscule number of animals
01:26:50.920 | used in research, it becomes almost impossible
01:26:54.960 | to understand why someone would point
01:26:56.760 | to that ridiculously small number of animals
01:27:00.920 | used in research when the vast,
01:27:03.640 | vast majority of animals that humans use
01:27:08.280 | and end their lives are done for food.
01:27:11.280 | - Or for fur.
01:27:12.240 | - Or for fur or these other reasons
01:27:14.280 | that people have historically used animals.
01:27:16.440 | So we, in that context, we do animal research
01:27:20.600 | because we have to, there's no other way around it.
01:27:23.040 | If tomorrow laws were changed and the FDA said,
01:27:27.560 | okay, you can do some of this early experimentation
01:27:30.680 | in willing human participants,
01:27:32.400 | that would be a very interesting option.
01:27:33.800 | I think there'd be a lot of people that would step up
01:27:36.280 | and say, yes, I'm willing to participate
01:27:38.600 | in early stage clinical research.
01:27:40.920 | - You already volunteered.
01:27:42.440 | - Yeah, and I wouldn't be alone.
01:27:45.280 | And that is a potential way that animals
01:27:47.800 | could maybe be spared being unwilling participants in this.
01:27:52.760 | On that note, to whatever extent possible,
01:27:56.880 | I think Neuralink goes really, really far,
01:28:01.480 | much, much farther than anyone I've ever heard of,
01:28:04.240 | any organization I've ever heard of,
01:28:06.560 | anything I've ever seen to give the animals agency
01:28:09.960 | in every aspect of the research.
01:28:11.880 | We have just an incredible team of people
01:28:16.560 | looking out for the animals
01:28:17.760 | and trying to design the experiments
01:28:19.720 | such that they're as purely opt-in as humanly possible.
01:28:23.920 | No animal is ever compelled to participate in experiments
01:28:28.480 | beyond the surgery itself.
01:28:30.640 | So if say on a given day, our star monkey pager
01:28:35.480 | doesn't want to play video games for smoothie,
01:28:38.720 | no one forces them to ever.
01:28:40.880 | - This is a very important point.
01:28:42.120 | And I want to cue people to really what Matt is saying here.
01:28:46.400 | Obviously the animals are being researched on
01:28:50.760 | for Neuralink so they don't get to opt into,
01:28:53.520 | opt out of the experiment.
01:28:55.040 | But what he's saying is that they play these games
01:29:00.140 | during which neural signals are measured from the brain
01:29:02.280 | because they have electrodes implanted in their brain
01:29:04.200 | through a surgery that thankfully to the brain is painless.
01:29:07.220 | No pain receptors in the brain.
01:29:09.740 | And are playing for reward.
01:29:12.300 | This is very different, very different
01:29:15.180 | than the typical scenario in laboratories around the world
01:29:18.780 | where people experiment on mice, monkeys,
01:29:23.380 | some cases pigs or other species
01:29:25.720 | in which the typical arrangement
01:29:27.960 | is to water deprive the animals.
01:29:30.940 | - We never do that.
01:29:31.940 | - And then have the animals work
01:29:33.740 | for their daily ration of water.
01:29:35.780 | - Right.
01:29:36.700 | - And some people are hearing this
01:29:38.020 | and probably think, wow, that's barbaric.
01:29:40.060 | And here I'm not trying to point fingers
01:29:42.160 | at the people doing that kind of work.
01:29:43.320 | I just think it's important that people understand
01:29:44.880 | how the work is done.
01:29:46.300 | - Right.
01:29:47.140 | - In order to motivate an animal to play a video game,
01:29:51.640 | depriving them of something that they yearn for
01:29:54.360 | is a very efficient way to do that.
01:29:56.380 | - We don't do that.
01:29:57.520 | They have free and full access to food this entire time.
01:30:00.540 | So they aren't hungry, they aren't thirsty.
01:30:02.720 | The only thing that would motivate them
01:30:04.280 | is if they want a treat extra to their normal rations.
01:30:08.360 | But there's never any deprivation.
01:30:10.900 | There's never any adverse negative stimuli
01:30:12.980 | that pushes them to do anything.
01:30:15.640 | - Must say I'm impressed by that decision
01:30:17.640 | because training animals to do tasks
01:30:23.500 | in laboratory settings is very hard.
01:30:25.940 | And the reason so many researchers
01:30:27.360 | have defaulted to water deprivation
01:30:29.960 | and having animals work for a ration of water
01:30:32.700 | is because frankly, it works.
01:30:35.400 | It allows people to finish their PhD
01:30:37.320 | or their postdoc more quickly than having to wait around
01:30:41.680 | and try and figure out why their monkey
01:30:45.620 | isn't working that day.
01:30:46.840 | In fact, having known a number of people
01:30:48.920 | who've done these kinds of experiments,
01:30:50.720 | although we've never done them in my lab,
01:30:53.120 | my monkey isn't working today is a common gripe
01:30:57.840 | among graduate students and postdocs
01:30:59.240 | who do this kind of work.
01:31:00.760 | And for people who work on mice.
01:31:02.560 | - Okay, so this is very important information to get across.
01:31:07.540 | And there's no public relations statement woven into this.
01:31:10.540 | This is just we're talking about the nature of the research,
01:31:12.860 | but I think it is important that people are aware of this.
01:31:15.400 | - Yeah, it's one of the underappreciated innovations
01:31:18.100 | out of Neuralink is how far the animal care team
01:31:20.300 | has been able to move in the direction
01:31:22.480 | of humane treatment of these guys.
01:31:24.180 | - Wonderful, as an animal lover myself,
01:31:26.680 | I can only say wonderful.
01:31:28.300 | Why pigs?
01:31:30.880 | - Yeah, pigs are, they're actually fairly commonly used
01:31:34.880 | in medical device research, more in the cardiac area,
01:31:39.880 | their hearts are somewhat similar to human hearts.
01:31:42.960 | - How big are these pigs?
01:31:43.920 | I've seen little pigs and I've seen big pigs.
01:31:45.640 | - Yeah, there's a range.
01:31:46.880 | There's a bunch of different varieties of pig.
01:31:49.060 | There's a bunch of different species
01:31:50.880 | that you can optimize for different characteristics.
01:31:55.600 | There's many pigs, there's Yorkshire's,
01:31:59.120 | there's a lot of different kinds of pigs
01:32:02.200 | that we use in different contexts
01:32:04.280 | when we're trying to optimize a certain characteristic.
01:32:08.720 | So yeah, the pigs are, we don't necessarily need them
01:32:12.340 | to be smart or task performers,
01:32:14.620 | although occasionally we have trained them
01:32:17.640 | to walk on a treadmill when we're studying
01:32:20.320 | how their limbs move for some of our spinal cord research,
01:32:24.120 | but we're not recording interesting,
01:32:28.180 | say, cognitive data out of their minds.
01:32:30.620 | They're really just a biological platform with a skull
01:32:34.160 | that's close enough in size and shape to humans
01:32:37.460 | to be a valid platform to study the safety of the device.
01:32:41.600 | - Unlike a monkey or a human, a pig, I don't think,
01:32:45.100 | can reach out and hit a button or a lever.
01:32:48.800 | How are they signaling that they saw or sensed something?
01:32:53.640 | - Yeah, so again, the pigs are really just a safety platform
01:32:56.600 | to say the device is safe to implant.
01:32:58.880 | It doesn't break down or cause any kind of toxic reaction.
01:33:02.880 | The monkeys are where we are really doing our heavy lifting
01:33:05.320 | in terms of ensuring that we're getting good signals
01:33:08.520 | out of the device, that what we expect to see in humans
01:33:12.220 | is validated on a functional level in monkeys first.
01:33:16.040 | - Let's talk about the skull.
01:33:19.700 | - Yeah.
01:33:20.540 | - Years ago, you and I were enjoying a conversation
01:33:22.840 | about these very sorts of things that we're discussing today
01:33:26.560 | and you said, you know, the skull is actually
01:33:28.440 | a pretty lousy biological adaptation.
01:33:31.520 | Far better would be a titanium plate, you know,
01:33:34.420 | spoken like a true neurosurgeon with a radio receiver
01:33:38.080 | implanted in his hand.
01:33:39.400 | But in all seriousness, drilling through the skull
01:33:44.280 | with a two millimeter hole,
01:33:46.260 | certainly don't do this at home, folks.
01:33:48.980 | Please don't do this.
01:33:50.400 | But yes, that's a small entry site,
01:33:55.400 | but I think most people cringe when they hear about that
01:33:58.480 | or think about that.
01:33:59.800 | And it obviously has to be done by a neurosurgeon
01:34:01.940 | with all the appropriate environmental conditions in place
01:34:06.940 | to limit infection.
01:34:08.660 | What did you mean when you said that the skull
01:34:11.880 | is a poor adaptation and a titanium plate will be better?
01:34:14.560 | And in particular, what does that mean in reference
01:34:18.720 | to things like traumatic brain injury?
01:34:20.360 | I mean, are human beings unnecessarily vulnerable
01:34:23.520 | at the level of traumatic brain injury
01:34:25.080 | because our skulls are just not hard enough?
01:34:28.840 | - You know, maybe I'm being too harsh about the skull.
01:34:34.080 | The skull is very good at what it does
01:34:37.880 | given the tools that we are working with
01:34:40.760 | as biological organisms that develop in our mother's uterus.
01:34:43.980 | The skull is, you know, usually the appropriate size.
01:34:48.960 | It's one of the hardest things in your body.
01:34:51.160 | That said, there are a couple of puzzling vulnerabilities.
01:34:55.320 | Some of the thinnest bone in the skull
01:34:58.480 | is in the temporal region.
01:34:59.960 | This is, you know, neurosurgeons will all know
01:35:02.440 | that I'm heading toward a feature
01:35:05.000 | that sometimes darkly is called God's little joke
01:35:08.520 | where the very thin bone of the temporal part of the skull
01:35:13.520 | has one of the largest arteries
01:35:16.440 | that goes to the lining of the brain right attached
01:35:19.560 | to the inside of it.
01:35:20.400 | And so this bone just to the side of your eye
01:35:24.040 | tends to fracture if you're struck there.
01:35:26.920 | And the sharp edges of that fractured bone
01:35:29.120 | very often cut an artery called the middle meningeal artery
01:35:32.720 | that leads to a big blood clot that crushes the brain.
01:35:36.920 | That's how a lot of people with, you know,
01:35:39.360 | otherwise would be a relatively minor injury end up dying
01:35:43.600 | is this large blood clot developing from high pressured
01:35:47.240 | arterial blood that crushes the brain.
01:35:50.360 | And so why would you put the artery right on the inside
01:35:53.960 | of the very thin bone that's most likely to fracture?
01:35:57.360 | It's an enduring mystery,
01:35:58.520 | but this is probably the most obvious failure mode
01:36:02.560 | in, you know, the design of a human skull.
01:36:05.440 | Otherwise, you know, in terms of general impact resistance,
01:36:09.240 | I think the brain is a very hard thing to protect.
01:36:12.840 | And the architecture of human anatomy,
01:36:15.760 | probably given all other possible architectures
01:36:19.200 | that can arise from development, it's not that bad really.
01:36:23.340 | One of the interesting features in terms of shock absorption
01:36:27.360 | that hopefully prevents a lot of traumatic brain injury
01:36:30.000 | is the fluid sheath around the brain.
01:36:31.880 | The brain you may know is it's mostly fat.
01:36:36.880 | It floats in saltwater in our brains.
01:36:40.080 | So our brains are all floating in saltwater.
01:36:42.360 | And so with rapid acceleration, deceleration,
01:36:45.560 | that sheath of saltwater adds a marvelous protective cushion
01:36:50.560 | against development of, you know, bruising of the brain,
01:36:55.920 | say, or bleeding in the brain.
01:36:57.980 | And so I think for any flaws in the design that do exist,
01:37:02.900 | you can imagine things being a lot worse
01:37:07.320 | and there's probably a lot fewer TBIs than would exist
01:37:10.880 | if a human designer was taking a first crack at it.
01:37:13.480 | - As you described the thinness of this temporal bone
01:37:17.080 | and the presence of a critical artery just beneath it,
01:37:20.860 | I'm thinking about most helmets.
01:37:24.240 | And here, I also want to cue up the fact that,
01:37:28.940 | well, whenever we hear about TBI or CTE or brain injury,
01:37:33.120 | people always think football, hockey,
01:37:35.520 | but most traumatic brain injuries are things like
01:37:38.120 | car accidents or construction work.
01:37:41.120 | It's not football and hockey.
01:37:43.160 | For some reason, football and hockey and boxing
01:37:45.560 | get all the attention.
01:37:47.560 | But my colleagues that work on traumatic brain injury
01:37:49.800 | tell me that most of the traumatic brain injury
01:37:52.860 | they see is somebody slips at a party and hits their head
01:37:55.720 | or, you know, is in a car accident
01:37:59.640 | or environmental accidents of various kinds.
01:38:03.480 | To my mind, most helmets don't actually cover this region
01:38:07.640 | close to the eyes.
01:38:08.820 | So is there also a failure of helmet engineering that,
01:38:13.820 | you know, I can understand why you'd want to have
01:38:16.360 | your peripheral vision out the sides of your eyes,
01:38:19.720 | periphery of your eyes,
01:38:20.860 | but it seems to me if this is such critical real estate,
01:38:22.980 | why isn't it being better protected?
01:38:25.840 | - You know, I'm no expert in helmets,
01:38:27.580 | but I don't think we see a lot of epidural hematomas
01:38:31.560 | in sports injuries.
01:38:33.180 | To get this kind of injury,
01:38:36.200 | you usually need a really focal blunt trauma,
01:38:38.640 | like the baseball bat to the head
01:38:40.440 | is a classic mechanism of injury
01:38:43.080 | that would lead to a temporal bone fracture
01:38:47.080 | and epidural hematoma.
01:38:48.320 | With sports injuries, you know, you don't often see that,
01:38:53.100 | especially in football with, you know,
01:38:55.480 | a sharper object coming in contact with the head.
01:39:00.480 | It's usually another helmet, right,
01:39:03.520 | is the mechanism of injury.
01:39:06.100 | So I can't think off the top of my head
01:39:09.480 | of an instance of this exact injury type in sports.
01:39:13.520 | - You spent a lot of time poking around in brains of humans.
01:39:17.780 | And while I realize this is not your area of expertise,
01:39:23.040 | you are somebody who I am aware, you know,
01:39:26.000 | cares about his health and the health of your family.
01:39:29.160 | And I think generally people's health.
01:39:32.160 | When you look out on the landscape of things
01:39:33.860 | that people can do and shouldn't do
01:39:37.860 | if their desire is to keep their brain healthy,
01:39:40.560 | do any data or any particular practices come to mind?
01:39:45.560 | I mean, I think we've all heard the obvious one.
01:39:47.680 | Don't get a head injury.
01:39:49.200 | If you do get a head injury, make sure it gets treated
01:39:51.960 | and don't get a second head injury.
01:39:53.840 | But those are sort of duh type answers
01:39:57.400 | that I'm able to give.
01:39:58.580 | So I'm curious about the answers
01:39:59.620 | that perhaps I'm not able to give.
01:40:01.320 | - Yeah, well, you know, the obvious ones,
01:40:03.080 | it's one that you talk about a lot
01:40:05.280 | and I see a lot of the smoldering wreckage of humanity,
01:40:10.120 | you know, in the operating room and in the emergency room
01:40:13.720 | for people that come in.
01:40:15.720 | You know, I work my practices in San Francisco
01:40:18.040 | right next to the Tenderloin.
01:40:19.240 | And so a lot of people that end up coming in
01:40:21.320 | from the Tenderloin have been drinking
01:40:23.920 | just spectacular amounts of alcohol for a long time.
01:40:27.420 | And their brains are, you know, very often on the scans,
01:40:32.500 | they look like small walnuts inside their empty skull.
01:40:36.340 | There's so much atrophy that happens
01:40:38.420 | with an alcohol-soaked brain chronically,
01:40:42.180 | that I would say that's, you know,
01:40:44.800 | far and away the most common source of brain damage
01:40:48.240 | that many of us just volunteer for.
01:40:52.000 | And it's, you know, when you look at the morbidity,
01:40:54.980 | kind of the human harm in aggregate that's done,
01:40:58.280 | it's mystifying that it's not something that we
01:41:02.580 | are all paranoid about.
01:41:04.240 | - People will think that I don't drink at all.
01:41:07.000 | I'll occasionally have a drink.
01:41:08.260 | I could take it or leave it, frankly.
01:41:11.340 | If all the alcohol in the plant disappeared,
01:41:12.700 | I wouldn't notice, but I do occasionally have a drink,
01:41:14.660 | maybe one per year or something like that.
01:41:16.900 | But I am shocked at this current state of affairs
01:41:21.300 | around alcohol consumption and advertising, et cetera.
01:41:24.020 | When I look at the data, mainly out of the UK Brain Bank,
01:41:26.180 | which basically shows that for every drink that one has
01:41:29.700 | on a regular basis, when you go from zero to one drink
01:41:33.900 | per week, there's more brain atrophy,
01:41:36.280 | thinning of the gray matter cortex.
01:41:38.040 | You go from one to two, more thinning.
01:41:39.940 | You come from two to three.
01:41:41.100 | And there's a near linear relationship
01:41:43.300 | between the amount that people are drinking
01:41:44.720 | and the amount of brain atrophy.
01:41:46.480 | And to me, it's like, it's just sort of obvious
01:41:49.320 | from these large scale studies that, as you point out,
01:41:53.800 | alcohol atrophies the brain.
01:41:55.560 | It kills neurons.
01:41:57.360 | And I don't have any bias against alcohol
01:41:59.680 | or people that drink, I know many of them,
01:42:01.760 | but it does seem to me kind of shocking
01:42:05.040 | that we're talking about the resveratrol in red wine,
01:42:07.480 | which is at infinitesimally small amounts.
01:42:10.420 | It's not even clear resveratrol is good for us anyway,
01:42:12.360 | by the way, a matter of debate, I should point out.
01:42:16.160 | But so alcohol, certainly alcohol in excess
01:42:19.680 | is bad for the brain.
01:42:22.720 | - In terms of, okay, so we have head hits,
01:42:25.960 | bad alcohol, bad.
01:42:28.240 | You're working, as you mentioned, you're the tenderloin.
01:42:32.480 | Is there any awareness that amphetamine use
01:42:35.260 | can disrupt brain structure or function?
01:42:38.320 | - You know, that's not an area
01:42:40.200 | that I spent a lot of time researching in.
01:42:42.320 | I, you know, I incidentally take care of people
01:42:45.260 | that have used every substance known to man
01:42:47.560 | in quantities that are, you know, spectacular,
01:42:50.720 | but I haven't specifically done research in that area.
01:42:54.120 | I'm not super well-versed on the literature.
01:42:56.880 | - Yeah, I ask in part because maybe you know a colleague
01:43:00.360 | or we'll come across a colleague who's working on this.
01:43:02.400 | There's just such a incredible increase
01:43:06.640 | in the use of things like Adderall, Ritalin,
01:43:08.680 | Modafinil, R-modafinil, which I think in small amounts
01:43:12.240 | in clinically prescribed situations can be very beneficial.
01:43:16.040 | But let's be honest, many people are using these
01:43:18.860 | on a chronic basis.
01:43:20.320 | I don't think we really know what it does to the brain
01:43:22.620 | aside from increasing addiction for those substances.
01:43:25.480 | That's very clear.
01:43:26.420 | - Well, for better or worse,
01:43:27.740 | we're generating a massive data set right now.
01:43:30.080 | - Well put.
01:43:31.980 | I'd like to briefly go back to our earlier discussion
01:43:37.160 | about neuroplasticity.
01:43:38.320 | - Sure.
01:43:39.160 | - You made an interesting statement,
01:43:40.120 | which is that we are not aware of any single brain area
01:43:43.480 | that one can stimulate in order to invoke plasticity.
01:43:47.200 | - Right.
01:43:48.040 | - This malleability of neural architecture.
01:43:50.280 | Years ago, Mike Merzenich and colleagues at UCSF
01:43:54.720 | did some experiments where they stimulated nucleus basalis
01:43:58.800 | and paired that stimulation with eight kilohertz tone.
01:44:02.640 | Or in some cases, they could also stimulate
01:44:06.260 | a different brain area, the ventral tegmental area,
01:44:08.760 | which causes release of dopamine and pair it with a tone.
01:44:11.680 | And it seemed in every one of these cases,
01:44:14.940 | they observed massive plasticity.
01:44:19.600 | Now I look at those data and I compare them
01:44:22.840 | to the kind of classic data.
01:44:25.120 | I think it was Carl Ashley that did these experiments
01:44:27.140 | where they would take animals
01:44:27.980 | and they'd scoop out a little bit of cortex,
01:44:30.440 | put the animal back into a learning environment,
01:44:32.600 | and the animal would do pretty well, if not perfectly.
01:44:35.840 | So they'd scoop out a different region of cortex
01:44:37.620 | and a different animal.
01:44:38.460 | And by the end of maybe three, four years
01:44:40.600 | of these kinds of lesion experiments,
01:44:42.520 | they referred to the equal potential of the cortex,
01:44:46.120 | meaning they concluded that it didn't matter
01:44:48.380 | which piece of the cortex you took out,
01:44:50.520 | that there was no one critical area.
01:44:52.760 | So on the one hand, you've got these experiments that say,
01:44:56.240 | "You know, you don't really need a lot of the brain."
01:44:59.700 | And every once in a while, a news story will come out
01:45:02.020 | where a person will go in for a brain scan
01:45:05.380 | for some other reason or an experiment,
01:45:07.380 | and the person seems perfectly normal,
01:45:09.120 | and they're like missing half their cortex.
01:45:12.100 | And then on the other hand, you have these experiments
01:45:14.120 | like the stimulation of basalis or VTA,
01:45:16.420 | where you get massive plasticity from stimulation of one area.
01:45:19.980 | I've never been able to reconcile
01:45:21.920 | these kinds of discrepant findings.
01:45:23.860 | And so I'd really like just your opinion on this.
01:45:27.040 | What is it about the brain as an organ
01:45:29.320 | that lets it be both so critical
01:45:31.480 | at the level of individual neurons and circuits,
01:45:33.660 | so, so critical, and yet at the same time,
01:45:37.060 | it's able to circumvent these what would otherwise seem
01:45:41.280 | like massive lesions and holes in itself.
01:45:44.500 | - Yeah, I mean, a lot of it to reconcile those experiments,
01:45:48.300 | you first account for the fact
01:45:51.180 | that they're probably in different species, right?
01:45:54.180 | You take out a particular portion of a pig
01:45:57.020 | or a rabbit brain, a small amount,
01:45:59.180 | you might not see a difference,
01:46:00.400 | but a small portion of a human brain,
01:46:03.500 | say the part most interested in coordinating speech
01:46:07.660 | or finger movement, and you're gonna see profound losses
01:46:10.780 | or visual cortex, right?
01:46:13.100 | I take out a small portion of V1
01:46:14.860 | and you'll have a visual deficit.
01:46:16.960 | And so species matters, age matters.
01:46:23.060 | If you take out half of the brain in a very young baby,
01:46:27.400 | that baby has a reasonable chance
01:46:29.220 | of developing a high degree of function
01:46:32.880 | by having the remaining half subsume
01:46:35.900 | some of the functions lost on the other side
01:46:38.100 | because they're very, very young
01:46:40.860 | and their brain is still developing.
01:46:42.320 | It's to some degree a blank slate
01:46:45.040 | with extremely high plasticity over many years.
01:46:48.380 | So that can overcome a lot of deficits.
01:46:50.720 | Taking an adult animal's brain
01:46:55.300 | that isn't very well differentiated functionally
01:46:58.480 | to begin with, you might not see those deficits.
01:47:00.660 | So apparently there's a lot of redundancy as well, right?
01:47:03.600 | There's a lot of say cerebellar and spinal circuits
01:47:06.220 | in other animals that generate stereotyped behavior patterns
01:47:10.980 | and might not need the brain at all
01:47:13.340 | to perform say a walking movement
01:47:15.580 | or some other sequences of motor activities.
01:47:19.400 | So a lot of that depends on the experimental setup.
01:47:23.800 | I would say in general, adult humans are very vulnerable
01:47:26.820 | to losing small parts of their brains
01:47:29.160 | and losing discrete functions.
01:47:30.980 | - I'm gonna take the liberty of asking a question
01:47:34.580 | that merges across Neuralink and Tesla.
01:47:38.800 | I could imagine that cars,
01:47:42.440 | whether or not they're on autopilot mode
01:47:44.820 | or being driven by the human directly,
01:47:46.800 | and society generally would benefit
01:47:50.900 | from knowing whether or not a human
01:47:53.040 | is very alert or sleepy.
01:47:55.840 | - Sure.
01:47:57.340 | - I don't own a Tesla.
01:47:58.520 | Perhaps this technology already exists,
01:48:02.020 | but is there any idea that a simple sensor,
01:48:05.660 | maybe even of just eyelid position or pupil size
01:48:09.500 | or head position could be introduced to a car
01:48:14.500 | like the Tesla or another car for that matter
01:48:18.540 | and resolve a common problem,
01:48:21.300 | which is that when people are less alert,
01:48:22.940 | not just when people fall asleep,
01:48:24.860 | but the simple drop in alertness that occurs
01:48:28.220 | when people are sleepy is my read of the data
01:48:31.940 | is responsible for approximately a third,
01:48:35.740 | third, it's incredible, of accidents between vehicles.
01:48:39.500 | And then of course, some percentage of those
01:48:41.040 | are gonna be lethal accidents.
01:48:42.340 | So in terms of preserving life,
01:48:44.060 | this might seem like a minor case,
01:48:45.300 | but it's actually a major case scenario.
01:48:47.180 | - Yeah, I have no special insight
01:48:50.080 | into how Tesla software works.
01:48:52.620 | I know they have brilliant engineers.
01:48:54.420 | When I have a Tesla, when I drive it,
01:48:59.440 | it seems to know when I'm looking at the road
01:49:01.860 | versus not, and it yells at me
01:49:04.080 | if I'm not looking at the road.
01:49:05.380 | - How does it do that?
01:49:06.220 | And what voice does it use?
01:49:07.140 | - There's a small camera up by the rear view mirror.
01:49:10.940 | And I think it's a simple eye track.
01:49:13.020 | My guess here is that it's a simple eye tracking program.
01:49:16.480 | And so it may already be the case that it's implemented,
01:49:20.700 | that it's detecting whether your eyes are open or not.
01:49:23.300 | Obviously, it's not strict.
01:49:27.740 | It's not stringent because sunglasses
01:49:31.580 | and I've seen forums on the internet
01:49:34.720 | where people tape over that small camera.
01:49:36.940 | - So they can, goodness.
01:49:40.500 | - But I think they're definitely making efforts
01:49:43.660 | to try to save lives here.
01:49:45.820 | - Incredible.
01:49:47.920 | I say incredible just because I think
01:49:49.620 | I'm fortunate enough to live in a lifetime
01:49:51.860 | where there were no electric cars when I was growing up
01:49:54.540 | and now things are moving oh so fast, no pun intended.
01:49:59.780 | What is your wish for brain machine interface
01:50:03.900 | and brain augmentation?
01:50:04.980 | So let's assume that the clinical stuff can be worked out
01:50:08.700 | or maybe you have a pet clinical condition
01:50:12.220 | that you just are just yearning to see resolved.
01:50:17.020 | That would be fine too.
01:50:18.340 | But in addition to that, you really just expand out.
01:50:21.740 | Let's say we can extend your life 200 years
01:50:25.860 | or we're thinking about the kind of world
01:50:28.780 | that your children are going to live in
01:50:30.280 | and their grandchildren will live in.
01:50:32.340 | What do you think is really possible
01:50:34.800 | with brain augmentation and brain machine interface?
01:50:37.900 | And here, please feel no bias whatsoever
01:50:41.100 | to answer in a way that reveals to us
01:50:44.480 | your incredible empathy and consideration
01:50:48.620 | of clinical conditions 'cause that's how you spend your days
01:50:51.340 | is fixing patients and helping their lives be better.
01:50:55.840 | So if it lands in that category, great.
01:50:58.080 | But for sake of fun and for sake of delight
01:51:02.360 | and for sake of really getting us, the audience,
01:51:05.460 | to understand what's really possible here,
01:51:08.520 | please feel no shackles.
01:51:10.720 | - Yeah.
01:51:12.060 | Well, I love the idea down the road
01:51:17.060 | and we're talking a 10-year, maybe 20-year timeframe
01:51:23.620 | of humans just getting control over some of the horrible ways
01:51:28.620 | that their brains go wrong, right?
01:51:30.200 | So I think everybody at this point
01:51:34.400 | has either known someone or second order known someone,
01:51:37.420 | a friend of a friend who has been touched by addiction
01:51:40.980 | or depression, suicide, obesity.
01:51:45.020 | These functions of the brain or malfunctions of the brain
01:51:48.860 | are what drives me.
01:51:50.220 | These are the things that I want to tackle in my career.
01:51:54.380 | In terms of my kid's lifetime,
01:51:58.280 | I'm thinking full human expansion of human cognition
01:52:03.280 | into AI, full immersion in the internet
01:52:09.080 | of your cognitive abilities,
01:52:11.200 | having no limitation for what you think
01:52:17.140 | as bottlenecked by needing to read the Wikipedia article first
01:52:20.920 | to have the data to inform your thoughts,
01:52:23.480 | having communication with anyone that you want to
01:52:28.320 | unrestricted by this flapping air past meat on your face.
01:52:33.320 | It's a means of communication
01:52:39.160 | that's ridiculously prone to being misunderstood.
01:52:42.080 | It's also a tiny narrow bottleneck of communication
01:52:45.960 | where trying to send messages back and forth
01:52:48.600 | through a tiny straw.
01:52:49.700 | And there's no reason that needs to necessarily be true.
01:52:53.000 | It's the way things have always been,
01:52:54.820 | but it isn't the way things are going to be in the future.
01:52:58.000 | And I think there's a million very sci-fi possibilities
01:53:03.000 | in terms of banding human minds together
01:53:09.560 | to be even more potent as a multi-unit organism.
01:53:15.300 | As an opt-in multi-brain.
01:53:20.300 | These are things that are so far down the road
01:53:23.780 | I can't even directly see how they would be implemented.
01:53:27.060 | But the technology we're working on
01:53:28.680 | is a little crack in the door
01:53:30.420 | that allows some of this stuff to even be thought about
01:53:32.640 | in a realistic way.
01:53:33.800 | To that point, I encourage anyone who is excited
01:53:40.420 | about things like that,
01:53:41.900 | especially mechanical engineers, software engineers,
01:53:44.720 | robotics engineers, come to the Neuralink website
01:53:47.700 | and look at the jobs we've got.
01:53:48.780 | We need the brightest people on the planet
01:53:50.840 | working on these, the hardest problems in the world,
01:53:54.540 | in my opinion.
01:53:55.820 | And so if you want to work on this stuff, come help us.
01:53:59.340 | - I have several responses to what you just said.
01:54:04.020 | First off, I'll get the least important one out of the way,
01:54:07.740 | which is that years ago I applied for a job at Neuralink.
01:54:11.300 | The Neuralink website at that time was incredibly sparse.
01:54:14.500 | It was just said Neuralink and it said,
01:54:15.880 | if you're interested, give us your email.
01:54:17.780 | So I put my email there.
01:54:18.740 | I got no response.
01:54:19.740 | So they made a wise choice in--
01:54:24.740 | - A terrible error, terrible loss.
01:54:27.100 | - Now, fast forward several years.
01:54:29.760 | I am very grateful and I think very lucky
01:54:33.580 | that you, who passed through, fortunately for me,
01:54:37.980 | through my lab at one point,
01:54:39.820 | and we had some fun expeditions together in the wild,
01:54:43.540 | neural exploration, so we can talk about some other time,
01:54:46.660 | as well as learning from you
01:54:48.540 | as you pass through your time at Stanford,
01:54:50.700 | but have arrived there at Neuralink
01:54:54.260 | and I'll say that they're very lucky to have you.
01:54:57.060 | And folks like Dan Adams,
01:54:59.020 | who I've known for a very long time.
01:55:00.880 | So phenomenal neurosurgeons like yourself,
01:55:04.380 | neuroscientists and vision scientists like Dan and others.
01:55:09.500 | It's really an incredible mission.
01:55:11.140 | So I really want to start off by saying thank you to you
01:55:15.120 | and all your colleagues there.
01:55:16.980 | I know that Neuralink is really tip of the spear
01:55:19.320 | in being public facing with the kinds of things
01:55:21.260 | they're doing and being so forthcoming
01:55:23.220 | about how that work is done in animals
01:55:25.180 | and exactly what they're doing.
01:55:26.860 | And that's a very brave stance to take,
01:55:30.440 | especially given the nature of the work, but--
01:55:32.660 | - Well, that's classic Elon, right?
01:55:34.180 | He doesn't keep secrets in public too commonly.
01:55:37.900 | He tells you what he's going to do and then he does it.
01:55:40.380 | And people are always amazed by that.
01:55:42.620 | He releases the Tesla master plan
01:55:44.420 | and tells you exactly what the company intends to do
01:55:47.000 | for the next several years.
01:55:48.260 | And people assume that there's some subterfuge
01:55:51.060 | that he is misdirecting,
01:55:52.640 | but it's right out there in the open.
01:55:55.220 | And I think Neuralink follows in that path of,
01:55:58.500 | we want people to know what we're doing.
01:56:00.060 | We want the brightest people in the world to come help us.
01:56:03.700 | We want to be able to help patients.
01:56:06.300 | We want the most motivated patients with quadriplegia
01:56:11.300 | to visit our patient registry and sign up
01:56:15.860 | to be considered for clinical trials
01:56:17.940 | that will happen in the future.
01:56:19.500 | - We'll put a link to that, by the way.
01:56:21.120 | So maybe just the direct call could happen now.
01:56:24.340 | So this is for people who are quadriplegic
01:56:27.560 | or who know people who are quadriplegic,
01:56:29.420 | who are interested in being part of this clinical trial.
01:56:32.380 | - It's a patient registry right now
01:56:33.980 | that we're just collecting information
01:56:35.420 | to see who might be eligible for clinical trials
01:56:37.740 | that'll happen in the future.
01:56:39.620 | We're still working with the FDA to hammer out the details
01:56:42.760 | and get their final permission to proceed with the trial.
01:56:46.900 | - Great, so please see the note in the show note,
01:56:49.540 | the link, excuse me, in the show note captions for that.
01:56:52.660 | Yeah, I want to thank you guys for your stance
01:56:55.500 | in public facing and also doing the incredibly hard work.
01:56:57.900 | I also think the robotics aspect,
01:56:59.420 | which you've clarified for me today,
01:57:01.440 | is extremely forward thinking and absolutely critical.
01:57:05.700 | So a lot of critical engineering that no doubt will wick out
01:57:09.360 | into other domains of neurosurgery and medical technology,
01:57:12.300 | not just serving Neuralink's mission directly.
01:57:16.180 | And I really want to thank you, first of all,
01:57:19.580 | for coming here today and taking time
01:57:22.000 | out of your important schedule of seeing patients
01:57:24.280 | and doing brain surgery, literally.
01:57:25.920 | - Happy to do it.
01:57:26.760 | - Time away from your family and time away
01:57:28.360 | from your mission at Neuralink briefly
01:57:31.040 | to share with people what you guys are doing.
01:57:34.360 | As I mentioned before, there's a lot of mystique around it.
01:57:37.500 | And despite the fact that Neuralink has gone out
01:57:40.780 | of their way to try and erase some of that mystique,
01:57:43.300 | this to me is the clearest picture ever,
01:57:46.640 | to my knowledge, that has been given
01:57:48.900 | about what's going on there and the stated
01:57:51.380 | and the real mission and what's going on
01:57:53.820 | at the level of nuts and bolts and guts and brains
01:57:57.300 | and this kind of thing.
01:57:58.340 | And I really just want to thank you also for being you,
01:58:01.480 | which is perhaps sounds like a kind of an odd thing to hear,
01:58:05.180 | but I think as made apparent by the device implanted
01:58:08.920 | in your hand, you don't just do this for a job.
01:58:13.440 | You live and breathe and embody, truly embody this stuff
01:58:18.160 | around the nervous system and trying to figure out
01:58:20.960 | how to fix it, how to make it better.
01:58:22.880 | And you live and breathe it
01:58:23.880 | and I know your deep love for it.
01:58:25.280 | So I want to thank you for not just the brains
01:58:28.860 | that you put into it and the energy you put into it,
01:58:31.360 | but also for the heart that you put into it.
01:58:33.940 | - Thanks for that, Andrew.
01:58:34.780 | I appreciate that.
01:58:36.180 | We just want to help people.
01:58:37.680 | We want to make things better.
01:58:39.240 | - Well, I know that to be true knowing you
01:58:42.080 | and thank you again for coming here today.
01:58:44.620 | And I look forward to another round of discussion
01:58:47.360 | and whenever the time happens to be
01:58:49.620 | when these incredible technologies have spelled out
01:58:53.360 | to the next major milestone.
01:58:56.000 | - Thank you.
01:58:57.000 | - Thank you for joining me for today's discussion
01:58:58.920 | with Dr. Matthew McDougall, all about the human brain
01:59:02.160 | and how it functions, how it breaks down
01:59:04.520 | and the incredible efforts that are being carried out
01:59:06.800 | at Neuralink in order to overcome diseases
01:59:09.360 | of brain and nervous system function
01:59:11.120 | and to augment how the human brain works.
01:59:13.920 | If you'd like to learn more about Dr. McDougall's work
01:59:16.040 | and the specific work being done at Neuralink,
01:59:18.440 | please see the links that we've provided
01:59:20.000 | in the show note captions.
01:59:21.600 | If you're learning from and or enjoying this podcast,
01:59:23.960 | please subscribe to our YouTube channel.
01:59:25.760 | That's a terrific zero cost way to support us.
01:59:28.360 | In addition, please subscribe to the podcast
01:59:30.600 | on Spotify and Apple.
01:59:32.160 | And in addition on both Spotify and Apple,
01:59:34.360 | you can leave us up to a five-star review.
01:59:36.680 | If you have questions for me or topics you'd like me to cover
01:59:39.480 | on the Huberman Lab Podcast or guests that you'd like me
01:59:41.840 | to consider inviting on the Huberman Lab Podcast,
01:59:44.160 | please put that in the comments on YouTube.
01:59:46.320 | I do read all the comments.
01:59:48.380 | In addition, please check out the sponsors mentioned
01:59:50.680 | at the beginning and throughout today's episode.
01:59:52.520 | That's the best way to support this podcast.
01:59:54.760 | Not so much on today's episode,
01:59:56.240 | but on various previous episodes
01:59:57.620 | of the Huberman Lab Podcast, we discuss supplements.
02:00:00.320 | While supplements aren't necessary for everybody,
02:00:02.240 | many people derive tremendous benefit from them
02:00:04.280 | for things like enhancing sleep, focus, and hormone support.
02:00:07.320 | The Huberman Lab Podcast is proud to have partnered
02:00:09.200 | with Momentous Supplements.
02:00:10.580 | If you'd like to hear more about the supplements discussed
02:00:12.740 | on the Huberman Lab Podcast,
02:00:14.080 | please go to livemomentous, spelled O-U-S, .com/huberman.
02:00:18.080 | Again, that's livemomentous.com/huberman.
02:00:20.800 | If you're not already following the Huberman Lab Podcast
02:00:23.020 | on social media, we are Huberman Lab on Instagram,
02:00:26.440 | Twitter, Facebook, and LinkedIn.
02:00:28.280 | And on all those places, I focus on material
02:00:31.080 | that somewhat overlaps with content
02:00:32.800 | from the Huberman Lab Podcast,
02:00:34.000 | but often is distinct from the content covered
02:00:36.160 | on the Huberman Lab Podcast.
02:00:37.320 | So again, it's Huberman Lab on all social media channels.
02:00:40.340 | For those of you that haven't already subscribed
02:00:42.120 | to our so-called neural network newsletter,
02:00:44.180 | this is a completely zero cost monthly newsletter
02:00:47.280 | that has summaries of podcast episodes
02:00:49.480 | and so-called toolkits.
02:00:50.800 | Toolkits are lists of about a page to two pages long
02:00:54.720 | that give the critical tools, for instance,
02:00:57.160 | for optimizing sleep or for neuroplasticity
02:00:59.200 | or deliberate cold exposure or deliberate heat exposure,
02:01:01.520 | optimizing dopamine.
02:01:02.560 | Again, all available to you at zero cost.
02:01:04.940 | You simply go to HubermanLab.com,
02:01:06.960 | go to the menu tab in the corner, scroll down to newsletter.
02:01:10.200 | You provide us your email.
02:01:11.180 | We do not share your email with anybody.
02:01:13.360 | And in addition to that, there are samples of toolkits
02:01:16.620 | on the HubermanLab.com website, again, under newsletter,
02:01:19.940 | and you don't even have to sign up to access those.
02:01:22.120 | But I think most people do end up signing up
02:01:23.520 | for the newsletter because it's rich
02:01:25.020 | with useful information and again, completely zero cost.
02:01:28.000 | Thank you once again for joining me
02:01:29.380 | for today's discussion with Dr. Matthew McDougall.
02:01:31.760 | And last, but certainly not least,
02:01:34.040 | thank you for your interest in science.
02:01:36.020 | [upbeat music]
02:01:38.600 | (upbeat music)