back to indexDr. Matthew MacDougall: Neuralink & Technologies to Enhance Human Brains | Huberman Lab Podcast
Chapters
0:0 Dr. Matthew MacDougall
4:5 Sponsors: HVMN, Levels, Thesis
7:38 Brain Function & Injury; Brain Tumor Treatment
13:52 Frontal Lobe Filter; Sleep Deprivation
19:0 Neuroplasticity, Pharmacology & Machines
22:10 Neuralink, Neural Implants & Injury, Robotics & Surgery
31:5 Sponsor: AG1 (Athletic Greens)
32:20 Neocortex vs. Deep Brain
36:45 Decoding Brain Signals
42:8 “Confidence Test” & Electrical Stimulation; RFID Implants
51:33 Bluetooth Headphones & Electromagnetic Fields; Heat
57:43 Brain Augmentation & Paralysis
60:51 Sponsor: InsideTracker
62:9 Brain Implants & Peripheral Devices
72:44 Brain Machine Interface (BMI), Neurofeedback; Video Games
82:13 Improving Animal Experimentation, Pigs
93:18 Skull & Injury, Traumatic Brain Injury (TBI)
99:14 Brain Health, Alcohol
103:34 Neuroplasticity, Brain Lesions & Redundancy
107:32 Car Accidents & Driver Alertness
110:0 Future Possibilities in Brain Augmentation & BMI; Neuralink
118:56 Zero-Cost Support, YouTube Feedback, Spotify & Apple Reviews, Sponsors, Momentous, Social Media, Neural Network Newsletter
00:00:02.280 |
where we discuss science and science-based tools 00:00:10.280 |
and I'm a professor of neurobiology and ophthalmology 00:00:18.020 |
Dr. Matthew McDougall is the head neurosurgeon at Neuralink. 00:00:21.600 |
Neuralink is a company whose goal is to develop technologies 00:00:31.120 |
that is to improve the way that brains currently function 00:00:34.820 |
by augmenting memory, by augmenting cognition, 00:00:37.700 |
and by improving communication between humans 00:00:44.440 |
and Neuralink is uniquely poised to accomplish these goals 00:00:48.240 |
because they are approaching these challenges 00:00:50.400 |
by combining both existing knowledge of brain function 00:00:53.780 |
from the fields of neuroscience and neurosurgery 00:00:56.680 |
with robotics, machine learning, computer science, 00:01:05.920 |
Today's conversation with Dr. Matthew McDougall 00:01:10.360 |
because I and many others in science and medicine 00:01:17.860 |
That is, they go where others have simply not gone before 00:01:21.480 |
and are in a position to discover incredibly novel things 00:01:36.460 |
as the brain itself is changed structurally and functionally. 00:01:50.000 |
of brain function and disease are immediately tractable, 00:01:58.020 |
of augmenting brain function for sake of treating disease 00:02:10.440 |
is that Dr. McDougall has a radio receiver implanted 00:02:16.240 |
He did this not to overcome any specific clinical challenge, 00:02:19.340 |
but to overcome a number of daily, everyday life challenges, 00:02:22.840 |
and in some ways to demonstrate the powerful utility 00:02:31.520 |
and different objects and technologies within the world. 00:02:34.640 |
I know that might sound a little bit mysterious, 00:02:36.120 |
but you'll soon learn exactly what I'm referring to. 00:02:38.780 |
And by the way, he also implanted his family members 00:02:57.560 |
of deep brain function, as well as to augment the human brain 00:03:05.460 |
are experiments and things that are happening now 00:03:09.080 |
Dr. McDougall also generously takes us under the hood, 00:03:12.440 |
so to speak, of what's happening at Neuralink, 00:03:22.200 |
about the utility of animal versus human research 00:03:27.560 |
and improving the human brain and in overcoming disease 00:03:30.580 |
in terms of neurosurgery and Neuralink's goals. 00:03:36.300 |
of how human brains work and how they can be improved 00:03:42.800 |
of what Neuralink is doing toward these goals. 00:03:49.420 |
and at Stanford University School of Medicine. 00:03:59.660 |
and to explain to us what the past, present and future 00:04:05.540 |
Before we begin, I'd like to emphasize that this podcast 00:04:08.260 |
is separate from my teaching and research roles at Stanford. 00:04:13.220 |
to bring zero cost to consumer information about science 00:04:15.860 |
and science-related tools to the general public. 00:04:19.580 |
I'd like to thank the sponsors of today's podcast. 00:04:29.100 |
I want to be very clear that I, like most people, 00:04:32.980 |
but I, like most people, do not follow a ketogenic diet. 00:04:37.760 |
However, most people don't realize that you can still benefit 00:04:44.620 |
I take ketone IQ prior to doing really focused 00:04:47.780 |
cognitive work, so I take it once in the afternoon, 00:04:51.500 |
anytime I'm going to prepare for a podcast or do a podcast, 00:04:54.800 |
or if I'm going to do some research or focus on a grant, 00:04:58.060 |
anything that requires a high level of cognitive demand, 00:05:00.480 |
and that's because ketones are the brain's preferred use 00:05:03.100 |
of fuel, even if you're not following a ketogenic diet. 00:05:16.620 |
Today's episode is also brought to us by Levels. 00:05:19.380 |
Levels is a program that lets you see how different foods 00:05:23.640 |
by giving you real-time feedback on your diet 00:05:31.500 |
and Levels allows you to assess how what you eat 00:05:35.260 |
and what combinations of foods you eat and exercise 00:05:40.660 |
should you indulge in alcohol and things of that sort, 00:05:45.200 |
Now, it's very important that the cells of your body, 00:05:47.080 |
and in particular, the cells of your nervous system, 00:05:58.200 |
is to understand how their specific routines, 00:06:06.880 |
find that there's a lot to learn and a lot to be gained 00:06:09.140 |
by understanding these blood glucose patterns. 00:06:11.780 |
If you're interested in learning more about Levels 00:06:13.520 |
and trying a continuous glucose monitor yourself, 00:06:27.780 |
Today's episode is also brought to us by Thesus. 00:06:32.580 |
And as many of you have perhaps heard me say before, 00:06:46.340 |
It does not have neural circuits for quote unquote, 00:06:50.180 |
Thesus understands this and has designed custom nootropics, 00:06:53.560 |
each of which is designed to place your brain and body 00:06:56.980 |
ideal for a particular type of work or physical effort, 00:07:10.100 |
so that you can assess which things work for you 00:07:13.180 |
And then they'll iterate with you over the course 00:07:16.420 |
to come up with the ideal nootropic kit for your needs. 00:07:19.680 |
To get your own personalized nootropic starter kit, 00:07:35.540 |
And now for my discussion with Dr. Matthew McDougall. 00:07:44.220 |
We'll get into our history a little bit later, 00:07:52.300 |
could you share with us your vision of the brain 00:07:55.700 |
as an organ as it relates to what's possible there? 00:08:25.100 |
and they like to mend and they, in your case, 00:08:27.800 |
have the potential to add things into the brain 00:08:31.480 |
So how do you think about and conceptualize the brain 00:08:34.440 |
as an organ and what do you think is really possible 00:08:44.160 |
Thinking about the brain as this three pound lump of meat 00:08:52.760 |
it seems almost magical that it could create a human, 00:09:07.160 |
a small tumor eating away at a little part of the brain 00:09:17.760 |
you start to realize that the brain really is 00:09:20.840 |
a collection of functional modules pinned together, 00:09:24.480 |
duct taped together in this bone box attached to your head 00:09:29.480 |
and sometimes you see very interesting failure modes. 00:09:37.040 |
So one of the most memorable patients I ever had 00:09:43.280 |
I was down at UC San Diego and saw a very young guy 00:10:15.120 |
saw that he was doing okay to our first guess at his health. 00:10:20.120 |
And we continued on to see our other patients 00:10:29.040 |
saying, "You've got to come see your patient right away. 00:11:00.320 |
anything from control of hormone levels in your body, 00:11:29.160 |
but once in a while you get a chance to really help. 00:11:51.580 |
of what qualifies as satisfying in neurosurgery? 00:11:56.900 |
One of the relatively newer techniques that we do 00:12:01.260 |
is if someone comes in with a reasonably small tumor 00:12:05.620 |
somewhere deep in the brain that's hard to get to, 00:12:11.220 |
would involve cutting through a lot of good normal brain 00:12:14.060 |
and disrupting a lot of neurons, a lot of white matter, 00:12:22.860 |
involves a two millimeter drill hole in the skull 00:12:26.180 |
down which you can pass a little fiber optic cannula. 00:12:33.100 |
and just heat the tumor up deep inside the brain 00:12:45.300 |
As the tumor heats up, you can monitor the temperature 00:12:52.720 |
but not hurt hardly any of the brain surrounding it. 00:13:00.860 |
that previously would have been catastrophic to operate on 00:13:43.660 |
where we would have expected to either not operate 00:13:48.700 |
those people sometimes now are coming out unscathed. 00:13:52.640 |
- I'm very curious about the sorts of basic information 00:14:06.700 |
So for instance, in your example of this patient 00:14:10.680 |
what do you think his lack of regulation reveals 00:14:14.980 |
about the normal functioning of the frontal lobes? 00:14:17.720 |
Because I think the obvious answer to most people 00:14:24.600 |
but as we both know, because the brain has excitatory 00:14:27.940 |
and inhibitory neurons as sort of accelerators 00:14:31.320 |
that isn't necessarily the straightforward answer. 00:14:35.080 |
It could be, for instance, that the frontal lobes 00:14:39.100 |
are acting as conductors and are kind of important, 00:14:44.100 |
but not the immediate players in determining impulsivity. 00:14:51.760 |
what do you think the frontal lobes are doing? 00:14:58.460 |
We have a lot of it compared to other animals. 00:15:03.560 |
of a given neural tissue means in terms of understanding 00:15:10.140 |
- Yeah, it varies, I think, from tissue to tissue. 00:15:25.080 |
when part of your brain says that looks very attractive. 00:15:59.620 |
He had this impulse, this sort of strange impulse 00:16:06.420 |
that normally it would be easy for our frontal lobes 00:16:22.940 |
to want something came right out to the surface. 00:16:27.940 |
So yeah, a filter calming the rest of the brain down 00:16:40.280 |
but just to inform you of what are called acute, 00:16:51.000 |
that are made uncomfortable by animal research. 00:16:53.140 |
I now work on humans, so a different type of animal. 00:17:02.220 |
the animal's anesthetized and doesn't feel any pain 00:17:08.400 |
is that the experimenter, me and another individual 00:17:13.200 |
with an hour of sleep here or an hour of sleep there. 00:17:15.900 |
But you're basically awake for two, three days. 00:17:17.360 |
Something that really I could only do in my teens and 20s. 00:17:25.740 |
after one of these experiments and I was very hungry. 00:17:29.240 |
And the waitress walking by with a tray full of food 00:17:38.220 |
to not get up and take the food off the tray. 00:17:40.700 |
Something that of course is totally inappropriate 00:17:43.400 |
And it must have been based on what you just said, 00:17:46.380 |
that my forebrain was essentially going offline 00:17:53.300 |
where I thought I might reach up and grab a plate of food 00:17:56.340 |
passing by simply because I wanted it and I didn't. 00:18:01.020 |
But I can relate to the experience of feeling 00:18:05.020 |
like the shh response is a flickering in and out 00:18:10.900 |
So do we know whether or not sleep deprivation 00:18:12.500 |
limits forebrain activity in a similar kind of way? 00:18:31.020 |
So the opposite end of the brain, as you know, 00:18:32.860 |
the visual cortex in the far back of the brain is affected. 00:18:43.060 |
So I think if you force me to give a definitive answer 00:18:48.060 |
on that question, I'd have to guess that the entire brain 00:18:55.760 |
And it's not clear that one part of the brain 00:18:59.240 |
- So we've been talking about damage to the brain 00:19:04.780 |
We could talk a little bit about what I consider really 00:19:11.340 |
which is neuroplasticity, this incredible capacity 00:19:19.460 |
maybe new neurons, but probably more strengthening 00:19:25.820 |
about so-called classical psychedelics like LSD 00:19:28.660 |
and psilocybin, which do seem to quote unquote, 00:19:34.140 |
but through the release of neuromodulators like serotonin 00:19:37.340 |
and so forth, how do you think about neuroplasticity 00:19:41.700 |
and more specifically, what do you think the potential 00:19:55.460 |
'Cause in your role at Neuralink and as a neurosurgeon 00:19:58.860 |
in other clinical settings, surely you are using machines 00:20:03.100 |
and surely you've seen plasticity in the positive 00:20:18.340 |
that plasticity definitely goes down in older brains. 00:20:21.760 |
It is harder for older people to learn new things, 00:20:40.620 |
aren't the obvious answer to increase plasticity necessarily 00:20:46.140 |
We already know that there are pharmacologics, 00:20:48.940 |
some of the ones you mentioned, psychedelics, 00:20:54.140 |
Yeah, it's hard to know which area of the brain 00:21:03.880 |
compared to pharmacologic agents that we already know about. 00:21:14.740 |
you're talking about altering a trillion synapses 00:21:19.540 |
all in a similar way in their tendency to be rewireable, 00:21:27.860 |
and an electrical stimulation target in the brain 00:21:38.540 |
there might be a more broad ability to steer current 00:21:42.820 |
to multiple targets with some degree of control, 00:21:46.100 |
but you're never going to get that broad target ability 00:21:51.100 |
with any electrodes that I can see coming in our lifetimes. 00:21:56.800 |
Let's just say that would be coding the entire surface 00:21:59.860 |
and depth of the brain the way that a drug can. 00:22:03.020 |
And so I think plasticity research will bear the most fruit 00:22:13.780 |
And then again, I think that all of us, me included, 00:22:21.020 |
while we may think we know what is going on at Neuralink 00:22:25.240 |
in terms of the specific goals and the general goals, 00:22:41.380 |
We really don't know what you all are doing there. 00:22:53.860 |
because I think that one of the things that's so exciting 00:23:01.540 |
has really spearheaded, SpaceX, Tesla, et cetera, 00:23:08.940 |
Mystique is a quality that is not often talked about, 00:23:17.220 |
in which engineers are starting to toss up big problems 00:23:23.460 |
And obviously, Elon is certainly among the best, 00:23:26.480 |
if not the best, in terms of going really big. 00:23:33.360 |
are very different than the picture a few years ago 00:23:35.660 |
when you didn't see so many of them, rockets and so forth, 00:23:58.840 |
I can imagine basic goals of trying to understand the brain 00:24:02.460 |
I could imagine clinical goals of trying to repair things 00:24:11.660 |
Neuralink and I think Tesla and SpaceX before it 00:24:19.760 |
that people project their hopes and fears onto. 00:24:22.700 |
And so we experience a lot of upside in this. 00:24:36.940 |
For the most part, those extremes are not true. 00:24:56.160 |
In the first indication that we're aiming at, 00:25:00.680 |
we are hoping to implant a series of these electrodes 00:25:17.900 |
- Because of some high level spinal cord damage. 00:25:21.060 |
And so this pristine motor cortex up in their brain 00:25:25.740 |
is completely capable of operating a human body. 00:25:44.820 |
So a mouse and a keyboard as if they had their hands 00:25:52.100 |
their motor intentions are coming directly out of the brain 00:25:57.940 |
And so they're able to regain their digital freedom 00:26:01.380 |
and connect with the world through the internet. 00:26:10.520 |
I can imagine that a robot could be more precise 00:26:16.820 |
more precise than the human hand, no tremor, for instance, 00:26:26.200 |
a little micro detection device on the tip of the blade 00:26:34.220 |
that you would want to avoid and swerve around 00:26:38.640 |
And you and I both know, however, that no two brains, 00:26:42.840 |
nor are the two sides of the same brain identical. 00:26:53.800 |
However, and here I'm going to interrupt myself again 00:27:01.320 |
was very clearly performed better by humans than machines. 00:27:10.700 |
So is this the idea that eventually, or maybe even now, 00:27:20.360 |
These electrodes are so tiny and the blood vessels 00:27:25.600 |
and so densely packed that a human physically can't do this. 00:27:38.680 |
and place it accurately, blindly, by the way, 00:27:46.100 |
at the right depth to get through all the cortical layers 00:28:08.180 |
to lean on robots to do this incredibly precise, 00:28:12.620 |
incredibly fast, incredibly numerous placement 00:28:16.440 |
of electrodes into the right area of the brain. 00:28:26.320 |
and augmentation and treatment of human brain conditions. 00:28:34.880 |
it is only for the placement of the electrodes, 00:28:44.680 |
the more crude part of opening the skin and skull 00:28:47.160 |
and presenting the robot a pristine brain surface 00:28:55.600 |
to be able to move again, or maybe even to walk again, 00:28:59.140 |
is a heroic goal and one that I think everyone would agree 00:29:07.260 |
Is that the first goal because it's hard but doable? 00:29:13.300 |
- Or is that the first goal because you and Elon 00:29:23.560 |
- Yeah, broadly speaking, the mission of Neuralink 00:29:26.900 |
is to reduce human suffering, at least in the near term. 00:29:30.300 |
You know, there's hope that eventually there's a use here 00:29:33.600 |
that makes sense for a brain interface to bring AI 00:29:38.600 |
as a tool embedded in the brain that a human can use 00:29:45.400 |
I think that's pretty far down the road for us, 00:29:51.940 |
In the near term, we really are focused on people 00:30:00.020 |
With regard to motor control, you know, our mutual friend 00:30:05.020 |
recently departed, Krishna Shenoy, was a giant 00:30:12.120 |
It just so happens that his work was foundational 00:30:16.460 |
for a lot of people that work in this area, including us, 00:30:21.340 |
That work was farther along than most other work 00:30:26.700 |
for addressing any function that lives on the surface 00:30:33.420 |
require us currently to focus on only surface features 00:30:37.900 |
So we can't say go to the really very compelling surface, 00:30:42.900 |
deep depth functions that happen in the brain, 00:30:48.060 |
like, you know, mood, appetite, addiction, pain, sleep. 00:30:56.660 |
but in the immediate future, our first indication 00:31:00.240 |
or two or three will probably be brain surface functions 00:31:06.020 |
- I'd like to take a quick break and acknowledge 00:31:15.280 |
that covers all of your foundational nutritional needs. 00:31:20.860 |
so I'm delighted that they're sponsoring the podcast. 00:31:36.000 |
that communicate with the brain, the immune system, 00:31:37.740 |
and basically all the biological systems of our body 00:31:40.140 |
to strongly impact our immediate and long-term health. 00:31:49.480 |
In addition, Athletic Greens contains a number of adaptogens, 00:31:53.460 |
that all of my foundational nutritional needs are met, 00:32:05.100 |
that make it really easy to mix up Athletic Greens 00:32:07.420 |
while you're on the road, in the car, on the plane, et cetera, 00:32:10.000 |
and they'll give you a year's supply of vitamin D3K2. 00:32:20.780 |
So for those listening, the outer portions of the brain 00:32:24.780 |
are filled with, or consist of, rather, neocortex, 00:32:29.780 |
so the bumpy stuff that looks like sea coral. 00:32:39.540 |
And then underneath reside a lot of the brain structures 00:32:46.980 |
things that control our mood, hormone output, 00:32:52.060 |
And would you agree that those deeper regions of the brain 00:32:55.560 |
have, in some ways, more predictable functions? 00:32:57.860 |
I mean, that lesions there or stimulation there 00:33:02.340 |
in terms of deficits or improvements in function. 00:33:16.840 |
They're kind of the firmware or the housekeeping functions, 00:33:20.840 |
to some degree, body temperature, blood pressure, 00:33:26.760 |
things that you don't really need to vary dramatically 00:33:36.080 |
of problem-solving functions between a fox and a human 00:33:40.100 |
are vastly different, and so the physical requirements 00:33:46.180 |
I think I heard Elon describe it as the human brain 00:34:00.140 |
on top of all that more stereotyped function-type stuff 00:34:05.820 |
and it's still unclear what neocortex is doing. 00:34:09.760 |
In the case of frontal cortex, as you mentioned earlier, 00:34:25.600 |
that sure are involved in vision or touch or hearing, 00:34:32.800 |
So I'm curious whether or not in your clinical work 00:34:39.240 |
whether or not you have ever encountered neurons 00:34:43.200 |
that do something that's really peculiar and intriguing. 00:34:54.040 |
when I stimulate them or when they're taken away, 00:34:57.760 |
lead to something kind of bizarre but interesting. 00:35:01.060 |
- Yeah, yeah, the one that comes immediately to mind 00:35:16.160 |
which is sort of an uncontrollable fit of laughter. 00:35:32.000 |
And so you don't normally think of a deep structure 00:35:43.640 |
And certainly when we think about this kind of laughter 00:35:50.920 |
it's mirthless laughter is the kind of textbook phrase, 00:35:58.240 |
It's just a reflexive, almost zombie-like behavior. 00:36:05.540 |
And it comes from a very small population of neurons 00:36:11.360 |
This is one of the other sort of strange loss of functions 00:36:16.360 |
you might say is it's nice that you and I can sit here 00:36:20.660 |
and not have constant disruptive fits of laughter 00:36:25.420 |
coming out of our bodies, but that's a neuronal function. 00:36:29.680 |
That's, thank goodness, due to neurons properly wired 00:36:35.640 |
And any neurons that do anything like this can be broken. 00:36:40.500 |
And so we see this in horrifying cases like that 00:36:44.440 |
- So I'm starting to sense that there are two broad bins 00:36:52.780 |
either to treat disease or for sake of increasing memory, 00:36:58.500 |
One category you alluded to earlier, which is pharmacology. 00:37:02.160 |
And you specifically mentioned the tremendous power 00:37:05.480 |
that pharmacology holds, whether or not it's through 00:37:13.140 |
The other approach are these little microelectrodes 00:37:20.480 |
into multiple regions in order to play essentially 00:37:25.100 |
a concert of electricity that is exactly right 00:37:33.860 |
First of all, is there a role for and is Neuralink 00:37:37.580 |
interested in combining pharmacology with stimulation? 00:37:42.860 |
Right now we're solely focused on the extremely hard, 00:37:46.360 |
some might say the hardest problem facing humans right now 00:37:49.500 |
of decoding the brain through electrical stimulation 00:37:56.180 |
- So to just give us a bit fuller picture of this, 00:37:59.660 |
we were talking about a patient who can't move their limbs 00:38:03.940 |
The motor cortex that controls movement is in theory fine. 00:38:09.920 |
Make a small hole in the skull and through that hole, 00:38:18.580 |
Is the idea that you're going to play a concert 00:38:21.300 |
You're going to hit all the keys on the piano 00:38:22.740 |
in different combinations and then figure out 00:38:27.080 |
What I'm alluding to here is I still don't understand 00:38:30.820 |
how the signals are going to get out of motor cortex 00:38:32.700 |
past the lesion and into and out to the limbs 00:38:35.540 |
because the lesion hasn't been dealt with at all 00:38:38.420 |
- So just to clarify there, I should emphasize 00:38:42.420 |
we're not in the immediate future talking about 00:38:44.980 |
reconnecting the brain to the patient's own limbs. 00:38:52.500 |
What we're talking about in the immediate future 00:38:55.580 |
is having the person be able to control electronic devices 00:38:58.260 |
around them with their motor intentions alone, right? 00:39:01.580 |
- So prosthetic hand and arm or just mouse and keys on a-- 00:39:07.740 |
So you wouldn't see anything in the world move. 00:39:10.140 |
As they have an intention, the patient might imagine, 00:39:14.440 |
say flexing their fist or moving their wrist. 00:39:18.740 |
And what would happen on the screen is the mouse would move 00:39:25.580 |
And then a keyboard at the bottom of the screen 00:39:27.360 |
would allow them to select letters in sequence 00:39:32.420 |
This is the easy place to start, easy in quotes. 00:39:55.540 |
out of motor cortex and put it into a mouse or a robot arm, 00:40:02.560 |
I mean, that's a whole other set of problems in fact. 00:40:05.520 |
- Well, we're unloading some of that difficulty 00:40:08.720 |
from the brain itself, from the brain of the patient 00:40:18.980 |
to decode the motor intentions out of the brain. 00:40:22.840 |
We have been able to do this in monkeys really well. 00:40:35.120 |
We actually have the world record of bit rate 00:40:39.040 |
of information coming out of a monkey's brain 00:40:41.320 |
to intelligently control a cursor on a screen. 00:40:47.160 |
And again, thanks in no small part due to Krishna Shenoy 00:40:52.440 |
and his lab and the people that have worked for him 00:41:08.580 |
You can't tell him to try something different. 00:41:11.000 |
You can't tell him to, hey, try the shoulder on this. 00:41:13.780 |
Try the other hand and see if there's some cross body 00:41:18.040 |
neuron firing that gives you a useful signal. 00:41:25.540 |
when they've done similar work in academic labs, 00:41:35.020 |
So one of the things out of Stanford recently 00:41:39.360 |
is there was a lab that with Krishna and Jamie Henderson 00:42:04.380 |
but that doesn't mean only hand movement in the hand area. 00:42:11.200 |
There's a long history dating back really prior 00:42:14.800 |
to the 1950s of scientists doing experiments on themselves. 00:42:22.360 |
but because they want the exact sorts of information 00:42:26.540 |
The ability to really understand how intention 00:42:30.680 |
and awareness of goals can shape outcomes in biology. 00:42:42.200 |
scientists have taken the drugs they've studied 00:42:47.840 |
to really try and get a sense of what the animals 00:42:52.880 |
Psychiatrists are sort of famous for this, by the way. 00:42:59.640 |
And some people would probably imagine that's a good thing 00:43:02.840 |
just so that the clinicians could have empathy 00:43:06.180 |
for the sorts of side effects and not so great effects 00:43:09.160 |
of some of these drugs that they administer to the patients. 00:43:17.500 |
would you be willing or are you willing, if allowed, 00:43:22.500 |
to have these electrodes implanted into your motor cortex? 00:43:30.620 |
But given the state of the technology at Neuralink now, 00:43:36.760 |
Or maybe in the next couple of years, if you were allowed, 00:43:40.160 |
would you be willing to do that and be the person to say, 00:43:47.840 |
with that robotic arm, but I'm feeling some resistance. 00:43:59.580 |
and the goals of the experiment that I would argue 00:44:02.360 |
actually stands to advance the technology fastest 00:44:05.360 |
as opposed to putting the electrodes first into somebody 00:44:11.920 |
and then trying to think about why things aren't working. 00:44:20.700 |
But would you implant yourself with these microelectrodes? 00:44:27.440 |
I think for the first iteration of the device, 00:44:50.040 |
for people with bad medical problems and no good options. 00:44:53.900 |
It wouldn't really make sense for an able-bodied person 00:45:13.840 |
to get to the point where you can type faster 00:45:29.400 |
It doesn't really make sense for me to get one 00:45:32.020 |
when it allows me to use a mouse slightly worse 00:45:48.800 |
than many of the industry standard FDA approved surgeries 00:45:56.880 |
that no one even thinks twice about their standard of care. 00:46:12.760 |
- Along the lines of augmenting one's biological function 00:46:45.440 |
And you've always had it in the time that I've known you. 00:46:47.640 |
What is that lump and why did you put it in there? 00:47:00.800 |
And so it's just a very small implantable chip 00:47:27.360 |
For some years, it unlocked the doors at Neuralink 00:47:41.700 |
And so for some years in the early days of crypto, 00:47:53.660 |
a dead offshoot of one of the main cryptocurrencies 00:48:01.940 |
and forgot about it and remembered a few years later 00:48:11.200 |
So that was a nice finding change in the sofa 00:48:16.080 |
you're essentially taking a phone or other device 00:48:18.560 |
and scanning it over the lump in your hand, so to speak. 00:48:23.560 |
And then it can read the data from there essentially. 00:48:27.160 |
What other sorts of things could one put into these RFIDs 00:48:31.360 |
in theory and how long can they stay in there 00:48:44.680 |
and so I was worried about that glass shattering 00:48:49.720 |
I additionally coated them in another ring of silicone 00:49:02.380 |
I don't think I'd ever have to remove it for any reason. 00:49:05.320 |
At some point, the technology is always improving, 00:49:28.740 |
tiptoeing towards the concept that you mentioned 00:49:31.440 |
of you have to be willing to go through the things 00:49:39.900 |
that you think this is a reasonable thing to do. 00:49:46.880 |
it's a little different than a brain implant. 00:49:48.700 |
- Yeah, what's involved in getting that RFID chip 00:50:09.800 |
is probably as painful as just doing the procedures. 00:50:17.920 |
'cause you'll never have to worry about losing your keys 00:50:23.520 |
because I'm dreadfully bad at remembering passwords. 00:50:27.200 |
I have to put them in places all over the place. 00:50:33.040 |
where the kid hides the pennies under the porch 00:50:40.400 |
Yeah, so it was just a little slit and then put in there. 00:50:43.280 |
No local immune response, no pus, no swelling. 00:50:47.160 |
- All the materials are completely biocompatible. 00:50:51.680 |
So no bad reaction, it healed up in days and it was fine. 00:50:58.680 |
maybe can you just maybe raise it and show us? 00:51:01.200 |
Yeah, so were you not to point out that little lump? 00:51:09.280 |
And any other members of your family have these? 00:51:12.440 |
- A few years after having this and seeing the convenience 00:51:16.160 |
of me being able to open the door without keys, 00:51:18.520 |
my wife insisted that I put one in her as well. 00:51:23.440 |
- We consider them our version of wedding rings. 00:51:35.600 |
even though it might seem a little bit off topic. 00:51:37.440 |
As long as we're talking about implantable devices 00:51:49.100 |
You work on the brain, you're a brain surgeon. 00:51:53.280 |
And you understand about electromagnetic fields 00:51:55.920 |
and any discussion about EMFs immediately puts us 00:51:59.080 |
in the category of, uh-oh, like get their tinfoil hats. 00:52:11.920 |
Everything's a real thing at some level, even an idea. 00:52:18.080 |
that electromagnetic fields of sufficient strength 00:52:21.800 |
can alter the function of, maybe the health of, 00:52:30.900 |
So I'll just ask this in a very straightforward way. 00:52:33.380 |
Do you use Bluetooth headphones or wired headphones? 00:52:36.420 |
- And you're not worried about any kind of EMF fields 00:52:40.600 |
- No, I mean, I think the energy levels involved 00:52:48.520 |
we're way out of the realm of ionizing radiation 00:52:51.520 |
that people would worry about tumor causing EMF fields. 00:53:02.840 |
as is very well described in a Bluetooth frequency range, 00:53:18.940 |
with ionizing radiation in a very tiny amount, 00:53:49.480 |
it's just the energy levels are way, way out of the range 00:53:59.080 |
I don't use the earbuds any longer for a couple of reasons. 00:54:04.680 |
Once, as you know, I take a lot of supplements 00:54:14.840 |
I swallowed it the moment after I gulped it down. 00:54:22.640 |
But I could see it on my phone as registering there. 00:54:27.560 |
But anyway, there's a bad joke there to be sure. 00:54:32.560 |
But in any event, I tend to lose them or misplace them. 00:54:43.040 |
I also am not convinced that plugging your ears 00:54:46.260 |
There's some ventilation through the sinus systems 00:55:10.140 |
in terms of keeping the cells healthy and alive. 00:55:29.340 |
So these analogies are going to start to be foreign 00:55:44.340 |
your body has a massive distributed fluid cooling system 00:55:50.300 |
You're pumping blood all around your body all the time 00:56:14.620 |
and the blood is going to bring heat back to that area. 00:56:23.920 |
from the sun itself that contain UV radiation 00:56:30.320 |
- If you're looking for things to be afraid of, 00:56:33.660 |
- Now you're talking to the guy that tells everybody 00:56:35.360 |
you get sunlight in their eyes every morning, 00:56:40.300 |
I encourage people to protect their skin accordingly. 00:56:43.160 |
And different individuals require different levels 00:56:46.720 |
Some people do very well in a lot of sunshine, 00:56:52.020 |
Some people, and it's not just people with very fair skin, 00:56:55.240 |
a minimum of sun exposure can cause some issues. 00:56:58.800 |
And here I'm talking about sun exposure to the skin. 00:57:04.360 |
- Thinking about the sun just as a heater for a moment 00:57:11.680 |
your body is very capable of carrying that heat away 00:57:22.220 |
So any heat that's locally generated in the ear, 00:57:24.880 |
one, there's a pretty large bony barrier there, 00:57:28.000 |
but two, there's a ton of blood flow in the scalp 00:57:30.080 |
and in the head in general and definitely in the brain 00:57:47.640 |
You've made very clear that one of the first goals 00:57:50.040 |
for Neuralink is to get quadriplegics walking again. 00:58:04.320 |
the patient's own muscle system to their motor cortex. 00:58:09.640 |
agency over the movement of things in the world. 00:58:17.680 |
And we've done a lot of work on developing a system 00:58:24.760 |
And so that gets to the question that you asked 00:58:36.680 |
in the spinal cord itself connected to an implant 00:58:39.640 |
in the brain and have them talking to each other, 00:58:42.000 |
you can take the perfectly intact motor signals 00:58:44.720 |
out of the motor cortex and send them to the spinal cord, 00:58:55.440 |
or motorcycle accident or gunshot wound or whatever. 00:58:58.560 |
And it should be possible to reconnect the brain 00:59:13.720 |
- And here, I just want to flag the hundred years 00:59:17.920 |
or more of incredible work by basic scientists. 00:59:21.520 |
The names that I learned about in my textbooks 00:59:23.280 |
as a graduate student were like Georgopoulos, 00:59:32.440 |
sophisticated recordings out of motor cortex, 00:59:34.280 |
just simply asking what sorts of electrical patterns 00:59:36.740 |
are present in motor cortex as an animal or human, 00:59:40.840 |
Krishna Shenoy being another major pioneer in this area 00:59:45.920 |
And just really highlighting the fact that basic research 00:59:48.160 |
where a exploration of neural tissue is carried out 01:00:08.800 |
sometimes are and sometimes stand on the shoulders 01:00:13.360 |
They were the real pioneers that they were involved 01:00:17.800 |
in the grind for years in an unglorious, unglamorous way. 01:00:25.240 |
And the reward for all the hard work is a paper 01:00:30.380 |
at the end of the day that is read by dozens of people. 01:00:34.520 |
And so they were selfless academic researchers 01:00:47.320 |
for all the hard work that they've done and continue to do. 01:00:56.320 |
InsideTracker is a personalized nutrition platform 01:01:05.000 |
I've long been a believer in getting regular blood work done 01:01:07.280 |
for the simple reason that blood work is the only way 01:01:10.260 |
that you can monitor the markers, such as hormone markers, 01:01:14.720 |
that impact your immediate and long-term health. 01:01:17.440 |
One major challenge with blood work, however, 01:01:19.560 |
is that most of the time it does not come back 01:01:32.960 |
because it has a personalized dashboard that you can use 01:01:35.900 |
to address the nutrition-based, behavior-based, 01:01:41.680 |
in order to move those values into the ranges 01:01:43.820 |
that are optimal for you, your vitality, and your longevity. 01:01:48.580 |
of apolipoprotein B, so-called ApoB, in their ultimate plan. 01:01:52.240 |
ApoB is a key marker of cardiovascular health, 01:02:05.100 |
Again, that's insidetracker.com/huberman to get 20% off. 01:02:13.580 |
about Neuralink that I overheard between Elon 01:02:22.180 |
that I think are still very much in play in people's minds. 01:02:30.500 |
that you so appropriately have worn on your shirt today. 01:02:40.340 |
Translates to seahorse, and it's an area of the brain 01:02:43.380 |
that's involved in learning and memory among other things. 01:02:49.980 |
that a chip or chips could be implanted in the hippocampus 01:02:53.020 |
that would allow greater than normal memory abilities, 01:02:59.980 |
Another idea that I heard about in these discussions 01:03:04.020 |
was, for instance, that you would have some chips 01:03:06.980 |
in your brain and I would have some chips in my brain 01:03:08.780 |
and you and I could just sit here looking at each other 01:03:17.140 |
which sounds outrageous, but of course, why not? 01:03:24.460 |
as our good friend Eddie Chang, who's a neurosurgeon 01:03:41.180 |
that shaping of the lungs come from some intention. 01:03:44.100 |
I have some idea, although it might not seem like it, 01:03:58.900 |
I mean, think about the fact that we could do this right now 01:04:03.340 |
if you pulled out your phone and started texting me 01:04:05.980 |
on my phone and I looked down and started texting you, 01:04:08.600 |
we would be communicating without looking at each other 01:04:11.620 |
Shifting that function from a phone to an implanted device, 01:04:17.200 |
it requires no magic advance, no leap forward. 01:04:34.020 |
- Or, and again, I'm deliberately interrupting, 01:04:36.700 |
or I can text an entire team of people simultaneously 01:04:52.700 |
- Yeah, and so texting each other with our brains 01:04:57.620 |
but it's not very difficult to imagine the implementation 01:05:02.940 |
of the same device in a more verbally focused area 01:05:07.420 |
of the brain that allows you to more naturally 01:05:12.220 |
and have them rendered into speech that I can hear, 01:05:15.980 |
maybe via a bone conducting implant, so silently here. 01:05:30.380 |
think their first name, which might cue up a device 01:05:33.620 |
that would then play my voice to them and say, 01:05:36.540 |
just got off the plane, I'm going to grab my bag 01:05:41.180 |
- Right, so that's all possible, meaning we know the origin 01:05:46.180 |
of the neural signals that gives rise to speech. 01:05:50.340 |
We know the different mechanical and neural apparati, 01:05:59.580 |
that transduce sound waves into electrical signals. 01:06:17.420 |
- For that use case, nonverbal communication, you might say, 01:06:22.420 |
that's a solved problem in a very crude disjointed way. 01:06:45.620 |
that are pulling all that together into one product. 01:06:48.920 |
That's a streamlined package from end to end. 01:06:54.140 |
- And we, I think, have some hints of how easily 01:07:20.980 |
So this is a device that translates sound in the environment 01:07:28.640 |
She's a admirer of birds and all things avian. 01:07:51.300 |
Now when walking past, say, pigeons in the park, 01:08:00.140 |
and that indeed it enriched her experience of those birds 01:08:04.200 |
in ways that obviously it wouldn't otherwise. 01:08:08.580 |
to find out whether or not ongoing use of the neosensory 01:08:16.020 |
or kind of equivalent experience of avians in the world, 01:08:26.240 |
What are your thoughts about peripheral devices 01:08:30.100 |
like that peripheral meaning outside of the skull, 01:08:47.140 |
And do you think that those are going to be used 01:08:49.700 |
more readily before the kind of brain surgery 01:08:54.900 |
- Yeah, certainly the barrier to entry is lower. 01:09:02.660 |
that's hard to say no to when you can slip it on 01:09:06.580 |
and slip it off and not have to get your skin cut at all. 01:09:09.860 |
Again, there's no perfect measure of the efficacy 01:09:16.540 |
of a device, of one device compared to another, 01:09:20.500 |
But one way that you can start to compare apples to oranges 01:09:25.340 |
is bit rate, useful information in or out of the brain 01:09:36.460 |
And you have to ask when you look at a device like that is, 01:09:42.660 |
How much information are you able to usefully convey 01:09:46.420 |
into the system and get out of the system into the body, 01:09:50.980 |
And I think what we've seen in the early stabs at this 01:09:55.980 |
is that there's a very low threshold for bit rate 01:10:00.480 |
on some of the devices that are trying to avoid 01:10:09.220 |
but in a way that maybe people who aren't as familiar 01:10:12.700 |
with thinking about bit rates might be able to digest. 01:10:22.020 |
I understand that adding a new channel of information 01:10:30.940 |
for novel function or experience and to what extent 01:10:39.500 |
- Well, I'm saying more, it's hard to measure utility 01:10:45.140 |
It's hard to put a single metric, single number 01:10:51.620 |
One crude way to try to get at that is bit rate. 01:10:56.220 |
Think of it as back in the days of dial-up modems. 01:11:07.220 |
- That was a bit rate that thankfully kept steadily 01:11:12.040 |
Your internet service provider gives you a number 01:11:16.300 |
that is the maximum usable data that you can transmit 01:11:20.900 |
That's a useful way to think about these assistive devices. 01:11:25.900 |
How much information are you able to get into the brain 01:11:36.780 |
but you have to ask yourself when you're looking 01:11:44.140 |
the theoretical maximum is very low, disappointingly low, 01:11:48.860 |
even if it's perfectly executed and perfectly developed 01:11:54.720 |
And I think the thing that attracts a lot of us 01:11:56.900 |
to a technology like Neuralink is that the ceiling 01:12:01.660 |
There's no obvious reason that you can't interface 01:12:04.260 |
with millions of neurons as this technology is refined 01:12:15.060 |
high bandwidth brain interface that you want to develop 01:12:18.340 |
if you're talking about a semantic prosthetic 01:12:23.340 |
and AI assistant to your cognitive abilities. 01:12:27.900 |
You know, the more sci-fi things that we think about 01:12:38.300 |
They really want it to be something that you can expand 01:12:48.900 |
I'm realizing that people have been doing exactly 01:12:52.060 |
what Neuralink is trying to do now for a very long time. 01:12:57.540 |
People who are blind, who have no pattern vision, 01:13:04.000 |
Now the cane is not a chip, it's not an electrode, 01:13:22.820 |
and translating what would otherwise be visual cues 01:13:31.900 |
at understanding, even when they are approaching, 01:13:37.980 |
because they are integrating that information 01:13:42.300 |
their somatosensory cortex and their motor cortex 01:13:44.760 |
with other things like the changes in the wind 01:13:54.900 |
it's a completely different set of auditory cues. 01:14:00.100 |
and this is because my laboratory worked on visual repair 01:14:02.100 |
for a long time, I talked to a lot of blind people 01:14:03.740 |
who use different devices to navigate the world, 01:14:15.020 |
- And in doing so get pretty good at navigating with a cane. 01:14:19.620 |
but you can imagine the other form of navigating 01:14:24.820 |
which is to just attach yourself or attach to you 01:14:43.980 |
different arousal states and others, threat, danger. 01:14:51.260 |
is taking a cane or another biological system, 01:14:58.580 |
is to get you to navigate more safely through the world. 01:15:02.020 |
In some sense what Neuralink is trying to do is that, 01:15:18.540 |
which is really what Neuralink specializes in. 01:15:22.060 |
another example where there's great promise and great fear. 01:15:27.680 |
To what extent can these brain machine interfaces 01:15:32.680 |
learn the same way a seeing eye dog would learn, 01:15:43.200 |
because it's also listening to the nervous system 01:15:47.540 |
Put simply, what is the role for AI and machine learning 01:16:04.900 |
that can adapt to changes in firing of the brain 01:16:08.740 |
and you're coupling it with another form of intelligence, 01:16:12.460 |
and you're allowing the two to learn each other. 01:16:16.200 |
So undoubtedly the human that has a Neuralink device 01:16:22.720 |
Undoubtedly, the software that the Neuralink engineers 01:16:27.020 |
have written will adapt to the firing patterns 01:16:45.420 |
when you intend to move the mouse cursor up and to the right, 01:16:56.120 |
but you start correlating it with what the person, 01:17:03.520 |
So you assume that the person wants to move the mouse 01:17:12.760 |
And so you start correlating the activity that you record 01:17:17.460 |
when they're moving toward an up and right target 01:17:22.220 |
And similarly for up and left, down and left, 01:17:25.660 |
And so you develop a model semi-intelligently 01:17:30.520 |
in the software for what the person is intending to do 01:17:33.700 |
and let the person run wild with it for a while. 01:17:36.300 |
And they start to get better at using the model presented 01:17:40.520 |
to them by the software as expressed by the mouse moving 01:17:46.760 |
So it's, imagine a scenario where you're asking somebody 01:17:50.120 |
to play piano, but the sound that comes out of each key 01:17:55.120 |
randomly shifts over time, very difficult problem, 01:18:00.160 |
but a human brain is good enough with the aid of software 01:18:06.140 |
to a semi-sable state that they're gonna know 01:18:20.020 |
I got one of these rowers, you know, to exercise. 01:18:25.020 |
And I am well aware that there's a proper row stroke 01:18:35.900 |
gets on this thing and pushes with their legs 01:18:40.620 |
and maybe a smidgen of correct type execution. 01:18:51.460 |
every row stroke you generate arrows toward a dartboard. 01:18:56.220 |
And it knows whether or not you're generating 01:18:58.820 |
the appropriate forces at the given segment of the row, 01:19:01.700 |
the initial pull when you're leaning back, et cetera, 01:19:13.480 |
that's unrelated to the kinds of instructions 01:19:19.660 |
splay your knees a bit more, reach more with your arms, 01:19:22.940 |
All the rowers are probably cringing as I say this 01:19:24.660 |
'cause they're realizing what is exactly the point, 01:19:31.460 |
to whether or not the arrow is hitting the bullseye or not, 01:19:39.940 |
as I understand, pretty close to optimal row stroke 01:19:43.180 |
in the same way that if you had a coach there 01:19:48.140 |
What we're really talking about here is neurobiofeedback. 01:19:51.260 |
- So is that analogy similar to what you're describing? 01:20:05.740 |
for people to get better at controlling these things. 01:20:08.680 |
In fact, it's the default and the obvious way to do it 01:20:12.540 |
is to have people and monkeys play video games. 01:20:37.300 |
Happy birthday, so we're a little bit offset there. 01:20:45.740 |
But the game, so the games you're describing, 01:20:56.180 |
to play cyberpunk, which was really satisfying 01:21:00.940 |
It's a game where the characters are all fully 01:21:08.120 |
- But the root of the game is run around and shoot things. 01:21:16.120 |
- Reason I ask about video games is there's been 01:21:20.520 |
they are making young brains better or worse. 01:21:23.200 |
And I think some of the work from Adam Ghazali's lab 01:21:28.140 |
that actually provided that children in particular 01:21:34.640 |
face-to-face, let's call them more traditional 01:21:48.840 |
- Yeah, there's some work showing that surgeons 01:22:03.980 |
and some of the parents who are trying to get their kids 01:22:06.180 |
to play fewer video games are cringing, but that's okay. 01:22:08.680 |
We'll let them settle their familial disputes 01:22:17.020 |
- Neuralink has been quite generous, I would say, 01:22:21.200 |
in announcing their discoveries and their goals. 01:22:30.680 |
I'm probably going to earn a few enemies by saying this. 01:22:33.460 |
Despite the fact that I've always owned Apple devices 01:22:41.980 |
or when the next phone or computer is going to come out 01:22:49.360 |
Neuralink has been pretty open about their goals. 01:23:13.880 |
have shared some of the progress that they've made 01:23:21.600 |
just for most people to know about and realize 01:23:36.840 |
And in some people it evokes extremely strong emotions. 01:23:49.300 |
- Then they did another one, you guys did another one 01:23:58.560 |
where the implant devices will be in a human. 01:24:15.780 |
and having worked admittedly on every species 01:24:18.800 |
from mice to cuttlefish, to humans, to hamsters, 01:24:48.880 |
The people at Neuralink are obsessive animal lovers. 01:24:55.960 |
spontaneously put up by people within the organization, 01:25:09.080 |
the way we wanna help people without using animals 01:25:13.960 |
It's just not known how to do that right now. 01:25:17.280 |
And so we are completely restricted to making advances, 01:25:24.600 |
by first showing that it's incredibly safe in animals. 01:25:28.920 |
- As is the case for any medical advancement, essentially. 01:25:42.680 |
with a minimum of discomfort to the animals, of course. 01:26:03.540 |
or who have loved ones that are suffering from diseases 01:26:11.520 |
and feel that if you have to work on a biological system 01:26:17.360 |
working on non-human animals first makes sense 01:26:24.160 |
that feels very strongly in the opposite direction. 01:26:36.920 |
weren't just casually slaughtering millions of animals 01:26:50.920 |
used in research, it becomes almost impossible 01:27:16.440 |
So we, in that context, we do animal research 01:27:20.600 |
because we have to, there's no other way around it. 01:27:23.040 |
If tomorrow laws were changed and the FDA said, 01:27:27.560 |
okay, you can do some of this early experimentation 01:27:33.800 |
I think there'd be a lot of people that would step up 01:27:47.800 |
could maybe be spared being unwilling participants in this. 01:28:01.480 |
much, much farther than anyone I've ever heard of, 01:28:06.560 |
anything I've ever seen to give the animals agency 01:28:19.720 |
such that they're as purely opt-in as humanly possible. 01:28:23.920 |
No animal is ever compelled to participate in experiments 01:28:30.640 |
So if say on a given day, our star monkey pager 01:28:35.480 |
doesn't want to play video games for smoothie, 01:28:42.120 |
And I want to cue people to really what Matt is saying here. 01:28:46.400 |
Obviously the animals are being researched on 01:28:55.040 |
But what he's saying is that they play these games 01:29:00.140 |
during which neural signals are measured from the brain 01:29:02.280 |
because they have electrodes implanted in their brain 01:29:04.200 |
through a surgery that thankfully to the brain is painless. 01:29:15.180 |
than the typical scenario in laboratories around the world 01:29:43.320 |
I just think it's important that people understand 01:29:47.140 |
- In order to motivate an animal to play a video game, 01:29:51.640 |
depriving them of something that they yearn for 01:29:57.520 |
They have free and full access to food this entire time. 01:30:04.280 |
is if they want a treat extra to their normal rations. 01:30:29.960 |
and having animals work for a ration of water 01:30:37.320 |
or their postdoc more quickly than having to wait around 01:30:53.120 |
my monkey isn't working today is a common gripe 01:31:02.560 |
- Okay, so this is very important information to get across. 01:31:07.540 |
And there's no public relations statement woven into this. 01:31:10.540 |
This is just we're talking about the nature of the research, 01:31:12.860 |
but I think it is important that people are aware of this. 01:31:15.400 |
- Yeah, it's one of the underappreciated innovations 01:31:18.100 |
out of Neuralink is how far the animal care team 01:31:30.880 |
- Yeah, pigs are, they're actually fairly commonly used 01:31:34.880 |
in medical device research, more in the cardiac area, 01:31:39.880 |
their hearts are somewhat similar to human hearts. 01:31:43.920 |
I've seen little pigs and I've seen big pigs. 01:31:46.880 |
There's a bunch of different varieties of pig. 01:31:50.880 |
that you can optimize for different characteristics. 01:32:04.280 |
when we're trying to optimize a certain characteristic. 01:32:08.720 |
So yeah, the pigs are, we don't necessarily need them 01:32:20.320 |
how their limbs move for some of our spinal cord research, 01:32:30.620 |
They're really just a biological platform with a skull 01:32:34.160 |
that's close enough in size and shape to humans 01:32:37.460 |
to be a valid platform to study the safety of the device. 01:32:41.600 |
- Unlike a monkey or a human, a pig, I don't think, 01:32:48.800 |
How are they signaling that they saw or sensed something? 01:32:53.640 |
- Yeah, so again, the pigs are really just a safety platform 01:32:58.880 |
It doesn't break down or cause any kind of toxic reaction. 01:33:02.880 |
The monkeys are where we are really doing our heavy lifting 01:33:05.320 |
in terms of ensuring that we're getting good signals 01:33:08.520 |
out of the device, that what we expect to see in humans 01:33:12.220 |
is validated on a functional level in monkeys first. 01:33:20.540 |
- Years ago, you and I were enjoying a conversation 01:33:22.840 |
about these very sorts of things that we're discussing today 01:33:26.560 |
and you said, you know, the skull is actually 01:33:31.520 |
Far better would be a titanium plate, you know, 01:33:34.420 |
spoken like a true neurosurgeon with a radio receiver 01:33:39.400 |
But in all seriousness, drilling through the skull 01:33:55.400 |
but I think most people cringe when they hear about that 01:33:59.800 |
And it obviously has to be done by a neurosurgeon 01:34:01.940 |
with all the appropriate environmental conditions in place 01:34:08.660 |
What did you mean when you said that the skull 01:34:11.880 |
is a poor adaptation and a titanium plate will be better? 01:34:14.560 |
And in particular, what does that mean in reference 01:34:20.360 |
I mean, are human beings unnecessarily vulnerable 01:34:28.840 |
- You know, maybe I'm being too harsh about the skull. 01:34:40.760 |
as biological organisms that develop in our mother's uterus. 01:34:43.980 |
The skull is, you know, usually the appropriate size. 01:34:51.160 |
That said, there are a couple of puzzling vulnerabilities. 01:34:59.960 |
This is, you know, neurosurgeons will all know 01:35:05.000 |
that sometimes darkly is called God's little joke 01:35:08.520 |
where the very thin bone of the temporal part of the skull 01:35:16.440 |
that goes to the lining of the brain right attached 01:35:20.400 |
And so this bone just to the side of your eye 01:35:29.120 |
very often cut an artery called the middle meningeal artery 01:35:32.720 |
that leads to a big blood clot that crushes the brain. 01:35:39.360 |
otherwise would be a relatively minor injury end up dying 01:35:43.600 |
is this large blood clot developing from high pressured 01:35:50.360 |
And so why would you put the artery right on the inside 01:35:53.960 |
of the very thin bone that's most likely to fracture? 01:35:58.520 |
but this is probably the most obvious failure mode 01:36:05.440 |
Otherwise, you know, in terms of general impact resistance, 01:36:09.240 |
I think the brain is a very hard thing to protect. 01:36:15.760 |
probably given all other possible architectures 01:36:19.200 |
that can arise from development, it's not that bad really. 01:36:23.340 |
One of the interesting features in terms of shock absorption 01:36:27.360 |
that hopefully prevents a lot of traumatic brain injury 01:36:42.360 |
And so with rapid acceleration, deceleration, 01:36:45.560 |
that sheath of saltwater adds a marvelous protective cushion 01:36:50.560 |
against development of, you know, bruising of the brain, 01:36:57.980 |
And so I think for any flaws in the design that do exist, 01:37:07.320 |
and there's probably a lot fewer TBIs than would exist 01:37:10.880 |
if a human designer was taking a first crack at it. 01:37:13.480 |
- As you described the thinness of this temporal bone 01:37:17.080 |
and the presence of a critical artery just beneath it, 01:37:24.240 |
And here, I also want to cue up the fact that, 01:37:28.940 |
well, whenever we hear about TBI or CTE or brain injury, 01:37:35.520 |
but most traumatic brain injuries are things like 01:37:43.160 |
For some reason, football and hockey and boxing 01:37:47.560 |
But my colleagues that work on traumatic brain injury 01:37:49.800 |
tell me that most of the traumatic brain injury 01:37:52.860 |
they see is somebody slips at a party and hits their head 01:38:03.480 |
To my mind, most helmets don't actually cover this region 01:38:08.820 |
So is there also a failure of helmet engineering that, 01:38:13.820 |
you know, I can understand why you'd want to have 01:38:16.360 |
your peripheral vision out the sides of your eyes, 01:38:20.860 |
but it seems to me if this is such critical real estate, 01:38:27.580 |
but I don't think we see a lot of epidural hematomas 01:38:36.200 |
you usually need a really focal blunt trauma, 01:38:48.320 |
With sports injuries, you know, you don't often see that, 01:38:55.480 |
a sharper object coming in contact with the head. 01:39:09.480 |
of an instance of this exact injury type in sports. 01:39:13.520 |
- You spent a lot of time poking around in brains of humans. 01:39:17.780 |
And while I realize this is not your area of expertise, 01:39:26.000 |
cares about his health and the health of your family. 01:39:37.860 |
if their desire is to keep their brain healthy, 01:39:40.560 |
do any data or any particular practices come to mind? 01:39:45.560 |
I mean, I think we've all heard the obvious one. 01:39:49.200 |
If you do get a head injury, make sure it gets treated 01:40:05.280 |
and I see a lot of the smoldering wreckage of humanity, 01:40:10.120 |
you know, in the operating room and in the emergency room 01:40:15.720 |
You know, I work my practices in San Francisco 01:40:23.920 |
just spectacular amounts of alcohol for a long time. 01:40:27.420 |
And their brains are, you know, very often on the scans, 01:40:32.500 |
they look like small walnuts inside their empty skull. 01:40:44.800 |
far and away the most common source of brain damage 01:40:52.000 |
And it's, you know, when you look at the morbidity, 01:40:54.980 |
kind of the human harm in aggregate that's done, 01:40:58.280 |
it's mystifying that it's not something that we 01:41:04.240 |
- People will think that I don't drink at all. 01:41:12.700 |
I wouldn't notice, but I do occasionally have a drink, 01:41:16.900 |
But I am shocked at this current state of affairs 01:41:21.300 |
around alcohol consumption and advertising, et cetera. 01:41:24.020 |
When I look at the data, mainly out of the UK Brain Bank, 01:41:26.180 |
which basically shows that for every drink that one has 01:41:29.700 |
on a regular basis, when you go from zero to one drink 01:41:46.480 |
And to me, it's like, it's just sort of obvious 01:41:49.320 |
from these large scale studies that, as you point out, 01:42:05.040 |
that we're talking about the resveratrol in red wine, 01:42:10.420 |
It's not even clear resveratrol is good for us anyway, 01:42:12.360 |
by the way, a matter of debate, I should point out. 01:42:28.240 |
You're working, as you mentioned, you're the tenderloin. 01:42:42.320 |
I, you know, I incidentally take care of people 01:42:47.560 |
in quantities that are, you know, spectacular, 01:42:50.720 |
but I haven't specifically done research in that area. 01:42:56.880 |
- Yeah, I ask in part because maybe you know a colleague 01:43:00.360 |
or we'll come across a colleague who's working on this. 01:43:08.680 |
Modafinil, R-modafinil, which I think in small amounts 01:43:12.240 |
in clinically prescribed situations can be very beneficial. 01:43:16.040 |
But let's be honest, many people are using these 01:43:20.320 |
I don't think we really know what it does to the brain 01:43:22.620 |
aside from increasing addiction for those substances. 01:43:27.740 |
we're generating a massive data set right now. 01:43:31.980 |
I'd like to briefly go back to our earlier discussion 01:43:40.120 |
which is that we are not aware of any single brain area 01:43:43.480 |
that one can stimulate in order to invoke plasticity. 01:43:50.280 |
Years ago, Mike Merzenich and colleagues at UCSF 01:43:54.720 |
did some experiments where they stimulated nucleus basalis 01:43:58.800 |
and paired that stimulation with eight kilohertz tone. 01:44:06.260 |
a different brain area, the ventral tegmental area, 01:44:08.760 |
which causes release of dopamine and pair it with a tone. 01:44:25.120 |
I think it was Carl Ashley that did these experiments 01:44:30.440 |
put the animal back into a learning environment, 01:44:32.600 |
and the animal would do pretty well, if not perfectly. 01:44:35.840 |
So they'd scoop out a different region of cortex 01:44:42.520 |
they referred to the equal potential of the cortex, 01:44:52.760 |
So on the one hand, you've got these experiments that say, 01:44:56.240 |
"You know, you don't really need a lot of the brain." 01:44:59.700 |
And every once in a while, a news story will come out 01:45:12.100 |
And then on the other hand, you have these experiments 01:45:16.420 |
where you get massive plasticity from stimulation of one area. 01:45:23.860 |
And so I'd really like just your opinion on this. 01:45:31.480 |
at the level of individual neurons and circuits, 01:45:37.060 |
it's able to circumvent these what would otherwise seem 01:45:44.500 |
- Yeah, I mean, a lot of it to reconcile those experiments, 01:45:51.180 |
that they're probably in different species, right? 01:46:03.500 |
say the part most interested in coordinating speech 01:46:07.660 |
or finger movement, and you're gonna see profound losses 01:46:23.060 |
If you take out half of the brain in a very young baby, 01:46:45.040 |
with extremely high plasticity over many years. 01:46:55.300 |
that isn't very well differentiated functionally 01:46:58.480 |
to begin with, you might not see those deficits. 01:47:00.660 |
So apparently there's a lot of redundancy as well, right? 01:47:03.600 |
There's a lot of say cerebellar and spinal circuits 01:47:06.220 |
in other animals that generate stereotyped behavior patterns 01:47:19.400 |
So a lot of that depends on the experimental setup. 01:47:23.800 |
I would say in general, adult humans are very vulnerable 01:47:30.980 |
- I'm gonna take the liberty of asking a question 01:48:05.660 |
maybe even of just eyelid position or pupil size 01:48:09.500 |
or head position could be introduced to a car 01:48:14.500 |
like the Tesla or another car for that matter 01:48:28.220 |
when people are sleepy is my read of the data 01:48:35.740 |
third, it's incredible, of accidents between vehicles. 01:48:59.440 |
it seems to know when I'm looking at the road 01:49:07.140 |
- There's a small camera up by the rear view mirror. 01:49:13.020 |
My guess here is that it's a simple eye tracking program. 01:49:16.480 |
And so it may already be the case that it's implemented, 01:49:20.700 |
that it's detecting whether your eyes are open or not. 01:49:40.500 |
- But I think they're definitely making efforts 01:49:51.860 |
where there were no electric cars when I was growing up 01:49:54.540 |
and now things are moving oh so fast, no pun intended. 01:49:59.780 |
What is your wish for brain machine interface 01:50:04.980 |
So let's assume that the clinical stuff can be worked out 01:50:12.220 |
that you just are just yearning to see resolved. 01:50:18.340 |
But in addition to that, you really just expand out. 01:50:34.800 |
with brain augmentation and brain machine interface? 01:50:48.620 |
of clinical conditions 'cause that's how you spend your days 01:50:51.340 |
is fixing patients and helping their lives be better. 01:51:02.360 |
and for sake of really getting us, the audience, 01:51:17.060 |
and we're talking a 10-year, maybe 20-year timeframe 01:51:23.620 |
of humans just getting control over some of the horrible ways 01:51:34.400 |
has either known someone or second order known someone, 01:51:37.420 |
a friend of a friend who has been touched by addiction 01:51:45.020 |
These functions of the brain or malfunctions of the brain 01:51:50.220 |
These are the things that I want to tackle in my career. 01:51:58.280 |
I'm thinking full human expansion of human cognition 01:52:17.140 |
as bottlenecked by needing to read the Wikipedia article first 01:52:23.480 |
having communication with anyone that you want to 01:52:28.320 |
unrestricted by this flapping air past meat on your face. 01:52:39.160 |
that's ridiculously prone to being misunderstood. 01:52:42.080 |
It's also a tiny narrow bottleneck of communication 01:52:49.700 |
And there's no reason that needs to necessarily be true. 01:52:54.820 |
but it isn't the way things are going to be in the future. 01:52:58.000 |
And I think there's a million very sci-fi possibilities 01:53:09.560 |
to be even more potent as a multi-unit organism. 01:53:20.300 |
These are things that are so far down the road 01:53:23.780 |
I can't even directly see how they would be implemented. 01:53:30.420 |
that allows some of this stuff to even be thought about 01:53:33.800 |
To that point, I encourage anyone who is excited 01:53:41.900 |
especially mechanical engineers, software engineers, 01:53:44.720 |
robotics engineers, come to the Neuralink website 01:53:50.840 |
working on these, the hardest problems in the world, 01:53:55.820 |
And so if you want to work on this stuff, come help us. 01:53:59.340 |
- I have several responses to what you just said. 01:54:04.020 |
First off, I'll get the least important one out of the way, 01:54:07.740 |
which is that years ago I applied for a job at Neuralink. 01:54:11.300 |
The Neuralink website at that time was incredibly sparse. 01:54:33.580 |
that you, who passed through, fortunately for me, 01:54:39.820 |
and we had some fun expeditions together in the wild, 01:54:43.540 |
neural exploration, so we can talk about some other time, 01:54:54.260 |
and I'll say that they're very lucky to have you. 01:55:04.380 |
neuroscientists and vision scientists like Dan and others. 01:55:11.140 |
So I really want to start off by saying thank you to you 01:55:16.980 |
I know that Neuralink is really tip of the spear 01:55:19.320 |
in being public facing with the kinds of things 01:55:30.440 |
especially given the nature of the work, but-- 01:55:34.180 |
He doesn't keep secrets in public too commonly. 01:55:37.900 |
He tells you what he's going to do and then he does it. 01:55:44.420 |
and tells you exactly what the company intends to do 01:55:48.260 |
And people assume that there's some subterfuge 01:55:55.220 |
And I think Neuralink follows in that path of, 01:56:00.060 |
We want the brightest people in the world to come help us. 01:56:06.300 |
We want the most motivated patients with quadriplegia 01:56:21.120 |
So maybe just the direct call could happen now. 01:56:29.420 |
who are interested in being part of this clinical trial. 01:56:35.420 |
to see who might be eligible for clinical trials 01:56:39.620 |
We're still working with the FDA to hammer out the details 01:56:42.760 |
and get their final permission to proceed with the trial. 01:56:46.900 |
- Great, so please see the note in the show note, 01:56:49.540 |
the link, excuse me, in the show note captions for that. 01:56:52.660 |
Yeah, I want to thank you guys for your stance 01:56:55.500 |
in public facing and also doing the incredibly hard work. 01:57:01.440 |
is extremely forward thinking and absolutely critical. 01:57:05.700 |
So a lot of critical engineering that no doubt will wick out 01:57:09.360 |
into other domains of neurosurgery and medical technology, 01:57:12.300 |
not just serving Neuralink's mission directly. 01:57:16.180 |
And I really want to thank you, first of all, 01:57:22.000 |
out of your important schedule of seeing patients 01:57:31.040 |
to share with people what you guys are doing. 01:57:34.360 |
As I mentioned before, there's a lot of mystique around it. 01:57:37.500 |
And despite the fact that Neuralink has gone out 01:57:40.780 |
of their way to try and erase some of that mystique, 01:57:53.820 |
at the level of nuts and bolts and guts and brains 01:57:58.340 |
And I really just want to thank you also for being you, 01:58:01.480 |
which is perhaps sounds like a kind of an odd thing to hear, 01:58:05.180 |
but I think as made apparent by the device implanted 01:58:08.920 |
in your hand, you don't just do this for a job. 01:58:13.440 |
You live and breathe and embody, truly embody this stuff 01:58:18.160 |
around the nervous system and trying to figure out 01:58:25.280 |
So I want to thank you for not just the brains 01:58:28.860 |
that you put into it and the energy you put into it, 01:58:44.620 |
And I look forward to another round of discussion 01:58:49.620 |
when these incredible technologies have spelled out 01:58:57.000 |
- Thank you for joining me for today's discussion 01:58:58.920 |
with Dr. Matthew McDougall, all about the human brain 01:59:04.520 |
and the incredible efforts that are being carried out 01:59:13.920 |
If you'd like to learn more about Dr. McDougall's work 01:59:16.040 |
and the specific work being done at Neuralink, 01:59:21.600 |
If you're learning from and or enjoying this podcast, 01:59:25.760 |
That's a terrific zero cost way to support us. 01:59:36.680 |
If you have questions for me or topics you'd like me to cover 01:59:39.480 |
on the Huberman Lab Podcast or guests that you'd like me 01:59:41.840 |
to consider inviting on the Huberman Lab Podcast, 01:59:48.380 |
In addition, please check out the sponsors mentioned 01:59:50.680 |
at the beginning and throughout today's episode. 01:59:57.620 |
of the Huberman Lab Podcast, we discuss supplements. 02:00:00.320 |
While supplements aren't necessary for everybody, 02:00:02.240 |
many people derive tremendous benefit from them 02:00:04.280 |
for things like enhancing sleep, focus, and hormone support. 02:00:07.320 |
The Huberman Lab Podcast is proud to have partnered 02:00:10.580 |
If you'd like to hear more about the supplements discussed 02:00:14.080 |
please go to livemomentous, spelled O-U-S, .com/huberman. 02:00:20.800 |
If you're not already following the Huberman Lab Podcast 02:00:23.020 |
on social media, we are Huberman Lab on Instagram, 02:00:34.000 |
but often is distinct from the content covered 02:00:37.320 |
So again, it's Huberman Lab on all social media channels. 02:00:40.340 |
For those of you that haven't already subscribed 02:00:44.180 |
this is a completely zero cost monthly newsletter 02:00:50.800 |
Toolkits are lists of about a page to two pages long 02:00:59.200 |
or deliberate cold exposure or deliberate heat exposure, 02:01:06.960 |
go to the menu tab in the corner, scroll down to newsletter. 02:01:13.360 |
And in addition to that, there are samples of toolkits 02:01:16.620 |
on the HubermanLab.com website, again, under newsletter, 02:01:19.940 |
and you don't even have to sign up to access those. 02:01:25.020 |
with useful information and again, completely zero cost. 02:01:29.380 |
for today's discussion with Dr. Matthew McDougall.