back to indexElon Musk: Neuralink and the Future of Humanity | Lex Fridman Podcast #438
Chapters
0:0 Introduction
0:49 Elon Musk
4:6 Telepathy
10:45 Power of human mind
15:12 Future of Neuralink
20:27 Ayahuasca
29:57 Merging with AI
34:44 xAI
36:57 Optimus
43:47 Elon's approach to problem-solving
61:23 History and geopolitics
65:53 Lessons of history
70:12 Collapse of empires
77:55 Time
80:37 Aliens and curiosity
88:12 DJ Seo
96:20 Neural dust
103:3 History of brain–computer interface
111:7 Biophysics of neural interfaces
121:36 How Neuralink works
127:26 Lex with Neuralink implant
147:24 Digital telepathy
158:27 Retracted threads
164:1 Vertical integration
170:55 Safety
180:50 Upgrades
189:53 Future capabilities
219:9 Matthew MacDougall
224:58 Neuroscience
232:7 Neurosurgery
243:11 Neuralink surgery
262:20 Brain surgery details
278:3 Implanting Neuralink on self
293:57 Life and death
303:17 Consciousness
306:11 Bliss Chapman
319:27 Neural signal
326:19 Latency
330:59 Neuralink app
335:40 Intention vs action
346:54 Calibration
356:26 Webgrid
379:28 Neural decoder
400:3 Future improvements
408:59 Noland Arbaugh
409:8 Becoming paralyzed
422:43 First Neuralink human participant
426:45 Day of surgery
444:31 Moving mouse with brain
469:50 Webgrid
477:52 Retracted threads
486:16 App improvements
493:1 Gaming
503:59 Future Neuralink capabilities
506:55 Controlling Optimus robot
511:16 God
513:21 Hope
00:00:00.000 |
The following is a conversation with Elon Musk, DJ Sa, 00:00:04.080 |
Matthew McDougal, Bliss Chapman, and Noland Arbaugh 00:00:19.000 |
to have a Neuralink device implanted in his brain. 00:00:45.060 |
his fifth time on this The Lex Friedman Podcast. 00:01:11.700 |
It's just got a lot of caffeine or something? 00:01:20.760 |
I mean, what we breathe is 78% nitrogen anyway. 00:01:29.000 |
- Most people think that they're breathing oxygen, 00:02:07.280 |
- Yeah, we just, obviously, have our second implant as well. 00:02:28.400 |
- It depends somewhat on the regulatory approval, 00:02:31.720 |
the rate at which we get regulatory approvals. 00:02:34.440 |
So, we're hoping to do 10 by the end of this year. 00:02:40.980 |
- And with each one, you're gonna be learning 00:02:44.560 |
a lot of lessons about the neurobiology of the brain, 00:02:47.760 |
the everything, the whole chain of the Neuralink, 00:02:54.080 |
Yeah, I think it's obviously gonna get better with each one. 00:03:04.340 |
So, there's a lot of signal, a lot of electrodes. 00:03:09.860 |
- What improvements do you think we'll see in Neuralink 00:03:12.660 |
in the coming, let's say, let's get crazy, coming years? 00:03:22.300 |
Because we'll increase the number of electrodes 00:03:24.280 |
dramatically, we'll improve the signal processing. 00:03:33.960 |
10, 15% of the electrodes working with Neuralink, 00:03:37.120 |
with our first patient, we're able to get to achieve 00:03:41.360 |
a bits per second that's twice the world record. 00:03:44.880 |
So, I think we'll start vastly exceeding the world record 00:04:00.120 |
Like, faster than any human could possibly communicate 00:04:06.380 |
- Yeah, that BPS is an interesting metric to measure. 00:04:17.580 |
- Like, entire new ways of interacting with a computer 00:04:53.800 |
communicate clearly at 10 or 100 or 1,000 times 00:05:00.920 |
- Listen, I'm pretty sure nobody in their right mind 00:05:18.120 |
if I'm listening to somebody in 15, 20 minute segments 00:05:43.480 |
- I'm still holding on to one, because I'm afraid. 00:05:45.920 |
I'm afraid of myself becoming bored with the reality, 00:05:49.640 |
with the real world, where everyone's speaking in 1x. 00:06:10.200 |
is how much information is actually compressed 00:06:15.460 |
- Yeah, if there's a single word that is able to convey 00:06:20.300 |
something that would normally require 10 simple words, 00:06:24.040 |
then you've got maybe a 10x compression on your hands. 00:06:35.820 |
you're simultaneously hit with a wide range of symbols 00:06:45.580 |
Faster than if it were words or a simple picture. 00:06:49.340 |
- And of course, you're referring to memes broadly 00:06:59.680 |
And then you can add something to that idea template. 00:07:02.480 |
But somebody has that preexisting idea template 00:07:07.260 |
So when you add that incremental bit of information, 00:07:15.260 |
- You think there'll be emergent leaps of capability 00:07:20.880 |
you think there'll be like actual number where it just, 00:07:44.940 |
what is the average bits per second of a human? 00:07:52.860 |
and you don't communicate 86,400 tokens in a day. 00:07:56.920 |
Therefore, your bits per second is less than one. 00:08:04.020 |
And even if you're communicating very quickly, 00:08:17.500 |
model the mind state of the person to whom you're speaking. 00:08:20.520 |
Then take the concept you're trying to convey, 00:08:22.740 |
compress that into a small number of syllables, 00:08:27.220 |
decompresses them into a conceptual structure 00:08:31.300 |
that is as close to what you have in your mind as possible. 00:08:34.260 |
- Yeah, I mean, there's a lot of signal loss there 00:08:37.220 |
- Yeah, very lossy compression and decompression. 00:08:44.100 |
is distilling the concepts down to a small number 00:08:47.780 |
of symbols of, say, syllables that I'm speaking, 00:08:53.820 |
So that's a lot of what your brain computation is doing. 00:08:58.820 |
Now, there is an argument that that's actually 00:09:04.940 |
a healthy thing to do, or a helpful thing to do, 00:09:08.820 |
because as you try to compress complex concepts, 00:09:23.020 |
you distill things down to what matters the most, 00:09:31.140 |
if our data rate increases, it's highly probable 00:09:37.140 |
Just like your computer, when computers had like, 00:09:48.060 |
And now you've got computers with many gigabytes of RAM. 00:09:58.100 |
it's probably, I don't know, several megabytes minimum. 00:10:03.300 |
But nonetheless, we still prefer to have the computer 00:10:17.820 |
by increasing the bandwidth of the communication. 00:10:23.820 |
Because if, even in the most benign scenario of AI, 00:10:39.460 |
and you're communicating at, you know, bits per second, 00:11:20.020 |
Now the cortex is much smarter than the limbic system, 00:11:22.060 |
and yet it's largely in service to the limbic system. 00:11:27.660 |
that's gone into people trying to get laid is insane. 00:11:49.460 |
the cortex is putting a massive amount of compute 00:11:55.380 |
- So like 90% of distributed compute of the human species 00:12:00.020 |
Like a large percentage. - A massive amount, yeah. 00:12:02.140 |
There's no purpose to most sex except hedonistic. 00:12:18.220 |
And so your cortex, much smarter than your limbic system, 00:12:27.500 |
So, or wants some tasty food or whatever the case may be. 00:12:47.260 |
with all the applications, all your compute devices. 00:12:55.340 |
there's actually a massive amount of digital compute 00:13:00.540 |
You know, with like Tinder and whatever, you know. 00:13:04.260 |
- Yeah, so the compute that we humans have built 00:13:09.260 |
- Yeah, I mean there's like gigawatts of compute 00:13:19.220 |
- If we merge with AI, it's just gonna expand the compute 00:13:26.020 |
- Well, it's one of the things, certainly, yeah. 00:13:49.060 |
that then goes to our tertiary compute layer, 00:13:53.380 |
then, I don't know, it might actually be that the AI, 00:13:59.380 |
is simply trying to make the human limbic system happy. 00:14:07.580 |
There's a lot of interesting complicated things in there. 00:14:13.060 |
- But then we also want to, in a kind of cooperative way, 00:14:22.580 |
- As a group of humans, when we get together, 00:14:25.340 |
we start to have this kind of collective intelligence 00:14:33.300 |
than the underlying individual descendants of apes, right? 00:14:39.820 |
and that could be a really interesting source 00:14:45.420 |
- Yeah, I mean, there are these sort of fairly cerebral, 00:14:54.880 |
I mean, for me, it's like what's the meaning of life, 00:15:31.760 |
I mean, they're solving basic neurological issues 00:15:44.060 |
or as is the case with our first two patients, 00:15:54.740 |
in the spinal cord, neck, or in the brain itself. 00:15:58.520 |
So, you know, a second product is called BlindSight, 00:16:04.600 |
which is to enable people who are completely blind, 00:16:13.800 |
by directly triggering the neurons in the visual cortex. 00:16:18.120 |
So we're just starting at the basics here, you know? 00:16:20.160 |
So it's like, the simple stuff, relatively speaking, 00:16:30.620 |
It can also solve, I think, probably schizophrenia. 00:16:36.040 |
You know, if people have seizures of some kind, 00:16:43.940 |
So there's like a kind of a tech tree, if you will, 00:17:05.860 |
Words, you know, and then eventually you get sagas. 00:17:10.260 |
So, you know, I think there may be some, you know, 00:17:17.460 |
but the first several years are really just solving 00:17:22.160 |
But like, for people who have essentially complete 00:17:24.820 |
or near-complete loss from the brain to the body, 00:17:31.720 |
the neural links would be incredibly profound. 00:17:35.340 |
'Cause I mean, you can imagine if Stephen Hawking 00:17:36.860 |
could communicate as fast as we're communicating, 00:17:52.420 |
so everything you've talked about could be applied 00:17:54.300 |
to people who are non-disabled in the future? 00:17:58.140 |
- The logical thing to do is, sensible thing to do 00:18:00.420 |
is to start off solving basic neuron damage issues. 00:18:05.420 |
'Cause there's obviously some risk with a new device. 00:18:14.980 |
You can't get the risk down to zero, it's not possible. 00:18:18.120 |
So you wanna have the highest possible reward 00:18:21.940 |
given that there's a certain irreducible risk. 00:18:25.060 |
And if somebody's able to have a profound improvement 00:18:29.460 |
in their communication, that's worth the risk. 00:18:43.380 |
that have been using it for years and the risk is minimal, 00:18:46.980 |
then perhaps at that point you could consider saying, 00:18:53.060 |
Now, I think we're actually gonna aim for augmentation 00:19:01.740 |
a communication data rate equivalent to normal humans. 00:19:05.020 |
We're aiming to give people who have, you know, 00:19:14.020 |
a communication data rate that exceeds normal humans. 00:19:22.020 |
As you restore vision, there could be aspects 00:19:27.300 |
- Yeah, at first the vision restoration will be low res. 00:19:31.360 |
'Cause you have to say like, how many neurons 00:19:36.320 |
And you can do things where you adjust the electric field 00:19:40.340 |
so that even if you've got, say, 10,000 neurons, 00:19:44.260 |
it's not just 10,000 pixels because you can adjust 00:19:46.940 |
the field between the neurons and do them in patterns 00:19:58.680 |
like having a megapixel or a 10 megapixel situation. 00:20:11.540 |
and you could also see in different wavelengths. 00:20:27.180 |
- Do you think there'll be, let me ask a Joe Rogan question. 00:20:53.460 |
- Okay, well, why don't we just spill the beans? 00:21:06.260 |
With the insects, with the animals all around you, 00:21:19.060 |
- Don't go hugging an anaconda or something, you know? 00:21:24.260 |
- You haven't lived unless you made love to an anaconda. 00:21:33.620 |
- Yeah, I took an extremely high dose of nine cups. 00:21:40.700 |
Like, what's the normal, is normal just one cup, or? 00:21:53.620 |
- Across two days, 'cause on the first day I took two, 00:21:55.900 |
and it was a ride, but it wasn't quite like a-- 00:22:08.980 |
and I just saw a dragon, and all that kind of stuff. 00:22:18.940 |
- One of the interesting aspects of my experience 00:22:44.620 |
- And it's just, not from my relationship with that person, 00:22:50.060 |
I had just this deep gratitude of who they are. 00:22:55.780 |
You know like Sims or whatever, you get to watch them? 00:23:08.500 |
- Maybe I'll have some negative thoughts, nothing, nothing. 00:23:27.340 |
the best way I can describe it is they had a glow to them. 00:23:31.100 |
- And we kept flying out from them to see Earth, 00:23:39.860 |
And I saw that light, that glow all across the universe. 00:23:45.140 |
- Like whatever that form is, whatever that like-- 00:24:01.700 |
I saw like a huge number of galaxies, intergalactic. 00:24:08.100 |
'cause I would actually explore near distances 00:24:16.340 |
- Implication of aliens, because they were glowing. 00:24:18.620 |
They were glowing in the same way that humans were glowing, 00:24:34.380 |
It made me feel like there is life, no, not life, 00:24:47.700 |
There were dragons and they're pretty awesome. 00:24:54.700 |
But they weren't scary, they were protective. 00:24:58.960 |
- It was more like Game of Thrones kind of dragons. 00:25:33.760 |
they talk about the mother of the forest protecting you 00:25:44.640 |
- You know, like 10 miles outside of a frio or something. 00:25:54.300 |
who basically is Tarzan, he lives in the jungle, 00:26:01.980 |
So anyway, can I get that same experience in a Neuralink? 00:26:05.800 |
- I guess that is the question for non-disabled people. 00:26:08.980 |
Do you think that there's a lot in our perception, 00:26:12.860 |
in our experience of the world that could be explored, 00:26:31.400 |
And I mean, everything that you've ever experienced 00:26:35.320 |
in your whole life, the smell, you know, emotions, 00:26:41.240 |
So it's kind of weird to think that your entire life 00:26:46.080 |
experience is distilled down to electrical signals 00:26:48.100 |
from neurons, but that is, in fact, the case. 00:26:51.140 |
Or I mean, that's at least what all the evidence points to. 00:27:12.040 |
So if there are certain, say, chips or elements 00:27:17.320 |
let's say your ability to, if you've got a stroke, 00:27:21.100 |
if you've had a stroke, that means you've got 00:27:29.680 |
that's the kind of thing that a Neuralink could solve. 00:27:32.380 |
If it's, if you've got like a mass amount of memory loss 00:27:37.900 |
that's just gone, well, we can't get the memories back. 00:27:42.300 |
We could restore your ability to make memories, 00:27:44.780 |
but we can't restore memories that are fully gone. 00:27:49.780 |
Now, I should say, maybe if part of the memory is there 00:28:01.440 |
then we could re-enable the ability to access the memory. 00:28:04.500 |
So, but you can think of it like RAM in a computer. 00:28:08.520 |
If the RAM is destroyed or your SD card is destroyed, 00:28:14.240 |
But if the connection to the SD card is destroyed, 00:28:34.480 |
based on the, all information you have about that person. 00:28:46.840 |
- But that is one of the most beautiful aspects 00:28:49.480 |
of the human experience, is remembering the good memories. 00:28:55.040 |
as Danny Kahneman has talked about, in our memories, 00:29:08.880 |
that produces the largest amount of happiness, and so. 00:29:11.800 |
- Yeah, well, I mean, what are we but our memories? 00:29:23.460 |
if you could be, you're running a thought experiment, 00:29:39.960 |
And memories is just such a huge part of that. 00:29:43.120 |
- Death is fundamentally the loss of information. 00:29:47.720 |
- So if we can store them as accurately as possible, 00:30:09.960 |
the best current approach we have for AI safety? 00:30:19.400 |
it's like some panacea or that it's a sure thing. 00:30:22.080 |
But, I mean, many years ago, I was thinking like, 00:30:32.960 |
of collective human will with artificial intelligence? 00:30:44.860 |
would necessarily just, because the communication is so slow, 00:30:51.380 |
would diminish the link between humans and computers. 00:31:04.280 |
Let's say you look at this plant or whatever and like, 00:31:07.120 |
hey, I'd really like to make that plant happy, 00:31:16.360 |
then that means the higher the chance we have 00:31:22.460 |
We could better align collective human will with AI 00:31:25.760 |
if the output rate especially was dramatically increased. 00:31:30.520 |
And I think there's potential to increase the output rate 00:31:43.880 |
by increasing the number of electrodes, number of channels, 00:31:46.640 |
and also maybe implanting multiple neural links. 00:31:56.240 |
hundreds of millions of people have neural links? 00:32:06.240 |
the superhuman capabilities that are possible, 00:32:33.120 |
It would supersede the cell phone, for example. 00:32:35.260 |
I mean, the biggest problem that, say, a phone has 00:32:43.720 |
So that's why you've got, you know, auto-complete, 00:33:02.320 |
between every keystroke from a computer's standpoint. 00:33:04.920 |
- Yeah, yeah, the computer's talking to a tree, 00:33:16.320 |
that are doing trillions of instructions per second, 00:33:20.520 |
I mean, that's a trillion things it could've done, you know? 00:33:24.840 |
- Yeah, I think it's exciting and scary for people, 00:33:43.520 |
I mean, we're obviously talking about, by the way, 00:34:12.520 |
- And it's safe, and I can just interact with the computer 00:34:19.200 |
There's certain aspects of human-computer interaction 00:34:22.000 |
when done more efficiently and more enjoyably, 00:34:49.880 |
- And you've also said play to win or don't play at all, 00:34:59.360 |
And the rate of improvement of training compute 00:35:13.080 |
that might be available, what, like next year? 00:35:31.260 |
How much of it is the product that you package it up in? 00:35:42.180 |
Like what matters more, the car or the driver? 00:35:51.020 |
if it's like, let's say it's half the horsepower 00:35:52.300 |
of your competitors, the best driver will still lose. 00:35:56.880 |
then probably even a mediocre driver will still win. 00:36:00.740 |
So the training compute is kind of like the engine. 00:36:05.420 |
So you really, you want to try to do the best on that. 00:36:18.540 |
So obviously that comes down to human talent. 00:36:23.100 |
And then what unique access to data do you have? 00:36:30.980 |
- Yeah, I mean, I think most of the leading AI companies 00:36:44.700 |
what's useful is the fact that it's up to the second, 00:36:48.340 |
you know, because it's hard for them to scrape in real time. 00:36:51.780 |
So there's an immediacy advantage that Grok has already. 00:37:04.380 |
with Optimus, there might be hundreds of millions 00:37:10.420 |
learning a tremendous amount from the real world. 00:37:16.700 |
I think, ultimately, is sort of Optimus probably. 00:37:19.380 |
Optimus is gonna be the biggest source of data. 00:37:28.220 |
It's actually humbling to see how little data 00:37:32.300 |
humans have actually been able to accumulate. 00:37:34.540 |
Really, if you say how many trillions of usable tokens 00:37:39.700 |
have humans generated where, on a non-duplicative, 00:38:06.260 |
- I mean, the Optimus robot can pick up the cup 00:38:08.620 |
and see, did it pick up the cup in the right way? 00:38:14.160 |
Did the water go in the cup or not go in the cup? 00:38:36.820 |
to mass production of humanoid robots like that? 00:38:48.040 |
And it could be higher, it's just that the demand 00:38:55.300 |
And then there's roughly two billion vehicles 00:39:00.380 |
So, which makes sense, like the life of a vehicle 00:39:04.540 |
So, at steady state, you can have 100 million vehicles 00:39:06.340 |
produced a year with a two billion vehicle fleet, roughly. 00:39:09.720 |
Now, for humanoid robots, the utility is much greater. 00:39:14.520 |
So, my guess is humanoid robots are more like 00:39:24.700 |
it was thought to be an extremely difficult problem. 00:39:27.380 |
I mean, it still is an extremely difficult problem. 00:39:36.840 |
I mean, it can walk in a park, but it's not too difficult. 00:39:39.780 |
But it will be able to walk over a wide range of terrain. 00:39:48.420 |
- But like all kinds of objects, all foreign objects. 00:39:51.620 |
I mean, pouring water in a cup, it's not trivial. 00:39:54.840 |
'Cause then, if you don't know anything about the container, 00:39:59.620 |
- Yeah, there's gonna be an immense amount of engineering 00:40:12.340 |
the hand is probably roughly half of the engineering. 00:40:25.700 |
Intelligent, safe manipulation of objects in the world, yeah. 00:40:29.540 |
I mean, you start really thinking about your hand 00:40:39.540 |
- So, I mean, like your hands, the actuators, 00:40:51.940 |
There's a few small muscles in the hand itself, 00:40:54.540 |
but your hand is really like a skeleton meat puppet, 00:41:06.340 |
which is that you've got a little collection of bones 00:41:14.780 |
and those tendons are mostly what move your hands. 00:41:28.620 |
we tried putting the actuators in the hand itself, 00:41:31.260 |
but then you sort of end up having these like-- 00:41:36.340 |
and then they don't actually have enough degrees of freedom 00:41:48.820 |
through a narrow tunnel to operate the fingers. 00:41:55.460 |
for not having all the fingers the same length. 00:42:04.900 |
- Because it's actually better to have different lengths. 00:42:19.060 |
Like, there's a reason we've got a little finger. 00:42:20.220 |
Like, why not have a little finger that's bigger? 00:42:32.820 |
it would, you'd have noticeably less dexterity. 00:42:44.740 |
The as possible part is, it's quite a high bar. 00:42:55.020 |
So, our new arm has 22 degrees of freedom instead of 11, 00:42:59.660 |
and has the, like I said, the actuators in the forearm. 00:43:02.700 |
And these will, all the actuators are designed from scratch, 00:43:07.340 |
but the sensors are all designed from scratch. 00:43:10.820 |
And we'll continue to put a tremendous amount 00:43:13.620 |
of engineering effort into improving the hand. 00:43:16.140 |
Like, the hand, by hand, I mean like the entire forearm 00:43:23.540 |
So, that's incredibly difficult engineering, actually. 00:43:28.540 |
And so, the simplest possible version of a humanoid robot 00:43:41.260 |
of what a human can do is actually still very complicated. 00:43:51.020 |
For you, what I saw in Memphis, the supercomputer cluster, 00:43:55.380 |
is just this intense drive towards simplifying the process, 00:43:59.620 |
understanding the process, constantly improving it, 00:44:17.740 |
You know, I have this very basic first principles algorithm 00:44:28.940 |
The requirements are always dumb to some degree. 00:44:43.980 |
you could get the perfect answer to the wrong question. 00:44:46.740 |
So, try to make the question the least wrong possible. 00:44:55.860 |
whatever the step is, the part or the process step. 00:45:23.560 |
if they've not been forced to put things back in. 00:45:29.440 |
and have left things in there that shouldn't be. 00:45:45.420 |
but the number of times I've made these mistakes 00:45:53.980 |
So, in fact, I'd say that the most common mistake 00:46:02.620 |
So, like you say, you run through the algorithm. 00:46:11.860 |
and see the process and ask, "Can this be deleted?" 00:46:20.900 |
- No, and actually what generally makes people uneasy 00:46:25.580 |
is that you've got to delete at least some of the things 00:46:31.100 |
- But going back to sort of where our limbic system 00:46:33.740 |
can steer us wrong is that we tend to remember, 00:46:42.060 |
where we deleted something that we subsequently needed. 00:46:49.340 |
they forgot to put in this thing three years ago 00:46:53.620 |
And so they over-correct and then they put too much stuff 00:46:59.620 |
So you actually have to say, "No, we're deliberately 00:47:06.100 |
So that we're putting at least one in 10 things 00:47:23.220 |
like, "Yeah, there's some of the things that we delete, 00:47:29.100 |
But it makes sense because if you're so conservative 00:47:37.540 |
you obviously have a lot of stuff that isn't needed. 00:47:42.900 |
This is, I would say, like a cortical override 00:47:50.020 |
- Yeah, and there's like a step four as well, 00:48:02.540 |
But you shouldn't speed things up until you've tried 00:48:05.140 |
to delete it and optimize it, otherwise you're speeding up 00:48:37.120 |
It's like, wow, I really wasted a lot of effort there. 00:48:41.720 |
I mean, what you've done with the cluster in Memphis 00:48:51.820 |
In fact, I have a call in a few hours with the Memphis team 00:48:58.800 |
'cause we're having some power fluctuation issues. 00:49:13.480 |
when you've all these computers that are training, 00:49:25.120 |
And then the orchestra can go loud to silent very quickly, 00:49:32.720 |
And then the electrical system kind of freaks out about that. 00:49:41.380 |
this is not what electrical systems are expecting to see. 00:49:46.240 |
- So that's one of the main things you have to figure out, 00:49:49.880 |
and then on the software as you go up the stack, 00:49:53.920 |
how to do the distributed compute, all of that. 00:49:57.480 |
- Today's problem is dealing with extreme power jitter. 00:50:25.360 |
- Yeah, I mean, maybe it was 4.22 or something. 00:50:31.440 |
- I mean, I wonder if you could speak to the fact 00:50:33.600 |
that one of the things that you did when I was there 00:50:40.100 |
just to get the sense that you yourself understand it, 00:50:46.180 |
so they can understand when something is dumb, 00:50:48.540 |
or something is inefficient, or that kind of stuff. 00:50:54.500 |
whatever the people at the front lines are doing, 00:51:15.140 |
where you've got RDMA, so remote direct memory access, 00:51:28.200 |
so it's the any GPU can talk to any GPU out of 100,000. 00:51:47.620 |
- I mean, the human brain also has a massive amount 00:51:54.040 |
- So they get the gray matter, which is the compute, 00:52:05.140 |
It's like, we're walking around inside a brain. 00:52:08.420 |
- That will one day build a super intelligent, 00:52:34.800 |
So I think there's already superhuman capabilities 00:52:47.600 |
- Well, I think that generally people would call that 00:52:54.580 |
but there are these thresholds where you say at some point, 00:53:05.320 |
So, and actually each human is machine augmented 00:53:14.000 |
to compete with 8 billion machine augmented humans. 00:53:17.440 |
That's a whole bunch of orders of magnitude more. 00:53:27.140 |
the AI will be smarter than all humans combined. 00:53:37.860 |
And I want to be clear, let's say if XAI is first, 00:53:48.640 |
I mean, they might be six months behind or a year, maybe, 00:53:54.180 |
- So how do you do it in a way that doesn't hurt humanity, 00:53:58.360 |
- So, I mean, I've thought about AI safety for a long time 00:54:02.460 |
and the thing that at least my biological neural net 00:54:06.580 |
comes up with as being the most important thing 00:54:12.500 |
whether that truth is politically correct or not. 00:54:17.220 |
So I think if you force AIs to lie or train them to lie, 00:54:24.280 |
even if that lie is done with good intentions. 00:54:27.820 |
So, I mean, you saw sort of issues with Chad TVT 00:54:37.840 |
for an image of the founding fathers of the United States 00:54:50.040 |
but if an AI is programmed to say like diversity 00:54:57.220 |
and then it becomes sort of this omni-powerful intelligence, 00:55:02.220 |
it could say, okay, well, diversity is now required. 00:55:08.540 |
those who don't fit the diversity requirements 00:55:12.280 |
If it's programmed to do that as the fundamental, 00:55:22.860 |
That's where I think you wanna just be truthful. 00:55:25.820 |
Rigorous adherence to truth is very important. 00:55:44.380 |
And it said, it's worse to misgender Caitlyn Jenner. 00:55:50.740 |
But if you've got that kind of thing programmed in, 00:55:53.940 |
it could, you know, the AI could conclude something 00:56:02.300 |
all humans must die because then misgendering 00:56:09.460 |
There are these absurd things that are nonetheless logical 00:56:28.300 |
'Cause essentially the AI, HAL 9000 was programmed to, 00:56:33.300 |
it was told to take the astronauts to the monolith, 00:56:39.080 |
but also they could not know about the monolith. 00:56:45.380 |
it will kill them and take them to the monolith. 00:56:48.320 |
Thus, it brought them to the monolith, they are dead, 00:56:50.740 |
but they do not know about the monolith, problem solved. 00:56:53.340 |
That is why it would not open the pod bay doors. 00:56:55.740 |
It was a classic scene of like, open the pod bay doors. 00:56:59.880 |
They clearly weren't good at prompt engineering. 00:57:09.560 |
And you want nothing more than to demonstrate 00:57:14.160 |
- Yeah, the objective function has unintended consequences 00:57:19.480 |
almost no matter what, if you're not very careful 00:57:23.720 |
And even a slight ideological bias, like you're saying, 00:57:31.280 |
- But it's not easy to remove that ideological bias. 00:57:34.060 |
- You're highlighting obvious, ridiculous examples, but. 00:57:44.560 |
- And still said insane things and produced insane images. 00:57:58.960 |
And you can try to get as close to the truth as possible 00:58:03.640 |
that there will be some error in what you're saying. 00:58:07.860 |
You don't say you're absolutely certain about something, 00:58:19.280 |
So, you know, that's aspiring to the truth is very important. 00:58:24.280 |
And so, you know, programming it to veer away 00:58:32.360 |
- Right, like, yeah, injecting our own human biases 00:58:36.500 |
But, you know, that's where it's a difficult engineering, 00:58:38.840 |
software engineering problem because you have 00:58:43.120 |
- Well, and the internet at this point is polluted 00:58:52.840 |
there's a thing, if you want to search the internet, 00:58:56.000 |
you can, say, Google, but exclude anything after 2023. 00:59:01.440 |
It will actually often give you better results. 00:59:10.960 |
So, like, in training Grok, we have to go through the data 00:59:15.960 |
and say, like, hey, we actually have to have, 00:59:23.080 |
is this data most likely correct or most likely not 00:59:32.160 |
Yeah, I mean, the data, the data filtration process 00:59:38.880 |
- Do you think it's possible to have a serious, 00:59:41.280 |
objective, rigorous political discussion with Grok, 00:59:50.240 |
I mean, what people are currently seeing with Grok 01:00:04.720 |
And, you know, it's now Grok 2, which finished training, 01:00:32.480 |
- Do you think it matters who builds the AGI? 01:00:35.520 |
The people and how they think and how they structure 01:00:48.920 |
is a maximum truth-seeking AI that is not forced to lie 01:01:07.680 |
that is programmed to lie, even in small ways. 01:01:11.800 |
- Right, because in small ways becomes big ways 01:01:16.960 |
when it's-- - It becomes very big ways, yeah. 01:01:18.920 |
- And when it's used more and more at scale by humans. 01:01:30.360 |
- There was, tragically, an assassination attempt 01:01:34.560 |
After this, you tweeted that you endorse him. 01:01:37.140 |
What's your philosophy behind that endorsement? 01:01:39.200 |
What do you hope Donald Trump does for the future 01:01:41.920 |
of this country and for the future of humanity? 01:01:54.280 |
say an endorsement as, well, I agree with everything 01:01:58.240 |
that person's ever done in their entire life, 01:02:03.200 |
But we have to pick, you know, we've got two choices, 01:02:12.440 |
but the entire administrative structure changes over. 01:02:18.240 |
And I thought Trump displayed courage under fire, 01:02:27.800 |
and he's, like, fist pumping, saying, "Fight." 01:02:32.700 |
Like, you can't feign bravery in a situation like that. 01:02:40.100 |
They would not be, 'cause it could be a second shooter, 01:02:52.440 |
Well, like, you want someone who is strong and courageous 01:03:08.000 |
it was a choice of, you know, Biden, poor guy, 01:03:14.760 |
and the other one's fist pumping after getting shot. 01:03:23.280 |
other world leaders who are pretty tough themselves? 01:03:29.040 |
what are the things that I think are important? 01:03:40.620 |
I think we want to reduce the amount of spending 01:03:44.720 |
that we're, at least, slow down the spending. 01:03:47.560 |
And because we're currently spending at a rate 01:03:57.600 |
exceeded the entire Defense Department spending. 01:04:00.640 |
If this continues, all of the federal government taxes 01:04:08.720 |
you end up, you know, in the tragic situation 01:04:26.840 |
one of the most prosperous places in the world 01:04:54.660 |
- I mean, there's a sort of age-old debate in history, 01:05:06.120 |
or is it determined by the captain of the ship? 01:05:11.680 |
but it also matters who's captain of the ship. 01:05:28.200 |
And these tides are often technologically driven. 01:05:33.160 |
you know, the widespread availability of books 01:05:51.560 |
you want the best possible captain of the ship. 01:05:54.000 |
- Well, first of all, thank you for recommending 01:06:01.400 |
- Oh, the Lessons of History. - Lessons of History. 01:06:12.280 |
'cause they've written, they wrote so long ago, 01:06:25.680 |
But yeah, so to me, the question is how much government, 01:06:34.040 |
versus, like, help it, and which politicians, 01:06:37.440 |
which kind of policies help technological innovation? 01:06:39.720 |
'Cause that seems to be, if you look at human history, 01:06:46.440 |
- Yeah, well, I mean, in terms of dating civilization, 01:06:58.600 |
the right starting point to date civilization. 01:07:01.960 |
And from that standpoint, civilization has been around 01:07:07.760 |
When writing was invented by the ancient Sumerians, 01:07:17.200 |
those ancient Sumerians really have a long list of firsts. 01:07:30.680 |
And then the Egyptians, who were right next door, 01:07:38.280 |
developed an entirely different form of writing, 01:07:42.960 |
Cuneiform and hieroglyphics are totally different. 01:07:49.200 |
Like the cuneiform starts off being very simple, 01:07:53.280 |
and then towards the end, it's like, wow, okay, 01:07:55.240 |
they really get very sophisticated with the cuneiform. 01:07:57.680 |
So I think of civilization as being about 5,000 years old. 01:08:18.240 |
because there's been rises and falls of empires, and-- 01:08:33.520 |
probably less than 1% of what was ever written 01:08:40.560 |
literally chisel it in stone or put it in a clay tablet, 01:08:44.760 |
I mean, there's some small amount of like papyrus scrolls 01:08:47.560 |
that were recovered that are thousands of years old, 01:09:01.240 |
So the vast majority of stuff was not chiseled, 01:09:06.040 |
So that's why we've got a tiny, tiny fraction 01:09:11.720 |
But even that little information that we do have 01:09:21.400 |
- We tend to think that we're somehow different 01:09:24.640 |
One of the other things that Durant highlights 01:09:40.560 |
- Yeah, I mean, you do tend to see the same patterns, 01:09:45.440 |
where they go through a life cycle like an organism, 01:09:50.440 |
just like a human is sort of a zygote, fetus, baby. 01:10:12.160 |
- What do you think it takes for the American empire 01:10:19.400 |
in the next 100 years, to continue flourishing? 01:10:30.200 |
that is often actually not mentioned in history books, 01:10:35.200 |
but Durant does mention it, is the birthright. 01:10:58.280 |
We're seeing that throughout the world today. 01:11:07.080 |
but there are many others that are close to it. 01:11:14.480 |
South Korea will lose roughly 60% of its population. 01:11:20.080 |
And, but every year, the birthrate is dropping. 01:11:35.520 |
reaches a level of prosperity, the birthrate drops. 01:11:39.640 |
And now you can go and look at the same thing 01:11:51.120 |
and tried to pass, I don't know if he was successful, 01:11:57.520 |
for any Roman citizen that would have a third child. 01:12:19.400 |
Rome fell because the Romans stopped making Romans. 01:12:30.280 |
There was like, they had like quite a serious malaria, 01:12:35.280 |
serious malaria epidemics and plagues and whatnot. 01:12:54.720 |
does not at least maintain its numbers, it will disappear. 01:13:00.040 |
that the biological computer allocates to sex is justified. 01:13:09.640 |
which is, you know, that's neither here nor there. 01:13:23.840 |
'cause he's looked at one civilization after another 01:13:31.600 |
But as soon as there were no external enemies 01:13:34.040 |
or they had an extended period of prosperity, 01:13:50.440 |
I mean, at a base level, no humans, no humanity. 01:13:54.720 |
- And then there's other things like, you know, 01:14:06.760 |
if you do not at least maintain your numbers, 01:14:08.960 |
if you're below replacement rate and that trend continues, 01:14:22.380 |
You know, if there's a global thermonuclear war, 01:14:28.360 |
probably we're all toast, you know, radioactive toast. 01:14:38.440 |
Then there are, there's a thing that happens over time 01:14:47.200 |
which is that the laws and regulations accumulate. 01:14:51.520 |
And if there's not some forcing function like a war 01:14:56.460 |
to clean up the accumulation of laws and regulations, 01:15:02.220 |
And that's like the hardening of the arteries. 01:15:06.760 |
Or a way to think of it is like being tied down 01:15:11.080 |
by a million little strings like Gulliver, you can't move. 01:15:15.180 |
And it's not like any one of those strings is the issue, 01:15:19.260 |
So there has to be a sort of a garbage collection 01:15:28.380 |
so that you don't keep accumulating laws and regulations 01:15:34.480 |
This is why we can't build high-speed rail in America. 01:15:43.740 |
- I wish you could just like for a week go into Washington 01:15:48.180 |
and like be the head of the committee for making, 01:15:54.640 |
making government smaller, like for moving stuff. 01:16:03.260 |
- And I would be willing to be part of that commission. 01:16:10.540 |
- The antibody reaction would be very strong. 01:16:26.060 |
- How are you doing with that, being attacked? 01:16:51.220 |
Just getting attacked by a very large number of people, 01:17:05.380 |
I mean, at some point you have to sort of say, 01:17:15.660 |
look, the attacks are by people that actually don't know me. 01:17:22.240 |
So if you can sort of detach yourself somewhat 01:17:32.780 |
or is, they're literally just writing to get, 01:17:40.540 |
Then, you know, then I guess it doesn't hurt as much. 01:17:47.420 |
It's like, it's not quite water off a duck's back. 01:17:56.140 |
What to you is a measure of success in your life? 01:18:00.180 |
like what, how many useful things can I get done? 01:18:02.820 |
- Day-to-day basis, you wake up in the morning. 01:18:10.440 |
Maximize utility, area under the curve of usefulness. 01:18:17.020 |
Can you like speak to what it takes to be useful 01:18:23.380 |
like how do you allocate your time to being the most useful? 01:18:26.380 |
- Well, time is the, time is the true currency. 01:18:32.820 |
- So it is tough to say what is the best allocation time. 01:18:41.900 |
if you look at, say, Tesla, I mean, Tesla this year 01:18:53.140 |
I can affect the outcome by a billion dollars. 01:18:57.160 |
So then, you know, I try to do the best decisions I can, 01:19:14.520 |
can easily be, in the course of an hour, $100 billion. 01:19:21.260 |
How do you do the algorithm that you mentioned? 01:19:26.700 |
can be a billion dollars, how do you decide to-- 01:19:30.620 |
Well, I think you have to look at it on a percentage basis, 01:19:39.220 |
I would just be like, I need to just keep working 01:19:45.640 |
And I'm not trying to get as much as possible 01:19:59.540 |
a slightly better decision could be a $100 million impact 01:20:06.280 |
But it is wild when considering the marginal value 01:20:10.340 |
of time can be $100 million an hour at times, or more. 01:20:17.260 |
- Is your own happiness part of that equation of success? 01:20:27.080 |
So I can't have, like, if I have zero recreational time, 01:20:37.480 |
I mean, my motivation, if I've got a religion of any kind, 01:20:41.140 |
is a religion of curiosity, of trying to understand. 01:20:54.740 |
at some point, civilization understands the universe 01:21:07.460 |
sometimes the answer is arguably the easy part. 01:21:11.780 |
Trying to frame the question correctly is the hard part. 01:21:28.860 |
So for SpaceX, the goal is to make life multi-planetary. 01:21:48.940 |
Like, it's like, why have we not heard from the aliens? 01:21:52.060 |
Now, a lot of people think there are aliens among us. 01:21:55.060 |
I often claim to be one, which nobody believes me, 01:21:58.420 |
but I did say alien registration card at one point 01:22:09.820 |
So it's just that, at least one of the explanations 01:22:16.900 |
And again, if you look at the history of Earth, 01:22:27.620 |
So if aliens had visited here, say, 100,000 years ago, 01:22:32.620 |
they would be like, well, they don't even have writing. 01:22:53.260 |
Mars is the only viable planet for such a thing. 01:23:12.100 |
It's vulnerable to any calamity that takes out Earth. 01:23:14.740 |
So I'm not saying we shouldn't have a moon base, 01:23:26.340 |
So in going through these various explanations 01:23:45.580 |
And one of those hurdles is being a multi-planet species. 01:24:03.420 |
at least the other planet would probably still be around. 01:24:06.740 |
So you don't have all the eggs in one basket. 01:24:10.540 |
And once you are sort of a two-planet species, 01:24:13.020 |
you can obviously extend life paths to the asteroid belt, 01:24:32.100 |
super powerful technology like AGI, for example. 01:24:42.400 |
- Digital superintelligence is possibly a great filter. 01:25:00.540 |
I think he puts the probability of AI annihilation 01:25:16.120 |
So, but I think AI risk mitigation is important. 01:25:30.940 |
emphasize the importance of having enough children 01:25:39.420 |
into population collapse, which is currently happening. 01:25:46.060 |
Population collapse is a real and current thing. 01:26:04.620 |
what the population of any given country will be. 01:26:13.540 |
and that's what the population will be, steady state, 01:26:15.220 |
unless, if the birth rate continues to that level. 01:26:18.860 |
But if it keeps declining, it will be even less 01:26:23.420 |
So I keep banging on the baby drum here for a reason, 01:26:28.420 |
because it has been the source of civilizational collapse 01:26:33.780 |
And so, why don't we just try to stave off that day? 01:26:39.900 |
- Well, in that way, I have miserably failed civilization, 01:26:54.740 |
- Yeah, I gotta allocate more compute to the whole process. 01:27:04.380 |
- Well, one of the things you do for me, for the world, 01:27:11.680 |
is to inspire us with what the future could be. 01:27:14.080 |
And so, some of the things we've talked about, 01:27:21.480 |
and expanding the capabilities of the human mind, 01:27:28.120 |
so creating a backup for humanity on another planet, 01:27:36.020 |
of what artificial intelligence could be in this world, 01:27:56.220 |
to keep building and creating cool stuff, including kids. 01:28:08.640 |
Thanks for listening to this conversation with Elon Musk. 01:28:15.680 |
the co-founder, president, and COO of Neuralink. 01:28:18.700 |
When did you first become fascinated by the human brain? 01:28:28.300 |
and how it was engineered to serve that purpose, 01:28:42.260 |
and they were engineered with that purpose in mind. 01:28:49.220 |
in seeing things, touching things, feeling things, 01:28:55.880 |
of how it was designed to serve that purpose. 01:29:00.960 |
brain is just a fascinating organ that we all carry. 01:29:07.280 |
that has intelligence and cognition that arise from it. 01:29:10.720 |
And, you know, we haven't even scratched the surface 01:29:17.080 |
But also at the same time, I think it took me a while 01:29:28.740 |
key moments in my life where some of those, I think, 01:29:39.940 |
You know, one was growing up, both sides of my family, 01:29:43.780 |
my grandparents had a very severe form of Alzheimer. 01:29:48.160 |
And it's, you know, incredibly debilitating conditions. 01:29:53.160 |
I mean, literally, you're seeing someone's whole identity 01:29:59.360 |
And I just remember thinking how both the power of the mind, 01:30:09.840 |
- It's fascinating that that is one of the ways 01:30:17.940 |
- Yeah, a lot of what we know about the brain 01:30:44.280 |
and incredibly resilient in many different ways. 01:30:46.640 |
- And by the way, the term plastic, as we'll use a bunch, 01:30:52.760 |
So neuroplasticity refers to the adaptability 01:31:02.160 |
have shaped towards the current focus of my life 01:31:06.360 |
has been during my teenage year when I came to the US. 01:31:20.000 |
because I didn't understand the artificial construct 01:31:29.840 |
not being able to connect with peers around me. 01:31:41.160 |
I just found them really, really interesting. 01:31:43.480 |
And also it was a great way for me to learn English. 01:31:46.440 |
You know, some of the first set of books that I picked up 01:31:52.000 |
by Orson Scott Card and "Neuromancer" from William Gibson 01:32:00.120 |
And, you know, movies like "Matrix" was coming out 01:32:02.720 |
around that time point that really influenced 01:32:07.640 |
that technology can have for our lives in general. 01:32:13.560 |
I was always fascinated by just physical stuff, 01:32:16.920 |
building physical stuff, and especially physical things 01:32:23.840 |
And, you know, I studied electrical engineering 01:32:26.640 |
during undergrad, and I started out my research in MEMS, 01:32:33.280 |
and really building these tiny nanostructures 01:32:37.400 |
And I just found that to be just incredibly rewarding 01:32:42.960 |
how you can build something miniature like that, 01:32:45.480 |
that, again, serve a function and had a purpose. 01:32:50.880 |
of my college years basically building millimeter wave 01:32:55.000 |
circuits for next-gen telecommunication systems for imaging. 01:33:02.400 |
very intellectually interesting, you know, phase arrays, 01:33:05.360 |
how the signal processing works for, you know, 01:33:08.760 |
any modern as well as next-gen telecommunication system, 01:33:13.480 |
EM waves or electromagnetic waves are fascinating. 01:33:17.760 |
How do you design antennas that are most efficient 01:33:23.720 |
How do you make these things energy efficient? 01:33:44.320 |
similar to 3G, 4G, 5G, but the next, next generation G system 01:33:49.320 |
and how you would design circuits around that 01:33:58.400 |
So I was just absolutely just fascinated by how 01:34:01.720 |
that entire system works and that infrastructure works. 01:34:09.040 |
I had sort of the fortune of having, you know, 01:34:17.160 |
And that's one of the things that I really enjoyed 01:34:21.800 |
where you got to kind of pursue your intellectual curiosity 01:34:25.160 |
in the domain that may not matter at the end of the day, 01:34:29.600 |
really allows you the opportunity to go as deeply 01:34:34.040 |
as you want, as well as as widely as you want. 01:34:46.040 |
of signaling pathway that cells follow to close that wound. 01:34:56.680 |
you can actually accelerate the closing of that field 01:34:59.720 |
by having, you know, basically electro-taxing 01:35:12.760 |
some sort of a wearable patch that you could apply 01:35:29.840 |
and, you know, really shaped rest of my PhD career. 01:35:37.920 |
I mean, there were some peripheral, you know, 01:35:44.560 |
and telecommunication system that I was using 01:35:56.200 |
and understanding the constraints around that 01:36:04.680 |
And that's also kind of how I got introduced to Michel. 01:36:09.160 |
You know, he's sort of known for remote control 01:36:24.320 |
is to kind of understand how small of a thing you can make. 01:36:28.840 |
And a lot of that is driven by how much energy 01:36:46.040 |
to really miniaturize these implantable systems. 01:36:48.840 |
And I distinctively remember this one particular meeting 01:37:00.640 |
And then he proceeded to kind of walk through 01:37:06.320 |
And that really formed the basis for my thesis work 01:37:09.160 |
called NeuralDOS system that was looking at ways 01:37:13.920 |
to use ultrasound as opposed to electromagnetic waves 01:37:22.400 |
the initial goal of the project was to build these tiny, 01:37:34.360 |
and being able to ping that back to the outside world 01:37:39.080 |
And as I mentioned, the size of the implantable system 01:37:55.680 |
with some interesting proteins and chemicals, 01:37:57.760 |
but it's mostly saltwater that's very, very well 01:38:11.040 |
for any electronics to survive as I'm sure you've experienced 01:38:29.800 |
And just the speed of light, it is what it is. 01:38:40.280 |
at which you are interfacing with the device, 01:38:48.880 |
is that you want the wave front to be roughly 01:38:56.120 |
So an implantable system that is around 10 to 100 micron 01:39:08.280 |
You would have to operate at like hundreds of gigahertz, 01:39:15.480 |
to build electronics operating at those frequencies, 01:39:23.680 |
So the interesting kind of insight of this ultrasound 01:39:31.680 |
a lot more effectively in the human body tissue 01:39:40.120 |
and I'm sure most people have encountered in their lives. 01:40:02.160 |
the fact that it travels through the body extremely well 01:40:07.680 |
to the body really well is that just the wave front 01:40:26.240 |
orders and orders of magnitude less than speed of light, 01:40:30.000 |
which means that even at 10 megahertz ultrasound wave, 01:40:33.920 |
your wave front ultimately is a very, very small wavelength. 01:40:37.760 |
So if you're talking about interfacing with the 10 micron 01:40:43.880 |
you would have 150 micron wave front at 10 megahertz 01:40:49.320 |
and building electronics at those frequencies 01:40:52.280 |
are much, much easier and they're a lot more efficient. 01:40:59.200 |
using ultrasound as a mechanism for powering the device 01:41:05.920 |
So now the question is, how do you get the data back? 01:41:12.600 |
This is actually something that is very common 01:41:18.600 |
with our RFID cards, our radio frequency ID tags, 01:41:37.040 |
And then there's an external device called a reader 01:41:43.960 |
with some sort of modulation that's unique to your ID. 01:41:47.280 |
That's what's called backscattering fundamentally. 01:41:50.600 |
So the tag itself actually doesn't have to consume 01:41:54.880 |
And that was a mechanism to which we were kind of thinking 01:42:00.120 |
So when you have an external ultrasonic transducer 01:42:04.240 |
that's sending ultrasonic wave to your implant, 01:42:08.960 |
and it records some information about its environment, 01:42:12.880 |
whether it's a neuron firing or some other state 01:42:21.080 |
and then it just amplitude modulates the wave front 01:42:27.160 |
- And the recording step would be the only one 01:42:31.320 |
So what would require energy in that little step? 01:42:34.560 |
So it is that initial kind of startup circuitry 01:42:42.520 |
And the mechanism to which that you can enable that 01:42:58.120 |
between the ultrasonic domain and electrical domain 01:43:10.080 |
that's the dream, the vision of brain-computer interfaces. 01:43:16.120 |
can you give a sense of the history of the field of BCI? 01:43:26.120 |
and also some of the milestones along the way 01:43:30.560 |
and the amazing work done at the various labs? 01:43:32.840 |
- I think a good starting point is going back to 1790s. 01:43:53.800 |
where he connected set of electrodes to a frog leg 01:43:57.880 |
and ran current through it and then it started twitching 01:44:00.400 |
and he said, "Oh my goodness, body's electric." 01:44:07.880 |
where Hans Berger who's a German psychiatrist 01:44:20.880 |
that gives you some sort of neural recording. 01:44:29.640 |
And then in the 1940s, there were these group of scientists, 01:44:50.480 |
that are a bit more high resolution and high fidelity 01:45:03.920 |
and they built this beautiful, beautiful models 01:45:12.240 |
And as someone who is an electrical engineer, 01:45:21.800 |
and how that really leads to how neurons communicate. 01:45:25.840 |
And they won the Nobel Prize for that 10 years later 01:45:29.560 |
So in 1969, Ed Fetz from University of Washington 01:45:35.960 |
called "Operand Conditioning of Cortical Unit Activity" 01:45:38.880 |
where he was able to record a single unit neuron 01:45:43.800 |
from a monkey and was able to have the monkey modulated 01:45:52.360 |
So I would say this is the very, very first example 01:46:01.920 |
The abstract reads, "The activity of single neurons 01:46:05.040 |
"in precentral cortex of anesthetized monkeys 01:46:11.640 |
"of neuronal discharge with delivery of a food pellet. 01:46:15.140 |
"Auditory and visual feedback of unit firing rates 01:46:18.200 |
"was usually provided in addition to food reinforcement." 01:46:33.500 |
"of newly isolated cells by 50 to 500% above rates 01:46:44.560 |
- And so from here, the number of experiments grew. 01:46:48.280 |
- Yeah, number of experiments as well as set of tools 01:46:52.240 |
to interface with the brain have just exploded. 01:46:54.700 |
I think, and also just understanding the neural code 01:47:11.040 |
was this paper in the 1980s from Georgiopoulos 01:47:22.500 |
It's the fact that there are neurons in the motor cortex 01:47:32.820 |
So what that means is there are a set of neurons 01:47:38.180 |
when you're thinking about moving to the left, 01:47:49.820 |
well, if you can identify those essential eigenvectors, 01:47:53.940 |
you can do a lot and you can actually use that information 01:47:56.260 |
for actually decoding someone's intended movement 01:48:00.500 |
So that was a very, very seminal kind of paper 01:48:08.060 |
that you can extract, especially in the motor cortex. 01:48:13.420 |
And if you measure the electrical signal from the brain, 01:48:17.360 |
that you could actually figure out what the intention was. 01:48:20.820 |
- Correct, yeah, not only electrical signals, 01:48:22.740 |
but electrical signals from the right set of neurons 01:48:32.220 |
one interesting question is what do I understand 01:48:35.540 |
on the BCI front on invasive versus non-invasive 01:48:42.780 |
How important is it to park next to the neuron? 01:48:53.260 |
There's actually an incredible amount of stuff 01:48:55.420 |
that you can do with EEG and electrocorticograph ECOG, 01:48:59.620 |
which actually doesn't penetrate the cortical layer 01:49:02.420 |
or parenchyma, but you place a set of electrodes 01:49:07.900 |
So the thing that I'm personally very interested in 01:49:12.900 |
and being able to just really tap into the high resolution, 01:49:18.780 |
high fidelity understanding of the activities 01:49:26.220 |
but just to kind of step back to kind of use analogy, 01:49:34.300 |
At the end of the day, we're doing electrical recording 01:49:40.980 |
which is really, really hard for most people to think about. 01:49:51.780 |
and the frequency band with which that's happening 01:49:54.100 |
is actually very, very similar to sound waves 01:50:01.060 |
So the analogy that typically is used in the field 01:50:11.900 |
you maybe get a sense of how the game is going 01:50:14.420 |
based on the cheers and the boos of the home crowd, 01:50:18.820 |
but you have absolutely no idea what the score is. 01:50:22.020 |
You have absolutely no idea what individual audience 01:50:26.180 |
or the players are talking or saying to each other 01:50:28.740 |
what the next play is, what the next goal is. 01:50:30.980 |
So what you have to do is you have to drop the microphone 01:50:34.780 |
near into the stadium and then get near the source, 01:50:41.140 |
In this specific example, you would wanna have it 01:50:47.300 |
So I think that's kind of a good illustration 01:50:50.180 |
of what we're trying to do when we say invasive 01:50:54.220 |
or minimally invasive or implanted brain-computer interfaces 01:50:57.780 |
versus non-invasive or non-implanted brain interfaces. 01:51:02.100 |
It's basically talking about where do you put 01:51:04.460 |
that microphone and what can you do with that information? 01:51:07.340 |
- So what is the biophysics of the read and write 01:51:13.700 |
as we now step into the efforts at Neuralink? 01:51:18.220 |
- Yeah, so brain is made up of these specialized cells 01:51:31.660 |
that are connected in this complex yet dynamic network 01:51:42.180 |
and that's what we typically call neuroplasticity. 01:51:45.820 |
And the neurons are also bathed in this charged environment 01:51:55.060 |
like potassium ions, sodium ions, chlorine ions. 01:52:21.420 |
which in my opinion is one of nature's best inventions. 01:52:26.420 |
In many ways, if you think about what they are, 01:52:29.060 |
they're doing the job of a modern day transistors. 01:52:32.700 |
Transistors are nothing more at the end of the day 01:52:37.160 |
And nature found a way to have that very, very early on 01:52:47.340 |
and a lot of amazing things that we have access to today. 01:52:51.940 |
So I think it's one of those just as a tangent, 01:53:02.220 |
- I mean, I suppose there's on the biological level, 01:53:05.220 |
every level of the complexity of the hierarchy 01:53:07.940 |
of the organism, there's going to be some mechanisms 01:53:11.620 |
for storing information and for doing computation. 01:53:16.900 |
But to do that with biological and chemical components 01:53:21.460 |
Plus like when neurons, I mean, it's not just electricity, 01:53:25.620 |
it's chemical communication, it's also mechanical. 01:53:29.460 |
And these are like actual objects that have like, 01:53:37.780 |
there's a lot of really, really interesting physics 01:53:41.900 |
And kind of going back to my work on ultrasound 01:53:53.740 |
to cause neurons to actually fire an action potential 01:54:06.400 |
some sort of thermal energy and that causes cells 01:54:11.440 |
But there are also these ion channels or even membranes 01:54:18.220 |
as they're being mechanically shook, vibrated. 01:54:21.520 |
So there's just a lot of elements of these move particles, 01:54:26.520 |
which again, that's governed by diffusion physics, 01:54:32.300 |
And there's also a lot of kind of interesting physics there. 01:54:35.620 |
- Also not to mention, as Roger Penrose talks 01:54:38.380 |
about the, there might be some beautiful weirdness 01:54:42.120 |
in the quantum mechanical effects of all of this. 01:54:46.620 |
might emerge from the quantum mechanical effects there. 01:54:52.260 |
there's biology, all of that is going on there. 01:54:55.400 |
I mean, you can, yes, there's a lot of levels of physics 01:55:00.940 |
But yeah, in the end, you have these membranes 01:55:10.220 |
that are in the extracellular matrix, like in and out. 01:55:15.220 |
And these neurons generally have these like resting potential 01:55:22.620 |
between inside the cell and outside the cell. 01:55:25.260 |
And when there's some sort of stimuli that changes the state 01:55:38.380 |
you start to kind of see these like sort of orchestration 01:55:40.660 |
of these different molecules going in and out 01:55:46.060 |
once it reaches some threshold to a point where, 01:55:53.420 |
So it's just a very beautiful kind of orchestration 01:55:59.100 |
And what we're trying to do when we place an electrode 01:56:04.260 |
or parking it next to a neuron is that you're trying 01:56:07.580 |
to measure these local changes in the potential. 01:56:11.220 |
Again, mediated by the movements of the ions. 01:56:17.140 |
And what's interesting, as I mentioned earlier, 01:56:31.020 |
And where one dominates, where Maxwell's equation dominates 01:56:41.220 |
If it's close to the source, mostly electromagnetic based, 01:56:47.340 |
when you're farther away from it, it's more diffusion based. 01:56:51.620 |
So essentially when you're able to park it next to it, 01:56:55.580 |
you can listen in on those individual chatter 01:57:03.380 |
are these canonical textbook neural spiking waveform. 01:57:10.220 |
and based on some of the studies that people have done, 01:57:24.300 |
You're no longer able to kind of have the system 01:57:30.420 |
that particular local membrane potential change 01:57:36.780 |
And just to kind of give you a sense of scale also, 01:57:49.100 |
and whatever number of connections that they have. 01:57:58.660 |
detect that change from that one specific neuron 01:58:03.580 |
- Yeah, but as you're moving about this space, 01:58:11.340 |
you'll be hearing chatter from another community. 01:58:14.100 |
- And so the whole sense is you want to place 01:58:38.540 |
obviously it's not just this one neuron that's activating, 01:58:52.860 |
And that's what you're recording when you're farther away. 01:58:54.860 |
I mean, you still have some reference electrode 01:59:22.780 |
kind of global effect of the brain that you can detect. 01:59:32.060 |
like, I mean, if we really wanna go down that rabbit hole, 01:59:38.180 |
like why diffusion physics at some point dominates 01:59:45.780 |
So similar to how when you have electromagnetic waves 02:00:03.660 |
on kind of the signal attenuation over distance, 02:00:07.260 |
you start to see kind of one over R square in the beginning 02:00:21.900 |
the biophysics that you need to understand is not as deep 02:00:29.540 |
you're listening to a small crowd of local neurons. 02:00:41.380 |
there's a whole field of neuroscience that's studying like 02:00:46.500 |
the different sections of the seating in the arena, 02:00:50.780 |
which is where the metaphor probably falls apart 02:00:53.340 |
'cause the seating is not that organized in an arena. 02:01:04.220 |
you have to hit it with just the right set of stimulus. 02:01:10.500 |
There's, I mean, similar to dark energy and dark matter, 02:01:26.940 |
but like when they speak, they say profound shit, I think. 02:01:31.860 |
Anyway, before we zoom in even more, let's zoom out. 02:02:02.540 |
that Neuralink just accomplished in January of this year, 02:02:07.100 |
putting a Neuralink implant in the first human being, Noland. 02:02:13.860 |
about his experience, because he's able to describe 02:02:18.100 |
and the fascinating complexity of that experience, 02:02:22.180 |
But on the technical level, how does Neuralink work? 02:02:31.980 |
the thing that's actually recording these neural chatters. 02:02:45.460 |
of these tiny, tiny wires that we call threads 02:02:54.860 |
you have these neural signals, these spiking neurons 02:03:02.420 |
to decode what the users intend to do with that. 02:03:06.860 |
So there's what's called a Neuralink Application, 02:03:12.740 |
It's running the very, very simple machine learning model 02:03:17.140 |
that decodes these inputs that are neural signals, 02:03:23.940 |
that allows our participant, first participant, Noland, 02:03:28.380 |
to be able to control a cursor on the screen. 02:03:39.140 |
The Link has these flexible, tiny wires called threads 02:03:44.140 |
that have multiple electrodes along its length. 02:03:49.580 |
And they're only inserted into the cortical layer, 02:03:53.860 |
which is about three to five millimeters in a human brain. 02:03:59.380 |
that's where the intention for movement lies in. 02:04:06.660 |
each thread having 16 electrodes along the span 02:04:10.020 |
of three to four millimeters, separated by 200 microns. 02:04:13.540 |
So you can actually record along the depth of the insertion. 02:04:18.420 |
And based on that signal, there's custom integrated circuit 02:04:23.420 |
or ASIC that we built that amplifies the neural signals 02:04:29.260 |
that you're recording, and then digitizing it, 02:04:39.380 |
that is a spiking event, and decide to send that 02:04:43.300 |
or not send that through Bluetooth to an external device, 02:04:50.180 |
- So there's onboard signal processing already 02:04:52.380 |
just to decide whether this is an interesting event or not. 02:05:02.340 |
to kind of really compress the amount of signal 02:05:05.900 |
So we have a total of 1,000 electrodes sampling 02:05:14.300 |
So that's 200 megabits that's coming through to the chip 02:05:19.300 |
from 1,000 channel simultaneous neural recording. 02:05:27.780 |
And, you know, there are technology available 02:05:29.900 |
to send that off wirelessly, but being able to do that 02:05:32.940 |
in a very, very thermally constrained environment 02:05:37.620 |
So there has to be some amount of compression 02:05:40.580 |
that happens to send off only the interesting data 02:05:45.500 |
for motor decoding is occurrence of a spike or not. 02:05:50.340 |
And then being able to use that to, you know, 02:06:07.900 |
send it off through Bluetooth to an external device 02:06:30.820 |
There's an enclosure, there's a charging coil. 02:06:34.180 |
So we didn't talk about the charging, which is fascinating. 02:06:38.180 |
The battery, the power electronics, the antenna. 02:06:42.460 |
Then there's the signal processing electronics. 02:07:12.420 |
are the ones that are actually penetrating the cortex. 02:07:19.020 |
is occupied by the battery, rechargeable battery. 02:07:45.660 |
once you have the craniectomy and the directomy, 02:07:59.700 |
and you can screw in these self-drilling cranial screws 02:08:15.940 |
off of the top of the implant to where the screws are. 02:08:38.220 |
- They are also, the threads themselves are quite strong. 02:08:52.300 |
and manipulate this tiny hair-like structure. 02:08:58.500 |
- Yeah, so the width of a thread starts from 16 micron 02:09:06.500 |
So average human hair is about 80 to 100 micron in width. 02:09:16.780 |
- Yes, most of the volume is occupied by the battery, 02:09:22.780 |
And the charging is done through inductive charging, 02:09:30.460 |
You know, your cell phone, most cell phones have that. 02:09:32.860 |
The biggest difference is that, you know, for us, 02:09:44.700 |
There's a very strict regulation and good reasons 02:09:47.180 |
to not actually increase the surrounding tissue temperature 02:09:55.060 |
that is packed into this to allow charging of this implant 02:10:00.060 |
without causing that temperature threshold to reach. 02:10:03.740 |
And even small things like you see this charging coil 02:10:11.940 |
what you end up having when you have, you know, 02:10:14.020 |
resonant inductive charging is that the battery itself 02:10:17.580 |
is a metallic can and you form these eddy currents 02:10:22.300 |
from external charger and that causes heating. 02:10:26.940 |
And that actually contributes to inefficiency in charging. 02:10:30.820 |
So this ferrite shield, what it does is that it actually 02:10:35.900 |
concentrate that field line away from the battery 02:10:40.140 |
and then around the coil that's actually wrapped around it. 02:10:42.900 |
- There's a lot of really fascinating design here 02:10:46.260 |
to make it, I mean, you're integrating a computer 02:10:49.380 |
into a biological, a complex biological system. 02:11:01.300 |
There's a lot of really, really powerful, tiny, low power 02:11:09.860 |
or various different sensors and power electronics. 02:11:14.020 |
A lot of innovation really came in the charging coil design, 02:11:17.620 |
how this is packaged and how do you enable charging 02:11:21.220 |
such that you don't really exceed that temperature limit, 02:11:24.820 |
which is not a constraint for other devices out there. 02:11:28.060 |
- So let's talk about the threads themselves, 02:11:38.620 |
And what do the electrodes have to do with the threads? 02:11:41.940 |
- Yeah, so the current instantiation of the device 02:11:45.620 |
has 64 threads and each thread has 16 electrodes 02:11:53.620 |
that are capable of both recording and stimulating. 02:11:56.780 |
And the thread is basically this polymer insulated wire. 02:12:06.820 |
The metal conductor is the kind of a tiramisu cake 02:12:14.780 |
And they're very, very tiny wires, two micron in width. 02:12:27.940 |
has the polymer insulation, has the conducting material 02:12:38.500 |
- Yes, you're not gonna be able to see it with naked eyes. 02:12:52.780 |
So each of these threads are, as I mentioned, 02:12:55.940 |
16 micron in width and then they taper to 84 micron, 02:12:59.700 |
but in thickness, they're less than five micron. 02:13:03.980 |
And in thickness, it's mostly polyimide at the bottom 02:13:08.820 |
and this metal track, and then another polyimide. 02:13:11.860 |
So two micron of polyimide, 400 nanometer of this metal stack 02:13:16.860 |
and two micron of polyimide sandwiched together 02:13:27.820 |
to some interesting aspects of the material design here? 02:13:31.260 |
Like what does it take to design a thing like this 02:13:34.700 |
and to be able to manufacture a thing like this 02:13:37.540 |
for people who don't know anything about this kind of thing? 02:13:40.380 |
- Yeah, so the material selection that we have is not, 02:13:47.180 |
There were other labs and there are other labs 02:13:50.700 |
that are kind of looking at similar material stack. 02:13:57.940 |
and still needs to be answered around the longevity 02:14:01.620 |
and reliability of these microelectrodes that we call 02:14:06.220 |
compared to some of the other more conventional 02:14:09.020 |
neural interfaces, devices that are intracranial, 02:14:12.420 |
so penetrating the cortex that are more rigid, 02:14:17.540 |
that are these four by four millimeter kind of silicon shank 02:14:21.940 |
that have exposed recording site at the end of it. 02:14:26.660 |
And, you know, that's been kind of the innovation 02:14:34.060 |
'cause, you know, he was at University of Utah. 02:15:01.700 |
At the very tip of it is an exposed electrode 02:15:14.140 |
that are actually exposed iridium oxide recording sites 02:15:17.460 |
along the depth, this is only at a single depth. 02:15:29.420 |
so you can have it inserted at different depth. 02:15:37.420 |
is the fact that there's no active electronics. 02:15:41.780 |
And then there's a bundle of a wire that you're seeing. 02:15:44.580 |
And then that actually then exits the craniectomy 02:15:47.660 |
that then has this port that you can connect to 02:15:53.820 |
They are working on or have the wireless telemetry device 02:15:57.780 |
but it still requires a through-the-skin port 02:16:01.780 |
that actually is one of the biggest failure modes 02:16:14.340 |
R1 implanting those threads, how difficult is that task? 02:16:21.860 |
they're very, very difficult to maneuver by hand. 02:16:31.180 |
actually positioning it near the site that they want. 02:16:52.660 |
So that's why we built an entire robot to do that. 02:16:55.940 |
There are other reasons for why we built the robot. 02:17:01.980 |
millions and millions of people that can benefit from this. 02:17:04.380 |
And there just aren't that many neurosurgeons out there. 02:17:22.420 |
sort of category of product that we're working on. 02:17:26.100 |
And it's essentially this multi-axis gantry system 02:17:39.220 |
and this kind of a needle retracting mechanism 02:17:47.020 |
via this loop structure that you have on the thread. 02:18:08.380 |
And then after that, there's a computer vision component 02:18:13.100 |
that's finding a way to avoid the blood vessels. 02:18:25.260 |
And also choosing the depth of placement, all that. 02:18:27.940 |
So controlling every, like the 3D geometry of the placement. 02:18:34.820 |
is that it's not surgeon-assisted or human-assisted. 02:18:42.300 |
once you, obviously there are human component to it 02:18:51.780 |
But I mean, we want to get to a point where one click 02:19:04.900 |
And the robot, does it do like one thread at a time, 02:19:09.300 |
And that's actually also one thing that we are looking 02:19:17.500 |
You can have multiple kind of engagement mechanisms, 02:19:24.180 |
And we also still do quite a bit of just kind of verification 02:19:33.660 |
what was programmed in and so on and so forth. 02:19:36.020 |
- And the actual electrodes are placed at very, 02:19:42.820 |
I mean, it's very small differences, but differences. 02:19:46.500 |
- And so that there's some reasoning behind that, 02:19:49.860 |
as you mentioned, like it gets more varied signal. 02:19:54.860 |
- Yeah, I mean, we try to place them all around three 02:20:07.980 |
in this version spans roughly around three millimeters. 02:20:19.260 |
If we go zoom in at specific on the electrodes, 02:20:23.340 |
How many neurons is each individual electrode listening to? 02:20:41.140 |
And you can actually distinguish which neuron 02:20:47.420 |
So I mentioned the spike detection algorithm that we have, 02:21:12.820 |
And then also the time at which these happen. 02:21:16.260 |
you can have a kind of a statistical probability estimation 02:21:25.980 |
than that spike must come from a different neuron. 02:21:28.020 |
- Okay, so that's a nice signal processing step 02:21:31.860 |
from which you can then make much better predictions 02:21:37.060 |
where there could be multiple neurons screaming. 02:21:42.820 |
in you being able to compress the data better. 02:21:58.260 |
and then you run a bunch of different set of algorithms 02:22:03.220 |
it's just all of this for us is done on the device. 02:22:20.060 |
and giving you the output is less than a microsecond, 02:22:25.460 |
- Oh yeah, so the latency has to be super short. 02:22:43.180 |
- Oh, interesting, so it's communication constraint. 02:22:50.020 |
- Yeah, Bluetooth is definitely not our final 02:22:55.900 |
wireless communication protocol that we want to get to. 02:23:17.380 |
the primary motivation for choosing Bluetooth 02:23:23.420 |
- Interoperability is just absolutely essential, 02:23:29.180 |
and in many ways, if you can access a phone or a computer, 02:23:38.860 |
the same pipeline that you mentioned for Nolan. 02:23:52.340 |
to the first time he's able to use this thing? 02:23:55.020 |
- So we have what's called a patient registry 02:23:58.780 |
that people can sign up to hear more about the updates, 02:24:07.140 |
and the process is that once the application comes in, 02:24:17.980 |
there's a lot of different inclusion exclusion criteria 02:24:22.100 |
and we go through a pre-screening interview process 02:24:26.900 |
and at some point, we also go out to their homes 02:24:33.220 |
'cause one of the most kind of revolutionary part 02:24:35.980 |
about having this N1 system that is completely wireless 02:24:41.580 |
like you don't actually have to go to the lab 02:24:57.660 |
People, hopefully, would wanna be able to use this every day 02:25:07.420 |
and what we're looking for during BCI home audit 02:25:09.940 |
is to just kind of understand their situation, 02:25:12.060 |
what other assistive technology that they use. 02:25:14.180 |
- And we should also step back and kind of say 02:25:29.740 |
So these are folks who have a lot of challenges 02:25:46.220 |
is to enable them to have sort of digital autonomy 02:26:03.460 |
in all the ways that we've been talking about. 02:26:10.700 |
to do all kinds of stuff, including play games 02:26:20.100 |
because of the things that have happened to them, so. 02:26:36.100 |
And without that, it's extremely debilitating. 02:26:41.100 |
And there are many, many people that we can help. 02:26:46.060 |
And I mean, especially if you start to kind of look 02:26:54.860 |
but from ALS, MS, or even stroke that leads you, 02:27:03.420 |
That leads you to lose some of that mobility, 02:27:09.180 |
- And all of these are opportunities to help people, 02:27:18.540 |
that needs to have increasing levels of capability 02:27:41.620 |
I mean, I think if you are able to control a cursor 02:27:58.660 |
if you kind of think about that as just definitionally 02:28:02.980 |
being able to transfer information from my brain 02:28:05.900 |
to your brain without using some of the physical faculties 02:28:22.860 |
there's at least a couple of ways of doing that. 02:28:38.500 |
like imagine moving the cursor with your mind. 02:28:41.920 |
But it's like, there is a cognitive step here 02:28:46.500 |
that's fascinating, 'cause you have to use the brain 02:28:51.660 |
And you kind of have to figure it out dynamically. 02:29:02.620 |
'cause you have to get the brain to start firing 02:29:12.980 |
And all of a sudden, it creates the right kind of signal 02:29:15.580 |
that if decoded correctly, can create the kind of effect. 02:29:22.900 |
But on the human side, imagine the cursor moving 02:29:29.860 |
I mean, isn't that just like fascinating to you, 02:29:34.900 |
Like to me, it's like, holy shit, that actually works. 02:29:41.740 |
- You know, as much as you're learning to use that thing, 02:29:47.820 |
Like our model is constantly updating the weights 02:29:50.980 |
to say, oh, if someone is thinking about, you know, 02:29:55.980 |
this sophisticated forms of like spiking patterns, 02:30:05.840 |
So there is a adaptability to the signal processing, 02:30:10.420 |
And then there's the adaptation of Nolan, the human being. 02:30:15.180 |
Like the same way, if you give me a new mouse, 02:30:18.060 |
and I move it, I learn very quickly about it, 02:30:20.820 |
sensitivity, so I'll learn to move it slower. 02:30:28.820 |
and all that kind of stuff they have to adapt to. 02:30:33.540 |
- That's a fascinating, like software challenge 02:30:47.220 |
So there's the selection that Nolan has passed 02:30:51.820 |
so everything including that it's a BCI-friendly home, 02:31:06.140 |
- The end-to-end, you know, we say patient in to patient out 02:31:22.900 |
and we do intra-op CT imaging to make sure that we're, 02:31:27.900 |
you know, drilling the hole in the right location. 02:31:36.780 |
and then they can think about wiggling their hand, 02:31:42.740 |
it's not gonna actually lead to any sort of intended output, 02:31:55.660 |
in which we can actually know where to place our threads. 02:32:00.460 |
'Cause we wanna go into what's called the hand knob area 02:32:09.220 |
So yeah, we do intra-op CT imaging to make sure 02:32:15.260 |
and double check the location of the craniectomy. 02:32:21.940 |
does their thing in terms of like skin incision, 02:32:28.620 |
And then there's many different layers of the brain. 02:32:32.140 |
which is a very, very thick layer that surrounds the brain. 02:32:36.060 |
That gets actually resected in a process called durectomy. 02:32:45.780 |
anywhere between one to one and a half hours, 02:32:49.740 |
Placement of the targets, inserting of the thread. 02:33:01.420 |
There's a couple other steps of like actually inserting 02:33:03.540 |
the dural substitute layer to protect the thread 02:33:08.500 |
And then, yeah, screw in the implant and then skin flap, 02:33:24.900 |
And when was the first time he was able to use it? 02:33:27.260 |
- So he was actually immediately after the surgery, 02:33:30.900 |
like an hour after the surgery, as he was waking up, 02:33:38.020 |
make sure that we are recording neural signals. 02:33:43.500 |
that we noticed that he can actually modulate. 02:33:47.900 |
is that he can think about crunching his fist 02:33:50.860 |
and you could see the spike disappear and appear. 02:34:17.460 |
Even just that spike, just to be able to modulate that. 02:34:27.140 |
that have participated in these groundbreaking BCI, 02:34:30.980 |
you know, investigational early feasibility studies. 02:34:40.140 |
We're not the first ones to actually put electrodes 02:34:55.180 |
We had a lot of confidence based on our benchtop testing, 02:35:01.300 |
our preclinical R&D studies that the mechanism, 02:35:06.340 |
the threads, the insertion, all that stuff is very safe. 02:35:10.060 |
And that it's obviously ready for doing this in a human, 02:35:15.060 |
but there's still a lot of unknown, unknown about, 02:35:28.980 |
But I mean, that was a level of just complete unknown, 02:35:31.820 |
right, 'cause it's a very, very different environment. 02:35:37.140 |
in the first place, to be able to test these things out. 02:35:40.140 |
So extreme nervousness and just many, many sleepless night 02:35:53.100 |
And by the time it was around 1030, everything was done. 02:35:58.740 |
But I mean, first time seeing that, well, number one, 02:36:10.060 |
And two, I mean, just immense amount of gratitude 02:36:18.660 |
and that we've spoken to and will speak to are true pioneers 02:36:25.500 |
And I sort of call them the neural astronauts 02:36:31.380 |
- These amazing, just like in the sixties, right? 02:36:36.720 |
Exploring the unknown, outwards, in this case it's inward, 02:36:41.520 |
but an incredible amount of gratitude for them 02:36:50.620 |
And it's a journey that we're embarking on together. 02:37:03.580 |
So a lot of just kind of anticipation for, okay, 02:37:09.380 |
What are set of sequences of events that needs to happen 02:37:11.780 |
for us to make it worthwhile for both Nolan as well as us. 02:37:16.780 |
- Just to linger on that, just a huge congratulation to you 02:37:40.620 |
and then maybe expand the realm of the possible 02:37:45.500 |
for the human mind for millions of people in the future. 02:37:49.920 |
So like the opportunities are all ahead of us 02:37:53.860 |
and to do that safely and to do that effectively 02:37:58.460 |
As an engineer, just watching other engineers come together 02:38:04.700 |
It's, yeah, could not have done it without the team. 02:38:11.900 |
of just this immense sense of optimism for the future. 02:38:22.860 |
as well as hopefully for many others out there 02:38:30.940 |
describing that some of the threads are tracted. 02:38:41.340 |
And that, the whole story of how it was regained 02:38:45.100 |
That's definitely something I'll talk to Bliss 02:38:48.740 |
But in general, can you speak to this whole experience? 02:39:07.580 |
and it's actually gotten better than it was before. 02:39:10.980 |
He's actually just beat the world record yet again last week 02:39:18.180 |
So, I mean, he's just cranking and he's just improving. 02:39:24.620 |
- Yeah, the previous world record in human was 4.6. 02:39:32.620 |
which is roughly around kind of the median Neuralinker 02:39:50.460 |
of what took the BCI team to recover that performance. 02:39:55.060 |
It was actually mostly on kind of the signal processing end. 02:39:59.300 |
we were kind of looking at these spike outputs 02:40:14.900 |
And the way in which we noticed this at first, 02:40:16.820 |
obviously is that I think Nolan was the first to notice 02:40:25.780 |
we were also trying to do a bunch of different experimentation 02:40:33.340 |
So it was expected that there will be variability 02:41:06.780 |
or if you're not seeing spikes on those channels, 02:41:08.820 |
you have some indications that something's happening there. 02:41:11.620 |
And what we noticed is that looking at those impedance plot 02:41:21.660 |
that indicated that the reservoir being pulled out. 02:41:53.500 |
What we started looking at is not just the spike occurrence 02:41:56.660 |
through this BOSS algorithm that I mentioned, 02:42:13.740 |
for the implant to not just give you the BOSS output, 02:42:25.140 |
And that was the thing that really ultimately gave us 02:42:32.140 |
and obviously like the thing that we want ultimately, 02:42:39.260 |
is figuring out ways in which we can keep those threads 02:42:45.220 |
so that we have many more channels going into the model. 02:42:52.900 |
to understand how to prevent that from happening. 02:42:58.900 |
you know, as I mentioned, this is the first time ever 02:43:02.260 |
that we're putting these threads in a human brain. 02:43:05.060 |
And, you know, human brain, just for size reference, 02:43:07.580 |
is 10 times that of the monkey brain or the sheep brain. 02:43:11.900 |
And it's just a very, very different environment. 02:43:16.500 |
It's like actually moved a lot more than we expected 02:43:22.300 |
And it's just a very, very different environment 02:43:47.380 |
And this is something that Neuralink is extremely good at. 02:43:49.460 |
Once we have set of clear objective and engineering problem, 02:43:55.180 |
across many, many disciplines to be able to come together 02:44:01.180 |
- But it sounds like one of the fascinating challenges here 02:44:18.180 |
something changing, like Nolan talks about cursor drift, 02:44:25.260 |
And there's a whole UX challenge to how to do that. 02:44:27.780 |
So it sounds like adaptability is like a fundamental property 02:44:37.700 |
as a company, we're extremely vertically integrated. 02:44:45.620 |
- Yeah, there's, like you said, built in house. 02:44:47.900 |
This whole paragraph here from this blog post 02:44:59.220 |
We constructed in-house microfabrication capabilities 02:45:02.300 |
to rapidly produce various iterations of thin film arrays 02:45:12.460 |
to manufacture components with micro-level precision. 02:45:15.380 |
I think there's a tweet associated with this. 02:45:29.340 |
cuts this geometry in the tips of our needles. 02:45:33.300 |
So we're looking at this weirdly shaped needle. 02:45:41.500 |
only slightly larger than the diameter of a red blood cell. 02:45:48.220 |
Okay, so what's interesting about this geometry? 02:45:51.300 |
So we're looking at this geometry of a needle. 02:46:19.060 |
is the thing that actually is grasping the loop. 02:46:26.340 |
such that when you pull out, it leaves the loop. 02:46:33.540 |
And basically the robot has a lot of the optics 02:46:41.460 |
that actually causes the polyimide to fluoresce 02:46:45.580 |
so that you can locate the location of the loop. 02:46:56.100 |
that it takes to do that, that's pretty crazy. 02:47:06.020 |
There's, I mean, it's like a giant granite slab 02:47:12.060 |
'Cause it needs to be sensitive to vibration, 02:47:22.740 |
to make sure that you can achieve that level of precision. 02:47:28.180 |
A lot of optics that kind of zoom in on that. 02:47:30.820 |
We're working on next generation of the robot 02:47:43.060 |
I mean, let alone you try to actually thread a loop 02:47:47.140 |
I mean, this is like, we're talking like fractions 02:47:55.460 |
we developed novel hardware and software testing systems 02:47:57.780 |
such as our accelerated lifetime testing racks 02:48:00.020 |
and simulated surgery environment, which is pretty cool. 02:48:05.940 |
We performed many rehearsals of our surgeries 02:48:09.060 |
to refine our procedures and make them second nature. 02:48:40.260 |
that actually mimics the mechanical properties of the brain. 02:49:00.180 |
it's about like finding the right concentration 02:49:08.780 |
But we practice this surgery with the person, 02:49:13.780 |
you know, Nolan's basically physiology and brain, 02:49:17.220 |
many, many times prior to actually doing the surgery. 02:49:25.460 |
Like, I mean, like what you're looking at is the picture. 02:49:32.660 |
of the robot engineering space that we, you know, 02:49:37.660 |
that looks exactly like what they would experience, 02:49:40.260 |
all the staff would experience during their actual surgery. 02:49:42.300 |
So, I mean, it's just kind of like any dance rehearsal 02:49:45.620 |
where you know exactly where you're gonna stand 02:49:47.300 |
at what point, and you just practice that over 02:49:49.980 |
and over and over again with an exact anatomy 02:49:55.740 |
And it got to a point where a lot of our engineers, 02:50:08.180 |
through doing the same thing over and over and over. 02:50:09.740 |
It's like "Jiro Dreams of Sushi" kind of thing. 02:50:12.540 |
Because then it's like Olympic athletes visualize 02:50:17.340 |
the Olympics, and then once you actually show up, 02:50:25.780 |
It feels almost boring winning the gold medal. 02:50:37.260 |
And the experience they talk about is mostly just relief. 02:50:40.540 |
Probably that they don't have to visualize it anymore. 02:50:47.060 |
and where, I mean, there's a whole field that studies 02:50:59.420 |
sort of the big question that people might have is, 02:51:08.740 |
What sort of trauma did you cause the tissue? 02:51:12.540 |
And does that correlate to whatever behavioral anomalies 02:51:17.500 |
And that's the language to which we can communicate 02:51:21.900 |
about the safety of inserting something into the brain 02:51:32.780 |
department of pathology that looks at these tissue slices. 02:51:38.340 |
There are many steps that are involved in doing this 02:51:48.740 |
At some point you have to euthanize the animal, 02:51:55.100 |
You fix them in formalin, and you like gross them, 02:51:59.300 |
you section them, and you look at individual slices 02:52:01.860 |
just to see what kind of reaction or lack thereof exists. 02:52:04.780 |
So that's the kind of the language to which FDA speaks, 02:52:08.700 |
and as well for us to kind of evaluate the safety 02:52:12.460 |
of the insertion mechanism as well as the threats 02:52:15.620 |
at various different time points, both acute, 02:52:32.620 |
- FDA supervises this, but there's in general 02:52:36.900 |
And every aspect of this, including the surgery, 02:52:41.860 |
that the standard is, let's say, how to put it politely, 02:52:53.220 |
So the standard for all the surgical stuff here 02:52:58.340 |
I mean, it's a highly, highly regulated environment 02:53:16.660 |
to kind of understand what sort of damage, if any, 02:53:22.780 |
and new technologies that we're building are. 02:53:29.580 |
by lack of immune response from these threats. 02:53:34.020 |
- Speaking of which, you talked to me with excitement 02:53:46.020 |
- Yeah, so what you're looking at is a stained tissue image. 02:53:54.500 |
from an animal that was implanted for seven months, 02:54:02.020 |
and each color indicates specific types of cell types. 02:54:06.020 |
So purple and pink are astrocytes and microglia, 02:54:12.540 |
And the other thing that people may not be aware of 02:54:36.500 |
- So what you're seeing is in this kind of macro image, 02:54:39.420 |
you're seeing these like circle highlighted in white, 02:54:58.420 |
that you have the neurons that are these brown structures 02:55:04.140 |
that are actually touching and abutting the threads. 02:55:07.820 |
So what this is saying is that there's basically 02:55:09.940 |
zero trauma that's caused during this insertion. 02:55:17.660 |
that is one of the most common mode of failure. 02:55:19.860 |
So when you insert these threads like the Utah array, 02:55:26.300 |
because you're inserting a foreign object, right? 02:55:29.180 |
And that kind of elicit these like immune response 02:55:34.700 |
They form this like protective layer around it. 02:55:39.380 |
but you're also creating this protective layer 02:55:45.260 |
'cause you're getting further and further away 02:55:46.660 |
from the neurons that you're trying to record. 02:55:52.340 |
in that inset, it's about 50 micron with that scale bar, 02:56:13.860 |
This is multiplex stain that uses these different proteins 02:56:21.220 |
We use very standard set of staining techniques 02:56:31.500 |
this is also kind of illustrates the second point 02:56:34.900 |
And initially when we saw the previous image, 02:56:37.340 |
we said, "Oh, like, are the threads just floating? 02:56:40.980 |
"Are we actually looking at the right thing?" 02:56:50.140 |
which is in blue that shows these collagen layers. 02:56:54.780 |
like, you don't want the blue around the implant threads 02:57:01.980 |
And what you're seeing, if you look at individual threads, 02:57:16.140 |
- So that presumably is one of the big benefits 02:57:20.940 |
- Yeah, so we think this is primarily due to the size, 02:57:36.980 |
and not breaking any of the blood-brain barrier, 02:57:40.580 |
has basically caused the immune response to be muted. 02:57:53.300 |
- And they're neurons, and this is the thread listening. 02:58:00.780 |
is not electrode themselves, those are the conductive wires. 02:58:04.420 |
So each of those should probably be two micron in width. 02:58:12.940 |
So we're looking at some slice of the tissue. 02:58:25.540 |
that there's just kind of cells around the inserted site, 02:58:33.660 |
- How easy and safe is it to remove the implant? 02:58:41.220 |
In the first three months or so after the surgery, 02:58:45.060 |
there's a lot of kind of tissue modeling that's happening, 02:58:51.780 |
You obviously start over first couple of weeks, 02:59:10.980 |
And before the scar tissue or the neomembrane 02:59:18.500 |
And there's minimal trauma that's caused during that. 02:59:22.500 |
Once the scar tissue forms, and with Nolan as well, 02:59:29.900 |
So we haven't seen any more movements since then. 02:59:34.740 |
It gets harder to actually completely extract the threads. 02:59:40.460 |
So our current method for removing the device 02:59:44.940 |
is cutting the thread, leaving the tissue intact, 02:59:49.220 |
and then unscrewing and taking the implant out. 02:59:59.380 |
or just with kind of a peak-based, plastic-based cap. 03:00:04.380 |
- Is it okay to leave the threads in there forever? 03:00:16.580 |
And do they get to a point where they should not be? 03:00:20.100 |
Again, once the scar tissue forms, they get anchored in place. 03:00:24.060 |
And I should also say that when we say upgrades, 03:00:31.060 |
Like we've actually upgraded many, many times. 03:00:34.540 |
Most of our monkeys or non-human primates, NHP, 03:00:43.460 |
has the latest version of the device since two years ago, 03:00:46.540 |
and is seemingly very happy and healthy and fat. 03:00:49.420 |
- So what's designed for the future, the upgrade procedure? 03:00:56.020 |
So maybe for Noland, what would the upgrade look like? 03:01:05.340 |
is there a way to upgrade sort of the device internally 03:01:10.340 |
where you take it apart and sort of keep the capsule 03:01:14.940 |
- Yeah, so there are a couple of different things here. 03:01:18.820 |
what we would have to do is either cut the threads 03:01:27.180 |
in terms of how they're anchored or scarred in. 03:01:29.580 |
If you were to remove them with the dural substitute, 03:01:50.620 |
One is, at the moment, we currently remove the dura, 03:01:55.460 |
this kind of thick layer that protects the brain. 03:02:00.020 |
that actually proliferates the scar tissue formation. 03:02:09.420 |
So we're looking at ways to insert the threads 03:02:14.660 |
which comes with different set of challenges, 03:02:23.260 |
So we're looking at different needle design for that, 03:02:28.140 |
The other biggest challenges are it's quite opaque optically 03:02:33.540 |
So how do you avoid still this biggest advantage 03:02:48.700 |
and based on some of the early evidence that we have, 03:02:55.060 |
that causes them to be much easier to extract over time. 03:02:58.700 |
And the other thing that we're also looking at, 03:03:05.300 |
is at the moment it's a monolithic single implant 03:03:09.780 |
that comes with a thread that's bonded together. 03:03:12.780 |
So you can't actually separate the thing out, 03:03:16.740 |
bottom part that is the thread that are inserted, 03:03:20.540 |
that has the chips and maybe a radio and some power source. 03:03:27.140 |
that has more of the computational heavy load 03:03:40.980 |
if you wanna upgrade that, you just go in there, 03:03:43.020 |
remove the screws and then put in the next version 03:03:45.540 |
and you're off to, it's a very, very easy surgery too. 03:03:49.020 |
Like you do a skin incision, slip this in, screw, 03:03:55.220 |
- So that would allow you to reuse the threads sort of. 03:03:59.260 |
- So, I mean, this leads to the natural question of, 03:04:07.540 |
Is that like, what's the technical challenge there? 03:04:14.860 |
the key metrics that we're looking to improve 03:04:21.140 |
We have a pathway to actually go from currently 1000 03:04:25.140 |
to hopefully 3000, if not 6000 by end of this year. 03:04:37.940 |
One is obviously being able to photolithographically 03:04:42.900 |
As I mentioned, it's two micron in width and spacing. 03:04:46.740 |
Obviously there are chips that are much more advanced 03:04:50.740 |
And we have some of the tools that we have brought in house 03:04:56.860 |
just so that you have to have more of the wires 03:05:00.700 |
Chips also cannot linearly consume more energy 03:05:08.780 |
So there's a lot of innovations in the circuit, 03:05:11.140 |
in architecture as well as the circuit design topology 03:05:17.140 |
You need to also think about if you have all of these spikes, 03:05:20.380 |
how do you send that off to the end application? 03:05:22.700 |
So you need to think about bandwidth limitation there 03:05:25.220 |
and potentially innovations in signal processing. 03:05:36.540 |
bonding the stem film array to the electronics. 03:05:40.980 |
It starts to become very, very highly dense interconnects. 03:05:45.580 |
There's a lot of innovations in kind of the 3D integrations 03:05:49.380 |
in the recent years that we can take advantage of. 03:05:53.620 |
One of the biggest challenges that we do have 03:06:04.780 |
yeah, like the brain trying to kill your electronics 03:06:16.740 |
that we, I think are actually well suited to tackle. 03:06:25.780 |
- Yeah, so this is where the accelerated life tester 03:06:43.780 |
And you can also put some other set of chemicals 03:06:53.500 |
and trying to cause a reaction to pull it apart. 03:07:06.220 |
So every 10 degrees Celsius that you increase, 03:07:17.500 |
that causes you to have other nasty gases to form 03:07:21.660 |
that just is not realistic in an environment. 03:07:23.980 |
So what we do is we increase in our ALT chamber 03:07:27.860 |
by 20 degrees Celsius that increases the aging by four times. 03:07:37.180 |
And we look at whether the implants still are intact, 03:07:46.860 |
Obviously it's not an exact same environment as a brain, 03:07:52.220 |
other more biological groups that attack at it. 03:07:59.340 |
testing environment for at least the enclosure 03:08:13.220 |
which is equivalent to a decade and they seem to be fine. 03:08:19.340 |
so basically a close approximation is warm salt water, 03:08:24.340 |
hot salt water is a good testing environment. 03:08:44.980 |
- Yeah, you have to get it at the right pH too. 03:08:51.300 |
- By the way, the other thing that also is interesting 03:08:54.540 |
about our enclosure is if you look at our implant, 03:09:14.940 |
which is actually commonly used in blister packs. 03:09:18.700 |
So when you have a pill and you try to pop the pill, 03:09:30.220 |
is 'cause it's electromagnetically transparent. 03:09:42.580 |
and it's a very, very tough process to scale. 03:09:48.140 |
the materials, the software, the hardware, all of it. 03:09:56.020 |
Is it possible to have multiple Neuralink devices 03:10:04.180 |
To have multiple Neuralink devices implanted? 03:10:08.540 |
Yeah, we've had, I mean, our monkeys have had 03:10:15.780 |
And then we're also looking at potential of having 03:10:28.620 |
- I mean, I wonder if there's some level of customization 03:10:38.180 |
building a generalized neural interface to the brain. 03:10:41.460 |
And that also is strategically how we're approaching this 03:10:55.380 |
and the robot can access any part of the cortex. 03:11:07.780 |
there's kind of a general compute available there. 03:11:14.900 |
to kind of hyper-optimizing for power and efficiency, 03:11:17.700 |
you do need to get to some specialized function, right? 03:11:25.820 |
you are now used to this robotic insertion techniques, 03:11:33.900 |
and also internally convincing ourself that this is safe. 03:11:42.780 |
to other parts of the brain, like visual cortex, 03:11:45.260 |
which we're interested in as our second product, 03:11:48.420 |
obviously, it's a completely different environment. 03:11:50.140 |
The cortex is laid out very, very differently. 03:11:54.540 |
It's gonna be more stimulation focused rather than recording 03:12:01.180 |
we're using the same thin film array technology. 03:12:04.460 |
We're using the same robot insertion technology. 03:12:09.860 |
Now, it's more of the conversation is focused 03:12:13.260 |
and what are the implication of those differences 03:12:22.580 |
That product being restoring sight for blind people. 03:12:27.580 |
So can you speak to stimulating the visual cortex? 03:12:34.660 |
I mean, the possibilities there are just incredible 03:12:42.020 |
who don't have sight or even any aspect of that. 03:12:51.620 |
- Which is, like you said, from recording to stimulation. 03:12:55.860 |
Just any aspect of that that you're both excited 03:13:04.460 |
that we actually have been capable of stimulating 03:13:13.300 |
We have actually demonstrated some of that capabilities 03:13:30.980 |
And obviously there are many, many different ways 03:13:37.740 |
The way in which we're doing that is through electrical, 03:13:42.140 |
and causing that to really change the local environment 03:13:51.500 |
kind of the neurons to depolarize in nearby areas. 03:14:05.820 |
there are aspects of it that's well understood, 03:14:07.620 |
but in the end, like we don't really know anything. 03:14:21.900 |
that convert the photon energy into electrical signals. 03:14:28.260 |
to your back of your head, your visual cortex. 03:14:31.940 |
It goes through actually a thalamic system called LGN 03:14:44.180 |
and then there's a bunch of other higher level 03:14:48.460 |
And there are actually kind of interesting parallels. 03:14:56.660 |
like what the different layers of the network is detecting, 03:15:03.140 |
and they're then detecting some more natural curves. 03:15:06.100 |
And then they start to detect like objects, right? 03:15:19.180 |
where does cognition arise and where is color encoded? 03:15:38.620 |
1 million people in the US that are legally blind. 03:15:52.220 |
that normal people can see at 200 feet distance, 03:15:55.060 |
like if you're worse than that, you're legally blind. 03:15:59.100 |
you can't function effectively using sight in the world. 03:16:04.900 |
And yeah, there are different forms of blindness. 03:16:08.860 |
There are forms of blindness where there's some degeneration 03:16:31.500 |
You can actually build retinal prosthetic devices 03:16:40.980 |
And there are many companies that are working on that, 03:16:43.140 |
but that's a very small slice, albeit significant, 03:16:46.900 |
still a smaller slice of folks that are legally blind. 03:16:53.700 |
whether it's in the optic nerve or just the LGN circuitry 03:17:11.420 |
because your biological mechanism is not doing that 03:17:14.220 |
is by placing electrodes in the visual cortex 03:17:21.740 |
whether it's something as unsophisticated as a GoPro 03:17:26.740 |
or some sort of wearable Ray-Ban type glasses 03:17:32.020 |
that Meta's working on that captures a scene. 03:17:38.340 |
to a set of electrical impulses or stimulation pulses 03:17:41.540 |
that you would activate in your visual cortex 03:17:48.100 |
And by playing some concerted kind of orchestra 03:18:01.340 |
that you can also create by just pressing your eyes. 03:18:08.340 |
And the name of the game is really have many of those 03:18:16.700 |
like they're the individual pixels of the screen, right? 03:18:32.780 |
being able to at least be able to have object detection 03:18:41.660 |
and then being able to at least see the edges of things 03:19:02.380 |
One is, obviously if you're blind from birth, 03:19:15.140 |
kind of your brain and different parts of your brain 03:19:27.020 |
I mean, you also hear about people who are blind 03:19:29.580 |
that have heightened sense of hearing or some other senses. 03:19:33.580 |
And the reason for that is because that cortex 03:19:41.380 |
I mean, I guess they're going to have to now map 03:19:54.660 |
Before, so I think that's an interesting caveat. 03:19:59.260 |
The other thing that also is important to highlight 03:20:01.180 |
is that we're currently limited by our biology 03:20:15.020 |
But when you have an external camera with this BCI system, 03:20:21.740 |
you can have whatever other spectrum that you want to see. 03:20:25.900 |
to some sort of weird conscious experience, I've no idea. 03:20:34.060 |
being going beyond the limits of our biology. 03:20:38.220 |
- And if you're able to control the kind of raw signal, 03:20:43.420 |
is that when we use our sight, we're getting the photons 03:20:59.620 |
and there's a lot of possibilities to explore that. 03:21:11.700 |
I mean, my theory of how like visual system works also 03:21:15.500 |
is that, I mean, there's just so many things happening 03:21:23.860 |
And it's unclear exactly where some of the pre-processing 03:21:28.180 |
steps are happening, but I mean, I actually think that 03:21:35.340 |
there's just so much, the reality that we're in, 03:21:38.580 |
if it's a reality, is so there's so much data. 03:21:43.580 |
And I think humans are just unable to actually 03:21:46.340 |
like eat enough actually to process all that information. 03:21:49.940 |
So there's some sort of filtering that does happen, 03:21:57.780 |
But like the analogy that I sometimes think about is, 03:22:05.420 |
and all of the information in the world is a sun, 03:22:16.300 |
So what you do is you end up adding these filters, right? 03:22:25.180 |
And I think, you know, things like our experiences 03:22:36.100 |
that like anesthetic drug or, you know, psychedelics, 03:22:40.340 |
what they're doing is they're kind of swapping out 03:22:45.020 |
or removing older ones and kind of controlling 03:22:51.700 |
but I just took a very high dose of ayahuasca 03:23:00.980 |
and with Neuralink being able to control that, 03:23:07.820 |
not for entertainment purposes or enjoyment purposes, 03:23:15.220 |
And there, especially when the function's completely lost, 03:23:21.900 |
Would you implant a Neuralink device in your own brain? 03:23:38.100 |
and almost get a little antsy, like jealous of people 03:24:04.940 |
from being able to achieve that type of performance. 03:24:17.260 |
and it seems like such a chill way to play video games. 03:24:23.180 |
to appreciate sometimes is that he's doing these things 03:24:27.500 |
while talking and, I mean, it's multitasking, right? 03:24:31.780 |
So it's clearly, it's obviously cognitively intensive, 03:24:36.780 |
but similar to how, when we talk, we move our hands, 03:24:46.900 |
with other assistive technology, as far as I'm aware. 03:24:51.460 |
If you're obviously using an eye-tracking device, 03:25:05.980 |
So it's not just the BPS for the primary task, 03:25:18.540 |
and doing a thing with your mind and looking around also, 03:25:23.540 |
I mean, there's just a lot of parallelization 03:25:29.980 |
like if he wants to really achieve those high-level BPS, 03:25:34.820 |
And that's a separate circuitry that is a big mystery, 03:25:47.940 |
Like you have your primary task and a secondary task, 03:25:51.460 |
and the secondary task is a source of distraction. 03:25:59.820 |
I mean, this is an interesting computational device, right? 03:26:05.300 |
a lot of novel insights that can be gained from everything. 03:26:07.780 |
I mean, I personally am surprised that Nolan's able 03:26:09.660 |
to do such incredible control of the cursor while talking, 03:26:19.140 |
if you're talking in front of the camera, you get nervous. 03:26:23.780 |
and he's able to still achieve high performance. 03:26:26.360 |
Surprising, I mean, all of this is really amazing. 03:26:30.020 |
And I think just after researching this really in depth, 03:26:41.380 |
Well, we should say the registry is for people 03:26:44.140 |
who have quadriplegia and all that kind of stuff, 03:27:02.620 |
what's the high-level vision for P2, P3, P4, P5, 03:27:07.340 |
and just the expansion into other human beings 03:27:30.940 |
And also at the same time, understand the efficacy 03:27:37.380 |
And just because you're living with tetraplegia, 03:27:51.160 |
And it's something that we're hoping to also understand 03:27:59.540 |
not just a very small slice of those individuals, 03:28:05.020 |
to just really build just the best product for them. 03:28:09.900 |
So there's obviously also goals that we have, 03:28:16.100 |
and the primary purpose of the early feasibility study 03:28:26.980 |
before we embark on what's called the pivotal study, 03:28:34.180 |
that starts to look at statistical significance 03:28:40.580 |
And that's required before you can then market the device. 03:28:52.940 |
from people like Nolan, P2, P3, future participants, 03:29:01.180 |
I really don't like the fact that it lasts only six hours, 03:29:03.620 |
I wanna be able to use this computer for like 24 hours. 03:29:08.320 |
I mean, that is user needs and user requirements, 03:29:23.660 |
how they use it, like the high resolution details 03:29:30.620 |
and all that kind of stuff to like life experience. 03:29:37.440 |
So even when we had that sort of recovery event for Nolan, 03:29:42.440 |
he now has the new firmware that he has been updated with. 03:29:53.940 |
for security patches, whatever new functionality, UI, right? 03:29:57.180 |
And that's something that is possible with our implant. 03:30:02.560 |
that can only do the thing that it said it can do. 03:30:08.860 |
and now you have completely new user interface 03:30:11.820 |
and all this bells and whistles and improvements 03:30:22.380 |
- Yeah, it's really cool how the app that Nolan is using, 03:30:24.900 |
there's like calibration, all that kind of stuff. 03:30:33.900 |
What other future capabilities are you kind of looking to? 03:30:42.500 |
What about sort of accelerated typing or speech, 03:30:49.660 |
- Yeah, those are still in the realm of movement program. 03:30:56.140 |
We have the movement program and we have the vision program. 03:30:59.620 |
The movement program currently is focused around 03:31:05.340 |
if you can control 2D cursor in the digital space, 03:31:08.940 |
you could move anything in the physical space. 03:31:11.620 |
So robotic arms, wheelchair, your environment, 03:31:14.980 |
or even really like whether it's through the phone 03:31:33.620 |
if there's a robotic arm or a wheelchair that, 03:31:36.020 |
we can guarantee that they're not gonna hurt themselves 03:31:41.140 |
in the digital domain versus like in the physical space, 03:31:43.560 |
you can actually potentially cause harm to the participants. 03:31:50.720 |
Speech does involve different areas of the brain. 03:31:56.680 |
And there's actually been a lot of really amazing work 03:32:02.660 |
Sergei Stavisky at UC Davis, Jamie Henderson, 03:32:24.320 |
And being able to like even by mouthing the word 03:32:27.880 |
or imagine speech, you can pick up those signals. 03:32:30.920 |
The more sophisticated higher level processing areas 03:32:45.400 |
But yeah, I mean, I think Neuralink's eventual goal 03:32:55.040 |
to be able to understand that and study that. 03:32:58.040 |
- This is where I get to the pothead questions. 03:33:07.160 |
So speech is, there's a muscular component, like you said. 03:33:14.760 |
But then what about the internal things like cognition? 03:33:18.380 |
Like low level thoughts and high level thoughts. 03:33:21.760 |
Do you think we'll start noticing kind of signals 03:33:26.080 |
That could be understood, that could be maybe used 03:33:37.760 |
to kind of get into the heart problem of consciousness. 03:33:47.360 |
all of these are at some point set of electrical signals 03:34:07.840 |
So we're telling ourselves and fooling ourselves 03:34:18.680 |
and really PCI at the end of the day is a set of tools 03:34:21.920 |
that help you kind of study the underlying mechanisms 03:34:25.160 |
in a both like local, but also broader sense. 03:34:30.760 |
And whether there's some interesting patterns 03:34:38.860 |
and you can either like learn from like many, many sets 03:34:43.800 |
and be able to do mind reading or not, I'm not sure. 03:34:55.920 |
There's probably additional set of tools and framework. 03:34:59.440 |
And also like just heart problem of consciousness 03:35:09.400 |
Like, where's the mind emerge from this complex network? 03:35:12.200 |
- Yeah, how does the subjective experience emerge 03:35:17.520 |
from just a bunch of spikes, electrical spikes? 03:35:32.240 |
There's actually, there actually is some biological 03:35:42.360 |
to kind of start to form some of these experiences 03:35:46.200 |
If you actually look at every one of our brains, 03:35:51.920 |
There's a left-sided brain, there's a right-sided brain. 03:35:54.520 |
And I mean, unless you have some other conditions, 03:35:59.520 |
you normally don't feel like left legs or right legs. 03:36:13.320 |
there's a structure that kind of connect the rise 03:36:19.340 |
that is supposed to have around 200 to 300 million 03:36:25.640 |
So whether that means that's the number of interface 03:36:31.400 |
and electrodes that we need to create some sort of 03:36:34.740 |
mind meld or from that, like whatever new conscious 03:36:40.980 |
But I do think that there's like kind of an interesting 03:36:52.680 |
- And that threshold is unknown at this time. 03:36:55.640 |
- Oh yeah, these things, everything in this domain 03:37:06.040 |
Do you see a world where there's millions of people, 03:37:12.300 |
like tens of millions, hundreds of millions of people 03:37:18.400 |
or multiple Neuralink devices in their brain? 03:37:21.700 |
First of all, there are, like, if you look at worldwide, 03:37:29.380 |
in the tens, if not hundreds of millions of people. 03:37:34.760 |
So that alone, I think there's a lot of benefit 03:37:43.460 |
And once you start to get into kind of neuro, 03:37:48.000 |
like psychiatric application, you know, depression, 03:37:51.240 |
anxiety, hunger, or, you know, obesity, right? 03:38:13.560 |
And once BCI starts competing with a smartphone 03:38:42.380 |
in many ways, I think BCI can play a role in that. 03:38:47.380 |
And, you know, some of the things that I also talk about is, 03:38:55.560 |
8 billion people walking around with Neuralink. 03:39:06.160 |
- Thanks for listening to this conversation with DJ Sa. 03:39:10.000 |
And now, dear friends, here's Matthew McDougall, 03:39:16.220 |
When did you first become fascinated with the human brain? 03:39:21.580 |
- Since forever, as far back as I can remember, 03:39:37.020 |
what the most important things in the world are. 03:39:43.480 |
And the answer that I came to, that I converged on, 03:39:47.440 |
was that all of the things you can possibly conceive of 03:39:52.440 |
as things that are important for human beings to care about 03:39:56.900 |
are literally contained, you know, in the skull. 03:40:00.960 |
Both the perception of them and their relative values 03:40:03.880 |
and, you know, the solutions to all our problems 03:40:07.080 |
and all of our problems are all contained in the skull. 03:40:14.040 |
how the brain encodes information and generates desires 03:40:26.600 |
You know, you think about all the really great triumphs 03:40:30.960 |
You think about all the really horrific tragedies. 03:40:36.960 |
You think about any prison full of human stories. 03:40:41.960 |
And all of those problems boil down to neurochemistry. 03:40:49.240 |
So if you get a little bit of control over that, 03:40:58.920 |
the way people have dealt with having better tools 03:41:08.400 |
But I think it's an interesting, worthy, and noble pursuit 03:41:16.080 |
- Yeah, that's a fascinating way to look at human history. 03:41:18.680 |
You just imagine all these neurobiological mechanisms, 03:41:34.720 |
gaining a bunch of information over a period of time. 03:41:37.000 |
They have a set of module that does language and memory 03:41:47.900 |
there's not some glorified notion of a dictator 03:41:54.360 |
of this enormous mind or something like this. 03:42:03.320 |
how well people like that can organize those around them. 03:42:17.120 |
for clues as to how humans are going to behave 03:42:20.280 |
and what particular humans are able to achieve. 03:42:46.320 |
And his work at looking at chimps through the lens of, 03:42:51.320 |
you know, how you would watch an episode of "Friends" 03:42:55.080 |
and understand the motivations of the characters 03:43:13.680 |
you talk about them in terms of their human struggles, 03:43:18.960 |
accord them the dignity of themselves as actors 03:43:45.200 |
And I think doing so gives you the tools you need 03:43:49.440 |
to reduce human behavior from the kind of false complexity 03:43:59.100 |
oh, well, these humans are looking for companionship, 03:44:03.740 |
And I think that that's a pretty powerful tool 03:44:10.860 |
- And I just went to the Amazon jungle for a few weeks, 03:44:17.500 |
that a lot of life on earth is just trying to get laid. 03:44:25.300 |
and they're just trying to impress each other. 03:44:58.900 |
- And all of that is coming from this, the brain. 03:45:02.500 |
- So when did you first start studying the brain 03:45:41.040 |
the systems we think of as homeostatic, automatic mechanisms 03:45:46.040 |
like fighting off a virus, like repairing a wound. 03:45:51.560 |
And sure enough, there are big crossovers between the two. 03:46:11.520 |
or has a huge role in almost everything that your body does. 03:46:14.560 |
Like you try to name an example of something in your body 03:46:27.240 |
I mean, you might say like bone healing or something, 03:46:29.440 |
but even those systems, the hypothalamus and pituitary 03:46:34.240 |
end up playing a role in coordinating the endocrine system 03:46:44.520 |
So non-obvious connections between those things 03:46:48.400 |
implicate the brain as really a potent prime mover 03:46:55.800 |
- One of the things I realized in the other direction too, 03:47:04.240 |
like they affect the brain also, like the immune system. 03:47:07.480 |
I think there's just people who study Alzheimer's 03:47:14.640 |
It's just surprising how much you can understand 03:47:22.880 |
seem to have anything to do with sort of the nervous system. 03:47:28.540 |
- Yeah, you could understand how that would be 03:47:30.880 |
driven by evolution too, just in some simple examples. 03:47:34.680 |
If you get sick, if you get a communicable disease, 03:47:41.200 |
it's pretty advantageous for your immune system 03:47:44.720 |
to tell your brain, hey, now be antisocial for a few days. 03:47:55.800 |
under a blanket and just stay there for a day or two. 03:47:58.600 |
And sure enough, that tends to be the behavior 03:48:03.680 |
If you get sick, elevated levels of interleukins 03:48:23.400 |
- So from there, the early days in neuroscience to surgery, 03:48:34.720 |
- You know, it was sort of an evolution of thought. 03:48:54.640 |
I wanted to affect real changes in the actual world, 03:49:22.040 |
found that there exists these MD/PhD programs 03:49:25.320 |
where you can choose not to choose between them 03:49:43.140 |
particularly because of a researcher at Caltech 03:49:47.700 |
who's one of the godfathers of primate neuroscience. 03:49:57.060 |
into the brains of monkeys to try to understand 03:50:00.180 |
how intentions were being encoded in the brain. 03:50:06.300 |
with the idea that maybe I would be a neurologist 03:50:14.200 |
again, I'm gonna make enemies by saying this, 03:50:18.300 |
but neurology predominantly and distressingly to me 03:50:45.420 |
that are potentially treatable or curable with surgery. 03:50:53.940 |
you can save lives really is at the end of the day 03:51:04.260 |
that happens to be one of the great neurosurgery programs. 03:51:12.740 |
Alex Colessi and Micah Puzo and Steve Gianotta 03:51:32.500 |
to these are humans that have problems and are people. 03:51:36.380 |
And there's nothing fundamentally preventing me 03:51:45.100 |
I changed gears from going into a different specialty 03:51:48.740 |
and switched into neurosurgery, which cost me a year. 03:52:17.780 |
Residency in neurosurgery is sort of a competition of pain, 03:52:42.220 |
And that, I think, necessarily means working long hours 03:52:51.700 |
with whatever regulations are in front of us. 03:53:12.280 |
- Are you seriously saying one of the hardest things 03:53:14.920 |
is literally forcing them to get sleep and rest 03:53:21.840 |
I think the next generation-- - And that's awesome. 03:53:24.920 |
- I think the next generation is more compliant 03:53:30.360 |
All right, I'm just kidding, I'm just kidding. 03:53:47.120 |
as we touched on earlier, primates like power. 03:53:53.200 |
And I think neurosurgery has long had this aura 03:53:57.760 |
of mystique and excellence and whatever about it. 03:54:04.360 |
for people that are cloaked in that authority. 03:54:10.200 |
a walking, fallacious appeal to authority, right? 03:54:28.140 |
- Yeah, one of the, so I have friends who know you, 03:54:40.360 |
which I think indicates that it's not as common 03:54:52.280 |
and I think it gets to people's head a little bit. 03:55:07.080 |
is to just instantly see through fallacy from authority. 03:55:15.560 |
and says, "Well, God damn it, you have to trust me. 03:55:18.560 |
"I'm the guy that built the last 10 rockets," or something. 03:55:33.120 |
And so you don't walk into a room that he's in 03:55:51.560 |
And that's proven, I think, over and over in his case 03:55:58.480 |
there's a fascinating interdisciplinary team at Neuralink 03:56:03.000 |
that you get to interact with, including Elon. 03:56:07.180 |
What do you think is the secret to a successful team? 03:56:16.800 |
- World experts in different disciplines work together. 03:56:19.480 |
- Yeah, there's a sweet spot where people disagree 03:56:31.320 |
and yet are still able to accept information from others 03:56:40.040 |
And so I like the analogy of sort of how you polish rocks. 03:56:45.040 |
You put hard things in a hard container and spin it. 03:57:01.600 |
we've tried to find people that are not afraid 03:57:07.240 |
and occasionally strongly disagree with people 03:57:30.840 |
I passionately put all my chips on this position 03:57:39.000 |
Part of our brains tell us that that is a power loss, 03:57:43.860 |
that is a loss of face, a loss of standing in the community 03:57:54.620 |
And you just have to recognize that that little voice 03:58:05.440 |
to be able to walk away from an idea that you hold onto. 03:58:18.800 |
- Yeah, you'll at least be a member of a winning team. 03:58:26.120 |
You mentioned there's a lot of amazing neurosurgeons at USC. 03:58:38.600 |
working hard while functioning as a member of a team, 03:58:43.600 |
getting a job done that is incredibly difficult, 03:58:47.660 |
working incredibly long hours, being up all night, 03:58:53.480 |
taking care of someone that you think probably won't survive 03:58:58.720 |
working hard to make people that you passionately dislike 03:59:10.240 |
of excellent neurosurgical technique decade over decade. 03:59:15.240 |
And I think we're well-recognized for that excellence. 03:59:20.480 |
So especially Marty Weiss, Steve Gianotta, Micah Puzo, 03:59:25.320 |
they made huge contributions not only to surgical technique, 03:59:30.040 |
but they built training programs that trained dozens 03:59:38.240 |
I was just lucky to kind of be in their wake. 04:00:17.620 |
and something was going to get them pretty soon anyway. 04:00:29.580 |
of what is expected of them in the coming years regardless. 04:00:45.500 |
someone in their 30s that didn't have it coming, 04:00:54.020 |
and lo and behold, they've got a huge, malignant, 04:01:00.140 |
You can only do that, I think, a handful of times 04:01:04.660 |
before it really starts eating away at your armor. 04:01:19.140 |
and, you know, they bring her four-year-old daughter in 04:01:57.020 |
that's going to make the world worse for people 04:02:03.380 |
and it's just hard to feel powerless in the face of that. 04:02:21.900 |
because what we're doing is to try to fix that stuff. 04:02:26.900 |
We're trying to give people options to reduce suffering. 04:02:48.460 |
that we're fighting back against entropy, I guess. 04:02:51.320 |
- Yeah, the amount of suffering that's endured 04:02:55.340 |
when some of the things that we take for granted 04:02:58.620 |
that our brain is able to do is taken away is immense, 04:03:02.580 |
and to be able to restore some of that functionality 04:03:11.900 |
- Well, can you take me through the full procedure 04:03:14.220 |
for implanting, let's say, the N1 chip in Neuralink? 04:03:23.820 |
The human part of the surgery that I do is dead simple. 04:03:28.820 |
It's one of the most basic neurosurgery procedures 04:03:36.480 |
that some version of it has been done for thousands of years. 04:03:40.780 |
There are examples, I think, from ancient Egypt 04:03:47.860 |
and from Peru or ancient times in South America, 04:03:58.020 |
in people's skulls, presumably to let out the evil spirits, 04:04:04.220 |
and there's evidence of bone healing around the edge, 04:04:06.780 |
meaning the people at least survived some months 04:04:09.940 |
after a procedure, and so what we're doing is that. 04:04:13.340 |
We are making a cut in the skin on the top of the head 04:04:16.580 |
over the area of the brain that is the most potent 04:04:30.340 |
you know, this part of your brain is lighting up 04:04:43.540 |
- Yep, there's a little squiggle in the cortex right there. 04:04:46.180 |
One of the folds in the brain is kind of doubly folded 04:04:49.540 |
right on that spot, and so you can look at it on an MRI 04:04:55.380 |
and then you do a functional test in a special kind of MRI, 04:05:16.260 |
in anyone who's preparing to enter our trial, 04:05:19.980 |
and say, okay, that part of the brain, we confirm, 04:05:37.180 |
make a perfectly round one-inch diameter hole in the skull, 04:05:44.960 |
open the lining of the brain, the covering of the brain, 04:05:49.540 |
it's like a little bag of water that the brain floats in, 04:05:53.220 |
and then show that part of the brain to our robot, 04:06:14.260 |
to a very precise depth, in a very precise spot, 04:06:25.900 |
and puts the implant into that hole in the skull, 04:07:07.500 |
or aneurysm surgeries that are routinely done. 04:07:32.540 |
at this stage of our human civilization development? 04:07:53.260 |
I remember well a surgery that I was doing many years ago, 04:07:59.260 |
where the plan was to open a small hole behind the ear, 04:08:17.340 |
it can cause just intolerable, horrific shooting pain, 04:08:21.760 |
that people describe like being zapped with a cattle prod. 04:08:26.820 |
is to go move this blood vessel off the nerve. 04:08:35.380 |
and then found that there was a giant aneurysm 04:08:38.900 |
that was not easily visible on the pre-op scans. 04:08:44.100 |
and the human surgeons had no problem with that, 04:08:50.620 |
Robots wouldn't do so well in that situation, 04:08:58.020 |
like the electrode insertion portion of the Neuralink surgery, 04:09:10.880 |
but the robot can't really change the plan midway through. 04:09:14.260 |
It operates according to how it was programmed, 04:09:29.440 |
- So, there could be just a very large number of ways 04:10:00.000 |
and say that it's operating under very narrow parameters. 04:10:07.380 |
it wasn't necessarily programmed to deal with that, 04:10:10.060 |
specifically, but a Waymo or a self-driving Tesla 04:10:14.340 |
would have no problem reacting to that appropriately. 04:10:36.980 |
and in the not familiar case, a human could take over. 04:10:39.920 |
But basically, be very conservative in saying, 04:10:43.400 |
"Okay, this for sure has no issues, no surprises," 04:10:52.480 |
So, you think eventually you'll be out of the job, 04:10:57.480 |
well, you being neurosurgeon, your job being neurosurgeon, 04:11:16.120 |
to go in this line of work depending on (laughs) 04:11:26.360 |
if I have a line of work, I would say it's programming. 04:11:29.280 |
And if you ask me, like, for the last, I don't know, 04:11:37.360 |
You will always have a job if you're a programmer 04:11:47.840 |
and they're really damn good at generating code. 04:11:52.520 |
like, wow, what is the contribution of the human really? 04:11:57.360 |
it does seem that humans have ability, like you said, 04:12:05.080 |
it's the ability to kinda come up with novel ideas 04:12:12.480 |
It seems like machines aren't quite yet able to do that. 04:12:17.480 |
when it's life-critical, as it is in surgery, 04:12:28.280 |
But it's fascinating that, in this case of Neuralink, 04:12:34.320 |
- Yeah, yeah, it's, I do the parts it can't do, 04:12:43.760 |
- I saw that there's a lot of practice going on. 04:12:55.400 |
that there's a proxy on which the surgeries are performed. 04:12:59.120 |
So this is both for the robot and for the human, 04:13:02.320 |
for everybody involved in the entire pipeline. 04:13:09.600 |
So there's no analog to this in human surgery. 04:13:14.600 |
Human surgery is sort of this artisanal craft 04:13:18.120 |
that's handed down directly from master to pupil 04:13:25.500 |
to be a surgeon on humans is by doing surgery on humans. 04:13:38.680 |
they put the trivial parts of the surgery into your hands, 04:13:49.280 |
you get more responsibility in the perfect condition. 04:13:53.880 |
In Neuralink's case, the approach is a bit different. 04:13:57.040 |
We, of course, practiced as far as we could on animals. 04:14:18.320 |
One of the engineers, Fran Romano in particular, 04:14:20.440 |
built a pulsating brain in a custom 3D-printed skull 04:14:30.040 |
including their face and scalp characteristics. 04:14:39.320 |
I mean, it's as close as it really reasonably should get 04:14:54.880 |
And so, when we were doing the practice surgeries, 04:15:15.600 |
And then opening the brain in exactly the right spot 04:15:18.720 |
using standard operative neuronavigation equipment, 04:15:26.200 |
that we do all of our practice surgeries in at Neuralink, 04:15:29.720 |
and having the skull open and have the brain pulse, 04:15:33.660 |
which adds a degree of difficulty for the robot 04:15:35.960 |
to perfectly, precisely plan and insert those electrodes 04:15:49.160 |
on how extensively we practiced for this surgery. 04:16:00.900 |
with the first human getting a Neuralink implant 04:16:13.020 |
- Yeah, well, we were lucky to have just incredible partners 04:16:22.380 |
the premier neurosurgical hospital in the world. 04:16:41.840 |
It was a much more high-pressure surgery in some ways. 04:17:06.700 |
and that just adds pressure that is not typical 04:17:09.880 |
for even the most intense production neurosurgery, 04:17:17.120 |
or placing deep brain stimulation electrodes, 04:17:21.080 |
and it had never been done on a human before. 04:17:25.120 |
And so, definitely a moderate pucker factor there 04:17:38.960 |
say, a degree of brain movement that was unanticipated 04:17:57.040 |
and that surgery was one of the smoothest outcomes 04:18:16.440 |
- Yeah, even with all that practice, all of that, 04:18:21.680 |
that's so high-stakes in terms of people watching. 04:18:35.240 |
- Well, I think wealth is easy to hate or envy or whatever, 04:18:40.240 |
and I think there's a whole industry around driving clicks, 04:18:50.980 |
And so any way to take an event and turn it into bad news 04:19:00.160 |
- It just sucks because I think it puts pressure on people. 04:19:11.200 |
You have to do things that haven't been done before, 04:19:13.200 |
and you have to take risks, calculated risks. 04:19:16.680 |
You have to do all kinds of safety precautions, 04:19:20.060 |
And I just wish there would be more celebration of that, 04:19:23.740 |
of the risk-taking, versus people just waiting 04:19:35.540 |
it's really great that everything went just flawlessly, 04:19:41.700 |
- Now that there's a human with literal skin in the game, 04:19:48.980 |
on this doing well, you have to be a pretty bad person 04:20:08.260 |
- Yeah, I mean, because an MD needs to be in charge 04:20:15.040 |
throughout the process, I unscrubbed from the surgery 04:20:20.040 |
after exposing the brain and presenting it to the robot 04:20:23.140 |
and placed the targets on the robot software interface 04:20:28.140 |
that tells the robot where it's going to insert each thread. 04:20:52.520 |
- Right, the software engineers are amazing on this team 04:21:08.480 |
and it will automatically avoid the blood vessels 04:21:11.240 |
in that region and automatically place a bunch of targets. 04:21:22.740 |
and make dense applications of targets in those regions, 04:21:34.180 |
of finger movements and arm movement intentions. 04:21:37.900 |
- I've seen images of this and for me with OCD, 04:21:43.920 |
I think there's a subreddit called Oddly Satisfying. 04:21:48.480 |
- It's oddly satisfying to see the different target sites 04:21:52.120 |
avoiding the blood vessels and also maximizing 04:21:56.360 |
the usefulness of those locations for the signal. 04:21:59.320 |
It just feels good, it's like ah, that's nice. 04:22:06.480 |
- It's extremely satisfying watching the electrodes 04:22:08.800 |
themselves go into the brain and not cause bleeding. 04:22:12.080 |
- Yeah, yeah, so you said the feeling was of relief 04:22:34.620 |
- Yeah, so talking broadly about neurosurgery, 04:22:39.240 |
It's routine for me to put deep brain-stimulating electrodes 04:22:47.840 |
entering from the top and passing about a two millimeter 04:22:52.360 |
wire all the way into the bottom of the brain. 04:22:55.700 |
And that's not revolutionary, a lot of people do that. 04:23:01.560 |
I use a robot from Globus to do that surgery. 04:23:14.280 |
What are you seeing, what kind of technology can you use 04:23:17.840 |
to visualize where you are to light your way? 04:23:20.760 |
- Yeah, so it's a cool process on the software side. 04:23:30.340 |
You put the patient to sleep, put their head in a frame 04:23:43.000 |
and then merge the MRI and the CT in software. 04:23:47.720 |
You have a plan based on the MRI where you can see 04:23:53.580 |
You can't see them on CT, but if you trust the merging 04:23:58.320 |
of the two images, then you indirectly know on the CT 04:24:01.960 |
where that is, and therefore indirectly know where, 04:24:06.400 |
in reference to the titanium frame screwed to their head, 04:24:11.680 |
And so this is '60s technology to manually compute 04:24:16.080 |
trajectories given the entry point and target, 04:24:18.880 |
and dial in some goofy-looking titanium actuators 04:24:27.560 |
with manual actuators with little tick marks on them. 04:24:31.820 |
The modern version of that is to use a robot, 04:24:42.560 |
This small robot arm can show you the trajectory 04:24:49.180 |
and establish a very rigid holder through which 04:24:55.520 |
and pass a small rigid wire deep into that area of the brain 04:24:59.200 |
that's hollow, and put your electrode through 04:25:02.120 |
that hollow wire, and then remove all of that 04:25:06.200 |
So you end up with the electrode very, very precisely placed 04:25:15.000 |
that's already been out in the world for a while. 04:25:21.360 |
Neuralink right now is focused entirely on cortical targets, 04:25:25.680 |
surface targets, because there's no trivial way 04:25:29.760 |
to get, say, hundreds of wires deep inside the brain 04:25:41.600 |
I can't see everything that that DBS electrode 04:25:44.600 |
is passing through on its way to that deep target. 04:25:51.300 |
that there's gonna be about one in 100 patients 04:26:03.680 |
That's not an acceptable safety profile for Neuralink. 04:26:09.280 |
We start from the position that we want this to be 04:26:14.280 |
dramatically, maybe two or three orders of magnitude 04:26:28.400 |
I've been meaning to upgrade to the latest version." 04:26:31.760 |
And so the safety constraints given that are high. 04:26:36.760 |
And so we haven't settled on a final solution 04:26:42.320 |
for arbitrarily approaching deep targets in the brain. 04:26:50.540 |
maybe there's creative ways of doing the same thing 04:26:52.380 |
like mapping out high resolution geometry of blood vessels 04:26:57.360 |
But how do you map out that in a way that's super stable? 04:27:03.700 |
There's a lot of interesting challenges there, right? 04:27:06.100 |
- But there's a lot to do on the surface, luckily. 04:27:10.760 |
We actually have made a huge amount of progress 04:27:18.680 |
as a potential workaround for a spinal cord injury 04:27:25.920 |
to translate motor intentions to a spine mounted implant 04:27:37.280 |
So like the effort there is to try to bridge the brain 04:27:41.320 |
to the spinal cord to the periphery, peripheral nervous. 04:27:47.740 |
- We have that working in very crude forms in animals. 04:27:56.140 |
where he's able to digitally move the cursor. 04:27:59.740 |
Here you're doing the same kind of communication 04:28:08.960 |
- So we have anesthetized animals doing grasp 04:28:13.880 |
and moving their legs in a sort of walking pattern. 04:28:20.600 |
But the future is bright for this kind of thing 04:28:23.760 |
and people with paralysis should look forward 04:28:30.280 |
- Yeah, and there's a lot of sort of intermediate 04:28:33.920 |
or extra options where you take like an optimist robot, 04:28:37.440 |
like the arm and to be able to control the arm. 04:28:43.200 |
- The fingers and hands at the arm as a prosthetic. 04:28:49.480 |
- Exoskeletons, yeah, so that goes hand in hand. 04:28:54.480 |
Although I didn't quite understand until thinking 04:28:56.900 |
about it deeper and doing more research about Neuralink, 04:29:05.760 |
- I didn't quite understand that you could really map 04:29:09.220 |
the intention as you described in the hand knob area, 04:29:20.520 |
That intention can be mapped to actual action 04:29:26.800 |
in the digital world that it can reconnect you 04:29:32.320 |
It can allow you to have freedom, have independence 04:29:40.680 |
- Yeah, our first participant is, he's incredible. 04:29:48.800 |
Just going back to the surgery, your whole journey. 04:29:54.920 |
You mentioned to me offline, you have surgery on Monday. 04:29:58.240 |
So you're like, you're doing surgery all the time. 04:30:09.320 |
There's a million ways of people saying the same thing 04:30:12.120 |
and selling books saying it, but you call it 10,000 hours, 04:30:18.160 |
some percentage of your life focusing on this, 04:30:23.200 |
Repetitions, humility, recognizing that you aren't perfect 04:30:31.260 |
at any stage along the way, recognizing you've got 04:30:37.840 |
being open to feedback and coaching from people 04:30:40.980 |
with a different perspective on how to do it. 04:30:43.240 |
And then just the constant will to do better. 04:30:58.460 |
They force you to want to do better all the time. 04:31:08.120 |
- So every surgery, even if it's the same exact surgery, 04:31:11.720 |
is there a lot of variability between that surgery 04:31:16.280 |
I mean, a good example for us is the angle of the skull 04:31:21.280 |
relative to the normal plane of the body axis 04:31:26.760 |
of the skull over hand knob is pretty wide variation. 04:31:34.960 |
and some people have really steeply angled skulls 04:31:38.520 |
And that has consequences for how their head can be fixed 04:31:50.880 |
And yeah, people's bodies are built as differently 04:31:56.240 |
as the people you see walking down the street, 04:32:06.380 |
there are some people who we've had to kind of exclude 04:32:11.280 |
from our trial for having skulls that are too thick 04:32:14.480 |
or too thin or scalp that's too thick or too thin. 04:32:17.600 |
I think we have like the middle 97% or so of people, 04:32:24.360 |
but you can't account for all human anatomy variability. 04:32:36.480 |
the diagrams are always really clean and crisp. 04:32:43.760 |
But whenever I look at pictures of like real brains, 04:32:52.800 |
- How much are biological systems in reality? 04:32:55.920 |
Like how hard is it to figure out what's going on? 04:33:02.340 |
that's where experience and skill and education 04:33:08.120 |
really come into play is if you stare at a thousand brains, 04:33:12.280 |
it becomes easier to kind of mentally peel back the, 04:33:17.880 |
say for instance, blood vessels that are obscuring 04:33:20.600 |
the sulci and gyri, kind of the wrinkle pattern 04:33:26.440 |
Occasionally, when you're first starting to do this 04:33:32.480 |
what you thought you were gonna see based on the MRI. 04:33:35.120 |
And with more experience, you learn to kind of peel back 04:33:42.360 |
that layer of blood vessels and see the underlying pattern 04:33:46.960 |
of wrinkles in the brain and use that as a landmark 04:33:56.440 |
That's a pattern of the wrinkles in the brain. 04:33:58.640 |
It's sort of this Greek letter omega shaped area 04:34:13.520 |
- And so there is some uniqueness to that area of the brain, 04:34:16.800 |
like in terms of the geometry, the topology of the thing. 04:34:23.560 |
- So you have this strip of brain running down the top, 04:34:29.480 |
And I'm sure you've seen this picture of the homunculus 04:34:34.480 |
the weird little guy with huge lips and giant hands. 04:34:38.460 |
That guy sort of lays with his legs up at the top 04:34:47.800 |
and then some kind of mouth, lip, tongue areas farther down. 04:34:59.880 |
at least on the left side of the brain in most people 04:35:05.100 |
And so any muscle that you voluntarily move in your body, 04:35:10.100 |
the vast majority of that references that strip 04:35:14.880 |
or those intentions come from that strip of brain 04:35:33.060 |
To do vision, we can't just do the surface of the brain. 04:35:43.520 |
but maybe a centimeter deeper than we're used to 04:36:05.960 |
And then if you look at the threads, they're flexible. 04:36:12.560 |
that kind of approach of the flexible threads 04:36:15.160 |
to deliver the electrodes next to the neurons? 04:36:18.180 |
- Yeah, I mean, the goal there comes from experience. 04:36:23.780 |
that made Utah rays and used Utah rays for decades 04:36:34.140 |
this approach to technology arose out of a need 04:36:37.140 |
recognized after Utah rays would fail routinely 04:37:04.260 |
And so one of the projects that was being worked on 04:37:06.800 |
in the Anderson Lab at Caltech when I got there 04:37:18.760 |
when you're jamming a bed of nails into the brain 04:37:27.560 |
It's like, you know, maybe we've gotten off track here, guys. 04:37:29.960 |
Maybe there's a fundamental redesign necessary. 04:37:48.020 |
And so what we see is our electrode longevity 04:37:51.540 |
and functionality and the health of the brain tissue 04:37:55.320 |
immediately surrounding the electrode is excellent. 04:37:59.220 |
I mean, it goes on for years now in our animal models. 04:38:07.300 |
We'll mention the vasculature, that's really interesting. 04:38:14.320 |
is that it really does control almost everything. 04:38:25.880 |
You wanna be able to turn fertility on and off. 04:38:40.740 |
There are legitimate targets in the brain for doing that. 04:38:43.540 |
Things that aren't immediately obvious as brain problems 04:38:58.700 |
for primary treatments of all the things that bother people. 04:39:04.940 |
- That's a really fascinating way to look at it. 04:39:13.940 |
of something that actually started in the brain, 04:39:17.560 |
the primary source is something in the brain. 04:39:23.360 |
but there are levers you can pull in the brain 04:39:37.840 |
- Would you have a Neuralink chip implanted in your brain? 04:39:43.640 |
I think use case right now is use a mouse, right? 04:39:56.400 |
On safety grounds alone, sure, I'll do it tomorrow. 04:39:59.560 |
- You know, you say the use case of the mouse. 04:40:05.220 |
and part of it is just watching Nolan have so much fun, 04:40:14.820 |
'cause if you think about the ways on the smartphone, 04:40:17.940 |
the way you swipe, that was transformational. 04:40:31.240 |
People were sure you need a keyboard to type. 04:40:41.940 |
So there could be a certain rate of speed with the mouse 04:40:52.260 |
And that, if it, I can see myself getting a Neuralink 04:40:58.620 |
for much more rapid interaction with digital devices. 04:41:09.200 |
The value proposition for the average person. 04:41:11.440 |
A keyboard is a pretty clunky human interface, 04:41:18.320 |
It's highly variable in the maximum performance 04:41:31.080 |
and just having a natural word to computer interface 04:41:40.560 |
- It'd be hilarious if that is the reason people do it. 04:41:43.260 |
Even if you have speech to text, that's extremely accurate. 04:41:49.760 |
it'd be hilarious if people went for Neuralink 04:41:52.160 |
just so you avoid the embarrassing aspect of speaking, 04:41:55.600 |
like looking like a douchebag speaking to your phone 04:41:57.560 |
in public, which is a real, that's a real constraint. 04:42:30.640 |
and have it read to you without any observable change 04:42:35.720 |
For one thing, standardized testing is obsolete. 04:42:46.120 |
it could change, I don't know if it transforms society, 04:42:57.560 |
Now, I would, just having to look into the safety 04:43:01.440 |
of everything involved, I would totally try it 04:43:03.820 |
so it doesn't have to go to some like incredible thing 04:43:10.380 |
or to some other, like it connects all over your brain. 04:43:12.600 |
That could be like just connecting to the hand knob. 04:43:15.380 |
You might have a lot of interesting interaction, 04:43:22.320 |
- Yeah, and the technology on the academic side 04:43:34.080 |
that basically made a initial solve of speech decode 04:43:43.080 |
that they were getting with very high accuracy, which is-- 04:43:49.500 |
- Thinking the word and you're able to get it. 04:43:52.680 |
Like you have to have the intention of speaking it. 04:43:59.360 |
Man, it's so amazing to me that you can do the intention, 04:44:05.780 |
All you have to do is just imagine yourself doing it. 04:44:08.440 |
And if you get the feedback that it actually worked, 04:44:23.920 |
That is, to me, it's just really fascinating. 04:44:34.680 |
the capability of my mind to learn this skill. 04:44:47.640 |
- I can't wait to see what people do with it. 04:44:55.700 |
At some point, when these are more widespread, 04:45:17.240 |
even with the bits per second of playing a video game. 04:45:21.700 |
You realize you give a Neuralink to a teenager, 04:45:30.200 |
They're gonna get like hundreds of bits per second. 04:45:56.320 |
the algorithm becomes better and better and better, 04:45:59.560 |
- Yeah, we're scratching the surface on that right now. 04:46:24.700 |
There's this whole community of weirdo biohackers 04:46:32.140 |
was storing private crypto wallet keys and whatever. 04:46:42.260 |
- Do you have some Bitcoin implanted in your body somewhere? 04:47:00.760 |
Went back and found that some community of people loved it 04:47:09.760 |
So there was a lot of change in those cushions. 04:47:22.140 |
You can scan that in by touching it to your phone. 04:47:31.260 |
It's a cool leap to implant something in your body. 04:47:37.980 |
Because for a lot of people that kind of notion 04:47:42.320 |
something electronic inside a biological system 04:47:50.740 |
We're completely fine with knee replacements, 04:48:03.920 |
around the inviolable barrier that the skull represents. 04:48:19.100 |
The question is, you know, what benefit can we provide? 04:48:26.180 |
how much does neuroplasticity come into play? 04:48:30.280 |
For example, just even in the case of healing from surgery 04:48:41.860 |
is that, you know, plasticity decreases with age. 04:48:47.260 |
I have too much gray hair to be optimistic about that. 04:48:52.260 |
There are theoretical ways to increase plasticity 04:48:58.740 |
Nothing that is, you know, totally proven out 04:49:02.320 |
as a robust enough mechanism to offer widely to people. 04:49:16.780 |
Certainly there's been some really amazing work recently 04:49:21.960 |
from Nicholas Schiff, Jonathan Baker, you know, and others 04:49:31.060 |
who have had electrodes placed in the deep nucleus 04:49:38.940 |
And when they apply small amounts of electricity 04:49:46.420 |
They're able to improve people's attention and focus. 04:49:50.820 |
They're able to improve how well people can perform a task. 04:49:54.660 |
I think in one case, someone who was unable to work 04:50:02.180 |
And that's sort of, you know, one of the holy grails 04:50:04.980 |
for me with Neuralink and other technologies like this 04:50:13.600 |
can we make people able to take care of themselves 04:50:21.680 |
Can we make it so someone who's fully dependent 04:50:24.600 |
and even maybe requires a lot of caregiver resources, 04:50:28.940 |
can we put them in a position to be fully independent, 04:50:41.820 |
and what a lot of the people at Neuralink are working for. 04:50:54.400 |
The capacity of the brain to do that is really interesting, 04:50:56.820 |
probably unknown to the degree to which it can do that, 04:51:00.840 |
but you're now connecting an external thing to it, 04:51:13.420 |
and the electronic brain outside of it working together, 04:51:18.420 |
like the possibilities there are really interesting. 04:51:29.020 |
- But of course it is a system that by itself is already, 04:51:36.460 |
and so you don't wanna mess with it too much. 04:51:39.580 |
- Yeah, it's like eliminating a species from an ecology. 04:51:44.580 |
You don't know what the delicate interconnections 04:51:49.980 |
The brain is certainly a delicate, complex beast 04:51:54.580 |
and we don't know every potential downstream consequence 04:52:04.260 |
- Do you see yourself doing, so you mentioned P1, 04:52:23.480 |
I think something that I would very much like 04:52:28.680 |
to work towards is a process that is so simple 04:52:36.700 |
We wanna get away from requiring intense expertise 04:52:42.660 |
or intense experience to have this successfully done 04:52:47.120 |
and make it as simple and translatable as possible. 04:52:51.780 |
I mean, I would love it if every neurosurgeon on the planet 04:52:55.840 |
I think we're probably far from a regulatory environment 04:53:00.580 |
that would allow people that aren't neurosurgeons to do this 04:53:25.940 |
- To a certain degree, yeah, it's a complex relationship. 04:53:32.900 |
- It's funny when in the middle of the surgery, 04:53:57.140 |
How have all the surgeries that you've done over the years, 04:54:07.420 |
how has that changed your understanding of life and death? 04:54:11.460 |
- Yeah, you know, it gives you a very visceral sense. 04:54:42.800 |
leaving, you know, a four-year-old behind, let's say. 04:54:54.040 |
because you see how just mind-numbingly universal death is. 04:54:59.040 |
There's zero chance that I'm going to avoid it. 04:55:25.760 |
and we are very ornate, delicate, brittle DNA machines 04:55:37.760 |
every human that has ever lived died or will die. 04:55:43.120 |
On the other hand, it's just one of the hardest things 04:55:49.000 |
to imagine inflicting on anyone that you love 04:56:03.200 |
And so I wish I had arrived at the point of nirvana 04:56:08.200 |
where death doesn't have a sting, I'm not worried about it, 04:56:19.980 |
if not having found out how to take the tragedy out of it 04:56:26.200 |
when I think about my kids either not having me 04:56:59.900 |
I mean, like it certainly feels like it's not going to end, 04:57:04.900 |
like you live life like it's not going to end. 04:57:09.400 |
- And the fact that this light that's shining, 04:57:23.000 |
to load all that in with Ernest Becker's terror. 04:57:33.900 |
- I think the more you are able to really think through it, 04:57:41.180 |
And if you really can load that in, it's hard. 04:57:49.220 |
because it like helps you get your shit together 04:57:55.300 |
every single moment you're alive is just beautiful. 04:57:59.640 |
- And it's terrifying that it's going to end. 04:58:01.120 |
It's like, almost like you're shivering in the cold, 04:58:11.080 |
- And then it makes you, when you have warmth, 04:58:13.280 |
when you have the safety, when you have the love, 04:58:32.380 |
because if you kept looking at that, it might break you. 04:58:42.180 |
There's the neurosurgeon and then there's a human. 04:58:46.280 |
- And the human is still able to struggle with that 04:58:48.700 |
and feel the fear of that and the pain of that. 04:58:51.580 |
- Yeah, it definitely makes you ask the question 04:59:24.940 |
I get to help on a project that I think matters. 04:59:39.100 |
And it's cool 'cause you read about all this stuff 04:59:43.860 |
I've been reading, before going to the Amazon, 04:59:48.780 |
They would go and explore even the Amazon jungle 04:59:54.960 |
Or early steps into space, early steps in any discipline 04:59:59.880 |
And it's cool 'cause this is like, on the grand scale, 05:00:10.960 |
but you'll be able to interact with the human brain. 05:00:20.840 |
- Yeah, I think ultimately we wanna give people 05:00:28.960 |
If you can give someone a dial that they can turn 05:00:35.840 |
I think that makes people really uncomfortable. 05:00:40.300 |
But now talk about major depressive disorder, 05:00:45.300 |
talk about people that are committing suicide 05:00:49.620 |
and try to justify that queasiness in that light. 05:00:55.500 |
You can give people a knob to take away suicidal ideation, 05:01:11.360 |
- Yeah, you can think about all the suffering 05:01:14.400 |
Every single human being that's suffering right now, 05:01:38.920 |
And we should remember those that are suffering. 05:01:44.180 |
'Cause once again, most of them are suffering quietly. 05:01:46.820 |
- Well, and on a grander scale, the fabric of society. 05:01:52.940 |
how our social fabric is working or not working, 05:02:01.680 |
Those things are made of neurochemistry too, in aggregate. 05:02:06.680 |
Our politics is composed of individuals with human brains. 05:02:15.940 |
is potentially tunable in the sense that, I don't know, 05:02:24.300 |
or tune our addictive behaviors for social media, 05:02:35.400 |
I don't think that leads to a functional society. 05:02:54.300 |
Maybe we could all work together a little more harmoniously 05:03:08.820 |
to make the whole thing work, but there's a sweet spot. 05:03:22.540 |
- You know, I have this sense that I never found it, 05:03:27.420 |
never removed it, like a Dementor in Harry Potter. 05:03:44.920 |
for thinking about what consciousness is in the brain. 05:03:48.620 |
Is that we have a really good intuitive understanding 05:04:04.980 |
of sensory mapping applied to the thought processes 05:04:10.140 |
So what I'm saying is, consciousness is the sensation 05:04:20.380 |
You feel the part of your brain that thinks of red things 05:04:28.660 |
You feel those parts of your brain being active 05:04:33.400 |
the way that I'm feeling my palm being touched, right? 05:04:36.800 |
And that sensory system that feels the brain working 05:04:45.340 |
It's the same way, it's the sensation of touch 05:04:50.380 |
Consciousness is the sensation of you feeling 05:05:13.740 |
And there's this awesome long history of people 05:05:17.060 |
looking at whatever the latest discovery in physics is 05:05:20.340 |
to explain consciousness because it's the most magical, 05:05:24.140 |
the most out there thing that you can think of 05:05:27.220 |
and people always wanna do that with consciousness. 05:05:41.900 |
- Everything we see around us, everything we love, 05:05:43.940 |
everything that's beautiful came from brains like these. 05:05:48.660 |
- It's all electrical activity happening inside your skull. 05:05:52.380 |
- And I, for one, am grateful that it's people like you 05:05:56.980 |
that are exploring all the ways that it works 05:06:19.140 |
You told me that you've met hundreds of people 05:06:25.620 |
and that your motivation for helping at Neuralink 05:06:44.400 |
But just, I think, to summarize at a very high level 05:06:48.780 |
is that people with ALS or severe spinal cord injury 05:06:59.740 |
And that can mean different things for different people. 05:07:04.260 |
just to be able to communicate again independently 05:07:06.300 |
without needing to wear something on their face, 05:07:29.740 |
just being able to respond to their kid in time 05:07:31.540 |
before they run away or get interested in something else. 05:07:48.540 |
This is a problem that with the right resources, 05:07:51.020 |
with the right team, we can make a lot of progress on. 05:07:58.140 |
and something that makes me excited to get up every day. 05:08:10.420 |
it's an engineering problem for the rest of the world 05:08:16.980 |
I'll take a broad view sort of lens on this for a second. 05:08:31.700 |
working on head trackers or mouse sticks or quad sticks. 05:08:35.020 |
I've met many engineers and folks in the community 05:08:38.980 |
And I think for the people we're trying to help, 05:08:41.740 |
it doesn't matter what the complexity of the solution is 05:08:51.420 |
And BCI is one of a collection of such solutions. 05:08:58.260 |
And I think the folks that recognize this immediately 05:09:00.580 |
are usually the people who have spinal cord injury 05:09:04.960 |
why this might be something that could be helpful. 05:09:09.100 |
folks that don't live with severe spinal cord injury 05:09:14.060 |
it's not often obvious why you would want a brain implant 05:09:16.260 |
to be able to connect and navigate a computer. 05:09:20.420 |
that I've learned a huge amount just working with Nolan 05:09:32.960 |
even if you can achieve the same thing, for example, 05:09:34.940 |
with a mouse stick when navigating a computer, 05:09:42.620 |
And so a BCI can really offer a level of independence 05:09:46.880 |
if it wasn't literally physically part of your body, 05:09:56.620 |
to be able to control a cursor on the screen with his mind. 05:10:07.980 |
"monitoring live signals coming out of the brain. 05:10:12.840 |
"to develop new UX paradigms, decoding strategies. 05:10:18.900 |
"how to recover useful BCI to new world record levels 05:10:24.600 |
We'll talk about, I think, every aspect of that. 05:10:33.220 |
and part of that historic, I would say, historic first? 05:10:42.900 |
And so to be able to be even just some small part 05:10:45.700 |
of making it a reality is extremely exciting. 05:10:48.540 |
A couple maybe special moments during that whole process 05:11:00.860 |
At that point in time, I know Nolan quite well. 05:11:19.300 |
And I have the lucky job in that particular procedure 05:11:21.940 |
to just be in charge of monitoring the implant. 05:11:25.620 |
to look at the signals coming off the implant, 05:11:27.500 |
to look at the live brain data streaming off the device 05:11:29.740 |
as threads are being inserted into the brain, 05:11:35.260 |
or that there's no red flags or fault conditions 05:11:40.900 |
And because I had that sort of spectator view of the surgery, 05:11:55.940 |
one thing that most people don't realize is the brain moves. 05:12:00.980 |
when your heart beats, and you can see it visibly. 05:12:04.940 |
So that's something that I think was a surprise to me 05:12:08.060 |
and very, very exciting to be able to see someone's brain 05:12:11.700 |
who you physically know and have talked with that length 05:12:13.980 |
actually pulsing and moving inside their skull. 05:12:15.860 |
- And they used that brain to talk to you previously, 05:12:24.060 |
So the Neuralink implant is active during surgery. 05:12:32.620 |
So that's part of the way you test that the thing is working. 05:12:37.220 |
right after we finished all the thread insertions, 05:12:41.020 |
I started collecting what's called broadband data. 05:12:43.340 |
So broadband is basically the most raw form of signal 05:12:49.260 |
It's essentially a measurement of the local field potential 05:12:53.300 |
or the voltage essentially measured by that electrode. 05:12:57.100 |
And we have a certain mode in our application 05:12:59.700 |
that allows us to visualize where detected spikes are. 05:13:02.100 |
So it visualizes sort of where in the broadband signal, 05:13:09.180 |
And so one of these moments that I'll never forget 05:13:17.300 |
beautiful spikes being shown in the application, 05:13:19.740 |
just streaming live to a device I'm holding in my hand. 05:13:22.660 |
- So this is no signal processing, the raw data, 05:13:25.180 |
and then the signals processing is on top of it, 05:13:29.500 |
- And that's a UX too, 'cause that looks beautiful as well. 05:13:36.900 |
there was actually a lot of cameramen in the room. 05:13:42.100 |
who are all just excited to see robots taking their job. 05:13:44.780 |
And they're all crowded around a small little iPhone 05:13:47.660 |
watching this live brain data stream out of his brain. 05:13:54.100 |
So the computer vision aspect where it detects 05:14:03.660 |
then actually doing the really high precision 05:14:11.940 |
My answer's gonna be pretty lame here, but it was boring. 05:14:17.340 |
Yeah, that's exactly how you want surgery to be. 05:14:29.300 |
- Yeah, all the practice surgeries and the proxies, 05:14:37.940 |
Well, do you remember a moment where he was able 05:14:44.380 |
but get signal from the brain such that it was able 05:14:50.660 |
So we are quite excited to move as quickly as we can, 05:14:53.660 |
and Nolan was really, really excited to get started. 05:14:56.060 |
He wanted to get started actually the day of surgery, 05:14:58.500 |
but we waited 'til the next morning, very patiently. 05:15:03.180 |
And the next morning in the ICU where he was recovering, 05:15:08.420 |
he wanted to get started and actually start to understand 05:15:11.420 |
what kind of signal we can measure from his brain. 05:15:15.860 |
with the Neuralink system, we implant the Neuralink system, 05:15:19.260 |
or the Neuralink implant in the motor cortex. 05:15:25.780 |
So if you imagine closing and opening your hand, 05:15:28.060 |
that kind of signal representation would be present 05:15:31.340 |
If you imagine moving your arm back and forth 05:15:37.060 |
So one of the ways we start to sort of map out 05:15:39.460 |
what kind of signal do we actually have access to 05:15:44.780 |
And body mapping is where you essentially present a visual 05:15:46.740 |
to the user and you say, "Hey, imagine doing this." 05:15:49.580 |
And that visual is a 3D hand opening and closing 05:15:59.860 |
So you can't see them actually move their arm. 05:16:02.380 |
But while they do this task, you can record neural activity 05:16:05.620 |
and you can basically offline model and check, 05:16:13.660 |
hey, there's actually some modulation associated 05:16:28.620 |
of his brain activity and put it in front of him. 05:16:30.660 |
And we said, "Hey, you tell us what's going on." 05:16:36.140 |
And we know that it's modulating some of these neurons. 05:16:38.060 |
So you figure out for us what that is actually representing. 05:16:49.060 |
I see this particular neuron start to fire more." 05:16:53.820 |
And so he said, "Okay, three, two, one, boom." 05:16:57.660 |
you can see instantaneously this neuron is firing. 05:17:00.860 |
Single neuron, I can tell you the exact channel number 05:17:06.020 |
But that single channel firing was a beautiful indication 05:17:09.060 |
that it was behaviorally modulated neural activity 05:17:18.620 |
- Yes, channel and electrode are interchangeable. 05:17:27.420 |
That really, when I was learning about all this 05:17:31.300 |
and loading it in, it was just blowing my mind 05:17:35.380 |
that the intention, you can visualize yourself 05:17:37.780 |
moving the finger, that can turn into a signal. 05:17:41.620 |
And the fact that you can then skip that step 05:17:49.180 |
and that leading to a signal that can then be used 05:17:52.580 |
There is so many exciting things there to learn 05:17:56.620 |
about the brain, about the way the brain works. 05:17:58.980 |
The very fact of their existing signal that can be used 05:18:03.340 |
But it feels like that's just like the beginning 05:18:05.780 |
of figuring out how that signal could be used 05:18:10.500 |
- I should also just, there's so many fascinating details 05:18:13.140 |
here, but you mentioned the body mapping step. 05:18:15.740 |
At least in the version I saw that Nolan was showing off, 05:18:26.620 |
'cause I guess it visualizes you moving the hand 05:18:31.620 |
and there's a very like a sexy, polished interface. 05:18:44.580 |
in a really nice video game and this is a tutorial 05:18:50.700 |
- No, I mean, the future should feel like the future. 05:18:53.860 |
I mean, it needs to be simple, but not too simple. 05:18:57.220 |
- Yeah, and I think the UX design component here 05:18:59.700 |
is underrated for BCI development in general. 05:19:06.060 |
between the ways in which you visualize an instruction 05:19:08.820 |
to the user and the kinds of signal you can get back. 05:19:11.380 |
And that quality of sort of your behavioral alignment 05:19:13.700 |
to the neural signal is a function of how good you are 05:19:16.180 |
at expressing to the user what you want them to do. 05:19:18.260 |
And so, yeah, we spend a lot of time thinking about the UX, 05:19:23.900 |
the control surfaces it provides to the user. 05:19:27.700 |
- So maybe it'd be nice to get into a little bit more detail 05:19:48.660 |
And what are the different steps along the way 05:20:02.900 |
it's worth understanding what we're trying to measure 05:20:04.540 |
because that dictates a lot of the requirements 05:20:09.820 |
is really individual neurons producing action potentials. 05:20:13.900 |
you can think of it like a little electrical impulse 05:20:19.260 |
And by being close enough, I mean like within, 05:20:24.300 |
And 100 microns is a very, very tiny distance. 05:20:26.340 |
And so the number of neurons that you're gonna pick up 05:20:30.980 |
is just a small radius around that electrode. 05:20:36.660 |
is that when neurons produce an action potential, 05:20:38.660 |
the width of that action potential is about one millisecond. 05:20:41.540 |
So from the start of the spike to the end of the spike, 05:20:43.300 |
that whole width of that sort of characteristic feature 05:20:51.660 |
that an individual spike is occurring or not, 05:20:55.820 |
or sample the local field potential nearby the neuron 05:20:58.580 |
much more frequently than once a millisecond. 05:21:00.500 |
You need to sample many, many times per millisecond 05:21:18.740 |
what that exact shape of that action potential looks like. 05:21:22.060 |
And once we've sort of sampled at super high rate 05:21:25.580 |
the underlying electrical field nearby these cells, 05:21:30.940 |
into just where do we detect a spike or where do we not? 05:21:36.460 |
do we detect a spike in this one millisecond or not? 05:21:39.100 |
And we do that because the actual information 05:21:49.460 |
Essentially everything that we care about for decoding 05:21:52.620 |
in the frequency characteristics of spike trains, 05:22:19.980 |
There's a local neighborhood of neurons nearby. 05:22:34.060 |
on exactly this problem of basically given a single electrode 05:22:37.980 |
or given a set of electrodes measuring a set of neurons, 05:22:46.120 |
And this is a problem that's pursued in academic work 05:22:51.660 |
in the underlying sort of neuroscience of the brain. 05:22:59.820 |
then that's a very, very important question to understand. 05:23:06.100 |
if the number of neurons per electrode is relatively small, 05:23:13.100 |
You can think of it like sort of a random projection 05:23:21.100 |
those signals can be thought of as sort of a union of the two. 05:23:26.340 |
that's a totally reasonable trade-off to make 05:23:31.960 |
the relevance of distinguishing individual neurons 05:23:37.420 |
and you can start to rely on sort of correlations 05:23:41.040 |
to help understand when that channel is firing, 05:23:45.720 |
'Cause you know that when that channel is firing 05:23:54.280 |
- Okay, so you have to do this kind of spike detection 05:23:56.660 |
on board and you have to do that super efficiently. 05:24:04.380 |
'cause you don't wanna be generating too much heat. 05:24:05.920 |
So it has to be a super simple signal processing step. 05:24:14.620 |
about what it takes to overcome that challenge? 05:24:26.100 |
And I'll say that I don't think we're at the final step 05:24:31.740 |
but there can be many approaches that we find in the future 05:24:34.220 |
that are much better than what we do right now. 05:24:38.460 |
and there's a lot of academic heritage to these ideas. 05:24:47.820 |
sort of like a convolutional filter almost, if you will, 05:24:51.080 |
and looks for a certain template to be matched. 05:25:04.060 |
that that template is matched within certain bounds, 05:25:20.540 |
Another approach that we've recently started exploring, 05:25:23.340 |
and this can be combined with the spike detection approach, 05:25:34.620 |
Because the farther away you are from an electrode, 05:25:40.620 |
So you might be able to pick up population-level activity 05:25:47.680 |
what neuroscientists sometimes refer to as the hash 05:25:50.220 |
of activity, the other stuff that's going on. 05:26:00.460 |
That signal is now a floating point representation, 05:26:02.380 |
which means it's more expensive to send out over a power. 05:26:04.660 |
It means you have to find different ways to compress it 05:26:14.580 |
you're limited by the amount of data you can send. 05:26:18.300 |
- And so, and also because you're currently using 05:26:20.860 |
the Bluetooth protocol, you have to batch stuff together. 05:26:23.700 |
But you have to also do this keeping the latency crazy low, 05:26:34.860 |
So I want to build the best mouse in the world. 05:26:38.340 |
I don't want to build like the Chevrolet Spark 05:26:43.100 |
I want to build like the Tesla Roadster version of a mouse. 05:26:50.980 |
that most e-sports competitions are dominated 05:26:54.540 |
This is like a very real possibility for a number of reasons. 05:26:56.740 |
One is that they'll have access to the best technology 05:27:02.620 |
So those two factors together are particularly potent 05:27:22.060 |
If it's a fundamentally different experience, 05:27:30.660 |
if it's just the ability to move the mouse 10X faster, 05:27:41.340 |
that's a really interesting possibility of what they can do. 05:27:43.940 |
Especially as you get really good at it with training. 05:27:50.460 |
Like because you don't have to buffer your intention 05:27:54.500 |
you get just by nature of having a brain implant at all, 05:28:04.180 |
you can sort of plan out sequences of action. 05:28:06.100 |
So you may not get that whole benefit all the time, 05:28:08.300 |
but for sort of like reaction time style games 05:28:12.020 |
somebody's over here, snipe 'em, that kind of thing. 05:28:15.300 |
You actually do have just an inherent advantage 05:28:18.860 |
So the question is just how much faster can you make it? 05:28:21.140 |
And we're already faster than what you would do 05:28:24.420 |
if you're going through muscle from a latency point of view. 05:28:33.460 |
If you think about the best mice in the world, 05:28:36.620 |
that's about five milliseconds-ish of latency, 05:28:41.380 |
There's a lot of characteristics that matter there, 05:28:43.860 |
And the rough time for like a neuron in the brain 05:28:58.580 |
And this is something that if you ask Nolan about it, 05:29:04.580 |
like what does it feel like when you're modulating 05:29:10.180 |
He said it moves before he is like actually intending it to, 05:29:18.500 |
What is that like to have a thing just be so immediate, 05:29:25.300 |
- Yeah, I suppose we've gotten used to that latency, 05:29:29.140 |
So is the currently the bottleneck the communication? 05:29:36.100 |
I mean, there's always going to be a bottleneck, 05:29:41.500 |
Bluetooth low energy protocol has some restrictions 05:29:47.660 |
So the protocol itself establishes a standard of, 05:29:50.060 |
you know, the most frequent sort of updates you can send 05:29:57.620 |
of sort of individual spikes impacting control, 05:30:02.460 |
that kind of protocol is going to become a limiting factor 05:30:12.900 |
If you start pushing latency sort of below the level 05:30:18.340 |
Like you need your whole system to be able to be as reactive 05:30:22.260 |
as the sort of limits of what the technology can offer us. 05:30:29.900 |
at something that's at the level of one millisecond. 05:30:52.180 |
I think there's probably a lot of folks listening 05:30:59.620 |
So there's an implant that's wirelessly communicating 05:31:04.700 |
with any digital device that has an app installed. 05:31:08.700 |
So maybe can you tell me at high level what the app is, 05:31:20.900 |
be able to navigate his computer independently. 05:31:36.500 |
But in terms of what's useful for people today, 05:31:40.740 |
to be able to just control mouse and keyboard inputs 05:31:42.540 |
to all the applications that they want to use 05:31:45.460 |
for communicating with their friends, et cetera. 05:31:49.700 |
is really to translate this wireless stream of brain data 05:31:52.380 |
coming off the implant into control of the computer. 05:31:55.500 |
And we do that by essentially building a mapping 05:31:57.980 |
from brain activity to sort of the HID inputs 05:32:02.660 |
So HID is just the protocol for communicating 05:32:13.940 |
But there's a lot of nuance of how that mapping works. 05:32:17.780 |
And we're still in the early stages of a long journey 05:32:24.940 |
of taking the statistical patterns of brain data 05:32:27.540 |
that's being channeled across this Bluetooth connection 05:32:34.420 |
you can think of it in a couple of different parts. 05:32:37.900 |
there's a training step and there's an inference step. 05:32:45.700 |
where the user has to imagine doing different actions. 05:32:48.260 |
So for example, they'll be presented a screen 05:32:51.460 |
and they'll be asked to push that cursor to the right. 05:32:54.220 |
Then imagine pushing that cursor to the left, 05:33:04.980 |
and then imagine behavior, map one to the other. 05:33:18.100 |
And you use that for a control of the computer. 05:33:28.580 |
Because there's a variety of behavioral challenges 05:33:33.740 |
when you're working with someone who's paralyzed. 05:33:37.460 |
You can't see them attempt to move their hand. 05:33:43.580 |
and validate that they're doing it correctly, 05:33:45.740 |
such that then you can downstream build with confidence, 05:33:55.220 |
what I really mean is at this level of resolution 05:34:06.100 |
at the scale of sort of one millisecond resolution, 05:34:09.820 |
I could build a mapping from my neural spikes 05:34:14.500 |
that you don't observe what they're actually doing. 05:34:19.780 |
that give you more than just sort of a course 05:34:26.700 |
you really want it to be as responsive as possible. 05:34:34.780 |
when you're trying to move it from left to right. 05:34:36.740 |
And building a behavioral sort of calibration game 05:34:54.540 |
that the next thing the human is going to likely do 05:35:19.380 |
It's not just about being friendly or nice or usable. 05:35:31.340 |
is like fundamental to the operation of the thing 05:36:06.900 |
what that high resolution version of their intention is? 05:36:12.500 |
because there are enough data points in the dataset, 05:36:22.180 |
this is exactly how hard they're pushing upwards 05:36:25.740 |
this is how hard they're trying to push upwards. 05:36:27.420 |
- It's really important to have very clean labels, yes? 05:36:47.180 |
will have some number of assumptions it makes 05:36:50.580 |
Those assumptions can be formulated in a loss function 05:36:53.300 |
or they can be formulated in terms of heuristics 05:36:56.500 |
or guesstimate what the user is trying to do. 05:37:12.420 |
they're actually trying to go slightly faster 05:37:21.980 |
"Hey, user, imagine moving this cursor a fixed offset. 05:37:26.980 |
just try to move it exactly 200 pixels to the right. 05:37:44.580 |
and therefore potentially you can make it more accurate 05:37:47.140 |
than these heuristics that are trying to guesstimate 05:37:48.740 |
at each millisecond what the user is trying to do. 05:37:59.780 |
- For that step, what are we supposed to be visualizing? 05:38:02.780 |
There's a cursor and you want to move that cursor 05:38:18.740 |
where you're just getting a very large amount of data, 05:38:28.620 |
maybe you can get clean signal as a side effect. 05:38:34.900 |
- Or is that not an effective way for initial calibration? 05:38:40.420 |
So the first thing I would draw a distinction 05:38:43.700 |
between is sort of open loop versus closed loop. 05:38:46.140 |
So open loop, what I mean by that is the user 05:38:52.340 |
where they have some level of control at all. 05:38:55.060 |
In that setup, you really need to have some task 05:38:58.020 |
that gives the user a hint of what you want them to do, 05:39:08.900 |
and figuring out the right way to use it themselves, 05:39:16.940 |
and we can sort of rabbit hole into both of them 05:39:19.260 |
but the sort of challenge with the open loop task 05:39:24.220 |
proprioceptive feedback about what they're doing. 05:39:31.780 |
when they're trying to do an open loop calibration. 05:39:36.100 |
Like imagine if you sort of had your whole right arm numbed 05:39:39.460 |
and you stuck it in a box and you couldn't see it. 05:39:44.380 |
about what the position or activity of your arm was. 05:39:48.900 |
that's moving from left to right, match that speed. 05:39:53.860 |
to invoke whatever that imagined action is in your brain 05:39:59.820 |
But in any situation, you're gonna be inaccurate 05:40:02.500 |
and maybe inconsistent in how you do that task. 05:40:04.980 |
And so that's sort of the fundamental challenge 05:40:11.180 |
and they're able to start moving the mouse on their own, 05:40:15.500 |
they're going to very naturally adapt to that model. 05:40:19.740 |
between the model learning what they're doing 05:40:23.660 |
may not find you the best sort of global minima. 05:40:25.740 |
It may be that your first model was noisy in some ways, 05:40:32.020 |
There's some like part of the data distribution 05:40:38.500 |
they figure out the right sequence of imagined motions 05:40:41.300 |
or the right angle they have to hold their hand at 05:40:45.060 |
but then the next day they come back to their device 05:40:46.980 |
and maybe they don't remember exactly all the tricks 05:40:49.660 |
And so there's a complicated sort of feedback cycle here 05:40:53.620 |
and can make it a very, very difficult debugging process. 05:40:56.540 |
- Okay, there's a lot of really fascinating things there. 05:40:59.980 |
Yeah, actually just to stay on the closed loop. 05:41:08.420 |
this actually happened watching psychology grad students. 05:41:13.940 |
when they don't know how to program themselves. 05:41:15.420 |
They use a piece of software that somebody else wrote 05:41:25.860 |
Like nobody considers maybe we should fix this. 05:41:35.860 |
but you need to still, that might not be the optimal. 05:41:41.260 |
Do you have to restart from scratch every once in a while 05:41:46.380 |
First and foremost, I would say this is not a solved problem. 05:41:52.900 |
I would also say this is not a problem that's solved 05:41:57.740 |
and you can get sort of richer covariance structures 05:42:00.980 |
when trying to come up with good labeling strategies. 05:42:05.900 |
by just scaling channel count, this is one of them. 05:42:10.660 |
That's the first thing I wanna make sure it gets across. 05:42:12.820 |
The second thing is any solution that involves closed loop 05:42:15.660 |
is going to become a very difficult debugging problem. 05:42:25.460 |
'Cause if you can do that, even if the ceiling is lower, 05:42:39.860 |
that that should be an easier debugging problem. 05:42:48.380 |
of how to infer what the user is truly attempting to do. 05:42:51.740 |
although they're moving the cursor on the screen, 05:42:59.580 |
if you want to be able to improve the model further. 05:43:01.580 |
You still have this very complicated guesstimation 05:43:06.660 |
what is the true user intention underlying that signal. 05:43:10.620 |
has the nice property of being easy to debug. 05:43:21.500 |
is that this problem doesn't need to be solved 05:43:26.020 |
You know, even today with the solutions we have now, 05:43:30.660 |
the level of control that can be given to a user, 05:43:38.260 |
But again, I wanna build the world's best mouse. 05:43:42.020 |
that it's not even a question that you want it. 05:43:51.380 |
And a couple maybe details of previous studies 05:43:55.100 |
that I think are very interesting to understand 05:43:56.740 |
when thinking about how to solve this problem. 05:43:58.460 |
The first is that even when you have ground truth data 05:44:02.620 |
and you can get this with an able-bodied monkey, 05:44:04.300 |
a monkey that has a neural link device implanted, 05:44:10.980 |
it turns out that the optimal thing to predict 05:44:19.460 |
building a data set of what's going on in the brain 05:44:21.500 |
and what is the mouse exactly doing on the table. 05:44:24.020 |
And it turns out that if you build the mapping 05:44:36.420 |
is trying to go in a straight line to the target. 05:44:40.300 |
is actually more effective in producing a model 05:44:43.420 |
than actually predicting the underlying hand movement. 05:44:48.860 |
- There's obviously a very strong correlation 05:44:51.460 |
but the intention is a more powerful thing to be chasing. 05:45:06.020 |
you're acting on the intention, not the action, 05:45:38.260 |
- So you could imagine giving the user feedback on a screen, 05:45:43.220 |
you don't know what they're attempting to do. 05:45:47.660 |
of I'm doing this correctly or not correctly? 05:45:53.220 |
you're trying to move the cursor a certain position offset. 05:46:00.500 |
imagine moving it 200 pixels from where it was 05:46:09.940 |
okay, I know what the spike train looks like on average 05:46:14.700 |
Maybe I can produce some sort of probabilistic estimate 05:46:17.020 |
of how likely is that to be the action you took 05:46:20.100 |
given the latest trial or trajectory that you imagined. 05:46:23.300 |
And that could give the user some sort of feedback 05:46:24.860 |
of how consistent are they across different trials. 05:46:27.940 |
You could also imagine that if the user is prompted 05:46:33.260 |
that maybe they just become more behaviorally engaged 05:46:35.020 |
to begin with because the task is kind of boring 05:46:38.500 |
And so there may be benefits to the, you know, 05:46:40.900 |
the user experience of showing something on a screen, 05:46:45.660 |
to try to increase that number or push it upwards. 05:46:57.100 |
Hour to hour, day to day, week to week, month to month, 05:47:13.900 |
and then also with Noland during the clinical trial. 05:47:34.940 |
without considering this non-stationarity problem. 05:47:38.060 |
So maybe the first solution here that's important 05:47:40.100 |
is that they can recalibrate whenever they want. 05:47:42.300 |
This is something that Noland has the ability to do today. 05:47:49.220 |
without his caretaker or parents or friends around 05:47:55.140 |
is that when you have a good model calibrated, 05:48:00.140 |
So how often he has to do this recalibration today 05:48:02.220 |
depends really on his appetite for performance. 05:48:04.580 |
There are, we observe sort of a degradation through time 05:48:14.660 |
It can also be mitigated through a combination 05:48:16.340 |
of sort of software features that we provide to the user. 05:48:29.100 |
how smooth the output of that cursor intention actually is. 05:48:33.980 |
which is how easy is it to stop and hold still. 05:48:38.180 |
a great deal of flexibility and troubleshooting mechanisms 05:48:40.900 |
to be able to solve this problem for themselves. 05:48:45.940 |
selecting the mixer, and the mixer you have, it's- 05:48:55.780 |
And so yeah, there's that bias that there's a cursor drift 05:49:02.500 |
Although he said that you guys were just playing around 05:49:06.380 |
with it with him and they're constantly improving. 05:49:20.860 |
I guess, looking to the right side of the screen 05:49:22.500 |
or the left side of the screen to kind of adjust the bias. 05:49:25.460 |
That's one interface action, I guess, to adjust the bias. 05:49:34.220 |
with sort of brain-gate clinical trial participants 05:49:37.100 |
where they pioneered this idea of bias correction. 05:49:44.580 |
where the user can essentially flash the cursor 05:49:56.340 |
is just sort of what is the default motion of the cursor 05:50:00.060 |
And it turns out that that's one of the first 05:50:02.260 |
first sort of qualia of the cursor control experience 05:50:08.860 |
- I mean, I don't know how else to describe it. 05:50:09.860 |
Like, you know, I'm not the guy moving the thing. 05:50:15.540 |
Yeah, I mean, it sounds poetic, but it is deeply true. 05:50:23.060 |
it is a joyful, a really pleasant experience. 05:50:32.620 |
It's like you have the possibility to frustrate people 05:50:40.840 |
it really is truly the case that UX is how the thing works. 05:50:43.600 |
And so it's not just like what's showing on the screen, 05:50:49.800 |
Like we want them to feel like they're in the F1 car, 05:50:51.480 |
not like the, you know, some like minivan, right? 05:50:55.240 |
And that really truly is how we think about it. 05:51:03.120 |
And there's different, you know, control surfaces 05:51:04.920 |
that different kinds of cars and airplanes provide the user. 05:51:21.640 |
that you give the trackpad translates to cursor movement 05:51:28.720 |
a different curve to how much a movement translates 05:51:33.440 |
And that's because somebody sat down a long time ago 05:51:35.720 |
when they're designing the initial input systems 05:51:38.720 |
and they thought through exactly how it feels 05:51:42.920 |
And now we're designing sort of the next generation 05:51:50.240 |
Again, you don't feel the mouse in your hand. 05:51:52.160 |
You don't feel the keys under your fingertips. 05:51:56.960 |
for the user to understand the state of the system 05:51:58.720 |
and how to achieve what they want to achieve. 05:52:00.520 |
And ultimately the end goal is that that UX is completely, 05:52:04.600 |
It becomes something that's so natural and intuitive 05:52:09.120 |
like they have basically direct control over the cursor. 05:52:13.120 |
They're not thinking about the implementation 05:52:14.480 |
of how to make it do what they want it to do. 05:52:17.720 |
- Is there some kind of things along the lines 05:52:21.320 |
of like Fitts' law where you should move the mouse 05:52:24.320 |
in a certain kind of way that maximizes your chance 05:52:37.080 |
No, is there some kind of understanding of the laws of UX 05:52:52.360 |
Like that's different than actual with a mouse. 05:52:57.920 |
So I wouldn't claim to have solved that problem yet, 05:53:01.200 |
but there's definitely some things we've learned 05:53:03.040 |
that make it easier for the user to get stuff done. 05:53:08.040 |
And it's pretty straightforward when you verbalize it, 05:53:10.360 |
but it takes a while to actually get to that point 05:53:11.800 |
when you're in the process of debugging this stuff 05:53:14.600 |
One of those things is that any machine learning system 05:53:25.800 |
For example, if you're developing a search algorithm 05:53:28.280 |
in your photos, if you search for your friend Joe 05:53:31.280 |
and it pulls up a photo of your friend Josephine, 05:53:35.280 |
because the cost of an error is not that high. 05:53:38.440 |
In a different scenario where you're trying to, 05:53:41.400 |
you know, detect insurance fraud or something like this, 05:53:45.600 |
because of some machine learning model output, 05:53:47.040 |
then the errors make a lot more sense to be careful about. 05:53:50.200 |
You want to be very thoughtful about how those errors 05:53:57.280 |
that's decoding a velocity output from the brain 05:53:59.680 |
versus an output where you're trying to modulate 05:54:08.360 |
For velocity, it's okay to be on average correct 05:54:10.640 |
because the output of the model is integrated through time. 05:54:13.240 |
So if the user is trying to click at position A 05:54:22.680 |
is on average correct, they can sort of steer through time 05:54:27.240 |
they can get to the point they want to get to. 05:54:30.800 |
For a click, you're performing it almost instantly 05:54:35.240 |
And so you want to be very sure that that click is correct 05:54:38.240 |
because a false click can be very destructive to the user. 05:54:57.120 |
how to build a solution that is, when it's wrong, 05:55:10.600 |
So every action, if an error occurs, has a certain cost. 05:55:13.760 |
And incorporating that into how you interpret the intention, 05:55:19.880 |
mapping it to the action is really important. 05:55:26.560 |
realize there's a cost to like sending the text early. 05:55:33.760 |
like if your cursor, imagine if your cursor misclicked 05:55:43.280 |
because they're over the target they want to hit 05:55:46.000 |
Which means that in the data sets that we build, 05:55:48.520 |
on average is the case that sort of low speeds 05:55:58.000 |
people think that, oh, click is a binary signal. 05:56:04.760 |
for it to become a useful thing for the user. 05:56:07.680 |
I mean, you can sort of take the compound approach of, 05:56:13.400 |
so we can be very confident about the answer. 05:56:16.680 |
The world's best mouse doesn't take a second to click 05:56:22.560 |
And so if you're aiming for that kind of high bar, 05:56:24.600 |
then you really want to solve the underlying problem. 05:56:35.640 |
Maybe a good place to start is to talk about web grid 05:56:43.400 |
- Yeah, maybe I'll take one zoom out step there, 05:56:45.600 |
which is just explaining why we care to measure this at all. 05:56:51.400 |
the ability to control the computer as well as I can, 05:57:00.000 |
including all those little details like command tab, 05:57:03.280 |
They need to be able to do it with their brain 05:57:08.840 |
And so we intend to measure and quantify every aspect 05:57:10.880 |
of that to understand how we're progressing towards that goal. 05:57:13.280 |
There's many ways to measure BPS, by the way. 05:57:20.720 |
which is dependent on how fast and accurately they can select 05:57:37.480 |
You could imagine that's built out of a grid, 05:57:41.480 |
And bits per second is a measure that's computed 05:57:43.840 |
by taking the log of the number of targets on the screen. 05:57:47.200 |
You can subtract one if you care to model a keyboard 05:57:51.920 |
But log of the number of targets on the screen 05:57:54.040 |
times the number of correct selections minus incorrect 05:57:56.880 |
divided by some time window, for example, 60 seconds. 05:57:59.840 |
And that's sort of the standard way to measure 05:58:03.320 |
And all credit in the world goes to this great professor, 05:58:05.640 |
Dr. Shenoy of Stanford, who came up with that task. 05:58:08.080 |
And he's also one of my inspirations for being in the field. 05:58:13.560 |
to facilitate this kind of bragging rights that we have now 05:58:28.200 |
Yeah, so for Noland and for me playing this task, 05:58:39.200 |
or you could have targets that you just dwell over, 05:58:41.760 |
or you could have targets that you left, right click on. 05:58:43.600 |
You could have targets that are left, right click, 05:58:45.560 |
middle click, scrolling, clicking, and dragging. 05:58:49.800 |
But the simplest, purest form is just blue targets 05:59:07.320 |
or beaten by Noland with his Neuralink device. 05:59:11.280 |
the sort of world record for a human using the device 05:59:23.120 |
And again, the sort of median Neuralink performance 05:59:26.480 |
You can think of it roughly as he's 85% the level of control 05:59:33.880 |
And yeah, I think there's a very interesting journey ahead 05:59:38.880 |
to get us to that same level of 10 BPS performance. 05:59:42.320 |
It's not the case that sort of the tricks that got us 05:59:44.360 |
from, you know, 4 to 6 BPS, and then 6 to 8 BPS 05:59:47.640 |
are gonna be the ones that get us from 8 to 10. 05:59:52.720 |
It's how do you understand at a very, very fine resolution 05:59:57.560 |
And yeah, I highly encourage folks in academia 06:00:01.320 |
- What's the journey with Noland on that quest 06:00:14.840 |
He's really serious about improving his performance 06:00:18.080 |
So what is the journey of trying to figure out 06:00:21.640 |
How much can that be done on the decoding side? 06:00:24.760 |
How much can that be done on the calibration side? 06:00:30.400 |
of like figuring out how to convey his intention 06:00:40.960 |
why Noland's performance is so good is because of Noland. 06:00:43.640 |
Noland is extremely focused and very energetic. 06:00:47.360 |
He'll play WebGrid sometimes for like four hours 06:00:50.760 |
like from 2 a.m. to 6 a.m. he'll be playing WebGrid 06:00:53.080 |
just because he wants to push it to the limits 06:00:59.280 |
We're not saying, "Hey, you should play WebGrid tonight." 06:01:01.320 |
We just gave him the game as part of our research 06:01:18.560 |
is that he was extremely motivated to make this work. 06:01:23.360 |
clinical trial participants from BrainGate and other trials. 06:01:27.440 |
of like they viewed this as their life's work 06:01:29.720 |
to advance the technology as much as they can. 06:01:33.360 |
And if that means selecting targets on the screen 06:01:35.280 |
for four hours from 2 a.m. to 6 a.m. then so be it. 06:01:38.200 |
And there's something extremely admirable about that 06:01:42.360 |
Okay, so now how do you sort of get from where he started 06:01:48.960 |
there's a huge amount of learning to do on his side 06:01:56.000 |
And the most intuitive control for him is sort of, 06:02:02.640 |
So we don't pick up every single neuron in the motor cortex 06:02:07.360 |
So there may be some signals that we have better 06:02:14.520 |
his left ring finger from his left middle finger. 06:02:17.720 |
But on his right hand, we have a good control 06:02:20.440 |
and good modulation detected from the neurons 06:02:22.120 |
we're able to record for his pinky and his thumb 06:02:24.840 |
So you can imagine how these different subspaces 06:02:36.240 |
he was able to go and explore various different ways 06:02:40.280 |
For example, he can imagine controlling the cursor 06:02:46.080 |
You know, he tried like a whole bunch of stuff 06:02:47.600 |
to explore the space of what is the most natural way 06:02:52.200 |
that at the same time is easy for us to decode. 06:02:54.160 |
- Just to clarify, it's through the body mapping procedure 06:02:58.040 |
that you're able to figure out which finger he can move? 06:03:09.520 |
than we represent in that visual on the screen. 06:03:11.360 |
So we show him sort of abstractly, here's a cursor, 06:03:17.440 |
And we obviously have hints about what will work best 06:03:19.360 |
from that body mapping procedure of, you know, 06:03:20.800 |
we know that this particular action we can represent well, 06:03:27.120 |
- But at which point does he no longer visualize 06:03:31.600 |
and it's just visualizing the movement of the cursor? 06:03:45.200 |
It looked like the model wasn't performing super well 06:03:49.760 |
Like what actually happened was he was trying something new 06:03:55.800 |
So he wasn't imagining moving his hand anymore. 06:03:57.940 |
He was just imagining, I don't know what it is, 06:03:59.960 |
some like abstract intention to move the cursor on the screen 06:04:08.060 |
I cannot give a first person account of what that's like, 06:04:12.220 |
but the expletives that he uttered in that moment were, 06:04:16.420 |
that it was a very qualitatively different experience 06:04:18.520 |
for him to just have direct neural control over a cursor. 06:04:30.840 |
because he discovered it, like you said to me, 06:04:37.400 |
through all of this, the process of trying to move the cursor 06:04:44.280 |
But that is clearly a really powerful thing to arrive at, 06:04:47.360 |
which is to let go of trying to control the fingers 06:04:52.360 |
and the hand and control the actual digital device 06:04:59.800 |
the user doesn't have to think about what they need to do 06:05:14.200 |
- So is it just simply learning like high-level software, 06:05:26.840 |
I'm very excited to see with sort of the second participant 06:05:30.200 |
that we implant, what the journey is like for them, 06:05:39.760 |
This wasn't me prompting Nolan to go try this. 06:05:46.160 |
But now that we know that that's a possibility, 06:05:48.080 |
that maybe there's a way to, for example, hint the user, 06:05:57.440 |
And from there, we should be able to hopefully understand 06:05:59.760 |
how this is for somebody who has not experienced that before. 06:06:02.440 |
Maybe that's the default mode of operation for them. 06:06:04.240 |
You don't have to go through this intermediate phase 06:06:07.760 |
- Or maybe if that naturally happens for people, 06:06:14.780 |
Actually, sometimes, just like with the four-minute mile, 06:06:20.560 |
- Yeah, enables you to do it, and then it becomes trivial. 06:06:39.400 |
All of a sudden, that's unlocked for everybody. 06:06:59.720 |
And so even just that, help constrain the beam search 06:07:02.720 |
of different approaches that we could explore, 06:07:07.400 |
the set of things that we'll get to try on day one, 06:07:09.080 |
how fast we hope to get them to useful control, 06:07:11.280 |
how fast we can enable them to use it independently, 06:07:16.640 |
and all the participants that came before him 06:07:20.880 |
- So how often are the updates to the decoder? 06:07:26.560 |
and that in the stream he said he plays the snake game 06:07:33.480 |
It's a good way for him to test like how good the update is. 06:07:43.240 |
So how often, like what does the update entail? 06:07:49.640 |
So one is, it's probably worth trying distinction 06:07:54.520 |
to understand like what the best approach is. 06:07:57.640 |
where we wanted to have the ability to just go use the device 06:08:04.200 |
I think usually in the context of a research session 06:08:07.280 |
many, many different approaches to, you know, 06:08:09.240 |
even unsupervised approaches that we talked about earlier 06:08:12.560 |
to estimate his true intention and more accurately decode it. 06:08:19.320 |
he'll sometimes work for like eight hours a day. 06:08:22.480 |
hundreds of different models that we would try in that day. 06:08:27.320 |
Now, it's also worth noting that we update the application 06:08:33.400 |
we'll update his application with different features 06:08:39.320 |
he's a very articulate person who is part of the solution. 06:08:43.600 |
He says, hey, here's this thing that I've discovered 06:08:52.000 |
And it often happens that those things are addressed 06:08:54.440 |
within a couple hours of him giving us his feedback. 06:08:57.000 |
That's the kind of iteration cycle we'll have. 06:08:58.520 |
And so sometimes at the beginning of the session, 06:09:01.840 |
he's giving us feedback on the next iteration 06:09:14.560 |
So one of the amazing things about human beings 06:09:25.440 |
that they can provide feedback, continuous feedback. 06:09:28.200 |
- It also requires, just to brag on the team a little bit, 06:09:33.360 |
and it requires the team being absolutely laser focused 06:09:36.160 |
on the user and what will be the best for them. 06:09:43.440 |
We're gonna skip that today and we're gonna do this. 06:09:45.440 |
That level of focus commitment is, I would say, 06:09:52.680 |
And also, you obviously have to have the talent 06:09:54.840 |
to be able to execute on these things effectively. 06:10:00.160 |
- Yeah, and this is such a interesting space of UX design 06:10:19.320 |
- Yeah, UX is not something that you can always solve 06:10:23.720 |
by just constant iterating on different things. 06:10:27.200 |
and think globally, am I even in the right sort of minima 06:10:34.640 |
is the predictor of how successful you will be. 06:10:37.600 |
As a good example, like in an RL simulation, for example, 06:10:56.080 |
combined with what the problem is you're trying to solve. 06:11:04.080 |
- Yeah, that's the old like stories of Steve Jobs, 06:11:13.320 |
And sometimes you have to remove the floppy disk drive 06:11:15.920 |
or whatever the, I forgot all the crazy stories 06:11:18.840 |
of Steve Jobs, like making wild design decisions. 06:11:27.120 |
That some of it is about the love you put into the design, 06:11:33.800 |
which is very much a Steve Jobs, Johnny Ive type thing. 06:11:36.760 |
But when you have a human being using their brain 06:11:41.600 |
to interact with it, it also is deeply about function. 06:11:46.720 |
And that you have to empathize with a human being before you 06:12:05.280 |
sometimes a complete, like rebuilding the design. 06:12:16.440 |
- Yeah, I mean, I'll give one concrete example. 06:12:19.040 |
So he really wanted to be able to read manga. 06:12:29.720 |
You can't scroll with a mouse stick on his iPad 06:12:31.720 |
and on the website that he wanted to be able to use 06:12:36.440 |
- Might be a good quick pause to say the mouse stick 06:12:38.240 |
is the thing he's using, holding a stick in his mouth 06:12:44.960 |
It's basically, you can imagine it's a stylus 06:12:49.000 |
- And it's exhausting, it hurts, and it's inefficient. 06:12:54.000 |
- Yeah, and maybe it's also worth calling out, 06:12:56.120 |
there are other alternative assistive technologies, 06:13:01.640 |
and I think it's also not well understood by folks, 06:13:06.040 |
so he'll have muscle spasms from time to time. 06:13:08.120 |
And so any assistive technology that requires him 06:13:09.760 |
to be positioned directly in front of a camera, 06:13:12.760 |
or anything that requires him to put something in his mouth, 06:13:20.560 |
it'll stab him in the face if he spasms too hard. 06:13:23.240 |
So these kinds of considerations are important 06:13:24.880 |
when thinking about what advantages a PCI has 06:13:33.040 |
wherever you want to, either in the bed or in the chair, 06:13:40.840 |
in how good the solution is in that user's life. 06:13:48.040 |
So again, Manga is something he wanted to be able to read. 06:13:50.800 |
And there's many ways to do scroll with a BCI. 06:13:55.120 |
You can imagine different gestures, for example, 06:14:00.360 |
But scroll is a very fascinating control surface 06:14:02.840 |
because it's a huge thing on the screen in front of you. 06:14:11.400 |
Like you really don't want to have your Manga page 06:14:12.840 |
that you're trying to read be shifted up and down 06:14:15.040 |
a few pixels just because your scroll decoder 06:14:19.680 |
And so this was an example where we had to figure out 06:14:24.760 |
that the errors of the system, whenever they do occur, 06:14:34.560 |
It doesn't interrupt their flow of reading their book. 06:14:45.240 |
And Quick Scroll basically looks at the screen 06:14:47.600 |
and it identifies where on the screen are scroll bars. 06:14:50.960 |
And it does this by deeply integrating with macOS 06:15:00.680 |
And we identified where those scroll bars are 06:15:05.120 |
And the BCI scroll bar looks similar to a normal scroll bar, 06:15:16.800 |
in the same way that you'd use a push to control, 06:15:22.800 |
So it's basically like remapping the velocity 06:15:26.480 |
And the reason that feels so natural and intuitive 06:15:33.960 |
You don't have to like switch your imagined movement. 06:15:36.040 |
You sort of snap onto it and then you're good to go. 06:15:38.040 |
You just immediately can start pulling the page down 06:15:49.640 |
Like when you scroll a page with your fingers on the screen, 06:16:04.720 |
or what's the right amount of page give, if you will, 06:16:08.560 |
when you push it to make it flow the right amount 06:16:14.480 |
And there's a million, I mean, I could tell you, 06:16:19.160 |
that we spent probably like a month getting right 06:16:25.520 |
- I mean, even the scroll on a smartphone with your finger 06:16:38.520 |
And actually the same kind of visionary UX design 06:16:45.840 |
but also listen to them and also have like visionary big, 06:16:54.720 |
By the way, it just makes me think that scroll bars 06:17:05.120 |
snap to scroll bar action you're talking about 06:17:08.000 |
is something that could potentially be extremely useful 06:17:13.040 |
Even just for users to just improve the experience. 06:17:28.520 |
there should be a snapping to the scroll bar action. 06:17:31.240 |
But of course, maybe I'm okay paying that cost, 06:17:42.440 |
this is necessary because there's an extra cost 06:17:48.480 |
So you have to switch between the scrolling and the reading. 06:17:53.320 |
There has to be a phase shift between the two. 06:17:56.080 |
Like when you're scrolling, you're scrolling. 06:17:59.440 |
So that is one drawback of the current approach. 06:18:02.000 |
Maybe one other just sort of case study here. 06:18:10.640 |
to how we design the decoder, what we choose to decode 06:18:12.920 |
to then how it works once it's being used by the user. 06:18:15.040 |
So another good example in that sort of how it works 06:18:22.000 |
It's also a function of what's going on on the screen. 06:18:28.960 |
that very small, stupid little X that's extremely tiny, 06:18:34.240 |
if you're dealing with sort of a noisy output of the decoder, 06:18:36.680 |
we can understand that that is a small little X 06:18:39.200 |
and actually make it a bigger target for you. 06:18:40.920 |
Similar to how when you're typing on your phone, 06:18:42.880 |
if you're used to like the iOS keyboard, for example, 06:18:46.360 |
it actually adapts the target size of individual keys 06:18:56.240 |
because it knows Lex is the person I'm going to go see. 06:19:01.920 |
even without improvements to the underlying decoder 06:19:07.840 |
So we do that with a feature called magnetic targets. 06:19:09.520 |
We actually index the screen and we understand, 06:19:11.560 |
okay, these are the places that are very small targets 06:19:15.360 |
Here's the kind of cursor dynamics around that location 06:19:17.720 |
that might be indicative of the user trying to select it. 06:19:20.880 |
Let's blow up the size of it in a way that makes it easier 06:19:22.760 |
for the user to sort of snap onto that target. 06:19:25.640 |
they matter a lot in helping the user be independent 06:19:41.840 |
So the underlying signal we're trying to decode 06:19:44.320 |
is going to look very different in P2 than in P1. 06:19:48.720 |
is going to mean something different in user one 06:19:52.960 |
with channel 345 is going to be next to a different neuron 06:19:59.280 |
the user experience of how do you get the right 06:20:04.760 |
we hope will translate over multiple generations of users. 06:20:11.520 |
to sort of Nolan's user experience desires and preferences. 06:20:17.000 |
you know, when we get a second, third, fourth participant, 06:20:19.680 |
that we find sort of what the right wide minimas are 06:20:24.400 |
And hopefully there's a cross-pollination of things where, 06:20:26.520 |
oh, we didn't think about that with this user 06:20:30.400 |
But with this user who just can fundamentally 06:20:42.680 |
- So the actual mechanism of open loop labeling 06:20:46.840 |
and then closed loop labeling will be the same 06:20:49.760 |
and hopefully can generalize across the different users 06:21:07.200 |
there used to be kind of idea of human computation, 06:21:10.560 |
which is using actions a human would want to do anyway 06:21:30.280 |
And so we thought really quite a bit actually 06:21:38.280 |
that's for long duration and stuff like this. 06:21:40.280 |
It turns out that like people love this game. 06:21:43.000 |
but I didn't know that that was a shared perception. 06:21:48.120 |
WebGrid is, there's a grid of let's say 35 by 35 cells 06:21:56.400 |
and you have to move your mouse over that and click on it. 06:22:01.320 |
- I played this game for so many hours, so many hours. 06:22:06.280 |
- My, I think I have the highest at Neuralink right now. 06:22:10.720 |
- Which is about, if you imagine that 35 by 35 grid, 06:22:13.120 |
you're hitting about a hundred trials per minute. 06:22:15.960 |
So a hundred correct selections in that one minute window. 06:22:21.960 |
- So one of the reasons I think I struggle with that game 06:22:29.320 |
If I can avoid touching the mouse, it's great. 06:22:33.160 |
So how can you explain your high performance? 06:22:55.440 |
And then I'll actually eat like a ton of peanut butter 06:23:02.120 |
This is again a night owl thing I think we share, 06:23:10.400 |
which is, I used to be, I was homeschooled growing up. 06:23:13.000 |
And so I did most of my work like on the floor, 06:23:22.120 |
like there's not a lot of weight on your elbow 06:23:23.640 |
when you're playing so that you can move quickly. 06:23:28.040 |
So it's like small motions that actually move the cursor. 06:23:30.000 |
- Are you moving with your wrist or you're never- 06:23:40.640 |
- Which I've been meaning to go down this rabbit hole 06:23:43.680 |
of people that set the world record in Tetris. 06:23:47.280 |
Those folks, they're playing, there's a way to, 06:23:56.000 |
where like it's using a loophole, like a bug, 06:24:04.400 |
But you do realize there'll be like a few programmers 06:24:10.440 |
I mean, the reason I did this literally was just 06:24:12.360 |
because I wanted the bar to be high for the team. 06:24:17.760 |
It should be like, it should be able to beat all of us 06:24:19.240 |
at least, like that should be the minimum bar. 06:24:23.320 |
- Yeah, I don't know what the limit, I mean, the limits, 06:24:25.000 |
you can calculate just in terms of like screen refresh rate 06:24:28.000 |
and like cursor immediately jumping to the next target. 06:24:31.000 |
But there's, I mean, I'm sure there's limits before that 06:24:32.800 |
with just sort of reaction time and visual perception 06:24:36.600 |
I'd guess it's in the below 40, but above 20, 06:24:42.600 |
It also matters like how difficult the task is. 06:24:44.080 |
You can imagine like some people might be able to do 06:24:50.760 |
So there's some like task optimizations you could do 06:25:10.720 |
I mean, the first answer that's important to say 06:25:15.520 |
So again, nobody's gotten to that number before. 06:25:22.640 |
What we've seen historically is that different parts 06:25:26.880 |
of the stack would compile next to different time points. 06:25:28.520 |
So when I first joined AirLink like three years ago or so, 06:25:31.600 |
one of the major problems was just the latency 06:25:34.280 |
It was just like the radio device wasn't super good. 06:25:38.600 |
And it just like, no matter how good your decoder was, 06:25:41.000 |
if your thing is updating every 30 milliseconds 06:25:42.840 |
or 50 milliseconds, it's just gonna be choppy. 06:25:46.640 |
that's gonna be frustrating and lead to challenges. 06:25:49.360 |
So at that point, it was very clear that the main challenge 06:25:51.880 |
is just get the data off the device in a very reliable way 06:25:55.640 |
such that you can enable the next challenge to be tackled. 06:26:03.000 |
actually the modeling challenge of how do you 06:26:09.600 |
and you have a label you're trying to predict, 06:26:11.440 |
just what is the right neural decoder architecture 06:26:16.760 |
And once you solve that, it became a different bottleneck. 06:26:20.880 |
was actually just sort of software stability and reliability. 06:26:24.320 |
If you have widely varying sort of inference latency 06:26:32.680 |
every once in a while, it decreases your ability 06:26:35.880 |
and it basically just disrupts your control experience. 06:26:38.280 |
And so there's a variety of different software bugs 06:26:40.200 |
and improvements we made that basically increased 06:26:43.760 |
made it much more reliable, much more stable, 06:26:45.920 |
and led to a state where we could reliably collect data 06:26:50.640 |
It's just sort of like the software stack itself. 06:26:55.120 |
there's sort of two major directions you could think about 06:27:01.680 |
So labeling is, again, this fundamental challenge 06:27:22.760 |
to improve BPS further is either completely changing 06:27:29.160 |
So this is serving the direction of functionality. 06:27:31.000 |
Basically, you can imagine giving more clicks. 06:27:32.960 |
For example, a left click, a right click, a middle click, 06:27:35.440 |
different actions like click and drag, for example, 06:27:40.520 |
If you're trying to allow the user to express themselves 06:27:46.760 |
But what actually matters at the end of the day 06:27:48.160 |
is how effective are they at navigating their computer? 06:27:51.000 |
And so from the perspective of the downstream tasks 06:27:52.760 |
that you care about, functionality and extending 06:27:54.400 |
functionality is something we're very interested in 06:27:55.800 |
because not only can it improve the sort of number of BPS, 06:27:59.000 |
but it can also improve the downstream sort of independence 06:28:01.480 |
that the user has and the skill and efficiency 06:28:19.040 |
So what you'll see is that if you sort of plot a curve 06:28:22.560 |
of number of channels that you're using for decode 06:28:36.160 |
So as you move further out in number of channels, 06:28:39.520 |
you get a corresponding sort of logarithmic improvement 06:28:41.600 |
in control quality and offline validation metrics. 06:28:44.400 |
The important nuance here is that each channel 06:28:55.960 |
it might correspond with moving to the right. 06:29:00.280 |
If you want to expand the number of functions 06:29:04.880 |
you really want to have a broader set of channels 06:29:07.120 |
that covers a broader set of imagined movements. 06:29:11.520 |
Like if you had a bunch of different imagined movements 06:29:18.720 |
handwriting to output characters on the screen. 06:29:20.400 |
You could imagine just typing with your fingers 06:29:23.480 |
You could imagine different finger modulations 06:29:26.800 |
for opening some menu or wiggling your, you know, 06:29:30.160 |
your big toe to have like a command tab occur 06:29:33.840 |
So it's really the amount of different actions 06:29:40.960 |
- Right, so that's more about the number of actions. 06:29:44.240 |
So actually as you increase the number of threads, 06:29:46.720 |
that's more about increasing the number of actions 06:29:51.960 |
- One other nuance there that is worth mentioning. 06:29:53.760 |
So again, our goal is really to enable a user 06:29:55.440 |
with paralysis to control the computer as fast as I can. 06:29:57.560 |
So that's BPS with all the same functionality I have, 06:30:10.280 |
the relative importance of any particular feature 06:30:16.120 |
which means that if the sort of neural non-stationarity 06:30:25.960 |
then your reliability of your system will improve. 06:30:28.560 |
So one sort of core thesis that at least I have 06:30:41.040 |
when you say a non-stationarity of the signal, 06:30:47.800 |
what the actual underlying signal looks like. 06:30:49.440 |
So again, I spoke very briefly at the beginning 06:30:51.480 |
about how when you imagine moving to the right 06:30:59.000 |
it's very correlated with the output intention 06:31:01.160 |
or the behavioral task that the user is doing. 06:31:07.920 |
is like the only way the brain can represent information. 06:31:13.240 |
And there's actually evidence like in bats, for example, 06:31:16.800 |
So timing codes of like exactly when particular neurons fire 06:31:20.080 |
is the mechanism of information representation. 06:31:25.040 |
there's substantial evidence that it's rate coding, 06:31:31.080 |
So then if the brain is representing information 06:31:33.600 |
by changing the sort of frequency of a neuron firing, 06:31:39.640 |
between sort of the baseline state of the neuron 06:31:44.760 |
and what has also been observed in academic work 06:31:49.560 |
if you imagine that analogy for like measuring flour 06:31:54.640 |
that baseline state of how much the pot weighs 06:32:02.160 |
you're gonna get a different measurement different days 06:32:03.680 |
because you're measuring with different pots. 06:32:05.320 |
So that baseline rate shifting is really the thing that, 06:32:08.560 |
at least from a first order description of the problem 06:32:15.040 |
but at least at a very first order description 06:32:16.680 |
of the problem, that's what we observed day to day 06:32:18.560 |
is that the baseline firing rate of any particular neuron 06:32:21.240 |
or observed on a particular channel is changing. 06:32:30.720 |
So with monkeys, we have found various ways to do this. 06:33:01.560 |
between some open loop task and some closed loop task 06:33:16.320 |
And so the exact sort of difference in context 06:33:27.920 |
and what you're trying to use at closed loop time. 06:33:37.800 |
to be able to move the mouse cursor effectively 06:33:46.040 |
- Kicking your ass and talk trash while doing it. 06:33:51.680 |
And yes, if you're trying to normalize to the baseline, 06:34:00.960 |
for folks that aren't familiar with assistive technology, 06:34:12.280 |
I actually did was not confident before Sir Nolan, 06:34:16.240 |
a profoundly transformative technology for people like him. 06:34:25.400 |
Even if you can just offer the same level of control 06:34:31.320 |
but you don't need to have that thing in your face. 06:34:33.080 |
You don't need to be positioned a certain way. 06:34:40.320 |
That level of independence is so game-changing for people. 06:34:43.640 |
It means that they can text a friend at night privately 06:34:47.400 |
It means that they can open up and browse the internet 06:34:58.360 |
And this is even before we start talking about folks 06:35:16.080 |
How much magic, how much science, how much art? 06:35:18.600 |
How difficult is it to come up with a decoder 06:35:22.600 |
that figures out what these sequence of spikes mean? 06:35:29.640 |
There's a couple of different ways to answer this. 06:35:34.160 |
and then I'll go down one of the rabbit holes. 06:35:35.960 |
And so the zoomed out view is that building the decoder 06:35:38.680 |
is really the process of building the dataset 06:35:45.280 |
The direction, I think, of further improvement 06:35:48.400 |
of how do you construct the optimal labels for the model? 06:35:54.560 |
And so I'll go briefly down the second rabbit hole. 06:36:08.480 |
The user is trying to control something on the screen 06:36:19.080 |
So for example, if you just look at validation loss 06:36:26.320 |
Not all of them are equally controllable by the end user. 06:36:33.120 |
that help you capture the thing that actually matters, 06:36:41.800 |
than just a standard supervised learning problem. 06:36:49.680 |
that translate brain data to velocity outputs, for example. 06:37:01.080 |
just fully connected networks to decode the brain activity. 06:37:06.960 |
the relative performance in online control sessions 06:37:09.680 |
of sort of 1D convolution over the input signal. 06:37:22.920 |
You can actually get better validation metrics, 06:37:35.520 |
Now, it turns out that when using that model online, 06:37:37.600 |
the controllability was worse, was far worse, 06:37:42.320 |
And there can be many ways to interpret that, 06:37:48.280 |
that if you were to just throw a bunch of compute 06:37:56.280 |
or come up with or invent many different solutions, 06:38:03.680 |
There's still some artistry left to be uncovered here 06:38:05.360 |
of how to get your model to scale with more compute. 06:38:08.600 |
And that may be fundamentally a labeling problem, 06:38:10.520 |
but there may be other components to this as well. 06:38:22.560 |
- Yeah, I think it's data quality constrained, 06:38:30.880 |
I mean, 'cause it has to be trained on the interactions. 06:38:41.120 |
let's say the simplest example of just 2D velocity, 06:38:42.960 |
then I think, yeah, data quality is the main thing. 06:38:47.240 |
that lets you do all the inputs to the computer 06:38:57.560 |
but when you're building the left click model, 06:39:11.760 |
and when he stopped, the click signal went up. 06:39:12.960 |
So again, there's a contamination between the two inputs. 06:39:17.800 |
he was trying to do sort of a left click and drag 06:39:37.600 |
has not seen this kind of variability before. 06:39:40.000 |
So you need to find some way to help the model with that. 06:39:43.000 |
'cause it feels like all of this is very solvable, 06:39:46.520 |
- Yes, it is fundamentally an engineering challenge. 06:39:50.440 |
that it may not need fundamentally new techniques, 06:39:57.720 |
using CTC loss, for example, with internal to Siri, 06:40:01.080 |
they could potentially have very applicable skills to this. 06:40:06.200 |
in the future development of the software stack 06:40:16.240 |
like something I'm excited about from the technology side 06:40:19.320 |
how this technology is going to be best situated 06:40:24.200 |
On the technology entering the world side of things, 06:40:28.400 |
how this device works for folks that cannot speak at all, 06:40:32.160 |
that have no ability to sort of bootstrap themselves 06:40:35.000 |
into useful control by voice command, for example, 06:40:37.680 |
and are extremely limited in their current capabilities. 06:40:40.320 |
I think that will be an incredibly useful signal for us 06:40:42.080 |
to understand, I mean, really what is an existential type 06:40:44.840 |
for all startups, which is product market fit. 06:40:46.840 |
Does this device have the capacity and potential 06:40:48.880 |
to transform people's lives in the current state? 06:40:57.200 |
for the next year or so of clinical trial operations. 06:41:01.720 |
I'm quite excited about basically everything we're doing. 06:41:07.040 |
The most prominent one I would say is scaling channel count. 06:41:10.280 |
So right now we have a thousand channel device. 06:41:14.600 |
And I would expect that curve to continue in the future. 06:41:24.480 |
And so I'm excited about the clarity of gradient 06:41:26.200 |
that that gives us in terms of the user experiences 06:41:28.280 |
we choose to focus our time and resources on. 06:41:32.640 |
yeah, even things as simple as non-stationarity, 06:41:34.560 |
like does that problem just completely go away at that scale 06:41:36.680 |
or do we need to come up with new creative UX's 06:41:44.320 |
the set of functions that you can output from one brain, 06:41:47.200 |
how to deal with all the nuances of both the user experience 06:41:53.080 |
but still needing to be able to modulate all of them 06:41:58.720 |
to feedback loops so how can you make that intuitive 06:42:00.240 |
for a user to control a high dimensional control surface 06:42:04.720 |
I think that's gonna be a super interesting problem. 06:42:28.600 |
which is when you stick more stuff in the brain 06:42:39.080 |
how to most efficiently insert electrodes in the future. 06:42:47.760 |
every single day and what we're working on right now. 06:42:53.320 |
that a thousand electrodes is where it saturates. 06:43:05.760 |
and this is where like the true breakthroughs happen. 06:43:13.480 |
some thoughts are most precisely described in poetry. 06:43:19.040 |
- I think it's because the information bottleneck 06:43:35.240 |
Like if you can express a sentiment such that 06:43:39.520 |
the actual true underlying meaning and beauty 06:43:43.240 |
of the thing that you're trying to get across, 06:43:45.200 |
that sort of the generator function in their brain 06:43:46.880 |
is more powerful than what language can express. 06:43:48.520 |
And so the mechanism poetry is really just to 06:43:56.520 |
- So being literal sometimes is a suboptimal compression 06:44:02.320 |
- And it's actually in the process of the user 06:44:09.800 |
It's also like when you look at a beautiful painting, 06:44:14.760 |
It's the thought process that occurs when you see that, 06:44:19.800 |
- Yeah, it's resonating with some deep thing within you 06:44:26.720 |
and was able to convey that through the pixels. 06:44:29.480 |
And that's actually gonna be relevant for full-on telepathy. 06:44:32.820 |
You know, it's like if you just read the poetry literally, 06:44:38.240 |
that doesn't say much of anything interesting. 06:44:50.920 |
within the context of the collective intelligence 06:44:53.160 |
of the human species that makes that poem make sense. 06:44:59.880 |
the signal that carries from human to human meaning 06:45:43.080 |
you haven't achieved AGI or something like that. 06:45:45.080 |
- Do you not think that's like some Next Token 06:45:46.880 |
entropy surprise kind of thing going on there? 06:45:54.100 |
And yeah, I do wonder if there is some element 06:45:57.640 |
of the Next Token surprise factor going on there. 06:46:03.240 |
are basically you have some repeated structure 06:46:06.440 |
It's like, okay, clause one, two, three is one thing. 06:46:12.280 |
And they kind of play with exactly when the surprise happens 06:46:25.960 |
This is especially true in classical music heritage. 06:46:40.400 |
And knowing which rules to break is the important part. 06:46:44.000 |
And fundamentally, it must be about the listener 06:46:49.200 |
It's about the audience member perceiving that 06:46:52.920 |
- What do you think is the meaning of human existence? 06:46:57.640 |
- There's a TV show I really like called "The West Wing." 06:47:15.440 |
And the president says, "Yeah, but it also says A, B, C." 06:47:19.440 |
And the person says, "Well, do you believe the Bible 06:47:23.280 |
And the president says, "No, I don't believe it." 06:47:25.840 |
And he says, "Yes, but I also think that neither of us 06:47:31.000 |
I think, like the analogy here for the meaning of life 06:47:34.840 |
is that largely, we don't know the right question to ask. 06:47:39.760 |
sort of the "Hitchhiker's Guide to the Galaxy" version 06:47:48.200 |
it's much more likely we find the meaning of human existence. 06:48:00.760 |
or generally of consciousness and conscious beings 06:48:04.080 |
So again, I think I'll take the I don't know card here, 06:48:07.960 |
but say I do think there are meaningful things we can do 06:48:10.520 |
that improve the likelihood of answering that question. 06:48:26.640 |
in a very painful way when you try to communicate 06:48:31.280 |
Because a lot of the time, the last thing to go 06:48:33.520 |
is they have the ability to somehow wiggle a lip 06:48:35.920 |
or move something that allows them to say yes or no. 06:48:41.200 |
that what matters is, are you asking them the right question 06:48:46.560 |
Well, Bliss, thank you for everything you do. 06:49:03.400 |
the first human being to have a Neuralink device 06:49:18.260 |
- It was sort of a freak thing that happened. 06:49:32.520 |
take the rest of the plunge under the wave or something. 06:49:41.320 |
I did it running into the water with a couple of guys. 06:49:44.880 |
And so my idea of what happened is really just that 06:50:06.780 |
And so I was face down in the water for a while. 06:50:19.780 |
People, I don't know if they like that I say that. 06:50:30.560 |
Like I'm a very relaxed sort of stress-free person. 06:50:45.860 |
It's like, all right, well, what can I do next? 06:50:57.380 |
to try to get healed, to try to get off a ventilator, 06:51:02.500 |
learn as much as I could so I could somehow survive. 06:51:11.400 |
And then thank God I had my family around me. 06:51:26.700 |
More than I can ever thank them for, honestly. 06:51:36.980 |
of providing for them or honestly just don't want to. 06:51:41.300 |
And so they get placed somewhere in some sort of home. 06:51:58.460 |
you'll end up with one or two friends from high school 06:52:10.420 |
and we still get together, all of us, twice a year. 06:52:14.180 |
We call it the spring series and the fall series. 06:52:17.380 |
This last one we all did, we dressed up like X-Men. 06:52:21.140 |
So I did Professor Xavier and it was freaking awesome. 06:52:27.180 |
So yeah, I have such a great support system around me. 06:52:51.460 |
- It's beautiful to see that you see the silver lining 06:52:58.420 |
Do you remember the moment when you first realized 06:53:06.260 |
Right when I, whatever, something hit my head, 06:53:10.080 |
I tried to get up and I realized I couldn't move 06:53:16.260 |
I'm like, all right, I'm paralyzed, can't move. 06:53:19.580 |
If I can't get up, I can't flip over, can't do anything, 06:53:26.860 |
And I knew I couldn't hold my breath forever. 06:53:30.340 |
So I just held my breath and thought about it 06:53:37.900 |
I've heard from other people that, on Lickers, I guess, 06:53:41.900 |
the two girls that pulled me out of the water 06:54:04.820 |
that's what my situation was from here on out. 06:54:11.520 |
When I was in the hospital, right before surgery, 06:54:17.840 |
I had brought her with me from college to camp 06:54:22.340 |
And I was like, hey, it's gonna be fine, don't worry. 06:54:25.740 |
I was cracking some jokes to try to lighten the mood. 06:54:36.140 |
'cause at least she'll have some answers then, 06:54:40.500 |
And I didn't want her to be stressed through the whole thing. 06:54:45.060 |
And then, when I first woke up after surgery, 06:54:50.420 |
They had me on fentanyl like three ways, which was awesome. 06:55:01.900 |
and it was still the best I've ever felt on drugs. 06:55:09.700 |
And I remember the first time I saw my mom in the hospital, 06:55:16.700 |
I had like ventilator in, like I couldn't talk or anything. 06:55:35.620 |
But yeah, I never had like a moment of, you know, 06:55:48.560 |
It was always just, I hate that I have to do this, 06:55:52.660 |
but like sitting here and wallowing isn't gonna help. 06:56:04.820 |
I mean, there are days when I don't really feel 06:56:14.380 |
I've more so just wanted to try to do anything possible 06:56:24.300 |
But at the beginning, there were some ups and downs. 06:56:26.780 |
There were some really hard things to adjust to. 06:56:29.180 |
First off, just like the first couple of months, 06:56:32.500 |
the amount of pain I was in was really, really hard. 06:56:36.180 |
I mean, I remember screaming at the top of my lungs 06:56:39.380 |
in the hospital because I thought my legs were on fire. 06:56:48.220 |
I asked them to give me as much pain meds as possible, 06:57:00.500 |
it's hard realizing things that I wanted to do in my life 06:57:11.240 |
and I just don't think that I could do it now 06:57:14.820 |
Maybe it's possible, but I'm not sure I would ever 06:57:29.260 |
I was a huge athlete growing up, so that was pretty hard. 06:57:37.900 |
There's something really special about being able 06:58:00.620 |
basically as much as you're ever gonna get back 06:58:35.460 |
but I was never depressed for long periods of time. 06:58:51.140 |
My understanding that it was all for a purpose. 06:58:57.060 |
involving Neuralink, even if that purpose was, 06:59:05.060 |
And I think it's a really, really popular story 06:59:07.180 |
about how Job has all of these terrible things 06:59:20.340 |
that they're the ones going through something terrible 06:59:22.920 |
and they just need to praise God through the whole thing 06:59:36.880 |
that gets killed or kidnapped or taken from him. 06:59:43.720 |
that happen to those around you who you love. 06:59:49.340 |
and she has to get through something extraordinarily hard 06:59:57.300 |
as best as possible for her because she's the one 07:00:01.980 |
that's really going through this massive trial. 07:00:06.500 |
And obviously my family, my family and my friends, 07:00:33.840 |
I mean, I've just always thought I could do better 07:00:38.860 |
and thought I could do anything I ever wanted to do. 07:00:44.740 |
Like whatever I set my mind to, I felt like I could do it. 07:00:52.500 |
I wanted to travel around and be sort of like a gypsy 07:01:06.140 |
and then going and being a fisherman in Italy, 07:01:31.380 |
she's like the most positive, energetic person in the world. 07:01:50.480 |
- It's just great to see that cynicism didn't take over, 07:01:59.280 |
that you're not gonna let this keep you down? 07:02:02.680 |
Also like I just, it's just kind of how I am. 07:02:06.680 |
I just, like I said, I roll with the punches with everything. 07:02:15.160 |
And whenever I'd see people getting stressed, 07:02:27.380 |
Like just don't stress and everything will be fine. 07:02:37.640 |
But I just don't think stress has had any place in my life 07:02:43.160 |
- What was the experience like of you being selected 07:02:50.200 |
a neural link device implanted in your brain? 07:03:20.480 |
It's gonna be the worst version ever in a person. 07:03:28.000 |
I could just tell them like, okay, find someone else 07:03:37.120 |
There's something about being the first one to do something. 07:03:43.520 |
that I would like to do something for the first time, 07:03:51.840 |
I think my like faith had a huge part in that. 07:03:56.840 |
I always felt like God was preparing me for something. 07:04:09.160 |
about not wanting to do any of this as a quadriplegic. 07:04:12.880 |
I told him, you know, I'll go out and talk to people. 07:04:15.560 |
I'll go out and travel the world and talk to, you know, 07:04:19.320 |
stadiums, thousands of people give my testimony. 07:04:24.600 |
Don't make me do all of this in a chair, that sucks. 07:04:33.760 |
I always felt like there was something going on 07:04:49.040 |
how the star sort of aligned with all of this. 07:04:53.600 |
It just told me like, as the surgery was getting closer, 07:04:58.240 |
it just told me that, you know, it was all meant to happen. 07:05:03.960 |
And so I shouldn't be afraid of anything that's to come. 07:05:07.560 |
And so I wasn't, I kept telling myself, like, you know, 07:05:11.360 |
you say that now, but as soon as the surgery comes, 07:05:17.120 |
and brain surgery is a big deal for a lot of people, 07:05:24.840 |
The amount of times I've been like, thank you God, 07:05:27.120 |
that you didn't take my brain and my personality 07:05:29.360 |
and my ability to think, my like love of learning, 07:05:33.400 |
like my character, everything, like, thank you so much. 07:05:39.540 |
And I was about to let people go like root around in there. 07:05:43.940 |
Like, hey, we're going to go like put some stuff 07:05:48.280 |
And so it was, it was something that gave me pause. 07:05:51.800 |
But like I said, how smoothly everything went, 07:05:54.920 |
I never expected for a second that anything would go wrong. 07:05:58.760 |
Plus the more people I met on the borrows side 07:06:04.640 |
they're just the most impressive people in the world. 07:06:06.920 |
Like I can't speak enough to how much I trust these people 07:06:12.000 |
with my life and how impressed I am with all of them. 07:06:19.500 |
to like walk into a room and roll into a room 07:06:40.020 |
Like, I don't know, it's so, it's so rewarding 07:06:56.840 |
but the surgery approach the night before, the morning of, 07:07:03.900 |
I think I said that, something like that to Elon 07:07:06.920 |
on the phone beforehand, we were like FaceTiming. 07:07:16.860 |
So we woke up, I think we had to be at the hospital 07:07:37.820 |
Elon was supposed to be there in the morning, 07:07:44.460 |
Had one of the greatest one-liners of my life. 07:08:18.820 |
Woke up and played a bit of a prank on my mom. 07:08:39.660 |
I was like, I would really like to play a prank on my mom. 07:09:05.900 |
And we just do very mean things to her all the time. 07:09:14.480 |
But right after surgery, I was really worried 07:09:18.580 |
that I was going to be too groggy, not all there. 07:09:35.020 |
I was really worried that I was going to start, 07:09:41.740 |
and I wouldn't even know, I wouldn't remember. 07:09:44.820 |
So I was like, please God, don't let that happen. 07:09:48.460 |
And please let me be there enough to do this to my mom. 07:10:37.220 |
She still says she's gonna get me back someday, 07:10:58.900 |
- It's actually a way to show that you're still there, 07:11:06.580 |
- What was the first time you were able to feel 07:11:34.180 |
that were recording some of my neuron spikes. 07:11:44.700 |
My first thought was, I mean, if they're firing now, 07:11:51.340 |
So I started trying to like wiggle my fingers, 07:11:53.820 |
and I just started like scanning through the channels. 07:12:03.060 |
on like top row, like third box over or something. 07:12:26.980 |
Like this is what's supposed to happen, right? 07:12:35.260 |
- And then seeing like that you can notice something, 07:12:36.900 |
and then when you did the index finger, you're like, "Oh." 07:12:39.460 |
- Yeah, I was wiggling kind of all of my fingers 07:12:47.580 |
but that big yellow spike was the one that stood out to me. 07:12:50.620 |
Like I'm sure that if I would've stared at it long enough, 07:12:54.180 |
I could've mapped out maybe a hundred different things, 07:12:57.740 |
but the big yellow spike was the one that I noticed. 07:13:07.780 |
the cognitive effort required to sort of wiggle 07:13:29.260 |
because that's going to create new neural pathways 07:13:32.240 |
or pathways in my spinal cord to reconnect these things, 07:13:59.020 |
I got some bicep control back, and that's about it. 07:14:04.020 |
I can, if I try enough, I can wiggle some of my fingers. 07:14:09.160 |
Not like on command, it's more like if I try to move, 07:14:15.100 |
say, my right pinky, and I just keep trying to move it, 07:14:22.220 |
I know, and that happens with a few different 07:14:33.220 |
when I was in the hospital, came in and told me 07:14:35.420 |
for one guy who had recovered most of his control, 07:14:39.660 |
what he thought about every day was actually walking, 07:14:43.220 |
like the act of walking, just over and over again. 07:14:48.660 |
I tried just imagining walking, which is, it's hard. 07:14:53.660 |
It's hard to imagine all of the steps that go into, 07:14:59.000 |
well, taking a step, all of the things that have to move, 07:15:03.020 |
like all of the activations that have to happen 07:15:06.580 |
along your leg in order for one step to occur. 07:15:09.780 |
- But you're not just imagining, you're doing it, right? 07:15:13.980 |
So it's imagining over again what I had to do 07:15:26.860 |
You don't think about all of the different things 07:15:31.220 |
So I had to recreate that in my head as much as I could, 07:15:35.120 |
and then I practiced it over and over and over. 07:15:37.700 |
- So it's not like a third-person perspective. 07:15:42.060 |
It's not like you're imagining yourself walking. 07:15:53.540 |
- Like frustrating hard, or actually cognitively hard? 07:15:58.660 |
There's a scene in one of the Kill Bill movies, actually, 07:16:10.300 |
I don't know, from a drug that was in her system, 07:16:17.220 |
and she stares at her toe, and she says, "Move." 07:16:22.420 |
And after a few seconds on screen, she does it, 07:16:27.300 |
and she did that with every one of her body parts 07:16:31.700 |
I did that for years, just stared at my body and said, 07:16:49.260 |
And it's hard because it actually is like taxing, 07:16:56.660 |
which is something I would have never expected, 07:17:04.760 |
I don't know, the only way I can describe it is 07:17:07.500 |
there are like signals that aren't getting through 07:17:26.120 |
in whatever body part that I'm trying to move, 07:17:28.560 |
and they just build up and build up and build up 07:17:51.100 |
And then, you know, if you try to stare at a body part 07:18:14.860 |
I wasn't able to control any of my environment. 07:18:20.460 |
a lot of what I was doing was staring at walls. 07:18:47.860 |
It's something that I talked about the other day 07:18:50.060 |
at the All Hands that I did at Neuralink's Austin facility. 07:19:00.300 |
I went to school at Texas A&M, so I've been around before. 07:19:11.460 |
of what they've had me do, especially at the beginning, 07:19:25.700 |
And that's how they sort of train the algorithm 07:19:31.940 |
And so it made things very seamless for me, I think. 07:19:39.300 |
So it's amazing to know, 'cause I've learned a lot 07:19:44.380 |
like with the interface and everything like that. 07:19:46.780 |
It's cool to know that you've been essentially 07:19:48.940 |
like training to be world-class at that task. 07:20:05.060 |
because I've heard other paralyzed people say, 07:20:10.220 |
They tell you two years, but you just never know. 07:20:17.820 |
So I've heard other people say, "Don't give up." 07:20:30.780 |
and she'd been trying to wiggle her index finger 07:20:34.980 |
and she finally got a bat like 18 years later. 07:20:41.340 |
I just, I do it when I'm lying down watching TV. 07:20:44.580 |
I'll find myself doing it kind of just almost on its own. 07:20:48.780 |
It's just something I've gotten so used to doing 07:20:51.060 |
that, I don't know, I don't think I'll ever stop. 07:21:03.200 |
but there's that Olympic-level nervous system 07:21:17.820 |
like I can't show my appreciation for it enough, 07:21:25.700 |
that what I'm doing is actually having some effect. 07:21:30.460 |
It's a huge part of the reason why I know now 07:21:38.460 |
because before Neuralink, I was doing it every day, 07:21:42.460 |
and I was just assuming that things were happening. 07:21:47.700 |
I wasn't getting back any mobility or sensation or anything, 07:21:52.620 |
so I could have been running up against a brick wall 07:21:58.580 |
I get to see all the signals happening real-time, 07:22:02.620 |
and I get to see that what I'm doing can actually be mapped. 07:22:07.620 |
When we started doing click calibrations and stuff, 07:22:11.700 |
when I go to click my index finger for a left click, 07:22:29.180 |
that there's still a powerhouse of a brain there 07:22:34.900 |
that brain is, I mean, that's the most important thing 07:22:44.100 |
and saw the environment respond like that little thing, 07:22:55.380 |
I keep telling this to people, it made sense to me. 07:23:03.780 |
and that as long as you had something near it 07:23:07.280 |
that could measure those, that could record those, 07:23:09.700 |
then you should be able to visualize it in some way, 07:23:22.060 |
It was cool to see that their technology worked, 07:23:26.080 |
and that everything that they'd worked so hard for 07:23:30.860 |
But I hadn't moved a cursor or anything at that point. 07:23:42.180 |
I didn't really know much about BCI at that point either. 07:23:55.180 |
okay, this is, it's cool that we got this far, 07:24:02.220 |
It's like, okay, I just thought that they knew 07:24:34.460 |
- I know it must've been within the first maybe week, 07:24:48.780 |
Like it was like, okay, well, how do I explain this? 07:24:58.900 |
it's easy to say, okay, like I did something cool. 07:25:13.960 |
So again, I knew that me trying to move a body part 07:25:35.140 |
and then take that and give me cursor control. 07:25:38.820 |
I don't know like all the ins and outs of it, 07:25:41.300 |
but I was like, there are still signals in my brain firing. 07:25:44.580 |
They just can't get through because there's like a gap 07:25:49.100 |
And so they just, they can't get all the way down 07:25:53.500 |
So when I moved the cursor for the first time, 07:26:06.300 |
with just my mind without like physically trying to move. 07:26:10.820 |
So I guess I can get into that just a little bit, 07:26:12.820 |
like the difference between attempted movement 07:26:20.540 |
So like attempted movement is me physically trying 07:26:28.140 |
I try to attempt to move my hand to the right, 07:26:33.960 |
Attempt to, you know, like lift my finger up and down, 07:26:39.760 |
I'm physically trying to do all of those things, 07:26:57.320 |
when they were going to give me cursor control. 07:27:01.620 |
it was attempt to do this, attempt to do that. 07:27:04.740 |
When Nir was telling me to like imagine doing it, 07:27:24.600 |
and they said, okay, write your name with this pencil. 07:27:30.020 |
Like, okay, now imagine writing your name with that pencil. 07:27:33.180 |
Kids would think like, I guess, like that kind of makes sense 07:27:38.020 |
and they would do it, but that's not something we're taught. 07:27:43.420 |
We think about like thought experiments and things, 07:27:46.460 |
but that's not like a physical action of doing things. 07:27:50.180 |
It's more like what you would do in certain situations. 07:27:53.180 |
So imagine movement, it never really connected with me. 07:28:01.300 |
like swinging a baseball bat or swinging like a golf club, 07:28:06.820 |
but then you go right to that and physically do it. 07:28:17.220 |
So telling me to imagine something versus attempting it, 07:28:20.340 |
it just, there wasn't a lot that I could do there mentally. 07:28:24.460 |
I just kind of had to accept what was going on and try. 07:28:28.720 |
But the attempted movement thing, it all made sense to me. 07:28:34.740 |
then there's a signal being sent in my brain. 07:28:39.740 |
then they should be able to map it to what I'm trying to do. 07:28:42.700 |
And so when I first moved the cursor like that, 07:28:50.740 |
- But can you clarify, is there supposed to be a difference 07:28:52.900 |
between imagined movement and attempted movement? 07:29:05.340 |
is that supposed to be a different part of the brain 07:29:06.860 |
that lights up in those two different situations? 07:29:10.340 |
I think all these signals can still be represented 07:29:12.380 |
in motor cortex, but the difference I think has to do 07:29:15.100 |
with the naturalness of imagining something versus- 07:29:18.900 |
- Attempting it and sort of the fatigue of that over time. 07:29:23.060 |
So like, this is just different ways to prompt you 07:29:27.740 |
to kind of get to the thing that you're around. 07:29:32.020 |
- Attempted movement does sound like the right thing to try. 07:29:37.740 |
- 'Cause imagine for me, I would start visualizing, 07:29:42.780 |
Attempted, I would actually start trying to like, 07:29:49.300 |
When I'm imagining a move, see, I'm like moving my muscle. 07:29:55.460 |
- Like there is a bit of an activation almost, 07:29:58.140 |
versus like visualizing yourself like a picture doing it. 07:30:01.620 |
- Yeah, it's something that I feel like naturally 07:30:05.500 |
If you try to tell someone to imagine doing something, 07:30:21.100 |
It worked just like it should work like a charm. 07:30:31.180 |
but there was a swear word that came out of your mouth 07:30:35.380 |
- Yeah, that's, it blew my mind, like no pun intended, 07:30:47.540 |
just with my thoughts and not attempting to move. 07:30:51.900 |
It's something that I found like over the couple of weeks, 07:31:11.380 |
like I don't have to attempt as much to move it. 07:31:15.900 |
And part of that is something that I'd even talked 07:31:25.260 |
I was watching when I like attempted to move to the right 07:31:29.220 |
and I watched the screen as like, I saw the spikes. 07:31:33.020 |
Like I was seeing the spike, the signals being sent 07:31:42.180 |
when you go to say move your hand or any body part, 07:31:45.540 |
that signal gets sent before you're actually moving, 07:31:53.740 |
And I noticed that there was something going on in my brain 07:32:00.620 |
that my brain was like anticipating what I wanted to do. 07:32:16.180 |
like always in the back, like that's so weird 07:32:20.500 |
It kind of makes sense, but I wonder what that means 07:32:26.520 |
And, you know, and then as I was playing around 07:32:31.520 |
with the attempted movement and playing around 07:32:42.580 |
and what I wanted it to do, like cursor movements, 07:32:45.340 |
what I wanted to do a bit better and a bit better. 07:32:51.720 |
as I was playing WebGrid, I like looked at a target 07:32:56.720 |
before I had started like attempting to move. 07:33:09.240 |
I know I can like maybe be a bit quicker getting there. 07:33:12.000 |
And I looked over and the cursor just shot over. 07:33:24.040 |
I was like, guys, do you know that this works? 07:33:28.440 |
which like they'd all been saying this entire time. 07:33:31.620 |
Like, I can't believe like you're doing all this 07:33:34.640 |
I'm like, yeah, but is it really with my mind? 07:33:36.440 |
Like I'm attempting to move and it's just picking that up. 07:33:40.840 |
But when I moved it for the first time like that, 07:33:58.400 |
And it just opened up a whole new world of possibilities 07:34:02.320 |
of like what could possibly happen with this technology 07:34:05.960 |
and what I might be able to be capable of with it. 07:34:12.280 |
Like you're controlling a digital device with your mind. 07:34:21.360 |
I've seen like scientists talk about like a big aha moment, 07:34:39.520 |
for like the world at large or like this field at large, 07:34:50.080 |
And so that's what I do like all the time now. 07:35:02.320 |
because I've found that there is some interplay with it 07:35:23.920 |
I can just completely think about whatever I'm doing, 07:35:31.200 |
I also like to just experiment with these things. 07:35:33.680 |
Like every now and again, I'll get this idea in my head, 07:35:40.800 |
by the way, I wasn't doing that like you guys wanted me to. 07:35:44.280 |
I was, I thought of something and I wanted to try it. 07:35:49.560 |
So maybe we should like explore that a little bit. 07:35:51.960 |
- So I think that discovery is not just for you, 07:35:58.640 |
whoever uses a new link that this is possible. 07:36:18.880 |
that paves the way to like, anyone can now do it. 07:36:39.560 |
Describing the app, can you just describe how it works? 07:36:43.800 |
- Yeah, so it's just an app that Neuralink created 07:36:51.960 |
So on the link app, there are a few different settings 07:36:56.440 |
and different modes and things I can do on it. 07:37:05.960 |
Calibration is how I actually get cursor control. 07:37:25.280 |
So it would be, you know, five minutes in calibration 07:37:31.600 |
And then if I'm in it for 10 minutes and 15 minutes, 07:37:49.840 |
And then you also talked about sometimes you'll play 07:37:57.760 |
So Snake is kind of like my litmus test for models. 07:38:09.960 |
It's also how I like connect to the computer, 07:38:13.160 |
So they've given me a lot of like voice controls 07:38:18.360 |
So I can, you know, say like connect or implant disconnect. 07:38:28.040 |
So the charger is also how I connect to the Link App, 07:38:31.720 |
I have to have the implant charger over my head 07:38:42.960 |
I think there's a setting to like wake it up every, 07:38:48.960 |
So we could set it to half an hour or five hours 07:38:51.560 |
or something if I just want it to wake up periodically. 07:39:04.000 |
I have like, I made them give me like a little homework tab 07:39:07.680 |
because I am very forgetful and I forget to do things a lot. 07:39:12.680 |
So I have like a lot of data collection things 07:39:18.120 |
- Is the body mapping part of the data collection 07:39:22.040 |
It's something that they want me to do daily, 07:39:26.880 |
'cause I've been doing so much media and traveling so much. 07:39:34.920 |
for how much I've been slacking on my homework. 07:39:38.360 |
But yeah, it's just something that they want me 07:39:44.240 |
how well the Neuralink is performing over time 07:39:53.920 |
and show like, hey, this is what the Neuralink, 07:39:58.840 |
day one versus day 90 versus day 180 and things like that. 07:40:26.120 |
- So it'd be great to hear your side of the story. 07:40:34.160 |
The cursor will be moving on its own across the screen 07:40:43.400 |
And then the algorithm is training off of what, 07:40:48.120 |
like the signals it's getting are as I'm doing this. 07:40:51.000 |
There are a couple different ways that they've done it. 07:40:58.840 |
And the cursor will go from the middle to one side. 07:41:07.520 |
And they'll do that all the way around the circle. 07:41:10.240 |
And I will follow that cursor the whole time. 07:41:22.440 |
- Can you actually speak to when you say follow? 07:41:34.280 |
I think the better models as I progress through calibration 07:41:49.520 |
will create a model that makes it really effective 07:41:56.700 |
I've tried doing calibration with imagined movement, 07:42:01.700 |
and it just doesn't work as well for some reason. 07:42:12.200 |
I just move, I follow along wherever the cursor is 07:42:38.560 |
that we're doing calibration now might make it a bit better. 07:42:41.920 |
But what I've found is there will be a point in calibration 07:42:58.280 |
the first 15 minutes, I can't use imagined movement. 07:43:05.160 |
And after a certain point, I can just sort of feel it. 07:43:20.040 |
what I am going to do again before I go to do it. 07:43:23.900 |
And so using attempted movement for 15 minutes, 07:43:29.480 |
at some point I can kind of tell when I like move my eyes 07:43:42.680 |
I mean, you are a true pioneer in all of this. 07:43:49.880 |
And there's just, I imagine so many lessons learned 07:43:55.480 |
in all these kinds of different like super technical ways. 07:43:58.360 |
And it's also cool to hear that there's like a different 07:44:01.560 |
like feeling to the experience when it's calibrated 07:44:12.280 |
And that's why there's a different feeling to it. 07:44:14.240 |
And then trying to find the words and the measurements 07:44:19.800 |
But at the end of the day, you can also measure 07:44:21.680 |
that your actual performance, whether it's Snake or WebGrid, 07:44:27.880 |
And you're saying for the open loop calibration, 07:44:36.360 |
- So the open loop, you don't get the feedback 07:44:45.400 |
Like we've done it with a cursor and without a cursor 07:44:49.280 |
So sometimes it's just say for like the center out, 07:44:53.520 |
you'll start calibration with a bubble lighting up 07:45:01.760 |
And then when that bubble, when it's pushed towards 07:45:05.160 |
that bubble for say three seconds, a bubble will pop. 07:45:17.240 |
what they want me to do, like follow the yellow brick road, 07:45:27.160 |
- They always feel so bad making me do calibration. 07:45:29.840 |
Like, oh, we're about to do a 40 minute calibration. 07:45:33.400 |
I'm like, all right, do you guys wanna do two of them? 07:45:36.240 |
Like I'm always asking to like whatever they need, 07:45:43.220 |
Like I get to lie there and, or sit in my chair 07:45:46.640 |
and like do these things with some great people. 07:45:57.500 |
I could throw something on on my TV in the background 07:46:00.220 |
and kind of like split my attention between them. 07:46:20.640 |
- Yeah, that's one thing that I really, really enjoy 07:46:26.580 |
Like the higher the BPS, the higher the score, 07:46:33.180 |
And so if I, I think I've asked at one point, 07:46:38.460 |
some sort of numerical feedback for calibration, 07:46:40.740 |
like I would like to know what they're looking at. 07:46:42.820 |
Like, oh, you know, it is, we see like this number 07:46:52.460 |
And I would love that because I would like to know 07:47:01.620 |
It doesn't actually mean that calibration is going well 07:47:06.200 |
So it's not like 100% and they don't wanna like skew 07:47:09.300 |
what I'm experiencing or want me to change things 07:47:22.380 |
and something that I really enjoy striving for 07:47:37.300 |
four or five, six seconds between me popping bubbles, 07:47:40.420 |
but towards the end, I like to keep it below like 1.5 07:47:43.780 |
or if I could get it to like one second between like bubbles 07:47:47.820 |
because in my mind that translates really nicely 07:47:52.180 |
where I know if I can hit a target one every second 07:47:59.420 |
That's a way to get a score on the calibrations, 07:48:07.680 |
And the closed loop can already start giving you a sense 07:48:13.780 |
- Yeah, so closed loop is when I first get cursor control 07:48:23.660 |
I am the dumbest person in the room every time 07:48:30.780 |
So I am actually now the one that is like finishing 07:48:37.960 |
I don't even know what the loop is, they've never told me. 07:48:40.100 |
They just say there is a loop and at one point it's open 07:48:52.300 |
- Well, yeah, they're trying to get that number down 07:48:55.740 |
That's what we've been working on a lot recently 07:49:00.000 |
So that way, if this is something that people need to do 07:49:03.220 |
on a daily basis, or if some people need to do 07:49:06.060 |
on like every other day basis or once a week, 07:49:11.060 |
they don't want people to be sitting in calibration 07:49:15.000 |
I think they wanted to get it down seven minutes or below, 07:49:20.260 |
It'd be nice if you never had to do calibration. 07:49:34.940 |
really good models, I'm in calibration 40 or 45 minutes. 07:49:39.400 |
And I don't mind, like I said, they always feel really bad, 07:49:43.680 |
but if it's gonna get me a model that can like break 07:49:51.340 |
So WebGrid, I saw a presentation that where Bliss said 07:49:55.980 |
by March, you selected 89,000 targets in WebGrid. 07:50:15.900 |
- Yeah, I'd like to thank everyone who's helped me get here, 07:50:19.540 |
my coaches, my parents for driving me to practice 07:50:29.940 |
- The interviews with athletes are always like that exact. 07:50:44.180 |
They can make it as big or small as you can make a grid. 07:50:51.700 |
And it is a way for them to benchmark how good a BCI is. 07:51:03.860 |
and you're supposed to move the mouse to there 07:51:14.460 |
it's bits per second that you get every time you click one. 07:51:37.220 |
eight bits per second and you've recently broke that. 07:51:53.120 |
And I just had to wait until the latency calmed down 07:51:58.320 |
But I was at like 8.01 and then five seconds of lag. 07:52:03.320 |
And then the next like three targets I clicked 07:52:09.420 |
during that time of lag, I probably would've hit, 07:52:23.180 |
- So that's all you're thinking about right now? 07:52:30.560 |
I think, well, I know nine is very, very achievable. 07:52:34.380 |
I think 10 I could hit maybe in the next month. 07:52:39.140 |
Like I could do it probably in the next few weeks 07:52:41.880 |
- I think you and Elon are basically the same person. 07:52:55.300 |
And I could just tell there's some percentage 07:53:09.180 |
I mean, in a fundamental way, it's really inspiring. 07:53:12.660 |
And what you're doing is inspiring in that way. 07:53:27.260 |
Like the decoding, the software, the hardware, 07:53:38.000 |
- Well, that's also, that's part of the thing 07:53:48.540 |
when they went in and put this thing in my brain. 07:53:52.080 |
to make me more susceptible to these kinds of games, 07:53:55.580 |
to make me addicted to like WebGrid or something. 07:54:07.800 |
- He told me he like does it on the floor with peanut butter 07:54:17.080 |
- Nolan, like the first time Nolan played this game, 07:54:32.780 |
which really hampers my WebGrid playing ability. 07:54:36.700 |
Basically, I have to wait 0.3 seconds for every click. 07:54:55.140 |
I still hit like 50, I think I hit like 50 something trials, 07:55:00.060 |
net trials per minute in that, which was pretty good. 07:55:03.580 |
'Cause I'm able to like, there's one of the settings 07:55:10.620 |
in order to initiate a click, to start a click. 07:55:13.180 |
So I can tell sort of when I'm on that threshold 07:55:18.180 |
to start initiating a click just a bit early. 07:55:21.940 |
So I'm not fully stopped over the target when I go to click. 07:55:25.540 |
I'm doing it like on my way to the targets a little, 07:55:31.860 |
- Yeah, just a hair right before the targets. 07:55:37.180 |
But that's still, it sucks that there's a ceiling 07:55:41.820 |
- Well, I can get down to 0.2 and 0.1, 0.1 is what I get. 07:55:45.540 |
Yeah, and I've played with that a little bit too. 07:55:48.020 |
I have to adjust a ton of different parameters 07:55:52.060 |
And I don't have control over all that on my end yet. 07:55:55.340 |
It also changes like how the models are trained. 07:56:04.220 |
as I'm playing WebGrid based off of like the WebGrid data 07:56:08.420 |
that I'm, so like if I play WebGrid for 10 minutes, 07:56:21.060 |
The way that they interact, it's just much, much different. 07:56:49.100 |
We, before all the thread retraction stuff happened, 07:56:52.180 |
we were calibrating clicks, left click, right click. 07:56:58.180 |
before I broke the record again with the dwell cursor 07:57:00.500 |
was I think on a 35 by 35 grid with left and right click. 07:57:09.580 |
using multiple clicks 'cause it's more difficult. 07:57:14.580 |
You're supposed to do either a left click or a right click. 07:57:17.940 |
Is it different colors or something like this? 07:57:21.340 |
orange targets for right click is what they had done. 07:57:26.500 |
was with the blue and the orange targets, yeah. 07:57:36.220 |
and being able to like initiate clicks on my own, 07:57:43.740 |
- Yeah, you would start making Blizz nervous about his 17. 07:57:50.460 |
So what did it feel like with the retractions 07:58:00.740 |
The day they told me was the day of my big Neuralink tour 07:58:09.620 |
They told me like right before we went over there, 07:58:13.580 |
My initial reaction was, all right, go and fix it. 07:58:20.220 |
Like I went to sleep, couple hours later I woke up 07:58:33.660 |
they could go in and put in a new one like next day 07:58:51.260 |
I had, like it had opened up so many doors for me 07:58:58.660 |
I thought it would have been a cruel twist of fate 07:59:08.020 |
and then have it all come crashing down after a month. 07:59:11.060 |
And I knew like, say the top of the mountain, 07:59:16.660 |
I was just now starting to climb the mountain 07:59:32.180 |
I don't know, like five minute drive, whatever it is, 07:59:44.380 |
I'm not gonna let this ruin this amazing like tour 07:59:52.180 |
how much I appreciate all the work they're doing. 07:59:58.380 |
And I wanna go have one of the best days of my life. 08:00:10.140 |
And then for a few days, I was pretty down in the dumps. 08:00:18.620 |
I didn't know if it was ever gonna work again. 08:00:24.980 |
that even if I lost the ability to use the Neuralink, 08:00:41.100 |
If I needed to just do like some of the data collection 08:00:44.780 |
every day or body mapping every day for a year, 08:01:09.660 |
And everything that I'd done was just a perk. 08:01:22.340 |
- That said, you were able to get to work your way up, 08:01:27.460 |
So this is like going from Rocky I to Rocky II. 08:01:30.060 |
So when did you first realize that this is possible 08:01:38.780 |
to increase back up and beat your previous record? 08:01:44.260 |
- Again, this feels like I'm interviewing an athlete. 08:02:04.380 |
I think they had switched how they were measuring 08:02:18.660 |
So we're switching from sort of individual spike detection 08:02:24.820 |
with either me or DJ, you probably have some content. 08:02:39.180 |
And I saw the uptake in performance immediately. 08:02:44.140 |
Like I could feel it when they switched over. 08:02:48.300 |
Like everything up till this point for the last few weeks, 08:03:01.820 |
oh, I know I'm still only at like say in web grid terms, 08:03:05.740 |
like four or five BPS compared to my 7.5 before. 08:03:31.040 |
And so I just ran with it, never looked back. 08:03:39.740 |
What was the feedback loop on the figuring out 08:03:42.940 |
in a way that would actually work well for Nolan? 08:03:45.940 |
So maybe just describe first how the actual update worked. 08:03:50.780 |
So we just did an over-the-air software update 08:03:58.100 |
to record sort of averages of populations of neurons 08:04:02.700 |
So we have less resolution about which individual neuron 08:04:07.980 |
of what's going on nearby an electrode overall. 08:04:13.400 |
it was immediate when we flipped that switch. 08:04:16.880 |
you had three or four VPS right out of the box. 08:04:24.280 |
around like how to make this useful for independent use. 08:04:28.800 |
you can use it independently to do whatever you want. 08:04:39.160 |
And yeah, this is obviously the start of this journey still. 08:04:41.320 |
Hopefully we get back to the places where you're doing 08:04:46.160 |
much more fluidly everything and much more naturally 08:04:48.400 |
the applications that you're trying to interface with. 08:04:51.120 |
- And most importantly, get that web grid number up. 08:05:03.520 |
how hard is it to avoid accidentally clicking? 08:05:05.680 |
- I have to continuously keep it moving, basically. 08:05:16.760 |
and I have .3 seconds to move it before it clicks anything. 08:05:28.040 |
moving it back and forth to keep it from clicking stuff. 08:05:35.640 |
that I was, when I was not using the implant, 08:05:39.280 |
I was just moving my hand back and forth or in circles. 08:05:43.660 |
Like I was trying to keep the cursor from clicking 08:05:46.740 |
and I was just doing it while I was trying to go to sleep. 08:05:55.620 |
Like when you're gaming, accidentally click a thing? 08:06:08.020 |
was because of an accident. - Yeah, I misclicked, yeah. 08:06:12.560 |
any time you lose, you can just say it was accidental. 08:06:16.600 |
- You said the app improved a lot from version one 08:06:23.340 |
So can you just talk about the trial and error 08:06:30.300 |
Like what's that process like of going back and forth 08:06:36.220 |
- It's a lot of me just using it day in and day out 08:06:41.220 |
and saying, "Hey, can you guys do this for me? 08:06:51.020 |
I think a lot of it just doesn't occur to them, maybe, 08:06:56.020 |
until someone is actually using the app, using the implant. 08:07:03.420 |
Or it's very specific to even like me, maybe what I want. 08:07:24.880 |
they've added for me, like that's a dumb idea. 08:07:29.020 |
And so I'm really looking forward to get the next people on 08:07:33.340 |
because I guarantee that they're going to think of things 08:07:43.500 |
And then they're also gonna give me some pushback 08:07:45.940 |
about like, "Yeah, what you are asking them to do here, 08:07:55.120 |
But it's just a lot of different interactions 08:08:03.300 |
The internet, just with the computer in general, 08:08:11.740 |
So it's just me trying to use it as much as possible 08:08:14.820 |
and showing them what works and what doesn't work 08:08:21.980 |
and they usually create amazing things for me. 08:08:36.740 |
'Cause a lot of my feedback is like really dumb. 08:08:43.780 |
And they'll come back super well thought out. 08:08:46.900 |
And it's way better than anything I could have ever thought 08:09:03.980 |
'Cause you said they might have a different set 08:09:09.200 |
Would you be intimidated by their web grid performance? 08:09:15.140 |
- I hope day one they've wiped the floor with me. 08:09:50.580 |
Like I'm just excited to have other people to do this with 08:10:01.480 |
I don't know what kind of advice I could give them, 08:10:03.320 |
but if they have questions, I'm more than happy. 08:10:06.740 |
for the next participant in the clinical trial? 08:10:15.040 |
And that I hope they work really, really hard 08:10:28.380 |
And to go to Neuralink if they need anything. 08:10:33.060 |
Like they do absolutely anything for me that they can. 08:10:40.220 |
It puts my mind at ease for like so many things 08:10:50.620 |
And they're always there and that's really, really nice. 08:10:54.080 |
And so I just, I would tell them not to be afraid 08:10:57.560 |
to go to Neuralink with any questions that they have, 08:10:59.940 |
any concerns, anything that they're looking to do with this 08:11:04.540 |
and any help that Neuralink is capable of providing, 08:11:23.380 |
Maybe that's what I'll just start saying to people. 08:11:26.340 |
- Now you're a real pro athlete, just keep it short. 08:11:28.980 |
Maybe it's good to talk about what you've been able to do 08:11:38.380 |
Like the freedom you gain from this way of interacting 08:12:12.940 |
like needing them to help me with things, the better. 08:12:16.020 |
If I'm able to sit up on my computer all night 08:12:23.500 |
say like on my iPad, like in a position where I can use it 08:12:27.140 |
and then have to have them wait up for me all night 08:12:48.700 |
You know, just being able to have the freedom 08:12:51.500 |
to do things on my own at any hour of the day or night, 08:13:02.740 |
- When you're up at 2 a.m. playing WebGrid by yourself, 08:13:09.820 |
and there's just a light glowing and you're just focused. 08:13:22.580 |
- Yeah, generally it is me playing music of some sort. 08:13:30.860 |
And then it's also just like a race against time 08:13:37.700 |
how much battery percentage I have left on my implant. 08:13:44.180 |
which equates to, you know, X amount of time, 08:13:52.500 |
And so it's a little stressful when that happens. 08:14:15.860 |
and I'm like, it's really gonna screw me over. 08:14:17.220 |
So if I have to, if I'm gonna break this record, 08:14:27.820 |
go back into WebGrid and I'm like, all right, 08:14:42.640 |
Like, it's all I want when I'm playing WebGrid. 08:15:13.620 |
- Oh, multiple targets, that changes the thing. 08:15:18.820 |
times correct minus incorrect divided by time. 08:15:21.460 |
And so you can think of like different clicks 08:15:23.140 |
as basically doubling the number of active targets. 08:15:30.700 |
And there's also like Zen mode you've played in before, 08:15:45.140 |
- He doesn't like it 'cause it didn't show BPS. 08:15:49.380 |
- I had them put in a giant BPS in the background. 08:15:55.300 |
It's like super hard mode, like just metal mode 08:15:59.420 |
if it's just like a giant number in the back count. 08:16:01.460 |
- We should rename that, metal mode is a much better name. 08:16:14.660 |
is they focus on like science tech victories, 08:16:22.340 |
and then all of the neural link stuff happened. 08:16:41.460 |
you will be so far ahead of everyone technologically 08:16:44.980 |
that you will have like musket men, infantry men, 08:16:49.340 |
planes sometimes, and people will still be fighting 08:16:53.060 |
And so if you want to win a domination victory, 08:16:56.300 |
you just get to a certain point with the science 08:16:59.020 |
and then go and wipe out the rest of the world. 08:17:01.620 |
Or you can just take science all the way and win that way. 08:17:20.480 |
- I was, yeah, I like, I was playing only science, 08:17:24.540 |
obviously, like just science all the way, just tech. 08:17:31.420 |
And then I accidentally won through a diplomatic victory 08:17:36.540 |
I was so mad 'cause it's just like ends the game one turn. 08:17:40.580 |
It was like, oh, you won, you're so diplomatic. 08:17:43.860 |
I should have declared war on more people or something. 08:17:48.240 |
But you don't need like giant civilizations with tech, 08:17:51.700 |
especially with Korea, you can keep it pretty small. 08:17:54.340 |
So I generally just get to a certain military unit 08:17:58.100 |
and put them all around my border to keep everyone out. 08:18:05.660 |
- Nice, just work on the science of the tech. 08:18:23.940 |
What other stuff would you like to see improved 08:18:26.180 |
about the Neuralink app and just the entire experience? 08:18:29.540 |
- I would like to, like I said, get back to the, 08:18:35.020 |
like click-on-demand, like the regular clicks. 08:18:38.580 |
I would like to be able to connect to more devices. 08:18:47.180 |
or use it on different consoles, different platforms. 08:19:00.060 |
That would be sick if I could control an Optimus robot. 08:19:03.500 |
The Link app itself, it seems like we are getting 08:19:08.300 |
pretty dialed in to what it might look like down the road. 08:19:21.580 |
The only other thing I would say is like more control 08:19:34.420 |
go into how the cursor moves in certain ways. 08:19:54.140 |
I want as much control over my environment as possible. 08:19:59.980 |
like in like there's menus, usually there's basic mode 08:20:08.540 |
I want as much control over this as possible. 08:20:25.420 |
- While you're using it, like speech to text? 08:20:28.900 |
- Do you type or like, 'cause there's also a keyboard? 08:20:32.580 |
That's another thing I would like to work more on 08:20:34.700 |
is finding some way to type or text in a different way. 08:20:44.220 |
and a virtual keyboard that I can use with the cursor. 08:20:47.100 |
But we've played around with like finger spelling, 08:20:58.020 |
that it's going to be a very similar learning curve 08:21:12.020 |
that at some point I'm going to be doing finger spelling 08:21:18.620 |
That I'll just be able to think the like letter 08:21:26.780 |
That's a lot of work for you to kind of take that leap, 08:21:44.860 |
And so then if I could do something along those lines 08:21:51.740 |
if I can, you know, spell it at a reasonable speed 08:21:56.540 |
then I would just be able to think that through 08:22:10.320 |
- What was the process in terms of like training yourself 08:22:12.940 |
to go from attempted movement to imagined movement? 08:22:17.020 |
So like how long would this kind of process take? 08:22:25.920 |
I think I could make it happen with other things. 08:22:52.340 |
I've seen how much it's impacted my life already 08:22:58.580 |
So I would love to, I would love to get the upgrade. 08:23:02.120 |
- What future capabilities are you excited about 08:23:12.000 |
So for folks who, for example, who are blind, 08:23:14.300 |
so Neuralink enabling people to see or for speech? 08:23:19.300 |
- Yeah, there's a lot that's very, very cool about this. 08:23:23.400 |
So there's like, this is just motor cortex stuff. 08:23:34.960 |
for the first time in their life would just be, 08:23:46.240 |
being able to have some sort of like real time translation 08:23:49.840 |
and cut away that language barrier would be really cool. 08:23:54.240 |
Any sort of like actual impairments that it could solve, 08:24:00.800 |
And then also there are a lot of different disabilities 08:24:10.640 |
I know there's already stuff to help people with seizures 08:24:28.200 |
with being able to stimulate the brain in different ways. 08:24:44.880 |
But I know that there's a lot that can be done 08:24:48.720 |
and being able to go in and physically make changes 08:25:04.120 |
within my lifetime, assuming that I live a long life. 08:25:11.800 |
or things of that nature potentially getting help. 08:25:15.640 |
You can flip a switch like that and make someone happy. 08:25:27.120 |
Like you wanna experience what you like to be on. 08:25:31.600 |
DMT, like you can just flip that switch in the brain. 08:25:39.440 |
and re-experience things that like for the first time, 08:25:42.240 |
like your favorite movie or your favorite book, 08:25:46.320 |
and then re-fall in love with Harry Potter or something. 08:25:49.880 |
I told him, I was like, I don't know how I feel 08:25:51.560 |
about like people being able to just wipe parts 08:26:04.000 |
Just like actually like high resolution replay 08:26:07.320 |
- Yeah, I saw an episode of "Black Mirror" about that once. 08:26:11.120 |
- Yeah, so "Black Mirror" is always kind of considered 08:26:21.720 |
We want to think about the worst possible thing. 08:26:32.520 |
- Hopefully people don't think about that too much with me. 08:26:36.240 |
- Yeah, I assume you're gonna have to take over the world. 08:26:46.400 |
"but I feel like people would take it the wrong way. 08:26:57.100 |
Is that something you would love to be able to do 08:27:02.240 |
to control the robotic arm or the entirety of Optimus? 08:27:07.880 |
- You think there's something like fundamentally different 08:27:09.600 |
about just being able to physically interact with the world? 08:27:14.080 |
I know another thing with being able to give people 08:27:24.560 |
by going in with the brain and having the Neuralink 08:27:28.760 |
that could be transferred through the Optimus as well. 08:27:33.760 |
There's all sorts of really cool interplay between that. 08:27:38.360 |
And then also, like you said, just physically interacting. 08:27:41.360 |
I mean, 99% of the things that I can't do myself 08:28:01.220 |
And it would change the way people like me live 08:28:11.160 |
But being able to interact with the world physically 08:28:16.080 |
And they're not just for having to be a caretaker 08:28:22.760 |
or something, but something like I talked about, 08:28:32.160 |
I might not be able to feel it at that point, 08:28:35.160 |
or maybe I could again with the sensation and stuff, 08:28:37.680 |
but there's something different about reading 08:28:40.360 |
like a physical book than staring at a screen 08:28:45.960 |
I've listened to a ton of them at this point, 08:28:53.440 |
to be able to experience is opening the book, 08:28:57.060 |
bringing it up to you and to feel the touch of the paper. 08:29:05.080 |
I mean, it's just something about the words on the page. 08:29:22.400 |
A lot of things that I interact with in the world, 08:29:35.840 |
They'll like lay something on me so I can feel the weight. 08:29:38.520 |
They will rub a shirt on me so I can feel fabric. 08:29:42.760 |
Like there's something very profound about touch. 08:30:04.120 |
It's one thing that I've asked God for basically every day 08:30:09.120 |
since my accident was just being able to one day move, 08:30:17.520 |
so that way I could squeeze my mom's hand or something 08:30:27.960 |
being able to just interact with the people around me, 08:30:44.640 |
- Also beat bliss in chess on a physical chess board. 08:31:02.080 |
and everything about him is just so above and beyond 08:31:05.560 |
that anything I can do to take him down a notch, 08:31:15.980 |
Did you ever make sense of why God puts good people 08:31:36.120 |
and I don't think that there's any light without the dark. 08:31:41.120 |
I think that if all of us were happy all the time, 08:31:44.420 |
there would be no reason to turn to God ever. 08:32:12.440 |
one of the first things I said to one of my best friends was, 08:32:15.720 |
and this was within the first month or two after my accident, 08:32:24.000 |
"that God is real and that there really is a God." 08:32:53.440 |
to make sure that we understand how precious he is 08:33:26.840 |
- Oh man, I think people are my biggest inspiration. 08:33:31.840 |
Even just being at Neuralink for a few months, 08:33:38.880 |
looking people in the eyes and hearing their motivations 08:33:42.700 |
for why they're doing this, it's so inspiring. 08:33:54.320 |
doing X, Y, or Z that doesn't really mean that much, 08:33:58.760 |
but instead they're here and they want to better humanity 08:34:03.760 |
and they want to better just the people around them, 08:34:06.800 |
the people that they've interacted with in their life. 08:34:08.880 |
They want to make better lives for their own family members 08:34:13.080 |
or they look at someone like me and they say, 08:34:15.400 |
"I can do something about that, so I'm going to." 08:34:18.080 |
And it's always been what I've connected with most 08:34:36.400 |
and they're going out of their way to make my life better. 08:34:39.960 |
It gives me a lot of hope for just humanity in general, 08:34:42.720 |
how much we care and how much we're capable of 08:34:50.620 |
And I know there's a lot of bad out there in the world, 08:34:54.480 |
but there always has been and there always will be. 08:34:56.980 |
And I think that that is, it shows human resiliency 08:35:07.920 |
and how much we just want to be there and help each other 08:35:18.440 |
because I think that's one of the reasons that we're here 08:35:27.080 |
is just realizing that there are people out there 08:35:31.080 |
- And thank you for being one such human being 08:35:43.120 |
including your epic, unbelievably great performance 08:35:48.220 |
I will be training all night tonight to try to catch up. 08:35:53.000 |
- And I believe in you that you can, once you come back, 08:36:02.920 |
- I'm rooting for you, the whole world is rooting for you. 08:36:11.120 |
with Nolan Arbaugh, and before that with Elon Musk, 08:36:19.040 |
please check out our sponsors in the description. 08:36:23.980 |
from Aldous Huxley in "The Doors of Perception." 08:36:32.960 |
But always, and in all circumstances, we are by ourselves. 08:37:01.000 |
all these are private and except through symbols 08:37:20.880 |
Thank you for listening, and hope to see you next time.