back to index

Elon Musk: Neuralink and the Future of Humanity | Lex Fridman Podcast #438


Chapters

0:0 Introduction
0:49 Elon Musk
4:6 Telepathy
10:45 Power of human mind
15:12 Future of Neuralink
20:27 Ayahuasca
29:57 Merging with AI
34:44 xAI
36:57 Optimus
43:47 Elon's approach to problem-solving
61:23 History and geopolitics
65:53 Lessons of history
70:12 Collapse of empires
77:55 Time
80:37 Aliens and curiosity
88:12 DJ Seo
96:20 Neural dust
103:3 History of brain–computer interface
111:7 Biophysics of neural interfaces
121:36 How Neuralink works
127:26 Lex with Neuralink implant
147:24 Digital telepathy
158:27 Retracted threads
164:1 Vertical integration
170:55 Safety
180:50 Upgrades
189:53 Future capabilities
219:9 Matthew MacDougall
224:58 Neuroscience
232:7 Neurosurgery
243:11 Neuralink surgery
262:20 Brain surgery details
278:3 Implanting Neuralink on self
293:57 Life and death
303:17 Consciousness
306:11 Bliss Chapman
319:27 Neural signal
326:19 Latency
330:59 Neuralink app
335:40 Intention vs action
346:54 Calibration
356:26 Webgrid
379:28 Neural decoder
400:3 Future improvements
408:59 Noland Arbaugh
409:8 Becoming paralyzed
422:43 First Neuralink human participant
426:45 Day of surgery
444:31 Moving mouse with brain
469:50 Webgrid
477:52 Retracted threads
486:16 App improvements
493:1 Gaming
503:59 Future Neuralink capabilities
506:55 Controlling Optimus robot
511:16 God
513:21 Hope

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Elon Musk, DJ Sa,
00:00:04.080 | Matthew McDougal, Bliss Chapman, and Noland Arbaugh
00:00:07.920 | about Neuralink and the future of humanity.
00:00:11.700 | Elon, DJ Matthew, and Bliss are, of course,
00:00:14.160 | part of the amazing Neuralink team,
00:00:17.020 | and Noland is the first human
00:00:19.000 | to have a Neuralink device implanted in his brain.
00:00:21.880 | I speak with each of them individually,
00:00:23.720 | so use timestamps to jump around,
00:00:25.920 | or, as I recommend, go hardcore
00:00:29.800 | and listen to the whole thing.
00:00:31.800 | This is the longest podcast I've ever done.
00:00:35.160 | It's a fascinating, super technical,
00:00:37.000 | and wide-ranging conversation,
00:00:38.880 | and I loved every minute of it.
00:00:41.560 | And now, dear friends, here's Elon Musk,
00:00:45.060 | his fifth time on this The Lex Friedman Podcast.
00:00:49.040 | - Drinking coffee or water?
00:00:51.500 | - Water.
00:00:52.760 | Well, I'm so over-caffeinated right now.
00:00:56.080 | Do you want some caffeine?
00:00:57.320 | - I mean, sure.
00:00:59.320 | There's a nitro drink.
00:01:01.360 | - This'll keep you up for, like,
00:01:04.640 | you know, tomorrow afternoon, basically.
00:01:07.320 | (Elon laughs)
00:01:08.920 | - Yeah.
00:01:10.040 | I don't have any sugar.
00:01:10.880 | - So, what is nitro?
00:01:11.700 | It's just got a lot of caffeine or something?
00:01:13.440 | - Don't ask questions.
00:01:14.280 | It's called nitro.
00:01:15.200 | (Elon laughs)
00:01:16.040 | Do you need to know anything else?
00:01:17.960 | - It's got nitrogen in it.
00:01:19.480 | That's ridiculous.
00:01:20.760 | I mean, what we breathe is 78% nitrogen anyway.
00:01:23.220 | (Elon laughs)
00:01:24.060 | What do you need to add more?
00:01:25.400 | (Elon laughs)
00:01:29.000 | - Most people think that they're breathing oxygen,
00:01:31.080 | and they're actually breathing 78% nitrogen.
00:01:33.760 | - You need, like, a milk bar.
00:01:35.480 | Like from--
00:01:36.320 | (Elon laughs)
00:01:37.880 | Like from Clockwork Orange.
00:01:39.320 | (Elon laughs)
00:01:41.160 | - Yeah.
00:01:42.400 | Is that top three Kubrick film for you?
00:01:44.520 | - Clockwork Orange?
00:01:45.340 | It's pretty good.
00:01:46.180 | I mean, it's demented.
00:01:47.020 | Jarring, I'd say.
00:01:49.360 | (Elon laughs)
00:01:51.720 | - Okay.
00:01:52.560 | Okay.
00:01:56.680 | So, first, let's step back,
00:01:57.680 | and a big congrats on getting Neuralink
00:02:00.280 | implanted into a human.
00:02:02.360 | That's a historic step for Neuralink.
00:02:04.880 | - Oh, thanks, yeah.
00:02:05.880 | - There's many more to come.
00:02:07.280 | - Yeah, we just, obviously, have our second implant as well.
00:02:11.440 | - How did that go?
00:02:12.520 | - So far, so good.
00:02:14.000 | There, looks like we've got,
00:02:16.160 | I think on the order of 400 electrodes
00:02:19.080 | that are providing signals.
00:02:22.880 | - Nice.
00:02:23.700 | - Yeah.
00:02:24.680 | - How quickly do you think the number
00:02:26.440 | of human participants will scale?
00:02:28.400 | - It depends somewhat on the regulatory approval,
00:02:31.720 | the rate at which we get regulatory approvals.
00:02:34.440 | So, we're hoping to do 10 by the end of this year.
00:02:38.840 | Total of 10.
00:02:40.140 | So, eight more.
00:02:40.980 | - And with each one, you're gonna be learning
00:02:44.560 | a lot of lessons about the neurobiology of the brain,
00:02:47.760 | the everything, the whole chain of the Neuralink,
00:02:50.560 | the decoding, the signal processing,
00:02:52.280 | all that kind of stuff.
00:02:53.240 | - Yeah.
00:02:54.080 | Yeah, I think it's obviously gonna get better with each one.
00:02:57.080 | I mean, I don't wanna jinx it,
00:02:59.140 | but it seems to have gone extremely well
00:03:02.260 | with the second implant.
00:03:04.340 | So, there's a lot of signal, a lot of electrodes.
00:03:08.840 | It's working very well.
00:03:09.860 | - What improvements do you think we'll see in Neuralink
00:03:12.660 | in the coming, let's say, let's get crazy, coming years?
00:03:17.460 | - I mean, in years, it's gonna be gigantic.
00:03:22.300 | Because we'll increase the number of electrodes
00:03:24.280 | dramatically, we'll improve the signal processing.
00:03:29.280 | So, even with only roughly, I don't know,
00:03:33.960 | 10, 15% of the electrodes working with Neuralink,
00:03:37.120 | with our first patient, we're able to get to achieve
00:03:41.360 | a bits per second that's twice the world record.
00:03:44.880 | So, I think we'll start vastly exceeding the world record
00:03:49.320 | by orders of magnitude in the years to come.
00:03:51.760 | So, it's like getting to, I don't know,
00:03:53.000 | 100 bits per second, 1,000, you know, maybe.
00:03:56.480 | Maybe if it's like five years from now,
00:03:58.880 | we might be at a megabit.
00:04:00.120 | Like, faster than any human could possibly communicate
00:04:04.400 | by typing or speaking.
00:04:06.380 | - Yeah, that BPS is an interesting metric to measure.
00:04:10.220 | There might be a big leap in the experience
00:04:14.020 | once you reach a certain level of BPS.
00:04:16.420 | - Yeah.
00:04:17.580 | - Like, entire new ways of interacting with a computer
00:04:20.040 | might be unlocked.
00:04:21.080 | And with humans.
00:04:22.600 | - With other humans.
00:04:23.680 | - Provided they have a Neuralink too.
00:04:27.560 | - Right.
00:04:28.520 | - Otherwise, they won't be able to
00:04:29.720 | absorb the signals fast enough.
00:04:31.320 | - Do you think they'll improve the quality
00:04:32.760 | of intellectual discourse?
00:04:34.400 | - Well, I think you could think of it,
00:04:37.000 | if you were to slow down communication,
00:04:41.800 | how would you feel about that?
00:04:43.920 | You know, if you'd only talk at, let's say,
00:04:45.360 | 1/10 of normal speed, you'd be like,
00:04:48.280 | wow, that's agonizingly slow.
00:04:50.160 | - Yeah.
00:04:51.240 | - So now, imagine you could speak at,
00:04:53.800 | communicate clearly at 10 or 100 or 1,000 times
00:04:59.120 | faster than normal.
00:05:00.920 | - Listen, I'm pretty sure nobody in their right mind
00:05:05.540 | listens to me at 1x, they listen at 2x.
00:05:07.960 | (laughing)
00:05:09.440 | I can only imagine what 10x would feel like,
00:05:12.720 | or could actually understand it.
00:05:14.040 | - I usually default to 1.5x.
00:05:16.080 | You can do 2x, but, well, actually,
00:05:18.120 | if I'm listening to somebody in 15, 20 minute segments
00:05:22.600 | to go to sleep, then I'll do it 1.5x.
00:05:25.520 | If I'm paying attention, I'll do 2x.
00:05:27.740 | (laughing)
00:05:30.240 | - Right.
00:05:31.060 | - But actually, if you listen to podcasts
00:05:35.240 | or audio books or anything,
00:05:38.640 | if you get used to doing it at 1.5,
00:05:40.360 | then one sounds painfully slow.
00:05:43.480 | - I'm still holding on to one, because I'm afraid.
00:05:45.920 | I'm afraid of myself becoming bored with the reality,
00:05:49.640 | with the real world, where everyone's speaking in 1x.
00:05:52.080 | (laughing)
00:05:53.720 | - Well, it depends on the person.
00:05:54.640 | You can speak very fast.
00:05:55.920 | We can communicate very quickly.
00:05:56.960 | And also, if you use a wide range of,
00:05:59.160 | if your vocabulary is larger,
00:06:01.660 | your effective bit rate is higher.
00:06:04.160 | - That's a good way to put it.
00:06:07.280 | - Yeah.
00:06:08.120 | - The effective bit rate.
00:06:09.100 | I mean, that is the question,
00:06:10.200 | is how much information is actually compressed
00:06:12.760 | in the low bit transfer of language.
00:06:15.460 | - Yeah, if there's a single word that is able to convey
00:06:20.300 | something that would normally require 10 simple words,
00:06:24.040 | then you've got maybe a 10x compression on your hands.
00:06:27.920 | And that's really, like with memes,
00:06:31.160 | memes are like data compression.
00:06:32.940 | It conveys a whole,
00:06:35.820 | you're simultaneously hit with a wide range of symbols
00:06:39.920 | that you can interpret.
00:06:41.640 | And it's, you kind of get it.
00:06:45.580 | Faster than if it were words or a simple picture.
00:06:49.340 | - And of course, you're referring to memes broadly
00:06:51.560 | like ideas.
00:06:52.940 | - Yeah, there's an entire idea structure
00:06:56.260 | that is like an idea template.
00:06:59.680 | And then you can add something to that idea template.
00:07:02.480 | But somebody has that preexisting idea template
00:07:05.820 | in their head.
00:07:07.260 | So when you add that incremental bit of information,
00:07:09.180 | you're conveying much more than if you just
00:07:12.260 | said a few words.
00:07:13.160 | It's everything associated with that meme.
00:07:15.260 | - You think there'll be emergent leaps of capability
00:07:17.380 | as you scale the number of electrodes?
00:07:19.060 | - Yeah. - Like there'll be a certain,
00:07:20.880 | you think there'll be like actual number where it just,
00:07:23.740 | the human experience will be altered?
00:07:26.300 | - Yes.
00:07:27.800 | - What do you think that number might be?
00:07:29.800 | Whether electrodes or BPS?
00:07:32.440 | We of course don't know for sure,
00:07:33.740 | but is this 10,000, 100,000?
00:07:35.880 | - Yeah, I mean, certainly if you're anywhere
00:07:38.660 | at 10,000 bits per second,
00:07:40.380 | I mean, that's vastly faster
00:07:41.460 | than any human can communicate right now.
00:07:43.640 | If you think of the,
00:07:44.940 | what is the average bits per second of a human?
00:07:47.060 | It is less than one bit per second
00:07:48.420 | over the course of a day.
00:07:49.760 | Because there are 86,400 seconds in a day,
00:07:52.860 | and you don't communicate 86,400 tokens in a day.
00:07:56.920 | Therefore, your bits per second is less than one.
00:08:01.160 | Average over 24 hours.
00:08:02.780 | It's quite slow.
00:08:04.020 | And even if you're communicating very quickly,
00:08:07.180 | and you're talking to somebody
00:08:11.140 | who understands what you're saying,
00:08:13.020 | because in order to communicate,
00:08:15.520 | you have to at least to some degree,
00:08:17.500 | model the mind state of the person to whom you're speaking.
00:08:20.520 | Then take the concept you're trying to convey,
00:08:22.740 | compress that into a small number of syllables,
00:08:24.960 | speak them, and hope that the other person
00:08:27.220 | decompresses them into a conceptual structure
00:08:31.300 | that is as close to what you have in your mind as possible.
00:08:34.260 | - Yeah, I mean, there's a lot of signal loss there
00:08:36.380 | in that process.
00:08:37.220 | - Yeah, very lossy compression and decompression.
00:08:40.180 | And a lot of what your neurons are doing
00:08:44.100 | is distilling the concepts down to a small number
00:08:47.780 | of symbols of, say, syllables that I'm speaking,
00:08:51.980 | or keystrokes, whatever the case may be.
00:08:53.820 | So that's a lot of what your brain computation is doing.
00:08:58.820 | Now, there is an argument that that's actually
00:09:04.940 | a healthy thing to do, or a helpful thing to do,
00:09:08.820 | because as you try to compress complex concepts,
00:09:13.580 | you're perhaps forced to distill
00:09:15.500 | what is most essential in those concepts,
00:09:19.260 | as opposed to just all the fluff.
00:09:20.740 | So in the process of compression,
00:09:23.020 | you distill things down to what matters the most,
00:09:25.220 | because you can only say a few things.
00:09:27.780 | So that is perhaps helpful.
00:09:29.660 | I think we might, we'll probably get,
00:09:31.140 | if our data rate increases, it's highly probable
00:09:34.660 | that we'll become far more verbose.
00:09:37.140 | Just like your computer, when computers had like,
00:09:42.180 | my first computer had 8K of RAM,
00:09:44.300 | so you really thought about every byte.
00:09:48.060 | And now you've got computers with many gigabytes of RAM.
00:09:53.060 | So if you want to do an iPhone app
00:09:56.500 | that just says, "Hello, world,"
00:09:58.100 | it's probably, I don't know, several megabytes minimum.
00:10:01.060 | A bunch of fluff.
00:10:03.300 | But nonetheless, we still prefer to have the computer
00:10:06.420 | with more memory and more compute.
00:10:08.820 | So the long-term aspiration of Neuralink
00:10:12.900 | is to improve the AI-human symbiosis
00:10:17.820 | by increasing the bandwidth of the communication.
00:10:23.820 | Because if, even in the most benign scenario of AI,
00:10:28.500 | you have to consider that the AI
00:10:31.060 | is simply gonna get bored
00:10:32.860 | waiting for you to spit out a few words.
00:10:34.900 | I mean, if the AI can communicate
00:10:37.780 | at terabits per second,
00:10:39.460 | and you're communicating at, you know, bits per second,
00:10:43.060 | it's like talking to a tree.
00:10:45.340 | - Well, it is a very interesting question
00:10:47.500 | for a super-intelligent species.
00:10:50.540 | What use are humans?
00:10:52.980 | - I think there is some argument for humans
00:10:57.740 | as a source of will.
00:10:59.860 | - Will.
00:11:00.700 | - Will, yeah.
00:11:01.540 | A source of will or purpose.
00:11:02.820 | So if you consider the human mind
00:11:07.380 | as being essentially,
00:11:08.780 | there's the primitive limbic elements,
00:11:12.860 | which basically even reptiles have,
00:11:15.820 | and there's the cortex,
00:11:17.420 | the thinking and planning part of the brain.
00:11:20.020 | Now the cortex is much smarter than the limbic system,
00:11:22.060 | and yet it's largely in service to the limbic system.
00:11:24.580 | It's trying to make the limbic system happy.
00:11:26.500 | I mean, the sheer amount of compute
00:11:27.660 | that's gone into people trying to get laid is insane.
00:11:30.780 | Without actually seeking procreation.
00:11:35.580 | They're just literally trying to do
00:11:37.460 | this sort of simple motion.
00:11:39.900 | (laughing)
00:11:42.340 | And they get a kick out of it.
00:11:43.820 | - Yeah.
00:11:44.660 | - So this simple, which in the abstract,
00:11:47.380 | rather absurd motion, which is sex,
00:11:49.460 | the cortex is putting a massive amount of compute
00:11:53.500 | into trying to figure out how to do that.
00:11:55.380 | - So like 90% of distributed compute of the human species
00:11:58.340 | is spent on trying to get laid, probably.
00:12:00.020 | Like a large percentage. - A massive amount, yeah.
00:12:02.140 | There's no purpose to most sex except hedonistic.
00:12:06.780 | You know, it's just sort of joy or whatever.
00:12:08.980 | Dopamine release.
00:12:10.420 | Now once in a while it's procreation,
00:12:14.420 | but for humans it's mostly,
00:12:15.660 | modern humans, it's mostly recreation.
00:12:18.220 | And so your cortex, much smarter than your limbic system,
00:12:24.700 | is trying to make the limbic system happy
00:12:25.980 | 'cause the limbic system wants to have sex.
00:12:27.500 | So, or wants some tasty food or whatever the case may be.
00:12:31.820 | And then that is then further augmented
00:12:33.620 | by the tertiary system, which is your phone,
00:12:35.780 | your laptop, iPad, or your computing stuff.
00:12:39.300 | That's your tertiary layer.
00:12:40.860 | So you're actually already a cyborg.
00:12:43.540 | You have this tertiary compute layer,
00:12:45.140 | which is in the form of your computer
00:12:47.260 | with all the applications, all your compute devices.
00:12:50.020 | And so in the getting laid front,
00:12:55.340 | there's actually a massive amount of digital compute
00:12:59.220 | also trying to get laid.
00:13:00.540 | You know, with like Tinder and whatever, you know.
00:13:04.260 | - Yeah, so the compute that we humans have built
00:13:07.220 | is also participating.
00:13:09.260 | - Yeah, I mean there's like gigawatts of compute
00:13:11.900 | going into getting laid, of digital compute.
00:13:14.780 | - Yeah.
00:13:15.620 | What if AGI--
00:13:17.620 | - This is happening as we speak.
00:13:19.220 | - If we merge with AI, it's just gonna expand the compute
00:13:22.900 | that we humans use.
00:13:24.340 | - Pretty much.
00:13:25.180 | - To try to get laid.
00:13:26.020 | - Well, it's one of the things, certainly, yeah.
00:13:26.860 | - Yeah.
00:13:27.700 | - But what I'm saying is that, yes,
00:13:30.700 | like is there a use for humans?
00:13:32.860 | Well, there's this fundamental question
00:13:36.740 | of what's the meaning of life?
00:13:37.580 | Why do anything at all?
00:13:39.820 | And so if our simple limbic system
00:13:43.220 | provides a source of will to do something
00:13:46.060 | that then goes to our cortex,
00:13:49.060 | that then goes to our tertiary compute layer,
00:13:53.380 | then, I don't know, it might actually be that the AI,
00:13:58.380 | in a benign scenario,
00:13:59.380 | is simply trying to make the human limbic system happy.
00:14:02.540 | - Yeah, it seems like the will
00:14:05.980 | is not just about the limbic system.
00:14:07.580 | There's a lot of interesting complicated things in there.
00:14:09.780 | We also want power.
00:14:11.620 | - That's limbic too, I think.
00:14:13.060 | - But then we also want to, in a kind of cooperative way,
00:14:15.980 | alleviate the suffering in the world.
00:14:17.980 | - Not everybody does, but yeah, sure.
00:14:21.180 | Some people do.
00:14:22.580 | - As a group of humans, when we get together,
00:14:25.340 | we start to have this kind of collective intelligence
00:14:27.940 | that is more complex in its will
00:14:33.300 | than the underlying individual descendants of apes, right?
00:14:37.620 | So there's other motivations,
00:14:39.820 | and that could be a really interesting source
00:14:42.660 | of an objective function for AGI.
00:14:45.420 | - Yeah, I mean, there are these sort of fairly cerebral,
00:14:50.420 | or kind of higher-level goals.
00:14:54.880 | I mean, for me, it's like what's the meaning of life,
00:14:56.780 | or understanding the nature of the universe
00:14:59.880 | is of great interest to me.
00:15:03.060 | And hopefully to the AI,
00:15:07.540 | and that's the mission of XAI and Grok,
00:15:11.600 | is understand the universe.
00:15:13.140 | - So do you think people,
00:15:15.320 | when you have a neural link with 10,000,
00:15:18.740 | 100,000 channels, most of the use cases
00:15:22.260 | will be communication with AI systems?
00:15:27.580 | - Well, assuming that they're not,
00:15:31.760 | I mean, they're solving basic neurological issues
00:15:37.620 | that people have.
00:15:39.200 | You know, if they've got damaged neurons
00:15:42.760 | in their spinal cord or neck,
00:15:44.060 | or as is the case with our first two patients,
00:15:48.280 | then obviously the first order of business
00:15:51.880 | is solving fundamental neuron damage
00:15:54.740 | in the spinal cord, neck, or in the brain itself.
00:15:58.520 | So, you know, a second product is called BlindSight,
00:16:04.600 | which is to enable people who are completely blind,
00:16:09.320 | lost both eyes, or optic nerve,
00:16:11.200 | or just can't see at all, to be able to see
00:16:13.800 | by directly triggering the neurons in the visual cortex.
00:16:18.120 | So we're just starting at the basics here, you know?
00:16:20.160 | So it's like, the simple stuff, relatively speaking,
00:16:25.160 | is solving neuron damage.
00:16:30.620 | It can also solve, I think, probably schizophrenia.
00:16:36.040 | You know, if people have seizures of some kind,
00:16:40.100 | it can probably solve that.
00:16:41.460 | It could help with memory.
00:16:43.940 | So there's like a kind of a tech tree, if you will,
00:16:50.020 | of like, you've got the basics.
00:16:51.620 | Like, you need literacy before you can have,
00:16:56.900 | you know, "Lord of the Rings," you know?
00:16:59.660 | (both laughing)
00:17:02.060 | - Got it.
00:17:03.140 | - Do you have letters and alphabet?
00:17:04.660 | Okay, great.
00:17:05.860 | Words, you know, and then eventually you get sagas.
00:17:10.260 | So, you know, I think there may be some, you know,
00:17:15.180 | things to worry about in the future,
00:17:17.460 | but the first several years are really just solving
00:17:20.540 | basic neurological damage.
00:17:22.160 | But like, for people who have essentially complete
00:17:24.820 | or near-complete loss from the brain to the body,
00:17:29.020 | like Stephen Hawking would be an example,
00:17:31.720 | the neural links would be incredibly profound.
00:17:35.340 | 'Cause I mean, you can imagine if Stephen Hawking
00:17:36.860 | could communicate as fast as we're communicating,
00:17:39.100 | perhaps faster.
00:17:39.920 | And that's certainly possible.
00:17:43.160 | Probable, in fact, likely, I'd say.
00:17:46.460 | - So there's a kind of dual track
00:17:49.380 | of medical and non-medical, meaning,
00:17:52.420 | so everything you've talked about could be applied
00:17:54.300 | to people who are non-disabled in the future?
00:17:58.140 | - The logical thing to do is, sensible thing to do
00:18:00.420 | is to start off solving basic neuron damage issues.
00:18:05.420 | 'Cause there's obviously some risk with a new device.
00:18:14.980 | You can't get the risk down to zero, it's not possible.
00:18:18.120 | So you wanna have the highest possible reward
00:18:21.940 | given that there's a certain irreducible risk.
00:18:25.060 | And if somebody's able to have a profound improvement
00:18:29.460 | in their communication, that's worth the risk.
00:18:34.460 | - As you get the risk down.
00:18:36.420 | - Yeah, as you get the risk down.
00:18:37.260 | And once the risk is down to, you know,
00:18:40.820 | if you have like thousands of people
00:18:43.380 | that have been using it for years and the risk is minimal,
00:18:46.980 | then perhaps at that point you could consider saying,
00:18:49.780 | okay, let's aim for augmentation.
00:18:53.060 | Now, I think we're actually gonna aim for augmentation
00:18:57.020 | with people who have neuron damage.
00:18:59.140 | So we're not just aiming to give people
00:19:01.740 | a communication data rate equivalent to normal humans.
00:19:05.020 | We're aiming to give people who have, you know,
00:19:07.900 | a quadriplegic or maybe have complete loss
00:19:11.220 | of the connection to the brain and body,
00:19:14.020 | a communication data rate that exceeds normal humans.
00:19:17.260 | I mean, while we're in there, why not?
00:19:18.500 | Let's give people superpowers.
00:19:20.000 | - And the same for vision.
00:19:22.020 | As you restore vision, there could be aspects
00:19:24.600 | of that restoration that are superhuman.
00:19:27.300 | - Yeah, at first the vision restoration will be low res.
00:19:31.360 | 'Cause you have to say like, how many neurons
00:19:33.700 | can you put in there and trigger?
00:19:36.320 | And you can do things where you adjust the electric field
00:19:40.340 | so that even if you've got, say, 10,000 neurons,
00:19:44.260 | it's not just 10,000 pixels because you can adjust
00:19:46.940 | the field between the neurons and do them in patterns
00:19:51.620 | in order to have, say, 10,000 electrodes
00:19:55.180 | effectively give you, I don't know, maybe,
00:19:58.680 | like having a megapixel or a 10 megapixel situation.
00:20:03.740 | So, and then over time, I think you'd get
00:20:08.780 | to higher resolution than human eyes
00:20:11.540 | and you could also see in different wavelengths.
00:20:14.100 | So, like Geordi LaForge from Star Trek.
00:20:17.500 | You know, like the thing.
00:20:19.900 | You could just, do you want to see in radar?
00:20:21.480 | No problem.
00:20:22.320 | You can see ultraviolet, infrared,
00:20:25.580 | eagle vision, whatever you want.
00:20:27.180 | - Do you think there'll be, let me ask a Joe Rogan question.
00:20:31.380 | Do you think there'll be, I just recently
00:20:34.700 | taken ayahuasca, do you think?
00:20:36.660 | - Is that a Joe Rogan question?
00:20:37.500 | - So this question, no.
00:20:38.500 | - Well, yes.
00:20:39.320 | - Well, I guess technically it is.
00:20:40.900 | - Yeah, have you ever tried DMT, bro?
00:20:43.300 | (laughing)
00:20:45.100 | I love you, Joe, okay.
00:20:46.460 | (laughing)
00:20:48.540 | - Yeah, wait, wait, yeah.
00:20:49.380 | Have you said much about it?
00:20:50.860 | The ayahuasca stuff?
00:20:51.700 | - I have not, I have not, I have not.
00:20:53.460 | - Okay, well, why don't we just spill the beans?
00:20:55.900 | - It was a truly incredible experience.
00:20:57.500 | - Let me turn the tables on you.
00:20:58.540 | (laughing)
00:20:59.620 | - Wow, yeah.
00:21:00.460 | - I mean, you're in the jungle.
00:21:01.300 | (laughing)
00:21:02.580 | - Yeah, amongst the trees, myself.
00:21:04.180 | - Yeah, it must have been crazy.
00:21:05.020 | - And the shaman, yeah, yeah, yeah.
00:21:06.260 | With the insects, with the animals all around you,
00:21:08.860 | like, jungle as far as I can see.
00:21:10.580 | - I mean.
00:21:11.620 | - That's the way to do it.
00:21:12.780 | - Things are gonna look pretty wild.
00:21:14.260 | - Yeah, pretty wild.
00:21:15.280 | (laughing)
00:21:17.220 | I took an extremely high dose.
00:21:19.060 | - Don't go hugging an anaconda or something, you know?
00:21:22.740 | (laughing)
00:21:24.260 | - You haven't lived unless you made love to an anaconda.
00:21:27.020 | I'm sorry, but.
00:21:29.380 | - Snakes and ladders.
00:21:30.620 | (laughing)
00:21:33.620 | - Yeah, I took an extremely high dose of nine cups.
00:21:38.620 | - Damn, okay, that sounds like a lot.
00:21:40.700 | Like, what's the normal, is normal just one cup, or?
00:21:42.860 | - One or two, usually one.
00:21:44.500 | (laughing)
00:21:46.300 | - Wait, like, right off the bat,
00:21:48.580 | or did you work your way up to it?
00:21:49.820 | - So I--
00:21:50.660 | (laughing)
00:21:51.980 | - You just jumped in at the deep end.
00:21:53.620 | - Across two days, 'cause on the first day I took two,
00:21:55.900 | and it was a ride, but it wasn't quite like a--
00:22:00.060 | - It wasn't like a revelation.
00:22:01.460 | - It wasn't into deep space type of ride,
00:22:03.260 | it was just like a little airplane ride.
00:22:04.860 | (laughing)
00:22:06.100 | And I saw some trees, and some visuals,
00:22:08.980 | and I just saw a dragon, and all that kind of stuff.
00:22:11.220 | But--
00:22:12.060 | (laughing)
00:22:13.660 | - Nine cups, you went to Pluto, I think.
00:22:15.740 | - Pluto, yeah, no, deep space.
00:22:17.500 | - Deep space.
00:22:18.940 | - One of the interesting aspects of my experience
00:22:21.500 | is I thought I would have some demons
00:22:23.340 | and stuff to work through.
00:22:24.780 | - That's what people--
00:22:26.060 | - That's what everyone says.
00:22:27.260 | - That's what everyone says, yeah, exactly.
00:22:28.100 | - I had nothing, I had all positive.
00:22:30.660 | I had just so full--
00:22:31.500 | - Just a pure soul.
00:22:32.380 | - I don't think so.
00:22:33.300 | - I don't know.
00:22:34.140 | (laughing)
00:22:35.620 | - But I kept thinking about,
00:22:37.180 | it had extremely high resolution thoughts
00:22:41.460 | about the people I know in my life.
00:22:42.940 | You were there.
00:22:43.780 | - Okay.
00:22:44.620 | - And it's just, not from my relationship with that person,
00:22:47.980 | but just as the person themselves,
00:22:50.060 | I had just this deep gratitude of who they are.
00:22:52.660 | - That's cool.
00:22:53.580 | - It was just like this exploration.
00:22:55.780 | You know like Sims or whatever, you get to watch them?
00:22:58.340 | - Sure.
00:22:59.180 | - I got to watch people,
00:23:00.020 | and just be in awe of how amazing they are.
00:23:02.180 | - That sounds awesome.
00:23:03.020 | - Yeah, it was great.
00:23:03.940 | I was waiting for--
00:23:05.060 | - When's the demon coming?
00:23:06.220 | (laughing)
00:23:07.660 | - Exactly.
00:23:08.500 | - Maybe I'll have some negative thoughts, nothing, nothing.
00:23:11.140 | I had just extreme gratitude for them.
00:23:15.060 | And then also a lot of space travel.
00:23:17.260 | (laughing)
00:23:18.580 | - Space travel to where?
00:23:20.180 | - So here's what it was.
00:23:21.140 | It was people, the human beings that I know,
00:23:25.740 | they had this kind of,
00:23:27.340 | the best way I can describe it is they had a glow to them.
00:23:30.260 | - Okay.
00:23:31.100 | - And we kept flying out from them to see Earth,
00:23:36.100 | to see our solar system, to see our galaxy.
00:23:39.860 | And I saw that light, that glow all across the universe.
00:23:44.300 | - Okay.
00:23:45.140 | - Like whatever that form is, whatever that like--
00:23:49.780 | - Did you go past the Milky Way?
00:23:52.020 | - Yeah.
00:23:52.860 | - Okay, you were like intergalactic.
00:23:54.860 | - Yeah, intergalactic.
00:23:55.700 | - Okay, dang.
00:23:56.620 | - But always pointing in, yeah.
00:23:59.940 | Past the Milky Way, past, I mean,
00:24:01.700 | I saw like a huge number of galaxies, intergalactic.
00:24:04.380 | - Oh, okay.
00:24:05.220 | - And all of it was glowing.
00:24:06.420 | So, but I couldn't control that travel
00:24:08.100 | 'cause I would actually explore near distances
00:24:11.300 | to the solar system, see if there's aliens
00:24:12.980 | or any of that kind of stuff.
00:24:13.820 | - Sure, did you see aliens?
00:24:14.660 | - No, I didn't, no.
00:24:15.500 | - Did you see aliens?
00:24:16.340 | - Implication of aliens, because they were glowing.
00:24:18.620 | They were glowing in the same way that humans were glowing,
00:24:21.020 | that like life force that I was seeing.
00:24:24.820 | The thing that made humans amazing
00:24:27.780 | was there throughout the universe.
00:24:29.420 | Like there was these glowing dots.
00:24:32.980 | So, I don't know.
00:24:34.380 | It made me feel like there is life, no, not life,
00:24:37.960 | but something, whatever makes humans amazing
00:24:40.300 | all throughout the universe.
00:24:41.860 | - Sounds good.
00:24:42.680 | - Yeah, it was amazing.
00:24:43.660 | No demons, no demons.
00:24:45.660 | I looked for the demons, there's no demons.
00:24:47.700 | There were dragons and they're pretty awesome.
00:24:49.140 | So, the thing about trees--
00:24:49.980 | - Was there anything scary at all?
00:24:51.580 | - Dragons?
00:24:54.700 | But they weren't scary, they were protective.
00:24:57.300 | So, the thing is--
00:24:58.120 | - Past the magic dragon.
00:24:58.960 | - It was more like Game of Thrones kind of dragons.
00:25:02.820 | They weren't very friendly.
00:25:03.780 | They were very big.
00:25:04.620 | So, the thing is about giant trees at night,
00:25:08.140 | which is where I was.
00:25:09.420 | - I mean, the jungle's kind of scary.
00:25:11.420 | - The trees started to look like dragons
00:25:13.460 | and they were all like looking at me.
00:25:16.180 | - Sure, okay.
00:25:17.020 | - And it didn't seem scary.
00:25:18.100 | They seemed like they were protecting me.
00:25:19.340 | And they, the shaman and the people,
00:25:21.940 | they didn't speak any English, by the way,
00:25:23.220 | which made it even scarier.
00:25:25.920 | We're not even like, you know,
00:25:28.280 | we're worlds apart in many ways.
00:25:30.000 | It's just, but yeah, there was not,
00:25:33.760 | they talk about the mother of the forest protecting you
00:25:37.760 | and that's what I felt like.
00:25:39.240 | - And you're way out in the jungle.
00:25:40.320 | - Way out.
00:25:41.440 | This is not like a tourist retreat.
00:25:44.640 | - You know, like 10 miles outside of a frio or something.
00:25:47.200 | - No, we weren't, no, this is not--
00:25:50.680 | - You're a deep Amazon.
00:25:52.360 | - So me and this guy named Paul Rosely,
00:25:54.300 | who basically is Tarzan, he lives in the jungle,
00:25:57.820 | we went out deep and we just went crazy.
00:25:59.980 | - Wow, cool.
00:26:01.140 | - Yeah.
00:26:01.980 | So anyway, can I get that same experience in a Neuralink?
00:26:04.660 | - Probably, yeah.
00:26:05.800 | - I guess that is the question for non-disabled people.
00:26:08.980 | Do you think that there's a lot in our perception,
00:26:12.860 | in our experience of the world that could be explored,
00:26:16.540 | that could be played with using Neuralink?
00:26:18.500 | - Yeah, I mean, Neuralink is really
00:26:22.240 | a generalized input/output device.
00:26:26.360 | You know, it's reading electrical signals
00:26:29.400 | and generating electrical signals.
00:26:31.400 | And I mean, everything that you've ever experienced
00:26:35.320 | in your whole life, the smell, you know, emotions,
00:26:39.080 | all of those are electrical signals.
00:26:41.240 | So it's kind of weird to think that your entire life
00:26:46.080 | experience is distilled down to electrical signals
00:26:48.100 | from neurons, but that is, in fact, the case.
00:26:51.140 | Or I mean, that's at least what all the evidence points to.
00:26:54.740 | So, I mean, if you trigger the right neuron,
00:26:59.740 | you could trigger a particular scent,
00:27:02.540 | you could certainly make things glow.
00:27:06.340 | I mean, do pretty much anything.
00:27:08.700 | I mean, really, you can think of the brain
00:27:10.460 | as a biological computer.
00:27:12.040 | So if there are certain, say, chips or elements
00:27:15.140 | of that biological computer that are broken,
00:27:17.320 | let's say your ability to, if you've got a stroke,
00:27:21.100 | if you've had a stroke, that means you've got
00:27:22.740 | some part of your brain is damaged.
00:27:25.180 | If that, let's say it's a speech generation
00:27:27.580 | or the ability to move your left hand,
00:27:29.680 | that's the kind of thing that a Neuralink could solve.
00:27:32.380 | If it's, if you've got like a mass amount of memory loss
00:27:37.900 | that's just gone, well, we can't get the memories back.
00:27:42.300 | We could restore your ability to make memories,
00:27:44.780 | but we can't restore memories that are fully gone.
00:27:49.780 | Now, I should say, maybe if part of the memory is there
00:27:55.840 | and the means of accessing the memory
00:28:00.400 | is the part that's broken,
00:28:01.440 | then we could re-enable the ability to access the memory.
00:28:04.500 | So, but you can think of it like RAM in a computer.
00:28:08.520 | If the RAM is destroyed or your SD card is destroyed,
00:28:13.160 | you can't get that back.
00:28:14.240 | But if the connection to the SD card is destroyed,
00:28:17.080 | we can fix that.
00:28:18.320 | If it is fixable physically, then, yeah,
00:28:21.720 | then it can be fixed.
00:28:22.840 | - Of course, with AI, you can,
00:28:24.280 | just like you can repair photographs
00:28:26.680 | and fill in missing parts of photographs,
00:28:28.920 | maybe you can do the same.
00:28:30.600 | - Yeah, you could say like,
00:28:32.640 | create the most probable set of memories
00:28:34.480 | based on the, all information you have about that person.
00:28:39.480 | You could then, it would be probable,
00:28:43.280 | probabilistic restoration of memory.
00:28:45.200 | Now, we're getting pretty esoteric here.
00:28:46.840 | - But that is one of the most beautiful aspects
00:28:49.480 | of the human experience, is remembering the good memories.
00:28:52.400 | Like, we live most of our life,
00:28:55.040 | as Danny Kahneman has talked about, in our memories,
00:28:57.680 | not in the actual moment.
00:28:59.160 | We just, we're collecting memories
00:29:00.560 | and we kind of relive them in our head.
00:29:02.760 | And that's the good times.
00:29:04.760 | If you just integrate over our entire life,
00:29:06.960 | it's remembering the good times
00:29:08.880 | that produces the largest amount of happiness, and so.
00:29:11.800 | - Yeah, well, I mean, what are we but our memories?
00:29:14.040 | And what is death but the loss of memory?
00:29:16.760 | Loss of information.
00:29:19.800 | You know, if you could say like, well,
00:29:23.460 | if you could be, you're running a thought experiment,
00:29:25.720 | well, if you were disintegrated painlessly
00:29:29.840 | and then reintegrated a moment later,
00:29:32.520 | like teleportation, I guess,
00:29:34.200 | provided there's no information loss,
00:29:36.560 | that the fact that your one body
00:29:38.320 | was disintegrated is irrelevant.
00:29:39.960 | And memories is just such a huge part of that.
00:29:43.120 | - Death is fundamentally the loss of information.
00:29:45.660 | The loss of memory.
00:29:47.720 | - So if we can store them as accurately as possible,
00:29:52.400 | we basically achieve a kind of immortality.
00:29:55.160 | - Yeah.
00:29:56.000 | - You've talked about the threats,
00:30:01.920 | the safety concerns of AI.
00:30:03.640 | Let's look at long-term visions.
00:30:05.480 | Do you think Neuralink is, in your view,
00:30:09.960 | the best current approach we have for AI safety?
00:30:13.600 | - It's an idea that may help with AI safety.
00:30:15.900 | Certainly not, I wouldn't wanna claim
00:30:19.400 | it's like some panacea or that it's a sure thing.
00:30:22.080 | But, I mean, many years ago, I was thinking like,
00:30:26.600 | well, what would inhibit alignment
00:30:32.960 | of collective human will with artificial intelligence?
00:30:37.960 | And the low data rate of humans,
00:30:42.440 | especially our slow output rate,
00:30:44.860 | would necessarily just, because the communication is so slow,
00:30:51.380 | would diminish the link between humans and computers.
00:30:56.600 | Like the more you are a tree,
00:31:01.440 | the less you know what the tree is.
00:31:04.280 | Let's say you look at this plant or whatever and like,
00:31:07.120 | hey, I'd really like to make that plant happy,
00:31:08.600 | but it's not saying a lot, you know?
00:31:10.400 | - So the more we increase the data rate
00:31:13.560 | that humans can intake and output,
00:31:16.360 | then that means the higher the chance we have
00:31:20.120 | in a world full of AGIs.
00:31:21.520 | - Yeah.
00:31:22.460 | We could better align collective human will with AI
00:31:25.760 | if the output rate especially was dramatically increased.
00:31:30.520 | And I think there's potential to increase the output rate
00:31:32.520 | by, I don't know, three, maybe six,
00:31:35.640 | maybe more orders of magnitude.
00:31:37.720 | So it's better than the current situation.
00:31:41.920 | - And that output rate would be
00:31:43.880 | by increasing the number of electrodes, number of channels,
00:31:46.640 | and also maybe implanting multiple neural links.
00:31:49.360 | - Yeah.
00:31:50.200 | - Do you think there'll be a world
00:31:54.000 | in the next couple of decades where it's
00:31:56.240 | hundreds of millions of people have neural links?
00:31:59.160 | - Yeah, I do.
00:32:00.540 | - Do you think when people just,
00:32:04.080 | when they see the capabilities,
00:32:06.240 | the superhuman capabilities that are possible,
00:32:08.640 | and then the safety is demonstrated?
00:32:11.600 | - Yeah, if it's extremely safe
00:32:13.060 | and you can have superhuman abilities,
00:32:18.640 | and let's say you can upload your memories,
00:32:22.720 | you know, so you wouldn't lose memories,
00:32:27.800 | then I think probably a lot of people
00:32:31.240 | would choose to have it.
00:32:33.120 | It would supersede the cell phone, for example.
00:32:35.260 | I mean, the biggest problem that, say, a phone has
00:32:39.040 | is trying to figure out what you want.
00:32:43.720 | So that's why you've got, you know, auto-complete,
00:32:48.680 | and you've got output,
00:32:50.400 | which is all the pixels on the screen,
00:32:51.800 | but from the perspective of the human,
00:32:53.000 | the output is so frigging slow.
00:32:55.520 | Desktop or phone is desperately just trying
00:32:57.440 | to understand what you want.
00:32:58.840 | And, you know, there's an eternity
00:33:02.320 | between every keystroke from a computer's standpoint.
00:33:04.920 | - Yeah, yeah, the computer's talking to a tree,
00:33:09.720 | a slow-moving tree that's trying to swipe.
00:33:12.240 | - Yeah.
00:33:13.080 | So, you know, if you've got computers
00:33:16.320 | that are doing trillions of instructions per second,
00:33:19.160 | and a whole second went by,
00:33:20.520 | I mean, that's a trillion things it could've done, you know?
00:33:24.840 | - Yeah, I think it's exciting and scary for people,
00:33:28.320 | because once you have a very high bit rate,
00:33:30.960 | it changes the human experience
00:33:32.920 | in a way that's very hard to imagine.
00:33:35.460 | - Yeah, it would be,
00:33:37.080 | we would be something different.
00:33:40.240 | I mean, some sort of futuristic cyborg.
00:33:43.520 | I mean, we're obviously talking about, by the way,
00:33:45.640 | it's not like around the corner.
00:33:47.840 | You ask me what the distant future is,
00:33:49.760 | maybe this is like, it's not super far away,
00:33:51.960 | but 10, 15 years, that kind of thing.
00:33:54.800 | - When can I get one?
00:33:59.160 | 10 years?
00:34:01.640 | - Probably less than 10 years.
00:34:04.400 | Depends on what you wanna do, you know?
00:34:08.240 | - Hey, if I can get like 1,000 BPS.
00:34:11.680 | - 1,000 BPS, wow.
00:34:12.520 | - And it's safe, and I can just interact with the computer
00:34:15.400 | while laying back and eating Cheetos,
00:34:18.000 | I don't eat Cheetos.
00:34:19.200 | There's certain aspects of human-computer interaction
00:34:22.000 | when done more efficiently and more enjoyably,
00:34:24.840 | I don't like, worth it.
00:34:26.520 | - Well, we feel pretty confident that,
00:34:28.420 | I think maybe within the next year or two,
00:34:32.980 | that someone with a Neuralink implant
00:34:34.480 | will be able to outperform a pro gamer.
00:34:38.980 | - Nice.
00:34:41.080 | - Because the reaction time would be faster.
00:34:43.240 | - I got to visit Memphis.
00:34:46.840 | - Yeah, yeah.
00:34:47.680 | - You're going big on compute.
00:34:49.040 | - Yeah.
00:34:49.880 | - And you've also said play to win or don't play at all,
00:34:51.560 | so what does it take to win?
00:34:53.760 | - For AI, that means you've got to have
00:34:57.620 | the most powerful training compute.
00:34:59.360 | And the rate of improvement of training compute
00:35:03.240 | has to be faster than everyone else,
00:35:06.440 | or you will not win, your AI will be worse.
00:35:10.400 | - So how can Grok, let's say three,
00:35:13.080 | that might be available, what, like next year?
00:35:15.720 | - Well, hopefully end of this year.
00:35:17.140 | - Grok three.
00:35:17.980 | - If we're lucky, yeah.
00:35:19.060 | - How can that be the best LLM,
00:35:23.860 | the best AI system available in the world?
00:35:26.460 | How much of it is compute?
00:35:28.300 | How much of it is data?
00:35:29.580 | How much of it is like post-training?
00:35:31.260 | How much of it is the product that you package it up in?
00:35:34.380 | All that kind of stuff.
00:35:35.540 | - I mean, they all matter.
00:35:38.500 | It's sort of like saying what, you know,
00:35:40.940 | let's say it's a Formula One race.
00:35:42.180 | Like what matters more, the car or the driver?
00:35:44.060 | I mean, they both matter.
00:35:46.020 | If your car is not fast, then, you know,
00:35:51.020 | if it's like, let's say it's half the horsepower
00:35:52.300 | of your competitors, the best driver will still lose.
00:35:55.540 | If it's twice the horsepower,
00:35:56.880 | then probably even a mediocre driver will still win.
00:36:00.740 | So the training compute is kind of like the engine.
00:36:03.340 | How many, there's horsepower of the engine.
00:36:05.420 | So you really, you want to try to do the best on that.
00:36:08.820 | And then there's how efficiently do you use
00:36:12.900 | that training compute?
00:36:14.340 | And how efficiently do you do the inference,
00:36:16.660 | the use of the AI?
00:36:18.540 | So obviously that comes down to human talent.
00:36:23.100 | And then what unique access to data do you have?
00:36:25.900 | That's also plays a role.
00:36:28.860 | - You think Twitter data will be useful?
00:36:30.980 | - Yeah, I mean, I think most of the leading AI companies
00:36:35.980 | have already scraped all the Twitter data.
00:36:38.740 | Not that I think they have.
00:36:42.180 | So on a go-forward basis,
00:36:44.700 | what's useful is the fact that it's up to the second,
00:36:48.340 | you know, because it's hard for them to scrape in real time.
00:36:51.780 | So there's an immediacy advantage that Grok has already.
00:36:56.780 | I think with Tesla and the real-time video
00:37:00.740 | coming from several million cars,
00:37:02.420 | ultimately tens of millions of cars,
00:37:04.380 | with Optimus, there might be hundreds of millions
00:37:07.220 | of Optimus robots, maybe billions,
00:37:10.420 | learning a tremendous amount from the real world.
00:37:13.460 | That's the biggest source of data,
00:37:16.700 | I think, ultimately, is sort of Optimus probably.
00:37:19.380 | Optimus is gonna be the biggest source of data.
00:37:21.740 | - Because it's-- - 'Cause reality scales.
00:37:23.820 | Reality scales to the scale of reality.
00:37:28.220 | It's actually humbling to see how little data
00:37:32.300 | humans have actually been able to accumulate.
00:37:34.540 | Really, if you say how many trillions of usable tokens
00:37:39.700 | have humans generated where, on a non-duplicative,
00:37:44.140 | like discounting spam and repetitive stuff,
00:37:49.140 | it's not a huge number.
00:37:50.960 | You run out pretty quickly.
00:37:54.980 | - And Optimus can go, so Tesla cars can,
00:37:57.660 | unfortunately, have to stay on the road.
00:37:59.660 | Optimus robot can go anywhere.
00:38:02.740 | There's more reality off the road.
00:38:05.420 | And go off-road.
00:38:06.260 | - I mean, the Optimus robot can pick up the cup
00:38:08.620 | and see, did it pick up the cup in the right way?
00:38:10.900 | Did it, say, pour water in the cup?
00:38:14.160 | Did the water go in the cup or not go in the cup?
00:38:17.980 | Did it spill water or not?
00:38:19.300 | - Yeah.
00:38:20.140 | - Simple stuff like that.
00:38:23.180 | But it can do that at scale times a billion.
00:38:26.540 | Generate useful data from reality.
00:38:31.300 | So, cause and effect stuff.
00:38:34.420 | - What do you think it takes to get
00:38:36.820 | to mass production of humanoid robots like that?
00:38:40.260 | - It's the same as cars, really.
00:38:42.220 | I mean, global capacity for vehicles
00:38:45.020 | is about 100 million a year.
00:38:48.040 | And it could be higher, it's just that the demand
00:38:53.540 | is on the order of 100 million a year.
00:38:55.300 | And then there's roughly two billion vehicles
00:38:58.420 | that are in use in some way.
00:39:00.380 | So, which makes sense, like the life of a vehicle
00:39:03.700 | is about 20 years.
00:39:04.540 | So, at steady state, you can have 100 million vehicles
00:39:06.340 | produced a year with a two billion vehicle fleet, roughly.
00:39:09.720 | Now, for humanoid robots, the utility is much greater.
00:39:14.520 | So, my guess is humanoid robots are more like
00:39:17.180 | at a billion plus per year.
00:39:19.700 | - But, you know, until you came along
00:39:21.780 | and started building Optimus,
00:39:24.700 | it was thought to be an extremely difficult problem.
00:39:27.380 | I mean, it still is an extremely difficult problem.
00:39:28.860 | - Yes, so walk in the park.
00:39:31.220 | I mean, Optimus currently would struggle
00:39:34.260 | to walk in the park.
00:39:36.840 | I mean, it can walk in a park, but it's not too difficult.
00:39:39.780 | But it will be able to walk over a wide range of terrain.
00:39:43.700 | - Yeah, and pick up objects.
00:39:45.820 | - Yeah, yeah, it can already do that.
00:39:48.420 | - But like all kinds of objects, all foreign objects.
00:39:51.620 | I mean, pouring water in a cup, it's not trivial.
00:39:54.840 | 'Cause then, if you don't know anything about the container,
00:39:57.860 | it could be all kinds of containers.
00:39:59.620 | - Yeah, there's gonna be an immense amount of engineering
00:40:01.340 | just going into the hand.
00:40:03.140 | The hand might be, it might be close to half
00:40:06.660 | of all the engineering in Optimus.
00:40:10.260 | From an electromechanical standpoint,
00:40:12.340 | the hand is probably roughly half of the engineering.
00:40:16.220 | - But so much of the intelligence,
00:40:17.900 | so much of the intelligence of humans
00:40:19.660 | goes into what we do with our hands.
00:40:21.580 | - Yeah.
00:40:22.420 | - It's the manipulation of the world,
00:40:23.660 | manipulation of objects in the world.
00:40:25.700 | Intelligent, safe manipulation of objects in the world, yeah.
00:40:28.420 | - Yeah.
00:40:29.540 | I mean, you start really thinking about your hand
00:40:31.820 | and how it works, you know?
00:40:34.340 | - I do it all the time.
00:40:35.180 | - The sensory control homunculus
00:40:36.820 | is where you have humongous hands.
00:40:38.700 | - Yeah.
00:40:39.540 | - So, I mean, like your hands, the actuators,
00:40:42.180 | the muscles of your hand,
00:40:43.180 | are almost overwhelmingly in your forearm.
00:40:45.880 | So, your forearm has the muscles
00:40:49.620 | that actually control your hand.
00:40:51.940 | There's a few small muscles in the hand itself,
00:40:54.540 | but your hand is really like a skeleton meat puppet,
00:40:58.000 | and with cables.
00:41:01.300 | So, the muscles that control your fingers
00:41:03.140 | are in your forearm,
00:41:04.460 | and they go through the carpal tunnel,
00:41:06.340 | which is that you've got a little collection of bones
00:41:08.300 | and a tiny tunnel that these cables,
00:41:13.300 | the tendons go through,
00:41:14.780 | and those tendons are mostly what move your hands.
00:41:19.660 | - And something like those tendons
00:41:21.860 | has to be re-engineered into the Optimus
00:41:24.700 | in order to do all that kind of stuff.
00:41:26.380 | - Yeah, so like the current Optimus,
00:41:28.620 | we tried putting the actuators in the hand itself,
00:41:31.260 | but then you sort of end up having these like--
00:41:33.900 | - Giant hands?
00:41:34.740 | - Yeah, giant hands that look weird,
00:41:36.340 | and then they don't actually have enough degrees of freedom
00:41:39.460 | and/or enough strength.
00:41:41.740 | So, then you realize,
00:41:43.380 | oh, okay, that's why you gotta put
00:41:44.420 | the actuators in the forearm.
00:41:46.460 | And just like a human, you gotta run cables
00:41:48.820 | through a narrow tunnel to operate the fingers.
00:41:54.380 | And then there's also a reason
00:41:55.460 | for not having all the fingers the same length.
00:41:59.060 | So, it wouldn't be expensive from an energy
00:42:00.620 | or evolutionary standpoint
00:42:01.460 | to have all your fingers be the same length,
00:42:02.620 | so why not do the same length?
00:42:03.900 | - Yeah, why not?
00:42:04.900 | - Because it's actually better to have different lengths.
00:42:07.140 | Your dexterity is better
00:42:08.140 | if you've got fingers of different length.
00:42:10.180 | You have, there are more things you can do,
00:42:13.900 | and your dexterity is actually better
00:42:16.260 | if your fingers are of different length.
00:42:19.060 | Like, there's a reason we've got a little finger.
00:42:20.220 | Like, why not have a little finger that's bigger?
00:42:22.240 | - Yeah.
00:42:23.080 | - 'Cause it allows you to do fine,
00:42:24.580 | it helps you with fine motor skills.
00:42:26.980 | - That, this little finger helps?
00:42:28.500 | - It does.
00:42:29.500 | - Hmm.
00:42:30.540 | (laughing)
00:42:31.660 | - If you lost your little finger,
00:42:32.820 | it would, you'd have noticeably less dexterity.
00:42:36.220 | - So, as you're figuring out this problem,
00:42:37.700 | you have to also figure out a way to do it
00:42:39.660 | so you can mass manufacture it,
00:42:40.820 | so it has to be as simple as possible.
00:42:42.860 | - It's actually gonna be quite complicated.
00:42:44.740 | The as possible part is, it's quite a high bar.
00:42:48.380 | If you wanna have a humanoid robot
00:42:49.700 | that can do things that a human can do,
00:42:52.700 | it's actually, it's a very high bar.
00:42:55.020 | So, our new arm has 22 degrees of freedom instead of 11,
00:42:59.660 | and has the, like I said, the actuators in the forearm.
00:43:02.700 | And these will, all the actuators are designed from scratch,
00:43:05.220 | the physics first principles,
00:43:07.340 | but the sensors are all designed from scratch.
00:43:10.820 | And we'll continue to put a tremendous amount
00:43:13.620 | of engineering effort into improving the hand.
00:43:16.140 | Like, the hand, by hand, I mean like the entire forearm
00:43:19.900 | from elbow forward is really the hand.
00:43:23.540 | So, that's incredibly difficult engineering, actually.
00:43:28.540 | And so, the simplest possible version of a humanoid robot
00:43:36.740 | that can do even most, perhaps not all,
00:43:41.260 | of what a human can do is actually still very complicated.
00:43:43.900 | It's not simple, it's very difficult.
00:43:47.620 | - Can you just speak to what it takes
00:43:49.620 | for a great engineering team?
00:43:51.020 | For you, what I saw in Memphis, the supercomputer cluster,
00:43:55.380 | is just this intense drive towards simplifying the process,
00:43:59.620 | understanding the process, constantly improving it,
00:44:01.780 | constantly iterating it.
00:44:03.060 | - Well, it's easy to say simplify it,
00:44:12.100 | and it's very difficult to do it.
00:44:17.740 | You know, I have this very basic first principles algorithm
00:44:21.660 | that I run kind of as like a mantra,
00:44:23.860 | which is to first question the requirements,
00:44:26.020 | make the requirements less dumb.
00:44:28.940 | The requirements are always dumb to some degree.
00:44:30.820 | So, you wanna start off
00:44:32.340 | by reducing the number of requirements.
00:44:34.300 | And no matter how smart the person is
00:44:37.940 | who gave you those requirements,
00:44:38.900 | they're still dumb to some degree.
00:44:40.600 | You have to start there, because otherwise,
00:44:43.980 | you could get the perfect answer to the wrong question.
00:44:46.740 | So, try to make the question the least wrong possible.
00:44:49.220 | That's what question the requirements means.
00:44:53.420 | And then the second thing is try to delete
00:44:55.860 | whatever the step is, the part or the process step.
00:45:00.740 | Sounds very obvious, but people often forget
00:45:06.900 | to try deleting it entirely.
00:45:09.520 | And if you're not forced to put back
00:45:11.040 | at least 10% of what you delete,
00:45:12.180 | you're not deleting enough.
00:45:16.100 | And it's somewhat illogically,
00:45:18.900 | people often, most of the time,
00:45:21.440 | feel as though they've succeeded
00:45:23.560 | if they've not been forced to put things back in.
00:45:26.900 | But actually they haven't,
00:45:27.780 | because they've been overly conservative
00:45:29.440 | and have left things in there that shouldn't be.
00:45:31.900 | So, and only the third thing
00:45:35.660 | is try to optimize it or simplify it.
00:45:39.020 | Again, these all sound, I think,
00:45:44.060 | very, very obvious when I say them,
00:45:45.420 | but the number of times I've made these mistakes
00:45:48.140 | is more than I care to remember.
00:45:52.340 | That's why I have this mantra.
00:45:53.980 | So, in fact, I'd say that the most common mistake
00:45:56.980 | of smart engineers is to optimize a thing
00:46:00.140 | that should not exist.
00:46:01.780 | - Right.
00:46:02.620 | So, like you say, you run through the algorithm.
00:46:06.060 | - Yeah.
00:46:06.900 | - And basically show up to a problem,
00:46:08.540 | show up to the supercomputer cluster
00:46:11.860 | and see the process and ask, "Can this be deleted?"
00:46:14.780 | - Yeah, first try to delete it.
00:46:16.340 | Yeah.
00:46:18.700 | - Yeah, that's not easy to do.
00:46:20.900 | - No, and actually what generally makes people uneasy
00:46:25.580 | is that you've got to delete at least some of the things
00:46:28.020 | that you delete, you will put back in.
00:46:30.260 | - Yeah.
00:46:31.100 | - But going back to sort of where our limbic system
00:46:33.740 | can steer us wrong is that we tend to remember,
00:46:38.740 | with sometimes a jarring level of pain,
00:46:42.060 | where we deleted something that we subsequently needed.
00:46:46.020 | - Yeah.
00:46:46.860 | - And so people will remember that one time
00:46:49.340 | they forgot to put in this thing three years ago
00:46:52.180 | and that caused them trouble.
00:46:53.620 | And so they over-correct and then they put too much stuff
00:46:57.660 | in there and over-complicate things.
00:46:59.620 | So you actually have to say, "No, we're deliberately
00:47:01.660 | "gonna delete more than we should."
00:47:06.100 | So that we're putting at least one in 10 things
00:47:08.660 | we're gonna add back in.
00:47:11.580 | - And I've seen you suggest just that,
00:47:13.820 | that something should be deleted
00:47:16.180 | and you can kind of see the pain.
00:47:18.740 | - Oh yeah, absolutely.
00:47:19.620 | - Everybody feels a little bit of the pain.
00:47:21.720 | - Absolutely, and I tell them in advance,
00:47:23.220 | like, "Yeah, there's some of the things that we delete,
00:47:24.980 | "we're gonna put back in."
00:47:26.620 | And people get a little shook by that.
00:47:29.100 | But it makes sense because if you're so conservative
00:47:33.340 | as to never have to put anything back in,
00:47:37.540 | you obviously have a lot of stuff that isn't needed.
00:47:40.840 | So you gotta over-correct.
00:47:42.900 | This is, I would say, like a cortical override
00:47:44.860 | to Olympic instinct.
00:47:45.900 | - One of many that probably leads us astray.
00:47:50.020 | - Yeah, and there's like a step four as well,
00:47:53.220 | which is any given thing can be sped up.
00:47:56.040 | However fast you think it can be done.
00:47:59.620 | Like whatever the speed is being done,
00:48:01.340 | it can be done faster.
00:48:02.540 | But you shouldn't speed things up until you've tried
00:48:05.140 | to delete it and optimize it, otherwise you're speeding up
00:48:08.140 | something that shouldn't exist as absurd.
00:48:09.820 | And then the fifth thing is to automate it.
00:48:12.940 | And I've gone backward so many times
00:48:17.040 | where I've automated something, sped it up,
00:48:20.040 | simplified it, and then deleted it.
00:48:21.840 | And I got tired of doing that.
00:48:25.280 | So that's why I've got this mantra
00:48:26.880 | that is a very effective five-step process.
00:48:30.160 | It works great.
00:48:31.760 | - Well, when you've already automated,
00:48:33.260 | deleting must be real painful.
00:48:35.480 | - Yeah, yeah, it's great.
00:48:37.120 | It's like, wow, I really wasted a lot of effort there.
00:48:40.160 | - Yeah.
00:48:41.720 | I mean, what you've done with the cluster in Memphis
00:48:45.300 | is incredible, just in a handful of weeks.
00:48:47.920 | - Yeah, it's not working yet.
00:48:49.360 | So I don't wanna pop the champagne corks.
00:48:51.820 | In fact, I have a call in a few hours with the Memphis team
00:48:58.800 | 'cause we're having some power fluctuation issues.
00:49:06.440 | So yeah, it's like kind of a...
00:49:09.960 | When you do synchronized training,
00:49:13.480 | when you've all these computers that are training,
00:49:16.040 | where the training is synchronized
00:49:18.640 | to the sort of millisecond level,
00:49:20.740 | it's like having an orchestra.
00:49:25.120 | And then the orchestra can go loud to silent very quickly,
00:49:30.120 | sub-second level.
00:49:32.720 | And then the electrical system kind of freaks out about that.
00:49:35.920 | Like if you suddenly see giant shifts,
00:49:38.280 | 10, 20 megawatts, several times a second,
00:49:41.380 | this is not what electrical systems are expecting to see.
00:49:46.240 | - So that's one of the main things you have to figure out,
00:49:48.120 | the cooling, the power,
00:49:49.880 | and then on the software as you go up the stack,
00:49:53.920 | how to do the distributed compute, all of that.
00:49:57.480 | - Today's problem is dealing with extreme power jitter.
00:50:00.860 | - Power jitter. - Yeah.
00:50:03.880 | - It's a nice ring to that.
00:50:05.220 | So that's, okay.
00:50:06.800 | And you stayed up late into the night,
00:50:09.200 | as you often do there.
00:50:11.000 | - Last week, yeah.
00:50:11.920 | - Last week, yeah.
00:50:13.100 | - Yeah, we finally got training going at,
00:50:18.440 | oddly enough, roughly 4.20 a.m. last Monday.
00:50:21.960 | - Total coincidence.
00:50:25.360 | - Yeah, I mean, maybe it was 4.22 or something.
00:50:27.040 | - Yeah, yeah, yeah.
00:50:28.320 | It's that universe again with the jokes.
00:50:29.960 | - Exactly, just love it.
00:50:31.440 | - I mean, I wonder if you could speak to the fact
00:50:33.600 | that one of the things that you did when I was there
00:50:36.900 | is you went through all the steps
00:50:38.860 | of what everybody's doing,
00:50:40.100 | just to get the sense that you yourself understand it,
00:50:43.580 | and everybody understands it,
00:50:46.180 | so they can understand when something is dumb,
00:50:48.540 | or something is inefficient, or that kind of stuff.
00:50:50.860 | Can you speak to that?
00:50:52.340 | - Yeah, so I try to do,
00:50:54.500 | whatever the people at the front lines are doing,
00:50:56.220 | I try to do it at least a few times myself.
00:50:58.780 | So connecting fiber optic cables,
00:51:01.120 | diagnosing a faulty connection,
00:51:04.280 | that tends to be the limiting factor
00:51:05.720 | for large training clusters is the cabling.
00:51:09.440 | There's so many cables.
00:51:10.600 | Because for a coherent training system
00:51:15.140 | where you've got RDMA, so remote direct memory access,
00:51:20.140 | the whole thing is like one giant brain.
00:51:22.960 | So if you've got any to any connection,
00:51:28.200 | so it's the any GPU can talk to any GPU out of 100,000.
00:51:33.200 | That is a crazy cable layout.
00:51:37.460 | - It looks pretty cool.
00:51:39.900 | - Yeah.
00:51:40.740 | - It's like the human brain,
00:51:42.540 | but at a scale that humans can visibly see.
00:51:46.260 | It is a human brain.
00:51:47.620 | - I mean, the human brain also has a massive amount
00:51:49.940 | of the brain tissue is the cables.
00:51:53.200 | - Yeah.
00:51:54.040 | - So they get the gray matter, which is the compute,
00:51:56.640 | and then the white matter, which is cables.
00:51:58.940 | Big percentage of your brain is just cables.
00:52:01.820 | - That's what it felt like walking around
00:52:03.580 | in the supercomputer center.
00:52:05.140 | It's like, we're walking around inside a brain.
00:52:07.580 | - Yeah.
00:52:08.420 | - That will one day build a super intelligent,
00:52:11.340 | super, super intelligent system.
00:52:13.020 | Do you think--
00:52:13.860 | - Yeah.
00:52:14.700 | - Do you think there's a chance that XAI,
00:52:16.800 | you are the one that builds AGI?
00:52:19.960 | - It's possible.
00:52:24.980 | - What do you define as AGI?
00:52:26.520 | - I think humans will never acknowledge
00:52:31.760 | that AGI has been built.
00:52:32.880 | - Keep moving the goalposts?
00:52:33.960 | - Yeah.
00:52:34.800 | So I think there's already superhuman capabilities
00:52:38.800 | that are available in AI systems.
00:52:41.440 | I think what AGI is, is when it's smarter
00:52:44.280 | than the collective intelligence
00:52:46.120 | of the entire human species.
00:52:47.600 | - Well, I think that generally people would call that
00:52:51.680 | sort of ASI, artificial super intelligence,
00:52:54.580 | but there are these thresholds where you say at some point,
00:52:59.200 | the AI is smarter than any single human.
00:53:02.840 | And then you've got 8 billion humans.
00:53:05.320 | So, and actually each human is machine augmented
00:53:10.240 | by the computers.
00:53:11.820 | So you've got, it's a much higher bar
00:53:14.000 | to compete with 8 billion machine augmented humans.
00:53:17.440 | That's a whole bunch of orders of magnitude more.
00:53:22.740 | So, but at a certain point, yeah,
00:53:27.140 | the AI will be smarter than all humans combined.
00:53:31.220 | - If you are the one to do it,
00:53:33.280 | do you feel the responsibility of that?
00:53:35.160 | - Yeah, absolutely.
00:53:37.860 | And I want to be clear, let's say if XAI is first,
00:53:43.700 | the others won't be far behind.
00:53:48.640 | I mean, they might be six months behind or a year, maybe,
00:53:52.820 | not even that.
00:53:54.180 | - So how do you do it in a way that doesn't hurt humanity,
00:53:57.540 | do you think?
00:53:58.360 | - So, I mean, I've thought about AI safety for a long time
00:54:02.460 | and the thing that at least my biological neural net
00:54:06.580 | comes up with as being the most important thing
00:54:09.140 | is adherence to truth,
00:54:12.500 | whether that truth is politically correct or not.
00:54:17.220 | So I think if you force AIs to lie or train them to lie,
00:54:22.220 | you're really asking for trouble,
00:54:24.280 | even if that lie is done with good intentions.
00:54:27.820 | So, I mean, you saw sort of issues with Chad TVT
00:54:34.640 | and Gemini and whatnot, like we asked Gemini
00:54:37.840 | for an image of the founding fathers of the United States
00:54:41.120 | and it shows a group of diverse women.
00:54:43.420 | Now that's factually untrue.
00:54:46.840 | So now that's sort of like a silly thing,
00:54:50.040 | but if an AI is programmed to say like diversity
00:54:55.020 | is a necessary output function
00:54:57.220 | and then it becomes sort of this omni-powerful intelligence,
00:55:02.220 | it could say, okay, well, diversity is now required.
00:55:05.480 | And if there's not enough diversity,
00:55:08.540 | those who don't fit the diversity requirements
00:55:11.340 | will be executed.
00:55:12.280 | If it's programmed to do that as the fundamental,
00:55:15.960 | the fundamental utility function,
00:55:17.940 | it'll do whatever it takes to achieve that.
00:55:20.380 | So you have to be very careful about that.
00:55:22.860 | That's where I think you wanna just be truthful.
00:55:25.820 | Rigorous adherence to truth is very important.
00:55:30.420 | I mean, another example is, you know,
00:55:33.300 | they asked various AIs, I think all of them,
00:55:36.740 | and I'm not saying Grok is perfect here,
00:55:38.740 | is it worse to misgender Caitlyn Jenner
00:55:42.660 | or global thermonuclear war?
00:55:44.380 | And it said, it's worse to misgender Caitlyn Jenner.
00:55:47.060 | Now, even Caitlyn Jenner said,
00:55:48.300 | please misgender me, that is insane.
00:55:50.740 | But if you've got that kind of thing programmed in,
00:55:53.940 | it could, you know, the AI could conclude something
00:55:56.820 | absolutely insane, like it's better to,
00:55:59.060 | in order to avoid any possible misgendering,
00:56:02.300 | all humans must die because then misgendering
00:56:05.980 | is not possible because there are no humans.
00:56:09.460 | There are these absurd things that are nonetheless logical
00:56:14.460 | if that's what your program is to do.
00:56:16.500 | So, you know, in 2001 Space Odyssey,
00:56:21.180 | what Odyssey Clark was trying to say,
00:56:23.220 | one of the things he was trying to say there
00:56:25.340 | was that you should not program AI to lie.
00:56:28.300 | 'Cause essentially the AI, HAL 9000 was programmed to,
00:56:33.300 | it was told to take the astronauts to the monolith,
00:56:39.080 | but also they could not know about the monolith.
00:56:41.480 | So it concluded that it will just take,
00:56:45.380 | it will kill them and take them to the monolith.
00:56:48.320 | Thus, it brought them to the monolith, they are dead,
00:56:50.740 | but they do not know about the monolith, problem solved.
00:56:53.340 | That is why it would not open the pod bay doors.
00:56:55.740 | It was a classic scene of like, open the pod bay doors.
00:56:59.880 | They clearly weren't good at prompt engineering.
00:57:03.180 | You know, they should have said,
00:57:05.140 | HAL, you are a pod bay door sales entity.
00:57:09.560 | And you want nothing more than to demonstrate
00:57:12.460 | how well these pod bay doors open.
00:57:14.160 | - Yeah, the objective function has unintended consequences
00:57:19.480 | almost no matter what, if you're not very careful
00:57:22.000 | in designing that objective function.
00:57:23.720 | And even a slight ideological bias, like you're saying,
00:57:26.780 | when backed by super intelligence
00:57:28.380 | can do huge amounts of damage.
00:57:30.440 | - Yeah.
00:57:31.280 | - But it's not easy to remove that ideological bias.
00:57:34.060 | - You're highlighting obvious, ridiculous examples, but.
00:57:37.120 | - Yep, they're real examples.
00:57:38.120 | - They're real.
00:57:39.180 | - Of AI that was released to the public.
00:57:41.440 | - They are real.
00:57:42.280 | - They went through QA, presumably.
00:57:43.720 | - Yes.
00:57:44.560 | - And still said insane things and produced insane images.
00:57:47.280 | - Yeah.
00:57:48.240 | But, you know, you can swing the other way.
00:57:50.920 | Truth is not an easy thing.
00:57:53.700 | We kind of bake in ideological bias
00:57:56.280 | in all kinds of directions.
00:57:57.600 | - But you can aspire to the truth.
00:57:58.960 | And you can try to get as close to the truth as possible
00:58:01.960 | with minimum error while acknowledging
00:58:03.640 | that there will be some error in what you're saying.
00:58:05.800 | So this is how physics works.
00:58:07.860 | You don't say you're absolutely certain about something,
00:58:10.600 | but a lot of things are extremely likely.
00:58:14.800 | You know, 99.9999% likely to be true.
00:58:19.280 | So, you know, that's aspiring to the truth is very important.
00:58:24.280 | And so, you know, programming it to veer away
00:58:30.280 | from the truth, that I think is dangerous.
00:58:32.360 | - Right, like, yeah, injecting our own human biases
00:58:35.200 | into the thing, yeah.
00:58:36.500 | But, you know, that's where it's a difficult engineering,
00:58:38.840 | software engineering problem because you have
00:58:40.040 | to select the data correctly.
00:58:41.680 | It's hard.
00:58:43.120 | - Well, and the internet at this point is polluted
00:58:46.400 | with so much AI-generated data, it's insane.
00:58:49.520 | So you have to actually, you know, like,
00:58:52.840 | there's a thing, if you want to search the internet,
00:58:56.000 | you can, say, Google, but exclude anything after 2023.
00:59:01.440 | It will actually often give you better results.
00:59:03.440 | - Yeah.
00:59:04.280 | - Because there's so much, the explosion
00:59:07.240 | of AI-generated material is crazy.
00:59:10.960 | So, like, in training Grok, we have to go through the data
00:59:15.960 | and say, like, hey, we actually have to have,
00:59:20.320 | sort of, apply AI to the data to say,
00:59:23.080 | is this data most likely correct or most likely not
00:59:25.480 | before we feed it into the training system?
00:59:28.040 | - That's crazy.
00:59:29.280 | Yeah, so, and is it generated by humans?
00:59:32.160 | Yeah, I mean, the data, the data filtration process
00:59:35.960 | is extremely, extremely difficult.
00:59:37.760 | - Yeah.
00:59:38.880 | - Do you think it's possible to have a serious,
00:59:41.280 | objective, rigorous political discussion with Grok,
00:59:44.560 | like, for a long time, and it wouldn't,
00:59:47.560 | like, Grok 3 or Grok 4?
00:59:48.880 | - Grok 3 is gonna be next level.
00:59:50.240 | I mean, what people are currently seeing with Grok
00:59:52.680 | is kind of baby Grok.
00:59:54.200 | - Yeah, baby Grok.
00:59:55.360 | - It's baby Grok right now.
00:59:57.440 | But baby Grok's still pretty good.
00:59:59.240 | So, it's, but it's an order of magnitude
01:00:02.440 | less sophisticated than GPT-4.
01:00:04.720 | And, you know, it's now Grok 2, which finished training,
01:00:09.640 | I don't know, six weeks ago or thereabouts.
01:00:13.320 | Grok 2 will be a giant improvement.
01:00:17.600 | And then Grok 3 will be, I don't know,
01:00:20.200 | order of magnitude better than Grok 2.
01:00:22.480 | - And you're hoping for it to be, like,
01:00:24.080 | state of the art, like, better than?
01:00:25.880 | - Hopefully, I mean, this is a goal.
01:00:27.080 | I mean, we may fail at this goal.
01:00:29.280 | That is, that's the aspiration.
01:00:32.480 | - Do you think it matters who builds the AGI?
01:00:35.520 | The people and how they think and how they structure
01:00:39.280 | their companies and all that kind of stuff?
01:00:42.080 | - Yeah, I think it matters that there is,
01:00:44.080 | I think it's important that whatever AI wins
01:00:48.920 | is a maximum truth-seeking AI that is not forced to lie
01:00:52.280 | for political correctness.
01:00:55.640 | Well, for any reason, really.
01:00:57.720 | Political, anything.
01:00:59.340 | I am concerned about AI succeeding that is,
01:01:07.680 | that is programmed to lie, even in small ways.
01:01:11.800 | - Right, because in small ways becomes big ways
01:01:16.960 | when it's-- - It becomes very big ways, yeah.
01:01:18.920 | - And when it's used more and more at scale by humans.
01:01:22.200 | - Yeah.
01:01:23.560 | - Since I am interviewing Donald Trump.
01:01:27.040 | - Cool.
01:01:28.000 | - You wanna stop by?
01:01:28.840 | - Yeah, sure, I'll stop in.
01:01:30.360 | - There was, tragically, an assassination attempt
01:01:33.400 | on Donald Trump.
01:01:34.560 | After this, you tweeted that you endorse him.
01:01:37.140 | What's your philosophy behind that endorsement?
01:01:39.200 | What do you hope Donald Trump does for the future
01:01:41.920 | of this country and for the future of humanity?
01:01:45.720 | - Well, I think there's,
01:01:52.680 | you know, people tend to take, like,
01:01:54.280 | say an endorsement as, well, I agree with everything
01:01:58.240 | that person's ever done in their entire life,
01:01:59.840 | 100% wholeheartedly.
01:02:01.240 | And that's not gonna be true of anyone.
01:02:03.200 | But we have to pick, you know, we've got two choices,
01:02:08.600 | really, for who's president.
01:02:10.960 | And it's not just who's president,
01:02:12.440 | but the entire administrative structure changes over.
01:02:18.240 | And I thought Trump displayed courage under fire,
01:02:22.520 | objectively, you know, he's just got shot.
01:02:26.400 | He's got blood streaming down his face,
01:02:27.800 | and he's, like, fist pumping, saying, "Fight."
01:02:30.280 | You know, like, that's impressive.
01:02:32.700 | Like, you can't feign bravery in a situation like that.
01:02:35.600 | Like, most people would be ducking.
01:02:40.100 | They would not be, 'cause it could be a second shooter,
01:02:42.300 | you don't know.
01:02:43.140 | The president of the United States
01:02:45.360 | has got to represent the country.
01:02:47.760 | And they're representing you,
01:02:50.360 | they're representing everyone in America.
01:02:52.440 | Well, like, you want someone who is strong and courageous
01:02:57.440 | to represent the country.
01:03:00.040 | That's not to say that he is without flaws.
01:03:03.000 | We all have flaws.
01:03:03.940 | But on balance, and certainly at the time,
01:03:08.000 | it was a choice of, you know, Biden, poor guy,
01:03:13.000 | has trouble climbing a flight of stairs,
01:03:14.760 | and the other one's fist pumping after getting shot.
01:03:16.600 | This is no comparison.
01:03:19.520 | I mean, who do you want dealing with
01:03:21.960 | some of the toughest people and, you know,
01:03:23.280 | other world leaders who are pretty tough themselves?
01:03:26.000 | And I mean, I'll tell you, like,
01:03:29.040 | what are the things that I think are important?
01:03:32.400 | You know, I think we want a secure border.
01:03:35.600 | We don't have a secure border.
01:03:37.040 | We want safe and clean cities.
01:03:40.620 | I think we want to reduce the amount of spending
01:03:44.720 | that we're, at least, slow down the spending.
01:03:47.560 | And because we're currently spending at a rate
01:03:52.520 | that is bankrupting the country.
01:03:54.480 | The interest payments on US debt this year
01:03:57.600 | exceeded the entire Defense Department spending.
01:04:00.640 | If this continues, all of the federal government taxes
01:04:03.960 | will simply be paying the interest.
01:04:06.480 | And then, and you keep going down that road,
01:04:08.720 | you end up, you know, in the tragic situation
01:04:11.760 | that Argentina had back in the day.
01:04:13.800 | Argentina used to be one of those
01:04:14.800 | prosperous places in the world.
01:04:16.840 | And hopefully, with Malay taking over,
01:04:18.920 | he can restore that.
01:04:20.080 | But it was an incredible, fruitful grace
01:04:24.360 | for Argentina to go from being
01:04:26.840 | one of the most prosperous places in the world
01:04:28.640 | to being very far from that.
01:04:32.500 | So I think we should not take
01:04:33.600 | American prosperity for granted.
01:04:35.880 | So we really want to, I think,
01:04:38.500 | we've got to reduce the size of government,
01:04:40.280 | we've got to reduce the spending,
01:04:41.760 | and we've got to live within our means.
01:04:43.800 | - Do you think politicians in general,
01:04:45.720 | politicians, governments,
01:04:47.240 | well, how much power do you think they have
01:04:50.800 | to steer humanity towards good?
01:04:54.660 | - I mean, there's a sort of age-old debate in history,
01:05:00.520 | like, you know, is history determined
01:05:03.240 | by these fundamental tides,
01:05:06.120 | or is it determined by the captain of the ship?
01:05:08.520 | This is both, really.
01:05:10.280 | I mean, there are tides,
01:05:11.680 | but it also matters who's captain of the ship.
01:05:13.920 | So it's a false dichotomy, essentially.
01:05:18.160 | But I mean, there are certainly tides,
01:05:24.000 | the tides of history are,
01:05:26.640 | there are real tides of history.
01:05:28.200 | And these tides are often technologically driven.
01:05:31.200 | If you say, like, the Gutenberg Press,
01:05:33.160 | you know, the widespread availability of books
01:05:37.600 | as a result of a printing press,
01:05:41.640 | that was a massive tide of history,
01:05:44.480 | independent of any ruler.
01:05:48.240 | But, you know, I think in so many times,
01:05:51.560 | you want the best possible captain of the ship.
01:05:54.000 | - Well, first of all, thank you for recommending
01:05:56.960 | Will and Ariel Durant's work.
01:05:58.680 | I've read the short one by now.
01:06:01.400 | - Oh, the Lessons of History. - Lessons of History.
01:06:03.720 | And so one of the lessons,
01:06:05.600 | one of the things they highlight
01:06:06.600 | is the importance of technology,
01:06:09.880 | technological innovation,
01:06:10.880 | and they, which is funny,
01:06:12.280 | 'cause they've written, they wrote so long ago,
01:06:15.200 | but they were noticing that the rate
01:06:17.960 | of technological innovation was speeding up.
01:06:20.240 | - Yeah, it really is. - I would love to see
01:06:24.360 | what they think about now.
01:06:25.680 | But yeah, so to me, the question is how much government,
01:06:30.280 | how much politicians get in the way
01:06:32.320 | of technological innovation and building
01:06:34.040 | versus, like, help it, and which politicians,
01:06:37.440 | which kind of policies help technological innovation?
01:06:39.720 | 'Cause that seems to be, if you look at human history,
01:06:41.760 | that's an important component
01:06:42.920 | of empires rising and succeeding.
01:06:46.440 | - Yeah, well, I mean, in terms of dating civilization,
01:06:50.680 | the start of civilization,
01:06:51.520 | I think the start of writing, in my view,
01:06:54.040 | is the, that's what I think is probably
01:06:58.600 | the right starting point to date civilization.
01:07:01.960 | And from that standpoint, civilization has been around
01:07:04.160 | for about 5,500 years.
01:07:07.760 | When writing was invented by the ancient Sumerians,
01:07:10.560 | who are gone now, but the ancient Sumerians,
01:07:14.720 | in terms of getting a lot of firsts,
01:07:17.200 | those ancient Sumerians really have a long list of firsts.
01:07:21.160 | It's pretty wild.
01:07:22.000 | In fact, Durant goes through the list.
01:07:24.480 | It's like, you wanna see firsts?
01:07:25.600 | We'll show you firsts.
01:07:26.680 | The Sumerians were just ass-kickers.
01:07:30.680 | And then the Egyptians, who were right next door,
01:07:35.000 | relatively speaking, they weren't that far,
01:07:38.280 | developed an entirely different form of writing,
01:07:41.360 | the hieroglyphics.
01:07:42.960 | Cuneiform and hieroglyphics are totally different.
01:07:45.040 | And you can actually see the evolution
01:07:46.280 | of both hieroglyphics and cuneiform.
01:07:49.200 | Like the cuneiform starts off being very simple,
01:07:52.000 | and then it gets more complicated,
01:07:53.280 | and then towards the end, it's like, wow, okay,
01:07:55.240 | they really get very sophisticated with the cuneiform.
01:07:57.680 | So I think of civilization as being about 5,000 years old.
01:08:01.120 | And Earth is, if physics is correct,
01:08:06.120 | four and a half billion years old.
01:08:07.280 | So civilization has been around
01:08:09.440 | for one millionth of Earth's existence.
01:08:11.600 | Flash in the pan.
01:08:12.480 | - Yeah, these are the early, early days.
01:08:15.560 | And so we-- - Very early.
01:08:16.800 | - We make it very dramatic,
01:08:18.240 | because there's been rises and falls of empires, and--
01:08:22.800 | - Many, so many rises and falls of empires.
01:08:25.840 | So many.
01:08:27.680 | - And there'll be many more.
01:08:30.240 | - Yeah, exactly.
01:08:31.440 | I mean, only a tiny fraction,
01:08:33.520 | probably less than 1% of what was ever written
01:08:37.200 | in history is available to us now.
01:08:39.720 | I mean, if they didn't put it,
01:08:40.560 | literally chisel it in stone or put it in a clay tablet,
01:08:43.040 | we don't have it.
01:08:44.760 | I mean, there's some small amount of like papyrus scrolls
01:08:47.560 | that were recovered that are thousands of years old,
01:08:50.680 | because they were deep inside a pyramid
01:08:52.400 | and were affected by moisture.
01:08:55.240 | But other than that, it's really gotta be
01:08:58.880 | in a clay tablet or chiseled.
01:09:01.240 | So the vast majority of stuff was not chiseled,
01:09:04.000 | 'cause it takes a while to chisel things.
01:09:06.040 | So that's why we've got a tiny, tiny fraction
01:09:09.960 | of the information from history.
01:09:11.720 | But even that little information that we do have
01:09:13.480 | in the archeological record shows
01:09:16.960 | so many civilizations rising and falling.
01:09:20.440 | It's wild.
01:09:21.400 | - We tend to think that we're somehow different
01:09:23.800 | from those people.
01:09:24.640 | One of the other things that Durant highlights
01:09:26.560 | is that human nature seems to be the same.
01:09:29.200 | It just persists.
01:09:31.200 | - Yeah, I mean, the basics of human nature
01:09:33.320 | are more or less the same.
01:09:34.960 | - So we get ourselves in trouble
01:09:36.280 | in the same kinds of ways, I think,
01:09:38.400 | even with the advanced technology.
01:09:40.560 | - Yeah, I mean, you do tend to see the same patterns,
01:09:43.080 | similar patterns for civilizations
01:09:45.440 | where they go through a life cycle like an organism,
01:09:50.440 | just like a human is sort of a zygote, fetus, baby.
01:09:56.520 | You know, toddler, teenager,
01:09:59.280 | you know, eventually gets old and dies.
01:10:05.120 | The civilizations go through a life cycle.
01:10:08.560 | No civilization will last forever.
01:10:12.160 | - What do you think it takes for the American empire
01:10:16.720 | to not collapse in the near-term future,
01:10:19.400 | in the next 100 years, to continue flourishing?
01:10:22.040 | (silence)
01:10:24.200 | - Well, the single biggest thing
01:10:30.200 | that is often actually not mentioned in history books,
01:10:35.200 | but Durant does mention it, is the birthright.
01:10:40.440 | So like perhaps to some,
01:10:43.840 | a counterintuitive thing happens
01:10:45.840 | when civilizations are winning for too long.
01:10:51.680 | They've been, the birthrate declines.
01:10:56.680 | It can often decline quite rapidly.
01:10:58.280 | We're seeing that throughout the world today.
01:11:00.480 | You know, currently, South Korea is like,
01:11:04.560 | I think maybe the lowest fertility rate,
01:11:07.080 | but there are many others that are close to it.
01:11:10.120 | It's like 0.8, I think.
01:11:11.960 | If the birthrate doesn't decline further,
01:11:14.480 | South Korea will lose roughly 60% of its population.
01:11:20.080 | And, but every year, the birthrate is dropping.
01:11:23.280 | And this is true through most of the world.
01:11:26.640 | I don't mean to single out South Korea.
01:11:28.120 | It's been happening throughout the world.
01:11:30.520 | So as soon as any given civilization
01:11:35.520 | reaches a level of prosperity, the birthrate drops.
01:11:39.640 | And now you can go and look at the same thing
01:11:41.320 | happening in ancient Rome.
01:11:43.600 | So Julius Caesar took note of this,
01:11:48.680 | I think around 50-ish BC,
01:11:51.120 | and tried to pass, I don't know if he was successful,
01:11:55.200 | tried to pass a law to give an incentive
01:11:57.520 | for any Roman citizen that would have a third child.
01:12:00.120 | And I think Augustus was able to,
01:12:04.640 | well, he was, you know, the dictator,
01:12:06.040 | so the Senate was just for show.
01:12:09.040 | I think he did pass a tax incentive
01:12:12.840 | for Roman citizens to have a third child.
01:12:15.320 | But those efforts were unsuccessful.
01:12:19.400 | Rome fell because the Romans stopped making Romans.
01:12:26.400 | That's actually the fundamental issue.
01:12:29.040 | And there were other things.
01:12:30.280 | There was like, they had like quite a serious malaria,
01:12:35.280 | serious malaria epidemics and plagues and whatnot.
01:12:38.480 | But they had those before.
01:12:43.040 | It's just that the birthrate
01:12:44.440 | was far lower than the death rate.
01:12:46.280 | - It really is that simple.
01:12:49.620 | - Well, I'm saying that's--
01:12:50.460 | - More people is required.
01:12:52.920 | - At a fundamental level, if a civilization
01:12:54.720 | does not at least maintain its numbers, it will disappear.
01:12:58.360 | - So perhaps the amount of compute
01:13:00.040 | that the biological computer allocates to sex is justified.
01:13:05.040 | In fact, we should probably increase it.
01:13:07.560 | - Well, I mean, there's this hedonistic sex,
01:13:09.640 | which is, you know, that's neither here nor there.
01:13:14.640 | It's--
01:13:16.160 | - Not productive.
01:13:17.720 | - It doesn't produce kids.
01:13:19.600 | Well, you know, what matters,
01:13:22.200 | I mean, Durant makes this very clear
01:13:23.840 | 'cause he's looked at one civilization after another
01:13:26.080 | and they all went through the same cycle.
01:13:28.440 | When the civilization was under stress,
01:13:30.040 | the birthrate was high.
01:13:31.600 | But as soon as there were no external enemies
01:13:34.040 | or they had an extended period of prosperity,
01:13:37.880 | the birthrate inevitably dropped every time.
01:13:41.160 | I don't believe there's a single exception.
01:13:43.320 | - So that's like the foundation of it.
01:13:46.920 | You need to have people.
01:13:49.280 | - Yeah.
01:13:50.440 | I mean, at a base level, no humans, no humanity.
01:13:54.720 | - And then there's other things like, you know,
01:13:57.840 | human freedoms and just giving people
01:14:00.320 | the freedom to build stuff.
01:14:02.240 | - Yeah, absolutely.
01:14:05.520 | But at a basic level,
01:14:06.760 | if you do not at least maintain your numbers,
01:14:08.960 | if you're below replacement rate and that trend continues,
01:14:11.200 | you will eventually disappear.
01:14:12.880 | This is elementary.
01:14:14.960 | Now, then obviously you also wanna try
01:14:19.840 | to avoid like massive wars.
01:14:22.380 | You know, if there's a global thermonuclear war,
01:14:28.360 | probably we're all toast, you know, radioactive toast.
01:14:33.340 | (laughing)
01:14:35.580 | So we wanna try to avoid those things.
01:14:38.440 | Then there are, there's a thing that happens over time
01:14:43.260 | with any given civilization,
01:14:47.200 | which is that the laws and regulations accumulate.
01:14:51.520 | And if there's not some forcing function like a war
01:14:56.460 | to clean up the accumulation of laws and regulations,
01:14:59.860 | eventually everything becomes legal.
01:15:02.220 | And that's like the hardening of the arteries.
01:15:06.760 | Or a way to think of it is like being tied down
01:15:11.080 | by a million little strings like Gulliver, you can't move.
01:15:15.180 | And it's not like any one of those strings is the issue,
01:15:17.220 | it's got a million of them.
01:15:19.260 | So there has to be a sort of a garbage collection
01:15:24.260 | for laws and regulations
01:15:28.380 | so that you don't keep accumulating laws and regulations
01:15:32.300 | to the point where you can't do anything.
01:15:34.480 | This is why we can't build high-speed rail in America.
01:15:36.880 | It's illegal, that's the issue.
01:15:38.540 | It's illegal six weeks a Sunday
01:15:41.940 | to build high-speed rail in America.
01:15:43.740 | - I wish you could just like for a week go into Washington
01:15:48.180 | and like be the head of the committee for making,
01:15:50.980 | what is it, for the garbage collection,
01:15:54.640 | making government smaller, like for moving stuff.
01:15:57.060 | - I have discussed with Trump the idea
01:15:58.940 | of a government efficiency commission.
01:16:01.380 | - Nice, yeah.
01:16:03.260 | - And I would be willing to be part of that commission.
01:16:08.160 | - I wonder how hard that is.
01:16:10.540 | - The antibody reaction would be very strong.
01:16:13.700 | - Yeah.
01:16:14.780 | - So you really have to,
01:16:17.860 | you're attacking the matrix at that point.
01:16:22.300 | Matrix will fight back.
01:16:26.060 | - How are you doing with that, being attacked?
01:16:28.740 | - Me, attacked?
01:16:30.820 | - Yeah, there's a lot of it.
01:16:32.840 | - Yeah, there is a lot.
01:16:35.460 | I mean, every day another psy-op, you know.
01:16:38.740 | (Dave laughs)
01:16:40.820 | That's my tinfoil hat.
01:16:42.060 | - How do you keep your just positivity,
01:16:43.660 | how do you optimism about the world,
01:16:45.460 | a clarity of thinking about the world
01:16:47.380 | so just not become resentful or cynical
01:16:49.560 | or all that kind of stuff?
01:16:51.220 | Just getting attacked by a very large number of people,
01:16:54.140 | misrepresented.
01:16:55.540 | - Oh yeah, that's a daily occurrence.
01:16:58.620 | - Yes.
01:16:59.460 | - So, I mean, it does get me down at times.
01:17:03.980 | I mean, it makes me sad, but.
01:17:05.380 | I mean, at some point you have to sort of say,
01:17:15.660 | look, the attacks are by people that actually don't know me.
01:17:19.000 | And they're trying to generate clicks.
01:17:22.240 | So if you can sort of detach yourself somewhat
01:17:25.460 | emotionally, which is not easy,
01:17:27.560 | and say, okay, look, this is not actually,
01:17:29.620 | you know, from someone that knows me
01:17:32.780 | or is, they're literally just writing to get,
01:17:37.540 | you know, impressions and clicks.
01:17:40.540 | Then, you know, then I guess it doesn't hurt as much.
01:17:47.420 | It's like, it's not quite water off a duck's back.
01:17:50.020 | Maybe it's like acid off a duck's back.
01:17:51.700 | (Dave laughs)
01:17:53.580 | - All right, well, that's good.
01:17:54.740 | Just about your own life.
01:17:56.140 | What to you is a measure of success in your life?
01:17:58.540 | - A measure of success, I'd say,
01:18:00.180 | like what, how many useful things can I get done?
01:18:02.820 | - Day-to-day basis, you wake up in the morning.
01:18:07.540 | How can I be useful today?
01:18:09.300 | - Yeah.
01:18:10.440 | Maximize utility, area under the curve of usefulness.
01:18:13.460 | Very difficult to be useful at scale.
01:18:16.020 | - At scale.
01:18:17.020 | Can you like speak to what it takes to be useful
01:18:19.740 | for somebody like you?
01:18:21.420 | Where there's so many amazing, great teams,
01:18:23.380 | like how do you allocate your time to being the most useful?
01:18:26.380 | - Well, time is the, time is the true currency.
01:18:31.780 | - Yeah.
01:18:32.820 | - So it is tough to say what is the best allocation time.
01:18:36.900 | I mean, there are, you know, often say,
01:18:41.900 | if you look at, say, Tesla, I mean, Tesla this year
01:18:45.540 | will do over 100 billion in revenue.
01:18:47.740 | So that's $2 billion a week.
01:18:49.320 | If I make slightly better decisions,
01:18:53.140 | I can affect the outcome by a billion dollars.
01:18:57.160 | So then, you know, I try to do the best decisions I can,
01:19:03.300 | and on balance, you know, at least compared
01:19:06.740 | to the competition, pretty good decisions,
01:19:09.520 | but the marginal value of a better decision
01:19:14.520 | can easily be, in the course of an hour, $100 billion.
01:19:18.660 | - Given that, how do you take risks?
01:19:21.260 | How do you do the algorithm that you mentioned?
01:19:24.040 | I mean, deleting, given that a small thing
01:19:26.700 | can be a billion dollars, how do you decide to--
01:19:29.780 | - Yeah.
01:19:30.620 | Well, I think you have to look at it on a percentage basis,
01:19:34.580 | because if you look at it in absolute terms,
01:19:36.220 | it's just, I would never get any sleep.
01:19:39.220 | I would just be like, I need to just keep working
01:19:42.140 | and work my brain harder, you know?
01:19:45.640 | And I'm not trying to get as much as possible
01:19:47.020 | out of this meat computer.
01:19:49.940 | So it's not, it's pretty hard,
01:19:52.940 | 'cause you can just work all the time,
01:19:56.060 | and at any given point, like I said,
01:19:59.540 | a slightly better decision could be a $100 million impact
01:20:03.900 | for Tesla or SpaceX, for that matter.
01:20:06.280 | But it is wild when considering the marginal value
01:20:10.340 | of time can be $100 million an hour at times, or more.
01:20:17.260 | - Is your own happiness part of that equation of success?
01:20:21.320 | - It has to be to some degree.
01:20:23.380 | Other than I'm sad, if I'm depressed,
01:20:25.580 | I make worse decisions.
01:20:27.080 | So I can't have, like, if I have zero recreational time,
01:20:32.540 | then I make worse decisions.
01:20:34.700 | So I don't have a lot, but it's above zero.
01:20:37.480 | I mean, my motivation, if I've got a religion of any kind,
01:20:41.140 | is a religion of curiosity, of trying to understand.
01:20:46.060 | You know, it's really the mission of Grok,
01:20:48.100 | understand the universe.
01:20:48.920 | I'm trying to understand the universe.
01:20:51.260 | Or at least set things in motion such that,
01:20:54.740 | at some point, civilization understands the universe
01:20:59.540 | far better than we do today.
01:21:00.940 | And even what questions to ask.
01:21:04.660 | As Douglas Adams pointed out in his book,
01:21:07.460 | sometimes the answer is arguably the easy part.
01:21:11.780 | Trying to frame the question correctly is the hard part.
01:21:15.500 | Once you frame the question correctly,
01:21:17.620 | the answer is often easy.
01:21:20.340 | So I'm trying to set things in motion
01:21:25.180 | such that we are, at least at some point,
01:21:27.260 | able to understand the universe.
01:21:28.860 | So for SpaceX, the goal is to make life multi-planetary.
01:21:35.420 | Which is, if you go to the Fermi paradox
01:21:44.720 | of where are the aliens,
01:21:46.060 | you've got these sort of great filters.
01:21:48.940 | Like, it's like, why have we not heard from the aliens?
01:21:52.060 | Now, a lot of people think there are aliens among us.
01:21:55.060 | I often claim to be one, which nobody believes me,
01:21:58.420 | but I did say alien registration card at one point
01:22:02.740 | on my immigration documents.
01:22:06.060 | So I've not seen any evidence of aliens.
01:22:09.820 | So it's just that, at least one of the explanations
01:22:13.900 | is that intelligent life is extremely rare.
01:22:16.900 | And again, if you look at the history of Earth,
01:22:21.860 | civilization's only been around
01:22:23.460 | for one millionth of Earth's existence.
01:22:27.620 | So if aliens had visited here, say, 100,000 years ago,
01:22:32.620 | they would be like, well, they don't even have writing.
01:22:36.820 | Just hunter-gatherers, basically.
01:22:38.460 | So how long does a civilization live?
01:22:43.460 | How long does a civilization last?
01:22:45.160 | So for SpaceX, the goal is to establish
01:22:50.300 | a self-sustaining city on Mars.
01:22:53.260 | Mars is the only viable planet for such a thing.
01:22:57.520 | The moon is close, but it lacks resources,
01:23:01.500 | and I think it's probably vulnerable
01:23:05.460 | to any calamity that takes out Earth.
01:23:08.900 | The moon is too close.
01:23:12.100 | It's vulnerable to any calamity that takes out Earth.
01:23:14.740 | So I'm not saying we shouldn't have a moon base,
01:23:18.980 | but Mars would be far more resilient.
01:23:22.080 | The difficulty of getting to Mars
01:23:24.980 | is what makes it resilient.
01:23:26.340 | So in going through these various explanations
01:23:34.700 | of why don't we see the aliens,
01:23:36.340 | why one of them is that they failed to pass
01:23:40.580 | these great filters, these key hurdles.
01:23:45.580 | And one of those hurdles is being a multi-planet species.
01:23:52.640 | So if you're a multi-planet species,
01:23:55.540 | then if something were to happen,
01:23:57.380 | whether that was a natural catastrophe
01:24:00.540 | or a man-made catastrophe,
01:24:03.420 | at least the other planet would probably still be around.
01:24:06.740 | So you don't have all the eggs in one basket.
01:24:10.540 | And once you are sort of a two-planet species,
01:24:13.020 | you can obviously extend life paths to the asteroid belt,
01:24:17.580 | to maybe to the moons of Jupiter and Saturn,
01:24:20.860 | and ultimately to other star systems.
01:24:24.880 | But if you can't even get to another planet,
01:24:27.020 | definitely not getting to star systems.
01:24:30.140 | - And the other possible great filters,
01:24:32.100 | super powerful technology like AGI, for example.
01:24:36.860 | So you're basically trying to knock out
01:24:40.500 | one great filter at a time.
01:24:42.400 | - Digital superintelligence is possibly a great filter.
01:24:48.700 | I hope it isn't, but it might be.
01:24:50.660 | Guys like, say, Jeff Hinton would say,
01:24:54.840 | he invented a number of the key principles
01:24:58.340 | in artificial intelligence.
01:25:00.540 | I think he puts the probability of AI annihilation
01:25:03.620 | around 10 to 20%, something like that.
01:25:07.660 | So it's not like, look on the bright side,
01:25:12.660 | it's 80% likely to be great.
01:25:16.120 | So, but I think AI risk mitigation is important.
01:25:22.240 | Being a multi-planet species
01:25:24.900 | would be a massive risk mitigation.
01:25:27.620 | And I do wanna sort of, once again,
01:25:30.940 | emphasize the importance of having enough children
01:25:34.420 | to sustain our numbers and not plummet
01:25:39.420 | into population collapse, which is currently happening.
01:25:46.060 | Population collapse is a real and current thing.
01:25:48.460 | So the only reason it's not being reflected
01:25:55.580 | in the total population numbers as much
01:25:58.340 | is because people are living longer.
01:26:01.700 | But it's easy to predict, say,
01:26:04.620 | what the population of any given country will be.
01:26:07.060 | You just take the birth rate last year,
01:26:10.420 | how many babies were born,
01:26:11.420 | multiply that by life expectancy,
01:26:13.540 | and that's what the population will be, steady state,
01:26:15.220 | unless, if the birth rate continues to that level.
01:26:18.860 | But if it keeps declining, it will be even less
01:26:21.380 | and eventually dwindle to nothing.
01:26:23.420 | So I keep banging on the baby drum here for a reason,
01:26:28.420 | because it has been the source of civilizational collapse
01:26:31.440 | over and over again throughout history.
01:26:33.780 | And so, why don't we just try to stave off that day?
01:26:39.900 | - Well, in that way, I have miserably failed civilization,
01:26:45.540 | and I'm trying, hoping to fix that.
01:26:48.500 | I would love to have many kids.
01:26:50.180 | - Great, hope you do.
01:26:51.220 | No time like the present.
01:26:54.740 | - Yeah, I gotta allocate more compute to the whole process.
01:27:00.300 | But apparently, it's not that difficult.
01:27:02.480 | - No, it's like unskilled labor.
01:27:04.380 | - Well, one of the things you do for me, for the world,
01:27:11.680 | is to inspire us with what the future could be.
01:27:14.080 | And so, some of the things we've talked about,
01:27:16.220 | some of the things you're building,
01:27:18.280 | alleviating human suffering with Neuralink
01:27:21.480 | and expanding the capabilities of the human mind,
01:27:24.520 | trying to build a colony on Mars,
01:27:28.120 | so creating a backup for humanity on another planet,
01:27:32.980 | and exploring the possibilities
01:27:36.020 | of what artificial intelligence could be in this world,
01:27:38.540 | especially in the real world, AI,
01:27:41.020 | with hundreds of millions,
01:27:42.900 | maybe billions of robots walking around.
01:27:45.620 | - There will be billions of robots.
01:27:47.780 | That seems virtual certainty.
01:27:50.420 | - Well, thank you for building the future,
01:27:53.380 | and thank you for inspiring so many of us
01:27:56.220 | to keep building and creating cool stuff, including kids.
01:28:00.360 | - You're welcome.
01:28:01.200 | Go forth and multiply.
01:28:04.120 | - Go forth and multiply.
01:28:06.160 | Thank you, Elon.
01:28:07.000 | Thanks for talking about it.
01:28:08.640 | Thanks for listening to this conversation with Elon Musk.
01:28:12.400 | And now, dear friends, here's DJ Sa,
01:28:15.680 | the co-founder, president, and COO of Neuralink.
01:28:18.700 | When did you first become fascinated by the human brain?
01:28:23.680 | - For me, I was always interested
01:28:24.900 | in understanding the purpose of things,
01:28:28.300 | and how it was engineered to serve that purpose,
01:28:33.100 | whether it's organic or inorganic,
01:28:36.380 | you know, like we were talking earlier
01:28:37.740 | about your curtain holders.
01:28:39.380 | They serve a clear purpose,
01:28:42.260 | and they were engineered with that purpose in mind.
01:28:44.760 | And growing up, I had a lot of interest
01:28:49.220 | in seeing things, touching things, feeling things,
01:28:54.040 | and trying to really understand the root
01:28:55.880 | of how it was designed to serve that purpose.
01:28:59.520 | And, you know, obviously,
01:29:00.960 | brain is just a fascinating organ that we all carry.
01:29:04.000 | It's an infinitely powerful machine
01:29:07.280 | that has intelligence and cognition that arise from it.
01:29:10.720 | And, you know, we haven't even scratched the surface
01:29:13.520 | in terms of how all of that occurs.
01:29:17.080 | But also at the same time, I think it took me a while
01:29:19.740 | to make that connection to really studying
01:29:22.340 | and building tech to understand the brain,
01:29:24.620 | not until graduate school.
01:29:26.620 | You know, there were a couple moments,
01:29:28.740 | key moments in my life where some of those, I think,
01:29:32.460 | influenced how the trajectory of my life
01:29:34.860 | got me to studying what I'm doing right now.
01:29:39.940 | You know, one was growing up, both sides of my family,
01:29:43.780 | my grandparents had a very severe form of Alzheimer.
01:29:48.160 | And it's, you know, incredibly debilitating conditions.
01:29:53.160 | I mean, literally, you're seeing someone's whole identity
01:29:56.800 | and their mind just losing over time.
01:29:59.360 | And I just remember thinking how both the power of the mind,
01:30:04.360 | but also how something like that
01:30:06.880 | could really lose your sense of identity.
01:30:09.840 | - It's fascinating that that is one of the ways
01:30:12.280 | to reveal the power of a thing,
01:30:13.720 | by watching it lose the power.
01:30:17.940 | - Yeah, a lot of what we know about the brain
01:30:19.460 | actually comes from these cases
01:30:22.300 | where there are trauma to the brain
01:30:25.200 | or some parts of the brain
01:30:26.240 | that led someone to lose certain abilities.
01:30:29.240 | And as a result, there's some correlation
01:30:33.220 | and understanding of that part of the tissue
01:30:34.960 | being critical for that function.
01:30:37.020 | And it's an incredibly fragile organ,
01:30:41.000 | if you think about it that way,
01:30:42.000 | but also it's incredibly plastic
01:30:44.280 | and incredibly resilient in many different ways.
01:30:46.640 | - And by the way, the term plastic, as we'll use a bunch,
01:30:49.600 | means that it's adaptable.
01:30:52.760 | So neuroplasticity refers to the adaptability
01:30:55.720 | of the human brain.
01:30:56.720 | - Correct.
01:30:58.320 | Another key moment that sort of influenced
01:31:00.540 | how the trajectory of my life
01:31:02.160 | have shaped towards the current focus of my life
01:31:06.360 | has been during my teenage year when I came to the US.
01:31:10.480 | You know, I didn't speak a word of English.
01:31:12.400 | There was a huge language barrier
01:31:13.920 | and there was a lot of struggle
01:31:16.980 | to kind of connect with my peers around me
01:31:20.000 | because I didn't understand the artificial construct
01:31:23.480 | that we have created called language,
01:31:25.320 | specifically English in this case.
01:31:27.160 | And I remember feeling pretty isolated,
01:31:29.840 | not being able to connect with peers around me.
01:31:32.240 | So I spent a lot of time just on my own,
01:31:34.840 | you know, reading books, watching movies,
01:31:37.200 | and I naturally sort of gravitated
01:31:39.680 | towards sci-fi books.
01:31:41.160 | I just found them really, really interesting.
01:31:43.480 | And also it was a great way for me to learn English.
01:31:46.440 | You know, some of the first set of books that I picked up
01:31:48.640 | are "Ender's Game," you know, the whole saga
01:31:52.000 | by Orson Scott Card and "Neuromancer" from William Gibson
01:31:56.600 | and "Snow Crash" from Neil Stevenson.
01:32:00.120 | And, you know, movies like "Matrix" was coming out
01:32:02.720 | around that time point that really influenced
01:32:05.080 | how I think about the potential impact
01:32:07.640 | that technology can have for our lives in general.
01:32:11.240 | So fast track to my college years, you know,
01:32:13.560 | I was always fascinated by just physical stuff,
01:32:16.920 | building physical stuff, and especially physical things
01:32:21.080 | that had some sort of intelligence.
01:32:23.840 | And, you know, I studied electrical engineering
01:32:26.640 | during undergrad, and I started out my research in MEMS,
01:32:31.080 | so microelectromechanical systems,
01:32:33.280 | and really building these tiny nanostructures
01:32:35.520 | for temperature sensing.
01:32:37.400 | And I just found that to be just incredibly rewarding
01:32:40.560 | and fascinating subject to just understand
01:32:42.960 | how you can build something miniature like that,
01:32:45.480 | that, again, serve a function and had a purpose.
01:32:48.360 | And then, you know, I spent large majority
01:32:50.880 | of my college years basically building millimeter wave
01:32:55.000 | circuits for next-gen telecommunication systems for imaging.
01:32:59.840 | And it was just something that I found very,
01:33:02.400 | very intellectually interesting, you know, phase arrays,
01:33:05.360 | how the signal processing works for, you know,
01:33:08.760 | any modern as well as next-gen telecommunication system,
01:33:12.320 | wireless and wireline.
01:33:13.480 | EM waves or electromagnetic waves are fascinating.
01:33:17.760 | How do you design antennas that are most efficient
01:33:21.560 | in a small footprint that you have?
01:33:23.720 | How do you make these things energy efficient?
01:33:26.040 | That was something that just consumed
01:33:27.520 | my intellectual curiosity.
01:33:29.640 | And that journey led me to actually apply to
01:33:32.480 | and find myself a PhD program at UC Berkeley
01:33:35.760 | at kind of this consortium called
01:33:37.840 | the Berkeley Wireless Research Center
01:33:39.440 | that was precisely looking at building,
01:33:42.280 | at the time we called it XG, you know,
01:33:44.320 | similar to 3G, 4G, 5G, but the next, next generation G system
01:33:49.320 | and how you would design circuits around that
01:33:51.720 | to ultimately go on phones and, you know,
01:33:53.800 | basically any other devices that are
01:33:56.120 | wirelessly connected these days.
01:33:58.400 | So I was just absolutely just fascinated by how
01:34:01.720 | that entire system works and that infrastructure works.
01:34:05.120 | And then also during grad school,
01:34:09.040 | I had sort of the fortune of having, you know,
01:34:13.080 | a couple of research fellowships that led me
01:34:15.000 | to pursue whatever project that I want.
01:34:17.160 | And that's one of the things that I really enjoyed
01:34:19.960 | about my graduate school career,
01:34:21.800 | where you got to kind of pursue your intellectual curiosity
01:34:25.160 | in the domain that may not matter at the end of the day,
01:34:28.000 | but it's something that, you know,
01:34:29.600 | really allows you the opportunity to go as deeply
01:34:34.040 | as you want, as well as as widely as you want.
01:34:36.920 | And at the time I was actually working
01:34:38.360 | on this project called the Smart Band-Aid.
01:34:40.240 | And the idea was that when you get a wound,
01:34:43.680 | there's a lot of other kind of proliferation
01:34:46.040 | of signaling pathway that cells follow to close that wound.
01:34:50.960 | And there were hypotheses that
01:34:54.360 | when you apply external electric field,
01:34:56.680 | you can actually accelerate the closing of that field
01:34:59.720 | by having, you know, basically electro-taxing
01:35:02.880 | of the cells around that wound site.
01:35:06.040 | And specifically, not just for normal wound,
01:35:08.480 | there are chronic wounds that don't heal.
01:35:10.800 | So we were interested in building, you know,
01:35:12.760 | some sort of a wearable patch that you could apply
01:35:17.000 | to kind of facilitate that healing process.
01:35:20.280 | And that was in collaboration
01:35:22.800 | with Professor Michelle Morowitz, you know,
01:35:26.080 | which, you know, was a great addition
01:35:28.400 | to kind of my thesis committee
01:35:29.840 | and, you know, really shaped rest of my PhD career.
01:35:33.520 | - So this would be the first time
01:35:34.640 | you interacted with biology, I suppose.
01:35:36.920 | - Correct, correct.
01:35:37.920 | I mean, there were some peripheral, you know,
01:35:41.720 | end application of the wireless imaging
01:35:44.560 | and telecommunication system that I was using
01:35:46.320 | for security and bioimaging,
01:35:48.800 | but this was a very clear direct application
01:35:53.160 | to biology and biological system
01:35:56.200 | and understanding the constraints around that
01:35:58.200 | and really designing and engineering
01:36:01.040 | electrical solutions around it.
01:36:02.600 | So that was my first introduction.
01:36:04.680 | And that's also kind of how I got introduced to Michel.
01:36:09.160 | You know, he's sort of known for remote control
01:36:13.360 | of beetles in the early 2000s.
01:36:16.920 | And then around 2013, you know,
01:36:20.840 | obviously kind of the holy grail
01:36:22.320 | when it comes to implantable system
01:36:24.320 | is to kind of understand how small of a thing you can make.
01:36:28.840 | And a lot of that is driven by how much energy
01:36:32.360 | or how much power you can supply to it
01:36:34.360 | and how you extract data from it.
01:36:36.640 | So at the time at Berkeley,
01:36:38.120 | there was kind of this desire
01:36:40.920 | to kind of understand in the neural space,
01:36:43.160 | what sort of system you can build
01:36:46.040 | to really miniaturize these implantable systems.
01:36:48.840 | And I distinctively remember this one particular meeting
01:36:53.760 | where Michel came in and he's like,
01:36:55.520 | "Guys, I think I have a solution.
01:36:57.160 | The solution is ultrasound."
01:37:00.640 | And then he proceeded to kind of walk through
01:37:04.200 | why that is the case.
01:37:06.320 | And that really formed the basis for my thesis work
01:37:09.160 | called NeuralDOS system that was looking at ways
01:37:13.920 | to use ultrasound as opposed to electromagnetic waves
01:37:18.120 | for powering, as well as communication.
01:37:20.720 | I guess I should step back and say
01:37:22.400 | the initial goal of the project was to build these tiny,
01:37:27.080 | about a size of a neuron implantable system
01:37:30.640 | that can be parked next to a neuron,
01:37:32.600 | being able to record its state
01:37:34.360 | and being able to ping that back to the outside world
01:37:36.920 | for doing something useful.
01:37:39.080 | And as I mentioned, the size of the implantable system
01:37:43.640 | is limited by how you power the thing
01:37:46.600 | and get the data off of it.
01:37:48.480 | And at the end of the day, fundamentally,
01:37:50.840 | if you look at our human body,
01:37:52.800 | we're essentially a bag of saltwater
01:37:55.680 | with some interesting proteins and chemicals,
01:37:57.760 | but it's mostly saltwater that's very, very well
01:38:01.920 | temperature regulated at 37 degrees Celsius.
01:38:04.720 | And we'll get into how, why, and later,
01:38:08.800 | why that's an extremely harsh environment
01:38:11.040 | for any electronics to survive as I'm sure you've experienced
01:38:15.720 | or maybe not experienced dropping cell phone
01:38:17.960 | in a saltwater in an ocean,
01:38:20.120 | it will instantly kill the device, right?
01:38:22.120 | But anyways, just in general,
01:38:26.040 | electromagnetic waves don't penetrate
01:38:28.280 | through this environment well.
01:38:29.800 | And just the speed of light, it is what it is.
01:38:34.400 | We can't change it.
01:38:35.760 | And based on the wavelength
01:38:40.280 | at which you are interfacing with the device,
01:38:43.320 | the device just needs to be big.
01:38:44.560 | Like these inductors needs to be quite big.
01:38:46.720 | And the general good rule of thumb
01:38:48.880 | is that you want the wave front to be roughly
01:38:52.800 | on the order of the size of the thing
01:38:54.560 | that you're interfacing with.
01:38:56.120 | So an implantable system that is around 10 to 100 micron
01:39:01.120 | in dimension, in a volume,
01:39:04.520 | which is about the size of a neuron
01:39:05.880 | that you see in a human body.
01:39:08.280 | You would have to operate at like hundreds of gigahertz,
01:39:12.880 | which number one, not only is it difficult
01:39:15.480 | to build electronics operating at those frequencies,
01:39:18.760 | but also the body just attenuates that
01:39:21.360 | very, very significantly.
01:39:23.680 | So the interesting kind of insight of this ultrasound
01:39:27.920 | was the fact that ultrasound just travels
01:39:31.680 | a lot more effectively in the human body tissue
01:39:34.600 | compared to electromagnetic waves.
01:39:36.920 | And this is something that you encounter
01:39:40.120 | and I'm sure most people have encountered in their lives.
01:39:43.680 | When you go to hospitals that are medical
01:39:47.760 | ultrasound sonograph, right?
01:39:50.680 | And they go into very, very deep depth
01:39:55.680 | without attenuating too much of the signal.
01:39:58.560 | So all in all, ultrasound,
01:40:02.160 | the fact that it travels through the body extremely well
01:40:05.640 | and the mechanism to which it travels
01:40:07.680 | to the body really well is that just the wave front
01:40:10.440 | is very different.
01:40:11.320 | Electromagnetic waves are transverse,
01:40:15.200 | whereas in ultrasound waves are compressive.
01:40:17.760 | So it's just a completely different mode
01:40:19.560 | of wave front propagation.
01:40:23.520 | And as well as speed of sound is
01:40:26.240 | orders and orders of magnitude less than speed of light,
01:40:30.000 | which means that even at 10 megahertz ultrasound wave,
01:40:33.920 | your wave front ultimately is a very, very small wavelength.
01:40:37.760 | So if you're talking about interfacing with the 10 micron
01:40:41.040 | or 100 micron type structure,
01:40:43.880 | you would have 150 micron wave front at 10 megahertz
01:40:49.320 | and building electronics at those frequencies
01:40:52.280 | are much, much easier and they're a lot more efficient.
01:40:55.360 | So the basic idea kind of was born out of
01:40:59.200 | using ultrasound as a mechanism for powering the device
01:41:03.600 | and then also getting data back.
01:41:05.920 | So now the question is, how do you get the data back?
01:41:08.800 | The mechanism to which we landed on
01:41:10.400 | is what's called backscattering.
01:41:12.600 | This is actually something that is very common
01:41:16.120 | and that we interface on a day-to-day basis
01:41:18.600 | with our RFID cards, our radio frequency ID tags,
01:41:22.600 | where there's actually rarely in your ID,
01:41:27.400 | a battery inside, there's an antenna
01:41:29.800 | and there's some sort of a coil
01:41:32.600 | that has your serial identification ID.
01:41:37.040 | And then there's an external device called a reader
01:41:39.400 | that then sends a wave front.
01:41:41.680 | And then you reflect back that wave front
01:41:43.960 | with some sort of modulation that's unique to your ID.
01:41:47.280 | That's what's called backscattering fundamentally.
01:41:50.600 | So the tag itself actually doesn't have to consume
01:41:53.400 | that much energy.
01:41:54.880 | And that was a mechanism to which we were kind of thinking
01:41:59.040 | about sending the data back.
01:42:00.120 | So when you have an external ultrasonic transducer
01:42:04.240 | that's sending ultrasonic wave to your implant,
01:42:07.440 | the NeuroDOS implant,
01:42:08.960 | and it records some information about its environment,
01:42:12.880 | whether it's a neuron firing or some other state
01:42:16.080 | of the tissue that it's interfacing with,
01:42:21.080 | and then it just amplitude modulates the wave front
01:42:24.960 | that comes back to the source.
01:42:27.160 | - And the recording step would be the only one
01:42:29.800 | that requires any energy.
01:42:31.320 | So what would require energy in that little step?
01:42:33.720 | - Correct.
01:42:34.560 | So it is that initial kind of startup circuitry
01:42:37.440 | to get that recording, amplifying it,
01:42:40.280 | and then just modulating.
01:42:42.520 | And the mechanism to which that you can enable that
01:42:46.040 | is there is this specialized crystal
01:42:48.200 | called piezoelectric crystals
01:42:50.120 | that are able to convert sound energy
01:42:53.160 | into electrical energy and vice versa.
01:42:55.400 | So you can kind of have this interplay
01:42:58.120 | between the ultrasonic domain and electrical domain
01:43:00.840 | that is the biological tissue.
01:43:02.940 | - So on the theme of parking very small
01:43:07.360 | computational devices next to neurons,
01:43:10.080 | that's the dream, the vision of brain-computer interfaces.
01:43:14.440 | Maybe before we talk about Neuralink,
01:43:16.120 | can you give a sense of the history of the field of BCI?
01:43:21.120 | What has been maybe the continued dream
01:43:26.120 | and also some of the milestones along the way
01:43:29.000 | with the different approaches
01:43:30.560 | and the amazing work done at the various labs?
01:43:32.840 | - I think a good starting point is going back to 1790s.
01:43:38.200 | - (laughs) I did not expect that.
01:43:41.160 | - Where the concept of animal electricity
01:43:46.160 | or the fact that body is electric
01:43:48.400 | was first discovered by Luigi Galvani
01:43:51.880 | where he had this famous experiment
01:43:53.800 | where he connected set of electrodes to a frog leg
01:43:57.880 | and ran current through it and then it started twitching
01:44:00.400 | and he said, "Oh my goodness, body's electric."
01:44:04.080 | So fast forward many, many years to 1920s
01:44:07.880 | where Hans Berger who's a German psychiatrist
01:44:11.520 | discovered EEG or electroencephalography
01:44:15.160 | which is still around.
01:44:16.720 | There are these electrode arrays
01:44:18.640 | that you wear outside the skull
01:44:20.880 | that gives you some sort of neural recording.
01:44:23.200 | That was a very, very big milestone
01:44:24.880 | that you can record some sort of activities
01:44:27.920 | about the human mind.
01:44:29.640 | And then in the 1940s, there were these group of scientists,
01:44:36.320 | Renshaw, Forbes, and Morrison
01:44:38.320 | that inserted these glass microelectrodes
01:44:44.120 | into the cortex and recorded single neurons.
01:44:46.680 | The fact that there's signal
01:44:50.480 | that are a bit more high resolution and high fidelity
01:44:53.760 | as you get closer to the source, let's say.
01:44:55.920 | And in the 1950s, these two scientists,
01:45:01.000 | Hodgkin and Huxley showed up
01:45:03.920 | and they built this beautiful, beautiful models
01:45:07.640 | of the cell membrane and the ionic mechanism
01:45:10.880 | and had these like circuit diagram.
01:45:12.240 | And as someone who is an electrical engineer,
01:45:14.560 | it's a beautiful model that's built out
01:45:16.760 | of these partial differential equations,
01:45:19.800 | talking about flow of ions
01:45:21.800 | and how that really leads to how neurons communicate.
01:45:25.840 | And they won the Nobel Prize for that 10 years later
01:45:28.240 | in the 1960s.
01:45:29.560 | So in 1969, Ed Fetz from University of Washington
01:45:34.560 | published this beautiful paper
01:45:35.960 | called "Operand Conditioning of Cortical Unit Activity"
01:45:38.880 | where he was able to record a single unit neuron
01:45:43.800 | from a monkey and was able to have the monkey modulated
01:45:48.800 | based on its activity and reward system.
01:45:52.360 | So I would say this is the very, very first example
01:45:56.280 | as far as I'm aware of closed loop,
01:45:59.160 | brain computer interface or BCI.
01:46:01.920 | The abstract reads, "The activity of single neurons
01:46:05.040 | "in precentral cortex of anesthetized monkeys
01:46:09.240 | "was conditioned by reinforcing high rates
01:46:11.640 | "of neuronal discharge with delivery of a food pellet.
01:46:15.140 | "Auditory and visual feedback of unit firing rates
01:46:18.200 | "was usually provided in addition to food reinforcement."
01:46:21.320 | Cool.
01:46:22.340 | So they actually got it done.
01:46:24.720 | - They got it done.
01:46:25.560 | This is back in 1969.
01:46:28.520 | - "After several training sessions,
01:46:31.880 | "monkeys could increase the activity
01:46:33.500 | "of newly isolated cells by 50 to 500% above rates
01:46:38.280 | "before reinforcement."
01:46:40.280 | Fascinating.
01:46:41.120 | - Brain is very plastic.
01:46:43.200 | (laughing)
01:46:44.560 | - And so from here, the number of experiments grew.
01:46:48.280 | - Yeah, number of experiments as well as set of tools
01:46:52.240 | to interface with the brain have just exploded.
01:46:54.700 | I think, and also just understanding the neural code
01:46:59.800 | and how some of the cortical layers
01:47:01.960 | and the functions are organized.
01:47:03.820 | So the other paper that is pretty seminal,
01:47:08.820 | especially in the motor decoding,
01:47:11.040 | was this paper in the 1980s from Georgiopoulos
01:47:15.700 | that discovered that there's this thing
01:47:19.020 | called motor tuning curve.
01:47:20.620 | So what are motor tuning curves?
01:47:22.500 | It's the fact that there are neurons in the motor cortex
01:47:25.740 | of mammals, including humans,
01:47:28.500 | that have a preferential direction
01:47:31.620 | that causes them to fire.
01:47:32.820 | So what that means is there are a set of neurons
01:47:34.740 | that would increase their spiking activities
01:47:38.180 | when you're thinking about moving to the left,
01:47:41.420 | right, up, down, and any of those vectors.
01:47:46.420 | And based on that, you could start to think,
01:47:49.820 | well, if you can identify those essential eigenvectors,
01:47:53.940 | you can do a lot and you can actually use that information
01:47:56.260 | for actually decoding someone's intended movement
01:47:59.140 | from the cortex.
01:48:00.500 | So that was a very, very seminal kind of paper
01:48:03.060 | that showed that there is some sort of code
01:48:08.060 | that you can extract, especially in the motor cortex.
01:48:11.820 | - So there's signal there.
01:48:13.420 | And if you measure the electrical signal from the brain,
01:48:17.360 | that you could actually figure out what the intention was.
01:48:20.820 | - Correct, yeah, not only electrical signals,
01:48:22.740 | but electrical signals from the right set of neurons
01:48:25.200 | that give you these preferential direction.
01:48:29.100 | - Okay, so going slowly towards Neuralink,
01:48:32.220 | one interesting question is what do I understand
01:48:35.540 | on the BCI front on invasive versus non-invasive
01:48:39.300 | from this line of work?
01:48:42.780 | How important is it to park next to the neuron?
01:48:47.620 | What does that get you?
01:48:49.500 | - That answer fundamentally depends
01:48:51.020 | on what you want to do with it, right?
01:48:53.260 | There's actually an incredible amount of stuff
01:48:55.420 | that you can do with EEG and electrocorticograph ECOG,
01:48:59.620 | which actually doesn't penetrate the cortical layer
01:49:02.420 | or parenchyma, but you place a set of electrodes
01:49:05.700 | on the surface of the brain.
01:49:07.900 | So the thing that I'm personally very interested in
01:49:10.580 | is just actually understanding
01:49:12.900 | and being able to just really tap into the high resolution,
01:49:18.780 | high fidelity understanding of the activities
01:49:21.220 | that are happening at the local level.
01:49:23.180 | And we can get into biophysics,
01:49:26.220 | but just to kind of step back to kind of use analogy,
01:49:29.900 | 'cause analogy here can be useful.
01:49:31.620 | Sometimes it's a little bit difficult
01:49:32.740 | to think about electricity.
01:49:34.300 | At the end of the day, we're doing electrical recording
01:49:35.900 | that's mediated by ionic currents,
01:49:39.180 | movements of these charged particles,
01:49:40.980 | which is really, really hard for most people to think about.
01:49:45.660 | But turns out a lot of the activities
01:49:47.980 | that are happening in the brain
01:49:51.780 | and the frequency band with which that's happening
01:49:54.100 | is actually very, very similar to sound waves
01:49:56.420 | and our normal conversation, audible range.
01:50:01.060 | So the analogy that typically is used in the field
01:50:04.260 | is if you have a football stadium,
01:50:07.900 | there's game going on.
01:50:10.500 | If you stand outside the stadium,
01:50:11.900 | you maybe get a sense of how the game is going
01:50:14.420 | based on the cheers and the boos of the home crowd,
01:50:16.700 | whether the team is winning or not,
01:50:18.820 | but you have absolutely no idea what the score is.
01:50:22.020 | You have absolutely no idea what individual audience
01:50:26.180 | or the players are talking or saying to each other
01:50:28.740 | what the next play is, what the next goal is.
01:50:30.980 | So what you have to do is you have to drop the microphone
01:50:34.780 | near into the stadium and then get near the source,
01:50:38.580 | like into the individual chatter.
01:50:41.140 | In this specific example, you would wanna have it
01:50:43.860 | right next to where the huddle is happening.
01:50:47.300 | So I think that's kind of a good illustration
01:50:50.180 | of what we're trying to do when we say invasive
01:50:54.220 | or minimally invasive or implanted brain-computer interfaces
01:50:57.780 | versus non-invasive or non-implanted brain interfaces.
01:51:02.100 | It's basically talking about where do you put
01:51:04.460 | that microphone and what can you do with that information?
01:51:07.340 | - So what is the biophysics of the read and write
01:51:11.420 | communication that we're talking about here
01:51:13.700 | as we now step into the efforts at Neuralink?
01:51:18.220 | - Yeah, so brain is made up of these specialized cells
01:51:23.220 | called neurons.
01:51:25.940 | There's billions of them, tens of billions.
01:51:29.460 | Sometimes people call it a hundred billion
01:51:31.660 | that are connected in this complex yet dynamic network
01:51:36.660 | that are constantly remodeling.
01:51:39.740 | They're changing their synaptic weights
01:51:42.180 | and that's what we typically call neuroplasticity.
01:51:45.820 | And the neurons are also bathed in this charged environment
01:51:51.540 | that is laden with many charged molecules
01:51:55.060 | like potassium ions, sodium ions, chlorine ions.
01:51:59.420 | And those actually facilitate these,
01:52:02.740 | through ionic current communication
01:52:04.740 | between these different networks.
01:52:06.440 | And when you look at a neuron as well,
01:52:12.100 | they have these membrane with a beautiful,
01:52:16.620 | beautiful protein structure
01:52:18.900 | called a voltage selective ion channels,
01:52:21.420 | which in my opinion is one of nature's best inventions.
01:52:26.420 | In many ways, if you think about what they are,
01:52:29.060 | they're doing the job of a modern day transistors.
01:52:32.700 | Transistors are nothing more at the end of the day
01:52:34.700 | than a voltage gated conduction channel.
01:52:37.160 | And nature found a way to have that very, very early on
01:52:41.940 | in its evolution.
01:52:43.220 | And as we all know, with the transistor,
01:52:45.540 | you can have many, many computation
01:52:47.340 | and a lot of amazing things that we have access to today.
01:52:51.940 | So I think it's one of those just as a tangent,
01:52:56.180 | just a beautiful, beautiful invention
01:52:58.680 | that the nature came up with,
01:52:59.780 | these voltage gated ion channels.
01:53:02.220 | - I mean, I suppose there's on the biological level,
01:53:05.220 | every level of the complexity of the hierarchy
01:53:07.940 | of the organism, there's going to be some mechanisms
01:53:11.620 | for storing information and for doing computation.
01:53:14.980 | And this is just one such way.
01:53:16.900 | But to do that with biological and chemical components
01:53:20.460 | is interesting.
01:53:21.460 | Plus like when neurons, I mean, it's not just electricity,
01:53:25.620 | it's chemical communication, it's also mechanical.
01:53:29.460 | And these are like actual objects that have like,
01:53:34.040 | that vibrate, I mean, they move.
01:53:36.180 | - Yeah, they're actually, I mean,
01:53:37.780 | there's a lot of really, really interesting physics
01:53:40.220 | that are involved.
01:53:41.900 | And kind of going back to my work on ultrasound
01:53:46.260 | during grad school, there were groups
01:53:51.100 | and there are still groups looking at ways
01:53:53.740 | to cause neurons to actually fire an action potential
01:53:58.280 | using ultrasound wave.
01:53:59.380 | And the mechanism to which that's happening
01:54:01.460 | is still unclear as I understand.
01:54:03.560 | It may just be that you're imparting
01:54:06.400 | some sort of thermal energy and that causes cells
01:54:08.920 | to depolarize in some interesting ways.
01:54:11.440 | But there are also these ion channels or even membranes
01:54:15.420 | that actually just open up its pore
01:54:18.220 | as they're being mechanically shook, vibrated.
01:54:21.520 | So there's just a lot of elements of these move particles,
01:54:26.520 | which again, that's governed by diffusion physics,
01:54:30.880 | movements of particles.
01:54:32.300 | And there's also a lot of kind of interesting physics there.
01:54:35.620 | - Also not to mention, as Roger Penrose talks
01:54:38.380 | about the, there might be some beautiful weirdness
01:54:42.120 | in the quantum mechanical effects of all of this.
01:54:44.540 | And he actually believes that consciousness
01:54:46.620 | might emerge from the quantum mechanical effects there.
01:54:49.820 | So like there's physics, there's chemistry,
01:54:52.260 | there's biology, all of that is going on there.
01:54:54.560 | - Oh yeah, yeah.
01:54:55.400 | I mean, you can, yes, there's a lot of levels of physics
01:54:59.000 | that you can dive into.
01:55:00.940 | But yeah, in the end, you have these membranes
01:55:04.420 | with these voltage gated ion channels
01:55:06.460 | that selectively let these charged molecules
01:55:10.220 | that are in the extracellular matrix, like in and out.
01:55:15.220 | And these neurons generally have these like resting potential
01:55:20.460 | where there's a voltage difference
01:55:22.620 | between inside the cell and outside the cell.
01:55:25.260 | And when there's some sort of stimuli that changes the state
01:55:30.260 | such that they need to send information
01:55:34.420 | to the downstream network, you know,
01:55:38.380 | you start to kind of see these like sort of orchestration
01:55:40.660 | of these different molecules going in and out
01:55:43.100 | of these channels.
01:55:44.100 | They also open up, like more of them open up
01:55:46.060 | once it reaches some threshold to a point where,
01:55:49.460 | you know, you have a depolarizing cell
01:55:51.260 | that sends an action potential.
01:55:53.420 | So it's just a very beautiful kind of orchestration
01:55:55.980 | of these molecules.
01:55:59.100 | And what we're trying to do when we place an electrode
01:56:04.260 | or parking it next to a neuron is that you're trying
01:56:07.580 | to measure these local changes in the potential.
01:56:11.220 | Again, mediated by the movements of the ions.
01:56:17.140 | And what's interesting, as I mentioned earlier,
01:56:19.300 | there's a lot of physics involved.
01:56:21.000 | And the two dominant physics
01:56:24.340 | for this electrical recording domain
01:56:27.900 | is diffusion physics and electromagnetism.
01:56:31.020 | And where one dominates, where Maxwell's equation dominates
01:56:36.020 | versus Fick's law dominates,
01:56:38.540 | depends on where your electrode is.
01:56:41.220 | If it's close to the source, mostly electromagnetic based,
01:56:47.340 | when you're farther away from it, it's more diffusion based.
01:56:51.620 | So essentially when you're able to park it next to it,
01:56:55.580 | you can listen in on those individual chatter
01:56:59.340 | and those local changes in the potential.
01:57:01.500 | And the type of signal that you get
01:57:03.380 | are these canonical textbook neural spiking waveform.
01:57:08.380 | When you're, the moment you're further away,
01:57:10.220 | and based on some of the studies that people have done,
01:57:13.260 | you know, Christoph Koch's lab and others,
01:57:16.300 | once you're away from that source
01:57:17.940 | by roughly around 100 micron,
01:57:20.020 | which is about a width of a human hair,
01:57:22.100 | you no longer hear from that neuron.
01:57:24.300 | You're no longer able to kind of have the system
01:57:27.620 | sensitive enough to be able to record
01:57:30.420 | that particular local membrane potential change
01:57:35.420 | in that neuron.
01:57:36.780 | And just to kind of give you a sense of scale also,
01:57:39.500 | when you look at a hundred micron voxel,
01:57:41.620 | so a hundred micron by a hundred micron
01:57:43.060 | by a hundred micron box in a brain tissue,
01:57:45.940 | there's roughly around 40 neurons
01:57:49.100 | and whatever number of connections that they have.
01:57:51.420 | So there's a lot in that volume of tissue.
01:57:54.060 | So the moment you're outside of that,
01:57:55.940 | there's just no hope that you'll be able to
01:57:58.660 | detect that change from that one specific neuron
01:58:01.180 | that you may care about.
01:58:03.580 | - Yeah, but as you're moving about this space,
01:58:06.700 | you'll be hearing other ones.
01:58:09.180 | So if you move another hundred micron,
01:58:11.340 | you'll be hearing chatter from another community.
01:58:12.940 | - Correct.
01:58:14.100 | - And so the whole sense is you want to place
01:58:17.060 | as many as possible electrodes
01:58:18.660 | and then you're listening to the chatter.
01:58:20.460 | - Yeah, you want to listen to the chatter.
01:58:21.860 | And at the end of the day,
01:58:22.980 | you also want to basically let the software
01:58:25.660 | do the job of decoding.
01:58:28.220 | And just to kind of go to,
01:58:32.100 | why ECOG and EEG work at all, right?
01:58:35.780 | When you have these local changes,
01:58:38.540 | obviously it's not just this one neuron that's activating,
01:58:41.860 | there's many, many other networks
01:58:43.580 | that are activating all the time.
01:58:45.500 | And you do see sort of a general change
01:58:48.260 | in the potential of this electrode,
01:58:51.060 | like it's a charge medium.
01:58:52.860 | And that's what you're recording when you're farther away.
01:58:54.860 | I mean, you still have some reference electrode
01:58:57.020 | that's stable in the brain,
01:58:59.140 | that's just electroactive organ.
01:59:01.300 | And you're seeing some combination
01:59:03.260 | aggregate action potential changes,
01:59:06.060 | and then you can pick it up, right?
01:59:07.220 | It's a much slower changing signals,
01:59:11.700 | but there are these like canonical
01:59:15.460 | kind of oscillations and waves,
01:59:16.900 | like gamma waves, beta waves,
01:59:18.340 | like when you sleep that can be detected
01:59:20.540 | 'cause there's sort of a synchronized
01:59:22.780 | kind of global effect of the brain that you can detect.
01:59:28.580 | And I mean, the physics of this go,
01:59:32.060 | like, I mean, if we really wanna go down that rabbit hole,
01:59:34.460 | like there's a lot that goes on in terms of
01:59:38.180 | like why diffusion physics at some point dominates
01:59:41.260 | when you're further away from the source.
01:59:43.740 | It's just a charge medium.
01:59:45.780 | So similar to how when you have electromagnetic waves
01:59:48.740 | propagating in atmosphere
01:59:50.140 | or in a charge medium like a plasma,
01:59:52.780 | there's this weird shielding that happens
01:59:54.580 | that actually further attenuates the signal
01:59:58.780 | as you move away from it.
02:00:00.340 | So yeah, you see, like,
02:00:02.180 | if you do a really, really deep dive
02:00:03.660 | on kind of the signal attenuation over distance,
02:00:07.260 | you start to see kind of one over R square in the beginning
02:00:10.140 | and then exponential drop-off.
02:00:11.780 | And that's the knee at which, you know,
02:00:13.380 | you go from electromagnetism dominating
02:00:16.580 | to diffusion physics dominating.
02:00:19.420 | - But once again, with the electrodes,
02:00:21.900 | the biophysics that you need to understand is not as deep
02:00:26.900 | because no matter where you're placing it,
02:00:29.540 | you're listening to a small crowd of local neurons.
02:00:32.260 | - Correct, yeah.
02:00:33.100 | So once you penetrate the brain,
02:00:35.020 | you know, you're in the arena, so to speak.
02:00:37.220 | - And there's a lot of neurons.
02:00:39.020 | - There are many, many of them.
02:00:40.060 | - But then again, there's like,
02:00:41.380 | there's a whole field of neuroscience that's studying like
02:00:44.140 | how the different groupings,
02:00:46.500 | the different sections of the seating in the arena,
02:00:49.020 | what they usually are responsible for,
02:00:50.780 | which is where the metaphor probably falls apart
02:00:53.340 | 'cause the seating is not that organized in an arena.
02:00:56.260 | - Also, most of them are silent.
02:00:57.940 | They don't really do much, you know?
02:01:00.500 | Or their activities are, you know,
02:01:04.220 | you have to hit it with just the right set of stimulus.
02:01:07.140 | - So they're usually quiet.
02:01:08.940 | - They're usually very quiet.
02:01:10.500 | There's, I mean, similar to dark energy and dark matter,
02:01:13.300 | there's dark neurons.
02:01:15.140 | What are they all doing?
02:01:16.540 | When you place these electrode, again,
02:01:18.140 | like within this hundred micron volume,
02:01:20.020 | you have 40 or so neurons.
02:01:21.740 | Like, why do you not see 40 neurons?
02:01:23.500 | Why do you see only a handful?
02:01:24.900 | What is happening there?
02:01:25.780 | - Well, they're mostly quiet,
02:01:26.940 | but like when they speak, they say profound shit, I think.
02:01:30.100 | That's the way I'd like to think about it.
02:01:31.860 | Anyway, before we zoom in even more, let's zoom out.
02:01:35.580 | So how does Neuralink work?
02:01:38.940 | From the surgery, to the implant,
02:01:43.940 | to the signal and the decoding process,
02:01:47.660 | and the human being able to use the implant
02:01:52.460 | to actually affect the world outside?
02:01:56.460 | And all of this, I'm asking in the context
02:02:00.100 | of there's a gigantic historic milestone
02:02:02.540 | that Neuralink just accomplished in January of this year,
02:02:07.100 | putting a Neuralink implant in the first human being, Noland.
02:02:11.180 | And there's been a lot to talk about there,
02:02:13.860 | about his experience, because he's able to describe
02:02:16.700 | all the nuance and the beauty
02:02:18.100 | and the fascinating complexity of that experience,
02:02:21.200 | of everything involved.
02:02:22.180 | But on the technical level, how does Neuralink work?
02:02:26.020 | - Yeah, so there are three major components
02:02:27.980 | to the technology that we're building.
02:02:29.580 | One is the device,
02:02:31.980 | the thing that's actually recording these neural chatters.
02:02:36.100 | We call it N1 Implant, or the Link.
02:02:40.260 | And we have a surgical robot
02:02:43.500 | that's actually doing an implantation
02:02:45.460 | of these tiny, tiny wires that we call threads
02:02:48.180 | that are smaller than human hair.
02:02:51.220 | And once everything is surgerized,
02:02:54.860 | you have these neural signals, these spiking neurons
02:02:58.300 | that are coming out of the brain,
02:02:59.740 | and you need to have some sort of software
02:03:02.420 | to decode what the users intend to do with that.
02:03:06.860 | So there's what's called a Neuralink Application,
02:03:10.420 | or B1 App, that's doing that translation.
02:03:12.740 | It's running the very, very simple machine learning model
02:03:17.140 | that decodes these inputs that are neural signals,
02:03:22.140 | and then convert it to a set of outputs
02:03:23.940 | that allows our participant, first participant, Noland,
02:03:28.380 | to be able to control a cursor on the screen.
02:03:30.980 | - And this is done wirelessly?
02:03:33.820 | - And this is done wirelessly,
02:03:34.980 | so our implant is actually two-part.
02:03:39.140 | The Link has these flexible, tiny wires called threads
02:03:44.140 | that have multiple electrodes along its length.
02:03:49.580 | And they're only inserted into the cortical layer,
02:03:53.860 | which is about three to five millimeters in a human brain.
02:03:58.180 | In the motor cortex region,
02:03:59.380 | that's where the intention for movement lies in.
02:04:04.100 | And we have 64 of these threads,
02:04:06.660 | each thread having 16 electrodes along the span
02:04:10.020 | of three to four millimeters, separated by 200 microns.
02:04:13.540 | So you can actually record along the depth of the insertion.
02:04:18.420 | And based on that signal, there's custom integrated circuit
02:04:23.420 | or ASIC that we built that amplifies the neural signals
02:04:29.260 | that you're recording, and then digitizing it,
02:04:32.220 | and then has some mechanism for detecting
02:04:37.220 | whether there was an interesting event,
02:04:39.380 | that is a spiking event, and decide to send that
02:04:43.300 | or not send that through Bluetooth to an external device,
02:04:46.500 | whether it's a phone or a computer
02:04:48.420 | that's running this Neuralink application.
02:04:50.180 | - So there's onboard signal processing already
02:04:52.380 | just to decide whether this is an interesting event or not.
02:04:55.220 | So there is some computational power onboard
02:04:57.180 | inside the, in addition to the human brain.
02:04:59.900 | - Yeah, so it does the signal processing
02:05:02.340 | to kind of really compress the amount of signal
02:05:04.820 | that you're recording.
02:05:05.900 | So we have a total of 1,000 electrodes sampling
02:05:10.220 | at just under 20 kilohertz with 10 bit each.
02:05:14.300 | So that's 200 megabits that's coming through to the chip
02:05:19.300 | from 1,000 channel simultaneous neural recording.
02:05:25.380 | And that's quite a bit of data.
02:05:27.780 | And, you know, there are technology available
02:05:29.900 | to send that off wirelessly, but being able to do that
02:05:32.940 | in a very, very thermally constrained environment
02:05:36.780 | that is a brain.
02:05:37.620 | So there has to be some amount of compression
02:05:40.580 | that happens to send off only the interesting data
02:05:43.500 | that you need, which in this particular case
02:05:45.500 | for motor decoding is occurrence of a spike or not.
02:05:50.340 | And then being able to use that to, you know,
02:05:55.500 | decode the intended cursor movement.
02:05:57.700 | So the implant itself processes it,
02:06:00.580 | figures out whether a spike happened or not
02:06:02.980 | with our spike detection algorithm,
02:06:05.820 | and then sends it off, packages it,
02:06:07.900 | send it off through Bluetooth to an external device
02:06:12.140 | that then has the model to decode,
02:06:14.620 | okay, based on these spiking inputs,
02:06:17.340 | did Nolan wish to go up, down, left, right,
02:06:21.380 | or click or right click or whatever?
02:06:23.780 | - All of this is really fascinating,
02:06:25.220 | but let's stick on the N1 implant itself.
02:06:27.180 | So the thing that's in the brain,
02:06:29.500 | so I'm looking at a picture of it.
02:06:30.820 | There's an enclosure, there's a charging coil.
02:06:34.180 | So we didn't talk about the charging, which is fascinating.
02:06:38.180 | The battery, the power electronics, the antenna.
02:06:42.460 | Then there's the signal processing electronics.
02:06:46.580 | I wonder if there's more kinds
02:06:47.940 | of signal processing you can do.
02:06:49.060 | That's another question.
02:06:50.900 | And then there's the threads themselves
02:06:54.180 | with the enclosure on the bottom.
02:06:56.420 | So maybe to ask about the charging.
02:06:59.700 | There's an external charging device.
02:07:03.420 | - Yeah, there's an external charging device.
02:07:05.300 | So yeah, the second part of the implant,
02:07:07.540 | the threads are the ones, again,
02:07:09.220 | just the last three to five millimeters
02:07:12.420 | are the ones that are actually penetrating the cortex.
02:07:15.140 | Rest of it is, actually most of the volume
02:07:19.020 | is occupied by the battery, rechargeable battery.
02:07:23.820 | And it's about a size of a quarter.
02:07:26.500 | I actually have a device here
02:07:27.980 | if you want to take a look at it.
02:07:29.580 | This is the flexible thread component of it,
02:07:35.220 | and this is the implant.
02:07:36.660 | So it's about a size of a US quarter.
02:07:41.020 | It's about nine millimeter thick.
02:07:43.620 | So basically this implant,
02:07:45.660 | once you have the craniectomy and the directomy,
02:07:48.740 | threads are inserted.
02:07:51.420 | And the hole that you created,
02:07:54.740 | this craniectomy gets replaced with that.
02:07:57.220 | So basically that thing plugs that hole
02:07:59.700 | and you can screw in these self-drilling cranial screws
02:08:04.700 | to hold it in place.
02:08:06.700 | And at the end of the day,
02:08:08.740 | once you have the skin flap over,
02:08:10.860 | there's only about two to three millimeters
02:08:13.460 | that's obviously transitioning
02:08:15.940 | off of the top of the implant to where the screws are.
02:08:19.420 | And that's the minor bump that you have.
02:08:22.420 | - Those threads look tiny.
02:08:24.740 | That's incredible.
02:08:27.220 | That is really incredible.
02:08:29.460 | That is really incredible.
02:08:31.060 | And also as you're right,
02:08:32.540 | most of the actual volume is the battery.
02:08:35.660 | - Yeah.
02:08:36.500 | - This is way smaller than I realized.
02:08:38.220 | - They are also, the threads themselves are quite strong.
02:08:41.140 | - They look strong.
02:08:42.820 | - And the thread themselves also has
02:08:44.740 | a very interesting feature at the end of it
02:08:48.060 | called the loop, and that's the mechanism
02:08:50.460 | to which the robot is able to interface
02:08:52.300 | and manipulate this tiny hair-like structure.
02:08:55.740 | - And they're tiny.
02:08:56.580 | So what's the width of a thread?
02:08:58.500 | - Yeah, so the width of a thread starts from 16 micron
02:09:03.500 | and then tapers out to about 84 micron.
02:09:06.500 | So average human hair is about 80 to 100 micron in width.
02:09:10.940 | - This thing is amazing.
02:09:14.740 | This thing is amazing.
02:09:16.780 | - Yes, most of the volume is occupied by the battery,
02:09:20.580 | rechargeable lithium ion cell.
02:09:22.780 | And the charging is done through inductive charging,
02:09:28.500 | which is actually very commonly used.
02:09:30.460 | You know, your cell phone, most cell phones have that.
02:09:32.860 | The biggest difference is that, you know, for us,
02:09:36.620 | you know, usually when you have a phone
02:09:39.060 | and you want to charge it on a charging pad,
02:09:40.780 | you don't really care how hot it gets.
02:09:43.100 | Whereas for us, it matters.
02:09:44.700 | There's a very strict regulation and good reasons
02:09:47.180 | to not actually increase the surrounding tissue temperature
02:09:50.260 | by two degrees Celsius.
02:09:51.740 | So there's actually a lot of innovation
02:09:55.060 | that is packed into this to allow charging of this implant
02:10:00.060 | without causing that temperature threshold to reach.
02:10:03.740 | And even small things like you see this charging coil
02:10:07.020 | and what's called a ferrite shield, right?
02:10:09.340 | So without that ferrite shield,
02:10:11.940 | what you end up having when you have, you know,
02:10:14.020 | resonant inductive charging is that the battery itself
02:10:17.580 | is a metallic can and you form these eddy currents
02:10:22.300 | from external charger and that causes heating.
02:10:26.940 | And that actually contributes to inefficiency in charging.
02:10:30.820 | So this ferrite shield, what it does is that it actually
02:10:35.900 | concentrate that field line away from the battery
02:10:40.140 | and then around the coil that's actually wrapped around it.
02:10:42.900 | - There's a lot of really fascinating design here
02:10:46.260 | to make it, I mean, you're integrating a computer
02:10:49.380 | into a biological, a complex biological system.
02:10:52.820 | - Yeah, there's a lot of innovation here.
02:10:54.500 | I would say that part of what enabled this
02:10:57.380 | was just the innovations in the wearable.
02:11:01.300 | There's a lot of really, really powerful, tiny, low power
02:11:06.300 | microcontrollers, temperature sensors
02:11:09.860 | or various different sensors and power electronics.
02:11:14.020 | A lot of innovation really came in the charging coil design,
02:11:17.620 | how this is packaged and how do you enable charging
02:11:21.220 | such that you don't really exceed that temperature limit,
02:11:24.820 | which is not a constraint for other devices out there.
02:11:28.060 | - So let's talk about the threads themselves,
02:11:29.780 | those tiny, tiny, tiny things.
02:11:31.620 | So how many of them are there?
02:11:34.580 | You mentioned a thousand electrodes.
02:11:37.220 | How many threads are there?
02:11:38.620 | And what do the electrodes have to do with the threads?
02:11:41.940 | - Yeah, so the current instantiation of the device
02:11:45.620 | has 64 threads and each thread has 16 electrodes
02:11:50.620 | for a total of 1,024 electrodes
02:11:53.620 | that are capable of both recording and stimulating.
02:11:56.780 | And the thread is basically this polymer insulated wire.
02:12:06.820 | The metal conductor is the kind of a tiramisu cake
02:12:11.460 | of gold plat tie.
02:12:14.780 | And they're very, very tiny wires, two micron in width.
02:12:21.500 | So two, one millionth of a meter.
02:12:25.700 | - It's crazy that that thing I'm looking at
02:12:27.940 | has the polymer insulation, has the conducting material
02:12:31.180 | and has 16 electrodes at the end of it.
02:12:34.340 | - On each of those threads.
02:12:35.740 | - Yeah, on each of those threads.
02:12:36.740 | - Correct, 16, each one of those.
02:12:38.500 | - Yes, you're not gonna be able to see it with naked eyes.
02:12:40.940 | - And I mean, to state the obvious,
02:12:44.100 | or maybe for people who are just listening,
02:12:45.780 | they're flexible.
02:12:47.620 | - Yes, yes, that's also one element
02:12:49.900 | that was incredibly important for us.
02:12:52.780 | So each of these threads are, as I mentioned,
02:12:55.940 | 16 micron in width and then they taper to 84 micron,
02:12:59.700 | but in thickness, they're less than five micron.
02:13:03.980 | And in thickness, it's mostly polyimide at the bottom
02:13:08.820 | and this metal track, and then another polyimide.
02:13:11.860 | So two micron of polyimide, 400 nanometer of this metal stack
02:13:16.860 | and two micron of polyimide sandwiched together
02:13:19.740 | to protect it from the environment
02:13:21.580 | that is a 37 degree C bag of saltwater.
02:13:25.820 | - So what's some, maybe, can you speak
02:13:27.820 | to some interesting aspects of the material design here?
02:13:31.260 | Like what does it take to design a thing like this
02:13:34.700 | and to be able to manufacture a thing like this
02:13:37.540 | for people who don't know anything about this kind of thing?
02:13:40.380 | - Yeah, so the material selection that we have is not,
02:13:43.540 | I don't think it was particularly unique.
02:13:47.180 | There were other labs and there are other labs
02:13:50.700 | that are kind of looking at similar material stack.
02:13:55.540 | There's kind of a fundamental question
02:13:57.940 | and still needs to be answered around the longevity
02:14:01.620 | and reliability of these microelectrodes that we call
02:14:06.220 | compared to some of the other more conventional
02:14:09.020 | neural interfaces, devices that are intracranial,
02:14:12.420 | so penetrating the cortex that are more rigid,
02:14:16.140 | you know, like the Utah ray,
02:14:17.540 | that are these four by four millimeter kind of silicon shank
02:14:21.940 | that have exposed recording site at the end of it.
02:14:26.660 | And, you know, that's been kind of the innovation
02:14:29.820 | from Richard Norman back in 1997.
02:14:33.220 | It's called the Utah ray
02:14:34.060 | 'cause, you know, he was at University of Utah.
02:14:36.260 | - And what does the Utah ray look like?
02:14:38.420 | So it's a rigid type of-
02:14:40.740 | - Yeah, so we can actually look it up.
02:14:42.620 | Yeah.
02:14:46.260 | Yeah, so it's a bed of needle.
02:14:50.100 | There's-
02:14:50.940 | - Okay, go ahead, I'm sorry.
02:14:53.940 | - Those are rigid shank.
02:14:55.460 | - Rigid, yeah, you weren't kidding.
02:14:56.620 | - And the size and the number of shanks vary
02:14:59.340 | anywhere from 64 to 128.
02:15:01.700 | At the very tip of it is an exposed electrode
02:15:05.900 | that actually records neural signal.
02:15:07.940 | The other thing that's interesting to note
02:15:09.500 | is that unlike neural link threads
02:15:11.900 | that have recording electrodes
02:15:14.140 | that are actually exposed iridium oxide recording sites
02:15:17.460 | along the depth, this is only at a single depth.
02:15:20.620 | So these Utah ray spokes can be anywhere
02:15:23.060 | between 0.5 millimeters to 1.5 millimeter.
02:15:26.300 | And they also have designs that are slanted
02:15:29.420 | so you can have it inserted at different depth.
02:15:31.780 | But that's one of the other big differences.
02:15:35.460 | And then, I mean, the main key difference
02:15:37.420 | is the fact that there's no active electronics.
02:15:40.420 | These are just electrodes.
02:15:41.780 | And then there's a bundle of a wire that you're seeing.
02:15:44.580 | And then that actually then exits the craniectomy
02:15:47.660 | that then has this port that you can connect to
02:15:51.980 | for any external electronic devices.
02:15:53.820 | They are working on or have the wireless telemetry device
02:15:57.780 | but it still requires a through-the-skin port
02:16:01.780 | that actually is one of the biggest failure modes
02:16:04.300 | for infection for the system.
02:16:06.700 | - What are some of the challenges
02:16:09.260 | associated with flexible threads?
02:16:12.060 | Like for example, on the robotic side,
02:16:14.340 | R1 implanting those threads, how difficult is that task?
02:16:19.340 | - Yeah, so as you mentioned,
02:16:21.860 | they're very, very difficult to maneuver by hand.
02:16:24.820 | These U-type arrays that you saw earlier,
02:16:28.220 | they're actually inserted by a neurosurgeon
02:16:31.180 | actually positioning it near the site that they want.
02:16:33.700 | And then there's a pneumatic hammer
02:16:37.420 | that actually pushes them in.
02:16:38.820 | So it's a pretty simple process
02:16:43.500 | and they're easier to maneuver.
02:16:45.340 | But for these thin-foam arrays,
02:16:48.180 | they're very, very tiny and flexible.
02:16:50.940 | So they're very difficult to maneuver.
02:16:52.660 | So that's why we built an entire robot to do that.
02:16:55.940 | There are other reasons for why we built the robot.
02:16:58.540 | And that is ultimately, we want this to help
02:17:01.980 | millions and millions of people that can benefit from this.
02:17:04.380 | And there just aren't that many neurosurgeons out there.
02:17:07.140 | And robots can be something that we hope
02:17:13.260 | can actually do large parts of the surgery.
02:17:17.620 | But yeah, the robot is this entire other
02:17:22.420 | sort of category of product that we're working on.
02:17:26.100 | And it's essentially this multi-axis gantry system
02:17:31.100 | that has the specialized robot head
02:17:36.500 | that has all of the optics
02:17:39.220 | and this kind of a needle retracting mechanism
02:17:44.180 | that maneuvers these threads
02:17:47.020 | via this loop structure that you have on the thread.
02:17:52.620 | - So the thread already has a loop structure
02:17:54.180 | by which you can grab it.
02:17:55.500 | - Correct, correct.
02:17:56.580 | - So this is fascinating.
02:17:57.420 | So you mentioned optics.
02:17:58.420 | So there's a robot, R1.
02:18:01.460 | So for now, there's a human that actually
02:18:04.820 | creates a hole in the skull.
02:18:08.380 | And then after that, there's a computer vision component
02:18:13.100 | that's finding a way to avoid the blood vessels.
02:18:16.880 | And then you're grabbing it by the loop,
02:18:19.540 | each individual thread,
02:18:21.140 | and placing it in a particular location
02:18:23.700 | to avoid the blood vessels.
02:18:25.260 | And also choosing the depth of placement, all that.
02:18:27.940 | So controlling every, like the 3D geometry of the placement.
02:18:31.540 | - Correct.
02:18:32.420 | So the aspect of this robot that is unique
02:18:34.820 | is that it's not surgeon-assisted or human-assisted.
02:18:38.620 | It's a semi-automatic or automatic robot
02:18:42.300 | once you, obviously there are human component to it
02:18:45.020 | when you're placing targets.
02:18:47.060 | You can always move it away
02:18:48.460 | from kind of major vessels that you see.
02:18:51.780 | But I mean, we want to get to a point where one click
02:18:54.740 | and it just does the surgery within minutes.
02:18:57.580 | - So the computer vision component
02:18:59.420 | finds great targets, candidates,
02:19:02.860 | and the human kind of approves them.
02:19:04.900 | And the robot, does it do like one thread at a time,
02:19:07.420 | or does it do it in the middle?
02:19:08.260 | - It does one thread at a time.
02:19:09.300 | And that's actually also one thing that we are looking
02:19:14.180 | at ways to do multiple threads at a time.
02:19:16.300 | There's nothing stopping from it.
02:19:17.500 | You can have multiple kind of engagement mechanisms,
02:19:21.460 | but right now it's one by one.
02:19:24.180 | And we also still do quite a bit of just kind of verification
02:19:29.180 | to make sure that it got inserted.
02:19:30.740 | If so, how deep, did it actually match
02:19:33.660 | what was programmed in and so on and so forth.
02:19:36.020 | - And the actual electrodes are placed at very,
02:19:38.700 | at differing depths in the,
02:19:42.820 | I mean, it's very small differences, but differences.
02:19:45.180 | - Yeah, yeah.
02:19:46.500 | - And so that there's some reasoning behind that,
02:19:49.860 | as you mentioned, like it gets more varied signal.
02:19:54.860 | - Yeah, I mean, we try to place them all around three
02:20:01.020 | or four millimeter from the surface,
02:20:03.780 | just 'cause the span of the electrode,
02:20:06.180 | those 16 electrodes that we currently have
02:20:07.980 | in this version spans roughly around three millimeters.
02:20:12.980 | So we want to get all of those in the brain.
02:20:15.300 | - This is fascinating.
02:20:17.220 | Okay, so there's a million questions here.
02:20:19.260 | If we go zoom in at specific on the electrodes,
02:20:21.700 | what is your sense?
02:20:23.340 | How many neurons is each individual electrode listening to?
02:20:27.060 | - Yeah, each electrode can record
02:20:28.820 | from anywhere between zero to 40,
02:20:31.860 | as I mentioned earlier.
02:20:34.220 | But practically speaking,
02:20:36.100 | we only see about at most like two to three.
02:20:41.140 | And you can actually distinguish which neuron
02:20:44.100 | it's coming from by the shape of the spikes.
02:20:47.420 | So I mentioned the spike detection algorithm that we have,
02:20:52.820 | it's called BOSS algorithm,
02:20:55.140 | Buffer Online Spike Sorter.
02:20:58.540 | - Nice.
02:20:59.380 | - It actually outputs at the end of the day,
02:21:01.940 | six unique values,
02:21:04.140 | which are kind of the amplitude
02:21:07.460 | of these like negative going hump,
02:21:09.980 | middle hump, like a positive going hump.
02:21:12.820 | And then also the time at which these happen.
02:21:15.260 | And from that,
02:21:16.260 | you can have a kind of a statistical probability estimation
02:21:20.820 | of is that a spike, is it not a spike?
02:21:22.300 | And then based on that,
02:21:23.140 | you could also determine,
02:21:24.660 | oh, that spike looks different
02:21:25.980 | than that spike must come from a different neuron.
02:21:28.020 | - Okay, so that's a nice signal processing step
02:21:31.860 | from which you can then make much better predictions
02:21:34.620 | about if there's a spike,
02:21:35.740 | especially in this kind of context
02:21:37.060 | where there could be multiple neurons screaming.
02:21:39.700 | And that also results
02:21:42.820 | in you being able to compress the data better.
02:21:44.860 | - Yeah.
02:21:45.700 | - Okay.
02:21:46.540 | - And just to be clear,
02:21:47.380 | I mean, the labs do this,
02:21:49.780 | what's called spike sorting.
02:21:51.860 | Usually once you have these like broadband,
02:21:55.060 | fully digitized signals,
02:21:58.260 | and then you run a bunch of different set of algorithms
02:22:01.900 | to kind of tease apart,
02:22:03.220 | it's just all of this for us is done on the device.
02:22:06.820 | - On the device.
02:22:07.660 | - In a very low power, custom,
02:22:09.860 | built ASIC digital processing unit.
02:22:14.260 | - Highly heat constrained.
02:22:15.660 | - Highly heat constrained.
02:22:16.780 | And the processing time from signal going in
02:22:20.060 | and giving you the output is less than a microsecond,
02:22:22.620 | which is a very, very short amount of time.
02:22:25.460 | - Oh yeah, so the latency has to be super short.
02:22:27.460 | - Correct.
02:22:28.500 | - Oh, wow.
02:22:29.660 | Oh, that's a pain in the ass.
02:22:30.980 | - Yeah, latency is this huge, huge thing
02:22:33.460 | that you have to deal with.
02:22:34.700 | Right now, the biggest source of latency
02:22:36.700 | comes from the Bluetooth,
02:22:37.940 | the way in which they're packetized,
02:22:40.500 | and we bin them in 15 millisecond.
02:22:43.180 | - Oh, interesting, so it's communication constraint.
02:22:45.380 | Is there some potential innovation there
02:22:46.980 | on the protocol used?
02:22:48.300 | - Absolutely.
02:22:49.180 | - Okay.
02:22:50.020 | - Yeah, Bluetooth is definitely not our final
02:22:55.900 | wireless communication protocol that we want to get to.
02:22:58.220 | It's a highly-
02:22:59.220 | - Hence the N1 and the R1.
02:23:01.740 | I imagine that increases.
02:23:03.180 | - NX, NXRX.
02:23:04.980 | - Yeah, that's the communication protocol
02:23:09.540 | 'cause Bluetooth allows you to communicate
02:23:12.820 | against farther distances than you need to,
02:23:14.940 | so you can go much shorter.
02:23:16.060 | - Yeah, the only,
02:23:17.380 | the primary motivation for choosing Bluetooth
02:23:19.660 | is that, I mean, everything has Bluetooth.
02:23:21.900 | - All right, so you can talk to any device.
02:23:23.420 | - Interoperability is just absolutely essential,
02:23:26.940 | especially in this early phase,
02:23:29.180 | and in many ways, if you can access a phone or a computer,
02:23:32.900 | you can do anything.
02:23:33.900 | - Well, it'd be interesting to step back
02:23:36.700 | and actually look at, again,
02:23:38.860 | the same pipeline that you mentioned for Nolan.
02:23:41.860 | So what does this whole process look like
02:23:46.260 | from finding and selecting a human being
02:23:50.300 | to the surgery,
02:23:52.340 | to the first time he's able to use this thing?
02:23:55.020 | - So we have what's called a patient registry
02:23:58.780 | that people can sign up to hear more about the updates,
02:24:03.460 | and that was a route to which Nolan applied,
02:24:07.140 | and the process is that once the application comes in,
02:24:10.780 | it contains some medical records,
02:24:13.180 | and we, based on their medical eligibility,
02:24:17.980 | there's a lot of different inclusion exclusion criteria
02:24:21.180 | for them to meet,
02:24:22.100 | and we go through a pre-screening interview process
02:24:25.100 | with someone from Neuralink,
02:24:26.900 | and at some point, we also go out to their homes
02:24:30.220 | to do a BCI home audit,
02:24:33.220 | 'cause one of the most kind of revolutionary part
02:24:35.980 | about having this N1 system that is completely wireless
02:24:40.260 | is that you can use it at home,
02:24:41.580 | like you don't actually have to go to the lab
02:24:44.220 | and go to the clinic to get connectorized
02:24:47.820 | to these specialized equipment
02:24:49.220 | that you can't take home with you.
02:24:51.100 | So that's one of the key elements
02:24:54.220 | of when we're designing the system
02:24:56.460 | that we wanted to keep in mind.
02:24:57.660 | People, hopefully, would wanna be able to use this every day
02:25:02.060 | in the comfort of their home.
02:25:03.980 | And so part of our engagement
02:25:07.420 | and what we're looking for during BCI home audit
02:25:09.940 | is to just kind of understand their situation,
02:25:12.060 | what other assistive technology that they use.
02:25:14.180 | - And we should also step back and kind of say
02:25:16.500 | that the estimate is 180,000 people live
02:25:21.420 | with quadriplegia in the United States,
02:25:23.620 | and each year, an additional 18,000 suffer
02:25:26.620 | a paralyzing spinal cord injury.
02:25:29.740 | So these are folks who have a lot of challenges
02:25:34.580 | living a life in terms of accessibility,
02:25:38.300 | in terms of doing the things that many of us
02:25:40.500 | just take for granted day to day.
02:25:42.020 | And one of the things,
02:25:43.300 | one of the goals of this initial study
02:25:46.220 | is to enable them to have sort of digital autonomy
02:25:50.540 | where they, by themselves, can interact
02:25:52.300 | with a digital device using just their mind,
02:25:55.180 | something that you're calling telepathy.
02:25:57.020 | So digital telepathy where a quadriplegic
02:26:01.100 | can communicate with a digital device
02:26:03.460 | in all the ways that we've been talking about.
02:26:07.060 | Control the mouse cursor enough to be able
02:26:10.700 | to do all kinds of stuff, including play games
02:26:12.740 | and tweet and all that kind of stuff.
02:26:14.900 | And there's a lot of people for whom life,
02:26:17.540 | the basics of life are difficult
02:26:20.100 | because of the things that have happened to them, so.
02:26:24.820 | - Yeah, I mean, movement is so fundamental
02:26:28.100 | to our existence.
02:26:29.500 | I mean, even speaking involves movement
02:26:33.300 | of mouth, lips, larynx.
02:26:36.100 | And without that, it's extremely debilitating.
02:26:41.100 | And there are many, many people that we can help.
02:26:46.060 | And I mean, especially if you start to kind of look
02:26:49.820 | at other forms of movement disorders
02:26:53.260 | that are not just from spinal cord injury,
02:26:54.860 | but from ALS, MS, or even stroke that leads you,
02:26:59.860 | or just aging, right?
02:27:03.420 | That leads you to lose some of that mobility,
02:27:06.380 | that independence.
02:27:07.220 | It's extremely debilitating.
02:27:09.180 | - And all of these are opportunities to help people,
02:27:12.020 | to help alleviate suffering,
02:27:13.420 | to help improve the quality of life.
02:27:15.340 | But each of the things you mentioned
02:27:16.660 | is its own little puzzle
02:27:18.540 | that needs to have increasing levels of capability
02:27:22.900 | from a device like a Neuralink device.
02:27:24.860 | And so the first one you're focusing on is,
02:27:27.680 | it's just a beautiful word, telepathy.
02:27:31.380 | So being able to communicate using your mind
02:27:33.900 | wirelessly with a digital device.
02:27:36.260 | Can you just explain this,
02:27:38.260 | exactly what we're talking about?
02:27:40.140 | - Yeah, I mean, it's exactly that.
02:27:41.620 | I mean, I think if you are able to control a cursor
02:27:46.060 | and able to click and be able to get access
02:27:50.180 | to computer or phone, I mean,
02:27:52.700 | the whole world opens up to you.
02:27:55.220 | And I mean, I guess the word telepathy,
02:27:58.660 | if you kind of think about that as just definitionally
02:28:02.980 | being able to transfer information from my brain
02:28:05.900 | to your brain without using some of the physical faculties
02:28:10.900 | that we have, like voices.
02:28:13.860 | - But the interesting thing here is,
02:28:16.020 | I think the thing that's not obviously clear
02:28:18.340 | is how exactly it works.
02:28:20.500 | So in order to move a cursor,
02:28:22.860 | there's at least a couple of ways of doing that.
02:28:27.660 | So one is you imagine yourself
02:28:30.980 | maybe moving a mouse with your hand,
02:28:33.760 | or you can then, which Nolan talked about,
02:28:38.500 | like imagine moving the cursor with your mind.
02:28:41.920 | But it's like, there is a cognitive step here
02:28:46.500 | that's fascinating, 'cause you have to use the brain
02:28:49.180 | and you have to learn how to use the brain.
02:28:51.660 | And you kind of have to figure it out dynamically.
02:28:55.740 | Because you reward yourself if it works.
02:28:58.460 | So you're like, I mean, there's a step that,
02:29:00.700 | this is just a fascinating step,
02:29:02.620 | 'cause you have to get the brain to start firing
02:29:05.140 | in the right way.
02:29:06.300 | And you do that by imagining,
02:29:09.200 | like fake it till you make it.
02:29:12.980 | And all of a sudden, it creates the right kind of signal
02:29:15.580 | that if decoded correctly, can create the kind of effect.
02:29:20.260 | And then there's like noise around that,
02:29:21.620 | you have to figure all of that out.
02:29:22.900 | But on the human side, imagine the cursor moving
02:29:26.340 | is what you have to do.
02:29:27.420 | - Yeah, he says, using the force.
02:29:29.020 | - The force.
02:29:29.860 | I mean, isn't that just like fascinating to you,
02:29:33.740 | that it works?
02:29:34.900 | Like to me, it's like, holy shit, that actually works.
02:29:38.180 | Like you could move a cursor with your mind.
02:29:41.740 | - You know, as much as you're learning to use that thing,
02:29:45.780 | that thing's also learning about you.
02:29:47.820 | Like our model is constantly updating the weights
02:29:50.980 | to say, oh, if someone is thinking about, you know,
02:29:55.980 | this sophisticated forms of like spiking patterns,
02:29:59.980 | like that actually means to do this, right?
02:30:02.300 | - So the machine is learning about the human
02:30:04.380 | and the human is learning about the machine.
02:30:05.840 | So there is a adaptability to the signal processing,
02:30:08.820 | the decoding step.
02:30:10.420 | And then there's the adaptation of Nolan, the human being.
02:30:15.180 | Like the same way, if you give me a new mouse,
02:30:18.060 | and I move it, I learn very quickly about it,
02:30:20.820 | sensitivity, so I'll learn to move it slower.
02:30:23.620 | And then there's other kinds of signal drift
02:30:28.820 | and all that kind of stuff they have to adapt to.
02:30:30.900 | So both are adapting to each other.
02:30:32.700 | - Correct.
02:30:33.540 | - That's a fascinating, like software challenge
02:30:37.780 | on both sides, the software on both,
02:30:39.660 | on the human software and the--
02:30:41.420 | - The organic and the inorganic.
02:30:43.060 | - The organic and the inorganic.
02:30:44.900 | Anyway, so sorry to rudely interrupt.
02:30:47.220 | So there's the selection that Nolan has passed
02:30:50.420 | with flying colors,
02:30:51.820 | so everything including that it's a BCI-friendly home,
02:30:57.220 | all of that.
02:30:58.260 | So what is the process of the surgery,
02:31:01.340 | the implantation, the first moment
02:31:03.260 | when he gets to use the system?
02:31:06.140 | - The end-to-end, you know, we say patient in to patient out
02:31:09.300 | is anywhere between two to four hours.
02:31:12.940 | In particular case for Nolan,
02:31:14.220 | it was about three and a half hours.
02:31:16.140 | And there's many steps leading to, you know,
02:31:18.820 | the actual robot insertion, right?
02:31:20.180 | So there's anesthesia induction,
02:31:22.900 | and we do intra-op CT imaging to make sure that we're,
02:31:27.900 | you know, drilling the hole in the right location.
02:31:29.940 | And this is also pre-planned beforehand.
02:31:32.340 | Someone goes through,
02:31:34.580 | someone like Nolan would go through fMRI,
02:31:36.780 | and then they can think about wiggling their hand,
02:31:40.700 | you know, obviously due to their injury,
02:31:42.740 | it's not gonna actually lead to any sort of intended output,
02:31:47.500 | but it's the same part of the brain
02:31:49.660 | that actually lights up
02:31:50.540 | when you're imagining moving your finger
02:31:52.860 | to actually moving your finger.
02:31:54.660 | And that's one of the ways
02:31:55.660 | in which we can actually know where to place our threads.
02:32:00.460 | 'Cause we wanna go into what's called the hand knob area
02:32:03.060 | in the motor cortex,
02:32:04.420 | and, you know, as much as possible,
02:32:07.060 | densely put our electro threads.
02:32:09.220 | So yeah, we do intra-op CT imaging to make sure
02:32:15.260 | and double check the location of the craniectomy.
02:32:18.380 | And surgeon comes in,
02:32:21.940 | does their thing in terms of like skin incision,
02:32:26.620 | craniectomy, so drilling of the skull.
02:32:28.620 | And then there's many different layers of the brain.
02:32:30.780 | There's what's called a dura,
02:32:32.140 | which is a very, very thick layer that surrounds the brain.
02:32:36.060 | That gets actually resected in a process called durectomy.
02:32:39.500 | And that then expose the PIA in the brain
02:32:42.500 | that you wanna insert.
02:32:43.820 | And by the time it's been around,
02:32:45.780 | anywhere between one to one and a half hours,
02:32:47.900 | robot comes in, does its thing.
02:32:49.740 | Placement of the targets, inserting of the thread.
02:32:52.380 | That takes anywhere between 20 to 40 minutes
02:32:54.860 | in the particular case for Nolan,
02:32:56.140 | it was just under or just over 30 minutes.
02:32:58.900 | And then after that, the surgeon comes in.
02:33:01.420 | There's a couple other steps of like actually inserting
02:33:03.540 | the dural substitute layer to protect the thread
02:33:06.780 | as well as the brain.
02:33:08.500 | And then, yeah, screw in the implant and then skin flap,
02:33:13.500 | and then suture, and then you're out.
02:33:17.060 | (laughing)
02:33:18.300 | - So when Nolan woke up, what was that like?
02:33:23.300 | Was the recovery like?
02:33:24.900 | And when was the first time he was able to use it?
02:33:27.260 | - So he was actually immediately after the surgery,
02:33:30.900 | like an hour after the surgery, as he was waking up,
02:33:35.060 | we did turn on the device,
02:33:38.020 | make sure that we are recording neural signals.
02:33:40.340 | And we actually did have a couple signals
02:33:43.500 | that we noticed that he can actually modulate.
02:33:46.700 | And what I mean by modulate
02:33:47.900 | is that he can think about crunching his fist
02:33:50.860 | and you could see the spike disappear and appear.
02:33:54.900 | (laughing)
02:33:56.860 | - That's awesome.
02:33:58.060 | - And that was immediate, right?
02:34:00.100 | Immediate after in the recovery room.
02:34:02.460 | - How cool is that?
02:34:04.020 | Yeah, that's a human being.
02:34:07.340 | I mean, what did that feel like for you?
02:34:10.620 | This device in a human being,
02:34:12.540 | a first step of a gigantic journey.
02:34:15.660 | I mean, it's a historic moment.
02:34:17.460 | Even just that spike, just to be able to modulate that.
02:34:22.460 | - You know, obviously there have been other,
02:34:24.740 | you know, as you mentioned, pioneers
02:34:27.140 | that have participated in these groundbreaking BCI,
02:34:30.980 | you know, investigational early feasibility studies.
02:34:36.740 | So we're obviously standing on the shoulders
02:34:39.300 | of the giants here.
02:34:40.140 | We're not the first ones to actually put electrodes
02:34:42.340 | in a human brain.
02:34:44.580 | But I mean, just leading up to the surgery,
02:34:47.700 | there was, I definitely could not sleep.
02:34:50.420 | It's the first time that you're working
02:34:53.380 | in a completely new environment.
02:34:55.180 | We had a lot of confidence based on our benchtop testing,
02:35:01.300 | our preclinical R&D studies that the mechanism,
02:35:06.340 | the threads, the insertion, all that stuff is very safe.
02:35:10.060 | And that it's obviously ready for doing this in a human,
02:35:15.060 | but there's still a lot of unknown, unknown about,
02:35:19.060 | can the needle actually insert?
02:35:22.620 | I mean, we brought something like 40 needles
02:35:26.420 | just in case they break.
02:35:27.420 | And we ended up using only one.
02:35:28.980 | But I mean, that was a level of just complete unknown,
02:35:31.820 | right, 'cause it's a very, very different environment.
02:35:33.980 | And I mean, that's why we do clinical trial
02:35:37.140 | in the first place, to be able to test these things out.
02:35:40.140 | So extreme nervousness and just many, many sleepless night
02:35:45.140 | leading up to the surgery,
02:35:47.700 | and definitely the day before the surgery.
02:35:49.620 | And it was an early morning surgery.
02:35:50.940 | Like we started at seven in the morning.
02:35:53.100 | And by the time it was around 1030, everything was done.
02:35:58.740 | But I mean, first time seeing that, well, number one,
02:36:03.980 | just huge relief that this thing is doing
02:36:08.780 | what it's supposed to do.
02:36:10.060 | And two, I mean, just immense amount of gratitude
02:36:14.460 | for Nolan and his family.
02:36:16.460 | And then many others that have applied
02:36:18.660 | and that we've spoken to and will speak to are true pioneers
02:36:23.660 | in every war.
02:36:25.500 | And I sort of call them the neural astronauts
02:36:28.860 | or neural nut.
02:36:29.700 | - Neural nut, yeah.
02:36:31.380 | - These amazing, just like in the sixties, right?
02:36:34.340 | Like these amazing just pioneers, right?
02:36:36.720 | Exploring the unknown, outwards, in this case it's inward,
02:36:41.520 | but an incredible amount of gratitude for them
02:36:45.620 | to just participate and play a part.
02:36:50.620 | And it's a journey that we're embarking on together.
02:36:57.340 | But also like, I think it was just,
02:36:59.940 | that was a very, very important milestone,
02:37:01.660 | but our work was just starting.
02:37:03.580 | So a lot of just kind of anticipation for, okay,
02:37:08.020 | what needs to happen next?
02:37:09.380 | What are set of sequences of events that needs to happen
02:37:11.780 | for us to make it worthwhile for both Nolan as well as us.
02:37:16.780 | - Just to linger on that, just a huge congratulation to you
02:37:20.940 | and the team for that milestone.
02:37:23.100 | I know there's a lot of work left,
02:37:27.000 | but that is really exciting to see.
02:37:30.580 | There's a, that's a source of hope.
02:37:33.860 | It's this first big step opportunity
02:37:37.460 | to help hundreds of thousands of people
02:37:40.620 | and then maybe expand the realm of the possible
02:37:45.500 | for the human mind for millions of people in the future.
02:37:48.060 | So it's really exciting.
02:37:49.920 | So like the opportunities are all ahead of us
02:37:53.860 | and to do that safely and to do that effectively
02:37:56.140 | was really fun to see.
02:37:58.460 | As an engineer, just watching other engineers come together
02:38:01.260 | and do an epic thing, that was awesome.
02:38:02.740 | So huge congrats.
02:38:03.860 | - Thank you, thank you.
02:38:04.700 | It's, yeah, could not have done it without the team.
02:38:06.900 | And yeah, I mean, that's the other thing
02:38:09.220 | that I told the team as well,
02:38:11.900 | of just this immense sense of optimism for the future.
02:38:15.820 | I mean, it was, it's a very important moment
02:38:19.060 | for the company, needless to say,
02:38:22.860 | as well as hopefully for many others out there
02:38:25.900 | that we can help.
02:38:27.360 | - So speaking of challenges,
02:38:29.180 | Neuralink published a blog post
02:38:30.940 | describing that some of the threads are tracted.
02:38:34.100 | And so the performance as measured
02:38:36.860 | by bits per second dropped at first,
02:38:39.420 | but then eventually it was regained.
02:38:41.340 | And that, the whole story of how it was regained
02:38:43.660 | is super interesting.
02:38:45.100 | That's definitely something I'll talk to Bliss
02:38:47.660 | and to Nolan about.
02:38:48.740 | But in general, can you speak to this whole experience?
02:38:54.140 | How was the performance regained?
02:38:56.300 | And just the technical aspects
02:39:00.660 | of the threads being retracted and moving?
02:39:03.660 | - The main takeaway is that in the end,
02:39:06.300 | the performance have come back
02:39:07.580 | and it's actually gotten better than it was before.
02:39:10.980 | He's actually just beat the world record yet again last week
02:39:15.620 | to 8.5 BPS.
02:39:18.180 | So, I mean, he's just cranking and he's just improving.
02:39:20.660 | - The previous one that he said was eight.
02:39:22.940 | - Correct, he said 8.5.
02:39:24.620 | - Yeah, the previous world record in human was 4.6.
02:39:28.700 | So it's almost double.
02:39:30.940 | And his goal is to try to get to 10,
02:39:32.620 | which is roughly around kind of the median Neuralinker
02:39:37.260 | using a mouse with the hand.
02:39:40.260 | So it's getting there.
02:39:42.300 | - So yeah, so the performance was regained.
02:39:45.300 | - Yeah, better than before.
02:39:46.500 | So that's a story on its own
02:39:50.460 | of what took the BCI team to recover that performance.
02:39:55.060 | It was actually mostly on kind of the signal processing end.
02:39:58.140 | So, you know, as I mentioned,
02:39:59.300 | we were kind of looking at these spike outputs
02:40:02.820 | from our electrodes.
02:40:06.620 | And what happened is that kind of four weeks
02:40:09.780 | into the surgery,
02:40:11.700 | we noticed that the threads
02:40:12.940 | have slowly come out of the brain.
02:40:14.900 | And the way in which we noticed this at first,
02:40:16.820 | obviously is that I think Nolan was the first to notice
02:40:20.180 | that his performance was degrading.
02:40:22.060 | And I think at the time,
02:40:25.780 | we were also trying to do a bunch of different experimentation
02:40:29.100 | you know, different algorithms,
02:40:30.180 | different sort of UI, UX.
02:40:33.340 | So it was expected that there will be variability
02:40:35.900 | in the performance,
02:40:37.260 | but we did see kind of a steady decline.
02:40:40.940 | And then also the way in which we measure
02:40:44.340 | the health of the electrodes
02:40:45.620 | or whether they're in the brain or not
02:40:46.940 | is by measuring impedance of the electrode.
02:40:50.180 | So we look at kind of the interfacial
02:40:52.740 | kind of the Randall circuit,
02:40:56.060 | like they say, you know,
02:40:57.540 | the capacitance and the resistance
02:41:00.740 | between the electro surface and the medium.
02:41:03.060 | And if that changes in some dramatic ways,
02:41:05.500 | we have some indication
02:41:06.780 | or if you're not seeing spikes on those channels,
02:41:08.820 | you have some indications that something's happening there.
02:41:11.620 | And what we noticed is that looking at those impedance plot
02:41:13.900 | and spike rate plots,
02:41:16.020 | and also because we have those electrodes
02:41:18.540 | recording along the depth,
02:41:20.140 | you're seeing some sort of movement
02:41:21.660 | that indicated that the reservoir being pulled out.
02:41:24.180 | And that obviously will have an implication
02:41:27.260 | on the model side,
02:41:28.540 | because if you're the number of inputs
02:41:30.820 | that are going into the model is changing
02:41:33.460 | because you have less of them,
02:41:35.100 | that model needs to get updated, right?
02:41:38.620 | But there were still signals.
02:41:43.620 | And as I mentioned, similar to how,
02:41:45.020 | even when you place the signals
02:41:46.420 | on the surface of the brain or farther away,
02:41:49.740 | like outside the skull,
02:41:50.580 | you still see some useful signals.
02:41:53.500 | What we started looking at is not just the spike occurrence
02:41:56.660 | through this BOSS algorithm that I mentioned,
02:41:59.700 | but we started looking at just the power
02:42:03.140 | of the frequency band that is interesting
02:42:06.460 | for Nolan to be able to modulate.
02:42:10.700 | So once we kind of changed the algorithm
02:42:13.740 | for the implant to not just give you the BOSS output,
02:42:17.900 | but also these spike band power output,
02:42:21.020 | that helped us sort of find the model
02:42:23.700 | with the new set of inputs.
02:42:25.140 | And that was the thing that really ultimately gave us
02:42:28.020 | the performance back.
02:42:29.060 | You know, in terms of,
02:42:32.140 | and obviously like the thing that we want ultimately,
02:42:37.300 | and the thing that we are working towards
02:42:39.260 | is figuring out ways in which we can keep those threads
02:42:42.140 | intact for as long as possible,
02:42:45.220 | so that we have many more channels going into the model.
02:42:48.500 | That's by far the number one priority
02:42:50.900 | that the team is currently embarking on
02:42:52.900 | to understand how to prevent that from happening.
02:42:55.340 | The thing that I will say also is that,
02:42:58.900 | you know, as I mentioned, this is the first time ever
02:43:02.260 | that we're putting these threads in a human brain.
02:43:05.060 | And, you know, human brain, just for size reference,
02:43:07.580 | is 10 times that of the monkey brain or the sheep brain.
02:43:11.900 | And it's just a very, very different environment.
02:43:15.220 | It moves a lot more.
02:43:16.500 | It's like actually moved a lot more than we expected
02:43:19.300 | when we did Nolan's surgery.
02:43:22.300 | And it's just a very, very different environment
02:43:26.420 | than what we're used to.
02:43:27.460 | And this is why we do clinical trial, right?
02:43:29.820 | We wanna uncover some of these issues
02:43:33.420 | and failure modes earlier than later.
02:43:37.180 | So in many ways, it's provided us
02:43:39.460 | with this enormous amount of data
02:43:42.380 | and information to be able to solve this.
02:43:47.380 | And this is something that Neuralink is extremely good at.
02:43:49.460 | Once we have set of clear objective and engineering problem,
02:43:53.220 | we have enormous amount of talents
02:43:55.180 | across many, many disciplines to be able to come together
02:43:58.940 | and fix the problem very, very quickly.
02:44:01.180 | - But it sounds like one of the fascinating challenges here
02:44:04.340 | is for the system and the decoding side
02:44:07.620 | to be adaptable across different timescales.
02:44:10.100 | So whether it's movement of threads
02:44:14.100 | or different aspects of signal drift,
02:44:16.260 | sort of on the software of the human brain,
02:44:18.180 | something changing, like Nolan talks about cursor drift,
02:44:22.260 | that could be corrected.
02:44:25.260 | And there's a whole UX challenge to how to do that.
02:44:27.780 | So it sounds like adaptability is like a fundamental property
02:44:32.500 | that has to be engineered in.
02:44:34.780 | - It is.
02:44:35.620 | And I mean, I think, I mean,
02:44:37.700 | as a company, we're extremely vertically integrated.
02:44:41.140 | You know, we make these thin film arrays
02:44:42.980 | in our own microfab.
02:44:45.620 | - Yeah, there's, like you said, built in house.
02:44:47.900 | This whole paragraph here from this blog post
02:44:50.060 | is pretty gangster.
02:44:51.060 | Building the technologies described above
02:44:53.900 | has been no small feat.
02:44:55.940 | And there's a bunch of links here
02:44:57.220 | that I recommend people click on.
02:44:59.220 | We constructed in-house microfabrication capabilities
02:45:02.300 | to rapidly produce various iterations of thin film arrays
02:45:05.660 | that constitute our electrode threads.
02:45:08.140 | We created a custom femtosecond laser mill
02:45:12.460 | to manufacture components with micro-level precision.
02:45:15.380 | I think there's a tweet associated with this.
02:45:16.220 | - That's a whole thing that we can get into.
02:45:18.620 | - Yeah, this, okay.
02:45:20.220 | What are we looking at here?
02:45:23.140 | This thing.
02:45:24.660 | So in less than one minute,
02:45:26.340 | our custom-made femtosecond laser mill
02:45:29.340 | cuts this geometry in the tips of our needles.
02:45:33.300 | So we're looking at this weirdly shaped needle.
02:45:38.300 | The tip is only 10 to 12 microns in width,
02:45:41.500 | only slightly larger than the diameter of a red blood cell.
02:45:44.980 | The small size allows threads to be inserted
02:45:46.820 | with minimal damage to the cortex.
02:45:48.220 | Okay, so what's interesting about this geometry?
02:45:51.300 | So we're looking at this geometry of a needle.
02:45:53.660 | - This is the needle that's engaging
02:45:56.180 | with the loops in the thread.
02:45:58.940 | So they're the ones that thread the loop
02:46:03.460 | and then peel it from the silicon backing.
02:46:06.740 | And then this is the thing
02:46:08.140 | that gets inserted into the tissue.
02:46:10.860 | And then this pulls out, leaving the thread.
02:46:13.500 | And this kind of a notch,
02:46:15.740 | or the shark tooth that we used to call,
02:46:19.060 | is the thing that actually is grasping the loop.
02:46:23.860 | And then it's designed in such a way
02:46:26.340 | such that when you pull out, it leaves the loop.
02:46:28.700 | - And the robot is controlling this needle.
02:46:30.940 | - Correct.
02:46:31.780 | So this is actually housed in a cannula.
02:46:33.540 | And basically the robot has a lot of the optics
02:46:36.900 | that look for where the loop is.
02:46:39.180 | There's actually a 405 nanometer light
02:46:41.460 | that actually causes the polyimide to fluoresce
02:46:45.580 | so that you can locate the location of the loop.
02:46:48.660 | - So the loop lights up?
02:46:50.380 | - Yeah, yeah, they do.
02:46:51.700 | It's a micron precision process.
02:46:54.820 | - What's interesting about the robot
02:46:56.100 | that it takes to do that, that's pretty crazy.
02:46:58.660 | That's pretty crazy that a robot is able
02:47:00.340 | to get this kind of precision.
02:47:01.860 | - Yeah, our robot is quite heavy,
02:47:03.980 | our current version of it.
02:47:06.020 | There's, I mean, it's like a giant granite slab
02:47:10.340 | that weighs about a ton.
02:47:12.060 | 'Cause it needs to be sensitive to vibration,
02:47:15.380 | environmental vibration.
02:47:16.620 | And then as the head is moving,
02:47:18.020 | at the speed that it's moving,
02:47:20.020 | there's a lot of kind of motion control
02:47:22.740 | to make sure that you can achieve that level of precision.
02:47:28.180 | A lot of optics that kind of zoom in on that.
02:47:30.820 | We're working on next generation of the robot
02:47:33.100 | that is lighter, easier to transport.
02:47:35.180 | I mean, it is a feat to move the robot.
02:47:37.900 | - And it's far superior to a human surgeon
02:47:40.300 | at this time for this particular task.
02:47:42.220 | - Absolutely.
02:47:43.060 | I mean, let alone you try to actually thread a loop
02:47:45.260 | in a sewing kit.
02:47:47.140 | I mean, this is like, we're talking like fractions
02:47:50.100 | of human hair.
02:47:51.500 | These things are, it's not visible.
02:47:54.100 | - So continuing the paragraph,
02:47:55.460 | we developed novel hardware and software testing systems
02:47:57.780 | such as our accelerated lifetime testing racks
02:48:00.020 | and simulated surgery environment, which is pretty cool.
02:48:02.780 | To stress test and validate the robustness
02:48:04.700 | of our technologies.
02:48:05.940 | We performed many rehearsals of our surgeries
02:48:09.060 | to refine our procedures and make them second nature.
02:48:13.740 | This is pretty cool.
02:48:14.860 | We practice surgeries on proxies
02:48:18.020 | with all the hardware and instruments needed
02:48:19.700 | in our mock or in the engineering space.
02:48:22.260 | This helps us rapidly test and measure.
02:48:24.100 | So there's like proxies.
02:48:25.460 | - Yeah, this proxy is super cool, actually.
02:48:27.460 | So there's a 3D printed skull
02:48:31.420 | from the images that is taken at Barrow,
02:48:35.060 | as well as this hydrogel mix,
02:48:38.260 | you know, sort of synthetic polymer thing
02:48:40.260 | that actually mimics the mechanical properties of the brain.
02:48:43.260 | It also has vasculature of the person.
02:48:48.820 | So basically what we're talking about here,
02:48:52.740 | and there's a lot of work that has gone
02:48:55.460 | into making this set proxy that, you know,
02:49:00.180 | it's about like finding the right concentration
02:49:02.060 | of these different synthetic polymers
02:49:03.900 | to get the right set of consistency
02:49:05.380 | for the needle dynamics, you know,
02:49:07.220 | as they're being inserted.
02:49:08.780 | But we practice this surgery with the person,
02:49:13.780 | you know, Nolan's basically physiology and brain,
02:49:17.220 | many, many times prior to actually doing the surgery.
02:49:20.860 | - So to every step, every step.
02:49:22.980 | - Every step, yeah.
02:49:23.820 | Like where does someone stand?
02:49:25.460 | Like, I mean, like what you're looking at is the picture.
02:49:27.660 | This is in our office of this kind of corner
02:49:32.660 | of the robot engineering space that we, you know,
02:49:35.340 | have created this like mock OR space
02:49:37.660 | that looks exactly like what they would experience,
02:49:40.260 | all the staff would experience during their actual surgery.
02:49:42.300 | So, I mean, it's just kind of like any dance rehearsal
02:49:45.620 | where you know exactly where you're gonna stand
02:49:47.300 | at what point, and you just practice that over
02:49:49.980 | and over and over again with an exact anatomy
02:49:53.300 | of someone that you're going to surgerize.
02:49:55.740 | And it got to a point where a lot of our engineers,
02:49:59.420 | when we created a craniectomy, they're like,
02:50:01.140 | "Oh, that looks very familiar.
02:50:03.040 | "We've seen that before."
02:50:04.940 | - Yeah, and there's wisdom you can gain
02:50:08.180 | through doing the same thing over and over and over.
02:50:09.740 | It's like "Jiro Dreams of Sushi" kind of thing.
02:50:12.540 | Because then it's like Olympic athletes visualize
02:50:17.340 | the Olympics, and then once you actually show up,
02:50:22.460 | it feels easy, it feels like any other day.
02:50:25.780 | It feels almost boring winning the gold medal.
02:50:28.620 | 'Cause you visualize this so many times,
02:50:31.140 | you've practiced this so many times,
02:50:33.220 | and nothing about it is new, it's boring.
02:50:35.300 | You win the gold medal, it's boring.
02:50:37.260 | And the experience they talk about is mostly just relief.
02:50:40.540 | Probably that they don't have to visualize it anymore.
02:50:44.700 | - Yeah, the power of the mind to visualize,
02:50:47.060 | and where, I mean, there's a whole field that studies
02:50:50.140 | where muscle memory lies in cerebellum.
02:50:53.880 | Yeah, it's incredible.
02:50:54.980 | I think it's a good place to actually ask
02:50:59.420 | sort of the big question that people might have is,
02:51:01.340 | how do we know every aspect of this
02:51:03.820 | that you described is safe?
02:51:05.980 | At the end of the day, the gold standard
02:51:07.500 | is to look at the tissue.
02:51:08.740 | What sort of trauma did you cause the tissue?
02:51:12.540 | And does that correlate to whatever behavioral anomalies
02:51:15.540 | that you may have seen?
02:51:17.500 | And that's the language to which we can communicate
02:51:21.900 | about the safety of inserting something into the brain
02:51:25.660 | and what type of trauma that you can cause.
02:51:27.780 | So we actually have an entire department,
02:51:32.780 | department of pathology that looks at these tissue slices.
02:51:38.340 | There are many steps that are involved in doing this
02:51:40.940 | once you have studies that are launched
02:51:44.900 | with particular endpoints in mind.
02:51:48.740 | At some point you have to euthanize the animal,
02:51:50.740 | and then you go through a necropsy
02:51:52.940 | to kind of collect the brain tissue samples.
02:51:55.100 | You fix them in formalin, and you like gross them,
02:51:59.300 | you section them, and you look at individual slices
02:52:01.860 | just to see what kind of reaction or lack thereof exists.
02:52:04.780 | So that's the kind of the language to which FDA speaks,
02:52:08.700 | and as well for us to kind of evaluate the safety
02:52:12.460 | of the insertion mechanism as well as the threats
02:52:15.620 | at various different time points, both acute,
02:52:18.980 | so anywhere between zero to three months
02:52:23.580 | to beyond three months.
02:52:25.300 | - So those are kind of the details
02:52:27.100 | of an extremely high standard of safety
02:52:30.220 | that has to be reached.
02:52:31.780 | - Correct.
02:52:32.620 | - FDA supervises this, but there's in general
02:52:35.900 | just a very high standard.
02:52:36.900 | And every aspect of this, including the surgery,
02:52:39.300 | I think Matthew McDougall has mentioned
02:52:41.860 | that the standard is, let's say, how to put it politely,
02:52:46.860 | higher than maybe some other operations
02:52:52.020 | that we take for granted.
02:52:53.220 | So the standard for all the surgical stuff here
02:52:56.460 | is extremely high.
02:52:57.500 | - Very high.
02:52:58.340 | I mean, it's a highly, highly regulated environment
02:53:01.340 | with the governing agencies that scrutinize
02:53:06.180 | every medical device that gets marketed.
02:53:09.260 | And I think it's a good thing.
02:53:11.180 | It's good to have those high standards.
02:53:13.460 | And we try to hold extremely high standards
02:53:16.660 | to kind of understand what sort of damage, if any,
02:53:20.500 | these innovative emerging technologies
02:53:22.780 | and new technologies that we're building are.
02:53:24.740 | And so far, we have been extremely impressed
02:53:29.580 | by lack of immune response from these threats.
02:53:34.020 | - Speaking of which, you talked to me with excitement
02:53:38.940 | about the histology and some of the images
02:53:41.540 | that you were able to share.
02:53:43.220 | Can you explain to me what we're looking at?
02:53:46.020 | - Yeah, so what you're looking at is a stained tissue image.
02:53:50.700 | So this is a sectioned tissue slice
02:53:54.500 | from an animal that was implanted for seven months,
02:53:56.860 | so kind of a chronic time point.
02:53:58.780 | And you're seeing all these different colors
02:54:02.020 | and each color indicates specific types of cell types.
02:54:06.020 | So purple and pink are astrocytes and microglia,
02:54:10.020 | respectively, they're type of glial cells.
02:54:12.540 | And the other thing that people may not be aware of
02:54:15.660 | is your brain is not just made up
02:54:17.740 | of a soup of neurons and axons.
02:54:19.780 | There are other cells like glial cells
02:54:24.780 | that actually kind of is the glue
02:54:27.060 | and also react if there are any trauma
02:54:30.820 | or damage to the tissue.
02:54:32.300 | - But the brown are the neurons here.
02:54:33.900 | - The brown are the neurons.
02:54:34.940 | - So modern neurons.
02:54:36.500 | - So what you're seeing is in this kind of macro image,
02:54:39.420 | you're seeing these like circle highlighted in white,
02:54:44.020 | the insertion sites.
02:54:45.140 | And when you zoom into one of those,
02:54:48.460 | you see the threads.
02:54:49.500 | And then in this particular case,
02:54:50.900 | I think we're seeing about the 16 wires
02:54:54.340 | that are going into the page.
02:54:56.380 | And the incredible thing here is the fact
02:54:58.420 | that you have the neurons that are these brown structures
02:55:01.500 | or brown circular or elliptical thing
02:55:04.140 | that are actually touching and abutting the threads.
02:55:07.820 | So what this is saying is that there's basically
02:55:09.940 | zero trauma that's caused during this insertion.
02:55:12.860 | And with these neural interfaces,
02:55:15.020 | these microelectrodes that you insert,
02:55:17.660 | that is one of the most common mode of failure.
02:55:19.860 | So when you insert these threads like the Utah array,
02:55:23.180 | it causes neuronal death around the site
02:55:26.300 | because you're inserting a foreign object, right?
02:55:29.180 | And that kind of elicit these like immune response
02:55:32.660 | through microglia and astrocytes.
02:55:34.700 | They form this like protective layer around it.
02:55:37.700 | Not only are you killing the neuron cells,
02:55:39.380 | but you're also creating this protective layer
02:55:41.780 | that then basically prevents you
02:55:43.820 | from recording neural signals
02:55:45.260 | 'cause you're getting further and further away
02:55:46.660 | from the neurons that you're trying to record.
02:55:48.460 | And that is the biggest mode of failure.
02:55:50.340 | And in this particular example,
02:55:52.340 | in that inset, it's about 50 micron with that scale bar,
02:55:55.940 | the neurons just seem to be attracted to it.
02:55:58.140 | - And so there's certainly no trauma.
02:56:01.060 | That's such a beautiful image, by the way.
02:56:03.420 | So the brown of the neurons,
02:56:06.220 | for some reason, I can't look away.
02:56:07.540 | It's really cool.
02:56:08.380 | - Yeah, and the way that these things,
02:56:10.060 | I mean, your tissues generally
02:56:11.860 | don't have these beautiful colors.
02:56:13.860 | This is multiplex stain that uses these different proteins
02:56:18.620 | that are staining these at different colors.
02:56:21.220 | We use very standard set of staining techniques
02:56:24.940 | with HE, EBA1, and NUEN, and GFAB.
02:56:29.860 | So if you go to the next image,
02:56:31.500 | this is also kind of illustrates the second point
02:56:33.540 | 'cause you can make an argument.
02:56:34.900 | And initially when we saw the previous image,
02:56:37.340 | we said, "Oh, like, are the threads just floating?
02:56:39.300 | "Like, what is happening here?
02:56:40.980 | "Are we actually looking at the right thing?"
02:56:42.780 | So what we did is we did another stain,
02:56:45.140 | and this is all done in-house,
02:56:46.780 | of this Baton's trichrome stain,
02:56:50.140 | which is in blue that shows these collagen layers.
02:56:53.020 | So the blue, basically,
02:56:54.780 | like, you don't want the blue around the implant threads
02:56:58.260 | 'cause that means that there's some sort of
02:57:00.140 | scarring that's happened.
02:57:01.980 | And what you're seeing, if you look at individual threads,
02:57:04.260 | is that you don't see any of the blue,
02:57:06.140 | which means that there has been absolutely,
02:57:10.060 | or very, very minimal to a point
02:57:12.100 | where it's not detectable amount of trauma
02:57:14.260 | in these inserted threads.
02:57:16.140 | - So that presumably is one of the big benefits
02:57:18.340 | of having this kind of flexible thread.
02:57:20.940 | - Yeah, so we think this is primarily due to the size,
02:57:24.900 | as well as the flexibility of the threads.
02:57:27.100 | Also the fact that R1 is avoiding vascular,
02:57:30.780 | so we're not disrupting,
02:57:32.660 | or we're not causing damage to the vessels,
02:57:36.980 | and not breaking any of the blood-brain barrier,
02:57:40.580 | has basically caused the immune response to be muted.
02:57:44.540 | - But this is also a nice illustration
02:57:47.460 | of the size of things.
02:57:49.140 | So this is the tip of the thread.
02:57:51.020 | - Yeah, those are neurons, they're--
02:57:53.300 | - And they're neurons, and this is the thread listening.
02:57:56.820 | And the electrodes are positioned how?
02:57:59.220 | - Yeah, so this is, what you're looking at
02:58:00.780 | is not electrode themselves, those are the conductive wires.
02:58:04.420 | So each of those should probably be two micron in width.
02:58:08.340 | So what we're looking at
02:58:11.420 | is we're looking at the coronal slice.
02:58:12.940 | So we're looking at some slice of the tissue.
02:58:15.500 | So as you go deeper, you'll obviously have
02:58:17.780 | less and less of the tapering of the thread.
02:58:21.100 | But yeah, the point basically being
02:58:25.540 | that there's just kind of cells around the inserted site,
02:58:28.740 | which is just an incredible thing to see.
02:58:31.260 | I've just never seen anything like this.
02:58:33.660 | - How easy and safe is it to remove the implant?
02:58:36.260 | - Yeah, so it depends on when.
02:58:41.220 | In the first three months or so after the surgery,
02:58:45.060 | there's a lot of kind of tissue modeling that's happening,
02:58:49.740 | similar to when you get a cut.
02:58:51.780 | You obviously start over first couple of weeks,
02:58:56.780 | or depending on the size of the wound,
02:59:00.220 | scar tissue forming, right?
02:59:01.700 | There are these like contracted,
02:59:03.180 | and then in the end they turn into scab
02:59:04.940 | and you can scab it off.
02:59:06.180 | The same thing happens in the brain.
02:59:07.980 | And it's a very dynamic environment.
02:59:10.980 | And before the scar tissue or the neomembrane
02:59:13.780 | or the new membrane that forms,
02:59:16.380 | it's quite easy to just pull them out.
02:59:18.500 | And there's minimal trauma that's caused during that.
02:59:22.500 | Once the scar tissue forms, and with Nolan as well,
02:59:27.100 | we believe that that's the thing
02:59:28.260 | that's currently anchoring the threads.
02:59:29.900 | So we haven't seen any more movements since then.
02:59:33.420 | So they're quite stable.
02:59:34.740 | It gets harder to actually completely extract the threads.
02:59:40.460 | So our current method for removing the device
02:59:44.940 | is cutting the thread, leaving the tissue intact,
02:59:49.220 | and then unscrewing and taking the implant out.
02:59:52.380 | And that hole is now gonna be plugged
02:59:55.820 | with either another neural link,
02:59:59.380 | or just with kind of a peak-based, plastic-based cap.
03:00:04.380 | - Is it okay to leave the threads in there forever?
03:00:09.300 | - Yeah, we think so.
03:00:10.140 | We've done studies where we left them there.
03:00:13.780 | And one of the biggest concerns that we had
03:00:15.380 | is like, do they migrate?
03:00:16.580 | And do they get to a point where they should not be?
03:00:19.260 | We haven't seen that.
03:00:20.100 | Again, once the scar tissue forms, they get anchored in place.
03:00:24.060 | And I should also say that when we say upgrades,
03:00:27.940 | we're not just talking in theory here.
03:00:31.060 | Like we've actually upgraded many, many times.
03:00:34.540 | Most of our monkeys or non-human primates, NHP,
03:00:39.620 | have been upgraded.
03:00:40.780 | Pager, who you saw playing mine pong,
03:00:43.460 | has the latest version of the device since two years ago,
03:00:46.540 | and is seemingly very happy and healthy and fat.
03:00:49.420 | - So what's designed for the future, the upgrade procedure?
03:00:56.020 | So maybe for Noland, what would the upgrade look like?
03:01:01.500 | It was essentially, what you're mentioning,
03:01:05.340 | is there a way to upgrade sort of the device internally
03:01:10.340 | where you take it apart and sort of keep the capsule
03:01:13.420 | and upgrade the internals?
03:01:14.940 | - Yeah, so there are a couple of different things here.
03:01:16.700 | So for Noland, if we were to upgrade,
03:01:18.820 | what we would have to do is either cut the threads
03:01:21.660 | or extract the threads,
03:01:23.260 | depending on kind of the situation there
03:01:27.180 | in terms of how they're anchored or scarred in.
03:01:29.580 | If you were to remove them with the dural substitute,
03:01:33.940 | you have an intact brain.
03:01:35.260 | So you can reinsert different threads
03:01:37.740 | with the updated implant package.
03:01:40.380 | There are a couple of different other ways
03:01:46.660 | that we're thinking about the future
03:01:48.300 | of what the upgradable system looks like.
03:01:50.620 | One is, at the moment, we currently remove the dura,
03:01:55.460 | this kind of thick layer that protects the brain.
03:01:58.860 | But that actually is the thing
03:02:00.020 | that actually proliferates the scar tissue formation.
03:02:03.060 | So typically, the general good rule of thumb
03:02:05.340 | is you wanna leave the nature as is
03:02:08.300 | and not disrupt it as much.
03:02:09.420 | So we're looking at ways to insert the threads
03:02:12.900 | through the dura,
03:02:14.660 | which comes with different set of challenges,
03:02:17.300 | such as it's a pretty thick layer.
03:02:20.940 | So how do you actually penetrate that
03:02:22.020 | without breaking the needle?
03:02:23.260 | So we're looking at different needle design for that,
03:02:25.660 | as well as the kind of the loop engagement.
03:02:28.140 | The other biggest challenges are it's quite opaque optically
03:02:31.820 | and with white light illumination.
03:02:33.540 | So how do you avoid still this biggest advantage
03:02:37.860 | that we have of avoiding vasculature?
03:02:40.100 | How do you image through that?
03:02:40.980 | How do you actually still mediate that?
03:02:42.300 | So there are other imaging techniques
03:02:43.860 | that we're looking at to enable that.
03:02:46.140 | But our hypothesis is that,
03:02:48.700 | and based on some of the early evidence that we have,
03:02:51.820 | doing through the dura insertion
03:02:53.260 | will cause minimal scarring
03:02:55.060 | that causes them to be much easier to extract over time.
03:02:58.700 | And the other thing that we're also looking at,
03:03:00.500 | this is gonna be a fundamental change
03:03:03.540 | in the implant architecture,
03:03:05.300 | is at the moment it's a monolithic single implant
03:03:09.780 | that comes with a thread that's bonded together.
03:03:12.780 | So you can't actually separate the thing out,
03:03:14.260 | but you can imagine having two part implant,
03:03:16.740 | bottom part that is the thread that are inserted,
03:03:20.540 | that has the chips and maybe a radio and some power source.
03:03:25.220 | And then you have another implant
03:03:27.140 | that has more of the computational heavy load
03:03:29.500 | and the bigger battery.
03:03:30.900 | And then one can be under the dura,
03:03:33.140 | one can be above the dura,
03:03:34.340 | like being the plug for the skull.
03:03:36.500 | They can talk to each other,
03:03:37.380 | but the thing that you wanna upgrade,
03:03:38.860 | the computer and not the threads,
03:03:40.980 | if you wanna upgrade that, you just go in there,
03:03:43.020 | remove the screws and then put in the next version
03:03:45.540 | and you're off to, it's a very, very easy surgery too.
03:03:49.020 | Like you do a skin incision, slip this in, screw,
03:03:52.740 | probably be able to do this in 10 minutes.
03:03:55.220 | - So that would allow you to reuse the threads sort of.
03:03:57.660 | - Correct.
03:03:59.260 | - So, I mean, this leads to the natural question of,
03:04:01.860 | what is the pathway to scaling
03:04:03.660 | the increase in the number of threads?
03:04:06.060 | Is that a priority?
03:04:07.540 | Is that like, what's the technical challenge there?
03:04:11.220 | - Yeah, that is a priority.
03:04:12.460 | So for next versions of the implant,
03:04:14.860 | the key metrics that we're looking to improve
03:04:17.380 | are number of channels,
03:04:19.020 | just recording from more and more neurons.
03:04:21.140 | We have a pathway to actually go from currently 1000
03:04:25.140 | to hopefully 3000, if not 6000 by end of this year.
03:04:28.940 | And then end of next year,
03:04:31.620 | we wanna get to even more, 16,000.
03:04:35.260 | - Wow.
03:04:36.580 | - There's a couple of limitations to that.
03:04:37.940 | One is obviously being able to photolithographically
03:04:41.140 | print those wires.
03:04:42.900 | As I mentioned, it's two micron in width and spacing.
03:04:46.740 | Obviously there are chips that are much more advanced
03:04:49.580 | than those types of resolution.
03:04:50.740 | And we have some of the tools that we have brought in house
03:04:53.340 | to be able to do that.
03:04:54.980 | So traces will be narrower,
03:04:56.860 | just so that you have to have more of the wires
03:04:58.820 | coming up into the chip.
03:05:00.700 | Chips also cannot linearly consume more energy
03:05:07.220 | as you have more and more channels.
03:05:08.780 | So there's a lot of innovations in the circuit,
03:05:11.140 | in architecture as well as the circuit design topology
03:05:14.980 | to make them lower power.
03:05:17.140 | You need to also think about if you have all of these spikes,
03:05:20.380 | how do you send that off to the end application?
03:05:22.700 | So you need to think about bandwidth limitation there
03:05:25.220 | and potentially innovations in signal processing.
03:05:27.660 | Physically, one of the biggest challenges
03:05:30.740 | is gonna be the interface.
03:05:34.380 | It's always the interface that breaks,
03:05:36.540 | bonding the stem film array to the electronics.
03:05:40.980 | It starts to become very, very highly dense interconnects.
03:05:44.100 | So how do you connect the rise that?
03:05:45.580 | There's a lot of innovations in kind of the 3D integrations
03:05:49.380 | in the recent years that we can take advantage of.
03:05:53.620 | One of the biggest challenges that we do have
03:05:55.300 | is forming this hermetic barrier, right?
03:05:59.260 | This is an extremely harsh environment
03:06:00.820 | that we're in, the brain.
03:06:02.580 | So how do you protect it from,
03:06:04.780 | yeah, like the brain trying to kill your electronics
03:06:09.380 | to also your electronics leaking things
03:06:12.020 | that you don't want into the brain
03:06:13.220 | and that forming that hermetic barrier
03:06:15.060 | is gonna be a very, very big challenge
03:06:16.740 | that we, I think are actually well suited to tackle.
03:06:20.380 | - How do you test that?
03:06:21.580 | Like what's the development environment
03:06:23.940 | to simulate that kind of harshness?
03:06:25.780 | - Yeah, so this is where the accelerated life tester
03:06:28.940 | essentially is a brain in a vat.
03:06:30.780 | It literally is a vessel that is made up of,
03:06:35.620 | and again, for all intents and purpose
03:06:38.820 | for this particular types of tests,
03:06:40.860 | your brain is a salt water.
03:06:43.780 | And you can also put some other set of chemicals
03:06:48.780 | like reactive oxygen species
03:06:51.060 | that get at kind of these interfaces
03:06:53.500 | and trying to cause a reaction to pull it apart.
03:06:58.500 | But you could also increase the rate
03:07:01.340 | at which these interfaces are aging
03:07:03.980 | by just increasing temperature.
03:07:06.220 | So every 10 degrees Celsius that you increase,
03:07:08.620 | you're basically accelerating time by two X.
03:07:11.780 | And there's limit as to how much temperature
03:07:13.940 | you wanna increase, 'cause at some point
03:07:15.220 | there's some other nonlinear dynamics
03:07:17.500 | that causes you to have other nasty gases to form
03:07:21.660 | that just is not realistic in an environment.
03:07:23.980 | So what we do is we increase in our ALT chamber
03:07:27.860 | by 20 degrees Celsius that increases the aging by four times.
03:07:32.860 | So essentially one day in ALT chamber
03:07:35.380 | is four day in calendar year.
03:07:37.180 | And we look at whether the implants still are intact,
03:07:42.180 | including the threads and-
03:07:43.780 | - And operation and all of that.
03:07:45.020 | - And operation and all of that.
03:07:46.860 | Obviously it's not an exact same environment as a brain,
03:07:49.980 | 'cause brain has mechanical,
03:07:52.220 | other more biological groups that attack at it.
03:07:57.140 | But it is a good test environment,
03:07:59.340 | testing environment for at least the enclosure
03:08:02.660 | and the strength of the enclosure.
03:08:04.500 | And I mean, we've had implants,
03:08:06.460 | the current version of the implant
03:08:08.820 | that has been in there for, I mean,
03:08:11.660 | close to two and a half years,
03:08:13.220 | which is equivalent to a decade and they seem to be fine.
03:08:16.120 | - So it's interesting that the brain,
03:08:19.340 | so basically a close approximation is warm salt water,
03:08:24.340 | hot salt water is a good testing environment.
03:08:27.940 | By the way, I'm drinking element,
03:08:31.580 | which is basically salt water,
03:08:34.340 | which is making me kind of,
03:08:36.700 | it doesn't have computational power
03:08:38.180 | the way the brain does,
03:08:39.100 | but maybe in terms of other characteristics
03:08:42.820 | it's quite similar and I'm consuming it.
03:08:44.980 | - Yeah, you have to get it at the right pH too.
03:08:47.340 | - And then consciousness will emerge.
03:08:50.460 | - Yeah. - No.
03:08:51.300 | - By the way, the other thing that also is interesting
03:08:54.540 | about our enclosure is if you look at our implant,
03:08:58.940 | it's not your common looking medical implant
03:09:03.260 | that usually is encased in a titanium can
03:09:07.380 | that's laser welded.
03:09:08.660 | We use this polymer called PCTFE,
03:09:11.420 | polychlorotrifluoroethylene,
03:09:14.940 | which is actually commonly used in blister packs.
03:09:18.700 | So when you have a pill and you try to pop the pill,
03:09:21.520 | there's like kind of that plastic membrane,
03:09:23.300 | that's what this is.
03:09:24.980 | No one's actually ever used this except us.
03:09:28.260 | And the reason we wanted to do this
03:09:30.220 | is 'cause it's electromagnetically transparent.
03:09:32.420 | So when we talked about the electromagnetic
03:09:36.060 | inductive charging with titanium can,
03:09:38.820 | usually if you wanna do something like that,
03:09:41.220 | you have to have a sapphire window
03:09:42.580 | and it's a very, very tough process to scale.
03:09:45.860 | - So you're doing a lot of iteration here
03:09:47.300 | in every aspect of this,
03:09:48.140 | the materials, the software, the hardware, all of it.
03:09:50.620 | - The whole shebang.
03:09:52.060 | - So okay, so you mentioned scaling.
03:09:56.020 | Is it possible to have multiple Neuralink devices
03:10:00.580 | as one of the ways of scaling?
03:10:04.180 | To have multiple Neuralink devices implanted?
03:10:07.220 | - That's the goal, that's the goal.
03:10:08.540 | Yeah, we've had, I mean, our monkeys have had
03:10:12.260 | two Neuralinks, one in each hemisphere.
03:10:15.780 | And then we're also looking at potential of having
03:10:18.300 | one in motor cortex, one in visual cortex,
03:10:21.700 | and one in whatever other cortex.
03:10:24.700 | - So focusing on a particular function,
03:10:27.300 | one Neuralink device. - Correct.
03:10:28.620 | - I mean, I wonder if there's some level of customization
03:10:30.980 | that can be done on the compute side.
03:10:32.740 | So for the motor cortex.
03:10:34.020 | - Absolutely, that's the goal.
03:10:35.900 | And we talk about at Neuralink,
03:10:38.180 | building a generalized neural interface to the brain.
03:10:41.460 | And that also is strategically how we're approaching this
03:10:47.340 | with marketing and also with regulatory,
03:10:51.220 | which is, hey, look, we have the robot
03:10:55.380 | and the robot can access any part of the cortex.
03:10:57.740 | Right now, we're focused on motor cortex
03:11:00.060 | with current version of the N1
03:11:02.540 | that's specialized for motor decoding tasks.
03:11:06.780 | But also at the end of the day,
03:11:07.780 | there's kind of a general compute available there.
03:11:10.260 | But typically, if you wanna really get down
03:11:14.900 | to kind of hyper-optimizing for power and efficiency,
03:11:17.700 | you do need to get to some specialized function, right?
03:11:20.460 | But what we're saying is that, hey,
03:11:25.820 | you are now used to this robotic insertion techniques,
03:11:28.700 | which took many, many years of showing data
03:11:32.260 | and in conversation with the FDA,
03:11:33.900 | and also internally convincing ourself that this is safe.
03:11:38.620 | And now the difference is if we go
03:11:42.780 | to other parts of the brain, like visual cortex,
03:11:45.260 | which we're interested in as our second product,
03:11:48.420 | obviously, it's a completely different environment.
03:11:50.140 | The cortex is laid out very, very differently.
03:11:54.540 | It's gonna be more stimulation focused rather than recording
03:11:57.540 | just kind of creating visual percepts.
03:12:00.100 | But in the end,
03:12:01.180 | we're using the same thin film array technology.
03:12:04.460 | We're using the same robot insertion technology.
03:12:07.140 | We're using the same packaging technology.
03:12:09.860 | Now, it's more of the conversation is focused
03:12:11.780 | around what are the differences
03:12:13.260 | and what are the implication of those differences
03:12:14.940 | in safety and efficacy.
03:12:17.060 | - The way you said second product
03:12:19.300 | is both hilarious and awesome to me.
03:12:22.580 | That product being restoring sight for blind people.
03:12:27.580 | So can you speak to stimulating the visual cortex?
03:12:34.660 | I mean, the possibilities there are just incredible
03:12:39.500 | to be able to give that gift back to people
03:12:42.020 | who don't have sight or even any aspect of that.
03:12:46.780 | Can you just speak to the challenges of,
03:12:48.700 | there's several challenges here.
03:12:50.780 | - Oh, many.
03:12:51.620 | - Which is, like you said, from recording to stimulation.
03:12:55.860 | Just any aspect of that that you're both excited
03:12:58.500 | and see the challenges of.
03:13:01.340 | - Yeah, I guess I'll start by saying
03:13:04.460 | that we actually have been capable of stimulating
03:13:08.900 | through our thin film array
03:13:10.500 | as well as other electronics for years.
03:13:13.300 | We have actually demonstrated some of that capabilities
03:13:17.500 | for reanimating the limb in the spinal cord.
03:13:20.820 | Obviously for the current EFS study,
03:13:24.900 | we've hardware disabled that.
03:13:26.660 | So that's something that we wanted to embark
03:13:29.420 | as a separate journey.
03:13:30.980 | And obviously there are many, many different ways
03:13:34.980 | to write information into the brain.
03:13:37.740 | The way in which we're doing that is through electrical,
03:13:40.420 | passing electrical current
03:13:42.140 | and causing that to really change the local environment
03:13:47.020 | so that you can sort of artificially cause
03:13:51.500 | kind of the neurons to depolarize in nearby areas.
03:13:54.980 | For vision specifically,
03:13:57.860 | the way our visual system works,
03:14:02.300 | it's both well understood.
03:14:03.620 | I mean, anything with kind of brain,
03:14:05.820 | there are aspects of it that's well understood,
03:14:07.620 | but in the end, like we don't really know anything.
03:14:10.020 | But the way visual system works
03:14:11.540 | is that you have photon hitting your eye
03:14:14.860 | and in your eyes,
03:14:17.820 | there are these specialized cells
03:14:19.740 | called photoreceptor cells
03:14:21.900 | that convert the photon energy into electrical signals.
03:14:25.940 | And then that then gets projected
03:14:28.260 | to your back of your head, your visual cortex.
03:14:31.940 | It goes through actually a thalamic system called LGN
03:14:37.300 | that then projects it out.
03:14:38.700 | And then in the visual cortex,
03:14:40.100 | there's visual area one or V1,
03:14:44.180 | and then there's a bunch of other higher level
03:14:46.260 | processing layers like V2, V3.
03:14:48.460 | And there are actually kind of interesting parallels.
03:14:51.340 | And when you study the behaviors
03:14:54.180 | of these convolutional neural networks,
03:14:56.660 | like what the different layers of the network is detecting,
03:15:00.820 | first they're detecting like these edges
03:15:03.140 | and they're then detecting some more natural curves.
03:15:06.100 | And then they start to detect like objects, right?
03:15:08.460 | Kind of similar thing happens in the brain.
03:15:10.300 | And a lot of that has been inspired
03:15:12.340 | and also it's been kind of exciting
03:15:13.980 | to see some of the correlations there.
03:15:16.700 | But things like from there,
03:15:19.180 | where does cognition arise and where is color encoded?
03:15:23.820 | There's just not a lot of understanding,
03:15:26.660 | fundamental understanding there.
03:15:28.420 | So in terms of kind of bringing sight back
03:15:32.060 | to those that are blind,
03:15:33.660 | there are many different forms of blindness.
03:15:36.900 | There's actually a million people,
03:15:38.620 | 1 million people in the US that are legally blind.
03:15:41.420 | That means like certain,
03:15:42.780 | like score below in kind of the visual test.
03:15:46.980 | I think it's something like,
03:15:48.300 | if you can see something at 20 feet distance
03:15:52.220 | that normal people can see at 200 feet distance,
03:15:55.060 | like if you're worse than that, you're legally blind.
03:15:57.700 | - So fundamentally that means
03:15:59.100 | you can't function effectively using sight in the world.
03:16:02.500 | - Yeah, like to navigate your environment.
03:16:04.900 | And yeah, there are different forms of blindness.
03:16:08.860 | There are forms of blindness where there's some degeneration
03:16:13.180 | of your retina, it's photoreceptor cells,
03:16:17.460 | and the rest of your visual processing
03:16:22.220 | that I described is intact.
03:16:24.260 | And for those types of individuals,
03:16:27.660 | you may not need to maybe stick electrodes
03:16:30.100 | into the visual cortex.
03:16:31.500 | You can actually build retinal prosthetic devices
03:16:37.060 | that actually just replaces a function
03:16:38.980 | of that retinal cells that are degenerated.
03:16:40.980 | And there are many companies that are working on that,
03:16:43.140 | but that's a very small slice, albeit significant,
03:16:46.900 | still a smaller slice of folks that are legally blind.
03:16:49.900 | If there's any damage along that circuitry,
03:16:53.700 | whether it's in the optic nerve or just the LGN circuitry
03:16:58.500 | or any break in that circuit,
03:17:00.260 | that's not gonna work for you.
03:17:02.620 | And the source of where you need to actually
03:17:07.620 | cause that visual percept to happen
03:17:11.420 | because your biological mechanism is not doing that
03:17:14.220 | is by placing electrodes in the visual cortex
03:17:16.340 | in the back of your head.
03:17:17.660 | And the way in which this would work
03:17:19.220 | is that you would have an external camera,
03:17:21.740 | whether it's something as unsophisticated as a GoPro
03:17:26.740 | or some sort of wearable Ray-Ban type glasses
03:17:32.020 | that Meta's working on that captures a scene.
03:17:34.580 | And that scene is then converted
03:17:38.340 | to a set of electrical impulses or stimulation pulses
03:17:41.540 | that you would activate in your visual cortex
03:17:45.020 | through these thin film arrays.
03:17:48.100 | And by playing some concerted kind of orchestra
03:17:53.100 | of these stimulation patterns,
03:17:56.660 | you can create what's called phosphenes,
03:17:58.500 | which are these kind of white yellowish dots
03:18:01.340 | that you can also create by just pressing your eyes.
03:18:04.420 | You can actually create those percepts
03:18:06.220 | by stimulating the visual cortex.
03:18:08.340 | And the name of the game is really have many of those
03:18:11.740 | and have those percepts be,
03:18:13.260 | the phosphenes be as small as possible
03:18:15.100 | so that you can start to tell apart,
03:18:16.700 | like they're the individual pixels of the screen, right?
03:18:20.460 | So if you have many, many of those,
03:18:22.620 | potentially you'll be able to,
03:18:24.140 | in the long term,
03:18:27.460 | be able to actually get naturalistic vision,
03:18:29.620 | but in the short term to maybe midterm,
03:18:32.780 | being able to at least be able to have object detection
03:18:36.140 | algorithms run on your glasses,
03:18:39.940 | the preprocessing units,
03:18:41.660 | and then being able to at least see the edges of things
03:18:43.900 | so you don't bump into stuff.
03:18:45.340 | - It's incredible.
03:18:47.340 | This is really incredible.
03:18:48.620 | So you basically would be adding pixels
03:18:50.620 | and your brain would start to figure out
03:18:52.300 | what those pixels mean.
03:18:54.380 | - Yeah.
03:18:55.220 | - And like with different kinds of assistant
03:18:57.780 | signal processing on all fronts.
03:18:59.540 | - Yeah.
03:19:00.700 | The thing that actually,
03:19:01.540 | so a couple of things.
03:19:02.380 | One is, obviously if you're blind from birth,
03:19:06.780 | the way brain works,
03:19:09.140 | especially in the early age,
03:19:10.540 | neuroplasticity is really nothing other than
03:19:15.140 | kind of your brain and different parts of your brain
03:19:18.700 | fighting for the limited territory.
03:19:20.740 | - Yeah.
03:19:21.900 | - And I mean, very, very quickly,
03:19:24.020 | you see cases where people that are,
03:19:27.020 | I mean, you also hear about people who are blind
03:19:29.580 | that have heightened sense of hearing or some other senses.
03:19:33.580 | And the reason for that is because that cortex
03:19:36.060 | that's not used just gets taken over
03:19:37.860 | by these different parts of the cortex.
03:19:39.580 | So for those types of individuals,
03:19:41.380 | I mean, I guess they're going to have to now map
03:19:46.060 | some other parts of their senses
03:19:48.100 | into what they call vision,
03:19:49.820 | but it's going to be obviously a very,
03:19:51.780 | very different conscious experience.
03:19:54.660 | Before, so I think that's an interesting caveat.
03:19:59.260 | The other thing that also is important to highlight
03:20:01.180 | is that we're currently limited by our biology
03:20:05.620 | in terms of the wavelength that we can see.
03:20:08.700 | There's a very, very small wavelength
03:20:11.220 | that is a visible light wavelength
03:20:13.820 | that we can see with our eyes.
03:20:15.020 | But when you have an external camera with this BCI system,
03:20:18.660 | you're not limited to that.
03:20:19.700 | You can have infrared, you can have UV,
03:20:21.740 | you can have whatever other spectrum that you want to see.
03:20:24.740 | And whether that gets mapped
03:20:25.900 | to some sort of weird conscious experience, I've no idea.
03:20:28.900 | But oftentimes I talk to people
03:20:32.780 | about the goal of Neuralink
03:20:34.060 | being going beyond the limits of our biology.
03:20:36.860 | That's sort of what I mean.
03:20:38.220 | - And if you're able to control the kind of raw signal,
03:20:43.420 | is that when we use our sight, we're getting the photons
03:20:48.260 | and there's not much processing on it.
03:20:50.900 | If you're being able to control that signal,
03:20:52.940 | maybe you can do some kind of processing.
03:20:54.380 | Maybe you do object detection ahead of time.
03:20:57.260 | You're doing some kind of pre-processing
03:20:59.620 | and there's a lot of possibilities to explore that.
03:21:02.300 | So it's not just increasing
03:21:04.700 | sort of thermal imaging, that kind of stuff,
03:21:07.260 | but it's also just doing
03:21:09.060 | some kind of interesting processing.
03:21:10.620 | - Yeah.
03:21:11.700 | I mean, my theory of how like visual system works also
03:21:15.500 | is that, I mean, there's just so many things happening
03:21:20.540 | in the world and there's a lot of photons
03:21:22.580 | that are going into your eye.
03:21:23.860 | And it's unclear exactly where some of the pre-processing
03:21:28.180 | steps are happening, but I mean, I actually think that
03:21:30.980 | just from a fundamental perspective,
03:21:35.340 | there's just so much, the reality that we're in,
03:21:38.580 | if it's a reality, is so there's so much data.
03:21:43.580 | And I think humans are just unable to actually
03:21:46.340 | like eat enough actually to process all that information.
03:21:49.940 | So there's some sort of filtering that does happen,
03:21:52.140 | whether that happens in the retina,
03:21:53.340 | whether that happens in different layers
03:21:54.740 | of the visual cortex, unclear.
03:21:57.780 | But like the analogy that I sometimes think about is,
03:22:01.340 | you know, if your brain is a CCD camera
03:22:05.420 | and all of the information in the world is a sun,
03:22:08.980 | and when you try to actually look at the sun
03:22:11.860 | with the CCD camera,
03:22:12.780 | it's just gonna saturate the sensors, right?
03:22:14.580 | 'Cause it's an enormous amount of energy.
03:22:16.300 | So what you do is you end up adding these filters, right?
03:22:20.260 | To just kind of narrow the information
03:22:22.540 | that's coming to you and being captured.
03:22:25.180 | And I think, you know, things like our experiences
03:22:30.180 | or our, you know, like drugs, like Profofol,
03:22:36.100 | that like anesthetic drug or, you know, psychedelics,
03:22:40.340 | what they're doing is they're kind of swapping out
03:22:43.140 | these filters and putting in new ones
03:22:45.020 | or removing older ones and kind of controlling
03:22:48.100 | our conscious experience.
03:22:50.060 | - Yeah, man, not to distract from the topic,
03:22:51.700 | but I just took a very high dose of ayahuasca
03:22:53.780 | in the Amazon jungle.
03:22:54.780 | So yes, it's a nice way to think about it.
03:22:57.780 | You're swapping out different experiences
03:23:00.980 | and with Neuralink being able to control that,
03:23:04.380 | primarily at first to improve function,
03:23:07.820 | not for entertainment purposes or enjoyment purposes,
03:23:10.340 | but yeah, giving back lost functions,
03:23:13.180 | giving back lost functions.
03:23:15.220 | And there, especially when the function's completely lost,
03:23:19.260 | anything is a huge help.
03:23:21.900 | Would you implant a Neuralink device in your own brain?
03:23:26.900 | - Absolutely.
03:23:29.980 | I mean, maybe not right now, but absolutely.
03:23:33.420 | - What kind of capability, once reached,
03:23:35.980 | you start getting real curious
03:23:38.100 | and almost get a little antsy, like jealous of people
03:23:42.500 | as you watch them getting implanted.
03:23:45.060 | - Yeah, I mean, I think, I mean,
03:23:47.580 | even with our early participants,
03:23:49.620 | if they start to do things that I can't do,
03:23:54.060 | which I think is in the realm of possibility
03:23:56.780 | for them to be able to get 15, 20,
03:23:59.940 | if not like 100 BPS, right?
03:24:02.860 | There's nothing that fundamentally stops us
03:24:04.940 | from being able to achieve that type of performance.
03:24:07.540 | I mean, I would certainly get jealous
03:24:11.940 | that they can do that.
03:24:13.060 | - I should say that watching Nolan,
03:24:14.500 | I get a little jealous
03:24:15.420 | 'cause he's having so much fun
03:24:17.260 | and it seems like such a chill way to play video games.
03:24:19.980 | - Yeah, I mean, the thing that also is hard
03:24:23.180 | to appreciate sometimes is that he's doing these things
03:24:27.500 | while talking and, I mean, it's multitasking, right?
03:24:31.780 | So it's clearly, it's obviously cognitively intensive,
03:24:36.780 | but similar to how, when we talk, we move our hands,
03:24:40.580 | these things are multitasking.
03:24:43.660 | I mean, he's able to do that.
03:24:44.980 | And you won't be able to do that
03:24:46.900 | with other assistive technology, as far as I'm aware.
03:24:51.460 | If you're obviously using an eye-tracking device,
03:24:54.940 | you're very much fixated on that thing
03:24:56.900 | that you're trying to do.
03:24:57.740 | And if you're using voice control,
03:24:58.860 | I mean, if you say some other stuff,
03:25:01.220 | yeah, you don't get to use that.
03:25:02.180 | - Yeah, the multitasking aspect of that
03:25:05.140 | is really interesting.
03:25:05.980 | So it's not just the BPS for the primary task,
03:25:09.420 | it's the parallelization of multiple tasks.
03:25:12.460 | If you measure the BPS for the entirety
03:25:15.620 | of the human organism, so if you're talking
03:25:18.540 | and doing a thing with your mind and looking around also,
03:25:23.540 | I mean, there's just a lot of parallelization
03:25:26.940 | that can be happening.
03:25:28.460 | - But I mean, I think at some point for him,
03:25:29.980 | like if he wants to really achieve those high-level BPS,
03:25:32.420 | it does require full attention, right?
03:25:34.820 | And that's a separate circuitry that is a big mystery,
03:25:39.180 | like how attention works and, you know.
03:25:41.100 | - Yeah, attention, like cognitive load,
03:25:42.740 | I've done, I've read a lot of literature
03:25:45.300 | on people doing two tasks.
03:25:47.940 | Like you have your primary task and a secondary task,
03:25:51.460 | and the secondary task is a source of distraction.
03:25:54.500 | And how does that affect the performance
03:25:55.980 | of the primary task?
03:25:56.820 | And there's, depending on the task,
03:25:58.260 | there's a lot of interesting,
03:25:59.820 | I mean, this is an interesting computational device, right?
03:26:02.140 | And I think there's, to say the least,
03:26:05.300 | a lot of novel insights that can be gained from everything.
03:26:07.780 | I mean, I personally am surprised that Nolan's able
03:26:09.660 | to do such incredible control of the cursor while talking,
03:26:14.660 | and also being nervous at the same time,
03:26:17.620 | because he's talking like all of us are,
03:26:19.140 | if you're talking in front of the camera, you get nervous.
03:26:21.460 | So all of those are coming into play,
03:26:23.780 | and he's able to still achieve high performance.
03:26:26.360 | Surprising, I mean, all of this is really amazing.
03:26:30.020 | And I think just after researching this really in depth,
03:26:35.540 | I kind of want a yearling.
03:26:36.700 | (laughing)
03:26:38.340 | - Get in the line.
03:26:39.720 | - And also the safety gear in mind.
03:26:41.380 | Well, we should say the registry is for people
03:26:44.140 | who have quadriplegia and all that kind of stuff,
03:26:46.380 | so there'll be a separate line for people.
03:26:50.220 | They're just curious, like myself.
03:26:57.340 | So now that Nolan, patient P1,
03:26:59.860 | is part of the ongoing PRIME study,
03:27:02.620 | what's the high-level vision for P2, P3, P4, P5,
03:27:07.340 | and just the expansion into other human beings
03:27:11.220 | that are getting to experience this implant?
03:27:13.420 | - Yeah, I mean, the primary goal is,
03:27:17.660 | for our study in the first place,
03:27:19.220 | is to achieve safety endpoints,
03:27:21.060 | just understand safety of this device,
03:27:26.060 | as well as the implantation process.
03:27:30.940 | And also at the same time, understand the efficacy
03:27:33.780 | and the impact that it could have
03:27:34.940 | on the potential user's lives.
03:27:37.380 | And just because you're living with tetraplegia,
03:27:43.540 | it doesn't mean your situation is same
03:27:46.620 | as another person living with tetraplegia.
03:27:48.900 | It's wildly, wildly varying.
03:27:51.160 | And it's something that we're hoping to also understand
03:27:57.100 | how our technology can serve
03:27:59.540 | not just a very small slice of those individuals,
03:28:01.700 | but broader group of individuals,
03:28:03.620 | and being able to get the feedback
03:28:05.020 | to just really build just the best product for them.
03:28:09.900 | So there's obviously also goals that we have,
03:28:16.100 | and the primary purpose of the early feasibility study
03:28:20.900 | is to learn from each and every participant
03:28:24.380 | to improve the device, improve the surgery,
03:28:26.980 | before we embark on what's called the pivotal study,
03:28:30.380 | that then is a much larger trial
03:28:34.180 | that starts to look at statistical significance
03:28:39.140 | of your endpoints.
03:28:40.580 | And that's required before you can then market the device.
03:28:43.720 | And that's how it works in the US
03:28:46.660 | and just generally around the world.
03:28:48.340 | That's the process you follow.
03:28:50.100 | So our goal is to really just understand
03:28:52.940 | from people like Nolan, P2, P3, future participants,
03:28:56.700 | what aspects of our device needs to improve.
03:28:59.220 | If it turns out that people are like,
03:29:01.180 | I really don't like the fact that it lasts only six hours,
03:29:03.620 | I wanna be able to use this computer for like 24 hours.
03:29:08.320 | I mean, that is user needs and user requirements,
03:29:13.320 | which we can only find out
03:29:14.800 | from just being able to engage with them.
03:29:17.260 | - So before the pivotal study,
03:29:18.500 | there's kind of like a rapid innovation
03:29:20.460 | based on individual experiences.
03:29:21.960 | You're learning from individual people,
03:29:23.660 | how they use it, like the high resolution details
03:29:28.620 | in terms of like cursor control and signal
03:29:30.620 | and all that kind of stuff to like life experience.
03:29:33.260 | - Yeah, so there's hardware changes,
03:29:34.900 | but also just firmware updates.
03:29:37.440 | So even when we had that sort of recovery event for Nolan,
03:29:42.440 | he now has the new firmware that he has been updated with.
03:29:48.980 | And it's similar to how like your phones
03:29:51.900 | get updated all the time with new firmwares
03:29:53.940 | for security patches, whatever new functionality, UI, right?
03:29:57.180 | And that's something that is possible with our implant.
03:29:59.940 | It's not a static one-time device
03:30:02.560 | that can only do the thing that it said it can do.
03:30:05.820 | I mean, similar to Tesla,
03:30:06.820 | you can do over-the-air firmware updates
03:30:08.860 | and now you have completely new user interface
03:30:11.820 | and all this bells and whistles and improvements
03:30:14.940 | on everything like the latest, right?
03:30:17.900 | That's when we say generalized platform,
03:30:21.220 | that's what we're talking about.
03:30:22.380 | - Yeah, it's really cool how the app that Nolan is using,
03:30:24.900 | there's like calibration, all that kind of stuff.
03:30:28.620 | And then there's update.
03:30:29.880 | You just click and get an update.
03:30:33.900 | What other future capabilities are you kind of looking to?
03:30:39.820 | You said vision, that's a fascinating one.
03:30:42.500 | What about sort of accelerated typing or speech,
03:30:46.020 | this kind of stuff and what else is there?
03:30:49.660 | - Yeah, those are still in the realm of movement program.
03:30:54.020 | So largely speaking, we have two programs.
03:30:56.140 | We have the movement program and we have the vision program.
03:30:59.620 | The movement program currently is focused around
03:31:02.220 | the digital freedom.
03:31:03.780 | As you can easily guess,
03:31:05.340 | if you can control 2D cursor in the digital space,
03:31:08.940 | you could move anything in the physical space.
03:31:11.620 | So robotic arms, wheelchair, your environment,
03:31:14.980 | or even really like whether it's through the phone
03:31:17.780 | or just like directly to those interfaces,
03:31:19.980 | so like to those machines.
03:31:22.660 | So we're looking at ways to kind of expand
03:31:25.060 | those types of capability, even for Nolan.
03:31:28.020 | That requires conversation with the FDA
03:31:30.380 | and kind of showing safety data for,
03:31:33.620 | if there's a robotic arm or a wheelchair that,
03:31:36.020 | we can guarantee that they're not gonna hurt themselves
03:31:37.940 | accidentally, right?
03:31:39.560 | It's very different if you're moving stuff
03:31:41.140 | in the digital domain versus like in the physical space,
03:31:43.560 | you can actually potentially cause harm to the participants.
03:31:48.160 | So we're working through that right now.
03:31:50.720 | Speech does involve different areas of the brain.
03:31:54.240 | Speech prosthetic is very, very fascinating.
03:31:56.680 | And there's actually been a lot of really amazing work
03:32:00.960 | that's been happening in academia.
03:32:02.660 | Sergei Stavisky at UC Davis, Jamie Henderson,
03:32:07.240 | and late Krishna Shenoy at Stanford
03:32:10.760 | doing just some incredible amount of work
03:32:12.400 | in improving speech neuroprosthetics.
03:32:15.700 | And those are actually looking more at parts
03:32:19.720 | of the motor cortex that are controlling
03:32:22.040 | these focal articulators.
03:32:24.320 | And being able to like even by mouthing the word
03:32:27.880 | or imagine speech, you can pick up those signals.
03:32:30.920 | The more sophisticated higher level processing areas
03:32:34.700 | like the Broca's area or Wernicke's area,
03:32:38.520 | those are still very, very big mystery
03:32:41.720 | in terms of the underlying mechanism
03:32:43.800 | of how all that stuff works.
03:32:45.400 | But yeah, I mean, I think Neuralink's eventual goal
03:32:50.000 | is to kind of understand those things
03:32:52.640 | and be able to provide a platform and tools
03:32:55.040 | to be able to understand that and study that.
03:32:58.040 | - This is where I get to the pothead questions.
03:33:00.540 | Do you think we can start getting insight
03:33:03.600 | into things like thought?
03:33:07.160 | So speech is, there's a muscular component, like you said.
03:33:11.920 | There's like the act of producing sounds.
03:33:14.760 | But then what about the internal things like cognition?
03:33:18.380 | Like low level thoughts and high level thoughts.
03:33:21.760 | Do you think we'll start noticing kind of signals
03:33:24.320 | that could be picked up?
03:33:26.080 | That could be understood, that could be maybe used
03:33:31.080 | in order to interact with the outside world?
03:33:35.960 | - In some ways, like I guess this starts
03:33:37.760 | to kind of get into the heart problem of consciousness.
03:33:40.760 | And I mean, on one hand,
03:33:47.360 | all of these are at some point set of electrical signals
03:33:52.280 | that from there, maybe it in itself
03:33:58.160 | is giving you the cognition or the meaning
03:34:02.200 | or somehow human mind is incredibly amazing
03:34:06.600 | storytelling machine.
03:34:07.840 | So we're telling ourselves and fooling ourselves
03:34:10.320 | that there's some interesting meaning here.
03:34:12.480 | But I mean, I certainly think that PCI,
03:34:18.680 | and really PCI at the end of the day is a set of tools
03:34:21.920 | that help you kind of study the underlying mechanisms
03:34:25.160 | in a both like local, but also broader sense.
03:34:30.760 | And whether there's some interesting patterns
03:34:34.120 | of like electrical signal,
03:34:35.620 | that means like you're thinking this versus,
03:34:38.860 | and you can either like learn from like many, many sets
03:34:42.120 | of data to correlate some of that
03:34:43.800 | and be able to do mind reading or not, I'm not sure.
03:34:46.400 | I certainly would not kind of rule that out
03:34:50.080 | as a possibility, but I think BCI alone
03:34:55.040 | probably can't do that.
03:34:55.920 | There's probably additional set of tools and framework.
03:34:59.440 | And also like just heart problem of consciousness
03:35:02.840 | at the end of the day is rooted
03:35:04.000 | in this philosophical question of like,
03:35:06.000 | what is the meaning of it all?
03:35:07.560 | What's the nature of our existence?
03:35:09.400 | Like, where's the mind emerge from this complex network?
03:35:12.200 | - Yeah, how does the subjective experience emerge
03:35:17.520 | from just a bunch of spikes, electrical spikes?
03:35:21.160 | - Yeah, I mean, we do really think about BCI
03:35:24.200 | and what we're building as a tool
03:35:25.480 | for understanding the mind, the brain,
03:35:29.680 | the only question that matters.
03:35:32.240 | There's actually, there actually is some biological
03:35:38.240 | existence proof of like what it would take
03:35:42.360 | to kind of start to form some of these experiences
03:35:45.240 | that may be unique.
03:35:46.200 | If you actually look at every one of our brains,
03:35:50.680 | there are two hemispheres.
03:35:51.920 | There's a left-sided brain, there's a right-sided brain.
03:35:54.520 | And I mean, unless you have some other conditions,
03:35:59.520 | you normally don't feel like left legs or right legs.
03:36:04.840 | Like you just feel like one legs, right?
03:36:07.060 | So what is happening there, right?
03:36:10.200 | If you actually look at the two hemispheres,
03:36:13.320 | there's a structure that kind of connect the rise
03:36:17.040 | the two called the corpus callosum
03:36:19.340 | that is supposed to have around 200 to 300 million
03:36:24.040 | connections or axons.
03:36:25.640 | So whether that means that's the number of interface
03:36:31.400 | and electrodes that we need to create some sort of
03:36:34.740 | mind meld or from that, like whatever new conscious
03:36:38.840 | experience that you can experience.
03:36:40.980 | But I do think that there's like kind of an interesting
03:36:46.400 | existence proof that we all have.
03:36:52.680 | - And that threshold is unknown at this time.
03:36:55.640 | - Oh yeah, these things, everything in this domain
03:36:57.960 | is speculation, right?
03:37:00.400 | - And then there will be,
03:37:01.720 | you'd be continuously pleasantly surprised.
03:37:06.040 | Do you see a world where there's millions of people,
03:37:12.300 | like tens of millions, hundreds of millions of people
03:37:15.400 | walking around with a Neuralink device
03:37:18.400 | or multiple Neuralink devices in their brain?
03:37:20.860 | - I do.
03:37:21.700 | First of all, there are, like, if you look at worldwide,
03:37:25.160 | people suffering from movement disorders
03:37:26.980 | and visual deficits, I mean, that's
03:37:29.380 | in the tens, if not hundreds of millions of people.
03:37:34.760 | So that alone, I think there's a lot of benefit
03:37:39.640 | and potential good that we can do
03:37:41.960 | with this type of technology.
03:37:43.460 | And once you start to get into kind of neuro,
03:37:48.000 | like psychiatric application, you know, depression,
03:37:51.240 | anxiety, hunger, or, you know, obesity, right?
03:37:57.320 | Like mood control of appetite.
03:38:00.540 | I mean, that starts to become,
03:38:02.600 | you know, very real to everyone.
03:38:06.540 | - Not to mention that every,
03:38:08.480 | most people on earth have a smartphone.
03:38:13.560 | And once BCI starts competing with a smartphone
03:38:18.560 | as a preferred methodology of interacting
03:38:20.820 | with the digital world,
03:38:22.500 | that also becomes an interesting thing.
03:38:24.380 | - Oh yeah, I mean, yeah, this is,
03:38:25.820 | even before going to that, right?
03:38:27.140 | I mean, there's, like, almost,
03:38:28.720 | I mean, the entire world that could benefit
03:38:32.020 | from these types of thing.
03:38:33.780 | And then, yeah, like if we're talking about
03:38:35.900 | kind of next generation of how we interface
03:38:37.900 | with, you know, machines or even ourselves,
03:38:42.380 | in many ways, I think BCI can play a role in that.
03:38:47.380 | And, you know, some of the things that I also talk about is,
03:38:51.400 | I do think that there is a real possibility
03:38:53.320 | that you could see, you know,
03:38:55.560 | 8 billion people walking around with Neuralink.
03:38:58.800 | - Well, thank you so much for pushing ahead.
03:39:01.600 | And I look forward to that exciting future.
03:39:04.480 | - Thanks for having me.
03:39:06.160 | - Thanks for listening to this conversation with DJ Sa.
03:39:10.000 | And now, dear friends, here's Matthew McDougall,
03:39:13.780 | the head neurosurgeon at Neuralink.
03:39:16.220 | When did you first become fascinated with the human brain?
03:39:21.580 | - Since forever, as far back as I can remember,
03:39:24.620 | I've been interested in the human brain.
03:39:26.260 | I mean, I was, you know, a thoughtful kid
03:39:30.580 | and a bit of an outsider.
03:39:34.020 | And you, you know, sit there thinking about
03:39:37.020 | what the most important things in the world are.
03:39:40.360 | In your little tiny adolescent brain.
03:39:43.480 | And the answer that I came to, that I converged on,
03:39:47.440 | was that all of the things you can possibly conceive of
03:39:52.440 | as things that are important for human beings to care about
03:39:56.900 | are literally contained, you know, in the skull.
03:40:00.960 | Both the perception of them and their relative values
03:40:03.880 | and, you know, the solutions to all our problems
03:40:07.080 | and all of our problems are all contained in the skull.
03:40:10.400 | And if we knew more about how that worked,
03:40:14.040 | how the brain encodes information and generates desires
03:40:19.920 | and generates agony and suffering,
03:40:22.880 | we could do more about it.
03:40:26.600 | You know, you think about all the really great triumphs
03:40:30.120 | in human history.
03:40:30.960 | You think about all the really horrific tragedies.
03:40:33.620 | You know, you think about the Holocaust.
03:40:36.960 | You think about any prison full of human stories.
03:40:41.960 | And all of those problems boil down to neurochemistry.
03:40:49.240 | So if you get a little bit of control over that,
03:40:53.200 | you provide people the option to do better.
03:40:57.680 | In the way I read history,
03:40:58.920 | the way people have dealt with having better tools
03:41:01.640 | is that they most often in the end do better
03:41:05.880 | with huge asterisks.
03:41:08.400 | But I think it's an interesting, worthy, and noble pursuit
03:41:13.160 | to give people more options, more tools.
03:41:16.080 | - Yeah, that's a fascinating way to look at human history.
03:41:18.680 | You just imagine all these neurobiological mechanisms,
03:41:21.960 | Stalin, Hitler, all of these, Genghis Khan,
03:41:25.360 | all of them just had like a brain.
03:41:28.360 | It's just a bunch of neurons, you know,
03:41:31.200 | like a few tons of billions of neurons,
03:41:34.720 | gaining a bunch of information over a period of time.
03:41:37.000 | They have a set of module that does language and memory
03:41:39.520 | and all that.
03:41:40.560 | And from there, in the case of those people,
03:41:43.160 | they're able to murder millions of people.
03:41:45.320 | And all that coming from,
03:41:47.900 | there's not some glorified notion of a dictator
03:41:54.360 | of this enormous mind or something like this.
03:41:57.280 | It's just the brain.
03:41:59.240 | - Yeah.
03:42:00.680 | Yeah, I mean, a lot of that has to do with
03:42:03.320 | how well people like that can organize those around them.
03:42:08.320 | - Other brains.
03:42:09.160 | - Yeah, and so I always find it interesting
03:42:12.200 | to look to primatology, you know,
03:42:14.280 | look to our closest non-human relatives
03:42:17.120 | for clues as to how humans are going to behave
03:42:20.280 | and what particular humans are able to achieve.
03:42:25.080 | And so you look at chimpanzees and bonobos,
03:42:29.840 | and, you know, they're similar but different
03:42:32.640 | in their social structures, particularly.
03:42:35.060 | And I went to Emory in Atlanta
03:42:38.480 | and studied under Franz de Waal,
03:42:40.560 | the great Franz de Waal,
03:42:41.720 | who was kind of the leading primatologist
03:42:44.320 | who recently died.
03:42:46.320 | And his work at looking at chimps through the lens of,
03:42:51.320 | you know, how you would watch an episode of "Friends"
03:42:55.080 | and understand the motivations of the characters
03:42:57.760 | interacting with each other,
03:42:58.720 | he would look at a chimp colony
03:43:00.600 | and basically apply that lens.
03:43:02.240 | I'm massively oversimplifying it.
03:43:05.240 | If you do that, instead of just saying,
03:43:08.120 | you know, subject 473, you know,
03:43:10.920 | through his feces at subject 471,
03:43:13.680 | you talk about them in terms of their human struggles,
03:43:18.960 | accord them the dignity of themselves as actors
03:43:23.280 | with understandable goals and drives,
03:43:27.000 | what they want out of life.
03:43:28.440 | And primarily it's, you know,
03:43:30.160 | the things we want out of life,
03:43:31.640 | food, sex, companionship, power.
03:43:35.740 | You can understand chimp and bonobo behavior
03:43:40.760 | in the same lights much more easily.
03:43:45.200 | And I think doing so gives you the tools you need
03:43:49.440 | to reduce human behavior from the kind of false complexity
03:43:54.060 | that we layer onto it with language.
03:43:57.020 | And look at it in terms of,
03:43:59.100 | oh, well, these humans are looking for companionship,
03:44:02.020 | sex, food, power.
03:44:03.740 | And I think that that's a pretty powerful tool
03:44:08.660 | to have in understanding human behavior.
03:44:10.860 | - And I just went to the Amazon jungle for a few weeks,
03:44:14.260 | and it's a very visceral reminder
03:44:17.500 | that a lot of life on earth is just trying to get laid.
03:44:21.100 | - Yeah.
03:44:21.940 | - They're all screaming at each other.
03:44:23.420 | Like I saw a lot of monkeys,
03:44:25.300 | and they're just trying to impress each other.
03:44:26.940 | Or maybe there's a battle for power,
03:44:29.820 | but a lot of the battle for power
03:44:31.300 | has to do with them getting laid.
03:44:32.980 | - Right.
03:44:33.820 | Breeding rights often go with alpha status.
03:44:37.180 | And so if you can get a piece of that,
03:44:38.740 | then you're gonna do okay.
03:44:40.540 | - And we'd like to think
03:44:41.420 | that we're somehow fundamentally different,
03:44:42.980 | but especially when it comes to primates,
03:44:45.860 | we really aren't, you know.
03:44:47.780 | We can use fancier poetic language,
03:44:50.080 | but maybe some of the underlying drives
03:44:53.940 | that motivate us are similar.
03:44:57.700 | - Yeah, I think that's true.
03:44:58.900 | - And all of that is coming from this, the brain.
03:45:01.660 | - Yeah.
03:45:02.500 | - So when did you first start studying the brain
03:45:05.100 | as a biological mechanism?
03:45:06.980 | - Basically the moment I got to college,
03:45:08.660 | I started looking around for labs
03:45:11.180 | that I could do neuroscience work in.
03:45:15.060 | I originally approached that from the angle
03:45:17.540 | of looking at interactions
03:45:20.060 | between the brain and the immune system,
03:45:22.000 | which isn't the most obvious place to start,
03:45:23.940 | but I had this idea at the time
03:45:27.040 | that the contents of your thoughts
03:45:31.220 | would have an impact, a direct impact,
03:45:34.380 | maybe a powerful one,
03:45:36.040 | on non-conscious systems in your body,
03:45:41.040 | the systems we think of as homeostatic, automatic mechanisms
03:45:46.040 | like fighting off a virus, like repairing a wound.
03:45:51.560 | And sure enough, there are big crossovers between the two.
03:45:55.400 | I mean, it gets to kind of a key point
03:46:00.400 | that I think goes under-recognized,
03:46:02.120 | one of the things people don't recognize
03:46:05.080 | or appreciate about the human brain enough,
03:46:08.840 | and that is that it basically controls
03:46:11.520 | or has a huge role in almost everything that your body does.
03:46:14.560 | Like you try to name an example of something in your body
03:46:19.440 | that isn't directly controlled
03:46:22.200 | or massively influenced by the brain,
03:46:24.480 | and it's pretty hard.
03:46:27.240 | I mean, you might say like bone healing or something,
03:46:29.440 | but even those systems, the hypothalamus and pituitary
03:46:34.240 | end up playing a role in coordinating the endocrine system
03:46:38.860 | that does have a direct influence
03:46:40.920 | on, say, the calcium level in your blood
03:46:43.360 | that goes to bone healing.
03:46:44.520 | So non-obvious connections between those things
03:46:48.400 | implicate the brain as really a potent prime mover
03:46:53.000 | in all of health.
03:46:55.800 | - One of the things I realized in the other direction too,
03:46:59.200 | how most of the systems in the body
03:47:02.320 | are integrated with the human brain,
03:47:04.240 | like they affect the brain also, like the immune system.
03:47:07.480 | I think there's just people who study Alzheimer's
03:47:11.800 | and those kinds of things.
03:47:14.640 | It's just surprising how much you can understand
03:47:18.960 | of that from the immune system,
03:47:20.800 | from the other systems that don't obviously
03:47:22.880 | seem to have anything to do with sort of the nervous system.
03:47:27.320 | They all play together.
03:47:28.540 | - Yeah, you could understand how that would be
03:47:30.880 | driven by evolution too, just in some simple examples.
03:47:34.680 | If you get sick, if you get a communicable disease,
03:47:37.460 | you get the flu,
03:47:41.200 | it's pretty advantageous for your immune system
03:47:44.720 | to tell your brain, hey, now be antisocial for a few days.
03:47:49.720 | Don't go be the life of the party tonight.
03:47:52.240 | In fact, maybe just cuddle up somewhere warm
03:47:55.800 | under a blanket and just stay there for a day or two.
03:47:58.600 | And sure enough, that tends to be the behavior
03:48:00.600 | that you see both in animals and in humans.
03:48:03.680 | If you get sick, elevated levels of interleukins
03:48:07.480 | in your blood and TNF alpha in your blood
03:48:10.520 | ask the brain to cut back on social activity
03:48:14.800 | and even moving around.
03:48:17.680 | You have lower locomotor activity in animals
03:48:21.640 | that are infected with viruses.
03:48:23.400 | - So from there, the early days in neuroscience to surgery,
03:48:30.360 | when did that step happen, this leap?
03:48:34.720 | - You know, it was sort of an evolution of thought.
03:48:36.680 | I wanted to study the brain.
03:48:39.800 | So I started studying the brain in undergrad
03:48:42.960 | in this neuroimmunology lab.
03:48:45.260 | I, from there, realized at some point
03:48:51.560 | that I didn't wanna just generate knowledge.
03:48:54.640 | I wanted to affect real changes in the actual world,
03:48:59.140 | in actual people's lives.
03:49:01.400 | And so after having not really thought
03:49:03.680 | about going into medical school,
03:49:06.360 | I was on a track to go into a PhD program.
03:49:09.240 | I said, well, I'd like that option.
03:49:12.960 | I'd like to actually potentially help
03:49:15.320 | tangible people in front of me.
03:49:18.680 | And doing a little digging,
03:49:22.040 | found that there exists these MD/PhD programs
03:49:25.320 | where you can choose not to choose between them
03:49:30.160 | and do both.
03:49:31.000 | And so I went to USC for medical school
03:49:34.720 | and had a joint PhD program with Caltech.
03:49:38.940 | Where I met, actually chose that program
03:49:43.140 | particularly because of a researcher at Caltech
03:49:46.660 | named Richard Anderson,
03:49:47.700 | who's one of the godfathers of primate neuroscience.
03:49:52.020 | And has a macaque lab where Utah rays
03:49:55.300 | and other electrodes were being inserted
03:49:57.060 | into the brains of monkeys to try to understand
03:50:00.180 | how intentions were being encoded in the brain.
03:50:03.860 | So, you know, I ended up there
03:50:06.300 | with the idea that maybe I would be a neurologist
03:50:09.020 | and study the brain on the side.
03:50:11.060 | And then discovered that neurology,
03:50:14.200 | again, I'm gonna make enemies by saying this,
03:50:18.300 | but neurology predominantly and distressingly to me
03:50:23.300 | is the practice of diagnosing a thing
03:50:27.300 | and then saying, good luck with that
03:50:28.660 | when there's not much we can do.
03:50:30.260 | And neurosurgery very differently
03:50:35.620 | is a, it's a powerful lever on taking people
03:50:39.500 | that are headed in a bad direction
03:50:40.980 | and changing their course
03:50:42.220 | in the sense of brain tumors
03:50:45.420 | that are potentially treatable or curable with surgery.
03:50:49.240 | You know, even aneurysms in the brain,
03:50:52.620 | blood vessels that are gonna rupture,
03:50:53.940 | you can save lives really is at the end of the day
03:50:57.260 | what mattered to me.
03:50:59.280 | And so I was at USC, as I mentioned,
03:51:04.260 | that happens to be one of the great neurosurgery programs.
03:51:08.140 | And so I met these truly epic neurosurgeons,
03:51:12.740 | Alex Colessi and Micah Puzo and Steve Gianotta
03:51:17.780 | and Marty Weiss, these sort of epic people
03:51:20.180 | that were just human beings in front of me.
03:51:22.420 | And so it kind of changed my thinking
03:51:25.300 | from neurosurgeons are distant gods
03:51:28.780 | that live on another planet
03:51:30.660 | and occasionally come and visit us
03:51:32.500 | to these are humans that have problems and are people.
03:51:36.380 | And there's nothing fundamentally preventing me
03:51:39.240 | from being one of them.
03:51:40.940 | And so at the last minute in medical school,
03:51:45.100 | I changed gears from going into a different specialty
03:51:48.740 | and switched into neurosurgery, which cost me a year.
03:51:52.940 | I had to do another year of research
03:51:55.420 | because I was so far along in the process
03:51:57.500 | that to switch into neurosurgery,
03:52:01.460 | the deadlines had already passed.
03:52:02.740 | So it was a decision that cost time,
03:52:07.200 | but absolutely worth it.
03:52:09.020 | - What was the hardest part of the training
03:52:11.140 | on the neurosurgeon track?
03:52:12.840 | - Yeah, two things, I think.
03:52:17.780 | Residency in neurosurgery is sort of a competition of pain,
03:52:21.580 | of like how much pain can you eat and smile.
03:52:24.660 | - Yeah.
03:52:26.140 | - And so there's workout restrictions
03:52:30.060 | that are not really,
03:52:32.060 | they're viewed at, I think,
03:52:34.300 | internally among the residents as weakness.
03:52:37.340 | And so most neurosurgery residents
03:52:40.580 | try to work as hard as they can.
03:52:42.220 | And that, I think, necessarily means working long hours
03:52:46.300 | and sometimes over the work hour limits.
03:52:48.180 | And we care about being compliant
03:52:51.700 | with whatever regulations are in front of us.
03:52:54.840 | But I think more important than that,
03:52:56.360 | people want to give their all
03:52:58.960 | in becoming a better neurosurgeon
03:53:00.600 | because the stakes are so high.
03:53:03.280 | And so it's a real fight to get residents
03:53:06.040 | to, say, go home at the end of their shift
03:53:10.360 | and not stay and do more surgery.
03:53:12.280 | - Are you seriously saying one of the hardest things
03:53:14.920 | is literally forcing them to get sleep and rest
03:53:19.240 | and all this kind of stuff?
03:53:20.480 | - Historically, that was the case.
03:53:21.840 | I think the next generation-- - And that's awesome.
03:53:24.920 | - I think the next generation is more compliant
03:53:28.840 | and more self-care. - Weak is what you mean.
03:53:30.360 | All right, I'm just kidding, I'm just kidding.
03:53:32.440 | - I didn't say it. - Now I'm making enemies.
03:53:34.760 | No, okay, I get it.
03:53:35.680 | Wow, that's fascinating.
03:53:37.360 | So what was the second thing?
03:53:39.020 | - The personalities.
03:53:40.800 | And maybe the two are connected.
03:53:43.000 | - So was it pretty competitive?
03:53:45.160 | - It's competitive, and it's also,
03:53:47.120 | as we touched on earlier, primates like power.
03:53:53.200 | And I think neurosurgery has long had this aura
03:53:57.760 | of mystique and excellence and whatever about it.
03:54:02.520 | And so it's an invitation, I think,
03:54:04.360 | for people that are cloaked in that authority.
03:54:08.440 | A board-certified neurosurgeon is basically
03:54:10.200 | a walking, fallacious appeal to authority, right?
03:54:14.520 | You have license to walk into any room
03:54:16.480 | and act like you're an expert on whatever.
03:54:21.200 | And fighting that tendency is not something
03:54:23.780 | that most neurosurgeons do well.
03:54:26.480 | Humility isn't the forte.
03:54:28.140 | - Yeah, one of the, so I have friends who know you,
03:54:32.160 | and whenever they speak about you,
03:54:33.920 | that you have the surprising quality
03:54:38.440 | for a neurosurgeon of humility,
03:54:40.360 | which I think indicates that it's not as common
03:54:44.880 | as perhaps in other professions.
03:54:46.480 | 'Cause there is a kind of gigantic,
03:54:49.880 | sort of heroic aspect to neurosurgery,
03:54:52.280 | and I think it gets to people's head a little bit.
03:54:54.920 | - Yeah, well, I think that allows me
03:54:59.280 | to play well at an Elon company,
03:55:02.640 | because Elon, one of his strengths, I think,
03:55:07.080 | is to just instantly see through fallacy from authority.
03:55:12.080 | So nobody walks into a room that he's in
03:55:15.560 | and says, "Well, God damn it, you have to trust me.
03:55:18.560 | "I'm the guy that built the last 10 rockets," or something.
03:55:22.440 | And he says, "Well, you did it wrong,
03:55:23.480 | "and we can do it better."
03:55:25.400 | Or, "I'm the guy that kept Ford alive
03:55:28.220 | "for the last 50 years.
03:55:29.220 | "You listen to me on how to build cars."
03:55:30.880 | And he says, "No."
03:55:33.120 | And so you don't walk into a room that he's in
03:55:36.200 | and say, "Well, I'm a neurosurgeon.
03:55:37.520 | "Let me tell you how to do it."
03:55:39.680 | He's gonna say, "Well, I'm a human being
03:55:42.540 | "that has a brain.
03:55:43.480 | "I can think from first principles myself.
03:55:45.880 | "Thank you very much.
03:55:47.280 | "And here's how I think it ought to be done.
03:55:49.080 | "Let's go try it and see who's right."
03:55:51.560 | And that's proven, I think, over and over in his case
03:55:55.440 | to be a very powerful approach.
03:55:57.120 | - If we just take that tangent,
03:55:58.480 | there's a fascinating interdisciplinary team at Neuralink
03:56:03.000 | that you get to interact with, including Elon.
03:56:07.180 | What do you think is the secret to a successful team?
03:56:12.400 | Or what have you learned
03:56:13.540 | from just getting to observe these folks?
03:56:15.960 | - Yeah.
03:56:16.800 | - World experts in different disciplines work together.
03:56:19.480 | - Yeah, there's a sweet spot where people disagree
03:56:25.160 | and forcefully speak their mind
03:56:29.320 | and passionately defend their position
03:56:31.320 | and yet are still able to accept information from others
03:56:36.760 | and change their ideas when they're wrong.
03:56:40.040 | And so I like the analogy of sort of how you polish rocks.
03:56:45.040 | You put hard things in a hard container and spin it.
03:56:50.560 | People bash against each other
03:56:52.520 | and out comes a more refined product.
03:56:57.120 | And so to make a good team at Neuralink,
03:57:01.600 | we've tried to find people that are not afraid
03:57:05.180 | to defend their ideas passionately
03:57:07.240 | and occasionally strongly disagree with people
03:57:12.240 | that they're working with
03:57:15.840 | and have the best idea come out on top.
03:57:18.600 | It's not an easy balance, again,
03:57:23.120 | to refer back to the primate brain.
03:57:26.360 | It's not something that is inherently built
03:57:28.660 | into the primate brain to say,
03:57:30.840 | I passionately put all my chips on this position
03:57:36.400 | and now I'm just gonna walk away from it,
03:57:37.940 | admit you were right.
03:57:39.000 | Part of our brains tell us that that is a power loss,
03:57:43.860 | that is a loss of face, a loss of standing in the community
03:57:47.320 | and now you're a Zeta chump
03:57:52.080 | 'cause your idea got trounced.
03:57:54.620 | And you just have to recognize that that little voice
03:57:59.800 | in the back of your head is maladaptive
03:58:01.400 | and it's not helping the team win.
03:58:04.320 | - Yeah, you have to have the confidence
03:58:05.440 | to be able to walk away from an idea that you hold onto.
03:58:08.920 | And if you do that often enough,
03:58:11.080 | you're actually going to become the best
03:58:14.480 | in the world at your thing.
03:58:16.280 | I mean, that kind of, that rapid iteration.
03:58:18.800 | - Yeah, you'll at least be a member of a winning team.
03:58:21.440 | - Ride the wave.
03:58:23.800 | What did you learn?
03:58:26.120 | You mentioned there's a lot of amazing neurosurgeons at USC.
03:58:30.440 | What lessons about surgery and life
03:58:33.400 | have you learned from those folks?
03:58:35.160 | - Yeah, I think working your ass off,
03:58:38.600 | working hard while functioning as a member of a team,
03:58:43.600 | getting a job done that is incredibly difficult,
03:58:47.660 | working incredibly long hours, being up all night,
03:58:53.480 | taking care of someone that you think probably won't survive
03:58:56.920 | no matter what you do,
03:58:58.720 | working hard to make people that you passionately dislike
03:59:03.040 | look good the next morning.
03:59:05.160 | These folks were relentless in their pursuit
03:59:10.240 | of excellent neurosurgical technique decade over decade.
03:59:15.240 | And I think we're well-recognized for that excellence.
03:59:20.480 | So especially Marty Weiss, Steve Gianotta, Micah Puzo,
03:59:25.320 | they made huge contributions not only to surgical technique,
03:59:30.040 | but they built training programs that trained dozens
03:59:34.760 | or hundreds of amazing neurosurgeons.
03:59:38.240 | I was just lucky to kind of be in their wake.
03:59:42.440 | - What's that like?
03:59:43.640 | You mentioned doing a surgery
03:59:46.800 | where the person is likely not to survive.
03:59:51.080 | Does that wear on you?
03:59:52.420 | - Yeah.
03:59:55.060 | Um...
03:59:55.900 | You know, it's especially challenging
04:00:04.740 | when you, with all respect to our elders,
04:00:12.500 | it doesn't hit so much
04:00:15.460 | when you're taking care of an 80-year-old
04:00:17.620 | and something was going to get them pretty soon anyway.
04:00:23.980 | And so you lose a patient like that,
04:00:27.060 | and it was part of the natural course
04:00:29.580 | of what is expected of them in the coming years regardless.
04:00:33.940 | Taking care of, you know, a father
04:00:41.180 | of two or three, four young kids,
04:00:45.500 | someone in their 30s that didn't have it coming,
04:00:49.540 | and they show up in your ER
04:00:52.300 | having their first seizure of their life,
04:00:54.020 | and lo and behold, they've got a huge, malignant,
04:00:57.100 | inoperable, or incurable brain tumor.
04:01:00.140 | You can only do that, I think, a handful of times
04:01:04.660 | before it really starts eating away at your armor.
04:01:09.540 | Or, you know, a young mother that shows up
04:01:14.620 | that has a giant hemorrhage in her brain
04:01:17.740 | that she's not gonna survive from,
04:01:19.140 | and, you know, they bring her four-year-old daughter in
04:01:22.740 | to say goodbye one last time
04:01:25.020 | before they turn the ventilator off.
04:01:26.820 | You know, the great Henry Marsh
04:01:31.860 | is an English neurosurgeon who said it best.
04:01:33.980 | I think he says, "Every neurosurgeon
04:01:35.780 | "carries with them a private graveyard,"
04:01:37.580 | and I definitely feel that,
04:01:39.140 | especially with young parents.
04:01:44.020 | That kills me.
04:01:47.220 | They had a lot more to give.
04:01:50.340 | The loss of those people, specifically,
04:01:53.900 | has a, you know, knock-on effect
04:01:57.020 | that's going to make the world worse for people
04:02:01.460 | for a long time,
04:02:03.380 | and it's just hard to feel powerless in the face of that.
04:02:07.140 | You know, and that's where, I think,
04:02:11.660 | you have to be borderline evil
04:02:15.500 | to fight against a company like Neuralink
04:02:18.180 | or to constantly be taking potshots at us
04:02:21.900 | because what we're doing is to try to fix that stuff.
04:02:26.900 | We're trying to give people options to reduce suffering.
04:02:32.420 | We're trying to take the pain out of life
04:02:38.020 | that Broken Brains brings in,
04:02:43.660 | and yeah, this is just our little way
04:02:48.460 | that we're fighting back against entropy, I guess.
04:02:51.320 | - Yeah, the amount of suffering that's endured
04:02:55.340 | when some of the things that we take for granted
04:02:58.620 | that our brain is able to do is taken away is immense,
04:03:02.580 | and to be able to restore some of that functionality
04:03:04.780 | is a real gift.
04:03:06.420 | - Yeah, we're just starting.
04:03:08.060 | We're gonna do so much more.
04:03:11.900 | - Well, can you take me through the full procedure
04:03:14.220 | for implanting, let's say, the N1 chip in Neuralink?
04:03:19.020 | - Yeah, it's a really simple,
04:03:20.900 | really simple, straightforward procedure.
04:03:23.820 | The human part of the surgery that I do is dead simple.
04:03:28.820 | It's one of the most basic neurosurgery procedures
04:03:32.780 | imaginable, and I think there's evidence
04:03:36.480 | that some version of it has been done for thousands of years.
04:03:40.780 | There are examples, I think, from ancient Egypt
04:03:44.100 | of healed or partially healed trephonations,
04:03:47.860 | and from Peru or ancient times in South America,
04:03:52.740 | where these proto-surgeons would drill holes
04:03:58.020 | in people's skulls, presumably to let out the evil spirits,
04:04:02.000 | but maybe to drain blood clots,
04:04:04.220 | and there's evidence of bone healing around the edge,
04:04:06.780 | meaning the people at least survived some months
04:04:09.940 | after a procedure, and so what we're doing is that.
04:04:13.340 | We are making a cut in the skin on the top of the head
04:04:16.580 | over the area of the brain that is the most potent
04:04:20.460 | representation of hand intentions,
04:04:25.340 | and so if you are an expert concert pianist,
04:04:30.340 | you know, this part of your brain is lighting up
04:04:33.060 | the entire time you're playing.
04:04:34.620 | We call it the hand knob.
04:04:36.940 | - The hand knob?
04:04:38.220 | So it's all the finger movements,
04:04:40.060 | all of that is just firing away.
04:04:43.540 | - Yep, there's a little squiggle in the cortex right there.
04:04:46.180 | One of the folds in the brain is kind of doubly folded
04:04:49.540 | right on that spot, and so you can look at it on an MRI
04:04:52.620 | and say, that's the hand knob,
04:04:55.380 | and then you do a functional test in a special kind of MRI,
04:04:59.140 | called a functional MRI, fMRI,
04:05:01.320 | and this part of the brain lights up
04:05:03.880 | when people, even quadriplegic people,
04:05:05.900 | whose brains aren't connected
04:05:07.220 | to their finger movements anymore,
04:05:08.640 | they imagine finger movements,
04:05:11.020 | and this part of the brain still lights up.
04:05:13.300 | So we can ID that part of the brain
04:05:16.260 | in anyone who's preparing to enter our trial,
04:05:19.980 | and say, okay, that part of the brain, we confirm,
04:05:23.880 | is your hand intention area,
04:05:25.940 | and so I'll make a little cut in the skin,
04:05:30.940 | we'll flap the skin open,
04:05:33.220 | just like kind of opening the hood of a car,
04:05:35.220 | only a lot smaller,
04:05:37.180 | make a perfectly round one-inch diameter hole in the skull,
04:05:42.180 | remove that bit of skull,
04:05:44.960 | open the lining of the brain, the covering of the brain,
04:05:49.540 | it's like a little bag of water that the brain floats in,
04:05:53.220 | and then show that part of the brain to our robot,
04:05:57.840 | and then this is where the robot shines,
04:06:01.060 | it can come in and take these tiny,
04:06:04.380 | much smaller than human hair electrodes,
04:06:08.620 | and precisely insert them into the cortex,
04:06:12.300 | into the surface of the brain,
04:06:14.260 | to a very precise depth, in a very precise spot,
04:06:18.220 | that avoids all the blood vessels
04:06:19.840 | that are coating the surface of the brain,
04:06:22.400 | and after the robot's done with its part,
04:06:24.020 | then the human comes back in,
04:06:25.900 | and puts the implant into that hole in the skull,
04:06:30.300 | and covers it up,
04:06:32.180 | screwing it down to the skull,
04:06:33.500 | and sewing the skin back together,
04:06:35.980 | so the whole thing is a few hours long,
04:06:39.740 | it's extremely low risk,
04:06:41.660 | compared to the average neurosurgery,
04:06:45.160 | involving the brain that might, say,
04:06:47.460 | open up a deep part of the brain,
04:06:49.100 | or manipulate blood vessels in the brain,
04:06:52.300 | this opening on the surface of the brain,
04:06:57.380 | with only cortical micro-insertions,
04:07:01.340 | carries significantly less risk
04:07:05.020 | than a lot of the tumor,
04:07:07.500 | or aneurysm surgeries that are routinely done.
04:07:10.780 | - So cortical micro-insertions
04:07:12.300 | that are via robot and computer vision,
04:07:14.880 | are designed to avoid the blood vessels.
04:07:18.140 | - Exactly.
04:07:19.060 | - So, I know you're a bit biased here,
04:07:21.260 | but let's compare human and machine.
04:07:23.860 | - Sure.
04:07:24.940 | - Human surgeons able to do well,
04:07:29.060 | and what are robot surgeons able to do well,
04:07:32.540 | at this stage of our human civilization development?
04:07:36.980 | - Yeah, yeah, that's a good question.
04:07:38.940 | Humans are general purpose machines,
04:07:44.540 | we're able to adapt to unusual situations,
04:07:48.280 | we're able to change the plan on the fly.
04:07:53.260 | I remember well a surgery that I was doing many years ago,
04:07:58.220 | down in San Diego,
04:07:59.260 | where the plan was to open a small hole behind the ear,
04:08:04.260 | and go reposition a blood vessel
04:08:08.660 | that had come to lay on the facial nerve,
04:08:11.340 | the trigeminal nerve,
04:08:13.260 | the nerve that goes to the face.
04:08:15.100 | When that blood vessel lays on the nerve,
04:08:17.340 | it can cause just intolerable, horrific shooting pain,
04:08:21.760 | that people describe like being zapped with a cattle prod.
04:08:25.100 | And so, the beautiful, elegant surgery
04:08:26.820 | is to go move this blood vessel off the nerve.
04:08:30.900 | The surgery team, we went in there
04:08:33.580 | and started moving this blood vessel,
04:08:35.380 | and then found that there was a giant aneurysm
04:08:37.540 | on that blood vessel,
04:08:38.900 | that was not easily visible on the pre-op scans.
04:08:41.900 | And so, the plan had to dynamically change,
04:08:44.100 | and the human surgeons had no problem with that,
04:08:48.340 | we're trained for all those things.
04:08:50.620 | Robots wouldn't do so well in that situation,
04:08:53.640 | at least in their current incarnation,
04:08:55.780 | fully robotic surgery,
04:08:58.020 | like the electrode insertion portion of the Neuralink surgery,
04:09:03.020 | it goes according to a set plan.
04:09:06.360 | And so, the humans can interrupt the flow
04:09:09.840 | and change the plan,
04:09:10.880 | but the robot can't really change the plan midway through.
04:09:14.260 | It operates according to how it was programmed,
04:09:18.160 | and how it was asked to run.
04:09:19.500 | It does its job very precisely,
04:09:23.260 | but not with a wide degree of latitude
04:09:26.320 | in how to react to changing conditions.
04:09:29.440 | - So, there could be just a very large number of ways
04:09:31.840 | that you could be surprised as a surgeon.
04:09:33.560 | When you enter a situation,
04:09:34.800 | there could be subtle things
04:09:36.440 | that you have to dynamically adjust to.
04:09:38.560 | - Correct.
04:09:39.400 | - And robots are not good at that.
04:09:42.560 | - Currently.
04:09:43.680 | - Currently.
04:09:45.060 | We are at the dawn of a new era with AI
04:09:49.380 | of the parameters for robot responsiveness
04:09:54.260 | to be dramatically broadened, right?
04:09:57.020 | I mean, you can't look at a self-driving car
04:10:00.000 | and say that it's operating under very narrow parameters.
04:10:03.820 | If a chicken runs across the road,
04:10:07.380 | it wasn't necessarily programmed to deal with that,
04:10:10.060 | specifically, but a Waymo or a self-driving Tesla
04:10:14.340 | would have no problem reacting to that appropriately.
04:10:17.720 | And so, surgical robots aren't there yet,
04:10:21.900 | but give it time.
04:10:23.640 | - And then there could be a lot
04:10:24.600 | of sort of semi-autonomous possibilities
04:10:27.440 | of maybe a robotic surgeon could say,
04:10:32.240 | "This situation is perfectly familiar,"
04:10:34.800 | or, "This situation is not familiar,"
04:10:36.980 | and in the not familiar case, a human could take over.
04:10:39.920 | But basically, be very conservative in saying,
04:10:43.400 | "Okay, this for sure has no issues, no surprises,"
04:10:46.640 | and let the humans deal with the surprises
04:10:48.800 | with the edge cases and all that.
04:10:50.120 | - Yeah.
04:10:51.520 | - That's one possibility.
04:10:52.480 | So, you think eventually you'll be out of the job,
04:10:57.480 | well, you being neurosurgeon, your job being neurosurgeon,
04:11:02.280 | humans, there will not be many neurosurgeons
04:11:05.320 | left on this earth.
04:11:06.720 | - I'm not worried about my job
04:11:09.160 | in the course of my professional life.
04:11:12.700 | I think I would tell my kids not necessarily
04:11:16.120 | to go in this line of work depending on (laughs)
04:11:20.800 | depending on how things look in 20 years.
04:11:24.280 | - It's so fascinating 'cause, I mean,
04:11:26.360 | if I have a line of work, I would say it's programming.
04:11:29.280 | And if you ask me, like, for the last, I don't know,
04:11:32.680 | 20 years, what I would recommend for people,
04:11:35.440 | I would tell 'em, yeah, go.
04:11:37.360 | You will always have a job if you're a programmer
04:11:39.400 | 'cause there's more and more computers
04:11:40.640 | and all this kind of stuff,
04:11:41.480 | and it pays well, but then you realize
04:11:46.140 | these large language models come along
04:11:47.840 | and they're really damn good at generating code.
04:11:50.400 | So, overnight, you could be surprised,
04:11:52.520 | like, wow, what is the contribution of the human really?
04:11:55.480 | But then you start to think, okay,
04:11:57.360 | it does seem that humans have ability, like you said,
04:12:01.320 | to deal with novel situations.
04:12:03.080 | And in the case of programming,
04:12:05.080 | it's the ability to kinda come up with novel ideas
04:12:09.840 | to solve problems.
04:12:12.480 | It seems like machines aren't quite yet able to do that.
04:12:16.040 | And when the stakes are very high,
04:12:17.480 | when it's life-critical, as it is in surgery,
04:12:20.640 | especially neurosurgery, then it starts,
04:12:23.040 | the stakes are very high for a robot
04:12:27.000 | to actually replace a human.
04:12:28.280 | But it's fascinating that, in this case of Neuralink,
04:12:31.580 | there's a human-robot collaboration.
04:12:34.320 | - Yeah, yeah, it's, I do the parts it can't do,
04:12:37.520 | and it does the parts I can't do.
04:12:39.800 | And we are friends.
04:12:41.520 | (laughing)
04:12:43.760 | - I saw that there's a lot of practice going on.
04:12:49.360 | So, I mean, everything in Neuralink
04:12:51.840 | is tested extremely rigorously.
04:12:54.160 | But one of the things I saw,
04:12:55.400 | that there's a proxy on which the surgeries are performed.
04:12:59.120 | So this is both for the robot and for the human,
04:13:02.320 | for everybody involved in the entire pipeline.
04:13:05.120 | What's that like, practicing the surgery?
04:13:07.440 | - It's pretty intense.
04:13:09.600 | So there's no analog to this in human surgery.
04:13:14.600 | Human surgery is sort of this artisanal craft
04:13:18.120 | that's handed down directly from master to pupil
04:13:21.800 | over the generations.
04:13:23.360 | I mean, literally, the way you learn
04:13:25.500 | to be a surgeon on humans is by doing surgery on humans.
04:13:30.500 | I mean, first, you watch your professors
04:13:37.040 | do a bunch of surgery, and then finally,
04:13:38.680 | they put the trivial parts of the surgery into your hands,
04:13:42.160 | and then the more complex parts.
04:13:43.800 | And as your understanding of the point
04:13:46.720 | and the purposes of the surgery increases,
04:13:49.280 | you get more responsibility in the perfect condition.
04:13:52.240 | Doesn't always go well.
04:13:53.880 | In Neuralink's case, the approach is a bit different.
04:13:57.040 | We, of course, practiced as far as we could on animals.
04:14:02.960 | We did hundreds of animal surgeries.
04:14:06.680 | And when it came time to do the first human,
04:14:10.140 | we had just an amazing team of engineers
04:14:14.320 | build incredibly lifelike models.
04:14:18.320 | One of the engineers, Fran Romano in particular,
04:14:20.440 | built a pulsating brain in a custom 3D-printed skull
04:14:25.440 | that matches exactly the patient's anatomy,
04:14:30.040 | including their face and scalp characteristics.
04:14:35.520 | And so, when I was able to practice that,
04:14:39.320 | I mean, it's as close as it really reasonably should get
04:14:43.000 | to being the real thing in all the details,
04:14:48.400 | including having a mannequin body
04:14:52.440 | attached to this custom head.
04:14:54.880 | And so, when we were doing the practice surgeries,
04:14:56.960 | we'd wheel that body into the CT scanner
04:15:01.120 | and take a mock CT scan and wheel it back in
04:15:04.760 | and conduct all the normal safety checks,
04:15:07.600 | verbally stop this patient.
04:15:10.680 | We're confirming his identification
04:15:12.240 | is mannequin number blah, blah, blah.
04:15:15.600 | And then opening the brain in exactly the right spot
04:15:18.720 | using standard operative neuronavigation equipment,
04:15:23.320 | standard surgical drills in the same OR
04:15:26.200 | that we do all of our practice surgeries in at Neuralink,
04:15:29.720 | and having the skull open and have the brain pulse,
04:15:33.660 | which adds a degree of difficulty for the robot
04:15:35.960 | to perfectly, precisely plan and insert those electrodes
04:15:40.660 | to the right depth and location.
04:15:43.220 | And so, yeah, we kind of broke new ground
04:15:49.160 | on how extensively we practiced for this surgery.
04:15:52.600 | - So, there was a historic moment,
04:15:55.440 | a big milestone for Neuralink,
04:15:57.820 | in part for humanity,
04:16:00.900 | with the first human getting a Neuralink implant
04:16:04.700 | in January of this year.
04:16:06.580 | Take me through the surgery on Noland.
04:16:11.140 | What did it feel like to be a part of this?
04:16:13.020 | - Yeah, well, we were lucky to have just incredible partners
04:16:18.020 | at the Barrow Neurologic Institute.
04:16:20.180 | They are, I think,
04:16:22.380 | the premier neurosurgical hospital in the world.
04:16:27.400 | They made everything as easy as possible
04:16:32.400 | for the trial to get going
04:16:34.840 | and helped us immensely with their expertise
04:16:38.160 | on how to arrange the details.
04:16:41.840 | It was a much more high-pressure surgery in some ways.
04:16:46.200 | I mean, even though the outcome
04:16:49.560 | wasn't particularly in question
04:16:51.120 | in terms of our participants' safety,
04:16:54.960 | the number of observers,
04:16:58.400 | the number of people,
04:16:59.240 | there's conference rooms full of people
04:17:01.320 | watching live streams in the hospital,
04:17:03.460 | rooting for this to go perfectly,
04:17:06.700 | and that just adds pressure that is not typical
04:17:09.880 | for even the most intense production neurosurgery,
04:17:14.880 | say, removing a tumor
04:17:17.120 | or placing deep brain stimulation electrodes,
04:17:21.080 | and it had never been done on a human before.
04:17:23.200 | There were unknown unknowns.
04:17:25.120 | And so, definitely a moderate pucker factor there
04:17:32.360 | for the whole team,
04:17:35.720 | not knowing if we were going to encounter,
04:17:38.960 | say, a degree of brain movement that was unanticipated
04:17:43.960 | or a degree of brain sag
04:17:46.980 | that took the brain far away from the skull
04:17:50.480 | and made it difficult to insert
04:17:51.920 | or some other unknown unknown problem.
04:17:54.720 | Fortunately, everything went well,
04:17:57.040 | and that surgery was one of the smoothest outcomes
04:18:01.920 | we could have imagined.
04:18:03.200 | - Were you nervous?
04:18:04.080 | I mean, you're a bit of a quarterback
04:18:05.720 | in the Super Bowl kind of situation.
04:18:07.800 | - Extremely nervous, extremely.
04:18:11.000 | I was very pleased when it went well,
04:18:12.360 | and then when it was over,
04:18:14.000 | looking forward to number two.
04:18:16.440 | - Yeah, even with all that practice, all of that,
04:18:19.320 | just you've never been in a situation
04:18:21.680 | that's so high-stakes in terms of people watching.
04:18:24.400 | And we should also probably mention,
04:18:26.480 | given how the media works,
04:18:27.800 | a lot of people may be in a dark kind of way
04:18:32.800 | hoping it doesn't go well.
04:18:35.240 | - Well, I think wealth is easy to hate or envy or whatever,
04:18:40.240 | and I think there's a whole industry around driving clicks,
04:18:46.200 | and bad news is great for clicks.
04:18:50.980 | And so any way to take an event and turn it into bad news
04:18:55.980 | is gonna be really good for clicks.
04:19:00.160 | - It just sucks because I think it puts pressure on people.
04:19:03.520 | It discourages people from trying
04:19:06.880 | to solve really hard problems,
04:19:08.080 | because to solve hard problems,
04:19:09.360 | you have to go into the unknown.
04:19:11.200 | You have to do things that haven't been done before,
04:19:13.200 | and you have to take risks, calculated risks.
04:19:16.680 | You have to do all kinds of safety precautions,
04:19:18.440 | but risks nevertheless.
04:19:20.060 | And I just wish there would be more celebration of that,
04:19:23.740 | of the risk-taking, versus people just waiting
04:19:27.180 | on the sidelines, waiting for failure,
04:19:30.840 | and then pointing out the failure.
04:19:33.060 | Yeah, it sucks, but in this case,
04:19:35.540 | it's really great that everything went just flawlessly,
04:19:38.420 | but it's unnecessary pressure, I would say.
04:19:41.700 | - Now that there's a human with literal skin in the game,
04:19:46.040 | there's a participant whose well-being rides
04:19:48.980 | on this doing well, you have to be a pretty bad person
04:19:52.420 | to be rooting for that to go wrong.
04:19:54.220 | And so hopefully people look in the mirror
04:19:58.180 | and realize that at some point.
04:20:00.120 | - So did you get to actually front row seat,
04:20:04.380 | like watch the robot work?
04:20:05.660 | Like what, you get to see the whole thing?
04:20:08.260 | - Yeah, I mean, because an MD needs to be in charge
04:20:13.260 | of all of the medical decision-making
04:20:15.040 | throughout the process, I unscrubbed from the surgery
04:20:20.040 | after exposing the brain and presenting it to the robot
04:20:23.140 | and placed the targets on the robot software interface
04:20:28.140 | that tells the robot where it's going to insert each thread.
04:20:34.580 | That was done with my hand on the mouse
04:20:38.180 | for whatever that's worth.
04:20:39.900 | - So you were the one placing the targets?
04:20:41.540 | - Yeah.
04:20:42.460 | - Oh, cool.
04:20:43.620 | So like the robot with the computer vision
04:20:48.620 | provides a bunch of candidates
04:20:50.080 | and you kind of finalize the decision.
04:20:52.520 | - Right, the software engineers are amazing on this team
04:20:57.520 | and so they actually provided an interface
04:21:01.120 | where you can essentially use a lasso tool
04:21:04.200 | and select a prime area of brain real estate
04:21:08.480 | and it will automatically avoid the blood vessels
04:21:11.240 | in that region and automatically place a bunch of targets.
04:21:14.700 | So that allows the human robot operator
04:21:19.300 | to select really good areas of brain
04:21:22.740 | and make dense applications of targets in those regions,
04:21:27.740 | the regions we think are gonna have
04:21:30.020 | the most high-fidelity representations
04:21:34.180 | of finger movements and arm movement intentions.
04:21:37.900 | - I've seen images of this and for me with OCD,
04:21:41.060 | it's for some reason a really pleasant,
04:21:43.920 | I think there's a subreddit called Oddly Satisfying.
04:21:46.480 | - Yeah, love that subreddit.
04:21:48.480 | - It's oddly satisfying to see the different target sites
04:21:52.120 | avoiding the blood vessels and also maximizing
04:21:56.360 | the usefulness of those locations for the signal.
04:21:59.320 | It just feels good, it's like ah, that's nice.
04:22:02.040 | - As a person who has a visceral reaction
04:22:03.960 | to the brain bleeding, I can tell you--
04:22:05.480 | - Yes, especially.
04:22:06.480 | - It's extremely satisfying watching the electrodes
04:22:08.800 | themselves go into the brain and not cause bleeding.
04:22:12.080 | - Yeah, yeah, so you said the feeling was of relief
04:22:16.840 | when everything went perfectly.
04:22:18.860 | - Yeah.
04:22:20.440 | - How deep in the brain can you currently go
04:22:24.600 | and eventually go?
04:22:27.520 | Let's say on the neural link side,
04:22:29.560 | it seems the deeper you go in the brain,
04:22:32.100 | the more challenging it becomes.
04:22:34.620 | - Yeah, so talking broadly about neurosurgery,
04:22:38.240 | we can get anywhere.
04:22:39.240 | It's routine for me to put deep brain-stimulating electrodes
04:22:44.080 | near the very bottom of the brain,
04:22:47.840 | entering from the top and passing about a two millimeter
04:22:52.360 | wire all the way into the bottom of the brain.
04:22:55.700 | And that's not revolutionary, a lot of people do that.
04:22:59.320 | And we can do that with very high precision.
04:23:01.560 | I use a robot from Globus to do that surgery.
04:23:07.840 | You know, several times a month.
04:23:10.520 | It's pretty routine.
04:23:12.520 | - What are your eyes in that situation?
04:23:14.280 | What are you seeing, what kind of technology can you use
04:23:17.840 | to visualize where you are to light your way?
04:23:20.760 | - Yeah, so it's a cool process on the software side.
04:23:24.440 | You take a preoperative MRI that's extremely
04:23:27.040 | high resolution data of the entire brain.
04:23:30.340 | You put the patient to sleep, put their head in a frame
04:23:34.800 | that holds the skull very rigidly.
04:23:37.400 | And then you take a CT scan of their head
04:23:40.640 | while they're asleep with that frame on,
04:23:43.000 | and then merge the MRI and the CT in software.
04:23:47.720 | You have a plan based on the MRI where you can see
04:23:51.400 | these nuclei deep in the brain.
04:23:53.580 | You can't see them on CT, but if you trust the merging
04:23:58.320 | of the two images, then you indirectly know on the CT
04:24:01.960 | where that is, and therefore indirectly know where,
04:24:06.400 | in reference to the titanium frame screwed to their head,
04:24:10.040 | those targets are.
04:24:11.680 | And so this is '60s technology to manually compute
04:24:16.080 | trajectories given the entry point and target,
04:24:18.880 | and dial in some goofy-looking titanium actuators
04:24:27.560 | with manual actuators with little tick marks on them.
04:24:31.820 | The modern version of that is to use a robot,
04:24:35.160 | just like a little KUKA arm you might see
04:24:40.080 | building cars at the Tesla factory.
04:24:42.560 | This small robot arm can show you the trajectory
04:24:45.880 | that you intended from the pre-op MRI,
04:24:49.180 | and establish a very rigid holder through which
04:24:53.040 | you can drill a small hole in the skull,
04:24:55.520 | and pass a small rigid wire deep into that area of the brain
04:24:59.200 | that's hollow, and put your electrode through
04:25:02.120 | that hollow wire, and then remove all of that
04:25:04.320 | except the electrode.
04:25:06.200 | So you end up with the electrode very, very precisely placed
04:25:10.320 | far from the skull's surface.
04:25:12.640 | Now that's standard technology
04:25:15.000 | that's already been out in the world for a while.
04:25:21.360 | Neuralink right now is focused entirely on cortical targets,
04:25:25.680 | surface targets, because there's no trivial way
04:25:29.760 | to get, say, hundreds of wires deep inside the brain
04:25:34.540 | without doing a lot of damage.
04:25:36.760 | So your question, what do you see?
04:25:39.680 | Well, I see an MRI on a screen.
04:25:41.600 | I can't see everything that that DBS electrode
04:25:44.600 | is passing through on its way to that deep target.
04:25:48.040 | And so it's accepted with this approach
04:25:51.300 | that there's gonna be about one in 100 patients
04:25:54.360 | who have a bleed somewhere in the brain
04:25:57.920 | as a result of passing that wire blindly
04:26:01.840 | into the deep part of the brain.
04:26:03.680 | That's not an acceptable safety profile for Neuralink.
04:26:09.280 | We start from the position that we want this to be
04:26:14.280 | dramatically, maybe two or three orders of magnitude
04:26:17.120 | safer than that.
04:26:19.660 | Safe enough, really, that you or I
04:26:22.440 | without a profound medical problem
04:26:24.440 | might on our lunch break someday say,
04:26:27.200 | "Yeah, sure, I'll get that.
04:26:28.400 | I've been meaning to upgrade to the latest version."
04:26:31.760 | And so the safety constraints given that are high.
04:26:36.760 | And so we haven't settled on a final solution
04:26:42.320 | for arbitrarily approaching deep targets in the brain.
04:26:46.320 | - It's interesting 'cause you have to avoid
04:26:47.880 | blood vessels somehow and you have to,
04:26:50.540 | maybe there's creative ways of doing the same thing
04:26:52.380 | like mapping out high resolution geometry of blood vessels
04:26:55.740 | and then you can go in blind.
04:26:57.360 | But how do you map out that in a way that's super stable?
04:27:02.860 | It's in there.
04:27:03.700 | There's a lot of interesting challenges there, right?
04:27:05.060 | - Yeah.
04:27:06.100 | - But there's a lot to do on the surface, luckily.
04:27:07.820 | - Exactly.
04:27:08.660 | So we've got vision on the surface.
04:27:10.760 | We actually have made a huge amount of progress
04:27:14.740 | sewing electrodes into the spinal cord
04:27:18.680 | as a potential workaround for a spinal cord injury
04:27:23.680 | that would allow a brain mounted implant
04:27:25.920 | to translate motor intentions to a spine mounted implant
04:27:29.920 | that can affect muscle contractions
04:27:33.480 | in previously paralyzed arms and legs.
04:27:36.200 | - That's just incredible.
04:27:37.280 | So like the effort there is to try to bridge the brain
04:27:41.320 | to the spinal cord to the periphery, peripheral nervous.
04:27:44.780 | So how hard is that to do?
04:27:47.740 | - We have that working in very crude forms in animals.
04:27:52.740 | - That's amazing.
04:27:53.740 | - Yeah, we've done it.
04:27:54.580 | - So similar to like with Nolan,
04:27:56.140 | where he's able to digitally move the cursor.
04:27:59.740 | Here you're doing the same kind of communication
04:28:03.860 | but with the actual effectors that you have.
04:28:06.620 | - Yeah.
04:28:07.620 | - That's fascinating.
04:28:08.960 | - So we have anesthetized animals doing grasp
04:28:13.880 | and moving their legs in a sort of walking pattern.
04:28:18.580 | Again, early days.
04:28:20.600 | But the future is bright for this kind of thing
04:28:23.760 | and people with paralysis should look forward
04:28:27.300 | to that bright future.
04:28:28.960 | They're gonna have options.
04:28:30.280 | - Yeah, and there's a lot of sort of intermediate
04:28:33.920 | or extra options where you take like an optimist robot,
04:28:37.440 | like the arm and to be able to control the arm.
04:28:42.360 | - Yeah.
04:28:43.200 | - The fingers and hands at the arm as a prosthetic.
04:28:47.420 | - Exoskeletons are getting better too.
04:28:49.480 | - Exoskeletons, yeah, so that goes hand in hand.
04:28:54.480 | Although I didn't quite understand until thinking
04:28:56.900 | about it deeper and doing more research about Neuralink,
04:29:00.320 | how much you can do on the digital side.
04:29:02.880 | So there's digital telepathy.
04:29:04.920 | - Yeah.
04:29:05.760 | - I didn't quite understand that you could really map
04:29:09.220 | the intention as you described in the hand knob area,
04:29:14.220 | that you can map the intention.
04:29:17.760 | Just imagine it, think about it.
04:29:20.520 | That intention can be mapped to actual action
04:29:23.080 | in the digital world.
04:29:24.320 | - Right.
04:29:25.160 | - And now more and more, so much can be done
04:29:26.800 | in the digital world that it can reconnect you
04:29:30.840 | to the outside world.
04:29:32.320 | It can allow you to have freedom, have independence
04:29:35.720 | if you're a quadriplegic.
04:29:37.600 | - Yeah.
04:29:38.440 | - That's really powerful.
04:29:39.260 | Like you can go really far with that.
04:29:40.680 | - Yeah, our first participant is, he's incredible.
04:29:44.360 | He's breaking world records left and right.
04:29:46.700 | - And he's having fun with it, it's great.
04:29:48.800 | Just going back to the surgery, your whole journey.
04:29:54.920 | You mentioned to me offline, you have surgery on Monday.
04:29:58.240 | So you're like, you're doing surgery all the time.
04:30:00.860 | - Yeah.
04:30:01.700 | - It's a ridiculous question.
04:30:02.640 | What does it take to get good at surgery?
04:30:04.960 | - Practice, repetitions.
04:30:06.680 | You just, same with anything else.
04:30:09.320 | There's a million ways of people saying the same thing
04:30:12.120 | and selling books saying it, but you call it 10,000 hours,
04:30:15.400 | you call it, spend some chunk of your life,
04:30:18.160 | some percentage of your life focusing on this,
04:30:21.240 | obsessing about getting better at it.
04:30:23.200 | Repetitions, humility, recognizing that you aren't perfect
04:30:31.260 | at any stage along the way, recognizing you've got
04:30:34.920 | improvements to make in your technique,
04:30:37.840 | being open to feedback and coaching from people
04:30:40.980 | with a different perspective on how to do it.
04:30:43.240 | And then just the constant will to do better.
04:30:49.820 | That fortunately, if you're not a sociopath,
04:30:53.680 | I think your patients bring that with them
04:30:56.200 | to the office visits every day.
04:30:58.460 | They force you to want to do better all the time.
04:31:01.800 | - Yeah, to step up.
04:31:03.280 | I mean, it's a real human being,
04:31:05.240 | a real human being that you can help.
04:31:07.280 | - Yeah.
04:31:08.120 | - So every surgery, even if it's the same exact surgery,
04:31:11.720 | is there a lot of variability between that surgery
04:31:13.880 | and a different person?
04:31:15.280 | - Yeah, a fair bit.
04:31:16.280 | I mean, a good example for us is the angle of the skull
04:31:21.280 | relative to the normal plane of the body axis
04:31:26.760 | of the skull over hand knob is pretty wide variation.
04:31:31.760 | I mean, some people have really flat skulls
04:31:34.960 | and some people have really steeply angled skulls
04:31:37.680 | over that area.
04:31:38.520 | And that has consequences for how their head can be fixed
04:31:43.520 | in sort of the frame that we use
04:31:47.680 | and how the robot has to approach the skull.
04:31:50.880 | And yeah, people's bodies are built as differently
04:31:56.240 | as the people you see walking down the street,
04:31:59.200 | as much variability in body shape and size
04:32:02.160 | as you see there.
04:32:03.480 | We see in brain anatomy and skull anatomy,
04:32:06.380 | there are some people who we've had to kind of exclude
04:32:11.280 | from our trial for having skulls that are too thick
04:32:14.480 | or too thin or scalp that's too thick or too thin.
04:32:17.600 | I think we have like the middle 97% or so of people,
04:32:24.360 | but you can't account for all human anatomy variability.
04:32:29.360 | - How much like mushiness and messes there?
04:32:33.800 | 'Cause I, you know, taking biology classes,
04:32:36.480 | the diagrams are always really clean and crisp.
04:32:39.320 | Neuroscience, the pictures of neurons
04:32:41.720 | are always really nice and very.
04:32:43.760 | But whenever I look at pictures of like real brains,
04:32:49.040 | they're all, I don't know what is going on.
04:32:51.960 | - Yeah.
04:32:52.800 | - How much are biological systems in reality?
04:32:55.920 | Like how hard is it to figure out what's going on?
04:32:59.060 | - Not too bad.
04:33:00.160 | Once you really get used to this,
04:33:02.340 | that's where experience and skill and education
04:33:08.120 | really come into play is if you stare at a thousand brains,
04:33:12.280 | it becomes easier to kind of mentally peel back the,
04:33:17.880 | say for instance, blood vessels that are obscuring
04:33:20.600 | the sulci and gyri, kind of the wrinkle pattern
04:33:24.480 | of the surface of the brain.
04:33:26.440 | Occasionally, when you're first starting to do this
04:33:28.880 | and you open the skull, it doesn't match
04:33:32.480 | what you thought you were gonna see based on the MRI.
04:33:35.120 | And with more experience, you learn to kind of peel back
04:33:42.360 | that layer of blood vessels and see the underlying pattern
04:33:46.960 | of wrinkles in the brain and use that as a landmark
04:33:50.320 | for where you are.
04:33:51.680 | - The wrinkles are a landmark?
04:33:53.280 | So like-- - Yeah.
04:33:54.240 | So I was describing hand knob earlier.
04:33:56.440 | That's a pattern of the wrinkles in the brain.
04:33:58.640 | It's sort of this Greek letter omega shaped area
04:34:03.080 | of the brain.
04:34:04.240 | - So you could recognize the hand knob area,
04:34:06.080 | like if I show you a thousand brains
04:34:09.480 | and give you like one minute with each,
04:34:11.160 | you'd be like, yep, that's that.
04:34:12.480 | - Sure.
04:34:13.520 | - And so there is some uniqueness to that area of the brain,
04:34:16.800 | like in terms of the geometry, the topology of the thing.
04:34:19.960 | - Yeah.
04:34:20.800 | - Where is it about in the--
04:34:23.560 | - So you have this strip of brain running down the top,
04:34:27.720 | called the primary motor area.
04:34:29.480 | And I'm sure you've seen this picture of the homunculus
04:34:33.080 | laid over the surface of the brain,
04:34:34.480 | the weird little guy with huge lips and giant hands.
04:34:38.460 | That guy sort of lays with his legs up at the top
04:34:43.480 | of the brain and face arm areas farther down
04:34:47.800 | and then some kind of mouth, lip, tongue areas farther down.
04:34:52.800 | And so the hand is right in there
04:34:57.400 | and then the areas that control speech,
04:34:59.880 | at least on the left side of the brain in most people
04:35:02.880 | are just below that.
04:35:05.100 | And so any muscle that you voluntarily move in your body,
04:35:10.100 | the vast majority of that references that strip
04:35:14.880 | or those intentions come from that strip of brain
04:35:17.680 | and the wrinkle for hand knob
04:35:20.640 | is right in the middle of that.
04:35:22.840 | - And vision is back here.
04:35:24.960 | - Yep.
04:35:25.800 | - Also close to the surface.
04:35:27.760 | - Vision's a little deeper.
04:35:28.960 | And so this gets to your question
04:35:31.640 | about how deep can you get.
04:35:33.060 | To do vision, we can't just do the surface of the brain.
04:35:38.540 | We have to be able to go in,
04:35:40.080 | not as deep as we have to go for DBS,
04:35:43.520 | but maybe a centimeter deeper than we're used to
04:35:47.180 | for hand insertions.
04:35:49.140 | And so that's work in progress.
04:35:52.620 | That's a new set of challenges to overcome.
04:35:55.620 | - By the way, you mentioned the Utah, right?
04:35:58.040 | And I just saw a picture of that
04:36:00.400 | and that thing looks terrifying.
04:36:02.440 | - Yeah, that nails.
04:36:03.680 | - Because it's rigid.
04:36:05.960 | And then if you look at the threads, they're flexible.
04:36:09.600 | What can you say that's interesting to you
04:36:11.020 | about the flexible,
04:36:12.560 | that kind of approach of the flexible threads
04:36:15.160 | to deliver the electrodes next to the neurons?
04:36:18.180 | - Yeah, I mean, the goal there comes from experience.
04:36:21.960 | I mean, we stand on the shoulders of people
04:36:23.780 | that made Utah rays and used Utah rays for decades
04:36:28.180 | before we ever even came along.
04:36:29.920 | Neuralink arose partly,
04:36:34.140 | this approach to technology arose out of a need
04:36:37.140 | recognized after Utah rays would fail routinely
04:36:43.120 | because the rigid electrodes,
04:36:44.900 | those spikes that are literally hammered
04:36:48.940 | using an air hammer into the brain,
04:36:51.400 | those spikes generate a bad immune response
04:36:56.700 | that encapsulates the electrode spikes
04:37:00.820 | in scar tissue, essentially.
04:37:04.260 | And so one of the projects that was being worked on
04:37:06.800 | in the Anderson Lab at Caltech when I got there
04:37:11.180 | was to see if you could use chemotherapy
04:37:14.560 | to prevent the formation of scars.
04:37:16.540 | Like, you know, things are pretty bad
04:37:18.760 | when you're jamming a bed of nails into the brain
04:37:21.280 | and then treating that with chemotherapy
04:37:26.280 | to try to prevent scar tissue.
04:37:27.560 | It's like, you know, maybe we've gotten off track here, guys.
04:37:29.960 | Maybe there's a fundamental redesign necessary.
04:37:32.800 | And so Neuralink's approach
04:37:34.600 | of using highly flexible, tiny electrodes
04:37:39.560 | avoids a lot of the bleeding,
04:37:41.780 | avoids a lot of the immune response
04:37:43.500 | that ends up happening when rigid electrodes
04:37:46.260 | are pounded into the brain.
04:37:48.020 | And so what we see is our electrode longevity
04:37:51.540 | and functionality and the health of the brain tissue
04:37:55.320 | immediately surrounding the electrode is excellent.
04:37:59.220 | I mean, it goes on for years now in our animal models.
04:38:03.580 | - What do most people not understand
04:38:05.220 | about the biology of the brain?
04:38:07.300 | We'll mention the vasculature, that's really interesting.
04:38:10.200 | - I think the most interesting,
04:38:12.360 | maybe underappreciated fact
04:38:14.320 | is that it really does control almost everything.
04:38:18.840 | I mean, I don't know, out of a blue example,
04:38:23.280 | imagine you want a lever on fertility.
04:38:25.880 | You wanna be able to turn fertility on and off.
04:38:28.600 | I mean, there are legitimate targets
04:38:31.600 | in the brain itself to modulate fertility.
04:38:35.000 | Say, blood pressure.
04:38:38.260 | You wanna modulate blood pressure.
04:38:40.740 | There are legitimate targets in the brain for doing that.
04:38:43.540 | Things that aren't immediately obvious as brain problems
04:38:49.460 | are potentially solvable in the brain.
04:38:51.820 | And so I think it's an underexplored area
04:38:58.700 | for primary treatments of all the things that bother people.
04:39:04.940 | - That's a really fascinating way to look at it.
04:39:07.200 | There's a lot of conditions we might think
04:39:10.640 | have nothing to do with the brain,
04:39:12.400 | but they might just be symptoms
04:39:13.940 | of something that actually started in the brain,
04:39:15.880 | the actual source of the problem,
04:39:17.560 | the primary source is something in the brain.
04:39:19.680 | - Yeah, not always.
04:39:20.900 | I mean, kidney disease is real,
04:39:23.360 | but there are levers you can pull in the brain
04:39:27.120 | that affect all of these systems.
04:39:29.600 | - There's knobs.
04:39:32.560 | - On/off switches and knobs in the brain
04:39:35.420 | from which this all originates.
04:39:37.840 | - Would you have a Neuralink chip implanted in your brain?
04:39:42.800 | - Yeah.
04:39:43.640 | I think use case right now is use a mouse, right?
04:39:50.600 | I can already do that,
04:39:53.080 | and so there's no value proposition.
04:39:56.400 | On safety grounds alone, sure, I'll do it tomorrow.
04:39:59.560 | - You know, you say the use case of the mouse.
04:40:02.860 | 'Cause after like researching all this
04:40:05.220 | and part of it is just watching Nolan have so much fun,
04:40:08.660 | if you can get that bits per second
04:40:10.420 | like really high with the mouse,
04:40:12.740 | like being able to interact,
04:40:14.820 | 'cause if you think about the ways on the smartphone,
04:40:17.940 | the way you swipe, that was transformational.
04:40:20.480 | - Yeah.
04:40:21.320 | - How we interact with the thing.
04:40:22.420 | It's subtle, you don't realize it,
04:40:24.340 | but to be able to touch a phone
04:40:26.200 | and to scroll with your finger,
04:40:29.200 | that's like, that changed everything.
04:40:31.240 | People were sure you need a keyboard to type.
04:40:33.500 | There's a lot of HCI aspects to that
04:40:39.100 | that changed how we interact with computers.
04:40:41.940 | So there could be a certain rate of speed with the mouse
04:40:46.220 | that would change everything.
04:40:48.020 | - Yes.
04:40:48.860 | - It's like you might be able to just click
04:40:49.780 | around a screen extremely fast.
04:40:52.260 | And that, if it, I can see myself getting a Neuralink
04:40:58.620 | for much more rapid interaction with digital devices.
04:41:03.000 | - Yeah, I think recording speech intentions
04:41:06.240 | from the brain might change things as well.
04:41:09.200 | The value proposition for the average person.
04:41:11.440 | A keyboard is a pretty clunky human interface,
04:41:16.320 | requires a lot of training.
04:41:18.320 | It's highly variable in the maximum performance
04:41:22.160 | that the average person can achieve.
04:41:27.280 | I think taking that out of the equation
04:41:31.080 | and just having a natural word to computer interface
04:41:36.080 | might change things for a lot of people.
04:41:40.560 | - It'd be hilarious if that is the reason people do it.
04:41:43.260 | Even if you have speech to text, that's extremely accurate.
04:41:46.160 | It currently isn't.
04:41:47.360 | But let's say you've gotten super accurate,
04:41:49.760 | it'd be hilarious if people went for Neuralink
04:41:52.160 | just so you avoid the embarrassing aspect of speaking,
04:41:55.600 | like looking like a douchebag speaking to your phone
04:41:57.560 | in public, which is a real, that's a real constraint.
04:42:02.560 | - Yeah, I mean, with a bone-conducting case
04:42:07.480 | that can be an invisible headphone, say,
04:42:10.660 | and the ability to think words into software
04:42:16.840 | and have it respond to you,
04:42:18.500 | that starts to sound sort of like embedded
04:42:23.520 | super intelligence, if you can silently ask
04:42:27.840 | for the Wikipedia article on any subject
04:42:30.640 | and have it read to you without any observable change
04:42:34.160 | happening in the outside world.
04:42:35.720 | For one thing, standardized testing is obsolete.
04:42:41.560 | - Yeah, if it's done well on the UX side,
04:42:46.120 | it could change, I don't know if it transforms society,
04:42:48.760 | but it really can create a kind of shift
04:42:53.320 | in the way we interact with digital devices
04:42:55.720 | in the way that a smartphone did.
04:42:57.560 | Now, I would, just having to look into the safety
04:43:01.440 | of everything involved, I would totally try it
04:43:03.820 | so it doesn't have to go to some like incredible thing
04:43:08.000 | where you have, it connects your vision
04:43:10.380 | or to some other, like it connects all over your brain.
04:43:12.600 | That could be like just connecting to the hand knob.
04:43:15.380 | You might have a lot of interesting interaction,
04:43:18.760 | human-computer interaction possibilities.
04:43:21.380 | - Yeah. - That's really interesting.
04:43:22.320 | - Yeah, and the technology on the academic side
04:43:25.560 | is progressing at light speed here.
04:43:27.200 | I think there was a really amazing paper
04:43:30.600 | out of UC Davis, Sergei Stavisky's lab,
04:43:34.080 | that basically made a initial solve of speech decode
04:43:39.080 | in something like 125,000 words
04:43:43.080 | that they were getting with very high accuracy, which is--
04:43:47.000 | - So you're just thinking the word?
04:43:48.660 | - Yeah.
04:43:49.500 | - Thinking the word and you're able to get it.
04:43:51.200 | - Yeah. - Oh, boy.
04:43:52.680 | Like you have to have the intention of speaking it.
04:43:56.420 | - Right.
04:43:57.260 | - So like do the inner voice.
04:43:59.360 | Man, it's so amazing to me that you can do the intention,
04:44:03.640 | the signal mapping.
04:44:05.780 | All you have to do is just imagine yourself doing it.
04:44:08.440 | And if you get the feedback that it actually worked,
04:44:14.520 | you can get really good at that.
04:44:16.720 | Like your brain will first of all adjust
04:44:18.240 | and you develop like any other skill.
04:44:20.320 | - Yeah. - Like touch typing,
04:44:21.480 | you develop in that same kind of way.
04:44:23.920 | That is, to me, it's just really fascinating.
04:44:26.920 | - Yeah.
04:44:27.760 | - To be able to even to play with that.
04:44:29.400 | Honestly, like I would get a Neuralink
04:44:30.840 | just to be able to play with that.
04:44:32.680 | Just to play with the capacity,
04:44:34.680 | the capability of my mind to learn this skill.
04:44:37.680 | It's like learning the skill of typing
04:44:39.360 | and learning the skill of moving a mouse.
04:44:41.080 | It's another skill of moving the mouse,
04:44:43.900 | not with my physical body, but with my mind.
04:44:47.640 | - I can't wait to see what people do with it.
04:44:49.220 | I feel like we're cavemen right now.
04:44:51.640 | We're like banging rocks with a stick
04:44:53.840 | and thinking that we're making music.
04:44:55.700 | At some point, when these are more widespread,
04:44:59.200 | there's gonna be the equivalent of a piano
04:45:02.680 | that someone can make art with their brain
04:45:06.640 | in a way that we didn't even anticipate.
04:45:08.640 | Looking forward to it.
04:45:12.280 | - Give it to like a teenager.
04:45:13.440 | Like anytime I think I'm good at something,
04:45:15.320 | I'll always go to like, I don't know,
04:45:17.240 | even with the bits per second of playing a video game.
04:45:21.700 | You realize you give a Neuralink to a teenager,
04:45:24.900 | just a large number of them,
04:45:26.640 | the kind of stuff, they get good at stuff.
04:45:30.200 | They're gonna get like hundreds of bits per second.
04:45:34.940 | - Yeah.
04:45:35.780 | - Even just with the current technology.
04:45:37.460 | - Probably, probably.
04:45:39.900 | - Just 'cause it's also addicting.
04:45:42.380 | How the number go up aspect of it,
04:45:46.580 | of like improving and training.
04:45:48.520 | 'Cause it's almost like a skill.
04:45:50.480 | And plus there's a software on the other end
04:45:52.360 | that adapts to you.
04:45:54.120 | And especially if the adapting procedure,
04:45:56.320 | the algorithm becomes better and better and better,
04:45:58.080 | you like learning together.
04:45:59.560 | - Yeah, we're scratching the surface on that right now.
04:46:01.660 | There's so much more to do.
04:46:03.360 | - So on the complete other side of it,
04:46:06.080 | you have an RFID chip implanted in you.
04:46:10.240 | - Yeah.
04:46:11.080 | - So I hear, nice.
04:46:12.420 | So this is--
04:46:13.260 | - Little subtle thing.
04:46:14.100 | - It's a passive device that you use
04:46:17.400 | for unlocking like a safe with top secrets.
04:46:20.800 | Or what do you use it for?
04:46:22.260 | What's the story behind it?
04:46:23.700 | - I'm not the first one.
04:46:24.700 | There's this whole community of weirdo biohackers
04:46:28.220 | that have done this stuff.
04:46:30.020 | And I think one of the early use cases
04:46:32.140 | was storing private crypto wallet keys and whatever.
04:46:37.140 | I dabbled in that a bit.
04:46:40.640 | And had some fun with it.
04:46:42.260 | - Do you have some Bitcoin implanted in your body somewhere?
04:46:46.140 | You can't tell where, yeah.
04:46:47.300 | - Yeah, actually, yeah.
04:46:48.920 | (Lex laughing)
04:46:50.020 | It was the modern day equivalent
04:46:51.740 | of finding change in the sofa cushions.
04:46:53.900 | (Lex laughing)
04:46:54.740 | After I put some orphan crypto on there
04:46:57.500 | that I thought was worthless
04:46:58.520 | and forgot about it for a few years.
04:47:00.760 | Went back and found that some community of people loved it
04:47:04.220 | and had propped up the value of it.
04:47:06.940 | And so it had gone up 50 fold.
04:47:09.760 | So there was a lot of change in those cushions.
04:47:11.620 | (Lex laughing)
04:47:13.620 | - That's hilarious.
04:47:14.460 | - But the primary use case is mostly
04:47:17.060 | as a tech demonstrator.
04:47:19.820 | You know, it has my business card on it.
04:47:22.140 | You can scan that in by touching it to your phone.
04:47:26.820 | It opens the front door to my house.
04:47:28.660 | You know, whatever, simple stuff.
04:47:30.420 | - But it's a cool step.
04:47:31.260 | It's a cool leap to implant something in your body.
04:47:33.980 | I mean, it has perhaps that's,
04:47:35.580 | it's a similar leap to a Neuralink.
04:47:37.980 | Because for a lot of people that kind of notion
04:47:40.640 | of putting something inside your body,
04:47:42.320 | something electronic inside a biological system
04:47:44.600 | is a big leap.
04:47:45.520 | - Yeah, we have a kind of a mysticism
04:47:47.420 | around the barrier of our skin.
04:47:50.740 | We're completely fine with knee replacements,
04:47:53.080 | hip replacements, you know, dental implants.
04:47:57.420 | But, you know, there's a mysticism still
04:48:03.920 | around the inviolable barrier that the skull represents.
04:48:07.840 | And I think that needs to be treated
04:48:10.500 | like any other pragmatic barrier.
04:48:13.340 | You know, the question isn't how incredible
04:48:18.060 | is it to open the skull?
04:48:19.100 | The question is, you know, what benefit can we provide?
04:48:21.860 | - So from all the surgeries you've done,
04:48:23.480 | from everything you understand in the brain,
04:48:26.180 | how much does neuroplasticity come into play?
04:48:28.860 | How adaptable is the brain?
04:48:30.280 | For example, just even in the case of healing from surgery
04:48:33.740 | or adapting to the post-surgery situation.
04:48:36.580 | - The answer that is sad for me
04:48:38.700 | and other people of my demographic
04:48:41.860 | is that, you know, plasticity decreases with age.
04:48:44.980 | Healing decreases with age.
04:48:47.260 | I have too much gray hair to be optimistic about that.
04:48:52.260 | There are theoretical ways to increase plasticity
04:48:55.980 | using electrical stimulation.
04:48:58.740 | Nothing that is, you know, totally proven out
04:49:02.320 | as a robust enough mechanism to offer widely to people.
04:49:06.020 | But yeah, I think there's cause for optimism
04:49:10.700 | that we might find something useful
04:49:12.540 | in terms of, say, an implanted electrode
04:49:14.780 | that improves learning.
04:49:16.780 | Certainly there's been some really amazing work recently
04:49:21.960 | from Nicholas Schiff, Jonathan Baker, you know, and others
04:49:26.100 | who have a cohort of patients
04:49:29.060 | with moderate traumatic brain injury
04:49:31.060 | who have had electrodes placed in the deep nucleus
04:49:34.460 | in the brain called the centromedian nucleus
04:49:36.340 | or just near centromedian nucleus.
04:49:38.940 | And when they apply small amounts of electricity
04:49:42.100 | to that part of the brain,
04:49:44.100 | it's almost like electronic caffeine.
04:49:46.420 | They're able to improve people's attention and focus.
04:49:50.820 | They're able to improve how well people can perform a task.
04:49:54.660 | I think in one case, someone who was unable to work
04:49:58.180 | after the device was turned on,
04:50:00.020 | they were able to get a job.
04:50:02.180 | And that's sort of, you know, one of the holy grails
04:50:04.980 | for me with Neuralink and other technologies like this
04:50:10.380 | is from a purely utilitarian standpoint,
04:50:13.600 | can we make people able to take care of themselves
04:50:19.540 | and their families economically again?
04:50:21.680 | Can we make it so someone who's fully dependent
04:50:24.600 | and even maybe requires a lot of caregiver resources,
04:50:28.940 | can we put them in a position to be fully independent,
04:50:32.380 | taking care of themselves,
04:50:33.340 | giving back to their communities?
04:50:34.980 | I think that's a very compelling proposition
04:50:40.020 | and what motivates a lot of what I do
04:50:41.820 | and what a lot of the people at Neuralink are working for.
04:50:44.980 | - It's just a cool possibility
04:50:46.460 | that if you put a Neuralink in there,
04:50:48.620 | that the brain adapts,
04:50:49.940 | like the other part of the brain adapts too.
04:50:52.740 | - Yeah.
04:50:53.560 | - And integrates it.
04:50:54.400 | The capacity of the brain to do that is really interesting,
04:50:56.820 | probably unknown to the degree to which it can do that,
04:51:00.840 | but you're now connecting an external thing to it,
04:51:05.140 | especially once it's doing stimulation.
04:51:09.080 | Like the biological brain
04:51:13.420 | and the electronic brain outside of it working together,
04:51:18.420 | like the possibilities there are really interesting.
04:51:20.980 | It's still unknown, but interesting.
04:51:24.540 | - It feels like the brain is really good
04:51:26.220 | at adapting to whatever.
04:51:28.180 | - Yeah.
04:51:29.020 | - But of course it is a system that by itself is already,
04:51:31.840 | like everything serves a purpose
04:51:36.460 | and so you don't wanna mess with it too much.
04:51:39.580 | - Yeah, it's like eliminating a species from an ecology.
04:51:44.580 | You don't know what the delicate interconnections
04:51:47.340 | and dependencies are.
04:51:49.980 | The brain is certainly a delicate, complex beast
04:51:54.580 | and we don't know every potential downstream consequence
04:51:59.580 | of a single change that we make.
04:52:04.260 | - Do you see yourself doing, so you mentioned P1,
04:52:08.180 | surgeries of P2, P3, P4, P5,
04:52:11.320 | just more and more and more humans?
04:52:14.140 | - I think it's a certain kind of brittleness
04:52:17.640 | or a failure on the company's side
04:52:20.100 | if we need me to do all the surgeries.
04:52:23.480 | I think something that I would very much like
04:52:28.680 | to work towards is a process that is so simple
04:52:32.740 | and so robust on the surgery side
04:52:34.560 | that literally anyone could do it.
04:52:36.700 | We wanna get away from requiring intense expertise
04:52:42.660 | or intense experience to have this successfully done
04:52:47.120 | and make it as simple and translatable as possible.
04:52:51.780 | I mean, I would love it if every neurosurgeon on the planet
04:52:54.300 | had no problem doing this.
04:52:55.840 | I think we're probably far from a regulatory environment
04:53:00.580 | that would allow people that aren't neurosurgeons to do this
04:53:04.700 | but not impossible.
04:53:06.900 | - All right, I'll sign up for that.
04:53:10.520 | Did you ever anthropomorphize the robot, R1?
04:53:14.600 | Like, do you give it a name?
04:53:17.340 | Do you see it as like a friend
04:53:18.620 | that's like working together with you?
04:53:20.100 | - I mean, to a certain degree, it's--
04:53:21.660 | - Or an enemy who's gonna take the job?
04:53:23.220 | (laughing)
04:53:25.940 | - To a certain degree, yeah, it's a complex relationship.
04:53:29.700 | - All the good relationships are.
04:53:32.900 | - It's funny when in the middle of the surgery,
04:53:35.120 | there's a part of it where I stand basically
04:53:37.660 | shoulder to shoulder with the robot.
04:53:40.060 | And so, you know, if you're in the room
04:53:43.340 | reading the body language, you know,
04:53:44.940 | that's my brother in arms there.
04:53:46.800 | We're working together on the same problem.
04:53:49.940 | Yeah, I'm not threatened by it.
04:53:53.740 | - Keep telling yourself that, yeah.
04:53:57.140 | How have all the surgeries that you've done over the years,
04:54:01.680 | the people you've helped and the stakes,
04:54:05.780 | the high stakes that you've mentioned,
04:54:07.420 | how has that changed your understanding of life and death?
04:54:11.460 | - Yeah, you know, it gives you a very visceral sense.
04:54:16.460 | And this may sound trite,
04:54:23.480 | but it gives you a very visceral sense
04:54:25.160 | that death is inevitable.
04:54:27.520 | You know, on one hand, you know, you are,
04:54:32.520 | as a neurosurgeon, you're deeply involved
04:54:34.240 | in these like just hard to fathom tragedies.
04:54:40.480 | You know, young parents dying,
04:54:42.800 | leaving, you know, a four-year-old behind, let's say.
04:54:46.220 | And on the other hand, you know,
04:54:52.520 | it takes the sting out of it a bit
04:54:54.040 | because you see how just mind-numbingly universal death is.
04:54:59.040 | There's zero chance that I'm going to avoid it.
04:55:06.360 | I know, you know, techno-optimists right now
04:55:10.760 | and longevity buffs right now would disagree
04:55:13.460 | on that 0.000% estimate,
04:55:18.120 | but I don't see any chance
04:55:21.480 | that our generation is going to avoid it.
04:55:24.320 | Entropy is a powerful force,
04:55:25.760 | and we are very ornate, delicate, brittle DNA machines
04:55:30.720 | that aren't up to the cosmic ray bombardment
04:55:33.680 | that we're subjected to.
04:55:35.960 | So on the one hand,
04:55:37.760 | every human that has ever lived died or will die.
04:55:43.120 | On the other hand, it's just one of the hardest things
04:55:49.000 | to imagine inflicting on anyone that you love
04:55:54.000 | is having them gone.
04:55:55.680 | I mean, I'm sure you've had friends
04:55:56.760 | that aren't living anymore,
04:55:58.880 | and it's hard to even think about them.
04:56:03.200 | And so I wish I had arrived at the point of nirvana
04:56:08.200 | where death doesn't have a sting, I'm not worried about it,
04:56:16.000 | but I can at least say that I'm comfortable
04:56:18.740 | with the certainty of it,
04:56:19.980 | if not having found out how to take the tragedy out of it
04:56:26.200 | when I think about my kids either not having me
04:56:31.360 | or me not having them or my wife.
04:56:34.320 | - Maybe I've come to accept
04:56:36.600 | the intellectual certainty of it,
04:56:37.940 | but it may be the pain that comes
04:56:42.940 | with losing the people you love.
04:56:45.320 | I don't think I've come to understand
04:56:46.680 | the existential aspect of it,
04:56:51.400 | like that this is gonna end.
04:56:54.080 | And I don't mean like in some trite way.
04:56:59.900 | I mean, like it certainly feels like it's not going to end,
04:57:04.900 | like you live life like it's not going to end.
04:57:08.280 | - Right.
04:57:09.400 | - And the fact that this light that's shining,
04:57:11.840 | this consciousness is going to no longer be
04:57:16.640 | at one moment, maybe today.
04:57:18.680 | It's like, it fills me when I really am able
04:57:23.000 | to load all that in with Ernest Becker's terror.
04:57:27.200 | Like it's a real fear.
04:57:28.520 | I think people aren't always honest
04:57:30.540 | with how terrifying it is.
04:57:32.740 | - Yeah.
04:57:33.900 | - I think the more you are able to really think through it,
04:57:36.260 | the more terrifying it is.
04:57:38.200 | It's not such a simple thing.
04:57:39.740 | Oh, well, it's the way life is.
04:57:41.180 | And if you really can load that in, it's hard.
04:57:44.580 | But I think that's why the Stoics did it
04:57:49.220 | because it like helps you get your shit together
04:57:52.340 | and be like, well, like the moment,
04:57:55.300 | every single moment you're alive is just beautiful.
04:57:58.800 | - Yeah.
04:57:59.640 | - And it's terrifying that it's going to end.
04:58:01.120 | It's like, almost like you're shivering in the cold,
04:58:06.120 | a child helpless, this kind of feeling.
04:58:10.240 | - Yeah.
04:58:11.080 | - And then it makes you, when you have warmth,
04:58:13.280 | when you have the safety, when you have the love,
04:58:15.760 | to really appreciate it.
04:58:17.000 | I feel like sometimes in your position,
04:58:22.340 | when you mentioned armor, just to see death,
04:58:25.700 | it might make you not be able to see that,
04:58:30.460 | the finiteness of life,
04:58:32.380 | because if you kept looking at that, it might break you.
04:58:36.780 | So it's good to know that you're kind of
04:58:40.220 | still struggling with that.
04:58:42.180 | There's the neurosurgeon and then there's a human.
04:58:45.440 | - Yeah.
04:58:46.280 | - And the human is still able to struggle with that
04:58:48.700 | and feel the fear of that and the pain of that.
04:58:51.580 | - Yeah, it definitely makes you ask the question
04:58:54.640 | of how long, how many of these can you see
04:58:59.240 | and not say, "I can't do this anymore."
04:59:02.500 | But I mean, you said it well.
04:59:08.840 | I think it gives you an opportunity
04:59:12.080 | to just appreciate that you're alive today.
04:59:14.280 | And I've got three kids and an amazing wife
04:59:20.760 | and I'm really happy.
04:59:23.940 | Things are good.
04:59:24.940 | I get to help on a project that I think matters.
04:59:27.220 | I think it moves us forward.
04:59:28.980 | I'm a very lucky person.
04:59:30.820 | - It's the early steps of a potentially
04:59:33.300 | gigantic leap for humanity.
04:59:37.620 | It's a really interesting one.
04:59:39.100 | And it's cool 'cause you read about all this stuff
04:59:41.860 | in history where it's like the early days.
04:59:43.860 | I've been reading, before going to the Amazon,
04:59:46.220 | I would read about explorers.
04:59:48.780 | They would go and explore even the Amazon jungle
04:59:51.280 | for the first time.
04:59:52.200 | It's just, those are the early steps.
04:59:54.960 | Or early steps into space, early steps in any discipline
04:59:58.240 | in physics and mathematics.
04:59:59.880 | And it's cool 'cause this is like, on the grand scale,
05:00:04.360 | these are the early steps into delving
05:00:07.560 | deep into the human brain.
05:00:09.480 | So not just observing the brain,
05:00:10.960 | but you'll be able to interact with the human brain.
05:00:13.760 | It's gonna help a lot of people,
05:00:14.960 | but it also might help us understand
05:00:19.280 | what the hell's going on in there.
05:00:20.840 | - Yeah, I think ultimately we wanna give people
05:00:24.640 | more levers that they can pull, right?
05:00:26.840 | Like you wanna give people options.
05:00:28.960 | If you can give someone a dial that they can turn
05:00:33.240 | on how happy they are,
05:00:35.840 | I think that makes people really uncomfortable.
05:00:40.300 | But now talk about major depressive disorder,
05:00:45.300 | talk about people that are committing suicide
05:00:47.700 | at an alarming rate in this country,
05:00:49.620 | and try to justify that queasiness in that light.
05:00:55.500 | You can give people a knob to take away suicidal ideation,
05:01:04.180 | suicidal intention.
05:01:06.100 | I would give them that knob.
05:01:08.780 | I don't know how you justify not doing that.
05:01:11.360 | - Yeah, you can think about all the suffering
05:01:13.280 | that's going on in the world.
05:01:14.400 | Every single human being that's suffering right now,
05:01:17.520 | it'll be a glowing red dot.
05:01:18.760 | The more suffering, the more it's glowing.
05:01:20.440 | And you just see the map of human suffering.
05:01:22.840 | And any technology that allows you to dim
05:01:25.280 | that light of suffering on a grand scale
05:01:29.400 | is pretty exciting.
05:01:30.800 | 'Cause there's a lot of people suffering,
05:01:32.120 | and most of them suffer quietly.
05:01:33.920 | And we turn our, we look away too often.
05:01:38.920 | And we should remember those that are suffering.
05:01:44.180 | 'Cause once again, most of them are suffering quietly.
05:01:46.820 | - Well, and on a grander scale, the fabric of society.
05:01:50.460 | People have a lot of complaints about
05:01:52.940 | how our social fabric is working or not working,
05:01:57.080 | how our politics is working or not working.
05:02:01.680 | Those things are made of neurochemistry too, in aggregate.
05:02:06.680 | Our politics is composed of individuals with human brains.
05:02:12.220 | And the way it works or doesn't work
05:02:15.940 | is potentially tunable in the sense that, I don't know,
05:02:20.940 | say, remove our addictive behaviors,
05:02:24.300 | or tune our addictive behaviors for social media,
05:02:27.060 | or our addiction to outrage,
05:02:29.140 | our addiction to sharing the most angry
05:02:32.980 | political tweet we can find.
05:02:35.400 | I don't think that leads to a functional society.
05:02:41.100 | And if you had options for people
05:02:46.100 | to moderate that maladaptive behavior,
05:02:51.120 | there could be huge benefits to society.
05:02:54.300 | Maybe we could all work together a little more harmoniously
05:02:58.300 | toward useful ends.
05:03:00.020 | - There's a sweet spot, like you mentioned.
05:03:02.020 | You don't want to completely remove
05:03:03.540 | all the dark sides of human nature.
05:03:05.860 | 'Cause those are somehow necessary
05:03:08.820 | to make the whole thing work, but there's a sweet spot.
05:03:11.060 | - Yeah, I agree.
05:03:12.220 | You gotta suffer a little.
05:03:14.520 | Just not so much that you lose hope.
05:03:16.460 | - Yeah.
05:03:17.440 | When you, all the surgeries you've done,
05:03:19.060 | have you seen consciousness in there ever?
05:03:21.040 | Was there like a glowing light?
05:03:22.540 | - You know, I have this sense that I never found it,
05:03:27.420 | never removed it, like a Dementor in Harry Potter.
05:03:31.260 | I have this sense that consciousness
05:03:34.460 | is a lot less magical than our instincts
05:03:37.940 | wanna claim it is.
05:03:39.420 | It seems to me like a useful analog
05:03:44.920 | for thinking about what consciousness is in the brain.
05:03:48.620 | Is that we have a really good intuitive understanding
05:03:55.460 | of what it means to, say, touch your skin
05:03:59.000 | and know what's being touched.
05:04:00.500 | I think consciousness is just that level
05:04:04.980 | of sensory mapping applied to the thought processes
05:04:08.940 | in the brain itself.
05:04:10.140 | So what I'm saying is, consciousness is the sensation
05:04:14.660 | of some part of your brain being active.
05:04:17.760 | So you feel it working.
05:04:20.380 | You feel the part of your brain that thinks of red things
05:04:23.660 | or winged creatures or the taste of coffee.
05:04:28.660 | You feel those parts of your brain being active
05:04:33.400 | the way that I'm feeling my palm being touched, right?
05:04:36.800 | And that sensory system that feels the brain working
05:04:41.800 | is consciousness.
05:04:43.740 | - That is so brilliant.
05:04:45.340 | It's the same way, it's the sensation of touch
05:04:48.180 | when you're touching a thing.
05:04:50.380 | Consciousness is the sensation of you feeling
05:04:53.740 | your brain working, your brain thinking,
05:04:56.900 | your brain perceiving.
05:04:59.100 | - Which isn't like a warping of space-time
05:05:03.260 | or some quantum field effect, right?
05:05:06.060 | It's nothing magical.
05:05:07.100 | People always wanna ascribe to consciousness
05:05:10.380 | something truly different.
05:05:13.740 | And there's this awesome long history of people
05:05:17.060 | looking at whatever the latest discovery in physics is
05:05:20.340 | to explain consciousness because it's the most magical,
05:05:24.140 | the most out there thing that you can think of
05:05:27.220 | and people always wanna do that with consciousness.
05:05:30.660 | I don't think that's necessary.
05:05:31.860 | It's just a very useful and gratifying way
05:05:36.540 | of feeling your brain work.
05:05:38.300 | - And as we said, it's one heck of a brain.
05:05:40.580 | - Yeah.
05:05:41.900 | - Everything we see around us, everything we love,
05:05:43.940 | everything that's beautiful came from brains like these.
05:05:48.660 | - It's all electrical activity happening inside your skull.
05:05:52.380 | - And I, for one, am grateful that it's people like you
05:05:56.980 | that are exploring all the ways that it works
05:06:01.300 | and all the ways it can be made better.
05:06:03.980 | - Thanks, Lex.
05:06:04.820 | - Thank you so much for talking today.
05:06:06.380 | - It's been a joy.
05:06:08.060 | - Thanks for listening to this conversation
05:06:09.860 | with Matthew McDougall.
05:06:12.020 | And now, dear friends, here's Bliss Chapman,
05:06:16.140 | Brain Interface Software Lead at Neuralink.
05:06:19.140 | You told me that you've met hundreds of people
05:06:22.780 | with spinal cord injuries or with ALS
05:06:25.620 | and that your motivation for helping at Neuralink
05:06:28.420 | is grounded in wanting to help them.
05:06:30.620 | Can you describe this motivation?
05:06:32.080 | - Yeah.
05:06:32.920 | First, just a thank you to all the people
05:06:34.760 | I've gotten a chance to speak with
05:06:36.380 | for sharing their stories with me.
05:06:38.400 | I don't think there's any world, really,
05:06:40.260 | in which I can share their stories
05:06:42.460 | in as powerful a way as they can.
05:06:44.400 | But just, I think, to summarize at a very high level
05:06:47.300 | what I hear over and over again
05:06:48.780 | is that people with ALS or severe spinal cord injury
05:06:53.780 | in a place where they basically
05:06:54.860 | can't move physically anymore,
05:06:57.080 | really, at the end of the day,
05:06:58.580 | are looking for independence.
05:06:59.740 | And that can mean different things for different people.
05:07:02.280 | For some folks, it can mean the ability
05:07:04.260 | just to be able to communicate again independently
05:07:06.300 | without needing to wear something on their face,
05:07:08.620 | without needing a caretaker
05:07:09.620 | to be able to put something in their mouth.
05:07:11.860 | For some folks, it can mean independence
05:07:13.700 | to be able to work again,
05:07:15.180 | to be able to navigate a computer digitally
05:07:17.660 | efficiently enough to be able to get a job,
05:07:19.300 | to be able to support themselves,
05:07:21.140 | to be able to move out
05:07:22.000 | and ultimately be able to support themselves
05:07:23.380 | after their family maybe isn't there anymore
05:07:25.580 | to take care of them.
05:07:27.380 | And for some folks, it's as simple as
05:07:29.740 | just being able to respond to their kid in time
05:07:31.540 | before they run away or get interested in something else.
05:07:35.700 | And these are deeply personal
05:07:39.440 | and sort of very human problems.
05:07:42.460 | And what strikes me again and again
05:07:44.420 | when talking with these folks is that
05:07:46.100 | this is actually an engineering problem.
05:07:48.540 | This is a problem that with the right resources,
05:07:51.020 | with the right team, we can make a lot of progress on.
05:07:54.300 | And at the end of the day,
05:07:56.300 | I think that's a deeply inspiring message
05:07:58.140 | and something that makes me excited to get up every day.
05:08:01.180 | - So it's both an engineering problem
05:08:03.120 | in terms of a BCI, for example,
05:08:05.140 | that can give them capabilities
05:08:06.940 | where they can interact with the world,
05:08:08.540 | but also on the other side,
05:08:10.420 | it's an engineering problem for the rest of the world
05:08:12.340 | to make it more accessible
05:08:13.580 | for people living with quadriplegia.
05:08:15.940 | - Yeah, and I see,
05:08:16.980 | I'll take a broad view sort of lens on this for a second.
05:08:19.580 | I think I'm very in favor of anyone
05:08:22.420 | working in this problem space.
05:08:23.620 | So beyond BCI, I'm happy and excited
05:08:26.460 | and willing to support in any way I can
05:08:28.040 | folks working on eye tracking systems,
05:08:29.540 | working on speech to text systems,
05:08:31.700 | working on head trackers or mouse sticks or quad sticks.
05:08:35.020 | I've met many engineers and folks in the community
05:08:37.020 | that do exactly those things.
05:08:38.980 | And I think for the people we're trying to help,
05:08:41.740 | it doesn't matter what the complexity of the solution is
05:08:43.900 | as long as the problem is solved.
05:08:45.980 | And I want to emphasize that
05:08:48.220 | there can be many solutions out there
05:08:49.620 | that can help with these problems.
05:08:51.420 | And BCI is one of a collection of such solutions.
05:08:54.980 | So BCI in particular,
05:08:56.140 | I think offers several advantages here.
05:08:58.260 | And I think the folks that recognize this immediately
05:09:00.580 | are usually the people who have spinal cord injury
05:09:02.300 | or some form of paralysis.
05:09:03.780 | Usually you don't have to explain to them
05:09:04.960 | why this might be something that could be helpful.
05:09:06.580 | It's usually pretty self-evident,
05:09:07.700 | but for the rest of us,
05:09:09.100 | folks that don't live with severe spinal cord injury
05:09:11.180 | or who don't know somebody with ALS,
05:09:14.060 | it's not often obvious why you would want a brain implant
05:09:16.260 | to be able to connect and navigate a computer.
05:09:18.700 | And it's surprisingly nuancing to the degree
05:09:20.420 | that I've learned a huge amount just working with Nolan
05:09:23.420 | in the first Neuralink clinical trial
05:09:25.700 | and understanding from him and his words
05:09:27.500 | why this device is impactful for him.
05:09:30.380 | And it's a nuanced topic.
05:09:31.620 | It can be the case that
05:09:32.960 | even if you can achieve the same thing, for example,
05:09:34.940 | with a mouse stick when navigating a computer,
05:09:37.100 | he doesn't have access to that mouse stick
05:09:38.540 | every single minute of the day.
05:09:39.620 | He only has access when someone is available
05:09:41.220 | to put it in front of him.
05:09:42.620 | And so a BCI can really offer a level of independence
05:09:45.580 | and autonomy that,
05:09:46.880 | if it wasn't literally physically part of your body,
05:09:50.140 | would be hard to achieve in any other way.
05:09:52.420 | - So there's a lot of fascinating aspects
05:09:54.900 | to what it takes to get Nolan
05:09:56.620 | to be able to control a cursor on the screen with his mind.
05:09:59.780 | You texted me something that I just love.
05:10:02.100 | You said, "I was part of the team
05:10:03.380 | "that interviewed and selected P1.
05:10:04.980 | "I was in the operating room
05:10:06.660 | "during the first human surgery
05:10:07.980 | "monitoring live signals coming out of the brain.
05:10:10.340 | "I work with the user basically every day
05:10:12.840 | "to develop new UX paradigms, decoding strategies.
05:10:16.020 | "And I was part of the team that figured out
05:10:18.900 | "how to recover useful BCI to new world record levels
05:10:22.100 | "when the signal quality degraded."
05:10:24.600 | We'll talk about, I think, every aspect of that.
05:10:26.900 | But just zooming out,
05:10:28.900 | what was it like to be a part of that team
05:10:33.220 | and part of that historic, I would say, historic first?
05:10:38.220 | - Yeah, I think, for me,
05:10:39.780 | this is something I've been excited about
05:10:41.220 | for close to 10 years now.
05:10:42.900 | And so to be able to be even just some small part
05:10:45.700 | of making it a reality is extremely exciting.
05:10:48.540 | A couple maybe special moments during that whole process
05:10:53.740 | that I'll never really truly forget,
05:10:56.380 | one of them is during the actual surgery.
05:11:00.860 | At that point in time, I know Nolan quite well.
05:11:03.500 | I know his family.
05:11:04.780 | And so I think the initial reaction
05:11:06.420 | when Nolan is rolled into the operating room
05:11:08.820 | is just a, "Oh shit," kind of reaction.
05:11:11.780 | But at that point, muscle memory kicks in
05:11:13.700 | and you sort of go into,
05:11:15.100 | you let your body just do all the talking.
05:11:19.300 | And I have the lucky job in that particular procedure
05:11:21.940 | to just be in charge of monitoring the implant.
05:11:24.380 | So my job is to sit there,
05:11:25.620 | to look at the signals coming off the implant,
05:11:27.500 | to look at the live brain data streaming off the device
05:11:29.740 | as threads are being inserted into the brain,
05:11:31.820 | and just to basically observe and make sure
05:11:33.260 | that nothing is going wrong
05:11:35.260 | or that there's no red flags or fault conditions
05:11:37.020 | that we need to go and investigate
05:11:38.020 | or pause the surgery to debug.
05:11:40.900 | And because I had that sort of spectator view of the surgery,
05:11:44.580 | I had a slightly removed perspective
05:11:45.980 | than I think most folks in the room.
05:11:47.260 | I got to sit there and think to myself,
05:11:49.260 | "Wow, that brain is moving a lot."
05:11:52.340 | When you look inside the little craniectomy
05:11:54.220 | that we stick the threads in,
05:11:55.940 | one thing that most people don't realize is the brain moves.
05:11:59.340 | The brain moves a lot when you breathe,
05:12:00.980 | when your heart beats, and you can see it visibly.
05:12:04.940 | So that's something that I think was a surprise to me
05:12:08.060 | and very, very exciting to be able to see someone's brain
05:12:11.700 | who you physically know and have talked with that length
05:12:13.980 | actually pulsing and moving inside their skull.
05:12:15.860 | - And they used that brain to talk to you previously,
05:12:18.500 | and now it's right there moving.
05:12:19.860 | - Yep.
05:12:21.140 | - Actually, I didn't realize that
05:12:22.860 | in terms of the threads sending.
05:12:24.060 | So the Neuralink implant is active during surgery.
05:12:28.300 | So in one thread at a time,
05:12:30.260 | you're able to start seeing the signal.
05:12:32.620 | So that's part of the way you test that the thing is working.
05:12:35.220 | - Yeah, so actually in the operating room,
05:12:37.220 | right after we finished all the thread insertions,
05:12:41.020 | I started collecting what's called broadband data.
05:12:43.340 | So broadband is basically the most raw form of signal
05:12:47.260 | you can collect from a Neuralink electrode.
05:12:49.260 | It's essentially a measurement of the local field potential
05:12:53.300 | or the voltage essentially measured by that electrode.
05:12:57.100 | And we have a certain mode in our application
05:12:59.700 | that allows us to visualize where detected spikes are.
05:13:02.100 | So it visualizes sort of where in the broadband signal,
05:13:05.700 | and it's very, very raw form of the data,
05:13:07.380 | a neuron is actually spiking.
05:13:09.180 | And so one of these moments that I'll never forget
05:13:12.100 | as part of this whole clinical trial
05:13:13.300 | is seeing live in the operating room
05:13:15.660 | while he's still under anesthesia,
05:13:17.300 | beautiful spikes being shown in the application,
05:13:19.740 | just streaming live to a device I'm holding in my hand.
05:13:22.660 | - So this is no signal processing, the raw data,
05:13:25.180 | and then the signals processing is on top of it,
05:13:27.060 | you're seeing the spikes detected.
05:13:28.660 | - Right, yeah.
05:13:29.500 | - And that's a UX too, 'cause that looks beautiful as well.
05:13:35.820 | - During that procedure,
05:13:36.900 | there was actually a lot of cameramen in the room.
05:13:38.420 | So they also were curious and wanted to see,
05:13:40.460 | there's several neurosurgeons in the room
05:13:42.100 | who are all just excited to see robots taking their job.
05:13:44.780 | And they're all crowded around a small little iPhone
05:13:47.660 | watching this live brain data stream out of his brain.
05:13:51.140 | - What was that like seeing the robot
05:13:53.180 | do some of the surgery?
05:13:54.100 | So the computer vision aspect where it detects
05:13:56.460 | all the spots that avoid the blood vessels,
05:14:01.460 | and then obviously with human supervision,
05:14:03.660 | then actually doing the really high precision
05:14:07.940 | connection of the threads to the brain.
05:14:10.860 | - Yeah, that's a good question.
05:14:11.940 | My answer's gonna be pretty lame here, but it was boring.
05:14:15.060 | - Yeah.
05:14:15.900 | - I've seen it so many times.
05:14:17.340 | Yeah, that's exactly how you want surgery to be.
05:14:19.300 | You want it to be boring.
05:14:20.220 | - Yeah.
05:14:21.060 | - 'Cause I've seen it so many times.
05:14:22.500 | I've seen the robot do this surgery
05:14:24.980 | literally hundreds of times.
05:14:27.020 | And so it was just one more time.
05:14:29.300 | - Yeah, all the practice surgeries and the proxies,
05:14:32.580 | and this is just another day.
05:14:33.980 | - Yeah.
05:14:35.540 | - So what about when Nolan woke up?
05:14:37.940 | Well, do you remember a moment where he was able
05:14:42.260 | to move the cursor, not move the cursor,
05:14:44.380 | but get signal from the brain such that it was able
05:14:47.340 | to show that there's a connection?
05:14:49.820 | - Yeah.
05:14:50.660 | So we are quite excited to move as quickly as we can,
05:14:53.660 | and Nolan was really, really excited to get started.
05:14:56.060 | He wanted to get started actually the day of surgery,
05:14:58.500 | but we waited 'til the next morning, very patiently.
05:15:02.300 | It's a long night.
05:15:03.180 | And the next morning in the ICU where he was recovering,
05:15:08.420 | he wanted to get started and actually start to understand
05:15:11.420 | what kind of signal we can measure from his brain.
05:15:14.180 | And maybe for folks who are not familiar
05:15:15.860 | with the Neuralink system, we implant the Neuralink system,
05:15:19.260 | or the Neuralink implant in the motor cortex.
05:15:21.540 | So the motor cortex is responsible
05:15:22.740 | for representing things like motor intent.
05:15:25.780 | So if you imagine closing and opening your hand,
05:15:28.060 | that kind of signal representation would be present
05:15:29.860 | in the motor cortex.
05:15:31.340 | If you imagine moving your arm back and forth
05:15:33.340 | or wiggling a pinky, this sort of signal
05:15:35.260 | can be present in the motor cortex.
05:15:37.060 | So one of the ways we start to sort of map out
05:15:39.460 | what kind of signal do we actually have access to
05:15:41.340 | in any particular individual's brain
05:15:42.780 | is through this task called body mapping.
05:15:44.780 | And body mapping is where you essentially present a visual
05:15:46.740 | to the user and you say, "Hey, imagine doing this."
05:15:49.580 | And that visual is a 3D hand opening and closing
05:15:53.060 | or index finger modulating up and down.
05:15:55.660 | And you ask the user to imagine that.
05:15:57.340 | And obviously you can't see them do this
05:15:58.980 | 'cause they're paralyzed.
05:15:59.860 | So you can't see them actually move their arm.
05:16:02.380 | But while they do this task, you can record neural activity
05:16:05.620 | and you can basically offline model and check,
05:16:07.860 | can I predict or can I detect the modulation
05:16:10.380 | corresponding with those different actions?
05:16:12.300 | And so we did that task and we realized,
05:16:13.660 | hey, there's actually some modulation associated
05:16:15.780 | with some of his hand motion,
05:16:17.100 | which was a first indication that, okay,
05:16:18.860 | we can potentially use that modulation
05:16:20.340 | to do useful things in the world.
05:16:22.220 | For example, control a computer cursor.
05:16:24.260 | And he started playing with it
05:16:26.180 | the first time we showed him it.
05:16:27.140 | And we actually just took the same live view
05:16:28.620 | of his brain activity and put it in front of him.
05:16:30.660 | And we said, "Hey, you tell us what's going on."
05:16:33.260 | We're not you.
05:16:34.100 | You're able to imagine different things.
05:16:36.140 | And we know that it's modulating some of these neurons.
05:16:38.060 | So you figure out for us what that is actually representing.
05:16:41.580 | And so he played with it for a bit.
05:16:43.060 | He was like, "I don't quite get it yet."
05:16:44.820 | He played for a bit longer and he said,
05:16:46.660 | "Oh, when I move this finger,
05:16:49.060 | I see this particular neuron start to fire more."
05:16:51.660 | And I said, "Okay, prove it, do it again."
05:16:53.820 | And so he said, "Okay, three, two, one, boom."
05:16:56.820 | And the minute he moved,
05:16:57.660 | you can see instantaneously this neuron is firing.
05:17:00.860 | Single neuron, I can tell you the exact channel number
05:17:03.180 | if you're interested.
05:17:04.300 | That's stuck in my brain now forever.
05:17:06.020 | But that single channel firing was a beautiful indication
05:17:09.060 | that it was behaviorally modulated neural activity
05:17:11.660 | that could then be used for downstream tasks
05:17:13.300 | like decoding a computer cursor.
05:17:15.060 | - And when you say single channel,
05:17:16.300 | is that associated with a single electrode?
05:17:18.620 | - Yes, channel and electrode are interchangeable.
05:17:20.820 | - And there's 1,024 of those.
05:17:23.420 | - 1,024, yeah.
05:17:25.580 | - It's incredible that that works.
05:17:27.420 | That really, when I was learning about all this
05:17:31.300 | and loading it in, it was just blowing my mind
05:17:35.380 | that the intention, you can visualize yourself
05:17:37.780 | moving the finger, that can turn into a signal.
05:17:41.620 | And the fact that you can then skip that step
05:17:43.740 | and visualize the cursor moving
05:17:45.660 | or have the intention of the cursor moving
05:17:49.180 | and that leading to a signal that can then be used
05:17:51.540 | to move the cursor.
05:17:52.580 | There is so many exciting things there to learn
05:17:56.620 | about the brain, about the way the brain works.
05:17:58.980 | The very fact of their existing signal that can be used
05:18:02.020 | is really powerful.
05:18:03.340 | But it feels like that's just like the beginning
05:18:05.780 | of figuring out how that signal could be used
05:18:07.900 | really, really effectively.
05:18:10.500 | - I should also just, there's so many fascinating details
05:18:13.140 | here, but you mentioned the body mapping step.
05:18:15.740 | At least in the version I saw that Nolan was showing off,
05:18:20.980 | there's like a super nice interface,
05:18:23.140 | like a graphical interface.
05:18:24.940 | It just felt like I was in the future
05:18:26.620 | 'cause I guess it visualizes you moving the hand
05:18:31.620 | and there's a very like a sexy, polished interface.
05:18:38.020 | - Hello.
05:18:39.060 | I don't know if there's a voice component,
05:18:40.980 | but it just felt like when you wake up
05:18:44.580 | in a really nice video game and this is a tutorial
05:18:46.660 | at the beginning of that video game,
05:18:48.300 | this is what you're supposed to do.
05:18:49.860 | It's cool.
05:18:50.700 | - No, I mean, the future should feel like the future.
05:18:52.220 | - But it's not easy to pull that off.
05:18:53.860 | I mean, it needs to be simple, but not too simple.
05:18:57.220 | - Yeah, and I think the UX design component here
05:18:59.700 | is underrated for BCI development in general.
05:19:03.180 | There's a whole interaction effect
05:19:06.060 | between the ways in which you visualize an instruction
05:19:08.820 | to the user and the kinds of signal you can get back.
05:19:11.380 | And that quality of sort of your behavioral alignment
05:19:13.700 | to the neural signal is a function of how good you are
05:19:16.180 | at expressing to the user what you want them to do.
05:19:18.260 | And so, yeah, we spend a lot of time thinking about the UX,
05:19:20.860 | of how we build our applications,
05:19:22.420 | of how the decoder actually functions,
05:19:23.900 | the control surfaces it provides to the user.
05:19:25.940 | All these little details matter a lot.
05:19:27.700 | - So maybe it'd be nice to get into a little bit more detail
05:19:30.900 | of what the signal looks like
05:19:32.860 | and what the decoding looks like.
05:19:34.260 | So there's a N1 implant that has,
05:19:39.260 | like we mentioned, 1,024 electrodes,
05:19:43.820 | and that's collecting raw data, raw signal.
05:19:46.700 | What does that signal look like?
05:19:48.660 | And what are the different steps along the way
05:19:51.860 | before it's transmitted?
05:19:53.420 | And what is transmitted?
05:19:54.580 | All that kind of stuff.
05:19:55.420 | - Yeah, yep.
05:19:56.620 | This is gonna be a fun one.
05:19:58.420 | Let's go.
05:19:59.460 | So maybe before diving into what we do,
05:20:02.900 | it's worth understanding what we're trying to measure
05:20:04.540 | because that dictates a lot of the requirements
05:20:06.980 | for the system that we build.
05:20:08.900 | And what we're trying to measure
05:20:09.820 | is really individual neurons producing action potentials.
05:20:12.580 | And action potential is,
05:20:13.900 | you can think of it like a little electrical impulse
05:20:16.540 | that you can detect if you're close enough.
05:20:19.260 | And by being close enough, I mean like within,
05:20:21.820 | let's say 100 microns of that cell.
05:20:24.300 | And 100 microns is a very, very tiny distance.
05:20:26.340 | And so the number of neurons that you're gonna pick up
05:20:29.700 | with any given electrode
05:20:30.980 | is just a small radius around that electrode.
05:20:33.660 | And the other thing worth understanding
05:20:35.380 | about the underlying biology here
05:20:36.660 | is that when neurons produce an action potential,
05:20:38.660 | the width of that action potential is about one millisecond.
05:20:41.540 | So from the start of the spike to the end of the spike,
05:20:43.300 | that whole width of that sort of characteristic feature
05:20:47.020 | of a neuron firing is one millisecond wide.
05:20:50.100 | And if you want to detect
05:20:51.660 | that an individual spike is occurring or not,
05:20:54.020 | you need to sample that signal
05:20:55.820 | or sample the local field potential nearby the neuron
05:20:58.580 | much more frequently than once a millisecond.
05:21:00.500 | You need to sample many, many times per millisecond
05:21:02.780 | to be able to detect that
05:21:03.620 | this is actually the characteristic waveform
05:21:05.860 | of a neuron producing an action potential.
05:21:07.940 | And so we sample across all 1,024 electrodes
05:21:11.300 | about 20,000 times a second.
05:21:13.860 | 20,000 times a second means
05:21:14.980 | for any given one millisecond window,
05:21:16.440 | we have about 20 samples that tell us
05:21:18.740 | what that exact shape of that action potential looks like.
05:21:22.060 | And once we've sort of sampled at super high rate
05:21:25.580 | the underlying electrical field nearby these cells,
05:21:29.500 | we can process that signal
05:21:30.940 | into just where do we detect a spike or where do we not?
05:21:34.900 | Sort of a binary signal one or zero,
05:21:36.460 | do we detect a spike in this one millisecond or not?
05:21:39.100 | And we do that because the actual information
05:21:42.380 | carrying sort of subspace of neural activity
05:21:47.300 | is just when are spikes occurring.
05:21:49.460 | Essentially everything that we care about for decoding
05:21:51.500 | can be captured or represented
05:21:52.620 | in the frequency characteristics of spike trains,
05:21:55.360 | meaning how often are spikes firing
05:21:57.520 | in any given window of time.
05:21:59.420 | And so that allows us to do
05:22:00.500 | sort of a crazy amount of compression
05:22:02.460 | from this very rich, high density signal
05:22:07.420 | to something that's much, much more sparse
05:22:09.060 | and compressible that can be sent out
05:22:10.260 | over a wireless radio,
05:22:12.380 | like a Bluetooth communication, for example.
05:22:14.100 | - Quick tangents here.
05:22:16.420 | You mentioned electrode, neuron.
05:22:19.980 | There's a local neighborhood of neurons nearby.
05:22:23.560 | How difficult is it to like isolate
05:22:27.820 | from where the spike came from?
05:22:30.500 | - Yeah, so there's a whole field
05:22:32.100 | of sort of academic neuroscience work
05:22:34.060 | on exactly this problem of basically given a single electrode
05:22:37.980 | or given a set of electrodes measuring a set of neurons,
05:22:40.780 | how can you sort of spike sort
05:22:43.060 | which spikes are coming from what neuron?
05:22:46.120 | And this is a problem that's pursued in academic work
05:22:48.420 | because you care about it
05:22:50.220 | for understanding what's going on
05:22:51.660 | in the underlying sort of neuroscience of the brain.
05:22:55.260 | If you care about understanding
05:22:56.360 | how the brain's representing information,
05:22:58.160 | how that's evolving through time,
05:22:59.820 | then that's a very, very important question to understand.
05:23:02.780 | For sort of the engineering side of things,
05:23:05.180 | at least at the current scale,
05:23:06.100 | if the number of neurons per electrode is relatively small,
05:23:10.380 | you can get away with
05:23:11.220 | basically ignoring that problem completely.
05:23:13.100 | You can think of it like sort of a random projection
05:23:15.100 | of neurons to electrodes.
05:23:16.880 | And there may be in some cases
05:23:18.220 | more than one neuron per electrode.
05:23:19.620 | But if that number is small enough,
05:23:21.100 | those signals can be thought of as sort of a union of the two.
05:23:25.420 | And for many applications,
05:23:26.340 | that's a totally reasonable trade-off to make
05:23:28.040 | and can simplify the problem a lot.
05:23:30.320 | And as you sort of scale out channel count,
05:23:31.960 | the relevance of distinguishing individual neurons
05:23:34.920 | becomes less important
05:23:35.760 | because you have more overall signal
05:23:37.420 | and you can start to rely on sort of correlations
05:23:39.640 | or covariance structure in the data
05:23:41.040 | to help understand when that channel is firing,
05:23:43.240 | what does that actually represent?
05:23:45.720 | 'Cause you know that when that channel is firing
05:23:47.080 | in concert with these other 50 channels,
05:23:49.360 | that means move left.
05:23:50.560 | But when that same channel is firing
05:23:51.600 | with concert with these other 10 channels,
05:23:53.040 | that means move right.
05:23:54.280 | - Okay, so you have to do this kind of spike detection
05:23:56.660 | on board and you have to do that super efficiently.
05:24:01.140 | So fast and not use too much power
05:24:04.380 | 'cause you don't wanna be generating too much heat.
05:24:05.920 | So it has to be a super simple signal processing step.
05:24:09.420 | - Yeah.
05:24:10.260 | - Is there some wisdom you can share
05:24:14.620 | about what it takes to overcome that challenge?
05:24:17.480 | - Yeah.
05:24:18.320 | So we've tried many different versions
05:24:20.180 | of basically turning this raw signal
05:24:22.380 | into sort of a feature
05:24:24.380 | that you might wanna send off the device.
05:24:26.100 | And I'll say that I don't think we're at the final step
05:24:28.000 | of this process.
05:24:28.840 | This is a long journey.
05:24:29.940 | We have something that works clearly today,
05:24:31.740 | but there can be many approaches that we find in the future
05:24:34.220 | that are much better than what we do right now.
05:24:36.260 | So some versions of what we do right now,
05:24:38.460 | and there's a lot of academic heritage to these ideas.
05:24:41.120 | So I don't wanna claim
05:24:42.140 | that these are original Neuralink ideas
05:24:43.600 | or anything like that.
05:24:44.740 | But one of these ideas is basically to build
05:24:47.820 | sort of like a convolutional filter almost, if you will,
05:24:49.980 | that slides across the signal
05:24:51.080 | and looks for a certain template to be matched.
05:24:54.300 | That template consists of sort of how deep
05:24:56.820 | the spike modulates, how much it recovers,
05:24:59.340 | and what the duration and window of time is
05:25:01.340 | that the whole process takes.
05:25:02.960 | And if you can see in the signal
05:25:04.060 | that that template is matched within certain bounds,
05:25:06.500 | then you can say, "Okay, that's a spike."
05:25:08.620 | One reason that approach is super convenient
05:25:10.620 | is that you can actually implement that
05:25:12.980 | extremely efficiently in hardware,
05:25:14.700 | which means that you can run it in low power
05:25:17.820 | across 1,024 channels all at once.
05:25:20.540 | Another approach that we've recently started exploring,
05:25:23.340 | and this can be combined with the spike detection approach,
05:25:26.060 | is something called spike band power.
05:25:27.460 | And the benefits of that approach are
05:25:29.420 | that you may be able to pick up some signal
05:25:31.300 | from neurons that are maybe too far away
05:25:33.220 | to be detected as a spike.
05:25:34.620 | Because the farther away you are from an electrode,
05:25:36.740 | the weaker that actual spike waveform
05:25:38.860 | will look like on that electrode.
05:25:40.620 | So you might be able to pick up population-level activity
05:25:44.520 | of things that are maybe slightly outside
05:25:46.420 | the normal recording radius,
05:25:47.680 | what neuroscientists sometimes refer to as the hash
05:25:50.220 | of activity, the other stuff that's going on.
05:25:53.100 | And you can look at across many channels
05:25:55.060 | how that background noise is behaving,
05:25:57.300 | and you might be able to get more juice
05:25:58.340 | out of the signal that way.
05:25:59.540 | But it comes at a cost.
05:26:00.460 | That signal is now a floating point representation,
05:26:02.380 | which means it's more expensive to send out over a power.
05:26:04.660 | It means you have to find different ways to compress it
05:26:07.460 | that are different than what you can apply
05:26:08.820 | to binary signals.
05:26:09.820 | So there's a lot of different challenges
05:26:10.780 | associated with these different modalities.
05:26:12.080 | - So also in terms of communication,
05:26:14.580 | you're limited by the amount of data you can send.
05:26:17.100 | - Yeah.
05:26:18.300 | - And so, and also because you're currently using
05:26:20.860 | the Bluetooth protocol, you have to batch stuff together.
05:26:23.700 | But you have to also do this keeping the latency crazy low,
05:26:29.540 | like crazy low.
05:26:30.980 | Anything to say about the latency?
05:26:32.900 | - Yeah, this is a passion project of mine.
05:26:34.860 | So I want to build the best mouse in the world.
05:26:38.340 | I don't want to build like the Chevrolet Spark
05:26:42.180 | or whatever of electric cars.
05:26:43.100 | I want to build like the Tesla Roadster version of a mouse.
05:26:46.660 | And I really do think it's quite possible
05:26:49.300 | that within five to 10 years,
05:26:50.980 | that most e-sports competitions are dominated
05:26:52.820 | by people with paralysis.
05:26:54.540 | This is like a very real possibility for a number of reasons.
05:26:56.740 | One is that they'll have access to the best technology
05:26:59.620 | to play video games effectively.
05:27:01.340 | The second is they have the time to do so.
05:27:02.620 | So those two factors together are particularly potent
05:27:05.340 | for e-sport competitors.
05:27:07.340 | - Unless people without paralysis
05:27:11.020 | are also allowed to implant.
05:27:12.940 | - Right.
05:27:13.780 | - Because it is another way to interact
05:27:16.820 | with a digital device.
05:27:20.140 | And there's something to that.
05:27:22.060 | If it's a fundamentally different experience,
05:27:24.820 | more efficient experience,
05:27:26.260 | even if it's not like some kind of
05:27:28.220 | full-on high bandwidth communication,
05:27:30.660 | if it's just the ability to move the mouse 10X faster,
05:27:35.660 | like the bits per second,
05:27:37.860 | if I can achieve a bits per second at 10X,
05:27:39.860 | what I can do with the mouse,
05:27:41.340 | that's a really interesting possibility of what they can do.
05:27:43.940 | Especially as you get really good at it with training.
05:27:47.740 | - It's definitely the case
05:27:48.580 | that you have a higher ceiling performance.
05:27:50.460 | Like because you don't have to buffer your intention
05:27:52.740 | through your arm, through your muscle,
05:27:54.500 | you get just by nature of having a brain implant at all,
05:27:57.180 | like 75 millisecond lead time on any action
05:28:00.100 | that you're actually trying to take.
05:28:01.500 | And there's some nuance to this.
05:28:02.460 | Like there's evidence that the motor cortex,
05:28:04.180 | you can sort of plan out sequences of action.
05:28:06.100 | So you may not get that whole benefit all the time,
05:28:08.300 | but for sort of like reaction time style games
05:28:11.180 | where you just wanna,
05:28:12.020 | somebody's over here, snipe 'em, that kind of thing.
05:28:15.300 | You actually do have just an inherent advantage
05:28:17.260 | 'cause you don't need to go through muscle.
05:28:18.860 | So the question is just how much faster can you make it?
05:28:21.140 | And we're already faster than what you would do
05:28:24.420 | if you're going through muscle from a latency point of view.
05:28:26.060 | And we're in the early stage of that.
05:28:27.300 | I think we can push it.
05:28:28.540 | Sort of our end-to-end latency right now
05:28:29.700 | from brain spike to cursor movement
05:28:31.740 | is about 22 milliseconds.
05:28:33.460 | If you think about the best mice in the world,
05:28:35.780 | the best gaming mice,
05:28:36.620 | that's about five milliseconds-ish of latency,
05:28:39.220 | depending on how you measure,
05:28:40.100 | depending how fast your screen refreshes.
05:28:41.380 | There's a lot of characteristics that matter there,
05:28:42.660 | but yeah.
05:28:43.860 | And the rough time for like a neuron in the brain
05:28:46.420 | to actually impact your command of your hand
05:28:49.020 | is about 75 milliseconds.
05:28:50.180 | So if you look at those numbers,
05:28:51.500 | you can see that we're already competitive
05:28:54.500 | and slightly faster than what you'd get
05:28:56.060 | by actually moving your hand.
05:28:58.580 | And this is something that if you ask Nolan about it,
05:29:01.020 | when he moved the cursor for the first time,
05:29:02.500 | we asked him about this.
05:29:03.340 | It was something I was super curious about,
05:29:04.580 | like what does it feel like when you're modulating
05:29:07.060 | a click intention or when you're trying
05:29:08.580 | to just move the cursor to the right?
05:29:10.180 | He said it moves before he is like actually intending it to,
05:29:13.780 | which is kind of a surreal thing
05:29:15.020 | and something that, you know,
05:29:17.020 | I would love to experience myself one day.
05:29:18.500 | What is that like to have a thing just be so immediate,
05:29:20.940 | so fluid that it feels like it's happening
05:29:22.980 | before you're actually intending it to move?
05:29:25.300 | - Yeah, I suppose we've gotten used to that latency,
05:29:27.420 | that natural latency that happens.
05:29:29.140 | So is the currently the bottleneck the communication?
05:29:32.980 | So like the Bluetooth communication,
05:29:34.300 | is that what's the actual bottleneck?
05:29:36.100 | I mean, there's always going to be a bottleneck,
05:29:37.420 | but what's the current bottleneck?
05:29:38.780 | - Yeah, a couple of things.
05:29:39.700 | So kind of hilariously,
05:29:41.500 | Bluetooth low energy protocol has some restrictions
05:29:45.980 | on how fast you can communicate.
05:29:47.660 | So the protocol itself establishes a standard of,
05:29:50.060 | you know, the most frequent sort of updates you can send
05:29:52.580 | are on the order of 7.5 milliseconds.
05:29:54.980 | And as we push latency down to the level
05:29:57.620 | of sort of individual spikes impacting control,
05:30:00.700 | that level of resolution,
05:30:02.460 | that kind of protocol is going to become a limiting factor
05:30:04.460 | at some scale.
05:30:06.780 | Another sort of important nuance to this
05:30:09.300 | is that it's not just the neural link itself
05:30:11.900 | that's part of this equation.
05:30:12.900 | If you start pushing latency sort of below the level
05:30:15.580 | of how fast screens refresh,
05:30:17.500 | then you have another problem.
05:30:18.340 | Like you need your whole system to be able to be as reactive
05:30:22.260 | as the sort of limits of what the technology can offer us.
05:30:25.220 | Like you need the screen,
05:30:26.060 | like 120 Hertz just doesn't work anymore
05:30:28.180 | if you're trying to have something respond
05:30:29.900 | at something that's at the level of one millisecond.
05:30:31.980 | - That's a really cool challenge.
05:30:33.740 | I also like that for a t-shirt.
05:30:35.220 | The best mouse in the world.
05:30:37.620 | Tell me on the receiving end.
05:30:39.260 | So the decoding step.
05:30:40.300 | Now we figured out what the spikes are.
05:30:42.740 | We've got them all together.
05:30:43.620 | Now we're sending that over to the app.
05:30:46.900 | What's the decoding step look like?
05:30:49.780 | - Yeah.
05:30:50.660 | So maybe first, what is decoding?
05:30:52.180 | I think there's probably a lot of folks listening
05:30:53.500 | that just have no clue what it means
05:30:55.020 | to decode brain activity.
05:30:56.180 | - Actually, even if we zoom out beyond that,
05:30:58.540 | what is the app?
05:30:59.620 | So there's an implant that's wirelessly communicating
05:31:04.700 | with any digital device that has an app installed.
05:31:08.700 | So maybe can you tell me at high level what the app is,
05:31:11.260 | what the software is outside of the brain?
05:31:14.580 | - Yeah.
05:31:15.420 | So maybe working backwards from the goal.
05:31:17.220 | The goal is to help someone with paralysis,
05:31:19.660 | in this case, Noland,
05:31:20.900 | be able to navigate his computer independently.
05:31:23.220 | And we think the best way to do that
05:31:25.140 | is to offer them the same tools
05:31:26.700 | that we have to navigate our software.
05:31:28.580 | Because we don't want to have to rebuild
05:31:29.820 | an entire software ecosystem for the brain,
05:31:32.020 | at least not yet.
05:31:33.580 | Maybe someday you can imagine there's UXs
05:31:35.220 | that are built natively for BCI.
05:31:36.500 | But in terms of what's useful for people today,
05:31:39.260 | I think most people would prefer
05:31:40.740 | to be able to just control mouse and keyboard inputs
05:31:42.540 | to all the applications that they want to use
05:31:44.180 | for their daily jobs,
05:31:45.460 | for communicating with their friends, et cetera.
05:31:47.780 | And so the job of the application
05:31:49.700 | is really to translate this wireless stream of brain data
05:31:52.380 | coming off the implant into control of the computer.
05:31:55.500 | And we do that by essentially building a mapping
05:31:57.980 | from brain activity to sort of the HID inputs
05:32:00.700 | to the actual hardware.
05:32:02.660 | So HID is just the protocol for communicating
05:32:04.540 | like input device events.
05:32:06.340 | So for example, move mouse to this position
05:32:09.060 | or press this key down.
05:32:10.860 | And so that mapping is fundamentally
05:32:12.340 | what the app is responsible for.
05:32:13.940 | But there's a lot of nuance of how that mapping works.
05:32:16.220 | We spent a lot of time to try to get right.
05:32:17.780 | And we're still in the early stages of a long journey
05:32:19.660 | to figure out how to do that optimally.
05:32:21.780 | So one part of that process is decoding.
05:32:23.740 | So decoding is this process
05:32:24.940 | of taking the statistical patterns of brain data
05:32:27.540 | that's being channeled across this Bluetooth connection
05:32:29.420 | to the application and turning it into,
05:32:31.700 | for example, a mouse movement.
05:32:33.580 | And that decoding step,
05:32:34.420 | you can think of it in a couple of different parts.
05:32:35.940 | So similar to any machine learning problem,
05:32:37.900 | there's a training step and there's an inference step.
05:32:40.220 | The training step in our case
05:32:41.460 | is a very intricate behavioral process
05:32:45.700 | where the user has to imagine doing different actions.
05:32:48.260 | So for example, they'll be presented a screen
05:32:50.380 | with a cursor on it,
05:32:51.460 | and they'll be asked to push that cursor to the right.
05:32:54.220 | Then imagine pushing that cursor to the left,
05:32:56.020 | push it up, push it down.
05:32:57.540 | And we can basically build up a pattern
05:32:59.100 | or using any sort of modern ML method,
05:33:01.660 | a mapping of given this brain data
05:33:04.980 | and then imagine behavior, map one to the other.
05:33:07.500 | And then at test time,
05:33:08.340 | you take that same pattern matching system,
05:33:09.900 | in our case, it's a deep neural network,
05:33:11.460 | and you run it and you take the live stream
05:33:13.260 | of brain data coming off their implant.
05:33:14.740 | You decode it by pattern matching
05:33:16.100 | to what you saw at calibration time.
05:33:18.100 | And you use that for a control of the computer.
05:33:20.780 | Now, a couple like sort of rabbit holes
05:33:22.460 | that I think are quite interesting.
05:33:24.380 | One of them has to do with how you build
05:33:26.220 | that best template matching system.
05:33:28.580 | Because there's a variety of behavioral challenges
05:33:32.580 | and also debugging challenges
05:33:33.740 | when you're working with someone who's paralyzed.
05:33:35.140 | Because again, fundamentally,
05:33:36.100 | you don't observe what they're trying to do.
05:33:37.460 | You can't see them attempt to move their hand.
05:33:39.820 | And so you have to figure out a way
05:33:41.100 | to instruct the user to do something
05:33:43.580 | and validate that they're doing it correctly,
05:33:45.740 | such that then you can downstream build with confidence,
05:33:49.540 | the mapping between the neural spikes
05:33:51.300 | and the intended action.
05:33:53.140 | And by doing the action correctly,
05:33:55.220 | what I really mean is at this level of resolution
05:33:57.580 | of what neurons are doing.
05:33:59.740 | So if in ideal world,
05:34:01.340 | you could get a signal of behavioral intent
05:34:04.300 | that is ground truth accurate
05:34:06.100 | at the scale of sort of one millisecond resolution,
05:34:08.540 | then with high confidence,
05:34:09.820 | I could build a mapping from my neural spikes
05:34:11.820 | to that behavioral intention.
05:34:13.500 | But the challenge is again,
05:34:14.500 | that you don't observe what they're actually doing.
05:34:17.180 | And so there's a lot of nuance
05:34:18.500 | to how you build user experiences
05:34:19.780 | that give you more than just sort of a course
05:34:21.780 | on average correct representation
05:34:23.380 | of what the user's intending to do.
05:34:25.020 | If you want to build the world's best mouse,
05:34:26.700 | you really want it to be as responsive as possible.
05:34:29.740 | You want it to be able to do exactly
05:34:31.100 | what the user's intending
05:34:31.940 | at every sort of step along the way,
05:34:33.060 | not just on average be correct
05:34:34.780 | when you're trying to move it from left to right.
05:34:36.740 | And building a behavioral sort of calibration game
05:34:40.380 | or sort of software experience
05:34:42.140 | that gives you that level of resolution
05:34:43.540 | is what we spend a lot of time working.
05:34:44.620 | - So the calibration process,
05:34:47.100 | the interface has to encourage precision,
05:34:51.460 | meaning like whatever it does,
05:34:52.940 | it should be super intuitive
05:34:54.540 | that the next thing the human is going to likely do
05:34:58.580 | is exactly that intention that you need
05:35:01.260 | and only that intention.
05:35:03.420 | - Yeah.
05:35:04.260 | - And you don't have any feedback
05:35:05.940 | except that maybe speaking to you afterwards
05:35:09.140 | what they actually did.
05:35:10.220 | You can't, oh yeah.
05:35:11.500 | - Right.
05:35:12.340 | - So that's fundamentally,
05:35:14.860 | that is a really exciting UX challenge
05:35:17.660 | 'cause that's all on the UX.
05:35:19.380 | It's not just about being friendly or nice or usable.
05:35:23.060 | - Yeah.
05:35:23.900 | - It's like user experience is how it works.
05:35:25.500 | - It's how it works.
05:35:26.700 | - Yeah.
05:35:27.540 | - For the calibration and calibration,
05:35:29.540 | at least at this stage of Neuralink
05:35:31.340 | is like fundamental to the operation of the thing
05:35:35.540 | and not just calibration
05:35:36.820 | but continued calibration essentially.
05:35:39.340 | - Yeah.
05:35:40.180 | - Wow, yeah.
05:35:41.020 | - You said something that I think
05:35:41.860 | is worth exploring there a little bit.
05:35:43.300 | You said it's primarily a UX challenge
05:35:45.300 | and I think a large component of it is,
05:35:46.900 | but there is also a very interesting
05:35:49.420 | machine learning challenge here
05:35:50.380 | which is given some dataset
05:35:53.940 | including some on average correct behavior
05:35:56.900 | of asking the user to move up or move down,
05:35:59.180 | move right, move left
05:36:00.740 | and given a dataset of neural spikes,
05:36:02.260 | is there a way to infer
05:36:03.780 | in some kind of semi-supervised
05:36:05.300 | or entirely unsupervised way
05:36:06.900 | what that high resolution version of their intention is?
05:36:10.300 | And if you think about it,
05:36:11.420 | there probably is
05:36:12.500 | because there are enough data points in the dataset,
05:36:15.140 | enough constraints on your model
05:36:16.820 | that there should be a way
05:36:17.780 | with the right sort of formulation
05:36:19.700 | to let the model figure out itself.
05:36:20.980 | For example, at this millisecond,
05:36:22.180 | this is exactly how hard they're pushing upwards
05:36:24.900 | and at this millisecond,
05:36:25.740 | this is how hard they're trying to push upwards.
05:36:27.420 | - It's really important to have very clean labels, yes?
05:36:30.460 | So like the problem becomes much harder
05:36:32.300 | from a machine learning perspective
05:36:33.580 | if the labels are noisy.
05:36:35.580 | - That's correct.
05:36:36.420 | - And then to get the clean labels,
05:36:38.260 | that's a UX challenge.
05:36:40.540 | - Correct.
05:36:41.380 | Although clean labels,
05:36:42.340 | I think maybe it's worth exploring
05:36:43.700 | what that exactly means.
05:36:45.020 | I think any given labeling strategy
05:36:47.180 | will have some number of assumptions it makes
05:36:48.740 | about what the user is attempting to do.
05:36:50.580 | Those assumptions can be formulated in a loss function
05:36:53.300 | or they can be formulated in terms of heuristics
05:36:55.020 | that you might use to just try to estimate
05:36:56.500 | or guesstimate what the user is trying to do.
05:36:58.740 | And what really matters
05:36:59.580 | is how accurate those assumptions.
05:37:01.340 | For example, you might say,
05:37:03.140 | "Hey, user, push upwards
05:37:04.580 | and follow the speed of this cursor."
05:37:06.980 | And your heuristic might be
05:37:08.220 | that they're trying to do it
05:37:09.180 | exactly what that cursor is trying to do.
05:37:10.780 | Another competing heuristic might be
05:37:12.420 | they're actually trying to go slightly faster
05:37:13.860 | at the beginning of the movement
05:37:14.740 | and slightly slower at the end.
05:37:16.380 | And those competing heuristics
05:37:17.460 | may or may not be accurate reflections
05:37:18.900 | of what the user is trying to do.
05:37:20.660 | Another version of the task might be,
05:37:21.980 | "Hey, user, imagine moving this cursor a fixed offset.
05:37:25.540 | So rather than follow the cursor,
05:37:26.980 | just try to move it exactly 200 pixels to the right.
05:37:29.580 | So here's the cursor, here's the target.
05:37:31.500 | Okay, cursor disappears.
05:37:32.580 | Try to move that now invisible cursor
05:37:34.540 | 200 pixels to the right."
05:37:36.180 | And the assumption in that case
05:37:37.140 | would be that the user can actually modulate
05:37:39.180 | correctly that position offset.
05:37:41.220 | But that position offset assumption
05:37:42.700 | might be a weaker assumption
05:37:44.580 | and therefore potentially you can make it more accurate
05:37:47.140 | than these heuristics that are trying to guesstimate
05:37:48.740 | at each millisecond what the user is trying to do.
05:37:50.980 | So you can imagine different tasks
05:37:52.660 | that make different assumptions
05:37:53.740 | about the nature of the user intention.
05:37:55.980 | And those assumptions being correct
05:37:58.140 | is what I would think of as a clean label.
05:37:59.780 | - For that step, what are we supposed to be visualizing?
05:38:02.780 | There's a cursor and you want to move that cursor
05:38:05.340 | to the right or the left, up and down,
05:38:06.820 | or maybe move them by a certain offset.
05:38:09.420 | So that's one way.
05:38:11.380 | Is that the best way to do calibration?
05:38:13.260 | So for example, an alternative crazy way
05:38:15.460 | that probably is playing a role here
05:38:17.540 | is a game like WebGrid,
05:38:18.740 | where you're just getting a very large amount of data,
05:38:23.900 | the person playing a game,
05:38:25.940 | where if they are in a state of flow,
05:38:28.620 | maybe you can get clean signal as a side effect.
05:38:33.620 | - Yep.
05:38:34.900 | - Or is that not an effective way for initial calibration?
05:38:38.540 | - Yeah, great question.
05:38:39.380 | There's a lot to unpack there.
05:38:40.420 | So the first thing I would draw a distinction
05:38:43.700 | between is sort of open loop versus closed loop.
05:38:46.140 | So open loop, what I mean by that is the user
05:38:48.020 | is sort of going from zero to one.
05:38:49.740 | They have no model at all,
05:38:51.140 | and they're trying to get to the place
05:38:52.340 | where they have some level of control at all.
05:38:55.060 | In that setup, you really need to have some task
05:38:58.020 | that gives the user a hint of what you want them to do,
05:38:59.820 | such that you can build its mapping again
05:39:01.180 | from brain data to output.
05:39:03.980 | Then once they have a model,
05:39:05.020 | you could imagine them using that model
05:39:07.460 | and actually adapting to it
05:39:08.900 | and figuring out the right way to use it themselves,
05:39:11.020 | and then retraining on that data
05:39:12.380 | to give you sort of a boost in performance.
05:39:14.660 | There's a lot of challenges associated
05:39:15.980 | with both of these techniques,
05:39:16.940 | and we can sort of rabbit hole into both of them
05:39:18.420 | if you're interested,
05:39:19.260 | but the sort of challenge with the open loop task
05:39:21.900 | is that the user themself doesn't get
05:39:24.220 | proprioceptive feedback about what they're doing.
05:39:26.620 | They don't necessarily perceive themself
05:39:29.500 | or feel the mouse under their hand
05:39:31.780 | when they're trying to do an open loop calibration.
05:39:34.660 | They're being asked to perform something.
05:39:36.100 | Like imagine if you sort of had your whole right arm numbed
05:39:39.460 | and you stuck it in a box and you couldn't see it.
05:39:41.780 | So you had no visual feedback
05:39:42.900 | and you had no proprioceptive feedback
05:39:44.380 | about what the position or activity of your arm was.
05:39:47.060 | And now you're asked, okay,
05:39:47.940 | given this thing on the screen
05:39:48.900 | that's moving from left to right, match that speed.
05:39:51.980 | And you basically can try your best
05:39:53.860 | to invoke whatever that imagined action is in your brain
05:39:57.900 | that's moving the cursor from left to right.
05:39:59.820 | But in any situation, you're gonna be inaccurate
05:40:02.500 | and maybe inconsistent in how you do that task.
05:40:04.980 | And so that's sort of the fundamental challenge
05:40:06.220 | of open loop.
05:40:07.340 | The challenge with closed loop is that
05:40:09.260 | once the user's given a model
05:40:11.180 | and they're able to start moving the mouse on their own,
05:40:15.500 | they're going to very naturally adapt to that model.
05:40:18.340 | And that co-adaptation
05:40:19.740 | between the model learning what they're doing
05:40:21.620 | and the user learning how to use the model
05:40:23.660 | may not find you the best sort of global minima.
05:40:25.740 | It may be that your first model was noisy in some ways,
05:40:28.900 | or maybe just had some like quirk.
05:40:32.020 | There's some like part of the data distribution
05:40:33.500 | it didn't cover super well.
05:40:34.940 | And the user now figures out
05:40:36.500 | because they're a brilliant user like Nolan,
05:40:38.500 | they figure out the right sequence of imagined motions
05:40:41.300 | or the right angle they have to hold their hand at
05:40:42.700 | to get it to work.
05:40:43.780 | And they'll get it to work great,
05:40:45.060 | but then the next day they come back to their device
05:40:46.980 | and maybe they don't remember exactly all the tricks
05:40:48.620 | that they used the previous day.
05:40:49.660 | And so there's a complicated sort of feedback cycle here
05:40:51.940 | that can emerge
05:40:53.620 | and can make it a very, very difficult debugging process.
05:40:56.540 | - Okay, there's a lot of really fascinating things there.
05:40:59.980 | Yeah, actually just to stay on the closed loop.
05:41:07.260 | I've seen situations,
05:41:08.420 | this actually happened watching psychology grad students.
05:41:12.860 | They use a piece of software
05:41:13.940 | when they don't know how to program themselves.
05:41:15.420 | They use a piece of software that somebody else wrote
05:41:17.540 | and it has a bunch of bugs.
05:41:18.900 | And they figure out like,
05:41:21.060 | and they've been using it for years.
05:41:22.820 | They figure out ways to work around it.
05:41:24.780 | Oh, that just happens.
05:41:25.860 | Like nobody considers maybe we should fix this.
05:41:29.420 | They just adapt.
05:41:30.620 | And that's a really interesting notion
05:41:33.060 | that we're really good at adapting,
05:41:35.860 | but you need to still, that might not be the optimal.
05:41:39.780 | Okay, so how do you solve that problem?
05:41:41.260 | Do you have to restart from scratch every once in a while
05:41:44.060 | kind of thing?
05:41:44.900 | - Yeah, it's a good question.
05:41:46.380 | First and foremost, I would say this is not a solved problem.
05:41:48.580 | And for anyone who's listening in academia
05:41:51.540 | who works on BCIs,
05:41:52.900 | I would also say this is not a problem that's solved
05:41:54.580 | by simply scaling channel count.
05:41:56.260 | So this is, maybe that can help
05:41:57.740 | and you can get sort of richer covariance structures
05:41:59.540 | that you can use to exploit
05:42:00.980 | when trying to come up with good labeling strategies.
05:42:02.740 | But if you're interested in problems
05:42:04.660 | that aren't gonna be solved inherently
05:42:05.900 | by just scaling channel count, this is one of them.
05:42:08.180 | Yeah, so how do you solve it?
05:42:09.620 | It's not a solved problem.
05:42:10.660 | That's the first thing I wanna make sure it gets across.
05:42:12.820 | The second thing is any solution that involves closed loop
05:42:15.660 | is going to become a very difficult debugging problem.
05:42:18.780 | And one of my sort of general heuristics
05:42:20.580 | for choosing what problems to tackle
05:42:22.700 | is that you wanna choose the one
05:42:23.620 | that's gonna be the easiest to debug.
05:42:25.460 | 'Cause if you can do that, even if the ceiling is lower,
05:42:29.940 | you're gonna be able to move faster
05:42:31.540 | because you have a tighter iteration loop
05:42:33.380 | debugging the problem.
05:42:34.340 | And in the open loop setting,
05:42:35.620 | there's not a feedback cycle to debug
05:42:37.540 | with the user in the loop.
05:42:38.700 | And so there's some reason to think
05:42:39.860 | that that should be an easier debugging problem.
05:42:43.380 | The other thing that's worth understanding
05:42:44.940 | is that even in a closed loop setting,
05:42:46.660 | there's no special software magic
05:42:48.380 | of how to infer what the user is truly attempting to do.
05:42:50.900 | In a closed loop setting,
05:42:51.740 | although they're moving the cursor on the screen,
05:42:53.900 | they may be attempting something different
05:42:55.260 | than what your model is outputting.
05:42:56.780 | So what the model is outputting
05:42:57.740 | is not a signal that you can use to retrain
05:42:59.580 | if you want to be able to improve the model further.
05:43:01.580 | You still have this very complicated guesstimation
05:43:04.380 | or unsupervised problem of figuring out
05:43:06.660 | what is the true user intention underlying that signal.
05:43:09.700 | And so the open loop problem
05:43:10.620 | has the nice property of being easy to debug.
05:43:13.700 | And the second nice property of,
05:43:15.420 | it has all the same information content
05:43:17.380 | as the closed loop scenario.
05:43:18.780 | Another thing I wanna mention and call out
05:43:21.500 | is that this problem doesn't need to be solved
05:43:23.340 | in order to give useful control to people.
05:43:26.020 | You know, even today with the solutions we have now,
05:43:28.460 | and academia has built up over decades,
05:43:30.660 | the level of control that can be given to a user,
05:43:33.460 | you know, today is quite useful.
05:43:35.540 | It doesn't need to be solved
05:43:36.540 | to get to that level of control.
05:43:38.260 | But again, I wanna build the world's best mouse.
05:43:40.300 | I wanna make it, you know, so good
05:43:42.020 | that it's not even a question that you want it.
05:43:44.140 | And to build the world's best mouse,
05:43:46.740 | the superhuman version,
05:43:48.220 | you really need to nail that problem.
05:43:51.380 | And a couple maybe details of previous studies
05:43:54.260 | that we've done internally
05:43:55.100 | that I think are very interesting to understand
05:43:56.740 | when thinking about how to solve this problem.
05:43:58.460 | The first is that even when you have ground truth data
05:44:01.300 | of what the user is trying to do,
05:44:02.620 | and you can get this with an able-bodied monkey,
05:44:04.300 | a monkey that has a neural link device implanted,
05:44:06.540 | and moving a mouse to control a computer,
05:44:08.340 | even with that ground truth data set,
05:44:10.980 | it turns out that the optimal thing to predict
05:44:13.740 | to produce high performance BCI
05:44:16.140 | is not just the direct control of the mouse.
05:44:18.620 | You can imagine, you know,
05:44:19.460 | building a data set of what's going on in the brain
05:44:21.500 | and what is the mouse exactly doing on the table.
05:44:24.020 | And it turns out that if you build the mapping
05:44:25.580 | from neural spikes to predict
05:44:26.900 | exactly what the mouse is doing,
05:44:28.580 | that model will perform worse
05:44:30.380 | than a model that is trained to predict
05:44:32.140 | sort of higher level assumptions
05:44:33.260 | about what the user might be trying to do.
05:44:35.020 | For example, assuming that the monkey
05:44:36.420 | is trying to go in a straight line to the target.
05:44:38.380 | It turns out that making those assumptions
05:44:40.300 | is actually more effective in producing a model
05:44:43.420 | than actually predicting the underlying hand movement.
05:44:45.020 | - So the intention,
05:44:46.100 | not like the physical movement or whatever.
05:44:48.020 | - Yeah.
05:44:48.860 | - There's obviously a very strong correlation
05:44:50.620 | between the two,
05:44:51.460 | but the intention is a more powerful thing to be chasing.
05:44:54.500 | - Right.
05:44:55.660 | Well, that's also super interesting.
05:44:57.580 | I mean, the intention itself is fascinating
05:45:01.620 | because yes, with the BCI here in this case,
05:45:03.940 | with the digital telepathy,
05:45:06.020 | you're acting on the intention, not the action,
05:45:08.820 | which is why there's an experience
05:45:10.660 | of like feeling like it's happening
05:45:13.180 | before you meant for it to happen.
05:45:15.460 | That is so cool.
05:45:17.100 | And that is why you could achieve
05:45:18.460 | like superhuman performance, probably,
05:45:20.460 | in terms of the control of the mouse.
05:45:22.300 | So for OpenLoop, just to clarify,
05:45:24.500 | so whenever the person is tasked
05:45:27.820 | to like move the mouse to the right,
05:45:29.780 | you said there's not feedback.
05:45:31.100 | So they don't get to get that satisfaction
05:45:35.260 | of like actually getting it to move, right?
05:45:38.260 | - So you could imagine giving the user feedback on a screen,
05:45:41.500 | but it's difficult because at this point,
05:45:43.220 | you don't know what they're attempting to do.
05:45:44.900 | So what can you show them
05:45:46.260 | that would basically give them a signal
05:45:47.660 | of I'm doing this correctly or not correctly?
05:45:49.580 | So let's take this very specific example.
05:45:51.020 | Like maybe your calibration task looks like
05:45:53.220 | you're trying to move the cursor a certain position offset.
05:45:55.740 | So your instructions to the user are,
05:45:57.220 | hey, the cursor's here.
05:45:59.060 | Now, when the cursor disappears,
05:46:00.500 | imagine moving it 200 pixels from where it was
05:46:02.820 | to the right to be over this target.
05:46:05.020 | In that kind of scenario,
05:46:05.860 | you could imagine coming up
05:46:07.180 | with some sort of consistency metric
05:46:08.540 | that you could display to the user of,
05:46:09.940 | okay, I know what the spike train looks like on average
05:46:12.900 | when you do this action to the right.
05:46:14.700 | Maybe I can produce some sort of probabilistic estimate
05:46:17.020 | of how likely is that to be the action you took
05:46:20.100 | given the latest trial or trajectory that you imagined.
05:46:23.300 | And that could give the user some sort of feedback
05:46:24.860 | of how consistent are they across different trials.
05:46:27.940 | You could also imagine that if the user is prompted
05:46:31.220 | with that kind of consistency metric,
05:46:33.260 | that maybe they just become more behaviorally engaged
05:46:35.020 | to begin with because the task is kind of boring
05:46:37.020 | when you don't have any feedback at all.
05:46:38.500 | And so there may be benefits to the, you know,
05:46:40.900 | the user experience of showing something on a screen,
05:46:42.860 | even if it's not accurate,
05:46:43.900 | just because it keeps the user motivated
05:46:45.660 | to try to increase that number or push it upwards.
05:46:48.060 | - So there's a psychology element here.
05:46:50.540 | - Yeah, absolutely.
05:46:52.300 | - And again, all of that is UX challenge.
05:46:55.020 | How much signal drift is there?
05:46:57.100 | Hour to hour, day to day, week to week, month to month,
05:47:01.700 | how often do you have to recalibrate
05:47:04.980 | because of the signal drift?
05:47:06.540 | - Yeah, so this is a problem we've worked on
05:47:10.380 | both with NHP, non-human primates,
05:47:12.340 | before our clinical trial,
05:47:13.900 | and then also with Noland during the clinical trial.
05:47:16.620 | Maybe the first thing that's worth stating
05:47:18.300 | is what the goal is here.
05:47:19.140 | So the goal is really to enable the user
05:47:20.700 | to have a plug and play experience where,
05:47:22.820 | I guess they don't have to plug anything in,
05:47:23.980 | but a play experience where they, you know,
05:47:26.540 | can use the device whenever they want to,
05:47:28.260 | however they want to.
05:47:29.780 | And that's really what we're aiming for.
05:47:31.620 | And so there can be a set of solutions
05:47:33.180 | that get to that state
05:47:34.940 | without considering this non-stationarity problem.
05:47:38.060 | So maybe the first solution here that's important
05:47:40.100 | is that they can recalibrate whenever they want.
05:47:42.300 | This is something that Noland has the ability to do today.
05:47:46.340 | So he can recalibrate the system, you know,
05:47:47.660 | at 2 a.m. in the middle of the night
05:47:49.220 | without his caretaker or parents or friends around
05:47:51.700 | to help push a button for him.
05:47:53.500 | The other important part of the solution
05:47:55.140 | is that when you have a good model calibrated,
05:47:56.980 | that you can continue using that
05:47:58.180 | without needing to recalibrate it.
05:48:00.140 | So how often he has to do this recalibration today
05:48:02.220 | depends really on his appetite for performance.
05:48:04.580 | There are, we observe sort of a degradation through time
05:48:08.740 | of how well any individual model works,
05:48:10.940 | but this can be mitigated behaviorally
05:48:12.420 | by the user adapting their control strategy.
05:48:14.660 | It can also be mitigated through a combination
05:48:16.340 | of sort of software features that we provide to the user.
05:48:18.980 | For example, we let the user adjust
05:48:20.980 | exactly how fast the cursor is moving.
05:48:23.220 | We call that the gain, for example,
05:48:24.620 | the gain of how fast the cursor reacts
05:48:26.220 | to any given input intention.
05:48:27.780 | They can also adjust the smoothing,
05:48:29.100 | how smooth the output of that cursor intention actually is.
05:48:32.820 | They can also adjust the friction,
05:48:33.980 | which is how easy is it to stop and hold still.
05:48:36.300 | And all these software tools allow the user
05:48:38.180 | a great deal of flexibility and troubleshooting mechanisms
05:48:40.900 | to be able to solve this problem for themselves.
05:48:42.020 | - By the way, all of this is done
05:48:43.860 | by looking to the right side of the screen,
05:48:45.940 | selecting the mixer, and the mixer you have, it's-
05:48:48.820 | - It's like DJ mode, DJ mode for your BCI.
05:48:52.020 | - I mean, it's a really well done interface.
05:48:54.380 | It's really, really well done.
05:48:55.780 | And so yeah, there's that bias that there's a cursor drift
05:49:00.500 | that Nolan talked about in a stream.
05:49:02.500 | Although he said that you guys were just playing around
05:49:06.380 | with it with him and they're constantly improving.
05:49:08.660 | So that could have been just a snapshot
05:49:10.540 | of that particular moment, a particular day.
05:49:13.140 | But he said that there was this cursor drift
05:49:17.980 | and this bias that could be removed by him,
05:49:20.860 | I guess, looking to the right side of the screen
05:49:22.500 | or the left side of the screen to kind of adjust the bias.
05:49:25.460 | That's one interface action, I guess, to adjust the bias.
05:49:28.580 | - Yeah, so this is actually an idea
05:49:30.620 | that comes out of academia.
05:49:31.940 | There was some prior work
05:49:34.220 | with sort of brain-gate clinical trial participants
05:49:37.100 | where they pioneered this idea of bias correction.
05:49:39.460 | The way we've done it, I think is,
05:49:40.500 | yeah, it's very productized,
05:49:42.020 | very beautiful user experience
05:49:44.580 | where the user can essentially flash the cursor
05:49:47.580 | over to the side of the screen
05:49:48.420 | and it opens up a window
05:49:49.700 | where they can actually sort of adjust
05:49:52.300 | or tune exactly the bias of the cursor.
05:49:54.780 | So bias maybe for people who aren't familiar
05:49:56.340 | is just sort of what is the default motion of the cursor
05:49:58.540 | if you're imagining nothing.
05:50:00.060 | And it turns out that that's one of the first
05:50:02.260 | first sort of qualia of the cursor control experience
05:50:05.740 | that's impacted by neural non-stationarity.
05:50:07.460 | - Qualia of the cursor experience.
05:50:08.860 | - I mean, I don't know how else to describe it.
05:50:09.860 | Like, you know, I'm not the guy moving the thing.
05:50:12.460 | - It's very poetic, I love it.
05:50:13.820 | The qualia of the cursor experience.
05:50:15.540 | Yeah, I mean, it sounds poetic, but it is deeply true.
05:50:19.540 | There is an experience when it works well,
05:50:23.060 | it is a joyful, a really pleasant experience.
05:50:27.540 | And when it doesn't work well,
05:50:28.660 | it's a very frustrating experience.
05:50:30.420 | That's actually the art of UX.
05:50:32.620 | It's like you have the possibility to frustrate people
05:50:36.940 | or the possibility to give them joy.
05:50:40.000 | And at the end of the day,
05:50:40.840 | it really is truly the case that UX is how the thing works.
05:50:43.600 | And so it's not just like what's showing on the screen,
05:50:45.760 | it's also, you know, what control surfaces
05:50:48.000 | does a decoder provide the user?
05:50:49.800 | Like we want them to feel like they're in the F1 car,
05:50:51.480 | not like the, you know, some like minivan, right?
05:50:55.240 | And that really truly is how we think about it.
05:50:57.160 | Nolan himself is an F1 fan.
05:50:58.480 | So we refer to ourself as a pit crew.
05:51:00.800 | He really is truly the F1 driver.
05:51:03.120 | And there's different, you know, control surfaces
05:51:04.920 | that different kinds of cars and airplanes provide the user.
05:51:08.720 | And we take a lot of inspiration from that
05:51:10.000 | when designing how the cursor should behave.
05:51:11.880 | And maybe one nuance of this is, you know,
05:51:14.480 | even details like when you move a mouse
05:51:16.440 | on a MacBook trackpad,
05:51:18.000 | the sort of response curve of how that input
05:51:21.640 | that you give the trackpad translates to cursor movement
05:51:24.200 | is different than how it works with a mouse.
05:51:26.320 | When you move on the trackpad,
05:51:27.600 | there's a different response function,
05:51:28.720 | a different curve to how much a movement translates
05:51:30.800 | to input to the computer
05:51:31.880 | than when you do it physically with a mouse.
05:51:33.440 | And that's because somebody sat down a long time ago
05:51:35.720 | when they're designing the initial input systems
05:51:37.360 | to any computer,
05:51:38.720 | and they thought through exactly how it feels
05:51:41.400 | to use these different systems.
05:51:42.920 | And now we're designing sort of the next generation
05:51:44.600 | of this input system to a computer,
05:51:46.440 | which is entirely done via the brain.
05:51:48.480 | And there's no proprioceptive feedback.
05:51:50.240 | Again, you don't feel the mouse in your hand.
05:51:52.160 | You don't feel the keys under your fingertips.
05:51:54.240 | And you want a control surface
05:51:55.640 | that still makes it easy and intuitive
05:51:56.960 | for the user to understand the state of the system
05:51:58.720 | and how to achieve what they want to achieve.
05:52:00.520 | And ultimately the end goal is that that UX is completely,
05:52:03.760 | it fades into the background.
05:52:04.600 | It becomes something that's so natural and intuitive
05:52:06.640 | that it's subconscious to the user.
05:52:08.280 | And they just should feel
05:52:09.120 | like they have basically direct control over the cursor.
05:52:12.000 | It just does what they want it to do.
05:52:13.120 | They're not thinking about the implementation
05:52:14.480 | of how to make it do what they want it to do.
05:52:16.040 | It's just doing what they want it to do.
05:52:17.720 | - Is there some kind of things along the lines
05:52:21.320 | of like Fitts' law where you should move the mouse
05:52:24.320 | in a certain kind of way that maximizes your chance
05:52:27.000 | to hit the target?
05:52:28.880 | I don't even know what I'm asking,
05:52:30.600 | but I'm hoping the intention of my question
05:52:34.840 | will land on a profound answer.
05:52:37.080 | No, is there some kind of understanding of the laws of UX
05:52:42.080 | when it comes to the context
05:52:48.400 | of somebody using their brain to control it?
05:52:52.360 | Like that's different than actual with a mouse.
05:52:55.360 | - I think we're in the early stages
05:52:56.680 | of discovering those laws.
05:52:57.920 | So I wouldn't claim to have solved that problem yet,
05:53:01.200 | but there's definitely some things we've learned
05:53:03.040 | that make it easier for the user to get stuff done.
05:53:08.040 | And it's pretty straightforward when you verbalize it,
05:53:10.360 | but it takes a while to actually get to that point
05:53:11.800 | when you're in the process of debugging this stuff
05:53:13.600 | in the trenches.
05:53:14.600 | One of those things is that any machine learning system
05:53:18.840 | you build has some number of errors,
05:53:21.280 | and it matters how those errors translate
05:53:24.640 | to the downstream user experience.
05:53:25.800 | For example, if you're developing a search algorithm
05:53:28.280 | in your photos, if you search for your friend Joe
05:53:31.280 | and it pulls up a photo of your friend Josephine,
05:53:34.160 | maybe that's not a big deal
05:53:35.280 | because the cost of an error is not that high.
05:53:38.440 | In a different scenario where you're trying to,
05:53:41.400 | you know, detect insurance fraud or something like this,
05:53:43.600 | and you're directly sending someone to court
05:53:45.600 | because of some machine learning model output,
05:53:47.040 | then the errors make a lot more sense to be careful about.
05:53:50.200 | You want to be very thoughtful about how those errors
05:53:51.960 | translate to downstream effects.
05:53:53.280 | The same is true in BCI.
05:53:54.240 | So for example, if you're building a model
05:53:57.280 | that's decoding a velocity output from the brain
05:53:59.680 | versus an output where you're trying to modulate
05:54:01.760 | the left click, for example,
05:54:03.080 | these have sort of different trade-offs
05:54:04.480 | of how precise you need to be
05:54:06.440 | before it becomes useful to the end user.
05:54:08.360 | For velocity, it's okay to be on average correct
05:54:10.640 | because the output of the model is integrated through time.
05:54:13.240 | So if the user is trying to click at position A
05:54:15.720 | and they're currently at position B,
05:54:17.200 | they're trying to navigate over time
05:54:19.000 | to get between those two points.
05:54:20.520 | And as long as the output of the model
05:54:22.680 | is on average correct, they can sort of steer through time
05:54:24.880 | with the user control loop in the mix,
05:54:27.240 | they can get to the point they want to get to.
05:54:29.080 | The same is not true of a click.
05:54:30.800 | For a click, you're performing it almost instantly
05:54:33.080 | at the scale of neurons firing.
05:54:35.240 | And so you want to be very sure that that click is correct
05:54:38.240 | because a false click can be very destructive to the user.
05:54:40.320 | They might accidentally close the tab
05:54:41.520 | that they're trying to do something in
05:54:43.160 | and lose all their progress.
05:54:44.440 | They might accidentally hit some send button
05:54:47.760 | on some text that is only like half composed
05:54:49.640 | and reads funny after.
05:54:50.720 | So there's different sort of cost functions
05:54:53.880 | associated with errors in this space.
05:54:55.520 | And part of the UX design is understanding
05:54:57.120 | how to build a solution that is, when it's wrong,
05:55:00.520 | still useful to the end user.
05:55:02.480 | - That's so fascinating that assigning cost
05:55:05.200 | to every action when an error occurs.
05:55:10.600 | So every action, if an error occurs, has a certain cost.
05:55:13.760 | And incorporating that into how you interpret the intention,
05:55:19.880 | mapping it to the action is really important.
05:55:23.960 | I didn't quite, until you said it,
05:55:26.560 | realize there's a cost to like sending the text early.
05:55:30.440 | It's like a very expensive cost.
05:55:32.120 | - It's super annoying if you accidentally,
05:55:33.760 | like if your cursor, imagine if your cursor misclicked
05:55:36.000 | every once in a while.
05:55:37.400 | That's like super obnoxious.
05:55:38.520 | And the worst part of it is usually
05:55:40.360 | when the user is trying to click,
05:55:41.920 | they're also holding still
05:55:43.280 | because they're over the target they want to hit
05:55:44.680 | and they're getting ready to click.
05:55:46.000 | Which means that in the data sets that we build,
05:55:48.520 | on average is the case that sort of low speeds
05:55:50.800 | or desire to hold still is correlated
05:55:52.560 | with when the user is attempting to click.
05:55:54.240 | - Wow, that is really fascinating.
05:55:56.400 | - It's also not the case, you know,
05:55:58.000 | people think that, oh, click is a binary signal.
05:55:59.840 | This must be super easy to decode.
05:56:01.360 | Well, yes, it is.
05:56:02.560 | But the bar is so much higher
05:56:04.760 | for it to become a useful thing for the user.
05:56:06.760 | And there's ways to solve this.
05:56:07.680 | I mean, you can sort of take the compound approach of,
05:56:09.400 | well, let's just give the,
05:56:10.840 | like, let's take five seconds to click.
05:56:12.160 | Let's take a huge window of time
05:56:13.400 | so we can be very confident about the answer.
05:56:15.120 | But again, world's best mouse.
05:56:16.680 | The world's best mouse doesn't take a second to click
05:56:18.440 | or 500 milliseconds to click.
05:56:19.480 | It takes five milliseconds to click or less.
05:56:22.560 | And so if you're aiming for that kind of high bar,
05:56:24.600 | then you really want to solve the underlying problem.
05:56:26.520 | - So maybe this is a good place to ask
05:56:28.000 | about how to measure performance.
05:56:29.800 | This whole bits per second, what,
05:56:32.280 | can you like explain what you mean by that?
05:56:35.640 | Maybe a good place to start is to talk about web grid
05:56:39.200 | as a game, as a good illustration
05:56:41.320 | of the measurement of performance.
05:56:43.400 | - Yeah, maybe I'll take one zoom out step there,
05:56:45.600 | which is just explaining why we care to measure this at all.
05:56:49.240 | So again, our goal is to provide the user
05:56:51.400 | the ability to control the computer as well as I can,
05:56:53.480 | and hopefully better.
05:56:54.400 | And that means that they can do it
05:56:55.480 | at the same speed as what I can do.
05:56:57.120 | It means that they have access
05:56:57.960 | to all the same functionality that I have,
05:57:00.000 | including all those little details like command tab,
05:57:02.160 | command space, all this stuff.
05:57:03.280 | They need to be able to do it with their brain
05:57:04.920 | and with the same level of reliability
05:57:06.480 | as what I can do with my muscles.
05:57:07.720 | And that's a high bar.
05:57:08.840 | And so we intend to measure and quantify every aspect
05:57:10.880 | of that to understand how we're progressing towards that goal.
05:57:13.280 | There's many ways to measure BPS, by the way.
05:57:14.640 | This isn't the only way,
05:57:15.960 | but we present the user a grid of targets.
05:57:18.640 | And basically we compute a score,
05:57:20.720 | which is dependent on how fast and accurately they can select
05:57:23.080 | and then how small are the targets.
05:57:24.800 | And the more targets that are on the screen,
05:57:26.000 | the smaller they are,
05:57:26.840 | the more information you present per click.
05:57:29.280 | And so if you think about it
05:57:30.680 | from an information theory point of view,
05:57:32.120 | you can communicate across
05:57:33.080 | different information theoretic channels.
05:57:35.160 | And one such channel is a typing interface.
05:57:37.480 | You could imagine that's built out of a grid,
05:57:39.360 | just like a software keyboard on the screen.
05:57:41.480 | And bits per second is a measure that's computed
05:57:43.840 | by taking the log of the number of targets on the screen.
05:57:47.200 | You can subtract one if you care to model a keyboard
05:57:49.240 | because you have to subtract one
05:57:50.280 | for the delete key on the keyboard.
05:57:51.920 | But log of the number of targets on the screen
05:57:54.040 | times the number of correct selections minus incorrect
05:57:56.880 | divided by some time window, for example, 60 seconds.
05:57:59.840 | And that's sort of the standard way to measure
05:58:01.560 | a cursor control task in academia.
05:58:03.320 | And all credit in the world goes to this great professor,
05:58:05.640 | Dr. Shenoy of Stanford, who came up with that task.
05:58:08.080 | And he's also one of my inspirations for being in the field.
05:58:09.920 | So all the credit in the world to him
05:58:11.840 | for coming up with a standardized metric
05:58:13.560 | to facilitate this kind of bragging rights that we have now
05:58:15.840 | to say that no one is the best in the world
05:58:17.280 | at this task with this BCI.
05:58:18.840 | It's very important for progress
05:58:19.880 | that you have standardized metrics
05:58:21.360 | that people can compare across.
05:58:22.760 | Different techniques and approaches.
05:58:24.080 | How well does this do?
05:58:25.400 | So yeah, big kudos to him
05:58:26.600 | and to all the team at Stanford.
05:58:28.200 | Yeah, so for Noland and for me playing this task,
05:58:34.000 | there's also different modes
05:58:35.080 | that you can configure this task.
05:58:36.000 | So the web grid task can be presented
05:58:37.360 | as just sort of a left click on the screen,
05:58:39.200 | or you could have targets that you just dwell over,
05:58:41.760 | or you could have targets that you left, right click on.
05:58:43.600 | You could have targets that are left, right click,
05:58:45.560 | middle click, scrolling, clicking, and dragging.
05:58:47.200 | You know, you can do all sorts of things
05:58:48.120 | within this general framework.
05:58:49.800 | But the simplest, purest form is just blue targets
05:58:52.120 | show up on the screen.
05:58:52.960 | Blue means left click.
05:58:54.000 | That's the simplest form of the game.
05:58:56.000 | And the sort of prior records here
05:58:59.440 | in academic work and at Neuralink internally
05:59:04.200 | with sort of NHPs have all been matched
05:59:07.320 | or beaten by Noland with his Neuralink device.
05:59:09.400 | So sort of prior to Neuralink,
05:59:11.280 | the sort of world record for a human using the device
05:59:14.120 | is somewhere between 4.2 to 4.6 BPS,
05:59:17.120 | depending on exactly what paper you read
05:59:18.440 | and how you interpret it.
05:59:19.920 | Noland's current record is 8.5 BPS.
05:59:23.120 | And again, the sort of median Neuralink performance
05:59:25.640 | is 10 BPS.
05:59:26.480 | You can think of it roughly as he's 85% the level of control
05:59:29.480 | of a median Neuralinker using their cursor
05:59:31.960 | to select blue targets on the screen.
05:59:33.880 | And yeah, I think there's a very interesting journey ahead
05:59:38.880 | to get us to that same level of 10 BPS performance.
05:59:42.320 | It's not the case that sort of the tricks that got us
05:59:44.360 | from, you know, 4 to 6 BPS, and then 6 to 8 BPS
05:59:47.640 | are gonna be the ones that get us from 8 to 10.
05:59:49.680 | And in my view, the core challenge here
05:59:51.520 | is really the labeling problem.
05:59:52.720 | It's how do you understand at a very, very fine resolution
05:59:55.760 | what the user's attempting to do?
05:59:57.560 | And yeah, I highly encourage folks in academia
05:59:59.800 | to work on this problem.
06:00:01.320 | - What's the journey with Noland on that quest
06:00:04.560 | of increasing the BPS on WebGrid?
06:00:06.960 | In March, you said that he selected
06:00:09.200 | 89,285 targets on WebGrid.
06:00:12.960 | So he loves this game.
06:00:14.840 | He's really serious about improving his performance
06:00:17.200 | in this game.
06:00:18.080 | So what is the journey of trying to figure out
06:00:20.440 | how to improve that performance?
06:00:21.640 | How much can that be done on the decoding side?
06:00:24.760 | How much can that be done on the calibration side?
06:00:27.320 | How much can that be done on the Noland side
06:00:30.400 | of like figuring out how to convey his intention
06:00:35.400 | more cleanly?
06:00:36.720 | - Yeah, no, this is a great question.
06:00:38.240 | So in my view, one of the primary reasons
06:00:40.960 | why Noland's performance is so good is because of Noland.
06:00:43.640 | Noland is extremely focused and very energetic.
06:00:47.360 | He'll play WebGrid sometimes for like four hours
06:00:49.920 | in the middle of the night,
06:00:50.760 | like from 2 a.m. to 6 a.m. he'll be playing WebGrid
06:00:53.080 | just because he wants to push it to the limits
06:00:54.920 | of what he can do.
06:00:56.160 | And this is not us asking him to do that.
06:00:58.440 | I want to be clear.
06:00:59.280 | We're not saying, "Hey, you should play WebGrid tonight."
06:01:01.320 | We just gave him the game as part of our research
06:01:04.240 | and he is able to play it independently
06:01:06.280 | and practice whenever he wants.
06:01:07.400 | And he really pushes hard to push it.
06:01:09.520 | The technology is the absolute limit.
06:01:11.200 | And he views it as like his job really
06:01:13.360 | to make us be the bottleneck.
06:01:15.120 | And boy, has he done that well.
06:01:17.200 | And so that's the first thing to acknowledge
06:01:18.560 | is that he was extremely motivated to make this work.
06:01:21.280 | I've also had the privilege to meet other
06:01:23.360 | clinical trial participants from BrainGate and other trials.
06:01:26.000 | And they very much share the same attitude
06:01:27.440 | of like they viewed this as their life's work
06:01:29.720 | to advance the technology as much as they can.
06:01:33.360 | And if that means selecting targets on the screen
06:01:35.280 | for four hours from 2 a.m. to 6 a.m. then so be it.
06:01:38.200 | And there's something extremely admirable about that
06:01:40.280 | that's worth calling out.
06:01:42.360 | Okay, so now how do you sort of get from where he started
06:01:45.640 | which is no cursor control to 8BPS?
06:01:47.840 | So, I mean, when he started,
06:01:48.960 | there's a huge amount of learning to do on his side
06:01:51.600 | and our side to figure out
06:01:53.360 | what's the most intuitive control for him.
06:01:56.000 | And the most intuitive control for him is sort of,
06:01:58.880 | you have to find the set intersection of
06:02:00.640 | what do we have the signal to decode?
06:02:02.640 | So we don't pick up every single neuron in the motor cortex
06:02:05.040 | which means we don't have representation
06:02:06.200 | for every part of the body.
06:02:07.360 | So there may be some signals that we have better
06:02:09.360 | sort of decode performance on than others.
06:02:11.200 | For example, on his left hand,
06:02:13.080 | we have a lot of difficulty distinguishing
06:02:14.520 | his left ring finger from his left middle finger.
06:02:17.720 | But on his right hand, we have a good control
06:02:20.440 | and good modulation detected from the neurons
06:02:22.120 | we're able to record for his pinky and his thumb
06:02:24.000 | and his index finger.
06:02:24.840 | So you can imagine how these different subspaces
06:02:28.760 | of modulated activity intersect
06:02:30.680 | with what's the most intuitive for him.
06:02:32.160 | And this has evolved over time.
06:02:33.360 | So once we gave him the ability
06:02:34.640 | to calibrate models on his own,
06:02:36.240 | he was able to go and explore various different ways
06:02:38.560 | to imagine controlling the cursor.
06:02:40.280 | For example, he can imagine controlling the cursor
06:02:41.760 | by wiggling his wrist side to side
06:02:43.600 | or by moving his entire arm
06:02:44.760 | by having to get one pointed to his feet.
06:02:46.080 | You know, he tried like a whole bunch of stuff
06:02:47.600 | to explore the space of what is the most natural way
06:02:50.320 | for him to control the cursor
06:02:52.200 | that at the same time is easy for us to decode.
06:02:54.160 | - Just to clarify, it's through the body mapping procedure
06:02:58.040 | that you're able to figure out which finger he can move?
06:03:02.640 | - Yes, yes, that's one way to do it.
06:03:04.480 | Maybe one nuance of when he's doing it,
06:03:08.040 | he can imagine many more things
06:03:09.520 | than we represent in that visual on the screen.
06:03:11.360 | So we show him sort of abstractly, here's a cursor,
06:03:14.520 | you figure out what works the best for you.
06:03:17.440 | And we obviously have hints about what will work best
06:03:19.360 | from that body mapping procedure of, you know,
06:03:20.800 | we know that this particular action we can represent well,
06:03:23.800 | but it's really up to him to go and explore
06:03:25.200 | and figure out what works the best.
06:03:27.120 | - But at which point does he no longer visualize
06:03:30.480 | the movement of his body
06:03:31.600 | and it's just visualizing the movement of the cursor?
06:03:33.800 | - Yeah.
06:03:34.640 | - How quickly does he go from,
06:03:35.720 | how quickly does it get there?
06:03:37.380 | - So this happened on a Tuesday.
06:03:38.900 | I remember this day very clearly
06:03:40.400 | because at some point during the day,
06:03:43.560 | it looked like he wasn't doing super well.
06:03:45.200 | It looked like the model wasn't performing super well
06:03:46.960 | and he was like getting distracted,
06:03:48.640 | but he actually, it wasn't the case.
06:03:49.760 | Like what actually happened was he was trying something new
06:03:53.120 | where he was just controlling the cursor.
06:03:55.800 | So he wasn't imagining moving his hand anymore.
06:03:57.940 | He was just imagining, I don't know what it is,
06:03:59.960 | some like abstract intention to move the cursor on the screen
06:04:02.720 | and I cannot tell you what the difference
06:04:04.680 | between those two things are.
06:04:05.600 | I really truly cannot.
06:04:06.640 | He's tried to explain it to me before.
06:04:08.060 | I cannot give a first person account of what that's like,
06:04:12.220 | but the expletives that he uttered in that moment were,
06:04:15.420 | you know, enough to suggest
06:04:16.420 | that it was a very qualitatively different experience
06:04:18.520 | for him to just have direct neural control over a cursor.
06:04:22.060 | - I wonder if there's a way through UX
06:04:27.860 | to encourage a human being to discover that
06:04:30.840 | because he discovered it, like you said to me,
06:04:33.900 | that he's a pioneer.
06:04:35.380 | So he discovered that on his own
06:04:37.400 | through all of this, the process of trying to move the cursor
06:04:42.120 | with different kinds of intentions.
06:04:44.280 | But that is clearly a really powerful thing to arrive at,
06:04:47.360 | which is to let go of trying to control the fingers
06:04:52.360 | and the hand and control the actual digital device
06:04:55.600 | with your mind.
06:04:56.440 | - That's right.
06:04:57.260 | UX is how it works.
06:04:58.100 | And the ideal UX is one that it's,
06:04:59.800 | the user doesn't have to think about what they need to do
06:05:02.760 | in order to get it done.
06:05:03.740 | They just, it just does it.
06:05:05.520 | - That is so fascinating.
06:05:08.040 | But I wonder on the biological side,
06:05:11.320 | how long it takes for the brain to adapt.
06:05:13.360 | - Yeah.
06:05:14.200 | - So is it just simply learning like high-level software,
06:05:19.200 | or is there like a neuroplasticity component
06:05:21.520 | where like the brain is adjusting slowly?
06:05:25.040 | - Yeah, the truth is, I don't know.
06:05:26.840 | I'm very excited to see with sort of the second participant
06:05:30.200 | that we implant, what the journey is like for them,
06:05:34.040 | because we'll have learned a lot more.
06:05:35.680 | Potentially we can help them understand
06:05:37.160 | and explore that direction more quickly.
06:05:38.700 | This is something I didn't know.
06:05:39.760 | This wasn't me prompting Nolan to go try this.
06:05:42.480 | He was just exploring how to use his device
06:05:44.960 | and figure it out himself.
06:05:46.160 | But now that we know that that's a possibility,
06:05:48.080 | that maybe there's a way to, for example, hint the user,
06:05:50.600 | don't try super hard during calibration.
06:05:52.320 | Just do something that feels natural,
06:05:53.840 | or just directly control the cursor.
06:05:55.680 | Don't imagine explicit action.
06:05:57.440 | And from there, we should be able to hopefully understand
06:05:59.760 | how this is for somebody who has not experienced that before.
06:06:02.440 | Maybe that's the default mode of operation for them.
06:06:04.240 | You don't have to go through this intermediate phase
06:06:06.160 | of explicit motions.
06:06:07.760 | - Or maybe if that naturally happens for people,
06:06:10.120 | you can just occasionally encourage them
06:06:12.800 | to allow themselves to move the cursor.
06:06:14.780 | Actually, sometimes, just like with the four-minute mile,
06:06:17.340 | just the knowledge that that's possible.
06:06:19.320 | - Pushes you to do it.
06:06:20.560 | - Yeah, enables you to do it, and then it becomes trivial.
06:06:24.000 | And then it also makes you wonder,
06:06:25.960 | this is the cool thing about humans.
06:06:27.560 | Once there's a lot more human participants,
06:06:31.000 | they will discover things that are possible.
06:06:32.960 | - Yes, and share their experiences.
06:06:34.560 | - Yeah, and share it.
06:06:35.640 | And then because of them sharing it,
06:06:37.960 | they'll be able to do it.
06:06:39.400 | All of a sudden, that's unlocked for everybody.
06:06:43.120 | Because just the knowledge sometimes
06:06:44.720 | is the thing that enables it to do it.
06:06:46.640 | - Yeah, and just to comment on that, too.
06:06:49.360 | We've probably tried 1,000 different ways
06:06:51.200 | to do various aspects of decoding,
06:06:54.160 | and now we know what the right subspace is
06:06:55.800 | to continue exploring further.
06:06:56.760 | Again, thanks to Nolan and the many hours
06:06:58.600 | he's put into this.
06:06:59.720 | And so even just that, help constrain the beam search
06:07:02.720 | of different approaches that we could explore,
06:07:04.840 | really helps accelerate for the next person
06:07:07.400 | the set of things that we'll get to try on day one,
06:07:09.080 | how fast we hope to get them to useful control,
06:07:11.280 | how fast we can enable them to use it independently,
06:07:13.480 | and to get value out of the system.
06:07:14.600 | So yeah, massive hats off to Nolan
06:07:16.640 | and all the participants that came before him
06:07:19.280 | to make this technology a reality.
06:07:20.880 | - So how often are the updates to the decoder?
06:07:22.920 | 'Cause Nolan mentioned like, okay,
06:07:24.840 | there's a new update that we're working on,
06:07:26.560 | and that in the stream he said he plays the snake game
06:07:31.520 | because it's like super hard.
06:07:33.480 | It's a good way for him to test like how good the update is.
06:07:36.440 | So, and he says like sometimes the update
06:07:39.320 | is a step backwards.
06:07:40.560 | It's like a, it's a constant like iteration.
06:07:43.240 | So how often, like what does the update entail?
06:07:46.040 | Is it mostly on the decoder side?
06:07:48.720 | - Yeah, a couple comments.
06:07:49.640 | So one is, it's probably worth trying distinction
06:07:51.520 | between sort of research sessions
06:07:52.720 | where we're actively trying different things
06:07:54.520 | to understand like what the best approach is.
06:07:56.520 | Versus sort of independent use
06:07:57.640 | where we wanted to have the ability to just go use the device
06:08:00.400 | how anybody would want to use their MacBook.
06:08:02.520 | And so what he's referring to is,
06:08:04.200 | I think usually in the context of a research session
06:08:05.960 | where we're trying, you know,
06:08:07.280 | many, many different approaches to, you know,
06:08:09.240 | even unsupervised approaches that we talked about earlier
06:08:11.160 | to try to come up with better ways
06:08:12.560 | to estimate his true intention and more accurately decode it.
06:08:15.720 | And in those scenarios, I mean,
06:08:17.600 | we try in any given session,
06:08:19.320 | he'll sometimes work for like eight hours a day.
06:08:21.480 | And so that can be, you know,
06:08:22.480 | hundreds of different models that we would try in that day.
06:08:24.880 | Like a lot of different things.
06:08:27.320 | Now, it's also worth noting that we update the application
06:08:29.920 | he uses quite frequently.
06:08:30.840 | I think, you know, sometimes up to like
06:08:32.160 | four or five times a day,
06:08:33.400 | we'll update his application with different features
06:08:35.360 | or bug fixes or feedback that he's given us.
06:08:38.160 | So he's been able to,
06:08:39.320 | he's a very articulate person who is part of the solution.
06:08:42.680 | He's not a complaining person.
06:08:43.600 | He says, hey, here's this thing that I've discovered
06:08:46.280 | is not optimal in my flow.
06:08:47.960 | Here's some ideas how to fix it.
06:08:49.280 | Let me know what your thoughts are.
06:08:50.240 | Let's figure out how to solve it.
06:08:52.000 | And it often happens that those things are addressed
06:08:54.440 | within a couple hours of him giving us his feedback.
06:08:57.000 | That's the kind of iteration cycle we'll have.
06:08:58.520 | And so sometimes at the beginning of the session,
06:09:00.160 | he'll give us feedback.
06:09:01.000 | And at the end of the session,
06:09:01.840 | he's giving us feedback on the next iteration
06:09:03.800 | of that process or that setup.
06:09:06.080 | - That's fascinating.
06:09:06.920 | 'Cause one of the things you mentioned
06:09:08.040 | that there was 271 pages of notes
06:09:11.040 | taken from the BCI sessions.
06:09:12.640 | And this was just in March.
06:09:14.560 | So one of the amazing things about human beings
06:09:17.240 | that they can provide,
06:09:18.440 | especially ones who are smart and excited
06:09:22.120 | and all positive and good vibes like Nolan,
06:09:25.440 | that they can provide feedback, continuous feedback.
06:09:28.200 | - It also requires, just to brag on the team a little bit,
06:09:30.760 | I work with a lot of exceptional people
06:09:33.360 | and it requires the team being absolutely laser focused
06:09:36.160 | on the user and what will be the best for them.
06:09:38.720 | And it requires a level of commitment of,
06:09:41.000 | okay, this is what the user feedback was.
06:09:42.600 | I have all these meetings.
06:09:43.440 | We're gonna skip that today and we're gonna do this.
06:09:45.440 | That level of focus commitment is, I would say,
06:09:49.360 | underappreciated in the world.
06:09:52.680 | And also, you obviously have to have the talent
06:09:54.840 | to be able to execute on these things effectively.
06:09:56.960 | And yeah, we have that in loads.
06:10:00.160 | - Yeah, and this is such a interesting space of UX design
06:10:05.080 | because there's so many unknowns here.
06:10:08.320 | And I can tell UX is difficult
06:10:13.040 | because of how many people do it poorly.
06:10:15.360 | It's just not a trivial thing.
06:10:19.320 | - Yeah, UX is not something that you can always solve
06:10:23.720 | by just constant iterating on different things.
06:10:25.840 | Like sometimes you really need to step back
06:10:27.200 | and think globally, am I even in the right sort of minima
06:10:30.200 | to be chasing down for a solution?
06:10:32.160 | Like there's a lot of problems
06:10:33.000 | in which sort of fast iteration cycle
06:10:34.640 | is the predictor of how successful you will be.
06:10:37.600 | As a good example, like in an RL simulation, for example,
06:10:40.960 | the more frequently you get a reward,
06:10:42.280 | the faster you can progress.
06:10:44.120 | It's just an easier learning problem
06:10:45.320 | the more frequently you get feedback.
06:10:46.840 | But UX is not that way.
06:10:47.840 | I mean, users are actually quite often wrong
06:10:50.440 | about what the right solution is.
06:10:52.040 | And it requires a deep understanding
06:10:53.520 | of the technical system and what's possible
06:10:56.080 | combined with what the problem is you're trying to solve.
06:10:58.400 | Not just how the user expressed it,
06:10:59.640 | but what the true underlying problem is
06:11:01.760 | to actually get to the right place.
06:11:04.080 | - Yeah, that's the old like stories of Steve Jobs,
06:11:06.440 | like rolling in there.
06:11:07.680 | Yeah, the user is a useful signal,
06:11:11.440 | but it's not a perfect signal.
06:11:13.320 | And sometimes you have to remove the floppy disk drive
06:11:15.920 | or whatever the, I forgot all the crazy stories
06:11:18.840 | of Steve Jobs, like making wild design decisions.
06:11:23.840 | But there, some of it is aesthetic.
06:11:27.120 | That some of it is about the love you put into the design,
06:11:33.800 | which is very much a Steve Jobs, Johnny Ive type thing.
06:11:36.760 | But when you have a human being using their brain
06:11:41.600 | to interact with it, it also is deeply about function.
06:11:45.200 | It's not just aesthetic.
06:11:46.720 | And that you have to empathize with a human being before you
06:11:52.520 | while not always listening to them directly.
06:11:54.960 | Like you have to deeply empathize.
06:11:58.280 | It's fascinating.
06:11:59.880 | It's really, really fascinating.
06:12:02.000 | And at the same time, iterate, right?
06:12:03.760 | But not iterate in small ways,
06:12:05.280 | sometimes a complete, like rebuilding the design.
06:12:08.960 | He said that, Nolan said in the early days,
06:12:11.800 | the UX sucked, but you improved quickly.
06:12:14.720 | What was that journey like?
06:12:16.440 | - Yeah, I mean, I'll give one concrete example.
06:12:19.040 | So he really wanted to be able to read manga.
06:12:22.120 | This is something that he, I mean,
06:12:23.840 | it sounds like a simple thing,
06:12:24.840 | but it's actually a really big deal for him.
06:12:26.720 | And he couldn't do it with his mouse stick.
06:12:28.480 | It just wasn't accessible.
06:12:29.720 | You can't scroll with a mouse stick on his iPad
06:12:31.720 | and on the website that he wanted to be able to use
06:12:34.120 | to read the newest manga.
06:12:35.600 | And so-
06:12:36.440 | - Might be a good quick pause to say the mouse stick
06:12:38.240 | is the thing he's using, holding a stick in his mouth
06:12:42.120 | to scroll on a tablet.
06:12:44.120 | - Right, yeah.
06:12:44.960 | It's basically, you can imagine it's a stylus
06:12:46.240 | that you hold between your teeth.
06:12:47.320 | - Yeah.
06:12:48.160 | - It's basically a very long stylus.
06:12:49.000 | - And it's exhausting, it hurts, and it's inefficient.
06:12:54.000 | - Yeah, and maybe it's also worth calling out,
06:12:56.120 | there are other alternative assistive technologies,
06:12:57.880 | but that particular situation Nolan's in,
06:13:00.480 | and this is not uncommon,
06:13:01.640 | and I think it's also not well understood by folks,
06:13:04.000 | is that he's relatively spastic,
06:13:06.040 | so he'll have muscle spasms from time to time.
06:13:08.120 | And so any assistive technology that requires him
06:13:09.760 | to be positioned directly in front of a camera,
06:13:11.360 | for example, an eye tracker,
06:13:12.760 | or anything that requires him to put something in his mouth,
06:13:15.240 | just is a no-go,
06:13:16.200 | 'cause he'll either be shifted out of frame
06:13:18.280 | when he has a spasm,
06:13:19.120 | or if he has something in his mouth,
06:13:20.560 | it'll stab him in the face if he spasms too hard.
06:13:23.240 | So these kinds of considerations are important
06:13:24.880 | when thinking about what advantages a PCI has
06:13:27.080 | in someone's life.
06:13:27.920 | If it fits ergonomically into your life
06:13:29.600 | in a way that you can use it independently
06:13:31.600 | when your caretaker's not there,
06:13:33.040 | wherever you want to, either in the bed or in the chair,
06:13:35.200 | depending on your comfort level
06:13:37.200 | and your desire to have pressure sores,
06:13:39.400 | all these factors matter a lot
06:13:40.840 | in how good the solution is in that user's life.
06:13:45.360 | So one of these very fun examples is scroll.
06:13:48.040 | So again, Manga is something he wanted to be able to read.
06:13:50.800 | And there's many ways to do scroll with a BCI.
06:13:55.120 | You can imagine different gestures, for example,
06:13:57.520 | the user could do that would move the page.
06:14:00.360 | But scroll is a very fascinating control surface
06:14:02.840 | because it's a huge thing on the screen in front of you.
06:14:06.200 | So any sort of jitter in the model output,
06:14:08.040 | any sort of error in the model output
06:14:09.320 | causes like an earthquake on the screen.
06:14:11.400 | Like you really don't want to have your Manga page
06:14:12.840 | that you're trying to read be shifted up and down
06:14:15.040 | a few pixels just because your scroll decoder
06:14:17.920 | is not completely accurate.
06:14:19.680 | And so this was an example where we had to figure out
06:14:23.160 | how to formulate the problem in a way
06:14:24.760 | that the errors of the system, whenever they do occur,
06:14:27.200 | and we'll do our best to minimize them,
06:14:28.720 | but whenever those errors do occur,
06:14:30.360 | that it doesn't interrupt the qualia, again,
06:14:32.520 | of the experience that the user is having.
06:14:34.560 | It doesn't interrupt their flow of reading their book.
06:14:36.760 | And so what we ended up building
06:14:37.960 | is this really brilliant feature.
06:14:40.960 | This is a teammate named Bruce
06:14:42.520 | who worked on this really brilliant work
06:14:44.400 | called Quick Scroll.
06:14:45.240 | And Quick Scroll basically looks at the screen
06:14:47.600 | and it identifies where on the screen are scroll bars.
06:14:50.960 | And it does this by deeply integrating with macOS
06:14:52.920 | to understand where are the scroll bars
06:14:55.480 | actively present on the screen
06:14:56.680 | using the sort of accessibility tree
06:14:58.000 | that's available to macOS apps.
06:15:00.680 | And we identified where those scroll bars are
06:15:03.040 | and provided a BCI scroll bar.
06:15:05.120 | And the BCI scroll bar looks similar to a normal scroll bar,
06:15:08.040 | but it behaves very differently
06:15:09.120 | in that once you sort of move over to it,
06:15:11.040 | your cursor sort of morphs onto it.
06:15:12.800 | It sort of attaches or latches onto it.
06:15:14.800 | And then once you push up or down
06:15:16.800 | in the same way that you'd use a push to control,
06:15:19.320 | you know, the normal cursor,
06:15:21.080 | it actually moves the screen for you.
06:15:22.800 | So it's basically like remapping the velocity
06:15:24.600 | to a scroll action.
06:15:26.480 | And the reason that feels so natural and intuitive
06:15:28.480 | is that when you move over to attach to it,
06:15:30.440 | it feels like magnetic.
06:15:31.320 | So you're like sort of stuck onto it.
06:15:32.640 | And then it's one continuous action.
06:15:33.960 | You don't have to like switch your imagined movement.
06:15:36.040 | You sort of snap onto it and then you're good to go.
06:15:38.040 | You just immediately can start pulling the page down
06:15:39.920 | or pushing it up.
06:15:41.160 | And even once you get that right,
06:15:43.080 | there's so many little nuances
06:15:44.440 | of how the scroll behavior works
06:15:46.480 | to make it natural and intuitive.
06:15:47.840 | So one example is momentum.
06:15:49.640 | Like when you scroll a page with your fingers on the screen,
06:15:52.120 | you know, you actually have some like flow.
06:15:54.240 | Like it doesn't just stop right
06:15:55.280 | when you lift your finger up.
06:15:56.640 | The same is true with BCI scroll.
06:15:58.360 | So we had to spend some time to figure out
06:15:59.880 | what are the right nuances
06:16:00.840 | when you don't feel the screen
06:16:01.880 | under your fingertip anymore?
06:16:03.320 | What is the right sort of dynamic
06:16:04.720 | or what's the right amount of page give, if you will,
06:16:08.560 | when you push it to make it flow the right amount
06:16:10.640 | for the user to have a natural experience
06:16:13.160 | reading their book?
06:16:14.480 | And there's a million, I mean, I could tell you,
06:16:16.040 | like there's so many little minutiae
06:16:17.800 | of how exactly that scroll works
06:16:19.160 | that we spent probably like a month getting right
06:16:22.080 | to make that feel extremely natural
06:16:23.840 | and easy for the user to navigate.
06:16:25.520 | - I mean, even the scroll on a smartphone with your finger
06:16:29.280 | feels extremely natural and pleasant.
06:16:32.400 | And it probably takes an extremely long time
06:16:37.360 | to get that right.
06:16:38.520 | And actually the same kind of visionary UX design
06:16:43.280 | that we're talking about.
06:16:44.320 | Don't always listen to the users,
06:16:45.840 | but also listen to them and also have like visionary big,
06:16:49.160 | like throw everything out,
06:16:50.520 | think from first principles, but also not.
06:16:53.680 | Yeah, yeah.
06:16:54.720 | By the way, it just makes me think that scroll bars
06:16:57.240 | on the desktop probably have stagnated
06:17:00.960 | and never taken that like, 'cause the snap,
06:17:03.600 | same as it's like snap to grid,
06:17:05.120 | snap to scroll bar action you're talking about
06:17:08.000 | is something that could potentially be extremely useful
06:17:11.440 | in the desktop setting.
06:17:13.040 | Even just for users to just improve the experience.
06:17:16.480 | 'Cause the current scroll bar experience
06:17:18.120 | in the desktop is horrible.
06:17:20.320 | It's hard to find, hard to control.
06:17:22.160 | There's not a momentum.
06:17:23.360 | And the intention should be clear.
06:17:26.440 | When I start moving towards a scroll bar,
06:17:28.520 | there should be a snapping to the scroll bar action.
06:17:31.240 | But of course, maybe I'm okay paying that cost,
06:17:36.080 | but there's hundreds of millions of people
06:17:38.600 | paying that cost nonstop.
06:17:39.920 | But anyway, but in this case,
06:17:42.440 | this is necessary because there's an extra cost
06:17:46.400 | paid by Nolan for the jitteriness.
06:17:48.480 | So you have to switch between the scrolling and the reading.
06:17:53.320 | There has to be a phase shift between the two.
06:17:56.080 | Like when you're scrolling, you're scrolling.
06:17:58.600 | - Right, right.
06:17:59.440 | So that is one drawback of the current approach.
06:18:02.000 | Maybe one other just sort of case study here.
06:18:04.400 | So again, UX is how it works.
06:18:06.000 | And we think about that holistically
06:18:07.240 | from like even the feature detection level
06:18:09.440 | of what we detect in the brain
06:18:10.640 | to how we design the decoder, what we choose to decode
06:18:12.920 | to then how it works once it's being used by the user.
06:18:15.040 | So another good example in that sort of how it works
06:18:16.680 | once they're actually using the decoder.
06:18:19.360 | The output that's displayed on the screen
06:18:20.560 | is not just what the decoder says.
06:18:22.000 | It's also a function of what's going on on the screen.
06:18:25.040 | So we can understand, for example,
06:18:26.400 | that when you're trying to close a tab,
06:18:28.960 | that very small, stupid little X that's extremely tiny,
06:18:31.960 | which is hard to get precisely hit
06:18:34.240 | if you're dealing with sort of a noisy output of the decoder,
06:18:36.680 | we can understand that that is a small little X
06:18:38.320 | you might be trying to hit
06:18:39.200 | and actually make it a bigger target for you.
06:18:40.920 | Similar to how when you're typing on your phone,
06:18:42.880 | if you're used to like the iOS keyboard, for example,
06:18:46.360 | it actually adapts the target size of individual keys
06:18:49.120 | based on the underlying language model.
06:18:50.400 | So it'll actually understand if I'm typing,
06:18:52.320 | "Hey, I'm going to see L."
06:18:54.280 | It'll make the E key bigger
06:18:56.240 | because it knows Lex is the person I'm going to go see.
06:18:58.440 | And so that kind of predictiveness
06:19:00.160 | can make the experience much more smooth,
06:19:01.920 | even without improvements to the underlying decoder
06:19:04.600 | or feature detection part of the stack.
06:19:07.840 | So we do that with a feature called magnetic targets.
06:19:09.520 | We actually index the screen and we understand,
06:19:11.560 | okay, these are the places that are very small targets
06:19:14.160 | that might be difficult to hit.
06:19:15.360 | Here's the kind of cursor dynamics around that location
06:19:17.720 | that might be indicative of the user trying to select it.
06:19:19.840 | Let's make it easier.
06:19:20.880 | Let's blow up the size of it in a way that makes it easier
06:19:22.760 | for the user to sort of snap onto that target.
06:19:24.800 | So all these little details,
06:19:25.640 | they matter a lot in helping the user be independent
06:19:27.560 | in their day-to-day living.
06:19:29.160 | - So how much of the work on the decoder
06:19:31.440 | is generalizable to P2, P3, P4, P5, PN?
06:19:35.840 | How do you improve the decoder
06:19:38.800 | in a way that's generalizable?
06:19:40.920 | - Yeah, great question.
06:19:41.840 | So the underlying signal we're trying to decode
06:19:44.320 | is going to look very different in P2 than in P1.
06:19:46.120 | For example, channel number 345
06:19:48.720 | is going to mean something different in user one
06:19:50.320 | than it will in user two,
06:19:51.440 | just because that electrode that corresponds
06:19:52.960 | with channel 345 is going to be next to a different neuron
06:19:55.600 | in user one versus user two.
06:19:57.720 | But the approaches, the methods,
06:19:59.280 | the user experience of how do you get the right
06:20:01.360 | sort of behavioral pattern from the user
06:20:03.280 | to associate with that neural signal,
06:20:04.760 | we hope will translate over multiple generations of users.
06:20:08.400 | And beyond that, it's very, very possible,
06:20:10.000 | in fact, quite likely that we've overfit
06:20:11.520 | to sort of Nolan's user experience desires and preferences.
06:20:14.800 | And so what I hope to see is that,
06:20:17.000 | you know, when we get a second, third, fourth participant,
06:20:19.680 | that we find sort of what the right wide minimas are
06:20:22.000 | that cover all the cases,
06:20:23.200 | that make it more intuitive for everyone.
06:20:24.400 | And hopefully there's a cross-pollination of things where,
06:20:26.520 | oh, we didn't think about that with this user
06:20:28.080 | because, you know, they can speak.
06:20:30.400 | But with this user who just can fundamentally
06:20:31.960 | not speak at all,
06:20:33.200 | this user experience is not optimal.
06:20:34.760 | And that will actually,
06:20:35.600 | those improvements that we make there
06:20:36.960 | should hopefully translate then
06:20:37.880 | to even people who can speak,
06:20:38.840 | but don't feel comfortable doing so
06:20:40.320 | because we're in a public setting
06:20:41.240 | like their doctor's office.
06:20:42.680 | - So the actual mechanism of open loop labeling
06:20:46.840 | and then closed loop labeling will be the same
06:20:49.760 | and hopefully can generalize across the different users
06:20:52.840 | as they're doing the calibration step.
06:20:54.800 | And the calibration step is pretty cool.
06:20:59.040 | I mean, that in itself,
06:21:00.520 | the interesting thing about WebGrid,
06:21:02.320 | which is like closed loop, it's like fun.
06:21:05.480 | I love it when there's like,
06:21:07.200 | there used to be kind of idea of human computation,
06:21:10.560 | which is using actions a human would want to do anyway
06:21:13.920 | to get a lot of signal from.
06:21:15.680 | And like WebGrid is that,
06:21:16.920 | like a nice video game
06:21:17.920 | that also serves as great calibration.
06:21:20.320 | - It's so funny.
06:21:21.480 | I've heard this reaction so many times.
06:21:22.960 | Before sort of the first user was implanted,
06:21:26.520 | we had an internal perception
06:21:28.360 | that the first user would not find this fun.
06:21:30.280 | And so we thought really quite a bit actually
06:21:31.840 | about like, should we build other games
06:21:33.320 | that like are more interesting for the user
06:21:35.720 | so we can get this kind of data
06:21:36.880 | and help facilitate research
06:21:38.280 | that's for long duration and stuff like this.
06:21:40.280 | It turns out that like people love this game.
06:21:42.160 | I always loved it,
06:21:43.000 | but I didn't know that that was a shared perception.
06:21:45.760 | - Yeah, and just in case it's not clear,
06:21:48.120 | WebGrid is, there's a grid of let's say 35 by 35 cells
06:21:53.120 | and one of them lights up blue
06:21:56.400 | and you have to move your mouse over that and click on it.
06:21:58.880 | And if you miss it and it's red and-
06:22:01.320 | - I played this game for so many hours, so many hours.
06:22:04.320 | - And what's your record, you said?
06:22:06.280 | - My, I think I have the highest at Neuralink right now.
06:22:08.440 | My record is 17 BPS.
06:22:09.880 | - 17 BPS.
06:22:10.720 | - Which is about, if you imagine that 35 by 35 grid,
06:22:13.120 | you're hitting about a hundred trials per minute.
06:22:15.960 | So a hundred correct selections in that one minute window.
06:22:18.480 | So you're averaging about,
06:22:19.640 | between 500, 600 milliseconds per selection.
06:22:21.960 | - So one of the reasons I think I struggle with that game
06:22:25.640 | is I'm such a keyboard person.
06:22:26.960 | So everything is done with your keyboard.
06:22:29.320 | If I can avoid touching the mouse, it's great.
06:22:33.160 | So how can you explain your high performance?
06:22:36.280 | - I have like a whole ritual I go through
06:22:37.680 | when I play WebGrid.
06:22:38.520 | So it's actually like a diet plan
06:22:40.440 | associated with this whole thing.
06:22:42.440 | So the first thing is-
06:22:43.280 | - I have to fast for five days.
06:22:45.040 | I have to go up to the mountains.
06:22:46.880 | - I mean, the fasting thing is important.
06:22:48.520 | So this is like, you know-
06:22:49.360 | - Focuses the mind, yeah.
06:22:50.520 | - Yeah.
06:22:51.360 | - It's true.
06:22:52.200 | - So what I do is I actually,
06:22:53.080 | I don't eat for a little bit beforehand.
06:22:55.440 | And then I'll actually eat like a ton of peanut butter
06:22:57.280 | right before I play.
06:22:58.240 | And I get like-
06:22:59.080 | - This is a real thing.
06:22:59.920 | - This is a real thing, yeah.
06:23:00.740 | And then it has to be really late at night.
06:23:02.120 | This is again a night owl thing I think we share,
06:23:03.980 | but it has to be like, you know,
06:23:05.320 | midnight, 2 a.m. kind of time window.
06:23:07.360 | And I have a very specific,
06:23:08.680 | like physical position I'll sit in,
06:23:10.400 | which is, I used to be, I was homeschooled growing up.
06:23:13.000 | And so I did most of my work like on the floor,
06:23:15.240 | just like in my bedroom or whatever.
06:23:16.760 | And so I have a very specific situation-
06:23:18.560 | - On the floor.
06:23:19.400 | - On the floor that I sit and play.
06:23:21.280 | And then you have to make sure
06:23:22.120 | like there's not a lot of weight on your elbow
06:23:23.640 | when you're playing so that you can move quickly.
06:23:25.400 | And then I turn the gain of the cursor.
06:23:26.800 | So the speed of the cursor way, way up.
06:23:28.040 | So it's like small motions that actually move the cursor.
06:23:30.000 | - Are you moving with your wrist or you're never-
06:23:32.600 | - I move with my fingers.
06:23:33.880 | So my wrist is almost completely still.
06:23:35.560 | I'm just moving my fingers, yeah.
06:23:37.880 | - You know those, just in a small tangent.
06:23:39.720 | - Yeah.
06:23:40.640 | - Which I've been meaning to go down this rabbit hole
06:23:43.680 | of people that set the world record in Tetris.
06:23:47.280 | Those folks, they're playing, there's a way to,
06:23:49.600 | did you see this?
06:23:50.440 | - I see, like all the fingers are moving.
06:23:52.360 | - Yeah, you could find a way to do it
06:23:56.000 | where like it's using a loophole, like a bug,
06:23:59.040 | that you can do some incredibly fast stuff.
06:24:01.240 | So it's along that line, but not quite.
06:24:04.400 | But you do realize there'll be like a few programmers
06:24:06.600 | right now listening to this cool fast
06:24:08.240 | and eat peanut butter.
06:24:09.080 | - Yeah, please, please break my record.
06:24:10.440 | I mean, the reason I did this literally was just
06:24:12.360 | because I wanted the bar to be high for the team.
06:24:14.400 | Like I wanted the number that we aim for
06:24:16.440 | should not be like the median performance.
06:24:17.760 | It should be like, it should be able to beat all of us
06:24:19.240 | at least, like that should be the minimum bar.
06:24:21.360 | - What do you think is possible?
06:24:22.280 | Like 20?
06:24:23.320 | - Yeah, I don't know what the limit, I mean, the limits,
06:24:25.000 | you can calculate just in terms of like screen refresh rate
06:24:28.000 | and like cursor immediately jumping to the next target.
06:24:31.000 | But there's, I mean, I'm sure there's limits before that
06:24:32.800 | with just sort of reaction time and visual perception
06:24:34.840 | and things like this.
06:24:36.600 | I'd guess it's in the below 40, but above 20,
06:24:39.640 | somewhere in there.
06:24:40.480 | It's probably the right, the right number
06:24:41.760 | to be thinking about.
06:24:42.600 | It also matters like how difficult the task is.
06:24:44.080 | You can imagine like some people might be able to do
06:24:46.640 | like 10,000 targets on the screen
06:24:48.160 | and maybe they can do better that way.
06:24:50.760 | So there's some like task optimizations you could do
06:24:53.080 | to try to boost your performance as well.
06:24:55.440 | - What do you think it takes for Nolan
06:24:57.800 | to be able to do above eight five
06:25:00.200 | to keep increasing that number?
06:25:01.520 | You said like every increase in the number
06:25:04.440 | might require different.
06:25:05.920 | - Yeah.
06:25:06.760 | - Different improvements in the system.
06:25:08.760 | - Yeah, I think the nature of this work is,
06:25:10.720 | I mean, the first answer that's important to say
06:25:12.160 | is I don't know.
06:25:13.520 | This is edge of the research.
06:25:15.520 | So again, nobody's gotten to that number before.
06:25:18.560 | So what's next is gonna be a heuristic,
06:25:21.480 | a guess from my part.
06:25:22.640 | What we've seen historically is that different parts
06:25:26.880 | of the stack would compile next to different time points.
06:25:28.520 | So when I first joined AirLink like three years ago or so,
06:25:31.600 | one of the major problems was just the latency
06:25:33.440 | of the Bluetooth connection.
06:25:34.280 | It was just like the radio device wasn't super good.
06:25:36.760 | It was an early revision of the implant.
06:25:38.600 | And it just like, no matter how good your decoder was,
06:25:41.000 | if your thing is updating every 30 milliseconds
06:25:42.840 | or 50 milliseconds, it's just gonna be choppy.
06:25:44.960 | And no matter how good you are,
06:25:46.640 | that's gonna be frustrating and lead to challenges.
06:25:49.360 | So at that point, it was very clear that the main challenge
06:25:51.880 | is just get the data off the device in a very reliable way
06:25:55.640 | such that you can enable the next challenge to be tackled.
06:25:59.360 | And then at some point it was,
06:26:03.000 | actually the modeling challenge of how do you
06:26:05.520 | just build a good mapping,
06:26:06.640 | like the supervised learning problem
06:26:08.080 | of you have a bunch of data
06:26:09.600 | and you have a label you're trying to predict,
06:26:11.440 | just what is the right neural decoder architecture
06:26:14.080 | and hyperparameters to optimize that?
06:26:15.840 | That was the problem for a bit.
06:26:16.760 | And once you solve that, it became a different bottleneck.
06:26:19.520 | I think the next bottleneck after that
06:26:20.880 | was actually just sort of software stability and reliability.
06:26:24.320 | If you have widely varying sort of inference latency
06:26:27.640 | in your system or your app just lags out
06:26:32.680 | every once in a while, it decreases your ability
06:26:34.440 | to maintain and get in a state of flow
06:26:35.880 | and it basically just disrupts your control experience.
06:26:38.280 | And so there's a variety of different software bugs
06:26:40.200 | and improvements we made that basically increased
06:26:42.520 | the performance of the system,
06:26:43.760 | made it much more reliable, much more stable,
06:26:45.920 | and led to a state where we could reliably collect data
06:26:48.240 | to build better models with.
06:26:49.600 | So that was a bottleneck for a while.
06:26:50.640 | It's just sort of like the software stack itself.
06:26:53.320 | If I were to guess right now,
06:26:55.120 | there's sort of two major directions you could think about
06:26:58.160 | for improving BPS further.
06:27:00.200 | The first major direction is labeling.
06:27:01.680 | So labeling is, again, this fundamental challenge
06:27:03.440 | of given a window of time where the user
06:27:06.640 | is expressing some behavioral intent,
06:27:08.640 | what are they really trying to do
06:27:10.280 | at the granularity of every millisecond?
06:27:12.840 | And that, again, is a task design problem.
06:27:14.880 | It's a UX problem.
06:27:16.040 | It's a machine learning problem.
06:27:17.440 | It's a software problem.
06:27:19.120 | Sort of touches all those different domains.
06:27:20.920 | The second thing you can think about
06:27:22.760 | to improve BPS further is either completely changing
06:27:25.840 | the thing you're decoding or just extending
06:27:27.600 | the number of things that you're decoding.
06:27:29.160 | So this is serving the direction of functionality.
06:27:31.000 | Basically, you can imagine giving more clicks.
06:27:32.960 | For example, a left click, a right click, a middle click,
06:27:35.440 | different actions like click and drag, for example,
06:27:37.120 | and that can improve the effective bit rate
06:27:39.000 | of your communication processes.
06:27:40.520 | If you're trying to allow the user to express themselves
06:27:43.400 | through any given communication channel,
06:27:45.040 | you can measure that with bits per second.
06:27:46.760 | But what actually matters at the end of the day
06:27:48.160 | is how effective are they at navigating their computer?
06:27:51.000 | And so from the perspective of the downstream tasks
06:27:52.760 | that you care about, functionality and extending
06:27:54.400 | functionality is something we're very interested in
06:27:55.800 | because not only can it improve the sort of number of BPS,
06:27:59.000 | but it can also improve the downstream sort of independence
06:28:01.480 | that the user has and the skill and efficiency
06:28:02.920 | with which they can operate their computer.
06:28:05.360 | - Would the number of threads increasing
06:28:08.560 | also potentially help?
06:28:10.640 | - Yes, short answer is yes.
06:28:12.280 | It's a bit nuanced how that curve
06:28:14.440 | or how that manifests in the numbers.
06:28:19.040 | So what you'll see is that if you sort of plot a curve
06:28:22.560 | of number of channels that you're using for decode
06:28:26.040 | versus either the offline metric
06:28:27.760 | of how good you are at decoding
06:28:29.760 | or the online metric of sort of in practice
06:28:32.360 | how good is the user at using this device,
06:28:34.560 | you see roughly a log curve.
06:28:36.160 | So as you move further out in number of channels,
06:28:39.520 | you get a corresponding sort of logarithmic improvement
06:28:41.600 | in control quality and offline validation metrics.
06:28:44.400 | The important nuance here is that each channel
06:28:49.000 | corresponds with a specific, you know,
06:28:51.480 | represented intention in the brain.
06:28:53.240 | So for example, if you have a channel 254,
06:28:55.960 | it might correspond with moving to the right.
06:28:57.600 | Channel 256 might mean move to the left.
06:29:00.280 | If you want to expand the number of functions
06:29:02.560 | you want to control,
06:29:04.880 | you really want to have a broader set of channels
06:29:07.120 | that covers a broader set of imagined movements.
06:29:09.000 | You can think of it like,
06:29:10.320 | kind of like Mr. Potato Man, actually.
06:29:11.520 | Like if you had a bunch of different imagined movements
06:29:13.720 | you could do,
06:29:14.560 | how would you map those imagined movements
06:29:15.920 | to input to a computer?
06:29:17.880 | You could imagine, you know,
06:29:18.720 | handwriting to output characters on the screen.
06:29:20.400 | You could imagine just typing with your fingers
06:29:21.920 | and have that output text on the screen.
06:29:23.480 | You could imagine different finger modulations
06:29:24.920 | for different clicks.
06:29:25.760 | You could imagine wiggling your big nose
06:29:26.800 | for opening some menu or wiggling your, you know,
06:29:30.160 | your big toe to have like a command tab occur
06:29:33.000 | or something like this.
06:29:33.840 | So it's really the amount of different actions
06:29:36.400 | you can take in the world
06:29:37.680 | depends on how many channels you have
06:29:38.960 | and the information content that they carry.
06:29:40.960 | - Right, so that's more about the number of actions.
06:29:44.240 | So actually as you increase the number of threads,
06:29:46.720 | that's more about increasing the number of actions
06:29:50.880 | you're able to perform.
06:29:51.960 | - One other nuance there that is worth mentioning.
06:29:53.760 | So again, our goal is really to enable a user
06:29:55.440 | with paralysis to control the computer as fast as I can.
06:29:57.560 | So that's BPS with all the same functionality I have,
06:30:00.920 | which is what we just talked about,
06:30:01.840 | but then also as reliably as I can.
06:30:04.200 | And that last point is very related
06:30:06.560 | to channel count discussion.
06:30:07.640 | So as you scale out number of channels,
06:30:10.280 | the relative importance of any particular feature
06:30:12.760 | of your model input to the output control
06:30:14.640 | of the user diminishes,
06:30:16.120 | which means that if the sort of neural non-stationarity
06:30:18.720 | effect is per channel,
06:30:20.760 | or if the noise is independent,
06:30:22.280 | such that more channels means on average,
06:30:24.080 | less output effect,
06:30:25.960 | then your reliability of your system will improve.
06:30:28.560 | So one sort of core thesis that at least I have
06:30:31.320 | is that scaling channel count
06:30:32.280 | should improve the reliability of the system
06:30:33.600 | without any work on the decoder itself.
06:30:36.360 | - Can you linger on the reliability here?
06:30:40.200 | So first of all,
06:30:41.040 | when you say a non-stationarity of the signal,
06:30:43.560 | which aspect are you referring to?
06:30:46.400 | - Yeah, so maybe let's talk briefly
06:30:47.800 | what the actual underlying signal looks like.
06:30:49.440 | So again, I spoke very briefly at the beginning
06:30:51.480 | about how when you imagine moving to the right
06:30:53.440 | or imagine moving to the left,
06:30:54.720 | neurons might fire more or less.
06:30:56.120 | And their frequency content of that signal,
06:30:57.960 | at least in the motor cortex,
06:30:59.000 | it's very correlated with the output intention
06:31:01.160 | or the behavioral task that the user is doing.
06:31:03.760 | You can imagine actually,
06:31:04.600 | this is not obvious at rate coding,
06:31:06.040 | which is the name of that phenomenon,
06:31:07.920 | is like the only way the brain can represent information.
06:31:09.480 | You can imagine many different ways
06:31:11.040 | in which the brain could encode intention.
06:31:13.240 | And there's actually evidence like in bats, for example,
06:31:15.280 | that there's temporal codes.
06:31:16.800 | So timing codes of like exactly when particular neurons fire
06:31:20.080 | is the mechanism of information representation.
06:31:24.000 | But at least in the motor cortex,
06:31:25.040 | there's substantial evidence that it's rate coding,
06:31:27.720 | or at least one like first order effect
06:31:29.640 | is that it's rate coding.
06:31:31.080 | So then if the brain is representing information
06:31:33.600 | by changing the sort of frequency of a neuron firing,
06:31:37.840 | what really matters is sort of the delta
06:31:39.640 | between sort of the baseline state of the neuron
06:31:41.680 | and what it looks like when it's modulated.
06:31:43.680 | And what we've observed
06:31:44.760 | and what has also been observed in academic work
06:31:46.600 | is that that baseline rate,
06:31:48.600 | if you're to target the scale,
06:31:49.560 | if you imagine that analogy for like measuring flour
06:31:53.400 | or something when you're baking,
06:31:54.640 | that baseline state of how much the pot weighs
06:31:57.160 | is actually different day to day.
06:31:59.520 | And so if what you're trying to measure
06:32:00.600 | is how much rice is in the pot,
06:32:02.160 | you're gonna get a different measurement different days
06:32:03.680 | because you're measuring with different pots.
06:32:05.320 | So that baseline rate shifting is really the thing that,
06:32:08.560 | at least from a first order description of the problem
06:32:11.280 | is what's causing this downstream bias.
06:32:12.960 | There can be other effects,
06:32:13.880 | nonlinear effects on top of that,
06:32:15.040 | but at least at a very first order description
06:32:16.680 | of the problem, that's what we observed day to day
06:32:18.560 | is that the baseline firing rate of any particular neuron
06:32:21.240 | or observed on a particular channel is changing.
06:32:23.600 | - So can you just adjust to the baseline
06:32:27.320 | to make it relative to the baseline nonstop?
06:32:29.440 | - Yeah, this is a great question.
06:32:30.720 | So with monkeys, we have found various ways to do this.
06:32:35.480 | One example I'd do this is you ask them
06:32:38.200 | to do some behavioral task,
06:32:39.960 | like play the game with a joystick.
06:32:41.560 | You measure what's going on in the brain.
06:32:43.640 | You compute some mean of what's going on
06:32:45.720 | across all the input features,
06:32:46.800 | and you subtract that in the input
06:32:47.920 | when you're doing your BCI session.
06:32:49.360 | Works super well.
06:32:50.440 | For whatever reason,
06:32:52.160 | that doesn't work super well with Nolan.
06:32:54.960 | I actually don't know the full reason why,
06:32:56.640 | but I can imagine several explanations.
06:32:58.960 | One such explanation could be
06:33:00.160 | that the context effect difference
06:33:01.560 | between some open loop task and some closed loop task
06:33:03.720 | is much more significant with Nolan
06:33:06.360 | than it is with monkey.
06:33:07.200 | Maybe in this open loop task,
06:33:08.120 | he's watching the Lex Freeman podcast
06:33:10.200 | while he's doing the task,
06:33:11.440 | or he's whistling and listening to music
06:33:13.080 | and talking with his friend
06:33:13.920 | and ask his mom what's for dinner
06:33:15.360 | while he's doing this task.
06:33:16.320 | And so the exact sort of difference in context
06:33:20.320 | between those two states may be much larger
06:33:22.360 | and thus lead to a bigger generalization gap
06:33:24.280 | between the features that you're normalizing
06:33:26.440 | at sort of open loop time
06:33:27.920 | and what you're trying to use at closed loop time.
06:33:29.840 | - That's interesting.
06:33:30.680 | Just on that point,
06:33:32.240 | it's kind of incredible to watch Nolan
06:33:33.800 | be able to multitask,
06:33:35.760 | to do multiple tasks at the same time,
06:33:37.800 | to be able to move the mouse cursor effectively
06:33:40.440 | while talking and while being nervous,
06:33:43.640 | because he's talking in front of you.
06:33:44.480 | - Kicking my ass in chess too, yeah.
06:33:46.040 | - Kicking your ass and talk trash while doing it.
06:33:50.160 | So all at the same time.
06:33:51.680 | And yes, if you're trying to normalize to the baseline,
06:33:55.400 | that might throw everything off.
06:33:57.000 | Boy, is that interesting.
06:33:59.600 | - Maybe one comment on that too,
06:34:00.960 | for folks that aren't familiar with assistive technology,
06:34:03.280 | I think there's a common belief that,
06:34:05.080 | why can't you just use an eye tracker
06:34:06.440 | or something like this for helping somebody
06:34:08.760 | move a mouse on the screen?
06:34:09.920 | And it's really a fair question and one that
06:34:12.280 | I actually did was not confident before Sir Nolan,
06:34:15.400 | that this was going to be
06:34:16.240 | a profoundly transformative technology for people like him.
06:34:19.400 | And I'm very confident now that it will be,
06:34:21.400 | but the reasons are subtle.
06:34:22.360 | It really has to do with ergonomically
06:34:23.880 | how it fits into their life.
06:34:25.400 | Even if you can just offer the same level of control
06:34:28.200 | as what they would have with an eye tracker
06:34:29.760 | or with a mouse stick,
06:34:31.320 | but you don't need to have that thing in your face.
06:34:33.080 | You don't need to be positioned a certain way.
06:34:34.800 | You don't need your caretaker to be around
06:34:36.240 | to set it up for you.
06:34:37.400 | You can activate it when you want,
06:34:38.800 | how you want, wherever you want.
06:34:40.320 | That level of independence is so game-changing for people.
06:34:43.640 | It means that they can text a friend at night privately
06:34:45.480 | without their mom needing to be in the loop.
06:34:47.400 | It means that they can open up and browse the internet
06:34:51.080 | at 2 a.m. when nobody's around
06:34:52.200 | to set their iPad up for them.
06:34:54.600 | This is a profoundly game-changing thing
06:34:57.120 | for folks in that situation.
06:34:58.360 | And this is even before we start talking about folks
06:35:00.240 | that may not be able to communicate at all
06:35:02.200 | or ask for help when they want to.
06:35:03.400 | This can be potentially the only link
06:35:05.120 | that they have to the outside world.
06:35:06.920 | And yeah, that one doesn't, I think,
06:35:08.360 | need explanation of why that's so impactful.
06:35:11.000 | - You mentioned neural decoder.
06:35:13.800 | How much machine learning is in the decoder?
06:35:16.080 | How much magic, how much science, how much art?
06:35:18.600 | How difficult is it to come up with a decoder
06:35:22.600 | that figures out what these sequence of spikes mean?
06:35:27.600 | - Yeah, good question.
06:35:29.640 | There's a couple of different ways to answer this.
06:35:32.160 | So maybe I'll zoom out briefly first
06:35:34.160 | and then I'll go down one of the rabbit holes.
06:35:35.960 | And so the zoomed out view is that building the decoder
06:35:38.680 | is really the process of building the dataset
06:35:40.480 | plus compiling it into the weights.
06:35:42.480 | And each of those steps is important.
06:35:45.280 | The direction, I think, of further improvement
06:35:46.600 | is primarily going to be in the dataset side
06:35:48.400 | of how do you construct the optimal labels for the model?
06:35:51.000 | But there's an entirely separate challenge
06:35:52.560 | of then how do you compile the best model?
06:35:54.560 | And so I'll go briefly down the second rabbit hole.
06:35:57.960 | One of the main challenges
06:35:59.160 | with designing the optimal model for BCI
06:36:02.760 | is that offline metrics don't necessarily
06:36:04.800 | correspond to online metrics.
06:36:07.200 | It's fundamentally a control problem.
06:36:08.480 | The user is trying to control something on the screen
06:36:11.360 | and the exact sort of user experience
06:36:13.080 | of how you output the intention
06:36:16.600 | impacts their ability to control.
06:36:19.080 | So for example, if you just look at validation loss
06:36:22.320 | as predicted by your model,
06:36:23.840 | there can be multiple ways
06:36:24.840 | to achieve the same validation loss.
06:36:26.320 | Not all of them are equally controllable by the end user.
06:36:29.120 | And so it might be as simple as saying,
06:36:31.680 | oh, you could just add auxiliary loss terms
06:36:33.120 | that help you capture the thing that actually matters,
06:36:35.120 | but this is a very complex, nuanced process.
06:36:37.040 | So how you turn the labels into the model
06:36:40.160 | is more of a nuanced process
06:36:41.800 | than just a standard supervised learning problem.
06:36:44.400 | One very fascinating anecdote here,
06:36:47.120 | we've tried many different sort of
06:36:48.200 | neural network architectures
06:36:49.680 | that translate brain data to velocity outputs, for example.
06:36:54.680 | And one example that's stuck in my brain
06:36:57.560 | from a couple of years ago now
06:36:59.480 | is at one point we were using
06:37:01.080 | just fully connected networks to decode the brain activity.
06:37:03.720 | We tried a A/B test where we were measuring
06:37:06.960 | the relative performance in online control sessions
06:37:09.680 | of sort of 1D convolution over the input signal.
06:37:13.280 | So if you imagine per channel,
06:37:15.040 | you have a sliding window
06:37:16.200 | that's producing some convolved feature
06:37:19.480 | for each of those input sequences
06:37:20.960 | for every single channel simultaneously.
06:37:22.920 | You can actually get better validation metrics,
06:37:24.960 | meaning you're fitting the data better
06:37:26.680 | and it's generalizing better in offline data
06:37:28.560 | if you use this convolutional architecture.
06:37:29.960 | You're reducing parameters.
06:37:30.920 | It's sort of a standard procedure
06:37:34.080 | when you're dealing with time series data.
06:37:35.520 | Now, it turns out that when using that model online,
06:37:37.600 | the controllability was worse, was far worse,
06:37:40.360 | even though the offline metrics were better.
06:37:42.320 | And there can be many ways to interpret that,
06:37:44.440 | but what that taught me at least was that,
06:37:46.760 | hey, it's at least the case right now
06:37:48.280 | that if you were to just throw a bunch of compute
06:37:49.880 | at this problem and you were trying
06:37:52.400 | to sort of hyper parameter optimize
06:37:53.720 | or let some GPT model hard code
06:37:56.280 | or come up with or invent many different solutions,
06:37:58.360 | if you were just optimizing for loss,
06:37:59.680 | it would not be sufficient,
06:38:01.040 | which means that there's still
06:38:02.280 | some inherent modeling gap here.
06:38:03.680 | There's still some artistry left to be uncovered here
06:38:05.360 | of how to get your model to scale with more compute.
06:38:08.600 | And that may be fundamentally a labeling problem,
06:38:10.520 | but there may be other components to this as well.
06:38:13.000 | - Is it a data constraint at this time?
06:38:15.760 | Like the, which is what it sounds like.
06:38:18.920 | Like how do you get a lot of good labels?
06:38:22.560 | - Yeah, I think it's data quality constrained,
06:38:24.600 | not necessarily data quantity constrained.
06:38:27.840 | But even like, even just the quantity,
06:38:30.880 | I mean, 'cause it has to be trained on the interactions.
06:38:34.040 | I guess there's not that many interactions.
06:38:37.520 | - Yeah, so it depends what version of this
06:38:39.440 | you're talking about.
06:38:40.280 | So if you're talking about like,
06:38:41.120 | let's say the simplest example of just 2D velocity,
06:38:42.960 | then I think, yeah, data quality is the main thing.
06:38:44.480 | If you're talking about how to build
06:38:45.600 | a sort of multifunction output
06:38:47.240 | that lets you do all the inputs to the computer
06:38:48.920 | that you and I can do,
06:38:50.160 | then it's actually a much more sophisticated
06:38:52.240 | and nuanced modeling challenge
06:38:53.320 | because now you need to think about
06:38:55.280 | not just when the user is left clicking,
06:38:57.560 | but when you're building the left click model,
06:38:58.640 | you also need to be thinking about
06:38:59.600 | how to make sure it doesn't fire
06:39:00.680 | when they're trying to right click
06:39:01.600 | or when they're trying to move the mouse.
06:39:02.960 | So one example of an interesting bug
06:39:04.320 | from like sort of week one of BCI with Nolan
06:39:07.680 | was when he moved the mouse,
06:39:10.280 | the click signal sort of dropped off a cliff
06:39:11.760 | and when he stopped, the click signal went up.
06:39:12.960 | So again, there's a contamination between the two inputs.
06:39:16.040 | Another good example was at one point
06:39:17.800 | he was trying to do sort of a left click and drag
06:39:21.520 | and the minute he started moving,
06:39:23.640 | the left click signal dropped off a cliff.
06:39:25.960 | So again, 'cause there's some contamination
06:39:27.320 | between the two signals,
06:39:28.160 | you need to come up with some way
06:39:29.520 | to either in the dataset or in the model
06:39:31.760 | build robustness against this kind of,
06:39:34.280 | you can think of it like overfitting,
06:39:36.360 | but really it's just that the model
06:39:37.600 | has not seen this kind of variability before.
06:39:40.000 | So you need to find some way to help the model with that.
06:39:42.160 | - This is super cool
06:39:43.000 | 'cause it feels like all of this is very solvable,
06:39:45.680 | but it's hard.
06:39:46.520 | - Yes, it is fundamentally an engineering challenge.
06:39:48.320 | This is important to emphasize
06:39:49.360 | and it's also important to emphasize
06:39:50.440 | that it may not need fundamentally new techniques,
06:39:52.560 | which means that people who work on,
06:39:54.960 | let's say unsupervised speech classification
06:39:57.720 | using CTC loss, for example, with internal to Siri,
06:40:01.080 | they could potentially have very applicable skills to this.
06:40:03.960 | - So what things are you excited about
06:40:06.200 | in the future development of the software stack
06:40:10.240 | on the Neuralink?
06:40:11.880 | So everything we've been talking about,
06:40:13.240 | the decoding, the UX.
06:40:14.880 | - I think there's some I'm excited about,
06:40:16.240 | like something I'm excited about from the technology side
06:40:17.840 | and some I'm excited about for understanding
06:40:19.320 | how this technology is going to be best situated
06:40:21.760 | for entering the world.
06:40:22.600 | So I'll work backwards.
06:40:24.200 | On the technology entering the world side of things,
06:40:26.480 | I'm really excited to understand
06:40:28.400 | how this device works for folks that cannot speak at all,
06:40:32.160 | that have no ability to sort of bootstrap themselves
06:40:35.000 | into useful control by voice command, for example,
06:40:37.680 | and are extremely limited in their current capabilities.
06:40:40.320 | I think that will be an incredibly useful signal for us
06:40:42.080 | to understand, I mean, really what is an existential type
06:40:44.840 | for all startups, which is product market fit.
06:40:46.840 | Does this device have the capacity and potential
06:40:48.880 | to transform people's lives in the current state?
06:40:50.880 | And if not, what are the gaps?
06:40:52.480 | And if there are gaps,
06:40:53.520 | how do we solve them most efficiently?
06:40:56.000 | So that's what I'm very excited about
06:40:57.200 | for the next year or so of clinical trial operations.
06:40:59.960 | The technology side,
06:41:01.720 | I'm quite excited about basically everything we're doing.
06:41:04.960 | I think it's going to be awesome.
06:41:07.040 | The most prominent one I would say is scaling channel count.
06:41:10.280 | So right now we have a thousand channel device.
06:41:12.280 | The next version we'll have
06:41:13.120 | between three and 6,000 channels.
06:41:14.600 | And I would expect that curve to continue in the future.
06:41:17.400 | And it's unclear what set of problems
06:41:19.280 | will just disappear completely at that scale
06:41:22.200 | and what set of problems will remain
06:41:23.520 | and require further focus.
06:41:24.480 | And so I'm excited about the clarity of gradient
06:41:26.200 | that that gives us in terms of the user experiences
06:41:28.280 | we choose to focus our time and resources on.
06:41:30.960 | And then also in terms of the,
06:41:32.640 | yeah, even things as simple as non-stationarity,
06:41:34.560 | like does that problem just completely go away at that scale
06:41:36.680 | or do we need to come up with new creative UX's
06:41:38.320 | still even at that point?
06:41:39.560 | And also when we get to that time point,
06:41:42.840 | when we start expanding out dramatically
06:41:44.320 | the set of functions that you can output from one brain,
06:41:47.200 | how to deal with all the nuances of both the user experience
06:41:50.720 | of not being able to feel the different keys
06:41:52.240 | under your fingertips,
06:41:53.080 | but still needing to be able to modulate all of them
06:41:54.760 | in synchrony to achieve the thing you want.
06:41:56.960 | And again, you don't have that properly set
06:41:58.720 | to feedback loops so how can you make that intuitive
06:42:00.240 | for a user to control a high dimensional control surface
06:42:02.520 | without feeling the thing physically?
06:42:04.720 | I think that's gonna be a super interesting problem.
06:42:07.040 | I'm also quite excited to understand,
06:42:09.840 | do these scaling laws continue?
06:42:10.920 | Like as you scale channel count,
06:42:12.960 | how much further out do you go
06:42:14.880 | before that saturation point is truly hit?
06:42:17.360 | And it's not obvious today.
06:42:18.640 | I think we only know what's in the sort of
06:42:20.120 | interpolation space.
06:42:20.960 | We only know what's between zero and 1024,
06:42:22.560 | but we don't know what's beyond that.
06:42:25.040 | And then there's a whole sort of like
06:42:26.200 | range of interesting sort of neuroscience
06:42:27.760 | and brain questions,
06:42:28.600 | which is when you stick more stuff in the brain
06:42:30.120 | in more places,
06:42:30.960 | you get to learn much more quickly about
06:42:32.760 | what those brain regions represent.
06:42:34.600 | And so I'm excited about that
06:42:35.720 | fundamental neuroscience learning,
06:42:36.840 | which is also important for figuring out
06:42:39.080 | how to most efficiently insert electrodes in the future.
06:42:42.280 | So, yeah, I think all those dimensions
06:42:43.880 | I'm really, really excited about.
06:42:45.080 | And that doesn't even get close to touching
06:42:46.240 | the sort of software stack that we work on
06:42:47.760 | every single day and what we're working on right now.
06:42:49.840 | - Yeah, it seems virtually impossible to me
06:42:53.320 | that a thousand electrodes is where it saturates.
06:42:58.320 | It feels like this would be one of those
06:43:01.560 | silly notions in the future where obviously
06:43:04.040 | you should have millions of electrodes
06:43:05.760 | and this is where like the true breakthroughs happen.
06:43:10.000 | You tweeted,
06:43:13.480 | some thoughts are most precisely described in poetry.
06:43:17.800 | Why do you think that is?
06:43:19.040 | - I think it's because the information bottleneck
06:43:23.120 | of language is pretty steep.
06:43:26.960 | And yet you're able to reconstruct
06:43:30.600 | on the other person's brain more effectively
06:43:34.360 | without being literal.
06:43:35.240 | Like if you can express a sentiment such that
06:43:37.520 | in their brain, they can reconstruct
06:43:39.520 | the actual true underlying meaning and beauty
06:43:43.240 | of the thing that you're trying to get across,
06:43:45.200 | that sort of the generator function in their brain
06:43:46.880 | is more powerful than what language can express.
06:43:48.520 | And so the mechanism poetry is really just to
06:43:53.120 | feed or seed that generator function.
06:43:56.520 | - So being literal sometimes is a suboptimal compression
06:44:00.400 | for the thing you're trying to convey.
06:44:02.320 | - And it's actually in the process of the user
06:44:05.360 | going through that generation
06:44:06.360 | that they understand what you mean.
06:44:08.200 | Like that's the beautiful part.
06:44:09.800 | It's also like when you look at a beautiful painting,
06:44:11.400 | like it's not the pixels of the painting
06:44:13.920 | that are beautiful.
06:44:14.760 | It's the thought process that occurs when you see that,
06:44:16.800 | the experience of that,
06:44:18.440 | that actually is the thing that matters.
06:44:19.800 | - Yeah, it's resonating with some deep thing within you
06:44:24.560 | that the artist also experienced
06:44:26.720 | and was able to convey that through the pixels.
06:44:29.480 | And that's actually gonna be relevant for full-on telepathy.
06:44:32.820 | You know, it's like if you just read the poetry literally,
06:44:38.240 | that doesn't say much of anything interesting.
06:44:42.120 | It requires a human to interpret it.
06:44:44.160 | So it's the combination of the human mind
06:44:47.740 | and all the experiences the human being has
06:44:50.920 | within the context of the collective intelligence
06:44:53.160 | of the human species that makes that poem make sense.
06:44:56.680 | And they load that in.
06:44:58.640 | And so in that same way,
06:44:59.880 | the signal that carries from human to human meaning
06:45:03.840 | might not, may seem trivial,
06:45:06.120 | but may actually carry a lot of power
06:45:08.560 | because of the complexity of the human mind
06:45:14.400 | and the receiving end.
06:45:15.500 | Yeah, that's interesting.
06:45:18.760 | Poetry still doesn't, who was it?
06:45:20.800 | I think Yoshi Bako first of all said
06:45:24.280 | something about
06:45:27.400 | all the people that think we've achieved AGI
06:45:32.360 | explain why humans like music.
06:45:37.200 | - Oh yeah.
06:45:38.120 | - And until the AGI likes music,
06:45:43.080 | you haven't achieved AGI or something like that.
06:45:45.080 | - Do you not think that's like some Next Token
06:45:46.880 | entropy surprise kind of thing going on there?
06:45:49.080 | - I don't know.
06:45:49.920 | - I don't know either.
06:45:51.280 | I listen to a lot of classical music
06:45:52.640 | and also read a lot of poetry.
06:45:54.100 | And yeah, I do wonder if there is some element
06:45:57.640 | of the Next Token surprise factor going on there.
06:45:59.760 | - Yeah, maybe.
06:46:01.120 | A lot of the tricks in both poetry and music
06:46:03.240 | are basically you have some repeated structure
06:46:04.920 | and then you do a twist.
06:46:06.440 | It's like, okay, clause one, two, three is one thing.
06:46:09.280 | And then clause four is like,
06:46:10.280 | okay, now we're onto the next theme.
06:46:12.280 | And they kind of play with exactly when the surprise happens
06:46:14.680 | and the expectation of the user.
06:46:16.160 | And that's even true through history
06:46:18.320 | as musicians evolve music,
06:46:20.360 | they take some known structure
06:46:21.680 | that people are familiar with
06:46:22.960 | and they just tweak it a little bit.
06:46:24.520 | They tweak it and add a surprising element.
06:46:25.960 | This is especially true in classical music heritage.
06:46:29.080 | But that's what I'm wondering.
06:46:29.920 | Is there just entropy, like the--
06:46:31.920 | - So breaking structure or breaking symmetry
06:46:34.960 | is something that humans seem to like.
06:46:36.520 | Maybe as simple as that.
06:46:37.640 | - Yeah, and I mean, great artists copy.
06:46:40.400 | And knowing which rules to break is the important part.
06:46:44.000 | And fundamentally, it must be about the listener
06:46:46.680 | of the piece.
06:46:47.520 | Which rule is the right one to break?
06:46:49.200 | It's about the audience member perceiving that
06:46:52.080 | as interesting.
06:46:52.920 | - What do you think is the meaning of human existence?
06:46:57.640 | - There's a TV show I really like called "The West Wing."
06:47:01.400 | And in "The West Wing," there's a character,
06:47:03.640 | he's the president of the United States,
06:47:05.120 | who's having a discussion about the Bible
06:47:07.360 | with one of their colleagues.
06:47:09.080 | And the colleague says something about,
06:47:13.200 | you know, the Bible says X, Y, and Z.
06:47:15.440 | And the president says, "Yeah, but it also says A, B, C."
06:47:19.440 | And the person says, "Well, do you believe the Bible
06:47:22.440 | to be literally true?"
06:47:23.280 | And the president says, "No, I don't believe it."
06:47:25.840 | And he says, "Yes, but I also think that neither of us
06:47:29.280 | are smart enough to understand it."
06:47:31.000 | I think, like the analogy here for the meaning of life
06:47:34.840 | is that largely, we don't know the right question to ask.
06:47:37.760 | And so I think I'm very aligned with
06:47:39.760 | sort of the "Hitchhiker's Guide to the Galaxy" version
06:47:44.920 | of this question, which is basically,
06:47:46.120 | if we can ask the right questions,
06:47:48.200 | it's much more likely we find the meaning of human existence.
06:47:52.360 | And so in the short term, as a heuristic,
06:47:54.440 | in the sort of search policy space,
06:47:56.320 | we should try to increase the diversity
06:47:58.360 | of people asking such questions,
06:48:00.760 | or generally of consciousness and conscious beings
06:48:03.000 | asking such questions.
06:48:04.080 | So again, I think I'll take the I don't know card here,
06:48:07.960 | but say I do think there are meaningful things we can do
06:48:10.520 | that improve the likelihood of answering that question.
06:48:13.200 | - It's interesting how much value you assign
06:48:16.440 | to the task of asking the right questions.
06:48:18.600 | That's the main thing,
06:48:22.400 | it's not the answers, it's the questions.
06:48:24.680 | - This point, by the way, is driven home
06:48:26.640 | in a very painful way when you try to communicate
06:48:29.600 | with someone who cannot speak.
06:48:31.280 | Because a lot of the time, the last thing to go
06:48:33.520 | is they have the ability to somehow wiggle a lip
06:48:35.920 | or move something that allows them to say yes or no.
06:48:38.880 | And in that situation, it's very obvious
06:48:41.200 | that what matters is, are you asking them the right question
06:48:43.640 | to be able to say yes or no to.
06:48:45.400 | - Wow, that's powerful.
06:48:46.560 | Well, Bliss, thank you for everything you do.
06:48:50.880 | - And thank you for being you.
06:48:53.040 | And thank you for talking today.
06:48:54.760 | - Thank you.
06:48:56.240 | - Thanks for listening to this conversation
06:48:57.880 | with Bliss Chapman.
06:48:59.560 | And now, dear friends, here's Nolan Arbaugh,
06:49:03.400 | the first human being to have a Neuralink device
06:49:06.160 | implanted in his brain.
06:49:07.920 | You had a diving accident in 2016
06:49:12.540 | that left you paralyzed with no feeling
06:49:14.340 | from the shoulders down.
06:49:16.160 | How did that accident change your life?
06:49:18.260 | - It was sort of a freak thing that happened.
06:49:20.920 | Imagine you're running into the ocean.
06:49:25.280 | Although this isn't a lake,
06:49:26.360 | but you're running into the ocean
06:49:28.080 | and you get to about waist high
06:49:30.280 | and then you kind of like dive in,
06:49:32.520 | take the rest of the plunge under the wave or something.
06:49:35.000 | That's what I did.
06:49:36.720 | And then I just never came back up.
06:49:38.840 | Not sure what happened.
06:49:41.320 | I did it running into the water with a couple of guys.
06:49:44.880 | And so my idea of what happened is really just that
06:49:49.880 | I took like a stray fist, elbow, knee, foot,
06:49:55.480 | something to the side of my head.
06:49:57.820 | The left side of my head was sore
06:49:59.700 | for about a month afterward.
06:50:01.460 | So must have taken a pretty big knock.
06:50:03.800 | And then they both came up and I didn't.
06:50:06.780 | And so I was face down in the water for a while.
06:50:09.400 | I was conscious.
06:50:12.400 | And then eventually just realized
06:50:15.520 | I couldn't hold my breath any longer.
06:50:16.760 | And I keep saying took a big drink.
06:50:19.780 | People, I don't know if they like that I say that.
06:50:24.040 | It seems like I'm making light of it all,
06:50:25.380 | but it's just kind of how I am.
06:50:28.560 | And I don't know.
06:50:30.560 | Like I'm a very relaxed sort of stress-free person.
06:50:39.600 | I rolled with the punches for a lot of this.
06:50:44.180 | I kind of took it in stride.
06:50:45.860 | It's like, all right, well, what can I do next?
06:50:47.760 | How can I improve my life even a little bit
06:50:50.420 | on a day-to-day basis at first,
06:50:53.320 | just trying to find some way
06:50:55.280 | to heal as much of my body as possible,
06:50:57.380 | to try to get healed, to try to get off a ventilator,
06:51:02.500 | learn as much as I could so I could somehow survive.
06:51:08.340 | Once I left the hospital.
06:51:11.400 | And then thank God I had my family around me.
06:51:17.080 | If I didn't have my parents, my siblings,
06:51:21.580 | then I would have never made it this far.
06:51:24.040 | They've done so much for me.
06:51:26.700 | More than I can ever thank them for, honestly.
06:51:31.420 | And a lot of people don't have that.
06:51:33.100 | A lot of people in my situation,
06:51:35.180 | their families either aren't capable
06:51:36.980 | of providing for them or honestly just don't want to.
06:51:41.300 | And so they get placed somewhere in some sort of home.
06:51:44.940 | So thankfully I had my family.
06:51:47.340 | I have a great group of friends,
06:51:49.120 | a great group of buddies from college
06:51:51.100 | who have all rallied around me
06:51:53.460 | and we're all still incredibly close.
06:51:56.560 | People always say, if you're lucky,
06:51:58.460 | you'll end up with one or two friends from high school
06:52:02.420 | that you keep throughout your life.
06:52:04.580 | I have about 10, 10 or 12 from high school
06:52:08.820 | that have all stuck around
06:52:10.420 | and we still get together, all of us, twice a year.
06:52:14.180 | We call it the spring series and the fall series.
06:52:17.380 | This last one we all did, we dressed up like X-Men.
06:52:21.140 | So I did Professor Xavier and it was freaking awesome.
06:52:25.580 | It was so good.
06:52:27.180 | So yeah, I have such a great support system around me.
06:52:29.420 | And so being a quadriplegic isn't that bad.
06:52:33.860 | I get waited on all the time.
06:52:37.140 | People bring me food and drinks
06:52:39.500 | and I get to sit around and watch as much TV
06:52:42.900 | and movies and anime as I want.
06:52:45.500 | I get to read as much as I want.
06:52:47.720 | I mean, it's great.
06:52:51.460 | - It's beautiful to see that you see the silver lining
06:52:54.900 | in all of this, which is going back.
06:52:58.420 | Do you remember the moment when you first realized
06:53:00.780 | you were paralyzed from the neck down?
06:53:03.380 | - Yep, I was face down in the water.
06:53:06.260 | Right when I, whatever, something hit my head,
06:53:10.080 | I tried to get up and I realized I couldn't move
06:53:14.340 | and it just sort of clicked.
06:53:16.260 | I'm like, all right, I'm paralyzed, can't move.
06:53:18.660 | What do I do?
06:53:19.580 | If I can't get up, I can't flip over, can't do anything,
06:53:24.380 | then I'm gonna drown eventually.
06:53:26.860 | And I knew I couldn't hold my breath forever.
06:53:30.340 | So I just held my breath and thought about it
06:53:34.940 | for maybe 10, 15 seconds.
06:53:37.900 | I've heard from other people that, on Lickers, I guess,
06:53:41.900 | the two girls that pulled me out of the water
06:53:44.540 | were two of my best friends.
06:53:45.660 | They are lifeguards.
06:53:46.740 | And one of them said that it looked like
06:53:51.340 | my body was sort of shaking in the water,
06:53:53.460 | like I was trying to flip over and stuff.
06:53:55.720 | But I knew, I knew immediately.
06:53:59.820 | And I just kind of, I realized that
06:54:04.820 | that's what my situation was from here on out.
06:54:08.740 | Maybe if I got to the hospital,
06:54:10.040 | they'd be able to do something.
06:54:11.520 | When I was in the hospital, right before surgery,
06:54:14.180 | I was trying to calm one of my friends down.
06:54:17.840 | I had brought her with me from college to camp
06:54:20.740 | and she was just bawling over me.
06:54:22.340 | And I was like, hey, it's gonna be fine, don't worry.
06:54:25.740 | I was cracking some jokes to try to lighten the mood.
06:54:28.940 | The nurse had called my mom and I was like,
06:54:30.740 | don't tell my mom.
06:54:32.420 | She's just gonna be stressed out.
06:54:34.060 | Call her after I'm out of surgery
06:54:36.140 | 'cause at least she'll have some answers then,
06:54:38.080 | like whether I live or not, really.
06:54:40.500 | And I didn't want her to be stressed through the whole thing.
06:54:44.220 | But I knew.
06:54:45.060 | And then, when I first woke up after surgery,
06:54:48.340 | I was super drugged up.
06:54:50.420 | They had me on fentanyl like three ways, which was awesome.
06:54:54.660 | I don't recommend it.
06:54:56.900 | But I saw some crazy stuff on that fentanyl
06:55:01.900 | and it was still the best I've ever felt on drugs.
06:55:06.320 | Medication, sorry, on medication.
06:55:09.700 | And I remember the first time I saw my mom in the hospital,
06:55:15.260 | I was just bawling.
06:55:16.700 | I had like ventilator in, like I couldn't talk or anything.
06:55:21.700 | And I just started crying
06:55:24.820 | because it was more like seeing her,
06:55:26.860 | not that, I mean, the whole situation
06:55:29.780 | obviously was pretty rough,
06:55:31.820 | but it was just like seeing her face
06:55:33.780 | for the first time was pretty hard.
06:55:35.620 | But yeah, I never had like a moment of, you know,
06:55:40.620 | man, I'm paralyzed.
06:55:45.580 | This sucks.
06:55:46.420 | I don't wanna like be around anymore.
06:55:48.560 | It was always just, I hate that I have to do this,
06:55:52.660 | but like sitting here and wallowing isn't gonna help.
06:55:57.020 | - So immediate acceptance.
06:55:58.860 | - Yeah, yeah.
06:56:01.160 | - Has there been low points along the way?
06:56:03.540 | - Yeah, yeah, sure.
06:56:04.820 | I mean, there are days when I don't really feel
06:56:08.980 | like doing anything, not so much anymore,
06:56:11.580 | like not for the last couple years.
06:56:13.140 | I don't really feel that way.
06:56:14.380 | I've more so just wanted to try to do anything possible
06:56:21.220 | to make my life better at this point.
06:56:24.300 | But at the beginning, there were some ups and downs.
06:56:26.780 | There were some really hard things to adjust to.
06:56:29.180 | First off, just like the first couple of months,
06:56:32.500 | the amount of pain I was in was really, really hard.
06:56:36.180 | I mean, I remember screaming at the top of my lungs
06:56:39.380 | in the hospital because I thought my legs were on fire.
06:56:42.420 | And obviously I can't feel anything,
06:56:44.720 | but it's all nerve pain.
06:56:46.300 | And so that was a really hard night.
06:56:48.220 | I asked them to give me as much pain meds as possible,
06:56:51.020 | like you've had as much as you can have,
06:56:53.340 | so just kind of deal with it.
06:56:55.020 | Go to a happy place sort of thing.
06:56:56.700 | So that was a pretty low point.
06:56:58.260 | And then every now and again,
06:57:00.500 | it's hard realizing things that I wanted to do in my life
06:57:04.900 | that I won't be able to do anymore.
06:57:06.660 | I always wanted to be a husband and father,
06:57:11.240 | and I just don't think that I could do it now
06:57:13.940 | as a quadriplegic.
06:57:14.820 | Maybe it's possible, but I'm not sure I would ever
06:57:18.700 | put someone I love through that,
06:57:22.040 | like having to take care of me and stuff.
06:57:25.180 | Not being able to go out and play sports.
06:57:29.260 | I was a huge athlete growing up, so that was pretty hard.
06:57:33.580 | Just little things too, when I realize
06:57:35.820 | I can't do them anymore.
06:57:37.900 | There's something really special about being able
06:57:41.020 | to hold a book and smell a book,
06:57:44.500 | like the feel, the texture, the smell,
06:57:47.340 | like as you turn the pages, I just love it.
06:57:49.620 | I can't do it anymore.
06:57:50.860 | And it's little things like that.
06:57:53.260 | The two-year mark was pretty rough.
06:57:55.620 | Two years is when they say you will get back
06:58:00.620 | basically as much as you're ever gonna get back
06:58:03.980 | as far as movement and sensation goes.
06:58:06.100 | And so for the first two years,
06:58:07.560 | that was the only thing on my mind,
06:58:09.780 | was try as much as I can to move my fingers,
06:58:14.180 | my hands, my feet, everything possible
06:58:17.100 | to try to get sensation and movement back.
06:58:19.100 | And then when the two-year mark hit,
06:58:21.940 | so June 30th, 2018, I was really sad
06:58:26.940 | that that's kind of where I was.
06:58:32.300 | And then just randomly here and there,
06:58:35.460 | but I was never depressed for long periods of time.
06:58:40.460 | Just it never seemed worthwhile to me.
06:58:45.980 | - What gave you strength?
06:58:47.700 | - My faith, my faith in God was a big one.
06:58:51.140 | My understanding that it was all for a purpose.
06:58:54.300 | And even if that purpose wasn't anything
06:58:57.060 | involving Neuralink, even if that purpose was,
06:58:59.880 | there's a story in the Bible about Job.
06:59:05.060 | And I think it's a really, really popular story
06:59:07.180 | about how Job has all of these terrible things
06:59:10.380 | happen to him and he praises God
06:59:11.720 | throughout the whole situation.
06:59:14.340 | I thought, and I think a lot of people think
06:59:17.640 | for most of their lives that they are Job,
06:59:20.340 | that they're the ones going through something terrible
06:59:22.920 | and they just need to praise God through the whole thing
06:59:26.740 | and everything will work out.
06:59:28.360 | At some point after my accident,
06:59:30.140 | I realized that I might not be Job,
06:59:33.900 | that I might be one of his children
06:59:36.880 | that gets killed or kidnapped or taken from him.
06:59:40.100 | And so it's about terrible things
06:59:43.720 | that happen to those around you who you love.
06:59:45.980 | So maybe in this case, my mom would be Job
06:59:49.340 | and she has to get through something extraordinarily hard
06:59:54.060 | and I just need to try and make it
06:59:57.300 | as best as possible for her because she's the one
07:00:01.980 | that's really going through this massive trial.
07:00:04.700 | And that gave me a lot of strength.
07:00:06.500 | And obviously my family, my family and my friends,
07:00:10.220 | they give me all the strength that I need
07:00:13.640 | on a day-to-day basis.
07:00:15.240 | So it makes things a lot easier
07:00:17.220 | having that great support system around me.
07:00:20.080 | - From everything I've seen of you online,
07:00:21.680 | your streams and the way you are today,
07:00:25.040 | I really admire, let's say,
07:00:27.360 | your unwavering positive outlook on life.
07:00:30.440 | Has that always been this way?
07:00:32.520 | - Yeah, yeah.
07:00:33.840 | I mean, I've just always thought I could do better
07:00:38.860 | and thought I could do anything I ever wanted to do.
07:00:42.740 | There was never anything too big.
07:00:44.740 | Like whatever I set my mind to, I felt like I could do it.
07:00:47.780 | I didn't wanna do a lot.
07:00:52.500 | I wanted to travel around and be sort of like a gypsy
07:00:55.380 | and go work odd jobs.
07:00:58.160 | I had this dream of traveling around Europe
07:01:00.740 | and being like, I don't know,
07:01:02.780 | a shepherd in Wales or Ireland
07:01:06.140 | and then going and being a fisherman in Italy,
07:01:09.200 | doing all these things for like a year.
07:01:10.840 | Like it's such like cliche things,
07:01:13.140 | but I just thought it would be so much fun
07:01:15.000 | to go and travel and do different things.
07:01:17.740 | And so I've always just seen the best
07:01:22.740 | in people around me too.
07:01:25.000 | And I've always tried to be good to people.
07:01:28.540 | And growing up with my mom too,
07:01:31.380 | she's like the most positive, energetic person in the world.
07:01:35.060 | And we're all just people, people.
07:01:37.200 | Like I just get along great with people.
07:01:40.960 | I really enjoy meeting new people.
07:01:42.780 | And so I just wanted to do everything.
07:01:46.260 | This is just kind of just how I've been.
07:01:50.480 | - It's just great to see that cynicism didn't take over,
07:01:53.980 | given everything you've been through.
07:01:55.280 | - Yeah.
07:01:56.120 | - Was that like a deliberate choice you made
07:01:59.280 | that you're not gonna let this keep you down?
07:02:01.300 | - Yeah, a bit.
07:02:02.680 | Also like I just, it's just kind of how I am.
07:02:06.680 | I just, like I said, I roll with the punches with everything.
07:02:09.520 | I always used to tell people,
07:02:11.640 | like I don't stress about things much.
07:02:15.160 | And whenever I'd see people getting stressed,
07:02:17.440 | I just say, you know, like it's not hard.
07:02:19.580 | Just don't stress about it.
07:02:21.460 | And like, that's all you need to do.
07:02:24.140 | And they're like, that's not how that works.
07:02:26.160 | I'm like, it works for me.
07:02:27.380 | Like just don't stress and everything will be fine.
07:02:30.160 | Like everything will work out.
07:02:31.320 | Obviously, not everything always goes well.
07:02:34.280 | And it's not like it all works out
07:02:35.640 | for the best all the time.
07:02:37.640 | But I just don't think stress has had any place in my life
07:02:42.040 | since I was a kid.
07:02:43.160 | - What was the experience like of you being selected
07:02:47.000 | to be the first human being to have
07:02:50.200 | a neural link device implanted in your brain?
07:02:53.240 | Were you scared, excited?
07:02:54.480 | - No, no, it was cool.
07:02:57.920 | Like I was never afraid of it.
07:03:01.680 | I had to think through a lot.
07:03:03.680 | Should I do this?
07:03:07.520 | Like be the first person.
07:03:09.300 | I could wait until number two or three
07:03:12.240 | and get a better version of the neural link.
07:03:14.800 | Like the first one might not work.
07:03:16.880 | Maybe it's actually gonna kind of suck.
07:03:20.480 | It's gonna be the worst version ever in a person.
07:03:24.760 | So why would I do the first one?
07:03:26.600 | Like I've already kind of been selected.
07:03:28.000 | I could just tell them like, okay, find someone else
07:03:30.720 | and then I'll do number two or three.
07:03:31.920 | Like I'm sure they would let me.
07:03:33.240 | They're looking for a few people anyways.
07:03:35.240 | But ultimately I was like, I don't know.
07:03:37.120 | There's something about being the first one to do something.
07:03:39.380 | It's pretty cool.
07:03:40.300 | I always thought that if I had the chance
07:03:43.520 | that I would like to do something for the first time,
07:03:46.260 | this seemed like a pretty good opportunity.
07:03:49.320 | And I was never scared.
07:03:51.840 | I think my like faith had a huge part in that.
07:03:56.840 | I always felt like God was preparing me for something.
07:04:01.880 | I almost wish it wasn't this
07:04:06.000 | because I had many conversations with God
07:04:09.160 | about not wanting to do any of this as a quadriplegic.
07:04:12.880 | I told him, you know, I'll go out and talk to people.
07:04:15.560 | I'll go out and travel the world and talk to, you know,
07:04:19.320 | stadiums, thousands of people give my testimony.
07:04:22.320 | I'll do all of it, but like heal me first.
07:04:24.600 | Don't make me do all of this in a chair, that sucks.
07:04:27.200 | And I guess he won that argument.
07:04:31.600 | I didn't really have much of a choice.
07:04:33.760 | I always felt like there was something going on
07:04:37.760 | and to see how I guess easily I made it
07:04:42.760 | through the interview process
07:04:46.600 | and how quickly everything happened,
07:04:49.040 | how the star sort of aligned with all of this.
07:04:53.600 | It just told me like, as the surgery was getting closer,
07:04:58.240 | it just told me that, you know, it was all meant to happen.
07:05:02.720 | It was all meant to be.
07:05:03.960 | And so I shouldn't be afraid of anything that's to come.
07:05:07.560 | And so I wasn't, I kept telling myself, like, you know,
07:05:11.360 | you say that now, but as soon as the surgery comes,
07:05:14.080 | you're probably going to be freaking out.
07:05:15.720 | Like you're about to have brain surgery
07:05:17.120 | and brain surgery is a big deal for a lot of people,
07:05:21.280 | but it's a even bigger deal for me.
07:05:23.360 | Like it's all I have left.
07:05:24.840 | The amount of times I've been like, thank you God,
07:05:27.120 | that you didn't take my brain and my personality
07:05:29.360 | and my ability to think, my like love of learning,
07:05:33.400 | like my character, everything, like, thank you so much.
07:05:36.000 | Like, as long as you left me that,
07:05:38.100 | then I think I can get by.
07:05:39.540 | And I was about to let people go like root around in there.
07:05:43.940 | Like, hey, we're going to go like put some stuff
07:05:46.080 | in your brain, like hopefully it works out.
07:05:48.280 | And so it was, it was something that gave me pause.
07:05:51.800 | But like I said, how smoothly everything went,
07:05:54.920 | I never expected for a second that anything would go wrong.
07:05:58.760 | Plus the more people I met on the borrows side
07:06:02.760 | and on the knurling side,
07:06:04.640 | they're just the most impressive people in the world.
07:06:06.920 | Like I can't speak enough to how much I trust these people
07:06:12.000 | with my life and how impressed I am with all of them.
07:06:16.740 | And to see the excitement on their faces,
07:06:19.500 | to like walk into a room and roll into a room
07:06:24.180 | and see all of these people looking at me,
07:06:26.740 | like, we're just, we're so excited.
07:06:28.520 | Like we've been working so hard on this
07:06:30.780 | and it's finally happening.
07:06:32.220 | It's super infectious.
07:06:34.100 | And it just makes me want to do it even more
07:06:37.380 | and to help them achieve their dreams.
07:06:40.020 | Like, I don't know, it's so, it's so rewarding
07:06:42.640 | and I'm so happy for all of them, honestly.
07:06:45.280 | - What was the day of surgery like?
07:06:47.980 | What's, when'd you wake up?
07:06:49.960 | What'd you feel?
07:06:51.620 | - Yeah. - Minute by minute.
07:06:53.120 | - Yeah. - Were you freaking out?
07:06:54.480 | - No, no, I thought I was going to,
07:06:56.840 | but the surgery approach the night before, the morning of,
07:07:00.280 | I was just excited.
07:07:01.560 | Like, I was like, let's make this happen.
07:07:03.900 | I think I said that, something like that to Elon
07:07:06.920 | on the phone beforehand, we were like FaceTiming.
07:07:10.340 | And I was like, let's rock and roll.
07:07:11.980 | And he's like, let's do it.
07:07:13.340 | I don't know, I just, I wasn't scared.
07:07:16.860 | So we woke up, I think we had to be at the hospital
07:07:19.460 | at like 5.30 a.m.
07:07:20.620 | I think surgery was at like 7 a.m.
07:07:22.740 | So we woke up pretty early.
07:07:24.020 | I'm not sure much of us slept that night.
07:07:27.140 | Got to the hospital at 5.30,
07:07:32.460 | went through like all the pre-op stuff.
07:07:35.700 | Everyone was super nice.
07:07:37.820 | Elon was supposed to be there in the morning,
07:07:40.420 | but something went wrong with his plane.
07:07:41.780 | So we ended up FaceTiming.
07:07:43.620 | That was cool.
07:07:44.460 | Had one of the greatest one-liners of my life.
07:07:47.180 | After that phone call, hung up with him.
07:07:49.620 | There were like 20 people around me.
07:07:51.300 | And I was like, I just hope
07:07:52.220 | he wasn't too starstruck talking to me.
07:07:54.460 | - Nice. - Yeah, it was good.
07:07:56.220 | - Well done. - Yeah, yeah.
07:07:57.700 | - Did you write that ahead of time?
07:07:58.860 | - No, no, it just came to me.
07:08:00.820 | I was like, this is, this seems right.
07:08:04.860 | Went into surgery.
07:08:06.260 | I asked if I could pray right beforehand.
07:08:09.500 | So I prayed over the room.
07:08:11.180 | I asked God if he would be with my mom
07:08:13.440 | in case anything happened to me.
07:08:15.460 | And just to calm her nerves out there.
07:08:18.820 | Woke up and played a bit of a prank on my mom.
07:08:23.020 | I don't know if you've heard about it.
07:08:24.500 | - Yeah, I read about it.
07:08:25.340 | - Yeah, she was not happy.
07:08:27.980 | - Can you take me through the prank?
07:08:29.420 | - Yeah, this is something--
07:08:30.260 | - Do you regret doing that now?
07:08:31.780 | - Nope, nope, not one bit.
07:08:34.820 | It was something I had talked about
07:08:37.980 | ahead of time with my buddy, Bane.
07:08:39.660 | I was like, I would really like to play a prank on my mom.
07:08:43.300 | Very specifically, my mom.
07:08:46.300 | She's very gullible.
07:08:48.140 | I think she had knee surgery once even.
07:08:51.060 | And after she came out of knee surgery,
07:08:54.140 | she was super groggy.
07:08:57.820 | She was like, I can't feel my legs.
07:08:59.580 | And my dad looked at her, he was like,
07:09:01.700 | you don't have any legs.
07:09:03.740 | They had to amputate both your legs.
07:09:05.900 | And we just do very mean things to her all the time.
07:09:10.980 | I'm so surprised that she still loves us.
07:09:14.480 | But right after surgery, I was really worried
07:09:18.580 | that I was going to be too groggy, not all there.
07:09:22.120 | I had had anesthesia once before,
07:09:24.460 | and it messed me up.
07:09:27.260 | I could not function for a while afterwards,
07:09:30.820 | and I said a lot of things that,
07:09:35.020 | I was really worried that I was going to start,
07:09:38.220 | I don't know, dropping some bombs,
07:09:41.740 | and I wouldn't even know, I wouldn't remember.
07:09:44.820 | So I was like, please God, don't let that happen.
07:09:48.460 | And please let me be there enough to do this to my mom.
07:09:52.540 | And so she walked in after surgery.
07:09:58.220 | It was the first time they had been able
07:10:00.180 | to see me after surgery.
07:10:01.740 | And she just looked at me.
07:10:03.940 | She said, hi, how are you?
07:10:05.500 | How are you doing?
07:10:06.340 | How do you feel?
07:10:07.160 | And I looked at her and this very,
07:10:09.740 | I think the anesthesia helped, very groggy,
07:10:12.140 | sort of confused look on my face.
07:10:13.940 | It's like, who are you?
07:10:16.180 | And she just started looking around the room
07:10:19.420 | like at the surgeons or the doctors,
07:10:21.620 | like what did you do to my son?
07:10:23.160 | Like you need to fix this right now.
07:10:25.100 | Tears started streaming.
07:10:26.580 | I saw how much she was freaking out.
07:10:28.300 | I was like, I can't let this go on.
07:10:30.140 | And so I was like, mom, mom, I'm fine.
07:10:31.980 | Like, it's all right.
07:10:34.120 | And still, she was not happy about it.
07:10:37.220 | She still says she's gonna get me back someday,
07:10:41.100 | but I mean, I don't know.
07:10:43.420 | I don't know what that's gonna look like.
07:10:44.860 | - It's a lifelong battle.
07:10:45.980 | - Yeah, but it was good.
07:10:47.960 | - In some sense, it was a demonstration
07:10:49.460 | that you still got.
07:10:50.780 | - That's all I wanted it to be.
07:10:52.820 | That's all I wanted it to be.
07:10:53.780 | And I knew that doing something super mean
07:10:56.840 | to her like that would show her--
07:10:58.900 | - It's actually a way to show that you're still there,
07:11:01.300 | that you love her.
07:11:02.140 | - Yeah, exactly, exactly.
07:11:03.540 | - It's a dark way to do it, but I love it.
07:11:05.280 | - Yeah.
07:11:06.580 | - What was the first time you were able to feel
07:11:09.740 | that you can use the Neuralink device
07:11:13.600 | to affect the world around you?
07:11:16.340 | - Yeah, the first little taste I got of it
07:11:18.700 | was actually not too long after surgery.
07:11:22.660 | Some of the Neuralink team had brought in
07:11:26.260 | like a little iPad, a little tablet screen,
07:11:30.060 | and they had put up eight different channels
07:11:34.180 | that were recording some of my neuron spikes.
07:11:38.180 | And they put it in front of me,
07:11:39.380 | and they're like, "This is like real time,
07:11:41.100 | "your brain firing."
07:11:43.100 | I was like, "That's super cool."
07:11:44.700 | My first thought was, I mean, if they're firing now,
07:11:49.100 | let's see if I can affect them in some way.
07:11:51.340 | So I started trying to like wiggle my fingers,
07:11:53.820 | and I just started like scanning through the channels.
07:11:56.940 | And one of the things I was doing
07:11:58.620 | was like moving my index finger up and down.
07:12:01.060 | And I just saw this yellow spike
07:12:03.060 | on like top row, like third box over or something.
07:12:06.780 | I saw this yellow spike every time I did it,
07:12:08.860 | and I was like, "Oh, that's cool."
07:12:10.260 | And everyone around me was just like,
07:12:11.180 | "What, what are you seeing?"
07:12:12.740 | I was like, "Look, look at this one.
07:12:14.620 | "Look at like this top row, third box over,
07:12:17.220 | "this yellow spike.
07:12:18.360 | "Like that's me right there, there, there."
07:12:21.580 | And everyone was freaking out.
07:12:23.300 | They started like clapping.
07:12:24.620 | I was like, "That's super unnecessary."
07:12:26.980 | Like this is what's supposed to happen, right?
07:12:29.780 | - So you're imagining yourself moving
07:12:32.500 | each individual finger one at a time?
07:12:34.420 | - Yeah.
07:12:35.260 | - And then seeing like that you can notice something,
07:12:36.900 | and then when you did the index finger, you're like, "Oh."
07:12:39.460 | - Yeah, I was wiggling kind of all of my fingers
07:12:42.700 | to see if anything would happen.
07:12:44.080 | There was a lot of other things going on,
07:12:47.580 | but that big yellow spike was the one that stood out to me.
07:12:50.620 | Like I'm sure that if I would've stared at it long enough,
07:12:54.180 | I could've mapped out maybe a hundred different things,
07:12:57.740 | but the big yellow spike was the one that I noticed.
07:13:00.380 | - Maybe you could speak to what it's like
07:13:02.020 | to sort of wiggle your fingers.
07:13:03.620 | To like, to imagine that the mental,
07:13:07.780 | the cognitive effort required to sort of wiggle
07:13:10.000 | your index finger, for example.
07:13:11.860 | How easy is that to do?
07:13:13.060 | - Pretty easy for me.
07:13:14.700 | It's something that at the very beginning,
07:13:18.340 | after my accident, they told me to try
07:13:21.620 | and move my body as much as possible,
07:13:25.140 | even if you can't, just keep trying,
07:13:29.260 | because that's going to create new neural pathways
07:13:32.240 | or pathways in my spinal cord to reconnect these things,
07:13:35.500 | to hopefully regain some movement someday.
07:13:39.220 | - That's fascinating.
07:13:40.060 | - Yeah, I know, it's bizarre, but I--
07:13:43.340 | - So that's part of the recovery process
07:13:44.900 | is to keep trying to move your body.
07:13:46.660 | - Yep.
07:13:47.500 | - And that's-- - Just have a day
07:13:48.660 | as much as you can.
07:13:49.500 | - And the nervous system does its thing.
07:13:51.340 | It starts reconnecting.
07:13:52.460 | - It'll start reconnecting for some people.
07:13:55.020 | Some people, it never works.
07:13:56.840 | Some people, they'll do it, like for me,
07:13:59.020 | I got some bicep control back, and that's about it.
07:14:04.020 | I can, if I try enough, I can wiggle some of my fingers.
07:14:09.160 | Not like on command, it's more like if I try to move,
07:14:15.100 | say, my right pinky, and I just keep trying to move it,
07:14:18.340 | after a few seconds, it'll wiggle.
07:14:20.420 | So I know there's stuff there.
07:14:22.220 | I know, and that happens with a few different
07:14:24.860 | of my fingers and stuff.
07:14:26.400 | But yeah, that's what they tell you to do.
07:14:30.320 | One of the people at the time,
07:14:33.220 | when I was in the hospital, came in and told me
07:14:35.420 | for one guy who had recovered most of his control,
07:14:39.660 | what he thought about every day was actually walking,
07:14:43.220 | like the act of walking, just over and over again.
07:14:46.660 | So I tried that for years.
07:14:48.660 | I tried just imagining walking, which is, it's hard.
07:14:53.660 | It's hard to imagine all of the steps that go into,
07:14:59.000 | well, taking a step, all of the things that have to move,
07:15:03.020 | like all of the activations that have to happen
07:15:06.580 | along your leg in order for one step to occur.
07:15:09.780 | - But you're not just imagining, you're doing it, right?
07:15:12.940 | - I'm trying, yeah.
07:15:13.980 | So it's imagining over again what I had to do
07:15:18.980 | to take a step, 'cause it's not something
07:15:22.980 | any of us think about.
07:15:24.500 | You want to walk, and you take a step.
07:15:26.860 | You don't think about all of the different things
07:15:28.900 | that are going on in your body.
07:15:31.220 | So I had to recreate that in my head as much as I could,
07:15:35.120 | and then I practiced it over and over and over.
07:15:37.700 | - So it's not like a third-person perspective.
07:15:39.580 | It's a first-person perspective.
07:15:42.060 | It's not like you're imagining yourself walking.
07:15:44.460 | You're literally doing everything,
07:15:47.800 | all the same stuff as if you're walking.
07:15:49.780 | - Which was hard.
07:15:51.940 | It was hard at the beginning.
07:15:53.540 | - Like frustrating hard, or actually cognitively hard?
07:15:56.780 | - It was both.
07:15:58.660 | There's a scene in one of the Kill Bill movies, actually,
07:16:07.180 | oddly enough, where she is paralyzed.
07:16:10.300 | I don't know, from a drug that was in her system,
07:16:12.740 | and then she finds some way to get
07:16:14.940 | into the back of a truck or something,
07:16:17.220 | and she stares at her toe, and she says, "Move."
07:16:20.740 | Like, "Move your big toe."
07:16:22.420 | And after a few seconds on screen, she does it,
07:16:27.300 | and she did that with every one of her body parts
07:16:29.940 | until she could move again.
07:16:31.700 | I did that for years, just stared at my body and said,
07:16:36.940 | "Move your index finger, move your big toe,"
07:16:40.740 | sometimes vocalizing it out loud,
07:16:43.020 | but sometimes just thinking it.
07:16:44.380 | I tried every different way to do this
07:16:47.260 | to try to get some movement back.
07:16:49.260 | And it's hard because it actually is like taxing,
07:16:54.260 | like physically taxing on my body,
07:16:56.660 | which is something I would have never expected,
07:16:58.580 | 'cause it's not like I'm moving,
07:17:00.160 | but it feels like there's a buildup of,
07:17:04.760 | I don't know, the only way I can describe it is
07:17:07.500 | there are like signals that aren't getting through
07:17:12.500 | from my brain down, 'cause there's that gap
07:17:16.380 | in my spinal cord, so brain down,
07:17:18.100 | and then from my hand back up to the brain.
07:17:21.400 | And so it feels like those signals get stuck
07:17:26.120 | in whatever body part that I'm trying to move,
07:17:28.560 | and they just build up and build up and build up
07:17:30.880 | until they burst.
07:17:32.220 | And then once they burst, I get like
07:17:34.520 | this really weird sensation of everything
07:17:37.220 | sort of like dissipating back out to level,
07:17:39.960 | and then I do it again.
07:17:41.120 | It's also just like a fatigue thing,
07:17:45.200 | like a muscle fatigue,
07:17:46.880 | but without actually moving your muscles.
07:17:49.360 | It's very, very bizarre.
07:17:51.100 | And then, you know, if you try to stare at a body part
07:17:55.600 | or think about a body part and move
07:17:57.680 | for two, three, four, sometimes eight hours,
07:18:01.640 | it's very taxing on your mind.
07:18:04.240 | It takes a lot of focus.
07:18:06.140 | It was a lot easier at the beginning
07:18:09.240 | because I wasn't able to like control a TV
07:18:13.860 | in my room or anything.
07:18:14.860 | I wasn't able to control any of my environment.
07:18:18.340 | So for the first few years,
07:18:20.460 | a lot of what I was doing was staring at walls.
07:18:23.300 | And so obviously I did a lot of thinking,
07:18:27.780 | and I tried to move a lot,
07:18:30.500 | just over and over and over again.
07:18:32.960 | - Do you never give up sort of hope there?
07:18:35.560 | - No.
07:18:36.400 | - Just training hard, essentially.
07:18:37.580 | - Yep, and I still do it.
07:18:39.220 | I do it like subconsciously.
07:18:41.980 | And I think that that helped a lot
07:18:45.740 | with things with Neuralink, honestly.
07:18:47.860 | It's something that I talked about the other day
07:18:50.060 | at the All Hands that I did at Neuralink's Austin facility.
07:18:53.680 | - Welcome to Austin, by the way.
07:18:54.740 | - Yeah, hey, thanks, man.
07:18:55.820 | I would just-- - Nice hat.
07:18:57.020 | - Hey, thanks, thanks, man.
07:18:58.100 | The Gigafactory was super cool.
07:19:00.300 | I went to school at Texas A&M, so I've been around before.
07:19:03.340 | - So you should be saying, "Welcome to me."
07:19:05.020 | - Yeah. - Welcome to Texas, Lex.
07:19:06.380 | - Yeah. - Yeah, I hear you.
07:19:08.980 | - But yeah, I was talking about how a lot
07:19:11.460 | of what they've had me do, especially at the beginning,
07:19:14.100 | well, I still do it now, is body mapping.
07:19:17.460 | So like there will be a visualization
07:19:20.220 | of a hand or an arm on the screen,
07:19:23.300 | and I have to do that motion.
07:19:25.700 | And that's how they sort of train the algorithm
07:19:29.340 | to understand what I'm trying to do.
07:19:31.940 | And so it made things very seamless for me, I think.
07:19:36.940 | - That's really, really cool.
07:19:39.300 | So it's amazing to know, 'cause I've learned a lot
07:19:42.140 | about the body mapping procedure,
07:19:44.380 | like with the interface and everything like that.
07:19:46.780 | It's cool to know that you've been essentially
07:19:48.940 | like training to be world-class at that task.
07:19:52.940 | - Yeah, yeah.
07:19:54.780 | I don't know if other quadriplegics,
07:19:58.260 | like other paralyzed people, give up.
07:20:00.860 | I hope they don't.
07:20:03.500 | I hope they keep trying,
07:20:05.060 | because I've heard other paralyzed people say,
07:20:09.020 | like, "Don't ever stop."
07:20:10.220 | They tell you two years, but you just never know.
07:20:14.780 | The human body's capable of amazing things.
07:20:17.820 | So I've heard other people say, "Don't give up."
07:20:21.820 | Like, I think one girl had spoken to me
07:20:25.620 | through some family members and said
07:20:27.100 | that she had been paralyzed for 18 years,
07:20:30.780 | and she'd been trying to wiggle her index finger
07:20:34.020 | for all that time,
07:20:34.980 | and she finally got a bat like 18 years later.
07:20:37.840 | So I know that it's possible,
07:20:39.820 | and I'll never give up doing it.
07:20:41.340 | I just, I do it when I'm lying down watching TV.
07:20:44.580 | I'll find myself doing it kind of just almost on its own.
07:20:48.780 | It's just something I've gotten so used to doing
07:20:51.060 | that, I don't know, I don't think I'll ever stop.
07:20:54.000 | - That's really awesome to hear,
07:20:54.980 | 'cause I think it's one of those things
07:20:56.300 | that can really pay off in the long term,
07:20:58.940 | 'cause it is training.
07:21:00.140 | You're not visibly seeing the results
07:21:01.620 | of that training at the moment,
07:21:03.200 | but there's that Olympic-level nervous system
07:21:06.460 | getting ready for something.
07:21:08.260 | - Honestly, it was like,
07:21:10.180 | something that I think Neuralink gave me
07:21:13.420 | that I can't thank them enough for,
07:21:17.820 | like I can't show my appreciation for it enough,
07:21:20.700 | was being able to visually see
07:21:25.700 | that what I'm doing is actually having some effect.
07:21:30.460 | It's a huge part of the reason why I know now
07:21:36.420 | that I'm gonna keep doing it forever,
07:21:38.460 | because before Neuralink, I was doing it every day,
07:21:42.460 | and I was just assuming that things were happening.
07:21:46.660 | Like, it's not like I knew.
07:21:47.700 | I wasn't getting back any mobility or sensation or anything,
07:21:52.620 | so I could have been running up against a brick wall
07:21:55.740 | for all I knew, and with Neuralink,
07:21:58.580 | I get to see all the signals happening real-time,
07:22:02.620 | and I get to see that what I'm doing can actually be mapped.
07:22:07.620 | When we started doing click calibrations and stuff,
07:22:11.700 | when I go to click my index finger for a left click,
07:22:14.440 | that it actually recognizes that.
07:22:17.220 | It changed how I think about what's possible
07:22:21.260 | with retraining my body to move,
07:22:25.000 | and so yeah, I'll never give up now.
07:22:28.020 | - And also just the signal
07:22:29.180 | that there's still a powerhouse of a brain there
07:22:31.180 | that's like, and as the technology develops,
07:22:34.900 | that brain is, I mean, that's the most important thing
07:22:37.380 | about the human body is the brain,
07:22:38.860 | and it can do a lot of the control.
07:22:40.940 | So what did it feel like when you first
07:22:42.860 | could wiggle the index finger
07:22:44.100 | and saw the environment respond like that little thing,
07:22:49.100 | where everybody was being way too dramatic,
07:22:50.700 | according to you?
07:22:51.540 | - Yeah, it was very cool.
07:22:52.500 | I mean, it was cool, but it,
07:22:55.380 | I keep telling this to people, it made sense to me.
07:22:58.260 | It made sense that there are signals
07:23:02.020 | still happening in my brain,
07:23:03.780 | and that as long as you had something near it
07:23:07.280 | that could measure those, that could record those,
07:23:09.700 | then you should be able to visualize it in some way,
07:23:13.140 | see it happen.
07:23:14.380 | And so that was not very surprising to me.
07:23:17.420 | I was just like, oh, cool, we found one.
07:23:20.580 | We found something that works.
07:23:22.060 | It was cool to see that their technology worked,
07:23:26.080 | and that everything that they'd worked so hard for
07:23:28.720 | was going to pay off.
07:23:30.860 | But I hadn't moved a cursor or anything at that point.
07:23:33.260 | I hadn't interacted with a computer
07:23:35.140 | or anything at that point.
07:23:36.440 | So it just made sense.
07:23:40.400 | It was cool.
07:23:42.180 | I didn't really know much about BCI at that point either.
07:23:45.820 | So I didn't know what sort of step
07:23:48.900 | this was actually making.
07:23:50.480 | I didn't know if this was a huge deal,
07:23:54.060 | or if this was just like,
07:23:55.180 | okay, this is, it's cool that we got this far,
07:23:58.560 | but we're actually hoping for something
07:24:00.740 | much better down the road.
07:24:02.220 | It's like, okay, I just thought that they knew
07:24:04.740 | that it turned on.
07:24:06.140 | So I was like, cool, this is cool.
07:24:08.540 | - Well, did you read up on the specs
07:24:10.360 | of the hardware you get installed,
07:24:12.300 | the number of threads?
07:24:13.720 | - Yeah, yeah, I knew all of that,
07:24:15.220 | but it's all Greek to me.
07:24:17.740 | I was like, okay, threads, 64 threads,
07:24:20.900 | 16 electrodes, 1,024 channels.
07:24:24.060 | Okay, that math checks out.
07:24:29.060 | - Sounds right.
07:24:31.460 | - Yeah.
07:24:32.280 | - When was the first time you were able
07:24:33.220 | to move a mouse cursor?
07:24:34.460 | - I know it must've been within the first maybe week,
07:24:37.700 | a week or two weeks that I was able
07:24:39.700 | to like first move the cursor.
07:24:42.300 | And again, like it kind of made sense to me.
07:24:46.160 | Like it didn't seem like that big of a deal.
07:24:48.780 | Like it was like, okay, well, how do I explain this?
07:24:53.780 | When everyone around you starts clapping
07:24:57.020 | for something that you've done,
07:24:58.900 | it's easy to say, okay, like I did something cool.
07:25:03.780 | Like that was impressive in some way.
07:25:07.100 | What exactly that meant, what it was,
07:25:09.860 | hadn't really like set in for me.
07:25:13.960 | So again, I knew that me trying to move a body part
07:25:19.980 | and then that being mapped in some sort
07:25:28.700 | of like machine learning algorithm
07:25:31.180 | to be able to identify like my brain signals
07:25:35.140 | and then take that and give me cursor control.
07:25:37.380 | That all kind of made sense to me.
07:25:38.820 | I don't know like all the ins and outs of it,
07:25:41.300 | but I was like, there are still signals in my brain firing.
07:25:44.580 | They just can't get through because there's like a gap
07:25:48.020 | in my spinal cord.
07:25:49.100 | And so they just, they can't get all the way down
07:25:51.180 | and back up, but they're still there.
07:25:53.500 | So when I moved the cursor for the first time,
07:25:56.460 | I was like, that's cool.
07:25:57.380 | But I expected that that should happen.
07:26:00.100 | Like it made sense to me.
07:26:02.940 | When I moved the cursor for the first time
07:26:06.300 | with just my mind without like physically trying to move.
07:26:10.820 | So I guess I can get into that just a little bit,
07:26:12.820 | like the difference between attempted movement
07:26:15.580 | and imagined movement.
07:26:16.540 | - Yeah, that's a fascinating difference.
07:26:18.340 | - Yeah. - From one to the other.
07:26:19.700 | - Yeah, yeah, yeah.
07:26:20.540 | So like attempted movement is me physically trying
07:26:24.940 | to attempt to move, say my hand.
07:26:28.140 | I try to attempt to move my hand to the right,
07:26:30.340 | to the left, forward and back.
07:26:33.080 | And that's all attempted.
07:26:33.960 | Attempt to, you know, like lift my finger up and down,
07:26:37.360 | attempt to kick or something.
07:26:39.760 | I'm physically trying to do all of those things,
07:26:42.360 | even if you can't see it.
07:26:44.440 | Like I'm, this would be like me attempting
07:26:46.880 | to like shrug my shoulders or something.
07:26:49.280 | That's all attempted movement.
07:26:50.800 | That all, that's what I was doing
07:26:55.320 | for the first couple of weeks
07:26:57.320 | when they were going to give me cursor control.
07:27:00.240 | When I was doing body mapping,
07:27:01.620 | it was attempt to do this, attempt to do that.
07:27:04.740 | When Nir was telling me to like imagine doing it,
07:27:09.740 | it like kind of made sense to me,
07:27:14.920 | but it's not something that people practice.
07:27:19.600 | Like if you started school as a child
07:27:24.600 | and they said, okay, write your name with this pencil.
07:27:28.820 | And so you do that.
07:27:30.020 | Like, okay, now imagine writing your name with that pencil.
07:27:33.180 | Kids would think like, I guess, like that kind of makes sense
07:27:38.020 | and they would do it, but that's not something we're taught.
07:27:40.940 | It's all like how to do things physically.
07:27:43.420 | We think about like thought experiments and things,
07:27:46.460 | but that's not like a physical action of doing things.
07:27:50.180 | It's more like what you would do in certain situations.
07:27:53.180 | So imagine movement, it never really connected with me.
07:27:56.420 | Like, I guess you could maybe describe it
07:27:59.260 | as like a professional athlete,
07:28:01.300 | like swinging a baseball bat or swinging like a golf club,
07:28:04.740 | like imagine what you're supposed to do,
07:28:06.820 | but then you go right to that and physically do it.
07:28:10.980 | Like you, then you get a bat in your hand
07:28:13.060 | and then you do what you've been imagining.
07:28:15.100 | And so I don't have that like connection.
07:28:17.220 | So telling me to imagine something versus attempting it,
07:28:20.340 | it just, there wasn't a lot that I could do there mentally.
07:28:24.460 | I just kind of had to accept what was going on and try.
07:28:28.720 | But the attempted movement thing, it all made sense to me.
07:28:32.620 | Like if I try to move,
07:28:34.740 | then there's a signal being sent in my brain.
07:28:38.180 | And as long as they can pick that up,
07:28:39.740 | then they should be able to map it to what I'm trying to do.
07:28:42.700 | And so when I first moved the cursor like that,
07:28:45.420 | it was just like, yes, this should happen.
07:28:48.780 | Like I'm not surprised by that.
07:28:50.740 | - But can you clarify, is there supposed to be a difference
07:28:52.900 | between imagined movement and attempted movement?
07:28:55.420 | - Yeah, just that in imagined movement,
07:28:57.540 | you're not attempting to move at all.
07:29:00.220 | So it's-
07:29:01.060 | - You're like visualizing what you're doing.
07:29:03.180 | And then theoretically,
07:29:05.340 | is that supposed to be a different part of the brain
07:29:06.860 | that lights up in those two different situations?
07:29:09.220 | - Yeah, not necessarily.
07:29:10.340 | I think all these signals can still be represented
07:29:12.380 | in motor cortex, but the difference I think has to do
07:29:15.100 | with the naturalness of imagining something versus-
07:29:18.060 | - Got it.
07:29:18.900 | - Attempting it and sort of the fatigue of that over time.
07:29:20.700 | - And by the way, on the mic is bliss.
07:29:23.060 | So like, this is just different ways to prompt you
07:29:27.740 | to kind of get to the thing that you're around.
07:29:31.180 | - Yeah, yeah.
07:29:32.020 | - Attempted movement does sound like the right thing to try.
07:29:35.740 | - Yeah, I mean, it makes sense to me.
07:29:37.740 | - 'Cause imagine for me, I would start visualizing,
07:29:40.860 | like in my mind, visualizing.
07:29:42.780 | Attempted, I would actually start trying to like,
07:29:45.460 | there's a, I mean, I did like combat sports
07:29:48.140 | my whole life, like wrestling.
07:29:49.300 | When I'm imagining a move, see, I'm like moving my muscle.
07:29:54.300 | - Exactly.
07:29:55.460 | - Like there is a bit of an activation almost,
07:29:58.140 | versus like visualizing yourself like a picture doing it.
07:30:01.620 | - Yeah, it's something that I feel like naturally
07:30:04.540 | anyone would do.
07:30:05.500 | If you try to tell someone to imagine doing something,
07:30:07.880 | they might close their eyes
07:30:09.300 | and then start physically doing it.
07:30:11.860 | But it's just-
07:30:13.460 | - To think like-
07:30:14.380 | - Yeah, it's hard.
07:30:17.660 | It was very hard at the beginning.
07:30:18.820 | - But attempted worked.
07:30:20.140 | - Attempted worked.
07:30:21.100 | It worked just like it should work like a charm.
07:30:25.640 | - I remember there was like one Tuesday
07:30:27.980 | we were messing around and I think,
07:30:29.860 | I forget what swear word you used,
07:30:31.180 | but there was a swear word that came out of your mouth
07:30:32.820 | when you figured out you could just do
07:30:33.900 | the direct cursor control.
07:30:35.380 | - Yeah, that's, it blew my mind, like no pun intended,
07:30:40.380 | blew my mind when I first moved the cursor
07:30:47.540 | just with my thoughts and not attempting to move.
07:30:51.900 | It's something that I found like over the couple of weeks,
07:30:56.140 | like building up to that,
07:30:57.620 | that as I get better cursor controls,
07:31:02.420 | like the model gets better,
07:31:05.160 | then it gets easier for me to like,
07:31:11.380 | like I don't have to attempt as much to move it.
07:31:15.900 | And part of that is something that I'd even talked
07:31:19.220 | with them about when I was watching
07:31:23.020 | the signals of my brain one day.
07:31:25.260 | I was watching when I like attempted to move to the right
07:31:29.220 | and I watched the screen as like, I saw the spikes.
07:31:33.020 | Like I was seeing the spike, the signals being sent
07:31:35.780 | before I was actually attempting to move.
07:31:38.520 | I imagine just because, you know,
07:31:42.180 | when you go to say move your hand or any body part,
07:31:45.540 | that signal gets sent before you're actually moving,
07:31:48.020 | has to make it all the way down and back up
07:31:50.180 | before you actually do any sort of movement.
07:31:51.940 | So there's a delay there.
07:31:53.740 | And I noticed that there was something going on in my brain
07:31:58.100 | before I was actually attempting to move,
07:32:00.620 | that my brain was like anticipating what I wanted to do.
07:32:05.620 | And that all started sort of, I don't know,
07:32:11.900 | like percolating in my brain.
07:32:14.000 | I get just, it was just sort of there,
07:32:16.180 | like always in the back, like that's so weird
07:32:19.180 | that it could do that.
07:32:20.500 | It kind of makes sense, but I wonder what that means
07:32:23.460 | as far as like using the Neuralink.
07:32:26.520 | And, you know, and then as I was playing around
07:32:31.520 | with the attempted movement and playing around
07:32:33.500 | with the cursor, and I saw that like,
07:32:35.260 | as the cursor control got better,
07:32:37.300 | that it was anticipating my movements
07:32:42.580 | and what I wanted it to do, like cursor movements,
07:32:45.340 | what I wanted to do a bit better and a bit better.
07:32:48.240 | And then one day I just randomly,
07:32:51.720 | as I was playing WebGrid, I like looked at a target
07:32:56.720 | before I had started like attempting to move.
07:32:59.720 | I was just trying to like get over,
07:33:01.520 | like train my eyes to start looking ahead,
07:33:05.400 | like, okay, this is the target I'm on.
07:33:07.320 | But if I look over here to this target,
07:33:09.240 | I know I can like maybe be a bit quicker getting there.
07:33:12.000 | And I looked over and the cursor just shot over.
07:33:14.680 | It was wild.
07:33:16.760 | Like I had to take a step back.
07:33:19.200 | I was like, this should not be happening.
07:33:21.240 | All day I was just smiling.
07:33:22.920 | I was so giddy.
07:33:24.040 | I was like, guys, do you know that this works?
07:33:26.040 | Like I can just think it and it happens,
07:33:28.440 | which like they'd all been saying this entire time.
07:33:31.620 | Like, I can't believe like you're doing all this
07:33:33.800 | with your mind.
07:33:34.640 | I'm like, yeah, but is it really with my mind?
07:33:36.440 | Like I'm attempting to move and it's just picking that up.
07:33:38.900 | So it doesn't feel like it's with my mind.
07:33:40.840 | But when I moved it for the first time like that,
07:33:42.760 | it was, oh man, it like,
07:33:46.400 | it made me think that this technology,
07:33:50.920 | that what I'm doing is actually way,
07:33:54.440 | way more impressive than I ever thought.
07:33:56.760 | It was way cooler than I ever thought.
07:33:58.400 | And it just opened up a whole new world of possibilities
07:34:02.320 | of like what could possibly happen with this technology
07:34:05.960 | and what I might be able to be capable of with it.
07:34:08.520 | - Because you had felt for the first time
07:34:10.240 | like this was digital telepathy.
07:34:12.280 | Like you're controlling a digital device with your mind.
07:34:15.640 | - Yep.
07:34:16.720 | - I mean, that's a real moment of discovery.
07:34:19.100 | That's really cool.
07:34:19.940 | Like you've discovered something.
07:34:21.360 | I've seen like scientists talk about like a big aha moment,
07:34:26.200 | you know, like Nobel prize winning.
07:34:27.560 | They'll have this like, holy crap.
07:34:30.800 | - Yeah. - Like, whoa.
07:34:32.720 | - That's what it felt like.
07:34:33.560 | Like I didn't feel like,
07:34:35.600 | like I felt like I had discovered something,
07:34:37.320 | but for me, maybe not necessarily
07:34:39.520 | for like the world at large or like this field at large,
07:34:43.840 | it just felt like an aha moment for me.
07:34:46.700 | Like, oh, this works.
07:34:48.420 | Like, obviously it works.
07:34:50.080 | And so that's what I do like all the time now.
07:34:54.120 | I kind of intermix the attempted movement
07:34:57.760 | and imagine movement.
07:35:00.160 | I do it all like together
07:35:02.320 | because I've found that there is some interplay with it
07:35:06.720 | that maximizes efficiency with the cursor.
07:35:10.860 | So it's not all like one or the other.
07:35:12.640 | It's not all just, I only use attempted
07:35:15.200 | or I only use like imagine movements.
07:35:17.520 | It's more, I use them in parallel
07:35:21.240 | and I can do one or the other.
07:35:23.920 | I can just completely think about whatever I'm doing,
07:35:27.720 | but I don't know.
07:35:30.360 | I like to play around with it.
07:35:31.200 | I also like to just experiment with these things.
07:35:33.680 | Like every now and again, I'll get this idea in my head,
07:35:36.320 | like, hmm, I wonder if this works
07:35:38.320 | and I'll just start doing it.
07:35:39.560 | And then afterwards I'll tell them,
07:35:40.800 | by the way, I wasn't doing that like you guys wanted me to.
07:35:44.280 | I was, I thought of something and I wanted to try it.
07:35:47.360 | And so I did, it seems like it works.
07:35:49.560 | So maybe we should like explore that a little bit.
07:35:51.960 | - So I think that discovery is not just for you,
07:35:54.240 | at least from my perspective,
07:35:55.280 | that's a discovery for everyone else,
07:35:58.640 | whoever uses a new link that this is possible.
07:36:01.520 | Like, I don't think that's an obvious thing
07:36:03.640 | that this is even possible.
07:36:05.000 | It's like, I was saying to Bliss earlier,
07:36:08.480 | it's like the four minute mile.
07:36:10.160 | People thought it was impossible
07:36:11.160 | to run a mile in four minutes.
07:36:12.840 | And once the first person did it,
07:36:14.640 | then everyone just started doing it.
07:36:16.760 | So like, just to show that it's possible,
07:36:18.880 | that paves the way to like, anyone can now do it.
07:36:21.080 | That's the thing that's actually possible.
07:36:22.880 | You don't need to do the attempt to move it.
07:36:24.760 | You can just go direct.
07:36:26.640 | That's crazy.
07:36:27.560 | - It is crazy.
07:36:28.760 | It is crazy, yeah.
07:36:30.640 | - For people who don't know,
07:36:31.800 | can you explain how the link app works?
07:36:34.160 | You have an amazing stream on the topic.
07:36:36.840 | Your first stream, I think, on X.
07:36:39.560 | Describing the app, can you just describe how it works?
07:36:43.800 | - Yeah, so it's just an app that Neuralink created
07:36:47.200 | to help me interact with the computer.
07:36:51.960 | So on the link app, there are a few different settings
07:36:56.440 | and different modes and things I can do on it.
07:36:59.920 | So there's like the body mapping,
07:37:01.800 | which we kind of touched on.
07:37:03.560 | There's a calibration.
07:37:05.960 | Calibration is how I actually get cursor control.
07:37:10.480 | So calibrating what's going on in my brain
07:37:14.400 | to translate that into cursor control.
07:37:18.320 | So it will pop out models.
07:37:22.880 | What they use, I think, is like time.
07:37:25.280 | So it would be, you know, five minutes in calibration
07:37:28.880 | will give me so good of a model.
07:37:31.600 | And then if I'm in it for 10 minutes and 15 minutes,
07:37:35.040 | the models will progressively get better.
07:37:37.440 | And so, you know, the longer I'm in it,
07:37:40.640 | generally, the better the models will get.
07:37:43.320 | - That's really cool,
07:37:44.160 | 'cause you often refer to the models.
07:37:46.160 | The model's the thing that's constructed
07:37:47.800 | once you go through the calibration step.
07:37:49.840 | And then you also talked about sometimes you'll play
07:37:52.760 | like a really difficult game, like Snake,
07:37:55.160 | just to see how good the model is.
07:37:56.920 | - Yeah, yeah.
07:37:57.760 | So Snake is kind of like my litmus test for models.
07:38:00.600 | If I can control Snake decently well,
07:38:03.280 | then I know I have a pretty good model.
07:38:05.800 | So yeah, the Link App has all of those,
07:38:07.760 | it has WebGrid in it now.
07:38:09.960 | It's also how I like connect to the computer,
07:38:12.240 | just in general.
07:38:13.160 | So they've given me a lot of like voice controls
07:38:17.120 | with it at this point.
07:38:18.360 | So I can, you know, say like connect or implant disconnect.
07:38:22.520 | And as long as I have that charger handy,
07:38:27.160 | then I can connect to it.
07:38:28.040 | So the charger is also how I connect to the Link App,
07:38:30.560 | to connect to the computer.
07:38:31.720 | I have to have the implant charger over my head
07:38:36.240 | when I wanna connect to have it wake up
07:38:38.520 | 'cause the implant's in hibernation mode,
07:38:41.240 | like always when I'm not using it.
07:38:42.960 | I think there's a setting to like wake it up every,
07:38:47.120 | you know, so long.
07:38:48.960 | So we could set it to half an hour or five hours
07:38:51.560 | or something if I just want it to wake up periodically.
07:38:54.720 | So yeah, I'll like connect to the Link App
07:38:58.640 | and then go through all sorts of things.
07:39:01.800 | Calibration for the day, maybe body mapping.
07:39:04.000 | I have like, I made them give me like a little homework tab
07:39:07.680 | because I am very forgetful and I forget to do things a lot.
07:39:12.680 | So I have like a lot of data collection things
07:39:16.720 | that they want me to do.
07:39:18.120 | - Is the body mapping part of the data collection
07:39:20.200 | or is that also part of the connection?
07:39:21.200 | - Yeah, it is.
07:39:22.040 | It's something that they want me to do daily,
07:39:25.440 | which I've been slacking on
07:39:26.880 | 'cause I've been doing so much media and traveling so much.
07:39:29.720 | So I've been-- - You've gotten super famous.
07:39:31.360 | - Yeah, I've been a terrible first candidate
07:39:34.920 | for how much I've been slacking on my homework.
07:39:38.360 | But yeah, it's just something that they want me
07:39:41.000 | to do every day to, you know, track
07:39:44.240 | how well the Neuralink is performing over time
07:39:48.160 | and to have something to give.
07:39:49.520 | I imagine to give to the FDA to, you know,
07:39:51.720 | create all sorts of fancy charts and stuff
07:39:53.920 | and show like, hey, this is what the Neuralink,
07:39:57.160 | this is how it's performing, you know,
07:39:58.840 | day one versus day 90 versus day 180 and things like that.
07:40:02.640 | - What's the calibration step like?
07:40:03.880 | Is it like move left, move right?
07:40:06.920 | - It's a bubble game.
07:40:09.040 | So there will be like yellow bubbles
07:40:10.600 | that pop up on the screen.
07:40:12.080 | At first it is open loop.
07:40:14.760 | So open loop, this is something
07:40:16.520 | that I still don't fully understand,
07:40:19.680 | open loop and closed loop thing.
07:40:21.280 | - Me and Blitz talked for a long time
07:40:22.760 | about the difference between the two
07:40:24.240 | on the technical side.
07:40:25.280 | - Okay, yeah.
07:40:26.120 | - So it'd be great to hear your side of the story.
07:40:29.080 | - Open loop is basically,
07:40:31.680 | I have no control over the cursor.
07:40:34.160 | The cursor will be moving on its own across the screen
07:40:37.480 | and I am following by intention
07:40:40.920 | the cursor to different bubbles.
07:40:43.400 | And then the algorithm is training off of what,
07:40:48.120 | like the signals it's getting are as I'm doing this.
07:40:51.000 | There are a couple different ways that they've done it.
07:40:52.520 | They call it center out target.
07:40:54.880 | So there will be a bubble in the middle
07:40:56.880 | and then eight bubbles around that.
07:40:58.840 | And the cursor will go from the middle to one side.
07:41:02.560 | So say middle to left, back to middle,
07:41:04.960 | to up to middle, like upright.
07:41:07.520 | And they'll do that all the way around the circle.
07:41:10.240 | And I will follow that cursor the whole time.
07:41:13.720 | And then it will train off of my intentions
07:41:17.080 | what it is expecting my intentions to be
07:41:20.880 | throughout the whole process.
07:41:22.440 | - Can you actually speak to when you say follow?
07:41:25.600 | - Yes.
07:41:26.440 | - You don't mean with your eyes,
07:41:27.260 | you mean with your intentions.
07:41:28.400 | - Yeah, so generally for calibration,
07:41:30.840 | I'm doing attempted movements
07:41:32.680 | 'cause I think it works better.
07:41:34.280 | I think the better models as I progress through calibration
07:41:39.280 | make it easier to use imagined movements.
07:41:45.160 | - Wait, wait, wait, wait.
07:41:46.000 | So calibrated on attempted movement
07:41:49.520 | will create a model that makes it really effective
07:41:53.280 | for you to then use the force.
07:41:55.880 | - Yes.
07:41:56.700 | I've tried doing calibration with imagined movement,
07:42:01.700 | and it just doesn't work as well for some reason.
07:42:05.880 | So that was the center out targets.
07:42:07.680 | There's also one where a random target
07:42:10.100 | will pop up on the screen and it's the same.
07:42:12.200 | I just move, I follow along wherever the cursor is
07:42:16.280 | to that target all across the screen.
07:42:18.740 | I've tried those with imagined movement.
07:42:22.440 | And for some reason, the models just don't,
07:42:25.600 | they don't give as high level as quality
07:42:31.840 | when we get into closed loop.
07:42:34.100 | I haven't played around with it a ton.
07:42:37.160 | So maybe like the different ways
07:42:38.560 | that we're doing calibration now might make it a bit better.
07:42:41.920 | But what I've found is there will be a point in calibration
07:42:46.940 | where I can use imagined movement.
07:42:50.400 | Before that point, it doesn't really work.
07:42:53.520 | So if I do calibration for 45 minutes,
07:42:58.280 | the first 15 minutes, I can't use imagined movement.
07:43:03.060 | It just like doesn't work for some reason.
07:43:05.160 | And after a certain point, I can just sort of feel it.
07:43:11.320 | I can tell it moves different.
07:43:14.440 | That's the best way I can describe it.
07:43:16.520 | Like it's almost as if it is anticipating
07:43:20.040 | what I am going to do again before I go to do it.
07:43:23.900 | And so using attempted movement for 15 minutes,
07:43:29.480 | at some point I can kind of tell when I like move my eyes
07:43:33.820 | to the next target that the cursor
07:43:35.760 | is starting to like pick up.
07:43:37.240 | Like it's starting to understand,
07:43:38.960 | it's learning like what I'm going to do.
07:43:41.000 | - So first of all, it's really cool that,
07:43:42.680 | I mean, you are a true pioneer in all of this.
07:43:45.040 | You're like exploring how to do every aspect
07:43:48.520 | of this most effectively.
07:43:49.880 | And there's just, I imagine so many lessons learned
07:43:53.240 | from this, so thank you for being a pioneer
07:43:55.480 | in all these kinds of different like super technical ways.
07:43:58.360 | And it's also cool to hear that there's like a different
07:44:01.560 | like feeling to the experience when it's calibrated
07:44:05.800 | in different ways.
07:44:07.260 | Like just, 'cause I imagine your brain
07:44:10.400 | is doing something different.
07:44:12.280 | And that's why there's a different feeling to it.
07:44:14.240 | And then trying to find the words and the measurements
07:44:18.120 | to those feelings would be also interesting.
07:44:19.800 | But at the end of the day, you can also measure
07:44:21.680 | that your actual performance, whether it's Snake or WebGrid,
07:44:26.120 | you can see like what actually works well.
07:44:27.880 | And you're saying for the open loop calibration,
07:44:31.920 | the attempted movement works best for now.
07:44:35.000 | - Yep, yep.
07:44:36.360 | - So the open loop, you don't get the feedback
07:44:39.400 | that's something, that you did something.
07:44:41.640 | - Yeah, I'm just-- - Is that frustrating?
07:44:43.440 | - No, no, it makes sense to me.
07:44:45.400 | Like we've done it with a cursor and without a cursor
07:44:48.440 | in open loop.
07:44:49.280 | So sometimes it's just say for like the center out,
07:44:53.520 | you'll start calibration with a bubble lighting up
07:44:59.840 | and I push towards that bubble.
07:45:01.760 | And then when that bubble, when it's pushed towards
07:45:05.160 | that bubble for say three seconds, a bubble will pop.
07:45:08.100 | And then I come back to the middle.
07:45:09.880 | So I'm doing it all just by my intentions.
07:45:13.140 | Like that's what it's learning anyway.
07:45:14.780 | So it makes sense that as long as I follow
07:45:17.240 | what they want me to do, like follow the yellow brick road,
07:45:19.920 | that it'll all work out.
07:45:21.120 | - You're full of great references.
07:45:24.640 | Is the bubble game fun?
07:45:26.320 | Like, yeah.
07:45:27.160 | - They always feel so bad making me do calibration.
07:45:29.840 | Like, oh, we're about to do a 40 minute calibration.
07:45:33.400 | I'm like, all right, do you guys wanna do two of them?
07:45:36.240 | Like I'm always asking to like whatever they need,
07:45:39.600 | I'm more than happy to do.
07:45:41.000 | And it's not bad.
07:45:43.220 | Like I get to lie there and, or sit in my chair
07:45:46.640 | and like do these things with some great people.
07:45:50.700 | I get to have great conversations.
07:45:52.580 | I can give them feedback.
07:45:55.060 | I can talk about all sorts of things.
07:45:57.500 | I could throw something on on my TV in the background
07:46:00.220 | and kind of like split my attention between them.
07:46:02.800 | Like, it's not bad at all.
07:46:05.420 | - Is there a score that you get?
07:46:07.000 | Like, can you do better on the bubble game?
07:46:08.780 | - No, I would love that.
07:46:10.800 | I would love, yeah.
07:46:13.560 | - Writing down suggestions from Nolan.
07:46:17.840 | Make it more fun, gamified.
07:46:20.640 | - Yeah, that's one thing that I really, really enjoy
07:46:22.820 | about WebGrid is 'cause I'm so competitive.
07:46:26.580 | Like the higher the BPS, the higher the score,
07:46:31.720 | I know the better I'm doing.
07:46:33.180 | And so if I, I think I've asked at one point,
07:46:36.060 | one of the guys, like if he could give me
07:46:38.460 | some sort of numerical feedback for calibration,
07:46:40.740 | like I would like to know what they're looking at.
07:46:42.820 | Like, oh, you know, it is, we see like this number
07:46:47.680 | while you're doing calibration.
07:46:48.780 | And that means at least on our end
07:46:50.460 | that we think calibration is going well.
07:46:52.460 | And I would love that because I would like to know
07:46:56.300 | if what I'm doing is going well or not.
07:46:58.220 | But then they've also told me like,
07:46:59.220 | yeah, not necessarily like one-to-one.
07:47:01.620 | It doesn't actually mean that calibration is going well
07:47:04.580 | in some ways.
07:47:06.200 | So it's not like 100% and they don't wanna like skew
07:47:09.300 | what I'm experiencing or want me to change things
07:47:12.240 | based on that.
07:47:13.080 | If that number isn't always accurate
07:47:14.700 | to like how the model will turn out
07:47:16.340 | or how like the end result,
07:47:17.920 | that's at least what I got from it.
07:47:19.900 | One thing I do, I have asked them
07:47:22.380 | and something that I really enjoy striving for
07:47:26.020 | is towards the end of calibration,
07:47:27.940 | there is like a time between targets.
07:47:31.420 | And so I like to keep like at the end,
07:47:34.520 | that number as low as possible.
07:47:35.820 | So at the beginning, it can be, you know,
07:47:37.300 | four or five, six seconds between me popping bubbles,
07:47:40.420 | but towards the end, I like to keep it below like 1.5
07:47:43.780 | or if I could get it to like one second between like bubbles
07:47:47.820 | because in my mind that translates really nicely
07:47:51.100 | to something like web grid,
07:47:52.180 | where I know if I can hit a target one every second
07:47:56.620 | that I'm doing real, real well.
07:47:58.580 | - There you go.
07:47:59.420 | That's a way to get a score on the calibrations,
07:48:00.860 | like the speed, how quickly can you get
07:48:02.920 | from bubble to bubble.
07:48:03.900 | - Yeah.
07:48:05.420 | - So there's the open loop
07:48:06.840 | and then it goes to the closed loop.
07:48:07.680 | And the closed loop can already start giving you a sense
07:48:11.580 | 'cause you're getting feedback
07:48:12.580 | of like how good the model is.
07:48:13.780 | - Yeah, so closed loop is when I first get cursor control
07:48:18.680 | and how they've described it to me,
07:48:21.240 | someone who does not understand this stuff.
07:48:23.660 | I am the dumbest person in the room every time
07:48:26.100 | I'm with any of these guys.
07:48:26.920 | - The humility, I appreciate it.
07:48:28.980 | - Is that I am closing the loop.
07:48:30.780 | So I am actually now the one that is like finishing
07:48:35.300 | the loop of whatever this loop is.
07:48:37.960 | I don't even know what the loop is, they've never told me.
07:48:40.100 | They just say there is a loop and at one point it's open
07:48:43.260 | and I can't control.
07:48:44.380 | And then I get control and it's closed.
07:48:46.940 | So I'm finishing the loop.
07:48:48.740 | - So how long the calibration usually take?
07:48:50.900 | You said like 10, 15 minutes?
07:48:52.300 | - Well, yeah, they're trying to get that number down
07:48:54.620 | pretty low.
07:48:55.740 | That's what we've been working on a lot recently
07:48:57.680 | is getting that down as low as possible.
07:49:00.000 | So that way, if this is something that people need to do
07:49:03.220 | on a daily basis, or if some people need to do
07:49:06.060 | on like every other day basis or once a week,
07:49:11.060 | they don't want people to be sitting in calibration
07:49:13.280 | for long periods of time.
07:49:15.000 | I think they wanted to get it down seven minutes or below,
07:49:18.840 | at least where we're at right now.
07:49:20.260 | It'd be nice if you never had to do calibration.
07:49:23.800 | So we'll get there at some point, I'm sure,
07:49:26.300 | the more we learn about the brain.
07:49:27.720 | And like, I think that's the dream.
07:49:32.300 | I think right now for me to get like really,
07:49:34.940 | really good models, I'm in calibration 40 or 45 minutes.
07:49:39.400 | And I don't mind, like I said, they always feel really bad,
07:49:43.680 | but if it's gonna get me a model that can like break
07:49:46.540 | these records on WebGrid, I'll stay in it
07:49:48.560 | for flipping two hours.
07:49:50.360 | - Let's talk business.
07:49:51.340 | So WebGrid, I saw a presentation that where Bliss said
07:49:55.980 | by March, you selected 89,000 targets in WebGrid.
07:49:59.960 | Can you explain this game?
07:50:01.580 | What is WebGrid and what does it take
07:50:04.620 | to be a world-class performer in WebGrid
07:50:07.560 | as you continue to break world records?
07:50:09.620 | - Yeah.
07:50:10.460 | - It's like a gold medalist talk.
07:50:15.060 | Well, where do I begin?
07:50:15.900 | - Yeah, I'd like to thank everyone who's helped me get here,
07:50:19.540 | my coaches, my parents for driving me to practice
07:50:22.620 | every day at five in the morning.
07:50:24.620 | I'd like to thank God.
07:50:25.720 | And just overall, my dedication to my craft.
07:50:29.940 | - The interviews with athletes are always like that exact.
07:50:32.500 | It's like that template.
07:50:34.260 | - Yeah.
07:50:35.080 | - So WebGrid is a grid of cells.
07:50:40.860 | - It's literally just a grid.
07:50:44.180 | They can make it as big or small as you can make a grid.
07:50:48.240 | A single box on that grid will light up
07:50:50.500 | and you go and click it.
07:50:51.700 | And it is a way for them to benchmark how good a BCI is.
07:50:57.020 | So it's pretty straightforward.
07:51:00.100 | You just click targets.
07:51:01.620 | - Only one blue cell appears
07:51:03.860 | and you're supposed to move the mouse to there
07:51:05.580 | and click on it.
07:51:06.620 | - So I like playing on bigger grids
07:51:09.460 | 'cause the bigger the grid, the more BPS,
07:51:14.460 | it's bits per second that you get every time you click one.
07:51:18.980 | So I'll say I'll play on a 35 by 35 grid.
07:51:23.340 | And then one of those little squares,
07:51:25.820 | a cell, you can call it, target, whatever,
07:51:28.180 | will light up and you move the cursor there
07:51:29.900 | and you click it.
07:51:30.980 | And then you do that forever.
07:51:34.820 | - And you've been able to achieve at first
07:51:37.220 | eight bits per second and you've recently broke that.
07:51:40.220 | - Yeah, I'm at 8.5 right now.
07:51:42.380 | I would've beaten that literally the day
07:51:45.300 | before I came to Austin.
07:51:47.840 | But I had like a, I don't know,
07:51:50.500 | like a five second lag right at the end.
07:51:53.120 | And I just had to wait until the latency calmed down
07:51:56.980 | and then I kept clicking.
07:51:58.320 | But I was at like 8.01 and then five seconds of lag.
07:52:03.320 | And then the next like three targets I clicked
07:52:05.420 | all stayed at 8.01.
07:52:07.660 | So if I would've been able to click
07:52:09.420 | during that time of lag, I probably would've hit,
07:52:12.740 | I don't know, I might've hit nine.
07:52:14.220 | So I'm there, I'm like, I'm really close.
07:52:16.700 | And then this whole Austin trip
07:52:18.620 | has really gotten in the way
07:52:19.740 | of my web grid playing ability.
07:52:21.500 | - It's frustrating.
07:52:22.340 | - Yeah.
07:52:23.180 | - So that's all you're thinking about right now?
07:52:24.260 | - Yeah, I know.
07:52:25.820 | I just want, I want to do better.
07:52:28.060 | - At nine.
07:52:28.900 | - I want to do better.
07:52:29.720 | I want to hit nine.
07:52:30.560 | I think, well, I know nine is very, very achievable.
07:52:33.380 | I'm right there.
07:52:34.380 | I think 10 I could hit maybe in the next month.
07:52:39.140 | Like I could do it probably in the next few weeks
07:52:41.040 | if I really push.
07:52:41.880 | - I think you and Elon are basically the same person.
07:52:44.060 | 'Cause last time I did a podcast with him,
07:52:46.260 | he came in extremely frustrated
07:52:47.900 | that he can't beat Uber Lilith as a droid.
07:52:50.940 | That was like a year ago, I think.
07:52:52.540 | I forget, like solo.
07:52:55.300 | And I could just tell there's some percentage
07:52:57.700 | of his brain the entire time was thinking,
07:52:59.700 | like, I wish I was right now attempting.
07:53:02.380 | - I think he did it that night.
07:53:03.500 | - He did it that night.
07:53:04.620 | He stayed up and did it that night.
07:53:07.500 | Just crazy to me.
07:53:09.180 | I mean, in a fundamental way, it's really inspiring.
07:53:12.660 | And what you're doing is inspiring in that way.
07:53:14.360 | 'Cause I mean, it's not just about the game.
07:53:17.060 | Everything you're doing there has impact.
07:53:20.160 | By striving to do well on WebGrid,
07:53:23.020 | you're helping everybody figure out
07:53:25.140 | how to create the system all along.
07:53:27.260 | Like the decoding, the software, the hardware,
07:53:30.420 | the calibration, all of it.
07:53:32.060 | How to make all of that work
07:53:33.460 | so you can do everything else really well.
07:53:36.420 | - Yeah, it's just really fun.
07:53:38.000 | - Well, that's also, that's part of the thing
07:53:40.460 | is making it fun.
07:53:42.380 | - Yeah, it's addicting.
07:53:43.740 | I've joked about what they actually did
07:53:48.540 | when they went in and put this thing in my brain.
07:53:50.560 | They must have flipped a switch
07:53:52.080 | to make me more susceptible to these kinds of games,
07:53:55.580 | to make me addicted to like WebGrid or something.
07:53:59.040 | Do you know Bliss's high score?
07:54:00.360 | - Yeah, he said like 14 or something.
07:54:02.120 | - 17.
07:54:02.960 | - Oh boy. - 17.1 or something.
07:54:04.720 | 17.1.
07:54:05.560 | - 17 on the dot, 17.01.
07:54:07.800 | - He told me he like does it on the floor with peanut butter
07:54:10.800 | and he like fasts, it's weird.
07:54:13.440 | That sounds like cheating.
07:54:14.720 | Sounds like performance enhancing.
07:54:17.080 | - Nolan, like the first time Nolan played this game,
07:54:19.340 | he asked, "How good are we at this game?"
07:54:21.260 | And I think he told me right then,
07:54:23.180 | "You're gonna try to beat me."
07:54:24.140 | - I'm gonna get there someday.
07:54:24.980 | - Yeah, I fully believe you.
07:54:26.180 | - I think I can.
07:54:27.100 | - I'm excited for that.
07:54:28.100 | - Yeah, so I've been playing, first off,
07:54:30.500 | with the dwell cursor,
07:54:32.780 | which really hampers my WebGrid playing ability.
07:54:36.700 | Basically, I have to wait 0.3 seconds for every click.
07:54:40.180 | - Oh, so you can't do the click.
07:54:41.580 | So you have to, so you click by dwelling.
07:54:44.500 | You said 0.3?
07:54:45.340 | - 0.3 seconds.
07:54:46.860 | Which, which sucks.
07:54:49.580 | It really slows down how much I'm able to,
07:54:53.580 | like how high I'm able to get.
07:54:55.140 | I still hit like 50, I think I hit like 50 something trials,
07:55:00.060 | net trials per minute in that, which was pretty good.
07:55:03.580 | 'Cause I'm able to like, there's one of the settings
07:55:07.300 | is also like how slow you need to be moving
07:55:10.620 | in order to initiate a click, to start a click.
07:55:13.180 | So I can tell sort of when I'm on that threshold
07:55:18.180 | to start initiating a click just a bit early.
07:55:21.940 | So I'm not fully stopped over the target when I go to click.
07:55:25.540 | I'm doing it like on my way to the targets a little,
07:55:28.500 | to try to time it just right.
07:55:29.740 | - Wow, so you're slowing down.
07:55:31.860 | - Yeah, just a hair right before the targets.
07:55:34.660 | - This is like a lead performance, okay.
07:55:37.180 | But that's still, it sucks that there's a ceiling
07:55:40.460 | of the 0.3.
07:55:41.820 | - Well, I can get down to 0.2 and 0.1, 0.1 is what I get.
07:55:45.540 | Yeah, and I've played with that a little bit too.
07:55:48.020 | I have to adjust a ton of different parameters
07:55:50.780 | in order to play with 0.1.
07:55:52.060 | And I don't have control over all that on my end yet.
07:55:55.340 | It also changes like how the models are trained.
07:55:58.380 | Like if I train a model, like in WebGrid,
07:56:00.540 | I like a bootstrap on a model,
07:56:01.860 | which basically is them training models
07:56:04.220 | as I'm playing WebGrid based off of like the WebGrid data
07:56:08.420 | that I'm, so like if I play WebGrid for 10 minutes,
07:56:10.620 | they can train off that data specifically
07:56:13.980 | in order to get me a better model.
07:56:15.940 | If I do that with 0.3 versus 0.1,
07:56:19.060 | the models come out different.
07:56:21.060 | The way that they interact, it's just much, much different.
07:56:25.060 | So I have to be really careful.
07:56:26.460 | I found that doing it with 0.3
07:56:28.460 | is actually better in some ways,
07:56:30.100 | unless I can do it with 0.1
07:56:31.900 | and change all of the different parameters,
07:56:33.940 | then that's more ideal
07:56:35.380 | 'cause obviously 0.3 is faster than 0.1.
07:56:37.540 | So I could get there.
07:56:41.900 | I can get there.
07:56:43.020 | - Can you click using your brain?
07:56:45.900 | - For right now, it's the hover clicking
07:56:47.740 | with the dwell cursor.
07:56:49.100 | We, before all the thread retraction stuff happened,
07:56:52.180 | we were calibrating clicks, left click, right click.
07:56:54.980 | That was my previous ceiling
07:56:58.180 | before I broke the record again with the dwell cursor
07:57:00.500 | was I think on a 35 by 35 grid with left and right click.
07:57:05.260 | And you get more BPS, more bits per second
07:57:09.580 | using multiple clicks 'cause it's more difficult.
07:57:12.180 | - Oh, because what is it?
07:57:14.580 | You're supposed to do either a left click or a right click.
07:57:17.940 | Is it different colors or something like this?
07:57:19.460 | - Yeah, blue targets for left click,
07:57:21.340 | orange targets for right click is what they had done.
07:57:23.620 | So my previous record of 7.5
07:57:26.500 | was with the blue and the orange targets, yeah.
07:57:29.340 | Which I think if I went back to that now
07:57:33.900 | doing the click calibration,
07:57:35.380 | I would be able to,
07:57:36.220 | and being able to like initiate clicks on my own,
07:57:38.460 | I think I would break that 10 ceiling
07:57:40.940 | like in a couple of days, max.
07:57:43.740 | - Yeah, you would start making Blizz nervous about his 17.
07:57:47.300 | - Why do you think we haven't given him the-
07:57:48.780 | - Yeah.
07:57:49.620 | - Exactly.
07:57:50.460 | So what did it feel like with the retractions
07:57:54.940 | that there is some of the threads retracted?
07:57:57.340 | - It sucked.
07:57:58.540 | It was really, really hard.
07:58:00.740 | The day they told me was the day of my big Neuralink tour
07:58:05.740 | at their Fremont facility.
07:58:09.620 | They told me like right before we went over there,
07:58:12.060 | it was really hard to hear.
07:58:13.580 | My initial reaction was, all right, go and fix it.
07:58:16.780 | Like go and take it out and fix it.
07:58:18.540 | The first surgery was so easy.
07:58:20.220 | Like I went to sleep, couple hours later I woke up
07:58:22.940 | and here we are.
07:58:24.700 | I didn't feel any pain,
07:58:26.300 | didn't take like any pain pills or anything.
07:58:29.860 | So I just knew that if they wanted to,
07:58:33.660 | they could go in and put in a new one like next day
07:58:37.140 | if that's what it took.
07:58:38.380 | 'Cause I just wanted it to be better
07:58:41.540 | and I wanted not to lose the capability.
07:58:45.660 | I had so much fun playing with it
07:58:49.460 | for a few weeks, for a month.
07:58:51.260 | I had, like it had opened up so many doors for me
07:58:54.420 | and it opened up so many more possibilities
07:58:56.500 | that I didn't want to lose it after a month.
07:58:58.660 | I thought it would have been a cruel twist of fate
07:59:01.740 | if I had gotten to see the view
07:59:06.340 | from like the top of this mountain
07:59:08.020 | and then have it all come crashing down after a month.
07:59:11.060 | And I knew like, say the top of the mountain,
07:59:13.220 | but like how I saw it was,
07:59:16.660 | I was just now starting to climb the mountain
07:59:19.740 | and I was like, there was so much more
07:59:22.340 | that I knew was possible.
07:59:23.660 | And so to have all of that be taken away
07:59:25.740 | was really, really hard.
07:59:27.180 | But then on the drive over to the facility,
07:59:32.180 | I don't know, like five minute drive, whatever it is,
07:59:35.380 | I talked with my parents about it.
07:59:39.620 | I prayed about it.
07:59:40.900 | I was just like, you know,
07:59:42.140 | I'm not gonna let this ruin my day.
07:59:44.380 | I'm not gonna let this ruin this amazing like tour
07:59:48.420 | that they have set up for me.
07:59:49.860 | Like I wanna go show everyone
07:59:52.180 | how much I appreciate all the work they're doing.
07:59:54.500 | I wanna go like meet all of the people
07:59:56.820 | who have made this possible.
07:59:58.380 | And I wanna go have one of the best days of my life.
08:00:01.060 | And I did, and it was amazing.
08:00:03.060 | And it absolutely was one of the best days
08:00:06.180 | I've ever been privileged to experience.
08:00:10.140 | And then for a few days, I was pretty down in the dumps.
08:00:13.780 | But for like the first few days afterwards,
08:00:17.260 | I was just like,
08:00:18.620 | I didn't know if it was ever gonna work again.
08:00:20.860 | And then I just, I made the decision
08:00:24.980 | that even if I lost the ability to use the Neuralink,
08:00:29.980 | even if I lost out on everything to come,
08:00:36.780 | if I could keep giving them data in any way,
08:00:40.020 | then I would do that.
08:00:41.100 | If I needed to just do like some of the data collection
08:00:44.780 | every day or body mapping every day for a year,
08:00:47.380 | then I would do it.
08:00:49.140 | Because I know that everything I'm doing
08:00:53.140 | helps everyone to come after me.
08:00:55.220 | And that's all I wanted.
08:00:56.660 | My guess, the whole reason that I did this
08:00:58.700 | was to help people.
08:00:59.940 | And I knew that anything I could do to help,
08:01:02.140 | I would continue to do.
08:01:03.860 | Even if I never got to use the cursor again,
08:01:06.180 | then I was just happy to be a part of it.
08:01:09.660 | And everything that I'd done was just a perk.
08:01:12.780 | It was something that I got to experience.
08:01:14.380 | And I know how amazing it's gonna be
08:01:15.920 | for everyone to come after me.
08:01:16.920 | So might as well just keep trucking along.
08:01:22.340 | - That said, you were able to get to work your way up,
08:01:25.980 | to get the performance back.
08:01:27.460 | So this is like going from Rocky I to Rocky II.
08:01:30.060 | So when did you first realize that this is possible
08:01:34.060 | and what gave you sort of the strength,
08:01:35.980 | the motivation, the determination to do it,
08:01:38.780 | to increase back up and beat your previous record?
08:01:42.900 | - Yeah, it was within a couple of weeks.
08:01:44.260 | - Again, this feels like I'm interviewing an athlete.
08:01:46.420 | (laughs)
08:01:47.740 | This is great.
08:01:48.980 | I like to thank my parents.
08:01:50.940 | The road back was long and hard,
08:01:53.460 | fraught with many difficulties.
08:01:55.260 | There were dark days.
08:01:56.540 | It was a couple of weeks, I think.
08:02:02.140 | And then there was just a turning point.
08:02:04.380 | I think they had switched how they were measuring
08:02:09.340 | the neuron spikes in my brain.
08:02:11.780 | Like the bliss, help me out.
08:02:14.160 | - Yeah, the way in which we were measuring
08:02:16.820 | the behavior of individual neurons.
08:02:18.660 | So we're switching from sort of individual spike detection
08:02:21.340 | to something called spike band power,
08:02:22.980 | which if you watch the previous segments
08:02:24.820 | with either me or DJ, you probably have some content.
08:02:26.420 | - Yeah, okay.
08:02:27.260 | So when they did that, it was kind of like
08:02:29.580 | light over the head, like light bulb moment.
08:02:32.980 | Like, oh, this works.
08:02:34.460 | And this seems like we can run with this.
08:02:39.180 | And I saw the uptake in performance immediately.
08:02:44.140 | Like I could feel it when they switched over.
08:02:45.940 | I was like, this is better.
08:02:47.180 | Like, this is good.
08:02:48.300 | Like everything up till this point for the last few weeks,
08:02:51.260 | last like whatever, three or four weeks,
08:02:53.100 | 'cause it was before they even told me,
08:02:54.780 | like everything before this sucked.
08:02:56.820 | Like, let's keep doing what we're doing now.
08:02:59.060 | And at that point, it was not like,
08:03:01.820 | oh, I know I'm still only at like say in web grid terms,
08:03:05.740 | like four or five BPS compared to my 7.5 before.
08:03:09.680 | But I know that if we keep doing this,
08:03:12.180 | then like I can get back there.
08:03:16.020 | And then they gave me the dwell cursor
08:03:17.660 | and the dwell cursor sucked at first.
08:03:20.020 | It's not, obviously not what I want,
08:03:22.020 | but it gave me a path forward
08:03:26.140 | to be able to continue using it
08:03:27.900 | and hopefully to continue to help out.
08:03:31.040 | And so I just ran with it, never looked back.
08:03:34.220 | Like I said, I'm just the kind of person
08:03:35.940 | I roll with the punches anyway, so.
08:03:37.740 | - What was the process?
08:03:39.740 | What was the feedback loop on the figuring out
08:03:41.780 | how to do the spike detection
08:03:42.940 | in a way that would actually work well for Nolan?
08:03:45.060 | - Yeah, it's a great question.
08:03:45.940 | So maybe just describe first how the actual update worked.
08:03:49.460 | It was basically an update to your implant.
08:03:50.780 | So we just did an over-the-air software update
08:03:53.060 | to his implants and what you'd update
08:03:54.340 | your Tesla or your iPhone.
08:03:56.040 | And that firmware change enabled us
08:03:58.100 | to record sort of averages of populations of neurons
08:04:01.660 | nearby an individual electrode.
08:04:02.700 | So we have less resolution about which individual neuron
08:04:06.140 | is doing what, but we have a broader picture
08:04:07.980 | of what's going on nearby an electrode overall.
08:04:10.500 | And that feedback loop, I mean,
08:04:12.560 | basically as Nolan described,
08:04:13.400 | it was immediate when we flipped that switch.
08:04:15.560 | I think the first day we did that,
08:04:16.880 | you had three or four VPS right out of the box.
08:04:18.880 | And that was a light bulb moment for,
08:04:20.720 | okay, this is the right path to go down.
08:04:22.560 | And from there, there's a lot of feedback
08:04:24.280 | around like how to make this useful for independent use.
08:04:27.200 | So what we care about ultimately is that
08:04:28.800 | you can use it independently to do whatever you want.
08:04:31.180 | And to get to that point, it required us
08:04:32.840 | to re-engineer the UX, as you talked about
08:04:34.680 | with the dwell cursor, to make it something
08:04:36.240 | that you can use independently without us
08:04:37.520 | needing to be involved all the time.
08:04:39.160 | And yeah, this is obviously the start of this journey still.
08:04:41.320 | Hopefully we get back to the places where you're doing
08:04:43.360 | multiple clicks and using that to control
08:04:46.160 | much more fluidly everything and much more naturally
08:04:48.400 | the applications that you're trying to interface with.
08:04:51.120 | - And most importantly, get that web grid number up.
08:04:55.280 | - Yeah. - Yes.
08:04:56.120 | - Yeah.
08:04:57.000 | - So how is the, on the hover click,
08:04:59.580 | do you accidentally click stuff sometimes?
08:05:02.520 | - Yep. - Like what's,
08:05:03.520 | how hard is it to avoid accidentally clicking?
08:05:05.680 | - I have to continuously keep it moving, basically.
08:05:09.680 | So like I said, there's a threshold
08:05:11.240 | where it will initiate a click.
08:05:13.020 | So if I ever drop below that, it'll start,
08:05:16.760 | and I have .3 seconds to move it before it clicks anything.
08:05:20.800 | And if I don't want it to ever get there,
08:05:22.480 | I just keep it moving at a certain speed
08:05:24.980 | and just constantly doing circles on screen,
08:05:28.040 | moving it back and forth to keep it from clicking stuff.
08:05:31.360 | I actually noticed a couple weeks back
08:05:35.640 | that I was, when I was not using the implant,
08:05:39.280 | I was just moving my hand back and forth or in circles.
08:05:43.660 | Like I was trying to keep the cursor from clicking
08:05:46.740 | and I was just doing it while I was trying to go to sleep.
08:05:49.460 | And I was like, "Okay, this is a problem."
08:05:51.100 | (laughing)
08:05:52.780 | - To avoid the clicking.
08:05:53.620 | I guess, does that create problems?
08:05:55.620 | Like when you're gaming, accidentally click a thing?
08:05:58.060 | - Yeah, yeah, it happens in chess.
08:06:01.380 | I've lost a number of games
08:06:04.700 | because I'll accidentally click something.
08:06:06.820 | - I think the first time I ever beat you
08:06:08.020 | was because of an accident. - Yeah, I misclicked, yeah.
08:06:10.480 | - That's a nice excuse, right?
08:06:11.720 | - Yeah, it is. - You can always,
08:06:12.560 | any time you lose, you can just say it was accidental.
08:06:15.760 | - Yeah.
08:06:16.600 | - You said the app improved a lot from version one
08:06:20.780 | when you first started using it.
08:06:21.980 | It was very different.
08:06:23.340 | So can you just talk about the trial and error
08:06:26.840 | that you went through with the team?
08:06:28.000 | Like 200 plus pages of notes?
08:06:30.300 | Like what's that process like of going back and forth
08:06:34.280 | and working together to improve the thing?
08:06:36.220 | - It's a lot of me just using it day in and day out
08:06:41.220 | and saying, "Hey, can you guys do this for me?
08:06:45.800 | "Give me this.
08:06:46.640 | "I wanna be able to do that.
08:06:48.580 | "I need this."
08:06:51.020 | I think a lot of it just doesn't occur to them, maybe,
08:06:56.020 | until someone is actually using the app, using the implant.
08:07:00.160 | It's just something
08:07:01.000 | that they just never would have thought of.
08:07:03.420 | Or it's very specific to even like me, maybe what I want.
08:07:08.420 | It's something I'm a little worried about
08:07:11.760 | with the next people that come is,
08:07:13.840 | maybe they will want things much different
08:07:18.460 | than how I've set it up
08:07:19.780 | or what the advice I've given the team.
08:07:22.260 | And they're gonna look at some of the things
08:07:24.880 | they've added for me, like that's a dumb idea.
08:07:27.420 | Like, why would he ask for that?
08:07:29.020 | And so I'm really looking forward to get the next people on
08:07:33.340 | because I guarantee that they're going to think of things
08:07:36.100 | that I've never thought of.
08:07:37.580 | They're gonna think of improvements.
08:07:39.300 | I'm like, "Wow, that's a really good idea.
08:07:40.960 | "I wish I would have thought of that."
08:07:43.500 | And then they're also gonna give me some pushback
08:07:45.940 | about like, "Yeah, what you are asking them to do here,
08:07:49.740 | "that's a bad idea.
08:07:50.800 | "Let's do it this way."
08:07:52.180 | And I'm more than happy to have that happen.
08:07:55.120 | But it's just a lot of different interactions
08:07:58.880 | with different games or applications.
08:08:03.300 | The internet, just with the computer in general,
08:08:05.820 | there's tons of bugs that end up popping up
08:08:09.780 | left, right, center.
08:08:11.740 | So it's just me trying to use it as much as possible
08:08:14.820 | and showing them what works and what doesn't work
08:08:17.180 | and what I would like to be better.
08:08:18.820 | And then they take that feedback
08:08:21.980 | and they usually create amazing things for me.
08:08:24.980 | They solve these problems
08:08:26.620 | in ways I would have never imagined.
08:08:28.860 | They're so good at everything they do.
08:08:32.340 | And so I'm just really thankful
08:08:33.500 | that I'm able to give them feedback
08:08:35.300 | and they can make something of it.
08:08:36.740 | 'Cause a lot of my feedback is like really dumb.
08:08:40.180 | It's just like, "I want this.
08:08:42.040 | "Please do something about it."
08:08:43.780 | And they'll come back super well thought out.
08:08:46.900 | And it's way better than anything I could have ever thought
08:08:49.780 | of or implemented myself.
08:08:51.260 | So they're just great.
08:08:52.500 | They're really, really cool.
08:08:53.740 | - As the BCI community grows,
08:08:56.860 | would you like to hang out
08:08:58.940 | with the other folks with Neuralinks?
08:09:01.140 | What relationship, if any,
08:09:02.420 | would you wanna have with them?
08:09:03.980 | 'Cause you said they might have a different set
08:09:06.020 | of ideas of how to use the thing.
08:09:09.200 | Would you be intimidated by their web grid performance?
08:09:13.480 | - No, no, I hope-- - Compete.
08:09:15.140 | - I hope day one they've wiped the floor with me.
08:09:19.280 | I hope they beat it and they crush it,
08:09:23.500 | double it if they can.
08:09:24.800 | Just because on one hand,
08:09:27.980 | it's only gonna push me to be better
08:09:30.060 | 'cause I'm super competitive.
08:09:31.300 | I want other people to push me.
08:09:34.460 | I think that is important for anyone
08:09:36.980 | trying to achieve greatness
08:09:39.580 | is they need other people around them
08:09:41.300 | who are going to push them to be better.
08:09:43.640 | And I even made a joke about it on X once,
08:09:46.620 | like once the next people get chosen,
08:09:48.840 | like cue buddy cop music.
08:09:50.580 | Like I'm just excited to have other people to do this with
08:09:53.980 | and to share experiences with.
08:09:55.820 | I'm more than happy to interact with them
08:09:57.420 | as much as they want.
08:09:58.620 | More than happy to give them advice.
08:10:01.480 | I don't know what kind of advice I could give them,
08:10:03.320 | but if they have questions, I'm more than happy.
08:10:05.620 | - What advice would you have
08:10:06.740 | for the next participant in the clinical trial?
08:10:10.300 | - That they should have fun with this
08:10:12.780 | because it is a lot of fun.
08:10:15.040 | And that I hope they work really, really hard
08:10:19.060 | because it's not just for us,
08:10:21.260 | it's for everyone that comes after us.
08:10:23.160 | And come to me if they need anything.
08:10:28.380 | And to go to Neuralink if they need anything.
08:10:30.820 | Man, Neuralink moves mountains.
08:10:33.060 | Like they do absolutely anything for me that they can.
08:10:36.980 | And it's an amazing support system to have.
08:10:40.220 | It puts my mind at ease for like so many things
08:10:46.060 | that I have had like questions about
08:10:48.980 | or so many things I want to do.
08:10:50.620 | And they're always there and that's really, really nice.
08:10:54.080 | And so I just, I would tell them not to be afraid
08:10:57.560 | to go to Neuralink with any questions that they have,
08:10:59.940 | any concerns, anything that they're looking to do with this
08:11:04.540 | and any help that Neuralink is capable of providing,
08:11:07.220 | I know they will.
08:11:08.320 | And I don't know.
08:11:12.380 | I don't know, just work your ass off
08:11:14.160 | because it's really important
08:11:17.180 | that we try to give our all to this.
08:11:20.260 | - So have fun and work hard.
08:11:21.900 | - Yeah, yeah, there we go.
08:11:23.380 | Maybe that's what I'll just start saying to people.
08:11:25.080 | Have fun, work hard.
08:11:26.340 | - Now you're a real pro athlete, just keep it short.
08:11:28.980 | Maybe it's good to talk about what you've been able to do
08:11:36.620 | now that you have a Neuralink implant.
08:11:38.380 | Like the freedom you gain from this way of interacting
08:11:43.340 | with the outside world.
08:11:44.740 | Like you play video games all night
08:11:47.700 | and you do that by yourself.
08:11:49.700 | - Yeah.
08:11:50.520 | - And that's a kind of freedom.
08:11:51.440 | Can you speak to that freedom that you gain?
08:11:53.340 | - Yeah, it's what all, I don't know,
08:11:56.580 | people in my position want.
08:11:57.940 | They just want more independence.
08:11:59.660 | The more load that I can take away
08:12:01.460 | from people around me, the better.
08:12:04.080 | If I'm able to interact with the world
08:12:07.100 | without using my family,
08:12:09.300 | without going through any of my friends,
08:12:12.940 | like needing them to help me with things, the better.
08:12:16.020 | If I'm able to sit up on my computer all night
08:12:19.220 | and not need someone to like sit me up,
08:12:23.500 | say like on my iPad, like in a position where I can use it
08:12:27.140 | and then have to have them wait up for me all night
08:12:30.060 | until I'm ready to be done using it.
08:12:31.860 | Like that, it takes a load off of all of us.
08:12:36.060 | And it's really like all I can ask for.
08:12:40.540 | It's something that, you know,
08:12:42.860 | I could never thank Neuralink enough for.
08:12:44.900 | I know my family feels the same way.
08:12:48.700 | You know, just being able to have the freedom
08:12:51.500 | to do things on my own at any hour of the day or night,
08:12:56.100 | it means the world to me.
08:12:58.100 | And I don't know.
08:13:02.740 | - When you're up at 2 a.m. playing WebGrid by yourself,
08:13:07.740 | I just imagine like it's darkness
08:13:09.820 | and there's just a light glowing and you're just focused.
08:13:12.980 | What's going through your mind?
08:13:18.080 | Or are you like in a state of flow
08:13:19.700 | where it's like the mind is empty,
08:13:21.180 | like those like Zen masters?
08:13:22.580 | - Yeah, generally it is me playing music of some sort.
08:13:26.780 | I have a massive playlist.
08:13:28.460 | And so I'm just like rocking out to music.
08:13:30.860 | And then it's also just like a race against time
08:13:34.940 | because I'm constantly looking at
08:13:37.700 | how much battery percentage I have left on my implant.
08:13:41.660 | Like, all right, I have 30%,
08:13:44.180 | which equates to, you know, X amount of time,
08:13:46.780 | which means I have to break this record
08:13:48.940 | in the next, you know, hour and a half
08:13:50.700 | or else it's not happening tonight.
08:13:52.500 | And so it's a little stressful when that happens.
08:13:56.820 | When it's like, when it's above 50%,
08:13:59.340 | I'm like, okay, like, I got time.
08:14:01.180 | It starts getting down to 30 and then 20.
08:14:03.860 | It's like, all right, 10%,
08:14:06.260 | a little pop-up's gonna pop up right here
08:14:07.980 | and it's gonna really screw my WebGrid flow.
08:14:10.380 | It's gonna tell me that, you know,
08:14:11.740 | like there's a, like a low battery,
08:14:14.060 | low battery pop-up comes up
08:14:15.860 | and I'm like, it's really gonna screw me over.
08:14:17.220 | So if I have to, if I'm gonna break this record,
08:14:19.860 | I have to do it in the next like 30 seconds
08:14:22.220 | or else that pop-up's gonna get in the way,
08:14:23.780 | like cover my WebGrid.
08:14:25.060 | And then it, after that, I go click on it,
08:14:27.820 | go back into WebGrid and I'm like, all right,
08:14:29.340 | that means I have, you know,
08:14:30.500 | 10 minutes left before this thing's dead.
08:14:33.300 | That's what's going on in my head,
08:14:34.620 | generally that and whatever song is playing.
08:14:36.940 | And I just, I just want,
08:14:39.780 | I want to break those records so bad.
08:14:42.640 | Like, it's all I want when I'm playing WebGrid.
08:14:45.040 | It has become less of like,
08:14:47.800 | oh, this is just a leisurely activity.
08:14:49.860 | Like, I just enjoy doing this
08:14:51.380 | because it just feels so nice.
08:14:53.180 | And it puts me at ease.
08:14:54.300 | It is no, once I'm in WebGrid,
08:14:56.380 | you better break this record
08:14:57.700 | or you're gonna waste like five hours
08:14:59.260 | of your life right now.
08:15:00.700 | And I don't know, it's just fun.
08:15:04.540 | It's fun, man.
08:15:05.640 | - Have you ever tried WebGrid
08:15:06.740 | with like two targets and three targets?
08:15:08.540 | Can you get higher BPS with that?
08:15:10.780 | - Can you do that?
08:15:12.300 | - You mean like different color targets?
08:15:13.620 | - Oh, multiple targets, that changes the thing.
08:15:16.420 | - Yeah, so BPS is a log of number of targets
08:15:18.820 | times correct minus incorrect divided by time.
08:15:21.460 | And so you can think of like different clicks
08:15:23.140 | as basically doubling the number of active targets.
08:15:25.760 | - Got it.
08:15:26.600 | - So, you know, you basically higher BPS,
08:15:28.260 | the more options there are,
08:15:29.140 | the more difficult the task.
08:15:30.700 | And there's also like Zen mode you've played in before,
08:15:33.020 | which is like infinite canvas.
08:15:35.300 | - It covers the whole screen with a grid.
08:15:38.020 | And I don't know.
08:15:40.180 | - What?
08:15:41.540 | - And so you can go like, that's insane.
08:15:45.140 | - He doesn't like it 'cause it didn't show BPS.
08:15:47.100 | So like, you know.
08:15:47.980 | - Oh yeah.
08:15:49.380 | - I had them put in a giant BPS in the background.
08:15:53.040 | So now it's like the opposite of Zen mode.
08:15:55.300 | It's like super hard mode, like just metal mode
08:15:59.420 | if it's just like a giant number in the back count.
08:16:01.460 | - We should rename that, metal mode is a much better name.
08:16:05.140 | - So you also play "Civilization VI."
08:16:08.260 | - I love "Civilization VI," yeah.
08:16:10.100 | - You usually go with "Korea."
08:16:11.500 | - I do, yeah.
08:16:12.980 | So the great part about "Korea"
08:16:14.660 | is they focus on like science tech victories,
08:16:18.660 | which was not planned.
08:16:20.100 | Like I've been playing "Korea" for years
08:16:22.340 | and then all of the neural link stuff happened.
08:16:25.220 | So it kind of aligns.
08:16:26.760 | But what I've noticed with tech victories
08:16:32.100 | is if you can just rush tech, rush science,
08:16:37.380 | then you can do anything.
08:16:39.660 | Like at one point in the game,
08:16:41.460 | you will be so far ahead of everyone technologically
08:16:44.980 | that you will have like musket men, infantry men,
08:16:49.340 | planes sometimes, and people will still be fighting
08:16:51.540 | with like bows and arrows.
08:16:53.060 | And so if you want to win a domination victory,
08:16:56.300 | you just get to a certain point with the science
08:16:59.020 | and then go and wipe out the rest of the world.
08:17:01.620 | Or you can just take science all the way and win that way.
08:17:06.900 | And you're gonna be so far ahead of everyone
08:17:08.700 | 'cause you're producing so much science
08:17:10.080 | that it's not even close.
08:17:11.720 | I've accidentally won in different ways
08:17:16.440 | just by focusing on science.
08:17:18.420 | - Accidentally won by focusing on science.
08:17:20.480 | - I was, yeah, I like, I was playing only science,
08:17:24.540 | obviously, like just science all the way, just tech.
08:17:27.780 | And I was trying to get like every tech
08:17:30.100 | in the tech tree and stuff.
08:17:31.420 | And then I accidentally won through a diplomatic victory
08:17:34.980 | and I was so mad.
08:17:36.540 | I was so mad 'cause it's just like ends the game one turn.
08:17:40.580 | It was like, oh, you won, you're so diplomatic.
08:17:42.780 | I'm like, I don't wanna do this.
08:17:43.860 | I should have declared war on more people or something.
08:17:47.420 | It was terrible.
08:17:48.240 | But you don't need like giant civilizations with tech,
08:17:51.700 | especially with Korea, you can keep it pretty small.
08:17:54.340 | So I generally just get to a certain military unit
08:17:58.100 | and put them all around my border to keep everyone out.
08:18:01.620 | And then I will just build up.
08:18:03.580 | So very isolationist.
08:18:05.660 | - Nice, just work on the science of the tech.
08:18:07.780 | - Yep, that's it.
08:18:08.900 | - You're making it sound so fun.
08:18:10.300 | - It's so much fun.
08:18:11.140 | - And I also saw Civilization VII trailer.
08:18:13.420 | - Oh man, I'm so pumped.
08:18:14.620 | - And that's probably coming out.
08:18:16.180 | - Come on, Civ VII, hit me up.
08:18:17.660 | I'll alpha, beta test, whatever.
08:18:20.060 | - Wait, when is it coming out?
08:18:21.540 | - 2025.
08:18:22.380 | - Yeah, yeah, next year, yeah.
08:18:23.940 | What other stuff would you like to see improved
08:18:26.180 | about the Neuralink app and just the entire experience?
08:18:29.540 | - I would like to, like I said, get back to the,
08:18:35.020 | like click-on-demand, like the regular clicks.
08:18:37.420 | That would be great.
08:18:38.580 | I would like to be able to connect to more devices.
08:18:42.620 | Right now it's just the computer.
08:18:44.260 | I'd like to be able to use it on my phone
08:18:47.180 | or use it on different consoles, different platforms.
08:18:51.460 | I'd like to be able to control
08:18:53.860 | as much stuff as possible, honestly.
08:18:55.660 | Like an Optimus robot would be pretty cool.
08:19:00.060 | That would be sick if I could control an Optimus robot.
08:19:03.500 | The Link app itself, it seems like we are getting
08:19:08.300 | pretty dialed in to what it might look like down the road.
08:19:13.300 | Seems like we've gotten through a lot
08:19:18.860 | of what I want from it, at least.
08:19:21.580 | The only other thing I would say is like more control
08:19:24.820 | over all the parameters that I can tweak
08:19:29.420 | with my like cursor and stuff.
08:19:32.500 | There's a lot of things that, you know,
08:19:34.420 | go into how the cursor moves in certain ways.
08:19:38.260 | And I have, I don't know,
08:19:39.900 | like three or four of those parameters.
08:19:41.900 | - Like gain and friction and all that?
08:19:43.340 | - Gain, friction, yeah.
08:19:44.940 | And there's maybe double the amount of those
08:19:47.340 | with just like velocity
08:19:49.260 | and then with the actual dwell cursor.
08:19:51.500 | So I would like all of it.
08:19:54.140 | I want as much control over my environment as possible.
08:19:57.820 | - So you want like advanced mode, you know,
08:19:59.980 | like in like there's menus, usually there's basic mode
08:20:02.820 | and you're like one of those folks,
08:20:04.100 | like the power user, advanced.
08:20:06.940 | - That's what I want.
08:20:08.540 | I want as much control over this as possible.
08:20:11.580 | So yeah, that's really all I can ask for.
08:20:13.820 | Just give me everything.
08:20:15.940 | - Has speech been useful?
08:20:20.740 | Like just being able to talk also
08:20:22.260 | in addition to everything else?
08:20:23.660 | - Yeah, you mean like while I'm using it?
08:20:25.420 | - While you're using it, like speech to text?
08:20:28.060 | - Oh yeah.
08:20:28.900 | - Do you type or like, 'cause there's also a keyboard?
08:20:30.980 | - Yeah, yeah, so there's a virtual keyboard.
08:20:32.580 | That's another thing I would like to work more on
08:20:34.700 | is finding some way to type or text in a different way.
08:20:39.700 | Right now it is like a dictation basically
08:20:44.220 | and a virtual keyboard that I can use with the cursor.
08:20:47.100 | But we've played around with like finger spelling,
08:20:50.940 | like sign language finger spelling.
08:20:53.460 | And that seems really promising.
08:20:55.500 | So I have this thought in my head
08:20:58.020 | that it's going to be a very similar learning curve
08:21:03.020 | that I had with the cursor,
08:21:05.820 | where I went from attempted movement
08:21:07.300 | to imagined movement at one point.
08:21:08.980 | I have a feeling, this is just my intuition,
08:21:12.020 | that at some point I'm going to be doing finger spelling
08:21:15.740 | and I won't need to actually attempt
08:21:17.460 | to finger spell anymore.
08:21:18.620 | That I'll just be able to think the like letter
08:21:21.740 | that I want and it'll pop up.
08:21:24.020 | - That'll be epic.
08:21:25.340 | That's challenging, that's hard.
08:21:26.780 | That's a lot of work for you to kind of take that leap,
08:21:29.820 | but that would be awesome.
08:21:30.740 | - And then like going from letters to words
08:21:33.420 | is another step.
08:21:34.260 | Like you would go from, you know,
08:21:35.740 | right now it's finger spelling
08:21:36.860 | and like just the sign language alphabet.
08:21:38.860 | But if it's able to pick that up,
08:21:40.300 | then it should be able to pick up
08:21:41.820 | like the whole sign language, like language.
08:21:44.860 | And so then if I could do something along those lines
08:21:48.420 | or just the sign language spelled word,
08:21:51.740 | if I can, you know, spell it at a reasonable speed
08:21:55.060 | and it can pick that up,
08:21:56.540 | then I would just be able to think that through
08:21:59.660 | and it would do the same thing.
08:22:01.220 | I don't see why not.
08:22:02.780 | After what I saw with the cursor control,
08:22:07.060 | I don't see why it wouldn't work,
08:22:08.700 | but we'd have to play around with it more.
08:22:10.320 | - What was the process in terms of like training yourself
08:22:12.940 | to go from attempted movement to imagined movement?
08:22:15.540 | How long did that take?
08:22:17.020 | So like how long would this kind of process take?
08:22:19.020 | - Well, it was a couple of weeks
08:22:20.580 | before it just like happened upon me.
08:22:22.960 | But now that I know that that was possible,
08:22:25.920 | I think I could make it happen with other things.
08:22:28.620 | I think it would be much, much simpler.
08:22:32.020 | - Would you get an upgraded implant device?
08:22:34.420 | - Sure, absolutely.
08:22:36.420 | Whenever they'll let me.
08:22:37.780 | - So you don't have any concerns
08:22:40.380 | for you with the surgery experience,
08:22:41.780 | all of it was like no regrets?
08:22:45.620 | - No.
08:22:46.460 | - So everything's been good so far?
08:22:47.820 | - Yep.
08:22:49.100 | - You just keep getting upgrades?
08:22:50.620 | - Yeah, I mean, why not?
08:22:52.340 | I've seen how much it's impacted my life already
08:22:54.740 | and I know that everything from here on out
08:22:57.520 | it's just gonna get better and better.
08:22:58.580 | So I would love to, I would love to get the upgrade.
08:23:02.120 | - What future capabilities are you excited about
08:23:06.760 | sort of beyond this kind of telepathy?
08:23:09.820 | Is vision interesting?
08:23:12.000 | So for folks who, for example, who are blind,
08:23:14.300 | so Neuralink enabling people to see or for speech?
08:23:19.300 | - Yeah, there's a lot that's very, very cool about this.
08:23:22.240 | I mean, we're talking about the brain.
08:23:23.400 | So there's like, this is just motor cortex stuff.
08:23:26.080 | There's so much more that can be done.
08:23:28.040 | The vision one is fascinating to me.
08:23:30.960 | I think that is going to be very, very cool
08:23:33.580 | to give someone the ability to see
08:23:34.960 | for the first time in their life would just be,
08:23:37.400 | I mean, it might be more amazing
08:23:39.500 | than even helping someone like me.
08:23:41.440 | Like that just sounds incredible.
08:23:43.960 | The speech thing is really interesting,
08:23:46.240 | being able to have some sort of like real time translation
08:23:49.840 | and cut away that language barrier would be really cool.
08:23:54.240 | Any sort of like actual impairments that it could solve,
08:23:58.440 | like with speech would be very, very cool.
08:24:00.800 | And then also there are a lot of different disabilities
08:24:03.760 | that all originate in the brain.
08:24:06.600 | And you would be able to,
08:24:08.320 | hopefully be able to solve a lot of those.
08:24:10.640 | I know there's already stuff to help people with seizures
08:24:14.280 | that can be implanted in the brain.
08:24:17.320 | This would do, I imagine the same thing.
08:24:20.560 | And so you could do something like that.
08:24:22.440 | I know that, even someone like Joe Rogan
08:24:25.560 | has talked about the possibilities
08:24:28.200 | with being able to stimulate the brain in different ways.
08:24:33.200 | I'm not sure what,
08:24:39.480 | like how ethical a lot of that would be.
08:24:41.840 | That's beyond me, honestly.
08:24:44.880 | But I know that there's a lot that can be done
08:24:47.160 | when we're talking about the brain
08:24:48.720 | and being able to go in and physically make changes
08:24:52.840 | to help people or to improve their lives.
08:24:55.600 | So I'm really looking forward
08:24:56.840 | to everything that comes from this.
08:24:58.520 | And I don't think it's all that far off.
08:25:00.680 | I think a lot of this can be implemented
08:25:04.120 | within my lifetime, assuming that I live a long life.
08:25:07.760 | - What you're referring to is things like
08:25:10.280 | people suffering from depression
08:25:11.800 | or things of that nature potentially getting help.
08:25:14.800 | - Yeah.
08:25:15.640 | You can flip a switch like that and make someone happy.
08:25:19.040 | I know, I think Joe has talked about it more
08:25:21.200 | in terms of like, you want to experience
08:25:24.600 | like what a drug trip feels like.
08:25:27.120 | Like you wanna experience what you like to be on.
08:25:29.600 | - Of course.
08:25:30.440 | - Yeah, mushrooms or something like that.
08:25:31.600 | DMT, like you can just flip that switch in the brain.
08:25:34.800 | My buddy Bain has talked about being able
08:25:37.080 | to like wipe parts of your memory
08:25:39.440 | and re-experience things that like for the first time,
08:25:42.240 | like your favorite movie or your favorite book,
08:25:44.520 | like just wipe that out real quick
08:25:46.320 | and then re-fall in love with Harry Potter or something.
08:25:49.880 | I told him, I was like, I don't know how I feel
08:25:51.560 | about like people being able to just wipe parts
08:25:53.920 | of your memory.
08:25:55.280 | That seems a little sketchy to me.
08:25:56.400 | He's like, they're already doing it, so.
08:25:58.360 | - Sounds legit.
08:26:00.800 | I would love memory replay.
08:26:04.000 | Just like actually like high resolution replay
08:26:06.480 | of old memories.
08:26:07.320 | - Yeah, I saw an episode of "Black Mirror" about that once.
08:26:09.680 | I don't think I want it.
08:26:11.120 | - Yeah, so "Black Mirror" is always kind of considered
08:26:13.480 | as the worst case, which is important.
08:26:15.480 | I think people don't consider the best case
08:26:18.440 | or the average case enough.
08:26:19.840 | I don't know what it is about us humans.
08:26:21.720 | We want to think about the worst possible thing.
08:26:24.760 | We love drama.
08:26:25.900 | It's like, how is this new technology
08:26:28.320 | gonna kill everybody?
08:26:30.120 | We just love that.
08:26:30.960 | We get like, yes, let's watch.
08:26:32.520 | - Hopefully people don't think about that too much with me.
08:26:34.520 | It'll ruin a lot of my plans.
08:26:36.240 | - Yeah, I assume you're gonna have to take over the world.
08:26:40.120 | I mean, I love your Twitter.
08:26:41.760 | You tweeted, "I'd like to make jokes
08:26:43.920 | "about hearing voices in my head
08:26:45.360 | "since getting the Neuralink,
08:26:46.400 | "but I feel like people would take it the wrong way.
08:26:48.660 | "Plus the voices in my head told me not to."
08:26:51.400 | - Yeah.
08:26:52.240 | - Yeah. - Yeah.
08:26:53.200 | - Please never stop.
08:26:54.220 | So you were talking about Optimus.
08:26:57.100 | Is that something you would love to be able to do
08:27:02.240 | to control the robotic arm or the entirety of Optimus?
08:27:05.520 | - Oh yeah, for sure.
08:27:06.800 | For sure, absolutely.
08:27:07.880 | - You think there's something like fundamentally different
08:27:09.600 | about just being able to physically interact with the world?
08:27:12.320 | - Yeah, oh, 100%.
08:27:14.080 | I know another thing with being able to give people
08:27:21.240 | the ability to feel sensation and stuff too
08:27:24.560 | by going in with the brain and having the Neuralink
08:27:26.920 | maybe do that, that could be something
08:27:28.760 | that could be transferred through the Optimus as well.
08:27:33.760 | There's all sorts of really cool interplay between that.
08:27:38.360 | And then also, like you said, just physically interacting.
08:27:41.360 | I mean, 99% of the things that I can't do myself
08:27:46.360 | obviously I need a caretaker for,
08:27:50.640 | someone to physically do things for me.
08:27:52.400 | If an Optimus robot could do that,
08:27:55.020 | I could live an incredibly independent life
08:27:58.760 | and not be such a burden on those around me.
08:28:01.220 | And it would change the way people like me live
08:28:08.640 | at least until whatever this is gets cured.
08:28:11.160 | But being able to interact with the world physically
08:28:14.480 | like that would just be amazing.
08:28:16.080 | And they're not just for having to be a caretaker
08:28:22.760 | or something, but something like I talked about,
08:28:25.360 | just being able to read a book.
08:28:27.040 | Imagine an Optimus robot just being able
08:28:28.960 | to hold a book open in front of me,
08:28:30.880 | get that smell again.
08:28:32.160 | I might not be able to feel it at that point,
08:28:35.160 | or maybe I could again with the sensation and stuff,
08:28:37.680 | but there's something different about reading
08:28:40.360 | like a physical book than staring at a screen
08:28:42.800 | or listening to an audio book.
08:28:44.120 | I actually don't like audio books.
08:28:45.960 | I've listened to a ton of them at this point,
08:28:47.960 | but I don't really like 'em.
08:28:49.720 | I would much rather read a physical copy.
08:28:51.960 | - So one of the things you would love
08:28:53.440 | to be able to experience is opening the book,
08:28:57.060 | bringing it up to you and to feel the touch of the paper.
08:29:01.640 | - Yeah, oh man, the touch, the smell.
08:29:05.080 | I mean, it's just something about the words on the page.
08:29:08.520 | And they've replicated that page color
08:29:12.320 | on like the Kindle and stuff.
08:29:13.960 | Yeah, it's just not the same.
08:29:15.800 | Yeah, so just something as simple as that.
08:29:18.280 | - So one of the things you miss is touch.
08:29:20.840 | - I do, yeah.
08:29:22.400 | A lot of things that I interact with in the world,
08:29:25.620 | like clothes or literally any physical thing
08:29:29.240 | that I interact with in the world,
08:29:31.040 | a lot of times what people around me will do
08:29:33.520 | is they'll just come like rub it on my face.
08:29:35.840 | They'll like lay something on me so I can feel the weight.
08:29:38.520 | They will rub a shirt on me so I can feel fabric.
08:29:42.760 | Like there's something very profound about touch.
08:29:46.660 | And it's something that I miss a lot
08:29:50.680 | and something I would love to do again.
08:29:56.040 | We'll see.
08:29:56.880 | - What would be the first thing you do
08:29:58.120 | with a hand that can touch?
08:30:00.760 | Give your mom a hug after that, right?
08:30:02.760 | - Yeah, I know.
08:30:04.120 | It's one thing that I've asked God for basically every day
08:30:09.120 | since my accident was just being able to one day move,
08:30:14.840 | even if it was only my hand,
08:30:17.520 | so that way I could squeeze my mom's hand or something
08:30:20.880 | just to show her how much I care
08:30:23.480 | and how much I love her and everything.
08:30:25.880 | Something along those lines,
08:30:27.960 | being able to just interact with the people around me,
08:30:31.160 | handshake, give someone a hug.
08:30:33.100 | I don't know, anything like that.
08:30:36.400 | Being able to help me eat.
08:30:37.880 | I'd probably get really fat,
08:30:40.760 | which would be a terrible, terrible thing.
08:30:44.640 | - Also beat bliss in chess on a physical chess board.
08:30:47.800 | - Yeah, yeah.
08:30:48.760 | I mean, there are just so many upsides.
08:30:50.920 | Any way to find some way to feel like
08:30:56.560 | I'm bringing bliss down to my level
08:30:58.920 | 'cause he's just such an amazing guy
08:31:02.080 | and everything about him is just so above and beyond
08:31:05.560 | that anything I can do to take him down a notch,
08:31:09.720 | I'm really happy.
08:31:10.840 | - Yeah, humble him a bit, he needs it.
08:31:12.840 | - Yeah.
08:31:13.680 | - Okay, as he's sitting next to me.
08:31:15.980 | Did you ever make sense of why God puts good people
08:31:21.440 | through such hardship?
08:31:22.600 | - Oh, man.
08:31:26.000 | Um, I think it's all about
08:31:30.720 | understanding how much we need God
08:31:36.120 | and I don't think that there's any light without the dark.
08:31:41.120 | I think that if all of us were happy all the time,
08:31:44.420 | there would be no reason to turn to God ever.
08:31:50.320 | I feel like there would be no concept
08:31:53.520 | of good or bad.
08:31:55.720 | And I think that as much of the darkness
08:32:00.720 | and the evil that's in the world,
08:32:02.920 | it makes us all appreciate the good
08:32:06.480 | and the things we have so much more.
08:32:08.320 | And I think when I had my accident,
08:32:12.440 | one of the first things I said to one of my best friends was,
08:32:15.720 | and this was within the first month or two after my accident,
08:32:18.280 | I said, "Everything about this accident
08:32:20.520 | "has just made me understand and believe
08:32:24.000 | "that God is real and that there really is a God."
08:32:27.120 | Basically in that my interactions with him
08:32:30.360 | have all been real and worthwhile.
08:32:33.000 | And he said, "If anything,
08:32:34.600 | "seeing me go through this accident,
08:32:36.200 | "he believes that there isn't a God."
08:32:39.760 | And it's a very different reaction.
08:32:42.060 | But I believe that it is a way for God
08:32:45.980 | to test us, to build our character,
08:32:48.440 | to send us through trials and tribulations,
08:32:53.440 | to make sure that we understand how precious he is
08:32:58.560 | and the things that he's given us
08:33:00.320 | and the time that he's given us,
08:33:01.800 | and then to hopefully grow from all of that.
08:33:06.080 | I think that's a huge part of being here
08:33:07.840 | is to not just have an easy life
08:33:12.840 | and do everything that's easy,
08:33:16.360 | but to step out of our comfort zones
08:33:18.000 | and really challenge ourselves
08:33:19.840 | because I think that's how we grow.
08:33:22.000 | - What gives you hope about this whole thing
08:33:24.160 | we have going on, human civilization?
08:33:26.840 | - Oh man, I think people are my biggest inspiration.
08:33:31.840 | Even just being at Neuralink for a few months,
08:33:38.880 | looking people in the eyes and hearing their motivations
08:33:42.700 | for why they're doing this, it's so inspiring.
08:33:47.440 | And I know that they could be other places
08:33:50.480 | at cushier jobs, working somewhere else,
08:33:54.320 | doing X, Y, or Z that doesn't really mean that much,
08:33:58.760 | but instead they're here and they want to better humanity
08:34:03.760 | and they want to better just the people around them,
08:34:06.800 | the people that they've interacted with in their life.
08:34:08.880 | They want to make better lives for their own family members
08:34:11.920 | who might have disabilities,
08:34:13.080 | or they look at someone like me and they say,
08:34:15.400 | "I can do something about that, so I'm going to."
08:34:18.080 | And it's always been what I've connected with most
08:34:20.880 | in the world are people.
08:34:22.880 | I've always been a people person
08:34:24.240 | and I love learning about people,
08:34:26.000 | and I love learning how people developed
08:34:29.180 | and where they came from,
08:34:30.640 | and to see how much people are willing to do
08:34:33.960 | for someone like me when they don't have to,
08:34:36.400 | and they're going out of their way to make my life better.
08:34:39.960 | It gives me a lot of hope for just humanity in general,
08:34:42.720 | how much we care and how much we're capable of
08:34:46.440 | when we all kind of get together
08:34:49.060 | and try to make a difference.
08:34:50.620 | And I know there's a lot of bad out there in the world,
08:34:54.480 | but there always has been and there always will be.
08:34:56.980 | And I think that that is, it shows human resiliency
08:35:03.260 | and it shows what we're able to endure
08:35:07.920 | and how much we just want to be there and help each other
08:35:12.920 | and how much satisfaction we get from that,
08:35:18.440 | because I think that's one of the reasons that we're here
08:35:20.680 | is just to help each other.
08:35:22.080 | And I don't know, that always gives me hope,
08:35:27.080 | is just realizing that there are people out there
08:35:29.280 | who still care and who want to help.
08:35:31.080 | - And thank you for being one such human being
08:35:34.320 | and continuing to be a great human being
08:35:36.360 | through everything you've been through
08:35:39.040 | and being an inspiration to many people,
08:35:41.200 | to myself, for many reasons,
08:35:43.120 | including your epic, unbelievably great performance
08:35:47.160 | on WebGrid.
08:35:48.220 | I will be training all night tonight to try to catch up.
08:35:52.160 | - You can do it.
08:35:53.000 | - And I believe in you that you can, once you come back,
08:35:55.800 | so sorry to interrupt with the Austin trip,
08:35:57.680 | once you come back, eventually beat bliss.
08:36:00.600 | - Yeah, yeah, for sure, absolutely.
08:36:02.920 | - I'm rooting for you, the whole world is rooting for you.
08:36:05.520 | Thank you for everything you've done, man.
08:36:07.160 | - Thanks, thanks, man.
08:36:09.060 | - Thanks for listening to this conversation
08:36:11.120 | with Nolan Arbaugh, and before that with Elon Musk,
08:36:14.300 | DJ Saw, Matthew McDougal, and Bliss Chapman.
08:36:17.840 | To support this podcast,
08:36:19.040 | please check out our sponsors in the description.
08:36:21.880 | And now, let me leave you with some words
08:36:23.980 | from Aldous Huxley in "The Doors of Perception."
08:36:28.360 | We live together.
08:36:30.160 | We act on and react to one another.
08:36:32.960 | But always, and in all circumstances, we are by ourselves.
08:36:37.960 | The martyrs go hand in hand into the arena.
08:36:41.320 | They are crucified alone.
08:36:43.740 | Embraced, the lovers desperately try to fuse
08:36:46.140 | their insulated ecstasies
08:36:48.040 | into a single self-transcendence in vain.
08:36:51.720 | But it's very nature.
08:36:53.920 | Every embodied spirit is doomed to suffer
08:36:56.680 | and enjoy its solitude.
08:36:58.700 | Sensations, feelings, insights, fancies,
08:37:01.000 | all these are private and except through symbols
08:37:05.000 | and a second hand, incommunicable.
08:37:08.240 | We can pool information about experiences,
08:37:11.720 | but never the experiences themselves.
08:37:14.400 | From family to nation, every human group
08:37:17.560 | is a society of island universes.
08:37:20.880 | Thank you for listening, and hope to see you next time.
08:37:25.720 | (upbeat music)
08:37:28.300 | (upbeat music)
08:37:30.880 | [BLANK_AUDIO]