back to indexDavid Eagleman: Neuroplasticity and the Livewired Brain | Lex Fridman Podcast #119
Chapters
0:0 Introduction
5:5 Livewired
16:39 Hardware vs software
25:53 Brain-computer interfaces
35:12 2020 is a challenge for neuroplasticity
46:8 Free will
50:43 Nature of evil
58:55 Psychiatry
66:28 GPT-3
73:31 Intelligence in the brain
81:51 Neosensory
91:27 Book recommendations
94:7 Meaning of life
96:53 Advice for young people
00:00:00.000 |
The following is a conversation with David Eagleman, 00:00:03.000 |
a neuroscientist and one of the great science communicators 00:00:06.360 |
of our time, exploring the beauty and mystery 00:00:13.140 |
about the human mind, and his new one called LiveWired. 00:00:21.200 |
that is fascinating to me, which is neuroplasticity 00:00:34.480 |
to get a discount and to support this podcast. 00:00:37.800 |
As a side note, let me say that the adaptability 00:00:41.400 |
of the human mind at the biological, chemical, 00:00:44.360 |
cognitive, psychological, and even sociological levels 00:00:48.480 |
is the very thing that captivated me many years ago 00:00:51.840 |
when I first began to wonder how we might engineer 00:01:03.680 |
As new, smarter and smarter devices and AI systems 00:01:12.080 |
will our brain be able to adapt, to catch up, to excel? 00:01:16.520 |
I personally believe yes, that we're far from reaching 00:01:19.520 |
the limitation of the human mind and the human brain, 00:01:23.080 |
just as we are far from reaching the limitations 00:01:27.880 |
If you enjoy this thing, subscribe on YouTube, 00:01:54.960 |
This show is brought to you by Athletic Greens, 00:01:58.260 |
the all-in-one daily drink to support better health 00:02:02.640 |
Even with a balanced diet, it's difficult to cover 00:02:09.120 |
Their daily drink is like nutritional insurance 00:02:11.780 |
for your body, as delivered straight to your door. 00:02:20.320 |
sometimes 24 hours, dinner to dinner, sometimes more. 00:02:26.920 |
It's delicious, refreshing, just makes me feel good. 00:02:30.160 |
I think it's like 50 calories, less than a gram of sugar, 00:02:34.400 |
but has a ton of nutrients to make sure my body 00:02:40.200 |
Go to athleticgreens.com/lex to claim a special offer 00:02:53.320 |
how awesome vitamin D is for your immune system. 00:02:57.560 |
So click the athleticgreens.com/lex in the description 00:03:02.080 |
to get the free stuff and to support this podcast. 00:03:05.320 |
This show is sponsored by BetterHelp, spelled H-E-L-P, help. 00:03:15.440 |
match you with a licensed professional therapist 00:03:21.800 |
it's professional counseling done securely online. 00:03:25.480 |
I'm a bit from the David Goggins line of creatures 00:03:30.640 |
usually on long runs or all nights full of self-doubt. 00:03:41.620 |
For most people, I think a good therapist can help in this. 00:03:49.240 |
It's easy, private, affordable, available worldwide. 00:03:54.040 |
save you time and schedule a weekly audio and video session. 00:04:18.440 |
in the context of the history of money is fascinating. 00:04:46.360 |
from the App Store or Google Play and use code LEXPODCAST, 00:04:49.760 |
you get $10, and Cash App will also donate $10 to FIRST, 00:04:53.880 |
an organization that is helping to advance robotics 00:04:56.920 |
and STEM education for young people around the world. 00:05:00.380 |
And now, here's my conversation with David Eagleman. 00:05:04.580 |
You have a new book coming out on "The Changing Brain." 00:05:10.320 |
Can you give a high-level overview of the book? 00:05:15.420 |
So the thing is, we typically think about the brain 00:05:33.420 |
as it learns and adapts to the world around it. 00:05:38.640 |
- So it's LiveWire, meaning like hardware, but changing. 00:05:44.560 |
Well, the hardware and the software layers are blended. 00:05:47.480 |
And so, typically, engineers are praised for their efficiency 00:05:59.720 |
that you get out of a piece of hardware like that 00:06:02.840 |
But what the brain is doing is completely different. 00:06:05.240 |
And I am so excited about where this is all going 00:06:08.360 |
because I feel like this is where our engineering will go. 00:06:14.400 |
So currently, we build all our devices a particular way. 00:06:18.280 |
But I can't tear half the circuitry out of your cell phone 00:06:36.640 |
They have a slight limp on the other side of their body, 00:06:43.440 |
Sometimes children are born without a hemisphere. 00:06:48.600 |
so that everything is on the single remaining hemisphere. 00:06:59.680 |
that is simply trying to accomplish the tasks in front of it 00:07:03.440 |
by rewiring itself with the available real estate. 00:07:06.920 |
- How much of that is a quirk or a feature of evolution? 00:07:20.840 |
'Cause you said you kind of look forward to the idea 00:07:27.480 |
like this in the future, like creating live wire systems. 00:07:30.720 |
How hard do you think is it to create systems like that? 00:07:34.000 |
It has proven itself to be a difficult challenge. 00:07:37.040 |
What I mean by that is even though it's taken evolution 00:07:43.180 |
all we have to do now is peek at the blueprints. 00:07:50.840 |
But that's the part that I mean is a difficult challenge 00:07:52.720 |
because there are tens of thousands of neuroscientists. 00:07:57.080 |
We're all poking and prodding and trying to figure this out, 00:08:09.400 |
and you could look inside the nucleus of a cell 00:08:10.940 |
and you'd see hundreds of thousands of things 00:08:17.400 |
the order of the base pairs and all the rest is details. 00:08:20.300 |
Then it simplifies it and we come to understand something. 00:08:25.120 |
which I've written over 10 years, by the way, 00:08:26.920 |
is to try to distill things down to the principles 00:08:29.920 |
of what plastic systems are trying to accomplish. 00:08:34.180 |
- But to even just linger, you said it's possible 00:08:50.080 |
I mean, you kind of hear things here and there. 00:08:51.760 |
This is why I'm really excited about your book 00:08:56.840 |
sort of popular sources to think about this stuff. 00:09:01.480 |
I mean, there's a lot of, I think from my perspective, 00:09:05.040 |
what I heard is there's been debates over decades 00:09:07.920 |
about how much neuroplasticity there is in the brain 00:09:12.040 |
and so on and people have learned a lot of things 00:09:23.880 |
like how malleable is the hardware of the human brain? 00:09:28.340 |
Maybe you said children at each stage of life. 00:09:33.640 |
I think part of the confusion about plasticity 00:09:40.320 |
and then people might read that from a distance 00:09:42.360 |
and they think, oh, well, Fred didn't recover 00:09:48.980 |
but then you do it with a child and they are plastic. 00:09:52.160 |
And so part of my goal here was to pull together 00:10:04.880 |
The principles are that plasticity diminishes, 00:10:09.520 |
By the way, we should just define plasticity. 00:10:11.560 |
It's the ability of a system to mold into a new shape 00:10:16.260 |
That's why we make things that we call plastic 00:10:19.880 |
because they are moldable and they can hold that new shape, 00:10:25.240 |
- And so maybe we'll use a lot of terms that are synonymous. 00:10:29.480 |
So something is plastic, something is malleable, 00:10:34.440 |
changing, live wire, the name of the book is synonymous. 00:10:39.840 |
but I'll tell you why I chose live wire instead of plasticity. 00:10:42.920 |
So I use the term plasticity in the book, but sparingly, 00:10:47.080 |
because that was a term coined by William James 00:10:58.480 |
But that's not what's actually happening in the brain. 00:11:11.000 |
that we've been having, your brain is changing, 00:11:18.000 |
oh yeah, like that time Lex and I sat together 00:11:21.400 |
- I wonder if your brain will have a Lex thing 00:11:29.320 |
- Yeah, no, I'll probably never get rid of it. 00:11:32.000 |
you and I don't see each other for the next 35 years. 00:11:43.360 |
- Back when we lived outside virtual reality. 00:11:52.880 |
I mean, it's the term that's used in the field 00:11:54.400 |
and so that's why we need to use it still for a while. 00:11:57.800 |
But yeah, it implies something gets molded into shape 00:12:00.800 |
But in fact, the whole system is completely changing. 00:12:03.240 |
- Then back to how malleable is the human brain 00:12:08.840 |
So what, just at a high level, is it malleable? 00:12:21.760 |
after reading thousands of papers on this issue 00:12:33.200 |
that cements itself into place pretty quickly 00:12:37.800 |
And I argue that's because of the stability of the data. 00:12:41.080 |
In other words, what you're getting in from the world, 00:12:44.000 |
you've got a certain number of angles, colors, shapes, 00:12:46.960 |
you know, it's essentially the world is visually stable. 00:12:52.480 |
As opposed to, let's say the somatosensory cortex, 00:12:56.600 |
from your body, or the motor cortex right next to it, 00:13:01.920 |
You get taller over time, you get fatter, thinner over time, 00:13:05.760 |
you might break a leg and have to limp for a while, 00:13:08.760 |
So because the data there is always changing, 00:13:20.920 |
you find that it appears to be this, you know, 00:13:28.040 |
But the point is, different parts of the brain 00:13:31.960 |
- Do you think it's possible that depending on 00:13:41.480 |
So like, you know, if you look at different cultures 00:13:44.200 |
that experience, like if you keep your eyes closed, 00:13:49.320 |
let's say you keep your eyes closed for your entire life, 00:13:53.600 |
then the visual cortex might be much less malleable. 00:13:58.600 |
The reason I bring that up is like, you know, 00:14:01.320 |
maybe we'll talk about brain-computer interfaces 00:14:06.040 |
like is this, is the malleability a genetic thing, 00:14:11.040 |
or is it more about the data, like you said, that comes in? 00:14:14.360 |
- Ah, so the malleability itself is a genetic thing. 00:14:17.960 |
The big trick that Mother Nature discovered with humans 00:14:24.200 |
as opposed to most other creatures to different degrees. 00:14:31.200 |
its brain does the same thing every generation. 00:14:34.160 |
If you compare an alligator 100,000 years ago 00:14:36.200 |
to an alligator now, they're essentially the same. 00:14:38.400 |
We, on the other hand, as humans, drop into a world 00:14:48.160 |
and the language, and the beliefs, and the customs, 00:14:50.320 |
and so on, that's what Mother Nature has done with us, 00:14:55.080 |
and it's been a tremendously successful trick. 00:14:57.360 |
We've taken over the whole planet as a result of this. 00:15:03.280 |
this is a nice feature, like if you were to design a thing 00:15:07.080 |
to survive in this world, do you put it at age zero, 00:15:17.720 |
do you make it malleable and just throw it in, 00:15:19.560 |
take the risk that you're maybe going to die, 00:15:23.380 |
but you're going to learn a lot in the process, 00:15:25.200 |
and if you don't die, you'll learn a hell of a lot 00:15:29.320 |
- So this is the experiment that Mother Nature ran, 00:15:31.360 |
and it turns out that, for better or worse, we've won. 00:15:34.960 |
I mean, yeah, we put other animals into zoos, 00:15:46.640 |
- So that's a beautiful feature of the human brain, 00:15:52.400 |
on the topic of Mother Nature, what do we start with? 00:16:01.040 |
What it's, terrific engineering that's set up in there, 00:16:07.640 |
okay, just make sure that things get to the right place. 00:16:13.800 |
or all this very complicated machinery in the ear 00:16:32.900 |
That's the trick, is how do you engineer something 00:16:37.240 |
that's ready to take in and fill in the blanks? 00:16:57.020 |
and there's chemical communication from the synapses. 00:17:08.240 |
the timing and the nature of the electrical signals, 00:17:10.920 |
I guess, and the hardware would be the actual synapses. 00:17:16.800 |
I wanna get away from the hardware and software metaphor, 00:17:21.880 |
as activity passes through the system, it changes things. 00:17:27.840 |
are really used to thinking about is synapses, 00:17:31.660 |
Of course, each neuron connects with 10,000 of its neighbors, 00:17:37.660 |
is the changing of the strength of that connection, 00:17:45.580 |
The receptor distribution inside that neuron, 00:17:55.460 |
all the way down to biochemical cascades inside the cell, 00:18:01.860 |
which is these little proteins that are attached to the DNA 00:18:08.740 |
that cause more genes to be expressed or repressed. 00:18:18.820 |
is because that's really all we can measure well. 00:18:21.980 |
And all this other stuff is really, really hard to see 00:18:38.900 |
So pace layers is a concept that Stewart Brand 00:18:44.020 |
So you have fashion, which changes rapidly in cities. 00:18:47.460 |
You have governance, which changes more slowly. 00:18:52.460 |
You have the structure, the buildings of a city, 00:18:55.280 |
which changes more slowly, all the way down to nature. 00:18:58.660 |
You've got all these different layers of things 00:19:00.460 |
that are changing at different paces, at different speeds. 00:19:03.620 |
I've taken that idea and mapped it onto the brain, 00:19:06.580 |
which is to say you have some biochemical cascades 00:19:10.620 |
when something happens, all the way down to things 00:19:19.860 |
about particular kinds of things that happen. 00:19:25.180 |
is called Ribo's Law, which is that older memories 00:19:33.180 |
you'll be able to remember things from your young life. 00:19:45.060 |
where older memories are more stable than newer memories. 00:19:56.700 |
And so this is, I think, the way we have to think 00:20:00.300 |
about the brain, not as, okay, you've got neurons, 00:20:05.300 |
- So yeah, so the idea of liveware and livewired 00:20:13.860 |
yeah, it's a gradual spectrum between software and hardware. 00:20:18.260 |
And so the metaphors completely doesn't make sense. 00:20:22.020 |
'Cause like when you talk about software and hardware, 00:20:26.500 |
I mean, of course, software is unlike hardware, 00:20:31.500 |
but even hardware, but like, so there's two groups, 00:20:39.060 |
There's the operating system, there's machine code, 00:20:44.500 |
But somehow that's actually fundamentally different 00:20:46.900 |
than the layers of abstractions in the hardware. 00:20:57.900 |
'cause it's hard to know what to think about that. 00:21:03.220 |
this is an important question for machine learning, 00:21:13.900 |
I mean, it just learns on all of these different levels 00:21:19.860 |
And as a result, what happens is as you practice something, 00:21:31.300 |
So let's say you take up, do you know how to surf? 00:21:35.780 |
Let's say you take up surfing now at this age. 00:21:42.060 |
you don't know how to read the waves, things like that. 00:21:48.660 |
You're of course conscious when you're first doing it, 00:21:50.860 |
you're thinking about, okay, where am I doing? 00:21:57.180 |
In fact, you can't even unpack what it is that you did. 00:22:04.140 |
You're just doing it, you're changing your balance 00:22:05.620 |
when you come, you know, you do this to go to a stop. 00:22:15.980 |
Survival of course being the top thing that's relevant, 00:22:25.620 |
And so we actually shape our circuitry around that. 00:22:28.700 |
- I mean, you mentioned this gets slower and slower 00:22:36.100 |
even on this podcast with a developmental neurobiologist, 00:22:50.460 |
And like, that's like, what, that's mind blowing 00:23:06.100 |
is that it remains malleable your whole life. 00:23:10.260 |
you'll be able to remember new faces and names. 00:23:15.940 |
And thank goodness, 'cause the world is changing rapidly 00:23:19.780 |
I just sent my mother an Alexa and she figured out 00:23:30.380 |
The interesting part is that really your goal 00:24:09.900 |
And I think, God, that's actually what it's like 00:24:12.620 |
to be inside your head and my head and anybody's head 00:24:16.500 |
is that you're essentially on your own planet in there. 00:24:24.220 |
where you've absorbed all of your experiences 00:24:31.260 |
And we've got this very thin bandwidth of communication 00:24:51.860 |
And what it's trying to do, this is the important part, 00:24:56.700 |
what's my place in, how do I function in the world? 00:25:09.180 |
And I think what happens when people get older and older, 00:25:13.020 |
it may not be that plasticity is diminishing. 00:25:17.820 |
essentially has set itself up in a way where it says, 00:25:20.820 |
"Okay, I've pretty much got a really good understanding 00:25:22.860 |
of the world now and I don't really need to change." 00:25:33.360 |
It's just that I think this notion that we all have 00:25:38.340 |
is in part because the motivation isn't there. 00:25:42.460 |
- But if you were 80 and you get fired from your job 00:25:45.780 |
how to program a WordPress site or something, 00:25:49.800 |
So the capability, the possibility of change is there. 00:25:57.100 |
the interesting challenge to this plasticity, 00:26:03.500 |
If we could talk about brain-computer interfaces 00:26:09.080 |
about the efforts of Elon Musk, Neuralink, BCI in general 00:26:13.700 |
in this regard, which is adding a machine, a computer, 00:26:18.700 |
the capability of a computer to communicate with the brain 00:26:26.860 |
and then like the futuristic kind of thoughts? 00:26:29.740 |
First of all, it's terrific that people are jumping 00:26:31.860 |
into doing that 'cause it's clearly the future. 00:26:34.500 |
The interesting part is our brains have pretty good methods 00:26:38.940 |
So maybe it's your fat thumbs on a cell phone or something, 00:26:45.920 |
But we have pretty rapid ways of communicating 00:26:55.640 |
you might be able to get a little bit faster, 00:27:19.580 |
And also it's not clear how many people would say, 00:27:23.140 |
I'm gonna volunteer to get something in my head 00:27:29.760 |
So I think it's, Mother Nature surrounds the brain 00:27:41.620 |
about the brain is, the person is never the same 00:27:48.980 |
Now, whether or not that's true or whatever, who cares? 00:27:51.540 |
But it's a big deal to do an open-head surgery. 00:27:53.980 |
So what I'm interested in is how can we get information 00:28:00.580 |
- Got it, without messing with the biological part, 00:28:06.420 |
with the intricate biological thing that we got going on 00:28:15.340 |
which is wonderful, is going to be in patient cases. 00:28:21.740 |
whether for Parkinson's or epilepsy or whatever. 00:28:27.740 |
and getting more higher density of electrodes. 00:28:30.900 |
I just don't think as far as the future of BCI goes, 00:28:34.340 |
I don't suspect that people will go in and say, 00:28:59.300 |
If you really think about it, it seems extremely difficult 00:29:04.740 |
and almost, I mean, just technically difficult 00:29:17.860 |
But the thing about the future is it's hard to predict. 00:29:31.640 |
it may be able to discover something very surprising 00:29:36.640 |
of our ability to directly communicate with the brain. 00:29:41.440 |
is figuring out how to play with this malleable brain, 00:30:08.640 |
on the human side and then be able to communicate, 00:30:19.620 |
the computer and the brain, like when you sleep. 00:30:22.480 |
I mean, there's a lot of futuristic kind of things 00:30:30.280 |
about the actual intricacies of the communication 00:30:34.120 |
of the brain that it's hard to find the common language. 00:30:38.520 |
- Well, interestingly, the technologies that have been built 00:30:43.520 |
don't actually require the perfect common language. 00:30:48.360 |
So for example, hundreds of thousands of people 00:30:53.480 |
meaning cochlear implants or retinal implants. 00:30:56.960 |
So this is, you take a essentially digital microphone, 00:31:00.360 |
you slip an electrode strip into the inner ear 00:31:06.620 |
and you plug it into the retina at the back of the eye 00:31:13.900 |
don't speak exactly the natural biological language, 00:31:19.440 |
And it turns out that as recently as about 25 years ago, 00:31:24.400 |
a lot of people thought this was never gonna work. 00:31:26.540 |
They thought it wasn't gonna work for that reason, 00:31:32.220 |
there's some correlation between what I can touch 00:31:38.620 |
I clap my hands and I have signals coming in there 00:31:41.520 |
and it figures out how to speak any language. 00:31:54.720 |
or the brain figures out the efficient way of communication. 00:32:00.200 |
And what I've proposed is the potato head theory 00:32:03.200 |
of evolution, which is that all our eyes and nose 00:32:14.180 |
And part of the reason that I think this is right, 00:32:20.380 |
you find all kinds of weird peripheral devices plugged in 00:32:23.540 |
and the brain figures out what to do with the data. 00:32:25.740 |
And I don't believe that mother nature has to reinvent 00:32:28.560 |
the principles of brain operation each time to say, 00:32:32.660 |
oh, now I'm gonna have heat pits to detect infrared. 00:32:48.920 |
Oh, great, I'm gonna mold myself around the data 00:32:52.620 |
- It's kind of fascinating to think that we think 00:32:55.780 |
of smartphones and all this new technology as novel. 00:32:58.740 |
It's totally novel as outside of what evolution 00:33:02.660 |
ever intended or like what nature ever intended. 00:33:05.580 |
It's fascinating to think that like the entirety 00:33:08.420 |
of the process of evolution is perfectly fine 00:33:10.940 |
and ready for the smartphone and the internet. 00:33:17.260 |
And whatever comes to cyborgs, to virtual reality, 00:33:23.500 |
there's all these like books written about what's natural 00:33:32.620 |
It's kind of, you know, we're probably not giving 00:33:43.140 |
You'll see the ease with which they pick up on stuff. 00:33:46.300 |
And as Kevin Kelly said, technology is what gets invented 00:33:54.500 |
But the stuff that already exists when you're born, 00:33:56.260 |
that's not even tech, that's just background furniture. 00:33:58.140 |
Like the fact that the iPad exists for my son and daughter, 00:34:02.340 |
So yeah, it's because we have this incredibly 00:34:07.260 |
malleable system, it just absorbs whatever is going on 00:34:11.900 |
- But do you think, just to linger for a little bit more, 00:34:22.180 |
Like we're kind of, you know, for the machine 00:34:26.560 |
to adjust to the brain, for the brain to adjust to the machine 00:34:32.340 |
So for example, when you put electrodes in the motor cortex 00:34:35.700 |
to control a robotic arm for somebody who's paralyzed, 00:34:39.220 |
the engineers do a lot of work to figure out, 00:34:42.820 |
so that we can detect what's going on from these cells 00:34:45.660 |
and figure out how to best program the robotic arm to move 00:34:49.740 |
given the data that we're measuring from these cells. 00:35:00.860 |
So if there's a piece of food there and she's hungry, 00:35:04.020 |
she'll figure out how to get this food into her mouth 00:35:08.220 |
with the robotic arm because that is what matters. 00:35:15.500 |
that paints a really promising and beautiful, 00:35:25.140 |
You know, so many things happened this year, 2020, 00:35:29.620 |
that you think like, how are we ever going to deal with it? 00:35:41.980 |
I actually think, so 2020 has been an awful year 00:35:46.980 |
but the one silver lining has to do with brain plasticity, 00:35:50.440 |
which is to say we've all been on our gerbil wheels. 00:36:10.900 |
We're having to create new things all the time 00:36:24.820 |
who stay cognitively active their whole lives. 00:36:27.980 |
Some fraction of them have Alzheimer's disease physically, 00:36:40.820 |
It's because they're challenged all the time. 00:36:45.540 |
all this novelty, all these responsibilities, 00:36:47.220 |
chores, social life, all these things happening. 00:36:49.660 |
And as a result, they're constantly building new roadways, 00:36:54.700 |
And that's the only good news is that we are in a situation 00:36:58.900 |
where suddenly we can't just operate like automaton anymore. 00:37:01.980 |
We have to think of completely new ways to do things. 00:37:06.700 |
- I don't know why this question popped into my head. 00:37:17.180 |
- You say this is the promising silver lining, 00:37:19.620 |
just from your own, 'cause you've written about this 00:37:25.220 |
but just this whole pandemic kind of changed the way, 00:37:29.900 |
it knocked us out of this hamster wheel like that of habit. 00:37:42.260 |
who either are ready or are going to lose their business, 00:37:47.260 |
is basically it's taking the dreams that people have had 00:37:56.140 |
this particular dream you've had will no longer be possible. 00:38:08.100 |
I mean, it's gonna be a rough time for many or most people, 00:38:24.540 |
This is obviously the plot in lots of Hollywood movies 00:38:32.340 |
I mean, in general, even though we plan our lives 00:38:38.420 |
as best we can, it's predicated on our notion of, 00:38:54.300 |
- Yeah, you know, for me, one exciting thing, 00:39:03.420 |
He does, he's a, if you see it, you would recognize it. 00:39:12.420 |
and he does these incredible, beautiful videos. 00:39:30.260 |
how to put content online that teaches people. 00:39:37.740 |
you know, Nobel Prize-winning faculty become YouTubers. 00:39:47.340 |
Like what Grant said, which is like the possibility 00:39:57.860 |
the world doesn't, you know, there's faculty. 00:40:04.260 |
that are experts in a particular beautiful field 00:40:18.140 |
And one possibility is they try to create that 00:40:26.460 |
This, of course, has been happening for a while already. 00:40:28.940 |
I mean, for example, when I go and I give book talks, 00:40:33.820 |
will come up to me afterwards and say something 00:40:38.860 |
And they'll say, "Oh, I saw it on a TED Talk." 00:40:42.900 |
Here you got the best person in the world on subject X 00:40:46.460 |
giving a 15-minute talk as beautifully as he or she can. 00:41:03.220 |
And hopefully that person knew what he or she was teaching 00:41:06.460 |
and often didn't and, you know, just made things up. 00:41:08.780 |
So the opportunity now has become extraordinary 00:41:24.340 |
all of the knowledge and the data and so on that it can get, 00:41:34.340 |
because we grew up with a lot of just-in-case learning. 00:41:39.340 |
So, you know, just in case you ever need to know 00:41:42.020 |
these dates in Mongolian history, here they are. 00:41:44.700 |
But what kids are growing up with now, like my kids, 00:41:48.780 |
So as soon as they're curious about something, 00:42:06.300 |
They had outlined seven different levels of learning, 00:42:08.340 |
and the highest level is when you're curious about a topic. 00:42:14.900 |
and as a result, they're gonna be so much smarter 00:42:19.740 |
I mean, my boy is eight years old, my girl is five. 00:42:22.180 |
But I mean, the things that he knows are amazing 00:42:29.900 |
- Yeah, it's just fascinating what the brain, 00:42:33.660 |
'cause of all those TED Talks just loaded in there. 00:42:40.860 |
there's a sense that our attention span is growing shorter, 00:43:04.220 |
And the point is that the brain is able to adjust to it 00:43:11.860 |
within this new medium of information that we have. 00:43:24.940 |
I mean, it's just adjusting to the entirety of things, 00:43:30.100 |
And then pops up COVID that forces us all to be home 00:43:56.820 |
Like how much is the whole thing predetermined? 00:44:01.820 |
Like how much is it already encoded in there? 00:44:12.380 |
The actions, the decisions, the judgments, the-- 00:44:31.380 |
You can't even separate them because you come to the table 00:44:39.680 |
like whether your mother is smoking or drinking, 00:44:41.820 |
things like that, whether she's stressed, so on, 00:44:43.740 |
those all influence how you're gonna pop out of the womb. 00:44:50.180 |
between all of your experiences and the nature. 00:44:55.500 |
What I mean is, I think of it like a space-time cone 00:45:01.820 |
and depending on the experiences that you have, 00:45:03.140 |
you might go off in this direction, or that direction, 00:45:04.820 |
or that direction, because there's interaction all the way, 00:45:12.380 |
So some genes get repressed, some get expressed, and so on. 00:45:26.380 |
and that is the layer that sits on top of the DNA 00:45:32.540 |
That is directly related to the experiences that you have. 00:45:35.100 |
So if, just as an example, they take rat pups, 00:45:38.660 |
and one group is placed away from their parents, 00:45:43.540 |
and taken good care of, that changes their gene expression 00:45:58.300 |
into things like education, and good childcare, and so on, 00:46:03.420 |
because these formative years matter so much. 00:46:17.280 |
- No, no, these are my favorite kind of questions. 00:46:20.900 |
We don't know, if you ask most neuroscientists, 00:46:28.640 |
because as far as we can tell, it's a machine. 00:46:32.200 |
Enormously sophisticated, 86 billion neurons, 00:46:41.300 |
Each neuron in your head has the entire human genome in it. 00:46:47.620 |
These are incredibly complicated biochemical cascades. 00:46:49.980 |
Each one is connected to 10,000 of its neighbors, 00:46:53.040 |
like half a quadrillion connections in the brain. 00:46:58.180 |
but it is fundamentally appears to just be a machine. 00:47:12.600 |
So that's the camp that pretty much all of us fall into, 00:47:14.860 |
but I will say our science is still quite young. 00:47:18.120 |
And you know, I'm a fan of the history of science. 00:47:20.860 |
And the thing that always strikes me as interesting 00:47:22.780 |
is when you look back at any moment in science, 00:47:28.500 |
and they just, they simply didn't know about, you know, 00:47:38.620 |
they all feel like we've converged to the final answer. 00:47:47.180 |
And in fact, this is what drives good science 00:47:49.540 |
is recognizing that we don't have most of the puzzle pieces. 00:47:52.660 |
So as far as the free will question goes, I don't know. 00:47:55.620 |
At the moment, it seems, wow, it would be really impossible 00:47:57.920 |
to figure out how something else could fit in there. 00:48:02.720 |
our textbooks might be very different than they are now. 00:48:07.620 |
where do you think free will could be squeezed into there? 00:48:15.220 |
that our brain just creates kinds of illusions 00:48:19.860 |
Or like where could it possibly be squeezed in? 00:48:33.140 |
- Yeah, exactly, I'm not saying this is what I believe 00:48:36.740 |
I give this at the end of my book, "Incognito." 00:48:38.960 |
So the whole book of "Incognito" is about, you know, 00:49:04.460 |
that when you turn this knob, you hear voices coming from, 00:49:11.720 |
you try to figure out like, how does this thing operate? 00:49:22.200 |
And so what you end up developing is a whole theory 00:49:24.320 |
about how this connection, this pattern of wires 00:49:29.060 |
But it would never strike you that in distant cities, 00:49:31.800 |
there's a radio tower and there's invisible stuff beaming. 00:49:34.520 |
And that's actually the origin of the voices. 00:49:44.120 |
what we know about the brain for absolutely certain 00:49:46.060 |
is that when you damage pieces and parts of it, 00:49:50.560 |
But how would you know if there's something else going on 00:49:52.680 |
that we can't see, like electromagnetic radiation, 00:50:01.360 |
of how totally, because we don't know most of how, 00:50:06.800 |
because we don't know most of how our universe works, 00:50:10.320 |
how totally off base we might be with our science. 00:50:13.920 |
Until, I mean, yeah, I mean, that's inspiring. 00:50:36.320 |
And that's all the things we're pursuing in our labs, 00:50:39.240 |
but there's this whole space of unknown unknowns 00:50:41.480 |
that we haven't even realized we haven't asked yet. 00:50:44.040 |
- Let me kind of ask a weird, maybe a difficult question. 00:50:50.600 |
I've been recently reading a lot about World War II. 00:50:54.360 |
I'm currently reading a book I recommend for people, 00:50:56.640 |
which is, as a Jew, it's been difficult to read, 00:51:04.940 |
So let me just ask about like the nature of genius, 00:51:14.260 |
we look at Hitler, Stalin, modern day Jeffrey Epstein, 00:51:30.660 |
What do we think about that in a live wired brain? 00:51:34.980 |
Like how do we think about these extreme people? 00:51:45.480 |
first of all, I saw a cover of Time Magazine some years ago, 00:51:51.620 |
and it was a big sagittal slice of the brain, 00:52:01.380 |
and there was a little spot that was pointing to Hitler. 00:52:03.140 |
And these Time Magazine covers always make me mad 00:52:05.820 |
because it's so goofy to think that we're gonna find 00:52:16.900 |
we are all about the world and the culture around us. 00:52:22.460 |
got all this positive feedback about what was going on, 00:52:27.580 |
and the crazier and crazier the ideas he had, 00:52:34.180 |
Somehow he was getting positive feedback from that, 00:52:37.380 |
and all these other people, they all spun each other up. 00:52:42.060 |
I mean, look at the cultural revolution in China 00:52:47.060 |
or the Russian Revolution or things like this, 00:52:59.780 |
where they all say, "Well, would I have thought 00:53:06.740 |
"Everyone around me seems to think it's right." 00:53:11.940 |
of having a live wired brain is that you can get crowds 00:53:17.500 |
So it's interesting to, we would pinpoint Hitler 00:53:20.420 |
as saying, "That's the evil guy," but in a sense, 00:53:29.660 |
In other words, Hitler was just a representation 00:53:34.700 |
of whatever was going on with that huge crowd 00:53:39.380 |
So I only bring that up to say that it's very difficult 00:53:49.100 |
He obviously got feedback for what he was doing. 00:53:51.620 |
The other thing, by the way, about what we often think of 00:53:55.720 |
as being evil in society is my lab recently published 00:54:04.560 |
which is a very important part of this puzzle. 00:54:16.040 |
and this seems to be a really fundamental thing. 00:54:20.120 |
where we brought people in, we stick them in the scanner, 00:54:23.380 |
and I don't know, and stop me if you know this, 00:54:36.040 |
So you actually see a syringe needle enter the hand 00:54:38.280 |
and come out, and it's really, what that does 00:54:47.280 |
Now, the interesting thing is it's not your hand 00:54:51.640 |
This is you seeing someone else's hand get stabbed, 00:54:54.160 |
and you feel like, oh God, this is awful, right? 00:54:58.600 |
with somebody's hand getting poked with a Q-tip, 00:55:02.560 |
but you don't have that same level of response. 00:55:06.120 |
Now what we do is we label each hand with a one-word label, 00:55:10.100 |
Christian, Jewish, Muslim, atheist, Scientologist, Hindu. 00:55:14.360 |
And now, do, do, do, do, the computer goes around, 00:55:16.160 |
picks a hand, stabs the hand, and the question is, 00:55:23.240 |
versus the one label that happens to match you? 00:55:25.700 |
And it turns out for everybody across all religions, 00:55:31.040 |
than their out-group, and when I say they care, 00:55:32.440 |
what I mean is you get a bigger response from their brain. 00:55:40.600 |
You care much more about your in-group than your out-group. 00:55:42.880 |
And I wish this weren't true, but this is how humans are. 00:55:55.280 |
like, if it's genetically built into the brain, 00:56:03.240 |
There are two, actually, there are two other things we did 00:56:07.060 |
as part of this study that I think matter for this point. 00:56:13.000 |
And by the way, this is not a cognitive thing. 00:56:21.900 |
What we did next is we next have it where we say, 00:56:24.800 |
okay, the year is 2025, and these three religions 00:56:28.640 |
are now in a war against these three religions. 00:56:33.560 |
and you have two allies now against these others. 00:56:36.000 |
And now it happens over the course of many trials. 00:56:38.680 |
You see everybody gets stabbed at different times. 00:56:41.600 |
And the question is, do you care more about your allies? 00:56:45.920 |
you didn't really care when they got stabbed, 00:56:49.760 |
that they're now your allies, you care more about them. 00:56:55.320 |
how ingrained is this or how arbitrary is it? 00:57:10.080 |
We give them a band that says Augustinian on it, 00:57:16.720 |
and they see a thing on the screen that says, 00:57:18.480 |
the Augustinians and Justinians are two warring tribes. 00:57:22.120 |
Some are labeled Augustinian, some are Justinian. 00:57:24.600 |
And now, you care more about whichever team you're on 00:57:27.840 |
than the other team, even though it's totally arbitrary, 00:57:32.920 |
So it's a state that's very easy to find ourselves in. 00:57:37.000 |
In other words, just before walking in the door, 00:57:39.480 |
they'd never even heard of Augustinian versus Justinian, 00:57:43.920 |
simply because they're told they're on this team. 00:57:46.400 |
- You know, now I did my own personal study of this. 00:57:49.620 |
So once you're an Augustinian, that tends to be sticky 00:58:15.200 |
- You know, we never tried that about saying, 00:58:16.600 |
"Okay, now you're a Justinian, you were an Augustinian." 00:58:28.540 |
and what happens is they look at the way monkeys behave 00:58:34.480 |
and how they treat members of the other tribe of monkeys. 00:58:37.520 |
And then what they do, I've forgotten how they do that 00:58:43.140 |
And very quickly, they end up becoming a part 00:58:47.280 |
and behaving badly towards their original troop. 00:58:55.080 |
In your book, you have a good light bulb joke. 00:59:00.080 |
"How many psychiatrists does it take to change a light bulb? 00:59:04.420 |
"Only one, but the light bulb has to want to change." 00:59:11.800 |
- Okay, so given, I've been interested in psychiatry 00:59:19.160 |
I've kind of early on dreamed to be a psychiatrist 00:59:30.680 |
for somebody else to help this live-wired brain to adjust? 00:59:45.920 |
when you're talking with a friend and you say, 00:59:48.920 |
And your friend says, "Hey, just look at it this way." 00:59:51.620 |
All we have access to under normal circumstances 01:00:05.800 |
So that's how psychiatrists sort of helped us. 01:00:07.240 |
But more importantly, the role that psychiatrists 01:00:10.120 |
have played is that there's this sort of naive assumption 01:00:15.160 |
which is that everyone is fundamentally just like us. 01:00:18.360 |
And when you're a kid, you believe this entirely, 01:00:21.160 |
but as you get older and you start realizing, 01:00:26.320 |
And to be inside that person's head is totally different 01:00:29.120 |
than what it is to be inside my head, or their psychopathy. 01:00:37.520 |
He's just doing what he needs to do to get what he needs. 01:00:45.520 |
and it is different to be inside those heads. 01:00:47.800 |
This is where the field of psychiatry comes in. 01:00:53.400 |
about the degree to which neuroscience is leaking into 01:00:58.520 |
and what the landscape will look like 50 years from now. 01:01:00.880 |
It may be that psychiatry as a profession changes a lot 01:01:07.760 |
and neuroscience will essentially be able to take over 01:01:11.000 |
But it has been extremely useful to understand 01:01:14.760 |
the differences between how people behave and why, 01:01:39.600 |
and psychiatric disorders or quote-unquote mind problems. 01:01:44.360 |
So on that topic, how do you think about this gray area? 01:01:51.240 |
There was psychiatry, and then there were guys and gals 01:02:07.280 |
where this matters a lot is the legal system, 01:02:14.400 |
is someone shows up in front of the judge's bench, 01:02:21.880 |
What we do, 'cause we feel like, hey, this is fair, 01:02:24.320 |
is we say, all right, you're gonna get the same sentence. 01:02:26.040 |
You'll all get three years in prison or whatever it is. 01:02:30.280 |
This guy's got schizophrenia, this guy's a psychopath, 01:02:32.080 |
this guy's tweaked out on drugs, and so on and so on, 01:02:34.080 |
that it actually doesn't make sense to keep doing that. 01:02:49.560 |
in the whole world, in terms of the percentage 01:02:53.320 |
So there's a much more refined thing we can do 01:02:59.640 |
and has the opportunity to change the legal system, 01:03:02.360 |
which is to say, this doesn't let anybody off the hook. 01:03:04.440 |
It doesn't say, oh, it's not your fault, and so on. 01:03:09.820 |
so it's not about, hey, how blameworthy are you? 01:03:13.380 |
But instead is about, hey, what do we do from here? 01:03:18.160 |
and you have them break rocks in the hot summer sun 01:03:21.560 |
in a chain gang, that doesn't help their schizophrenia. 01:03:42.380 |
bring to the table is lots of really useful things 01:03:44.940 |
you can do with schizophrenia, with drug addiction, 01:03:48.480 |
And that's why, so I don't know if you know this, 01:04:01.660 |
I'll just, without going down that rabbit hole, 01:04:04.700 |
I'll just say one of the very simplest things to do 01:04:15.800 |
Because if you go, by the way, to a regular court 01:04:17.780 |
and the person says, or the defense lawyer says, 01:04:22.880 |
most of the jury will say, meh, I call bullshit on that. 01:04:30.780 |
And it turns out people who know about schizophrenia 01:04:46.620 |
Having a drug court where you have judges and juries 01:05:05.080 |
- What's the process of injecting expertise into this? 01:05:14.500 |
So what happens is a county has a completely full jail 01:05:19.820 |
And then they realize, God, we don't have any money. 01:05:23.460 |
And that's when they turn to, God, we need something smarter. 01:05:26.700 |
And that's when they set up specialized court systems. 01:05:30.940 |
- We all function best when our back is against the wall. 01:05:40.820 |
And suddenly our backs are against the wall, all of us. 01:05:47.580 |
I mean, I'm a big believer in the possibility 01:05:56.220 |
And when it becomes too big of a bureaucracy, 01:05:59.220 |
it starts functioning poorly, it starts wasting money. 01:06:02.620 |
It's nice to, I mean, COVID reveals that nicely. 01:06:07.180 |
And lessons to be learned about who gets elected 01:06:14.220 |
Hopefully this inspires talented young people 01:06:23.660 |
Yeah, so that's the positive silver lining of COVID. 01:06:55.860 |
and it's able to generate some pretty impressive things. 01:07:06.020 |
People get so upset when machine learning people 01:07:25.580 |
but it's very different from what the brain does. 01:07:27.620 |
So it's a good impersonator, but just as one example, 01:07:32.620 |
everybody takes a passage that GPT-3 has written 01:07:37.500 |
and they say, wow, look at this, and it's pretty good, right? 01:07:40.420 |
But it's already gone through a filtering process 01:07:48.860 |
Human creativity is about absorbing everything around it 01:07:58.820 |
But we also know, we also have very good models 01:08:04.780 |
And so, I don't know if you speak French or something, 01:08:09.820 |
'cause then you'll say, wait, what are you doing? 01:08:18.140 |
I know the vocabulary that you know and don't know. 01:08:25.140 |
And so, of all the possible sentences I could say, 01:08:31.700 |
so that it's something useful for our conversation. 01:08:34.540 |
- Yeah, in real time, but also throughout your life. 01:08:42.980 |
- Exactly, but this is what GPT-3 does not do. 01:08:51.100 |
But it doesn't know how to make it so that you, Lex, 01:09:00.420 |
- Well, of course, it could be all the impressive results. 01:09:04.700 |
The question is, if you raise the number of parameters, 01:09:11.900 |
Raising more parameters won't, here's the thing. 01:09:18.140 |
'cause I suspect they will be at some point 50 years, 01:09:21.140 |
But what we are missing in artificial neural networks 01:09:28.300 |
where you've got units and you've got synapses 01:09:33.420 |
And it's done incredibly mind-blowing, impressive things, 01:09:35.860 |
but it's not doing the same algorithms as the human brain. 01:09:40.300 |
So when I look at my children as little kids, 01:09:50.660 |
They can navigate social conversation with an adult. 01:09:58.220 |
They are active thinkers in our world and doing things. 01:10:09.100 |
but we also know the things that they can't do well, 01:10:12.820 |
like, you know, like be generally intelligent, 01:10:23.580 |
It's, to me, it's possible that we'll be surprised. 01:10:33.140 |
and I'm glad I'm getting to say this on your podcast 01:10:36.220 |
so we can look back at this in two years and 10 years, 01:10:38.380 |
is that we've got to be much more sophisticated 01:10:45.700 |
and this is something I talk about in LiveWIRE, 01:10:54.620 |
Artificial neural networks don't have some basic things 01:10:56.580 |
that we like, caring about relevance, for example. 01:11:10.100 |
that I mentioned earlier is based on survival 01:11:12.780 |
but then all the things about my life and your life, 01:11:22.020 |
They don't even have a yen to survive and things like that. 01:11:24.740 |
- So we filter out a bunch of the junk we don't need. 01:11:58.100 |
I don't know, from an Elon Musk or something like that. 01:12:00.760 |
It's doing a filtration of it's throwing away 01:12:04.060 |
all the parameters it doesn't need for this task. 01:12:06.900 |
And it's figuring out how to do that successfully. 01:12:09.700 |
And then ultimately it's not doing a very good job right now 01:12:12.700 |
but it's doing a lot better job than we expected. 01:12:21.660 |
And we see like, oh, wow, it produced these three. 01:12:26.820 |
that produced that didn't really make any sense. 01:12:28.940 |
It's because it has no idea what it is like to be a human. 01:12:32.740 |
And all the things that you might want to say 01:12:39.860 |
in this modern political climate or whatever. 01:12:44.180 |
- And it somehow boils down to fear of mortality 01:12:46.700 |
and all of these human things at the end of the day, 01:12:56.880 |
but you've got all these more things like, you know, 01:13:02.300 |
of my department reads this, I want it to come off well there 01:13:05.760 |
I want to make sure she, you know, and so on. 01:13:08.240 |
- So those are all the things that humans are able to 01:13:14.880 |
- What it requires though is having a model of your chairman, 01:13:18.760 |
having a model of your mother, having a model of, you know, 01:13:34.560 |
again, this may be going into speculation land, 01:13:40.740 |
Is, okay, so the brain seems to be intelligent 01:13:48.400 |
So where do you think intelligence arises in the brain? 01:13:55.680 |
- So if you mean where location wise, it's no single spot. 01:14:02.000 |
I'm looking at New York city, where is the economy? 01:14:15.520 |
is interacting from everything going on at once. 01:14:20.000 |
humans are much smarter than fish, maybe not dolphins, 01:14:30.000 |
So what we mean when we say, oh, we're smarter, 01:14:33.680 |
and figure out a new pathway to get where we need to go. 01:14:36.720 |
And that's because fish are essentially coming to the table 01:14:47.560 |
but you know, I saw someone else do this thing 01:14:49.200 |
and I read once that you could do this other thing 01:14:52.800 |
- So do you think there's, is there something, 01:15:00.120 |
what feature of the brain of the live wire aspect of it 01:15:08.120 |
So like, is it the ability of neurons to reconnect? 01:15:26.880 |
- Yeah, I'm actually just trying to write some up 01:15:30.080 |
if you want to build a robot, start with the stomach. 01:15:32.480 |
And what I mean by that, what I mean by that is 01:15:37.240 |
it has to care about surviving, that kind of thing. 01:15:55.480 |
and then it figures out how to walk on three legs. 01:16:00.600 |
got its front wheel stuck in some Martian soil, 01:16:09.960 |
Wouldn't it be terrific if we could build a robot 01:16:17.800 |
That's the kind of thing that we want to be able to build 01:16:25.120 |
is because its motor and somatosensory systems 01:16:29.920 |
"Turns out we've got a body plan that's different 01:16:42.480 |
"So I'm just gonna figure out how to operate with this. 01:16:49.920 |
It just says, "Oh, geez, I was pre-programmed 01:16:51.560 |
"to have four wheels, now I have three, I'm screwed." 01:17:00.840 |
and there's a few psychologists, Sheldon Solomon, 01:17:03.520 |
I think I just spoke with him on his podcast, 01:17:09.940 |
which is, like, Ernest Becker is a philosopher 01:17:19.040 |
- And so, I don't know, it sounds compelling as an idea 01:17:23.160 |
that we're all, I mean, that all of the civilization 01:17:25.800 |
we've constructed is based on this, but it's-- 01:17:31.240 |
I think that, yes, fundamentally, this desire to survive 01:17:35.160 |
is at the core of it, I would agree with that, 01:17:47.260 |
why you chose to write your tweet this way and that way, 01:17:49.280 |
and it really has nothing to do with the survival part. 01:17:51.320 |
It has to do with trying to impress fellow humans 01:17:55.240 |
- Yeah, so many things built on top of each other. 01:18:00.860 |
we wanna be able to somehow engineer this drive 01:18:08.280 |
I mean, because as humans, we're not just about survival. 01:18:11.440 |
We're aware of the fact that we're going to die, 01:18:14.680 |
which is a very kind of, we're aware of, like, 01:18:20.120 |
Confucius said, he said, "Each person has two lives. 01:18:32.680 |
- I mean, you could argue this kind of Freudian thing, 01:18:34.640 |
which Erzbeker argues is they actually figured it out 01:18:47.480 |
and the reason most people, when I ask them about 01:18:49.520 |
whether they're afraid of death, they basically say no. 01:18:53.080 |
They basically say, like, "I'm afraid I won't get, 01:19:04.760 |
for a particular set of, like, a book you're writing. 01:19:08.160 |
As opposed to, like, what the hell, this thing ends. 01:19:16.200 |
as I've encountered, do not meditate on the idea 01:19:22.840 |
In the next five minutes, it could be all over. 01:19:29.900 |
I think that somehow brings you closer to, like, 01:19:39.480 |
- I think it might be the core, but like I said, 01:19:48.820 |
"Death whispers at my ear, live, for I come." 01:19:53.380 |
So it is certainly motivating when we think about that. 01:20:04.280 |
I mean, I know for, just speaking for me personally, 01:20:08.880 |
It's instead, oh, I want to get this, you know, 01:20:14.760 |
or I want to make sure my coauthor isn't mad at me 01:20:26.520 |
But for the AI systems, none of that is there. 01:20:30.240 |
Like a neural network does not fear its mortality. 01:20:40.720 |
but I wonder, it's an interesting speculation 01:20:52.780 |
Right now, if you have a robot roll into the room, 01:20:54.840 |
it's gonna be frozen, 'cause it doesn't have any reason 01:21:02.600 |
about this is how I should navigate my next move 01:21:15.780 |
I mean, it's very flexible in terms of the goals 01:21:20.000 |
and creative in terms of the goals we generate 01:21:22.920 |
You show up to a party without a goal usually, 01:21:27.960 |
- Yes, but this goes back to the question about free will, 01:21:35.640 |
would I go and talk to that couple over there 01:21:56.120 |
which is this, when we were talking about BCI. 01:22:00.240 |
but what I'm spending 90% of my time doing now 01:22:04.640 |
- Yes, I wasn't sure what the company is involved in. 01:22:18.880 |
but my interest has been how you can get data streams 01:22:29.680 |
We've built this in many different form factors. 01:22:35.400 |
So these things, as I'm speaking, for example, 01:22:38.240 |
it's capturing my voice and running algorithms 01:22:41.040 |
and then turning that into patterns of vibration here. 01:22:50.760 |
So the information is getting up to their brain this way, 01:22:55.840 |
So it turns out on day one, people are pretty good, 01:22:58.200 |
better than you'd expect at being able to say, 01:23:12.400 |
in other words, a new subjective internal experience. 01:23:15.400 |
So on day one, they say, "Whoa, what was that? 01:23:26.400 |
- And by the way, that's exactly how you learn 01:23:29.400 |
So what you, of course, you don't remember this, 01:23:30.560 |
but when you were an infant, all you have are, 01:23:33.120 |
your eardrum vibrating causes spikes to go down, 01:23:36.560 |
your auditory nerves and impinging your auditory cortex. 01:23:39.980 |
Your brain doesn't know what those mean automatically, 01:23:50.880 |
and that correlates with what's going on there. 01:23:55.880 |
"as an internal experience, as a conscious experience." 01:24:01.320 |
The weird part is that you can feed data into the brain, 01:24:04.200 |
not through the ears, but through any channel 01:24:17.220 |
could expand arbitrarily, which is fascinating. 01:24:30.960 |
and you need to be able to see and hear other stuff. 01:24:34.760 |
which is this incredible computational material 01:24:39.760 |
And we don't use our skin for much of anything nowadays. 01:24:47.760 |
and you're passing in all this information that way. 01:24:51.400 |
And what I'm doing here with the deaf community 01:25:01.480 |
I'm just replacing the ears with the skin, and that works. 01:25:09.920 |
So what if you took something like your visual system, 01:25:18.840 |
So we've hooked that up, infrared and ultraviolet detectors, 01:25:25.640 |
one of my engineers built it, the infrared detector, 01:25:27.840 |
I was walking in the dark between two houses, 01:25:29.840 |
and suddenly I felt all this infrared radiation. 01:25:34.240 |
and I found an infrared camera, a night vision camera. 01:25:37.200 |
But I immediately, oh, there's that thing there. 01:25:47.440 |
what I'm really interested in is sensory addition. 01:25:52.080 |
that isn't even part of what we normally pick up 01:25:58.080 |
or Twitter, or stock market, or things like that. 01:26:02.720 |
like the moods of other people or something like that. 01:26:04.720 |
- Sure, now what you need is a way to measure that. 01:26:06.960 |
So as long as there's a machine that can measure it, 01:26:08.640 |
it's easy, it's trivial to feed this in here, 01:26:21.240 |
I forgot how you put it, but it was eloquent, 01:26:24.160 |
without getting, cutting into the brain, basically. 01:26:33.800 |
You just put it on, and when you're done, you take it off. 01:26:36.920 |
Yeah, and so, and the name of the company, by the way, 01:26:46.320 |
with certain plug and play devices, and then that's it. 01:26:56.600 |
and there's no reason we have to be stuck there. 01:26:58.220 |
We can expand our umwelt by adding new senses, yeah. 01:27:03.080 |
- Oh, I'm sorry, the umwelt is the slice of reality 01:27:12.840 |
It's such an important concept, which is to say, 01:27:20.160 |
you pick up on butyric acid, you pick up on odor, 01:27:29.200 |
you're picking up on air compression waves coming back, 01:27:35.200 |
you're picking up on changes in the electrical field 01:27:57.800 |
we've got little bits that we can pick up on. 01:27:59.640 |
One of the things I like to do with my students 01:28:01.440 |
is talk about, imagine that you are a bloodhound dog, right? 01:28:25.800 |
and you looked at your human master and thought, 01:28:30.800 |
How could you not know that there's a cat 100 yards away 01:28:34.800 |
And so the idea is because we're stuck in our umwelt, 01:28:39.440 |
is we think, okay, well, yeah, we're seeing reality, 01:28:41.880 |
but you can have very different sorts of realities 01:28:44.880 |
depending on the peripheral plug and play devices 01:28:50.000 |
if we're being honest, probably our umwelt is, 01:28:53.020 |
you know, some infinitely tiny percent of the possibilities 01:29:03.400 |
Even if you could, I mean, there's a guy named 01:29:15.960 |
Like we're very, we're almost like we're floating out there 01:29:24.200 |
It's fascinating that we can have extra senses 01:29:29.680 |
- Exactly, and by the way, this has been the fruits 01:29:38.200 |
But of course, depending on how you calculate it, 01:29:40.120 |
it's less than a 10 trillionth of the electromagnetic 01:29:45.800 |
The reason I say it depends is because, you know, 01:29:46.960 |
it's actually infinite in all directions presumably. 01:29:54.960 |
- Exactly, so understanding how big the world is out there. 01:29:57.200 |
- And the same with the world of really small 01:30:03.160 |
- Exactly, and so the reason I think this kind of thing 01:30:05.040 |
matters is because we now have an opportunity 01:30:10.960 |
okay, well, I'm just gonna include other things 01:30:13.160 |
in my OOM-VEL, so I'm gonna include infrared radiation 01:30:15.800 |
and have a direct perceptual experience of that. 01:30:35.120 |
- Yeah, I feel like this is the most important thing 01:30:40.640 |
what I'm devoting my time and my life to, but-- 01:30:56.240 |
- Exactly, so if you ask what I think about Neuralink, 01:30:59.320 |
I think it's amazing what those guys are doing 01:31:01.240 |
and working on, but I think it's not practical 01:31:04.800 |
For example, for people who are deaf, they buy this 01:31:07.680 |
and, you know, every day we're getting tons of emails 01:31:14.520 |
I didn't even know that was happening out there. 01:31:18.520 |
By the way, this is, you know, less than a 10th 01:31:20.320 |
of the price of a hearing aid and like 250 times 01:31:30.680 |
brilliant folks like yourself could recommend 01:31:37.000 |
So I'll, in the introduction, mention all the books 01:31:44.840 |
But is there three books, technical, fiction, 01:31:56.440 |
perhaps some of which you would want to recommend 01:32:10.560 |
And so I grew up in the mountains in New Mexico. 01:32:28.240 |
I would actually recommend "Invisible Cities." 01:32:44.160 |
by exactly what we were talking about earlier 01:32:45.960 |
about how you can only see a little bit of the, 01:32:48.720 |
what we call visible light in the electromagnetic radiation. 01:32:52.760 |
and then he reviewed incognito for the Washington Post. 01:33:30.200 |
- It's a collection of short stories that I love. 01:33:35.800 |
both watching the PBS series and then reading the book, 01:33:38.200 |
and that influenced me a huge amount in terms of what I do. 01:34:01.440 |
- That was my aspiration. - Is the aspiration. 01:34:03.720 |
I mean, you're doing an incredible job of it. 01:34:07.920 |
So you open the book "Livewired" with a quote by Heidegger. 01:34:11.920 |
"Every man is born as many men and dies as a single one." 01:34:21.680 |
So he had his own reason why he was writing that, 01:34:23.840 |
but I meant this in terms of brain plasticity, 01:34:41.720 |
or you could have been thousands of different men 01:34:55.400 |
but the day you die, you will be exactly Lex. 01:35:23.360 |
when we choose all these different trajectories 01:35:25.160 |
and end up with one, what's the meaning of it all? 01:35:40.360 |
I mean, this is the question that everyone has attacked 01:35:51.040 |
So if you grew up in a secular or scientific society, 01:35:53.240 |
you have a different way of attacking that question. 01:36:01.200 |
- I mean, I think one of the fundamental things, I guess, 01:36:14.200 |
is equivalent to, or at least runs in parallel 01:36:20.840 |
'Cause it's kinda, that's the underlying question. 01:36:24.520 |
And by the way, you know, this is the interesting thing 01:36:27.760 |
You know, we've got all these layers of things 01:36:30.840 |
And so if you keep asking yourself the question about, 01:36:33.680 |
what is the optimal way for me to be spending my time? 01:36:53.720 |
- So you've, I think, just in your eyes, in your work, 01:37:04.720 |
If you were to give advice to a young person today 01:37:14.080 |
about life, about how to discover their passion, 01:37:30.080 |
And this is back to this issue of why COVID is useful for us 01:37:35.920 |
The fact is the jobs that will exist 20 years from now, 01:37:40.560 |
we can't even imagine the jobs that are gonna exist. 01:37:43.000 |
And so when young people that I know go into college 01:37:46.840 |
And so on, college is and should be less and less vocational 01:37:52.680 |
and then I'm gonna do that the rest of my career." 01:37:58.240 |
So the important thing is learning how to learn, 01:38:06.880 |
what I talk to them is, you know, what you digest, 01:38:14.040 |
of things that you can remix and be creative with. 01:38:25.960 |
You're constantly, whoa, you go down some mole hole 01:38:36.240 |
And what I tell people is just always do a gut check about, 01:38:40.240 |
okay, I'm reading this paper and yeah, I think that, 01:38:47.720 |
I tell them just to keep a real sniff out for that. 01:38:50.520 |
And when you find those things, keep going down those paths. 01:38:55.080 |
I mean, that's one of the challenges and the downsides 01:39:00.080 |
is that sometimes people are a little bit afraid 01:39:16.280 |
and that trajectory, it doesn't last forever. 01:39:20.200 |
So just if something sparks your imagination, 01:39:26.400 |
- I don't think there's a more beautiful way to end it. 01:39:29.960 |
David, it's a huge honor to finally meet you. 01:39:34.880 |
I've talked to so many people who are passionate 01:39:43.680 |
I think you're already there with Carl Sagan. 01:39:47.520 |
Yeah, it was an honor talking with you today. 01:39:54.920 |
with David Eagleman, and thank you to our sponsors, 01:40:03.880 |
to get a discount and to support this podcast. 01:40:07.400 |
If you enjoy this thing, subscribe on YouTube, 01:40:28.160 |
the product of billions of years of molecules 01:40:35.160 |
That we're composed only of highways of fluids 01:40:42.660 |
That trillions of synaptic connections hum in parallel. 01:40:46.280 |
That this vast egg-like fabric of micro-thin circuitry 01:40:50.340 |
runs algorithms undreamt of in modern science. 01:40:56.800 |
our decision-making, loves, desires, fears, and aspirations. 01:41:01.800 |
To me, understanding this would be a numinous experience, 01:41:07.680 |
better than anything ever proposed in any holy text. 01:41:12.100 |
Thank you for listening, and hope to see you next time.