back to index

David Eagleman: Neuroplasticity and the Livewired Brain | Lex Fridman Podcast #119


Chapters

0:0 Introduction
5:5 Livewired
16:39 Hardware vs software
25:53 Brain-computer interfaces
35:12 2020 is a challenge for neuroplasticity
46:8 Free will
50:43 Nature of evil
58:55 Psychiatry
66:28 GPT-3
73:31 Intelligence in the brain
81:51 Neosensory
91:27 Book recommendations
94:7 Meaning of life
96:53 Advice for young people

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with David Eagleman,
00:00:03.000 | a neuroscientist and one of the great science communicators
00:00:06.360 | of our time, exploring the beauty and mystery
00:00:09.280 | of the human brain.
00:00:10.800 | He's an author of a lot of amazing books
00:00:13.140 | about the human mind, and his new one called LiveWired.
00:00:18.040 | LiveWired is a work of 10 years on a topic
00:00:21.200 | that is fascinating to me, which is neuroplasticity
00:00:24.700 | or the malleability of the human brain.
00:00:26.800 | Quick summary of the sponsors.
00:00:29.240 | Athletic Greens, BetterHelp, and Cash App.
00:00:32.680 | Click the sponsor links in the description
00:00:34.480 | to get a discount and to support this podcast.
00:00:37.800 | As a side note, let me say that the adaptability
00:00:41.400 | of the human mind at the biological, chemical,
00:00:44.360 | cognitive, psychological, and even sociological levels
00:00:48.480 | is the very thing that captivated me many years ago
00:00:51.840 | when I first began to wonder how we might engineer
00:00:54.960 | something like it in the machine.
00:00:57.760 | The open question today in the 21st century
00:01:00.680 | is what are the limits of this adaptability?
00:01:03.680 | As new, smarter and smarter devices and AI systems
00:01:07.400 | come to life, or as better and better
00:01:09.680 | brain-computer interfaces are engineered,
00:01:12.080 | will our brain be able to adapt, to catch up, to excel?
00:01:16.520 | I personally believe yes, that we're far from reaching
00:01:19.520 | the limitation of the human mind and the human brain,
00:01:23.080 | just as we are far from reaching the limitations
00:01:26.040 | of our computational systems.
00:01:27.880 | If you enjoy this thing, subscribe on YouTube,
00:01:31.360 | review it with Five Stars on Apple Podcasts,
00:01:33.560 | follow on Spotify, support on Patreon,
00:01:36.200 | or connect with me on Twitter @LexFriedman.
00:01:39.160 | As usual, I'll do a few minutes of ads now
00:01:41.920 | and no ads in the middle.
00:01:43.400 | I try to make these interesting,
00:01:45.060 | but I give you time stamps so you can skip.
00:01:47.920 | But please do check out the sponsors
00:01:49.720 | by clicking the links in the description.
00:01:51.640 | It's the best way to support this podcast.
00:01:54.960 | This show is brought to you by Athletic Greens,
00:01:58.260 | the all-in-one daily drink to support better health
00:02:01.080 | and peak performance.
00:02:02.640 | Even with a balanced diet, it's difficult to cover
00:02:04.760 | all of your nutritional bases.
00:02:07.000 | That's where Athletic Greens will help.
00:02:09.120 | Their daily drink is like nutritional insurance
00:02:11.780 | for your body, as delivered straight to your door.
00:02:15.360 | As you may know, I fast often,
00:02:17.040 | sometimes intermittent fasting for 16 hours,
00:02:20.320 | sometimes 24 hours, dinner to dinner, sometimes more.
00:02:24.480 | I break the fast with Athletic Greens.
00:02:26.920 | It's delicious, refreshing, just makes me feel good.
00:02:30.160 | I think it's like 50 calories, less than a gram of sugar,
00:02:34.400 | but has a ton of nutrients to make sure my body
00:02:36.460 | has what it needs despite what I'm eating.
00:02:40.200 | Go to athleticgreens.com/lex to claim a special offer
00:02:44.560 | of a free vitamin D3K2 for a year.
00:02:49.160 | If you listened to the Joe Rogan experience,
00:02:51.440 | you might've listened to him rant about
00:02:53.320 | how awesome vitamin D is for your immune system.
00:02:56.120 | So there you have it.
00:02:57.560 | So click the athleticgreens.com/lex in the description
00:03:02.080 | to get the free stuff and to support this podcast.
00:03:05.320 | This show is sponsored by BetterHelp, spelled H-E-L-P, help.
00:03:11.400 | Check it out at betterhelp.com/lex.
00:03:14.200 | They figure out what you need,
00:03:15.440 | match you with a licensed professional therapist
00:03:17.520 | in under 48 hours.
00:03:19.360 | It's not a crisis line, it's not self-help,
00:03:21.800 | it's professional counseling done securely online.
00:03:25.480 | I'm a bit from the David Goggins line of creatures
00:03:28.080 | and so have some demons to contend with,
00:03:30.640 | usually on long runs or all nights full of self-doubt.
00:03:34.920 | I think suffering is essential for creation,
00:03:37.840 | but you can suffer beautifully
00:03:39.520 | in a way that doesn't destroy you.
00:03:41.620 | For most people, I think a good therapist can help in this.
00:03:45.280 | So it's at least worth a try.
00:03:47.120 | Check out their reviews, they're good.
00:03:49.240 | It's easy, private, affordable, available worldwide.
00:03:52.620 | You can communicate by text,
00:03:54.040 | save you time and schedule a weekly audio and video session.
00:03:58.240 | Check it out at betterhelp.com/lex.
00:04:01.500 | This show is presented by Cash App,
00:04:04.580 | the number one finance app in the App Store.
00:04:06.600 | When you get it, use code LEXPODCAST.
00:04:09.240 | Cash App lets you send money to friends,
00:04:10.920 | buy Bitcoin and invest in the stock market
00:04:12.880 | with as little as $1.
00:04:14.520 | Since Cash App allows you to buy Bitcoin,
00:04:16.840 | let me mention that cryptocurrency
00:04:18.440 | in the context of the history of money is fascinating.
00:04:21.820 | I recommend "A Scent of Money"
00:04:23.360 | as a great book on this history.
00:04:25.400 | Debits and credits on ledgers
00:04:27.280 | started around 30,000 years ago,
00:04:29.960 | and the first decentralized cryptocurrency
00:04:32.200 | released just over 10 years ago.
00:04:34.160 | So given that history,
00:04:35.520 | cryptocurrency is still very much
00:04:37.400 | in its early days of development,
00:04:39.280 | but it's still aiming to
00:04:40.660 | and just might redefine the nature of money.
00:04:43.820 | So again, if you get Cash App
00:04:46.360 | from the App Store or Google Play and use code LEXPODCAST,
00:04:49.760 | you get $10, and Cash App will also donate $10 to FIRST,
00:04:53.880 | an organization that is helping to advance robotics
00:04:56.920 | and STEM education for young people around the world.
00:05:00.380 | And now, here's my conversation with David Eagleman.
00:05:04.580 | You have a new book coming out on "The Changing Brain."
00:05:10.320 | Can you give a high-level overview of the book?
00:05:13.240 | It's called "LiveWired," by the way.
00:05:14.600 | - Yeah.
00:05:15.420 | So the thing is, we typically think about the brain
00:05:17.760 | in terms of the metaphors we already have,
00:05:19.480 | like hardware and software.
00:05:21.200 | That's how we build all our stuff.
00:05:22.380 | But what's happening in the brain
00:05:24.440 | is fundamentally so different.
00:05:26.860 | So I coined this new term LiveWire,
00:05:29.040 | which is a system that's constantly
00:05:31.280 | reconfiguring itself physically
00:05:33.420 | as it learns and adapts to the world around it.
00:05:37.100 | It's physically changing.
00:05:38.640 | - So it's LiveWire, meaning like hardware, but changing.
00:05:43.640 | - Yeah, exactly.
00:05:44.560 | Well, the hardware and the software layers are blended.
00:05:47.480 | And so, typically, engineers are praised for their efficiency
00:05:52.480 | in making something really clean and clear,
00:05:55.200 | like, okay, here's the hardware layer,
00:05:56.380 | then I'm gonna run software on top of it.
00:05:57.760 | And there's all sorts of universality
00:05:59.720 | that you get out of a piece of hardware like that
00:06:01.840 | that's useful.
00:06:02.840 | But what the brain is doing is completely different.
00:06:05.240 | And I am so excited about where this is all going
00:06:08.360 | because I feel like this is where our engineering will go.
00:06:14.400 | So currently, we build all our devices a particular way.
00:06:18.280 | But I can't tear half the circuitry out of your cell phone
00:06:21.760 | and expect it to still function.
00:06:23.780 | But you can do that with the brain.
00:06:27.480 | So just as an example, kids who are under
00:06:30.320 | about seven years old can get one half
00:06:32.600 | of their brain removed.
00:06:33.640 | It's called a hemispherectomy.
00:06:35.360 | And they're fine.
00:06:36.640 | They have a slight limp on the other side of their body,
00:06:38.440 | but they can function just fine that way.
00:06:41.480 | And this is generally true.
00:06:43.440 | Sometimes children are born without a hemisphere.
00:06:46.280 | And their visual system rewires
00:06:48.600 | so that everything is on the single remaining hemisphere.
00:06:53.040 | What thousands of cases like this teach us
00:06:56.300 | is that it's a very malleable system
00:06:59.680 | that is simply trying to accomplish the tasks in front of it
00:07:03.440 | by rewiring itself with the available real estate.
00:07:06.920 | - How much of that is a quirk or a feature of evolution?
00:07:10.640 | Like how hard is it to engineer?
00:07:12.520 | 'Cause evolution took a lot of work.
00:07:15.240 | Trillions of organisms had to die
00:07:17.960 | to create this thing we have in our skull.
00:07:20.840 | 'Cause you said you kind of look forward to the idea
00:07:24.520 | that we might be engineering our systems
00:07:27.480 | like this in the future, like creating live wire systems.
00:07:30.720 | How hard do you think is it to create systems like that?
00:07:33.160 | - Great question.
00:07:34.000 | It has proven itself to be a difficult challenge.
00:07:37.040 | What I mean by that is even though it's taken evolution
00:07:40.220 | a really long time to get where it is now,
00:07:43.180 | all we have to do now is peek at the blueprints.
00:07:47.600 | It's just three pounds, this organ,
00:07:49.280 | and we just figure out how to do it.
00:07:50.840 | But that's the part that I mean is a difficult challenge
00:07:52.720 | because there are tens of thousands of neuroscientists.
00:07:57.080 | We're all poking and prodding and trying to figure this out,
00:07:59.200 | but it's an extremely complicated system.
00:08:00.680 | But it's only gonna be complicated
00:08:02.680 | until we figure out the general principles.
00:08:05.320 | Exactly like if you had a magic camera
00:08:09.400 | and you could look inside the nucleus of a cell
00:08:10.940 | and you'd see hundreds of thousands of things
00:08:13.240 | moving around or whatever,
00:08:14.200 | and then it takes Crick and Watts and say,
00:08:16.080 | oh, you're just trying to maintain
00:08:17.400 | the order of the base pairs and all the rest is details.
00:08:20.300 | Then it simplifies it and we come to understand something.
00:08:23.480 | That was my goal in live wire,
00:08:25.120 | which I've written over 10 years, by the way,
00:08:26.920 | is to try to distill things down to the principles
00:08:29.920 | of what plastic systems are trying to accomplish.
00:08:34.180 | - But to even just linger, you said it's possible
00:08:36.760 | to be born with just one hemisphere
00:08:38.840 | and you still are able to function.
00:08:41.960 | First of all, just to pause on that,
00:08:43.480 | I mean, that's amazing.
00:08:45.640 | I don't know if people quite,
00:08:50.080 | I mean, you kind of hear things here and there.
00:08:51.760 | This is why I'm really excited about your book
00:08:54.320 | is I don't know if there's definitive
00:08:56.840 | sort of popular sources to think about this stuff.
00:09:01.480 | I mean, there's a lot of, I think from my perspective,
00:09:05.040 | what I heard is there's been debates over decades
00:09:07.920 | about how much neuroplasticity there is in the brain
00:09:12.040 | and so on and people have learned a lot of things
00:09:14.820 | and now it's converging towards people
00:09:16.880 | that are understanding this much more neuro,
00:09:19.640 | much more plastic than people realize.
00:09:21.400 | But just like linger on that topic,
00:09:23.880 | like how malleable is the hardware of the human brain?
00:09:28.340 | Maybe you said children at each stage of life.
00:09:32.280 | - Yeah, so here's the whole thing.
00:09:33.640 | I think part of the confusion about plasticity
00:09:36.920 | has been that there are studies
00:09:38.200 | at all sorts of different ages
00:09:40.320 | and then people might read that from a distance
00:09:42.360 | and they think, oh, well, Fred didn't recover
00:09:45.320 | when half his brain was taken out
00:09:47.020 | and so clearly you're not plastic,
00:09:48.980 | but then you do it with a child and they are plastic.
00:09:52.160 | And so part of my goal here was to pull together
00:09:56.360 | the tens of thousands of papers on this,
00:09:59.040 | both from clinical work and from,
00:10:01.920 | all the way down to the molecular
00:10:03.480 | and understand what are the principles here?
00:10:04.880 | The principles are that plasticity diminishes,
00:10:08.240 | that's no surprise.
00:10:09.520 | By the way, we should just define plasticity.
00:10:11.560 | It's the ability of a system to mold into a new shape
00:10:14.840 | and then hold that shape.
00:10:16.260 | That's why we make things that we call plastic
00:10:19.880 | because they are moldable and they can hold that new shape,
00:10:23.640 | like a plastic toy or something.
00:10:25.240 | - And so maybe we'll use a lot of terms that are synonymous.
00:10:29.480 | So something is plastic, something is malleable,
00:10:34.440 | changing, live wire, the name of the book is synonymous.
00:10:38.720 | - So I'll tell you, exactly right,
00:10:39.840 | but I'll tell you why I chose live wire instead of plasticity.
00:10:42.920 | So I use the term plasticity in the book, but sparingly,
00:10:47.080 | because that was a term coined by William James
00:10:51.000 | over 100 years ago and he was, of course,
00:10:53.940 | very impressed with plastic manufacturing,
00:10:55.760 | that you could mold something into shape
00:10:57.600 | and then it holds that.
00:10:58.480 | But that's not what's actually happening in the brain.
00:11:01.360 | It's constantly rewiring your entire life.
00:11:03.680 | You never hit an end point.
00:11:06.400 | The whole point is for it to keep changing.
00:11:08.240 | So even in the few minutes of conversation
00:11:11.000 | that we've been having, your brain is changing,
00:11:12.560 | my brain is changing.
00:11:13.840 | Next time I see your face, I will remember,
00:11:18.000 | oh yeah, like that time Lex and I sat together
00:11:19.880 | and we did these things.
00:11:21.400 | - I wonder if your brain will have a Lex thing
00:11:24.000 | going on for the next few months.
00:11:25.440 | It'll stay there until you get rid of it
00:11:27.840 | 'cause it was useful for now.
00:11:29.320 | - Yeah, no, I'll probably never get rid of it.
00:11:30.840 | Let's say for some circumstance,
00:11:32.000 | you and I don't see each other for the next 35 years.
00:11:34.320 | When I run into you, I'll be like, oh yeah.
00:11:36.240 | - That looks familiar.
00:11:37.480 | - Yeah, yeah, and we sat down for a podcast
00:11:40.480 | back when there were podcasts.
00:11:42.520 | Exactly.
00:11:43.360 | - Back when we lived outside virtual reality.
00:11:46.160 | - Exactly.
00:11:47.080 | - So you chose LiveWIRE to mold our plastic.
00:11:50.200 | - Exactly, because plastic implies,
00:11:52.880 | I mean, it's the term that's used in the field
00:11:54.400 | and so that's why we need to use it still for a while.
00:11:57.800 | But yeah, it implies something gets molded into shape
00:11:59.680 | and then holds that shape forever.
00:12:00.800 | But in fact, the whole system is completely changing.
00:12:03.240 | - Then back to how malleable is the human brain
00:12:07.720 | at each stage of life?
00:12:08.840 | So what, just at a high level, is it malleable?
00:12:13.840 | - So yes, and plasticity diminishes.
00:12:17.240 | But one of the things that I felt like
00:12:19.920 | I was able to put together for myself
00:12:21.760 | after reading thousands of papers on this issue
00:12:23.920 | is that different parts of the brain
00:12:26.920 | have different plasticity windows.
00:12:30.880 | So for example, with the visual cortex,
00:12:33.200 | that cements itself into place pretty quickly
00:12:36.000 | over the course of a few years.
00:12:37.800 | And I argue that's because of the stability of the data.
00:12:41.080 | In other words, what you're getting in from the world,
00:12:44.000 | you've got a certain number of angles, colors, shapes,
00:12:46.960 | you know, it's essentially the world is visually stable.
00:12:50.000 | So that hardens around that data.
00:12:52.480 | As opposed to, let's say the somatosensory cortex,
00:12:55.160 | which is the part that's taking information
00:12:56.600 | from your body, or the motor cortex right next to it,
00:12:58.600 | which is what drives your body.
00:13:00.400 | The fact is bodies are always changing.
00:13:01.920 | You get taller over time, you get fatter, thinner over time,
00:13:05.760 | you might break a leg and have to limp for a while,
00:13:07.920 | stuff like that.
00:13:08.760 | So because the data there is always changing,
00:13:11.640 | and by the way, you might get on a bicycle,
00:13:12.960 | you might get a surfboard, things like that.
00:13:15.720 | Because the data is always changing,
00:13:16.840 | that stays more malleable.
00:13:18.860 | And when you look through the brain,
00:13:20.920 | you find that it appears to be this, you know,
00:13:24.000 | how stable the data is determines how fast
00:13:26.640 | something hardens into place.
00:13:28.040 | But the point is, different parts of the brain
00:13:30.000 | harden into place at different times.
00:13:31.960 | - Do you think it's possible that depending on
00:13:35.560 | how much data you get on different sensors,
00:13:38.720 | that it stays more malleable longer?
00:13:41.480 | So like, you know, if you look at different cultures
00:13:44.200 | that experience, like if you keep your eyes closed,
00:13:47.600 | or maybe you're blind, I don't know,
00:13:49.320 | let's say you keep your eyes closed for your entire life,
00:13:53.600 | then the visual cortex might be much less malleable.
00:13:58.600 | The reason I bring that up is like, you know,
00:14:01.320 | maybe we'll talk about brain-computer interfaces
00:14:03.320 | a little bit down the line, but you know,
00:14:06.040 | like is this, is the malleability a genetic thing,
00:14:11.040 | or is it more about the data, like you said, that comes in?
00:14:14.360 | - Ah, so the malleability itself is a genetic thing.
00:14:17.960 | The big trick that Mother Nature discovered with humans
00:14:20.840 | is make a system that's really flexible,
00:14:24.200 | as opposed to most other creatures to different degrees.
00:14:28.080 | So if you take an alligator, it's born,
00:14:31.200 | its brain does the same thing every generation.
00:14:34.160 | If you compare an alligator 100,000 years ago
00:14:36.200 | to an alligator now, they're essentially the same.
00:14:38.400 | We, on the other hand, as humans, drop into a world
00:14:42.320 | with a half-baked brain, and what we require
00:14:45.360 | is to absorb the culture around us,
00:14:48.160 | and the language, and the beliefs, and the customs,
00:14:50.320 | and so on, that's what Mother Nature has done with us,
00:14:55.080 | and it's been a tremendously successful trick.
00:14:57.360 | We've taken over the whole planet as a result of this.
00:15:00.120 | - So that's an interesting point,
00:15:01.240 | I mean, just to linger on it, that, I mean,
00:15:03.280 | this is a nice feature, like if you were to design a thing
00:15:07.080 | to survive in this world, do you put it at age zero,
00:15:11.780 | already equipped to deal with the world
00:15:14.600 | in a like hard-coded way, or do you put it,
00:15:17.720 | do you make it malleable and just throw it in,
00:15:19.560 | take the risk that you're maybe going to die,
00:15:23.380 | but you're going to learn a lot in the process,
00:15:25.200 | and if you don't die, you'll learn a hell of a lot
00:15:27.600 | to be able to survive in the environment.
00:15:29.320 | - So this is the experiment that Mother Nature ran,
00:15:31.360 | and it turns out that, for better or worse, we've won.
00:15:34.960 | I mean, yeah, we put other animals into zoos,
00:15:37.480 | and we, yeah, that's right.
00:15:38.320 | - AI might do better.
00:15:39.600 | - Okay, fair enough, that's true.
00:15:41.320 | And maybe what the trick Mother Nature did
00:15:43.640 | is just the stepping stone to AI, but.
00:15:46.640 | - So that's a beautiful feature of the human brain,
00:15:50.200 | that it's malleable, but let's,
00:15:52.400 | on the topic of Mother Nature, what do we start with?
00:15:56.360 | Like how blank is the slate?
00:15:58.360 | - Ah, so it's not actually a blank slate.
00:16:01.040 | What it's, terrific engineering that's set up in there,
00:16:05.320 | but much of that engineering has to do with,
00:16:07.640 | okay, just make sure that things get to the right place.
00:16:10.060 | For example, like the fibers from the eyes
00:16:12.240 | getting to the visual cortex,
00:16:13.800 | or all this very complicated machinery in the ear
00:16:15.600 | getting to the auditory cortex and so on.
00:16:17.680 | So things, first of all, there's that.
00:16:19.840 | And then what we also come equipped with
00:16:21.480 | is the ability to absorb language
00:16:24.560 | and culture and beliefs and so on.
00:16:27.040 | So you're already set up for that.
00:16:28.560 | So no matter what you're exposed to,
00:16:29.960 | you will absorb some sort of language.
00:16:32.900 | That's the trick, is how do you engineer something
00:16:34.960 | just enough that it's then a sponge
00:16:37.240 | that's ready to take in and fill in the blanks?
00:16:40.000 | - How much of the malleability is hardware?
00:16:42.400 | How much is software?
00:16:43.280 | Is that useful at all in the brain?
00:16:45.060 | So what are we talking about?
00:16:46.960 | So there's neurons, there's synapses,
00:16:51.960 | and all kinds of different synapses,
00:16:54.040 | and there's chemical communication,
00:16:55.880 | like electrical signals,
00:16:57.020 | and there's chemical communication from the synapses.
00:17:01.500 | What, I would say, the software would be
00:17:08.240 | the timing and the nature of the electrical signals,
00:17:10.920 | I guess, and the hardware would be the actual synapses.
00:17:14.120 | So here's the thing.
00:17:15.020 | This is why I really, if we can,
00:17:16.800 | I wanna get away from the hardware and software metaphor,
00:17:19.540 | because what happens is,
00:17:21.880 | as activity passes through the system, it changes things.
00:17:25.060 | Now, the thing that computer engineers
00:17:27.840 | are really used to thinking about is synapses,
00:17:30.260 | where two neurons connect.
00:17:31.660 | Of course, each neuron connects with 10,000 of its neighbors,
00:17:33.640 | but at a point where they connect,
00:17:35.340 | what we're all used to thinking about
00:17:37.660 | is the changing of the strength of that connection,
00:17:40.800 | the synaptic weight.
00:17:43.260 | But in fact, everything is changing.
00:17:45.580 | The receptor distribution inside that neuron,
00:17:48.460 | so that you're more or less sensitive
00:17:50.660 | to the neurotransmitter.
00:17:51.820 | Then the structure of the neuron itself,
00:17:54.460 | and what's happening there,
00:17:55.460 | all the way down to biochemical cascades inside the cell,
00:17:58.320 | all the way down to the nucleus,
00:18:00.140 | and for example, the epigenome,
00:18:01.860 | which is these little proteins that are attached to the DNA
00:18:06.560 | that cause conformational changes,
00:18:08.740 | that cause more genes to be expressed or repressed.
00:18:12.540 | All of these things are plastic.
00:18:14.980 | The reason that most people only talk
00:18:17.300 | about the synaptic weights
00:18:18.820 | is because that's really all we can measure well.
00:18:21.980 | And all this other stuff is really, really hard to see
00:18:23.940 | with our current technology,
00:18:24.900 | so essentially that just gets ignored.
00:18:26.980 | But in fact, the system is plastic
00:18:29.140 | at all these different levels.
00:18:30.300 | And my way of thinking about this
00:18:34.900 | is an analogy to pace layers.
00:18:38.900 | So pace layers is a concept that Stewart Brand
00:18:41.860 | suggested about how to think about cities.
00:18:44.020 | So you have fashion, which changes rapidly in cities.
00:18:47.460 | You have governance, which changes more slowly.
00:18:52.460 | You have the structure, the buildings of a city,
00:18:55.280 | which changes more slowly, all the way down to nature.
00:18:58.660 | You've got all these different layers of things
00:19:00.460 | that are changing at different paces, at different speeds.
00:19:03.620 | I've taken that idea and mapped it onto the brain,
00:19:06.580 | which is to say you have some biochemical cascades
00:19:09.420 | that are just changing really rapidly
00:19:10.620 | when something happens, all the way down to things
00:19:12.380 | that are more and more cemented in there.
00:19:14.940 | And this is actually,
00:19:16.420 | this actually allows us to understand a lot
00:19:19.860 | about particular kinds of things that happen.
00:19:21.460 | For example, one of the oldest,
00:19:22.700 | probably the oldest rule in neurology
00:19:25.180 | is called Ribo's Law, which is that older memories
00:19:28.020 | are more stable than newer memories.
00:19:30.120 | So when you get old and demented,
00:19:33.180 | you'll be able to remember things from your young life.
00:19:36.660 | Maybe you'll remember this podcast,
00:19:37.900 | but you won't remember what you did
00:19:39.740 | a month ago or a year ago.
00:19:41.920 | And this is a very weird structure, right?
00:19:43.860 | No other system works this way,
00:19:45.060 | where older memories are more stable than newer memories.
00:19:48.740 | But it's because through time,
00:19:52.420 | things get more and more cemented
00:19:53.940 | into deeper layers of the system.
00:19:56.700 | And so this is, I think, the way we have to think
00:20:00.300 | about the brain, not as, okay, you've got neurons,
00:20:03.460 | you've got synaptic weights, and that's it.
00:20:05.300 | - So yeah, so the idea of liveware and livewired
00:20:09.420 | is that it's like a, it's a gradual,
00:20:13.860 | yeah, it's a gradual spectrum between software and hardware.
00:20:18.260 | And so the metaphors completely doesn't make sense.
00:20:22.020 | 'Cause like when you talk about software and hardware,
00:20:24.500 | it's really hard lines.
00:20:26.500 | I mean, of course, software is unlike hardware,
00:20:31.500 | but even hardware, but like, so there's two groups,
00:20:36.300 | but in the software world,
00:20:37.660 | there's levels of abstractions, right?
00:20:39.060 | There's the operating system, there's machine code,
00:20:42.140 | and then it gets higher and higher levels.
00:20:44.500 | But somehow that's actually fundamentally different
00:20:46.900 | than the layers of abstractions in the hardware.
00:20:50.040 | But in the brain, it's all like the same.
00:20:53.140 | And I love the city, the city metaphor.
00:20:55.380 | I mean, yeah, it's kind of mind-blowing,
00:20:57.900 | 'cause it's hard to know what to think about that.
00:21:01.420 | Like if I were to ask the question,
00:21:03.220 | this is an important question for machine learning,
00:21:06.340 | is how does the brain learn?
00:21:09.740 | So essentially you're saying that,
00:21:13.900 | I mean, it just learns on all of these different levels
00:21:17.180 | at all different paces.
00:21:18.940 | - Exactly right.
00:21:19.860 | And as a result, what happens is as you practice something,
00:21:22.660 | you get good at something,
00:21:24.460 | you're physically changing the circuitry.
00:21:26.620 | You're adapting your brain around the thing
00:21:29.980 | that is relevant to you.
00:21:31.300 | So let's say you take up, do you know how to surf?
00:21:34.940 | - Nope. - Okay, great.
00:21:35.780 | Let's say you take up surfing now at this age.
00:21:39.220 | What happens is you'll be terrible at first,
00:21:41.100 | you don't know how to operate your body,
00:21:42.060 | you don't know how to read the waves, things like that.
00:21:43.980 | And through time you get better and better.
00:21:45.700 | What you're doing is you're burning that
00:21:46.980 | into the actual circuitry of your brain.
00:21:48.660 | You're of course conscious when you're first doing it,
00:21:50.860 | you're thinking about, okay, where am I doing?
00:21:52.140 | What's my body weight?
00:21:53.620 | But eventually when you become a pro at it,
00:21:55.340 | you are not conscious of it at all.
00:21:57.180 | In fact, you can't even unpack what it is that you did.
00:22:00.220 | Think about riding a bicycle.
00:22:01.980 | You can't describe how you're doing it.
00:22:04.140 | You're just doing it, you're changing your balance
00:22:05.620 | when you come, you know, you do this to go to a stop.
00:22:08.060 | So this is what we're constantly doing
00:22:10.820 | is actually shaping our own circuitry
00:22:14.300 | based on what is relevant for us.
00:22:15.980 | Survival of course being the top thing that's relevant,
00:22:18.780 | but interestingly, especially with humans,
00:22:21.460 | we have these particular goals in our lives,
00:22:23.420 | computer science, neuroscience, whatever.
00:22:25.620 | And so we actually shape our circuitry around that.
00:22:28.700 | - I mean, you mentioned this gets slower and slower
00:22:30.380 | with age, but is there, like I've,
00:22:33.100 | I think I've read and spoken offline,
00:22:36.100 | even on this podcast with a developmental neurobiologist,
00:22:41.100 | I guess would be the right terminology,
00:22:43.460 | is like looking at the very early,
00:22:45.580 | like from embryonic stem cells to like,
00:22:48.380 | to the creation of the brain.
00:22:50.460 | And like, that's like, what, that's mind blowing
00:22:53.700 | how much stuff happens there.
00:22:55.380 | So it's very malleable at that stage.
00:22:57.500 | And then, but after that,
00:23:01.620 | at which point does it stop being malleable?
00:23:04.180 | - So that's the interesting thing
00:23:06.100 | is that it remains malleable your whole life.
00:23:08.540 | So even when you're an old person,
00:23:10.260 | you'll be able to remember new faces and names.
00:23:13.180 | You'll be able to learn new sorts of tasks.
00:23:15.940 | And thank goodness, 'cause the world is changing rapidly
00:23:18.020 | in terms of technology and so on.
00:23:19.780 | I just sent my mother an Alexa and she figured out
00:23:22.620 | how to go in the settings and do the thing.
00:23:23.900 | And I was really impressed by that,
00:23:25.820 | that she was able to do it.
00:23:26.980 | So there are parts of the brain
00:23:28.580 | that remain malleable their whole life.
00:23:30.380 | The interesting part is that really your goal
00:23:34.220 | is to make an internal model of the world.
00:23:36.100 | Your goal is to say, okay, the brain
00:23:40.540 | is trapped in silence and darkness,
00:23:42.300 | and it's trying to understand
00:23:43.540 | how the world works out there, right?
00:23:46.020 | - I love that image.
00:23:47.100 | Yeah, I guess it is.
00:23:48.260 | - Yeah.
00:23:49.100 | - You forget, it's like this lonely thing
00:23:53.100 | is sitting in its own container
00:23:54.700 | and trying to actually throw a few sensors,
00:23:57.180 | figure out what the hell's going on.
00:23:58.860 | - You know what I sometimes think about is
00:24:00.740 | that movie, "The Martian" with Matt Damon.
00:24:03.140 | I mean, it was written in a book, of course,
00:24:05.660 | but the movie poster shows Matt Damon
00:24:08.500 | all alone on the red planet.
00:24:09.900 | And I think, God, that's actually what it's like
00:24:12.620 | to be inside your head and my head and anybody's head
00:24:16.500 | is that you're essentially on your own planet in there.
00:24:20.340 | And I'm essentially on my own planet.
00:24:21.780 | And everyone's got their own world
00:24:24.220 | where you've absorbed all of your experiences
00:24:26.820 | up to this moment in your life
00:24:28.140 | that have made you exactly who you are
00:24:29.700 | and same for me and everyone.
00:24:31.260 | And we've got this very thin bandwidth of communication
00:24:36.260 | and I'll say something like,
00:24:38.780 | "Oh yeah, that tastes just like peaches."
00:24:40.860 | And you'll say, "Oh, I know what you mean."
00:24:42.980 | But the experience, of course,
00:24:44.420 | might be vastly different for us.
00:24:46.700 | But anyway, yes, so the brain is trapped
00:24:49.660 | in silence and darkness, each one of us.
00:24:51.860 | And what it's trying to do, this is the important part,
00:24:54.220 | it's trying to make an internal model
00:24:55.860 | of what's going on out there,
00:24:56.700 | what's my place in, how do I function in the world?
00:24:59.420 | How do I interact with other people?
00:25:01.580 | Do I say something nice and polite?
00:25:03.460 | Do I say something aggressive and mean?
00:25:04.820 | Do I, you know, all these things
00:25:06.340 | that it's putting together about the world.
00:25:09.180 | And I think what happens when people get older and older,
00:25:13.020 | it may not be that plasticity is diminishing.
00:25:16.220 | It may be that their internal model
00:25:17.820 | essentially has set itself up in a way where it says,
00:25:20.820 | "Okay, I've pretty much got a really good understanding
00:25:22.860 | of the world now and I don't really need to change."
00:25:25.860 | When much older people find themselves
00:25:28.700 | in a situation where they need to change,
00:25:30.860 | they can actually or are able to do it.
00:25:33.360 | It's just that I think this notion that we all have
00:25:35.400 | that plasticity diminishes as we grow older
00:25:38.340 | is in part because the motivation isn't there.
00:25:41.620 | - I see.
00:25:42.460 | - But if you were 80 and you get fired from your job
00:25:44.180 | and suddenly had to figure out
00:25:45.780 | how to program a WordPress site or something,
00:25:47.500 | you'd figure it out.
00:25:48.980 | - Got it.
00:25:49.800 | So the capability, the possibility of change is there.
00:25:53.820 | But let me ask the highest challenge,
00:25:57.100 | the interesting challenge to this plasticity,
00:26:00.940 | to this liveware system.
00:26:03.500 | If we could talk about brain-computer interfaces
00:26:06.060 | and Neuralink, what are your thoughts
00:26:09.080 | about the efforts of Elon Musk, Neuralink, BCI in general
00:26:13.700 | in this regard, which is adding a machine, a computer,
00:26:18.700 | the capability of a computer to communicate with the brain
00:26:22.860 | and the brain to communicate with a computer
00:26:24.980 | at the very basic applications
00:26:26.860 | and then like the futuristic kind of thoughts?
00:26:28.900 | - Yeah.
00:26:29.740 | First of all, it's terrific that people are jumping
00:26:31.860 | into doing that 'cause it's clearly the future.
00:26:34.500 | The interesting part is our brains have pretty good methods
00:26:37.320 | of interacting with technology.
00:26:38.940 | So maybe it's your fat thumbs on a cell phone or something,
00:26:41.660 | or maybe it's watching a YouTube video
00:26:44.660 | and getting into your eye that way.
00:26:45.920 | But we have pretty rapid ways of communicating
00:26:48.420 | with technology and getting data.
00:26:50.020 | So if you actually crack open the skull
00:26:52.820 | and go into the inner sanctum of the brain,
00:26:55.640 | you might be able to get a little bit faster,
00:26:58.260 | but I'll tell you, I'm not so sanguine
00:27:03.260 | on the future of that as a business.
00:27:06.740 | And I'll tell you why.
00:27:07.560 | It's because there are various ways
00:27:10.020 | of getting data in and out.
00:27:11.140 | And an open-head surgery is a big deal.
00:27:14.460 | Neurosurgeons don't wanna do it
00:27:16.500 | 'cause there's always risk of death
00:27:17.980 | and infection on the table.
00:27:19.580 | And also it's not clear how many people would say,
00:27:23.140 | I'm gonna volunteer to get something in my head
00:27:25.940 | so that I can text faster, 20% faster.
00:27:29.760 | So I think it's, Mother Nature surrounds the brain
00:27:33.680 | with this armored bunker of the skull
00:27:37.780 | because it's a very delicate material.
00:27:39.620 | And there's an expression in neurosurgery
00:27:41.620 | about the brain is, the person is never the same
00:27:47.780 | after you open up their skull.
00:27:48.980 | Now, whether or not that's true or whatever, who cares?
00:27:51.540 | But it's a big deal to do an open-head surgery.
00:27:53.980 | So what I'm interested in is how can we get information
00:27:57.420 | in and out of the brain
00:27:58.260 | without having to crack the skull open?
00:28:00.580 | - Got it, without messing with the biological part,
00:28:03.580 | like directly connecting or messing
00:28:06.420 | with the intricate biological thing that we got going on
00:28:10.380 | that seems to be working.
00:28:11.860 | - Yeah, exactly.
00:28:12.700 | And by the way, where Neuralink is going,
00:28:15.340 | which is wonderful, is going to be in patient cases.
00:28:18.200 | It really matters for all kinds of surgeries
00:28:20.500 | that a person needs,
00:28:21.740 | whether for Parkinson's or epilepsy or whatever.
00:28:23.980 | It's a terrific new technology
00:28:25.620 | for essentially sewing electrodes in there
00:28:27.740 | and getting more higher density of electrodes.
00:28:29.940 | So that's great.
00:28:30.900 | I just don't think as far as the future of BCI goes,
00:28:34.340 | I don't suspect that people will go in and say,
00:28:38.300 | "Yeah, drill a hole in my head and do that."
00:28:40.580 | - Well, it's interesting
00:28:41.620 | 'cause I think there's a similar intuition,
00:28:44.220 | but say in the world of autonomous vehicles,
00:28:46.620 | that folks know how hard it is
00:28:49.360 | and it seems damn impossible.
00:28:51.480 | The similar intuition about,
00:28:52.920 | I'm sticking on the Elon Musk thing,
00:28:54.640 | is just a good, easy example.
00:28:57.000 | Similar intuition about colonizing Mars.
00:28:59.300 | If you really think about it, it seems extremely difficult
00:29:04.740 | and almost, I mean, just technically difficult
00:29:08.960 | to a degree where you wanna ask,
00:29:12.000 | is it really worth doing, worth trying?
00:29:14.840 | And then the same is applied with BCI.
00:29:17.860 | But the thing about the future is it's hard to predict.
00:29:22.860 | So the exciting thing to me with,
00:29:26.680 | so once it does, once, if successful,
00:29:29.640 | it's able to help patients,
00:29:31.640 | it may be able to discover something very surprising
00:29:36.640 | of our ability to directly communicate with the brain.
00:29:39.560 | So exactly what you're interested in
00:29:41.440 | is figuring out how to play with this malleable brain,
00:29:46.440 | but help assist it somehow.
00:29:49.520 | I mean, it's such a compelling notion to me
00:29:52.280 | that we're now working on
00:29:53.680 | all these exciting machine learning systems
00:29:56.000 | that are able to learn from data.
00:29:59.840 | And then if we can have this other brain
00:30:04.100 | that's a learning system that's live wired
00:30:08.640 | on the human side and then be able to communicate,
00:30:11.600 | it's like a self-play mechanism
00:30:14.160 | was able to beat the world champion at Go.
00:30:17.840 | So they can play with each other,
00:30:19.620 | the computer and the brain, like when you sleep.
00:30:22.480 | I mean, there's a lot of futuristic kind of things
00:30:24.560 | that it's just exciting possibilities.
00:30:27.680 | But I hear you, we understand so little
00:30:30.280 | about the actual intricacies of the communication
00:30:34.120 | of the brain that it's hard to find the common language.
00:30:38.520 | - Well, interestingly, the technologies that have been built
00:30:43.520 | don't actually require the perfect common language.
00:30:48.360 | So for example, hundreds of thousands of people
00:30:51.120 | are walking around with artificial ears
00:30:52.480 | and artificial eyes,
00:30:53.480 | meaning cochlear implants or retinal implants.
00:30:56.960 | So this is, you take a essentially digital microphone,
00:31:00.360 | you slip an electrode strip into the inner ear
00:31:03.260 | and people can learn how to hear that way.
00:31:05.220 | Or you take an electrode grid
00:31:06.620 | and you plug it into the retina at the back of the eye
00:31:09.260 | and people can learn how to see that way.
00:31:11.180 | The interesting part is those devices
00:31:13.900 | don't speak exactly the natural biological language,
00:31:17.140 | they speak the dialect of Silicon Valley.
00:31:19.440 | And it turns out that as recently as about 25 years ago,
00:31:24.400 | a lot of people thought this was never gonna work.
00:31:26.540 | They thought it wasn't gonna work for that reason,
00:31:29.000 | but the brain figures it out.
00:31:30.400 | It's really good at saying, okay, look,
00:31:32.220 | there's some correlation between what I can touch
00:31:34.240 | and feel and hear and so on.
00:31:35.860 | And the data that's coming in or between,
00:31:38.620 | I clap my hands and I have signals coming in there
00:31:41.520 | and it figures out how to speak any language.
00:31:44.260 | - Oh, that's fascinating.
00:31:45.100 | So like no matter if it's Neuralink,
00:31:49.100 | so directly communicating with the brain
00:31:51.740 | or it's a smartphone or Google Glass
00:31:54.720 | or the brain figures out the efficient way of communication.
00:31:58.820 | - Well, exactly, exactly.
00:32:00.200 | And what I've proposed is the potato head theory
00:32:03.200 | of evolution, which is that all our eyes and nose
00:32:07.300 | and mouth and ears and fingertips,
00:32:08.820 | all this stuff is just plug and play.
00:32:10.740 | And the brain can figure out what to do
00:32:13.260 | with the data that comes in.
00:32:14.180 | And part of the reason that I think this is right,
00:32:17.600 | and I care so deeply about this,
00:32:18.940 | is when you look across the animal kingdom,
00:32:20.380 | you find all kinds of weird peripheral devices plugged in
00:32:23.540 | and the brain figures out what to do with the data.
00:32:25.740 | And I don't believe that mother nature has to reinvent
00:32:28.560 | the principles of brain operation each time to say,
00:32:32.660 | oh, now I'm gonna have heat pits to detect infrared.
00:32:34.500 | Now I'm gonna have something to detect,
00:32:37.460 | you know, electro receptors on the body.
00:32:39.200 | Now I'm gonna detect something to pick up
00:32:40.540 | the magnetic field of the earth
00:32:42.540 | with cryptochromes in the eye.
00:32:43.940 | And so instead the brain says, oh, I got it.
00:32:45.820 | There's data coming in.
00:32:47.260 | Is that useful?
00:32:48.100 | Can I do something with it?
00:32:48.920 | Oh, great, I'm gonna mold myself around the data
00:32:51.440 | that's coming in.
00:32:52.620 | - It's kind of fascinating to think that we think
00:32:55.780 | of smartphones and all this new technology as novel.
00:32:58.740 | It's totally novel as outside of what evolution
00:33:02.660 | ever intended or like what nature ever intended.
00:33:05.580 | It's fascinating to think that like the entirety
00:33:08.420 | of the process of evolution is perfectly fine
00:33:10.940 | and ready for the smartphone and the internet.
00:33:14.220 | Like it's ready.
00:33:15.260 | It's ready to be valuable to that.
00:33:17.260 | And whatever comes to cyborgs, to virtual reality,
00:33:21.500 | we kind of think like this is, you know,
00:33:23.500 | there's all these like books written about what's natural
00:33:27.020 | and we're like destroying our natural selves
00:33:29.620 | by like embracing all this technology.
00:33:32.620 | It's kind of, you know, we're probably not giving
00:33:35.380 | the brain enough credit.
00:33:36.500 | Like this thing is just fine with new tech.
00:33:40.280 | - Oh, exactly.
00:33:41.120 | It wraps itself around.
00:33:41.940 | And by the way, wait till you have kids.
00:33:43.140 | You'll see the ease with which they pick up on stuff.
00:33:46.300 | And as Kevin Kelly said, technology is what gets invented
00:33:51.300 | after you're born.
00:33:54.500 | But the stuff that already exists when you're born,
00:33:56.260 | that's not even tech, that's just background furniture.
00:33:58.140 | Like the fact that the iPad exists for my son and daughter,
00:34:00.860 | like that's just background furniture.
00:34:02.340 | So yeah, it's because we have this incredibly
00:34:07.260 | malleable system, it just absorbs whatever is going on
00:34:10.300 | in the world and learns what to do with it.
00:34:11.900 | - But do you think, just to linger for a little bit more,
00:34:15.420 | do you think it's possible to co-adjust?
00:34:22.180 | Like we're kind of, you know, for the machine
00:34:26.560 | to adjust to the brain, for the brain to adjust to the machine
00:34:29.220 | I guess that's what's already happening.
00:34:31.020 | - Sure, that is what's happening.
00:34:32.340 | So for example, when you put electrodes in the motor cortex
00:34:35.700 | to control a robotic arm for somebody who's paralyzed,
00:34:39.220 | the engineers do a lot of work to figure out,
00:34:41.060 | okay, what can we do with the algorithm here
00:34:42.820 | so that we can detect what's going on from these cells
00:34:45.660 | and figure out how to best program the robotic arm to move
00:34:49.740 | given the data that we're measuring from these cells.
00:34:51.920 | But also the brain is learning too.
00:34:54.540 | So, you know, the paralyzed woman says,
00:34:57.020 | wait, I'm trying to grab this thing.
00:34:58.820 | And by the way, it's all about relevance.
00:35:00.860 | So if there's a piece of food there and she's hungry,
00:35:04.020 | she'll figure out how to get this food into her mouth
00:35:08.220 | with the robotic arm because that is what matters.
00:35:11.320 | - Well, that's, okay, first of all,
00:35:15.500 | that paints a really promising and beautiful,
00:35:17.620 | for some reason, really optimistic picture
00:35:20.140 | that our brain is able to adjust to so much.
00:35:25.140 | You know, so many things happened this year, 2020,
00:35:29.620 | that you think like, how are we ever going to deal with it?
00:35:32.960 | And it's somehow encouraging and inspiring
00:35:36.960 | that we're going to be okay.
00:35:41.140 | - Well, that's right.
00:35:41.980 | I actually think, so 2020 has been an awful year
00:35:45.100 | for almost everybody in many ways,
00:35:46.980 | but the one silver lining has to do with brain plasticity,
00:35:50.440 | which is to say we've all been on our gerbil wheels.
00:35:55.180 | We've all been in our routines.
00:35:56.500 | And as I mentioned, our internal models
00:35:59.780 | are all about how do you maximally succeed?
00:36:02.300 | How do you optimize your operation
00:36:04.540 | in this circumstance where you are, right?
00:36:07.180 | And then all of a sudden, bang, 2020 comes,
00:36:09.020 | we're completely off our wheels.
00:36:10.900 | We're having to create new things all the time
00:36:14.860 | and figure out how to do it.
00:36:15.940 | And that is terrific for brain plasticity
00:36:18.060 | because, and we know this because there are
00:36:21.860 | very large studies on older people
00:36:24.820 | who stay cognitively active their whole lives.
00:36:27.980 | Some fraction of them have Alzheimer's disease physically,
00:36:32.300 | but nobody knows that when they're alive.
00:36:34.640 | Even though their brain is getting chewed up
00:36:36.020 | with the ravages of Alzheimer's,
00:36:38.620 | cognitively, they're doing just fine.
00:36:40.820 | It's because they're challenged all the time.
00:36:44.100 | They've got all these new things going on,
00:36:45.540 | all this novelty, all these responsibilities,
00:36:47.220 | chores, social life, all these things happening.
00:36:49.660 | And as a result, they're constantly building new roadways,
00:36:52.780 | even as parts degrade.
00:36:54.700 | And that's the only good news is that we are in a situation
00:36:58.900 | where suddenly we can't just operate like automaton anymore.
00:37:01.980 | We have to think of completely new ways to do things.
00:37:05.620 | And that's wonderful.
00:37:06.700 | - I don't know why this question popped into my head.
00:37:11.180 | It's quite absurd, but are we gonna be okay?
00:37:16.100 | - Yeah.
00:37:17.180 | - You say this is the promising silver lining,
00:37:19.620 | just from your own, 'cause you've written about this
00:37:21.740 | and thought about this outside of maybe even
00:37:24.020 | the plasticity of the brain,
00:37:25.220 | but just this whole pandemic kind of changed the way,
00:37:29.900 | it knocked us out of this hamster wheel like that of habit.
00:37:34.900 | A lot of people had to reinvent themselves.
00:37:39.420 | Unfortunately, and I have a lot of friends
00:37:42.260 | who either are ready or are going to lose their business,
00:37:47.260 | is basically it's taking the dreams that people have had
00:37:52.740 | and said, like, said this dream,
00:37:56.140 | this particular dream you've had will no longer be possible.
00:38:00.100 | So you have to find something new.
00:38:02.060 | What are your, are we gonna be okay?
00:38:06.020 | - Yeah, we'll be okay in the sense that,
00:38:08.100 | I mean, it's gonna be a rough time for many or most people,
00:38:11.820 | but in the sense that it is sometimes useful
00:38:16.140 | to find that what you thought was your dream
00:38:20.780 | was not the thing that you're going to do.
00:38:24.540 | This is obviously the plot in lots of Hollywood movies
00:38:26.780 | that someone says, I'm gonna do this,
00:38:27.940 | and then that gets foiled
00:38:29.260 | and they end up doing something better.
00:38:31.100 | But this is true in life.
00:38:32.340 | I mean, in general, even though we plan our lives
00:38:38.420 | as best we can, it's predicated on our notion of,
00:38:42.100 | okay, given everything that's around me,
00:38:43.580 | this is what's possible for me next.
00:38:45.800 | But it takes 2020 to knock you off that,
00:38:49.400 | where you think, oh, well, actually,
00:38:50.840 | maybe there's something I can be doing
00:38:52.680 | that's bigger, that's better.
00:38:54.300 | - Yeah, you know, for me, one exciting thing,
00:38:56.580 | and I just talked to Grant Sanderson.
00:38:59.620 | I don't know if you know who he is.
00:39:00.700 | It's the 3Blue1Brown.
00:39:02.100 | It's a YouTube channel.
00:39:03.420 | He does, he's a, if you see it, you would recognize it.
00:39:08.220 | He's like a really famous math guy,
00:39:11.100 | and he's a math educator,
00:39:12.420 | and he does these incredible, beautiful videos.
00:39:15.020 | And now I see sort of at MIT,
00:39:16.980 | folks are struggling to try to figure out,
00:39:19.660 | you know, if we do teach remotely,
00:39:21.780 | how do we do it effectively?
00:39:23.540 | So you have these world-class researchers
00:39:27.940 | and professors trying to figure out
00:39:30.260 | how to put content online that teaches people.
00:39:33.740 | And to me, a possible future of that is,
00:39:37.740 | you know, Nobel Prize-winning faculty become YouTubers.
00:39:42.740 | Like that, that to me is so exciting.
00:39:47.340 | Like what Grant said, which is like the possibility
00:39:50.940 | of creating canonical videos
00:39:52.500 | on the thing you're a world expert in.
00:39:55.020 | You know, there's so many topics that just,
00:39:57.860 | the world doesn't, you know, there's faculty.
00:40:00.900 | I mentioned Russ Tedrick.
00:40:02.140 | There's all these people in robotics
00:40:04.260 | that are experts in a particular beautiful field
00:40:07.900 | on which there's only just papers.
00:40:09.900 | There's no popular book.
00:40:12.700 | There's no clean canonical video
00:40:16.460 | showing the beauty of a subject.
00:40:18.140 | And one possibility is they try to create that
00:40:22.380 | and share it with the world.
00:40:25.460 | - This is the beautiful thing.
00:40:26.460 | This, of course, has been happening for a while already.
00:40:28.940 | I mean, for example, when I go and I give book talks,
00:40:31.740 | often what'll happen is some 13-year-old
00:40:33.820 | will come up to me afterwards and say something
00:40:35.380 | and I'll say, "My God, that was so smart.
00:40:37.180 | "Like, how did you know that?"
00:40:38.860 | And they'll say, "Oh, I saw it on a TED Talk."
00:40:40.340 | Well, what an amazing opportunity.
00:40:42.900 | Here you got the best person in the world on subject X
00:40:46.460 | giving a 15-minute talk as beautifully as he or she can.
00:40:51.460 | And the 13-year-old just grows up with that.
00:40:53.460 | That's just the mother's milk, right?
00:40:55.180 | As opposed to when we grew up, you know,
00:40:57.940 | I had whatever homeroom teacher I had
00:41:00.380 | and, you know, whatever classmates I had.
00:41:03.220 | And hopefully that person knew what he or she was teaching
00:41:06.460 | and often didn't and, you know, just made things up.
00:41:08.780 | So the opportunity now has become extraordinary
00:41:12.980 | to get the best of the world.
00:41:14.540 | And the reason this matters, of course,
00:41:15.740 | is because obviously, back to plasticity,
00:41:18.580 | the way our brain gets molded
00:41:22.020 | is by absorbing everything from the world,
00:41:24.340 | all of the knowledge and the data and so on that it can get,
00:41:28.300 | and then springboarding off of that.
00:41:33.060 | And we're in a very lucky time now
00:41:34.340 | because we grew up with a lot of just-in-case learning.
00:41:39.340 | So, you know, just in case you ever need to know
00:41:42.020 | these dates in Mongolian history, here they are.
00:41:44.700 | But what kids are growing up with now, like my kids,
00:41:47.260 | is tons of just-in-time learning.
00:41:48.780 | So as soon as they're curious about something,
00:41:50.140 | they ask Alexa, they ask Google Home,
00:41:51.780 | they get the answer right there
00:41:53.100 | in the context of their curiosity.
00:41:54.580 | The reason this matters is because
00:41:57.140 | for plasticity to happen, you need to care.
00:42:00.420 | You need to be curious about something.
00:42:02.460 | And this is something, by the way,
00:42:03.740 | that the ancient Romans had noted.
00:42:06.300 | They had outlined seven different levels of learning,
00:42:08.340 | and the highest level is when you're curious about a topic.
00:42:10.580 | But anyway, so kids now are getting
00:42:12.940 | tons of just-in-time learning,
00:42:14.900 | and as a result, they're gonna be so much smarter
00:42:17.380 | than we are.
00:42:18.820 | And we can already see that.
00:42:19.740 | I mean, my boy is eight years old, my girl is five.
00:42:22.180 | But I mean, the things that he knows are amazing
00:42:25.700 | because it's not just him having to do
00:42:27.740 | the rote memorization stuff that we did.
00:42:29.900 | - Yeah, it's just fascinating what the brain,
00:42:32.220 | what young brains look like now
00:42:33.660 | 'cause of all those TED Talks just loaded in there.
00:42:36.860 | - Yes. - And there's also,
00:42:38.420 | I mean, a lot of people write kind of,
00:42:40.860 | there's a sense that our attention span is growing shorter,
00:42:44.840 | but it's complicated because, for example,
00:42:49.060 | most people, majority of people,
00:42:50.740 | it's the 80-plus percent of people
00:42:52.540 | listen to the entirety of these things,
00:42:54.620 | two, three hours for it.
00:42:56.020 | Podcasts, long-form podcasts
00:42:58.300 | are becoming more and more popular.
00:43:00.920 | So it's all really giant, complicated mess.
00:43:04.220 | And the point is that the brain is able to adjust to it
00:43:07.420 | and somehow form a worldview
00:43:11.860 | within this new medium of information that we have.
00:43:16.860 | You have these short tweets
00:43:19.940 | and you have these three, four-hour podcasts
00:43:22.860 | and you have Netflix, movie.
00:43:24.940 | I mean, it's just adjusting to the entirety of things,
00:43:27.420 | just absorbing it and taking it all in.
00:43:30.100 | And then pops up COVID that forces us all to be home
00:43:34.620 | and it all just adjusts and figures it out.
00:43:39.220 | - Yeah, yeah, exactly. - It's fascinating.
00:43:41.900 | Been talking about the brain
00:43:43.900 | as if it's something separate from the human
00:43:48.420 | that carries it a little bit.
00:43:50.300 | Like whenever you talk about the brain,
00:43:52.220 | it's easy to forget that that's us.
00:43:56.820 | Like how much is the whole thing predetermined?
00:44:01.820 | Like how much is it already encoded in there?
00:44:07.780 | And how much is it the-- - What's the it?
00:44:11.540 | - What's the it?
00:44:12.380 | The actions, the decisions, the judgments, the--
00:44:22.100 | - You mean like who you are?
00:44:23.380 | - Who you are.
00:44:24.200 | - Oh, yeah, yeah, okay, great question.
00:44:25.780 | Right, so there used to be a big debate
00:44:27.420 | about nature versus nurture.
00:44:28.900 | And we now know that it's always both.
00:44:31.380 | You can't even separate them because you come to the table
00:44:34.740 | with a certain amount of nature,
00:44:35.580 | for example, your whole genome and so on.
00:44:37.740 | The experiences you have in the womb,
00:44:39.680 | like whether your mother is smoking or drinking,
00:44:41.820 | things like that, whether she's stressed, so on,
00:44:43.740 | those all influence how you're gonna pop out of the womb.
00:44:47.260 | From there, everything is an interaction
00:44:50.180 | between all of your experiences and the nature.
00:44:55.500 | What I mean is, I think of it like a space-time cone
00:44:59.900 | where you have, you drop into the world,
00:45:01.820 | and depending on the experiences that you have,
00:45:03.140 | you might go off in this direction, or that direction,
00:45:04.820 | or that direction, because there's interaction all the way,
00:45:08.420 | your experiences determine what happens
00:45:11.060 | with the expression of your genes.
00:45:12.380 | So some genes get repressed, some get expressed, and so on.
00:45:15.940 | And you actually become a different person
00:45:17.580 | based on your experiences.
00:45:18.900 | There's a whole field called epigenomics,
00:45:21.100 | which is, or epigenetics, I should say,
00:45:23.980 | which is about the epigenome,
00:45:26.380 | and that is the layer that sits on top of the DNA
00:45:30.340 | and causes the genes to express differently.
00:45:32.540 | That is directly related to the experiences that you have.
00:45:35.100 | So if, just as an example, they take rat pups,
00:45:38.660 | and one group is placed away from their parents,
00:45:41.540 | and the other group is groomed, and licked,
00:45:43.540 | and taken good care of, that changes their gene expression
00:45:46.060 | for the rest of their life.
00:45:46.900 | They go off in different directions
00:45:48.100 | in this space-time cone.
00:45:50.900 | So yeah, this is, of course, why it matters
00:45:55.780 | that we take care of children and pour money
00:45:58.300 | into things like education, and good childcare, and so on,
00:46:01.900 | for children broadly,
00:46:03.420 | because these formative years matter so much.
00:46:08.260 | - So is there a free will?
00:46:10.080 | - This is a great question.
00:46:13.780 | - I apologize for the absurd high-level
00:46:16.440 | philosophical questions.
00:46:17.280 | - No, no, these are my favorite kind of questions.
00:46:19.020 | Here's the thing, here's the thing.
00:46:20.900 | We don't know, if you ask most neuroscientists,
00:46:23.300 | they'll say that we can't really think
00:46:26.780 | of how you would get free will in there,
00:46:28.640 | because as far as we can tell, it's a machine.
00:46:30.280 | It's a very complicated machine.
00:46:32.200 | Enormously sophisticated, 86 billion neurons,
00:46:36.120 | about the same number of glial cells.
00:46:38.140 | Each of these things is as complicated
00:46:40.140 | as the city of San Francisco.
00:46:41.300 | Each neuron in your head has the entire human genome in it.
00:46:43.500 | It's expressing millions of gene products.
00:46:47.620 | These are incredibly complicated biochemical cascades.
00:46:49.980 | Each one is connected to 10,000 of its neighbors,
00:46:51.860 | which means you have, you know,
00:46:53.040 | like half a quadrillion connections in the brain.
00:46:55.740 | So it's incredibly complicated,
00:46:58.180 | but it is fundamentally appears to just be a machine.
00:47:01.560 | And therefore, if there's nothing in it
00:47:04.980 | that's not being driven by something else,
00:47:07.400 | then it seems it's hard to understand
00:47:10.180 | where free will would come from.
00:47:12.600 | So that's the camp that pretty much all of us fall into,
00:47:14.860 | but I will say our science is still quite young.
00:47:18.120 | And you know, I'm a fan of the history of science.
00:47:20.860 | And the thing that always strikes me as interesting
00:47:22.780 | is when you look back at any moment in science,
00:47:26.100 | everybody believes something is true,
00:47:28.500 | and they just, they simply didn't know about, you know,
00:47:31.340 | what Einstein revealed or whatever.
00:47:33.160 | And so who knows?
00:47:35.100 | - And they all feel like that we've,
00:47:37.060 | at any moment in history,
00:47:38.620 | they all feel like we've converged to the final answer.
00:47:40.700 | - Exactly, exactly.
00:47:41.780 | Like all the pieces of the puzzle are there.
00:47:43.780 | And I think that's a funny illusion
00:47:45.620 | that's worth getting rid of.
00:47:47.180 | And in fact, this is what drives good science
00:47:49.540 | is recognizing that we don't have most of the puzzle pieces.
00:47:52.660 | So as far as the free will question goes, I don't know.
00:47:55.620 | At the moment, it seems, wow, it would be really impossible
00:47:57.920 | to figure out how something else could fit in there.
00:48:00.020 | But you know, a hundred years from now,
00:48:02.720 | our textbooks might be very different than they are now.
00:48:05.620 | - I mean, could I ask you to speculate
00:48:07.620 | where do you think free will could be squeezed into there?
00:48:11.060 | Like what's that even, is it possible
00:48:15.220 | that our brain just creates kinds of illusions
00:48:17.700 | that are useful for us?
00:48:19.860 | Or like where could it possibly be squeezed in?
00:48:24.180 | - Well, let me give a speculation answer
00:48:27.140 | to your very nice question, but you know,
00:48:30.700 | and the listeners to this podcast,
00:48:32.180 | don't quote me on this. - It's not a quote.
00:48:33.140 | - Yeah, exactly, I'm not saying this is what I believe
00:48:35.140 | to be true, but let me just give an example.
00:48:36.740 | I give this at the end of my book, "Incognito."
00:48:38.960 | So the whole book of "Incognito" is about, you know,
00:48:41.380 | all the what's happening in the brain.
00:48:42.940 | And essentially I'm saying, look,
00:48:44.100 | here's all the reasons to think
00:48:45.500 | that free will probably does not exist.
00:48:47.060 | But at the very end, I say, look,
00:48:50.620 | imagine that you are, you know,
00:48:53.900 | imagine that you're a Kalahari Bushman
00:48:56.180 | and you find a radio in the sand
00:48:58.940 | and you've never seen anything like this.
00:49:01.140 | And you look at this radio and you realize
00:49:04.460 | that when you turn this knob, you hear voices coming from,
00:49:07.220 | there are voices coming from it.
00:49:08.740 | So being a, you know, a radio materialist,
00:49:11.720 | you try to figure out like, how does this thing operate?
00:49:14.040 | So you take off the back cover
00:49:15.440 | and you realize there's all these wires.
00:49:16.840 | And when you take out some wires,
00:49:19.780 | the voices get garbled or stop or whatever.
00:49:22.200 | And so what you end up developing is a whole theory
00:49:24.320 | about how this connection, this pattern of wires
00:49:26.840 | gives rise to voices.
00:49:29.060 | But it would never strike you that in distant cities,
00:49:31.800 | there's a radio tower and there's invisible stuff beaming.
00:49:34.520 | And that's actually the origin of the voices.
00:49:36.800 | And this is just necessary for it.
00:49:38.680 | So I mentioned this just as a speculation.
00:49:42.580 | Say, look, how would we know,
00:49:44.120 | what we know about the brain for absolutely certain
00:49:46.060 | is that when you damage pieces and parts of it,
00:49:48.560 | things get jumbled up.
00:49:50.560 | But how would you know if there's something else going on
00:49:52.680 | that we can't see, like electromagnetic radiation,
00:49:55.120 | that is what's actually generating this?
00:49:58.160 | - Yeah, you paint a beautiful example
00:50:01.360 | of how totally, because we don't know most of how,
00:50:06.800 | because we don't know most of how our universe works,
00:50:10.320 | how totally off base we might be with our science.
00:50:13.920 | Until, I mean, yeah, I mean, that's inspiring.
00:50:18.480 | That's beautiful.
00:50:19.320 | It's kind of terrifying.
00:50:21.120 | It's humbling.
00:50:21.960 | It's all of the above.
00:50:23.960 | - And the important part just to recognize
00:50:26.840 | is that of course we're in the position
00:50:28.640 | of having massive unknowns.
00:50:31.780 | And we have, of course, the known unknowns.
00:50:36.320 | And that's all the things we're pursuing in our labs,
00:50:38.400 | trying to figure out,
00:50:39.240 | but there's this whole space of unknown unknowns
00:50:41.480 | that we haven't even realized we haven't asked yet.
00:50:44.040 | - Let me kind of ask a weird, maybe a difficult question.
00:50:47.680 | Part of it has to do with,
00:50:50.600 | I've been recently reading a lot about World War II.
00:50:54.360 | I'm currently reading a book I recommend for people,
00:50:56.640 | which is, as a Jew, it's been difficult to read,
00:51:00.960 | but The Rise and Fall of the Third Reich.
00:51:04.940 | So let me just ask about like the nature of genius,
00:51:08.940 | the nature of evil.
00:51:10.520 | If we look at somebody like Einstein,
00:51:14.260 | we look at Hitler, Stalin, modern day Jeffrey Epstein,
00:51:19.260 | just folks who through their life have done,
00:51:23.600 | with Einstein, done works of genius,
00:51:25.680 | and with the others I mentioned,
00:51:27.620 | have done evil on this world.
00:51:30.660 | What do we think about that in a live wired brain?
00:51:34.980 | Like how do we think about these extreme people?
00:51:39.980 | - Here's what I'd say.
00:51:41.740 | This is a very big and difficult question,
00:51:43.580 | but what I would say briefly on it is,
00:51:45.480 | first of all, I saw a cover of Time Magazine some years ago,
00:51:51.620 | and it was a big sagittal slice of the brain,
00:51:55.560 | and it said something like,
00:51:56.960 | "What makes us good and evil?"
00:51:59.060 | And there was a little spot pointing to it,
00:52:00.540 | there was a picture of Gandhi,
00:52:01.380 | and there was a little spot that was pointing to Hitler.
00:52:03.140 | And these Time Magazine covers always make me mad
00:52:05.820 | because it's so goofy to think that we're gonna find
00:52:08.700 | some spot in the brain or something.
00:52:11.020 | Instead, the interesting part is,
00:52:13.200 | because we're live wired,
00:52:16.900 | we are all about the world and the culture around us.
00:52:20.620 | So somebody like Adolf Hitler
00:52:22.460 | got all this positive feedback about what was going on,
00:52:27.580 | and the crazier and crazier the ideas he had,
00:52:29.780 | he's like, "Let's set up death camps
00:52:31.940 | "and murder a bunch of people," and so on.
00:52:34.180 | Somehow he was getting positive feedback from that,
00:52:37.380 | and all these other people, they all spun each other up.
00:52:40.340 | And you look at anything like,
00:52:42.060 | I mean, look at the cultural revolution in China
00:52:47.060 | or the Russian Revolution or things like this,
00:52:51.820 | where you look at these and you think,
00:52:52.660 | my God, how do people all behave like this?
00:52:55.340 | But it's easy to see groups of people
00:52:57.820 | spinning themselves up in particular ways,
00:52:59.780 | where they all say, "Well, would I have thought
00:53:02.300 | "this was right in a different circumstance?
00:53:04.700 | "I don't know, but Fred thinks it's right,
00:53:05.900 | "and Steve thinks it's right.
00:53:06.740 | "Everyone around me seems to think it's right."
00:53:08.220 | And so part of the maybe downside
00:53:11.940 | of having a live wired brain is that you can get crowds
00:53:14.460 | of people doing things as a group.
00:53:17.500 | So it's interesting to, we would pinpoint Hitler
00:53:20.420 | as saying, "That's the evil guy," but in a sense,
00:53:23.620 | I think it was Tolstoy who said,
00:53:24.660 | "The king becomes slave to the people."
00:53:29.660 | In other words, Hitler was just a representation
00:53:34.700 | of whatever was going on with that huge crowd
00:53:37.380 | that he was surrounded with.
00:53:39.380 | So I only bring that up to say that it's very difficult
00:53:44.380 | to say what it is about this person's brain
00:53:48.260 | or that person's brain.
00:53:49.100 | He obviously got feedback for what he was doing.
00:53:51.620 | The other thing, by the way, about what we often think of
00:53:55.720 | as being evil in society is my lab recently published
00:54:00.720 | some work on in-groups and out-groups,
00:54:04.560 | which is a very important part of this puzzle.
00:54:08.220 | So it turns out that we are very engineered
00:54:13.220 | to care about in-groups versus out-groups,
00:54:16.040 | and this seems to be a really fundamental thing.
00:54:18.640 | So we did this experiment in my lab
00:54:20.120 | where we brought people in, we stick them in the scanner,
00:54:23.380 | and I don't know, and stop me if you know this,
00:54:24.960 | but we show them on the screen six hands,
00:54:29.960 | and the computer boop, boop, boop, boop,
00:54:32.320 | goes around, randomly picks a hand,
00:54:33.400 | and then you see that hand gets stabbed
00:54:34.920 | with a syringe needle.
00:54:36.040 | So you actually see a syringe needle enter the hand
00:54:38.280 | and come out, and it's really, what that does
00:54:40.720 | is that triggers parts of the pain matrix,
00:54:44.560 | this area in your brain that involved
00:54:46.120 | in feeling physical pain.
00:54:47.280 | Now, the interesting thing is it's not your hand
00:54:48.880 | that was stabbed.
00:54:49.960 | So what you're seeing is empathy.
00:54:51.640 | This is you seeing someone else's hand get stabbed,
00:54:54.160 | and you feel like, oh God, this is awful, right?
00:54:56.240 | Okay.
00:54:57.440 | We contrast that, by the way,
00:54:58.600 | with somebody's hand getting poked with a Q-tip,
00:55:00.840 | which looks visually the same,
00:55:02.560 | but you don't have that same level of response.
00:55:06.120 | Now what we do is we label each hand with a one-word label,
00:55:10.100 | Christian, Jewish, Muslim, atheist, Scientologist, Hindu.
00:55:14.360 | And now, do, do, do, do, the computer goes around,
00:55:16.160 | picks a hand, stabs the hand, and the question is,
00:55:19.200 | how much does your brain care
00:55:21.400 | about all the people in your out-group
00:55:23.240 | versus the one label that happens to match you?
00:55:25.700 | And it turns out for everybody across all religions,
00:55:29.160 | they care much more about their in-group
00:55:31.040 | than their out-group, and when I say they care,
00:55:32.440 | what I mean is you get a bigger response from their brain.
00:55:35.580 | Everything's the same.
00:55:36.420 | It's the same hands.
00:55:38.820 | It's just a one-word label.
00:55:40.600 | You care much more about your in-group than your out-group.
00:55:42.880 | And I wish this weren't true, but this is how humans are.
00:55:45.720 | - I wonder how fundamental that is,
00:55:47.840 | or if it's the emergent thing about culture.
00:55:52.840 | Like, if we lived alone,
00:55:55.280 | like, if it's genetically built into the brain,
00:55:57.520 | like this longing for tribe.
00:56:00.320 | - So I'll tell you, we addressed that.
00:56:02.280 | So here's what we did.
00:56:03.240 | There are two, actually, there are two other things we did
00:56:07.060 | as part of this study that I think matter for this point.
00:56:09.560 | One is, so, okay, so we show
00:56:11.460 | that you have a much bigger response.
00:56:13.000 | And by the way, this is not a cognitive thing.
00:56:14.400 | This is a very low-level, basic response
00:56:17.400 | to seeing pain in somebody, okay.
00:56:19.000 | - Great study, by the way.
00:56:20.040 | - Thanks, thanks, thanks.
00:56:21.900 | What we did next is we next have it where we say,
00:56:24.800 | okay, the year is 2025, and these three religions
00:56:28.640 | are now in a war against these three religions.
00:56:30.760 | And it's all randomized, right?
00:56:31.800 | But what you see is your thing,
00:56:33.560 | and you have two allies now against these others.
00:56:36.000 | And now it happens over the course of many trials.
00:56:38.680 | You see everybody gets stabbed at different times.
00:56:41.600 | And the question is, do you care more about your allies?
00:56:43.480 | And the answer is yes.
00:56:44.320 | Suddenly, people who a moment ago,
00:56:45.920 | you didn't really care when they got stabbed,
00:56:47.280 | now, simply with this one word thing
00:56:49.760 | that they're now your allies, you care more about them.
00:56:52.680 | But then what I wanted to do was look at
00:56:55.320 | how ingrained is this or how arbitrary is it?
00:56:57.640 | So we brought new participants in,
00:57:00.080 | and we said, here's a coin, toss the coin.
00:57:02.720 | If it's heads, you're an Augustinian.
00:57:04.160 | If it's tails, you're a Justinian.
00:57:06.160 | These are totally made up.
00:57:08.000 | Okay, so they toss it, they get whatever.
00:57:10.080 | We give them a band that says Augustinian on it,
00:57:13.360 | whatever tribe they're in now.
00:57:15.120 | And they get in the scanner,
00:57:16.720 | and they see a thing on the screen that says,
00:57:18.480 | the Augustinians and Justinians are two warring tribes.
00:57:21.080 | Then you see a bunch of hands.
00:57:22.120 | Some are labeled Augustinian, some are Justinian.
00:57:24.600 | And now, you care more about whichever team you're on
00:57:27.840 | than the other team, even though it's totally arbitrary,
00:57:29.680 | and you know it's arbitrary
00:57:30.600 | 'cause you're the one who tossed the coin.
00:57:32.920 | So it's a state that's very easy to find ourselves in.
00:57:37.000 | In other words, just before walking in the door,
00:57:39.480 | they'd never even heard of Augustinian versus Justinian,
00:57:41.720 | and now their brain is representing it
00:57:43.920 | simply because they're told they're on this team.
00:57:46.400 | - You know, now I did my own personal study of this.
00:57:49.620 | So once you're an Augustinian, that tends to be sticky
00:57:55.500 | because I've been a Packers fan,
00:57:57.440 | going to be a Packers fan my whole life.
00:57:59.080 | Now, when I'm in Boston with the Patriots,
00:58:03.040 | it's been tough going for my life
00:58:04.880 | where I brain to switch to the Patriots.
00:58:07.120 | (laughing)
00:58:07.960 | So once you become, it's interesting,
00:58:10.040 | once the tribe is sticky.
00:58:12.120 | - Yeah, I bet that's true.
00:58:13.760 | That's it, you know.
00:58:15.200 | - You know, we never tried that about saying,
00:58:16.600 | "Okay, now you're a Justinian, you were an Augustinian."
00:58:19.200 | We never saw how sticky it is,
00:58:21.840 | but there are studies of this,
00:58:24.420 | of monkey troops on some island,
00:58:28.540 | and what happens is they look at the way monkeys behave
00:58:33.620 | when they're part of this tribe
00:58:34.480 | and how they treat members of the other tribe of monkeys.
00:58:37.520 | And then what they do, I've forgotten how they do that
00:58:39.560 | exactly, but they end up switching a monkey
00:58:42.100 | so he ends up in the other troop.
00:58:43.140 | And very quickly, they end up becoming a part
00:58:45.240 | of that other troop and hating
00:58:47.280 | and behaving badly towards their original troop.
00:58:50.400 | - These are fascinating studies, by the way.
00:58:52.040 | - Yeah.
00:58:52.880 | - This is beautiful.
00:58:55.080 | In your book, you have a good light bulb joke.
00:59:00.080 | "How many psychiatrists does it take to change a light bulb?
00:59:04.420 | "Only one, but the light bulb has to want to change."
00:59:07.920 | Sorry.
00:59:08.760 | (laughing)
00:59:09.760 | I'm a sucker for a good light bulb joke.
00:59:11.800 | - Okay, so given, I've been interested in psychiatry
00:59:16.800 | my whole life, just maybe tangentially.
00:59:19.160 | I've kind of early on dreamed to be a psychiatrist
00:59:22.520 | until I understood what it entails.
00:59:24.460 | But is there hope for psychiatry,
00:59:30.680 | for somebody else to help this live-wired brain to adjust?
00:59:37.000 | - Oh yeah, I mean, in the sense that,
00:59:40.200 | and this has to do with this issue
00:59:41.240 | about us being trapped on our own planet.
00:59:43.280 | Forget psychiatrists, just think of like
00:59:45.920 | when you're talking with a friend and you say,
00:59:47.400 | "Oh, I'm so upset about this."
00:59:48.920 | And your friend says, "Hey, just look at it this way."
00:59:51.620 | All we have access to under normal circumstances
00:59:55.800 | is just the way we're seeing something.
00:59:57.440 | And so it's super helpful to have friends
01:00:01.400 | and communities and psychiatrists and so on
01:00:03.520 | to help things change that way.
01:00:05.800 | So that's how psychiatrists sort of helped us.
01:00:07.240 | But more importantly, the role that psychiatrists
01:00:10.120 | have played is that there's this sort of naive assumption
01:00:13.960 | that we all come to the table with,
01:00:15.160 | which is that everyone is fundamentally just like us.
01:00:18.360 | And when you're a kid, you believe this entirely,
01:00:21.160 | but as you get older and you start realizing,
01:00:22.800 | okay, there's something called schizophrenia
01:00:25.280 | and that's a real thing.
01:00:26.320 | And to be inside that person's head is totally different
01:00:29.120 | than what it is to be inside my head, or their psychopathy.
01:00:32.160 | And to be inside the psychopath's head,
01:00:34.960 | he doesn't care about other people.
01:00:36.320 | He doesn't care about hurting other people.
01:00:37.520 | He's just doing what he needs to do to get what he needs.
01:00:40.920 | That's a different head.
01:00:42.120 | There's a million different things going on,
01:00:45.520 | and it is different to be inside those heads.
01:00:47.800 | This is where the field of psychiatry comes in.
01:00:51.320 | Now, I think it's an interesting question
01:00:53.400 | about the degree to which neuroscience is leaking into
01:00:57.040 | and taking over psychiatry
01:00:58.520 | and what the landscape will look like 50 years from now.
01:01:00.880 | It may be that psychiatry as a profession changes a lot
01:01:06.400 | or maybe goes away entirely,
01:01:07.760 | and neuroscience will essentially be able to take over
01:01:10.160 | some of these functions.
01:01:11.000 | But it has been extremely useful to understand
01:01:14.760 | the differences between how people behave and why,
01:01:18.760 | and what you can tell about what's going on
01:01:20.160 | inside their brain,
01:01:21.280 | just based on observation of their behavior.
01:01:23.840 | - This might be years ago, but I'm not sure.
01:01:28.920 | There's an Atlantic article you've written
01:01:31.080 | about moving away from a distinction
01:01:35.040 | between neurological disorders,
01:01:37.440 | quote-unquote brain problems,
01:01:39.600 | and psychiatric disorders or quote-unquote mind problems.
01:01:44.360 | So on that topic, how do you think about this gray area?
01:01:47.720 | - Yeah, this is exactly the evolution
01:01:50.400 | that things are going.
01:01:51.240 | There was psychiatry, and then there were guys and gals
01:01:54.680 | in labs poking cells, and so on.
01:01:56.480 | Those were the neuroscientists.
01:01:57.520 | But yeah, I think these are moving together
01:01:59.400 | for exactly the reason you just cited.
01:02:00.960 | And where this matters a lot,
01:02:03.080 | the Atlantic article that I wrote
01:02:05.160 | was called "The Brain on Trial,"
01:02:07.280 | where this matters a lot is the legal system,
01:02:10.080 | because the way we run our legal system now,
01:02:13.240 | and this is true everywhere in the world,
01:02:14.400 | is someone shows up in front of the judge's bench,
01:02:17.920 | or let's say there's five people
01:02:19.040 | in front of the judge's bench,
01:02:20.560 | and they've all committed the same crime.
01:02:21.880 | What we do, 'cause we feel like, hey, this is fair,
01:02:24.320 | is we say, all right, you're gonna get the same sentence.
01:02:26.040 | You'll all get three years in prison or whatever it is.
01:02:28.240 | But in fact, brains can be so different.
01:02:30.280 | This guy's got schizophrenia, this guy's a psychopath,
01:02:32.080 | this guy's tweaked out on drugs, and so on and so on,
01:02:34.080 | that it actually doesn't make sense to keep doing that.
01:02:38.080 | And what we do in this country
01:02:40.760 | more than anywhere in the world
01:02:42.640 | is we imagine that incarceration
01:02:44.560 | is a one-size-fits-all solution.
01:02:45.960 | And you may know we have the,
01:02:47.720 | America has the highest incarceration rate
01:02:49.560 | in the whole world, in terms of the percentage
01:02:51.320 | of our population we put behind bars.
01:02:53.320 | So there's a much more refined thing we can do
01:02:57.000 | as neuroscience comes in and changes,
01:02:59.640 | and has the opportunity to change the legal system,
01:03:02.360 | which is to say, this doesn't let anybody off the hook.
01:03:04.440 | It doesn't say, oh, it's not your fault, and so on.
01:03:06.520 | But what it does is it changes the equation
01:03:09.820 | so it's not about, hey, how blameworthy are you?
01:03:13.380 | But instead is about, hey, what do we do from here?
01:03:15.660 | What's the best thing to do from here?
01:03:16.720 | So if you take somebody with schizophrenia
01:03:18.160 | and you have them break rocks in the hot summer sun
01:03:21.560 | in a chain gang, that doesn't help their schizophrenia.
01:03:24.960 | That doesn't fix the problem.
01:03:27.580 | If you take somebody with a drug addiction
01:03:29.740 | who's in jail for being caught
01:03:31.780 | with two ounces of some illegal substance,
01:03:33.880 | and you put them in prison,
01:03:36.140 | it doesn't actually fix the addiction.
01:03:37.700 | It doesn't help anything.
01:03:38.940 | Happily, what neuroscience and psychiatry
01:03:42.380 | bring to the table is lots of really useful things
01:03:44.940 | you can do with schizophrenia, with drug addiction,
01:03:47.260 | things like this.
01:03:48.480 | And that's why, so I don't know if you know this,
01:03:50.820 | but I run a national nonprofit
01:03:52.160 | called the Center for Science and Law.
01:03:53.820 | And it's all about this intersection
01:03:55.340 | of neuroscience and the legal system.
01:03:57.380 | And we're trying to implement changes
01:03:59.060 | in every county, in every state.
01:04:01.660 | I'll just, without going down that rabbit hole,
01:04:04.700 | I'll just say one of the very simplest things to do
01:04:07.540 | is to set up specialized court systems
01:04:09.140 | where you have a mental health court
01:04:12.660 | that has judges and juries with expertise
01:04:14.980 | in mental illness.
01:04:15.800 | Because if you go, by the way, to a regular court
01:04:17.780 | and the person says, or the defense lawyer says,
01:04:21.220 | this person has schizophrenia,
01:04:22.880 | most of the jury will say, meh, I call bullshit on that.
01:04:26.600 | Because they don't know about schizophrenia.
01:04:28.140 | They don't know what it's about.
01:04:30.780 | And it turns out people who know about schizophrenia
01:04:34.500 | feel very differently as a juror
01:04:35.940 | than someone who happens not to know
01:04:37.460 | any about schizophrenia.
01:04:38.300 | They think it's an excuse.
01:04:39.560 | So you have judges and juries
01:04:41.780 | with expertise in mental illness,
01:04:43.040 | and they know the rehabilitative strategies
01:04:44.780 | that are available.
01:04:45.780 | That's one thing.
01:04:46.620 | Having a drug court where you have judges and juries
01:04:48.540 | with expertise in rehabilitative strategies
01:04:50.580 | and what can be done and so on.
01:04:51.900 | A specialized prostitution court and so on.
01:04:54.180 | All these different things.
01:04:55.740 | By the way, this is very easy for counties
01:04:57.640 | to implement this sort of thing.
01:04:59.300 | And this is, I think, where this matters
01:05:01.860 | to get neuroscience into public policy.
01:05:05.080 | - What's the process of injecting expertise into this?
01:05:08.660 | - Yeah, I'll tell you exactly what it is.
01:05:10.580 | A county needs to run out of money first.
01:05:12.460 | I've seen this happen over and over.
01:05:14.500 | So what happens is a county has a completely full jail
01:05:17.240 | and they say, you know what?
01:05:18.580 | We need to build another jail.
01:05:19.820 | And then they realize, God, we don't have any money.
01:05:21.260 | We can't afford this.
01:05:22.300 | We've got too many people in jail.
01:05:23.460 | And that's when they turn to, God, we need something smarter.
01:05:26.700 | And that's when they set up specialized court systems.
01:05:28.900 | (laughing)
01:05:30.940 | - We all function best when our back is against the wall.
01:05:34.300 | - And that's what COVID is good for.
01:05:36.180 | It's because we've all had our routines
01:05:38.380 | and we are optimized for the things we do.
01:05:40.820 | And suddenly our backs are against the wall, all of us.
01:05:43.020 | - Yeah, it's really, I mean,
01:05:44.620 | one of the exciting things about COVID.
01:05:47.580 | I mean, I'm a big believer in the possibility
01:05:51.940 | of what government can do for the people.
01:05:56.220 | And when it becomes too big of a bureaucracy,
01:05:59.220 | it starts functioning poorly, it starts wasting money.
01:06:02.620 | It's nice to, I mean, COVID reveals that nicely.
01:06:07.180 | And lessons to be learned about who gets elected
01:06:11.740 | and who goes into government.
01:06:14.220 | Hopefully this inspires talented young people
01:06:19.740 | to go into government,
01:06:20.860 | to revolutionize different aspects of it.
01:06:23.660 | Yeah, so that's the positive silver lining of COVID.
01:06:28.660 | I mean, I thought it'd be fun to ask you,
01:06:30.820 | I don't know if you're paying attention
01:06:31.980 | to the machine learning world and GPT-3.
01:06:34.700 | So the GPT-3 is this language model,
01:06:39.340 | this neural network that's able to,
01:06:41.260 | it has 175 billion parameters.
01:06:44.940 | So it's very large in its strength
01:06:47.900 | in an unsupervised way on the internet.
01:06:51.580 | It just reads a lot of unstructured texts
01:06:55.860 | and it's able to generate some pretty impressive things.
01:06:59.460 | The human brain compared to that
01:07:01.180 | has about a thousand times more synapses.
01:07:06.020 | People get so upset when machine learning people
01:07:10.100 | compare the brain.
01:07:12.060 | And we know synapses are different.
01:07:14.420 | It was very different, very different.
01:07:16.900 | - But like, what do you think about GPT-3?
01:07:20.580 | - Here's what I think, here's what I think.
01:07:21.900 | A few things.
01:07:22.740 | What GPT-3 is doing is extremely impressive,
01:07:25.580 | but it's very different from what the brain does.
01:07:27.620 | So it's a good impersonator, but just as one example,
01:07:32.620 | everybody takes a passage that GPT-3 has written
01:07:37.500 | and they say, wow, look at this, and it's pretty good, right?
01:07:40.420 | But it's already gone through a filtering process
01:07:42.420 | of humans looking at it and saying,
01:07:43.740 | okay, well, that's crap, that's crap.
01:07:45.340 | Oh, here's a sentence that's pretty cool.
01:07:47.500 | Now here's the thing.
01:07:48.860 | Human creativity is about absorbing everything around it
01:07:51.740 | and remixing that and coming up with stuff.
01:07:53.420 | So in that sense, we're sort of like GPT-3,
01:07:56.060 | we're remixing what we've gotten in before.
01:07:58.820 | But we also know, we also have very good models
01:08:03.140 | of what it is to be another human.
01:08:04.780 | And so, I don't know if you speak French or something,
01:08:08.500 | but I'm not gonna start speaking in French
01:08:09.820 | 'cause then you'll say, wait, what are you doing?
01:08:11.500 | I don't understand it.
01:08:12.420 | Instead, everything coming out of my mouth
01:08:14.700 | is meant for your ears.
01:08:16.260 | I know what you'll understand.
01:08:18.140 | I know the vocabulary that you know and don't know.
01:08:20.140 | I know what parts you care about.
01:08:21.860 | That's a huge part of it.
01:08:25.140 | And so, of all the possible sentences I could say,
01:08:28.560 | I'm navigating this thin bandwidth
01:08:31.700 | so that it's something useful for our conversation.
01:08:34.540 | - Yeah, in real time, but also throughout your life.
01:08:36.740 | I mean, we're co-evolving together.
01:08:39.740 | We're learning how to communicate together.
01:08:42.980 | - Exactly, but this is what GPT-3 does not do.
01:08:46.220 | All it's doing is saying, okay,
01:08:47.300 | I'm gonna take all these sentences
01:08:48.580 | and remix stuff and pop some stuff out.
01:08:51.100 | But it doesn't know how to make it so that you, Lex,
01:08:53.660 | will feel like, oh yeah,
01:08:54.580 | that's exactly what I needed to hear.
01:08:56.800 | That's the next sentence that I needed
01:08:58.940 | to know about for something.
01:09:00.420 | - Well, of course, it could be all the impressive results.
01:09:03.860 | We'll see.
01:09:04.700 | The question is, if you raise the number of parameters,
01:09:07.720 | whether it's going to be after some--
01:09:09.980 | - It will not be.
01:09:11.060 | It will not be.
01:09:11.900 | Raising more parameters won't, here's the thing.
01:09:15.180 | It's not that I don't think neural networks
01:09:16.760 | can't be like the human brain,
01:09:18.140 | 'cause I suspect they will be at some point 50 years,
01:09:20.300 | you know, who knows?
01:09:21.140 | But what we are missing in artificial neural networks
01:09:26.060 | is we've got this basic structure
01:09:28.300 | where you've got units and you've got synapses
01:09:30.780 | and they're connected.
01:09:32.460 | And that's great.
01:09:33.420 | And it's done incredibly mind-blowing, impressive things,
01:09:35.860 | but it's not doing the same algorithms as the human brain.
01:09:40.300 | So when I look at my children as little kids,
01:09:43.660 | you know, as infants,
01:09:44.620 | they can do things that no GPT-3 can do.
01:09:47.700 | They can navigate a complex room.
01:09:50.660 | They can navigate social conversation with an adult.
01:09:54.940 | They can lie.
01:09:55.780 | They can do a million things.
01:09:58.220 | They are active thinkers in our world and doing things.
01:10:03.220 | And this, of course, I mean, look,
01:10:04.640 | we totally agree on how incredibly awesome
01:10:07.860 | artificial neural networks are right now,
01:10:09.100 | but we also know the things that they can't do well,
01:10:12.820 | like, you know, like be generally intelligent,
01:10:14.980 | do all these different things.
01:10:16.420 | - Reason about the world, efficiently learn,
01:10:18.740 | efficiently adapt.
01:10:19.940 | - Exactly.
01:10:20.780 | - But it's still the rate of improvement.
01:10:23.580 | It's, to me, it's possible that we'll be surprised.
01:10:28.140 | - I agree.
01:10:28.980 | It's possible we'll be surprised.
01:10:30.100 | But what I would assert,
01:10:33.140 | and I'm glad I'm getting to say this on your podcast
01:10:36.220 | so we can look back at this in two years and 10 years,
01:10:38.380 | is that we've got to be much more sophisticated
01:10:41.620 | than units and synapses between them.
01:10:44.780 | Let me give you an example,
01:10:45.700 | and this is something I talk about in LiveWIRE,
01:10:47.300 | is despite the amazing impressiveness,
01:10:50.580 | mind-blowing impressiveness,
01:10:52.420 | computers don't have some basic things.
01:10:54.620 | Artificial neural networks don't have some basic things
01:10:56.580 | that we like, caring about relevance, for example.
01:10:59.500 | So as humans, we are confronted
01:11:02.420 | with tons of data all the time,
01:11:03.940 | and we only encode particular things
01:11:05.660 | that are relevant to us.
01:11:07.780 | We have this very deep sense of relevance
01:11:10.100 | that I mentioned earlier is based on survival
01:11:11.940 | at the most basic level,
01:11:12.780 | but then all the things about my life and your life,
01:11:16.460 | what's relevant to you, that we encode.
01:11:18.800 | This is very useful.
01:11:20.540 | Computers at the moment don't have that.
01:11:22.020 | They don't even have a yen to survive and things like that.
01:11:24.740 | - So we filter out a bunch of the junk we don't need.
01:11:27.540 | We're really good at efficiently
01:11:29.940 | zooming in on things we need.
01:11:31.500 | Again, could be argued, you know,
01:11:34.540 | let me put on my Freud hat, maybe it's a,
01:11:37.540 | I mean, that's our conscious mind.
01:11:39.640 | There's no reason that neural networks
01:11:44.060 | aren't doing the same kind of filtration.
01:11:46.140 | I mean, in the sense with GPT-3 is doing,
01:11:48.500 | so there's a priming step.
01:11:50.820 | It's doing an essential kind of filtration
01:11:53.400 | when you ask it to generate tweets from,
01:11:58.100 | I don't know, from an Elon Musk or something like that.
01:12:00.760 | It's doing a filtration of it's throwing away
01:12:04.060 | all the parameters it doesn't need for this task.
01:12:06.900 | And it's figuring out how to do that successfully.
01:12:09.700 | And then ultimately it's not doing a very good job right now
01:12:12.700 | but it's doing a lot better job than we expected.
01:12:15.460 | - But it won't ever do a really good job.
01:12:17.500 | And I'll tell you why.
01:12:18.340 | I mean, so let's say we say,
01:12:20.100 | hey, produce an Elon Musk tweet.
01:12:21.660 | And we see like, oh, wow, it produced these three.
01:12:23.980 | That's great.
01:12:24.820 | But again, we're not seeing the 3000
01:12:26.820 | that produced that didn't really make any sense.
01:12:28.940 | It's because it has no idea what it is like to be a human.
01:12:32.740 | And all the things that you might want to say
01:12:34.540 | and all the reasons you wouldn't,
01:12:35.380 | like when you go to write a tweet,
01:12:37.140 | you might write something you think,
01:12:37.980 | ah, it's not gonna come off quite right
01:12:39.860 | in this modern political climate or whatever.
01:12:41.580 | Like, you know, you change things.
01:12:44.180 | - And it somehow boils down to fear of mortality
01:12:46.700 | and all of these human things at the end of the day,
01:12:49.980 | all contained with that tweeting experience.
01:12:52.540 | (laughing)
01:12:53.740 | - Well, interestingly, the fear of mortality
01:12:55.540 | is at the bottom of this,
01:12:56.880 | but you've got all these more things like, you know,
01:12:59.480 | oh, I want to, just in case the chairman
01:13:02.300 | of my department reads this, I want it to come off well there
01:13:04.280 | just in case my mom looks at this tweet,
01:13:05.760 | I want to make sure she, you know, and so on.
01:13:08.240 | - So those are all the things that humans are able to
01:13:10.600 | sort of throw into the calculation.
01:13:12.400 | I mean-
01:13:14.880 | - What it requires though is having a model of your chairman,
01:13:18.760 | having a model of your mother, having a model of, you know,
01:13:22.080 | the person you want to go on a date with
01:13:24.540 | who might look at your tweet and so on.
01:13:26.040 | All these things are, you're running models
01:13:28.240 | of what it is like to be them.
01:13:31.020 | So in terms of the structure of the brain,
01:13:34.560 | again, this may be going into speculation land,
01:13:37.180 | I hope you go along with me.
01:13:39.040 | (laughing)
01:13:40.740 | Is, okay, so the brain seems to be intelligent
01:13:45.240 | and our AI systems aren't very currently.
01:13:48.400 | So where do you think intelligence arises in the brain?
01:13:52.880 | Like what is it about the brain?
01:13:55.680 | - So if you mean where location wise, it's no single spot.
01:13:59.920 | It would be equivalent to asking,
01:14:02.000 | I'm looking at New York city, where is the economy?
01:14:06.520 | The answer is you can't point to anywhere.
01:14:08.180 | The economy is all about the interaction
01:14:09.980 | of all of the pieces and parts of the city.
01:14:12.200 | And that's what, you know, intelligence,
01:14:14.160 | whatever we mean by that in the brain
01:14:15.520 | is interacting from everything going on at once.
01:14:18.200 | - In terms of a structure, so we look,
01:14:20.000 | humans are much smarter than fish, maybe not dolphins,
01:14:24.840 | but dolphins are mammals, right?
01:14:26.720 | - I assert that what we mean by smarter
01:14:28.560 | has to do with live wiring.
01:14:30.000 | So what we mean when we say, oh, we're smarter,
01:14:32.160 | is oh, we can figure out a new thing
01:14:33.680 | and figure out a new pathway to get where we need to go.
01:14:36.720 | And that's because fish are essentially coming to the table
01:14:39.640 | with, you know, okay, here's the hardware,
01:14:41.480 | go, swim, mate, eat.
01:14:43.680 | But we have the capacity to say,
01:14:46.040 | okay, look, I'm gonna absorb, oh, oh,
01:14:47.560 | but you know, I saw someone else do this thing
01:14:49.200 | and I read once that you could do this other thing
01:14:51.960 | and so on.
01:14:52.800 | - So do you think there's, is there something,
01:14:54.600 | I know these are mysteries,
01:14:56.680 | but like architecturally speaking,
01:15:00.120 | what feature of the brain of the live wire aspect of it
01:15:05.120 | that is really useful for intelligence?
01:15:08.120 | So like, is it the ability of neurons to reconnect?
01:15:13.120 | Like, is there something,
01:15:16.160 | is there any lessons about the human brain
01:15:18.160 | you think might be inspiring for us
01:15:22.000 | to take into the artificial,
01:15:24.640 | into the machine learning world?
01:15:26.880 | - Yeah, I'm actually just trying to write some up
01:15:29.080 | on this now called, you know,
01:15:30.080 | if you want to build a robot, start with the stomach.
01:15:32.480 | And what I mean by that, what I mean by that is
01:15:35.400 | a robot has to care, it has to have hunger,
01:15:37.240 | it has to care about surviving, that kind of thing.
01:15:40.360 | Here's an example.
01:15:41.280 | So the penultimate chapter of my book,
01:15:44.160 | I titled "The Wolf and the Mars Rover."
01:15:46.600 | And I just look at this simple comparison
01:15:48.880 | of you look at a wolf,
01:15:51.160 | it gets its leg caught in a trap.
01:15:52.560 | What does it do?
01:15:53.400 | It gnaws its leg off,
01:15:55.480 | and then it figures out how to walk on three legs.
01:15:58.240 | No problem.
01:15:59.080 | Now the Mars Rover, Curiosity,
01:16:00.600 | got its front wheel stuck in some Martian soil,
01:16:03.960 | and it died.
01:16:05.200 | This project that cost billions of dollars
01:16:08.680 | died 'cause it's got its wheels.
01:16:09.960 | Wouldn't it be terrific if we could build a robot
01:16:12.760 | that chewed off its front wheel
01:16:14.600 | and figured out how to operate
01:16:15.920 | with a slightly different body plan?
01:16:17.800 | That's the kind of thing that we want to be able to build
01:16:21.040 | and to get there, what we need,
01:16:23.280 | the whole reason the wolf is able to do that
01:16:25.120 | is because its motor and somatosensory systems
01:16:27.920 | are live wired.
01:16:28.760 | So it says, "Oh, you know what?
01:16:29.920 | "Turns out we've got a body plan that's different
01:16:31.560 | "than what I thought a few minutes ago,
01:16:34.060 | "but I have a yen to survive,
01:16:37.500 | "and I care about relevance,"
01:16:38.800 | which in this case is getting to food,
01:16:40.680 | getting back to my pack and so on.
01:16:42.480 | "So I'm just gonna figure out how to operate with this.
01:16:44.560 | "Oh, whoops, that didn't work.
01:16:45.940 | "Oh, okay, I'm kind of getting it to work."
01:16:48.520 | But the Mars Rover doesn't do that.
01:16:49.920 | It just says, "Oh, geez, I was pre-programmed
01:16:51.560 | "to have four wheels, now I have three, I'm screwed."
01:16:54.040 | - Yeah, I don't know if you're familiar
01:16:55.920 | with a philosopher named Ernest Becker.
01:16:58.200 | He wrote a book called "The Nile of Death,"
01:17:00.840 | and there's a few psychologists, Sheldon Solomon,
01:17:03.520 | I think I just spoke with him on his podcast,
01:17:06.340 | who developed terror management theory,
01:17:09.940 | which is, like, Ernest Becker is a philosopher
01:17:12.960 | that basically said that fear of mortality
01:17:17.240 | is at the core of it.
01:17:18.200 | - Yeah.
01:17:19.040 | - And so, I don't know, it sounds compelling as an idea
01:17:23.160 | that we're all, I mean, that all of the civilization
01:17:25.800 | we've constructed is based on this, but it's--
01:17:29.120 | - I'm familiar with his work.
01:17:30.320 | Here's what I think.
01:17:31.240 | I think that, yes, fundamentally, this desire to survive
01:17:35.160 | is at the core of it, I would agree with that,
01:17:37.400 | but how that expresses itself in your life
01:17:40.680 | ends up being very different.
01:17:41.720 | The reason you do what you do is, I mean,
01:17:45.300 | you could list the hundred reasons
01:17:47.260 | why you chose to write your tweet this way and that way,
01:17:49.280 | and it really has nothing to do with the survival part.
01:17:51.320 | It has to do with trying to impress fellow humans
01:17:53.500 | and surprise them and say something.
01:17:55.240 | - Yeah, so many things built on top of each other.
01:17:56.880 | - Yeah, exactly.
01:17:57.720 | - But it's fascinating to think that
01:17:59.240 | in artificial intelligence systems,
01:18:00.860 | we wanna be able to somehow engineer this drive
01:18:05.360 | for survival, for immortality.
01:18:08.280 | I mean, because as humans, we're not just about survival.
01:18:11.440 | We're aware of the fact that we're going to die,
01:18:14.680 | which is a very kind of, we're aware of, like,
01:18:16.920 | most people aren't, by the way.
01:18:18.460 | - Aren't?
01:18:19.300 | - Aren't.
01:18:20.120 | Confucius said, he said, "Each person has two lives.
01:18:25.120 | "The second one begins when you realize
01:18:27.960 | "that you have just one."
01:18:29.480 | - Yeah.
01:18:30.320 | - But most people, it takes a long time
01:18:31.320 | for most people to get there.
01:18:32.680 | - I mean, you could argue this kind of Freudian thing,
01:18:34.640 | which Erzbeker argues is they actually figured it out
01:18:39.640 | early on, and the terror they felt was like
01:18:45.360 | the reason it's been suppressed,
01:18:47.480 | and the reason most people, when I ask them about
01:18:49.520 | whether they're afraid of death, they basically say no.
01:18:53.080 | They basically say, like, "I'm afraid I won't get,
01:18:56.820 | "like, submit the paper before I die."
01:18:59.720 | Like, they kind of see, they see death
01:19:01.840 | as a kind of inconvenient deadline
01:19:04.760 | for a particular set of, like, a book you're writing.
01:19:08.160 | As opposed to, like, what the hell, this thing ends.
01:19:12.360 | It's like, at any moment, like, most people,
01:19:16.200 | as I've encountered, do not meditate on the idea
01:19:18.840 | that, like, right now you could die.
01:19:21.680 | Like, right now.
01:19:22.840 | In the next five minutes, it could be all over.
01:19:27.840 | And, you know, meditate on that idea.
01:19:29.900 | I think that somehow brings you closer to, like,
01:19:33.880 | the core of the motivations,
01:19:36.560 | and the core of the human cognition.
01:19:39.480 | - I think it might be the core, but like I said,
01:19:41.360 | it is not what drives us day to day.
01:19:43.840 | Yeah, there's so many things on top of it.
01:19:45.560 | But it is interesting.
01:19:46.400 | I mean, as the ancient poet said,
01:19:48.820 | "Death whispers at my ear, live, for I come."
01:19:53.380 | So it is certainly motivating when we think about that.
01:19:58.080 | Okay, I've got some deadline.
01:19:59.240 | I don't know exactly what it is,
01:20:00.360 | but I better make stuff happen.
01:20:02.180 | It is motivating, but I don't think,
01:20:04.280 | I mean, I know for, just speaking for me personally,
01:20:06.760 | that's not what motivates me day to day.
01:20:08.880 | It's instead, oh, I want to get this, you know,
01:20:13.360 | program up and running before this,
01:20:14.760 | or I want to make sure my coauthor isn't mad at me
01:20:17.280 | because I haven't gotten this in,
01:20:18.200 | or I don't want to miss this grant deadline,
01:20:19.600 | or, you know, whatever the thing is.
01:20:21.160 | - Yeah, it's too distant in a sense.
01:20:23.820 | Nevertheless, it is good to reconnect.
01:20:26.520 | But for the AI systems, none of that is there.
01:20:30.240 | Like a neural network does not fear its mortality.
01:20:33.880 | And that seems to be somehow fundamentally
01:20:37.900 | missing the point.
01:20:39.680 | - I think that's missing the point,
01:20:40.720 | but I wonder, it's an interesting speculation
01:20:42.480 | about whether you can build an AI system
01:20:43.740 | that is much closer to being a human
01:20:45.820 | without the mortality and survival piece,
01:20:48.440 | but just the thing of relevance.
01:20:51.200 | Just, I care about this versus that.
01:20:52.780 | Right now, if you have a robot roll into the room,
01:20:54.840 | it's gonna be frozen, 'cause it doesn't have any reason
01:20:56.720 | to go there versus there.
01:20:57.720 | It doesn't have any particular set of things
01:21:02.600 | about this is how I should navigate my next move
01:21:05.640 | because I want something.
01:21:07.760 | - Yeah, the thing about humans
01:21:10.940 | is they seem to generate goals.
01:21:13.800 | They're like, you said live-wired.
01:21:15.780 | I mean, it's very flexible in terms of the goals
01:21:20.000 | and creative in terms of the goals we generate
01:21:21.820 | when we enter a room.
01:21:22.920 | You show up to a party without a goal usually,
01:21:26.760 | and then you figure it out along the way.
01:21:27.960 | - Yes, but this goes back to the question about free will,
01:21:30.020 | which is when I walk into the party,
01:21:31.980 | if you rewound it 10,000 times,
01:21:35.640 | would I go and talk to that couple over there
01:21:38.000 | versus that person?
01:21:39.080 | I might do this exact same thing every time
01:21:41.720 | because I've got some goal stack,
01:21:43.720 | and I think, okay, well, at this party,
01:21:45.840 | I really wanna meet these kind of people,
01:21:47.680 | or I feel awkward, or whatever my goals are.
01:21:51.360 | By the way, so there was something
01:21:52.600 | that I meant to mention earlier,
01:21:54.560 | if you don't mind going back,
01:21:56.120 | which is this, when we were talking about BCI.
01:21:58.420 | So I don't know if you know this,
01:22:00.240 | but what I'm spending 90% of my time doing now
01:22:02.600 | is running a company.
01:22:03.800 | Do you know about this?
01:22:04.640 | - Yes, I wasn't sure what the company is involved in.
01:22:08.120 | - Right, so-- - Can you talk about it?
01:22:09.440 | - Yeah, yeah.
01:22:10.840 | So when it comes to the future of BCI,
01:22:13.040 | you can put stuff into the brain invasively,
01:22:18.880 | but my interest has been how you can get data streams
01:22:22.120 | into the brain non-invasively.
01:22:24.220 | So I run a company called Neosensory,
01:22:26.400 | and what we build is this little wristband.
01:22:29.680 | We've built this in many different form factors.
01:22:30.640 | - Oh, wow, that's it?
01:22:31.960 | - Yeah, this is it.
01:22:32.800 | And it's got these vibratory motors in it.
01:22:35.400 | So these things, as I'm speaking, for example,
01:22:38.240 | it's capturing my voice and running algorithms
01:22:41.040 | and then turning that into patterns of vibration here.
01:22:44.480 | So people who are deaf, for example,
01:22:48.760 | learn to hear through their skin.
01:22:50.760 | So the information is getting up to their brain this way,
01:22:54.280 | and they learn how to hear.
01:22:55.840 | So it turns out on day one, people are pretty good,
01:22:58.200 | better than you'd expect at being able to say,
01:23:00.520 | "Oh, that's weird.
01:23:01.360 | "Was that a dog barking?
01:23:02.500 | "Was that a baby crying?
01:23:03.340 | "Was that a door knock, a doorbell?"
01:23:05.040 | People are pretty good at it.
01:23:06.520 | But with time, they get better and better,
01:23:09.280 | and what it becomes is a new qualia,
01:23:12.400 | in other words, a new subjective internal experience.
01:23:15.400 | So on day one, they say, "Whoa, what was that?
01:23:18.360 | "Oh, that was the dog barking."
01:23:20.640 | But by three months later, they say,
01:23:23.360 | "Oh, there's a dog barking somewhere.
01:23:24.500 | "Oh, there's the dog."
01:23:25.560 | - That's fascinating.
01:23:26.400 | - And by the way, that's exactly how you learn
01:23:27.720 | how to use your ears.
01:23:29.400 | So what you, of course, you don't remember this,
01:23:30.560 | but when you were an infant, all you have are,
01:23:33.120 | your eardrum vibrating causes spikes to go down,
01:23:36.560 | your auditory nerves and impinging your auditory cortex.
01:23:39.980 | Your brain doesn't know what those mean automatically,
01:23:43.080 | but what happens is you learn how to hear
01:23:44.840 | by looking for correlations.
01:23:46.480 | You clap your hands as a baby,
01:23:48.920 | you look at your mother's mouth moving,
01:23:50.880 | and that correlates with what's going on there.
01:23:53.200 | And eventually, your brain says,
01:23:54.700 | "All right, I'm just gonna summarize this
01:23:55.880 | "as an internal experience, as a conscious experience."
01:23:59.440 | And that's exactly what happens here.
01:24:01.320 | The weird part is that you can feed data into the brain,
01:24:04.200 | not through the ears, but through any channel
01:24:06.340 | that gets there.
01:24:07.280 | As long as the information gets there,
01:24:08.600 | your brain figures out what to do with it.
01:24:10.360 | - That's fascinating.
01:24:11.440 | Like expanding the set of sensors,
01:24:14.840 | and it could be arbitrarily,
01:24:17.220 | could expand arbitrarily, which is fascinating.
01:24:21.320 | - Well, exactly.
01:24:22.160 | And by the way, the reason I use this skin,
01:24:24.600 | there's all kinds of cool stuff going on
01:24:26.400 | in the AR world with glasses and with it.
01:24:28.240 | But the fact is your eyes are overtaxed
01:24:30.000 | and your ears are overtaxed,
01:24:30.960 | and you need to be able to see and hear other stuff.
01:24:33.480 | But you're covered with the skin,
01:24:34.760 | which is this incredible computational material
01:24:38.200 | with which you can feed information.
01:24:39.760 | And we don't use our skin for much of anything nowadays.
01:24:42.560 | My joke in the lab is that I say
01:24:44.760 | we don't call this the waste for nothing,
01:24:46.240 | 'cause originally we built this as the vest,
01:24:47.760 | and you're passing in all this information that way.
01:24:51.400 | And what I'm doing here with the deaf community
01:24:56.800 | is what's called sensory substitution,
01:24:59.440 | where I'm capturing sound and,
01:25:01.480 | I'm just replacing the ears with the skin, and that works.
01:25:04.760 | One of the things I talk about in LiveWire
01:25:07.520 | is sensory expansion.
01:25:09.920 | So what if you took something like your visual system,
01:25:12.400 | which picks up on a very thin slice
01:25:13.880 | of the electromagnetic spectrum,
01:25:15.680 | and you could see infrared or ultraviolet.
01:25:18.840 | So we've hooked that up, infrared and ultraviolet detectors,
01:25:21.480 | and I can feel what's going on.
01:25:23.180 | So just as an example,
01:25:24.020 | the first night I built the infrared,
01:25:25.640 | one of my engineers built it, the infrared detector,
01:25:27.840 | I was walking in the dark between two houses,
01:25:29.840 | and suddenly I felt all this infrared radiation.
01:25:32.160 | I was like, where does that come from?
01:25:33.000 | And I just followed my wrist,
01:25:34.240 | and I found an infrared camera, a night vision camera.
01:25:37.200 | But I immediately, oh, there's that thing there.
01:25:41.280 | Of course I would have never seen it,
01:25:42.520 | but now it's just part of my reality.
01:25:45.720 | - That's fascinating.
01:25:46.560 | - Yeah, and then of course,
01:25:47.440 | what I'm really interested in is sensory addition.
01:25:50.300 | What if you could pick up on stuff
01:25:52.080 | that isn't even part of what we normally pick up
01:25:55.360 | on like the magnetic field of the earth,
01:25:58.080 | or Twitter, or stock market, or things like that.
01:26:01.120 | - Or the, I don't know, some weird stuff,
01:26:02.720 | like the moods of other people or something like that.
01:26:04.720 | - Sure, now what you need is a way to measure that.
01:26:06.960 | So as long as there's a machine that can measure it,
01:26:08.640 | it's easy, it's trivial to feed this in here,
01:26:10.440 | and it comes to be part of your reality.
01:26:14.720 | It's like you have another sensor.
01:26:16.640 | - And that kind of thing is without doing,
01:26:19.360 | like if you look at Neuralink,
01:26:21.240 | I forgot how you put it, but it was eloquent,
01:26:24.160 | without getting, cutting into the brain, basically.
01:26:26.560 | - Yeah, exactly, exactly.
01:26:27.720 | So this costs, at the moment, $399.
01:26:31.080 | - That's not gonna kill you.
01:26:31.920 | - Yeah, it's not gonna kill you.
01:26:33.800 | You just put it on, and when you're done, you take it off.
01:26:36.920 | Yeah, and so, and the name of the company, by the way,
01:26:39.560 | is Neosensory for new senses,
01:26:41.680 | because the whole idea is--
01:26:43.080 | - Beautiful.
01:26:43.920 | - You can, as I said, you come to the table
01:26:46.320 | with certain plug and play devices, and then that's it.
01:26:48.460 | Like I can pick up on this little bit
01:26:49.560 | of the electromagnetic radiation,
01:26:50.560 | I can pick up on this little frequency band
01:26:53.800 | for hearing and so on, but I'm stuck there,
01:26:56.600 | and there's no reason we have to be stuck there.
01:26:58.220 | We can expand our umwelt by adding new senses, yeah.
01:27:02.120 | - What's umwelt?
01:27:03.080 | - Oh, I'm sorry, the umwelt is the slice of reality
01:27:06.120 | that you pick up on.
01:27:06.960 | So each animal has its own--
01:27:08.480 | - Hell of a word.
01:27:09.360 | - Umwelt, yeah, exactly.
01:27:10.920 | - Nice.
01:27:11.760 | - I'm sorry, I forgot to define it before.
01:27:12.840 | It's such an important concept, which is to say,
01:27:16.000 | for example, if you are a tick,
01:27:20.160 | you pick up on butyric acid, you pick up on odor,
01:27:23.200 | and you pick up on temperature, that's it.
01:27:24.440 | That's how you construct your reality,
01:27:26.120 | is with those two sensors.
01:27:27.240 | If you are a blind echolocating bat,
01:27:29.200 | you're picking up on air compression waves coming back,
01:27:31.880 | you know, echolocation.
01:27:32.920 | If you are the black ghost knifefish,
01:27:35.200 | you're picking up on changes in the electrical field
01:27:38.360 | around you with electroreception.
01:27:40.800 | That's how they swim around and tell
01:27:42.080 | there's a rock there and so on.
01:27:43.400 | But that's all they pick up on.
01:27:45.440 | That's their umwelt.
01:27:46.880 | That's the signals they get from the world
01:27:50.160 | from which to construct their reality.
01:27:51.640 | And they can be totally different umwelts.
01:27:53.880 | - That's fascinating.
01:27:54.720 | - And so our human umwelt is, you know,
01:27:57.800 | we've got little bits that we can pick up on.
01:27:59.640 | One of the things I like to do with my students
01:28:01.440 | is talk about, imagine that you are a bloodhound dog, right?
01:28:06.120 | You are a bloodhound dog with a huge snout
01:28:07.840 | with 200 million scent receptors in it,
01:28:09.720 | and your whole world is about smelling.
01:28:11.720 | You've got slits in your nostrils,
01:28:14.080 | like big nosefuls of air and so on.
01:28:15.640 | Do you have a dog?
01:28:16.480 | - Nope, used to.
01:28:17.680 | - Used to, okay, right.
01:28:18.520 | So you know, you walk your dog around
01:28:19.720 | and your dog is smelling everything.
01:28:21.400 | The whole world is full of signals
01:28:22.680 | that you do not pick up on.
01:28:24.200 | And so imagine if you were that dog
01:28:25.800 | and you looked at your human master and thought,
01:28:27.360 | my God, what is it like to have
01:28:28.480 | the pitiful little nose of a human?
01:28:30.800 | How could you not know that there's a cat 100 yards away
01:28:33.080 | or that your friend was here six hours ago?
01:28:34.800 | And so the idea is because we're stuck in our umwelt,
01:28:38.360 | because we have this little pitiful nose,
01:28:39.440 | is we think, okay, well, yeah, we're seeing reality,
01:28:41.880 | but you can have very different sorts of realities
01:28:44.880 | depending on the peripheral plug and play devices
01:28:47.320 | you're equipped with.
01:28:48.160 | - It's fascinating to think that like,
01:28:50.000 | if we're being honest, probably our umwelt is,
01:28:53.020 | you know, some infinitely tiny percent of the possibilities
01:28:59.040 | of how you can sense quote unquote reality.
01:29:03.400 | Even if you could, I mean, there's a guy named
01:29:05.600 | Donald Hoffman, yeah, who basically says
01:29:10.600 | we're really far away from reality
01:29:13.840 | in terms of our ability to sense anything.
01:29:15.960 | Like we're very, we're almost like we're floating out there
01:29:20.680 | that's almost like completely detached
01:29:22.400 | from the actual physical reality.
01:29:24.200 | It's fascinating that we can have extra senses
01:29:27.040 | that could help us get a little bit closer.
01:29:29.680 | - Exactly, and by the way, this has been the fruits
01:29:33.320 | of science is realizing, like, for example,
01:29:36.080 | you know, you open your eyes
01:29:36.920 | and there's the world around you, right?
01:29:38.200 | But of course, depending on how you calculate it,
01:29:40.120 | it's less than a 10 trillionth of the electromagnetic
01:29:42.880 | spectrum that we call visible light.
01:29:45.800 | The reason I say it depends is because, you know,
01:29:46.960 | it's actually infinite in all directions presumably.
01:29:49.240 | - Yeah, and so that's exactly that.
01:29:51.240 | And then science allows you to actually look
01:29:53.600 | into the rest of it.
01:29:54.960 | - Exactly, so understanding how big the world is out there.
01:29:57.200 | - And the same with the world of really small
01:29:59.120 | and the world of really large.
01:30:00.520 | - Exactly.
01:30:01.360 | - That's beyond our ability to sense.
01:30:03.160 | - Exactly, and so the reason I think this kind of thing
01:30:05.040 | matters is because we now have an opportunity
01:30:07.800 | for the first time in human history to say,
01:30:10.960 | okay, well, I'm just gonna include other things
01:30:13.160 | in my OOM-VEL, so I'm gonna include infrared radiation
01:30:15.800 | and have a direct perceptual experience of that.
01:30:19.120 | And so I'm very, you know, I mean,
01:30:21.440 | so, you know, I've given up my lab
01:30:22.760 | and I run this company 90% of my time now.
01:30:25.440 | That's what I'm doing.
01:30:26.280 | I still teach at Stanford and I'm, you know,
01:30:27.840 | teaching courses and stuff like that, but--
01:30:30.240 | - This is like, this is your passion.
01:30:33.080 | The fire is on this.
01:30:35.120 | - Yeah, I feel like this is the most important thing
01:30:37.520 | that's happening right now.
01:30:38.940 | I mean, obviously I think that 'cause that's
01:30:40.640 | what I'm devoting my time and my life to, but--
01:30:43.360 | - I mean, it's a brilliant set of ideas.
01:30:45.200 | It certainly is like, it's a step
01:30:48.440 | in a very vibrant future, I would say.
01:30:52.920 | Like, the possibilities there are endless.
01:30:56.240 | - Exactly, so if you ask what I think about Neuralink,
01:30:59.320 | I think it's amazing what those guys are doing
01:31:01.240 | and working on, but I think it's not practical
01:31:03.560 | for almost everybody.
01:31:04.800 | For example, for people who are deaf, they buy this
01:31:07.680 | and, you know, every day we're getting tons of emails
01:31:10.440 | and tweets and whatever from people saying,
01:31:11.640 | "Wow, I picked up on this."
01:31:12.760 | And then I had no idea that was a,
01:31:14.520 | I didn't even know that was happening out there.
01:31:16.880 | And they're coming to hear.
01:31:18.520 | By the way, this is, you know, less than a 10th
01:31:20.320 | of the price of a hearing aid and like 250 times
01:31:23.200 | less than a cochlear implant.
01:31:25.240 | - That's amazing.
01:31:26.080 | People love hearing about what, you know,
01:31:30.680 | brilliant folks like yourself could recommend
01:31:33.920 | in terms of books.
01:31:35.040 | Of course, you're an author of many books.
01:31:37.000 | So I'll, in the introduction, mention all the books
01:31:39.760 | you've written.
01:31:40.600 | People should definitely read "Livewired."
01:31:42.560 | I've gotten a chance to read some of it.
01:31:44.000 | It's amazing.
01:31:44.840 | But is there three books, technical, fiction,
01:31:48.320 | philosophical, that had an impact on you
01:31:52.160 | when you were younger or today and books,
01:31:56.440 | perhaps some of which you would want to recommend
01:31:59.600 | that others read?
01:32:00.560 | - You know, as an undergraduate, I majored
01:32:02.920 | in British and American literature.
01:32:04.200 | That was my major, 'cause I love literature.
01:32:06.840 | I grew up with literature.
01:32:08.720 | My father had these extensive bookshelves.
01:32:10.560 | And so I grew up in the mountains in New Mexico.
01:32:13.880 | And so that was mostly where I spent my time
01:32:15.520 | was reading books.
01:32:16.360 | But, you know, I love, you know,
01:32:19.480 | Faulkner, Hemingway.
01:32:23.720 | I love many South American authors,
01:32:26.160 | Gabriel Garcia Marquez and Italo Calvino.
01:32:28.240 | I would actually recommend "Invisible Cities."
01:32:29.920 | I just, I loved that book.
01:32:31.440 | - By?
01:32:32.320 | - Italo Calvino, sorry.
01:32:33.520 | It's a book of fiction.
01:32:37.040 | Anthony Doar wrote a book called
01:32:39.000 | "All the Light We Cannot See,"
01:32:41.040 | which actually was inspired by incognito
01:32:44.160 | by exactly what we were talking about earlier
01:32:45.960 | about how you can only see a little bit of the,
01:32:48.720 | what we call visible light in the electromagnetic radiation.
01:32:51.440 | I wrote about this in incognito
01:32:52.760 | and then he reviewed incognito for the Washington Post.
01:32:54.800 | - Oh, no, that's awesome.
01:32:56.000 | - And then he wrote this book.
01:32:57.560 | The book has nothing to do with that,
01:32:58.720 | but that's where the title comes from.
01:33:00.160 | - Yeah.
01:33:01.000 | - "All the Light We Cannot See"
01:33:01.840 | is about the rest of the spectrum.
01:33:02.960 | But that's an absolutely gorgeous book.
01:33:07.960 | - That's a book of fiction.
01:33:09.400 | - Yeah, it's a book of fiction.
01:33:10.480 | One that people are surprised.
01:33:11.400 | - What's it about?
01:33:12.240 | - It takes place during World War II
01:33:13.760 | about these two young people,
01:33:15.400 | one of whom is blind.
01:33:16.440 | - Anything else?
01:33:19.800 | So what I need, so you mentioned Hemingway.
01:33:21.920 | - I mean.
01:33:22.760 | - "Old Man and the Sea."
01:33:24.800 | What's your favorite?
01:33:26.480 | - "Snow's the Kill of Ninjaro."
01:33:29.360 | - Oh, wow, okay.
01:33:30.200 | - It's a collection of short stories that I love.
01:33:32.320 | As far as nonfiction goes,
01:33:33.320 | I grew up with "Cosmos,"
01:33:35.800 | both watching the PBS series and then reading the book,
01:33:38.200 | and that influenced me a huge amount in terms of what I do.
01:33:42.000 | From the time I was a kid,
01:33:43.000 | I felt like I wanna be Carl Sagan.
01:33:45.440 | That's what I loved.
01:33:46.280 | And in the end, I studied space physics
01:33:49.360 | for a while as an undergrad,
01:33:51.000 | but then I, in my last semester,
01:33:53.600 | discovered neuroscience last semester,
01:33:55.480 | and I just thought, wow, I'm hooked on that.
01:33:57.560 | - So the Carl Sagan of the brain.
01:34:01.440 | - That was my aspiration. - Is the aspiration.
01:34:03.720 | I mean, you're doing an incredible job of it.
01:34:07.920 | So you open the book "Livewired" with a quote by Heidegger.
01:34:11.920 | "Every man is born as many men and dies as a single one."
01:34:15.680 | Well, what do you mean?
01:34:18.960 | - I'll tell you what I meant by it.
01:34:21.680 | So he had his own reason why he was writing that,
01:34:23.840 | but I meant this in terms of brain plasticity,
01:34:25.840 | in terms of "Livewired,"
01:34:26.960 | which is this issue that I mentioned before
01:34:28.600 | about this cone, the space-time cone
01:34:30.720 | that we are in, which is that
01:34:32.800 | when you dropped into the world,
01:34:35.800 | you, Lex, had all this different potential.
01:34:38.000 | You could have been a great surfer
01:34:40.480 | or a great chess player,
01:34:41.720 | or you could have been thousands of different men
01:34:45.560 | when you grew up, but what you did is,
01:34:47.920 | things that were not your choice
01:34:49.400 | and your choice along the way,
01:34:50.720 | you ended up navigating a particular path,
01:34:52.840 | and now you're exactly who you are.
01:34:54.240 | You still have lots of potential,
01:34:55.400 | but the day you die, you will be exactly Lex.
01:34:59.200 | You will be that one person.
01:35:01.680 | - So in that context,
01:35:04.120 | first of all, it's just a beautiful,
01:35:06.840 | it's a humbling picture,
01:35:08.840 | but it's a beautiful one,
01:35:10.000 | 'cause it's all the possible trajectories,
01:35:12.760 | and you pick one, you walk down that road,
01:35:14.600 | and it's the Robert Frost poem.
01:35:16.320 | But on that topic, let me ask the biggest
01:35:18.920 | and the most ridiculous question.
01:35:20.560 | So in this "Livewired" brain,
01:35:23.360 | when we choose all these different trajectories
01:35:25.160 | and end up with one, what's the meaning of it all?
01:35:28.000 | What's, is there a why here?
01:35:32.200 | What's the meaning of life, David Engelman?
01:35:36.440 | - That's it.
01:35:37.280 | (laughing)
01:35:40.360 | I mean, this is the question that everyone has attacked
01:35:42.920 | from their own "Livewired" point of view,
01:35:45.240 | by which I mean, culturally,
01:35:47.240 | if you grew up in a religious society,
01:35:49.160 | you have one way of attacking that question.
01:35:51.040 | So if you grew up in a secular or scientific society,
01:35:53.240 | you have a different way of attacking that question.
01:35:55.400 | Obviously, I don't know.
01:35:57.800 | I abstain on that question.
01:35:59.640 | (laughing)
01:36:01.200 | - I mean, I think one of the fundamental things, I guess,
01:36:03.920 | in that, in all those possible trajectories,
01:36:06.240 | is you're always asking.
01:36:09.200 | I mean, that's the act of asking,
01:36:11.480 | what the heck is this thing for,
01:36:14.200 | is equivalent to, or at least runs in parallel
01:36:18.440 | to all the choices that you're making.
01:36:20.840 | 'Cause it's kinda, that's the underlying question.
01:36:23.680 | - Well, that's right.
01:36:24.520 | And by the way, you know, this is the interesting thing
01:36:26.120 | about human psychology.
01:36:27.760 | You know, we've got all these layers of things
01:36:29.440 | at which we can ask questions.
01:36:30.840 | And so if you keep asking yourself the question about,
01:36:33.680 | what is the optimal way for me to be spending my time?
01:36:36.440 | What should I be doing?
01:36:37.280 | What charity should I get involved with?
01:36:39.000 | If you're asking those big questions,
01:36:41.360 | that steers you appropriately.
01:36:44.680 | If you're the type of person who never asks,
01:36:46.440 | "Hey, is there something better
01:36:47.280 | "I could be doing with my time?"
01:36:48.840 | Then presumably you won't optimize
01:36:51.000 | whatever it is that is important to you.
01:36:53.720 | - So you've, I think, just in your eyes, in your work,
01:36:58.080 | there's a passion that just is obvious
01:37:02.320 | and it's inspiring, it's contagious.
01:37:04.720 | If you were to give advice to a young person today
01:37:11.280 | in the crazy chaos that we live today,
01:37:14.080 | about life, about how to discover their passion,
01:37:21.440 | is there some words that you could give?
01:37:23.800 | - First of all, I would say the main thing
01:37:26.800 | for a young person is stay adaptable.
01:37:30.080 | And this is back to this issue of why COVID is useful for us
01:37:33.280 | because it forces us off our tracks.
01:37:35.920 | The fact is the jobs that will exist 20 years from now,
01:37:39.520 | we don't even have names for,
01:37:40.560 | we can't even imagine the jobs that are gonna exist.
01:37:43.000 | And so when young people that I know go into college
01:37:45.440 | and they say, "Hey, what should I major in?"
01:37:46.840 | And so on, college is and should be less and less vocational
01:37:51.080 | as in, "Oh, I'm gonna learn how to do this
01:37:52.680 | and then I'm gonna do that the rest of my career."
01:37:54.520 | The world just isn't that way anymore
01:37:56.040 | with the exponential speed of things.
01:37:58.240 | So the important thing is learning how to learn,
01:38:00.840 | learning how to be live wired and adaptable.
01:38:03.960 | That's really key.
01:38:05.000 | And what I tell, what I advise young people,
01:38:06.880 | what I talk to them is, you know, what you digest,
01:38:11.600 | that's what gives you the raw storehouse
01:38:14.040 | of things that you can remix and be creative with.
01:38:17.760 | And so eat broadly and widely.
01:38:21.640 | And obviously this is the wonderful thing
01:38:23.400 | about the internet world we live in now
01:38:25.000 | is you kind of can't help it.
01:38:25.960 | You're constantly, whoa, you go down some mole hole
01:38:28.640 | of Wikipedia and you think,
01:38:29.800 | oh, I didn't even realize that was a thing.
01:38:31.160 | I didn't know that existed.
01:38:32.600 | And so-
01:38:33.780 | - Embrace that.
01:38:34.620 | - Embrace that, yeah, exactly.
01:38:36.240 | And what I tell people is just always do a gut check about,
01:38:40.240 | okay, I'm reading this paper and yeah, I think that,
01:38:42.680 | but this paper, wow, that really,
01:38:45.640 | I really cared about that in some way.
01:38:47.720 | I tell them just to keep a real sniff out for that.
01:38:50.520 | And when you find those things, keep going down those paths.
01:38:54.040 | - Yeah, don't be afraid.
01:38:55.080 | I mean, that's one of the challenges and the downsides
01:38:58.320 | of having so many beautiful options
01:39:00.080 | is that sometimes people are a little bit afraid
01:39:02.900 | to really commit, but that's very true.
01:39:05.640 | If there's something that just sparks
01:39:08.440 | your interest and passion, just run with it.
01:39:10.880 | I mean, it goes back to the Heiderer quote.
01:39:13.460 | I mean, we only get this one life
01:39:16.280 | and that trajectory, it doesn't last forever.
01:39:20.200 | So just if something sparks your imagination,
01:39:23.560 | your passion is run with it.
01:39:24.960 | - Yeah, exactly.
01:39:26.400 | - I don't think there's a more beautiful way to end it.
01:39:29.960 | David, it's a huge honor to finally meet you.
01:39:32.600 | Your work is inspiring so many people.
01:39:34.880 | I've talked to so many people who are passionate
01:39:36.320 | about neuroscience, about the brain,
01:39:38.280 | even outside that read your book.
01:39:40.880 | So I hope you keep doing so.
01:39:43.680 | I think you're already there with Carl Sagan.
01:39:46.120 | I hope you continue growing.
01:39:47.520 | Yeah, it was an honor talking with you today.
01:39:50.080 | Thanks so much.
01:39:50.920 | - Great, you too, Lex, wonderful.
01:39:52.280 | - Thanks for listening to this conversation
01:39:54.920 | with David Eagleman, and thank you to our sponsors,
01:39:58.120 | Athletic Greens, BetterHelp, and Cash App.
01:40:01.520 | Click the sponsor links in the description
01:40:03.880 | to get a discount and to support this podcast.
01:40:07.400 | If you enjoy this thing, subscribe on YouTube,
01:40:09.680 | review it with Five Stars on Apple Podcast,
01:40:11.920 | follow us on Spotify, support on Patreon,
01:40:14.680 | or connect with me on Twitter @LexFriedman.
01:40:18.400 | And now let me leave you with some words
01:40:20.080 | from David Eagleman in his book, "Some,"
01:40:22.680 | for details from the afterlives.
01:40:25.160 | Imagine for a moment that we're nothing but
01:40:28.160 | the product of billions of years of molecules
01:40:30.520 | coming together and ratcheting up
01:40:33.040 | through natural selection.
01:40:35.160 | That we're composed only of highways of fluids
01:40:37.520 | and chemicals sliding along roadways
01:40:40.040 | within billions of dancing cells.
01:40:42.660 | That trillions of synaptic connections hum in parallel.
01:40:46.280 | That this vast egg-like fabric of micro-thin circuitry
01:40:50.340 | runs algorithms undreamt of in modern science.
01:40:54.440 | And that these neural programs give rise to
01:40:56.800 | our decision-making, loves, desires, fears, and aspirations.
01:41:01.800 | To me, understanding this would be a numinous experience,
01:41:07.680 | better than anything ever proposed in any holy text.
01:41:12.100 | Thank you for listening, and hope to see you next time.
01:41:15.640 | (upbeat music)
01:41:18.220 | (upbeat music)
01:41:20.800 | Thanks for watching.