back to index

Christof Koch: Consciousness | Lex Fridman Podcast #2


Chapters

0:0
12:20 Pure Experience
15:16 Turing Test for Intelligence and Consciousness
16:45 A Theory of Consciousness
18:25 Why Is Consciousness a Hard Problem
20:57 Gazelian Monism
22:17 The Difference between Life Intelligence and Consciousness
22:50 Halo Morphism
24:13 What Is Death
28:19 Consciousness Is Not Required To Create Human Level Intelligence
28:36 What We Mean by Intelligence
30:35 Empathy
31:25 Fear and Suffering Are Essential To Have Consciousness
34:21 Where Does Religion Fit into Your Thinking about Consciousness
52:7 Mini Organoids
53:28 Cognitive Neuroscience

Whisper Transcript | Transcript Only Page

00:00:00.000 | As part of MIT course 6S099 on Artificial General Intelligence,
00:00:05.000 | I got a chance to sit down with Christoph Koch,
00:00:07.320 | who is one of the seminal figures in neurobiology, neuroscience,
00:00:12.600 | and generally in the study of consciousness.
00:00:15.480 | He is the president, the chief scientific officer
00:00:18.680 | of the Allen Institute for Brain Science in Seattle.
00:00:22.120 | From 1986 to 2013, he was the professor at Caltech.
00:00:27.040 | Before that, he was at MIT.
00:00:28.960 | He is extremely well cited, over 100,000 citations.
00:00:33.760 | His research, his writing, his ideas have had big impact
00:00:37.760 | on the scientific community and the general public
00:00:40.200 | in the way we think about consciousness,
00:00:42.400 | in the way we see ourselves as human beings.
00:00:44.640 | He's the author of several books,
00:00:46.360 | The Quest for Consciousness, a Neurobiological Approach,
00:00:49.240 | and a more recent book,
00:00:50.480 | Consciousness, Confessions of a Romantic Reductionist.
00:00:55.080 | If you enjoy this conversation, this course,
00:00:57.720 | subscribe, click the little bell icon
00:00:59.840 | to make sure you never miss a video,
00:01:01.560 | and in the comments, leave suggestions
00:01:04.280 | for any people you'd like to see be part of the course
00:01:07.040 | or any ideas that you would like us to explore.
00:01:09.840 | Thanks very much, and I hope you enjoy.
00:01:11.880 | Okay, before we delve into the beautiful mysteries
00:01:15.360 | of consciousness, let's zoom out a little bit.
00:01:18.720 | And let me ask, do you think there's intelligent life
00:01:21.960 | out there in the universe?
00:01:23.520 | - Yes, I do believe so.
00:01:25.440 | We have no evidence of it,
00:01:26.680 | but I think the probabilities are overwhelming
00:01:28.880 | in favor of it, given universe,
00:01:31.560 | where we have 10 to the 11 galaxies,
00:01:33.600 | and each galaxy has between 10 to the 11, 10 to the 12 stars,
00:01:37.920 | and we know most stars have one or more planets.
00:01:40.320 | - So how does that make you feel?
00:01:43.440 | - It still makes me feel special
00:01:45.480 | because I have experiences.
00:01:49.040 | I feel the world, I experience the world.
00:01:52.000 | And independent of whether there are other creatures
00:01:55.360 | out there, I still feel the world,
00:01:57.600 | and I have access to this world
00:01:59.440 | in this very strange, compelling way,
00:02:02.040 | and that's the core of human existence.
00:02:05.640 | - Now, you said human.
00:02:07.000 | Do you think, if those intelligent creatures are out there,
00:02:10.360 | do you think they experience their world?
00:02:14.120 | - Yes, if they are evolved,
00:02:15.360 | if they are a product of natural evolution,
00:02:17.360 | as they would have to be,
00:02:18.280 | they will also experience their own world.
00:02:20.160 | So consciousness isn't just human, you're right,
00:02:22.520 | it's much wider.
00:02:23.680 | It's probably, it may be spread across all of biology.
00:02:27.120 | We have, the only thing that we have special
00:02:29.440 | is we can talk about it.
00:02:31.320 | Of course, not all people can talk about it.
00:02:33.080 | Babies and little children can talk about it,
00:02:35.200 | patients who have a stroke in the left inferior frontal gyrus
00:02:39.440 | can talk about it,
00:02:40.800 | but most normal adult people can talk about it,
00:02:43.200 | and so we think that makes us special
00:02:45.440 | compared to, let's say, monkeys or dogs or cats or mice
00:02:47.920 | or all the other creatures that we share the planet with.
00:02:50.680 | But all the evidence seems to suggest
00:02:52.560 | that they too experience the world,
00:02:54.320 | and so it's overwhelmingly likely that other alien,
00:02:57.120 | that aliens would also experience their world.
00:02:59.440 | Of course, differently,
00:03:00.280 | because they have a different sensorium,
00:03:01.720 | they have different sensors,
00:03:02.720 | they have a very different environment.
00:03:04.840 | But the fact that I would strongly suppose
00:03:08.240 | that they also have experiences.
00:03:10.320 | They feel pain and pleasure
00:03:12.320 | and see in some sort of spectrum
00:03:14.760 | and hear and have all the other senses.
00:03:17.720 | - Of course, their language,
00:03:18.760 | if they have one, would be different,
00:03:20.120 | so we might not be able to understand their poetry
00:03:22.600 | about the experiences that they have.
00:03:24.360 | - That's correct.
00:03:25.200 | - Right.
00:03:26.120 | So, in a talk, in a video,
00:03:29.040 | I've heard you mention Sapuzo,
00:03:31.560 | a Dachshund that you came up with,
00:03:33.240 | that you grew up with,
00:03:34.080 | was part of your family when you were young.
00:03:37.000 | First of all, you're technically a Midwestern boy.
00:03:40.760 | You just-- - Technically.
00:03:42.080 | (laughs)
00:03:43.400 | - But after that, you traveled around a bit,
00:03:46.480 | hence a little bit of the accent.
00:03:48.280 | You talked about Sapuzo,
00:03:49.960 | the Dachshund,
00:03:50.800 | having these elements of humanness,
00:03:55.000 | of consciousness that you discovered.
00:03:56.520 | So, I just wanted to ask,
00:03:58.120 | can you look back in your childhood
00:03:59.840 | and remember when was the first time
00:04:02.160 | you realized you, yourself,
00:04:04.360 | sort of from a third-person perspective,
00:04:06.080 | are a conscious being?
00:04:08.680 | This idea of stepping outside yourself
00:04:12.640 | and seeing there's something special
00:04:15.160 | going on here in my brain.
00:04:16.640 | - I can't really actually,
00:04:18.920 | it's a good question.
00:04:19.760 | I'm not sure I recall a discrete moment.
00:04:22.000 | I mean, you take it for granted
00:04:23.760 | because that's the only world you know, right?
00:04:25.880 | The only world I know, you know,
00:04:27.800 | is the world of seeing and hearing voices
00:04:30.240 | and touching and all the other things.
00:04:33.680 | So, it's only much later,
00:04:35.480 | at early in my undergraduate days
00:04:37.640 | when I enrolled in physics and in philosophy
00:04:40.920 | that I really thought about it
00:04:42.000 | and thought, well, this is really,
00:04:43.000 | fundamentally, very, very mysterious.
00:04:45.360 | And there's nothing really in physics right now
00:04:47.480 | that explains this transition
00:04:48.880 | from the physics of the brain to feelings.
00:04:51.920 | Where do the feelings come in?
00:04:54.040 | So, you can look at the foundational equation
00:04:55.520 | of quantum mechanics, general relativity.
00:04:57.520 | You can look at the periodic table of the elements.
00:04:59.880 | You can look at the endless ATGC chat in our genes
00:05:04.240 | and nowhere is consciousness.
00:05:05.560 | Yet, I wake up every morning to a world
00:05:07.440 | where I have experiences.
00:05:09.480 | And so, that's the heart
00:05:11.000 | of the ancient mind-body problem.
00:05:12.560 | How do experiences get into the world?
00:05:16.600 | - So, what is consciousness?
00:05:19.160 | - Experience.
00:05:20.240 | Consciousness is any experience.
00:05:24.480 | Some people call it subjective feelings.
00:05:26.440 | Some people call it phenomenology.
00:05:29.400 | Some people call it qualia, if they're philosopher.
00:05:31.440 | But they all denote the same thing.
00:05:32.800 | It feels like something,
00:05:34.000 | in the famous word of the philosopher Thomas Nagel,
00:05:37.160 | it feels like something to be a bat
00:05:39.640 | or to be an American,
00:05:43.600 | or to be angry, or to be sad,
00:05:45.320 | or to be in love, or to have pain.
00:05:47.880 | And that is what experience is.
00:05:51.080 | Any possible experience.
00:05:52.960 | Could be as mundane as just sitting in a chair.
00:05:55.240 | Could be as exalted as having a mystical moment
00:05:58.800 | in deep meditation.
00:06:01.160 | Those are just different forms of experiences.
00:06:03.280 | - Experience.
00:06:04.200 | So, if you were to sit down with maybe the next,
00:06:08.840 | skip a couple generations, of IBM Watson,
00:06:11.600 | something that won Jeopardy.
00:06:13.080 | What is the gap, I guess the question is,
00:06:15.120 | between Watson, that might be much smarter than you,
00:06:20.120 | than us, than any human alive,
00:06:23.000 | but may not have experience.
00:06:26.200 | What is the gap?
00:06:27.120 | - Well, so that's a big, big question.
00:06:30.400 | That's occupied people for the last,
00:06:32.720 | certainly the last 50 years,
00:06:34.600 | since the advent, the birth of computers.
00:06:38.960 | That's a question Alan Turing tried to answer.
00:06:40.920 | And of course, he did it in this indirect way,
00:06:42.840 | by proposing a test, an operational test.
00:06:45.120 | But that's not really,
00:06:47.760 | he tried to get at what does it mean for a person to think.
00:06:51.480 | And then he had this test, you lock 'em away,
00:06:53.880 | and then you have a communication with them,
00:06:55.320 | and then you try to guess after a while
00:06:57.640 | whether that is a person or whether it's a computer system.
00:07:00.560 | There's no question that now, or very soon,
00:07:03.080 | Alexa or Siri or Google Now will pass this test.
00:07:07.520 | And you can game it, but ultimately,
00:07:10.120 | certainly in your generation,
00:07:12.040 | there will be machines that will speak with complete poise,
00:07:14.960 | that will remember everything you ever said,
00:07:16.640 | they'll remember every email you ever had.
00:07:18.640 | Like Samantha, remember in the movie "Her"?
00:07:21.840 | There's no question it's gonna happen.
00:07:24.240 | But of course, the key question is,
00:07:25.640 | does it feel like anything to be Samantha in the movie "Her"?
00:07:29.560 | Does it feel like anything to be Watson?
00:07:32.040 | And there, one has to very strongly think,
00:07:36.680 | there are two different concepts here that we commingle.
00:07:38.880 | There is the concept of intelligence,
00:07:41.320 | natural or artificial,
00:07:42.520 | and there is a concept of consciousness,
00:07:45.080 | of experience, natural or artificial.
00:07:47.280 | Those are very, very different things.
00:07:49.680 | Now, historically, we associate consciousness
00:07:52.120 | with intelligence, why?
00:07:54.080 | Because we live in a world, leaving aside computers,
00:07:56.840 | of natural selection, where we're surrounded by creatures,
00:08:00.560 | either our own kin that are less or more intelligent,
00:08:03.760 | or we go across species.
00:08:05.360 | Some are more adapted to a particular environment,
00:08:07.920 | others are less adapted, whether it's a whale or dog
00:08:10.520 | or you go talk about a paramecium or a little worm.
00:08:14.480 | And we see the complexity of the nervous system
00:08:17.040 | goes from one cell to specialized cells,
00:08:21.000 | to a worm that has three nets,
00:08:22.640 | that has 30% of its cells are nerve cells,
00:08:25.560 | to a creature like us or like a blue whale
00:08:27.400 | that has 100 billion, even more nerve cells.
00:08:30.000 | And so, based on behavioral evidence
00:08:32.480 | and based on the underlying neuroscience,
00:08:34.720 | we believe that as these creatures become more complex,
00:08:38.440 | they are better adapted to their particular ecological niche,
00:08:42.440 | and they become more conscious,
00:08:45.000 | partly because their brain grows,
00:08:46.480 | and we believe consciousness, unlike the ancient,
00:08:48.720 | ancient people thought most, almost every culture thought
00:08:51.760 | that consciousness with intelligence
00:08:53.880 | has to do with your heart.
00:08:55.720 | And you still see that today, you see,
00:08:57.200 | honey, I love you with all my heart.
00:08:59.360 | But what you should actually say is,
00:09:00.760 | you say, no, honey, I love you
00:09:01.760 | with all my lateral hypothalamus.
00:09:04.280 | And for Valentine's Day, you should give your sweetheart,
00:09:07.160 | you know, a hypothalamic shaped piece of chocolate,
00:09:10.040 | not a heart-shaped chocolate, right?
00:09:11.840 | Anyway, so we still have this language,
00:09:13.200 | but now we believe it's a brain.
00:09:14.520 | And so we see brains of different complexity,
00:09:16.480 | and we think, well,
00:09:17.640 | they have different levels of consciousness.
00:09:19.840 | They're capable of different experiences.
00:09:22.400 | But now we confront a world where we know,
00:09:27.880 | where we're beginning to engineer intelligence,
00:09:31.920 | and it's radical unclear whether the intelligence
00:09:34.880 | we're engineering has anything to do with consciousness
00:09:37.840 | and whether it can experience anything.
00:09:40.000 | 'Cause fundamentally, what's the difference?
00:09:42.160 | Intelligence is about function.
00:09:44.360 | Intelligence, no matter exactly how you define it,
00:09:46.440 | sort of adaptation to new environments,
00:09:48.800 | being able to learn and quickly understand,
00:09:51.480 | you know, the setup of this and what's going on
00:09:53.560 | and who are the actors and what's gonna happen next,
00:09:55.760 | that's all about function.
00:09:57.600 | Consciousness is not about function.
00:10:00.240 | Consciousness is about being.
00:10:02.280 | It's in some sense much fundamental.
00:10:04.360 | You can see this in several cases.
00:10:08.720 | You can see it, for instance, in the case of the clinic.
00:10:12.040 | When you're dealing with patients who, let's say,
00:10:14.640 | had a stroke or were in a traffic accident, et cetera,
00:10:18.720 | they're pretty much immobile.
00:10:20.040 | Terri Schiavo, you may have heard historically,
00:10:22.960 | she was a person here in the '90s in Florida.
00:10:26.600 | Her heart stood still.
00:10:27.520 | She was reanimated.
00:10:28.640 | Then for the next 14 years,
00:10:29.840 | she was what's called in a vegetative state.
00:10:32.160 | So there are thousands of people in a vegetative state.
00:10:33.920 | So they're like this.
00:10:36.480 | Occasionally, they open their eyes
00:10:38.400 | for two, three, four, five, six, eight hours
00:10:40.440 | and then close their eyes.
00:10:41.280 | They have sleep-wake cycle.
00:10:42.880 | Occasionally, they have behavior.
00:10:44.120 | They do like, you know, they're,
00:10:46.600 | but there's no way that you can establish
00:10:49.360 | a lawful relationship between what you say
00:10:51.520 | or the doctor says or the mom says
00:10:53.280 | and what the patient does.
00:10:54.720 | - Correct.
00:10:55.560 | - So there isn't any behavior,
00:10:59.400 | yet in some of these people,
00:11:00.800 | there is still experience.
00:11:02.360 | You can design and build brain machine interfaces
00:11:06.600 | where you can see they still experience something.
00:11:09.200 | And of course, in these cases of locked-in state,
00:11:11.440 | there's this famous book called
00:11:13.200 | "The Diving Bell and the Butterfly"
00:11:15.200 | where you had an editor, a French editor,
00:11:16.840 | he had a stroke in the brainstem,
00:11:19.520 | unable to move except his vertical eyes, eye movement.
00:11:22.880 | He could just move his eyes up and down.
00:11:25.200 | And he dictated an entire book.
00:11:27.680 | And some people even lose this at the end.
00:11:30.040 | And all the evidence seems to suggest
00:11:31.880 | that they're still in there.
00:11:33.480 | In this case, you have no behavior, you have consciousness.
00:11:37.160 | Second case is tonight, like all of us,
00:11:39.520 | you're gonna go to sleep.
00:11:41.040 | Close your eyes, you go to sleep.
00:11:42.680 | You will wake up inside your sleeping body
00:11:44.960 | and you will have conscious experiences.
00:11:47.560 | They are different from everyday experience.
00:11:49.760 | You might fly, you might not be surprised
00:11:51.880 | that you're flying.
00:11:52.720 | You might meet a long-dead pet childhood dog
00:11:56.040 | and you're not surprised that you're meeting them,
00:11:57.920 | you know, but you have conscious experience
00:11:59.320 | of love, of hate, you know, they can be very emotional.
00:12:02.240 | Your body during this state, typically it's REM,
00:12:05.080 | state sends an active signal to your motor neurons
00:12:08.320 | to paralyze you, it's called atonia, right?
00:12:11.400 | Because if you don't have that, like some patients,
00:12:13.680 | what do you do?
00:12:14.520 | You act out your dreams.
00:12:15.560 | You get, for example, REM behavioral disorder,
00:12:17.320 | which is bad juju to get, okay?
00:12:20.600 | Third case is pure experience.
00:12:22.680 | So I recently had this,
00:12:23.960 | what some people call a mystical experience.
00:12:27.440 | I went to Singapore and went into a flotation tank, right?
00:12:31.040 | So this is a big tub filled with water
00:12:35.000 | that's body temperature and Epsom salt.
00:12:37.720 | You strip completely naked, you lie inside of it.
00:12:39.880 | You close the lid. - Darkness.
00:12:41.920 | - Complete darkness, soundproof.
00:12:44.760 | So very quickly you become bodyless
00:12:46.920 | because you're floating and you're naked.
00:12:48.560 | You have no rings, no watch, no nothing.
00:12:50.440 | You don't feel your body anymore.
00:12:52.800 | It's no sound, soundless.
00:12:54.400 | There's no photon, sightless, timeless,
00:12:59.120 | because after a while, early on,
00:13:00.440 | you actually hear your heart,
00:13:02.280 | but then you sort of adapt to that
00:13:04.480 | and then sort of the passage of time ceases.
00:13:07.880 | And if you train yourself, like in a meditation,
00:13:10.560 | not to think, early on you think a lot.
00:13:12.400 | It's a little bit spooky.
00:13:13.520 | You feel somewhat uncomfortable
00:13:14.960 | or you think, well, I'm gonna get bored.
00:13:16.920 | But if you try to not to think actively,
00:13:18.880 | you become mindless.
00:13:20.480 | There you are, bodyless, timeless,
00:13:22.640 | soundless, sightless, mindless,
00:13:26.880 | but you're in a conscious experience.
00:13:27.960 | You're not asleep.
00:13:29.120 | You're not asleep.
00:13:29.960 | You are pure being.
00:13:33.600 | There isn't any function.
00:13:34.560 | You aren't doing any computation.
00:13:36.240 | You're not remembering.
00:13:37.080 | You're not projecting.
00:13:37.920 | You're not planning, yet you are fully conscious.
00:13:40.600 | - You're fully conscious.
00:13:41.800 | There's something going on out there.
00:13:42.640 | It could be just a side effect.
00:13:44.200 | So what is the--
00:13:46.040 | - You mean epiphenomena.
00:13:47.760 | - So what's the-- - You mean a side effect.
00:13:49.440 | - Meaning what is the function of you being able
00:13:54.320 | to lay in this sensory-free deprivation tank
00:13:59.320 | and still have a conscious experience?
00:14:01.080 | - Evolutionary? - Evolutionary.
00:14:02.640 | - Obviously, we didn't evolve with flotation tanks
00:14:05.080 | in our environment.
00:14:06.760 | I mean, so biology is notoriously bad
00:14:09.240 | at asking why question, tele-anomical question.
00:14:11.480 | Why do we have two eyes?
00:14:12.440 | Why don't we have four eyes, like some teachers,
00:14:14.240 | or three eyes or something?
00:14:15.440 | Well, no, there's probably, there is a function to that,
00:14:18.200 | but we're not very good at answering those questions.
00:14:21.120 | We can speculate endlessly.
00:14:22.480 | Biology is very, or science is very good
00:14:24.160 | about mechanistic questions.
00:14:25.440 | Why is there a charge in the universe, right?
00:14:27.280 | We find ourselves in a universe
00:14:28.400 | where there are positive and negative charges, why?
00:14:30.760 | Why does quantum mechanics hold?
00:14:32.440 | Why doesn't some other theory hold?
00:14:35.800 | Quantum mechanics hold in our universe.
00:14:37.280 | It's very unclear why.
00:14:38.440 | So tele-anomical questions, why questions,
00:14:40.680 | are difficult to answer.
00:14:41.640 | Clearly, there's some relationship between complexity,
00:14:45.280 | brain processing power, and consciousness.
00:14:48.080 | But, however, in these cases, in these three examples I gave,
00:14:52.280 | one is an everyday experience at night.
00:14:54.320 | The other one is a trauma,
00:14:55.800 | and third one is in principle,
00:14:57.640 | everybody can have these sort of mystical experiences.
00:15:00.400 | You have a dissociation of function,
00:15:03.080 | of intelligence, from consciousness.
00:15:09.000 | - You caught me asking a why question.
00:15:12.080 | Let me ask a question that's not a why question.
00:15:15.040 | You're giving a talk later today on the Turing test
00:15:17.880 | for intelligence and consciousness,
00:15:19.200 | drawing lines between the two.
00:15:21.200 | So is there a scientific way to say
00:15:23.680 | there's consciousness present in this entity or not?
00:15:27.880 | And to anticipate your answer,
00:15:30.680 | 'cause there's a neurobiological answer.
00:15:34.120 | So we can test the human brain,
00:15:35.600 | but if you take a machine brain
00:15:37.280 | that you don't know tests for yet,
00:15:40.320 | how would you even begin to approach a test
00:15:44.080 | if there's consciousness present in this thing?
00:15:46.360 | - Okay, that's a really good question.
00:15:47.680 | So let me take it in two steps.
00:15:49.000 | So as you point out, for humans,
00:15:53.160 | let's just stick with humans,
00:15:54.160 | there's now a test called the ZAP and ZIP.
00:15:56.600 | It's a procedure where you ping the brain
00:15:58.440 | using transcranial magnetic stimulation.
00:16:00.880 | You look at the electrical reverberations,
00:16:03.200 | essentially using EG,
00:16:05.080 | and then you can measure the complexity
00:16:06.800 | of this brain response.
00:16:07.800 | And you can do this in awake people,
00:16:09.400 | in asleep, normal people,
00:16:10.800 | you can do it in awake people and then anesthetize them.
00:16:13.680 | You can do it in patients.
00:16:15.440 | And it has 100% accuracy that in all those cases,
00:16:19.680 | when you're clear, the patient or the person
00:16:21.480 | is either conscious or unconscious,
00:16:23.000 | the complexity is either high or low.
00:16:24.840 | And then you can adopt these techniques
00:16:26.240 | to similar creatures like monkeys and dogs
00:16:28.200 | and mice that have very similar brains.
00:16:31.080 | Now, of course, you point out that may not help you
00:16:34.520 | because we don't have a cortex,
00:16:36.520 | and if I send a magnetic pulse into my iPhone
00:16:38.880 | or my computer, it's probably gonna break something.
00:16:41.400 | So we don't have that.
00:16:42.400 | So what we need ultimately,
00:16:45.240 | we need a theory of consciousness.
00:16:47.400 | We can't just rely on our intuition.
00:16:49.240 | Our intuition is, well, yeah,
00:16:50.480 | if somebody talks, they're conscious.
00:16:53.320 | However, then there are all these children,
00:16:55.200 | babies don't talk, right?
00:16:56.600 | But we believe that babies also have conscious experiences.
00:17:01.240 | And then there are all these patients I mentioned,
00:17:04.160 | and they don't talk.
00:17:05.000 | When you dream, you can't talk because you're paralyzed.
00:17:06.960 | So what we ultimately need,
00:17:09.160 | we can't just rely on our intuition.
00:17:11.120 | We need a theory of conscience that tells us
00:17:13.720 | what is it about a piece of matter?
00:17:15.400 | What is it about a piece of highly excitable matter,
00:17:17.720 | like the brain or like a computer,
00:17:19.600 | that gives rise to conscious experience?
00:17:21.720 | We all believe, none of us believe anymore
00:17:23.600 | in the old story, it's a soul, right?
00:17:25.560 | That used to be the most common explanation
00:17:27.160 | that most people accept.
00:17:28.120 | And still a lot of people today believe,
00:17:30.000 | well, there's God and doubt,
00:17:32.120 | only us with a special thing that animals don't have.
00:17:35.360 | Rene Descartes famously said,
00:17:36.840 | "A dog, if you hit it with your carriage,
00:17:38.720 | "may yelp, may cry,
00:17:39.720 | "but it doesn't have this special thing.
00:17:41.720 | "It doesn't have the magic sauce."
00:17:43.800 | - Soul, yeah.
00:17:44.920 | - It doesn't have res cogitans, a soul.
00:17:46.600 | Now we believe that isn't the case anymore.
00:17:48.840 | So what is the difference between brains
00:17:51.840 | and these guys, silicon?
00:17:54.320 | And in particular, once their behavior matches,
00:17:58.160 | so if you have Siri or Alexa in 20 years from now
00:18:01.240 | that she can talk just as good as any possible human,
00:18:04.480 | what grounds do you have to say she's not conscious?
00:18:06.920 | In particular, if she says,
00:18:08.200 | it's of course she will.
00:18:10.000 | Well, of course I'm conscious.
00:18:11.440 | You ask her, how are you doing?
00:18:12.480 | And she'll say, well, you know,
00:18:14.000 | they'll generate some way to, of course.
00:18:17.040 | She'll behave like a person.
00:18:19.720 | Now there's several differences.
00:18:21.360 | One is, so this relates to the problem,
00:18:24.920 | the very hard, why is consciousness a hard problem?
00:18:28.080 | It's because it's subjective, right?
00:18:30.480 | Only I have it, only I know,
00:18:33.280 | I have direct experience of my own consciousness.
00:18:36.480 | I don't have experience of your consciousness.
00:18:38.640 | Now I assume as a sort of a Bayesian person
00:18:41.240 | who believes in probability theory and all of that,
00:18:43.120 | you know, I can do an abduction to the best available facts.
00:18:47.000 | I deduce your brain is very similar to mine.
00:18:48.840 | If I put you in a scanner,
00:18:49.920 | your brain is roughly gonna behave the same way as I do.
00:18:52.960 | If I gave you this muesli and asked you,
00:18:55.560 | how does it taste?
00:18:56.400 | You'd tell me things that I would also say, more or less.
00:19:00.400 | So I infer based on all of that that you're conscious.
00:19:02.560 | Now with Siri, I can't do that.
00:19:03.840 | So there I really need a theory that tells me
00:19:06.080 | what is it about any system, this or this,
00:19:09.680 | that makes it conscious.
00:19:11.080 | We have such a theory.
00:19:12.520 | - Yes, so the integrated information theory.
00:19:15.680 | But let me first, maybe as introduction
00:19:17.560 | for people who are not familiar, Descartes.
00:19:19.760 | Can you, you talk a lot about panpsychism.
00:19:24.080 | Can you describe what physicalism versus dualism,
00:19:29.080 | this, you mentioned the soul.
00:19:31.080 | What is the history of that idea?
00:19:33.880 | - The idea of panpsychism?
00:19:35.840 | - Oh no, the debate, really,
00:19:37.600 | out of which panpsychism can emerge
00:19:41.160 | of dualism versus physicalism.
00:19:46.160 | Or do you not see panpsychism as fitting into that?
00:19:49.840 | - No, you can argue there's some,
00:19:51.360 | well, okay, so let's step back.
00:19:52.560 | So panpsychism is a very ancient belief
00:19:54.880 | that's been around, I mean, Plato and Aristotle
00:19:57.880 | talks about it, modern philosophers talk about it.
00:20:01.440 | Of course, in Buddhism, the idea is very prevalent
00:20:04.440 | that, I mean, there are different versions of it.
00:20:06.680 | One version says everything is in souls, everything.
00:20:09.920 | Rocks and stones and dogs and people
00:20:11.840 | and forests and iPhones, all of us are all.
00:20:14.240 | All matter is in soul, that's sort of one version.
00:20:16.800 | Another version is that all biology,
00:20:20.240 | all creatures, small or large,
00:20:22.360 | from a single cell to a giant sequoia tree
00:20:25.040 | feel like something.
00:20:25.920 | That's one I think is somewhat more realistic.
00:20:29.120 | So the different versions--
00:20:29.960 | - What do you mean by feel like something?
00:20:31.480 | - Have--
00:20:32.320 | - Have feeling, have some kind of experience.
00:20:33.880 | - It feels like something, it may well be possible
00:20:36.360 | that it feels like something to be a paramecium.
00:20:39.320 | I think it's pretty likely it feels like something
00:20:41.200 | to be a bee or a mouse or a dog.
00:20:43.960 | - Sure, so, okay.
00:20:46.760 | - So that you can say that's also,
00:20:48.400 | so panpsychism is very broad, right?
00:20:51.240 | And you can, so some people, for example, Bertrand Russell,
00:20:55.160 | tried to advocate this idea, it's called Russellian monism,
00:20:59.280 | that panpsychism is really physics viewed from the inside.
00:21:05.040 | So the idea is that physics is very good
00:21:06.960 | at describing relationship among objects,
00:21:09.800 | like charges or like gravity, right?
00:21:12.840 | You know, it describes the relationship
00:21:14.080 | between curvature and mass distribution, okay?
00:21:16.400 | That's the relationship among things.
00:21:18.160 | Physics doesn't really describe the ultimate reality itself.
00:21:20.920 | It's just relationship among, you know,
00:21:22.760 | quarks or all these other stuff--
00:21:24.960 | - Almost from like a third person observer.
00:21:27.360 | - Yes, yes, yes.
00:21:28.960 | And consciousness is what physics feels from the inside.
00:21:31.800 | So my conscious experience,
00:21:33.560 | it's the way the physics of my brain,
00:21:35.800 | particularly my cortex, feels from the inside.
00:21:38.520 | And so if you are paramecium, you gotta remember,
00:21:41.400 | you say paramecium, well, that's a pretty dumb creature.
00:21:43.280 | It is, but it has already a billion different molecules,
00:21:48.000 | probably, you know, 5,000 different proteins
00:21:50.640 | assembled in a highly, highly complex system
00:21:52.720 | that no single person, no computer system so far
00:21:55.880 | on this planet has ever managed to accurately simulate.
00:21:59.000 | Its complexity vastly escapes us.
00:22:01.240 | Yes, and it may well be that that little thing
00:22:03.480 | feels like a tiny bit.
00:22:04.880 | Now, it doesn't have a voice in the head like me.
00:22:06.560 | It doesn't have expectations.
00:22:08.320 | You know, it doesn't have all that complex things,
00:22:10.120 | but it may well feel like something.
00:22:12.520 | - Yeah, so this is really interesting.
00:22:14.680 | Can we draw some lines and maybe try to understand
00:22:17.640 | the difference between life, intelligence,
00:22:21.120 | and consciousness?
00:22:22.280 | How do you see all of those?
00:22:25.200 | If you had to define what is a living thing,
00:22:28.520 | what is a conscious thing,
00:22:30.080 | and what is an intelligent thing,
00:22:31.480 | do those intermix for you,
00:22:32.800 | or are they totally separate?
00:22:34.680 | - Okay, so A, that's a question
00:22:36.000 | that we don't have a full answer to.
00:22:38.080 | - A lot of the stuff we're talking about today
00:22:40.080 | is full of mysteries and fascinating ones, right?
00:22:42.480 | - Well, for example, you can go to Aristotle,
00:22:44.360 | who's probably the most important scientist
00:22:46.840 | and philosopher who's ever lived
00:22:47.960 | in certain Western culture.
00:22:49.520 | He had this idea, it's called hylomorphism,
00:22:51.680 | it's quite popular these days,
00:22:53.440 | that there are different forms of soul.
00:22:55.120 | The soul is really the form of something.
00:22:56.720 | He says, "All biological creatures have a vegetative soul."
00:23:00.560 | That's life principle.
00:23:01.680 | Today, we think we understand something more
00:23:03.480 | that it's biochemistry and nonlinear thermodynamics.
00:23:06.360 | Then he says, "They have a sensitive soul."
00:23:09.160 | Only animals and humans have also a sensitive soul,
00:23:13.280 | or a petitive soul.
00:23:14.600 | They can see, they can smell, and they have drives.
00:23:18.120 | They wanna reproduce, they wanna eat, et cetera.
00:23:20.880 | And then only humans have what he called a rational soul.
00:23:25.080 | Okay? - Right.
00:23:25.920 | - And that idea then made it into Christendom,
00:23:28.120 | and then the rational soul is the one that lives forever.
00:23:30.720 | He was very unclear.
00:23:31.560 | He wasn't really, I mean, different readings
00:23:33.600 | of Aristotle give different,
00:23:35.000 | whether did he believe that rational soul
00:23:37.000 | was immortal or not?
00:23:38.240 | I probably think he didn't.
00:23:39.680 | But then, of course, that made it into,
00:23:41.080 | through Plato into Christianity,
00:23:42.440 | and then the soul became immortal,
00:23:43.880 | and then became the connection to God.
00:23:47.400 | Now, so you ask me, essentially,
00:23:49.880 | what is our modern conception of these three,
00:23:52.400 | Aristotle would have called them different forms.
00:23:56.200 | Life, we think we know something about it,
00:23:58.480 | at least life on this planet, right?
00:24:00.000 | Although we don't understand how it originated,
00:24:01.840 | but it's been difficult to rigorously pin down.
00:24:05.160 | You see this in modern definitions of death.
00:24:08.240 | It's in fact, right now, there's a conference ongoing,
00:24:11.120 | again, that tries to define legally and medically
00:24:14.280 | what is death.
00:24:15.320 | It used to be very simple.
00:24:16.400 | Death is you stop breathing, your heart stops beating.
00:24:18.760 | You're dead, right? - Yeah.
00:24:20.000 | - Totally unconventional.
00:24:21.440 | If you're unsure, you wait another 10 minutes.
00:24:23.400 | If the patient doesn't breathe, you know, he's dead.
00:24:25.360 | Well, now we have ventilators, we have pacemakers.
00:24:28.640 | So it's much more difficult to define what death is.
00:24:31.000 | Typically, death is defined as the end of life,
00:24:33.440 | and life is defined before death.
00:24:35.720 | - It's the thing before death.
00:24:36.640 | - Okay, so we don't have really very good definitions.
00:24:39.280 | Intelligence, we don't have a rigorous definition.
00:24:41.760 | We know something how to measure.
00:24:43.040 | It's called IQ or G-factors, right?
00:24:45.800 | And we're beginning to build it in a narrow sense, right?
00:24:50.200 | Like Go, AlphaGo, and Watson, and Google Cars,
00:24:55.200 | and Uber cars, and all of that.
00:24:56.640 | That's still narrow AI,
00:24:57.760 | and some people are thinking
00:24:58.840 | about artificial general intelligence.
00:25:01.200 | But roughly, as we said before,
00:25:02.480 | it's something to do with the ability to learn
00:25:04.280 | and to adapt to new environments.
00:25:06.560 | But that is, as I said also,
00:25:08.080 | it's radical difference from experience.
00:25:10.960 | And it's very unclear, if you build a machine that has AGI,
00:25:14.720 | it's not at all, a priori, it's not at all clear
00:25:17.520 | that this machine will have consciousness.
00:25:19.640 | It may or may not.
00:25:20.760 | - So let's ask it the other way.
00:25:22.320 | Do you think, if you were to try to build
00:25:24.480 | an artificial general intelligence system,
00:25:26.560 | do you think figuring out how to build
00:25:29.680 | artificial consciousness would help you get to an AGI?
00:25:34.680 | So, or put another way,
00:25:36.800 | do you think intelligent requires consciousness?
00:25:40.440 | - In human, it goes hand in hand.
00:25:43.080 | In human, or I think in biology,
00:25:44.840 | consciousness, intelligence goes hand in hand.
00:25:47.640 | Quay is illusion, because the brain evolved
00:25:50.960 | to be highly complex.
00:25:51.960 | Complexity, via the theory,
00:25:53.520 | integrated information theory,
00:25:55.160 | is sort of ultimately is what is closely tied
00:25:58.120 | to consciousness.
00:25:59.240 | Ultimately, it's causal power upon itself.
00:26:01.400 | And so in evolved systems, they go together.
00:26:05.000 | In artificial system, particularly in digital machines,
00:26:07.760 | they do not go together.
00:26:09.320 | And if you ask me, point blank,
00:26:10.760 | is Alexa 20.0 in the year 2040,
00:26:15.280 | when she can easily pass every Turing test,
00:26:17.440 | is she conscious?
00:26:19.120 | Even if she claims she's conscious.
00:26:21.000 | In fact, you could even do a more radical version
00:26:22.880 | of this thought experiment.
00:26:24.080 | We can build a computer simulation of the human brain.
00:26:26.800 | You know what Henry Markham in the Blue Brain Project,
00:26:29.760 | or the Human Brain Project in Switzerland is trying to do.
00:26:32.040 | Let's grant him all the success.
00:26:33.560 | So in 10 years, we have this perfect simulation
00:26:35.680 | of the human brain.
00:26:36.520 | Every neuron is simulated.
00:26:38.200 | And it has alarmics, and it has motor neurons,
00:26:40.400 | it has a broadcast area, and of course it'll talk,
00:26:43.560 | and it'll say, "Hi, I just woke up, I feel great."
00:26:46.120 | Okay?
00:26:46.960 | Even that computer simulation that can in principle
00:26:49.040 | map onto your brain will not be conscious.
00:26:52.520 | Because it simulates.
00:26:53.360 | It's a difference between the simulated and the real.
00:26:55.880 | So it simulates the behavior associated with consciousness.
00:26:59.160 | It might be, it will if it's done properly,
00:27:01.440 | will have all the intelligence that that particular person
00:27:04.360 | they're simulating has.
00:27:05.920 | But simulating intelligence is not the same
00:27:08.680 | as having conscious experiences.
00:27:10.600 | And I give you a really nice metaphor
00:27:12.280 | that engineers and physicists typically get.
00:27:15.200 | I can write down Einstein's field equation,
00:27:17.600 | nine or 10 equations that describe the link
00:27:19.760 | in general relativity between curvature and mass.
00:27:23.440 | I can do that, I can run this on my laptop
00:27:26.480 | to predict that the center, the black hole
00:27:29.400 | at the center of our galaxy will be so massive
00:27:33.480 | that it'll twist space-time around it
00:27:35.680 | so no light can escape.
00:27:36.880 | It's a black hole, right?
00:27:38.440 | But funny, have you ever wondered
00:27:40.280 | why doesn't this computer simulation suck me in?
00:27:44.040 | Right?
00:27:44.880 | It simulates gravity, but it doesn't have
00:27:47.200 | the causal power of gravity.
00:27:48.840 | That's a huge difference.
00:27:50.640 | So it's a difference between the real and the simulator,
00:27:54.360 | just like it doesn't get wet inside a computer
00:27:56.360 | when the computer runs cold, that simulates a weather storm.
00:27:58.920 | And so in order to have artificial consciousness,
00:28:02.880 | you have to give it the same causal power
00:28:05.480 | as a human brain.
00:28:06.760 | You have to build so-called a neuromorphic machine
00:28:09.080 | that has hardware that is very similar to the human brain,
00:28:12.400 | not a digital clocked phenomenon computer.
00:28:16.360 | - So that's, just to clarify though,
00:28:19.120 | you think that consciousness is not required
00:28:22.520 | to create human level intelligence.
00:28:25.400 | It seems to accompany in the human brain,
00:28:28.660 | but for a machine, not.
00:28:30.240 | - That's correct.
00:28:31.640 | - So maybe just because this is AGI,
00:28:35.760 | let's dig in a little bit about
00:28:37.240 | what we mean by intelligence.
00:28:39.080 | So one thing is the G factor,
00:28:42.040 | these kind of IQ tests of intelligence.
00:28:44.140 | But I think if you, maybe another way to say,
00:28:47.200 | so in 2040, 2050, people will have Siri
00:28:51.200 | that is just really impressive.
00:28:53.880 | Do you think people will say Siri is intelligent?
00:28:57.640 | - Yes.
00:28:58.480 | - Intelligence is this amorphous thing.
00:29:01.080 | So to be intelligent, it seems like you have to have
00:29:04.200 | some kind of connections with other human beings,
00:29:07.120 | in the sense that you have to impress them
00:29:09.480 | with your intelligence.
00:29:11.360 | And there feels, you have to somehow operate
00:29:14.600 | in this world full of humans.
00:29:17.280 | And for that, there feels like there has to be
00:29:19.420 | something like consciousness.
00:29:21.700 | So you think you can have just the world's best
00:29:24.180 | natural NLP system, natural language understanding
00:29:27.240 | and generation, and that will be,
00:29:29.720 | that will get us happy and say,
00:29:31.440 | you know what, we've created an AGI.
00:29:33.920 | - I don't know happy, but yes, I do believe
00:29:38.200 | we can get what we call high level functional intelligence,
00:29:41.200 | in particular sort of the G, you know,
00:29:43.200 | this fluid-like intelligence that we cherish,
00:29:47.320 | particularly at a place like MIT, right?
00:29:50.080 | In machines, I see a priori no reasons,
00:29:52.720 | and I see a lot of reason to believe it's gonna happen
00:29:54.880 | very, you know, over the next 50 years or 30 years.
00:29:58.200 | - So for beneficial AI, for creating an AI system
00:30:01.760 | that's, so you mentioned ethics,
00:30:04.920 | that is exceptionally intelligent,
00:30:06.920 | but also does not do, does, you know,
00:30:09.600 | aligns its values with our values as humanity.
00:30:12.800 | Do you think then it needs consciousness?
00:30:14.880 | - Yes, I think that that is a very good argument
00:30:17.320 | that if we're concerned about AI and the threat of AI,
00:30:20.600 | a la Nick Bostrom, existentialist threat,
00:30:22.800 | I think having an intelligence that has empathy, right?
00:30:27.360 | Why do we find abusing a dog,
00:30:29.480 | why do most of us find that abhorrent, abusing any animal?
00:30:32.840 | Why do we find that abhorrent?
00:30:34.360 | Because we have this thing called empathy,
00:30:37.240 | which if you look at the Greek,
00:30:38.320 | really means feeling with, I feel a path of empathy.
00:30:41.520 | I have feeling with you.
00:30:43.160 | I see somebody else suffer that isn't even my conspecific,
00:30:46.360 | it's not a person, it's not a love,
00:30:48.480 | it's not my wife or my kids, it's a dog,
00:30:51.280 | but I feel naturally most of us, not all of us,
00:30:53.400 | most of us will feel emphatic.
00:30:55.680 | And so it may well be in the long-term interest
00:30:59.960 | of survival of homo sapiens sapiens,
00:31:02.080 | that if we do build AGI and it's really becomes very powerful
00:31:05.920 | that it has an emphatic response
00:31:08.480 | and doesn't just exterminate humanity.
00:31:11.880 | - So as part of the full conscious experience,
00:31:14.880 | to create a consciousness, artificial,
00:31:17.480 | or in our human consciousness, do you think fear,
00:31:21.440 | maybe we're gonna get into your earlier days,
00:31:24.320 | with Nietzsche and so on, but do you think fear
00:31:26.200 | and suffering are essential to have consciousness?
00:31:30.680 | Do you have to have the full range of experience
00:31:32.680 | to have a system that has experience?
00:31:36.200 | Or can you have a system that only has
00:31:38.600 | very particular kinds of very positive experiences?
00:31:41.760 | - Look, you can have, in principle,
00:31:44.000 | people have done this in the rat,
00:31:45.400 | where you implant an electrode in the hypothalamus,
00:31:48.240 | the pleasure center of the rat,
00:31:49.680 | and the rat stimulates itself above and beyond anything else.
00:31:52.960 | It doesn't care about food or natural sex
00:31:55.480 | or drink anymore, it just stimulates itself
00:31:57.720 | because it's such a pleasurable feeling.
00:32:00.480 | I guess it's like an orgasm, just you have all day long.
00:32:03.960 | And so, a priori, I see no reason why you need
00:32:07.640 | a great variety.
00:32:11.320 | Now, clearly, to survive, that wouldn't work, right?
00:32:14.560 | But if I'd engineered artificially,
00:32:16.360 | I don't think you need a great variety
00:32:21.960 | of conscious experience.
00:32:23.200 | You could have just pleasure or just fear.
00:32:25.920 | It might be a terrible existence,
00:32:27.520 | but I think that's possible,
00:32:28.920 | at least on conceptual, logical ground.
00:32:31.800 | Because any real creature, whether artificial or engineered,
00:32:34.360 | you wanna give it fear, the fear of extinction,
00:32:36.680 | that we all have, and you also wanna give it
00:32:39.720 | positive, repetitive states,
00:32:41.400 | states that you want the machine encouraged to do
00:32:45.000 | because they give the machine positive feedback.
00:32:48.000 | - So, you mentioned panpsychism,
00:32:50.680 | to jump back a little bit,
00:32:52.480 | everything having some kind of mental property.
00:32:55.920 | How do you go from there to something
00:32:59.600 | like human consciousness?
00:33:02.120 | So, everything having some elements of consciousness.
00:33:05.440 | Is there something special about human consciousness?
00:33:08.200 | - So, just, it's not everything.
00:33:10.640 | Like a spoon, there's no,
00:33:12.720 | the form of panpsychism I think about
00:33:15.560 | doesn't ascribe consciousness to anything,
00:33:17.800 | like this, the spoon or my liver.
00:33:19.960 | However, it is, the theory,
00:33:23.200 | the integrated information theory,
00:33:24.560 | does say that system, even ones that look
00:33:27.280 | from the outside relatively simple,
00:33:28.880 | at least if they have this internal causal power,
00:33:31.280 | it does feel like something.
00:33:35.320 | The theory, I probably doesn't say anything
00:33:36.960 | what's special about human.
00:33:38.880 | Biologically, we know what the one thing
00:33:41.240 | that's special about human is we speak,
00:33:43.680 | and we have an overblown sense of our own importance.
00:33:47.400 | - Right.
00:33:48.360 | - We believe we're exceptional,
00:33:49.760 | and we're just God's gift to the universe.
00:33:53.800 | But behaviorally, the main thing that we have,
00:33:56.080 | we can plan over the long term,
00:33:58.120 | we have language, and that gives us
00:33:59.680 | enormous amount of power,
00:34:01.520 | and that's why we are the current
00:34:03.200 | dominant species on the planet.
00:34:05.800 | - So you mentioned God,
00:34:07.720 | you grew up a devout Roman Catholic,
00:34:10.040 | in a Roman Catholic family.
00:34:12.520 | So, you know, with consciousness,
00:34:15.480 | you're sort of exploring some really deeply
00:34:18.920 | fundamental human things that religion also touches on.
00:34:21.960 | So where does religion fit into your
00:34:24.520 | thinking about consciousness?
00:34:25.600 | And you've grown throughout your life
00:34:29.000 | and changed your views on religion,
00:34:30.600 | as far as I understand.
00:34:32.240 | - Yeah, I mean, I'm now much closer to,
00:34:33.880 | so I'm not a Roman Catholic anymore,
00:34:36.440 | I don't believe there's sort of this God,
00:34:39.120 | the God I was educated to believe in,
00:34:41.760 | you know, sits somewhere in the fullness of time,
00:34:43.920 | I'll be united in some sort of everlasting bliss,
00:34:47.120 | I just don't see any evidence for that.
00:34:50.160 | Look, the world, the night is large and full of wonders.
00:34:53.160 | (laughing)
00:34:54.160 | There are many things that I don't understand,
00:34:55.880 | I think many things that we as a cult,
00:34:57.920 | look, we don't even understand more than 4%
00:34:59.840 | of all the universe, right?
00:35:01.400 | Dark matter, dark energy, we have no idea what it is,
00:35:03.440 | maybe it's lost socks, what do I know?
00:35:05.640 | So, all I can tell you is,
00:35:09.520 | it's sort of my current religious or spiritual sentiment
00:35:12.560 | is much closer to some form of Buddhism.
00:35:14.560 | - Can you describe--
00:35:16.920 | - Without the reincarnation, unfortunately,
00:35:19.040 | there's no evidence for a reincarnation.
00:35:21.120 | - So can you describe the way Buddhism
00:35:23.960 | sees the world a little bit?
00:35:25.440 | - Well, so, they talk about,
00:35:27.280 | so when I spent several meetings with the Dalai Lama,
00:35:31.840 | and what always impressed me about him,
00:35:33.440 | he really, unlike, for example, let's say,
00:35:35.120 | the Pope or some cardinal, he always emphasized
00:35:38.080 | minimizing the suffering of all creatures.
00:35:40.760 | So they have this, from the early beginning,
00:35:43.240 | they look at suffering in all creatures,
00:35:45.160 | not just in people, but in everybody, this universal,
00:35:48.240 | and of course, by degrees, right?
00:35:49.680 | An animal, generally, is less capable of suffering
00:35:53.520 | than a well-developed, normally developed human,
00:35:57.920 | and they think consciousness pervades in this universe,
00:36:02.280 | and they have these techniques, you know,
00:36:04.840 | you can think of them like mindfulness, et cetera,
00:36:07.040 | and meditation that tries to access sort of
00:36:09.760 | what they claim of this more fundamental aspect of reality.
00:36:13.280 | I'm not sure it's more fundamental,
00:36:14.600 | as I think about it, there's the physical,
00:36:16.480 | and then there's this inside view, consciousness,
00:36:18.880 | and those are the two aspects.
00:36:20.440 | That's the only thing I have access to in my life,
00:36:23.520 | and you've gotta remember, my conscious experience
00:36:25.400 | and your conscious experience comes prior
00:36:27.040 | to anything you know about physics,
00:36:28.800 | comes prior to knowledge about the universe,
00:36:31.080 | and atoms, and super strings, and molecules,
00:36:33.520 | and all of that.
00:36:34.600 | The only thing you directly are acquainted with
00:36:36.880 | is this world that's populated with things,
00:36:39.200 | and images, and sounds in your head,
00:36:41.720 | and touches, and all of that.
00:36:43.400 | - I actually have a question.
00:36:44.240 | So it sounds like you kind of have a rich life.
00:36:47.880 | You talk about rock climbing,
00:36:50.040 | and it seems like you really love literature,
00:36:52.560 | and consciousness is all about experiencing things.
00:36:56.280 | So do you think that has helped your research on this topic?
00:37:00.800 | - Yes, particularly if you think about it,
00:37:03.680 | the various states, so for example,
00:37:05.120 | when you do rock climbing,
00:37:06.440 | or now I do rowing, crew rowing, and I bike every day,
00:37:10.320 | you can get into this thing called the zone,
00:37:12.840 | and I've always, I wonder about it,
00:37:14.960 | particularly with respect to consciousness,
00:37:16.480 | because it's a strangely addictive state.
00:37:18.840 | You wanna, I mean, once people have it once,
00:37:21.840 | they wanna keep on going back to it,
00:37:23.720 | and you wonder, what is it so addicting about it?
00:37:26.040 | And I think it's the experience
00:37:27.800 | of almost close to pure experience,
00:37:30.960 | because in this zone,
00:37:32.840 | you're not conscious of your inner voice anymore.
00:37:35.000 | There's always this inner voice nagging you.
00:37:36.800 | You have to do this, you have to do that,
00:37:38.080 | you have to pay your taxes,
00:37:39.200 | you have this fight with your ex,
00:37:40.480 | and all of those things are always there.
00:37:42.440 | But when you're in the zone, all of that is gone,
00:37:44.600 | and you're just in this wonderful state
00:37:46.440 | where you're fully out in the world, right?
00:37:47.920 | You're climbing, or you're rowing, or biking,
00:37:50.640 | or doing soccer, or whatever you're doing,
00:37:53.440 | and sort of consciousness sort of is this,
00:37:55.840 | you're all action, or in this case, of pure experience.
00:37:59.440 | You're not action at all,
00:38:00.920 | but in both cases, you experience some aspect of,
00:38:04.720 | you touch some basic part of conscious existence
00:38:09.360 | that is so basic and so deeply satisfying.
00:38:11.880 | I think you touch the root of being.
00:38:14.880 | That's really what you're touching there.
00:38:16.280 | You're getting close to the root of being,
00:38:18.440 | and that's very different from intelligence.
00:38:21.680 | - So what do you think about
00:38:23.200 | the simulation hypothesis, simulation theory,
00:38:25.360 | the idea that we all live in a computer simulation?
00:38:27.960 | Have you given it a shot? - Rapture for nerds.
00:38:30.240 | Raptures for nerds.
00:38:31.240 | - Raptures for nerds? (laughs)
00:38:32.560 | - I think it's as likely as the hypothesis
00:38:35.680 | that engaged hundreds of scholars for many centuries,
00:38:39.320 | are we all just existing in the mind of God?
00:38:42.160 | - Right. - And this is just
00:38:43.160 | a modern version of it.
00:38:44.200 | It's equally plausible.
00:38:47.400 | People love talking about these sorts of things.
00:38:49.480 | I know they're book written about the simulation hypothesis.
00:38:52.280 | If that's what people wanna do, that's fine.
00:38:54.800 | Seems rather esoteric.
00:38:56.160 | It's never testable.
00:38:58.200 | - But it's not useful for you to think of in those terms.
00:39:00.320 | So maybe connecting to the questions of free will,
00:39:03.280 | which you've talked about.
00:39:05.000 | I vaguely remember you saying
00:39:07.320 | that the idea that there's no free will,
00:39:09.600 | it makes you very uncomfortable.
00:39:13.360 | So what do you think about free will
00:39:15.760 | from a physics perspective, from a consciousness perspective?
00:39:19.280 | What does it all fit?
00:39:20.840 | - Okay, so from the physics perspective,
00:39:22.480 | leaving aside quantum mechanics,
00:39:24.160 | we believe we live in a fully deterministic world, right?
00:39:27.080 | But then comes, of course, quantum mechanics.
00:39:28.720 | So now we know that certain things
00:39:30.320 | are in principle not predictable,
00:39:32.120 | which, as you said, I prefer
00:39:33.960 | because the idea that the initial condition of the universe
00:39:37.920 | and then everything else, we're just acting out
00:39:39.920 | the initial condition of the universe
00:39:41.320 | that doesn't--
00:39:43.800 | - It's not a romantic notion.
00:39:45.560 | - Certainly not, right?
00:39:47.080 | Now when it comes to consciousness,
00:39:48.440 | I think we do have certain freedom.
00:39:50.640 | We are much more constrained by physics, of course,
00:39:53.280 | and by our past and by our own conscious desires
00:39:55.680 | and what our parents told us
00:39:57.120 | and what our environment tells us.
00:39:59.040 | We all know that, right?
00:40:00.120 | There's hundreds of experiments
00:40:01.280 | that show how we can be influenced.
00:40:03.600 | But finally, in the final analysis,
00:40:05.880 | when you make a life,
00:40:06.920 | and I'm talking really about critical decisions,
00:40:09.240 | what you really think.
00:40:10.080 | Should I marry?
00:40:10.920 | Should I go to this school or that school?
00:40:12.320 | Should I take this job or that job?
00:40:14.880 | Should I cheat on my taxes or not?
00:40:16.840 | (Richard laughs)
00:40:17.880 | These are things where you really deliberate.
00:40:20.320 | And I think under those conditions,
00:40:22.200 | you are as free as you can be.
00:40:24.160 | When you bring your entire being,
00:40:26.000 | your entire conscious being to that question
00:40:31.000 | and try to analyze it under all the various conditions,
00:40:34.240 | then you make a decision.
00:40:36.160 | You are as free as you can ever be.
00:40:38.640 | That is, I think, what free will is.
00:40:40.560 | It's not a will that's totally free to do anything it wants.
00:40:44.320 | That's not possible.
00:40:45.720 | - Right.
00:40:46.760 | So as Jack mentioned,
00:40:48.640 | you actually write a blog about books you've read.
00:40:51.720 | Amazing books from, I'm Russian,
00:40:54.400 | from Bulgakov to-
00:40:55.560 | - Ah, my son, my Rita.
00:40:57.200 | - Yeah.
00:40:58.160 | Neil Gaiman, Carl Sagan, Murakami.
00:41:00.960 | So what is a book that early in your life
00:41:04.320 | transformed the way you saw the world?
00:41:06.480 | Something that changed your life?
00:41:09.120 | - Nietzsche, I guess, did.
00:41:11.040 | Das Bruch der Truste,
00:41:12.520 | because he talks about some of these problems.
00:41:14.280 | You know, he was one of the first discoverer
00:41:15.960 | of the unconscious.
00:41:17.800 | This is a little bit before Freud,
00:41:20.600 | when it was in the air.
00:41:22.080 | He makes all these claims that people
00:41:25.960 | sort of under the guise of, under the mask of charity,
00:41:29.040 | actually are very non-charitable.
00:41:31.120 | So he's sort of really the first discoverer
00:41:35.960 | of the great land of the unconscious.
00:41:39.480 | And that really struck me.
00:41:41.440 | - And what do you think about the unconscious?
00:41:43.560 | What do you think about Freud?
00:41:44.720 | What do you think about these ideas?
00:41:47.080 | What's, just like dark matter in the universe,
00:41:49.720 | what's over there in that unconscious?
00:41:51.960 | - A lot.
00:41:52.800 | I mean, much more than we think.
00:41:53.960 | This is what a lot of last hundred years of research
00:41:56.920 | has shown.
00:41:57.760 | So I think he was a genius, misguided towards the end,
00:42:00.440 | but he started out as a neuroscientist.
00:42:02.840 | He contributed.
00:42:03.680 | He did studies on the Lamprey.
00:42:07.240 | He contributed himself to the neuron hypothesis,
00:42:10.280 | the idea that there are discrete units
00:42:11.800 | that we call nerve cells now.
00:42:13.360 | And then he wrote about the unconscious.
00:42:18.080 | And I think it's true.
00:42:18.920 | There's lots of stuff happening.
00:42:20.880 | You feel this particular, when you're in a relationship
00:42:23.840 | and it breaks asunder, right?
00:42:25.520 | And then you have this terrible,
00:42:26.720 | you can have love and hate and lust and anger
00:42:29.640 | and all of it's mixed in.
00:42:31.200 | And when you try to analyze yourself,
00:42:33.360 | why am I so upset?
00:42:35.000 | It's very, very difficult to penetrate to those basements,
00:42:38.840 | those caverns in your mind,
00:42:41.200 | because the prying eyes of conscience
00:42:43.640 | doesn't have access to those.
00:42:44.960 | But they're there in the amygdala,
00:42:46.600 | or in lots of other places,
00:42:48.560 | they make you upset or angry or sad or depressed.
00:42:51.280 | And it's very difficult to try
00:42:53.120 | to actually uncover the reason.
00:42:54.640 | You can go to a shrink,
00:42:55.520 | you can talk with your friend endlessly.
00:42:57.400 | You construct finally a story why this happened,
00:42:59.960 | why you love her or don't love her or whatever,
00:43:02.080 | but you don't really know whether that actually happened,
00:43:05.760 | because you simply don't have access
00:43:07.400 | to those parts of the brain,
00:43:08.520 | and they're very powerful.
00:43:09.720 | - Do you think that's a feature or a bug of our brain?
00:43:12.560 | The fact that we have this deep,
00:43:14.520 | difficult to dive into subconscious?
00:43:16.920 | - I think it's a feature, because otherwise,
00:43:19.280 | look, we are, like any other brain
00:43:23.800 | or nervous system or computer,
00:43:26.280 | we are severely band-limited.
00:43:28.560 | If everything I do, every emotion I feel,
00:43:32.560 | every eye movement I make,
00:43:34.040 | if all of that had to be
00:43:35.120 | under the control of consciousness,
00:43:37.080 | I wouldn't be here.
00:43:41.040 | So what you do early on, your brain,
00:43:44.560 | you have to be conscious when you learn things
00:43:46.400 | like typing or like riding on a bike.
00:43:48.880 | But then what you do, you train up,
00:43:50.720 | routes I think that involve basal ganglia and striatum,
00:43:54.560 | you train up different parts of your brain.
00:43:56.720 | And then once you do it automatically like typing,
00:43:59.080 | you can show you do it much faster
00:44:00.600 | without even thinking about it,
00:44:01.720 | because you've got these highly specialized,
00:44:03.320 | what Franz Krick and I call zombie agents,
00:44:06.360 | that are taking care of that,
00:44:08.200 | while your consciousness can sort of worry
00:44:09.840 | about the abstract sense of the text you wanna write.
00:44:12.560 | And I think that's true for many, many things.
00:44:14.760 | - But for the things like all the fights you had
00:44:18.360 | with an ex-girlfriend, things that you would think
00:44:21.880 | are not useful to still linger somewhere
00:44:24.000 | in the subconscious.
00:44:25.240 | So that seems like a bug that it would stay there.
00:44:28.400 | - You think it would be better if you can analyze it
00:44:30.320 | and then get it out of the system.
00:44:31.160 | - And get it out of the system,
00:44:32.320 | or just forget it ever happened.
00:44:34.200 | You know, that seems a very buggy kind of--
00:44:37.280 | - Well, yeah, in general, we don't have,
00:44:39.360 | and that's probably functional,
00:44:40.520 | we don't have an ability, unless it's extreme,
00:44:42.800 | there are cases, clinical dissociations, right?
00:44:45.160 | When people are heavily abused,
00:44:47.360 | when they completely repress the memory,
00:44:49.880 | but that doesn't happen in normal people.
00:44:53.440 | We don't have an ability to remove traumatic memories.
00:44:57.120 | And of course, we suffer from that.
00:44:58.960 | On the other hand, probably if you had the ability
00:45:01.920 | to constantly wipe your memory,
00:45:03.840 | you probably do it to an extent that isn't useful to you.
00:45:08.520 | - Yeah, it's a good question.
00:45:10.360 | - It's a balance.
00:45:11.200 | So on the books, as Jack mentioned,
00:45:14.680 | correct me if I'm wrong, but broadly speaking,
00:45:17.520 | academia and the different scientific disciplines,
00:45:20.360 | certainly in engineering,
00:45:21.960 | reading literature seems to be a rare pursuit.
00:45:25.360 | Perhaps I'm wrong on this, but that's in my experience.
00:45:28.680 | Most people read much more technical texts
00:45:30.680 | and do not sort of escape or seek truth in literature.
00:45:35.680 | It seems like you do.
00:45:38.320 | So what do you think is the value?
00:45:40.120 | What do you think literature adds
00:45:41.160 | to the pursuit of scientific truth?
00:45:43.760 | Do you think it's good, it's useful for--
00:45:47.280 | - I think it's the access to a much wider array
00:45:49.880 | of human experiences.
00:45:52.560 | - How valuable do you think it is?
00:45:54.240 | - Well, if you wanna understand human nature
00:45:56.920 | and nature in general,
00:45:57.920 | then I think you have to better understand
00:46:00.280 | a wide variety of experiences,
00:46:02.520 | not just sitting in a lab, staring at a screen
00:46:04.840 | and having a face flashed onto you
00:46:06.520 | for a hundred milliseconds and pushing a button.
00:46:07.920 | That's what I used to do.
00:46:09.400 | That's what most psychologists do.
00:46:10.600 | There's nothing wrong with that,
00:46:11.920 | but you need to consider lots of other strange states.
00:46:16.720 | - And literature is a shortcut for this.
00:46:18.920 | - Well, yeah, because literature,
00:46:19.920 | that's what literature is all about,
00:46:21.880 | all sorts of interesting experiences that people have,
00:46:24.520 | the contingency of it,
00:46:26.880 | the fact that women experience the world different,
00:46:29.160 | black people experience the world different.
00:46:30.800 | And one way to experience that
00:46:33.480 | is reading all these different literature
00:46:35.280 | and try to find out.
00:46:36.600 | You see everything is so relative, right?
00:46:38.280 | You read a book 300 years ago,
00:46:39.360 | they thought about certain problems
00:46:40.760 | very, very differently than us today.
00:46:43.240 | We today, like any culture, think we know it all.
00:46:46.040 | That's common to every culture.
00:46:47.320 | Every culture believes that it's heyday, they know it all.
00:46:49.840 | And then you realize, well,
00:46:50.880 | there's other ways of viewing the universe,
00:46:52.800 | and some of them may have lots of things in their favor.
00:46:55.760 | - So this is a question I wanted to ask
00:46:58.520 | about time scale or scale in general.
00:47:01.600 | When you, with IIT or in general,
00:47:04.760 | try to think about consciousness,
00:47:06.360 | try to think about these ideas,
00:47:08.680 | we kind of naturally think in human time scales.
00:47:13.600 | Or, and also entities that are sized close to humans.
00:47:18.600 | Do you think of things that are much larger,
00:47:20.600 | much smaller as containing consciousness?
00:47:22.520 | And do you think of things that take, you know,
00:47:26.040 | well, ages, eons, to operate
00:47:31.040 | in their conscious cause effect?
00:47:33.480 | - It's a very good question.
00:47:35.560 | So I think a lot about small creatures
00:47:37.200 | because experimentally, you know,
00:47:38.480 | a lot of people work on flies and bees.
00:47:41.360 | So, and most people just think they are automata.
00:47:43.580 | They're just bugs for heaven's sake, right?
00:47:45.280 | But if you look at their behavior,
00:47:46.520 | like bees, they can recognize individual humans.
00:47:48.840 | They have this very complicated way to communicate.
00:47:52.440 | If you've ever been involved,
00:47:53.600 | or you know your parents when they bought a house,
00:47:55.480 | what sort of agonizing decision that is.
00:47:57.560 | And bees have to do that once a year, right?
00:47:59.960 | When they swarm in the spring.
00:48:01.240 | And then they have this very elaborate way.
00:48:02.920 | They have three night scouts.
00:48:04.040 | They go to the individual sites.
00:48:05.680 | They come back, they have this power, this dance,
00:48:07.560 | literally where they dance for several days.
00:48:09.440 | They try to recruit other bees.
00:48:10.680 | It's very complicated decision way.
00:48:12.640 | When they finally want to make a decision,
00:48:14.340 | the entire swarm, the scouts warm up the entire swarm,
00:48:17.420 | then go to one location.
00:48:18.460 | They don't go to 50 locations.
00:48:19.580 | They go to one location that the scouts
00:48:21.360 | have agreed upon by themselves.
00:48:22.620 | That's awesome.
00:48:23.600 | If we look at the circuit complexity,
00:48:24.980 | it's 10 times more denser than anything we have in our brain.
00:48:27.740 | Now they only have a million neurons,
00:48:28.900 | but the neurons are amazingly complex.
00:48:30.820 | Complex behavior, very complicated circuitry.
00:48:32.820 | So there's no question they experience something.
00:48:34.920 | Their life is very different.
00:48:36.300 | They're tiny.
00:48:37.320 | They only live, you know, for,
00:48:38.860 | for while workers live maybe for two months.
00:48:42.100 | So I think, and IIT tells you this,
00:48:44.340 | in principle, the substrate of consciousness
00:48:47.220 | is the substrate that maximizes the cause effect power
00:48:50.180 | over all possible spatial temple grains.
00:48:53.120 | So when I think about, for example,
00:48:54.500 | do you know the science fiction story, The Black Cloud?
00:48:57.700 | Okay, it's a classic by Fred Hoyle, the astronomer.
00:49:00.060 | He has this cloud intervening between the earth
00:49:02.860 | and the sun and leading to some sort of,
00:49:06.460 | to global cooling.
00:49:07.660 | This is written in the 50s.
00:49:09.020 | It turns out you can, using the radio dish,
00:49:12.580 | they communicate with actually an entity.
00:49:14.420 | It's actually an intelligent entity.
00:49:16.380 | And they sort of, they convince it to move away.
00:49:20.180 | So here you have a radical different entity.
00:49:22.340 | And in principle, IIT says, well,
00:49:24.660 | you can measure the integrated information
00:49:26.860 | in principle at least.
00:49:27.840 | And yes, if that, if the maximum of that occurs
00:49:30.660 | at a timescale of months,
00:49:32.580 | rather than in assets sort of fraction of a second,
00:49:35.600 | yes, then they would experience life
00:49:37.980 | where each moment is a month rather than,
00:49:40.700 | or microsecond, right, rather than a fraction
00:49:43.420 | of a second in the human case.
00:49:46.700 | And so there may be forms of consciousness
00:49:48.660 | that we simply don't recognize for what they are
00:49:50.940 | because they are so radical different
00:49:53.300 | from anything you and I are used to.
00:49:55.780 | Again, that's why it's good to read
00:49:58.020 | or to watch science fiction movie
00:49:59.300 | in order to think about this.
00:50:00.740 | Like this is, do you know Stanislav Lem,
00:50:04.060 | this Polish science fiction writer?
00:50:05.380 | He wrote Solaris.
00:50:06.220 | It was turned into a Hollywood movie.
00:50:07.660 | - Yes.
00:50:08.500 | - His best novel, so it was in the '60s,
00:50:10.220 | very, very engineering, he's an engineering background.
00:50:12.980 | His most interesting novel is called The Victorious
00:50:15.940 | where human civilization,
00:50:17.740 | they have this mission to this planet
00:50:21.020 | and everything is destroyed and they discover machines,
00:50:24.100 | humans got killed and then these machines took over
00:50:27.100 | and there was a machine evolution,
00:50:29.060 | a Darwinian evolution, he talks about this very vividly.
00:50:31.860 | And finally, the dominant,
00:50:34.260 | the dominant machine intelligence organism that survived
00:50:37.980 | were gigantic clouds of little hexagonal
00:50:40.900 | universal cellular automata.
00:50:42.460 | This was written in the '60s.
00:50:44.220 | So typically they're all lying on the ground
00:50:46.140 | individual by themselves, but in times of crisis,
00:50:48.460 | they can communicate, they assemble into gigantic nets,
00:50:51.300 | into clouds of trillions of these particles
00:50:54.300 | and then they become hyperintelligent
00:50:55.780 | and they can beat anything that humans can throw at it.
00:50:59.980 | It's a very beautiful and compelling way
00:51:02.260 | to have an intelligence where finally the humans
00:51:05.220 | leave the planet, they simply are unable to understand
00:51:08.140 | and comprehend this creature.
00:51:09.420 | And they can say, well, either we can nuke
00:51:10.740 | the entire planet and destroy it
00:51:12.580 | or we just have to leave because fundamentally
00:51:14.820 | it's an alien, it's so alien from us and our ideas
00:51:18.140 | that we cannot communicate with them.
00:51:19.980 | - Yeah, actually in conversation, cellular automatas,
00:51:24.580 | Stephen Wolfram brought up is that there could be
00:51:27.940 | his ideas that you already have
00:51:30.580 | these artificial general intelligence,
00:51:32.500 | like super smart or maybe conscious beings
00:51:34.900 | in the cellular automata, we just don't know
00:51:36.340 | how to talk to them.
00:51:37.700 | So it's the language of communication
00:51:39.740 | 'cause you don't know what to do with it.
00:51:41.380 | So that's one sort of view is consciousness
00:51:45.100 | is only something you can measure.
00:51:47.740 | So it's not conscious if you can't measure it.
00:51:50.140 | - So you're making an ontological
00:51:51.700 | and an epistemic statement.
00:51:53.060 | One is it's just like seeing the multiverses,
00:51:57.380 | that might be true, but I can't communicate with them.
00:51:59.740 | I can't have any knowledge of them.
00:52:01.540 | That's an epistemic argument.
00:52:03.460 | Those are two different things.
00:52:04.300 | So it may well be possible.
00:52:05.820 | Look, another case that's happening right now,
00:52:07.620 | people are building these mini organoids.
00:52:09.700 | Do you know about this?
00:52:10.860 | So you can take stem cells from under your arm,
00:52:12.940 | put it in a dish, add four transcription factors
00:52:15.260 | and then you can induce them to grow into large,
00:52:18.500 | well large, they're a few millimeter,
00:52:19.820 | they're like a half a million neurons
00:52:21.500 | that look like nerve cells in a dish called mini organoids.
00:52:24.780 | At Harvard, at Stanford, everywhere they're building them.
00:52:27.220 | It may well be possible that they're beginning
00:52:29.220 | to feel like something,
00:52:30.740 | but we can't really communicate with them right now.
00:52:33.540 | So people are beginning to think about the ethics of this.
00:52:36.300 | So yes, he may be perfectly right,
00:52:37.980 | but it's one question, are they conscious or not,
00:52:41.780 | is totally separate question, how would I know?
00:52:43.940 | Those are two different things.
00:52:45.740 | - If you could give advice to a young researcher
00:52:49.660 | sort of dreaming of understanding
00:52:52.260 | or creating human level intelligence or consciousness,
00:52:56.420 | what would you say?
00:52:59.900 | - Just follow your dreams, read widely.
00:53:03.580 | - No, I mean, I suppose what discipline,
00:53:05.860 | what is the pursuit that they should take on?
00:53:08.860 | Is it neuroscience,
00:53:09.700 | is it computational cognitive science,
00:53:11.620 | is it philosophy, is it computer science or robotics?
00:53:14.980 | - No, in a sense that, okay, so the only known system
00:53:21.460 | that have high level of intelligence is Homo sapiens.
00:53:24.380 | So if you wanted to build it,
00:53:25.660 | it's probably good to continue to study closely
00:53:28.140 | what humans do, so cognitive neuroscience,
00:53:30.500 | somewhere between cognitive neuroscience on the one hand,
00:53:33.060 | then some philosophy of mind,
00:53:34.540 | and then AI, computer science.
00:53:37.300 | You can look at all the original ideas, neural networks,
00:53:40.340 | they all came from neuroscience, right?
00:53:42.340 | Reinforcing, whether it's Snarky, Minsky building Snarky,
00:53:46.300 | or whether it's the early Schubert and Wiesel experiments
00:53:48.580 | at Harvard that then gave rise to networks
00:53:50.540 | and then multi-layer networks.
00:53:52.740 | So it may well be possible, in fact,
00:53:55.040 | some people argue that to make the next big step in AI,
00:53:58.020 | once we realize the limits of deep convolutional networks,
00:54:01.620 | they can do certain things,
00:54:02.540 | but they can't really understand.
00:54:04.140 | I can't really show 'em one image.
00:54:08.820 | I can show you a single image of somebody,
00:54:11.780 | a pickpocket who steals a wallet from a purse.
00:54:14.820 | You immediately know that's a pickpocket.
00:54:16.980 | Now, computer system would just say,
00:54:18.260 | well, it's a man, it's a woman, it's a purse, right?
00:54:20.900 | Unless you train this machine
00:54:22.460 | on showing it 100,000 pickpockets, right?
00:54:24.460 | So it doesn't have this easy understanding
00:54:26.900 | that you have, right?
00:54:28.900 | So some people make the argument,
00:54:31.140 | in order to go to the next step,
00:54:32.420 | where you really wanna build machines that understand,
00:54:34.440 | in a way you and I, we have to go to psychology.
00:54:37.040 | We need to understand how we do it
00:54:38.380 | and how our brains enable us to do it.
00:54:40.300 | And so therefore, being on the cusp,
00:54:42.060 | it's also so exciting to try to understand better our nature
00:54:45.860 | and then to build, to take some of those insight
00:54:48.180 | and build them.
00:54:49.000 | So I think the most exciting thing
00:54:49.940 | is somewhere in the interface between cognitive science,
00:54:52.560 | neuroscience, AI, computer science, and philosophy of mind.
00:54:56.660 | - Beautiful, yeah, I'd say if there is,
00:54:58.100 | from the machine learning, from the computer science,
00:55:00.300 | computer vision perspective,
00:55:01.900 | many of the research is kind of ignore
00:55:04.580 | the way the human brain works.
00:55:05.900 | - Yes. - Ignore even psychology
00:55:07.580 | or literature or studying the brain.
00:55:09.540 | I would hope Josh Tenenbaum talks about
00:55:12.740 | bringing that in more and more.
00:55:14.620 | And that's, yeah.
00:55:15.900 | So you've worked on some amazing stuff
00:55:19.020 | throughout your life.
00:55:19.940 | What's the thing that you're really excited about?
00:55:24.380 | What's the mystery that you would love to uncover
00:55:27.820 | in the near term,
00:55:28.820 | beyond all the mysteries that you're already surrounded by?
00:55:32.820 | - Well, so there's a structure called the claustrum.
00:55:35.860 | - Okay. - This is a structure,
00:55:37.300 | it's underneath our cortex, it's yay big.
00:55:39.940 | You have one on the left, one on the right,
00:55:41.540 | underneath this part, underneath the insula.
00:55:43.460 | It's very thin, it's like one millimeter.
00:55:45.180 | It's embedded in wiring, in white matter,
00:55:47.580 | so it's very difficult to image.
00:55:49.860 | And it has connection to every cortical region.
00:55:54.860 | And Francis Crick, the last paper he ever wrote,
00:55:56.980 | he dictated corrections the day he died in hospital
00:55:59.380 | on this paper.
00:56:00.780 | He and I, we hypothesized, well,
00:56:03.420 | 'cause it has this unique anatomy,
00:56:05.220 | it gets input from every cortical area
00:56:07.020 | and projects back to every cortical area.
00:56:10.740 | That the function of this structure is similar,
00:56:13.900 | it's just a metaphor,
00:56:14.980 | to the role of a conductor in a symphony orchestra.
00:56:18.740 | You have all the different cortical players.
00:56:20.540 | You have some that do motion,
00:56:21.820 | some that do theory of mind,
00:56:23.140 | some that infer social interaction,
00:56:24.740 | and color, and hearing,
00:56:25.780 | and all the different modules in cortex.
00:56:28.100 | But of course, what consciousness is,
00:56:29.540 | consciousness puts it all together into one package,
00:56:32.060 | the binding problem, all of that.
00:56:33.300 | And this is really the function,
00:56:35.180 | because it has relatively few neurons compared to cortex,
00:56:38.140 | but it talks, it sort of receives input from all of them,
00:56:41.780 | and it projects back to all of them.
00:56:43.460 | And so we are testing that right now.
00:56:45.020 | We've got this beautiful neuronal reconstruction
00:56:47.460 | in the mouse called crown of thorn neurons
00:56:51.060 | that are in the claustrum,
00:56:52.540 | that have the most widespread connection
00:56:54.420 | of any neuron I've ever seen.
00:56:56.340 | They're very, you have individual neurons
00:56:58.540 | that sit in the claustrum, tiny,
00:56:59.940 | but then they have this, single neurons
00:57:01.540 | have this huge axonal tree that cover
00:57:03.980 | both ipsi and contralateral cortex,
00:57:06.980 | and trying to turn, using fancy tools like optogenetics,
00:57:10.340 | trying to turn those neurons on or off
00:57:12.340 | and study it, what happens in the mouse.
00:57:15.380 | - So this thing is perhaps where the parts become the whole.
00:57:19.380 | It integrates--
00:57:20.220 | - Perhaps, it's one of the structures,
00:57:22.060 | it's a very good way of putting it,
00:57:24.260 | where the individual parts turn into the whole
00:57:27.100 | of the conscious experience.
00:57:29.760 | - Well, with that, thank you very much for being here today.
00:57:35.340 | - Thank you very much. - It's wonderful
00:57:36.180 | to have you back at MIT. - Excellent, Jack.
00:57:37.660 | Thank you very much.
00:57:38.660 | (upbeat music)
00:57:41.240 | (upbeat music)
00:57:43.820 | (upbeat music)
00:57:46.400 | (upbeat music)
00:57:48.980 | [BLANK_AUDIO]