back to indexChristof Koch: Consciousness | Lex Fridman Podcast #2
Chapters
0:0
12:20 Pure Experience
15:16 Turing Test for Intelligence and Consciousness
16:45 A Theory of Consciousness
18:25 Why Is Consciousness a Hard Problem
20:57 Gazelian Monism
22:17 The Difference between Life Intelligence and Consciousness
22:50 Halo Morphism
24:13 What Is Death
28:19 Consciousness Is Not Required To Create Human Level Intelligence
28:36 What We Mean by Intelligence
30:35 Empathy
31:25 Fear and Suffering Are Essential To Have Consciousness
34:21 Where Does Religion Fit into Your Thinking about Consciousness
52:7 Mini Organoids
53:28 Cognitive Neuroscience
00:00:00.000 |
As part of MIT course 6S099 on Artificial General Intelligence, 00:00:05.000 |
I got a chance to sit down with Christoph Koch, 00:00:07.320 |
who is one of the seminal figures in neurobiology, neuroscience, 00:00:15.480 |
He is the president, the chief scientific officer 00:00:18.680 |
of the Allen Institute for Brain Science in Seattle. 00:00:22.120 |
From 1986 to 2013, he was the professor at Caltech. 00:00:28.960 |
He is extremely well cited, over 100,000 citations. 00:00:33.760 |
His research, his writing, his ideas have had big impact 00:00:37.760 |
on the scientific community and the general public 00:00:46.360 |
The Quest for Consciousness, a Neurobiological Approach, 00:00:50.480 |
Consciousness, Confessions of a Romantic Reductionist. 00:01:04.280 |
for any people you'd like to see be part of the course 00:01:07.040 |
or any ideas that you would like us to explore. 00:01:11.880 |
Okay, before we delve into the beautiful mysteries 00:01:15.360 |
of consciousness, let's zoom out a little bit. 00:01:18.720 |
And let me ask, do you think there's intelligent life 00:01:26.680 |
but I think the probabilities are overwhelming 00:01:33.600 |
and each galaxy has between 10 to the 11, 10 to the 12 stars, 00:01:37.920 |
and we know most stars have one or more planets. 00:01:52.000 |
And independent of whether there are other creatures 00:02:07.000 |
Do you think, if those intelligent creatures are out there, 00:02:20.160 |
So consciousness isn't just human, you're right, 00:02:23.680 |
It's probably, it may be spread across all of biology. 00:02:33.080 |
Babies and little children can talk about it, 00:02:35.200 |
patients who have a stroke in the left inferior frontal gyrus 00:02:40.800 |
but most normal adult people can talk about it, 00:02:45.440 |
compared to, let's say, monkeys or dogs or cats or mice 00:02:47.920 |
or all the other creatures that we share the planet with. 00:02:54.320 |
and so it's overwhelmingly likely that other alien, 00:02:57.120 |
that aliens would also experience their world. 00:03:20.120 |
so we might not be able to understand their poetry 00:03:37.000 |
First of all, you're technically a Midwestern boy. 00:04:23.760 |
because that's the only world you know, right? 00:04:45.360 |
And there's nothing really in physics right now 00:04:54.040 |
So, you can look at the foundational equation 00:04:57.520 |
You can look at the periodic table of the elements. 00:04:59.880 |
You can look at the endless ATGC chat in our genes 00:05:29.400 |
Some people call it qualia, if they're philosopher. 00:05:34.000 |
in the famous word of the philosopher Thomas Nagel, 00:05:52.960 |
Could be as mundane as just sitting in a chair. 00:05:55.240 |
Could be as exalted as having a mystical moment 00:06:01.160 |
Those are just different forms of experiences. 00:06:04.200 |
So, if you were to sit down with maybe the next, 00:06:15.120 |
between Watson, that might be much smarter than you, 00:06:38.960 |
That's a question Alan Turing tried to answer. 00:06:40.920 |
And of course, he did it in this indirect way, 00:06:47.760 |
he tried to get at what does it mean for a person to think. 00:06:51.480 |
And then he had this test, you lock 'em away, 00:06:57.640 |
whether that is a person or whether it's a computer system. 00:07:03.080 |
Alexa or Siri or Google Now will pass this test. 00:07:12.040 |
there will be machines that will speak with complete poise, 00:07:25.640 |
does it feel like anything to be Samantha in the movie "Her"? 00:07:36.680 |
there are two different concepts here that we commingle. 00:07:49.680 |
Now, historically, we associate consciousness 00:07:54.080 |
Because we live in a world, leaving aside computers, 00:07:56.840 |
of natural selection, where we're surrounded by creatures, 00:08:00.560 |
either our own kin that are less or more intelligent, 00:08:05.360 |
Some are more adapted to a particular environment, 00:08:07.920 |
others are less adapted, whether it's a whale or dog 00:08:10.520 |
or you go talk about a paramecium or a little worm. 00:08:14.480 |
And we see the complexity of the nervous system 00:08:34.720 |
we believe that as these creatures become more complex, 00:08:38.440 |
they are better adapted to their particular ecological niche, 00:08:46.480 |
and we believe consciousness, unlike the ancient, 00:08:48.720 |
ancient people thought most, almost every culture thought 00:09:04.280 |
And for Valentine's Day, you should give your sweetheart, 00:09:07.160 |
you know, a hypothalamic shaped piece of chocolate, 00:09:14.520 |
And so we see brains of different complexity, 00:09:27.880 |
where we're beginning to engineer intelligence, 00:09:31.920 |
and it's radical unclear whether the intelligence 00:09:34.880 |
we're engineering has anything to do with consciousness 00:09:44.360 |
Intelligence, no matter exactly how you define it, 00:09:51.480 |
you know, the setup of this and what's going on 00:09:53.560 |
and who are the actors and what's gonna happen next, 00:10:08.720 |
You can see it, for instance, in the case of the clinic. 00:10:12.040 |
When you're dealing with patients who, let's say, 00:10:14.640 |
had a stroke or were in a traffic accident, et cetera, 00:10:20.040 |
Terri Schiavo, you may have heard historically, 00:10:22.960 |
she was a person here in the '90s in Florida. 00:10:32.160 |
So there are thousands of people in a vegetative state. 00:11:02.360 |
You can design and build brain machine interfaces 00:11:06.600 |
where you can see they still experience something. 00:11:09.200 |
And of course, in these cases of locked-in state, 00:11:19.520 |
unable to move except his vertical eyes, eye movement. 00:11:33.480 |
In this case, you have no behavior, you have consciousness. 00:11:56.040 |
and you're not surprised that you're meeting them, 00:11:59.320 |
of love, of hate, you know, they can be very emotional. 00:12:02.240 |
Your body during this state, typically it's REM, 00:12:05.080 |
state sends an active signal to your motor neurons 00:12:11.400 |
Because if you don't have that, like some patients, 00:12:15.560 |
You get, for example, REM behavioral disorder, 00:12:27.440 |
I went to Singapore and went into a flotation tank, right? 00:12:37.720 |
You strip completely naked, you lie inside of it. 00:13:07.880 |
And if you train yourself, like in a meditation, 00:13:37.920 |
You're not planning, yet you are fully conscious. 00:13:49.440 |
- Meaning what is the function of you being able 00:14:02.640 |
- Obviously, we didn't evolve with flotation tanks 00:14:09.240 |
at asking why question, tele-anomical question. 00:14:12.440 |
Why don't we have four eyes, like some teachers, 00:14:15.440 |
Well, no, there's probably, there is a function to that, 00:14:18.200 |
but we're not very good at answering those questions. 00:14:25.440 |
Why is there a charge in the universe, right? 00:14:28.400 |
where there are positive and negative charges, why? 00:14:41.640 |
Clearly, there's some relationship between complexity, 00:14:48.080 |
But, however, in these cases, in these three examples I gave, 00:14:57.640 |
everybody can have these sort of mystical experiences. 00:15:12.080 |
Let me ask a question that's not a why question. 00:15:15.040 |
You're giving a talk later today on the Turing test 00:15:23.680 |
there's consciousness present in this entity or not? 00:15:44.080 |
if there's consciousness present in this thing? 00:16:10.800 |
you can do it in awake people and then anesthetize them. 00:16:15.440 |
And it has 100% accuracy that in all those cases, 00:16:31.080 |
Now, of course, you point out that may not help you 00:16:36.520 |
and if I send a magnetic pulse into my iPhone 00:16:38.880 |
or my computer, it's probably gonna break something. 00:16:56.600 |
But we believe that babies also have conscious experiences. 00:17:01.240 |
And then there are all these patients I mentioned, 00:17:05.000 |
When you dream, you can't talk because you're paralyzed. 00:17:15.400 |
What is it about a piece of highly excitable matter, 00:17:32.120 |
only us with a special thing that animals don't have. 00:17:54.320 |
And in particular, once their behavior matches, 00:17:58.160 |
so if you have Siri or Alexa in 20 years from now 00:18:01.240 |
that she can talk just as good as any possible human, 00:18:04.480 |
what grounds do you have to say she's not conscious? 00:18:24.920 |
the very hard, why is consciousness a hard problem? 00:18:33.280 |
I have direct experience of my own consciousness. 00:18:36.480 |
I don't have experience of your consciousness. 00:18:41.240 |
who believes in probability theory and all of that, 00:18:43.120 |
you know, I can do an abduction to the best available facts. 00:18:49.920 |
your brain is roughly gonna behave the same way as I do. 00:18:56.400 |
You'd tell me things that I would also say, more or less. 00:19:00.400 |
So I infer based on all of that that you're conscious. 00:19:03.840 |
So there I really need a theory that tells me 00:19:24.080 |
Can you describe what physicalism versus dualism, 00:19:46.160 |
Or do you not see panpsychism as fitting into that? 00:19:54.880 |
that's been around, I mean, Plato and Aristotle 00:19:57.880 |
talks about it, modern philosophers talk about it. 00:20:01.440 |
Of course, in Buddhism, the idea is very prevalent 00:20:04.440 |
that, I mean, there are different versions of it. 00:20:06.680 |
One version says everything is in souls, everything. 00:20:14.240 |
All matter is in soul, that's sort of one version. 00:20:25.920 |
That's one I think is somewhat more realistic. 00:20:32.320 |
- Have feeling, have some kind of experience. 00:20:33.880 |
- It feels like something, it may well be possible 00:20:36.360 |
that it feels like something to be a paramecium. 00:20:39.320 |
I think it's pretty likely it feels like something 00:20:51.240 |
And you can, so some people, for example, Bertrand Russell, 00:20:55.160 |
tried to advocate this idea, it's called Russellian monism, 00:20:59.280 |
that panpsychism is really physics viewed from the inside. 00:21:14.080 |
between curvature and mass distribution, okay? 00:21:18.160 |
Physics doesn't really describe the ultimate reality itself. 00:21:28.960 |
And consciousness is what physics feels from the inside. 00:21:35.800 |
particularly my cortex, feels from the inside. 00:21:38.520 |
And so if you are paramecium, you gotta remember, 00:21:41.400 |
you say paramecium, well, that's a pretty dumb creature. 00:21:43.280 |
It is, but it has already a billion different molecules, 00:21:52.720 |
that no single person, no computer system so far 00:21:55.880 |
on this planet has ever managed to accurately simulate. 00:22:01.240 |
Yes, and it may well be that that little thing 00:22:04.880 |
Now, it doesn't have a voice in the head like me. 00:22:08.320 |
You know, it doesn't have all that complex things, 00:22:14.680 |
Can we draw some lines and maybe try to understand 00:22:38.080 |
- A lot of the stuff we're talking about today 00:22:40.080 |
is full of mysteries and fascinating ones, right? 00:22:42.480 |
- Well, for example, you can go to Aristotle, 00:22:56.720 |
He says, "All biological creatures have a vegetative soul." 00:23:03.480 |
that it's biochemistry and nonlinear thermodynamics. 00:23:09.160 |
Only animals and humans have also a sensitive soul, 00:23:14.600 |
They can see, they can smell, and they have drives. 00:23:18.120 |
They wanna reproduce, they wanna eat, et cetera. 00:23:20.880 |
And then only humans have what he called a rational soul. 00:23:25.920 |
- And that idea then made it into Christendom, 00:23:28.120 |
and then the rational soul is the one that lives forever. 00:23:49.880 |
what is our modern conception of these three, 00:23:52.400 |
Aristotle would have called them different forms. 00:24:00.000 |
Although we don't understand how it originated, 00:24:01.840 |
but it's been difficult to rigorously pin down. 00:24:08.240 |
It's in fact, right now, there's a conference ongoing, 00:24:11.120 |
again, that tries to define legally and medically 00:24:16.400 |
Death is you stop breathing, your heart stops beating. 00:24:21.440 |
If you're unsure, you wait another 10 minutes. 00:24:23.400 |
If the patient doesn't breathe, you know, he's dead. 00:24:25.360 |
Well, now we have ventilators, we have pacemakers. 00:24:28.640 |
So it's much more difficult to define what death is. 00:24:31.000 |
Typically, death is defined as the end of life, 00:24:36.640 |
- Okay, so we don't have really very good definitions. 00:24:39.280 |
Intelligence, we don't have a rigorous definition. 00:24:45.800 |
And we're beginning to build it in a narrow sense, right? 00:24:50.200 |
Like Go, AlphaGo, and Watson, and Google Cars, 00:25:02.480 |
it's something to do with the ability to learn 00:25:10.960 |
And it's very unclear, if you build a machine that has AGI, 00:25:14.720 |
it's not at all, a priori, it's not at all clear 00:25:29.680 |
artificial consciousness would help you get to an AGI? 00:25:36.800 |
do you think intelligent requires consciousness? 00:25:44.840 |
consciousness, intelligence goes hand in hand. 00:25:55.160 |
is sort of ultimately is what is closely tied 00:26:05.000 |
In artificial system, particularly in digital machines, 00:26:21.000 |
In fact, you could even do a more radical version 00:26:24.080 |
We can build a computer simulation of the human brain. 00:26:26.800 |
You know what Henry Markham in the Blue Brain Project, 00:26:29.760 |
or the Human Brain Project in Switzerland is trying to do. 00:26:33.560 |
So in 10 years, we have this perfect simulation 00:26:38.200 |
And it has alarmics, and it has motor neurons, 00:26:40.400 |
it has a broadcast area, and of course it'll talk, 00:26:43.560 |
and it'll say, "Hi, I just woke up, I feel great." 00:26:46.960 |
Even that computer simulation that can in principle 00:26:53.360 |
It's a difference between the simulated and the real. 00:26:55.880 |
So it simulates the behavior associated with consciousness. 00:27:01.440 |
will have all the intelligence that that particular person 00:27:19.760 |
in general relativity between curvature and mass. 00:27:29.400 |
at the center of our galaxy will be so massive 00:27:40.280 |
why doesn't this computer simulation suck me in? 00:27:50.640 |
So it's a difference between the real and the simulator, 00:27:54.360 |
just like it doesn't get wet inside a computer 00:27:56.360 |
when the computer runs cold, that simulates a weather storm. 00:27:58.920 |
And so in order to have artificial consciousness, 00:28:06.760 |
You have to build so-called a neuromorphic machine 00:28:09.080 |
that has hardware that is very similar to the human brain, 00:28:44.140 |
But I think if you, maybe another way to say, 00:28:53.880 |
Do you think people will say Siri is intelligent? 00:29:01.080 |
So to be intelligent, it seems like you have to have 00:29:04.200 |
some kind of connections with other human beings, 00:29:17.280 |
And for that, there feels like there has to be 00:29:21.700 |
So you think you can have just the world's best 00:29:24.180 |
natural NLP system, natural language understanding 00:29:38.200 |
we can get what we call high level functional intelligence, 00:29:43.200 |
this fluid-like intelligence that we cherish, 00:29:52.720 |
and I see a lot of reason to believe it's gonna happen 00:29:54.880 |
very, you know, over the next 50 years or 30 years. 00:29:58.200 |
- So for beneficial AI, for creating an AI system 00:30:09.600 |
aligns its values with our values as humanity. 00:30:14.880 |
- Yes, I think that that is a very good argument 00:30:17.320 |
that if we're concerned about AI and the threat of AI, 00:30:22.800 |
I think having an intelligence that has empathy, right? 00:30:29.480 |
why do most of us find that abhorrent, abusing any animal? 00:30:38.320 |
really means feeling with, I feel a path of empathy. 00:30:43.160 |
I see somebody else suffer that isn't even my conspecific, 00:30:51.280 |
but I feel naturally most of us, not all of us, 00:30:55.680 |
And so it may well be in the long-term interest 00:31:02.080 |
that if we do build AGI and it's really becomes very powerful 00:31:11.880 |
- So as part of the full conscious experience, 00:31:17.480 |
or in our human consciousness, do you think fear, 00:31:21.440 |
maybe we're gonna get into your earlier days, 00:31:24.320 |
with Nietzsche and so on, but do you think fear 00:31:26.200 |
and suffering are essential to have consciousness? 00:31:30.680 |
Do you have to have the full range of experience 00:31:38.600 |
very particular kinds of very positive experiences? 00:31:45.400 |
where you implant an electrode in the hypothalamus, 00:31:49.680 |
and the rat stimulates itself above and beyond anything else. 00:32:00.480 |
I guess it's like an orgasm, just you have all day long. 00:32:03.960 |
And so, a priori, I see no reason why you need 00:32:11.320 |
Now, clearly, to survive, that wouldn't work, right? 00:32:31.800 |
Because any real creature, whether artificial or engineered, 00:32:34.360 |
you wanna give it fear, the fear of extinction, 00:32:41.400 |
states that you want the machine encouraged to do 00:32:45.000 |
because they give the machine positive feedback. 00:32:52.480 |
everything having some kind of mental property. 00:33:02.120 |
So, everything having some elements of consciousness. 00:33:05.440 |
Is there something special about human consciousness? 00:33:28.880 |
at least if they have this internal causal power, 00:33:43.680 |
and we have an overblown sense of our own importance. 00:33:53.800 |
But behaviorally, the main thing that we have, 00:34:18.920 |
fundamental human things that religion also touches on. 00:34:41.760 |
you know, sits somewhere in the fullness of time, 00:34:43.920 |
I'll be united in some sort of everlasting bliss, 00:34:50.160 |
Look, the world, the night is large and full of wonders. 00:34:54.160 |
There are many things that I don't understand, 00:35:01.400 |
Dark matter, dark energy, we have no idea what it is, 00:35:09.520 |
it's sort of my current religious or spiritual sentiment 00:35:27.280 |
so when I spent several meetings with the Dalai Lama, 00:35:35.120 |
the Pope or some cardinal, he always emphasized 00:35:45.160 |
not just in people, but in everybody, this universal, 00:35:49.680 |
An animal, generally, is less capable of suffering 00:35:53.520 |
than a well-developed, normally developed human, 00:35:57.920 |
and they think consciousness pervades in this universe, 00:36:04.840 |
you can think of them like mindfulness, et cetera, 00:36:09.760 |
what they claim of this more fundamental aspect of reality. 00:36:16.480 |
and then there's this inside view, consciousness, 00:36:20.440 |
That's the only thing I have access to in my life, 00:36:23.520 |
and you've gotta remember, my conscious experience 00:36:34.600 |
The only thing you directly are acquainted with 00:36:44.240 |
So it sounds like you kind of have a rich life. 00:36:50.040 |
and it seems like you really love literature, 00:36:52.560 |
and consciousness is all about experiencing things. 00:36:56.280 |
So do you think that has helped your research on this topic? 00:37:06.440 |
or now I do rowing, crew rowing, and I bike every day, 00:37:23.720 |
and you wonder, what is it so addicting about it? 00:37:32.840 |
you're not conscious of your inner voice anymore. 00:37:42.440 |
But when you're in the zone, all of that is gone, 00:37:47.920 |
You're climbing, or you're rowing, or biking, 00:37:55.840 |
you're all action, or in this case, of pure experience. 00:38:00.920 |
but in both cases, you experience some aspect of, 00:38:04.720 |
you touch some basic part of conscious existence 00:38:23.200 |
the simulation hypothesis, simulation theory, 00:38:25.360 |
the idea that we all live in a computer simulation? 00:38:27.960 |
Have you given it a shot? - Rapture for nerds. 00:38:35.680 |
that engaged hundreds of scholars for many centuries, 00:38:47.400 |
People love talking about these sorts of things. 00:38:49.480 |
I know they're book written about the simulation hypothesis. 00:38:58.200 |
- But it's not useful for you to think of in those terms. 00:39:00.320 |
So maybe connecting to the questions of free will, 00:39:15.760 |
from a physics perspective, from a consciousness perspective? 00:39:24.160 |
we believe we live in a fully deterministic world, right? 00:39:27.080 |
But then comes, of course, quantum mechanics. 00:39:33.960 |
because the idea that the initial condition of the universe 00:39:37.920 |
and then everything else, we're just acting out 00:39:50.640 |
We are much more constrained by physics, of course, 00:39:53.280 |
and by our past and by our own conscious desires 00:40:06.920 |
and I'm talking really about critical decisions, 00:40:17.880 |
These are things where you really deliberate. 00:40:31.000 |
and try to analyze it under all the various conditions, 00:40:40.560 |
It's not a will that's totally free to do anything it wants. 00:40:48.640 |
you actually write a blog about books you've read. 00:41:12.520 |
because he talks about some of these problems. 00:41:25.960 |
sort of under the guise of, under the mask of charity, 00:41:41.440 |
- And what do you think about the unconscious? 00:41:47.080 |
What's, just like dark matter in the universe, 00:41:53.960 |
This is what a lot of last hundred years of research 00:41:57.760 |
So I think he was a genius, misguided towards the end, 00:42:07.240 |
He contributed himself to the neuron hypothesis, 00:42:20.880 |
You feel this particular, when you're in a relationship 00:42:26.720 |
you can have love and hate and lust and anger 00:42:35.000 |
It's very, very difficult to penetrate to those basements, 00:42:48.560 |
they make you upset or angry or sad or depressed. 00:42:57.400 |
You construct finally a story why this happened, 00:42:59.960 |
why you love her or don't love her or whatever, 00:43:02.080 |
but you don't really know whether that actually happened, 00:43:09.720 |
- Do you think that's a feature or a bug of our brain? 00:43:44.560 |
you have to be conscious when you learn things 00:43:50.720 |
routes I think that involve basal ganglia and striatum, 00:43:56.720 |
And then once you do it automatically like typing, 00:44:09.840 |
about the abstract sense of the text you wanna write. 00:44:12.560 |
And I think that's true for many, many things. 00:44:14.760 |
- But for the things like all the fights you had 00:44:18.360 |
with an ex-girlfriend, things that you would think 00:44:25.240 |
So that seems like a bug that it would stay there. 00:44:28.400 |
- You think it would be better if you can analyze it 00:44:40.520 |
we don't have an ability, unless it's extreme, 00:44:42.800 |
there are cases, clinical dissociations, right? 00:44:53.440 |
We don't have an ability to remove traumatic memories. 00:44:58.960 |
On the other hand, probably if you had the ability 00:45:03.840 |
you probably do it to an extent that isn't useful to you. 00:45:14.680 |
correct me if I'm wrong, but broadly speaking, 00:45:17.520 |
academia and the different scientific disciplines, 00:45:21.960 |
reading literature seems to be a rare pursuit. 00:45:25.360 |
Perhaps I'm wrong on this, but that's in my experience. 00:45:30.680 |
and do not sort of escape or seek truth in literature. 00:45:47.280 |
- I think it's the access to a much wider array 00:46:02.520 |
not just sitting in a lab, staring at a screen 00:46:06.520 |
for a hundred milliseconds and pushing a button. 00:46:11.920 |
but you need to consider lots of other strange states. 00:46:21.880 |
all sorts of interesting experiences that people have, 00:46:26.880 |
the fact that women experience the world different, 00:46:43.240 |
We today, like any culture, think we know it all. 00:46:47.320 |
Every culture believes that it's heyday, they know it all. 00:46:52.800 |
and some of them may have lots of things in their favor. 00:47:08.680 |
we kind of naturally think in human time scales. 00:47:13.600 |
Or, and also entities that are sized close to humans. 00:47:22.520 |
And do you think of things that take, you know, 00:47:41.360 |
So, and most people just think they are automata. 00:47:46.520 |
like bees, they can recognize individual humans. 00:47:48.840 |
They have this very complicated way to communicate. 00:47:53.600 |
or you know your parents when they bought a house, 00:48:05.680 |
They come back, they have this power, this dance, 00:48:14.340 |
the entire swarm, the scouts warm up the entire swarm, 00:48:24.980 |
it's 10 times more denser than anything we have in our brain. 00:48:30.820 |
Complex behavior, very complicated circuitry. 00:48:32.820 |
So there's no question they experience something. 00:48:47.220 |
is the substrate that maximizes the cause effect power 00:48:54.500 |
do you know the science fiction story, The Black Cloud? 00:48:57.700 |
Okay, it's a classic by Fred Hoyle, the astronomer. 00:49:00.060 |
He has this cloud intervening between the earth 00:49:16.380 |
And they sort of, they convince it to move away. 00:49:27.840 |
And yes, if that, if the maximum of that occurs 00:49:32.580 |
rather than in assets sort of fraction of a second, 00:49:40.700 |
or microsecond, right, rather than a fraction 00:49:48.660 |
that we simply don't recognize for what they are 00:50:10.220 |
very, very engineering, he's an engineering background. 00:50:12.980 |
His most interesting novel is called The Victorious 00:50:21.020 |
and everything is destroyed and they discover machines, 00:50:24.100 |
humans got killed and then these machines took over 00:50:29.060 |
a Darwinian evolution, he talks about this very vividly. 00:50:34.260 |
the dominant machine intelligence organism that survived 00:50:46.140 |
individual by themselves, but in times of crisis, 00:50:48.460 |
they can communicate, they assemble into gigantic nets, 00:50:55.780 |
and they can beat anything that humans can throw at it. 00:51:02.260 |
to have an intelligence where finally the humans 00:51:05.220 |
leave the planet, they simply are unable to understand 00:51:12.580 |
or we just have to leave because fundamentally 00:51:14.820 |
it's an alien, it's so alien from us and our ideas 00:51:19.980 |
- Yeah, actually in conversation, cellular automatas, 00:51:24.580 |
Stephen Wolfram brought up is that there could be 00:51:47.740 |
So it's not conscious if you can't measure it. 00:51:53.060 |
One is it's just like seeing the multiverses, 00:51:57.380 |
that might be true, but I can't communicate with them. 00:52:05.820 |
Look, another case that's happening right now, 00:52:10.860 |
So you can take stem cells from under your arm, 00:52:12.940 |
put it in a dish, add four transcription factors 00:52:15.260 |
and then you can induce them to grow into large, 00:52:21.500 |
that look like nerve cells in a dish called mini organoids. 00:52:24.780 |
At Harvard, at Stanford, everywhere they're building them. 00:52:27.220 |
It may well be possible that they're beginning 00:52:30.740 |
but we can't really communicate with them right now. 00:52:33.540 |
So people are beginning to think about the ethics of this. 00:52:37.980 |
but it's one question, are they conscious or not, 00:52:41.780 |
is totally separate question, how would I know? 00:52:45.740 |
- If you could give advice to a young researcher 00:52:52.260 |
or creating human level intelligence or consciousness, 00:53:05.860 |
what is the pursuit that they should take on? 00:53:11.620 |
is it philosophy, is it computer science or robotics? 00:53:14.980 |
- No, in a sense that, okay, so the only known system 00:53:21.460 |
that have high level of intelligence is Homo sapiens. 00:53:25.660 |
it's probably good to continue to study closely 00:53:30.500 |
somewhere between cognitive neuroscience on the one hand, 00:53:37.300 |
You can look at all the original ideas, neural networks, 00:53:42.340 |
Reinforcing, whether it's Snarky, Minsky building Snarky, 00:53:46.300 |
or whether it's the early Schubert and Wiesel experiments 00:53:55.040 |
some people argue that to make the next big step in AI, 00:53:58.020 |
once we realize the limits of deep convolutional networks, 00:54:11.780 |
a pickpocket who steals a wallet from a purse. 00:54:18.260 |
well, it's a man, it's a woman, it's a purse, right? 00:54:32.420 |
where you really wanna build machines that understand, 00:54:34.440 |
in a way you and I, we have to go to psychology. 00:54:42.060 |
it's also so exciting to try to understand better our nature 00:54:45.860 |
and then to build, to take some of those insight 00:54:49.940 |
is somewhere in the interface between cognitive science, 00:54:52.560 |
neuroscience, AI, computer science, and philosophy of mind. 00:54:58.100 |
from the machine learning, from the computer science, 00:55:19.940 |
What's the thing that you're really excited about? 00:55:24.380 |
What's the mystery that you would love to uncover 00:55:28.820 |
beyond all the mysteries that you're already surrounded by? 00:55:32.820 |
- Well, so there's a structure called the claustrum. 00:55:49.860 |
And it has connection to every cortical region. 00:55:54.860 |
And Francis Crick, the last paper he ever wrote, 00:55:56.980 |
he dictated corrections the day he died in hospital 00:56:10.740 |
That the function of this structure is similar, 00:56:14.980 |
to the role of a conductor in a symphony orchestra. 00:56:29.540 |
consciousness puts it all together into one package, 00:56:35.180 |
because it has relatively few neurons compared to cortex, 00:56:38.140 |
but it talks, it sort of receives input from all of them, 00:56:45.020 |
We've got this beautiful neuronal reconstruction 00:57:06.980 |
and trying to turn, using fancy tools like optogenetics, 00:57:15.380 |
- So this thing is perhaps where the parts become the whole. 00:57:24.260 |
where the individual parts turn into the whole 00:57:29.760 |
- Well, with that, thank you very much for being here today.