back to index

AI Simulating Humans to Understand Itself (Joscha Bach) | AI Podcast Clips


Whisper Transcript | Transcript Only Page

00:00:00.000 | - What, let me ask a romanticized question,
00:00:04.040 | what is the most beautiful to you, silly ape,
00:00:07.720 | the most beautiful or surprising idea
00:00:10.360 | in the development of artificial intelligence,
00:00:12.440 | whether in your own life or in the history
00:00:14.520 | of artificial intelligence that you've come across?
00:00:17.040 | - If you built an AI, it probably can make models
00:00:20.960 | at an arbitrary degree of detail, right, of the world.
00:00:24.880 | And then it would try to understand its own nature.
00:00:27.340 | It's tempting to think that at some point
00:00:29.160 | when we have general intelligence,
00:00:30.380 | we have competitions where we will let the AIs wake up
00:00:33.740 | in different kinds of physical universes
00:00:35.740 | and we measure how many movements of the Rubik's cube
00:00:38.500 | it takes until it's figured out what's going on
00:00:40.500 | in its universe and what it is in its own nature
00:00:43.500 | and its own physics and so on, right?
00:00:45.260 | So what if we exist in the memory of an AI
00:00:48.020 | that is trying to understand its own nature
00:00:49.940 | and remembers its own genesis and remembers Lex and Yosha
00:00:53.340 | sitting in a hotel room, sparking some of the ideas
00:00:56.620 | off that led to the development of general intelligence?
00:00:59.400 | - So we're a kind of simulation that's running
00:01:01.480 | in an AI system that's trying to understand itself.
00:01:03.960 | - Maybe.
00:01:04.800 | It's not that I believe that,
00:01:07.040 | but I think it's a beautiful idea.
00:01:08.680 | (laughing)
00:01:10.920 | - I mean, yeah, you kind of return to this idea
00:01:15.800 | with the Turing test of intelligence being,
00:01:18.600 | of intelligence being the process of asking
00:01:24.040 | and answering what is intelligence.
00:01:26.060 | (inhaling)
00:01:28.000 | I mean, what, why, do you think there is an answer?
00:01:33.000 | Why is there such a search for an answer?
00:01:37.160 | So does there have to be like an answer?
00:01:40.320 | You just said an AI system that's trying to understand
00:01:43.520 | the why of what, you know, understand itself.
00:01:46.520 | Is that a fundamental process of greater
00:01:51.640 | and greater complexity, greater and greater intelligence?
00:01:54.400 | Is the continuous trying of understanding itself?
00:01:57.480 | - No, I think you will find that most people
00:01:59.940 | don't care about that because they're well-adjusted
00:02:02.420 | enough to not care.
00:02:04.200 | And the reason why people like you and me care about it
00:02:06.940 | probably has to do with the need to understand ourselves.
00:02:09.900 | It's because we are in fundamental disagreement
00:02:12.140 | with the universe that we wake up in.
00:02:14.620 | - What's the disagreement? - They look down on me
00:02:15.700 | and they see, oh my God, I'm caught in a monkey.
00:02:17.540 | What's that?
00:02:18.380 | - Right, that's the feeling, right?
00:02:20.140 | - Some people are unhappy with the government
00:02:21.420 | and I'm unhappy with the entire universe
00:02:23.340 | that I find myself in.
00:02:24.560 | - Oh, so you don't think that's a fundamental aspect
00:02:28.760 | of human nature that some people are just suppressing?
00:02:31.660 | That they wake up shocked that they're
00:02:34.220 | in the body of a monkey?
00:02:35.480 | - No, there is a clear adaptive value
00:02:37.300 | to not be confused by that and by--
00:02:41.240 | - Well, no, that's not what I asked.
00:02:43.100 | So yeah, if there's clear adaptive value,
00:02:47.020 | then there's clear adaptive value to,
00:02:50.380 | while fundamentally your brain is confused by that,
00:02:53.080 | by creating an illusion, another layer of the narrative
00:02:56.300 | that says, that tries to suppress that
00:03:01.020 | and instead say that what's going on with the government
00:03:04.020 | right now is the most important thing
00:03:05.380 | and what's going on with my football team
00:03:06.840 | is the most important thing.
00:03:08.420 | But it seems to me, like for me,
00:03:13.100 | it was a really interesting moment reading
00:03:14.980 | Ernest Becker's Denial of Death,
00:03:19.600 | this kind of idea that we're all,
00:03:24.000 | the fundamental thing from which most of our human mind
00:03:29.000 | springs is this fear of mortality,
00:03:33.080 | of being cognizant of your mortality
00:03:34.760 | and the fear of that mortality.
00:03:37.200 | And then you construct illusions on top of that.
00:03:39.600 | I guess I'm, you being, just to push on it,
00:03:44.360 | you really don't think it's possible
00:03:46.980 | that this worry of the big existential questions
00:03:51.360 | is actually fundamental as the existentialist thought
00:03:55.120 | to our existence.
00:03:56.600 | - No, I think that the fear of death only plays a role
00:03:59.320 | as long as you don't see the big picture.
00:04:01.960 | The thing is that minds are software states, right?
00:04:05.160 | Software doesn't have identity.
00:04:07.280 | Software in some sense is a physical law.
00:04:09.320 | - But, hold on a second, but it, yeah.
00:04:11.400 | - Right, so--
00:04:12.240 | - But it feels like there's an identity.
00:04:14.080 | I thought that was for this particular piece of software
00:04:17.180 | and the narrative it tells,
00:04:18.620 | that's a fundamental property of it,
00:04:20.420 | of assigning an identity.
00:04:21.260 | - The maintenance of the identity is not terminal.
00:04:23.820 | It's instrumental to something else.
00:04:25.820 | You maintain your identity so you can serve your meaning,
00:04:28.980 | so you can do the things that you're supposed to do
00:04:30.940 | before you die.
00:04:32.340 | And I suspect that for most people,
00:04:33.940 | the fear of death is the fear of dying
00:04:35.780 | before they're done with the things
00:04:37.100 | that they feel they have to do
00:04:38.180 | even though they cannot quite put their finger on it,
00:04:40.020 | what that is.
00:04:42.600 | - Right, but in the software world,
00:04:46.700 | to return to the question,
00:04:48.780 | then what happens after we die?
00:04:50.900 | (laughs)
00:04:53.620 | Because--
00:04:54.460 | - Why would you care?
00:04:55.280 | You will not be longer there.
00:04:56.120 | The point of dying is that you are gone.
00:04:58.420 | - Well, maybe I'm not.
00:04:59.620 | This is what, you know,
00:05:01.180 | it seems like there's so much,
00:05:04.940 | in the idea that this is just,
00:05:08.660 | the mind is just a simulation
00:05:10.460 | that's constructing a narrative around some particular
00:05:14.020 | aspects of the quantum mechanical wave function world
00:05:17.820 | that we can't quite get direct access to.
00:05:21.560 | Then the idea of mortality seems to be fuzzy as well.
00:05:27.420 | Maybe there's not a clear end.
00:05:29.060 | - The fuzzy idea is the one of continuous existence.
00:05:32.340 | We don't have continuous existence.
00:05:34.400 | - How do you know that?
00:05:35.240 | - Because it's not computable.
00:05:37.640 | - 'Cause you're saying it's gonna be directly--
00:05:39.700 | - There is no continuous process.
00:05:41.060 | The only thing that binds you together
00:05:42.580 | with the Lex Friedman from yesterday
00:05:44.020 | is the illusion that you have memories about him.
00:05:46.660 | So if you want to upload, it's very easy.
00:05:48.420 | You make a machine that thinks it's you.
00:05:50.540 | Because it's the same thing that you are.
00:05:51.820 | You are a machine that thinks it's you.
00:05:53.380 | - But that's immortality.
00:05:56.220 | - Yeah, but it's just a belief.
00:05:57.320 | You can create this belief very easily
00:05:59.020 | once you realize that the question
00:06:01.100 | whether you are immortal or not
00:06:03.100 | depends entirely on your beliefs and your own continuity.
00:06:06.840 | - But then you can be immortal
00:06:09.280 | by the continuity of the belief.
00:06:12.340 | - You cannot be immortal,
00:06:13.320 | but you can stop being afraid of your mortality
00:06:16.100 | because you realize you were never
00:06:17.980 | continuously existing in the first place.
00:06:20.020 | - Well, I don't know if I'd be more terrified
00:06:23.020 | or less terrified by that.
00:06:24.100 | It seems like the fact that I existed--
00:06:27.460 | - Also, you don't know this state
00:06:28.580 | in which you don't have a self.
00:06:30.160 | You can turn off yourself, you know?
00:06:32.700 | - I can't turn off myself. - You can turn it off.
00:06:34.660 | You can turn it off.
00:06:35.500 | - I can.
00:06:36.320 | - Yes, and you can basically meditate yourself
00:06:38.460 | in a state where you are still conscious,
00:06:40.580 | where still things are happening,
00:06:41.860 | where you know everything that you knew before,
00:06:44.260 | but you're no longer identified with changing anything.
00:06:47.420 | And this means that your self, in a way, dissolves.
00:06:51.060 | There is no longer this person.
00:06:52.420 | You know that this person construct exists in other states,
00:06:55.520 | and it runs on this brain of Lex Friedman,
00:06:58.260 | but it's not a real thing.
00:07:00.780 | It's a construct.
00:07:01.620 | It's an idea, and you can change that idea.
00:07:04.420 | And if you let go of this idea,
00:07:06.900 | if you don't think that you are special,
00:07:09.020 | you realize it's just one of many people,
00:07:10.860 | and it's not your favorite person even, right?
00:07:12.700 | It's just one of many.
00:07:14.420 | And it's the one that you are doomed to control
00:07:16.660 | for the most part,
00:07:17.500 | and that is basically informing the actions of this organism
00:07:21.460 | as a control model.
00:07:22.620 | This is all there is.
00:07:24.060 | And you are somehow afraid
00:07:25.620 | that this control model gets interrupted
00:07:28.280 | or loses the identity of continuity.
00:07:31.540 | - Yeah, so I'm attached.
00:07:32.860 | I mean, yeah, it's a very popular,
00:07:35.460 | it's a somehow compelling notion that being attached,
00:07:39.300 | like there's no need to be attached
00:07:41.100 | to this idea of an identity.
00:07:43.160 | But that in itself could be an illusion that you construct.
00:07:49.980 | So the process of meditation,
00:07:51.340 | while popularly thought of as getting
00:07:53.820 | under the concept of identity,
00:07:55.740 | it could be just putting a cloak over it,
00:07:58.320 | just telling it to be quiet for the moment.
00:08:04.900 | I think that meditation is eventually
00:08:06.300 | just a bunch of techniques that let you control attention.
00:08:09.520 | And when you can control attention,
00:08:11.060 | you can get access to your own source code,
00:08:13.820 | hopefully not before you understand what you're doing.
00:08:16.620 | And then you can change the way it works,
00:08:18.340 | temporarily or permanently.
00:08:19.820 | - So yeah, meditation's to get a glimpse
00:08:23.100 | at the source code, get under,
00:08:25.060 | so basically control or turn off the attention.
00:08:26.740 | - The entire thing is that you learn to control attention.
00:08:28.700 | So everything else is downstream from controlling attention.
00:08:31.900 | - And control the attention
00:08:33.100 | that's looking at the attention.
00:08:35.420 | - Normally we only get attention
00:08:36.860 | in the parts of our mind that create heat,
00:08:38.900 | where you have a mismatch between model
00:08:40.660 | and the results that are happening.
00:08:42.820 | And so most people are not self-aware
00:08:44.820 | because their control is too good.
00:08:47.500 | If everything works out roughly the way you want,
00:08:49.460 | and the only things that don't work out
00:08:51.260 | is whether your football team wins,
00:08:53.140 | then you will mostly have models about these domains.
00:08:55.940 | And it's only when, for instance,
00:08:57.820 | your fundamental relationships
00:08:59.660 | to the world around you don't work,
00:09:01.040 | because the ideology of your country is insane,
00:09:04.020 | and the other kids are not nerds,
00:09:05.780 | and don't understand why you understand physics,
00:09:08.060 | and you don't, why you want to understand physics,
00:09:10.820 | and you don't understand why somebody
00:09:12.500 | would not want to understand physics.
00:09:14.580 | (audience applauding)
00:09:17.740 | (gentle music)
00:09:20.320 | (gentle music)
00:09:22.900 | (gentle music)
00:09:25.480 | (gentle music)
00:09:28.060 | (gentle music)
00:09:30.640 | [BLANK_AUDIO]