back to indexAI Simulating Humans to Understand Itself (Joscha Bach) | AI Podcast Clips
00:00:04.040 |
what is the most beautiful to you, silly ape, 00:00:10.360 |
in the development of artificial intelligence, 00:00:14.520 |
of artificial intelligence that you've come across? 00:00:17.040 |
- If you built an AI, it probably can make models 00:00:20.960 |
at an arbitrary degree of detail, right, of the world. 00:00:24.880 |
And then it would try to understand its own nature. 00:00:30.380 |
we have competitions where we will let the AIs wake up 00:00:35.740 |
and we measure how many movements of the Rubik's cube 00:00:38.500 |
it takes until it's figured out what's going on 00:00:40.500 |
in its universe and what it is in its own nature 00:00:49.940 |
and remembers its own genesis and remembers Lex and Yosha 00:00:53.340 |
sitting in a hotel room, sparking some of the ideas 00:00:56.620 |
off that led to the development of general intelligence? 00:00:59.400 |
- So we're a kind of simulation that's running 00:01:01.480 |
in an AI system that's trying to understand itself. 00:01:10.920 |
- I mean, yeah, you kind of return to this idea 00:01:28.000 |
I mean, what, why, do you think there is an answer? 00:01:40.320 |
You just said an AI system that's trying to understand 00:01:43.520 |
the why of what, you know, understand itself. 00:01:51.640 |
and greater complexity, greater and greater intelligence? 00:01:54.400 |
Is the continuous trying of understanding itself? 00:01:59.940 |
don't care about that because they're well-adjusted 00:02:04.200 |
And the reason why people like you and me care about it 00:02:06.940 |
probably has to do with the need to understand ourselves. 00:02:09.900 |
It's because we are in fundamental disagreement 00:02:14.620 |
- What's the disagreement? - They look down on me 00:02:15.700 |
and they see, oh my God, I'm caught in a monkey. 00:02:20.140 |
- Some people are unhappy with the government 00:02:24.560 |
- Oh, so you don't think that's a fundamental aspect 00:02:28.760 |
of human nature that some people are just suppressing? 00:02:50.380 |
while fundamentally your brain is confused by that, 00:02:53.080 |
by creating an illusion, another layer of the narrative 00:03:01.020 |
and instead say that what's going on with the government 00:03:24.000 |
the fundamental thing from which most of our human mind 00:03:37.200 |
And then you construct illusions on top of that. 00:03:46.980 |
that this worry of the big existential questions 00:03:51.360 |
is actually fundamental as the existentialist thought 00:03:56.600 |
- No, I think that the fear of death only plays a role 00:04:01.960 |
The thing is that minds are software states, right? 00:04:14.080 |
I thought that was for this particular piece of software 00:04:21.260 |
- The maintenance of the identity is not terminal. 00:04:25.820 |
You maintain your identity so you can serve your meaning, 00:04:28.980 |
so you can do the things that you're supposed to do 00:04:38.180 |
even though they cannot quite put their finger on it, 00:05:10.460 |
that's constructing a narrative around some particular 00:05:14.020 |
aspects of the quantum mechanical wave function world 00:05:21.560 |
Then the idea of mortality seems to be fuzzy as well. 00:05:29.060 |
- The fuzzy idea is the one of continuous existence. 00:05:37.640 |
- 'Cause you're saying it's gonna be directly-- 00:05:44.020 |
is the illusion that you have memories about him. 00:06:03.100 |
depends entirely on your beliefs and your own continuity. 00:06:13.320 |
but you can stop being afraid of your mortality 00:06:20.020 |
- Well, I don't know if I'd be more terrified 00:06:32.700 |
- I can't turn off myself. - You can turn it off. 00:06:36.320 |
- Yes, and you can basically meditate yourself 00:06:41.860 |
where you know everything that you knew before, 00:06:44.260 |
but you're no longer identified with changing anything. 00:06:47.420 |
And this means that your self, in a way, dissolves. 00:06:52.420 |
You know that this person construct exists in other states, 00:07:10.860 |
and it's not your favorite person even, right? 00:07:14.420 |
And it's the one that you are doomed to control 00:07:17.500 |
and that is basically informing the actions of this organism 00:07:35.460 |
it's a somehow compelling notion that being attached, 00:07:43.160 |
But that in itself could be an illusion that you construct. 00:08:06.300 |
just a bunch of techniques that let you control attention. 00:08:13.820 |
hopefully not before you understand what you're doing. 00:08:25.060 |
so basically control or turn off the attention. 00:08:26.740 |
- The entire thing is that you learn to control attention. 00:08:28.700 |
So everything else is downstream from controlling attention. 00:08:47.500 |
If everything works out roughly the way you want, 00:08:53.140 |
then you will mostly have models about these domains. 00:09:01.040 |
because the ideology of your country is insane, 00:09:05.780 |
and don't understand why you understand physics, 00:09:08.060 |
and you don't, why you want to understand physics,