Back to Index

AI Simulating Humans to Understand Itself (Joscha Bach) | AI Podcast Clips


Transcript

- What, let me ask a romanticized question, what is the most beautiful to you, silly ape, the most beautiful or surprising idea in the development of artificial intelligence, whether in your own life or in the history of artificial intelligence that you've come across? - If you built an AI, it probably can make models at an arbitrary degree of detail, right, of the world.

And then it would try to understand its own nature. It's tempting to think that at some point when we have general intelligence, we have competitions where we will let the AIs wake up in different kinds of physical universes and we measure how many movements of the Rubik's cube it takes until it's figured out what's going on in its universe and what it is in its own nature and its own physics and so on, right?

So what if we exist in the memory of an AI that is trying to understand its own nature and remembers its own genesis and remembers Lex and Yosha sitting in a hotel room, sparking some of the ideas off that led to the development of general intelligence? - So we're a kind of simulation that's running in an AI system that's trying to understand itself.

- Maybe. It's not that I believe that, but I think it's a beautiful idea. (laughing) - I mean, yeah, you kind of return to this idea with the Turing test of intelligence being, of intelligence being the process of asking and answering what is intelligence. (inhaling) I mean, what, why, do you think there is an answer?

Why is there such a search for an answer? So does there have to be like an answer? You just said an AI system that's trying to understand the why of what, you know, understand itself. Is that a fundamental process of greater and greater complexity, greater and greater intelligence? Is the continuous trying of understanding itself?

- No, I think you will find that most people don't care about that because they're well-adjusted enough to not care. And the reason why people like you and me care about it probably has to do with the need to understand ourselves. It's because we are in fundamental disagreement with the universe that we wake up in.

- What's the disagreement? - They look down on me and they see, oh my God, I'm caught in a monkey. What's that? - Right, that's the feeling, right? - Some people are unhappy with the government and I'm unhappy with the entire universe that I find myself in. - Oh, so you don't think that's a fundamental aspect of human nature that some people are just suppressing?

That they wake up shocked that they're in the body of a monkey? - No, there is a clear adaptive value to not be confused by that and by-- - Well, no, that's not what I asked. So yeah, if there's clear adaptive value, then there's clear adaptive value to, while fundamentally your brain is confused by that, by creating an illusion, another layer of the narrative that says, that tries to suppress that and instead say that what's going on with the government right now is the most important thing and what's going on with my football team is the most important thing.

But it seems to me, like for me, it was a really interesting moment reading Ernest Becker's Denial of Death, this kind of idea that we're all, the fundamental thing from which most of our human mind springs is this fear of mortality, of being cognizant of your mortality and the fear of that mortality.

And then you construct illusions on top of that. I guess I'm, you being, just to push on it, you really don't think it's possible that this worry of the big existential questions is actually fundamental as the existentialist thought to our existence. - No, I think that the fear of death only plays a role as long as you don't see the big picture.

The thing is that minds are software states, right? Software doesn't have identity. Software in some sense is a physical law. - But, hold on a second, but it, yeah. - Right, so-- - But it feels like there's an identity. I thought that was for this particular piece of software and the narrative it tells, that's a fundamental property of it, of assigning an identity.

- The maintenance of the identity is not terminal. It's instrumental to something else. You maintain your identity so you can serve your meaning, so you can do the things that you're supposed to do before you die. And I suspect that for most people, the fear of death is the fear of dying before they're done with the things that they feel they have to do even though they cannot quite put their finger on it, what that is.

- Right, but in the software world, to return to the question, then what happens after we die? (laughs) Because-- - Why would you care? You will not be longer there. The point of dying is that you are gone. - Well, maybe I'm not. This is what, you know, it seems like there's so much, in the idea that this is just, the mind is just a simulation that's constructing a narrative around some particular aspects of the quantum mechanical wave function world that we can't quite get direct access to.

Then the idea of mortality seems to be fuzzy as well. Maybe there's not a clear end. - The fuzzy idea is the one of continuous existence. We don't have continuous existence. - How do you know that? - Because it's not computable. - 'Cause you're saying it's gonna be directly-- - There is no continuous process.

The only thing that binds you together with the Lex Friedman from yesterday is the illusion that you have memories about him. So if you want to upload, it's very easy. You make a machine that thinks it's you. Because it's the same thing that you are. You are a machine that thinks it's you.

- But that's immortality. - Yeah, but it's just a belief. You can create this belief very easily once you realize that the question whether you are immortal or not depends entirely on your beliefs and your own continuity. - But then you can be immortal by the continuity of the belief.

- You cannot be immortal, but you can stop being afraid of your mortality because you realize you were never continuously existing in the first place. - Well, I don't know if I'd be more terrified or less terrified by that. It seems like the fact that I existed-- - Also, you don't know this state in which you don't have a self.

You can turn off yourself, you know? - I can't turn off myself. - You can turn it off. You can turn it off. - I can. - Yes, and you can basically meditate yourself in a state where you are still conscious, where still things are happening, where you know everything that you knew before, but you're no longer identified with changing anything.

And this means that your self, in a way, dissolves. There is no longer this person. You know that this person construct exists in other states, and it runs on this brain of Lex Friedman, but it's not a real thing. It's a construct. It's an idea, and you can change that idea.

And if you let go of this idea, if you don't think that you are special, you realize it's just one of many people, and it's not your favorite person even, right? It's just one of many. And it's the one that you are doomed to control for the most part, and that is basically informing the actions of this organism as a control model.

This is all there is. And you are somehow afraid that this control model gets interrupted or loses the identity of continuity. - Yeah, so I'm attached. I mean, yeah, it's a very popular, it's a somehow compelling notion that being attached, like there's no need to be attached to this idea of an identity.

But that in itself could be an illusion that you construct. So the process of meditation, while popularly thought of as getting under the concept of identity, it could be just putting a cloak over it, just telling it to be quiet for the moment. I think that meditation is eventually just a bunch of techniques that let you control attention.

And when you can control attention, you can get access to your own source code, hopefully not before you understand what you're doing. And then you can change the way it works, temporarily or permanently. - So yeah, meditation's to get a glimpse at the source code, get under, so basically control or turn off the attention.

- The entire thing is that you learn to control attention. So everything else is downstream from controlling attention. - And control the attention that's looking at the attention. - Normally we only get attention in the parts of our mind that create heat, where you have a mismatch between model and the results that are happening.

And so most people are not self-aware because their control is too good. If everything works out roughly the way you want, and the only things that don't work out is whether your football team wins, then you will mostly have models about these domains. And it's only when, for instance, your fundamental relationships to the world around you don't work, because the ideology of your country is insane, and the other kids are not nerds, and don't understand why you understand physics, and you don't, why you want to understand physics, and you don't understand why somebody would not want to understand physics.

(audience applauding) (gentle music) (gentle music) (gentle music) (gentle music) (gentle music)