back to indexNick Bostrom: Experience Machine | AI Podcast Clips
00:00:00.000 |
I mean, in philosophy, there's this experience machine, thought experiment. 00:00:07.360 |
So Robert Nozick had this thought experiment where you imagine some crazy, super-duper 00:00:14.680 |
neuroscientist of the future have created a machine that could give you any experience 00:00:21.720 |
And for the rest of your life, you can kind of pre-programmed it in different ways. 00:00:32.360 |
You could, whatever you dream, you want to be a great artist, a great lover, like have 00:00:40.740 |
If you step into the experience machine, will be your experiences constantly happy. 00:00:47.820 |
But you would kind of disconnect from the rest of reality and you would float there 00:00:53.180 |
And so Nozick thought that most people would choose not to enter the experience machine. 00:01:00.360 |
I mean, many might want to go there for a holiday, but they wouldn't want to sort of 00:01:06.000 |
And so he thought that was an argument against certain views of value, according to what 00:01:12.840 |
we value is a function of what we experience. 00:01:16.200 |
Because in the experience machine, you could have any experience you want, and yet many 00:01:20.000 |
people would think that would not be much value. 00:01:23.440 |
So therefore, what we value depends on other things than what we experience. 00:01:33.360 |
What about the fact that maybe what we value is the up and down of life? 00:01:36.960 |
You could have up and downs in the experience machine, right? 00:01:40.520 |
But what can't you have in the experience machine? 00:01:42.680 |
Well, I mean, that then becomes an interesting question to explore. 00:01:46.920 |
But for example, real connection with other people, if the experience machine is a solo 00:01:51.880 |
machine where it's only you, that's something you wouldn't have there. 00:01:56.320 |
You would have this subjective experience that would be like fake people. 00:02:00.880 |
But if you gave somebody flowers, that wouldn't be anybody there who actually got happy. 00:02:05.920 |
It would just be a little simulation of somebody smiling. 00:02:09.920 |
But the simulation would not be the kind of simulation I'm talking about in the simulation 00:02:12.920 |
argument where the simulated creature is conscious. 00:02:15.520 |
It would just be a kind of smiley face that would look perfectly real to you. 00:02:20.040 |
So we're now drawing a distinction between appear to be perfectly real and actually being 00:02:29.600 |
I mean, like a big impact on history, maybe it's also something you won't have if you 00:02:37.120 |
So some people might actually feel the life I want to have for me is one where I have 00:02:41.360 |
a big positive impact on how history unfolds. 00:02:47.080 |
So you could kind of explore these different possible explanations for why it is you wouldn't 00:02:55.000 |
want to go into the experience machine if that's what you feel. 00:02:59.480 |
And one interesting observation regarding this Nozick thought experiment and the conclusions 00:03:04.800 |
he wanted to draw from it is how much is a kind of a status quo effect. 00:03:10.280 |
So a lot of people might not want to get this on current reality to plug into this dream 00:03:17.680 |
But if they instead were told, well, what you've experienced up to this point was a 00:03:24.760 |
dream now, do you want to disconnect from this and enter the real world when you have 00:03:34.600 |
Or maybe you could say, well, you're actually a farmer in Peru, growing peanuts and you 00:03:41.440 |
could live for the rest of your life in this. 00:03:45.280 |
Or would you want to continue your dream life as Alex Friedman, going around the world, 00:03:55.560 |
So if the status quo was that they were actually in the experience machine, I think a lot of 00:04:02.840 |
people might then prefer to live the life that they are familiar with rather than sort 00:04:12.640 |
So it might not be so much the reality itself that we are after, but it's more that we are 00:04:16.760 |
maybe involved in certain projects and relationships. 00:04:20.520 |
And we have a self-identity and these things that our values are kind of connected with 00:04:27.380 |
And then whether it's inside a tank or outside a tank in Peru, or whether inside a computer 00:04:34.240 |
or outside a computer, that's kind of less important to what we ultimately care about. 00:04:41.840 |
But still, just to linger on it, it is interesting. 00:04:46.180 |
I find maybe people are different, but I find myself quite willing to take the leap to the 00:04:51.060 |
farmer in Peru, especially as the virtual reality system become more realistic. 00:04:58.260 |
I find that possibility, and I think more people would take that leap. 00:05:02.220 |
But in this thought experiment, just to make sure we are on the same, so in this case, 00:05:05.940 |
the farmer in Peru would not be a virtual reality. 00:05:09.160 |
That would be the real, your life, like before this whole experience machine started. 00:05:15.860 |
Well, I kind of assumed from that description, you're being very specific, but that kind 00:05:21.060 |
of idea just like washes away the concept of what's real. 00:05:26.780 |
I'm still a little hesitant about your kind of distinction between real and illusion, 00:05:34.820 |
because when you can have an illusion that feels, I mean, that looks real, I don't know 00:05:42.580 |
how you can definitively say something is real or not, what's a good way to prove that 00:05:49.220 |
Well, so I guess in this case, it's more a stipulation. 00:05:52.500 |
In one case, you're floating in a tank with these wires by the super duper neuroscientists 00:05:58.860 |
plugging into your head, giving you like Friedman experiences. 00:06:03.900 |
In the other, you're actually tilling the soil in Peru, growing peanuts, and then those 00:06:08.580 |
peanuts are being eaten by other people all around the world who buy the exports. 00:06:13.300 |
So there's two different possible situations in the one and the same real world that you 00:06:21.780 |
Just to be clear, when you're in a vat with wires and the neuroscientists, you can still 00:06:30.180 |
No, well, if you wanted to, you could have the experience of farming in Peru, but there 00:06:43.060 |
So a peanut could be grown, and you could feed things with that peanut. 00:06:49.980 |
And why can't all of that be done in a simulation? 00:06:53.060 |
I hope, first of all, that they actually have peanut farms in Peru. 00:06:56.660 |
I guess we'll get a lot of comments otherwise from Angry. 00:07:01.460 |
I was with you up to the point when you started talking about Peru. 00:07:04.300 |
You should know you can't grow peanuts in that climate. 00:07:09.100 |
No, I mean, in the simulation, I think there's a sense, the important sense in which it would 00:07:18.700 |
Nevertheless, there is a distinction between inside a simulation and outside a simulation, 00:07:25.220 |
or in the case of NOCIC thought experiment, whether you're in the vat or outside the vat. 00:07:31.220 |
And some of those differences may or may not be important. 00:07:33.780 |
I mean, that comes down to your values and preferences. 00:07:36.980 |
So if the experience machine only gives you the experience of growing peanuts, but you're 00:07:47.300 |
No, but there's other, you can, within the experience machine, others can plug in. 00:07:51.860 |
Well, there are versions of the experience machine. 00:07:55.300 |
So in fact, you might want to have distinguished thought experiments, different versions of 00:08:00.140 |
So in the original thought experiment, maybe it's only you, right? 00:08:04.300 |
So, and you think, I wouldn't want to go in there. 00:08:05.980 |
Well, that tells you something interesting about what you value and what you care about. 00:08:09.660 |
Then you could say, well, what if you add the fact that there would be other people 00:08:14.900 |
Well, it starts to make it more attractive, right? 00:08:18.360 |
Then you could add in, well, what if you could also have important long-term effects on human 00:08:22.340 |
history and the world, and you could actually do something useful, even though you were 00:08:26.300 |
in there that makes it maybe even more attractive. 00:08:28.980 |
Like you could actually have a life that had a purpose and consequences. 00:08:34.460 |
So as you sort of add more into it, it becomes more similar to the baseline reality that 00:08:44.380 |
Yeah, but I just think inside the experience machine, and without taking those steps you 00:08:49.340 |
just mentioned, you still have an impact on long-term history of the creatures that live 00:08:57.180 |
inside that, of the quote-unquote fake creatures that live inside that experience machine. 00:09:04.820 |
And that, like at a certain point, if there's a person waiting for you inside that experience 00:09:11.860 |
machine, maybe your newly found wife, and she dies, she has fear, she has hopes, and 00:09:22.060 |
When you unplug yourself and plug back in, she's still there going on about her life 00:09:28.140 |
Well, in that case, yeah, she starts to have more of an independent existence. 00:09:32.940 |
But it depends, I think, on how she's implemented in the experience machine. 00:09:38.140 |
Take one limit case where all she is is a static picture on the wall, a photograph. 00:09:43.940 |
So you think, well, I can look at her, but that's it. 00:09:49.700 |
But then you think, well, it doesn't really matter much what happens to that, any more 00:09:54.460 |
If you tear it up, it means you can't see it anymore, but you haven't harmed the person 00:10:03.780 |
But if she's actually implemented, say, at a neural level of detail, so that she's a 00:10:09.220 |
fully realized digital mind with the same behavioral repertoire as you have, then very 00:10:17.060 |
possibly she would be a conscious person like you are. 00:10:20.700 |
And then what you do in this experience machine would have real consequences for how this 00:10:29.180 |
So you have to specify which of these experience machines you're talking about. 00:10:32.580 |
I think it's not entirely obvious that it would be possible to have an experience machine 00:10:39.380 |
that gave you a normal set of human experiences, which include experiences of interacting with 00:10:45.700 |
other people, without that also generating consciousnesses corresponding to those other 00:10:53.020 |
That is, if you create another entity that you perceive and interact with, that to you 00:11:00.780 |
Not just when you say hello, they say hello back, but you have a rich interaction, many 00:11:06.340 |
It might be that the only possible way of implementing that would be one that also has 00:11:12.420 |
a side effect, instantiated this other person in enough detail that you would have a second 00:11:19.180 |
I think that's to some extent an open question. 00:11:23.260 |
So you don't think it's possible to fake consciousness and fake intelligence? 00:11:30.460 |
If you have a very limited interaction with somebody, you could certainly fake that. 00:11:35.780 |
If all you have to go on is somebody said hello to you, that's not enough for you to 00:11:39.780 |
tell whether that was a real person there or a pre-recorded message or a very superficial 00:11:55.180 |
But if you have a richer set of interactions where you're allowed to ask open-ended questions 00:12:00.620 |
and probe from different angles, you couldn't give canned answer to all of the possible 00:12:06.420 |
ways that you could probe it, then it starts to become more plausible that the only way 00:12:11.740 |
to realize this thing in such a way that you would get the right answer from any which 00:12:16.620 |
angle you probed it would be a way of instantiating it where you also instantiated a conscious