back to index

Richard Dawkins: The Programmer of the Simulation Came About Through Evolution | AI Podcast Clips


Whisper Transcript | Transcript Only Page

00:00:00.000 | - There's still this desire to get answers
00:00:03.520 | to the why question that if the world is a simulation,
00:00:07.600 | if we're living in a simulation,
00:00:08.880 | that there's a programmer-like creature
00:00:12.040 | that we can ask questions of.
00:00:13.280 | - Okay, well, let's pursue the idea
00:00:15.360 | that we're living in a simulation,
00:00:16.760 | which is not totally ridiculous, by the way.
00:00:19.180 | - There we go.
00:00:21.120 | - Then you still need to explain the programmer.
00:00:27.520 | The programmer had to come into existence
00:00:29.580 | by some, even if we're in a simulation,
00:00:32.940 | the programmer must have evolved.
00:00:34.620 | Or if he's in a sort of--
00:00:37.840 | - Or she.
00:00:38.680 | - Or she.
00:00:39.500 | If she's in a meta-simulation,
00:00:41.720 | then the meta-meta-programmer must have evolved
00:00:45.040 | by a gradual process.
00:00:46.600 | You can't escape that.
00:00:48.320 | Fundamentally, you've got to come back
00:00:49.960 | to a gradual incremental process
00:00:53.360 | of explanation to start with.
00:00:55.160 | - There's no shortcuts in this world.
00:00:59.120 | - No, exactly.
00:01:00.480 | - But maybe to linger on that point about the simulation,
00:01:04.080 | do you think it's an interesting,
00:01:06.080 | basically talk to,
00:01:08.240 | bore the heck out of everybody asking this question,
00:01:11.320 | but whether you live in a simulation,
00:01:14.320 | do you think, first, do you think we live in a simulation?
00:01:17.160 | Second, do you think it's an interesting thought experiment?
00:01:20.360 | - It's certainly an interesting thought experiment.
00:01:22.640 | I first met it in a science fiction novel
00:01:25.440 | by Daniel Galloway called "Counterfeit World,"
00:01:30.440 | in which it's all about,
00:01:35.480 | I mean, our heroes are running a gigantic computer
00:01:38.240 | which simulates the world,
00:01:40.400 | and something goes wrong,
00:01:43.280 | and so one of them has to go down
00:01:44.500 | into the simulated world in order to fix it.
00:01:46.720 | And then the denouement of the thing,
00:01:50.360 | the climax to the novel is that they discover
00:01:52.280 | that they themselves are in another simulation
00:01:54.080 | at a higher level.
00:01:55.440 | So I was intrigued by this,
00:01:56.720 | and I love others of Daniel Galloway's
00:02:00.040 | science fiction novels.
00:02:01.800 | Then it was revived seriously by Nick Bostrom.
00:02:06.440 | - Bostrom, talking to him in an hour.
00:02:08.200 | - Okay.
00:02:09.040 | (laughing)
00:02:10.400 | And he goes further,
00:02:12.520 | not just to treat it as a science fiction speculation,
00:02:15.280 | he actually thinks it's positively likely.
00:02:17.920 | - Yes.
00:02:18.760 | - I mean, he thinks it's very likely, actually.
00:02:20.280 | - Well, he makes a probabilistic argument
00:02:22.440 | which you can use to come up
00:02:24.040 | with very interesting conclusions
00:02:25.640 | about the nature of this universe.
00:02:27.440 | - I mean, he thinks that we're in a simulation
00:02:31.720 | done by, so to speak, our descendants of the future,
00:02:34.320 | that the products, but it's still a product of evolution.
00:02:37.920 | It's still ultimately going to be a product of evolution,
00:02:40.000 | even though the super intelligent people of the future
00:02:44.800 | have created our world,
00:02:48.040 | and you and I are just a simulation,
00:02:49.640 | and this table is a simulation, and so on.
00:02:52.740 | I don't actually, in my heart of hearts, believe it,
00:02:56.180 | but I like his argument.
00:02:58.580 | - Well, so the interesting thing is,
00:03:00.380 | I agree with you, but the interesting thing to me,
00:03:04.180 | if I were to say, if we're living in a simulation,
00:03:06.640 | that in that simulation, to make it work,
00:03:09.620 | you still have to do everything gradually,
00:03:11.980 | just like you said.
00:03:13.260 | That even though it's programmed,
00:03:14.940 | I don't think there could be miracles.
00:03:16.580 | Otherwise, it's--
00:03:17.420 | - Well, no, I mean, the programmer, the higher,
00:03:20.300 | the upper ones have to have evolved gradually.
00:03:22.900 | However, the simulation they create could be instantaneous.
00:03:26.100 | I mean, it could be switched on,
00:03:27.500 | and we come into the world with fabricated memories.
00:03:31.260 | - True, but what I'm trying to convey,
00:03:33.260 | so you're saying the broader statement,
00:03:35.820 | but I'm saying from an engineering perspective,
00:03:38.460 | both the programmer has to be slowly evolved
00:03:41.960 | and the simulation, because it's like--
00:03:44.420 | - Oh, yeah.
00:03:45.260 | - From an engineering perspective--
00:03:46.260 | - Oh, yeah, it takes a long time to write a program.
00:03:48.660 | No, like, just, I don't think you can create the universe
00:03:51.820 | in a snap.
00:03:52.980 | I think you have to grow it.
00:03:54.660 | - Okay, well, that's a good point.
00:03:58.820 | That's an arguable point.
00:04:00.100 | By the way, I have thought about using the Nick Bostrom idea
00:04:05.100 | to solve the riddle of how, we were talking earlier
00:04:10.500 | about why the human brain can achieve so much.
00:04:15.220 | I thought of this when my then 100-year-old mother
00:04:19.220 | was marveling at what I could do with a smartphone,
00:04:22.460 | and I could call, look up anything in the encyclopedia,
00:04:25.860 | or I could play her music that she liked, and so on.
00:04:27.780 | She said, "It's all in that tiny little phone."
00:04:29.900 | No, it's out there.
00:04:31.180 | It's in the cloud.
00:04:32.660 | And maybe most of what we do is in a cloud.
00:04:35.660 | So maybe if we are a simulation,
00:04:39.300 | then all the power that we think is in our skull,
00:04:43.380 | it actually may be like the power
00:04:44.860 | that we think is in the iPhone,
00:04:47.260 | but is that actually out there?
00:04:49.100 | - It's an interface to something else.
00:04:50.900 | I mean, that's what people,
00:04:52.300 | including Roger Penrose with panpsychism,
00:04:55.860 | that consciousness is somehow a fundamental part of physics,
00:04:58.980 | that it doesn't have to actually all reside inside a brain.
00:05:02.300 | - But Roger thinks it does reside in the skull,
00:05:04.820 | whereas I'm suggesting that it doesn't,
00:05:07.140 | that there's a cloud.
00:05:10.940 | - That'd be a fascinating notion.
00:05:14.380 | On a small tangent,
00:05:16.140 | are you familiar with the work of Donald Hoffman, I guess?
00:05:20.780 | Maybe not saying his name correctly,
00:05:23.380 | but just forget the name,
00:05:26.620 | the idea that there's a difference
00:05:28.260 | between reality and perception.
00:05:30.500 | So like we biological organisms perceive the world
00:05:35.500 | in order for the natural selection process
00:05:37.820 | to be able to survive and so on,
00:05:39.460 | but that doesn't mean that our perception
00:05:42.380 | actually reflects the fundamental reality,
00:05:45.580 | the physical reality underneath.
00:05:47.060 | - Well, I do think that
00:05:49.100 | although it reflects the fundamental reality,
00:05:52.900 | I do believe there is a fundamental reality,
00:05:55.100 | I do think that our perception is constructive
00:06:01.020 | in the sense that we construct in our minds
00:06:05.100 | a model of what we're seeing.
00:06:07.380 | And so this is really the view of people
00:06:10.460 | who work on visual illusions like Richard Gregory,
00:06:13.740 | who point out that things like a Necker cube,
00:06:16.820 | which flip from,
00:06:19.380 | it's a two-dimensional picture of a cube on a sheet of paper,
00:06:24.020 | but we see it as a three-dimensional cube,
00:06:26.060 | and it flips from one orientation to another
00:06:28.380 | at regular intervals.
00:06:31.460 | What's going on is that the brain is constructing a cube,
00:06:35.700 | but the sense data are compatible with two alternative cubes.
00:06:40.140 | And so rather than stick with one of them,
00:06:41.700 | it alternates between them.
00:06:43.500 | I think that's just a model for what we do all the time
00:06:47.700 | when we see a table, when we see a person,
00:06:49.700 | when we see anything,
00:06:51.180 | we're using the sense data to construct
00:06:55.260 | or make use of a perhaps previously constructed model.
00:06:58.820 | I noticed this when I meet somebody
00:07:03.820 | who actually is, say, a friend of mine,
00:07:07.180 | but until I kind of realized that it is him,
00:07:11.500 | he looks different.
00:07:12.700 | And then when I finally clock that it's him,
00:07:15.980 | his features switch like a Necker cube
00:07:18.780 | into the familiar form.
00:07:20.420 | As it were, I've taken his face
00:07:22.620 | out of the filing cabinet inside and grafted it onto,
00:07:27.100 | or used the sense data to invoke it.
00:07:31.620 | Yeah, we do some kind of miraculous compression
00:07:34.300 | on this whole thing to be able to filter out
00:07:36.340 | most of the sense data and make sense of it.
00:07:39.100 | That's just the magical thing that we do.
00:07:41.620 | (upbeat music)
00:07:44.220 | (upbeat music)
00:07:46.820 | (upbeat music)
00:07:49.420 | (upbeat music)
00:07:52.020 | (upbeat music)
00:07:54.620 | (upbeat music)
00:07:57.220 | [BLANK_AUDIO]