back to indexRichard Dawkins: Evolution, Intelligence, Simulation, and Memes | Lex Fridman Podcast #87
Chapters
0:0 Introduction
2:31 Intelligent life in the universe
5:3 Engineering intelligence (are there shortcuts?)
7:6 Is the evolutionary process efficient?
10:39 Human brain and AGI
15:31 Memes
26:37 Does society need religion?
33:10 Conspiracy theories
39:10 Where do morals come from in humans?
46:10 AI began with the ancient wish to forge the gods
49:18 Simulation
56:58 Books that influenced you
62:53 Meaning of life
00:00:00.000 |
The following is a conversation with Richard Dawkins, 00:00:03.260 |
an evolutionary biologist and author of "The Selfish Gene," 00:00:09.720 |
"The Magic of Reality," and "The Greatest Show of Earth," 00:00:18.500 |
of a lot of fascinating ideas in evolutionary biology 00:00:21.780 |
and science in general, including, funny enough, 00:00:29.940 |
which, in the context of a gene-centered view of evolution, 00:00:41.300 |
and in this way, is one of the most influential thinkers 00:00:50.540 |
For everyone feeling the medical, psychological, 00:01:07.780 |
support it on Patreon, or simply connect with me on Twitter, 00:01:38.380 |
Since Cash App allows you to send and receive money 00:01:43.260 |
security in all digital transactions is very important. 00:01:46.640 |
Let me mention the PCI data security standard 00:01:51.660 |
I'm a big fan of standards for safety and security. 00:01:57.740 |
where a bunch of competitors got together and agreed 00:02:04.540 |
Now we just need to do the same for autonomous vehicles 00:02:07.460 |
and artificial intelligence systems in general. 00:02:10.220 |
So again, if you get Cash App from the App Store, 00:02:15.780 |
you get $10, and Cash App will also donate $10 to FIRST, 00:02:19.540 |
an organization that is helping to advance robotics 00:02:22.220 |
and STEM education for young people around the world. 00:02:25.660 |
And now, here's my conversation with Richard Dawkins. 00:02:33.860 |
- Well, if we accept that there's intelligent life here, 00:02:37.920 |
and we accept that the number of planets in the universe 00:02:40.700 |
is gigantic, I mean, 10 to the 22 stars has been estimated, 00:02:44.740 |
it seems to me highly likely that there is not only life 00:02:47.340 |
in the universe elsewhere, but also intelligent life. 00:02:50.740 |
If you deny that, then you're committed to the view 00:02:57.060 |
I mean, ludicrously, off the charts improbable. 00:03:04.660 |
there are really two steps, the origin of life, 00:03:09.200 |
and then the subsequent evolution to intelligent life, 00:03:20.500 |
It's an interesting question, maybe you're coming onto it, 00:03:22.780 |
how we would recognize intelligence from outer space 00:03:27.820 |
The most likely way we would come across them 00:03:39.020 |
And then we would have to have some means of deciding 00:03:48.900 |
and things like prime numbers would be an obvious thing to, 00:03:57.160 |
I suspect it probably would be obvious, actually. 00:04:05.560 |
it's an open question whether mathematics is the same for us 00:04:14.220 |
if we're governed by the same laws of physics, 00:04:17.260 |
then we should be governed by the same laws of mathematics. 00:04:20.820 |
I suspect that they will have Pythagoras' theorem, et cetera. 00:04:26.420 |
- Do you think evolution would also be a force 00:04:30.580 |
- I stuck my neck out and said that if we do, 00:04:36.940 |
in the sense that it will work by some kind of 00:04:54.580 |
But I think it would have to be Darwinian, yes. 00:04:59.800 |
- Yes, in the general sense, it would be Darwinian. 00:05:02.500 |
- So let me ask kind of an artificial intelligence 00:05:10.860 |
I guess what could be called intelligent design, 00:05:15.380 |
of a human mind and body by some religious folks, 00:05:21.580 |
So broadly speaking, evolution is, as far as I know, 00:05:25.140 |
again, you can correct me, is the only scientific theory 00:05:27.500 |
we have for the development of intelligent life. 00:05:29.980 |
Like there's no alternative theory as far as I understand. 00:05:36.780 |
- Well, of course, whenever somebody says that, 00:05:51.860 |
but it's almost like Einstein's general relativity 00:06:12.660 |
to do what intelligent design says, you know, 00:06:16.940 |
was done here on Earth, what's your intuition? 00:06:19.780 |
Do you think it's possible to build intelligence, 00:06:27.320 |
or do we need to do something like the evolutionary process? 00:06:35.300 |
I'm committed to the belief that it's ultimately possible, 00:06:39.100 |
because I think there's nothing non-physical in our brains. 00:06:42.320 |
I think our brains work by the laws of physics. 00:06:49.420 |
In practice, though, it might be very difficult. 00:06:52.300 |
And as you suggest, it may be the only way to do it 00:06:55.100 |
is by something like an evolutionary process. 00:07:07.020 |
But in your sense, is the evolutionary process efficient? 00:07:18.280 |
I mean, on the one side, it is deplorably wasteful. 00:07:24.560 |
On the other hand, it does produce magnificent results. 00:07:39.520 |
On the other hand, an engineer would not be proud 00:07:55.840 |
- My favorite example is the recurrent laryngeal nerve. 00:07:59.440 |
This is a nerve, it's one of the cranial nerves, 00:08:04.880 |
that it supplies is the voice box, the larynx. 00:08:15.440 |
and then comes straight back up again to the larynx. 00:08:19.080 |
And I've assisted in the dissection of a giraffe's neck, 00:08:24.080 |
And we watched the, we saw the recurrent laryngeal nerve 00:08:58.800 |
The most direct pathway was behind that artery. 00:09:03.320 |
And then when the mammal, when the tetrapods, 00:09:10.200 |
the marginal cost of changing the embryological design 00:09:14.200 |
to jump that nerve over the artery was too great, 00:09:17.640 |
or rather was, each step of the way was a very small cost, 00:09:22.440 |
but the marginal, but the cost of actually jumping it over 00:09:26.680 |
As the neck lengthened, it was a negligible change 00:09:30.320 |
to just increase the length of the detour a tiny bit, 00:09:33.600 |
each millimeter at a time didn't make any difference. 00:09:35.800 |
And so, but finally, when you get to a giraffe, 00:09:38.240 |
it's a huge detour and no doubt is very inefficient. 00:09:42.840 |
Any engineer would reject that piece of design, 00:09:50.400 |
it's not surprising that we find examples of that sort. 00:09:53.400 |
In a way, what's surprising is there aren't more of them. 00:09:57.200 |
is that the design of living things is so good. 00:09:59.600 |
So natural selection manages to achieve excellent results, 00:10:06.120 |
partly by coming along and cleaning up initial mistakes 00:10:11.120 |
and as it were, making the best of a bad job. 00:10:17.680 |
And it's a mystery from an engineering perspective 00:10:25.680 |
is how many generations have to die for that. 00:10:38.560 |
Elon Musk describes human beings as potentially the, 00:10:47.520 |
or artificial general intelligence is used as the term, 00:10:56.080 |
as potentially the next step in the evolutionary process? 00:10:59.320 |
- Yes, I think that if superhuman intelligence 00:11:16.040 |
It is the dominant trend or one of the dominant trends 00:11:19.240 |
in our fossil history for the last two or 3 million years. 00:11:32.600 |
The only way that happens is because natural selection 00:11:36.040 |
favors those individuals with the biggest brains 00:11:53.080 |
it would be necessary that the most intelligent, 00:12:00.480 |
to brain size, but let's talk about intelligence. 00:12:06.680 |
that the most intelligent beings have the most, 00:12:21.640 |
and things like that if you have a successful career. 00:12:24.480 |
It may buy you the admiration of your fellow people, 00:12:28.360 |
but it doesn't increase the number of offspring 00:12:31.200 |
that you have, it doesn't increase your genetic legacy 00:12:47.920 |
- And so what do you think, I don't know if you're familiar, 00:12:51.560 |
but there's a general effort of brain-computer interfaces, 00:13:01.640 |
And the long-term dream there is to do exactly that, 00:13:05.080 |
which is expand, I guess expand the size of the brain, 00:13:14.640 |
Do you see this as a promising possible technology, 00:13:17.520 |
or is the interface between the computer and the brain, 00:13:25.360 |
Whether it's promising, I'm really not qualified to say. 00:13:28.840 |
What I do find puzzling is that the brain being as small 00:13:36.160 |
and the individual components being as slow as they are 00:13:55.440 |
or integrated circuits which work as slowly as neurons do. 00:14:04.280 |
Something must be going on that we don't understand. 00:14:10.960 |
I'm not sure if you're familiar with his work. 00:14:13.880 |
He also describes this kind of mystery in the mind, 00:14:23.480 |
So there's no sort of mystical thing going on, 00:14:26.840 |
but there's so much about the material of the brain 00:14:30.680 |
that might be quantum mechanical in nature and so on. 00:14:36.960 |
Do you have any, have you ever thought about, 00:14:39.240 |
do you ever think about ideas of consciousness 00:14:43.600 |
of intelligence and consciousness that seems to pop up, 00:14:48.520 |
- I agree with Roger Penrose that there is a mystery there. 00:14:59.960 |
- But nobody knows anything about consciousness. 00:15:01.880 |
And in fact, if we talk about religion and so on, 00:15:06.880 |
the mystery of consciousness is so awe-inspiring 00:15:13.960 |
that the leap to sort of religious or mystical explanations 00:15:40.320 |
as just the medium through which the software 00:15:42.680 |
of our genetics and the ideas sort of propagate. 00:15:52.200 |
What in this context does the word meme mean? 00:15:56.560 |
- It would mean the cultural equivalent of a gene, 00:16:06.320 |
and the transmission of ideas in the broadest sense. 00:16:35.320 |
in a sort of differential way, in a Darwinian fashion? 00:17:12.600 |
- That's such a fascinating possibility, if that's true. 00:17:45.320 |
these things are actual changes in the body form, 00:17:50.800 |
and which get passed on from generation to generation, 00:17:59.400 |
- But the moment you start drifting away from the physical, 00:18:05.440 |
'cause the space of ideas, ideologies, political systems. 00:18:14.080 |
Are memes a metaphor more, or are they really, 00:18:24.280 |
- Well, I think they're a bit more than a metaphor, 00:18:27.440 |
I mentioned the physical bodily characteristics, 00:18:33.080 |
but with things like the propagation of religious ideas, 00:18:40.400 |
and transversely, as in a sort of epidemiology of ideas. 00:18:59.080 |
from grandparent to parent to child, et cetera, 00:19:02.640 |
is more like conventional genetic transmission. 00:19:11.320 |
Do you think about this implication in social networks, 00:19:17.400 |
and hence the new use of the word meme to describe-- 00:19:23.640 |
provides extremely rapid method of transmission. 00:19:31.160 |
and so I was thinking then in terms of books, newspapers, 00:19:59.080 |
it's like you have Galapagos Islands or something, 00:20:02.000 |
it's the '70s, and the internet allowed all these species 00:20:10.360 |
you can spread a message to millions of people, 00:20:22.120 |
and there's different, I guess, groups that have all, 00:20:28.600 |
- Basically, do you think your work in this direction, 00:20:32.520 |
while fundamentally it was focused on life on Earth, 00:20:35.660 |
do you think it should continue, like, to be taken further? 00:20:39.200 |
- I mean, I do think it would probably be a good idea 00:20:41.240 |
to think in a Darwinian way about this sort of thing. 00:20:45.400 |
We conventionally think of the transmission of ideas 00:20:57.120 |
living in small bands where everybody knew each other, 00:21:00.400 |
and ideas could propagate within the village, 00:21:03.240 |
and they might hop to a neighboring village occasionally, 00:21:07.060 |
and maybe even to a neighboring continent eventually. 00:21:15.820 |
I mean, you have people, it's been called echo chambers, 00:21:20.380 |
where people are in a sort of internet village, 00:21:26.500 |
may be geographically distributed all over the world, 00:21:29.340 |
but they just happen to be interested in the same things, 00:21:39.640 |
they don't all live in one place, they find each other, 00:21:43.660 |
and they talk the same language to each other, 00:21:51.660 |
of the primitive idea of people living in villages 00:22:02.400 |
So is there evolutionary purpose of villages, 00:22:07.240 |
- I wouldn't use a word like evolutionary purpose 00:22:12.520 |
that just emerged, that's the way people happen to live. 00:22:22.920 |
emerge in the same kind of way in this digital space? 00:22:28.000 |
- Is there something interesting to say about the, 00:22:38.560 |
of social interaction in these social networks? 00:22:49.280 |
- Well, a Darwinian selection idea would involve 00:22:52.800 |
investigating which ideas spread and which don't. 00:22:57.640 |
So some ideas don't have the ability to spread. 00:23:01.960 |
I mean, flat Earthism, there are a few people 00:23:12.280 |
can spread because they are attractive in some sense. 00:23:19.520 |
in the Darwinian context, it just has to be attractive 00:23:43.600 |
But there are other criteria which might help it to spread. 00:23:56.280 |
the peacock to survive, it helps it to pass on its genes. 00:23:59.240 |
Similarly, an idea which is actually rubbish, 00:24:10.280 |
- As a small sidestep, I remember reading somewhere, 00:24:13.080 |
I think recently, that in some species of birds, 00:24:18.160 |
sort of the idea that beauty may have its own purpose, 00:24:29.000 |
but there's some aspects of their feathers and so on 00:24:32.320 |
that serve no evolutionary purpose whatsoever. 00:24:37.360 |
that there are some things about beauty that animals do 00:24:50.680 |
Darwin, when he coined the phrase sexual selection, 00:25:05.760 |
that what females found attractive had to be useful. 00:25:09.920 |
It was enough that females found it attractive. 00:25:14.240 |
probably was completely useless in the conventional sense, 00:25:17.360 |
but was not at all useless in the sense of passing on 00:25:31.480 |
And they wanted sexually selected characteristics 00:25:36.240 |
like peacock's tails to be in some sense useful. 00:25:39.440 |
It's a bit of a stretch to think of a peacock's tail 00:25:41.200 |
as being useful, but in the sense of survival, 00:25:50.440 |
there are two schools of thought on sexual selection, 00:25:52.480 |
which are still active and about equally supported now. 00:25:56.000 |
Those who follow Darwin in thinking that it's just enough 00:25:58.760 |
to say it's attractive and those who follow Wallace 00:26:03.160 |
and say that it has to be in some sense useful. 00:26:07.240 |
- Do you fall into one category or the other? 00:26:11.840 |
I think they both could be correct in different cases. 00:26:31.280 |
so attraction is a powerful force in evolution. 00:26:36.040 |
On religion, do you think there will ever be a time 00:26:41.360 |
in our future where almost nobody believes in God 00:26:44.960 |
or God is not a part of the moral fabric of our society? 00:26:51.560 |
I think it may happen after a very long time. 00:26:53.880 |
I think it may take a long time for that to happen. 00:26:56.060 |
- So do you think ultimately for everybody on earth, 00:27:04.280 |
could do a better job than what religion does? 00:27:30.360 |
you mean the very basic sort of inarguable conclusions 00:27:34.600 |
of science versus which political system is better? 00:27:45.520 |
not just by the more rigorous methods of science, 00:27:59.480 |
are we hopelessly, fundamentally tied to religion 00:28:15.420 |
You could mean something like society needs religion 00:28:20.220 |
in order to function properly, something like that. 00:28:25.900 |
- Well, I've read books on it and they're persuasive. 00:28:30.900 |
I don't think they're that persuasive though. 00:28:46.320 |
about the idea that, well, you and I are intelligent enough 00:28:50.680 |
not to believe in God, but the plebs need it sort of thing. 00:29:02.680 |
do you think there's some value of spirituality? 00:29:12.400 |
the amount of things we actually know about our universe 00:29:21.360 |
even the certainty we have about the laws of physics, 00:29:23.720 |
it seems to be that there's yet a huge amount to discover. 00:29:27.680 |
And therefore we're sitting where 99.999% of things 00:29:34.320 |
Do you think there's a role in a kind of spiritual view 00:29:41.360 |
I think it's right to admit that there's a lot we don't know, 00:29:58.920 |
aren't adequate to do the job, then we need better ones. 00:30:02.800 |
And of course, the history of science shows just that, 00:30:09.680 |
and the science advances as science gets better. 00:30:12.080 |
But to invoke a non-scientific, non-physical explanation 00:30:18.080 |
is simply to lie down in a cowardly way and say, 00:30:21.720 |
"We can't solve it, so we're going to invoke magic." 00:30:30.560 |
It may be that we will never actually understand 00:30:32.560 |
everything and that's okay, but let's keep working on it. 00:30:47.240 |
maybe it's the aspect of scientists and not science, 00:30:56.280 |
that can lead us astray in terms of discovering 00:31:01.640 |
some of the big open questions about the universe. 00:31:06.120 |
I mean, there are arrogant people in any walk of life 00:31:15.440 |
So humility is a proper stance for a scientist. 00:31:23.980 |
But in a way to resort to a supernatural explanation 00:31:37.120 |
supernatural explanation must be the right one. 00:31:46.400 |
- So maybe if I could psychoanalyze you for a second, 00:31:50.240 |
you have at times been just slightly frustrated 00:32:03.840 |
that kind of have seek supernatural explanations, 00:32:25.400 |
And I mean, obviously many of them are perfectly nice people 00:32:28.520 |
so I don't sort of despise them in that sense. 00:32:43.960 |
They will jump straight to what they think of 00:32:48.000 |
which is the supernatural one, which is not an alternative. 00:32:57.520 |
that we need to actually do some better science. 00:33:10.800 |
- So what about this really interesting space 00:33:16.960 |
but there's large communities, like you said, 00:33:26.680 |
I saw that there's, I've noticed that there's people 00:33:31.000 |
So there's this bigger world of conspiracy theorists, 00:33:36.640 |
which is a kind of, I mean, there's elements of it 00:33:53.760 |
which is also the credo of a good scientist, I would say. 00:33:59.960 |
- I mean, I think it's probably too easy to say 00:34:10.680 |
And so we shouldn't dismiss conspiracy theories out of hand. 00:34:21.720 |
But I mean, there may be other conspiracy theories 00:34:31.880 |
was very influential for me on both sides of the coin. 00:34:45.200 |
and it's very difficult to rigorously scientifically show 00:34:49.560 |
It's just, you have to use some of the human intuition 00:35:07.480 |
there has to be a large number of people working together 00:35:14.840 |
The sheer number who would have to be in on this conspiracy 00:35:18.600 |
and the sheer detail, the attention to detail 00:35:26.080 |
and why would anyone want to suggest that it didn't happen? 00:35:33.120 |
I mean, the physics of it, the mathematics of it, 00:35:36.120 |
the idea of computing orbits and trajectories and things, 00:35:47.280 |
about pointing out that the emperor has no clothes 00:36:02.880 |
You want to prove the entire scientific community wrong. 00:36:12.480 |
But often people who think they're a lone genius 00:36:17.600 |
So you have to judge each case on its merits. 00:36:21.720 |
the mere fact that you're going against the current tide 00:36:27.120 |
You've got to show you're right by looking at the evidence. 00:36:42.200 |
even reaching into presidential politics and so on. 00:36:48.400 |
that believe different kinds of conspiracy theories. 00:37:02.920 |
that are clearly nonsense like, well, flat Earth, 00:37:05.800 |
and also the conspiracy about not landing on the moon 00:37:25.040 |
because it's passed down from generation to generation. 00:37:33.760 |
and childhood indoctrination is a very powerful force. 00:37:38.760 |
But these things like the 9/11 conspiracy theory, 00:38:07.880 |
that they happen to read or meet who spins some yarn. 00:38:18.160 |
and not capable of critical thinking for themselves. 00:38:21.440 |
So I sort of get why the great religions of the world 00:38:33.800 |
And sure enough, flat-earthers is a very minority cult. 00:38:45.600 |
and in "God Delusion" is the early indoctrination. 00:38:50.280 |
You can get away with a lot of out there ideas 00:38:56.040 |
if the age at which you convey those ideas at first 00:39:02.780 |
So indoctrination is sort of an essential element 00:39:26.440 |
So in general, where do you think morals come from 00:39:31.840 |
- A very complicated and interesting question. 00:39:53.320 |
And we in the 21st century are quite clearly labeled 00:39:58.320 |
21st century people in terms of our moral values. 00:40:05.360 |
I mean, some of us are a little bit more ruthless, 00:40:11.360 |
But we all subscribe to pretty much the same views 00:40:27.880 |
we're much less sexist and so on than we used to be. 00:40:30.520 |
Some people are still racist and some are still sexist, 00:40:42.320 |
And that is the most powerful influence I can see 00:40:50.520 |
And that doesn't have anything to do with religion. 00:40:55.600 |
the morals of the Old Testament are Bronze Age models, 00:41:15.560 |
petty revenge, killing people for breaking the Sabbath, 00:41:23.720 |
- So at some point, religious texts may have in part 00:41:26.840 |
reflected that Gaussian distribution at that time. 00:41:30.040 |
- I'm sure they did, I'm sure they always reflect that, yes. 00:41:32.120 |
- And then now, but the sort of almost like the meme 00:41:35.920 |
as you describe it of ideas moves much faster 00:41:40.100 |
than religious texts do, than you religious people. 00:41:53.400 |
So not only should we not get our morals from such texts, 00:42:01.280 |
If we did, then we'd be discriminating against women 00:42:04.560 |
and we'd be racist, we'd be killing homosexuals and so on. 00:42:20.520 |
and you can look at the Bible and you can cherry pick 00:42:23.560 |
particular verses which conform to our modern morality 00:42:27.640 |
and you'll find that Jesus says some pretty nice things, 00:42:30.480 |
which is great, but you're using your 21st century morality 00:42:35.480 |
to decide which verses to pick and which verses to reject. 00:42:39.040 |
And so why not cut out the middleman of the Bible 00:42:42.760 |
and go straight to the 21st century morality, 00:42:58.240 |
And it's a very interesting question to ask why. 00:43:07.400 |
so moral values progress for probably very different reasons. 00:43:14.360 |
has some evolutionary value or if it's merely a drift 00:43:20.440 |
and I'm not sure it's evolutionarily valuable. 00:43:22.600 |
What it is is progressive in the sense that each step 00:43:26.880 |
is a step in the same direction as the previous step. 00:43:33.160 |
by modern standards, more liberal, less violent. 00:43:37.440 |
- See, but more decent, I think you're using terms 00:43:46.760 |
that this is not more decent because we're now, 00:43:49.800 |
you know, there's a lot of weak members of society 00:43:56.840 |
by our standards, if we with hindsight look back at history, 00:44:00.880 |
what we see is a trend in the direction towards us, 00:44:06.760 |
- For us, we see progress, but it's an open question 00:44:13.000 |
I don't see necessarily why we can never return 00:44:19.640 |
But if you look at the history of moral values 00:44:27.360 |
I use the word progressive not in a value judgment sense, 00:44:33.920 |
Each step is the same direction of the previous step. 00:44:48.400 |
like the Romans did in the Colosseum from that stage. 00:45:13.840 |
- Well, there's a Dan Carlin of Hardcore History 00:45:18.960 |
of how we've enjoyed watching the torture of people, 00:45:32.480 |
we're kind of channeling that feeling into something else. 00:45:35.680 |
I mean, there is some dark aspects of human nature 00:45:41.160 |
And I do hope this like higher level software we've built 00:45:53.480 |
And I clearly remember that some of the darker aspects 00:46:00.080 |
- And there have been steps backwards admittedly, 00:46:09.560 |
- So Pamela McCordick in "Machines Who Think" 00:46:14.920 |
has written that AI began with an ancient wish 00:46:19.580 |
Do you see, it's a poetic description, I suppose, 00:46:23.000 |
but do you see a connection between our civilizations, 00:46:27.040 |
historic desire to create gods, to create religions, 00:46:35.560 |
- I suppose there's a link between an ancient desire 00:46:51.860 |
I mean, I forget, I read somewhere a somewhat facetious 00:46:59.000 |
it's called Google, and we pray to it, and we worship it, 00:47:03.040 |
and we ask its advice like an oracle and so on. 00:47:07.680 |
- You don't see that, you see that as a fun statement, 00:47:18.760 |
- It has a kind of poetic resonance to it, which I get. 00:47:25.720 |
- I wouldn't have bothered to make the point myself, 00:47:29.140 |
So you don't think AI will become a new religion, 00:47:41.160 |
or indeed intelligent aliens from outer space, 00:47:44.520 |
might yield beings that we would regard as gods 00:48:03.080 |
as something very, very powerful and intelligent 00:48:05.520 |
on the one hand, and a god who doesn't need explaining 00:48:09.380 |
by a progressive step-by-step process like evolution 00:48:17.720 |
So the difference, so suppose we did meet an alien 00:48:27.480 |
and we would sort of worship it for that reason. 00:48:32.920 |
in the very important sense that it did not just happen 00:48:40.660 |
It must have come about by a gradual, step-by-step, 00:48:53.800 |
Intelligence, design, comes into the universe late 00:48:58.800 |
as a product of a progressive evolutionary process 00:49:06.440 |
- So most of the work is done through this slow-moving-- 00:49:17.680 |
Yeah, but there's still this desire to get answers 00:49:20.480 |
to the why question that if the world is a simulation, 00:49:38.080 |
- Then you still need to explain the programmer. 00:49:58.700 |
then the meta-meta-programmer must have evolved 00:50:06.940 |
to a gradual, incremental process of explanation 00:50:17.440 |
- But maybe to linger on that point about the simulation, 00:50:25.200 |
bore the heck out of everybody asking this question, 00:50:31.280 |
do you think, first, do you think we live in a simulation? 00:50:34.120 |
Second, do you think it's an interesting thought experiment? 00:50:37.320 |
- It's certainly an interesting thought experiment. 00:50:42.400 |
by Daniel Galloway called "Counterfeit World," 00:50:52.440 |
I mean, our heroes are running a gigantic computer 00:51:00.240 |
and so one of them has to go down into the simulated world 00:51:07.320 |
the climax to the novel is that they discover 00:51:09.240 |
that they themselves are in another simulation 00:51:18.760 |
Then it was revived seriously by Nick Bostrom. 00:51:29.480 |
not just to treat it as a science fiction speculation, 00:51:35.720 |
- I mean, he thinks it's very likely, actually. 00:51:44.400 |
- I mean, he thinks that we're in a simulation 00:51:48.720 |
done by, so to speak, our descendants of the future, 00:51:51.320 |
that the products, but it's still a product of evolution. 00:51:54.880 |
It's still ultimately going to be a product of evolution, 00:51:57.000 |
even though the super intelligent people of the future 00:52:09.680 |
I don't actually, in my heart of hearts, believe it, 00:52:18.080 |
that I agree with you, but the interesting thing to me, 00:52:21.120 |
if I were to say, if we're living in a simulation, 00:52:36.240 |
the higher, the upper ones, have to have evolved gradually. 00:52:39.840 |
However, the simulation they create could be instantaneous. 00:52:44.440 |
and we come into the world with fabricated memories. 00:52:52.800 |
but I'm saying from an engineering perspective, 00:52:55.400 |
both the programmer has to be slowly evolved, 00:53:03.200 |
- Oh, yeah, it takes a long time to write a program. 00:53:17.040 |
By the way, I have thought about using the Nick Bostrom idea 00:53:22.040 |
to solve the riddle of how, we were talking earlier 00:53:27.400 |
about why the human brain can achieve so much. 00:53:32.160 |
I thought of this when my then 100-year-old mother 00:53:36.160 |
was marveling at what I could do with a smartphone, 00:53:39.400 |
and I could call, look up anything in the encyclopedia, 00:53:42.800 |
or I could play her music that she liked, and so on. 00:53:44.680 |
She said, "But it's all in that tiny little phone." 00:53:56.200 |
then all the power that we think is in our skull, 00:54:12.760 |
that consciousness is somehow a fundamental part of physics, 00:54:15.920 |
that it doesn't have to actually all reside inside a brain. 00:54:19.240 |
- But Roger thinks it does reside in the skull, 00:54:33.080 |
are you familiar with the work of Donald Hoffman, I guess? 00:54:47.400 |
So like we biological organisms perceive the world 00:55:06.160 |
although it reflects the fundamental reality, 00:55:12.080 |
I do think that our perception is constructive 00:55:24.360 |
And so, and this is really the view of people 00:55:27.440 |
who work on visual illusions like Richard Gregory, 00:55:30.720 |
who point out that things like a Necker cube, 00:55:33.800 |
which flip from, it's a two-dimensional picture 00:55:48.440 |
What's going on is that the brain is constructing a cube, 00:56:00.440 |
I think that's just a model for what we do all the time 00:56:12.200 |
or make use of a perhaps previously constructed model. 00:56:39.600 |
out of the filing cabinet inside and grafted it onto, 00:56:48.600 |
Yeah, we do some kind of miraculous compression 00:56:58.720 |
So you've written several, many amazing books, 00:57:03.600 |
but let me ask what books, technical or fiction 00:57:08.440 |
or philosophical had a big impact on your own life? 00:57:11.400 |
What books would you recommend people consider reading 00:57:25.200 |
a shame to say I've never read Darwin in the original. 00:57:39.040 |
Everything except genetics is amazingly right 00:58:01.440 |
but he'd be astonished by Mendelian genetics as well. 00:58:04.240 |
- Yeah, it'd be fascinating to see what he thought about, 00:58:08.520 |
- I mean, yes, it would, because in many ways 00:58:11.520 |
it clears up what appeared in his time to be a riddle. 00:58:28.560 |
- Is there something outside sort of more fiction? 00:58:35.480 |
outside of kind of the realm of science and religion, 00:58:43.640 |
that I've learned some science from science fiction. 00:58:48.360 |
I mentioned Daniel Galloy, and that's one example, 00:58:54.800 |
but another of his novels called "Dark Universe," 00:59:00.360 |
but it's a very, very nice science fiction story. 00:59:06.320 |
And we're not told at the beginning of the book 00:59:09.600 |
They stumble around in some kind of underground world 00:59:15.560 |
using echolocation like bats and whales to get around. 00:59:20.360 |
And they've adapted, presumably by Darwinian means, 00:59:28.240 |
But what's interesting is that their mythology, 00:59:38.160 |
And so there's been a fall from a paradise world 00:59:43.160 |
that once existed where light reigned supreme. 00:59:46.960 |
And because of the sin of mankind, light banished them. 00:59:51.960 |
So then they no longer are in light's presence, 01:00:07.880 |
- So some of the same religious elements are present 01:00:10.280 |
in this other totally kind of absurd different form. 01:00:14.160 |
And so it's a wonderful, I wouldn't call it satire 01:00:17.640 |
I mean, a wonderful parable about Christianity 01:00:22.640 |
and the doctrine, the theological doctrine of the fall. 01:00:26.040 |
So I find that kind of science fiction immensely stimulating. 01:00:48.880 |
It suffers from an obnoxious hero, unfortunately, 01:00:51.640 |
but apart from that, you learn a lot of science from it. 01:00:59.520 |
which by the way, the theme of that is taken up 01:01:05.720 |
another wonderful writer, "Carl Sagan Contact," 01:01:10.160 |
where the idea is, again, we will not be visited 01:01:18.520 |
We will be visited, possibly, we might be visited by radio, 01:01:25.200 |
and actually have a concrete influence on the world 01:01:30.560 |
if they make us or persuade us to build a computer 01:01:37.400 |
so that they can then transmit their software by radio. 01:01:44.680 |
And this is the same theme in both Hoyle's book 01:01:56.640 |
that we will never be invaded by physical bodies. 01:02:03.080 |
The war of the worlds of HG Wells will never happen, 01:02:06.120 |
but we could be invaded by radio signals, code, 01:02:10.160 |
coded information, which is sort of like DNA. 01:02:13.400 |
And we are, I call them, we are survival machines of our DNA. 01:02:18.400 |
So it has great resonance for me because I think of us, 01:02:25.080 |
I think of bodies, physical bodies, biological bodies 01:02:29.040 |
as being manipulated by coded information, DNA, 01:02:39.560 |
It can be transmitted through the space of information. 01:02:43.880 |
- That's a fascinating possibility that from outer space, 01:02:47.240 |
we can be infiltrated by other memes, by other ideas 01:03:00.920 |
What gives your life fulfillment, purpose, happiness, meaning? 01:03:07.160 |
the meaning of life is the propagation of DNA, 01:03:27.060 |
We want to do whatever it is we do, write a quartet. 01:03:42.760 |
which have goal-seeking machinery built into them. 01:03:53.400 |
We have other goals, which can be very moving, 01:03:59.200 |
They could even be called spiritual in some cases. 01:04:02.700 |
We want to understand the riddle of the universe. 01:04:35.660 |
But what do you make of this unfortunate fact 01:04:55.120 |
If there's something frightening about mortality, 01:05:09.640 |
But eternity is only frightening if you're there. 01:05:15.980 |
and we were effectively dead before we were born. 01:05:21.240 |
"I was dead for billions of years before I was born 01:05:24.100 |
"and never suffered the smallest inconvenience." 01:05:26.360 |
And that's how it's going to be after we leave. 01:05:41.640 |
Richard, it was a huge honor to meet you, to talk to you. 01:05:49.860 |
And thank you to our presenting sponsor, Cash App. 01:05:54.660 |
by downloading Cash App and using code LEXPODCAST. 01:05:58.560 |
If you enjoy this podcast, subscribe on YouTube, 01:06:03.200 |
support on Patreon, or simply connect with me on Twitter 01:06:08.560 |
And now let me leave you with some words of wisdom 01:06:21.940 |
"The potential people who could have been here in my place, 01:06:25.240 |
"but who will in fact never see the light of day 01:06:37.700 |
"We know this because the set of possible people 01:06:49.420 |
"it is you and I, in our ordinariness, that are here. 01:07:03.240 |
"to that prior state from which the vast majority 01:07:08.360 |
Thank you for listening and hope to see you next time.