back to index

Richard Dawkins: Evolution, Intelligence, Simulation, and Memes | Lex Fridman Podcast #87


Chapters

0:0 Introduction
2:31 Intelligent life in the universe
5:3 Engineering intelligence (are there shortcuts?)
7:6 Is the evolutionary process efficient?
10:39 Human brain and AGI
15:31 Memes
26:37 Does society need religion?
33:10 Conspiracy theories
39:10 Where do morals come from in humans?
46:10 AI began with the ancient wish to forge the gods
49:18 Simulation
56:58 Books that influenced you
62:53 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Richard Dawkins,
00:00:03.260 | an evolutionary biologist and author of "The Selfish Gene,"
00:00:07.140 | "The Blind Watchmaker," "The God Delusion,"
00:00:09.720 | "The Magic of Reality," and "The Greatest Show of Earth,"
00:00:12.380 | and his latest, "All-Growing God."
00:00:15.740 | He is the originator and popularizer
00:00:18.500 | of a lot of fascinating ideas in evolutionary biology
00:00:21.780 | and science in general, including, funny enough,
00:00:24.900 | the introduction of the word meme
00:00:27.140 | in his 1976 book, "The Selfish Gene,"
00:00:29.940 | which, in the context of a gene-centered view of evolution,
00:00:33.100 | is an exceptionally powerful idea.
00:00:35.700 | He's outspoken, bold, and often fearless
00:00:39.220 | in the defense of science and reason,
00:00:41.300 | and in this way, is one of the most influential thinkers
00:00:44.540 | of our time.
00:00:45.380 | This conversation was recorded
00:00:48.500 | before the outbreak of the pandemic.
00:00:50.540 | For everyone feeling the medical, psychological,
00:00:52.780 | and financial burden of this crisis,
00:00:54.680 | I'm sending love your way.
00:00:56.660 | Stay strong.
00:00:57.740 | We're in this together.
00:00:59.140 | We'll beat this thing.
00:01:00.980 | This is the Artificial Intelligence Podcast.
00:01:03.500 | If you enjoy it, subscribe on YouTube,
00:01:05.500 | review it with Five Stars on Apple Podcast,
00:01:07.780 | support it on Patreon, or simply connect with me on Twitter,
00:01:11.020 | @LexFriedman, spelled F-R-I-D-M-A-N.
00:01:14.480 | As usual, I'll do a few minutes of ads now,
00:01:16.980 | and never any ads in the middle
00:01:18.540 | that can break the flow of the conversation.
00:01:20.760 | I hope that works for you
00:01:21.940 | and doesn't hurt the listening experience.
00:01:25.540 | This show is presented by Cash App,
00:01:27.660 | the number one finance app in the App Store.
00:01:29.820 | When you get it, use code LEXPODCAST.
00:01:32.900 | Cash App lets you send money to friends,
00:01:34.900 | buy Bitcoin, and invest in the stock market
00:01:37.220 | with as little as $1.
00:01:38.380 | Since Cash App allows you to send and receive money
00:01:41.640 | digitally, peer-to-peer,
00:01:43.260 | security in all digital transactions is very important.
00:01:46.640 | Let me mention the PCI data security standard
00:01:49.700 | that Cash App is compliant with.
00:01:51.660 | I'm a big fan of standards for safety and security.
00:01:54.620 | PCI DSS is a good example of that,
00:01:57.740 | where a bunch of competitors got together and agreed
00:02:00.420 | that there needs to be a global standard
00:02:02.340 | around the security of transactions.
00:02:04.540 | Now we just need to do the same for autonomous vehicles
00:02:07.460 | and artificial intelligence systems in general.
00:02:10.220 | So again, if you get Cash App from the App Store,
00:02:12.620 | Google Play, and use the code LEXPODCAST,
00:02:15.780 | you get $10, and Cash App will also donate $10 to FIRST,
00:02:19.540 | an organization that is helping to advance robotics
00:02:22.220 | and STEM education for young people around the world.
00:02:25.660 | And now, here's my conversation with Richard Dawkins.
00:02:30.060 | Do you think there's intelligent life
00:02:32.540 | out there in the universe?
00:02:33.860 | - Well, if we accept that there's intelligent life here,
00:02:37.920 | and we accept that the number of planets in the universe
00:02:40.700 | is gigantic, I mean, 10 to the 22 stars has been estimated,
00:02:44.740 | it seems to me highly likely that there is not only life
00:02:47.340 | in the universe elsewhere, but also intelligent life.
00:02:50.740 | If you deny that, then you're committed to the view
00:02:53.940 | that the things that happened on this planet
00:02:55.540 | are staggeringly improbable.
00:02:57.060 | I mean, ludicrously, off the charts improbable.
00:03:00.620 | And I don't think it's that improbable.
00:03:02.980 | Certainly the origin of life itself,
00:03:04.660 | there are really two steps, the origin of life,
00:03:07.180 | which is probably fairly improbable,
00:03:09.200 | and then the subsequent evolution to intelligent life,
00:03:12.220 | which is also fairly improbable.
00:03:13.940 | So the juxtaposition of those two,
00:03:15.620 | you could say is pretty improbable,
00:03:17.340 | but not 10 to the 22 improbable.
00:03:20.500 | It's an interesting question, maybe you're coming onto it,
00:03:22.780 | how we would recognize intelligence from outer space
00:03:25.140 | if we encountered it.
00:03:27.820 | The most likely way we would come across them
00:03:29.540 | would be by radio.
00:03:30.900 | It's highly unlikely they'd ever visit us,
00:03:33.580 | but it's not that unlikely
00:03:36.820 | that we would pick up radio signals.
00:03:39.020 | And then we would have to have some means of deciding
00:03:42.060 | that it was intelligent.
00:03:43.320 | People involved in the SETI program
00:03:47.620 | discuss how they would do it,
00:03:48.900 | and things like prime numbers would be an obvious thing to,
00:03:52.700 | an obvious way for them to broadcast,
00:03:54.860 | to say we are intelligent, we are here.
00:03:57.160 | I suspect it probably would be obvious, actually.
00:04:02.140 | - Well, it's interesting, prime numbers,
00:04:03.540 | so the mathematical patterns,
00:04:05.560 | it's an open question whether mathematics is the same for us
00:04:09.140 | as it would be for aliens.
00:04:10.740 | I suppose we could assume that ultimately,
00:04:14.220 | if we're governed by the same laws of physics,
00:04:17.260 | then we should be governed by the same laws of mathematics.
00:04:19.580 | - I think so.
00:04:20.820 | I suspect that they will have Pythagoras' theorem, et cetera.
00:04:23.620 | I mean, I don't think that their mathematics
00:04:25.260 | will be that different.
00:04:26.420 | - Do you think evolution would also be a force
00:04:29.220 | on the alien planets as well?
00:04:30.580 | - I stuck my neck out and said that if we do,
00:04:32.660 | if ever that we do discover life elsewhere,
00:04:35.220 | it will be Darwinian life,
00:04:36.940 | in the sense that it will work by some kind of
00:04:40.580 | natural selection, the non-random survival
00:04:43.180 | of randomly generated codes.
00:04:46.980 | It doesn't mean that the,
00:04:48.860 | it would have to have some kind of genetics,
00:04:50.140 | but it doesn't have to be DNA genetics,
00:04:52.940 | probably wouldn't be, actually.
00:04:54.580 | But I think it would have to be Darwinian, yes.
00:04:57.020 | - So some kind of selection process.
00:04:59.800 | - Yes, in the general sense, it would be Darwinian.
00:05:02.500 | - So let me ask kind of an artificial intelligence
00:05:06.980 | engineering question.
00:05:08.100 | So you've been an outspoken critic of,
00:05:10.860 | I guess what could be called intelligent design,
00:05:13.540 | which is an attempt to describe the creation
00:05:15.380 | of a human mind and body by some religious folks,
00:05:18.700 | religious folks used to describe.
00:05:21.580 | So broadly speaking, evolution is, as far as I know,
00:05:25.140 | again, you can correct me, is the only scientific theory
00:05:27.500 | we have for the development of intelligent life.
00:05:29.980 | Like there's no alternative theory as far as I understand.
00:05:33.260 | - None has ever been suggested,
00:05:35.180 | and I suspect it never will be.
00:05:36.780 | - Well, of course, whenever somebody says that,
00:05:40.900 | a hundred years later.
00:05:42.580 | - I know.
00:05:43.420 | (laughing)
00:05:44.260 | It's a risk.
00:05:45.100 | - It's a risk.
00:05:45.940 | - But, you want to bet?
00:05:47.820 | I mean, I--
00:05:48.980 | - But it would look, sorry, yes,
00:05:50.500 | it would probably look very similar,
00:05:51.860 | but it's almost like Einstein's general relativity
00:05:54.620 | versus Newtonian physics.
00:05:56.180 | It'll be maybe an alteration of the theory
00:05:59.940 | or something like that,
00:06:00.780 | but it won't be fundamentally different.
00:06:02.500 | But okay, so now for the past 70 years,
00:06:07.500 | even before the AI community has been trying
00:06:10.540 | to engineer intelligence, in a sense,
00:06:12.660 | to do what intelligent design says, you know,
00:06:16.940 | was done here on Earth, what's your intuition?
00:06:19.780 | Do you think it's possible to build intelligence,
00:06:24.460 | to build computers that are intelligent,
00:06:27.320 | or do we need to do something like the evolutionary process?
00:06:30.400 | Like there's no shortcuts here.
00:06:33.380 | - That's an interesting question.
00:06:35.300 | I'm committed to the belief that it's ultimately possible,
00:06:39.100 | because I think there's nothing non-physical in our brains.
00:06:42.320 | I think our brains work by the laws of physics.
00:06:45.180 | And so it must, in principle,
00:06:47.460 | be possible to replicate that.
00:06:49.420 | In practice, though, it might be very difficult.
00:06:52.300 | And as you suggest, it may be the only way to do it
00:06:55.100 | is by something like an evolutionary process.
00:06:57.660 | I'd be surprised.
00:06:58.500 | I suspect that it will come.
00:07:00.460 | But it's certainly been slower in coming
00:07:03.140 | than some of the early pioneers thought.
00:07:05.660 | - Thought it would be, yeah.
00:07:07.020 | But in your sense, is the evolutionary process efficient?
00:07:10.500 | So you can see it as exceptionally wasteful
00:07:12.800 | in one perspective, but at the same time,
00:07:15.200 | maybe that is the only path to--
00:07:17.280 | - It's a paradox, isn't it?
00:07:18.280 | I mean, on the one side, it is deplorably wasteful.
00:07:21.920 | It's fundamentally based on waste.
00:07:24.560 | On the other hand, it does produce magnificent results.
00:07:27.400 | The design of a soaring bird, an albatross,
00:07:33.280 | a vulture, an eagle, is superb.
00:07:37.880 | An engineer would be proud to have done it.
00:07:39.520 | On the other hand, an engineer would not be proud
00:07:41.160 | to have done some of the other things
00:07:42.320 | that evolution has served up.
00:07:45.280 | Some of the sort of botched jobs
00:07:47.280 | that you can easily understand
00:07:49.360 | because of their historical origins,
00:07:51.280 | but they don't look well-designed.
00:07:53.400 | - Do you have examples of bad design?
00:07:55.840 | - My favorite example is the recurrent laryngeal nerve.
00:07:58.080 | I've used this many times.
00:07:59.440 | This is a nerve, it's one of the cranial nerves,
00:08:01.840 | which goes from the brain, and the end organ
00:08:04.880 | that it supplies is the voice box, the larynx.
00:08:08.880 | But it doesn't go straight to the larynx.
00:08:10.280 | It goes right down into the chest
00:08:12.320 | and then loops around an artery in the chest
00:08:15.440 | and then comes straight back up again to the larynx.
00:08:19.080 | And I've assisted in the dissection of a giraffe's neck,
00:08:22.080 | which happened to have died in a zoo.
00:08:24.080 | And we watched the, we saw the recurrent laryngeal nerve
00:08:27.520 | going, whizzing straight past the larynx,
00:08:30.640 | within an inch of the larynx,
00:08:32.200 | down into the chest and then back up again,
00:08:34.360 | which is a detour of many feet.
00:08:39.080 | Very, very inefficient.
00:08:41.240 | The reason is historical.
00:08:43.160 | The ancestors, our fish ancestors,
00:08:45.760 | the ancestors of all mammals and fish,
00:08:47.680 | the most direct pathway of that,
00:08:52.800 | of the equivalent of that nerve,
00:08:54.400 | there wasn't a larynx in those days,
00:08:55.760 | but it innervated part of the gills.
00:08:58.800 | The most direct pathway was behind that artery.
00:09:03.320 | And then when the mammal, when the tetrapods,
00:09:05.840 | when the land vertebrae started evolving,
00:09:07.760 | and then the neck started to stretch,
00:09:10.200 | the marginal cost of changing the embryological design
00:09:14.200 | to jump that nerve over the artery was too great,
00:09:17.640 | or rather was, each step of the way was a very small cost,
00:09:22.440 | but the marginal, but the cost of actually jumping it over
00:09:24.840 | would have been very large.
00:09:26.680 | As the neck lengthened, it was a negligible change
00:09:30.320 | to just increase the length of the detour a tiny bit,
00:09:32.760 | a tiny bit, a tiny bit,
00:09:33.600 | each millimeter at a time didn't make any difference.
00:09:35.800 | And so, but finally, when you get to a giraffe,
00:09:38.240 | it's a huge detour and no doubt is very inefficient.
00:09:41.160 | Now that's bad design.
00:09:42.840 | Any engineer would reject that piece of design,
00:09:46.720 | it's ridiculous.
00:09:47.800 | And there are quite a number of examples,
00:09:49.560 | as you'd expect,
00:09:50.400 | it's not surprising that we find examples of that sort.
00:09:53.400 | In a way, what's surprising is there aren't more of them.
00:09:55.480 | In a way, what's surprising
00:09:57.200 | is that the design of living things is so good.
00:09:59.600 | So natural selection manages to achieve excellent results,
00:10:04.360 | partly by tinkering,
00:10:06.120 | partly by coming along and cleaning up initial mistakes
00:10:11.120 | and as it were, making the best of a bad job.
00:10:14.320 | - That's really interesting.
00:10:15.160 | I mean, it is surprising and beautiful.
00:10:17.680 | And it's a mystery from an engineering perspective
00:10:20.760 | that so many things are well-designed.
00:10:23.080 | I suppose the thing we're forgetting
00:10:25.680 | is how many generations have to die for that.
00:10:30.680 | - That's the inefficiency of it, yes.
00:10:31.760 | That's the horrible wastefulness of it.
00:10:33.560 | - So yeah, we marvel at the final product,
00:10:36.400 | but yeah, the process is painful.
00:10:38.560 | Elon Musk describes human beings as potentially the,
00:10:43.520 | what he calls the biological bootloader
00:10:45.600 | for artificial intelligence
00:10:47.520 | or artificial general intelligence is used as the term,
00:10:50.440 | it's kind of like super intelligence.
00:10:52.560 | Do you see superhuman level intelligence
00:10:56.080 | as potentially the next step in the evolutionary process?
00:10:59.320 | - Yes, I think that if superhuman intelligence
00:11:01.560 | is to be found, it will be artificial.
00:11:03.360 | I don't have any hope that we ourselves,
00:11:06.400 | our brains will go on getting larger
00:11:10.280 | in ordinary biological evolution.
00:11:13.560 | I think that's probably coming to an end.
00:11:16.040 | It is the dominant trend or one of the dominant trends
00:11:19.240 | in our fossil history for the last two or 3 million years.
00:11:23.720 | - Brain size?
00:11:24.560 | - Brain size, yes.
00:11:25.480 | So it's been swelling rather dramatically
00:11:28.440 | over the last two or 3 million years.
00:11:30.680 | That is unlikely to continue.
00:11:32.600 | The only way that happens is because natural selection
00:11:36.040 | favors those individuals with the biggest brains
00:11:40.040 | and that's not happening anymore.
00:11:42.000 | - Right, so in general in humans,
00:11:43.960 | the selection pressures are not,
00:11:47.560 | I mean, are they active in any form?
00:11:49.200 | - Well, in order for them to be active,
00:11:53.080 | it would be necessary that the most intelligent,
00:11:55.560 | let's call it intelligence.
00:11:57.120 | Not that intelligence is simply correlated
00:12:00.480 | to brain size, but let's talk about intelligence.
00:12:03.040 | In order for that to evolve, it's necessary
00:12:06.680 | that the most intelligent beings have the most,
00:12:08.680 | individuals have the most children.
00:12:10.480 | And so intelligence may buy you money,
00:12:16.600 | it may buy you worldly success,
00:12:18.920 | it may buy you a nice house and a nice car
00:12:21.640 | and things like that if you have a successful career.
00:12:24.480 | It may buy you the admiration of your fellow people,
00:12:28.360 | but it doesn't increase the number of offspring
00:12:31.200 | that you have, it doesn't increase your genetic legacy
00:12:34.560 | to the next generation.
00:12:35.920 | On the other hand, artificial intelligence,
00:12:38.160 | I mean, computers and technology generally
00:12:42.120 | is evolving by a non-genetic means,
00:12:45.200 | by leaps and bounds, of course.
00:12:47.920 | - And so what do you think, I don't know if you're familiar,
00:12:50.120 | there's a company called Neuralink,
00:12:51.560 | but there's a general effort of brain-computer interfaces,
00:12:54.320 | which is to try to build a connection
00:12:58.000 | between the computer and the brain,
00:12:59.560 | to send signals both directions.
00:13:01.640 | And the long-term dream there is to do exactly that,
00:13:05.080 | which is expand, I guess expand the size of the brain,
00:13:08.640 | expand the capabilities of the brain.
00:13:10.560 | Do you see this as interesting?
00:13:14.640 | Do you see this as a promising possible technology,
00:13:17.520 | or is the interface between the computer and the brain,
00:13:20.440 | like the brain is this wet, messy thing
00:13:22.320 | that's just impossible to interface with?
00:13:24.000 | - Well, of course it's interesting.
00:13:25.360 | Whether it's promising, I'm really not qualified to say.
00:13:28.840 | What I do find puzzling is that the brain being as small
00:13:33.400 | as it is compared to a computer,
00:13:36.160 | and the individual components being as slow as they are
00:13:39.040 | compared to our electronic components,
00:13:41.920 | it is astonishing what it can do.
00:13:44.640 | I mean, imagine building a computer
00:13:48.040 | that fits into the size of a human skull,
00:13:51.680 | and with the equivalent of transistors
00:13:55.440 | or integrated circuits which work as slowly as neurons do.
00:14:00.440 | It's something mysterious about that.
00:14:04.280 | Something must be going on that we don't understand.
00:14:07.960 | - So I've just talked to Roger Penrose.
00:14:10.960 | I'm not sure if you're familiar with his work.
00:14:13.880 | He also describes this kind of mystery in the mind,
00:14:19.680 | in the brain, that he sees a materialist.
00:14:23.480 | So there's no sort of mystical thing going on,
00:14:26.840 | but there's so much about the material of the brain
00:14:29.240 | that we don't understand,
00:14:30.680 | that might be quantum mechanical in nature and so on.
00:14:34.320 | So there are the ideas about consciousness.
00:14:36.960 | Do you have any, have you ever thought about,
00:14:39.240 | do you ever think about ideas of consciousness
00:14:41.480 | or a little bit more about the mystery
00:14:43.600 | of intelligence and consciousness that seems to pop up,
00:14:46.200 | just like you're saying, from our brain?
00:14:48.520 | - I agree with Roger Penrose that there is a mystery there.
00:14:51.840 | He's one of the world's greatest physicists,
00:14:56.400 | so I can't possibly argue with his-
00:14:59.960 | - But nobody knows anything about consciousness.
00:15:01.880 | And in fact, if we talk about religion and so on,
00:15:06.880 | the mystery of consciousness is so awe-inspiring
00:15:11.680 | and we know so little about it
00:15:13.960 | that the leap to sort of religious or mystical explanations
00:15:17.280 | is too easy to make.
00:15:18.800 | - I think that it's just an act of cowardice
00:15:21.320 | to leap to religious explanations,
00:15:22.960 | and Roger doesn't do that, of course.
00:15:24.840 | But I accept that there may be something
00:15:28.960 | that we don't understand about it.
00:15:31.200 | - So correct me if I'm wrong,
00:15:32.760 | but in your book, "Selfish Gene,"
00:15:34.560 | the gene-centered view of evolution
00:15:37.120 | allows us to think of the physical organisms
00:15:40.320 | as just the medium through which the software
00:15:42.680 | of our genetics and the ideas sort of propagate.
00:15:47.200 | So maybe can we start just with the basics?
00:15:52.200 | What in this context does the word meme mean?
00:15:56.560 | - It would mean the cultural equivalent of a gene,
00:15:59.960 | cultural equivalent in the sense of
00:16:02.280 | that which plays the same role as the gene
00:16:04.440 | in the transmission of culture
00:16:06.320 | and the transmission of ideas in the broadest sense.
00:16:09.800 | And it's only a useful word
00:16:11.680 | if there's something Darwinian going on.
00:16:14.320 | Obviously, culture is transmitted,
00:16:16.560 | but is there anything Darwinian going on?
00:16:18.840 | And if there is, that means there has to be
00:16:20.520 | something like a gene,
00:16:22.280 | which becomes more numerous or less numerous
00:16:25.880 | in the population.
00:16:27.800 | - So it can replicate?
00:16:29.640 | - It can replicate.
00:16:30.640 | Well, it clearly does replicate.
00:16:31.960 | There's no question about that.
00:16:33.880 | The question is, does it replicate
00:16:35.320 | in a sort of differential way, in a Darwinian fashion?
00:16:38.480 | Could you say that certain ideas propagate
00:16:40.920 | because they're successful in the meme pool?
00:16:44.480 | In a sort of trivial sense, you can.
00:16:46.480 | Would you wish to say, though,
00:16:49.240 | that in the same way as an animal body
00:16:52.080 | is modified, adapted to serve as a machine
00:16:56.800 | for propagating genes,
00:16:59.160 | is it also a machine for propagating memes?
00:17:01.200 | Could you actually say that something
00:17:02.560 | about the way a human is,
00:17:04.160 | is modified, adapted
00:17:07.520 | for the function of meme propagation?
00:17:12.600 | - That's such a fascinating possibility, if that's true.
00:17:16.040 | If that it's not just about the genes,
00:17:18.320 | which seem somehow more comprehensible,
00:17:21.640 | it's like these things of biology.
00:17:23.600 | The idea that culture, or maybe ideas,
00:17:28.360 | you can really broadly define it,
00:17:30.640 | operates under these mechanisms.
00:17:33.440 | - Even morphology, even anatomy,
00:17:36.120 | does evolve by memetic means.
00:17:39.400 | I mean, things like hairstyles,
00:17:42.000 | styles of makeup, circumcision,
00:17:45.320 | these things are actual changes in the body form,
00:17:49.480 | which are non-genetic,
00:17:50.800 | and which get passed on from generation to generation,
00:17:53.600 | or sideways, like a virus,
00:17:56.080 | in a quasi-genetic way.
00:17:59.400 | - But the moment you start drifting away from the physical,
00:18:03.520 | it becomes interesting,
00:18:05.440 | 'cause the space of ideas, ideologies, political systems.
00:18:09.680 | - Of course, yes.
00:18:10.560 | - So, what's your sense?
00:18:14.080 | Are memes a metaphor more, or are they really,
00:18:20.240 | is there something fundamental,
00:18:22.160 | almost physical presence of memes?
00:18:24.280 | - Well, I think they're a bit more than a metaphor,
00:18:26.120 | and I think that,
00:18:27.440 | I mentioned the physical bodily characteristics,
00:18:31.840 | which are a bit trivial in a way,
00:18:33.080 | but with things like the propagation of religious ideas,
00:18:36.640 | both longitudinally down generations,
00:18:40.400 | and transversely, as in a sort of epidemiology of ideas.
00:18:45.400 | When a charismatic preacher converts people,
00:18:50.560 | that resembles viral transmission,
00:18:55.640 | whereas the longitudinal transmission
00:18:59.080 | from grandparent to parent to child, et cetera,
00:19:02.640 | is more like conventional genetic transmission.
00:19:07.120 | - That's such a beautiful,
00:19:08.840 | especially in the modern day, idea.
00:19:11.320 | Do you think about this implication in social networks,
00:19:13.840 | where the propagation of ideas,
00:19:15.440 | the viral propagation of ideas,
00:19:17.400 | and hence the new use of the word meme to describe--
00:19:21.720 | - Well, the internet, of course,
00:19:23.640 | provides extremely rapid method of transmission.
00:19:27.640 | Before, when I first coined the word,
00:19:30.120 | the internet didn't exist,
00:19:31.160 | and so I was thinking then in terms of books, newspapers,
00:19:36.360 | broad radio, television, that kind of thing.
00:19:39.560 | Now, an idea can just leap around the world
00:19:43.600 | in all directions instantly.
00:19:46.400 | And so, the internet provides a step change
00:19:50.960 | in the facility of propagation of memes.
00:19:55.160 | - How does that make you feel?
00:19:56.600 | Isn't it fascinating that sort of ideas,
00:19:59.080 | it's like you have Galapagos Islands or something,
00:20:02.000 | it's the '70s, and the internet allowed all these species
00:20:05.880 | to just globalize.
00:20:08.320 | And in a matter of seconds,
00:20:10.360 | you can spread a message to millions of people,
00:20:13.160 | and these ideas, these memes can breed,
00:20:17.720 | can evolve, can mutate, there's a selection,
00:20:22.120 | and there's different, I guess, groups that have all,
00:20:25.320 | there's a dynamics that's fascinating here.
00:20:27.520 | Do you think-- - Yes.
00:20:28.600 | - Basically, do you think your work in this direction,
00:20:32.520 | while fundamentally it was focused on life on Earth,
00:20:35.660 | do you think it should continue, like, to be taken further?
00:20:39.200 | - I mean, I do think it would probably be a good idea
00:20:41.240 | to think in a Darwinian way about this sort of thing.
00:20:45.400 | We conventionally think of the transmission of ideas
00:20:49.000 | in evolutionary context as being limited to,
00:20:52.560 | in our ancestors, people living in villages,
00:20:57.120 | living in small bands where everybody knew each other,
00:21:00.400 | and ideas could propagate within the village,
00:21:03.240 | and they might hop to a neighboring village occasionally,
00:21:07.060 | and maybe even to a neighboring continent eventually.
00:21:10.180 | And that was a slow process.
00:21:12.180 | Nowadays, villages are international.
00:21:15.820 | I mean, you have people, it's been called echo chambers,
00:21:20.380 | where people are in a sort of internet village,
00:21:23.940 | where the other members of the village
00:21:26.500 | may be geographically distributed all over the world,
00:21:29.340 | but they just happen to be interested in the same things,
00:21:31.840 | use the same terminology, the same jargon,
00:21:35.580 | have the same enthusiasm.
00:21:36.900 | So people like the Flat Earth Society,
00:21:39.640 | they don't all live in one place, they find each other,
00:21:43.660 | and they talk the same language to each other,
00:21:45.180 | they talk the same nonsense to each other.
00:21:47.180 | But so this is a kind of distributed version
00:21:51.660 | of the primitive idea of people living in villages
00:21:56.140 | and propagating their ideas in a local way.
00:21:58.660 | - Is there Darwinist parallel here?
00:22:02.400 | So is there evolutionary purpose of villages,
00:22:05.920 | or is that just a--
00:22:07.240 | - I wouldn't use a word like evolutionary purpose
00:22:09.040 | in that case, but villages will be something
00:22:12.520 | that just emerged, that's the way people happen to live.
00:22:15.760 | - And in just the same kind of way,
00:22:19.960 | the Flat Earth Society, societies of ideas
00:22:22.920 | emerge in the same kind of way in this digital space?
00:22:26.560 | - Yes, yes.
00:22:28.000 | - Is there something interesting to say about the,
00:22:31.040 | I guess, from a perspective of Darwin,
00:22:34.960 | could we fully interpret the dynamics
00:22:38.560 | of social interaction in these social networks?
00:22:43.320 | Or is there some much more complicated thing
00:22:46.800 | need to be developed?
00:22:48.320 | Like what's your sense?
00:22:49.280 | - Well, a Darwinian selection idea would involve
00:22:52.800 | investigating which ideas spread and which don't.
00:22:57.640 | So some ideas don't have the ability to spread.
00:23:01.960 | I mean, flat Earthism, there are a few people
00:23:06.040 | believing it, but it's not gonna spread
00:23:07.360 | because it's obvious nonsense.
00:23:08.880 | But other ideas, even if they are wrong,
00:23:12.280 | can spread because they are attractive in some sense.
00:23:16.480 | - So the spreading and the selection
00:23:19.520 | in the Darwinian context, it just has to be attractive
00:23:24.240 | in some sense, like we don't have to define,
00:23:26.320 | like it doesn't have to be attractive
00:23:27.640 | in the way that animals attract each other.
00:23:30.600 | It could be attractive in some other way.
00:23:32.200 | - Yes, all that matters is,
00:23:35.120 | all that is needed is that it should spread.
00:23:37.720 | And it doesn't have to be true to spread.
00:23:39.640 | In truth, there's one criterion
00:23:41.400 | which might help an idea to spread.
00:23:43.600 | But there are other criteria which might help it to spread.
00:23:45.960 | As you say, attraction in animals
00:23:49.420 | is not necessarily valuable for survival.
00:23:51.880 | The famous peacock's tail doesn't help
00:23:56.280 | the peacock to survive, it helps it to pass on its genes.
00:23:59.240 | Similarly, an idea which is actually rubbish,
00:24:03.280 | but which people don't know is rubbish
00:24:04.880 | and think is very attractive, will spread
00:24:07.000 | in the same way as a peacock's genes spread.
00:24:10.280 | - As a small sidestep, I remember reading somewhere,
00:24:13.080 | I think recently, that in some species of birds,
00:24:18.160 | sort of the idea that beauty may have its own purpose,
00:24:21.920 | and the idea that some birds,
00:24:26.020 | I'm being ineloquent here,
00:24:29.000 | but there's some aspects of their feathers and so on
00:24:32.320 | that serve no evolutionary purpose whatsoever.
00:24:35.440 | There's somebody making an argument
00:24:37.360 | that there are some things about beauty that animals do
00:24:40.160 | that may be its own purpose.
00:24:42.560 | Does that ring a bell for you?
00:24:45.400 | Does it sound ridiculous?
00:24:46.960 | - I think it's a rather distorted bell.
00:24:50.680 | Darwin, when he coined the phrase sexual selection,
00:24:55.160 | didn't feel the need to suggest
00:25:00.520 | that what was attractive to females,
00:25:03.680 | usually is males attracting females,
00:25:05.760 | that what females found attractive had to be useful.
00:25:08.080 | He said it didn't have to be useful.
00:25:09.920 | It was enough that females found it attractive.
00:25:12.600 | And so it could be completely useless,
00:25:14.240 | probably was completely useless in the conventional sense,
00:25:17.360 | but was not at all useless in the sense of passing on
00:25:20.600 | to Darwin didn't call them genes,
00:25:21.800 | but in the sense of reproducing.
00:25:23.400 | Others, starting with Wallace,
00:25:27.240 | the co-discoverer of natural selection,
00:25:30.320 | didn't like that idea.
00:25:31.480 | And they wanted sexually selected characteristics
00:25:36.240 | like peacock's tails to be in some sense useful.
00:25:39.440 | It's a bit of a stretch to think of a peacock's tail
00:25:41.200 | as being useful, but in the sense of survival,
00:25:43.920 | but others have run with that idea
00:25:46.400 | and have brought it up to date.
00:25:48.520 | And so there's a kind of,
00:25:50.440 | there are two schools of thought on sexual selection,
00:25:52.480 | which are still active and about equally supported now.
00:25:56.000 | Those who follow Darwin in thinking that it's just enough
00:25:58.760 | to say it's attractive and those who follow Wallace
00:26:03.160 | and say that it has to be in some sense useful.
00:26:07.240 | - Do you fall into one category or the other?
00:26:10.560 | - No, I'm open-minded.
00:26:11.840 | I think they both could be correct in different cases.
00:26:14.540 | I mean, they've both been made sophisticated
00:26:17.840 | in a mathematical sense,
00:26:19.280 | more so than when Darwin and Wallace
00:26:20.860 | first started talking about it.
00:26:22.580 | - I'm Russian, I romanticize things,
00:26:24.800 | so I prefer the former,
00:26:27.720 | where the beauty in itself is a powerful,
00:26:31.280 | so attraction is a powerful force in evolution.
00:26:36.040 | On religion, do you think there will ever be a time
00:26:41.360 | in our future where almost nobody believes in God
00:26:44.960 | or God is not a part of the moral fabric of our society?
00:26:49.420 | - Yes, I do.
00:26:51.560 | I think it may happen after a very long time.
00:26:53.880 | I think it may take a long time for that to happen.
00:26:56.060 | - So do you think ultimately for everybody on earth,
00:26:59.280 | religion, other forms of doctrines, ideas,
00:27:04.280 | could do a better job than what religion does?
00:27:07.740 | - Yes.
00:27:09.220 | I mean, following truth.
00:27:12.720 | - Well, truth is a funny, funny word.
00:27:15.240 | And reason too.
00:27:18.040 | There's, yeah, it's a difficult idea now
00:27:23.960 | with truth and the internet, right?
00:27:26.960 | And fake news and so on.
00:27:28.680 | I suppose when you say reason,
00:27:30.360 | you mean the very basic sort of inarguable conclusions
00:27:34.600 | of science versus which political system is better?
00:27:38.240 | - Yes, yes.
00:27:39.400 | I mean truth about the real world,
00:27:42.680 | which is ascertainable by,
00:27:45.520 | not just by the more rigorous methods of science,
00:27:48.120 | but by just ordinary sensory observation.
00:27:52.840 | - So do you think there will ever be a time
00:27:54.440 | when we move past it?
00:27:57.360 | Like, I guess another way to ask it,
00:27:59.480 | are we hopelessly, fundamentally tied to religion
00:28:04.480 | in the way our society functions?
00:28:09.200 | - Well, clearly all individuals
00:28:11.800 | are not hopelessly tied to it
00:28:13.360 | because many individuals don't believe.
00:28:15.420 | You could mean something like society needs religion
00:28:20.220 | in order to function properly, something like that.
00:28:22.440 | And some people have suggested that.
00:28:23.840 | - What's your intuition on that?
00:28:25.900 | - Well, I've read books on it and they're persuasive.
00:28:30.900 | I don't think they're that persuasive though.
00:28:33.720 | I mean, some people suggested that society
00:28:38.320 | needs a sort of figurehead,
00:28:40.920 | which can be a non-existent figurehead
00:28:42.880 | in order to function properly.
00:28:44.640 | I think there's something rather patronizing
00:28:46.320 | about the idea that, well, you and I are intelligent enough
00:28:50.680 | not to believe in God, but the plebs need it sort of thing.
00:28:54.280 | And I think that's patronizing.
00:28:55.840 | And I'd like to think that
00:28:58.360 | that was not the right way to proceed.
00:29:01.200 | - But at the individual level,
00:29:02.680 | do you think there's some value of spirituality?
00:29:08.680 | Sort of, if I think sort of as a scientist,
00:29:12.400 | the amount of things we actually know about our universe
00:29:15.140 | is a tiny, tiny, tiny percentage
00:29:18.060 | of what we could possibly know.
00:29:19.940 | So just from everything,
00:29:21.360 | even the certainty we have about the laws of physics,
00:29:23.720 | it seems to be that there's yet a huge amount to discover.
00:29:27.680 | And therefore we're sitting where 99.999% of things
00:29:31.640 | is just still shrouded in mystery.
00:29:34.320 | Do you think there's a role in a kind of spiritual view
00:29:37.680 | of that, sort of a humbled spiritual?
00:29:39.720 | - I think it's right to be humble.
00:29:41.360 | I think it's right to admit that there's a lot we don't know,
00:29:44.200 | a lot that we don't understand,
00:29:45.360 | a lot that we still need to work on.
00:29:47.640 | We're working on it.
00:29:49.200 | What I don't think is that it helps
00:29:50.960 | to invoke supernatural explanations.
00:29:54.480 | If our current scientific explanations
00:29:58.920 | aren't adequate to do the job, then we need better ones.
00:30:01.400 | We need to work more.
00:30:02.800 | And of course, the history of science shows just that,
00:30:05.240 | that as science goes on,
00:30:06.720 | problems get solved one after another
00:30:09.680 | and the science advances as science gets better.
00:30:12.080 | But to invoke a non-scientific, non-physical explanation
00:30:18.080 | is simply to lie down in a cowardly way and say,
00:30:21.720 | "We can't solve it, so we're going to invoke magic."
00:30:24.440 | Don't let's do that.
00:30:25.280 | Let's say we need better science.
00:30:26.600 | We need more science.
00:30:27.820 | It may be that the science will never do it.
00:30:30.560 | It may be that we will never actually understand
00:30:32.560 | everything and that's okay, but let's keep working on it.
00:30:37.560 | - A challenging question there is,
00:30:41.000 | do you think science can lead us astray
00:30:43.000 | in terms of the humbleness?
00:30:44.240 | So there's some aspect of science,
00:30:47.240 | maybe it's the aspect of scientists and not science,
00:30:51.360 | but of sort of a mix of ego and confidence
00:30:56.280 | that can lead us astray in terms of discovering
00:31:01.640 | some of the big open questions about the universe.
00:31:05.280 | - I think that's right.
00:31:06.120 | I mean, there are arrogant people in any walk of life
00:31:09.240 | and scientists are no exception to that.
00:31:11.080 | And so there are arrogant scientists
00:31:13.160 | who think we've solved everything.
00:31:14.200 | Of course we haven't.
00:31:15.440 | So humility is a proper stance for a scientist.
00:31:18.640 | I mean, it's a proper working stance
00:31:20.120 | because it encourages further work.
00:31:23.980 | But in a way to resort to a supernatural explanation
00:31:28.440 | is a kind of arrogance because it's saying,
00:31:30.520 | well, we don't understand it scientifically,
00:31:32.120 | therefore the non-scientific religious
00:31:37.120 | supernatural explanation must be the right one.
00:31:39.720 | That's arrogant.
00:31:40.880 | What is humble is to say we don't know
00:31:43.120 | and we need to work further on it.
00:31:46.400 | - So maybe if I could psychoanalyze you for a second,
00:31:50.240 | you have at times been just slightly frustrated
00:31:54.080 | with people who have a supernatural.
00:31:59.640 | Has that changed over the years?
00:32:01.400 | Have you become, like, how do people
00:32:03.840 | that kind of have seek supernatural explanations,
00:32:07.280 | how do you see those people as human beings?
00:32:12.280 | Do you see them as dishonest?
00:32:14.040 | Do you see them as sort of ignorant?
00:32:19.040 | Do you see them as, I don't know,
00:32:21.680 | like how do you think of those people?
00:32:23.400 | - Certainly not dishonest.
00:32:25.400 | And I mean, obviously many of them are perfectly nice people
00:32:28.520 | so I don't sort of despise them in that sense.
00:32:31.000 | I think it's often a misunderstanding
00:32:35.880 | that people will jump from the admission
00:32:40.880 | that we don't understand something.
00:32:43.960 | They will jump straight to what they think of
00:32:46.440 | as an alternative explanation,
00:32:48.000 | which is the supernatural one, which is not an alternative.
00:32:50.640 | It's a non-explanation.
00:32:52.000 | Instead of jumping to the conclusion
00:32:55.280 | that science needs more work,
00:32:57.520 | that we need to actually do some better science.
00:33:00.880 | So I don't have, I mean,
00:33:04.080 | personal antipathy towards such people.
00:33:08.280 | I just think they're misguided.
00:33:10.800 | - So what about this really interesting space
00:33:12.800 | that I have trouble with?
00:33:14.560 | So religion I have a better grasp on,
00:33:16.960 | but there's large communities, like you said,
00:33:19.760 | Flat Earth community that I've recently,
00:33:23.680 | because I've made a few jokes about it,
00:33:26.680 | I saw that there's, I've noticed that there's people
00:33:29.440 | that take it quite seriously.
00:33:31.000 | So there's this bigger world of conspiracy theorists,
00:33:36.640 | which is a kind of, I mean, there's elements of it
00:33:40.840 | that are religious as well,
00:33:43.680 | but I think they're also scientific.
00:33:46.120 | So the basic credo of a conspiracy theorist
00:33:50.280 | is to question everything,
00:33:53.760 | which is also the credo of a good scientist, I would say.
00:33:57.920 | So what do you make of this?
00:33:59.960 | - I mean, I think it's probably too easy to say
00:34:04.080 | that by labeling something conspiracy,
00:34:06.960 | you therefore dismiss it.
00:34:08.080 | I mean, occasionally conspiracies are right.
00:34:10.680 | And so we shouldn't dismiss conspiracy theories out of hand.
00:34:14.760 | We should examine them on their own merits.
00:34:17.280 | Flat Earthism is obvious nonsense.
00:34:19.280 | We don't have to examine that much further.
00:34:21.720 | But I mean, there may be other conspiracy theories
00:34:25.640 | which are actually right.
00:34:27.200 | - So I've grew up in the Soviet Union,
00:34:29.560 | so I understand, you know, the space race
00:34:31.880 | was very influential for me on both sides of the coin.
00:34:34.800 | You know, there's a conspiracy theory
00:34:37.880 | that we never went to the moon, right?
00:34:40.200 | And it's like, I can understand it,
00:34:45.200 | and it's very difficult to rigorously scientifically show
00:34:47.880 | one way or the other.
00:34:49.560 | It's just, you have to use some of the human intuition
00:34:52.680 | about who would have to lie,
00:34:54.000 | who would have to work together.
00:34:55.400 | And it's clear that very unlikely good...
00:34:59.800 | Behind that is my general intuition
00:35:01.760 | that most people in this world are good.
00:35:04.280 | You know, in order to really put together
00:35:06.040 | some conspiracy theories,
00:35:07.480 | there has to be a large number of people working together
00:35:11.400 | and essentially being dishonest.
00:35:13.280 | - Yes, which is improbable.
00:35:14.840 | The sheer number who would have to be in on this conspiracy
00:35:18.600 | and the sheer detail, the attention to detail
00:35:21.600 | they'd have had to have had and so on.
00:35:23.680 | I'd also worry about the motive,
00:35:26.080 | and why would anyone want to suggest that it didn't happen?
00:35:29.200 | What's the...
00:35:30.040 | Why is it so hard to believe?
00:35:33.120 | I mean, the physics of it, the mathematics of it,
00:35:36.120 | the idea of computing orbits and trajectories and things,
00:35:40.600 | it all works mathematically.
00:35:42.400 | Why wouldn't you believe it?
00:35:44.200 | - It's a psychology question
00:35:45.320 | because there's something really pleasant
00:35:47.280 | about pointing out that the emperor has no clothes
00:35:52.280 | when everybody...
00:35:53.800 | Thinking outside the box
00:35:56.880 | and coming up with a true answer
00:35:58.480 | where everybody else is deluded.
00:36:00.000 | There's something...
00:36:00.840 | I mean, I have that for science, right?
00:36:02.880 | You want to prove the entire scientific community wrong.
00:36:05.240 | That's the whole...
00:36:06.080 | - No, that's right.
00:36:07.120 | And of course, historically,
00:36:09.680 | lone geniuses have come out right sometimes.
00:36:12.480 | But often people who think they're a lone genius
00:36:15.240 | have much more often turned out not to.
00:36:17.600 | So you have to judge each case on its merits.
00:36:20.240 | The mere fact that you're a maverick,
00:36:21.720 | the mere fact that you're going against the current tide
00:36:25.960 | doesn't make you right.
00:36:27.120 | You've got to show you're right by looking at the evidence.
00:36:29.880 | - So because you focus so much on religion
00:36:32.520 | and disassembled a lot of ideas there,
00:36:34.360 | and I was wondering if you have ideas
00:36:37.560 | about conspiracy theory groups,
00:36:41.040 | because it's such a prevalent,
00:36:42.200 | even reaching into presidential politics and so on.
00:36:46.240 | It seems like it's a very large communities
00:36:48.400 | that believe different kinds of conspiracy theories.
00:36:50.840 | Is there some connection there
00:36:52.280 | to your thinking on religion?
00:36:55.080 | - It is curious.
00:36:55.960 | It's a matter...
00:36:56.800 | It's an obviously difficult thing.
00:36:58.520 | I don't understand why people believe things
00:37:02.920 | that are clearly nonsense like, well, flat Earth,
00:37:05.800 | and also the conspiracy about not landing on the moon
00:37:08.600 | or that the United States engineered 9/11,
00:37:13.600 | that kind of thing.
00:37:15.640 | - So it's not clearly nonsense,
00:37:18.640 | it's extremely unlikely.
00:37:20.920 | - Okay, it's extremely unlikely.
00:37:22.520 | Religion is a bit different
00:37:25.040 | because it's passed down from generation to generation.
00:37:27.640 | So many of the people who are religious
00:37:29.600 | got it from their parents,
00:37:31.880 | who got it from their parents,
00:37:32.800 | who got it from their parents,
00:37:33.760 | and childhood indoctrination is a very powerful force.
00:37:38.760 | But these things like the 9/11 conspiracy theory,
00:37:43.880 | the Kennedy assassination conspiracy theory,
00:37:47.800 | the man on the moon conspiracy theory,
00:37:50.560 | these are not childhood indoctrination.
00:37:52.320 | These are presumably dreamed up by somebody
00:37:56.840 | who then tells somebody else,
00:37:58.640 | who then wants to believe it.
00:38:01.000 | And I don't know why people are so eager
00:38:04.600 | to fall in line with just some person
00:38:07.880 | that they happen to read or meet who spins some yarn.
00:38:11.480 | I can kind of understand why they believe
00:38:14.240 | what their parents and teachers told them
00:38:16.360 | when they were very tiny
00:38:18.160 | and not capable of critical thinking for themselves.
00:38:21.440 | So I sort of get why the great religions of the world
00:38:25.160 | like Catholicism and Islam go on persisting.
00:38:28.880 | It's because of childhood indoctrination.
00:38:31.240 | But that's not true of flat-earthers.
00:38:33.800 | And sure enough, flat-earthers is a very minority cult.
00:38:37.560 | - Way larger than I ever realized.
00:38:39.880 | - Well, yes, I know.
00:38:40.720 | - But so that's a really clean idea.
00:38:42.240 | And you've articulated that in your new book
00:38:43.920 | and "The Outgrown God"
00:38:45.600 | and in "God Delusion" is the early indoctrination.
00:38:49.120 | That's really interesting.
00:38:50.280 | You can get away with a lot of out there ideas
00:38:53.400 | in terms of religious texts
00:38:56.040 | if the age at which you convey those ideas at first
00:39:01.040 | is a young age.
00:39:02.780 | So indoctrination is sort of an essential element
00:39:06.240 | of propagation of religion.
00:39:08.140 | So let me ask on the morality side
00:39:12.480 | in the books that I mentioned,
00:39:13.720 | "God Delusion" and "Outgrown God",
00:39:15.320 | you described that human beings
00:39:17.520 | don't need religion to be moral.
00:39:20.060 | So from an engineering perspective,
00:39:21.660 | we wanna engineer morality into AI systems.
00:39:26.440 | So in general, where do you think morals come from
00:39:30.760 | in humans?
00:39:31.840 | - A very complicated and interesting question.
00:39:36.680 | It's clear to me that the moral standards,
00:39:40.680 | the moral values of our civilization
00:39:44.680 | changes as the decades go by,
00:39:49.600 | certainly as the centuries go by,
00:39:51.040 | even as the decades go by.
00:39:53.320 | And we in the 21st century are quite clearly labeled
00:39:58.320 | 21st century people in terms of our moral values.
00:40:03.320 | There's a spread.
00:40:05.360 | I mean, some of us are a little bit more ruthless,
00:40:08.080 | some of us more conservative,
00:40:09.160 | some of us more liberal and so on.
00:40:11.360 | But we all subscribe to pretty much the same views
00:40:15.940 | when you compare us with say 18th century,
00:40:19.540 | 17th century people,
00:40:21.160 | even 19th century, 20th century people.
00:40:23.320 | So we're much less racist,
00:40:27.880 | we're much less sexist and so on than we used to be.
00:40:30.520 | Some people are still racist and some are still sexist,
00:40:33.260 | but the spread has shifted.
00:40:35.480 | The Gaussian distribution has moved
00:40:37.780 | and moves steadily as the centuries go by.
00:40:42.320 | And that is the most powerful influence I can see
00:40:48.160 | on our moral values.
00:40:50.520 | And that doesn't have anything to do with religion.
00:40:53.080 | I mean, the religion, sorry,
00:40:55.600 | the morals of the Old Testament are Bronze Age models,
00:41:00.600 | morals, they're deplorable
00:41:02.400 | and they are to be understood
00:41:06.240 | in terms of the people in the desert
00:41:08.400 | who made them up at the time.
00:41:10.360 | And so human sacrifice,
00:41:11.880 | an eye for an eye and a tooth for a tooth,
00:41:15.560 | petty revenge, killing people for breaking the Sabbath,
00:41:19.280 | all that kind of thing, inconceivable now.
00:41:23.720 | - So at some point, religious texts may have in part
00:41:26.840 | reflected that Gaussian distribution at that time.
00:41:30.040 | - I'm sure they did, I'm sure they always reflect that, yes.
00:41:32.120 | - And then now, but the sort of almost like the meme
00:41:35.920 | as you describe it of ideas moves much faster
00:41:40.100 | than religious texts do, than you religious people.
00:41:42.960 | - Basing your morals on religious texts,
00:41:45.200 | which were written millennia ago
00:41:47.120 | is not a great way to proceed.
00:41:51.560 | I think that's pretty clear.
00:41:53.400 | So not only should we not get our morals from such texts,
00:41:58.400 | but we don't, we quite clearly don't.
00:42:01.280 | If we did, then we'd be discriminating against women
00:42:04.560 | and we'd be racist, we'd be killing homosexuals and so on.
00:42:09.480 | So we don't and we shouldn't.
00:42:14.520 | Now, of course, it's possible to use
00:42:18.080 | your 21st century standards of morality
00:42:20.520 | and you can look at the Bible and you can cherry pick
00:42:23.560 | particular verses which conform to our modern morality
00:42:27.640 | and you'll find that Jesus says some pretty nice things,
00:42:30.480 | which is great, but you're using your 21st century morality
00:42:35.480 | to decide which verses to pick and which verses to reject.
00:42:39.040 | And so why not cut out the middleman of the Bible
00:42:42.760 | and go straight to the 21st century morality,
00:42:46.400 | which is where that comes from.
00:42:49.680 | It's a much more complicated question.
00:42:51.200 | Why is it that morality, moral values change
00:42:55.520 | as the centuries go by?
00:42:56.520 | They undoubtedly do.
00:42:58.240 | And it's a very interesting question to ask why.
00:43:00.480 | It's another example of cultural evolution,
00:43:03.280 | just as technology progresses,
00:43:07.400 | so moral values progress for probably very different reasons.
00:43:10.520 | - But it's interesting if the direction
00:43:12.720 | in which that progress is happening
00:43:14.360 | has some evolutionary value or if it's merely a drift
00:43:17.480 | that can go into any direction.
00:43:19.320 | - I'm not sure it's any direction
00:43:20.440 | and I'm not sure it's evolutionarily valuable.
00:43:22.600 | What it is is progressive in the sense that each step
00:43:26.880 | is a step in the same direction as the previous step.
00:43:30.080 | So it becomes more gentle, more decent,
00:43:33.160 | by modern standards, more liberal, less violent.
00:43:37.440 | - See, but more decent, I think you're using terms
00:43:40.440 | and interpreting everything in the context
00:43:42.680 | of the 21st century.
00:43:44.320 | Because Genghis Khan would probably say
00:43:46.760 | that this is not more decent because we're now,
00:43:49.800 | you know, there's a lot of weak members of society
00:43:52.160 | that we're not murdering.
00:43:53.000 | - Yes, and I was careful to say
00:43:54.800 | by the standards of the 21st century,
00:43:56.840 | by our standards, if we with hindsight look back at history,
00:44:00.880 | what we see is a trend in the direction towards us,
00:44:03.920 | towards our present value system.
00:44:06.760 | - For us, we see progress, but it's an open question
00:44:10.680 | whether that won't, you know,
00:44:13.000 | I don't see necessarily why we can never return
00:44:15.600 | to Genghis Khan times.
00:44:16.440 | - Well, we could.
00:44:18.120 | I suspect we won't.
00:44:19.640 | But if you look at the history of moral values
00:44:24.640 | over the centuries, it is in a progressive,
00:44:27.360 | I use the word progressive not in a value judgment sense,
00:44:31.200 | in the sense of a transitive sense.
00:44:33.920 | Each step is the same direction of the previous step.
00:44:37.480 | So things like we don't derive entertainment
00:44:42.480 | from torturing cats.
00:44:45.360 | We don't derive entertainment from,
00:44:48.400 | like the Romans did in the Colosseum from that stage.
00:44:52.640 | - Or rather we suppress the desire to get,
00:44:57.640 | I mean, to have pleasure.
00:44:58.960 | It's probably in us somewhere.
00:45:00.320 | So there's a bunch of parts of our brain,
00:45:02.720 | one that probably, you know, limbic system
00:45:05.400 | that wants certain pleasures.
00:45:06.840 | And that's-
00:45:07.680 | - I mean, I wouldn't have said that,
00:45:10.760 | but you're limited to think that.
00:45:13.840 | - Well, there's a Dan Carlin of Hardcore History
00:45:17.560 | that has a really nice explanation
00:45:18.960 | of how we've enjoyed watching the torture of people,
00:45:22.120 | the fighting of people, just the torture,
00:45:23.760 | the suffering of people throughout history
00:45:25.800 | as entertainment until quite recently.
00:45:29.920 | And now everything we do with sports,
00:45:32.480 | we're kind of channeling that feeling into something else.
00:45:35.680 | I mean, there is some dark aspects of human nature
00:45:39.000 | that are underneath everything.
00:45:41.160 | And I do hope this like higher level software we've built
00:45:45.540 | will keep us at bay.
00:45:46.920 | - Yes.
00:45:47.760 | - I'm also Jewish and have history
00:45:49.880 | with the Soviet Union and the Holocaust.
00:45:53.480 | And I clearly remember that some of the darker aspects
00:45:57.840 | of human nature creeped up there.
00:45:59.240 | - They do.
00:46:00.080 | - And there have been steps backwards admittedly,
00:46:03.520 | and the Holocaust is an obvious one.
00:46:05.000 | But if you take a broad view of history,
00:46:07.160 | it's in the same direction.
00:46:09.560 | - So Pamela McCordick in "Machines Who Think"
00:46:14.920 | has written that AI began with an ancient wish
00:46:17.400 | to forge the gods.
00:46:19.580 | Do you see, it's a poetic description, I suppose,
00:46:23.000 | but do you see a connection between our civilizations,
00:46:27.040 | historic desire to create gods, to create religions,
00:46:30.120 | and our modern desire to create technology,
00:46:33.840 | an intelligent technology?
00:46:35.560 | - I suppose there's a link between an ancient desire
00:46:40.360 | to explain away mystery and science,
00:46:45.360 | but artificial intelligence creating gods,
00:46:50.520 | creating new gods.
00:46:51.860 | I mean, I forget, I read somewhere a somewhat facetious
00:46:56.920 | paper which said that we have a new god,
00:46:59.000 | it's called Google, and we pray to it, and we worship it,
00:47:03.040 | and we ask its advice like an oracle and so on.
00:47:05.660 | That's fun.
00:47:07.680 | - You don't see that, you see that as a fun statement,
00:47:11.040 | a facetious statement, you don't see that
00:47:13.320 | as a kind of truth of us creating things
00:47:15.480 | that are more powerful than ourselves,
00:47:17.480 | a natural sort of-
00:47:18.760 | - It has a kind of poetic resonance to it, which I get.
00:47:22.320 | - But-
00:47:23.160 | - I wouldn't-
00:47:24.000 | - But not a scientific one.
00:47:25.720 | - I wouldn't have bothered to make the point myself,
00:47:27.480 | put it that way.
00:47:28.320 | - All right.
00:47:29.140 | So you don't think AI will become a new religion,
00:47:33.640 | a new god, like Google?
00:47:34.680 | - Well, yes, I mean, I can see that
00:47:36.800 | the future of intelligent machines,
00:47:41.160 | or indeed intelligent aliens from outer space,
00:47:44.520 | might yield beings that we would regard as gods
00:47:48.160 | in the sense that they are so superior to us
00:47:51.120 | that we might as well worship them.
00:47:53.360 | That's highly plausible, I think.
00:47:56.600 | But I see a very fundamental distinction
00:48:00.000 | between a god who is simply defined
00:48:03.080 | as something very, very powerful and intelligent
00:48:05.520 | on the one hand, and a god who doesn't need explaining
00:48:09.380 | by a progressive step-by-step process like evolution
00:48:13.800 | or like engineering design.
00:48:17.720 | So the difference, so suppose we did meet an alien
00:48:21.600 | from outer space who was marvelously,
00:48:25.640 | magnificently more intelligent than us,
00:48:27.480 | and we would sort of worship it for that reason.
00:48:30.640 | Nevertheless, it would not be a god
00:48:32.920 | in the very important sense that it did not just happen
00:48:36.880 | by to be there like God is supposed to.
00:48:40.660 | It must have come about by a gradual, step-by-step,
00:48:44.640 | incremental, progressive process,
00:48:47.560 | presumably like Darwinian evolution.
00:48:49.760 | So there's all the difference in the world
00:48:52.600 | between those two.
00:48:53.800 | Intelligence, design, comes into the universe late
00:48:58.800 | as a product of a progressive evolutionary process
00:49:02.720 | or a progressive engineering design process.
00:49:06.440 | - So most of the work is done through this slow-moving--
00:49:09.440 | - Exactly. - Progress.
00:49:10.720 | - Exactly.
00:49:11.560 | - Yeah. (chuckles)
00:49:17.680 | Yeah, but there's still this desire to get answers
00:49:20.480 | to the why question that if the world is a simulation,
00:49:24.560 | if we're living in a simulation,
00:49:25.840 | that there's a programmer-like creature
00:49:29.000 | that we can ask questions of.
00:49:30.240 | - Okay, well, let's pursue the idea
00:49:32.320 | that we're living in a simulation,
00:49:33.720 | which is not totally ridiculous, by the way.
00:49:36.140 | - There we go.
00:49:38.080 | - Then you still need to explain the programmer.
00:49:44.480 | The programmer had to come into existence
00:49:46.560 | by some, even if we're in a simulation,
00:49:49.920 | the programmer must have evolved.
00:49:51.600 | Or if he's in a sort of--
00:49:54.800 | - Or she. - Or she.
00:49:56.160 | If she's in a meta-simulation,
00:49:58.700 | then the meta-meta-programmer must have evolved
00:50:02.040 | by a gradual process.
00:50:03.560 | You can't escape that.
00:50:05.200 | Fundamentally, you've got to come back
00:50:06.940 | to a gradual, incremental process of explanation
00:50:11.200 | to start with.
00:50:12.140 | - There's no shortcuts in this world.
00:50:16.080 | - No, exactly.
00:50:17.440 | - But maybe to linger on that point about the simulation,
00:50:21.040 | do you think it's an interesting,
00:50:23.080 | basically talk to,
00:50:25.200 | bore the heck out of everybody asking this question,
00:50:28.280 | but whether you live in a simulation,
00:50:31.280 | do you think, first, do you think we live in a simulation?
00:50:34.120 | Second, do you think it's an interesting thought experiment?
00:50:37.320 | - It's certainly an interesting thought experiment.
00:50:39.600 | I first met it in a science fiction novel
00:50:42.400 | by Daniel Galloway called "Counterfeit World,"
00:50:47.400 | in which it's all about,
00:50:52.440 | I mean, our heroes are running a gigantic computer
00:50:55.200 | which simulates the world,
00:50:57.360 | and something goes wrong,
00:51:00.240 | and so one of them has to go down into the simulated world
00:51:02.480 | in order to fix it.
00:51:03.680 | And then the denouement of the thing,
00:51:07.320 | the climax to the novel is that they discover
00:51:09.240 | that they themselves are in another simulation
00:51:11.040 | at a higher level.
00:51:12.400 | So I was intrigued by this,
00:51:13.680 | and I love others of Daniel Galloway's
00:51:17.000 | science fiction novels.
00:51:18.760 | Then it was revived seriously by Nick Bostrom.
00:51:23.400 | - Bostrom, talking to him in an hour.
00:51:25.160 | - Okay.
00:51:26.000 | (laughing)
00:51:27.360 | And he goes further,
00:51:29.480 | not just to treat it as a science fiction speculation,
00:51:32.240 | he actually thinks it's positively likely.
00:51:34.880 | - Yes.
00:51:35.720 | - I mean, he thinks it's very likely, actually.
00:51:37.240 | - Well, he makes a probabilistic argument
00:51:39.400 | which you can use to come up with
00:51:41.160 | very interesting conclusions about
00:51:42.880 | the nature of this universe.
00:51:44.400 | - I mean, he thinks that we're in a simulation
00:51:48.720 | done by, so to speak, our descendants of the future,
00:51:51.320 | that the products, but it's still a product of evolution.
00:51:54.880 | It's still ultimately going to be a product of evolution,
00:51:57.000 | even though the super intelligent people of the future
00:52:01.800 | have created our world,
00:52:05.040 | and you and I are just a simulation,
00:52:06.600 | and this table is a simulation, and so on.
00:52:09.680 | I don't actually, in my heart of hearts, believe it,
00:52:13.120 | but I like his argument.
00:52:15.520 | - Well, so the interesting thing is,
00:52:18.080 | that I agree with you, but the interesting thing to me,
00:52:21.120 | if I were to say, if we're living in a simulation,
00:52:23.600 | that in that simulation, to make it work,
00:52:26.560 | you still have to do everything gradually,
00:52:28.920 | just like you said.
00:52:30.200 | That even though it's programmed,
00:52:31.880 | I don't think there could be miracles.
00:52:33.520 | Otherwise, it's--
00:52:34.360 | - Well, no, I mean, the programmer,
00:52:36.240 | the higher, the upper ones, have to have evolved gradually.
00:52:39.840 | However, the simulation they create could be instantaneous.
00:52:43.040 | I mean, they could be switched on,
00:52:44.440 | and we come into the world with fabricated memories.
00:52:48.200 | - True, but what I'm trying to convey,
00:52:50.200 | so you're saying the broader statement,
00:52:52.800 | but I'm saying from an engineering perspective,
00:52:55.400 | both the programmer has to be slowly evolved,
00:52:58.920 | and the simulation, because it's--
00:53:01.160 | - Oh, yeah.
00:53:02.000 | - From an engineering perspective--
00:53:03.200 | - Oh, yeah, it takes a long time to write a program.
00:53:05.600 | No, like, just, I don't think you can create
00:53:07.880 | the universe in a snap.
00:53:09.920 | I think you have to grow it.
00:53:11.600 | - Okay, well, that's a good point.
00:53:15.760 | That's an arguable point.
00:53:17.040 | By the way, I have thought about using the Nick Bostrom idea
00:53:22.040 | to solve the riddle of how, we were talking earlier
00:53:27.400 | about why the human brain can achieve so much.
00:53:32.160 | I thought of this when my then 100-year-old mother
00:53:36.160 | was marveling at what I could do with a smartphone,
00:53:39.400 | and I could call, look up anything in the encyclopedia,
00:53:42.800 | or I could play her music that she liked, and so on.
00:53:44.680 | She said, "But it's all in that tiny little phone."
00:53:46.840 | No, it's out there, it's in the cloud.
00:53:49.600 | And maybe most of what we do is in a cloud.
00:53:52.600 | So maybe if we are a simulation,
00:53:56.200 | then all the power that we think is in our skull,
00:54:00.280 | it actually may be like the power
00:54:01.760 | that we think is in the iPhone,
00:54:04.160 | but is that actually out there in a--
00:54:06.280 | - It's an interface to something else.
00:54:07.800 | - Yes.
00:54:08.640 | - I mean, that's what people,
00:54:10.520 | including Roger Penrose with panpsychism,
00:54:12.760 | that consciousness is somehow a fundamental part of physics,
00:54:15.920 | that it doesn't have to actually all reside inside a brain.
00:54:19.240 | - But Roger thinks it does reside in the skull,
00:54:21.720 | whereas I'm suggesting that it doesn't,
00:54:24.040 | that there's a cloud.
00:54:27.840 | - That'd be a fascinating notion.
00:54:31.280 | On a small tangent,
00:54:33.080 | are you familiar with the work of Donald Hoffman, I guess?
00:54:37.680 | Maybe not saying his name correctly,
00:54:40.320 | but just forget the name,
00:54:43.560 | the idea that there's a difference
00:54:45.200 | between reality and perception.
00:54:47.400 | So like we biological organisms perceive the world
00:54:52.400 | in order for the natural selection process
00:54:54.760 | to be able to survive and so on,
00:54:56.400 | but that doesn't mean that our perception
00:54:59.360 | actually reflects the fundamental reality,
00:55:02.520 | the physical reality underneath.
00:55:04.000 | - Well, I do think that,
00:55:06.160 | although it reflects the fundamental reality,
00:55:09.880 | I do believe there is a fundamental reality,
00:55:12.080 | I do think that our perception is constructive
00:55:17.960 | in the sense that we construct in our minds
00:55:22.040 | a model of what we're seeing.
00:55:24.360 | And so, and this is really the view of people
00:55:27.440 | who work on visual illusions like Richard Gregory,
00:55:30.720 | who point out that things like a Necker cube,
00:55:33.800 | which flip from, it's a two-dimensional picture
00:55:37.720 | of a cube on a sheet of paper,
00:55:41.000 | but we see it as a three-dimensional cube,
00:55:43.040 | and it flips from one orientation to another
00:55:45.360 | at regular intervals.
00:55:48.440 | What's going on is that the brain is constructing a cube,
00:55:52.680 | but the sense data are compatible
00:55:55.040 | with two alternative cubes.
00:55:57.120 | And so rather than stick with one of them,
00:55:58.640 | it alternates between them.
00:56:00.440 | I think that's just a model for what we do all the time
00:56:04.640 | when we see a table, when we see a person,
00:56:06.640 | when we see anything,
00:56:08.120 | we're using the sense data to construct
00:56:12.200 | or make use of a perhaps previously constructed model.
00:56:15.760 | I noticed this when I meet somebody
00:56:20.800 | who actually is, say, a friend of mine,
00:56:24.160 | but until I kind of realized that it is him,
00:56:28.480 | he looks different.
00:56:29.680 | And then when I finally clock that it's him,
00:56:32.960 | his features switch like a Necker cube
00:56:35.760 | into the familiar form.
00:56:37.440 | As it were, I've taken his face
00:56:39.600 | out of the filing cabinet inside and grafted it onto,
00:56:44.080 | or used the sense data to invoke it.
00:56:48.600 | Yeah, we do some kind of miraculous compression
00:56:51.240 | on this whole thing to be able to filter out
00:56:53.280 | most of the sense data and make sense of it.
00:56:56.040 | That's just the magical thing that we do.
00:56:58.720 | So you've written several, many amazing books,
00:57:03.600 | but let me ask what books, technical or fiction
00:57:08.440 | or philosophical had a big impact on your own life?
00:57:11.400 | What books would you recommend people consider reading
00:57:17.040 | in their own intellectual journey?
00:57:18.800 | - Darwin, of course.
00:57:22.000 | - The original, I've actually,
00:57:25.200 | a shame to say I've never read Darwin in the original.
00:57:28.880 | - He's astonishingly prescient
00:57:30.520 | because considering he was writing
00:57:32.040 | in the middle of the 19th century,
00:57:34.680 | Michael Gieselin said he's working
00:57:37.160 | a hundred years ahead of his time.
00:57:39.040 | Everything except genetics is amazingly right
00:57:42.680 | and amazingly far ahead of his time.
00:57:46.480 | And of course you need to read the updatings
00:57:51.000 | that have happened since his time as well.
00:57:54.680 | I mean, he would be astonished by,
00:57:56.480 | well, let alone Watson and Crick, of course,
00:58:01.440 | but he'd be astonished by Mendelian genetics as well.
00:58:04.240 | - Yeah, it'd be fascinating to see what he thought about,
00:58:06.920 | he would think about DNA.
00:58:08.520 | - I mean, yes, it would, because in many ways
00:58:11.520 | it clears up what appeared in his time to be a riddle.
00:58:16.520 | The digital nature of genetics clears up
00:58:21.240 | what was a problem, what was a big problem.
00:58:24.160 | Gosh, there's so much that I could think of.
00:58:27.240 | I can't really-
00:58:28.560 | - Is there something outside sort of more fiction?
00:58:32.320 | Is there, when you think young,
00:58:34.080 | was there books that just kind of,
00:58:35.480 | outside of kind of the realm of science and religion,
00:58:38.760 | that just kind of sparked your journey?
00:58:40.520 | - Yes, well, actually, I suppose I could say
00:58:43.640 | that I've learned some science from science fiction.
00:58:48.360 | I mentioned Daniel Galloy, and that's one example,
00:58:54.800 | but another of his novels called "Dark Universe,"
00:58:59.240 | which is not terribly well-known,
00:59:00.360 | but it's a very, very nice science fiction story.
00:59:02.520 | It's about a world of perpetual darkness.
00:59:06.320 | And we're not told at the beginning of the book
00:59:08.160 | why these people are in darkness.
00:59:09.600 | They stumble around in some kind of underground world
00:59:13.320 | of caverns and passages,
00:59:15.560 | using echolocation like bats and whales to get around.
00:59:20.360 | And they've adapted, presumably by Darwinian means,
00:59:24.600 | to survive in perpetual, total darkness.
00:59:28.240 | But what's interesting is that their mythology,
00:59:30.920 | their religion, has echoes of Christianity,
00:59:35.920 | but it's based on light.
00:59:38.160 | And so there's been a fall from a paradise world
00:59:43.160 | that once existed where light reigned supreme.
00:59:46.960 | And because of the sin of mankind, light banished them.
00:59:51.960 | So then they no longer are in light's presence,
00:59:54.440 | but light survives in the form of mythology
00:59:58.280 | and in the form of sayings like,
01:00:00.320 | "The great light almighty,"
01:00:01.440 | "Oh, for light's sake, don't do that,"
01:00:03.360 | and "I hear what you mean"
01:00:06.080 | rather than "I see what you mean."
01:00:07.880 | - So some of the same religious elements are present
01:00:10.280 | in this other totally kind of absurd different form.
01:00:13.320 | - Yes.
01:00:14.160 | And so it's a wonderful, I wouldn't call it satire
01:00:16.800 | because it's too good-natured for that.
01:00:17.640 | I mean, a wonderful parable about Christianity
01:00:22.640 | and the doctrine, the theological doctrine of the fall.
01:00:26.040 | So I find that kind of science fiction immensely stimulating.
01:00:31.480 | Fred Hoyle's "The Black Cloud."
01:00:33.280 | Oh, by the way, anything by Arthur C. Clarke
01:00:35.000 | I find very, very wonderful too.
01:00:37.360 | Fred Hoyle's "The Black Cloud,"
01:00:39.600 | his first science fiction novel where he,
01:00:44.600 | well, I learned a lot of science from that.
01:00:48.880 | It suffers from an obnoxious hero, unfortunately,
01:00:51.640 | but apart from that, you learn a lot of science from it.
01:00:54.920 | Another of his novels, "A for Andromeda,"
01:00:59.520 | which by the way, the theme of that is taken up
01:01:02.560 | by Carl Sagan's science fiction novel,
01:01:05.720 | another wonderful writer, "Carl Sagan Contact,"
01:01:10.160 | where the idea is, again, we will not be visited
01:01:15.160 | from outer space by physical bodies.
01:01:18.520 | We will be visited, possibly, we might be visited by radio,
01:01:21.760 | but the radio signals could manipulate us
01:01:25.200 | and actually have a concrete influence on the world
01:01:30.560 | if they make us or persuade us to build a computer
01:01:35.320 | which runs their software
01:01:37.400 | so that they can then transmit their software by radio.
01:01:41.480 | And then the computer takes over the world.
01:01:44.680 | And this is the same theme in both Hoyle's book
01:01:48.080 | and Sagan's book.
01:01:49.360 | I presume, I don't know whether Sagan knew
01:01:51.240 | about Hoyle's book, probably did.
01:01:53.320 | And, but it's a clever idea
01:01:56.640 | that we will never be invaded by physical bodies.
01:02:03.080 | The war of the worlds of HG Wells will never happen,
01:02:06.120 | but we could be invaded by radio signals, code,
01:02:10.160 | coded information, which is sort of like DNA.
01:02:13.400 | And we are, I call them, we are survival machines of our DNA.
01:02:18.400 | So it has great resonance for me because I think of us,
01:02:25.080 | I think of bodies, physical bodies, biological bodies
01:02:29.040 | as being manipulated by coded information, DNA,
01:02:34.040 | which has come down through generations.
01:02:36.520 | - And in the space of memes,
01:02:38.320 | it doesn't have to be physical.
01:02:39.560 | It can be transmitted through the space of information.
01:02:42.680 | - Yes.
01:02:43.880 | - That's a fascinating possibility that from outer space,
01:02:47.240 | we can be infiltrated by other memes, by other ideas
01:02:51.320 | and thereby controlled in that way.
01:02:54.040 | Let me ask the last, the silliest,
01:02:56.760 | or maybe the most important question.
01:02:58.840 | What is the meaning of life?
01:03:00.920 | What gives your life fulfillment, purpose, happiness, meaning?
01:03:06.040 | - From a scientific point of view,
01:03:07.160 | the meaning of life is the propagation of DNA,
01:03:10.160 | but that's not what I feel.
01:03:11.720 | That's not the meaning of my life.
01:03:14.280 | So the meaning of my life is something
01:03:16.400 | which is probably different from yours
01:03:17.640 | and different from other people's,
01:03:18.640 | but we each make our own meaning.
01:03:21.520 | So we set up goals we want to achieve.
01:03:26.240 | We want to write a book.
01:03:27.060 | We want to do whatever it is we do, write a quartet.
01:03:32.060 | We want to win a football match.
01:03:34.900 | And these are short-term goals.
01:03:38.880 | Well, maybe even quite long-term goals,
01:03:41.240 | which are set up by our brains,
01:03:42.760 | which have goal-seeking machinery built into them.
01:03:45.900 | But what we feel, we don't feel motivated
01:03:49.280 | by the desire to pass on our DNA mostly.
01:03:53.400 | We have other goals, which can be very moving,
01:03:57.180 | very important.
01:03:59.200 | They could even be called spiritual in some cases.
01:04:02.700 | We want to understand the riddle of the universe.
01:04:06.060 | We want to understand consciousness.
01:04:07.660 | We want to understand how the brain works.
01:04:09.780 | These are all noble goals.
01:04:13.300 | Some of them can be noble goals anyway,
01:04:15.900 | and they are a far cry
01:04:18.060 | from the fundamental biological goal,
01:04:20.060 | which is the propagation of DNA.
01:04:22.380 | But the machinery that enables us
01:04:24.240 | to set up these higher level goals
01:04:28.180 | is originally programmed into us
01:04:30.960 | by natural selection of DNA.
01:04:34.240 | - The propagation of DNA.
01:04:35.660 | But what do you make of this unfortunate fact
01:04:39.920 | that we are mortal?
01:04:41.700 | Do you ponder your mortality?
01:04:43.960 | Does it make you sad?
01:04:45.740 | Does it-
01:04:47.120 | - I ponder it.
01:04:49.040 | It makes me sad that I shall have to leave
01:04:51.640 | and not see what's going to happen next.
01:04:55.120 | If there's something frightening about mortality,
01:05:00.640 | apart from sort of missing, as I've said,
01:05:03.380 | something more deeply, darkly frightening,
01:05:07.320 | it's the idea of eternity.
01:05:09.640 | But eternity is only frightening if you're there.
01:05:11.800 | Eternity before we were born,
01:05:14.320 | billions of years before we were born,
01:05:15.980 | and we were effectively dead before we were born.
01:05:19.960 | As I think it was Mark Twain said,
01:05:21.240 | "I was dead for billions of years before I was born
01:05:24.100 | "and never suffered the smallest inconvenience."
01:05:26.360 | And that's how it's going to be after we leave.
01:05:29.480 | So I think of it as really,
01:05:31.480 | eternity is a frightening prospect.
01:05:33.760 | And so the best way to spend it
01:05:35.780 | is under a general anesthetic,
01:05:38.120 | which is what it'll be.
01:05:39.420 | - Beautifully put.
01:05:41.640 | Richard, it was a huge honor to meet you, to talk to you.
01:05:43.960 | Thank you so much for your time.
01:05:45.120 | - Thank you very much.
01:05:46.240 | - Thanks for listening to this conversation
01:05:48.680 | with Richard Dawkins.
01:05:49.860 | And thank you to our presenting sponsor, Cash App.
01:05:53.000 | Please consider supporting the podcast
01:05:54.660 | by downloading Cash App and using code LEXPODCAST.
01:05:58.560 | If you enjoy this podcast, subscribe on YouTube,
01:06:01.040 | review with Five Stars on Apple Podcast,
01:06:03.200 | support on Patreon, or simply connect with me on Twitter
01:06:06.560 | at Lex Friedman.
01:06:08.560 | And now let me leave you with some words of wisdom
01:06:10.920 | from Richard Dawkins.
01:06:12.880 | "We are going to die.
01:06:15.000 | "And that makes us the lucky ones.
01:06:17.560 | "Most people are never going to die
01:06:19.240 | "because they're never going to be born.
01:06:21.940 | "The potential people who could have been here in my place,
01:06:25.240 | "but who will in fact never see the light of day
01:06:27.840 | "outnumber the sand grains of Arabia.
01:06:31.260 | "Certainly those unborn ghosts
01:06:33.480 | "include greater poets than Keats,
01:06:35.700 | "scientists greater than Newton.
01:06:37.700 | "We know this because the set of possible people
01:06:41.260 | "allowed by our DNA so massively exceeds
01:06:44.880 | "the set of actual people.
01:06:47.000 | "In the teeth of these stupefying odds,
01:06:49.420 | "it is you and I, in our ordinariness, that are here.
01:06:54.420 | "We privileged few who won the lottery
01:06:57.540 | "of birth against all odds.
01:06:59.960 | "How dare we whine at our inevitable return
01:07:03.240 | "to that prior state from which the vast majority
01:07:06.260 | "have never stirred."
01:07:08.360 | Thank you for listening and hope to see you next time.
01:07:11.460 | (upbeat music)
01:07:14.040 | (upbeat music)
01:07:16.620 | [BLANK_AUDIO]