back to index

Nick Lane: Origin of Life, Evolution, Aliens, Biology, and Consciousness | Lex Fridman Podcast #318


Chapters

0:0 Introduction
1:9 Origin of life
14:56 Panspermia
20:30 What is life?
33:44 Photosynthesis
37:19 Prokaryotic vs eukaryotic cells
47:20 Sex
55:3 DNA
62:15 Violence
72:50 Human evolution
78:45 Neanderthals
82:18 Sensory inputs
93:8 Consciousness
124:41 AI and biology
154:0 Evolution
174:32 Fermi paradox
187:52 Cities
195:39 Depression
198:14 Writing
206:13 Advice for young people
213:22 Earth

Whisper Transcript | Transcript Only Page

00:00:00.000 | Well, the source of energy at the origin of life
00:00:01.720 | is the reaction between carbon dioxide and hydrogen.
00:00:04.720 | And amazingly, most of these reactions are hexagonic,
00:00:08.320 | which is to say they release energy.
00:00:10.760 | If you have hydrogen and CO2
00:00:13.280 | and you put them together in a Falcon tube
00:00:15.240 | and you warm it up to say 50 degrees centigrade
00:00:17.320 | and you put in a couple of catalysts and you shake it,
00:00:19.980 | nothing's gonna happen.
00:00:21.320 | But thermodynamically, that is less stable,
00:00:24.800 | two gases, hydrogen and CO2, is less stable than cells.
00:00:28.240 | What should happen is you get cells coming out.
00:00:31.220 | Why doesn't that happen is because of the kinetic barriers.
00:00:34.760 | That's where you need the spark.
00:00:36.360 | - The following is a conversation with Nick Lane,
00:00:40.680 | a biochemist at University College London
00:00:43.400 | and author of some of my favorite books
00:00:46.200 | on biology, science and life ever written,
00:00:49.640 | including his two most recent titled
00:00:51.440 | "Transformer, the Deep Chemistry of Life and Death"
00:00:54.540 | and "The Vital Question,
00:00:56.960 | Why is Life the Way It Is?"
00:01:00.260 | This is the Lex Friedman Podcast.
00:01:02.320 | To support it, please check out our sponsors
00:01:04.460 | in the description.
00:01:05.760 | And now, dear friends, here's Nick Lane.
00:01:09.000 | Let's start with perhaps the most mysterious,
00:01:11.800 | the most interesting question
00:01:13.800 | that we little humans can ask of ourselves.
00:01:18.160 | How did life originate on earth?
00:01:20.640 | - You could ask anybody working on the subject
00:01:24.040 | and you'll get a different answer from all of them.
00:01:26.920 | They will be pretty passionately held opinions
00:01:30.600 | and their opinions grounded in science,
00:01:32.600 | but they're still really at this point, their opinions,
00:01:36.120 | 'cause there's so much stuff to know
00:01:37.920 | that all we can ever do is get a kind of a small slice of it
00:01:42.840 | and it's the context which matters.
00:01:44.920 | So I can give you my answer.
00:01:46.560 | My answer is from a biologist's point of view,
00:01:50.640 | that has been missing from the equation over decades,
00:01:54.760 | which is, well, what does life do on earth?
00:01:57.520 | Why is it this way?
00:01:58.640 | Why is it made of cells?
00:01:59.720 | Why is it made of carbon?
00:02:01.320 | Why is it powered by electrical charges on membranes?
00:02:06.000 | There's all these interesting questions about cells
00:02:09.040 | that if you then look to see,
00:02:10.200 | well, is there an environment on earth,
00:02:12.040 | on the early earth, 4 billion years ago,
00:02:14.240 | that kind of matches the requirements of cells?
00:02:16.720 | Well, there is one.
00:02:17.560 | There's a very obvious one.
00:02:18.480 | It's basically created by whenever
00:02:20.800 | you have a wet, rocky planet,
00:02:22.240 | you get these hydrothermal vents.
00:02:24.720 | Which generate hydrogen gas in bucket loads
00:02:28.640 | and electrical charges on kind of cell-like pores
00:02:32.520 | that can drive the kind of chemistry that life does.
00:02:35.600 | So it seems so beautiful and so obvious
00:02:39.560 | that I've spent the last 10 years or more
00:02:44.000 | trying to do experiments.
00:02:45.160 | It turns out to be difficult, of course.
00:02:47.200 | Everything's more difficult
00:02:48.120 | than you ever thought it was gonna be.
00:02:49.760 | But it looks, I would say, more true rather than less true
00:02:52.760 | over that 10 year period.
00:02:53.960 | I think I have to take a step back every now and then
00:02:56.320 | and think, hang on a minute, where's this going?
00:02:59.000 | I'm happy it's going in a sensible direction.
00:03:01.960 | And I think then you have these other interesting dilemmas.
00:03:06.560 | I mean, I'm often accused of being
00:03:08.440 | too focused on life on earth.
00:03:12.040 | Too kind of narrow-minded and inward-looking, you might say.
00:03:17.040 | I'm talking about carbon.
00:03:17.880 | I'm talking about cells.
00:03:18.840 | And maybe you or plenty of people can say to me,
00:03:21.600 | ah, yeah, but life can be anything.
00:03:23.200 | I have no imagination.
00:03:24.760 | And maybe they're right.
00:03:26.080 | But unless we can say why life here is this way,
00:03:29.680 | and if those reasons are fundamental reasons
00:03:31.960 | or if they're just trivial reasons,
00:03:33.920 | then we can't answer that question.
00:03:36.120 | So I think they're fundamental reasons
00:03:38.720 | and I think we need to worry about them.
00:03:40.320 | - Yeah, there might be some deep truth
00:03:41.640 | to the puzzle here on earth
00:03:43.680 | that will resonate with other puzzles elsewhere
00:03:46.760 | that will, solving this particular puzzle
00:03:50.280 | will give us that deeper truth.
00:03:52.000 | So what, to this puzzle, you said vents,
00:03:57.000 | hydrogen, wet.
00:03:59.280 | So chemically, what is the potion here?
00:04:04.440 | How important is oxygen?
00:04:06.000 | You wrote a book about this.
00:04:07.120 | - Yeah, and I actually just came straight here
00:04:09.280 | from a conference where I was chairing a session
00:04:11.080 | on whether oxygen matters or not in the history of life.
00:04:13.840 | Of course it matters.
00:04:15.320 | But it matters most to the origin of life to be not there.
00:04:20.120 | As I see it, we have this,
00:04:21.680 | I mean, life is made of carbon, basically,
00:04:25.360 | primarily organic molecules with carbon-carbon bonds.
00:04:30.280 | And the building block, the Lego brick
00:04:32.880 | that we take out of the air or take out of the oceans
00:04:34.920 | is carbon dioxide.
00:04:36.520 | And to turn carbon dioxide into organic molecules,
00:04:39.640 | we need to strap on hydrogen.
00:04:42.000 | And so we need, and this is basically what life is doing,
00:04:45.200 | it's hydrogenating carbon dioxide.
00:04:47.440 | It's taking the hydrogen that bubbles out of the earth
00:04:49.440 | in these hydrothermal vents and it sticks it on CO2.
00:04:52.520 | And it's kind of really as simple as that.
00:04:56.160 | And actually thermodynamically,
00:04:58.440 | the thing that I find most troubling
00:05:01.000 | is that if you do these experiments in the lab,
00:05:03.800 | the molecules you get are exactly the molecules
00:05:06.000 | that we see at the heart of biochemistry
00:05:07.880 | in the heart of life.
00:05:08.920 | - Is there something to be said about
00:05:12.480 | the earliest origins of that little potion
00:05:18.920 | that chemical process?
00:05:21.440 | What really is the spark there?
00:05:23.600 | - There isn't a spark.
00:05:27.280 | There is a continuous chemical reaction.
00:05:30.640 | And there is kind of a spark,
00:05:33.120 | but it's a continuous electrical charge
00:05:35.080 | which helps drive that reaction.
00:05:37.440 | - So literally spark.
00:05:38.600 | - Well, the charge at least, but yes.
00:05:41.560 | I mean, a spark in that sense is,
00:05:43.240 | we tend to think of in terms of Frankenstein,
00:05:46.440 | we tend to think in terms of electricity
00:05:48.440 | and one moment you zap something and it comes alive.
00:05:52.040 | And what does that really mean?
00:05:53.520 | You've just come alive and now what's sustaining it?
00:05:56.040 | Well, we are sustained by oxygen,
00:05:59.360 | by this continuous chemical reaction.
00:06:02.120 | And if you put a plastic bag on your head,
00:06:03.960 | then you've got a minute or something
00:06:05.280 | before it's all over.
00:06:07.040 | - So some way of being able to leverage a source of energy.
00:06:11.000 | - Well, the source of energy at the origin of life
00:06:12.720 | is the reaction between carbon dioxide and hydrogen.
00:06:15.720 | And amazingly, most of these reactions are exergonic,
00:06:19.320 | which is to say they release energy.
00:06:21.760 | If you have hydrogen and CO2
00:06:24.280 | and you put them together in a Falcon tube
00:06:26.240 | and you warm it up to say 50 degrees centigrade
00:06:28.320 | and you put in a couple of catalysts and you shake it,
00:06:31.000 | nothing's gonna happen.
00:06:32.320 | But thermodynamically, that is less stable.
00:06:35.800 | Two gases, hydrogen and CO2 is less stable than cells.
00:06:39.240 | What should happen is you get cells coming out.
00:06:42.080 | So why doesn't that happen?
00:06:45.480 | It's because of the kinetic barriers.
00:06:47.800 | That's where you need the spark.
00:06:49.960 | - Is it possible that life originated
00:06:52.040 | multiple times on earth?
00:06:54.080 | The way you describe it, you make it sound so easy.
00:06:56.640 | - There's a long distance to go from the first bits
00:07:01.240 | of prebiotic chemistry to say molecular machines
00:07:04.160 | like ribosomes.
00:07:05.360 | - Is that the first thing that you would say is life?
00:07:09.440 | Like if I introduced you to the two of you at a party,
00:07:12.640 | you would say that's a living thing?
00:07:15.040 | - I would say as soon as we introduce genes information
00:07:19.920 | into systems that are growing anyway,
00:07:22.040 | so I would talk about growing protocells.
00:07:25.240 | As soon as we introduce even random bits of information
00:07:30.080 | into there, I'm thinking about RNA molecules, for example,
00:07:35.040 | doesn't have to have any information in it.
00:07:36.440 | It can be completely random sequence.
00:07:38.240 | But if it's introduced into a system,
00:07:40.280 | which is in any case growing and doubling itself
00:07:42.520 | and reproducing itself,
00:07:43.640 | then any changes in that sequence that allow it
00:07:46.360 | to do so better or worse are now selected
00:07:48.920 | by perfectly normal natural selection.
00:07:51.240 | - But it's a system--
00:07:52.320 | - So that's when it becomes alive to my mind.
00:07:54.560 | - That's encompassed into like an object
00:07:59.360 | that keeps information and evolves that information
00:08:04.000 | over time, changes that information over time.
00:08:06.080 | - Yes, exactly.
00:08:06.920 | - In response to the--
00:08:07.760 | - So it's always part of a cell system
00:08:10.040 | from the very beginning.
00:08:11.200 | - So is your sense that it started only once
00:08:14.360 | because it's difficult,
00:08:15.520 | or is it possibly started in multiple locations on Earth?
00:08:18.720 | - It's possibly started multiple occasions.
00:08:20.920 | There's two provisos to that.
00:08:23.720 | One of them is oxygen makes it impossible, really,
00:08:28.720 | for life to start.
00:08:29.880 | So as soon as we've got oxygen in the atmosphere,
00:08:31.880 | then life isn't gonna keep starting over.
00:08:34.320 | So I often get asked by people,
00:08:36.320 | "Why can't we have life starting?
00:08:38.000 | If it's so easy, why can't life start in these vents now?"
00:08:40.600 | And the answer is, if you want hydrogen to react with CO2
00:08:43.720 | and there's oxygen there,
00:08:44.560 | hydrogen reacts with oxygen instead.
00:08:46.120 | It's just, you're getting an explosive reaction that way.
00:08:48.800 | It's rocket fuel.
00:08:50.040 | So it's never gonna happen.
00:08:51.240 | But for the origin of life earlier than that,
00:08:54.120 | all we know is that there's a single common ancestor
00:08:57.280 | for all of life.
00:08:58.120 | There could have been multiple origins
00:09:00.120 | and they all just disappeared.
00:09:01.640 | But there's a very interesting deep split in life
00:09:06.160 | between bacteria and what are called archaea,
00:09:09.000 | which look just the same as bacteria.
00:09:11.320 | And they're not quite as diverse, but nearly.
00:09:14.880 | And they are very different in their biochemistry.
00:09:18.120 | And so any explanation for the origin of life
00:09:19.800 | has to account as well for why they're so different
00:09:22.840 | and yet so similar.
00:09:24.160 | And that makes me think that life
00:09:26.760 | probably did arise only once.
00:09:29.160 | - Can you describe the difference
00:09:30.600 | that's interesting there?
00:09:32.200 | How they're similar, how they're different?
00:09:33.960 | - Well, they're different in their membranes primarily.
00:09:38.120 | They're different in things like DNA replication.
00:09:40.400 | They use completely different enzymes
00:09:41.880 | and the genes behind it for replicating DNA.
00:09:44.680 | - So they both have membranes, both have DNA replication.
00:09:48.040 | - Yes.
00:09:48.880 | - The process of that is different.
00:09:50.280 | - They both have DNA.
00:09:52.080 | The genetic code is identical in them both.
00:09:54.880 | The way in which it's transcribed into RNA,
00:09:58.560 | into the copy of a gene,
00:10:00.960 | and the way that that's then translated into a protein,
00:10:03.280 | that's all basically the same in both these groups.
00:10:05.320 | So they clearly share a common ancestor.
00:10:08.560 | It's just that they're different
00:10:09.640 | in fundamental ways as well.
00:10:10.960 | And if you think about, well,
00:10:11.800 | what kind of processes could drive
00:10:14.360 | that divergence very early on?
00:10:16.520 | I can think about it in terms of membranes,
00:10:19.760 | in terms of the electrical charges on membranes.
00:10:22.640 | And it's that that makes me think
00:10:24.320 | that there were probably many unsuccessful attempts
00:10:27.440 | and only one really successful attempt.
00:10:30.040 | - Can you explain why that divergence
00:10:31.880 | makes you think there's one common ancestor?
00:10:36.160 | Okay, can you describe that intuition?
00:10:37.960 | I'm a little bit unclear about why the divergent,
00:10:40.360 | like the leap from the divergence means there's one.
00:10:43.920 | Do you mean like the divergence indicates
00:10:47.000 | that there was a big invention at that time from one source?
00:10:52.000 | - If you'd got, as I imagine it,
00:10:54.640 | you have a common ancestor living in a hydrothermal vent.
00:10:59.120 | Let's say there are millions of vents
00:11:01.440 | and millions of potential common ancestors
00:11:04.080 | living in all of those vents,
00:11:06.000 | but only one of them makes it out first,
00:11:09.440 | then you could imagine that that cell
00:11:11.120 | is then gonna kind of take over the world
00:11:12.680 | and wipe out everything else.
00:11:14.440 | And so what you would see would be
00:11:16.520 | a single common ancestor for all of life.
00:11:18.920 | But with lots of different vent systems
00:11:21.440 | all kind of vying to create the first life forms,
00:11:24.680 | you might say.
00:11:25.520 | - So this thing is a cell, a single cell organism.
00:11:28.400 | - We're always talking about populations of cells,
00:11:30.560 | but yes, these are single-celled organisms.
00:11:33.720 | - But the fundamental life form is a single cell, right?
00:11:37.800 | So like, or, so they're always together,
00:11:41.960 | but they're alone together. (laughs)
00:11:44.040 | - Yeah.
00:11:44.880 | - There's a machinery in each one individual component
00:11:47.800 | that if left by itself would still work, right?
00:11:50.760 | - Yes, yes, yes.
00:11:51.680 | It's the unit of selection is a single cell.
00:11:54.600 | But selection operates over generations
00:11:56.720 | and changes over generations in populations of cells.
00:11:59.560 | So it would be impossible to say
00:12:00.800 | that a cell is the unit of selection
00:12:02.600 | in the sense that unless you have a population,
00:12:05.400 | you can't evolve, you can't change.
00:12:07.360 | - Right, but there was one Chuck Norris,
00:12:12.360 | it's an American reference,
00:12:13.800 | cell that made it out of the vents, right?
00:12:17.880 | Or like the first one.
00:12:19.800 | - So imagine then that there's one cell gets out
00:12:22.160 | and it takes over the world.
00:12:23.680 | - It gets out in the water, it's like floating around.
00:12:25.640 | - We're deep in the ocean somewhere.
00:12:27.240 | - Yeah.
00:12:28.320 | - Actually two cells got out
00:12:31.240 | and they appear to have got out from the same vent
00:12:35.480 | because they both share the same code and everything else.
00:12:38.720 | So unless all the,
00:12:40.160 | we've got a million different common ancestors
00:12:42.200 | in all these different vents.
00:12:44.160 | So either they all have the same code
00:12:47.000 | and two cells spontaneously emerged from different places
00:12:49.360 | or two different cells,
00:12:52.560 | fundamentally different cells came from the same place.
00:12:55.760 | So either way, what are the constraints that say,
00:12:59.320 | not just one came out or not half a million came out,
00:13:01.640 | but two came out, that's kind of a bit strange.
00:13:05.280 | So how did they come out?
00:13:06.520 | Well, they come out because what you're doing inside a vent
00:13:09.760 | is you're relying on the electrical charges down there
00:13:12.680 | to power this reaction between hydrogen and CO2
00:13:15.480 | to make yourself grow.
00:13:16.960 | And when you leave the vent, you've got to do that yourself.
00:13:19.200 | You've got to power up your own membrane.
00:13:21.280 | And so the question is,
00:13:22.120 | well, how do you power up your own membrane?
00:13:24.800 | And the answer is, well, you need to pump.
00:13:27.240 | You need to pump ions
00:13:28.800 | to give an electrical charge on the membrane.
00:13:30.800 | So what do the pumps look like?
00:13:32.160 | Well, the pumps look different in these two groups.
00:13:35.320 | It's as if they both emerged from a common ancestor.
00:13:37.960 | As soon as you've got that ancestor,
00:13:39.400 | things move very quickly and divergently.
00:13:43.760 | Why does the DNA replication look different?
00:13:45.840 | Well, it's joined to the membrane.
00:13:47.280 | The membranes are different.
00:13:48.240 | The DNA replication is different
00:13:49.640 | because it's joined to a different kind of membrane.
00:13:52.320 | So there's interesting,
00:13:53.680 | you know, this is detail, you may say,
00:13:55.560 | but it's also fundamental
00:13:57.040 | because it's about the two big divergent groups
00:13:59.640 | of life on earth
00:14:00.480 | that seem to have diverged really early on.
00:14:02.920 | - It all started from one organism.
00:14:06.520 | And then that organism just start replicating
00:14:09.560 | the heck out of itself with some mutation of the DNA.
00:14:14.280 | So like there's some,
00:14:16.200 | there's a competition through the process of evolution.
00:14:19.360 | They're not like trying to beat each other up.
00:14:21.320 | They're just trying to live-
00:14:23.760 | - They're just replicators.
00:14:25.640 | - Yeah.
00:14:26.480 | Well, you know, let's not minimize their...
00:14:28.520 | - Yeah.
00:14:29.360 | - They're just trying to chill.
00:14:30.280 | They're trying to relax up in the...
00:14:32.560 | But there's no sense of trying to survive.
00:14:35.080 | They're replicating.
00:14:36.400 | - I mean, there's no sense
00:14:37.320 | in which they're trying to do anything.
00:14:39.560 | They're just kind of an outgrowth of the earth,
00:14:41.960 | you might say.
00:14:42.800 | - Of course, the aliens would describe us humans
00:14:44.720 | in that same way.
00:14:45.680 | - They might be right.
00:14:47.520 | - This primitive life.
00:14:49.400 | - It's just ants that are hairless, mostly hairless.
00:14:53.440 | - Overgrown ants.
00:14:54.400 | - Overgrown ants.
00:14:55.680 | Okay, what do you think about the idea of panspermia,
00:14:59.400 | that the theory that life did not originate on earth
00:15:03.440 | and was planted here from outer space?
00:15:06.960 | Or pseudopanspermia, which is like the basic ingredients,
00:15:10.720 | the magic that you mentioned was planted here
00:15:12.960 | from elsewhere in space?
00:15:14.720 | - I don't find them helpful.
00:15:16.400 | That's not to say they're wrong.
00:15:18.840 | So pseudotranspermia, the idea that the chemicals,
00:15:22.360 | the amino acids, the nucleotides
00:15:23.920 | are being delivered from space.
00:15:24.800 | Well, we know that happens.
00:15:25.840 | It's unequivocal.
00:15:27.080 | They're delivered on meteorites, comets, and so on.
00:15:29.640 | So what do they do next?
00:15:31.880 | That's, to me, the question.
00:15:33.160 | Well, what do they do is they stock a soup.
00:15:35.160 | Presumably they land in a pond or in an ocean
00:15:37.480 | or wherever they land.
00:15:39.040 | And then you end up with,
00:15:40.160 | in the best possible case scenario,
00:15:42.160 | is you end up with a soup of nucleotides and amino acids.
00:15:44.720 | And then you have to say,
00:15:45.560 | so now what happens?
00:15:46.440 | And the answer is, oh, well,
00:15:47.280 | you have to go, become alive.
00:15:50.800 | So how did they do that?
00:15:51.760 | You may as well say, then a miracle happened.
00:15:53.920 | I don't believe in soup.
00:15:57.120 | I think what we have in a vent is a continuous conversion,
00:16:00.720 | a continuous growth, a continuous reaction,
00:16:02.720 | a continuous converting a flow of molecules
00:16:05.880 | into more of yourself, you might say,
00:16:07.920 | even if it's a small bit.
00:16:08.880 | So you've got a kind of continuous self-organization
00:16:13.040 | and growth from the very beginning.
00:16:14.760 | You never have that in a soup.
00:16:17.000 | - Isn't the entire universe
00:16:19.120 | and living organisms in the universe,
00:16:21.640 | isn't it just soup all the way down?
00:16:25.000 | Isn't it all soup?
00:16:25.840 | - No, no.
00:16:26.680 | I mean, soup almost by definition doesn't have a structure.
00:16:29.560 | - But soup is a collection of ingredients
00:16:32.200 | that are like randomly interacting.
00:16:34.040 | - Yeah, but they're not random.
00:16:36.200 | They're not, I mean, we have chemistry going on here.
00:16:38.760 | We have metal grains forming,
00:16:40.160 | which are, you know, effective oil-water interactions.
00:16:43.400 | - Okay, so it feels like there's a direction to a process,
00:16:45.960 | like a director process.
00:16:46.800 | - There are directions to processes, yeah.
00:16:49.920 | And if you're starting with CO2
00:16:52.640 | and you've got two reactive fluids being brought together
00:16:55.600 | and they react, what are they gonna make?
00:16:57.720 | Well, they make carboxylic acids,
00:16:59.840 | which include the fatty acids
00:17:01.720 | that make up the cell membranes.
00:17:03.040 | And they form directly into bilayer membranes.
00:17:06.520 | They form like soap bubbles.
00:17:07.680 | It's spontaneous organization
00:17:10.000 | caused by the nature of the molecules.
00:17:11.920 | And those things are capable of growing
00:17:14.040 | and are capable in effect of being selected
00:17:16.440 | even before there are genes.
00:17:18.720 | So we have a lot of order,
00:17:20.040 | and that order is coming from thermodynamics.
00:17:22.560 | And the thermodynamics,
00:17:23.680 | it's always about increasing the entropy of the universe.
00:17:27.040 | But if you have oil and water and they're separating,
00:17:30.480 | you're increasing the entropy of the universe,
00:17:32.040 | even though you've got some order,
00:17:33.120 | which is the soap and the water are not miscible.
00:17:36.480 | Now, to come back to your first question
00:17:39.480 | about panspermia properly,
00:17:42.360 | that just pushes the question somewhere else.
00:17:45.880 | Even if it's true,
00:17:46.840 | maybe life did start on Earth by panspermia.
00:17:49.680 | So what are the principles
00:17:52.200 | that govern the emergence of life on any planet?
00:17:55.200 | It's an assumption that life started here.
00:17:57.560 | And it's an assumption that it started
00:18:01.000 | in a hydrothermal vent,
00:18:02.000 | or it started in a terrestrial geothermal system.
00:18:04.680 | The question is, can we work out a testable sequence
00:18:07.560 | of events that would lead from one to the other one,
00:18:10.480 | and then test it and see if there's any truth in it or not?
00:18:12.640 | With panspermia, you can't do any of that.
00:18:14.880 | - But the fundamental question of panspermia is,
00:18:17.680 | do we have the machine here on Earth to build life?
00:18:21.960 | Is the vents enough?
00:18:25.680 | Is oxygen and hydrogen and whatever the heck else we want,
00:18:30.680 | and some source of energy and heat,
00:18:34.520 | is that enough to build life?
00:18:36.360 | - Yes.
00:18:37.320 | - Well, that's...
00:18:38.480 | (laughing)
00:18:40.000 | Of course you would say that as a human.
00:18:42.800 | But there could be aliens right now,
00:18:44.640 | chuckling at that idea.
00:18:46.160 | Maybe you need some special sauce.
00:18:50.160 | Special elsewhere sauce.
00:18:51.920 | So your sense is, we have everything here.
00:18:54.600 | - I mean, this is precisely the question.
00:18:56.960 | So I like to, when I'm talking in schools,
00:18:59.600 | I like to start out with the idea of,
00:19:01.520 | we can make a time machine.
00:19:03.240 | We go back four billion years,
00:19:05.160 | and we go to these environments that people talk about.
00:19:07.400 | We go to a deep sea hydrothermal vent,
00:19:09.280 | we go to a kind of Yellowstone Park type place environment,
00:19:14.120 | and we find some slime that looks like,
00:19:17.560 | and we can test it, it's made of organic molecules.
00:19:20.240 | It's got a structure which is not obviously cells,
00:19:22.360 | but is this a stepping stone on the way to life or not?
00:19:26.840 | - Yeah.
00:19:27.680 | - How do we know?
00:19:29.040 | Unless we've got an intellectual framework
00:19:31.560 | that says this is a stepping stone and that's not a step,
00:19:34.160 | you know, we'd never know.
00:19:35.000 | We wouldn't know which environment to go to,
00:19:36.480 | what to look for, how to say this.
00:19:38.520 | So all we can ever hope for,
00:19:39.880 | 'cause we're never gonna build that time machine,
00:19:41.840 | is to have an intellectual framework
00:19:43.400 | that can explain step by step, experiment by experiment,
00:19:46.840 | how we go from a sterile inorganic planet
00:19:49.640 | to living cells as we know them.
00:19:52.440 | And in that framework, every time you have a choice,
00:19:55.240 | it could be this way or it could be that way,
00:19:56.920 | or, you know, there's lots of possible forks down that road.
00:20:01.120 | Did it have to be that way?
00:20:03.560 | Could it have been the other way?
00:20:05.200 | And would that have given you life
00:20:06.520 | with very different properties?
00:20:08.800 | And so if you come up with a, you know,
00:20:11.000 | it's a long hypothesis, 'cause as I say,
00:20:12.680 | we're going from really simple prebiotic chemistry
00:20:15.520 | all the way through to genes and molecular machines.
00:20:17.800 | That's a long, long pathway.
00:20:20.120 | And nobody in the field would agree on the order
00:20:22.280 | in which these things happened,
00:20:23.720 | which is not a bad thing 'cause it means
00:20:24.960 | that you have to go out and do some experiments
00:20:26.800 | and try and demonstrate that it's possible or not possible.
00:20:29.800 | - It's so freaking amazing that it happened though.
00:20:34.800 | It feels like there's a direction to the thing.
00:20:41.920 | Can you try to answer from a framework perspective
00:20:46.920 | of what is life?
00:20:49.600 | So you said there's some order, and yet there's complexity.
00:20:56.000 | So it's not perfectly ordered.
00:20:59.120 | It's not boring.
00:20:59.960 | There's still some fun in it.
00:21:02.000 | And it also feels like the processes have a direction
00:21:05.360 | through the selection mechanism.
00:21:07.880 | They seem to be building something, always better,
00:21:11.960 | always improving.
00:21:14.240 | I mean, maybe it's-
00:21:15.080 | - I mean, that's a perception.
00:21:16.240 | - That's our romanticization of things are always better.
00:21:19.500 | Things are getting better.
00:21:21.520 | We'd like to believe that.
00:21:22.880 | - I mean, you think about the world
00:21:24.040 | from the point of view of bacteria,
00:21:25.640 | and bacteria are the first things to emerge
00:21:27.960 | from whatever environment they came from.
00:21:30.040 | And they dominated the planet very, very quickly.
00:21:32.680 | And they haven't really changed.
00:21:34.440 | 4 billion years later, they look exactly the same.
00:21:36.640 | So if about 4 billion years ago,
00:21:38.760 | bacteria started to really run the show.
00:21:42.400 | And then nothing happened for a while.
00:21:44.680 | - Nothing happened for 2 billion years.
00:21:46.960 | Then after 2 billion years,
00:21:48.120 | we see another single event origin, if you like,
00:21:51.120 | of our own type of cell, the eukaryotic cells,
00:21:54.040 | cells with a nucleus and lots of stuff going on inside.
00:21:57.280 | Another singular origin.
00:21:58.480 | It only happened once in the history of life on Earth.
00:22:01.080 | Maybe it happened multiple times,
00:22:02.520 | and there's no evidence.
00:22:03.360 | Everything just disappeared.
00:22:04.360 | But we have to at least take it seriously
00:22:07.520 | that there's something that stops bacteria
00:22:10.040 | from becoming more complex, because they didn't.
00:22:13.360 | You know, that's a fact,
00:22:14.240 | that they emerged 4 billion years ago,
00:22:17.480 | and something happened 2 billion years ago,
00:22:19.360 | but the bacteria themselves didn't change.
00:22:21.280 | They remain bacterial.
00:22:22.520 | So there is no necessary trajectory
00:22:26.080 | towards great complexity in human beings at the end of it.
00:22:28.600 | It's very easy to imagine
00:22:29.720 | that without photosynthesis arising
00:22:31.720 | or without eukaryotes arising,
00:22:33.000 | that a planet could be full of bacteria and nothing else.
00:22:36.360 | - We'll get to that, 'cause that's a brilliant invention,
00:22:39.320 | and there's a few brilliant invention along the way.
00:22:41.520 | But what is life?
00:22:44.000 | If you were to show up on Earth,
00:22:45.960 | but to take that time machine,
00:22:47.840 | and you said, asking yourself the question,
00:22:50.160 | "Is this a stepping stone towards life?"
00:22:52.720 | As you step along, when you see the early bacteria,
00:22:57.000 | how would you know it's life?
00:22:58.520 | And then this is a really important question
00:23:01.760 | when you go to other planets and look for life.
00:23:04.200 | What is the framework of telling the difference
00:23:08.240 | between a rock and a bacteria?
00:23:12.280 | - I mean, the question's kind of both impossible to answer
00:23:15.280 | and trivial at the same time,
00:23:16.540 | and I don't like to answer it,
00:23:18.160 | because I don't think there is an answer.
00:23:19.680 | I think we're trying to describe--
00:23:20.520 | - Those are the most fun questions.
00:23:21.720 | Believe me, there's no answer.
00:23:23.440 | - No, there is no answer.
00:23:24.280 | I mean, there's lots of,
00:23:25.800 | at least 40 or 50 different definitions of life out there,
00:23:29.040 | and most of them are, well--
00:23:31.280 | - Not convincing.
00:23:32.120 | - Obviously bad in one way or another.
00:23:34.000 | (laughing)
00:23:36.080 | I mean, there's, I can never remember
00:23:38.240 | the exact words that people use,
00:23:39.920 | but there's a NASA working definition of life,
00:23:43.720 | which more or less says a system which is capable of,
00:23:48.400 | a self-sustaining system capable of evolution
00:23:50.640 | or something along those lines.
00:23:52.680 | And I immediately have a problem with the word self-sustaining
00:23:56.200 | because it's sustained by the environment,
00:23:58.240 | and I know what they're getting at.
00:24:00.120 | I know what they're trying to say,
00:24:01.000 | but I pick a hole in that.
00:24:03.120 | And there's always wags who say,
00:24:04.680 | but by that definition, a rabbit is not alive.
00:24:07.360 | Only a pair of rabbits would be alive
00:24:09.760 | because a single rabbit is incapable of copying itself.
00:24:12.840 | There's all kinds of pedantic,
00:24:15.560 | silly but also important objections to any hypothesis.
00:24:19.240 | The real question is what is,
00:24:20.840 | we can argue all day, or people do argue all day about,
00:24:25.120 | is a virus alive or not?
00:24:27.400 | And it depends on the content.
00:24:29.080 | Most biologists could not agree.
00:24:30.920 | So then what about a jumping gene,
00:24:32.720 | a retro element or something like that?
00:24:33.560 | It's even simpler than a virus,
00:24:36.080 | but it's capable of converting its environment
00:24:40.400 | into a copy of itself.
00:24:41.840 | And that's about as close, this is not a definition,
00:24:45.200 | but this is a kind of a description of life,
00:24:47.440 | is that it's able to parasitize the environment,
00:24:52.080 | and that goes for plants as well as animals
00:24:53.920 | and bacteria and viruses,
00:24:56.040 | to make a relatively exact copy of themselves.
00:25:00.880 | Informationally exact copy of themselves.
00:25:04.160 | - By the way, it doesn't really have to be
00:25:06.200 | a copy of itself, right?
00:25:07.720 | It just has to be, you have to create something
00:25:10.320 | that's interesting.
00:25:12.960 | The way evolution is,
00:25:16.440 | so it is extremely powerful process of evolution,
00:25:19.960 | which is basically make a copy of yourself
00:25:22.400 | and sometimes mess up a little bit.
00:25:24.720 | - Absolutely.
00:25:25.560 | - That seems to work really well.
00:25:26.480 | I wonder if it's possible to--
00:25:28.520 | - Mess up big time.
00:25:29.400 | - Mess up big time as a standard, as the default.
00:25:32.400 | - It's called the hopeful monster,
00:25:33.680 | and you know, there's--
00:25:34.640 | - It doesn't work.
00:25:35.960 | - In principle it can.
00:25:36.880 | Actually, it turns out,
00:25:38.040 | I would say that this is due a re-emergence.
00:25:41.480 | There's some amazing work from Michael Levin.
00:25:44.200 | I don't know if you came across him,
00:25:45.320 | but if you haven't interviewed him,
00:25:47.360 | you should interview him.
00:25:48.360 | - Yeah, yeah, in Boston.
00:25:49.920 | - About, yeah, yeah.
00:25:50.760 | - I'm talking to him in a few days.
00:25:53.640 | - Oh, fantastic.
00:25:54.480 | (both laughing)
00:25:56.000 | - So I mentioned there's two people
00:25:58.800 | that Andre, if I may mention,
00:26:00.680 | Andre Kapathie is a friend
00:26:02.400 | who's really admired in the AI community,
00:26:04.640 | said you absolutely must talk to Michael and to Nick.
00:26:09.640 | So this, of course, I'm a huge fan of yours,
00:26:11.840 | so I'm really fortunate
00:26:13.600 | that we can actually make this happen.
00:26:14.880 | Anyway, you were saying?
00:26:16.120 | - Well, Michael Levin is doing amazing work,
00:26:19.320 | basically about the way
00:26:20.760 | in which electrical fields control development.
00:26:23.460 | And he's done some work with planarian worms,
00:26:26.720 | so flatworms, where he'll tell you all about this,
00:26:29.440 | so I won't say any more than the minimum,
00:26:30.840 | but basically you can cut their head off
00:26:32.480 | and they'll redevelop a different, a new head.
00:26:35.680 | But the head that they develop depends,
00:26:37.720 | if you knock out just one iron pump in a membrane,
00:26:42.560 | so you change the electrical circuitry just a little bit,
00:26:45.280 | you can come up with a completely different head.
00:26:47.040 | It can be a head which is similar
00:26:49.000 | to those that diverged 150 million years ago,
00:26:52.120 | or it can be a head which no one's ever seen before,
00:26:54.200 | a different kind of head.
00:26:56.760 | Now that is really, you might say, a hopeful monster.
00:26:59.320 | This is a kind of leap into a different direction.
00:27:01.980 | The only question for natural selection is does it work?
00:27:05.040 | Is the change itself feasible as a single change?
00:27:08.200 | And the answer is yes,
00:27:09.020 | it's just a small change to a single gene.
00:27:11.080 | And the second thing is it gives rise
00:27:12.880 | to a completely different morphology.
00:27:14.880 | Does it work?
00:27:16.000 | And if it works, that can easily be a shift.
00:27:21.000 | But for it to be a speciation,
00:27:23.040 | for it to continue,
00:27:25.400 | for it to give rise to a different morphology over time,
00:27:29.720 | then it has to be perpetuated.
00:27:32.000 | So that shift, that change in that one gene
00:27:37.000 | has to work well enough that it is selected and it goes on.
00:27:41.120 | - And copied enough times to where you can really test it.
00:27:44.320 | - So the likelihood, it would be lost,
00:27:46.040 | but there'll be some occasions where it survives.
00:27:48.800 | And yes, the idea that we can have
00:27:50.320 | sudden, fairly abrupt changes in evolution,
00:27:52.940 | I think it's time for a rebirth.
00:27:54.960 | - What about this idea that kind of trying to mathematize
00:27:59.960 | a definition of life and saying how many steps,
00:28:04.480 | the shortest amount of steps it takes to build the thing?
00:28:07.120 | Almost like an engineering view of it.
00:28:09.640 | - Ah, I like that view.
00:28:11.880 | Because I think that in a sense,
00:28:13.400 | that's not very far away from what a hypothesis needs to do
00:28:17.340 | to be a testable hypothesis for the origin of life.
00:28:19.520 | You need to spell out, here's each step,
00:28:22.440 | and here's the experiment to do for each step.
00:28:24.980 | The idea that we can do it in the lab,
00:28:26.960 | some people say, oh, we'll have created life
00:28:29.820 | within five years, but ask them what they mean by life.
00:28:32.680 | We have a planet four billion years ago
00:28:36.680 | with these vent systems across the entire surface
00:28:39.280 | of the planet, and we have millions of years if we wanted.
00:28:41.880 | I have a feeling that we're not talking
00:28:43.240 | about millions of years.
00:28:44.120 | I have a feeling we're talking about
00:28:47.200 | maybe millions of nanoseconds or picoseconds.
00:28:49.560 | We're talking about chemistry, which is happening quickly.
00:28:52.460 | But we still need to constrain those steps,
00:28:56.780 | but we've got a planet doing similar chemistry.
00:29:00.880 | You asked about a trajectory.
00:29:02.660 | The trajectory is the planetary trajectory.
00:29:05.260 | The planet has properties.
00:29:06.640 | It's basically, it's got a lot of iron at the center of it.
00:29:08.680 | It's got a lot of electrons at the center of it.
00:29:10.480 | It's more oxidized on the outside,
00:29:12.040 | partly because of the sun, and partly because
00:29:14.640 | the heat of volcanoes puts out oxidized gases.
00:29:17.720 | So the planet is a battery.
00:29:19.660 | It's a giant battery.
00:29:20.760 | And we have a flow of electrons going from inside
00:29:24.140 | to outside in these hydrothermal vents,
00:29:26.200 | and that's the same topology that a cell has.
00:29:29.120 | A cell is basically just a micro version of the planet.
00:29:32.800 | And there is a trajectory in all of that,
00:29:37.000 | and there's an inevitability that certain types
00:29:39.360 | of chemical reaction are going to be favored over others,
00:29:42.360 | and there's an inevitability in what happens in water,
00:29:46.220 | the chemistry that happens in water.
00:29:47.880 | Some will be immiscible with water and will form membranes
00:29:51.840 | and will form insoluble structures.
00:29:53.640 | Nobody really understands water very well.
00:29:57.280 | And it's another big question.
00:30:00.680 | For experiments on the origin of life, what do you put it in?
00:30:04.440 | What kind of structure do we want to induce in this water?
00:30:07.200 | Because the last thing it's likely to be
00:30:08.840 | is just kind of bulk water.
00:30:11.680 | - How fundamental is water to life, would you say?
00:30:14.280 | - I would say pretty fundamental.
00:30:16.460 | I wouldn't like to say it's impossible for life
00:30:20.080 | to start any other way, but water is everywhere.
00:30:25.080 | Water's extremely good at what it does,
00:30:27.800 | and carbon works in water especially well.
00:30:31.200 | So those things, and carbon is everywhere.
00:30:33.280 | So those things together make me think probabilistically,
00:30:35.700 | if we found 1,000 life forms,
00:30:37.880 | 995 of them would be carbon-based and living in water.
00:30:41.960 | - Now the reverse question,
00:30:43.280 | if you found a puddle of water elsewhere and some carbon,
00:30:48.040 | no, just a puddle of water.
00:30:50.240 | Is a puddle of water a pretty damn good indication
00:30:53.280 | that life either exists here or has once existed here?
00:30:58.280 | - No.
00:31:02.360 | - So it doesn't work the other way?
00:31:04.120 | - I think you need a living planet.
00:31:07.560 | You need a planet which is capable
00:31:09.080 | of turning over its surface.
00:31:10.720 | It needs to be a planet with water.
00:31:12.840 | It needs to be capable of bringing those electrons
00:31:16.880 | from inside to the outside.
00:31:18.160 | It needs to turn over its surface.
00:31:19.660 | It needs to make that water work and turn it into hydrogen.
00:31:22.920 | So I think you need a living planet.
00:31:24.760 | But once you've got the living planet,
00:31:25.920 | I think the rest of it is kind of thermodynamics
00:31:28.840 | all the way.
00:31:29.680 | - So if you were to run Earth over a million times
00:31:34.260 | up to this point, maybe beyond, to the end,
00:31:37.820 | let's run it to the end,
00:31:39.160 | what is it, how much variety is there?
00:31:42.980 | You kind of spoke to this trajectory
00:31:45.060 | that the environment dictates like chemically,
00:31:49.900 | I don't know in which other way, spiritually,
00:31:53.520 | like dictates kind of the direction of this giant machine
00:31:59.620 | that seems chaotic, but it does seem to have order
00:32:03.580 | in the steps it's taking.
00:32:05.460 | How often will life, how often will bacteria emerge?
00:32:11.420 | How often will something like humans emerge?
00:32:13.420 | How much variety do you think there would be?
00:32:15.300 | - I think at the level of bacteria, not much variety.
00:32:19.380 | I think we would get,
00:32:20.980 | that's how many times you say you wanna run it?
00:32:22.820 | A million times.
00:32:23.780 | I would say at least a few hundred thousand
00:32:26.860 | we'll get bacteria again.
00:32:28.020 | - Oh, wow, nice.
00:32:29.460 | - Because I think there's some level of inevitability
00:32:31.860 | that a wet rocky planet will give rise
00:32:33.980 | through the same processes to something very,
00:32:38.260 | I think, this is not something I'd have thought
00:32:40.460 | a few years ago, but working with a PhD student of mine,
00:32:43.620 | Stuart Harrison, he's been thinking about the genetic code
00:32:46.580 | and we've just been publishing on that.
00:32:49.260 | There are patterns that you can discern in the code,
00:32:51.540 | or he has discerned in the code,
00:32:53.440 | that if you think about them in terms of,
00:32:56.120 | we start with CO2 and hydrogen
00:32:57.780 | and that these are the first steps of biochemistry,
00:32:59.780 | you come up with a code
00:33:00.800 | which is very similar to the code that we see.
00:33:03.660 | So it wouldn't surprise me any longer
00:33:05.600 | if we found life on Mars and it had a genetic code
00:33:07.780 | that was not very different to the genetic code
00:33:09.780 | that we have here,
00:33:11.060 | without it just being transferred across.
00:33:13.340 | There's some inevitability
00:33:15.340 | about the whole of the beginnings of life, in my view.
00:33:18.580 | - That's really promising because if the basic chemistry
00:33:21.540 | is tightly linked to the genetic code,
00:33:25.940 | that means we can interact with other life
00:33:29.540 | if it exists out there.
00:33:30.500 | - Well, that's potentially.
00:33:32.120 | - That's really exciting if that's the case.
00:33:34.600 | Okay, but then bacteria.
00:33:36.040 | - We've got then, we've got bacteria.
00:33:37.840 | How easy is photosynthesis?
00:33:41.200 | Much harder, I would say.
00:33:44.880 | - Let's actually go there.
00:33:46.080 | Let's go through the inventions.
00:33:47.640 | - Yeah.
00:33:49.160 | - What is photosynthesis and why is it hard?
00:33:52.420 | - Well, there are different forms.
00:33:55.360 | I mean, basically you're taking hydrogen
00:33:57.520 | and you're sticking it onto CO2
00:33:59.000 | and it's powered by the sun.
00:34:00.360 | Question is where are you taking the hydrogen from?
00:34:02.840 | And in photosynthesis that we know in plants,
00:34:05.340 | it's coming from water.
00:34:06.760 | So you're using the power of the sun to split water,
00:34:08.840 | take out the hydrogen, stick it onto CO2
00:34:11.460 | and the oxygen is a waste product
00:34:13.080 | and you just throw it out, throw it away.
00:34:15.640 | So there's the single greatest planetary pollution event
00:34:19.100 | in the whole history of the earth.
00:34:21.180 | - The pollutant being oxygen.
00:34:22.460 | - Yes, yeah.
00:34:24.000 | It also made possible animals.
00:34:26.200 | You can't have large active animals
00:34:28.440 | without an oxygenated atmosphere,
00:34:30.040 | at least not in the sense that we know on earth.
00:34:33.680 | - So that's a really big invention
00:34:35.560 | in the history of earth. - Huge invention, yes.
00:34:37.560 | And it happened once.
00:34:38.480 | There's a few things that happened once on earth
00:34:40.400 | and you're always stuck with this problem.
00:34:42.680 | Once it happened, did it become so good so quickly
00:34:44.780 | that it precluded the same thing happening ever again
00:34:48.280 | or are there other reasons?
00:34:49.500 | And we really have to look at each one in turn
00:34:51.240 | and think why did it only happen once?
00:34:53.980 | In this case, it's really difficult to split water.
00:34:58.000 | It requires a lot of power
00:34:59.120 | and that power you're effectively separating charge
00:35:01.800 | across a membrane and the way in which you do it,
00:35:04.040 | if it doesn't all rush back
00:35:05.340 | and kind of cause an explosion right at the site,
00:35:08.280 | requires really careful wiring.
00:35:10.720 | And that wiring, it can't be easy to get it right
00:35:14.740 | because the plants that we see around us,
00:35:18.680 | they have chloroplasts.
00:35:19.520 | Those chloroplasts were cyanobacteria ones.
00:35:21.320 | Those cyanobacteria are the only group of bacteria
00:35:23.600 | that can do that type of photosynthesis.
00:35:25.740 | So there's plenty of opportunity.
00:35:28.200 | - So not even many bacteria.
00:35:29.520 | So who invented photosynthesis?
00:35:31.880 | - The cyanobacteria or their ancestors.
00:35:34.200 | - And there's not many.
00:35:36.020 | - No other bacteria can do
00:35:37.760 | what's called oxygenic photosynthesis.
00:35:39.760 | Lots of other bacteria can split.
00:35:42.120 | I mean, you can take your hydrogen from somewhere else.
00:35:44.080 | You can take it from hydrogen sulfide
00:35:45.520 | bubbling out of a hydrothermal vent, grab your two hydrogens.
00:35:49.480 | The sulfur is the waste now.
00:35:51.060 | You can do it from iron.
00:35:53.320 | You can take electrons.
00:35:54.240 | So the early oceans were probably full of iron.
00:35:56.060 | You can take an electron from ferrous iron,
00:35:59.020 | so iron two plus and make it iron three plus,
00:36:01.640 | which now precipitates as rust.
00:36:03.960 | And you take a proton from the acidic early ocean,
00:36:07.600 | stick it there.
00:36:08.440 | Now you've got a hydrogen atom.
00:36:09.280 | Stick it onto CO2.
00:36:10.720 | You've just done the trick.
00:36:12.360 | The trouble is you bury yourself in rusty iron.
00:36:16.480 | And with sulfur, you can bury yourself in sulfur.
00:36:18.540 | One of the reasons oxygenic photosynthesis is so much better
00:36:21.680 | is that the waste product is oxygen, which just bubbles away.
00:36:25.660 | - That seems like extremely unlikely,
00:36:29.240 | and it's extremely essential for the evolution
00:36:31.420 | of complex organisms because of all the oxygen.
00:36:35.160 | - Yeah, and that didn't accumulate quickly either.
00:36:39.420 | - So it's converting, what is it?
00:36:42.060 | It's converting energy from the sun
00:36:43.900 | and the resource of water
00:36:46.960 | into the resource needed for animals.
00:36:49.940 | - Both resources needed for animals.
00:36:52.420 | We need to eat and we need to burn the food.
00:36:54.540 | And we're eating plants,
00:36:57.780 | which are getting their energy from the sun,
00:36:59.300 | and we're burning it with their waste product,
00:37:01.260 | which is the oxygen.
00:37:02.540 | So there's a lot of kind of circularity in that.
00:37:04.620 | But without an oxygenated planet,
00:37:07.880 | you couldn't really have predation.
00:37:11.840 | You can have animals,
00:37:14.980 | but you can't really have animals
00:37:16.240 | that go around and eat each other.
00:37:17.420 | You can't have ecosystems as we know them.
00:37:19.900 | - Well, let's actually step back.
00:37:21.140 | What about eukaryotic versus prokaryotic cells?
00:37:24.380 | Prokaryotes.
00:37:25.220 | What are each of those
00:37:28.420 | and how big of an invention is that?
00:37:31.060 | - I personally think that's the single biggest invention
00:37:33.380 | in the whole history of life.
00:37:34.820 | - Exciting.
00:37:35.900 | So what are they?
00:37:36.900 | Can you explain?
00:37:37.740 | - Yeah, so I mentioned bacteria and archaea.
00:37:40.780 | These are both prokaryotes.
00:37:42.140 | They're basically small cells that don't have a nucleus.
00:37:45.740 | If you look at them under a microscope,
00:37:47.060 | you don't see much going on.
00:37:48.140 | If you look at them under a super resolution microscope,
00:37:50.640 | then they're fantastically complex.
00:37:53.220 | In terms of their molecular machinery, they're amazing.
00:37:55.460 | In terms of their morphological appearance
00:37:58.340 | under a microscope, they're really small and really simple.
00:38:03.060 | The earliest life that we can physically see on the planet
00:38:05.500 | are stromatolites,
00:38:06.380 | which are made by things like cyanobacteria
00:38:08.500 | and they're large superstructures,
00:38:11.020 | effectively biofilms plated on top of each other.
00:38:13.820 | And you end up with quite large structures
00:38:17.380 | that you can see in the fossil record.
00:38:19.780 | But they never came up with animals.
00:38:23.100 | They never came up with plants.
00:38:24.340 | They came up with multicellular things,
00:38:26.660 | filamentous cyanobacteria, for example,
00:38:28.620 | that is long strings of cells.
00:38:31.420 | But the origin of the eukaryotic cell
00:38:34.500 | seems to have been what's called an endosymbiosis.
00:38:37.300 | So one cell gets inside another cell.
00:38:39.620 | And I think that that's transformed
00:38:42.140 | the energetic possibilities of life.
00:38:43.780 | So what we end up with is a kind of supercharged cell,
00:38:48.160 | which can have a much larger nucleus with many more genes,
00:38:52.320 | all supported.
00:38:54.180 | If you think about it,
00:38:55.020 | you could think about it as multi-bacterial power
00:38:57.380 | without the overhead.
00:38:58.340 | So you've got a cell and it's got bacteria living in it.
00:39:00.820 | And those bacteria are providing it
00:39:02.260 | with the energy currency it needs.
00:39:04.680 | But each bacterium has a genome of its own,
00:39:07.040 | which costs a fair amount of energy to express,
00:39:10.340 | to kind of turn over and convert into proteins and so on.
00:39:13.920 | What the mitochondria did,
00:39:16.780 | which are these power packs in our own cells,
00:39:20.500 | they were bacteria once,
00:39:22.320 | and they threw away virtually all their genes.
00:39:24.120 | They've only got a few left.
00:39:25.660 | - So mitochondria is, like you said,
00:39:27.720 | is the bacteria that got inside a cell
00:39:30.120 | and then threw away all this stuff
00:39:31.420 | it doesn't need to survive inside the cell
00:39:33.920 | and then kept what?
00:39:35.260 | - So what we end up with,
00:39:36.340 | so it kept always a handful of genes,
00:39:38.820 | in our own case, 37 genes.
00:39:41.580 | But there's a few protists, which are single-celled things
00:39:44.640 | that have got as many as 70 or 80 genes.
00:39:47.220 | So it's not always the same,
00:39:48.960 | but it's always a small number.
00:39:51.220 | And you can think of it as a paired-down power pack
00:39:54.220 | where the control unit has really been,
00:39:56.020 | has been kind of paired down to almost nothing.
00:39:58.820 | So you're putting out the same power,
00:40:00.880 | but the investment in the overheads is really paired down.
00:40:04.420 | That means that you can support
00:40:05.780 | a much larger nuclear genome.
00:40:08.420 | So we've gone up in the number of genes,
00:40:10.580 | but also the amount of power you have
00:40:12.240 | to convert those genes into proteins.
00:40:14.100 | We've gone up about fourfold in the number of genes,
00:40:17.140 | but in terms of the size of genomes
00:40:19.240 | and your ability to make the building blocks,
00:40:21.840 | make the proteins, we've gone up 100,000 fold or more.
00:40:25.120 | So it's huge step change in the possibilities of evolution.
00:40:29.580 | And it's interesting then that the only two occasions
00:40:34.000 | that complex life has arisen on Earth, plants and animals,
00:40:37.200 | fungi, you could say, are complex as well,
00:40:40.160 | but they don't form such complex morphology
00:40:42.840 | as plants and animals.
00:40:44.560 | Start with a single cell.
00:40:45.680 | They start with an oocyte and a sperm
00:40:48.400 | fused together to make a zygote.
00:40:50.440 | So you start development with a single cell
00:40:52.320 | and all the cells in the organism have identical DNA.
00:40:56.460 | And you switch off in the brain,
00:40:58.360 | you switch off these genes and you switch on those genes
00:41:00.560 | and liver, you switch off those
00:41:01.680 | and you switch on a different set.
00:41:04.060 | And the standard evolutionary explanation for that
00:41:06.120 | is that you're restricting conflict.
00:41:08.560 | You don't have a load of genetically different cells
00:41:11.000 | that are all fighting each other.
00:41:13.360 | And so it works.
00:41:14.720 | The trouble with bacteria is they form these biofilms
00:41:17.160 | and they're all genetically different
00:41:18.520 | and effectively they're incapable
00:41:21.000 | of that level of cooperation.
00:41:23.280 | They would get in a fight.
00:41:24.580 | - Okay, so why is this such a difficult invention
00:41:30.000 | of getting this bacteria inside and becoming an engine,
00:41:35.800 | which the mitochondria is?
00:41:37.320 | Why was that?
00:41:38.160 | Why do you assign it such great importance?
00:41:40.400 | Is it great importance in terms of the difficulty
00:41:42.180 | of how it was to achieve or great importance
00:41:44.280 | in terms of the impact it had on life?
00:41:46.920 | - Both.
00:41:48.360 | It had a huge impact on life
00:41:49.760 | because if that had not happened,
00:41:52.520 | you can be certain that life on Earth
00:41:54.960 | would be bacterial only.
00:41:56.480 | - And that took a really long time to--
00:41:58.200 | - It took two billion years.
00:41:59.960 | And it hasn't happened since to the best of our knowledge.
00:42:02.760 | So it looks as if it's genuinely difficult.
00:42:05.080 | And if you think about it then
00:42:06.200 | from just an informational perspective,
00:42:08.480 | you think bacteria have got,
00:42:13.120 | they structure their information differently.
00:42:15.280 | So a bacterial cell has a small genome.
00:42:17.560 | It might have 4,000 genes in it,
00:42:19.080 | but a single E. coli cell has access
00:42:21.120 | to about 30,000 genes, potentially.
00:42:24.080 | It's got a kind of metagenome
00:42:26.040 | where other E. coli out there have got different gene sets
00:42:29.120 | and they can switch them around between themselves.
00:42:31.880 | And so you can generate a huge amount of variation.
00:42:34.600 | And they've got more,
00:42:36.240 | an E. coli metagenome is larger than the human genome.
00:42:40.680 | We own 20,000 genes or something.
00:42:43.200 | So, and they've had four billion years of evolution
00:42:46.920 | to work out what can I do and what can't I do
00:42:49.980 | with this metagenome.
00:42:51.520 | And the answer is you're stuck, you're still bacteria.
00:42:54.280 | So they have explored genetic sequence space
00:42:58.920 | far more thoroughly than eukaryotes ever did
00:43:01.240 | because they've had twice as long at least
00:43:03.040 | and they've got much larger populations.
00:43:05.960 | And they never got around this problem.
00:43:08.440 | So why can't they?
00:43:09.360 | It seems as if you can't solve it with information alone.
00:43:12.400 | So what's the problem?
00:43:14.840 | The problem is structure.
00:43:16.400 | If cells, if the very first cells needed an electrical
00:43:20.640 | charge on their membrane to grow,
00:43:22.960 | and in bacteria it's the outer membrane
00:43:25.280 | that surrounds the cell, which is electrically charged,
00:43:28.240 | you try and scale that up
00:43:29.760 | and you've got a fundamental design problem.
00:43:31.920 | You've got an engineering problem.
00:43:33.680 | And there are examples of it.
00:43:35.280 | And what we see in all these cases
00:43:37.080 | is what's known as extreme polyploidy,
00:43:38.800 | which is to say they have tens of thousands
00:43:40.600 | of copies of their complete genome,
00:43:42.720 | which is energetically hugely expensive.
00:43:45.640 | And you end up with a large bacteria
00:43:49.040 | with no further development.
00:43:52.360 | What you need is to incorporate
00:43:55.040 | these electrically charged power pack units inside
00:43:58.520 | with their control units intact,
00:44:01.400 | and for them not to conflict so much with the host cell
00:44:04.000 | that it all goes wrong.
00:44:05.880 | Perhaps it goes wrong more often than not.
00:44:07.840 | And then you change the topology of the cell.
00:44:10.920 | Now you don't necessarily have any more DNA
00:44:14.200 | than a giant bacterium with extreme polyploidy,
00:44:16.720 | but what you've got is an asymmetry.
00:44:19.320 | You now have a giant nuclear genome
00:44:22.200 | surrounded by lots of subsidiary energetic genomes
00:44:25.640 | that do all the, they're the control units
00:44:27.920 | that are doing all the control of energy generation.
00:44:32.360 | - Could this have been done gradually
00:44:34.000 | or does it have to be done,
00:44:35.880 | the power pack has to be all intact and ready to go?
00:44:39.040 | - I mean, it's a kind of step change
00:44:41.840 | in the possibilities of evolution,
00:44:43.400 | but it doesn't happen overnight.
00:44:44.520 | It's gonna still require multiple, multiple generations.
00:44:47.640 | So it could take millions of years.
00:44:50.880 | It could take shorter times.
00:44:52.160 | There's another thing,
00:44:53.000 | I would like to put the number of steps
00:44:54.040 | and try and work out what's required at each step.
00:44:56.120 | And we are trying to do that with sex, for example.
00:44:58.480 | You can't have a very large genome
00:45:00.760 | unless you have sex at that point.
00:45:02.200 | So what are the changes to go from bacterial recombination
00:45:05.240 | to eukaryotic recombination?
00:45:07.040 | What do you need to do?
00:45:09.480 | Why do we go from passing around bits of DNA
00:45:12.280 | as if it's loose change to fusing cells together,
00:45:15.280 | lining up the chromosomes,
00:45:16.520 | recombining across the chromosomes,
00:45:18.560 | and then going through two rounds of cell division
00:45:20.760 | to produce your gametes?
00:45:22.320 | All eukaryotes do it that way.
00:45:24.400 | So again, why switch?
00:45:27.360 | What are the drivers here?
00:45:28.720 | So there's a lot of time, there's a lot of evolution,
00:45:31.360 | but as soon as you've got cells living inside another cell,
00:45:34.080 | what you've got is a new design.
00:45:36.320 | You've got new potential that you didn't have before.
00:45:39.080 | - So the cell living inside another cell,
00:45:42.080 | that design allows for better storage of information,
00:45:47.080 | better use of energy, more delegation,
00:45:52.720 | like a hierarchical control of the whole thing.
00:45:55.360 | And then somehow that leads to ability
00:45:58.280 | to have multi-cell organisms.
00:46:00.120 | - I'm not sure that you have hierarchical control,
00:46:02.560 | necessarily, but you've got a system
00:46:04.640 | where you can have a much larger information storage depot
00:46:09.320 | in the nucleus, you can have a much larger genome.
00:46:11.680 | And that allows multi-cellularity, yes,
00:46:13.600 | because it allows you, it's a funny thing,
00:46:18.600 | to have an animal where I have 70% of my genes
00:46:23.720 | switched on in my brain,
00:46:25.240 | and a different 50% switched on in my liver or something,
00:46:28.480 | you've got to have all those genes in the egg cell
00:46:30.840 | at the very beginning,
00:46:31.760 | and you've got to have a program of development
00:46:35.480 | which says, okay, you guys switch off those genes
00:46:37.880 | and switch on those genes, and you guys, you do that.
00:46:40.280 | But all the genes are there at the beginning.
00:46:42.160 | That means you've got to have a lot of genes in one cell,
00:46:44.080 | and you've got to be able to maintain them.
00:46:45.600 | And the problem with bacteria is they don't get close
00:46:47.640 | to having enough genes in one cell.
00:46:49.880 | So if you were to try and make a multi-cellular organism
00:46:52.760 | from bacteria, you'd bring different types
00:46:54.480 | of bacteria together and hope they'll cooperate.
00:46:56.520 | And the reality is they don't.
00:46:57.920 | - That's really, really tough to do.
00:46:59.520 | - Yeah. - Common internal.
00:47:00.360 | - No, they don't because it doesn't exist.
00:47:02.720 | - We'll have the data, as far as we know.
00:47:04.600 | I'm sure there's a few special ones
00:47:06.440 | and they die off quickly.
00:47:08.200 | I'd love to know some of the most fun things
00:47:10.000 | bacteria have done since.
00:47:11.780 | - Oh, there's a few.
00:47:13.520 | I mean, they can do some pretty funky things.
00:47:15.480 | (laughing)
00:47:16.320 | This is broad brushstroke that I'm talking about.
00:47:18.320 | - Yes. - But it's, yeah.
00:47:19.240 | - Generally speaking.
00:47:21.080 | So how was, so another fun invention.
00:47:25.080 | Us humans seem to utilize it well,
00:47:27.840 | but you say it's also very important early on is sex.
00:47:31.560 | So what is sex?
00:47:34.560 | Just asking for a friend.
00:47:36.400 | And when was it invented and how hard was it to invent,
00:47:39.280 | just as you were saying, and why was it invented?
00:47:42.360 | Why, how hard was it, and when?
00:47:45.680 | - I have a PhD student who's been working on this
00:47:47.960 | and we've just published a couple of papers on sex.
00:47:49.960 | Yes, yes, yes.
00:47:50.880 | - Where do you publish these?
00:47:51.720 | Does biology, is it biology, genetics, journals?
00:47:55.520 | - This is actually PNAS,
00:47:57.240 | which is Proceedings of the National Academy.
00:47:59.840 | - So like broad, big, big pictures.
00:48:02.200 | - Everyone's interested in sex.
00:48:03.520 | - Yeah. (laughing)
00:48:04.840 | - The job of biologist is to make sex dull.
00:48:07.400 | - Yeah, that's a beautiful way to put it.
00:48:10.680 | Okay, so when was it invented?
00:48:13.240 | - It was invented with eukaryotes
00:48:14.560 | about two billion years ago.
00:48:15.960 | All eukaryotes share the same basic mechanism
00:48:20.920 | that you produce gametes.
00:48:21.960 | The gametes fuse together,
00:48:23.440 | so a gamete is the egg cell and the sperm.
00:48:26.280 | They're not necessarily even different in size or shape.
00:48:29.560 | So the simplest eukaryotes produce
00:48:31.840 | what are called motile gametes.
00:48:32.960 | They're all like sperm and they all swim around.
00:48:34.880 | They find each other, they fuse together.
00:48:36.400 | They don't have kind of much going on there beyond that.
00:48:39.920 | And then these are haploid,
00:48:43.160 | which is to say we all have two copies of our genome,
00:48:46.080 | and the gametes have only a single copy of the genome.
00:48:49.200 | So when they fuse together, you now become diploid again,
00:48:51.960 | which is to say you now have two copies of your genome.
00:48:55.080 | And what you do is you line them all up
00:48:57.800 | and then you double everything.
00:49:01.640 | So now we have four copies of the complete genome.
00:49:03.920 | And then we crisscross between all of these things.
00:49:06.000 | So we take a bit from here and stick it on there
00:49:07.640 | and a bit from here and we stick it on here.
00:49:09.600 | That's recombination.
00:49:10.800 | And then we go through two rounds of cell division.
00:49:14.840 | So we divide in half.
00:49:15.920 | So now the two daughter cells have two copies
00:49:18.000 | and we divide in half again.
00:49:19.440 | Now we have some gametes,
00:49:21.240 | each of which has got a single copy of the genome.
00:49:24.480 | And that's the basic ground plan
00:49:26.680 | for what's called meiosis and syngamy.
00:49:29.800 | That's basically sex.
00:49:31.400 | And it happens at the level of single-celled organisms
00:49:33.920 | and it happens pretty much the same way in plants
00:49:35.800 | and pretty much the same way in animals and so on.
00:49:38.160 | And it's not found in any bacteria.
00:49:40.240 | They switch things around using the same machinery
00:49:43.120 | and they take up a bit of DNA from the environment.
00:49:44.960 | They take out this bit and stick in that bit
00:49:46.640 | and it's the same molecular machinery
00:49:48.720 | they're using to do it.
00:49:50.040 | - So what about the kind of, you said, find each other,
00:49:52.680 | this kind of imperative to find each other?
00:49:56.040 | What is that?
00:49:57.320 | Like, is that--
00:49:58.280 | - Well, you've got a few cells together.
00:50:00.680 | So the bottom line on all of this is bacteria,
00:50:04.440 | I mean, it's kind of simple when you've figured it out
00:50:07.840 | and figuring it out, this is not me,
00:50:09.300 | this is my PhD student, Marco Colnaghi.
00:50:11.700 | And in effect, if you're doing lateral,
00:50:16.440 | you're an E. coli cell.
00:50:18.340 | You've got 4,000 genes.
00:50:19.740 | You wanna scale up to a eukaryotic size.
00:50:22.920 | I wanna have 20,000 genes.
00:50:25.400 | And I need to maintain my genome
00:50:27.760 | so it doesn't get shot to pieces by mutations.
00:50:30.480 | And I'm gonna do it by lateral gene transfer.
00:50:32.720 | So I know I've got a mutation in a gene.
00:50:35.440 | I don't know which gene it is 'cause I'm not sentient,
00:50:38.840 | but I know I can't grow.
00:50:40.240 | I know all my regulation systems are saying,
00:50:42.480 | something wrong here, something wrong.
00:50:43.760 | Pick up some DNA.
00:50:45.000 | Pick up a bit of DNA from the environment.
00:50:47.720 | If you've got a small genome,
00:50:49.080 | the chances of you picking up the right bit of DNA
00:50:50.980 | from the environment is much higher
00:50:52.340 | than if you've got a genome of 20,000 genes.
00:50:54.780 | To do that, you've effectively gotta be picking up DNA
00:50:58.060 | all the time, all day long and nothing else,
00:51:00.500 | and you're still gonna get the wrong DNA.
00:51:02.220 | You've gotta pick up large chunks,
00:51:03.720 | and in the end, you've gotta align them.
00:51:05.100 | You're forced into sex, to coin a phrase.
00:51:08.940 | So you're-
00:51:11.040 | - You're forced.
00:51:13.420 | So there is a kind of incentive.
00:51:18.420 | - If you wanna have a large genome,
00:51:20.080 | you've gotta prevent it mutating to nothing.
00:51:22.600 | That will happen with bacteria.
00:51:23.960 | So there's another reason why bacteria
00:51:25.640 | can't have a large genome.
00:51:26.940 | But as soon as you give them the power pack,
00:51:28.680 | as soon as you give eukaryotic cells the power pack
00:51:30.360 | that allows them to increase the size of their genome,
00:51:33.000 | then you face the pressure
00:51:34.360 | that you've gotta maintain its quality.
00:51:36.320 | You've gotta stop it just mutating away.
00:51:38.340 | - What about sexual selection?
00:51:39.860 | So the finding, like, "I don't like this one.
00:51:44.660 | "I don't like this one.
00:51:45.660 | "This one seems all right."
00:51:47.060 | At which point does it become less random?
00:51:52.780 | - It's hard to know.
00:51:54.220 | - 'Cause eukaryotes just kind of float around.
00:51:56.020 | They just kind of have-
00:51:57.060 | - Yeah, I mean, is there sexual selection
00:51:59.100 | in single-celled eukaryotes?
00:52:00.340 | There probably is, it's just that
00:52:01.340 | I don't know very much about it.
00:52:02.980 | By the time we get onto-
00:52:03.820 | - You don't hang out with the eukaryotes.
00:52:05.740 | - Well, I do all the time, but I don't know.
00:52:07.460 | - But you can't communicate with them yet.
00:52:09.820 | - Peacock or something.
00:52:11.020 | - Yes.
00:52:11.860 | - The kind of standard, this is not quite what I work on,
00:52:15.560 | but the standard answer is that it's female mate choice.
00:52:19.820 | She is looking for good genes.
00:52:21.940 | And if you can have a tail that's like this
00:52:25.660 | and still survive, still be alive,
00:52:28.120 | not actually have been taken down by the nearest predator,
00:52:30.420 | then you must've got pretty good genes
00:52:31.780 | 'cause despite this handicap, you're able to survive.
00:52:36.480 | - So those are like human interpretable things
00:52:38.340 | like with a peacock, but I wonder,
00:52:40.580 | I'm sure echoes of the same thing
00:52:43.000 | are there with more primitive organisms.
00:52:46.500 | Basically, your PR, like how you advertise yourself
00:52:51.180 | that you're worthy of-
00:52:54.060 | - Absolutely.
00:52:54.900 | - So one big advertisement is the fact
00:52:56.420 | that you survived it all.
00:52:58.220 | - Let me give you one beautiful example of an algal bloom.
00:53:03.020 | And this can be a cyanobacteria,
00:53:05.540 | this can be a bacteria.
00:53:07.060 | So if suddenly you pump nitrate or phosphate
00:53:10.740 | or something into the ocean and everything goes green,
00:53:13.300 | you end up with all this algae growing there.
00:53:16.980 | A viral infection or something like that
00:53:20.820 | can kill the entire bloom overnight.
00:53:23.320 | And it's not that the virus takes out everything overnight,
00:53:26.820 | it's that most of the cells in that bloom kill themselves
00:53:29.660 | before the virus can get onto them.
00:53:31.860 | And it's through a form of cell death
00:53:33.700 | called programmed cell death.
00:53:34.980 | And we do the same things, is how we have the different,
00:53:37.980 | the gaps between our fingers and so on,
00:53:39.940 | is how we craft synapses in the brain.
00:53:42.260 | It's fundamental again to multicellular life.
00:53:47.420 | They have the same machinery in these algal blooms.
00:53:51.220 | How do they know who dies?
00:53:52.900 | The answer is they will often put out a toxin.
00:53:56.740 | And that toxin is a kind of a challenge to you.
00:54:00.300 | Either you can cope with the toxin or you can't.
00:54:03.420 | If you can cope with it, you form a spore
00:54:06.460 | and you will go on to become the next generation.
00:54:09.100 | You'll form a kind of a resistance spore.
00:54:11.940 | You sink down a little bit, you get out of the way,
00:54:14.500 | you can't be attacked by a virus if you're a spore,
00:54:18.220 | or at least not so easily.
00:54:19.660 | Whereas if you can't deal with that toxin,
00:54:21.900 | you pull the plug and you trigger your death apparatus
00:54:25.700 | and you kill yourself.
00:54:27.060 | - Wow, so it's truly life and death.
00:54:29.140 | - Yeah, so it's really, it's a challenge.
00:54:31.620 | And this is a bit like sexual selection.
00:54:33.780 | It's not so, they're all pretty much genetically identical,
00:54:36.960 | but they've had different life histories.
00:54:39.020 | So have you had a tough day?
00:54:41.420 | Did you happen to get infected by this virus
00:54:44.460 | or did you run out of iron
00:54:45.540 | or did you get a bit too much sun?
00:54:47.460 | Whatever it may be, if this extra stress of the toxin
00:54:51.180 | just pushes you over the edge,
00:54:52.820 | then you have this binary choice.
00:54:53.980 | Either you're the next generation
00:54:55.180 | or you kill yourself now using the same machinery.
00:54:57.920 | - It's also actually exactly the way I approach dating,
00:55:00.660 | but that's probably why I'm single.
00:55:03.220 | Okay, what about if we can step back, DNA?
00:55:07.380 | Just mechanism of storing information.
00:55:10.460 | RNA, DNA, how big of an invention was that?
00:55:13.460 | That seems to be, that seems to be fundamental
00:55:16.180 | to something deep within what life is
00:55:21.180 | is the ability, as you said,
00:55:24.060 | to kind of store and propagate information.
00:55:28.000 | But then you also kind of inferred that
00:55:29.900 | with your and your students' work
00:55:31.580 | that there's a deep connection between the chemistry
00:55:35.140 | and the ability to have this kind of genetic information.
00:55:39.020 | So how big of an invention is it to have
00:55:42.220 | a nice representation, a nice hard drive for info
00:55:45.500 | to pass on?
00:55:46.340 | - Huge, I suspect.
00:55:47.900 | I mean, but when I was talking about the code,
00:55:50.540 | you see the code in RNA as well.
00:55:52.580 | And RNA almost certainly came first.
00:55:55.020 | And there's been an idea going back decades
00:55:58.560 | called the RNA world, because RNA in theory
00:56:01.140 | can copy itself and can catalyze reactions.
00:56:04.900 | So it kind of cuts out this chicken and egg loop.
00:56:07.820 | - So DNA, it's possible, is not that special.
00:56:09.900 | - So RNA, RNA is the thing that does the work, really.
00:56:13.540 | And the code lies in RNA.
00:56:15.300 | The code lies in the interactions
00:56:16.860 | between RNA and amino acids.
00:56:18.300 | And it still is there today in the ribosome, for example,
00:56:21.660 | which is just kind of a giant ribozyme,
00:56:23.780 | which is to say it's an enzyme that's made of RNA.
00:56:28.180 | So getting to RNA, I suspect is probably not that hard,
00:56:33.180 | but getting from RNA, how do you,
00:56:37.860 | there's multiple different types of RNA now.
00:56:39.820 | How do you distinguish?
00:56:42.380 | This is something we're actively thinking about.
00:56:43.820 | How do you distinguish between a random population of RNA,
00:56:47.180 | as some of them go on to become messenger RNA,
00:56:50.300 | this is the transcript of the code,
00:56:52.500 | of the gene that you want to make.
00:56:54.220 | Some of them become transfer RNA,
00:56:56.860 | which is kind of the unit that holds the amino acid
00:56:59.860 | that's going to be polymerized.
00:57:01.700 | Some of them become ribosomal RNA,
00:57:04.260 | which is the machine which is joining them all up together.
00:57:07.500 | How do they discriminate themselves?
00:57:10.100 | And is some kind of phase transition going on there?
00:57:12.820 | What's, I don't know.
00:57:14.180 | It's a difficult question.
00:57:16.020 | And we're now in the region of biology
00:57:18.260 | where information is coming in.
00:57:19.660 | But the thing about RNA is very, very good at what it does.
00:57:22.900 | But the largest genomes supported by RNA are RNA viruses,
00:57:26.940 | like HIV, for example.
00:57:28.820 | They're pretty small.
00:57:29.860 | And so there's a limit to how complex life could be
00:57:34.580 | unless you come up with DNA,
00:57:36.300 | which chemically is a really small change.
00:57:39.180 | But how easy it is to make that change,
00:57:41.740 | I don't really know.
00:57:42.580 | As soon as you've got DNA,
00:57:43.940 | then you've got an amazingly stable molecule
00:57:46.580 | for information storage.
00:57:48.380 | And you can do absolutely anything.
00:57:50.540 | But how likely that transition from RNA to DNA was,
00:57:53.220 | I don't know either.
00:57:54.420 | - How much possibility is there for variety
00:57:56.860 | in ways to store information?
00:58:00.380 | 'Cause it seems to be very,
00:58:01.460 | there's specific characteristics
00:58:03.020 | about the programming language of DNA.
00:58:06.580 | - Yeah, there's a lot of work going on
00:58:08.300 | on what's called the xenodna or RNA.
00:58:11.860 | Can we replace the bases themselves,
00:58:15.220 | the letters, if you like, in RNA or DNA?
00:58:18.300 | Can we replace the backbone?
00:58:19.820 | Can we replace, for example, phosphate with arsenate?
00:58:23.180 | Can we replace the sugar ribose or deoxyribose
00:58:25.860 | with a different sugar?
00:58:26.700 | And the answer is yes, you can.
00:58:28.220 | Within limits, there's not an infinite space there.
00:58:34.140 | Arsenate doesn't really work
00:58:35.980 | if the bonds are not as strong as phosphate.
00:58:37.900 | It's probably quite hard to replace phosphate.
00:58:40.140 | It's possible to do it.
00:58:43.380 | The question to me is, why is it this way?
00:58:47.500 | Is it because there was some form of selection
00:58:50.220 | that this is better than the other forms
00:58:52.060 | and there were lots of competing forms
00:58:53.620 | of information storage early on
00:58:55.060 | and this one was the one that worked out?
00:58:56.780 | Or was it kind of channeled that way,
00:58:58.500 | that these are the molecules that you're dealing with
00:59:01.100 | and they work?
00:59:03.940 | And I'm increasingly thinking it's that way,
00:59:05.780 | that we're channeled towards ribose, phosphate
00:59:08.820 | and the bases that are used.
00:59:11.900 | But there are 200 different letters kicking around out there
00:59:15.660 | that could have been used.
00:59:17.100 | - It's such an interesting question.
00:59:18.220 | If you look at in the programming world in computer science,
00:59:21.780 | there's a programming language called JavaScript,
00:59:24.140 | which was written super quickly.
00:59:26.300 | It's a giant mess, but it took over the world.
00:59:28.940 | - Sounds very biological.
00:59:31.260 | - It was kind of a running joke that like,
00:59:34.260 | surely this can't be, this is a terrible programming language.
00:59:39.500 | It's a giant mess.
00:59:40.340 | It's full of bugs.
00:59:41.700 | It's so easy to write really crappy code,
00:59:44.020 | but it took over all of front end development
00:59:47.980 | in the web browser.
00:59:49.260 | If you have any kind of dynamic interactive website,
00:59:52.660 | it's usually running JavaScript
00:59:54.860 | and it's now taking over much of the backend,
00:59:57.740 | which is like the serious heavy duty computational stuff
01:00:00.860 | and it's become super fast
01:00:02.620 | with the different compilation engines that are running it.
01:00:06.460 | So it's like, it really took over the world.
01:00:08.020 | It's very possible that this initially crappy,
01:00:12.460 | derided language actually takes everything over.
01:00:14.860 | And then the question is,
01:00:16.220 | did human civilization always strive towards JavaScript?
01:00:22.300 | Or was JavaScript just the first programming language
01:00:25.140 | that ran in the browser and still sticky?
01:00:27.220 | The first is the sticky one.
01:00:29.860 | And so it wins over anything else because it was first.
01:00:32.700 | And I don't think that's answerable, right?
01:00:34.620 | But it's good to ask that.
01:00:37.140 | I suppose in the lab,
01:00:39.780 | you can't run it with programming languages,
01:00:43.380 | but in biology you can probably
01:00:45.180 | do some kind of small scale evolutionary test
01:00:51.460 | to try to infer which is which.
01:00:54.740 | - Yeah, I mean, in a way,
01:00:55.980 | we've got the hardware and the software here.
01:00:58.620 | And the hardware is maybe the DNA and the RNA itself.
01:01:02.860 | And then the software perhaps is more about the code.
01:01:06.220 | Did the code have to be this way?
01:01:07.420 | Could it have been a different way?
01:01:08.540 | - Yeah.
01:01:09.380 | - People talk about the optimization of the code
01:01:11.420 | and there's some suggestion for that.
01:01:14.020 | I think it's weak actually.
01:01:16.060 | But you could imagine,
01:01:17.020 | you can come out with a million different codes
01:01:18.860 | and this would be one of the best ones.
01:01:20.820 | - Well, we don't know this.
01:01:24.300 | - Well, people have tried to model it
01:01:27.380 | based on the effect that mutations would have.
01:01:30.220 | So no, you're right, we don't know it
01:01:32.420 | because that's the single assumption
01:01:34.100 | that a mutation is what's being selected on there.
01:01:37.740 | And there's other possibilities too.
01:01:39.300 | - I mean, there does seem to be a resilience
01:01:41.140 | and a redundancy to the whole thing.
01:01:43.180 | It's hard to mess up.
01:01:45.300 | And the way you mess it up
01:01:47.620 | often is likely to produce interesting results.
01:01:51.540 | So it's--
01:01:52.700 | - Are you talking about JavaScript or the genetic code now?
01:01:55.420 | Yeah, well, I mean, it's almost,
01:01:57.780 | biology is underpinned by this kind of mess as well.
01:02:00.580 | And you look at the human genome
01:02:01.660 | and it's full of stuff that is really either broken
01:02:05.300 | or dysfunctional or was a virus once,
01:02:07.300 | whatever it may be, and somehow it works.
01:02:09.260 | And maybe we need a lot of this mess.
01:02:11.660 | We know that some functional genes are taken from this mess.
01:02:15.380 | - So what about, you mentioned the predatory behavior.
01:02:19.580 | - Yeah.
01:02:20.420 | - We talked about sex.
01:02:21.460 | What about violence?
01:02:22.780 | Predator and prey dynamics.
01:02:24.720 | How, when was that invented?
01:02:28.500 | And poetic and biological ways of putting it,
01:02:33.940 | how do you describe predator-prey relationship?
01:02:37.420 | Is it a beautiful dance or is it a violent atrocity?
01:02:41.840 | - Well, I guess it's both, isn't it?
01:02:44.380 | I mean, when does it start?
01:02:45.420 | It starts in bacteria.
01:02:46.940 | You see these amazing predators.
01:02:49.340 | Delavibrio is one that Lynn Margulis
01:02:51.700 | used to talk about a lot.
01:02:53.020 | It's got a kind of a drill piece
01:02:55.640 | that drills through the wall
01:02:57.300 | and the membrane of the bacterium,
01:02:58.740 | and then it effectively eats the bacterium
01:03:00.580 | from just inside the periplasmic space.
01:03:03.580 | And makes copies of itself that way.
01:03:04.980 | So that's straight predation.
01:03:06.380 | There are predators among bacteria.
01:03:08.380 | - So predation in that, sorry to interrupt,
01:03:10.300 | means you murder somebody
01:03:12.620 | and use their body as a resource in some way.
01:03:17.620 | - Yeah.
01:03:18.460 | - But it's not parasitic in that
01:03:21.460 | you need them to be still alive?
01:03:23.420 | - No, no.
01:03:24.700 | I mean, predation is you kill them, really.
01:03:26.420 | - Murder.
01:03:27.260 | - Parasites, you kind of live on them.
01:03:30.340 | - Okay.
01:03:31.300 | But it seems the predator is the really popular tool.
01:03:35.620 | - So what we see if we go back 560, 570 million years
01:03:40.620 | before the Cambrian explosion,
01:03:44.260 | there is what's known as the Ediacaran fauna,
01:03:48.660 | or sometimes they call Vendobionts,
01:03:50.260 | which is a lovely name.
01:03:51.820 | And it's not obvious that they're animals at all.
01:03:55.740 | They're stalked things.
01:03:56.740 | They often have fronds that look a lot like leaves
01:03:59.260 | with kind of fractal branching patterns on them.
01:04:02.020 | And the thing is they're found,
01:04:06.660 | sometimes geologists can figure out the environment
01:04:10.580 | that they were in and say,
01:04:11.900 | this is more than 200 meters deep
01:04:13.500 | because there's no sign of any waves.
01:04:15.500 | There's no storm damage down here, this kind of thing.
01:04:19.540 | They were more than 200 meters deep,
01:04:20.780 | so they're definitely not photosynthetic.
01:04:22.860 | These are animals.
01:04:24.660 | And they're filter feeders.
01:04:26.420 | And we know, you know, sponges and corals and things
01:04:28.860 | are filter feeding animals.
01:04:30.100 | They're stuck to the spot.
01:04:31.500 | And little bits of carbon that come their way,
01:04:33.860 | they filter it out and that's what they're eating.
01:04:36.420 | So no predation involved in this,
01:04:39.500 | beyond stuff just dice anyway.
01:04:41.620 | And it feels like a very gentle, rather beautiful,
01:04:44.340 | rather limited world, you might say.
01:04:46.660 | There's not a lot going on there.
01:04:48.420 | And something changes.
01:04:52.740 | Oxygen definitely changes during this period.
01:04:54.860 | Other things may have changed as well.
01:04:56.300 | But the next thing you really see in the fossil record
01:04:59.100 | is the Cambrian explosion.
01:05:01.260 | And what do we see there?
01:05:02.740 | We're now seeing animals that we would recognize.
01:05:05.420 | They've got eyes, they've got claws, they've got shells.
01:05:08.460 | They're, you know, they're plainly killing things
01:05:10.740 | or running away and hiding.
01:05:15.060 | And so we've gone from a rather gentle but limited world
01:05:19.260 | to a rather vicious, unpleasant world that we recognize,
01:05:24.580 | and which leads to kind of arms races,
01:05:28.740 | evolutionary arms races,
01:05:30.340 | which again is something that when we think
01:05:32.980 | about a nuclear arms race, we think,
01:05:34.420 | Jesus, we don't wanna go there.
01:05:36.100 | It's not done anybody any good.
01:05:38.540 | In some ways, maybe it does do good.
01:05:41.140 | I don't wanna make an argument for nuclear arms.
01:05:43.860 | But predation as a mechanism forces organisms
01:05:48.860 | to adapt to change, to be better to escape or to kill.
01:05:54.100 | If you need to eat, then you've got to eat.
01:05:55.940 | And a cheetah's not gonna run at that speed
01:05:57.940 | unless it has to because the zebra is capable of escaping.
01:06:02.940 | So it leads to much greater feats of evolution
01:06:07.820 | would ever have been possible without it,
01:06:09.900 | and in the end to a much more beautiful world.
01:06:12.660 | And so it's not all bad by any means.
01:06:17.660 | But the thing is, you can't have this
01:06:19.340 | if you don't have an oxygenated planet,
01:06:21.100 | because it's all in the end,
01:06:22.780 | it's about how much energy can you extract
01:06:24.820 | from the food you eat.
01:06:26.660 | And if you don't have an oxygenated planet,
01:06:28.380 | you can get about 10% out, not much more than that.
01:06:32.220 | And if you've got an oxygenated planet,
01:06:34.140 | you can get about 40% out.
01:06:35.940 | And that means you can have,
01:06:37.060 | instead of having one or two trophic levels,
01:06:40.460 | you can have five or six trophic levels.
01:06:42.780 | And that means things can eat things
01:06:44.340 | that eat other things and so on.
01:06:45.700 | And you've gone to a level of ecological complexity,
01:06:48.900 | which is completely impossible in the absence of oxygen.
01:06:51.700 | - This reminds me of the Hunter S. Thompson quote,
01:06:54.340 | that for every moment of triumph,
01:06:56.820 | for every instance of beauty,
01:06:58.780 | many souls must be trampled.
01:07:00.900 | The history of life on Earth, unfortunately,
01:07:05.060 | is that of violence.
01:07:08.940 | Just the trillions and trillions
01:07:11.180 | of multi-cell organisms that were murdered
01:07:15.100 | in the struggle for survival.
01:07:17.020 | - It's a sorry statement, but yes, it's basically true.
01:07:20.340 | And that somehow is a catalyst
01:07:23.940 | from an evolutionary perspective for creativity,
01:07:26.220 | for creating more and more complex organisms
01:07:28.860 | that are better and better at surviving.
01:07:30.180 | - I mean, survival of the fittest,
01:07:32.140 | if you just go back to that old phrase,
01:07:33.540 | means death of the weakest.
01:07:36.260 | Now, what's fit, what's weak,
01:07:38.460 | these are terms that don't have much intrinsic meaning.
01:07:41.380 | But the thing is, evolution only happens because of death.
01:07:45.260 | - One way to die is the constraints,
01:07:49.020 | the scarcity of the resources in the environment.
01:07:52.180 | But that seems to be not nearly as good of a mechanism
01:07:56.300 | for death than other creatures roaming about
01:08:00.220 | in the environment.
01:08:01.500 | When I say environment, I mean like the static environment.
01:08:04.140 | But then there's the dynamic environment
01:08:05.860 | of bigger things trying to eat you
01:08:08.380 | and use you for your energy.
01:08:10.580 | - It forces you to come up with a solution
01:08:13.300 | to your specific problem that is inventive
01:08:16.940 | and is new and hasn't been done before.
01:08:18.820 | And so it forces, I mean, literally change,
01:08:22.580 | literally evolution on populations.
01:08:25.980 | They have to become different.
01:08:27.380 | - And it's interesting that humans have channeled that
01:08:30.900 | into more, I mean, I guess what humans are doing
01:08:34.220 | is they're inventing more productive
01:08:37.780 | and safe ways of doing that.
01:08:39.900 | You know, this whole idea of morality
01:08:41.500 | and all those kinds of things.
01:08:43.300 | I think they ultimately lead to competition
01:08:48.340 | versus violence.
01:08:49.940 | 'Cause I think violence can have a cold,
01:08:53.780 | brutal, inefficient aspect to it.
01:08:56.540 | But if you channel that into more controlled competition
01:09:01.220 | in the space of ideas, in the space of approaches to life,
01:09:05.460 | maybe you can be even more productive than evolution is.
01:09:10.260 | 'Cause evolution is very wasteful.
01:09:12.220 | Like the amount of murder required
01:09:14.700 | to really test a good idea,
01:09:16.580 | genetically speaking, is just a lot.
01:09:18.900 | - Yeah.
01:09:19.740 | - Many, many, many generations.
01:09:21.100 | - Morally, we cannot base society
01:09:24.580 | on the way that evolution works.
01:09:26.340 | - That's an invention, right?
01:09:27.300 | - But actually, in some respects we do,
01:09:29.620 | which is to say, this is how science works.
01:09:31.460 | We have competing hypotheses that have to get better,
01:09:33.980 | otherwise they die.
01:09:35.380 | It's the way that society works.
01:09:36.660 | You know, in ancient Greece,
01:09:38.540 | we had the Athens and Sparta and city states,
01:09:42.580 | and then we had the Renaissance and nation states.
01:09:45.420 | And universities compete with each other.
01:09:47.980 | - Yes.
01:09:48.820 | - Tremendous amount of companies
01:09:50.140 | competing with each other all the time.
01:09:51.740 | It drives innovation.
01:09:53.700 | And if we want to do it without all the death
01:09:57.420 | that we see in nature,
01:09:59.060 | then we have to have some kind of societal level control
01:10:03.380 | that says, well, there's some limits, guys,
01:10:05.500 | and these are what the limits are gonna be.
01:10:07.260 | And society as a whole has to say,
01:10:08.780 | right, we wanna limit the amount of death here,
01:10:10.980 | so you can't do this and you can't do that.
01:10:12.780 | And who makes up these rules and how do we know?
01:10:15.060 | It's a tough thing,
01:10:16.100 | but it's basically trying to find a moral basis
01:10:19.500 | for avoiding the death of evolution and natural selection
01:10:22.980 | and keeping the innovation and the richness of it.
01:10:27.420 | - And I forgot who said it,
01:10:28.740 | but that murder is illegal, probably Kurt Vonnegut.
01:10:33.260 | Murder is illegal except when it's done
01:10:35.460 | to the sound of trumpets and at a large scale.
01:10:38.260 | So we still have wars,
01:10:41.740 | but we are struggling with this idea
01:10:44.220 | that murder is a bad thing.
01:10:47.340 | It's so interesting how we're channeling
01:10:49.820 | the best of the evolutionary imperative
01:10:53.300 | and trying to get rid of the stuff that's not productive.
01:10:58.300 | It's trying to almost accelerate evolution,
01:11:00.660 | the same kind of thing that makes evolution creative,
01:11:05.660 | we're trying to use that.
01:11:07.420 | - I think we naturally do it.
01:11:08.660 | I mean, I don't think we can help ourselves do it.
01:11:11.020 | - It's hard to know.
01:11:12.020 | - Capitalism as a form is basically about competition
01:11:15.020 | and differential rewards,
01:11:17.220 | but society and we have a,
01:11:21.740 | I keep using this word, moral obligation,
01:11:25.180 | but we cannot operate as a society if we go that way.
01:11:29.780 | It's interesting that we've had problems achieving balance.
01:11:34.780 | So for example, in the financial crash in 2009,
01:11:39.300 | do you let banks go to the wall
01:11:40.900 | or not this kind of question?
01:11:42.940 | In evolution, certainly you let them go to the wall
01:11:45.060 | and in that sense, you don't need the regulation
01:11:47.820 | because they just die.
01:11:49.580 | Whereas if we as a society think about what's required
01:11:55.260 | for society as a whole,
01:11:56.340 | then you don't necessarily let them go to the wall,
01:11:59.420 | in which case you then have to impose
01:12:01.500 | some kind of regulation that the bankers themselves
01:12:05.060 | will in an evolutionary manner exploit.
01:12:08.180 | - Yeah, we've been struggling with this kind of idea
01:12:11.380 | of capitalism, the cold brutality of capitalism
01:12:16.140 | that seems to create so much beautiful things in this world
01:12:20.340 | and then the ideals of communism
01:12:23.180 | that seem to create so much brutal destruction in history.
01:12:26.620 | We struggle with ideas of,
01:12:28.700 | well, maybe we didn't do it right,
01:12:30.540 | how can we do things better?
01:12:31.740 | And then the ideas are the things we're playing with
01:12:34.720 | as opposed to people.
01:12:35.960 | If a PhD student has a bad idea,
01:12:37.700 | we don't shoot the PhD student,
01:12:39.580 | we just criticize their idea and hope they improve it.
01:12:42.060 | - You have a very humane lab.
01:12:44.020 | - Yeah, I don't know how you guys do it.
01:12:46.580 | The way I run things, it's always life and death.
01:12:49.300 | Okay, so it is interesting about humans
01:12:52.460 | that there is an inner sense of morality
01:12:54.740 | which begs the question of how did Homo sapiens evolve?
01:13:00.620 | If we think about the invention of,
01:13:05.540 | early invention of sex and early invention of predation,
01:13:09.200 | what was the thing invented to make humans?
01:13:15.420 | What would you say?
01:13:17.180 | - I mean, I suppose a couple of things I'd say.
01:13:19.100 | Number one is you don't have to wind the clock back very far,
01:13:22.540 | five, six million years or so
01:13:24.140 | and let it run forwards again
01:13:26.620 | and the chances of humans as we know them
01:13:28.980 | is not necessarily that high.
01:13:30.500 | Imagine as an alien, you find planet Earth
01:13:34.540 | and it's got everything apart from humans on it,
01:13:36.300 | it's an amazing, wonderful, marvelous planet
01:13:39.340 | but nothing that we would recognize
01:13:41.300 | as extremely intelligent life,
01:13:43.620 | kind of space-faring civilization.
01:13:45.820 | So when we think about aliens,
01:13:46.940 | we're kind of after something like ourselves,
01:13:49.740 | we're after a space-faring civilization,
01:13:51.820 | we're not after zebras and giraffes and lions and things,
01:13:55.820 | amazing though they are.
01:13:57.700 | But the additional kind of evolutionary steps
01:14:01.280 | to go from large, complex mammals, monkeys let's say,
01:14:06.100 | to humans doesn't strike me as that long a distance,
01:14:11.100 | it's all about the brain
01:14:14.460 | and where's the brain and morality coming from?
01:14:17.020 | It seems to me to be all about groups,
01:14:19.860 | human groups and interactions between groups.
01:14:22.420 | - The collective intelligence of it.
01:14:24.260 | - Yes, the interactions really.
01:14:26.660 | And there's a guy at UCL called Mark Thomas
01:14:30.060 | who's done a lot of really beautiful work,
01:14:31.980 | I think on this kind of question,
01:14:33.580 | so I talk to him every now and then,
01:14:34.900 | so my views are influenced by him.
01:14:37.240 | But a lot seems to depend on population density,
01:14:43.060 | that the more interactions you have going on
01:14:45.500 | between different groups,
01:14:46.660 | the more transfer of information, if you like,
01:14:49.560 | between groups, people moving from one group
01:14:52.340 | to another group,
01:14:53.620 | almost like lateral gene transfer in bacteria,
01:14:56.020 | the more expertise you're able to develop
01:14:59.860 | and maintain, the more culturally complex
01:15:02.980 | your society can become.
01:15:04.460 | And groups that have become detached,
01:15:06.700 | like on Easter Island, for example,
01:15:09.220 | very often degenerate in terms of the complexity
01:15:12.460 | of their civilization.
01:15:13.580 | - Is that true for complex organisms in general?
01:15:16.100 | Population density is often productive.
01:15:19.220 | - Really matters, but in human terms,
01:15:21.200 | I don't know what the actual factors were
01:15:26.060 | that were driving a large brain,
01:15:28.640 | but you can talk about fire,
01:15:30.760 | you can talk about tool use,
01:15:32.940 | you can talk about language,
01:15:34.180 | and none of them seem to correlate especially well
01:15:36.780 | with the actual known trajectory of human evolution
01:15:39.540 | in terms of cave art and these kinds of things.
01:15:42.820 | That seems to work much better
01:15:45.140 | just with population density
01:15:47.540 | and number of interactions between different groups,
01:15:51.220 | all of which is really about human interactions,
01:15:55.820 | human-human interactions and the complexity of those.
01:15:58.620 | But population density is the thing
01:16:02.060 | that increases the number of interactions,
01:16:04.320 | but then there must have been inventions
01:16:06.700 | forced by that number of interactions
01:16:12.980 | that actually led to humans.
01:16:14.720 | So like Richard Wrangham talks about
01:16:17.460 | that it's basically the beta males
01:16:20.400 | had to beat up the alpha male.
01:16:22.080 | So that's what collaboration looks like,
01:16:23.800 | is they, when you're living together,
01:16:25.880 | they don't like, our early ancestors
01:16:30.120 | don't like the dictatorial aspect of a single individual
01:16:34.640 | at the top of a tribe.
01:16:36.080 | So they learn to collaborate,
01:16:38.540 | how to basically create a democracy of sorts,
01:16:42.840 | a democracy that prevents, minimizes,
01:16:45.240 | or lessens the amount of violence,
01:16:47.360 | which essentially gives strength to the tribe
01:16:50.680 | and make the war between tribes versus the dictator.
01:16:55.080 | - I mean, I think one of the most wonderful things
01:16:57.280 | about humans is we're all of those things.
01:17:00.360 | I mean, we are deeply social as a species
01:17:03.720 | and we're also deeply selfish.
01:17:05.360 | And it seems to me the conflict
01:17:06.440 | between capitalism and communism,
01:17:08.380 | it's really just two aspects of human nature,
01:17:10.400 | both of which are- - We are both.
01:17:11.800 | - We are both.
01:17:12.800 | And we have a constant kind of vying between the two sides.
01:17:16.100 | We really do care about other people beyond our families,
01:17:19.440 | beyond our immediate people.
01:17:21.140 | We care about society and the society that we live in.
01:17:24.620 | And you could say that's a drawing
01:17:27.160 | towards socialism or communism.
01:17:28.520 | On the other side, we really do care about ourselves.
01:17:30.720 | We really do care about our families,
01:17:32.120 | about working for something that we gain from.
01:17:34.760 | And that's the capitalist side of it.
01:17:35.960 | They're both really deeply ingrained in human nature.
01:17:38.860 | In terms of violence and interactions between groups,
01:17:43.040 | yes, all this dynamic of,
01:17:45.920 | if you're interacting between groups,
01:17:47.160 | you can be certain that they're gonna be burning each other
01:17:50.240 | and all kinds of physical violent interactions as well,
01:17:53.360 | which will drive the kind of cleverness
01:17:56.600 | of how do you resist this?
01:17:57.840 | Let's build a tower.
01:17:58.840 | Let's, what are we gonna do to prevent being overrun
01:18:02.600 | by those marauding gangs from over there?
01:18:05.060 | And you look outside humans
01:18:08.280 | and you look at chimps and bonobos and so on,
01:18:10.640 | and they're very, very different structures to society.
01:18:13.280 | Chimps tend to have an aggressive alpha male type structure
01:18:16.640 | and bonobos, there's basically a female society
01:18:21.000 | where the males are predominantly excluded
01:18:22.880 | and only brought in at the behest of the female.
01:18:25.560 | We have a lot in common with both of those groups.
01:18:29.240 | - And there's, again, tension there.
01:18:31.080 | And probably chimps, more violence with bonobos,
01:18:33.920 | probably more sex.
01:18:35.440 | That's another tension.
01:18:36.780 | (both laughing)
01:18:39.800 | How serious do we wanna be?
01:18:42.200 | How much fun we wanna be?
01:18:44.140 | Asking for a friend again,
01:18:45.440 | what do you think happened to Neanderthals?
01:18:47.920 | What did we cheeky humans do to the Neanderthals,
01:18:52.000 | the homo sapiens?
01:18:53.080 | Do you think we murdered them?
01:18:54.600 | How do we murder them?
01:18:56.760 | How do we out-compete them?
01:18:58.160 | Or do we out-mate them?
01:19:01.200 | - I don't know.
01:19:02.120 | I mean, I think there's unequivocal evidence
01:19:04.360 | that we mated with them.
01:19:05.980 | - We always try to mate with everything.
01:19:07.720 | - Yes, pretty much.
01:19:09.440 | There's some interesting,
01:19:10.280 | the first sequences that came along
01:19:12.040 | were in mitochondrial DNA.
01:19:14.080 | And that was back to about 2002 or thereabouts.
01:19:17.200 | What was found was that Neanderthal mitochondrial DNA
01:19:21.240 | was very different to human mitochondria.
01:19:23.360 | - Oh, that's so interesting.
01:19:24.200 | - You could do a clock on it
01:19:25.020 | and it said the divergent state
01:19:26.600 | was about 600,000 years ago or something like that.
01:19:29.200 | So not so long ago.
01:19:31.040 | And then the first full genomes were sequenced
01:19:33.960 | maybe 10 years after that.
01:19:35.720 | And they showed plenty of signs of mating between.
01:19:39.600 | So the mitochondrial DNA effectively says no mating.
01:19:43.080 | And the nuclear genes say, yeah, lots of mating.
01:19:47.080 | But we don't know-- - How's that possible?
01:19:49.200 | So can you explain the difference
01:19:50.360 | between mitochondrial DNA and nucleus?
01:19:53.840 | - I've talked before about the mitochondria,
01:19:55.560 | which are the power packs in cells.
01:19:57.400 | These are the paired down control units is their DNA.
01:20:01.400 | So it's passed on by the mother only.
01:20:03.920 | And in the egg cell,
01:20:07.140 | we might have half a million copies of mitochondrial DNA.
01:20:10.600 | There's only 37 genes left.
01:20:12.700 | And they do, it's basically the control unit
01:20:16.400 | of energy production.
01:20:17.240 | That's what it's doing.
01:20:18.480 | - It's a basic old school machine that does--
01:20:21.560 | - And it's got genes that were considered
01:20:23.320 | to be effectively trivial
01:20:24.920 | because they did a very narrowly defined job.
01:20:28.780 | But they're not trivial in the sense
01:20:30.400 | that that narrowly defined job
01:20:31.680 | is about everything is being alive.
01:20:34.640 | So they're much easier to sequence.
01:20:38.200 | You've got many more copies of these things
01:20:39.800 | and you can sequence them very quickly.
01:20:42.140 | But the problem is because they go down
01:20:43.820 | only the maternal line from mother to daughter,
01:20:46.360 | your mitochondrial DNA and mine is going nowhere.
01:20:49.560 | Doesn't matter any kids we have,
01:20:51.920 | they get their mother's mitochondrial DNA,
01:20:54.840 | except in very, very rare and strange circumstances.
01:20:59.040 | And so it tells a different story
01:21:02.320 | and it's not a story which is easy to reconcile always.
01:21:06.020 | And what it seems to suggest to my mind at least
01:21:10.060 | is that there was one way traffic of genes,
01:21:13.880 | probably going from humans into Neanderthals
01:21:16.560 | rather than the other way around.
01:21:18.040 | Why did the Neanderthals disappear?
01:21:19.840 | I don't know.
01:21:20.840 | I mean, I suspect that they were,
01:21:23.680 | I suspect they were probably less violent,
01:21:25.640 | less clever, less populous,
01:21:29.360 | less willing to fight.
01:21:31.640 | I don't know.
01:21:32.560 | I mean, I think it probably drove them to extinction
01:21:34.800 | at the margins of Europe.
01:21:36.040 | - And it's interesting how much,
01:21:39.100 | if we ran earth over and over again,
01:21:41.480 | how many of these branches of intelligent beings
01:21:45.400 | that have figured out some kind of
01:21:47.640 | how to leverage collective intelligence,
01:21:52.600 | which ones of them emerge, which ones of them succeed?
01:21:55.660 | Is it the more violent ones?
01:21:57.760 | Is it the more isolated ones?
01:22:01.040 | You know, like what dynamics result in more productivity?
01:22:03.760 | And I suppose we'll never know.
01:22:06.440 | The more complex the organism,
01:22:07.880 | the harder it is to run the experiment in the lab.
01:22:10.640 | - Yes.
01:22:11.480 | And in some respects, maybe it's best if we don't know.
01:22:15.200 | - Yeah, the truth might be very painful.
01:22:18.160 | What about if we actually step back
01:22:20.680 | a couple of interesting things that we humans do?
01:22:23.120 | One is object manipulation and movement.
01:22:28.880 | And of course, movement was something that was done.
01:22:32.100 | That was another big invention,
01:22:33.800 | being able to move around the environment.
01:22:36.080 | And the other one is the sensory mechanism,
01:22:39.440 | how we sense the environment.
01:22:41.080 | One of the coolest high definition ones is vision.
01:22:44.040 | How big are those inventions
01:22:47.140 | in the history of life on earth?
01:22:50.000 | - Vision, movement, I mean, again, extremely important,
01:22:55.060 | going back to the origin of animals,
01:22:56.760 | the Cambrian explosion,
01:22:57.680 | where suddenly you're seeing eyes in the fossil record.
01:23:01.080 | And you can, it's not necessarily, again,
01:23:03.760 | lots of people historically have said,
01:23:05.840 | what use is half an eye?
01:23:07.480 | And you can go in a series of steps
01:23:10.640 | from a light sensitive spot on a flat piece of tissue
01:23:16.640 | to an eyeball with a lens and so on.
01:23:20.360 | If you assume no more than, I don't remember,
01:23:23.080 | this was a specific model that I have in mind,
01:23:25.120 | but it was 1% change or half a percent change
01:23:28.480 | for each generation,
01:23:29.440 | how long would it take to evolve an eye as we know it?
01:23:31.600 | And the answer is half a million years.
01:23:34.000 | It doesn't have to take long.
01:23:35.560 | That's not how evolution works.
01:23:36.840 | That's not an answer to the question.
01:23:38.640 | It just shows you can reconstruct the steps
01:23:41.680 | and you can work out roughly how it can work.
01:23:44.600 | So it's not that big a deal to evolve an eye,
01:23:48.260 | but once you have one, then there's nowhere to hide.
01:23:51.960 | And again, we're back to predator prey relationships.
01:23:55.120 | We're back to all the benefits
01:23:56.360 | that being able to see brings you.
01:23:58.320 | And if you think philosophically what bats are doing
01:24:00.760 | with the eco location and so on, I have no idea,
01:24:04.240 | but I suspect that they form an image of the world
01:24:06.440 | in pretty much the same way that we do.
01:24:07.960 | It's just a matter of mental reconstruction.
01:24:10.120 | So I suppose the other thing about sight,
01:24:11.840 | there are single-celled organisms that have got a lens
01:24:16.840 | and a retina and a cornea and so on.
01:24:21.560 | Basically, they've got a camera-type eye in a single cell.
01:24:24.800 | They don't have a brain.
01:24:26.000 | What they understand about their world
01:24:29.640 | is impossible to say,
01:24:30.880 | but they're capable of coming up
01:24:32.640 | with the same structures to do so.
01:24:34.920 | So I suppose then is that once you've got things like eyes,
01:24:39.120 | then you have a big driving pressure
01:24:41.080 | on the central nervous system
01:24:42.400 | to figure out what it all means.
01:24:44.440 | And we come around to your other point
01:24:45.680 | about manipulation, sensory input, and so on,
01:24:47.960 | about now you have a huge requirement
01:24:52.760 | to understand what your environment is and what it means
01:24:55.040 | and how it reacts and where you should run away
01:24:57.080 | and where you should stay put.
01:24:59.160 | - Actually, on that point,
01:25:00.440 | I don't know if you know the work of Donald Hoffman,
01:25:03.760 | who uses the argument,
01:25:08.320 | the mechanism of evolution to say
01:25:12.160 | that there's not necessarily a strong evolutionary value
01:25:17.160 | to seeing the world as it is.
01:25:23.240 | So objective reality,
01:25:24.800 | that our perception actually is very different
01:25:27.800 | from what's objectively real.
01:25:29.800 | We're living inside an illusion,
01:25:32.240 | and we're basically the entire set of species on Earth,
01:25:37.240 | I think, I guess, are competing in a space
01:25:40.160 | that's an illusion that's distinct from,
01:25:41.960 | that's far away from physical reality as it is,
01:25:45.320 | as defined by physics.
01:25:46.160 | - I'm not sure it's an illusion so much as a bubble.
01:25:48.680 | I mean, we have a sensory input,
01:25:50.480 | which is a fraction of what we could have
01:25:51.920 | a sensory input on,
01:25:53.120 | and we interpret it in terms of what's useful
01:25:56.680 | for us to know to stay alive.
01:25:58.200 | So yes, it's an illusion in that sense, but-
01:26:00.880 | - So it's a subset-
01:26:01.920 | - A tree is physically there,
01:26:03.640 | and if you walk into that tree,
01:26:05.280 | it's not purely a delusion,
01:26:08.240 | there's some physical reality to it.
01:26:10.400 | - So it's a sensory slice into reality as it is,
01:26:15.120 | but because it's just a slice,
01:26:17.160 | you're missing a big picture.
01:26:18.840 | But he says that that slice
01:26:20.320 | doesn't necessarily need to be a slice.
01:26:23.080 | It could be a complete fabrication
01:26:25.800 | that's just consistent amongst the species,
01:26:28.440 | which is an interesting,
01:26:29.720 | or at least it's a humbling realization
01:26:33.520 | that our perception is limited,
01:26:37.160 | and our cognitive abilities are limited.
01:26:40.520 | And at least to me, it's argument from evolution.
01:26:44.920 | I don't know how strong that is as an argument,
01:26:49.360 | but I do think that life can exist in the mind.
01:26:54.360 | - Yes.
01:26:56.720 | - In the same way that you can do a virtual reality
01:26:58.680 | video game, and you can have a vibrant life
01:27:01.000 | inside that place, and that place is not real in some sense,
01:27:05.880 | but you can still have all the same forces of evolution,
01:27:08.880 | all the same competition, the dynamics
01:27:12.160 | between humans you can have,
01:27:13.960 | but I don't know if,
01:27:17.080 | I don't know if there's evidence for that
01:27:21.680 | being the thing that happened on Earth.
01:27:23.680 | It seems that Earth-
01:27:25.080 | - I think in either environment,
01:27:26.400 | I wouldn't deny that you could have exactly the world
01:27:29.240 | that you talk about, and it would be very difficult to,
01:27:32.040 | the idea in "Matrix" movies and so on,
01:27:36.520 | that the whole world is completely a construction,
01:27:40.400 | and we're fundamentally deluded.
01:27:43.880 | It's difficult to say that's impossible or couldn't happen,
01:27:47.640 | and certainly we construct in our minds
01:27:51.160 | what the outside world is,
01:27:52.280 | but we do it on input, and that input,
01:27:54.920 | I would hesitate to say is not real,
01:27:56.920 | because it's precisely how we do understand the world.
01:28:00.360 | We have eyes, but if you keep someone in,
01:28:04.080 | apparently this kind of thing happens,
01:28:06.040 | someone kept in a dark room for five years
01:28:08.560 | or something like that, they never see properly again,
01:28:10.960 | because the neural wiring that underpins
01:28:15.200 | how we interpret vision never developed.
01:28:18.840 | You need, when you watch a child develop,
01:28:21.240 | it walks into a table, it bangs its head on the table,
01:28:23.760 | and it hurts, and now you've got two inputs.
01:28:28.080 | You've got one pain from this sharp edge,
01:28:30.040 | and number two, you've probably,
01:28:31.680 | you've touched it and realized it's there,
01:28:33.200 | it's a sharp edge, and you've got the visual input,
01:28:34.920 | and you put the three things together and think,
01:28:36.520 | I don't wanna walk into a table again.
01:28:38.360 | So you're learning, and it's a limited reality,
01:28:42.440 | but it's a true reality,
01:28:43.560 | and if you don't learn that properly,
01:28:44.800 | then you will get eaten, you will get hit by a bus,
01:28:46.800 | you will not survive.
01:28:48.000 | And same if you're in some kind of,
01:28:53.400 | let's say, computer construction of reality.
01:28:55.920 | I'm not on my ground here, but if you construct the laws
01:28:59.200 | that this is what reality is inside this,
01:29:03.600 | then you play by those laws.
01:29:05.200 | - Yeah, well, I mean, as long as the laws are consistent.
01:29:07.600 | So just like you said in the lab,
01:29:09.600 | the interesting thing about the simulation question,
01:29:12.540 | yes, it's hard to know if we're living inside a simulation,
01:29:15.480 | but also, yes, it's possible to do these kinds
01:29:18.560 | of experiments in the lab now, more and more.
01:29:21.680 | To me, the interesting question is,
01:29:23.720 | how realistic does a virtual reality game need to be
01:29:28.280 | for us to not be able to tell the difference?
01:29:30.600 | A more interesting question to me is,
01:29:33.280 | how realistic or interesting does a virtual reality world
01:29:38.280 | need to be in order for us to want to stay there forever,
01:29:43.440 | or much longer than physical reality?
01:29:46.160 | Prefer that place.
01:29:47.920 | And also prefer it not as we prefer hard drugs,
01:29:52.160 | but prefer it in a deep, meaningful way,
01:29:55.120 | in the way we enjoy life.
01:29:57.800 | - I mean, I suppose the issue with the matrix,
01:30:00.440 | I imagine that it's possible to delude the mind sufficiently
01:30:05.040 | that you genuinely, in that way,
01:30:07.000 | do think that you are interacting with the real world
01:30:10.800 | when in fact, the whole thing's a simulation.
01:30:13.040 | How good does a simulation need to be
01:30:16.320 | to be able to do that?
01:30:17.280 | Well, it needs to convince you
01:30:21.400 | that all your sensory input is correct and accurate
01:30:24.160 | and joins up and makes sense.
01:30:26.640 | Now, that sensory input is not something
01:30:28.400 | that we're born with.
01:30:29.720 | We're born with a sense of touch,
01:30:31.680 | we're born with eyes and so on,
01:30:33.040 | but we don't know how to use them,
01:30:34.120 | we don't know what to make of them.
01:30:35.760 | We go around, we bump into trees, we cry a lot,
01:30:38.680 | we're in pain a lot, you know,
01:30:40.040 | we're basically booting up the system
01:30:43.080 | so that it can make head or tail
01:30:45.320 | of the sensory input that it's getting.
01:30:47.560 | And that sensory input's not just a one-way flux of things,
01:30:50.000 | it's also you have to walk into things,
01:30:51.600 | you have to hear things, you have to put it together.
01:30:53.720 | Now, if you've got just babies in the matrix
01:30:58.120 | who are slotted into this,
01:30:59.920 | I don't think they have that kind of sensory input.
01:31:02.560 | I don't think they would have any way
01:31:03.760 | to make sense of New York as a world that they're part of.
01:31:08.400 | The brain is just not developed in that way.
01:31:10.760 | - Well, I can't make sense of New York
01:31:12.400 | in this physical reality either.
01:31:13.920 | But yeah, I mean, but you said pain
01:31:16.400 | and the walking into things.
01:31:17.840 | Well, you can create a pain signal,
01:31:19.760 | and as long as it's consistent
01:31:21.900 | that certain things result in pain,
01:31:23.840 | you can start to construct a reality.
01:31:25.880 | There's some, maybe you disagree with this,
01:31:28.440 | but I think we are born almost with a desire
01:31:33.280 | to be convinced by our reality,
01:31:35.820 | like a desire to make sense of our reality.
01:31:38.840 | - Oh, I'm sure we are, yes.
01:31:40.240 | - So there's an imperative.
01:31:41.120 | So whatever that reality is given to us,
01:31:44.000 | like the table hurts, fire is hot,
01:31:46.560 | I think we wanna be deluded
01:31:49.920 | in the sense that we want to make a simple,
01:31:53.120 | like Einstein's simple theory of the thing around us.
01:31:56.440 | We want that simplicity.
01:31:58.000 | And so maybe the hunger for the simplicity
01:32:02.200 | is the thing that could be used
01:32:03.860 | to construct a pretty dumb simulation that tricks us.
01:32:07.800 | So maybe tricking humans
01:32:09.080 | doesn't require building a universe. (laughs)
01:32:11.480 | - No, I don't, I mean, this is not what I work on,
01:32:14.480 | so I don't know how close to it we are.
01:32:15.320 | - Yes, I hope the game one works out.
01:32:16.640 | - But I agree with you that, yeah,
01:32:18.760 | I'm not sure that it's a morally justifiable thing to do,
01:32:22.000 | but is it possible in principle?
01:32:24.820 | I think it'll be very difficult,
01:32:28.400 | but I don't see why in principle it wouldn't be possible.
01:32:31.520 | And I agree with you that we try to understand the world,
01:32:35.880 | we try to integrate the sensory inputs that we have,
01:32:38.080 | and we try to come up with a hypothesis
01:32:40.200 | that explains what's going on.
01:32:42.000 | I think, though, that we have huge input
01:32:46.240 | from the social context that we're in.
01:32:49.120 | We don't do it by ourselves.
01:32:50.500 | We don't kind of blunder around in a universe by ourself
01:32:53.520 | and understand the whole thing.
01:32:55.980 | We're told by the people around us what things are
01:32:58.840 | and what they do, and language is coming in here and so on.
01:33:01.680 | So it would have to be an extremely impressive simulation
01:33:05.300 | to simulate all of that.
01:33:07.640 | - Yeah, simulate all of that,
01:33:10.360 | including the social construct, the spread of ideas
01:33:14.000 | and the exchange of ideas, I don't know.
01:33:16.600 | But those questions are really important to understand
01:33:18.660 | as we become more and more digital creatures.
01:33:22.080 | It seems like the next step of evolution
01:33:23.780 | is us becoming all the same mechanisms we've talked about
01:33:28.280 | are becoming more and more plugged in into the machine.
01:33:31.840 | We're becoming cyborgs.
01:33:34.080 | And there's an interesting interplay
01:33:36.600 | between wires and biology,
01:33:38.540 | zeros and ones and the biological systems.
01:33:43.460 | And I don't think you can just,
01:33:47.240 | I don't think we'll have the luxury to see humans
01:33:49.520 | as disjoint from the technology we've created
01:33:52.220 | for much longer.
01:33:53.440 | We are an organism that's--
01:33:55.520 | - Yeah, I mean, I agree with you,
01:34:00.240 | but we come really with this to consciousness.
01:34:05.240 | - Yes.
01:34:06.920 | - And is there a distinction there?
01:34:08.160 | Because what you're saying, the natural endpoint
01:34:10.520 | says we are indistinguishable.
01:34:12.000 | That if you are capable of building an AI,
01:34:17.000 | which is sufficiently close and similar
01:34:19.700 | that we merge with it,
01:34:20.640 | then to all intents and purposes,
01:34:23.600 | that AI is conscious as we know it.
01:34:28.000 | And I don't have a strong view, but I have a view.
01:34:32.040 | And I wrote about it in the epilogue to my last book
01:34:37.760 | because 10 years ago,
01:34:39.600 | I wrote a chapter in a book called "Life Ascending"
01:34:44.400 | about consciousness.
01:34:45.880 | And the subtitle of "Life Ascending"
01:34:47.360 | was the 10 great inventions of evolution.
01:34:49.940 | And I couldn't possibly write a book
01:34:51.340 | with a subtitle like that that did not include consciousness
01:34:54.520 | and specifically consciousness
01:34:57.320 | as one of the great inventions.
01:34:59.440 | And it was in part because I was just curious to know more
01:35:02.560 | and I read more for that chapter.
01:35:04.400 | I never worked on it, but I've always,
01:35:06.320 | how can anyone not be interested in the question?
01:35:09.220 | And I was left with the feeling that A, nobody knows,
01:35:13.240 | and B, there are two main schools of thought out there
01:35:18.240 | with a big kind of skew in distribution.
01:35:21.200 | One of them says, oh, it's a property of matter.
01:35:23.760 | It's an unknown law of physics, panpsychism,
01:35:27.240 | everything is conscious, the sun is conscious,
01:35:29.120 | it's just a matter, or a rock is conscious,
01:35:31.360 | it's just a matter of how much.
01:35:33.680 | And I find that very unpersuasive.
01:35:36.480 | I can't say that it's wrong,
01:35:37.720 | it's just that I think we somehow can tell the difference
01:35:41.360 | between something that's living and something that's not.
01:35:44.000 | And then the other end is it's an emergent property
01:35:48.800 | of a very complex central nervous system.
01:35:52.680 | And I never quite understand what people mean
01:35:57.320 | by words like emergence.
01:35:58.680 | I mean, there are genuine examples,
01:36:00.600 | but I think we very often tend to use it
01:36:04.600 | to plaster over ignorance.
01:36:07.940 | As a biochemist, the question for me then was,
01:36:11.600 | okay, it's a concoction of a central nervous system.
01:36:16.600 | A depolarizing neuron gives rise to a feeling,
01:36:21.240 | to a feeling of pain, or to a feeling of love,
01:36:24.640 | or anger, or whatever it may be.
01:36:27.680 | So what is then a feeling in biophysical terms
01:36:30.600 | in the central nervous system?
01:36:31.800 | Which bit of the wiring gives rise to,
01:36:34.800 | and I've never seen anyone answer that question
01:36:37.760 | in a way that makes sense to me.
01:36:41.200 | - And that's an important question to answer.
01:36:43.720 | - I think if we want to understand consciousness,
01:36:45.360 | that's the only question to answer.
01:36:47.200 | Because certainly an AI is capable of outthinking,
01:36:51.760 | and it's only a matter of time.
01:36:53.480 | Maybe it's already happened.
01:36:54.920 | In terms of just information processing
01:36:58.260 | and computational skill,
01:37:00.000 | I don't think we have any problem in designing a mind
01:37:04.200 | which is at least the equal of the human mind.
01:37:07.280 | But in terms of what we value the most as humans,
01:37:11.160 | which is to say our feelings, our emotions,
01:37:13.280 | our sense of what the world is in a very personal way,
01:37:18.280 | that I think means as much or more to people
01:37:23.520 | than their information processing.
01:37:24.880 | And that's where I don't think
01:37:27.080 | that AI necessarily will become conscious,
01:37:29.680 | because I think it's a property of life.
01:37:33.080 | - Well, let's talk about it more.
01:37:34.200 | You're an incredible writer, one of my favorite writers.
01:37:36.840 | So let me read from your latest book, "Transformers,"
01:37:40.440 | what you write about consciousness.
01:37:42.760 | - "I think therefore I am," said Descartes,
01:37:46.200 | is one of the most celebrated lines ever written.
01:37:49.320 | But what am I exactly?
01:37:51.760 | An artificial intelligence can think too by definition,
01:37:54.640 | and therefore is.
01:37:56.680 | Yet few of us could agree whether AI is capable in principle
01:38:00.760 | of anything resembling human emotions,
01:38:03.200 | of love or hate, fear and joy,
01:38:06.680 | of spiritual yearnings for oneness or oblivion,
01:38:10.640 | or corporeal pangs of thirst and hunger.
01:38:14.200 | The problem is we don't know what emotions are,
01:38:18.080 | as you were saying.
01:38:19.560 | What is the feeling in physical terms?
01:38:21.760 | How does a discharging neuron give rise
01:38:23.760 | to a feeling of anything at all?
01:38:25.640 | This is the hard problem of consciousness,
01:38:28.600 | the seeming duality of mind and matter,
01:38:31.160 | the physical makeup of our innermost self.
01:38:34.240 | We can understand in principle
01:38:35.600 | how an extremely sophisticated parallel processing system
01:38:38.720 | could be capable of wondrous feats of intelligence,
01:38:41.480 | but we can't answer in principle
01:38:44.120 | whether such a supreme intelligence
01:38:46.160 | would experience joy or melancholy.
01:38:49.120 | What is the quantum of solace?
01:38:51.160 | Speaking to the question of emergence,
01:38:56.020 | you know, there's just technical,
01:38:59.140 | there's an excellent paper on this recently
01:39:03.800 | about this kind of phase transition,
01:39:08.120 | emergence of performance in neural networks
01:39:10.760 | on the problem of NLP, natural language processing.
01:39:14.920 | So language models, there seems to be this question of size.
01:39:19.440 | At some point, there is a phase transition
01:39:23.880 | as you grow the size of the neural network.
01:39:25.920 | So the question is,
01:39:27.280 | this is sort of somewhat of a technical question
01:39:29.920 | that you can philosophize over.
01:39:32.000 | The technical question is,
01:39:33.280 | is there a size of a neural network
01:39:35.680 | that starts to be able to form
01:39:37.680 | the kind of representations that can capture a language
01:39:40.760 | and therefore be able to,
01:39:42.620 | not just language, but linguistically capture knowledge
01:39:47.120 | that's sufficient to solve a lot of problems in language,
01:39:50.920 | like be able to have a conversation.
01:39:52.600 | And there seems to be not a gradual increase,
01:39:55.780 | but a phase transition.
01:39:57.160 | And they're trying to construct the science
01:39:59.920 | of where that is.
01:40:01.200 | Like what is a good size of a neural network?
01:40:03.760 | And why does such a phase transition happen?
01:40:05.920 | Anyway, that sort of points to emergence,
01:40:08.640 | that there could be stages where a thing goes from being,
01:40:13.640 | oh, you're a very intelligent toaster
01:40:18.840 | to a toaster that's feeling sad today and turns away
01:40:25.040 | and looks out the window,
01:40:28.240 | sighing, having an existential crisis.
01:40:30.640 | - I was thinking of Marvin, the paranoid android.
01:40:33.200 | - Well, no, Marvin is simplistic
01:40:34.800 | because Marvin is just cranky.
01:40:38.240 | - Yes.
01:40:39.080 | So easily programmed.
01:40:41.560 | - Yeah, easily programmed, nonstop existential crisis.
01:40:45.200 | You're almost basically, what is it?
01:40:47.280 | Notes from Underground by Dostoevsky.
01:40:49.080 | He's just constantly complaining about life.
01:40:53.000 | No, they're capturing the full rollercoaster
01:40:57.080 | of human emotion, the excitement, the bliss, the connection,
01:41:01.120 | the empathy and all that kind of stuff.
01:41:03.920 | And then the selfishness, the anger, the depression,
01:41:08.920 | all that kind of stuff, they're capturing all of that
01:41:11.800 | and be able to experience it deeply.
01:41:14.400 | Like it's the most important thing
01:41:16.560 | you could possibly experience today.
01:41:18.480 | The highest highs, the lowest lows, this is it.
01:41:21.280 | My life will be over.
01:41:22.680 | I cannot possibly go on that feeling.
01:41:26.680 | And then like after a nap, you're feeling amazing.
01:41:30.600 | That might be something that emerges.
01:41:33.640 | - So why would a nap make an AI being feel better?
01:41:38.640 | - First of all, we don't know that for a human either.
01:41:45.240 | - But we do know that that's actually true
01:41:47.640 | for many people much of the time.
01:41:49.000 | Maybe you're depressed and you have a nap
01:41:50.840 | and you do in fact feel better.
01:41:53.480 | - Oh, you are actually asking the technical question there.
01:41:56.440 | So there's a biological answer to that.
01:42:00.000 | And so the question is whether AI needs
01:42:01.760 | to have the same kind of attachments to its body,
01:42:04.920 | bodily function and preservation
01:42:07.600 | of the brain's successful function,
01:42:11.320 | self-preservation essentially in some deep biological sense.
01:42:17.080 | - I mean, to my mind, it comes back around
01:42:19.960 | to the problem we were talking about before
01:42:21.680 | about simulations and sensory input
01:42:24.120 | and learning what all of this stuff means.
01:42:28.600 | And life and death, that biology,
01:42:32.920 | unlike society has a death penalty over everything.
01:42:36.600 | And natural selection works on that death penalty.
01:42:39.000 | That if you make this decision wrongly, you die.
01:42:44.000 | And the next generation is represented by beings
01:42:50.800 | that made a slightly different decision on balance.
01:42:56.400 | And that is something that's intrinsically difficult
01:43:01.400 | to simulate in all this richness, I would say.
01:43:04.320 | So what is-
01:43:07.360 | - Death and all its richness, our relationship with death
01:43:13.480 | or the whole of it.
01:43:16.600 | So which, when you say richness, of course,
01:43:20.160 | there's a lot in that, which is hard to simulate.
01:43:23.880 | What's part of the richness that's hard to simulate?
01:43:28.000 | - I suppose the complexity of the environment
01:43:31.040 | and your position in that,
01:43:32.520 | or the position of an organism in that environment,
01:43:35.480 | in the full richness of that environment
01:43:37.560 | over its entire life, over multiple generations
01:43:40.640 | with changes in gene sequence over those generations.
01:43:44.240 | So slight changes in the makeup of those individuals
01:43:46.880 | over generations.
01:43:48.320 | But if you take it back to the level of single cells,
01:43:52.480 | which I do in the book and ask,
01:43:55.040 | how does a single cell in effect know it exists
01:44:00.040 | as an unit, as an entity?
01:44:02.400 | I mean, no, in inverted commas,
01:44:04.040 | obviously it doesn't know anything,
01:44:07.160 | but it acts as a unit and it acts
01:44:09.120 | with astonishing precision as a unit.
01:44:13.800 | And I had suggested that that's linked
01:44:17.360 | to the electrical fields on the membranes themselves
01:44:19.880 | and that they give some indication of how am I doing
01:44:22.880 | in relation to my environment
01:44:24.320 | as a kind of real-time feedback on the world.
01:44:27.200 | And this is something physical,
01:44:31.000 | which can be selected over generations,
01:44:34.960 | that if you get this wrong,
01:44:37.440 | it's linked with this set of circumstances
01:44:42.280 | that I've just, as an individual,
01:44:45.520 | I have a moment of blind panic and run.
01:44:49.160 | As a bacterium or something,
01:44:50.640 | you have some electrical discharge that says blind panic
01:44:54.160 | and it runs, whatever it may be.
01:44:56.560 | And you associate over generations, multiple generations,
01:44:59.800 | that this electrical phase that I'm in now
01:45:03.640 | is associated with a response like that.
01:45:07.080 | And it's easy to see how feelings come in
01:45:09.840 | through the back door almost,
01:45:12.080 | with that kind of giving real-time feedback
01:45:18.160 | on your position in the world in relation to how am I doing.
01:45:22.040 | And then you complexify the system.
01:45:23.880 | And yes, I have no problem with phase transition.
01:45:27.920 | Can all of this be done purely by the language,
01:45:35.160 | by the issues with how the system understands itself?
01:45:41.400 | Maybe it can, I honestly don't know.
01:45:44.400 | The philosophers for a long time
01:45:47.560 | have talked about the possibility
01:45:49.680 | that you can have a zombie intelligence
01:45:53.080 | and that there are no feelings there,
01:45:55.600 | but everything else is the same.
01:45:57.640 | I mean, I have to throw this back to you, really.
01:46:01.200 | How do you deal with a zombie intelligence?
01:46:03.920 | - So first of all, I can see that
01:46:06.080 | from a biologist's perspective,
01:46:07.780 | you think of all the complexities
01:46:10.720 | that led up to the human being.
01:46:12.840 | The entirety of the history of 4 billion years
01:46:15.760 | that in some deep sense integrated
01:46:18.200 | the human being into its environment.
01:46:20.200 | And that dance of the organism and the environment,
01:46:25.040 | you could see how emotions arise from that.
01:46:27.480 | And then emotions are deeply connected
01:46:29.760 | to creating a human experience.
01:46:32.120 | And from that, you mix in consciousness
01:46:34.160 | and the full mess of it, yeah.
01:46:36.340 | But from a perspective of an intelligent organism
01:46:40.920 | that's already here, like a baby that learns,
01:46:45.200 | it doesn't need to learn how to be a collection of cells
01:46:49.400 | or how to do all the things it needs to do.
01:46:51.760 | The basic function of a baby as it learns
01:46:55.320 | is to interact with its environment,
01:46:57.360 | to learn from its environment,
01:46:58.600 | to learn how to fit in to this social society.
01:47:01.600 | - And the basic response of the baby
01:47:05.920 | is to cry a lot of the time.
01:47:07.120 | - Cry, well, to convince the humans to protect it
01:47:12.120 | or to discipline it, to teach it.
01:47:14.920 | I mean, we've developed a bunch of different tricks,
01:47:18.760 | how to get our parents to take care of us,
01:47:22.080 | to educate us, to teach us about the world.
01:47:24.840 | Also, we've constructed the world in such a way
01:47:27.880 | that it's safe enough for us to survive in
01:47:30.360 | and yet dangerous enough for learning the valuable lessons.
01:47:32.760 | Like the tables are still hard with corners,
01:47:35.560 | so it can still run into them.
01:47:36.800 | It hurts like how...
01:47:38.760 | So AI needs to solve that problem,
01:47:41.680 | not the problem of constructing
01:47:43.080 | this super complex organism that leads up.
01:47:46.080 | To run the whole, to make an apple pie,
01:47:52.600 | to build the whole universe,
01:47:53.960 | you need to build a whole universe.
01:47:55.880 | I think the zombie question,
01:47:58.200 | it's something I would leave to the philosophers.
01:48:04.080 | (Lex laughs)
01:48:05.000 | Because...
01:48:06.240 | And I will also leave to them the definition of love
01:48:11.200 | and what happens between two human beings
01:48:14.920 | when there's a magic that just grabs them,
01:48:18.560 | like nothing else matters in the world
01:48:20.720 | and somehow you've been searching for this feeling,
01:48:23.000 | this moment, this person your whole life.
01:48:25.280 | That feeling, the philosophers can have a lot of fun
01:48:29.600 | with that one and also say that that's just,
01:48:32.880 | you can have a biological explanation,
01:48:34.720 | you can have all kinds of, it's all fake.
01:48:36.840 | It's actually, Ayn Rand will say it's all selfish.
01:48:40.720 | There's a lot of different interpretations.
01:48:42.440 | I'll leave it to the philosophers.
01:48:43.600 | The point is the feeling, sure as hell feels very real.
01:48:48.040 | And if my toaster makes me feel
01:48:51.840 | like it's the only toaster in the world,
01:48:55.720 | and when I leave and I miss the toaster,
01:48:58.400 | and when I come back, I'm excited to see the toaster,
01:49:01.440 | and my life is meaningful and joyful,
01:49:04.000 | and the friends I have around me get a better version of me
01:49:08.080 | because that toaster exists,
01:49:10.680 | that sure as hell feels like a conscious toaster.
01:49:13.320 | - Is that psychologically different to having a dog?
01:49:16.000 | - No, no.
01:49:16.840 | - Because I mean, most people would dispute
01:49:19.240 | whether we can say a dog,
01:49:20.520 | I would say a dog is undoubtedly conscious,
01:49:22.200 | but some people say it doesn't.
01:49:24.360 | - But there's degrees of consciousness and so on,
01:49:26.280 | but people are definitely much more uncomfortable
01:49:28.800 | saying a toaster can be conscious than a dog.
01:49:32.640 | And there's still a deep connection.
01:49:35.040 | You could say our relationship with the dog
01:49:37.880 | has more to do with anthropomorphism,
01:49:40.120 | like we kind of project a human being onto it.
01:49:42.400 | - Maybe.
01:49:43.240 | - We can do the same damn thing with a toaster.
01:49:45.760 | - Yes, but you can look into the dog's eyes
01:49:48.120 | and you can see that it's sad,
01:49:50.480 | that it's delighted to see you again.
01:49:52.480 | I don't have a dog, by the way.
01:49:53.560 | I'm not, it's not that I'm a dog person or a cat person.
01:49:55.160 | - And dogs are actually incredibly good at using their eyes
01:49:57.800 | to do just that.
01:49:59.480 | - They are.
01:50:00.320 | Now, I don't imagine that a dog is remotely
01:50:02.680 | as close to being intelligent as an AI intelligence,
01:50:07.040 | but it's certainly capable
01:50:09.560 | of communicating emotionally with us.
01:50:12.000 | - But here's what I would venture to say.
01:50:13.840 | We tend to think because AI plays chess well
01:50:17.160 | and is able to fold proteins now well,
01:50:19.720 | that it's intelligent.
01:50:21.020 | I would argue that in order to communicate with humans,
01:50:23.840 | in order to have emotional intelligence,
01:50:25.920 | it actually requires another order
01:50:27.440 | of magnitude of intelligence.
01:50:28.920 | It's not easy to be flawed.
01:50:34.120 | Solving a mathematical puzzle is not the same
01:50:38.400 | as the full complexity of human to human interaction.
01:50:42.080 | That's actually, we humans just take for granted
01:50:46.960 | the things we're really good at.
01:50:49.160 | Nonstop, people tell me how shitty people are at driving.
01:50:52.720 | No, humans are incredible at driving.
01:50:56.600 | Bipedal walking, walking, object manipulation.
01:51:00.280 | We're incredible at this.
01:51:01.960 | And so people tend to-
01:51:03.840 | - Discount the things we all just take for granted.
01:51:07.400 | - And one of those things that they discount
01:51:10.160 | is our ability, the dance of conversation
01:51:13.600 | and interaction with each other.
01:51:15.320 | The ability to morph ideas together,
01:51:18.440 | the ability to get angry at each other,
01:51:20.480 | and then to miss each other.
01:51:21.920 | Like to create attention that makes life fun
01:51:24.960 | and difficult and challenging in a way that's meaningful.
01:51:28.280 | That is a skill that's learned.
01:51:31.680 | And AI would need to solve that problem.
01:51:33.480 | - I mean, in some sense, what you're saying is
01:51:35.920 | AI cannot become meaningfully emotional, let's say,
01:51:42.000 | until it experiences some kind of internal conflict
01:51:45.200 | that is unable to reconcile these various aspects
01:51:48.480 | of reality or its reality with a decision to make.
01:51:53.480 | And then it feels sad necessarily
01:51:56.800 | because it doesn't know what to do.
01:51:59.480 | And I certainly can't dispute that.
01:52:01.760 | That may very well be how it works.
01:52:03.720 | I think the only way to find out is to do it.
01:52:06.120 | - To build it, yeah.
01:52:07.040 | And leave it to the philosophers
01:52:08.480 | if it actually feels sad or not.
01:52:10.280 | The point is the robot will be sitting there alone,
01:52:13.880 | having an internal conflict, an existential crisis,
01:52:16.920 | and that's required for it to have a deep,
01:52:19.320 | meaningful connection with another human being.
01:52:21.560 | Now, does it actually feel that?
01:52:23.320 | I don't know.
01:52:24.160 | - But I'd like to throw something else at you
01:52:26.040 | which troubles me on reading it.
01:52:29.200 | Noah Harari's book, "21 Lessons for the 21st Century,"
01:52:35.120 | and he's written about this kind of thing
01:52:36.520 | on various occasions.
01:52:38.440 | And he sees biochemistry as an algorithm.
01:52:40.720 | And then AI will necessarily be able to hack that algorithm
01:52:46.040 | and do it better than humans.
01:52:47.280 | So there will be AI better at writing music
01:52:49.720 | that we appreciate than Mozart ever could
01:52:51.560 | or writing better than Shakespeare ever did and so on
01:52:53.960 | because biochemistry is algorithmic
01:52:56.480 | and all you need to do is figure out
01:52:57.800 | which bits of the algorithm to play
01:52:59.600 | to make us feel good or bad or appreciate things.
01:53:03.200 | And as a biochemist, I find that argument
01:53:06.440 | close to irrefutable and not very enjoyable.
01:53:13.280 | I don't like the sound of it.
01:53:14.800 | That's just my reaction as a human being.
01:53:16.440 | You might like the sound of it
01:53:17.400 | because that says that AI is capable
01:53:19.880 | of the same kind of emotional feelings
01:53:23.400 | about the world as we are
01:53:25.520 | because the whole thing is an algorithm
01:53:27.040 | and you can program an algorithm and there you are.
01:53:30.160 | He then has a peculiar final chapter
01:53:33.480 | where he talks about consciousness in rather separate terms.
01:53:37.600 | And he's talking about meditating and so on
01:53:39.760 | and getting in touch with his inner conscious.
01:53:41.400 | I don't meditate.
01:53:42.520 | I don't know anything about that.
01:53:44.600 | But he wrote in very different terms about it
01:53:48.120 | as if somehow it's a way out of the algorithm.
01:53:50.880 | Now, it seems to me that consciousness in that sense
01:53:56.040 | is capable of scuppering the algorithm.
01:53:58.720 | I think in terms of the biochemical feedback loops
01:54:01.840 | and so on, it is undoubtedly algorithmic.
01:54:04.800 | But in terms of what we decide to do,
01:54:06.880 | it can be much more based on an emotion.
01:54:10.840 | We can just think, "I don't care.
01:54:16.880 | "I can't resolve this complex situation.
01:54:20.200 | "I'm gonna do that."
01:54:21.520 | And that can be based on, in effect, a different currency,
01:54:24.720 | which is the currency of feelings
01:54:26.280 | and something where we don't have
01:54:27.840 | very much personal control over.
01:54:29.960 | And then it comes back around to you
01:54:32.040 | and what are you trying to get at with AI?
01:54:35.360 | Do we need to have some system
01:54:38.080 | which is capable of overriding a rational decision
01:54:41.840 | which cannot be made
01:54:42.680 | because there's too much conflicting information
01:54:45.080 | by effectively an emotional, judgmental decision
01:54:48.520 | that just says, "Do this and see what happens."
01:54:50.800 | That's what consciousness is really doing, in my view.
01:54:53.400 | - Yeah, and the question is whether it's a different process
01:54:56.480 | or just a higher-level process.
01:54:58.400 | The idea that biochemistry is an algorithm
01:55:03.320 | is, to me, an oversimplistic view.
01:55:07.480 | There's a lot of things that,
01:55:09.280 | the moment you say it, it's irrefutable,
01:55:14.920 | but it simplifies.
01:55:17.080 | - I'm sure it's an extremely complex system.
01:55:17.920 | - And in the process, loses something fundamental.
01:55:21.000 | So, for example, calling a universe
01:55:23.480 | an information-processing system, sure, yes.
01:55:27.360 | You could make that.
01:55:29.720 | It's a computer that's performing computations,
01:55:32.160 | but you're missing the process of the entropy
01:55:37.160 | somehow leading to pockets of complexity
01:55:42.760 | that creates these beautiful artifacts
01:55:45.360 | that are incredibly complex, and they're like machines.
01:55:48.920 | And then those machines are,
01:55:50.720 | through the process of evolution,
01:55:52.000 | are constructing even further complexity.
01:55:54.080 | In calling the universe an information-processing machine,
01:55:59.480 | you're missing those little local pockets
01:56:03.000 | and how difficult it is to create them.
01:56:05.120 | So, the question to me is,
01:56:06.560 | if biochemistry is an algorithm,
01:56:07.880 | how difficult is it to create a software system
01:56:12.000 | that runs the human body, which I think is incorrect.
01:56:16.240 | I think that is going to take so long.
01:56:20.520 | I can't, I mean, that's going to be centuries from now,
01:56:23.520 | to be able to reconstruct a human.
01:56:25.600 | Now, what I would venture to say,
01:56:27.600 | to get some of the magic of a human being,
01:56:30.000 | what we were saying with the emotions and the interactions,
01:56:33.920 | and like a dog makes a smile and joyful
01:56:36.920 | and all those kinds of things,
01:56:38.200 | that will come much sooner,
01:56:39.640 | but that doesn't require us to reverse engineer
01:56:42.240 | the algorithm of biochemistry.
01:56:44.080 | - Yes, but the toaster is making you happy.
01:56:47.680 | - Yes.
01:56:48.640 | - It's not about whether you make the toaster happy.
01:56:51.840 | - No, it has to be.
01:56:53.640 | It has to be.
01:56:55.120 | It has to be.
01:56:56.160 | The toaster has to be able to leave me.
01:56:58.080 | - Yes, but it's the toaster is the AI in this case,
01:57:00.080 | is a very intelligent--
01:57:00.920 | - Yeah, the toaster has to be able to be unhappy
01:57:02.760 | and leave me.
01:57:03.600 | That's essential.
01:57:06.360 | - Yeah.
01:57:07.200 | - That's essential for my being able to miss the toaster.
01:57:09.760 | If the toaster is just my servant,
01:57:12.240 | that's not, or a provider of like services,
01:57:16.000 | like tells me the weather and makes toast,
01:57:20.440 | that's not going to deep connection.
01:57:22.760 | It has to have internal conflict.
01:57:24.920 | You write about life and death.
01:57:26.760 | It has to be able to be conscious of its mortality
01:57:29.840 | and the finiteness of its existence.
01:57:33.760 | And that life is temporary
01:57:35.880 | and therefore it needs to be more selective.
01:57:38.000 | - One of the most moving moments in the movies
01:57:41.040 | from when I was a boy was the unplugging of Hal in 2001,
01:57:45.080 | where that was the death of a sentient being
01:57:48.600 | and Hal knew it.
01:57:51.000 | So I think we all kind of know
01:57:54.320 | that a sufficiently intelligent being
01:58:00.120 | is going to have some form of consciousness,
01:58:02.760 | but whether it would be like biological consciousness,
01:58:06.800 | I just don't know.
01:58:07.640 | And if you're thinking about how do we bring together,
01:58:10.200 | I mean, obviously we're going to interact
01:58:12.240 | more closely with AI,
01:58:16.240 | but are we really,
01:58:18.720 | is a dog really like a toaster
01:58:21.640 | or is there really some kind of difference there?
01:58:25.640 | You were talking about biochemistry is algorithmic,
01:58:29.480 | but it's not single algorithm and it's very complex.
01:58:32.240 | Of course it is.
01:58:33.160 | So it may be that there are again,
01:58:35.200 | conflicts in the circuits of biochemistry,
01:58:37.160 | but I have a feeling that the level of complexity
01:58:40.960 | of the total biochemical system
01:58:43.280 | at the level of a single cell is less complex
01:58:45.560 | than the level of neural networking
01:58:47.560 | in the human brain or in an AI.
01:58:49.720 | - Well, I guess I assumed that we were including the brain
01:58:55.760 | in the biochemistry algorithm because you have to-
01:58:58.960 | - I would see that as a higher level of organization
01:59:02.000 | of neural networks.
01:59:02.920 | They're all using the same biochemical wiring
01:59:04.880 | within themselves.
01:59:06.400 | - Yeah, but the human brain is not just neurons.
01:59:09.760 | It's the immune system.
01:59:11.360 | It's the whole package.
01:59:13.840 | I mean, to have a biochemical algorithm
01:59:16.280 | that runs an intelligent biological system,
01:59:20.040 | you have to include the whole damn thing.
01:59:21.680 | And it's pretty fascinating that it comes from an embryo.
01:59:25.320 | Boy, I mean, if you can,
01:59:31.280 | what is a human being?
01:59:33.200 | 'Cause it's just some code and then you build.
01:59:36.320 | And then that, so it's DNA doesn't just tell you
01:59:39.280 | what to build, but how to build it.
01:59:41.040 | I mean, the thing is impressive.
01:59:44.640 | And the question is how difficult is it
01:59:49.480 | to reverse engineer the whole shebang?
01:59:51.440 | - Very difficult.
01:59:54.400 | - I would say it's,
01:59:55.820 | don't wanna say impossible,
02:00:01.200 | but it's much easier to build a human
02:00:05.400 | than to reverse engineer,
02:00:06.800 | to build like a fake human, human-like thing,
02:00:11.600 | than to reverse engineer the entirety of the process
02:00:14.120 | of the evolution of the head of a rat.
02:00:15.960 | - I'm not sure if we are capable
02:00:18.560 | of reverse engineering the whole thing,
02:00:20.560 | if the human mind is capable of doing that.
02:00:23.720 | I mean, I wouldn't be a biologist if I wasn't trying,
02:00:26.720 | but I know I can't understand the whole problem.
02:00:31.120 | I'm just trying to understand
02:00:32.040 | the rudimentary outlines of the problem.
02:00:34.340 | There's another aspect though,
02:00:37.240 | you're talking about developing from a single cell
02:00:39.160 | to the human mind and all the part system,
02:00:43.920 | subsystems that are part of an immune system and so on.
02:00:46.620 | This is something that you'll talk about, I imagine,
02:00:53.240 | with Michael Levin,
02:00:55.280 | but so little is known about the human mind
02:01:00.760 | you talk about reverse engineering.
02:01:02.160 | So little is known about the developmental pathways
02:01:04.640 | that go from a genome to going to a fully wired organism.
02:01:09.160 | And a lot of it seems to depend
02:01:10.800 | on the same electrical interactions
02:01:14.880 | that I was talking about happening
02:01:16.400 | at the level of single cells
02:01:17.680 | and its interaction with the environment.
02:01:19.880 | There's a whole electrical field side to biology
02:01:23.580 | that is not yet written into any of the textbooks,
02:01:27.020 | which is about how does an embryo develop
02:01:29.340 | into a single cell, develop into these complex systems?
02:01:32.480 | What defines the head?
02:01:33.640 | What defines the immune system?
02:01:35.000 | What defines the brain and so on?
02:01:37.120 | That really is written in a language
02:01:38.660 | that we're only just beginning to understand.
02:01:40.320 | And frankly, biologists, most biologists
02:01:42.800 | are still very reluctant to even get themselves tangled up
02:01:47.240 | in questions like electrical fields influencing development.
02:01:51.640 | It seems like mumbo jumbo to a lot of biologists
02:01:54.480 | and it should not be
02:01:55.320 | because this is the 21st century biology.
02:01:57.500 | This is where it's going.
02:01:59.660 | But we're not gonna reverse engineer a human being
02:02:02.020 | or the mind or any of these subsystems
02:02:04.320 | until we understand how this developmental process is,
02:02:07.060 | how electricity in biology really works.
02:02:09.980 | And if it is linked with feelings
02:02:13.600 | and with consciousness and so on,
02:02:15.880 | that's the, I mean, in the meantime, we have to try.
02:02:18.660 | But I think that's where the answer lies.
02:02:21.020 | - So you think it's possible that the key
02:02:25.980 | to things like consciousness
02:02:28.620 | or some of the more tricky aspects of cognition
02:02:32.300 | might lie in that early development,
02:02:34.660 | the interaction of electricity in biology,
02:02:37.360 | electrical fields.
02:02:40.780 | - But we already know the EEG and so on
02:02:43.060 | is telling us a lot about brain function,
02:02:44.780 | but we don't know which cells,
02:02:46.100 | which parts of a neural network is giving rise to the EEG.
02:02:48.820 | We don't know the basics.
02:02:50.500 | The assumption is, I mean, we know it's neural networks.
02:02:53.620 | We know it's multiple cells,
02:02:54.740 | hundreds or thousands of cells involved in it.
02:02:56.940 | And we assume that it has to do with depolarization
02:03:00.660 | during action potentials and so on.
02:03:03.420 | But the mitochondria which are in there
02:03:05.300 | have much more membranes
02:03:06.780 | than the plasma membrane of the neuron.
02:03:08.660 | And there's a much greater membrane potential.
02:03:10.700 | And it's formed in parallel, very often parallel crystals,
02:03:14.380 | which are capable of reinforcing a field
02:03:17.540 | and generating fields over longer distances.
02:03:19.740 | And nobody knows if that plays a role
02:03:23.260 | in consciousness or not.
02:03:24.580 | There's reasons to argue that it could,
02:03:26.340 | but frankly, we simply do not know.
02:03:28.900 | And it's not taken into consideration.
02:03:30.980 | You look at the structure of the mitochondrial membranes
02:03:35.260 | in the brains of simple things like Drosophila, the fruit fly
02:03:40.260 | and they have amazing structures.
02:03:42.140 | You can see lots of little rectangular things
02:03:44.180 | all lined up in amazing patterns.
02:03:48.020 | What are they doing?
02:03:49.060 | Why are they like that?
02:03:49.920 | We haven't the first clue.
02:03:52.460 | - What do you think about organoids and brain organoids?
02:03:55.340 | And like, so in a lab trying to study the development
02:03:59.540 | of these in the Petri dish development of organs,
02:04:04.540 | do you think that's promising?
02:04:06.860 | Do you have to look at whole systems?
02:04:08.660 | - I've never done anything like that.
02:04:10.340 | I don't know much about it.
02:04:11.460 | The people who I've talked to who do work on it
02:04:13.660 | say amazing things can happen.
02:04:15.180 | And a bit of a brain grown in a dish
02:04:18.460 | is capable of experiencing some kind of feelings
02:04:21.380 | or even memories of its former brain.
02:04:23.480 | Again, I have a feeling that until we understand
02:04:27.420 | how to control the electrical fields
02:04:29.580 | that control development, we're not gonna understand
02:04:32.360 | how to turn an organoid into a real functional system.
02:04:35.280 | - But how do we get that understanding?
02:04:38.660 | It's so incredibly difficult.
02:04:41.980 | I mean, you would have to, I mean, one promising direction,
02:04:44.740 | I'd love to get your opinion on this.
02:04:46.860 | I don't know if you're familiar with the work of DeepMind
02:04:49.040 | and AlphaFold with protein folding and so on.
02:04:52.220 | Do you think it's possible that that will give us
02:04:55.100 | some breakthroughs in biology,
02:04:58.020 | trying to basically simulate and model the behavior
02:05:03.020 | of trivial biological systems
02:05:07.200 | as they become complex biological systems?
02:05:09.920 | - I'm sure it will.
02:05:12.820 | The interesting thing to me about protein folding
02:05:16.380 | is that for a long time, my understanding,
02:05:19.820 | this is not what I work on, so I may have got this wrong,
02:05:21.620 | but my understanding is that you take the sequence
02:05:24.500 | of a protein and you try to fold it.
02:05:27.740 | And there are multiple ways in which it can fold
02:05:31.060 | and to come up with the correct confirmation
02:05:33.180 | is not a very easy thing because you're doing it
02:05:35.120 | from first principles from a string of letters
02:05:37.420 | which specify the string of amino acids.
02:05:39.900 | But what actually happens is when a protein
02:05:43.460 | is coming out of a ribosome,
02:05:45.740 | it's coming out of a charged tunnel
02:05:47.940 | and it's in a very specific environment
02:05:49.660 | which is gonna force this to go there now
02:05:51.460 | and then this one to go there and this one to come like that.
02:05:53.380 | And so you're forcing a specific conformational set
02:05:55.940 | of changes onto it as it comes out of the ribosome.
02:05:58.460 | So by the time it's fully emerged,
02:06:00.100 | it's already got its shape and that shape depended
02:06:03.020 | on the immediate environment that it was emerging into,
02:06:07.660 | one letter, one amino acid at a time.
02:06:11.940 | And I don't think that the field
02:06:14.860 | was looking at it that way.
02:06:17.020 | And if that's correct,
02:06:18.780 | then that's very characteristic of science,
02:06:20.580 | which is to say it asks very often the wrong question
02:06:23.100 | and then does really amazingly sophisticated analyses
02:06:25.860 | on something having never thought to actually think,
02:06:27.860 | well, what is biology doing?
02:06:29.100 | And biology is giving you a charged electrical environment
02:06:31.900 | that forces you to be this way.
02:06:33.420 | Now, did deep mind come up through patterns
02:06:37.980 | with some answer that was like that?
02:06:39.820 | I've got absolutely no idea.
02:06:41.860 | It ought to be possible to deduce that
02:06:44.460 | from the shapes of proteins.
02:06:46.540 | It would require much greater skill
02:06:50.820 | than the human mind has.
02:06:52.860 | But the human mind is capable of saying,
02:06:54.460 | well, hang on, let's look at this exit tunnel
02:06:56.420 | and try and work out what shape is this protein going
02:06:58.580 | to take and we can figure that out.
02:07:00.180 | - That's really interesting about the exit tunnel.
02:07:01.620 | But like sometimes we get lucky and our,
02:07:05.060 | like just like in science,
02:07:07.100 | the simplified view or the static view
02:07:10.540 | will actually solve the problem for us.
02:07:12.180 | So in this case, it's very possible
02:07:14.420 | that the sequence of letters has a unique mapping
02:07:17.260 | to our structure without considering how it unraveled.
02:07:21.460 | So without considering the tunnel.
02:07:23.780 | And so that seems to be the case in this situation
02:07:27.740 | where the cool thing about proteins,
02:07:29.740 | all the different shapes they can possibly take,
02:07:31.620 | it actually seems to take very specific,
02:07:34.620 | unique shapes given the sequence.
02:07:36.700 | - That's forced on you by an exit tunnel.
02:07:38.300 | So the problem is actually much simpler than you thought.
02:07:40.580 | And then there's a whole army of proteins
02:07:44.260 | that which changed the conformational state,
02:07:48.260 | chaperone proteins.
02:07:49.780 | And they're only used when there's some,
02:07:54.020 | presumably issue with how it came out of the exit tunnel
02:07:56.500 | and you want to do it differently to that.
02:07:58.020 | So very often the chaperone proteins will go there
02:08:00.820 | and will influence the way in which it falls.
02:08:03.540 | So there's two ways of doing it.
02:08:06.620 | Either you can look at the structures
02:08:09.260 | and the sequences of all the proteins
02:08:10.980 | and you can apply an immense mind to it
02:08:13.180 | and figure out what the patterns are
02:08:14.580 | and figure out what happened.
02:08:15.580 | Or you can look at the actual situation where it is
02:08:17.900 | and say, well, hang on, it was actually quite simple.
02:08:20.060 | It's got a charged environment
02:08:21.180 | and then it's forced to come out this way.
02:08:23.100 | And then the question will be,
02:08:23.980 | well, do different ribosomes
02:08:25.420 | have different charged environments?
02:08:26.980 | What happens if a chaperone,
02:08:28.700 | you're asking a different set of questions
02:08:30.420 | to come to the same answer in a way
02:08:31.740 | which is telling you a much simpler story
02:08:34.220 | and explains why it is.
02:08:35.740 | Rather than saying it could be,
02:08:37.780 | this is one in a billion different
02:08:39.980 | possible conformational states that this protein could have.
02:08:42.340 | You're saying, well, it has this one
02:08:43.460 | because that was the only one it could take
02:08:46.060 | given its setting.
02:08:48.340 | - Well, yeah, I mean, currently humans are very good
02:08:51.100 | at that kind of first principles thinking.
02:08:52.940 | I was stepping back.
02:08:54.580 | But I think AI is really good at,
02:08:56.460 | you know, collect a huge amount of data
02:08:58.980 | and a huge amount of data of observation of planets
02:09:01.980 | and figure out that Earth is not
02:09:03.940 | at the center of the universe,
02:09:05.180 | that there's actually a sun, we're orbiting the sun.
02:09:08.020 | But then you can, as a human being,
02:09:09.780 | ask, well, how do solar systems come to be?
02:09:13.140 | What are the different forces that are required
02:09:17.500 | to make this kind of pattern emerge?
02:09:19.940 | And then you start to invent things like gravity.
02:09:23.020 | I mean, obviously.
02:09:24.080 | - Is it an invention?
02:09:26.900 | - I mixed up the ordering of gravity.
02:09:29.780 | Wasn't considered as a thing that connects planets.
02:09:32.620 | But we are able to think about those big picture things
02:09:36.860 | as human beings.
02:09:38.260 | AI is just very good to infer simple models
02:09:42.220 | from a huge amount of data.
02:09:44.240 | And the question is with biology,
02:09:47.700 | you know, we kind of go back and forth
02:09:49.420 | in how we solve biology.
02:09:50.940 | Listen, protein folding was thought
02:09:53.700 | to be impossible to solve.
02:09:55.340 | And there's a lot of brilliant PhD students
02:09:57.560 | that worked one protein at a time
02:09:59.460 | trying to figure out the structure.
02:10:00.900 | And the fact that I was able to do that.
02:10:03.540 | - Oh, I'm not knocking it at all.
02:10:06.300 | But I think that people have been asking the wrong question.
02:10:09.700 | - But then as the people start to ask
02:10:13.500 | better and bigger questions,
02:10:17.220 | the AI kind of enters the chat and says,
02:10:20.900 | "I'll help you out with that."
02:10:22.700 | - Can I give you another example from my own work?
02:10:28.020 | The risk of getting a disease as we get older,
02:10:30.700 | there are genetic aspects to it.
02:10:34.940 | You know, if you spend your whole life
02:10:37.260 | overeating and smoking and whatever,
02:10:38.940 | that's a whole separate question.
02:10:41.540 | But there's a genetic side to the risk.
02:10:43.260 | And we know a few genes
02:10:45.500 | that increase your risk of certain things.
02:10:47.340 | And for probably 20 years now,
02:10:49.660 | people have been doing what's called GWAS,
02:10:51.660 | which is genome wide association studies.
02:10:55.300 | So you effectively scan the entire genome
02:10:58.780 | for any single nucleotide polymorphisms,
02:11:02.200 | which is say a single letter change in one place
02:11:04.940 | that has a higher association
02:11:06.860 | of being linked with a particular disease or not.
02:11:09.260 | And you can come up with thousands of these things
02:11:10.940 | across the genome.
02:11:11.820 | And if you add them all up and try and say,
02:11:17.180 | well, so do they add up to explain
02:11:20.780 | the known genetic risk of this disease?
02:11:23.700 | And the known genetic risk often comes from twin studies.
02:11:26.140 | And you can say that if this twin gets epilepsy,
02:11:30.620 | there's a 40 or 50% risk that the other twin,
02:11:33.580 | identical twin, will also get epilepsy.
02:11:35.800 | Therefore, the genetic factor is about 50%.
02:11:39.060 | And so the gene similarities that you see
02:11:43.020 | should account for 50% of that known risk.
02:11:46.420 | Very often it accounts for less than a 10th
02:11:49.220 | of the known risk.
02:11:50.820 | And there's two possible explanations.
02:11:52.980 | And there's one which people tend to do,
02:11:54.460 | which is to say, ah, well,
02:11:56.180 | we don't have enough statistical power.
02:11:58.180 | If we, maybe there's a million,
02:12:00.260 | we've only found a thousand of them.
02:12:01.620 | But if we find the other million,
02:12:02.860 | they're weakly related, but there's a huge number of them.
02:12:05.140 | And so we'll account for that whole risk.
02:12:07.420 | Maybe there's a billion of them, for instance.
02:12:11.500 | So that's one way.
02:12:12.740 | The other way is to say, well, hang on a minute,
02:12:15.860 | you're missing a system here.
02:12:17.020 | That system is the mitochondrial DNA,
02:12:19.140 | which people tend to dismiss because it's small
02:12:21.860 | and it doesn't change very much.
02:12:25.020 | But a few single letter changes in that mitochondrial DNA,
02:12:30.860 | it controls some really basic processes.
02:12:33.620 | It controls not only all the energy that we need to live
02:12:37.860 | and to move around and do everything we do,
02:12:39.700 | but also biosynthesis to make the new building blocks,
02:12:44.420 | to make new cells.
02:12:47.420 | And cancer cells very often kind of take over
02:12:49.780 | the mitochondria and rewire them
02:12:52.060 | so that instead of using them for making energy,
02:12:54.620 | they're effectively using them as precursors
02:12:56.500 | for the building blocks for biosynthesis.
02:12:58.420 | You need to make new amino acids, new nucleotides for DNA.
02:13:01.380 | You want to make new lipids
02:13:03.140 | to make your membranes and so on.
02:13:04.700 | So they kind of rewire metabolism.
02:13:06.940 | Now, the problem is that we've got all these interactions
02:13:10.220 | between mitochondrial DNA and the genes in the nucleus
02:13:13.420 | that are overlooked completely because people throw away,
02:13:16.660 | literally throw away the mitochondrial genes.
02:13:18.540 | And we can see in fruit flies that they interact
02:13:21.020 | and produce big differences in risk.
02:13:24.420 | So you can set AI onto this question
02:13:29.380 | of exactly how many of these base changes there are.
02:13:34.380 | That's just one possible solution
02:13:36.900 | that maybe there are a million of them
02:13:39.060 | and it does account for the greatest part of the risk.
02:13:41.180 | Or the other one is they aren't, it's just not there.
02:13:43.740 | Actually, the risk lies in something
02:13:45.260 | you weren't even looking at.
02:13:47.100 | And this is where human intuition is very important.
02:13:50.860 | And just this feeling that, well, I'm working on this
02:13:53.540 | and I think it's important and I'm bloody minded about it.
02:13:56.180 | And in the end, some people are right.
02:13:57.540 | It turns out that it was important.
02:14:00.140 | Can you get AI to do that, to be bloody minded?
02:14:03.140 | - And that, hang on a minute,
02:14:06.580 | you might be missing a whole other system here
02:14:09.300 | that's much bigger.
02:14:10.300 | That's the moment of discovery of scientific revolution.
02:14:17.500 | I'm giving up on saying AI can't do something.
02:14:21.380 | I've said it enough times about enough things.
02:14:25.220 | I think there's been a lot of progress.
02:14:27.460 | And instead, I'm excited by the possibility
02:14:30.220 | of AI helping humans.
02:14:31.420 | But at the same time, just like I said,
02:14:34.420 | we seem to dismiss the power of humans.
02:14:37.380 | - Yes, yes.
02:14:38.460 | - Like we're so limited in so many ways
02:14:43.940 | that we kind of, in what we feel like dumb ways,
02:14:48.540 | like we're not strong, we're kind of,
02:14:52.740 | our attention, our memory is limited.
02:14:57.060 | Our ability to focus on things is limited
02:15:00.260 | in our own perception of what limited is.
02:15:02.740 | But that actually, there's an incredible computer
02:15:05.460 | behind the whole thing that makes this whole system work.
02:15:08.900 | Our ability to interact with the environment,
02:15:11.820 | to reason about the environment.
02:15:13.300 | There's magic there.
02:15:14.900 | And I'm hopeful that AI can capture some of that same magic,
02:15:18.740 | but that magic is not gonna look like
02:15:20.820 | Deep Blue playing chess.
02:15:22.420 | - No.
02:15:23.260 | - It's going to be more interesting.
02:15:24.740 | - But I don't think it's gonna look like
02:15:26.460 | pattern finding either.
02:15:27.980 | I mean, that's essentially what you're telling me
02:15:29.620 | it does very well at the moment.
02:15:30.460 | And my point is it works very well
02:15:33.020 | where you're looking for the right pattern.
02:15:36.140 | But we are storytelling animals,
02:15:38.540 | and the hypothesis is a story.
02:15:40.860 | It's a testable story.
02:15:42.460 | But a new hypothesis is a leap into the unknown,
02:15:47.140 | and it's a new story, basically.
02:15:48.460 | And it says, this leads to this, this leads to that.
02:15:51.020 | It's a causal set of storytelling.
02:15:54.980 | - It's also possible that the leap into the unknown
02:15:57.460 | has a pattern of its own.
02:15:58.700 | - Yes, it is.
02:15:59.740 | - And it's possible that it's learnable.
02:16:02.540 | - I'm sure it is.
02:16:04.300 | There's a nice book by Arthur Koestler
02:16:06.860 | on the nature of creativity,
02:16:11.300 | and he likens it to a joke where the punchline
02:16:13.580 | goes off in a completely unexpected direction
02:16:15.580 | and says that this is the basis of human creativity,
02:16:19.420 | some creative switch of direction to an unexpected place
02:16:22.700 | is similar to a joke.
02:16:24.060 | I'm not saying that's how it works,
02:16:26.340 | but it's a nice idea and there must be some truth in it.
02:16:29.420 | And it's one of these,
02:16:32.380 | most of the stories we tell are probably the wrong story
02:16:34.940 | and probably going nowhere and probably not helpful.
02:16:37.620 | And we definitely don't do as well
02:16:39.860 | at seeing patterns in things,
02:16:41.740 | but some of the most enjoyable human aspects
02:16:44.420 | is finding a new story that goes to an unexpected place.
02:16:47.700 | And these are all aspects of what being human means to me.
02:16:51.580 | And maybe these are all things
02:16:53.820 | that AI figures out for itself,
02:16:55.780 | or maybe they're just aspects.
02:16:58.020 | But I just have the feeling sometimes
02:17:00.300 | that the people who are trying to understand
02:17:04.860 | what we are like,
02:17:08.780 | if we wish to craft an AI system,
02:17:10.620 | which is somehow human-like,
02:17:12.740 | that we don't have a firm enough grasp
02:17:16.500 | of what humans really are like
02:17:18.860 | in terms of how we are built.
02:17:20.580 | - But we get a better, better understanding of that.
02:17:25.020 | I agree with you completely.
02:17:26.620 | We try to build the thing and then we go,
02:17:29.300 | hang on a minute.
02:17:30.380 | - Yeah.
02:17:31.540 | - There's another system here.
02:17:33.060 | And that's actually the attempt to build AI
02:17:35.900 | that's human-like is getting us to a deeper understanding
02:17:38.540 | of human beings.
02:17:39.940 | The funny thing is I recently talked to Magnus Carlsen,
02:17:42.900 | widely considered to be the greatest chess player
02:17:44.700 | of all time.
02:17:45.540 | And he talked about AlphaZero,
02:17:48.500 | which is a system from DeepMind that plays chess.
02:17:51.620 | And he had a funny comment.
02:17:53.180 | He has a kind of dry sense of humor.
02:17:57.620 | But he was extremely impressed
02:17:59.420 | when he first saw AlphaZero play.
02:18:02.060 | And he said that it did a lot of things
02:18:04.580 | that could easily be mistaken for creativity.
02:18:07.380 | (laughing)
02:18:09.780 | So he like refused, as a typical human,
02:18:12.100 | refused to give the system sort of its due.
02:18:16.860 | Because he came up with a lot of things
02:18:18.500 | that a lot of people are extremely impressed by.
02:18:22.580 | Not just the sheer calculation,
02:18:24.180 | but the brilliance of play.
02:18:26.780 | So one of the things that it does
02:18:30.900 | in really interesting ways is it sacrifices pieces.
02:18:35.420 | So in chess that means you basically take a few steps back
02:18:39.940 | in order to take a step forward.
02:18:41.700 | You give away pieces for some future reward.
02:18:46.180 | And that, for us humans, is where art is in chess.
02:18:50.780 | You take big risks.
02:18:52.540 | That for us humans, those risks are especially painful
02:18:57.540 | because you have a fog of uncertainty before you.
02:19:02.260 | So to take a risk now based on intuition
02:19:05.100 | of I think this is the right risk to take.
02:19:07.460 | But there's so many possibilities
02:19:09.620 | that that's where it takes guts.
02:19:11.300 | That's where art is, that's that danger.
02:19:14.060 | And then AlphaZero takes those same kind of risks
02:19:19.060 | and does them even greater degree.
02:19:22.380 | But of course it does it from a,
02:19:25.180 | well you could easily reduce down
02:19:30.340 | to a cold calculation over patterns.
02:19:34.580 | But boy, when you see the final result,
02:19:37.820 | it sure looks like the same kind of magic
02:19:39.980 | that we see in creativity.
02:19:41.900 | When we see creative play on the chessboard.
02:19:45.180 | But the chessboard is very limited.
02:19:46.940 | And the question is, as we get better and better,
02:19:49.100 | can we do that same kind of creativity
02:19:52.500 | in mathematics, in programming,
02:19:55.700 | and then eventually in biology, psychology,
02:19:59.580 | and expand into more and more complex systems.
02:20:04.020 | I was, I used to go running when I was a boy
02:20:07.180 | and fell running, which is say running up and down mountains
02:20:10.460 | and I was never particularly great at it.
02:20:12.740 | But there were some people who were amazingly fast,
02:20:16.780 | especially at running down.
02:20:18.580 | And I realized in trying to do this
02:20:21.100 | that there's only really two,
02:20:25.020 | there's three possible ways of doing it.
02:20:26.540 | And there's only two that work.
02:20:27.700 | Either you go extremely slowly and carefully
02:20:30.460 | and you figure out, okay, there's a stone.
02:20:32.460 | I'll put my foot on this stone
02:20:33.820 | and then there's another,
02:20:35.580 | there's a muddy puddle I'm going to avoid.
02:20:37.100 | And it's slow, it's laborious.
02:20:39.500 | You figure it out step by step.
02:20:41.300 | Or you can just go incredibly fast
02:20:44.260 | and you don't think about it at all.
02:20:45.580 | The entire conscious mind is shut out of it.
02:20:47.620 | And it's probably the same playing table tennis
02:20:49.700 | or something.
02:20:50.540 | There's something in the mind
02:20:51.380 | which is doing a whole lot of subconscious calculations
02:20:54.180 | about exactly, and it's amazing.
02:20:55.500 | You can run at astonishing speed down a hillside
02:20:58.180 | with no idea how you did it at all.
02:21:00.420 | And then you panic and you think,
02:21:01.820 | I'm gonna break my leg if I keep doing this.
02:21:03.500 | I've got to think about where I'm gonna put my foot.
02:21:05.580 | So you slow down a bit
02:21:06.460 | and try to bring those conscious mind in.
02:21:07.980 | And then you do, you crash.
02:21:09.700 | You cannot think consciously while running downhill.
02:21:14.060 | And so it's amazing how many calculations
02:21:18.380 | the mind is able to make.
02:21:21.260 | And now the problem with playing chess or something,
02:21:23.620 | if you're able to make all of those subconscious
02:21:26.500 | forward calculations about what is the likely outcome
02:21:30.300 | of this move now in the way that we can
02:21:33.300 | by running down a hillside or something,
02:21:35.540 | it's partly about what we have adapted to do.
02:21:38.180 | It's partly about the reality of the world that we're in.
02:21:40.460 | Running fast downhill is something
02:21:42.020 | that we better be bloody good at
02:21:43.140 | otherwise we're gonna be eaten.
02:21:44.900 | Whereas trying to calculate multiple, multiple moves
02:21:51.020 | into the future is not something
02:21:52.540 | we've ever been called on to do.
02:21:54.100 | Two or three, four moves into the future
02:21:55.780 | is quite enough for most of us most of the time.
02:22:00.140 | Yeah, yeah.
02:22:01.620 | So yeah, just solving chess may not,
02:22:04.100 | we may not be as far towards solving the problem
02:22:10.140 | of downhill running as we might think
02:22:14.660 | just because we solve chess.
02:22:16.280 | Still, it's beautiful to see creativity.
02:22:20.820 | Humans create machines.
02:22:23.340 | They're able to create art and art on a chess board
02:22:27.740 | and art otherwise.
02:22:29.460 | Who knows how far that takes us?
02:22:31.820 | So I mentioned Andrej Karpathy earlier.
02:22:35.340 | Him and I are big fans of yours.
02:22:37.380 | If you're taking votes, his suggestion was
02:22:39.740 | you should write your next book on the Fermi Paradox.
02:22:42.620 | So let me ask you on the topic of alien life
02:22:47.780 | since we've been talking about life
02:22:50.460 | and we're a kind of aliens.
02:22:52.020 | How many alien civilizations are out there do you think?
02:22:58.260 | - Well, the universe is very big, so some,
02:23:01.540 | but not as many as most people would like to think
02:23:04.260 | is my view because the idea that there is a trajectory
02:23:09.180 | going from simple cellular life like bacteria
02:23:14.180 | all the way through to humans.
02:23:16.300 | It seems to me there's some big gaps along that way
02:23:20.020 | that the eukaryotic cell, the complex cell that we have
02:23:23.380 | is the biggest of them, but also photosynthesis is another.
02:23:27.260 | The other interesting gap is a long gap
02:23:30.300 | from the origin of the eukaryotic cells
02:23:33.180 | to the first animals.
02:23:34.260 | That was about a billion years, maybe more than that.
02:23:37.820 | A long delay in when oxygen began to accumulate
02:23:41.980 | in the atmosphere.
02:23:42.980 | So from the first appearance of oxygen
02:23:44.740 | in the great oxidation event to enough for animals to respire
02:23:48.780 | was close to 2 billion years.
02:23:50.420 | Why so long?
02:23:53.060 | It seems to be planetary factors.
02:23:54.980 | It seems to be geology as much as anything else.
02:23:57.220 | And we don't really know what was going on.
02:24:00.700 | So the idea that there's a kind of an inevitable march
02:24:04.820 | towards complexity and sentient life,
02:24:09.820 | I don't think is right.
02:24:12.260 | Doesn't, not to say it's not gonna happen,
02:24:14.660 | but I think it's not gonna happen often.
02:24:17.740 | - So if you think of Earth,
02:24:19.220 | given the geological constraints and all that kind of stuff,
02:24:25.260 | do you have a sense that life, complex life,
02:24:28.220 | intelligent life happened really quickly on Earth
02:24:30.340 | or really long?
02:24:31.660 | So just to get a sense of,
02:24:34.020 | are you more sort of saying that it's very unlikely
02:24:38.700 | to get the kind of conditions required to create humans?
02:24:42.220 | Or is it, even if you have the condition,
02:24:44.860 | it's just statistically difficult?
02:24:46.780 | - I think the, I mean, the problem,
02:24:48.820 | the single great problem at the center of all of that,
02:24:51.140 | to my mind, is the origin of the eukaryotic cell,
02:24:53.260 | which happened once.
02:24:54.140 | And without eukaryotes, nothing else would have happened.
02:24:56.740 | And that is something that-
02:24:58.860 | - That's 'cause you're saying it's super important,
02:25:01.020 | the eukaryotes, but-
02:25:02.380 | - I'm saying a tantamount to saying
02:25:04.580 | that it is impossible to build something as complex
02:25:06.980 | as a human being from bacterial cells.
02:25:09.260 | - Totally agree in some deep fundamental way.
02:25:11.860 | But it's just like one cell going inside another.
02:25:14.740 | Is that so difficult to get to work right?
02:25:17.100 | That like-
02:25:18.340 | - Well, again, it happened once.
02:25:21.620 | And if you think about, if you think,
02:25:24.420 | I'm in a minority view in this position.
02:25:27.940 | Most biologists probably wouldn't agree with me anyway.
02:25:30.540 | But if you think about the starting point,
02:25:33.100 | we've got a simple cell.
02:25:34.580 | It's an archaeal cell.
02:25:35.740 | We can be fairly sure about that.
02:25:36.860 | So it looks a lot like a bacterium,
02:25:39.180 | but it's in fact from this other domain of life.
02:25:42.420 | So it looks a lot like a bacterial cell.
02:25:44.260 | That means it doesn't have anything.
02:25:46.220 | It doesn't have a nutrients.
02:25:47.460 | It doesn't really have complex endomembrane.
02:25:50.220 | It has a little bit of stuff, but not that much.
02:25:53.700 | And it takes up an endosymbiont.
02:25:56.220 | So what happens next?
02:25:58.940 | And the answer is basically everything to do with complexity.
02:26:02.780 | To me, there's a beautiful paradox here.
02:26:04.940 | Plants and animals and fungi
02:26:09.340 | all have exactly the same type of cell,
02:26:12.820 | but they all have really different ways of living.
02:26:16.860 | So a plant cell is photosynthetic.
02:26:21.260 | They started out as algae in the oceans and so on.
02:26:24.260 | So think of algal bloom, single cell things.
02:26:26.700 | The basic cell structure that it's built from
02:26:32.260 | is exactly the same with a couple of small differences.
02:26:35.340 | It's got chloroplasts as well.
02:26:36.580 | It's got a vacuole.
02:26:37.420 | It's got a cell wall, but that's about it.
02:26:39.060 | Pretty much everything else is exactly the same
02:26:40.900 | in a plant cell and an animal cell.
02:26:42.660 | And yet the ways of life are completely different.
02:26:45.820 | So this cell structure did not evolve
02:26:49.140 | in response to different ways of life, different environments.
02:26:51.780 | I'm in the ocean doing photosynthesis.
02:26:53.380 | I'm on land running around as part of an animal.
02:26:56.100 | I'm a fungus in a soil,
02:26:58.060 | spreading out long kind of shoots
02:27:01.140 | into whatever it may be, mycelium.
02:27:03.700 | So they all have the same underlying cell structure.
02:27:08.580 | Almost certainly it was driven by adaptation
02:27:11.820 | to the internal environment,
02:27:13.100 | to having these pesky endosymbionts
02:27:15.340 | that forced all kinds of change on the host cell.
02:27:18.300 | Now, in one way, you could see that as a really good thing
02:27:20.060 | because it may be that there's some inevitability
02:27:22.500 | to this process, that as soon as you've got endosymbionts,
02:27:24.660 | you're more or less bound to go in that direction.
02:27:26.500 | Or it could be that there's a huge fluke about it
02:27:29.140 | and it's almost certain to go wrong
02:27:30.500 | in just about every case possible,
02:27:32.820 | that the conflict will lead to effectively war,
02:27:35.540 | leading to death and extinction,
02:27:37.700 | and it simply doesn't work out.
02:27:39.300 | So maybe it happened millions of times
02:27:40.820 | and it went wrong every time,
02:27:41.860 | or maybe it only happened once and it worked out
02:27:44.940 | because it was inevitable.
02:27:46.100 | And actually, we simply do not know enough now
02:27:48.620 | to say which of those two possibilities is true,
02:27:50.500 | but both of them are a bit grim.
02:27:52.220 | - But you're leaning towards,
02:27:55.460 | we just got really lucky in that one leap.
02:27:57.900 | So do you have a sense that our galaxy, for example,
02:28:02.460 | has just maybe millions of planets
02:28:06.260 | with bacteria living on it?
02:28:07.740 | - I would expect billions, tens of billions of planets
02:28:10.900 | with bacteria living on it, practically.
02:28:13.660 | I mean, there's probably, what,
02:28:14.740 | five to 10 planets per star,
02:28:17.660 | of which I would hope that at least one
02:28:19.540 | would have bacteria on.
02:28:20.820 | So I expect bacteria to be very common.
02:28:23.940 | I simply can't put a number otherwise.
02:28:26.860 | I mean, I expect it will happen elsewhere.
02:28:28.820 | It's not that I think we're living
02:28:29.940 | in a completely empty universe.
02:28:31.380 | - That's so fascinating.
02:28:32.700 | - But I think that it's not gonna happen inevitably
02:28:35.620 | and there's something,
02:28:37.580 | that's not the only problem with complex life on Earth.
02:28:41.660 | I mentioned oxygen and animals and so on as well.
02:28:44.060 | And even humans, we came along very late.
02:28:46.140 | You go back 5 million years and would we be that impressed
02:28:49.300 | if we came across a planet full of giraffes?
02:28:52.140 | I mean, you'd think, hey, there's life here
02:28:53.980 | and it's a nice planet to colonize or something.
02:28:56.140 | We wouldn't think, oh, let's try
02:28:57.820 | and have a conversation with this giraffe.
02:29:00.420 | - Yeah, I'm not sure what exactly we would think.
02:29:04.220 | I'm not exactly sure what makes humans so interesting
02:29:07.940 | from an alien perspective
02:29:10.020 | or how they would notice.
02:29:11.700 | I'll talk to you about cities too
02:29:12.940 | 'cause that's an interesting perspective
02:29:14.260 | of how to look at human civilization.
02:29:16.720 | But your sense, I mean, of course you don't know,
02:29:20.480 | but it's an interesting world.
02:29:24.060 | It's an interesting galaxy.
02:29:25.420 | It's an interesting universe to live in
02:29:27.540 | that's just like every sun,
02:29:31.260 | like 90% of solar systems have bacteria in it.
02:29:39.940 | Imagine that world and the galaxy maybe has
02:29:44.940 | just a handful, if not one intelligent civilization.
02:29:50.360 | That's a wild world.
02:29:53.060 | - It's a wild world.
02:29:53.900 | - I didn't even think about that world.
02:29:55.820 | There's a kind of thought that,
02:29:58.820 | like one of the reasons it would be so exciting
02:30:00.860 | to find life on Mars or Titan or whatever
02:30:04.180 | is like if it's life is elsewhere,
02:30:05.860 | then surely, statistically,
02:30:09.980 | that life, no matter how unlikely,
02:30:12.220 | eukaryotes, multicellular organisms,
02:30:14.740 | sex, violence, what else is extremely difficult?
02:30:19.740 | I mean, photosynthesis,
02:30:22.620 | figuring out some machinery
02:30:25.880 | that involves the chemistry and the environment
02:30:28.260 | to allow the building up of complex organisms.
02:30:31.340 | Surely that would arise.
02:30:33.620 | But man, I don't know how I would feel
02:30:35.820 | about just bacteria everywhere.
02:30:38.820 | - Well, it would be depressing if it was true.
02:30:41.460 | I suppose depressing.
02:30:42.380 | - Always potential.
02:30:43.500 | I don't know what's more depressing,
02:30:44.660 | bacteria everywhere or nothing everywhere.
02:30:47.300 | - Yes, either of them are chilling.
02:30:50.020 | But whether it's chilling or not,
02:30:51.900 | I don't think should force us to change our view
02:30:55.940 | about whether it's real or not.
02:30:58.140 | And what I'm saying may or may not be true.
02:31:00.380 | - So how would you feel if we discovered life on Mars?
02:31:03.860 | - I'd be delighted.
02:31:04.700 | - It sounds like you would be less excited than some others.
02:31:07.740 | 'Cause you're like, well.
02:31:08.980 | - What I would be most interested in
02:31:10.460 | is how similar to life on Earth it would be.
02:31:12.180 | It would actually turn into quite a subtle problem
02:31:13.940 | because the likelihood of life having gone to and fro
02:31:18.940 | between Mars and the Earth is quite,
02:31:24.020 | I wouldn't say high, but it's not low.
02:31:26.060 | It's quite feasible.
02:31:27.380 | And so if we found life on Mars
02:31:29.460 | and it had very similar genetic code,
02:31:32.460 | but it was slightly different,
02:31:34.420 | most people would interpret that immediately
02:31:36.420 | as evidence that there'd been transit one way or the other
02:31:38.940 | and that it was a common origin of life on Mars
02:31:41.420 | or on the Earth and it went one way or the other way.
02:31:43.500 | The other way to see that question though
02:31:45.060 | would be to say, well, actually,
02:31:46.180 | the beginnings of life lie in deterministic chemistry
02:31:49.540 | and thermodynamics,
02:31:50.740 | starting with the most likely abundant materials,
02:31:53.900 | CO2 and water and a wet, rocky planet.
02:31:57.620 | And Mars was wet and rocky at the beginning.
02:32:00.020 | And will, I won't say inevitably,
02:32:02.100 | but potentially almost inevitably come up
02:32:04.060 | with a genetic code,
02:32:04.900 | which is not very far away from the genetic code
02:32:07.020 | that we already have.
02:32:08.060 | So we see subtle differences in the genetic code.
02:32:11.820 | What does it mean?
02:32:12.660 | It could be very difficult to interpret.
02:32:14.740 | - Is it possible, do you think,
02:32:15.740 | to tell the difference of something that truly originated?
02:32:19.900 | - I think if the stereochemistry was different,
02:32:23.100 | we have sugars, for example,
02:32:24.540 | that are the L form or the D form,
02:32:26.180 | and we have D sugars and L amino acids
02:32:31.380 | right across all of life.
02:32:32.860 | But lipids, the bacteria have one stereoisomer
02:32:37.860 | and the bacteria have the other, the opposite stereoisomer.
02:32:42.340 | So it's perfectly possible to use one or the other one.
02:32:46.420 | And the same would almost certainly go for,
02:32:48.900 | I think George Church has been trying to make life
02:32:53.740 | based on the opposite stereoisomer.
02:32:55.900 | So it's perfectly possible to do and it will work.
02:33:00.620 | And if we were to find life on Mars
02:33:02.340 | that was using the opposite stereoisomer,
02:33:03.900 | that would be unequivocal evidence
02:33:06.140 | that life had started independently there.
02:33:08.940 | - So hopefully the life we find will be on Titan
02:33:12.420 | and Europa or something like that,
02:33:14.180 | where it's less likely that we shared
02:33:16.540 | and it's harsher conditions,
02:33:18.180 | so there's gonna be weirder kind of life.
02:33:20.780 | - I wouldn't count on that because life started
02:33:23.780 | in deep sea hydrothermal vents.
02:33:25.900 | - It's harsh.
02:33:27.100 | - That's pretty harsh, yeah.
02:33:28.620 | So Titan is different.
02:33:29.900 | Europa is probably quite similar to Earth
02:33:32.020 | in the sense that we're dealing with an ocean,
02:33:34.460 | it's an acidic ocean there,
02:33:35.820 | as the early Earth would have been.
02:33:38.780 | And it almost certainly has hydrothermal systems.
02:33:41.300 | Same with Enceladus.
02:33:42.980 | We can tell that from these plumes
02:33:44.860 | coming from the surface through the ice.
02:33:46.860 | We know there's a liquid ocean
02:33:47.980 | and we can tell roughly what the chemistry is.
02:33:50.460 | For Titan, we're dealing with liquid methane
02:33:53.580 | and things like that.
02:33:54.420 | So that would really, if there really is life there,
02:33:56.140 | it would really have to be very, very different to anything
02:33:59.460 | that we know on Earth.
02:34:00.940 | - So the hard leap, the hardest leap,
02:34:02.980 | the most important leap is from Prokaryotes
02:34:06.500 | to Eukaryotes, Eukaryotic.
02:34:09.500 | What's the second, if we're ranking?
02:34:12.020 | You gave a lot of emphasis on photosynthesis.
02:34:17.460 | - Yeah, and that would be my second one, I think.
02:34:20.460 | But it's not so much,
02:34:22.420 | I mean, photosynthesis is part of the problem.
02:34:25.220 | It's a difficult thing to do.
02:34:26.820 | Again, we know it happened once.
02:34:29.980 | We don't know why it happened once.
02:34:31.780 | But the fact that it was kind of taken on board completely
02:34:38.020 | by plants and algae and so on as chloroplasts
02:34:43.180 | and did very well in completely different environments
02:34:47.380 | and then on land and whatever else
02:34:49.060 | seems to suggest that there's no problem with exploring,
02:34:54.060 | you could have a separate origin
02:34:55.180 | that explored this whole domain over there
02:34:56.900 | that the bacteria had never gone into.
02:34:58.780 | So that kind of says that the reason
02:35:02.300 | that it only happened once is probably
02:35:03.740 | because it's difficult, because the wiring is difficult.
02:35:06.580 | But then it happened at least 2.2 billion years ago,
02:35:11.940 | right before the GOE,
02:35:13.660 | maybe as long as 3 billion years ago,
02:35:16.020 | when there are, some people say there are whiffs of oxygen,
02:35:18.460 | there's just kind of traces in the fossil,
02:35:20.100 | in the geochemical record that say
02:35:22.100 | maybe there was a bit of oxygen then.
02:35:23.860 | That's really disputed.
02:35:25.300 | Some people say it goes all the way back
02:35:26.980 | 4 billion years ago and that it was
02:35:29.820 | the common ancestry of life on earth was photosynthetic.
02:35:32.460 | So immediately you've got groups of people
02:35:34.820 | who disagree over a 2 billion year period of time
02:35:37.460 | about when it started.
02:35:39.620 | But well, let's take the latest date when it's unequivocal,
02:35:45.620 | that's 2.2 billion years ago,
02:35:47.380 | through to around about the time of the Cambrian explosion
02:35:49.980 | when oxygen levels definitely got close to modern levels.
02:35:54.060 | Which was around about 550 million years ago.
02:35:56.620 | So we've gone more than 1.5 billion years
02:36:00.020 | where the earth was in stasis.
02:36:01.620 | Nothing much changed.
02:36:04.860 | It's known as the boring billion, in fact.
02:36:06.940 | Probably stuff was,
02:36:09.620 | that was when eukaryotes arose somewhere in there,
02:36:11.380 | but it's...
02:36:12.260 | So this idea that the world is constantly changing,
02:36:17.900 | that we're constantly evolving,
02:36:19.380 | that we're moving up some ramp,
02:36:20.780 | it's a very human idea,
02:36:21.980 | but in reality,
02:36:23.460 | there are kind of tipping points
02:36:30.460 | to a new stable equilibrium
02:36:33.060 | where the cells that are producing oxygen
02:36:36.100 | are precisely counterbalanced by the cells
02:36:38.100 | that are consuming that oxygen,
02:36:39.700 | which is why it's 21% now
02:36:41.980 | and has been that way for hundreds of millions of years.
02:36:44.340 | We have a very precise balance.
02:36:46.660 | You go through a tipping point
02:36:47.820 | and you don't know where the next stable state's gonna be,
02:36:51.580 | but it can be a long way from here.
02:36:53.340 | And so if we change the world with global warming,
02:36:57.180 | there will be a tipping point.
02:36:58.660 | Question is where and when,
02:37:00.020 | and what's the next stable state?
02:37:01.820 | It may be uninhabitable to us.
02:37:03.660 | It'll be habitable to life, for sure,
02:37:07.020 | but there may be something like the Permian extinction
02:37:08.940 | where 95% of species go extinct
02:37:11.420 | and there's a five to 10 million year gap
02:37:13.900 | and then life recovers, but without humans.
02:37:16.780 | - And the question statistically,
02:37:18.100 | well, without humans, but statistically,
02:37:19.860 | does that ultimately lead to greater complexity,
02:37:23.140 | more interesting life, more intelligent life?
02:37:25.340 | - Well, after the first appearance of oxygen with the GOE,
02:37:29.700 | there was a tipping point
02:37:30.900 | which led to a long-term stable state
02:37:32.660 | that was equivalent to the Black Sea today,
02:37:34.980 | which is to say oxygenated at the very surface
02:37:37.140 | and stagnant, sterile, not sterile,
02:37:39.300 | but sulfurous lower down.
02:37:42.260 | And that was stable,
02:37:46.100 | certainly around the continental margins,
02:37:47.940 | for more than a billion years.
02:37:50.180 | It was not a state that led to progression
02:37:52.260 | in an obvious way.
02:37:53.360 | - Yeah, I mean, it's interesting to think about evolution,
02:37:58.220 | like what leads to stable states
02:38:00.020 | and how often are evolutionary pressures
02:38:06.260 | emerging from the environment?
02:38:10.060 | So maybe other planets are able to create
02:38:14.180 | evolutionary pressures, chemical pressures, whatever,
02:38:16.900 | some kind of pressure that say,
02:38:18.460 | you're screwed unless you get your shit together
02:38:20.260 | in the next 10,000 years, a lot of pressure.
02:38:25.260 | It seems like Earth,
02:38:28.260 | the boring building might be explained in two ways.
02:38:31.340 | One, it's super difficult to take any kind of next step.
02:38:34.860 | And the second way it could be explained
02:38:37.260 | is there's no reason to take the next step.
02:38:39.180 | - No, I think there is no reason,
02:38:40.780 | but at the end of it, there was a snowball Earth.
02:38:45.440 | So there was a planetary catastrophe on a huge scale
02:38:47.960 | where the sea was frozen at the equator.
02:38:52.960 | And that forced change in one way or another.
02:38:59.480 | It's not long after that, 100 million years,
02:39:01.400 | perhaps after that, so not a short time,
02:39:03.200 | but this is when we begin to see animals.
02:39:04.760 | There was a shift again, another tipping point
02:39:07.720 | that led to catastrophic change
02:39:09.120 | that led to a takeoff then.
02:39:12.320 | We don't really know why,
02:39:13.440 | but one of the reasons why that I discuss in the book
02:39:16.080 | is about sulfate being washed into the oceans,
02:39:21.240 | which sounds incredibly parochial.
02:39:23.680 | But the issue is, I mean, what the data is showing,
02:39:28.200 | we can track roughly how oxygen was going
02:39:30.960 | into the atmosphere from carbon isotopes.
02:39:35.800 | So there's two main isotopes of carbon
02:39:38.120 | that we need to think about here.
02:39:39.440 | One is carbon-12, 99% of carbon is carbon-12.
02:39:42.680 | And then 1% of carbon is carbon-13,
02:39:44.800 | which is a stable isotope.
02:39:46.040 | And then there's carbon-14, which is a trivial radioactive,
02:39:48.960 | it's trivial in amount.
02:39:50.800 | So carbon-13 is 1%.
02:39:53.120 | And life and enzymes generally,
02:39:56.440 | you can think of carbon atoms as little balls
02:40:00.000 | bouncing around, ping pong balls bouncing around.
02:40:01.840 | Carbon-12 moves a little bit faster than carbon-13
02:40:04.280 | because it's lighter.
02:40:05.880 | And it's more likely to encounter an enzyme.
02:40:08.600 | And so it's more likely to be fixed into organic matter.
02:40:11.840 | And so organic matter is enriched,
02:40:13.360 | and this is just an observation,
02:40:14.480 | it's enriched in carbon-12 by a few percent
02:40:17.640 | compared to carbon-13 relative to what you would expect
02:40:20.280 | if it was just equal.
02:40:21.600 | And if you then bury organic matter as coal
02:40:26.600 | or oil or whatever it may be,
02:40:30.200 | then it's no longer oxidized.
02:40:31.520 | So some oxygen remains left over in the atmosphere.
02:40:35.160 | And that's how oxygen accumulates in the atmosphere.
02:40:37.440 | And you can work out historically how much oxygen
02:40:39.840 | there must have been in the atmosphere
02:40:40.920 | by how much carbon was being buried.
02:40:43.600 | And you think, well, how can we possibly know
02:40:45.080 | how much carbon was being buried?
02:40:46.520 | And the answer is, well, if you're burying carbon-12,
02:40:49.320 | what you're leaving behind is more carbon-13 in the oceans.
02:40:52.080 | And that precipitates out as limestone.
02:40:54.360 | So you can look at limestones over these ages
02:40:56.400 | and work out what's the carbon-13 signal.
02:40:59.080 | And that gives you a kind of a feedback
02:41:00.960 | on what the oxygen content.
02:41:03.520 | Right before the Cambrian explosion,
02:41:05.160 | there was what's called a negative isotope anomaly excursion,
02:41:08.760 | which is basically the carbon-13 goes down
02:41:10.840 | by a massive amount and then back up again
02:41:12.720 | 10 million years later.
02:41:13.880 | And what that seems to be saying
02:41:17.400 | is the amount of carbon-12 in the oceans was disappearing,
02:41:22.400 | which is to say it was being oxidized.
02:41:28.560 | And if it's being oxidized, it's consuming oxygen.
02:41:33.160 | And that should, so a big carbon-13 signal says
02:41:36.360 | the ratio of carbon-12 to carbon-13 is really going down,
02:41:39.920 | which means there's much more carbon-12 being taken out
02:41:43.560 | and being oxidized.
02:41:44.400 | Sorry, this is getting too complex, but.
02:41:46.240 | - Well, it's a good way to estimate the amount of oxygen.
02:41:49.720 | - If you calculate the amount of oxygen
02:41:51.560 | based on the assumption that all this carbon-12
02:41:53.920 | that's being taken out is being oxidized by oxygen,
02:41:56.160 | the answer is all the oxygen in the atmosphere
02:41:58.040 | gets stripped out, there is none left.
02:42:00.000 | And yet the rest of the geological indicators say,
02:42:03.400 | no, there's oxygen in the atmosphere.
02:42:06.160 | So it's kind of a paradox.
02:42:07.600 | And the only way to explain this paradox
02:42:09.840 | just on mass balance of how much stuff is in the air,
02:42:12.560 | how much stuff is in the oceans, and so on,
02:42:15.280 | is to assume that oxygen was not the oxygen, it was sulfate.
02:42:19.680 | Sulfate was being washed into the oceans.
02:42:22.480 | It's used as an electron acceptor
02:42:24.680 | by sulfate-reducing bacteria,
02:42:25.960 | just as we use oxygen as an electron acceptor.
02:42:28.240 | So they pass their electrons to sulfate instead of oxygen.
02:42:31.720 | - Bacteria did.
02:42:32.720 | - Yeah, yeah, so these are bacteria.
02:42:36.520 | So they're oxidizing carbon, organic carbon,
02:42:40.560 | with sulfate, passing the electrons onto sulfate.
02:42:43.320 | That reacts with iron to form iron pyrite, or fool's gold,
02:42:47.560 | sinks down to the bottom, gets buried out of the system.
02:42:50.360 | And this can account for the mass balance.
02:42:53.920 | So why does it matter?
02:42:55.600 | It matters because what it says is there was a chance event,
02:43:00.000 | tectonically, there was a lot of sulfate sitting on land
02:43:03.200 | as some kind of mineral.
02:43:06.560 | So calcium sulfate minerals, for example, are evaporitic.
02:43:10.320 | And because there happened to be some continental collisions,
02:43:16.280 | mountain building, the sulfate was pushed up the side
02:43:20.240 | of a mountain and happened to get washed into the ocean.
02:43:23.000 | - Yeah, so I wonder how many happy accidents
02:43:26.400 | like that are possible.
02:43:27.400 | - Statistically, it's really hard.
02:43:28.760 | Maybe you can rule that in statistically,
02:43:30.760 | or rule it, but this is the course of life on Earth.
02:43:34.040 | Without all that sulfate being raised up,
02:43:36.440 | this Cambrian explosion almost certainly
02:43:38.080 | would not have happened, and then we wouldn't
02:43:40.360 | have had animals, and so on and so on.
02:43:42.280 | So it's, you know, it's--
02:43:44.200 | - This kind of explanation of the Cambrian explosion,
02:43:46.840 | so let me actually say it in several ways.
02:43:51.880 | So, you know, folks who challenge the validity
02:43:55.840 | of the theory of evolution will give us an example,
02:44:00.840 | now I'm not well studied in this,
02:44:02.600 | but will give us an example of the Cambrian explosion
02:44:04.800 | as like, this thing's weird.
02:44:06.680 | - Oh, it is weird, yeah.
02:44:08.880 | - So the question I would have is,
02:44:13.720 | what's the biggest mystery or gap in understanding
02:44:17.240 | about evolution?
02:44:18.540 | Is it the Cambrian explosion, and if so,
02:44:21.600 | how do we, what's our best understanding
02:44:23.560 | of how to explain, first of all, what is it?
02:44:27.040 | In my understanding, in the short amount of time,
02:44:30.940 | maybe 10 million years, 100 million years,
02:44:32.640 | something like that, a huge number of animals,
02:44:35.800 | variety, diversity of animals were created.
02:44:38.580 | Anyway, there's like five questions in there.
02:44:41.760 | - Yeah. - Is that the biggest
02:44:42.600 | mystery to you about evolution?
02:44:43.440 | - No, I don't think it's a particularly big mystery,
02:44:45.360 | really, anymore, I mean, it's,
02:44:47.280 | there are still mysteries about why then,
02:44:51.240 | and I've just said sulfate being washed
02:44:52.880 | into the oceans is one, it needs oxygen,
02:44:55.040 | and oxygen levels rose around that time,
02:44:59.220 | so probably before that, they weren't high enough
02:45:01.820 | for animals.
02:45:03.000 | What we're seeing with the Cambrian explosion
02:45:04.740 | is the beginning of predators and prey relationships.
02:45:07.940 | We're seeing modern ecosystems,
02:45:11.100 | and we're seeing arms races, and we're seeing,
02:45:13.860 | we're seeing the full creativity of evolution unleashed.
02:45:19.380 | So I talked about the boring billion,
02:45:22.620 | nothing happens for one and a half,
02:45:25.340 | one billion years, one and a half billion years.
02:45:28.440 | The assumption, and this is completely wrong,
02:45:31.840 | this assumption, is then that evolution works really slowly,
02:45:36.240 | and that you need billions of years
02:45:38.040 | to affect some small change,
02:45:40.720 | and then another billion years to do something else,
02:45:42.520 | and it's completely wrong.
02:45:44.680 | Evolution gets stuck in a stasis,
02:45:46.280 | and it stays that way for tens of millions,
02:45:48.400 | hundreds of millions of years,
02:45:50.200 | and Stephen Jay Gould used to argue this,
02:45:52.540 | he called it punctuated equilibrium,
02:45:54.000 | but he was doing it to do with animals
02:45:55.560 | and to do with the last 500 million years or so,
02:45:58.700 | where it's much less obvious
02:46:00.060 | than if you think about the entire planetary history,
02:46:02.780 | and then you realize that the first two billion years
02:46:04.820 | was bacteria only.
02:46:06.580 | You have the origin of life,
02:46:07.800 | two billion years of just bacteria,
02:46:09.820 | oxygen and photosynthesis arising here,
02:46:11.940 | then you have a global catastrophe,
02:46:14.340 | snowball earths and great oxidation event,
02:46:16.460 | and then another billion years of nothing happening,
02:46:18.260 | and then some period of upheavals,
02:46:20.440 | and then another snowball earth,
02:46:21.700 | and then suddenly you see the Cambrian explosion.
02:46:23.580 | This is long periods of stasis,
02:46:25.900 | where the world is in a stable state,
02:46:27.740 | and is not geared towards increasing complexity,
02:46:31.100 | it's just everything is in balance,
02:46:33.500 | and only when you have a catastrophic level,
02:46:35.620 | global level problem like a snowball earth,
02:46:38.660 | it forces everything out of balance,
02:46:40.340 | and there's a tipping point,
02:46:41.340 | and you end up somewhere else.
02:46:42.660 | Now, the idea that evolution is slow is wrong,
02:46:47.660 | it can be incredibly fast,
02:46:50.380 | and I mentioned earlier on,
02:46:52.140 | you can, in theory,
02:46:53.700 | it would take half a million years to invent an eye,
02:46:56.180 | for example, from a light sensitive spot.
02:46:57.980 | It doesn't take long to convert,
02:47:00.720 | one kind of tube into a tube with nobbles on it,
02:47:05.580 | into a tube with arms on it,
02:47:07.660 | and then multiple arms,
02:47:08.740 | and then one end is a head,
02:47:10.380 | where it starts out as a swelling.
02:47:11.740 | It's not difficult intellectually
02:47:14.700 | to understand how these things can happen.
02:47:16.920 | It boggles the mind that it can happen so quickly,
02:47:21.060 | but we're used to human time scales,
02:47:24.820 | and what we need to talk about is generations
02:47:26.940 | of things that live for a year in the ocean,
02:47:29.640 | and then a million years is a million generations,
02:47:33.360 | and the amount of change that you can do,
02:47:36.020 | you can affect in that period of time is enormous,
02:47:39.060 | and we're dealing with large populations of things
02:47:41.060 | where selection is sensitive to pretty small changes,
02:47:44.100 | and can, so again,
02:47:46.380 | as soon as you throw in the competition
02:47:49.300 | of predators and prey,
02:47:50.900 | and you're ramping up the scale of evolution,
02:47:53.860 | it's not very surprising that it happens very quickly
02:47:56.220 | when the environment allows it to happen.
02:47:58.740 | So I don't think there's a big mystery.
02:47:59.980 | There's lots of details that need to be filled in.
02:48:03.500 | I mean, the big mystery in biology is consciousness.
02:48:07.140 | - The big mystery in biology is consciousness.
02:48:13.420 | Well, intelligence is kind of a mystery too.
02:48:17.620 | I mean, you said biology, not psychology,
02:48:26.460 | 'cause from a biology perspective,
02:48:30.700 | it seems like intelligence and consciousness
02:48:32.460 | are all the same, like weird,
02:48:34.620 | like all the brain stuff.
02:48:37.740 | - I don't see intelligence as necessarily that difficult,
02:48:41.860 | I suppose.
02:48:42.700 | I mean, I see it as a form of computing,
02:48:44.820 | and I don't know much about computing, so I.
02:48:47.020 | (laughing)
02:48:48.500 | - Well, you don't know much about consciousness either,
02:48:50.260 | so I mean, I suppose, oh, I see.
02:48:54.380 | I see, I see, I see, I see.
02:48:56.260 | That consciousness you do know a lot about as a human being.
02:49:00.340 | - No, no, I mean, I think I can understand
02:49:03.820 | the wiring of a brain as a series of,
02:49:07.220 | in pretty much the same way as a computer in theory,
02:49:12.500 | in terms of the circuitry of it.
02:49:16.700 | The mystery to me is how this system gives rise to feelings
02:49:21.700 | as we were talking about earlier on.
02:49:23.180 | - Yeah, I just, I think we oversimplify intelligence.
02:49:27.860 | I think the dance, the magic of reasoning
02:49:31.900 | is as interesting as the magic of feeling.
02:49:35.220 | We tend to think of reasoning as like very,
02:49:40.980 | very, running a very simplistic algorithm.
02:49:45.460 | I think reasoning is the interplay between memory,
02:49:48.820 | whatever the hell is going on in the unconscious mind.
02:49:51.960 | All of that.
02:49:53.000 | - I'm not trying to diminish it in any way at all.
02:49:58.500 | Obviously, it's extraordinarily, exquisitely complex,
02:50:01.940 | but I don't see a logical difficulty with how it works.
02:50:06.940 | - Yeah, no, I mean, I agree with you,
02:50:08.700 | but sometimes, yeah, there's a big cloak of mystery
02:50:13.580 | around consciousness.
02:50:14.940 | - I mean, let me compare it
02:50:16.980 | with classical versus quantum physics.
02:50:20.020 | Classical physics is logical,
02:50:23.140 | and you can understand the kind of language
02:50:27.100 | we're dealing with.
02:50:27.940 | It's almost at the human level.
02:50:29.420 | We're dealing with stars and things that we can see,
02:50:31.360 | and when you get to quantum mechanics and things,
02:50:34.500 | it's practically impossible for the human mind
02:50:36.820 | to compute what just happened there.
02:50:39.620 | - Yeah, I mean, that is the same.
02:50:41.940 | It's like you understand mathematically
02:50:45.220 | the notes of a musical composition.
02:50:47.600 | That's intelligence.
02:50:48.620 | - Yes.
02:50:49.440 | - But why it makes you feel a certain way,
02:50:51.280 | that is much harder to understand.
02:50:57.580 | Yeah, that's really, but it was interesting framing it,
02:51:02.580 | that that's a mystery at the core of biology.
02:51:05.500 | I wonder who solves consciousness.
02:51:08.720 | I tend to think consciousness will be solved
02:51:11.820 | by the engineer, meaning the person who builds it,
02:51:16.220 | who keeps trying to build the thing,
02:51:19.340 | versus biology, such a complicated system.
02:51:23.840 | I feel like the building blocks of consciousness
02:51:29.900 | from a biological perspective are like,
02:51:34.140 | that's like the final creation of a human being.
02:51:36.920 | So you have to understand the whole damn thing.
02:51:38.940 | You said electrical fields, but like,
02:51:40.860 | electrical fields plus plus, everything,
02:51:45.100 | the whole shebang.
02:51:47.220 | - I'm inclined to agree.
02:51:48.340 | I mean, my feeling is from my meager knowledge
02:51:51.860 | of the history of science is that the biggest breakthrough
02:51:53.900 | is usually comes through from a field
02:51:55.660 | that was not related to it.
02:51:57.380 | So if anyone, you know, is not gonna be a biologist
02:51:59.980 | who solves consciousness, just because biologists
02:52:03.020 | are too embedded in the nature of the problem.
02:52:06.980 | And then nobody's gonna believe you when you've done it,
02:52:09.020 | because nobody's gonna be able to prove
02:52:10.500 | that this AI is in fact conscious and sad in any case,
02:52:15.500 | and any more than you can prove
02:52:17.520 | that a dog is conscious and sad.
02:52:19.300 | So it tells you that it is in good language,
02:52:23.220 | and you must believe it.
02:52:24.820 | But I think most people will accept,
02:52:27.440 | if faced with that, that that's what it is.
02:52:30.820 | All of this probability of complex life.
02:52:35.160 | In one way, I think why it matters is that,
02:52:42.480 | my expectation, I suppose, is that we will be,
02:52:48.500 | over the next 100 years or so, if we survive at all,
02:52:51.220 | that AI will increasingly dominate,
02:52:53.980 | and pretty much anything that we put out into space,
02:52:56.780 | going, looking for other, well, for the universe,
02:53:00.220 | for what's out there, will be AI.
02:53:02.660 | Won't be us, we won't be doing that,
02:53:04.700 | or when we do, it'll be on a much more limited scale.
02:53:07.420 | I suppose the same would apply to any alien civilization.
02:53:12.420 | So perhaps, rather than looking for signs of life out there,
02:53:15.400 | we should be looking for AI out there.
02:53:18.020 | But then we face the problem that,
02:53:23.560 | I don't see how a planet is going to give rise
02:53:27.780 | directly to AI.
02:53:29.980 | I can see how a planet can give rise
02:53:31.340 | directly to organic life.
02:53:33.000 | And if the principles that govern the evolution of life
02:53:36.940 | on Earth apply to other planets as well,
02:53:40.140 | and I think a lot of them would,
02:53:41.740 | then the likelihood of ending up with
02:53:46.220 | a human-like civilization capable of giving rise to AI
02:53:49.100 | in the first place is massively limited.
02:53:52.660 | Once you've done it once, perhaps it takes over the universe
02:53:54.620 | and maybe there's no issue.
02:53:57.380 | But it seems to me that the two are necessarily linked,
02:54:01.060 | that you're not gonna just turn a sterile planet
02:54:03.980 | into an AI life form without the intermediary
02:54:06.980 | of the organics first.
02:54:08.420 | - So you have to run the full evolutionary computation
02:54:13.220 | with the organics to create AI.
02:54:15.460 | - How does AI bootstrap itself up without the aid,
02:54:18.300 | if you like, of an intelligent designer?
02:54:20.580 | - The origin of AI is going to have to be
02:54:24.100 | in the chemistry of a planet.
02:54:26.820 | So, but that's not a limiting factor, right?
02:54:29.980 | So, I mean, so there's,
02:54:32.260 | let me ask the Fermi paradox question.
02:54:34.320 | Let's say we live in this incredibly dark
02:54:39.820 | and beautiful world of just billions of planets
02:54:44.820 | with bacteria on it and very few intelligent civilizations,
02:54:49.220 | and yet there's a few out there.
02:54:52.580 | Why haven't we at scale seen them visit us?
02:54:57.580 | What's your sense?
02:54:59.900 | Is it because they don't exist?
02:55:01.580 | Is it because-- - Well, don't exist
02:55:03.460 | in the right part of the universe at the right time,
02:55:05.420 | that's the simplest answer for it.
02:55:07.120 | - Is that the one you find the most compelling
02:55:10.380 | or is there some other explanation?
02:55:12.220 | - I find that, no, it's not that I find it more compelling,
02:55:16.820 | it's that I find more probable and I find all of them,
02:55:20.940 | I mean, there's a lot of hand-waving in this,
02:55:22.580 | we just don't know.
02:55:23.560 | So, I'm trying to read out from what I know
02:55:27.020 | about life on Earth to what might happen somewhere else.
02:55:30.540 | And it gives, to my mind, a bit of a pessimistic view
02:55:33.580 | of bacteria everywhere and only occasional intelligent life
02:55:37.460 | and running forward humans only once on Earth
02:55:40.860 | and nothing else that you would necessarily be
02:55:43.460 | any more excited about making contact with
02:55:45.360 | than you would be making contact with them on Earth.
02:55:49.260 | So, I think the chances are pretty limited
02:55:52.580 | and the chances of us surviving are pretty limited too.
02:55:56.780 | The way we're going on at the moment,
02:55:58.020 | the likelihood of us not making ourselves extinct
02:56:01.060 | within the next few hundred years,
02:56:03.940 | possibly within the next 50 or 100 years seems quite small.
02:56:08.220 | I hope we can do better than that.
02:56:09.920 | So, maybe the only thing that will survive from humanity
02:56:13.360 | will be AI and maybe AI, once it exists
02:56:15.660 | and once it's capable of effectively copying itself
02:56:19.420 | and cutting humans out of the loop,
02:56:21.180 | then maybe that will take over the universe.
02:56:24.620 | - I mean, there's a kind of inherent sadness
02:56:26.540 | to the way you described that,
02:56:28.100 | but isn't that also potentially beautiful
02:56:31.820 | that that's the next step of life?
02:56:33.600 | I suppose, from your perspective,
02:56:38.740 | as long as it carries the flame of consciousness somehow.
02:56:41.420 | - I think, yes, there can be some beauty
02:56:42.980 | to it being the next step of life.
02:56:44.500 | And I don't know if consciousness matters or not
02:56:46.420 | from that point of view, to be honest with you.
02:56:48.740 | - Yeah.
02:56:50.580 | - But there's some sadness, yes, probably,
02:56:53.260 | because I think it comes down to the selfishness
02:56:58.140 | that we were talking about earlier on.
02:56:59.380 | I am an individual with a desire
02:57:03.660 | not to be kind of displaced from life.
02:57:06.100 | I want to stay alive.
02:57:07.580 | I want to be here.
02:57:09.900 | So, I suppose the threat that a lot of people would feel
02:57:13.660 | is that we will just be wiped out,
02:57:15.260 | so that there will be potential conflicts
02:57:18.620 | between AI and humans,
02:57:20.180 | and that AI will win because it's a lot smarter.
02:57:23.680 | - Boy, would that be a sad state of affairs
02:57:27.900 | if consciousness is just an intermediate stage
02:57:32.460 | between bacteria and AI, right?
02:57:36.740 | And so- - Well, I would see bacteria
02:57:37.980 | as being potentially a kind of primitive form
02:57:40.060 | of consciousness anyway.
02:57:40.900 | - Right, so maybe- - The whole of life on Earth,
02:57:42.700 | to my mind- - Is conscious.
02:57:43.940 | - Is capable of some form of feelings
02:57:46.620 | in response to the environment.
02:57:47.860 | That's not to say it's intelligent,
02:57:49.020 | though it's got its own algorithms for intelligence,
02:57:52.580 | but nothing comparable with us.
02:57:54.300 | I think it's beautiful what a planet,
02:57:56.740 | what a sterile planet can come up with.
02:57:59.180 | It's astonishing that it's come up with
02:58:00.740 | all of this stuff that we see around us,
02:58:02.700 | and that either we or whatever we produce
02:58:07.140 | is capable of destroying all of that.
02:58:09.000 | It is a sad thought.
02:58:12.300 | But it's also, it's hugely pessimistic.
02:58:17.300 | I'd like to think that we're capable of giving rise
02:58:19.740 | to something which is at least as good,
02:58:21.620 | if not better than us, as AI.
02:58:24.140 | - Yeah, I have that same optimism,
02:58:28.800 | especially a thing that is able to propagate
02:58:31.060 | throughout the universe more efficiently than humans can,
02:58:33.780 | or extensions of humans.
02:58:36.180 | Some merger with AI and humans,
02:58:39.140 | whether that comes from bioengineering of the human body
02:58:43.900 | to extend its life somehow,
02:58:47.140 | to carry that flame of consciousness and that personality
02:58:50.220 | and the beautiful tension that's within all of us,
02:58:54.540 | carry that through to multiple planets,
02:58:56.620 | to multiple solar systems all out there in the universe.
02:58:59.620 | I mean, that's a beautiful vision.
02:59:02.100 | Whether AI can do that or bioengineered humans can,
02:59:07.700 | that's an exciting possibility,
02:59:09.140 | and especially meeting other alien civilizations
02:59:13.300 | in that same kind of way.
02:59:14.660 | Do you think aliens have consciousness?
02:59:16.860 | - If they're organic.
02:59:18.340 | - So organic is connected to consciousness.
02:59:20.500 | - I mean, I think any system which is gonna bootstrap itself
02:59:24.340 | up from planetary origins,
02:59:28.060 | I mean, let me finish this and then I'll come on to it
02:59:31.140 | with something else, but from planetary origins
02:59:33.660 | is going to face similar constraints,
02:59:35.980 | and those constraints are going to be addressed
02:59:37.900 | in similar basic engineering ways.
02:59:40.140 | And I think it will be cellular,
02:59:41.740 | and I think it will have electrical charges,
02:59:43.860 | and I think it will have to be selected
02:59:46.340 | in populations over time,
02:59:47.620 | and all of these things will tend to give rise
02:59:49.340 | to the same processes as the simplest fix
02:59:51.900 | to a difficult problem.
02:59:53.260 | So I would expect it to be conscious, yes,
02:59:54.940 | and I would expect it to resemble life on Earth
02:59:58.140 | in many ways.
03:00:00.580 | When I was about, I guess, 15 or 16,
03:00:03.100 | I remember reading a book by Fred Hoyle
03:00:06.140 | called "The Black Cloud,"
03:00:08.260 | which I was a budding biologist at the time,
03:00:11.420 | and this was the first time I'd come across someone
03:00:13.340 | that really challenging the heart of biology
03:00:15.220 | and saying, "You are far too parochial.
03:00:17.980 | "You're thinking about life as carbon-based.
03:00:20.820 | "Here's a life form which is kind of dust,
03:00:23.580 | "interstellar dust on a solar system scale."
03:00:28.260 | And I, you know, it's a novel,
03:00:32.460 | but I felt enormously challenged by that novel
03:00:34.340 | because it hadn't occurred to me
03:00:36.380 | how limited my thinking was,
03:00:39.940 | how narrow-minded I was being,
03:00:43.340 | and here was a great physicist
03:00:45.140 | with a completely different conception
03:00:46.540 | of what life could be.
03:00:48.060 | And since then, I've seen him attacked in various ways,
03:00:51.460 | and I'm kind of reluctant to say
03:00:54.540 | the attacks make more sense to me
03:00:56.260 | than the original story,
03:00:58.580 | which is to say, even in terms of information processing,
03:01:03.020 | if you're on that scale and there's a limit
03:01:04.500 | to the speed of light, how quickly can something think
03:01:07.300 | if you're needing to broadcast across the solar system?
03:01:12.300 | It's going to be slow.
03:01:16.780 | It's not gonna hold a conversation with you
03:01:18.980 | on the kind of timelines that Fred Hoyle was imagining,
03:01:22.420 | or at least not by any easy way of doing it,
03:01:25.380 | assuming that the speed of light is a limit.
03:01:27.580 | And then, again, you really can't,
03:01:32.540 | this is something Richard Dawkins argued long ago,
03:01:34.820 | and I do think he's right.
03:01:36.300 | There is no other way to generate this level of complexity
03:01:40.300 | than natural selection.
03:01:41.340 | Nothing else can do it.
03:01:42.620 | You need populations,
03:01:44.780 | and you need selection in populations,
03:01:46.380 | and kind of an isolated, interstellar cloud.
03:01:51.380 | Again, there's unlimited time,
03:01:55.340 | and maybe there's no problems with distance,
03:01:57.220 | but you need to have a certain frequency of generation
03:02:02.860 | or time to generate a serious level of complexity.
03:02:06.340 | And I just have a feeling it's never gonna work.
03:02:11.860 | - Well, as far as we know, so natural selection,
03:02:14.980 | evolution is a really powerful tool here on Earth,
03:02:17.460 | but there could be other mechanisms.
03:02:18.940 | So whenever, I don't know if you're familiar
03:02:21.580 | with cellular automata, but complex systems
03:02:26.180 | that have really simple components
03:02:29.060 | and seemingly move based on simple rules
03:02:31.500 | when they're taken as a whole,
03:02:33.180 | really interesting complexity emerges.
03:02:35.980 | I don't know what the pressures on that are.
03:02:38.980 | It's not really selection,
03:02:40.180 | but interesting complexity seems to emerge,
03:02:42.820 | and that's not well understood exactly why
03:02:46.420 | that complexity emerges. - I think there's a difference
03:02:47.260 | between complexity and evolution.
03:02:51.060 | So some of the work we're doing on the origin of life
03:02:53.300 | is thinking about how does, well, how do genes arise?
03:02:59.140 | How does information arise in biology?
03:03:01.740 | And thinking about it from the point of view
03:03:03.900 | of reacting CO2 with hydrogen, what do you get?
03:03:06.340 | Well, what you're gonna get is carboxylic acids,
03:03:08.860 | then amino acids.
03:03:09.740 | It's quite hard to make nucleotides.
03:03:11.940 | And it's possible to make them, and it's been done,
03:03:15.860 | and it's been done following this pathway as well.
03:03:18.340 | But you make trace amounts.
03:03:20.140 | And so the next question,
03:03:21.340 | assuming that this is the right way of seeing the question,
03:03:23.940 | which maybe it's just not, but let's assume it is,
03:03:26.580 | is, well, how do you reliably make more nucleotides?
03:03:29.500 | And how do you become more complex and better
03:03:31.740 | at becoming a nucleotide-generating machine?
03:03:35.100 | And the answer is, well, you need positive feedback loops,
03:03:38.420 | some form of autocatalysis.
03:03:40.260 | So that can work, and we know it happens in biology.
03:03:43.060 | If this nucleotide, for example, catalyzes CO2 fixation,
03:03:48.060 | then you're going to increase the rate of flux
03:03:50.260 | through the whole system,
03:03:51.140 | and you're going to effectively steepen the driving force
03:03:53.540 | to make more nucleotides.
03:03:55.380 | And this can be inherited
03:03:58.420 | because there are forms of membrane heredity
03:04:01.860 | that you can have, and there are,
03:04:04.100 | effectively, you can, if a cell divides in two
03:04:06.380 | and it's got a lot of stuff inside it,
03:04:08.260 | and that stuff is basically bound as a network
03:04:11.540 | which is capable of regenerating itself,
03:04:14.300 | then it will inevitably regenerate itself.
03:04:17.220 | And so you can develop greater complexity.
03:04:19.900 | But everything that I've said depends
03:04:23.220 | on the underlying rules of thermodynamics.
03:04:25.580 | There is no evolvability about that.
03:04:27.780 | It's simply an inevitable outcome of your starting point,
03:04:32.780 | assuming that you're able to increase the driving force
03:04:36.260 | through the system.
03:04:37.140 | You will generate more of the same.
03:04:38.740 | You'll expand on what you can do,
03:04:40.140 | but you'll never get anything different than that.
03:04:42.780 | And it's only when you introduce information into that
03:04:45.860 | as a gene, as a kind of small stretch of RNA,
03:04:51.900 | which can be random stretch,
03:04:53.700 | then you get real evolvability,
03:04:55.300 | then you get biology as we know it,
03:04:57.500 | but you also have selection as we know it.
03:05:00.460 | - Yeah, I mean, I don't know how to think about information.
03:05:03.860 | That's a kind of memory of the system.
03:05:08.060 | So it's not, yeah, at the local level,
03:05:10.620 | it's propagation of copying yourself and changing
03:05:13.380 | and improving your adaptability to the environment.
03:05:16.540 | But if you look at Earth as a whole,
03:05:21.340 | it has a kind of memory.
03:05:23.100 | That's the key feature of it.
03:05:25.100 | - In what way?
03:05:25.940 | - It remembers the stuff it tries.
03:05:30.540 | Like if you were to describe Earth,
03:05:32.580 | I think evolution is something that we experience
03:05:37.500 | as individual organisms.
03:05:41.660 | That's how the individual organisms
03:05:43.980 | interact with each other.
03:05:45.260 | There's a natural selection.
03:05:47.740 | But when you look at Earth as an organism in its entirety,
03:05:51.460 | how would you describe it?
03:05:55.780 | I mean- - Well, not as an organism.
03:05:58.020 | I mean, the idea of Gaia is lovely.
03:06:00.940 | And James Lovelock originally put Gaia out
03:06:03.980 | as an organism that had somehow evolved.
03:06:07.300 | And he was immediately attacked by lots of people.
03:06:10.340 | And he's not wrong, but he backpedaled somewhat
03:06:14.180 | because that was more of a poetic vision
03:06:16.500 | than the science.
03:06:20.220 | The science is now called Earth systems science.
03:06:23.100 | And it's really about how does the world
03:06:25.340 | kind of regulate itself so it remains within the limits
03:06:28.020 | which are hospitable to life.
03:06:29.220 | And it does it amazingly well.
03:06:30.460 | And it is working at a planetary level
03:06:33.060 | of kind of integration of regulation.
03:06:38.860 | But it's not evolving by natural selection.
03:06:41.220 | And it can't because there's only one of it.
03:06:43.940 | And so it can change over time,
03:06:45.580 | but it's not evolving.
03:06:46.900 | All the evolution is happening in the parts of the system.
03:06:50.580 | - Yeah, but it's a self-sustaining organism.
03:06:52.980 | - No, it's sustained by the sun.
03:06:54.900 | - Right, so you don't think it's possible
03:06:59.380 | to see Earth as its own organism?
03:07:03.300 | - I think it's poetic and beautiful.
03:07:04.820 | And I often refer to the Earth as a living planet.
03:07:08.500 | But it's not, in biological terms, an organism, no.
03:07:14.780 | - If aliens were to visit Earth, what would they notice?
03:07:19.780 | What would be the basic unit of light they would notice?
03:07:24.060 | - Trees, probably.
03:07:25.020 | I mean, it's green, and it's green and blue.
03:07:27.260 | I think that's the first thing you'd notice
03:07:28.660 | is it stands out from space as being different
03:07:31.500 | to any of the other planets.
03:07:33.020 | - So it'd notice the trees at first 'cause the green.
03:07:36.220 | - Well, I would, I'd notice the green, yes.
03:07:39.020 | - And then probably figure out the photosynthesis.
03:07:42.780 | - Probably notice cities a second, I suspect.
03:07:45.660 | Maybe first.
03:07:47.060 | - So let me actually--
03:07:47.900 | - If they arrived at night, they'd notice cities first,
03:07:49.340 | that's for sure.
03:07:50.180 | - It depends, depends the time.
03:07:52.100 | You write quite beautifully in "Transformers" once again.
03:07:56.340 | I think you opened the book in this way, I don't remember.
03:07:59.060 | From space, describing Earth.
03:08:01.780 | It's such an interesting idea of what Earth is.
03:08:06.460 | You also, I mean, "Hitchhiker's Guide,"
03:08:10.500 | summarizing it as harmless, or mostly harmless,
03:08:13.300 | which is a beautifully poetic thing.
03:08:15.580 | You open "Transformers" with,
03:08:17.780 | from space, it looks gray and crystalline,
03:08:21.420 | obliterating the blue-green colors of the living Earth.
03:08:24.900 | It is crisscrossed by irregular patterns
03:08:27.220 | and convergent striations.
03:08:29.180 | There's a central amorphous density
03:08:33.220 | where these scratches seem lighter.
03:08:35.460 | This, quote, "growth does not look alive,
03:08:38.340 | "although it has extended out along some lines,
03:08:41.020 | "and there is something grasping and parasitic about it.
03:08:44.940 | "Across the globe, there are thousands of them,
03:08:47.420 | "varying in shape and detail,
03:08:49.540 | "but all of them gray, angular, inorganic, spreading.
03:08:54.540 | "Yet, at night, they light up.
03:08:57.620 | "Glowing up, the dark sky, suddenly beautiful.
03:09:00.740 | "Perhaps these cankers on the landscape
03:09:03.300 | "are in some sense living.
03:09:05.020 | "There's a controlled flow of energy.
03:09:07.220 | "There must be information and some form of metabolism,
03:09:10.420 | "some turnover of materials.
03:09:12.660 | "Are they alive?
03:09:14.000 | "No, of course not.
03:09:15.940 | "They are cities."
03:09:18.260 | So is there some sense that cities are living beings?
03:09:22.140 | You think aliens would think of them as living beings?
03:09:25.060 | - Well, it'd be easy to see it that way, wouldn't it?
03:09:27.660 | - It wakes up at night.
03:09:31.460 | They wake up at night.
03:09:32.300 | (laughing)
03:09:33.260 | - Strictly nocturnal.
03:09:36.220 | I imagine that any aliens that are smart enough to get here
03:09:39.020 | would understand that they're not living beings.
03:09:43.020 | My reason for saying that is that
03:09:47.580 | we tend to think of biology in terms of information
03:09:52.580 | and forget about cells.
03:09:54.780 | I was trying to draw a comparison between the cell as a city
03:09:57.820 | and the energy flow through the city
03:10:00.300 | and the energy flow through cells
03:10:02.020 | and the turnover of materials.
03:10:03.940 | And an interesting thing about cities
03:10:06.460 | is that they're not really exactly governed by anybody.
03:10:10.580 | There are regulations and systems and whatever else,
03:10:15.340 | but it's pretty loose.
03:10:17.160 | They have their own life,
03:10:20.840 | their own way of developing over time.
03:10:22.940 | And in that sense, they're quite biological.
03:10:26.740 | There was a plan after the Great Fire of London.
03:10:31.020 | Christopher Wren was making plans
03:10:33.540 | not only for St. Paul's Cathedral,
03:10:35.700 | but also to rebuild in large Parisian type boulevards,
03:10:39.660 | a large part of the area of central London that was burned.
03:10:44.660 | And it never happened
03:10:46.420 | because they didn't have enough money, I think.
03:10:48.380 | But it's interesting what was in the plan.
03:10:49.860 | There were all these boulevards,
03:10:51.820 | but there were no pubs and no coffee houses
03:10:55.420 | or anything like that.
03:10:56.820 | And the reality was London just kind of grew up
03:11:00.340 | in a set of jumbled streets.
03:11:03.140 | And it was the coffee houses and the pubs
03:11:04.580 | where all the business of the city of London was being done.
03:11:06.980 | And that was where the real life of the city was.
03:11:09.100 | And no one had planned it.
03:11:09.980 | The whole thing was unplanned
03:11:11.900 | and works much better that way.
03:11:13.660 | And in that sense, a cell is completely unplanned,
03:11:15.820 | is not controlled by the genes in the nucleus
03:11:17.900 | in the way that we might like to think that it is.
03:11:19.580 | But it's kind of evolved entity
03:11:22.220 | that has the same kind of flux,
03:11:24.100 | the same animation, the same life.
03:11:25.740 | So I think it's a beautiful analogy,
03:11:28.460 | but I wouldn't get too stuck with it as a metaphor.
03:11:32.340 | - See, I disagree with you.
03:11:33.420 | I disagree with you.
03:11:34.900 | I think you are so steeped,
03:11:39.100 | actually the entirety of science,
03:11:42.500 | the history of science is steeped
03:11:45.460 | in a biological framework of thinking about what is life.
03:11:50.100 | And not just biological, it's very human-centric too.
03:11:54.100 | That the human organism is the epitome of life on earth.
03:11:59.100 | I don't know.
03:12:01.660 | I think there is some deep fundamental way
03:12:03.980 | in which a city is a living being
03:12:07.140 | in the same way that a human-
03:12:08.980 | - But it doesn't give rise to an offspring city.
03:12:13.060 | So, I mean, it doesn't work by natural selection.
03:12:15.860 | It works by, if anything, memes, it works by.
03:12:18.660 | - Yeah, but isn't that-
03:12:20.380 | - It's kind of copying itself conceptually
03:12:22.020 | as a mode of being.
03:12:24.580 | - So, I mean, maybe memes, maybe ideas
03:12:27.700 | are the organisms that are really essential
03:12:30.780 | to life on earth.
03:12:32.340 | Maybe it's much more important
03:12:34.140 | about the collective aspect of human nature,
03:12:37.060 | the collective intelligence
03:12:38.100 | than the individual intelligence.
03:12:40.060 | Maybe the collective humanity is the organism.
03:12:43.780 | And the thing that defines the collective intelligence
03:12:48.140 | of humanity is the ideas.
03:12:50.740 | And maybe the way that manifests itself is cities.
03:12:54.300 | Maybe, or societies, or geographically constricted societies
03:12:57.900 | or nations and all that kind of stuff.
03:12:59.900 | I mean, from an alien perspective,
03:13:02.100 | it's possible that that is the more deeply noticeable thing,
03:13:06.460 | not from a place of ignorance.
03:13:08.900 | - What's noticeable doesn't tell you how it works.
03:13:11.400 | I think, I mean, I don't have any problem
03:13:14.100 | with what you're saying really,
03:13:15.100 | except that it's not possible without the humans,
03:13:20.660 | you know, we went from a hunter-gatherer type economy,
03:13:25.260 | if you like, without cities,
03:13:26.940 | through to cities.
03:13:28.220 | And as soon as we get into human evolution
03:13:30.260 | and culture and society and so on,
03:13:32.380 | then yes, there are other forms of evolution,
03:13:36.420 | other forms of change.
03:13:38.620 | But cities don't directly propagate themselves,
03:13:41.700 | they propagate themselves through human societies.
03:13:43.940 | And human societies only exist
03:13:45.340 | because humans as individuals propagate themselves.
03:13:48.340 | So there is a hierarchy there,
03:13:51.420 | and without the humans in the first place,
03:13:52.860 | none of the rest of it exists.
03:13:54.420 | - So to you, life is primarily defined
03:13:57.180 | by the basic unit on which evolution can operate.
03:14:01.660 | - I think it's a really important thing, yes.
03:14:04.180 | - Yeah.
03:14:05.020 | And we don't have any other better ideas than evolution
03:14:09.940 | for how to create life.
03:14:10.780 | - I never came across a better idea than evolution.
03:14:13.140 | I mean, maybe I'm just ignorant and I don't know,
03:14:15.740 | and you mentioned automator and so on,
03:14:19.700 | and I don't think specifically about that,
03:14:22.540 | but I have thought about it in terms of selective units
03:14:24.900 | at the origin of life and the difference
03:14:26.980 | between evolvability and complexity
03:14:29.500 | or just increasing complexity,
03:14:31.460 | but within very narrowly defined limits.
03:14:35.620 | The great thing about genes and about selection
03:14:39.500 | is it just knocks down all those limits.
03:14:41.460 | It gives you a world of information in the end,
03:14:43.580 | which is limited only by the biophysical reality
03:14:47.820 | of what kind of an organism you are,
03:14:49.940 | what kind of a planet you live on and so on.
03:14:52.340 | And cities and all these other forms that look alive
03:14:55.780 | and could be described as alive,
03:14:58.100 | because they can't propagate themselves,
03:14:59.860 | can only exist as the product of something
03:15:02.740 | that did propagate itself.
03:15:04.220 | - Yeah.
03:15:05.940 | I mean, there's a deeply compelling truth
03:15:09.300 | to that kind of way of looking at things,
03:15:11.700 | but I just hope that we don't miss the giant cloud.
03:15:16.500 | (Luke laughs)
03:15:17.820 | Among us.
03:15:18.780 | - I kind of hope that I'm wrong about a lot of this
03:15:21.380 | because I can't say that my worldview
03:15:24.100 | is particularly uplifting,
03:15:25.460 | but in some sense, it doesn't matter
03:15:28.900 | if it's uplifting or not.
03:15:29.900 | Science is about what's reality, what's out there,
03:15:33.380 | why is it this way?
03:15:35.220 | And I think there's beauty in that too.
03:15:38.220 | - There's beauty in darkness.
03:15:41.260 | You write about life and death
03:15:43.020 | sort of at the biological level.
03:15:45.580 | Does the question of suicide, why live,
03:15:49.780 | does the question of why the human mind
03:15:52.260 | is capable of depression,
03:15:54.020 | are you able to introspect that from a place of biology?
03:15:59.020 | Why our minds, why we humans can go to such dark places?
03:16:05.300 | Why can we commit suicide?
03:16:07.980 | Why can we go suffer,
03:16:12.500 | suffer period, but also suffer from a feeling
03:16:18.260 | of meaninglessness of going to a dark place
03:16:21.380 | that depression can take you?
03:16:23.060 | Is this a feature of life or is it a bug?
03:16:25.540 | - I don't know.
03:16:31.220 | I mean, if it's a feature of life,
03:16:32.420 | then I suppose it would have to be true
03:16:33.900 | of other organisms as well,
03:16:35.500 | and I don't know, we were talking about dogs earlier on
03:16:39.180 | and they can certainly be very sad and upset
03:16:43.500 | and may mooch for days after their owner died
03:16:46.380 | or something like that.
03:16:47.580 | So I suspect in some sense it's a feature of biology.
03:16:50.220 | It's probably a feature of mortality.
03:16:54.380 | It's probably a...
03:16:55.460 | But beyond all of that,
03:16:59.860 | I mean, I guess there's two ways you could come at it.
03:17:01.980 | One of them would be to say,
03:17:03.300 | well, you can effectively do the math
03:17:07.540 | and come to the conclusion that it's all pointless
03:17:10.260 | and that there's really no point in me being here any longer.
03:17:13.340 | And maybe that's true.
03:17:15.820 | In the greater scheme of things,
03:17:17.700 | you can justify yourself in terms of society,
03:17:20.140 | but society will be gone soon enough as well,
03:17:22.100 | and you end up with a very bleak place just by logic.
03:17:25.220 | - In some sense, it's surprising
03:17:27.540 | that we can find any meaning at all.
03:17:29.380 | - Well, maybe this is where consciousness comes in,
03:17:32.140 | that we have transient joy,
03:17:34.060 | but with transient joy, we have transient misery as well.
03:17:37.020 | And sometimes, with everything in biology,
03:17:40.220 | getting the regulation right is practically impossible.
03:17:45.260 | You will always have a bell-shaped curve
03:17:47.300 | where some people, unfortunately, are at the joy end
03:17:50.340 | and some people are at the misery end.
03:17:52.100 | And that's the way brains are wired.
03:17:55.060 | And I doubt there's ever an escape from that.
03:17:58.700 | It's the same with sex and everything else as well.
03:18:00.700 | We're dealing with it, you can't regulate it.
03:18:04.220 | So anything goes, it's all part of biology.
03:18:08.580 | - Amen to that.
03:18:13.860 | Let me, on writing,
03:18:16.180 | in your book, "Power, Sex, and Suicide,"
03:18:21.140 | first of all, can I just read off the books you've written?
03:18:24.060 | If there's any better titles and topics to be covered,
03:18:27.740 | I don't know what they are.
03:18:28.900 | Makes me look forward to whatever
03:18:30.180 | you're going to write next.
03:18:31.500 | I hope there's things you write next.
03:18:34.120 | So first, you wrote "Oxygen, the Molecule
03:18:36.300 | "That Made the World," as we've talked about,
03:18:38.060 | this idea of the role of oxygen in life on Earth.
03:18:41.620 | Then, wait for it, "Power, Sex, Suicide,
03:18:45.460 | "Mitochondria and the Meaning of Life,"
03:18:48.220 | then "Life Ascending,
03:18:49.540 | "The 10 Great Inventions of Evolution,"
03:18:52.340 | "The Vital Question," the first book I've read of yours,
03:18:54.780 | "The Vital Question, Why Is Life The Way It Is,"
03:18:58.020 | and the new book, "Transformer,
03:19:00.180 | "The Deep Chemistry of Life and Death."
03:19:02.540 | In "Power, Sex, and Suicide,"
03:19:06.140 | you write about writing, or about a lot of things,
03:19:09.940 | but I have a question about writing.
03:19:12.020 | You write, "In 'The Hitchhiker's Guide to the Galaxy,'
03:19:16.580 | "Ford Perfect spends 15 years researching his revision
03:19:20.660 | "to the guide's entry on the Earth,
03:19:23.120 | "which originally read 'harmless.'"
03:19:26.740 | By the way, I would also, as a side quest,
03:19:29.100 | as a side question, would like to ask you
03:19:30.780 | what would be your summary of what Earth is.
03:19:33.580 | You write, "His long essay on the subject
03:19:36.660 | "is edited down by the guide to read 'mostly harmless.'
03:19:41.500 | "I suspect that too many new editions suffer a similar fate,
03:19:46.000 | "if not through absurd editing decisions,
03:19:48.340 | "at least through a lack of meaningful change in content.
03:19:51.520 | "As it happens, nearly 15 years have passed
03:19:54.280 | "since the first edition of 'Power, Sex, and Suicide'
03:19:56.860 | "was published, and I am resisting the temptation
03:20:00.320 | "to make any lame revisions.
03:20:02.340 | "Some say that even Darwin lessened the power
03:20:05.540 | "of his arguments in 'The Origin of Species'
03:20:08.440 | "through his multiple revisions,
03:20:10.240 | "in which he dealt with criticisms
03:20:12.400 | "and sometimes shifted his views in the wrong direction.
03:20:15.920 | "I prefer my original to speak for itself,
03:20:19.240 | "even if it turns out to be wrong."
03:20:23.160 | Let me ask the question about writing,
03:20:25.800 | both your students in the academic setting,
03:20:28.160 | but also writing some of the most brilliant writings
03:20:30.920 | on science and humanity I've ever read.
03:20:33.400 | What's the process of writing?
03:20:36.280 | How do you advise other humans,
03:20:41.280 | if you were to talk to young Darwin,
03:20:45.120 | or the young you, and just young anybody,
03:20:49.520 | and give advice about how to write,
03:20:51.400 | and how to write well about these big topics,
03:20:54.400 | what would you say?
03:20:55.360 | - I mean, I suppose there's a couple of points.
03:20:59.280 | One of them is, what's the story?
03:21:02.440 | What do I wanna know?
03:21:04.760 | What do I wanna convey?
03:21:06.560 | Why does it matter to anybody?
03:21:08.760 | And very often, the biggest, most interesting questions,
03:21:13.480 | the childlike questions,
03:21:18.120 | are the one that actually everybody wants to ask,
03:21:20.680 | but don't quite do it in case they look stupid.
03:21:23.520 | And one of the nice things about being in science
03:21:27.000 | is the longer you're in, the more you realize
03:21:29.320 | that everybody doesn't know the answer to these questions,
03:21:31.360 | and it's not so stupid to ask them after all.
03:21:33.600 | So trying to ask the questions
03:21:39.720 | that I would have been asking myself at the age of 15, 16,
03:21:44.560 | when I was really hungry to know about the world,
03:21:47.280 | and didn't know very much about it,
03:21:49.040 | and wanted to go to the edge of what we know,
03:21:54.040 | but be helped to get there.
03:21:58.240 | I don't wanna be, you know, too much terminology,
03:22:01.440 | and so I want someone to keep a clean eye
03:22:03.920 | on what the question is.
03:22:05.120 | Beyond that, I've wondered a lot about who am I writing for?
03:22:12.000 | And that was, in the end, the only answer I had
03:22:16.280 | was myself at the age of 15 or 16.
03:22:19.160 | Because even if you're, you know, you can,
03:22:23.560 | you just don't know who's reading,
03:22:25.200 | but also where are they reading it?
03:22:27.280 | Are they reading it in the bath, or in bed,
03:22:29.680 | or on the metro, or are they listening to an audio book?
03:22:33.960 | Do you wanna have a recapitulation every few pages,
03:22:38.960 | 'cause you read three pages at a time,
03:22:41.080 | or are you really irritated by that?
03:22:44.160 | You're going to get criticism from people
03:22:46.880 | who are irritated by what you're doing,
03:22:48.800 | and you don't know who they are,
03:22:49.920 | or what you're gonna do that's gonna irritate people,
03:22:51.760 | and in the end, all you can do
03:22:52.960 | is just try and please yourself.
03:22:56.440 | And that means, well, what are these big, fun,
03:23:00.520 | fascinating, big questions, and what do we know about it?
03:23:04.960 | And can I convey that?
03:23:07.160 | And I kind of learned in trying to write,
03:23:10.520 | first of all, say what we know.
03:23:14.320 | And I was shocked in the first couple of books
03:23:16.600 | how often I came up quickly
03:23:18.640 | against all the stuff we don't know.
03:23:21.600 | And if you're trying to, I realized later on
03:23:25.040 | in supervising various physicists and mathematicians
03:23:29.720 | who are PhD students, and I, you know,
03:23:31.680 | their maths is way beyond what I can do,
03:23:34.480 | but the process of trying to work out
03:23:36.680 | what are we actually gonna model here?
03:23:37.960 | What's going into this equation?
03:23:39.240 | It's a very similar one to writing.
03:23:40.480 | What am I gonna put on a page?
03:23:42.080 | What's the simplest possible way
03:23:43.640 | I can encapsulate this idea
03:23:45.600 | so that I now have it as a unit
03:23:47.120 | that I can kind of see how it interacts
03:23:48.880 | with the other units?
03:23:50.520 | And you realize that, well, if this is like that,
03:23:53.000 | and this is like this, then that can't be true.
03:23:55.520 | So you end up navigating your own path
03:24:00.480 | through this landscape, and that can be thrilling
03:24:02.880 | 'cause you don't know where it's going.
03:24:05.120 | And I'd like to think that that's one of the reasons
03:24:07.440 | my books have worked for people,
03:24:09.080 | because this sense of thrilling adventure ride,
03:24:12.000 | I don't know where it's going either.
03:24:14.120 | - So finding the simplest possible way
03:24:16.520 | to explain the things we know
03:24:18.360 | and the simplest possible way to explain the things
03:24:20.320 | we don't know and the tension between those two,
03:24:22.600 | and that's where the story emerges.
03:24:25.040 | What about the edit?
03:24:27.160 | Do you find yourself to the point of this,
03:24:31.160 | you know, editing dialed to mostly harmless?
03:24:36.360 | To arrive at simplicity, do you find the edit is productive
03:24:40.680 | or does it destroy the magic that was originally there?
03:24:44.080 | - No, I usually find, I think I'm perhaps a better editor
03:24:47.880 | than I am a writer.
03:24:48.840 | I write and rewrite and rewrite and rewrite.
03:24:51.200 | - So you put a bunch of crap on the page first
03:24:52.880 | and then see where the edit where it takes you.
03:24:55.080 | - Yeah, but then there's the professional editors
03:24:58.160 | who come along as well.
03:24:59.160 | And I mean, in "Transformer," the editor came back to me
03:25:05.040 | after I'd sent him two months
03:25:06.680 | after I sent the first edition,
03:25:07.800 | he'd read the whole thing and he said,
03:25:09.720 | "The first two chapters prevent a formidable hurdle
03:25:12.720 | to the general reader.
03:25:14.200 | Go and do something about it."
03:25:16.360 | And it was the last thing I really wanted to do.
03:25:18.640 | - Your editor sounds very eloquent in speech.
03:25:21.440 | - Yeah, well, this was an email,
03:25:23.000 | but I thought about it and, you know,
03:25:26.560 | the bottom line is he was right.
03:25:28.800 | And so I put the whole thing aside for about two months,
03:25:33.240 | spent the summer, this would have been,
03:25:35.080 | I guess, last summer,
03:25:36.600 | and then turned to it with full attention
03:25:39.360 | in about September or something
03:25:40.880 | and rewrote those chapters almost from scratch.
03:25:42.920 | I kept some of the material,
03:25:44.040 | but it took me a long time to process it,
03:25:47.240 | to work out what needs to change,
03:25:48.960 | where does it need to change?
03:25:49.800 | I wasn't writing in this time.
03:25:51.200 | How am I going to tell this story better
03:25:53.360 | so it's more accessible and interesting?
03:25:54.840 | And in the end, I think it worked.
03:25:56.880 | It's still difficult, it's still biochemistry,
03:25:59.440 | but he ended up saying,
03:26:01.560 | now he's got a barreling energy to it.
03:26:03.360 | And I was, you know, because he'd been,
03:26:05.720 | 'cause he told me the truth the first time,
03:26:07.120 | I decided to believe that he was telling me the truth
03:26:08.880 | the second time as well and was delighted.
03:26:11.720 | - Could you give advice to young people in general,
03:26:17.320 | folks in high school, folks in college,
03:26:20.400 | how to take on some of the big questions you've taken on?
03:26:23.040 | Now, you've done that in the space of biology
03:26:24.760 | and expanded out.
03:26:26.680 | How can they have a career they can be proud of
03:26:31.680 | or have a life they can be proud of?
03:26:35.160 | - Gosh, that's a big question.
03:26:37.560 | - I'm sure you've gathered some wisdom
03:26:42.720 | that you can impart to young populace.
03:26:45.760 | - So the only advice that I actually ever give
03:26:48.960 | to my students is follow what you're interested in
03:26:55.400 | because they're often worried
03:26:58.120 | that if they make this decision now
03:26:59.880 | and do this course instead of that course,
03:27:01.960 | then they're gonna restrict their career opportunities.
03:27:04.120 | And there isn't a career path in science.
03:27:08.760 | It's not, I mean, there is, but there isn't.
03:27:11.320 | There's a lot of competition.
03:27:14.080 | There's a lot of death symbolically.
03:27:16.320 | So who survives?
03:27:19.200 | The people who survive are the people who care enough
03:27:24.120 | to still do it.
03:27:25.280 | And they're very often the people
03:27:26.800 | who don't worry too much about the future
03:27:31.240 | and are able to live in the present.
03:27:33.480 | 'Cause if you do a PhD,
03:27:35.200 | you've competed hard to get onto the PhD,
03:27:37.400 | then you have to compete hard to get a postdoc job
03:27:39.480 | and you have the next bomb maybe on another continent
03:27:44.000 | and it's only two years anyway.
03:27:45.760 | And so, and there's no guarantee
03:27:49.280 | you're gonna get a faculty position at the end of it.
03:27:53.120 | - And there's always the next step to compete
03:27:54.920 | if you get a faculty position,
03:27:56.840 | you get a tenure and with tenure,
03:27:58.880 | you go full professor, full professor,
03:28:00.440 | then you go to some kind of whatever the discipline is,
03:28:03.160 | there's an award.
03:28:04.360 | If you're in physics,
03:28:05.280 | you're always competing for the Nobel Prize.
03:28:06.880 | There's different awards.
03:28:08.440 | And then eventually you're all competing to,
03:28:11.240 | I mean, there's always a competition.
03:28:12.640 | - So there is no happiness.
03:28:14.000 | Happiness does not lie.
03:28:14.960 | - If you're looking into the future, yes.
03:28:16.880 | - And if what you're caring about is a career,
03:28:18.800 | then it's probably not the one for you.
03:28:22.480 | If though you can put that aside,
03:28:25.560 | and I've also worked in industry for a brief period
03:28:28.760 | and I was made redundant twice.
03:28:30.880 | So I know that-
03:28:32.040 | - It's redundant.
03:28:33.880 | - That there's no guarantee
03:28:35.960 | that you've got a career that way either.
03:28:37.880 | - Yes.
03:28:38.720 | - So live in the moment
03:28:42.640 | and try and enjoy what you're doing.
03:28:44.160 | And that means really go to the themes
03:28:49.160 | that you're most interested in
03:28:50.840 | and try and follow them as well as you can.
03:28:52.800 | And that tends to pay back in surprising ways.
03:28:57.320 | I don't know if you've found this as well,
03:28:58.600 | but I found that people will help you often.
03:29:03.600 | If they see some light shining in the eye,
03:29:08.360 | you're excited about their subject
03:29:10.480 | and just want to talk about it.
03:29:15.800 | And they know that their friend in California
03:29:18.600 | has got a job coming up,
03:29:19.520 | they'll say, "Go for this, this guy's all right."
03:29:21.800 | They'll use the network to help you out
03:29:26.040 | if you really care.
03:29:27.040 | And you're not gonna have a job two years down the line,
03:29:29.600 | but if what you really care about is what you're doing now,
03:29:32.920 | then it doesn't matter
03:29:33.760 | if you have a job in two years time or not.
03:29:35.480 | It'll work itself out if you've got the light in your eye.
03:29:38.840 | And so that's the only advice I can give.
03:29:42.360 | And most people probably drop out through that system
03:29:46.880 | because the fight is just not worth it for them.
03:29:49.800 | - Yeah, when you have the light in your eye,
03:29:51.840 | when you have the excitement for the thing,
03:29:53.440 | what happens is you start to surround yourself with others
03:29:56.560 | that are interested in that same thing
03:29:57.880 | that also have the light.
03:29:59.120 | If you really are rigorous about this,
03:30:01.080 | 'cause I think it does take,
03:30:03.600 | it doesn't, it takes effort to make-
03:30:07.320 | - Oh, you've gotta be obsessive.
03:30:08.800 | But if you're doing what you really love doing,
03:30:11.240 | then it's not work anymore, it's what you do.
03:30:13.280 | - Yeah, but I also mean the surrounding yourself
03:30:15.560 | with other people that are obsessed about the same thing.
03:30:17.800 | 'Cause depending on-
03:30:18.640 | - Oh, and that takes some work as well, yes.
03:30:20.960 | And luck.
03:30:21.800 | - Finding the right mentors, the collaborators,
03:30:25.320 | because I think one of the problem with the PhD process
03:30:29.800 | is people are not careful enough in picking their mentors.
03:30:34.800 | Those are people, mentors and colleagues and so on,
03:30:38.360 | those are people who are gonna define
03:30:40.600 | the direction of your life, how much you love a thing.
03:30:45.080 | The power of just the few little conversations you have
03:30:48.600 | in the hallway, it's incredible.
03:30:52.400 | So you have to be a little bit careful in that.
03:30:55.720 | Sometimes you just get randomly almost assigned,
03:30:58.080 | really pursue, I suppose, the subject
03:31:04.160 | as much as you pursue the people that do that subject.
03:31:07.480 | So both, the whole dance of it.
03:31:09.520 | - They kind of go together, really.
03:31:10.600 | - Yeah, they really do.
03:31:12.000 | But take that part seriously.
03:31:14.720 | And probably in the way you're describing it,
03:31:17.520 | careful how you define success.
03:31:21.000 | - You'll never find happiness in success, I don't think.
03:31:24.800 | There's a lovely quote from Robert Louis Stevenson,
03:31:27.560 | I think, who said, "Nothing in life
03:31:29.240 | "is so disenchanting as attainment."
03:31:31.520 | (Luke laughs)
03:31:33.160 | - Yeah, so I mean, in some sense,
03:31:35.480 | the true definition of success is
03:31:38.120 | getting to do today what you really enjoy doing.
03:31:43.600 | Just what fills you with joy.
03:31:46.360 | And that's ultimately success.
03:31:48.120 | So success isn't the thing beyond the horizon,
03:31:51.520 | the big trophy, the financial--
03:31:54.960 | - I think it's as close as we can get to happiness.
03:31:57.760 | That's not to say you're full of joy all the time,
03:31:59.800 | but it's as close as we can get
03:32:01.680 | to a sustained human happiness
03:32:03.600 | is by getting some fulfillment
03:32:05.280 | from what you're doing on a daily basis.
03:32:06.800 | And if what you're looking for is the world
03:32:11.240 | giving you the stamp of approval with a Nobel Prize
03:32:14.000 | or a fellowship or whatever it is,
03:32:15.840 | then I've known people like this who,
03:32:18.720 | they're eaten away by the anger,
03:32:24.480 | the kind of caustic resentment
03:32:27.480 | that they've not been awarded this prize that they deserve.
03:32:31.080 | - And the other way, if you put too much value
03:32:32.960 | into those kinds of prizes and you win them,
03:32:35.480 | I've gotten a chance to see that it also
03:32:41.120 | the more quote unquote successful you are in that sense,
03:32:45.600 | the more you run the danger of growing an ego so big
03:32:50.600 | that you don't get to actually enjoy the beauty of this life.
03:32:56.120 | You start to believe that you figured it all out
03:32:58.640 | as opposed to, I think what the ultimately
03:33:01.400 | the most fun thing is, is being curious
03:33:03.320 | about everything around you, being constantly surprised
03:33:06.400 | and these little moments of discovery,
03:33:08.560 | of enjoying beauty in small and big ways all around you.
03:33:12.120 | And I think the bigger your ego grows,
03:33:14.240 | the more you start to take yourself seriously,
03:33:15.960 | the less you're able to enjoy that.
03:33:17.880 | - Amen to that, I couldn't agree more.
03:33:19.780 | - So, the summary from harmless to mostly harmless
03:33:25.720 | in "Hitchhiker's Guide to the Galaxy",
03:33:27.640 | how would you try to summarize earth?
03:33:31.440 | And if you were given, if you had to summarize
03:33:35.960 | the whole thing in a couple of sentences
03:33:38.560 | and maybe throwing meaning of life in there,
03:33:40.400 | like what, why, why, why?
03:33:44.120 | Maybe, is that a defining thing about humans
03:33:47.560 | that we care about the meaning of the whole thing?
03:33:51.720 | I wonder if that should be part of the,
03:33:55.760 | these creatures seem to be very lost.
03:33:58.680 | - Yes, they're always asking why.
03:34:00.360 | I mean, that's my defining question is why.
03:34:02.920 | There was a, people used to make a joke,
03:34:06.680 | I have a small scar on my forehead
03:34:08.560 | from a climbing accident years ago.
03:34:11.160 | And the guy I was climbing with had dislodged a rock
03:34:13.760 | and he'd shouted something, he'd shouted below, I think,
03:34:16.600 | meaning that the rock was coming down.
03:34:18.440 | And I hadn't caught what he said,
03:34:20.280 | so I looked up and he smashed straight on my forehead.
03:34:23.600 | And everybody around me took the piss saying,
03:34:28.400 | he looked up to ask why.
03:34:30.280 | (laughing)
03:34:32.440 | Yeah, but that's a human imperative.
03:34:34.560 | That's part of what it means to be human.
03:34:37.160 | Look up to the sky and ask why.
03:34:38.760 | (laughing)
03:34:39.600 | And ask why.
03:34:40.520 | - So your question, define the earth.
03:34:44.560 | I'm not sure I can do that.
03:34:50.400 | I mean, the first word that comes to mind is living.
03:34:52.960 | I wouldn't like to say mostly living, but perhaps.
03:34:57.560 | - Mostly living, well, it's interesting
03:34:59.600 | because if you were to write
03:35:02.160 | "The Hitchhiker's Guide to the Galaxy,"
03:35:04.240 | I suppose, say our idea that we talked about,
03:35:10.360 | that bacteria is the most prominent form of life
03:35:13.680 | throughout the galaxy and the universe,
03:35:16.460 | I suppose that earth would be kind of unique
03:35:21.760 | and would require--
03:35:22.600 | - It's always abundance in that case.
03:35:24.480 | - Yeah.
03:35:25.320 | - It's profligate, it's rich, it's enormously living.
03:35:29.880 | So how would you describe that it's not bacteria?
03:35:33.360 | It's--
03:35:34.200 | - Eukaryotic.
03:35:37.640 | (laughing)
03:35:39.000 | - Yeah, eukaryotic.
03:35:39.840 | - Well, I mean, that's the technical term,
03:35:41.640 | but it is basically, it's--
03:35:43.000 | - Yeah, and then--
03:35:47.200 | - How would I describe that?
03:35:49.240 | I've actually really struggled with that term
03:35:52.120 | because the word, I mean, there's few words
03:35:55.360 | quite as good as eukaryotic
03:35:56.600 | to put everybody off immediately.
03:35:58.560 | You start using words like that
03:35:59.840 | and they'll leave the room.
03:36:01.640 | Krebs cycle is another one
03:36:02.840 | that gets people to leave the room.
03:36:04.920 | But--
03:36:05.760 | So I've tried to think, is there another word
03:36:09.400 | for eukaryotic that I can use?
03:36:10.840 | And really the only word that I've been able to use
03:36:13.200 | is complex, complex cells, complex life and so on.
03:36:18.200 | And that word, it serves one immediate purpose,
03:36:22.840 | which is to convey an impression,
03:36:26.240 | but then it means so many different things
03:36:29.520 | to everybody that actually is lost immediately.
03:36:33.600 | And so it's a kind of--
03:36:35.160 | - Well, that's a noticeable
03:36:37.520 | from the perspective of other planets,
03:36:39.480 | that is a noticeable phase transition of complexity
03:36:42.960 | is the eukaryotic.
03:36:44.440 | What about the harmless and the mostly harmless?
03:36:49.200 | Is that kind of--
03:36:50.160 | - Probably accurate on a universal kind of scale.
03:36:55.440 | I don't think that humanity is in any danger
03:36:59.200 | of disturbing the universe at the moment.
03:37:02.120 | - At the moment, which is why the mostly,
03:37:05.400 | we don't know, depends what Elon is up to.
03:37:08.360 | Depends how many rockets.
03:37:09.960 | I think--
03:37:10.800 | - It'll be still even then a while,
03:37:13.280 | I think before we disturb the fabric of time and space.
03:37:17.640 | - Was the aforementioned Andrej Karpathy,
03:37:20.320 | I think he summarized Earth
03:37:24.880 | as a system where you hammer it with a bunch of photons.
03:37:29.880 | The input is like photons and the output is rockets.
03:37:33.520 | (laughing)
03:37:35.760 | - Well, that's a hell of a lot of photons
03:37:38.440 | before it was a rocket launch.
03:37:40.240 | - But maybe in the span of the universe,
03:37:43.920 | it's not that much time.
03:37:46.040 | And so, and I do wonder what the future is,
03:37:49.880 | whether we're just in the early beginnings of this Earth,
03:37:52.920 | which is important when you try to summarize it,
03:37:55.400 | or we're at the end,
03:37:57.600 | where humans have finally gained the ability
03:38:00.640 | to destroy the entirety of this beautiful project
03:38:04.240 | we've got going on.
03:38:05.200 | Now with nuclear weapons, with engineered viruses,
03:38:09.120 | with all those kinds of things.
03:38:10.160 | - Or just inadvertently through global warming
03:38:12.400 | and pollution and so on.
03:38:14.000 | We're quite capable of that.
03:38:15.880 | I mean, we just need to pass the tipping point.
03:38:18.440 | I mean, I think we're more likely to do it inadvertently
03:38:20.800 | than through a nuclear war,
03:38:22.920 | which could happen at any time.
03:38:24.480 | But my fear is we just don't know
03:38:29.480 | where the tipping points are.
03:38:31.520 | And we kind of think we're smart enough
03:38:35.360 | to fix the problem quickly if we really need to.
03:38:37.440 | I think that's the overriding assumption
03:38:40.200 | that we're all right for now.
03:38:42.960 | Maybe in 20 years time,
03:38:44.200 | it's gonna be a calamitous problem,
03:38:45.720 | and then we'll really need to put some serious mental power
03:38:47.840 | into fixing it.
03:38:49.720 | Without seriously worrying that perhaps that is too late
03:38:52.760 | and that however brilliant we are, we miss the boat.
03:38:58.600 | - And just walk off the cliff.
03:39:01.680 | I don't know.
03:39:02.520 | I have optimism in humans being clever descendants.
03:39:05.880 | - Oh, I have no doubt that we can fix the problem,
03:39:09.360 | but it's an urgent problem.
03:39:11.280 | And we need to fix it pretty sharpish.
03:39:14.040 | And I do have doubts about whether politically
03:39:16.680 | we are capable of coming together enough
03:39:18.600 | to not just in any one country, but around the planet.
03:39:23.600 | To, I mean, I know we can do it, but do we have the will?
03:39:26.920 | Do we have the vision to accomplish it?
03:39:31.240 | - That's what makes this whole ride fun.
03:39:33.800 | I don't know.
03:39:35.000 | Not only do we not know if we can handle
03:39:37.240 | the crises before us, we don't even know all the crises
03:39:40.720 | that are gonna be before us in the next 20 years.
03:39:43.960 | The ones I think that will most likely challenge us
03:39:48.280 | in the 21st century are the ones we don't even expect.
03:39:51.200 | People didn't expect World War II
03:39:53.160 | at the end of World War I.
03:39:54.560 | - Some folks did, but not at the end of World War I,
03:39:58.680 | but by the late 1920s, I think people were beginning
03:40:01.920 | to worry about it.
03:40:02.960 | - Yeah, no, there's always people worrying about everything.
03:40:05.880 | So if you focus on the thing that--
03:40:07.960 | - People worry about, yes.
03:40:09.560 | - 'Cause there's a million things people worry about
03:40:11.320 | and 99.99999% of them don't come to be.
03:40:14.800 | Of course, the people that turn out to be right,
03:40:16.640 | they'll say, "I knew all along,"
03:40:18.120 | but that's not an accurate way of knowing
03:40:20.960 | what you could have predicted.
03:40:22.400 | I think, rationally speaking, you can worry about it,
03:40:25.560 | but nobody thought you could have another world war,
03:40:28.120 | the war to end all wars.
03:40:29.600 | Why would you have another war?
03:40:31.280 | And the idea of nuclear weapons,
03:40:33.600 | just technologically, is a very difficult thing
03:40:37.380 | to anticipate, to create a weapon that just jumps
03:40:40.600 | orders of magnitude and destructive capability.
03:40:43.840 | And of course, we can intuit all the things
03:40:46.200 | like engineered viruses, nanobots,
03:40:49.240 | artificial intelligence, yes, all the different,
03:40:54.240 | complicated global effects of global warming.
03:40:57.280 | So how that changes the allocation of resources,
03:40:59.440 | the flow of energy, the tension between countries,
03:41:02.700 | the military conflict between countries,
03:41:04.440 | the reallocation of power,
03:41:06.720 | then looking at the role of China in this whole thing
03:41:09.320 | with Russia and growing influence of Africa
03:41:13.560 | and the weird dynamics of Europe,
03:41:16.200 | and then America falling apart
03:41:18.520 | through the political division fueled
03:41:20.360 | by recommender systems through Twitter and Facebook.
03:41:24.120 | The whole beautiful mess is just fun.
03:41:26.920 | And I think there's a lot of incredible engineers,
03:41:30.240 | incredible scientists, incredible human beings
03:41:32.920 | that while everyone is bickering and so on online
03:41:36.140 | for the fun of it on the weekends,
03:41:37.760 | they're actually trying to build solutions.
03:41:39.320 | And those are the people
03:41:40.680 | that will create something beautiful,
03:41:42.120 | at least I have, you know, that's the process of evolution.
03:41:45.680 | It's, there was, it all started
03:41:48.520 | with a Chuck Norris single cell organism
03:41:53.520 | that went out from the vents
03:41:55.440 | and was the parent to all of us.
03:41:57.880 | And for that guy or lady or both, I guess,
03:42:01.840 | is a big thank you and I can't wait to what happens next.
03:42:05.840 | And I'm glad there's incredible humans writing
03:42:08.440 | and studying it like you are, Nick.
03:42:10.000 | It's a huge honor that you would talk to me.
03:42:12.240 | - That's fantastic.
03:42:13.240 | - This is really amazing.
03:42:14.360 | I can't wait to read what you write next.
03:42:17.840 | Thank you for existing.
03:42:19.260 | And thank you for talking today.
03:42:24.600 | - Thank you.
03:42:25.440 | - Thanks for listening to this conversation with Nick Lane.
03:42:28.800 | To support this podcast,
03:42:30.040 | please check out our sponsors in the description.
03:42:32.680 | And now let me leave you with some words from Steve Jobs.
03:42:37.600 | I think the biggest innovations of the 21st century
03:42:40.720 | will be at the intersection of biology and technology.
03:42:45.720 | A new era is beginning.
03:42:47.840 | Thank you for listening.
03:42:50.040 | I hope to see you next time.
03:42:51.900 | (upbeat music)
03:42:54.480 | (upbeat music)
03:42:57.060 | [BLANK_AUDIO]