back to index

Lee Cronin: Origin of Life, Aliens, Complexity, and Consciousness | Lex Fridman Podcast #269


Chapters

0:0 Introduction
2:2 Life and chemistry
15:27 Self-replicating molecules
25:51 Origin of life
42:16 Life on Mars
47:20 Aliens
54:1 Origin of life continued
60:55 Fermi Paradox
70:35 UFOs
78:56 Science and authority
84:59 Pickle experiment
87:54 Assembly theory
130:53 Free will
142:8 Cellular automata
165:40 Chemputation
182:54 Universal programming language for chemistry
196:5 Chemputer safety
208:47 Automated engineering of nanomaterials
217:46 Consciousness
227:19 Joscha Bach
238:35 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Leek Ronan,
00:00:02.720 | a chemist from University of Glasgow,
00:00:05.160 | who's one of the most fascinating, brilliant,
00:00:07.440 | out of the box thinking scientists I've ever spoken to.
00:00:11.520 | This episode was recorded more than two weeks ago,
00:00:14.600 | so the war in Ukraine is not mentioned.
00:00:17.200 | I have been spending a lot of time each day
00:00:19.400 | talking to people in Ukraine and Russia.
00:00:22.080 | I have family, friends, colleagues,
00:00:23.920 | and loved ones in both countries.
00:00:27.320 | I will try to release a solo episode on this war,
00:00:30.200 | but I've been failing to find the words
00:00:31.880 | that make sense of it for myself and others, so I may not.
00:00:36.880 | I ask for your understanding no matter which path I take.
00:00:40.600 | Most of my time is spent trying to help
00:00:44.400 | as much as I can privately.
00:00:46.220 | I'm talking to people who are suffering,
00:00:49.320 | who are angry, afraid.
00:00:51.400 | When I returned to this conversation with Lee,
00:00:55.680 | I couldn't help but smile.
00:00:57.880 | He's a beautiful, brilliant, and hilarious human being.
00:01:01.560 | He's basically a human manifestation
00:01:03.680 | of the mad scientist Rick Sanchez from Rick and Morty.
00:01:06.480 | I thought about quitting this podcast for a time,
00:01:10.960 | but for now at least, I'll keep going.
00:01:14.080 | I love people too much, you the listener.
00:01:17.880 | I meet folks on the street or when I run.
00:01:21.320 | You say a few kind words about the podcast,
00:01:23.600 | and we talk about life, the small things,
00:01:26.800 | and the big things.
00:01:28.240 | All of it gives me hope.
00:01:29.500 | People are just amazing.
00:01:32.360 | You are amazing.
00:01:34.060 | I ask for your support, wisdom, and patience
00:01:37.280 | as I keep going with this silly little podcast,
00:01:41.520 | including through some difficult conversations,
00:01:43.960 | and hopefully many fascinating and fun ones too.
00:01:52.160 | This is the Alex Friedman Podcast.
00:01:54.320 | To support it, please check out our sponsors
00:01:56.400 | in the description.
00:01:57.640 | And now to your friends.
00:01:59.480 | Here's Lee Cronin.
00:02:01.480 | How do you think life originated on Earth,
00:02:05.760 | and what insights does that give us about life?
00:02:09.360 | - If we go back to the origin of Earth,
00:02:11.440 | and you think about maybe 4.7, 4.6, 4.5 billion years ago,
00:02:16.440 | planet was quite hot.
00:02:17.600 | There was a limited number of minerals.
00:02:19.760 | There was some carbon, some water,
00:02:22.120 | and I think that maybe it's a really simple set of chemistry
00:02:25.320 | that we really don't understand.
00:02:28.240 | So that means you've got a finite number of elements
00:02:30.640 | that are going to react very simply with one another,
00:02:34.280 | and out of that mess comes a cell.
00:02:36.280 | So literally sand turns into cells,
00:02:38.960 | and it seems to happen quick.
00:02:41.040 | So what I think I can say with some degree of,
00:02:44.480 | I think not certainty, but curiosity,
00:02:46.860 | genuine curiosity is that life happened fast.
00:02:50.280 | - Yeah, so when we say fast, this is a pretty surprising fact
00:02:55.280 | and maybe you can actually correct me and elaborate,
00:02:58.080 | but it seems like most, like 70 or 80% of the time
00:03:02.520 | that Earth has been around, there's been life on it.
00:03:04.440 | Like some very significant percentage.
00:03:06.160 | So when you say fast, like the slow part is from single cell
00:03:11.160 | or from bacteria to some more complicated organism.
00:03:14.280 | It seems like most of the time that Earth has been around,
00:03:17.560 | it's been single cell or like very basic organisms,
00:03:21.600 | like a couple of billion years.
00:03:23.120 | But yeah, you're right.
00:03:24.400 | That's really, I recently kind of revisited our history
00:03:28.760 | and saw this, and I was just looking at the timeline.
00:03:32.840 | Wait a minute, like how did life just spring up so quickly?
00:03:36.400 | Like really quickly.
00:03:38.200 | That makes me think that it really wanted to.
00:03:41.420 | Like put another way, it's very easy for life to spring.
00:03:45.980 | - Yeah, I agree, I think it's much more inevitable.
00:03:48.840 | And I think I try to kind of, not provoke,
00:03:52.760 | but try and push chemists to think about,
00:03:54.680 | 'cause chemists are central to this problem, right?
00:03:57.800 | Of understanding the origin of life on Earth at least,
00:04:00.040 | because we're made of chemistry.
00:04:02.360 | But I wonder if the origin of life on a planet,
00:04:05.360 | or sorry, the emergence of life on a planet
00:04:07.200 | is as common as the formation of a star.
00:04:12.200 | And if you start framing it in that way,
00:04:15.400 | it allows you to then look at the universe
00:04:17.000 | slightly differently, because,
00:04:18.960 | and we can get into this, I think, in quite some detail.
00:04:21.560 | But I think, to come back to your question,
00:04:24.240 | I have little idea of how life got started.
00:04:27.800 | But I know it was simple.
00:04:29.640 | And I know that the process of selection
00:04:32.440 | had to occur before the biology was established.
00:04:36.320 | So that selection built the framework
00:04:39.800 | from which life kind of grew in complexity
00:04:43.520 | and capability and functionality and autonomy.
00:04:46.320 | And I think these are all really important words
00:04:48.320 | that we can unpack over the next while.
00:04:51.120 | - Can you say all the words again?
00:04:53.080 | So you said selection, so natural selection,
00:04:57.920 | the original A/B testing.
00:04:59.800 | - And so, and then complexity,
00:05:01.840 | and then the degree of autonomy and sophistication.
00:05:05.440 | Because I think that people misunderstand what life is.
00:05:08.960 | Some people say that life is a cell,
00:05:11.960 | and some people that say that life is a virus,
00:05:15.240 | or life is a, you know, an on/off switch.
00:05:19.760 | I don't think it's that.
00:05:20.960 | Life is the universe developing a memory.
00:05:24.020 | And the laws of physics, and the way,
00:05:27.440 | well, there are no laws of physics.
00:05:28.840 | Physics is just memory-free stuff, right?
00:05:32.040 | There's only a finite number of ways
00:05:34.440 | you can arrange the fundamental particles to do things.
00:05:38.600 | Life is the universe developing a memory.
00:05:43.440 | So it's like sewing a piece of art slowly,
00:05:48.440 | and then you can look back at it.
00:05:50.600 | So there's a stickiness to life.
00:05:55.600 | It's like universe doing stuff.
00:05:58.680 | And when you say memory, it's like there's a stickiness
00:06:01.800 | to a bunch of the stuff that's building together.
00:06:04.080 | So you can, in a stable way, trace back the complexity
00:06:09.080 | and that tells a coherent story.
00:06:13.080 | - Yeah, and I think, yeah.
00:06:14.520 | - Okay.
00:06:15.440 | That's, by the way, very poetic.
00:06:17.120 | (laughs)
00:06:18.400 | - Beautiful.
00:06:19.520 | - Life is the universe developing a memory.
00:06:22.160 | Okay, and then there's autonomy,
00:06:25.920 | you said, and complexity we'll talk about.
00:06:27.600 | But it's a really interesting idea
00:06:30.120 | that selection preceded biology.
00:06:33.560 | - Yeah, I think--
00:06:35.080 | - So first of all, what is chemistry?
00:06:38.400 | Like, does sand still count as chemistry?
00:06:41.360 | - Sure, I mean, as a chemist, a card-carrying chemist,
00:06:43.960 | if I'm allowed a card, I don't know.
00:06:45.920 | Don't know what I am most days, actually.
00:06:46.760 | - What is a card made of?
00:06:48.240 | (laughs)
00:06:49.600 | What's the chemical composition of the card?
00:06:52.360 | - So what is chemistry?
00:06:53.840 | Well, chemistry is the thing that happens
00:06:55.600 | when you bring electrons together and you form bonds.
00:06:58.320 | So bonds, or I say to people
00:07:00.360 | when they talk about life elsewhere,
00:07:02.360 | and I just say, well, there's bonds, there's hope.
00:07:04.800 | Because bonds allow you to get heterogeneity,
00:07:07.000 | they allow you to record those memories.
00:07:09.520 | Or, at least on Earth, you could imagine
00:07:12.360 | a Stanislav Lem-tripe world where you might have life
00:07:16.760 | emerging or intelligence emerging before life.
00:07:19.680 | That may be something like Solaris or something.
00:07:21.960 | But to get to selection, if atoms can combine and form bonds,
00:07:26.960 | those bonds, those atoms can bond to different elements,
00:07:31.880 | and those molecules will have different identities
00:07:35.600 | and interact with each other differently,
00:07:37.160 | and then you can start to have some degree
00:07:39.120 | of causation or interaction, and then selection,
00:07:42.560 | and then existence, and then you go up
00:07:47.240 | the path of complexity.
00:07:49.840 | And so, at least on Earth, as we know it,
00:07:52.560 | there is a sufficient pool of available chemicals
00:07:56.400 | to start searching that combinatorial space of bonds.
00:08:01.480 | - So, okay, this is a really interesting question.
00:08:03.480 | Let's lay it out.
00:08:04.800 | So, bonds, almost like cards.
00:08:07.920 | We say there's bonds, there is life,
00:08:12.040 | there's intelligence, there's consciousness.
00:08:14.320 | And what you just made me realize is
00:08:18.120 | those can emerge, let's put bonds aside,
00:08:24.640 | those can emerge in any order.
00:08:26.440 | That's really brilliant.
00:08:28.820 | So, intelligence can come before life.
00:08:32.160 | It's like panpsychics believe that consciousness,
00:08:36.200 | I guess, comes before life and before intelligence.
00:08:41.200 | So, consciousness permeates all matter,
00:08:43.880 | it's some kind of fabric of reality.
00:08:46.000 | Okay, so within this framework,
00:08:47.960 | you can kind of arrange everything,
00:08:49.960 | but you need to have the bonds
00:08:52.340 | that precedes everything else.
00:08:55.360 | Oh, and the other thing is selection.
00:08:57.560 | So, like the mechanism of selection.
00:08:59.840 | That could precede, couldn't that precede bonds too?
00:09:04.680 | Whatever the hell selection is.
00:09:06.240 | - I would say that there is an elegant order to it.
00:09:09.280 | Bonds allow selection, allows the emergence of life,
00:09:13.600 | allows the emergence of multicellularity,
00:09:15.880 | and then more information processing,
00:09:18.280 | building state machines all the way up.
00:09:19.680 | However, you could imagine a situation
00:09:22.160 | if you had, I don't know, a neutron star or a sun
00:09:25.360 | or what, a ferromagnetic loops interacting with one another
00:09:28.640 | and these oscillators building state machines
00:09:31.080 | and these state machines reading something out
00:09:32.720 | in the environment.
00:09:34.280 | Over time, these state machines would be able
00:09:36.440 | to literally record what happened in the past
00:09:39.320 | and sense what's going on in the present
00:09:41.520 | and imagine the future.
00:09:43.120 | However, I don't think it's ever gonna be
00:09:46.200 | within a human comprehension, that type of life.
00:09:49.000 | I wouldn't count it out because, you know,
00:09:52.480 | whenever you, I know in science,
00:09:54.040 | whenever I say something's impossible,
00:09:55.880 | I then wake up the next day and say,
00:09:56.960 | no, that's actually wrong.
00:09:57.800 | I mean, there are some limits, of course.
00:10:00.480 | I don't see myself traveling fast and light anytime soon.
00:10:03.440 | - Eric Weinstein says that's possible,
00:10:05.560 | so he will say you're wrong.
00:10:06.600 | - Sure, but I'm an experimentalist as well,
00:10:08.680 | so one of my, I have two superpowers.
00:10:11.360 | My stupidity, and I don't mean that as a, you know,
00:10:14.520 | I'm like absolutely, completely witless,
00:10:16.600 | but I mean my ability to kind of just start again
00:10:19.000 | and ask the question and then do it with an experiment.
00:10:22.240 | I always wanted to be a theoretician growing up,
00:10:24.360 | but I just didn't have the intellectual capability,
00:10:27.120 | but I was able to think of experiments in my head
00:10:30.600 | I could then do in my lab or in the, you know,
00:10:32.240 | when I was a child outside,
00:10:35.760 | and then those experiments in my head and then outside
00:10:38.640 | reinforced one another, so I think that's a very good way
00:10:41.360 | of kind of grounding the science, right?
00:10:44.320 | - Well, that's a nice way to think about theoreticians
00:10:46.640 | is they're just people who run experiments in their head.
00:10:49.640 | I mean, that's exactly what Einstein did, right?
00:10:51.600 | But you were also capable of doing that in the head,
00:10:54.440 | in your head, inside your head and in the real world
00:10:57.000 | and the connection between the two
00:10:59.440 | is when you first discovered your superpower of stupidity.
00:11:02.560 | I like it.
00:11:03.400 | - Yes, there you go. - Okay,
00:11:04.220 | what's the second superpower?
00:11:05.440 | Your accent or is that?
00:11:08.400 | - Well, I don't know.
00:11:09.680 | I am genuinely curious, so my, so I have, you know,
00:11:13.760 | like everybody, ego problems,
00:11:15.040 | but my curiosity is bigger than my ego,
00:11:16.920 | so as long as that happens, I can--
00:11:19.880 | - Oh, that's awesome.
00:11:21.280 | That is so powerful.
00:11:22.280 | You're just dropping some powerful lines.
00:11:23.960 | So curiosity is bigger than ego.
00:11:26.760 | That's something I have to think about
00:11:28.120 | 'cause you always struggle about the role of ego in life
00:11:30.800 | and that's so nice to think about.
00:11:35.800 | Don't think about the size of ego,
00:11:37.760 | the absolute size of ego.
00:11:38.920 | Think about the relative size of ego
00:11:40.800 | to the other horses pulling at you
00:11:44.080 | and if the curiosity one is bigger,
00:11:46.600 | then ego will do just fine and make you fun to talk to.
00:11:51.360 | Anyway, so those are the two superpowers.
00:11:53.640 | How do those connect to natural selection
00:11:55.320 | or selection and bonds and I forgot already,
00:11:58.480 | life and consciousness?
00:12:00.000 | - So we're going back to selection in the universe
00:12:02.120 | and origin of life on Earth.
00:12:03.840 | I mean, selection is a,
00:12:06.520 | I'm convinced that selection is a force in the universe.
00:12:09.520 | Not a fundamental force, but a directing,
00:12:13.240 | but it is a directing force because existence,
00:12:15.920 | although existence appears to be the default,
00:12:19.200 | the existence of what?
00:12:21.120 | Why does, and we can get to this later I think,
00:12:24.360 | but it's amazing that discrete things exist
00:12:29.160 | and you see this cup,
00:12:31.040 | it's not the sexiest cup in the world,
00:12:33.800 | but it's pretty functional.
00:12:35.080 | This cup, the complexity of this cup
00:12:38.840 | isn't just in the object,
00:12:39.800 | it is literally the lineage of people making cups
00:12:42.200 | and recognizing that, seeing that in their head,
00:12:44.480 | making an abstraction of a cup
00:12:45.760 | and then making a different one.
00:12:47.160 | So I wonder how many billions of cups
00:12:50.480 | have come before this one
00:12:52.960 | and that's the process of selection and existence
00:12:55.120 | and the only reason the cup is still used,
00:12:56.640 | it's quite useful.
00:12:57.480 | I like the handle, it's convenient so I don't die,
00:12:59.800 | I get hydration.
00:13:01.720 | And so I think we are missing something fundamental
00:13:04.960 | in the universe about selection
00:13:07.120 | and I think what biology is,
00:13:08.520 | is a selection amplifier
00:13:12.640 | and that this is where autonomy comes in
00:13:15.000 | and actually I think that how humanity is gonna,
00:13:17.360 | humans and autonomous robots
00:13:20.480 | or whatever we're gonna call them in the future,
00:13:22.240 | we'll supercharge that even further.
00:13:24.680 | So selection is happening in the universe,
00:13:26.920 | but if you look in the asteroid belt,
00:13:28.840 | selection, if objects are being kicked in and out
00:13:31.640 | of the asteroid belt,
00:13:33.560 | those trajectories are quite complex.
00:13:35.080 | You don't really look at that as productive selection
00:13:37.080 | because it's not doing anything to improve its function.
00:13:40.040 | But is it?
00:13:40.880 | The asteroid belt has existed for some time.
00:13:43.720 | So there is some selection going on,
00:13:45.520 | but the functionality is somewhat limited.
00:13:49.520 | On Earth, at the formation of Earth,
00:13:53.440 | interaction of chemicals and molecules in the environment
00:13:56.880 | gave selection and then things could happen
00:13:59.320 | 'cause you could think about in chemistry,
00:14:01.800 | we could have an infinite number of reactions happen,
00:14:04.080 | but they don't all,
00:14:04.920 | all the reactions that are allowed to happen don't happen.
00:14:07.840 | Because there are energy barriers.
00:14:08.920 | So there must be some things called catalysts out there
00:14:12.080 | or there are bits of minerals
00:14:14.640 | that when two molecules get together in that mineral,
00:14:17.040 | it lowers the energy barrier for the reaction
00:14:19.200 | and so the reaction is promoted.
00:14:21.560 | So suddenly you get one reaction
00:14:22.920 | over another series of possibilities occurring
00:14:26.160 | that makes a particular molecule
00:14:28.120 | and this keeps happening in steps.
00:14:30.040 | And before you know it,
00:14:31.600 | almost these waves as discrete reactions work together
00:14:34.400 | and you start to build a machinery
00:14:38.080 | that is run by existence.
00:14:41.880 | So as you go forward in time,
00:14:44.960 | the fact that the molecules, the bonds are getting,
00:14:48.040 | there are more bonds in a molecule,
00:14:49.520 | there's more function,
00:14:50.920 | there's more capability for this molecule
00:14:52.480 | to interact with other molecules, to redirect them,
00:14:55.320 | it's like a series of little,
00:14:57.160 | and I don't want to use this term too much,
00:15:00.000 | but it's almost thinking about
00:15:01.120 | the simplest von Neumann constructor
00:15:04.680 | that's the simplest molecule
00:15:06.000 | that could build a more complicated molecule
00:15:07.560 | to build a more complicated molecule.
00:15:09.120 | And before you know it,
00:15:10.000 | when that more complicated molecule can act
00:15:11.920 | on the causal chain that's produced itself and change it,
00:15:15.880 | suddenly you start to get towards some kind of autonomy
00:15:18.480 | and that's where life I think emerges in earnest.
00:15:21.840 | - Every single word in the past few paragraphs,
00:15:24.680 | let's break those apart.
00:15:26.200 | Who's von Neumann?
00:15:28.960 | What's a constructor?
00:15:30.920 | The closing of the loop that you're talking about,
00:15:33.880 | the molecule that starts becoming,
00:15:37.920 | I think you said the smallest von Neumann constructor,
00:15:41.800 | the smallest, the minimal.
00:15:43.240 | So what do all those things mean
00:15:45.360 | and what are we supposed to imagine
00:15:47.920 | when we think about the smallest von Neumann constructor?
00:15:51.200 | - So John von Neumann is a real hero,
00:15:54.160 | actually not just me, but many people,
00:15:55.760 | I think computer science and physics.
00:15:58.560 | He was an incredible intellect
00:16:00.520 | who probably solved a lot of the problems
00:16:02.960 | that we're working on today,
00:16:04.080 | I just forgot to write them down.
00:16:06.480 | And I'm not sure if it's John von Neumann or Johnny
00:16:09.440 | as I think his friends called him,
00:16:10.760 | but I think he was Hungarian, mathematician,
00:16:14.800 | came to the US and basically was involved
00:16:18.520 | in the Manhattan Project and developing computation
00:16:21.400 | and came up with all sorts of ideas
00:16:24.280 | and I think was one of the first people
00:16:25.680 | to come up with cellular automata.
00:16:27.400 | - Oh really, I didn't know this little fact.
00:16:30.280 | - I think so.
00:16:31.120 | I think so.
00:16:31.960 | - Well anyway, if he didn't come up with it,
00:16:34.840 | he probably did come up with it and didn't write it down.
00:16:37.240 | - There was a couple of people who did at the same time
00:16:38.800 | and then Conway obviously took it on
00:16:40.120 | and then Wolfram loves CAs,
00:16:42.600 | there is his fabric of the universe.
00:16:44.720 | And what I think he imagined was that he wasn't satisfied
00:16:48.320 | and this may be incorrect recollection,
00:16:51.360 | but so a lot of what I say is gonna be kind of,
00:16:53.800 | you know, just way out of my--
00:16:56.120 | - You're just part of the universe,
00:16:59.120 | creating its memory, designing--
00:17:02.280 | - Exactly, yeah, rewriting history.
00:17:04.160 | - Rewriting history.
00:17:05.000 | - Exactly, imperfectly.
00:17:06.360 | So but what I mean is I think he liked this idea
00:17:09.080 | of thinking about how could a Turing machine
00:17:14.080 | literally build itself without a Turing machine, right?
00:17:16.760 | It's like literally how do state machines emerge
00:17:19.600 | and I think that von Neumann constructors,
00:17:21.640 | he wanted to conceive of a minimal thing,
00:17:24.760 | autonoma, that could build itself
00:17:28.400 | and what would those rules look like in the world?
00:17:30.680 | And that's what a von Neumann kind of constructor
00:17:32.800 | looked like, like it's a minimal hypothetical object
00:17:35.480 | that could build itself, self replicate.
00:17:37.880 | And I'm really fascinated by that because I think that
00:17:42.280 | although it's probably not exactly what happened,
00:17:46.960 | it's a nice model because as chemists,
00:17:48.800 | if we could go back to the origin of life
00:17:50.800 | and think about what is a minimal machine
00:17:54.120 | that can get structured randomly,
00:17:56.000 | so like with no prime mover, with no architect,
00:18:01.240 | it assembles through just existence.
00:18:03.600 | So random stuff bumping in together
00:18:06.200 | and you make this first molecule.
00:18:07.800 | So you have molecule A and molecule A interacts
00:18:11.520 | with another random molecule B and they get together
00:18:14.080 | and they realize by working together,
00:18:15.520 | they can make more of themselves.
00:18:17.600 | But then they realize they can mutate
00:18:19.560 | so they can make AB prime.
00:18:21.440 | So AB prime is different to AB and then AB prime
00:18:25.960 | can then act back where A and B were being created
00:18:29.680 | and slightly nudge that causal chain
00:18:32.920 | and make AB prime more evolvable or learn more.
00:18:37.920 | So that's the closing the loop part.
00:18:40.560 | - Closing the loop part, got it.
00:18:42.040 | It feels like the mutation part is not that difficult.
00:18:46.920 | It feels like the difficult part is just creating
00:18:48.680 | a copy of yourself as step one.
00:18:50.880 | It seems like one of the greatest inventions
00:18:55.880 | in the history of the universe
00:18:57.600 | is the first molecule that figured out,
00:19:01.640 | holy shit, I can create a copy of myself.
00:19:04.480 | How hard is that?
00:19:06.920 | - I think it's really, really easy.
00:19:09.720 | - Okay, I did not expect that.
00:19:11.000 | - I think it's really, really easy.
00:19:12.640 | Well, let's take a step back.
00:19:15.120 | I think replicating molecules are rare,
00:19:19.240 | but if you say, I think I was saying on,
00:19:22.040 | I probably got into trouble on Twitter the other day,
00:19:23.880 | so I was trying to work this.
00:19:24.920 | There's about more than 18 mils of water in there.
00:19:27.160 | So one mole of water, 6.022 times 10 to the 23 molecules.
00:19:31.720 | That's about the number of stars in the universe,
00:19:33.600 | I think, of the order.
00:19:34.680 | So there's three universe worth, but between one--
00:19:37.160 | - Somebody corrected you on Twitter.
00:19:38.440 | - Yeah, as ever, I'm always being corrected.
00:19:40.840 | It's a great, but there's a lot of molecules in the water.
00:19:44.120 | And so there's a lot of, so although it's,
00:19:46.920 | for you and me, really hard to conceive of,
00:19:49.520 | if existence is not the default for a long period of time,
00:19:55.240 | because what happens is the molecules get degraded.
00:19:57.840 | So much of the possibilities in the universe
00:20:00.120 | are just broken back into atoms.
00:20:01.920 | So you have this kind of destruction of the molecules
00:20:06.080 | for our chemical reactions.
00:20:07.720 | So you only need one or two molecules
00:20:10.000 | to become good at copying themselves,
00:20:12.280 | for them suddenly to then take resources in the pool
00:20:15.080 | and start to grow.
00:20:16.400 | And so then replication, actually, over time,
00:20:19.000 | when you have bonds, I think is much simpler, much easier.
00:20:23.040 | And I even found this in my lab years ago.
00:20:25.520 | I had, one of the reasons I started doing
00:20:27.560 | inorganic chemistry and making rust,
00:20:30.400 | making a bit of rust based on a thing called molybdenum.
00:20:33.720 | Molybdenum oxide, is this molybdenum oxide, very simple.
00:20:38.620 | But when you add acid to it, and some electrons,
00:20:40.940 | they make these molecules you just cannot possibly imagine,
00:20:44.700 | would be constructed big, gigantic wheels
00:20:46.960 | of 154 molybdenum atoms in a wheel.
00:20:50.160 | Or I cost a dodecahedron 132 molybdenum atoms,
00:20:54.360 | all in the same pot.
00:20:55.440 | And I realized when I, and I just finished experiments
00:20:58.040 | two years ago, I've just published a couple of papers
00:21:00.200 | on this, that they're actually,
00:21:02.680 | there is a random small molecule with 12 atoms in it
00:21:06.400 | that can form randomly, but it happens to template
00:21:09.020 | its own production.
00:21:10.840 | And then by chance it templates the ring.
00:21:14.060 | Just an accident, just like, just an absolute accident.
00:21:17.280 | And that ring also helps make the small 12 mer.
00:21:21.960 | And so you have what's called an autocatalytic set,
00:21:25.520 | where A makes B, and B helps make A, and vice versa.
00:21:30.520 | And you then make this loop.
00:21:34.560 | So it's a bit like these, they all work in synergy
00:21:39.560 | to make this chain of events that grow.
00:21:43.500 | And it doesn't take a very sophisticated model
00:21:46.600 | to show that if you have these objects competing
00:21:50.540 | and then collaborating to help one another build,
00:21:53.280 | they just grow out of the mess.
00:21:55.000 | And although they seem improbable, they are improbable.
00:21:58.320 | In fact, impossible in one step.
00:22:01.560 | There's multiple steps.
00:22:02.800 | This is when the blind people look at the blind watchmaker
00:22:06.040 | argument when you're talking about how could a watch
00:22:08.320 | spontaneously emerge?
00:22:10.860 | Well, it doesn't.
00:22:11.760 | It's a lineage of watches and cruder devices
00:22:14.560 | that are bootstrapped onto one another.
00:22:17.840 | - Right.
00:22:20.040 | So it's very improbable, but once you get
00:22:24.240 | that little discovery, like with the wheel and fire,
00:22:28.480 | it just gets, explodes in, because it's so successful,
00:22:34.360 | it explodes.
00:22:35.180 | It's basically selection.
00:22:37.800 | So this templating mechanism that allows you
00:22:40.520 | to have a little blueprint for yourself,
00:22:43.440 | how you go through different procedures
00:22:45.080 | is to build copies of yourself.
00:22:46.640 | So in chemistry somehow it's possible to imagine
00:22:52.920 | that that kind of thing is easy to spring up.
00:22:57.000 | In more complex organisms, it feels like a different thing
00:22:59.800 | and much more complicated.
00:23:01.680 | We're having multiple abstractions of the birds
00:23:05.080 | and the bees conversation here.
00:23:06.820 | But with human, sorry, with complex organisms,
00:23:10.240 | it feels difficult to have reproduction.
00:23:13.280 | To, that's going to get clipped out.
00:23:17.120 | I'm going to make fun of that.
00:23:18.600 | (laughs)
00:23:20.800 | It's difficult to develop this idea
00:23:23.840 | of making copies of yourself or no.
00:23:25.840 | 'Cause that seems like a magical idea for life to,
00:23:32.960 | That feels like very necessary for what selection is,
00:23:36.560 | for what evolution is.
00:23:37.720 | But then if selection precedes all of this,
00:23:40.640 | then maybe these are just like echoes
00:23:43.440 | of the selecting mechanism at different scales.
00:23:47.360 | - Yeah, that's exactly it.
00:23:48.440 | So selection is the default in the universe.
00:23:50.720 | If you want to, and what happens is that life,
00:23:54.820 | the solution that life has got on Earth,
00:23:57.680 | life on Earth, biology on Earth, is unique to Earth.
00:24:01.320 | We can talk about that.
00:24:03.240 | And that was really hard fought for.
00:24:05.760 | But that is a solution that works on Earth,
00:24:08.680 | the ribosome, the fundamental machine
00:24:10.640 | that is responsible for every life,
00:24:13.200 | every cell on Earth, or wherever it is
00:24:16.600 | in the kingdom of life.
00:24:18.520 | That is an incredibly complex object.
00:24:21.160 | But it was evolved over time,
00:24:22.880 | and it wasn't involved in a vacuum.
00:24:24.600 | And I think that once we understand
00:24:26.800 | that selection can occur without the ribosome,
00:24:31.800 | but what the ribosome does,
00:24:34.160 | it's a phase transition in replication.
00:24:37.220 | And I think that that, and also technology,
00:24:39.640 | that is probably much easier to get to than we think.
00:24:44.640 | - Why do you put the ribosome as the central
00:24:49.200 | part of living organisms on Earth?
00:24:52.400 | - It basically is a combination
00:24:53.720 | of two different polymer systems,
00:24:55.840 | so RNA and peptides.
00:24:57.280 | So the RNA world, if you like, gets transmitted
00:25:00.900 | and builds proteins, and the proteins are responsible
00:25:03.720 | for all the catalysis.
00:25:05.200 | The majority of the catalysis goes on the cell.
00:25:07.340 | No ribosome, no proteins, no decoding, no evolution.
00:25:11.500 | - So ribosome is looking at the action.
00:25:14.340 | You don't put the RNA itself as the critical thing.
00:25:17.820 | Like information, you put action as the most important thing.
00:25:20.460 | - Yeah, I think the actual molecules
00:25:22.460 | that we have in biology right now
00:25:23.700 | are entirely contingent on the history of life on Earth.
00:25:27.340 | There are so many possible solutions.
00:25:29.020 | And this is where chemistry gets itself,
00:25:30.900 | into origin of life chemistry gets itself
00:25:32.580 | into a bit of a trap.
00:25:33.460 | - Yeah, let me interrupt you there.
00:25:35.260 | You've tweeted, you're gonna get,
00:25:36.560 | I'm gonna cite your tweets, like it's Shakespeare.
00:25:39.260 | - Okay.
00:25:40.520 | - It's surprising you haven't gotten canceled
00:25:41.960 | on Twitter yet.
00:25:42.800 | Your brilliance, once again, saves you.
00:25:45.660 | I'm just kidding.
00:25:47.500 | There's, you like to have a little bit of fun on Twitter.
00:25:51.300 | You've tweeted that, quote,
00:25:52.720 | "Origin of life research is a scam."
00:25:56.860 | So if this is Shakespeare, can we analyze this word?
00:25:59.900 | Why is the origin of life research a scam?
00:26:02.300 | Aren't you kind of doing origin of life research?
00:26:05.220 | - Okay, it was tongue in cheek, but yeah,
00:26:08.140 | I think, and I meant it as tongue in cheek.
00:26:10.940 | I am, I'm not doing the origin of life research.
00:26:14.460 | I'm trying to make artificial life.
00:26:16.820 | And I also want to bound the likelihood
00:26:19.700 | of the origin of life on Earth,
00:26:21.340 | but more importantly, to find origin of life elsewhere.
00:26:23.820 | But let me directly address the tweet.
00:26:26.020 | There are many, many good chemists out there
00:26:27.900 | doing origin of life research, but I want to nudge them.
00:26:31.500 | And I think they're brilliant.
00:26:32.460 | Like, there's no question.
00:26:35.280 | The chemistry they are doing, the motivation is great.
00:26:38.780 | So what I meant by that tweet is saying
00:26:40.580 | that maybe they're making assumptions about saying,
00:26:42.980 | if only I could make this particular type of molecule,
00:26:47.440 | say this RNA molecule or this phosphodiester
00:26:52.100 | or this other molecule,
00:26:54.140 | it's gonna somehow unlock the origin of life.
00:26:57.800 | And I think that origin of life has been looking at this
00:27:00.560 | for a very long time.
00:27:01.980 | And whilst I think it's brilliant to work out
00:27:05.700 | how you can get to those molecules,
00:27:08.220 | I think that chemistry and chemists doing origin of life
00:27:11.420 | could be nudged into doing something even more profound.
00:27:15.940 | And so the argument I'm making, it's a bit like right now,
00:27:20.260 | let's say, I don't know, the first Tesla
00:27:22.940 | that makes its way to, I don't know,
00:27:25.820 | into a new country in the world.
00:27:27.020 | Let's say there's a country X
00:27:29.660 | that has never had a Tesla before,
00:27:31.180 | and they get the Tesla. - Russia.
00:27:33.380 | - And what they do is they take the Tesla apart
00:27:35.620 | and say, we wanna find the origin of cars in the universe
00:27:38.460 | and say, okay, how did this form and how did this form?
00:27:41.140 | And they just randomly keep making
00:27:42.620 | till they make the door, they make the wheel,
00:27:44.360 | they make the steering column and all this stuff.
00:27:46.260 | And they say, oh, that's the route.
00:27:48.140 | That's the way cars emerged on earth.
00:27:50.420 | But actually we know that there's a causal chain of cars
00:27:53.340 | going right back to Henry Ford and the horse and carriage.
00:27:56.100 | And before that, maybe, you know,
00:27:57.760 | where people were using wheels.
00:28:00.340 | And I think that obsession with the identities
00:28:04.340 | that we see in biology right now
00:28:06.140 | are giving us a false sense of security
00:28:09.260 | about what we're looking for.
00:28:10.960 | And I think the origin of life chemistry
00:28:13.780 | is in danger of not making the progress that it deserves.
00:28:20.100 | Because the chemists are doing this.
00:28:22.780 | The field is exploding right now.
00:28:24.260 | There's amazing people out there, young and old, doing this.
00:28:27.540 | And there's deservedly so more money going in.
00:28:30.340 | You know, I used to complain,
00:28:31.860 | there's more money being spent searching for the Higgs boson
00:28:34.100 | that we know exists in the origin of life.
00:28:36.300 | You know, why is that?
00:28:37.460 | The origin, we understand the origin of life.
00:28:39.780 | We're gonna actually work at what life is.
00:28:42.220 | We're gonna be outbound the likelihood
00:28:43.660 | of finding life elsewhere in the universe.
00:28:45.700 | And most important for us, we are gonna know
00:28:48.620 | or have a good idea of what the future of humanity
00:28:50.700 | looks like.
00:28:51.700 | You know, we need to understand
00:28:53.440 | that although we're precious,
00:28:54.980 | we're not the only life forms in the universe.
00:28:57.140 | Or that's my very strong impression.
00:28:58.720 | I have no data for that.
00:29:00.100 | It's just right now a belief.
00:29:02.180 | And I want to turn that belief
00:29:03.380 | into more than a belief by experimentation.
00:29:07.180 | But coming back to the scam,
00:29:09.480 | the scam is if we just make this RNA,
00:29:11.740 | we've got this, you know, this fluke event,
00:29:16.020 | we know how that's simple.
00:29:17.540 | Let's make this phosphodiester,
00:29:19.180 | or let's make ATP or ADP.
00:29:21.220 | We've got that part nailed.
00:29:22.420 | Let's now make this other molecule, another molecule.
00:29:24.420 | And how many molecules are gonna be enough?
00:29:26.460 | And then the reason I say this
00:29:28.360 | is when you go back to Craig Venter,
00:29:30.600 | when he invented his life form, Cyndia,
00:29:34.440 | this minimal plasmid,
00:29:38.220 | is a myoplasma, something, I don't know the name of it.
00:29:43.140 | But he made this wonderful cell
00:29:45.560 | and said, "I've invented life."
00:29:48.340 | Not quite.
00:29:49.180 | He facsimiled the genome from this entity
00:29:52.700 | and made it in the lab, all the DNA,
00:29:55.620 | but he didn't make the cell.
00:29:56.940 | He had to take an existing cell
00:29:59.980 | that has a causal chain going all the way back to Luca.
00:30:02.660 | And he showed when he took out the gene,
00:30:05.300 | the genes, and put in his genes, synthesized,
00:30:08.260 | the cell could boot up.
00:30:09.860 | But it's remarkable
00:30:10.780 | that he could not make a cell from scratch.
00:30:12.900 | And even now today,
00:30:14.220 | synthetic biologists cannot make a cell from scratch
00:30:17.300 | because there's some contingent information
00:30:20.340 | embodied outside the genome in the cell.
00:30:23.780 | And that is just incredible.
00:30:26.140 | So there's lots of layers to the scam.
00:30:28.120 | - Well, let me then ask the question,
00:30:32.940 | how can we create life in the lab from scratch?
00:30:37.220 | What have been the most promising attempts
00:30:39.420 | at creating life in the lab from scratch?
00:30:41.860 | Has anyone actually been able to do it?
00:30:44.220 | Do you think anyone will be able to do it
00:30:46.340 | in the near future if they haven't already?
00:30:48.500 | - Yeah, I think that nobody has made life
00:30:53.660 | in the lab from scratch.
00:30:54.740 | Lots of people would argue that they have made progress.
00:30:57.380 | So Craig Venter, I think the synthesis
00:30:59.300 | of a synthetic genome milestone in human achievement.
00:31:03.580 | Brilliant.
00:31:04.420 | - Yeah, can we just walk back and say,
00:31:06.340 | what would you say from your perspective,
00:31:09.740 | one of the world experts in exactly this area,
00:31:12.980 | what does it mean to create life from scratch?
00:31:15.100 | Where if you sit back, whether you do it
00:31:17.260 | or somebody else does it, it's like,
00:31:20.060 | damn, we just created life.
00:31:24.580 | - Well, I can tell you what I would expect,
00:31:27.580 | I would like to be able to do,
00:31:30.240 | is to go from sand to cells in my lab.
00:31:35.240 | And--
00:31:36.380 | - Can you explain what sand is?
00:31:39.060 | You used a board. - Yeah, just inorganic stuff.
00:31:40.980 | Like basically, so sand is just silica.
00:31:44.820 | Silicon oxide with some other ions in it,
00:31:47.220 | maybe some inorganic carbon, some carbonates.
00:31:50.980 | Just basically clearly dead stuff
00:31:52.980 | that you could just grind rocks into sand.
00:31:55.580 | - And it would be what, in a kind of vacuum
00:31:57.940 | so they could remove anything else that could possibly
00:32:00.700 | be like a shadow of life that can assist in the chemical--
00:32:06.340 | - You could do that, you could insist and say,
00:32:07.940 | look, I'm gonna take, and not just inorganic,
00:32:09.940 | I want some more, I wanna cheat and have some organic,
00:32:12.580 | but I want inorganic organic,
00:32:13.900 | and I'll explain the play on words in a moment.
00:32:16.100 | So I would like to basically put into a world,
00:32:19.420 | let's say a completely, you know, a synthetic world,
00:32:22.780 | if you like, a closed world,
00:32:24.500 | put some inorganic materials
00:32:26.700 | and just literally add some energy in some form,
00:32:30.020 | be it lightning or heat, UV light,
00:32:33.660 | and run this thing in cycles over time
00:32:37.180 | and let it solve the search problem.
00:32:38.620 | So I see the origin of life as a search problem
00:32:41.140 | in chemical space.
00:32:42.700 | And then I would wait, literally wait for a life form
00:32:45.580 | to crawl out of the test tube,
00:32:46.580 | that's the joke I tell my group.
00:32:48.100 | Literally wait for a very, don't worry,
00:32:51.620 | it's gonna be very feeble,
00:32:52.540 | it's not gonna take over the world.
00:32:53.900 | You know, there's ways of ethically containing it.
00:32:56.140 | - Famous last words.
00:32:57.580 | - It was, indeed, indeed, indeed.
00:33:00.220 | But I--
00:33:01.420 | - You know this is being recorded, right?
00:33:03.220 | It'll make you, it will not make you look good
00:33:05.660 | once it crawls out of the lab
00:33:07.660 | and destroys all of human civilization, but yes, let's--
00:33:09.940 | - But there is very good,
00:33:11.060 | there is a very good things you can do to prevent that.
00:33:13.420 | For instance, if you put stuff in your world
00:33:15.580 | which isn't Earth abundant,
00:33:17.460 | so let's say we make life based on molybdenum
00:33:20.420 | and it escapes, it would die immediately
00:33:21.940 | 'cause there's not enough molybdenum in the environment.
00:33:23.460 | So we can put in, we can do responsible life.
00:33:26.700 | Or as I fantasize with my research group on our away day
00:33:29.900 | that would go in, it's, you know,
00:33:31.500 | I think it's actually morally,
00:33:33.540 | if we don't find, until humanity finds life in the universe,
00:33:37.980 | this is going on a tangent,
00:33:39.260 | it's our moral obligation to make origin of life bombs,
00:33:42.020 | identify dead planets and bomb them
00:33:43.820 | with our origin of life machines and make them alive.
00:33:46.580 | I think it is our moral obligation to do that.
00:33:49.700 | I'm sure some people might argue with me about that,
00:33:51.780 | but I think that we need more life in the universe.
00:33:54.420 | - And then we kind of forget we did it
00:33:58.100 | and then come back.
00:33:59.100 | And then--
00:34:01.220 | - Say, where did you come from?
00:34:02.260 | But coming back to the, what I'd expect,
00:34:04.020 | so I'll just say--
00:34:04.860 | - Father, are you back?
00:34:06.820 | I think this is, once again, a Rick and Morty episode.
00:34:09.300 | - Definitely, definitely all Rick and Morty
00:34:11.020 | all the way down.
00:34:11.940 | So I imagine we have this pristine experiment
00:34:15.540 | and everything is, you know, sanitized.
00:34:18.100 | And we put in inorganic materials and we have cycles,
00:34:21.940 | whether day, night cycles, up, down, whatever.
00:34:24.500 | And we look for evidence of replication
00:34:27.260 | and evolution over time.
00:34:28.780 | And that's what the experiment should be.
00:34:30.580 | Now, are there people doing this in the world right now?
00:34:32.460 | There are a couple of,
00:34:33.780 | there's some really good groups doing this.
00:34:35.780 | There's some really interesting scientists
00:34:37.140 | doing this around the world.
00:34:38.700 | They're kind of, perhaps, too much associated with the scam.
00:34:43.700 | So, and so they're using molecules
00:34:48.500 | that are already, were already invented by biology.
00:34:50.860 | So there's a bit of replication built in.
00:34:54.100 | But I still think the work they're doing is amazing.
00:34:57.660 | But I would like people to be a bit freer
00:34:59.780 | and say, let's just basically shake a load of sand in a box
00:35:03.500 | and wait for life to come out.
00:35:04.580 | Because that's what happened on Earth.
00:35:06.340 | And so we have to understand that.
00:35:08.020 | Now, how would I know I've been successful?
00:35:10.180 | Well, because I'm not obsessing
00:35:11.940 | with what molecules are in life now,
00:35:15.100 | I would wager a vast quantity of money.
00:35:19.860 | I'm not very rich, so it'd just be a few dollars.
00:35:21.940 | But for me, the solution space will be different.
00:35:26.940 | So the genetic material will be not RNA.
00:35:31.060 | The proteins will not be what we think.
00:35:33.980 | The solutions will be just completely different.
00:35:36.180 | And it might be, and it'll be very feeble,
00:35:37.820 | because that's the other thing we should be able to show
00:35:40.940 | fairly robustly, that even if I did make,
00:35:43.860 | or someone did make a new life form in the lab,
00:35:46.100 | it would be so poor that it's not gonna leap out.
00:35:50.100 | It is, the fear about making a lethal life form
00:35:55.100 | in the lab from scratch is similar to us imagining
00:36:00.700 | that we're gonna make the Terminator
00:36:02.580 | in the Boston Dynamics tomorrow.
00:36:04.500 | It's simply not.
00:36:05.340 | And the problem is, we don't communicate that properly.
00:36:08.500 | I know you yourself, you explain this very well.
00:36:12.900 | There is not the AI catastrophe coming.
00:36:15.000 | We're very far away from that.
00:36:17.020 | That doesn't mean we should ignore it.
00:36:18.460 | Same with the origin of life catastrophe.
00:36:20.020 | It's not coming anytime soon.
00:36:21.940 | We shouldn't ignore it.
00:36:23.140 | But we shouldn't let that fear stop us
00:36:25.380 | from doing those experiments.
00:36:26.580 | - But this is a much, much longer discussion,
00:36:28.540 | 'cause there's a lot of details there.
00:36:30.020 | I would say there's potential for catastrophic events
00:36:33.620 | to happen in much dumber ways.
00:36:36.660 | In the AI space, there's a lot of ways to create,
00:36:40.060 | like social networks are creating a kind of accelerated
00:36:45.060 | set of events that we might not be able to control.
00:36:48.380 | That social network virality in the digital space
00:36:53.260 | can create mass movements of ideas that can then,
00:36:58.820 | if times are tough, create military conflict
00:37:01.980 | and all those kinds of things.
00:37:02.980 | But that's not super intelligent AI.
00:37:05.660 | That's an interesting at-scale application of AI.
00:37:09.740 | If you look at viruses, viruses are pretty dumb.
00:37:13.300 | But at scale, their application is pretty detrimental.
00:37:16.820 | And so origin of life, much like all the kind of virology,
00:37:21.820 | the very contentious word of gain-of-function research
00:37:28.540 | in virology, sort of like research on viruses,
00:37:31.760 | messing with them genetically,
00:37:36.100 | that can create a lot of problems if not done well.
00:37:39.000 | So we have to be very cautious.
00:37:41.820 | So there's a kind of, whenever you're ultra-cautious
00:37:44.920 | about stuff in AI or in virology and biology,
00:37:49.920 | it borders on cynicism, I would say,
00:37:53.420 | where it's like everything we do is going to turn out
00:37:56.760 | to be destructive and terrible,
00:37:58.680 | so I'm just going to sit here and do nothing.
00:38:01.000 | Okay, that's a possible solution,
00:38:03.460 | except for the fact that somebody's going to do it.
00:38:06.360 | It's science and technology progresses,
00:38:10.500 | so we have to do it in an ethical way, in a good way,
00:38:14.800 | considering in a transparent way, in an open way,
00:38:17.820 | considering all the possible positive trajectories
00:38:23.800 | that could be taken and making sure,
00:38:25.480 | as much as possible, that we walk those trajectories.
00:38:28.260 | So yeah, I don't think Terminator is coming,
00:38:31.080 | but a totally unexpected version of Terminator
00:38:35.220 | may be around the corner.
00:38:36.320 | - Yeah, it might be here already.
00:38:37.300 | Yeah, so I agree with that.
00:38:39.000 | And so going back to the origin of life discussion,
00:38:41.460 | I think that in synthetic biology right now,
00:38:44.500 | we have to be very careful about how we edit genomes
00:38:49.080 | and edit synthetic biology to do things.
00:38:51.480 | So that's where things might go wrong,
00:38:53.920 | in the same way as Twitter turning ourselves
00:38:56.320 | into kind of strange scale effects.
00:38:59.520 | I would love origin of life research
00:39:01.840 | or artificial life research to get to the point
00:39:04.640 | where we have those worries,
00:39:06.860 | because that's why I think we're just so far away from that.
00:39:10.200 | Right now, I think there are two really important angles.
00:39:13.160 | There is the origin of life people,
00:39:15.900 | researchers who are faithfully working on this
00:39:18.840 | and trying to make those molecules,
00:39:20.600 | the scam molecules I talk about.
00:39:22.800 | And then there are people on the creationist side
00:39:24.920 | who are saying, look, the fact you can't make these molecules
00:39:26.960 | and you can't make a cell means that evolution isn't true
00:39:29.960 | and all this other stuff.
00:39:30.800 | - Gotcha.
00:39:31.620 | - Yeah, and I find that really frustrating
00:39:34.200 | because actually the origin of life researchers
00:39:36.080 | are all working in good faith, right?
00:39:38.160 | And so what I'm trying to do is give origin of life research
00:39:41.920 | a little bit more of an open context.
00:39:46.920 | And one of the things I think is important,
00:39:49.440 | I really want to make a new life form in my lifetime.
00:39:52.340 | I really want to prove that life is a general phenomena,
00:39:56.060 | a bit like gravity in the universe,
00:39:58.160 | because I think that's gonna be really important
00:39:59.960 | for humanity's global psychological state,
00:40:04.920 | meaning going forward.
00:40:06.280 | - That's beautifully put.
00:40:09.120 | So one, it will help us understand ourselves,
00:40:12.480 | so that's useful for science.
00:40:14.240 | But two, it gives us a kind of hope,
00:40:17.660 | if not an awe at all the huge amounts
00:40:22.660 | of alien civilizations that are out there.
00:40:26.360 | If you can build life and understand
00:40:29.120 | just how easy it is to build life,
00:40:31.920 | then that's just as good, if not much better,
00:40:35.520 | than discovering life on another planet.
00:40:38.060 | I mean, it's cheaper, it's much cheaper and much easier
00:40:44.460 | and probably much more conclusive
00:40:47.460 | because once you're able to create life,
00:40:50.020 | like you said, it's a search problem,
00:40:52.280 | that there's probably a lot of different ways to do it.
00:40:55.680 | So once you find the first solution,
00:40:58.920 | you probably have all the right methodology
00:41:00.640 | for finding all kinds of other solutions.
00:41:02.680 | - Yeah, and wouldn't it be great if we could find a solution?
00:41:04.880 | I mean, it's probably a bit late for,
00:41:07.580 | I mean, I worry about climate change,
00:41:09.080 | but I'm not that worried about climate change.
00:41:10.680 | And I think one day you could think about,
00:41:13.440 | could we engineer a new type of life form
00:41:15.160 | that could basically, and I don't want to do this,
00:41:17.920 | I don't think we should do this necessarily,
00:41:19.460 | but it's a good thought experiment,
00:41:21.200 | that would perhaps take CO2 out of the atmosphere
00:41:23.620 | or an intermediate life form, so it's not quite alive,
00:41:26.080 | it's almost like an add-on,
00:41:28.200 | that we can, a time-dependent add-on
00:41:32.200 | you could give to, say, cyanobacteria in the ocean
00:41:35.100 | or to, maybe, to wheat and say, right,
00:41:37.500 | we're just gonna fix a bit more CO2
00:41:40.320 | and we're gonna work out how much we need to fix
00:41:42.060 | to basically save the climate
00:41:44.600 | and we're gonna use evolutionary principles
00:41:47.680 | to basically get there.
00:41:49.560 | What worries me is that biology has had a few billion years
00:41:51.960 | to find a solution for CO2 fixation.
00:41:54.360 | It hasn't really done,
00:41:56.000 | the solution isn't brilliant for our needs,
00:41:58.920 | but biology wasn't thinking about our needs,
00:42:00.800 | biology was thinking about biology's needs.
00:42:03.200 | But I think if we can do, as you say, make life in the lab,
00:42:06.720 | then suddenly we don't need to go to everywhere
00:42:09.720 | and conclusively prove it.
00:42:10.960 | I think we make life in the lab,
00:42:12.440 | we look at the extent of life in the solar system,
00:42:14.320 | how far did Earth life get?
00:42:16.480 | Probably we're all Martians,
00:42:17.760 | probably life got going on Mars,
00:42:19.280 | the chemistry on Mars, ceded Earth,
00:42:21.200 | that might have been a legitimate way
00:42:22.920 | to kind of truncate the surface space.
00:42:25.560 | But in the outer solar system,
00:42:26.600 | we might have completely different life forms
00:42:28.080 | on Enceladus, on Europa, and Titan.
00:42:32.040 | And that would be a cool thing because--
00:42:33.400 | - Okay, wait a minute, wait a minute, wait a minute.
00:42:36.400 | Did you just say that you think, in terms of likelihood,
00:42:40.360 | life started on Mars, like statistically speaking,
00:42:43.600 | life started on Mars and ceded Earth?
00:42:45.840 | - It could be possible because life was,
00:42:48.360 | so Mars was habitable for the type of life
00:42:51.000 | that we have right now, type of chemistry before Earth.
00:42:53.920 | So it seems to me that Mars got searching, doing chemistry.
00:42:57.920 | - And started way before.
00:43:00.800 | - Yeah, and so they had a few more replicators
00:43:03.040 | and some other stuff.
00:43:04.240 | And if those replicators got ejected from Mars
00:43:06.320 | and landed on Earth, and Earth was like,
00:43:09.200 | I don't need to start again.
00:43:10.440 | - Right.
00:43:11.600 | - Thanks for that.
00:43:12.440 | And then it just carried on.
00:43:13.320 | So I'm not going, I think we will find evidence
00:43:16.520 | of life on Mars, either life we put there by mistake,
00:43:20.040 | contamination, or actually life,
00:43:22.200 | the earliest remnants of life.
00:43:24.520 | And that would be really exciting.
00:43:26.000 | There's a really good reason to go there.
00:43:28.100 | But I think it's more unlikely
00:43:29.600 | because of the gravitational situation in the solar system.
00:43:31.600 | If we find life in the outer solar system--
00:43:33.680 | - Titan and all that, that would be its own thing.
00:43:35.880 | - Exactly.
00:43:36.720 | - Wow, that would be so cool.
00:43:37.960 | If we go to Mars and we find life
00:43:39.960 | that looks a hell of a lot similar to Earth life,
00:43:43.000 | and then we'll go to Titan and all those weird moons
00:43:46.760 | with the ices and the volcanoes and all that kind of stuff.
00:43:49.040 | And then we find there something that looks,
00:43:52.480 | I don't know, way weirder.
00:43:54.880 | - Yeah.
00:43:55.720 | - Some other, some non-RNA type of situation.
00:43:57.600 | - But we might find almost life,
00:43:59.560 | like in the prebiotic chemical space.
00:44:01.800 | And I think there are four types of exoplanets
00:44:03.640 | we can go look for, right?
00:44:04.560 | 'Cause when JWST goes up and touch wood,
00:44:07.200 | it goes up and everything's fine.
00:44:09.120 | When we look at a star, well, we know statistically
00:44:12.000 | most stars have planets around them.
00:44:13.440 | What type of planet are they?
00:44:14.680 | Are they gonna be dead?
00:44:16.400 | Are they gonna be just prebiotic, origin of life coming?
00:44:20.600 | So are they gonna be technological?
00:44:22.560 | And so with intelligence on them, and will they have died?
00:44:27.240 | So from, had life on them, but not--
00:44:30.320 | - Those are the four states of--
00:44:31.400 | - They're four, and so suddenly,
00:44:33.200 | it's a bit like I want to classify planets
00:44:34.960 | the way we classify stars.
00:44:37.000 | - Yeah.
00:44:37.840 | - And I think that in terms of,
00:44:39.640 | rather than having this, oh,
00:44:41.200 | we've found methane as evidence of life.
00:44:43.520 | We've found oxygen as evidence of life.
00:44:45.280 | We found whatever molecule marker,
00:44:47.920 | and start to then frame things a little bit more.
00:44:51.600 | - As those four states.
00:44:53.000 | - Yeah.
00:44:53.840 | - Which, by the way, you're just saying four,
00:44:55.120 | but there could be a before the dead,
00:44:58.760 | there could be other states
00:45:00.120 | that we humans can't even conceive of.
00:45:01.720 | - Yeah, yeah, just prebiotic, almost alive,
00:45:04.600 | got the possibility to come alive.
00:45:06.280 | I think--
00:45:07.120 | - But there could be a post-technological.
00:45:09.280 | Whatever we think of as technology,
00:45:12.040 | there could be a pre-conscious,
00:45:16.880 | like where we all meld into one
00:45:19.720 | super intelligent conscious,
00:45:20.960 | or some weird thing that naturally happens over time.
00:45:23.920 | - Yeah, yeah, I mean, I think that all bets are off on that.
00:45:26.880 | - The metaverse.
00:45:28.120 | - Yeah, we are.
00:45:28.960 | - In that case, we join into a virtual metaverse,
00:45:32.480 | and start creating, which is kind of an interesting idea,
00:45:35.360 | almost arbitrary number of copies of each other,
00:45:38.480 | much more quickly, to mess with different ideas.
00:45:42.120 | I can create 1,000 copies of Lex,
00:45:45.400 | like every possible version of Lex,
00:45:48.080 | and then just see, and then I just have them
00:45:51.240 | argue with each other until, in the space of ideas,
00:45:54.840 | and see who wins out.
00:45:56.320 | How could that possibly go wrong?
00:45:58.040 | But anyway, there's, especially in this digital space,
00:46:01.640 | where you can start exploring with AIs mixed in,
00:46:04.520 | you can start engineering arbitrary intelligences,
00:46:07.480 | you can start playing in the space of ideas,
00:46:10.080 | which might move us into a world
00:46:11.560 | that looks very different than a biological world.
00:46:15.240 | Our current world, the technology,
00:46:17.600 | is still very much tied to our biology.
00:46:20.800 | We might move past that completely.
00:46:23.520 | - Oh, definitely, we definitely will.
00:46:25.960 | - 'Cause that could be another phase then.
00:46:27.320 | - Sure.
00:46:28.160 | - Because then you--
00:46:29.000 | - But I did say technological, so I think I agree with you.
00:46:30.800 | I think, so you can have, let's get this right.
00:46:32.520 | So, dead world, no prospect of alive.
00:46:36.520 | Prebiotic world, life emerging, living, and technological.
00:46:40.960 | And you probably, and the dead one,
00:46:42.200 | you probably won't be able to tell
00:46:43.080 | between the dead never being alive and the dead one,
00:46:46.040 | maybe you've got some artifacts, and maybe there's five.
00:46:48.040 | There's probably not more than five.
00:46:50.440 | And I think the technological one could allow,
00:46:53.080 | could have life on it still, but it might just have exceeded.
00:46:56.360 | 'Cause one way that life might survive on Earth
00:46:58.840 | is if we can work out how to deal with the coming,
00:47:01.840 | the real climate change that comes when the sun expands.
00:47:05.520 | It might be a way to survive that, you know?
00:47:07.720 | But yeah, I think that we need to start thinking
00:47:11.720 | statistically when it comes to looking for life
00:47:14.320 | in the universe.
00:47:15.240 | - Let me ask you then, sort of, statistically,
00:47:20.240 | how many alien civilizations are out there
00:47:23.880 | in those four phases that you're talking about?
00:47:26.480 | When you look up to the stars,
00:47:28.600 | and you're sipping on some wine,
00:47:30.800 | and talking to other people with British accents
00:47:35.320 | about something intelligent and intellectual, I'm sure,
00:47:38.040 | do you think there's a lot of alien civilizations
00:47:42.720 | looking back at us and wondering the same?
00:47:45.680 | - My romantic view of the universe
00:47:48.520 | is really taking loans from my logical self.
00:47:52.240 | So what I'm saying is I have no doubt, I have no idea.
00:47:55.480 | But having said that, there is no reason to suppose
00:48:00.480 | that life is as hard as we first thought it was.
00:48:04.800 | And so if we just take Earth as a marker,
00:48:08.760 | and if I think that life is a much more general phenomena
00:48:12.200 | than just our biology,
00:48:13.280 | then I think the universe is full of life.
00:48:16.680 | And the reason for the Fermi paradox
00:48:18.840 | is not that they're not out there,
00:48:22.200 | it's just that we can't interact with the other life forms
00:48:24.960 | because they're so different.
00:48:27.040 | And I'm not saying that they're necessarily
00:48:28.720 | like as depicted in Arrival or other, you know,
00:48:32.960 | I'm just saying that perhaps
00:48:35.280 | there are very few universal facts in the universe,
00:48:40.120 | and maybe that it's not,
00:48:44.040 | our technologies are quite divergent.
00:48:47.440 | And so I think that it's very hard to know
00:48:49.760 | how we're gonna interact with alien life.
00:48:51.640 | - You think there's a lot of kinds of life that's possible.
00:48:53.960 | I guess that was the intuition.
00:48:55.480 | - Yeah.
00:48:56.560 | - You provided that the way biology itself,
00:49:02.360 | but even this particular kinds of biology
00:49:04.640 | that we have on Earth is something that is just one sample
00:49:09.640 | of nearly infinite number of other possible
00:49:15.520 | complex autonomous self-replicating type of things
00:49:19.280 | that could be possible.
00:49:20.640 | And so we're almost unable to see
00:49:24.280 | the alternative versions of us.
00:49:27.040 | I mean, we still be able to detect them,
00:49:31.840 | we'll still be able to interact with them,
00:49:33.840 | we'll still be able to, like,
00:49:35.680 | which, what exactly is lost in translation?
00:49:39.240 | Why can't we see them?
00:49:41.280 | Why can't we talk to them?
00:49:42.720 | 'Cause I too have a sense,
00:49:46.100 | you put it way more poetically,
00:49:49.760 | but it seems both statistically
00:49:54.760 | and sort of romantically,
00:49:58.600 | it feels like the universe should be teaming with life,
00:50:02.040 | like super intelligent life.
00:50:04.000 | And I just, I sit there and the Fermi paradox is very,
00:50:09.960 | it's felt very distinctly by me when I look up at the stars,
00:50:14.080 | because it's like, it's the same way I feel
00:50:17.400 | when I'm driving through New Jersey
00:50:19.200 | and listening to Bruce Springsteen and feel quite sad.
00:50:23.160 | It's like Louis C.K. talks about pulling off
00:50:25.000 | to the side of the road and just weeping a little bit.
00:50:27.680 | I'm almost like wondering like,
00:50:31.160 | hey, why aren't you talking to us?
00:50:33.800 | It feels lonely.
00:50:34.960 | It feels lonely 'cause it feels like they're out there.
00:50:38.240 | - I think that there are a number of answers to that.
00:50:40.120 | I think the Fermi paradox is perhaps based on
00:50:43.360 | the assumption that if life did emerge in the universe,
00:50:47.240 | it would be similar to our life
00:50:49.160 | and there's only one solution.
00:50:50.920 | And I think that what we've got to start to do
00:50:52.800 | is go out and look for selection, detection,
00:50:56.120 | rather than an evolution detection,
00:50:58.320 | rather than life detection.
00:50:59.940 | And I think that once we start to do that,
00:51:03.440 | we might start to see really interesting things.
00:51:07.000 | And we haven't been doing this for very long.
00:51:09.200 | And we are living in an expanding universe
00:51:12.640 | and that makes the problem a little bit harder.
00:51:15.680 | - Everybody's always leaving.
00:51:17.200 | Distance wise.
00:51:19.840 | - I'm very optimistic that we will,
00:51:22.880 | well, I don't know, there are two movies that came out
00:51:25.800 | within six months of one another, "Ad Astra" and "Cosmos".
00:51:30.120 | "Ad Astra", the very expensive blockbuster
00:51:32.880 | with Brad Pitt in it and saying there is no life
00:51:35.880 | and it's all, we've got a life on earth
00:51:37.800 | has more pressures than "Cosmos",
00:51:39.080 | which is a UK production, which basically
00:51:41.360 | aliens came and visited earth one day
00:51:42.920 | and they were discovered in the UK.
00:51:45.160 | It was quite, it's a fun film.
00:51:47.440 | But I really loved those two films.
00:51:50.440 | And at the same time, those films,
00:51:53.560 | at the time those films are coming out,
00:51:54.760 | I was working on a paper, a life detection paper
00:51:58.560 | and I found it was so hard to publish this paper.
00:52:02.080 | And it was almost as depressed,
00:52:03.720 | I got so depressed trying to get this science out there
00:52:06.560 | that I felt the depression of the film in "Ad Astra"
00:52:11.360 | like life is, there's no life elsewhere in the universe.
00:52:14.760 | And, but I'm incredibly optimistic
00:52:17.440 | that I think we will find life in the universe,
00:52:19.360 | firm evidence of life.
00:52:21.080 | And it will have to start on earth,
00:52:22.480 | making life on earth and surprising us.
00:52:24.400 | We have to surprise ourselves
00:52:25.680 | and make non-biological life on earth.
00:52:28.440 | And then people say, well, you made this life on earth,
00:52:31.400 | therefore it's, you're part of the causal chain of that.
00:52:34.400 | And that might be true, but if I can show
00:52:37.080 | how I'm able to do it with a very little cheating
00:52:40.800 | or very little information inputs,
00:52:42.360 | just creating like a model planet, some description
00:52:45.920 | and watching it, watching life emerge,
00:52:48.520 | then I think that we will be even to persuade
00:52:51.160 | even the hardest critic that it's possible.
00:52:55.080 | Now, with regards to the Fermi paradox,
00:52:57.680 | I think that we might crush that with the JWST.
00:53:01.400 | It's basically, if I recall correctly,
00:53:03.120 | the mirror is about 10 times the size of the Hubble,
00:53:06.440 | that we're gonna be able to do spectroscopy,
00:53:09.040 | look at colors of exoplanets, I think, not brilliantly,
00:53:12.680 | but we'll be able to start to classify them.
00:53:15.280 | And we'll start to get a real feel
00:53:18.840 | for what's going on in the universe on these exoplanets.
00:53:21.400 | 'Cause it's only in the last few decades, I think,
00:53:25.200 | maybe even last decade that we even
00:53:27.600 | came to recognize that exoplanets even are common.
00:53:33.840 | And I think that that gives us a lot of optimism
00:53:36.160 | that life is gonna be out there.
00:53:38.640 | But I think we have to start framing,
00:53:40.560 | we have to start preparing the fact
00:53:44.040 | that biology is only one solution.
00:53:46.920 | I can tell you with confidence that biology on Earth
00:53:50.640 | does not exist anywhere else in the universe.
00:53:52.620 | We are absolutely unique.
00:53:54.680 | - Well, okay, I love the confidence,
00:53:56.840 | but where does that confidence come from?
00:54:01.840 | You know, chemistry,
00:54:03.540 | like how many options does chemistry really have?
00:54:07.640 | - Many, that's the point.
00:54:08.880 | And the thing is, this is where
00:54:10.120 | the origin of life scam comes in,
00:54:11.840 | is that people don't quite count,
00:54:15.120 | they don't count the numbers.
00:54:16.240 | So if biology, as you find on Earth, is common everywhere,
00:54:19.960 | then there's something really weird going on.
00:54:21.600 | They're basically written in the quantum mechanics,
00:54:23.920 | there's some kind of,
00:54:24.880 | these bonds must form over these bonds,
00:54:26.680 | and this catalyst must form over this catalyst,
00:54:28.280 | when they're all quite equal.
00:54:30.160 | Life is contingent.
00:54:31.840 | The origin of life on Earth
00:54:33.760 | was contingent upon the chemistry available
00:54:35.680 | at the origin of life on Earth.
00:54:38.620 | So that means if we want to find other Earth-like worlds,
00:54:43.360 | we look for the same kind of rocky world,
00:54:45.120 | we might look in the same zone as Earth,
00:54:47.840 | and we might expect, reasonably,
00:54:50.320 | to find biological-like stuff going on.
00:54:53.840 | That would be a reasonable hypothesis,
00:54:55.540 | but it won't be the same, it can't be.
00:54:57.360 | It's like saying, I don't believe in magic,
00:55:00.480 | that's why I'm sure.
00:55:02.520 | I just don't believe in magic,
00:55:03.680 | I believe in statistics, and I can do experiments.
00:55:06.480 | And so I won't get the same,
00:55:08.440 | exactly the same sequence of events,
00:55:10.040 | I'll get something different.
00:55:11.460 | And so there is TikTok elsewhere in the universe,
00:55:14.400 | but it's not the same as our TikTok, right?
00:55:17.160 | That's what I mean.
00:55:18.000 | - Which aspect of it is not the same?
00:55:20.160 | - Well, I just think,
00:55:21.160 | so what is TikTok?
00:55:23.520 | TikTok is a social media where people upload videos,
00:55:26.840 | right, of silly videos.
00:55:28.120 | So I guess there might be--
00:55:29.440 | - Well, there's humor, there's attention,
00:55:31.000 | there's ability to process,
00:55:33.100 | there's ability for intelligent organisms
00:55:35.480 | to collaborate on ideas,
00:55:37.320 | and find humor in ideas, and play with those ideas,
00:55:39.760 | make them viral, memes.
00:55:43.760 | Humor seems to be kind of fundamental to human experience.
00:55:46.600 | - And I think that that's a really interesting question
00:55:48.800 | we can ask, is humor a fundamental thing in the universe?
00:55:52.360 | I think maybe it will be, right?
00:55:53.640 | In terms of, you think about in a game theoretic sense,
00:55:56.800 | humor, the emergence of humor serves a role
00:56:00.440 | in our game engine.
00:56:01.640 | And so if selection is fundamental in the universe,
00:56:05.000 | so is humor.
00:56:05.840 | - Well, I actually don't know exactly
00:56:10.320 | what role humor serves.
00:56:12.560 | Maybe it's like, from a chemical perspective,
00:56:15.360 | it's like a catalyst for,
00:56:19.360 | I guess it serves several purposes.
00:56:21.520 | One is the catalyst for spreading ideas on the internet,
00:56:23.760 | that's modern humor.
00:56:25.520 | But humor is also a good way to deal
00:56:28.260 | with the difficulty of life.
00:56:30.840 | It's a kind of valve, release valve for suffering.
00:56:34.580 | Like, throughout human history, life has been really hard.
00:56:39.120 | And for the people that I've known in my life
00:56:41.240 | who've lived through some really difficult things,
00:56:44.800 | humor is part of how they deal with that.
00:56:47.240 | - Yeah.
00:56:48.080 | - It's usually dark humor.
00:56:49.360 | But yeah, it's interesting.
00:56:51.520 | I don't know exactly sort of what's the more
00:56:55.120 | mathematically general way to formulate
00:56:57.540 | what the hell is humor.
00:56:58.880 | What humor does it serve?
00:57:01.040 | But I still, we're kind of joking here,
00:57:03.800 | but it's a counterintuitive idea to me
00:57:09.800 | to think that life elsewhere in the universe
00:57:14.640 | is very different than life on Earth.
00:57:17.440 | And also, like, all of each instantiation of life
00:57:22.440 | is likely very different from each other.
00:57:26.040 | - Yeah.
00:57:26.880 | - Like, maybe there's a few clusters of similar-like life,
00:57:30.520 | but it's much more likely is what you're saying.
00:57:34.200 | To me, it's a kind of novel thought.
00:57:36.220 | I'm not sure what to do with it.
00:57:37.560 | But you're saying that there's,
00:57:39.520 | it's more common to be a weird odd cast
00:57:43.280 | in the full spectrum of life
00:57:44.880 | than it is to be in some usual cluster.
00:57:48.160 | So every instantiation of a kind of chemistry
00:57:51.540 | that results in complexity that's autonomous
00:57:53.480 | and self-replicating, however the hell you define life,
00:57:56.920 | that is going to be very different every time.
00:57:59.720 | I don't know.
00:58:00.560 | It feels like a selection is a fundamental
00:58:05.000 | kind of directed force in the universe.
00:58:08.000 | Won't selection result in a few pockets
00:58:11.560 | of interesting complexities?
00:58:13.480 | I mean, yeah.
00:58:14.960 | If we ran Earth over again, over and over and over,
00:58:20.920 | you're saying it's going to come up
00:58:22.280 | with there's not gonna be elephants every time?
00:58:26.000 | - Yeah, I don't think so.
00:58:27.480 | I think that there will be similarities.
00:58:30.520 | And I think we don't know enough
00:58:32.260 | about how selection globally works.
00:58:36.500 | But it might be that the emergence of elephants
00:58:42.060 | was wired into the history of Earth in some way,
00:58:44.400 | like the gravitational force, how evolution was going,
00:58:47.920 | Cambrian explosions, blah, blah, blah,
00:58:49.360 | the emergence of mammals.
00:58:50.760 | But I just don't know enough about the contingency,
00:58:53.480 | right, the variability.
00:58:54.920 | All I do know is you count the number of bits
00:58:57.240 | of information required to make an element,
00:58:59.280 | sorry, an elephant, and think about the causal chain
00:59:03.580 | that provide the lineage of elephants
00:59:05.700 | going all the way back to Luca.
00:59:07.660 | There's a huge scope for divergence.
00:59:10.340 | - Yeah, but just like you said,
00:59:12.580 | with chemistry and selection,
00:59:15.020 | the things that result in self-replicating chemistry
00:59:20.420 | and self-replicating organisms,
00:59:26.540 | those are extremely unlikely, as you're saying.
00:59:29.340 | But once they're successful, they multiply.
00:59:33.080 | So it might be a tiny subset of all things
00:59:37.640 | that are possible in the universe, chemically speaking,
00:59:40.080 | it might be a very tiny subset
00:59:41.800 | is actually successful at creating elephants.
00:59:44.600 | Or elephant-like slash human-like creatures.
00:59:48.080 | - Well, there's two different questions here.
00:59:48.920 | So the first one, if we were to reset Earth
00:59:51.440 | and to start again--
00:59:52.720 | - At the different phases, sorry to keep interrupting.
00:59:54.920 | - Yeah, no, if we restart Earth and start again,
00:59:56.900 | say we could go back to the beginning and do the experiment
00:59:59.260 | or have a number of Earths, how similar would biology be?
01:00:02.620 | I would say that there would be broad similarities.
01:00:06.440 | But the emergence of mammals is not a given
01:00:09.140 | unless we're gonna throw an asteroid at each planet
01:00:11.580 | each time and try and faithfully reproduce what happened.
01:00:15.740 | Then there's the other thing about
01:00:17.960 | when you go to another Earth-like planet elsewhere,
01:00:20.420 | maybe there's a different ratio, particular elements,
01:00:22.900 | maybe there's the bombardment at the beginning of the planet
01:00:27.260 | was quicker or longer than Earth.
01:00:30.420 | And I just don't have enough information there.
01:00:32.780 | What I do know is that the complexity
01:00:37.140 | of the story of life on Earth
01:00:39.540 | gives us lots of scope for variation.
01:00:42.300 | And I just don't think it's a reasonable
01:00:44.540 | mathematical assumption to think that life on Earth
01:00:48.980 | that happened again would be the same as what we have now.
01:00:51.560 | - Okay, but you've also extended that to say that we might,
01:00:55.700 | as an explanation for the Fermi paradox,
01:00:58.420 | that that means we're not able to interact with them.
01:01:02.700 | Or that's an explanation for why we haven't at scale
01:01:06.680 | heard from aliens is--
01:01:08.020 | - Well, right now-- - Is there different than us.
01:01:10.500 | - We've only been looking for, say, 70, 80 years.
01:01:13.480 | So I think that the reason we have not found aliens yet
01:01:17.440 | is that we haven't worked out what life is.
01:01:19.980 | - No, but the aliens have worked that out, surely.
01:01:23.940 | I mean, statistically speaking,
01:01:26.940 | there must be a large number of aliens
01:01:30.520 | that are way ahead of us on this whole life question.
01:01:33.500 | Unless there's something about this stage
01:01:37.320 | of intellectual evolution that often quickly results
01:01:40.900 | in nuclear war and destroys itself.
01:01:42.680 | There's something in this process that eventually,
01:01:49.680 | I don't know, crystallizes the complexity
01:01:52.440 | and it stops, either dies or stops developing.
01:01:55.800 | But most likely, they already figured it out.
01:01:58.120 | And why aren't they contacting us?
01:01:59.840 | There's some grad student somewhere
01:02:03.880 | wants to study a new green planet.
01:02:06.480 | - Maybe they have.
01:02:07.800 | I mean, maybe, I mean, I don't have a coherent answer
01:02:11.080 | to your question other than to say
01:02:12.880 | that if there are other aliens out there
01:02:15.440 | and they're far more advanced,
01:02:17.540 | they might be in contact with each other
01:02:19.840 | and they might also, we might be at a point
01:02:22.400 | where what I'm saying quite critically
01:02:24.560 | is it takes two to talk, right?
01:02:26.800 | So the aliens might be there,
01:02:28.000 | but if we don't have the ability to recognize them
01:02:31.160 | and talk to them, then the aliens aren't going
01:02:33.640 | to want to talk to us.
01:02:36.040 | And I think that's a critical point
01:02:37.520 | that probably if that's a filter,
01:02:42.160 | there needs to be an ability for one
01:02:44.400 | to communicate with the other
01:02:45.720 | and we need to know what life is
01:02:47.520 | before we do that.
01:02:48.360 | So we haven't qualified to even join their club
01:02:50.900 | to have a talk.
01:02:51.840 | - Well, I think they still want to teach us how to talk.
01:02:54.440 | Right?
01:02:55.280 | But my worry is that,
01:02:57.660 | or I think they would want to teach us how to talk
01:03:00.860 | like you do when you meet it.
01:03:02.320 | Like when you even meet, I was going to say child,
01:03:07.940 | but that's a human species.
01:03:09.700 | I mean, like ant.
01:03:11.380 | You want to try to communicate with them
01:03:15.660 | through whatever devices you can,
01:03:17.400 | given what an ant is like.
01:03:19.360 | I just, I worry mostly about that humans
01:03:21.880 | are just too close minded or don't have the right tools.
01:03:25.040 | - No, I'm going to push back on this quite significantly.
01:03:27.200 | I would say, because we don't understand what life is
01:03:30.640 | and because we don't understand how life emerged
01:03:33.640 | in the universe, we don't understand the physics
01:03:36.080 | that gave rise to life yet.
01:03:37.400 | And that means our description, fundamental description,
01:03:40.640 | I'm way out of my pay grade, even further out.
01:03:42.960 | But I'll say it anyway, because I think it's a fun--
01:03:44.800 | - You don't get paid much anyway, as you said earlier.
01:03:47.460 | (laughing)
01:03:49.640 | - So I would say that we,
01:03:50.940 | because we don't understand the universe yet,
01:03:52.900 | we do not understand how the universe spat out life.
01:03:56.020 | And we don't know what life is.
01:03:57.620 | And I think that until we understand that,
01:03:59.740 | it is going to limit our ability to even,
01:04:01.940 | we don't qualify to talk to the aliens.
01:04:05.020 | So I'm going to say that they might be there,
01:04:08.440 | but we just, I'm not going to say that I believe
01:04:10.540 | in interdimensional aliens being present in this room.
01:04:12.460 | - Yeah, but I think you're just being self-critical,
01:04:14.800 | like we don't qualify.
01:04:16.400 | I think the fact that we don't qualify qualifies us.
01:04:19.320 | We're interesting in our innocence.
01:04:22.040 | - No, I'm saying that because we don't understand
01:04:25.680 | causal chains and the way that information
01:04:27.720 | is propagated in the universe,
01:04:29.320 | and we don't understand what replication is yet,
01:04:31.160 | and we don't understand how life emerged,
01:04:35.120 | I think that we would not recognize aliens.
01:04:38.720 | And if someone doesn't recognize you,
01:04:42.140 | you wouldn't go and talk to it.
01:04:44.300 | You don't go and talk to ants.
01:04:46.700 | You don't go and talk to birds,
01:04:48.220 | or maybe some birds you do, right?
01:04:49.780 | 'Cause you can, there's just enough cognition.
01:04:52.140 | So I'm saying because we don't have enough,
01:04:55.100 | our cognitive abilities are not yet where they need to be,
01:04:58.740 | we probably aren't even communicating with them.
01:05:00.180 | - So you don't agree with the dating strategy
01:05:02.900 | of playing hard to get?
01:05:04.660 | 'Cause us humans, that seems to attract us.
01:05:07.460 | - Within a species, that's fine,
01:05:09.240 | but I think we don't have the abstraction.
01:05:11.660 | No, actually, I think in this talk, in this conversation,
01:05:15.600 | you've helped me crystallize something
01:05:16.920 | that I think has been troubling me for a long time
01:05:19.220 | with the Fermi paradox.
01:05:20.160 | I'm pretty sure that a reasonable avenue
01:05:23.840 | is to say that you would not go and talk to your cat
01:05:28.480 | about calculus, right?
01:05:31.160 | - But I would still pet it.
01:05:32.280 | - Sure, but I'm not talking about petting a cat.
01:05:34.280 | The analogy is that the aliens are not going to talk to us
01:05:37.160 | because we, and I'm using calculus
01:05:39.320 | as an analogy for abstraction,
01:05:41.400 | because we lack the layer, the fundamental layer
01:05:45.360 | of understanding what life is
01:05:47.400 | and what the universe is in our reality
01:05:50.280 | that it would be so counterproductive
01:05:52.840 | interacting with intelligent alien species
01:05:55.280 | that it would cause more angst for human race.
01:05:58.840 | - They don't care, okay.
01:06:01.480 | They gotta be self-interested,
01:06:03.040 | so they'll probably, they more care about
01:06:05.240 | is it interesting for them.
01:06:06.880 | Maybe they, I mean, surely there's a way
01:06:10.680 | to pet the cat in this analogy,
01:06:15.680 | because even if we lack complete understanding,
01:06:20.840 | it must be a very frustrating experience
01:06:23.160 | for other kinds of intelligence to communicate with us,
01:06:26.760 | still there must be a way to interact with us,
01:06:30.360 | like perturb the system in interesting ways
01:06:34.280 | to see what these creatures do.
01:06:35.820 | We might actually find the answer,
01:06:37.440 | I mean, again, out of my pay grade,
01:06:39.280 | in a simulation of Earth,
01:06:42.240 | or say, let's say a simulation
01:06:43.560 | where we allow an intelligent AI to emerge, right,
01:06:47.160 | and that AI, we then give it,
01:06:50.320 | the objective is to be curious,
01:06:53.320 | interact with other intelligence in its universe,
01:06:56.200 | and then we might find the parameters required
01:07:00.920 | for that AI to walk wherever,
01:07:02.440 | and I think you'll find
01:07:03.760 | if the AI will not talk to other AIs
01:07:06.180 | that don't share the ability to abstract
01:07:09.380 | at the level of the AI,
01:07:10.420 | because it's just a cat,
01:07:11.980 | and are you gonna travel 20 light years
01:07:14.220 | to go and pet a cat?
01:07:15.300 | - So not because of the inability to do so,
01:07:19.180 | but because of like boredom,
01:07:20.820 | is it's more interested,
01:07:22.220 | it will start talking to,
01:07:23.900 | it will spend most,
01:07:25.180 | it will spend a majority of its time
01:07:28.620 | talking to other AI systems
01:07:30.260 | that can at least somewhat understand it,
01:07:32.460 | it's much more fun.
01:07:33.300 | - A bit like, do we know that plants are conscious?
01:07:34.960 | Well, plants aren't conscious in the way we typically think,
01:07:36.960 | but we don't talk to them.
01:07:37.800 | They could be, right?
01:07:38.640 | - Yeah, but there's a lot of people on Earth
01:07:40.080 | who like gardening.
01:07:41.120 | There's always going to be a weird--
01:07:42.440 | - They're not talking, they're just gardening.
01:07:44.160 | - Okay, well, you're not romantic enough
01:07:46.240 | to see gardening as a way of communication
01:07:48.120 | between humans and plants.
01:07:48.960 | - Oh, okay, you've got me there.
01:07:50.320 | - But there is ways,
01:07:52.120 | there's always going to be the people who are curious,
01:07:55.000 | Jane Goodall, who lives with the chimps, right?
01:07:58.560 | There's always going to be curious, intelligent species
01:08:00.900 | that visit the weird Earth planet and try to interact.
01:08:05.900 | I mean, it's, yeah, I think it's a super cool idea
01:08:10.360 | that you're expressing.
01:08:11.200 | I just kind of have a sense,
01:08:13.020 | maybe it's a hope that there's always going to be
01:08:16.160 | a desire to interact even with those
01:08:18.180 | that can't possibly understand
01:08:20.880 | the depth of what you understand.
01:08:23.000 | - So I'm with you,
01:08:24.240 | so I want to be as positive as you
01:08:25.700 | that aliens do exist and we will interact with them.
01:08:29.380 | What I'm trying to do is to give you
01:08:30.840 | a reasonable hypothesis why we haven't yet.
01:08:35.080 | And also something to strive for, to be able to do that.
01:08:38.880 | I mean, there is the other view
01:08:41.760 | that the universe is just too big
01:08:43.800 | and life is just too rare.
01:08:46.100 | But I want to come up with an alternative explanation,
01:08:49.320 | which I think is reasonable
01:08:51.120 | and not being philosophically and scientifically thought out
01:08:54.680 | which is this, if you can't actually communicate
01:08:57.880 | with the object, the thing competently,
01:09:01.740 | you don't even know it's there,
01:09:03.520 | then there's no point yet.
01:09:05.200 | - See, I disagree with that,
01:09:06.440 | but I'm totally aligned with your hopeful vision,
01:09:08.480 | which is like, we need to understand the origin of life.
01:09:11.520 | It will help us engineer life,
01:09:13.480 | it will help us engineer intelligent life
01:09:15.120 | through perhaps on a computer side through simulation
01:09:18.560 | and explore all the ways that life emerges.
01:09:20.960 | And that will allow us to,
01:09:22.400 | I think the fundamental reason
01:09:24.120 | we don't see overwhelming amounts of life
01:09:27.160 | is I actually believe aliens,
01:09:31.160 | of course, these are all just kind of open-minded beliefs.
01:09:35.760 | It's difficult to know for sure about any of this,
01:09:37.960 | but I think there's a lot of alien civilizations
01:09:40.920 | which are actively communicating with us.
01:09:43.000 | And we're too dumb.
01:09:45.480 | We don't have the right tools to see it.
01:09:47.760 | - That's what I'm saying.
01:09:48.600 | - No, but maybe I misinterpreted you,
01:09:51.880 | but I interpreted you to say they kind of tried a few times
01:09:56.240 | and they're like, oh God, humans are too dumb.
01:09:57.560 | - No, no, no, what I'm saying is we,
01:09:59.160 | so this goes two ways.
01:10:00.680 | Yeah, I agree with you.
01:10:01.800 | There could be information out there,
01:10:03.880 | but just put in such a way
01:10:05.000 | that we just don't understand it yet.
01:10:07.320 | So sorry if I didn't make that clear.
01:10:09.280 | I mean, it's not just, I don't think we,
01:10:11.080 | I think we qualify as soon as we can decode their signal.
01:10:15.600 | - Right, so when you say qualify, got it, got it.
01:10:17.880 | So you mean we're just not smart enough,
01:10:19.720 | the word qualify was throwing me off.
01:10:21.320 | So we're not smart enough to do,
01:10:22.520 | it's like, but we just need to get smarter.
01:10:25.240 | And there's a lot of people who believe,
01:10:27.440 | let me get your opinion on this, about UFO sightings.
01:10:31.640 | So sightings of weird phenomena that,
01:10:36.640 | what does UFO mean?
01:10:41.680 | It means it's a flying object
01:10:45.040 | and it's not identified clearly at the time of sighting.
01:10:50.040 | That's what UFO means.
01:10:51.780 | So it could be a physics phenomena,
01:10:53.720 | it could be ball lightning, it could be all kinds
01:10:55.280 | of fascinating, I was always fascinated
01:10:57.400 | with ball lightning as a,
01:10:58.620 | like the fact that there could be physical phenomena
01:11:03.160 | in this world that are observable by the human eye,
01:11:06.280 | of course, all physical phenomena generally are fascinating
01:11:09.640 | that are, that really smart people can't explain.
01:11:14.160 | I love that, 'cause it's like, wait a minute,
01:11:16.580 | especially if you can replicate it,
01:11:18.560 | it's like, wait a minute, how does this happen?
01:11:20.160 | That's like the precursor to giant discoveries
01:11:23.600 | in chemistry and biology and physics and so on.
01:11:26.000 | But it sucks when those events are super rare, right?
01:11:29.560 | Physical, like ball lightning.
01:11:31.360 | So that's out there.
01:11:34.200 | And then, of course, that phenomena could have
01:11:36.960 | other interpretations that don't have to do
01:11:39.220 | with the physics, the chemistry, the biology of Earth.
01:11:42.200 | It could have to do with more
01:11:44.520 | extraterrestrial explanations that, in large part,
01:11:49.200 | thanks to Hollywood and movies and all those kinds
01:11:51.080 | of things, captivates the imaginations
01:11:53.120 | of millions of people.
01:11:54.320 | But just because it's science fiction
01:11:58.640 | that captivates the imagination of people
01:12:00.480 | doesn't mean that some of those sightings,
01:12:03.520 | all it takes is one.
01:12:05.340 | One of those sightings is actually a sign
01:12:09.160 | that it's extraterrestrial intelligence,
01:12:11.960 | that it's an object that's not of this particular world.
01:12:16.960 | Do you think there's a chance that that's the case?
01:12:18.960 | What do you make, especially the pilot sightings,
01:12:21.200 | what do you make of those?
01:12:23.340 | - So I agree there's a chance, there's always a chance.
01:12:26.820 | Any good scientist would have to,
01:12:29.420 | or observationist would have to, you know,
01:12:32.220 | I want to see if aliens exist, come to Earth.
01:12:36.140 | What I know about the universe is I think it's unlikely
01:12:39.460 | right now that there are aliens visiting us,
01:12:42.180 | but not impossible.
01:12:44.460 | I think the releases, the dramatization
01:12:48.540 | that's been happening politically,
01:12:49.820 | saying we're gonna release all this information,
01:12:51.580 | this classified information,
01:12:53.580 | I was kind of disappointed because it was just
01:12:57.260 | very poor material.
01:13:01.980 | And right now, the ability to capture high-resolution video,
01:13:06.980 | everybody is carrying around with them
01:13:10.460 | an incredible video device now,
01:13:12.420 | and we haven't got more compelling data.
01:13:18.220 | And so we've just seeing grainy pictures,
01:13:22.660 | a lot of hearsay, instrument kind of malfunctions
01:13:26.420 | and whatnot, and so I think on balance,
01:13:29.140 | I think it's extremely unlikely,
01:13:30.620 | but I think something really interesting is happening.
01:13:33.320 | And also during the pandemic, right?
01:13:36.380 | We've all been locked down, we all want to have,
01:13:38.220 | we want to, our imaginations are running riot,
01:13:42.080 | and I think that, I don't think that the information
01:13:45.980 | out there has convinced me there are anything interesting
01:13:48.340 | on the UFO side, but what it has made me very interested
01:13:51.340 | about is how humanity is opening up its mind
01:13:54.820 | to ponder aliens and the mystery of our universe.
01:13:59.820 | And so I don't want to dissuade people
01:14:02.700 | from having those thoughts and say you're stupid
01:14:04.440 | and look at that, it's clearly incorrect.
01:14:06.140 | That's not right, that's not fair.
01:14:07.820 | What I would say is that I lack sufficient data,
01:14:10.900 | replicated observations to make me go,
01:14:14.100 | oh, I'm gonna take this seriously,
01:14:16.620 | but I'm really interested by the fact
01:14:18.300 | that there is this great deal of interest,
01:14:21.640 | and I think that it drives me to maybe want
01:14:25.420 | to make an artificial life form even more
01:14:28.620 | and to help NASA and the Air Force
01:14:31.540 | and whoever go and look for things even more
01:14:33.300 | because I think humanity wants to know what's out there.
01:14:36.180 | There's this yearning, isn't there?
01:14:38.220 | - Yeah, but I, see, I almost,
01:14:43.820 | depending on the day, I sometimes agree with you,
01:14:45.420 | but, with the thing you just said,
01:14:47.820 | but one of the disappointing things to me about the sightings
01:14:52.820 | I still hold the belief that a non-zero number of them
01:15:00.020 | is an indication of something very interesting.
01:15:04.900 | So I don't side with the people who say
01:15:09.780 | everything can be explained
01:15:10.820 | with like sensor artifacts kind of thing.
01:15:13.300 | - Yeah, I'd agree with you.
01:15:14.260 | I didn't say that either.
01:15:15.140 | I would say I just don't have enough data.
01:15:16.980 | - Right, but the thing I wanna push back on
01:15:19.620 | is the statement that everybody
01:15:21.300 | has a high-definition camera.
01:15:23.420 | One of the disappointing things to me
01:15:24.980 | about the report that the government released,
01:15:26.980 | but in general, just having worked with government,
01:15:29.380 | having worked with people all over,
01:15:32.820 | is how incompetent we are.
01:15:36.220 | Like, if you look at the response to the pandemic,
01:15:38.820 | how incompetent we are in the face of great challenges
01:15:43.260 | without great leadership,
01:15:46.060 | how incompetent we are in the face
01:15:47.900 | of the great mysteries before us without great leadership.
01:15:51.660 | And I just think it's actually,
01:15:54.020 | the fact that there's a lot of high-definition cameras
01:15:56.020 | is not enough to capture the full richness of weird,
01:16:01.020 | of the mysterious phenomena out there
01:16:03.900 | of which extraterrestrial intelligence
01:16:06.380 | visiting Earth could be one.
01:16:08.060 | I don't think we have,
01:16:09.420 | I don't think everybody having a smartphone
01:16:12.060 | in their pocket is enough.
01:16:14.900 | I think that allows for TikTok videos.
01:16:17.460 | I don't think it allows for the capture
01:16:19.900 | of even interesting, relatively rare human events.
01:16:23.820 | That's not that common.
01:16:25.420 | It's rare to be in the right moment in the right time
01:16:28.580 | to be able to capture the thing.
01:16:30.580 | - I agree, I agree.
01:16:31.500 | Let me rephrase what I think on this.
01:16:33.940 | I haven't seen enough information.
01:16:35.780 | I haven't really actively sought it out.
01:16:37.420 | I must admit.
01:16:38.860 | But I'm with you in that I love the idea
01:16:41.420 | of anomaly detection in chemistry in particular.
01:16:44.100 | I want to make anomalies, sorry,
01:16:45.820 | or not necessarily make anomalies.
01:16:47.740 | I want to understand an anomaly.
01:16:50.780 | Let me give you two from chemistry,
01:16:52.540 | which are really quite interesting.
01:16:55.660 | Phlogiston, going way back,
01:16:59.300 | where people said there's this thing called Phlogiston.
01:17:02.060 | And for ages, the alchemists got really this kind of,
01:17:05.980 | that fire is a thing.
01:17:07.900 | That's one.
01:17:09.940 | And then we determined that Phlogiston
01:17:11.180 | wasn't what we thought it is.
01:17:13.380 | Let's go to physics, the ether.
01:17:15.260 | The ether's a hard one,
01:17:16.420 | because I think actually the ether might exist.
01:17:18.460 | And I'll tell you what I think the ether is later.
01:17:20.980 | - Can you explain ether?
01:17:25.060 | - So as the vacuum,
01:17:26.100 | so it's the light traveling through the ether in the vacuum.
01:17:28.620 | There is some thing that we call the ether
01:17:30.700 | that basically mediates the movement of light, say.
01:17:34.380 | And I think that,
01:17:36.140 | and then the other one is cold fusion,
01:17:38.220 | which is more of,
01:17:39.060 | so a few years ago,
01:17:40.340 | that people observed it
01:17:43.100 | when they did some electrochemistry,
01:17:44.300 | when they were splitting water into hydrogen and oxygen,
01:17:48.260 | that you got more energy out than you put in.
01:17:50.660 | And people got excited,
01:17:51.900 | and they thought that this was a nuclear reaction.
01:17:54.660 | And in the end, it was kind of discredited,
01:17:57.940 | because you didn't detect neutrons and all this stuff.
01:17:59.900 | But I'm pretty sure,
01:18:02.980 | I'm a chemist,
01:18:03.820 | I'm telling you this on your podcast,
01:18:05.380 | but why not?
01:18:06.220 | I'm pretty sure there's interesting
01:18:07.380 | electrochemical phenomena
01:18:08.380 | that's not completely bottomed out yet,
01:18:09.900 | that there is something there.
01:18:11.460 | However, we lack the technology
01:18:13.580 | and the experimental design.
01:18:15.980 | So what I'm saying,
01:18:16.980 | in your response about aliens,
01:18:18.260 | is we lack the experimental design
01:18:20.420 | to really capture these anomalies.
01:18:22.500 | And we are encircling the planet
01:18:25.740 | with many more detection systems.
01:18:27.380 | We've got satellites everywhere.
01:18:28.980 | So there is,
01:18:29.900 | I do hope that we are gonna discover more anomalies.
01:18:32.300 | And remember,
01:18:33.460 | that the solar system isn't just static in space,
01:18:37.020 | it's moving through the universe.
01:18:39.180 | So there's just more and more chance.
01:18:40.740 | I'm not what with Avi Loeb,
01:18:41.940 | that he's generating all sorts of kind of,
01:18:44.700 | a cult, I would say,
01:18:48.100 | with this.
01:18:49.020 | But I'm not against him.
01:18:50.500 | I think there is a finite chance,
01:18:52.060 | if there are aliens in the universe,
01:18:53.180 | that we're gonna happen upon them.
01:18:54.780 | Because we're moving through the universe.
01:18:56.260 | - What's the nature of the following
01:18:58.460 | that Avi Loeb has?
01:18:59.780 | - He's doubling down more and more and more,
01:19:01.260 | and say there are aliens,
01:19:02.260 | interdimensional aliens and everything else, right?
01:19:04.500 | He's gone from space junk accelerating out of,
01:19:06.900 | to interdimensional stuff,
01:19:08.700 | in a very short space of time.
01:19:10.580 | (laughs)
01:19:11.420 | - I see.
01:19:12.260 | - He's obviously bored.
01:19:13.100 | (laughs)
01:19:14.620 | - Yeah.
01:19:15.460 | - Or he wants to tap into the psyche and understand.
01:19:18.780 | And he's playfully kind of trying to interact
01:19:21.180 | with society and his peers to say,
01:19:22.940 | stop saying it's not possible.
01:19:26.260 | And which I agree with,
01:19:27.380 | we shouldn't do that.
01:19:28.580 | But we should frame it statistically,
01:19:30.660 | in the same way we should frame everything
01:19:32.820 | as good scientists, statistically.
01:19:34.920 | - Yeah.
01:19:36.860 | Good scientists, recently,
01:19:41.820 | idea of good scientists is,
01:19:43.560 | I take quite skeptically.
01:19:47.740 | I've been listening to a lot of scientists
01:19:49.420 | tell me about what is good science.
01:19:52.660 | - That makes me sad,
01:19:53.500 | 'cause you've been interviewing
01:19:54.460 | what I would consider a lot of really good scientists.
01:19:56.700 | - No, that's true.
01:19:57.540 | - A lot of really good thinkers.
01:19:58.380 | - But that's exactly right.
01:19:59.940 | And most of the people I talk to
01:20:01.340 | are incredible human beings.
01:20:03.500 | But there's a humility that's required.
01:20:05.820 | Science is not,
01:20:06.980 | science cannot be dogmatism.
01:20:09.980 | - Sure, I agree.
01:20:10.980 | I mean-
01:20:11.820 | - Authority,
01:20:12.660 | a PhD does not give you authority.
01:20:17.020 | A lifelong pursuit of a particular task
01:20:19.500 | does not give you authority.
01:20:20.980 | You're just as lost and clueless as everybody else,
01:20:23.980 | but you're more curious and more stubborn.
01:20:27.260 | So that's a nice quality to have.
01:20:29.900 | But overall, just
01:20:31.340 | using the word science and statistics can often,
01:20:38.620 | as you know,
01:20:41.140 | kind of become a catalyst for dismissing
01:20:45.580 | new ideas, out of the box ideas,
01:20:48.620 | wild ideas, all that kind of stuff.
01:20:50.420 | - Well, yes and no.
01:20:52.180 | I think that, so I like to,
01:20:55.540 | some people find me extremely annoying in science
01:20:59.700 | because I'm basically,
01:21:01.180 | I'm quite rude and disruptive,
01:21:04.100 | not in a rude, you know,
01:21:05.580 | some people say they're ugly or stupid
01:21:07.500 | or anything like that.
01:21:08.340 | I just say,
01:21:09.180 | you know, you're wrong.
01:21:12.220 | Or why do you think this?
01:21:13.300 | And something I,
01:21:15.340 | gift I got given by society when I was very young,
01:21:17.700 | 'cause I was in the learning difficulties class at school,
01:21:20.660 | is I was told I was stupid.
01:21:22.780 | And so I know I'm stupid,
01:21:24.860 | but I always wanted to be smart, right?
01:21:26.940 | I always, I remember going to school going,
01:21:29.020 | maybe today they're gonna tell me
01:21:30.540 | I'm not as stupid as I was yesterday.
01:21:32.340 | And it was always disappointed, always.
01:21:34.420 | And so when I went into academia and everyone said,
01:21:36.620 | you're wrong, I was like,
01:21:38.260 | join the queue.
01:21:39.140 | - Yeah. (laughs)
01:21:40.700 | - Because it allowed me to walk through the,
01:21:42.500 | you know, the wall.
01:21:43.420 | So I think that people like to,
01:21:45.260 | always imagine science as a bit like
01:21:46.740 | living in a Japanese house,
01:21:48.060 | the paper walls,
01:21:49.260 | and everyone sits in their room.
01:21:51.980 | And I annoy people because I walk straight through the wall,
01:21:54.300 | not because, why should I be a chemist
01:21:56.700 | and not a mathematician?
01:21:57.980 | Why should I be a mathematician and not a computer scientist?
01:22:00.380 | Because if the problem requires us
01:22:02.260 | to walk through those walls,
01:22:04.180 | but I like walking through the walls.
01:22:07.860 | Like, but as long, then I have to put up,
01:22:11.100 | you know, I have to do good science.
01:22:12.420 | I have to win the people in those rooms
01:22:15.420 | across by good science,
01:22:17.220 | by taking their criticisms and addressing them head on.
01:22:20.940 | And I think we must do that.
01:22:22.380 | And I think that I try and do that in my own way.
01:22:25.180 | And I kind of love walking through the walls.
01:22:28.100 | And it gives me, it's difficult for me personally,
01:22:33.180 | it's quite painful,
01:22:35.060 | but it always leads to a deeper understanding
01:22:38.260 | of the people I'm with.
01:22:39.340 | In particular, you know, the arguments I have
01:22:41.020 | with all sorts of interesting minds,
01:22:43.380 | because I want to solve the problem
01:22:45.780 | or I want to understand more about why I exist.
01:22:48.340 | You know, that's it really.
01:22:52.620 | And I think we have to not dismiss science
01:22:55.100 | on that basis.
01:22:55.940 | I think we can work with science.
01:22:58.220 | - No, science is beautiful,
01:22:59.420 | but humans with egos and all those kinds of things
01:23:03.180 | can sometimes misuse good things,
01:23:08.180 | like social justice,
01:23:10.700 | like all ideas we all aspire to misuse these beautiful ideas
01:23:15.700 | to manipulate people to all those kinds of things.
01:23:20.100 | - Sure.
01:23:20.940 | - And that's, there's assholes in every space
01:23:24.620 | and walk of life, including science.
01:23:26.580 | - Yeah, yeah, yeah, of course.
01:23:27.420 | - And those are no good, but yes, you're right.
01:23:29.820 | The scientific method has proven to be quite useful.
01:23:34.580 | That said, for difficult questions,
01:23:36.540 | for difficult explanations for rare phenomena,
01:23:41.540 | you have to walk cautiously.
01:23:44.780 | Because the scientific method,
01:23:50.220 | when you totally don't understand something,
01:23:53.220 | and it's rare, and you can't replicate it,
01:23:55.500 | doesn't quite apply.
01:23:57.140 | - Yeah, yeah, yeah, I agree with you.
01:23:59.100 | The challenge is to not dismiss the anomaly
01:24:02.140 | because you can't replicate it.
01:24:03.540 | I mean, we can talk about this.
01:24:05.260 | This is something I realized
01:24:06.540 | when we were developing assembly theory.
01:24:08.580 | People thinking that the track they're on
01:24:13.980 | is so dogmatic, but there is this thing that they see,
01:24:18.060 | but they don't see, and it takes a bit of time,
01:24:20.260 | and you just have to keep reframing it.
01:24:21.780 | And my approach is to say, well, why can't this be right?
01:24:25.060 | Why must we accept that RNA is the only way into life?
01:24:29.460 | I mean, who said?
01:24:31.220 | Does RNA have a special class of information
01:24:35.820 | that's encoded in the universe?
01:24:37.180 | No, of course it doesn't, right?
01:24:38.620 | RNA is not a special molecule
01:24:41.060 | in the space of all the other molecules.
01:24:42.860 | - But it's so elegant and simple,
01:24:44.540 | and it works so well for the evolutionary process
01:24:46.700 | that we kind of use that as an intuition
01:24:48.620 | to explain that that must be the only way to have life.
01:24:51.460 | - Sure.
01:24:52.300 | - But you mentioned assembly theory.
01:24:55.140 | Well, first, let me pause, bathroom break.
01:24:56.860 | Need it?
01:24:57.700 | - Yeah, let's take two minutes.
01:24:59.900 | - We took a quick break, and offline,
01:25:02.500 | you mentioned to me that you have a lab in your home,
01:25:07.380 | and then I said that you're basically Rick
01:25:10.640 | from Rick and Morty, which is something I've been thinking
01:25:13.660 | this whole conversation.
01:25:15.420 | And then you say that there's a glowing pickle
01:25:20.300 | that you used something involving cold plasma,
01:25:23.180 | I believe, I don't know,
01:25:24.020 | but can you explain the glowing pickle situation?
01:25:26.380 | (laughing)
01:25:28.620 | And is there many, arbitrarily many versions of you
01:25:34.380 | in alternate dimensions that you're aware of?
01:25:37.380 | - I tried to make an electrochemical memory at home
01:25:41.500 | and using a pickle, the only way I could get any traction
01:25:47.560 | with it was actually by plugging it
01:25:49.100 | into a very high voltage alternating current
01:25:51.580 | and then putting in a couple of electrodes.
01:25:53.820 | But my kids weren't impressed.
01:25:54.940 | They're not impressed with anything I do,
01:25:56.540 | any experiments I do at home.
01:25:57.860 | I think it's quite funny.
01:25:58.860 | - But you connected a pickle to some electro, I mean--
01:26:01.180 | - To 240 volts, yeah, AC.
01:26:03.100 | - Yeah.
01:26:03.940 | - And then had a couple of electrodes on it.
01:26:04.820 | So what happens is a pickle,
01:26:06.180 | this is a classic thing you do, I mean, I shouldn't,
01:26:10.860 | pranks you do, you put a pickle into the mains
01:26:13.140 | and just run away and leave it.
01:26:15.140 | And what happens is it starts to decompose,
01:26:17.580 | it heats up and then explodes,
01:26:19.020 | because the water turns to steam
01:26:20.940 | and it just violently explodes.
01:26:22.700 | But I wondered if I could cause the iron,
01:26:25.260 | sodium potassium ions in the pickle to migrate.
01:26:27.380 | It'd been in a jar, right?
01:26:28.700 | So it'd been in a brine.
01:26:31.100 | That was, yeah, that was not my best experiment.
01:26:34.300 | So I've done far better experiments in my lab at home.
01:26:36.900 | - At that time it was a failed experiment,
01:26:38.460 | but you never know, it could,
01:26:40.140 | every experiment is a successful experiment
01:26:44.540 | if you stick with it long enough.
01:26:45.980 | - Well, I mean, I got kicked out of my own lab
01:26:48.220 | by my research team many years ago, and for good reason.
01:26:50.820 | I mean, my team is brilliant
01:26:51.900 | and I used to go in and just break things.
01:26:53.820 | So what I do do at home is I have
01:26:55.780 | a kind of electronics workshop
01:26:57.620 | and I prototype experiments there.
01:27:01.380 | Then I try and suggest to my team sometimes,
01:27:05.260 | maybe we can try this thing.
01:27:07.180 | And they would just say,
01:27:08.020 | "Oh, well, that's not gonna work because of this."
01:27:10.140 | And I'll say, "Ah-ha, but actually I've tried
01:27:11.980 | "and here's some code and here's some hardware,
01:27:14.020 | "can we have a go?"
01:27:15.260 | So I'm doing that less and less now
01:27:17.500 | as I get even more busy, but that's quite fun
01:27:20.740 | 'cause then they feel that we're in the experiment together.
01:27:24.180 | - You do, in fact, brilliantly,
01:27:27.420 | just like Rick from Rick and Morty,
01:27:30.100 | connect up chemistry with computation.
01:27:34.260 | And when we say chemistry,
01:27:40.140 | we don't mean the simulation of chemistry,
01:27:45.140 | or modeling of chemistry.
01:27:46.940 | We mean chemistry in the physical space
01:27:49.260 | as well as in the digital space, which is fascinating.
01:27:52.220 | We'll talk about that,
01:27:54.260 | but first you mentioned assembly theory.
01:27:56.020 | So we'll stick on theory in these big ideas.
01:28:00.780 | I would say revolutionary ideas.
01:28:03.020 | This intersection between mathematics and philosophy.
01:28:07.300 | What is assembly theory?
01:28:09.140 | And generally speaking,
01:28:10.740 | how would we recognize life if we saw it?
01:28:15.300 | - So assembly theory is a theory,
01:28:17.620 | goes back a few years now,
01:28:18.900 | my struggle for maybe almost 10 years
01:28:22.100 | when I was going to origin of life conferences
01:28:24.380 | and artificial life conferences
01:28:25.780 | where I thought that everybody was dancing around
01:28:28.740 | the problem of what life is and what it does.
01:28:32.700 | But I'll tell you about what assembly theory is
01:28:34.380 | 'cause I think it's easier.
01:28:35.820 | So assembly theory literally says,
01:28:37.260 | if you take an object, any given object,
01:28:40.940 | and you are able to break the object
01:28:42.620 | into parts very gently,
01:28:43.860 | so just maybe let's say take a piece
01:28:46.380 | of very intricate Chinese porcelain
01:28:48.980 | and you tap it just with a hammer or the nail at some point
01:28:52.220 | and it will fragment into many parts.
01:28:54.940 | And if that object is able to fragment into many
01:28:57.860 | and you count those parts, the different parts,
01:29:00.500 | so they're unsymmetrical,
01:29:01.820 | assembly theory says the larger the number of parts,
01:29:07.180 | unsymmetrical parts that object has,
01:29:10.140 | the more likely it is that object has been created
01:29:12.500 | by an evolutionary or information process,
01:29:15.060 | especially if that object is not one-off,
01:29:18.980 | you've got an abundance of them.
01:29:22.140 | And that's really important.
01:29:23.540 | Because what I'm literally saying about the abundance,
01:29:26.860 | if you have a one-off object and you break it into parts
01:29:30.300 | and it has lots of parts, you'd say,
01:29:32.500 | well, that could be incredibly intricate and complex,
01:29:37.500 | but it could be just random.
01:29:39.980 | And I was troubled with this for years
01:29:41.740 | 'cause I saw in reality that assembly theory works.
01:29:44.700 | But when I talked to very good
01:29:46.180 | computational complexity computation lists,
01:29:49.700 | algorithmic complexity people,
01:29:51.380 | they said, you haven't really done this properly,
01:29:53.860 | you haven't thought about it.
01:29:54.780 | It's like, this is the random problem.
01:29:57.660 | And so I kept working this up
01:30:00.020 | 'cause I invented an assembly theory in chemistry,
01:30:03.140 | first of all, with molecules.
01:30:05.100 | And so the thought experiment was,
01:30:06.820 | how complex does a molecule need to be when I find it
01:30:11.500 | that it couldn't possibly have risen
01:30:12.780 | by chance probabilistically?
01:30:15.260 | And if I found this molecule,
01:30:16.540 | able to detect it in enough quantities
01:30:18.180 | in the same object, like a machine,
01:30:19.500 | like a mass spectrometer.
01:30:20.820 | So typically in a mass spectrometer,
01:30:22.540 | you weigh the molecules in electric field,
01:30:24.340 | you probably have to have on the order
01:30:25.660 | of 10,000 identical molecules to get a signal.
01:30:28.540 | So 10,000 identical molecules that are complex,
01:30:32.900 | what's the chance of them occurring by chance?
01:30:36.700 | Well, we can do the math.
01:30:37.540 | Let's take a molecule like strychnine
01:30:39.300 | or, yeah, so strychnine is a good molecule actually to take
01:30:44.300 | or Viagra is a good molecule.
01:30:46.820 | I made jokes about Viagra 'cause it's complex molecule.
01:30:49.580 | And one of my friends said,
01:30:50.420 | "Yeah, if we find Viagra on Mars in detectable quantities,
01:30:53.300 | "we know something is up."
01:30:54.780 | (laughing)
01:30:57.020 | But anyway, it's a complex molecule.
01:31:00.300 | So what you do is you take this molecule
01:31:01.740 | in the mass spectrometer and you hit it with some electrons
01:31:03.860 | or in electric field and it breaks apart.
01:31:06.300 | And if the larger the number of different parts,
01:31:09.980 | you know when it starts to get to a threshold.
01:31:12.820 | My idea was that that molecule could not be created
01:31:15.060 | by chance probabilistically.
01:31:17.220 | So that was where assembly theory was born
01:31:20.300 | in an experiment, in a mass spec experiment.
01:31:22.820 | And I was thinking about this
01:31:24.380 | because NASA is sending the mass spectrometers
01:31:26.220 | to Mars, to Titan, it's gonna send them to Europa.
01:31:29.620 | There's gonna be a nuclear-powered mass spectrometer
01:31:32.020 | going to Titan.
01:31:33.620 | I mean, this is the coolest experiment ever.
01:31:35.860 | They're not only sending a drone
01:31:37.260 | that's gonna fly around Titan,
01:31:38.420 | it's gonna be powered by a nuclear slug, a nuclear battery
01:31:43.420 | and it's gonna have a mass spectrometer on it.
01:31:45.940 | - Is this already launched?
01:31:47.140 | - No, it's Dragonfly and it's gonna be launched
01:31:50.180 | in a few years.
01:31:51.020 | I think it got pushed a year because of the pandemic.
01:31:53.460 | So I think three or four years.
01:31:55.220 | - Dragonfly, nuclear Dragonfly is going to fly to Titan
01:32:00.900 | and collect data about the composition
01:32:05.900 | of the various chemicals on Titan.
01:32:09.580 | - Yeah, I'm trying to convince NASA.
01:32:11.380 | I don't know if I'll be able to convince
01:32:12.660 | the Dragonfly team that they should apply this approach,
01:32:17.000 | but they will get data and depending on how good
01:32:19.220 | their mass spectrometer is.
01:32:20.840 | But I had this thought experiment anyway
01:32:22.380 | and I did this thought experiment.
01:32:24.340 | And for me, it seemed to work.
01:32:26.620 | I turned the thought experiment into an algorithm
01:32:29.080 | in assembly theory and I basically, assembly theory,
01:32:31.600 | if I take, let's just make it generic
01:32:33.740 | and let's just take the word abracadabra.
01:32:36.200 | So can I, if you find the word,
01:32:38.920 | so if you have a book with lots of words in it
01:32:41.140 | and you find abracadabra one off
01:32:42.780 | and it's a book that's been written by, in a random way,
01:32:46.240 | you know, set of monkeys in a room.
01:32:48.440 | - And typewriters.
01:32:49.600 | - And you're on typewriters and you find one off
01:32:51.320 | abracadabra, no big deal.
01:32:52.500 | But if you find lots of reoccurrences of abracadabra,
01:32:55.920 | well, that means something weird is going on.
01:32:57.500 | But let's think about the assembly number of abracadabra.
01:33:00.460 | So abracadabra has a, you know,
01:33:03.280 | has a number of letters in it.
01:33:04.840 | You can break it down.
01:33:05.680 | So you just cut the letters up.
01:33:07.540 | But when you actually reassemble abracadabra,
01:33:09.660 | the minimum number of ways of organizing those letters,
01:33:12.440 | so you'd have an A, a B, you know, and keep going up.
01:33:16.560 | There's just the, you can,
01:33:18.880 | when you cut abracadabra up into parts,
01:33:21.560 | you can put it together again in seven steps.
01:33:24.060 | So what does that mean?
01:33:24.900 | That means if you basically don't re,
01:33:26.960 | you're allowed to reuse things you make in a chain
01:33:29.480 | at the beginning, that's the memory of the universe,
01:33:32.520 | the process that makes abracadabra.
01:33:35.040 | And because of that causal chain,
01:33:36.460 | you can then get to abracadabra
01:33:38.500 | quicker than the number of letters
01:33:40.440 | for having to specify only in seven.
01:33:43.200 | So if you take that to a molecule
01:33:44.640 | and you cut the molecule up into parts,
01:33:47.920 | and you can, on the causal chain,
01:33:49.820 | and you basically start with the atoms and then bonds,
01:33:52.640 | and then you randomly add on those parts
01:33:54.520 | to make the A, make the B, make the,
01:33:57.120 | and keep going all the way up,
01:33:58.760 | I found that literally, assembly theory
01:34:01.600 | allows me to say how compressed a molecule is,
01:34:04.920 | so when there's some information in there.
01:34:07.840 | And I realized assembly theory wasn't,
01:34:10.480 | isn't just confined to molecular space,
01:34:13.280 | it can apply to anything.
01:34:14.160 | But let me finish the molecular argument.
01:34:16.020 | So what I did is I had this theory,
01:34:18.680 | I, with one of my students, we wrote an algorithm.
01:34:22.680 | We basically took the 20 million molecules from a database
01:34:26.320 | and we just calculated their assembly number.
01:34:28.720 | And that's the index.
01:34:29.600 | Like basically, if I take a molecule
01:34:32.040 | and I cut it up into bonds,
01:34:33.920 | what is the minimum number of steps I need to take
01:34:36.640 | to reform that molecule from atoms?
01:34:39.720 | - So reusability of previously formed things
01:34:42.600 | is somehow a fundamental part of what--
01:34:43.840 | - Exactly, it's like memory in the universe, right?
01:34:46.400 | I'm making lots of leaps here, like, it's kind of weird.
01:34:48.840 | I'm saying, right, there's a process that can form
01:34:50.880 | the A and the B and the C, let's say.
01:34:53.400 | And then when there's,
01:34:54.320 | and because we've formed A and B before,
01:34:56.320 | we can use A and B again with no extra cost except one unit.
01:35:00.080 | So that's the kind of what the chain of events.
01:35:02.560 | - And that's how you think about memory here
01:35:04.260 | when you say the universe,
01:35:05.960 | when you talk about the universe
01:35:07.560 | or life is the universe creating memory.
01:35:11.880 | - Exactly.
01:35:12.880 | So we went through chemical space
01:35:15.240 | and we looked at the assembly numbers
01:35:16.960 | and we were able to classify it.
01:35:18.640 | So, okay, let's test it, let's go.
01:35:21.600 | So we're able to take a whole bunch of molecules
01:35:23.960 | and assign an assembly index to them, okay?
01:35:27.480 | And it's just a function of the number of bonds
01:35:30.720 | in the molecule and how much symmetry.
01:35:32.360 | So literally, assembly theory is a measure
01:35:34.800 | of how little symmetry a molecule has.
01:35:37.920 | So the more asymmetry, the more information,
01:35:40.600 | the more weird it is,
01:35:41.440 | like a Jackson Pollock of some description.
01:35:44.400 | So I then went and did a load of experiments
01:35:47.760 | and I basically took those molecules,
01:35:49.960 | I cut them up in the mass spec
01:35:51.320 | and measured the number of peaks
01:35:52.640 | without any knowledge of the molecule.
01:35:55.200 | And we found the assembly number,
01:35:57.200 | there was almost not quite a one-to-one correlation,
01:36:00.600 | but almost because not all bonds are equal,
01:36:02.760 | they have different energies.
01:36:03.940 | I then did this using two other spectroscopic techniques,
01:36:07.000 | NMR, nuclear magnetic resonance,
01:36:08.720 | which uses radio frequency to basically jangle the molecules
01:36:11.720 | and get a signature out.
01:36:13.080 | And I also used infrared.
01:36:15.080 | And infrared and NMR almost gave us a one-to-one correlation.
01:36:18.200 | So what am I saying?
01:36:19.560 | Saying by taking a molecule
01:36:21.800 | and doing either infrared or NMR or mass spec,
01:36:26.060 | I can work out how many parts there are in that molecule
01:36:30.080 | and then put it on a scale.
01:36:32.360 | And what we did in the next part of the work
01:36:35.600 | is we took molecules randomly from the environment,
01:36:39.180 | from outer space, from all around Earth,
01:36:41.440 | from the sea, from Antarctica,
01:36:45.720 | and from fossils and so on.
01:36:47.560 | And even NASA, 'cause they didn't believe us,
01:36:49.760 | blinded some samples.
01:36:51.600 | And we found that all these samples that came from biology
01:36:56.600 | produced molecules that had a very high assembly number,
01:37:00.560 | above a threshold of about 15.
01:37:02.920 | So basically, all the stuff that came
01:37:05.880 | from that abiotic origin was low.
01:37:08.680 | There was no complexity there.
01:37:10.240 | So we suddenly realized that on Earth, at least,
01:37:12.680 | there is a cutoff that natural phenomena
01:37:16.160 | cannot produce molecules that need
01:37:18.080 | more than 15 steps to make them.
01:37:20.040 | So I realized that this is a way to make a scale of life,
01:37:25.280 | a scale of technology as well.
01:37:27.400 | And literally, you could just go sniffing for molecules
01:37:30.480 | off Earth, on Titan, on Mars.
01:37:33.360 | And when you find a molecule in the mass spectrometer
01:37:35.680 | that gives you more than 15 parts,
01:37:37.760 | you'll know pretty much for sure
01:37:40.680 | that it had to be produced by evolution.
01:37:43.200 | And this allowed me to come up with a general definition
01:37:45.400 | of life based on assembly theory,
01:37:47.320 | to say that if I find an object
01:37:49.600 | that has a large number of parts,
01:37:51.760 | say an iPhone, or Boeing 747,
01:37:54.680 | or any complex object, and I can find it in abundance
01:37:58.760 | and cut it up, I can tell you whether that has been produced
01:38:03.080 | by an informational process or not.
01:38:05.000 | And that's what assembly theory kind of does.
01:38:07.720 | But it goes a bit further.
01:38:09.080 | I then realized that this isn't just about life,
01:38:12.280 | it's about causation.
01:38:14.320 | So actually, it tells you about
01:38:15.840 | whether there's a causal structure.
01:38:17.360 | So now I can look at objects in the universe,
01:38:19.000 | say that again, this cup, and say, right,
01:38:20.800 | I'm gonna look at how many independent parts it has.
01:38:23.760 | So that's the assembly number.
01:38:25.560 | I'll then look at the abundance, how many cups.
01:38:28.000 | There are two on this table,
01:38:28.960 | maybe there's a few more you got stashed away.
01:38:31.160 | So assembly is a function of the complexity of the object
01:38:37.240 | times the number of copy numbers of that object,
01:38:39.400 | or a function of the copy number, normalized.
01:38:41.920 | So I realized there's a new quantity in the universe.
01:38:45.120 | You have energy, entropy, and assembly.
01:38:48.160 | - So assembly, the way we should think about that
01:38:51.240 | is how much reusability there is.
01:38:55.220 | Because what-- - Yes.
01:38:56.060 | - Reusability is like, can you play devil's advocate to this?
01:38:59.720 | So could this just be a nice tertiary signal
01:39:04.720 | for living organisms, like some kind of distant signal
01:39:09.720 | that's, yeah, this is a nice property,
01:39:12.320 | but it's not capturing something fundamental?
01:39:14.880 | Or do you think reusability is something fundamental
01:39:17.320 | to life in complex organisms?
01:39:19.480 | - I think reusability is fundamental in the universe,
01:39:22.480 | not just for life in complex organisms.
01:39:24.400 | It's about causation.
01:39:26.240 | So I think assembly tells you, if you find objects,
01:39:29.520 | 'cause you can do this with trajectories as well,
01:39:31.760 | if you think about it, the fact there are objects
01:39:36.760 | in the universe on Earth is weird.
01:39:39.800 | You think about it, we should just have
01:39:42.440 | a Combinatorial explosion of stuff.
01:39:44.600 | The fact that not everything exists is really weird.
01:39:49.600 | - Yeah, and then there, as I'm looking at two mugs
01:39:54.600 | and two water bottles, and the things that exist
01:39:57.760 | are similar and multiply in copies of each other.
01:40:02.760 | - Yeah, yeah, so I would say that assembly allows you
01:40:06.360 | to do something that statistical mechanics
01:40:08.800 | and people looking at entropy have got stuck with
01:40:11.840 | for a while.
01:40:13.080 | So I'm making, it's pretty bold.
01:40:14.600 | I mean, I'm writing a paper with Sarah Walker
01:40:16.560 | on this at the moment, and we're realizing,
01:40:19.320 | we don't wanna get ahead of ourselves
01:40:20.320 | because I think that there's lots of ways
01:40:22.280 | where this is, you know, we're not gonna get ahead
01:40:24.840 | of ourselves, but we're gonna get ahead of ourselves
01:40:26.760 | and it's a really interesting idea.
01:40:30.520 | It works for molecules, and it appears to work
01:40:33.440 | for any objects produced by causation.
01:40:35.560 | 'Cause you can take a motor car,
01:40:36.440 | you can look at the assembly of the motor car,
01:40:37.640 | look at a book, look at the assembly of the book.
01:40:39.760 | Assembly theory tells you there's a way
01:40:41.600 | of compressing and reusing, and so when people,
01:40:44.760 | I talk to information theorists, they say,
01:40:47.000 | "Oh, this is just logical depth."
01:40:48.800 | I say, "It is like logical depth,
01:40:51.200 | "but it's experimentally measurable."
01:40:53.960 | They say, "Oh, it's a bit like Komagolov complexity."
01:40:55.720 | I say, "But it's computable."
01:40:57.880 | And now, okay, it's not infinitely computable,
01:41:00.600 | gets NP hard very quickly, right?
01:41:02.320 | It's a very hard problem when you,
01:41:04.200 | but it's computable enough, you're contractible enough
01:41:07.760 | to be able to tell the difference between a molecule
01:41:09.240 | that's been formed by the random background
01:41:11.120 | and by causation.
01:41:13.600 | And I think that that's really interesting
01:41:16.560 | because until now, there's no way
01:41:19.080 | of measuring complexity objectively.
01:41:22.240 | Complexity has required algorithmic comparisons
01:41:26.280 | and programs and human beings to enlabel things.
01:41:30.560 | Assembly is label-free.
01:41:32.680 | Well, not entirely.
01:41:34.200 | We can talk about what that means in a minute.
01:41:37.040 | - Okay, my brain has been broken a couple times here.
01:41:42.040 | - I'm sorry I explained it really badly.
01:41:43.480 | - No, it was very well explained.
01:41:45.040 | It was just fascinating, and it's,
01:41:47.520 | my brain is broken into pieces,
01:41:50.400 | and I'm trying to assemble it.
01:41:52.000 | So NP hard, so when you have a molecule,
01:41:58.200 | you're trying to figure out, okay,
01:42:01.040 | if we were to reuse parts of this molecule,
01:42:03.800 | which parts can we reuse to,
01:42:08.680 | as an optimization problem, NP hard,
01:42:11.600 | to figure out the minimum amount of reused components
01:42:15.880 | that will create this molecule.
01:42:18.160 | And it becomes difficult when you start
01:42:19.700 | to look at huge, huge molecules, arbitrarily large.
01:42:23.760 | 'Cause I'm also mapping this,
01:42:25.160 | can I think about this in complexity generally,
01:42:28.800 | like looking at a cellular automata system
01:42:30.840 | and saying, can this be used as a measure of complexity
01:42:35.600 | for an arbitrarily complicated system?
01:42:38.780 | - Yeah, I think it can.
01:42:40.400 | - It can.
01:42:41.240 | - And I think that the question is,
01:42:42.920 | and what's the benefit?
01:42:43.880 | 'Cause there's plenty of, I mean,
01:42:46.560 | in computer science and mathematics and physics,
01:42:48.580 | people have been really seriously studying complexity
01:42:51.560 | for a long time.
01:42:53.240 | And I think there's some really interesting problems
01:42:55.160 | of where we course grade and we lose information.
01:42:58.600 | And all assembly theory does really,
01:43:00.120 | assembly theory just explains weak emergence.
01:43:03.200 | And so what assembly theory says,
01:43:05.120 | look, going from the atoms that interact,
01:43:08.920 | those first replicators that build one another.
01:43:12.040 | Assembly at the minimal level just tells you evidence
01:43:14.960 | that there's been replication and selection.
01:43:17.920 | And I think the more selected something is,
01:43:20.440 | the higher the assembly.
01:43:22.520 | And so we're able to start to know
01:43:25.600 | how to look for selection in the universe.
01:43:27.160 | If you go to the moon,
01:43:28.440 | there's nothing of very high assembly on the moon
01:43:30.160 | except the human artifacts we've left there.
01:43:32.960 | So again, let's go back to the sandbox.
01:43:36.080 | In assembly theory says,
01:43:37.880 | if all the sand grains could stick together,
01:43:40.080 | that's the infinite combinatorial explosion in the universe.
01:43:43.200 | That should be the default.
01:43:44.800 | Well, we don't have that.
01:43:45.960 | Now let's assemble sand grains together
01:43:48.640 | and do them in every possible way.
01:43:51.080 | So we have a series of minimal operations
01:43:54.760 | that can move the sand together.
01:43:56.200 | But all that doesn't exist either.
01:43:57.920 | Now, because we have specific memory where we say,
01:43:59.920 | well, we're gonna put three sand grains in line
01:44:01.800 | or four and make a cross or a triangle
01:44:03.920 | or something unsymmetrical.
01:44:05.480 | And once we've made the triangle
01:44:06.680 | and the unsymmetrical thing, we remember that,
01:44:08.400 | we can use it again 'cause on that causal chain.
01:44:10.920 | So what assembly theory allows you to do
01:44:12.480 | is go to the actual object that you find in space.
01:44:16.120 | And actually the way you get there is by disassembling it.
01:44:20.320 | Assembly theory works by disassembling objects you have
01:44:24.440 | and understanding the steps to create them.
01:44:27.720 | And it works for molecules beautifully
01:44:30.160 | 'cause you just break bonds.
01:44:31.800 | - But like you said, it's gonna be hard, it's very difficult.
01:44:33.920 | It's a difficult problem to figure out
01:44:35.600 | how to break them apart.
01:44:36.600 | - For molecules, it's easy.
01:44:37.640 | If you just keep low enough in molecular weight space,
01:44:40.800 | it's good enough.
01:44:41.720 | So it's a complete theory.
01:44:43.880 | When we start to think about objects,
01:44:45.320 | we can start to assign,
01:44:46.960 | we can start to think about things at different levels,
01:44:49.080 | different, what you assign as your atom.
01:44:51.560 | So in a molecule, the atom, this is really confusing
01:44:54.640 | 'cause the word atom, I mean smallest breakable part.
01:44:57.280 | So in a molecule, the atom is the bond
01:44:59.720 | 'cause you break bonds, not atoms, right?
01:45:02.440 | So in a car, the atom might be, I don't know,
01:45:06.120 | a small amount of iron or the smallest reusable part,
01:45:09.800 | a rivet, a piece of plastic or something.
01:45:13.520 | So you gotta be really careful.
01:45:14.800 | In a microprocessor, the atoms might be transistors.
01:45:18.480 | And so the amount of assembly that something has
01:45:23.480 | is a function, you have to look at the atom level.
01:45:26.640 | What are you, where are your parts?
01:45:28.240 | What are you counting?
01:45:29.080 | - That's one of the things you get to choose.
01:45:30.800 | What is, at what scale is the atom?
01:45:32.760 | What is the minimal thing? - Exactly.
01:45:34.760 | - I mean, there's huge amounts of trade-offs
01:45:36.760 | in when you approach a system and try to analyze,
01:45:39.720 | like if you approach Earth, you're an alien civilization,
01:45:42.480 | try to study Earth, what is the atom
01:45:44.960 | for trying to measure the complexity of life?
01:45:48.420 | Is it, are humans the atoms?
01:45:50.560 | - I would say to start with, you just use molecules.
01:45:53.520 | I can say for sure, if there are molecules
01:45:56.060 | of sufficient complexity on Earth,
01:45:58.240 | then I know that life has made them.
01:45:59.760 | And you can go further and show technology.
01:46:01.400 | There are molecules that exist on Earth
01:46:03.880 | that are not possible even by biology.
01:46:06.160 | You needed technology and you needed microprocessors
01:46:08.480 | to get there, so that's really cool.
01:46:11.480 | - And there's a correlation between that,
01:46:14.680 | between the coolness of that and assembly number,
01:46:18.620 | whatever the measure, what would you call the measure?
01:46:21.600 | - Assembly index.
01:46:22.560 | - Yeah, assembly index.
01:46:23.400 | - So there are three kind of fundamental kind of labels
01:46:26.400 | we have, so there's the quantity of assembly
01:46:28.960 | and the assembly, so if you have a box,
01:46:31.160 | let's just have a box of molecules.
01:46:32.560 | So I'm gonna have my box.
01:46:34.240 | We count the number of identical molecules
01:46:36.560 | and then we chop each molecule up
01:46:38.040 | in an individual molecule class
01:46:40.520 | and calculate the assembly number.
01:46:42.280 | So basically, you then have a function
01:46:45.760 | that sums over all the molecules for each assembly
01:46:49.440 | and then you divide through, so you make it,
01:46:51.680 | divide through by the number of molecules.
01:46:54.240 | - So that's the assembly index for the box?
01:46:56.200 | - So that will tell you the amount of assembly in the box.
01:46:58.640 | So basically, the assembly equation we come up with
01:47:01.080 | is like basically the sum of e to the power
01:47:05.000 | of the assembly index of molecule i
01:47:07.440 | times the number of copies of the molecule i
01:47:10.560 | and then you normalize.
01:47:11.480 | So you sum them all up and then normalize.
01:47:13.420 | So some boxes are gonna be more assembled than others.
01:47:16.400 | - Yeah, that's what they tell me.
01:47:17.760 | So if you were to look at me as a box,
01:47:19.960 | let's say I'm a box, am I assembling my parts
01:47:23.840 | in terms of like, how do you know,
01:47:25.480 | what's my assembly index?
01:47:27.040 | And be gentle.
01:47:28.760 | - So let's just, we'll talk about the molecules in you.
01:47:30.960 | So let's just take a pile of sand the same way as you
01:47:34.560 | and I would take you and just cut up all the molecules.
01:47:39.560 | I mean, and look at the number of copies
01:47:42.600 | and assembly number.
01:47:43.880 | So in sand, let's say, there's probably gonna be
01:47:46.040 | nothing more than assembly number two or three,
01:47:48.360 | but there might be trillions and trillions of sand grains.
01:47:52.000 | In your body, there might be,
01:47:53.880 | the assembly number's gonna be higher,
01:47:56.160 | but there might not be quite as many copies
01:47:58.320 | because the molecular weight is higher.
01:48:01.220 | - So you do wanna average it out?
01:48:03.000 | - You can average, you can do average it out.
01:48:04.400 | - I'm not defined by the most impressive molecules.
01:48:07.160 | - No, no, you're an average in your volume.
01:48:08.720 | Well, I mean, we're just working this out,
01:48:10.520 | but what's really cool is you're gonna have
01:48:13.520 | a really high assembly.
01:48:15.160 | The sand will have a very low assembly.
01:48:16.720 | Your causal power is much higher.
01:48:19.520 | You get to make decisions, you're alive, you're aspiring.
01:48:22.960 | Assembly says something about causal power in the universe.
01:48:26.600 | And that's not supposed to exist
01:48:28.880 | because physicists don't accept that causation
01:48:31.640 | exists at the bottom.
01:48:33.160 | - So I understand at the chemical level
01:48:34.600 | why the assembly is causation,
01:48:37.680 | why is it causation?
01:48:38.600 | 'Cause it's capturing the memory.
01:48:40.960 | - Exactly.
01:48:41.800 | - Capturing memory, but there's not an action to it.
01:48:45.680 | So I'm trying to see how it leads to life.
01:48:50.680 | - Well, it's what life does.
01:48:53.000 | So I think it's, we don't know.
01:48:55.040 | - Yeah, that's a good question.
01:48:57.320 | What is life versus what does life do?
01:49:00.140 | - Yeah, so this is the definition of life,
01:49:02.480 | the only definition we need.
01:49:04.200 | - What's the assembly in terms?
01:49:05.760 | - That life is able to create objects in abundance
01:49:10.760 | that are so complex, the assembly number is so high,
01:49:15.700 | they can't possibly form in an environment
01:49:18.720 | where there's just random interactions.
01:49:20.560 | So suddenly you can put life on a scale.
01:49:24.880 | And then life doesn't exist, actually, in that case.
01:49:27.880 | It's just how evolved you are.
01:49:30.320 | And you as an object,
01:49:33.120 | because you have incredible causal power,
01:49:35.680 | you could go and launch rockets or build cars
01:49:40.680 | or create drugs or, you can do so many things.
01:49:46.360 | You can build stuff, build more artifacts
01:49:49.000 | that show that you have had causal power.
01:49:52.360 | And that causal power was this kind of a lineage.
01:49:55.360 | And I think that over time,
01:49:57.840 | I've been realizing that physics as a discipline
01:50:02.600 | has a number of problems associated with it.
01:50:04.920 | Me as a chemist, it's kind of interesting
01:50:07.320 | that assembly theory, and I'm really,
01:50:09.160 | I wanna maintain some credibility in the physicist's eyes,
01:50:13.480 | but I have to push them
01:50:14.400 | because physics is a really good discipline.
01:50:17.960 | It's reduced the number.
01:50:20.080 | Physics is about reducing the belief system.
01:50:22.240 | But they're down to some things in their belief system,
01:50:24.720 | which is kind of really makes me kind of grumpy.
01:50:27.760 | Number one is requiring order
01:50:29.600 | at the beginning of the universe magically.
01:50:31.040 | We don't need that.
01:50:32.240 | The second is the second law.
01:50:34.000 | Well, we don't actually need that.
01:50:36.120 | - This is blasphemous.
01:50:39.200 | - Well, in a minute, I'll recover my career in a second.
01:50:42.280 | Although I think the only good thing
01:50:43.960 | about being the Regius chair
01:50:45.120 | means I think there has to be an act of parliament
01:50:46.840 | to fire me.
01:50:47.840 | - Yeah.
01:50:48.680 | (both laughing)
01:50:51.340 | You can always go to Lee's Twitter and protest.
01:50:56.680 | And I think the third thing is that,
01:50:58.600 | so we've got the order at the beginning.
01:51:01.680 | - Second law.
01:51:03.360 | - The second law, and the fact that causation is emergent.
01:51:06.760 | All right, and that time is emergent.
01:51:09.000 | - John Carroll just turned off this program.
01:51:11.200 | I think he believes that it's emergent.
01:51:13.840 | So causation is not emergent.
01:51:15.520 | - That's clearly incorrect
01:51:16.960 | because we wouldn't exist otherwise.
01:51:20.120 | So physicists have kind of got confused about time.
01:51:23.400 | Time is a real thing.
01:51:25.560 | Well, I mean, so look,
01:51:27.360 | I'm very happy with the current description
01:51:31.160 | of the universe as physics give me
01:51:32.720 | because I can do a lot of stuff, right?
01:51:33.960 | I can go to the moon with Newtonian physics, I think.
01:51:36.560 | And I can understand the orbit of Mercury with relativity.
01:51:41.080 | And I can build transistors with quantum mechanics, right?
01:51:43.960 | And I can do all this stuff.
01:51:45.760 | So I'm not saying the physics is wrong.
01:51:47.280 | I'm just saying, if we say that time is fundamental,
01:51:50.480 | i.e. time is non-negotiable, there's a global clock,
01:51:54.120 | I don't need to require that there's order
01:51:57.560 | being magically made in the past
01:51:59.720 | because that asymmetry is built
01:52:01.360 | into the way the universe is.
01:52:03.360 | - So if time is fundamental,
01:52:07.000 | I mean, you've been referring to this kind of
01:52:09.560 | an interesting formulation of that is memory.
01:52:13.920 | - Yeah.
01:52:14.760 | - So time is hard to like put a finger on,
01:52:18.960 | like what the hell are we talking about?
01:52:20.880 | It's just a direction.
01:52:21.800 | But memory is a construction,
01:52:24.640 | especially when you have like,
01:52:26.680 | think about these local pockets of complexity,
01:52:29.000 | these non-zero assembly index entities
01:52:34.000 | that's being constructed and they remember.
01:52:36.360 | Never forget molecules.
01:52:39.160 | - But remember, the thing is I invented assembly theory.
01:52:41.920 | I'll tell you I invented it.
01:52:44.440 | When I was a kid, I mean, the thing is,
01:52:46.160 | I keep making fun of myself to my search group.
01:52:48.320 | I've only ever had one idea.
01:52:49.640 | I keep exploring that idea
01:52:51.040 | over the 40 years or so since I had the idea.
01:52:53.720 | I used to--
01:52:54.560 | - Well, aren't you the idea that the universe had?
01:52:56.440 | So it's very kind of hierarchical.
01:52:58.480 | Anyway, go ahead, I'm sorry.
01:52:59.960 | - That's very poetic.
01:53:02.080 | - Yeah.
01:53:02.920 | - So I think I came up with assembly theory
01:53:04.760 | with the following idea when I was a kid.
01:53:07.240 | I was obsessed about survival kits.
01:53:09.520 | What is the minimum stuff I would need
01:53:12.440 | to basically replicate my reality?
01:53:14.000 | And I love computers and I love technology
01:53:17.040 | or what technology is gonna become.
01:53:18.320 | So I imagined that I would have basically
01:53:20.760 | this really big truck full of stuff.
01:53:22.880 | And I thought, well, can I delete some of that stuff out?
01:53:25.520 | Can I have a blueprint?
01:53:26.520 | And then in the end, I kept making it smaller.
01:53:29.560 | It got to maybe half a truck and into a suitcase.
01:53:32.320 | And then I went, okay, well, screw it.
01:53:34.000 | I wanna carry my entire technology in my pocket.
01:53:37.600 | How do I do it?
01:53:38.640 | And I'm not like gonna launch into Steve Jobby
01:53:41.160 | and I'm iPlayer.
01:53:43.200 | I came up with a matchbox survival kit.
01:53:45.960 | In that matchbox survival kit,
01:53:47.280 | I would have the minimum stuff
01:53:48.520 | that would allow me to interact the environment
01:53:50.560 | to build my shelter, to build a fishing rod,
01:53:54.240 | to build a water purification system.
01:53:57.280 | And it's kind of like, so what did I use in my box
01:53:59.480 | to assemble in the environment,
01:54:01.620 | to assemble, to assemble, to assemble?
01:54:04.280 | And I realized I could make a causal chain
01:54:07.040 | in my survival kit.
01:54:08.200 | So I guess that's probably why I've been obsessed
01:54:10.200 | with assembly theory for so long.
01:54:12.000 | And I was just pre-configured to find it somewhere.
01:54:17.000 | And when I saw it in molecules,
01:54:19.440 | I realized that the causal structure that we say emerges
01:54:24.360 | and the physics kind of gets really stuck
01:54:27.160 | because they're saying that time,
01:54:28.600 | you can go backwards in time.
01:54:29.760 | I mean, how do we let physicists get away
01:54:32.160 | with the notion that we can go back in time
01:54:34.360 | and meet ourselves?
01:54:35.240 | I mean, that's clearly a very hard thing to let up.
01:54:40.240 | Physicists would not let other sciences get away
01:54:46.040 | with that kind of heresy, right?
01:54:49.120 | So why are physicists allowed to get away with it?
01:54:51.080 | - So first of all, to push back, to play devil's advocate,
01:54:54.280 | you are clearly married to the idea of memory.
01:54:58.280 | You see in this, again, from Rick and Morty way,
01:55:02.200 | you have these deep dreams of the universe
01:55:06.360 | that is writing the story through its memories,
01:55:08.880 | through its chemical compounds
01:55:10.840 | that are just building on top of each other.
01:55:12.440 | And then they find useful components they can reuse.
01:55:16.120 | And then the reused components create systems
01:55:19.760 | that themselves are then reused
01:55:21.800 | and all in this way construct things.
01:55:24.080 | But when you think of that as memory,
01:55:27.880 | it seems like quite sad that you can walk that back.
01:55:31.960 | But at the same time, it feels like that memory,
01:55:34.880 | you can walk in both directions on that memory
01:55:37.040 | in terms of time.
01:55:38.000 | - You could walk in both directions,
01:55:39.520 | but I don't think that that makes any sense
01:55:42.000 | because the problem that I have with time being reversible
01:55:47.000 | is that, I mean, I'm just a,
01:55:53.440 | I'm a dumb experimental chemist, right?
01:55:55.160 | So I love burning stuff, burning stuff and building stuff.
01:55:58.880 | But when I think of reversible phenomena,
01:56:01.120 | I imagine in my head,
01:56:02.200 | I have to actually manufacture some time.
01:56:05.200 | I have to borrow time from the universe to do that.
01:56:08.120 | I can't, when anyone says,
01:56:10.040 | let's imagine that we can go back in time or reversibility,
01:56:13.120 | you can't do that.
01:56:13.960 | You can't step out of time.
01:56:15.160 | Time is non-negotiable, it's happening.
01:56:17.000 | - No, but see, you're assuming that time is fundamental,
01:56:20.480 | which most of us do when we go day to day,
01:56:23.720 | but it takes a leap of wild imagination
01:56:27.280 | to think that time is emergent.
01:56:29.720 | - No, time is not emergent.
01:56:30.920 | Yeah, I mean, this is an argument we can have,
01:56:32.880 | but I believe I can come up with an experiment.
01:56:35.920 | - An experiment that proves
01:56:37.160 | that time cannot possibly be emergent?
01:56:38.840 | - Experiment that shows how assembly theory
01:56:43.320 | kind of is the way that the universe produces selection
01:56:48.200 | and that selection gives rise to life.
01:56:50.680 | And also to say, well, hang on,
01:56:53.400 | we could allow ourselves to have a theory
01:56:55.320 | that requires us to have these statements to be possible.
01:56:59.200 | Like we need to have order in the past
01:57:03.160 | or we can have used the past hypothesis,
01:57:06.240 | which is order in the past, but as well, okay.
01:57:10.840 | And we have to have an arrow of time,
01:57:12.840 | we have to require that entropy increases.
01:57:16.520 | And we have to say, and then we can say,
01:57:17.800 | look, the universe is completely closed
01:57:19.960 | and there's no novelty or that novelty is predetermined.
01:57:23.960 | What I'm saying is very, very important
01:57:26.440 | that time is fundamental, which means if you think about it,
01:57:30.080 | the universe becomes more and more novel each step.
01:57:32.680 | It generates more states in the next step than it was before.
01:57:35.880 | So that means bigger search.
01:57:37.560 | So what I'm saying is that the universe
01:57:39.480 | wasn't capable of consciousness at day one,
01:57:41.680 | actually, because it didn't have enough states.
01:57:45.760 | But today the universe is, so it's like how--
01:57:48.800 | - All right, all right, hold on a second.
01:57:51.600 | Now we've pissed off the panpsychics too, okay.
01:57:54.060 | No, this is brilliant, sorry.
01:57:57.560 | Part of me is just joking, having fun with this thing,
01:58:00.240 | but 'cause you're saying a lot of brilliant stuff
01:58:02.640 | and I'm trying to slow it down before my brain explodes.
01:58:05.640 | So 'cause I wanna break apart
01:58:08.080 | some of the fascinating things you're saying.
01:58:10.320 | So novelty, novelty is increasing in the universe
01:58:14.680 | because the number of states is increasing.
01:58:16.960 | What do you mean by states?
01:58:18.440 | - So I think the physicists almost got everything right.
01:58:20.760 | I can't fault them at all.
01:58:22.920 | I just think there's a little bit of dogma.
01:58:24.440 | I'm just trying to play devil's advocate.
01:58:26.040 | I'm very happy to be entirely wrong on this, right?
01:58:28.760 | I'm not right on many things at all.
01:58:31.080 | But if I can make less assumptions
01:58:34.080 | about the universe with this,
01:58:35.920 | then potentially that's a more powerful way
01:58:38.100 | of looking at things.
01:58:39.720 | - If you think of time as fundamental,
01:58:41.800 | you can make less assumptions overall.
01:58:43.480 | - Exactly.
01:58:44.600 | The time is fundamental.
01:58:45.760 | I don't need to add on a magical second law
01:58:48.020 | because the second law comes out of the fact
01:58:49.480 | the universe is actually, there's more states available.
01:58:52.720 | I mean, we might even be able to do weird things
01:58:54.580 | like dark energy in the universe
01:58:55.880 | might actually just be time, right?
01:58:58.640 | - Yeah, but then you still have to explain
01:59:00.960 | why time is fundamental.
01:59:02.560 | 'Cause I can give you one explanation
01:59:04.480 | that's simpler than time and say God.
01:59:06.320 | Just because it's simple doesn't mean it's,
01:59:09.960 | you still have to explain God
01:59:12.640 | and you still have to explain time.
01:59:14.400 | Why is it fundamental?
01:59:15.600 | - So let's just say existence is default,
01:59:18.040 | which means time is the default.
01:59:19.600 | So, look, lots of--
01:59:20.440 | - Wait, how did you go from the existence
01:59:22.040 | is the default to time is the default?
01:59:23.360 | - Well, look, we exist, right?
01:59:25.000 | So let's just be very--
01:59:27.040 | - We're yet to talk about what exist means
01:59:29.080 | because consciousness is not. - All right, let's go
01:59:29.920 | all the way back, yeah, yeah, okay.
01:59:31.120 | I think it's very poetic and beautiful
01:59:32.600 | what you're weaving into this.
01:59:34.200 | I don't think this conversation is even about the assembly,
01:59:37.240 | which is fascinating and we'll keep mentioning it
01:59:41.160 | as something index in this idea
01:59:43.040 | that I don't think is necessarily connected to time.
01:59:47.000 | - Oh, I think it is deeply connected.
01:59:48.480 | I just can't explain it.
01:59:50.320 | - So you don't think everything you've said
01:59:52.600 | about assembly theory and assembly index
01:59:56.080 | can still be correct even if time is emergent?
01:59:58.640 | - So, yeah, right now, assembly theory appears to work.
02:00:01.640 | I appear to be able to measure objects of high assembly
02:00:04.880 | in a mass spectrometer and look at their abundance
02:00:06.660 | and all that's fine, right?
02:00:08.160 | It's a nice, if nothing else, it's a nice way
02:00:10.560 | of looking at how molecules can compress things.
02:00:13.200 | Now, am I saying that a time has to be fundamental
02:00:17.200 | not emergent for assembly theory to work?
02:00:18.740 | No, I think I'm saying that the universe,
02:00:23.320 | it appears that the universe has many different ways
02:00:25.720 | of using time.
02:00:26.560 | You could have three different types of time.
02:00:28.360 | You could just have time that's,
02:00:30.000 | the way I would think of it,
02:00:30.880 | if you want to hold onto emergent time,
02:00:33.320 | I think that's fine, let's do that for a second.
02:00:35.600 | Hold onto emergent time
02:00:37.480 | and the universe is just doing its thing.
02:00:39.600 | Then assembly time only exists
02:00:41.420 | when the universe starts to write memories through bonds.
02:00:43.800 | So let's just say there's rocks running around,
02:00:45.640 | when the bond happens and selection starts,
02:00:49.360 | suddenly the universe is remembering cause in the past
02:00:54.360 | and those structures will have effects in the future.
02:00:58.000 | So suddenly a new type of time emerges at that point,
02:01:00.760 | which has a direction.
02:01:02.360 | And I think Sean Carroll at this point
02:01:04.240 | might even turn the podcast back on and go,
02:01:06.560 | okay, I can deal with that, that's fine.
02:01:08.860 | But I'm just basically trying to condense the conversation
02:01:10.960 | and say, hey, let's just have time fundamental
02:01:13.080 | and see how that screws with people's minds.
02:01:15.000 | - You're triggering people by saying fundamental.
02:01:17.740 | - Why not?
02:01:18.580 | - Well, you just say, like, let's say--
02:01:19.880 | - Why am I, look, I'm walking through the wall.
02:01:22.160 | Why should I grow up in a world where time,
02:01:27.000 | I don't go back in time, I don't meet myself in the past.
02:01:30.760 | There are no aliens coming from the future, right?
02:01:34.240 | It's just like--
02:01:35.080 | - No, no, no, but that's not, no, no, no, hold on a second.
02:01:38.680 | That's like saying we're talking about biology
02:01:41.800 | or like evolutionary psychology and you're saying,
02:01:44.900 | okay, let's just assume that clothing is fundamental.
02:01:48.540 | People wearing clothes is fundamental.
02:01:50.400 | It's like, no, no, no, wait a minute.
02:01:52.320 | You can't, like, I think you're gonna get in a lot
02:01:55.240 | of trouble if you assume time is fundamental.
02:01:57.600 | - Why?
02:01:58.440 | Give me one reason why I'm getting into trouble
02:02:00.040 | with time being fundamental.
02:02:01.320 | - Because you might not understand the origins
02:02:05.600 | of this memory that might be deeper.
02:02:08.080 | Like, this memory, that could be a thing
02:02:11.180 | that's explaining the construction
02:02:13.300 | of these higher complexities better
02:02:17.700 | than just saying it's a search.
02:02:21.160 | It's chemicals doing a search for reusable structures
02:02:26.160 | that they can like then use as bricks to build a house.
02:02:32.680 | - Okay, so I accept that.
02:02:34.400 | So let's go back a second because it's a kind of,
02:02:37.920 | I wanted to drop the time bomb at this part
02:02:40.680 | because I think we can carry on discussing it
02:02:42.160 | for many, many, many, many, many days, many months.
02:02:45.640 | But I'm happy to accept that it might be wrong.
02:02:50.920 | But what I would like to do is imagine a universe
02:02:53.800 | where time is fundamental and time is emergent
02:02:56.600 | and ask, let's just then talk about causation
02:03:00.320 | because physicists require that causation,
02:03:02.900 | so this is where I'm gonna go, causation emerges
02:03:05.880 | and it doesn't exist at the micro scale.
02:03:08.440 | Well, that clearly is wrong because if causation has
02:03:11.080 | to emerge at the macro scale, life cannot emerge.
02:03:13.660 | So how does life emerge?
02:03:15.200 | Life requires molecules to bump into each other,
02:03:17.940 | produce replicators.
02:03:19.840 | Those replicators need to produce polymers.
02:03:21.880 | There needs to be cause and effect at the molecular level.
02:03:24.600 | There needs to be a non-ergodic to an ergodic transition
02:03:28.160 | at some point and those replicators have consequence,
02:03:33.160 | material consequence in the universe.
02:03:37.520 | Physicists just say, oh, you know what?
02:03:39.720 | I'm gonna have a bunch of particles in a box.
02:03:42.480 | I'm gonna think about it in either Newtonian way
02:03:44.840 | and a quantum way and I'll add on an arrow time
02:03:48.020 | so I can label things and causation will happen
02:03:51.200 | magically later.
02:03:52.040 | Well, how?
02:03:52.860 | Explain causation and they can't.
02:03:55.760 | The only way I can reconcile causation
02:03:58.540 | is having a fundamental time because this allows me
02:04:01.440 | to have a deterministic universe that creates novelty
02:04:04.680 | and there's so many things to unpack here
02:04:08.320 | but let's go back to the point.
02:04:09.780 | You said, can assembly theory work with emergent time?
02:04:12.600 | Sure, it can but it doesn't give me a deep satisfaction
02:04:16.560 | about how causation and assembly gives rise
02:04:21.080 | to these objects that move through time and space.
02:04:24.600 | And again, what am I saying?
02:04:25.600 | To bring it back, I can say without fear,
02:04:29.800 | take this water bottle and look at this water bottle
02:04:32.080 | and look at the features on it.
02:04:32.900 | There's writing, you've got a load of them.
02:04:35.760 | I know that causal structures gave rise to this.
02:04:38.640 | In fact, I'm not looking at just one water bottle here.
02:04:40.840 | I'm looking at every water bottle
02:04:42.120 | that's ever been conceived of by humanity.
02:04:44.540 | This here is a special object.
02:04:46.960 | In fact, Leibniz knew this.
02:04:48.700 | Leibniz, who was at the same time of Newton,
02:04:52.080 | he kind of got stuck.
02:04:53.200 | I think Leibniz actually invented assembly theory.
02:04:56.040 | He gave soul, the soul that you see in objects
02:04:58.680 | wasn't the mystical soul, it is assembly.
02:05:01.080 | It is the fact there's been a history of objects related
02:05:04.000 | and without the object in the past,
02:05:06.780 | this object wouldn't exist.
02:05:08.380 | There is a lineage and there is conserved structures,
02:05:12.640 | causal structures have given rise to those.
02:05:14.800 | - Fair enough.
02:05:17.900 | And you're saying it's just a simpler view
02:05:20.340 | of time is fundamental.
02:05:22.940 | - And it shakes the physicist's cage a bit, right?
02:05:25.820 | I was gonna say, but I think that--
02:05:27.920 | (laughing)
02:05:29.100 | - I just enjoy the fact that physicists are in cages.
02:05:32.500 | This is good.
02:05:33.340 | - I think that, I would say that Lee Smolin,
02:05:36.760 | I don't want to speak for Lee.
02:05:37.940 | I'm talking to Lee about this.
02:05:39.940 | I think Lee also is in agreement
02:05:41.820 | that time has to be fundamental.
02:05:43.740 | But I think he goes further.
02:05:46.020 | Even in space, I don't think you can go back
02:05:47.780 | to the same place in space.
02:05:49.960 | I've been to Austin a few times now.
02:05:51.380 | This is my, I think, third time I've been to Austin.
02:05:54.100 | Is Austin in the same place?
02:05:56.420 | The solar system is moving through space.
02:05:58.460 | I'm not back in the same space.
02:06:01.900 | Locally, I am.
02:06:02.960 | Every event in the universe is unique.
02:06:06.300 | - In space.
02:06:08.780 | - And time.
02:06:09.620 | - And time.
02:06:10.580 | Doesn't mean we can't go back, though.
02:06:13.260 | I mean, let's just rest this conversation,
02:06:18.260 | which was beautiful, with a quote from the Rolling Stones
02:06:22.580 | that you can't always get what you want.
02:06:25.260 | Which is, you want time to be fundamental,
02:06:27.820 | but if you try, you'll get what you need,
02:06:30.100 | which is assembly theory.
02:06:32.160 | Okay, let me ask you about,
02:06:34.340 | continue talking about complexity,
02:06:36.740 | and to clarify it with this beautiful theory of yours
02:06:41.740 | that you're developing, and I'm sure will continue developing
02:06:45.300 | both in the lab and in theory.
02:06:48.280 | Yeah, it can't be said enough.
02:06:52.220 | Just the ideas you're playing with in your head
02:06:54.980 | are just, and we've been talking about it,
02:06:57.020 | are just beautiful.
02:06:58.100 | So if we talk about complexity a little bit more generally,
02:07:01.100 | maybe in an admiring, romantic way,
02:07:05.700 | how does complexity emerge from simple rules?
02:07:09.180 | The why, the how.
02:07:11.260 | Okay, the nice algorithm of assembly is there.
02:07:15.300 | - I would say that the problem I have right now,
02:07:17.300 | is I mean, you're right, we can, about time as well.
02:07:20.300 | The problem is I have this hammer called assembly,
02:07:22.580 | and everything I see is a nail.
02:07:24.520 | So now, let's just apply it to all sorts of things.
02:07:27.460 | We take the Bernard instability.
02:07:28.900 | The Bernard instability is you have oil,
02:07:31.120 | you heat up oil, let's say on a frying pan,
02:07:35.120 | when you get convection, you get honeycomb patterns.
02:07:37.500 | Take the formation of snowflakes, right?
02:07:40.580 | Take the emergence of a tropical storm,
02:07:45.580 | or the storm on Jupiter.
02:07:47.580 | When people say, let's talk about complexity in general,
02:07:50.100 | what they're saying is, let's take this collection
02:07:52.920 | of objects that are correlated in some way,
02:07:56.880 | and try and work out how many moving parts there are,
02:08:00.300 | how this got, how this exists.
02:08:02.940 | So what people have been doing for a very long time
02:08:04.820 | is taking complexity and counting what they've lost,
02:08:09.300 | calculating the entropy.
02:08:11.140 | And the reason why I'm pushing very hard on assembly,
02:08:13.900 | is entropy tells you how much you've lost.
02:08:16.380 | It doesn't tell you the microstates are gone.
02:08:18.800 | But if you embrace the bottom up with assembly,
02:08:21.740 | those states, and you then understand the causal chain
02:08:26.340 | that gives rise to the emergence.
02:08:28.700 | So what I think assembly will help us do
02:08:31.460 | is understand weak emergence at the very least,
02:08:34.300 | and maybe allow us to crack open complexity in a new way.
02:08:39.300 | And I've been fascinated with complexity theory
02:08:42.740 | for many years.
02:08:43.580 | I mean, as soon as I could, you know,
02:08:46.640 | I learned of the Mandelbrot set,
02:08:48.460 | and I could write, just type it up in my computer
02:08:51.620 | and run it, and just show it, see it kind of unfold.
02:08:55.740 | It was just this kind of, this mathematical reality
02:09:00.180 | that existed in front of me, I just found incredible.
02:09:03.980 | But then I realized that actually, we were cheating.
02:09:07.100 | We're putting in the boundary conditions all the time,
02:09:09.500 | we're putting in information.
02:09:11.500 | And so, when people talk to me about the complexity
02:09:15.260 | of things, I say, but relative what?
02:09:17.420 | How do you measure them?
02:09:18.700 | So my attempt, my small attempt, naive attempt,
02:09:23.540 | because there's many greater minds than mine
02:09:25.420 | on the planet right now thinking about this properly,
02:09:27.500 | and you've had some of them on the podcast, right?
02:09:29.420 | They're just absolutely fantastic.
02:09:33.100 | But I'm wondering if we might be able to reformat
02:09:35.620 | the way we would explore algorithmic complexity
02:09:39.880 | using assembly.
02:09:41.080 | What's the minimum number of constraints we need
02:09:45.260 | in our system for this to unfold?
02:09:47.860 | So whether it's like, you know, if you take some particles
02:09:50.860 | and put them in a box, at a certain box size,
02:09:53.780 | you get quasi-crystallinity coming out, right?
02:09:57.100 | But that emergence, it's not magic.
02:10:00.540 | It must come from the boundary conditions you put in.
02:10:03.780 | So all I'm saying is a lot of the complexity that we see
02:10:07.100 | is a direct read of the constraints we put in,
02:10:10.260 | but we just don't understand.
02:10:11.980 | So as I said earlier to the poor origin of life chemists,
02:10:14.620 | you know, origin of life is a scam.
02:10:17.340 | I would say lots of the complexity calculation theory
02:10:20.220 | is a bit of a scam, 'cause we put the constraints in,
02:10:22.900 | but we don't count them correctly.
02:10:25.260 | And I'm wondering if--
02:10:26.140 | - Oh, you're thinking, and sorry to drop this,
02:10:28.900 | as assembly theory, assembly index is a way
02:10:32.140 | to count to the constraints.
02:10:33.300 | - Yes, that's it, that's all it is.
02:10:35.020 | So assembly theory doesn't lower any of the importance
02:10:38.700 | of complexity theory, but it allows us to go across domains
02:10:42.180 | and start to compare things,
02:10:44.460 | compare the complexity of a molecule, of a microprocessor,
02:10:47.820 | of the text you're writing, of the music you may compose.
02:10:52.020 | - You've tweeted, quote, "Assembly theory explains
02:10:56.700 | "why Nietzsche understood we had limited freedom
02:10:59.460 | "rather than radical freedom."
02:11:01.980 | So we've applied assembly theory to cellular automata
02:11:04.580 | in life and chemistry.
02:11:06.100 | What does Nietzsche have to do with assembly theory?
02:11:09.620 | - Oh, that gets me into free will and everything.
02:11:12.100 | - So let me say that again.
02:11:14.580 | Assembly theory explains why Nietzsche understood
02:11:16.660 | we had limited freedom rather than radical freedom.
02:11:20.420 | Limited freedom, I suppose, is referring to the fact
02:11:23.500 | that there's constraints.
02:11:25.020 | - Yeah.
02:11:26.700 | - What is radical freedom?
02:11:28.780 | What is freedom?
02:11:30.180 | - So Sartre was like, believed in absolute freedom
02:11:33.460 | and that he could do whatever he wanted in his imagination.
02:11:38.620 | And Nietzsche understood that his freedom
02:11:41.780 | was somewhat more limited.
02:11:43.180 | And it kind of takes me back to this computer game
02:11:46.180 | that I played when I was 10.
02:11:47.940 | So I think it's called "Dragon's Lair."
02:11:49.940 | (laughing)
02:11:50.980 | - Okay.
02:11:51.820 | - Do you know "Dragon's Lair?"
02:11:52.660 | - I think I know "Dragon's Lair," yeah.
02:11:54.100 | - "Dragon's Lair," I knew I was being conned, right?
02:11:56.380 | "Dragon's Lair," when you play the game,
02:11:57.780 | you're lucky that you grew up
02:11:59.060 | in a basically procedurally generated world.
02:12:01.540 | - That was RPG a little bit.
02:12:03.180 | No, it's like, is it turn-based play, was it?
02:12:06.860 | - It was a role-playing game.
02:12:07.980 | - Role-playing.
02:12:08.820 | - But really good graphics and won the first LaserDiscs.
02:12:11.660 | And when you actually flicked the stick,
02:12:13.580 | you took, it was like a graphical adventure game
02:12:16.580 | with animation.
02:12:17.980 | - Yeah.
02:12:18.820 | - And when I played this game, I really,
02:12:19.980 | you could get through the game in 12 minutes
02:12:22.060 | if you knew what you were doing,
02:12:22.980 | if you were not making mistakes.
02:12:23.900 | You just play the disc, play the disc, play the disc.
02:12:25.700 | So it was just about timing.
02:12:27.460 | And actually, it was a complete fraud
02:12:29.660 | because all the animation has been pre-recorded on the disc.
02:12:33.260 | - Yeah.
02:12:34.100 | - It's like "The Black Mirror," the first interactive
02:12:36.100 | where they had all the, you know,
02:12:37.460 | several million kind of permutations of the movie
02:12:41.460 | that you could select on Netflix.
02:12:42.780 | I've forgotten the name of it.
02:12:44.700 | So this was exactly that in the LaserDiscs.
02:12:47.660 | You basically go left, go right,
02:12:49.340 | fight the ogre, slay the dragon.
02:12:52.060 | And when you flick the joystick at the right time,
02:12:53.860 | it just goes to the next animation to play.
02:12:55.860 | - Yeah.
02:12:56.700 | - It's not really generating it.
02:12:57.580 | - Yeah.
02:12:58.420 | - And I played that game and I knew I was being had.
02:13:00.540 | (laughs)
02:13:01.460 | - So, oh, okay, I see.
02:13:03.200 | So to you, "Dragon Lair" is the first time
02:13:06.660 | you realized that free will is an illusion.
02:13:09.380 | - Yeah.
02:13:10.220 | (laughs)
02:13:11.180 | - And why does assembly theory give you hints
02:13:14.980 | about free will, whether it's an illusion or not?
02:13:17.140 | - Yeah, so no, so not totally.
02:13:18.820 | If I do think I have some will
02:13:21.340 | and I think I am an agent and I think I can interact
02:13:24.020 | and I can play around with the model I have of the world
02:13:27.100 | and the cost functions, right,
02:13:28.260 | and I can hack my own cost functions,
02:13:30.260 | which means I have a little bit of free will.
02:13:32.340 | But as much as I want to do stuff in the universe,
02:13:35.160 | I don't think I could suddenly say,
02:13:37.340 | I mean, actually, this is ridiculous
02:13:38.540 | 'cause now I say I could try and do it, right?
02:13:39.900 | It's like I'm gonna suddenly give up everything
02:13:41.220 | and become a rapper tomorrow, right?
02:13:44.180 | Maybe I could try that,
02:13:45.900 | but I don't have sufficient agency
02:13:48.700 | to make that necessarily happen, I'm on a trajectory.
02:13:51.500 | So when in "Dragon's Lair,"
02:13:52.380 | I know that I have some trajectories that I can play with,
02:13:56.540 | where Sartre realized he thought
02:13:58.940 | that he had no assembly, no memory,
02:14:00.820 | he could just leap across and do everything.
02:14:03.420 | And Nietzsche said, okay, I realize I don't have full freedom
02:14:07.580 | but I have some freedom.
02:14:09.340 | And the assembly theory basically says that.
02:14:11.300 | It says, if you have these constraints in your past,
02:14:14.200 | they limit what you are able to do in the future,
02:14:16.420 | but you can use them to do amazing things.
02:14:19.300 | Let's say I'm a poppy plant and I'm creating some opiates.
02:14:23.860 | Opiates are really interesting molecules.
02:14:25.780 | I mean, they're obviously great for medicine,
02:14:27.900 | great for cause, great problems in society,
02:14:29.740 | but let's imagine we fast forward a billion years,
02:14:33.420 | what will the opioids look like in a billion years?
02:14:37.780 | Well, we can guess
02:14:38.620 | because we can see how those proteins will evolve
02:14:41.300 | and we can see how the secondary metabolites will change,
02:14:44.900 | but they can't go radical.
02:14:46.460 | They can't suddenly become, I don't know,
02:14:48.580 | like a molecule that you'd find in an OLED in a display.
02:14:52.260 | They will have some, they will be limited
02:14:54.240 | by the causal chain that produced them.
02:14:57.060 | And that's what I'm getting at, saying you're,
02:14:59.980 | we're predictive, we are unpredictably predictable
02:15:03.540 | or predictably unpredictable within a constraint
02:15:07.520 | on the trajectory we're on.
02:15:08.860 | - Yeah, so the predictably part is the constraints
02:15:11.740 | of the trajectory and the unpredictable part
02:15:14.140 | is the part that you still haven't really clarified
02:15:16.540 | of the origin of the little bit of freedom.
02:15:19.980 | - Yeah.
02:15:20.820 | - So you're just arguing, you're basically saying
02:15:23.620 | that radical freedom is impossible.
02:15:26.740 | You're really operating in a world of constraints
02:15:29.080 | that are constrained by the memory of the trajectory
02:15:31.740 | of the chemistry that led to who you are.
02:15:33.740 | Okay, but you know, even just a tiny bit of freedom,
02:15:38.740 | even if everything, if everywhere you are in physics,
02:15:44.020 | in cages, if you can move around in that cage a little bit,
02:15:49.020 | you're free.
02:15:50.580 | - I agree.
02:15:51.420 | - And so the question is, in assembly theory,
02:15:55.700 | if we're thinking about free will,
02:15:57.500 | where does the little bit of freedom come from?
02:15:59.700 | What is the eye that can decide to be a rapper?
02:16:03.700 | What, why, what is that?
02:16:06.940 | That's a cute little trick we've convinced each other of
02:16:10.440 | so we can do fun tricks at parties
02:16:13.460 | or is there something fundamental
02:16:15.300 | that allows us to feel free, to be free?
02:16:18.860 | - I think that that's the question that I wanna answer.
02:16:21.860 | I know you wanna answer it and I think it's so profound.
02:16:25.820 | Let me have a go at it.
02:16:27.140 | I would say that I don't take the stance of Sam Harris
02:16:30.260 | 'cause I think Sam Harris, when he said,
02:16:31.780 | the way he says it is almost, it's really interesting.
02:16:33.980 | I'd love to talk to him about it.
02:16:35.020 | Sam Harris almost thinks himself out of existence, right?
02:16:37.660 | Because, do you know what I mean?
02:16:40.620 | - Yeah, well, he has different views
02:16:43.580 | on consciousness versus free will.
02:16:45.340 | I think he saves himself with consciousness.
02:16:47.420 | He thinks himself out of existence with free will.
02:16:49.980 | - Yeah, yeah, exactly.
02:16:50.940 | So that means there's no point, right?
02:16:53.260 | So I-- - He's a leaf
02:16:54.300 | floating on a river.
02:16:55.820 | - Yeah, I think that he's, I don't know,
02:17:00.820 | I'd love to ask him whether he really believes that
02:17:03.580 | and then we could play some games.
02:17:04.660 | - Oh yeah. - No, no, I then would say,
02:17:06.380 | I'll get him to play a game of cards with me
02:17:08.900 | and I'll work out the conditions
02:17:10.020 | on which he says no.
02:17:11.580 | And then I'll get him to the conditions he says yes
02:17:13.420 | and then I'll trap him in his logical inconsistency
02:17:15.620 | with that argument.
02:17:17.180 | Because at some point when he loses enough money
02:17:19.980 | or the prospect of losing enough money,
02:17:21.940 | there's a way of basically mapping out a series of,
02:17:26.940 | so what will is about, let's not call it free will,
02:17:30.900 | what will is about is to have a series of decisions
02:17:34.540 | equally weighted in front of you
02:17:36.820 | and those decisions aren't necessarily
02:17:38.420 | energy minimization, those decisions are a function
02:17:42.740 | of the model you've made in your mind,
02:17:44.340 | you're in your simulation.
02:17:45.940 | And the way you've interacted in reality
02:17:48.740 | and also other interactions that you're having
02:17:52.100 | with other individuals and happenstance.
02:17:55.260 | And I think that you, there's a little bit of delay in time.
02:18:00.260 | So I think what you're able to do is say,
02:18:03.820 | well, I'm gonna do the counterfactual.
02:18:06.660 | I've done all of them.
02:18:08.140 | And I'm gonna go this way.
02:18:10.740 | And you probably don't know why.
02:18:12.060 | I think free will is actually very complex interaction
02:18:14.780 | between your unconscious and your conscious brain.
02:18:18.780 | And I think the reason why we're arguing about it,
02:18:20.780 | it's so interesting in that we just,
02:18:23.780 | some people outsource their free will
02:18:26.940 | to their unconscious brain.
02:18:28.800 | And some people try and overthink
02:18:31.140 | the free will in the conscious brain.
02:18:33.100 | I would say that Sam Harris has realized
02:18:35.460 | his conscious brain doesn't have free will,
02:18:36.900 | but his unconscious brain does.
02:18:38.580 | That's my guess, right?
02:18:39.420 | - And that he can't have access to the unconscious brain.
02:18:41.780 | - Yeah, and that's kind of annoying.
02:18:43.660 | - So he's just, he's going to, through meditation,
02:18:46.380 | come to acceptance with that fact.
02:18:48.020 | - Yeah, which is maybe okay.
02:18:49.820 | Maybe, but I do think that I have the ability
02:18:54.180 | to make decisions and I like my decisions.
02:18:55.980 | In fact, I mean, this is an argument I have
02:18:58.500 | with some people that some days I feel I have no free will
02:19:03.260 | and it's just an illusion.
02:19:04.780 | And this is one, and it makes me more radical,
02:19:06.620 | if you like, you know, as a,
02:19:08.860 | that I get to explore more of the state space.
02:19:11.620 | And I'm like, I'm gonna try and affect the world now.
02:19:13.620 | I'm really gonna ask the question
02:19:15.220 | that maybe I dare not ask or dare not,
02:19:17.620 | or do the thing I dare not do.
02:19:19.660 | And that allows me to kind of explore more.
02:19:21.800 | - It's funny that if you truly accept
02:19:25.540 | that there's no free will,
02:19:27.020 | that is a kind of radical freedom.
02:19:30.620 | It's funny, but you're,
02:19:34.020 | because the little bit of the illusion
02:19:37.620 | under that framework that you have
02:19:39.980 | that you can make choices,
02:19:41.940 | if choice is just an illusion of psychology,
02:19:45.420 | you can do whatever the hell you want.
02:19:46.780 | That's the-- - But we don't, do we?
02:19:48.820 | And I think--
02:19:49.660 | - But because you don't truly accept
02:19:51.500 | that you think that there's,
02:19:53.960 | like, you think there's a choice,
02:19:56.860 | which is why you don't just do whatever the hell you want.
02:20:00.980 | Like, you feel like there's some responsibility
02:20:03.060 | for making the wrong choice, which is why you don't do it.
02:20:05.580 | But if you truly accept that the choice
02:20:07.500 | has already been made,
02:20:08.900 | then you can go, I don't know,
02:20:12.500 | what is the most radical thing?
02:20:14.120 | I mean, but, yeah, I don't, I wonder what,
02:20:19.860 | what am I preventing myself from doing
02:20:21.880 | that I would really want to do?
02:20:23.480 | Probably, like, humor stuff.
02:20:26.180 | Like, I would love to,
02:20:29.100 | if I could, like, save a game, do the thing,
02:20:32.780 | and then reload it later, like, do undo,
02:20:35.780 | it'd probably be humor,
02:20:37.100 | just to do something, like, super hilarious.
02:20:40.380 | That's super embarrassing, and then just go.
02:20:44.780 | I mean, it's basically just fun.
02:20:46.660 | I would add more fun to the world.
02:20:48.020 | - I mean, I sometimes do that.
02:20:49.420 | You know, I sometimes,
02:20:52.060 | I try and mess up my reality in unusual ways
02:20:56.380 | by just doing things because I'm bored, but not bored.
02:20:59.320 | I'm not expressing this very well.
02:21:00.580 | I think that this is a really interesting problem,
02:21:02.500 | that perhaps the hard sciences don't really understand
02:21:05.540 | that they are responsible for,
02:21:06.640 | because the question about how life emerged,
02:21:09.500 | and how intelligence emerged,
02:21:10.700 | and consciousness, and free will,
02:21:12.280 | they're all ultimately boiling down
02:21:14.480 | to some of the same mechanics, I think.
02:21:16.720 | My feeling is that they are the same problem
02:21:19.260 | again and again and again.
02:21:20.920 | The transition from a, you know, a boring world,
02:21:24.740 | or a world in which there is no selection.
02:21:26.780 | So I wonder if free will has something to do
02:21:28.540 | with selection and models,
02:21:30.260 | and also the models you're generating in the brain,
02:21:32.500 | and also the amount of memory,
02:21:34.140 | of working memory you have available at any one time
02:21:35.980 | to generate counterfactuals.
02:21:37.500 | - Well, that's fascinating.
02:21:38.340 | So like the decision-making process
02:21:40.580 | is a kind of selection.
02:21:42.140 | - Yeah. - And that could be just--
02:21:43.260 | - Absolutely. - Yet another,
02:21:45.140 | yet another manifestation of the selection mechanism
02:21:48.220 | that's pervasive throughout the universe.
02:21:50.340 | Okay, that's fascinating to think about.
02:21:52.660 | (laughs)
02:21:54.780 | Yeah.
02:21:55.700 | There's not some kind of fundamental,
02:21:56.980 | its own thing, or something like that,
02:21:59.300 | that is just yet another example of selection.
02:22:01.980 | - Yeah.
02:22:02.820 | And in a universe that's intrinsically open,
02:22:06.300 | you want to do that because you generate novelty.
02:22:08.660 | - You mentioned something about,
02:22:10.700 | do cellular automata exist outside the human mind
02:22:13.940 | in our little offline conversation.
02:22:16.040 | Why is that an interesting question?
02:22:18.780 | So cellular automata, complexity,
02:22:22.420 | what's the relationship between complexity
02:22:24.300 | and the human mind, and trees falling in the forest?
02:22:28.900 | - Infrastructure, so the CA,
02:22:30.500 | so when John von Neumann and Conway and Feynman
02:22:34.020 | were doing CA, they were doing it on paper.
02:22:35.780 | - CA is cellular automata.
02:22:37.140 | - Just drawing them on paper.
02:22:38.860 | - How awesome is that,
02:22:40.020 | that they were doing cellular automata on paper?
02:22:42.260 | - Yeah. - And then they were doing
02:22:43.420 | a computer that takes like forever
02:22:46.420 | to print out anything and program.
02:22:48.780 | - Sure.
02:22:49.620 | - People are now, with the TikTok,
02:22:51.020 | kids these days with the TikTok,
02:22:52.620 | don't understand how amazing it is
02:22:55.240 | to just play with cellular automata,
02:22:57.060 | arbitrarily changing the rules as you want,
02:23:00.020 | the initial conditions, and see the beautiful
02:23:03.060 | patterns emerge, sing with fractals,
02:23:04.980 | all of that, just all.
02:23:06.180 | - You've just given me a brilliant idea.
02:23:07.580 | I wonder if there's a TikTok account
02:23:08.780 | that's just dedicated to putting out CA rules,
02:23:10.700 | and if it isn't, we should make one.
02:23:12.620 | - 100%.
02:23:14.060 | And that will get--
02:23:14.900 | - Then we have millions of views.
02:23:16.660 | - Millions, yes.
02:23:17.740 | No, it'll get dozens.
02:23:20.220 | - Or just have it running.
02:23:21.460 | So look, I kind of,
02:23:23.060 | I love CAs.
02:23:25.460 | (laughing)
02:23:27.700 | Yeah, no.
02:23:30.140 | - We just have to make one.
02:23:32.540 | I actually, a few years ago,
02:23:34.140 | I made some robots that talk to each other,
02:23:36.140 | chemical robots that played the game of Hex,
02:23:40.020 | invented by John Nash, by doing chemistry,
02:23:42.940 | and they communicated via Twitter
02:23:44.940 | which experiments they were doing.
02:23:46.540 | And they had a lookup table of experiments.
02:23:49.780 | And robot one said, "I'm doing experiment 10."
02:23:51.700 | And the other robot, "Okay, I'll do experiment one then."
02:23:53.700 | - And they communicated--
02:23:54.820 | - Like publicly or DMs?
02:23:57.300 | - Yeah, yeah, yeah.
02:23:58.700 | - Can you maybe quickly explain what the Game of Hex is?
02:24:01.460 | - Yeah, so it's basically a board, hexagonal board,
02:24:04.260 | and you try and basically, you color each hexagon,
02:24:06.660 | each element on the board of each hexagon,
02:24:08.820 | and you try and get from one side to the other,
02:24:10.660 | and the other one tries to block you.
02:24:12.420 | - How are they connected, what-
02:24:13.860 | - So what the robots-
02:24:14.780 | - So the chemical-
02:24:16.740 | - Yeah, let's go back.
02:24:17.560 | So the two robots, each robot was doing dye chemistry.
02:24:20.640 | So making RGB, red, green, blue, red, green, blue,
02:24:23.020 | red, green, blue.
02:24:24.180 | And they could just choose from experiments
02:24:26.220 | to do red, green, blue.
02:24:28.500 | Initially, I said to my group,
02:24:29.460 | we need to make two chemical robots that play chess,
02:24:31.500 | and my group were like, that's too hard, no.
02:24:33.700 | - Too complicated. - Go away.
02:24:35.340 | But anyway, so we had the robot.
02:24:36.540 | (laughing)
02:24:38.380 | - By the way, people listening to this
02:24:40.580 | should probably know that Lee Cronin
02:24:43.300 | is an amazing group of brilliant people.
02:24:46.940 | He's exceptionally well-published.
02:24:48.820 | He's written a huge number of amazing papers.
02:24:52.220 | Whenever he calls himself stupid,
02:24:55.660 | and is a sign of humility,
02:24:57.940 | and I deeply respect that and appreciate it.
02:25:00.240 | So people listening to this should know
02:25:01.980 | this is a world-class scientist
02:25:04.340 | who doesn't take himself seriously,
02:25:05.620 | which I really appreciate and love.
02:25:07.800 | Anywho, talking about serious science,
02:25:11.740 | we're back to your group rejecting your idea
02:25:15.980 | of chemical robots playing chess via dyes,
02:25:21.980 | so you went to a simpler game of hex.
02:25:24.540 | Okay, so what else?
02:25:26.300 | - The team that did it were brilliant.
02:25:27.540 | I really take, I think they still have PTSD from doing it,
02:25:31.500 | 'cause I said, this is a workshop.
02:25:32.620 | What I'd often do is I have about 60 people on my team,
02:25:36.780 | and occasionally before lockdown,
02:25:38.260 | I would say, I'm a bit bored,
02:25:40.020 | we're gonna have a workshop on something.
02:25:41.660 | Who wants to come?
02:25:42.660 | And then basically about 20 people turn up to my office,
02:25:44.820 | and I say, we're gonna do this mad thing.
02:25:46.700 | And then it would just self-organize,
02:25:49.380 | and some of them would be like, no, I'm not doing this.
02:25:51.140 | And then you get left with the happy dozen.
02:25:54.540 | And what we did is we built this robot,
02:25:56.100 | and doing dye chemistry is really easy.
02:25:57.820 | You can just take two molecules,
02:25:59.060 | react them together, and change color.
02:26:00.900 | And what I wanted to do is have a palette
02:26:03.100 | of different molecules you could react commentarily
02:26:05.860 | and get different colors.
02:26:07.020 | So you got two robots.
02:26:08.060 | And I went, wouldn't it be cool
02:26:09.460 | if the robots basically shared
02:26:12.100 | the same list of reactions to do,
02:26:14.140 | and they said, oh, and then you could do
02:26:16.500 | a kind of multi-core chemistry.
02:26:18.740 | Like they weren't, so you could have two chemical reactions
02:26:20.540 | going on at once, and they could basically
02:26:22.660 | outsource the problem.
02:26:23.740 | - But they're sharing the same tape.
02:26:25.300 | - Exactly.
02:26:26.140 | - Okay. - So robot one would say,
02:26:27.260 | I'm gonna do experiment one,
02:26:28.980 | and the other robot says, I'll do experiment 100.
02:26:30.900 | And then they cross it off.
02:26:32.900 | But I wanted to make it--
02:26:33.740 | - That's brilliant, by the way.
02:26:34.820 | - I wanted to make-- - That is genius.
02:26:36.340 | Sorry. - Well, I wanted
02:26:37.540 | to make it groovier.
02:26:38.380 | And I said, look, let's have them competing.
02:26:40.900 | - Yeah. - To make,
02:26:42.020 | so they're playing a game of hex,
02:26:43.740 | and so when the robot does an experiment,
02:26:46.460 | and the more blue the dye,
02:26:48.500 | the more it gets, the higher chance it gets
02:26:51.220 | to make the move it wants on the hex board.
02:26:53.500 | So if it gets a red color, it's like,
02:26:55.380 | it gets down-weighted in the other robot.
02:26:57.740 | And so what the robots could do is they play,
02:26:59.420 | each player move, and 'cause the fitness function
02:27:02.380 | or the optimization function was to make the color blue,
02:27:05.320 | they started to invent reactions
02:27:09.260 | we weren't on the list.
02:27:11.100 | And they did this by not cleaning,
02:27:13.140 | because we made cleaning optional.
02:27:14.740 | So when one robot realized, if it didn't clean its pipes,
02:27:17.420 | it could get blue more quickly.
02:27:18.820 | - Yeah. - And the other robot
02:27:19.860 | realized that, so it was like getting dirty as well.
02:27:21.700 | And they-- (laughing)
02:27:23.980 | - Unintended consequences of super intelligence.
02:27:26.380 | Okay, but-- - That was the game.
02:27:28.340 | And we-- - But communicating
02:27:29.620 | through Twitter, though.
02:27:30.460 | - They were doing it through Twitter,
02:27:31.780 | and Twitter banned them a couple of times.
02:27:33.300 | I said, come on, you've got a couple of robots
02:27:34.780 | doing chemistry, it's really cool.
02:27:36.060 | Stop banning them. - Yeah.
02:27:37.220 | - But in the end, we had to take them off Twitter,
02:27:39.420 | and they just communicated via a server.
02:27:41.180 | 'Cause it was just, there were people saying,
02:27:43.140 | you can still find it, Cronin Lab 1
02:27:44.940 | and Cronin Lab 2 on Twitter.
02:27:46.700 | But it was like, make move, wait,
02:27:49.140 | mix A and B, wait 10 seconds, answer blue.
02:27:52.740 | - I really find it super compelling
02:27:56.140 | that you would have a chemical entity
02:27:59.700 | that's communicating with the world.
02:28:03.180 | - That was one of the things I wanna do
02:28:04.020 | in my origin of life reaction, right?
02:28:05.940 | Is basically have a reactor
02:28:10.220 | that's basically just randomly enumerating
02:28:12.020 | through chemical space and have some kind of cycle.
02:28:14.780 | And then read out what the molecule's reading out
02:28:17.220 | using a mass spectrometer, and then convert that to text,
02:28:20.820 | and publish it on Twitter,
02:28:22.420 | and then wait until it says I'm alive.
02:28:24.300 | (laughing)
02:28:25.860 | I reckon that Twitter account would get a lot of followers.
02:28:28.860 | - Yeah. - And I'm still trying
02:28:29.700 | to convince my group that we should just make
02:28:31.060 | an origin of life Twitter account,
02:28:32.700 | where it's going, and it's like, hello, testing, I'm here.
02:28:37.340 | - Well, I'll share it, I like it.
02:28:39.140 | I particularly enjoy this idea.
02:28:42.980 | Of a non-human entity communicating with the world
02:28:46.700 | via a human-designed social network.
02:28:48.860 | It's quite a beautiful idea.
02:28:51.500 | How we were talking about CAs existing
02:28:59.220 | outside the human mind.
02:29:00.260 | - Yeah, so I really admire Stephen Wolfram.
02:29:02.620 | I think he's a genius, clearly a genius.
02:29:05.420 | And trapped is actually,
02:29:06.420 | it's like a problem with being so smart,
02:29:08.300 | is you get trapped in your own mind, right?
02:29:11.580 | And I tried to actually, I tried to convince Stephen
02:29:14.580 | that assembly theory wasn't nonsense.
02:29:15.860 | He was like, no, it's just nonsense.
02:29:17.820 | I was a little bit sad by that.
02:29:18.660 | - So nonsense applied, even if it applied
02:29:20.900 | to the simplest construct of a one-dimensional
02:29:23.980 | cellular automata, for example?
02:29:25.020 | - Yeah, yeah, but I mean, actually,
02:29:26.260 | maybe I'm doing myself a bit too down.
02:29:28.300 | It was just as a theory was coming through,
02:29:30.980 | and I didn't really know how to explain it.
02:29:33.220 | But we are gonna use assembly theory and CAs
02:29:35.900 | in cellular automata, but I wanted to,
02:29:38.660 | what I was really curious about
02:29:40.020 | is why people marvel, I mean, you marvel at CAs
02:29:43.820 | and their complexity, and I said, well,
02:29:45.540 | hang on, that complexity's baked in,
02:29:47.180 | because if you play the game of life in a CA,
02:29:50.500 | you have to run it on a computer.
02:29:52.700 | You have to have a, you have to do a number of operations,
02:29:55.660 | put in the boundary conditions.
02:29:57.180 | So is it surprising that you get this structure out?
02:29:59.860 | Is it manufactured by the boundary conditions?
02:30:02.900 | And it is interesting, because I think,
02:30:06.300 | a cellular automata, running them,
02:30:09.380 | is teaching me something about what real numbers are and aren't.
02:30:14.020 | And I haven't quite got there yet.
02:30:15.140 | I was playing on the airplane coming over,
02:30:16.820 | and just realized I have no idea
02:30:18.260 | what real numbers are, really.
02:30:20.060 | And I was like, well, I do actually have some notion
02:30:21.820 | of what real numbers are, and I think
02:30:26.260 | thinking about real numbers as functions,
02:30:28.860 | rather than numbers, is more appropriate.
02:30:31.020 | And then, if you then apply that to CAs,
02:30:33.580 | then you're saying, well, actually,
02:30:36.140 | why am I seeing this complexity in this rule?
02:30:39.340 | You've got this deterministic system,
02:30:44.660 | and yet you get this incredible structure coming out.
02:30:48.060 | Well, isn't that what you'd get with any real number,
02:30:51.300 | as you apply it as a function?
02:30:54.300 | And you're trying to read it out to an arbitrary position?
02:30:57.180 | And I wonder if CAs are just helping me,
02:31:00.260 | well, my misunderstanding of CAs
02:31:02.300 | might be helping me understand them
02:31:03.540 | in terms of real numbers.
02:31:04.380 | I don't know what you think.
02:31:05.220 | - Yeah, well, the function,
02:31:06.460 | but the devil's in the function.
02:31:10.740 | Like, which is the function
02:31:13.380 | that's generating your real number?
02:31:16.180 | It seems like it's very important,
02:31:20.780 | the specific algorithm of that function,
02:31:22.660 | 'cause some lead to something super trivial,
02:31:25.140 | some lead to something that's all chaotic,
02:31:27.260 | and some lead to things that are just walked out,
02:31:31.820 | fine line of complexity and structure.
02:31:35.380 | - I think we agree.
02:31:36.220 | So let's take it back a second.
02:31:37.180 | So take the logistic map or something, logistic equation,
02:31:40.780 | where you have this equation,
02:31:42.380 | which is you don't know what's gonna happen to N plus one,
02:31:46.540 | but once you've done N plus one, you know, full time.
02:31:48.900 | You can't predict it.
02:31:50.380 | For me, CAs and logistic equation feel similar.
02:31:53.620 | And I think what's incredibly interesting,
02:31:57.380 | and I share your kind of wonder at running a CA,
02:32:02.380 | but also I'm saying,
02:32:04.940 | well, what is it about the boundary conditions
02:32:06.660 | and the way I'm running that calculation?
02:32:08.620 | So in my group, with my team,
02:32:10.220 | we actually made a chemical CA.
02:32:12.060 | We made game of life.
02:32:13.100 | We actually made a physical grid.
02:32:15.140 | I haven't been able to publish this paper.
02:32:16.460 | It's been trapped in purgatory for a long time.
02:32:18.860 | - So you wrote it up as a paper,
02:32:20.180 | how to do a chemical formulation of the game of life,
02:32:23.060 | which is like-- - We made a chemical computer
02:32:24.500 | and little cells.
02:32:25.740 | - And I was playing game of life.
02:32:26.940 | With a BZ reaction,
02:32:28.220 | so each cell would pulse on and off, on and off, on and off.
02:32:31.260 | We have little stirrer bars and we have little gates.
02:32:34.260 | And we actually played Conway's game of life in there.
02:32:37.300 | And we got structures in that.
02:32:38.860 | We got structures in that game from the chemistry
02:32:41.900 | that you wouldn't expect from the actual CA.
02:32:44.300 | So that was kind of cool in that--
02:32:46.500 | - 'Cause they were interacting outside of the cells now or--
02:32:50.420 | - So what's happening is you're getting noise.
02:32:52.180 | So the thing is that you've got this BZ reaction
02:32:55.420 | that goes on, off, on, off, on, off.
02:32:56.900 | But there's also a wake.
02:32:58.820 | And those wakes constructively interfere
02:33:00.980 | or in such a non-trivial way that's non-deterministic.
02:33:05.980 | And the non-determinism in the system
02:33:11.040 | gives very rich dynamics.
02:33:12.740 | And I was wondering if I could physically
02:33:14.020 | make a chemical computer with this CA
02:33:16.980 | that gives me something different
02:33:20.140 | that I can't get in a silicon representation of a CA.
02:33:25.380 | Where all the states are clean.
02:33:27.100 | 'Cause you don't have the noise trailing
02:33:29.420 | into the next round.
02:33:30.660 | You just have the state.
02:33:32.100 | - So the paper in particular.
02:33:33.740 | So it's just a beautiful idea to use a chemical computer
02:33:37.020 | to construct a cellular automata
02:33:38.980 | and the famous one of game of life.
02:33:41.260 | But it's also interesting.
02:33:42.300 | And it's a really interesting scientific question
02:33:45.780 | of whether some kind of random perturbations
02:33:48.380 | or some source of randomness can have a,
02:33:55.060 | significant constructive effect
02:33:58.940 | on the complexity of the system.
02:34:00.300 | - And indeed, I mean, whether it's random
02:34:03.100 | or just non-deterministic.
02:34:04.700 | And can we bake in that non-determinism at the beginning?
02:34:08.140 | You know, I wonder what is the,
02:34:10.540 | I'm trying to think about what is the encoding space.
02:34:12.500 | The encoding space is pretty big.
02:34:14.260 | We have 49 steroids, so 49 cells, 49 chem bits,
02:34:19.260 | all connected to one another in like an analog computer
02:34:23.700 | but being read out discreetly as the BZ reaction.
02:34:26.980 | So just to say the BZ reaction is a chemical oscillator.
02:34:30.260 | And what happened in each cell
02:34:31.460 | is it goes between red and blue.
02:34:33.100 | So two Russians discovered it, Belousov-Zaposkinsky.
02:34:36.380 | I think Belousov first proposed it
02:34:38.460 | and everyone said, "You're crazy, it breaks the second law."
02:34:40.740 | And Zaposkinsky said, "No, it doesn't break the second law.
02:34:42.740 | It's consuming a fuel."
02:34:44.160 | And so, and then, and it's like,
02:34:47.580 | there's a lot of chemistry hidden
02:34:50.340 | in the Russian literature actually.
02:34:51.980 | That just because Russians just wrote it in Russian,
02:34:54.740 | they didn't publish it in English-speaking journals.
02:34:56.580 | - It's heartbreaking actually.
02:34:57.580 | - Well, yeah, sad and it's great that it's there, right?
02:35:01.700 | It's not lost.
02:35:02.540 | I'm sure we will find a way of translating it properly.
02:35:05.100 | - Well, the silver lining slash greater sadness
02:35:08.780 | of all of this is there's probably ideas
02:35:11.220 | in English-speaking.
02:35:12.900 | Like there's ideas in certain disciplines
02:35:16.220 | that if discovered by other disciplines
02:35:20.140 | would crack open some of the biggest mysteries
02:35:22.380 | in those disciplines.
02:35:23.620 | Like computer science, for example,
02:35:26.360 | is trying to solve problems
02:35:30.120 | like nobody else has ever tried to solve problems.
02:35:32.860 | As if it's not already been all addressed
02:35:35.500 | in cognitive science and psychology and mathematics
02:35:38.500 | and physics and just whatever you want to, economics even.
02:35:42.860 | But if you look into that literature,
02:35:44.660 | you might be able to discover some beautiful ideas.
02:35:47.620 | - Obviously Russian is an interesting case of that
02:35:51.220 | because there's a loss in translation.
02:35:53.460 | But you said there's a source of fuel, a source of energy.
02:35:57.420 | - Yeah, yeah, so the BZ reaction,
02:35:59.180 | you have an acid in there called malonic acid.
02:36:02.500 | And what happens is it,
02:36:03.580 | it's basically like a battery that powers it
02:36:08.420 | and it loses CO2, so decarboxylates.
02:36:10.660 | It's just a chemical reaction.
02:36:12.020 | What that means we have to do is continuously feed
02:36:14.460 | or we just keep the BZ reaction going
02:36:16.500 | in a long enough time so it's like it's reversible in time.
02:36:21.500 | (laughs)
02:36:22.660 | - But only like.
02:36:23.900 | - But only like.
02:36:25.020 | But it's fascinating.
02:36:26.340 | I mean, the team that did it,
02:36:27.660 | I'm really proud of their persistence.
02:36:29.660 | We made a chemical computer.
02:36:32.220 | It can solve little problems.
02:36:34.420 | It can solve traveling salesman problems actually.
02:36:36.340 | - Nice.
02:36:37.820 | - But like I say, it's--
02:36:38.660 | - But not any faster than the regular computer.
02:36:41.020 | Is there something you could do?
02:36:43.180 | - Maybe.
02:36:44.020 | I'm not sure.
02:36:46.220 | I think we can come up with a way of solving problems,
02:36:49.780 | also really complex, hard ones,
02:36:52.100 | 'cause it's an analog computer.
02:36:55.780 | It can energy minimize really quickly.
02:36:59.620 | It doesn't have to basically go through
02:37:00.780 | every element in the matrix.
02:37:02.620 | Like flip it, it just reads out.
02:37:05.460 | So we could actually do Monte Carlo
02:37:07.940 | by just shaking the box.
02:37:10.620 | It's literally a box shaker.
02:37:12.100 | You don't actually have to encode the shaking of the box
02:37:14.900 | in a silicon memory and then just shuffle everything around.
02:37:17.500 | Yeah, and you--
02:37:18.340 | - It's analog, it's natural.
02:37:19.660 | So it's an organic computer.
02:37:23.460 | - Yeah, yeah.
02:37:24.300 | So I was playing around with this
02:37:25.900 | and I was kind of annoying some of my colleagues
02:37:27.500 | and wondering if we could get to chemical supremacy,
02:37:29.820 | like quantum supremacy.
02:37:31.500 | And I kind of calculated how big the grid has to be
02:37:36.020 | so we can actually start to solve problems
02:37:37.820 | faster than a silicon computer.
02:37:40.060 | But I'm not willing to state how that is yet
02:37:43.300 | 'cause I'm probably wrong.
02:37:44.780 | It's not that it's any top secret thing.
02:37:46.980 | It's I think I can make a chemical computer
02:37:49.460 | that can solve optimization problems
02:37:51.420 | faster than a silicon computer.
02:37:53.060 | - That's fascinating.
02:37:55.580 | But then you're unsure how big that has to be.
02:37:57.860 | - Yeah, I think, I mean--
02:37:58.700 | - It might be a big box, hard to shake.
02:38:00.900 | - It might be exactly a big box, hard to shake
02:38:03.060 | and basically a bit sloppy.
02:38:05.180 | - Did we answer the question about
02:38:08.220 | do cellular atomics just outside the mind?
02:38:11.260 | - We didn't, but I would posit that they don't
02:38:14.580 | and I, but I think minds can, well--
02:38:17.740 | - So the mind is fundamental.
02:38:19.860 | What's the, why?
02:38:21.340 | - Well, I mean, sorry, let's go to the back.
02:38:23.260 | So as a physical phenomena,
02:38:25.020 | do CAs exist in physical reality, right?
02:38:28.540 | I would say they probably don't exist
02:38:30.820 | outside the human mind,
02:38:32.020 | but now I've constructed them,
02:38:32.980 | they exist in computer memories,
02:38:34.700 | they exist in my lab, they exist on paper.
02:38:36.980 | So they are, they emerge from the human mind.
02:38:40.700 | I'm just interested in,
02:38:41.940 | because Stephen Wolfram likes CAs,
02:38:43.900 | a lot of people like CAs,
02:38:45.020 | and likes to think of them
02:38:46.380 | as minimal computational elements.
02:38:49.380 | I'm just saying, well, do they exist in reality
02:38:52.020 | or are they a representation of a simple machine
02:38:54.300 | that's just very elegant to implement?
02:38:56.220 | - So it's a platonic question, I guess.
02:38:59.060 | - Yeah. - I mean, it's,
02:39:00.260 | there's initial conditions,
02:39:03.180 | there's a memory in the system,
02:39:05.340 | there are simple rules that dictate
02:39:07.380 | the evolution of the system.
02:39:09.100 | So what exists?
02:39:10.980 | The idea, the rules, the--
02:39:12.820 | - Yeah, people are using CAs
02:39:14.860 | as models for things in reality
02:39:16.860 | to say, hey, look, you can do this thing in a CA.
02:39:21.100 | When I see this, I'm saying, oh, that's cool,
02:39:24.540 | but what does that tell me about reality?
02:39:26.140 | Where's the CA in space time?
02:39:27.620 | - Oh, I see.
02:39:28.460 | Well, right, it's a mathematical object.
02:39:30.540 | So for people who don't know cellular automata,
02:39:33.060 | there's usually a grid,
02:39:34.660 | whether it's one-dimensional, two-dimensional,
02:39:36.140 | or three-dimensional,
02:39:36.980 | and it evolves by simple local rules,
02:39:39.460 | like you die or are born.
02:39:41.020 | If the neighbors are alive or dead,
02:39:44.420 | and it turns out if you have,
02:39:46.780 | with certain kinds of initial conditions
02:39:51.740 | and with certain kinds of very simple rules,
02:39:53.380 | you can create arbitrarily complex and beautiful systems.
02:39:57.660 | And to me, whether drugs are involved or not,
02:40:02.660 | I can sit back for hours and enjoy the mystery of it,
02:40:09.580 | how such complexity can emerge.
02:40:12.300 | It gives me almost like,
02:40:14.740 | people talk about religious experiences.
02:40:17.340 | It gives me a sense that you get to have a glimpse
02:40:22.340 | at the origin of this whole thing.
02:40:26.860 | Whatever is creating this complexity from such simplicity
02:40:31.860 | is the very thing that brought my mind to life,
02:40:38.060 | this me, the human, our human civilization.
02:40:42.940 | And yes, those constructs are pretty trivial.
02:40:47.940 | I mean, that's part of their magic,
02:40:50.740 | is even in this trivial framework,
02:40:54.300 | you could see the emergence,
02:40:56.500 | or especially in this trivial framework,
02:40:58.800 | you could see the emergence of complexity from simplicity.
02:41:01.820 | I guess what, Lee, you're saying is that this is not,
02:41:05.980 | you know, this is highly unlike systems
02:41:10.980 | we see in the physical world,
02:41:13.060 | even though they probably carry some of the same magic,
02:41:17.300 | like mechanistically.
02:41:18.620 | - I mean, I'm saying that the operating system
02:41:22.380 | that a CA has to exist on is quite complex.
02:41:26.140 | And so I wonder if you're getting the complexity
02:41:28.940 | out of the CA from the boundary conditions
02:41:30.500 | of the operating system, the underlying digital computer.
02:41:33.580 | - Oh, wow, those are some strong words.
02:41:35.940 | Against CAs, then, I didn't realize--
02:41:37.140 | - Not against, I mean, I'm in love with CAs as well.
02:41:40.660 | I'm just saying they aren't as trivial as people think.
02:41:44.500 | They are incredible.
02:41:45.900 | To get to that richness,
02:41:47.220 | you have to iterate billions of times.
02:41:51.020 | And you need a display, and you need a math coprocessor,
02:41:54.940 | and you need a von Neumann machine
02:41:57.700 | based on a Turing machine
02:41:58.940 | with digital error correction and states.
02:42:01.580 | - Wow, to think that for the simplicity of a grid,
02:42:05.780 | you're basically saying a grid is not simple.
02:42:08.820 | - Yeah.
02:42:09.780 | - It requires incredible complexity
02:42:11.700 | to bring a grid to life.
02:42:12.820 | - Yeah.
02:42:13.660 | Yeah, that's--
02:42:16.100 | - What is simple?
02:42:17.380 | - That's all I wanted to say.
02:42:18.420 | I agree with you with the wonder of CAs.
02:42:19.980 | I just think, but remember, we take so much for granted
02:42:22.980 | what the CA is resting on.
02:42:24.820 | 'Cause von Neumann and Feynman weren't showing,
02:42:28.340 | weren't seeing these elaborate structures.
02:42:29.980 | They could not get that far.
02:42:31.460 | - Yeah, but that's the limitation of their mind.
02:42:34.820 | - Yeah, yeah, exactly, the limitation of their pencil.
02:42:37.420 | - But I think that's, the question is
02:42:40.700 | whether the essential elements of the cellular automata
02:42:44.180 | is present without all the complexities required
02:42:51.340 | to build a computer.
02:42:52.460 | And my intuition, the reason I find it incredible
02:42:56.100 | is that, yeah, my intuition is yes.
02:42:58.340 | It might look different.
02:43:00.860 | There might not be a grid-like structure,
02:43:05.100 | but local interactions operating under simple rules
02:43:09.740 | and resulting in multi-hierarchical complex structures
02:43:14.540 | feels like a thing that doesn't require a computer.
02:43:17.940 | - I agree, but coming back to von Neumann and Feynman
02:43:20.980 | and Wolfram, their minds, the non-trivial minds,
02:43:24.820 | to create those architectures and do it
02:43:26.540 | and to put on those state transitions
02:43:30.740 | and I think that's something
02:43:32.060 | that's really incredibly interesting,
02:43:34.520 | that is understanding how the human mind
02:43:38.080 | builds those state transition machines.
02:43:40.720 | - I could see how deeply in love
02:43:43.380 | with the idea of memory you are.
02:43:45.020 | So it's like how much of E equals MC squared
02:43:49.700 | is more than an equation?
02:43:53.860 | It has Albert Einstein in it.
02:43:56.460 | Like you're saying, you can't just say
02:43:59.500 | this is like the equations of physics
02:44:03.940 | are a really good simple capture of a physical phenomena.
02:44:08.940 | It is also, that equation has the memory of the humans.
02:44:15.020 | - Absolutely, absolutely, yeah.
02:44:17.680 | - But I don't, I don't know if you're implying this,
02:44:21.460 | I don't, that's a beautiful idea,
02:44:25.060 | but I don't know if I'm comfortable
02:44:27.140 | with that sort of diminishing the power of that equation.
02:44:31.060 | - No, no, it enhances it.
02:44:31.900 | - Because it's built on the shoulders, it enhances it.
02:44:33.620 | - I think it enhances it.
02:44:34.460 | It's not, that equation is a minimal compressed
02:44:36.940 | representation of reality, right?
02:44:39.340 | We can use machine learning or Max Tegmark's AI Feynman
02:44:43.440 | to find lots of solutions for gravity,
02:44:45.100 | but isn't it wonderful that the laws that we do find
02:44:47.340 | are the maximally compressed representations?
02:44:49.620 | - Yeah, but that representation, you can now give it,
02:44:54.500 | I guess the universe has the memory of Einstein
02:44:56.620 | with that representation, but then you can now give it
02:44:59.500 | as a gift for free to other alien civilizations.
02:45:00.900 | - Yeah, yeah, it's low memory.
02:45:02.820 | Einstein had to go through a lot of pain
02:45:03.980 | to get that, but it's low memory.
02:45:04.940 | So I say that physics and chemistry and biology
02:45:07.300 | are the same discipline.
02:45:08.780 | They're just physics, laws in physics,
02:45:11.820 | there's no such thing as a law in physics,
02:45:13.380 | it's just low memory stuff.
02:45:15.600 | Because you've got low memory stuff,
02:45:16.940 | you can, things reoccur quickly.
02:45:19.820 | As you get building more memory, you get to chemistry,
02:45:22.120 | so things become more contingent.
02:45:24.040 | When you get to biology, more contingent still,
02:45:26.220 | and then technology.
02:45:27.420 | So the more memory you need, the more your laws are local.
02:45:31.500 | That's all I'm saying, in that the less memory,
02:45:34.340 | the more the laws are universal, because they're not laws,
02:45:36.340 | they are just low memory states.
02:45:37.940 | - We have to talk about a thing you've kind of mentioned
02:45:44.060 | already a bunch of times, but doing computation
02:45:48.420 | through chemistry, chemical-based computation.
02:45:51.860 | I've seen you refer to it as, in a sexy title,
02:45:56.860 | of chemputation, chemputation.
02:46:00.940 | So what is chemputation,
02:46:02.620 | and what is chemical-based computation?
02:46:06.180 | - Okay, so chemputation is a name I gave
02:46:09.700 | to the process of building a state machine
02:46:12.980 | to make any molecule physically in the lab.
02:46:15.900 | And so, as a chemist, chemists make molecules by hand.
02:46:21.660 | And they're quite hard,
02:46:23.180 | chemists have a lot of tacit knowledge, a lot of ambiguity.
02:46:26.860 | It's not possible to go uniformly to the literature
02:46:29.740 | and read a recipe to make a molecule,
02:46:32.460 | and then go and make it in the lab every time.
02:46:35.420 | Some recipes are better than others,
02:46:37.460 | but they all assume some knowledge.
02:46:42.460 | And it's not universal what that is.
02:46:44.900 | - Like, so it's carried from human to human,
02:46:49.620 | some of that implicit knowledge.
02:46:51.100 | And you're saying, can we remove the human from the picture?
02:46:53.180 | Can we, like, program?
02:46:55.700 | What, by the way, what is a state machine?
02:46:58.660 | - So a state machine is a, I suppose,
02:47:01.820 | a object, either abstract or mechanical,
02:47:05.740 | where you can do a unit operation on it
02:47:10.580 | and flick it from one state to another.
02:47:12.300 | So a turnstile would be a good example of a state machine.
02:47:15.940 | - There's some kinds of states
02:47:17.380 | and some kind of transitions between states,
02:47:19.660 | and it's very formal in nature
02:47:23.620 | in terms of like, it's precise how you do those transitions.
02:47:26.140 | - Yes, and you can mathematically, precisely,
02:47:28.100 | describe a state machine.
02:47:29.420 | So, I mean, you know, a very simple Boolean gates
02:47:33.140 | are a very good way of building
02:47:35.420 | kind of logic-based state machines.
02:47:37.500 | Obviously, a Turing machine,
02:47:39.900 | the concept of a Turing machine
02:47:41.460 | where you have a tape and a read head
02:47:43.500 | and a series of rules in a table,
02:47:46.180 | and you would basically look at what's on the tape
02:47:49.380 | and if you're shifting the tape from left to right,
02:47:51.620 | and if you see a zero or a one,
02:47:53.500 | you look in your lookup table and say,
02:47:54.940 | "Right, I've seen a zero and a one."
02:47:57.420 | I then do, I then respond to that.
02:48:01.700 | So the turnstile would be,
02:48:03.700 | is there a human being pushing the turnstile
02:48:06.660 | in direction clockwise?
02:48:09.260 | If yes, I will open, let them go.
02:48:11.380 | If it's anti-clockwise, no.
02:48:12.980 | So yeah, so a state machine has some labels
02:48:15.100 | and a transition diagram.
02:48:17.900 | - So you're looking to come up with a chemical computer
02:48:22.140 | to form state machines to create molecules?
02:48:25.580 | Or what's the chicken and the egg?
02:48:29.340 | - So computation is not a chemical computer,
02:48:31.500 | 'cause we talked a few minutes
02:48:32.380 | about actually doing computations with chemicals.
02:48:35.260 | What I'm now saying is I want to use state machines
02:48:37.500 | to transform chemicals.
02:48:39.500 | - So build chemicals programmatically.
02:48:42.700 | - Yeah, I mean, I get in trouble saying this.
02:48:44.460 | I said to my group,
02:48:46.220 | oh, I shouldn't say it 'cause it's,
02:48:47.780 | but I said, look, we should make the crack bot,
02:48:49.580 | is it in the crack robot?
02:48:51.180 | The robot that makes crack. - The crack bot?
02:48:52.740 | - The robot that- - Oh, oh, oh, crack bot.
02:48:55.220 | - The robot that makes crack,
02:48:56.420 | but maybe we should scrub this from, but-
02:48:58.940 | - No, or, well, so maybe you can educate me
02:49:03.100 | on breaking bad with like math, right?
02:49:06.140 | - Yeah, so in breaking bad-
02:49:07.580 | - You wanna make basically some kind of mix
02:49:12.100 | of ex machina and breaking bad.
02:49:15.380 | - No, I don't, I don't, for the record, I don't,
02:49:17.220 | but I've said- - You don't.
02:49:18.220 | I said that's what I'm going to do
02:49:19.860 | once you release the papers.
02:49:21.540 | But I shaved my head and I'm going to live a life of crime.
02:49:28.660 | Anyway, I'm sorry.
02:49:29.740 | - No, no, so yeah, let's get back to,
02:49:32.140 | so indeed, it is about making drugs,
02:49:34.460 | but importantly, making important drugs.
02:49:36.660 | - All drugs matter.
02:49:39.860 | - Yeah, but let's go back.
02:49:41.500 | So the basic thesis is chemistry is very analog.
02:49:46.020 | There is no state machine.
02:49:47.340 | And I wandered into the, through the paper walls
02:49:52.780 | in the Japanese house a few years ago and said,
02:49:54.900 | "Okay, hey, organic chemist, why are you doing this analog?"
02:49:58.140 | They said, "Well, chemistry is really hard.
02:50:00.380 | You can't automate it, it's impossible."
02:50:03.340 | I said, "But is it impossible?"
02:50:05.260 | They said, "Yeah."
02:50:06.100 | And they said, you know, I got the impression
02:50:07.660 | they're saying it's magic.
02:50:09.540 | And so when people tell me things are magic,
02:50:11.780 | it's like, no, no, they can't be magic, right?
02:50:14.340 | So let's break this down.
02:50:15.660 | And so what I did is I went to my group one day
02:50:19.540 | about eight years ago and said,
02:50:21.100 | "Hey guys, I've written this new programming language
02:50:23.020 | for you."
02:50:24.380 | And so everything is clear.
02:50:26.660 | And you know, you're not allowed to just wander
02:50:28.940 | around the lab willy nilly.
02:50:30.100 | You have to pick up things in order,
02:50:31.500 | go to the balance at the right time and all this stuff.
02:50:34.260 | And they looked at me as if I was insane
02:50:35.980 | and basically kicked me out of the lab and said,
02:50:37.380 | "No, don't do that.
02:50:38.220 | We're not doing that."
02:50:39.540 | And I said, "Okay."
02:50:40.380 | So I went back the next day and said,
02:50:42.020 | "I'm gonna find some money so we can make cool robots
02:50:45.740 | to do chemical reactions."
02:50:46.780 | And everyone went, "That's cool."
02:50:48.900 | And so in that process-
02:50:51.540 | - So first you try to convert the humans to become robots
02:50:54.140 | and next you agree you might as well just create the robots.
02:50:57.300 | Yes, but so in that, the formalization process.
02:51:00.140 | - Yeah, so what I did is I said, "Look, chemical,
02:51:02.620 | to make a molecule, you need to do four things abstractly.
02:51:04.940 | I want to make a chemical Turing machine
02:51:07.020 | 'cause a Turing machine, you think about,
02:51:08.620 | let's imagine a Turing machine.
02:51:10.060 | Turing machine is the ultimate abstraction of a computation
02:51:14.580 | because it's been shown by Turing and others
02:51:17.900 | that basically a universal Turing machine
02:51:19.940 | should be able to do all computations that you can imagine."
02:51:23.020 | It's like, "Wow, why don't I think of a Turing machine
02:51:25.620 | for chemistry?
02:51:26.460 | Let's think of a magic robot that can make any molecule.
02:51:29.720 | Let's think about that for a second."
02:51:31.500 | "Okay, great.
02:51:32.420 | How do we then implement it?"
02:51:33.500 | And I think, "So what is the abstraction?"
02:51:35.740 | So to make any molecule, you have to do a reaction.
02:51:39.260 | So you have to put reagents together,
02:51:40.380 | do a reaction in a flask, typically.
02:51:43.860 | Then after the reaction, you have to stop the reaction.
02:51:45.860 | So you do what's called a workup.
02:51:47.340 | So whatever, cool it down, add some liquid to it, extract.
02:51:51.540 | So then after you do the workup, you separate.
02:51:53.540 | So you then remove the molecules, separate them all out.
02:51:55.900 | And then the final step is purification.
02:51:58.140 | So reaction, workup, separate, purify.
02:52:01.700 | So this is basically exactly like a Turing machine
02:52:05.900 | where you have your tape head, you have some rules,
02:52:09.420 | and then you run it.
02:52:10.360 | So I thought, "Cool."
02:52:11.760 | I went to all the chemists and said,
02:52:12.740 | "Look, chemistry isn't that hard.
02:52:15.300 | Reaction, workup, separation, purification.
02:52:18.340 | Do that in cycles forever for any molecule,
02:52:20.940 | all the chemistry, done."
02:52:23.180 | And they said, "Chemistry is that hard."
02:52:26.180 | I said, "But just in principle."
02:52:27.980 | And I got a few very enlightened people to say,
02:52:30.420 | "Yeah, okay, in principle, but it ain't gonna work."
02:52:33.140 | And this was in about 2013, 2014.
02:52:36.620 | And I found myself going to an architecture conference
02:52:39.820 | almost by accident.
02:52:40.820 | It's like, "Why am I at this random conference
02:52:42.740 | on architecture?"
02:52:44.420 | And that was because I published a paper
02:52:45.940 | on inorganic architecture.
02:52:47.860 | And they said, "Come to architecture conference."
02:52:49.420 | But the inorganic architecture is not nano architecture.
02:52:52.380 | And I went, "Okay."
02:52:53.200 | And then I found these guys at the conference,
02:52:54.900 | 3D printing ping pong balls and shapes.
02:52:57.420 | And this is 3D printing was cool.
02:52:58.940 | I was like, "This is ridiculous.
02:53:00.460 | Why are you 3D printing ping pong balls?"
02:53:01.900 | And I gave them a whole load of abuse
02:53:03.420 | like I normally do when I first meet people,
02:53:05.340 | how to win friends and influence people.
02:53:07.260 | And then I was like, "Oh my God, you guys are geniuses."
02:53:10.300 | And so I got from, they were a bit confused
02:53:12.020 | 'cause I was calling them idiots
02:53:13.100 | and then call them geniuses.
02:53:14.140 | It's like, "Will you come to my lab
02:53:16.140 | and we're gonna build a robot to do chemistry
02:53:18.260 | with a 3D printer?"
02:53:19.100 | And I said, "Oh, that's cool, all right."
02:53:20.980 | So I had them come to the lab
02:53:22.340 | and we started to 3D print test tubes.
02:53:24.060 | So you imagine, 3D print a bottle
02:53:26.780 | and then use the same gantry to basically,
02:53:30.500 | rather than to squirt out plastic out of a nozzle,
02:53:34.180 | have a little syringe and get chemicals in.
02:53:36.420 | So we had the 3D printer that could simultaneously
02:53:38.740 | print the test tube and then put chemicals
02:53:40.580 | into the test tube.
02:53:42.100 | And then-
02:53:42.940 | - Wow, so it's really end to end.
02:53:44.500 | - Yeah, I was like, "That'll be cool
02:53:45.580 | 'cause they've got G-code to do it all."
02:53:47.140 | I was like, "That's cool."
02:53:48.780 | So I got my group doing this and I developed it a bit.
02:53:50.820 | And I realized that we could take those unit operations.
02:53:54.660 | And we built a whole bunch of pumps and valves.
02:53:57.180 | And I realized that I could basically take the literature
02:54:00.820 | and I made the first version of the computer in 2016, 17.
02:54:05.820 | I made some architectural decisions.
02:54:07.540 | So I designed the pumps and valves in my group.
02:54:09.420 | I did all the electronics in my group.
02:54:10.780 | They were brilliant.
02:54:12.300 | I cannot pay tribute to my group enough in doing this.
02:54:15.900 | They were just brilliant.
02:54:16.740 | And there were some poor souls there that said,
02:54:18.420 | "Lee, why are you making this design electronics?"
02:54:21.300 | I'm like, "Well, 'cause I don't understand it."
02:54:24.260 | They're like, "So you're making this design stuff
02:54:26.020 | because you don't understand?"
02:54:26.860 | I was like, "Yeah."
02:54:27.700 | It's like, "But can we not just buy some?"
02:54:29.660 | I said, "Well, we can, but then I don't understand
02:54:31.540 | how to, you know, what bus they're gonna use
02:54:34.540 | and the serial ports and all this stuff.
02:54:36.060 | I just wanted..."
02:54:37.180 | And I made, I came up with a decision
02:54:39.020 | to design a bunch of pumps and valves
02:54:41.100 | and use power over ethernet.
02:54:42.620 | So I got one cable for power and data, plug them all in,
02:54:46.340 | plug them all into a router.
02:54:48.100 | And then I made the state machine.
02:54:51.380 | And there was a couple of cool things I did.
02:54:53.660 | Oh, they did actually.
02:54:54.900 | We got the abstraction.
02:54:57.180 | So reaction, workup, separation, purification.
02:55:01.580 | And then I made the decision to do it in batch.
02:55:06.100 | Now it's in batch.
02:55:07.900 | All chemistry had been digitized before, apparently,
02:55:10.660 | once it's been done.
02:55:11.820 | But everyone had been doing it in flow.
02:55:13.940 | And flow is continuous and there are infinities everywhere.
02:55:16.220 | And you have to just...
02:55:17.220 | And I realized that I could actually make a state machine
02:55:20.620 | where I basically put stuff in the reactor,
02:55:23.860 | turn it from one state to another state,
02:55:26.900 | stop it and just read it out.
02:55:28.380 | And okay, and I was kind of bitching
02:55:30.660 | at electrical engineers saying, "You have it easy.
02:55:32.340 | You don't have to clean out the electrons."
02:55:33.660 | You know, electrons don't leave a big mess.
02:55:36.380 | They leave some EM waste.
02:55:37.900 | But in my state machine, I built in cleaning.
02:55:39.780 | So it's like, we do an operation
02:55:41.620 | and then it cleans the backbone and then can do it again.
02:55:43.300 | So there's no... - That's fascinating.
02:55:44.420 | - So what we managed to do over a couple of years
02:55:47.140 | is develop the hardware, develop the state machine.
02:55:50.740 | And we encoded three molecules.
02:55:52.140 | We did three, the first three, we did NITOL,
02:55:54.140 | a sleeping drug, rufinamide, anesthesia and Viagra.
02:55:57.180 | You know, and I could make jokes on the paper.
02:55:59.540 | It's a hard problem, blah, blah, blah, blah.
02:56:02.140 | - Yeah, that's very good.
02:56:03.900 | - And then in the next one, what we did is said,
02:56:05.780 | "Okay, my poor organic chemist said,
02:56:07.780 | "Look, Lee, we've worked with you this long.
02:56:11.180 | We've made a robot that looks like
02:56:12.860 | it's gonna take our jobs away.
02:56:14.660 | And not just take our jobs away,
02:56:17.820 | that what we love in the lab,
02:56:19.060 | but now we have to become programmers.
02:56:20.820 | But we're not even good programmers.
02:56:22.020 | We just have to spend ages writing lines of code
02:56:25.060 | that are boring and it's not as elegant."
02:56:27.420 | I went, "You're right."
02:56:28.860 | So then, but I knew because I had this abstraction
02:56:32.700 | and I knew that there was language,
02:56:34.940 | I could suddenly develop a state machine
02:56:36.540 | that would interpret the language
02:56:37.700 | which was lossy and ambiguous and populate my abstraction.
02:56:42.080 | So I built a chemical programming language
02:56:44.300 | that is actually gonna be recursively enumerable.
02:56:46.940 | It's gonna be a Turing complete language actually,
02:56:48.580 | which is kind of cool,
02:56:49.420 | which means it's formally verifiable.
02:56:51.500 | So where we are now is we can now read the literature
02:56:55.260 | using a bit of natural language processing.
02:56:56.860 | It's not the best.
02:56:57.700 | There are many other groups have done better job,
02:56:59.500 | but we can use that language reading
02:57:01.220 | to populate the state machine
02:57:03.180 | and basically add, subtract.
02:57:05.860 | We got about a number of primitives
02:57:08.780 | that we basically program loops
02:57:12.660 | that we dovetail together
02:57:14.300 | and we can make any molecule with it.
02:57:15.780 | - Okay, so that's the kind of program synthesis.
02:57:18.260 | So you start at like,
02:57:19.460 | literally you're talking about like a paper,
02:57:21.780 | like a scientific paper that's being read
02:57:24.660 | through natural language processing,
02:57:27.060 | extracting some kind of details about chemical reactions
02:57:32.060 | and the chemical molecules and composites involved.
02:57:38.500 | And then that's, that in GPT terms,
02:57:43.500 | serves as a prompt for the program synthesis
02:57:47.620 | that's kind of trivial right now.
02:57:49.300 | There you have a bunch of different like for loops
02:57:51.220 | and so on that creates a program in this chemical language
02:57:56.100 | that can then be interpreted by the chemical computer,
02:58:00.140 | the chemputer.
02:58:01.700 | - Yeah, chemputer.
02:58:02.540 | That's the word. - Chemputer, yeah.
02:58:04.260 | Everything sounds better in your British accent,
02:58:07.820 | but I love it.
02:58:09.260 | So into the computer and that's able to then
02:58:12.060 | basically be a 3D printer for these, for molecules.
02:58:17.100 | - Yeah, I wouldn't call it a 3D printer.
02:58:18.420 | I would call it a universal chemical reaction system
02:58:21.020 | because 3D printing gives the wrong impression,
02:58:22.660 | but yeah, and it purifies.
02:58:25.180 | And the nice thing is that that code now,
02:58:27.620 | we call it the CHI-DL code,
02:58:30.500 | is really interesting because now,
02:58:34.520 | so computation, what is computation?
02:58:36.220 | Computation is what computing is to mathematics, I think.
02:58:41.220 | Computation is the process of taking chemical code
02:58:44.660 | and some input reagents and making the molecule
02:58:49.660 | reproducibly every time without fail.
02:58:52.180 | What is computation?
02:58:53.320 | It's the process of using a program
02:58:55.540 | to take some input conditions
02:58:57.740 | and give you an output same every time, right, reliably.
02:59:02.220 | - So the problem is,
02:59:04.500 | now maybe you can push back and correct me on this.
02:59:07.460 | So I know biology is messy.
02:59:09.940 | My question is how messy is chemistry?
02:59:12.380 | So if we use the analogy of a computer,
02:59:15.640 | it's easier to make computation in a computer very precise,
02:59:20.640 | that it's repeatable, it makes errors almost never.
02:59:26.180 | If it does the exact same way over and over and over and over
02:59:29.540 | what about chemistry?
02:59:30.540 | Is there messiness in the whole thing?
02:59:32.700 | Can that be somehow leveraged?
02:59:33.940 | Can that be controlled?
02:59:34.780 | Can be that removed?
02:59:36.180 | Do we wanna remove it from the system?
02:59:38.260 | - Oh yes and no, right.
02:59:39.700 | Is there messiness?
02:59:40.700 | There is messiness because chemistry is like
02:59:44.980 | you're doing reactions on billions of molecules
02:59:49.220 | and they don't always work.
02:59:50.180 | But you've got purification there.
02:59:52.100 | And so what we found is at the beginning,
02:59:54.460 | everyone said it can't work.
02:59:55.940 | It's gonna be too messy, it'll just fail.
02:59:58.020 | And I said, but you managed to get chemistry
02:59:59.980 | to work in the lab.
03:00:00.820 | Are you magic?
03:00:01.640 | Are you doing something?
03:00:02.660 | So I would say, now go back to the first ever computer
03:00:05.940 | or the ENIAC, 5 million solder joints,
03:00:09.500 | 400,000 valves that are exploding all the time.
03:00:12.800 | Was that, would you have gone, okay, that's messy.
03:00:16.380 | So we've got the, and have we got the equivalent
03:00:19.300 | of the ENIAC in my lab?
03:00:21.100 | We've got 15 computers in the lab now
03:00:23.580 | and they, are they unreliable?
03:00:24.980 | Yeah, they fall apart here and there.
03:00:26.980 | But are they getting better really quickly?
03:00:29.580 | Yeah.
03:00:30.420 | Are they now able to reliably make more?
03:00:32.340 | Are we at the point in the lab
03:00:33.340 | where there are some molecules
03:00:34.820 | we would rather make on the computer
03:00:37.460 | than have a human being make?
03:00:38.540 | Yeah, we've just done, we've just made
03:00:40.460 | a anti-influenza molecule, some antivirals,
03:00:45.460 | six steps on the computer that would take a human being
03:00:49.740 | about one week to make Arbidol of continuous labor.
03:00:54.740 | And all they do now is load up the reagents,
03:00:56.180 | press go button, and just go away and drink coffee.
03:00:58.260 | - Wow.
03:00:59.140 | So this, I mean, and this is,
03:01:01.940 | you're saying this computer's just the early days.
03:01:04.340 | And so like some of the criticism
03:01:06.260 | just have to do with the early days.
03:01:08.020 | And yes, I would say that something like this
03:01:10.220 | is quite impossible.
03:01:13.940 | You know, so the fact that you're doing this is incredible.
03:01:17.940 | Not impossible, of course, but extremely difficult.
03:01:20.780 | - It did seem really difficult.
03:01:22.940 | And I do keep pinching myself when I go in the lab.
03:01:25.100 | I was like, is it working?
03:01:25.940 | Like, yep.
03:01:27.300 | It's not, you know, it does clog.
03:01:29.260 | It does stop.
03:01:30.460 | - You gotta clean, this is great.
03:01:32.460 | - You know, but it's getting more reliable
03:01:35.100 | because I made some, we just made design decisions
03:01:37.980 | and said we are not gonna abandon the abstraction.
03:01:40.500 | Think about it, if the von Neumann implementation
03:01:44.780 | was abandoned, I mean, think about what we do
03:01:46.900 | to semiconductors to really constrain them,
03:01:49.540 | to what we do to silicon in a fab lab.
03:01:53.460 | We take computation for granted.
03:01:55.380 | Silicon is not in its natural state.
03:01:57.660 | We are doping the hell out of it.
03:01:59.020 | - It's incredible what they're able to accomplish
03:02:00.900 | and achieve that reliability at the scale they do.
03:02:03.660 | Like you said, that's after Moore's law,
03:02:06.420 | what we have now, and how it started, you know,
03:02:11.420 | now we're here. - So think about it now.
03:02:14.060 | - We started at the bottom, now we're here.
03:02:15.500 | - We only have 20 million molecules,
03:02:17.060 | well, say 20 million molecules in one database,
03:02:19.580 | maybe a few hundred million
03:02:21.260 | in all the pharmaceutical companies.
03:02:23.420 | And those few hundred million molecules
03:02:25.740 | are responsible for all the drugs
03:02:27.460 | that we've had in humanity except, you know,
03:02:29.620 | biologics for the last 50 years.
03:02:32.300 | Now imagine what happens when a drug goes out of print,
03:02:35.500 | goes out of print because there's only a finite number
03:02:37.460 | of manufacturing facilities in the world
03:02:38.740 | that make these drugs.
03:02:39.580 | - Goes out of print. - Yeah.
03:02:40.980 | - It's the printing press for chemistry.
03:02:44.060 | - Yeah, and not only that, we can protect the Chi DL
03:02:47.500 | so we can stop bad actors doing it, we can encrypt them,
03:02:50.620 | and we can give people--
03:02:51.460 | - Chi DL, that's the name, sorry to interrupt,
03:02:52.860 | is the name of the programming language?
03:02:53.940 | - Yeah, the Chi DL is the name of the programming language
03:02:56.740 | and the code we give the chemicals.
03:02:58.540 | So Chi, as in, you know, just for,
03:03:01.980 | it's actually like an XML format,
03:03:04.140 | but I've now taken it from script
03:03:06.460 | to a fully expressible programming language
03:03:10.220 | so we can do dynamics and there's for loops in there
03:03:12.700 | and conditional statements.
03:03:13.820 | - Right, but the structure, it started out as a,
03:03:16.220 | like an XML type of thing. - Yeah, yeah, yeah, yeah.
03:03:19.900 | And now we also, the chemist doesn't need
03:03:21.460 | to program in Chi DL, they can just go to the software
03:03:23.660 | and type in add A to B, reflux,
03:03:26.220 | do what they would normally do,
03:03:27.180 | and it just converts it to Chi DL
03:03:28.500 | and they have a linter to check it.
03:03:29.940 | And they're correct. - So how do you,
03:03:31.780 | you know, not with ASCII, but because it's a Greek letter,
03:03:36.220 | how do you go with, how do you spell it
03:03:39.700 | just using the English alphabet?
03:03:42.260 | - We just-- - XDL?
03:03:43.620 | - XDL, but we put in Chi.
03:03:46.020 | And it was named by one of my students
03:03:47.580 | and one of my postdocs many years ago,
03:03:49.900 | and I quite liked it.
03:03:51.020 | It's like-- - It's a cool name.
03:03:52.260 | - It's important, I think, when the team are contributing
03:03:54.580 | to such big ideas, 'cause there are ideas as well,
03:03:57.540 | I try not to just rename, I didn't call it Cronan
03:04:00.540 | or anything that, 'cause they keep saying, you know,
03:04:03.140 | is it, the chemistry, when they're putting stuff
03:04:07.380 | in the computer, one of my students said,
03:04:08.780 | "We're asking now, is it Cronan complete?"
03:04:10.660 | And I was like, "What does that mean?"
03:04:11.500 | He said, "Well, can we make it on the damn machine?"
03:04:13.980 | (laughing)
03:04:15.460 | And I was like, "Oh, is that a compliment or a pejorative?"
03:04:18.500 | They're like, "Well, it might be both."
03:04:20.740 | (laughing)
03:04:21.660 | - Yeah, so you tweeted, quote,
03:04:24.500 | "Why does chemistry need a universal programming language?"
03:04:27.420 | Question mark.
03:04:28.260 | For all the reasons you can think of,
03:04:31.820 | reliability, interoperability, collaboration,
03:04:35.700 | remove ambiguity, lower cost, increase safety,
03:04:39.100 | open up discovery, molecular customization,
03:04:42.340 | and publication of executable chemical code.
03:04:47.380 | Which is fascinating, by the way, just publish code.
03:04:51.980 | And can you maybe elaborate a little bit more
03:04:56.260 | about this CHI-DL?
03:04:57.660 | What does a universal language of chemistry look like?
03:05:01.500 | A Cronan complete language.
03:05:04.820 | - It's a true incomplete language, really.
03:05:07.380 | But so what it has, it has a series of operators in it,
03:05:11.020 | like add, heat, stir.
03:05:15.540 | So there's a bunch of just unit operations.
03:05:17.940 | And all it is, really, is just,
03:05:19.640 | with chemical engineers, when I talked about this,
03:05:22.740 | that you've just rediscovered chemical engineering.
03:05:26.020 | And I said, "Well, yeah, I know."
03:05:27.780 | They said, "Well, that's trivial."
03:05:30.140 | I said, "Well, not really."
03:05:31.860 | Well, yes, it is trivial, and that's why it's good.
03:05:33.700 | Because not only have we rediscovered chemical engineering,
03:05:36.820 | we've made it implementable on a universal hardware
03:05:38.940 | that doesn't cost very much money.
03:05:40.740 | And so the CHI-DL has a series of statements.
03:05:43.940 | Like, define the reactor.
03:05:46.140 | So defines the reagents.
03:05:49.300 | So they're all labels, so you assign them.
03:05:52.180 | And what I also implemented at the beginning is,
03:05:55.420 | because I give all the hardware IP address,
03:05:57.980 | you put it on a graph.
03:05:59.540 | And so what it does is,
03:06:01.260 | the graph is equivalent to the processor firmware,
03:06:04.980 | the processor code.
03:06:06.600 | So when you take your CHI-DL
03:06:08.620 | and you go to run it on your computer,
03:06:10.180 | you can run it on any compatible hardware
03:06:12.020 | in any configuration.
03:06:12.860 | It says, "What does your graph look like?"
03:06:14.820 | As long as I can solve the problem on the graph
03:06:17.020 | with these unit operations,
03:06:18.100 | you have the resources available, it compiles.
03:06:20.380 | Chem-piles.
03:06:21.620 | - Aha. (laughs)
03:06:24.140 | - All right, we could carry on for years.
03:06:27.100 | But it is really, it's chem-pilation.
03:06:29.100 | - Chem-pilation, yeah.
03:06:29.940 | - And what it now does is it says,
03:06:31.860 | "Okay, the problem we had before is,
03:06:34.460 | "it was possible to do robotics for chemistry,
03:06:37.740 | "but the robots were really expensive.
03:06:39.360 | "They were unique.
03:06:40.520 | "They were vendor-locked."
03:06:42.840 | And what I want to do is to make sure
03:06:44.540 | that every chemist in the world
03:06:46.020 | can get access to machinery like this
03:06:48.580 | at virtually no cost, because it makes it safer.
03:06:51.800 | It makes it more reliable.
03:06:53.740 | And then, if you go to the literature
03:06:55.300 | and you find a molecule
03:06:56.140 | that could potentially cure cancer,
03:06:58.380 | and let's say the molecule
03:06:59.260 | that could potentially cure cancer
03:07:00.380 | takes you three years to repeat,
03:07:02.980 | and maybe a student finishes their PhD in the time
03:07:06.580 | and they never get it back,
03:07:07.940 | so that it's really hard to kind of get
03:07:12.380 | all the way to that molecule,
03:07:13.460 | and it limits the ability of humanity to build on it.
03:07:16.420 | If they just download the code and can execute it,
03:07:19.020 | it turns, I would say,
03:07:20.860 | the electronic laboratory notebook in chemistry
03:07:23.180 | is a data cemetery,
03:07:25.240 | because no one will ever reproduce it.
03:07:28.580 | For now, the data cemetery is a Jupiter notebook,
03:07:30.900 | and you can just execute it.
03:07:31.740 | - A notebook, and people can play with it.
03:07:33.300 | - Yeah.
03:07:34.140 | - The access to it-- - Reverse it.
03:07:34.980 | - Orders of magnitude is increased.
03:07:37.860 | We'll talk about, so, as with all technologies,
03:07:41.460 | I think there's way more exciting possibilities,
03:07:43.780 | but there are also terrifying possibilities,
03:07:45.660 | and we'll talk about all of them,
03:07:47.400 | but let me just kind of linger
03:07:49.500 | on the machine learning side of this.
03:07:51.420 | So, you're describing programming,
03:07:53.460 | but it's a language.
03:07:56.380 | I don't know if you've heard about OpenAI Codex,
03:07:58.820 | which is-- - Yep, I'm playing with it.
03:08:00.620 | - You're playing, of course you are.
03:08:02.540 | (both laughing)
03:08:05.200 | You really are from Rick and Morty.
03:08:07.940 | This is great, okay.
03:08:10.860 | Except philosophically, I mean,
03:08:12.420 | he is, I guess, kind of philosophically deep, too.
03:08:14.980 | So, for people who don't know, GPT, GPT-3,
03:08:18.540 | it's a language model that can do
03:08:20.820 | natural language generation, so you can give it a prompt,
03:08:23.900 | and it can complete the rest of it,
03:08:26.440 | but it turns out that that kind of prompt,
03:08:28.940 | it's not just completes the rest of it,
03:08:30.740 | it's generating, like, novel-sounding text,
03:08:35.740 | and then you can apply that to generation
03:08:38.260 | of other kinds of stuff.
03:08:40.700 | So, these kinds of transformer-based language models
03:08:43.900 | are really good at forming deep representations
03:08:48.900 | of a particular space, like a medium, like language.
03:08:54.420 | So, you can then apply it to a specific subset
03:08:56.420 | of language, like programming.
03:08:58.380 | So, you can have it learn the representation
03:09:01.700 | of the Python programming language,
03:09:03.200 | and use it to then generate syntactically
03:09:06.580 | and semantically correct programs.
03:09:10.460 | So, you can start to make progress
03:09:12.140 | on one of the hardest problems in computer science,
03:09:14.800 | which is program synthesis.
03:09:16.060 | How do you write programs that accomplish different tasks?
03:09:19.620 | So, what OpenAI Codex does is it generate
03:09:24.620 | those programs based on a prompt of some kind.
03:09:28.700 | Usually, you can do a natural language prompt,
03:09:30.780 | so basically, as you do when you program,
03:09:32.780 | you write some comment, which serves
03:09:36.340 | the basic documentation of the inputs and the outputs
03:09:39.500 | and the function of the particular set of code,
03:09:41.220 | and it's able to generate that.
03:09:43.300 | Point being is you can generate programs
03:09:45.860 | using machine learning, using neural networks.
03:09:49.660 | Those programs operate on the boring old computer.
03:09:56.140 | Can you generate programs that operate,
03:09:59.580 | there's gotta be a clever version of programs for this,
03:10:01.700 | but can you write programs that operate on a computer?
03:10:06.460 | - Yep, there's actually software out there right now,
03:10:08.780 | you can go and do it.
03:10:10.300 | - Really?
03:10:11.140 | - Yeah, yeah, it's a heuristic, it's rule-based,
03:10:14.300 | but we have, what we've done, inspired by Codex, actually,
03:10:19.300 | is over the summer, I ran a little workshop.
03:10:23.540 | Some of my groups got this inspired idea
03:10:25.140 | that we should get a load of students
03:10:28.060 | and ask them to manually collect data,
03:10:30.660 | to label chemical procedures into KyDL,
03:10:35.240 | and we have a cool synth reader,
03:10:38.840 | so there's a bunch of people doing this right now,
03:10:41.960 | but they're doing it without abstraction,
03:10:45.140 | and because we have an abstraction
03:10:47.480 | that's implementable in the hardware,
03:10:50.280 | we've developed basically a chemical analog of Codex.
03:10:54.520 | - When you say, sorry to interrupt,
03:10:56.920 | when you say abstraction in the hardware,
03:10:59.320 | what do you mean?
03:11:00.160 | - So right now, a lot of people do machine learning
03:11:01.800 | and reading chemistry and saying,
03:11:04.480 | oh, you've got all these operations,
03:11:06.000 | add, shake, whatever, here,
03:11:07.840 | but because they don't have a uniform,
03:11:11.240 | I mean, there's a couple of groups doing it,
03:11:12.800 | competitors, actually, and they're good, very good,
03:11:15.660 | but they can't run that code automatically.
03:11:20.520 | They are losing meaning,
03:11:23.700 | and the really important thing that you have to do
03:11:28.280 | is generate context,
03:11:30.000 | and so what we've learned to do with our abstraction
03:11:32.780 | is make sure we can pull the context out of the text,
03:11:36.400 | and so can we take a chemical procedure
03:11:40.480 | and read it and generate our executable code?
03:11:43.960 | - What's the hardest part about that whole pipeline,
03:11:46.000 | from the initial text,
03:11:47.040 | interpreting the initial text of a paper,
03:11:49.360 | extracting the meaningful context
03:11:52.240 | and the meaningful chemical information,
03:11:54.400 | to then generating the program,
03:11:56.800 | to then running that program in the hardware?
03:12:00.320 | What's the hardest part about that pipeline
03:12:02.360 | as we look towards a universal Turing computer?
03:12:05.820 | - So the hardest-- - Computers.
03:12:07.780 | - The hardest thing with the pipeline
03:12:10.620 | is that the software, the model,
03:12:14.900 | gets confused between some meanings, right?
03:12:18.340 | So if, you know, chemists are very good at inventing words
03:12:21.740 | that aren't broken down,
03:12:22.880 | so I would, the classic word that you would use
03:12:26.000 | for boiling something is called reflux.
03:12:28.240 | So reflux is, you would have a solvent
03:12:31.060 | in a round-bottom flask, at reflux it would be boiling,
03:12:33.740 | going up the reflux condenser and coming down.
03:12:36.020 | But that term, reflux, to reflux, could be changed,
03:12:39.660 | you know, to people often make up words, new words,
03:12:44.340 | and then the software can fall over.
03:12:47.380 | But what we've been able to do is a bit like in Python,
03:12:50.980 | or any programming language,
03:12:52.560 | is identify when things aren't matched.
03:12:55.140 | So you present the code, you say, "This isn't matched,
03:12:57.340 | "you may want to think about this,"
03:12:58.580 | and then the user goes and says, "Oh, I mean reflux,"
03:13:01.140 | and just ticks the box and corrects it.
03:13:02.700 | So what the Codex or the ChemX does in this case,
03:13:07.700 | is it just, it suggests the first go,
03:13:12.540 | and then the chemist goes in and corrects it.
03:13:14.140 | And I really want the chemist to correct it,
03:13:16.260 | because it's not safe, I believe, for it to allow AI
03:13:21.260 | to just read literature and generate code at this stage.
03:13:25.100 | - 'Cause now you're having actual, by the way, ChemX, nice.
03:13:29.180 | Nice name.
03:13:32.020 | So you are unlike, which is fascinating,
03:13:38.020 | is that we live in a fascinating moment in human history.
03:13:42.380 | But yes, you're literally connecting AI
03:13:47.060 | to some physical, and like,
03:13:51.020 | it's building something in the physical realm.
03:13:53.620 | - Yeah.
03:13:54.460 | - Especially in the space of chemistry
03:13:56.300 | that operates sort of invisibly.
03:14:02.460 | - Yeah, yeah, I would say that's right.
03:14:05.460 | And it's really important to understand
03:14:07.420 | those labeling schemes, right?
03:14:09.260 | And one of the things I was never,
03:14:11.340 | I was always worried about at the beginning,
03:14:12.700 | that the abstraction was gonna fall over.
03:14:15.260 | And the way we did it was just by brute force to start with.
03:14:18.340 | We just kept reading the literature and saying,
03:14:19.820 | "Is there anything new, can we add a new rule in?"
03:14:21.900 | And actually, our CHI DL language expand exploded.
03:14:25.380 | There was so many extra things we had to keep adding.
03:14:28.140 | And then I realized the primitives still were maintained,
03:14:31.140 | and I could break them down again.
03:14:32.660 | So it's pretty good.
03:14:35.100 | I mean, there are problems.
03:14:36.100 | There are problems of interpreting any big sentence
03:14:38.500 | and turning it into an actionable code.
03:14:40.700 | And the Codex is not without its problems.
03:14:42.300 | You can crash it quite easily, right?
03:14:44.100 | You can generate nonsense.
03:14:45.540 | But boy, it's interesting.
03:14:47.660 | I would love to learn to program now using Codex, right?
03:14:52.660 | Just hacking around, right?
03:14:54.180 | And I wonder if chemists in the future
03:14:55.420 | will learn to do chemistry by just hacking around
03:15:00.100 | with the system, writing in different things.
03:15:02.100 | Because the key thing that we're doing with chemistry
03:15:04.300 | is that where a lot of mathematical chemistry went wrong
03:15:07.220 | is people, and I think Wolfram does this in Mathematica,
03:15:11.500 | he assumes that chemistry is a reaction
03:15:14.700 | where atom A or molecule A reacts with molecule B
03:15:17.420 | to give molecule C.
03:15:18.260 | That's not what chemistry is.
03:15:20.460 | Chemistry is take some molecule,
03:15:23.100 | take a liquid or a solid, mix it up and heat it,
03:15:26.780 | and then extract it.
03:15:29.380 | So the programming language is actually with respect
03:15:32.540 | to the process operations.
03:15:35.380 | And if you flick in process space,
03:15:37.780 | not in chemical graph space, you unlock everything.
03:15:41.500 | Because there's only a finite number of processes
03:15:43.220 | you need to do in chemistry.
03:15:45.180 | And that's reassuring.
03:15:47.220 | And so we're in the middle of it.
03:15:49.060 | It's really exciting.
03:15:50.420 | It's not the be all and the end all.
03:15:53.580 | And there is, like I say, errors that can creep in.
03:15:55.780 | One day we might be able to do it
03:15:57.420 | without human interaction, you simulate it,
03:15:59.380 | and you'll know enough about the simulation
03:16:01.460 | that the lab won't catch fire.
03:16:05.380 | But there are so many safety issues right now
03:16:07.020 | that we've got to really be very careful,
03:16:09.700 | protecting the user, protecting the environment,
03:16:11.900 | protecting misuse.
03:16:13.020 | I mean, there's lots to discuss
03:16:14.860 | if you want to go down that route,
03:16:16.060 | because it's very, very interesting.
03:16:17.620 | You don't want novichoks being made,
03:16:20.580 | or explosives being made,
03:16:23.620 | or recreational drugs being made.
03:16:26.300 | But how do you stop a molecular biologist
03:16:28.580 | making a drug that's gonna be important
03:16:31.420 | for them looking at their particular assay,
03:16:35.220 | on a bad actor trying to make methamphetamine?
03:16:38.940 | - I saw how you looked at me when you said bad actor,
03:16:41.060 | but that's exactly what I'm gonna do.
03:16:42.900 | I'm trying to get the details of this so I can be first.
03:16:45.420 | - Don't worry, we can protect you from yourself.
03:16:47.460 | - Okay. (laughs)
03:16:50.220 | I'm not sure that's true, but that statement gives me hope.
03:16:54.060 | Does this ultimately excite you about the future,
03:16:59.060 | or does it terrify you?
03:17:01.980 | So, we mentioned that time is fundamental.
03:17:06.980 | It seems like you're at the cutting edge
03:17:11.260 | of progress that will have to happen, that will happen,
03:17:15.780 | that there's no stopping it.
03:17:17.340 | And as we've been talking about,
03:17:20.780 | I see obviously a huge number of exciting possibilities.
03:17:24.660 | So, whenever you automate these kinds of things,
03:17:29.300 | just the world opens up.
03:17:31.380 | It's like programming itself,
03:17:33.540 | and the computer, regular computer,
03:17:35.980 | has created innumerable applications,
03:17:39.500 | and made the world better in so many dimensions.
03:17:43.540 | And it created, of course, a lot of negative things
03:17:45.940 | that we, for some reason, like to focus on,
03:17:48.940 | using that very technology to tweet about it.
03:17:52.340 | But I think it made a much better world,
03:17:55.460 | but it created a lot of new dangers.
03:17:57.540 | So, maybe you can speak to,
03:18:00.900 | when you kind of stand at the end of the road
03:18:07.780 | for building a really solid, reliable, universal computer,
03:18:12.780 | what are the possibilities that are positive?
03:18:16.380 | What are the possibilities that are negative?
03:18:18.140 | How can we minimize the chance of the negative?
03:18:21.060 | - Yeah, that's a really good question.
03:18:22.460 | So, there's so many positive things,
03:18:23.940 | from drug discovery, from supply chain stress,
03:18:28.740 | for basically enabling chemists
03:18:30.700 | to basically build more productive in the lab, right?
03:18:32.620 | Well, the computer's not gonna replace the chemist.
03:18:35.300 | There's gonna be a Moore's law of molecules, right?
03:18:37.380 | There's gonna be so many more molecules we can design,
03:18:39.740 | so many more diseases we can cure.
03:18:41.540 | - So, chemists in the lab, as researchers,
03:18:43.420 | that's better for science,
03:18:44.740 | so they can build a bunch of,
03:18:46.500 | like, they could do science
03:18:49.100 | at a much more accelerated pace.
03:18:50.380 | So, it's not just the development of drugs,
03:18:51.980 | it's actually like doing the basic understanding
03:18:54.300 | of the science of drugs.
03:18:55.180 | - And the personalization, the cost of drugs right now,
03:18:57.780 | we're all living longer, we're all having more and more,
03:19:00.380 | we know more about our genomic development,
03:19:02.340 | we know about our predetermination,
03:19:05.020 | and we might be able to,
03:19:06.460 | one dream I've got is like,
03:19:07.940 | imagine, you know, you can work at your genome assistant,
03:19:11.900 | tells you you're gonna get cancer in seven years time,
03:19:14.580 | and you have your personal computer
03:19:16.820 | that cooks up the right molecule just for you to cure it,
03:19:20.060 | right, that's a really positive idea.
03:19:22.460 | The other thing is, is when drugs,
03:19:25.260 | so right now, I think it's absolutely outrageous
03:19:28.660 | that not all of humanity has access to medicine.
03:19:33.140 | And I think the computer might be able
03:19:35.100 | to change that fundamentally,
03:19:36.020 | because it will disrupt the way things are manufactured.
03:19:38.940 | So let's stop thinking about manufacturing
03:19:40.860 | in different factories, let's say that computers,
03:19:44.420 | clinical grade computers or drug grade computers
03:19:47.020 | will be in facilities all around the world,
03:19:49.740 | and they can make things on demand as a function of the cost,
03:19:54.060 | you know, maybe people won't be able to afford
03:19:55.620 | the latest and greatest patent,
03:19:57.220 | but maybe they'll be able to get the next best thing,
03:20:00.180 | and will basically democratize,
03:20:02.820 | make available drugs to everybody that they need,
03:20:05.980 | you know, and you know, there's lots of
03:20:07.900 | really interesting things there.
03:20:10.540 | So I think that's gonna happen.
03:20:12.540 | I think that now let's take the negative.
03:20:16.980 | Before we do that, let's imagine what happened,
03:20:19.460 | go back to a really tragic accident a few years ago,
03:20:21.900 | well not an accident, an act of murder by that pilot
03:20:25.500 | on the, I think it was Eurowings or Swiss Wings,
03:20:28.380 | but what he did is, the plane took off,
03:20:31.740 | he waited till his pilot went to the toilet,
03:20:33.540 | he was a co-pilot, he locked the door,
03:20:35.620 | and then set the autopilot above the Alps,
03:20:39.940 | he set the altimeter or the descend height to zero,
03:20:44.220 | so the computer just took the plane into the Alps.
03:20:48.100 | Now, I mean, that was such a tragedy,
03:20:51.660 | obviously the guy was mentally ill,
03:20:53.180 | but it wasn't just a tragedy for him,
03:20:55.100 | it was for all the people on board,
03:20:57.060 | but what if, and I was inspired by this,
03:20:58.940 | and my thinking, what can I do to do,
03:21:01.660 | to anticipate problems like this in the computer?
03:21:05.260 | Had the software, and I'm sure Boeing and Airbus
03:21:08.380 | will be thinking, oh, maybe I can give the computer
03:21:11.260 | a bit more situational awareness,
03:21:12.660 | so whenever one tries to drop the height of the plane,
03:21:15.820 | and it knows it's above the Alps,
03:21:16.860 | we'll just say, oh no, computer says no,
03:21:18.460 | we're not letting you do that.
03:21:20.540 | Of course, he would have been able to find another way,
03:21:22.240 | maybe fly it until it runs out of fuel or something,
03:21:24.260 | but you know.
03:21:25.340 | - Keep anticipating all the large number of trajectories
03:21:28.820 | that can go negative, all those kinds of,
03:21:30.420 | running into the Alps, and try to at least make it easy
03:21:35.420 | for the engineers to build systems that are protecting us.
03:21:38.500 | - Yeah, and let's just think,
03:21:40.460 | what in the computer world right now with Kyde-Ls,
03:21:43.020 | let's just not think about what I'm doing right now.
03:21:45.020 | What I'm doing right now is, it's completely open, right?
03:21:47.100 | Everyone's gonna know Kyde-Ls,
03:21:48.260 | and be playing with them, making them more easier,
03:21:50.100 | and easier, and easier, but what we're gonna start to do,
03:21:53.020 | it makes sense to encrypt the Kyde-Ls
03:21:57.540 | in such a way you, let's say you work
03:21:59.980 | for a pharmaceutical company,
03:22:01.180 | and you have a license to make a given molecule.
03:22:03.580 | Well, you get issued with a license
03:22:05.220 | by the FDA or your local authority,
03:22:07.460 | and they'll say, right, your license to do it,
03:22:08.940 | here it is, it's encrypted, and the Kyde-L gets run.
03:22:12.360 | So you have a license for that instance of use.
03:22:14.600 | Easy to do.
03:22:15.460 | Computer science has already solved the problem.
03:22:17.620 | So the fact that we all trust online banking, right,
03:22:21.260 | the right now, and then we can secure it,
03:22:23.120 | I'm 100% sure we can secure the computer.
03:22:26.620 | And because of the way we have a many,
03:22:28.660 | it's like the same mapping problem,
03:22:31.980 | that you, to actually reverse engineer a Kyde-L
03:22:34.460 | will be as hard as reverse engineering the encryption key.
03:22:37.420 | You know, brute force, it will be cheaper
03:22:40.420 | to just actually buy the regulated medicine.
03:22:44.420 | And actually, people aren't gonna want to then
03:22:46.900 | make their own fake pharmaceuticals,
03:22:49.380 | because it'll be so cheap to do it.
03:22:51.420 | We'll drop the cost of access to drugs.
03:22:53.700 | Now, what will happen?
03:22:54.780 | Recreational drugs.
03:22:56.200 | People will start saying, well,
03:22:57.300 | I want access to recreational drugs.
03:22:59.700 | Well, it's gonna be up to,
03:23:01.220 | it's gonna accelerate that social discussion
03:23:03.940 | that's happening in the US and Canada
03:23:05.700 | and the UK, everywhere, right?
03:23:07.140 | - 'Cause cost goes down, access goes up.
03:23:10.340 | - Given cannabis, THC, to some people who've got epilepsy,
03:23:13.780 | isn't literally, forgive the term, a no-brainer,
03:23:16.540 | because these poor people go from seizures like every day
03:23:19.660 | to maybe seizures just once every few months.
03:23:22.500 | - That's an interesting idea,
03:23:23.940 | that try to minimize the chance that it can get
03:23:26.240 | into the hands of individuals,
03:23:29.020 | like terrorists or people that want to do harm.
03:23:32.960 | Now, with that kind of thing,
03:23:35.680 | you're putting a lot of power in the hands of governments,
03:23:39.160 | in the hands of institutions,
03:23:40.480 | and so then emerge the kind of natural criticism
03:23:44.120 | you might have of governments
03:23:45.320 | that can sometimes use these for ill,
03:23:48.400 | use them as weapons of war,
03:23:50.980 | not tools of betterment.
03:23:54.740 | So, and sometimes not just war against other nations,
03:23:59.580 | but war against its own people,
03:24:01.540 | as it has been done throughout history.
03:24:03.500 | - Well, I'm thinking, so there's another way of doing it,
03:24:06.260 | a decentralized peer-to-peer version,
03:24:09.260 | where, and what you have to do,
03:24:11.200 | I'm not saying you should adopt a blockchain,
03:24:13.060 | but there is a way of maybe taking Kyde-Ls
03:24:14.940 | and put them in blockchain.
03:24:16.220 | Here's an idea, let's just say,
03:24:17.780 | the way we're doing it in my lab right now
03:24:18.820 | is we go to the literature,
03:24:20.400 | we take a recipe to make a molecule,
03:24:23.120 | convert that to Kyde-L,
03:24:24.960 | and diligently make it in the robot and validate it.
03:24:28.160 | So, I would call mining, proof of work, proof of synthesis.
03:24:33.200 | All right?
03:24:34.040 | - Proof of the synthesis, that's pretty cool.
03:24:34.880 | - Yeah, yeah, but this is cool,
03:24:35.920 | because suddenly, when you actually synthesize it,
03:24:38.960 | you can get the analytical data,
03:24:40.200 | but there's also a fingerprint in there
03:24:41.520 | of the impurities that get carried across,
03:24:43.680 | 'cause you can never make something 100% pure.
03:24:46.440 | That fingerprint will allow you to secure your Kyde-L.
03:24:49.700 | So, what you do is encrypt those two things.
03:24:51.500 | So, suddenly, you can have people out there mining,
03:24:54.140 | and what you could do, perhaps,
03:24:56.260 | is do the type of thing,
03:24:58.060 | we need to basically look at the way
03:25:00.220 | that contact tracing should have been done in COVID,
03:25:03.220 | where people are given the information.
03:25:05.500 | So, you have just been in contact with someone COVID,
03:25:07.940 | you choose, I'm not telling you to stay at home,
03:25:10.540 | you choose, right?
03:25:12.260 | So, now, if we could imagine a similar thing,
03:25:14.260 | like, you have got access to these chemicals,
03:25:17.460 | they will have these effects,
03:25:18.500 | you choose and publicize it,
03:25:20.940 | or maybe it's out somewhere, I don't know,
03:25:22.660 | I'm not a policymaker on this.
03:25:25.260 | And my job here is to not just make the technology possible,
03:25:29.220 | but to have as open as a discussion as possible
03:25:31.940 | with people to say, "Hey, can we stop childhood mortality
03:25:35.420 | "with this technology?"
03:25:36.900 | And do those benefits outweigh the one-off
03:25:40.300 | where people might use it for terrorism,
03:25:42.420 | or people might use it for recreational drugs?
03:25:45.380 | Chemify, which is the name of the entity
03:25:47.380 | that will make this happen,
03:25:48.580 | I think we have some social responsibilities
03:25:51.020 | as an entity to make sure that we're not enabling people
03:25:53.380 | to manufacture personal drugs, weapons at will.
03:25:56.760 | And what we have to do is have a discussion with society,
03:25:59.660 | with the people that invest in this,
03:26:01.280 | with people that are gonna pay for this,
03:26:03.680 | to say, "Well, do you wanna live longer?
03:26:05.780 | "And do you wanna be healthier?
03:26:07.460 | "And are you willing to accept some of the risks?"
03:26:09.660 | And I think that's a discussion to have.
03:26:11.820 | - So by the way, when you say personal drugs,
03:26:14.580 | do you mean the illegal ones?
03:26:16.260 | Or do you have a concern of just putting the manufacturer
03:26:19.860 | of any kind of legal drugs in the hands of regular people?
03:26:24.860 | 'Cause they might, like dose matters,
03:26:28.140 | they might take way too much?
03:26:29.940 | - I mean, I would say, to be honest,
03:26:32.020 | the chances of computers being, well, should always never,
03:26:35.740 | so the fact I can now say this means
03:26:37.580 | it's totally gonna come true, right?
03:26:39.260 | - And I'm going to do it.
03:26:40.860 | - I cannot imagine that computers will be
03:26:43.300 | in people's houses anytime soon,
03:26:45.420 | but they might be at the local pharmacy, right?
03:26:48.940 | And if you've got a drug manufacturing facility
03:26:51.140 | in every town, then you just go
03:26:53.580 | and they give you a prescription,
03:26:54.700 | they do it in such a way, they format it
03:26:56.260 | so that you don't have to take 10 pills every day.
03:26:59.380 | You get one manufactured for you
03:27:01.940 | that has all the materials you need
03:27:03.580 | and the right distribution.
03:27:04.980 | - Got it.
03:27:05.820 | But you mentioned recreation of drugs,
03:27:07.300 | and the reason I mention it,
03:27:09.220 | 'cause I know people are gonna speak up on this,
03:27:11.980 | if the drug is legal, there's, to me,
03:27:14.780 | no reason why you can't manufacture it for recreation.
03:27:18.780 | - I mean, you can do it right now.
03:27:19.620 | - What do you have against fun, Lee?
03:27:21.420 | - So, I mean, I'm a chemistry professor in a university
03:27:27.540 | who's an entrepreneur as well.
03:27:29.420 | I just think I need to be as responsible
03:27:31.780 | as I can in the discussion.
03:27:33.020 | - Sure.
03:27:34.220 | No, sure, sure.
03:27:35.140 | But I know, so let me be the one that says
03:27:37.780 | there's nothing, 'cause you have said recreational drugs
03:27:41.980 | and terrorism in the same sentence.
03:27:43.900 | - Yeah, yeah, okay.
03:27:45.060 | - I think let's make sure we draw a line
03:27:48.020 | that there's real dangers to the world
03:27:50.780 | of terrorists of bio-warfare,
03:27:53.420 | and then there's a little bit of weed.
03:27:56.660 | - So, I mean, I think it's up to the society
03:28:00.020 | to tell its governments what it wants,
03:28:03.060 | what's acceptable, right?
03:28:04.780 | And if it becomes, let's say that THCs
03:28:07.820 | become heavily acceptable, and that you can modify them.
03:28:12.820 | So, let's say it's like blood type.
03:28:15.660 | There's a particular type of THC
03:28:17.420 | that you tolerate better than I do,
03:28:20.380 | then why not have a machine that makes the one you like?
03:28:23.180 | - Yeah.
03:28:24.020 | - And then, and why not--
03:28:24.860 | - It's the perfect brownie.
03:28:26.060 | - Yeah, and I think that that's fine.
03:28:28.740 | But I'm, you know, we're so far away from that.
03:28:32.180 | I can barely get the thing to work in the lab, right?
03:28:34.540 | I mean, it's reliability and all this other stuff,
03:28:36.780 | but what I think's gonna happen in the short term,
03:28:39.100 | it's gonna turbocharge molecular discovery, reliability,
03:28:43.980 | and that will change the world.
03:28:45.500 | - That's super exciting.
03:28:46.980 | You have a draft of a paper titled
03:28:49.340 | Autonomous Intelligent Exploration,
03:28:50.900 | Discovery and Optimization of Nanomaterials.
03:28:53.780 | So, we are talking about automating engineering
03:28:56.940 | of nanomaterials.
03:28:58.820 | How hard is this problem?
03:29:01.100 | And as we continue down this thread
03:29:03.180 | of the positives and the worrisome,
03:29:06.780 | what are the things we should be excited about?
03:29:09.660 | And what are the things we should be terrified about?
03:29:12.300 | And how do we minimize the chance
03:29:13.940 | of the terrifying consequences?
03:29:18.060 | - So, in this robot, the robot does all the heavy lifting.
03:29:21.380 | So, the robot basically is an embodied AI.
03:29:24.460 | I really like AI in a domain-specific way.
03:29:29.460 | One of the, I should say at this point,
03:29:31.460 | there was an attempt in the '60s, Joshua Ledenberg
03:29:35.460 | and some really important people did this,
03:29:37.700 | that made an AI to try and guess if organic molecules
03:29:41.420 | in a mass spectrometer were alien or not.
03:29:43.700 | - Yes.
03:29:44.740 | - And they failed 'cause they didn't have assembly theory.
03:29:47.620 | (laughing)
03:29:48.460 | - I see.
03:29:49.300 | - And when I--
03:29:50.140 | - Wait, what does assembly theory give you
03:29:51.820 | about alien versus human life?
03:29:53.540 | - Well, no, it tells you about unknown,
03:29:55.660 | the degree of unknowns.
03:29:56.540 | You can fingerprint stuff.
03:29:57.500 | They weren't looking at,
03:29:58.860 | they were trying to basically just look at the corpus
03:30:01.260 | of complex organic molecules.
03:30:04.060 | So, when I was a bit down about assembly theory,
03:30:06.300 | 'cause I couldn't convince referees
03:30:08.180 | and couldn't convince computational people interested
03:30:12.420 | in computational complexity,
03:30:14.540 | I was really quite depressed about it.
03:30:16.340 | And I mean, I've been working with Sarah Walker's team,
03:30:20.140 | and I think she also invented assembly theory in some way.
03:30:23.620 | We can talk about it later.
03:30:25.180 | When I found the AI not working for the dendral project,
03:30:30.980 | I suddenly realized I wasn't totally insane.
03:30:34.700 | Coming back to this nano robot,
03:30:37.220 | so what it does, it's basically like a computer,
03:30:40.020 | but now what it does is it squirts a liquid
03:30:42.980 | with gold in it in a test tube,
03:30:45.420 | and it adds some reducing agents,
03:30:49.660 | so some electrons to make the gold turn into a nanoparticle.
03:30:53.260 | Now, when gold becomes a nanoparticle,
03:30:54.820 | it gets a characteristic color, a plasmon.
03:30:56.820 | So, it's a bit like if you look at the sheen
03:30:58.740 | on a gold wedding ring or a gold bar or something,
03:31:01.780 | those are the ways that conducting electrons
03:31:03.660 | basically reflect light.
03:31:05.860 | What we did is we randomly squirt the gold particle
03:31:09.580 | and the reducing agent in, and we measure the UV,
03:31:11.940 | we measure the color.
03:31:13.540 | And so, what we do is we've got, the robot has a mind,
03:31:16.900 | so it has a mind where, in a simulation,
03:31:20.380 | it randomly generates nanoparticles,
03:31:23.020 | and the plasmon, the color that comes out,
03:31:24.820 | randomly imagines in its head.
03:31:27.340 | It then, where the other,
03:31:28.300 | so that's the imaginary side of the robot.
03:31:29.820 | In the physical side of the robot,
03:31:30.860 | it squirts in the chemicals and looks at the color,
03:31:34.620 | and it uses a genetic algorithm,
03:31:36.420 | and a map elite, actually, on it,
03:31:38.680 | and it goes around in cycles
03:31:40.420 | and refines the color to the objective.
03:31:45.380 | Now, we use two different points.
03:31:46.780 | We have an exploration and an optimization.
03:31:50.300 | They're two different.
03:31:51.140 | So, the exploration just says, just do random stuff
03:31:54.420 | and see how many different things you can get.
03:31:55.940 | And when you get different things,
03:31:57.500 | try and optimize and make the peak sharper, sharper, sharper.
03:32:00.780 | And what it does, after a number of cycles,
03:32:03.220 | is it physically takes a sample
03:32:05.420 | of the optimized nanomaterial,
03:32:07.880 | resets all the round bottom flasks, cleans them,
03:32:11.020 | and puts the seed, physical seed, back in.
03:32:14.660 | And what this robot is able to do is search a space
03:32:17.740 | of 10 to the 23 possible reactions
03:32:20.620 | in just 1,000 experiments in three days.
03:32:24.180 | And it makes five generations of nanoparticles,
03:32:26.540 | which get nicer and nicer in terms of shape
03:32:28.580 | and color and definition.
03:32:30.100 | And then, at the end, it outputs a Kyde-L code.
03:32:32.500 | - Wow, it's doing the search for programs
03:32:38.220 | in the physical space.
03:32:39.700 | So, it's doing a kind of reinforcement learning.
03:32:42.620 | - Yeah, yeah, in the physical space.
03:32:44.180 | - With the exploration and the optimization.
03:32:46.580 | - And that Kyde-L will work on any computer
03:32:48.740 | or any qualified hardware.
03:32:49.580 | - So, now that's it.
03:32:51.500 | Now, that's a general piece of code.
03:32:53.460 | - Yeah.
03:32:54.700 | - Replicate somewhat, maybe perfectly, what it created.
03:32:58.940 | That's amazing, that's incredible.
03:33:00.340 | - But the nanoparticles themselves are done.
03:33:02.180 | The robot has all the thinking.
03:33:03.540 | So, we don't try and imply any self-replication
03:33:06.620 | or try and get the particles to make themselves,
03:33:09.700 | although it would be cool to try.
03:33:11.500 | - So, well, there you go.
03:33:13.060 | Those are famous last words
03:33:14.940 | for the end of human civilization.
03:33:16.620 | Would be cool to try.
03:33:17.860 | So, is it possible to create molecules
03:33:21.420 | to start approaching this question
03:33:24.860 | that we started this conversation,
03:33:26.620 | which is the origin of life,
03:33:29.580 | to start to create molecules that have lifelike qualities?
03:33:34.020 | So, have the replication, have complex,
03:33:37.740 | start to create complex organisms.
03:33:40.180 | - So, we have done this with the oxides.
03:33:42.060 | I talked about earlier, the moxides and the rings
03:33:44.540 | and the balls.
03:33:45.780 | And the problem is that, well, they do,
03:33:49.340 | they autocatalytically enhance one another.
03:33:52.020 | So, they would, I guess you would call it self-replication.
03:33:55.780 | But because there's limited function and mutation,
03:34:00.780 | they're pretty dumb.
03:34:02.740 | So, they don't do very much.
03:34:04.540 | So, I think the prospect of us being able to engineer
03:34:09.540 | a nanomaterial life form in the short term,
03:34:13.700 | like I said earlier, my aim is to do this, of course.
03:34:16.420 | I mean, on one hand, I'm saying it's impossible.
03:34:18.460 | On the other hand, I'm saying I'm doing it.
03:34:19.580 | So, which is it, Lee?
03:34:20.420 | You know, it's like, well, I think we can do it,
03:34:23.780 | but only in the robot.
03:34:25.020 | So, the causal chain that's gonna allow it
03:34:26.500 | is in the robot.
03:34:27.540 | These particles, if they do start to self-replicate,
03:34:30.060 | the system's gonna be so fragile
03:34:32.340 | that I don't think anything dangerous will come out.
03:34:35.980 | And it doesn't mean we shouldn't treat them
03:34:38.300 | as potentially, you know, I mean,
03:34:41.340 | I don't want to scare people, like gain of function,
03:34:43.260 | we're gonna produce stuff that comes out.
03:34:45.180 | Our number one kill switch is that we always try
03:34:48.060 | to search a space of objects that don't exist in our,
03:34:53.060 | it don't exist in the environment.
03:34:55.820 | So, even if something got out, it just would die immediately.
03:34:58.500 | It's like making a silicon life form or something,
03:35:00.860 | or, you know.
03:35:02.060 | - Which is the opposite of oftentimes
03:35:04.300 | gain of function research is focused on, like,
03:35:06.380 | how do you get a dangerous thing to be closer
03:35:10.140 | to something that works with humans?
03:35:11.700 | - Yeah.
03:35:12.540 | - So, have it jump to humans.
03:35:13.420 | So, that's one good mode to operate on
03:35:16.820 | is always try to operate on chemical entities
03:35:21.820 | that are very different than the kind of chemical environment
03:35:25.220 | that humans operate in.
03:35:26.300 | - Yeah, and also, I mean, I'll say something dramatic,
03:35:29.540 | which may not be true, so I should be careful.
03:35:34.540 | If, let's say, we did discover a new living system,
03:35:39.100 | and it was made out of a shadow biosphere,
03:35:41.960 | and we just released it in the environment, who cares?
03:35:45.380 | It's gonna use different stuff.
03:35:46.980 | - It'll just live.
03:35:49.600 | - Just live, yeah.
03:35:51.620 | I found one of my biggest fantasies
03:35:53.860 | is actually is like a planet
03:35:55.820 | that's basically half in the sun.
03:35:57.260 | It doesn't rotate, right?
03:35:58.420 | And you have two different origins of life on that planet,
03:36:02.340 | and they don't share the same chemistry.
03:36:05.140 | - Yeah.
03:36:06.020 | - And then the only time they recognize each other
03:36:08.100 | is when they become intelligent.
03:36:08.980 | They go, "Well, what's that moving?"
03:36:10.660 | - Yeah.
03:36:11.500 | (laughing)
03:36:12.340 | - I wonder if--
03:36:13.180 | - So, they co-evolve, and that's fascinating.
03:36:14.740 | I mean, so one fascinating thing to do
03:36:17.460 | is exactly what you were saying, which is a life bomb,
03:36:20.780 | which is like, try to focus on atmospheres
03:36:25.780 | or chemical conditions of other planets,
03:36:29.700 | and try within this kind of exploration,
03:36:33.700 | optimization system, try to discover life forms
03:36:39.920 | that can work in those conditions,
03:36:41.860 | and then you send those life forms over there.
03:36:44.540 | - Yeah.
03:36:45.380 | - And see what kind of stuff they build up.
03:36:46.300 | Like, you can do like a large-scale,
03:36:48.780 | it's kind of a safe physical environment
03:36:51.860 | to do large-scale experiments.
03:36:53.300 | It's another planet.
03:36:54.700 | - Yeah, so look, I'm gonna say something quite contentious.
03:36:57.220 | I mean, Elon wants to go to Mars.
03:36:59.000 | I think it's brilliant he wants to go to Mars,
03:37:00.460 | but I would counter that and say,
03:37:02.540 | is Elon just obsessed with getting humanity off Earth,
03:37:06.060 | or what about just technology?
03:37:08.100 | So, if we do technology, so Elon either needs
03:37:10.700 | to take a computer to Mars,
03:37:11.820 | 'cause he needs to manufacture drugs, right, on demand,
03:37:13.940 | right, 'cause zero-cost payload
03:37:15.820 | and all that stuff is just code,
03:37:17.460 | or what we do is we actually say, hang on,
03:37:20.060 | it's quite hard for humans to survive on Mars.
03:37:22.580 | Why don't we write a series of origin of life algorithms
03:37:25.480 | where we embed our culture in it, right?
03:37:27.620 | It's a very Ridley Scott Prometheus, right?
03:37:29.820 | - Yeah, yeah.
03:37:30.660 | - Which is a terrible film, by the way, but anyway.
03:37:33.020 | And dump it on Mars, and just terraform Mars,
03:37:37.300 | and what we do is we evolve life on Mars
03:37:39.660 | that is suited to life on Mars,
03:37:42.180 | rather than brute-forcing human life on Mars.
03:37:44.660 | - So, one of the questions is, you know,
03:37:47.300 | what is human culture, what are the things you encode?
03:37:50.180 | Some of it is knowledge, some of it is information,
03:37:52.660 | but the thing that Elon talks about,
03:37:55.380 | and the thing I think about,
03:37:57.460 | I think you think about as well,
03:37:58.680 | is some of the more unique aspects of what makes us human,
03:38:03.680 | which is our particular kind of consciousness.
03:38:09.780 | So, he talks about the flame of human consciousness.
03:38:12.740 | - Yeah.
03:38:13.580 | - That's one of the questions,
03:38:14.620 | is can we instill consciousness into other beings?
03:38:19.620 | Because that's a sad thought,
03:38:24.300 | that whatever this thing inside our minds
03:38:27.860 | that hopes and dreams and fears and loves can all die.
03:38:32.860 | - Yeah, but I think you already know
03:38:35.260 | the answer to that question.
03:38:37.540 | I have a robot lawnmower at home.
03:38:40.140 | My kids call it CC, cool cutter.
03:38:42.380 | It's a robo-mow, and the way it works,
03:38:45.980 | it has an electric field around the perimeter,
03:38:48.020 | and it just, you tell it the area,
03:38:49.900 | and it goes out and goes from its base station,
03:38:52.860 | just mows a bit, gets to the perimeter,
03:38:55.180 | detects the perimeter, then chooses a random angle,
03:38:57.580 | rotates around and goes on.
03:38:58.900 | - Yeah.
03:39:00.460 | - My kids call it cool cutter, it's a she.
03:39:02.860 | I don't know why it's a she,
03:39:03.820 | they just, when they were quite young, they called it,
03:39:07.300 | I don't wanna be sexist there, it could be a he,
03:39:08.940 | but they liked.
03:39:09.900 | - They gendered the lawnmower?
03:39:13.180 | - They gendered the lawnmower.
03:39:14.140 | - Okay. - Yeah, why not?
03:39:15.540 | But I was thinking this lawnmower,
03:39:17.820 | if you apply integrated information theory to lawnmowers,
03:39:20.780 | lawnmowers are conscious.
03:39:22.660 | Now, integrated information theory
03:39:25.020 | is that people say it's a flawed way of measuring
03:39:27.780 | consciousness, but I don't think it is.
03:39:29.060 | I think assembly theory actually measures consciousness
03:39:32.100 | in the same way.
03:39:33.260 | Consciousness is something that is generated
03:39:35.940 | over a population of objects, of humans.
03:39:38.820 | Consciousness didn't suddenly spring in.
03:39:41.020 | Our consciousness has evolved together, right?
03:39:42.860 | The fact we're here and the robots we leave behind,
03:39:45.980 | they all have some of that, so we won't lose it all.
03:39:49.140 | Sure, consciousness requires that we have many models
03:39:52.180 | being generated, it's not just one domain-specific AI,
03:39:54.660 | right, I think the way to create consciousness,
03:39:57.420 | I'm gonna say unashamedly, the best way to make
03:40:00.740 | a consciousness is in a chemical system,
03:40:02.840 | because you just have access to many more states.
03:40:05.300 | And the problem right now we're making silicon consciousness
03:40:08.580 | is you just don't have enough states.
03:40:10.860 | So there are more possible states,
03:40:12.780 | or sorry, there are more possible configurations
03:40:14.620 | possible in your brain than there are atoms in the universe.
03:40:17.940 | And you can switch between them.
03:40:20.500 | You can't do that on a core i10.
03:40:22.300 | It's got 10 billion, 12 billion, 14 billion transistors,
03:40:25.260 | but you can't reconfigure them as dynamically.
03:40:28.340 | - Well, you've shared this intuition a few times already
03:40:31.420 | that the larger number of states somehow correlates
03:40:35.500 | to greater possibility of life,
03:40:38.380 | but it's also possible that constraints are essential here.
03:40:42.340 | - Yeah, yeah.
03:40:43.380 | I mean, but coming back to the,
03:40:45.580 | you worry that something's lost, I agree.
03:40:47.620 | But I think that we will get to an AGI,
03:40:53.580 | but I wonder if it's not, it can't be separate from human,
03:40:58.220 | it can't be separate from human consciousness,
03:41:00.020 | because the causal chain that produced it came from humans.
03:41:02.980 | So what I kind of try and suggest heavily
03:41:06.340 | to people worry about the existential threat of AI saying,
03:41:11.340 | I mean, you put it much more elegantly earlier,
03:41:13.380 | like we should worry about algorithms,
03:41:15.820 | dumb algorithms written by human beings on Twitter,
03:41:18.620 | driving us insane, right?
03:41:20.860 | And doing, acting in odd ways.
03:41:23.220 | - Yeah, I think intelligence,
03:41:24.900 | this is what I have been ineloquent in trying to describe it.
03:41:29.580 | Partially because I try not to think too deeply
03:41:34.340 | through this stuff, because then you become a philosopher.
03:41:36.260 | I still aspire to actually building a bunch of stuff.
03:41:40.460 | But my sense is super intelligence leads to
03:41:44.980 | deep integration into human society.
03:41:48.760 | So like intelligence is strongly correlated.
03:41:51.980 | Like intelligence, the way we conceive of intelligence
03:41:56.980 | materializes as a thing that becomes a fun entity
03:42:01.980 | to have at a party with humans.
03:42:06.380 | So like it's a mix of wit, intelligence, humor,
03:42:09.940 | like intelligence, like knowledge,
03:42:11.700 | ability to do reasoning and so on,
03:42:14.280 | but also humor, emotional intelligence,
03:42:17.940 | ability to love, to dream, to share those dreams,
03:42:23.420 | to play the game of human civilization,
03:42:28.040 | the push and pull, the whole dance of it,
03:42:30.060 | the whole dance of life.
03:42:31.340 | And I think that kind of super intelligent being
03:42:35.380 | is not the thing that worries me.
03:42:38.100 | I think that ultimately will enrich life.
03:42:41.060 | It's again, the dumb algorithms,
03:42:42.980 | the dumb algorithms that scale
03:42:45.420 | in the hands of people that are too,
03:42:47.760 | don't study history, that don't study human psychology
03:42:50.460 | and human nature, just applying too broadly
03:42:54.420 | for selfish near-term interests.
03:42:56.980 | That's the biggest danger.
03:42:58.460 | - Yeah, and I think it's not a new danger, right?
03:43:01.340 | I now know how I should use Twitter
03:43:04.620 | and how I shouldn't use Twitter, right?
03:43:07.440 | I like to provoke people into thinking.
03:43:09.600 | I don't want to provoke people into outrage.
03:43:11.540 | It's not fun, it's not a good thing for humans to do, right?
03:43:14.620 | And I think that when you get people into outrage,
03:43:16.700 | they take sides.
03:43:18.380 | And taking sides is really bad,
03:43:20.020 | but I think that we're all beginning to see this.
03:43:22.060 | And I think that actually I'm very optimistic
03:43:24.780 | about how things will evolve,
03:43:26.220 | because I wonder how much productivity
03:43:31.220 | has Twitter and social media taken out of humanity?
03:43:33.780 | 'Cause how many now, I mean,
03:43:36.580 | so the good thing about Twitter is it gives power,
03:43:38.940 | so it gives voice to minorities, right?
03:43:41.300 | And that's good to some degree,
03:43:44.220 | but I wonder how much voice does it give
03:43:47.460 | to all sorts of other problems
03:43:52.460 | that don't need this emerge?
03:43:54.300 | - By the way, when you say minorities,
03:43:55.780 | I think, or at least if I were to agree with you,
03:43:59.980 | what I would say is minorities broadly defined
03:44:02.380 | in these small groups of people that,
03:44:07.380 | it magnifies the concerns of the small versus the big.
03:44:12.820 | - That is good to some degree,
03:44:15.700 | but I think, I mean, I have to be careful,
03:44:17.980 | because I think I have a very,
03:44:20.940 | I mean, I think that the world isn't that broken, right?
03:44:23.420 | I think the world is a pretty cool place.
03:44:25.100 | I think academia is really great.
03:44:26.980 | I think climate change presents
03:44:29.980 | a really interesting problem for humanity
03:44:32.380 | that we will solve.
03:44:33.540 | - I like how you said it, it's a pretty cool problem.
03:44:35.700 | (laughs)
03:44:36.780 | For civilization, it's a big one.
03:44:38.460 | - Well, it's a bunch of, I wanna--
03:44:39.820 | - There's a bunch of really, really big problems
03:44:42.620 | that if solved can significantly improve
03:44:45.940 | the quality of life for a large,
03:44:47.540 | that ultimately is what we're trying to do,
03:44:49.540 | improve how awesome life is
03:44:51.660 | for the maximum number of people.
03:44:53.420 | - Yeah, and I think that coming back to consciousness,
03:44:56.620 | I don't think the universe is doomed to heat death, right?
03:45:00.300 | It's one of the optimists, that's why I want to
03:45:02.580 | kind of nudge you into thinking that time is fundamental,
03:45:04.740 | 'cause if time is fundamental,
03:45:05.820 | then suddenly you don't have to give it back.
03:45:09.180 | The universe just constructs stuff,
03:45:11.140 | and what we see around us in our construction,
03:45:13.220 | I know everyone's worried about how fragile civilization is,
03:45:16.020 | and I mean, look at the fundamentals.
03:45:17.700 | We're good until the sun expands, right?
03:45:21.980 | We've got quite a lot of resource on Earth.
03:45:23.740 | We're trying to be quite cooperative.
03:45:25.660 | Humans are nice to each other
03:45:27.740 | when they're not under enormous stress.
03:45:31.820 | But coming back to the consciousness thing,
03:45:33.740 | are we going to send human beings to Mars
03:45:36.580 | or conscious robots to Mars,
03:45:38.380 | or are we gonna send some hybrid?
03:45:41.020 | And I don't know the answer to that question right now.
03:45:43.340 | I guess Elon's gonna have a pretty good go at getting there.
03:45:46.500 | I'm not sure whether, I have my doubts,
03:45:50.900 | but I'm not qualified.
03:45:53.020 | I'm sure people have their doubts that computation works,
03:45:56.420 | but I've got it working.
03:45:58.020 | - And most of the cool technologies we have today
03:46:02.780 | and take for granted, like the airplane,
03:46:06.900 | aforementioned airplane,
03:46:08.580 | were things that people doubted,
03:46:10.820 | every majority of people doubted before they came to life,
03:46:15.300 | and they come to life.
03:46:16.140 | And speaking of hybrid AI and humans,
03:46:19.160 | it's fascinating to think about all the different ways
03:46:21.540 | that hybridization, that merger can happen.
03:46:25.100 | I mean, we currently have the smartphone,
03:46:27.340 | so there's already a hybrid,
03:46:28.540 | but there's all kinds of ways that hybrid happens,
03:46:31.460 | how we and other technology play together, like a computer,
03:46:35.820 | how that changes the fabric of human civilization
03:46:39.420 | is like wide open, who knows?
03:46:41.660 | Who knows?
03:46:42.580 | If you remove, if you remove cancer,
03:46:47.580 | if you remove major diseases from humanity,
03:46:52.520 | there's going to be a bunch of consequences
03:46:55.660 | we're not anticipating, many of them positive,
03:46:59.020 | but many of them negative.
03:47:01.140 | Many of them, most of them, at least I hope,
03:47:04.140 | are weird and wonderful and fun
03:47:06.620 | in ways that are totally unexpected.
03:47:08.820 | And we sitting on a porch with a bottle of Jack Daniels
03:47:11.820 | and a rocker, we'll say,
03:47:13.660 | kids these days don't appreciate
03:47:15.260 | how hard we had it back in the day.
03:47:17.240 | I gotta ask you, speaking of nudging,
03:47:21.340 | you and Yoshua Bach nudge each other on Twitter quite a bit
03:47:25.940 | in wonderful intellectual debates.
03:47:29.420 | And of course, for people who don't know,
03:47:30.820 | Joshua Bach is this brilliant guy.
03:47:32.340 | He's been on the podcast a couple times.
03:47:34.340 | You tweeted, or he tweeted, Joshua Bach,
03:47:39.620 | everyone should follow him as well.
03:47:41.140 | You should definitely follow Mr. Lee Cronin,
03:47:43.380 | Dr. Lee Cronin.
03:47:44.460 | He tweeted, "Dinner with Lee Cronin.
03:47:48.100 | "We discussed that while we can translate
03:47:50.820 | "every working model of existence into a Turing machine,
03:47:54.360 | "the structure of the universe might be given
03:47:56.740 | "by wakes of nonexistence in a pattern generated
03:47:59.640 | "by all possible automata,
03:48:02.200 | "which exist by default."
03:48:04.340 | And then he followed on saying,
03:48:07.340 | "Face to face is the best."
03:48:09.940 | So the dinner was face to face.
03:48:12.180 | What is Joshua talking about?
03:48:14.260 | In wakes, quote, "Wakes of nonexistence
03:48:18.340 | "in a pattern generated by all possible automata,
03:48:22.600 | "which exist by default."
03:48:25.480 | So automata exist by default, apparently.
03:48:29.860 | And then there's wakes of nonexistence.
03:48:31.780 | What the hell is nonexistence in the universe?
03:48:34.980 | And also, in another conversation,
03:48:39.360 | you tweeted, "It's state machines all the way down,"
03:48:44.420 | which presumably is the origin story
03:48:46.220 | of this dinner discussion.
03:48:47.820 | And then Joshua said, again, nudging,
03:48:51.980 | nudging, nudging/trolling.
03:48:55.520 | Joshua said, "You've seen the light.
03:48:57.860 | "Welcome, friend.
03:48:59.220 | "Many foundational physicists effectively believe
03:49:01.580 | "in some form of hypercomputation."
03:49:04.420 | Lee is coming around to this idea.
03:49:07.060 | And then you said, "I think there are notable differences.
03:49:10.120 | "First, I think the universe does not have
03:49:12.160 | "to be a computer.
03:49:13.020 | "Second, I want to understand how the universe emerges
03:49:16.720 | "constructors that build computers.
03:49:18.380 | "And third, is that there is something
03:49:21.820 | "below church touring."
03:49:24.980 | Okay.
03:49:26.580 | What the heck is this dinner conversation about?
03:49:30.780 | Maybe, put another way, maybe zooming out a little bit,
03:49:34.700 | are there interesting agreements or disagreements
03:49:37.020 | between you and Joshua Bach that can elucidate
03:49:41.540 | some of the other topics we've been talking about?
03:49:44.060 | - Yeah, so Yash has an incredible mind,
03:49:46.100 | and he's so well-read,
03:49:48.780 | and uses language really elegantly.
03:49:53.300 | It bamboozles me at times.
03:49:54.820 | So often, I'm describing concepts in a way
03:50:00.420 | that I built from the ground up,
03:50:03.140 | and so we misunderstand each other a lot.
03:50:05.300 | - And he's floating in the clouds?
03:50:07.020 | Is that what you're saying?
03:50:08.180 | - Something like, not quite, but I think,
03:50:09.780 | so this concept of a Turing machine.
03:50:11.620 | So a Turing machine, Turing machines, I would argue,
03:50:15.420 | and I think this is not, the Turing machine,
03:50:18.500 | the universe is not a Turing machine.
03:50:21.060 | Biology is not even a Turing machine, right?
03:50:23.340 | Because Turing machines don't evolve, right?
03:50:25.620 | There is this problem that people see
03:50:27.700 | Turing machines everywhere.
03:50:28.620 | But isn't it interesting,
03:50:30.100 | the universe gave rise to biology
03:50:31.580 | that gave rise to intelligence
03:50:32.820 | that gave rise to Alan Turing,
03:50:34.420 | who invented the abstraction of the Turing machine,
03:50:37.060 | and allowed us to digitize.
03:50:39.580 | And so I've been looking for the mystery
03:50:43.340 | at the origin of life, the origin of intelligence,
03:50:45.500 | and the origin of this.
03:50:46.420 | And when I discuss with Yash, I think,
03:50:49.260 | Yoshi, he was saying, well, the universe,
03:50:51.300 | of course the universe is a Turing machine.
03:50:53.260 | Of course, there's a hypercomputer there.
03:50:56.260 | And I think we got kind of trapped in our words,
03:50:58.580 | in terms, because of course it's possible
03:51:01.340 | for a Turing machine or computers
03:51:03.060 | to exist in the universe.
03:51:04.180 | We have them.
03:51:05.500 | But what I'm trying to understand is,
03:51:07.740 | where did the transition of continuous to discrete occur?
03:51:11.340 | And this is because of my general foolishness
03:51:14.900 | of understanding the continuous.
03:51:19.540 | But I guess what I'm trying to say is,
03:51:23.540 | there were constructors before there were abstractors.
03:51:26.780 | Because how did the universe abstract itself into existence?
03:51:31.460 | And it goes back to earlier saying,
03:51:32.820 | could the universe of intelligence have come first?
03:51:35.220 | - What's a constructor, what's an abstractor?
03:51:37.260 | - So the abstractor is the ability of say,
03:51:40.580 | of Alan Turing and Godel and Church,
03:51:45.100 | to think about the mathematical universe,
03:51:48.980 | and to label things.
03:51:50.740 | And then from those labels, to come up with a set of axioms,
03:51:54.460 | with those labels, and to basically understand
03:51:57.620 | the universe mathematically and say,
03:51:59.140 | okay, I can label things.
03:52:01.180 | But where did the labeler come from?
03:52:03.140 | Where is the prime labeler?
03:52:04.700 | - Even if the universe is not a Turing computer,
03:52:09.700 | does that negate the possibility that a Turing computer
03:52:14.820 | can simulate the universe?
03:52:16.260 | Like, just because the abstraction was formed
03:52:18.700 | at a later time, does that mean that abstraction,
03:52:22.660 | this is to our cellular automata conversation.
03:52:24.900 | - Yeah.
03:52:25.740 | - You're taking away some of the magic
03:52:26.660 | from the cellular automata,
03:52:27.660 | because very intelligent biological systems
03:52:30.380 | came up with that cellular automata.
03:52:32.380 | - Well, this is where the existence is the default, right?
03:52:34.380 | Is it, does the fact that we exist,
03:52:36.260 | and we can come up with a Turing machine,
03:52:38.100 | does that mean the universe
03:52:39.620 | has to be a Turing machine as well?
03:52:41.980 | - But can it be a Turing machine though?
03:52:44.020 | That's a, so the has to be and the can it be.
03:52:46.900 | - Can it be, sure.
03:52:50.300 | I don't know, I don't understand if it has to be or not.
03:52:53.660 | Can it be?
03:52:54.780 | But can the universe have Turing machines in it?
03:52:58.020 | Sure, they exist now.
03:53:00.060 | I'm wondering though, maybe,
03:53:04.380 | and this is when things get really hairy,
03:53:07.460 | is I think the universe, maybe in the past,
03:53:09.420 | did not have the computational power that it has now.
03:53:17.900 | - This is almost like a law of physics,
03:53:20.700 | kind of, so the computational power is not,
03:53:24.260 | you can get some free lunch?
03:53:28.700 | - Yeah, I mean, the fact that we now,
03:53:30.660 | we sit here in this period in time,
03:53:32.340 | and we can imagine all these robots
03:53:33.980 | and all these machines, and we built them.
03:53:36.060 | And so we can easily imagine going back in time
03:53:38.300 | that the universe was capable of having them,
03:53:39.780 | but I don't think it can.
03:53:41.940 | - So the universe may have been a lot dumber computationally?
03:53:45.260 | - And I think that's why,
03:53:46.820 | I don't want to go back to the time discussion,
03:53:48.540 | but I think it has some relationship with it.
03:53:50.780 | The universe is basically smarter now than it used to be,
03:53:53.460 | and it's gonna continue getting smarter over time
03:53:57.060 | because of novelty generation,
03:53:59.420 | and the ability to create objects
03:54:00.980 | within objects within objects.
03:54:02.460 | - You know, there's a, perhaps there's ground in physics,
03:54:05.260 | there's this intuition of conservation.
03:54:07.900 | - Yeah. - That stuff is conserved.
03:54:09.220 | Like, you're not, you've always had all,
03:54:12.460 | everything, you're just rearranging books
03:54:15.260 | on the bookshelf through time.
03:54:18.580 | - So, okay. - But you're saying,
03:54:19.660 | like, new books are being written.
03:54:21.700 | - Which laws do you want to break?
03:54:23.020 | At the origin of the Big Bang,
03:54:25.940 | you had to break the second law
03:54:27.380 | 'cause we got order for free.
03:54:28.500 | - Yeah.
03:54:29.580 | - Well, what I'm telling you now
03:54:30.420 | is that the energy isn't conserved in the universe.
03:54:33.460 | - Oh, it's the second law, okay, I gotcha.
03:54:35.460 | - So because, but not in a mad way.
03:54:38.060 | - Okay, so computation potentially is not conserved,
03:54:44.660 | which is a fascinating idea.
03:54:46.460 | Intelligence is not conserved.
03:54:49.200 | Complexity is not conserved.
03:54:52.500 | I suppose that's deeply connected
03:54:57.420 | to time being fundamental.
03:55:00.340 | - The natural consequence of that
03:55:03.180 | is if time is fundamental
03:55:05.300 | and the universe is inflating in time, if you like,
03:55:09.660 | then there are one or two conservation laws
03:55:11.860 | that we need to have a look at.
03:55:14.060 | And I wonder if that means the total energy
03:55:16.220 | of the universe is actually increasing over time.
03:55:18.980 | And this may be completely ludicrous,
03:55:21.100 | but we do have all this dark energy.
03:55:25.300 | We have some anomalies, let's say,
03:55:28.060 | dark matter and dark energy.
03:55:29.380 | If we don't add them in,
03:55:30.980 | galaxies, so dark matter, I think, doesn't hold.
03:55:35.420 | You know, you need to hold the galaxies together
03:55:36.980 | and there's some other observational issues.
03:55:38.900 | Could dark energy just be time?
03:55:42.140 | So figuring out what dark energy is
03:55:44.500 | might give us some deep clues about this,
03:55:48.120 | not just time, but the consequences of time.
03:55:53.140 | - So it could be that, I mean,
03:55:55.300 | I'm not saying there's perpetual motion allowed
03:55:57.220 | and it's free lunch, but I'm saying
03:55:59.100 | if the universe is intrinsically asymmetric
03:56:02.860 | and it appears to be,
03:56:04.120 | and it's generating novelty and it appears to,
03:56:09.720 | couldn't that just be mechanistically how reality works?
03:56:13.520 | And therefore, I don't really like this idea that the,
03:56:18.520 | so I want to live in a deterministic universe
03:56:21.920 | that is undetermined.
03:56:23.180 | - Yeah. - Right?
03:56:25.160 | The only way I can do that is if time is fundamental.
03:56:27.640 | Because otherwise, all the rest of it
03:56:30.360 | is just sleight of hand, because the physicists will say,
03:56:33.320 | yes, everything's deterministic.
03:56:35.360 | Newtonian mechanics is deterministic.
03:56:38.540 | Quantum mechanics is deterministic.
03:56:40.520 | Let's take the Everettian view.
03:56:42.440 | And then basically we can just draw out
03:56:44.080 | this massive universe branching,
03:56:46.160 | but it closes again, we get it all back.
03:56:48.000 | And don't worry, your feeling of free will is effective.
03:56:50.920 | But what the physicists are actually saying
03:56:53.540 | is the entire future is mapped out.
03:56:55.520 | And that is clearly problematical.
03:56:59.180 | And clearly--
03:57:01.640 | - It's just that's not so clear.
03:57:05.080 | - Yeah. - It's just problematic.
03:57:07.820 | - Well, yeah, yeah, so it's--
03:57:10.300 | - 'Cause that maybe is just the way it is.
03:57:12.100 | It's problematic to you, a particular creature
03:57:14.220 | along this timeline.
03:57:15.060 | - I want to reduce the number of beliefs
03:57:16.500 | I need to understand the universe.
03:57:18.500 | So if time is fundamental, I don't need
03:57:21.240 | to have magic order at the beginning,
03:57:23.980 | and I don't need a second law.
03:57:25.580 | - But you do need the magical belief
03:57:27.400 | that time is fundamental.
03:57:28.780 | - Well, I need the observation that I'm seeing
03:57:32.200 | to be just how it is all the way down.
03:57:34.380 | - But the Earth also looks flat
03:57:36.000 | if you agree with your observation.
03:57:40.820 | So we can't necessarily trust our observation.
03:57:43.500 | - I know the Earth isn't flat
03:57:44.640 | because I can send a satellite into space.
03:57:46.400 | I can fly. - No, but now you see,
03:57:47.440 | now you're using the tools of science
03:57:49.720 | and the technology of science.
03:57:50.560 | - But I'm saying I'm gonna do experiments
03:57:53.240 | that start to show.
03:57:55.160 | I mean, I think that it's worth,
03:57:57.400 | so if you can't, so if I cannot do an experiment
03:58:01.060 | or a thought experiment that will test this assumption,
03:58:04.640 | then the assumption is without merit, really, in the end.
03:58:07.760 | You know, that's fine.
03:58:09.520 | - Yeah, that's a beautiful ideal you hold yourself to.
03:58:12.360 | That's, given that you think deeply
03:58:16.960 | in a philosophical way, you think about some
03:58:19.560 | of these really important issues
03:58:21.880 | and you have theoretical frameworks like assembly theory,
03:58:25.940 | it's really nice that you're always grounded
03:58:30.020 | with experiment.
03:58:31.360 | - Oh, I have. - That's so refreshing.
03:58:33.280 | That's so beautifully refreshing.
03:58:35.340 | Now that we're holding you to the grounded in experiment,
03:58:40.340 | to the harsh truth of reality,
03:58:42.280 | let me ask the big ridiculous question.
03:58:45.240 | What is the meaning of this whole thing?
03:58:47.440 | What's the meaning of life?
03:58:49.920 | This time is fundamental.
03:58:52.640 | It's marching forward and along this long timeline,
03:58:58.120 | come along a bunch of descendants of apes
03:59:01.580 | that have come up with cellular automata and computers
03:59:05.580 | and now computers, why?
03:59:08.220 | - I have so many different answers to this question.
03:59:11.820 | It depends on what day.
03:59:13.820 | I would say that given the way of the conversation
03:59:16.060 | we've had today, I'd say the meaning,
03:59:18.500 | well, we make our own meaning, I think that's fine,
03:59:20.940 | but I think the universe wants to explore
03:59:24.700 | every possible configuration that it's allowed
03:59:27.220 | us to explore and this goes back to the kind of question
03:59:30.400 | that you're asking about, Yasha,
03:59:32.000 | and the existence and non-existence of things, right?
03:59:35.200 | So if the universe is a Turing machine,
03:59:37.060 | it's churning through a lot of states
03:59:38.900 | and you think about combinatorial theory,
03:59:42.200 | before assembly theory, so everything is possible.
03:59:45.240 | What Yasha and I were saying is,
03:59:47.520 | well, not everything is,
03:59:48.920 | we don't see the combinatorial explosion.
03:59:52.020 | We see something else and what we see is evidence of memory
03:59:57.740 | so there clearly seems to be some interference
04:00:00.500 | between the combinatorial explosion of things
04:00:04.140 | and what the universe allows and it's like this kind of
04:00:07.540 | constructive destructive interference.
04:00:09.780 | So maybe the universe is not just about,
04:00:13.620 | it is assembling objects in space and time
04:00:18.220 | and those objects are able to search more space and time
04:00:22.140 | and the universe is just infinitely creative
04:00:24.240 | and I guess I'm searching for why the universe
04:00:27.300 | is infinitely creative or is infinitely creative
04:00:29.460 | and maybe the meaning is just simply to make
04:00:31.380 | as many objects, create as many things as possible
04:00:34.820 | and I see a future of the universe that doesn't result
04:00:37.220 | in the heat death of the universe.
04:00:38.740 | The universe is gonna extract every ounce of creativity
04:00:42.820 | it can out of it 'cause that's what we see on Earth, right?
04:00:45.900 | - And if you think that almost like intelligence
04:00:49.020 | is not conserved, that maybe creativity isn't either.
04:00:52.340 | Maybe like it's an infinite well.
04:00:57.040 | So like creativity is ultimately tied to novelty.
04:01:00.460 | You're coming up with cool new configurations of things
04:01:03.700 | and maybe that just can continue indefinitely
04:01:06.320 | and this human species that was created along the way
04:01:09.920 | is probably just one method like that's able to ask
04:01:14.920 | the universe about itself.
04:01:16.600 | It's just one way to explore creativity.
04:01:19.500 | Maybe there's other meta levels on top of that.
04:01:22.700 | Like obviously as a collective intelligence
04:01:24.880 | we'll create organisms, maybe there'll be organisms
04:01:27.980 | that ask themselves questions about themselves
04:01:32.560 | in a deeper, bigger picture way than we humans do.
04:01:38.220 | First to ask questions about the humans
04:01:39.840 | and then construct some kind of hybrid systems
04:01:42.920 | that ask themselves about the collective aspect.
04:01:45.520 | Just like some weird stacking
04:01:47.120 | that we can't even imagine yet.
04:01:49.100 | - And that stacking, I mean I have discussed
04:01:50.600 | this stacking a lot with Sarah Walker
04:01:52.840 | who's a professor of physics and astrobiology at ASU.
04:01:57.440 | And we argue about how creative the universe is gonna be
04:02:02.060 | and is it as deterministic as all that
04:02:03.880 | because I think she's more of a free will thinker
04:02:07.640 | and I'm of a less free will thinker
04:02:10.080 | but I think we're beginning to converge
04:02:12.240 | and understanding that.
04:02:14.000 | Because there's simply a missing understanding.
04:02:16.000 | Right now we don't understand how the universe,
04:02:19.040 | we don't know what rules the universe has
04:02:21.340 | to allow the universe to contemplate itself.
04:02:23.460 | So asking the meaning of it before we even know
04:02:26.500 | what those rules are is premature
04:02:28.280 | but my guess is that it's not meaningless
04:02:30.660 | and it isn't just about the,
04:02:32.060 | and there are three levels of meanings.
04:02:33.880 | Obviously the universe wants us to do stuff.
04:02:35.840 | We're interacting with each other
04:02:36.900 | so we create meaning in our own society
04:02:38.540 | and our own interactions with humanity.
04:02:41.220 | But I do think there is something else going on.
04:02:44.480 | But because reality is so weird
04:02:47.660 | we're just scratching at that
04:02:49.980 | and I think that we have to make the experiments better
04:02:52.960 | and we have to perhaps join across
04:02:56.580 | not just for the computation lists
04:02:58.620 | and what I tried to do with Yasha is meet him halfway.
04:03:01.140 | Say, well what happens if I become a computation list?
04:03:03.220 | What do I gain?
04:03:04.240 | A lot it turns out
04:03:05.660 | because I can make Turing machines in the universe.
04:03:08.180 | 'Cause on the one hand I'm making computers
04:03:09.860 | which are state machines inspired by Turing.
04:03:12.020 | On the other hand I'm saying they can't exist.
04:03:14.580 | Well clearly I can't have my cake and eat it
04:03:16.940 | so there's something weird going on there.
04:03:18.980 | So then did the universe have to make a continuous
04:03:21.260 | to a discrete transition or is the universe just discrete?
04:03:23.820 | It's probably just discrete.
04:03:25.360 | So if it's just discrete then there are,
04:03:27.500 | I will then give Yasha his Turing-like property
04:03:31.260 | in the universe but maybe there's something else below it
04:03:33.620 | which is the constructor that constructs a Turing machine
04:03:36.660 | that then constructs, you know,
04:03:38.700 | it's a bit like you generate a computing system
04:03:42.920 | that then is able to build an abstraction
04:03:44.620 | that then recognizes, it can make a generalizable abstraction
04:03:48.500 | because human beings with mathematics have been able
04:03:52.500 | to go on and build physical computers
04:03:55.180 | if that makes any sense.
04:03:57.060 | And I think that's the meaning.
04:03:58.100 | I think that's, you know, as far as we can,
04:04:00.800 | the meaning will be further elucidated
04:04:02.380 | with further experiments.
04:04:03.660 | (laughing)
04:04:05.900 | - Well you mentioned Sarah.
04:04:08.420 | I think you and Sarah Walker
04:04:11.420 | are just these fascinating human beings.
04:04:14.460 | I'm really fortunate to have the opportunity
04:04:18.580 | to be in your presence, to study your work,
04:04:20.160 | to follow along with your work.
04:04:21.660 | I'm a big fan.
04:04:23.740 | Like I told you offline, I hope we get a chance
04:04:26.500 | to talk again with perhaps just the two of us
04:04:29.900 | but also with Sarah.
04:04:30.860 | That's a fascinating dynamic for people who haven't heard,
04:04:35.340 | I suppose on Clubhouse is where I heard you guys talk
04:04:38.240 | but you have an incredible dynamic.
04:04:39.940 | And I also can't wait to hear you and Yosha talk.
04:04:43.220 | So I think if there's some point in this predetermined
04:04:48.180 | or yet to be determined future
04:04:50.100 | where the three of us, you and Sarah,
04:04:54.900 | or the four of us with Yosha could meet and talk
04:04:57.940 | would be a beautiful future.
04:04:59.340 | And I look forward to most futures
04:05:02.160 | but I look forward to that one in particular.
04:05:04.620 | Lee, thank you so much for spending your valuable time
04:05:07.500 | with me today.
04:05:08.340 | - Thanks, Lex.
04:05:09.160 | It's been a pleasure.
04:05:10.660 | - Thanks for listening to this conversation with Lee Cronin.
04:05:13.620 | To support this podcast,
04:05:15.140 | please check out our sponsors in the description.
04:05:18.140 | And now let me leave you with some words
04:05:20.380 | from the mad scientist, Rick Sanchez
04:05:22.460 | of Rick and Morty fame.
04:05:23.960 | "To live is to risk it all.
04:05:27.520 | Otherwise you're just an inert chunk
04:05:31.820 | of randomly assembled molecules
04:05:34.220 | drifting wherever the universe blows you."
04:05:39.140 | Thank you for listening and hope to see you next time.
04:05:41.780 | (upbeat music)
04:05:44.360 | (upbeat music)
04:05:46.940 | [BLANK_AUDIO]