back to indexLee Cronin: Origin of Life, Aliens, Complexity, and Consciousness | Lex Fridman Podcast #269
Chapters
0:0 Introduction
2:2 Life and chemistry
15:27 Self-replicating molecules
25:51 Origin of life
42:16 Life on Mars
47:20 Aliens
54:1 Origin of life continued
60:55 Fermi Paradox
70:35 UFOs
78:56 Science and authority
84:59 Pickle experiment
87:54 Assembly theory
130:53 Free will
142:8 Cellular automata
165:40 Chemputation
182:54 Universal programming language for chemistry
196:5 Chemputer safety
208:47 Automated engineering of nanomaterials
217:46 Consciousness
227:19 Joscha Bach
238:35 Meaning of life
00:00:00.000 |
The following is a conversation with Leek Ronan, 00:00:05.160 |
who's one of the most fascinating, brilliant, 00:00:07.440 |
out of the box thinking scientists I've ever spoken to. 00:00:11.520 |
This episode was recorded more than two weeks ago, 00:00:27.320 |
I will try to release a solo episode on this war, 00:00:31.880 |
that make sense of it for myself and others, so I may not. 00:00:36.880 |
I ask for your understanding no matter which path I take. 00:00:51.400 |
When I returned to this conversation with Lee, 00:00:57.880 |
He's a beautiful, brilliant, and hilarious human being. 00:01:03.680 |
of the mad scientist Rick Sanchez from Rick and Morty. 00:01:06.480 |
I thought about quitting this podcast for a time, 00:01:37.280 |
as I keep going with this silly little podcast, 00:01:41.520 |
including through some difficult conversations, 00:01:43.960 |
and hopefully many fascinating and fun ones too. 00:02:05.760 |
and what insights does that give us about life? 00:02:11.440 |
and you think about maybe 4.7, 4.6, 4.5 billion years ago, 00:02:22.120 |
and I think that maybe it's a really simple set of chemistry 00:02:28.240 |
So that means you've got a finite number of elements 00:02:30.640 |
that are going to react very simply with one another, 00:02:41.040 |
So what I think I can say with some degree of, 00:02:46.860 |
genuine curiosity is that life happened fast. 00:02:50.280 |
- Yeah, so when we say fast, this is a pretty surprising fact 00:02:55.280 |
and maybe you can actually correct me and elaborate, 00:02:58.080 |
but it seems like most, like 70 or 80% of the time 00:03:02.520 |
that Earth has been around, there's been life on it. 00:03:06.160 |
So when you say fast, like the slow part is from single cell 00:03:11.160 |
or from bacteria to some more complicated organism. 00:03:14.280 |
It seems like most of the time that Earth has been around, 00:03:17.560 |
it's been single cell or like very basic organisms, 00:03:24.400 |
That's really, I recently kind of revisited our history 00:03:28.760 |
and saw this, and I was just looking at the timeline. 00:03:32.840 |
Wait a minute, like how did life just spring up so quickly? 00:03:38.200 |
That makes me think that it really wanted to. 00:03:41.420 |
Like put another way, it's very easy for life to spring. 00:03:45.980 |
- Yeah, I agree, I think it's much more inevitable. 00:03:54.680 |
'cause chemists are central to this problem, right? 00:03:57.800 |
Of understanding the origin of life on Earth at least, 00:04:02.360 |
But I wonder if the origin of life on a planet, 00:04:18.960 |
and we can get into this, I think, in quite some detail. 00:04:32.440 |
had to occur before the biology was established. 00:04:43.520 |
and capability and functionality and autonomy. 00:04:46.320 |
And I think these are all really important words 00:05:01.840 |
and then the degree of autonomy and sophistication. 00:05:05.440 |
Because I think that people misunderstand what life is. 00:05:11.960 |
and some people that say that life is a virus, 00:05:34.440 |
you can arrange the fundamental particles to do things. 00:05:58.680 |
And when you say memory, it's like there's a stickiness 00:06:01.800 |
to a bunch of the stuff that's building together. 00:06:04.080 |
So you can, in a stable way, trace back the complexity 00:06:41.360 |
- Sure, I mean, as a chemist, a card-carrying chemist, 00:06:55.600 |
when you bring electrons together and you form bonds. 00:07:02.360 |
and I just say, well, there's bonds, there's hope. 00:07:04.800 |
Because bonds allow you to get heterogeneity, 00:07:12.360 |
a Stanislav Lem-tripe world where you might have life 00:07:16.760 |
emerging or intelligence emerging before life. 00:07:19.680 |
That may be something like Solaris or something. 00:07:21.960 |
But to get to selection, if atoms can combine and form bonds, 00:07:26.960 |
those bonds, those atoms can bond to different elements, 00:07:31.880 |
and those molecules will have different identities 00:07:39.120 |
of causation or interaction, and then selection, 00:07:52.560 |
there is a sufficient pool of available chemicals 00:07:56.400 |
to start searching that combinatorial space of bonds. 00:08:01.480 |
- So, okay, this is a really interesting question. 00:08:32.160 |
It's like panpsychics believe that consciousness, 00:08:36.200 |
I guess, comes before life and before intelligence. 00:08:59.840 |
That could precede, couldn't that precede bonds too? 00:09:06.240 |
- I would say that there is an elegant order to it. 00:09:09.280 |
Bonds allow selection, allows the emergence of life, 00:09:22.160 |
if you had, I don't know, a neutron star or a sun 00:09:25.360 |
or what, a ferromagnetic loops interacting with one another 00:09:28.640 |
and these oscillators building state machines 00:09:31.080 |
and these state machines reading something out 00:09:34.280 |
Over time, these state machines would be able 00:09:36.440 |
to literally record what happened in the past 00:09:46.200 |
within a human comprehension, that type of life. 00:10:00.480 |
I don't see myself traveling fast and light anytime soon. 00:10:11.360 |
My stupidity, and I don't mean that as a, you know, 00:10:16.600 |
but I mean my ability to kind of just start again 00:10:19.000 |
and ask the question and then do it with an experiment. 00:10:22.240 |
I always wanted to be a theoretician growing up, 00:10:24.360 |
but I just didn't have the intellectual capability, 00:10:27.120 |
but I was able to think of experiments in my head 00:10:30.600 |
I could then do in my lab or in the, you know, 00:10:35.760 |
and then those experiments in my head and then outside 00:10:38.640 |
reinforced one another, so I think that's a very good way 00:10:44.320 |
- Well, that's a nice way to think about theoreticians 00:10:46.640 |
is they're just people who run experiments in their head. 00:10:49.640 |
I mean, that's exactly what Einstein did, right? 00:10:51.600 |
But you were also capable of doing that in the head, 00:10:54.440 |
in your head, inside your head and in the real world 00:10:59.440 |
is when you first discovered your superpower of stupidity. 00:11:09.680 |
I am genuinely curious, so my, so I have, you know, 00:11:28.120 |
'cause you always struggle about the role of ego in life 00:11:46.600 |
then ego will do just fine and make you fun to talk to. 00:12:00.000 |
- So we're going back to selection in the universe 00:12:06.520 |
I'm convinced that selection is a force in the universe. 00:12:13.240 |
but it is a directing force because existence, 00:12:15.920 |
although existence appears to be the default, 00:12:21.120 |
Why does, and we can get to this later I think, 00:12:39.800 |
it is literally the lineage of people making cups 00:12:42.200 |
and recognizing that, seeing that in their head, 00:12:52.960 |
and that's the process of selection and existence 00:12:57.480 |
I like the handle, it's convenient so I don't die, 00:13:01.720 |
And so I think we are missing something fundamental 00:13:15.000 |
and actually I think that how humanity is gonna, 00:13:20.480 |
or whatever we're gonna call them in the future, 00:13:28.840 |
selection, if objects are being kicked in and out 00:13:35.080 |
You don't really look at that as productive selection 00:13:37.080 |
because it's not doing anything to improve its function. 00:13:53.440 |
interaction of chemicals and molecules in the environment 00:14:01.800 |
we could have an infinite number of reactions happen, 00:14:04.920 |
all the reactions that are allowed to happen don't happen. 00:14:08.920 |
So there must be some things called catalysts out there 00:14:14.640 |
that when two molecules get together in that mineral, 00:14:17.040 |
it lowers the energy barrier for the reaction 00:14:22.920 |
over another series of possibilities occurring 00:14:31.600 |
almost these waves as discrete reactions work together 00:14:44.960 |
the fact that the molecules, the bonds are getting, 00:14:52.480 |
to interact with other molecules, to redirect them, 00:15:11.920 |
on the causal chain that's produced itself and change it, 00:15:15.880 |
suddenly you start to get towards some kind of autonomy 00:15:18.480 |
and that's where life I think emerges in earnest. 00:15:21.840 |
- Every single word in the past few paragraphs, 00:15:30.920 |
The closing of the loop that you're talking about, 00:15:37.920 |
I think you said the smallest von Neumann constructor, 00:15:47.920 |
when we think about the smallest von Neumann constructor? 00:16:06.480 |
And I'm not sure if it's John von Neumann or Johnny 00:16:18.520 |
in the Manhattan Project and developing computation 00:16:34.840 |
he probably did come up with it and didn't write it down. 00:16:37.240 |
- There was a couple of people who did at the same time 00:16:44.720 |
And what I think he imagined was that he wasn't satisfied 00:16:51.360 |
but so a lot of what I say is gonna be kind of, 00:17:06.360 |
So but what I mean is I think he liked this idea 00:17:14.080 |
literally build itself without a Turing machine, right? 00:17:16.760 |
It's like literally how do state machines emerge 00:17:28.400 |
and what would those rules look like in the world? 00:17:30.680 |
And that's what a von Neumann kind of constructor 00:17:32.800 |
looked like, like it's a minimal hypothetical object 00:17:37.880 |
And I'm really fascinated by that because I think that 00:17:42.280 |
although it's probably not exactly what happened, 00:17:56.000 |
so like with no prime mover, with no architect, 00:18:07.800 |
So you have molecule A and molecule A interacts 00:18:11.520 |
with another random molecule B and they get together 00:18:21.440 |
So AB prime is different to AB and then AB prime 00:18:25.960 |
can then act back where A and B were being created 00:18:32.920 |
and make AB prime more evolvable or learn more. 00:18:42.040 |
It feels like the mutation part is not that difficult. 00:18:46.920 |
It feels like the difficult part is just creating 00:19:22.040 |
I probably got into trouble on Twitter the other day, 00:19:24.920 |
There's about more than 18 mils of water in there. 00:19:27.160 |
So one mole of water, 6.022 times 10 to the 23 molecules. 00:19:31.720 |
That's about the number of stars in the universe, 00:19:34.680 |
So there's three universe worth, but between one-- 00:19:40.840 |
It's a great, but there's a lot of molecules in the water. 00:19:49.520 |
if existence is not the default for a long period of time, 00:19:55.240 |
because what happens is the molecules get degraded. 00:20:01.920 |
So you have this kind of destruction of the molecules 00:20:12.280 |
for them suddenly to then take resources in the pool 00:20:16.400 |
And so then replication, actually, over time, 00:20:19.000 |
when you have bonds, I think is much simpler, much easier. 00:20:30.400 |
making a bit of rust based on a thing called molybdenum. 00:20:33.720 |
Molybdenum oxide, is this molybdenum oxide, very simple. 00:20:38.620 |
But when you add acid to it, and some electrons, 00:20:40.940 |
they make these molecules you just cannot possibly imagine, 00:20:50.160 |
Or I cost a dodecahedron 132 molybdenum atoms, 00:20:55.440 |
And I realized when I, and I just finished experiments 00:20:58.040 |
two years ago, I've just published a couple of papers 00:21:02.680 |
there is a random small molecule with 12 atoms in it 00:21:06.400 |
that can form randomly, but it happens to template 00:21:14.060 |
Just an accident, just like, just an absolute accident. 00:21:17.280 |
And that ring also helps make the small 12 mer. 00:21:21.960 |
And so you have what's called an autocatalytic set, 00:21:25.520 |
where A makes B, and B helps make A, and vice versa. 00:21:34.560 |
So it's a bit like these, they all work in synergy 00:21:43.500 |
And it doesn't take a very sophisticated model 00:21:46.600 |
to show that if you have these objects competing 00:21:50.540 |
and then collaborating to help one another build, 00:21:55.000 |
And although they seem improbable, they are improbable. 00:22:02.800 |
This is when the blind people look at the blind watchmaker 00:22:06.040 |
argument when you're talking about how could a watch 00:22:24.240 |
that little discovery, like with the wheel and fire, 00:22:28.480 |
it just gets, explodes in, because it's so successful, 00:22:46.640 |
So in chemistry somehow it's possible to imagine 00:22:52.920 |
that that kind of thing is easy to spring up. 00:22:57.000 |
In more complex organisms, it feels like a different thing 00:23:01.680 |
We're having multiple abstractions of the birds 00:23:06.820 |
But with human, sorry, with complex organisms, 00:23:25.840 |
'Cause that seems like a magical idea for life to, 00:23:32.960 |
That feels like very necessary for what selection is, 00:23:43.440 |
of the selecting mechanism at different scales. 00:23:50.720 |
If you want to, and what happens is that life, 00:23:57.680 |
life on Earth, biology on Earth, is unique to Earth. 00:24:26.800 |
that selection can occur without the ribosome, 00:24:39.640 |
that is probably much easier to get to than we think. 00:24:57.280 |
So the RNA world, if you like, gets transmitted 00:25:00.900 |
and builds proteins, and the proteins are responsible 00:25:05.200 |
The majority of the catalysis goes on the cell. 00:25:07.340 |
No ribosome, no proteins, no decoding, no evolution. 00:25:14.340 |
You don't put the RNA itself as the critical thing. 00:25:17.820 |
Like information, you put action as the most important thing. 00:25:23.700 |
are entirely contingent on the history of life on Earth. 00:25:36.560 |
I'm gonna cite your tweets, like it's Shakespeare. 00:25:40.520 |
- It's surprising you haven't gotten canceled 00:25:47.500 |
There's, you like to have a little bit of fun on Twitter. 00:25:56.860 |
So if this is Shakespeare, can we analyze this word? 00:26:02.300 |
Aren't you kind of doing origin of life research? 00:26:10.940 |
I am, I'm not doing the origin of life research. 00:26:21.340 |
but more importantly, to find origin of life elsewhere. 00:26:27.900 |
doing origin of life research, but I want to nudge them. 00:26:35.280 |
The chemistry they are doing, the motivation is great. 00:26:40.580 |
that maybe they're making assumptions about saying, 00:26:42.980 |
if only I could make this particular type of molecule, 00:26:54.140 |
it's gonna somehow unlock the origin of life. 00:26:57.800 |
And I think that origin of life has been looking at this 00:27:01.980 |
And whilst I think it's brilliant to work out 00:27:08.220 |
I think that chemistry and chemists doing origin of life 00:27:11.420 |
could be nudged into doing something even more profound. 00:27:15.940 |
And so the argument I'm making, it's a bit like right now, 00:27:33.380 |
- And what they do is they take the Tesla apart 00:27:35.620 |
and say, we wanna find the origin of cars in the universe 00:27:38.460 |
and say, okay, how did this form and how did this form? 00:27:42.620 |
till they make the door, they make the wheel, 00:27:44.360 |
they make the steering column and all this stuff. 00:27:50.420 |
But actually we know that there's a causal chain of cars 00:27:53.340 |
going right back to Henry Ford and the horse and carriage. 00:28:00.340 |
And I think that obsession with the identities 00:28:13.780 |
is in danger of not making the progress that it deserves. 00:28:24.260 |
There's amazing people out there, young and old, doing this. 00:28:27.540 |
And there's deservedly so more money going in. 00:28:31.860 |
there's more money being spent searching for the Higgs boson 00:28:37.460 |
The origin, we understand the origin of life. 00:28:48.620 |
or have a good idea of what the future of humanity 00:28:54.980 |
we're not the only life forms in the universe. 00:29:22.420 |
Let's now make this other molecule, another molecule. 00:29:38.220 |
is a myoplasma, something, I don't know the name of it. 00:29:59.980 |
that has a causal chain going all the way back to Luca. 00:30:05.300 |
the genes, and put in his genes, synthesized, 00:30:14.220 |
synthetic biologists cannot make a cell from scratch 00:30:32.940 |
how can we create life in the lab from scratch? 00:30:54.740 |
Lots of people would argue that they have made progress. 00:30:59.300 |
of a synthetic genome milestone in human achievement. 00:31:09.740 |
one of the world experts in exactly this area, 00:31:12.980 |
what does it mean to create life from scratch? 00:31:39.060 |
You used a board. - Yeah, just inorganic stuff. 00:31:47.220 |
maybe some inorganic carbon, some carbonates. 00:31:57.940 |
so they could remove anything else that could possibly 00:32:00.700 |
be like a shadow of life that can assist in the chemical-- 00:32:06.340 |
- You could do that, you could insist and say, 00:32:07.940 |
look, I'm gonna take, and not just inorganic, 00:32:09.940 |
I want some more, I wanna cheat and have some organic, 00:32:13.900 |
and I'll explain the play on words in a moment. 00:32:16.100 |
So I would like to basically put into a world, 00:32:19.420 |
let's say a completely, you know, a synthetic world, 00:32:26.700 |
and just literally add some energy in some form, 00:32:38.620 |
So I see the origin of life as a search problem 00:32:42.700 |
And then I would wait, literally wait for a life form 00:32:53.900 |
You know, there's ways of ethically containing it. 00:33:03.220 |
It'll make you, it will not make you look good 00:33:07.660 |
and destroys all of human civilization, but yes, let's-- 00:33:11.060 |
there is a very good things you can do to prevent that. 00:33:17.460 |
so let's say we make life based on molybdenum 00:33:21.940 |
'cause there's not enough molybdenum in the environment. 00:33:23.460 |
So we can put in, we can do responsible life. 00:33:26.700 |
Or as I fantasize with my research group on our away day 00:33:33.540 |
if we don't find, until humanity finds life in the universe, 00:33:39.260 |
it's our moral obligation to make origin of life bombs, 00:33:43.820 |
with our origin of life machines and make them alive. 00:33:46.580 |
I think it is our moral obligation to do that. 00:33:49.700 |
I'm sure some people might argue with me about that, 00:33:51.780 |
but I think that we need more life in the universe. 00:34:06.820 |
I think this is, once again, a Rick and Morty episode. 00:34:11.940 |
So I imagine we have this pristine experiment 00:34:18.100 |
And we put in inorganic materials and we have cycles, 00:34:21.940 |
whether day, night cycles, up, down, whatever. 00:34:30.580 |
Now, are there people doing this in the world right now? 00:34:38.700 |
They're kind of, perhaps, too much associated with the scam. 00:34:48.500 |
that are already, were already invented by biology. 00:34:54.100 |
But I still think the work they're doing is amazing. 00:34:59.780 |
and say, let's just basically shake a load of sand in a box 00:35:19.860 |
I'm not very rich, so it'd just be a few dollars. 00:35:21.940 |
But for me, the solution space will be different. 00:35:33.980 |
The solutions will be just completely different. 00:35:37.820 |
because that's the other thing we should be able to show 00:35:43.860 |
or someone did make a new life form in the lab, 00:35:46.100 |
it would be so poor that it's not gonna leap out. 00:35:50.100 |
It is, the fear about making a lethal life form 00:35:55.100 |
in the lab from scratch is similar to us imagining 00:36:05.340 |
And the problem is, we don't communicate that properly. 00:36:08.500 |
I know you yourself, you explain this very well. 00:36:26.580 |
- But this is a much, much longer discussion, 00:36:30.020 |
I would say there's potential for catastrophic events 00:36:36.660 |
In the AI space, there's a lot of ways to create, 00:36:40.060 |
like social networks are creating a kind of accelerated 00:36:45.060 |
set of events that we might not be able to control. 00:36:48.380 |
That social network virality in the digital space 00:36:53.260 |
can create mass movements of ideas that can then, 00:37:05.660 |
That's an interesting at-scale application of AI. 00:37:09.740 |
If you look at viruses, viruses are pretty dumb. 00:37:13.300 |
But at scale, their application is pretty detrimental. 00:37:16.820 |
And so origin of life, much like all the kind of virology, 00:37:21.820 |
the very contentious word of gain-of-function research 00:37:28.540 |
in virology, sort of like research on viruses, 00:37:36.100 |
that can create a lot of problems if not done well. 00:37:41.820 |
So there's a kind of, whenever you're ultra-cautious 00:37:44.920 |
about stuff in AI or in virology and biology, 00:37:53.420 |
where it's like everything we do is going to turn out 00:37:58.680 |
so I'm just going to sit here and do nothing. 00:38:03.460 |
except for the fact that somebody's going to do it. 00:38:10.500 |
so we have to do it in an ethical way, in a good way, 00:38:14.800 |
considering in a transparent way, in an open way, 00:38:17.820 |
considering all the possible positive trajectories 00:38:25.480 |
as much as possible, that we walk those trajectories. 00:38:31.080 |
but a totally unexpected version of Terminator 00:38:39.000 |
And so going back to the origin of life discussion, 00:38:44.500 |
we have to be very careful about how we edit genomes 00:39:01.840 |
or artificial life research to get to the point 00:39:06.860 |
because that's why I think we're just so far away from that. 00:39:10.200 |
Right now, I think there are two really important angles. 00:39:15.900 |
researchers who are faithfully working on this 00:39:22.800 |
And then there are people on the creationist side 00:39:24.920 |
who are saying, look, the fact you can't make these molecules 00:39:26.960 |
and you can't make a cell means that evolution isn't true 00:39:34.200 |
because actually the origin of life researchers 00:39:38.160 |
And so what I'm trying to do is give origin of life research 00:39:49.440 |
I really want to make a new life form in my lifetime. 00:39:52.340 |
I really want to prove that life is a general phenomena, 00:39:58.160 |
because I think that's gonna be really important 00:40:09.120 |
So one, it will help us understand ourselves, 00:40:31.920 |
then that's just as good, if not much better, 00:40:38.060 |
I mean, it's cheaper, it's much cheaper and much easier 00:40:52.280 |
that there's probably a lot of different ways to do it. 00:41:02.680 |
- Yeah, and wouldn't it be great if we could find a solution? 00:41:09.080 |
but I'm not that worried about climate change. 00:41:15.160 |
that could basically, and I don't want to do this, 00:41:21.200 |
that would perhaps take CO2 out of the atmosphere 00:41:23.620 |
or an intermediate life form, so it's not quite alive, 00:41:32.200 |
you could give to, say, cyanobacteria in the ocean 00:41:40.320 |
and we're gonna work out how much we need to fix 00:41:49.560 |
What worries me is that biology has had a few billion years 00:42:03.200 |
But I think if we can do, as you say, make life in the lab, 00:42:06.720 |
then suddenly we don't need to go to everywhere 00:42:12.440 |
we look at the extent of life in the solar system, 00:42:26.600 |
we might have completely different life forms 00:42:33.400 |
- Okay, wait a minute, wait a minute, wait a minute. 00:42:36.400 |
Did you just say that you think, in terms of likelihood, 00:42:40.360 |
life started on Mars, like statistically speaking, 00:42:51.000 |
that we have right now, type of chemistry before Earth. 00:42:53.920 |
So it seems to me that Mars got searching, doing chemistry. 00:43:00.800 |
- Yeah, and so they had a few more replicators 00:43:04.240 |
And if those replicators got ejected from Mars 00:43:13.320 |
So I'm not going, I think we will find evidence 00:43:16.520 |
of life on Mars, either life we put there by mistake, 00:43:29.600 |
because of the gravitational situation in the solar system. 00:43:33.680 |
- Titan and all that, that would be its own thing. 00:43:39.960 |
that looks a hell of a lot similar to Earth life, 00:43:43.000 |
and then we'll go to Titan and all those weird moons 00:43:46.760 |
with the ices and the volcanoes and all that kind of stuff. 00:43:55.720 |
- Some other, some non-RNA type of situation. 00:44:01.800 |
And I think there are four types of exoplanets 00:44:09.120 |
When we look at a star, well, we know statistically 00:44:16.400 |
Are they gonna be just prebiotic, origin of life coming? 00:44:22.560 |
And so with intelligence on them, and will they have died? 00:44:47.920 |
and start to then frame things a little bit more. 00:44:53.840 |
- Which, by the way, you're just saying four, 00:45:20.960 |
or some weird thing that naturally happens over time. 00:45:23.920 |
- Yeah, yeah, I mean, I think that all bets are off on that. 00:45:28.960 |
- In that case, we join into a virtual metaverse, 00:45:32.480 |
and start creating, which is kind of an interesting idea, 00:45:35.360 |
almost arbitrary number of copies of each other, 00:45:38.480 |
much more quickly, to mess with different ideas. 00:45:51.240 |
argue with each other until, in the space of ideas, 00:45:58.040 |
But anyway, there's, especially in this digital space, 00:46:01.640 |
where you can start exploring with AIs mixed in, 00:46:04.520 |
you can start engineering arbitrary intelligences, 00:46:11.560 |
that looks very different than a biological world. 00:46:29.000 |
- But I did say technological, so I think I agree with you. 00:46:30.800 |
I think, so you can have, let's get this right. 00:46:36.520 |
Prebiotic world, life emerging, living, and technological. 00:46:43.080 |
between the dead never being alive and the dead one, 00:46:46.040 |
maybe you've got some artifacts, and maybe there's five. 00:46:50.440 |
And I think the technological one could allow, 00:46:53.080 |
could have life on it still, but it might just have exceeded. 00:46:56.360 |
'Cause one way that life might survive on Earth 00:46:58.840 |
is if we can work out how to deal with the coming, 00:47:01.840 |
the real climate change that comes when the sun expands. 00:47:07.720 |
But yeah, I think that we need to start thinking 00:47:11.720 |
statistically when it comes to looking for life 00:47:15.240 |
- Let me ask you then, sort of, statistically, 00:47:23.880 |
in those four phases that you're talking about? 00:47:30.800 |
and talking to other people with British accents 00:47:35.320 |
about something intelligent and intellectual, I'm sure, 00:47:38.040 |
do you think there's a lot of alien civilizations 00:47:52.240 |
So what I'm saying is I have no doubt, I have no idea. 00:47:55.480 |
But having said that, there is no reason to suppose 00:48:00.480 |
that life is as hard as we first thought it was. 00:48:08.760 |
and if I think that life is a much more general phenomena 00:48:22.200 |
it's just that we can't interact with the other life forms 00:48:28.720 |
like as depicted in Arrival or other, you know, 00:48:35.280 |
there are very few universal facts in the universe, 00:48:51.640 |
- You think there's a lot of kinds of life that's possible. 00:49:04.640 |
that we have on Earth is something that is just one sample 00:49:15.520 |
complex autonomous self-replicating type of things 00:49:58.600 |
it feels like the universe should be teaming with life, 00:50:04.000 |
And I just, I sit there and the Fermi paradox is very, 00:50:09.960 |
it's felt very distinctly by me when I look up at the stars, 00:50:19.200 |
and listening to Bruce Springsteen and feel quite sad. 00:50:25.000 |
to the side of the road and just weeping a little bit. 00:50:34.960 |
It feels lonely 'cause it feels like they're out there. 00:50:38.240 |
- I think that there are a number of answers to that. 00:50:40.120 |
I think the Fermi paradox is perhaps based on 00:50:43.360 |
the assumption that if life did emerge in the universe, 00:50:50.920 |
And I think that what we've got to start to do 00:51:03.440 |
we might start to see really interesting things. 00:51:07.000 |
And we haven't been doing this for very long. 00:51:12.640 |
and that makes the problem a little bit harder. 00:51:22.880 |
well, I don't know, there are two movies that came out 00:51:25.800 |
within six months of one another, "Ad Astra" and "Cosmos". 00:51:32.880 |
with Brad Pitt in it and saying there is no life 00:51:54.760 |
I was working on a paper, a life detection paper 00:51:58.560 |
and I found it was so hard to publish this paper. 00:52:03.720 |
I got so depressed trying to get this science out there 00:52:06.560 |
that I felt the depression of the film in "Ad Astra" 00:52:11.360 |
like life is, there's no life elsewhere in the universe. 00:52:17.440 |
that I think we will find life in the universe, 00:52:28.440 |
And then people say, well, you made this life on earth, 00:52:31.400 |
therefore it's, you're part of the causal chain of that. 00:52:37.080 |
how I'm able to do it with a very little cheating 00:52:42.360 |
just creating like a model planet, some description 00:52:48.520 |
then I think that we will be even to persuade 00:52:57.680 |
I think that we might crush that with the JWST. 00:53:03.120 |
the mirror is about 10 times the size of the Hubble, 00:53:09.040 |
look at colors of exoplanets, I think, not brilliantly, 00:53:18.840 |
for what's going on in the universe on these exoplanets. 00:53:21.400 |
'Cause it's only in the last few decades, I think, 00:53:27.600 |
came to recognize that exoplanets even are common. 00:53:33.840 |
And I think that that gives us a lot of optimism 00:53:46.920 |
I can tell you with confidence that biology on Earth 00:53:50.640 |
does not exist anywhere else in the universe. 00:54:03.540 |
like how many options does chemistry really have? 00:54:16.240 |
So if biology, as you find on Earth, is common everywhere, 00:54:19.960 |
then there's something really weird going on. 00:54:21.600 |
They're basically written in the quantum mechanics, 00:54:26.680 |
and this catalyst must form over this catalyst, 00:54:38.620 |
So that means if we want to find other Earth-like worlds, 00:55:03.680 |
I believe in statistics, and I can do experiments. 00:55:11.460 |
And so there is TikTok elsewhere in the universe, 00:55:23.520 |
TikTok is a social media where people upload videos, 00:55:37.320 |
and find humor in ideas, and play with those ideas, 00:55:43.760 |
Humor seems to be kind of fundamental to human experience. 00:55:46.600 |
- And I think that that's a really interesting question 00:55:48.800 |
we can ask, is humor a fundamental thing in the universe? 00:55:53.640 |
In terms of, you think about in a game theoretic sense, 00:56:01.640 |
And so if selection is fundamental in the universe, 00:56:12.560 |
Maybe it's like, from a chemical perspective, 00:56:21.520 |
One is the catalyst for spreading ideas on the internet, 00:56:30.840 |
It's a kind of valve, release valve for suffering. 00:56:34.580 |
Like, throughout human history, life has been really hard. 00:56:39.120 |
And for the people that I've known in my life 00:56:41.240 |
who've lived through some really difficult things, 00:57:17.440 |
And also, like, all of each instantiation of life 00:57:26.880 |
- Like, maybe there's a few clusters of similar-like life, 00:57:30.520 |
but it's much more likely is what you're saying. 00:57:48.160 |
So every instantiation of a kind of chemistry 00:57:53.480 |
and self-replicating, however the hell you define life, 00:57:56.920 |
that is going to be very different every time. 00:58:14.960 |
If we ran Earth over again, over and over and over, 00:58:22.280 |
with there's not gonna be elephants every time? 00:58:36.500 |
But it might be that the emergence of elephants 00:58:42.060 |
was wired into the history of Earth in some way, 00:58:44.400 |
like the gravitational force, how evolution was going, 00:58:50.760 |
But I just don't know enough about the contingency, 00:58:54.920 |
All I do know is you count the number of bits 00:58:59.280 |
sorry, an elephant, and think about the causal chain 00:59:15.020 |
the things that result in self-replicating chemistry 00:59:26.540 |
those are extremely unlikely, as you're saying. 00:59:37.640 |
that are possible in the universe, chemically speaking, 00:59:41.800 |
is actually successful at creating elephants. 00:59:48.080 |
- Well, there's two different questions here. 00:59:52.720 |
- At the different phases, sorry to keep interrupting. 00:59:54.920 |
- Yeah, no, if we restart Earth and start again, 00:59:56.900 |
say we could go back to the beginning and do the experiment 00:59:59.260 |
or have a number of Earths, how similar would biology be? 01:00:02.620 |
I would say that there would be broad similarities. 01:00:09.140 |
unless we're gonna throw an asteroid at each planet 01:00:11.580 |
each time and try and faithfully reproduce what happened. 01:00:17.960 |
when you go to another Earth-like planet elsewhere, 01:00:20.420 |
maybe there's a different ratio, particular elements, 01:00:22.900 |
maybe there's the bombardment at the beginning of the planet 01:00:30.420 |
And I just don't have enough information there. 01:00:44.540 |
mathematical assumption to think that life on Earth 01:00:48.980 |
that happened again would be the same as what we have now. 01:00:51.560 |
- Okay, but you've also extended that to say that we might, 01:00:58.420 |
that that means we're not able to interact with them. 01:01:02.700 |
Or that's an explanation for why we haven't at scale 01:01:08.020 |
- Well, right now-- - Is there different than us. 01:01:10.500 |
- We've only been looking for, say, 70, 80 years. 01:01:13.480 |
So I think that the reason we have not found aliens yet 01:01:19.980 |
- No, but the aliens have worked that out, surely. 01:01:30.520 |
that are way ahead of us on this whole life question. 01:01:37.320 |
of intellectual evolution that often quickly results 01:01:42.680 |
There's something in this process that eventually, 01:01:52.440 |
and it stops, either dies or stops developing. 01:01:55.800 |
But most likely, they already figured it out. 01:02:07.800 |
I mean, maybe, I mean, I don't have a coherent answer 01:02:28.000 |
but if we don't have the ability to recognize them 01:02:31.160 |
and talk to them, then the aliens aren't going 01:02:48.360 |
So we haven't qualified to even join their club 01:02:51.840 |
- Well, I think they still want to teach us how to talk. 01:02:57.660 |
or I think they would want to teach us how to talk 01:03:02.320 |
Like when you even meet, I was going to say child, 01:03:21.880 |
are just too close minded or don't have the right tools. 01:03:25.040 |
- No, I'm going to push back on this quite significantly. 01:03:27.200 |
I would say, because we don't understand what life is 01:03:30.640 |
and because we don't understand how life emerged 01:03:33.640 |
in the universe, we don't understand the physics 01:03:37.400 |
And that means our description, fundamental description, 01:03:40.640 |
I'm way out of my pay grade, even further out. 01:03:42.960 |
But I'll say it anyway, because I think it's a fun-- 01:03:44.800 |
- You don't get paid much anyway, as you said earlier. 01:03:50.940 |
because we don't understand the universe yet, 01:03:52.900 |
we do not understand how the universe spat out life. 01:04:05.020 |
So I'm going to say that they might be there, 01:04:08.440 |
but we just, I'm not going to say that I believe 01:04:10.540 |
in interdimensional aliens being present in this room. 01:04:12.460 |
- Yeah, but I think you're just being self-critical, 01:04:16.400 |
I think the fact that we don't qualify qualifies us. 01:04:22.040 |
- No, I'm saying that because we don't understand 01:04:29.320 |
and we don't understand what replication is yet, 01:04:49.780 |
'Cause you can, there's just enough cognition. 01:04:55.100 |
our cognitive abilities are not yet where they need to be, 01:04:58.740 |
we probably aren't even communicating with them. 01:05:00.180 |
- So you don't agree with the dating strategy 01:05:11.660 |
No, actually, I think in this talk, in this conversation, 01:05:16.920 |
that I think has been troubling me for a long time 01:05:23.840 |
is to say that you would not go and talk to your cat 01:05:32.280 |
- Sure, but I'm not talking about petting a cat. 01:05:34.280 |
The analogy is that the aliens are not going to talk to us 01:05:41.400 |
because we lack the layer, the fundamental layer 01:05:55.280 |
that it would cause more angst for human race. 01:06:15.680 |
because even if we lack complete understanding, 01:06:23.160 |
for other kinds of intelligence to communicate with us, 01:06:26.760 |
still there must be a way to interact with us, 01:06:43.560 |
where we allow an intelligent AI to emerge, right, 01:06:53.320 |
interact with other intelligence in its universe, 01:06:56.200 |
and then we might find the parameters required 01:07:33.300 |
- A bit like, do we know that plants are conscious? 01:07:34.960 |
Well, plants aren't conscious in the way we typically think, 01:07:42.440 |
- They're not talking, they're just gardening. 01:07:52.120 |
there's always going to be the people who are curious, 01:07:55.000 |
Jane Goodall, who lives with the chimps, right? 01:07:58.560 |
There's always going to be curious, intelligent species 01:08:00.900 |
that visit the weird Earth planet and try to interact. 01:08:05.900 |
I mean, it's, yeah, I think it's a super cool idea 01:08:13.020 |
maybe it's a hope that there's always going to be 01:08:25.700 |
that aliens do exist and we will interact with them. 01:08:35.080 |
And also something to strive for, to be able to do that. 01:08:46.100 |
But I want to come up with an alternative explanation, 01:08:51.120 |
and not being philosophically and scientifically thought out 01:08:54.680 |
which is this, if you can't actually communicate 01:09:06.440 |
but I'm totally aligned with your hopeful vision, 01:09:08.480 |
which is like, we need to understand the origin of life. 01:09:15.120 |
through perhaps on a computer side through simulation 01:09:31.160 |
of course, these are all just kind of open-minded beliefs. 01:09:35.760 |
It's difficult to know for sure about any of this, 01:09:37.960 |
but I think there's a lot of alien civilizations 01:09:51.880 |
but I interpreted you to say they kind of tried a few times 01:09:56.240 |
and they're like, oh God, humans are too dumb. 01:10:11.080 |
I think we qualify as soon as we can decode their signal. 01:10:15.600 |
- Right, so when you say qualify, got it, got it. 01:10:27.440 |
let me get your opinion on this, about UFO sightings. 01:10:45.040 |
and it's not identified clearly at the time of sighting. 01:10:53.720 |
it could be ball lightning, it could be all kinds 01:10:58.620 |
like the fact that there could be physical phenomena 01:11:03.160 |
in this world that are observable by the human eye, 01:11:06.280 |
of course, all physical phenomena generally are fascinating 01:11:09.640 |
that are, that really smart people can't explain. 01:11:14.160 |
I love that, 'cause it's like, wait a minute, 01:11:18.560 |
it's like, wait a minute, how does this happen? 01:11:20.160 |
That's like the precursor to giant discoveries 01:11:23.600 |
in chemistry and biology and physics and so on. 01:11:26.000 |
But it sucks when those events are super rare, right? 01:11:34.200 |
And then, of course, that phenomena could have 01:11:39.220 |
with the physics, the chemistry, the biology of Earth. 01:11:44.520 |
extraterrestrial explanations that, in large part, 01:11:49.200 |
thanks to Hollywood and movies and all those kinds 01:12:11.960 |
that it's an object that's not of this particular world. 01:12:16.960 |
Do you think there's a chance that that's the case? 01:12:18.960 |
What do you make, especially the pilot sightings, 01:12:23.340 |
- So I agree there's a chance, there's always a chance. 01:12:32.220 |
I want to see if aliens exist, come to Earth. 01:12:36.140 |
What I know about the universe is I think it's unlikely 01:12:49.820 |
saying we're gonna release all this information, 01:12:53.580 |
I was kind of disappointed because it was just 01:13:01.980 |
And right now, the ability to capture high-resolution video, 01:13:22.660 |
a lot of hearsay, instrument kind of malfunctions 01:13:30.620 |
but I think something really interesting is happening. 01:13:36.380 |
We've all been locked down, we all want to have, 01:13:38.220 |
we want to, our imaginations are running riot, 01:13:42.080 |
and I think that, I don't think that the information 01:13:45.980 |
out there has convinced me there are anything interesting 01:13:48.340 |
on the UFO side, but what it has made me very interested 01:13:54.820 |
to ponder aliens and the mystery of our universe. 01:14:02.700 |
from having those thoughts and say you're stupid 01:14:07.820 |
What I would say is that I lack sufficient data, 01:14:33.300 |
because I think humanity wants to know what's out there. 01:14:43.820 |
depending on the day, I sometimes agree with you, 01:14:47.820 |
but one of the disappointing things to me about the sightings 01:14:52.820 |
I still hold the belief that a non-zero number of them 01:15:00.020 |
is an indication of something very interesting. 01:15:24.980 |
about the report that the government released, 01:15:26.980 |
but in general, just having worked with government, 01:15:36.220 |
Like, if you look at the response to the pandemic, 01:15:38.820 |
how incompetent we are in the face of great challenges 01:15:47.900 |
of the great mysteries before us without great leadership. 01:15:54.020 |
the fact that there's a lot of high-definition cameras 01:15:56.020 |
is not enough to capture the full richness of weird, 01:16:19.900 |
of even interesting, relatively rare human events. 01:16:25.420 |
It's rare to be in the right moment in the right time 01:16:41.420 |
of anomaly detection in chemistry in particular. 01:16:59.300 |
where people said there's this thing called Phlogiston. 01:17:02.060 |
And for ages, the alchemists got really this kind of, 01:17:16.420 |
because I think actually the ether might exist. 01:17:18.460 |
And I'll tell you what I think the ether is later. 01:17:26.100 |
so it's the light traveling through the ether in the vacuum. 01:17:30.700 |
that basically mediates the movement of light, say. 01:17:44.300 |
when they were splitting water into hydrogen and oxygen, 01:17:48.260 |
that you got more energy out than you put in. 01:17:51.900 |
and they thought that this was a nuclear reaction. 01:17:57.940 |
because you didn't detect neutrons and all this stuff. 01:18:29.900 |
I do hope that we are gonna discover more anomalies. 01:18:33.460 |
that the solar system isn't just static in space, 01:19:02.260 |
interdimensional aliens and everything else, right? 01:19:04.500 |
He's gone from space junk accelerating out of, 01:19:15.460 |
- Or he wants to tap into the psyche and understand. 01:19:18.780 |
And he's playfully kind of trying to interact 01:19:54.460 |
what I would consider a lot of really good scientists. 01:20:20.980 |
You're just as lost and clueless as everybody else, 01:20:31.340 |
using the word science and statistics can often, 01:20:55.540 |
some people find me extremely annoying in science 01:21:15.340 |
gift I got given by society when I was very young, 01:21:17.700 |
'cause I was in the learning difficulties class at school, 01:21:34.420 |
And so when I went into academia and everyone said, 01:21:51.980 |
And I annoy people because I walk straight through the wall, 01:21:57.980 |
Why should I be a mathematician and not a computer scientist? 01:22:17.220 |
by taking their criticisms and addressing them head on. 01:22:22.380 |
And I think that I try and do that in my own way. 01:22:25.180 |
And I kind of love walking through the walls. 01:22:28.100 |
And it gives me, it's difficult for me personally, 01:22:35.060 |
but it always leads to a deeper understanding 01:22:39.340 |
In particular, you know, the arguments I have 01:22:45.780 |
or I want to understand more about why I exist. 01:22:59.420 |
but humans with egos and all those kinds of things 01:23:10.700 |
like all ideas we all aspire to misuse these beautiful ideas 01:23:15.700 |
to manipulate people to all those kinds of things. 01:23:20.940 |
- And that's, there's assholes in every space 01:23:27.420 |
- And those are no good, but yes, you're right. 01:23:29.820 |
The scientific method has proven to be quite useful. 01:23:36.540 |
for difficult explanations for rare phenomena, 01:24:13.980 |
is so dogmatic, but there is this thing that they see, 01:24:18.060 |
but they don't see, and it takes a bit of time, 01:24:21.780 |
And my approach is to say, well, why can't this be right? 01:24:25.060 |
Why must we accept that RNA is the only way into life? 01:24:44.540 |
and it works so well for the evolutionary process 01:24:48.620 |
to explain that that must be the only way to have life. 01:25:02.500 |
you mentioned to me that you have a lab in your home, 01:25:10.640 |
from Rick and Morty, which is something I've been thinking 01:25:15.420 |
And then you say that there's a glowing pickle 01:25:20.300 |
that you used something involving cold plasma, 01:25:24.020 |
but can you explain the glowing pickle situation? 01:25:28.620 |
And is there many, arbitrarily many versions of you 01:25:34.380 |
in alternate dimensions that you're aware of? 01:25:37.380 |
- I tried to make an electrochemical memory at home 01:25:41.500 |
and using a pickle, the only way I could get any traction 01:25:58.860 |
- But you connected a pickle to some electro, I mean-- 01:26:06.180 |
this is a classic thing you do, I mean, I shouldn't, 01:26:10.860 |
pranks you do, you put a pickle into the mains 01:26:25.260 |
sodium potassium ions in the pickle to migrate. 01:26:31.100 |
That was, yeah, that was not my best experiment. 01:26:34.300 |
So I've done far better experiments in my lab at home. 01:26:45.980 |
- Well, I mean, I got kicked out of my own lab 01:26:48.220 |
by my research team many years ago, and for good reason. 01:27:08.020 |
"Oh, well, that's not gonna work because of this." 01:27:10.140 |
And I'll say, "Ah-ha, but actually I've tried 01:27:11.980 |
"and here's some code and here's some hardware, 01:27:17.500 |
as I get even more busy, but that's quite fun 01:27:20.740 |
'cause then they feel that we're in the experiment together. 01:27:49.260 |
as well as in the digital space, which is fascinating. 01:28:03.020 |
This intersection between mathematics and philosophy. 01:28:22.100 |
when I was going to origin of life conferences 01:28:25.780 |
where I thought that everybody was dancing around 01:28:28.740 |
the problem of what life is and what it does. 01:28:32.700 |
But I'll tell you about what assembly theory is 01:28:48.980 |
and you tap it just with a hammer or the nail at some point 01:28:54.940 |
And if that object is able to fragment into many 01:28:57.860 |
and you count those parts, the different parts, 01:29:01.820 |
assembly theory says the larger the number of parts, 01:29:10.140 |
the more likely it is that object has been created 01:29:23.540 |
Because what I'm literally saying about the abundance, 01:29:26.860 |
if you have a one-off object and you break it into parts 01:29:32.500 |
well, that could be incredibly intricate and complex, 01:29:41.740 |
'cause I saw in reality that assembly theory works. 01:29:51.380 |
they said, you haven't really done this properly, 01:30:00.020 |
'cause I invented an assembly theory in chemistry, 01:30:06.820 |
how complex does a molecule need to be when I find it 01:30:25.660 |
of 10,000 identical molecules to get a signal. 01:30:28.540 |
So 10,000 identical molecules that are complex, 01:30:32.900 |
what's the chance of them occurring by chance? 01:30:39.300 |
or, yeah, so strychnine is a good molecule actually to take 01:30:46.820 |
I made jokes about Viagra 'cause it's complex molecule. 01:30:50.420 |
"Yeah, if we find Viagra on Mars in detectable quantities, 01:31:01.740 |
in the mass spectrometer and you hit it with some electrons 01:31:06.300 |
And if the larger the number of different parts, 01:31:09.980 |
you know when it starts to get to a threshold. 01:31:12.820 |
My idea was that that molecule could not be created 01:31:24.380 |
because NASA is sending the mass spectrometers 01:31:26.220 |
to Mars, to Titan, it's gonna send them to Europa. 01:31:29.620 |
There's gonna be a nuclear-powered mass spectrometer 01:31:38.420 |
it's gonna be powered by a nuclear slug, a nuclear battery 01:31:43.420 |
and it's gonna have a mass spectrometer on it. 01:31:47.140 |
- No, it's Dragonfly and it's gonna be launched 01:31:51.020 |
I think it got pushed a year because of the pandemic. 01:31:55.220 |
- Dragonfly, nuclear Dragonfly is going to fly to Titan 01:32:12.660 |
the Dragonfly team that they should apply this approach, 01:32:17.000 |
but they will get data and depending on how good 01:32:26.620 |
I turned the thought experiment into an algorithm 01:32:29.080 |
in assembly theory and I basically, assembly theory, 01:32:38.920 |
so if you have a book with lots of words in it 01:32:42.780 |
and it's a book that's been written by, in a random way, 01:32:49.600 |
- And you're on typewriters and you find one off 01:32:52.500 |
But if you find lots of reoccurrences of abracadabra, 01:32:55.920 |
well, that means something weird is going on. 01:32:57.500 |
But let's think about the assembly number of abracadabra. 01:33:07.540 |
But when you actually reassemble abracadabra, 01:33:09.660 |
the minimum number of ways of organizing those letters, 01:33:12.440 |
so you'd have an A, a B, you know, and keep going up. 01:33:21.560 |
you can put it together again in seven steps. 01:33:26.960 |
you're allowed to reuse things you make in a chain 01:33:29.480 |
at the beginning, that's the memory of the universe, 01:33:49.820 |
and you basically start with the atoms and then bonds, 01:34:01.600 |
allows me to say how compressed a molecule is, 01:34:18.680 |
I, with one of my students, we wrote an algorithm. 01:34:22.680 |
We basically took the 20 million molecules from a database 01:34:26.320 |
and we just calculated their assembly number. 01:34:33.920 |
what is the minimum number of steps I need to take 01:34:43.840 |
- Exactly, it's like memory in the universe, right? 01:34:46.400 |
I'm making lots of leaps here, like, it's kind of weird. 01:34:48.840 |
I'm saying, right, there's a process that can form 01:34:56.320 |
we can use A and B again with no extra cost except one unit. 01:35:00.080 |
So that's the kind of what the chain of events. 01:35:21.600 |
So we're able to take a whole bunch of molecules 01:35:27.480 |
And it's just a function of the number of bonds 01:35:57.200 |
there was almost not quite a one-to-one correlation, 01:36:03.940 |
I then did this using two other spectroscopic techniques, 01:36:08.720 |
which uses radio frequency to basically jangle the molecules 01:36:15.080 |
And infrared and NMR almost gave us a one-to-one correlation. 01:36:21.800 |
and doing either infrared or NMR or mass spec, 01:36:26.060 |
I can work out how many parts there are in that molecule 01:36:35.600 |
is we took molecules randomly from the environment, 01:36:47.560 |
And even NASA, 'cause they didn't believe us, 01:36:51.600 |
And we found that all these samples that came from biology 01:36:56.600 |
produced molecules that had a very high assembly number, 01:37:10.240 |
So we suddenly realized that on Earth, at least, 01:37:20.040 |
So I realized that this is a way to make a scale of life, 01:37:27.400 |
And literally, you could just go sniffing for molecules 01:37:33.360 |
And when you find a molecule in the mass spectrometer 01:37:43.200 |
And this allowed me to come up with a general definition 01:37:54.680 |
or any complex object, and I can find it in abundance 01:37:58.760 |
and cut it up, I can tell you whether that has been produced 01:38:05.000 |
And that's what assembly theory kind of does. 01:38:09.080 |
I then realized that this isn't just about life, 01:38:17.360 |
So now I can look at objects in the universe, 01:38:20.800 |
I'm gonna look at how many independent parts it has. 01:38:25.560 |
I'll then look at the abundance, how many cups. 01:38:28.960 |
maybe there's a few more you got stashed away. 01:38:31.160 |
So assembly is a function of the complexity of the object 01:38:37.240 |
times the number of copy numbers of that object, 01:38:39.400 |
or a function of the copy number, normalized. 01:38:41.920 |
So I realized there's a new quantity in the universe. 01:38:48.160 |
- So assembly, the way we should think about that 01:38:56.060 |
- Reusability is like, can you play devil's advocate to this? 01:39:04.720 |
for living organisms, like some kind of distant signal 01:39:12.320 |
but it's not capturing something fundamental? 01:39:14.880 |
Or do you think reusability is something fundamental 01:39:19.480 |
- I think reusability is fundamental in the universe, 01:39:26.240 |
So I think assembly tells you, if you find objects, 01:39:29.520 |
'cause you can do this with trajectories as well, 01:39:31.760 |
if you think about it, the fact there are objects 01:39:44.600 |
The fact that not everything exists is really weird. 01:39:49.600 |
- Yeah, and then there, as I'm looking at two mugs 01:39:54.600 |
and two water bottles, and the things that exist 01:39:57.760 |
are similar and multiply in copies of each other. 01:40:02.760 |
- Yeah, yeah, so I would say that assembly allows you 01:40:08.800 |
and people looking at entropy have got stuck with 01:40:14.600 |
I mean, I'm writing a paper with Sarah Walker 01:40:22.280 |
where this is, you know, we're not gonna get ahead 01:40:24.840 |
of ourselves, but we're gonna get ahead of ourselves 01:40:30.520 |
It works for molecules, and it appears to work 01:40:36.440 |
you can look at the assembly of the motor car, 01:40:37.640 |
look at a book, look at the assembly of the book. 01:40:41.600 |
of compressing and reusing, and so when people, 01:40:53.960 |
They say, "Oh, it's a bit like Komagolov complexity." 01:40:57.880 |
And now, okay, it's not infinitely computable, 01:41:04.200 |
but it's computable enough, you're contractible enough 01:41:07.760 |
to be able to tell the difference between a molecule 01:41:22.240 |
Complexity has required algorithmic comparisons 01:41:26.280 |
and programs and human beings to enlabel things. 01:41:34.200 |
We can talk about what that means in a minute. 01:41:37.040 |
- Okay, my brain has been broken a couple times here. 01:42:11.600 |
to figure out the minimum amount of reused components 01:42:19.700 |
to look at huge, huge molecules, arbitrarily large. 01:42:25.160 |
can I think about this in complexity generally, 01:42:30.840 |
and saying, can this be used as a measure of complexity 01:42:46.560 |
in computer science and mathematics and physics, 01:42:48.580 |
people have been really seriously studying complexity 01:42:53.240 |
And I think there's some really interesting problems 01:42:55.160 |
of where we course grade and we lose information. 01:43:00.120 |
assembly theory just explains weak emergence. 01:43:08.920 |
those first replicators that build one another. 01:43:12.040 |
Assembly at the minimal level just tells you evidence 01:43:28.440 |
there's nothing of very high assembly on the moon 01:43:40.080 |
that's the infinite combinatorial explosion in the universe. 01:43:57.920 |
Now, because we have specific memory where we say, 01:43:59.920 |
well, we're gonna put three sand grains in line 01:44:06.680 |
and the unsymmetrical thing, we remember that, 01:44:08.400 |
we can use it again 'cause on that causal chain. 01:44:12.480 |
is go to the actual object that you find in space. 01:44:16.120 |
And actually the way you get there is by disassembling it. 01:44:20.320 |
Assembly theory works by disassembling objects you have 01:44:31.800 |
- But like you said, it's gonna be hard, it's very difficult. 01:44:37.640 |
If you just keep low enough in molecular weight space, 01:44:46.960 |
we can start to think about things at different levels, 01:44:51.560 |
So in a molecule, the atom, this is really confusing 01:44:54.640 |
'cause the word atom, I mean smallest breakable part. 01:45:02.440 |
So in a car, the atom might be, I don't know, 01:45:06.120 |
a small amount of iron or the smallest reusable part, 01:45:14.800 |
In a microprocessor, the atoms might be transistors. 01:45:18.480 |
And so the amount of assembly that something has 01:45:23.480 |
is a function, you have to look at the atom level. 01:45:29.080 |
- That's one of the things you get to choose. 01:45:36.760 |
in when you approach a system and try to analyze, 01:45:39.720 |
like if you approach Earth, you're an alien civilization, 01:45:44.960 |
for trying to measure the complexity of life? 01:45:50.560 |
- I would say to start with, you just use molecules. 01:46:06.160 |
You needed technology and you needed microprocessors 01:46:14.680 |
between the coolness of that and assembly number, 01:46:18.620 |
whatever the measure, what would you call the measure? 01:46:23.400 |
- So there are three kind of fundamental kind of labels 01:46:45.760 |
that sums over all the molecules for each assembly 01:46:56.200 |
- So that will tell you the amount of assembly in the box. 01:46:58.640 |
So basically, the assembly equation we come up with 01:47:13.420 |
So some boxes are gonna be more assembled than others. 01:47:19.960 |
let's say I'm a box, am I assembling my parts 01:47:28.760 |
- So let's just, we'll talk about the molecules in you. 01:47:30.960 |
So let's just take a pile of sand the same way as you 01:47:34.560 |
and I would take you and just cut up all the molecules. 01:47:43.880 |
So in sand, let's say, there's probably gonna be 01:47:46.040 |
nothing more than assembly number two or three, 01:47:48.360 |
but there might be trillions and trillions of sand grains. 01:48:03.000 |
- You can average, you can do average it out. 01:48:04.400 |
- I'm not defined by the most impressive molecules. 01:48:19.520 |
You get to make decisions, you're alive, you're aspiring. 01:48:22.960 |
Assembly says something about causal power in the universe. 01:48:28.880 |
because physicists don't accept that causation 01:48:41.800 |
- Capturing memory, but there's not an action to it. 01:49:05.760 |
- That life is able to create objects in abundance 01:49:10.760 |
that are so complex, the assembly number is so high, 01:49:24.880 |
And then life doesn't exist, actually, in that case. 01:49:35.680 |
you could go and launch rockets or build cars 01:49:40.680 |
or create drugs or, you can do so many things. 01:49:52.360 |
And that causal power was this kind of a lineage. 01:49:57.840 |
I've been realizing that physics as a discipline 01:50:09.160 |
I wanna maintain some credibility in the physicist's eyes, 01:50:22.240 |
But they're down to some things in their belief system, 01:50:24.720 |
which is kind of really makes me kind of grumpy. 01:50:39.200 |
- Well, in a minute, I'll recover my career in a second. 01:50:45.120 |
means I think there has to be an act of parliament 01:50:51.340 |
You can always go to Lee's Twitter and protest. 01:51:03.360 |
- The second law, and the fact that causation is emergent. 01:51:20.120 |
So physicists have kind of got confused about time. 01:51:33.960 |
I can go to the moon with Newtonian physics, I think. 01:51:36.560 |
And I can understand the orbit of Mercury with relativity. 01:51:41.080 |
And I can build transistors with quantum mechanics, right? 01:51:47.280 |
I'm just saying, if we say that time is fundamental, 01:51:50.480 |
i.e. time is non-negotiable, there's a global clock, 01:52:07.000 |
I mean, you've been referring to this kind of 01:52:09.560 |
an interesting formulation of that is memory. 01:52:26.680 |
think about these local pockets of complexity, 01:52:39.160 |
- But remember, the thing is I invented assembly theory. 01:52:46.160 |
I keep making fun of myself to my search group. 01:52:51.040 |
over the 40 years or so since I had the idea. 01:52:54.560 |
- Well, aren't you the idea that the universe had? 01:53:22.880 |
And I thought, well, can I delete some of that stuff out? 01:53:26.520 |
And then in the end, I kept making it smaller. 01:53:29.560 |
It got to maybe half a truck and into a suitcase. 01:53:34.000 |
I wanna carry my entire technology in my pocket. 01:53:38.640 |
And I'm not like gonna launch into Steve Jobby 01:53:48.520 |
that would allow me to interact the environment 01:53:57.280 |
And it's kind of like, so what did I use in my box 01:54:08.200 |
So I guess that's probably why I've been obsessed 01:54:12.000 |
And I was just pre-configured to find it somewhere. 01:54:19.440 |
I realized that the causal structure that we say emerges 01:54:35.240 |
I mean, that's clearly a very hard thing to let up. 01:54:40.240 |
Physicists would not let other sciences get away 01:54:49.120 |
So why are physicists allowed to get away with it? 01:54:51.080 |
- So first of all, to push back, to play devil's advocate, 01:54:54.280 |
you are clearly married to the idea of memory. 01:54:58.280 |
You see in this, again, from Rick and Morty way, 01:55:06.360 |
that is writing the story through its memories, 01:55:12.440 |
And then they find useful components they can reuse. 01:55:16.120 |
And then the reused components create systems 01:55:27.880 |
it seems like quite sad that you can walk that back. 01:55:31.960 |
But at the same time, it feels like that memory, 01:55:34.880 |
you can walk in both directions on that memory 01:55:42.000 |
because the problem that I have with time being reversible 01:55:55.160 |
So I love burning stuff, burning stuff and building stuff. 01:56:05.200 |
I have to borrow time from the universe to do that. 01:56:10.040 |
let's imagine that we can go back in time or reversibility, 01:56:17.000 |
- No, but see, you're assuming that time is fundamental, 01:56:30.920 |
Yeah, I mean, this is an argument we can have, 01:56:32.880 |
but I believe I can come up with an experiment. 01:56:43.320 |
kind of is the way that the universe produces selection 01:56:55.320 |
that requires us to have these statements to be possible. 01:57:06.240 |
which is order in the past, but as well, okay. 01:57:19.960 |
and there's no novelty or that novelty is predetermined. 01:57:26.440 |
that time is fundamental, which means if you think about it, 01:57:30.080 |
the universe becomes more and more novel each step. 01:57:32.680 |
It generates more states in the next step than it was before. 01:57:41.680 |
actually, because it didn't have enough states. 01:57:45.760 |
But today the universe is, so it's like how-- 01:57:51.600 |
Now we've pissed off the panpsychics too, okay. 01:57:57.560 |
Part of me is just joking, having fun with this thing, 01:58:00.240 |
but 'cause you're saying a lot of brilliant stuff 01:58:02.640 |
and I'm trying to slow it down before my brain explodes. 01:58:08.080 |
some of the fascinating things you're saying. 01:58:10.320 |
So novelty, novelty is increasing in the universe 01:58:18.440 |
- So I think the physicists almost got everything right. 01:58:26.040 |
I'm very happy to be entirely wrong on this, right? 01:58:49.480 |
the universe is actually, there's more states available. 01:58:52.720 |
I mean, we might even be able to do weird things 01:59:29.080 |
because consciousness is not. - All right, let's go 01:59:34.200 |
I don't think this conversation is even about the assembly, 01:59:37.240 |
which is fascinating and we'll keep mentioning it 01:59:43.040 |
that I don't think is necessarily connected to time. 01:59:56.080 |
can still be correct even if time is emergent? 01:59:58.640 |
- So, yeah, right now, assembly theory appears to work. 02:00:01.640 |
I appear to be able to measure objects of high assembly 02:00:04.880 |
in a mass spectrometer and look at their abundance 02:00:08.160 |
It's a nice, if nothing else, it's a nice way 02:00:10.560 |
of looking at how molecules can compress things. 02:00:13.200 |
Now, am I saying that a time has to be fundamental 02:00:23.320 |
it appears that the universe has many different ways 02:00:26.560 |
You could have three different types of time. 02:00:33.320 |
I think that's fine, let's do that for a second. 02:00:41.420 |
when the universe starts to write memories through bonds. 02:00:43.800 |
So let's just say there's rocks running around, 02:00:49.360 |
suddenly the universe is remembering cause in the past 02:00:54.360 |
and those structures will have effects in the future. 02:00:58.000 |
So suddenly a new type of time emerges at that point, 02:01:08.860 |
But I'm just basically trying to condense the conversation 02:01:10.960 |
and say, hey, let's just have time fundamental 02:01:15.000 |
- You're triggering people by saying fundamental. 02:01:19.880 |
- Why am I, look, I'm walking through the wall. 02:01:27.000 |
I don't go back in time, I don't meet myself in the past. 02:01:30.760 |
There are no aliens coming from the future, right? 02:01:35.080 |
- No, no, no, but that's not, no, no, no, hold on a second. 02:01:38.680 |
That's like saying we're talking about biology 02:01:41.800 |
or like evolutionary psychology and you're saying, 02:01:44.900 |
okay, let's just assume that clothing is fundamental. 02:01:52.320 |
You can't, like, I think you're gonna get in a lot 02:01:55.240 |
of trouble if you assume time is fundamental. 02:01:58.440 |
Give me one reason why I'm getting into trouble 02:02:01.320 |
- Because you might not understand the origins 02:02:21.160 |
It's chemicals doing a search for reusable structures 02:02:26.160 |
that they can like then use as bricks to build a house. 02:02:34.400 |
So let's go back a second because it's a kind of, 02:02:40.680 |
because I think we can carry on discussing it 02:02:42.160 |
for many, many, many, many, many days, many months. 02:02:45.640 |
But I'm happy to accept that it might be wrong. 02:02:50.920 |
But what I would like to do is imagine a universe 02:02:53.800 |
where time is fundamental and time is emergent 02:02:56.600 |
and ask, let's just then talk about causation 02:03:02.900 |
so this is where I'm gonna go, causation emerges 02:03:08.440 |
Well, that clearly is wrong because if causation has 02:03:11.080 |
to emerge at the macro scale, life cannot emerge. 02:03:15.200 |
Life requires molecules to bump into each other, 02:03:21.880 |
There needs to be cause and effect at the molecular level. 02:03:24.600 |
There needs to be a non-ergodic to an ergodic transition 02:03:28.160 |
at some point and those replicators have consequence, 02:03:39.720 |
I'm gonna have a bunch of particles in a box. 02:03:42.480 |
I'm gonna think about it in either Newtonian way 02:03:44.840 |
and a quantum way and I'll add on an arrow time 02:03:48.020 |
so I can label things and causation will happen 02:03:58.540 |
is having a fundamental time because this allows me 02:04:01.440 |
to have a deterministic universe that creates novelty 02:04:09.780 |
You said, can assembly theory work with emergent time? 02:04:12.600 |
Sure, it can but it doesn't give me a deep satisfaction 02:04:21.080 |
to these objects that move through time and space. 02:04:29.800 |
take this water bottle and look at this water bottle 02:04:35.760 |
I know that causal structures gave rise to this. 02:04:38.640 |
In fact, I'm not looking at just one water bottle here. 02:04:53.200 |
I think Leibniz actually invented assembly theory. 02:04:56.040 |
He gave soul, the soul that you see in objects 02:05:01.080 |
It is the fact there's been a history of objects related 02:05:08.380 |
There is a lineage and there is conserved structures, 02:05:22.940 |
- And it shakes the physicist's cage a bit, right? 02:05:29.100 |
- I just enjoy the fact that physicists are in cages. 02:05:51.380 |
This is my, I think, third time I've been to Austin. 02:06:18.260 |
which was beautiful, with a quote from the Rolling Stones 02:06:36.740 |
and to clarify it with this beautiful theory of yours 02:06:41.740 |
that you're developing, and I'm sure will continue developing 02:06:52.220 |
Just the ideas you're playing with in your head 02:06:58.100 |
So if we talk about complexity a little bit more generally, 02:07:05.700 |
how does complexity emerge from simple rules? 02:07:11.260 |
Okay, the nice algorithm of assembly is there. 02:07:15.300 |
- I would say that the problem I have right now, 02:07:17.300 |
is I mean, you're right, we can, about time as well. 02:07:20.300 |
The problem is I have this hammer called assembly, 02:07:24.520 |
So now, let's just apply it to all sorts of things. 02:07:35.120 |
when you get convection, you get honeycomb patterns. 02:07:47.580 |
When people say, let's talk about complexity in general, 02:07:50.100 |
what they're saying is, let's take this collection 02:07:56.880 |
and try and work out how many moving parts there are, 02:08:02.940 |
So what people have been doing for a very long time 02:08:04.820 |
is taking complexity and counting what they've lost, 02:08:11.140 |
And the reason why I'm pushing very hard on assembly, 02:08:16.380 |
It doesn't tell you the microstates are gone. 02:08:18.800 |
But if you embrace the bottom up with assembly, 02:08:21.740 |
those states, and you then understand the causal chain 02:08:31.460 |
is understand weak emergence at the very least, 02:08:34.300 |
and maybe allow us to crack open complexity in a new way. 02:08:39.300 |
And I've been fascinated with complexity theory 02:08:48.460 |
and I could write, just type it up in my computer 02:08:51.620 |
and run it, and just show it, see it kind of unfold. 02:08:55.740 |
It was just this kind of, this mathematical reality 02:09:00.180 |
that existed in front of me, I just found incredible. 02:09:03.980 |
But then I realized that actually, we were cheating. 02:09:07.100 |
We're putting in the boundary conditions all the time, 02:09:11.500 |
And so, when people talk to me about the complexity 02:09:18.700 |
So my attempt, my small attempt, naive attempt, 02:09:25.420 |
on the planet right now thinking about this properly, 02:09:27.500 |
and you've had some of them on the podcast, right? 02:09:33.100 |
But I'm wondering if we might be able to reformat 02:09:35.620 |
the way we would explore algorithmic complexity 02:09:41.080 |
What's the minimum number of constraints we need 02:09:47.860 |
So whether it's like, you know, if you take some particles 02:09:50.860 |
and put them in a box, at a certain box size, 02:09:53.780 |
you get quasi-crystallinity coming out, right? 02:10:00.540 |
It must come from the boundary conditions you put in. 02:10:03.780 |
So all I'm saying is a lot of the complexity that we see 02:10:07.100 |
is a direct read of the constraints we put in, 02:10:11.980 |
So as I said earlier to the poor origin of life chemists, 02:10:17.340 |
I would say lots of the complexity calculation theory 02:10:20.220 |
is a bit of a scam, 'cause we put the constraints in, 02:10:26.140 |
- Oh, you're thinking, and sorry to drop this, 02:10:35.020 |
So assembly theory doesn't lower any of the importance 02:10:38.700 |
of complexity theory, but it allows us to go across domains 02:10:44.460 |
compare the complexity of a molecule, of a microprocessor, 02:10:47.820 |
of the text you're writing, of the music you may compose. 02:10:52.020 |
- You've tweeted, quote, "Assembly theory explains 02:10:56.700 |
"why Nietzsche understood we had limited freedom 02:11:01.980 |
So we've applied assembly theory to cellular automata 02:11:06.100 |
What does Nietzsche have to do with assembly theory? 02:11:09.620 |
- Oh, that gets me into free will and everything. 02:11:14.580 |
Assembly theory explains why Nietzsche understood 02:11:16.660 |
we had limited freedom rather than radical freedom. 02:11:20.420 |
Limited freedom, I suppose, is referring to the fact 02:11:30.180 |
- So Sartre was like, believed in absolute freedom 02:11:33.460 |
and that he could do whatever he wanted in his imagination. 02:11:43.180 |
And it kind of takes me back to this computer game 02:11:54.100 |
- "Dragon's Lair," I knew I was being conned, right? 02:12:03.180 |
No, it's like, is it turn-based play, was it? 02:12:08.820 |
- But really good graphics and won the first LaserDiscs. 02:12:13.580 |
you took, it was like a graphical adventure game 02:12:23.900 |
You just play the disc, play the disc, play the disc. 02:12:29.660 |
because all the animation has been pre-recorded on the disc. 02:12:34.100 |
- It's like "The Black Mirror," the first interactive 02:12:37.460 |
several million kind of permutations of the movie 02:12:52.060 |
And when you flick the joystick at the right time, 02:12:58.420 |
- And I played that game and I knew I was being had. 02:13:11.180 |
- And why does assembly theory give you hints 02:13:14.980 |
about free will, whether it's an illusion or not? 02:13:21.340 |
and I think I am an agent and I think I can interact 02:13:24.020 |
and I can play around with the model I have of the world 02:13:30.260 |
which means I have a little bit of free will. 02:13:32.340 |
But as much as I want to do stuff in the universe, 02:13:38.540 |
'cause now I say I could try and do it, right? 02:13:39.900 |
It's like I'm gonna suddenly give up everything 02:13:48.700 |
to make that necessarily happen, I'm on a trajectory. 02:13:52.380 |
I know that I have some trajectories that I can play with, 02:14:03.420 |
And Nietzsche said, okay, I realize I don't have full freedom 02:14:11.300 |
It says, if you have these constraints in your past, 02:14:14.200 |
they limit what you are able to do in the future, 02:14:19.300 |
Let's say I'm a poppy plant and I'm creating some opiates. 02:14:25.780 |
I mean, they're obviously great for medicine, 02:14:29.740 |
but let's imagine we fast forward a billion years, 02:14:33.420 |
what will the opioids look like in a billion years? 02:14:38.620 |
because we can see how those proteins will evolve 02:14:41.300 |
and we can see how the secondary metabolites will change, 02:14:48.580 |
like a molecule that you'd find in an OLED in a display. 02:14:57.060 |
And that's what I'm getting at, saying you're, 02:14:59.980 |
we're predictive, we are unpredictably predictable 02:15:03.540 |
or predictably unpredictable within a constraint 02:15:08.860 |
- Yeah, so the predictably part is the constraints 02:15:14.140 |
is the part that you still haven't really clarified 02:15:20.820 |
- So you're just arguing, you're basically saying 02:15:26.740 |
You're really operating in a world of constraints 02:15:29.080 |
that are constrained by the memory of the trajectory 02:15:33.740 |
Okay, but you know, even just a tiny bit of freedom, 02:15:38.740 |
even if everything, if everywhere you are in physics, 02:15:44.020 |
in cages, if you can move around in that cage a little bit, 02:15:51.420 |
- And so the question is, in assembly theory, 02:15:57.500 |
where does the little bit of freedom come from? 02:15:59.700 |
What is the eye that can decide to be a rapper? 02:16:06.940 |
That's a cute little trick we've convinced each other of 02:16:18.860 |
- I think that that's the question that I wanna answer. 02:16:21.860 |
I know you wanna answer it and I think it's so profound. 02:16:27.140 |
I would say that I don't take the stance of Sam Harris 02:16:31.780 |
the way he says it is almost, it's really interesting. 02:16:35.020 |
Sam Harris almost thinks himself out of existence, right? 02:16:47.420 |
He thinks himself out of existence with free will. 02:17:00.820 |
I'd love to ask him whether he really believes that 02:17:11.580 |
And then I'll get him to the conditions he says yes 02:17:13.420 |
and then I'll trap him in his logical inconsistency 02:17:17.180 |
Because at some point when he loses enough money 02:17:21.940 |
there's a way of basically mapping out a series of, 02:17:26.940 |
so what will is about, let's not call it free will, 02:17:30.900 |
what will is about is to have a series of decisions 02:17:38.420 |
energy minimization, those decisions are a function 02:17:48.740 |
and also other interactions that you're having 02:17:55.260 |
And I think that you, there's a little bit of delay in time. 02:18:12.060 |
I think free will is actually very complex interaction 02:18:14.780 |
between your unconscious and your conscious brain. 02:18:18.780 |
And I think the reason why we're arguing about it, 02:18:39.420 |
- And that he can't have access to the unconscious brain. 02:18:43.660 |
- So he's just, he's going to, through meditation, 02:18:49.820 |
Maybe, but I do think that I have the ability 02:18:58.500 |
with some people that some days I feel I have no free will 02:19:04.780 |
And this is one, and it makes me more radical, 02:19:08.860 |
that I get to explore more of the state space. 02:19:11.620 |
And I'm like, I'm gonna try and affect the world now. 02:19:56.860 |
which is why you don't just do whatever the hell you want. 02:20:00.980 |
Like, you feel like there's some responsibility 02:20:03.060 |
for making the wrong choice, which is why you don't do it. 02:20:56.380 |
by just doing things because I'm bored, but not bored. 02:21:00.580 |
I think that this is a really interesting problem, 02:21:02.500 |
that perhaps the hard sciences don't really understand 02:21:20.920 |
The transition from a, you know, a boring world, 02:21:30.260 |
and also the models you're generating in the brain, 02:21:34.140 |
of working memory you have available at any one time 02:21:45.140 |
yet another manifestation of the selection mechanism 02:21:59.300 |
that is just yet another example of selection. 02:22:06.300 |
you want to do that because you generate novelty. 02:22:10.700 |
do cellular automata exist outside the human mind 02:22:24.300 |
and the human mind, and trees falling in the forest? 02:22:30.500 |
so when John von Neumann and Conway and Feynman 02:22:40.020 |
that they were doing cellular automata on paper? 02:23:00.020 |
the initial conditions, and see the beautiful 02:23:08.780 |
that's just dedicated to putting out CA rules, 02:23:49.780 |
And robot one said, "I'm doing experiment 10." 02:23:51.700 |
And the other robot, "Okay, I'll do experiment one then." 02:23:58.700 |
- Can you maybe quickly explain what the Game of Hex is? 02:24:01.460 |
- Yeah, so it's basically a board, hexagonal board, 02:24:04.260 |
and you try and basically, you color each hexagon, 02:24:08.820 |
and you try and get from one side to the other, 02:24:17.560 |
So the two robots, each robot was doing dye chemistry. 02:24:20.640 |
So making RGB, red, green, blue, red, green, blue, 02:24:29.460 |
we need to make two chemical robots that play chess, 02:24:48.820 |
He's written a huge number of amazing papers. 02:25:27.540 |
I really take, I think they still have PTSD from doing it, 02:25:32.620 |
What I'd often do is I have about 60 people on my team, 02:25:42.660 |
And then basically about 20 people turn up to my office, 02:25:49.380 |
and some of them would be like, no, I'm not doing this. 02:26:03.100 |
of different molecules you could react commentarily 02:26:18.740 |
Like they weren't, so you could have two chemical reactions 02:26:28.980 |
and the other robot says, I'll do experiment 100. 02:26:57.740 |
And so what the robots could do is they play, 02:26:59.420 |
each player move, and 'cause the fitness function 02:27:02.380 |
or the optimization function was to make the color blue, 02:27:14.740 |
So when one robot realized, if it didn't clean its pipes, 02:27:19.860 |
realized that, so it was like getting dirty as well. 02:27:23.980 |
- Unintended consequences of super intelligence. 02:27:33.300 |
I said, come on, you've got a couple of robots 02:27:37.220 |
- But in the end, we had to take them off Twitter, 02:27:41.180 |
'Cause it was just, there were people saying, 02:28:12.020 |
through chemical space and have some kind of cycle. 02:28:14.780 |
And then read out what the molecule's reading out 02:28:17.220 |
using a mass spectrometer, and then convert that to text, 02:28:25.860 |
I reckon that Twitter account would get a lot of followers. 02:28:29.700 |
to convince my group that we should just make 02:28:32.700 |
where it's going, and it's like, hello, testing, I'm here. 02:28:42.980 |
Of a non-human entity communicating with the world 02:29:11.580 |
And I tried to actually, I tried to convince Stephen 02:29:20.900 |
to the simplest construct of a one-dimensional 02:29:40.020 |
is why people marvel, I mean, you marvel at CAs 02:29:47.180 |
because if you play the game of life in a CA, 02:29:52.700 |
You have to have a, you have to do a number of operations, 02:29:57.180 |
So is it surprising that you get this structure out? 02:29:59.860 |
Is it manufactured by the boundary conditions? 02:30:09.380 |
is teaching me something about what real numbers are and aren't. 02:30:20.060 |
And I was like, well, I do actually have some notion 02:30:36.140 |
why am I seeing this complexity in this rule? 02:30:44.660 |
and yet you get this incredible structure coming out. 02:30:48.060 |
Well, isn't that what you'd get with any real number, 02:30:54.300 |
And you're trying to read it out to an arbitrary position? 02:31:27.260 |
and some lead to things that are just walked out, 02:31:37.180 |
So take the logistic map or something, logistic equation, 02:31:42.380 |
which is you don't know what's gonna happen to N plus one, 02:31:46.540 |
but once you've done N plus one, you know, full time. 02:31:50.380 |
For me, CAs and logistic equation feel similar. 02:31:57.380 |
and I share your kind of wonder at running a CA, 02:32:04.940 |
well, what is it about the boundary conditions 02:32:16.460 |
It's been trapped in purgatory for a long time. 02:32:20.180 |
how to do a chemical formulation of the game of life, 02:32:23.060 |
which is like-- - We made a chemical computer 02:32:28.220 |
so each cell would pulse on and off, on and off, on and off. 02:32:31.260 |
We have little stirrer bars and we have little gates. 02:32:34.260 |
And we actually played Conway's game of life in there. 02:32:38.860 |
We got structures in that game from the chemistry 02:32:46.500 |
- 'Cause they were interacting outside of the cells now or-- 02:32:50.420 |
- So what's happening is you're getting noise. 02:32:52.180 |
So the thing is that you've got this BZ reaction 02:33:00.980 |
or in such a non-trivial way that's non-deterministic. 02:33:20.140 |
that I can't get in a silicon representation of a CA. 02:33:33.740 |
So it's just a beautiful idea to use a chemical computer 02:33:42.300 |
And it's a really interesting scientific question 02:34:04.700 |
And can we bake in that non-determinism at the beginning? 02:34:10.540 |
I'm trying to think about what is the encoding space. 02:34:14.260 |
We have 49 steroids, so 49 cells, 49 chem bits, 02:34:19.260 |
all connected to one another in like an analog computer 02:34:23.700 |
but being read out discreetly as the BZ reaction. 02:34:26.980 |
So just to say the BZ reaction is a chemical oscillator. 02:34:33.100 |
So two Russians discovered it, Belousov-Zaposkinsky. 02:34:38.460 |
and everyone said, "You're crazy, it breaks the second law." 02:34:40.740 |
And Zaposkinsky said, "No, it doesn't break the second law. 02:34:51.980 |
That just because Russians just wrote it in Russian, 02:34:54.740 |
they didn't publish it in English-speaking journals. 02:34:57.580 |
- Well, yeah, sad and it's great that it's there, right? 02:35:02.540 |
I'm sure we will find a way of translating it properly. 02:35:05.100 |
- Well, the silver lining slash greater sadness 02:35:20.140 |
would crack open some of the biggest mysteries 02:35:30.120 |
like nobody else has ever tried to solve problems. 02:35:35.500 |
in cognitive science and psychology and mathematics 02:35:38.500 |
and physics and just whatever you want to, economics even. 02:35:44.660 |
you might be able to discover some beautiful ideas. 02:35:47.620 |
- Obviously Russian is an interesting case of that 02:35:53.460 |
But you said there's a source of fuel, a source of energy. 02:35:59.180 |
you have an acid in there called malonic acid. 02:36:12.020 |
What that means we have to do is continuously feed 02:36:16.500 |
in a long enough time so it's like it's reversible in time. 02:36:34.420 |
It can solve traveling salesman problems actually. 02:36:38.660 |
- But not any faster than the regular computer. 02:36:46.220 |
I think we can come up with a way of solving problems, 02:37:12.100 |
You don't actually have to encode the shaking of the box 02:37:14.900 |
in a silicon memory and then just shuffle everything around. 02:37:25.900 |
and I was kind of annoying some of my colleagues 02:37:27.500 |
and wondering if we could get to chemical supremacy, 02:37:31.500 |
And I kind of calculated how big the grid has to be 02:37:55.580 |
But then you're unsure how big that has to be. 02:38:00.900 |
- It might be exactly a big box, hard to shake 02:38:11.260 |
- We didn't, but I would posit that they don't 02:38:36.980 |
So they are, they emerge from the human mind. 02:38:49.380 |
I'm just saying, well, do they exist in reality 02:38:52.020 |
or are they a representation of a simple machine 02:39:16.860 |
to say, hey, look, you can do this thing in a CA. 02:39:21.100 |
When I see this, I'm saying, oh, that's cool, 02:39:30.540 |
So for people who don't know cellular automata, 02:39:34.660 |
whether it's one-dimensional, two-dimensional, 02:39:53.380 |
you can create arbitrarily complex and beautiful systems. 02:39:57.660 |
And to me, whether drugs are involved or not, 02:40:02.660 |
I can sit back for hours and enjoy the mystery of it, 02:40:17.340 |
It gives me a sense that you get to have a glimpse 02:40:26.860 |
Whatever is creating this complexity from such simplicity 02:40:31.860 |
is the very thing that brought my mind to life, 02:40:42.940 |
And yes, those constructs are pretty trivial. 02:40:58.800 |
you could see the emergence of complexity from simplicity. 02:41:01.820 |
I guess what, Lee, you're saying is that this is not, 02:41:13.060 |
even though they probably carry some of the same magic, 02:41:18.620 |
- I mean, I'm saying that the operating system 02:41:26.140 |
And so I wonder if you're getting the complexity 02:41:30.500 |
of the operating system, the underlying digital computer. 02:41:37.140 |
- Not against, I mean, I'm in love with CAs as well. 02:41:40.660 |
I'm just saying they aren't as trivial as people think. 02:41:51.020 |
And you need a display, and you need a math coprocessor, 02:42:01.580 |
- Wow, to think that for the simplicity of a grid, 02:42:05.780 |
you're basically saying a grid is not simple. 02:42:19.980 |
I just think, but remember, we take so much for granted 02:42:24.820 |
'Cause von Neumann and Feynman weren't showing, 02:42:31.460 |
- Yeah, but that's the limitation of their mind. 02:42:34.820 |
- Yeah, yeah, exactly, the limitation of their pencil. 02:42:40.700 |
whether the essential elements of the cellular automata 02:42:44.180 |
is present without all the complexities required 02:42:52.460 |
And my intuition, the reason I find it incredible 02:43:05.100 |
but local interactions operating under simple rules 02:43:09.740 |
and resulting in multi-hierarchical complex structures 02:43:14.540 |
feels like a thing that doesn't require a computer. 02:43:17.940 |
- I agree, but coming back to von Neumann and Feynman 02:43:20.980 |
and Wolfram, their minds, the non-trivial minds, 02:44:03.940 |
are a really good simple capture of a physical phenomena. 02:44:08.940 |
It is also, that equation has the memory of the humans. 02:44:17.680 |
- But I don't, I don't know if you're implying this, 02:44:27.140 |
with that sort of diminishing the power of that equation. 02:44:31.900 |
- Because it's built on the shoulders, it enhances it. 02:44:34.460 |
It's not, that equation is a minimal compressed 02:44:39.340 |
We can use machine learning or Max Tegmark's AI Feynman 02:44:45.100 |
but isn't it wonderful that the laws that we do find 02:44:47.340 |
are the maximally compressed representations? 02:44:49.620 |
- Yeah, but that representation, you can now give it, 02:44:54.500 |
I guess the universe has the memory of Einstein 02:44:56.620 |
with that representation, but then you can now give it 02:44:59.500 |
as a gift for free to other alien civilizations. 02:45:04.940 |
So I say that physics and chemistry and biology 02:45:19.820 |
As you get building more memory, you get to chemistry, 02:45:24.040 |
When you get to biology, more contingent still, 02:45:27.420 |
So the more memory you need, the more your laws are local. 02:45:31.500 |
That's all I'm saying, in that the less memory, 02:45:34.340 |
the more the laws are universal, because they're not laws, 02:45:37.940 |
- We have to talk about a thing you've kind of mentioned 02:45:44.060 |
already a bunch of times, but doing computation 02:45:48.420 |
through chemistry, chemical-based computation. 02:45:51.860 |
I've seen you refer to it as, in a sexy title, 02:46:15.900 |
And so, as a chemist, chemists make molecules by hand. 02:46:23.180 |
chemists have a lot of tacit knowledge, a lot of ambiguity. 02:46:26.860 |
It's not possible to go uniformly to the literature 02:46:32.460 |
and then go and make it in the lab every time. 02:46:51.100 |
And you're saying, can we remove the human from the picture? 02:47:12.300 |
So a turnstile would be a good example of a state machine. 02:47:23.620 |
in terms of like, it's precise how you do those transitions. 02:47:26.140 |
- Yes, and you can mathematically, precisely, 02:47:29.420 |
So, I mean, you know, a very simple Boolean gates 02:47:46.180 |
and you would basically look at what's on the tape 02:47:49.380 |
and if you're shifting the tape from left to right, 02:48:17.900 |
- So you're looking to come up with a chemical computer 02:48:32.380 |
about actually doing computations with chemicals. 02:48:35.260 |
What I'm now saying is I want to use state machines 02:48:42.700 |
- Yeah, I mean, I get in trouble saying this. 02:48:47.780 |
but I said, look, we should make the crack bot, 02:49:15.380 |
- No, I don't, I don't, for the record, I don't, 02:49:21.540 |
But I shaved my head and I'm going to live a life of crime. 02:49:41.500 |
So the basic thesis is chemistry is very analog. 02:49:47.340 |
And I wandered into the, through the paper walls 02:49:52.780 |
in the Japanese house a few years ago and said, 02:49:54.900 |
"Okay, hey, organic chemist, why are you doing this analog?" 02:50:06.100 |
And they said, you know, I got the impression 02:50:11.780 |
it's like, no, no, they can't be magic, right? 02:50:15.660 |
And so what I did is I went to my group one day 02:50:21.100 |
"Hey guys, I've written this new programming language 02:50:26.660 |
And you know, you're not allowed to just wander 02:50:31.500 |
go to the balance at the right time and all this stuff. 02:50:35.980 |
and basically kicked me out of the lab and said, 02:50:42.020 |
"I'm gonna find some money so we can make cool robots 02:50:51.540 |
- So first you try to convert the humans to become robots 02:50:54.140 |
and next you agree you might as well just create the robots. 02:50:57.300 |
Yes, but so in that, the formalization process. 02:51:00.140 |
- Yeah, so what I did is I said, "Look, chemical, 02:51:02.620 |
to make a molecule, you need to do four things abstractly. 02:51:10.060 |
Turing machine is the ultimate abstraction of a computation 02:51:19.940 |
should be able to do all computations that you can imagine." 02:51:23.020 |
It's like, "Wow, why don't I think of a Turing machine 02:51:26.460 |
Let's think of a magic robot that can make any molecule. 02:51:35.740 |
So to make any molecule, you have to do a reaction. 02:51:43.860 |
Then after the reaction, you have to stop the reaction. 02:51:47.340 |
So whatever, cool it down, add some liquid to it, extract. 02:51:51.540 |
So then after you do the workup, you separate. 02:51:53.540 |
So you then remove the molecules, separate them all out. 02:52:01.700 |
So this is basically exactly like a Turing machine 02:52:05.900 |
where you have your tape head, you have some rules, 02:52:27.980 |
And I got a few very enlightened people to say, 02:52:30.420 |
"Yeah, okay, in principle, but it ain't gonna work." 02:52:36.620 |
And I found myself going to an architecture conference 02:52:40.820 |
It's like, "Why am I at this random conference 02:52:47.860 |
And they said, "Come to architecture conference." 02:52:49.420 |
But the inorganic architecture is not nano architecture. 02:52:53.200 |
And then I found these guys at the conference, 02:53:07.260 |
And then I was like, "Oh my God, you guys are geniuses." 02:53:16.140 |
and we're gonna build a robot to do chemistry 02:53:30.500 |
rather than to squirt out plastic out of a nozzle, 02:53:36.420 |
So we had the 3D printer that could simultaneously 02:53:48.780 |
So I got my group doing this and I developed it a bit. 02:53:50.820 |
And I realized that we could take those unit operations. 02:53:54.660 |
And we built a whole bunch of pumps and valves. 02:53:57.180 |
And I realized that I could basically take the literature 02:54:00.820 |
and I made the first version of the computer in 2016, 17. 02:54:07.540 |
So I designed the pumps and valves in my group. 02:54:12.300 |
I cannot pay tribute to my group enough in doing this. 02:54:16.740 |
And there were some poor souls there that said, 02:54:18.420 |
"Lee, why are you making this design electronics?" 02:54:21.300 |
I'm like, "Well, 'cause I don't understand it." 02:54:24.260 |
They're like, "So you're making this design stuff 02:54:29.660 |
I said, "Well, we can, but then I don't understand 02:54:42.620 |
So I got one cable for power and data, plug them all in, 02:54:57.180 |
So reaction, workup, separation, purification. 02:55:01.580 |
And then I made the decision to do it in batch. 02:55:07.900 |
All chemistry had been digitized before, apparently, 02:55:13.940 |
And flow is continuous and there are infinities everywhere. 02:55:17.220 |
And I realized that I could actually make a state machine 02:55:30.660 |
at electrical engineers saying, "You have it easy. 02:55:37.900 |
But in my state machine, I built in cleaning. 02:55:41.620 |
and then it cleans the backbone and then can do it again. 02:55:44.420 |
- So what we managed to do over a couple of years 02:55:47.140 |
is develop the hardware, develop the state machine. 02:55:54.140 |
a sleeping drug, rufinamide, anesthesia and Viagra. 02:55:57.180 |
You know, and I could make jokes on the paper. 02:56:03.900 |
- And then in the next one, what we did is said, 02:56:22.020 |
We just have to spend ages writing lines of code 02:56:28.860 |
So then, but I knew because I had this abstraction 02:56:37.700 |
which was lossy and ambiguous and populate my abstraction. 02:56:44.300 |
that is actually gonna be recursively enumerable. 02:56:46.940 |
It's gonna be a Turing complete language actually, 02:56:51.500 |
So where we are now is we can now read the literature 02:56:57.700 |
There are many other groups have done better job, 02:57:15.780 |
- Okay, so that's the kind of program synthesis. 02:57:27.060 |
extracting some kind of details about chemical reactions 02:57:32.060 |
and the chemical molecules and composites involved. 02:57:49.300 |
There you have a bunch of different like for loops 02:57:51.220 |
and so on that creates a program in this chemical language 02:57:56.100 |
that can then be interpreted by the chemical computer, 02:58:04.260 |
Everything sounds better in your British accent, 02:58:12.060 |
basically be a 3D printer for these, for molecules. 02:58:18.420 |
I would call it a universal chemical reaction system 02:58:21.020 |
because 3D printing gives the wrong impression, 02:58:36.220 |
Computation is what computing is to mathematics, I think. 02:58:41.220 |
Computation is the process of taking chemical code 02:58:44.660 |
and some input reagents and making the molecule 02:58:57.740 |
and give you an output same every time, right, reliably. 02:59:04.500 |
now maybe you can push back and correct me on this. 02:59:15.640 |
it's easier to make computation in a computer very precise, 02:59:20.640 |
that it's repeatable, it makes errors almost never. 02:59:26.180 |
If it does the exact same way over and over and over and over 02:59:44.980 |
you're doing reactions on billions of molecules 03:00:02.660 |
So I would say, now go back to the first ever computer 03:00:09.500 |
400,000 valves that are exploding all the time. 03:00:12.800 |
Was that, would you have gone, okay, that's messy. 03:00:16.380 |
So we've got the, and have we got the equivalent 03:00:45.460 |
six steps on the computer that would take a human being 03:00:49.740 |
about one week to make Arbidol of continuous labor. 03:00:56.180 |
press go button, and just go away and drink coffee. 03:01:01.940 |
you're saying this computer's just the early days. 03:01:08.020 |
And yes, I would say that something like this 03:01:13.940 |
You know, so the fact that you're doing this is incredible. 03:01:17.940 |
Not impossible, of course, but extremely difficult. 03:01:22.940 |
And I do keep pinching myself when I go in the lab. 03:01:35.100 |
because I made some, we just made design decisions 03:01:37.980 |
and said we are not gonna abandon the abstraction. 03:01:40.500 |
Think about it, if the von Neumann implementation 03:01:44.780 |
was abandoned, I mean, think about what we do 03:01:59.020 |
- It's incredible what they're able to accomplish 03:02:00.900 |
and achieve that reliability at the scale they do. 03:02:06.420 |
what we have now, and how it started, you know, 03:02:17.060 |
well, say 20 million molecules in one database, 03:02:32.300 |
Now imagine what happens when a drug goes out of print, 03:02:35.500 |
goes out of print because there's only a finite number 03:02:44.060 |
- Yeah, and not only that, we can protect the Chi DL 03:02:47.500 |
so we can stop bad actors doing it, we can encrypt them, 03:02:51.460 |
- Chi DL, that's the name, sorry to interrupt, 03:02:53.940 |
- Yeah, the Chi DL is the name of the programming language 03:03:10.220 |
so we can do dynamics and there's for loops in there 03:03:13.820 |
- Right, but the structure, it started out as a, 03:03:16.220 |
like an XML type of thing. - Yeah, yeah, yeah, yeah. 03:03:21.460 |
to program in Chi DL, they can just go to the software 03:03:31.780 |
you know, not with ASCII, but because it's a Greek letter, 03:03:52.260 |
- It's important, I think, when the team are contributing 03:03:54.580 |
to such big ideas, 'cause there are ideas as well, 03:03:57.540 |
I try not to just rename, I didn't call it Cronan 03:04:00.540 |
or anything that, 'cause they keep saying, you know, 03:04:03.140 |
is it, the chemistry, when they're putting stuff 03:04:11.500 |
He said, "Well, can we make it on the damn machine?" 03:04:15.460 |
And I was like, "Oh, is that a compliment or a pejorative?" 03:04:24.500 |
"Why does chemistry need a universal programming language?" 03:04:31.820 |
reliability, interoperability, collaboration, 03:04:35.700 |
remove ambiguity, lower cost, increase safety, 03:04:47.380 |
Which is fascinating, by the way, just publish code. 03:04:51.980 |
And can you maybe elaborate a little bit more 03:04:57.660 |
What does a universal language of chemistry look like? 03:05:07.380 |
But so what it has, it has a series of operators in it, 03:05:19.640 |
with chemical engineers, when I talked about this, 03:05:22.740 |
that you've just rediscovered chemical engineering. 03:05:31.860 |
Well, yes, it is trivial, and that's why it's good. 03:05:33.700 |
Because not only have we rediscovered chemical engineering, 03:05:36.820 |
we've made it implementable on a universal hardware 03:05:40.740 |
And so the CHI-DL has a series of statements. 03:05:52.180 |
And what I also implemented at the beginning is, 03:06:01.260 |
the graph is equivalent to the processor firmware, 03:06:14.820 |
As long as I can solve the problem on the graph 03:06:18.100 |
you have the resources available, it compiles. 03:06:34.460 |
"it was possible to do robotics for chemistry, 03:06:48.580 |
at virtually no cost, because it makes it safer. 03:07:02.980 |
and maybe a student finishes their PhD in the time 03:07:13.460 |
and it limits the ability of humanity to build on it. 03:07:16.420 |
If they just download the code and can execute it, 03:07:20.860 |
the electronic laboratory notebook in chemistry 03:07:28.580 |
For now, the data cemetery is a Jupiter notebook, 03:07:37.860 |
We'll talk about, so, as with all technologies, 03:07:41.460 |
I think there's way more exciting possibilities, 03:07:56.380 |
I don't know if you've heard about OpenAI Codex, 03:08:12.420 |
he is, I guess, kind of philosophically deep, too. 03:08:20.820 |
natural language generation, so you can give it a prompt, 03:08:40.700 |
So, these kinds of transformer-based language models 03:08:43.900 |
are really good at forming deep representations 03:08:48.900 |
of a particular space, like a medium, like language. 03:08:54.420 |
So, you can then apply it to a specific subset 03:09:12.140 |
on one of the hardest problems in computer science, 03:09:16.060 |
How do you write programs that accomplish different tasks? 03:09:24.620 |
those programs based on a prompt of some kind. 03:09:28.700 |
Usually, you can do a natural language prompt, 03:09:36.340 |
the basic documentation of the inputs and the outputs 03:09:39.500 |
and the function of the particular set of code, 03:09:45.860 |
using machine learning, using neural networks. 03:09:49.660 |
Those programs operate on the boring old computer. 03:09:59.580 |
there's gotta be a clever version of programs for this, 03:10:01.700 |
but can you write programs that operate on a computer? 03:10:06.460 |
- Yep, there's actually software out there right now, 03:10:11.140 |
- Yeah, yeah, it's a heuristic, it's rule-based, 03:10:14.300 |
but we have, what we've done, inspired by Codex, actually, 03:10:38.840 |
so there's a bunch of people doing this right now, 03:10:50.280 |
we've developed basically a chemical analog of Codex. 03:11:00.160 |
- So right now, a lot of people do machine learning 03:11:12.800 |
competitors, actually, and they're good, very good, 03:11:23.700 |
and the really important thing that you have to do 03:11:30.000 |
and so what we've learned to do with our abstraction 03:11:32.780 |
is make sure we can pull the context out of the text, 03:11:40.480 |
and read it and generate our executable code? 03:11:43.960 |
- What's the hardest part about that whole pipeline, 03:11:56.800 |
to then running that program in the hardware? 03:12:02.360 |
as we look towards a universal Turing computer? 03:12:18.340 |
So if, you know, chemists are very good at inventing words 03:12:22.880 |
so I would, the classic word that you would use 03:12:31.060 |
in a round-bottom flask, at reflux it would be boiling, 03:12:33.740 |
going up the reflux condenser and coming down. 03:12:36.020 |
But that term, reflux, to reflux, could be changed, 03:12:39.660 |
you know, to people often make up words, new words, 03:12:47.380 |
But what we've been able to do is a bit like in Python, 03:12:55.140 |
So you present the code, you say, "This isn't matched, 03:12:58.580 |
and then the user goes and says, "Oh, I mean reflux," 03:13:02.700 |
So what the Codex or the ChemX does in this case, 03:13:12.540 |
and then the chemist goes in and corrects it. 03:13:16.260 |
because it's not safe, I believe, for it to allow AI 03:13:21.260 |
to just read literature and generate code at this stage. 03:13:25.100 |
- 'Cause now you're having actual, by the way, ChemX, nice. 03:13:38.020 |
is that we live in a fascinating moment in human history. 03:13:51.020 |
it's building something in the physical realm. 03:14:15.260 |
And the way we did it was just by brute force to start with. 03:14:18.340 |
We just kept reading the literature and saying, 03:14:19.820 |
"Is there anything new, can we add a new rule in?" 03:14:21.900 |
And actually, our CHI DL language expand exploded. 03:14:25.380 |
There was so many extra things we had to keep adding. 03:14:28.140 |
And then I realized the primitives still were maintained, 03:14:36.100 |
There are problems of interpreting any big sentence 03:14:47.660 |
I would love to learn to program now using Codex, right? 03:14:55.420 |
will learn to do chemistry by just hacking around 03:15:00.100 |
with the system, writing in different things. 03:15:02.100 |
Because the key thing that we're doing with chemistry 03:15:04.300 |
is that where a lot of mathematical chemistry went wrong 03:15:07.220 |
is people, and I think Wolfram does this in Mathematica, 03:15:14.700 |
where atom A or molecule A reacts with molecule B 03:15:23.100 |
take a liquid or a solid, mix it up and heat it, 03:15:29.380 |
So the programming language is actually with respect 03:15:37.780 |
not in chemical graph space, you unlock everything. 03:15:41.500 |
Because there's only a finite number of processes 03:15:53.580 |
And there is, like I say, errors that can creep in. 03:16:05.380 |
But there are so many safety issues right now 03:16:09.700 |
protecting the user, protecting the environment, 03:16:35.220 |
on a bad actor trying to make methamphetamine? 03:16:38.940 |
- I saw how you looked at me when you said bad actor, 03:16:42.900 |
I'm trying to get the details of this so I can be first. 03:16:45.420 |
- Don't worry, we can protect you from yourself. 03:16:50.220 |
I'm not sure that's true, but that statement gives me hope. 03:16:54.060 |
Does this ultimately excite you about the future, 03:17:11.260 |
of progress that will have to happen, that will happen, 03:17:20.780 |
I see obviously a huge number of exciting possibilities. 03:17:24.660 |
So, whenever you automate these kinds of things, 03:17:39.500 |
and made the world better in so many dimensions. 03:17:43.540 |
And it created, of course, a lot of negative things 03:17:48.940 |
using that very technology to tweet about it. 03:18:00.900 |
when you kind of stand at the end of the road 03:18:07.780 |
for building a really solid, reliable, universal computer, 03:18:12.780 |
what are the possibilities that are positive? 03:18:16.380 |
What are the possibilities that are negative? 03:18:18.140 |
How can we minimize the chance of the negative? 03:18:23.940 |
from drug discovery, from supply chain stress, 03:18:30.700 |
to basically build more productive in the lab, right? 03:18:32.620 |
Well, the computer's not gonna replace the chemist. 03:18:35.300 |
There's gonna be a Moore's law of molecules, right? 03:18:37.380 |
There's gonna be so many more molecules we can design, 03:18:51.980 |
it's actually like doing the basic understanding 03:18:55.180 |
- And the personalization, the cost of drugs right now, 03:18:57.780 |
we're all living longer, we're all having more and more, 03:19:07.940 |
imagine, you know, you can work at your genome assistant, 03:19:11.900 |
tells you you're gonna get cancer in seven years time, 03:19:16.820 |
that cooks up the right molecule just for you to cure it, 03:19:25.260 |
so right now, I think it's absolutely outrageous 03:19:28.660 |
that not all of humanity has access to medicine. 03:19:36.020 |
because it will disrupt the way things are manufactured. 03:19:40.860 |
in different factories, let's say that computers, 03:19:44.420 |
clinical grade computers or drug grade computers 03:19:49.740 |
and they can make things on demand as a function of the cost, 03:19:54.060 |
you know, maybe people won't be able to afford 03:19:57.220 |
but maybe they'll be able to get the next best thing, 03:20:02.820 |
make available drugs to everybody that they need, 03:20:16.980 |
Before we do that, let's imagine what happened, 03:20:19.460 |
go back to a really tragic accident a few years ago, 03:20:21.900 |
well not an accident, an act of murder by that pilot 03:20:25.500 |
on the, I think it was Eurowings or Swiss Wings, 03:20:39.940 |
he set the altimeter or the descend height to zero, 03:20:44.220 |
so the computer just took the plane into the Alps. 03:21:01.660 |
to anticipate problems like this in the computer? 03:21:05.260 |
Had the software, and I'm sure Boeing and Airbus 03:21:08.380 |
will be thinking, oh, maybe I can give the computer 03:21:12.660 |
so whenever one tries to drop the height of the plane, 03:21:20.540 |
Of course, he would have been able to find another way, 03:21:22.240 |
maybe fly it until it runs out of fuel or something, 03:21:25.340 |
- Keep anticipating all the large number of trajectories 03:21:30.420 |
running into the Alps, and try to at least make it easy 03:21:35.420 |
for the engineers to build systems that are protecting us. 03:21:40.460 |
what in the computer world right now with Kyde-Ls, 03:21:43.020 |
let's just not think about what I'm doing right now. 03:21:45.020 |
What I'm doing right now is, it's completely open, right? 03:21:48.260 |
and be playing with them, making them more easier, 03:21:50.100 |
and easier, and easier, but what we're gonna start to do, 03:22:01.180 |
and you have a license to make a given molecule. 03:22:07.460 |
and they'll say, right, your license to do it, 03:22:08.940 |
here it is, it's encrypted, and the Kyde-L gets run. 03:22:12.360 |
So you have a license for that instance of use. 03:22:15.460 |
Computer science has already solved the problem. 03:22:17.620 |
So the fact that we all trust online banking, right, 03:22:31.980 |
that you, to actually reverse engineer a Kyde-L 03:22:34.460 |
will be as hard as reverse engineering the encryption key. 03:22:44.420 |
And actually, people aren't gonna want to then 03:23:10.340 |
- Given cannabis, THC, to some people who've got epilepsy, 03:23:13.780 |
isn't literally, forgive the term, a no-brainer, 03:23:16.540 |
because these poor people go from seizures like every day 03:23:19.660 |
to maybe seizures just once every few months. 03:23:23.940 |
that try to minimize the chance that it can get 03:23:29.020 |
like terrorists or people that want to do harm. 03:23:35.680 |
you're putting a lot of power in the hands of governments, 03:23:40.480 |
and so then emerge the kind of natural criticism 03:23:54.740 |
So, and sometimes not just war against other nations, 03:24:03.500 |
- Well, I'm thinking, so there's another way of doing it, 03:24:11.200 |
I'm not saying you should adopt a blockchain, 03:24:24.960 |
and diligently make it in the robot and validate it. 03:24:28.160 |
So, I would call mining, proof of work, proof of synthesis. 03:24:34.040 |
- Proof of the synthesis, that's pretty cool. 03:24:35.920 |
because suddenly, when you actually synthesize it, 03:24:43.680 |
'cause you can never make something 100% pure. 03:24:46.440 |
That fingerprint will allow you to secure your Kyde-L. 03:24:51.500 |
So, suddenly, you can have people out there mining, 03:25:00.220 |
that contact tracing should have been done in COVID, 03:25:05.500 |
So, you have just been in contact with someone COVID, 03:25:07.940 |
you choose, I'm not telling you to stay at home, 03:25:12.260 |
So, now, if we could imagine a similar thing, 03:25:14.260 |
like, you have got access to these chemicals, 03:25:25.260 |
And my job here is to not just make the technology possible, 03:25:29.220 |
but to have as open as a discussion as possible 03:25:31.940 |
with people to say, "Hey, can we stop childhood mortality 03:25:42.420 |
or people might use it for recreational drugs? 03:25:51.020 |
as an entity to make sure that we're not enabling people 03:25:53.380 |
to manufacture personal drugs, weapons at will. 03:25:56.760 |
And what we have to do is have a discussion with society, 03:26:07.460 |
"And are you willing to accept some of the risks?" 03:26:11.820 |
- So by the way, when you say personal drugs, 03:26:16.260 |
Or do you have a concern of just putting the manufacturer 03:26:19.860 |
of any kind of legal drugs in the hands of regular people? 03:26:32.020 |
the chances of computers being, well, should always never, 03:26:45.420 |
but they might be at the local pharmacy, right? 03:26:48.940 |
And if you've got a drug manufacturing facility 03:26:56.260 |
so that you don't have to take 10 pills every day. 03:27:09.220 |
'cause I know people are gonna speak up on this, 03:27:14.780 |
no reason why you can't manufacture it for recreation. 03:27:21.420 |
- So, I mean, I'm a chemistry professor in a university 03:27:37.780 |
there's nothing, 'cause you have said recreational drugs 03:28:07.820 |
become heavily acceptable, and that you can modify them. 03:28:20.380 |
then why not have a machine that makes the one you like? 03:28:28.740 |
But I'm, you know, we're so far away from that. 03:28:32.180 |
I can barely get the thing to work in the lab, right? 03:28:34.540 |
I mean, it's reliability and all this other stuff, 03:28:36.780 |
but what I think's gonna happen in the short term, 03:28:39.100 |
it's gonna turbocharge molecular discovery, reliability, 03:28:53.780 |
So, we are talking about automating engineering 03:29:06.780 |
what are the things we should be excited about? 03:29:09.660 |
And what are the things we should be terrified about? 03:29:18.060 |
- So, in this robot, the robot does all the heavy lifting. 03:29:31.460 |
there was an attempt in the '60s, Joshua Ledenberg 03:29:37.700 |
that made an AI to try and guess if organic molecules 03:29:44.740 |
- And they failed 'cause they didn't have assembly theory. 03:29:58.860 |
they were trying to basically just look at the corpus 03:30:04.060 |
So, when I was a bit down about assembly theory, 03:30:08.180 |
and couldn't convince computational people interested 03:30:16.340 |
And I mean, I've been working with Sarah Walker's team, 03:30:20.140 |
and I think she also invented assembly theory in some way. 03:30:25.180 |
When I found the AI not working for the dendral project, 03:30:37.220 |
so what it does, it's basically like a computer, 03:30:49.660 |
so some electrons to make the gold turn into a nanoparticle. 03:30:58.740 |
on a gold wedding ring or a gold bar or something, 03:31:05.860 |
What we did is we randomly squirt the gold particle 03:31:09.580 |
and the reducing agent in, and we measure the UV, 03:31:13.540 |
And so, what we do is we've got, the robot has a mind, 03:31:30.860 |
it squirts in the chemicals and looks at the color, 03:31:51.140 |
So, the exploration just says, just do random stuff 03:31:54.420 |
and see how many different things you can get. 03:31:57.500 |
try and optimize and make the peak sharper, sharper, sharper. 03:32:07.880 |
resets all the round bottom flasks, cleans them, 03:32:14.660 |
And what this robot is able to do is search a space 03:32:24.180 |
And it makes five generations of nanoparticles, 03:32:30.100 |
And then, at the end, it outputs a Kyde-L code. 03:32:39.700 |
So, it's doing a kind of reinforcement learning. 03:32:54.700 |
- Replicate somewhat, maybe perfectly, what it created. 03:33:03.540 |
So, we don't try and imply any self-replication 03:33:06.620 |
or try and get the particles to make themselves, 03:33:29.580 |
to start to create molecules that have lifelike qualities? 03:33:42.060 |
I talked about earlier, the moxides and the rings 03:33:52.020 |
So, they would, I guess you would call it self-replication. 03:33:55.780 |
But because there's limited function and mutation, 03:34:04.540 |
So, I think the prospect of us being able to engineer 03:34:13.700 |
like I said earlier, my aim is to do this, of course. 03:34:16.420 |
I mean, on one hand, I'm saying it's impossible. 03:34:20.420 |
You know, it's like, well, I think we can do it, 03:34:27.540 |
These particles, if they do start to self-replicate, 03:34:32.340 |
that I don't think anything dangerous will come out. 03:34:41.340 |
I don't want to scare people, like gain of function, 03:34:45.180 |
Our number one kill switch is that we always try 03:34:48.060 |
to search a space of objects that don't exist in our, 03:34:55.820 |
So, even if something got out, it just would die immediately. 03:34:58.500 |
It's like making a silicon life form or something, 03:35:04.300 |
gain of function research is focused on, like, 03:35:06.380 |
how do you get a dangerous thing to be closer 03:35:16.820 |
is always try to operate on chemical entities 03:35:21.820 |
that are very different than the kind of chemical environment 03:35:26.300 |
- Yeah, and also, I mean, I'll say something dramatic, 03:35:29.540 |
which may not be true, so I should be careful. 03:35:34.540 |
If, let's say, we did discover a new living system, 03:35:41.960 |
and we just released it in the environment, who cares? 03:35:58.420 |
And you have two different origins of life on that planet, 03:36:06.020 |
- And then the only time they recognize each other 03:36:13.180 |
- So, they co-evolve, and that's fascinating. 03:36:17.460 |
is exactly what you were saying, which is a life bomb, 03:36:33.700 |
optimization system, try to discover life forms 03:36:41.860 |
and then you send those life forms over there. 03:36:54.700 |
- Yeah, so look, I'm gonna say something quite contentious. 03:36:59.000 |
I think it's brilliant he wants to go to Mars, 03:37:02.540 |
is Elon just obsessed with getting humanity off Earth, 03:37:08.100 |
So, if we do technology, so Elon either needs 03:37:11.820 |
'cause he needs to manufacture drugs, right, on demand, 03:37:20.060 |
it's quite hard for humans to survive on Mars. 03:37:22.580 |
Why don't we write a series of origin of life algorithms 03:37:30.660 |
- Which is a terrible film, by the way, but anyway. 03:37:33.020 |
And dump it on Mars, and just terraform Mars, 03:37:42.180 |
rather than brute-forcing human life on Mars. 03:37:47.300 |
what is human culture, what are the things you encode? 03:37:50.180 |
Some of it is knowledge, some of it is information, 03:37:58.680 |
is some of the more unique aspects of what makes us human, 03:38:03.680 |
which is our particular kind of consciousness. 03:38:09.780 |
So, he talks about the flame of human consciousness. 03:38:14.620 |
is can we instill consciousness into other beings? 03:38:27.860 |
that hopes and dreams and fears and loves can all die. 03:38:45.980 |
it has an electric field around the perimeter, 03:38:49.900 |
and it goes out and goes from its base station, 03:38:55.180 |
detects the perimeter, then chooses a random angle, 03:39:03.820 |
they just, when they were quite young, they called it, 03:39:07.300 |
I don't wanna be sexist there, it could be a he, 03:39:17.820 |
if you apply integrated information theory to lawnmowers, 03:39:25.020 |
is that people say it's a flawed way of measuring 03:39:29.060 |
I think assembly theory actually measures consciousness 03:39:41.020 |
Our consciousness has evolved together, right? 03:39:42.860 |
The fact we're here and the robots we leave behind, 03:39:45.980 |
they all have some of that, so we won't lose it all. 03:39:49.140 |
Sure, consciousness requires that we have many models 03:39:52.180 |
being generated, it's not just one domain-specific AI, 03:39:54.660 |
right, I think the way to create consciousness, 03:39:57.420 |
I'm gonna say unashamedly, the best way to make 03:40:02.840 |
because you just have access to many more states. 03:40:05.300 |
And the problem right now we're making silicon consciousness 03:40:12.780 |
or sorry, there are more possible configurations 03:40:14.620 |
possible in your brain than there are atoms in the universe. 03:40:22.300 |
It's got 10 billion, 12 billion, 14 billion transistors, 03:40:25.260 |
but you can't reconfigure them as dynamically. 03:40:28.340 |
- Well, you've shared this intuition a few times already 03:40:31.420 |
that the larger number of states somehow correlates 03:40:38.380 |
but it's also possible that constraints are essential here. 03:40:53.580 |
but I wonder if it's not, it can't be separate from human, 03:40:58.220 |
it can't be separate from human consciousness, 03:41:00.020 |
because the causal chain that produced it came from humans. 03:41:06.340 |
to people worry about the existential threat of AI saying, 03:41:11.340 |
I mean, you put it much more elegantly earlier, 03:41:15.820 |
dumb algorithms written by human beings on Twitter, 03:41:24.900 |
this is what I have been ineloquent in trying to describe it. 03:41:29.580 |
Partially because I try not to think too deeply 03:41:34.340 |
through this stuff, because then you become a philosopher. 03:41:36.260 |
I still aspire to actually building a bunch of stuff. 03:41:51.980 |
Like intelligence, the way we conceive of intelligence 03:41:56.980 |
materializes as a thing that becomes a fun entity 03:42:06.380 |
So like it's a mix of wit, intelligence, humor, 03:42:17.940 |
ability to love, to dream, to share those dreams, 03:42:31.340 |
And I think that kind of super intelligent being 03:42:47.760 |
don't study history, that don't study human psychology 03:42:58.460 |
- Yeah, and I think it's not a new danger, right? 03:43:11.540 |
It's not fun, it's not a good thing for humans to do, right? 03:43:14.620 |
And I think that when you get people into outrage, 03:43:20.020 |
but I think that we're all beginning to see this. 03:43:22.060 |
And I think that actually I'm very optimistic 03:43:31.220 |
has Twitter and social media taken out of humanity? 03:43:36.580 |
so the good thing about Twitter is it gives power, 03:43:55.780 |
I think, or at least if I were to agree with you, 03:43:59.980 |
what I would say is minorities broadly defined 03:44:07.380 |
it magnifies the concerns of the small versus the big. 03:44:20.940 |
I mean, I think that the world isn't that broken, right? 03:44:33.540 |
- I like how you said it, it's a pretty cool problem. 03:44:39.820 |
- There's a bunch of really, really big problems 03:44:53.420 |
- Yeah, and I think that coming back to consciousness, 03:44:56.620 |
I don't think the universe is doomed to heat death, right? 03:45:00.300 |
It's one of the optimists, that's why I want to 03:45:02.580 |
kind of nudge you into thinking that time is fundamental, 03:45:05.820 |
then suddenly you don't have to give it back. 03:45:11.140 |
and what we see around us in our construction, 03:45:13.220 |
I know everyone's worried about how fragile civilization is, 03:45:41.020 |
And I don't know the answer to that question right now. 03:45:43.340 |
I guess Elon's gonna have a pretty good go at getting there. 03:45:53.020 |
I'm sure people have their doubts that computation works, 03:45:58.020 |
- And most of the cool technologies we have today 03:46:10.820 |
every majority of people doubted before they came to life, 03:46:19.160 |
it's fascinating to think about all the different ways 03:46:28.540 |
but there's all kinds of ways that hybrid happens, 03:46:31.460 |
how we and other technology play together, like a computer, 03:46:35.820 |
how that changes the fabric of human civilization 03:46:55.660 |
we're not anticipating, many of them positive, 03:47:08.820 |
And we sitting on a porch with a bottle of Jack Daniels 03:47:21.340 |
you and Yoshua Bach nudge each other on Twitter quite a bit 03:47:50.820 |
"every working model of existence into a Turing machine, 03:47:54.360 |
"the structure of the universe might be given 03:47:56.740 |
"by wakes of nonexistence in a pattern generated 03:48:18.340 |
"in a pattern generated by all possible automata, 03:48:31.780 |
What the hell is nonexistence in the universe? 03:48:39.360 |
you tweeted, "It's state machines all the way down," 03:48:59.220 |
"Many foundational physicists effectively believe 03:49:07.060 |
And then you said, "I think there are notable differences. 03:49:13.020 |
"Second, I want to understand how the universe emerges 03:49:26.580 |
What the heck is this dinner conversation about? 03:49:30.780 |
Maybe, put another way, maybe zooming out a little bit, 03:49:34.700 |
are there interesting agreements or disagreements 03:49:37.020 |
between you and Joshua Bach that can elucidate 03:49:41.540 |
some of the other topics we've been talking about? 03:50:11.620 |
So a Turing machine, Turing machines, I would argue, 03:50:34.420 |
who invented the abstraction of the Turing machine, 03:50:43.340 |
at the origin of life, the origin of intelligence, 03:50:56.260 |
And I think we got kind of trapped in our words, 03:51:07.740 |
where did the transition of continuous to discrete occur? 03:51:11.340 |
And this is because of my general foolishness 03:51:23.540 |
there were constructors before there were abstractors. 03:51:26.780 |
Because how did the universe abstract itself into existence? 03:51:32.820 |
could the universe of intelligence have come first? 03:51:35.220 |
- What's a constructor, what's an abstractor? 03:51:50.740 |
And then from those labels, to come up with a set of axioms, 03:51:54.460 |
with those labels, and to basically understand 03:52:04.700 |
- Even if the universe is not a Turing computer, 03:52:09.700 |
does that negate the possibility that a Turing computer 03:52:16.260 |
Like, just because the abstraction was formed 03:52:18.700 |
at a later time, does that mean that abstraction, 03:52:22.660 |
this is to our cellular automata conversation. 03:52:32.380 |
- Well, this is where the existence is the default, right? 03:52:44.020 |
That's a, so the has to be and the can it be. 03:52:50.300 |
I don't know, I don't understand if it has to be or not. 03:52:54.780 |
But can the universe have Turing machines in it? 03:53:09.420 |
did not have the computational power that it has now. 03:53:36.060 |
And so we can easily imagine going back in time 03:53:38.300 |
that the universe was capable of having them, 03:53:41.940 |
- So the universe may have been a lot dumber computationally? 03:53:46.820 |
I don't want to go back to the time discussion, 03:53:48.540 |
but I think it has some relationship with it. 03:53:50.780 |
The universe is basically smarter now than it used to be, 03:53:53.460 |
and it's gonna continue getting smarter over time 03:54:02.460 |
- You know, there's a, perhaps there's ground in physics, 03:54:30.420 |
is that the energy isn't conserved in the universe. 03:54:38.060 |
- Okay, so computation potentially is not conserved, 03:55:05.300 |
and the universe is inflating in time, if you like, 03:55:16.220 |
of the universe is actually increasing over time. 03:55:30.980 |
galaxies, so dark matter, I think, doesn't hold. 03:55:35.420 |
You know, you need to hold the galaxies together 03:55:55.300 |
I'm not saying there's perpetual motion allowed 03:56:04.120 |
and it's generating novelty and it appears to, 03:56:09.720 |
couldn't that just be mechanistically how reality works? 03:56:13.520 |
And therefore, I don't really like this idea that the, 03:56:18.520 |
so I want to live in a deterministic universe 03:56:25.160 |
The only way I can do that is if time is fundamental. 03:56:30.360 |
is just sleight of hand, because the physicists will say, 03:56:48.000 |
And don't worry, your feeling of free will is effective. 03:57:12.100 |
It's problematic to you, a particular creature 03:57:28.780 |
- Well, I need the observation that I'm seeing 03:57:40.820 |
So we can't necessarily trust our observation. 03:57:57.400 |
so if you can't, so if I cannot do an experiment 03:58:01.060 |
or a thought experiment that will test this assumption, 03:58:04.640 |
then the assumption is without merit, really, in the end. 03:58:09.520 |
- Yeah, that's a beautiful ideal you hold yourself to. 03:58:21.880 |
and you have theoretical frameworks like assembly theory, 03:58:35.340 |
Now that we're holding you to the grounded in experiment, 03:58:52.640 |
It's marching forward and along this long timeline, 03:59:01.580 |
that have come up with cellular automata and computers 03:59:08.220 |
- I have so many different answers to this question. 03:59:13.820 |
I would say that given the way of the conversation 03:59:18.500 |
well, we make our own meaning, I think that's fine, 03:59:24.700 |
every possible configuration that it's allowed 03:59:27.220 |
us to explore and this goes back to the kind of question 03:59:32.000 |
and the existence and non-existence of things, right? 03:59:42.200 |
before assembly theory, so everything is possible. 03:59:52.020 |
We see something else and what we see is evidence of memory 03:59:57.740 |
so there clearly seems to be some interference 04:00:00.500 |
between the combinatorial explosion of things 04:00:04.140 |
and what the universe allows and it's like this kind of 04:00:18.220 |
and those objects are able to search more space and time 04:00:24.240 |
and I guess I'm searching for why the universe 04:00:27.300 |
is infinitely creative or is infinitely creative 04:00:31.380 |
as many objects, create as many things as possible 04:00:34.820 |
and I see a future of the universe that doesn't result 04:00:38.740 |
The universe is gonna extract every ounce of creativity 04:00:42.820 |
it can out of it 'cause that's what we see on Earth, right? 04:00:45.900 |
- And if you think that almost like intelligence 04:00:49.020 |
is not conserved, that maybe creativity isn't either. 04:00:57.040 |
So like creativity is ultimately tied to novelty. 04:01:00.460 |
You're coming up with cool new configurations of things 04:01:03.700 |
and maybe that just can continue indefinitely 04:01:06.320 |
and this human species that was created along the way 04:01:09.920 |
is probably just one method like that's able to ask 04:01:19.500 |
Maybe there's other meta levels on top of that. 04:01:24.880 |
we'll create organisms, maybe there'll be organisms 04:01:27.980 |
that ask themselves questions about themselves 04:01:32.560 |
in a deeper, bigger picture way than we humans do. 04:01:39.840 |
and then construct some kind of hybrid systems 04:01:42.920 |
that ask themselves about the collective aspect. 04:01:52.840 |
who's a professor of physics and astrobiology at ASU. 04:01:57.440 |
And we argue about how creative the universe is gonna be 04:02:03.880 |
because I think she's more of a free will thinker 04:02:14.000 |
Because there's simply a missing understanding. 04:02:16.000 |
Right now we don't understand how the universe, 04:02:23.460 |
So asking the meaning of it before we even know 04:02:41.220 |
But I do think there is something else going on. 04:02:49.980 |
and I think that we have to make the experiments better 04:02:58.620 |
and what I tried to do with Yasha is meet him halfway. 04:03:01.140 |
Say, well what happens if I become a computation list? 04:03:05.660 |
because I can make Turing machines in the universe. 04:03:12.020 |
On the other hand I'm saying they can't exist. 04:03:18.980 |
So then did the universe have to make a continuous 04:03:21.260 |
to a discrete transition or is the universe just discrete? 04:03:27.500 |
I will then give Yasha his Turing-like property 04:03:31.260 |
in the universe but maybe there's something else below it 04:03:33.620 |
which is the constructor that constructs a Turing machine 04:03:38.700 |
it's a bit like you generate a computing system 04:03:44.620 |
that then recognizes, it can make a generalizable abstraction 04:03:48.500 |
because human beings with mathematics have been able 04:04:23.740 |
Like I told you offline, I hope we get a chance 04:04:26.500 |
to talk again with perhaps just the two of us 04:04:30.860 |
That's a fascinating dynamic for people who haven't heard, 04:04:35.340 |
I suppose on Clubhouse is where I heard you guys talk 04:04:39.940 |
And I also can't wait to hear you and Yosha talk. 04:04:43.220 |
So I think if there's some point in this predetermined 04:04:54.900 |
or the four of us with Yosha could meet and talk 04:05:02.160 |
but I look forward to that one in particular. 04:05:04.620 |
Lee, thank you so much for spending your valuable time 04:05:10.660 |
- Thanks for listening to this conversation with Lee Cronin. 04:05:15.140 |
please check out our sponsors in the description. 04:05:39.140 |
Thank you for listening and hope to see you next time.