back to index

Sara Walker: Physics of Life, Time, Complexity, and Aliens | Lex Fridman Podcast #433


Chapters

0:0 Introduction
1:7 Definition of life
21:45 Time and space
32:26 Technosphere
36:51 Theory of everything
45:32 Origin of life
67:10 Assembly theory
83:24 Aliens
95:14 Great Perceptual Filter
99:12 Fashion
103:14 Beauty
109:35 Language
116:16 Computation
126:3 Consciousness
134:55 Artificial life
158:48 Free will
165:32 Why anything exists

Whisper Transcript | Transcript Only Page

00:00:00.000 | So you have an original life event.
00:00:01.760 | It evolves for four billion years, at least on our planet.
00:00:04.560 | It evolves a technosphere.
00:00:06.160 | The technologies themselves start having this property
00:00:09.200 | we call life, which is the phase we're undergoing now.
00:00:11.960 | It solves the origin of itself,
00:00:15.240 | and then it figures out how that process all works,
00:00:17.980 | understands how to make more life,
00:00:19.360 | and then can copy itself onto another planet,
00:00:21.360 | so the whole structure can reproduce itself.
00:00:23.560 | - The following is a conversation with Sarah Walker,
00:00:29.080 | her third time on this podcast.
00:00:31.120 | She is an astrobiologist and theoretical physicist
00:00:34.840 | interested in the origin of life
00:00:37.000 | and in discovering alien life on other worlds.
00:00:41.640 | She has written an amazing new upcoming book
00:00:44.000 | titled "Life as No One Knows It,
00:00:46.240 | "The Physics of Life's Emergence."
00:00:48.840 | This book is coming out on August 6th,
00:00:51.560 | so please go pre-order it now.
00:00:55.200 | It will blow your mind.
00:00:57.640 | This is a Lex Friedman podcast.
00:00:59.400 | To support it, please check out our sponsors
00:01:01.260 | in the description.
00:01:02.740 | And now, dear friends, here's Sarah Walker.
00:01:06.440 | You opened the book "Life as No One Knows It,
00:01:10.700 | "The Physics of Life's Emergence,"
00:01:12.360 | with a distinction between the materialists
00:01:15.400 | and the vitalists, so what's the difference?
00:01:17.920 | Can you maybe define the two?
00:01:19.480 | - I think the question there is about
00:01:22.560 | whether life can be described
00:01:26.580 | in terms of matter and physical things,
00:01:30.860 | or whether there is some other feature
00:01:35.720 | that's not physical that actually animates living things.
00:01:40.120 | So for a long time, people maybe have called that a soul.
00:01:44.120 | It's been really hard to pin down what that is.
00:01:45.800 | So I think the vitalist idea is really
00:01:48.080 | that it's kind of a dualistic interpretation
00:01:51.000 | that there's sort of the material properties,
00:01:53.480 | but there's something else that animates life
00:01:55.900 | that is there when you're alive,
00:01:57.640 | and it's not there when you're dead.
00:01:59.360 | And materialists kind of don't think
00:02:01.760 | that there's anything really special
00:02:02.920 | about the matter of life and the material substrates
00:02:05.240 | that life is made out of.
00:02:06.620 | So they disagree on some really fundamental points.
00:02:10.320 | - Is there a gray area between the two?
00:02:12.520 | Like maybe all there is is matter,
00:02:15.360 | but there's so much we don't know
00:02:17.720 | that it might as well be magic.
00:02:19.360 | That like whatever that magic that the vitalists see.
00:02:24.040 | Meaning like there's just so much mystery
00:02:27.340 | that it's really unfair to say
00:02:29.940 | that it's boring and understood
00:02:32.780 | and as simple as quote-unquote physics.
00:02:35.860 | - Yeah, I think the entire universe is just a giant mystery.
00:02:39.920 | I guess that's what motivates me as a scientist.
00:02:42.260 | And so oftentimes when I look at open problems
00:02:44.440 | like the nature of life or consciousness
00:02:46.620 | or what is intelligence or are there souls
00:02:50.940 | or whatever question that we have
00:02:53.260 | that we feel like we aren't even
00:02:55.460 | on the tip of answering yet,
00:02:56.860 | I think we have a lot more work to do
00:03:00.140 | to really understand the answers to these questions.
00:03:02.700 | So it's not magic, it's just the unknown.
00:03:05.900 | And I think a lot of the history
00:03:07.380 | of humans coming to understand the world around us
00:03:09.720 | has been taking ideas that we once thought
00:03:13.340 | were magic or supernatural
00:03:15.360 | and really understanding them in a much deeper way
00:03:17.980 | that we learn what those things are.
00:03:22.700 | And they still have an air of mystery
00:03:24.500 | even when we understand them.
00:03:26.020 | There's no sort of bottom to our understanding.
00:03:30.580 | - So do you think the vitalists have a point
00:03:32.700 | that they're more eager and able
00:03:35.820 | to notice the magic of life?
00:03:39.180 | - I think that no tradition, vitalists included,
00:03:42.620 | is ever fully wrong about the nature
00:03:44.820 | of the things that they're describing.
00:03:46.580 | So a lot of times when I look at different ways
00:03:50.220 | that people have described things
00:03:51.660 | across human history, across different cultures,
00:03:54.220 | there's always a seed of truth in them.
00:03:55.860 | And I think it's really important to try to look for those
00:03:58.260 | because if there are narratives
00:03:59.380 | that humans have been telling ourselves
00:04:01.820 | for thousands of years, for thousands of generations,
00:04:04.640 | there must be some truth to them.
00:04:06.500 | We've been learning about reality for a really long time
00:04:11.460 | and we recognize the patterns that reality presents us.
00:04:14.740 | We don't always understand what those patterns are.
00:04:17.380 | And so I think it's really important
00:04:18.500 | to pay attention to that.
00:04:19.380 | So I don't think the vitalists were actually wrong.
00:04:21.180 | And a lot of what I talk about in the book,
00:04:24.260 | but also I think about a lot just professionally,
00:04:27.340 | is the nature of our definitions of what's material
00:04:30.940 | and how science has come to invent the concept of matter.
00:04:34.540 | And that some of those things actually really are inventions
00:04:39.100 | that happened in a particular time
00:04:40.620 | in a particular technology
00:04:42.540 | that could learn about certain patterns
00:04:44.940 | and help us understand them.
00:04:46.420 | And that there are some patterns we still don't understand.
00:04:49.380 | And if we knew how to measure those things
00:04:53.660 | or we knew how to describe them in a more rigorous way,
00:04:58.100 | we would realize that the material world, matter,
00:05:01.260 | has more properties than we thought that it did.
00:05:04.140 | And one of those might be associated
00:05:05.620 | with the thing that we call life.
00:05:06.640 | Life could be a material property
00:05:08.620 | and still have a lot of the features
00:05:10.840 | that the vitalists thought were mysterious.
00:05:12.940 | - So we may still expand our understanding
00:05:16.340 | what is incorporated in the category of matter.
00:05:19.220 | That will eventually incorporate such magical things
00:05:22.380 | that the vitalists have noticed.
00:05:24.300 | - Yeah. - Like life.
00:05:25.780 | - Yeah, so I think about,
00:05:27.340 | I always like to use examples from physics,
00:05:29.100 | so I'll probably do that to like,
00:05:31.740 | like it's just my go-to place.
00:05:34.660 | But in the history of gravitational physics,
00:05:38.420 | for example, in the history of motion,
00:05:41.120 | when Aristotle came up with his theories of motion,
00:05:43.220 | he did it by the material properties he thought things had.
00:05:45.780 | So there was a concept of things falling to earth
00:05:48.800 | because they were solid-like
00:05:50.780 | and things raising to the heavens 'cause they were air-like
00:05:53.140 | and things moving around the planet
00:05:54.660 | 'cause they were celestial-like.
00:05:56.500 | But then we came to realize that thousands of years later
00:05:59.420 | and after the invention of many technologies
00:06:01.580 | that allowed us to actually measure time
00:06:05.180 | in a mechanistic way and track planetary motion,
00:06:09.740 | and we could roll balls down incline planes
00:06:12.700 | and track that progress,
00:06:14.020 | we realized that if we just talked about mass
00:06:16.220 | and acceleration, we could unify all motion in the universe
00:06:20.320 | in a really simple description.
00:06:22.540 | So we didn't really have to worry about the fact
00:06:24.440 | that my cup is heavy and the air is light.
00:06:27.000 | Like the same laws describe them.
00:06:29.680 | If we have the right material properties
00:06:31.400 | to talk about what those laws are actually interacting with.
00:06:35.040 | And so I think the issue with life
00:06:37.000 | is we don't know how to think about information
00:06:38.940 | in a material way.
00:06:40.480 | And so we haven't been able to build a unified description
00:06:44.200 | of what life is or the kind of things that evolution builds
00:06:49.200 | because we haven't really invented
00:06:52.000 | the right material concept yet.
00:06:54.080 | - So when talking about motion,
00:06:57.200 | the laws of physics appear to be the same
00:07:00.520 | everywhere out in the universe.
00:07:02.560 | You think the same is true for other kinds of matter
00:07:06.600 | that we might eventually include life in?
00:07:08.840 | - I think life obeys universal principles.
00:07:13.720 | I think there is some deep underlying explanatory framework
00:07:17.440 | that will tell us about the nature of life in the universe
00:07:22.000 | and will allow us to identify life
00:07:24.440 | that we can't yet recognize because it's too different.
00:07:28.280 | - You're right about the paradox of defining life.
00:07:30.600 | Why does it seem to be so easy
00:07:32.480 | and so complicated at the same time?
00:07:35.320 | - You know, all the sort of classic definitions
00:07:37.600 | people want to use just don't work.
00:07:40.240 | They don't work in all cases.
00:07:41.560 | So Carl Sagan had this wonderful essay
00:07:45.040 | on definitions of life where I think he talks about
00:07:47.520 | aliens coming from another planet.
00:07:49.480 | If they saw Earth, they might think that cars
00:07:51.600 | were the dominant life form
00:07:52.880 | because there's so many of them on our planet.
00:07:55.240 | And like humans are inside them
00:07:56.680 | and you might want to exclude machines,
00:07:59.360 | but any definition, you know,
00:08:00.600 | like classic biology textbook definitions
00:08:02.880 | would also include them.
00:08:04.000 | And so, you know, he wanted to draw a boundary
00:08:06.140 | between these kinds of things by trying to exclude them,
00:08:11.140 | but they were naturally included by the definitions
00:08:13.480 | people want to give.
00:08:14.320 | And in fact, what he ended up pointing out
00:08:15.800 | is that all of the definitions of life that we have,
00:08:18.800 | whether it's life is a self-reproducing system
00:08:21.960 | or life eats to survive or life requires compartments,
00:08:26.640 | whatever it is, there's always a counter example
00:08:29.520 | that challenges that definition.
00:08:31.040 | This is why viruses are so hard or why fire is so hard.
00:08:34.600 | And so we've had a really hard time trying to pin down
00:08:38.000 | from a definitional perspective exactly what life is.
00:08:42.640 | - Yeah, you actually bring up the zombie ant fungus.
00:08:46.020 | I enjoyed looking at this thing
00:08:48.500 | as an example of one of the challenges.
00:08:49.940 | You mentioned viruses, but this is a parasite.
00:08:53.300 | Look at that.
00:08:54.300 | - Did you see this in the jungle?
00:08:55.540 | - Infects ants.
00:08:57.460 | Actually, one of the interesting things about the jungle,
00:09:00.660 | everything is ephemeral.
00:09:01.820 | Like everything eats everything really quickly.
00:09:04.700 | So if an organism dies, that organism disappears.
00:09:09.680 | - Yeah.
00:09:11.320 | - It's a machine that doesn't have,
00:09:13.180 | I wanted to say it doesn't have a memory or a history,
00:09:17.160 | which is interesting given your work on history
00:09:20.000 | in defining a living being.
00:09:23.520 | The jungle forgets very quickly.
00:09:25.080 | It wants to erase the fact that you existed very quickly.
00:09:28.480 | - Yeah, but it can't erase it.
00:09:29.600 | It's just restructuring it.
00:09:30.720 | And I think the other thing that is really vivid to me
00:09:33.500 | about this example that you're giving
00:09:35.360 | is how much death is necessary for life.
00:09:39.120 | So I worry a bit about notions of immortality
00:09:44.120 | and whether immortality is a good thing or not.
00:09:47.560 | So I have sort of a broad conception
00:09:49.200 | that life is the only thing the universe generates
00:09:53.040 | that actually has even the potential to be immortal.
00:09:55.600 | But that's as like the sort of process
00:09:57.880 | that you're describing where life is about memory
00:10:00.000 | and historical contingency and construction
00:10:02.520 | of new possibilities.
00:10:03.940 | But when you look at any instance of life,
00:10:05.700 | especially one as dynamic as what you're describing,
00:10:08.860 | it's a constant birth and death process.
00:10:11.180 | But that birth and death process is like
00:10:13.420 | the way that the universe can explore
00:10:16.660 | what possibilities can exist and not everything,
00:10:20.740 | not every possible human or every possible ant
00:10:23.820 | or every possible zombie ant
00:10:26.780 | or every possible tree will ever live.
00:10:29.980 | So it's an incredibly dynamic and creative place
00:10:33.520 | because of all that death.
00:10:34.920 | - So does this thing, this is a parasite that needs the ant.
00:10:39.360 | So is this a living thing or is this not a living thing?
00:10:42.360 | So this is, it just pierces the ant.
00:10:44.940 | I mean, it-- - Right.
00:10:46.680 | - And I've seen a lot of this, by the way.
00:10:49.080 | Organisms working together in the jungle,
00:10:51.560 | like ants protecting a delicious piece of fruit.
00:10:55.260 | So they need the fruit, but like if you touch that fruit,
00:10:58.480 | they're going to, like the forces emerge.
00:11:01.640 | They're fighting you, they're defending that fruit.
00:11:04.080 | - Right. - To the death.
00:11:05.840 | It's just nature seems to find mutual benefits, right?
00:11:09.080 | - Yeah, it does.
00:11:10.480 | I think the thing that's perplexing for me
00:11:13.360 | about these kind of examples is,
00:11:15.680 | effectively the ant's dead, but it's staying alive now
00:11:18.440 | 'cause it's piloted by this fungus.
00:11:20.720 | And so that gets back to this thing
00:11:23.680 | that we were talking about a few minutes ago
00:11:24.760 | about how the boundary of life is really hard to define.
00:11:26.880 | So, you know, anytime that you want to draw a boundary
00:11:30.560 | around something and you say,
00:11:32.360 | this feature is the thing that makes this alive,
00:11:34.600 | or this thing is alive on its own,
00:11:37.340 | there's not ever really a clear boundary.
00:11:39.680 | And these kind of examples are really good at showing that
00:11:42.160 | because it's like the thing that you would have thought
00:11:44.800 | is the living organism is now dead,
00:11:47.480 | except that it has another living organism
00:11:49.440 | that's piloting it.
00:11:50.340 | So the two of them together are alive in some sense,
00:11:53.880 | but they're now in this kind of weird symbiotic relationship
00:11:57.460 | that's taking this ant to its death.
00:11:59.240 | - So what do you do with that
00:12:00.400 | in terms of when you try to define life?
00:12:02.840 | - I think we have to get rid of the notion
00:12:04.280 | of an individual as being relevant.
00:12:06.040 | And this is really difficult because, you know,
00:12:10.520 | a lot of the ways that we think about life,
00:12:12.320 | like the fundamental unit of life is the cell,
00:12:15.240 | individuals are alive,
00:12:16.920 | but we don't think about how gray that distinction is.
00:12:22.520 | So for example, you might consider, you know,
00:12:27.520 | self-reproduction to be the most defining feature of life.
00:12:30.440 | A lot of people do actually like, you know,
00:12:31.920 | one of these standard different definitions
00:12:33.760 | that a lot of people may feel like to use in astrobiology
00:12:36.160 | is life as a self-sustaining chemical system
00:12:38.420 | capable of Darwinian evolution,
00:12:40.300 | which I was once quoted as agreeing with,
00:12:41.980 | and I was really offended because I hate that definition.
00:12:45.560 | I think it's terrible.
00:12:46.760 | And I think it's terrible that people use it.
00:12:48.640 | I think like every word in that definition
00:12:50.080 | is actually wrong as a descriptor of life.
00:12:52.520 | - Life is a self-sustaining chemical system
00:12:54.720 | capable of Darwinian evolution.
00:12:56.160 | Why is that?
00:12:57.000 | That seems like a pretty good definition.
00:12:57.840 | - Yeah, I know.
00:12:58.660 | If you wanna make me angry, you can pretend I said that.
00:13:00.700 | (laughing)
00:13:01.760 | And believed it.
00:13:02.920 | - So self-sustaining chemical system,
00:13:07.160 | Darwinian evolution, what is self-sustaining?
00:13:10.280 | What's so frustrating?
00:13:11.840 | I mean, which aspect is frustrating to you?
00:13:13.300 | But it's also those very interesting words.
00:13:15.320 | - Yeah, they're all interesting words.
00:13:17.720 | And, you know, together they sound really smart
00:13:19.800 | and they sound like they box in what life is,
00:13:21.720 | but you can use any of the words individually
00:13:24.680 | and you can come up with counterexamples
00:13:27.060 | that don't fulfill that property.
00:13:28.960 | The self-sustaining one is really interesting
00:13:30.840 | thinking about humans, right?
00:13:33.280 | Like we're not self-sustaining,
00:13:34.440 | we're dependent on societies.
00:13:36.160 | And so, you know, I find it paradoxical
00:13:39.160 | that it might be that societies,
00:13:41.720 | because they're self-sustaining units,
00:13:43.260 | are now more alive than individuals are.
00:13:45.980 | And that could be the case,
00:13:47.200 | but I still think we have some property
00:13:49.280 | associated with life.
00:13:50.220 | I mean, that's the thing that we're trying to describe.
00:13:52.680 | So that one's quite hard.
00:13:54.400 | And in general, you know,
00:13:56.080 | no organism is really self-sustaining.
00:13:58.320 | They always require an environment.
00:13:59.600 | So being self-sustaining is coupled in some sense
00:14:02.240 | to the world around you.
00:14:03.500 | We don't live in a vacuum.
00:14:06.460 | So that part's already challenging.
00:14:10.740 | And then you can go to chemical system.
00:14:14.080 | I don't think that's good either.
00:14:15.160 | I think there's a confusion
00:14:16.760 | because life emerges in chemistry,
00:14:18.820 | that life is chemical.
00:14:21.080 | I don't think life is chemical.
00:14:22.440 | I think life emerges in chemistry
00:14:24.600 | because chemistry is the first thing the universe builds,
00:14:28.400 | where it cannot exhaust all the possibilities
00:14:31.020 | because the combinatorial space of chemistry is too large.
00:14:33.680 | - Well, but is it possible to have a life
00:14:35.040 | that is not a chemical system?
00:14:36.600 | - Yes.
00:14:37.560 | - There's a guy I know named Lee Cronin
00:14:39.520 | has been on a podcast a couple of times
00:14:41.120 | who just got really pissed off listening to this.
00:14:44.280 | He probably just got really pissed off hearing
00:14:46.440 | that for people who somehow don't know he's a chemist.
00:14:49.500 | - Yeah, but he would agree with that statement.
00:14:51.380 | - Would he?
00:14:52.300 | I don't think he would.
00:14:53.700 | I don't think he would.
00:14:54.740 | He would broaden the definition of chemistry
00:14:56.860 | until it would include everything.
00:14:58.140 | - Oh, sure.
00:14:59.100 | - Okay, so you--
00:14:59.940 | - Or maybe, I don't know.
00:15:00.760 | - But wait, but you said that universe,
00:15:02.160 | that's the first thing it creates is chemistry.
00:15:04.920 | - Where the, very precisely,
00:15:06.540 | it's not the first thing it creates.
00:15:07.540 | Obviously, it has to make atoms first,
00:15:09.380 | but it's the first thing.
00:15:10.580 | If you think about the universe originated,
00:15:14.100 | atoms were made in Big Bang nucleosynthesis
00:15:17.240 | and then later in stars, and then planets formed,
00:15:20.400 | and planets become engines of chemistry.
00:15:23.100 | They start exploring what kind of chemistry is possible,
00:15:27.500 | and the combinatorial space of chemistry is so large
00:15:32.500 | that even on every planet in the entire universe,
00:15:36.400 | you will never express every possible molecule.
00:15:38.740 | I like this example, actually, that Lee gave me,
00:15:42.720 | which is to think about taxol.
00:15:43.960 | It has a molecular weight of about 853.
00:15:46.920 | It's got a lot of atoms, but it's not astronomically large,
00:15:50.180 | and if you tried to make one molecule
00:15:55.180 | with that molecular formula
00:15:57.320 | and every three-dimensional shape you could make
00:15:59.120 | with that molecular formula,
00:16:00.540 | it would fill 1.5 universes in volume,
00:16:05.540 | so with one unique molecule.
00:16:08.260 | That's just one molecule, so chemical space is huge,
00:16:12.440 | and I think it's really important to recognize that
00:16:15.280 | because if you wanna ask a question
00:16:16.520 | of why does life emerge in chemistry,
00:16:18.680 | well, life emerges in chemistry
00:16:20.000 | because life is the physics
00:16:21.760 | of how the universe selects what gets to exist,
00:16:25.120 | and those things get created
00:16:27.560 | along historically contingent pathways in memory
00:16:29.440 | and all the other stuff that we can talk about,
00:16:31.920 | but the universe has to actually make
00:16:34.120 | historically contingent choices in chemistry
00:16:36.040 | 'cause it can't exhaust all possible molecules.
00:16:38.300 | - What kind of things can you create
00:16:39.960 | that's outside the combinatorial space of chemistry?
00:16:44.040 | That's what I'm trying to understand.
00:16:45.080 | - Oh, if it's not chemical,
00:16:46.800 | so I think some of the things
00:16:48.760 | that have evolved in our biosphere
00:16:50.720 | I would call as much alive as chemistry, as a cell,
00:16:55.720 | but they seem much more abstract,
00:16:57.400 | so for example, I think language is alive,
00:16:59.880 | I think, or at least life, I think memes are, I think--
00:17:04.880 | - You're saying language is life.
00:17:07.720 | - Yes. - Language is alive.
00:17:09.240 | - Oh boy, we're gonna have to explore that one.
00:17:11.720 | (Rebekah laughing)
00:17:12.840 | Okay, but-- - Life maybe not,
00:17:14.000 | maybe not alive, but I actually don't know
00:17:16.640 | where I stand exactly on that.
00:17:18.640 | I've been thinking about that a little bit more lately,
00:17:20.480 | but mathematics too, and it's interesting
00:17:23.360 | 'cause people think that math has this platonic reality
00:17:26.720 | that exists outside of our universe,
00:17:28.240 | and I think it's a feature of our biosphere,
00:17:30.640 | and it's telling us something
00:17:33.200 | about the structure of our cells,
00:17:34.960 | and I find that really interesting
00:17:37.560 | 'cause when you would sort of internalize
00:17:39.040 | all of these things that we notice about the world,
00:17:41.120 | and you start asking, well, what do these look like
00:17:42.880 | if I was something outside of myself
00:17:46.120 | observing these systems that we're all embedded in,
00:17:50.200 | what would that structure look like?
00:17:51.760 | And I think we look really different
00:17:53.240 | than the way that we talk about
00:17:55.480 | what we look like to each other.
00:17:57.160 | - What do you think a living organism in math is?
00:17:59.600 | Is it one axiomatic system, or is it individual theorems,
00:18:02.560 | or is it-- - I think it's the fact
00:18:05.880 | that it's open-ended in some sense.
00:18:10.200 | It's another open-ended combinatorial space,
00:18:13.920 | and the recursive properties of it
00:18:15.960 | allow creativity to happen,
00:18:19.040 | which is what you see with the revolution in the last century
00:18:22.760 | with Godel's theorem and Turing,
00:18:25.600 | and there's clear places where mathematics
00:18:30.520 | notices holes in the universe. (laughs)
00:18:32.840 | - So it seems like you're sneaking up
00:18:34.440 | on a different kind of definition of life,
00:18:36.040 | open-ended, large combinatorial space.
00:18:39.440 | - Yeah. - Room for creativity.
00:18:41.440 | - Definitely not chemical.
00:18:44.080 | I mean, chemistry is one substrate.
00:18:45.920 | - Restricted to chemical. - Yeah.
00:18:47.760 | - Okay, what about the third thing,
00:18:49.280 | which I think would be the hardest,
00:18:51.040 | 'cause you probably like it the most,
00:18:52.680 | is evolution or selection.
00:18:54.800 | - Well, specifically, it's Darwinian evolution,
00:18:57.440 | and I think Darwinian evolution is a problem,
00:18:59.960 | but the reason that that definition is a problem
00:19:02.280 | is not because evolution is in the definition,
00:19:04.400 | but because the implication is that,
00:19:09.120 | that most people would wanna make
00:19:11.080 | is that an individual is alive,
00:19:12.960 | and the evolutionary process,
00:19:14.720 | at least the Darwinian evolutionary process,
00:19:16.240 | most evolutionary processes,
00:19:17.400 | they don't happen at the level of individuals.
00:19:21.280 | They happen at the level of populations.
00:19:22.840 | So again, you would be saying something
00:19:25.040 | like what we saw with the self-sustaining definition,
00:19:27.360 | which is that populations are alive,
00:19:29.800 | but individuals aren't,
00:19:30.680 | because populations evolve and individuals don't,
00:19:32.920 | and obviously, maybe you're alive
00:19:34.960 | because your gut microbiome is evolving,
00:19:37.640 | but Lex is an entity right now,
00:19:39.480 | is not evolving by canonical theories of evolution.
00:19:43.440 | In assembly theory, which is attempting to explain life,
00:19:48.280 | evolution is a much broader thing.
00:19:50.680 | - So an individual organism
00:19:52.720 | can evolve under assembly theory?
00:19:54.480 | - Yes, you're constructing yourself all the time.
00:19:56.840 | Assembly theory is about construction
00:19:58.600 | and how the universe selects for things to exist.
00:20:01.080 | - What if you formulate everything
00:20:02.560 | like a population is a living organism?
00:20:04.800 | - That's fine, too.
00:20:06.400 | But this again gets back to,
00:20:07.840 | so I think what all of the,
00:20:11.200 | we can nitpick at definitions.
00:20:12.520 | I don't think it's incredibly helpful to do it,
00:20:14.680 | but the reason for me-- - But it's fun.
00:20:16.760 | - Yeah, it is fun.
00:20:17.600 | It is really fun.
00:20:18.440 | And actually, I do think it's useful
00:20:20.760 | in the sense that when you see
00:20:22.480 | the ways that they all break down,
00:20:24.200 | you either have to keep forcing in
00:20:28.000 | your sort of conception of life you want to have,
00:20:30.360 | or you have to say all these definitions
00:20:31.880 | are breaking down for a reason.
00:20:33.040 | Maybe I should adopt a more expansive definition
00:20:36.120 | that encompasses all the things that I think are life.
00:20:39.880 | And so for me, I think life is the process
00:20:43.560 | of how information structures matter over time and space.
00:20:48.200 | And an example of life is what emerges on a planet
00:20:51.680 | and yields an open-ended cascade
00:20:54.160 | of generation of structure and increasing complexity.
00:20:57.680 | And this is the thing that life is.
00:21:00.080 | And any individual is just a particular instance
00:21:02.960 | of these lineages that are structured across time.
00:21:07.960 | And so we focus so much on these individuals
00:21:11.360 | that are these short temporal moments
00:21:13.120 | in this larger causal structure
00:21:15.080 | that actually is the life on our planet.
00:21:18.000 | And I think that's why these definitions break down,
00:21:21.560 | because they're not general enough,
00:21:23.760 | they're not universal enough, they're not deep enough,
00:21:25.520 | they're not abstract enough
00:21:26.520 | to actually capture that regularity.
00:21:28.400 | - 'Cause we're focused on those,
00:21:29.920 | that little ephemeral thing that we call human life.
00:21:33.760 | - Aristotle focusing on heavy things falling
00:21:37.200 | because they're earth-like,
00:21:38.400 | and things floating 'cause they're air-like.
00:21:42.520 | It's the wrong thing to focus on.
00:21:45.080 | - So what exactly are we missing
00:21:46.680 | by focusing on such a short span of time?
00:21:50.560 | - I think we're missing most of what we are.
00:21:52.360 | So one of the issues,
00:21:54.280 | I've been thinking about this really viscerally lately.
00:21:57.600 | It's weird when you do theoretical physics,
00:21:59.160 | 'cause I think it literally changes
00:22:00.680 | the structure of your brain,
00:22:02.400 | and you see the world differently,
00:22:03.960 | especially when you're trying to build new abstractions.
00:22:05.880 | - Do you think it's possible,
00:22:06.800 | if you're a theoretical physicist,
00:22:08.800 | it's easy to fall off the cliff
00:22:10.960 | and go descend into madness?
00:22:13.240 | - I mean, I think you're always on the edge of it,
00:22:15.040 | but I think what is amazing about being a scientist
00:22:19.520 | and trying to do things rigorously is it keeps your sanity.
00:22:23.520 | So I think if I wasn't a theoretical physicist,
00:22:26.240 | I would be probably not sane.
00:22:29.600 | But what it forces you to do is hold the fire.
00:22:31.840 | You have to hold yourself to the fire
00:22:33.240 | of these abstractions in my mind
00:22:35.600 | have to really correspond to reality,
00:22:37.440 | and I have to really test that all the time.
00:22:39.600 | And so I love building new abstractions,
00:22:41.680 | and I love going to those incredibly creative spaces
00:22:46.440 | that people don't see
00:22:47.920 | as part of the way that we understand the world now.
00:22:51.720 | But ultimately, I have to make sure
00:22:54.120 | that whatever I'm pulling from that space
00:22:55.840 | is something that's really usable
00:22:57.480 | and really relates to the world outside of me.
00:23:00.560 | That's what science is.
00:23:01.680 | - So we were talking about what we're missing
00:23:04.160 | when we look at a small stretch of time
00:23:07.200 | and a small stretch of space.
00:23:09.600 | - Yeah, so the issue is we evolve perception
00:23:14.480 | to see reality a certain way, right?
00:23:17.040 | So for us, space is really important,
00:23:19.280 | and time feels fleeting.
00:23:20.800 | And I had a really wonderful mentor, Paul Davies,
00:23:24.880 | most of my career.
00:23:25.720 | And Paul's amazing because he gives these
00:23:28.080 | little seed thought experiments all the time.
00:23:30.720 | Something he used to ask me all the time
00:23:32.360 | when I was a postdoc, this is kind of a random tangent,
00:23:34.480 | but was how much of the universe
00:23:37.000 | could be converted into technology
00:23:39.000 | if you were thinking about long-term futures
00:23:41.040 | and stuff like that.
00:23:41.880 | And it's a weird thought experiment,
00:23:43.320 | but there's a lot of deep things there.
00:23:45.520 | And I do think a lot about the fact
00:23:46.840 | that we're really limited in our interactions with reality
00:23:50.640 | by the particular architectures that we evolved.
00:23:54.040 | And so we're not seeing everything.
00:23:56.400 | And in fact, our technology tells us this all the time
00:23:58.760 | because it allows us to see the world in new ways
00:24:01.200 | by basically allowing us to perceive the world
00:24:03.200 | in ways that we couldn't otherwise.
00:24:05.200 | And so what I'm getting at with this is
00:24:07.520 | I think that living objects are actually huge.
00:24:12.520 | Like they're some of the biggest structures in the universe,
00:24:16.000 | but they are not big in space, they are big in time.
00:24:18.800 | And we actually can't resolve that feature.
00:24:21.480 | We don't interact with it on a regular basis.
00:24:23.480 | So we see them as these fleeting things
00:24:25.400 | that have this really short temporal clock time
00:24:27.720 | without seeing how large they are.
00:24:29.760 | When I'm saying time here, I really,
00:24:31.400 | like the way that people could picture it
00:24:33.920 | is in terms of causal structure.
00:24:35.360 | So if you think about the history of the universe
00:24:38.160 | to get to you, and you imagine
00:24:40.320 | that that entire history is you,
00:24:42.880 | that is the picture I have in my mind
00:24:47.240 | when I look at every living thing.
00:24:49.000 | - So you have a tweet for everything.
00:24:52.640 | You tweeted-- - Doesn't everyone?
00:24:54.800 | - You have a lot of poetic, profound tweets.
00:24:56.960 | Sometimes they're puzzles
00:25:00.240 | that take a long time to figure out.
00:25:02.960 | - Well, you know what the trick is?
00:25:04.600 | The reason they're hard to write
00:25:05.840 | is because it's compressing a very deep idea
00:25:09.000 | into a short amount of space.
00:25:10.800 | And I really like doing that intellectual exercise
00:25:12.640 | 'cause I find it productive for me.
00:25:14.280 | - Yeah, it's a very interesting
00:25:15.800 | kind of compression algorithm though.
00:25:18.040 | - Yeah, I like language.
00:25:18.960 | I think it's really fun to play with.
00:25:20.440 | - Yeah, I wonder if AI can decompress it.
00:25:23.760 | That'd be interesting.
00:25:24.600 | - I think, I would like to try this,
00:25:26.760 | but I think I use language in certain ways
00:25:29.680 | that are non-canonical and I do it very purposefully.
00:25:32.920 | And it would be interesting to me
00:25:34.000 | how AI would interpret it.
00:25:35.720 | - Yeah, your tweets would be a good Turing test
00:25:37.800 | for superintelligence.
00:25:40.080 | Anyway, you tweeted that things only look emergent
00:25:43.720 | because we can't see time.
00:25:46.640 | So if we could see time,
00:25:50.200 | what would the world look like?
00:25:51.520 | You're saying you'll be able to see
00:25:54.740 | everything that an object has been,
00:25:57.040 | every step of the way that led to this current moment
00:26:01.120 | and all the interactions
00:26:05.800 | that required to make that evolution happen.
00:26:08.520 | You would see this gigantic tail.
00:26:11.040 | - The universe is far larger in time than it is in space.
00:26:16.040 | - Yeah.
00:26:17.000 | - And this planet is one of the biggest things
00:26:20.720 | in the universe.
00:26:22.080 | - Oh, so the more complexity, the bigger--
00:26:23.960 | - Yeah, so like, yeah, technosphere.
00:26:27.160 | I think the modern technosphere is the largest object
00:26:30.800 | in time in the universe that we know about.
00:26:33.160 | - And when you say technosphere, what do you mean?
00:26:35.200 | - I mean the global integration of life
00:26:39.080 | and technology on this planet.
00:26:40.760 | - So all the things,
00:26:41.640 | all the technological things we've created?
00:26:44.160 | - But I don't think of them as separate.
00:26:45.440 | They're very integrated with the structure
00:26:47.040 | that generated them.
00:26:48.440 | So you can almost imagine it,
00:26:50.160 | time is constantly bifurcating
00:26:52.160 | and it's generating new structures
00:26:54.400 | and these new structures are locally constructing the future.
00:26:59.400 | And so things like you and I are very close together in time
00:27:04.560 | 'cause we didn't diverge very early
00:27:06.600 | in the history of the universe.
00:27:07.440 | It's very recent.
00:27:08.760 | And I think this is one of the reasons
00:27:10.280 | that we can understand each other so well
00:27:12.160 | and we can communicate effectively.
00:27:14.600 | And I might have some sense of what it feels like to be you,
00:27:18.880 | but other organisms bifurcated from us in time earlier.
00:27:23.760 | This is just the concept of phylogeny, right?
00:27:26.800 | But if you take that deeper
00:27:28.280 | and you really think about that as the structure
00:27:30.920 | of the physics that generates life
00:27:34.120 | and you take that very seriously,
00:27:36.120 | all of that causation is still bundled up
00:27:40.720 | in the objects we observe today.
00:27:42.800 | And so you and I are close in this temporal structure,
00:27:47.800 | but we're so close because we're really big
00:27:53.480 | and we only are very different
00:27:55.120 | in sort of like the most recent moments
00:27:57.000 | in the time that's like embedded in us.
00:27:59.000 | It's hard to use words to visualize what's in minds.
00:28:05.560 | (laughing)
00:28:07.360 | I have such a hard time with this sometimes.
00:28:09.640 | Actually, I was thinking of the way over here.
00:28:12.000 | You have pictures in your brain
00:28:14.280 | and then they're hard to put into words,
00:28:16.280 | but I realized I always say I have a visual,
00:28:18.720 | but it's not actually I have a visual, it's I have a feeling
00:28:21.360 | 'cause oftentimes I cannot actually draw a picture
00:28:23.480 | in my mind for the things that I say,
00:28:27.600 | but sometimes they go through a picture
00:28:29.000 | before they get to words.
00:28:30.140 | But I like experimenting with words
00:28:31.560 | because I think they help paint pictures.
00:28:33.840 | - Yeah, it's again some kind of compressed feeling
00:28:37.280 | that you can query to get a sense
00:28:40.860 | of the bigger visualization that you have in mind.
00:28:43.880 | It's just a really nice compression.
00:28:45.680 | But I think the idea of this object
00:28:49.880 | that in it contains all the information
00:28:52.760 | about the history of an entity that you see now,
00:28:55.720 | just trying to visualize that is pretty cool.
00:28:58.400 | - Yeah.
00:28:59.320 | - I mean, obviously the mind breaks down quickly
00:29:02.880 | as you step seconds and minutes back in time.
00:29:06.080 | But I guess it's just a gigantic object
00:29:11.080 | we're supposed to be thinking about.
00:29:15.120 | - Yeah, I think so.
00:29:15.960 | And I think this is one of the reasons
00:29:17.160 | that we have such an ability to abstract as humans
00:29:21.200 | because we are so gigantic that like the space
00:29:23.880 | that we can go back into is really large.
00:29:26.180 | So like the more abstract you're going,
00:29:27.720 | like the deeper you're going in that space.
00:29:29.640 | - But in that sense, aren't we fundamentally all connected?
00:29:33.320 | - Yes, and this is why the definition of life
00:29:35.880 | cannot be the individual, it has to be these lineages
00:29:38.320 | because they're all connected, they're interwoven
00:29:39.960 | and they're exchanging parts all the time.
00:29:42.120 | - Yeah, so that maybe there's certain aspects
00:29:44.000 | of those lineages that can be lifelike,
00:29:47.320 | they can be characteristics,
00:29:48.360 | they can be measured like with the assembly theory
00:29:50.080 | that have more or less life,
00:29:52.360 | but they're all just fingertips of a much bigger object.
00:29:57.320 | - Yeah, I think life is very high dimensional.
00:29:59.720 | And in fact, I think you can be alive in some dimensions
00:30:02.400 | and not in others.
00:30:04.320 | Like if you could project all the causation that's in you
00:30:07.880 | and in some features of you,
00:30:10.840 | very little causation is required and like very little
00:30:13.480 | history and in some features a lot is.
00:30:16.860 | So it's quite difficult to take this really high dimensional,
00:30:21.560 | very deep structure and project it into things
00:30:25.360 | that we really can understand and say like,
00:30:27.320 | this is the one thing that we're seeing
00:30:31.480 | because it's not one thing.
00:30:33.600 | - It's funny, we're talking about this now
00:30:35.040 | and I'm slowly starting to realize,
00:30:37.160 | one of the things I saw when I took ayahuasca,
00:30:40.020 | afterwards actually.
00:30:42.480 | So the actual ceremony is four or five hours,
00:30:45.080 | but afterwards you're still riding
00:30:47.760 | whatever the thing that you're riding.
00:30:49.640 | And I got a chance to afterwards hang out with some friends
00:30:54.640 | and just shoot the shit in the forest.
00:30:58.560 | And I get to see their faces.
00:31:01.680 | And what was happening with their faces and their hair
00:31:05.120 | is I would get this interesting effect.
00:31:07.240 | First of all, everything was beautiful
00:31:08.560 | and I just had so much love for everybody.
00:31:12.000 | But I could see their past selves like behind them.
00:31:17.000 | It was this effect where I guess it's a blurring effect
00:31:22.760 | of where like if I move like this,
00:31:26.200 | the faces that were just there are still there
00:31:29.200 | and it would just float like this behind them,
00:31:34.200 | which will create this incredible effect.
00:31:36.160 | But it's also another way to think about that
00:31:37.960 | is I'm visualizing a little bit of that object,
00:31:42.960 | of the thing they wore just a few seconds ago.
00:31:45.640 | It's a cool little effect.
00:31:47.000 | - That's very cool.
00:31:47.840 | - And now it's like giving it a bit more profundity
00:31:51.540 | to the effect that was just beautiful aesthetically,
00:31:54.300 | but it's also beautiful from a physics perspective
00:31:59.120 | because that is a past self.
00:32:00.760 | I get a little glimpse at the past selves that they were,
00:32:04.640 | but then you take that to its natural conclusion,
00:32:09.640 | not just a few seconds ago,
00:32:11.120 | but just to the beginning of the universe.
00:32:13.600 | And you could probably get down that lineage.
00:32:17.520 | - It's crazy that there's billions of years
00:32:19.200 | inside all of us.
00:32:20.320 | - All of us.
00:32:21.840 | And then we connect, obviously, not too long ago.
00:32:25.560 | You mentioned just the technosphere
00:32:28.840 | and you also wrote that the most alive thing
00:32:31.160 | on this planet is our technosphere.
00:32:33.520 | - Yeah.
00:32:34.360 | - Why is the technology we create a kind of life form?
00:32:36.800 | Why are you seeing it as life?
00:32:39.480 | - Because it's creative,
00:32:40.960 | but with us, obviously, like not independently of us.
00:32:43.600 | And also because of this sort of lineage view of life.
00:32:46.440 | And I think about life often as a planetary scale phenomena
00:32:50.680 | 'cause that's sort of the natural boundary
00:32:52.240 | for all of this causation that's bundled
00:32:54.780 | in every object in our biosphere.
00:32:58.200 | And so for me, it's just sort of the current boundary
00:33:02.600 | of how far life on our planet has pushed
00:33:07.040 | into the things that our universe can generate.
00:33:10.280 | And so it's the furthest thing, it's the biggest thing.
00:33:13.800 | And I think a lot about the nature of life
00:33:16.760 | across different scales.
00:33:18.040 | And so we have cells inside of us that are alive
00:33:23.040 | and we feel like we're alive,
00:33:24.760 | but we don't often think about the societies
00:33:26.840 | that we're embedded in as alive
00:33:29.200 | or a global scale organization of us
00:33:32.920 | and our technology on the planet as alive.
00:33:35.960 | But I think if you have this deeper view
00:33:40.400 | into the nature of life, which I think is necessary also
00:33:43.320 | to solve the origin of life,
00:33:44.440 | then you have to include those things.
00:33:47.200 | - All of them.
00:33:48.040 | So you have to simultaneously think about--
00:33:50.440 | - Every scale. - Life at every single scale.
00:33:52.460 | - Yeah.
00:33:53.300 | - The planetary and the bacteria level.
00:33:55.640 | - Yeah, this is the hard thing
00:33:57.320 | about solving the problem of life, I think,
00:33:58.980 | is how many things you have to integrate
00:34:01.440 | into building a sort of a unified picture
00:34:05.480 | of this thing that we wanna call life.
00:34:07.040 | And a lot of our theories of physics
00:34:08.760 | are built on building deep regularities
00:34:13.760 | that explain a really broad class of phenomenon.
00:34:16.760 | I think we haven't really traditionally thought
00:34:18.200 | about life that way.
00:34:19.400 | But I think to get at some of these hardest questions,
00:34:23.020 | like looking for life on other planets
00:34:24.640 | or the origin of life, you really have
00:34:25.920 | to think about it that way.
00:34:27.320 | And so most of my professional work
00:34:30.200 | is just trying to understand every single thing
00:34:33.180 | on this planet that might be an example of life,
00:34:35.240 | which is pretty much everything,
00:34:36.440 | and then trying to figure out
00:34:37.680 | what's the deeper structure underlying that.
00:34:40.040 | - Yeah, Schrodinger wrote that living matter,
00:34:42.680 | while not eluding the laws of physics
00:34:44.600 | as established up to date,
00:34:46.240 | is likely to involve other laws of physics,
00:34:49.440 | hitherto unknown.
00:34:52.080 | So to him--
00:34:54.160 | - I love that quote.
00:34:55.880 | - There was a sense that at the bottom of this
00:34:59.560 | are new laws of physics that could explain
00:35:02.800 | this thing that we call life.
00:35:04.880 | - Yeah, Schrodinger really tried
00:35:07.080 | to do what physicists try to do,
00:35:10.000 | which is explain things.
00:35:12.220 | And his attempt was to try to explain life
00:35:17.220 | in terms of non-equilibrium physics,
00:35:20.760 | because he thought that was the best description
00:35:23.540 | that we could generate at the time.
00:35:25.920 | And so he did come up with something really insightful,
00:35:28.120 | which was to predict the structure of DNA
00:35:31.400 | as an aperiodic crystal.
00:35:33.480 | And that was for a very precise reason,
00:35:35.800 | that was the only kind of physical structure
00:35:38.080 | that could encode enough information
00:35:39.840 | to actually specify a cell.
00:35:41.480 | We knew some things about genes,
00:35:43.080 | but not about DNA and its actual structure
00:35:46.400 | when he proposed that.
00:35:47.840 | But in the book, he tried to explain
00:35:50.040 | life as kind of going against entropy.
00:35:53.280 | And so some people have talked about it
00:35:54.300 | as Schrodinger's paradox, how can life persist
00:35:56.560 | when the second law of thermodynamics is there?
00:35:59.480 | But in open systems, that's not so problematic.
00:36:02.120 | And really the question is,
00:36:04.000 | why can life generate so much order?
00:36:07.240 | And we don't have a physics to describe that.
00:36:10.000 | And it's interesting, generations of physicists
00:36:13.360 | have thought about this problem.
00:36:14.680 | Oftentimes, it's like when people are retiring,
00:36:17.280 | they're like, "Oh, now I can work on life."
00:36:20.200 | Or they're more senior in their career
00:36:21.520 | and they've worked on other more traditional problems.
00:36:23.360 | And there's still a lot of impetus in the physics community
00:36:26.200 | to think that non-equilibrium physics will explain life.
00:36:29.400 | But I think that's not the right approach.
00:36:32.380 | I don't think ultimately the solution
00:36:34.280 | to what life is, is there.
00:36:35.400 | And I don't really think entropy has much to do with it
00:36:39.480 | unless it's entirely reformulated.
00:36:42.000 | - Well, 'cause you have to explain how interesting order,
00:36:44.160 | how complexity emerges from the soup.
00:36:47.240 | - Yes, from randomness.
00:36:49.040 | - From randomness.
00:36:49.960 | Physics currently can't do that.
00:36:51.560 | - No, physics hardly even acknowledges
00:36:53.720 | that the universe is random at its base.
00:36:55.680 | We like to think we live in a deterministic universe
00:36:59.480 | and everything's deterministic,
00:37:00.640 | but I think that's probably an artifact
00:37:04.480 | of the way that we've written down laws of physics
00:37:07.080 | since Newton invented modern physics
00:37:10.360 | in his conception of motion and gravity,
00:37:12.220 | which he formulated laws that had initial conditions
00:37:17.120 | and fixed dynamical laws.
00:37:21.400 | And that's been sort of become the standard canon
00:37:24.240 | of how people think the universe works
00:37:25.800 | and how we need to describe any physical system
00:37:27.800 | is with an initial condition and a law of motion.
00:37:30.600 | And I think that's not actually
00:37:32.760 | the way the universe really works.
00:37:34.160 | I think it's a good approximation for the kind of systems
00:37:36.920 | that physicists have studied so far.
00:37:38.960 | And I think it will radically fail in the longterm
00:37:42.680 | at describing reality at its more basal levels.
00:37:47.680 | I'm not saying there's a base.
00:37:49.440 | I don't think that reality has a ground.
00:37:51.520 | And I don't think there's a theory of everything,
00:37:53.680 | but I think there are better theories
00:37:55.280 | and I think there are more explanatory theories.
00:37:57.480 | And I think we can get to something
00:37:59.640 | that explains much more than the current laws of physics do.
00:38:02.300 | - When you say theory of everything,
00:38:03.440 | you mean like everything, everything.
00:38:05.360 | - Yeah, you know, like in physics right now,
00:38:07.720 | it's really popular to talk about theories of everything.
00:38:09.960 | So string theory is supposed to be a theory of everything
00:38:11.600 | because it unifies quantum mechanics and gravity.
00:38:14.640 | And people have their different pet theories of everything.
00:38:18.960 | And the challenge with a theory of everything,
00:38:21.440 | I really love this quote from David Krakauer,
00:38:23.960 | which is a theory of everything
00:38:25.240 | is a theory of everything
00:38:26.200 | except those things that theorize.
00:38:27.920 | - Oh, you mean removing the observer from the thing.
00:38:30.360 | - Yeah, but it's also weird
00:38:32.320 | because if a theory of everything explained everything,
00:38:34.800 | it should also explain the theory.
00:38:36.380 | So the theory has to be recursive.
00:38:38.480 | And none of our theories of physics are recursive.
00:38:41.040 | So it's a weird concept.
00:38:44.280 | - Yeah, but it's very difficult
00:38:45.200 | to integrate the observer into a theory.
00:38:47.520 | - I don't think so.
00:38:48.800 | I think you can build a theory
00:38:50.120 | acknowledging that you're an observer inside the universe.
00:38:52.880 | - But doesn't it become recursive in that way?
00:38:55.360 | And you're saying it's possible
00:38:56.960 | to make a theory that's okay with that?
00:39:00.200 | - I think so.
00:39:01.080 | I mean, I don't think,
00:39:02.080 | there's always gonna be the paradox of another metal level
00:39:06.520 | you could build on the metal level, right?
00:39:08.440 | So like if you assume this is your universe
00:39:10.080 | and you're the observer outside of it,
00:39:11.700 | you have some meta description of that universe,
00:39:14.280 | but then you need a meta description
00:39:15.560 | of you describing that universe, right?
00:39:17.040 | So this is one of the biggest challenges that we face
00:39:22.040 | being observers inside our universe.
00:39:24.320 | And also why the paradoxes
00:39:26.560 | and the foundations of mathematics
00:39:27.880 | and any place that we try to have observers in the system
00:39:31.640 | or a system describing itself show up.
00:39:35.380 | But I think it is possible to build a physics
00:39:38.240 | that builds in those things intrinsically
00:39:40.480 | without having them be paradoxical
00:39:42.360 | or have holes in the descriptions.
00:39:44.480 | And so one place I think about this quite a lot,
00:39:49.120 | which I think can give you sort of a more concrete example
00:39:51.840 | is the nature of like what we call fundamental.
00:39:54.920 | So we typically define fundamental right now
00:39:59.080 | in terms of the smallest indivisible units of matter.
00:40:03.480 | So again, you have to have a definition
00:40:04.680 | of what you think material is and matter is.
00:40:06.560 | But right now, what's fundamental are elementary particles.
00:40:10.720 | And we think they're fundamental
00:40:12.480 | because we can't break them apart further.
00:40:14.120 | And obviously we have theories like string theory
00:40:16.020 | that if they're right, would replace the current description
00:40:19.560 | of what's the most fundamental thing in our universe
00:40:21.480 | by replacing it with something smaller.
00:40:23.440 | But we can't get to those theories
00:40:26.120 | because we're technologically limited.
00:40:28.280 | And so if you look at this from a historical perspective
00:40:34.640 | and you think about explanations changing
00:40:37.760 | as physical systems like us,
00:40:41.320 | learn more about the reality in which they live.
00:40:43.520 | We once considered atoms to be the most fundamental thing.
00:40:46.920 | And it literally comes from the word indivisible.
00:40:50.640 | And then we realized atoms had substructure
00:40:52.440 | because we built better technology,
00:40:54.160 | which allowed us to "see the world better"
00:40:56.800 | and resolve smaller features of it.
00:40:58.840 | And then we built even better technology,
00:41:01.400 | which allowed us to see even smaller structure
00:41:03.600 | and get down to the standard model particles.
00:41:06.280 | And we think that there might be structure below that,
00:41:08.880 | but we can't get there yet with our technology.
00:41:11.200 | So what's fundamental, the way we talk about it
00:41:14.480 | in current physics is not actually fundamental.
00:41:19.480 | It's the boundaries of what we can observe in our universe,
00:41:22.680 | what we can see with our technology.
00:41:24.760 | And so if you wanna build a theory that's about us
00:41:28.200 | and about what's inside the universe
00:41:32.960 | that we can observe, not what's at the boundary of it,
00:41:36.240 | you need to talk about objects that are in the universe
00:41:40.160 | that you can actually break apart to smaller things.
00:41:42.840 | So I think the things that are fundamental
00:41:44.280 | are actually the constructed objects.
00:41:45.800 | They're the ones that really exist
00:41:47.080 | and you really understand their properties
00:41:48.480 | because you know how the universe constructed them
00:41:50.240 | because you can actually take them apart.
00:41:51.920 | You can understand the intrinsic laws that built them.
00:41:54.480 | But the things at the boundary are just at the boundary,
00:41:56.600 | they're evolving with us.
00:41:58.000 | And we'll learn more about that structure as we go along.
00:42:00.840 | But really, if we wanna talk about
00:42:02.120 | what's fundamental inside our universe,
00:42:03.560 | we have to talk about all these things
00:42:04.860 | that are traditionally considered emergent,
00:42:07.500 | but really just structures in time
00:42:10.320 | that have causal histories that constructed them
00:42:13.120 | and are really actually what our universe is about.
00:42:17.600 | - So we should focus on the construction methodology
00:42:20.700 | as the fundamental thing.
00:42:21.880 | Do you think there's a bottom
00:42:24.360 | to the smallest possible thing that makes up the universe?
00:42:27.620 | - I don't see one.
00:42:29.440 | - And it'll take way too long.
00:42:31.120 | It'll take longer to find that
00:42:33.080 | than it will to understand the mechanism that created life.
00:42:36.120 | - I think so, yeah.
00:42:37.080 | I think for me, the frontier in modern physics,
00:42:40.900 | where the new physics lies,
00:42:42.200 | is not in high energy particle physics,
00:42:44.480 | it's not in quantum gravity,
00:42:46.460 | it's not in any of these sort of traditionally sold,
00:42:48.960 | this is gonna be the newest, deepest insight
00:42:50.600 | we have into the nature of reality.
00:42:51.920 | It is going to be in studying the problems of life
00:42:54.920 | and intelligence and the things that are sort of
00:42:57.860 | also our current existential crises as a civilization
00:43:01.860 | or a culture that's going through
00:43:04.220 | an existential trauma of inventing technologies
00:43:08.240 | that we don't understand right now.
00:43:09.700 | - The existential trauma and the terror we feel
00:43:12.580 | that that technology might somehow destroy us,
00:43:15.440 | us meaning living, intelligent living organisms,
00:43:18.500 | yet we don't understand what that even means.
00:43:20.940 | - Well, humans have always been afraid
00:43:22.140 | of our technologies though, right?
00:43:23.420 | So it's kind of a fascinating thing
00:43:25.900 | that every time we invent something we don't understand,
00:43:27.940 | it takes us a little while to catch up with it.
00:43:29.500 | - I think also in part, humans kind of love being afraid.
00:43:33.140 | - Yeah, we love being traumatized.
00:43:34.660 | - It's weird.
00:43:36.060 | - We wanna learn more,
00:43:37.060 | and then when we learn more, it traumatizes us.
00:43:39.160 | (laughing)
00:43:41.140 | You know, I never thought about this before,
00:43:42.660 | but I think this is one of the reasons I love what I do
00:43:44.500 | is because it traumatizes me all the time.
00:43:46.540 | (laughing)
00:43:47.380 | That sounds really bad.
00:43:48.740 | But what I mean is, I love the shock of realizing that,
00:43:52.860 | coming to understand something in a way
00:43:54.500 | that you never understood it before.
00:43:56.540 | I think, it seems to me when I see a lot of the ways
00:44:00.540 | other people react to new ideas,
00:44:01.860 | that they don't feel that way intrinsically.
00:44:03.340 | But for me, that's like, that's why I do what I do.
00:44:06.020 | I love, I love that feeling.
00:44:08.780 | - But you're also working on a topic
00:44:11.660 | where it's fundamentally ego-destroying,
00:44:14.380 | 'cause you're talking about like life.
00:44:17.300 | It's humbling to think that we're not,
00:44:20.300 | the individual human is not special.
00:44:23.180 | - Yeah.
00:44:24.020 | - And you're like very viscerally exploring that.
00:44:27.260 | - Yeah, I'm trying to embody that.
00:44:30.940 | 'Cause I think you have to live the physics
00:44:33.500 | to understand it.
00:44:34.380 | But there's a great quote about Einstein.
00:44:37.140 | I don't know if this is true or not,
00:44:38.220 | that he once said that he could feel
00:44:39.380 | a light beam in his belly.
00:44:40.780 | And I think, (laughing)
00:44:44.540 | but I think you gotta think about it though, right?
00:44:46.780 | If you're a really deep thinker
00:44:48.100 | and you're really thinking about reality that deeply,
00:44:49.940 | and you are part of the reality
00:44:51.340 | that you're trying to describe, you feel it.
00:44:53.580 | You really feel it.
00:44:54.540 | - That's what I was saying about,
00:44:56.540 | you're always like walking along the cliff.
00:44:59.260 | If you fall off, you're falling into madness.
00:45:01.620 | - Yes, it's a constant, constant descent into madness.
00:45:05.220 | - The fascinating thing about physicists and madness
00:45:07.540 | is that you don't know if you've fallen off the cliff.
00:45:10.420 | - Yeah, you know you don't know.
00:45:11.500 | - That's the cool thing about that.
00:45:12.780 | - I rely on other people to tell me.
00:45:14.580 | Actually, this is very funny,
00:45:15.580 | 'cause I have these conversations with my students often.
00:45:18.380 | They're worried about going crazy.
00:45:19.660 | (laughing)
00:45:20.620 | Like, (laughing)
00:45:22.540 | reassure them that one of the reasons they'll stay sane
00:45:25.020 | is by trying to work on concrete problems.
00:45:27.140 | - Going crazy or waking up, I don't know which one it is.
00:45:32.340 | - Yeah.
00:45:33.460 | - So what do you think is the origin of life on Earth?
00:45:37.340 | And how can we talk about it in a productive way?
00:45:40.380 | - The origin of life is like this boundary
00:45:43.020 | that the universe can only cross
00:45:47.460 | if a structure that emerges can reinforce its own existence,
00:45:51.860 | which is self-reproduction, autocatalysis,
00:45:54.260 | things people traditionally talk about.
00:45:56.220 | But it has to be able to maintain its own existence
00:45:59.220 | against this sort of randomness that happens in chemistry
00:46:03.100 | and this randomness that happens in the quantum world.
00:46:06.460 | And it's in some sense the emergence
00:46:08.260 | of a deterministic structure that says,
00:46:11.140 | I'm gonna exist and I'm gonna keep going.
00:46:13.380 | But pinning that down is really hard.
00:46:16.860 | We have ways of thinking about it in assembly theory
00:46:18.780 | that I think are pretty rigorous.
00:46:20.020 | And one of the things I'm really excited about
00:46:21.700 | is trying to actually quantify
00:46:24.220 | in an assembly theoretic way
00:46:26.420 | when the origin of life happens.
00:46:28.420 | But the basic process I have in mind
00:46:31.860 | is like a system that has no causal contingency,
00:46:36.860 | no constraints of objects,
00:46:38.660 | basically constraining the existence of other objects
00:46:40.820 | or allowing the existence of other objects.
00:46:45.660 | And so that sounds very abstract,
00:46:47.300 | but you can just think of like a chemical reaction
00:46:49.740 | can't happen if there's not a catalyst, for example,
00:46:52.740 | or a baby can't be born if there wasn't a parent.
00:46:55.860 | So there's a lot of causal contingency
00:46:57.780 | that's necessary for certain things to happen.
00:47:00.500 | So you think about this sort of unconstrained random system,
00:47:05.500 | there's nothing that reinforces
00:47:07.580 | the existence of other things.
00:47:08.900 | So the sort of resources just get washed out
00:47:12.340 | in all of these different structures
00:47:13.620 | and none of them exist again.
00:47:16.140 | Or they just, they're not very complicated
00:47:19.260 | if they're in high abundance.
00:47:20.980 | And some random events allow some things
00:47:23.820 | to start reinforcing the existence
00:47:26.980 | of a small subset of objects.
00:47:28.820 | And if they can do that,
00:47:30.700 | like just molecules basically recognizing each other
00:47:34.820 | and being able to catalyze certain reactions,
00:47:37.180 | there's this kind of transition point that happens
00:47:42.740 | where unless you get a self-reinforcing structure,
00:47:47.740 | something that can maintain its own existence,
00:47:50.940 | it actually can't cross this boundary
00:47:52.900 | to make any objects in high abundance
00:47:57.180 | without having this sort of past history
00:47:59.780 | that it's carrying with us
00:48:00.940 | and maintaining the existence of that past history.
00:48:03.220 | And that boundary point where objects can't exist
00:48:05.940 | unless they have the selection in history in them
00:48:08.500 | is what we call the original life.
00:48:09.860 | And pretty much everything beyond that boundary
00:48:12.180 | is holding on for dear life
00:48:15.100 | to all of the causation and causal structure
00:48:17.100 | that's basically put it there.
00:48:19.380 | And it's carving its way through this possibility space
00:48:22.900 | into generating more and more structure.
00:48:26.900 | And that's when you get the open-ended cascade of evolution.
00:48:29.620 | But that boundary point is really hard to cross.
00:48:31.540 | And then what happens when you cross that boundary point
00:48:33.580 | and the way objects come into existence
00:48:35.180 | is also like really fascinating dynamics
00:48:37.140 | because as things become more complex,
00:48:40.020 | the assembly index increases,
00:48:41.500 | I can explain all these things.
00:48:42.620 | Sorry, you can tell me what you wanna explain.
00:48:44.860 | I mean, explain or what people will wanna hear.
00:48:48.000 | Sorry, I have a very vivid visual in my brain
00:48:53.660 | and it's really hard to articulate it.
00:48:55.180 | - Gotta convert it to language.
00:48:56.500 | - I know, it's so hard.
00:48:59.320 | It's going from a feeling to a visual to language
00:49:02.600 | is so stifling sometimes.
00:49:03.440 | - And then I have to convert it
00:49:05.660 | from language to a visual, to a feeling.
00:49:09.300 | - Yeah.
00:49:10.140 | - I think it's working.
00:49:11.180 | - I hope so.
00:49:12.580 | - I really like the self-reinforcing objects.
00:49:14.580 | I mean, just so I understand,
00:49:18.340 | one way to create a lot of the same kind of object
00:49:21.900 | is make them self-reinforcing.
00:49:23.500 | - Yes.
00:49:25.260 | So self-reproduction has this property, right?
00:49:28.300 | Like if that system can make itself,
00:49:30.480 | then it can persist in time, right?
00:49:32.820 | 'Cause all objects decay, they all have a finite lifetime.
00:49:35.620 | So if you're able to make a copy of yourself
00:49:38.300 | before you die, before the second law eats you
00:49:42.260 | or whatever people think happens,
00:49:44.700 | then that structure can persist in time.
00:49:47.380 | - So that's a way to sort of emerge out of a random soup,
00:49:50.900 | out of the randomness of soup.
00:49:52.580 | - Right, but things that can copy themselves are very rare.
00:49:55.340 | - Yeah, rare.
00:49:56.180 | - And so what ends up happening is that you get structures
00:50:02.480 | that enable the existence of other things.
00:50:06.160 | And then somehow, only for some sets of objects,
00:50:11.160 | you get closed structures that are self-reinforcing
00:50:14.040 | and allow that entire structure to persist.
00:50:16.760 | - Right, so the one object A reinforces the existence
00:50:21.480 | of object B, but object A can die.
00:50:24.880 | - Yeah.
00:50:25.720 | - So you have to close that loop.
00:50:27.360 | - Right.
00:50:28.200 | So this is the classic idea.
00:50:29.040 | - This is all very unlikely statistically,
00:50:30.720 | but, you know, that's sufficiently,
00:50:34.140 | so you're saying there's a chance.
00:50:38.000 | - There is a chance.
00:50:38.840 | - It's still a probability, but once you solve that,
00:50:41.800 | once you close the loop, you can create
00:50:43.280 | a lot of those objects.
00:50:44.480 | - And that's what we're trying to figure out
00:50:45.760 | is what are the causal constraints that close the loop.
00:50:47.720 | So there is this idea that's been in the literature
00:50:49.640 | for a really long time that was originally proposed
00:50:51.520 | by Stuart Kaufman as really critical
00:50:52.960 | to the origin life called autocatalytic set.
00:50:54.560 | So autocatalytic set is exactly this property.
00:50:56.640 | We have A makes B, B makes C, C makes A,
00:50:59.680 | and you get a closed system.
00:51:00.800 | But the problem with the theory of autocatalytic sets
00:51:03.000 | is it's incredibly brittle as a theory,
00:51:06.160 | and it requires a lot of ad hoc assumptions.
00:51:08.920 | Like you have to assume function.
00:51:11.760 | You have to say this thing makes B.
00:51:14.160 | It's not an emergent property,
00:51:15.600 | the association between A and B.
00:51:17.560 | And so the way I think about it is much more general.
00:51:21.360 | If you think about these histories that make objects,
00:51:25.840 | it's kind of like the structure of the histories
00:51:28.520 | becomes collapses in such a way
00:51:32.680 | that these things are all
00:51:34.040 | in the same sort of causal structure,
00:51:36.320 | and that causal structure actually loops back on itself
00:51:38.920 | to be able to generate some of the things
00:51:40.600 | that make the higher level structures.
00:51:43.360 | Leah has a beautiful example of this, actually,
00:51:45.400 | in molybdenum.
00:51:46.240 | It's like the first non-organic autocatalytic set.
00:51:51.240 | It's a self-reproducing molybdenum ring,
00:51:55.200 | but it's like molybdenum.
00:51:57.280 | And basically, if you look at the molybdenum,
00:52:00.320 | it makes a huge molybdenum ring.
00:52:01.960 | I don't remember exactly how big it is.
00:52:03.400 | It might be like 150 molybdenum atoms or something.
00:52:06.440 | But if you think about the configuration space
00:52:08.240 | of that object, it's exponentially large
00:52:10.480 | how many possible molecules.
00:52:11.640 | So why does the entire system collapse
00:52:13.680 | on just making that one structure?
00:52:15.520 | If you start from molybdenum atoms
00:52:17.800 | that are maybe just a couple of them stuck together.
00:52:20.440 | And so what they see in this system
00:52:22.960 | is there's a few intermediate stages.
00:52:24.920 | So there's like some random events
00:52:26.280 | where the chemistry comes together
00:52:27.440 | and makes these structures.
00:52:28.720 | And then once you get to this very large one,
00:52:30.640 | it becomes a template for the smaller ones.
00:52:32.320 | And then the whole system just reinforces
00:52:33.880 | its own production.
00:52:34.880 | - How did Lee find this molybdenum?
00:52:38.920 | (laughing)
00:52:41.180 | - If I knew how Lee's brain work,
00:52:44.220 | I think I would understand a lot more about the universe.
00:52:47.160 | - This is not an algorithmic discovery, it's a--
00:52:49.640 | - No, but I think it goes to the deepest roots
00:52:52.660 | of when he started thinking about origins of life.
00:52:54.700 | So I mean, I don't know all his history,
00:52:56.720 | but like what he's told me
00:52:57.800 | is he started out in crystallography.
00:53:01.320 | And there's some things that he would just,
00:53:04.000 | like people would just take for granted
00:53:06.280 | about chemical structures
00:53:08.840 | that he was like deeply perplexed about.
00:53:10.840 | Like just like why are these like really intricate,
00:53:13.960 | really complex structures forming so easily
00:53:16.120 | under these conditions?
00:53:17.840 | And he was really interested in life,
00:53:21.960 | but he started in that field.
00:53:23.360 | So he's just carried with him
00:53:24.740 | these sort of deep insights from these systems
00:53:26.620 | that seem like they're totally not alive
00:53:28.360 | and just like these metallic chemistries
00:53:30.960 | into actually thinking about the deep principles of life.
00:53:35.120 | So I think he already knew a lot about that chemistry
00:53:40.120 | and he also, you know, assembly theory came
00:53:45.680 | from him thinking about how these systems work.
00:53:48.380 | So he had some intuition about what was going on
00:53:51.420 | with this molybdenum ring.
00:53:53.400 | - The molybdenum might be able to be the thing
00:53:56.280 | that makes a ring.
00:53:57.640 | - They knew about them for a long time,
00:53:59.080 | but they didn't know that the mechanism
00:54:00.440 | of why that particular structure form
00:54:02.000 | was autocatalytic feedback.
00:54:03.840 | And so that's what they figured out in this paper.
00:54:06.880 | And I actually think that paper is revealing
00:54:09.200 | some of the mechanism of the origin of life transition.
00:54:11.640 | 'Cause really what you see,
00:54:13.320 | like the origin of life is basically like,
00:54:15.620 | you should have a combinatorial explosion
00:54:18.140 | of the space of possible structures
00:54:21.900 | that are too large to exhaust.
00:54:23.540 | And yet you see it collapse on this, you know,
00:54:26.880 | really small space of possibilities
00:54:29.220 | that's mutually reinforcing itself to keep existing.
00:54:32.140 | That is the origin of life.
00:54:34.900 | - There's some set of structures that result
00:54:37.860 | in this autocatalytic feedback.
00:54:40.540 | - Yeah.
00:54:41.380 | - And what is it, tiny, tiny, tiny, tiny percent?
00:54:44.660 | - I think it's a small space.
00:54:47.860 | Like chemistry is very large.
00:54:49.320 | - And so like--
00:54:51.180 | - There might be a lot of them out there, but we don't know.
00:54:53.700 | - And one of them is the thing
00:54:55.060 | that probably started life on Earth.
00:54:56.660 | - That's right.
00:54:57.500 | - Or many, many starts.
00:54:58.860 | - Yes.
00:54:59.700 | - And it keeps starting maybe.
00:55:00.520 | - Yeah.
00:55:01.360 | I mean, there's also all kinds of other weird properties
00:55:03.500 | that happen around this kind of phase boundary.
00:55:08.500 | So this other project that I have in my lab
00:55:11.940 | is focused on the origin of chirality,
00:55:13.800 | which is, you know, thinking about,
00:55:17.280 | so chirality is this property of molecules
00:55:19.320 | that they can come in mirror image forms.
00:55:21.180 | So like just, like chirality means hand.
00:55:23.320 | So your left and right hand
00:55:25.820 | are what's called non-superimposable
00:55:27.660 | because if you try to lay one on the other,
00:55:29.280 | you can't actually lay them directly on top of each other.
00:55:32.620 | And that's the property of being a mirror image.
00:55:34.680 | So there's this sort of perplexing property
00:55:36.580 | of the chemistry of life
00:55:37.420 | that no one's been able to really adequately explain
00:55:39.940 | that all of the amino acids in proteins are left-handed
00:55:44.420 | and all of the bases in RNA and DNA are right-handed.
00:55:49.420 | And yet the chemistry of these building block units,
00:55:54.060 | the amino acids and nucleobases
00:55:55.540 | is the same for left and right-handed.
00:55:57.020 | So you have to have like some kind of symmetry breaking
00:55:59.420 | where you go from these chemistries
00:56:00.900 | that seem entirely equivalent
00:56:02.580 | to only having one chemistry take over as the dominant form.
00:56:07.420 | And for a long time, I had been really,
00:56:10.340 | I actually did my PhD on the origin of chirality.
00:56:13.300 | I was working on it
00:56:14.420 | as like a symmetry breaking problem in physics.
00:56:16.780 | This is how I got started on the origin of life.
00:56:18.780 | And then I left it for a long time
00:56:19.940 | 'cause I thought it was like one of the most boring problems
00:56:21.700 | in the origin of life, but I've come back to it
00:56:23.260 | 'cause I think there's something really deep going on here
00:56:25.140 | related to this like combinatorial explosion
00:56:27.540 | of the space of possibilities.
00:56:29.060 | But just to get to that point,
00:56:32.660 | like this feature of this handedness has been the main focus,
00:56:36.200 | but people take for granted
00:56:39.060 | the existence of chiral molecules at all,
00:56:41.580 | that this property of having a handedness,
00:56:44.720 | and they just assume that,
00:56:47.740 | you know, like it's just a generic feature of chemistry.
00:56:50.420 | But if you actually look at molecules,
00:56:53.140 | if you look at chemical space,
00:56:54.740 | which is like the space of all possible molecules
00:56:56.540 | that people can generate,
00:56:58.060 | and you look at small molecules,
00:57:00.100 | things that have less than about seven to 11 heavy atoms,
00:57:04.140 | so things that are not hydrogen,
00:57:05.780 | almost every single molecule in that space is achiral,
00:57:08.900 | like doesn't have a chiral center.
00:57:10.300 | So it would be like a spoon.
00:57:12.060 | A spoon doesn't have it,
00:57:13.260 | like it's the same as its mirror image.
00:57:14.940 | It's not like a hand that's different than its mirror image.
00:57:17.940 | But if you get to like this threshold boundary,
00:57:22.100 | above that boundary, almost every single molecule is chiral.
00:57:26.300 | So you go from a universe
00:57:27.640 | where almost nothing has a mirror image form,
00:57:30.340 | there's no mirror image universe of possibilities,
00:57:32.900 | to this one where every single structure
00:57:35.180 | has pretty much a mirror image version.
00:57:37.620 | And what we've been looking at in my lab
00:57:41.300 | is that it seems to be the case
00:57:43.140 | that the original life transition
00:57:45.900 | happens around the time when you start accumulating.
00:57:49.260 | You push your molecules to a large enough complexity
00:57:52.940 | that chiral molecules become very likely to form.
00:57:57.220 | And then there's a cascade of molecular recognition
00:58:00.220 | where chiral molecules can recognize each other.
00:58:03.060 | And then you get this sort of autocatalytic feedback
00:58:05.140 | and things self-reinforcing.
00:58:06.660 | - So is chirality in itself an interesting feature
00:58:10.620 | or just an accident of complexity?
00:58:12.020 | - No, it's a super interesting feature.
00:58:13.500 | I think chirality breaks symmetry in time, not space.
00:58:15.980 | So we think of it as a spatial property,
00:58:18.760 | like a left and right hand.
00:58:20.180 | But if I choose the left hand,
00:58:21.760 | I'm basically choosing the future of that system
00:58:24.460 | for all time because I've basically made a choice
00:58:27.020 | between the ways that that molecule can now react
00:58:29.760 | with every other object in its chemical universe.
00:58:32.380 | - Oh, I see.
00:58:33.220 | - And so you're actually like,
00:58:34.620 | when you have this splitting of making a molecule
00:58:37.560 | that now has another form it could have had
00:58:40.580 | by the same exact atomic composition,
00:58:42.820 | but now it's just a mirror image isometry,
00:58:44.740 | you're basically splitting the universe
00:58:46.220 | of possibilities every time.
00:58:47.740 | - Yeah, in two.
00:58:50.020 | - In two, but molecules can have more than one chiral center
00:58:52.100 | and that's not the only stereosometry that they can have.
00:58:54.580 | So this is one of the reasons that Taxol
00:58:56.540 | fills 1.5 universes of space.
00:58:58.860 | It's all of these spatial permutations
00:59:00.740 | that you do on these objects
00:59:01.860 | that actually makes the space so huge.
00:59:03.860 | So the point of this sort of chiral transition
00:59:06.740 | that I'm putting out is chirality is actually a signature
00:59:10.060 | of being in a complex chemical space.
00:59:12.660 | And the fact that we think it's a really generic feature
00:59:15.980 | of chemistry and it's really prevalent
00:59:17.500 | is because most of the chemistry we study on earth
00:59:19.640 | is a product already of life.
00:59:21.500 | And it also has to do with this transition and assembly,
00:59:24.420 | this transition and possibility spaces,
00:59:26.900 | because I think there's something really fundamental
00:59:29.100 | going on at this boundary
00:59:31.940 | that you don't really need to go that far
00:59:33.900 | into chemical space to actually see life
00:59:37.780 | in terms of this depth in time,
00:59:39.380 | this depth in symmetries of objects
00:59:42.100 | in terms of like chiral symmetries
00:59:43.940 | or this assembly structure.
00:59:46.200 | But getting past this boundary
00:59:50.460 | that's not very deep in that space requires life.
00:59:53.380 | It's a really weird property and it's really weird
00:59:59.060 | that so many abrupt things happen in chemistry
01:00:01.180 | at that same scale.
01:00:02.380 | - So would that be the greatest invention ever made on earth
01:00:07.380 | in its evolutionary history?
01:00:09.940 | So I really like that formulation of it.
01:00:12.260 | Nick Lane has a book called "Life Ascending"
01:00:15.180 | where he lists the 10 great inventions of evolution,
01:00:18.380 | the origin of life being first,
01:00:20.860 | and DNA, the hereditary material
01:00:23.860 | that encodes the genetic instructions
01:00:25.340 | for all living organisms,
01:00:26.620 | then photosynthesis, the process that allows organisms
01:00:29.700 | to convert sunlight into chemical energy,
01:00:31.860 | producing oxygen as a byproduct,
01:00:34.220 | the complex cell, eukaryotic cells,
01:00:36.900 | which contain a nucleus and organelles
01:00:38.620 | that rose from simple bacterial cells,
01:00:41.340 | sex, sexual reproduction,
01:00:44.060 | movement, so just the ability to move
01:00:46.500 | under which you have the predation,
01:00:48.540 | the predators and ability of living organisms
01:00:51.340 | to find food. - I like that movement's
01:00:52.180 | in there, that's cool.
01:00:53.100 | - Yeah, but a movement includes
01:00:54.700 | a lot of interesting stuff in there,
01:00:55.900 | like predator-prey dynamic,
01:00:58.460 | which, not to romanticize "Nature is Metal,"
01:01:02.180 | that seems like an important one.
01:01:04.060 | I don't know, it's such a computationally powerful thing
01:01:08.940 | to have a predator and prey.
01:01:10.740 | - Well, it's efficient for things to eat
01:01:12.980 | other things that are already alive
01:01:14.540 | 'cause they don't have to go all the way back
01:01:15.740 | to the base chemistry.
01:01:18.840 | - Well, that, but maybe I just like deadlines,
01:01:21.260 | but it creates an urgency, you're gonna get eaten.
01:01:23.940 | - You gotta live.
01:01:24.900 | - Yeah, like survival,
01:01:27.140 | it's not just the static environment
01:01:28.700 | you're battling against. - Oh, I see.
01:01:29.660 | - You're like, the dangers against which
01:01:34.340 | you're trying to survive are also evolving.
01:01:37.660 | This is just a much faster way
01:01:39.820 | to explore the space of possibilities.
01:01:42.780 | - I actually think it's a gift
01:01:43.900 | that we don't have much time.
01:01:45.340 | - Yes, sight, the ability to see,
01:01:47.960 | so the increasing complexifying of sensory organisms,
01:01:52.860 | consciousness, and death,
01:01:55.380 | the concept of programmed cell death,
01:01:58.580 | these are all inventions along the line.
01:02:02.980 | - I like invention as a word for them,
01:02:04.700 | I think that's good.
01:02:05.580 | - Which are the more interesting inventions to you,
01:02:07.740 | the origin of life?
01:02:08.660 | Because you kind of are not glorifying
01:02:12.280 | the origin of life itself.
01:02:13.900 | There's a process. - No, I think the origin
01:02:15.580 | of life is a continual process, that's why.
01:02:17.700 | I'm interested in the first transition
01:02:19.260 | in solving that problem 'cause I think it's the hardest,
01:02:21.180 | but I think it's happening all the time.
01:02:24.180 | - When you look back at the history of Earth,
01:02:26.340 | like what are you impressed happened?
01:02:28.340 | - I like sight as an invention
01:02:33.340 | 'cause I think having sensory perception
01:02:37.260 | and trying to comprehend the world,
01:02:39.900 | to use the anthropocentric terms,
01:02:41.420 | is like a really critical feature of life.
01:02:45.540 | And I also, it's interesting the way
01:02:47.500 | that sight has complexified over time, right?
01:02:51.220 | So like if you think of the origin of life,
01:02:54.140 | nothing on the planet could see, right?
01:02:57.580 | So like for a long time, life had no sight.
01:03:00.260 | And then like photon receptors were invented.
01:03:04.460 | And then when multicellularity evolved,
01:03:07.460 | those cells eventually grew into eyes
01:03:12.460 | and we had the multicellular eye.
01:03:14.500 | And then it's interesting when you get to societies,
01:03:16.500 | like human societies, that we invent
01:03:18.700 | even better technologies of seeing,
01:03:20.380 | like telescopes and microscopes,
01:03:22.500 | which allow us to see deeper into the universe
01:03:24.680 | or at smaller scales.
01:03:26.120 | So I think that's pretty profound,
01:03:30.660 | the way that sight has transformed the ability of life
01:03:35.660 | to literally see the reality in which it's existing in.
01:03:43.500 | - I think consciousness is also
01:03:47.140 | obviously deeply interesting.
01:03:49.460 | I've gotten kind of obsessed with like octopus.
01:03:51.960 | They're just so weird.
01:03:54.220 | And the fact that like they evolved complex nervous systems
01:03:57.900 | kind of independently is like, seems very alien.
01:04:01.020 | - Yeah, there's a lot of alien like organisms.
01:04:03.020 | That's another thing I saw in the jungle.
01:04:05.860 | - Yeah.
01:04:07.900 | - Just things that are like, oh, okay.
01:04:09.660 | They make one of those, huh?
01:04:11.660 | It just feels like there's-- - Do you have any examples?
01:04:14.200 | - There's a frog that's as thin as a sheet of paper.
01:04:18.560 | And I was like, what?
01:04:19.760 | And it gives birth through like pores.
01:04:22.060 | - Oh, I've seen videos of that.
01:04:23.740 | It's so gross when the babies come out.
01:04:26.020 | Did you see that like in person,
01:04:28.220 | like the babies coming out? - Oh, no, no.
01:04:29.740 | I saw the, without the--
01:04:31.980 | - Have you seen videos of that?
01:04:32.820 | It's so gross.
01:04:34.420 | It's one of the grossest things I've ever seen.
01:04:36.540 | - Well, gross is just the other side of beautiful.
01:04:41.420 | It's like, oh, wow, that's possible.
01:04:44.500 | - I guess if I was one of those frogs,
01:04:45.900 | I would think that was the most beautiful event
01:04:47.420 | I'd ever seen.
01:04:48.260 | Although like human childbirth
01:04:49.660 | is not that beautiful either.
01:04:51.460 | - Yeah, it's all a matter of perspective.
01:04:54.340 | - Well, we come in the world so violently.
01:04:56.220 | It's just like, it's amazing.
01:04:58.780 | - I mean, the world is a violent place.
01:05:00.980 | So again, it's just another side of the coin.
01:05:04.940 | - You know what?
01:05:05.780 | This actually makes me think of one that's not up there,
01:05:07.500 | which I do find really incredibly amazing
01:05:10.860 | is the process of like the germline cell in organisms.
01:05:15.860 | Like basically like every living thing on this planet
01:05:23.300 | at some point in its life has to go through a single cell.
01:05:26.300 | And this whole issue of like development,
01:05:28.180 | like the developmental program is kind of crazy.
01:05:30.180 | Like how do you build you out of a single cell?
01:05:32.620 | How does a single cell know how to do that?
01:05:34.700 | Like pattern formation of a multicellular organism
01:05:38.180 | obviously like evolves with DNA,
01:05:40.300 | but there's a lot of stuff happening there
01:05:41.980 | about when cells take on certain morphologies
01:05:45.300 | and things that people don't understand,
01:05:46.620 | like the actual shape formation mechanism.
01:05:48.620 | And a lot of people study that
01:05:49.780 | and there's a lot of advances being made now in that field.
01:05:53.900 | I think it's pretty shocking though
01:05:55.020 | that like how little we know about that process.
01:05:58.300 | And often it's left off of people's lists.
01:06:00.020 | It's just kind of interesting.
01:06:01.740 | Embryogenesis is fascinating.
01:06:03.380 | - Yeah, 'cause you start from just one cell.
01:06:05.940 | - Yeah, and the genes in all the cells are the same, right?
01:06:08.500 | So like the differentiation has to be
01:06:11.580 | something that's like much more about like the actual
01:06:15.660 | expression of genes over time
01:06:20.700 | and like how they get switched on and off
01:06:22.660 | and also the physical environment
01:06:24.060 | of like the cell interacting with other cells.
01:06:26.140 | And there's just a lot of stuff going on.
01:06:28.820 | - Yeah, the computation, the intelligence of that process
01:06:32.740 | might be like the most important thing to understand
01:06:36.300 | and we just kind of don't really think about it.
01:06:38.180 | - Right.
01:06:39.020 | - We think about the final product.
01:06:40.180 | - Yeah.
01:06:41.620 | - Maybe the key to understanding the organism
01:06:44.900 | is understanding that process, not the final product.
01:06:48.500 | - Probably, yes.
01:06:50.340 | I think most of the things about understanding anything
01:06:52.300 | about what we are are embedded in time.
01:06:54.300 | - Well, of course you would say that.
01:06:55.260 | - I know, so predictable.
01:06:56.780 | It's turning into a deterministic universe.
01:07:01.460 | - It always has been, always was, like the meme.
01:07:05.180 | - Yeah, always was, but it won't be in the future.
01:07:07.660 | - Well, that's, before we talk about the future,
01:07:09.620 | let's talk about the past, the assembly theory.
01:07:11.860 | - Yes.
01:07:12.700 | - Can you explain assembly theory to me?
01:07:15.180 | I listened to Lee talk about it for many hours
01:07:17.380 | and I understood nothing.
01:07:18.460 | No, I'm just kidding.
01:07:20.220 | I just wanted to take another,
01:07:21.900 | you've been already talking about it,
01:07:23.300 | but just what from a big picture view
01:07:28.300 | is the assembly theory way of thinking about our world?
01:07:37.500 | Our universe?
01:07:38.860 | - Yeah, I think the first thing is the observation
01:07:43.860 | that life seems to be the only thing in the universe
01:07:49.740 | that builds complexity in the way that we see it here,
01:07:52.660 | and complexity is obviously a loaded term,
01:07:55.100 | so I'll just use assembly instead,
01:07:57.820 | 'cause I think assembly is more precise.
01:07:59.820 | But the idea that all the things on your desk here,
01:08:05.140 | from your computer to the pen to us sitting here
01:08:10.140 | don't exist anywhere else in the universe
01:08:11.980 | as far as we know, they only exist on this planet,
01:08:14.660 | and it took a long evolutionary history to get to us,
01:08:17.820 | is a real feature that we should take seriously
01:08:21.540 | as one that's deeply embedded in the laws of physics
01:08:24.940 | and the structure of the universe that we live in.
01:08:27.540 | Standard physics would say that all of that complexity
01:08:30.420 | traces back to the infinitesimal deviations
01:08:35.420 | in the initial state of the universe,
01:08:37.620 | that there was some order there.
01:08:39.540 | I find that deeply unsatisfactory,
01:08:41.620 | and what assembly theory says that's very different
01:08:46.620 | is that the universe is basically constructing itself,
01:08:51.620 | and when you get to these common tutorial spaces
01:08:56.700 | like chemistry, where the space of possibilities
01:09:00.140 | is too large to exhaust them all,
01:09:02.460 | you can only construct things along
01:09:06.620 | historically contingent paths,
01:09:08.820 | like you basically have causal chains of events
01:09:11.020 | that happen to allow other things to come into existence,
01:09:15.100 | and that this is the way that complex objects get formed
01:09:20.100 | is basically unscaffolding on the past history
01:09:22.380 | of objects making more complex objects
01:09:24.540 | making more complex objects.
01:09:26.220 | That idea in itself is easy to state and simple,
01:09:28.780 | but it has some really radical implications
01:09:31.420 | as far as what you think is the nature of the physics
01:09:36.420 | that would describe life,
01:09:38.380 | and so what assembly theory does formally
01:09:41.340 | is try to measure the boundary in the space of all things
01:09:45.980 | that chemically could exist, for example,
01:09:48.500 | like all possible molecules,
01:09:49.780 | where is the boundary above which we should say
01:09:52.180 | these things are too complex to happen
01:09:54.020 | outside of an evolutionary chain of events,
01:09:56.940 | outside of selection,
01:09:58.740 | and we formalize that with two observables,
01:10:02.180 | one of them's the copy number of the object,
01:10:03.900 | so how many of the object did you observe,
01:10:05.860 | and the second one is what's the minimal number
01:10:08.660 | of recursive steps to make it,
01:10:10.220 | so if you start from elementary building blocks,
01:10:14.980 | like bonds for molecules, and you put them together,
01:10:18.700 | and then you take things you've made already
01:10:20.540 | and build up to the object,
01:10:21.500 | what's the shortest number of steps you had to take,
01:10:24.460 | and what Lee's been able to show in the lab with his team
01:10:28.420 | is that for organic chemistry, it's about 15 steps,
01:10:33.420 | and then you only see molecules that,
01:10:38.020 | you know, the only molecules that we observe
01:10:40.900 | that are past that threshold are ones that are in life,
01:10:45.100 | and in fact, one of the things I'm trying to do
01:10:46.620 | with this idea of like trying to actually quantify
01:10:48.540 | the origin of life as a transition
01:10:50.700 | in like a phase transition in assembly theory
01:10:53.300 | is actually be able to explain
01:10:55.860 | why that boundary is where it is,
01:10:58.020 | 'cause I think that's actually the boundary
01:10:59.300 | that life must cross, so the idea of going back
01:11:02.340 | to this thing we were talking about before
01:11:03.740 | about these structures that can reinforce
01:11:06.180 | their own existence and move past that boundary,
01:11:08.980 | 15 seems to be that boundary in chemical space.
01:11:12.140 | It's not a universal number.
01:11:13.300 | It will be different for different assembly spaces,
01:11:16.380 | but that's what we've experimentally validated so far,
01:11:18.980 | and then--
01:11:19.820 | - So literally 15, like the assembly index is 15?
01:11:22.700 | - It's 15 or so for the experimental data, yeah.
01:11:25.620 | - So that's when you start getting the self-reinforcing.
01:11:28.980 | - That's when you have to have that feature
01:11:31.260 | in order to observe molecules in high abundance
01:11:34.820 | in that space.
01:11:35.820 | - So copy number is the number of exact copies.
01:11:39.020 | That's what you mean by high abundance,
01:11:40.580 | and assembly index or the complexity of the object
01:11:43.820 | is how many steps it took to create it, recursive?
01:11:47.460 | - Recursive, yeah.
01:11:49.220 | So you can think of objects in assembly theory
01:11:51.220 | as basically recursive stacks
01:11:53.020 | of the construction steps to build them.
01:11:55.580 | So it's like you take this step,
01:11:59.020 | and then you make this object,
01:12:00.100 | and you make this object, and make this object,
01:12:01.780 | and then you get up to the final object,
01:12:03.220 | but that object is all of that history
01:12:05.020 | rolled up into the current structure.
01:12:06.900 | - What if you took the long way home?
01:12:08.660 | - You can't take the long way.
01:12:10.660 | - Why not?
01:12:11.500 | - The long way doesn't exist.
01:12:12.900 | - That's a good song, though.
01:12:14.300 | What do you mean the long way doesn't exist?
01:12:16.940 | If I do a random walk from A to B,
01:12:20.260 | I'll eventually, if I start at A,
01:12:22.460 | I'll eventually end up at B,
01:12:23.980 | and that random walk would be much shorter--
01:12:25.660 | - So it turns out--
01:12:26.500 | - Longer than the short--
01:12:27.340 | - No, if you look at objects,
01:12:29.220 | and so we define something we call the assembly universe,
01:12:33.380 | and the assembly universe is ordered in time.
01:12:35.620 | It's actually ordered in the causation,
01:12:37.460 | the number of steps to produce an object,
01:12:39.500 | and so all objects in the universe are, in some sense,
01:12:42.900 | exist in a layer that's defined by their assembly index,
01:12:47.580 | and the size of each layer is growing exponentially.
01:12:51.420 | So what you're talking about,
01:12:54.300 | if you wanna look at the long way of getting to an object,
01:12:57.060 | as I'm increasing the assembly index of an object,
01:12:59.260 | I'm moving deeper and deeper
01:13:00.540 | into an exponentially growing space,
01:13:02.940 | and it's actually also the case
01:13:04.660 | that the sort of typical path to get to that object
01:13:08.140 | is also exponentially growing
01:13:09.660 | with respect to the assembly index,
01:13:11.620 | and so if you want to try to make
01:13:13.900 | a more and more complex object,
01:13:15.260 | and you wanna do it by a typical path,
01:13:18.740 | that's actually an exponentially receding horizon,
01:13:21.540 | and so most objects that come into existence
01:13:23.660 | have to be causally very similar to the things that exist
01:13:26.180 | 'cause they're close by in that space,
01:13:27.580 | and they can actually get to it
01:13:28.740 | by an almost shortest path for that object.
01:13:30.700 | - Yeah, the almost shortest path is the most likely,
01:13:33.100 | and like by a lot.
01:13:35.780 | - By a lot.
01:13:36.820 | - Okay, so if you see a high copy number.
01:13:39.540 | - Yeah, imagine yourself--
01:13:40.380 | - A copy number greater than one.
01:13:42.060 | - Yeah, I mean, basically we live,
01:13:44.060 | the more complex we get,
01:13:45.340 | we live in a space that is growing exponentially large,
01:13:50.180 | and the ways of getting to objects in the space
01:13:53.020 | are also growing exponentially large,
01:13:55.180 | and so we're this kind of recursively stacked structure
01:14:00.180 | of all of these objects
01:14:03.460 | that are clinging onto each other for existence,
01:14:06.340 | and then they grab something else
01:14:08.100 | and are able to bring that thing into existence
01:14:10.420 | 'cause it's kind of similar to them.
01:14:11.980 | - But there is a phase transition.
01:14:13.620 | There is a--
01:14:14.460 | - There is a transition.
01:14:15.300 | - There is a place where you would say,
01:14:16.900 | oh, that's life. - I think it's actually abrupt.
01:14:18.500 | I've never been able to say that in my entire career before.
01:14:20.980 | I've always gone back and forth
01:14:22.020 | about whether the original life
01:14:23.180 | was kind of gradual or abrupt.
01:14:24.300 | I think it's very abrupt.
01:14:26.220 | - Poetically, chemically, literally, what snaps?
01:14:28.940 | Okay, that's very beautiful. - It snaps.
01:14:30.820 | - Okay, but-- - We'll be poetic today,
01:14:32.340 | but no, I think there's like a lot of random exploration,
01:14:35.740 | and then the possibility space
01:14:38.980 | just collapses on the structure kind of really fast.
01:14:42.340 | That can reinforce its own existence
01:14:43.980 | because it's basically fighting against nonexistence.
01:14:47.100 | - Yeah, you tweeted,
01:14:49.620 | "The most significant struggle for existence
01:14:52.420 | "in the evolutionary process
01:14:53.720 | "is not among the objects that do exist,
01:14:56.380 | "but between the ones that do
01:14:58.500 | "and those that never have the chance to."
01:15:01.180 | This is where selection does most of its causal work.
01:15:04.760 | The objects that never get a chance to exist,
01:15:10.860 | the struggle between the ones that never get a chance
01:15:12.620 | to exist and the ones that, okay, what's that line exactly?
01:15:16.500 | - I don't know, we can make songs out of all of these.
01:15:18.380 | - What are the objects that never get a chance to exist?
01:15:20.740 | What does that mean?
01:15:21.580 | - So there was this website, I forgot what it was,
01:15:25.140 | but it's like a neural network
01:15:27.660 | that just generates a human face,
01:15:29.540 | and it's like this person does not exist.
01:15:31.020 | I think that's what it's called, right?
01:15:32.140 | So you can just click on that all day,
01:15:33.480 | and you can look at people all day that don't exist.
01:15:36.140 | All of those people exist
01:15:37.380 | in that space of things that don't exist.
01:15:40.460 | - Yeah, but there's the real struggle.
01:15:44.340 | - Yeah, so the struggle of the quote,
01:15:46.540 | the struggle for existence is,
01:15:48.660 | that goes all the way back to Darwin's writing
01:15:50.380 | about natural selection, right?
01:15:51.720 | So the whole idea of survival of the fittest
01:15:53.740 | is everything struggling to exist,
01:15:55.140 | this predator-prey dynamic.
01:15:57.020 | And the fittest survive,
01:15:59.620 | and so the struggle for existence
01:16:02.060 | is really what selection is all about.
01:16:04.040 | And that's true.
01:16:07.540 | We do see things that do exist
01:16:10.740 | competing to continue to exist.
01:16:12.900 | But each time that,
01:16:15.860 | like if you think about this space of possibilities,
01:16:18.700 | and each time the universe generates a new structure,
01:16:23.420 | or like an object that exists
01:16:26.500 | generates a new structure along this causal chain,
01:16:29.620 | it's generating something that exists
01:16:32.820 | that never existed before.
01:16:34.540 | And each time that we make that kind of decision,
01:16:37.620 | we're excluding a huge space of possibilities.
01:16:40.260 | And so actually like as this process
01:16:42.220 | of increasing assembly index,
01:16:44.160 | it's not just that like the space
01:16:46.060 | that these objects exist in is exponentially growing,
01:16:49.000 | but there are objects in that space
01:16:51.940 | that are exponentially receding away from us.
01:16:54.980 | So they're becoming exponentially less
01:16:56.900 | and less likely to ever exist.
01:16:59.140 | And so existence excludes a huge number of things.
01:17:03.540 | - Just because of the accident of history,
01:17:06.180 | how it ended up--
01:17:07.300 | - Yeah, it is in part an accident,
01:17:09.620 | because I think some of the structure that gets generated
01:17:12.860 | is driven a bit by randomness.
01:17:16.940 | I think a lot of it,
01:17:18.260 | so one of the conceptions that we have in assembly theory
01:17:22.420 | is the universe is random at its base.
01:17:24.500 | You can see this in chemistry,
01:17:25.660 | like unconstrained chemical reactions are pretty random.
01:17:28.460 | And then, and also quantum mechanics,
01:17:33.220 | there's lots of places that give evidence for that.
01:17:36.260 | And deterministic structures emerge
01:17:38.380 | by things that can causally reinforce themselves
01:17:41.140 | and maintain persistence over time.
01:17:43.820 | And so we are some of the most deterministic things
01:17:47.540 | in the universe.
01:17:48.380 | And so like we can generate very regular structure
01:17:51.700 | and we can generate new structure
01:17:53.740 | along a particular lineage,
01:17:55.640 | but the possibility space at the sort of tips,
01:17:58.260 | like the things we can generate next is really huge.
01:18:01.200 | So there's some stochasticity in what we actually,
01:18:04.380 | you know, instantiate as like the next structures
01:18:08.460 | that get built in the biosphere.
01:18:10.460 | It's not completely deterministic,
01:18:13.420 | 'cause the space of future possibilities
01:18:15.100 | is always larger than the space of things that exist now.
01:18:17.780 | - So how many instantiations of life is out there,
01:18:21.660 | do you think?
01:18:22.480 | So how often does this happen?
01:18:26.660 | What we see happen here on Earth,
01:18:28.820 | how often is this process repeated
01:18:30.760 | throughout our galaxy, throughout the universe?
01:18:32.960 | - So I said before, like right now,
01:18:35.040 | I think the origin of life is a continuous process on Earth.
01:18:37.560 | Like I think this idea of like combinatorial spaces
01:18:40.840 | that our biosphere generates, not just chemistry,
01:18:42.960 | but other spaces, often cross this threshold
01:18:46.680 | where they then allow themselves to persist
01:18:49.800 | with particular regular structure over time.
01:18:51.500 | So language is another one where, you know,
01:18:53.820 | like the space of, you know, possible configurations
01:18:57.680 | of the 26 letters of the English alphabet
01:18:59.820 | is astronomically large,
01:19:01.320 | but we use with very high regularity certain structures.
01:19:04.080 | And then we associate meaning to them
01:19:07.520 | because of the regularity of like how much we use them,
01:19:10.000 | right, so meaning is an emergent property
01:19:11.660 | of the causation and the objects
01:19:13.420 | and like how often they recur
01:19:14.840 | and what the relationship of the recurrence is
01:19:17.080 | to other objects.
01:19:18.220 | - Meaning is the emergent property, okay, got it.
01:19:20.360 | - Well, this is why you can play
01:19:21.320 | with language so much, actually.
01:19:23.080 | So words don't really carry meaning,
01:19:24.640 | it's just about how you lace them together.
01:19:27.320 | - Yeah, but from where does the--
01:19:29.200 | - But you don't have a lot of room.
01:19:31.240 | Obviously, as a speaker of a given language,
01:19:33.440 | you don't have a lot of room with a given word to wiggle,
01:19:35.760 | but you do have a certain amount of room
01:19:40.000 | to push the meanings of words.
01:19:42.300 | And I do this all the time and you have to do it
01:19:46.440 | with the kind of work that I do
01:19:48.160 | because if you want to discover an abstraction,
01:19:53.160 | like some kind of concept that we don't understand yet,
01:19:56.960 | it means we don't have the language.
01:19:58.920 | And so the words that we have are inadequate
01:20:01.920 | to describe the things.
01:20:02.840 | This is why we're having a hard time talking
01:20:04.040 | about assembly theory
01:20:04.880 | 'cause it's a newly emerging idea.
01:20:07.040 | And so I'm constantly playing with words in different ways
01:20:12.280 | to try to convey the meaning
01:20:13.720 | that is actually behind the words, but it's hard to do.
01:20:18.200 | - So you have to wiggle within the constraints.
01:20:20.320 | - Yes, lots of wiggle.
01:20:21.920 | - The great orators are just good at wiggling.
01:20:27.440 | Do you wiggle?
01:20:28.260 | - I'm not a very good wiggler, no.
01:20:32.280 | This is the problem.
01:20:33.360 | This is part of the problem.
01:20:34.560 | - No, I like playing with words a lot.
01:20:37.160 | You know, it's very funny
01:20:38.080 | 'cause I know you talked about this with Lee,
01:20:40.640 | but people are so offended by the writing of the paper
01:20:43.600 | that came out last fall.
01:20:45.280 | And it was interesting because the ways that we use words
01:20:48.680 | were not the way that people were interacting with the words.
01:20:52.080 | And I think that was part of the mismatch
01:20:54.200 | where we were trying to use words in a new way
01:20:56.400 | 'cause we were trying to describe something
01:20:58.480 | that hadn't been described adequately before,
01:21:02.360 | but we had to use the words that everyone else uses
01:21:04.560 | for things that are related.
01:21:06.160 | And so it was really interesting to watch that clash play out
01:21:09.360 | in real time for me,
01:21:10.840 | being someone that tries to be so precise with my word usage,
01:21:14.120 | knowing that it's always gonna be vague.
01:21:17.080 | - Boy, can I relate.
01:21:18.480 | What is truth?
01:21:21.920 | Is truth the thing you meant when you wrote the words
01:21:24.760 | or is truth the thing that people understood
01:21:27.440 | when they read the words?
01:21:28.640 | - Oh, yeah.
01:21:29.480 | - I think that compression mechanism into language
01:21:32.200 | is a really interesting one
01:21:33.200 | and that's why Twitter is a nice exercise.
01:21:36.200 | - I love Twitter.
01:21:37.240 | - You get to write a thing
01:21:39.080 | and you think a certain thing when you write it.
01:21:42.880 | And then you get to see all these other people interpret it
01:21:45.360 | in all kinds of different ways.
01:21:46.760 | - I use it as an experimental platform for that reason.
01:21:49.680 | - I wish there was a higher diversity
01:21:51.920 | of interpretation mechanisms applied to tweets,
01:21:56.000 | meaning like all kinds of different people would come to it.
01:22:00.680 | Like some people that see the good in everything
01:22:03.200 | and some people that are ultra cynical,
01:22:05.320 | a bunch of haters and a bunch of lovers and a bunch of-
01:22:07.680 | - Maybe they could do better jobs
01:22:08.880 | with presenting material to people.
01:22:11.120 | It's usually based on interest,
01:22:15.520 | but I think it would be really nice
01:22:16.600 | if you got like 10% of your Twitter feed
01:22:19.040 | was random stuff sampled from other places.
01:22:21.560 | That'd be kind of fun.
01:22:23.160 | - I also would love to filter,
01:22:25.040 | just like bin the response to tweets
01:22:29.480 | by like the people that hate on everything.
01:22:32.880 | - Yes.
01:22:33.720 | - The people that are-
01:22:34.560 | - Oh, that would be fantastic.
01:22:35.640 | - The people that are like super positive on everything.
01:22:38.400 | And then they'll just kind of,
01:22:40.880 | I guess, normalize the response.
01:22:43.000 | 'Cause then it'd be cool to see
01:22:44.400 | if the people that are usually positive about everything
01:22:46.440 | are hating on you or like totally don't understand
01:22:49.440 | or completely misunderstood.
01:22:51.040 | - Yeah, usually it takes a lot of clicking to find that out.
01:22:54.440 | - Yeah.
01:22:55.280 | - Yeah, so it'd be better if it was sorted, yeah.
01:22:56.520 | - The more clicking you do,
01:22:57.960 | the more damaging it is to the soul.
01:23:01.200 | - Yeah.
01:23:02.040 | It's like, instead of like,
01:23:02.860 | like you could have the blue check,
01:23:03.700 | but you should have like,
01:23:04.540 | are you a pessimist, an optimist?
01:23:06.480 | - Yeah, there's a lot of colors.
01:23:07.300 | - Chaotic neutral?
01:23:09.140 | - Yeah, a whole rainbow of checks.
01:23:13.180 | And then you realize there's more categories
01:23:15.520 | than we can possibly express in colors.
01:23:17.400 | - Yeah, of course.
01:23:19.720 | People are complex.
01:23:20.960 | - That's our best feature.
01:23:24.800 | I don't know how we got to the wiggling required,
01:23:29.160 | given the constraints of language,
01:23:31.560 | because I think we started about me asking about alien life,
01:23:35.320 | which is how many different times
01:23:41.600 | did the face transition happen elsewhere?
01:23:45.920 | Do you think there's other alien civilizations out there?
01:23:48.920 | This goes into like the,
01:23:50.600 | are you on the boundary of insane or not?
01:23:53.240 | But when you think about the structure
01:23:55.600 | of the physics of what we are that deeply,
01:23:57.980 | it really changes your conception of things.
01:23:59.800 | And going to this idea of the universe
01:24:04.800 | being kind of small in physical space
01:24:10.240 | compared to how big it is in time and like how large we are,
01:24:13.760 | it really makes me question
01:24:15.280 | about whether there's any other structure
01:24:18.240 | that's like this giant crystal in time,
01:24:20.560 | this giant causal structure,
01:24:22.000 | like our biosphere/technosphere
01:24:24.800 | is anywhere else in the universe.
01:24:28.740 | - Why not?
01:24:29.660 | - I don't know.
01:24:31.800 | - Just because this one is gigantic
01:24:33.600 | doesn't mean there's other gigantic ones.
01:24:36.360 | - But I think the universe is expanding, right?
01:24:38.920 | It's expanding in space,
01:24:40.040 | but in assembly theory, it's also expanding in time.
01:24:42.640 | And actually that's driving the expansion in space.
01:24:46.240 | And the expansion in time is also driving the expansion
01:24:50.360 | in the sort of combinatorial space of things on our planet.
01:24:53.920 | So that's driving the sort of pace of technology
01:24:57.000 | and all the other things.
01:24:57.880 | So time is driving all of these things,
01:25:00.080 | which is a little bit crazy to think
01:25:01.720 | that the universe is just getting bigger
01:25:03.240 | because time is getting bigger.
01:25:04.980 | But like the sort of visual that gets built in my brain
01:25:10.400 | about that is like the structure
01:25:12.500 | that we're building on this planet
01:25:13.780 | is packing more and more time
01:25:16.400 | in this very small volume of space, right?
01:25:18.840 | 'Cause our planet hasn't changed its physical size
01:25:21.080 | in four billion years,
01:25:22.000 | but there's like a ton of causation and recursion and time,
01:25:27.000 | whatever word you wanna use, information packed into this.
01:25:31.000 | And I think this is also embedded
01:25:34.680 | in sort of the virtualization of our technologies
01:25:38.200 | or the abstraction of language and all of these things.
01:25:41.160 | These things that seem really abstract
01:25:43.640 | are just really deep in time.
01:25:45.180 | And so what that looks like
01:25:49.420 | is you have a planet that becomes increasingly virtualized.
01:25:53.700 | And so it's getting bigger and bigger in time,
01:25:55.720 | but not really expanding out in space.
01:25:57.340 | And the rest of space is like kind of moving away from it.
01:25:59.860 | Again, it's a sort of exponentially receding horizon.
01:26:02.380 | And I'm just not sure how far
01:26:04.340 | into this evolutionary process something gets
01:26:07.740 | if it can ever see
01:26:08.680 | that there's another such structure out there.
01:26:10.900 | - What do you mean by virtualized in that context?
01:26:13.540 | - Virtual as sort of a play on virtual reality
01:26:16.520 | and like simulation theories,
01:26:18.440 | but virtual also in a sense of,
01:26:20.680 | we talk about virtual particles and particle physics,
01:26:25.740 | which they are very critical to doing calculations
01:26:29.080 | about predicting the properties of real particles,
01:26:30.960 | but we don't observe them directly.
01:26:33.000 | So what I mean by virtual here is virtual reality for me,
01:26:38.000 | things that appear virtual, appear abstract
01:26:42.480 | are just things that are very deep in time
01:26:44.500 | in the structure of the things that we are.
01:26:48.120 | So if you think about you as a 4 billion year old object,
01:26:51.380 | the things that are part of you,
01:26:53.740 | like your capacity to use language or think abstractly
01:26:56.740 | or have mathematics are just very deep temporal structures.
01:27:01.420 | That's why they look like they're informational and abstract
01:27:05.140 | is because they're existing in this temporal part of you,
01:27:08.660 | but not necessarily spatial part.
01:27:10.180 | - Just because I have a 4 billion year old history,
01:27:12.060 | why does that mean I can't hang out with aliens?
01:27:15.100 | - There's a couple of ideas that are embedded here.
01:27:16.880 | So one of them comes again from Paul.
01:27:19.380 | He wrote this book years ago about the eerie silence
01:27:24.340 | and why we're alone.
01:27:25.260 | And he concluded the book with this idea
01:27:27.300 | of quintelligence or something,
01:27:28.660 | but this idea that really advanced intelligence
01:27:32.380 | would basically just build itself into a quantum computer
01:27:36.220 | and it wouldn't wanna operate in the vacuum of space
01:27:39.020 | 'cause that's the best place to do quantum computation.
01:27:41.060 | It would just run out all of its computations indefinitely,
01:27:43.740 | but it would look completely dark to the rest of the universe
01:27:46.100 | and I don't think as typical,
01:27:48.220 | I don't think that's actually the right physics,
01:27:50.220 | but I think something about that idea
01:27:51.820 | as I do with all ideas is partially correct.
01:27:54.020 | And Freeman Dyson also had this amazing paper
01:27:56.780 | about how long life could persist in a universe
01:27:59.900 | that was exponentially expanding.
01:28:01.780 | And his conception was if you imagine analog life form,
01:28:05.620 | it could run slower and slower and slower
01:28:08.980 | and slower and slower as a function of time.
01:28:12.140 | And so it would be able to run indefinitely
01:28:15.260 | even against an exponentially expanding universe
01:28:17.980 | because it would just run exponentially slower.
01:28:20.340 | And so I guess part of what I'm doing in my brain
01:28:23.620 | is putting those two things together
01:28:25.340 | along with this idea that we are building,
01:28:28.960 | like if you imagine with our technology,
01:28:32.220 | we're now building virtual realities, right?
01:28:34.420 | Like things we actually call virtual reality,
01:28:37.140 | which required 4 billion years of history
01:28:40.060 | and a whole bunch of data to basically embed them
01:28:42.380 | in a computer architecture.
01:28:43.380 | So now you can put like an Oculus headset on
01:28:45.860 | and think that you're in this world, right?
01:28:47.740 | And what you really are embedded in
01:28:50.020 | is in a very deep temporal structure.
01:28:52.460 | And so it's huge in time, but it's very small in space.
01:28:56.080 | And you can go lots of places in the virtual space, right?
01:28:59.340 | But you're still stuck in like your physical body
01:29:01.620 | and like sitting in the chair.
01:29:03.220 | And so part of it is it might be the case
01:29:06.200 | that sufficiently evolved biospheres
01:29:10.900 | kind of virtualize themselves
01:29:13.420 | and they internalize their universe
01:29:15.020 | in their sort of temporal causal structure
01:29:17.420 | and they close themselves off from the rest of the universe.
01:29:19.860 | - I just don't know if a deep temporal structure
01:29:21.900 | necessarily means that you're closed off.
01:29:24.540 | - No, I don't either.
01:29:25.380 | So that's kind of my fear.
01:29:26.700 | So I'm not sure I'm agreeing with what I say.
01:29:29.940 | I'm just saying like, this is one sort of conclusion.
01:29:32.340 | And you know, like in my most sort of like,
01:29:34.380 | it's interesting 'cause I don't do psychedelic drugs.
01:29:37.980 | But when people describe to me
01:29:39.020 | like your thing with the faces and stuff,
01:29:40.500 | and like I have had a lot of deep conversations
01:29:42.660 | with friends that have done psychedelic drugs
01:29:44.540 | for intellectual reasons and otherwise.
01:29:47.060 | But I'm always like,
01:29:48.040 | oh, it sounds like you're just doing theoretical physics.
01:29:50.180 | Like that's what brains do on theoretical physics.
01:29:52.680 | So I live in these like really abstract spaces
01:29:56.460 | most of the time.
01:29:57.940 | But there's also this issue of extinction, right?
01:30:01.140 | Like extinction events are basically
01:30:03.020 | pinching off an entire like causal structure,
01:30:05.620 | the one of these, like,
01:30:06.660 | I'm gonna call them time crystals.
01:30:07.940 | I don't like know what,
01:30:08.780 | but there's like these very large objects in time,
01:30:10.920 | pinching off that whole structure from the rest of it.
01:30:13.300 | And so it's like,
01:30:14.180 | if you imagine that sort of same thing in the universe,
01:30:17.700 | I, you know, I once thought
01:30:19.060 | that sufficiently advanced technologies
01:30:20.900 | would look like black holes.
01:30:22.060 | - They would be just completely imperceptible to us.
01:30:23.780 | - Yeah.
01:30:24.620 | So there might be lots of aliens out there.
01:30:28.000 | Maybe that's the explanation for all the singularities.
01:30:30.460 | They're all pinched off causal structures
01:30:31.960 | that virtualize their reality and kind of broke off from us.
01:30:34.340 | - Black holes in every way.
01:30:36.100 | So like untouchable to us or unlikely to be detectable by us.
01:30:41.100 | - Right.
01:30:43.420 | - With whatever sensory mechanisms we have.
01:30:45.140 | - Yeah.
01:30:45.980 | But the other way I think about it is,
01:30:47.860 | there is probably hopefully life out there.
01:30:51.260 | So like I do work on life detection efforts
01:30:54.300 | in the solar system,
01:30:55.320 | and I'm trying to help
01:30:56.160 | with the Habitable Worlds Observatory
01:30:58.260 | mission planning right now,
01:31:00.480 | and working with like the biosignatures team for that,
01:31:02.360 | like to think about exoplanet biosignatures.
01:31:04.240 | So like I have some optimism that we might find things,
01:31:08.520 | but they're the challenges
01:31:11.680 | that we don't know the likelihood for life,
01:31:13.880 | like which is what you were talking about.
01:31:15.600 | So if I get to a more grounded discussion,
01:31:18.560 | what I'm really interested in doing
01:31:21.280 | is trying to solve the origin of life
01:31:24.360 | so we can understand how likely life is out there.
01:31:27.340 | So I don't think that the,
01:31:28.600 | I think that the problem of discovering alien life
01:31:32.240 | and solving the origin of life are deeply coupled,
01:31:35.020 | and in fact are one in the same problem.
01:31:37.160 | And that the first contact with alien life
01:31:40.400 | will actually be in an original life experiment.
01:31:43.280 | But that part I'm super interested in.
01:31:45.440 | And then there's this other feature
01:31:46.980 | that I think about a lot,
01:31:48.200 | which is our own technological phase of development
01:31:52.520 | as sort of like,
01:31:53.500 | what is this phase in the evolution of life on a planet?
01:31:58.000 | If you think about a biosphere emerging on a planet
01:32:00.680 | and evolving over billions of years
01:32:02.400 | and evolving into a technosphere.
01:32:04.320 | When a technosphere can move off planet
01:32:09.200 | and basically reproduce itself on another planet,
01:32:11.680 | now you have biospheres reproducing themselves.
01:32:16.040 | Basically, they have to go through technology to do that.
01:32:19.880 | And so there are ways of thinking about
01:32:23.160 | sort of the nature of intelligent life
01:32:25.600 | and how it spreads in that capacity
01:32:27.480 | that I'm also really excited about and thinking about.
01:32:30.240 | And all of those things for me are connected.
01:32:33.960 | We have to solve the origin of life
01:32:35.280 | in order for us to get off planet
01:32:37.640 | because we basically have to start life on another planet.
01:32:40.400 | And we also have to solve the origin of life
01:32:41.960 | in order to recognize other alien intelligence.
01:32:44.240 | All of these things are literally the same problem.
01:32:46.920 | - Right, understanding the origin of life here on Earth
01:32:49.600 | is a way to understand ourselves
01:32:51.440 | and to understanding ourselves as a prerequisite
01:32:54.600 | for being able to detect other intelligent civilizations.
01:32:59.600 | I, for one, take it for what it's worth, on ayahuasca,
01:33:04.960 | one of the things I did is zoom out,
01:33:07.840 | like aggressively, like a spaceship.
01:33:11.320 | And it would always go quickly to the galaxy
01:33:14.200 | and from the galaxy to this representation of the universe.
01:33:19.200 | And at least for me, from that perspective,
01:33:22.600 | it seemed like it was full of alien life.
01:33:25.000 | Not just alien life, but intelligent life.
01:33:28.960 | - I like that.
01:33:29.800 | - And conscious life.
01:33:31.440 | So I don't know how to convert it into words.
01:33:34.880 | It's more like a feeling, like you were saying.
01:33:36.880 | A feeling converted to a visual, converted to words.
01:33:40.560 | So I had a visual with it, but really it was a feeling
01:33:44.720 | that it was just full of this vibrant energy
01:33:49.600 | that I was feeling when I'm looking
01:33:52.160 | at the people in my life and full of gratitude.
01:33:55.980 | But that same exact thing is everywhere in the universe.
01:33:59.920 | - Right.
01:34:01.280 | I totally agree with this.
01:34:02.440 | Like that visual, I really love.
01:34:04.280 | And I think we live in a universe
01:34:06.680 | that generates life and purpose.
01:34:10.280 | And it's part of the structure of just the world.
01:34:14.460 | And so maybe this sort of lonely view I have is,
01:34:19.280 | I never thought about it this way
01:34:20.120 | 'til you were describing that.
01:34:20.960 | I was like, I wanna live in that universe.
01:34:22.000 | And I'm a very optimistic person,
01:34:23.480 | and I love building visions of reality that are positive.
01:34:28.200 | But I think for me right now in the intellectual process,
01:34:30.760 | I have to tunnel through this particular way
01:34:33.600 | of thinking about the loneliness
01:34:36.440 | of being separated in time from everything else.
01:34:40.160 | Which I think we also all are,
01:34:41.880 | because time is what defines us as individuals.
01:34:44.640 | - So part of you is drawn to the trauma of being alone.
01:34:48.000 | - Yeah. - Deeper in the physics.
01:34:50.000 | - Yeah, but also part of what I mean
01:34:52.040 | is you have to go through ideas
01:34:54.280 | you don't necessarily agree with
01:34:56.760 | to work out what you're trying to understand.
01:34:59.320 | And I'm trying to be inside this structure
01:35:01.400 | so I can really understand it.
01:35:02.840 | And I don't think I've been able to,
01:35:04.800 | I'm so deeply embedded in what we are
01:35:07.720 | intellectually right now that I don't have an ability
01:35:10.600 | to see these other ones that you're describing,
01:35:14.140 | if they're there.
01:35:14.980 | - Well, one of the things you kind of described
01:35:16.600 | that you already spoke to,
01:35:17.920 | you call it the great perceptual filter.
01:35:20.340 | - Yeah.
01:35:21.180 | - So there's the famous great filter,
01:35:23.200 | which is basically the idea that
01:35:27.400 | there's some really powerful moment
01:35:30.540 | in every intelligent civilization
01:35:33.560 | where they destroy themselves.
01:35:35.080 | - Yeah.
01:35:37.000 | - That explains why we have not seen aliens.
01:35:39.600 | And you're saying that there's something like that
01:35:42.280 | in the temporal history of the creation of complex objects
01:35:45.760 | that at a certain point, they become an island,
01:35:48.880 | an island too far to reach based on the perceptions.
01:35:52.720 | - I hope not, but yeah, I worry about it, yeah.
01:35:55.480 | - But that's basically meaning
01:35:57.720 | there's something fundamental about the universe
01:35:59.880 | where if the more complex you become,
01:36:02.320 | the harder it will be to perceive other complex.
01:36:04.840 | - Yeah.
01:36:05.680 | I mean, just think about us with microbial life, right?
01:36:07.640 | Like we used to once be cells.
01:36:09.800 | And for most of human history,
01:36:11.360 | we didn't even recognize cellular life was there
01:36:13.040 | until we built a new technology,
01:36:14.800 | microscopes that allowed us to see them, right?
01:36:17.840 | So that's kind of, it's kind of weird, right?
01:36:20.280 | Like, like things that-
01:36:21.520 | - And they're close to us.
01:36:22.360 | - They're close, they're everywhere.
01:36:24.400 | - But also in the history of the development
01:36:26.520 | of complex objects, they're pretty close.
01:36:28.320 | - Yeah, super close, super close.
01:36:31.160 | Like, yeah, I mean, everything on this planet is like,
01:36:34.800 | it's like pretty much the same thing.
01:36:36.640 | Like, like the space of possibilities is so huge.
01:36:40.520 | It's like, we're virtually identical.
01:36:42.480 | So how many flavors or kinds of life
01:36:46.000 | do you think are possible?
01:36:47.640 | - I'm kind of like trying to imagine
01:36:49.000 | all the little flickering lights in the universe,
01:36:50.680 | like in the way that you were describing it,
01:36:52.120 | that was kind of cool.
01:36:52.960 | - It was so, I mean, it was awesome to me.
01:36:54.640 | It was exactly that, it was like lights.
01:36:57.160 | - Yeah.
01:36:58.000 | - The way you maybe see a city,
01:36:59.320 | but a city from like up above,
01:37:03.360 | you see a city with the flickering lights,
01:37:05.200 | but there's a coldness to the city.
01:37:07.200 | - Yeah.
01:37:08.040 | - There's some, you know, that, you know,
01:37:09.480 | humans are capable of good and evil,
01:37:11.040 | and you could see like, there's a complex feeling
01:37:13.280 | to the city.
01:37:14.120 | I had no such complex feeling about seeing the lights
01:37:18.840 | of all the galaxies, whatever, the billions of galaxies.
01:37:22.360 | - Yeah, this is kind of cool.
01:37:23.200 | I'll answer the question in a second,
01:37:24.160 | but I just maybe like this idea of flickering lights
01:37:26.520 | and intelligence is interesting to me because I, you know,
01:37:29.600 | like we have such a human centric view
01:37:31.920 | of alien intelligences that a lot of the work
01:37:34.960 | that I've been doing with my lab
01:37:35.960 | is just trying to take inspiration
01:37:38.400 | from non-human life on earth.
01:37:42.000 | And so I have this really talented undergrad student
01:37:45.480 | that's basically building a model of alien communication
01:37:49.340 | based on fireflies.
01:37:50.960 | So one of my colleagues or at Peleg is,
01:37:53.560 | she's totally brilliant,
01:37:54.720 | but she goes out with like GoPro cameras
01:37:57.000 | and like, you know, films in high resolution,
01:37:58.920 | all these firefly flickering.
01:37:59.920 | And she has like this theory about how their signaling
01:38:02.880 | evolved to like maximally differentiate
01:38:05.400 | the flickering patterns.
01:38:08.080 | So like she has a theory basically that predicts,
01:38:11.120 | you know, like this species should flash like this.
01:38:13.640 | If this one's flashing like this,
01:38:14.720 | this other one's gonna do it at a slower rate
01:38:16.560 | so that the, you know, like they can distinguish
01:38:18.800 | each other living in the same environment.
01:38:21.000 | And so this undergrads building this model
01:38:22.760 | where you have like a pulsar background
01:38:24.760 | of all these like giant flashing sources in the universe
01:38:27.160 | and an alien intelligence, you know,
01:38:28.960 | wants to signal it's there.
01:38:30.280 | So it's flashing like a firefly.
01:38:31.980 | And I just like, I like the idea of thinking
01:38:34.880 | about a non-human alien.
01:38:36.720 | So that was really fun.
01:38:37.760 | - The mechanism of the flashing, unfortunately,
01:38:39.960 | is like the diversity of that is very high
01:38:42.400 | and we might not be able to see it.
01:38:43.640 | That's what--
01:38:44.480 | - Yeah, well, I think there's some ways
01:38:45.800 | we might be able to differentiate that signal.
01:38:47.520 | I'm still thinking about this part of it.
01:38:48.960 | So one is like if you have pulsars
01:38:52.000 | and they all have a certain spectrum
01:38:53.840 | to their pulsing patterns,
01:38:55.480 | and you have this one signal that's in there
01:38:57.600 | that's basically tried to maximally differentiate itself
01:39:00.160 | from all the other sources in the universe,
01:39:02.160 | it might stick out in the distribution.
01:39:03.800 | Like there might be ways of actually being able to tell
01:39:05.520 | if it's an anomalous pulsar, basically.
01:39:08.080 | But I don't know if that would really work or not.
01:39:10.720 | So still thinking about it.
01:39:12.600 | - You tweeted, "If one wants to understand
01:39:14.840 | "how truly combinatorially and compositionally complex
01:39:17.720 | "our universe is, they only need a step
01:39:20.720 | "into the world of fashion."
01:39:22.520 | - Yeah.
01:39:23.360 | - It's bonkers how big the constructible space
01:39:27.560 | of human aesthetics is.
01:39:29.560 | Can you explain?
01:39:30.480 | Can we explore the space of human aesthetics?
01:39:34.080 | - Yeah, I don't know.
01:39:35.920 | I've been kind of obsessed with,
01:39:38.280 | I never know how to pronounce it, a chaparelli.
01:39:40.360 | You know, like they have ears and things.
01:39:42.760 | Like it's such like a weird, grotesque aesthetic.
01:39:45.280 | But like it's totally bizarre.
01:39:48.680 | But what I meant, like I have a visceral experience
01:39:51.200 | when I walk into my closet.
01:39:52.640 | I have like a lot of (laughs)
01:39:55.040 | - How big is your closet?
01:39:56.160 | - It's pretty big.
01:39:57.180 | It's like I do assembly theory every morning
01:40:00.200 | when I walk in my closet,
01:40:01.160 | because I really like a very large,
01:40:04.600 | combinatorial, diverse palette,
01:40:06.080 | but I never know what I'm gonna build in the morning.
01:40:08.200 | - Do you get rid of stuff?
01:40:09.520 | - Sometimes.
01:40:10.840 | - Or do you have trouble getting rid of stuff?
01:40:12.640 | - I have trouble getting rid of some stuff.
01:40:14.240 | It depends on what it is.
01:40:15.280 | If it's vintage, it's hard to get rid of,
01:40:17.520 | 'cause it's kind of hard to replace.
01:40:19.420 | It depends on the piece, yeah.
01:40:22.160 | - So you have, your closet is one of those
01:40:25.160 | temporal time crystals that--
01:40:26.840 | - Yeah.
01:40:27.680 | - That just, you get to visualize the entire history.
01:40:29.920 | - It's a physical manifestation of my personality.
01:40:32.760 | - Right.
01:40:33.600 | So why is that a good visualization
01:40:37.360 | of the combinatorial and compositionally complex universe?
01:40:42.360 | - I think it's an interesting feature of our species
01:40:45.480 | that we allow, we get to express ourselves
01:40:47.560 | through what we wear.
01:40:48.880 | Right, like if you think about all those animals
01:40:51.160 | in the jungle you saw,
01:40:52.320 | like they're born looking the way they look,
01:40:54.360 | and then they're stuck with it for life.
01:40:55.800 | - That's true.
01:40:56.640 | I mean, it is one of the loudest, clearest,
01:40:59.480 | most consistent ways we signal to each other.
01:41:01.720 | - Yeah.
01:41:02.540 | - It's the clothing we wear.
01:41:03.380 | - Yeah.
01:41:04.720 | And it's highly dynamic.
01:41:06.040 | I mean, you can be dynamic if you want to.
01:41:07.920 | Very few people are, there's a certain bravery,
01:41:11.200 | but it's actually more about confidence,
01:41:13.200 | willing to play with style and play with aesthetics.
01:41:19.080 | And I think it's interesting
01:41:21.080 | when you start experimenting with it,
01:41:23.280 | how it changes the fluidity of the social spaces
01:41:26.200 | and the way that you interact with them.
01:41:27.720 | - But there's also commitment.
01:41:28.920 | Like, you have to wear that outfit all day.
01:41:31.700 | - I know, I know, it's a big commitment.
01:41:34.520 | Do you feel like that every morning?
01:41:35.840 | (laughing)
01:41:36.800 | - I wear it, that's why.
01:41:37.640 | - You're like, this is a life commitment.
01:41:39.140 | (laughing)
01:41:40.840 | - All I have is suits and a black shirt and jeans.
01:41:43.880 | Those are the two outfits.
01:41:45.120 | - Yeah.
01:41:45.960 | Well, see, this is the thing though, right?
01:41:47.240 | It simplifies your thought process in the morning.
01:41:49.400 | So like, I have other ways I do that.
01:41:50.840 | I park in the same exact parking spot
01:41:53.160 | when I go to work on the fourth floor of a parking garage
01:41:55.360 | because no one ever parks on the fourth floor.
01:41:56.880 | So I don't have to remember where I park my car.
01:41:59.960 | But I really like aesthetics and playing with them.
01:42:04.120 | So I'm willing to spend part of my cognitive energy
01:42:06.720 | every morning trying to figure out
01:42:07.880 | what I want to be that day.
01:42:09.160 | - Did you deliberately think about
01:42:10.720 | the outfit you're wearing today?
01:42:12.840 | - Yep.
01:42:13.880 | - Was there backup options?
01:42:15.080 | Were you going back and forth between some--
01:42:16.360 | - Three or four.
01:42:17.200 | But I really like the look.
01:42:18.960 | - Were they drastically different?
01:42:20.240 | - Yes.
01:42:21.080 | (laughing)
01:42:22.520 | - It's okay.
01:42:23.360 | - And even this one could have been really different
01:42:25.080 | because it's not just the sort of jacket and the shoes
01:42:28.960 | and the hairstyle.
01:42:30.860 | It's like the jewelry and the accessories.
01:42:32.640 | So any outfit is a lot of small decisions.
01:42:37.640 | - Well, I think your current outfit
01:42:39.200 | is like a lot of shades of yellow.
01:42:42.120 | There's like a theme.
01:42:43.760 | - Yeah.
01:42:44.600 | - It's nice.
01:42:45.440 | It's really, I'm grateful that you did that.
01:42:47.760 | - Thanks.
01:42:48.600 | - It's on art form.
01:42:49.920 | - Yeah, yellow's my daughter's favorite color.
01:42:52.000 | And I never really thought about yellow much,
01:42:53.700 | but she's been obsessed with yellow.
01:42:54.980 | She's seven now.
01:42:56.280 | And I don't know.
01:42:57.120 | I just really love it.
01:42:58.200 | - I guess you can pick a color
01:42:59.320 | and just make that the constraint.
01:43:01.800 | - Yeah.
01:43:02.640 | - And just go with it.
01:43:03.480 | - I'm playing with yellow a lot lately.
01:43:04.600 | Like this is not even the most yellow
01:43:05.960 | 'cause I black pants on,
01:43:06.960 | but I have worn outfits
01:43:10.280 | that have probably five shades of yellow in them.
01:43:12.280 | - Wow.
01:43:13.120 | (laughing)
01:43:14.560 | What do you think beauty is?
01:43:17.560 | We seem to, so underlying this idea
01:43:20.160 | of playing with aesthetics
01:43:21.180 | is we find certain things beautiful.
01:43:23.480 | - Yeah.
01:43:24.320 | - That humans find beautiful.
01:43:27.120 | And why do we need to find things beautiful?
01:43:30.080 | - Yeah, it's interesting.
01:43:32.200 | It's not, I'm not, I mean,
01:43:34.720 | I am attracted to style and aesthetics
01:43:38.000 | because I think they're beautiful,
01:43:39.120 | but it's much more because I think it's fun to play with.
01:43:42.680 | And so I will get to the beauty thing,
01:43:47.580 | but I guess I wanna just explain a little bit
01:43:50.080 | about my motivation in this space
01:43:51.480 | 'cause it's really an intellectual thing for me.
01:43:54.720 | And Stuart Brand has this great infographic
01:43:57.800 | about the layers of human society.
01:44:01.040 | And I think it starts with the natural sciences
01:44:03.560 | and physics at the bottom,
01:44:04.840 | and it goes through all these layers and it's economics.
01:44:06.920 | And then fashion is at the top,
01:44:08.360 | is the fastest moving part of human culture.
01:44:11.480 | And I think I really like that
01:44:13.840 | because it's so dynamic and so short
01:44:16.560 | and it's temporal longevity
01:44:19.200 | contrasted with studying the laws of physics,
01:44:22.040 | which are like the deep structure reality
01:44:24.320 | that I feel like bridging those scales
01:44:26.560 | tells me much more about the structure
01:44:29.320 | of the world that I live in.
01:44:31.160 | - That said, there's some kinds of fashions,
01:44:33.720 | like a dude in a black suit with a black tie
01:44:37.360 | seems to be less dynamic.
01:44:40.760 | - Yeah.
01:44:41.580 | - It seems to persist through time.
01:44:43.240 | - Are you embodying this?
01:44:44.720 | - Yeah, I think so.
01:44:45.760 | I think, I think it's just-
01:44:48.720 | - I'd like to see you wearing yellow, Lex.
01:44:50.880 | - I don't, I wouldn't even know what to do with myself.
01:44:54.000 | I would freak out.
01:44:55.440 | I wouldn't know how to act in the world.
01:44:56.280 | - You wouldn't know how to be you, yeah.
01:44:58.280 | I know this is amazing though, isn't it?
01:44:59.880 | Amazing, like you have the choice to do it.
01:45:02.120 | But one of my favorite, just on the question of beauty,
01:45:04.920 | one of my favorite fashion designers of all time
01:45:07.320 | is Alexander McQueen.
01:45:08.920 | And he was really phenomenal, but like his early,
01:45:13.160 | and actually I kind of used,
01:45:15.720 | like what happened to him in the fashion industry
01:45:17.440 | as a coping mechanism with our paper
01:45:18.920 | when like the nature paper in the fall
01:45:22.000 | when everyone was saying it was controversial
01:45:23.640 | and how terrible that like, you know,
01:45:25.400 | but controversial is good, right?
01:45:26.520 | But like when Alexander McQueen, you know,
01:45:28.520 | first came out with his fashion lines,
01:45:30.080 | he was mixing horror and beauty.
01:45:32.440 | And people were horrified.
01:45:34.200 | It was so controversial.
01:45:35.640 | Like they, like it was macabre.
01:45:37.560 | He had like, you know, like it looked like
01:45:39.240 | there were blood on the models and like-
01:45:42.000 | - It was beautiful.
01:45:42.840 | We're just looking at some pictures here.
01:45:44.040 | - Yeah, no, I mean, his stuff is amazing.
01:45:47.880 | His first like runway line I think was called Neilism.
01:45:52.480 | I don't know if you could find it.
01:45:55.080 | You know, I mean, he was really dramatic.
01:45:56.840 | He carried a lot of trauma with him.
01:45:59.680 | There you go, that's, yeah.
01:46:01.040 | Yeah. - Wow.
01:46:03.560 | - But he changed the fashion industry.
01:46:05.240 | His stuff became very popular.
01:46:07.720 | - That's a good outfit to show up to a party.
01:46:09.600 | - Right, right.
01:46:10.960 | But this gets at the question,
01:46:12.280 | like is that horrific or is it beautiful?
01:46:15.920 | And I think, you know, he had a traumatic,
01:46:18.720 | he ended up committing suicide
01:46:22.680 | and actually he left his death note on the descent of man.
01:46:26.680 | So he was a really deep person.
01:46:29.400 | - So I mean, great fashion certainly
01:46:30.800 | has that kind of depth to it.
01:46:32.200 | - Yeah, it sure does.
01:46:33.640 | So I think it's the intellectual pursuit, right?
01:46:35.880 | Like it's not, so this is like very highly intellectual.
01:46:38.760 | And I think it's a lot like how I play with language
01:46:41.600 | is the same way that I play with fashion
01:46:43.200 | or the same way that I play with ideas
01:46:44.760 | in theoretical physics.
01:46:45.680 | Like there's always this space
01:46:47.440 | that you can just push things just enough.
01:46:49.840 | So they're like, they look like something
01:46:51.880 | someone thinks is familiar, but they're not familiar.
01:46:54.520 | And yeah, and I think that's really cool.
01:46:58.520 | - It seems like beauty doesn't have much function, right?
01:47:01.680 | But it seems to also have a lot of influence
01:47:06.680 | on the way we collaborate with each other.
01:47:09.760 | - It has tons of function.
01:47:10.600 | What do you mean it doesn't have function?
01:47:11.440 | - I guess sexual selection incorporates beauty somehow.
01:47:14.600 | - But why?
01:47:15.720 | Because beauty is a sign of health or something?
01:47:18.240 | I don't even.
01:47:19.640 | - Oh, evolutionarily, maybe.
01:47:21.800 | But then beauty becomes a signal of other things, right?
01:47:24.520 | So it's really not like,
01:47:25.960 | and then beauty becomes an adaptive trait.
01:47:28.400 | So it can change with different,
01:47:29.800 | like maybe some species would think,
01:47:32.560 | well, you thought the frog having babies
01:47:34.360 | come out of its back was beautiful
01:47:35.840 | and I thought it was grotesque.
01:47:37.400 | Like there's not a universal definition of what's beautiful.
01:47:40.560 | It is something that is dependent on your history
01:47:44.440 | and how you interact with the world.
01:47:46.520 | And I guess what I like about beauty,
01:47:49.680 | like any other concept, is when you turn it on its head.
01:47:52.240 | So maybe the traditional conception
01:47:57.120 | of why women wear makeup and they dress certain ways
01:48:02.120 | is because they wanna look beautiful
01:48:04.040 | and pleasing to people.
01:48:07.240 | And I just like to do it 'cause it's a confidence thing.
01:48:11.040 | It's about embodying the person that I want to be
01:48:15.920 | and about owning that person.
01:48:18.680 | And then the way that people interact with that person
01:48:21.160 | is very different than if I didn't have the,
01:48:23.200 | like if I wasn't using that attribute as part of,
01:48:26.720 | and obviously that's influenced by the society I live
01:48:29.960 | and like what's aesthetically pleasing things.
01:48:31.840 | But it's interesting to be able to turn that around
01:48:33.560 | and not have it necessarily be about the aesthetics,
01:48:36.000 | but about the power dynamics that the aesthetics create.
01:48:38.560 | - But you're saying there's some function to beauty
01:48:40.880 | in that way, in the way you're describing,
01:48:42.640 | in the dynamic it creates in the social interaction.
01:48:44.840 | - Well, the point is you're saying it's an adaptive trait
01:48:47.560 | for like sexual selection or something.
01:48:49.320 | And I'm saying that the adaptation that beauty confers
01:48:52.080 | is far richer than that.
01:48:53.960 | And some of the adaptation is about social hierarchy
01:48:57.320 | and social mobility and just plain social dynamics.
01:49:01.040 | Like why do some people dress goth?
01:49:03.400 | It's 'cause they identify with a community
01:49:05.360 | and a culture associated with that.
01:49:06.920 | And they get, you know, and that's a beautiful aesthetic.
01:49:10.120 | It's a different aesthetic.
01:49:11.360 | Some people don't like it.
01:49:13.120 | - So it has the same richness as does language.
01:49:16.160 | - Yes.
01:49:17.000 | - It's the same kind of--
01:49:17.880 | - Yes, and I think too few people think about
01:49:22.400 | the way that they, the aesthetics they build for themselves
01:49:26.000 | in the morning and how they carry it in the world
01:49:27.680 | and the way that other people interact with that
01:49:30.560 | because they put clothes on
01:49:32.520 | and they don't think about clothes as carrying function.
01:49:35.800 | - Let's jump from beauty to language.
01:49:37.920 | There's so many ways to explore the topic of language.
01:49:41.080 | You called it, you said that language is,
01:49:44.060 | parts of language or language in itself
01:49:47.080 | and the mechanism of language is a kind of living life form.
01:49:50.440 | You've tweeted a lot about this in all kinds of poetic ways.
01:49:55.480 | Let's talk about the computation aspect of it.
01:49:57.800 | You tweeted, "The world is not a computation,
01:50:01.640 | "but computation is our best current language
01:50:03.720 | "for understanding the world.
01:50:05.260 | "It is important we recognize this
01:50:07.120 | "so we can start to see the structure
01:50:09.060 | "of our future languages that will allow us
01:50:11.140 | "to see deeper than computation allows us."
01:50:14.720 | So what's the use of language in helping us understand
01:50:17.720 | and make sense of the world?
01:50:19.440 | - I think one thing that I feel like I notice
01:50:23.520 | much more viscerally than I feel like
01:50:25.520 | I hear other people describe
01:50:27.840 | is that the representations in our mind
01:50:32.040 | and the way that we use language
01:50:34.320 | are not the things like,
01:50:37.140 | actually, I mean, this is an important point
01:50:40.720 | going back to what Gertl did,
01:50:42.360 | but also this idea of signs and symbols
01:50:44.240 | and all kinds of ways of separating them.
01:50:45.800 | There's like the word, right?
01:50:48.200 | And then there's like what the word means about the world.
01:50:51.480 | And we often confuse those things.
01:50:54.480 | And what I feel very viscerally,
01:50:58.320 | I almost sometimes think I have some kind of like
01:50:59.960 | synesthesia for language or something
01:51:01.760 | and I just like don't interact with it
01:51:03.160 | like the way that other people do.
01:51:04.860 | But for me, words are objects
01:51:07.280 | and the objects are not the things that they describe.
01:51:09.400 | They have like a different ontology to them.
01:51:12.240 | Like they're physical things and they carry causation
01:51:15.800 | and they can create meaning,
01:51:18.180 | but they're not what we think they are.
01:51:23.180 | And also like the internal representations in our mind,
01:51:26.320 | like the things I'm seeing about this room are probably,
01:51:28.980 | you know, like they're small projection
01:51:30.640 | of the things that are actually in this room.
01:51:32.640 | And I think we have such a difficult time moving past
01:51:36.600 | the way that we build representations in the mind
01:51:39.120 | and the way that we structure our language
01:51:40.680 | to realize that those are approximations
01:51:42.540 | to what's out there and they're fluid
01:51:43.940 | and we can play around with them
01:51:45.000 | and we can see deeper structure underneath them
01:51:47.400 | that I think like we're missing a lot.
01:51:51.000 | - Yeah, but also the life of the mind
01:51:52.520 | is in some ways richer than the physical reality.
01:51:56.080 | - Sure.
01:51:56.920 | - What's going on in your mind might be a projection
01:52:00.520 | - Right.
01:52:01.360 | - of what's here, but there's also all kinds
01:52:03.560 | of other stuff going on there.
01:52:04.760 | - Yeah, for sure.
01:52:05.880 | I love this essay by Poincaré
01:52:08.640 | about like mathematical creativity,
01:52:10.560 | where he talks about this sort of like frothing
01:52:12.160 | of all these things and then like somehow
01:52:13.680 | you build theorems on top of it
01:52:15.080 | and they become kind of concrete.
01:52:16.880 | But like, and I also think about this with language.
01:52:18.960 | It's like, there's a lot of stuff happening in your mind,
01:52:21.520 | but you have to compress it in this few sets of words
01:52:23.920 | to try to convey it to someone.
01:52:25.680 | So it's a compactification of the space.
01:52:30.300 | And it's not a very efficient one.
01:52:32.440 | And I think just recognizing
01:52:35.000 | that there's a lot that's happening behind language
01:52:37.440 | is really important.
01:52:38.280 | I think this is one of the great things
01:52:41.400 | about the existential trauma of large language models,
01:52:44.280 | I think, is the recognition
01:52:46.160 | that language is not the only thing required.
01:52:48.640 | Like there's something underneath it.
01:52:52.300 | Not by everybody.
01:52:53.260 | - Can you just speak to the feeling you have
01:52:59.160 | when you think about words?
01:53:01.020 | So is there, like, what's the magic of words to you?
01:53:03.560 | Is it, like, do you feel,
01:53:04.720 | it almost sometimes feels like you're playing with it.
01:53:09.280 | - Yeah, I was just gonna say, it's like a playground.
01:53:11.520 | - But you're almost like,
01:53:12.840 | I think one of the things you enjoy,
01:53:14.500 | maybe I'm projecting, is deviating,
01:53:17.800 | like using words in ways that not everyone uses them.
01:53:20.760 | Like slightly sort of deviating from the norm a little bit.
01:53:25.400 | - I love doing that in everything I do,
01:53:26.900 | but especially with language.
01:53:28.680 | - But not so far that it doesn't make sense.
01:53:31.040 | - Exactly.
01:53:32.000 | - So you're always, like, tethered to reality, to the norm,
01:53:37.000 | but, like, are playing with it.
01:53:38.880 | Like basically fucking with people's minds a little bit.
01:53:42.080 | I mean, like, you know,
01:53:43.840 | and in so doing, creating a different perspective
01:53:47.280 | on the thing that's been previously explored
01:53:50.080 | in a different way.
01:53:51.400 | - Yeah, it's literally my favorite thing to do.
01:53:53.600 | - Yeah, use words as one way to make people think.
01:53:57.800 | - Yeah, so I, you know, a lot of my sort of,
01:54:01.440 | like, what happens in my mind when I'm thinking about ideas
01:54:05.920 | is I've been presented with this information
01:54:07.760 | about how people think about things.
01:54:09.840 | And I try to go around to different communities
01:54:12.800 | and hear the ways that different,
01:54:14.680 | whether it's like, you know,
01:54:16.240 | hanging out with a bunch of artists or philosophers
01:54:18.600 | or scientists thinking about things.
01:54:20.880 | Like they all think about it different ways.
01:54:22.600 | And then I just try to figure out, like,
01:54:24.600 | how do you take the structure of the way
01:54:27.520 | that we're talking about it and turn it slightly?
01:54:31.080 | So you have all the same pieces
01:54:34.000 | that everybody sees are there,
01:54:35.320 | but the description that you've come up with
01:54:37.040 | seems totally different.
01:54:38.200 | So they can understand that there's,
01:54:40.440 | like, they understand the pattern you're describing,
01:54:42.480 | but they never heard the structure underlying it
01:54:44.640 | described the way that you describe it.
01:54:47.120 | - Is there words or terms you remember
01:54:49.960 | that disturbed people the most,
01:54:54.960 | maybe the positive sense of disturbed?
01:54:58.000 | This assembly theory, I suppose, is one.
01:55:00.600 | - Yeah, I mean, the first couple sentences of that paper
01:55:03.800 | disturbed people a lot.
01:55:05.000 | And I think they were really carefully constructed
01:55:07.000 | in exactly this kind of way.
01:55:08.480 | - What was that?
01:55:09.320 | Let me look it up.
01:55:10.160 | - Oh, it was really fun.
01:55:11.000 | But I think it's interesting 'cause I do,
01:55:17.320 | you know, sometimes I'm very upfront about it.
01:55:19.000 | I say I'm gonna use the same word
01:55:20.640 | in probably six different ways in a lecture, and I will.
01:55:25.640 | - You're right.
01:55:26.680 | Scientists have grappled
01:55:27.680 | with reconciling biological evolution
01:55:29.720 | with immutable laws of the universe defined by physics.
01:55:33.080 | These laws underpin life's origin, evolution,
01:55:36.080 | and the--
01:55:36.920 | - He came up with this with me when he was here, too.
01:55:38.640 | - The development of human culture.
01:55:41.320 | Well, he was, I think your love for words
01:55:44.480 | runs deeper than these.
01:55:46.160 | - Yeah, for sure.
01:55:47.640 | I mean, this is part of the sort of brilliant thing
01:55:50.760 | about our collaboration is, you know,
01:55:55.040 | complementary skillsets.
01:55:56.720 | So I love playing with the abstract space of language,
01:56:00.640 | and it's a really interesting playground
01:56:04.640 | when I'm working with Lee
01:56:05.640 | because he thinks at a much deeper level of abstraction
01:56:10.280 | than can be expressed by language.
01:56:11.880 | And the ideas we work on
01:56:13.840 | are hard to talk about for that reason.
01:56:16.680 | - What do you think about computation as a language?
01:56:19.120 | - I think it's a very poor language.
01:56:20.760 | A lot of people think it's a really great one,
01:56:22.200 | but I think it has some nice properties.
01:56:25.200 | But I think the feature of it that, you know,
01:56:28.240 | is compelling is this kind of idea of universality
01:56:30.920 | that, like, you can, if you have a language,
01:56:34.880 | you can describe things in any other language.
01:56:37.640 | - Well, for me, one of the people who kind of revealed
01:56:40.320 | the expressive power of computation,
01:56:43.680 | aside from Alan Turing, is Stephen Wolfram,
01:56:46.160 | through all the explorations
01:56:47.360 | of, like, cellular automata type of objects
01:56:49.560 | that he did in "A New Kind of Science" and afterwards.
01:56:53.280 | So what do you get from that?
01:56:55.120 | The kind of computational worlds
01:57:00.680 | that are revealed through even something
01:57:02.320 | as simple as cellular automata.
01:57:04.560 | It seems like that's a really nice way
01:57:06.640 | to explore languages that are far outside
01:57:11.160 | our human languages, and do so rigorously,
01:57:14.880 | and understand how those kinds of complex systems
01:57:19.880 | can interact with each other, can emerge,
01:57:22.480 | all that kind of stuff.
01:57:23.640 | - I don't think that they're outside our human languages.
01:57:28.360 | I think they define the boundary
01:57:29.960 | of the space of human languages.
01:57:32.440 | - They allow us to explore things within that space,
01:57:34.760 | which is also fantastic.
01:57:36.160 | But I think there is a set of ideas that takes,
01:57:38.560 | and Stephen Wolfram has worked on this quite a lot,
01:57:42.480 | and contributed very significantly to it.
01:57:45.000 | And, you know, I really like some of the stuff
01:57:47.800 | that Stephen's doing with, like, his physics project,
01:57:50.560 | but don't agree with a lot of the foundations of it.
01:57:52.640 | But I think the space is really fun that he's exploring.
01:57:55.600 | You know, there's this assumption
01:57:58.400 | that computation is at the base of reality.
01:58:01.160 | And I kind of see it at the top of reality,
01:58:04.800 | not at the base, because I think computation
01:58:07.520 | was built by our biosphere.
01:58:08.800 | It's something that happened
01:58:10.280 | after many billion years of evolution.
01:58:13.200 | And it doesn't happen in every physical object.
01:58:16.600 | It only happens in some of them.
01:58:18.400 | And I think one of the reasons
01:58:20.160 | that we feel like the universe is computational
01:58:24.440 | is because it's so easy for us,
01:58:27.200 | as things that have the theory of computation in our minds,
01:58:32.200 | and actually, in some sense,
01:58:34.640 | it might be related to the functioning of our minds,
01:58:36.880 | and how we build languages to describe the world,
01:58:40.040 | and sets of relations to describe the world.
01:58:42.240 | But it's easy for us to go out into the world
01:58:47.840 | and build computers.
01:58:49.680 | And then we mistake our ability to do that
01:58:52.520 | with assuming that the world is computational.
01:58:55.040 | And I'll give you a really simple example.
01:58:57.480 | This one came from John Conway.
01:58:59.120 | I one time had a conversation with him,
01:59:01.640 | which was really delightful.
01:59:03.480 | He was really fun.
01:59:04.760 | But he was pointing out that if you, you know,
01:59:08.760 | string lights in a barn, you know,
01:59:11.920 | you can program them to have your favorite
01:59:14.960 | one-dimensional CA,
01:59:16.840 | and you might even be able to make them, you know,
01:59:18.600 | do a, like, be capable of universal computation.
01:59:21.680 | Is universal computation a feature of the string lights?
01:59:26.000 | - Well, no.
01:59:27.000 | - No, it's probably not.
01:59:28.600 | It's a feature of the fact that you, as a programmer,
01:59:31.280 | had a theory that you could embed
01:59:33.400 | in the physical architecture of the string lights.
01:59:35.560 | Now, what happens, though,
01:59:37.120 | is we get confused by this kind of distinction
01:59:39.320 | between us as agents in the world
01:59:41.480 | that actually can transfer things that life does
01:59:44.040 | onto other physical substrates with what the world is.
01:59:47.640 | And so, for example, you'll see people, you know,
01:59:51.200 | doing, studying the mathematics
01:59:52.640 | of chemical reaction networks and saying,
01:59:54.600 | "Well, chemistry is Turing universal,"
01:59:57.200 | or studying the laws of physics and saying,
01:59:59.080 | "The laws of physics are Turing universal."
02:00:01.200 | But anytime that you wanna do that,
02:00:03.000 | you always have to prepare an initial state.
02:00:05.120 | You have to, you know, you have to constrain the rule space,
02:00:08.480 | and then you have to actually be able to demonstrate
02:00:10.840 | the properties of computation.
02:00:12.720 | And all of that requires an agent or a designer
02:00:15.080 | to be able to do that.
02:00:16.160 | - But it gives you an intuition.
02:00:19.600 | If you look at a 1D or 2D cellular automata,
02:00:22.600 | it gives you, it allows you to build an intuition
02:00:25.720 | of how you can have complexity emerge
02:00:28.240 | from very simple beginnings,
02:00:30.200 | very simple initial conditions.
02:00:31.360 | - I think that's the intuition
02:00:32.880 | that people have derived from it.
02:00:34.240 | The intuition I get from cellular automata
02:00:37.960 | is that the flat space of an initial condition
02:00:40.360 | in a fixed dynamical law is not rich enough
02:00:42.440 | to describe an open-ended generation process.
02:00:45.040 | And so the way I see cellular automata
02:00:47.280 | is they're embedded slices
02:00:48.560 | in a much larger causal structure.
02:00:50.080 | And if you wanna look at a deterministic slice
02:00:52.040 | of that causal structure,
02:00:52.960 | you might be able to extract a set of consistent rules
02:00:55.720 | that you might call a cellular automata,
02:00:57.400 | but you could embed them as much larger space.
02:00:59.920 | That's not dynamical and is about the causal structure
02:01:03.080 | relations between all of those computations.
02:01:05.600 | And that would be the space cellular automata live in.
02:01:08.560 | And I think that's the space that Stephen is talking about
02:01:12.080 | when he talks about his Rouliad
02:01:13.360 | and these hypergraphs of all these possible computations.
02:01:16.480 | But I wouldn't take that as my base reality
02:01:18.880 | because I think, again, computation itself,
02:01:21.280 | this abstract property computation,
02:01:23.440 | is not at the base of reality.
02:01:25.280 | - So can we just linger on that Rouliad?
02:01:28.320 | - Yeah.
02:01:29.600 | One Rouliad to rule them all.
02:01:31.920 | - Yeah, so this is part of Wolfram's physics project.
02:01:36.920 | It's what he calls the entangled limit
02:01:39.320 | of everything that is computationally possible.
02:01:42.040 | So what's your problem with the Rouliad?
02:01:45.600 | - Well, it's interesting.
02:01:47.280 | So Stephen came to a workshop we had in the Beyond Center
02:01:50.400 | in the fall, and the workshop theme was mathematics.
02:01:53.600 | Is it evolved or eternal?
02:01:55.080 | And he gave a talk about the Rouliad.
02:01:56.960 | And he was talking about how a lot of the things
02:02:00.360 | that we talk about in the Beyond Center,
02:02:01.760 | like does reality have a bottom?
02:02:04.200 | If it has a bottom, what is it?
02:02:05.800 | - I need to go to--
02:02:09.920 | - We'll have you to one sometime.
02:02:11.600 | - This is great.
02:02:12.440 | Does reality have a bottom?
02:02:14.720 | - Yeah, so we had one that was,
02:02:16.560 | it was called "Infinite Turtles or Ground Truth."
02:02:19.880 | And it was really just about this issue.
02:02:23.000 | But the thing that was interesting,
02:02:24.560 | I think Stephen was trying to make the argument
02:02:27.400 | that fundamental particles aren't fundamental,
02:02:30.040 | gravitation is not fundamental.
02:02:31.760 | These are just turtles, and computation is fundamental.
02:02:37.480 | And I just, I remember pointing out to him,
02:02:39.520 | I was like, well, computation is your turtle.
02:02:41.920 | And I think it's a weird turtle to have.
02:02:45.600 | - First of all, isn't it okay to have a turtle?
02:02:47.320 | - It's totally fine to have a turtle.
02:02:49.120 | Everyone has a turtle.
02:02:50.520 | You can't build a theory without a turtle.
02:02:52.960 | It's just, so it depends on the problem you wanna describe.
02:02:55.880 | And I actually, the reason I can't get behind
02:02:59.240 | Stephen's ontology is I don't know
02:03:01.680 | what question he's trying to answer.
02:03:03.760 | And without a question to answer,
02:03:05.160 | I don't understand why you're building a theory of reality.
02:03:07.520 | - And the question you're trying to answer is--
02:03:10.360 | - What life is.
02:03:11.480 | - What life is, which, another simpler way of phrasing this,
02:03:15.720 | how did life originate?
02:03:17.520 | - Well, I started working on the origin of life.
02:03:19.440 | And I think what my challenge was there
02:03:22.880 | was no one knew what life was.
02:03:24.520 | And so you can't really talk about the origination
02:03:26.400 | of something if you don't know what it is.
02:03:28.480 | And so the way I would approach it is
02:03:31.600 | if you wanna understand what life is,
02:03:34.080 | then proving that physics is solving the origin of life.
02:03:37.960 | So there's the theory of what life is,
02:03:40.360 | but there's the actual demonstration
02:03:42.000 | that that theory is an accurate description
02:03:43.920 | of the phenomena you aim to describe.
02:03:45.440 | So again, they're the same problem.
02:03:47.720 | It's not like I can decouple origin of life
02:03:49.600 | from what life is.
02:03:50.440 | It's like that is the problem.
02:03:53.480 | And the point I guess I'm making about having a question
02:03:57.760 | is no matter what slice of reality you take,
02:04:00.880 | what regularity of nature you're gonna try to describe,
02:04:04.040 | there will be an abstraction
02:04:06.920 | that unifies that structure of reality, hopefully.
02:04:11.280 | And that will have a fundamental layer to it, right?
02:04:16.280 | 'Cause you have to explain something
02:04:18.760 | in terms of something else.
02:04:20.480 | But so if I wanna explain life, for example,
02:04:23.160 | then my fundamental description of nature
02:04:24.880 | has to be something I think
02:04:26.200 | that has to do with time being fundamental.
02:04:28.840 | But if I wanted to describe,
02:04:30.680 | I don't know, the sort of interactions of matter and light,
02:04:37.480 | you know, I have elementary particles be fundamental.
02:04:40.360 | If I wanna describe electricity and magnetism in the 1800s,
02:04:43.720 | I have to have waves be fundamental, right?
02:04:47.340 | So like you, or in quantum mechanics,
02:04:50.120 | like it's a wave function that's fundamental
02:04:52.240 | 'cause that's the sort of explanatory paradigm
02:04:54.960 | of your theory.
02:04:55.800 | So I guess I don't know what problem
02:05:02.920 | saying computation is fundamental solves.
02:05:06.300 | - Doesn't he want to understand how does the basic,
02:05:11.360 | quantum mechanics and general relativity emerge?
02:05:13.800 | - Yeah, but that's-- - And how does time--
02:05:16.120 | - Right, so I think-- - But then that doesn't
02:05:17.560 | really answer an important question for us.
02:05:19.040 | - Well, I think that the issue is general relativity
02:05:22.640 | and quantum mechanics are expressed
02:05:23.920 | in mathematical languages.
02:05:26.040 | And then computation is a mathematical language.
02:05:29.240 | So you're basically saying that maybe there's
02:05:31.080 | a more universal mathematical language
02:05:32.920 | for describing theories of physics that we already know.
02:05:35.240 | That's an important question.
02:05:36.440 | And I do think that's what Stephen's trying to do
02:05:38.120 | and do well.
02:05:39.640 | But then the question becomes,
02:05:41.640 | does that formulation of a more universal language
02:05:45.660 | for describing the laws of physics that we know now
02:05:48.800 | tell us anything new about the nature of reality?
02:05:52.040 | Or is it a language?
02:05:53.620 | - NTU languages are fundamental, can be fundamental.
02:05:57.920 | - The language itself is never the fundamental thing.
02:06:02.040 | It's whatever it's describing.
02:06:03.920 | - So one of the possible titles you were thinking about
02:06:06.440 | originally for the book is the hard problem of life.
02:06:10.880 | Sort of reminiscent of the hard problem of consciousness.
02:06:13.720 | So you're saying that assembly theory
02:06:15.960 | is supposed to be answering the question
02:06:18.560 | about what is life.
02:06:20.040 | So let's go to the other hard problems.
02:06:22.160 | You also say that's the easiest of the hard problems
02:06:25.400 | is the hard problem of life.
02:06:27.920 | So what do you think is the nature of intelligence
02:06:34.800 | and consciousness?
02:06:36.000 | Do you think something like assembly theory
02:06:41.240 | can help us understand that?
02:06:46.160 | - I think if assembly theory is an accurate depiction
02:06:51.160 | of the physics of life,
02:06:54.760 | it should shed a lot of light on those problems.
02:06:58.360 | And in fact, I sometimes wonder
02:07:00.080 | if the problems of consciousness and intelligence
02:07:02.000 | are at all different than the problem of life generally.
02:07:05.620 | And I'm of two minds of it,
02:07:10.280 | but I in general try to,
02:07:13.740 | the process of my thinking
02:07:15.900 | is trying to regularize everything into one theory.
02:07:18.260 | So pretty much every interaction I have is like,
02:07:20.980 | oh, how do I fold that into?
02:07:22.700 | And so I'm just building this giant abstraction
02:07:24.900 | that's basically trying to take every piece of data
02:07:27.220 | I've ever gotten in my brain into a theory of what life is.
02:07:31.280 | And consciousness and intelligence
02:07:34.740 | are obviously some of the most interesting things
02:07:36.820 | that life has manifest.
02:07:38.160 | And so I think they're very telling
02:07:40.820 | about some of the deeper features
02:07:43.620 | about the nature of life.
02:07:45.060 | - It does seem like they're all flavors of the same thing,
02:07:49.160 | but it's interesting to wonder at which stage
02:07:52.340 | there's something that we would recognize as life
02:07:55.780 | in a sort of canonical silly human way
02:07:58.780 | and something that we would recognize as intelligence.
02:08:02.660 | At which stage does that emerge?
02:08:04.140 | At which assembly index does that emerge
02:08:06.260 | and at which assembly index is it consciousness?
02:08:08.940 | Something that we would canonically recognize
02:08:11.820 | as consciousness.
02:08:12.660 | Is this use of flavors the same as you meant
02:08:15.540 | when you were talking about flavors of alien life?
02:08:18.700 | - Yeah, sure, yeah.
02:08:20.580 | I mean, it's the same as the flavors of ice cream
02:08:22.940 | and the flavors of fashion.
02:08:24.700 | - Yeah, but we were talking about in terms of colors
02:08:26.960 | and very nondescript.
02:08:28.260 | But the way that you just talked about flavors now
02:08:30.020 | was more in the space of consciousness and intelligence.
02:08:32.480 | It was kind of much more specific.
02:08:34.540 | - It'd be nice if there was a formal way of expressing.
02:08:38.620 | - Quantifying flavors.
02:08:39.740 | - Quantifying flavors.
02:08:41.500 | - It seems like I would order it
02:08:46.260 | life, consciousness, intelligence probably
02:08:49.140 | as like the order in which things emerge
02:08:52.100 | and they're all just the same.
02:08:54.000 | We're using the word life differently here.
02:08:59.220 | I mean, life sort of when I'm talking about
02:09:01.780 | what is a living versus non-living thing
02:09:04.220 | at a bar with a person, I'm already like
02:09:06.220 | four or five drinks in, that kind of thing.
02:09:09.540 | - Just that. (laughs)
02:09:11.460 | - Like we're not being too philosophical.
02:09:13.420 | Like there's a thing that moves
02:09:14.740 | and here's a thing that doesn't move.
02:09:16.540 | But maybe consciousness precedes that.
02:09:21.500 | It's a weird dance there.
02:09:23.940 | Is life precede consciousness or consciousness precede life?
02:09:29.040 | And I think that understanding of what life is
02:09:33.820 | in the way you're doing will help us disentangle that.
02:09:37.500 | - Depending on what you wanna explain,
02:09:38.940 | as I was saying before,
02:09:39.780 | you have to assume something's fundamental.
02:09:42.020 | And so because people can't explain consciousness,
02:09:45.300 | there's a temptation for some people
02:09:46.940 | to wanna take consciousness as fundamental
02:09:48.780 | and assume everything else is derived out of that.
02:09:50.820 | And then you get some people
02:09:52.460 | that wanna assume consciousness preceded life.
02:09:56.300 | And I don't find either of those views
02:09:58.980 | particularly illuminating.
02:10:00.640 | I think, 'cause I don't wanna assume a feminology
02:10:05.140 | before I explain a thing.
02:10:06.820 | And so what I've tried really hard to do
02:10:08.500 | is not assume that I think life is anything,
02:10:13.300 | except hold on to sort of the patterns and structures
02:10:16.420 | that seem to be the sort of consistent ways
02:10:18.500 | that we talk about this thing
02:10:19.660 | and then try to build a physics that describes that.
02:10:22.500 | And I think that's a really different approach
02:10:24.340 | than saying consciousness is this thing
02:10:27.940 | we all feel and experience about things.
02:10:30.900 | I would wanna understand the regularities
02:10:33.620 | associated with that and build a deeper structure
02:10:35.620 | underneath that and build into it.
02:10:37.340 | I wouldn't wanna assume that thing
02:10:39.340 | and that I understand that thing,
02:10:40.740 | which is usually how I see people talk about it.
02:10:43.000 | - The difference between life and consciousness.
02:10:46.220 | - Yeah. - Which comes first.
02:10:48.180 | - Yeah, so I think if you're thinking about
02:10:51.740 | this sort of thinking about living things
02:10:55.300 | as these giant causal structures
02:10:57.820 | or these objects that are deep in time
02:10:59.420 | or whatever language we end up using to describe it,
02:11:04.700 | it seems to me that consciousness is about
02:11:09.700 | the fact that we have a conscious experience
02:11:13.540 | is because we are these temporally extended objects.
02:11:16.500 | So consciousness and the abstraction that we have
02:11:19.260 | in our minds is actually a manifestation
02:11:21.380 | of all the time that's rolled up in us.
02:11:22.940 | And it's just because we're so huge
02:11:24.900 | that we have this very large inner space
02:11:27.220 | that we're experiencing that's not,
02:11:29.100 | and it's also separated off from the rest of the world
02:11:31.300 | because we're the separate thread in time.
02:11:33.620 | And so our consciousness is not exactly shared
02:11:36.140 | with anything else because nothing else occupies
02:11:38.620 | the same part of time that we occupy.
02:11:42.220 | But I can understand something about you maybe being
02:11:45.700 | conscious because you and I didn't separate
02:11:48.420 | that far in the past in terms of our causal histories.
02:11:52.860 | So in some sense, we can even share experiences
02:11:55.180 | with each other through language
02:11:57.020 | because of that sort of overlap in our structure.
02:12:00.540 | - Well then if consciousness is merely temporal separateness,
02:12:05.540 | then that comes before life.
02:12:07.780 | - It's not merely temporal separateness.
02:12:09.860 | It's about the depth in that time.
02:12:11.940 | So it's-- - Yes.
02:12:12.940 | - The reason that my conscious experience
02:12:14.740 | is not the same as yours is because we're separated in time.
02:12:17.260 | The fact that I have a conscious experience
02:12:19.020 | is because I'm an object that's super deep in time.
02:12:21.300 | So I'm huge in time.
02:12:23.780 | And that means that there's a lot,
02:12:25.780 | that I am basically, in some sense,
02:12:28.020 | a universe onto myself because my structure
02:12:30.340 | is so large relative to the amount of space that I occupy.
02:12:33.300 | - But it feels like that's possible to do
02:12:37.220 | before you get anything like bacteria.
02:12:40.340 | - I think there's a horizon,
02:12:42.340 | and I don't know how to articulate this yet.
02:12:44.100 | It's a little bit like the horizon at the origin of life
02:12:46.340 | where the space inside a particular structure
02:12:50.180 | becomes so large that it has some access
02:12:52.820 | to a space that's not, that doesn't feel as physical.
02:12:57.780 | It's almost like this idea of counterfactuals.
02:13:00.100 | So I think the past history of your horizon
02:13:05.820 | is just much larger than can be encompassed
02:13:08.820 | in a small configuration of matter.
02:13:11.260 | So you can pull this stuff into existence.
02:13:14.380 | This property is maybe a continuous property,
02:13:17.220 | but there's something really different
02:13:18.820 | about human-level physical systems
02:13:23.340 | and human-level ability to understand reality.
02:13:27.620 | I really love David Deutsch's conception
02:13:29.660 | of universal explainers,
02:13:31.060 | and that's related to the theory of universal computation.
02:13:34.780 | And I think there's some transition that happens there.
02:13:40.260 | But maybe to describe that a little bit better,
02:13:44.020 | what I can also say is what intelligence is
02:13:46.340 | in this framework.
02:13:47.260 | So you have these objects that are large in time.
02:13:52.140 | They were selected to exist by constraining
02:13:54.980 | the possible space of objects to this particular,
02:13:58.540 | like all of the matter is funneled
02:14:00.060 | into this particular configuration of object over time.
02:14:04.580 | And so these objects arise through selection,
02:14:07.220 | but the more selection that you have embedded in you,
02:14:09.860 | the more possible selection you have on your future.
02:14:12.460 | And so selection and evolution,
02:14:16.780 | we usually think about in the past sense
02:14:18.900 | where selection happened in the past,
02:14:22.140 | but objects that are high-density configurations of matter
02:14:26.900 | that have a lot of selection in them
02:14:28.580 | are also selecting agents in the universe.
02:14:31.220 | So they actually embody the physics of selection
02:14:33.380 | and they can select on possible futures.
02:14:35.500 | And I guess what I'm saying with respect to consciousness
02:14:38.660 | and the experience we have
02:14:40.660 | is that there's something very deep about that structure
02:14:43.460 | and the nature of how we exist in that structure
02:14:46.100 | that has to do with how we're navigating that space
02:14:50.380 | and how we generate that space
02:14:52.060 | and how we continue to persist in that space.
02:14:54.420 | - Is there shortcuts we can take
02:14:57.860 | to artificially engineering living organisms,
02:15:02.080 | artificial life, artificial consciousness,
02:15:04.620 | artificial intelligence?
02:15:06.500 | So maybe just looking pragmatically
02:15:09.700 | at the LLMs we have now,
02:15:13.600 | do you think those can exhibit qualities of life,
02:15:18.780 | qualities of consciousness,
02:15:20.900 | qualities of intelligence
02:15:22.340 | in the way we think of intelligence?
02:15:24.140 | - I mean, I think they already do,
02:15:25.460 | but not in the way I hear popularly discussed.
02:15:27.860 | So there are obviously signatures of intelligence
02:15:31.340 | and a part of a ecosystem of intelligent systems,
02:15:36.340 | but I don't know that individually,
02:15:40.540 | I would assign all the properties to them that people have.
02:15:44.900 | It's a little like,
02:15:46.540 | so we talked about the history of eyes before
02:15:48.900 | and like how I scaled up into technological forms
02:15:52.580 | and language has also had a really interesting history
02:15:55.680 | and got much more interesting,
02:15:57.340 | I think once we started writing it down
02:15:59.660 | and then in inventing books and things,
02:16:02.540 | but like every time that we started
02:16:05.580 | storing language in a new way,
02:16:08.280 | we were kind of existentially traumatized by it.
02:16:11.980 | So like the idea of written language was traumatic
02:16:14.740 | because it seemed like the dead were speaking to us
02:16:17.360 | even though they were deceased and books were traumatic
02:16:19.580 | because like suddenly there were lots of copies
02:16:23.760 | of this information available to everyone
02:16:25.660 | and it was gonna somehow dilute it.
02:16:28.300 | And large language models are kind of interesting
02:16:30.940 | because they don't feel as static, they're very dynamic.
02:16:33.940 | But if you think about language
02:16:35.420 | in the way I was describing before,
02:16:36.700 | is language is this very large in time structure
02:16:39.620 | and before it had been something that was distributed
02:16:42.780 | over human brains as a dynamic structure
02:16:45.620 | and occasionally we store components
02:16:47.600 | of that very large dynamic structure
02:16:50.660 | in books or in written language.
02:16:52.860 | Now we can actually store the dynamics of that structure
02:16:56.620 | in a physical artifact, which is a large language model.
02:16:59.940 | And so I think about it almost like the evolution of genomes
02:17:03.780 | in some sense where there might've been
02:17:05.980 | like really primitive genes in the first living things
02:17:08.380 | and they didn't store a lot of information
02:17:10.260 | or they were like really messy.
02:17:11.840 | And then we, like by the time you get to the eukaryote cell
02:17:14.580 | you have this really dynamic genetic architecture
02:17:16.660 | that's rewritable, right?
02:17:18.260 | And like, and it has all of these different properties.
02:17:20.540 | And I think large language models
02:17:22.740 | are kind of like the genetic system for language
02:17:25.980 | in some sense where it's allowing
02:17:29.600 | and a sort of archiving that's highly dynamic.
02:17:32.900 | And I think it's very paradoxical to us
02:17:35.060 | because obviously in human history
02:17:37.100 | we haven't been used to conversing
02:17:39.040 | with anything that's not human.
02:17:41.740 | But now we can converse basically
02:17:45.580 | with a crystallization of human language in a computer
02:17:50.580 | that's a highly dynamic crystal
02:17:52.300 | because it's a crystallization in time
02:17:54.080 | of this massive abstract structure
02:17:56.300 | that's evolved over human history
02:17:58.380 | and is now put into a small device.
02:18:01.180 | - I think crystallization kind of implies
02:18:03.420 | that a limit on its capabilities.
02:18:07.460 | - I think there's not, I mean it very purposefully
02:18:10.640 | because a particular instantiation of a language model
02:18:13.360 | trained on a particular data set
02:18:14.900 | becomes a crystal of the language at that time it was trained
02:18:17.380 | but obviously we're iterating with the technology
02:18:19.360 | and evolving it.
02:18:20.460 | - I guess the question is when you crystallize it,
02:18:23.060 | when you compress it, when you archive it,
02:18:25.500 | you're archiving some slice of the collective intelligence
02:18:30.220 | of the human species.
02:18:31.380 | - That's right.
02:18:32.220 | - And the question is like, how powerful is that?
02:18:36.220 | - Right, it's a societal level technology, right?
02:18:38.500 | We've actually put collective intelligence in a box.
02:18:40.720 | - Yeah, I mean how much smarter is the collective intelligence
02:18:44.960 | of humans versus a single human?
02:18:47.560 | And that's the question of AGI
02:18:50.880 | versus human level intelligence.
02:18:54.000 | Superhuman level intelligence
02:18:55.400 | versus human level intelligence.
02:18:57.580 | Like how much smarter can this thing when done well,
02:19:00.280 | when we solve a lot of the computation complexities,
02:19:04.720 | maybe there's some data complexities
02:19:07.240 | and how to really archive this thing,
02:19:09.400 | crystallize this thing really well,
02:19:11.800 | how powerful is this thing gonna be?
02:19:13.240 | Like what's your--
02:19:14.080 | - I think, I actually, I don't like the sort of language
02:19:19.040 | we use around that
02:19:19.880 | and I think the language really matters.
02:19:22.440 | So I don't know how to talk about
02:19:23.920 | how much smarter one human is than another, right?
02:19:27.300 | Like usually we talk about abilities
02:19:31.520 | or particular talents someone has.
02:19:34.880 | And going back to David Reisch's idea
02:19:40.000 | of universal explainers,
02:19:41.380 | adopting the view that we're the first kinds of structures
02:19:47.400 | our biosphere has built
02:19:49.040 | that can understand the rest of reality.
02:19:51.760 | We have this universal comprehension capability.
02:19:54.260 | He makes an argument that basically we're the first things
02:20:00.160 | that actually are capable of understanding anything.
02:20:02.480 | It doesn't matter,
02:20:03.320 | it doesn't mean an individual understands everything,
02:20:05.920 | but like we have that capability.
02:20:07.860 | And so there's not a difference between that
02:20:10.040 | and what people talk about with AGI.
02:20:11.960 | In some sense, AGI is a universal explainer.
02:20:15.560 | But it might be that a computer is much more efficient
02:20:19.800 | at doing, I don't know, prime factorization or something
02:20:24.560 | than a human is,
02:20:25.880 | but it doesn't mean that it's necessarily smarter
02:20:30.560 | or has a broader reach of the kind of things
02:20:32.720 | that can understand than a human does.
02:20:35.280 | And so I think we really have to think about,
02:20:37.960 | is it a level shift or is it we're enhancing
02:20:41.720 | certain kinds of capabilities humans have
02:20:44.280 | in the same way that we can enhance eyesight
02:20:46.320 | by making telescopes and microscopes?
02:20:48.720 | Are we enhancing capabilities we have into technologies
02:20:52.240 | and the entire global ecosystem
02:20:54.280 | is getting more intelligent?
02:20:55.880 | Or is it really that we're building some super machine
02:20:58.280 | in a box that's gonna be smart and kill everybody?
02:21:00.560 | Like, that sounds like a science,
02:21:01.920 | like it's not even a science fiction narrative,
02:21:03.960 | it's a bad science fiction narrative.
02:21:05.720 | I just don't think it's actually accurate
02:21:07.680 | to any of the technologies we're building
02:21:09.160 | or the way that we should be describing them.
02:21:10.640 | It's not even how we should be describing ourselves.
02:21:12.840 | - So the benevolent stories is a benevolent system
02:21:16.360 | that's able to transform our economy, our way of life
02:21:20.980 | by just 10X-ing the GDP of countries--
02:21:25.660 | - Well, these are human questions, right?
02:21:27.540 | I don't think they're necessarily questions
02:21:29.940 | that we're gonna outsource to an artificial intelligence.
02:21:33.280 | I think what is happening and will continue to happen
02:21:36.100 | is there's a co-evolution between humans and technology
02:21:40.220 | that's happening.
02:21:41.100 | And we're coexisting in this ecosystem right now
02:21:44.780 | and we're maintaining a lot of the balance.
02:21:46.700 | And for the balance to shift to the technology
02:21:49.740 | would require some very bad human actors,
02:21:52.220 | which is a real risk, or some sort of,
02:21:56.580 | I don't know, some sort of dynamic that favors,
02:22:03.540 | like, I just don't know how that plays out
02:22:05.700 | without human agency actually trying to put it
02:22:08.540 | in that direction.
02:22:09.580 | - It could also be how rapid the rate--
02:22:12.300 | - The rapid rate is scary.
02:22:13.720 | So like, I think the things that are terrifying
02:22:16.980 | are the ideas of deep fakes,
02:22:19.940 | or all the kinds of issues that become legal issues
02:22:24.940 | about artificial intelligence technologies,
02:22:30.140 | and using them to control weapons,
02:22:32.780 | or using them for child pornography,
02:22:37.780 | or faking out that someone's loved one
02:22:42.900 | was kidnapped or killed and ran--
02:22:45.940 | There's all kinds of things that are super scary
02:22:48.060 | in this landscape,
02:22:48.900 | and all kinds of new legislation needs to be built,
02:22:50.820 | and all kinds of guardrails on the technology
02:22:54.700 | to make sure that people don't abuse it need to be built.
02:22:56.980 | And that needs to happen.
02:22:58.980 | And I think one function of sort of
02:23:01.180 | the artificial intelligence doomsday
02:23:04.980 | sort of part of our culture right now
02:23:09.060 | is it's sort of our immune response
02:23:11.100 | to knowing that's coming.
02:23:13.220 | And we're overscaring ourselves,
02:23:15.540 | so we try to act more quickly, which is good.
02:23:17.780 | But I just, it's about the words that we use
02:23:23.140 | versus the actual things happening behind the words.
02:23:26.380 | I think one thing that's good
02:23:27.860 | is when people are talking about things different ways,
02:23:29.620 | it makes us think about them.
02:23:31.340 | And also when things are existentially threatening,
02:23:34.940 | we wanna pay attention to those.
02:23:36.700 | But the ways that they're existentially threatening
02:23:38.620 | and the ways that we're experiencing existential trauma,
02:23:41.180 | I don't think that we're really gonna understand
02:23:42.700 | for another century or two, if ever.
02:23:44.700 | And I certainly think they're not the way
02:23:47.140 | that we're describing them now.
02:23:48.700 | - Well, creating existential trauma
02:23:52.700 | is one of the things that makes life fun, I guess.
02:23:55.420 | - Yeah, it's just what we do to ourselves.
02:23:57.780 | - It gives us really exciting big problems to solve.
02:24:00.940 | - Yeah, for sure.
02:24:01.780 | - Do you think we will see these AI systems become conscious
02:24:06.780 | or convince us that they're conscious?
02:24:09.660 | And then maybe we'll have relationships with them,
02:24:12.460 | romantic relationships?
02:24:14.580 | - Well, I think people are gonna have
02:24:15.660 | romantic relationships with them.
02:24:17.180 | And I also think that some people
02:24:19.860 | will be convinced already that they're conscious.
02:24:21.780 | But I think in order, what does it take to convince
02:24:25.780 | people that something is conscious?
02:24:31.660 | I think that we actually have to have an idea
02:24:33.820 | of what we're talking about.
02:24:35.540 | Like we have to have a theory that explains
02:24:38.740 | when things are conscious or not that's testable, right?
02:24:41.500 | And we don't have one right now.
02:24:42.980 | So I think until we have that,
02:24:44.900 | it's always gonna be this sort of gray area
02:24:46.940 | where some people think it has it
02:24:47.780 | and some people think it doesn't,
02:24:49.380 | because we don't actually know what we're talking about
02:24:51.420 | that we think it has.
02:24:52.420 | - So do you think it's possible to get out of the gray area
02:24:54.980 | and really have a formal test for consciousness?
02:24:57.500 | - For sure.
02:24:58.340 | - And for life as you were--
02:24:59.980 | - For sure.
02:25:00.860 | - As we've been talking about for some of the--
02:25:02.540 | - Yeah.
02:25:03.380 | - Consciousness is a tricky one.
02:25:04.940 | - It is a tricky one.
02:25:05.820 | I mean, that's why it's called
02:25:06.660 | the hard problem of consciousness, because it's hard.
02:25:09.340 | And it might even be outside of the purview of science,
02:25:12.540 | which means that we can't understand it
02:25:14.420 | in a scientific way.
02:25:15.460 | There might be other ways of coming to understand it,
02:25:17.180 | but those may not be the ones that we necessarily want
02:25:20.580 | for technological utility
02:25:23.260 | or for developing laws with respect to,
02:25:25.740 | because the laws are the things
02:25:28.420 | that are gonna govern the technology.
02:25:30.780 | - Well, I think that's actually
02:25:31.860 | where the hard problem of consciousness,
02:25:35.660 | a different hard problem of consciousness,
02:25:37.540 | is that I fear that humans will resist.
02:25:42.540 | That's the last thing they will resist,
02:25:46.620 | is calling something else conscious.
02:25:48.500 | - Oh, that's interesting.
02:25:49.700 | I think it depends on the culture, though,
02:25:51.100 | because, I mean, some cultures already think
02:25:52.780 | like everything's imbued with a life essence
02:25:57.220 | or kind of conscious.
02:25:58.540 | - I don't think those cultures have nuclear weapons.
02:26:00.740 | - No, they don't.
02:26:01.580 | And they're probably not building
02:26:02.420 | the most advanced technologies.
02:26:04.340 | - The cultures that are primed
02:26:05.860 | for destroying the other,
02:26:08.060 | constructing a very effective propaganda machines
02:26:11.980 | of what the other is,
02:26:13.220 | the group to hate,
02:26:15.940 | are the cultures that I worry would--
02:26:19.780 | - Yeah, I know.
02:26:20.620 | - Would be very resistant to label something,
02:26:25.620 | to sort of acknowledge the consciousness
02:26:29.660 | laden in a thing that was created by us humans.
02:26:32.660 | - And so what do you think the risks are there
02:26:35.020 | that the conscious things will get angry with us
02:26:37.540 | and fight back?
02:26:39.020 | - No, that we would torture and kill conscious beings.
02:26:42.500 | - Oh, yeah.
02:26:44.820 | I think we do that quite a lot.
02:26:47.060 | Anyway, without, I mean, I don't,
02:26:50.180 | I mean, it goes back to your,
02:26:52.260 | and I don't know how to feel about this,
02:26:54.260 | but, you know, like we talked already
02:26:55.820 | about the predator-prey thing,
02:26:57.060 | that like, in some sense, you know,
02:27:00.420 | being alive requires eating other things that are alive.
02:27:03.180 | And even if you're a vegetarian,
02:27:04.740 | or, you know, like try to have like,
02:27:06.300 | you're like, you're still eating living things.
02:27:09.340 | - So maybe part of the story of Earth
02:27:11.500 | will involve a predator-prey dynamic
02:27:14.900 | between humans and human creations.
02:27:19.820 | - Yeah.
02:27:20.660 | - And all of that is part of the--
02:27:21.500 | - But I don't like thinking about them as like,
02:27:24.180 | our technologies as a separate species,
02:27:26.140 | 'cause this again goes back
02:27:27.300 | to this sort of levels of selection issue.
02:27:29.820 | And, you know, if you think about humans,
02:27:32.780 | individually alive,
02:27:34.380 | you miss the fact that societies are also alive.
02:27:37.340 | And so I think about it much more in the sense of,
02:27:41.900 | an ecosystem's not the right word,
02:27:44.780 | but we don't have the right words for these things of like,
02:27:47.620 | and this is why I talk about the technosphere,
02:27:49.380 | it's a system that is both human and technological.
02:27:52.780 | It's not human or technological.
02:27:55.340 | And so this is the part that I think we're really good for,
02:28:02.300 | like, and this is driving in part a lot
02:28:04.460 | of the sort of attitude of like,
02:28:06.860 | I'll kill you first with my nuclear weapons.
02:28:10.860 | We're really good at identifying things as other.
02:28:13.860 | We're not really good at understanding when we're the same
02:28:16.660 | or when we're part of an integrated system
02:28:18.460 | that's actually functioning together
02:28:19.820 | in some kind of cohesive way.
02:28:21.380 | So even if you look at like, you know,
02:28:23.180 | the division in American politics or something, for example,
02:28:26.020 | it's important that there's multiple sides
02:28:27.900 | that are arguing with each other
02:28:29.180 | because that's actually how you resolve society's issues.
02:28:31.660 | It's not like a bad feature.
02:28:33.020 | I think like some of the sort of extreme positions
02:28:35.740 | and like the way people talk about it are maybe not ideal,
02:28:39.340 | but that's how societies solve problems.
02:28:42.860 | What it looks like for an individual
02:28:44.140 | is really different than the societal level outcomes
02:28:46.420 | and the fact that like, there is,
02:28:49.140 | I don't wanna call it cognition or computation,
02:28:51.180 | I don't know what you call it,
02:28:52.380 | but like there is a process playing out
02:28:54.180 | in the dynamics of societies
02:28:55.780 | that we are all individual actors in,
02:28:58.180 | and like, we're not part of that.
02:29:00.620 | It requires all of us acting individually,
02:29:02.780 | but like this higher level structure
02:29:05.060 | is playing out some things
02:29:06.660 | and like things are getting solved
02:29:07.900 | for it to be able to maintain itself.
02:29:09.740 | And that's the level that our technologies live at.
02:29:13.180 | They don't live at our level.
02:29:14.500 | They live at the societal level
02:29:15.900 | and they're deeply integrated with the social organism,
02:29:18.700 | if you wanna call it that.
02:29:20.060 | And so I really get upset
02:29:23.220 | when people talk about the species
02:29:25.420 | of artificial intelligence.
02:29:27.060 | I'm like, you mean we live in an ecosystem
02:29:29.420 | of all these kind of intelligent things
02:29:31.100 | and these animating technologies
02:29:32.500 | that we're in some sense helping to come alive.
02:29:36.220 | We are generating them,
02:29:37.900 | but it's not like the biosphere
02:29:39.300 | eliminated all of its past history
02:29:41.060 | when it invented a new species.
02:29:42.900 | All of these things get scaffolded
02:29:44.900 | and we're also augmenting ourselves
02:29:46.660 | at the same time that we're building technologies.
02:29:48.340 | I don't think we can anticipate
02:29:49.620 | what that system's gonna look like.
02:29:51.020 | - So in some fundamental way,
02:29:52.380 | you always wanna be thinking about
02:29:54.140 | the planet as one organism.
02:29:55.980 | - The planet is one living thing.
02:29:58.060 | What happens when it becomes multi-planetary?
02:30:00.100 | Is it still just the--
02:30:01.860 | - Still the same causal chain.
02:30:03.580 | - Same causal chain.
02:30:04.540 | - It's like when the first cell split into two.
02:30:06.420 | That's what I was talking about.
02:30:07.620 | When a planet reproduces itself,
02:30:09.460 | the technosphere emerges enough understanding.
02:30:12.580 | It's like this recursive,
02:30:14.220 | like the entire history of life is just recursion, right?
02:30:17.260 | So you have an original life event.
02:30:18.980 | It evolves for four billion years at least on our planet.
02:30:21.820 | It evolves a technosphere.
02:30:23.380 | The technologies themselves start to become
02:30:27.300 | having this property we call life,
02:30:29.100 | which is the phase we're undergoing now.
02:30:31.060 | It solves the origin of itself
02:30:34.300 | and then it figures out how that process all works,
02:30:37.060 | understands how to make more life
02:30:38.460 | and then can copy itself onto another planet
02:30:40.420 | so the whole structure can reproduce itself.
02:30:42.620 | And so the original life is happening again right now
02:30:46.780 | on this planet in the technosphere
02:30:49.100 | with the way that our planet
02:30:50.380 | is undergoing another transition,
02:30:52.100 | just like at the original life
02:30:53.180 | when geochemistry transitioned to biology,
02:30:55.540 | which is the global,
02:30:56.700 | for me it was a planetary scale transition.
02:30:58.500 | It was a multi-scale thing
02:30:59.860 | that happened from the scale of chemistry
02:31:01.340 | all the way to planetary cycles.
02:31:02.980 | It's happening now all the way from individual humans
02:31:06.900 | to the internet, which is a global technology
02:31:10.020 | and all the other things.
02:31:11.060 | Like there's this multi-scale process that's happening
02:31:12.980 | and transitioning us globally.
02:31:14.380 | And it is really like, it's a dramatic transition.
02:31:16.580 | It's happening really fast and we're living in it.
02:31:20.900 | - You think this technosphere that we've created,
02:31:22.500 | this increasingly complex technosphere
02:31:24.420 | will spread to other planets?
02:31:26.100 | - I hope so.
02:31:26.940 | I think so.
02:31:28.660 | - You think we'll become a type two Kardashev civilization?
02:31:31.740 | - I don't really like the Kardashev scale.
02:31:35.020 | And it goes back to like,
02:31:36.460 | I don't like a lot of the narratives about life
02:31:38.220 | 'cause they're very like, you know,
02:31:40.420 | survival of the fittest, energy consuming,
02:31:42.940 | this, that, and the other thing.
02:31:43.780 | It's very like, I don't know,
02:31:45.140 | sort of old world, you know, like conqueror mentality.
02:31:49.180 | - What's the alternative to that exactly?
02:31:51.140 | - I mean, I think it does require life
02:31:54.020 | to use new energy sources in order to expand the way it is.
02:31:57.700 | So that part's accurate.
02:31:59.460 | But I think this sort of process of life,
02:32:02.180 | like being the mechanism
02:32:05.820 | that the universe creatively expresses itself,
02:32:08.460 | generates novelty, explores the space of the possible,
02:32:11.820 | is really the thing that's most deeply intrinsic to life.
02:32:14.700 | And so, you know,
02:32:16.180 | these sort of energy consuming scales of technology,
02:32:20.500 | I think is missing the sort of actual feature
02:32:24.260 | that's most prominent about any alien life
02:32:27.220 | that we might find,
02:32:28.100 | which is that it's literally our universe, our reality,
02:32:31.940 | trying to creatively express itself
02:32:33.700 | and trying to find out what can't exist
02:32:35.340 | and trying to make it exist.
02:32:36.380 | - See, but past a certain level of complexity,
02:32:38.420 | unfortunately, maybe you can correct me,
02:32:40.380 | but we're built, all complex life on Earth
02:32:43.140 | is built on a foundation of that predator-prey dynamic.
02:32:45.980 | - Yes.
02:32:46.900 | - And so like, I don't know if we can escape that.
02:32:48.660 | I don't know how-- - No, we can't.
02:32:49.860 | - But this is why I'm okay with having a finite lifetime.
02:32:52.860 | And, you know, one of the reasons I'm okay with that,
02:32:55.300 | actually, goes back to this issue
02:32:58.140 | of the fact that we're resource bound.
02:33:00.420 | We live in a, you know, like,
02:33:02.060 | we have a finite amount of material,
02:33:04.340 | whatever way you wanna define material, I think,
02:33:06.340 | like for me, you know, material is time,
02:33:08.940 | material is information,
02:33:10.580 | but we have a finite amount of material.
02:33:12.900 | If time is a generating mechanism,
02:33:14.900 | it's always gonna be finite because the universe is,
02:33:18.020 | you know, like, it's a resource that's getting generated,
02:33:20.900 | but it has a size,
02:33:22.020 | which means that all the things that could exist
02:33:26.980 | don't exist, and in fact, most of them never will.
02:33:29.620 | So death is a way to make room in the universe
02:33:32.100 | for other things to exist
02:33:33.060 | that wouldn't be able to exist otherwise.
02:33:34.780 | So if the universe, over its entire temporal history,
02:33:37.460 | wants to maximize the number of things,
02:33:39.060 | wants is a hard word, maximize is a hard word,
02:33:41.180 | all these things are approximate,
02:33:42.460 | but wants to maximize the number of things that can exist,
02:33:45.820 | the best way to do it is to make recursively
02:33:48.060 | embedded stacked objects like us
02:33:49.740 | that have a lot of structure and a small volume of space,
02:33:53.020 | and to have those things turn over rapidly
02:33:55.060 | so you can create as many of them as possible.
02:33:58.060 | - So there for sure is a bunch of those kinds of things
02:34:01.300 | throughout the universe.
02:34:02.340 | - Hopefully.
02:34:03.740 | Hopefully our universe is teeming with life.
02:34:05.540 | - This is like early on in the conversation,
02:34:07.340 | you mentioned that we really don't understand much,
02:34:11.660 | like there's mystery all around us.
02:34:14.540 | - Yes.
02:34:15.380 | - We have to like bet money on it, like what percent?
02:34:18.020 | So like say a million years from now,
02:34:21.300 | the story of science and human understanding,
02:34:26.300 | understanding that started on Earth is written,
02:34:32.220 | like what chapter are we on?
02:34:34.620 | Are we like, is this like 1%, 10%, 20%, 50%, 90%?
02:34:39.620 | How much do we understand?
02:34:42.500 | Like the big stuff.
02:34:44.100 | - Not like the details of like big,
02:34:48.060 | important questions and ideas.
02:34:51.580 | - I think we're in our 20s and--
02:34:55.500 | - 20% of the 20%?
02:34:56.580 | - No, like age wise, let's say we're in our 20s,
02:34:59.300 | but the lifespan is gonna keep getting longer.
02:35:02.060 | - You can't do that.
02:35:02.900 | - I can.
02:35:03.820 | You know why I use that though?
02:35:05.140 | I'll tell you why, why my brain went there,
02:35:07.940 | is because anybody that gets an education in physics
02:35:12.380 | has this sort of trope about how all the great physicists
02:35:15.620 | did their best work in their 20s.
02:35:18.460 | And then you don't do any good work after that.
02:35:20.340 | And I always thought it was kind of funny,
02:35:22.340 | because for me, physics is not complete.
02:35:26.780 | It's not nearly complete, but most physicists think
02:35:31.740 | that we understand most of the structure of reality.
02:35:34.460 | And so I think I actually,
02:35:37.500 | I think I put this in the book somewhere,
02:35:39.460 | but this idea to me that societies would discover everything
02:35:44.100 | while they're young is very consistent
02:35:46.060 | with the way we talk about physics right now.
02:35:48.940 | But I don't think that's actually the way
02:35:50.900 | that things are gonna go.
02:35:51.900 | And you're finding that people
02:35:54.460 | that are making major discoveries are getting older
02:35:58.060 | in some sense than they were,
02:35:59.300 | and our lifespan is also increasing.
02:36:01.340 | So I think there is something about age
02:36:04.780 | and your ability to learn
02:36:06.060 | and how much of the world you can see
02:36:08.260 | that's really important over a human lifespan,
02:36:10.780 | but also over the lifespan of societies.
02:36:12.900 | And so I don't know how big the frontier is.
02:36:16.020 | I don't actually think it has a limit.
02:36:18.340 | I don't believe in infinity as a physical thing,
02:36:22.340 | but I think as a receding horizon,
02:36:25.660 | I think because the universe is getting bigger,
02:36:27.320 | you can never know all of it.
02:36:28.780 | - Well, I think it's about 1.7%.
02:36:34.460 | - 1.7, where does that come from?
02:36:36.060 | It's a finite, I don't know, I just made it up.
02:36:38.860 | - That number had to come from somewhere.
02:36:40.900 | - I think seven is the thing that people usually pick.
02:36:44.620 | - 7%?
02:36:45.540 | - So I wanted to say 1%, but I thought it'd be funnier
02:36:49.460 | to add a point, so humor, inject a little humor in there.
02:36:53.460 | So the seven is for the humor.
02:36:54.700 | One is for how much mystery I think there is out there.
02:36:59.220 | - 99% mystery, 1% known.
02:37:01.900 | - In terms of really big, important questions.
02:37:04.780 | - Yeah. - It's like the list,
02:37:06.500 | say there's gonna be like 200 chapters.
02:37:08.460 | Like the stuff that's gonna remain true.
02:37:10.460 | - But you think the book has a finite size.
02:37:14.460 | - Yeah, yeah.
02:37:15.540 | - And I don't.
02:37:16.380 | I mean, not that I believe in infinities,
02:37:19.580 | but I don't, I think the size of the book is growing.
02:37:22.220 | - Well, the fact that the size of the book is growing
02:37:25.900 | is one of the chapters in the book.
02:37:28.500 | - Oh, there you go.
02:37:29.860 | Oh, we're being recursive.
02:37:31.780 | (laughing)
02:37:33.700 | - You can't have an ever-growing book.
02:37:36.620 | - Yes, you can.
02:37:37.700 | - I mean, you just, I mean, I don't even.
02:37:39.820 | Because then--
02:37:41.620 | - Well, you couldn't have been asking this
02:37:42.780 | at the origin of life, right?
02:37:43.820 | Because obviously like you wouldn't have existed
02:37:45.780 | at the origin of life.
02:37:46.620 | But like the question of intelligence
02:37:48.620 | and artificial general, like those questions
02:37:51.020 | did not exist then.
02:37:52.900 | And so, and they in part existed
02:37:56.020 | because the universe invented a space
02:37:57.460 | for those questions to exist through evolution.
02:38:01.900 | - But like, I think that question will still stand
02:38:04.740 | a thousand years from now.
02:38:06.100 | - It will, but there will be other questions
02:38:07.700 | we can't anticipate now that we'll be asking.
02:38:10.140 | - Yeah, and maybe we'll develop the kinds of languages
02:38:13.740 | that we'll be able to ask much better questions.
02:38:15.540 | - Right, or like the theory of like gravitation,
02:38:18.220 | for example, like when we invented that theory,
02:38:20.260 | like we only knew about the planets in our solar system.
02:38:22.700 | Right, and now, you know, many centuries later,
02:38:24.980 | we know about all these planets around other stars
02:38:26.700 | and black holes and other things
02:38:27.940 | that we could never have anticipated.
02:38:29.900 | So, and then we can ask questions about them.
02:38:33.020 | You know, like we wouldn't have been asking
02:38:34.260 | about singularities and like,
02:38:36.340 | can they really be physical things in the universe
02:38:38.260 | several hundred years ago?
02:38:39.100 | That question couldn't exist.
02:38:40.500 | - Yeah, but it's not, I still think
02:38:44.840 | those are chapters in a book.
02:38:47.060 | Like I don't get a sense from that.
02:38:48.820 | - So do you think the universe has an end?
02:38:51.780 | If you think it's a book with an end?
02:38:54.140 | - I think the number of words required
02:38:57.380 | to describe how the universe works has an end, yes.
02:39:00.100 | Meaning like, I don't care if it's infinite or not.
02:39:06.100 | - Right.
02:39:06.940 | - As long as the explanation is simple and it exists.
02:39:09.780 | - Oh, I see.
02:39:10.700 | - And I think there is a finite explanation
02:39:13.980 | for each aspect of it.
02:39:15.460 | The consciousness, the life.
02:39:18.340 | - Yeah.
02:39:19.180 | - I mean, very probably there's like some,
02:39:22.660 | the black hole thing is like, what's going on there?
02:39:26.860 | Where's that going?
02:39:27.700 | Like, where did they, what?
02:39:29.580 | And then, you know, why the big bang?
02:39:32.420 | Like, what?
02:39:33.820 | - Right.
02:39:34.660 | - It's probably there's just a huge number of universes
02:39:37.580 | and it's like universes inside universe.
02:39:40.220 | - I think universes inside universes is maybe possible.
02:39:43.780 | - I just think it's,
02:39:44.980 | every time we assume this is all there is,
02:39:50.620 | it turns out there's much more.
02:39:52.980 | - The universe is a huge place.
02:39:54.260 | - And when we mostly talked about the past
02:39:55.980 | and the richness of the past,
02:39:57.020 | but the future, I mean, with many worlds,
02:39:59.580 | interpretation of quantum mechanics, so--
02:40:02.860 | - Oh, I am not a many worlds person.
02:40:04.380 | - You're not.
02:40:05.220 | - No, are you?
02:40:06.900 | How many Lexus are there?
02:40:08.020 | - Depending on the day.
02:40:09.460 | Well--
02:40:10.300 | - Do some of them wear yellow jackets?
02:40:11.620 | - At the moment you asked the question, there was one.
02:40:14.860 | At the moment I'm answering it,
02:40:16.060 | there's now near infinity, apparently.
02:40:18.800 | I mean, the future is bigger than the past, yes?
02:40:24.980 | - Yes.
02:40:25.820 | - I think so.
02:40:26.660 | - In the past, according to you, it's already gigantic.
02:40:28.860 | - Yeah, but yeah, I mean,
02:40:30.380 | that's consistent with many worlds, right?
02:40:31.900 | 'Cause there's this constant branching.
02:40:34.220 | But it doesn't really have a directionality to it.
02:40:36.980 | It's a, I don't know, many worlds is weird.
02:40:39.540 | So my interpretation of reality is like,
02:40:41.700 | if you fold it up, all that bifurcation of many worlds,
02:40:44.780 | and you just fold it into the structure that is you,
02:40:46.980 | and you just said you are all of those many worlds,
02:40:50.020 | and that sort of, your history converged on you,
02:40:54.660 | but you're actually an object exists that's like,
02:40:57.620 | that was selected to exist,
02:41:00.140 | and you're self-consistent with the other structures.
02:41:02.900 | So the quantum mechanical reality
02:41:04.740 | is not the one that you live in.
02:41:06.420 | It's this very deterministic classical world,
02:41:10.580 | and you're carving a path through that space.
02:41:14.340 | But I don't think that you're constantly
02:41:16.900 | branching into new spaces.
02:41:18.300 | I think you are that space.
02:41:19.780 | - Wait, so to you, at the bottom, it's deterministic.
02:41:23.260 | I thought you said the universe is--
02:41:24.260 | - No, it's random at the bottom, right?
02:41:26.380 | But like this randomness that we see
02:41:28.020 | at the bottom of reality, that is quantum mechanics,
02:41:30.900 | I think people have assumed that that is reality.
02:41:33.980 | And what I'm saying is,
02:41:36.100 | all those things you see in many worlds,
02:41:37.840 | all those versions of you,
02:41:39.320 | just collect them up and bundle them up,
02:41:42.020 | and they're all you.
02:41:43.820 | And what has happened is,
02:41:46.820 | elementary particles don't have,
02:41:48.620 | they don't live in a deterministic universe,
02:41:50.580 | the things that we study in quantum experiments.
02:41:52.780 | They live in this fuzzy random space.
02:41:55.020 | But as that structure collapsed
02:41:57.060 | and started to build structures that were deterministic
02:41:59.620 | and evolved into you,
02:42:01.220 | you are a very deterministic macroscopic object.
02:42:04.820 | And you can look down on that universe
02:42:06.460 | that doesn't have time in it, that random structure.
02:42:09.000 | And you can see that all of these possibilities
02:42:12.420 | look possible, but they don't look,
02:42:14.140 | they're not possible for you,
02:42:15.440 | because you're constrained by this giant
02:42:17.660 | causal structural history.
02:42:20.580 | So you can't live in all those universes.
02:42:24.060 | You'd have to go all the way back
02:42:25.860 | to the very beginning of the universe
02:42:27.780 | and retrace everything again to be a different you.
02:42:29.860 | - So where's the source of the free will
02:42:31.280 | for the macro object?
02:42:32.580 | - It's the fact that you're a deterministic structure
02:42:37.320 | living in a random background.
02:42:38.880 | And also all of that selection bundled in you
02:42:41.220 | allows you to select on possible futures,
02:42:43.180 | so that's where your will comes from.
02:42:44.940 | And there's just always a little bit of randomness,
02:42:46.840 | because the universe is getting bigger.
02:42:48.740 | And this idea that the past and the present
02:42:53.700 | is not large enough yet to contain the future,
02:42:57.180 | the extra structure has to come from somewhere.
02:43:00.140 | And some of that is because outside of those
02:43:03.140 | giant causal structures that are things like us,
02:43:06.520 | it's fucking random out there.
02:43:08.420 | And it's scary.
02:43:11.260 | And we're all hanging on to each other,
02:43:12.620 | because the only way to hang on to each other,
02:43:14.780 | the only way to exist is to cling on
02:43:17.380 | to all of these causal structures
02:43:18.780 | that we happen to co-inhabit existence with
02:43:21.900 | and try to keep reinforcing each other's existence.
02:43:24.460 | - All the selection bundled in.
02:43:27.380 | - And us.
02:43:28.220 | But free will's totally consistent with that.
02:43:30.500 | - I don't know what I think about that.
02:43:31.740 | That's complicated to imagine.
02:43:33.940 | Just that little bit of randomness is enough.
02:43:36.780 | Okay.
02:43:37.620 | - Well, it's also, it's not just the randomness.
02:43:39.660 | There's two features.
02:43:40.480 | One is randomness helps generate some novelty
02:43:42.660 | and some flexibility.
02:43:44.060 | But it's also that because you're the structure
02:43:47.180 | that's deep in time,
02:43:48.580 | you have this combinatorial history that's you.
02:43:51.460 | And I think about time and assembly theory,
02:43:55.440 | not as linear time, but as combinatorial time.
02:43:57.980 | So if you have all of this structure
02:44:00.140 | that you're built out of,
02:44:01.880 | you, in principle, you know,
02:44:04.940 | your future can be combinations of that structure.
02:44:07.600 | You obviously need to persist yourself as a coherent you.
02:44:10.140 | So you wanna optimize for a future
02:44:13.980 | in that combinatorial space that still includes you
02:44:17.660 | most of the time for most of us.
02:44:20.860 | And when you make those kinds of,
02:44:25.340 | and then that gives you a space to operate in.
02:44:28.460 | And that's your sort of horizon
02:44:30.620 | where your free will can operate.
02:44:32.900 | And your free will can't be instantaneous.
02:44:34.620 | So for like example, like I'm sitting here
02:44:37.220 | talking to you right now,
02:44:38.060 | I can't be in the UK and I can't be in Arizona,
02:44:41.180 | but I could plan.
02:44:42.760 | I could execute my free will over time
02:44:44.620 | because free will is a temporal feature of life
02:44:47.700 | to be there, you know, tomorrow or the next day
02:44:50.620 | if I wanted to.
02:44:51.460 | - But what about like the instantaneous decisions
02:44:53.900 | you're making, like to, I don't know,
02:44:56.420 | to put your hand on the table?
02:44:58.140 | That's--
02:44:58.980 | - I think those were already decided a while ago.
02:45:01.540 | I don't think free will is ever instantaneous.
02:45:05.020 | - But on a longer time horizon,
02:45:07.940 | there's some kind of steering going on?
02:45:10.360 | - Mm-hmm.
02:45:11.200 | - And who's doing the steering?
02:45:14.480 | - You are.
02:45:15.320 | - Hmm.
02:45:16.160 | And you being this macro object that's encompasses.
02:45:20.920 | - Or you being Lex.
02:45:22.320 | (laughs)
02:45:24.000 | Whatever you wanna call it.
02:45:25.200 | (laughs)
02:45:27.040 | - There you are saying words to things once again.
02:45:31.280 | - I know.
02:45:32.240 | - Why does anything exist at all?
02:45:34.240 | You've kind of taken that as a starting point.
02:45:39.200 | - Yeah.
02:45:40.040 | - Why do things exist?
02:45:40.880 | - I think that's the hardest question.
02:45:42.800 | - Isn't it just hard questions stacked on top of each other?
02:45:45.560 | - It is.
02:45:46.400 | - Wouldn't it be the same kind of question of what is life?
02:45:49.160 | - It is the same.
02:45:50.080 | Well, that's sort of like, I try to fold
02:45:51.940 | all of the questions into that question
02:45:53.640 | 'cause I think that one's really hard.
02:45:55.400 | And I think the nature of existence is really hard.
02:45:57.640 | - You think actually like answering what is life
02:45:59.760 | will help us understand existence?
02:46:01.260 | Maybe there's, it's turtles all the way down.
02:46:05.600 | It'll just, understanding the nature of turtles
02:46:08.680 | will help us kind of march down,
02:46:11.040 | even if we don't have the experimental methodology
02:46:13.960 | of reaching before the Big Bang.
02:46:15.800 | - Right.
02:46:16.720 | So, well I think there's sort of two questions embedded here.
02:46:20.400 | I think the one that we can't answer by answering life
02:46:22.680 | is why certain things exist and others don't.
02:46:26.760 | But I think the sort of ultimate question,
02:46:30.360 | the sort of like prime mover question
02:46:32.460 | of why anything exists, we will not be able to answer.
02:46:36.160 | - What's outside the universe?
02:46:38.440 | - Oh, there's nothing outside the universe.
02:46:40.720 | So I have a very, I am a very,
02:46:44.300 | I am like the most physicalist that like anyone could be.
02:46:49.080 | So like for me, everything exists in our universe.
02:46:53.240 | And I like to think like everything exists here.
02:46:58.240 | So even when we talk about the multiverse,
02:47:00.520 | I don't like, to me, it's not like
02:47:02.300 | there's all these other universes
02:47:03.560 | outside of our universe that exists.
02:47:05.020 | The multiverse is a concept that exists in human minds here.
02:47:09.440 | And it allows us to have some counterfactual reasoning,
02:47:12.900 | to reason about our own cosmology,
02:47:14.720 | and therefore it's causal in our biosphere
02:47:17.100 | to understanding the reality that we live in
02:47:19.540 | and building better theories.
02:47:21.240 | But I don't think that the multiverse is something like,
02:47:24.100 | and also math, like I don't think there's a platonic world
02:47:26.960 | that mathematical things live in.
02:47:28.840 | I think mathematical things are here on this planet.
02:47:32.700 | Like I don't think it makes sense
02:47:33.880 | to talk about things that exist outside of the universe.
02:47:36.720 | If you're talking about them,
02:47:38.400 | you're already talking about something
02:47:39.600 | that exists inside the universe and is part of the universe
02:47:42.120 | and is part of like what the universe is building.
02:47:44.320 | - It all originates here, it all exists here in some--
02:47:47.160 | - I mean, what else would there be?
02:47:49.420 | - There could be things you can't possibly understand
02:47:52.440 | outside of all of this that we call the universe.
02:47:55.000 | - And you can say that,
02:47:55.920 | and that's an interesting philosophy.
02:47:57.460 | But again, this is sort of like pushing on the boundaries
02:47:59.860 | of like the way that we understand things.
02:48:01.840 | I think it's more constructive to say
02:48:04.280 | the fact that I can talk about those things
02:48:05.920 | is telling me something about the structure
02:48:07.640 | of where I actually live and where I exist.
02:48:09.560 | - Just because it's more constructive
02:48:10.960 | doesn't mean it's true.
02:48:12.160 | - Well, it may not be true.
02:48:15.740 | It may be something that allows me
02:48:17.640 | to build better theories I can test
02:48:19.400 | to try to understand something objective.
02:48:21.240 | - And in the end, that's a good way to get to the truth.
02:48:25.120 | - Exactly.
02:48:26.000 | - Even if you realize you were wrong in the past.
02:48:28.680 | - Yeah, so there's no such thing as experimental Platonism.
02:48:32.400 | But if you think math is an object
02:48:34.680 | that emerged in our biosphere,
02:48:35.780 | you can start experimenting with that idea.
02:48:37.920 | And that, to me, is really interesting.
02:48:41.600 | Like to think about, well, people,
02:48:43.520 | I mean, mathematicians do think about math
02:48:45.080 | sometimes as an experimental science.
02:48:46.500 | But to think about math itself
02:48:48.920 | as an object for study by physicists
02:48:53.920 | rather than a tool physicists use to describe reality,
02:48:57.240 | becomes the part of reality they're trying to describe,
02:49:00.240 | to me, is a deeply interesting inversion.
02:49:02.240 | - What to you is most beautiful
02:49:03.600 | about this kind of exploration of the physics of life
02:49:08.440 | that you've been doing?
02:49:09.640 | - I love the way it makes me feel.
02:49:12.820 | - And then you have to try to convert the feelings
02:49:18.840 | into visuals and the visuals into words.
02:49:20.280 | - Yeah, so I think I love,
02:49:23.040 | yeah, I love the way it makes me feel
02:49:24.880 | to have ideas that I think are novel.
02:49:29.840 | And I think that the dual side of that
02:49:32.320 | is the painful process of trying to communicate that
02:49:34.720 | with other human beings
02:49:35.900 | to test if they have any kind of reality to them.
02:49:38.580 | And I also love that process.
02:49:41.680 | I love trying to figure out
02:49:43.720 | how to explain really deep, abstract things
02:49:46.840 | that I don't think that we understand
02:49:49.440 | and trying to understand them with other people.
02:49:51.640 | And I also love the shock value
02:49:53.480 | of this kind of idea we were talking about before
02:49:58.480 | of being on the boundary of what we understand.
02:50:01.600 | And so people can kind of see what you're seeing,
02:50:03.960 | but they haven't ever saw it that way before.
02:50:06.560 | And I love the shock value that people have,
02:50:10.520 | like that immediate moment of recognizing
02:50:12.360 | that there's something beyond the way
02:50:13.720 | that they thought about things before.
02:50:15.480 | And being able to deliver that to people,
02:50:17.360 | I think is one of the biggest joys that I have,
02:50:19.400 | is just like, maybe it's that sense of mystery,
02:50:22.040 | like to share that there's something beyond the frontier
02:50:24.920 | of how we understand, and we might be able to see it.
02:50:27.400 | - And you get to see the humans transformed by the new idea?
02:50:31.520 | - Yes.
02:50:32.840 | And I think my greatest wish in life
02:50:36.800 | is to somehow contribute to an idea
02:50:38.840 | that transforms the way that we think.
02:50:42.120 | Like, I have my problem I wanna solve,
02:50:45.080 | but the thing that gives me joy about it
02:50:47.080 | is really changing something.
02:50:49.020 | And ideally getting to a deeper understanding
02:50:53.520 | of how the world works, and what we are.
02:50:56.540 | - Yeah, I would say understanding life at a deep level
02:51:01.600 | is probably one of the most exciting problems,
02:51:06.000 | one of the most exciting questions.
02:51:07.760 | So I'm glad you're trying to answer just that,
02:51:12.480 | and doing it in style.
02:51:15.440 | - It's the only way to do anything.
02:51:17.680 | - Thank you so much for this amazing conversation,
02:51:19.480 | thank you for being you, Sarah.
02:51:21.880 | This was awesome.
02:51:23.080 | - Thanks, Lex.
02:51:24.040 | - Thanks for listening to this conversation
02:51:26.400 | with Sarah Walker.
02:51:27.600 | To support this podcast,
02:51:28.880 | please check out our sponsors in the description.
02:51:31.680 | And now, let me leave you with some words
02:51:33.520 | from Charles Darwin.
02:51:34.640 | In the long history of humankind, and animalkind too,
02:51:40.200 | those who learn to collaborate
02:51:42.240 | and improvise most effectively have prevailed.
02:51:45.840 | Thank you for listening, and hope to see you next time.
02:51:49.580 | (upbeat music)
02:51:52.160 | (upbeat music)
02:51:54.740 | [BLANK_AUDIO]