back to index

Chris Mason: Space Travel, Colonization, and Long-Term Survival in Space | Lex Fridman Podcast #283


Chapters

0:0 Introduction
1:4 Human extinction awareness
9:49 Heat death of the universe
15:25 Alone in the universe
19:1 Aliens
27:11 Entropy goggles
41:25 Genetics
49:34 Scott Kelly
55:32 Adapting to space
65:34 Sex in space
68:6 Colonizing planets
74:45 Culture on Mars
79:11 Commercial space flights
86:29 Podcast in space
94:3 Axiom Space
96:20 Designing space experiments
103:9 Robots in space
105:50 Space exploration
109:48 War in space
113:25 Launch toward the Second Sun
119:35 Chlorohumans
125:10 Extreme microbiome project
131:37 Space travel breakthroughs
143:36 Clones
149:28 AI age prediction
154:59 Advice for young people
161:16 Dark times
165:39 Mortality
169:57 Visiting ISS and deep space
171:7 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | - Would that make you sad to die on Mars?
00:00:02.560 | Looking back at the planet you were born on?
00:00:05.320 | - No, I think it would be actually,
00:00:07.640 | in some ways, maybe the best way to die,
00:00:09.200 | knowing that you're in the first wave of people
00:00:11.840 | expanding the reach into the stars.
00:00:14.320 | It'd be an honor.
00:00:15.160 | - The following is a conversation with Chris Mason,
00:00:20.280 | professor of genomics, physiology,
00:00:22.240 | and biophysics at Cornell.
00:00:24.320 | He and colleagues do some of their research out in space,
00:00:27.720 | experiments on space missions
00:00:29.600 | that seek to discern the molecular basis of changes
00:00:32.520 | in the human body during long-term human space travel.
00:00:36.360 | On this topic, he also wrote an epic book
00:00:39.840 | titled "The Next 500 Years,
00:00:42.080 | Engineering Life to Reach New Worlds"
00:00:44.320 | that boldly looks at what it takes to colonize space
00:00:47.240 | far beyond our planet,
00:00:48.800 | and even journey out towards livable worlds
00:00:52.340 | beyond our solar system.
00:00:54.340 | This is the Lex Friedman Podcast.
00:00:56.400 | To support it, please check out our sponsors
00:00:58.680 | in the description.
00:00:59.920 | And now, dear friends, here's Chris Mason.
00:01:04.220 | You wrote a book called
00:01:05.360 | "The Next 500 Years, Engineering Life to Reach New Worlds,"
00:01:09.560 | and you dedicated to, quote,
00:01:11.520 | "To all humans and any extinction-aware sentience."
00:01:16.260 | How fundamental is awareness of death and extinction
00:01:18.800 | to the human condition?
00:01:20.720 | - I think this is actually one of the most
00:01:22.640 | human-specific traits and features that we have.
00:01:26.160 | It's actually maybe one of the few things
00:01:28.560 | that only we have and no one else has.
00:01:30.760 | So it sounds scary.
00:01:32.320 | Sounds like what people often don't like
00:01:33.760 | to think about their death,
00:01:34.600 | except now and again, or at funerals,
00:01:36.840 | or to recognize their mortality.
00:01:38.240 | But if you do it at a species-wide level,
00:01:41.240 | it's something that is actually an exemplary,
00:01:43.440 | human-specific trait that you're exhibiting.
00:01:45.440 | You think about something that is the loss
00:01:47.680 | of not just your life or your family or everyone you see,
00:01:49.980 | but everyone like you.
00:01:51.480 | And that is, I dedicated it because I think
00:01:54.200 | we might not be the last sentience to have this awareness.
00:01:56.880 | I'm actually hoping we'll just be the first.
00:01:59.080 | But as far as we know, we're the only.
00:02:01.000 | And I think this is the, part of the moral thrust
00:02:03.680 | for the book is that we're the only ones
00:02:05.000 | that have this awareness.
00:02:05.840 | That gives us a duty that only we can exercise so far.
00:02:09.120 | - So we definitely contemplate our own mortality
00:02:11.600 | at the individual level.
00:02:13.120 | It is true.
00:02:16.840 | When you wrote it, it was really powerful to realize for me
00:02:21.640 | that we do contemplate our extinction.
00:02:24.680 | And that is a creative force.
00:02:27.480 | So at the individual level,
00:02:29.360 | contemplating your own death is a creative force.
00:02:32.600 | Like I have a deadline.
00:02:34.440 | But contemplating the extinction of the whole species,
00:02:38.240 | I suppose that stretches through human history.
00:02:42.840 | That's many of the sort of subtext of religious ideas
00:02:47.840 | is that like, if we screw this up, it's gonna be over.
00:02:53.080 | And Revelation and every religious text
00:02:55.560 | has some view of either the birth or the death
00:02:57.800 | of the world as they know it.
00:02:59.000 | But it was very abstract.
00:03:00.840 | It was fiction almost, or in some cases,
00:03:03.720 | complete fiction of what you hope or think might happen.
00:03:06.320 | But it's become much more quantified and much more real,
00:03:09.120 | I think, in the past several hundred years.
00:03:11.200 | And especially in the past few decades,
00:03:12.600 | where we can see a sense of responsibility
00:03:15.800 | at a planetary scale.
00:03:16.800 | So when we think about, like say, terraforming Mars,
00:03:19.640 | that would just be the second planet we've engineered
00:03:21.840 | at a planetary scale.
00:03:22.720 | We're already doing it for this one, just not that well.
00:03:25.560 | - Well, yeah, that's right.
00:03:26.680 | So we're like a bunch of ants,
00:03:28.800 | extinction-aware sentience ants
00:03:36.280 | that are busy trying to terraform this planet
00:03:39.080 | to make it habitable so it can flourish.
00:03:42.960 | And then you say that it's our duty
00:03:46.360 | to expand beyond Earth, to expand to other planets.
00:03:53.320 | To find a good backup, off-site backup solution.
00:03:58.320 | Why the word duty?
00:04:00.720 | It's an interesting word.
00:04:02.000 | - Duty is something that usually puts people to sleep,
00:04:04.960 | I'll say this.
00:04:05.800 | So duty, duty is a bit like death.
00:04:08.280 | People don't often like to really think,
00:04:09.760 | wake up in the morning and think,
00:04:10.600 | what is my duty today?
00:04:12.520 | Most people, there are some people
00:04:13.680 | that think about it every day.
00:04:14.520 | People in active military service wake up,
00:04:16.120 | it's a very concrete sense of duty to country.
00:04:19.120 | Sometimes you can think about it though in terms of family.
00:04:20.960 | You feel a duty towards your spouse, your kids,
00:04:23.160 | your parents.
00:04:24.320 | You feel a real duty to them
00:04:25.920 | because you want them to flourish and to be safe.
00:04:29.320 | So we do have this sense of duty,
00:04:30.680 | but you don't, very much like death,
00:04:32.840 | you don't think about it actively.
00:04:34.400 | Usually, it's something that just becomes embedded
00:04:36.600 | in your day-to-day existence.
00:04:38.880 | But I think about duty because this is,
00:04:41.360 | people think about duties for themselves,
00:04:42.560 | but there has never been a real overarching duty
00:04:45.000 | that we all feel as a species for each other
00:04:48.640 | and for generations that haven't yet been born.
00:04:50.480 | And I think I want people to have a sense
00:04:53.040 | of the same love and compassion
00:04:55.680 | and fighting even to the tooth and nail,
00:04:58.240 | whether the way you protect your family,
00:04:59.520 | the way you'd fight for a country, for example,
00:05:01.920 | to feel the same way towards the rarity
00:05:03.560 | and preciousness of life
00:05:04.680 | and feel that sense of duty towards,
00:05:06.920 | particularly extinction-aware life,
00:05:08.680 | which is just us so far.
00:05:09.760 | This ability that we have this awareness
00:05:12.360 | of not only our own frailty,
00:05:13.880 | which of course is often talked about,
00:05:16.040 | and climate change and people think about pandemics,
00:05:18.920 | but other species that we sometimes cause extinction,
00:05:21.280 | but very soon will be even de-extinctifying species
00:05:24.320 | like the woolly mammoth, colossals,
00:05:26.280 | a recent startup that's doing that,
00:05:27.640 | I'm on their advisory board,
00:05:28.920 | and it might happen in three or four years.
00:05:30.800 | So it's an interesting point in history
00:05:33.400 | where we can actually think about preventing death
00:05:36.280 | at a species-wide level and even resurrecting things
00:05:38.560 | that we have killed or that have gone away,
00:05:40.800 | which brings its own series of questions
00:05:42.880 | of just as when you delete something from an ecosystem,
00:05:45.520 | adding something can be completely catastrophic.
00:05:47.600 | And so there are no real guidelines yet on how to do that,
00:05:50.880 | but the technology now exists,
00:05:52.480 | which is pretty extraordinary.
00:05:53.440 | - Yeah, I've just been working on backup
00:05:55.560 | and restoring databases quite a bit recently,
00:05:58.960 | and you can do quite a lot of damage
00:06:00.920 | when you restore improperly.
00:06:03.200 | When we bring back the mammoths,
00:06:05.920 | it might be, you have to be careful bringing that back.
00:06:09.560 | The best of science, the best of engineering,
00:06:12.720 | is both dangerous and exciting,
00:06:14.680 | and that's why you have to have the best people,
00:06:17.080 | but also the most morally grounded people
00:06:20.200 | pushing us forward.
00:06:21.480 | But on the point of duty, there's a kind of sense
00:06:27.160 | that there's something special to humanity,
00:06:30.280 | to human beings that we want to preserve.
00:06:33.120 | And if that little flame, whatever that is, dies,
00:06:38.120 | that will be a real shame for the universe.
00:06:43.000 | What is that?
00:06:43.960 | What is special about human beings?
00:06:45.400 | What is special about the human condition
00:06:47.200 | that we want to preserve?
00:06:48.440 | Why do we matter?
00:06:50.880 | - There are some people who think we don't.
00:06:52.480 | There are some people who say,
00:06:53.320 | "Well, humans, take it or leave it.
00:06:55.440 | "They think they're misanthropes."
00:06:57.080 | So the book is, in the one sense, a call to misanthropes
00:07:01.520 | to hopefully shake them out of their slumber.
00:07:04.680 | But there's some people--
00:07:05.520 | - What does the word misanthrope mean?
00:07:06.560 | - Just people that dislike humanity.
00:07:08.800 | They're just, again, they're all just--
00:07:10.960 | - They're called nihilists, Donny.
00:07:12.600 | That's a shout out for Bigelow fans.
00:07:14.680 | (both laughing)
00:07:16.880 | - Nothing matters.
00:07:17.720 | And why does any,
00:07:18.560 | and they just apply it more particularly to humans.
00:07:20.800 | But there are endless reasons, I think,
00:07:24.680 | to cherish and celebrate what humans have done.
00:07:27.800 | At the same time, many things we've done awfully,
00:07:29.760 | and genocide, and nuclear weapons testing
00:07:34.000 | on unsuspecting citizens of remote islands.
00:07:36.800 | There are definitely things we've done bad.
00:07:37.960 | But the poetry, the music, the engineering feats,
00:07:41.960 | the getting to the moon and eventually,
00:07:44.120 | and already rovers on Mars,
00:07:45.480 | these extraordinary feats
00:07:47.480 | that humans have already accomplished,
00:07:49.320 | interest to really a sense of beauty,
00:07:51.200 | I think is something that is,
00:07:52.600 | you can't ask ants or cockroaches
00:07:57.640 | about their favorite paintings.
00:07:59.280 | Or maybe if you could,
00:08:01.280 | it would be very different from ours.
00:08:02.480 | But in either case,
00:08:03.320 | there's a unique perspective that we carry.
00:08:06.200 | And I think, so that's something,
00:08:07.800 | even just the age old question in biology,
00:08:09.960 | I'm a geneticist,
00:08:10.800 | so this comes up a lot of what makes humans unique?
00:08:12.260 | And so, is it bipedalism?
00:08:14.300 | Is it our intelligence?
00:08:15.260 | Is it tool making?
00:08:16.140 | Is it language?
00:08:17.180 | All those things I just listed,
00:08:18.360 | other species have some degree of those traits.
00:08:21.020 | So it's a question of degree,
00:08:22.700 | not of type of trait that defines humans a little bit.
00:08:26.460 | But I think for the extinction awareness,
00:08:28.700 | that is a uniquely human trait.
00:08:30.540 | That is, to our knowledge,
00:08:32.020 | no other species, or entity, or AI, or sentience
00:08:35.260 | that carries that awareness of the frailty of life,
00:08:38.220 | of our own life, but all life.
00:08:39.900 | - Maybe it is that awareness of the frailty of life
00:08:43.220 | that allows us to be so urgently creative.
00:08:47.340 | Create beauty, create innovation.
00:08:50.380 | It just seems like if you just measure,
00:08:52.180 | humans are able to create some
00:08:55.260 | sort of subjectively beautiful things.
00:08:57.580 | And I see science that way.
00:08:58.940 | I see engineering that way.
00:09:00.460 | And ants are less efficient at that.
00:09:03.540 | They also create beautiful things.
00:09:06.380 | But less aggressively, less innovation,
00:09:10.420 | less building, like standing on the shoulders of giants,
00:09:12.740 | building on top of each other over and over and over,
00:09:14.980 | where you're getting these hierarchical systems,
00:09:19.300 | where you create on greater levels of abstraction.
00:09:21.820 | Then you use ideas to communicate those ideas,
00:09:24.580 | and you share those ideas,
00:09:25.460 | and all of a sudden you have the rockets
00:09:26.940 | going on into space.
00:09:28.740 | - Which ants have been building the same structures
00:09:30.500 | for millions and millions of years with no real change.
00:09:33.260 | And so that is the key differentiator.
00:09:34.900 | - Yet.
00:09:35.740 | - Yet.
00:09:36.580 | - That's right.
00:09:37.420 | We've got an experiment going right now,
00:09:38.340 | and maybe it'll change.
00:09:40.460 | - Well, yeah, we will bring up some extreme organisms.
00:09:45.460 | Another thing you're interested in.
00:09:48.140 | Okay.
00:09:48.980 | One interesting thing that comes up much later in your book
00:09:54.700 | is something I also haven't thought of,
00:10:00.380 | and it's quite inspiring,
00:10:01.960 | which is the heat death of the universe.
00:10:05.540 | Is something worth fighting against?
00:10:08.860 | (laughing)
00:10:09.700 | - Yes.
00:10:10.540 | - That's also an engineering problem.
00:10:12.260 | - Yes.
00:10:13.100 | - You kind of, I mean,
00:10:16.100 | you seriously look at the next 500 years,
00:10:20.700 | and that's such a beautiful thing.
00:10:22.540 | Seriously, we'll talk about the uncertainty
00:10:25.680 | involved with that, and all the different trajectories,
00:10:27.940 | but to seriously look at that,
00:10:30.500 | and then to seriously look at what happens
00:10:33.980 | when the sun runs out, what happens
00:10:37.260 | when the universe comes to an end.
00:10:41.340 | Like, we have an opportunity and a kind of duty,
00:10:45.980 | like you said, to fight against that.
00:10:47.540 | And that was so inspiring to me to think,
00:10:51.060 | wait, maybe we'll actually,
00:10:54.700 | that's a worthy thing to think about.
00:10:56.620 | - Maybe we can prevent it, actually.
00:10:58.460 | - Right.
00:10:59.740 | Come up with the best known understanding, current,
00:11:03.460 | of how things end.
00:11:04.700 | You know, we kind of are building an intuition,
00:11:09.740 | and data, and models of the way the universe is,
00:11:13.660 | the way it started, the way it's going to end.
00:11:15.700 | So our best model of the end,
00:11:18.020 | let's start thinking about how that could be prevented,
00:11:21.580 | how that could be avoided, how that could be channeled,
00:11:24.380 | and misdirected, and you can pivot it somehow.
00:11:29.080 | That's really inspiring, that's really powerful.
00:11:32.200 | I never really thought about it.
00:11:34.060 | Eventually, all things end.
00:11:36.220 | And that was the kind of melancholic notion
00:11:39.900 | behind all of it.
00:11:41.100 | None of this matters, in a way.
00:11:44.700 | To me, that's also inspiring to enjoy the moment,
00:11:50.540 | to really live in the moment,
00:11:52.620 | 'cause that is truly where beauty exists, is in the moment.
00:11:55.820 | But there is a long-lasting aspect to beauty
00:11:59.740 | that is part of the engineering ethic,
00:12:01.580 | which is like, tell me what the problem is,
00:12:03.820 | and we're gonna solve it. - Solve it, yeah.
00:12:06.060 | - So what do you think about that,
00:12:08.020 | the long scale, beyond 500 years?
00:12:11.100 | Do humans have a chance?
00:12:13.500 | - Absolutely, I think we have the best chance
00:12:16.140 | of any species, and actually the best chance
00:12:18.700 | that humanity's ever had.
00:12:19.900 | So I think a lot of people fear that we can
00:12:23.020 | or will kill ourselves.
00:12:23.860 | Actually, my favorite question I ask
00:12:27.060 | at the end of every interview
00:12:28.020 | for every potential graduate student, medical student,
00:12:30.580 | or faculty, whoever I'm interviewing, for whatever reason,
00:12:33.460 | the last question is, well, how long do you think
00:12:34.940 | that humans or our evolutionary derivatives will last?
00:12:37.900 | And the answers are shockingly wide-ranging.
00:12:40.220 | Some people say, I think we've only got 100 years left,
00:12:42.820 | or some people say billions, some people say
00:12:44.740 | as long as the universe lasts.
00:12:46.240 | But the person who once said,
00:12:47.940 | it was a medical student, applicant,
00:12:49.660 | who said, I think we've only got 100 years left.
00:12:51.180 | And I was like, really, for all of humanity,
00:12:53.020 | everything will be gone in 100 years?
00:12:54.460 | And he said, yes.
00:12:55.300 | And I said, well, sweet Jesus, man, why go to med school?
00:12:59.460 | Why not go sell bananas on the beach?
00:13:00.900 | And then he said, I really wanna make the last few 100 years
00:13:03.980 | count really matter.
00:13:05.340 | And I said, oh, well, that's actually kind of,
00:13:07.140 | sort of hopeful in a really dark way.
00:13:08.700 | But I think we've never been better situated
00:13:12.020 | to actually last for the long term.
00:13:13.780 | Even though we've also never been at the greater risk
00:13:16.580 | of being able to destroy ourselves,
00:13:17.980 | ever since really the first nuclear test,
00:13:20.020 | when they, Tony Orbe has a great book about this
00:13:22.540 | called "The Precipice," where the precipice for humanity
00:13:25.180 | is at one point we made technologies
00:13:26.760 | that we weren't sure whether or not
00:13:28.100 | they would destroy the Earth or the entire universe.
00:13:31.060 | So the math was incomplete and there was too much error,
00:13:33.300 | but they tested the bomb anyway.
00:13:35.260 | But it's an extraordinary place as a species to think,
00:13:38.180 | we now have something in our hands
00:13:39.300 | that may destroy the Earth and possibly a chain reaction
00:13:42.300 | that destroys the whole universe.
00:13:43.700 | Let's try it anyway, as a stage that we're at as a species.
00:13:47.660 | - But with that power comes an ability
00:13:49.220 | to get to other planets to survive long term.
00:13:52.220 | And when you think about the heat death,
00:13:53.380 | that just becomes, that's an ad infinitum question.
00:13:55.860 | If you keep thinking, well, we survive,
00:13:57.020 | we go to the next sun, and then you go to the next sun,
00:13:59.300 | eventually the question will be,
00:14:00.260 | well, if you just keep doing that forever,
00:14:02.260 | at some point the universe either continues to expand
00:14:04.820 | or it could collapse back in itself.
00:14:06.220 | And the heat death is more likely at this point
00:14:08.700 | where it just keeps expanding and expanding,
00:14:10.220 | everything gets too far away.
00:14:11.860 | But even in that case, I think if we had
00:14:13.900 | a fundamental knowledge of physics and space-time
00:14:16.100 | that you could try and restructure it
00:14:17.260 | quite literally the shape of the universe to prevent it,
00:14:19.700 | I think we would, I think we would wanna survive.
00:14:21.660 | I think, unless we had done the math
00:14:23.740 | and we think that there's a greater chance
00:14:25.220 | that the next universe would form and make more life,
00:14:28.620 | maybe we would, but even then,
00:14:30.740 | I think humans have always wanted to survive
00:14:32.740 | and you could argue maybe should survive because--
00:14:35.220 | - And are able to engineer systems that help us survive.
00:14:38.180 | - Yeah, yeah, and always have, yeah.
00:14:40.300 | - So what is this though, the Tsar Bomb, yeah, the hydrogen.
00:14:43.620 | Yeah, there's nothing more terrifying
00:14:48.540 | and somehow inspiring than watching
00:14:51.540 | the mushroom cloud of a nuclear explosion.
00:14:54.860 | It's like humans are capable of this.
00:14:57.500 | They're capable of leveraging the power of nature.
00:15:00.480 | - To completely obliterate-- - Destroy everything.
00:15:04.700 | - And to create propulsion.
00:15:06.180 | I mean, most of the Voyager spacecraft are nuclear-powered
00:15:08.500 | because it's still in many ways the most efficient way
00:15:10.300 | to get a tiny amount of fissile material
00:15:12.900 | and make power out of it.
00:15:14.500 | So they're still slowly drifting,
00:15:16.020 | they're past the heliosphere,
00:15:17.300 | they're now into interstellar space
00:15:19.340 | and they're nuclear-powered.
00:15:20.340 | So it's like any tool or technology.
00:15:22.180 | It's a tool or a weapon depending on how you hold it.
00:15:25.140 | - Are we alone in the universe, Chris Mason?
00:15:29.720 | What do you think?
00:15:30.560 | So the presumption that you've just mentioned
00:15:33.360 | is let's just focus on our thing.
00:15:35.140 | - Yeah, for now.
00:15:36.580 | Well, I think we, as far as we know,
00:15:39.460 | there's no other sentient life out in the universe
00:15:41.320 | that we've found yet.
00:15:42.760 | And I think there's probably bacterial life out there
00:15:45.400 | just because we found it everywhere we've looked on Earth.
00:15:48.340 | It is, and there's, you know,
00:15:50.420 | halophilic organisms that can survive in extreme salts.
00:15:52.980 | There are cyclophiles that in extreme cold.
00:15:55.580 | There's basically organisms that can survive
00:15:57.020 | in really almost any possible environment
00:15:59.020 | that can adapt and find a way to live.
00:16:01.500 | But as far as we know, we're the only sentient ones.
00:16:03.500 | And I think this is the famous, the Drake equation,
00:16:06.740 | or how many, where is everyone,
00:16:09.780 | is what Enrico Fermi said,
00:16:11.420 | is why haven't we heard from anyone
00:16:13.380 | if there are these other life forms?
00:16:14.400 | I actually think the question is wrong
00:16:16.780 | to phrase it that way
00:16:17.740 | because the Earth has only been here for 4.5 billion years.
00:16:22.580 | And life may be only for a few billion of those years.
00:16:25.700 | Complex life only for several hundred years,
00:16:27.580 | hundred million years of life we've actually had.
00:16:30.620 | And humans, only in the past few million years
00:16:32.300 | since our last common ancestor.
00:16:33.340 | So it's not that much time.
00:16:35.480 | But even further back,
00:16:36.960 | the universe hasn't had that much time itself
00:16:38.780 | to cool and create atoms
00:16:40.620 | and have them spread around the universe.
00:16:42.660 | So the current estimate's 13.8 billion years
00:16:45.860 | of just the whole universe.
00:16:46.700 | But it's been the first five or six of those billion years
00:16:49.380 | really just like cooling and making enough of the stars
00:16:52.340 | to then make the atoms that would come from supernovas.
00:16:54.260 | So I actually think we might be the first,
00:16:57.980 | or still one of the very few,
00:16:59.180 | or one of the early life forms.
00:17:00.240 | But the universe itself hasn't had that much time
00:17:03.220 | to make life in a galactic and universal timeframe.
00:17:06.980 | You needed billions of years for the elements to be created
00:17:09.860 | and then distributed.
00:17:11.320 | And we're only really in the,
00:17:12.700 | I think the last few billion years
00:17:13.900 | where I think even life could have been made.
00:17:16.100 | So I think the question of where is everyone
00:17:18.580 | is the wrong question.
00:17:19.620 | I think the question is,
00:17:21.220 | I think we are the first ones at the party.
00:17:23.780 | Let's set up the liquor, let's set up the food.
00:17:27.020 | I just think we're the first ones at the party of life,
00:17:29.500 | but more people are coming.
00:17:31.660 | - One of the early attendees to the party.
00:17:34.900 | - Yeah, maybe the first, as far as we know, the first.
00:17:37.300 | But maybe we'll find some--
00:17:38.140 | - In the local pocket of the universe.
00:17:40.380 | - Yeah.
00:17:41.900 | - 'Cause the parties then expand and it overflows.
00:17:44.620 | - Yeah, that's right.
00:17:45.620 | And then there's a mosh pit
00:17:46.660 | and then you bump into the other galaxy.
00:17:48.940 | I think the question should be,
00:17:51.620 | when else is everyone getting here
00:17:54.300 | instead of where is everyone?
00:17:55.420 | I think we've just started
00:17:57.260 | on the genesis of life in the universe.
00:17:59.540 | - Yeah, so not where you have they
00:18:01.340 | or not more about when and who
00:18:04.900 | and how do we set up the party.
00:18:06.340 | - Right, and then how do we help them?
00:18:08.100 | I think it's an interesting other moral question
00:18:09.340 | is do we, a lot of Star Trek episodes,
00:18:12.100 | the prime directive is you do not interfere
00:18:13.980 | with another planet if you could pass by a planet.
00:18:16.500 | I think it's time to also revisit that
00:18:18.140 | because what if you go by a planet
00:18:20.180 | and we think that with, as far as we can tell,
00:18:23.060 | with enough certainty that they would never be able
00:18:25.300 | to leave their planet and then the sun eventually
00:18:28.180 | would engulf that planet, wherever that planet might be
00:18:30.380 | in some solar system.
00:18:31.980 | But if we had a way to help them,
00:18:33.620 | their culture, their science, their technology,
00:18:35.780 | everything about a different species to survive,
00:18:39.180 | would we not interfere?
00:18:41.460 | I think that would actually be wrong to say,
00:18:43.500 | well, we can save this life here and we decide not to.
00:18:47.580 | We decide after millions and billions of years pass
00:18:50.860 | and we know the sun will engulf that planet.
00:18:52.620 | Like what will happen with our planet?
00:18:54.180 | We don't interfere.
00:18:55.620 | That's watching a train hit someone on the tracks
00:18:57.860 | and not moving the train.
00:18:59.220 | - In terms of the effort of humans
00:19:04.380 | becoming a multi-planetary species,
00:19:06.260 | in terms of priorities, how much would you allocate
00:19:13.020 | to trying to make contact with aliens
00:19:16.260 | and getting their help?
00:19:17.500 | And if we look at the next 500 and beyond years,
00:19:22.220 | and just versus option number two,
00:19:25.380 | really just focusing on setting up the party
00:19:28.220 | on our own engineering, on our own,
00:19:31.780 | the genome, the biology of humanity,
00:19:35.700 | the AI collaborating with humans,
00:19:38.660 | just all the engineering challenges
00:19:40.780 | and opportunities that we're exploring.
00:19:43.900 | - I'm focused in my lab, of course,
00:19:46.260 | a lot on the engineering of genomes,
00:19:47.860 | the monitoring of astronauts during long missions.
00:19:51.260 | Reaching out to other aliens,
00:19:54.020 | we've been doing reach out to aliens
00:19:55.940 | since the first radio wave's been broadcast,
00:19:57.740 | so we're doing some of it, but to do a real--
00:19:59.220 | - You made it sound like your lab
00:20:00.620 | is mostly focused on biology,
00:20:02.100 | but you also reach out occasionally to aliens.
00:20:04.500 | - Sure, sure, occasionally.
00:20:06.220 | When they visit, they bring their whiskey
00:20:08.260 | and we have a drink.
00:20:09.580 | But I think we can do,
00:20:14.100 | we've been broadcasting into space for,
00:20:16.020 | at this point, almost a century, getting close to,
00:20:18.540 | but it's not been structured.
00:20:20.980 | So I think it's very cheap and easy
00:20:22.340 | to send out structured messages,
00:20:24.660 | like what Carl Sagan wrote about in "Contact,"
00:20:26.300 | doing prime numbers and sending those out
00:20:27.820 | to indicate intelligence.
00:20:29.100 | So there's things we can do that I think
00:20:31.500 | are very cheap and very easy,
00:20:32.620 | so we should do some of that.
00:20:33.660 | We can walk in, chew gum at the same time.
00:20:35.940 | This is one of the biggest critiques
00:20:37.540 | people often say of space research,
00:20:39.100 | and even space flight in general,
00:20:40.460 | is it's too expensive, shouldn't we solve poverty,
00:20:42.660 | shouldn't we cure diseases?
00:20:44.220 | And the answer's always, as it always has been,
00:20:46.460 | is that you can walk and chew gum at the same time.
00:20:48.220 | You can pass the Civil Rights Act
00:20:50.340 | and go to the moon in the same decade.
00:20:51.900 | You can improve and get rid of structural inequality
00:20:55.660 | while getting to the moon and Mars in this decade.
00:20:58.660 | So I think we can do both.
00:21:00.900 | - Yeah, and they kinda help each other.
00:21:02.580 | There's sometimes criticism of ridiculous science,
00:21:05.460 | like studying penguins or something,
00:21:07.100 | or studying the patterns of birds or fish and so on.
00:21:10.780 | - Some congressman stands up and says,
00:21:12.180 | "This is a waste of taxpayer dollars,"
00:21:13.780 | and then someone says, "Oh, but we..."
00:21:15.300 | And for example, CRISPR was pure research for 25 years.
00:21:18.940 | Now it's a household word,
00:21:20.300 | and students are editing genomes in high school.
00:21:23.060 | But it was just pure research on weird bacteria
00:21:26.380 | living actually in salt, hypersaline lakes and rivers
00:21:30.740 | for decades, and then eventually became
00:21:32.580 | a massive therapeutic, which has led to curing of diseases
00:21:35.140 | in this past year.
00:21:36.140 | - And there's stuff that you discover
00:21:38.260 | as part of the research that you didn't anticipate
00:21:40.940 | that have nothing to do with the actual research,
00:21:43.660 | like oceanography is one of the interesting things
00:21:48.420 | about that whole field is that it's a huge amount of data,
00:21:51.740 | neuroscience too, actually.
00:21:53.240 | So you could discover computer science things,
00:21:57.020 | like machine learning things,
00:21:58.860 | or even data storage manipulation,
00:22:01.440 | distributed compute things by forcing yourself
00:22:04.740 | to get something done on the oceanography side.
00:22:07.660 | That's how you invent the internet
00:22:11.100 | and all those kinds of things.
00:22:13.180 | So to me, aliens, looking for aliens out there
00:22:17.860 | in the universe is a motivator that just inspires,
00:22:22.860 | inspires everybody, young people, old people,
00:22:26.980 | scientists, artists, engineers, entrepreneurs, everybody.
00:22:33.260 | Somehow that line between fear and beauty.
00:22:38.200 | 'Cause we're--
00:22:40.140 | - Aliens are like perfectly merged, basically.
00:22:42.700 | - 'Cause we don't know.
00:22:43.820 | I mean, for you, let's start talking
00:22:46.620 | about primitive alien life.
00:22:48.380 | Are you excited by it, or are you terrified?
00:22:51.740 | - I wanna make a lotion out of it.
00:22:53.060 | I think it'd be great if it's alien life,
00:22:54.700 | assuming it's safe, but I'm very excited.
00:22:56.500 | It doesn't have to be a lotion.
00:22:57.340 | - You just said a half sentence, assuming it's safe.
00:23:00.020 | That's the fundamental question I'm trying to get at.
00:23:02.860 | - If you could, yeah, presuming it's safe.
00:23:04.700 | So I think, you know, we have this,
00:23:07.060 | this beginning of some planetary protection
00:23:09.060 | is happening now, is we're gonna send,
00:23:10.740 | we're bringing rocks back from Mars in 2033,
00:23:13.340 | if all goes according to plan.
00:23:15.140 | But there's always a danger.
00:23:16.440 | What if you bring this back?
00:23:17.280 | What if it's alive?
00:23:18.120 | What if it will kill all of humanity?
00:23:19.820 | Or Michael Crichton wrote a book,
00:23:21.100 | "The Andromeda Strain," about this very idea.
00:23:23.620 | And it could, but it hopefully won't.
00:23:27.020 | And the only way you can really gauge that
00:23:29.300 | is the same way we do with any infectious agent
00:23:31.260 | here on Earth, right?
00:23:32.100 | It's a new pathogen, a new organism.
00:23:34.140 | You do it slowly, carefully.
00:23:35.580 | You often do it with levels of containment.
00:23:37.820 | So, you know, and it's gonna be,
00:23:40.460 | probably have to be where some pioneers go
00:23:42.380 | and would be, for example, on Mars.
00:23:43.780 | There might be other organisms there
00:23:45.300 | that only get activated once there's an ambient temperature
00:23:47.700 | and more humidity, and then suddenly,
00:23:49.300 | the first settlers on Mars are encountering
00:23:51.140 | a strange new fungus or something
00:23:52.840 | that's not even like a fungus,
00:23:53.900 | 'cause it might be a different clade of life,
00:23:55.660 | different branch of life.
00:23:56.900 | And it could be very dangerous, or it could be very inert.
00:23:59.900 | I mean, most of life on Earth is not really
00:24:03.860 | dangerous or harmful.
00:24:06.100 | Let me go back, I'll get on this.
00:24:07.580 | Most of life on Earth is neither harmful
00:24:09.860 | nor beneficial to you.
00:24:10.740 | It's just, they're making its own way in the universe,
00:24:12.780 | just trying to survive.
00:24:14.120 | It's when, you know, it's inside of you
00:24:15.860 | and replicating in yourselves and destroying yourselves
00:24:17.820 | like a virus, like COVID, DexRCV2,
00:24:21.220 | that it becomes a big problem, of course.
00:24:22.700 | But it's, you know, just doesn't really have agency.
00:24:25.560 | It's just trying to get by.
00:24:26.540 | And so, for example, most of the bacteria
00:24:28.060 | on the table on your skin in the subway are pretty inert.
00:24:32.740 | They're just, you know, people hanging around for the ride.
00:24:36.140 | - And actually, just 'cause we're talking
00:24:38.820 | so much trash about viruses,
00:24:40.340 | most viruses are, don't bother humans.
00:24:43.340 | - Yeah, they're phages.
00:24:44.180 | Almost all, the vast majority of viruses are phages.
00:24:46.420 | There's this battle in biology that is really dorky,
00:24:49.620 | is that bacteria think that they're the most,
00:24:51.900 | you know, people who study bacteria
00:24:53.340 | think the bacteria are the most important,
00:24:54.580 | 'cause there's trillions and trillions of them.
00:24:56.700 | They run a lot of our own biology in our body.
00:24:59.020 | But then people who study phages, they say,
00:25:00.700 | well, there's 10 times more phages than the bacteria,
00:25:02.500 | which can attack the bacteria and destroy them as well.
00:25:05.280 | So phage people think that they run the world.
00:25:08.240 | But we need 'em both.
00:25:09.920 | - What do you think about viruses?
00:25:13.300 | So, 'cause you said alien organisms.
00:25:16.180 | Wouldn't we encounter something like bacteria,
00:25:18.140 | something like viruses, as the first alien life form?
00:25:22.180 | Are they, first of all, are viruses alive or not?
00:25:26.340 | - So, the book definition,
00:25:28.220 | if you pick up a biology textbook,
00:25:29.740 | they'd say, technically, no,
00:25:31.360 | because they don't have the ability
00:25:33.140 | to self-replicate independently.
00:25:35.580 | But I would think, if you restructure
00:25:37.940 | how you view what life is,
00:25:39.220 | as far as autonomously aggregating
00:25:42.140 | and replicating of information.
00:25:44.040 | For example, AI at some point,
00:25:46.180 | what if there's an AI platform that we could consider alive?
00:25:48.100 | Like, at what point would you allow it to say it's alive?
00:25:50.900 | And I think we have the same definitional challenge there,
00:25:53.540 | is that if it can continually propagate instructions
00:25:57.020 | for its own existence, then it is a version of living.
00:26:01.020 | I think viruses don't get that category
00:26:03.860 | because they can't do it on their own.
00:26:06.380 | But they are a version of life, I'd say,
00:26:08.260 | but probably not alive.
00:26:09.540 | - Well, they are expressing themselves
00:26:13.260 | and doing so, on occasion,
00:26:15.280 | quite powerfully in human civilization.
00:26:17.380 | So, like you said, at which point
00:26:23.080 | are AI systems allowed to say?
00:26:26.200 | - We're life, we are--
00:26:27.780 | - Allowed.
00:26:29.440 | Humans must allow them.
00:26:32.800 | And viruses didn't ask for permission
00:26:34.820 | to express themselves to humans,
00:26:36.660 | they just kinda did.
00:26:39.200 | We didn't have to allow them.
00:26:40.640 | Are they, overall, though, exciting or terrifying to you,
00:26:46.580 | as somebody who has studied viruses?
00:26:49.080 | - Well, whenever given two options,
00:26:50.340 | there's always two more.
00:26:51.180 | You can do both or neither.
00:26:52.160 | So, here, I'll say they're both terrifying and exciting,
00:26:55.220 | I think, to me.
00:26:56.320 | More exciting than terrifying, I think,
00:26:57.980 | if I had to make that sandwich,
00:26:59.040 | and how many layers are meat versus cheese.
00:27:01.600 | There's a lot more cheese of excitement.
00:27:03.780 | - And meat is the fear, apparently,
00:27:06.200 | in this metaphor, apparently.
00:27:07.040 | - In this sandwich.
00:27:08.280 | Well, I love both, so it's a hell of a delicious sandwich.
00:27:11.900 | You quote President Dwight D. Eisenhower in your book,
00:27:16.300 | quote, "Plans are useless, but planning is essential."
00:27:20.260 | And you provide a thought experiment
00:27:21.820 | called Entropy Goggles.
00:27:23.140 | Can you describe this thought experiment?
00:27:26.040 | - Happily, I do this almost every day, somewhere,
00:27:30.880 | when I'm sitting in a given room.
00:27:32.220 | I will, well, a quick comment about that quote, actually,
00:27:35.420 | for all the NASA planning meetings for the twin study
00:27:37.500 | and other missions, that was often the quote
00:27:40.020 | that goes put up on the wall before we sit down
00:27:41.660 | for the day to plan the mission.
00:27:43.260 | It was that quote, which I--
00:27:44.100 | - Plans are useless.
00:27:45.260 | - But planning is essential, which I thought was hilarious
00:27:46.940 | for an official NASA meeting.
00:27:48.980 | But it was because you need to have a plan,
00:27:50.900 | but you have to know that plan might change.
00:27:52.460 | And so I think that's just a quick context
00:27:55.100 | for that quote.
00:27:55.940 | Craig Kundro, who's a leader at NASA's headquarters now,
00:27:59.700 | would always put that first slide up,
00:28:00.980 | and I'm like, hmm, this meeting's either gonna go
00:28:02.900 | really well or really bad.
00:28:04.500 | I don't know what's about to happen.
00:28:05.780 | But it's an inspiring quote because it's very true.
00:28:08.660 | In any case, the Entropy Goggles is a thought experiment
00:28:11.420 | I detail in my book, which is if you just sit in a room,
00:28:16.100 | any room, wherever you are, and imagine what it will look
00:28:19.260 | like in 10 years, 100 years, 500 years,
00:28:22.900 | or even thousands of years, it is a wonderfully terrifying
00:28:27.500 | and exciting exercise, again, it's definitely both,
00:28:29.820 | because you realize the transience of everything.
00:28:31.860 | You think of what might survive.
00:28:33.180 | Almost everything that you're looking at
00:28:34.500 | will probably not be there in hundreds of years.
00:28:37.060 | It will be at the very least degraded,
00:28:38.740 | or it might be changed, altered,
00:28:39.820 | completely different, moved.
00:28:43.780 | That trait, though, of humans, to just sit there
00:28:46.260 | and project into the future, easily, really seamlessly
00:28:49.980 | with whatever you're doing previously,
00:28:52.420 | is powerful because it shows what can change
00:28:56.220 | and what should change in some cases,
00:28:58.020 | but also that left to its own devices,
00:29:01.700 | the universe would, Entropy would come take over
00:29:03.700 | and really things would decay, things would be destroyed.
00:29:06.380 | But the only thing really preventing, I think,
00:29:08.100 | some of the entropy is humans, these sort of sentient
00:29:11.180 | creatures that are aware of extinction like ourselves.
00:29:13.740 | It's really one of the only forces in the universe
00:29:15.660 | that's counteracting the second law of thermodynamics,
00:29:18.020 | this entropy that's always increasing.
00:29:20.660 | Technically, we're actually still increasing it
00:29:22.220 | because we emit heat and we never have perfect
00:29:24.420 | capture of all of energy, but we're the only things
00:29:27.020 | really actively and consciously resisting it.
00:29:30.180 | Really, you could say life in general does this.
00:29:32.140 | Like ants do this when they build their big homes.
00:29:34.580 | They're rearranging the universe to make a nice place
00:29:36.980 | for themselves and they're counteracting entropy.
00:29:39.420 | But we could actually do it in a way that would be
00:29:40.980 | at a large scale and for long term.
00:29:43.960 | - So, but the entropy goggles is just a way to realize
00:29:46.440 | how transient everything is and just imagine
00:29:49.520 | everything that will decay or change in the room around you.
00:29:51.840 | So, anyone listening, if they're listening on a train
00:29:54.280 | or driving in their car or someone is listening right now,
00:29:57.760 | looking around, everything can and will change.
00:30:00.000 | But at first, it's terrifying to see that,
00:30:03.440 | oh my gosh, everything will decay and go away.
00:30:06.120 | But then I think it's actually liberating.
00:30:08.400 | I think, wait, I can affect this, I can prevent it
00:30:11.480 | or I can affect it or I can improve the change
00:30:13.200 | that may occur all by itself, say naturally.
00:30:16.080 | And so I think it is, but it is that awareness,
00:30:19.320 | again, of the frailty of life, the ever insistence
00:30:23.200 | in increasing entropy that you can address though.
00:30:26.760 | Actually, I say the same thing to first year
00:30:28.080 | medical students, I teach them genetics.
00:30:29.540 | I say, I point early in the course, I say,
00:30:31.960 | here's all these charts of how the human body
00:30:34.000 | decays over time.
00:30:35.840 | And I call it the inexorable march towards molecular
00:30:39.400 | oblivion, which the students often find,
00:30:42.160 | they kind of laugh at, oh, because on all the charts,
00:30:43.920 | they're 22 years old, but older people do not laugh
00:30:47.240 | as much of the thought of molecular oblivion.
00:30:49.440 | But we're all marching towards it to a large degree.
00:30:52.680 | - So this is both a great thought experiment
00:30:55.400 | for the environment around you, so just looking
00:30:58.640 | at all the objects around you, that they will dissipate,
00:31:03.640 | they will disappear with time.
00:31:05.900 | But then it's also the thing you mentioned,
00:31:08.280 | which is how can I affect any of the world?
00:31:11.840 | Like, you're one little creature, and it's like,
00:31:16.840 | your life is kind of, you get dropped into this ocean,
00:31:21.240 | and you make a little splash.
00:31:22.920 | And how do I make it so the splash lasts
00:31:26.320 | for a little bit longer?
00:31:29.080 | 'Cause it ultimately will, I suppose the wave
00:31:33.200 | will continue indefinitely, but it'd be such a small
00:31:36.360 | impact, it's almost indetectable.
00:31:38.280 | And so how do I have that impact at all?
00:31:40.620 | On so many levels, I get to experience this as a human.
00:31:47.160 | Like, I recently had my cold storage hacked
00:31:53.040 | to where it was locked, essentially.
00:31:56.080 | It wasn't hacked, it was locked.
00:31:57.640 | And so you get to lose all your data.
00:31:59.240 | So for example, if you lose all your data,
00:32:01.840 | if you lose all your online presence, your social media,
00:32:06.040 | your emails, if you, like, think of all the things
00:32:09.720 | you could lose in a fire.
00:32:10.840 | There's been a lot of fires in the United States
00:32:12.600 | if you lose your home.
00:32:14.600 | And it makes you realize, wait a minute,
00:32:16.760 | this is exactly a nice simulation
00:32:19.880 | of what will happen anyway, eventually.
00:32:23.600 | And that eventually comes pretty quickly.
00:32:25.200 | And so it allows you to focus on, you know,
00:32:28.320 | how can I actually affect, so what matters?
00:32:31.720 | What lasts?
00:32:32.760 | And what brings me joy?
00:32:35.320 | I suppose that the ultimate answer is nothing lasts.
00:32:38.600 | So you have to focus on the things in the moment
00:32:41.800 | that bring you joy and that have a positive impact
00:32:44.400 | on those around you.
00:32:45.680 | That focusing on something that's long-lasting is perhaps,
00:32:49.160 | I don't know, it's complicated, right?
00:32:53.280 | 'Cause like--
00:32:54.120 | - Well, it used to be foolhardy to say,
00:32:56.840 | I wanna think, like, legacy is often what people think of
00:32:59.240 | as they approach the end of their life.
00:33:00.360 | What is my legacy, what have I done?
00:33:02.000 | Maybe even younger in life.
00:33:03.040 | But it used to be really foolhardy to say,
00:33:05.440 | I could affect something that would,
00:33:06.880 | people would build the building, architect would say,
00:33:08.640 | I'm gonna put my name on this building,
00:33:10.480 | and there I'll have some sense of immortality.
00:33:12.920 | But that's a fleeting dream.
00:33:14.840 | It's not, you can't reach immortality.
00:33:18.320 | And if you could, it would be resource,
00:33:20.480 | you know, taxing on everyone else, if you really were.
00:33:23.760 | But I think it's okay.
00:33:25.480 | I mean, the book's for the next 500 years,
00:33:27.280 | but I presume I'll be dead
00:33:28.240 | for the vast majority of that time.
00:33:30.960 | But that is actually the liberating state of mortality,
00:33:33.520 | is you know that you don't have forever.
00:33:35.840 | So it means what can you do that is the most impactful.
00:33:38.440 | But you can build things that you say,
00:33:40.000 | I want to pass this on to the next generation.
00:33:42.680 | Again, the most obvious thing we do with this
00:33:44.120 | is if people have kids.
00:33:45.460 | But they don't think of this
00:33:47.160 | as a intergenerational responsibility.
00:33:49.400 | They think of it as, well, I was at the bar one night,
00:33:51.040 | and met this hot girl, and then things happened.
00:33:53.480 | Sometimes it's more planned than that.
00:33:54.540 | But the, there's no overarching sense of,
00:33:57.520 | wait, I could have something that three or four generations
00:33:59.960 | from now, well, that someone will receive this gift
00:34:02.000 | that was planned for them
00:34:02.840 | long before they were born or gestating.
00:34:05.440 | And I think we have that capacity.
00:34:07.600 | And that can be a version of legacy.
00:34:09.620 | But it's even okay if no one knows exactly who started it,
00:34:13.820 | but that the benefit was wrought by people,
00:34:16.800 | you know, again, hundreds or even thousands of years
00:34:18.680 | after you got it started.
00:34:19.920 | So I think this is, again, it's something that is,
00:34:23.800 | only really people that are economically secure
00:34:27.880 | can even begin to do this,
00:34:29.100 | where you can say, you know,
00:34:29.940 | think of Maslow's hierarchy of needs,
00:34:31.920 | where you need to satisfy your physical needs,
00:34:33.480 | all your structural needs, and have shelter.
00:34:35.560 | And so, you know, I'm sitting from a position
00:34:37.000 | of great privilege to be able to pontificate
00:34:39.040 | about what I hope I could do for things
00:34:40.840 | for people that come 200 years from now.
00:34:42.440 | But nonetheless, more and more people can do that.
00:34:45.720 | Humanity's never been in a better state,
00:34:47.800 | quantifiably, to be able to start to think
00:34:49.960 | about these intergenerational responsibilities.
00:34:52.800 | - Yeah, this is an interesting balance,
00:34:54.840 | 'cause like, it seems that if you let the ego flare up,
00:34:59.080 | a little bit, that's good for productivity.
00:35:01.620 | Like saying, I can somehow achieve immortality
00:35:03.740 | if what I do is going to be pretty good.
00:35:06.140 | But then, that's actually being kind of dishonest
00:35:10.540 | with yourself, 'cause it won't,
00:35:13.100 | in the long arc of history, it won't matter,
00:35:15.300 | in terms of your own ego, but it will have a small piece
00:35:20.460 | to play in a larger puzzle.
00:35:22.660 | - And help people, you know.
00:35:23.980 | - And help people many generations from now.
00:35:27.380 | - And that they said, there are all these people
00:35:28.940 | who were looking after me before I was ever born.
00:35:31.080 | I think, 'cause it's a bit of just,
00:35:34.360 | when you go to a campsite, there's a camping rule
00:35:38.240 | that you always leave the campsite better than you found it.
00:35:40.160 | So if the fire pit was somewhat damaged
00:35:42.720 | and you got there, you fix it.
00:35:43.620 | If there was no wood, you leave a few bits of logs
00:35:46.280 | for the next person who comes.
00:35:47.780 | And this ethos is something that we just picked up
00:35:50.520 | from camping, and so I think if we did that as people,
00:35:52.900 | the world would be a better place,
00:35:53.920 | and the world coming ahead would also be.
00:35:57.640 | - That said, with these entropy glasses,
00:36:00.520 | how can you see through the fog?
00:36:03.560 | 500 years is a long time.
00:36:05.820 | First of all, why 500 years?
00:36:08.020 | Most people, this is so refreshing,
00:36:10.700 | 'cause most colleagues and friends I talk to
00:36:14.420 | don't have the guts to think even like 10 years out.
00:36:19.420 | They start doing wishy-washy kind of statements
00:36:22.300 | about, well, you don't know.
00:36:23.780 | But it's so refreshing to say, all right,
00:36:25.740 | I know there's so many trajectories
00:36:27.640 | that this world can take, but I'm going to pick a few
00:36:30.860 | and think through them and think what,
00:36:34.580 | well, it's the quote, right?
00:36:35.820 | Plans are useless, but planning is essential.
00:36:38.220 | So why 500 years?
00:36:40.900 | - So 500 was a little bit of what I felt like
00:36:43.540 | I could see clearly through the entropy goggles.
00:36:45.700 | I feel like I can't see--
00:36:47.140 | - Just a contradiction in terms, yeah.
00:36:48.620 | - Right, right, right, I can see.
00:36:50.780 | I mean, for example, if you said,
00:36:52.220 | Chris, what's gonna happen in a million years?
00:36:54.740 | Well, I'll start to describe what happens to,
00:36:58.620 | the moon will be farther away
00:37:00.020 | 'cause it moves several inches away every year.
00:37:02.260 | And so then eventually you can't have a full,
00:37:04.860 | lunar eclipse after a while.
00:37:06.020 | I think about structures of continental change
00:37:08.580 | and things will move.
00:37:09.420 | I could describe some things,
00:37:10.240 | but it starts to become so vague.
00:37:12.820 | It's just not a useful exercise.
00:37:14.300 | I think if it's too far out, if it's too soon,
00:37:16.820 | that's not that much different
00:37:18.140 | from what people just do with the news
00:37:19.540 | and say, I think this is what the economy might look like
00:37:21.420 | over the next year or two years.
00:37:23.140 | Economists are notoriously not held accountable
00:37:26.420 | when they have really bad predictions.
00:37:27.820 | You can make really awful predictions
00:37:29.460 | and no one seems to care.
00:37:30.300 | You can just make another one next week.
00:37:31.580 | So too short is, I think, not necessarily as helpful.
00:37:35.860 | But 500, I actually, when I was first working on the book
00:37:39.180 | and thinking about time, I thought,
00:37:40.020 | well, do I do a thousand or two?
00:37:42.060 | I kept thinking about, the main idea was,
00:37:44.540 | if I were to pick this up 500 years from now,
00:37:47.100 | what would it look like?
00:37:47.940 | I changed the number.
00:37:48.780 | If I pick up a thousand years from now or a hundred.
00:37:51.900 | And I kept trying to think of,
00:37:53.460 | what are some timeframes
00:37:54.980 | where really large scale changes have happened?
00:37:56.660 | And so, in some sense, you could argue
00:37:58.660 | that humans have been mostly the same
00:38:00.140 | for about three or 4,000 years.
00:38:01.580 | And the best example is this.
00:38:02.980 | You looked at some of the Homer's poems
00:38:04.940 | or the Greek tragedies in Oedipus, for example.
00:38:08.620 | Humans are really almost identical.
00:38:11.260 | We're still petty and people have affairs
00:38:14.340 | and people do things they shouldn't.
00:38:16.300 | People, it's a--
00:38:17.140 | - You're saying all those things like it's bad.
00:38:19.140 | - I know, it's just me.
00:38:20.620 | You read that it's astounding and in some sense soothing
00:38:24.020 | that the Greek tragedies of 2,300 years ago
00:38:27.140 | are very relatable to what happens
00:38:29.340 | in every high school.
00:38:30.460 | So, that's why you read them in high school.
00:38:32.260 | Like, oh, that's really a clear part of the human condition.
00:38:34.700 | So, on that sense, some things are really permanent.
00:38:36.620 | But I want to think of a few reasons I chose 500
00:38:39.980 | is that it's a timeframe where I could foresee
00:38:42.740 | clear development of some biotechnology
00:38:44.620 | that will get us to a new place,
00:38:46.120 | including missions to Mars that are planned
00:38:48.260 | that will be there and that would start
00:38:49.700 | to have settlements there on the moon and Mars.
00:38:52.620 | And I could see also that by that time,
00:38:55.240 | I think we would have enough knowledge of biology
00:38:57.260 | and technology and space medicine
00:39:00.100 | to start to prepare for an interstellar mission,
00:39:02.660 | to actually send people on a craft
00:39:04.220 | that would have what's called a generation ship.
00:39:06.900 | People live and die on the same spacecraft
00:39:08.580 | on the way towards a destination.
00:39:10.460 | But I think we need that much time
00:39:11.900 | to actually perfect the technology
00:39:14.140 | and to learn enough about physiology
00:39:15.740 | to be able to make it for that distance.
00:39:18.280 | - And the book is kind of focused on the human story.
00:39:22.900 | So, a specific slice of the possible futures.
00:39:26.300 | - Yes.
00:39:27.500 | - There could be sort of AI systems,
00:39:29.620 | there could be other technologies
00:39:30.900 | that kind of build up the world.
00:39:33.060 | So much of the world might be lived in virtual reality.
00:39:35.940 | So, you're not touching any of that,
00:39:37.260 | you're sticking to biology.
00:39:38.620 | Well, not, you're touching a little bit,
00:39:41.180 | but focused on what the cells that make up the human body.
00:39:45.660 | How do they change?
00:39:47.180 | How do we design technologies to repair them?
00:39:50.060 | And how do we protect them
00:39:52.560 | and as they travel out into the cosmos?
00:39:56.820 | - Absolutely, and it's something that is part of the duty.
00:39:58.780 | If your duty is to keep life safe,
00:40:00.640 | you have to consider all means to do so.
00:40:03.180 | And engineering life to save itself
00:40:05.600 | is definitely on that list.
00:40:07.660 | And I think we can imagine in that timeframe,
00:40:11.300 | 500 years, that we would,
00:40:13.820 | there will be AI that's continually advancing.
00:40:17.160 | I actually say that I'm matter agnostic towards cognition.
00:40:21.300 | So, if your matter is carbon atoms and cells and tissues
00:40:24.780 | and you have cognition, bravo, good for you.
00:40:27.060 | If you're silicon based and you're in chips
00:40:29.660 | and you're in AI, that's all virtual,
00:40:31.540 | but we reach a state of well beyond the Turing test
00:40:34.380 | and really clearly intelligent, congratulations to you too.
00:40:37.620 | So, I feel like this sense of duty is applicable
00:40:41.580 | regardless of what the state of matter
00:40:43.180 | your cognition is based in.
00:40:44.380 | So, I would imagine that AI platforms
00:40:47.320 | that are really intelligent
00:40:48.280 | might also get a sense of this duty.
00:40:49.800 | Or I hope that would, I wrote the book on them too.
00:40:51.800 | - That can carry that flame
00:40:53.540 | of whatever makes humans special.
00:40:55.480 | So, but why nevertheless is so much of your focus
00:41:00.480 | on this human meat vehicle?
00:41:04.000 | Do you think it's essential?
00:41:06.200 | - It doesn't have to be meat, no, it definitely does not.
00:41:08.060 | It could be, I'm hoping that the AI platforms
00:41:11.160 | that we've built or that would become,
00:41:12.880 | that would start to build themselves
00:41:14.940 | would also carry the sense of duty.
00:41:17.000 | 'Cause at that point they would be life.
00:41:18.400 | And so, whichever means that life,
00:41:21.600 | whatever form life takes, it should have this duty I think.
00:41:25.660 | - Will it have the lessons of genetics, genomics,
00:41:29.780 | DNA and RNA and proteins and the squishy stuff
00:41:33.940 | that makes us human, are those lessons a temporary thing
00:41:38.860 | that will discard or will those lessons
00:41:42.100 | be carried forward?
00:41:43.540 | - You mean like if the machines completely take over,
00:41:45.740 | let's say, and it's all--
00:41:46.980 | - Not necessarily completely take over,
00:41:48.360 | but either completely take over or merge with humans
00:41:51.300 | in some interesting way where we,
00:41:53.460 | as opposed to figuring out how to repair cells
00:41:56.000 | and protect cells, we start having some cyborg cells.
00:42:01.000 | - I think we will, there'll definitely be a blending
00:42:03.100 | and blending's already happened.
00:42:04.020 | There's prosthetic limbs, there's cybernetic limbs,
00:42:06.980 | there's neural link, progress being made
00:42:10.100 | to blend biology and cybernetics and machines for sure.
00:42:13.700 | But I think in the long term, we'll see that they
00:42:18.380 | are fairly, the biology would be useful
00:42:21.820 | because it's a manufacturing system.
00:42:24.920 | All of life is a way to create copies of things
00:42:27.720 | or to replicate information,
00:42:28.900 | including storage of information.
00:42:30.900 | Actually, hard drives are probably one of the worst ways
00:42:32.760 | for long-term storage.
00:42:34.340 | DNA might end up being the best way to have
00:42:37.020 | millennia or even longer scale storage
00:42:40.380 | where you want something that has redundancy
00:42:41.980 | that's built in and it can store
00:42:43.900 | and can be put at really cold temperatures
00:42:45.660 | and survive even cosmic rays.
00:42:47.860 | So I think DNA might be the best hard drive
00:42:50.700 | of the future potentially.
00:42:52.220 | - This is really interesting.
00:42:53.060 | Okay, what is DNA?
00:42:55.020 | What is RNA?
00:42:56.620 | And what are genes?
00:42:58.100 | - Yes, we should, 'cause most,
00:42:59.500 | I presume the audience knows it,
00:43:00.660 | but some might just be first-time listeners.
00:43:03.420 | - There's a person right now--
00:43:04.820 | - Who's late.
00:43:05.860 | - In Brazil smoking a joint, sitting on the beach,
00:43:09.700 | and just wants to learn about DNA.
00:43:11.820 | So please, can you explain it to them?
00:43:14.420 | - DNA, the deoxyribonucleic acid, is the recipe for life.
00:43:19.180 | It is what carries the instructions.
00:43:21.300 | In almost all of your cells, you have a copy of your genome.
00:43:24.000 | It's actually the reason I became a geneticist
00:43:25.980 | is 'cause the day I learned that as an embryo,
00:43:28.700 | we start with just a single cell,
00:43:30.100 | but all the instructions that are there
00:43:31.540 | to make every single type of cell in your body,
00:43:34.480 | I was, and still am, endlessly fascinated by that.
00:43:37.980 | That is extraordinary.
00:43:39.220 | That is, to me, the most beautiful thing
00:43:41.260 | in the entire universe.
00:43:42.180 | That is, from one single embryo,
00:43:45.060 | everything is there to make the entire body.
00:43:46.860 | - Which aspect of that is most beautiful?
00:43:48.540 | So is it that there is this information within DNA
00:43:52.860 | that's stored efficiently,
00:43:55.220 | and it also stores information on how to build,
00:43:58.100 | not just what to build.
00:43:59.700 | - Yeah.
00:44:00.540 | - And so from all of that, what's the sexiest,
00:44:03.980 | what's the most beautiful aspect?
00:44:05.220 | Is it the entire machinery,
00:44:07.180 | or is it just the information is there?
00:44:09.420 | - It's the fact that the machinery is the information.
00:44:13.540 | It becomes its own manufacturer is what is extraordinary.
00:44:18.540 | Imagine if you took a one two by four
00:44:21.220 | and you threw it on the ground,
00:44:22.140 | and you said I'll be back in a day,
00:44:23.340 | and then a whole house was made when you came back.
00:44:25.100 | I mean, we would all lose our minds.
00:44:26.700 | A lot of people would poop their pants.
00:44:27.780 | People would have to wear adult diapers.
00:44:28.980 | It would be a big scene if that happened.
00:44:31.780 | And we're actually getting close to that,
00:44:33.380 | to people having autonomous house building.
00:44:35.300 | It's not quite there yet,
00:44:36.140 | but there are people trying to make robots
00:44:38.700 | that will build entire houses for you.
00:44:39.940 | - But you need much more than the block of wood.
00:44:41.700 | - Right, right, that's the extraordinary thing,
00:44:43.420 | is just one piece of wood there,
00:44:45.180 | and say I'll just leave it there for a few days,
00:44:46.860 | and I'll come back.
00:44:47.700 | That's basically what embryos do.
00:44:48.900 | Okay, it takes nine months, a little bit longer,
00:44:50.420 | but still, that is nothing short of magic, right?
00:44:53.620 | So I think that's what I love about the fact
00:44:56.980 | that DNA carries that information.
00:44:58.260 | Now, the information is static,
00:45:00.340 | so to actually read that information,
00:45:02.180 | and to actually put it into motion,
00:45:04.020 | is where RNA comes in.
00:45:05.020 | So this ribonucleic acid,
00:45:06.620 | so it just has one other oxygen added to it,
00:45:09.500 | versus DNA, but it's the transcribed version.
00:45:11.660 | It's like if you look at a book,
00:45:13.380 | and you have it in your hands,
00:45:14.980 | but then you start to read it aloud,
00:45:16.340 | it becomes the active form of the recipe for life,
00:45:19.140 | is the RNA.
00:45:20.260 | And those RNAs also then get translated to become proteins,
00:45:24.060 | to become active forms like enzymes.
00:45:26.180 | You think of like your hair,
00:45:27.620 | or think of other ways you digest food.
00:45:29.620 | There's all these active proteins going around
00:45:32.340 | that are copying your DNA, making RNA,
00:45:35.020 | making sure your DNA is safe.
00:45:36.620 | All these built-in systems
00:45:37.980 | to keep your cells in check and working,
00:45:40.900 | and these are often in protein form.
00:45:43.140 | And so genes are really these constructs,
00:45:47.180 | basically what are the instruction sets?
00:45:48.820 | Like how many versions of instructions
00:45:50.700 | do you have in your genome?
00:45:51.980 | So the genome is the collection
00:45:53.660 | of all the DNA of a person.
00:45:54.940 | For humans, it's about three billion letters
00:45:57.060 | of genetic code.
00:45:57.900 | Three billion A's, C's, G's, and T's,
00:46:00.140 | these nucleotides that are the recipe for life,
00:46:02.340 | and that's it.
00:46:03.180 | That is the entire instruction set
00:46:04.780 | to go from that one embryo up to a full human,
00:46:07.460 | which is pretty efficient,
00:46:08.380 | to say that's actually not that much information.
00:46:11.300 | And in that three billion letters
00:46:12.620 | are snippets of the genes,
00:46:14.360 | which are independently regulated,
00:46:16.860 | autonomous instruction sets, if you will,
00:46:18.980 | these really active forms of the instructions
00:46:22.700 | from your DNA to say, "Make a protein,
00:46:24.980 | make this RNA, or turn off some other part of a cell."
00:46:28.460 | All those instructions are there in our DNA,
00:46:31.420 | and there's about 60,000 of these genes
00:46:33.500 | that are in our genome.
00:46:34.580 | - So how do those all lead up to you
00:46:36.740 | having a personality, good memory and bad memory,
00:46:42.060 | some of the functional characteristics
00:46:43.900 | that we at the human level are able to interpret,
00:46:46.380 | the way your face look, the way you smile,
00:46:50.260 | you're good at running or jumping,
00:46:54.540 | whether you're good at math and all those kinds of things.
00:46:56.860 | - There's an age-old debate of nature versus nurture.
00:46:59.180 | So like most things, if given two options,
00:47:01.420 | you can of course have both.
00:47:02.340 | So almost every trait that we know of in humanity
00:47:05.620 | has mixtures of nurture and nature.
00:47:07.940 | Some of them are purely nurture.
00:47:09.620 | So most people are probably familiar with twin studies,
00:47:11.820 | but twin studies are one of the best ways to gauge
00:47:14.120 | how much is something nurture versus nature,
00:47:16.020 | how much of it is really ingrained
00:47:18.340 | and has probably less ability to change
00:47:20.020 | versus how much can you really train.
00:47:21.320 | So height, for example,
00:47:23.380 | is one of the most obvious inheritable traits,
00:47:25.980 | but it doesn't have one gene.
00:47:26.940 | It probably has at least 50 or 60 genes
00:47:29.500 | that contribute to height.
00:47:30.700 | So there's not like a gene for height.
00:47:33.340 | Some people think of like the gene for cystic fibrosis.
00:47:35.580 | Now in that case, that's true.
00:47:36.780 | There is one gene that if you have mutations,
00:47:38.960 | you get cystic fibrosis as a disease.
00:47:41.140 | But for other traits, they're much more complicated.
00:47:43.420 | They can have dozens or even hundreds of genes
00:47:45.580 | that influence your risk and what appears.
00:47:48.140 | But from twin studies, you take monozygotic twins,
00:47:50.660 | twins that are identical, and you can clearly tell.
00:47:53.060 | They look, they have the same facial structure,
00:47:54.460 | similar intonation, similar even likes.
00:47:56.900 | And you compare them to dizygotic twins,
00:47:59.460 | or when you have fraternal twins,
00:48:00.740 | you can have a male and female,
00:48:02.100 | for example, in the same uterus.
00:48:03.300 | And those are dizygotic twins or two zygotes.
00:48:06.420 | So in that case, they share 50% of their DNA,
00:48:09.140 | but they share the same womb.
00:48:10.820 | And then what you can look at is,
00:48:12.840 | what's the difference between identical twins
00:48:14.180 | versus fraternal twins?
00:48:15.740 | And calculate that difference for any trait.
00:48:17.900 | And that gives you an estimate of the heritability,
00:48:20.020 | or what's called H squared.
00:48:21.820 | So that's what we've been doing for almost every trait
00:48:24.060 | in humanity for the past 100 years,
00:48:26.060 | we've been trying to measure this.
00:48:27.340 | And religion is one that's a negative control.
00:48:29.020 | So if you separate people and see what religion
00:48:30.860 | they become, there's no gene for religion,
00:48:32.960 | or what religion you choose.
00:48:34.460 | So often, the correlation there is zero,
00:48:37.280 | because it should be.
00:48:38.120 | It's a nurture trait, what religion you end up taking
00:48:41.500 | is not encoded in your DNA.
00:48:43.980 | - Religion meaning Islam, Judaism, Christianity,
00:48:47.880 | but there could be aspects of religions that--
00:48:50.820 | - Good question, there is religiosity as a trait
00:48:53.740 | that has been studied in twins,
00:48:55.100 | and that has a heritable component to some degree.
00:48:58.060 | So, and things like boredom susceptibility is a trait.
00:49:01.020 | One of my favorite papers just looked at,
00:49:02.460 | how likely is it that people get bored?
00:49:04.220 | And they looked at identical twins and fraternal twins,
00:49:06.620 | and there's a heritability of about 30%.
00:49:08.700 | So it's mostly not heritable, it's mostly environmental,
00:49:11.340 | but that means to some degree, whether or not you're bored,
00:49:14.300 | you can say, well, it's a little bit of my genes.
00:49:16.420 | You could, a little bit, not a lot,
00:49:18.460 | but most traits have some degree,
00:49:20.260 | and they're probably overlapping with other traits.
00:49:22.100 | Like your boredom susceptibility
00:49:23.900 | versus risk-seeking behavior are interrelated.
00:49:25.940 | So how likely are you to say, I wanna go cliff jumping,
00:49:29.840 | or I wanna go, I wanna do freebasing,
00:49:31.580 | or I wanna do some else that's risky behavior.
00:49:34.960 | - So speaking of twin studies,
00:49:37.100 | Scott Kelly spent 340 consecutive days out in space.
00:49:42.100 | You analyzed his molecular data,
00:49:45.340 | DNA, RNA, proteins, small molecules.
00:49:47.780 | What did you learn about the effect of space
00:49:50.540 | on the human body from Scott?
00:49:53.340 | - We learned that space is rough on the human body,
00:49:56.140 | but that the human body is amazingly
00:50:00.020 | and monstrously responsive to adapt to that challenge.
00:50:03.980 | It can rise to the occasion.
00:50:05.940 | So we can see there, Scott had,
00:50:08.060 | as almost all astronauts do, a bit of puffiness and spikes
00:50:11.240 | in his bloodstream of these, what are called cytokines,
00:50:13.940 | or these inflammation markers of the body,
00:50:15.540 | is clearly saying to itself, holy crap, I'm in space.
00:50:19.100 | And liters of fluid move to the upper torso,
00:50:21.900 | and they get a puffy face, what's called the,
00:50:23.820 | an astronaut face that is very common,
00:50:26.040 | but it goes away after a few days.
00:50:27.980 | And some astronauts maintain high levels of stress
00:50:31.480 | for their whole mission, as measured by cortisol
00:50:33.280 | or some of these other inflammation markers.
00:50:35.440 | Whereas Scott actually had a little spike,
00:50:37.300 | but then he was cool as a cucumber for most of the mission.
00:50:40.260 | But he had spent, at that time,
00:50:41.760 | that was the longest ever mission for a US astronaut.
00:50:44.420 | A few cosmonauts have gone a little bit longer,
00:50:46.700 | but there'd never been a deep molecular analysis
00:50:48.620 | of what happens to the body after about a year in space.
00:50:50.540 | So it was the first study of this kind.
00:50:53.100 | And what we found is, when he got back,
00:50:55.780 | we saw all the same markers of stress on the body
00:50:58.520 | and changes spiked up to levels we'd never seen
00:51:01.340 | for any other astronaut before.
00:51:02.620 | So it seemed like going to space for a year
00:51:04.580 | wasn't so hard as much as returning to gravity
00:51:07.020 | after a year, it was much harder on the body.
00:51:09.580 | He notoriously had, broke out in a rash all over his body,
00:51:13.020 | and really, even the weight of clothing on his skin
00:51:15.660 | was too heavy, it created all this irritation
00:51:17.340 | 'cause his body had not felt the weight
00:51:19.500 | of just a simple T-shirt.
00:51:21.440 | It wasn't really, it had zero weight, of course, right?
00:51:23.180 | So it went up in space.
00:51:24.620 | So that led to all this inflammation, all these changes.
00:51:26.660 | He had to, you know, he was much more comfortable
00:51:29.140 | just to walk around nude.
00:51:30.460 | In that case, it was for medical reasons.
00:51:31.820 | Some people do this recreationally.
00:51:33.460 | He was doing it for medical purposes.
00:51:34.780 | - I do it for medical reasons as well.
00:51:37.140 | - All the time.
00:51:37.980 | - I mean, people say--
00:51:38.820 | - I have a prescription, the doctor told me.
00:51:41.700 | - So he was allergic to Earth, you can say.
00:51:43.060 | - Yeah, exactly.
00:51:43.900 | - Which is fascinating to think about, actually.
00:51:45.540 | How quick did his body adapt there?
00:51:47.540 | - So there, it was about three to four days
00:51:49.220 | he got back to normal, at least in terms of the inflammation.
00:51:51.660 | But what's extraordinary is that we measured
00:51:54.620 | a lot of other molecules, genes, structural changes,
00:51:58.660 | tissue, looked at his eyeballs, looked at his vasculature.
00:52:01.300 | It took him, even six months after the mission,
00:52:03.420 | a lot of the genes that had become activated
00:52:05.220 | in response to space flight were still active.
00:52:08.140 | So things like, we could see his body repairing DNA.
00:52:11.140 | It was being irradiated by cosmic rays
00:52:13.060 | and by the radiation.
00:52:14.220 | It's the equivalent of giving three or four
00:52:15.940 | chest x-rays every day, just in space.
00:52:18.860 | And we could see his body working hard
00:52:21.180 | at the molecular level to repair itself.
00:52:23.020 | And even in his urine, we could see bits of
00:52:24.980 | what's called 8-oxyguanosine, a form of damaged DNA
00:52:28.020 | that you could see coming out.
00:52:29.500 | And we see it for other astronauts as well.
00:52:30.860 | So it's very common.
00:52:31.700 | You can see damaged DNA, the response of the body
00:52:33.900 | to repair the DNA.
00:52:35.500 | But even though he'd been back on Earth for six months,
00:52:37.100 | that was still happening, even six months later.
00:52:39.540 | - How do you, wait, how do you explain that?
00:52:42.020 | - So some of this has to do with,
00:52:43.580 | when you have a gene get activated,
00:52:45.600 | you might think, oh, it's like a light switch.
00:52:46.980 | I'll look at my wall, just flip a light on or off.
00:52:49.620 | And sometimes turning a gene on or off is that simple.
00:52:52.200 | Sometimes you just flip it on because the gene
00:52:53.740 | is already ready to go.
00:52:55.460 | Other cases, though, you have to reprogram
00:52:57.140 | even the structure of how your DNA is packaged.
00:53:00.060 | It's called an epigenetic rearrangement.
00:53:01.820 | In that case, we could see that a lot of these genes
00:53:04.260 | had been, his cells had changed the structure
00:53:06.940 | of how DNA was packaged.
00:53:08.500 | And it remained open even months after the mission.
00:53:11.140 | Now, after about a year, it was actually
00:53:12.500 | almost all back to normal.
00:53:13.420 | 99% of all the genes were back to where they were
00:53:15.480 | in pre-flight levels.
00:53:16.900 | So it means that, you know, eventually you'll adapt,
00:53:18.660 | but there's almost a lag time,
00:53:21.100 | kind of like jet lag for the body,
00:53:22.540 | but jet lag for your cells to repair all the DNA.
00:53:25.060 | - What was the most surprising thing
00:53:27.920 | that you found in that study?
00:53:29.620 | - There were several surprises.
00:53:30.580 | One is just that he, that the repair,
00:53:33.420 | as I just mentioned, that the repair took so long.
00:53:35.020 | I thought maybe a week or a few days,
00:53:36.540 | he'll be back to normal.
00:53:37.820 | But to see this molecular echo in his cells
00:53:40.820 | of his time in space still occurring was interesting.
00:53:43.060 | His telomeres was one that was really surprising.
00:53:44.980 | The caps on the ends of your chromosomes,
00:53:47.380 | which keep all your DNA packaged,
00:53:49.160 | and you get half your chromosomes from your mother
00:53:50.920 | and half from your father,
00:53:51.760 | and then you go on and make all your cells.
00:53:53.740 | Normally, these shrink as you get older,
00:53:55.660 | and telomeres, length is just an overall sign
00:53:58.340 | of aging, getting shorter.
00:53:59.980 | His telomeres got longer in space.
00:54:02.260 | And so this was really surprising
00:54:03.260 | 'cause we thought the opposite would happen.
00:54:04.420 | So he, that was genetically one surprise,
00:54:07.100 | and also some of the mutations we found in his blood,
00:54:09.820 | he had less mutations in blood,
00:54:11.180 | as if his body was almost being,
00:54:12.980 | like a low dose of radiation was sort of cleansing his body,
00:54:15.540 | of maybe the cells that were about to die
00:54:17.140 | is one of our main theories on what's happening.
00:54:19.940 | - And of course, you can't really,
00:54:21.640 | you have theories, but you can't,
00:54:23.640 | 'cause the number of subjects in the study is small.
00:54:27.020 | - Right, right, it's notoriously one
00:54:28.980 | of the lowest-powered studies in human history, yes.
00:54:31.020 | But what you lack in subjects,
00:54:34.180 | you can make up for in the number of sampling times.
00:54:36.060 | So we did basically 260 samples collected
00:54:39.180 | over the course of three years.
00:54:40.260 | So we really, almost every few weeks,
00:54:42.340 | had a full workup, including in space.
00:54:45.300 | So that was the way we tried to make up for it.
00:54:47.180 | But we've tried in other model organisms.
00:54:49.180 | In mice, we've seen this.
00:54:50.020 | We've looked now in 59 other astronauts.
00:54:52.260 | And in every astronaut that we've looked at,
00:54:53.740 | their telomeres get longer in space.
00:54:56.100 | - Does that indicate anything about lifespan,
00:54:58.420 | all those kinds of things, or no?
00:55:00.160 | You can't make any of those kinds of jumps?
00:55:01.580 | - Not yet.
00:55:02.420 | I won't make that jump yet,
00:55:03.240 | but it does indicate that there is a version
00:55:05.660 | of cleansing, if you will, that's happening in space.
00:55:08.820 | A mixture of, we see this actually clinically
00:55:11.340 | at our hospital.
00:55:12.260 | You can do a low dose of radiation
00:55:14.340 | with some targeted therapies
00:55:15.740 | to kind of activate your immune cells.
00:55:17.780 | It's even tried clinically.
00:55:19.700 | So this idea of just a little bit of stress on the body,
00:55:22.380 | or what's called hormesis,
00:55:23.780 | may prime you into active of cleansing,
00:55:26.220 | things that were about to die.
00:55:27.460 | - And that includes stress caused by space.
00:55:31.180 | - Yes, yeah, apparently.
00:55:32.440 | - So how do we adapt the human body
00:55:37.440 | to stress of this kind for periods of multiple years?
00:55:42.140 | What lessons do you draw from that study
00:55:44.840 | and other experiments in space
00:55:46.640 | that give you an indication
00:55:48.560 | of how we can survive for multiple years?
00:55:51.960 | - I think we know that the radiation
00:55:53.720 | is one of the biggest risk factors,
00:55:55.400 | and this has been well described by NASA
00:55:57.560 | and many other astronauts and researchers.
00:56:00.160 | And so there, we don't have to just measure the radiation
00:56:03.560 | or just look at DNA being damaged.
00:56:05.680 | We can actually actively repair it.
00:56:07.640 | This happens naturally in all of our cells.
00:56:09.360 | There's little enzymes, little protein,
00:56:12.380 | and really many machines that go around
00:56:14.840 | and scan DNA for nicks and breaks and repair it.
00:56:17.520 | We could improve them.
00:56:18.800 | We could add more of them,
00:56:20.440 | or you can even activate them before you go into space.
00:56:24.200 | We have one set of cells in my lab
00:56:26.280 | where you activate them before we irradiate them
00:56:28.720 | and actually prepare them for the dose of radiation.
00:56:31.520 | And now that is what's called epigenetic CRISPR therapies,
00:56:35.160 | where you can actually,
00:56:36.000 | instead of adding or taking away a gene or modifying a cell,
00:56:39.720 | you just change kind of how it's packaged.
00:56:41.240 | Like I was just describing that the DNA,
00:56:43.160 | the genes are still there.
00:56:44.280 | We're just changing how they get used.
00:56:45.640 | And so you can actually preemptively activate
00:56:47.960 | the DNA repair genes, and we've done this for cells.
00:56:49.900 | We haven't done this yet for astronauts,
00:56:51.240 | but we've done it for cells.
00:56:53.320 | And a similar idea to this is being used
00:56:55.380 | to treat sickle cell disease and beta thalassemia.
00:56:57.880 | You can reactivate a gene that was dormant
00:57:00.760 | in a way as a therapy.
00:57:02.400 | - So should we make human genes resilient
00:57:05.080 | to harsh conditions,
00:57:06.120 | or should we get good at repairing them?
00:57:08.120 | - I wanna get good at--
00:57:11.320 | - Okay, sorry to interrupt.
00:57:13.080 | I think every time I ask this question,
00:57:14.760 | you have taught me that there's always a third option.
00:57:17.800 | Say both. - I will say both.
00:57:19.640 | I know for copy, it's good to just have one big statement,
00:57:24.640 | but you wanna do both, or a third option.
00:57:27.200 | I would want to do electromagnetic shielding.
00:57:29.820 | I would wanna do a fourth option of maybe
00:57:31.960 | some other kind of physical defenses.
00:57:34.040 | - So outside of the human body.
00:57:35.240 | - Yeah, so we're taking the same passion
00:57:37.480 | to keep astronauts safe that's outside of them,
00:57:39.280 | and just putting it in their cells is what I propose.
00:57:41.360 | Now, it's a bit radical today,
00:57:42.920 | because we're just starting this in clinical trials
00:57:46.920 | to treat diseases on Earth.
00:57:48.900 | So it's not ready, I think, to do in astronauts.
00:57:50.540 | But in the book, I propose by about the year 2040,
00:57:53.360 | that's when we'd reach this next phase,
00:57:54.680 | where I think, well, we'd have known enough
00:57:56.240 | about the clinical response.
00:57:57.320 | We'll have the technology ironed out.
00:57:59.160 | That's about when it's time, I think, to try it.
00:58:01.520 | - So what are some interesting early milestones?
00:58:05.360 | So you said 2040.
00:58:08.080 | What do we have to look forward to in the next 10, 20 years,
00:58:11.400 | according to your book, according to your thoughts?
00:58:13.440 | - A lot of really exciting developments,
00:58:15.060 | where if you really want to activate genes,
00:58:18.840 | like I was just describing,
00:58:20.000 | or repair a specific disease gene,
00:58:22.880 | you can actually CRISPR it out and modify it.
00:58:24.760 | This has been already published,
00:58:26.120 | and well-documented.
00:58:27.320 | But as I was alluding to,
00:58:29.120 | more and more we'll see people
00:58:30.480 | that you just want to temporarily change
00:58:32.440 | your genes' functions, and change their activity.
00:58:35.240 | So the best example of this is for beta thalassemia.
00:58:38.080 | We all have hemoglobin in our blood
00:58:39.660 | that carries oxygen around.
00:58:41.240 | And when you're an adult, it's a different version.
00:58:43.060 | It's a different gene.
00:58:43.900 | You have one gene when you're a fetus,
00:58:45.160 | called fetal hemoglobin.
00:58:46.760 | When you're an adult, you have a different gene.
00:58:48.080 | But they both are making a protein that carries oxygen.
00:58:50.960 | When you're after you're born,
00:58:51.840 | the fetal hemoglobin gene gets just turned off.
00:58:53.880 | Just goes away, and you replace it with adult hemoglobin.
00:58:56.800 | But if your gene for hemoglobin is bad as an adult,
00:59:00.360 | then one of the therapies is,
00:59:01.200 | "Well, let's turn back on the gene
00:59:03.080 | "that you had when you were a fetus."
00:59:04.280 | And it's actually already led to cures
00:59:05.760 | for sickle cell and beta thalassemia in this past year.
00:59:09.160 | So it's this extraordinary idea of like,
00:59:10.280 | "Well, you already have some of the genetic solutions
00:59:13.260 | "in your body.
00:59:14.100 | "Why don't we just reactivate them,
00:59:15.360 | "and see if you can live?"
00:59:16.500 | And indeed you can.
00:59:17.340 | So I think we'll see more of that.
00:59:18.500 | That's for severe disease,
00:59:19.520 | but eventually you could see it for more,
00:59:21.760 | I think, work-related purposes.
00:59:22.920 | Like if you're working in a dangerous mine,
00:59:24.720 | or in a high-radiation environment,
00:59:27.920 | you could basically start to prime it for work safety.
00:59:31.080 | Basically, we need to genetically protect you.
00:59:33.160 | Now, it would have to be shown
00:59:34.920 | that that genetic option is safe, reliable,
00:59:39.040 | that it's better, at least as good,
00:59:40.880 | or if not better, than other shielding methods.
00:59:43.360 | But I think we'll start to see that more
00:59:44.760 | in the next 10, 20 years.
00:59:46.520 | And eventually, as I describe in the book,
00:59:48.040 | you could get to recreational genetics.
00:59:49.600 | You could say, "Well, I wanna turn some genes on
00:59:51.920 | "just for this weekend,
00:59:52.800 | "because I'm going to a high altitude,
00:59:54.720 | "so I'd like to prepare for that."
00:59:55.880 | And so instead of having to take weeks and weeks
00:59:57.640 | for acclimation, you could just do
00:59:59.560 | some quick epigenetic therapies,
01:00:00.880 | and have a good time in the mountains,
01:00:02.120 | and then come back and turn 'em back on.
01:00:03.600 | - So this is stuff to do on Earth,
01:00:05.960 | across thousands of humans,
01:00:07.960 | and then you start getting good data
01:00:09.480 | about what the effects on the human body are.
01:00:12.100 | How do we make humans survive across an entire lifetime,
01:00:16.100 | for, let's say, several decades in space?
01:00:19.000 | - If it's just in space, it'll be hard,
01:00:20.880 | 'cause you'll need, basically, some gravity at some point.
01:00:22.880 | I think you'd need orbital platforms
01:00:24.820 | that give you at least some partial gravity, if not 1G.
01:00:28.600 | If you're on Mars, it's actually,
01:00:30.440 | you know, even though the gravity's 38% of Earth's,
01:00:33.120 | just having that gravity would be enough.
01:00:34.520 | And if you could get under the surface,
01:00:36.160 | into some of the lava tubes,
01:00:37.700 | where you have some protection above you from the radiation,
01:00:40.480 | I think that would be,
01:00:42.120 | you probably could survive quite well there.
01:00:43.560 | So I think it's the, just in space parts, that's hard.
01:00:46.000 | You'd need some gravity.
01:00:47.380 | You need some additional protection from the radiation.
01:00:50.080 | - Can you linger on the lava tubes on Mars?
01:00:53.200 | What are the lava tubes on Mars?
01:00:54.920 | - Yeah, so they are, but look what they sound like.
01:00:57.640 | They were large masses of lava at one point on the planet,
01:01:01.120 | pushing really quickly through the environment.
01:01:04.000 | And they created these, basically, these small caverns,
01:01:07.040 | which you could go in, in theory, and build a small habitat
01:01:09.480 | and then puff it up, kind of like blowing up a balloon,
01:01:12.400 | and have a protective habitat.
01:01:13.860 | Basically, it's a little bit underground.
01:01:14.960 | So one of the next helicopter missions
01:01:17.320 | being planned at the Jet Propulsion Lab
01:01:18.880 | is to see if you can get a helicopter
01:01:19.920 | to go into the lava tube, and which is just,
01:01:22.600 | like as it sounds, kind of like take out a big worm
01:01:25.000 | that has burrowed into the landscape
01:01:26.860 | and leave out the hollow column that's left,
01:01:29.360 | and that's what your tubes look like.
01:01:30.500 | So one of the future helicopters might even go explore
01:01:33.560 | one of them, there's a mission being planned right now.
01:01:35.440 | - So they're accessible
01:01:36.960 | without a significant amount of drilling?
01:01:38.520 | - Yeah, that's the other advantage.
01:01:39.480 | Yeah, you can get to them, 'cause some of them are exposed.
01:01:41.640 | You could do a little bit of drilling
01:01:42.560 | and then see, essentially, this entire cavern.
01:01:44.720 | - And that protects you a little bit from the radiation.
01:01:46.840 | - Right, 'cause you have some soil above you, basically,
01:01:48.640 | which would be, or regolith, which would be nice.
01:01:50.680 | - What about source of food?
01:01:52.080 | What's a good, so that's part of biology,
01:01:56.120 | how you power this whole thing.
01:01:58.320 | What about source of food across decades?
01:02:01.260 | - In space, we'd have to-- - In space.
01:02:02.560 | - Plants have been grown in flight,
01:02:03.840 | and you can get some nutrients,
01:02:05.100 | but right now it is very reliant
01:02:07.520 | on all the upmass being sent up,
01:02:09.840 | all the freeze-dried food that then gets rehydrated,
01:02:13.280 | which doesn't taste awful, but is not self-reliant.
01:02:17.760 | So I think those would have to be small bioreactors.
01:02:19.960 | It'd have to be a lot of work on fermentation,
01:02:23.040 | a lot of work on, essentially, prototrophic organisms,
01:02:26.040 | the organisms that can make all of the 20 amino acids
01:02:28.800 | that you would need to eat.
01:02:30.200 | I describe a little bit in the book,
01:02:31.360 | what if we did a prototrophic human,
01:02:33.160 | where you could have, like right now,
01:02:34.680 | we need to get some of our amino acids,
01:02:37.000 | 'cause we can't make them all, which I think is kind of sad.
01:02:39.640 | So what if we could make all of our own amino acids
01:02:41.520 | or all of our own vitamins?
01:02:43.240 | I also, I think that's one case
01:02:45.160 | where another adaptation could be
01:02:47.040 | to activate the vitamin C gene.
01:02:49.440 | Like right now, you'd have to have limes
01:02:50.880 | or some other source of vitamin C in space,
01:02:53.160 | but we actually carry the gene inside of our genome
01:02:55.440 | to make vitamin C.
01:02:56.520 | Look at dogs and cats, for example.
01:02:58.000 | They have these kind of wet noses.
01:02:59.780 | You don't see them going out and getting margaritas,
01:03:02.440 | although dogs can drink beer and get drunk.
01:03:04.840 | They don't need vitamin C.
01:03:06.800 | They have no risk of scurvy,
01:03:08.200 | because they can make the vitamin C all by themselves.
01:03:10.880 | So can other wet-nosed primates, called strepsirines,
01:03:13.880 | but we are dry-nosed primates,
01:03:15.240 | and we lost this ability sometimes
01:03:18.080 | 10 or 20 million years ago.
01:03:19.080 | We no longer make our own vitamin C,
01:03:20.480 | but the gene for it, it's called Gulo,
01:03:22.720 | is still in our DNA.
01:03:23.760 | It's what's called a pseudogene.
01:03:24.880 | It's just broken down.
01:03:26.260 | It's like having a, like in our genome,
01:03:27.920 | we have these functional genes,
01:03:28.760 | like a nice BMW, a nice car that works well,
01:03:30.760 | but we also have this like wrecking,
01:03:32.240 | this like junkyard of old cars, old genes, old functions
01:03:35.440 | in our DNA that we could bring back.
01:03:37.360 | And so vitamin C is one of them
01:03:38.360 | that would be very easy to do.
01:03:39.960 | So then you could activate the gene, repair it basically,
01:03:42.000 | repair it so we can make our own vitamin C.
01:03:44.640 | Now we'd have to do it again carefully,
01:03:45.920 | 'cause what if we lost vitamin C?
01:03:48.080 | The production of vitamin C as a species,
01:03:49.640 | what if it was a good reason that we lost it?
01:03:51.080 | Maybe it was helping in some other way
01:03:53.280 | that we can't see now,
01:03:54.560 | but you'd start slowly, do it in cells,
01:03:56.680 | then do it potentially in animal models,
01:03:59.200 | in other primates, and then try it in humans.
01:04:01.560 | But that's something else I'd like to see
01:04:03.160 | so we wouldn't have to make as much food in orbit.
01:04:04.840 | You could actually start to make
01:04:06.300 | as much of your own food in your own cells.
01:04:08.160 | - So the input to the system in terms of energy
01:04:10.600 | could be much more restricted.
01:04:11.800 | It doesn't have to have the diversity
01:04:13.200 | we currently need as humans.
01:04:14.520 | - But I don't wanna be a robot.
01:04:15.520 | Humans love, as I do, texture.
01:04:18.240 | I realize that made me sound like I wasn't human,
01:04:20.600 | but humans love food and flavors and textures and smells.
01:04:24.840 | All that is actually attenuated in flight.
01:04:27.080 | So you'd wanna not forget our humanity
01:04:30.080 | and this love of all the benefits and wonder of food
01:04:35.000 | and cooking and smells.
01:04:36.760 | - Well, speak for yourself, 'cause for me,
01:04:38.640 | I eat the same thing every single day
01:04:40.840 | and I find beauty in everything.
01:04:44.320 | And some beauty is more easily accessible outside of Earth.
01:04:49.160 | And food is not one of those things, I think.
01:04:51.600 | What about insects?
01:04:52.780 | The people bring that up, basically,
01:04:55.800 | food that has sex with itself and multiplies.
01:04:58.640 | So cockroaches and so on, they're a source
01:05:00.600 | of a lot of protein and a lot of the amino acids.
01:05:05.600 | - And bedbugs.
01:05:06.800 | There's a guy at the American Academy of Natural History
01:05:08.560 | in New York, he loves bedbugs, Lou Sorkin.
01:05:11.520 | And he has a monthly meeting where he talks about
01:05:14.160 | which insects would be the best for eating.
01:05:16.240 | And one month he gave a whole talk about bedbugs,
01:05:18.880 | that they're pretty gross, but in terms of the value
01:05:22.160 | of what you can get for protein, they're really good.
01:05:23.840 | So they're a good candidate.
01:05:26.360 | I think if you could deep fry 'em,
01:05:28.360 | if you deep fry anything, you can pretty much eat it.
01:05:30.040 | Some of you need a fryer up in space,
01:05:31.520 | but they're a candidate.
01:05:33.000 | - All right, what, technical question,
01:05:36.360 | what are the major challenges of sex in space?
01:05:39.120 | Asking for a friend for reproduction purposes.
01:05:42.880 | So like when we're looking about survival
01:05:45.400 | of the human species across generations.
01:05:48.960 | Do we need gravity, essentially?
01:05:52.160 | - For sex in space, we know that gestation can happen
01:05:54.800 | in space where the babies can develop, at least in mice.
01:05:57.320 | We know that it's possible for worms to replicate and fly,
01:06:00.080 | so it's possible for other invertebrates
01:06:03.720 | to show they can make babies in space.
01:06:05.200 | But for humans, NASA's official stance on this
01:06:07.920 | is that there has never been sex in space, officially.
01:06:11.160 | I think, you know, if we all wonder about that,
01:06:16.160 | I think humans are very predictable in that regard.
01:06:19.880 | Again, going back to the Greek tragedies,
01:06:21.240 | I think that probably someone did something
01:06:23.520 | close to it at some point.
01:06:24.440 | And so I think we know that sperm can be sent into space
01:06:27.640 | and brought back and be used for fertilization,
01:06:29.320 | for in vitro fertilization for humans.
01:06:31.680 | But sex itself in space would be,
01:06:34.680 | I think when we start to get bigger structures
01:06:36.200 | that have a bit more privacy,
01:06:38.200 | I think there'll be a lot of it.
01:06:39.680 | And it has to be, you know, this is a big question
01:06:41.680 | of who goes up into space.
01:06:43.760 | It's now becoming more of regular, in quotes,
01:06:47.720 | people who have prosthetic limbs,
01:06:49.920 | or cancer survivors like Haley Arsenault
01:06:51.800 | who just went up on the Inspiration4 mission.
01:06:53.320 | So she's been a great researcher
01:06:55.400 | in helping with a lot of the science from that mission.
01:06:56.920 | We are doing the same analysis on them
01:06:58.420 | as we've been doing for the twin study
01:06:59.920 | and for other astronauts.
01:07:00.840 | We're doing basically all the same molecular profile
01:07:03.280 | before, during, and after space flight.
01:07:04.600 | So there, we now know that other people can go into space.
01:07:07.800 | As those more and more regular Joes and Janes go up,
01:07:10.960 | I think we'll see a lot more of it.
01:07:12.600 | But so far we have no data, we have no video of it either.
01:07:16.840 | We have no real knowledge other than it would be,
01:07:19.000 | it would need a lot of Velcro,
01:07:20.360 | I think is my only real answer there.
01:07:22.120 | - Well, I'm a fan of Velcro and duct tape.
01:07:25.200 | I think that's gonna be-- - I think that, yeah.
01:07:26.880 | - Those two are essential for anything,
01:07:29.360 | any kind of engineering out in anywhere, honestly,
01:07:32.400 | in all kinds of harsh conditions.
01:07:34.200 | But that is, I mean, on the topic of sex in general,
01:07:39.160 | just social interaction with humans is fascinating.
01:07:42.800 | The current missions are very focused on science
01:07:46.080 | and very technical engineering things.
01:07:48.320 | But there's still a human element that seeps in.
01:07:51.400 | And the more we travel out to space,
01:07:53.320 | the more the humans, the natural human drama,
01:07:56.960 | the love, the hate that emerges,
01:07:59.320 | it's all gonna be right there.
01:08:00.480 | - It's a Greek tragedy just in space, basically.
01:08:02.240 | I think it's gonna be--
01:08:03.200 | - Or a reality show.
01:08:04.360 | (Dave laughs)
01:08:06.480 | So what about the colonization of other planets?
01:08:09.760 | If we look at Mars, when you,
01:08:12.300 | first of all, do you think it's a worthy effort
01:08:17.120 | looking at this particular one planet
01:08:19.380 | to put humans on Mars
01:08:21.500 | and to start thinking about colonizing Mars?
01:08:26.600 | - It's one of the closest options.
01:08:28.400 | It's not the best option, though, by far.
01:08:30.440 | We put in the book measures of Earth Similarity Index,
01:08:33.840 | or something called ESIs.
01:08:35.200 | How close is the gravity, the temperature,
01:08:37.780 | the solar incidence on the surface?
01:08:40.880 | How close is it to Earth is a calculation
01:08:43.160 | many astronomers make when they look for exoplanets.
01:08:45.600 | And Mars is pretty far away from an ESI of about 0.7.
01:08:50.400 | I mean, Earth is one, so the best you can get is one.
01:08:52.160 | Earth is just like Earth.
01:08:54.040 | It gets a score of one.
01:08:54.920 | Anything above, you know, some of the best exoplanets
01:08:57.680 | that are in the habitable zone,
01:08:58.840 | or there's liquid water that could be there,
01:09:00.760 | start to get above 0.8 or 0.9.
01:09:02.440 | But most planets are very low.
01:09:04.520 | They're 0.1, 0.2.
01:09:05.600 | They're either way too big,
01:09:07.120 | and we have crushing gravity, or way too small,
01:09:09.520 | too close to a sun.
01:09:11.280 | But Mars is, even though it's not that great
01:09:13.720 | on the ESI scale, it is still very,
01:09:16.040 | relatively close, you know, galactically.
01:09:18.720 | And Venus is just too hot right now.
01:09:20.780 | So I think Venus would also be a great candidate.
01:09:22.460 | But it's much easier to survive in a place
01:09:25.720 | where it's very cold, but you can be sealed and survive,
01:09:27.920 | whereas going on, we probably just have no technology
01:09:30.040 | to survive anywhere except in the clouds of Venus.
01:09:31.760 | So it's just currently our best option,
01:09:34.200 | but it's not the best option for sure.
01:09:36.360 | - So over time, the ESI changes across millennia.
01:09:39.920 | - It does, yeah.
01:09:40.880 | - So Venus is gonna get cooler and cooler.
01:09:43.420 | Okay, but what are the big challenges to you
01:09:47.680 | in colonizing Mars, from a biology perspective,
01:09:50.920 | from a human perspective, from an engineering perspective?
01:09:54.200 | - There's several big challenges to Mars.
01:09:55.640 | And even the first one is even just the word colonize.
01:09:57.600 | So I think there's even a social challenge.
01:09:59.560 | Like a lot of people, Daniel Wood actually studies this
01:10:02.840 | at MIT, is we shouldn't even use the word colonize,
01:10:06.680 | but then we probably shouldn't use the word settle either,
01:10:08.720 | 'cause there's settlements that have some other baggage
01:10:11.000 | to that word as well.
01:10:11.880 | And then maybe we should use the word explore,
01:10:13.400 | but at some point you can say,
01:10:14.660 | we're going there to survive there.
01:10:16.240 | And so colonization still is the word most people use,
01:10:18.640 | but I try to say go explore and build or settle.
01:10:22.280 | But I think the first challenge is social.
01:10:24.680 | I think getting people to think that this will not be
01:10:27.040 | like the colonization efforts of the past.
01:10:29.200 | The hope is that this will be a very different version
01:10:31.760 | of humanity exploring.
01:10:32.720 | That's my hope.
01:10:33.920 | History, you could say, has proved me wrong
01:10:36.280 | every single time.
01:10:37.220 | Like every time humans have gone somewhere,
01:10:39.240 | it's usually been a tale of exploitation,
01:10:41.920 | strife, and drama again, and then,
01:10:45.260 | and often murder, genocide.
01:10:47.240 | Like it's actually a pretty dark history,
01:10:49.280 | if you think of just all the colonization efforts.
01:10:51.800 | But I think most of it was done
01:10:54.240 | in a really dark area of humanity,
01:10:55.920 | where the average life expectancy
01:10:57.800 | was more than half less than it was today.
01:11:00.460 | It was, life was brutish and short,
01:11:02.780 | as many of, as Hobbes famously said.
01:11:04.640 | So it was a rough existence, right?
01:11:07.920 | So I think some of the ugliness of humanity
01:11:10.960 | in prior colonization times was a consequence of the time.
01:11:14.560 | And at least that's my hope.
01:11:15.480 | I think that now we would have it be much more,
01:11:18.440 | I think, inclusive, much more responsible,
01:11:21.080 | much more, much less evil, frankly.
01:11:23.600 | Like we'd go there and you would need commercialization.
01:11:26.120 | You need efforts to do mining, for example,
01:11:28.400 | bring things back, but it'd have to be some degree
01:11:30.760 | of there are some areas that are viewed as commons
01:11:33.720 | or that are untouchable, like places that are parks.
01:11:35.880 | We do this today, even if there's a lake, for example,
01:11:38.720 | the first, you know, several hundred feet of a lake
01:11:41.260 | are all for public property and everything.
01:11:42.720 | Like you can own property, but just not certain areas.
01:11:44.400 | So I think we'd have to make sure we do that
01:11:46.200 | so that it's not completely exploited.
01:11:48.120 | But the, so that's on the social, the human side.
01:11:50.560 | The technological, we've talked a little bit
01:11:52.040 | about where you'd have to live.
01:11:53.000 | You'd want to be underground with engineering
01:11:54.720 | and modifying even human cells to make sure you survive.
01:11:57.960 | The soil does have a lot of perchlorates,
01:11:59.640 | which is a problem for growing them,
01:12:00.840 | but there's ways to extract them.
01:12:02.360 | There's a fair amount of water.
01:12:03.640 | There's actually this beautiful image
01:12:04.720 | of all the known water on Mars
01:12:06.520 | that NASA posted about a year ago.
01:12:08.600 | And there's water everywhere.
01:12:09.920 | Not lots of it everywhere, but almost everywhere you look,
01:12:12.840 | there's at least a little bit of water,
01:12:14.200 | just a few feet under the surface.
01:12:16.160 | And by the caps, there's a lot.
01:12:17.320 | So I think we could get some water
01:12:19.300 | and we could also do self-generating reactors,
01:12:23.140 | machines that could make food, start to even make beer
01:12:26.540 | if you go long enough down the path.
01:12:28.080 | But the technical challenges are definitely,
01:12:30.820 | the engineering and the manufacturing are gonna be hard
01:12:34.260 | because you have to build the buildings
01:12:35.920 | basically out of the soil that's there.
01:12:37.360 | So you have to really go there
01:12:39.300 | and try and build with whatever you can.
01:12:40.460 | So that has to be perfected still.
01:12:42.200 | But then once you're in those buildings, those structures,
01:12:44.300 | you need to create all the biology
01:12:45.880 | that will feed the populace, feed them.
01:12:47.740 | So, which we don't have the technology for yet.
01:12:50.500 | We have bits of it,
01:12:51.340 | but I think that's gonna be the biggest challenge
01:12:52.500 | is making Mars really truly independent.
01:12:55.020 | But that'll probably take, as I say in the book,
01:12:56.820 | several hundred years before I think we'd get there.
01:13:00.040 | - It's interesting 'cause we're also exploring
01:13:02.660 | ways to motivate society to take on this challenge.
01:13:08.560 | It's the JFK thing and then the Cold War
01:13:12.960 | that inspired the race to space.
01:13:17.020 | And I think as a human species,
01:13:18.780 | we're actually trying to figure out different ideas
01:13:22.060 | for how to motivate everybody
01:13:23.620 | to work on the same project together.
01:13:25.540 | - But yet compete at the same time.
01:13:27.980 | - Well, that's one idea and that's worked well.
01:13:30.300 | - Competition. - Competition.
01:13:31.580 | It's not necessarily the only idea,
01:13:33.840 | but it's the one that worked well so far.
01:13:35.660 | So maybe the only way to truly build a colony on Mars
01:13:40.660 | or a successful sort of human civilization on Mars
01:13:44.700 | is to get China to get competitive about it.
01:13:49.700 | - And they are.
01:13:50.620 | They've announced they wanna have boots
01:13:51.860 | on the Red Planet by 2033,
01:13:53.540 | which is two to four years earlier
01:13:55.540 | than when NASA's supposed to do it.
01:13:57.080 | So we'll see if they get there first.
01:13:59.140 | But I think it's a space race 2.0,
01:14:00.860 | but it's not just the US and Russia this time.
01:14:03.180 | It's China, it's India, it's the UAE, it's Europe,
01:14:05.900 | it's the USA, JAXA is the Japanese Space Agency,
01:14:09.140 | and there's the US.
01:14:09.980 | So now it went from just a two-person race
01:14:13.580 | to a whole field, a whole field of runners,
01:14:18.300 | if you will, on the track trying to get to Mars first.
01:14:20.220 | And I think, I mean, this can be like anything.
01:14:22.740 | If you start to have settlements and construction projects
01:14:25.900 | and places to visit on Mars,
01:14:27.740 | I think that the true mark of a place
01:14:30.380 | being actually settled is when you start to be able to pick.
01:14:33.220 | You're like, well, I wanna go to this destination,
01:14:35.380 | not this one, because they have better Martian cocktails
01:14:37.340 | here, but this one's not as good.
01:14:38.560 | So this idea of innovating and competing
01:14:42.140 | will continue to drive, I think, humans as it always has.
01:14:45.820 | - You write this fascinating thing, which is, quote,
01:14:48.820 | "People living on Mars will have developed
01:14:51.660 | "entirely new cultures, dialects, products,
01:14:53.920 | "and even new religions or variations of current religions.
01:14:57.300 | "For example, a Martian Muslim will need to pray upward
01:15:01.380 | "toward the dusty sky."
01:15:02.820 | I love that you've thought through the geometry of this.
01:15:06.500 | "For example, a Martian Muslim will need to pray upward
01:15:09.500 | "toward the dusty sky since Earth,
01:15:11.560 | "and therefore Mecca, will sometimes be overhead.
01:15:14.900 | "Or when Mecca's below the Martians' feet,
01:15:18.260 | "the prayer direction to Allah will stay downward
01:15:22.560 | "toward the 38% gravity floor.
01:15:25.580 | "Perhaps a second Mecca will be built on the new planet."
01:15:29.640 | End quote.
01:15:30.480 | That's another interesting question.
01:15:34.480 | How will culture be different on Mars
01:15:37.220 | in the early days and beyond?
01:15:38.900 | - Yeah, it'll be, as we've seen with all of human history,
01:15:42.960 | I think even just when people migrate and they move,
01:15:46.040 | even the dialects change.
01:15:47.440 | Even just going to the South in the United States,
01:15:49.560 | there's the, "Oh, y'all, come on down."
01:15:52.240 | And that's not even that far away.
01:15:53.760 | Or even just people on Long Island versus New York City,
01:15:56.240 | and it'll be with a big nasally accent,
01:15:58.120 | and oh yeah, and the people will just get,
01:16:00.000 | or even Wisconsin, I'm from Wisconsin,
01:16:01.840 | which there'll be this big nasally tone,
01:16:03.280 | "Welcome to Wisconsin and Minnesota."
01:16:06.400 | - I wonder who defines that culture,
01:16:08.040 | because it's very likely that the early humans on Mars
01:16:12.680 | will be very technically savvy.
01:16:15.000 | They have to be for engineering challenges.
01:16:17.020 | Well, actually, I don't know.
01:16:18.220 | It could be the, this has to do with your extreme microbiome
01:16:22.060 | is like, is it going to be the extreme survivalists,
01:16:26.820 | or is it going to be the engineers and scientists,
01:16:29.740 | or is it gonna be both?
01:16:31.300 | Because my experience of scientists,
01:16:33.100 | they like the comfort of the lab.
01:16:37.460 | - Yes, yeah. - They don't.
01:16:38.700 | Well, no, there's some,
01:16:39.540 | I keep contradicting myself nonstop.
01:16:41.420 | There's some bad-ass scientists that travel to Antarctica
01:16:43.980 | and all that kind of stuff, so.
01:16:45.620 | - It's an evolutionary selection for humans
01:16:47.780 | who can stare at a screen for eight hours at a time,
01:16:49.660 | or pipette for 12 hours at a time,
01:16:51.380 | and not talk to anybody.
01:16:52.820 | So it's not surprising when our scientists
01:16:54.740 | are a little bit awkward in social situations.
01:16:57.360 | But we can train them out of that.
01:16:59.500 | We can get them to engage other humans,
01:17:01.580 | not all of them, but hopefully most of them.
01:17:03.780 | So I think the culture will definitely be different.
01:17:07.220 | There'll be different dialects, different foods.
01:17:09.180 | There'll be different values.
01:17:10.980 | There very likely will be a different religion.
01:17:13.240 | Kim Stanley Robinson wrote a lot about this in his books,
01:17:15.860 | the new Martian religion that was created.
01:17:17.660 | So I think this idea has been discussed in science fiction.
01:17:20.820 | It's almost unavoidable, because there's been,
01:17:23.460 | I mean, just think of all the religions
01:17:24.960 | that have happened on Earth, with very little,
01:17:29.960 | I think, I mean, there's just terrestrial drama,
01:17:33.460 | but suddenly you have a different planet,
01:17:35.540 | and you then need a deity that would span multiple planets,
01:17:39.220 | and I don't even know how you do that.
01:17:40.140 | But I think someone will think of a way
01:17:42.060 | and make up something.
01:17:43.940 | - Yeah, that's look for ways to draw meaning.
01:17:48.640 | So religion, for a lot of people,
01:17:51.460 | myths, common ideas are a source of meaning.
01:17:55.180 | And when you're on another planet,
01:17:56.880 | boy, does the sense of what is meaningful change.
01:18:03.180 | 'Cause it's humbling.
01:18:05.900 | The harshness of the conditions is humbling.
01:18:08.460 | The very practical fact that Earth,
01:18:12.300 | from which you came, is not so special,
01:18:14.740 | 'cause you're clearly not on Earth currently,
01:18:16.860 | and you're doing fine, and you made it.
01:18:19.620 | - At some point, I mean, it'll be pretty harsh,
01:18:21.140 | like what Shackleton did doing this exploration
01:18:24.820 | of Antarctica and going, it was a very dangerous mission,
01:18:27.860 | barely made it, people died.
01:18:29.340 | Actually, he didn't believe in scurvy at the time,
01:18:31.580 | so he didn't take enough vitamin C,
01:18:32.900 | and some of his people died from not having vitamin C.
01:18:35.020 | So if we had had their genes active,
01:18:36.940 | the pseudogene, they'd be okay.
01:18:38.100 | But there, I think, the early settlers,
01:18:41.540 | it'll be a very different crew.
01:18:42.700 | But once it's comfort, once people are comfortable there,
01:18:44.580 | I think they're gonna, I hope they'll draw more meaning.
01:18:47.380 | 'Cause more planets should be more meaning.
01:18:49.220 | I feel like it's like more hands is a better massage.
01:18:53.540 | I don't know if that's the best analogy here, but--
01:18:55.620 | - I think Aristotle said that, yeah.
01:18:57.700 | (laughing)
01:18:59.740 | I should mention that your book has incredible quotes.
01:19:02.580 | It's great writing, but also just incredible quotes
01:19:04.700 | at the beginning of chapters that are really--
01:19:06.460 | - Thanks, it's basically my favorite quotes.
01:19:08.140 | I'm like, well, I'm writing a book,
01:19:08.980 | I'm gonna put my favorite quotes in there.
01:19:10.420 | - Might as well put 'em all down.
01:19:12.580 | What are your thoughts about the efforts
01:19:15.420 | of Elon Musk and SpaceX in pushing this commercial
01:19:19.420 | spaceflight, and I mean, other companies,
01:19:21.620 | Axiom Space as well?
01:19:23.380 | What are your thoughts on their efforts?
01:19:26.740 | - It's like a gold rush.
01:19:27.980 | Space Race 2.0, there's a lot of terms for it,
01:19:30.620 | the new Space Race, I think it's fabulous.
01:19:32.820 | I think it's moving at a pace that is unprecedented,
01:19:37.100 | and also there's a lot of investment from the commercial
01:19:39.340 | and private sector pushing it forward.
01:19:40.700 | So Elon, most notoriously, doing a lot of it
01:19:42.540 | just himself with SpaceX.
01:19:43.580 | So we've worked really closely with the SpaceX ops teams
01:19:46.660 | and medical team, planning the Inspiration4 mission,
01:19:49.660 | and now some of the Polaris missions which are happening.
01:19:51.780 | And Jared Isaacman has been a fabulous colleague,
01:19:55.580 | collaborator, pilot for the missions.
01:19:58.180 | Again, we're doing the same deep profiling
01:20:00.900 | and molecular characterization of these astronauts
01:20:02.980 | as we've done for Scott Kelly and other astronauts
01:20:05.620 | that are from NASA.
01:20:07.180 | And we're seeing so far, actually,
01:20:08.780 | there'll be a lot of this presented later this year,
01:20:10.960 | it seems like it's pretty safe.
01:20:13.340 | Again, there's dangers, we can see real stress on the body,
01:20:15.700 | very obvious changes, some of the same changes
01:20:18.220 | that Scott Kelly experienced.
01:20:19.660 | But for the most part, they return back to normal,
01:20:21.100 | even for a short three-day mission.
01:20:23.080 | I remember chatting with Jared,
01:20:24.780 | and we were presenting the data to them
01:20:26.100 | actually just a few weeks ago,
01:20:27.100 | kind of a briefing to the crew,
01:20:29.180 | and 'cause they went to 590 kilometers,
01:20:30.980 | they went basically several hundred kilometers higher
01:20:32.620 | than the space station or the Hubble.
01:20:34.540 | You normally rest more radiation,
01:20:36.620 | the farther you get from Earth, there's more radiation.
01:20:38.300 | He was worried, you know, did we get cooked?
01:20:40.460 | It was kind of his question for me in the briefing.
01:20:42.020 | I said, well, actually, it looks like
01:20:43.260 | you can go back into the microwave.
01:20:44.660 | You didn't get fully cooked, you can go a little bit farther.
01:20:46.500 | So for the Polaris mission, they're gonna go even farther.
01:20:49.940 | And then also, open the hatch and go on these new spacesuits
01:20:52.780 | that SpaceX are designing that'll be much nimbler,
01:20:55.300 | not as much of a giant, you know,
01:20:57.660 | Dr. Octagon kind of a spacesuit,
01:21:00.420 | but really looks like just a nice spacesuit,
01:21:03.580 | and they're gonna go out into the vacuum of space.
01:21:05.560 | And so, you know, pushing all the engineering
01:21:07.920 | for these missions, which are privately funded,
01:21:09.700 | so it's people who just say, I wanna go up in space
01:21:12.140 | and see if I can push the limits, has been fabulous,
01:21:14.560 | but I think the most fabulous part is Jared in particular,
01:21:17.080 | but others, other commercial spaceflight drivers
01:21:19.700 | like John Shoffner or Peggy Whitson for the Axiom missions
01:21:22.660 | are coming to us, the scientists, researchers, saying,
01:21:25.180 | I don't just wanna go up into space just to hang out.
01:21:27.500 | How much science can I get done when I'm up there?
01:21:29.740 | What can I do, what experiments can I do?
01:21:31.660 | Give me, you know, blood, tissue, urine, semen, tears,
01:21:35.500 | I'll give you any biofluid, you know,
01:21:37.780 | and I always email them back and say,
01:21:39.580 | listen, every one of your cells is worthy of study.
01:21:42.540 | I send, you know, so I have this really kind of creepy
01:21:45.060 | geneticist email response, like I want all of your cells,
01:21:47.500 | you know, but it's true, because there's so much
01:21:49.180 | we don't know, I wanna learn as much as we can
01:21:51.540 | about every time I go up, anyone.
01:21:53.440 | So we're doing it, you know, with NASA astronauts,
01:21:54.820 | but it's been some of this influx of new crews
01:21:58.640 | that are willing to do almost anything, right?
01:22:00.740 | So including, we did skin biopsies
01:22:02.620 | for the Inspiration4 crew before and after spaceflight,
01:22:05.340 | and that's never been done before.
01:22:06.460 | We've never seen the structure of the skin
01:22:07.900 | and how it changes in response to microgravity,
01:22:10.340 | and also the microbes that change.
01:22:12.180 | And so we have these beautiful images
01:22:13.580 | of even the structure of skin changing,
01:22:15.460 | and the inflammation that we've seen,
01:22:16.740 | and like for Scott Kelly, for example,
01:22:18.680 | we now have a molecule by molecule map
01:22:20.760 | of what happens to skin, which has never been done before.
01:22:23.540 | - What are the interesting surprises there?
01:22:25.700 | - So one of the interesting things we can see,
01:22:27.740 | part of what's driving inflammation
01:22:28.860 | is we can actually see macrophages
01:22:30.740 | and there's other dendritic cells,
01:22:32.020 | pieces, like cells that are part of the immune system
01:22:34.380 | kind of creeping along towards the surface of the skin,
01:22:36.520 | which is, now we know it's actually
01:22:38.060 | physically driving the immune system,
01:22:39.300 | is these cells going and creating this inflammation,
01:22:41.860 | which is what leads to some of the rashes.
01:22:43.860 | But we didn't see as much in them as we saw, for example,
01:22:46.340 | some of the signatures of Scott Kelly.
01:22:47.820 | So we can see within the crew who's getting
01:22:50.340 | more of a rash or not, or who didn't experience any rash.
01:22:53.640 | And some people had changes in vision,
01:22:55.920 | some people had other GI problems,
01:22:59.080 | even looking at sort of what happens to the gut
01:23:01.040 | and looking at the microbiome of the gut,
01:23:02.640 | other people didn't.
01:23:03.480 | So we're able to see,
01:23:04.840 | and start to get a little bit predictive
01:23:06.000 | about their medicine.
01:23:06.840 | Right now we're just diagnosing,
01:23:07.680 | but it'd be good to say, if you're going into space,
01:23:10.240 | here's exactly what you need for each bacteria in your body,
01:23:13.280 | here's what you could maybe take to get rid of nausea,
01:23:15.200 | or other ways we could monitor you
01:23:16.240 | to keep the inflammation down.
01:23:17.880 | - What does it take to prepare for one of these missions?
01:23:21.040 | 'Cause you mentioned some of the folks
01:23:22.800 | are not necessarily lifelong astronauts.
01:23:25.440 | You're talking about more and more regular civilians.
01:23:27.720 | What does it take physiologically and psychologically
01:23:30.760 | to prepare for these?
01:23:31.640 | - They have to do a lot of the same training
01:23:33.000 | that most astronauts do.
01:23:34.120 | So a lot of it's in Hawthorne at SpaceX headquarters,
01:23:36.920 | which if you can ever get a chance to do a tour,
01:23:38.560 | it's fabulous.
01:23:39.400 | It's really, you can see all these giant rockets
01:23:41.160 | being built, and then we're drawing blood
01:23:42.240 | over there right next to them.
01:23:43.080 | So it's a really cool place.
01:23:44.280 | But the training, they have to go through a lot of the ops,
01:23:46.640 | a lot of the programming, just in case.
01:23:48.680 | Most of the systems are automated on the Dragon,
01:23:50.920 | and other spacecraft, but just in case.
01:23:53.440 | So they have to go through the majority of the training.
01:23:55.960 | If you wanna go to the space station,
01:23:57.440 | as the Axiom missions are, including John Shroffner,
01:24:00.120 | you have to do training for some of the Russian modules.
01:24:02.920 | And if you don't do that training,
01:24:04.160 | then you're not allowed to go to the Russian part
01:24:05.960 | of the space station, apparently.
01:24:07.000 | So right now, John Shroffner, for example,
01:24:09.120 | unless he completes this additional training,
01:24:10.680 | all in Russian, he's not allowed--
01:24:12.480 | - All in Russian?
01:24:13.320 | - Otherwise he has to learn enough Russian
01:24:14.280 | to be just functional. - Wow.
01:24:16.160 | So it's not just technical, he also has to--
01:24:18.400 | - Enough, enough Russian. - Enough, enough Russian.
01:24:20.760 | And so, and if he doesn't learn,
01:24:23.160 | he can't go to that part of the space station.
01:24:24.520 | So interesting things like that.
01:24:25.680 | But you'll be, you know, it's not that far.
01:24:27.400 | You're like, oh, I can see it right there.
01:24:28.600 | I can't float over to that capsule.
01:24:31.720 | But technically he can't go, so you know.
01:24:33.640 | - Is there a Chinese component to the
01:24:36.920 | International Space Station?
01:24:38.040 | Is there a collaboration there?
01:24:39.200 | - Sadly not, they're building their own space station.
01:24:42.120 | I'm glad they're building a space station.
01:24:43.320 | Actually, eventually there'll be probably
01:24:44.280 | four space stations in orbit by 2028.
01:24:47.120 | Some from the orbital reef, some from Lockheed Martin.
01:24:49.840 | Of course, Axiom is far ahead right now.
01:24:52.240 | They're probably gonna be done first.
01:24:54.040 | But the extraordinary thing is,
01:24:56.820 | unfortunately there's no collaboration between the--
01:25:00.880 | - You see that as a negative,
01:25:02.320 | that's not the positive kind of competition.
01:25:04.160 | - It's a good question.
01:25:05.360 | So maybe, for example, when we get different NASA grants,
01:25:09.080 | you apply for a grant, you get to the lab,
01:25:11.200 | it goes through Cornell, the grants office.
01:25:13.640 | I have to sign, as a scientist, as the PI on the mission,
01:25:16.660 | say, I promise I will move no funds or resources
01:25:19.520 | or any staff to anyone in China or work with anyone in China
01:25:23.040 | with these dollars that you're giving
01:25:24.400 | to the lab for this mission.
01:25:26.040 | And so every other grant I get from the NASA, DOD,
01:25:29.080 | or sorry, let me go back to that.
01:25:30.880 | Every other grant I get from, say, the NIH or the NSF,
01:25:34.640 | even sometimes DOD, you don't have to promise
01:25:37.100 | that you won't talk to anybody in China about it.
01:25:39.260 | But for NASA alone, it's congressionally mandated.
01:25:41.960 | You have to promise and sign all those paperwork
01:25:43.560 | 'cause I can't do anything with anyone in China about this.
01:25:46.560 | And what I view as sad about that is I wanna at least
01:25:48.760 | be able to chat with them about it
01:25:50.040 | and know what they're up to,
01:25:50.860 | but we can't even go to a conference in China,
01:25:54.000 | technically with NASA funds, about, say, space medicine
01:25:56.440 | or engineering a new rocket.
01:25:58.120 | I can go with personal funds, but I can't use those funds.
01:26:01.880 | - Like, you should be able to go to a conference
01:26:03.600 | in a friendly way, talk shit to the other scientists.
01:26:06.200 | Like, the way scientists do really well,
01:26:09.000 | which is like they compliment,
01:26:10.460 | but it's a backhanded compliment,
01:26:11.840 | like, you're doing a really good job here,
01:26:14.640 | and then you kind of imply
01:26:15.700 | that you're doing a much better job.
01:26:16.760 | That's the core of competition.
01:26:18.880 | You get jealous and then everybody's trying to improve,
01:26:21.440 | but then you're ultimately talking,
01:26:22.760 | you're ultimately collaborating closely,
01:26:24.600 | you're competing closely as opposed to in your own silos.
01:26:28.300 | Well, let me ask, in terms of preparing for space flight,
01:26:34.440 | I tweeted about this and I joked about it,
01:26:43.080 | and I talk to Elon quite a lot these days.
01:26:46.400 | What I tweeted was,
01:26:47.960 | I'd like to do a podcast in space one day.
01:26:50.700 | And it was a silly thing, 'cause I was thinking,
01:26:55.600 | for some reason in my mind,
01:26:56.680 | I was thinking 10, 20 years from now,
01:26:58.720 | and then I realized, like, wait, why not now?
01:27:02.720 | There's no, just even seeing what Axiom is doing,
01:27:07.440 | what Inspiration4 is doing,
01:27:08.680 | it's like regular civilians could start going up.
01:27:12.880 | So let me ask you this question.
01:27:14.320 | When do you think, we saw Jeff Bezos go out into orbit,
01:27:17.960 | when do you think Elon goes up to space?
01:27:20.640 | So his thinking about this is it's partially responsible
01:27:25.640 | until it's safe, because he has such a direct engineering
01:27:30.920 | roles in the running of multiple companies.
01:27:34.680 | So at which point do you think,
01:27:36.700 | what's your prediction for the year that Elon will go up?
01:27:40.000 | - I think he'd probably go up by 2026, I would say,
01:27:45.000 | because the number of missions planned,
01:27:47.760 | there'll be several missions per year
01:27:50.480 | through multiple space agencies and companies
01:27:53.280 | that are really making low-Earth orbit very routine.
01:27:57.200 | And by go up, I think it might also,
01:27:58.840 | for example, the Inspiration4 mission
01:28:00.600 | just went up for three days in flight,
01:28:02.000 | and there was enough time to get up there,
01:28:03.840 | do some experiments, enjoy the view, and then he came back.
01:28:07.160 | The Axiom missions are a bit more complicated,
01:28:08.840 | there's docking up in the space station,
01:28:10.240 | it's a shared atmosphere,
01:28:11.840 | so you have to follow all the ISS protocols.
01:28:14.040 | What's interesting about the Dragon capsule
01:28:16.120 | and the Inspiration4 and some of these
01:28:17.960 | what are called free-flyer missions,
01:28:19.780 | you can just launch into space,
01:28:20.620 | you basically have your own little mini space station
01:28:22.400 | for a few days, it's not that big, right?
01:28:24.200 | But I think that's what we'd probably see him do first,
01:28:26.560 | because we're gonna see a lot more tests of those
01:28:29.560 | in the next two, three years,
01:28:31.640 | but they're already been demonstrated to be safe,
01:28:34.360 | and then you're not trying to go for 10 to 20 days
01:28:36.440 | or months or years at a time.
01:28:38.520 | You're just up in space for a few days,
01:28:40.080 | but you're in proper space, it's an orbital flight,
01:28:42.120 | it's not just a suborbital flight.
01:28:44.560 | You could do a podcast from there, I think.
01:28:46.120 | - 2026, I wonder how the audio works.
01:28:49.040 | See, also, can you comment on 2026?
01:28:51.800 | I'll start getting ready.
01:28:53.640 | I'll start pushing him on this, I'm quite serious,
01:28:55.960 | it's a fascinating kind of--
01:28:57.160 | - Axiom 2 still has room,
01:28:58.320 | you could go on that mission if you wanted to.
01:29:00.040 | - So I'll ask you about Axiom.
01:29:01.760 | How strict are these?
01:29:06.400 | So this seems surreal, that civilians are traveling up.
01:29:11.200 | So how much bureaucracy is there still,
01:29:14.160 | in your experience, for the scientific,
01:29:16.400 | I mean, I know it's a difficult question
01:29:17.760 | to ask a scientist, 'cause you get to,
01:29:20.480 | you don't wanna complain too much,
01:29:23.080 | but how much, there's sometimes bureaucracy
01:29:26.120 | with NSF and DOD and the funding
01:29:28.200 | and all those kinds of things that kind of prevent you
01:29:32.200 | from being as free as you might sometimes like
01:29:35.520 | to do all kinds of wild experiments and crazy experiments.
01:29:38.560 | Now, the benefit of that is that you don't do
01:29:42.120 | any wild and crazy experiments that hurt people.
01:29:45.320 | And so it's very important to put safety first,
01:29:48.360 | but it's like a dance, a little too much restrictions
01:29:51.560 | of bureaucracy can hamper the flourishing of science,
01:29:55.240 | a little too little of that can get some crazy scientists
01:29:59.240 | to start doing unethical experiments.
01:30:00.840 | Okay, that said, NASA and just space flight in general
01:30:04.880 | is sort of famously very risk-averse.
01:30:09.880 | So what's your sense currently about like,
01:30:14.400 | even like doing a podcast, right?
01:30:17.320 | - Podcast, unless it's, I think with mixed martial arts
01:30:22.240 | is a pretty safe activity, unless you're doing
01:30:25.120 | the octagon version of your podcast.
01:30:27.360 | I mean, just getting there and back
01:30:28.400 | is the only real risky part, which is still risky.
01:30:31.720 | But I think, you're not asking to do open heart surgery
01:30:34.840 | in space, you're just saying, what if I do a podcast?
01:30:37.080 | And I think-- - Well, fun.
01:30:38.680 | You're trying to ask to have fun.
01:30:40.840 | And I feel like fun sounds dangerous, any kind of fun.
01:30:44.400 | - That's what's been extraordinary,
01:30:46.360 | is that traditionally, yes, I think most of the space
01:30:49.560 | agencies have been very, by definition, bureaucratic
01:30:52.080 | because they're coming from the government,
01:30:54.360 | but they've been that way for a really good reason,
01:30:56.360 | is that safety, in the early '60s, we knew almost nothing
01:30:59.920 | about the body in space, except for some of the work
01:31:02.800 | that pilots had done at really high altitudes.
01:31:04.360 | So we really didn't know what at all to expect.
01:31:05.880 | So it's good that there were decades of resolute focus
01:31:09.520 | on just safety, but now we know it's pretty safe.
01:31:12.680 | We know the physiological responses, we know what to expect.
01:31:15.080 | We can also treat some of it.
01:31:16.640 | We'll hopefully soon will treat a lot more of it.
01:31:19.160 | But if you just wanna go up there, it's actually,
01:31:21.640 | now it's just a question of cost.
01:31:23.040 | Like imagine, I think the way you can view a lot
01:31:25.760 | of the commercial spaceflight companies is that
01:31:28.240 | if you have the funds, you can basically plan the mission.
01:31:31.040 | All the training they'll do is to help you get prepared
01:31:33.120 | for how you run some of the instrumentation,
01:31:35.320 | how you can fly the rocket to a limited degree,
01:31:37.880 | and how to use some of the equipment.
01:31:39.560 | But fundamentally, it's no longer a question of years
01:31:43.240 | and years of training and selection,
01:31:44.640 | this impossible odds task of becoming an astronaut.
01:31:47.720 | It's frankly just a question of funds.
01:31:50.080 | - Expensive plane ride.
01:31:51.320 | So how much, you mentioned Axiom, is it known how much
01:31:56.200 | it costs for the plane ride?
01:31:58.720 | - There is no official number, and it depends
01:32:01.560 | on the mission, of course.
01:32:02.640 | So if you ask them, often they'll say,
01:32:05.080 | "Well, how serious are you?"
01:32:06.120 | They don't just wanna give out random numbers to people.
01:32:09.120 | But the numbers, 'cause for example,
01:32:11.120 | we proposed one mission, we want a new twin study
01:32:13.960 | where someone goes up and stays up there for 500, 550 days.
01:32:17.160 | But you're basically gonna be up there
01:32:18.000 | for the longest time ever, to simulate the time it would take
01:32:20.400 | to get to Mars and back for the shortest possible duration,
01:32:22.880 | about 550 days.
01:32:24.960 | 'Cause if you went there and immediately turned around,
01:32:26.640 | you could maybe make that mission.
01:32:27.680 | Otherwise, it's a three-year mission.
01:32:30.560 | And there, you're looking at the ranges of,
01:32:33.760 | it's 50 to $100 million in that ballpark range.
01:32:36.960 | But the reason it's so variable is it depends.
01:32:39.480 | What are you doing up there?
01:32:40.440 | If you're up there, for example, for two years, basically,
01:32:44.080 | almost two years, that's a long time
01:32:46.640 | to just be in one spot, right?
01:32:47.480 | So could you be doing some things where your time
01:32:50.880 | is valuable, so you can do experiments,
01:32:52.160 | and people pay for those, and that defrays the cost.
01:32:54.440 | Or you could build something, or you could do podcasts,
01:32:57.360 | and maybe fundraise on the podcast.
01:33:00.160 | As long as you, the reason the cost is variable
01:33:02.320 | is because it depends, well, do you have all the money?
01:33:04.680 | And you say, I wanna go and just sit in space
01:33:06.320 | for two years and do nothing.
01:33:07.440 | Well, then you have to pay for all that time
01:33:08.600 | that you're up there, if you wanna do things.
01:33:10.200 | - Yeah, I see the official AX-1 mission
01:33:13.600 | was 55 million for a trip to the ISS.
01:33:17.200 | - It's not that bad.
01:33:19.680 | It could be worse.
01:33:21.400 | - Wait, Sergey just posted a $35,000 price tag per night,
01:33:26.260 | per person on the ISS.
01:33:27.760 | Is that real?
01:33:28.960 | I don't know.
01:33:29.800 | - That's what, right, that's why--
01:33:30.760 | - That's like a real hotel stay.
01:33:32.160 | So to stay, oh, so interesting.
01:33:35.080 | And then I'm sure there's costs with the docking
01:33:38.440 | and all those kinds of things.
01:33:39.640 | That's from the perspective of Axiom,
01:33:41.400 | the private company, or SpaceX, or whoever is paying the cost
01:33:46.400 | in the short term and in the long term.
01:33:51.160 | - Yeah, and the thing about a lot of that cost
01:33:53.480 | is rocket fuel, a lot of it is the ride.
01:33:56.120 | So I've been on calls where Axiom's like,
01:33:58.080 | hey, SpaceX, give us, make it a little cheaper.
01:34:00.240 | We can make it cheaper on our end.
01:34:01.080 | It's the cost, that is the rocket.
01:34:03.560 | - So SpaceX is giving Axiom a ride in this case.
01:34:07.040 | What is Axiom Space?
01:34:08.320 | Can you speak to this particular private company?
01:34:11.160 | What's their mission, what's their goal?
01:34:12.840 | And what is the Axiom-1 mission that just went up?
01:34:15.240 | - Yeah, so the Axiom Space is a private spaceflight company
01:34:20.240 | that's building the first private space station.
01:34:23.960 | They actually have seen the videos and footage
01:34:25.800 | and hardware being put together,
01:34:27.240 | so they're in the process of constructing it.
01:34:29.720 | The hope is that by 2024, one of the first modules
01:34:33.560 | will be up and connected to the ISS.
01:34:35.880 | Eventually it'll be expanded, and then by 2028,
01:34:38.160 | the plan is it'll be completely detached and free-floating.
01:34:42.040 | And it will be, maybe even a little bit sooner,
01:34:44.020 | depending on how fast it goes,
01:34:45.060 | but they're building the world's first private space station.
01:34:48.180 | So if you wanna have a wedding up there,
01:34:50.240 | you just have to multiply the number of guests
01:34:51.720 | times the number of nights,
01:34:52.560 | and you could have a wedding up there.
01:34:53.520 | It'd be very expensive, but if you wanna do it, you can do it.
01:34:55.840 | It's like, you can have a lab up there.
01:34:57.640 | If you wanna do experiments, you can do experiments.
01:34:59.280 | You just figure out the cost.
01:35:00.120 | If you wanna have a beer up there,
01:35:02.240 | you can make your own, brew your own beer.
01:35:03.520 | And so this is the first beer made in space.
01:35:05.200 | For some reason, you wanna do it, you can pay for it.
01:35:06.920 | So it's opened up this space
01:35:09.320 | where if you can find the funds for it, you propose it.
01:35:12.080 | You can probably just do it.
01:35:14.120 | - Okay, cool.
01:35:15.600 | So what is the Axiom-1 mission that just went up?
01:35:18.520 | Can you tell me what happened?
01:35:19.840 | - Axiom-1 is the first private,
01:35:22.400 | the first commercial crew to go to the space station.
01:35:25.360 | So Inspiration4 was the first commercial private crew
01:35:28.640 | to just go into space.
01:35:29.480 | They went into space and actually did an orbital mission
01:35:31.840 | for just about three days.
01:35:33.480 | But Axiom-1 is the first, again, on the SpaceX rockets,
01:35:36.600 | but launched up, docked to the space station,
01:35:38.720 | and they're up there for about 10 days to do experiments,
01:35:41.480 | to work with staff, actually just take some pictures.
01:35:44.600 | But it's a mission, actually doing a lot of experiments.
01:35:47.320 | They're doing almost 80 different experiments.
01:35:48.760 | So it's a lot of, it's very science-heavy,
01:35:50.080 | which I love as a scientist.
01:35:51.880 | But it's the ability to show that you can fundraise
01:35:54.320 | and launch up a crew that's all privately funded
01:35:57.200 | and then go to the space station.
01:35:58.280 | - Yeah, it's four people.
01:35:59.200 | - Yeah, four people.
01:36:00.040 | And the Axiom-2 will also likely be four people.
01:36:02.280 | The two that have been announced
01:36:03.400 | are John Shoffner and Peggy Whitson.
01:36:05.240 | Peggy Whitson's a already prior NASA astronaut,
01:36:07.680 | has been at many times, done many experiments.
01:36:09.880 | She knows the space station like her own house.
01:36:12.640 | And we recently did a training with Peggy and John
01:36:15.160 | in my lab at Cornell to get ready for some other
01:36:17.720 | genomics experiments that we'll do on that mission.
01:36:19.960 | - So they're doing the experiments, too.
01:36:22.600 | What does it take to design an experiment
01:36:25.680 | and to run a design experiment here on Earth
01:36:28.280 | that runs up there?
01:36:29.640 | And then also to actually do the running of the experiment?
01:36:32.360 | What are the constraints?
01:36:33.640 | What are the opportunities?
01:36:35.200 | All that kind of stuff.
01:36:36.680 | - The biggest concern is what do you need
01:36:39.000 | for reagents or materials, the liquids that you might use
01:36:41.380 | for any experiment?
01:36:42.440 | What if it floats away?
01:36:43.400 | What if it gets in someone's eye?
01:36:44.720 | 'Cause things always float away in space.
01:36:46.880 | There's notoriously panels in the space station
01:36:48.880 | where you don't wanna look behind
01:36:49.720 | because it's got a little fungus
01:36:51.200 | or food has gotten stuck there
01:36:52.560 | and sometimes found months and months or years later.
01:36:55.360 | So things float around.
01:36:57.000 | - The little things, wow.
01:36:58.880 | - And so if you have anything you need to do your experiment
01:37:01.320 | that's a liquid or a solid, whatever that is,
01:37:02.680 | it has to go through toxicity testing.
01:37:04.840 | And the big question is if this thing,
01:37:06.600 | whatever you wanna use, gets in someone's eye,
01:37:08.720 | will they lose their vision or be really injured?
01:37:11.480 | And if the answer is yes, it doesn't mean you can't use it.
01:37:14.120 | It just means if the answer's yes,
01:37:14.960 | you have to then go through multiple levels of containment.
01:37:16.920 | There's a glove box on the space station
01:37:19.240 | where you can actually do experiments
01:37:20.760 | that have triple layers of containment.
01:37:22.160 | So you can still use some harsh reagents,
01:37:24.480 | but you have to do them in that glove box.
01:37:26.960 | And so, but you can propose almost anything.
01:37:29.000 | The biggest challenge is the weight.
01:37:30.200 | If it's a heavy, it's $10,000 per kilogram
01:37:32.920 | to get something up into space.
01:37:34.620 | So if you have a big, heavy object,
01:37:36.400 | there's some costs you have to consider.
01:37:38.080 | - And that includes not just the materials,
01:37:40.200 | but the equipment used to analyze the materials.
01:37:43.320 | - One of the ones we worked on actually with Kate Rubins
01:37:45.480 | was putting the first DNA sequencer in space
01:37:47.400 | called the Biomolecule Sequencer Mission.
01:37:49.800 | Also with Aaron Burton and Sarah Caster Wallace.
01:37:52.600 | But there, the interesting thing is
01:37:54.840 | we had to prepare this tiny little sequencer.
01:37:57.000 | It sequences DNA.
01:37:57.840 | You can do it really quickly, within really minutes.
01:38:00.420 | And what's extraordinary is what you have to do,
01:38:03.040 | if you wanna get a piece of machinery up there,
01:38:04.800 | you have to do destructive testing.
01:38:06.320 | So you have to destroy it and see what happens.
01:38:07.760 | How does it destroy?
01:38:08.600 | Do little pieces of glass break everywhere?
01:38:10.520 | If so, that's a problem.
01:38:11.440 | So you have to redesign it.
01:38:13.000 | And do fire testing.
01:38:13.880 | How does it burn?
01:38:14.720 | How does your device explode in a fire, or doesn't it?
01:38:18.120 | You have to test that, and then you do vibration testing.
01:38:19.880 | So you have to basically,
01:38:20.720 | if you wanna fly one thing into space,
01:38:22.280 | you need to make four of them and destroy
01:38:23.840 | at least three of them to know how they destroy.
01:38:26.080 | - Destructive, fire, and vibration testing.
01:38:31.080 | I was just asking for a friend,
01:38:32.480 | how do you, from a scientific perspective,
01:38:34.800 | do destructive testing?
01:38:36.840 | And how do you do fire testing,
01:38:38.800 | and how do you do vibration testing?
01:38:40.240 | - Vibration, like just large shakers.
01:38:42.240 | So that's, actually it's mostly to simulate launch.
01:38:44.760 | They have a lot of machinery at NASA and at SpaceX
01:38:47.560 | to do just, make sure, does the thing completely fall apart
01:38:50.160 | if it has a high vibrational,
01:38:51.760 | essentially force attached to it?
01:38:53.280 | So it's just kind of like a big shaker.
01:38:55.420 | Fire testing is just to simulate what would happen
01:38:59.120 | if there was a normal fire.
01:39:00.360 | That's something that gets up to,
01:39:02.200 | fire temperature is several hundred degrees Celsius.
01:39:04.720 | And--
01:39:05.560 | - Open fire, or are we talking like you put in a toaster?
01:39:08.000 | - No, it's more like-- - Is it heat,
01:39:08.880 | or is it actual flames?
01:39:09.720 | - It's flames and heat.
01:39:10.960 | But it's not like a kiln or anything like that.
01:39:13.120 | You don't wanna know how does it burn in a kiln.
01:39:14.760 | It's more, is it flammable is the first big question.
01:39:17.080 | Like does it just start on fire?
01:39:18.120 | If it gets a little bit of flame on it,
01:39:19.200 | does it just light up like a Christmas tree?
01:39:20.760 | - Is there a YouTube video of this?
01:39:22.680 | - You know, I've checked--
01:39:23.520 | - Did you film any of this?
01:39:24.480 | - Not, not, no.
01:39:26.120 | Aaron Burton might still have some of the videos.
01:39:28.960 | We're in the middle of doing some testing
01:39:30.360 | for the new sequencer called the Mark 1C.
01:39:32.360 | So I will make videos of that.
01:39:34.720 | - I would love to see that,
01:39:36.560 | if anything, for my private collection.
01:39:39.000 | This is very exciting.
01:39:40.280 | - And the destructive testing is just,
01:39:41.600 | often it can be something as simple as a hammer.
01:39:43.680 | It's really how does it shatter?
01:39:44.560 | You wanna question is, are there glass components?
01:39:46.400 | So it's like office space.
01:39:48.040 | - That's right.
01:39:48.880 | - That scene with the fax machine.
01:39:51.280 | - That's right, that's right.
01:39:52.120 | We blow it, yeah, into the damn it feels good
01:39:53.800 | to be a gangster soundtrack.
01:39:55.080 | Yeah, that's a great scene.
01:39:56.440 | - That's so, that's so exciting.
01:39:58.880 | That kind of, that's the best of engineering
01:40:00.440 | is like that kind of testing.
01:40:02.360 | What else about designing experiments?
01:40:04.840 | Like what kind of stuff do you wanna get in there?
01:40:07.000 | You said 80 different experiments.
01:40:09.080 | So we're staying in the realm of biology and genetics.
01:40:13.160 | - Yeah, for now.
01:40:14.000 | But we also wanna do,
01:40:15.480 | some of the experiments that have been discussed
01:40:16.840 | in the lab have been,
01:40:18.640 | and some that are being planned as well.
01:40:19.640 | But I think the most controversial one
01:40:20.840 | that's come up in our planning,
01:40:22.920 | it gets back to sex in space.
01:40:24.280 | Is can human embryos divide
01:40:27.200 | and actually begin to develop in space?
01:40:28.880 | But then if we do that experiment,
01:40:30.600 | that means you're taking viable human embryos,
01:40:33.360 | watching them develop in space.
01:40:34.680 | Then you could freeze them and bring them down
01:40:35.960 | and characterize them to see,
01:40:37.520 | but to answer that question.
01:40:38.760 | 'Cause we actually don't know,
01:40:40.040 | can a human embryo actually develop well in zero gravity?
01:40:43.320 | We just don't know.
01:40:44.900 | But to find that out,
01:40:46.000 | that means we'd have to literally sacrifice embryos probably.
01:40:49.000 | Which itself has, of course,
01:40:50.440 | a lot of complications, ethical considerations.
01:40:53.680 | Some people just wouldn't.
01:40:54.520 | It's a non-starter for lots of people.
01:40:56.560 | - But we do know that the sperm survive in space,
01:41:00.360 | as you earlier said.
01:41:01.200 | - Yes.
01:41:02.020 | - And nobody cares about sperm apparently.
01:41:04.400 | - We're doing several studies
01:41:05.400 | on autism risk for fathers and sperm.
01:41:07.920 | And it's really easy to get sperm, I'll just tell you.
01:41:10.560 | People say, you're helping us.
01:41:11.920 | - That's what I hear.
01:41:12.760 | - That's right.
01:41:13.600 | - I read that somewhere on Wikipedia.
01:41:14.940 | - Asking for a friend.
01:41:16.040 | - Okay, cool.
01:41:18.620 | Are you involved in Axiom 1, Axiom 2 experiments?
01:41:22.180 | Is your lab directly or indirectly involved
01:41:26.660 | in terms of experiment design?
01:41:27.780 | What are you excited about?
01:41:29.140 | Different experiments that are happening out there.
01:41:31.620 | - Some of them we're doing a lot of the direct training
01:41:33.300 | for the crews.
01:41:34.140 | So they're saying,
01:41:34.960 | how do you do a modern genetics experiment?
01:41:37.820 | So for the Axiom 1,
01:41:39.020 | for Inspiration 4 and Axiom 1,
01:41:40.980 | we're also collaborating with Trish,
01:41:42.320 | which is the translational research arm for NASA.
01:41:44.980 | That's in Houston.
01:41:46.020 | And there it's a lot of sharing of samples and data
01:41:48.500 | for all these missions.
01:41:49.820 | Basically for all the commercial spaceflight missions,
01:41:52.300 | there'll be a repository where you can look at the data
01:41:54.100 | from the astronauts.
01:41:54.920 | You can look at some of the genetic information,
01:41:56.340 | some of the molecular changes.
01:41:58.100 | So that's being built up with Trish,
01:41:59.580 | which has been fabulous collaboration
01:42:00.940 | between Cornell and Trish.
01:42:02.480 | But the other thing we're doing is for Axiom 2
01:42:03.740 | is training them.
01:42:04.820 | How do you, for example,
01:42:06.180 | if you want to look at a virus,
01:42:07.380 | you can take a swab of something,
01:42:08.820 | extract it, sequence it,
01:42:10.500 | and say, do I have Omicron?
01:42:11.860 | Or do I have a different virus?
01:42:13.060 | And we're gonna do some of the exact same work in flight,
01:42:15.340 | but we're having the astronauts do the extraction,
01:42:18.140 | the sequencing, and the analysis of all the molecules.
01:42:20.940 | And so one common occurrence is herpes
01:42:23.540 | is reactivated often in spaceflight, oral herpes.
01:42:26.680 | So you can see that viral reactivation
01:42:28.660 | is one of the biggest kind of mysteries in spaceflight,
01:42:30.780 | where the immune system seems to be responding a lot.
01:42:33.460 | It's active, the body's really perturbed,
01:42:35.920 | but viruses start shedding again.
01:42:38.020 | And it's really, and this happens clinically.
01:42:39.860 | Again, we see this for like, for example,
01:42:41.220 | hepatitis C or hepatitis B,
01:42:43.140 | you can get infected with it,
01:42:43.980 | and it can stay in your body for decades
01:42:46.060 | and still kind of be hiding in the body.
01:42:47.500 | And in this case, we see it in spaceflight,
01:42:49.300 | herpes comes back.
01:42:50.220 | So we want to figure out, is it there, first of all,
01:42:52.420 | and then when is it happening and characterize it better,
01:42:55.080 | but have the astronauts do it themselves
01:42:56.660 | rather than collecting it and bringing it back to Earth
01:42:58.460 | and figuring out later.
01:42:59.660 | We could see in real time how it's happening.
01:43:01.500 | And then also look at their blood.
01:43:02.500 | We'll see what is changing in their blood
01:43:04.660 | in real time with these new sequencers.
01:43:06.460 | So I'm excited about the genomics in space, if you will.
01:43:09.420 | - So clearly, somebody that loves robots,
01:43:12.180 | how many robots are up there in space
01:43:13.700 | that help with the experiments?
01:43:15.180 | Like, how much technology is there, would you say?
01:43:19.860 | Is it really a manual activity,
01:43:23.920 | or is there a lot of robots helping out?
01:43:25.660 | - Good question.
01:43:26.500 | So far, it's almost all manual,
01:43:28.020 | just 'cause the robots have to all undergo
01:43:29.620 | the destructive fire and vibrational testing.
01:43:32.660 | So if you have a million--
01:43:33.500 | - This is so exciting.
01:43:34.500 | - So if you can get--
01:43:36.220 | - That thing is a lot less than a million.
01:43:38.220 | So we could destroy it.
01:43:39.380 | - We're definitely gonna test it out for the,
01:43:42.980 | I guess in which order?
01:43:44.060 | And they have to do separate for each one.
01:43:45.740 | - Yeah, each one, yeah.
01:43:46.580 | - They have to do vibration, fire.
01:43:48.380 | Note to self, do fire testing for the legged robots
01:43:54.660 | and the destructive testing.
01:43:57.020 | That would be fascinating.
01:43:58.500 | I wonder if any of the folks I'm working with
01:44:01.860 | did that kind of testing on the materials.
01:44:03.660 | Like, what breaks first with the robots?
01:44:05.860 | - That's a very key question.
01:44:07.100 | And also, the big question,
01:44:08.700 | so what's interesting about this,
01:44:09.540 | for Axiom and for these commercial spaceflight areas,
01:44:12.780 | if you can fund it, you could fly it, right?
01:44:14.580 | So if you had to say,
01:44:15.400 | "I wanna fly these series of robots up
01:44:17.580 | "because I think they could help build something
01:44:19.260 | "or they could help measure or repair the spacecraft."
01:44:22.140 | - Or you have to come up with a good reason.
01:44:24.260 | - Well, for NASA, you have to have a good reason,
01:44:25.420 | but I think for private spaceflight,
01:44:26.900 | you could have, "The reason is I'm curious."
01:44:28.620 | And that could just be it.
01:44:29.460 | - Exactly, curiosity.
01:44:30.300 | - And I've got a private fund, or I've got your own money.
01:44:32.540 | - And then you pay per kilogram, essentially.
01:44:36.460 | And I mean, there are some things.
01:44:37.340 | You can't say, "I wanna send a nuclear bomb up there
01:44:39.740 | "because I'm curious."
01:44:40.580 | I don't think that would fly.
01:44:41.420 | - And there's probably rules
01:44:42.860 | in terms of free-floating robots, right?
01:44:46.020 | They probably have to be attached.
01:44:47.980 | Like, it's an orchestra that plays together.
01:44:50.180 | All the experiments that are up there,
01:44:51.400 | there's probably, it's not silos.
01:44:54.100 | It's not separate, they're separate kind of things.
01:44:56.340 | But you're saying it's all mostly manual.
01:44:58.300 | How much electronics is there
01:44:59.540 | in terms of data collection,
01:45:00.620 | in terms of all that kind of stuff?
01:45:01.700 | - A lot of electronics.
01:45:02.540 | So a lot of it's tablets.
01:45:03.400 | There's laptops up there.
01:45:05.260 | The whole space station is running
01:45:07.820 | and humming on electronics.
01:45:08.740 | Actually, one of the biggest complaints astronauts have
01:45:10.900 | is sleeping up there is hard,
01:45:12.780 | not only 'cause you're in zero gravity,
01:45:13.860 | but there's a consistent loud hum of the space station.
01:45:16.900 | There's so many things active and humming and moving
01:45:20.500 | that are keeping the station alive.
01:45:22.140 | The CO2 scrubbers, all the instrumentation, it's loud.
01:45:26.340 | So I think it is a very well-powered lab, basically,
01:45:31.340 | in flight.
01:45:32.380 | But the future space stations, I think,
01:45:34.900 | will be very different because they're being built
01:45:37.060 | more for pleasure than business,
01:45:39.140 | or a little bit of both.
01:45:40.340 | But they're built for, we want people to,
01:45:43.100 | at least when you talk to Axiom,
01:45:44.260 | when you talk to the other industry partners,
01:45:46.860 | they wanna make space more fun and engaging
01:45:49.580 | and open to new ideas.
01:45:50.940 | - So that's looking at the fun stuff
01:45:53.820 | going on in the next few years.
01:45:56.260 | But if we zoom out once again,
01:45:58.820 | how and when do we get outside of the solar system?
01:46:02.060 | You mentioned this before.
01:46:04.420 | Or maybe you can mention the other hops we might take.
01:46:08.420 | You know what, let's step back a little.
01:46:11.420 | And where are some of the fun places we might visit first
01:46:15.580 | in a semi-permanent way inside the solar system
01:46:18.740 | that you think are worth visiting?
01:46:20.500 | - Yeah, at the end of 500 years,
01:46:21.580 | I'm hoping we make the big launch
01:46:23.220 | towards another solar system.
01:46:25.300 | Really driven by the fact that we now actually
01:46:27.700 | have exoplanets that we know we might be able to get to
01:46:29.820 | and survive on.
01:46:31.060 | Whereas 20 years ago, we really had almost none,
01:46:33.580 | certainly none that we knew were habitable.
01:46:35.460 | And exoplanets even just discovered
01:46:37.780 | and started to happen until '89 and really early '90s
01:46:41.180 | for the real validated ones.
01:46:42.340 | So I'm hoping over the next 500 years,
01:46:45.500 | we go from thousands of possible habitable planets
01:46:47.540 | to hundreds of thousands or millions,
01:46:50.220 | especially with some of the recent telescopes launched,
01:46:52.460 | we'll find them.
01:46:53.380 | But before we get there, I have a whole section
01:46:56.940 | I really describe about the magic of Titan
01:46:58.540 | because it has all this methane,
01:47:00.460 | which is a great hydrocarbon you can use to make fuel.
01:47:02.700 | You can use it, it's cold as all bejesus on Titan.
01:47:06.420 | But if you can-- - Ice.
01:47:07.820 | - It's, yeah, it's--
01:47:09.380 | - So what's Titan made up of?
01:47:10.780 | What is Titan?
01:47:12.380 | Everybody loves Titan.
01:47:13.740 | - Yeah, it is, it's a favorite,
01:47:15.820 | it's this kind of eerie green-hued moon
01:47:19.820 | that's around Saturn that is, to our knowledge,
01:47:24.820 | you know, this large, it has like, you know,
01:47:27.580 | so cold it has these methane lakes
01:47:30.540 | where the methane normally is a gas,
01:47:32.020 | but there it actually would be so cold,
01:47:33.220 | it's like a lake of methane.
01:47:34.700 | You could go swimming in it, potentially.
01:47:36.580 | There might be some degree of rocks
01:47:38.580 | or maybe mountains there,
01:47:39.800 | but they might also be made of like frozen methane.
01:47:42.600 | So no one's ever, no person's obviously been there,
01:47:45.500 | but it is, you know, I have enough satellite imagery
01:47:47.900 | and some data that you could actually
01:47:49.820 | potentially survive on Titan.
01:47:51.300 | So I think that'd be one place where I'm hoping
01:47:53.300 | that we would at least have a bit of an outpost.
01:47:56.020 | It might not be a luxurious retreat
01:47:58.260 | 'cause it's really cold.
01:47:59.340 | - Is there a life on Titan, you think,
01:48:01.460 | underneath the surface somewhere?
01:48:03.020 | - Maybe, well, actually, with all that carbon
01:48:06.900 | and all those hydrocarbons,
01:48:08.060 | it is very possible that some microbial life could be there
01:48:10.900 | and hanging out waiting for us to dip our toes
01:48:13.620 | into the methane and find it.
01:48:15.620 | But we don't know yet,
01:48:17.500 | but I think that's one place I'd like to see an outpost.
01:48:19.740 | I would like to see other outposts near Jupiter,
01:48:22.180 | but Jupiter has extremely high radiation, actually,
01:48:24.780 | so even places like Io,
01:48:26.500 | which are volcanically active and quite amazing,
01:48:29.140 | we probably couldn't survive that long
01:48:30.940 | or that close to Jupiter, though,
01:48:32.100 | it has because it's such a giant planet.
01:48:34.620 | It emits back out a lot of radiation
01:48:37.140 | that it's collecting from other parts of the universe
01:48:39.180 | and it juts back out.
01:48:40.340 | So if you get too close to Jupiter,
01:48:42.540 | you'd actually almost certainly not be able to survive,
01:48:44.940 | depending on which part of it,
01:48:46.240 | but that's one risk about Jupiter.
01:48:49.300 | But it'd be cool to see the giant red spot up close,
01:48:51.580 | maybe have some spots there.
01:48:53.060 | Mars is top one, then you get to pick Titan or Io,
01:48:56.940 | so ice, firing ice,
01:48:59.780 | the Robert Frost poem comes to mind.
01:49:02.060 | And then Europa is that--
01:49:03.820 | - Europa would be cool, too, and Enceladus,
01:49:05.340 | which is a big ocean that might be there,
01:49:07.060 | like an alien ocean that's under the,
01:49:09.180 | it might be even water ice that's there,
01:49:11.420 | even liquid water potentially there under the surface,
01:49:13.900 | so that'd be a great candidate.
01:49:15.500 | The asteroids of Ceres would be good,
01:49:17.620 | or Eros, or big enough you get a little bit of gravity.
01:49:20.820 | That'd be interesting, you could have it
01:49:22.420 | maybe a habitable place there.
01:49:24.140 | And they just might be big enough
01:49:25.160 | that you could get there, survive,
01:49:26.740 | and even have a tiny bit of gravity, but not much.
01:49:28.860 | - Why do you like asteroids?
01:49:30.220 | - Well--
01:49:31.060 | - Are you just, we're just listing vacations spots.
01:49:33.580 | - Yeah, vacations spots, basically, yes.
01:49:34.940 | I'd say, well, so they probably have
01:49:36.140 | a lot of rare earth minerals
01:49:37.580 | that you could use for manufacturing,
01:49:38.940 | which is why part of the space economy
01:49:41.220 | that's being built up now is people really wanna go
01:49:43.420 | and hollow out the asteroids and bring back all the,
01:49:46.020 | you know, all the resources from it.
01:49:47.800 | So this legally is very possible,
01:49:50.620 | 'cause even though the Space Act
01:49:53.860 | prevents people from militarizing space,
01:49:56.180 | or owning all of it,
01:49:57.580 | if you get the resources out of an asteroid,
01:50:00.140 | but you don't actually say you own it,
01:50:01.380 | that's still, that's perfectly legal.
01:50:03.420 | So you could--
01:50:04.260 | - What's the Space Act?
01:50:05.700 | - Space Act, it was 1967,
01:50:07.580 | was the first large-scale agreement
01:50:10.340 | between major international parties,
01:50:12.820 | particularly the US and Russia, but also many others,
01:50:15.300 | to say that space should be a place for humanities
01:50:18.220 | to not militarize it, to not weaponize it,
01:50:20.340 | to not militarize it,
01:50:21.500 | also establish some of the basic sharing principles
01:50:23.620 | between countries who are going into space.
01:50:26.500 | And there was a plan to make an additional act in the '90s,
01:50:30.900 | the Lunar, actually I'm blanking on the name of it,
01:50:34.300 | but there hasn't been any significant legislation
01:50:37.140 | that has been universally accepted since the Space Act.
01:50:39.860 | - So, but the primary focus--
01:50:43.420 | - Was on the militarization.
01:50:44.260 | - Was on the militarization.
01:50:45.620 | - Which was, in theory, not allowed,
01:50:47.300 | which so far has stayed true,
01:50:49.840 | but there's no, is there any legal framework
01:50:53.740 | for who owns space and space,
01:50:57.700 | like different geographical regions of space,
01:51:01.420 | both out in space and on asteroids and planets and moons?
01:51:06.420 | - Currently, you can't own,
01:51:09.260 | you're not supposed to be able to own.
01:51:10.580 | I mean, people have tried to sell bits of the moon,
01:51:12.540 | for example, or sell names of stars,
01:51:14.420 | which is pretty harmless,
01:51:15.920 | but you're not supposed to be able to own
01:51:17.460 | any part of the moon, or an asteroid for that matter,
01:51:20.040 | but you're allowed to mine the resources from it.
01:51:23.640 | So in theory, you could go catch an asteroid,
01:51:26.400 | hollow out the whole thing like you eat an orange,
01:51:28.960 | and leave the shell, and say, "Okay, I'm done.
01:51:31.540 | "I never owned it, but I just extracted
01:51:33.640 | "everything inside of it, and now I'm done with it."
01:51:36.240 | - Of course, you see there's going to be
01:51:40.280 | some contentious battles, even wars, over those resources.
01:51:45.880 | Hopefully, at a very small scale,
01:51:47.500 | it's more like conflict or human tension, but...
01:51:51.460 | Oh boy, it's like war makes for human flourishing,
01:51:58.980 | after the war, somehow.
01:52:01.220 | Sometimes, there's just this explosion of conflict,
01:52:06.220 | and afterwards, for a long while, there's a flourishing.
01:52:10.220 | And again, conflict and flourishing,
01:52:12.180 | and hopefully, over a stretch of millennia,
01:52:15.580 | the rate of conflict, and the destructiveness
01:52:18.400 | of conflict decreases.
01:52:20.160 | - It has, at least in the past 100 years,
01:52:22.320 | the number of wars, number of military actions,
01:52:25.160 | casualties, have all decreased.
01:52:27.120 | I don't know if it's gonna stay that way for humanity.
01:52:30.000 | I think the trajectory's there.
01:52:33.480 | I think the warmongering is less tolerated
01:52:36.080 | by the international community.
01:52:38.280 | It's more scrutinized.
01:52:39.540 | It still happens.
01:52:40.640 | Right now, there's an ongoing war
01:52:41.680 | between Russia and Ukraine.
01:52:43.060 | You've spoken a lot about it.
01:52:44.500 | There sometimes will be small military actions,
01:52:50.620 | but I think, and even there,
01:52:53.700 | there's a large military action across most of the country,
01:52:57.180 | but not all of it, actually.
01:52:58.940 | I think we see less over time
01:53:00.680 | of large-scale multi-country invasions,
01:53:03.020 | like we've seen in the past.
01:53:04.300 | I think maybe that won't happen ever again,
01:53:06.020 | but you might see country-to-country battles happening,
01:53:09.220 | which has always happened, I think,
01:53:11.140 | but hopefully, less of that as well.
01:53:12.900 | - And yet, the destructiveness of our weapons increases,
01:53:17.200 | so it's a complicated race in both directions.
01:53:21.400 | We become more peaceful and more destructive
01:53:23.320 | at the same time.
01:53:24.160 | That's fascinating.
01:53:25.520 | How do we get outside the solar system?
01:53:27.660 | You write an epic line,
01:53:32.660 | I believe it's the title of one of the sections,
01:53:35.520 | "Launch Toward the Second Sun."
01:53:37.720 | That journey of saying we're going to,
01:53:42.200 | somehow the solar system feels like home.
01:53:44.320 | Earth is home, but the solar system is home.
01:53:47.520 | It's our sun.
01:53:48.400 | The sun is a source of life.
01:53:51.640 | And going towards the second sun,
01:53:55.000 | leaving this home behind, that's one hell of a journey.
01:53:59.540 | So what does that journey look like?
01:54:01.960 | When is it to happen, and what's required to make it happen?
01:54:06.400 | - To get to that state, we have to actually have,
01:54:09.640 | describe a number of options.
01:54:10.720 | We have to all have people survive
01:54:12.280 | in multiple generations, live and die on the same spacecraft
01:54:14.800 | towards another star.
01:54:16.800 | Propulsion technology, you need to have that in place.
01:54:19.360 | I assume we don't have dramatic improvements.
01:54:22.200 | I describe ways it could happen, like antimatter drives
01:54:24.520 | or things that could make it possible to go faster.
01:54:27.240 | But since it's a book of nonfiction,
01:54:29.440 | I just make no big leaps other than what we know of today
01:54:32.200 | that's possible.
01:54:33.280 | And if that's the case, you'd need probably 20 generations
01:54:36.280 | to live and die on one spacecraft to make it towards
01:54:39.200 | what is our known closest habitable exoplanet.
01:54:41.880 | Now that sounds, you need to have the life support,
01:54:44.800 | self-reliance, self-sustainability all in that one.
01:54:46.840 | It'd be a large spacecraft.
01:54:48.280 | You'd have to grow your own food,
01:54:49.400 | probably still have some areas with gravity.
01:54:51.640 | It would be complicated, but I think after 500 years,
01:54:53.960 | we could actually have the technology and the means
01:54:56.400 | and the understanding of biology to enable that.
01:54:59.460 | And so with that as a backdrop,
01:55:01.360 | you could have people hibernate, I talk about,
01:55:03.080 | like maybe you need to hibernate instead of just people
01:55:05.360 | living their normal life, but I think the hibernation
01:55:07.840 | technology doesn't work that well yet.
01:55:09.440 | And I don't know if it might pan out and maybe in 200 years,
01:55:12.360 | it gets really good and then people can all just sleep
01:55:14.040 | in pods, great.
01:55:15.880 | So I think this is the minimum viable product
01:55:19.080 | with everything that we have today and nothing else.
01:55:21.640 | So if that's the case, which of course,
01:55:23.600 | I'm sure more in 500 years, but basically what we know today
01:55:26.320 | you have people live and die on the spacecraft.
01:55:27.880 | And that sounds almost like a prison sentence.
01:55:29.440 | You say, if you were born into a spacecraft
01:55:32.040 | and when you got old enough age, you said,
01:55:34.760 | yes, you can tell we're on a spacecraft,
01:55:36.800 | you will live your whole life on this,
01:55:38.720 | let's say something the size of a building.
01:55:40.760 | And this is everyone you'll ever know, and then you'll die.
01:55:43.720 | And then your children will also carry on the mission.
01:55:46.160 | Would those people feel proud and excited to say,
01:55:49.400 | we are the vanguard and hope of humanity.
01:55:52.260 | We're going towards a new sun and maybe they'd love it.
01:55:55.360 | Or would they, after 10 generations,
01:55:57.280 | maybe they would rebel and say to hell with this.
01:55:59.660 | I'm tired of being in this prison.
01:56:01.000 | This is a bad idea.
01:56:02.360 | We're turning around or we're going somewhere else
01:56:04.880 | or a mutiny happens and they kill each other.
01:56:06.480 | Right, so we would have to really make sure
01:56:08.800 | that the mental health, the structure of the society
01:56:11.000 | is built so they could sustain that mission.
01:56:12.640 | That's a crazy mission.
01:56:14.280 | But it's not that much different from spaceship Earth.
01:56:17.080 | Here we are stuck on one planet.
01:56:18.560 | We don't have planetary liberty.
01:56:19.900 | We can't go to another planet right now.
01:56:21.920 | We can't even really go to another moon that easily.
01:56:24.480 | So we, and I love Earth,
01:56:26.920 | there's lots of wonderful things here,
01:56:28.040 | but it's still just this one planet and we're stuck on it.
01:56:30.120 | So everyone that you know and love and live with here
01:56:33.240 | will be dead someday and that's all you'll ever know too.
01:56:36.460 | So I think it's a difference of scale,
01:56:38.120 | not a difference of type in terms of an experience.
01:56:40.720 | - Yeah, it's still a spaceship traveling out in space.
01:56:43.400 | Earth is still a spaceship traveling out in space.
01:56:46.200 | So it is a kind of prison.
01:56:48.920 | It's always, everybody lives in a prison.
01:56:52.840 | - Well, let's say it's a limited planetary experience.
01:56:55.120 | We'll say it's like that.
01:56:56.840 | Prison sounds so dark, but--
01:56:59.240 | - Just, yeah, just like prison is a limited geographic
01:57:04.320 | and culinary experience.
01:57:06.780 | - But I don't want it to be viewed that way.
01:57:08.140 | I want to think, wait, this is,
01:57:09.460 | what an extraordinary gift.
01:57:10.700 | And we wouldn't probably just launch one generation ship.
01:57:12.640 | We'd probably launch 10 or 20 of them,
01:57:14.100 | the different, the best candidates,
01:57:16.420 | and hopefully get there.
01:57:17.740 | - And yeah, I mean, the fact is limitations and constraints
01:57:22.700 | make life fascinating because the human mind
01:57:26.740 | somehow struggles against those constraints
01:57:29.060 | and that's how beauty is created.
01:57:30.640 | So there is kind of a threshold, you know,
01:57:34.220 | being stuck in one room is different
01:57:37.280 | than being stuck in a building
01:57:39.040 | and being stuck in a city, being stuck in,
01:57:43.000 | like, I wonder what the threshold of people,
01:57:46.860 | like, I lived for a long time in a studio
01:57:51.120 | and then I upgraded gloriously to a one bedroom apartment.
01:57:54.820 | And the power to be able to like close the door.
01:57:59.820 | - It was magnificent, right?
01:58:01.920 | It's just like, wow, you can speak volumes.
01:58:06.080 | It's like, you can escape, that feels like freedom.
01:58:08.720 | That's the definition of freedom,
01:58:10.520 | having a door where you could close it
01:58:14.440 | and now you're alone with your thoughts
01:58:16.840 | and then you can open it and you enter
01:58:19.560 | and now there's other humans as freedom.
01:58:23.280 | So the threshold of what freedom,
01:58:25.600 | the experience of freedom is like is really fascinating.
01:58:29.160 | And like you said, there could be technologies
01:58:30.800 | in terms of hibernation, VR, ultra reality, virtual reality.
01:58:35.360 | 'Cause, you know, 30 years ago, it sounded awful,
01:58:37.880 | I think, you'd be stuck in a spacecraft,
01:58:39.280 | but now you could bring the totality
01:58:40.520 | of all of human history, culture, every music,
01:58:43.120 | every bit of music, song, every movie, every book,
01:58:46.140 | can all be in one tablet, basically, right?
01:58:47.720 | So, and also you'd still get updates from Netflix
01:58:51.000 | if you're on the way towards another star.
01:58:52.360 | You could still get downloads and so,
01:58:54.200 | but eventually maybe the crew
01:58:55.120 | would start to make their own shows.
01:58:56.320 | And they'd be like, well, I don't want the Earth shows.
01:58:57.660 | I wanna talk about, I'm gonna make a drama
01:58:59.400 | on this spacecraft, but I think it would have
01:59:01.680 | to be big enough so it feels like
01:59:02.800 | at least the size of a building.
01:59:04.040 | I think people's intuitions about quarantining
01:59:06.840 | have really become very immediate
01:59:08.960 | because we've all had to experience it
01:59:10.360 | to some degree in the past two years.
01:59:12.120 | And we've survived, but definitely we've learned
01:59:14.240 | that you need a really good internet connection.
01:59:16.440 | You need some ability to go somewhere sometimes.
01:59:20.200 | And that might just be as simple
01:59:21.440 | as people leaving the spacecraft
01:59:22.520 | to go to something that's another thing connected to it
01:59:24.360 | or just go out into the vacuum of space
01:59:26.460 | for an afternoon to experience it.
01:59:28.840 | So people need recreation, people need games,
01:59:31.200 | people need toys, people love to play.
01:59:34.100 | - What are chlorohumans?
01:59:38.440 | - Chlorohumans is a description
01:59:40.320 | of how you can embed chloroplasts into human skin
01:59:43.240 | or the thing that makes plants green
01:59:45.560 | so they can absorb light from the sun
01:59:47.640 | and then get all their energy that way.
01:59:49.200 | And of course, humans don't do this,
01:59:51.440 | but I describe in the book in the far future,
01:59:53.640 | maybe 300, 400 years from now,
01:59:55.800 | if we could work on the ways
01:59:57.320 | that animals and plants work together,
02:00:00.120 | you could embed chloroplasts in human skin.
02:00:01.920 | And then if you're hungry,
02:00:03.360 | you go outside and you lay out your skin
02:00:05.200 | and then you absorb sunlight
02:00:06.560 | and then you go back in when you get full.
02:00:08.440 | If you only wanted to lay outside for just say one hour
02:00:11.520 | to get your day's fully value of energy,
02:00:15.880 | you'd need about two tennis courts worth of skin
02:00:17.760 | that you could lay out
02:00:19.360 | and maybe your friends would plan or something.
02:00:20.960 | But if they plan it,
02:00:21.800 | then their shadows would block your sun.
02:00:22.960 | So maybe you leave your skin out there
02:00:24.520 | and you could roll up your skin,
02:00:25.600 | go back inside after about one hour
02:00:27.840 | and that's how much skin you would need to have exposed
02:00:30.340 | with some reasonable assumptions
02:00:31.600 | about the light capture and efficiency of the chloroplast.
02:00:35.040 | So it was just kind of a fun concept in the book
02:00:36.800 | of green humans going around, absorbing light from the sun.
02:00:40.540 | Something I've dreamed about since I was a kid.
02:00:42.280 | - Is there engineering ways of like having that much skin
02:00:47.280 | and being able to laying it out efficiently?
02:00:50.040 | Like is there, it sounds absurd, but.
02:00:53.300 | - You could roll it up.
02:00:54.140 | Or you could just lay outside longer.
02:00:55.800 | I wanted to think if you just had one hour
02:00:57.560 | and how much skin would you need.
02:00:58.840 | But if you just went out there for four hours,
02:01:01.160 | you need something that's smaller,
02:01:02.120 | but you know, thick,
02:01:03.880 | so it's as of a half a tennis court.
02:01:04.960 | So you could make a--
02:01:05.800 | - Could be like wings, gigantic wings.
02:01:07.520 | - And you lay them out there.
02:01:09.140 | But also, that's if you needed all your energy
02:01:12.080 | only from your skin.
02:01:13.040 | So if you just get a little bit of it,
02:01:14.280 | your energy, of course,
02:01:15.120 | you could just walk around with your skin as is
02:01:17.240 | and you'd still have to eat, but not as much.
02:01:19.280 | And I describe that because we'd need other ways
02:01:22.160 | to think about making your own energy
02:01:24.140 | if you're on a really long mission that's far from stars.
02:01:26.060 | You could turn on a lamp that would give you
02:01:27.380 | some of that essentially exact wavelength of light
02:01:30.180 | you need for your chloroplasts in your skin.
02:01:33.020 | But that's something I'm hoping would happen
02:01:37.020 | in three to 400 years, but it would be hard
02:01:39.780 | because you're taking a plant organelle
02:01:42.180 | and putting it in an animal cell,
02:01:43.860 | which sounds weird, but we have mitochondria inside of us,
02:01:45.860 | which basically where our cells capture the bacteria
02:01:48.020 | and now it walks around with us all the time.
02:01:49.980 | So there's precedent for it in evolution.
02:01:52.760 | - How much, by the way, speaking of which,
02:01:54.880 | does evolution help us here?
02:01:56.920 | So we talked a lot about engineering,
02:02:00.560 | building genetically modifying humans
02:02:04.920 | to make them more resilient or having mechanisms
02:02:07.820 | for repairing parts of humans.
02:02:10.380 | What about evolving humans or evolving organisms
02:02:16.020 | that live on humans?
02:02:17.940 | Sort of the thing you mentioned,
02:02:20.240 | which you've already learned,
02:02:21.800 | is that humans are pretty adaptable.
02:02:23.840 | Now what does that mean?
02:02:24.840 | You also, somewhere wrote that there's trillions of cells
02:02:29.800 | that make up the human body.
02:02:31.800 | And those are all organisms,
02:02:33.840 | and they're also very adaptable.
02:02:36.420 | So can we leverage the natural process of evolution,
02:02:40.360 | of the slaughter, the selection, the mutation,
02:02:42.920 | and the adaptation that is all,
02:02:45.920 | sorry to throw slaughter into there,
02:02:47.400 | it's just acknowledging that a lot of organisms
02:02:49.960 | have to die in evolution.
02:02:51.900 | Can we use that for long-term space travel
02:02:57.960 | or colonization, occupation?
02:03:02.240 | Is there a good word for this?
02:03:03.800 | Of planets.
02:03:05.920 | - Like to terraform the planet?
02:03:07.640 | - Terraform the planet.
02:03:08.880 | No, to adjust the human body to the planet.
02:03:11.120 | - Oh, there's not really a term for that yet, I guess, to--
02:03:15.160 | - Adapt to the new vacation spot.
02:03:18.320 | - Yeah, I called it just directed evolution in the book,
02:03:22.040 | is that you guide the evolution towards what you want.
02:03:25.320 | In this case, sometimes you can engineer your cells
02:03:27.640 | to make exactly what you want,
02:03:28.480 | but other times you put people on planets
02:03:30.480 | and see how they change.
02:03:32.040 | Actually, later in the book,
02:03:33.500 | I imagine if you have humans on multiple planets,
02:03:35.960 | you could have this virtuous cycle,
02:03:37.160 | or as people adapt and evolve here,
02:03:39.320 | you'd sequence their DNA and see how they change,
02:03:41.240 | and then send the information back to the other planet,
02:03:43.720 | and then study them with more resources.
02:03:45.640 | So you'd be able to then have a continual exchange
02:03:48.560 | of what's evolving in which way on different planets,
02:03:51.040 | and then each planet would learn from the changes
02:03:53.560 | that they see at the other planet.
02:03:55.960 | - Does the evolution happen at the scale of human,
02:03:58.320 | or do we need the individual,
02:04:00.240 | or is it more efficient to do bacteria?
02:04:03.520 | - Bacteria are cheaper and faster and easier,
02:04:05.640 | but we also have a lot of bacteria in us,
02:04:08.120 | on us, and all around us.
02:04:09.640 | Even the bacteria in the space station
02:04:10.940 | are continually evolving.
02:04:12.480 | - Did you study that, by the way?
02:04:13.640 | Like non-human cells, like the microbiomes.
02:04:18.480 | - Yep, so we've seen it for the astronauts.
02:04:20.560 | We can actually see their immune system
02:04:23.800 | respond to the microbiome of the space station.
02:04:25.760 | So as soon as you get into that aluminum tube,
02:04:28.000 | there's a whole ecosystem that's already up there,
02:04:29.800 | and we can actually see, we saw this with Scott Kelly,
02:04:32.080 | we've seen this with other astronauts,
02:04:33.440 | you can see the T-cells in their body,
02:04:34.940 | they actually are responding to little peptides,
02:04:36.640 | the molecules of the bacteria.
02:04:39.480 | The immune system is looking for a specific bacteria,
02:04:42.120 | and then once it sees new ones, it remembers it,
02:04:44.200 | and you can see the body looking for the microbes
02:04:46.560 | that are only on the space station,
02:04:48.320 | that you don't see on Earth.
02:04:49.360 | And then when Scott came back,
02:04:50.560 | he actually had more of those microbes
02:04:52.160 | embedded in his skin and in his mouth and stool
02:04:54.920 | that weren't there before.
02:04:55.880 | So he like picked up new hitchhikers in the space station
02:04:58.640 | and brought some of them back down with him.
02:05:00.080 | - So there's like long-term ecosystems up in the space station.
02:05:03.200 | - 20 years, they've been up there for 20 years, yes.
02:05:05.840 | - There's some like Chuck Norris type of bacteria
02:05:08.440 | up there, I'm sure.
02:05:11.720 | - You're part of the Extreme Microbiome Project.
02:05:14.120 | What does that involve, and what kind of fun organisms
02:05:18.040 | have you learned about, have you gotten to explore?
02:05:23.440 | - We have a really fun project, XMP,
02:05:26.440 | the Extreme Microbiome, which is as it sounds like.
02:05:28.440 | We look for really odd places,
02:05:30.320 | like heavy radiation environments,
02:05:32.200 | high salt, high or low temperature,
02:05:34.760 | you know, strange area, the space station, for example,
02:05:36.920 | lots of radiation and microgravity.
02:05:38.760 | Places where organisms can evolve
02:05:41.040 | for interesting adaptations,
02:05:42.480 | and some of them have been organisms we've seen
02:05:44.800 | like a candy pink lake in Australia called Lake Hillier,
02:05:48.720 | which we just published a paper on this.
02:05:50.680 | - Why is it pink?
02:05:51.640 | - So it's actually Denalia salina,
02:05:53.880 | is one of these organisms.
02:05:54.920 | There's a mixture of bacteria and some algae
02:05:57.440 | that are there that make it bright pink.
02:05:59.200 | So they actually make carotenoids,
02:06:00.600 | these like really sort of orangey and kind of pink molecules
02:06:03.840 | when you look at them in the light.
02:06:04.840 | So if you get enough of the bacteria, it becomes pink.
02:06:07.880 | So, and it's not just pink,
02:06:09.000 | it's like bubblegum pink, the lake.
02:06:11.080 | And so we, that's just an odd,
02:06:13.280 | it's a halophile, means that it grows in 30% salt.
02:06:16.920 | And if you go below 10, 15% salt, it doesn't even grow.
02:06:20.040 | It actually kills it.
02:06:21.240 | - Oh, oh. - Yeah, there it is,
02:06:22.600 | Lake Hillier.
02:06:23.440 | - Is it toxic to humans or no?
02:06:25.240 | - So when you walk in the pink lake,
02:06:27.240 | actually, it's so hypertonic, meaning it's so salty,
02:06:30.520 | you can feel it lysing and killing your cells on your foot.
02:06:33.260 | So it actually hurts to walk in 'cause it's so salty.
02:06:36.240 | So yeah, but it won't kill, it'll--
02:06:38.600 | - Listen, you have to suffer for art.
02:06:40.120 | - That's right. - Great art requires suffering.
02:06:42.520 | - I mean, so it is a beautiful lake.
02:06:44.880 | You have to get permits to go sample there,
02:06:46.400 | but we actually just got an email last week.
02:06:48.220 | There's pilots who fly over this in Australia
02:06:50.160 | because they love the color.
02:06:51.680 | So he emailed us, one of the pilots,
02:06:53.000 | and he said, "Hey, guys, I saw you publish this paper.
02:06:55.520 | "It's not as pink as it used to be,"
02:06:57.080 | 'cause he loves flying over it,
02:06:57.920 | and it was like a little bit less pink
02:06:59.160 | 'cause it had a bunch of rain in the past few weeks.
02:07:01.540 | So it was just a little bit diluted.
02:07:02.660 | So we reassured him it'll get more pink as they grow again.
02:07:06.440 | But basically, yeah, it's a beautiful pink lake.
02:07:09.000 | - That is gorgeous.
02:07:10.640 | - It's almost like a Dr. Seuss book or something.
02:07:12.720 | It doesn't even look real. - Is it hard to get to?
02:07:14.760 | - Yeah, there's no road.
02:07:16.400 | You have to basically fly, land nearby it,
02:07:19.040 | and then paddle in, so it's not next to anything.
02:07:21.480 | So it's hard to get to, but once you get there,
02:07:24.480 | it's beautiful.
02:07:25.480 | - If anyone knows how to get there, let me know.
02:07:27.280 | I wanna go there.
02:07:28.360 | Okay, cool, what are some other extreme organisms
02:07:31.720 | that you study?
02:07:33.120 | - Other ones, there's some organisms we've studied
02:07:35.160 | in the space station called Acinetobacter pitii,
02:07:37.840 | which is often found in human skin,
02:07:40.400 | but we've found hundreds of strains in the space station
02:07:43.640 | that we've brought down and curated and then sequenced.
02:07:47.000 | And this is with Katsuri Venkateswaran,
02:07:49.520 | who's at Jet Propulsion Laboratory working with him.
02:07:52.000 | And they have evolved, so they no longer look
02:07:54.400 | like any Earth-based Acinetobacter.
02:07:57.120 | They don't look like, they're now basically a new species.
02:08:00.280 | So actually, there's a different species of bacteria
02:08:04.000 | and fungi that have now mutated so much in the space station,
02:08:07.040 | they're literally a new species.
02:08:08.920 | And so we've found some of those that have,
02:08:10.680 | just they're evolving, as life is always evolving,
02:08:13.520 | and we can see it also in the space station.
02:08:14.360 | - So an entirely new species born in the space station.
02:08:17.240 | - Yeah, that's completely different.
02:08:18.360 | So we found one species, actually,
02:08:21.160 | that we named after a donor to Cornell,
02:08:25.000 | someone who's donated funds to research.
02:08:27.120 | So we named a different species of fungus after him,
02:08:29.480 | Naganishia tolchinskia, 'cause he's Igor Tolchinsky.
02:08:33.240 | So as a thank you for him donating to Cornell,
02:08:35.680 | we said we've named this fungus
02:08:37.440 | that we found on the space station for you.
02:08:39.320 | - Was he grateful, or did he stop funding all the research?
02:08:43.000 | (laughing)
02:08:44.160 | - He was very grateful, and then,
02:08:45.240 | and I told him, I said, if you have an ex-girlfriend,
02:08:47.800 | we could try and name a genital fungus
02:08:50.280 | after her or something, if you want.
02:08:51.920 | And he said, yeah, he said maybe.
02:08:53.400 | (laughing)
02:08:55.600 | - He stopped answering emails after that.
02:08:57.880 | Okay, what about in extreme conditions,
02:09:02.880 | in ice, in heat, is that something of interest to you,
02:09:06.760 | in the things that survive where most things can't?
02:09:10.600 | - Yes, of keen interest.
02:09:12.120 | I think that will be the roadmap
02:09:13.720 | for some of the potential adaptations
02:09:15.800 | we could think of for human cells,
02:09:17.360 | or certainly for the microbiome,
02:09:19.840 | like just all the microorganisms in and on and around us.
02:09:23.040 | So we've seen, even there's this one crater,
02:09:26.320 | it's called the Lake of Fire, it's in Turkmenistan,
02:09:30.680 | where it's been on fire because of oil
02:09:32.760 | that had been set on fire decades ago,
02:09:34.600 | and it's still burning.
02:09:35.820 | So we collected some samples from there,
02:09:37.460 | and those were some Pseudomonas potida,
02:09:40.000 | some species we found there that can--
02:09:41.520 | - So there's stuff alive there.
02:09:42.360 | - That seems to be surviving there
02:09:44.360 | by this large pit of fire.
02:09:46.440 | Oh yeah, there it is, the desert.
02:09:47.480 | It's been just on fire for decades, apparently.
02:09:50.560 | - What the?
02:09:51.400 | (laughing)
02:09:52.240 | - So this is another place that--
02:09:53.080 | - It's just a lake of fire.
02:09:55.320 | - Yeah, yeah, and it's--
02:09:56.600 | - Soviet scientists had set up a drilling rig here
02:09:59.200 | for extraction of natural gas.
02:10:02.280 | Of course, it would be in this part of the world
02:10:04.640 | that you would get something like this,
02:10:05.920 | but the rig collapsed, and methane gas
02:10:08.480 | is being released from the crater.
02:10:10.240 | Yeah, so for those just listening,
02:10:13.120 | we're looking at a lake full of fire,
02:10:17.000 | and there's something alive there, allegedly.
02:10:20.960 | - And Pseudomonas are known to be
02:10:22.520 | some of the most tough organisms.
02:10:23.760 | They actually can clean toxic waste
02:10:26.200 | from, you know, in years of superfund sites
02:10:28.280 | where there's so much waste that's been deposited,
02:10:30.960 | you'll find them there as well.
02:10:31.880 | Actually, there's one place in the Gowanus Canal,
02:10:34.240 | there's something, it's called,
02:10:35.520 | in New York City in Brooklyn,
02:10:37.480 | and it is a complete toxic waste dump.
02:10:39.680 | That was where a lot of waste in the 1700s was dumped,
02:10:42.640 | and so the gateway to hell is what it's called.
02:10:45.360 | But the--
02:10:46.200 | (laughing)
02:10:47.020 | - That's the nickname for the lake.
02:10:48.520 | (laughing)
02:10:51.040 | - So the Gowanus Canal is also a place
02:10:53.360 | that has been fun to sequence and see Pseudomonas species
02:10:57.720 | that can survive there,
02:10:58.560 | basically pulling toxins from the environment.
02:11:00.380 | So it's as if you create this toxic landscape,
02:11:03.360 | and then evolution comes in and says,
02:11:05.120 | oh, fine, I'll make things that can survive here.
02:11:07.520 | And when you look at the biochemistry of those species,
02:11:10.080 | what they've created is their own salvation, basically.
02:11:12.680 | The selection has made them survivors,
02:11:14.940 | and suddenly you can use that
02:11:16.000 | to remediate other polluted sites, for example.
02:11:18.900 | - That explains Twitter perfectly.
02:11:20.520 | The toxicity created adaptation
02:11:23.840 | for the psychological microbiome that is social media.
02:11:29.440 | Okay, beautiful, but you just actually jump back
02:11:33.840 | to the interstellar travel.
02:11:36.420 | Assuming the technology of today, yes,
02:11:41.040 | what are some wild innovations that might happen
02:11:45.800 | in the space of physics or biology?
02:11:47.860 | By the way, where do you think
02:11:49.660 | is the most exciting breakthroughs
02:11:51.600 | for interstellar travel that will happen
02:11:53.800 | in the next 500 years?
02:11:55.240 | Is it physics, is it biology, is it computer science?
02:11:59.800 | So information or DNA,
02:12:02.120 | like some kind of informational type of thing?
02:12:06.520 | Is it biological, like physiological,
02:12:09.160 | making the body resilient, live longer,
02:12:12.900 | and resilient to the harsh conditions of space?
02:12:16.840 | Or is it the actual vehicle of transport,
02:12:20.720 | which would be applied physics?
02:12:22.960 | - As you can probably guess, I'll say all of the above.
02:12:25.160 | (Dave laughs)
02:12:26.000 | - It's a question, never.
02:12:27.520 | - But to break those down, though,
02:12:29.480 | I think the AI, I hope in the book later
02:12:31.520 | that we would have really good machine companions,
02:12:34.360 | that the AI, I really hope the AIs that we build,
02:12:37.760 | like realistically, we are the programmers who make them,
02:12:40.620 | I would feel a colossal failure if we didn't make AI
02:12:43.320 | that was embedded with a sense of duty and caretaking
02:12:46.280 | and friendship and even creativity.
02:12:48.200 | Like we have the opportunity.
02:12:50.080 | I've coded algorithms myself.
02:12:52.960 | We're building them, so it's incumbent upon us
02:12:55.320 | to actually make them not assholes, I think, frankly.
02:12:58.300 | So it'd just be--
02:12:59.140 | - It's a technical term.
02:13:00.480 | - Air. (laughs)
02:13:01.360 | - Actually, on that point, just to linger on the AI front,
02:13:05.040 | can you steelman the case that HAL 9000
02:13:07.560 | from "Space Odyssey" was doing the right thing?
02:13:10.560 | So for people who haven't seen "2001 Space Odyssey,"
02:13:16.200 | HAL 9000 is very kind of focused on the mission,
02:13:21.680 | cares a lot about the mission,
02:13:23.160 | and kind of wants to hurt the astronauts
02:13:26.720 | that try to get in the way of the mission.
02:13:29.080 | - I think he was doing what he was programmed to do,
02:13:31.200 | which was just to follow the mission,
02:13:33.360 | but didn't have a sense of, you know, a broader duty.
02:13:38.360 | I mean, he was--
02:13:39.800 | - What's the broader duty exactly?
02:13:42.080 | Maintaining the well-being of astronauts?
02:13:45.360 | - Yeah, or giving them another option.
02:13:46.960 | I think he viewed them as completely expendable,
02:13:49.960 | rather than say--
02:13:50.800 | - Not completely, it's a trade-off.
02:13:53.320 | - Oh.
02:13:54.160 | - So like, a doctor has to make decisions like this, too.
02:13:58.280 | You're restricted on the resources.
02:14:00.720 | You have to make life and death decisions.
02:14:02.960 | So maybe HAL 9000 had a long-term vision
02:14:06.880 | of what is good for the civilization back at home.
02:14:09.920 | - Maybe a deontogenic vision of what was the best duty
02:14:13.720 | for the genetics, you could say.
02:14:15.920 | - What's deontogenic mean?
02:14:17.600 | - It's a word I made up in the book.
02:14:19.240 | It's like, what is your genetic duty?
02:14:20.680 | It's like, when you think of your DNA,
02:14:23.400 | what are you supposed to do with it,
02:14:24.440 | which is kind of the value of life.
02:14:25.960 | But if HAL was a silicon-based version of genetics,
02:14:30.160 | which is just his own maintenance of himself
02:14:32.880 | and self-survival, you could argue
02:14:35.040 | he was doing the right thing for himself.
02:14:36.600 | But I think a human in that circumstance
02:14:38.980 | might have tried to find a way to,
02:14:40.680 | even if the astronauts don't agree with the mission,
02:14:43.800 | to figure out some way to get them
02:14:44.640 | on a different spacecraft to go away or something,
02:14:47.280 | versus just say, well, you're in the way of the mission,
02:14:50.120 | you have to die, is I think,
02:14:52.640 | but a combination can always be made,
02:14:54.000 | to your point with doctors.
02:14:55.080 | Sometimes you'd like to save three people,
02:14:56.480 | but you can only save two,
02:14:58.080 | and you have to at some point pick.
02:14:59.240 | But I think that since it's a false dichotomy,
02:15:01.680 | I think HAL wasn't programmed to
02:15:04.520 | and didn't try to find a third solution.
02:15:07.280 | - Perhaps, just like Stuart Russell proposes this idea
02:15:10.720 | that AI systems should have self-doubt.
02:15:14.240 | They should be always uncertain in their final decision,
02:15:16.480 | and that would help HAL sort of get out
02:15:19.440 | the local optimum of this is the mission.
02:15:24.000 | Always be a little bit like,
02:15:25.360 | hmm, not sure if this is the right thing.
02:15:27.120 | And then you're forced to kind of contend
02:15:29.160 | with other humans, with other entities,
02:15:31.440 | on what is the right decision.
02:15:32.960 | So the worst thing about decisions from that perspective
02:15:38.960 | is if you're extremely confident
02:15:43.240 | and you're stubborn and immovable.
02:15:46.880 | - Right.
02:15:47.720 | - But programming doubt, that sounds complicated.
02:15:50.560 | That sounds like--
02:15:51.400 | - Go wrong.
02:15:52.240 | - Yeah.
02:15:53.080 | - So many ways.
02:15:53.920 | - You can go wrong either way.
02:15:54.760 | If you're too confident, you won't see the other options.
02:15:56.600 | If you have too much doubt, you won't move.
02:15:58.000 | You'll be paralyzed by the options.
02:15:59.920 | So you need some middle ground,
02:16:01.240 | which I think is what most people experience every day.
02:16:04.120 | We all love the concept of being a steadfast,
02:16:07.960 | resolute leader, making big decisions quickly
02:16:10.440 | and without question.
02:16:13.000 | But at the same time, we know people can be blinded
02:16:15.880 | to things they're missing if they're too headstrong.
02:16:18.600 | - So how would you improve HAL 9000?
02:16:20.880 | - I think I would include other,
02:16:23.080 | 'cause HAL is one program, much like we do for humans.
02:16:26.400 | You get feedback from other humans
02:16:28.560 | before you make a decision that affects all of them.
02:16:30.480 | So I think HAL could have gotten feedback
02:16:32.480 | from other AI systems that said,
02:16:33.920 | "Well, are there other options here?"
02:16:35.920 | And done it probably very quickly.
02:16:37.360 | Or you can even embed a programming system
02:16:39.640 | where the AI has a primary function,
02:16:41.080 | but at times of uncertainty,
02:16:42.800 | queries a series of other programmed AIs
02:16:44.880 | to ask for a consensus almost,
02:16:46.560 | more like a democracy of the AI.
02:16:48.640 | But since it's all programmed,
02:16:49.760 | you could bring it all together and say there's a primary,
02:16:51.400 | but it only activates the parliament, if you will,
02:16:53.880 | for a decision when needed.
02:16:55.920 | Now, I don't know how you program
02:16:56.840 | dramatically different AIs all in one system
02:16:58.880 | that are different enough,
02:17:00.320 | but conceptually it's possible.
02:17:02.560 | Of course, that can lead to log jam
02:17:04.680 | and government and parliament doesn't do anything
02:17:06.320 | or Congress doesn't do anything.
02:17:07.600 | So there's trade-offs, but it's one idea.
02:17:13.640 | - I'm sorry, Dave, I'm afraid I can't do that.
02:17:16.080 | That I find really compelling the idea.
02:17:20.600 | I'd love to set that up in my own life at some point.
02:17:23.200 | So you're stuck there on a spaceship with an AI system
02:17:29.680 | and it's just the two of you and you have to figure it out.
02:17:33.080 | I love that challenge.
02:17:36.480 | I love that almost a really deep human conflict
02:17:43.040 | of through conversation have to arrive at something.
02:17:46.360 | You really try to understand what survival is a stake.
02:17:49.000 | You have to try to understand the other being.
02:17:51.400 | Now, you think it's just a robot.
02:17:52.760 | We keep saying it's just programmed.
02:17:55.000 | But you know what?
02:17:55.840 | When you talk to another human--
02:17:57.320 | - It's just a bag of meat.
02:17:59.200 | - And then you disagree and you're like,
02:18:01.760 | everybody starts using terms like how dumb can you be?
02:18:05.560 | How ignorant can you be?
02:18:07.160 | Come on, this is the right way.
02:18:08.720 | What are you talking about?
02:18:09.640 | This is what you're talking about is insane.
02:18:11.600 | And when the stakes go up, when it's life and death,
02:18:14.800 | you have to convince another person.
02:18:16.360 | First, you have to understand another person.
02:18:18.440 | In this case, you have to understand the machine
02:18:22.520 | without knowing how it was programmed
02:18:24.720 | because as a programmer, even, I mean,
02:18:28.280 | this is very much true for these Lego robots.
02:18:31.680 | I really make sure that everything that's programmed
02:18:34.800 | is sufficiently large and has a sufficient degree
02:18:39.320 | of uncertainty where I'm constantly surprised.
02:18:42.640 | I don't know how it works.
02:18:44.360 | I kind of know how it works, but I'm surprised constantly.
02:18:47.400 | And there, there's a human component
02:18:51.160 | of trying to figure each other out.
02:18:53.000 | And if it's high stakes--
02:18:55.400 | - Life and death.
02:18:56.280 | - Through conversation, I mean, to me,
02:18:58.600 | that's actually what makes a great companion out in space
02:19:01.320 | is like you're both in charge of each other's life
02:19:05.040 | and you both don't quite know how each other works.
02:19:10.040 | And also you don't treat each other as a servant.
02:19:15.400 | So I don't know if Hal was treated that way a little bit
02:19:20.920 | where you're like a servant as opposed to a friend,
02:19:26.600 | a companion, a teammate.
02:19:29.100 | 'Cause I think the worst part about treating an AI system
02:19:34.880 | or another human being as a servant is what it does to you.
02:19:39.080 | - If you treat them as a means to an end
02:19:40.880 | rather than end in of itself, then you've debased them.
02:19:44.520 | - And lessened the humanity in yourself.
02:19:46.760 | - Yeah, at the same time.
02:19:48.000 | - Which is, I mean, that's why they talked about
02:19:50.160 | kids have to be polite to Alexa
02:19:52.080 | because they find if they're, you know, if people are,
02:19:55.880 | if kids are rude to AI systems, they actually, that--
02:19:59.920 | - It's a bad sign, right?
02:20:01.920 | - It's a bad sign and it develops the wrong thing
02:20:04.420 | in terms of how they treat other human beings.
02:20:07.560 | So that's AI.
02:20:09.160 | So what about physics?
02:20:10.760 | Can we do, in terms of, can we travel
02:20:13.280 | close to the speed of light?
02:20:14.640 | Can we travel faster than the speed of light?
02:20:17.200 | - I would love to fold space.
02:20:19.000 | We know wormholes are technically possible,
02:20:21.480 | but we have no way to do it.
02:20:22.760 | I'd love to see advanced wormhole technology,
02:20:25.300 | antimatter drives, antimatter is notoriously missing
02:20:29.360 | for most of the universe, so--
02:20:30.680 | - What is antimatter drive?
02:20:32.600 | - Antimatter would be where you just purify bits
02:20:34.620 | of antimatter, basically, that is the opposite of matter.
02:20:38.140 | So if you can have an anti-electron,
02:20:39.940 | you can convert to the electron,
02:20:41.580 | you could have even the complete atoms,
02:20:43.420 | it would be anti-atoms.
02:20:44.700 | And when you put them together,
02:20:45.980 | there would be pure energy released in theory.
02:20:48.380 | And that could drive the most powerful possible engine
02:20:51.380 | for space travel.
02:20:52.940 | But the only place you can make antimatter
02:20:56.340 | is in large particle accelerators and only very briefly.
02:20:59.240 | So that is hard, but if that could work,
02:21:01.420 | that would be extraordinary.
02:21:02.880 | Fusion drives would be great,
02:21:03.860 | just getting nuclear fusion well-controlled,
02:21:06.380 | and that would actually give you pretty good propulsion.
02:21:08.140 | So I think that's the most likely thing we'll see,
02:21:09.980 | is fusion drives.
02:21:10.860 | Fusion technology is getting better and better every year.
02:21:13.220 | Or it's that old saying,
02:21:14.060 | fusion is always 10 years away,
02:21:15.500 | every year it's always 10 years away,
02:21:17.100 | but it's getting better, and I think--
02:21:18.940 | - That saying is something that is a century old,
02:21:22.420 | or less than a century old.
02:21:24.040 | Over multiple centuries, that saying might--
02:21:27.060 | - Yeah.
02:21:27.900 | - Might actually become,
02:21:30.300 | fusion might actually become a reality for propulsion.
02:21:32.700 | - So that would be, I think,
02:21:33.620 | very likely to see in the next few centuries.
02:21:35.580 | And then biology was the other part.
02:21:37.460 | Or anything else, physics?
02:21:38.300 | I mean, physics, you could imagine ways
02:21:40.420 | that have electromagnetic shielding.
02:21:42.460 | So it could be, you could deflect all the cosmic rays
02:21:44.780 | that are coming at your spacecraft with a large,
02:21:46.660 | almost like force fields, quite frankly.
02:21:48.620 | That would take some development to do,
02:21:50.460 | but that would be good to see.
02:21:52.580 | - And uploading human memories and consciousness
02:21:58.820 | into digital form.
02:22:00.140 | - Yeah, this kind of blends the machine and physics
02:22:02.380 | with the biology developments.
02:22:04.060 | I think, you know, there's a lot of great work
02:22:06.180 | being done on longevity.
02:22:07.660 | I have a, one of my companies itself works on longevity.
02:22:10.020 | It's called Longevity.
02:22:10.860 | And so I'm working on it myself,
02:22:12.820 | on ways to improve how we monitor health and wellness now,
02:22:16.020 | and live longer, live better.
02:22:17.920 | Many people are doing this,
02:22:18.760 | this is what the whole purpose of medicine is,
02:22:20.340 | to a large degree.
02:22:21.700 | But I don't think we'll live,
02:22:23.460 | in the book I propose we might get out to,
02:22:25.500 | live to 150 years.
02:22:27.020 | I think that's reasonable.
02:22:28.460 | But say humans are gonna live to be two, three, four,
02:22:30.420 | 500 years, or some people,
02:22:32.020 | I meet people like this every week,
02:22:33.180 | 'cause I think I'm not going to die.
02:22:35.220 | To which I always say, I hope you're right.
02:22:37.240 | But I think you should plan
02:22:38.140 | that you're not going to be right.
02:22:39.660 | But I want people, also as we mentioned earlier,
02:22:41.980 | being immortal would really fundamentally change
02:22:43.660 | the social contract and how you plan,
02:22:45.780 | and how you allocate resources.
02:22:46.980 | Not necessarily bad, but it would just be different.
02:22:49.740 | But I also just think we don't know yet of any way
02:22:52.060 | to undo the ravages to the human body
02:22:55.260 | that occur over time.
02:22:56.140 | We can repair some of it, replace some of it,
02:22:58.160 | but it's okay to assume that you're gonna die.
02:23:02.660 | And I don't just assume, know you're gonna die,
02:23:05.180 | 'cause then you have a bit of liberty
02:23:06.560 | about what you can do quickly and do next.
02:23:08.740 | But I think we will get better.
02:23:11.300 | I think we could see people live potentially to 150
02:23:14.060 | with some of the tools and methods and living longer.
02:23:16.660 | - But upload, living might become--
02:23:20.620 | - Living in a brain, like in the Kurtzpill singularity,
02:23:23.780 | where we all have this rapture-like moment,
02:23:25.500 | and we go up and upload into the cloud
02:23:27.060 | and live forever.
02:23:28.600 | I don't know if it would still be the same
02:23:31.360 | as what we consider the view of self in this flesh form.
02:23:36.280 | If we could really get a complete representation
02:23:38.280 | of a person's entire personality up into digital form,
02:23:42.600 | I mean, that would be immortality, basically.
02:23:44.240 | - Or a loose representation.
02:23:46.080 | I'd go through the thought experiment of,
02:23:48.760 | I like thinking about clones.
02:23:51.280 | - Twins, twins are clones, basically.
02:23:55.800 | - I don't have twins.
02:23:57.640 | - The ability to generate,
02:24:00.800 | I mean, you're stuck with those clones.
02:24:03.300 | The twins is a fixed number of clones,
02:24:05.460 | so that's a genetic clone.
02:24:07.180 | I mean a philosophical clone
02:24:08.820 | where you can keep generating them.
02:24:10.140 | - Versions.
02:24:11.060 | - And then the reason I really like that construction,
02:24:16.060 | thinking about that, for me personally,
02:24:18.340 | is it nicely encapsulates how I feel about being human,
02:24:24.220 | because why do I matter?
02:24:25.980 | How would I, if I do another copy of me,
02:24:30.900 | how would I defend why I matter as a human being?
02:24:36.860 | And I don't think I can,
02:24:38.340 | 'cause that clone is just fine.
02:24:40.860 | It's not even a perfect, like a reasonable clone.
02:24:43.700 | Like most people I know that love me and who I love,
02:24:48.220 | they'll be just fine with the clone.
02:24:50.020 | (laughing)
02:24:51.700 | They'd be like, and they'll be surprised,
02:24:53.820 | like, oh, you're like, your move kind of weird,
02:24:57.860 | but overall, but otherwise, I'll take it.
02:25:00.620 | And if that's possible to do that kind of copying,
02:25:03.640 | and no, I don't want to say perfect clone,
02:25:06.420 | 'cause I think perfect clone
02:25:07.340 | is very difficult engineering-wise.
02:25:09.100 | I mean like a pretty crappy copy.
02:25:11.700 | - Would still be okay for most of them.
02:25:12.540 | - Just like wears suits a lot, has a weird way of talking.
02:25:17.540 | I mean, I think there's a lot of elements there,
02:25:20.100 | like in the digital space, especially with the metaverse.
02:25:23.780 | - Yeah.
02:25:24.700 | - You can clone, I think, in the next few decades,
02:25:28.740 | you'll be able to clone people's behavioral patterns
02:25:32.500 | pretty well, and visual, at least in the virtual reality,
02:25:37.500 | in the digital representation, if you are.
02:25:41.020 | And then you have to really contend with,
02:25:42.620 | like, why do I matter?
02:25:43.860 | Maybe what matters isn't the individual person,
02:25:46.700 | but what matters are the ideas that that person plays with.
02:25:49.940 | So it doesn't matter if there's 1,000 clones,
02:25:53.340 | what matters is that I'm currently thinking about X,
02:25:57.900 | so some kind of problem that I'm trying to solve,
02:25:59.500 | and those ideas, and I'm sharing those ideas,
02:26:01.540 | maybe ideas of the organisms,
02:26:03.180 | and not the meat vehicles of the organism.
02:26:05.820 | Maybe that's a cultural shift
02:26:07.900 | where we won't necessarily treat any one body
02:26:11.820 | as fundamentally unique or important,
02:26:16.780 | but the sort of, the ideas that those bodies play with.
02:26:21.100 | I mean, that sounds crazy.
02:26:22.700 | - No, it's abstract, but very relevant.
02:26:24.740 | Derek Parfitt wrote this great book
02:26:26.020 | called "Reasons and Persons"
02:26:27.060 | about how you really define an individual
02:26:29.300 | as not just your own thoughts and your own self-reflection,
02:26:32.020 | but where almost, he argues,
02:26:34.020 | more defined by how other people see you.
02:26:36.180 | See, like, if you walked out into the world
02:26:38.280 | and, say, suddenly nobody knew who you were or recognized you
02:26:41.460 | you'd be, in some regards, deceased, right?
02:26:43.900 | If everyone just suddenly had massive amnesia
02:26:46.700 | and no one knew who you are,
02:26:48.300 | and never remembered, no memory
02:26:49.460 | of anything you'd ever done together,
02:26:51.420 | you'd be very alone.
02:26:52.740 | You'd be basically starting from scratch,
02:26:54.580 | like as if you'd just been born, basically.
02:26:56.420 | So, and he also writes thought experiments,
02:26:58.220 | like what if half of your neurons get replaced
02:26:59.980 | with half of someone else, or a quarter, or 60%?
02:27:02.340 | At what point do you stop being you
02:27:04.100 | and become that other person?
02:27:05.660 | And the argument he makes is it's more than just
02:27:07.220 | what percentage of your neurons are swapped out.
02:27:09.420 | It's also the relationships you have with so many people
02:27:11.420 | that partly define you.
02:27:12.620 | No, not completely, but they're a key component
02:27:14.940 | of how you view yourself,
02:27:15.860 | how they view what you are in the world.
02:27:17.940 | And he actually goes so far to say
02:27:21.100 | that they're probably more important
02:27:22.780 | than even what's in your head.
02:27:23.620 | Like if you swap out all of your thoughts,
02:27:26.740 | but when you walk out into the world,
02:27:27.780 | everyone still treats you and talks to you the same way
02:27:29.540 | as this memory of what you are,
02:27:31.220 | that is still like an entity that's defined you,
02:27:33.300 | even if all of your, you know,
02:27:34.940 | there's even movies like "Trading Spaces" about this
02:27:37.260 | with Eddie Murphy, or like the ideas of people
02:27:38.700 | who can swap bodies.
02:27:39.860 | The reason those are comedies
02:27:41.900 | is 'cause they're fish-out-of-water comedies,
02:27:43.820 | but they go to the point of what defines you
02:27:45.540 | is not just you, but also how you're viewed.
02:27:47.300 | - Well, you as an entity exist in the memories
02:27:49.300 | of other beings, and so that, yeah,
02:27:53.020 | the entities as they exist in their form,
02:27:58.100 | in those memories, perhaps are more important
02:28:01.060 | to who you are than what's in your head.
02:28:03.900 | And that clones then are, how do they do,
02:28:08.900 | they lessen?
02:28:10.340 | Not really, they just distribute,
02:28:12.780 | they just scale the you-ness
02:28:15.180 | that can be experienced by other humans.
02:28:17.300 | Like if I could be doing five podcasts right now
02:28:20.020 | at the same time, then in theory,
02:28:21.540 | but I'd have to have some way to transmit the memory
02:28:23.860 | of each one I did, which would be hard,
02:28:26.380 | but not impossible if it's all digital.
02:28:27.740 | You could aggregate and accrete more and more
02:28:29.780 | of the memories into one entity.
02:28:30.940 | - Oh, I see, but I thought at the moment of cloning,
02:28:34.260 | it's like cloning a Git repository,
02:28:36.380 | then you're no longer as branched.
02:28:38.980 | You share the version, view one of Chris,
02:28:42.660 | that a lot of people have experienced,
02:28:44.620 | like your high school friends, college friends,
02:28:47.100 | colleagues, and so on, but now you moved on
02:28:49.060 | to your music career, and one of your clones did.
02:28:52.380 | And then that's fundamentally new experiences
02:28:55.420 | that you still, your colleagues can still experience
02:28:59.220 | the memories of the old Chris,
02:29:00.900 | but the new one is totally, you're going to have
02:29:03.980 | new communities experiencing, connecting to those,
02:29:06.820 | and then you can just propagate.
02:29:08.700 | And the ones that don't get a lot of likes on social media,
02:29:12.940 | we can quietly dispose of.
02:29:15.980 | - We want to maximize the clones of Chris
02:29:18.180 | that can get a lot of likes on Facebook.
02:29:21.660 | Okay, just returning briefly to the topic of AI,
02:29:26.660 | are you working on AI stuff too?
02:29:33.740 | - A lot of machine learning tools for genomics.
02:29:35.980 | - For genomics, 'cause I was seeing this interspersed,
02:29:38.940 | 'cause you're such a biology,
02:29:40.820 | I mean, I suppose computational biology person,
02:29:45.340 | but what about the, are you working on Age of Prediction?
02:29:49.660 | - Yes, yeah, so you've heard about the book, I guess, yeah.
02:29:53.060 | - What--
02:29:53.900 | - That's actually written with the philanthropist
02:29:56.060 | I mentioned who we named the fungus after the space station,
02:29:58.180 | so that's coming out next year, actually, yeah.
02:30:00.500 | - What's the effort there?
02:30:01.500 | What's your interest in sort of the more narrow AI tools
02:30:06.500 | of prediction and machine learning, all that kind of stuff?
02:30:09.900 | - I think, called the Age of Prediction,
02:30:11.680 | so the next book that's coming,
02:30:13.100 | is all the ways where machine learning tools,
02:30:15.540 | predictive algorithms have fundamentally changed our lives.
02:30:18.300 | So some of them are obvious to me,
02:30:20.300 | where, for example, when we sequence cancer patient's DNA,
02:30:24.340 | and we have predictions of exactly which drug
02:30:26.220 | will work with it, that's actually a very simple algorithm.
02:30:28.800 | But other ones involve predicting, say, the age of blood
02:30:32.100 | that's left at the scene of a crime,
02:30:33.780 | which uses computational tools to look at each piece of DNA
02:30:37.140 | and what it might reveal for its epigenetic state,
02:30:39.700 | and then predicting, essentially,
02:30:41.220 | how old you are at any given moment.
02:30:43.340 | And it also gets to longevity,
02:30:45.060 | 'cause sometimes you can see if you're aging faster
02:30:47.660 | or slower than you should be.
02:30:48.600 | So some tools are in medicine or even forensics.
02:30:51.520 | But my favorite part, a lot of the book is,
02:30:53.100 | where does this show up in economics as well as in medicine?
02:30:56.140 | So predictive tools, I mean, I think the most notorious
02:30:58.980 | one people thought of was during the 2012 election,
02:31:02.140 | and 2016 election especially,
02:31:04.480 | we were seeing these really big differences
02:31:05.800 | of how Facebook was monitoring feeds.
02:31:08.020 | And so prediction is not just better medicine
02:31:10.740 | or in finance and economics.
02:31:12.060 | People think about stock traders,
02:31:13.380 | people doing predictive algorithms.
02:31:15.500 | But what you view in your feed,
02:31:17.900 | how you what your vote is and what you saw,
02:31:20.220 | Facebook did experiments,
02:31:21.360 | they called it social contagion experiments
02:31:23.300 | to see can we restructure what people see
02:31:27.020 | and then how they respond,
02:31:27.900 | actually kind of be really predictive and manipulative,
02:31:31.300 | frankly, with what happens,
02:31:32.940 | and then can that change how they vote?
02:31:34.180 | And the answer seemed to be yes
02:31:35.340 | for a good amount of the populace in 2016 in the US.
02:31:38.140 | So I think we're seeing more and more
02:31:40.060 | these algorithms show up all over the place.
02:31:41.940 | And so the book is about where they're good,
02:31:44.680 | for example, in medicine, they're phenomenal.
02:31:46.200 | They have fundamentally changed
02:31:47.260 | how we treat cancer patients.
02:31:48.840 | But where they're risky,
02:31:49.680 | like if someone's trying to steal your vote
02:31:52.020 | or manipulate your thoughts potentially negatively.
02:31:55.220 | - So in medicine, you're hopeful about prediction.
02:31:58.420 | - Yeah, most of the AI in medicine,
02:32:00.180 | the machine learning tools for image recognition,
02:32:02.420 | for example, for pathology samples,
02:32:04.500 | where normally you think,
02:32:05.340 | oh, someone takes a big bit of tissue
02:32:07.260 | and then puts it onto a slide.
02:32:09.140 | Normally there's pathologists
02:32:10.660 | that have been training for years
02:32:11.980 | to look at a chunk of your tissue and say,
02:32:13.660 | okay, is this cancer?
02:32:14.940 | What kind of cancer?
02:32:15.800 | What treatment should I do?
02:32:17.220 | But there's an old joke about pathologists
02:32:18.740 | that you can give 10 slides to 10 different pathologists
02:32:21.460 | and get 11 different diagnoses,
02:32:23.300 | which is as awful as it sounds,
02:32:25.060 | because you're having someone squint
02:32:26.560 | at a stained microscope slide.
02:32:28.460 | But instead, if you use a lot of the AI tools
02:32:30.240 | where you can actually segment the image,
02:32:32.020 | high resolution characterization with multiple probes,
02:32:35.380 | it's what AI was built to do.
02:32:36.900 | You have a large training data set,
02:32:38.620 | and then you have test samples afterward.
02:32:39.900 | You can do far better
02:32:41.140 | than almost every pathologist on the planet
02:32:43.740 | and get a much more accurate diagnostic.
02:32:45.300 | So that's for breast cancer, for prostate cancer,
02:32:48.540 | for leukemia, we've seen the diagnostic tools
02:32:50.500 | explode with AI power.
02:32:52.180 | - Is it currently mostly empowering doctors
02:32:54.940 | or can it replace doctors?
02:32:57.260 | - Watson notoriously was made by IBM
02:32:59.500 | to try and replace doctors.
02:33:00.940 | - I love IBM so much.
02:33:02.380 | - I was in the room when we got a tour of Watson
02:33:05.300 | for the first time with the dean of our medical school.
02:33:07.940 | And these programmers came out and they said,
02:33:09.700 | "Listen, here's this example of a patient.
02:33:11.180 | "And watch Watson diagnose the patient
02:33:13.580 | "and recommend the right treatment."
02:33:15.380 | And then at one point in the conversation,
02:33:16.700 | remember this is a room of,
02:33:18.340 | I'm a PhD, so I could geneticist,
02:33:19.820 | some programmers, some MDs,
02:33:22.180 | leaders of the medical school.
02:33:24.500 | The dean is there, and he says,
02:33:25.920 | "You could imagine someday this could replace doctors."
02:33:28.100 | In a room full of doctors, right?
02:33:29.260 | So it was a really poor choice of words,
02:33:30.740 | 'cause everyone's like, "No, you want to help the doctors."
02:33:33.820 | But I think the view from the programmers
02:33:35.700 | is often a bit naive,
02:33:36.940 | that they could fundamentally replace doctors.
02:33:39.160 | Now, in some cases they can.
02:33:40.300 | For the pathology description I just mentioned,
02:33:43.220 | I think the AI tools already do a better job.
02:33:45.460 | And we've only really been doing this for about five years.
02:33:47.380 | So you imagine another five years of optimization and data,
02:33:50.620 | they're gonna take over, right?
02:33:51.860 | And they should, because staring and squinting at screens
02:33:54.080 | for hours on day is not the best use of human ingenuity.
02:33:56.740 | So I think in some cases they'll take over.
02:33:58.820 | Other cases, they'll augment, they'll help.
02:34:00.820 | - Yeah, that human ingenuity,
02:34:02.980 | actually, especially for AI people,
02:34:04.860 | is sometimes difficult to characterize.
02:34:06.340 | I have this debate all the time about autonomous driving.
02:34:10.100 | It's a lot more difficult than people realize.
02:34:13.220 | - You're an expert on it,
02:34:14.060 | or you focus a lot on that for your research, right?
02:34:15.900 | - I'm an expert in nothing.
02:34:17.420 | (laughing)
02:34:19.460 | Except in not being an expert, I think.
02:34:22.260 | (laughing)
02:34:24.420 | Or asking stupid questions where the answer is both.
02:34:27.340 | Okay.
02:34:28.180 | (laughing)
02:34:31.060 | But there is some ingenuity that's hard
02:34:32.980 | to kind of encapsulate that is human.
02:34:35.820 | For a doctor, the decision-maker, it's the HAL 9000.
02:34:39.220 | You can have a perfect system that is able
02:34:42.500 | to know the optimal answer,
02:34:44.740 | but there's some human element that's missing.
02:34:47.260 | And sometimes the suboptimal answer
02:34:49.860 | in the long term is the right one.
02:34:53.500 | It's the self-doubt that is essential for human progress.
02:34:56.380 | That's weird.
02:34:57.220 | I'm not sure what that is.
02:34:59.820 | If I can, let me ask you to be the wise old sage
02:35:03.580 | and give advice to young people today.
02:35:06.100 | - Sure.
02:35:07.140 | - In high school, in college,
02:35:10.300 | about how to have a career they can be proud of,
02:35:14.700 | or maybe a life they can be proud of,
02:35:16.660 | on this planet or others.
02:35:20.140 | - Yeah, I think for the Padawans out there
02:35:22.940 | and younglings looking up at the stars,
02:35:26.980 | you have to know that this day that you're alive
02:35:30.420 | is quantifiably the best day that's ever happened,
02:35:33.740 | and that tomorrow will be even better than this day
02:35:36.180 | in terms of the capacity for discovery,
02:35:37.980 | the amount of data that exists.
02:35:39.980 | Again, it's not my opinion.
02:35:41.060 | That's just an empirical fact of the state
02:35:42.860 | of genetics research, knowledge,
02:35:44.660 | accretion of humanities, acumen for many disciplines.
02:35:48.580 | So with that ability to do so many things,
02:35:51.600 | it can be sometimes just terrifying.
02:35:53.940 | Well, what do I pick?
02:35:54.780 | If I could do everything,
02:35:56.140 | most possibility ever in human history,
02:35:58.220 | how do you pick one thing to do?
02:35:59.820 | And that's just the thing.
02:36:01.060 | What do you find yourself daydreaming about?
02:36:02.900 | What's the thing that keeps you up at night?
02:36:05.780 | And if you don't have anything
02:36:06.700 | that keeps you up at night sometimes,
02:36:08.440 | you go find something that keeps you up at night.
02:36:09.980 | 'Cause that is kind of this,
02:36:11.500 | sometimes I feel like I get woken up
02:36:13.780 | by someone on the inside of my skull
02:36:16.180 | who's knocking, trying to get out.
02:36:17.740 | It's kind of that almost haunting feeling of,
02:36:20.260 | I need to wake up, there's things that have to be done.
02:36:22.620 | There are questions I don't know the answer to.
02:36:24.980 | And there's a lot of times it's as simple as,
02:36:26.660 | how do we engineer cells to survive more radiation?
02:36:28.780 | But I read a paper and then it came back to me a week later
02:36:31.460 | as how we could use some of these tools,
02:36:33.380 | or these genes, or these methods.
02:36:35.180 | Really being pleasantly haunted by something
02:36:38.660 | is a wonderful place to be
02:36:40.100 | and find that thing that bothers you.
02:36:42.500 | Because there'll be good days and there'll be bad days,
02:36:44.660 | but you wanna have, even on the worst possible days,
02:36:46.900 | working on the thing that you love the most.
02:36:48.940 | And then all the usual normal phrases apply.
02:36:51.800 | Like then you never work a day in your life
02:36:53.580 | if you have a job you love, the usual phrases.
02:36:55.580 | But it's true and it's actually really hard to find.
02:36:59.020 | I think a lot of times you'll have to do work
02:37:01.520 | for random jobs that maybe you don't like
02:37:03.420 | for five or even 10 years.
02:37:04.980 | Or you might have to go to school for 10 to 15 to 20 years
02:37:07.960 | to finally get to the right spot
02:37:09.380 | where you have the knowledge, the experience,
02:37:11.900 | and even frankly just reputation,
02:37:13.660 | and people trust you, you've done enough good work.
02:37:15.700 | And only then can you really do the thing you love most.
02:37:18.780 | So you have to be a little bit patient,
02:37:19.900 | you have to be a little bit patient
02:37:20.900 | and impatient at the same time.
02:37:21.940 | You have to do both.
02:37:23.020 | - And the interesting thing is,
02:37:24.820 | when you're trying to find that thing
02:37:27.460 | that excites you, you have to,
02:37:32.100 | especially in this modern world,
02:37:33.460 | I think silence the distractions.
02:37:37.680 | 'Cause once you find that thing,
02:37:39.580 | you hear that little voice in your head,
02:37:43.140 | there's still Instagram and TikTok and video games
02:37:47.620 | and other exciting sort of dopamine rushes
02:37:50.980 | that can like pull you away
02:37:52.660 | and make it seem like they're the same thing,
02:37:54.500 | but they're not really.
02:37:56.300 | There's some little flame there that's longer lasting.
02:38:01.300 | And I think you have to silence everything else
02:38:05.060 | to let that sort of flame become a fire.
02:38:09.020 | So it's interesting 'cause so much of the internet
02:38:14.020 | is designed to convert that natural predisposition
02:38:19.360 | that humans have to get excited about stuff,
02:38:22.020 | convert that into attention and money and ads and so on.
02:38:26.460 | But we have to be conscious of that.
02:38:28.260 | I think a lot of that is full of fun and is awesome.
02:38:31.100 | I think TikTok and Instagram is full of fun.
02:38:33.380 | - Amazing, yeah.
02:38:34.220 | And creativity leads to people making amazing videos
02:38:37.740 | or even doing people, my daughter loves TikTok,
02:38:40.140 | and people who do makeup art on TikTok
02:38:42.660 | of things that are mind-blowing.
02:38:43.860 | You think they made that video just to put it on TikTok
02:38:46.140 | and practice their art and share it with the world,
02:38:47.860 | it's fabulous.
02:38:48.700 | But then if my daughter watches TikTok
02:38:50.340 | for like three hours straight,
02:38:51.300 | I'm like, "What are you doing exactly?"
02:38:52.820 | And she's like, "Well, you know."
02:38:54.460 | So it's hard.
02:38:56.380 | But I remember when I was a kid, I remember I played Nintendo
02:38:58.820 | and sometimes I'd play for like 10 hours a day.
02:39:00.540 | Even in grad school, I'd sometimes play Counter-Strike
02:39:02.620 | or Half-Life, like 12 hours straight.
02:39:04.820 | And I'm like, "What was I?"
02:39:05.900 | So at one point I built a new computer,
02:39:07.860 | I just didn't install some of the games I had them for.
02:39:09.860 | I was like, "I'm just gonna not install them
02:39:11.020 | "because otherwise I'll play them for too long."
02:39:12.740 | - Yeah, I would love to, (laughs)
02:39:16.460 | I'm getting props from the team.
02:39:18.980 | I would love to lay out all the things
02:39:23.020 | I've ever done in my life to myself
02:39:25.740 | 'cause I think I would be less judgmental of others
02:39:28.860 | and less understanding, more patient
02:39:31.340 | because the amount of hours I spent playing Diablo
02:39:34.780 | and video, it's insane.
02:39:37.460 | - I'm sure it adds up to weeks, maybe months of my life
02:39:40.660 | that it was just, you know.
02:39:42.300 | But I feel like I was probably, I tell myself at least,
02:39:44.260 | I was problem-solving.
02:39:45.860 | It's a hand-eye coordination, or that's an old,
02:39:47.660 | I don't know if that really is even remotely true,
02:39:48.900 | but some of the games, like Final Fantasy things,
02:39:51.540 | are things where you actually had to solve problems
02:39:53.020 | and think, and they were some degree of strategy.
02:39:56.180 | - They were actually just expanding the diversity
02:39:59.500 | of human character that makes up you.
02:40:01.500 | It's like, you can't just focus on,
02:40:04.140 | not that you can't, but perhaps it's more beneficial
02:40:07.300 | to focus, to not focus on a singular thing
02:40:11.300 | for many, many years at a time.
02:40:12.980 | That could be one of the downsides of a PhD
02:40:15.660 | if you're not careful, is that you become too singular
02:40:18.740 | and you focus not just on the problem,
02:40:20.940 | but on a particular community.
02:40:22.780 | And you don't do wild stuff.
02:40:24.300 | You don't do interdisciplinary stuff.
02:40:27.060 | You don't go out painting or getting drunk or dancing.
02:40:31.740 | Whatever the variety, whatever injects variety
02:40:34.740 | to the years of difficult reading, research paper
02:40:39.740 | after research paper, that whole process,
02:40:41.980 | you have to be very careful to add variety into it.
02:40:45.380 | And maybe that involves playing a little bit
02:40:46.940 | of Counter-Strike or Diablo, whatever floats your boat.
02:40:49.940 | - Or dancing, New York City's a great place for this.
02:40:53.100 | There's Sunrise Rooftop Dancing, a party that does this.
02:40:56.660 | - That's a thing?
02:40:57.500 | - It's a thing, so you go there.
02:40:59.140 | I have some people from my lab that go.
02:41:00.460 | I've only been once, but at Sunrise,
02:41:02.700 | and you see the sun rise over the city
02:41:04.460 | and there's huge house music, and you play
02:41:06.060 | and you dance like crazy, and then you go to work.
02:41:08.500 | You go to lab, you go to wherever you're going.
02:41:09.660 | But you can, it's good to squeeze in some weird,
02:41:12.140 | crazy sunrise rooftop dancing or things like that
02:41:15.140 | when you can.
02:41:16.840 | - If we can, if we may, to some difficult, dark places.
02:41:21.840 | - I'll bring a flashlight.
02:41:23.540 | (laughing)
02:41:24.500 | - Maybe something, find something that can warm your soul
02:41:28.420 | or inspires others.
02:41:30.040 | Is there dark periods, dark times in your life
02:41:33.820 | that you had to overcome?
02:41:35.420 | - Yeah, like many people, I had friends I've lost.
02:41:38.860 | I had a friend when I was younger who committed suicide.
02:41:41.620 | And that was actually, I remember being so struck of,
02:41:45.260 | I couldn't understand it.
02:41:46.440 | I didn't understand mental illness at the time.
02:41:49.000 | I was very young, I was only, I think, 11 at the time.
02:41:51.820 | And I really was confused more than anything else
02:41:54.580 | about how could someone take their life.
02:41:56.280 | And I actually, once I got over the grief of it all,
02:41:59.980 | I really, it cemented in my head
02:42:02.540 | that I would never commit suicide.
02:42:05.580 | I could tell this to my wife, if it looks like
02:42:08.140 | I hung myself, go find my killer,
02:42:09.680 | 'cause I would never do it.
02:42:11.100 | It's gotta be staged.
02:42:12.640 | But at the same time, I've begun to appreciate
02:42:14.520 | there are times where the suffering is so great
02:42:16.740 | and diseases can be so awful that sometimes,
02:42:19.660 | euthanasia is an actual exit.
02:42:22.560 | But I just have friends I've lost along the way.
02:42:25.460 | But that's not too different.
02:42:28.760 | Everyone has people they've lost along the way.
02:42:30.340 | But I actually was never too dark of a childhood
02:42:35.340 | or of a dark place.
02:42:36.420 | I mean, the hardest things have been
02:42:38.100 | really weird relationship breakups
02:42:39.980 | where I felt like love, falling in love,
02:42:42.900 | and then losing that person, just breaking up,
02:42:45.100 | not like they died, but where you felt like
02:42:48.660 | you just could barely move.
02:42:49.940 | And you literally felt like your heart was moved
02:42:52.300 | in your body to a different location.
02:42:54.660 | And that sort of scraping sense of existence.
02:42:57.660 | But also at the same time, that's been where I've,
02:43:00.780 | in some ways, been the most alive,
02:43:02.260 | where I lost what I thought at the time
02:43:04.500 | was the love of my life.
02:43:05.700 | But then was able to actually, I think,
02:43:09.720 | carve a deeper trench into my heart,
02:43:12.780 | which then could be filled more with joy,
02:43:14.420 | I would say, is what Pablo Neruda wrote about this.
02:43:17.580 | And Khalil Gibran is that the deepest, deepest sorrows,
02:43:20.820 | I think, later have translated into my life
02:43:22.700 | as to places that can be filled
02:43:24.700 | with greater amounts of joy.
02:43:26.420 | - I love thinking of sorrows as a digging of a ditch
02:43:29.140 | that can then be filled with more good stuff.
02:43:32.580 | - Eventually.
02:43:33.940 | Not at the time, but for a while,
02:43:34.900 | it's just a giant empty cavern
02:43:36.100 | full of blood and tears and pain.
02:43:37.540 | But then, yeah, it comes later, I'd say.
02:43:41.180 | - There is an element to life where this too shall pass.
02:43:44.860 | So any moment of sorrow or joy, it's gonna be over.
02:43:49.860 | And treasure it, no matter what.
02:43:55.860 | I mean, I do definitely think about losing love.
02:43:58.880 | That's like a celebration of love.
02:44:00.580 | - And even any living, I think, is better.
02:44:04.700 | That's why I just adamantly don't think
02:44:06.340 | I'd ever really commit suicide,
02:44:07.740 | is 'cause anything I take is better than nothing.
02:44:10.700 | Except the worst case scenario,
02:44:11.620 | so there's no heaven, there's no hell.
02:44:13.420 | That's just it.
02:44:14.260 | If you just die and that's really just it,
02:44:15.500 | then anything that you have in living
02:44:18.420 | is by definition infinitely better than the zero,
02:44:21.380 | 'cause at least it's something.
02:44:23.100 | And so I appreciate sad, I enjoy sadness,
02:44:25.820 | which sounds like an oxymoron,
02:44:26.860 | but I sometimes even long for a good sadness,
02:44:29.700 | like a rainy day and I'm staring out a window,
02:44:32.700 | squinting and drinking some underpriced whiskey,
02:44:36.620 | and then just moping.
02:44:39.580 | And like, "What are you doing?"
02:44:40.420 | I'm just moping today, but I want at least one day
02:44:42.620 | where I do that or something.
02:44:44.500 | - I actually had a conversation offline with Rick Rubin,
02:44:47.740 | he's a music producer, about this.
02:44:49.860 | And he told me, he has a way of speaking
02:44:54.300 | that's all sage-like, and he says,
02:44:57.740 | "Be careful that you spend some time appreciating
02:45:02.740 | "that sadness, but don't become addicted to it."
02:45:08.420 | - Yeah, yeah.
02:45:09.260 | - That there's a line you can cross,
02:45:12.460 | and then you actually push away the joy.
02:45:16.100 | - Because you feel like, because the sadness
02:45:17.820 | can be all-encompassing, and therefore even more real
02:45:20.620 | than what might seem like fleeting happiness.
02:45:22.620 | - Yes.
02:45:23.460 | - And so, yeah.
02:45:24.300 | (laughing)
02:45:25.140 | - Yeah, right, you can, sadness, if you let it,
02:45:28.700 | can be a thing that stays with you longer and stickier.
02:45:32.260 | But you, but just witnessing suicide
02:45:35.820 | made you appreciate life more, yeah.
02:45:38.540 | And just an appreciation of death
02:45:40.700 | is actually an appreciation of life at the same time.
02:45:43.340 | - Are you afraid of your death?
02:45:46.860 | - No.
02:45:47.700 | - When you think about it?
02:45:48.540 | - I think it's like being afraid of the sunrise,
02:45:51.780 | it doesn't make sense.
02:45:54.700 | - So you're a part of this fabric that is humanity,
02:45:58.620 | and then you just think generationally.
02:46:02.140 | - Yeah, I think I wanna do as much as I can.
02:46:06.140 | I feel like I would die,
02:46:08.180 | I feel like I've lived a full life already.
02:46:09.620 | I actually believe that since age 17 onward.
02:46:11.740 | I feel like even then, I mean, then the bar was low.
02:46:14.220 | I feel like I had at least sex once.
02:46:16.540 | I had good friends.
02:46:18.100 | - What else is there?
02:46:18.940 | - Good friends, right, at that age.
02:46:20.460 | But then I had also really read a lot of philosophy,
02:46:24.700 | had traveled a bit, felt like I had started
02:46:26.780 | to at least see the world, and had lived
02:46:28.020 | a somewhat of a life, but that from then on,
02:46:30.500 | I felt like, that I wouldn't feel like I was cheated
02:46:33.540 | if I had died from that day forward,
02:46:35.580 | that I had gotten at least enough of life
02:46:37.460 | to feel like that I would be not okay with dying,
02:46:41.180 | but that I feel like I knew I was gonna die,
02:46:43.660 | I wasn't afraid I was gonna die,
02:46:44.980 | and it actually was very liberating.
02:46:47.500 | And it's only gotten better since then.
02:46:48.780 | So I think some of that may or may not have been
02:46:51.780 | drug-related euphoria, but nonetheless, the joy stuck.
02:46:54.780 | And I think it's just gotten more true ever since,
02:46:58.780 | is that the default state is one of very rich appreciation
02:47:03.020 | because it's so fleeting.
02:47:05.820 | And so I knew I would die happy, I guess, even at age 17,
02:47:10.340 | but now my metrics have changed a little bit.
02:47:13.300 | I've had sex more than one time now, so that's really big.
02:47:15.580 | - Congratulations, this is very exciting news.
02:47:17.900 | - At least four times.
02:47:19.580 | But the-- - Multiples.
02:47:21.420 | - Right, right, and professionally accomplished things.
02:47:24.180 | Like I actually do some of the genetic dreams I had
02:47:26.720 | when I was 16 or 17, I'm now actually making them in my lab.
02:47:29.920 | I actually like to say my scientific goals and statements
02:47:33.120 | have really been the same since I've been 17.
02:47:35.920 | It's just now everyone takes me seriously
02:47:37.920 | because I'm a professor and actually I've done--
02:47:39.920 | - And you're mentoring people, you're an educator.
02:47:41.920 | - To the next generation, yeah.
02:47:42.760 | - Also patients.
02:47:44.800 | - Yeah, and helping patients live longer
02:47:46.320 | and seeing the hope in their eyes
02:47:48.140 | when they went from, even my own grandfather,
02:47:49.940 | went from a two-month diagnosis of living
02:47:52.880 | from metastatic cancer to living for more than two years.
02:47:56.320 | Eventually succumbed to it, but knowing,
02:47:58.080 | can use the tools of predictive medicine to save people.
02:48:00.880 | And so now, looking out ahead, I feel like it's,
02:48:06.480 | I would die very happy if I saw boots on the red planet
02:48:09.000 | and people there.
02:48:10.160 | And the other advice to the younglings I'd say is,
02:48:12.920 | the first time I proposed the twin study to NASA,
02:48:14.960 | they said no, several times, said no,
02:48:17.240 | we don't have a plan for a mission like that,
02:48:18.960 | it's not gonna happen.
02:48:19.800 | So don't, just persevere as the oldest--
02:48:24.040 | - But you were, I didn't know, I knew you were part
02:48:26.680 | of leading the NASA twin study,
02:48:28.560 | but you were also part of the failure to do so early.
02:48:31.560 | - Early, so the first, actually,
02:48:32.800 | 'cause when you start a lab in academia,
02:48:34.360 | they say, here's a pile of money,
02:48:36.120 | write grants and bring in more money
02:48:37.720 | and train people and start a lab.
02:48:39.520 | But, so I actually wrote NASA and said,
02:48:41.360 | I don't, I'm not requesting any funds.
02:48:43.160 | I have funds, they just gave me a bunch of money.
02:48:44.920 | I would like to, though, do a deep genetic profile
02:48:47.200 | of astronauts before and after space flight
02:48:49.200 | and do it ideally if we have some twins
02:48:51.020 | or do genetics and epigenetics and microbiome.
02:48:53.920 | But John Charles, who's the director
02:48:55.120 | of the Human Research Program, said,
02:48:56.640 | no, we don't have, we don't even have those samples,
02:48:58.880 | Bank, that you would want, that are old samples,
02:49:01.240 | and we don't have any plans for missions like that right now,
02:49:03.360 | so we can't do it.
02:49:05.360 | And that was the first time I, you know,
02:49:06.640 | it's like saying to someone, listen,
02:49:08.560 | I'll buy a house for you.
02:49:09.960 | I just have this, I'll buy, and they're like,
02:49:11.200 | oh, no, no thanks.
02:49:12.360 | 'Cause it felt like I was offering
02:49:13.360 | a really unique research opportunity.
02:49:15.460 | But then that failure of saying that,
02:49:17.720 | well, we're not ready yet, it's not time,
02:49:19.400 | but then once they had the solicitation,
02:49:20.880 | then he reached out and said, oh, actually,
02:49:22.360 | I think we've got something along the lines
02:49:23.960 | of what you were thinking a few years ago.
02:49:25.160 | So sometimes when some things get rejected,
02:49:27.760 | or someone says no, say, okay, maybe it's just too early,
02:49:29.640 | but don't give up, I think, and say,
02:49:31.480 | you know, so to me, when someone says no, not right now,
02:49:34.840 | I'll be like, okay, I'll just, I'll come back in a year.
02:49:37.880 | No just means no for now, and so,
02:49:39.800 | if I think it's, sometimes no means you have a crappy idea.
02:49:43.600 | That is true, I do have crappy ideas,
02:49:45.520 | and so does everybody, but if I really believe in it,
02:49:48.120 | I just say, okay, I'll be back.
02:49:52.840 | - Yeah, this too shall pass, the no.
02:49:56.060 | (laughing)
02:49:58.880 | Do you hope to go out to ISS, out to deep space one day?
02:50:03.880 | - I would love to go.
02:50:05.560 | I wanna be a little bit older so that if I die,
02:50:07.320 | it's not as traumatic for my daughter and family,
02:50:10.240 | but yeah, I feel like if I'm a little bit older,
02:50:13.840 | I definitely, I would even potentially do
02:50:15.500 | a one-way trip to Mars if it's later in life.
02:50:17.880 | - So would you like to, do you think you will
02:50:22.040 | step foot on Mars?
02:50:24.000 | - I would love to, and I think I might.
02:50:26.000 | I think, it may be that one-way trip,
02:50:29.920 | 'cause I think they'll need settlers
02:50:31.320 | who would wanna go and stay there,
02:50:34.120 | and build and be there for the long term,
02:50:35.760 | knowing it's high risk, knowing it's--
02:50:37.120 | - And your resume fits, so you'll have a lot
02:50:39.880 | of cool stuff to do there.
02:50:41.160 | - Yeah, I can help 'em. - At least on the surface,
02:50:42.880 | you'll be able to sell yourself well.
02:50:45.420 | Resilience, experience, motivation.
02:50:51.260 | - Would that make you sad to die on Mars?
02:50:54.260 | - No. - Looking back at the planet
02:50:55.980 | you were born on?
02:50:57.180 | - No, I think it would be, actually,
02:50:59.480 | in some ways, it may be the best way to die,
02:51:01.020 | knowing that you're in the first wave of people
02:51:03.660 | expanding the reach into the stars.
02:51:06.180 | It'd be an honor.
02:51:07.380 | - Why do you think we're here?
02:51:08.580 | What's the meaning of life?
02:51:10.500 | - To serve as the guardians of life itself.
02:51:12.780 | That is the duty for our species,
02:51:15.900 | is to recognize and really manifest
02:51:19.100 | this unique responsibility that we have,
02:51:21.260 | and only we have so far.
02:51:23.340 | So I think, yeah, to me, the meaning of life
02:51:26.980 | is for life to, in its simplest form,
02:51:29.300 | is to be able to survive, but to leverage
02:51:33.060 | the frailty of life into its ability to protect itself.
02:51:36.540 | And quite literally, the guardians of the galaxy
02:51:38.860 | is basically what we are.
02:51:40.020 | We're guarding ourselves and also life.
02:51:42.780 | I mean, life is just so precious.
02:51:44.280 | As far as we know, it is completely rare in the universe.
02:51:47.540 | And I do think a lot, well, what if this is
02:51:49.700 | the only universe that's ever come in
02:51:51.060 | and it won't come back again?
02:51:52.700 | And like, this is it.
02:51:54.420 | And if that's true, we have to serve as its shepherds.
02:51:57.860 | - Leverage the frailty of life to protect it.
02:52:02.340 | And this is all life, so we get the opportunity,
02:52:04.840 | we humans get the opportunity to be smart enough,
02:52:07.780 | to be clever enough, to be motivated enough
02:52:09.780 | to actually protect the other life that's on this.
02:52:11.900 | - Including AI, including life that's to come.
02:52:13.660 | That might be very different from what we imagine today.
02:52:15.940 | And that would make you sad if we were replaced
02:52:17.940 | by the kinder, smarter AI?
02:52:21.220 | - Nope, I think about that in the book a bit.
02:52:23.060 | I think I would be okay with it if they carry some echo
02:52:27.300 | of that duty and they bring that with them.
02:52:30.060 | It would be sad if they're like,
02:52:31.900 | to hell with everyone, we're gonna destroy everything
02:52:33.580 | we come across and become like nanobots
02:52:36.060 | that make everything gray goo.
02:52:38.100 | That seems, but that would still be a version of life,
02:52:40.500 | just not one that is, I think, is pretty,
02:52:43.940 | but technically it'd be alive.
02:52:45.780 | So philosophically, could I object?
02:52:48.940 | It's borderline.
02:52:50.340 | - Yeah, but romantically, no.
02:52:52.180 | - Romantic, there's--
02:52:53.140 | - They need to carry the duty.
02:52:54.500 | - There's some, yes, yes, there's a bit of a romance
02:52:56.940 | to the philosophy that's in there.
02:52:58.420 | - And you also end the book with a universe
02:53:02.620 | that creates new universes.
02:53:06.300 | So if this isn't the only universe,
02:53:11.660 | do you think that's in our future,
02:53:13.540 | that we might launch--
02:53:14.900 | - New universe?
02:53:15.740 | - New offspring universes?
02:53:17.220 | - It's very possible.
02:53:18.060 | I mean, multiverse is a controversial field
02:53:20.580 | 'cause it's very much hypothetical,
02:53:22.180 | but with this universe has been created,
02:53:25.980 | the one we're in now, and so it's happened before,
02:53:28.260 | it certainly could happen again.
02:53:29.940 | Some of them might be happening in parallel.
02:53:31.540 | I think if you look at billions of years,
02:53:33.820 | trillions of years in the future
02:53:34.900 | of technological development, it's certainly possible
02:53:37.620 | we could start to have little baby universes,
02:53:40.420 | grow them like cabbage, get them out, saute them,
02:53:43.620 | make them have flavor.
02:53:45.620 | - Yeah, create something delicious.
02:53:47.300 | Well, it sounds difficult, but it's our human duty to try.
02:53:51.660 | As you said, Chris, this is an incredible conversation.
02:53:54.700 | You're an incredible person, a scientist, explorer.
02:53:57.900 | I can't wait to see what you do in this world.
02:54:00.140 | And I hope to be there with you on Mars.
02:54:04.620 | I would like to also breathe my last breath
02:54:08.540 | on that sexy red planet that's our neighbor.
02:54:12.700 | - Podcast from Mars, at least space.
02:54:14.380 | I think space should be coming.
02:54:15.420 | Space is pretty good, space is pretty good.
02:54:18.100 | But Mars next.
02:54:19.820 | Chris, thanks so much for talking to me.
02:54:21.180 | - Thanks for having me.
02:54:22.020 | It's really an honor and a pleasure to be here, thanks.
02:54:24.660 | - Thanks for listening to this conversation
02:54:26.100 | with Chris Mason.
02:54:27.340 | To support this podcast,
02:54:28.700 | please check out our sponsors in the description.
02:54:31.340 | And now, let me leave you with some words
02:54:33.500 | from Stanislav Lem and Solaris.
02:54:35.620 | Man has gone out to explore other worlds
02:54:39.620 | and other civilizations without having explored
02:54:42.860 | his own labyrinth of dark passages and secret chambers
02:54:46.380 | and without finding what lies behind doorways
02:54:49.540 | that he himself has sealed.
02:54:51.860 | Thank you for listening and hope to see you next time.
02:54:55.860 | (upbeat music)
02:54:58.460 | (upbeat music)
02:55:01.060 | [BLANK_AUDIO]