back to index

Rob Reid: The Existential Threat of Engineered Viruses and Lab Leaks | Lex Fridman Podcast #193


Chapters

0:0 Introduction
2:28 The most entertaining outcome is the most likely
8:47 Meme theory
12:7 Writing process
18:54 Engineered viruses as a threat to human civilization
26:40 Gain-of-function research on viruses
38:50 Did COVID leak from a lab?
46:10 Virus detection
53:59 Failure of institutions
61:43 Using AI to engineer viruses
66:2 Evil and competence
75:21 Where are the aliens?
79:14 Backing up human consciousness by colonizing space
88:43 Superintelligence and consciousness
100:7 Meditation
108:15 Fasting
114:15 Greatest song of all time
119:41 Early days of music streaming
131:34 Startup advice
144:45 Podcasting
160:7 Advice for young people
169:10 Mortality
174:36 Meaning of life

Whisper Transcript | Transcript Only Page

00:00:00.000 | The following is a conversation with Rob Reed,
00:00:02.640 | entrepreneur, author, and host of the After On podcast.
00:00:07.280 | Sam Harris recommended that I absolutely must talk to Rob
00:00:11.120 | about his recent work on the future of engineer pandemics.
00:00:15.080 | I then listened to the four hour special episode
00:00:18.360 | of Sam's Making Sense podcast with Rob,
00:00:21.000 | titled Engineering the Apocalypse, and I was floored.
00:00:25.000 | I knew I had to talk to him.
00:00:27.080 | Quick mention of our sponsors,
00:00:28.840 | Athletic Greens, Belcampo, Fundrise, and NetSuite.
00:00:33.520 | Check them out in the description to support this podcast.
00:00:36.840 | As a side note, let me say a few words
00:00:38.880 | about the lab leak hypothesis,
00:00:40.560 | which proposes that COVID-19 is a product
00:00:43.800 | of gain-of-function research on coronaviruses
00:00:46.760 | conducted at the Wuhan Institute of Virology
00:00:49.480 | that was then accidentally leaked due to human error.
00:00:53.000 | For context, this lab is biosafety level four, BSL-4,
00:00:56.560 | and it investigates coronaviruses.
00:00:59.400 | BSL-4 is the highest level of safety,
00:01:01.800 | but if you look at all the human-in-the-loop pieces
00:01:04.080 | required to achieve this level of safety,
00:01:06.120 | it becomes clear that even BSL-4 labs
00:01:08.960 | are highly susceptible to human error.
00:01:11.440 | To me, whether the virus leaked from the lab or not,
00:01:14.160 | getting to the bottom of what happened
00:01:15.760 | is about much more than this particular catastrophic case.
00:01:19.160 | It is a test for our scientific, political,
00:01:22.440 | journalistic, and social institutions
00:01:24.800 | of how well we can prepare and respond to threats
00:01:28.120 | that can cripple or destroy human civilization.
00:01:31.360 | If we continue gain-of-function research on viruses,
00:01:33.960 | eventually these viruses will leak,
00:01:36.840 | and they will be more deadly and more contagious.
00:01:40.160 | We can pretend that won't happen,
00:01:42.320 | or we can openly and honestly talk about the risks involved.
00:01:45.760 | This research can both save and destroy human life on Earth
00:01:49.080 | as we know it.
00:01:50.200 | It's a powerful double-edged sword.
00:01:52.560 | If YouTube and other platforms
00:01:54.600 | censor conversations about this,
00:01:56.640 | if scientists self-censor conversations about this,
00:02:00.080 | we'll become merely victims of our brief homo sapien story,
00:02:04.080 | not its heroes.
00:02:05.880 | As I said before,
00:02:07.160 | too carelessly labeling ideas as misinformation
00:02:10.440 | and dismissing them because of that
00:02:12.760 | will eventually destroy our ability to discover the truth.
00:02:16.440 | And without truth, we don't have a fighting chance
00:02:19.720 | against the great filter before us.
00:02:22.720 | This is the Lex Friedman Podcast,
00:02:24.760 | and here is my conversation with Rob Reed.
00:02:28.080 | I have seen evidence on the internet
00:02:30.880 | that you have a sense of humor, allegedly,
00:02:33.840 | but you also talk and think about
00:02:35.840 | the destruction of human civilization.
00:02:38.160 | What do you think of the Elon Musk hypothesis
00:02:40.840 | that the most entertaining outcome is the most likely?
00:02:44.440 | And he, I think, followed on to say
00:02:46.520 | a scene from an external observer,
00:02:48.760 | like if somebody was watching us,
00:02:50.760 | it seems we come up with creative ways
00:02:54.200 | of progressing our civilization that's fun to watch.
00:02:57.560 | - Yeah, so exactly.
00:02:59.360 | He said, from the standpoint of the observer,
00:03:01.480 | not the participant, I think.
00:03:03.240 | And so what's interesting about that,
00:03:05.640 | those were, I think, just a couple of freestanding tweets
00:03:08.080 | and delivered without a whole lot of wrapper of context,
00:03:11.640 | so it's left to the mind of the reader of the tweets
00:03:15.080 | to infer what he was talking about.
00:03:16.760 | So that's kind of like,
00:03:18.800 | it provokes some interesting thoughts.
00:03:20.320 | Like, first of all,
00:03:21.160 | it presupposes the existence of an observer,
00:03:24.240 | and it also presupposes that the observer
00:03:27.280 | wishes to be entertained
00:03:29.160 | and has some mechanism of enforcing
00:03:31.600 | their desire to be entertained.
00:03:33.000 | So there's a lot underpinning that.
00:03:35.080 | And to me, that suggests, particularly coming from Elon,
00:03:38.760 | that it's a reference to simulation theory,
00:03:41.220 | that somebody is out there and has far greater insights
00:03:45.120 | and a far greater ability to, let's say,
00:03:47.680 | peer into a single individual life
00:03:50.200 | and find that entertaining
00:03:51.600 | and full of plot twists and surprises
00:03:53.780 | and either a happy or tragic ending,
00:03:56.480 | or they have an incredible meta-view
00:03:59.600 | and they can watch the arc of civilization
00:04:01.960 | unfolding in a way that is entertaining
00:04:04.240 | and full of plot twists and surprises
00:04:06.200 | and a happy or unhappy ending.
00:04:08.020 | So, okay, so we're presupposing an observer.
00:04:11.760 | Then on top of that, when you think about it,
00:04:14.080 | you're also presupposing a producer
00:04:17.200 | because the act of observation is mostly fun
00:04:21.560 | if there are plot twists and surprises
00:04:23.420 | and other developments that you weren't foreseeing.
00:04:25.840 | I have re-read my own novels, and that's fun
00:04:29.400 | because it's something I worked hard on
00:04:31.120 | and I slaved over and I love,
00:04:33.140 | but there aren't a lot of surprises in there.
00:04:34.920 | So now I'm thinking we need a producer and an observer
00:04:39.080 | for that to be true.
00:04:39.960 | And on top of that,
00:04:41.380 | it's got to be a very competent producer
00:04:43.720 | because Elon said the most entertaining outcome
00:04:46.720 | is the most likely one.
00:04:48.360 | So there's lots of layers for thinking about that.
00:04:51.240 | And when you've got a producer
00:04:53.120 | who's trying to make it entertaining,
00:04:54.800 | it makes me think of there was a South Park episode
00:04:57.240 | in which Earth turned out to be a reality show.
00:05:00.400 | And somehow we had failed to entertain the audience
00:05:03.880 | as much as we used to,
00:05:04.880 | so the Earth show was going to get canceled, et cetera.
00:05:08.040 | So taking all that together,
00:05:10.520 | and I'm obviously being a little bit playful
00:05:12.320 | in laying this out,
00:05:13.840 | what is the evidence that we have
00:05:16.520 | that we are in a reality
00:05:19.360 | that is intended to be most entertaining?
00:05:21.640 | Now you could look at that reality
00:05:23.520 | on the level of individual lives
00:05:25.400 | or the whole arc of civilization,
00:05:27.280 | other lives, levels as well, I'm sure.
00:05:29.960 | But just looking from my own life,
00:05:32.080 | I think I'd make a pretty lousy show.
00:05:34.400 | I spend an inordinate amount of time
00:05:37.280 | just looking at a computer.
00:05:38.920 | I don't think that's very entertaining.
00:05:40.480 | And there's just a completely inadequate level
00:05:44.200 | of shootouts and car chases in my life.
00:05:46.440 | I mean, I'll go weeks, even months
00:05:48.120 | without a single shootout or car chase.
00:05:49.960 | - That just means that you're one
00:05:51.160 | of the non-player characters in this game.
00:05:52.920 | You're just waiting.
00:05:54.160 | - I'm an extra.
00:05:55.000 | - You're an extra that waiting for your one opportunity
00:05:57.840 | for a brief moment to actually interact
00:05:59.760 | with one of the main characters in the play.
00:06:02.600 | - Very interesting.
00:06:03.440 | Okay, that's good.
00:06:04.480 | So okay, so we'll rule out me being the star of the show,
00:06:07.240 | which I probably could have guessed at anyway.
00:06:09.320 | But then even the arc of civilization.
00:06:11.360 | I mean, there have been a lot of really intriguing things
00:06:13.680 | that have happened
00:06:14.520 | and a lot of astounding things that have happened.
00:06:16.440 | But I would have some werewolves, I'd have some zombies.
00:06:21.440 | I would have some really improbable developments
00:06:24.160 | like maybe Canada absorbing the United States.
00:06:28.600 | So I don't know.
00:06:29.800 | I'm not sure if we're necessarily designed
00:06:31.800 | for maximum entertainment.
00:06:33.160 | But if we are, that will mean that 2020 is just a prequel
00:06:38.160 | for even more bizarre years ahead.
00:06:41.200 | So I kind of hope that we're not designed
00:06:43.960 | for maximum entertainment.
00:06:45.520 | - Well, the night is still young in terms of Canada,
00:06:47.700 | but do you think it's possible for the observer
00:06:49.760 | and the producer to be kind of emergent?
00:06:52.360 | So meaning it does seem when you kind of watch memes
00:06:56.880 | on the internet, the funny ones,
00:06:59.320 | the entertaining ones spread more efficiently.
00:07:01.920 | - They do.
00:07:02.760 | - I mean, I don't know what it is about the human mind
00:07:05.320 | that soaks up on mass funny things.
00:07:11.080 | Much more sort of aggressively, it's more viral
00:07:14.040 | in the full sense of that word.
00:07:16.640 | Is there some sense that whatever the evolutionary process
00:07:20.800 | that created our cognitive capabilities
00:07:23.560 | is the same process that's going to, in an emergent way,
00:07:27.520 | create the most entertaining outcome,
00:07:29.400 | the most meme-ifiable outcome, the most viral outcome
00:07:34.400 | if we were to share it on Twitter?
00:07:36.760 | - Yeah, that's interesting.
00:07:38.960 | Yeah, we do have an incredible ability.
00:07:41.600 | Like, I mean, how many memes are created in a given day?
00:07:43.960 | And the ones that go viral are almost uniformly funny,
00:07:46.400 | at least to somebody with a particular sense of humor.
00:07:48.760 | - Right.
00:07:50.400 | - Yeah, I'd have to think about that.
00:07:52.260 | We are definitely great at creating atomized units of funny.
00:07:58.460 | Like in the example that you used,
00:08:01.720 | there are going to be X million brains parsing
00:08:04.640 | and judging whether this meme is retweetable or not.
00:08:07.440 | And so that sort of atomic element of funniness,
00:08:12.080 | of entertainingness, et cetera,
00:08:14.120 | we definitely have an environment
00:08:15.920 | that's good at selecting for that,
00:08:18.560 | and selective pressure, and everything else that's going on.
00:08:21.640 | But in terms of the entire ecosystem of conscious systems
00:08:26.640 | here on the Earth driving for a level of entertainment,
00:08:32.800 | that is on such a much higher level
00:08:35.960 | that I don't know if that would necessarily
00:08:38.760 | follow directly from the fact that
00:08:41.600 | atomic units of entertainment
00:08:43.560 | are very, very aptly selected for us.
00:08:45.960 | I don't know.
00:08:47.300 | - Do you find it compelling or useful
00:08:49.600 | to think about human civilization
00:08:52.840 | from the perspective of the ideas
00:08:55.320 | versus the perspective of the individual human brains?
00:08:59.680 | So almost thinking about the ideas or the memes,
00:09:02.440 | this is the Dawkins thing, as the organisms.
00:09:05.720 | And then the humans as just like vehicles
00:09:09.640 | for briefly carrying those organisms
00:09:12.200 | as they jump around and spread.
00:09:14.040 | - Yeah, for propagating them, mutating them,
00:09:16.480 | putting selective pressure on them, et cetera.
00:09:19.400 | I mean, I found Dawkins',
00:09:22.600 | or his launching of the idea of memes
00:09:25.640 | is just kind of an afterthought
00:09:27.480 | to his unbelievably brilliant book about the selfish gene.
00:09:30.840 | Like, what a PS to put at the end
00:09:34.080 | of a long chunk of writing.
00:09:35.680 | It's profoundly interesting.
00:09:37.440 | I view the relationship though between humans and memes
00:09:41.640 | as probably an oversimplification,
00:09:43.520 | but maybe a little bit like the relationship
00:09:45.400 | between flowers and bees, right?
00:09:47.640 | Do flowers have bees or do bees in a sense have flowers?
00:09:51.800 | And the answer is, it is a very, very symbiotic relationship
00:09:56.200 | in which both have semi-independent roles that they play
00:10:00.080 | and both are highly dependent upon the other.
00:10:03.160 | And so in the case of bees, obviously,
00:10:05.440 | you could see the flower
00:10:06.720 | as being this monolithic structure physically
00:10:09.480 | in relation to any given bee,
00:10:11.520 | and it's the source of food and sustenance.
00:10:14.280 | So you could kind of say, well, flowers have bees.
00:10:16.980 | But on the other hand, the flowers would obviously be doomed.
00:10:20.720 | They weren't being pollinated by the bees.
00:10:22.840 | So you could kind of say, well, bees,
00:10:25.320 | flowers are really expression of what the bees need.
00:10:28.040 | And the truth is a symbiosis.
00:10:30.480 | So with memes in human minds,
00:10:33.400 | our brains are clearly the Petri dishes
00:10:37.960 | in which memes are either propagated or not propagated,
00:10:41.520 | get mutated or don't get mutated.
00:10:44.160 | They are the venue in which competition,
00:10:47.640 | selective competition, plays out between different memes.
00:10:51.320 | So all of that is very true.
00:10:53.000 | And you could look at that and say,
00:10:55.000 | really the human mind is a production of memes
00:10:58.720 | and ideas have us rather than us having ideas.
00:11:01.760 | But at the same time, let's take a catchy tune
00:11:04.880 | as an example of a meme.
00:11:07.000 | That catchy tune did originate in a human mind.
00:11:10.920 | Somebody had to structure that thing.
00:11:12.880 | And as much as I like Elizabeth Gilbert's TED Talk
00:11:15.760 | about how the universe, I'm simplifying,
00:11:19.000 | but kind of the ideas find their way
00:11:21.520 | in this beautiful TED Talk, it's very lyrical.
00:11:23.880 | She talked about ideas and prose
00:11:27.880 | kind of beaming into our minds.
00:11:30.760 | She talked about needing to pull over to the side of the road
00:11:33.040 | when she got inspiration for a particular paragraph
00:11:36.360 | or a particular idea and a burning need to write that down.
00:11:40.160 | I love that.
00:11:41.000 | I find that beautiful.
00:11:42.360 | As a writer, as a novelist myself,
00:11:45.200 | I've never had that experience.
00:11:47.160 | And I think that really most things that do become memes
00:11:52.000 | are the product of a great deal of deliberate
00:11:55.440 | and willful exertion of a conscious mind.
00:11:59.480 | And so like the bees and the flowers,
00:12:01.520 | I think there's a great symbiosis.
00:12:03.560 | And they both kind of have one another.
00:12:05.480 | Ideas have us, but we have ideas for real.
00:12:07.920 | - If we could take a little bit of a tangent,
00:12:10.200 | Stephen King on writing, you as a great writer,
00:12:14.280 | you're dropping a hint here that the ideas don't come to you.
00:12:18.000 | It's a grind of sort of,
00:12:19.960 | it's almost like you're mining for gold.
00:12:22.640 | It's more of a very deliberate, rigorous daily process.
00:12:28.040 | So maybe, can you talk about the writing process?
00:12:32.520 | How do you write well?
00:12:36.120 | And maybe if you want to step outside of yourself,
00:12:38.760 | almost like give advice to an aspiring writer,
00:12:42.480 | what does it take to write the best work of your life?
00:12:46.200 | - Well, it would be very different
00:12:47.600 | if it's fiction versus nonfiction.
00:12:49.560 | And I've done both.
00:12:51.000 | I've written two works of non,
00:12:52.360 | two nonfiction books and two works of fiction.
00:12:55.400 | Two works of fiction being more recent,
00:12:57.080 | I'm gonna focus on that right now
00:12:58.520 | 'cause that's more toweringly on my mind.
00:13:01.440 | They're amongst novelists.
00:13:03.680 | Again, this is an oversimplification,
00:13:05.420 | but there's kind of two schools of thought.
00:13:08.440 | Some people really like to fly by the seat of their pants,
00:13:11.400 | and some people really, really like to outline, to plot.
00:13:15.640 | So there's plotters and pantsers, I guess,
00:13:17.720 | is one way that people look at it.
00:13:19.760 | And as with most things,
00:13:22.300 | there is a great continuum in between,
00:13:24.160 | and I'm somewhere on that continuum,
00:13:25.660 | but I lean, I guess, a little bit more toward the plotter.
00:13:29.800 | And so when I do start a novel,
00:13:32.440 | I have a pretty strong point of view
00:13:34.400 | about how it's gonna end,
00:13:36.080 | and I have a very strong point of view
00:13:37.480 | about how it's gonna begin.
00:13:39.240 | And I do try to make an effort of making an outline
00:13:42.500 | that I know I'm gonna be extremely unfaithful to
00:13:45.120 | in the actual execution of the story,
00:13:48.280 | but I'm trying to make an outline
00:13:49.580 | that gets us from here to there,
00:13:50.900 | and notion of subplots and beats and rhythm
00:13:53.440 | and different characters and so forth.
00:13:56.140 | But then when I get into the process,
00:13:58.760 | that outline, particularly the center of it,
00:14:01.400 | ultimately, inevitably morphs a great deal.
00:14:03.720 | And I think if I were personally a rigorous outliner,
00:14:07.320 | I would not allow that to happen.
00:14:08.980 | I also would make a much more vigorous skeleton
00:14:13.080 | before I start.
00:14:14.280 | So I think people who are really
00:14:16.400 | in that plotting, outlining mode
00:14:19.160 | are people who write page turners,
00:14:22.140 | people who write spy novels or supernatural adventures,
00:14:27.140 | where you really want a relentless pace
00:14:31.660 | of events, action, plot twists, conspiracy, et cetera.
00:14:36.660 | And that is really the bone,
00:14:39.880 | that's really the skeletal structure.
00:14:42.780 | So I think folks who write that kind of book
00:14:45.080 | are really very much on the outlining side.
00:14:47.600 | And I think people who write what's often referred to
00:14:50.800 | as literary fiction, for lack of a better term,
00:14:53.480 | where it's more about sort of aura and ambiance
00:14:58.260 | and character development and experience
00:15:00.680 | and inner experience and inner journey and so forth,
00:15:04.000 | I think that group is more likely
00:15:06.300 | to fly by the seat of their pants.
00:15:07.600 | And I know people who start with a blank page
00:15:09.720 | and just see where it's gonna go.
00:15:11.360 | I'm a little bit more on the plotting side.
00:15:13.520 | Now you asked what makes something,
00:15:18.200 | at least in the mind of the writer,
00:15:20.080 | as great as it can be.
00:15:21.720 | For me, it's an astonishingly high percentage of it
00:15:25.000 | is editing as opposed to the initial writing.
00:15:27.600 | For every hour that I spend writing new prose,
00:15:32.600 | like new pages, new paragraphs,
00:15:34.520 | stuff that, new bits of the book,
00:15:37.640 | I probably spend, I mean, I wish I kept a count.
00:15:42.640 | I wish I had one of those pieces of software
00:15:45.080 | that lawyers use to decide how much time
00:15:47.120 | I'm gonna be doing this, that.
00:15:47.960 | But I would say it's at least four or five hours
00:15:51.560 | and maybe as many as 10 that I spend editing.
00:15:54.200 | And so it's relentless for me.
00:15:56.520 | - For each one hour of writing, you said?
00:15:58.440 | - I'd say that.
00:15:59.400 | - Wow.
00:16:00.240 | - I mean, I write because I edit
00:16:01.860 | and I spend just relentlessly polishing and pruning
00:16:06.800 | and sometimes on the micro level of just like,
00:16:09.840 | does the rhythm of the sentence feel right?
00:16:12.360 | Do I need to carve a syllable or something so it can land?
00:16:15.640 | Like as micro as that to as macro as like,
00:16:18.920 | okay, I'm done but the book is 750 pages long
00:16:21.920 | and it's way too bloated and I need to lop
00:16:23.480 | a third out of it.
00:16:24.720 | Problems on those two orders of magnitude
00:16:27.400 | and everything in between,
00:16:29.600 | that is an enormous amount of my time.
00:16:31.840 | And I also write music, write and record and produce music.
00:16:36.840 | And there the ratio is even higher.
00:16:40.120 | Every minute that I spend or my band spends
00:16:44.560 | laying down that original audio,
00:16:47.000 | it's a very high proportion of hours
00:16:49.120 | that go into just making it all hang together
00:16:51.720 | and sound just right.
00:16:52.760 | So I think that's true of a lot of creative processes.
00:16:56.240 | I know it's true of sculpture.
00:16:58.800 | I believe it's true of woodwork.
00:16:59.880 | My dad was an amateur woodworker
00:17:01.400 | and he spent a huge amount of time
00:17:03.440 | on sanding and polishing at the end.
00:17:05.400 | So I think a great deal of the sparkle
00:17:07.340 | comes from that part of the process, any creative process.
00:17:10.040 | - Can I ask about the psychological,
00:17:12.000 | the demon side of that picture?
00:17:14.360 | In the editing process, you're ultimately judging
00:17:16.920 | the initial piece of work and you're judging
00:17:18.840 | and judging and judging.
00:17:20.480 | How much of your time do you spend hating your work?
00:17:24.920 | How much time do you spend in gratitude,
00:17:29.920 | impressed, thankful, or how good the work
00:17:33.320 | that you will put together is?
00:17:36.040 | - I spend almost all the time in a place
00:17:39.080 | that's intermediate between those
00:17:41.520 | but leaning toward gratitude.
00:17:43.360 | I spend almost all the time in a state of optimism
00:17:47.040 | that this thing that I have, I like, I like quite a bit
00:17:51.320 | and I can make it better and better and better
00:17:55.320 | with every time I go through it.
00:17:56.840 | So I spend most of my time in a state of optimism.
00:18:00.420 | - I think I personally oscillate much more aggressively
00:18:04.960 | between those two where I wouldn't be able
00:18:06.760 | to find the average.
00:18:08.080 | I go pretty deep.
00:18:11.080 | Marvin Minsky from MIT had this advice, I guess,
00:18:16.080 | to what it takes to be successful in science and research
00:18:21.120 | is to hate everything you do,
00:18:23.160 | you've ever done in the past.
00:18:24.600 | I mean, at least he was speaking about himself
00:18:27.820 | that the key to his success was to hate everything
00:18:31.560 | he's ever done.
00:18:32.520 | I have a little Marvin Minsky there in me too
00:18:36.480 | to sort of always be exceptionally self-critical
00:18:39.840 | but almost like self-critical about the work
00:18:42.440 | but grateful for the chance to be able to do the work.
00:18:46.760 | - Yeah. - If that makes sense.
00:18:47.720 | - Makes perfect sense.
00:18:48.560 | - But that, you know, each one of us have to strike
00:18:51.840 | a certain kind of balance.
00:18:54.200 | - Yeah.
00:18:55.800 | - But back to the destruction of human civilization.
00:18:59.040 | If humans destroy ourselves in the next 100 years,
00:19:04.720 | what will be the most likely source,
00:19:08.320 | the most likely reason that we destroy ourselves?
00:19:11.400 | - Well, let's see, 100 years, it's hard for me
00:19:15.800 | to comfortably predict out that far
00:19:18.120 | and it's something to give a lot more thought to,
00:19:20.320 | I think, than normal folks
00:19:23.400 | simply because I am a science fiction writer.
00:19:25.400 | And I feel with the acceleration of technological progress,
00:19:30.400 | it's really hard to foresee out more
00:19:32.960 | than just a few decades.
00:19:34.040 | I mean, comparing today's world to that of 1921,
00:19:37.660 | where we are right now a century later,
00:19:39.560 | it would have been so unforeseeable.
00:19:41.560 | And I just don't know what's gonna happen,
00:19:44.200 | particularly with exponential technologies.
00:19:46.640 | I mean, our intuitions reliably defeat ourselves
00:19:49.960 | with exponential technologies
00:19:51.280 | like computing and synthetic biology.
00:19:53.080 | And, you know, how we might destroy ourselves
00:19:56.040 | in the 100-year timeframe might have everything to do
00:20:00.040 | with breakthroughs in nanotechnology 40 years from now
00:20:03.060 | and then how rapidly those breakthroughs accelerate.
00:20:05.520 | But in the nearer term that I'm comfortable predicting,
00:20:07.800 | let's say 30 years, I would say the most likely route
00:20:12.400 | to self-destruction would be synthetic biology.
00:20:16.280 | And I always say that with the gigantic caveat
00:20:19.600 | and very important one that I find,
00:20:21.920 | and I'll abbreviate synthetic biology to SYNBIO
00:20:24.280 | just to save us some syllables.
00:20:26.060 | I believe SYNBIO offers us simply stunning promise
00:20:30.400 | that we would be fools to deny ourselves.
00:20:34.200 | So I'm not an anti-SYNBIO person by any stretch.
00:20:37.200 | I mean, SYNBIO has unbelievable odds
00:20:39.920 | of helping us beat cancer,
00:20:41.520 | helping us rescue the environment,
00:20:43.360 | helping us do things
00:20:44.520 | that we would currently find imponderable.
00:20:46.200 | So it's electrifying the field.
00:20:48.360 | But in the wrong hands,
00:20:50.040 | those hands either being incompetent or being malevolent,
00:20:54.800 | in the wrong hands, synthetic biology to me
00:20:57.840 | has a much, much greater odds,
00:21:00.200 | has much, much greater odds
00:21:01.800 | of leading to our self-destruction
00:21:05.080 | than something running amok with super AI,
00:21:07.200 | which I believe is a real possibility
00:21:08.960 | and one we need to be concerned about.
00:21:10.320 | But in the 30-year timeframe, I think it's a lesser one,
00:21:13.440 | or nuclear weapons or anything else that I can think of.
00:21:16.760 | - Can you explain that a little bit further?
00:21:18.440 | So your concern is on the man-made
00:21:21.980 | versus the natural side of the pandemic frontier.
00:21:26.120 | So we humans engineering pathogens, engineering viruses
00:21:31.560 | is the concern here.
00:21:32.760 | - Yeah.
00:21:33.840 | - And maybe how do you see the possible trajectories
00:21:37.640 | happening here in terms of, is it malevolent
00:21:41.520 | or is it accidents, oops, little mistakes
00:21:46.520 | or unintended consequences of particular actions
00:21:51.560 | that are ultimately lead to unexpected mistakes?
00:21:54.020 | - Well, both of them are a danger.
00:21:55.700 | And I think the question of which is more likely
00:21:59.060 | has to do with two things.
00:22:00.800 | One, do we take a lot of methodical, affordable,
00:22:05.600 | four-sided steps that we are absolutely capable
00:22:08.880 | of taking right now to first stall the risk
00:22:12.120 | of a bad actor infecting us with something
00:22:14.680 | that could have annihilating impacts.
00:22:17.060 | And in the episode you referenced with Sam,
00:22:20.000 | we talked a great deal about that.
00:22:22.480 | So do we take those steps?
00:22:24.120 | And if we take those steps, I think the danger
00:22:26.320 | of malevolent rogue actors doing a sin
00:22:29.320 | with sin bio could plummet.
00:22:31.600 | But it's always a question of if,
00:22:34.000 | and we have a bad, bad and very long track record
00:22:36.640 | of hitting the snooze bar after different natural pandemics
00:22:40.000 | have attacked us.
00:22:41.440 | So that's variable number one.
00:22:43.400 | Variable number two is how much experimentation
00:22:48.400 | and pathogen development do we as a society decide
00:22:52.520 | is acceptable in the realms of academia, government
00:22:57.040 | or private industry?
00:22:58.680 | And if we decide as a society that it's perfectly okay
00:23:03.440 | for people with varying research agendas
00:23:06.200 | to create pathogens that if released
00:23:09.520 | could wipe out humanity, if we think that's fine,
00:23:12.520 | and if that kind of work starts happening in one lab,
00:23:15.960 | five labs, 50 labs, 500 labs in one country,
00:23:19.940 | then 10 countries, then 70 countries or whatever,
00:23:23.280 | that risk of a boo-boo starts rising astronomically.
00:23:28.240 | And this won't be a spoiler alert based on the way
00:23:31.340 | that I presented those two things,
00:23:33.200 | but I think it's unbelievably important
00:23:35.640 | to manage both of those risks.
00:23:37.480 | The easier one to manage, although it wouldn't be simple
00:23:40.960 | by any stretch because it would have to be something
00:23:43.120 | that all nations agree on, but the easiest way,
00:23:46.420 | the easier risk to manage is that of,
00:23:49.600 | hey guys, let's not develop pathogens
00:23:52.440 | that if they escape from a lab could annihilate us.
00:23:56.120 | There's no line of research that justifies that,
00:23:58.720 | and in my view, I mean, that's the point of perspective
00:24:01.280 | we need to have.
00:24:02.100 | We'd have to collectively agree that there's no line
00:24:04.240 | of research that justifies that.
00:24:06.060 | The reason why I believe that would be
00:24:07.780 | a highly rational conclusion is even the highest level
00:24:11.360 | of biosafety lab in the world, biosafety lab level four,
00:24:15.560 | and there are not a lot of BSL-4 labs in the world,
00:24:18.240 | there are things can and have leaked out of BSL-4 labs,
00:24:23.600 | and some of the work that's been done
00:24:26.080 | with potentially annihilating pathogens,
00:24:28.480 | which we can talk about, is actually done at BSL-3,
00:24:31.840 | and so fundamentally, any lab can leak.
00:24:36.280 | We have proven ourselves to be incapable of creating a lab
00:24:40.000 | that is utterly impervious to leaks,
00:24:42.360 | so why in the world would we create something
00:24:44.880 | where if, God forbid, it leaked, could annihilate us all?
00:24:48.080 | And by the way, almost all of the measures
00:24:50.720 | that are taken in biosafety level anything labs
00:24:54.200 | are designed to prevent accidental leaks.
00:24:57.040 | What happens if you have a malevolent insider?
00:24:59.600 | And we could talk about the psychology and the motivations
00:25:02.640 | of what would make a malevolent insider
00:25:04.400 | who wants to release something annihilating in a bit.
00:25:07.460 | I'm sure that we will.
00:25:08.680 | But what if you have a malevolent insider?
00:25:11.280 | Virtually none of the standards that go
00:25:13.960 | into biosafety level one, two, three, and four
00:25:17.200 | are about preventing somebody hijacking the process.
00:25:20.120 | I mean, some of them are,
00:25:21.360 | but they're mainly designed against accidents.
00:25:23.480 | They're imperfect against accidents,
00:25:25.560 | and if this kind of work starts happening
00:25:27.280 | in lots and lots of labs, with every lab you add,
00:25:29.940 | the odds of there being a malevolent insider
00:25:32.360 | naturally increase arithmetically
00:25:34.120 | as the number of labs goes up.
00:25:35.960 | Now, on the front of somebody outside of a government,
00:25:40.960 | academic, or scientific, traditional government,
00:25:44.680 | academic, scientific environment,
00:25:47.600 | creating something malevolent,
00:25:50.040 | again, there's protections that we can take,
00:25:52.580 | both at the level of syn-bio architecture,
00:25:55.840 | the hardening of the entire syn-bio ecosystem
00:25:59.660 | against terrible things being made
00:26:02.840 | that we don't want to have out there by rogue actors,
00:26:05.920 | to early detection, to lots and lots of other things
00:26:08.880 | that we can do to dramatically mitigate that risk.
00:26:12.040 | And I think we do both of those things,
00:26:13.640 | decide that no, we're not going to experiment,
00:26:16.120 | we make annihilating pathogens in leaky labs,
00:26:19.360 | and B, yes, we are going to take countermeasures
00:26:22.340 | that are going to cost a fraction
00:26:24.720 | of our annual defense budget to preclude their creation,
00:26:29.080 | then I think both risks get managed down.
00:26:31.720 | But if you take one set of precautions and not the other,
00:26:34.840 | then the thing that you have not taken precautions against
00:26:38.480 | immediately becomes the more likely outcome.
00:26:40.880 | - So can we talk about this kind of research
00:26:43.340 | and what's actually done,
00:26:45.080 | and what are the positives and negatives of it?
00:26:47.160 | So if we look at gain-of-function research
00:26:50.320 | and the kind of stuff that's happening
00:26:51.960 | in level three and level four BSL labs,
00:26:55.220 | what's the whole idea here?
00:26:56.560 | Is it trying to engineer viruses
00:26:59.440 | to understand how they behave?
00:27:01.480 | You want to understand the dangerous ones.
00:27:03.320 | - Yeah, so that would be the logic behind doing it.
00:27:06.600 | And so gain-of-function can mean a lot of different things.
00:27:10.200 | Viewed through a certain lens, gain-of-function research
00:27:13.480 | could be what you do when you create GMOs,
00:27:16.760 | when you create hearty strains of corn
00:27:19.880 | that are resistant to pesticides.
00:27:21.440 | I mean, you could view that as gain-of-function.
00:27:23.240 | So I'm going to refer to gain-of-function
00:27:25.040 | in a relatively narrow sense,
00:27:26.400 | which is actually the sense that the term is usually used,
00:27:29.640 | which is in some way magnifying capabilities
00:27:34.320 | of microorganisms to make them more dangerous,
00:27:37.400 | whether it's more transmissible or more deadly.
00:27:40.180 | And in that line of research,
00:27:42.560 | I'll use an example from 2011,
00:27:44.480 | 'cause it's very illustrative and it's also very chilling.
00:27:48.160 | Back in 2011, two separate labs,
00:27:52.080 | independently of one another,
00:27:53.280 | I assume there was some kind of communication between them,
00:27:55.320 | but they were basically independent projects,
00:27:57.120 | one in Holland and one in Wisconsin,
00:27:58.960 | did gain-of-function research
00:28:01.480 | on something called H5N1 flu.
00:28:04.080 | H5N1 is something that, at least on a lethality basis,
00:28:09.080 | makes COVID look like a kitten.
00:28:12.400 | COVID, according to the World Health Organization,
00:28:14.360 | has a case fatality rate
00:28:15.440 | somewhere between half a percent and 1%.
00:28:17.960 | H5N1 is closer to 60%, six zero.
00:28:21.320 | And so that's actually even slightly more lethal than Ebola.
00:28:24.560 | It's a very, very, very scary pathogen.
00:28:27.460 | The good news about H5N1,
00:28:29.200 | it is that it is barely, barely contagious.
00:28:33.000 | And I believe it is in no way contagious human to human.
00:28:36.320 | It requires very, very, very deep contact
00:28:42.040 | with birds, in most cases chickens.
00:28:44.760 | And so if you're a chicken farmer
00:28:47.120 | and you spend an enormous amount of time around them,
00:28:49.340 | and perhaps you get into situations
00:28:51.560 | in which you get a break in your skin
00:28:53.660 | and you're interacting intensely with fowl
00:28:57.840 | who, as it turns out, have H5N1,
00:29:00.200 | that's when the jump comes.
00:29:01.880 | But there's no airborne transmission
00:29:03.880 | that we're aware of human to human.
00:29:05.320 | I mean, not that we're, it just doesn't exist.
00:29:08.000 | I think the World Health Organization
00:29:11.240 | did a relentless survey of the number of H5N1 cases.
00:29:15.320 | I think they do it every year.
00:29:16.800 | I saw one 10-year series where I think it was like
00:29:19.960 | 500 fatalities over the course of a decade.
00:29:22.920 | And that's a drop in the bucket.
00:29:24.600 | Kind of fun fact.
00:29:26.560 | I believe the typical lethality from lightning
00:29:29.680 | over 10 years is 70,000 deaths.
00:29:31.840 | So we've been getting struck by lightning, pretty low risk.
00:29:34.420 | H5N1, much, much lower than that.
00:29:36.920 | What happened in these experiments
00:29:39.300 | is the experimenters in both cases
00:29:41.520 | set out to make H5N1 that would be contagious,
00:29:45.840 | that could create airborne transmission.
00:29:48.200 | And so they basically passed it, I think in both cases,
00:29:50.980 | they passed it through a large number of ferrets.
00:29:54.520 | And so this wasn't like CRISPR,
00:29:56.440 | there wasn't even any CRISPR back in those days.
00:29:58.160 | This was relatively straightforward,
00:30:00.400 | selecting for a particular outcome.
00:30:03.040 | And after guiding the path and passing them through,
00:30:06.240 | again, I believe it was a series of ferrets,
00:30:07.760 | they did in fact come up with a version of H5N1
00:30:11.560 | that is capable of airborne transmission.
00:30:14.560 | Now, they didn't unleash it into the world,
00:30:17.320 | they didn't inject it into humans to see what would happen.
00:30:20.520 | And so for those two reasons,
00:30:21.960 | we don't really know how contagious it might have been.
00:30:25.520 | But if it was as contagious as COVID,
00:30:29.200 | that could be a civilization-threatening pathogen.
00:30:33.080 | And why would you do it?
00:30:34.680 | Well, the people who did it were good guys.
00:30:36.700 | They were virologists.
00:30:38.320 | I believe their agenda as they explained it was,
00:30:40.760 | much as you said, let's figure out
00:30:43.240 | what a worst case scenario might look like
00:30:45.700 | so we can understand it better.
00:30:47.680 | But my understanding is in both cases,
00:30:50.360 | it was done in BSL-3 labs.
00:30:52.700 | And so potential of leak, significantly non-zero,
00:30:57.360 | hopefully way below 1%, but significantly non-zero.
00:31:00.920 | And when you look at the consequences of an escape
00:31:03.480 | in terms of human lives,
00:31:05.440 | destruction of a large portion of the economy, et cetera,
00:31:08.360 | and you do an expected value calculation
00:31:10.600 | on whatever fraction of 1% that was,
00:31:13.360 | you would come up with a staggering cost,
00:31:17.080 | staggering expected cost for this work.
00:31:19.260 | So it should never have been carried out.
00:31:21.860 | Now, you might make an argument.
00:31:24.440 | If you said, if you believed that H5N1 in nature
00:31:29.440 | is on an inevitable path to airborne transmission,
00:31:33.960 | and it's only gonna be a small number of years, A,
00:31:37.280 | and B, if it makes that transition,
00:31:40.280 | there is one set of changes to its metabolic pathways
00:31:44.840 | and its genomic code and so forth,
00:31:47.320 | one, that we have discovered.
00:31:49.060 | So it is gonna go from point A,
00:31:51.280 | which is where it is right now, to point B.
00:31:53.560 | We have reliably engineered point B.
00:31:56.120 | That is the destination.
00:31:58.020 | And we need to start fighting that right now
00:32:00.120 | because this is five years or less away.
00:32:02.400 | Now, that'd be a very different world.
00:32:03.440 | That'd be like spotting an asteroid
00:32:04.920 | that's coming toward the Earth and is five years off.
00:32:06.960 | And yes, you marshal everything you can to resist that.
00:32:10.240 | But there's two problems with that perspective.
00:32:12.520 | The first is, in however many thousands of generations
00:32:15.920 | that humans have been inhabiting this planet,
00:32:17.720 | there has never been a transmissible form of H5N1.
00:32:21.160 | And influenza's been around for a very long time.
00:32:23.540 | So there is no case for inevitability
00:32:27.360 | of this kind of a jump to airborne transmission.
00:32:30.080 | So we're not on a freight train to that outcome.
00:32:33.400 | And if there was inevitability around that,
00:32:36.080 | it's not like there's just one set of genetic code
00:32:38.720 | that would get there.
00:32:39.760 | There are just, there's all kinds of different mutations
00:32:43.680 | that could conceivably result in that kind of an outcome.
00:32:47.100 | Unbelievable diversity of mutations.
00:32:49.600 | And so we're not actually creating something
00:32:51.840 | we're inevitably going to face.
00:32:54.000 | But we are creating something,
00:32:55.440 | we are creating a very powerful
00:32:58.300 | and unbelievably negative card and injecting it in the deck
00:33:01.960 | that nature never put into the deck.
00:33:04.280 | So in that case, I just don't see any moral
00:33:08.880 | or scientific justification for that kind of work.
00:33:12.120 | And interestingly, there was quite a bit of excitement
00:33:17.120 | and concern about this when the work came out.
00:33:18.820 | One of the teams was gonna publish their results in science,
00:33:21.100 | the other in nature.
00:33:22.440 | And there were a lot of editorials
00:33:25.160 | and a lot of scientists are saying this is crazy.
00:33:27.800 | And publication of those papers did get suspended.
00:33:31.080 | And not long after that, there was a pause put
00:33:33.520 | on US government funding, NIH funding
00:33:36.240 | on gain-of-function research.
00:33:38.120 | But both of those speed bumps were ultimately removed.
00:33:41.720 | Those papers did ultimately get published.
00:33:43.800 | And that pause on funding ceased long ago.
00:33:47.720 | And in fact, those two very projects, my understanding is,
00:33:50.560 | resumed their funding, got their government funding back.
00:33:53.680 | I don't know why Dutch Project's getting NIH funding,
00:33:55.840 | but whatever, about a year and a half ago.
00:33:58.700 | So as far as the US government and regulators are concerned,
00:34:02.980 | it's all systems go for gain-of-function at this point,
00:34:05.580 | which I find very troubling.
00:34:07.580 | - Now I'm a little bit of an outsider from this field,
00:34:09.700 | but it has echoes of the same kind of problem I see
00:34:12.500 | in the AI world with autonomous weapon systems.
00:34:16.220 | Nobody, and my colleagues, my colleagues, friends,
00:34:21.220 | as far as I can tell, people in the AI community
00:34:25.180 | are not really talking about autonomous weapon systems.
00:34:28.580 | As now US and China full steam ahead
00:34:31.540 | on the development of both.
00:34:33.100 | And that seems to be a similar kind of thing
00:34:35.180 | on gain-of-function.
00:34:37.140 | I have friends in the biology space,
00:34:40.900 | and they don't want to talk about gain-of-function publicly.
00:34:45.240 | That makes me very uncomfortable
00:34:48.860 | from an outsider perspective in terms of gain-of-function.
00:34:51.460 | It makes me very uncomfortable
00:34:53.860 | from the insider perspective on autonomous weapon systems.
00:34:56.940 | I'm not sure how to communicate exactly
00:34:58.900 | about autonomous weapon systems,
00:35:00.220 | and I certainly don't know how to communicate effectively
00:35:02.460 | about gain-of-function.
00:35:03.980 | What is the right path forward here?
00:35:06.060 | Should we seize all gain-of-function research?
00:35:08.860 | Is that really the solution here?
00:35:11.380 | - Well, again, I'm gonna use gain-of-function
00:35:13.100 | in the relatively narrow context of what we're discussing.
00:35:15.420 | - Sorry, yes, for viruses.
00:35:16.260 | - 'Cause you could say almost anything that you do
00:35:18.460 | to make biology more effective is gain-of-function.
00:35:20.540 | So within the narrow confines of what we're discussing,
00:35:23.500 | I think it would be easy enough for level-headed people
00:35:28.500 | in all of the countries,
00:35:30.380 | level-headed governmental people in all of the countries
00:35:32.500 | that realistically could support such a program to agree,
00:35:36.060 | we don't want this to happen because all labs leak.
00:35:40.660 | I mean, an example that I use,
00:35:43.380 | I actually did use it in the piece
00:35:45.460 | I did with Sam Harris as well,
00:35:47.000 | is the anthrax attacks in the United States in 2001.
00:35:51.380 | I mean, talk about an example of the least likely lab
00:35:55.020 | leaking into the least likely place.
00:35:57.940 | This was shortly after 9/11, for folks who don't remember it,
00:36:01.020 | and it was a very, very lethal strand of anthrax
00:36:04.500 | that, as it turned out,
00:36:06.340 | based on the forensic genomic work that was done
00:36:08.780 | and so forth, absolutely leaked
00:36:11.020 | from a high-security US Army lab,
00:36:13.280 | probably the one at Fort Detrick in Maryland.
00:36:15.980 | It might've been another one, but who cares?
00:36:17.300 | It absolutely leaked from a high-security US Army lab.
00:36:21.420 | And where did it leak to,
00:36:22.780 | this highly dangerous substance
00:36:24.880 | that was kept under lock and key
00:36:26.860 | by a very security-minded organization?
00:36:29.460 | Well, it leaked to places
00:36:30.660 | including the Senate Majority Leader's office,
00:36:32.620 | Tom Daschle's office, I think it was Senator Leahy's office,
00:36:35.300 | certain publications,
00:36:36.300 | including, bizarrely, the National Enquirer.
00:36:39.160 | But let's go to the Senate Majority Leader's office.
00:36:41.860 | It is hard to imagine a more security-minded country
00:36:46.000 | than the United States two weeks after the 9/11 attack.
00:36:49.120 | I mean, it doesn't get more security-minded than that.
00:36:52.200 | And it's also hard to imagine
00:36:54.820 | a more security-capable organization
00:36:57.620 | than the United States military.
00:36:59.220 | We can joke all we want
00:37:00.820 | about inefficiencies in the military
00:37:02.680 | and $24,000 wrenches and so forth,
00:37:06.020 | but pretty capable when it comes to that,
00:37:08.060 | despite that level of focus and concern and competence,
00:37:13.460 | just days after the 9/11 attack,
00:37:16.440 | something comes from the inside
00:37:18.380 | of our military and industrial complex
00:37:20.240 | and ends up in the office of someone,
00:37:22.680 | I believe the Senate Majority Leader,
00:37:24.120 | somewhere in the line of presidential succession.
00:37:26.480 | It tells us everything can leak.
00:37:27.980 | So again, think of a level-headed conversation
00:37:30.900 | between powerful leaders in a diversity of countries,
00:37:34.920 | thinking through, like,
00:37:36.460 | I can imagine a very simple PowerPoint revealing,
00:37:39.680 | just discussing briefly things like the anthrax leak,
00:37:43.080 | things like this foot-in-mouth disease outbreak
00:37:47.320 | or leaking that came out of a BSL-4-level lab in the UK,
00:37:51.000 | several other things,
00:37:52.480 | talking about the utter virulence
00:37:54.940 | that could result from gain-of-function and say,
00:37:57.000 | folks, can we agree that this just shouldn't happen?
00:38:01.120 | I mean, if we were able to agree
00:38:02.760 | on the Nuclear Non-Proliferation Treaty,
00:38:04.900 | which we were,
00:38:06.120 | by a weapons convention,
00:38:07.360 | which we did agree on, we the world, for the most part,
00:38:10.840 | I believe agreement could be found there.
00:38:13.880 | But it's gonna take people in leadership
00:38:16.680 | of a couple of very powerful countries
00:38:18.960 | to get to the consensus amongst them
00:38:20.960 | and then to decide we're gonna get everybody together
00:38:23.040 | and browbeat them into banning this stuff.
00:38:24.640 | Now, that doesn't make it entirely impossible
00:38:26.620 | that somebody might do this,
00:38:28.460 | but in well-regulated, carefully watched over
00:38:33.120 | fiduciary environments,
00:38:35.040 | like federally-funded academic research,
00:38:36.980 | anything going on in the government itself,
00:38:39.520 | things going on in companies that have investors
00:38:43.400 | who don't wanna go to jail for the rest of their lives,
00:38:46.160 | I think that would have a major,
00:38:48.320 | major dampening impact on it.
00:38:50.080 | - But there is a particular possible catalyst
00:38:53.820 | in this time we live in,
00:38:55.200 | which is for really kind of raising the question
00:38:59.560 | of gain-of-function research for the application of virus,
00:39:02.400 | making viruses more dangerous,
00:39:03.920 | is the question of whether COVID
00:39:08.680 | leaked from a lab.
00:39:10.080 | Sort of not even answering that question,
00:39:14.320 | but even asking that question.
00:39:16.000 | It seems like a very important question to ask
00:39:21.200 | to catalyze the conversation
00:39:23.520 | about whether we should be doing gain-of-function research.
00:39:26.520 | I mean, from a high level,
00:39:28.020 | why do you think people,
00:39:31.720 | even colleagues of mine,
00:39:33.040 | are not comfortable asking that question?
00:39:35.200 | And two, do you think that the answer could be
00:39:38.040 | that it did leak from a lab?
00:39:40.200 | - I think the mere possibility
00:39:43.080 | that it did leak from a lab
00:39:45.280 | is evidence enough, again,
00:39:48.040 | for the hypothetical, rational national leaders
00:39:52.360 | watching this simple PowerPoint.
00:39:54.520 | If you could put the possibility at 1%,
00:39:57.720 | and you look at the unbelievable destructive power
00:40:00.600 | that COVID had,
00:40:02.040 | that should be an overwhelmingly powerful argument
00:40:04.760 | for excluding it.
00:40:06.120 | Now, as to whether or not that was a leak,
00:40:09.600 | some very, very level,
00:40:10.880 | I don't know enough about all of the factors
00:40:14.160 | in the Bayesian analysis and so forth
00:40:16.080 | that has gone into people making the pro-argument of that.
00:40:19.560 | So I don't pretend to be an expert on that,
00:40:21.720 | and I don't have a point of view.
00:40:24.040 | I just don't know.
00:40:25.720 | But what we can say is
00:40:29.080 | it is entirely possible for a couple of reasons.
00:40:32.240 | One is that there is a BSL-4 lab in Wuhan,
00:40:35.600 | the Wuhan Institute of Virology.
00:40:37.400 | I believe it's the only BSL-4 in China.
00:40:40.240 | I could be wrong about that.
00:40:41.640 | But it definitely had a history
00:40:45.880 | that alarmed very sophisticated US diplomats and others
00:40:50.880 | who were in contact with the lab
00:40:53.120 | and were aware of what it was doing
00:40:55.360 | long before COVID hit the world.
00:40:59.960 | And so there are diplomatic cables
00:41:01.920 | that have been declassified.
00:41:03.640 | I believe one sophisticated scientist or other observer
00:41:07.120 | said that WIV is a ticking time bomb.
00:41:10.200 | And I believe it's also been pretty reasonably established
00:41:13.840 | that coronaviruses were a topic of great interest at WIV.
00:41:18.240 | SARS obviously came out of China,
00:41:19.960 | and that's a coronavirus,
00:41:21.480 | so it would make an enormous amount of sense
00:41:23.040 | for it to be studied there.
00:41:24.400 | And there is so much opacity
00:41:28.440 | about what happened in the early days and weeks
00:41:31.640 | after the outbreak that's basically been imposed
00:41:34.840 | by the Chinese government that we just don't know.
00:41:38.040 | So it feels like a substantially
00:41:40.560 | or greater than 1% possibility to me
00:41:42.960 | looking at it from the outside.
00:41:44.880 | And that's something that one could imagine.
00:41:47.720 | Now we're going to the realm of thought experiment,
00:41:49.320 | not me decreeing this is what happened,
00:41:51.100 | but if they're studying coronavirus
00:41:53.800 | at the Wuhan Institute of Virology,
00:41:56.120 | and there is this precedent of gain-of-function research
00:41:59.360 | that's been done on something
00:42:00.840 | that is remarkably uncontagious to humans,
00:42:03.160 | whereas we know coronavirus is contagious to humans.
00:42:05.680 | I could definitely, and there is this global consensus.
00:42:09.960 | Certainly was the case two or three years ago
00:42:12.480 | when this work might have started.
00:42:13.920 | Seems to be this global consensus
00:42:15.720 | that gain-of-function is fine.
00:42:17.560 | US paused funding for a little while, but paused funding.
00:42:20.680 | They never said private actors couldn't do it.
00:42:23.280 | It was just a pause of NIH funding.
00:42:25.600 | And then that pause was lifted.
00:42:26.920 | So again, none of this is irrational.
00:42:28.800 | You could certainly see the folks at WIV saying,
00:42:31.600 | gain-of-function, interesting vector,
00:42:33.440 | coronavirus, unlike H5N1, very contagious.
00:42:38.000 | We're a nation that has had terrible run-ins
00:42:41.640 | with coronavirus.
00:42:42.760 | Why don't we do a little gain-of-function on this?
00:42:44.580 | And then like all labs at all levels,
00:42:48.320 | one can imagine this lab leaking.
00:42:49.780 | So it's not an impossibility,
00:42:52.120 | and very, very level-headed people have said that,
00:42:54.640 | you know, who've looked at it much more deeply,
00:42:56.640 | do believe in that outcome.
00:42:58.780 | - Why is it such a threat to power,
00:43:01.840 | the idea that it'll leak from a lab?
00:43:03.520 | Why is it so threatening?
00:43:04.640 | I don't maybe understand this point exactly.
00:43:07.200 | Like, is it just that as governments,
00:43:12.000 | and especially the Chinese government
00:43:13.360 | is really afraid of admitting mistakes that everybody makes?
00:43:18.200 | So this is a horrible mistake.
00:43:19.520 | Like Chernobyl is a good example.
00:43:21.880 | I come from the Soviet Union.
00:43:24.520 | I mean, well, major mistakes were made in Chernobyl.
00:43:29.340 | I would argue for a lab leak to happen,
00:43:32.420 | the scale of the mistake is much smaller, right?
00:43:39.680 | The depth and the breadth of rot that in bureaucracy
00:43:44.680 | that led to Chernobyl is much bigger
00:43:49.860 | than anything that could lead to a lab leak,
00:43:52.920 | 'cause it could literally just be,
00:43:54.940 | I mean, I'm sure there's security,
00:43:56.260 | very careful security procedures, even in level three labs,
00:43:59.200 | but I imagine maybe you can correct me.
00:44:04.200 | All it takes is the incompetence
00:44:06.220 | of a small number of individuals.
00:44:08.060 | - Or even one, yeah.
00:44:09.060 | - One individual on a particular,
00:44:11.340 | a couple weeks, three weeks period,
00:44:14.000 | as opposed to a multi-year bureaucratic failure
00:44:17.780 | of the entire government.
00:44:19.180 | - Right, well, certainly the magnitude of mistakes
00:44:22.380 | and compounding mistakes that went into Chernobyl
00:44:24.300 | was far, far, far greater,
00:44:26.020 | but the consequence of COVID outweighs that,
00:44:29.920 | the consequence of Chernobyl to a tremendous degree.
00:44:33.480 | And I think that particularly authoritarian governments
00:44:38.480 | are unbelievably reluctant to admit
00:44:44.020 | to any fallibility whatsoever.
00:44:46.420 | And there's a long, long history of that
00:44:48.360 | across dozens and dozens of authoritarian governments.
00:44:52.140 | And to be transparent, again,
00:44:54.980 | this is in the hypothetical world in which this was a leak,
00:44:57.540 | which again, I don't personally have enough sophistication
00:45:01.540 | to have an opinion on the likelihood,
00:45:03.220 | but in the hypothetical world in which it was a leak,
00:45:06.420 | the global reaction and the amount of global animus
00:45:11.420 | and the amount of, you know,
00:45:16.260 | the decline in global respect
00:45:18.700 | that would happen toward China,
00:45:21.720 | because every country suffered massively from this,
00:45:24.640 | unbelievable damages in terms of human lives
00:45:27.980 | and economic activity disrupted.
00:45:29.820 | The world would in some way present China with that bill.
00:45:34.620 | And when you take on top of that,
00:45:37.920 | the natural disinclination for any authoritarian government
00:45:41.260 | to admit any fallibility and tolerate the possibility
00:45:43.780 | of any fallibility whatsoever,
00:45:45.440 | and you look at the relative opacity,
00:45:49.420 | even though they let a World Health Organization
00:45:51.580 | group in, you know, a couple of months ago to run around,
00:45:55.300 | they didn't give that who group
00:45:56.800 | anywhere near the level of access that would be necessary
00:45:59.220 | to definitively say X happened versus Y.
00:46:02.240 | The level of opacity that surrounds those opening weeks
00:46:04.900 | and months of COVID in China, we just don't know.
00:46:09.180 | - If you were to kind of look back at 2020
00:46:12.700 | and maybe broadening it out to future pandemics
00:46:17.980 | that could be much more dangerous,
00:46:20.780 | what kind of response, how do we fail in a response
00:46:24.820 | and how could we do better?
00:46:27.420 | So the gain of function research is discussing,
00:46:31.380 | which, you know, the question of,
00:46:33.380 | we should not be creating viruses
00:46:36.540 | that are both exceptionally contagious
00:46:38.620 | and exceptionally deadly to humans.
00:46:41.000 | But if it does happen, perhaps the natural evolution,
00:46:45.180 | natural mutation, is there interesting
00:46:48.300 | technological responses on the testing side,
00:46:52.860 | on the vaccine development side, on the collection of data,
00:46:57.220 | or on the basic sort of policy response side,
00:47:00.220 | or the sociological, the psychological side?
00:47:03.460 | - Yeah, there's all kinds of things.
00:47:05.320 | And most of what I've thought about and written about,
00:47:09.000 | and again, discussed in that long bit with Sam, is dual use.
00:47:14.000 | So most of the countermeasures that I've been thinking about
00:47:18.220 | and advocating for would be every bit as effective
00:47:21.480 | against zoonotic disease, a natural pandemic of some sort
00:47:25.900 | as an artificial one.
00:47:27.340 | The risk of an artificial one,
00:47:29.120 | even the near-term risk of an artificial one,
00:47:31.700 | ups the urgency around these measures immensely,
00:47:34.540 | but most of them would be broadly applicable.
00:47:37.500 | And so I think the first thing that we really wanna do
00:47:40.800 | on a global scale is have a far, far, far more robust
00:47:45.140 | and globally transparent system of detection.
00:47:49.260 | And that can happen on a number of levels.
00:47:52.020 | The most obvious one is just in the blood of people
00:47:56.660 | who come into clinics exhibiting signs of illness.
00:47:59.820 | And we are certainly at a point now
00:48:03.180 | where we're at with relatively minimal investment.
00:48:07.360 | We could develop in-clinic diagnostics
00:48:09.980 | that would be unbelievably effective
00:48:12.000 | at pinpointing what's going on
00:48:13.980 | in almost any disease when somebody walks
00:48:16.700 | into a doctor's office or a clinic.
00:48:18.780 | And better than that, this is a little bit further off,
00:48:23.620 | but it wouldn't cost tens of billions in research dollars.
00:48:26.380 | It would be a relatively modest and affordable budget
00:48:28.980 | in relation to the threat, at-home diagnostics
00:48:31.900 | that can really, really pinpoint,
00:48:34.820 | okay, particularly with respiratory infections,
00:48:37.780 | because that is generally, almost universally,
00:48:40.880 | the mechanism of transmission for any serious pandemic.
00:48:44.520 | So somebody has a respiratory infection.
00:48:46.940 | Is it one of the significantly large handful
00:48:50.900 | of rhinoviruses, coronaviruses,
00:48:52.640 | and other things that cause common cold?
00:48:55.200 | Or is it influenza?
00:48:56.360 | If it's influenza, is it influenza A versus B?
00:48:58.920 | Or is it a small handful of other more exotic,
00:49:03.880 | but nonetheless sort of common respiratory infections
00:49:07.660 | that are out there?
00:49:08.840 | Developing a diagnostic panel to pinpoint all of that stuff,
00:49:12.080 | that's something that's well within our capabilities.
00:49:13.920 | That's much less a lift than creating mRNA vaccines,
00:49:17.720 | which obviously we proved capable of
00:49:19.360 | when we put our minds to it.
00:49:21.000 | So do that on a global basis.
00:49:24.160 | And I don't think that's irrational
00:49:25.600 | because the best prototype for this that I'm aware of
00:49:29.420 | isn't currently rolling out in Atherton, California,
00:49:32.820 | or Fairfield County, Connecticut,
00:49:34.560 | or some other wealthy place.
00:49:36.200 | The best prototype that I'm aware of this
00:49:37.920 | is rolling out right now in Nigeria.
00:49:39.920 | And it's a project that came out of the Broad Institute,
00:49:43.080 | which is, as I'm sure you know,
00:49:44.880 | but some listeners may not,
00:49:46.400 | is kind of like an academic joint venture
00:49:49.020 | between Harvard and MIT.
00:49:50.800 | The program is called Sentinel.
00:49:52.880 | And their objective is, and their plan,
00:49:56.520 | and it's a very well-conceived plan,
00:49:58.000 | methodical plan, is to do just that
00:50:00.840 | in areas of Nigeria that are particularly vulnerable
00:50:03.640 | to zoonotic diseases,
00:50:05.480 | making the jump from animals to humans.
00:50:07.920 | But also there's just an unbelievable
00:50:09.920 | public health benefit from that.
00:50:11.840 | And it's sort of a three-tier system
00:50:13.480 | where clinicians in the field could very rapidly determine,
00:50:17.760 | do you have one of the infections of acute interest here,
00:50:21.560 | either because it's very common in this region,
00:50:23.820 | so we want to diagnose as many things as we can
00:50:26.400 | at the frontline,
00:50:27.580 | or because it's uncommon
00:50:29.000 | but unbelievably threatening like Ebola.
00:50:31.240 | So frontline worker can make that determination
00:50:33.800 | very, very rapidly.
00:50:35.360 | If it comes up as a we don't know,
00:50:38.120 | they bump it up to a level that's more like
00:50:40.140 | at a fully configured doctor's office or local hospital.
00:50:43.920 | And if it's still at a we don't know,
00:50:45.440 | it gets bumped up to a national level.
00:50:47.440 | And it gets bumped very, very rapidly.
00:50:51.060 | So if this can be done in Nigeria,
00:50:53.760 | and it seems that it can be,
00:50:56.480 | there shouldn't be any inhibition for it to happen
00:50:58.900 | in most other places.
00:51:00.680 | And it should be affordable from a budgetary standpoint.
00:51:03.200 | And based on Sentinel's budget and adjusting things
00:51:05.960 | for things like very different cost of living,
00:51:08.800 | larger population, et cetera,
00:51:10.720 | I did a back of the envelope calculation
00:51:13.040 | that doing something like Sentinel in the US
00:51:14.720 | would be in the low billions of dollars.
00:51:16.800 | And wealthy countries, middle-income countries
00:51:20.280 | can't afford such a thing.
00:51:21.840 | Lower-income countries should certainly be helped with that.
00:51:25.600 | But start with that level of detection.
00:51:27.700 | And then layer on top of that other interesting things
00:51:30.680 | like monitoring search engine traffic,
00:51:33.480 | search engine queries for evidence
00:51:36.240 | that strange clusters of symptoms
00:51:38.720 | are starting to rise in different places.
00:51:40.920 | There's been a lot of work done with that.
00:51:42.900 | Most of it kind of academic and experimental.
00:51:45.920 | But some of it has been powerful enough to suggest
00:51:48.200 | that this could be a very powerful early warning system.
00:51:51.100 | There's a guy named Bill Lampos
00:51:52.360 | at University College London,
00:51:54.820 | who basically did a very rigorous analysis
00:51:58.560 | that showed that symptom searches
00:52:02.160 | reliably predicted COVID outbreaks
00:52:04.820 | in the early days of the pandemic in given countries
00:52:07.760 | by as much as 16 days before the evidence
00:52:10.200 | started to accrue at a public health level.
00:52:12.200 | 16 days of forewarning can be monumentally important
00:52:16.600 | in the early days of an outbreak.
00:52:18.580 | And this is a very, very talented,
00:52:22.460 | but nonetheless very resource-constrained academic project.
00:52:26.200 | Imagine if that was something that was done
00:52:28.380 | with a NORAD-like budget.
00:52:30.560 | Yeah, so I mean, starting with detection,
00:52:32.840 | that's something we could do radically, radically better.
00:52:35.440 | - So aggregating multiple data sources
00:52:37.400 | in order to create something.
00:52:39.000 | I mean, this is really exciting to me,
00:52:40.500 | the possibility that I've heard inklings of,
00:52:43.240 | of creating almost like a weather map of pathogens.
00:52:46.680 | Like basically aggregating all of these data sources,
00:52:51.680 | scaling many orders of magnitude up at home,
00:52:55.520 | testing it in all kinds of testing
00:52:57.880 | that doesn't just try to test
00:52:59.860 | for the particular pathogen of worry now,
00:53:02.760 | but everything, like a full spectrum of things
00:53:06.420 | that could be dangerous to the human body
00:53:08.860 | and thereby be able to create these maps
00:53:11.260 | like that are dynamically updated on an hourly basis
00:53:16.000 | of how viruses travel throughout the world.
00:53:19.380 | And so you can respond, like you can then integrate,
00:53:22.300 | just like you do when you check your weather map
00:53:24.540 | and it's raining or not, of course, not perfect,
00:53:27.740 | but it's very good predictor
00:53:29.420 | of whether it's gonna rain or not,
00:53:31.220 | and use that to then make decisions about your own life.
00:53:35.140 | Ultimately give the power of information
00:53:37.380 | to individuals to respond.
00:53:39.140 | And if it's a super dangerous,
00:53:40.940 | like if it's acid rain versus regular rain,
00:53:43.740 | you might wanna really stay inside
00:53:45.620 | as opposed to risking it.
00:53:47.220 | And that, just like you said,
00:53:50.980 | I think it's not very expensive
00:53:52.700 | relative to all the things that we do in this world,
00:53:56.340 | but it does require bold leadership.
00:53:59.200 | And there's another dark thing,
00:54:01.760 | which really has bothered me about 2020,
00:54:03.820 | which it requires, is it requires trust in institutions
00:54:08.820 | to carry out these kinds of programs,
00:54:12.260 | and it requires trust in science and engineers
00:54:15.980 | and sort of centralized organizations
00:54:19.200 | that would operate at scale here.
00:54:21.040 | And much of that trust has been,
00:54:25.000 | at least in the United States, diminished.
00:54:27.740 | It feels like, I'm not exactly sure where to place the blame,
00:54:32.080 | but I do place quite a bit of the blame
00:54:33.900 | into the scientific community,
00:54:36.240 | and again, my fellow colleagues.
00:54:38.940 | In speaking down to people at times,
00:54:42.240 | speaking from authority,
00:54:44.160 | it sounded like it dismissed the basic human experience
00:54:47.080 | or the basic common humanity of people
00:54:50.440 | in a way that like, it almost sounded like
00:54:52.820 | there's an agenda that's hidden
00:54:55.940 | behind the words the scientists spoke,
00:54:58.120 | like they're trying to, in a self-preserving way,
00:55:01.240 | control the population or something like that.
00:55:03.880 | I don't think any of that is true
00:55:05.440 | from the majority of the scientific community,
00:55:07.240 | but it sounded that way,
00:55:08.640 | and so the trust began to diminish,
00:55:11.280 | and I'm not sure how to fix that,
00:55:13.580 | except to be more authentic, be more real,
00:55:18.280 | acknowledge the uncertainties under which we operate,
00:55:20.960 | acknowledge the mistakes that scientists make,
00:55:24.320 | that institutions make.
00:55:26.440 | The leak from the lab is a perfect example,
00:55:29.380 | where we have imperfect systems
00:55:31.720 | that make all the progress we see in the world,
00:55:33.960 | and that being honest about that imperfection,
00:55:36.740 | I think is essential for forming trust,
00:55:39.400 | but I don't know what to make of it.
00:55:40.680 | It's been deeply disappointing,
00:55:43.700 | because I do think, just like you mentioned,
00:55:45.920 | the solutions require people to trust
00:55:49.840 | the institutions with their data.
00:55:54.160 | - Yeah, and I think part of the problem is,
00:55:57.280 | it seems to me as an outsider,
00:55:58.840 | that there was a bizarre unwillingness
00:56:01.040 | on the part of the CDC and other institutions
00:56:04.040 | to admit to, to frame, and to contextualize uncertainty.
00:56:09.040 | Maybe they had a patronizing idea
00:56:13.720 | that these people need to be told,
00:56:15.740 | and when they're told, they need to be told with authority
00:56:18.760 | and a level of definitiveness and certitude
00:56:21.720 | that doesn't actually exist.
00:56:23.520 | And so when they whipsaw on recommendations
00:56:26.600 | like what you should do about masks,
00:56:29.080 | you know, when the CDC is kind of at the very beginning
00:56:32.480 | of the pandemic saying, "Masks don't do anything,
00:56:35.320 | "don't wear them," when the real driver for that was,
00:56:39.120 | "We don't want these clowns going out
00:56:41.040 | "and depleting Amazon of masks,
00:56:42.800 | "because they may be needed in medical settings,
00:56:47.120 | "and we just don't know yet."
00:56:49.760 | I think a message that actually respected people and said,
00:56:53.320 | "This is why we're asking you not to do masks yet,
00:56:55.840 | "and there's more to be seen,"
00:56:58.080 | would be less whipsawing and would bring people,
00:57:01.240 | like they feel more like they're part of the conversation
00:57:04.280 | and they're being treated like adults
00:57:06.080 | than saying one day, definitively masks suck,
00:57:09.360 | and then X days later saying, "Nope, dammit, wear masks."
00:57:12.920 | And so I think framing things in terms of the probabilities,
00:57:16.020 | which most people are easy to parse.
00:57:17.560 | I mean, a more recent example,
00:57:20.320 | which I just thought was batty,
00:57:22.240 | was suspending the Johnson & Johnson vaccine
00:57:24.960 | for a very low single-digit number of days
00:57:29.240 | in the United States, based on the fact that I believe
00:57:32.520 | there had been seven-ish clotting incidents
00:57:36.200 | in roughly seven million people
00:57:39.400 | who had had the vaccine administered,
00:57:42.280 | I believe one of which resulted in a fatality.
00:57:45.120 | And there was definitely suggestive data
00:57:47.740 | that indicated that there was a relationship.
00:57:50.060 | This wasn't just coincidental,
00:57:51.280 | because I think all of the clotting incidents happened
00:57:53.760 | in women as opposed to men,
00:57:55.420 | and kind of clustered in a certain age group.
00:57:58.180 | But does that call for shutting off the vaccine,
00:58:02.740 | or does it call for leveling with the American public
00:58:05.860 | and saying, "We've had one fatality out of seven million.
00:58:10.140 | "This is," let's just assume,
00:58:12.440 | "substantially less than the likelihood
00:58:15.000 | "of getting struck by lightning."
00:58:16.700 | Based on that information,
00:58:20.780 | and we're gonna keep you posted
00:58:22.140 | 'cause you can trust us to keep you posted,
00:58:24.340 | based on that information,
00:58:25.540 | please decide whether you're comfortable
00:58:27.100 | with the Johnson & Johnson vaccine.
00:58:29.180 | That would have been one response,
00:58:30.940 | and I think people would have been able to parse
00:58:32.780 | those simple bits of data and make their own judgment.
00:58:35.180 | By turning it off, all of a sudden,
00:58:38.180 | there's this dramatic signal to people
00:58:40.840 | who don't read all 900 words in the New York Times piece
00:58:44.220 | that explains why it's being turned off,
00:58:45.900 | but just see the headline, which is a majority of people.
00:58:48.620 | There's a sudden, like, oh my God, yikes,
00:58:51.620 | vaccine being shut off,
00:58:54.380 | and then all the people who sat on the fence,
00:58:56.720 | or are sitting on the fence,
00:58:57.820 | about whether or not they trust vaccines.
00:59:00.580 | That is gonna push an incalculable number of people.
00:59:03.300 | That's gonna be the last straw,
00:59:04.980 | for we don't know how many hundreds of thousands,
00:59:06.740 | or more likely millions of people,
00:59:08.140 | to say, "Okay, tipping point here.
00:59:10.300 | "I don't trust these vaccines."
00:59:11.520 | By pausing that for, whatever it was, 10 or 12 days,
00:59:14.360 | and then flipping the switch,
00:59:16.160 | as everybody who knew much about the situation
00:59:19.040 | knew was inevitable,
00:59:21.140 | by flipping the on switch 12 days later,
00:59:24.180 | you're conveying certitude J&J bad
00:59:27.280 | to certitude J&J good in a period of just a few days,
00:59:31.360 | and people just feel whipsawed,
00:59:32.920 | and they're not part of the analysis.
00:59:34.520 | - But it's not just the whipsawing,
00:59:36.480 | and I think about this quite a bit.
00:59:37.960 | I don't think I have good answers.
00:59:39.780 | It's something about the way
00:59:41.080 | the communication actually happens.
00:59:43.040 | Just, I don't know what it is about Anthony Fauci,
00:59:46.380 | for example, but I don't trust him.
00:59:48.800 | And I think that has to do,
00:59:51.480 | I mean, he has an incredible background.
00:59:55.900 | I'm sure he's a brilliant scientist and researcher.
00:59:59.120 | I'm sure he's also a great, like, inside the room,
01:00:03.240 | policymaker, and deliberator, and so on.
01:00:06.040 | But what makes a great leader is something about
01:00:11.040 | that thing that you can't quite describe,
01:00:14.000 | but being a communicator that you know you can trust,
01:00:19.000 | that there's an authenticity that's required.
01:00:23.080 | And I'm not sure, maybe I'm being a bit too judgmental,
01:00:26.880 | but I'm a huge fan of a lot of great leaders
01:00:29.560 | throughout history.
01:00:30.640 | They've communicated exceptionally well
01:00:34.040 | in the way that Fauci does not, and I think about that.
01:00:38.120 | I think about what is effective science communication.
01:00:40.520 | So, you know, great leaders throughout history
01:00:43.280 | did not necessarily need to be great science communicators.
01:00:47.080 | Their leadership was in other domains,
01:00:49.580 | but when you're fighting the virus,
01:00:50.960 | you also have to be a great science communicator.
01:00:53.760 | You have to be able to communicate uncertainties.
01:00:56.240 | You have to be able to communicate something like a vaccine
01:00:59.480 | that you're allowing inside your body into the messiness,
01:01:02.800 | into the complexity of the biology system,
01:01:05.440 | that if we're being honest, it's so complex,
01:01:07.540 | we'll never be able to really understand.
01:01:09.540 | We can only desperately hope that science
01:01:13.000 | can give us sort of a high likelihood
01:01:16.720 | that there's no short-term negative consequences,
01:01:20.440 | and that kind of intuition
01:01:21.920 | about long-term negative consequences,
01:01:23.920 | and doing our best in this battle against trillions
01:01:27.800 | of things that are trying to kill us.
01:01:31.020 | - I mean, being an effective communicator
01:01:34.040 | in that space is very difficult,
01:01:35.200 | but I think about what it takes,
01:01:37.440 | because I think there should be more science communicators
01:01:41.240 | that are effective at that kind of thing.
01:01:43.600 | Let me ask you about something
01:01:44.680 | that's sort of more in the AI space
01:01:46.800 | that I think about that kind of goes along this thread
01:01:51.800 | that you've spoken about,
01:01:55.240 | about democratizing the technology
01:01:58.560 | that could destroy human civilization,
01:02:00.740 | is from amazing work from DeepMind, AlphaFold2,
01:02:05.640 | which achieved incredible performance
01:02:09.520 | on the protein folding problem,
01:02:10.760 | single protein folding problem.
01:02:13.160 | Do you think about the use of AI in the SYN biospace of,
01:02:17.680 | I think the gain of function
01:02:22.040 | in the virus-based research that you referred to,
01:02:24.480 | I think is natural mutations,
01:02:27.960 | and sort of aggressively mutating the virus
01:02:31.280 | until you get one that has this both contagious and deadly.
01:02:36.280 | But what about then using AI through simulation
01:02:42.820 | be able to compute deadly viruses,
01:02:47.040 | or any kind of biological systems?
01:02:49.480 | Is this something you're worried about,
01:02:51.320 | or again, is this something you're more excited about?
01:02:53.720 | - I think computational biology
01:02:55.680 | is unbelievably exciting and promising field.
01:02:58.680 | And I think when you're doing things in silico
01:03:00.440 | as opposed to in vivo, the dangers plummet.
01:03:05.440 | You don't have a critter that can leak from a leaky lab.
01:03:10.040 | So I don't see any problem with that,
01:03:11.560 | except I do worry about the data security dimension of it.
01:03:16.040 | Because if you were doing really, really interesting
01:03:18.840 | in silico gain of function research,
01:03:21.080 | and you hit upon, through a level of sophistication,
01:03:24.360 | we don't currently have, but synthetic biology
01:03:27.200 | is an exponential technology,
01:03:28.600 | so capabilities that are utterly out of reach today
01:03:32.160 | will be attainable in five or six years.
01:03:34.440 | I think if you conjured up worst-case genomes of viruses
01:03:40.760 | that don't exist in vivo anywhere,
01:03:44.080 | they're just in the computer space,
01:03:45.840 | but like, hey guys, this is the genetic sequence
01:03:48.720 | that would end the world, let's say.
01:03:51.080 | Then you have to worry about the utter hackability
01:03:55.520 | of every computer network we can imagine.
01:03:58.280 | I mean, data leaks from the least likely places
01:04:01.880 | on the grandest possible scales have happened
01:04:04.880 | and continue to happen,
01:04:06.040 | and will probably always continue to happen.
01:04:08.280 | And so that would be the danger of doing the work in silico.
01:04:11.720 | If you end up with a list of like,
01:04:13.280 | well, these are things we never want to see,
01:04:15.600 | that list leaks, and after the passage of some time,
01:04:19.200 | certainly couldn't be done today,
01:04:20.320 | but after the passage of some time,
01:04:22.880 | lots and lots of people in academic labs
01:04:25.840 | going all the way down to the high school level
01:04:27.840 | are in a position to, to make it overly simplistic,
01:04:31.360 | hit print on a genome and have the virus bearing that genome
01:04:36.080 | pop out on the other end,
01:04:37.240 | and you got something to worry about.
01:04:38.520 | But in general, computational biology, I think,
01:04:41.160 | is incredibly important,
01:04:42.320 | particularly because the crushing majority of work
01:04:45.480 | that people are doing with the protein folding problem
01:04:47.680 | and other things are about creating therapeutics,
01:04:50.840 | about creating things that will help us live better,
01:04:54.200 | live longer, thrive, be more well, and so forth.
01:04:57.760 | And the protein folding problem
01:04:59.520 | is a monstrous computational challenge
01:05:02.680 | that we seem to make just the most glacial project on,
01:05:05.960 | I'm sorry, progress on for years and years.
01:05:08.320 | But I think there's like a,
01:05:09.160 | there's a biannual competition, I think,
01:05:11.400 | at which people tackle the protein folding problem,
01:05:16.360 | and DeepMind's entrant, both two years ago,
01:05:20.440 | like in 2018 and 2020, ruled the field.
01:05:23.560 | And so, protein folding is an unbelievably important thing
01:05:27.360 | if you want to start thinking about therapeutics,
01:05:29.600 | because it's the folding of the protein
01:05:32.320 | that tells us where the channels and the receptors
01:05:34.700 | and everything else are on that protein,
01:05:37.040 | and it's from that precise model,
01:05:39.600 | if we can get to a precise model,
01:05:41.420 | that you can start barraging it again in silicon
01:05:44.520 | with thousands, tens of thousands,
01:05:47.480 | millions of potential therapeutics
01:05:49.400 | and see what resolves the problems,
01:05:51.840 | the shortcomings that a misshapen protein,
01:05:56.040 | for instance, somebody with cystic fibrosis,
01:05:58.520 | how might we treat that?
01:05:59.720 | So, I see nothing but good in that.
01:06:01.760 | - Well, let me ask you about fear and hope in this world.
01:06:05.280 | I tend to believe that,
01:06:07.480 | that in terms of competence and malevolence,
01:06:13.800 | that people who are, maybe it's in my interactions,
01:06:17.360 | I tend to see that, first of all,
01:06:18.880 | I believe that most people are good,
01:06:21.120 | want to do good, and are just better at doing good
01:06:25.240 | and more inclined to do good on this world.
01:06:28.760 | And more than that, people who are malevolent
01:06:33.420 | are usually incompetent at building technology.
01:06:37.960 | So, I've seen this in my life,
01:06:40.300 | that people who are exceptionally good at stuff,
01:06:42.780 | no matter what the stuff is,
01:06:44.560 | tend to, maybe they discover joy in life
01:06:48.240 | in a way that gives them fulfillment
01:06:50.540 | and thereby does not result in them
01:06:53.280 | wanting to destroy the world.
01:06:55.020 | So, like the better you are at stuff,
01:06:57.180 | whether that's building nuclear weapons or plumbing,
01:06:59.880 | doesn't matter, both,
01:07:01.520 | the less likely you are to destroy the world.
01:07:03.700 | So, in that sense, with many technologies,
01:07:07.760 | the AI especially,
01:07:10.620 | I always think that the malevolent
01:07:14.640 | will be far outnumbered by the ultra competent.
01:07:18.400 | And in that sense, the defenses
01:07:21.000 | will always be stronger than the offense
01:07:25.000 | in terms of the people trying to destroy the world.
01:07:28.560 | Now, there's a few spaces where that might not be the case,
01:07:33.360 | and that's an interesting conversation,
01:07:34.840 | where this one person who's not very competent
01:07:38.440 | can destroy the whole world.
01:07:39.800 | Perhaps SynBio is one such space,
01:07:42.840 | because of the exponential effects of the technology.
01:07:47.280 | I tend to believe AI is not one of those such spaces,
01:07:50.800 | but do you share this kind of view
01:07:54.520 | that the ultra competent are usually also the good?
01:07:58.640 | - Yeah, absolutely.
01:07:59.760 | I absolutely share that,
01:08:01.080 | and that gives me a great deal of optimism
01:08:03.720 | that we will be able to short circuit the threat
01:08:06.820 | that malevolent SynBio could pose to us.
01:08:10.260 | But we need to start creating those defensive systems,
01:08:13.220 | or defensive layers, one of which we talked about,
01:08:15.820 | far, far, far better surveillance in order to prevail.
01:08:18.740 | So, the good guys will almost inevitably outsmart,
01:08:23.220 | and definitely outnumber the bad guys
01:08:26.020 | in most sort of smack downs that we can imagine.
01:08:29.580 | But the good guys aren't going to be able to exert
01:08:32.540 | their advantages unless they have the imagination
01:08:36.360 | necessary to think about what the worst possible thing
01:08:39.620 | can be done by somebody whose own psychology
01:08:42.620 | is completely alien to their own.
01:08:45.020 | So, that's a tricky, tricky thing to solve for.
01:08:47.460 | Now, in terms of whether the asymmetric power
01:08:51.740 | that a bad guy might have in the face of the overwhelming
01:08:55.340 | numerical advantage and competence advantage
01:08:57.700 | that the good guys have,
01:08:59.620 | unfortunately I look at something like
01:09:01.220 | mass shootings as an example.
01:09:04.100 | I'm sure the guy who was responsible for the Vegas shooting,
01:09:07.680 | or the Orlando shooting, or any other shooting
01:09:09.680 | that we can imagine, didn't know a whole lot
01:09:12.520 | about ballistics.
01:09:14.040 | And the number of good guy citizens in the United States
01:09:18.800 | with guns compared to bad guy citizens,
01:09:20.920 | I'm sure is a crushingly, overwhelmingly high ratio
01:09:24.120 | in favor of the good guys.
01:09:25.640 | But that doesn't make it possible
01:09:27.600 | for us to stop mass shootings.
01:09:30.320 | An example, Fort Hood,
01:09:33.720 | 45,000 trained soldiers on that base,
01:09:37.040 | yet there've been two mass shootings there.
01:09:39.720 | And so, there is an asymmetry when you have
01:09:43.880 | powerful and lethal technology that gets so democratized
01:09:48.880 | and so proliferated in tools that are very, very easy
01:09:53.660 | to use, even by a knucklehead.
01:09:56.180 | When those tools get really easy to use by a knucklehead
01:09:58.840 | and they're really widespread,
01:10:00.720 | it becomes very, very hard to defend against
01:10:03.680 | all instances of usage.
01:10:06.160 | Now, the good news, quote unquote, about mass shootings,
01:10:08.920 | if there is any, and there is some,
01:10:10.800 | is even the most brutal and carefully planning
01:10:15.240 | and well-armed mass shooter can only take so many victims.
01:10:19.880 | And the same is true, there's been four instances
01:10:23.400 | that I'm aware of, of commercial pilots committing suicide
01:10:26.660 | by downing their planes and taking all their passengers
01:10:28.700 | with them.
01:10:29.540 | These weren't Boeing engineers,
01:10:31.560 | but like an army of Boeing engineers,
01:10:33.160 | ultimately were not capable of preventing that.
01:10:36.040 | But even in their case,
01:10:37.560 | and I'm actually not counting 9/11 in that,
01:10:39.320 | 9/11's a different category in my mind,
01:10:42.160 | these are just personally suicidal pilots.
01:10:44.760 | In those cases, they only have a plane load of people
01:10:48.040 | that they're able to take with them.
01:10:50.200 | If we imagine a highly plausible and imaginable future
01:10:54.520 | in which some bio-tools that are amoral,
01:10:57.780 | that could be used for good or for ill,
01:10:59.820 | start embodying unbelievable sophistication and genius
01:11:04.820 | in the tool, in the easier and easier and easier
01:11:08.760 | to make tool, all those thousands, tens of thousands,
01:11:12.960 | hundreds of thousands of scientist years
01:11:15.200 | start getting embodied in something that may be as simple
01:11:18.640 | as hitting a print button,
01:11:20.020 | then that good guy technology can be hijacked
01:11:25.960 | by a bad person and used in a very asymmetric way.
01:11:29.720 | - See, I think what happens though,
01:11:31.400 | as you go to the high school student
01:11:34.000 | from the current very specific set of labs
01:11:36.720 | that are able to do it,
01:11:37.560 | as it becomes more and more democratized,
01:11:41.120 | as it becomes easier and easier to do
01:11:43.720 | this kind of large-scale damage with an engineered virus,
01:11:48.720 | the more and more there will be engineering of defenses
01:11:52.120 | against these systems,
01:11:53.120 | as some of the things we talked about
01:11:54.360 | in terms of testing, in terms of collection of data,
01:11:56.680 | but also in terms of like a scale contact tracing
01:12:01.280 | or also engineering of vaccines,
01:12:03.920 | like in a matter of like days, maybe hours, maybe minutes.
01:12:08.240 | So like, I just, I feel like the defenses,
01:12:11.480 | that's what human species seems to do,
01:12:13.520 | is like we keep hitting the snooze button
01:12:16.360 | until there's like a storm on the horizon
01:12:20.680 | heading towards us,
01:12:21.920 | then we start to quickly build up the defenses
01:12:25.280 | or the response that's proportional
01:12:28.480 | to the scale of the storm.
01:12:31.440 | Of course, again, certain kinds of exponential threats
01:12:35.080 | require us to build up the defenses
01:12:37.680 | way earlier than we usually do.
01:12:40.560 | And that's, I guess, the question.
01:12:42.040 | But I ultimately am hopeful that the natural process
01:12:46.600 | of hitting the snooze button
01:12:47.720 | until the deadline is right in front of us
01:12:50.680 | will work out for quite a long time for us humans.
01:12:53.080 | - And I fully agree.
01:12:54.200 | I mean, that's why I'm fundamentally,
01:12:56.040 | I may not sound like it thus far,
01:12:57.400 | but I'm fundamentally very, very optimistic
01:13:00.320 | about our ability to short circuit this threat
01:13:02.600 | because there is, again, I'll stress,
01:13:06.160 | the technological feasibility
01:13:08.280 | and the profound affordability
01:13:10.360 | of a relatively simple set of steps
01:13:12.360 | that we can take to preclude it,
01:13:13.640 | but we do have to take those steps.
01:13:16.200 | And so, what I'm hoping to do and trying to do
01:13:19.440 | is inject a notion of what those steps are
01:13:22.680 | into the public conversation and do my small part
01:13:25.040 | to up the odds that that actually ends up happening.
01:13:27.800 | The danger with this one is it is exponential.
01:13:33.440 | And I think that our minds fundamentally struggle
01:13:38.080 | to understand exponential math.
01:13:40.120 | It's just not something we're wired for.
01:13:42.360 | Our ancestors didn't confront exponential processes
01:13:45.800 | when they were growing up on the savanna.
01:13:47.560 | So, it's not something that's intuitive to us
01:13:49.440 | and our intuitions are reliably defeated
01:13:52.000 | when exponential processes come along.
01:13:53.920 | So, that's issue number one.
01:13:55.880 | And issue number two with something like this
01:13:58.760 | is it kind of only takes one.
01:14:02.760 | That ball only has to go into the net once
01:14:05.520 | and we're doomed, which is not the case with mass shooters.
01:14:08.920 | It's not the case with commercial pilots running muck.
01:14:12.200 | It's not the case with really any threat
01:14:15.200 | that I can think of with the exception of nuclear war
01:14:18.640 | that has the one bad outcome and game over.
01:14:23.640 | And that means that we need to be unbelievably serious
01:14:27.880 | about these defenses and we need to do things
01:14:31.520 | that might on the surface seem like a tremendous overreaction
01:14:35.440 | so that we can be prepared to nip anything
01:14:38.320 | that comes along in the bud.
01:14:39.560 | But I, like you, believe that's eminently doable.
01:14:43.600 | I, like you, believe that the good guys outnumber
01:14:45.920 | the bad guys in this particular one to a degree
01:14:48.020 | that probably has no precedent in history.
01:14:50.160 | I mean, even the worst, worst people, I'm sure, in ISIS,
01:14:53.900 | even Osama bin Laden, even any bad guy you could imagine
01:14:57.560 | in history would be revolted by the idea
01:15:01.080 | of exterminating all of humanity.
01:15:02.760 | I mean, that's a low bar.
01:15:05.680 | And so, the good guys completely outnumber the bad guys
01:15:09.960 | when it comes to this.
01:15:10.980 | But the asymmetry and the fact that one catastrophic error
01:15:15.980 | could lead to unbelievably consequential things
01:15:19.000 | is what worries me here.
01:15:19.940 | But I, too, am very optimistic.
01:15:21.840 | - The thing that I sometimes worry about is the fact
01:15:25.600 | that we haven't seen overwhelming evidence
01:15:27.920 | of alien civilizations out there.
01:15:29.820 | Makes me think, well, there's a lot of explanations,
01:15:33.800 | but one of them that worries me is that
01:15:35.700 | whenever they get smart, they just destroy themselves.
01:15:40.360 | - Oh, yeah.
01:15:41.200 | I mean, that was the most fascinating,
01:15:43.440 | is the most fascinating and chilling number
01:15:46.840 | or variable in the Drake equation is L.
01:15:49.120 | At the end of it, you look out and you see
01:15:51.920 | one to 400 billion stars in the Milky Way galaxy,
01:15:56.300 | and we now know because of Kepler
01:15:58.480 | that an astonishingly high percentage of them
01:16:00.920 | probably have habitable planets.
01:16:03.100 | And so, all the things that were unknowns
01:16:06.120 | when the Drake equation was originally written,
01:16:08.080 | like how many stars have planets,
01:16:10.440 | actually back then in the 1960s
01:16:12.320 | when the Drake equation came along,
01:16:14.080 | the consensus amongst astronomers
01:16:15.800 | was that it would be a small minority of solar systems
01:16:17.960 | that had planets or stars.
01:16:19.480 | But now we know it's substantially all of them.
01:16:21.880 | How many of those stars have planets in the habitable zone?
01:16:25.800 | It's kind of looking like 20%, like, oh my God.
01:16:29.360 | And so, L, which is how long does a civilization,
01:16:33.000 | once it reaches technological competence,
01:16:35.920 | continues to last, that's the doozy.
01:16:39.040 | And you're right.
01:16:41.720 | It's all too plausible to think
01:16:44.760 | that when a civilization reaches a level of sophistication
01:16:47.620 | that's probably just a decade or three in our future,
01:16:50.560 | the odds of it self-destructing
01:16:52.920 | just start mounting astronomically, no pun intended.
01:16:57.400 | - My hope is that, actually,
01:17:00.120 | there is a lot of alien civilizations out there,
01:17:01.920 | and what they figure out in order to avoid
01:17:04.960 | the self-destruction, they need to turn off
01:17:07.760 | the thing that was useful, that used to be a feature
01:17:10.480 | and now became a bug, which is the desire to colonize,
01:17:13.520 | to conquer more land.
01:17:15.160 | So there's probably ultra-intelligent
01:17:17.480 | alien civilizations out there that are just chilling,
01:17:20.160 | like on the beach with whatever
01:17:22.640 | your favorite alcohol beverage is,
01:17:24.680 | but without trying to conquer everything,
01:17:27.880 | just chilling out and maybe exploring
01:17:31.480 | in the realm of knowledge,
01:17:33.520 | but almost appreciating existence
01:17:38.520 | for its own sake versus life
01:17:42.640 | as a progression of conquering of other life,
01:17:47.640 | like this kind of predator-prey formulation
01:17:50.280 | that resulted in us humans,
01:17:53.720 | perhaps as something we have to shed in order to survive.
01:17:57.320 | I don't know.
01:17:58.560 | - Yeah, that is a very plausible solution
01:18:02.160 | to Fermi's paradox, and it's one that makes sense.
01:18:04.880 | When we look at our own lives
01:18:06.840 | and our own arc of technological trajectory,
01:18:11.600 | it's very, very easy to imagine that
01:18:13.640 | in an intermediate future world of flawless VR
01:18:18.480 | or flawless whatever kind of simulation
01:18:22.120 | that we wanna inhabit, it will just simply cease
01:18:25.800 | to be worthwhile to go out and expand
01:18:30.240 | our interstellar territory.
01:18:33.960 | But if we were going out and conquering
01:18:36.480 | interstellar territory, it wouldn't necessarily
01:18:38.200 | have to be predator or prey.
01:18:39.280 | I can imagine a benign but sophisticated intelligence
01:18:43.200 | saying, "Well, we're gonna go to places.
01:18:44.520 | "We're gonna go to places that we can terraform."
01:18:46.640 | We'd use a different word than terra, obviously,
01:18:48.360 | but we can turn into habitable
01:18:50.280 | for our particular physiology,
01:18:53.200 | so long as that they don't house intelligent,
01:18:56.240 | sentient creatures that would suffer from our invasion.
01:18:59.640 | But it is easy to see a sophisticated,
01:19:02.640 | intelligent species evolving to the point
01:19:04.800 | where interstellar travel with its incalculable expense
01:19:08.800 | and physical hurdles just isn't worth it
01:19:10.760 | compared to what could be done where one already is.
01:19:14.920 | - So you talked about diagnostics at scale
01:19:18.080 | as a possible solution to future pandemics.
01:19:22.520 | What about another possible solution,
01:19:24.160 | which is kind of creating a backup copy?
01:19:27.480 | I'm actually now putting together a NAS
01:19:30.840 | for a backup for myself for the first time,
01:19:32.640 | taking backup of data seriously.
01:19:35.000 | But if we were to take the backup
01:19:36.920 | of human consciousness seriously
01:19:38.520 | and try to expand throughout the solar system
01:19:41.600 | and colonize other planets,
01:19:43.240 | do you think that's an interesting solution,
01:19:47.000 | one of many, for protecting human civilizations
01:19:50.880 | from self-destruction, humans becoming
01:19:53.360 | a multi-planetary species?
01:19:54.840 | - Oh, absolutely.
01:19:55.800 | I mean, I find it electrifying, first of all,
01:19:57.800 | so I've got a little bit of a personal bias.
01:19:59.400 | When I was a kid, I thought there was nothing cooler
01:20:01.680 | than rockets, I thought there was nothing cooler than NASA,
01:20:04.600 | I thought there was nothing cooler
01:20:05.960 | than people walking on the moon.
01:20:07.920 | And as I grew up, I thought there was nothing more tragic
01:20:11.320 | than the fact that we went from walking on the moon
01:20:13.400 | to at best getting to something like suborbital altitude.
01:20:17.120 | And just, I found that more and more depressing
01:20:20.280 | with the passage of decades at just the colossal expense
01:20:24.960 | of manned space travel and the fact that it seemed
01:20:28.760 | that we were unlikely to ever get back to the moon,
01:20:30.720 | let alone Mars.
01:20:31.720 | So I have a boundless appreciation for Elon Musk
01:20:35.440 | for many reasons, but the fact that he has put Mars
01:20:37.920 | on the incredible agenda is one of the things
01:20:41.000 | that I appreciate immensely.
01:20:43.000 | So there's just this sort of space nerd in me
01:20:45.080 | that just says, "God, that's cool."
01:20:47.280 | But on a more practical level, we were talking about
01:20:51.800 | potentially inhabiting planets that aren't our own,
01:20:56.560 | and we're thinking about a benign civilization
01:20:59.080 | that would do that in planetary circumstances
01:21:04.080 | where we're not causing other conscious systems to suffer.
01:21:07.560 | I mean, Mars is a place that's very promising.
01:21:09.600 | There may be microbial life there, and I hope there is,
01:21:12.240 | and if we found it, I think it would be electrifying.
01:21:14.840 | But I think ultimately, the moral judgment would be made
01:21:18.680 | that the continued thriving of that microbial life
01:21:22.480 | is of less concern than creating a habitable planet
01:21:26.440 | to humans, which would be a project
01:21:28.040 | on the many thousands of years scale.
01:21:30.560 | But I don't think that that would be a greatly immoral act.
01:21:34.320 | And if that happened, and if Mars became home
01:21:37.560 | to a self-sustaining group of humans
01:21:40.160 | that could survive a catastrophic mistake here on Earth,
01:21:43.320 | then yeah, the fact that we have a backup colony is great.
01:21:45.960 | And if we could make more, I'm sorry, not backup colony,
01:21:48.360 | backup copy is great.
01:21:50.120 | And if we could make more and more such backup copies
01:21:53.000 | throughout the solar system by hollowing out asteroids
01:21:56.360 | and whatever else it is, maybe even Venus,
01:21:58.440 | we could get rid of 3/4 of its atmosphere
01:22:00.960 | and turn it into a tropical paradise.
01:22:03.400 | I think all of that is wonderful.
01:22:05.700 | Now, whether we can make the leap from that
01:22:07.520 | to interstellar transportation
01:22:10.260 | with the incredible distances that are involved,
01:22:13.180 | I think that's an open question.
01:22:15.880 | But I think if we ever do that,
01:22:17.400 | it would be more like the Pacific Ocean's
01:22:22.400 | channel of human expansion than the Atlantic Oceans.
01:22:27.880 | And so what I mean by that is,
01:22:30.720 | when we think about European society
01:22:33.640 | transmitting itself across the Atlantic,
01:22:36.120 | it's these big, ambitious, crazy, expensive,
01:22:40.780 | one-shot expeditions like Columbus's
01:22:43.680 | to make it across this enormous expanse,
01:22:46.000 | at least initially, without any certainty
01:22:48.320 | that there's land on the other end, right?
01:22:50.160 | So that's kind of how I view our space program,
01:22:53.440 | is like big, very conscious, deliberate efforts
01:22:56.920 | to get from point A to point B.
01:22:58.440 | If you look at how Pacific Islanders transmitted
01:23:03.440 | their descendants and their culture and so forth
01:23:07.420 | throughout Polynesia and beyond,
01:23:09.500 | it was much more inhabiting a place,
01:23:13.880 | getting to the point where there were people
01:23:15.920 | who were ambitious or unwelcome enough
01:23:18.440 | to decide it's time to go off-island
01:23:20.200 | and find the next one and pray to find the next one.
01:23:22.880 | That method of transmission didn't happen
01:23:25.560 | in a single swift year,
01:23:28.360 | but it happened over many, many centuries.
01:23:30.520 | And it was like going from this island to that island
01:23:32.920 | and probably for every expedition
01:23:34.620 | that went out to seek another island
01:23:36.060 | and actually lucked out and found one,
01:23:37.800 | God knows how many were lost at sea.
01:23:39.860 | But that form of transmission took place
01:23:41.760 | over a very long period of time.
01:23:43.400 | And I could see us perhaps going
01:23:46.920 | from the inner solar system to the outer solar system,
01:23:49.260 | to the Kuiper belt, to the Oort cloud.
01:23:51.800 | There's theories that there might be planets out there
01:23:55.800 | that are not anchored to stars,
01:23:57.240 | like kind of hop, hop, slowly transmitting ourselves.
01:24:00.840 | At some point, we're actually in Alpha Centauri.
01:24:03.480 | But I think that kind of backup copy
01:24:06.320 | and transmission of our physical presence
01:24:08.400 | and our culture to a diversity of extraterrestrial outposts
01:24:13.400 | is a really exciting idea.
01:24:15.800 | - I really never thought about that
01:24:17.120 | because I have thought,
01:24:18.800 | my thinking about space exploration
01:24:21.800 | has been very Atlantic Ocean centric
01:24:23.840 | in a sense that there'll be one program with NASA
01:24:26.280 | and maybe private Elon Musk, SpaceX, or Jeff Bezos and so on.
01:24:31.280 | But it's true that with the help of Elon Musk,
01:24:35.120 | making it cheaper and cheaper and more effective
01:24:36.940 | to create these technologies,
01:24:38.860 | where you could go into deep space,
01:24:41.440 | perhaps the way we actually colonize the solar system
01:24:45.280 | and expand out into the galaxy
01:24:50.520 | is basically just like these renegade ships of weirdos.
01:24:55.520 | They're just kind of like,
01:24:59.240 | most of them like quote unquote homemade,
01:25:02.280 | but they just kind of venture out into space
01:25:04.240 | and just like the initial Android model
01:25:08.200 | of millions of these little ships just flying out,
01:25:11.420 | most of them die off in horrible accidents,
01:25:14.900 | but some of them will persist
01:25:17.420 | or there'll be stories of them persisting
01:25:19.680 | and over a period of decades and centuries,
01:25:21.840 | there'll be other attempts,
01:25:23.560 | almost always as a response to the main set of efforts.
01:25:27.200 | That's interesting.
01:25:28.040 | - Yeah.
01:25:28.880 | - 'Cause you kind of think of Mars colonization
01:25:30.560 | as the big NASA Elon Musk effort of a big colony,
01:25:35.240 | but maybe the successful one would be,
01:25:37.840 | like a decade after that,
01:25:39.480 | there'll be like a ship from like some kid,
01:25:42.520 | some high school kid who gets together a large team
01:25:45.160 | and does something probably illegal
01:25:47.000 | and launches something where they end up
01:25:49.280 | actually persisting quite a bit.
01:25:51.320 | And from that learning lessons
01:25:53.680 | that nobody ever gave permission for,
01:25:55.640 | but somehow actually flourish
01:25:58.240 | and then take that into the scale of centuries forward
01:26:02.680 | into the rest of space.
01:26:04.600 | That's really interesting.
01:26:05.560 | - Yeah, I think the giant steps
01:26:07.800 | are likely to be NASA-like efforts.
01:26:09.440 | Like there is no intermediate rock,
01:26:11.200 | well, I guess it's the moon,
01:26:12.120 | but even getting to the moon ain't that easy
01:26:14.080 | between us and Mars, right?
01:26:15.200 | So like the giant steps, the big hubs,
01:26:18.160 | like the O'Hare airports of the future
01:26:20.560 | probably will be very deliberate efforts,
01:26:23.360 | but then you would have, I think,
01:26:25.480 | that kind of diffusion
01:26:28.080 | as space travel becomes more democratized and more capable.
01:26:31.160 | You'll have this sort of natural diffusion
01:26:33.200 | of people who kind of want to be off grid
01:26:35.440 | or think they can make a fortune there,
01:26:37.000 | you know, the kind of mentality
01:26:38.000 | that drove people to San Francisco.
01:26:39.280 | I mean, San Francisco was not populated
01:26:41.600 | as a result of a King Ferdinand and Isabella-like effort
01:26:45.360 | to fund Columbus going over.
01:26:46.880 | It was just a whole bunch of people
01:26:48.200 | making individual decisions that there's gold
01:26:51.120 | in them thar hills and I'm gonna go out
01:26:52.560 | and get a piece of it.
01:26:53.400 | So I could see that kind of fusion.
01:26:55.040 | What I can't see, and the reason that I think
01:26:57.400 | this Pacific model of transmission is more likely,
01:27:00.240 | is I just can't see a NASA-like effort
01:27:03.120 | to go from Earth to Alpha Centauri.
01:27:06.240 | It's just too far.
01:27:08.720 | I just see lots and lots and lots
01:27:10.880 | of relatively tiny steps between now and there.
01:27:14.720 | And the fact is that there are large chunks of matter
01:27:18.480 | going at least a light year beyond the sun.
01:27:21.080 | I mean, the Oort cloud, I think,
01:27:22.440 | extends at least a light year beyond the sun.
01:27:25.240 | And then maybe there are these untethered planets after that.
01:27:28.240 | We won't really know till we get there.
01:27:30.260 | And if our Oort cloud goes out a light year
01:27:32.520 | and Alpha Centauri's Oort cloud goes out a light year,
01:27:35.320 | you've already cut in half the distance.
01:27:37.640 | You know, so who knows?
01:27:38.800 | But yeah.
01:27:39.640 | - One of the possibilities, probably the cheapest
01:27:42.120 | and most effective way to create interesting
01:27:45.040 | interstellar spacecraft is ones
01:27:49.600 | that are powered and driven by AI.
01:27:51.840 | And you could think of,
01:27:52.920 | here's where you have high school students be able to build
01:27:57.080 | a sort of a Hal 9000 version, the modern version of that.
01:28:02.080 | And it's kind of interesting to think about these robots
01:28:06.440 | traveling out throughout, perhaps sadly,
01:28:11.440 | long after human civilization is gone,
01:28:14.440 | there'll be these intelligent robots flying throughout space
01:28:18.600 | and perhaps land on Alpha Centauri B
01:28:21.600 | or any of those kinds of planets
01:28:23.560 | and colonize sort of,
01:28:26.520 | humanity continues through the proliferation
01:28:33.720 | of our creations, like robotic creations
01:28:37.760 | that have some echoes of that intelligence.
01:28:41.200 | Hopefully also the consciousness.
01:28:43.480 | Does that make you sad the future where AGI,
01:28:47.520 | super intelligent or just mediocre intelligent AI systems
01:28:51.580 | outlive humans?
01:28:54.360 | - Yeah, I guess it depends on the circumstances
01:28:57.080 | in which they outlive humans.
01:28:58.400 | So let's take the example that you just gave.
01:29:01.160 | We send out, you know, very sophisticated AGI's
01:29:05.080 | on simple rocket ships, relatively simple ones
01:29:08.280 | that don't have to have all the life support
01:29:10.280 | necessary for humans.
01:29:11.520 | And therefore they're of trivial mass
01:29:14.200 | compared to a crude ship, a generation ship.
01:29:17.320 | And therefore they're way more likely to happen.
01:29:19.560 | So let's use that example.
01:29:21.080 | And let's say that they travel to distant planets
01:29:23.980 | at a speed that's not much faster
01:29:26.300 | than what a chemical rocket can achieve.
01:29:27.920 | And so it's inevitably tens, hundreds of thousands of years
01:29:30.660 | before they make landfall someplace.
01:29:32.600 | So let's imagine that's going on.
01:29:34.120 | And meanwhile, we die for reasons that have nothing to do
01:29:39.120 | with those AGI's diffusing throughout the solar system,
01:29:42.320 | whether it's through climate change, nuclear war,
01:29:45.320 | symbio, rogue symbio, whatever.
01:29:47.280 | In that kind of scenario, the notion of the AGI's
01:29:50.000 | that we created outlasting us is very reassuring
01:29:53.040 | because it says that like we ended,
01:29:56.600 | but our descendants are out there.
01:29:59.240 | And hopefully some of them make landfall
01:30:01.120 | and create some echo of who we are.
01:30:02.800 | So that's a very optimistic one.
01:30:04.760 | Whereas the Terminator scenario of a super AGI
01:30:09.200 | arising on earth and getting let out of its box
01:30:12.760 | due to some boo-boo on the part of its creators
01:30:15.760 | who do not have super intelligence,
01:30:17.800 | and then deciding that for whatever reason
01:30:20.520 | it doesn't have any need for us to be around
01:30:22.560 | and exterminating us, that makes me feel crushingly sad.
01:30:26.160 | I mean, look, I was sad when my elementary school
01:30:29.880 | was shut down and bulldozed,
01:30:31.500 | even though I hadn't been a student there for decades.
01:30:34.920 | The thought of my hometown getting disbanded
01:30:37.960 | is even worse, the thought of my home state of Connecticut
01:30:41.500 | getting disbanded and like absorbed into Massachusetts
01:30:44.280 | is even worse.
01:30:45.120 | The notion of humanity is just crushingly,
01:30:47.440 | crushingly sad to me.
01:30:48.520 | - So you hate goodbyes?
01:30:50.240 | - I, certain goodbyes, yes.
01:30:52.840 | Some goodbyes are really, really liberating, but yes.
01:30:55.960 | - Well, but what if the Terminators, you know,
01:31:00.680 | have consciousness and enjoy the hell out of life as well?
01:31:05.580 | They're just better at it.
01:31:07.520 | - Yeah, well, the have consciousness is a really key element.
01:31:11.100 | And so there's no reason to be certain
01:31:15.660 | that a super intelligence would have consciousness.
01:31:19.640 | We don't know that factually at all.
01:31:21.120 | And so what is a very lonely outcome to me
01:31:23.960 | is the rise of a super intelligence
01:31:25.960 | that has a certain optimization function
01:31:28.460 | that it's either been programmed with
01:31:30.860 | or that arises in an emergently that says,
01:31:33.580 | hey, I want to do this thing for which humans
01:31:36.660 | are either an unacceptable risk,
01:31:38.240 | their presence is either an unacceptable risk
01:31:40.220 | or they're just collateral damage.
01:31:42.160 | But there is no consciousness there.
01:31:44.260 | Then the idea of the light of consciousness
01:31:46.980 | being snuffed out by something that is very competent
01:31:50.840 | but has no consciousness is really, really sad.
01:31:54.380 | - Yeah, but I tend to believe that it's almost impossible
01:31:57.180 | to create a super intelligent agent
01:31:58.780 | that can't destroy human civilization
01:32:00.480 | without it being conscious.
01:32:01.740 | It's like those are coupled.
01:32:03.820 | Like you have to, in order to destroy humans
01:32:07.460 | or supersede humans, you really have to be accepted
01:32:12.460 | by humans.
01:32:13.640 | I think this idea that you can build systems
01:32:16.220 | that destroy human civilization
01:32:20.020 | without them being deeply integrated
01:32:21.980 | into human civilization is impossible.
01:32:23.700 | And for them to be integrated, they have to be human-like,
01:32:27.500 | not just in body and form,
01:32:29.060 | but in all the things that we value as humans,
01:32:32.700 | one of which is consciousness.
01:32:34.540 | The other one is just ability to communicate.
01:32:36.880 | The other one is poetry and music and beauty
01:32:38.940 | and all those things.
01:32:40.020 | Like they have to be all of those things.
01:32:43.460 | I mean, this is what I think about.
01:32:45.260 | It does make me sad, but it's letting go,
01:32:48.580 | which is they might be just better
01:32:52.660 | at everything we appreciate than us.
01:32:55.140 | And that's sad.
01:32:56.180 | And hopefully they'll keep us around.
01:32:58.940 | But I think it's a kind of,
01:33:03.220 | it is a kind of goodbye to realizing
01:33:07.080 | that we're not the most special species on Earth anymore.
01:33:10.640 | That's still painful.
01:33:11.920 | - It's still painful.
01:33:12.880 | And in terms of whether such a creation
01:33:16.680 | would have to be conscious, let's say, I'm not so sure.
01:33:20.080 | I mean, let's imagine something
01:33:22.920 | that can pass the Turing test.
01:33:25.120 | That something that passes the Turing test
01:33:26.760 | could, over text-based interaction in any event,
01:33:30.640 | successfully mimic a very conscious intelligence
01:33:34.220 | on the other end, but just be completely unconscious.
01:33:37.140 | So that's a possibility.
01:33:38.860 | And that if you take that upper radical step,
01:33:40.860 | which I think can be permitted
01:33:42.900 | if we're thinking about superintelligence,
01:33:45.660 | you could have something that could reason its way
01:33:48.300 | through this is my optimization function.
01:33:51.380 | And in order to get to it,
01:33:53.300 | I've got to deal with these messy, somewhat illogical things
01:33:56.100 | that are as intelligent in relation to me
01:33:58.660 | as they are intelligent in relation to ants.
01:34:01.440 | I can trick them, manipulate them, whatever.
01:34:03.960 | And I know the resources I need.
01:34:05.300 | I know I need this amount of power.
01:34:07.360 | I need to seize control of these manufacturing resources
01:34:11.280 | that are robotically operated.
01:34:13.280 | I need to improve those robots with software upgrades
01:34:15.760 | and then ultimately mechanical upgrades,
01:34:17.760 | which I can affect through X, Y, and Z.
01:34:20.160 | That doesn't, you know, that could still be a thing
01:34:22.520 | that passes the Turing test.
01:34:24.400 | I don't think it's necessarily certain
01:34:27.040 | that that optimization function,
01:34:30.460 | you know, maximizing entity would be conscious.
01:34:35.460 | - So this is from a very engineering perspective
01:34:39.120 | because I think a lot about natural language processing,
01:34:43.220 | all those kind of, very,
01:34:44.800 | I'm speaking to a very specific problem
01:34:47.320 | of just say the Turing test.
01:34:48.960 | I really think that something like consciousness
01:34:52.240 | is required, when you say reasoning,
01:34:54.940 | you're separating that from consciousness.
01:34:56.640 | But I think consciousness is part of reasoning
01:34:59.560 | in the sense that you will not be able
01:35:03.240 | to become super intelligent in the way
01:35:06.120 | that it's required to be part of human society
01:35:09.840 | without having consciousness.
01:35:11.100 | Like I really think it's impossible
01:35:13.200 | to separate the consciousness thing.
01:35:14.720 | But it's hard to define consciousness
01:35:17.120 | when you just use that word.
01:35:18.280 | But even just like the capacity,
01:35:20.600 | the way I think about consciousness
01:35:22.720 | is the important symptoms
01:35:25.600 | or maybe consequences of consciousness,
01:35:27.840 | one of which is the capacity to suffer.
01:35:31.400 | I think AI will need to be able to suffer
01:35:34.760 | in order to become super intelligent,
01:35:37.320 | to feel the pain, the uncertainty, the doubt.
01:35:40.480 | The other part of that is not just the suffering,
01:35:42.700 | but the ability to understand that it too is mortal
01:35:47.700 | in the sense that it has a self-awareness
01:35:51.880 | about its presence in the world,
01:35:53.760 | understand that it's finite,
01:35:55.880 | and be terrified of that finiteness.
01:35:58.720 | I personally think that's a fundamental part
01:36:00.560 | of the human condition is this fear of death
01:36:02.900 | that most of us construct an illusion around.
01:36:04.880 | But I think AI would need to be able
01:36:06.800 | to really have it part of its whole essence.
01:36:11.800 | Like every computation, every part of the thing
01:36:15.260 | that generates, that does both the perception
01:36:17.920 | and generates the behavior will have to have,
01:36:20.900 | I don't know how this is accomplished,
01:36:22.940 | but I believe it has to truly be terrified of death,
01:36:27.840 | truly have the capacity to suffer,
01:36:30.320 | and from that, something that will be recognized
01:36:32.800 | to us humans as consciousness would emerge.
01:36:35.120 | Whether it's the illusion of consciousness, I don't know.
01:36:37.680 | The point is it looks a whole hell of a lot
01:36:40.180 | like consciousness to us humans,
01:36:41.880 | and I believe that AI, when you ask it,
01:36:45.920 | will also say that it is conscious.
01:36:48.900 | You know, in the full sense that we say
01:36:50.840 | that we're conscious.
01:36:52.200 | And all of that, I think, is fully integrated.
01:36:54.360 | Like you can't separate the two.
01:36:55.800 | The idea of the paperclip maximizer
01:37:00.160 | that sort of ultra-rationally would be able
01:37:03.200 | to destroy all humans because it's really good
01:37:05.960 | at accomplishing a simple objective function
01:37:10.960 | that doesn't care about the value of humans.
01:37:13.880 | It may be possible, but the number of trajectories
01:37:16.560 | to that are far outnumbered by the trajectories
01:37:20.120 | that create something that is conscious,
01:37:21.640 | something that appreciative of beauty,
01:37:24.120 | creates beautiful things in the same way
01:37:25.920 | that humans can create beautiful things.
01:37:27.680 | And ultimately, the sad, destructive path for that AI
01:37:32.680 | would look a lot like just better humans
01:37:37.240 | than like these cold machines.
01:37:41.560 | And I would say, of course, the cold machines
01:37:44.320 | that lack consciousness, the philosophical zombies,
01:37:48.560 | make me sad, but also what makes me sad
01:37:51.200 | is just things that are far more powerful
01:37:53.560 | and smart and creative than us too.
01:37:57.560 | 'Cause then in the same way that AlphaZero
01:38:02.560 | becoming a better chess player than the best of humans,
01:38:06.420 | even starting with Deep Blue, but really with AlphaZero,
01:38:10.660 | that makes me sad too.
01:38:11.920 | One of the most beautiful games that humans ever created
01:38:17.520 | that used to be seen as demonstrations of the intellect,
01:38:20.400 | which is chess, and Go in other parts of the world
01:38:24.440 | have been solved by AI, that makes me quite sad.
01:38:27.280 | And it feels like the progress of that
01:38:28.880 | is just pushing on forward.
01:38:30.560 | - Oh, it makes me sad too.
01:38:32.040 | And to be perfectly clear, I absolutely believe
01:38:35.360 | that artificial consciousness is entirely possible.
01:38:39.000 | And it's not something I rule out at all.
01:38:40.640 | I mean, if you could get smart enough
01:38:42.960 | to have a perfect map of the neural structure
01:38:46.760 | and the neural states and the amount of neurotransmitters
01:38:49.840 | that are going between every synapse
01:38:51.040 | in a particular person's mind,
01:38:53.240 | could you replicate that in silica
01:38:55.560 | at some reasonably distant point in the future?
01:38:59.560 | Absolutely, and then you'd have a consciousness.
01:39:01.400 | I don't rule out the possibility
01:39:02.880 | of artificial consciousness in any way.
01:39:05.600 | What I'm less certain about is whether consciousness
01:39:08.680 | is a requirement for superintelligence
01:39:11.320 | pursuing a maximizing function of some sort.
01:39:16.120 | I don't feel the certitude that consciousness
01:39:19.440 | simply must be part of that.
01:39:21.800 | You had said for it to coexist with human society,
01:39:25.200 | would need to be consciousness.
01:39:26.880 | Could be entirely true, but it also could just exist
01:39:30.360 | orthogonally to human society.
01:39:32.720 | And it could also, upon attaining a superintelligence
01:39:36.760 | with a maximizing function, very, very, very rapidly
01:39:40.840 | because of the speed at which computing works
01:39:42.720 | compared to our own meat-based minds,
01:39:46.200 | very, very rapidly make the decisions and calculations
01:39:50.200 | necessary to seize the reins of power
01:39:51.800 | before we even know what's going on.
01:39:53.200 | - Yeah, I mean, kind of like biological viruses do.
01:39:55.880 | - Yeah. - Don't necessarily,
01:39:57.000 | they integrate themselves just fine with human society.
01:39:59.920 | - Yeah, without, technically--
01:40:02.400 | - Without consciousness. - Yeah, without even
01:40:03.800 | being alive, technically, by the standards
01:40:06.360 | of a lot of biologists.
01:40:07.840 | - So this is a bit of a tangent,
01:40:09.600 | but you've talked with Sam Harris
01:40:13.520 | on that four-hour special episode we mentioned.
01:40:15.920 | I'm just curious to ask, 'cause I use this meditation app
01:40:21.580 | I've been using for the past month to meditate.
01:40:23.880 | Is this something you've integrated
01:40:26.120 | as part of your life, meditation or fasting?
01:40:28.800 | Or has some of Sam Harris rubbed off on you
01:40:31.160 | in terms of his appreciation of meditation
01:40:35.040 | and just kind of, from a third-person perspective,
01:40:38.040 | analyzing your own mind, consciousness,
01:40:40.120 | free will, and so on?
01:40:41.400 | - You know, I've tried it three separate times in my life,
01:40:45.000 | really made a concerted attack on meditation
01:40:48.280 | and integrating it into my life.
01:40:49.880 | One of them, the most extreme, was I took a class
01:40:53.280 | based on the work of Jon Kabat-Zinn,
01:40:55.920 | who is, in many ways, one of the founding people
01:41:00.040 | behind the mindful meditation movement,
01:41:03.320 | that required, like, part of the class was,
01:41:05.720 | you know, it was a weekly class,
01:41:07.640 | and you were gonna meditate an hour a day, every day.
01:41:12.360 | And having done that for, I think it was 10 weeks,
01:41:15.480 | it might have been 13, however long a period of time was,
01:41:18.340 | at the end of it, it just didn't stick.
01:41:20.000 | As soon as it was over, you know,
01:41:22.440 | I did not feel that gravitational pull,
01:41:25.000 | I did not feel the collapse in quality of life
01:41:29.340 | after wimping out on that project.
01:41:33.000 | And then the most recent one was actually with Sam's app.
01:41:36.160 | During the lockdown, I did make a pretty good
01:41:39.680 | and consistent concerted effort
01:41:41.320 | to listen to his 10-minute meditation every day,
01:41:44.400 | and I've always fallen away from it.
01:41:46.760 | And I, you know, you're kind of interpreting
01:41:49.400 | why did I personally do this.
01:41:51.000 | I do believe it was ultimately
01:41:52.600 | because it wasn't bringing me that, you know,
01:41:55.840 | joy or inner peace or better confidence at being me
01:41:59.360 | that I was hoping to get from it.
01:42:01.360 | Otherwise, I think I would have clung to it
01:42:03.600 | in the way that we cling to certain good habits,
01:42:06.280 | like I'm really good at flossing my teeth.
01:42:08.240 | Not that you were gonna ask Lex,
01:42:10.120 | but yeah, that's one thing that defeats a lot of people.
01:42:12.800 | I'm good at that.
01:42:14.000 | - See, Herman Hesse, I think,
01:42:16.320 | if you get in which book or maybe, I forget where,
01:42:19.880 | I've read everything of his,
01:42:20.920 | so it's unclear where it came from,
01:42:24.360 | but he had this idea that anybody who is,
01:42:28.840 | who truly achieves mastery in things
01:42:32.760 | will learn how to meditate in some way.
01:42:35.520 | So it could be that for you, the flossing of teeth
01:42:38.200 | is yet another like little inkling of meditation.
01:42:42.120 | Like it doesn't have to be
01:42:43.120 | this very particular kind of meditation.
01:42:45.920 | Maybe podcasting, you have an amazing podcast,
01:42:48.120 | that could be meditation.
01:42:49.200 | The writing process is meditation.
01:42:51.280 | For me, like,
01:42:57.440 | there's a bunch of mechanisms which take my mind
01:43:00.360 | into a very particular place
01:43:02.200 | that looks a whole lot like meditation.
01:43:04.520 | For example, when I've been running
01:43:06.760 | over the past couple of years,
01:43:09.880 | and especially when I listen to certain kinds of audio books,
01:43:14.520 | like I've listened to the rise and fall of the Third Reich.
01:43:18.120 | I've listened to a lot of sort of World War II,
01:43:20.960 | which at once, because I have a lot of family
01:43:25.440 | who's lost in World War II,
01:43:26.800 | and so much of the Soviet Union
01:43:28.680 | is grounded in the suffering of World War II,
01:43:31.400 | that somehow it connects me to my history,
01:43:34.120 | but also there's some kind of purifying aspect
01:43:38.840 | to thinking about how cruel, but at the same time,
01:43:41.860 | how beautiful human nature could be.
01:43:43.760 | And so you're also running,
01:43:45.640 | like it clears the mind from all the concerns of the world,
01:43:49.660 | and somehow it takes you to this place
01:43:51.200 | where you were like deeply appreciative to be alive,
01:43:54.840 | in the sense that, as opposed to listening to your breath,
01:43:57.840 | or like feeling your breath,
01:43:59.220 | and thinking about your consciousness,
01:44:00.800 | and all those kinds of processes that Sam's app does,
01:44:04.320 | well, this does that for me, the running,
01:44:07.640 | and flossing may do that for you.
01:44:10.580 | So maybe Herman Hesse is onto something.
01:44:13.280 | - I hope flossing is not my main form of expertise,
01:44:16.000 | although I am gonna claim a certain expertise there,
01:44:18.240 | and I'm gonna claim it rather.
01:44:19.080 | - Somebody has to be the best flosser in the world.
01:44:21.240 | - That ain't me.
01:44:22.080 | I'm just glad that I'm a consistent one.
01:44:23.800 | I mean, there are a lot of things
01:44:24.800 | that bring me into a flow state,
01:44:25.880 | and I think maybe, perhaps that's one reason
01:44:27.680 | why meditation isn't as necessary for me.
01:44:30.640 | I definitely enter a flow state when I'm writing,
01:44:33.160 | I definitely enter a flow state when I'm editing,
01:44:34.800 | I definitely enter a flow state
01:44:36.240 | when I'm mixing and mastering music.
01:44:39.440 | I enter a flow state when I'm doing heavy, heavy research
01:44:42.440 | to either prepare for a podcast,
01:44:44.960 | or to also do tech investing,
01:44:48.320 | to make myself smart in a new field
01:44:51.400 | that is fairly alien to me.
01:44:54.140 | I can just, the hours can just melt away
01:44:56.660 | while I'm reading this and watching that YouTube lecture
01:44:59.340 | and going through this presentation and so forth.
01:45:02.440 | So maybe because there's a lot of things
01:45:04.140 | that bring me into a flow state in my normal weekly life,
01:45:07.400 | not daily, unfortunately,
01:45:08.500 | but certainly my normal weekly life,
01:45:10.340 | that I have less of an urge to meditate.
01:45:12.340 | Now you've been working with Sam's app
01:45:14.000 | for about a month now, you said.
01:45:15.860 | Is this your first run-in with meditation?
01:45:17.420 | Is this your first attempt to integrate it with your life?
01:45:19.900 | - Meditation, meditation.
01:45:21.060 | I always thought running and thinking,
01:45:24.100 | I listen to brown noise often.
01:45:26.100 | That takes my mind, I don't know what the hell it does,
01:45:28.540 | but it takes my mind immediately into like the state
01:45:31.020 | where I'm deeply focused on anything I do.
01:45:33.420 | I don't know why.
01:45:34.260 | - So it's like you're accompanying sound when you're--
01:45:36.460 | - Yeah. - Really?
01:45:37.300 | And what's the difference between brown and white noise?
01:45:39.180 | This is a cool term I haven't heard before.
01:45:41.420 | - So people should look up brown noise.
01:45:43.300 | - They don't have to,
01:45:44.140 | 'cause you're about to tell them what it is.
01:45:45.860 | - 'Cause you have to experience it, you have to listen to it.
01:45:48.140 | So I think white noise is, this has to do with music.
01:45:52.020 | I think there's different colors.
01:45:54.060 | There's pink noise, and I think that has to do
01:45:56.860 | with the frequencies.
01:45:59.620 | Like the white noise is usually less bassy.
01:46:04.380 | Brown noise is very bassy.
01:46:06.220 | So it's more like (exhales)
01:46:09.180 | versus like (shushes)
01:46:11.140 | like the, if that makes sense.
01:46:13.660 | So there's like a deepness to it.
01:46:16.340 | I think everyone is different,
01:46:17.700 | but for me, it was when I was a research scientist at MIT,
01:46:22.700 | especially when there's a lot of students around,
01:46:28.260 | I remember just being annoyed
01:46:29.780 | at the noise of people talking.
01:46:31.540 | And one of my colleagues said,
01:46:32.940 | "Well, you should try listening to brown noise.
01:46:34.940 | "Like it really knocks out everything."
01:46:36.860 | 'Cause I used to wear earplugs too,
01:46:38.660 | like just see if I can block it out.
01:46:40.500 | And the moment I put it on,
01:46:44.060 | it's as if my mind was waiting all these years
01:46:48.180 | to hear that sound.
01:46:49.860 | Everything just focused in.
01:46:52.180 | It makes me wonder how many other amazing things out there
01:46:54.820 | they're waiting to discover from my own particular,
01:46:58.460 | like biological, from my own particular brain.
01:47:01.540 | So that, it just goes (mimics noise)
01:47:04.500 | The mind just focuses in.
01:47:06.100 | It's kind of incredible.
01:47:06.980 | So I see that as a kind of meditation.
01:47:09.140 | Maybe I'm using a performance enhancing
01:47:12.740 | a sound to achieve that meditation,
01:47:15.460 | but I've been doing that for many years now
01:47:17.860 | and running and walking and doing,
01:47:20.660 | Cal Newport was the first person that introduced me
01:47:23.380 | to the idea of deep work.
01:47:24.780 | Just put a word to the kind of thinking
01:47:28.140 | that's required to sort of deeply think about a problem,
01:47:31.300 | especially if it's mathematical in nature.
01:47:33.300 | I see that as a kind of meditation
01:47:35.140 | 'cause what it's doing is you have these constructs
01:47:38.740 | in your mind that you're building on top of each other.
01:47:40.860 | And there's all these distracting thoughts
01:47:42.660 | that keep bombarding you from all over the place.
01:47:45.740 | And the whole process is you slowly let them
01:47:48.100 | kind of move past you.
01:47:50.100 | And that's a meditative process.
01:47:51.380 | - It's very meditative.
01:47:52.220 | That sounds a lot like what Sam talks about
01:47:55.620 | in his meditation app, which I did use,
01:47:57.380 | to be clear, for a while,
01:47:58.860 | of just letting the thought go by without deranging you.
01:48:02.500 | Derangement is one of Sam's favorite words,
01:48:04.180 | as I'm sure you know.
01:48:05.220 | But brown noise, that's really intriguing.
01:48:08.500 | I am going to try that as soon as this evening.
01:48:11.540 | - Yeah, to see if it works.
01:48:12.900 | But very well might not work at all.
01:48:14.660 | - Yeah, yeah.
01:48:15.740 | - I think the interesting point is,
01:48:17.220 | and the same with the fasting and the diet,
01:48:20.060 | is I long ago stopped trusting experts
01:48:25.060 | or maybe taking the word of experts as the gospel truth
01:48:31.740 | and only using it as an inspiration to try something,
01:48:37.180 | to try thoroughly something.
01:48:39.780 | So fasting was one of the things when I first discovered,
01:48:43.460 | I've been many times eating just once a day,
01:48:46.420 | so that's a 24-hour fast.
01:48:48.320 | It makes me feel amazing.
01:48:50.740 | And at the same time, eating only meat,
01:48:53.460 | putting ethical concerns aside, makes me feel amazing.
01:48:57.860 | I don't know why it doesn't,
01:49:00.100 | the point is to be an N of one scientist
01:49:02.900 | until nutrition science becomes a real science
01:49:05.620 | to where it's doing studies that deeply understand
01:49:09.380 | the biology underlying all of it
01:49:12.420 | and also does real thorough long-term studies
01:49:17.420 | of thousands if not millions of people
01:49:20.660 | versus the very small studies that are generalizing
01:49:25.740 | from very noisy data and all those kinds of things
01:49:30.100 | where you can't control all the elements.
01:49:32.420 | - Particularly because our own personal metabolism
01:49:35.420 | is highly variant among us.
01:49:36.940 | So there are going to be some people,
01:49:38.420 | like if brown noise is a game changer for 7% of people,
01:49:43.420 | there's 93% odds that I'm not one of them,
01:49:46.860 | but there's certainly every reason in the world
01:49:48.820 | to test it out.
01:49:49.900 | Now, so I'm intrigued by the fasting.
01:49:51.900 | I like you, well, I assume like you,
01:49:54.780 | I don't have any problem going to one meal a day
01:49:56.740 | and I often do that inadvertently.
01:49:58.860 | And I've never done it methodically,
01:50:00.780 | like I've never done it like I'm gonna do this for 15 days,
01:50:03.660 | maybe I should.
01:50:04.900 | And maybe I should, like how many days in a row
01:50:07.780 | of the one meal a day did you find
01:50:10.940 | brought noticeable impact to you?
01:50:13.500 | Was it after three days of it?
01:50:14.780 | Was it months of it?
01:50:15.780 | Like what was it?
01:50:17.100 | - Well, the noticeable impact is day one.
01:50:19.220 | So for me, 'cause I eat a very low carb diet,
01:50:22.820 | so the hunger wasn't the hugest issue.
01:50:25.460 | Like there wasn't a painful hunger, like wanting to eat.
01:50:29.700 | So I was already kind of primed for it.
01:50:32.020 | And the benefit comes from a lot of people
01:50:35.420 | that do intermittent fasting,
01:50:36.780 | that's only like 16 hours of fasting
01:50:39.660 | get this benefit too is the focus.
01:50:41.380 | There's a clarity of thought.
01:50:43.180 | If my brain was a runner,
01:50:46.980 | it felt like I'm running on a track when I'm fasting
01:50:49.820 | versus running in quicksand.
01:50:51.820 | Like it's much crisper.
01:50:53.420 | - And is this your first 72 hour fast right now?
01:50:55.100 | - This is the first time doing 72 hours, yeah.
01:50:56.940 | And that's a different thing, but similar.
01:51:00.580 | Like I'm going up and down in terms of hunger
01:51:04.740 | and the focus is really crisp.
01:51:06.620 | The thing I'm noticing most of all, to be honest,
01:51:09.580 | is how much eating, even when it's once a day
01:51:14.580 | or twice a day is a big part of my life.
01:51:18.460 | Like I almost feel like I have way more time in my life.
01:51:21.860 | And it's not so much about the eating,
01:51:24.300 | but like I don't have to plan my day around.
01:51:26.820 | Like today I don't have any eating to do.
01:51:30.540 | - It does free up hours.
01:51:32.060 | Or any cleaning up after eating or provisioning of food.
01:51:35.740 | - Or even like thinking about it.
01:51:37.900 | It's not a thing.
01:51:38.860 | So when you think about what you're going to do tonight,
01:51:42.060 | I think I'm realizing that as opposed to thinking,
01:51:46.180 | you know, I'm gonna work on this problem
01:51:47.740 | or I'm gonna go on this walk
01:51:49.100 | or I'm going to call this person,
01:51:51.540 | I often think I'm gonna eat this thing.
01:51:54.580 | You allow dinner as a kind of,
01:51:57.860 | you know, when people talk about like the weather
01:51:59.540 | or something like that,
01:52:00.420 | it's almost like a generic thought
01:52:02.020 | you allow yourself to have,
01:52:04.060 | because it's the lazy thought.
01:52:06.740 | And I don't have the opportunity to have that thought
01:52:08.940 | because I'm not eating it.
01:52:10.540 | So now I get to think about like the things
01:52:12.580 | I'm actually gonna do tonight
01:52:13.780 | that are more complicated than the eating process.
01:52:16.740 | That's been the most noticeable thing, to be honest.
01:52:20.340 | And then there's people that have written me
01:52:22.380 | that have done seven day fast.
01:52:25.060 | And there's a few people that have written me
01:52:27.100 | and I've heard of this, is doing a 30 day fast.
01:52:31.380 | And it's interesting, the body,
01:52:33.900 | I don't know what the health benefits are necessarily.
01:52:37.100 | What that shows me is how adaptable the human body is.
01:52:41.980 | - Yeah.
01:52:43.060 | - And that's incredible.
01:52:44.060 | And that's something really important to remember
01:52:46.820 | when we think about how to live life,
01:52:49.220 | 'cause the body adapts.
01:52:50.620 | - Yeah, I mean, we sure couldn't go 30 days without water.
01:52:53.260 | - That's right.
01:52:54.460 | - But food, yeah, it's been done.
01:52:56.260 | It's demonstrably possible.
01:52:57.500 | You ever read, Franz Kafka has a great short story
01:53:00.660 | called "The Hunger Artist"?
01:53:01.940 | - Yeah, I love that.
01:53:03.380 | - Great story.
01:53:05.060 | - You know, that was before I started fasting.
01:53:06.660 | I read that story and I admired the beauty of that,
01:53:09.820 | the artistry of that actual hunger artist.
01:53:12.620 | That it's like madness,
01:53:14.540 | but it also felt like a little bit of genius.
01:53:16.860 | I actually have to reread it.
01:53:18.300 | You know what, that's what I'm gonna do tonight.
01:53:19.540 | I'm gonna read it because I'm doing the fasting.
01:53:21.740 | - 'Cause you're in the midst of it.
01:53:22.580 | - Yeah, in the midst of it.
01:53:23.420 | - It'd be very contextual.
01:53:24.240 | I haven't read it since high school
01:53:25.080 | and I'd love to read it again.
01:53:25.900 | I love his work.
01:53:26.740 | So maybe I'll read it tonight too.
01:53:28.260 | - And part of the reason of sort of,
01:53:30.820 | I've, here in Texas, people have been so friendly
01:53:34.140 | that I've been nonstop eating like brisket
01:53:36.660 | with incredible people, a lot of whiskey as well.
01:53:40.040 | So I gained quite a bit of weight,
01:53:42.300 | which I'm embracing, it's okay.
01:53:44.340 | But I am also aware as I'm fasting
01:53:48.380 | that like I have a lot of fat to run on.
01:53:52.360 | Like I have a lot of like natural resources on my body.
01:53:57.360 | - You've got reserves.
01:53:58.240 | - Reserves, that's a good way to put it.
01:53:59.880 | And that's really cool.
01:54:01.360 | You know, there's like a, this whole thing,
01:54:03.880 | this biology works well.
01:54:05.600 | Like I can go a long time because of the long-term investing
01:54:09.960 | in terms of brisket that I've been doing
01:54:11.600 | in the weeks before.
01:54:12.600 | - It's all training.
01:54:13.520 | - It's all training.
01:54:14.360 | - It's all prep work, all prep work, yeah.
01:54:15.760 | - So, okay, you open a bunch of doors,
01:54:17.380 | one of which is music.
01:54:19.120 | So I got to walk in, at least for a brief moment.
01:54:21.500 | I love guitar, I love music.
01:54:23.480 | You founded a music company,
01:54:26.080 | but you're also a musician yourself.
01:54:28.520 | Let me ask the big ridiculous question first.
01:54:30.400 | What's the greatest song of all time?
01:54:32.280 | - Greatest song of all time?
01:54:34.640 | Okay, wow, it's gonna obviously vary dramatically
01:54:38.160 | from genre to genre.
01:54:39.240 | So like you, I like guitar.
01:54:42.000 | Perhaps like you, although I've dabbled
01:54:45.320 | in inhaling every genre of music
01:54:48.440 | that I can almost practically imagine,
01:54:51.400 | I keep coming back to the sound of bass, guitar,
01:54:56.080 | drum, keyboards, voice.
01:54:57.900 | I love that style of music.
01:54:59.200 | And added to it, I think a lot of really cool
01:55:02.500 | electronic production makes something
01:55:04.940 | that's really, really new and hybrid-y and awesome.
01:55:08.980 | But in that kind of like guitar-based rock,
01:55:13.060 | I think I've got to go with
01:55:15.660 | "Won't Get Fooled Again" by The Who.
01:55:18.840 | It is such an epic song.
01:55:21.560 | It's got so much grandeur to it.
01:55:23.920 | It uses the synthesizers that were available at the time.
01:55:27.600 | This has gotta be, I think, 1972, '73,
01:55:29.920 | which are very, very primitive to our years,
01:55:32.080 | but uses them in this hypnotic and beautiful way
01:55:36.200 | that I can't imagine somebody with the greatest synth array
01:55:40.380 | conceivable by today's technology could do a better job of
01:55:43.440 | in the context of that song.
01:55:45.820 | And it's almost operatic.
01:55:49.220 | So I would say in that genre, the genre of rock,
01:55:54.120 | that would be my nomination.
01:55:56.300 | - I'm totally, in my brain,
01:55:58.820 | Pinball Wizard is overriding everything else by The Who,
01:56:02.380 | so I can't even imagine the song.
01:56:04.620 | - Well, I would say, ironically, with Pinball Wizard,
01:56:07.520 | so that came from the movie "Tommy."
01:56:09.860 | And in the movie "Tommy," the rival of Tommy,
01:56:13.980 | the reigning pinball champ, was Elton John.
01:56:17.500 | And so there are a couple versions
01:56:19.460 | of Pinball Wizard out there,
01:56:20.580 | one sung by Roger Daltrey of The Who,
01:56:22.660 | which a purist would say,
01:56:23.740 | "Hey, that's the real Pinball Wizard."
01:56:25.720 | But the version that is sung by Elton John in the movie,
01:56:29.460 | which is available to those who are ambitious
01:56:31.620 | and wanna dig for it, that's even better in my mind.
01:56:35.300 | - Yeah, the covers.
01:56:36.220 | And I, for myself, I was thinking,
01:56:38.580 | "What is the song for me?"
01:56:41.020 | They asked that question. - And what is that?
01:56:43.000 | - I think that changes day to day, too.
01:56:45.980 | I was realizing that. - Of course.
01:56:47.540 | - But for me, somebody who values lyrics as well
01:56:52.540 | and the emotion in the song,
01:56:56.200 | by the way, "Hallelujah" by Leonard Cohen was a close one,
01:57:00.080 | but the number one is Johnny Cash's cover of "Hurt."
01:57:03.780 | There's something so powerful about that song,
01:57:12.460 | about that cover, about that performance.
01:57:15.300 | Maybe another one is the cover of "Sound of Silence."
01:57:18.060 | Maybe there's something about covers for me.
01:57:21.820 | - So whose cover sounds, 'cause Simon and Garfunkel,
01:57:24.400 | I think, did the original recording of that, right?
01:57:26.340 | So which cover is it then?
01:57:28.120 | - There's a cover by Disturbed.
01:57:31.660 | It's a metal band, which is so interesting,
01:57:34.020 | 'cause I'm really not into that kind of metal,
01:57:36.020 | but he does a pure vocal performance.
01:57:38.560 | So he's not doing a metal performance.
01:57:41.360 | I would say it's one of the greatest, people should see it.
01:57:44.220 | It's like 400 million views or something like that.
01:57:47.260 | - Wow.
01:57:48.100 | - It's probably the greatest live vocal performance
01:57:52.260 | I've ever heard is Disturbed covering "Sound of Silence."
01:57:55.660 | - I'll listen to it as soon as I get home.
01:57:57.060 | - And that song came to life to me
01:57:58.700 | in a way that Simon and Garfunkel never did.
01:58:00.460 | There was no, for me, with Simon and Garfunkel,
01:58:02.780 | there's not a pain, there's not an anger,
01:58:06.820 | there's not a power to their performance.
01:58:11.820 | It's almost like this melancholy, I don't know.
01:58:15.360 | - Well, there's a lot of, I guess there's a lot of beauty
01:58:18.200 | to it, like objectively beautiful.
01:58:20.840 | And I think, I never thought of this until now,
01:58:23.560 | but I think if you put entirely different lyrics
01:58:26.880 | on top of it, unless they were joyous, which would be weird,
01:58:31.200 | it wouldn't necessarily lose that much.
01:58:33.200 | It's just a beauty in the harmonizing, it's soft.
01:58:36.240 | And you're right, it's not dripping with emotion.
01:58:40.680 | The vocal performance is not dripping with emotion.
01:58:42.720 | It's dripping with harmonizing,
01:58:46.440 | technical harmonizing brilliance and beauty.
01:58:49.520 | - Now, if you compare that to the Disturbed cover
01:58:52.880 | or the Johnny Cash's "Hurt" cover, when you walk away,
01:58:57.880 | there's a few, it's haunting.
01:59:00.120 | It stays with you for a long time.
01:59:02.640 | There's certain performances that will just stay with you
01:59:05.960 | to where, like if you watch people respond to that,
01:59:10.960 | and that's certainly how I felt when you listened
01:59:14.480 | to the Disturbed performance or Johnny Cash "Hurt",
01:59:17.680 | there's a response to where you just sit there
01:59:20.720 | with your mouth open, kind of like paralyzed by it somehow.
01:59:25.300 | And I think that's what makes for a great song
01:59:29.600 | to where you're just like, it's not that you're like
01:59:32.040 | singing along or having fun,
01:59:33.920 | that's another way a song could be great,
01:59:36.640 | but where you're just like, what, this is, you're in awe.
01:59:41.240 | - Yeah.
01:59:42.060 | - If we go to listen.com and that whole fascinating era
01:59:47.060 | of music in the '90s, transitioning to the aughts,
01:59:51.100 | so I remember those days, the Napster days,
01:59:55.120 | when piracy, from my perspective, allegedly ruled the land.
01:59:59.640 | What do you make of that whole era?
02:00:03.400 | What are the big, what was, first of all,
02:00:05.560 | your experiences of that era,
02:00:07.240 | and what were the big takeaways in terms of piracy,
02:00:11.440 | in terms of what it takes to build a company that succeeds
02:00:15.240 | in that kind of digital space in terms of music,
02:00:19.640 | but in terms of anything creative?
02:00:21.840 | - Well, so for those who don't remember,
02:00:24.080 | which is gonna be most folks,
02:00:25.520 | listen.com created a service called Rhapsody,
02:00:28.760 | which is much, much more recognizable to folks
02:00:31.040 | because Rhapsody became a pretty big name
02:00:32.720 | for reasons I'll get into in a second.
02:00:34.240 | So for people who don't know
02:00:36.960 | their early online music history,
02:00:38.520 | we were the first company, so I founded Listen.
02:00:41.400 | - Thank you. - I was the lone founder.
02:00:42.840 | And Rhapsody was, we were the first service
02:00:46.240 | to get full catalog licenses from all the major music labels
02:00:50.480 | in order to distribute their music online,
02:00:52.640 | and we specifically did it through a mechanism,
02:00:54.600 | which at the time struck people as exotic and bizarre
02:00:57.920 | and kind of incomprehensible,
02:00:59.360 | which was unlimited on-demand streaming,
02:01:01.720 | which of course now, it's a model that's been appropriated
02:01:06.160 | by Spotify and Apple and many, many others.
02:01:08.320 | So we were a pioneer on that front.
02:01:10.320 | What was really, really, really hard
02:01:12.520 | about doing business in those days
02:01:14.640 | was the reaction of the music labels to piracy,
02:01:18.120 | which was about 180 degrees opposite
02:01:21.120 | of what their reaction, quote unquote,
02:01:23.480 | should have been from the standpoint
02:01:25.080 | of preserving their business from piracy.
02:01:27.560 | So Napster came along and was a service
02:01:32.200 | that enabled people to get near unlimited access
02:01:36.680 | to most songs.
02:01:39.120 | I mean, truly obscure things could be very hard
02:01:41.360 | to find on Napster, but most songs
02:01:43.720 | with a relatively simple one-click ability
02:01:48.120 | to download those songs and have the MP3s
02:01:50.560 | on their hard drives.
02:01:51.720 | But there was a lot that was very messy
02:01:54.560 | about the Napster experience.
02:01:56.000 | You might download a really god-awful recording
02:01:59.120 | of that song.
02:02:00.200 | You may download a recording that actually wasn't that song
02:02:03.160 | with some prankster putting it up
02:02:05.160 | to sort of mess with people.
02:02:07.320 | You could struggle to find the song that you're looking for.
02:02:09.760 | You could end up finding yourself connected.
02:02:12.480 | It was peer-to-peer.
02:02:13.680 | You might randomly find yourself connected
02:02:15.720 | to somebody in Bulgaria,
02:02:17.320 | doesn't have a very good internet connection,
02:02:18.960 | so you might wait 19 minutes only for it to snap,
02:02:22.600 | et cetera, et cetera.
02:02:24.000 | And our argument to, well, actually,
02:02:25.880 | let's start with how that hit the music labels.
02:02:28.440 | The music labels had been in a very, very comfortable position
02:02:31.880 | for many, many decades of essentially, you know,
02:02:36.200 | having monopoly, you know,
02:02:38.680 | having been the monopoly providers
02:02:41.000 | of a certain subset of artists.
02:02:42.640 | Any given label was a monopoly provider of the artists
02:02:45.600 | and the recordings that they owned,
02:02:47.360 | and they could sell it at what turned
02:02:49.440 | out to be tremendously favorable rates.
02:02:51.680 | In the late era of the CD, you know,
02:02:54.160 | you were talking close to $20 for a compact disc
02:02:57.600 | that might have one song that you were crazy about
02:03:00.000 | and simply needed to own that might actually be glued
02:03:03.080 | to 17 other songs that you found to be sure crap.
02:03:06.000 | And so the music industry had used the fact
02:03:10.360 | that it had this unbelievable leverage
02:03:13.280 | and profound pricing power to really get music lovers
02:03:18.280 | to the point that they felt very, very misused
02:03:21.160 | by the entire situation.
02:03:22.560 | Now along comes Napster and music sales start getting gutted
02:03:27.320 | with extreme rapidity.
02:03:29.440 | And the reaction of the music industry to that
02:03:33.080 | was one of shock and absolute fury,
02:03:37.080 | which is understandable, you know?
02:03:39.160 | I mean, industries do get gutted all the time,
02:03:42.120 | but I struggle to think of an analog of an industry
02:03:44.560 | that got gutted that rapidly.
02:03:46.800 | I mean, we could say that passenger train service
02:03:49.120 | certainly got gutted by airlines,
02:03:51.720 | but that was a process that took place over decades
02:03:54.280 | and decades and decades.
02:03:55.400 | It wasn't something that happened, you know,
02:03:57.520 | really started showing up in the numbers
02:03:59.320 | in a single digit number of months
02:04:01.200 | and started looking like an existential threat
02:04:03.480 | within a year or two.
02:04:05.080 | So the music industry is quite understandably
02:04:08.560 | in a state of shock and fury.
02:04:10.280 | I don't blame them for that.
02:04:12.000 | But then their reaction was catastrophic,
02:04:15.120 | both for themselves and almost for people like us
02:04:18.840 | who were trying to do, you know,
02:04:21.000 | the cowboy in the white hat thing.
02:04:23.240 | So our response to the music industry was,
02:04:25.480 | look, what you need to do to fight piracy,
02:04:28.440 | you can't put the genie back in the bottle.
02:04:30.320 | You can't switch off the internet.
02:04:32.640 | Even if you all shut your eyes and wish very, very,
02:04:35.440 | very hard, the internet is not going away.
02:04:38.280 | And these peer-to-peer technologies
02:04:39.840 | are genies out of the bottle.
02:04:41.040 | And if you, God, don't, whatever you do,
02:04:43.440 | don't shut down Napster, because if you do,
02:04:46.640 | suddenly that technology is gonna splinter
02:04:49.080 | into 30 different nodes that you'll never,
02:04:51.160 | ever be able to shut off.
02:04:52.240 | What we suggested to them is like, look,
02:04:54.520 | what you want to do is to create a massively
02:04:58.520 | better experience to piracy, something that's way better,
02:05:02.480 | that you sell at a completely reasonable price,
02:05:05.080 | and this is what it is.
02:05:06.400 | Don't just give people access to that very limited number
02:05:09.480 | of songs that they happen to have acquired
02:05:11.720 | and paid for or pirated and have on their hard drive.
02:05:15.840 | Give them access to all of the music in the world
02:05:18.520 | for a simple low price.
02:05:19.560 | And obviously, that doesn't sound like a crazy suggestion,
02:05:22.080 | I don't think, to anybody's ears today,
02:05:24.040 | because that is how the majority of music
02:05:25.720 | is now being consumed online.
02:05:26.880 | But in doing that, you're gonna create
02:05:29.520 | a much, much better option to this kind of crappy,
02:05:33.240 | kind of rickety, kind of buggy process of acquiring MP3s.
02:05:37.720 | Now, unfortunately, the music industry was so angry
02:05:41.560 | about Napster and so forth that for essentially
02:05:44.720 | three and a half years, they folded their arms,
02:05:47.400 | stamped their feet, and boycotted the internet.
02:05:49.880 | So they basically gave people who were fervently passionate
02:05:53.320 | about music and were digitally modern,
02:05:55.720 | they gave them basically one choice.
02:05:57.560 | If you want to have access to digital music,
02:05:59.400 | we, the music industry, insist that you steal it
02:06:02.000 | because we are not going to sell it to you.
02:06:04.560 | So what that did is it made an entire generation
02:06:07.040 | of people morally comfortable with swiping the music
02:06:10.920 | because they felt quite pragmatically,
02:06:12.560 | well, they're not giving me any choice here.
02:06:14.240 | It's like a 20-year-old violating the 21 drinking age.
02:06:18.840 | If they do that, they're not gonna feel like felons.
02:06:22.000 | They're gonna be like, "This is an unreasonable law
02:06:23.880 | "and I'm skirting it," right?
02:06:25.240 | So they make a whole generation of people
02:06:27.000 | morally comfortable with swiping music,
02:06:29.560 | but also technically adept at it.
02:06:32.240 | And when they did shut down Napster
02:06:33.920 | and kind of even trickier tools and like tweakier tools
02:06:37.240 | like Kazaa and so forth came along,
02:06:39.360 | people just figured out how to do it.
02:06:41.640 | So by the time they finally, grudgingly, it took years,
02:06:46.280 | allowed us to release this experience
02:06:48.800 | that we were quite convinced would be better than piracy,
02:06:51.760 | we had this enormous hole had been dug
02:06:54.840 | where lots of people said music is a thing that is free
02:06:58.680 | and that's morally okay and I know how to get it.
02:07:01.800 | And so streaming took many, many, many more years
02:07:05.640 | to take off and become the gargantuan thing,
02:07:08.960 | the juggernaut it is today
02:07:10.920 | than would have happened if they'd pivoted
02:07:13.560 | to let's sell a better experience
02:07:16.160 | as opposed to demand that people want digital music,
02:07:18.920 | steal it.
02:07:19.840 | - Like what lessons do we draw from that?
02:07:21.520 | 'Cause we're probably in the midst of living
02:07:23.920 | through a bunch of similar situations
02:07:26.360 | in different domains currently, we just don't know.
02:07:28.160 | There's a lot of things in this world
02:07:29.560 | that are really painful.
02:07:30.760 | I mean, I don't know if you can draw perfect parallels,
02:07:34.800 | but fiat money versus cryptocurrency.
02:07:37.240 | There's a lot of currently people in power
02:07:40.200 | who are kind of very skeptical about cryptocurrency,
02:07:42.280 | although that's changing, but it's arguable
02:07:44.760 | it's changing way too slowly.
02:07:46.000 | There's a lot of people making that argument
02:07:47.720 | where there should be a complete like Coinbase
02:07:49.920 | and all this stuff switched to that.
02:07:52.600 | There's a lot of other domains that where a pivot,
02:07:57.400 | like if you pivot now, you're going to win big,
02:08:02.400 | but you don't pivot because you're stubborn.
02:08:05.520 | And so, I mean, like, is this just the way
02:08:08.400 | that companies are?
02:08:09.520 | The company succeeds initially, and then it grows,
02:08:13.600 | and there's a huge number of employees and managers
02:08:16.640 | that don't have the guts or the institutional mechanisms
02:08:20.880 | to do the pivot.
02:08:21.720 | Is this just the way of companies?
02:08:23.560 | - Well, I think what happens,
02:08:24.760 | I'll use the case of the music industry.
02:08:27.080 | There was an economic model that they put food on the table
02:08:30.640 | and paid for marble lobbies
02:08:32.360 | and seven and even eight figure executive salaries
02:08:34.640 | for many, many decades,
02:08:36.000 | which was the physical collection of music.
02:08:39.480 | And then you start talking about something
02:08:41.000 | like unlimited streaming,
02:08:42.800 | and it seems so ephemeral and like such a long shot
02:08:47.640 | that people start worrying
02:08:48.600 | about cannibalizing their own business.
02:08:51.080 | And they lose sight of the fact
02:08:52.520 | that something illicit is cannibalizing their business
02:08:55.240 | at an extraordinarily fast rate.
02:08:57.000 | And so if they don't do it themselves, they're doomed.
02:08:59.360 | I mean, we used to put slides in front of these folks,
02:09:01.760 | this is really funny, where we said,
02:09:04.440 | okay, let's assume Rhapsody, we want it to be 9.99 a month,
02:09:08.360 | and we want it to be 12 months.
02:09:10.320 | So it's $120 a year from the budget of a music lover.
02:09:14.800 | And then we were also able to get
02:09:16.240 | reasonably accurate statistics
02:09:17.920 | that showed how many CDs per year
02:09:20.480 | the average person who bothered to collect music,
02:09:22.720 | which was not all people, actually bought.
02:09:25.240 | And it was overwhelmingly clear
02:09:26.920 | that the average CD buyer spends a hell of a lot
02:09:30.320 | less than $120 a year on music.
02:09:32.880 | This is a revenue expansion, blah, blah, blah,
02:09:35.280 | but all they could think of,
02:09:36.960 | and I'm not saying this in a pejorative or patronizing way,
02:09:40.800 | I don't blame them, they'd grown up
02:09:42.000 | in this environment for decades.
02:09:43.760 | All they could think of was the incredible margins
02:09:46.000 | that they had on a CD.
02:09:48.440 | And they would say, well, if this CD,
02:09:51.480 | by the mechanism that you guys are proposing,
02:09:54.960 | the CD that I'm selling for $17.99,
02:09:58.800 | somebody would need to stream those songs.
02:10:01.040 | We were talking about a penny a play back then,
02:10:02.680 | it's less than that now that the record labels get paid.
02:10:05.400 | But would have to stream songs from that 1,799 times,
02:10:09.320 | it's never gonna happen.
02:10:10.520 | So they were just sort of stuck in the model of this,
02:10:12.320 | but it's like, no, dude,
02:10:13.440 | but they're gonna spend money on all this other stuff.
02:10:15.160 | So I think people get very hung up on that.
02:10:17.120 | I mean, another example is really,
02:10:19.120 | the taxi industry was not monolithic,
02:10:21.360 | like the music labels.
02:10:22.960 | There was a whole bunch of fleets
02:10:24.360 | and a whole bunch of cities, very, very fragmented.
02:10:26.040 | It's an imperfect analogy, but nonetheless,
02:10:28.320 | imagine if the taxi industry writ large,
02:10:30.760 | upon seeing Uber said, oh my God,
02:10:34.000 | people wanna be able to hail things easily, cheaply,
02:10:38.000 | they don't wanna mess with cash,
02:10:39.360 | they wanna know how many minutes it's gonna be,
02:10:41.200 | they wanna know the fare in advance,
02:10:43.040 | and they want a much bigger fleet than what we've got.
02:10:46.400 | If the taxi industry had rolled out something like that,
02:10:50.640 | with the branding of yellow taxis, universally known
02:10:53.960 | and kind of loved by Americans
02:10:56.760 | and expanded their fleet in a necessary manner,
02:10:58.800 | I don't think Uber or Lyft ever would have gotten a foothold.
02:11:02.000 | But the problem there was that real economics
02:11:05.280 | in the taxi industry wasn't with fares,
02:11:08.120 | it was with the scarcity of medallions.
02:11:10.760 | And so the taxi fleets, in many cases,
02:11:13.040 | owned gazillions of medallions
02:11:15.120 | whose value came from their very scarcity.
02:11:18.680 | So they simply couldn't pivot to that.
02:11:21.240 | So you think you end up having these vested interests
02:11:23.640 | with economics that aren't necessarily visible to outsiders
02:11:27.900 | who get very, very reluctant to disrupt their own model,
02:11:31.480 | which is why it ends up coming
02:11:32.720 | from the outside so frequently.
02:11:34.760 | - So you know what it takes to build a successful startup,
02:11:37.400 | but you're also an investor in a lot of successful startups.
02:11:41.340 | Let me ask for advice.
02:11:44.000 | What do you think it takes to build a successful startup
02:11:47.360 | by way of advice?
02:11:48.840 | - Well, I think it starts, I mean,
02:11:51.200 | everything starts and even ends with the founder.
02:11:54.720 | And so I think it's really, really important
02:11:56.620 | to look at the founder's motivations
02:11:59.440 | and their sophistication about what they're doing.
02:12:01.940 | In almost all cases that I'm familiar with
02:12:05.880 | and have thought hard about,
02:12:07.400 | you've had a founder who was deeply, deeply inculcated
02:12:12.400 | in the domain of technology that they were taking on.
02:12:16.560 | Now, what's interesting about that is you could say,
02:12:18.960 | no, wait, how is that possible
02:12:20.320 | 'cause there's so many young founders?
02:12:21.760 | When you look at young founders,
02:12:23.480 | they're generally coming out of very nascent,
02:12:25.820 | emerging fields of technology.
02:12:27.700 | Where simply being present and accounted for
02:12:30.800 | and engaged in the community for a period of even months
02:12:34.160 | is enough time to make them very, very deeply inculcated.
02:12:37.040 | I mean, you look at Marc Andreessen and Netscape,
02:12:39.860 | Marc had been doing visual web browsers
02:12:43.580 | when Netscape had been founded for what, a year and a half?
02:12:45.960 | But he'd created the first one,
02:12:47.480 | and in Mosaic when he was an undergrad,
02:12:51.520 | and the commercial internet was pre-nascent in 1994
02:12:56.600 | when Netscape was founded.
02:12:58.520 | So there's somebody who's very, very deep in their domain,
02:13:00.780 | Mark Zuckerberg also, social networking,
02:13:02.640 | very deep in his domain,
02:13:03.640 | even though it was nascent at the time,
02:13:05.920 | lots of people doing crypto stuff.
02:13:07.620 | I mean, 10 years ago, even seven or eight years ago,
02:13:12.020 | by being a really, really vehement
02:13:14.720 | and engaged participant in the crypto ecosystem,
02:13:18.360 | you could be an expert in that.
02:13:19.860 | You look, however, at more established industries,
02:13:22.580 | take salesforce.com.
02:13:23.760 | Salesforce automation, pretty mature field
02:13:26.100 | when it got started, who's the executive and the founder?
02:13:29.000 | Mark Benioff, who spent 13 years at Oracle
02:13:31.720 | and was an investor in Siebel Systems,
02:13:33.640 | which ended up being Salesforce's main competition.
02:13:36.740 | So more established, you need the entrepreneur
02:13:40.520 | to be very, very deep in the technology and the culture
02:13:44.800 | and the UN2 of the space,
02:13:46.600 | because you need that entrepreneur, that founder,
02:13:49.760 | to have just an unbelievably accurate intuitive sense
02:13:54.560 | for where the puck is going, right?
02:13:56.540 | And that only comes from being very deep.
02:13:59.100 | So that is sort of factor number one.
02:14:01.380 | And the next thing is that that founder needs to be
02:14:04.620 | charismatic and/or credible, or ideally both,
02:14:08.900 | in exactly the right ways, to be able to attract a team
02:14:13.020 | that is bought into that vision
02:14:14.900 | and is bought into that founder's intuitions being correct,
02:14:18.140 | and not just the team, obviously, but also the investors.
02:14:21.540 | So it takes a certain personality type
02:14:24.140 | to pull that off.
02:14:25.680 | Then the next thing I'm still talking about, the founder,
02:14:28.140 | is a relentlessness and indeed a monomania
02:14:33.140 | to put this above things that might rationally,
02:14:38.180 | should perhaps rationally supersede it for a period of time,
02:14:41.800 | to just relentlessly pivot when pivoting is called for,
02:14:46.820 | and it's always called for.
02:14:48.140 | I mean, think of even very successful companies.
02:14:50.780 | Like, how many times does Facebook pivot?
02:14:53.940 | Newsfeed was something that was completely alien
02:14:56.420 | to the original version of Facebook
02:14:58.340 | and came foundationally important.
02:15:00.220 | How many times did Google?
02:15:01.060 | How many times at any given,
02:15:02.660 | how many times has Apple pivoted?
02:15:04.580 | You know, that founder energy and DNA,
02:15:06.940 | when the founder moves on,
02:15:07.940 | the DNA that's been inculcated with a company
02:15:10.740 | has to have that relentlessness and that ability
02:15:13.580 | to pivot and pivot and pivot
02:15:15.340 | without being worried about sacred cows.
02:15:18.260 | And then the last thing I'll say about the founder
02:15:20.140 | before I get to the rest of the team,
02:15:21.420 | and that'll be mercifully brief,
02:15:24.180 | is the founder has to be obviously a really great hirer,
02:15:29.180 | but just important, a very good firer.
02:15:32.980 | And firing is a horrific experience
02:15:36.100 | for both people involved in it.
02:15:37.940 | It is a wrenching emotional experience.
02:15:40.780 | And being good at realizing when this particular person
02:15:45.780 | is damaging the interests of the company
02:15:49.180 | and the team and the shareholders,
02:15:51.380 | and having the intestinal fortitude
02:15:55.500 | to have that conversation and make it happen
02:15:58.140 | is something that most people don't have in them.
02:16:01.700 | And it's something that needs to be developed
02:16:04.460 | in most people, or maybe some people have it naturally.
02:16:08.500 | But without that ability,
02:16:10.420 | that will take an A-plus organization
02:16:12.620 | into B-minus range very, very quickly.
02:16:15.300 | And so that's all what needs to be present in the founder.
02:16:19.300 | - Can I just say? - Sure.
02:16:21.340 | - How damn good you are, Rob.
02:16:22.940 | That was brilliant.
02:16:23.900 | The one thing that was really kind of surprising to me
02:16:27.900 | is having a deep technical knowledge.
02:16:31.120 | Because I think the way you expressed it,
02:16:34.420 | which is that allows you to be really honest
02:16:38.060 | with the capabilities of what's possible.
02:16:42.100 | Like, of course, you're often trying to do the impossible.
02:16:47.100 | But in order to do the impossible,
02:16:50.220 | you have to be quote-unquote impossible.
02:16:51.980 | But you have to be honest with what is actually possible.
02:16:54.780 | - And it doesn't necessarily have to be
02:16:56.580 | the technical competence.
02:16:57.780 | It's gotta be, in my view,
02:16:59.580 | just a complete immersion in that emerging market.
02:17:02.780 | And so I can imagine, there are a couple people out there
02:17:05.380 | who have started really good crypto projects
02:17:08.020 | who themselves aren't writing the code.
02:17:10.780 | But they're immersed in the culture
02:17:12.540 | and through the culture and a deep understanding
02:17:15.300 | of what's happening and what's not happening,
02:17:16.980 | they can get a good intuition of what's possible.
02:17:19.700 | But the very first hire,
02:17:22.220 | I mean, a great way to solve that
02:17:24.060 | is to have a technical co-founder.
02:17:26.020 | And dual founder companies have become extremely common
02:17:29.580 | for that reason.
02:17:31.020 | And if you're not doing that
02:17:32.300 | and you're not the technical person,
02:17:34.060 | but you are the founder,
02:17:35.820 | you've gotta be really great at hiring
02:17:38.700 | a very damn good technical person very, very fast.
02:17:43.700 | - Can I, on the founder, ask you,
02:17:45.940 | is it possible to do this alone?
02:17:50.140 | There's so many people giving advice
02:17:52.140 | and saying that it's impossible to do the first few steps.
02:17:54.820 | Not impossible, but much more difficult to do it alone.
02:17:58.420 | If we were to take the journey,
02:17:59.820 | say, especially in the software world,
02:18:01.660 | where there's not significant investment required
02:18:04.100 | for it to build something up,
02:18:06.220 | is it possible to go to a prototype,
02:18:10.220 | to something that essentially works
02:18:11.820 | and already has a huge number of customers alone?
02:18:14.780 | - Sure.
02:18:15.900 | There are lots and lots of loan founder companies out there
02:18:19.220 | that have made an incredible difference.
02:18:21.780 | I mean, I'm not certainly putting Rhapsody
02:18:24.540 | in the league of Spotify.
02:18:25.860 | We were too early to be Spotify,
02:18:27.500 | but we did an awful lot of innovation.
02:18:29.740 | And then after the company sold
02:18:31.140 | and ended up in the hands of Real Networks and MTV,
02:18:34.220 | got to millions of subs, right?
02:18:35.860 | I was a loan founder,
02:18:37.020 | and I studied Arabic and Middle Eastern history undergrad.
02:18:40.700 | So I wasn't very, very technical.
02:18:42.420 | But yeah, loan founders can absolutely work.
02:18:44.940 | And the advantage of a loan founder
02:18:46.980 | is you don't have the catastrophic potential
02:18:51.220 | of a falling out between founders.
02:18:53.220 | I mean, two founders who fall out with each other badly
02:18:57.220 | can rip a company to shreds
02:18:59.660 | because they both have an enormous amount of equity,
02:19:03.020 | an enormous amount of power,
02:19:04.380 | and the capital structure is a result of that.
02:19:06.700 | They both have an enormous amount of moral authority
02:19:10.340 | with the team as a result of each having that founder role.
02:19:14.300 | And I have witnessed over the years,
02:19:17.580 | many, many situations in which companies have been shredded
02:19:21.340 | or have suffered near fatal blows
02:19:24.700 | because of a falling out between founders.
02:19:27.500 | And the more founders you add, the more risky that becomes.
02:19:30.580 | I don't think there should ever almost,
02:19:33.420 | I mean, you never say never,
02:19:34.580 | but multiple founders beyond two
02:19:37.260 | is such an unstable and potentially treacherous situation
02:19:42.780 | that I would never, ever recommend going beyond two.
02:19:46.100 | But I do see value in the non-technical
02:19:49.140 | sort of business and market and outside-minded founder
02:19:52.620 | teaming up with the technical founder.
02:19:55.140 | There is a lot of merit to that,
02:19:56.460 | but there's a lot of danger in that
02:19:57.940 | lest those two blow apart.
02:19:59.380 | - Was it lonely for you?
02:20:00.980 | - Unbelievably, and that's the drawback.
02:20:02.940 | I mean, if you're a lone founder,
02:20:04.780 | there is no other person that you can sit down with
02:20:10.580 | and tackle problems and talk them through
02:20:13.140 | who has precisely or nearly precisely
02:20:15.580 | your alignment of interests.
02:20:17.620 | Your most trusted board member is likely an investor,
02:20:22.140 | and therefore at the end of the day
02:20:23.740 | has the interest of preferred stock in mind,
02:20:25.660 | not common stock.
02:20:27.060 | Your most trusted VP,
02:20:30.340 | who might own a very significant stake in the company,
02:20:33.740 | doesn't own anywhere near your stake in the company.
02:20:35.980 | And so their long-term interests
02:20:38.020 | may well be in getting the right level
02:20:40.340 | of experience and credibility necessary
02:20:42.340 | to peel off and start their own company.
02:20:44.340 | Or their interests might be aligned with
02:20:46.580 | jumping ship and setting up with another,
02:20:50.700 | with a different company,
02:20:51.980 | whether it's a rival or one in a completely different space.
02:20:54.580 | So yeah, being a lone founder
02:20:56.180 | is a spectacularly lonely thing,
02:20:58.020 | and that's a major downside to it.
02:20:59.420 | - What about mentorship?
02:21:00.340 | 'Cause you're a mentor to a lot of people.
02:21:02.460 | Can you find an alleviation to that loneliness
02:21:06.900 | in the space of ideas with a good mentor?
02:21:09.380 | - With a good mentor, like a mentor who's mentoring you?
02:21:11.740 | - Yeah. - Yeah, you can.
02:21:13.140 | A great deal, particularly if it's somebody
02:21:14.820 | who's been through this very process
02:21:16.260 | and has navigated it successfully
02:21:18.420 | and cares enough about you and your well-being
02:21:22.020 | to give you beautifully unvarnished advice,
02:21:25.260 | that can be a huge, huge thing.
02:21:26.980 | That can just raise things a great deal.
02:21:29.140 | And I had a board member who was not an investor,
02:21:33.260 | who basically played that role for me to a great degree.
02:21:36.060 | He came in maybe halfway through the company's history,
02:21:38.860 | though, I would've needed that the most
02:21:40.660 | in the very earliest days. (laughs)
02:21:43.500 | - Yeah, the loneliness, that's the whole journey of life.
02:21:47.980 | We're always alone, alone together.
02:21:49.960 | It pays to embrace that.
02:21:52.860 | You were saying that there might be something
02:21:56.080 | outside of the founder that's also,
02:21:58.500 | that you were promising to be brief on.
02:22:00.660 | - Yeah, okay, so we talked about the founder.
02:22:02.980 | You were asking what makes a great startup.
02:22:04.660 | - Yes. - And great founder
02:22:05.820 | is thing number one, but then thing number two,
02:22:08.500 | and it's ginormous, is a great team.
02:22:10.580 | And so I said so much about the founder
02:22:12.840 | because one hopes or one believes
02:22:16.140 | that a founder who is a great hirer
02:22:18.300 | is going to be hiring people
02:22:19.780 | in charge of critical functions
02:22:21.940 | like engineering and marketing and biz dev
02:22:23.820 | and sales and so forth, who themselves are great hirers.
02:22:26.900 | But what needs to radiate from the founder into the team
02:22:29.900 | that might be a little bit different
02:22:31.020 | from what's in the gene code of the founder?
02:22:33.460 | The team needs to be fully bought in
02:22:37.580 | to the intuitions and the vision of the founder.
02:22:41.420 | Great, we've got that.
02:22:43.100 | But the team needs to have a slightly different thing,
02:22:47.340 | which is, it's 99% obsession, is execution,
02:22:52.340 | is to relentlessly hit the milestones,
02:22:55.980 | hit the objectives, hit the quarterly goals.
02:22:59.100 | That is 1% vision, you don't wanna lose that.
02:23:03.620 | But execution machines,
02:23:07.300 | people who have a demonstrated ability
02:23:09.540 | and a demonstrated focus on,
02:23:12.100 | yeah, I go from point to point to point,
02:23:14.820 | I try to beat and raise expectations relentlessly,
02:23:17.820 | never fall short, and both sort of blaze
02:23:21.820 | and follow the path.
02:23:23.300 | Not that the path is gonna,
02:23:25.380 | blaze the trail as well.
02:23:27.220 | A good founder is going to trust that VP of sales
02:23:30.700 | to have a better sense of what it takes
02:23:32.920 | to build out that organization, what the milestones be.
02:23:35.420 | And it's gonna be kind of a dialogue
02:23:36.940 | amongst those at the top.
02:23:38.740 | But execution obsession in the team is the next thing.
02:23:43.060 | - Yeah, there's some sense where the founder,
02:23:45.300 | you talk about sort of the space of ideas
02:23:48.300 | like first principles thinking,
02:23:49.880 | asking big difficult questions of future trajectories
02:23:53.340 | or having a big vision and big picture dreams.
02:23:56.740 | You can almost be a dreamer, it feels like,
02:24:00.300 | when you're like not the founder,
02:24:02.660 | but in the space of sort of leadership.
02:24:06.980 | But when it gets to the ground floor,
02:24:10.380 | there has to be execution.
02:24:11.940 | There has to be hitting deadlines.
02:24:14.020 | And sometimes those are attention.
02:24:17.980 | There's something about dreams
02:24:20.780 | that are attention with the pragmatic nature of execution,
02:24:25.780 | not dreams, but sort of ambitious vision.
02:24:30.900 | And those have to be, I suppose, coupled.
02:24:35.700 | The vision in the leader and the execution
02:24:38.780 | in the software world, that would be the programmer
02:24:43.260 | or the designer. - Absolutely.
02:24:46.100 | - Amongst many other things,
02:24:47.580 | you're an incredible conversationalist,
02:24:50.040 | a podcaster, you host a podcast called After On.
02:24:52.780 | I mean, there's a million questions I wanna ask you here,
02:24:56.520 | but one at the highest level,
02:24:58.580 | what do you think makes for a great conversation?
02:25:00.660 | - I would say two things, one of two things,
02:25:04.340 | and ideally both of two things.
02:25:06.820 | One is if something is beautifully architected,
02:25:11.820 | whether it's done deliberately and methodically
02:25:16.580 | and willfully as when I do it,
02:25:19.200 | or whether that just emerges from the conversation,
02:25:21.780 | but something that's beautifully architected,
02:25:24.420 | that can create something
02:25:25.620 | that's incredibly powerful and memorable,
02:25:28.540 | or something where there's just extraordinary chemistry.
02:25:32.220 | And so with All In, or I'll go way back,
02:25:35.140 | you might remember the NPR show Car Talk.
02:25:38.100 | - Oh yeah, yeah.
02:25:38.940 | - I couldn't care less about auto mechanics myself.
02:25:41.460 | - Yeah, that's right.
02:25:42.300 | - But I love that show because the banter
02:25:44.700 | between those two guys was just beyond,
02:25:47.660 | without any parallel, right?
02:25:49.760 | You know, and some kind of edgy podcast,
02:25:51.920 | like Red Scare is just really entertaining to me
02:25:54.720 | because the banter between the women on that show
02:25:56.560 | is just so good.
02:25:58.040 | And All In and that kind of thing.
02:25:59.260 | So I think it's a combination of sort of the arc
02:26:02.640 | and the chemistry.
02:26:04.740 | And I think because the arc can be so important,
02:26:07.680 | that's why very, very highly produced podcasts
02:26:11.600 | like This American Life, obviously a radio show,
02:26:14.240 | but I think of a podcast 'cause that's how I always consume
02:26:16.360 | it, or Criminal, or a lot of what Wondery does and so forth.
02:26:21.360 | That is real documentary making,
02:26:24.200 | and that requires a big team and a big budget
02:26:26.260 | relative to the kinds of things you and I do.
02:26:27.720 | But nonetheless, then you got that arc,
02:26:30.600 | and that can be really, really compelling.
02:26:32.040 | But if we go back to conversation,
02:26:34.680 | I think it's a combination of structure and chemistry.
02:26:38.360 | - Yeah, and I've actually personally have lost,
02:26:41.040 | I used to love This American Life,
02:26:42.760 | and for some reason, because it lacks
02:26:45.560 | the possibility of magic, it's engineered magic.
02:26:50.560 | - I've fallen off of it myself as well.
02:26:53.080 | I mean, when I fell madly in love with it during the aughts,
02:26:56.560 | it was the only thing going.
02:26:58.280 | They were really smart to adopt podcasting
02:27:01.020 | as a distribution mechanism early.
02:27:02.940 | But yeah, I think that maybe there's a little bit
02:27:07.680 | less magic there now, 'cause I think they have agendas
02:27:10.040 | other than necessarily just delighting their listeners
02:27:13.240 | with quirky stories, which I think is what it was all about
02:27:15.280 | back in the day and some other things.
02:27:17.640 | - Is there a memorable conversation that you've had
02:27:20.480 | on the podcast, whether it was because it was wild and fun,
02:27:25.480 | or one that was exceptionally challenging,
02:27:28.860 | maybe challenging to prepare for, that kind of thing?
02:27:31.360 | Is there something that stands out in your mind
02:27:33.520 | that you can draw an insight from?
02:27:35.700 | - Yeah, I mean, this in no way diminishes
02:27:38.680 | the episodes that will not be the answer
02:27:41.040 | to these two questions.
02:27:42.160 | But an example of something that was really,
02:27:45.200 | really challenging to prepare for was George Church.
02:27:47.960 | So as I'm sure you know, and as I'm sure
02:27:49.920 | many of your listeners know, he is one of the absolute
02:27:52.480 | leading lights in the field of synthetic biology.
02:27:55.040 | He's also unbelievably prolific.
02:27:57.400 | His lab is large and has all kinds of efforts
02:28:01.240 | have spun out of that.
02:28:02.560 | And what I wanted to make my George Church episode about
02:28:05.560 | was first of all, grounding people into
02:28:09.640 | what is this thing called syn-bio?
02:28:12.040 | And that required me to learn a hell of a lot more
02:28:15.440 | about syn-bio than I knew going into it.
02:28:17.960 | So there was just this very broad, I mean,
02:28:20.840 | I knew much more than the average person
02:28:23.040 | going into that episode, but there was this incredible
02:28:25.720 | breadth of grounding that I needed to give myself
02:28:27.760 | in the domain.
02:28:28.920 | And then George does so many interesting things,
02:28:32.600 | there's so many interesting things emitting from his lab
02:28:35.480 | that, you know, and he and I had a really good dialogue.
02:28:38.440 | He was a great guide going into it.
02:28:40.200 | Winnowing it down to the three to four
02:28:44.360 | that I really wanted us to focus on
02:28:46.400 | to create a sense of wonder and magic in the listener
02:28:49.520 | of what could be possible from this
02:28:51.600 | very broad spectrum domain, that was a doozy of a challenge.
02:28:54.680 | That was a tough, tough, tough one to prepare for.
02:28:58.120 | Now in terms of something that was just wild and fun,
02:29:02.760 | unexpected, I mean, by the time we sat down to interview,
02:29:06.120 | I knew where we were gonna go,
02:29:07.400 | but just in terms of the idea space, Don Hoffman.
02:29:11.280 | - Oh, wow. - Yeah.
02:29:12.400 | So Don Hoffman, as again, some listeners probably know,
02:29:16.000 | 'cause he's, I think I was the first podcaster
02:29:18.120 | to interview him.
02:29:19.400 | I'm sure some of your listeners are familiar with him,
02:29:21.320 | but he has this unbelievably contrarian take
02:29:26.240 | on the nature of reality, but it is contrarian in a way
02:29:31.240 | that all the ideas are highly internally consistent
02:29:35.120 | and snap together in a way that's just delightful.
02:29:38.520 | And it seems as radically violating of our intuitions
02:29:43.520 | and as radically violating of the probable nature of reality
02:29:47.880 | as anything that one can encounter,
02:29:49.460 | but an analogy that he uses, which is very powerful,
02:29:52.040 | which is what intuition could possibly be more powerful
02:29:56.360 | than the notion that there is a single unitary direction
02:29:59.160 | called down, and we're on this big flat thing
02:30:02.720 | for which there is a thing called down.
02:30:05.200 | And we all know, I mean, that's the most intuitive thing
02:30:07.560 | that one could probably think of.
02:30:10.200 | And we all know that that ain't true.
02:30:12.320 | So my conversation with Don Hoffman was just wild
02:30:15.680 | and full of plot twists and interesting stuff.
02:30:19.640 | - And the interesting thing about the wildness of his ideas,
02:30:23.200 | it's to me at least as a listener coupled with,
02:30:28.200 | he's a good listener and he empathizes
02:30:32.200 | with the people who challenge his ideas.
02:30:35.240 | Like what's a better way to phrase that?
02:30:39.360 | He is a welcoming of challenge in a way
02:30:42.280 | that creates a really fun conversation.
02:30:44.600 | - Oh, totally, yeah.
02:30:45.640 | He loves a Perry or a jab, whatever the word is,
02:30:50.640 | at his argument, he honors it.
02:30:53.880 | He's a very, very gentle and non-combatitive soul,
02:30:58.880 | but then he is very good and takes great evident joy
02:31:04.920 | in responding to that in a way that expands
02:31:08.480 | your understanding of his thinking.
02:31:10.160 | - Let me, as a small tangent of tying up together
02:31:13.800 | our previous conversation about listen.com
02:31:15.480 | and streaming and Spotify and the world of podcasting.
02:31:19.200 | So we've been talking about this magical medium
02:31:23.720 | of podcasting, I have a lot of friends at Spotify
02:31:27.480 | in the high positions of Spotify as well.
02:31:31.480 | I worry about Spotify and podcasting
02:31:37.440 | and the future of podcasting in general
02:31:40.020 | that moves podcasting in the place
02:31:43.000 | of maybe walled gardens of sorts.
02:31:46.780 | Since you've had a foot in both worlds,
02:31:50.960 | have a foot in both worlds,
02:31:53.480 | do you worry as well about the future of podcasting?
02:31:57.160 | - Yeah, I think walled gardens are really toxic
02:32:01.880 | to the medium that they start balkanizing.
02:32:05.520 | So to take an example, I'll take two examples.
02:32:08.240 | With music, it was a very, very big deal that at Rhapsody,
02:32:14.200 | we were the first company to get full catalog licenses
02:32:16.920 | from all, back then there were five major music labels
02:32:20.000 | and also hundreds and hundreds of indies
02:32:21.480 | because you needed to present the listener
02:32:24.040 | with a sense that basically everything is there
02:32:27.480 | and there is essentially no friction
02:32:30.280 | to discovering that which is new
02:32:32.220 | and you can wander this realm
02:32:33.700 | and all you really need is a good map,
02:32:36.720 | whether it is something that somebody,
02:32:38.160 | the editorial team assembled or a good algorithm
02:32:40.720 | or whatever it is, but a good map to wander this domain.
02:32:43.480 | When you start walling things off,
02:32:45.720 | A, you undermine the joy of friction-free discovery,
02:32:50.400 | which is an incredibly valuable thing
02:32:52.360 | to deliver to your customer,
02:32:54.100 | both from a business standpoint
02:32:55.720 | and simply from a humanistic standpoint
02:32:59.440 | of you wanna bring delight to people,
02:33:01.320 | but it also creates an incredible opening vector for piracy.
02:33:05.960 | And so something that's very different
02:33:08.020 | from the Rhapsody/Spotify/et cetera like experience
02:33:12.160 | is what we have now in video.
02:33:14.440 | Like wow, is that show on Hulu?
02:33:16.440 | Is it on Netflix?
02:33:17.500 | Is it on something like IFC channel?
02:33:20.020 | Is it on Discovery+, is it here, is it there?
02:33:23.140 | And the more frustration and toe-stubbing
02:33:26.780 | that people encounter when they are seeking something
02:33:31.440 | and they're already paying a very respectable amount
02:33:33.960 | of money per month to have access to content
02:33:36.920 | and they can't find it, the more that happens,
02:33:39.160 | the more people are gonna be driven
02:33:40.600 | to piracy solutions like to hell with it.
02:33:42.860 | Never know where I'm gonna find something,
02:33:44.440 | I never know what it's gonna cost.
02:33:45.600 | Oftentimes, really interesting things
02:33:48.080 | are simply unavailable.
02:33:50.120 | That surprises me, the number of times
02:33:52.000 | that I've been looking for things
02:33:53.020 | I don't even think are that obscure
02:33:54.960 | that are just, it says not available
02:33:57.280 | in your geography period, mister, right?
02:33:59.800 | So I think that that's a mistake.
02:34:01.560 | And then the other thing is for podcasters
02:34:04.860 | and lovers of podcasting, we should wanna resist
02:34:08.280 | this Waldegarden thing because A,
02:34:11.120 | it does smother this friction-free
02:34:16.240 | or eradicate this friction-free discovery
02:34:18.280 | unless you wanna sign up for lots of different services.
02:34:21.480 | And also dims the voice of somebody
02:34:25.840 | who might be able to have a far, far, far bigger impact
02:34:28.700 | by reaching far more neurons with their ideas.
02:34:32.760 | I'm gonna use an example from,
02:34:34.480 | I guess it was probably the '90s
02:34:35.600 | or maybe it was the aughts, of Howard Stern
02:34:38.640 | who had the biggest megaphone
02:34:41.380 | or maybe the second biggest after Oprah
02:34:43.680 | megaphone in popular culture.
02:34:45.820 | And 'cause he was syndicated on hundreds and hundreds
02:34:48.840 | and hundreds of radio stations at a time
02:34:50.520 | when terrestrial broadcast was the main thing
02:34:52.220 | people listened to in their car, no more obviously.
02:34:54.860 | But when he decided to go over to satellite radio,
02:34:58.040 | if I can't remember, it was XM or Sirius,
02:34:59.640 | maybe they'd already merged at that point.
02:35:01.740 | But when he did that, he made,
02:35:03.840 | totally his right to do it, a financial calculation
02:35:07.720 | that they were offering him a nine-figure sum to do that.
02:35:10.980 | But his audience, because not a lot of people
02:35:13.160 | were subscribing to satellite radio at that point,
02:35:15.080 | his audience probably collapsed by,
02:35:17.840 | I wouldn't be surprised if it was as much as 95%.
02:35:20.840 | And so the influence that he had on the culture
02:35:24.120 | and his ability to sort of shape conversation
02:35:27.520 | and so forth just got muted.
02:35:30.560 | - Yeah, and also there's a certain sense,
02:35:33.480 | especially in modern times where the walled gardens
02:35:37.400 | naturally lead to,
02:35:40.100 | I don't know if there's a term for it,
02:35:44.920 | but people who are not creatives
02:35:48.480 | starting to have power over the creatives.
02:35:51.400 | - Right, and even if they don't stifle it,
02:35:54.400 | if they're providing incentives within the platform
02:35:59.400 | to shape, shift, or even completely mutate
02:36:03.960 | or distort the show, I mean,
02:36:05.960 | imagine somebody has got a reasonably interesting idea
02:36:10.280 | for a podcast and they get signed up with,
02:36:12.720 | let's say Spotify, and then Spotify
02:36:14.360 | is gonna give them financing to get the thing spun up.
02:36:17.160 | And that's great, and Spotify is gonna give them
02:36:19.720 | a certain amount of really powerful placement
02:36:24.080 | within the visual field of listeners.
02:36:27.120 | But Spotify has conditions for that.
02:36:29.080 | They say, look, we think that your podcast
02:36:31.960 | will be much more successful if you dumb it down about 60%,
02:36:36.960 | if you add some silly, dirty jokes,
02:36:42.000 | if you do this, you do that,
02:36:43.800 | and suddenly the person who is dependent upon Spotify
02:36:47.200 | for permission to come into existence
02:36:49.000 | and is really dependent, really wants to please them
02:36:51.760 | to get that money in, to get that placement,
02:36:53.480 | really wants to be successful,
02:36:55.020 | now all of a sudden you're having a dialogue
02:36:56.560 | between a complete non-creative,
02:36:59.080 | some marketing sort of data analytic person
02:37:02.580 | at Spotify and a creative that's going to shape
02:37:05.240 | what that show is.
02:37:07.200 | So that could be much more common
02:37:10.160 | and ultimately having the aggregate,
02:37:13.120 | an even bigger impact than the cancellation,
02:37:16.120 | let's say, of somebody who says the wrong word
02:37:17.840 | or voices the wrong idea.
02:37:19.680 | I mean, that's kind of what you have,
02:37:21.000 | not kind of, it's what you have with film and TV,
02:37:23.360 | is that so much influence is exerted over the storyline
02:37:28.360 | and the plots and the character arcs
02:37:30.640 | and all kinds of things by executives
02:37:33.020 | who are completely alien to the experience
02:37:35.400 | and the skill set of being a showrunner in television,
02:37:37.440 | being a director in film, that is meant to like,
02:37:40.680 | we can't piss off the Chinese market here,
02:37:42.840 | we can't say that, we need to have cast members
02:37:46.440 | that have precisely these demographics reflected
02:37:48.880 | or whatever it is, that, and obviously,
02:37:51.720 | despite that extraordinary,
02:37:53.360 | at least TV shows are now being made,
02:37:55.220 | in terms of film, I think the quality has nosedived
02:38:00.040 | of the average, let's say, say American film
02:38:02.160 | coming out of a major studio, the average quality,
02:38:04.120 | and my view has nosedived over the past decade
02:38:06.240 | is it's kind of everything's gotta be a superhero franchise,
02:38:09.840 | but great stuff gets made despite that,
02:38:12.960 | but I have to assume that in some cases,
02:38:16.960 | at least in perhaps many cases,
02:38:19.080 | greater stuff would be made if there was less interference
02:38:22.200 | from non-creative executives.
02:38:23.640 | - It's like the flip side of that, though,
02:38:25.600 | and this was the pitch of Spotify
02:38:27.280 | because I've heard their pitch,
02:38:28.920 | is Netflix, from everybody I've heard
02:38:32.520 | that I've spoken with about Netflix,
02:38:34.320 | is they actually empower the creator.
02:38:36.080 | - They do. - I don't know
02:38:36.920 | what the heck they do, but they do a good job
02:38:39.420 | of giving creators, even the crazy ones,
02:38:41.760 | like Tim Dillon, like Joe Rogan, like comedians,
02:38:44.480 | freedom to be their crazy selves,
02:38:46.960 | and the result is some of the greatest television,
02:38:51.600 | some of the greatest cinema,
02:38:53.800 | whatever you call it, ever made.
02:38:55.680 | - True. - Right?
02:38:56.600 | And I don't know what the heck they're doing.
02:38:58.700 | - It's a relative thing.
02:39:00.120 | From what I understand, it's a relative thing.
02:39:01.400 | They're interfering far, far, far less
02:39:03.680 | than NBC or AMC would have interfered,
02:39:08.080 | so it's a relative thing,
02:39:09.940 | and obviously, they're the ones writing the checks,
02:39:12.100 | and they're the ones giving the platforms,
02:39:13.380 | so they have every right to their own influence, obviously,
02:39:16.600 | but my understanding is that they're relatively
02:39:19.100 | way more hands-off, and that has had a demonstrable effect,
02:39:22.140 | 'cause I agree, some of the greatest produced video content
02:39:26.620 | of all time, an incredibly inordinate percentage of that
02:39:29.940 | is coming out from Netflix in just a few years
02:39:32.140 | when the history of cinema goes back many, many decades.
02:39:34.500 | - And Spotify wants to be that for podcasting,
02:39:38.400 | and I hope they do become that for podcasting,
02:39:41.280 | but I'm wearing my skeptical goggles or skeptical hat,
02:39:45.800 | whatever the heck it is, 'cause it's not easy to do,
02:39:48.640 | and it requires letting go of power,
02:39:53.160 | giving power to the creatives.
02:39:55.040 | It requires pivoting, which large companies,
02:39:57.520 | even as innovative as Spotify is,
02:40:00.720 | still now a large company,
02:40:02.120 | pivoting into a whole new space is very tricky,
02:40:04.440 | and difficult, so I'm skeptical, but hopeful.
02:40:08.120 | What advice would you give to a young person today
02:40:10.900 | about life, about career?
02:40:12.980 | We talked about startups, we talked about music,
02:40:15.580 | we talked about the end of human civilization.
02:40:17.880 | Is there advice you would give to a young person today,
02:40:22.260 | maybe in college, maybe in high school, about their life?
02:40:27.020 | - Well, let's see, I mean, there's so many domains
02:40:28.900 | you can advise on, and I'm not gonna give advice
02:40:34.520 | on life, because I fear that I would drift
02:40:36.960 | into sort of Hallmark bromides
02:40:39.280 | that really wouldn't be all that distinctive,
02:40:41.280 | and they might be entirely true.
02:40:43.180 | Sometimes the greatest insights about life
02:40:46.160 | turn out to be like the kinds of things
02:40:48.160 | you'd see on a Hallmark card,
02:40:49.080 | so I'm gonna steer clear of that.
02:40:50.460 | On a career level, one thing that I think is unintuitive,
02:40:55.460 | but unbelievably powerful, is to focus not necessarily
02:41:00.880 | on being in the top sliver of 1% in excelling at one domain
02:41:05.880 | that's important and valuable, but to think in terms
02:41:10.420 | of intersections of two domains,
02:41:13.900 | which are rare, but valuable.
02:41:16.460 | And there's a couple reasons for this.
02:41:19.620 | The first is, in an incredibly competitive world
02:41:23.500 | that is so much more competitive than it was
02:41:25.460 | when I was coming out of school,
02:41:26.660 | radically more competitive than when I was coming
02:41:28.900 | out of school, to navigate your way
02:41:31.600 | to the absolute pinnacle of any domain.
02:41:34.040 | Let's say you wanna be really, really great
02:41:37.400 | at Python, Pickle Language, whatever it is.
02:41:40.320 | You wanna be one of the world's greatest Python developers,
02:41:44.100 | JavaScript, whatever your language is.
02:41:45.720 | Hopefully it's not Cobalt.
02:41:47.080 | - By the way, if you listen to this,
02:41:50.520 | I am actually looking for a Cobalt expert
02:41:53.240 | to interview, 'cause I find language fascinating,
02:41:55.240 | and there's not many of them, so please,
02:41:57.280 | if you know a world expert in Cobalt,
02:42:00.540 | or Fortran, but both, actually.
02:42:02.300 | - Or if you are one.
02:42:03.460 | - Or if you are one, please email me.
02:42:05.620 | - Yeah, so I mean, if you're going out there
02:42:07.740 | and you wanna be in the top sliver 1%,
02:42:10.220 | a Python developer's a very, very difficult thing to do,
02:42:12.420 | particularly if you wanna be number one in the world,
02:42:13.700 | something like that.
02:42:14.780 | And I'll use an analogy, is I had a friend in college
02:42:17.820 | who was on a track, and indeed succeeded at that,
02:42:23.780 | to become an Olympic medalist,
02:42:26.660 | and I think it was 100 meter breaststroke.
02:42:28.740 | And he mortgaged a significant percentage
02:42:33.740 | of his sort of college life to that goal,
02:42:37.340 | or I should say dedicated, or invested,
02:42:38.980 | or whatever you wanted to say,
02:42:39.900 | but he didn't participate in a lot of the social,
02:42:42.720 | a lot of the late night, a lot of the this,
02:42:44.780 | a lot of the that, because he was training so much.
02:42:48.120 | And obviously he also wanted to keep up with his academics,
02:42:50.660 | and at the end of the day, story has a happy ending,
02:42:53.540 | in that he did medal in that.
02:42:55.740 | Bronze, not gold, but holy cow,
02:42:57.780 | anybody who gets an Olympic medal,
02:42:59.140 | that's an extraordinary thing, and at that moment,
02:43:01.060 | he was one of the top three people on Earth at that thing.
02:43:05.060 | But wow, how hard to do that,
02:43:07.320 | how many thousands of other people went down that path
02:43:10.700 | and made similar sacrifices and didn't get there.
02:43:13.020 | It's very, very hard to do that.
02:43:15.100 | Whereas, and I'll use a personal example,
02:43:17.720 | when I came out of business school,
02:43:20.700 | I went to a good business school,
02:43:22.580 | and learned the things that were there to be learned,
02:43:25.720 | and I came out and I entered a world with lots of--
02:43:29.100 | - Harvard Business School, by the way.
02:43:30.540 | - Okay, yes, it was Harvard, it's true.
02:43:32.900 | - You're the first person who went there
02:43:34.620 | who didn't say where you went, which is beautiful,
02:43:36.940 | I appreciate that.
02:43:37.780 | It's one of the greatest business schools in the world.
02:43:41.260 | It's a whole 'nother fascinating conversation
02:43:43.220 | about that world, but anyway, yes.
02:43:45.380 | - But anyway, so I learned the things,
02:43:46.980 | you learn getting an MBA from a top program,
02:43:51.100 | and I entered a world that had hundreds of thousands
02:43:53.500 | of people who had MBAs, probably hundreds of thousands
02:43:57.600 | who had them from top 10 programs.
02:44:00.320 | So I was not particularly great at being an MBA person.
02:44:04.960 | I was inexperienced relative to most of them,
02:44:07.800 | and there were a lot of them,
02:44:08.720 | but it was an okay MBA person, newly minted.
02:44:12.000 | But then as it happened, I found my way
02:44:16.000 | into working on the commercial internet in 1994.
02:44:20.320 | So I went to a, at the time, giant and hot computing company
02:44:23.540 | called Silicon Graphics, which had enough heft
02:44:26.260 | and enough head count that they could take on
02:44:29.140 | and experienced MBAs and try to train them
02:44:31.340 | in the world of Silicon Valley.
02:44:33.060 | But within that company that had an enormous amount
02:44:37.300 | of surface area and was touching a lot of areas
02:44:39.460 | and had unbelievably smart people at the time,
02:44:43.660 | it was not surprising that SGI started doing
02:44:47.420 | really interesting and innovative and trailblazing stuff
02:44:50.520 | on the internet before almost anybody else.
02:44:52.580 | And part of the reason was that our founder,
02:44:54.080 | Jim Clark, went off to co-found Netscape
02:44:55.800 | with Mark Andresen, so the whole company was like,
02:44:58.080 | "Wait, what was that?
02:44:58.920 | "What's this commercial internet thing?"
02:45:00.660 | So I end up in that group.
02:45:01.900 | Now, in terms of being a commercial internet person
02:45:04.980 | or a worldwide web person, again,
02:45:09.280 | I was, in that case, barely credentialed.
02:45:11.260 | I couldn't write a stitch of code,
02:45:12.980 | but I had a pretty good mind for grasping
02:45:16.360 | the business and cultural significance of this transition.
02:45:21.360 | And this was, again, we were talking earlier
02:45:23.740 | about emerging areas.
02:45:25.320 | Within a few months, I was in the relatively top echelon
02:45:28.840 | of people in terms of just sheer experience.
02:45:31.500 | 'Cause let's say it was five months into the program,
02:45:33.680 | there were only so many people who had been doing
02:45:35.520 | worldwide web stuff commercially for five months.
02:45:38.500 | And then what was interesting, though,
02:45:40.920 | was the intersection of those two things.
02:45:43.480 | The commercial web, as it turned out,
02:45:45.380 | grew into an unbelievable vastness.
02:45:49.600 | And so by being a pretty good, okay web person
02:45:53.620 | and a pretty good, okay MBA person,
02:45:56.600 | that intersection put me in a very rare group,
02:45:59.380 | which was web-oriented MBAs.
02:46:03.080 | And in those early days, you could probably count
02:46:06.240 | on your fingers the number of people
02:46:08.520 | who came out of really competitive programs
02:46:10.100 | who were doing stuff full-time on the internet.
02:46:11.760 | And there was a greater appetite for great software
02:46:15.700 | developers in the internet domain,
02:46:17.440 | but there was an appetite and a real one
02:46:19.680 | and a rapidly growing one for MBA thinkers
02:46:24.220 | who were also seasoned and networked in the emerging world
02:46:28.020 | of the commercial worldwide web.
02:46:29.260 | And so finding an intersection of two things
02:46:33.300 | you can be pretty good at, but is a rare intersection
02:46:37.620 | and a special intersection, is probably a much easier way
02:46:41.740 | to make yourself distinguishable and in demand
02:46:44.540 | from the world than trying to be world-class
02:46:46.940 | at this one thing.
02:46:48.660 | - So in the intersection is where there's
02:46:51.260 | to be discovered opportunity and success.
02:46:53.220 | That's really interesting.
02:46:54.500 | There's actually more intersection of fields
02:46:56.820 | than fields themselves, right?
02:46:58.980 | - Yeah, I mean, I'll give you
02:46:59.820 | kind of a funny hypothetical here,
02:47:01.900 | but it's one I've been thinking about a little bit.
02:47:04.480 | There's a lot of people in crypto right now.
02:47:06.460 | It'd be hard to be in the top percentile of crypto people,
02:47:11.060 | whether it comes from just having a sheer grasp
02:47:13.020 | of the industry, a great network within the industry,
02:47:15.040 | technological skills, whatever you wanna call it.
02:47:18.260 | And then there's this parallel world,
02:47:19.940 | an orthogonal world called crop insurance.
02:47:22.920 | And there's, I'm sure that's a big world.
02:47:25.260 | Crop insurance is a very, very big deal,
02:47:27.340 | particularly in the wealthy and industrialized world
02:47:29.460 | where people, there's sophisticated financial markets,
02:47:31.920 | rule of law, and large agricultural concerns
02:47:35.220 | that are worried about that.
02:47:37.380 | Somewhere out there is somebody who is pretty crypto savvy,
02:47:40.540 | but probably not top 1%.
02:47:42.580 | But also has kind of been in the crop insurance world
02:47:45.940 | and understands that a hell of a lot better
02:47:47.980 | than almost anybody who's ever had anything
02:47:50.420 | to do with cryptocurrency.
02:47:52.140 | And so I think that decentralized finance, DeFi,
02:47:56.420 | one of the interesting and I think very world positive things
02:47:59.660 | that I think it's almost inevitably
02:48:01.500 | we'll be bringing to the world
02:48:03.260 | is crop insurance for smallholding farmers.
02:48:07.060 | I mean, people who have tiny, tiny plots of land
02:48:10.060 | in places like India, et cetera,
02:48:12.020 | where there is no crop insurance available to them
02:48:14.500 | because just the financial infrastructure doesn't exist.
02:48:18.980 | But it's highly imaginable that using Oracle networks
02:48:22.520 | that are trusted outside deliverers of factual information
02:48:26.420 | about rainfall in a particular area,
02:48:28.280 | you can start giving drought insurance to folks like this.
02:48:31.260 | The right person to come up with that idea
02:48:33.700 | is not a crypto whiz who doesn't know a blasted thing
02:48:37.740 | about smallholding farmers.
02:48:39.300 | The right person to come up with that
02:48:40.500 | is not a crop insurance whiz
02:48:42.540 | who isn't quite sure what Bitcoin is.
02:48:44.340 | But somebody occupies that intersection.
02:48:47.460 | That's just one of gazillion examples
02:48:50.140 | of things that are gonna come along
02:48:51.820 | for somebody who occupies the right intersection of skills
02:48:55.100 | but isn't necessarily the number one person
02:48:57.620 | at either one of those expertises.
02:48:59.620 | - That's making me kind of wonder
02:49:01.060 | about my own little things that I'm average at
02:49:04.620 | and seeing where the intersections that could be exploited.
02:49:09.260 | That's pretty profound.
02:49:10.340 | So we talked quite a bit about the end of the world
02:49:13.780 | and how we're both optimistic about us figuring our way out.
02:49:17.900 | Unfortunately, for now at least,
02:49:20.540 | both you and I are going to die one day way too soon.
02:49:25.000 | First of all, that sucks.
02:49:29.140 | - It does.
02:49:29.980 | (laughing)
02:49:32.580 | - I mean, one I'd like to ask,
02:49:36.320 | if you ponder your own mortality,
02:49:39.340 | how does that kind of, what kind of wisdom insight
02:49:42.620 | does it give you about your own life?
02:49:45.260 | And broadly, do you think about your life
02:49:47.980 | and what the heck it's all about?
02:49:50.660 | - Yeah, with respect to pondering mortality,
02:49:53.400 | I do try to do that as little as possible
02:49:57.140 | 'cause there's not a lot I can do about it.
02:49:59.960 | But it's inevitably there.
02:50:01.180 | And I think that what it does,
02:50:03.780 | when you think about it in the right way,
02:50:06.080 | is it makes you realize how unbelievably rare and precious
02:50:11.080 | the moments that we have here are,
02:50:13.500 | and therefore how consequential the decisions that we make
02:50:16.140 | about how to spend our time are.
02:50:17.940 | Do you do those 17 nagging emails
02:50:23.060 | or do you have dinner with somebody who's really important
02:50:26.380 | to you who you haven't seen in three and a half years?
02:50:28.780 | If you had an infinite expanse of time in front of you,
02:50:31.740 | you might well rationally conclude
02:50:33.860 | I'm gonna do those emails because collectively,
02:50:35.740 | they're rather important and I have tens of thousands
02:50:38.740 | of years to catch up with my buddy, Tim.
02:50:41.020 | But I think the scarcity of the time that we have
02:50:43.900 | helps us choose the right things if we're attuned to that.
02:50:49.660 | And we're attuned to the context that mortality puts
02:50:53.020 | over the consequence of every decision we make
02:50:55.780 | of how to spend our time.
02:50:56.920 | That doesn't mean that we're all very good at it.
02:50:58.700 | Doesn't mean I'm very good at it.
02:51:00.400 | But it does add a dimension of choice and significance
02:51:05.340 | to everything that we elect to do.
02:51:07.380 | - It's kind of funny that you say you try to think about it
02:51:09.620 | as little as possible.
02:51:10.540 | I would venture to say you probably think about
02:51:12.340 | the end of human civilization more than you do
02:51:14.580 | about your own life.
02:51:15.500 | - You're probably right.
02:51:16.700 | - Because that feels like a problem that could be solved.
02:51:19.580 | - Right.
02:51:20.420 | - And--
02:51:21.240 | - Whereas the end of my own life can't be solved.
02:51:23.000 | Well, I don't know.
02:51:23.840 | I mean, there's transhumanists who have incredible optimism
02:51:26.080 | about near or intermediate future therapies
02:51:29.540 | that could really, really change human lifespan.
02:51:32.740 | I really hope that they're right,
02:51:34.620 | but I don't have a whole lot to add to that project
02:51:36.820 | because I'm not a life scientist myself, so.
02:51:39.700 | - I'm in part also afraid of immortality.
02:51:43.180 | Not as much, but close to as I'm afraid of death itself.
02:51:48.740 | So it feels like the things that give us meaning
02:51:53.740 | because of the scarcity that surrounds it.
02:51:56.100 | - Agreed.
02:51:56.940 | - I'm almost afraid of having too much of stuff.
02:52:01.520 | - Yeah.
02:52:02.860 | Although if there was something that said,
02:52:04.180 | "This can expand your enjoyable well-span
02:52:07.860 | "or lifespan by 75 years," I'm all in.
02:52:11.660 | - Well, part of the reason I wanted to not do a startup,
02:52:15.280 | really the only thing that worries me about doing a startup
02:52:21.540 | is if it becomes successful.
02:52:24.540 | Because of how much I dream,
02:52:26.040 | how much I'm driven to be successful,
02:52:28.820 | that there will not be enough silence in my life,
02:52:33.860 | enough scarcity to appreciate the moments I appreciate now
02:52:38.860 | as deeply as I appreciate them now.
02:52:41.500 | Like, there's a simplicity to my life now
02:52:44.700 | that it feels like it might disappear with success.
02:52:48.580 | - I wouldn't say might.
02:52:50.580 | (Luke laughs)
02:52:52.180 | I think if you start a company that has ambitious investors,
02:52:57.180 | ambitious for the returns that they'd like to see,
02:53:01.180 | that has ambitious employees,
02:53:02.740 | ambitious for the career trajectories
02:53:05.860 | they wanna be on and so forth,
02:53:07.380 | and is driven by your own ambition,
02:53:12.040 | there is a profound monogamy to that.
02:53:15.780 | And it is very, very hard to carve out time
02:53:21.740 | to be creative, to be peaceful, to be so forth
02:53:24.420 | because with every new employee that you hire,
02:53:28.620 | that's one more mouth to feed.
02:53:30.540 | With every new investor that you take on,
02:53:33.020 | that's one more person to whom you really do wanna
02:53:36.300 | deliver great returns.
02:53:37.940 | And as the valuation ticks up,
02:53:40.180 | the threshold to delivering great returns
02:53:42.340 | for your investors always rises.
02:53:45.020 | And so there is an extraordinary monogamy
02:53:49.020 | to being a founder CEO,
02:53:51.320 | above all for the first few years.
02:53:55.100 | And first in people's minds
02:53:56.620 | could be as many as 10 or 15.
02:53:59.220 | - But I guess the fundamental calculation
02:54:04.220 | is whether the passion for the vision
02:54:07.060 | is greater than the cost you'll pay.
02:54:09.100 | - Right, it's all opportunity cost.
02:54:11.300 | It's all opportunity cost.
02:54:13.740 | In terms of time and attention and experience.
02:54:16.580 | - And some things, everyone's different,
02:54:19.140 | but I'm less calculating.
02:54:20.460 | Some things you just can't help.
02:54:21.660 | Sometimes you just dive in.
02:54:23.740 | - Oh yeah, I mean you can do balance sheets all you want
02:54:26.540 | on this versus that and what's the right,
02:54:28.220 | I mean I've done it in the past and it's never worked.
02:54:31.100 | It's always been like,
02:54:32.900 | okay, what's my gut screaming at me to do?
02:54:35.020 | - But about the meaning of life,
02:54:39.280 | you ever think about that?
02:54:43.300 | - Yeah, I mean, this is where I'm gonna go
02:54:45.100 | all hallmarking on you,
02:54:46.300 | but I think that there's a few things
02:54:49.220 | and one of them is certainly love.
02:54:53.220 | And the love that we experience and feel
02:54:57.380 | and cause to well up in others
02:55:00.180 | is something that's just so profound
02:55:03.940 | and goes beyond almost anything else that we can do.
02:55:07.780 | And whether that is something that lies in the past,
02:55:11.900 | like maybe there was somebody that you were dating
02:55:14.580 | and loved very profoundly in college
02:55:17.380 | and haven't seen in years,
02:55:19.140 | I don't think the significance of that love
02:55:20.940 | is in any way diminished by the fact
02:55:22.860 | that it had a notional beginning and end.
02:55:25.540 | The fact is that you experience that
02:55:27.300 | and you triggered that in somebody else and that happened.
02:55:29.980 | And it doesn't have to be,
02:55:32.660 | certainly it doesn't have to be love
02:55:33.860 | of romantic partners alone,
02:55:35.220 | it's family members, it's love between friends,
02:55:38.140 | it's love between creatures.
02:55:39.980 | I had a dog for 10 years who passed away a while ago
02:55:43.900 | and experienced unbelievable love with her.
02:55:48.900 | It can be love of that which you create.
02:55:50.460 | And we were talking about the flow states that we enter
02:55:52.460 | and the pride or lack of pride,
02:55:54.980 | or in the Minsky case,
02:55:56.380 | your hatred of that which you've done,
02:55:57.620 | but nonetheless, the creations that we make,
02:56:02.220 | and whether it's the love or the joy
02:56:05.700 | or the engagement or the perspective shift,
02:56:07.820 | that that cascades into other minds.
02:56:11.180 | I think that's a big, big, big part of the meaning of life.
02:56:13.620 | It's not something that everybody participates in
02:56:15.500 | necessarily, although I think we all do,
02:56:18.660 | at least in a very local level by the example that we set,
02:56:23.540 | by the interactions that we have,
02:56:25.620 | but for people who create works that travel far
02:56:29.500 | and reach people they'll never meet,
02:56:31.660 | that reach countries they'll never visit,
02:56:33.740 | that reach people perhaps that come along
02:56:36.460 | and come across their ideas or their works or their stories
02:56:39.060 | or their aesthetic creations of other sorts
02:56:41.260 | long after they're dead.
02:56:43.340 | I think that's a really, really big part of the fabric
02:56:46.340 | of the meaning of life.
02:56:48.300 | And so all these things, like love and creation,
02:56:54.300 | I think really is what it's all about.
02:56:59.300 | - And part of love is also the loss of it.
02:57:03.740 | There's a Louis episode with Louis C.K.
02:57:07.620 | where an old gentleman is giving him advice
02:57:11.220 | that sometimes the sweetest parts of love
02:57:14.740 | is when you lose it and you remember it,
02:57:17.700 | sort of you reminisce on the loss of it.
02:57:20.820 | And there's some aspect in which,
02:57:23.500 | and I have many of those in my own life,
02:57:25.580 | that almost like the memories of it
02:57:28.860 | and the intensity of emotion you still feel about it
02:57:33.740 | is like the sweetest part.
02:57:35.180 | You're like, after saying goodbye, you relive it.
02:57:40.760 | So that goodbye is also a part of love.
02:57:45.340 | The loss of it is also a part of love.
02:57:47.420 | I don't know, it's back to that scarcity.
02:57:49.580 | - I won't say the loss is the best part personally,
02:57:53.000 | but it definitely is an aspect of it.
02:57:55.660 | And the grief you might feel about something that's gone
02:58:00.660 | makes you realize what a big deal it was.
02:58:04.160 | - Yeah. - Yeah.
02:58:05.360 | - Speaking of which, this particular journey,
02:58:10.220 | we went on together, come to an end.
02:58:14.020 | So I have to say goodbye, and I hate saying goodbye.
02:58:16.580 | Rob, this is truly an honor.
02:58:18.140 | I've really been a big fan.
02:58:20.320 | People should definitely check out your podcast.
02:58:22.020 | You're a master at what you do in the conversation space,
02:58:24.660 | in the writing space.
02:58:25.980 | It's been an incredible honor that you would show up here
02:58:28.740 | and spend this time with me.
02:58:29.900 | I really, really appreciate it.
02:58:30.940 | - Well, it's been a huge honor to be here as well,
02:58:33.220 | and also a fan and have been for a long time.
02:58:36.100 | - Thanks, Rob.
02:58:37.660 | Thanks for listening to this conversation with Rob Reed,
02:58:40.100 | and thank you to Athletic Greens, Belcampo,
02:58:43.460 | Fundrise, and NetSuite.
02:58:46.200 | Check them out in the description to support this podcast.
02:58:49.260 | And now, let me leave you with some words from Plato.
02:58:52.500 | We can easily forgive a child who's afraid of the dark.
02:58:55.660 | The real tragedy of life is when men are afraid of the light.
02:58:59.640 | Thank you for listening, and hope to see you next time.
02:59:03.580 | (upbeat music)
02:59:06.160 | (upbeat music)
02:59:08.740 | [BLANK_AUDIO]